Monthly Archives: January 2012

16×2 LCD Display Contrast

A few weeks ago, I put in an order to Jameco for some parts that’ll help me complete some of the projects in Michael McRoberts‘s book Beginning Arduino.

One of the more interesting components was this KST1602B 16×2 LCD Display.  I dunno about you all, but ever since I saw one on a digital calculator, I’ve wanted to bend one of these bad boys to my will… and now I have! … well, sorta.

The first time I wired this up, I got a totally blank screen. I had assumed that I did *something* wrong and went through and re-verified that things were connected right.  Finally, I eyed the resistor that I had been using on the “Contrast” pin on the LCD.  (V0 in the case of this one).  I had chosen a resistor close to that described in the book, but meh… so I pulled out the potentiometer from my last project and hooked that up to pin V0.  It turns out that the resistance needs to be a lot more than what I had expected!

To the left you can see an image with three contrast settings on the potentiometer.  The top one is max (about 10k ohms) the middle one was maybe 8k ohms and the last one represents anything less than say, 7k ohms.

When you look at the data sheet PDF, the section called “Adjusting Display Contrast” seems to suggest that you can put a resistance of between 20 and 50k ohms… that’s a HUGE range considering that what you can go from visible to NO contrast in less than 2k ohms.  I feel like the contrast range should be a little more linear… but that’s just me.

 

Using Background Images in Blender

While I was building the iPhone4 model, I began to realize that my proportions were off.  The Home Button seemed out of place. The front camera and speaker were not quite the right shape.  The buttons and breaks in the seems were a bit off from where they needed to be… the list went on and on.

It turns out that blender has a feature for exactly this problem, background images.  This lets you set an image as the background to your modeling space, thus allowing you to match up your mesh boundaries to features of the image!  Obviously, care has to be chosen when choosing these kinds of images, but assuming you can find fairly orthogonal shots of the front, back, left, right, top and bottom, you can model very precisely from actual photographs.

Ok. How can you set background images?  Go to your view menu and click properties. That should pop up a properties pane.  Near the bottom of the properties pane, you’ll see the widget for background images.  Check the check box and click “Add Image”.  Select which view you’d like this image to show up in (front, left, right, back, etc) and expand the “Not Set” arrow. That should give you a little dialog box that’ll give you a file picker.

You can add an image for every orthogonal view, so when you hit a perspective key on the number pad, the appropriate background image is displayed.

After digging around the web for a while, I finally found a set of images that could help me re-tune my iPhone mesh… here’s the mesh with the “front” background image

Smooth Analog Input Class for Arduino

In the above photo, you see my Arduino Uno with a mini-breadboard ProtoShield.

Here, I wired it up to accept data from a potentiometer on analog input A0.  I also have a standard servo connected to digital pin 7.  The point of this is, you turn the potentiometer, it produces a corresponding motion in the servo.

Here’s the initial code that I used for making this happen:

#include <Servo.h>
#include <SmoothAnalogInput.h>

Servo servo;
int last;
void setup() {
    servo.attach(7);
    Serial.begin(9600);
    last = -1;
}

void loop() {
  int current = analogRead(A0);
  int scaled = map(current, 0, 1024, 0, 180);
  if (scaled != last) {
    last = scaled;
    Serial.println(scaled);
    servo.write(scaled);
  }
}

Pretty simple stuff.  You see I attach he Servo object to pin 7 and within the loop() section, read from A0, scale the value, then write it to the servo! It works!

One kind of annoying thing that I noticed was… the input from the potentiometer wasn’t terribly stable.  Even when it was just sitting there, it would output values that were up to plus/minus 5.  Because of this, my servo was constantly twitching.  Not only is that probably not good for the servo, you could constantly hear it too… so it was annoying.

It turns out that in one of the Arduino Tutorials, they talk about analog input smoothing.  Essentially, you take 10 readings and average them.  That’ll give you a more stable value.  The problem with using the code from the tutorial is, it’s ungainly.  It would be nice if we could create an object, attach it to A0 and have it automatically smooth incoming values for us… well, that’s what I wrote!

You can grab the actual code from here: https://github.com/rl337/Arduino/blob/master/libraries/SmoothAnalogInput

The prototype for the object looks like this:

#define SMOOTH_ANALOG_INPUT_SIZE 32

class SmoothAnalogInput {
    public:
        SmoothAnalogInput();
        void attach(int pin);
        void scale(int min, int max);
        int read();
        int raw();

};

The object smooths data over 32 samples. You attach() the object to whatever pin you want it to smooth input for, in our case, A0. the raw() method takes a reading and returns the raw value from the analog input. the read() method returns the smoothed result which can be automatically scaled between a min/max value if you call the scale() method. Here’s the program reworked to use the SmoothAnalogInput object

#include 
#include 

Servo servo;
SmoothAnalogInput ai;
int last;
void setup() {
  Serial.begin(9600);
  servo.attach(7);
  ai.attach(A0);
  last = -1;
}

void loop() {
  int sensorReading = ai.read();

  int scaled = map(sensorReading, 0, 1024, 0, 180);
  if (scaled != last) {
    last = scaled;
    Serial.println(scaled);
    servo.write(scaled);
  }
}

Looking at the Serial Monitor, there are still a few cases where the servo fidgets, but by and large, the values become stable quickly after you stop moving the potentiometer.

iPhone4 Blender Model – Still incomplete, but oh well..

Continuing my efforts to fill in all the models needed to render the first Little Robots Script, I started on modeling a phone… in this case, my iPhone4.

I thought incorrectly that the simple shape of the phone would make the modeling go by pretty quickly… but it turns out that there is some trickyness involved with how you get the little buttons and indentations into the body itself.  There’s also texturing complications.  You want the screen to be a single rectangular texture so that when you animate, you don’t have to worry about strange artifacts involving the screen content being stretched or otherwise transformed.

On this model, I still need to add the speaker holes and interface jack at the bottom as well as the buttons on the side.  I need to revisit the front surface though. I’d like to actually put a transparent layer over it just like the real thing… I’ll be able to easy texture that with smudges and blemishes.

Here’s what  I have rendered in-scene…

The metal texturing still isn’t quite right.  For some reason I’m getting strange textured reflections from the surface of the phone. You can see it toward the home button in the render.  It looks like it’s reflecting the wall, but the stucco pattern is way too emphasized.

N-Body Simulation

A long time ago, I created a simple animation using POV-Ray that simulated a Neptune-like planet that I named Augustus-Voltaire 4.  I used POV-Ray’s built-in programmatic texturing features to create a blue banded atmosphere that had white speckled high atmospheric clouds. You can see the video here… It’s got some pretty bad aliasing features due to compression, but you can get a basic idea of what I was going for…

Ultimately, I wasn’t happy with the result, so now years later, I’ve decided to take a stab at the problem again…  This time, armed with a much faster computer!  But how do you simulate turbulent, banded clouds?

I came up with a couple of different ideas for how to do it… but the only one that seemed to produce anything vaguely interesting was N-Body simulation.  Essentially, you assume that the atmosphere is a fluid and realize that fluids can be approximated with particles.  In the simulation, you throw a few thousand simulated particles which interact with each other in set ways (in my case, they all repel each other in an enclosed space).

I wrote the simulator in Java… and is kind of slow.  Here is my first real attempt… it took about 18 hours to render.  The source is too large to include here in the post.  If you’re interested in seeing it, drop me an email.

In this simulation the Red and Blue components of the color describe its mass.  The redder the particle, the heavier it is.  The Green component of the color is controlled by how fast the particle is going.   I introduced a force pushing through the center.  The intensity tapers off as you get closer to the top and bottom.  That the simulation quickly reached a form of equilibrium…. which is kinda cool, but doesn’t make for very interesting weather patterns.

I noted as I watched this run, was.. the particles tended to segregate themselves based on mass.  The heavier stuff accumulated near the center of rotation and the lighter stuff was pushed out to the edges.

In my second N-Body simulation, I used far fewer particles but made each particle way more massive.  I also changed the shape of the force pushing through the center of the simulation, making it much more narrow.

Once again, you can see that we quickly reach a stable pattern.  This one seems to have much more circulation though.  It seems like bigger particles are the way to go.   I have 2 more avenues to explore as far as particles go…

  • Make the simulatiion have a 3rd generation.  This aught to cause different masses to group in layers. That might be visually interesting.
  • Add more complex ambient forces than the simple down-the-middle force.