The official Andrew's page

2005 Jul. 11

Wow, it seems it has exactly been three months I haven't upgraded this page! Not that nothing happened in the meantime... Anyway, the news of today is that I've integrated the two webcams to Tumbleweed and started seriously working on it's vision system. I've bought two fish-eye lens for the two cameras with 120o field-of-view. I need that so that the robot will see into the curves it's turining into without actually moving the cameras. I got the lenses from Fry's and I was fortunate enough that though they were designed for PAL/NTSC security cameras they fit my old D-Link webcams. I've also mounted the cameras on the back-side of the R/C car platform and will reverse the drive as well. The reason for that is in reverse it easier to see into the curve of the turn. It means that I can make sharper turns with the available field-of-view.

Well the plan with the fixed, but wide-angle cameras was great, and in fact I think will work out well, but there are two catches. Catch No. 1 is that with such a wide angle (and with such cheap equipment) the edges of the image is rather blurred. In other words the sensor cannot be fully in focus for all of its area. This is a small issue, however for my particular application, in fact I might even benefit from the removed noise and clutter on the images.

The second problem, however is bigger: since the lens have such a wide field-of-view it has a significant barell- distortion. Since the disparity calculation that I would like to use is extreamly sensitive to vertical shifts in the corresponding pixels, this is just not acceptable.

I needed a lens-distortion correction algorithm. Unfortunately the most popular way of doing it, that is publicized on the web in many places does not work for fish-eye lens. It only works for relatively small distortions. I had to come up with a solution on my own. Knowing however that the fish-eye projection is fairly similar to a spheircal projection, I was able figure out the right equations to undistort the image.

The image looks rather strange, due to the extreme wide angle, but if you look at the straight lines of the chair or the rug, you'll see that the projection is actually pretty accurate. There was a new problem however: the equations needed trigonometric functions (arc-tan) to be calculated for each pixel. That's not something I can handle real-time so I needed to do something else. After some tinkering I came up with a good caching of the required data. I basically pre-calculate all the distortion vectors for each pixel and store them in two arrays. Since the image resolution that I'm working on (320x240 raw image resolution) those arrays are not too big. That all I have to do in real-time is to index into these arrays, and access the right pixel.

Well, almost. There's one more thing left to do, which is interpolation to reduce the sampling effect towards the edges. With all that, the algorithm is still a bit slow, it takes about 9ms to un-distort an image. When you plan on going for something around 5fps total processing than 18-20ms just to pre-process the images (twice the time since we have the left and the right images) is around 10% of the total processing power available. I'll see how it goes and if I need to optimize this algorithm even further but for now, I'm satisfied.

I will follow-up with a short discussion on the alhorithm I'm using and the implementation shortly.

2005 Apr. 11

The big work that I've started was the integration of the robot base. I've had a 1:10 scale R/C car from Traxxas, called Stampade. I've used it as a base, but removed all electronics from it. I've used my servo controller to control the streering while the H-Bridge to control the speed. I bought an i815e single-board computer off of e-Bay and planned on using that as the brains of the robot. I also bought an M1 DC-DC ATX power supply. The former is a socket-370 motherboard in the size of an 5 1/4" floppy drive, and the latter one is an automotive power supply. What was interresting for me in the power supply is the claimed wide input rage: 6-24V. Since I didn't know if I will run my robot from one or two 6-pack NiMH batteries (7.2 or 14.4 volts) this supply was a perfect match. I've added a 1GHz PIII cpu and half gigs of RAM plus a small laptop hard-drive with Windows XP on it to complete the setup. Latter on I will try to reduce the power consumption by using a lower-power CPU (or switch to a VIA EDEN based motherboard) and replacing the hard-drive with compact-flash, but first I have to see how much processing power I need. The OS is also transitional, in the end I will have something embedded (Windows XP embedded, WinCE or embedded Linux) running on it. I've create the a separate webpage to document the progress of this robot.

For the integration work I had to interface the TWI bus controlled µModules with this PC. One obvious thing, and that's what I've been doing while developing the software for them is to use the printer port and some support electronics. However I had a better idea. As it turns out this motherboard contains an IT8712F interface chip to handle the serial ports, the printer port, the floppy and all the rest of 'legacy' peripherals. Now, the datasheet of the chip reveals that it also contains a smart-card interface that happens to be wired out to a connector on my motherboard. It also turns out that most of the pins of this chip can also be used as general-purpose I/O pins, including these smart-card reader ones. These GPIO pins are rather intelligent as well. Not only they can be programmed for both input and output independently, they also have an optional pull-up resistor. All that I needed for a bit-banged TWI (alias I2C) implementation. So I've decided to go down that route and dished up a simple program to interface TWI devices over two of those GPIO pins. The experiment worked out beautifully, and now I have both µModules hooked up to this motherboard using two of the GPIO pins of this device.

Looking ahead I have big plans: now that I have the basic things together I have to integrate the servo- and motor-control application together. I also will have to re-tune the PID loop, since the added load on the motor obviously threw it out from its stable range.

Once its all done I should be able to start measuring the precision of the navigation using back-EMF feadback alone. My guess is that it will not be that great but will at least give me a rough estimate of the location of the robot.

I think I should also tell you about the reasons for the M1 power supply. You might recall that in december I received the PCBs for my own ATX DC-DC power supply. Well, that turned out to be a disaster. I have not been able to stabilize the control loop of the PWM controller chip when both channels were turned on. My guess is that the power supply pin picked up too much noise from its own switching circuit that threw the loop out of its stable zone. But I have not been able to mitigate that noise no matter what I've tried. One of my last ideas was that maybe the over-current protection was was acting up and that was the reason for the behavior I've seen so I've disconnected it. It turned out to be a fatal mistake. Shortly after, I've managed to short-circuit one of the outputs... Well, I haven't seen electronics catching on fire until than. Now, I did. I guess I don't have to say that the board was toast afterwards, there wasn't much I could do. And I didn't feel like putting even more money into it and possibly burning more components with the experiments. I wen't to the sure-fire way of oredering something that is expected to work and if not would at least have a warranty on it. To my greatest surprise, when the power supply arrived, I've discovered, that they've used the same PWM controller chip that I had so much trouble with. Well, at least I can't say it can't be done...

8/30/2004 - New engine deployed

New website engine done, and finished converting the site content to the new layout. This is a huge update that moves all the processing from the client side to a spearate compilation step. Neither the server or the client required to support scripting.

© 2004 Andras Tantos