Yet Another Robot
So, while I was flying back home from RobotsConf (which was awesome, btw), I was paging through the copy of Make Magazine than they gave us. On page 64, I saw this cool little robot called CamBot, shown below.
The gear motors are placeholders for the real ones, which I've ordered, along with the optical encoders they support. You can see in the image below that they mount nicely using 1.6mm machine screws. This picture also gives you an idea of the scale - the robot will be 102mm long, 91mm wide, and 50mm high.
Shown below are some of the parts I'm using, including a dual-motor h-bridge, and a bunch of tiny connectors.
![]() |
Cambot by Dave Astolfo |
One of the issues I ran into with Roz while I worked on getting him walking was the constant need to reboot the Beaglebone Black, either because of USB issues or networking issues or power issues. I decided I wanted a robot to play with that went back to using a micro-controller, but I wanted to use something more powerful than your typical AVR. I supported the Micro Python kickstarter, but in the meantime I'm going to use a Teensy 3.1 to control this thing.
So, I fired up my CAD package, and came up with the model below. It uses
treads from a Lego Technic set, but everything else is 3D printed.
Now, a week later, I have this:
The gear motors are placeholders for the real ones, which I've ordered, along with the optical encoders they support. You can see in the image below that they mount nicely using 1.6mm machine screws. This picture also gives you an idea of the scale - the robot will be 102mm long, 91mm wide, and 50mm high.
Shown below are some of the parts I'm using, including a dual-motor h-bridge, and a bunch of tiny connectors.
The robot will have 3 Sharp analog IR sensors, one on each side, and one in front. It will have, in addition to the Teensy 3.1 processor mentioned above, a 9 axis IMU, and a bluetooth module. The bluetooth module will allow me to write a simple app on my Android phone (Nexus 5) to control and get feedback from the robot. By "control", I really mean stuff like choosing a mission to run, since this robot (like all my others) will be fully autonomous. The robot also has optical encoders on each motor, with 500 counts per revolution of the drive wheel. That, along with full PWM control of the motors, will allow me to use a proper PID solution for control.
The motor driver board also provides analog feedback to the current draw of the motors, so I'll be able to detect if/when the motors are stalled.
The robot will use a pair of tiny 250 mAh lipo batteries in series.
At some point in the future I want to add a very small camera to this robot, and start playing with some very simple visual processing, like blob tracking and fiducial recognition.
Speaking of Roz, I managed to get him walking quite nicely at the conference using the NUKE engine running in Python on the Beaglebone Black. I didn't get any video, but I'll get some this week and post it here.
0 Comments:
Post a Comment
Subscribe to Post Comments [Atom]
<< Home