Brainbot is finally Autonomous
So, its been a while since I updated here. Things have been busy, but I ran into some mechanical issues, and then had a really hard time finding a compass sensor that would work on this platform, given the off-road (and thus uneven ground) issues it runs into.
I finally solved the compass issue by buying the new Sparkfun Razor 9 DOF IMU sensor, and loaded the provided AHRS software onto it, but even that didn't work exactly right. The heading reading provided was very non-linear over the full 360 degrees. I fixed it in a very brute-force way, by building a turntable I could place Brainbot onto, and then coding a lookup table to convert IMU heading into real heading.
To the right is a graph showing the actual heading versus the IMU heading.
Here's a video showing the first autonomous run:
This is a video showing the second autonomous run, which is the same mission as the first one, but I start the robot on the "wrong" heading, and it auto-corrects (at the beginning):
This is a screenshot of the mission editor, with the mission path (from top left to bottom right), and the logged vehicle track overlaid on top of that. All the navigation right now is done using dead reckoning, with wheel encoders and the 9-axis IMU.
Now that the heading issue is (hopefully) solved, things should progress at a much more rapid pace. Next up is integrating obstacle avoidance, using the Hokuyo laser scanner, followed by visual servoing using the camera.
2 Comments:
Nice work! In what did you code the mission editor in?
By
bluehash, At
May 16, 2010 at 7:04 AM
The mission editor, like the entire autonomous controller, is written in Squeak Smalltalk...
By
Unknown, At
May 16, 2010 at 7:13 AM
Post a Comment
Subscribe to Post Comments [Atom]
<< Home