Polymax 9000 at the 2009 AHRC Robot Rally
What worked, what didn't
On May 16th 2009 the Polymax 9000 competed against other bots of various designs in 7 contests at the AHRC Robot Rally. Polymax 9000 took first place in the overall Polyathlon but did not turn in the best score in all events. In fact it was rather disappointing in some and down right embarrassing in others. Here's the rundown on the various events.
Basic line follower
The bot did exactly as expected here. It was beat by both 3pi robots because when it comes to raw speed line following the 3pi design is very fast. We took 3rd place with a time of 5.52 seconds. The 3pi bots ran the course in around 4.4 seconds.
How to improve? More speed! A 20% increase will equal the 3pi bots speed. I've already exchanged the L293D motor driver chip for a LMD18200T. The new chip has MOSFET drivers with 0.3 ohm on resistance. This means the motors get about 2 more volts to run with. That's a 20% increase and should bring it to parity with the 3pi bots. Adding a couple more AA cells should put it on top if it can hold the line at such high speeds.
Advanced Line Follower
Once again the 3pi bots dominated by virtue of pure speed and a solid line following algorithm. Polymax 9000 came in second. I tried increasing the speed on the 2nd run but the lost line recovery algorithm failed and caused the bot to do a 180 degree turn. At this time I do not know why that happened.
Improvements? Yeah, now that the speed is on par with the 3pi bots the line tracking algorithm needs some tweeking.
This was the event I expected to dominate but instead took 3rd place due to the bot running off the edge on both tries and getting the default 90 second time plus 10 seconds for each object left on the table. Yes, the "well tested" optical edge detectors failed to do their job properly. I take full credit for this screwup.
By the way, the 1st place bot, EEB, was blind. It ran a zig zag pattern that covered the full table area in 29 seconds, knocking off everything in its path. It only used edge sensors (which actually worked!)
What went wrong? I used off the shelf reflective IR optical sensors and chose to power the LEDs all the time and simply measure the reflected light. Bzzzzt! Wrong! The lighting in the room was about twice the level of the room at home where I did all the testing. Enough room light was reflected from the carpet to make the bot believe it was still on the table. I've now fixed that issue. I installed pulsed IR LEDs and photodiodes that totaly ignore room lighting. This was fairly easy since I had 2 channels unused in the existing active IR object detector circuit. It works this way: Basically, with the LED off, read the photodiode signal and store in a variable named "bias". Turn on the LED and read the photodiode signal again. Subtract "bias" from this reading to get the true refected light value.
Beacon Killer + Obstacles
This is another contest I expected to dominate but only took 3rd place. The bot had serious problems with initial aiming at the beacon and obstacle avoidance. It tended to head in a random direction that was unfortunately not towards the beacon. Yet it worked perfectly every time at home. We tried turning off the room lights and increasing the beacon brightness (added 200 watt flood light) but the problems persisted. When obstacles were added the bot hit one on both runs even though it had performed perfectly at home.
What went wrong? Once again the bot was highly dependent on a specific level of ambiant room light. Too high and it thought the beacon was found and headed off in a random direction. When we added an additional flood light and turned off the room lights the bot measured the brightness of the back wall as sufficient and headed in that direction.
The obstacle non-avoidance was also caused by room lighting and beacon intensity differences. When the room lights were off and my original beacon was used it hit the first obstacle even after initially avoiding it! Yes, it steered to miss it, noticed a decrease in beacon intensity and started spinning in place to re-aquire it. The left front corner of the bot clipped the obstacle during the spin cycle. Since the bot sees beacon light + room light the reduced room lighting caused it to falsely loose beacon lock.
On the second run through the obstacles we turned on the extra flood light to help prevent loss of beacon lock during avoidance maneuvers. That corrected that problem but introduced another that caused the bot to mow down the obstacle closest to the beacon. Why? Well, the bot is not allowed to avoid the beacon itself so it disables obstacle detection when the light intensity reaches a certain level indicating it's real close to the beacon. Due to the higher intensity beacon it turned off the obstacle detector too soon. Oops.
Needless to say the beacon navigation system sucks and needs a lot of work.
Obstacle Non-avoidance Video
Navigation by Dead Reckoning
We took 1st place in this contest with a 2 inch error after traveling three legs of a 4 foot triangle. The bot performed exactly as expected. WooHoo!
This was not a Polyathlon event but was seperate part of the AHRC Robot Rally. I expected to lose to the 3pi bots big time because I only had programmed it for right and left hand following routines. There was no memory or maze reduction algorithm. But surprise! I took 1st place. The Polymax 9000 performed exactly as expected but the 3pi failed to complete the second run when it was supposed to have memorized the fastest route.
I hope to have a maze reduction algorithm for the next Robot Rally.
Most important lesson learned: Design sensor systems so they ignore ambiant room light or at least work over a wide range of lighting conditions including the contest site!
UPDATE: For a detailed description of what I changed and improved click here.
Polyathlon highlights video
Hit Counter 4596