Zero to Autonomous

Last weekend DIYRobocars held the biggest race of its short history.  We raced our 1/10th scale cars (and smaller) at ThunderHill Raceways as part of the Self Racing Cars event where some raced full sized cars and one go-cart on the 2 mile ThunderHill West track.   Among the 3 classes of cars at the event between Self Racing Cars and DIYRobocars, I am the track master for what may be the most active track at the event, the 1/10th scale competition.   Our modest 60 meter long track laid out with tape in the parking lot had 12 cars of which 7 completed fully autonomous laps.  While all used vision the cars were based upon 5 different software stacks (2 based on OpenCV, one on a depth camera and 2 different CNN implementations).  While most racers are from the greater Bay Area, we had racers join us from all over North America including Miami, Toronto and San Diego.

DIYRobocars Tape Track at ThunderHill in the Early Morning

After this huge milestone it is worth reflecting on the short time that the DIYRobocars league has been in existence.  The 1/10th leave kicked off with a hackathon organized by Chris Anderson at Carl Bass’ Warehouse on November 13 2016.  As I look back on the last 4.5 months it is amazing what we have accomplished.  Here are some stats:

  • 23 total cars built
  • 15 cars have raced 
  • 10 cars have completed an autonomous lap
  • 21s – Fastest autonomous time
  • 8s – Fastest Human time.  
  • And there is much more to come…

In addition times have consistently improved even though we have added new cars and racers as can be seen in the chart below

Compared to Self Racing Cars (full sized cars) at ThunderHill, our little cars did exceptionally well especially considering many of the full sized car projects have been running for years.  Even with all of their funding, only 4 full-sized cars were able to run autonomous laps and not one was able to make a lap with vision only. 

Winner at ThunderHill, Will Roscoe.

How was the 1/10th scale track able to demonstrate so much success in a shorter time?  I am not totally certain, but I suspect two things are the primary causes:

First, the cost of failure is zero – The amount of caution required for a the development of an autopilot for a full size car must greatly hamper speed of innovation.  With 1/10 we are able to take risks and test anywhere and move fast. 

Second is less obvious – What makes DIYRobocars special is that it is a collaborative league.  I cannot express how unusual this feels.  While we are all competitors, we share our code, our secrets for winning and brainstorm with our competitors how to make our cars faster.   Fierce competitors one moment are looking to merge code bases the next.  In the larger car league, many cars are sponsored by competing companies.  There is much less sharing and collaboration which puts the brakes on innovation.  

While the last 4 months have been great, I also look towards all we accomplish in the next 4.  While many designs will be refined and lap-times will drop, the next big step is to mix up the format to tackle the next set of technical challenges.  Our next big rule and format change is to incorporate is wheel-to-wheel racing which introduces a new set of technical problems.  More on the rule changes in the coming weeks.  

Post-race report from Thunderhill

Our outdoor hack/race at Thunderhill was a blast, both for the opportunity to compete over a whole weekend and the intermixing with the full-size autonomous car teams at the Self Racing Cars event we were part of.  We didn’t do anything that couldn’t have been done indoors at our regular monthly races in Oakland (people didn’t use GPS, for example), but the preparation that went into a full weekend brought the best out of everyone.

Here are some quick observations from the weekend.

  • There was lots of improvement over the two days. Times more than halved from Saturday to Sunday, from 45 seconds to winning times all in the low twenties.
  • Humans are still faster. The fastest human driven time was 8 seconds, the fastest autonomous time was 21 seconds. We’ve still got work to do.
  • Wheel-to-wheel racing is the future (see above). We’re going to be doing a lot more of that in the monthly events, including seeded rounds, ladders and a “Final Four”.
  • Of the 14 teams that entered, 10 finished the course. All used computer vision to navigate (no GPS, on the ground that GPS is too easy — we’ve been doing that for nearly a decade at the Sparkfun Autonomous Vehicle Competition)
  • By contrast, on the full-size track next door, *no* cars successfully did a full autonomous lap with vision. The few that used vision, such as Autonomous Stuff and Right Turn Clyde (our own Carl Bass and Otavio Good, shown below), were not able to complete autonomous laps. And the cars that did complete autonomous laps, such as Comma.ai, used GPS waypoints to do it.
  • On the 1/10 scale course, the jury is still out on whether traditional computer vision or neural networks are best. First place, Will Roscoe, the author of Donkey, uses TensorFlow and Keras (which uses neural networks). But just one second behind, in 2nd place, was Elliot Anderson using traditional OpenCV-style computer vision on a $55 OpenMV M7 board.
  • Traditional computer vision is easier to understand and debug, but requires tuning. Neural networks, on the other hand, are black boxes that take a long time to train, but they don’t need to be hand-tuned (ideally, they can just figure out correlations themselves, without having to be told to look for specific shapes or colors).  I prefer things I understand, so I’m drawn to CV. But I fully accept that there will soon come a day when CNNs are so easy to train and use that I make the switch. Just not yet.