iStrike 2014

iStrike is a competition where a bot has to navigate the field autonomously using an overhead camera. The arena consists of a road with a boom barrier at which the bot has to stop (till the barrier opens up), a T-shape bend, and 2 zones at either end of the little dash above the T (For a clearer idea, refer the click of the arena from the overhead camera below).

Controlling the Arduino from Matlab was simple, since I had already worked on it before, when I wanted to plot distances from the bot on Matlab from data received from an ultrasonic sensor.

Here’s how our bot looked like:

IMG_20140329_135824 IMG_20140329_135835 IMG_20140329_135806

And here’s how the arena looked like from the overhead camera:

istrike_rnd1

iStrike Round 1 Arena

Unfortunately, I didn’t capture an image of how the bot looked like on the arena, or an image of how things looked like after thresholding the image.

There were a few things we had to take care of during the calibration phase:

  • Ensuring the bot is detected, by modifying the operations performed on the image slightly (altohugh we had done most of this beforehand, we needed a few finishing touches)
  • Rotating the image appropriately, so that it was in a “T” shape as initially mentioned
  • Ensure that after thresholding, stray white pixels were removed by setting an appropriate minimum number of pixels which had to be present in a component so that it was not removed by bwareaopen

We didn’t need to determine the threshold, since MATLAB, which uses Otzu’s thresholding method by default, does this brilliantly.

Our algorithm first thresholded the image, and removed the stray pixels (or groups of stray pixels) using bwareaopen. We then used the “Extrema” region property to determine the extreme points of the arena. Through this, we obtain the location of the topmost edge, the bottom-most edge, the left-most edge and the right-most edge as shown:

extrema

Now, we need to find the bottom edge of the horizontal line of the “T”. For this, we used the discarded red component,as shown:

extrema_red

where the red component was detected using something similar to red_component = 2*im(:,:,1) - im(:,:,2) - im(:,:,3);

We then detected the bot along similar lines, but for yellow instead of red, extracted its centroid, and tried to ensure that the bot’s centroid would lie within the left and right edges (with some region kept as a buffer, of course), until once it crossed the lower edge of the T’s horizontal part, when it would turn towards the direction of green.

This worked well, except for the fact that we forgot to keep in mind two key aspects:

  • Orientation of the bot: The direction the bot was facing could not be obtained with the patch of yellow. This would not, however, pose too much of a problem, provided the bot didn’t do something unexpected (such as magically take a U-turn), as in the long run, the bot would turn the other way if it turned too much towards one side.
  • “The Barrier”: The barrier is a black sheet that would be thrown across the track in front of the bot before the first (and only) turn, at which point the bot would have to stop until the barrier was removed. One thing that we didn’t realize was that The Barrier was compulsory. We had inquired, and had been informed that if we didn’t stop at the barrier, we would be penalized a few points. However, at the event, we were disqualified outright. Sigh…

All in all it was a wonderful experience, and we’re definitely participating next year!

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: