Archive | Roboticity RSS for this section

iStrike 2014

iStrike is a competition where a bot has to navigate the field autonomously using an overhead camera. The arena consists of a road with a boom barrier at which the bot has to stop (till the barrier opens up), a T-shape bend, and 2 zones at either end of the little dash above the T (For a clearer idea, refer the click of the arena from the overhead camera below).

Controlling the Arduino from Matlab was simple, since I had already worked on it before, when I wanted to plot distances from the bot on Matlab from data received from an ultrasonic sensor.

Here’s how our bot looked like:

IMG_20140329_135824 IMG_20140329_135835 IMG_20140329_135806

And here’s how the arena looked like from the overhead camera:


iStrike Round 1 Arena

Unfortunately, I didn’t capture an image of how the bot looked like on the arena, or an image of how things looked like after thresholding the image.

There were a few things we had to take care of during the calibration phase:

  • Ensuring the bot is detected, by modifying the operations performed on the image slightly (altohugh we had done most of this beforehand, we needed a few finishing touches)
  • Rotating the image appropriately, so that it was in a “T” shape as initially mentioned
  • Ensure that after thresholding, stray white pixels were removed by setting an appropriate minimum number of pixels which had to be present in a component so that it was not removed by bwareaopen

We didn’t need to determine the threshold, since MATLAB, which uses Otzu’s thresholding method by default, does this brilliantly.

Our algorithm first thresholded the image, and removed the stray pixels (or groups of stray pixels) using bwareaopen. We then used the “Extrema” region property to determine the extreme points of the arena. Through this, we obtain the location of the topmost edge, the bottom-most edge, the left-most edge and the right-most edge as shown:


Now, we need to find the bottom edge of the horizontal line of the “T”. For this, we used the discarded red component,as shown:


where the red component was detected using something similar to red_component = 2*im(:,:,1) - im(:,:,2) - im(:,:,3);

We then detected the bot along similar lines, but for yellow instead of red, extracted its centroid, and tried to ensure that the bot’s centroid would lie within the left and right edges (with some region kept as a buffer, of course), until once it crossed the lower edge of the T’s horizontal part, when it would turn towards the direction of green.

This worked well, except for the fact that we forgot to keep in mind two key aspects:

  • Orientation of the bot: The direction the bot was facing could not be obtained with the patch of yellow. This would not, however, pose too much of a problem, provided the bot didn’t do something unexpected (such as magically take a U-turn), as in the long run, the bot would turn the other way if it turned too much towards one side.
  • “The Barrier”: The barrier is a black sheet that would be thrown across the track in front of the bot before the first (and only) turn, at which point the bot would have to stop until the barrier was removed. One thing that we didn’t realize was that The Barrier was compulsory. We had inquired, and had been informed that if we didn’t stop at the barrier, we would be penalized a few points. However, at the event, we were disqualified outright. Sigh…

All in all it was a wonderful experience, and we’re definitely participating next year!

Modulated Colour Sensors

In order to have reliable line following, we at Team Robocon realized that calibration is one of the most important (and unreliable) aspects (fresh batteries aside :p ). So, we (Anirudh and I) started working on fool-proof line sensors that work unaffected by ambient light. Here is what we planned, what we did, and the problems we faced.

What we planned

Our plan was this: the intensity of ambient light is generally a constant, and if it varies, does so really REALLY slowly ( if one keeps switching a light on and off as fast as one can, one JUST ABOUT MIGHT be able to achieve a frequency of… *gasp* 10Hz (I tried. I reached 2Hz :P). So, we planned to give an LED a sinusoidally varying voltage as its source, and, so that it doesn’t get reverse biased, the source would have a DC voltage superimposed on it so hat the net voltage never goes below 0V, the LED is always on, and only its brightness varies. The light from this LED, if in contact with white, gets reflected, and if in contact with black, absorbed. In presence of ambient light, the light sensor always detects light, and the micro-controller ends up being tricked into thinking that the particular sensor is above white, even though it may be above black. So, this reflected LED light shines onto a photo-detector/LDR (in series with a resistor), and a voltage is dropped across the combo as usual. Since the received light’s amplitude is modulated, the resistance of the LDR, and hence the net voltage dropped across it, varies. Thus, the voltage across it will be a sine wave coupled with a DC voltage. In the absence of modulated light, the sine wave will not be present in the output voltage. Thus, it is this sine wave that is the solution to our long quest for ambient light-free-happy-happy-land. To detect this sine wave, we pass the output voltage through a high-pass filter, which permits ONLY this sine wave voltage through (provided of course, that the sine wave frequency is more than the minimum frequency that the filter allows). We don’t really need a band-pass filter as ambient light, being un-modulated, has no frequency (in varying amplitude, I mean) at all, let alone frequency high enough to cross the high-pass filter and mandate the use of a band-pass filter. The output from the high-pass has to now be rectified, so that only the positive half of the sine is let through. Then, we smoothen the bumpy wave with a capacitor, and voila! If there is a modulated light shining on the photodiode / LDR, we have a “high” output the capacitor, else the output is low.

Not really.

All this will be interfaced with a micro-controller, right? So, here comes the million dollar question: where on earth do we get a sine wave from?

There are a number of possible (surprisingly, not-so-simple) ways:

  1. Use an R-C cascade:
    What this does is basically keeps filtering the square wave. The Fourier’s series of a square wave consists of a number of frequencies of sine waves (refer to point 2 below). The RC cascade keeps filtering out these waves to end up giving us a sine wave. Here is the cascade that we modeled, along with the expected results:

    Square to sine- circuit 1

    Square to sine- circuit 1

    The wave in red shows a pure sine wave, and the one in cyan is the output of this circuit, which is fairly close.

    Here is the actual output we got at 30kHz, the frequency at which we input the square wave:

    Actual Output

    Actual Output

    which is, in fact, fairly close. Here’s a pic of how the circuit looked like:

    RC circuit pic

    RC circuit pic

  2. Use TI’s UAF42 IC:
    This method would involve converting one of Arduino’s PWM square wave outputs, or a the square wave output from a 555 timer IC into a sine wave of same frequency. Involving a single IC and 3 resistors, we thought it’d be a “cleaner” implementation-  way more manageable (read: less pesky loose contacts). Details of how to use this can be found here, and its data sheet here. The how-to paper basically says that if used by a band-pass filter, all higher order terms in the Fourier series of the square wave get filtered out, leaving a pure sine wave. If the frequency is not among the ones specified in the paper, then the software filter42 needs to be installed, downloaded, and run… on DOS-BOX.
    We tried this method, making the exact circuit given in the documentation above. However, this only made it LIKE a sine wave. We got the output as:

    UAF42 output

    UAF42 output

    which isn’t exactly a sine wave, but is close to it (sort of).  What we think should be done to improve this is cascade 2 resistors to get a higher order filter, but this defeats the very purpose: to keep the circuit simple.

  3. Use a Wien bridge Oscillator (or another of n number of oscillators that uses an op-amp):
    Another method that we had in mind, but didn’t actually try, since method 1 gave us decent results.

So, with the sine wave generation done, we used a waveform generator to input a ready-made sine wave so that we could see the output in a CRO, and analyse how it plays out. Thats when we realized something we hadnt thought of before: when the LED is given an input sine wave as voltage, if the frequency is too high, the LDR doesn’t respond fast enough. We found the lower the frequency, the better the variation in voltage across the LDR looked (the LDR being connected to a voltage source through a constant series resistance, and the LED to a sine wave+DC voltage source; the LED  shining directly onto the LDR).  We experimented a little, and realized the LDR doesn’t respond to anything above 10kHz, and responds well to a frequency below 1kHz. But we had designed circuit 1 for 30kHz. Ahhhh!!! Back to the drawing board… (Or in this case, edx’s circuits and electronics circuit simulator). Oh well, we chose a frequency of round 620Hz, and continued with the wave generator (620Hz is one of the frequencies listed in the document above). 

We had simulated the high-pass filter : diode rectifier : capacitor ‘smoothener’, and here is what we got:

LDR to output- circuit 2

LDR to output- circuit 2

Note that above, the voltage source is the voltage across the LDR.

What we observed in the circuit, though, was similar, but the output voltage was much lower ( a very sad 20mV).

Here is a video showing output across R in the above circuit:

Here is a video showing output across where the red probe is above, but without adding the capacitor:

Here is a video of the output shown by the probe, but in real life:

So, ya, its 20mV. Waaaaaaaaaayyyy too small. Even amplification isnt really helpful as any noise, if present, will be amplified too.


Telepathic Line Follower at Aavishkar 2013

A few of us from Team Robocon went to Aavishkar 2013, UIET, Punjab University in Chandigarh. There were 2 Rounds in this event, held over 2 days: 14th and 15th of September.

The first round was an elimination round, where we had to follow a blue line on a white background. Here is the track that they had specified:

Telepathic Follower Arena Specifications- Round 1

Telepathic Follower Arena Specifications- Round 1

We realized that it was miniscule (we even called them to confirm!), and as they had specified a low % of error, we decided to hard code the bot, with a simple backup code just in case. What we basically did was to slow down the bot as time passed, so that it could take tighter turns easier. Further, after a certain amount of time, it would rotate thrice at approximately 90° angles, which was the only way for the bot to take the last few turns to reach the center. Here’s a video of how our bot ran on our track, made as per the specifications provided:


The second round was where the “Telepathic” part of things came into the picture. A bot had to follow the given track, while another bot had to trace out the path made by the “master” bot from wireless data received from the master. We decided to use a one-way RF transmitter-emitter pair for this task. The master would send a signal via the transmitter module by using digitalWrite() on 8 of its pins to encode a signal, while the “telepathic” bot would read this signal from the RF receiver via 8 digital pins using digitalRead(). Here’s a video of a prototype we developed, where we used a remote control of sorts on the telepathic bot rather than the master directly for testing things out:

Here’s what the track was supposed to be like:

Telepathic Follower Arena Specifications- Round 1

Telepathic Follower Arena Specifications- Round 2

Though seemingly complex, it was specified that only the following blue loop had to be followed:

Telepathic Follower Arena Specifications- Round 2- path

Telepathic Follower Arena Specifications- Round 2- path

This worked extremely well in our favor, since we were using line sensors with red LEDs on them, and the blue would be the line that reflected best. Further, there were no sharp turns, which would mean that the telepathic bot could trace the path fairly easily with the instructions received via the RF receiver.

We received the shock of our lives when we reached there! We were shown the track before bot submission and it was huge! Well… At least compared to the one they had specified initially. It was over four times the size! We didn’t protest though, and loaded our back up code into the bot before submitting it. We then calibrated the bot, and started it up. To our dismay, the bot stopped after 3/4th the track was done. We realized that this was because, post calibration, all tube lights had been switched on prior to a photo shoot before the start of round 1. In our second run, with the lights all off, i.e., with lighting exactly like it had been during calibration, the bot ran (almost) perfectly. We went onto the second round! Here’s the video of round 1:


The second round was initially supposed to be on a different track then the first round, but they decided to use the initial track anyway. The organisers thought this would be easier, considering the first round track wasn’t coloured, but this change ended up working against us, because we had used a colour sensor with red LEDs (which means the multiple colours wouldn’t really be an issue), and the initial track involved following a blue line with no sharp turns, while the first round had 90° turns.

To make matters worse, during the round, the master bot ran out of battery (we had brought 2 rechargeable 9V batteries with us, out of which one had failed (and died totally, providing a massive voltage of 0.02V) during the testing phase, while the other drained fairly quickly, and we had to use non-rechargeable 9V batteries which drain quicker than you can say “Ouch”), while the telepathic bot went crazy (read: the RF module had a short-circuit). Here’s an image from the mayhem that ensued:


Moral of the story: Always be prepared for the worse (which we (almost) were, considering we had a back-up line following code inspite of being assured that the track dimensions wouldn’t be changed), and one can never, ever have enough batteries.

Making a colour detector

Recently, I decided to make a colour detector, which might come in handy if ever we need it to follow a coloured line.

The idea was simple: have 4 coloured LEDs ( red, green, yellow, blue) connected through resistors to an Arduino. Have an LDR in between them, connected to high voltage with another resistor in series. Now make the LEDs shine one at a time for an equal interval of time each. Read the voltage across the LDR, which changes based on the color which the entire setup faces. Calibrate it by setting thresholds for each colour you wish to measure, and we’re done!

My first line follower

During Apogee 2013, my sidee, Arjun, and I decided to participate in Track-o-Mania, an event which involved line following, with a twist.
The bot which we had in mind was a line follower which involved the following:
A PID Line following algorithm: PID stands for Proportional Integral Derivative. A very nice explanation of how PID can be used to rapidly reach a steady value is given here, and how one on how PID can be implemented in robotics is given here.
ADC: Analog to Digital conversion, we used ADC to calibrate the IR sensors of the bot in our code rather than manually adjusting potentiometers at the venue.

We didn’t know how to get the PID constants automatically using the auto tune library, nor did we know exactly how to use the Zeiglor-Nicols method.

The bot primarily comprised of motors, wheels, an L293D motor driver, an acrylic chassis, and an IR emitter-detector array which consists of 5 emitter-detector pairs.

For the second round, we had planned to put 2 IR emitter-detector pairs on either side of the bot to detect the soldiers and terrorists at different heights, an IR emitter-detector pair in front to detect a wall, a motor with a platform on which to keep the first aid kit which would rotate to drop the kit, and a few LEDs on which to keep a count of soldiers and terrorists. However, we were not able to implement these as the bot decided to trouble us with a loose contact, and we had to remove a soldered motor driver (an error that took us a massive amount of to figure out!) and plonk a bread board there instead.

Strangely enough, not a single participant had prepared for the second round! Our bot was very fast compared to the others as we had used PID, but sadly, the track had bumps in it (as opposed to a smooth track promised in the rules), and our motors didn’t have a high enough torque to deal with this…

But it was a brilliant learning experience, and it is this event that motivated me to join Robocon even more.