Pushing Boundaries – Navigating with Ultrasonic & Colour (Light) Sensors

IMG_20160602_124438

When we started our LEGO robotics program a few months ago, I wasn’t sure how far we could push our FLL girls’ programming skills this year. Time and time again; however, they have surprised me with their persistence, curiosity, and drive to solve the problems I have put in front of them. This was especially true with our forays into the use of ultrasonic and colour (light) sensors, which I had originally planned to introduce next year.

The Ultrasonic Sensor

45504

The ultrasonic sensor, rather appropriately, functions as a set of ‘eyes’ for the Mindstorms robot. It bounces sound waves to accurately detect the distance from an obstacle – in a similar fashion to how sonar works in a submarine. In the limited time we had, only a few students experimented with this sensor this year. I am hopeful that one or two might have a go at learning how to use it in the upcoming FLL season, but it will definitely be a focus for Year 6 robotics in 2017.

The Colour Sensor

45506

The colour sensor can be used to detect different coloured lines & the colours of obstacles on the mat. Our primary focus this year was introducing a few key team members to basic black line following using the colour sensor and the switch (IF/ELSE) programming block. This wasn’t an easy process to learn (or teach)!

With their current line following program, the Year 6 girls can follow straight and slightly wavy lines, but struggle following very curved lines (which requires two colour sensors & more advanced programming skills). We will return to line following later in the year.

“Robots and Walls don’t Mix!”

Miele robot vacuum, IFA 2015Creative Commons License Kārlis Dambrāns via Compfight

Have you ever wondered how a robot vacuum cleaner detects and avoids obstacles? This was a question our girls sought to answer when they began exploring the role of sensors in aiding robot navigation.

The first sensor we worked with was the touch sensor, which is ‘activated’ by a ‘push’, being ‘released’, or with a ‘bump’. As the girls discovered, this sensor can be extremely useful for detecting obstacles in front of the robot.

45507

The first challenge required the girls to work out the difference between the ‘push’ and ‘bump’ sensor states. They had to program their robot to move forward until a team member ‘bumped’ the sensor with their hand. The resulting code looked something like this:

Screen Shot 2016-07-08 at 12.06.42 pm

Move Forward –> Wait Until Touch Sensor is ‘bumped” –> Play “Sorry” —> Move Back 1 rotation.

Robots and Walls don’t Mix!

The next challenge proved to be rather entertaining. The girls were asked to program their robot to move until it detected a wall, reverse 20cm or so, and then turn 90 degrees. Then after adding a loop, they had to create a physical obstacle course for the robot to navigate through. Judging by the number of robots trying to drive through (and climb) walls, this wasn’t an easy challenge. 🙂

The key to success relies on understanding the difference between the ‘bump’ and the ‘push’ states when using the touch sensor. A bump could be likened to a quick tap; however, the ‘push’ is activated when the sensor detects a firm pushing force (e.g. what happens when you hit a wall).

Screen Shot 2016-07-08 at 12.21.42 pm

And, in one of the funniest pre-season moments to date … 

  • Student: “Mr Graffin, our touch sensor doesn’t work! Our robot is stupid!”
  • Teammate: “Have you tried plugging it in …?”

They had to pick me up off the floor after that one, 🙂