Navigational Systems


Overall Strategy

The robot will navigate its way through the maze using several distance sensors on the front, back and sides. Its primary strategy will be to follow the wall on one side of it, staying more or less parallel to that wall so that it travels in a straight path down the hallway. It will use two distance sensors mounted on its side (one near the front and the other near the back) to maintain a parallel course; different sensor readings indicate that the robot has veered from its parallel position, and the robot will adjust accordingly. When it reaches a doorway or a point at which the hallway branches, its sensors will detect this. It will then make a decision about whether to turn or continue to go straight, do so, and continue through the maze. Click here to see a more detailed overview of the control flow and our code.

Since a candle emits a great deal of infrared light, we plan to use infrared sensors to find and approach the candle. The robot will navigate its way through the maze and check each room with an IR sensor, stopping when the sensor reading crosses a specified threshold, indicating the presence of the candle. At that point, the robot will enter the room (indicated by a white line in the doorway, as per the contest rules) and move towards the candle. It will have two IR sensors on its front. By comparing the readings of the two sensors, it will determine in which direction the candle lies and will move accordingly. Since the wheels are powered independently (see the Final Structural Design section), the robot will be able to control the direction in which it travels by powering the wheels at different speeds. Using it's IR sensors, it will zigzag towards the candle. When it arrives within 12" of the candle (indicated by a white circle around the candle, as per the contest rules), it will stop, turn on the motor that controls the pushing mechanism, and shoot shaving cream at the flame, thereby extinguishing the candle.

To control the robot, we are using crickets, which are programmed with Cricket Logo, a dialect of the procedural language Logo. Click here to see our reasons for using the crickets, as well as the drawbacks in doing so. (Click here to see the work done with crickets by the MIT Media Lab.) We are currently working on making the robot navigate its way through the maze. Click here to see the navigation designs that didn't work. Click here to see our Final Navigational System.

 

Our only way of troubleshooting at the moment is by watching the LEDs that display sensor readings, observing the behavior of the robot, and attempting to determine at point in the code it does the wrong thing. However, the LEDs have a slight lag time, they only update when a call is made to the procedure that gets the current sensor readings, and it is difficult for us to make accurate diagnoses on the basis of what we are able to observe about the sensor readings as displayed on the LEDs. We are therefore working on setting up a system which will allow us to record data, including sensor readings and when the robot starts each separate procedure, in hopes that being able to look at the data all at once will help us to determine where the problems lie.

 

 

Robot Home  Introduction  Final Robot  Structural Design  Misadventures