EV3 Intellisearch is an Alexa assisted, intelligent search and rescue bot using the LEGO EV3 Mindstorm and Amazon Echo Dot.
The Idea....The idea for Intellisearch was born out of a desire to make a LEGO Mindstorms robot perform a real life activity on a small scale. Using the base LEGO Mindstorm Track3r Bot, and adding color, and gyro sensors to the build, and we wanted to create an Alexa enabled Bot capable of locating objects within a search grid.
We chose this project because we wanted to solve a real world problem using tools that are readily available in an educational setting. So we decided to develop a robot that would be capable of carrying out a simple task that would otherwise require human intervention. There are many situations where humans could be replaced by bots to save human lives. One example using a bot to deliver communication equipment and/or supplies to a trapped hiker, assisting survival until human rescue can safely extract them from the situation. Or, imagine if we didn’t have to send a human firefighter to fight a wildfire and instead could send an army of bots to perform the task. The list of possibilities is endless.
Next, the build. We followed the building directions for the LEGO Mindstorms Track3r Bot with a few variations in order to allow the use of a wireless usb dongle as below.
- Follow directions through Step 4.
- Complete steps 7 & 9. (Omitting step 8)
- Go to page 11 and complete steps 1 through 10.
Here is where We added our adaptation to expose the USB port and support the wireless dongle.
- Go to page 32 and follow steps 1 through 4. (Infared sensor attachment).
- Refer to build video for additional sensor attachments.
The original bot was programmed with the LEGO Mindstorm builder interface. This is a drag and drop coding platform. The following video is both a screen capture and voice recording of our base program.
For this challenge, we took the logic from our rudimentary implementation and converted it into Python. This allows for some portions of the code to run on the LEGO EV3 brick, and some portions to run in our AWS Lambda functions, allowing our bot to communicate with our Alexa Skill and Echo Dot. All code has been submitted and stored in the EV3 Intellisearch GitHub repository.
The final demonstration shows the Intellisearch function on a small scale. In the first demo video, the bot has been told to perform a search to find a green piece of paper as its intended subject. The second video explains and demonstrates the bot’s ability to autonomously walk a grid perimeter based on grid height, width and the bot’s position within the grid. This feature allows us to provide search dimensions dynamically at run time.