Droid Bot - Floor Bot With Navigation & Image Recognition
DroidBot is a medium sized bot meant for the the table top or floor environment. The bot is designed to use vision and image regognition with other forms of sensors. A Lidiar as well as a Time of Flight 'radar' and edge sensors are used.
A key new hardware subsystem this bot sports is my new drive control unit which is based on the Mark-Toys Esp32 board as seen on EspressoBot and on the IoT page of this site.
The new Esp32 motor controller subsystem is made up of our
Esp32 Dev board
that then controls an off the shelf H-Bridge style motor controller so you can pick your level of power the motors requre. For DroidBot the H-Bridge controller is the ArduMoto mid-sized driver.
The new motor controller Esp32 software accepts serial packet inputs from the Raspberry Pi 3 ROS from our base_control ROS node.
Because the Esp32 has a BLE gatt server we can even remove the Raspberry Pi 3 and just run this bot using only the motor controller which accepts commands from its onboard BLE (bluetooth) GATT server and we then use our EspBot Android App and away we go on manual mode controlled by human operator.
In mid 2018 the addition of a couple edge sensors as well as a 'TimeOfFlight' distance proximity sensor mounted on a servo allows sensing of objects in front of the Bot. The edge sensors as well as the ToF Radar are connected to the Esp32 board which scans the ToF radar. A ROS node on the Raspberry Pi queries the results from the edge sensors as well as the ToF radar. A demo at HBRC meeting can be seen in
this live demo
A small 128x64 pixel OLED display shows the results of the ToF radar as well as edge sensor and motor parameters for realtime viewing.
The code for Espressobot has been enhanced and used here by adding a serial input that accepts packets from the host (RasPi3 or other system) to query or set parameters in the motor controller. The most valuable command is a command to set both motor control side of right and left at once and we map that on the RasPi3 side when we get a ROS 'twist' command which is a standard in ROS for specification of the required drive movement.
The Esp32 dev board used for the motor control system can be
on this webpage.
This bot runs the motors on a 7.2 volt LiPo cell and runs the Pi and Esp32 from 5Volt cell phone charger type of power supply. It is generally a good idea to keep separate supplies.
This bot was just created and a variety of other abilities are planned to allow this bot to use other forms of vision and sensors to be added in 2018.
Mark-World - Tech Projects To Amuse The Curious