Welcome to the Robotics page of Mark-World where you will be seeing unique mini-robots designed and created from scratch at Mark-World
Robotics has been my main focus on Mark-World projects since 2014. These robots each have had their fair share of mechanical, electronic, and software challanges and each ends up being more complex and more capable than the last. Robots require a multi-disiplined engineering approach which makes each of them a very satisfying project.
List of the my robot projects that have reached some point of completion:
- FiddlerBot - A robot with a state of the art ROS operating system and a claw like arm to act on his image recognition
- RoboMadge - This robot is a mid-sized wheeled bot with WebCam to see, Lidar rotating laser for navigation and precision drive system
- BugBot - A 6-legged insect like bot that moves about using muscle wire and no motors. He does not bite (very hard)
- MotoBot - A small bot that has a unique home-brew cycloptic eye enabling him to seek an object while roaming about.
- MotoMouse - A second configuration of MotoBot that is meant to solve a maze (and also look cute)
- RobotArm - This is the 'arm' part of a robot with 5 degrees of freedom and a functioning claw.
- MotorCtrl - Dual Motor controller and servo driver board, a Mark-World.com creation.
- MarkWorldMotorControlBoard An I2C board that has 2 Quadrature encs, Dual chnl Motor Driver signal gen, 3 channel Servo signal generator and is controlled by my BaseControl ROS node for a nice packaged solution to general base control for a robot.
- rosbits github repository Contains common ROS node code that is used in my bots operating under ROS is Robot Operating System
I hope that you enjoy these creations and perhaps are even inspired to create your own robots or devices with moving parts for fun!
Many Mark-World Bots On display at Maker Faire 2016
Taking part in the Home Brew Robotics Club booth many of the Mark-World Robots I have made were on display.
A video of the robots doing my bidding can be seen in a youtube video
This robot, Robo Madge Ellen is a mid-sized bot that has navigation abilities and is thus called 'Robo Magellian'.
In January of 2015 I re-purposed the bot as an inside navigation bot when I got hold of a Lidar unit (360 degree Laser distance device) seen on the top of the picture that allows detection of nearby objects including walls. In the picture below we see Madge Ellen with the spinning Lidar unit on top (with classic time tunnel pattern on top). This was shown at a robot meeting in late January 2016 to be able to follow the nearest object to it as well as turn the head (with cute eyes) to follow the person. It seemed a nice indoor use for this mid-sized robot.
Also done in late 2015 I converted to use differential motors that are geared down for slower usage. The motors have the added advantage of having magnetic encoders on them are are driven now by a RoboClaw drive controler for precise wheel control with ability to travel precise distances or rotate precise angles.
A brief introduction of Madge-Ellen to the Home Brew Robotics Club can be seen at 6min 50 sec into this video and I hope to soon link to the demo of the version shown below for the Jan 2016 meeting.
This bot also has GPS and a Compass and understands high level commands like face a given heading or face a given waypoint. The bot has a WiFi access point and bluetooth joystick control as well as a a webcam that feeds OpenCV visual recognition routines which can identify at this time an orange traffic cone.
I will offer more technical details on RoboMadge at some later point and my goal is to also come out of this with several pieces of the system I wish to share to the open source movement so others may use them for their robots. These will appear on my github at some point where they are mature. I am shooting for sharing the LCD display node, the Servo controller node and the Lsm303 compass/accelerometer node code. The GPS node is a fairly standard already available piece of open source software. Again, more details will appear as this bot becomes fully functional by end of 2015.
My 3rd robot, FiddlerBot, has been gaining abilities extensively and now recognizes and interacts with objects using vision and custom 'claw-arm' thus the name FiddlerBot. Command goals can be loaded into him to do sequences of actions all on his own that require identifying objects, grabbing and moving them as he can tell the distance and direction things from his video image on a full pan-tilt camera mount.
In this video FiddlerBot moves back and forth across a table sensing the edge and doing precise turn-abouts before he then uses vision to find object #2, turn on his 'laser' to highlight it and then approach the object with a precise distance he knows from his vision thus placing the object in his grasp. Then he was told to find and move marker #6, go to that marker and in doing so push object #2 off the table. The objects have little bar codes on them and can be seen in a picture of the web browser interface.
Here is the video showing FiddlerBot doing smart stuff.
You can see a couple of videos of FiddlerBot in two stages although I am talking to a robot club so pay no attention to the techno-mumbo-jump and just watch for fun. Watch this youtube video and go to 6min 0 sec for my segment. This 1st video is more entertaining and shorter but does not show his ability to find and grab objects all by himself.
The second video he is able to find and approach objects by number and can run lists of goals too but the video I am showing the web interface so I ran him manually. Go to this video on youtube and then get to 7min 40 sec to see the 2nd video which is about 7 minutes long for my show and tell to the Robot club.
FiddlerBot can be controlled using a full-function web interface pictured below where there are buttons to use for manual control or to manually tell him to find and pick up thing and so on. FiddlerBot feeds back status of his many sensors and also his current goal and other things so you can monitor his progress following the directions you gave him AND you can see what FiddlerBot sees in the video part of the web interface. This guy in short can be controlled remotely from anywhere else using the internet and you would see what he sees.
To keep this page less cluttered, I have moved the cool details and pics to the FiddlerBot Progress page
FiddlerBot With Camera and 'The Claw' to Grab and MoveThings
In the remote web interface which can be operated from right next to him OR across the country on the internet is shown below. The top right has controls for manual movement and below that we have Bot Current State section shows his current goal and if it is active in this case or complete as well as the distance and angle to the object he is looking for (if in view). Below that part lies feedback on his assorted sensors. The very bottom right lies several check-boxes so that you can manually tell FiddlerBot to search, approach and pick up things OR fire his 'Deadly Laser' ;-)
FiddlerBot Operates On His Own
developed FiddlerBot is shown in this video navigating in a square. The separate commands like 'move forward 20cm or turn 90 degrees and so on are sent to him in a list and from there FiddlerBot does all the commands till done. FiddlerBot has a WiFi hotspot so we connect to his network and give him the commands all over WiFi.
If curious there is a HUGE amount of detail on the FiddlerBot Progress page
At a high level FiddlerBot runs a current version of Ubuntu Linux on a fairly powerful processor and utilizes a very modular set of functions on top of the Robot Operating System (ROS) toolkit extensively.
Again, you may track or view past progress on ths project on my FiddlerBot Progress page
A 4" long 6-legged radio controlled 'bug' was developed using only 'muscle wire' to move his legs. This was a challanging project mostly from mechanical and electrical design perspective but he is still 'cute' more or less. This is my oldest electro-mechanical bot and hopefully you will agree that it has been physically laid out to give the image of a bug as that was part of the goal.
Pic Microcontroller with it's digital and analog IO allows creators of electronic projects to make intellegent devices that are both low-power as well as low-cost. Several projects will be shown that have been developed with Pic microcontrollers.
The 'Bot' devices described have been developed using the 'SourceBoost IDE' which I cannot say enough good things about. Projects typically use recent models of the Microchip Pic Microcontrollers that I have found to be full featured, energy efficient, and very low cost.
BugBot And His Muscle Wires
BugBot is a 4 inch long robot who uses muscle wire on his legs and body to move. He uses a Pic microprocessor to sequence his movements and his 'master' controls him via a radio control unit taken from one of the common 1.5 inch long radio controlled cars found on technical toy sites. This little guy presented me with many mechanical as well as electrical problems mostly due to the constraints of working with a muscle wire and a 3.7 volt battery. His body is 65mm long (2.5 inches) and the LiPoly battery he carries below his body only gives him 3.7 volts to then be used to drive about 200milliamps through the wires which heats them and causes the 5 percent contraction of muscle wire. This little guy moves very slowly because each step moves about 3-4mm while taking a couple seconds due to heating and cooling speeds for the muscle wire.
Movie of BugBot in Two Video Flavors
(however slow that may seem ... it is still 'action')
Avi Version: BugBot Movie
Mpeg Version: BugBot Movie
MotoBot is a self contained, thinking bot with a propriatary 'cycloptic' linear vision systemthat allows him to find a little light and know how far it is away and if he has to turn right or left to get it in the center. MotoBot can also tell if he gets to the edge of a table. MotoBot now converts back to MotoMouse as required so see below for that configuration.
You can see a Demo I did in Sept 2014 to the Home Brew Robotics Club at this MotoBot demo video link and go about 3 minutes into the video to see my discussion. In this video it shows many of his 'intellegent' features and my demo segment is only about 5 minutes long so enjoy.
MotoBot is thus able to wander about on a table and then when he sees his little light he approaches it and tracks it till he looses interest and then continues to roam about. Motobot has a few sensors to see the edge of the table and so on. MotoBot can also be controlled with a hand held radio transmitter to do basic movement and rotations.
A nice little two line display shows you what he is thinking or seeing and tells the distance to hsi little light (if it is in front).
Technical Details (which I recommend you just scroll on down to next of my bots as these deatils will be rather booring):
MotoBot is powered by a LiPoly battery he has two bi-directional wheels. This bot is a fully self contained thinking bot who thinks with a PIC microprocessor programmed in C and can be controlled with a radio hand-controller. His 'phase 2' as shown above still allows him to be controlled via radio as in phase 1 so he can be told to move in assorted patterns using separately controlled dual drive wheels. This bot is able to detect distance and angle to a small lighted object using a have a 256bit linear detection optical eye enabling detection of a little light if it's within a 35 degree area in front of him
MotoBot With LCD Display
MotoBot first came to life in late 2007 and had his vision and so on by Jan 2008. MotoBot will be able to have a lot of fun once he is able to sense other things (like with his ears not yet in service but seen in a picture above). In Motobot's Phase 3 development he will begin to physically interact with objects as well as perhaps communicate in simple audio conversations.
In mid 2016 we configured the MotoBot platform to be converted into MotoMouse to solve a maze for a HBRC (Home Brew Robotics Club) compitition to solve a large 8 foot by 8 foot maze on the floor. He was able to do it and the video is posted on the web.
Here is MotoMouse Configuration Of this Bot.
MotoMouse In All His Cuteness
Here is the line sensor which uses a PC board I made through OshPark and has separate down-facing IR sensors to find the line, the same types used on his MotoBot configuration to find the end of a table or edge of the box that MotoBot moves about within.
MotoMouse Line Sensor
Here we have a robotic arm that is not yet attached to any wheeled robot. The demo video below shows that this robot can move about in many ways due to it's 6 degree of freedom or rather 6-axis servo motor control. This project is really in a very early state so I made a video of the arm moving about in a sequence that repeats. It's kind of neat that it has so many different movements and to combine them using ROS packages such as 'Move-It' is my goal even though this arm is a fairly low-end arm I have made a 'model' for it and can run it in a virtual way in a sort of robot cad tool (not shown here).
This robot was quite a hit along with MotoBot at the 2015 Maker Faire held in first part of 2015 at San Mateo Fairgrounds. This was shown at the Home Brew Robotics Club booth and was a real croud drawing display similar to the movie but I made the movements even fancier for the Maker Faire exibit.
A picture is below and you can also view the moving robotic arm video
(This video may play sideways but you will still get the idea and besides isn't that a bit of fun?)
Here we have a small pc board with the ability to be attached as an I2C slave and supply several functions common in robotic systems. It is implemented as two separate dsPic30F2010 microcontrollers and this board will run nicely on 3.3V or 5V. Each channel or processor is accessed with an I2C address difference of 1. The A4 jumper allow the board to appear at 7-bit addr base of 0x60 or this address or 0x70 so hopefully we can avoid other I2C devices nicely.
As of this new board formed in Sept 2016 each side can now operate a closed loop control loop for motors that have QEI encoders for speed readback. I had made a ROS node and this board suitable to drive a few servos on my FiddlerBot as well as Robo Madge Ellen bots. I have not exposed the base control ROS node on my github yet as it is still being tweeked as I work on RoboMadge Ellen in 2016.
- Each side always has one 16-bit Quadrature Encoder
for fast motors this subsystem does not miss counts but a software
solution can. Each channel has a simple led to show encoder is changing
as a quick check. Each read over I2C returns a 6-byte packet with
checksum where 2 bytes are encoder.
- Each side can use it's 6-signal header in one of 4 ways (Full PID control mode added Sept 2016)
- 6 simple digital input lines
- 3 servo digital PWM lines that
supply pulses from 400us to 2.5ms scaled from input value of 0-100
(soon to be 0-1000 also). We only supply the signals so user must
supply ground and power to his servos using separate connections.
- Two PWM 0-100% motor controller lines with PWM and 2 control lines for use as breaking or forward/reverse characteristics to devices such as the High power VNH2SP30 control bits or just use the 0-100% PWM signal at 5ms cycle by itself.
- One Proportional Control closed Loop motor control. Here each side can read the QEI encoder and drive a PWM signal to a driver. The driver board is external and a nice small one is the AdaFruit TB6612 model but the control on other higher power units is similar.
The board has a header for the programming of the microcontrollers and a jumper selects which device will be programmed.
This project also led to development of an Arduino sketch that controls this device over I2C with drivers for the features at a high level for simple hookup to I2C on the Arduino. I have started to enjoy using the very low cost Arduino Nano boards from China for all sorts of things and that board at about $3 or less is a wonderful tool to have around and just leave them programed for assorted things then burn another for the next task. Reburn as required. So nice and available.
I Hope you have been entertained by viewing the mini-robots from Mark-World!