Python Augmentation on Wheels
For me this is a new challenge to drive motors and communicate with a robot using a Raspberry Pi.
I'll break up the build into the following parts with some detailed information on linked pages to dive into this further
- Hardware for Droid
- Camera streaming from Raspberry Pi
- Motor control, including hooking up an Arduino to Pi
- Communication with Raspberry Pi
- PyGame augmentation
Tools of choice are:
- Raspberry Pi Zero W and Arduino Pro Mini 328
- Sparkfun Motor Driver
- Pimoroni's VL53L1X sensor
- Adafruit 500 PowerBoost Charger
- Sparkfun Fuel Gauge
- 2200mAh LiPo battery
- Python to just get things done. Loads of libraries for powerful functionality
- PyGame and SDL to create a remote view of the droid
- OpenCV required to do some of the video heavy lifting
- Raspivid for streaming video. This is standard with the Raspberry Pi
- Netcat to manage the video streaming from Raspberry Pi
Hardware for Droid
Tracks are on the Pololu Zumo Chassis which is fairly compact to add micro-motors and batteries for 6v supply.
Zumo chassis and attached control box. Blue tack in good use to hold camera and TTF sensor |
Pi Zero has been used as it's compact and can still use the standard Pi Camera (with different cable). Power usage is also much lower, which is better for the LiPo supply.
Adafruit 500 PowerBoost combines well with the Sparkfun Fuel Gauge. A battery can be added to either holder (but not 2 batteries in both!), be charged and provide power. I added the USB connector to run a standard USB cable to the Pi. This means the protection circuits on the Pi are used.
500 PowerBoost with soldered on Fuel Gauge |
Getting everything in the box was very snug. As the Pi Zero only uses 2/3 of the width there is still room for the LiPo
Snug battery fits below the prototype board |
Rotary encoders are fitted to both motors, but the faff with the soldering and fitting wasn't worth it in the end. The idea was to provide better control and detect stalling, but it was more a curiosity than a practical need. I get random inputs from the light sensors through the Arduino so given up for now. The majority of wires in the photos are from the encoders. Each motor adds an extra 4 wires which aren't really useful.
Taking a recharge break with the lid off |
Camera streaming
Raspivid comes installed with Raspbian images. There are other alternatives such as Gstreamer which require additional install and setup. I will try out Gstreamer, but for now the results from Raspivid are good enough.
raspivid -t 0 -fps 15 -w 640 -h 480 --inline -b 10000000 -n -o - | nc -lk -p 5001
Options
- -t 0 creates a continuous output
- -fps 15 sets the framerate to a slow 15, this is slow enough to prevent buffering
- -w 640 -h 480 sets the desired resolution to send. This is reasonable for speed and quality
- -inline ensures that the video can be resumed by the connecting Pygame code
- -b 100000000 sets the bit rate
- -n to prevent showing a preview
- -o output to stdout
- nc is a pipe of the output to netcat
- -lk listen for connections and keep open after disconnects
- -p 5001 listen on port 5001
Netcat is provided in the NMAP package for Windows download at https://nmap.org/download.html.
OpenCV can process an incoming video stream. Opening the stream is as simple as
cam = cv2.VideoCapture("tcp://192.168.1.10:5001")
Add the IP name of the Raspberry Pi with attached camera into the connection string.
Processing images for Pygame is easy with the following code
import cv2
import pygame
pygame.init()
screen = pygame.display.set_mode((640,480))
cam = cv2.VideoCapture("tcp://192.168.1.10:5001")
while True:
ret, image= cam.read()
pygame_surface_image = pygame.image.frombuffer(image.tostring()[::-1], image.shape[1::-1], "RGB")
screen.blit(pygame_surface_image,(0,0))
pygame.display.flip()
Camera images take some time to catch up, but once in sync then there's only a small amount of lag.import pygame
pygame.init()
screen = pygame.display.set_mode((640,480))
cam = cv2.VideoCapture("tcp://192.168.1.10:5001")
while True:
ret, image= cam.read()
pygame_surface_image = pygame.image.frombuffer(image.tostring()[::-1], image.shape[1::-1], "RGB")
screen.blit(pygame_surface_image,(0,0))
pygame.display.flip()
Motor control
Micro motors controlled by Spark Fun Motor Driver which provides H-Bridge control for each motor.
Controlling of motors from a Raspberry Pi is completely possible, but I've had problems with PWM on GPIO 18 when the processor is even slightly busy. Arduino controllers are much better at PWM control as they provide real-time processing so the Pi works with a Sparkfun Pro Mini 328 running at 5v.
The Arduino and Raspberry Pi run at different voltages, so communication over serial requires a voltage drop from 5v to 3.3v. This is achieved with a voltage divider. Only the TX and RX need connecting to talk to the Arduino. Programming is possible from the Pi but the reset button will need pressing at the right time to upload code.
More information can be found on my Pi to Arduino page.
More information can be found on my Pi to Arduino page.
Serial communication with commands to move the motors in 2 directions and at different speeds. On a Raspberry Pi Zero the serial device runs on /dev/serial0
Communication with Raspberry Pi
The Zero W is great with its WiFi capability. Streaming video over this link is ideal. Video has already been mentioned above, but the other communication to issue move commands, activate LEDs and use the Time-to-Flight sensor need an interface.
MQTT jumps to mind immediately as I've used this before for sensors. This is great for a publisher and subscriber situation but REST APIs may fit this a little better (I'm still thinking of providing a hybrid with an MQTT bus as well for future changes).
REST APIs can be provided using Connexion and Flask. I use python 3 for all my new code so it's referenced in the installations below:
sudo apt-get install python3-flask
sudo pip3 install connexion
sudo pip3 install connexion[swagger-ui]
A client machine (running the PyGame software) will need the Python package requests:sudo pip3 install connexion
sudo pip3 install connexion[swagger-ui]
pip install requests
The following API end points are created
- /droid/range provides feedback and configuration of the VL53L1X time-to-flight sensor
- /droid/battery provides capacity information
- /droid/move allows specification of move commands to Arduino
- /droid/lights illuminates LED headlights
With the swagger-ui installed for Connexion the API can be used by browsing to /droid/ui. This provides a nice interface to set and query values.
PyGame Augmentation
This is the fun part. A client machine (Windows or another Raspberry Pi on Linux) can connect to the Video Droid and control it.
I made the display appear like you are viewing through the windows of a vehicle.
Control view with HUD and control panel |
Pressing the Video button connects the video stream from the Raspberry Pi droid. Ranging information appears as a HUD overlay
Land of the giant hands |
All button presses send a command over REST to the API end points hosted by Flask. Some polling has to happen to get the range distance information. This may have been better over a MQTT bus.
There is also a battery level indicator (not shown as battery wasn't in use for the shots) which appears when running on LiPo power. See the video at the start of this post for an internal and external view of the Video Droid.
I'm currently having loads of fun driving this around the house and making obstacle courses for it to explore. There will be another droid in the pipeline to explore OpenCV further and hook up NoIR camera for night vision.