Laustracker: Using computer vision to track little robots moving through a labyrinth
Overview
The robots
They are named “Robolaus”, “Robo” stands for robot and “Laus” is the German word for louse.
Used software
OpenCV (http://opencv.org/)
OpenCV (Open Source Computer Vision) is a library of programming functions for real time computer vision.
ROS (http://www.ros.org/wiki/)
ROS (Robot Operating System) provides libraries and tools to help software developers create robot applications. It provides hardware abstraction, device drivers, libraries, visualizers, message-passing, package management, and more. ROS is licensed under an open source, BSD license.
OpenSceneGraph (http://www.openscenegraph.org/projects/osg)
The OpenSceneGraph is an open source high performance 3D graphics toolkit, used by application developers in fields such as visual simulation, games, virtual reality, scientific visualization and modelling. Written entirely in Standard C++ and OpenGL it runs on all Windows platforms, OSX, GNU/Linux, IRIX, Solaris, HP-Ux, AIX and FreeBSD operating systems.
Pygame (http://www.pygame.org/)
Pygame is a set of Python modules designed for writing games. Pygame adds functionality on top of the excellent SDL library. This allows you to create fully featured games and multimedia programs in the python language. Pygame is highly portable and runs on nearly every platform and operating system.
How it works
We have several ROS nodes that communicate with each other:
The camera node connects to a uEye USB camera and publishes new images as soon as it gets a new one from the camera.
The Laustracker server processes the images, locates the labyrinth, the walls and the robots and then publishes them as ROS messages.
Laustracker3D and Laustracker2D are used to visualize the maze and the robot positions, both subscribe to messages from the Laustracker server.
The Laustracker XBee node receives robots positions and map data from the Laustracker server and then sends that data wireless via XBee to the robots.
And then there is laustracker-cli, a small console application that is useful to test new stuff:
[chris@thinkpad cli]$ ../../bin/laustracker-cli --help
Allowed options:
--help Show Help Message
--debug Show Debug Images
--no-undistort Whether the image should not be undistorted
--mask Use a mask image to dedect walls
--black Use black lines to dedect walls
--manual Ask the user to select the labyrinth edges
--reddots Use red dots to find the labyrinth edges
--walls-img arg Use a image file and not the camera
--robots-img arg Search for robots in an local image
--robots-conf arg Use a specific robots configuration file
--labyrinth-conf arg Use a specific labyrinth configuration file
--debuglevel arg Set the debuglevel for Laustracker library functions
--benchmark Use robots-img to measure the performance (max FPS)
--no-wrap Do not wrap the image, should be faster
--fast Fast mode, the result may be a bit inaccurate but the
FPS is higher
Starting point
The starting point is a unprocessed camera image:
The camera is configured to output only an AOI (area of interest, http://en.ids-imaging.com/manuals/uEye_SDK/EN/uEye_Manual/is_aoi.html) and not the whole image to get higher FPS.
The camera needs to be calibrated, then the images can be undistorted. Here is a nice tutorial how this works:
Locating the labyrinth
There are 3 methods implemented:
- Selecting the labyrinth edges manually
- Locating the labyrinth via square detection
- Locating the labyrinth with red markers at the corners
Locating the walls
[chris@thinkpad cli]$ ../../bin/laustracker-cli --walls-img=../../testdata/aoi_labyrinth_with_robots_leds_on.jpg --mask --reddots --debug
- Produce a black-white image.
- Use a mask to get the wall positions.
- Count white pixels to decide whether there is a wall or not.
Locating the robots
[chris@thinkpad cli]$ ../../bin/laustracker-cli --robots-img=../../testdata/aoi_robots_dark.jpg --walls-img=../../testdata/aoi_labyrinth_empty.jpg --debug
- Decrease the exposure time and mask everything outside the labyrinth.
- Search for blue, red and white regions and decide whether they belong to a robot or not.
- Determine the distance from the white led to the blue led and the red led and calculate the ratio to assign names to robots.
The Laustracker server
Before you can use Laustracker2D or Laustracker3D you need to start the Laustracker server and roscore, I’m using here some images from the testdata/ directory, so that you can try it out yourself, without the need for a uEye camera.
Start roscore:
[chris@thinkpad ~]$ roscore
... logging to /home/chris/.ros/log/27d3a4e8-8027-11e2-ba83-a088b49df024/roslaunch-thinkpad-2599.log
Checking log directory for disk usage. This may take awhile.
Press Ctrl-C to interrupt
Done checking log file disk usage. Usage is <1GB.
started roslaunch server http://thinkpad:42212/
ros_comm version 1.9.41
SUMMARY
========
PARAMETERS
* /rosdistro
* /rosversion
NODES
auto-starting new master
process[master]: started with pid [2612]
ROS_MASTER_URI=http://thinkpad:11311/
setting /run_id to 27d3a4e8-8027-11e2-ba83-a088b49df024
process[rosout-1]: started with pid [2625]
started core service [/rosout]
Start the Laustracker server:
[chris@thinkpad ~]$ rosrun ltserver laustrackerd --robots-img=${LAUSTRACKER_PATH}/testdata/aoi_robots_dark.jpg --walls-img=${LAUSTRACKER_PATH}/testdata/aoi_labyrinth_empty.jpg
Map created
Trying to open ../share/laustracker/robots.conf
Warning: could not open ../share/laustracker/robots.conf
Used intrinsics.xml: /mnt/data/Uni/Projects/laustracker.git/laustracker/ros/devel/lib/ltserver/../share/laustracker/intrinsics.xml
Warning: could not open intrinsics.xml
Using image file: /home/chris/data/Uni/Projects/laustracker.git/laustracker/testdata/aoi_labyrinth_empty.jpg.
Using image file: /home/chris/data/Uni/Projects/laustracker.git/laustracker/testdata/aoi_robots_dark.jpg.
Used intrinsics.xml: /mnt/data/Uni/Projects/laustracker.git/laustracker/ros/devel/lib/ltserver/../../../../share/laustracker/intrinsics.xml
used: /mnt/data/Uni/Projects/laustracker.git/laustracker/ros/devel/lib/ltserver/../../../../share/laustracker/intrinsics.xml
image width: 680
image height: 680
intrinsic matrix:[413.4837556918991, 0, 339.5;
0, 413.3793124538085, 339.5;
0, 0, 1]
distortion coefficients: [-0.3264955586115522, 0.130044296806931, 0, 0, -0.02822119522596339]
mask_filename: /mnt/data/Uni/Projects/laustracker.git/laustracker/ros/devel/lib/ltserver/../../../../share/laustracker/mask.png
Trying to open ../../../../share/laustracker/robots.conf
robot_names = Andi
Pete
robot_ratios = 1.38
2.1
max_ratio_diff = 0.4
Finished all contours
============ Begin Map ===================
#########################
# # #
# ########## ##########
# # # #
# # ################ #
# # # # # #
# # # ####### #### #
# # # # #
### # #### ##########
# # # #
# ####### #############
# # # #
# # ####### ####### #
# # # # # #
# ###### # # # # #
# # # # #
#########################
============ End Map =====================
The Laustracker2D Python script
The Laustracker2D Python script connects to the Laustracker server and then displays the map.
[chris@thinkpad ~]$ rosrun laustracker2d laustracker2d.py --debug
running in debug mode
[INFO] [WallTime: 1361892176.475582] /laustracker2D_2892_1361892170234 I heard:
START_ROBOTS:
Pete 5 2 0 0 0.109589 0.0547945 1 1.88147 19.78 37.2156
Andi 2 1 2 270 0.493151 -0.109589 1 1.44278 25 36.0694
:END_ROBOTS
[INFO] [WallTime: 1361892176.481251] /laustracker2D_2892_1361892170234 I heard:
START_MAP:[00001010][00001100][00001100][00001100][00001001][00001110][00001100][00001101][00000011][00001010][00001100][00001101][00000110][00001100][00001100][00001001][00000011][00000011][00001010][00001100][00001100][00001001][00001111][00000011][00000110][00000001][00000011][00001110][00001001][00000110][00001100][00000101][00001010][00000101][00000110][00001001][00000110][00001100][00001100][00001101][00000011][00001010][00001101][00000110][00001000][00001100][00001100][00001001][00000011][00000110][00001100][00001001][00000011][00001010][00001001][00000011][00000110][00001100][00001100][00000101][00000110][00000101][00000111][00000111]:END_MAP
#########################
# # #
# ########## ##########
# # < # #
# # ################ #
# # # ^ # # #
# # # ####### #### #
# # # # #
### # #### ##########
# # # #
# ####### #############
# # # #
# # ####### ####### #
# # # # # #
# ###### # # # # #
# # # # #
#########################
...
The source code
#!/usr/bin/env python
# -*- coding: UTF-8 -*-
'''
Laustracker2D
~~~~~~~~~~~~~
This script connects to the Laustracker server and displays
a simple 2D map.
:copyright: Copyright 2013 by Christian Jann
:license: GPLv3+, see COPYING file
'''
from time import sleep
from optparse import OptionParser
import sys
import math
import pygame
import roslib
roslib.load_manifest('laustracker2d')
import rospy
from std_msgs.msg import String
from ltserver.msg import Map, Robots
MARKER_MAP_BEGIN = "START_MAP:"
MARKER_MAP_END = ":END_MAP"
MARKER_ROBOTS_BEGIN = "START_ROBOTS:"
MARKER_ROBOTS_END = ":END_ROBOTS"
FIELD_WIDTH_PX = 60
LAB_SIZE_PX = 8 * FIELD_WIDTH_PX + 9 + 1
FIELD_WIDTH_CM = 30.0
ROBOT_WIDTH_CM = 17.0
ROB_RADIUS = int(ROBOT_WIDTH_CM / FIELD_WIDTH_CM * FIELD_WIDTH_PX / 2.0)
NO_RELATIV_POS = False
TOP = 0
BOTTOM = 1
LEFT = 2
RIGHT = 3
screen = None
robots = []
fields = []
debug = False
show_robot_pos_cm = False
class Field:
def __init__(self):
self.wall_top = False
self.wall_bottom = False
self.wall_left = False
self.wall_right = False
self.buoy_top = False
self.buoy_bottom = False
self.buoy_left = False
self.buoy_right = False
class Robot:
def __init__(self):
self.name = ""
self.field_idx = (0, 0)
self.direction = 0
self.angle = 0
self.relative_pos = (0, 0)
self.third_led_found = False
self.ratio = 0
self.dist_blue_white = 0
self.dist_red_white = 0
self.absolut_pos_cm = (0, 0)
def print_map():
MAP_SIZE_H = 8 * 3 + 1
MAP_SIZE_V = 8 * 2 + 1
# create a MAP_SIZE_V x MAP_SIZE_H matrix of '#'
pmap = [['#' for col in range(MAP_SIZE_V)] for row in range(MAP_SIZE_H)]
for i in range(1, MAP_SIZE_V - 1, 2):
for j in range(1, MAP_SIZE_H - 1, 3):
pmap[j][i] = ' '
pmap[j + 1][i] = ' '
for y in range(0, 8, 1):
for x in range(0, 8, 1):
if(fields[(x, y)].wall_top == False):
pmap[x * 3 + 1][2 * y] = ' '
pmap[x * 3 + 2][2 * y] = ' '
if (fields[(x, y)].wall_bottom == False):
pmap[x * 3 + 1][2 * y + 1] = ' '
pmap[x * 3 + 2][2 * y + 1] = ' '
if (fields[(x, y)].wall_left == False):
pmap[x * 3][2 * y + 1] = ' '
if (fields[(x, y)].wall_right == False):
pmap[x * 3 + 3][2 * y + 1] = ' '
if (y > 0):
if (fields[(x, y)].wall_top == False
and fields[(x, y)].wall_left == False
and fields[(x, y - 1)].wall_left == False):
pmap[x * 3][2 * y] = ' '
if(robots != None):
for robot in robots:
if(robot.direction == TOP):
pmap[robot.field_idx[0] * 3 + 1]\
[2 * robot.field_idx[1] + 1] = '^'
if(robot.direction == BOTTOM):
pmap[robot.field_idx[0] * 3 + 1]\
[2 * robot.field_idx[1] + 1] = 'v'
if(robot.direction == LEFT):
pmap[robot.field_idx[0] * 3 + 1]\
[2 * robot.field_idx[1] + 1] = '<'
if(robot.direction == RIGHT):
pmap[robot.field_idx[0] * 3 + 1]\
[2 * robot.field_idx[1] + 1] = '>'
map_str = ""
for i in range(0, MAP_SIZE_V, 1):
for j in range(0, MAP_SIZE_H, 1):
map_str += pmap[j][i]
map_str += "\n"
rospy.loginfo("Map:\n%s\n" % map_str)
def chr2bool(string, idx):
if (idx > -1 and idx < len(string)):
if (string[idx] == '0'):
return False
else:
return True
else:
print("wrong index")
return False
def mapFromString(map_str):
fields = {}
for x in range(0, 8):
for y in range(0, 8):
fields[(x, y)] = Field()
begin = map_str.find(MARKER_MAP_BEGIN)
end = map_str.find(MARKER_MAP_END)
if (begin > -1) and (end > -1):
idx = begin + len(MARKER_MAP_BEGIN)
for y in range(0, 8, 1):
for x in range(0, 8, 1):
idx += 1
fields[(x, y)].buoy_top = chr2bool(map_str, idx)
idx += 1
fields[(x, y)].buoy_bottom = chr2bool(map_str, idx)
idx += 1
fields[(x, y)].buoy_left = chr2bool(map_str, idx)
idx += 1
fields[(x, y)].buoy_right = chr2bool(map_str, idx)
idx += 1
fields[(x, y)].wall_top = chr2bool(map_str, idx)
idx += 1
fields[(x, y)].wall_bottom = chr2bool(map_str, idx)
idx += 1
fields[(x, y)].wall_left = chr2bool(map_str, idx)
idx += 1
fields[(x, y)].wall_right = chr2bool(map_str, idx)
idx += 1
idx += 1
else:
print("begin or end missing, or both")
return fields
def robotsFromString(robots_str):
robots = []
begin = robots_str.find(MARKER_ROBOTS_BEGIN)
end = robots_str.find(MARKER_ROBOTS_END)
robot_str = robots_str[begin + len(MARKER_ROBOTS_BEGIN) + 1:end - 1]
for line in robot_str.splitlines():
values = line.split()
rob = Robot()
rob.name = values[0]
rob.field_idx = (int(values[1]), int(values[2]))
rob.direction = int(values[3])
rob.angle = float(values[4])
rob.relative_pos = (float(values[5]), float(values[6]))
rob.third_led_found = bool(values[7])
rob.ratio = float(values[8])
rob.dist_blue_white = float(values[9])
rob.dist_red_white = float(values[10])
rob.absolut_pos_cm = (float(values[11]), float(values[12]))
robots.append(rob)
if debug:
print_robots(robots)
return robots
def print_robots(robots):
robots_str = "Robots:\n"
for robot in robots:
robots_str += " ".join([repr(el) for el in [robot.name, robot.field_idx,
robot.direction, robot.angle,
robot.absolut_pos_cm]])
robots_str += "\n"
rospy.loginfo("\n%s" % robots_str)
def wall(x, y, where):
xc = FIELD_WIDTH_PX / 2 + 1 + (FIELD_WIDTH_PX + 1) * x
yc = FIELD_WIDTH_PX / 2 + 1 + (FIELD_WIDTH_PX + 1) * y
if(where == TOP):
pygame.draw.line(
screen, (255, 255, 255), (xc - FIELD_WIDTH_PX / 2,
yc - FIELD_WIDTH_PX / 2),
(xc + FIELD_WIDTH_PX / 2, yc - FIELD_WIDTH_PX / 2), 3)
if(where == BOTTOM):
pygame.draw.line(
screen, (255, 255, 255), (xc - FIELD_WIDTH_PX / 2,
yc + FIELD_WIDTH_PX / 2),
(xc + FIELD_WIDTH_PX / 2, yc + FIELD_WIDTH_PX / 2), 3)
if(where == LEFT):
pygame.draw.line(
screen, (255, 255, 255), (xc - FIELD_WIDTH_PX / 2,
yc + FIELD_WIDTH_PX / 2),
(xc - FIELD_WIDTH_PX / 2, yc - FIELD_WIDTH_PX / 2), 3)
if(where == RIGHT):
pygame.draw.line(
screen, (255, 255, 255), (xc + FIELD_WIDTH_PX / 2,
yc - FIELD_WIDTH_PX / 2),
(xc + FIELD_WIDTH_PX / 2, yc + FIELD_WIDTH_PX / 2), 3)
def draw_bot(field_idx, angle, relative_pos=None):
xc = int(FIELD_WIDTH_PX / 2 + 1 + (FIELD_WIDTH_PX + 1) * field_idx[0])
yc = int(FIELD_WIDTH_PX / 2 + 1 + (FIELD_WIDTH_PX + 1) * field_idx[1])
if relative_pos:
xc = int(xc - relative_pos[0] * FIELD_WIDTH_PX)
yc = int(yc - relative_pos[1] * FIELD_WIDTH_PX)
pygame.draw.circle(screen, (0, 255, 0), (xc, yc), ROB_RADIUS, 4)
dx = math.cos(math.radians(angle - 90)) * ROB_RADIUS
dy = math.sin(math.radians(angle - 90)) * ROB_RADIUS
pygame.draw.line(screen, (0, 255, 0), (xc, yc), (xc + dx, yc + dy), 4)
def update_display():
screen.fill((0, 0, 150))
for x in xrange(0, LAB_SIZE_PX, FIELD_WIDTH_PX + 1):
pygame.draw.line(screen, (0, 0, 0), (x, 0), (x, LAB_SIZE_PX))
for y in xrange(0, LAB_SIZE_PX, FIELD_WIDTH_PX + 1):
pygame.draw.line(screen, (0, 0, 0), (0, y), (LAB_SIZE_PX, y))
if fields:
for y in range(0, 8, 1):
for x in range(0, 8, 1):
if(fields[(x, y)].wall_top == True):
wall(x, y, TOP)
if(fields[(x, y)].wall_bottom == True):
wall(x, y, BOTTOM)
if(fields[(x, y)].wall_right == True):
wall(x, y, RIGHT)
if(fields[(x, y)].wall_left == True):
wall(x, y, LEFT)
for robot in robots:
if NO_RELATIV_POS:
draw_bot(robot.field_idx, robot.angle)
else:
draw_bot(robot.field_idx, robot.angle, robot.relative_pos)
if show_robot_pos_cm:
if len(robots) != 0:
pygame.display.set_caption(
'Laustracker2D: Robot 1: ' +
robots[0].name + ' [' +
str(round(robots[0].absolut_pos_cm[0], 2)) + ',' +
str(round(robots[0].absolut_pos_cm[1], 2)) + ']')
pygame.display.update()
def handle_events():
for event in pygame.event.get():
if event.type == pygame.KEYDOWN and event.key == pygame.K_ESCAPE:
rospy.signal_shutdown("pygame.quit()")
sleep(1)
return False
elif event.type == pygame.QUIT:
return False
return True
def mapCallback(data):
msg = str(data.data)
if debug:
rospy.loginfo(rospy.get_caller_id() + " I heard:\n%s\n", msg)
if(msg.startswith(MARKER_MAP_BEGIN)):
global fields
fields = mapFromString(msg)
if debug:
print_map()
update_display()
def mapMsgCallback(data):
if debug:
rospy.loginfo(rospy.get_caller_id() + " I heard:\n%s\n", data)
global fields
tmp_fields = {}
for x in range(0, 8, 1):
for y in range(0, 8, 1):
tmp_fields[(x, y)] = Field()
tmp_fields[(x, y)].buoy_top = data.map[x + y * 8].buoy_top
tmp_fields[(x, y)].buoy_bottom = data.map[x + x * 8].buoy_bottom
tmp_fields[(x, y)].buoy_left = data.map[x + y * 8].buoy_left
tmp_fields[(x, y)].buoy_right = data.map[x + y * 8].buoy_right
tmp_fields[(x, y)].wall_top = data.map[x + y * 8].wall_top
tmp_fields[(x, y)].wall_bottom = data.map[x + y * 8].wall_bottom
tmp_fields[(x, y)].wall_left = data.map[x + y * 8].wall_left
tmp_fields[(x, y)].wall_right = data.map[x + y * 8].wall_right
fields = tmp_fields
if debug:
print_map()
update_display()
def robotsCallback(data):
msg = str(data.data)
if debug:
rospy.loginfo(rospy.get_caller_id() + " I heard:\n%s\n", msg)
if(msg.startswith(MARKER_ROBOTS_BEGIN)):
global robots
robots = robotsFromString(msg)
update_display()
def robotsMsgCallback(data):
if debug:
rospy.loginfo(rospy.get_caller_id() + " I heard:\n%s\n", data)
global robots
tmp_robots = []
for robot in data.robots:
rob = Robot()
rob.name = robot.name
rob.field_idx = (robot.field_idx.x, robot.field_idx.y)
rob.direction = robot.direction
rob.angle = robot.angle
rob.relative_pos = (robot.relative_pos.x, robot.relative_pos.y)
rob.third_led_found = robot.third_led_found
rob.ratio = robot.ratio
rob.dist_blue_white = robot.dist_blue_white
rob.dist_red_white = robot.dist_red_white
rob.absolut_pos_cm = (robot.absolut_pos_cm.x, robot.absolut_pos_cm.y)
tmp_robots.append(rob)
robots = tmp_robots
if debug:
print_robots(robots)
update_display()
if __name__ == "__main__":
parser = OptionParser()
parser.add_option("-d", "--debug",
action="store_true", dest="debug",
help="Print status messages to stdout")
parser.add_option("-p", "--pos",
action="store_true", dest="pos",
help="Show robot position in window title")
(options, args) = parser.parse_args()
if options.debug:
print("Running in debug mode")
debug = True
if options.pos:
show_robot_pos_cm = True
pygame.init()
screen = pygame.display.set_mode((LAB_SIZE_PX, LAB_SIZE_PX), 0, 32)
pygame.display.set_caption('Laustracker2D using pygame')
update_display()
rospy.init_node('laustracker2D', anonymous=True)
# rospy.Subscriber("map_publisher", String, mapCallback)
rospy.Subscriber("map_msg_publisher", Map, mapMsgCallback)
# rospy.Subscriber("robots_publisher", String, robotsCallback)
rospy.Subscriber("robots_msg_publisher", Robots, robotsMsgCallback)
while handle_events() and not rospy.is_shutdown():
rospy.sleep(0.5)
pygame.quit()
Laustracker3D
[chris@thinkpad ~]$ rosrun laustracker3d laustracker3d
Trying to open ../share/laustracker/robots.conf
Warning: could not open ../share/laustracker/robots.conf
Map created
Screensize: 1366x768
FPS: 10.45
FPS: 30.1
FPS: 29.87
FPS: 30.18
...
Using robotsvideo.avi as input
First start the Laustracker server with the demo video:
[chris@thinkpad ~]$ rosrun ltserver laustrackerd --walls-img=${LAUSTRACKER_PATH}/testdata/aoi_labyrinth_empty.jpg --robots-vid=${LAUSTRACKER_PATH}/testdata/robotsvideo.avi --fast --headless --fps=20
If you use the --robots-vid option then the the specified video will be repeated infinitely.
Then you can run Laustracker3D as usual:
[chris@thinkpad ~]$ rosrun laustracker3d laustracker3d
Source code and API documentation
The source code is hosted at Bitbucket, because Bitbucket allows free unlimited private repositories, unlike the better known GitHub.
At the moment only the laustracker-main repository is publicly available:
And the API documentation is located here:
Get the source code
[chris@thinkpad Downloads]$ mkdir laustracker-src
[chris@thinkpad Downloads]$ cd laustracker-src
[chris@thinkpad laustracker-src]$ git clone https://bitbucket.org/christianjann/laustracker-main.git laustracker
Cloning into 'laustracker'...
remote: Counting objects: 3, done.
remote: Compressing objects: 100% (2/2), done.
remote: Total 3 (delta 0), reused 0 (delta 0)
Unpacking objects: 100% (3/3), done.
[chris@thinkpad laustracker-src]$