<?xml version="1.0" encoding="utf-8"?>
<feed xmlns="http://www.w3.org/2005/Atom"><title>Johns Blog</title><link href="http://www.johnstowers.co.nz/blog/" rel="alternate"></link><link href="http://www.johnstowers.co.nz/blog/feeds/ROS.atom.xml" rel="self"></link><id>http://www.johnstowers.co.nz/blog/</id><updated>2014-05-27T21:05:00+02:00</updated><entry><title>FlyMAD - The Fly Mind Altering Device</title><link href="http://www.johnstowers.co.nz/blog/2014/05/27/flymad/" rel="alternate"></link><updated>2014-05-27T21:05:00+02:00</updated><author><name>John Stowers</name></author><id>tag:www.johnstowers.co.nz/blog,2014-05-27:2014/05/27/flymad/</id><summary type="html">&lt;p&gt;Today I'm proud to announce the availability of &lt;a class="reference external" href="https://github.com/strawlab/flymad"&gt;all source code&lt;/a&gt;,
and the &lt;a class="reference external" href="http://www.nature.com/doifinder/10.1038/nmeth.2973"&gt;advanced online publication&lt;/a&gt;
of our paper&lt;/p&gt;
&lt;blockquote&gt;
Bath DE*, Stowers JR*, Hörmann D, Poehlmann A, Dickson BJ, Straw AD (* equal contribution) (2014)
&lt;strong&gt;FlyMAD: Rapid thermogenetic control of neuronal activity in freely-walking Drosophila.&lt;/strong&gt;
&lt;a class="reference external" href="http://www.nature.com/doifinder/10.1038/nmeth.2973"&gt;Nature Methods.&lt;/a&gt; doi 10.1038/nmeth.2973&lt;/blockquote&gt;
&lt;p&gt;&lt;a class="reference external" href="http://flymad.strawlab.org"&gt;FlyMAD&lt;/a&gt; (Fly Mind Altering Device) is a system for targeting freely walking
flies (Drosophila) with lasers. This allows rapid thermo- and opto- genetic manipulation of the
fly nervous system in order to study neuronal function.&lt;/p&gt;
&lt;a class="reference external image-reference" href="http://www.johnstowers.co.nz/blog/static/images/strawlab/flymad_intro.png"&gt;&lt;img alt="|filename|images/strawlab/flymad_intro_sml.png" class="align-center" src="http://www.johnstowers.co.nz/blog/static/images/strawlab/flymad_intro_sml.png" /&gt;&lt;/a&gt;
&lt;p&gt;The scientific aspects of the publication are better summarised on
&lt;a class="reference external" href="http://www.nature.com/news/laser-beam-makes-flies-flirt-1.14794"&gt;nature.com&lt;/a&gt;,
&lt;a class="reference external" href="http://www.sciencedaily.com/releases/2014/05/140525154744.htm"&gt;here&lt;/a&gt;,
on our &lt;a class="reference external" href="http://strawlab.org/2014/05/25/flymad/"&gt;laboratory website&lt;/a&gt;, or
in the video at the bottom of this post.&lt;/p&gt;
&lt;p&gt;Briefly however; if one wishes to link function to specific neurons one could conceive of
two broad approaches. First, observe the firing of the neurons in
&lt;a class="reference external" href="http://www.imp.ac.at/news/press-releases/press-release/press-release-high-speed-imaging-method-captures-entire-brain-activity/"&gt;real time&lt;/a&gt;
using fluorescence or other microscopy techniques. Second, use genetic techniques to engineer organisms with
light or temperature sensitive proteins bound to specific neuronal classes such that by the application
of heat or light, activity in those neurons can be modulated.&lt;/p&gt;
&lt;p&gt;Our system takes the second approach; our innovation being that by using real time computer vision and
control techniques we are able to track freely walking Drosophila and apply precise (sub 0.2mm)
opto- or thermogenetic stimulation to study the role of specific neurons in a diverse array of behaviours.&lt;/p&gt;
&lt;p&gt;This blog post will cover a few of the technical and architectural decisions I made in the creation of the system.
Perhaps it is easiest to start with a screenshot and schematic of the system in operation&lt;/p&gt;
&lt;a class="reference external image-reference" href="http://www.johnstowers.co.nz/blog/static/images/strawlab/flymad_screenshot.png"&gt;&lt;img alt="|filename|images/strawlab/flymad_screenshot_sml.png" class="align-center" src="http://www.johnstowers.co.nz/blog/static/images/strawlab/flymad_screenshot_sml.png" /&gt;&lt;/a&gt;
&lt;p&gt;Here one can see two windows showing images from the two tracking cameras, associated image processing configuration parameters
(and their results, at 120fps). In the center at the bottom is visible the ROS based experimental control UI.
Schematically, the two cameras and lasers are arranged like the following&lt;/p&gt;
&lt;a class="reference external image-reference" href="http://www.johnstowers.co.nz/blog/static/images/strawlab/render2.png"&gt;&lt;img alt="|filename|images/strawlab/render2_sml.png" class="align-center" src="http://www.johnstowers.co.nz/blog/static/images/strawlab/render2_sml.png" /&gt;&lt;/a&gt;
&lt;p&gt;In this image you can also see the thorlabs 2D galvanometers (top left), and the dichroic mirror
which allows aligning the camera and laser on the same optical axis.&lt;/p&gt;
&lt;p&gt;By pointing the laser at flies freely walking in the arena below, one can subsequently
deliver heat or light to specific body regions.&lt;/p&gt;
&lt;div class="section" id="general-architecture"&gt;
&lt;h2&gt;General Architecture&lt;/h2&gt;
&lt;p&gt;The system consists of hardware and software elements. A small microcontroller and
digital to analogue converter generate analog control signals to point the
2D galvanometers and to control laser power. The device communicates with the host
PC over a serial link. There are two cameras in the system; a wide camera for fly position
tracking, and a second high magnification camera for targeting specific regions of the fly.
This second camera is aligned with the laser beam, and its view can be pointed
anywhere in the arena by the galvanometers.&lt;/p&gt;
&lt;p&gt;The software is conceptually three parts; image processing code, tracking and targeting code, and
experimental logic. All software elements communicate using robot operating
system (ROS) interprocess communication layer. The great majority of
code is written in python.&lt;/p&gt;
&lt;img alt="|filename|images/strawlab/path8510_sml.png" class="align-center" src="http://www.johnstowers.co.nz/blog/static/images/strawlab/path8510_sml.png" /&gt;
&lt;/div&gt;
&lt;div class="section" id="robot-operating-system-ros"&gt;
&lt;h2&gt;Robot Operating System (ROS)&lt;/h2&gt;
&lt;p&gt;&lt;a class="reference external" href="http://www.ros.org"&gt;ROS&lt;/a&gt; is a framework traditionally used for building
complex robotic systems. In particular it has a relatively good performance and
simple, strongly typed, inter-process-communication framework and serialization format.&lt;/p&gt;
&lt;p&gt;Through its (pure) python interface one can build a complex system of multiple
processes who communicate (primarily) by publishing and subscribing to
message &amp;quot;topics&amp;quot;. An example of the ROS processes running during a FlyMAD
experiment can be seen below.&lt;/p&gt;
&lt;a class="reference external image-reference" href="http://www.johnstowers.co.nz/blog/static/images/strawlab/s3_crop.png"&gt;&lt;img alt="|filename|images/strawlab/s3_crop_sml.png" class="align-center" src="http://www.johnstowers.co.nz/blog/static/images/strawlab/s3_crop_sml.png" /&gt;&lt;/a&gt;
&lt;p&gt;The lines connecting the nodes represent the flow of information across the
network, and all messages can be simultaneously recorded (see &lt;tt class="docutils literal"&gt;/recorder&lt;/tt&gt;)
for analysis later. Furthermore, the isolation of the individual processes
improves robustness and defers some of the responsibility for realtime
performance from myself / Python, to the Kernel and to my overall
architecture.&lt;/p&gt;
&lt;blockquote&gt;
For more details on ROS and on why I believe it is a good tool for
creating reliable reproducible science, see my
&lt;a class="reference external" href="http://johnstowers.co.nz/blog/2013/10/11/ros-freeze/"&gt;previous post&lt;/a&gt;,
my &lt;a class="reference external" href="http://www.youtube.com/watch?v=marMd_K8Z1M"&gt;Scipy2013 video&lt;/a&gt; and
&lt;a class="reference external" href="https://speakerdeck.com/nzjrs/managing-complex-experiments-automation-and-analysis-using-robot-operating-system"&gt;presentation&lt;/a&gt;&lt;/blockquote&gt;
&lt;/div&gt;
&lt;div class="section" id="image-processing"&gt;
&lt;h2&gt;Image Processing&lt;/h2&gt;
&lt;p&gt;There are two image processing tasks in the system. Both are implemented as
&lt;a class="reference external" href="http://code.astraw.com/projects/motmot/fview.html"&gt;FView&lt;/a&gt;
plugins and communicate with the rest of the system using ROS.&lt;/p&gt;
&lt;p&gt;Firstly, the position of the fly (flies) in the arena, as seen by the
wide camera, must be determined. Here, a simple threshold approach is used to
find candidate points and image moments around those points are used to find the
center and slope of the fly body. A lookup table is used to point the
galvanometers in an open-loop fashion approximately at the fly.&lt;/p&gt;
&lt;p&gt;With the fly now located in the field of view of the high magnification camera a
second real time control loop is initiated. Here, the fly body or head is detected,
and a closed loop PID controller finely adjusts the galvanometer position to achieve
maximum targeting accuracy. The accuracy of this through the mirror (TTM) system asymptotically
approaches 200μm and at 50 msec from onset the accuracy of head detection is 400 ± 200 μm.
From onset of TTM mode, considering other latencies in the system (gigabit ethernet, 5 ms,
USB delay, 4 ms, galvanometer response time, 7 ms, image processing 8ms, and image
acquisition time, 5-13 ms) total 32 ms, this shows the real time
targeting stabilises after 2-3 frames and comfortably operates at better than 120 frames
per second.&lt;/p&gt;
&lt;a class="reference external image-reference" href="http://www.johnstowers.co.nz/blog/static/images/strawlab/s1_crop.png"&gt;&lt;img alt="|filename|images/strawlab/s1_crop_sml.png" class="align-center" src="http://www.johnstowers.co.nz/blog/static/images/strawlab/s1_crop_sml.png" /&gt;&lt;/a&gt;
&lt;p&gt;To reliably track freely walking flies, the head and body step image processing
operations must take less than 8ms. Somewhat frustratingly, a traditional template
matching strategy worked best. On the binarized, filtered image, the largest contour
is detected (c, red). Using an ellipse fit to the contour points (c,green), the contour
is rotated into an upright orientation (d). A template of the fly (e) is compared with the
fly in both orientations and the best match is taken.&lt;/p&gt;
&lt;p&gt;I mention the template strategy as being disappointing only because I spent considerable time
evaluating newer, shinier, feature based approaches and could not achieve the closed loop
performance I needed. While the newer descriptors, BRISK, FREAK, ORB were faster than the previous
class, nether (in total) were significantly more reliable considering changes in illumination than
SURF - which could not meet the &amp;lt;8ms deadline reliably. I also spent considerable time testing
edge based (binary) descriptors such as edgelets, or edge based (gradient) approaches such as
dominant orientation templates or gradient response maps. The most promising of this class was local
shape context descriptors, but I also could not get the runtime below 8ms. Furthermore, one advantage
of the contour based template matching strategy I implemented, was that graceful degradation was
possible - should a template match not be found (which occurred in &amp;lt;1% of frames), an estimate
of the centre of mass of the fly was still present, which still allowed degraded targeting performance.
No such graceful fallback was possible using feature correspondence based strategies.&lt;/p&gt;
&lt;p&gt;There are two implementations of the template match operation - GPU and CPU based. The CPU matcher
uses the python OpenCV bindings (and numpy in places), the GPU matcher uses cython to wrap a small
c++ library that does the same thing using OpenCV 2.4 Cuda GPU support (which is not otherwise
accessible from python). Intelligently, the python OpenCV bindings use numpy arrays to store
image data, so passing data from Python to native code is trivial and efficient.&lt;/p&gt;
&lt;blockquote&gt;
I also gave &lt;a class="reference external" href="https://speakerdeck.com/nzjrs/interfacing-with-native-code-from-python"&gt;a presentation&lt;/a&gt;
comparing different strategies of interfacing python with native code. The
&lt;a class="reference external" href="https://github.com/strawlab/py4science-vbc/tree/master/2012-04-13"&gt;provided source code&lt;/a&gt;
includes examples using python/ctypes/cython/numpy and permutations thereof.&lt;/blockquote&gt;
&lt;p&gt;The GPU code-path is only necessary / beneficial for very large templates and
higher resolution cameras (as used by our collaborator) and in general the CPU
implementation is used.&lt;/p&gt;
&lt;/div&gt;
&lt;div class="section" id="experimental-control-gui"&gt;
&lt;h2&gt;Experimental Control GUI&lt;/h2&gt;
&lt;p&gt;To make FlyMAD easier to manage and use for biologists I wrote a small GUI using
Gtk (PyGObject), and my ROS utility GUI library
&lt;a class="reference external" href="https://github.com/strawlab/rosgobject"&gt;rosgobject&lt;/a&gt;.&lt;/p&gt;
&lt;a class="reference external image-reference" href="http://www.johnstowers.co.nz/blog/static/images/strawlab/gflymad.png"&gt;&lt;img alt="|filename|images/strawlab/gflymad_sml.png" class="align-center" src="http://www.johnstowers.co.nz/blog/static/images/strawlab/gflymad_sml.png" /&gt;&lt;/a&gt;
&lt;p&gt;On the left you can see buttons for launching individual ROS nodes. On the right
are widgets for adjusting the image processing and control parameters (these
widgets display and set ROS parameters). At the bottom are realtime statistics showing
the TTM image processing performance (as published to ROS topics).&lt;/p&gt;
&lt;p&gt;Like good ROS practice, once reliable values are found for all adjustable parameters
they can be recorded in a &lt;tt class="docutils literal"&gt;roslaunch&lt;/tt&gt; file allowing the whole system to
be started with known configuration from a single command.&lt;/p&gt;
&lt;/div&gt;
&lt;div class="section" id="manual-scoring-of-videos"&gt;
&lt;h2&gt;Manual Scoring of Videos&lt;/h2&gt;
&lt;p&gt;For certain experiments (such as courtship) videos recorded during the experiment
must be watched and behaviours must be manually annotated. To my surprise, no tools
exist to make this relatively common behavioural neuroscience task any easier
(and easier matters; it is not uncommon to score 10s to 100s of hours of videos).&lt;/p&gt;
&lt;p&gt;During every experiment, &lt;a class="reference external" href="http://code.astraw.com/projects/motmot/fly-movie-format.html"&gt;RAW uncompressed videos&lt;/a&gt;
from both cameras are written to disk (uncompressed videos are chosen for performance reasons, because
SSDs are cheap, and because each frame can be precisely timestamped).
Additionally, &lt;tt class="docutils literal"&gt;rosbag&lt;/tt&gt; files record the complete state of the experiment at
every instant in time (as described by all messages passing
between ROS nodes). After each experiment finishes, the uncompressed videos from
each camera are composited together, along with metadata such as the frame
timestamp, and a h264 encoded mp4 video is created for scoring.&lt;/p&gt;
&lt;p&gt;After completing a full day of experiments one can then score / annotate
videos in bulk. The scorer is written in Python, uses Gtk+ and PyGObject for the
UI, and &lt;a class="reference external" href="https://wiki.videolan.org/Python_bindings"&gt;vlc.py&lt;/a&gt; for decoding the
video (I chose vlc due to the lack of working gstreamer PyGObject support on Ubuntu
12.04)&lt;/p&gt;
&lt;a class="reference external image-reference" href="http://www.johnstowers.co.nz/blog/static/images/strawlab/scorer.png"&gt;&lt;img alt="|filename|images/strawlab/scorer_sml.png" class="align-center" src="http://www.johnstowers.co.nz/blog/static/images/strawlab/scorer_sml.png" /&gt;&lt;/a&gt;
&lt;p&gt;In addition to allowing play, pause and single frame scrubbing through the video,
pressing any of &lt;tt class="docutils literal"&gt;qw,as,zx,cv&lt;/tt&gt; pairs of keys indicates that a
a behaviour has started or finished. At this instant the current video frame is
extracted from the video, and optical-character-recognition is performed on the
top left region of the frame in order to extract the timestamp. When the video
is finished, a &lt;a class="reference external" href="http://pandas.pydata.org/"&gt;pandas dataframe&lt;/a&gt; is created which contains
all original experimental &lt;tt class="docutils literal"&gt;rosbag&lt;/tt&gt; data, and the manually annotated behaviour
against on a common timebase.&lt;/p&gt;
&lt;/div&gt;
&lt;div class="section" id="distributing-complex-experimental-software"&gt;
&lt;h2&gt;Distributing complex experimental software&lt;/h2&gt;
&lt;p&gt;The system was not only run by myself, but by &lt;a class="reference external" href="http://janelia.org/lab/dickson-lab"&gt;collaborators&lt;/a&gt;,
and we hope in future, by others too. To make this possible we generate a single
file self installing executable using &lt;a class="reference external" href="https://github.com/megastep/makeself"&gt;makeself&lt;/a&gt;,
and we only officially support one distribution - Ubuntu 12.04 LTS and x86_64.&lt;/p&gt;
&lt;p&gt;The &lt;tt class="docutils literal"&gt;makeself&lt;/tt&gt; installer performs the following steps&lt;/p&gt;
&lt;ol class="arabic simple"&gt;
&lt;li&gt;Adds our Debian repository to the system&lt;/li&gt;
&lt;li&gt;Adds the official ROS Debian repository to the system&lt;/li&gt;
&lt;li&gt;Adds our custom ROS stacks (FlyMAD from tarball and rosgobject from git)
to the ROS environment&lt;/li&gt;
&lt;li&gt;Calls &lt;tt class="docutils literal"&gt;rosmake flymad&lt;/tt&gt; to install all system dependencies and build
and non-binary ROS packages.&lt;/li&gt;
&lt;li&gt;Creates a FlyMAD desktop file to start the software easily&lt;/li&gt;
&lt;/ol&gt;
&lt;p&gt;We also include a version check utility in the FlyMAD GUI which notifies the user
when a newer version of the software is available.&lt;/p&gt;
&lt;/div&gt;
&lt;div class="section" id="the-results"&gt;
&lt;h2&gt;The Results&lt;/h2&gt;
&lt;p&gt;Using FlyMAD and the architecture I have described above we created a novel system
to perform temporally and spatially precise opto and thermogenetic activation
of freely moving drosophila. To validate the system we showed distinct timing
relationships for two neuronal cell types previously linked to courtship song, and
demonstrated compatibility of the system to visual behaviour experiments.&lt;/p&gt;
&lt;p&gt;Practically we were able to develop and simultaneously operate this complex
real-time assay in two countries. The system was conceived and built in approximately
one year using Python. FlyMAD utilises many best-in-class libraries and frameworks
in order to meet the demanding real time requirements (OpenCV, numpy, ROS).&lt;/p&gt;
&lt;p&gt;We are proud to make the entire system available to the Drosophila community
under an open source license, and we look forward to its adoption by our peers.&lt;/p&gt;
&lt;p&gt;For those still reading, I encourage you to view the supplementary video below,
where its operation can be seen.&lt;/p&gt;
&lt;iframe width="670" height="361" src="http://www.youtube.com/embed/SDIEBgtSSJk?rel=0" frameborder="0" allowfullscreen&gt;&lt;/iframe&gt;&lt;p&gt;Comments, suggestions or corrections can be &lt;a class="reference external" href="http://www.johnstowers.co.nz/blog/pages/about-me.html"&gt;emailed to me&lt;/a&gt;
or left on &lt;a class="reference external" href="https://plus.google.com/113069881945587451927/posts/fPQn2GMmsbC"&gt;Google Plus&lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
</summary><category term="ROS"></category><category term="Nerd"></category><category term="Planet GNOME"></category><category term="Tracking"></category><category term="PyGObject"></category><category term="Gtk"></category></entry><entry><title>Distributing Pure Python ROS Applications</title><link href="http://www.johnstowers.co.nz/blog/2013/10/11/ros-freeze/" rel="alternate"></link><updated>2013-10-11T00:46:32+02:00</updated><author><name>John Stowers</name></author><id>tag:www.johnstowers.co.nz/blog,2013-10-11:2013/10/11/ros-freeze/</id><summary type="html">&lt;p&gt;In June 2013 I was lucky to speak at the fantastic SciPy2013 conference
(scientific computing with python). I spoke about a work flow and tools we have
developed at &lt;a class="reference external" href="http://www.strawlab.org"&gt;strawlab&lt;/a&gt;. The title of my talk
was &lt;strong&gt;Managing Complex Experiments, Automation, and
Analysis using Robot Operating System&lt;/strong&gt;.
The &lt;a class="reference external" href="http://www.youtube.com/watch?v=marMd_K8Z1M"&gt;video&lt;/a&gt; of that
talk is included below;&lt;/p&gt;
&lt;iframe width="670" height="361" src="http://www.youtube.com/embed/marMd_K8Z1M?rel=0" frameborder="0" allowfullscreen&gt;&lt;/iframe&gt;&lt;p&gt;And here are the &lt;a class="reference external" href="https://speakerdeck.com/nzjrs/managing-complex-experiments-automation-and-analysis-using-robot-operating-system"&gt;accompanying slides&lt;/a&gt;;&lt;/p&gt;
&lt;script async class="speakerdeck-embed" data-id="6ae4eb60c4690130f2271247c9814889" data-ratio="1.33333333333333" src="http://speakerdeck.com/assets/embed.js"&gt;&lt;/script&gt;&lt;p&gt;This post describes a tool I developed for
distributing ROS packages to scientific collaborators. That software is
called &lt;a class="reference external" href="https://github.com/strawlab/ros-freeze"&gt;ros-freeze&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;For those of you not aware, &lt;a class="reference external" href="http://www.ros.org"&gt;ROS&lt;/a&gt; is a great framework
traditionally targeted for robotics but usable in other fields too. In particular
it has a relatively good performance and simple, strongly typed, inter-process-communication
framework and serialization format. This is simultaneously useful for creating
distributed realtime-ish systems with comprehensive logging of the system
state. Best of all, the python interface to ROS is very clean.&lt;/p&gt;
&lt;img alt="|filename|images/strawlab/rospyramid.png" class="align-center" src="http://www.johnstowers.co.nz/blog/static/images/strawlab/rospyramid.png" /&gt;
&lt;p&gt;Unfortunately, being a framework, ROS is rather all-or-nothing (going as
far as to describe itself as a meta-operating system). The basic ROS
install is several gigabytes, and building it yourself can be rather
difficult. Furthermore, as I mentioned in my presentation, it is attractive
to use the built in ROS tool &lt;tt class="docutils literal"&gt;rosbag&lt;/tt&gt; for recording timestamped data
to disk. Unfortunately, reading these files again needs ROS,
thus necessarily coupling experimental data to the software used
to collect it.&lt;/p&gt;
&lt;p&gt;To remedy this I wrote &lt;a class="reference external" href="https://github.com/strawlab/ros-freeze"&gt;ros-freeze&lt;/a&gt;,
a python tool to convert any ROS package into a pure-python package including
all of the dependencies. Collaborators can then install the python
package and immediately have access to all the same ROS packages and libraries
without having to build the whole ROS stack.&lt;/p&gt;
&lt;div class="section" id="converting-your-ros-package"&gt;
&lt;h2&gt;Converting your ROS package&lt;/h2&gt;
&lt;ol class="arabic"&gt;
&lt;li&gt;&lt;p class="first"&gt;Download ros-freeze from &lt;a class="reference external" href="https://github.com/strawlab/ros-freeze"&gt;here&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;&lt;p class="first"&gt;Modify setup-freeze.py according to your needs&lt;/p&gt;
&lt;div class="highlight"&gt;&lt;pre&gt;&lt;span&gt;&lt;/span&gt;&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="nn"&gt;setuptools&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;setup&lt;/span&gt;
&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="nn"&gt;rosfreeze&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;import_ros_package&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;get_disutils_cmds&lt;/span&gt;

&lt;span class="n"&gt;MY_PACKAGE&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s1"&gt;&amp;#39;foo&amp;#39;&lt;/span&gt;

&lt;span class="n"&gt;setup&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;name&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="s1"&gt;&amp;#39;python-ros-&lt;/span&gt;&lt;span class="si"&gt;%s&lt;/span&gt;&lt;span class="s1"&gt;&amp;#39;&lt;/span&gt; &lt;span class="o"&gt;%&lt;/span&gt; &lt;span class="n"&gt;MY_PACKAGE&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
      &lt;span class="n"&gt;version&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="s1"&gt;&amp;#39;1.0&amp;#39;&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
      &lt;span class="n"&gt;description&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="s1"&gt;&amp;#39;Pure Python ROS &lt;/span&gt;&lt;span class="si"&gt;%s&lt;/span&gt;&lt;span class="s1"&gt;&amp;#39;&lt;/span&gt; &lt;span class="o"&gt;%&lt;/span&gt; &lt;span class="n"&gt;MY_PACKAGE&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
      &lt;span class="n"&gt;author&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="s1"&gt;&amp;#39;&amp;#39;&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
      &lt;span class="n"&gt;author_email&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="s1"&gt;&amp;#39;&amp;#39;&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
      &lt;span class="o"&gt;**&lt;/span&gt;&lt;span class="n"&gt;get_disutils_cmds&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;srcdir&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;bindir&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;datadir&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
 &lt;span class="p"&gt;)&lt;/span&gt;
&lt;/pre&gt;&lt;/div&gt;
&lt;/li&gt;
&lt;li&gt;&lt;p class="first"&gt;Build a python egg (for example)&lt;/p&gt;
&lt;p&gt;&lt;tt class="docutils literal"&gt;$ python &lt;span class="pre"&gt;setup-freeze.py&lt;/span&gt; bdist_egg&lt;/tt&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;&lt;p class="first"&gt;Install that egg into your virtual environment&lt;/p&gt;
&lt;/li&gt;
&lt;/ol&gt;
&lt;/div&gt;
&lt;div class="section" id="caveats-and-other-notes"&gt;
&lt;h2&gt;Caveats and other Notes&lt;/h2&gt;
&lt;ul class="simple"&gt;
&lt;li&gt;this is currently working on ROS Electric (an old release, at work
we have chosen to stick with Ubuntu 12.04LTS)&lt;/li&gt;
&lt;li&gt;changes for other ROS distributions might be necessary, so please get in touch&lt;/li&gt;
&lt;li&gt;this is successfully tested on ROS packages containing tens of thousands of lines of
code and dozens of ROS dependencies.&lt;/li&gt;
&lt;li&gt;although recent ROS releases have improved the package management situation (by embracing
deb packaging, yay!) this tool provides an unprecedentedly easy way to distribute your
pure python ROS applications&lt;/li&gt;
&lt;/ul&gt;
&lt;/div&gt;
&lt;div class="section" id="a-pure-python-ros-distribution"&gt;
&lt;h2&gt;A Pure Python ROS Distribution?&lt;/h2&gt;
&lt;p&gt;One side effect of this was the packaging of the pure python ROS core as an easily installable
python egg. This means that you can write, debug and test python ROS nodes without having to
install the whole ROS distribution.&lt;/p&gt;
&lt;p&gt;One can even go as far as running &lt;tt class="docutils literal"&gt;rosmaster&lt;/tt&gt; and the command line tools
(&lt;tt class="docutils literal"&gt;rosnode&lt;/tt&gt;, &lt;tt class="docutils literal"&gt;rosparam&lt;/tt&gt;, etc)!&lt;/p&gt;
&lt;p&gt;You can download the &lt;a class="reference external" href="https://github.com/strawlab/ros-freeze/releases"&gt;python-ros-electric package from here&lt;/a&gt;.&lt;/p&gt;
&lt;/div&gt;
</summary><category term="ROS"></category><category term="Nerd"></category><category term="Planet GNOME"></category></entry><entry><title>ROS and Gtk for Laboratory Control</title><link href="http://www.johnstowers.co.nz/blog/2013/01/27/lab-with-gtk-ros-1/" rel="alternate"></link><updated>2013-01-27T23:15:32+01:00</updated><author><name>John Stowers</name></author><id>tag:www.johnstowers.co.nz/blog,2013-01-27:2013/01/27/lab-with-gtk-ros-1/</id><summary type="html">&lt;p&gt;At the lab in which I work &lt;a href="http://strawlab.org"&gt;(Andrew Straw, strawlab)&lt;/a&gt; we
study the visual flight behaviour of Drosophila using virtual reality. The
implementation of this will be explained in future posts and papers however for
this post I am going to describe how I used Gtk&lt;sup id="fnref:1"&gt;&lt;a class="footnote-ref" href="#fn:1" rel="footnote"&gt;1&lt;/a&gt;&lt;/sup&gt; and
&lt;a href="http://www.ros.org"&gt;ROS&lt;/a&gt; to build an interface to control and monitor
running experiments (called the 'Operator Console').&lt;/p&gt;
&lt;p&gt;A future post will address and release all the ROS+GObject&lt;sup id="fnref:2"&gt;&lt;a class="footnote-ref" href="#fn:2" rel="footnote"&gt;2&lt;/a&gt;&lt;/sup&gt; glue that lets these
interfaces scale dynamically as nodes (dis)appear. This just shows the relevant
Gtk parts and has some comments on what I would like from Gtk to make these
sort of interfaces easier.&lt;/p&gt;
&lt;p&gt;The screenshow shows the first tab of the 'Operator Console'&lt;sup id="fnref:3"&gt;&lt;a class="footnote-ref" href="#fn:3" rel="footnote"&gt;3&lt;/a&gt;&lt;/sup&gt;.&lt;/p&gt;
&lt;p&gt;&lt;a href="http://www.johnstowers.co.nz/blog/static/images/strawlab/oc-tab1.png"&gt;&lt;img alt="Operator Console" src="http://www.johnstowers.co.nz/blog/static/images/strawlab/oc-tab1-sml.png" /&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;&lt;em&gt; Implementation Notes &lt;/em&gt;&lt;/strong&gt;&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;I use the secondary icon support of &lt;code&gt;Gtk.Entry&lt;/code&gt; to show the contents contain
   sensible data. Maybe validation support in Gtk would be useful here
   &lt;a href="https://bugzilla.gnome.org/show_bug.cgi?id=446056"&gt;bug&lt;/a&gt;.&lt;/li&gt;
&lt;li&gt;The 'Description' entry is a &lt;code&gt;Gtk.TextView&lt;/code&gt;, not a &lt;code&gt;Gtk.Entry&lt;/code&gt;. It was necessary
   to apply custom CSS to make it look reasonably similar. Sadly, it does not
   support the full/same set of CSS properties as &lt;code&gt;Gtk.Entry&lt;/code&gt;, so it was impossible
   to show the same border radius and focus colors
   &lt;a href="https://bugzilla.gnome.org/show_bug.cgi?id=687363"&gt;bug&lt;/a&gt;. Perhaps a multi-line
   &lt;code&gt;Gtk.Entry&lt;/code&gt; would be better.&lt;/li&gt;
&lt;li&gt;The bottom half of the window shows the utilisation of all computers. I tried
   a few versions of this, and simple sensibly formatted monospaced text looked
   much better than anything else I tried. Any suggestions?&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;This screenshot shows an example screen where we mix the control and monitoring
or many instances of the same ROS node.&lt;/p&gt;
&lt;p&gt;&lt;a href="http://www.johnstowers.co.nz/blog/static/images/strawlab/oc-tab2.png"&gt;&lt;img alt="Operator Console" src="http://www.johnstowers.co.nz/blog/static/images/strawlab/oc-tab2-sml.png" /&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;&lt;em&gt; Implementation Notes &lt;/em&gt;&lt;/strong&gt;&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;The &lt;code&gt;Gtk.Switch&lt;/code&gt; simultainously displays the status of the projector, and also
   allows control of the node. The is a common use-case in the software, and
   due to the asynchronous nature of the ROS messages, I need to distinguish these
   from user-generated signals. I have wrappers such as the following
   for many widgets&lt;sup id="fnref:4"&gt;&lt;a class="footnote-ref" href="#fn:4" rel="footnote"&gt;4&lt;/a&gt;&lt;/sup&gt;. Advice on how to distinguish this use-case would be
   preferred.&lt;/li&gt;
&lt;/ul&gt;
&lt;div class="codehilite"&gt;&lt;pre&gt;&lt;span&gt;&lt;/span&gt;&lt;span class="k"&gt;class&lt;/span&gt; &lt;span class="nc"&gt;UpdateableGtkSwitch&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;Gtk&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;Switch&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
    &lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;__init__&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="bp"&gt;self&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="o"&gt;*&lt;/span&gt;&lt;span class="n"&gt;args&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="o"&gt;**&lt;/span&gt;&lt;span class="n"&gt;kwargs&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
        &lt;span class="n"&gt;Gtk&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;Switch&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;__init__&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="bp"&gt;self&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="o"&gt;*&lt;/span&gt;&lt;span class="n"&gt;args&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="o"&gt;**&lt;/span&gt;&lt;span class="n"&gt;kwargs&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="bp"&gt;self&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;_changing&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="bp"&gt;False&lt;/span&gt;
        &lt;span class="bp"&gt;self&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;connect_after&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s2"&gt;&amp;quot;notify::active&amp;quot;&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="bp"&gt;self&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;_changed&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

    &lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;_changed&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="bp"&gt;self&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="o"&gt;*&lt;/span&gt;&lt;span class="n"&gt;args&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
        &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="bp"&gt;self&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;_changing&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
            &lt;span class="bp"&gt;self&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;stop_emission&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s2"&gt;&amp;quot;notify::active&amp;quot;&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

    &lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;set_active&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="bp"&gt;self&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;is_active&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
        &lt;span class="bp"&gt;self&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;_changing&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="bp"&gt;True&lt;/span&gt;
        &lt;span class="n"&gt;Gtk&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;Switch&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;set_active&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="bp"&gt;self&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;is_active&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="bp"&gt;self&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;_changing&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="bp"&gt;False&lt;/span&gt;

    &lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;connect&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="bp"&gt;self&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="o"&gt;*&lt;/span&gt;&lt;span class="n"&gt;args&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="o"&gt;**&lt;/span&gt;&lt;span class="n"&gt;kwargs&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
        &lt;span class="bp"&gt;self&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;connect_after&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="o"&gt;*&lt;/span&gt;&lt;span class="n"&gt;args&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="o"&gt;**&lt;/span&gt;&lt;span class="n"&gt;kwargs&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;/pre&gt;&lt;/div&gt;


&lt;ul&gt;
&lt;li&gt;The "Standby (Computer1)" is for display only and mirrors the status of a ROS
   topic. I would like some way visually to inicate that this widget is not actually
   an editable &lt;code&gt;Gtk.Entry&lt;/code&gt;. Currently the &lt;code&gt;Gtk.Entry&lt;/code&gt; is set
   &lt;code&gt;editable = False&lt;/code&gt;, it looks to out of place with &lt;code&gt;sensitive = False&lt;/code&gt;. 
   Perhaps I should add some custom &lt;code&gt;CSS&lt;/code&gt; to color it slightly different.
   Suggestions are appreciated.&lt;/li&gt;
&lt;/ul&gt;
&lt;h2&gt;Closing Remarks&lt;/h2&gt;
&lt;p&gt;I'm really happy with the status of the &lt;code&gt;PyGObject&lt;/code&gt; bindings. We have a few quite
large applications built using them (and ROS) and I have no complaints about
performance&lt;sup id="fnref:5"&gt;&lt;a class="footnote-ref" href="#fn:5" rel="footnote"&gt;5&lt;/a&gt;&lt;/sup&gt; or otherwise. The conventional wisdom was that PyGTK (and GTK) were
not suitable for threaded workloads but the threading model of ROS guarentees that
the 'operator-console' shown above manages upwards of 50 background threads asynchronously
updating the GUI state.&lt;/p&gt;
&lt;div class="footnote"&gt;
&lt;hr /&gt;
&lt;ol&gt;
&lt;li id="fn:1"&gt;
&lt;p&gt;Actually PyGObject, argh why didn't we keep the name as pygtk?&amp;#160;&lt;a class="footnote-backref" href="#fnref:1" rev="footnote" title="Jump back to footnote 1 in the text"&gt;&amp;#8617;&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li id="fn:2"&gt;
&lt;p&gt;I'll blog about this later. For the curious,
  &lt;a href="https://github.com/strawlab/rosgobject"&gt;rosgobject&lt;/a&gt; lives here.&amp;#160;&lt;a class="footnote-backref" href="#fnref:2" rev="footnote" title="Jump back to footnote 2 in the text"&gt;&amp;#8617;&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li id="fn:3"&gt;
&lt;p&gt;In real operation this GUI shows the state of many more
  machines/nodes/computers. This screenshot is running on my laptop because
  showing too much more might give away the game ;-).&amp;#160;&lt;a class="footnote-backref" href="#fnref:3" rev="footnote" title="Jump back to footnote 3 in the text"&gt;&amp;#8617;&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li id="fn:4"&gt;
&lt;p&gt;&lt;code&gt;freeze_notify&lt;/code&gt; and &lt;code&gt;thaw_notify&lt;/code&gt; would almost work, if the events
could be dropped and not queued. Also, not all widgets use &lt;code&gt;notify::active&lt;/code&gt;,
&lt;code&gt;GtkComboBox(Text)&lt;/code&gt; for example. A general way to do this would be preferred.&amp;#160;&lt;a class="footnote-backref" href="#fnref:4" rev="footnote" title="Jump back to footnote 4 in the text"&gt;&amp;#8617;&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li id="fn:5"&gt;
&lt;p&gt;Excluding plotting / graphing performace. But that is fodder for a later post.&amp;#160;&lt;a class="footnote-backref" href="#fnref:5" rev="footnote" title="Jump back to footnote 5 in the text"&gt;&amp;#8617;&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;/ol&gt;
&lt;/div&gt;</summary><category term="gnome"></category><category term="ROS"></category><category term="Nerd"></category><category term="Planet GNOME"></category><category term="PyGObject"></category><category term="Gtk"></category></entry></feed>