Telerobotics and Bodytracking - The Rendezvous

Siddhant Shrivastava

July 31, 2015

Filed under “

Hi! The past week was a refreshingly positive one. I was able to solve some of the insidious issues that were plaguing the efforts that I was putting in last week.

Virtual Machine Networking issues Solved!

I was able to use the Tango server across the Windows 7 Virtual Machine and the Tango Host on my Ubuntu 14.04 Host Machine. The proper Networking mode for this turns out to be Bridged Networking mode which basically tunnels a connection between the Virtual Machine and the host.

In the bridged mode, the Virtual Machine exposes a Virtual Network interface with its own IP Address and Networking stack. In my case it was vm8 with an IP Address different from the IP Address patterns that were used by the real Ethernet and WiFi Network Interface Cards. Using bridged mode, I was able to maintain the Tango Device Database server on Ubuntu and use Vito’s Bodytracking device on Windows. The Virtual Machine didn’t slow down things by any magnitude while communicating across the Tango devices.

This image explains what I’m talking about -

Jive on Windows and Ubuntu machines

In bridged mode, I chose the IP Address on the host which corresponds to the Virtual Machine interface - vmnet8 in my case. I used the vmnet8 interface on Ubuntu and a similar interface on the Windows Virtual Machine. I read quite a bit about how Networking works in Virtual Machines and was fascinated by the Virtualization in place.

Bodytracking meets Telerobotics

With Tango up and running, I had to ensure that Vito’s Bodytracking application works on the Virtual Machine. To that end, I installed Kinect for Windows SDK, Kinect Developer Tools, Visual Python, Tango-Controls, and PyTango. Setting a new virtual machine up mildly slowed me down but was a necessary step in the development.

Once I had that bit running, I was able to visualize the simulated Martian Motivity walk done in Innsbruck in a training station. The Bodytracking server created by Vito published events corresponding to the moves attribute which is a list of the following two metrics -

I was able to read the attributes that the Bodytracking device was publishing by subscribing to Event Changes to that attribute. This is done in the following way -

    while TRIGGER:
        # Subscribe to the 'moves' event from the Bodytracking interface
        moves_event = device_proxy.subscribe_event(
                                                                                 'moves',
                                                                                  PyTango.EventType.CHANGE_EVENT,
                                                                                  cb, [])
        # Wait for at least REFRESH_RATE Seconds for the next callback.
        time.sleep(REFRESH_RATE)

This ensures that the Subscriber doesn’t exhaust the polled attributes at a rate faster than they are published. In that unfortunate case, an EventManagerException occurs which must be handled properly.

Note the cb attribute, it refers to the Callback function that is triggered when an Event change occurs. The callback function is responsible for reading and processing the attributes.

The processing part in our case is the core of the Telerobotics-Bodytracking interface. It acts as the intermediary between Telerobotics and Bodytracking - converting the position, and orientation values to linear and angular velocity that Husky can understand. I use a high-performance container from the collections class known as deque. It can act both as a stack and a queue using deque.append, deque.appendleft, deque.pop, deque.popleft.

To calculate velocity, I compute the differences between consecutive events and their corresponding timestamps. The events are stored in a deque, popped when necessary and subtracted from the current event values

For instance this is how linear velocity processing takes place -

  # Position and Linear Velocity Processing
  position_previous = position_events.pop()
  position_current = position
  linear_displacement = position_current - position_previous
  linear_speed = linear_displacement / time_delta

ROS-Telerobotics Interface

We are halfway through the Telerobotics-Bodytracking architecture. Once the velocities are obtained, we have everything we need to send to ROS. The challenge here is to use velocities which ROS and the Husky UGV can understand. The messages are published ot ROS only when there is some change in the velocity. This has the added advantage of minimzing communication between ROS and Tango. When working with multiple distributed systems, it is always wise to keep the communication between them minimial. That’s what I’ve aimed to do. I’ll be enhacing the interface even further by adding Trigger Overrides in case of an emergency situation. The speeds currently are not ROS-friendly. I am writing a high-pass and low-pass filter to limit the velocities to what Husky can sustain. Vito and I will be refining the User Step estimation and the corresponding Robot movements respectively.

GSoC is only becoming more exciting. I’m certain that I will be contributing to this project after GSoC as well. The Telerobotics scenario is full of possibilities, most of which I’ve tried to cover in my GSoC proposal.

I’m back to my university now and it has become hectic but enjoyably challenging to complete this project. My next post will hopefully be a culmination of the Telerobotics/Bodytracking interface and the integration of 3D streaming with Oculus Rift Virtual Reality.

Ciao!