The Engineering Behind Making Robots Dance

Written by our software engineer, Kevin Liang.

Since our crowdfunding campaign 2 months ago, the engineering team has been heads down working on our software platform API for tablet devices. One of the use cases for the API is to provide a way for people to program Bo & Yana using visual animations. We decided it would be fun to make robots dance using the popular animation tool Maya. In this first (of many to come) tech blog articles, we are sharing how we’ve solved the engineering challenges that we faced.

Here is the official version of Bo & Yana dancing to a song you just might recognize, filmed/edited by the amazing students at Palo Alto High School:

The Challenge: Making Robots Dance

While our creative team was busy trying to come up with some sweet robot dance moves, the engineers got to work on building this demo, which has three major components:

  1. Write a Maya parser to convert animation data into a set of sequential robot commands that describe complex actions to be performed at specified time intervals.
  2. Design the API to convert the command sequence into instructions carried out by Bo & Yana.
  3. Create the final app that will synchronize the dance routine with the song.

Step 1: Building a Maya Animation Parser

We built a parser for Maya because it’s a very popular 3D animation tool, and we had been using it extensively to simulate Bo & Yana’s personality traits and actions. Maya represents its animations as a sequence of frames over the entire animation period (30 frames per second in our case). Each frame describes the state of the animated object for that given point in time. The 30th frame, for instance, will contain all the output values of the robot on the 1st second of the animation.

While most static animation data (like color and lights) can be easily translated, converting robot movements from Maya data is much more complicated. Since Maya expresses movement as a series of object coordinates/angles rather than motion curves, it does not match with how we normally control Bo’s movements, which uses power as motor input. But it was fun for us to exercise our trigonometry and vector math muscles to translate these spatial coordinate/angle data points into motor power curves.

The Engineering Behind Making Robots Dance

.

Step 2: Creating a “Robot Action” API

The concept of representing animation through a sequence of frames is very expressive and is widely used in many other applications (video, films, etc). We decided to use this concept to express complex actions for Bo & Yana as well. Developers can now create a “wiggle for 1 second” action, for instance, for Bo through a sequence of robot states and also represent even more complex actions by stitching together existing actions! We used this design to choreograph the Harlem Shake dance routine.

Step 3: Putting it All Together

Once we had the command sequence and the parser completed, it was time to put it all together.  While everything was supposed to work, in theory, we were a little anxious to see this come to life. We put a lot of stress on the iPad to:

  • Play the song
  • Control 3 robots in sync by sending 30 instructions per second to each robot
  • Process Yana’s accelerometer data in real time. Yana’s role was to start/pause the dance sequence every time you shake it.

With all this heavy computation, the iPad Air we used was processing ~15% slower than what we needed.  Optimizing the dispatch_queue and system level timers, we were able to improve the performance within ~3% of our desired process rate!

Here is our very first, unedited harlem shake dance sequence, complete with shake detection to start/pause the routine:

At the end of the two weeks, we got the job done and had a lot of fun during the process, from whiteboarding solutions to making performance optimizations on iOS. In the coming months, we will be putting a lot more work into designing the robot interface API. We are excited to see how third party developers will use our API to make our robots do even more wonderful, creative things!

If you like what we are doing and want to join us to shape the way children lean, and play with robots all day, we’d love to talk to you!  Please check out our career page for more information.

10 thoughts on “The Engineering Behind Making Robots Dance

  1. Pingback: Play-i is building an ecosystem for its robots to teach coding in schools | Tech Auntie

  2. Pingback: marketing.com.gr » Play-i is building an ecosystem for its robots to teach coding in schools

  3. Pingback: Yana en Bo als goedkope alternatieven voor NAO en Qbo?

  4. Pingback: Play-i is building an ecosystem for its robots to teach coding in schools |

  5. Will the robots support Bluetooth? I like to write Android apps with AppInventor that use Bluetooth to control robots.

      • Awesome! This is going to be great! Will you please publish documentation for the protocol, so I don’t have to reverse-engineer it? Thanks!

        • The API will be the main way for you to access the robots and you will have granular controls for all the actuators and sensor values.

          you can see the initial documentation here: https://github.com/playi/PIRobotKit (please send us your github screen name so we can give you read access)

          Over the next 2 months, we will release the actual framework library for you to interact with it for your iOS application.

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>