Deep in the center of Chicago's Adler Planetarium is the Space Visualization Lab, which is responsible for the creation of many of the planetarium's interactive installations. Their interactive installtions at the time relied on the Kinect as the controller and often dealt with not only visualizations of space but also explorations of related artifacts. A partner and I, working with the lab for a week-long Alternative Spring Break, were tasked with two goals for :
I worked on exploring what the Leap Motion could detect and how we could leverage that to control our flyover installation, but one snag became apparent: the only way we could have the controller send data to WorldWide Telescope was through the planetarium's local area network due to a combination of documentation, skillset, and APIs; this was causing latency. The controller also wasn't very precise, which ruled out a variety of different motions we had considered.
The issues that we noticed while developing the installation continued to manifest: the controller was finicky and wasn't particuarly precise, and the controller-to-software setup we used caused latency which further diminished the controller's efficacy. It was clear that it would require more time and resources to find better way to integrate the Leap Motion controller, but it was also clear that there were some limitations to the controller itself in contrast to a Kinect.