NYU ITP Spring Show 2011: Apps at at Exhibition

Last night, NYU’s ITP program put on its vernal show of student projects. I couldn’t resist.

I made the trip to lower Broadway to see solenoids, DC motors, pulleys, gears, transformers, cameras, OR gates, laser-etched plexiglass, magnets, propellers, large touch screens, finger puppets, spare parts from Xbox, and clever software (of course) mashed up to create something pretty, gadgety, joyful (sometimes dark), and at times vaguely practical.

Can you believe that none of the object de gadget had even a reference to Twitter, Facebook, or other social media forms?  

Technically, the finger puppets from Mindy Tchieu’s Burger Island didn’t involve any silicon, but I always enjoy games employing animals knit from wool and characters based around Rachel Ray and Alice Waters.

One inescapable conclusion from last night’s show was that image recognition software has gotten pretty darn good.

I witnessed Jihyun Moon and Tamar Ziv’s LoLWell, which had a project mission statement to “add meanings to a usage of plastic cups”.

They more than succeeded. With its assembled hardware and crafty algorithms, the LoLWell system keeps track of water-filled cups on a table, projecting images into them. You move the cup, the image follows along. You pick the cup up, the projector simulates a water stain.

Another application of image recognition was Brett Murphy’s SoundStage, “an ambisonic surround sound mixer easy enough for a three year old to use.” Three-year-olds and other age groups place various farms animals, tractors, dolphins, and objects associated with distinctive sounds onto a camera-equipped and illuminated play table.

Moon and Ziv’s LoLWell:adding the magic back to tap water.

SoundStage’s image system figures out the object by interpreting the unique geometric pattern embedded on the base and then serves up the appropriate audio.

Pretty soon with enough figurines on the table, you’ve simulated the background noise of a working farm.

And the SoundStage system even detects limited gestures, so dragging the model locomotive will deliver the cho-cho-cho of an engine picking up steam.

There was Stepan Bolatalin and Ezer Longinus’s Exotraveller, which allows participants to walk through a virtual planetary system based on data supplied by the SETI organization. The pair has hacked into Microsoft’s Kinect, which is used to detect the motion and gestures of participants. As you space walk in this virtual environment, planets and other astronomical objects move by on the projected screen.

There be robots and computer-controlled mechanical pieces, toys, and thingies at ITP as well.

Belisario Russell de la Torre hacked out a four-propellered auto-stabilizing helicopter or quadcopter, called Scout. I’ve only seen these aeronautical things on YouTube, but de La Torre had the real item, though he wasn’t able to demo it due to the restricted flight patterns in the ITP space.

I asked him whether his project could have been pulled off a few years ago. And learned that Scout depends on an open source platform called ArduPilot that has only come together very recently.

Scary to think about what’s being hatched now in the open-source equivalent of  “skunk works”.

I also liked Shahar Zak’s SteadyState, a colony of gear modules stacked one on top of the other that independently rotate based on signals from a light sensor. Part robot and part interactive sculpture.

For cuddly non-shedding fun, there was Natalie Be’er’s Wigglebot, which simulates a frisky pet. Though I think this clear plastic, two wheeled robot could use a wool-knit covering from the Burger Island people. In any case, its sensors can detect obstructions and when it’s being held, and then respond with various pet-like behaviors.

If my 3-year old kitten doesn’t shape up, I may consider bringing a Wigglebot on board as a role model.

Final shout out to Saul Kessler and his seer-like Q, a pirate doll with enough silicon smarts (read Android) to understand speech and deliver answers parsed from Web sites.

He’s got image rec working on this, so it’s conceivable that Q could deliver Goodnight, Moon to the appropriate child and then answer questions about the narrative.

The perfect solution for overworked parents.

Enhanced by Zemanta