Tech Explorations

A selection of hacks and fun explorations with code

Story-like Captions for Images using Machine Learning
My aim was to generate captions for images that didn't just describe the image. I wanted them to add meaning, with a dystopian sense. This Ross Goodwin article sparked my interest in using Recurrent Neural Networks to train a machine to generate text in a particular style. I trained a machine on Horror novels and assorted prose. With a literal image caption as input, it could generate a paragraph of narrative. I used computation to extract the most meaningful part of it, and captioned the image. More details here.

Technology Used: Recurrent Neural Networks, Processing, Python
The captions were generated using Torch RNN, NeuralTalk2 and Python. These frames were generated using Processing.
Lighting Up the Randy Pausch Bridge
Along with 3 others, I designed and exhibited a light show on the Randy Pausch Bridge. Our show was loosely structured on Plato's Allegory of the Cave. The first part of the show has muted, pastel-like colors fading in slowly to exhibit the prisoner's lack of understanding of the world. In the second segment, the prisoner is exposed to the world, and is overwhelmed by it. This is reflected as parallel morse code messages flashing quickly. Slowly the prisoner begins to make sense of this, and we see bright and vivid patterns on the screen.

Technology Used: Pharos Lighting System, Python
Collaborators: Ruth Scherr, Nate Lamkin, Hannah Ng
The intermittent flashing of the bridge is the morse code translation of real-time messages to the bridge. Don't miss the psychedelic sequence at 0:43! Note that the music was edited into the video, and wasn't part of the show.
Druid - A VR based rehabilitation assistant for the paralytic
Druid is an Android app that helps people who suffer from loss of motor function due to stroke or injury. It makes the otherwise mundane process of recovery exciting and immersive. I worked on this project along with 3 others at the 2015 Pennapps hackathon. We built a Google Cardboard VR interface for a user to lift objects, such as bowling balls. The user's arm movements are captured using a Myo armband, which influences the Unity based VR environment. This information is transmitted to a Meteor.js based server. A therapist can access this and tweak the difficulty of the experience. More details can be found here.

Technology Used: Google Cardboard VR, Android, Unity, Meteor.js
Collaborators: Paloma Sodhi, Sachin Nayak, Priyamvada Tiwari
Top: A glimpse of the VR environment that a user sees. Bottom: Me influencing that environment (lifting a ball) by raising my arm.

Other Projects

Educational Poster
Anna Karenina
Orient
Spin
Lemme Get That
Alarms & Clock
Proximiti
Capstone Project with Odigo