Saturday, May 4, 2013

Final Piece - Reflections

After the final presentations of our piece, "Zwischenkoerper", I would like to make a few reflections.

Positive Notes:
- I feel like our biggest improvement between the April 29th performance and the later performances was getting the pillows to reliably communicate to the base station all at the same time. Once we accomplished this, it was much easier for the audience to figure out how the pillows related to the visualization on the projection screen.
- Andrea did a very good job each and every time with the choreography and interacting with all the various elements of our system.
- We got all three pillows to communicate!! I had my doubts, but we were able to pull it off. Thanks to Catherine and Thomas for putting in extra long hours to get them all to work.
- The Kinect worked like a charm. We never had to update the code or change anything about the motion capture system after our initial successful implementation.

Negative Notes:
- The interaction with the pillows was still much more coarse than we had originally imagined. We wanted very particular interactions to show on the screen from particular sensors on the pillows. It turned out that the best we could do is: small pillow shows up when rotated, medium pillow shows up when thrown around on the ground, and large pillow shows up when crushed or laid upon.
- I wish we would have had the growing and shrinking visualization for Andrea in the final show. I think that it would have made for a more interactive aspect in regard to Andrea.
- It would have been cool for the audience to be able to interact with the pillows while Andrea was dancing. As it was, our piece lacked the audience interaction we were originally aiming for.

Conclusion:
I feel very proud of our team and of our final project. It turned out way better than I imagined it would, and I feel like the system we created was both fun for Andrea to work with and fun for the audience to watch. Once again, the talents of our multidisciplinary team were put to good use to create a fascinating result.

Final Piece - Motion Capture strategy and code

For our final project, we had two methods for motion capture: Kinect video processing of Andrea's movement and the various sensors embedded in the 3 pillow bodies.

For the Kinect video processing, we wanted to keep a very minimalistic and natural approach to motion capture. We saw some of the shortcomings of using the Kinect through the earlier sketches by other teams, and we wanted to keep the Kinect as "out of sight, out of mind" as possible. Through my playing around with the Kinect SDK, I first decided upon using the DepthBasics sample program to base my program on. This sample was fairly accurate at capturing depth data, and my idea was to be able to capture the convex shape of Andrea's body and track how big or how small the area was. This moved me to learn more about the User-tracking feature of the Kinect. It turns out, however, that the Kinect mush have skeleton-tracking enabled in order to use the user-tracking feature.

This led me to experiment with the SkeletonBasics sample program. I then soon found that the skeleton-tracking did not work well for having bodies upside down or lying on the floor. The solution for this problem was to rely on center of mass processing in the program. After capturing the center of mass, I was able to pipe this information into Touch Designer, where we could use it to control the visualization based on position and speed of movement.

https://dl.dropboxusercontent.com/u/104459740/SkeletonBasics.cpp

https://dl.dropboxusercontent.com/u/104459740/SkeletonBasics-D2D.exe


The next part of the motion tracking strategy was to create the three interactive pillows and capture data on each one. Our original goals for the pillow were as follows:

Small - Use accelerometer to determine orientation and speed of movement of pillow
Medium - Have multiple "hairs" (flex sensors) which would respond when pet.
Large - Have multiple squish sensors which would respond when hugged.

These goals were more or less accomplished in our final product, but not to the granularity we originally imagined. We successfully utilized the small pillow and its accelerometer. The middle pillow turned out to be very finicky, and only responded when significant force was applied (as when Andrea throws it around). The large pillow acted similarly, requiring a good deal of force to set off the sensors, and even then, they were not 100% consistent.

The lessons learned in this project were twofold: to figure out what type of motion tracking is desired and correctly gauge the time it takes and the resources necessary to accomplish it.


Monday, April 22, 2013

It's working!!

I finally figured out the error in the sample circuit in the large pillow. The + and - leads for the vibrator were both touching the vibrator housing, thus short circuiting and diverting the current around the vibrator.

Simple fix -- tape the leads down.

Yay!

*Old Post* Final Presentation - Concepts

***Old Post for Final Project. We eventually went to a new idea***

Yesterday, our team ( - Federico and + Katherine) started working on hashing out ideas for our final piece. We began by making a list of things we thought were fun and embodied play. Our list included things from bouncy castles to throwing paint.

At the end, our current idea (which still needs to be readjusted and fleshed out) is to make a two-room piece in which actions the audience performs in one room affect the dance performed in the next room. The audience will think that the first room is self-contained, and will be moderately entertained by playing around with things

Work Day 4.21

Work Day on Sunday Night 4.21

Debugging the electronics of the pillow creatures...

6:00-7:15
Started by talking to Catherine about what's currently wrong with each creature. I think the list is as follows:

Large Pillow:
      - None of the vibrators are working
      - Voltage connection to XBee is shaky
      - The voltages on the transistors are not very intuitive

Medium Pillow
      - Flex sensor data coming from pillow is not intuitive (We have no idea why it's sending what it's sending)
      - LEDs are not mapped to the correct flex sensors

Small Pillow
      - Unknown... Wait on this one

7:15-
Catherine has to go home, started working on understanding one isolated circuit of the Large pillow.

7:30 --taped down conductive thread connections, stopped flickering of power to circuit.

7:45 -- I think the voltages going into the vibrator are reversed...

8:00 -- No they're correct

--Break from 8:15-11--

11:13 -- The battery is dead. And the low battery has been causing a problem. It can't supply enough voltage for the Arduino to properly send out 5 volts on 5 pins. With the USB cable attached, the proper voltages are sent out.

     

Friday, April 12, 2013

Project Meeting 4/12/13

Project Meeting 4/12/13

Today's goals:
- Get two XBees connected.
- Get Pillow Creature 1 to talk to Touch Designer
- Get Touch Designer to receive data from Kinect

Wednesday, April 10, 2013

Find-a-conference/festival!!

While searching around the internet, I found a conference called SPLASH (Systems, Programming, Languages and Applications: Software for Humanity). The home page is here:

http://splashcon.org/2013/

This conference discusses how to use different systems and/or computing techniques to better integrate software with humanity. I thought our project would fit nicely here because our project deals with two different types of human/software integration: the Kinect, and a physical computing sensor blob.

Here is the website for the Student Research Competition:
http://splashcon.org/2013/cfp/due-june-28-2013/664-acm-student-research-competition

And here are the details involving submission:

Submission Summary
Due on: June 28, 2013
Notifications: July 28, 2013
Camera-ready copy due: August 28, 2013
Format: ACM Proceedings format
Contact: Isil Dillig and Sam Guyer (chair)

Sunday, March 24, 2013

Sketch 2 - Final Product

See my groupmate's blog to get a picture of the final product:

http://thomasrussellstorey.wordpress.com/2013/03/05/sketch-two-recap/

-Jonathan

Monday, March 4, 2013

Technical Side of Sketch 2

Sketch 2 involves a dancer, a Wii remote, and a visualization. Andrea is our dancer, Thomas and Federico took care of the visuals, and I took care of the Wii remote interaction.

We decided to use a Wiimote for this sketch for numerous reasons:
1) It can communicate over Bluetooth (meaning Andrea could dance where ever she wanted)
2) It has a clean interface for accessing the accelerometer, buttons, and infrared data.
3) The Visualization Department had them on site.

After deciding on the Wiimote technology, my job became how to use the wiimote to gather interesting information about a dancer's movements. I played around with the wiimote and a little wiimote pairing tool called DarwiinRemote that provided a nice graphical representation of the acceleration data (on all 3 axes) versus time. My discovery was the accelerometer reliably captured both human-made accelerations and acceleration due to gravity. So I decided to try an utilize both in our performance.

Acceleration due to gravity:
The acceleration due to gravity enabled me to characterize the orientation of the Wiimote. Based on which axes were experiencing gravitational force, I was able to figure out which way the Wiimote was leaning. I created the "equalizer" visual after brainstorming with Thomas and Federico about how best to show off this data. The equalizer visual plays off the building because of the horizontal beams between the windows.


The code for the equalizer is seen below:
int convertFromRange(int target, int oldmin, int oldmax, int newmin, int newmax){
  float newRange = newmax - newmin;
  float oldRange = oldmax - oldmin;
  float newValue = (((target - oldmin) / oldRange) * newRange) + newmin;
  return (int) newValue;
}
void draw(){
  gradientRect(0,0,width/3-10,height,light,dark);
  gradientRect(width/3,0,width/3-10,height,light,dark);
  gradientRect(2*width/3,0,width/3-10,height,light,dark);
  fill(0,0,0);
  int left = convertFromRange((int)wiiController.acc.x, 300, 200, 0, height);
  rect(0,0,width/3,height-left);
 
  int mid = convertFromRange((int)wiiController.acc.y, 0, 350, 0, height);
  rect(width/3,0,width/3,height-mid);
 
  int right = convertFromRange((int)wiiController.acc.x, 200, 300, 0, height);
  rect(2*width/3,0,width/3,height-right);
}

This code draws dynamically sized rectangles based on the values given for the x and y axes.

Human-acceleration:
The next piece of information I wanted to grab from the wiimote was the action of flick (or slash). I figured this would be a useful piece of information because of the nature of sketch 2's dance requirements (Laban's "A scale"). The visualization our team thought of to demonstrate this data was spheres floating in a 3D space.

The procedure I followed in capturing the flick data was to observe the different waveforms related to the 3 axes of the accelerometer, recognizing a pattern for when the Wiimote was "flicked", and capture the occurrence of the pattern in code. I noticed that in a flick, there is an initial change in acceleration in the flick direction, and then an spike in the other direction immediately following. This is the waveform that I wanted to try and recognize:

  (Right flick)

The algorithm to determine a (right) flick turned out to be something like this:

if the x_val < minThreshold && x_val < previous x_val
increment spike_up
if the x_val > maxThreshold && x_val > previous x_val
increment spike_down

depending on the order of the incrementation of spike_up and spike_down, either a flick left or a flick right has occurred.


Final Implementation
We changed a few things for our final implementation. Most notably, I coded the flick recognition to consider all 3 axes and we changed the floating spheres to floating cubes.


Friday, March 1, 2013

Working on sketch 2

Friday, March 1:

Working on creating an equalizer-like effect to track Andrea's body:


Wednesday, February 20, 2013

Sketch 2 Research

I really like the realism and interaction with the windows on this video:

http://m.youtube.com/watch?v=rPr0CgvmBM4

It was projected on the Ralph Lauren Bond Street Store.

I enjoyed the fact that there were human figures on the building. I also enjoyed the "pretend" lighting up of the building from the inside. The light moving through the building lit up the windows as if it were really inside.

Monday, February 18, 2013

Sketch One Reflections

Reflections on the Dancing Egg
====

There were several postive aspects to our performance and also a few negatives:

Positives:
Scene setup
   - Andrea's costume, the lighting, and the aesthetics surrounding the egg all turned out very nicely.
   - We were aiming for a mysterious, earthy atmosphere and I feel like we acheived it.

Sound production
   - I feel like this was the best part of our production.
   - We created meaningful sounds that helped the audience understand what the egg was feeling even though it wasn't moving :(

Choreography
   - Andrea's dancing was very believable and it added to the performance quite a bit


Negatives
No motion :(
   - The egg's battery was high enough to run the Arduino board, but too low to actual move the motors..
(Lessons -- 1. Always check the battery and test the full range of operability before the performance
                   2. Make the battery easily replacable)

Lack of Responsiveness
   - Although Andrea did a good job of learning the system, the proximity detection method we implemented just wasn't fully robust.

Limited performance space
   - Andrea was limited to the view-area of the webcam we used, which turned out to be about an 8 ft square

All in all, I was proud of what we accomplished, but wished we would have been able to demonstrate our product flawlessly.

-Jonathan
The Chronicles of the Dancing Egg
Part 6 - Performance

Wednesday 2/13/13 1:30pm - 4:30pm

- Finished documentation
The Chronicles of the Dancing Egg
Part 5

Tuesday 2/12/13 2pm - 2am

- Finished the construction of the egg
- Covered the egg in twine to add to aesthetics and cover up electronics
- Brainstormed and decided on placement of Thomas's laptop
- Added sound and created musical phrasing
- Andrea praticed with the completed module
- Tested proximity sensing
The Chronicles of the Dancing Egg
Part 4
Monday 2/11/13 1:30pm - 5pm

Technical Rehearsal
====

The technical rehearsal went fairly well, however we noticed several issues.

First, we were still unsure how the egg would fare once the top portion was added.
Second, the proximity tracking program would lose sight of Andrea as she moved her head in different directions.
Third, our control station was literally right inthe way of the performance space
The Chronicles of the Dancing Egg
Part 3

Sunday 2/10/13 2pm - 5pm

Thomas and I successfully soldered the two infrared transmitter modules (one for the egg and one for Andrea's head) onto PCBs.
====
In other news, Thomas, Federico, and Andrea had worked Saturday night and tested the proximity detection as well as created the new version of balancing and moving the egg (using a lead egg fishing weight).
====

The two infrared LED modules were relatively simple to design, but were harder to solder. After getting one module soldered, we realized that the infrared light wasn't very bright. Because of this, we decided to remove one resistor from each of the LED circuits to allow more current to pass through.
The Chronicles of the Dancing Egg
Part 2

Friday 2/8/13 9am - 5pm

Today I figured out the wireless transmission over the two XBee modules
====

After running into the problems on Wednesday night, I brought all the solutions together and worked on configuring the modules.

====

I then went out and bought us a new servomotor. The one we had received turned out to be a continuous rotation servo, and this caused problems in created reproducible "wobbling" motion.

I visited a local hobby shop called "HobbyTown USA" and the manager there was very helpful in guiding my selection of the correct servo and giving other advice about the project.

====

Thomas, Federico, and I worked together to put the motor and wireless system together with the egg. Thomas coded the software to track infrared light that morning, so we also combined that part of the project in and got the Arduino and his software to talk to each other over wireless serial communication.
====

At the end of the day we demonstrated the project to Dr. Seo
The Chronicles of the Dancing Egg
Part 1

Wednesday 2/6/13 4pm - 2am

Today Federico and I worked on finishing the construction of the Egg and I began working on the Wireless Communication setup using the XBee transmitters.
====

Our first challenge was to get the Egg to stand up on it's own. We brainstormed and thought of many different alternatives:

- Using a brick
- Using a rock
- Using small rocks
- Using sand
- Cutting a flat surface on the bottom
- Using rope to flatten the bottom

We finally decided on using small pebbles (found outside the Architecture building) because when the base of the egg was scooped out, the smaller pebbles could fit better and added density.

Both the brick and the rock were two irregularly shaped and thus caused the egg to fall over often.
====

The next challenge involved getting the egg to move. We came up with these alternatives:

- An upside-down pendulum using a servo motor at the base of the egg and a stick with a weight on the end.
   (This method proved unsuccessful, as the weight would not come back up after the egg had tilted to the side)

- A normal pendulum
   (This method also proved unsuccessful as the top portion of the egg could not hold the weight of the motor and pendulum. In addition, putting any weight on the top of the egg caused even more balance issues)

- Water weight
   (As shown in this video:    <insert link>, we tried using the shifting weight of water to "wobble" the egg, but it was too inconsistent.
====

Work with the XBee shields:
Then I first plugged in the XBee to the Arduino and the computer, I thought it would work perfectly.
It didn't.
After working with it all night, these problems surfaced:

First I figured out that the computer needed to be configured with FTDI drivers. These drivers enabled the computer to "see" the XBee that was attached.

Second, I needed an interface to configure the XBee. The standard program recommended by Sparkfun is only built for Windows, so I had to find another one called "CoolTerm".
====

Wednesday, February 6, 2013

Arduino XBEE resources

http://www.instructables.com/id/Xbee-quick-setup-guide-Arduino/step5/Done/

https://www.sparkfun.com/pages/xbee_guide

Wednesday, January 23, 2013

Live Media: Interactive Technology & Theatre

In this paper, David Z. Saltz writes about his experiences of using "interactive media" in theatre productions as Assistant Professor of Drama and Director of the Interactive Performance Laboratory at the University of Georgia. Dr. Saltz begins the paper with a discussion on "defining interactive media" in a general way. He discusses how interactive media is a way to store and "summon" media through interactions with sensory triggers. His three major determinants of interactive media are random access, arbitrary links between trigger and output, and media manipulations. These three concepts separate interactive media from both stagnate media and linear media. They require dynamic reactions and indeterminate programmatic flow. Saltz then discusses the connection between interactive media and theatre. He talks about the fact that dancers have clear connections to interactive media, as they often want specific reactions to accompany their movements. He then details the possibilities of using interactive media in scripted performances. He illustrates how linear media does not fit in this situation, and interactive media can have a real niche in it. In the next part of the paper, Dr. Saltz talks about productions that he directed at the University of Georgia with the Interactive Performance Laboratory (IPL) which heavily utilized interactive media. In their productions of Hair, Kaspar, and  The Tempest, the students at UGA were able to integrate media into live, scripted performances of theatre. To end his paper, Dr. Saltz then focuses on twelve different ways he has found that can connect media and performers in live productions. Virtual scenery and interactive costumes provide interesting setting opportunities. There are several perspective options including "alternate perspective", "subjective perspective", and illustration that give audience members a greater understanding of what is going on in a scene. In addition, the last elements: commentary, digetic media, affective media, synthesia, instrumental media, virtual puppetry, and dramatic media can all serve to breathe more life into a production, by way of adding more possibilities for failure as well as more dynamic cues to the actual human performers. To end, Dr. Saltz's paper was an intriguing study into the possibilities of using media in theatre in a way never before seen.

Portfolio Presentation

Last semester I was on a team that made a 3D point-and-click adventure game.
This is a one-minute presentation about it:

Surveillance Presentation