Name: Owen Campbell
Program: Final Project (Julia Set Music Visualizer)
Description: My project uses low-level audio features extracted using the Aubio library to drive dynamic Julia Sets on the GPU.
I have a video, but QuickTime’s audio/screen...
Name: Owen Campbell
Program: Final Project (Julia Set Music Visualizer)
Description: My project uses low-level audio features extracted using the Aubio library to drive dynamic Julia Sets on the GPU.
I have a video, but QuickTime’s audio/screen...
Name: Owen Campbell
Program: Final Project (Julia Set Music Visualizer)
Description: My project uses low-level audio features extracted using the Aubio library to drive dynamic Julia Sets on the GPU.
I have a video, but QuickTime’s audio/screen...
Name: Owen Campbell
Program: Final Project (Julia Set Music Visualizer)
Description: My project uses low-level audio features extracted using the Aubio library to drive dynamic Julia Sets on the GPU.
I have a video, but QuickTime’s audio/screen...
Name: Owen Campbell
Program: Final Project (Julia Set Music Visualizer)
Description: My project uses low-level audio features extracted using the Aubio library to drive dynamic Julia Sets on the GPU.
I have a video, but QuickTime’s audio/screen...
Name: Owen Campbell
Program: Final Project (Julia Set Music Visualizer)
Description: My project uses low-level audio features extracted using the Aubio library to drive dynamic Julia Sets on the GPU.
I have a video, but QuickTime’s audio/screen...

Name: Owen Campbell

Program: Final Project (Julia Set Music Visualizer)

Description: My project uses low-level audio features extracted using the Aubio library to drive dynamic Julia Sets on the GPU.

I have a video, but QuickTime’s audio/screen capture was too laggy to keep the audio in sync with the visuals, so this does a poor job of exemplifying the relationship between the two mediums. 

https://www.youtube.com/watch?v=kl2kUrNgLS4

Sean Phillips: Reverb in 2D Space, Hybrid Wave Equation/Convolution
The thrust of my final project was to use the wave equation to calculate the reverb characteristics of virtual spaces. In the final version I hoped spaces could be quickly sketched...
Sean Phillips: Reverb in 2D Space, Hybrid Wave Equation/Convolution
The thrust of my final project was to use the wave equation to calculate the reverb characteristics of virtual spaces. In the final version I hoped spaces could be quickly sketched...
Sean Phillips: Reverb in 2D Space, Hybrid Wave Equation/Convolution
The thrust of my final project was to use the wave equation to calculate the reverb characteristics of virtual spaces. In the final version I hoped spaces could be quickly sketched...
Sean Phillips: Reverb in 2D Space, Hybrid Wave Equation/Convolution
The thrust of my final project was to use the wave equation to calculate the reverb characteristics of virtual spaces. In the final version I hoped spaces could be quickly sketched...
Sean Phillips: Reverb in 2D Space, Hybrid Wave Equation/Convolution
The thrust of my final project was to use the wave equation to calculate the reverb characteristics of virtual spaces. In the final version I hoped spaces could be quickly sketched...

Sean Phillips:  Reverb in 2D Space, Hybrid Wave Equation/Convolution

The thrust of my final project was to use the wave equation to calculate the reverb characteristics of virtual spaces.  In the final version I hoped spaces could be quickly sketched out and auditioned through a simple GUI.  The algorithm went through three distinct implementations during development:

1. Wave equation reverberation of entire input signal

2. Wave equation IR, 1D (time domain) convolution

3. Wave equation IR, frequency domain convolution

The third image, from my 200C final project presentation, illustrates the speed of the three implementations.  The first version, wave equation only, was far too slow for practical sketching of spaces.  

The second version obtained a virtual impulse response of the 2D space, with the wave equation.  This IR was convolved with the input signal using time domain convolution.  It was significantly faster than using the wave equation alone, but still far too slow for practical use in quickly sketching virtual spaces and auditioning them.  

For the final version I tried the convolution in the frequency domain, which was fast enough to make the program workable.

The final image shows the GUI.  Walls are draw in white, the virtual source in red, and the virtual listening position is in green (two for stereo).  The user can clear the screen and sketch another arrangement easily, then hear the convolved output.

Scroll down to my previous post to see a sketch of a more elaborate virtual space, and hear an audio sample.

Some photos of the 6 channel vibrotactile sleeve device as it was presented in the demonstration “Vibrotactile Sleeves” for the MAT End Of the Year Show 2014.
Some photos of the 6 channel vibrotactile sleeve device as it was presented in the demonstration “Vibrotactile Sleeves” for the MAT End Of the Year Show 2014.
Some photos of the 6 channel vibrotactile sleeve device as it was presented in the demonstration “Vibrotactile Sleeves” for the MAT End Of the Year Show 2014.

Some photos of the 6 channel vibrotactile sleeve device as it was presented in the demonstration “Vibrotactile Sleeves” for the MAT End Of the Year Show 2014.

Alexis Story Crawshaw

FINITE DIFFERENCE SPATIALIZATION TECHNIQUES

IN VIBROTACTILE MUSIC DIFFUSION

With this project, I sought to develop the potential of physical simulation (specifically using the Heat and Wave Equations) to control spatial-amplitude envelopes for sound signals diffused to the body through touch in a circular 6-channel, 1D vibrotactile actuator sleeve array. I mapped the visual parameter of brightness (or rather, the level of heat) to the level of gain.  The hope was that these simulations would create aesthetically salient ways to indicate the spread of a phantom tactile source across the arms, or of the “hopping” to create the cutaneous rabbit phenomenon.

The implementation was done in the C++ music library Gamma. I created two separate “instruments” that created their own fields, which control the spatial-amplitude of any given individual note. One can set the azimuth position from which the simulation is seeded, the seed’s width and initial amplitude.

The 6 channels of the vibrotactile actuators are found at azimuths 210 (left shoulder), 270 (left elbow), 330 (left hand), 30 (right hand), 90 (right elbow) and 150 (right shoulder), respectively. The gain for each channel is the read-out for a single bin within a 360 bin-sized array, given these azimuth positions. The video, taken in Max, also shows the level read-outs of a higher resolution actuator array of 60 channels in order to better illustrate the behavior of the equations within the entire 1D field.

Image

Sean Phillips:  Reverb in 2D Space, Hybrid Wave Equation/Convolution

This file is a four track recording of a barbershop quartet, spatialized with the 2D Virtual space program written for my 200C final project.  A separate impulse response was generated for each singer, in four different locations in a virtual room.  The virtual microphones were placed as a spaced pair, and virtual boundaries were used for additional source separation.  Click the accompanying photo for a screenshot of the arrangement.

The algorithm then used these four IRs to reverberate each singer’s vocal track, through frequency domain convolution.  

The result is an experimental way to “mix down” the four track recording of this quartet to stereo.

Name: Nataly Moreno

Program: Newton Fractal Exploration

Description:

This video is showing what I called “Proper” mode in my program. In proper mode, the program generates a random equation with its respective derivative. I showed a couple iterations of this and interacted with colors and the “a” parameter. The last one I did was the colorful one in which I saturated the colors. After this, to prevent confusion, I change the color to red again and run the “Experimental” mode of my program. In experimental mode, it generates a random equation and a random “derivative” (because it is not the correct derivative). However, the results for this are un-interesting.

Name: Nataly Moreno

Program: Newton Fractal Exploration

Description:

~ Special Mode ~

This video shows how I interact with different trig functions. The function being used is printed to the terminal window behind the running program. I show different colors and changes to the “a” term for both the real and imaginary components, which cause interesting effects on the fractal. I also add a random complex number to the equation and derivative.

Visualizing Perlin Noise
By Mohit Hingorani
MAT200C Spring 2014 Final project
For my final project I have explored Perlin noise and interactivity. The end result is a prototype installation, which uses a Kinect and a projector for sensing people in...
Visualizing Perlin Noise
By Mohit Hingorani
MAT200C Spring 2014 Final project
For my final project I have explored Perlin noise and interactivity. The end result is a prototype installation, which uses a Kinect and a projector for sensing people in...
Visualizing Perlin Noise
By Mohit Hingorani
MAT200C Spring 2014 Final project
For my final project I have explored Perlin noise and interactivity. The end result is a prototype installation, which uses a Kinect and a projector for sensing people in...
Visualizing Perlin Noise
By Mohit Hingorani
MAT200C Spring 2014 Final project
For my final project I have explored Perlin noise and interactivity. The end result is a prototype installation, which uses a Kinect and a projector for sensing people in...
Visualizing Perlin Noise
By Mohit Hingorani
MAT200C Spring 2014 Final project
For my final project I have explored Perlin noise and interactivity. The end result is a prototype installation, which uses a Kinect and a projector for sensing people in...

Visualizing Perlin Noise

By Mohit Hingorani

MAT200C Spring 2014 Final project

For my final project I have explored Perlin noise and interactivity. The end result is a prototype installation, which uses a Kinect and a projector for sensing people in spaces and for projecting the visuals.

My visuals are based on creating particles and moving them with a certain velocity that is based on the perlin noise. The particles leave a trail behind forming aesthetically pleasing lines, which are pseudo random in nature. The DLA walkers on the other hand quickly fade away creating grid line patterns.

Some of the features of the current project:

Mouse Interaction: To quickly test the visuals, I am using the mouse clicks for launching the particles / walkers.

Kinect Interactions: When a person is detected in space, I begin to track his hands.

As soon as the distance between the hand and the corresponding hip is more than 35 cms, the hand becomes active and the particles / walkers come alive to the corresponding position the projected display.

Gestures: The particles/ walkers are added to the system when the viewers, wave their hands or push ( a gesture in Simple OpenNI library)

Perlin Noise: the number of people present in the space varies the noise detail. When no one is present only 2 octaves are in use. For every additional person detected number of octaves gets incremented. At this point, I am not using any fall off so all the octaves are equally weighted.

Random Walkers:  I additionally implemented random walkers for where a bunch of walkers are launched into space and perform a grid like random walk and slowly fades away.

Color Palettes:

Demo Mode: Using 5 colors, which are chosen randomly.

Installations Mode:  Every person is allotted a distinct color.

Art mode: Using 2 complementary colors that switch randomly.

 

 

Here are the links to the websites I referred to:

www.processing.org

http://www.behance.net/gallery/Perlin-Noise-Patterns/4246255

http://flashyprogramming.wordpress.com/2009/12/26/visualizing-perlin-noise/

http://www.colourlovers.com/palette/2968653/Neon_craze

http://blog.3dsense.org/programming/programming-for-kinect-4-kinect-app-with-skeleton-tracking-openni-2-0/

 

Name: Cecilia Wu

Project Title:

Lotus, Buddha, and Enlightenment

― Experimenting with Buddhabrot rendering algorithm 

Description (Part 4 of 4):

This is a video demonstration of my project by changing color fields RGB values over time, and then normalize and brighten the filed. The equation’s factors’ value that I changed are as the following factor1, 2, 3, 4 and 5.

pNew.x = factor1* pow(p.x, factor4) - factor2* pow(p.y, factor5);

pNew.y = factor3*p.x*p.y


Name: 
Rob Miller (4 of 4 - Modification)

Rob’s Game of Life: An Extension of Conway’s Game of Life in 3D as a Continuous Spatial Automaton

Rob’s Game of Life is an extension of Conway’s Game of Life in 3D and as a continuous spatial automaton. The degree to which a particle is alive is determined by floating points, which differs from Conway’s GoL which uses a boolean to determine life and death. Additionally, the simulation uses isosurfaces (via marching cubes algorithm) to generate continuous shapes and patterns. A single 16-point neighborhood is generated multiple times to create interesting patterns and for efficiency.

Running on The Default Network
by Boyce