Biomedia project 3 – Beehive Bots

Using different audio samples taken from inside a Colorado Top Bar beehive, my group of peers Amanda Kriss, Leo Liu, Bianca Garcia and I were able to create generative pieces of physical artwork. The hive sounds were fed into a Max patch, whose purpose was to translate hive sound data into data that could be partitioned into “left”, “right”, “forward”, and “backward” directions. This was done by creating amplitude thresholds for forward/backward controls, and frequency thresholds for left/right controls. These would eventually reach a Hexbug with markers on its legs, drawing as it walks around the canvas. This way, the Hexbug would act differently based on different sound inputs; i.e., different stages in the beehive’s formation (see final pieces).

The patch. Code available here.

Screen Shot 2016-05-28 at 3.27.42 PM.png

The Max patch communicates with an Arduino file created by Chip Audette, accessible here, designed to output left, right, up, and down controls. The leads connecting to the arduino were soldered to the Hexbug remote control board in order to take over control of the Hexbugs.

IMG_4477 (1).JPG

Short videos of the bots in action:

 

 

Finished Pieces:

Using sounds from a cold hive:

Hexbug1

Using sounds from a hive just upon formation, drawing with the crab bot:

Hexbug3

Using sounds from a hive just upon formation, drawing with the spider bot:

Hexbug2

Advertisements

Biomedia project 2 – Biosensing

This project involved building and utilizing a wearable sensor which detects muscle movement. After initially building the sensors, my role was the programmer, who edited and put together our Max patch we hooked up to the sensor. The patch allowed us to manipulate the information coming on from our muscles via the sensor, and generate a sonic and visual output. The sonic output’s frequencies would depend upon muscle movement, outputting an eerie chorus of generated synth sounds. The visual output was driven by four videos representing different aspects of biology, like flowers and plant cells. These videos faded in and out, and mixed together with one another, depending on the amplitude of the sounds generated by our muscles. Our final presentation was performance-based, with one of our group members performing Tai Chi movements while wearing the biosensor and the visual/sonic output going on around and behind him. I feel the project was very successful, while also improving my understanding of our ability to harness and utilize biological data.

Biomedia project 1 – Prehistoric Soundscape

My goal of this project was to take audio samples from an ordinary, somewhat urban landscape and turn them into a foreign soundscape. My samples were collected from Wash Park right by DU, and I sought to turn them into a piece which resembled prehistoric times. The majority of the samples were first cleaned using Adobe Audition to get rid of background noise, like humans and traffic noise. Audition was also used to do some of the sample pitch and speed distortion. I then created a patch in Max and incorporated these altered samples, editing them more once the patch was complete. The editing done in Max included pitch and speed shifting, reverb, echo, and altering the sound envelopes of the samples. I then linked it all together, mixed it live several times, and used the Quick Record extra to export a .wav file when I was satisfied with it.

F-111 TFR behind the scenes

Development of my TFR simulator here. Overall pretty simple, with the help of MaKey MaKey. Processing code is posted at the bottom. I couldn’t attach any audio files so if you’re running the code you’ll have to delete anything that references audio or replace it with something. The code isn’t too pretty and probably could’ve been done in a more simple way, but it works!

IMG_3359IMG_3353IMG_3354IMG_3356IMG_3355IMG_3361IMG_3364IMG_3368

IMG_3369.JPG

 

IMG_3370.JPG

 

 

//Brian Franceschi:

//F-111 sim full demo

int xspacing = 1;
int w;
float planeheight = 300;
int i;
int beamheight;
int directionY = 5;
int dieDirectionY = 2;
int dieX = 375;
int dieY = 490;
float alpha = 0.0;
float amplitude = 158.0;
float period = 500.0;
float dx;
float[] yvalues;
float altitudeUp = 1.0075;
float altitudeDown = 1;
int points = 0;

//timer
int actualSecs;
int actualMins;
int startSec = 0;
int startMin = 0;
int scrnSecs;
int scrnMins=0;
int restartSecs=0;
int restartMins=0;

import ddf.minim.*;

AudioPlayer player;
AudioPlayer player2;
Minim minim;//audio context

void setup() {
size(1080,720);
w = width+200;
dx = (TWO_PI / period) * xspacing;
beamheight = 300;
yvalues = new float[w/xspacing];
frameRate(220);
minim = new Minim(this);
player = minim.loadFile(“F-111 sim 2.mp3”, 2048);
player2 = minim.loadFile(“explosion.mp3”, 2048);
player.play();
PFont font;
}

void draw() {
background(0);
fill(250,100,100);
text(“OFF”, 70, 90);

//Timer: turns to night every 10 seconds
actualSecs = millis()/1000;
actualMins = millis() /1000 / 60;
scrnSecs = actualSecs – restartSecs;
scrnMins = actualMins – restartMins;

if (keyPressed == true){
if (key ==’a’){
restartSecs = actualSecs;
scrnSecs = startSec;
restartMins = actualMins;
scrnMins = startMin;
}
}
if (actualSecs % 60 == 0) {
restartSecs = actualSecs;
scrnSecs = startSec;
}

println(scrnSecs);
println(scrnMins);

textAlign(LEFT);
fill(255);
text(“Time:”,856,85);
text(nf(scrnMins, 2) + ” : ” + nf(scrnSecs, 2), 932, 85);
//TFR Zone
stroke(10,40,180,255);
strokeWeight(80);
line(0,280,width,280);
fill(0,0,0);
noStroke();
rect(275,250,150,20,3);
rect(340,250,20,65,3);

rect(435,250,150,20,3);
rect(435,250,20,65,3);
rect(435,280,110,20,3);

rect(595,250,150,20,3,10,3,3);
rect(595,267,150,20,3,3,10,3);
rect(595,250,20,65,3);
rect(700,250,20,65,3);
fill(10,40,180,255);
rect(615,269,115,8,3,10,10,3);
stroke(255,255,255);
fill(255,255,255);

//Scoring points. More points if you go lower
text(“Score: “+points,850,50);
points=points+1;
if((planeheight>250) && (planeheight<800)){
points=points+5;
player.play();
player2.rewind();
}

stroke(0);
calcWave();
renderWave();

//Setting the death sensor
//original: if(dieY > 510 || dieY < 200){
if(dieY > 510 || dieY < 200){
dieDirectionY = -1 * dieDirectionY;
}

dieY = dieY + dieDirectionY;

//Ceiling:
stroke(59,200,111,230);
strokeWeight(7);
line(0,110,1080,110);

//Death by ceiling:
if (planeheight < 110){
planeheight=planeheight+10000;
background(0,0,0);
textAlign(CENTER);
text(“you died.”,width/2,300);
points=points-1;
text(“Score: “+points,width/2,400);
textAlign(BASELINE);
player.pause();
player.rewind();
player2.play();
player2.rewind();
}
endShape();

fill(255,255,255);
noStroke();

ellipse(dieX,dieY,1,1);
//front radar (TFR):
fill(230,60,60,100);
beginShape();
vertex(405,planeheight);
vertex(850,-50+beamheight);
vertex(850,0+beamheight);
vertex(405,planeheight);
endShape();

//bottom radar (LARA):
beginShape();
noStroke();
vertex(380, planeheight);
vertex(350, planeheight+160);
vertex(410, planeheight+160);
vertex(380,planeheight);
endShape();
stroke(230,60,60,100);
strokeWeight(3);
line(320,planeheight+40,440,planeheight+40);
noStroke();
//F-111:
fill(200,200,200);
ellipse(370,planeheight,65,10);
beginShape();
vertex(342,planeheight);
vertex(330,planeheight-14);
vertex(345,planeheight-16);
vertex(372,planeheight);
endShape();
fill(255,255,255);
ellipse(375,planeheight+1,30,5);
//Description/labels:
textSize(25);
fill(255,255,255);
text(“F-111 Terrain Following Radar System”, 10, 30);
text(“ride control: Hard”, 10, 60);
text(“TFR: “, 10, 90);

//Manual up:
if (keyPressed == true) {
if(key== ‘w’){
planeheight = planeheight-altitudeUp*2;
}

//Manual down:
if(key==’s’){
planeheight = planeheight+altitudeDown*3;
}

//TFC on:
if(key==’d’){
if (yvalues[1000]>1) {
planeheight = planeheight-altitudeUp;
fill(0,0,0);
rect(70,70,70,25);
fill(100,255,100);
text(“ON!”, 70, 90);
}
else { planeheight = planeheight+altitudeDown;}
fill(0,0,0);
rect(70,70,70,25);
fill(100,255,100);
text(“ON!”, 70, 90);

}
}
else { planeheight = planeheight+altitudeDown;}

if(beamheight > 500 || beamheight < 200){
directionY = -1 * directionY;
}
beamheight = beamheight + directionY;

//Death by mountains:
if (dieY < planeheight){
planeheight=planeheight+10000;
background(0,0,0);
textAlign(CENTER);
text(“you died.”,width/2,300);
points=points-1;
text(“Score: “+points,width/2,400);
textAlign(BASELINE);
player.pause();
player.rewind();
player2.play();
}
else{
player.play();
}

//Reset game
if (key==’a’){
planeheight=180;
points=0;
player.play();
}

//Nighttime:
if(scrnSecs > 9 && scrnSecs < 20 || scrnSecs > 29 && scrnSecs < 40 || scrnSecs > 49 && scrnSecs < 60){

fill(0,0,0,255);
rect(0,105,1200,1200);
fill(255,255,255);

if (dieY < planeheight){
planeheight=planeheight+10000;
background(0,0,0);
textAlign(CENTER);
text(“you died.”,width/2,300);
text(“Score: “+points,width/2,400);

textAlign(BASELINE);
player.pause();
player.rewind();
player2.play();
}
}
//Wave creation. Basic sine wave ellipses originally from Daniel Shiffman:
//https://processing.org/examples/sinewave.html
}
void calcWave() {
alpha += 0.02;
float x = alpha;
for (int i = 0; i < yvalues.length; i++) {
yvalues[i] = sin(x)*amplitude;
x+=dx;
}

}

void renderWave() {
noStroke();
fill(59,200,111);
for (int x = 0; x < yvalues.length; x++) {
ellipse(x*xspacing, height/2-yvalues[x], 10, 10);
}

}

 

Cultural Event 3: Tim Stutts

Tim Stutts is a designer, developer, and prototyper, who focuses on data visualization, user interface/experience, and motion graphics. On November 12, Tim came to talk to the EDP department with a presentation he called “Prototyping in the Work World”. The presentation was focused on his digital business and personal achievements, and gave a strong sense of the degree to which digital media and motion graphics play in our modern world.

Tim began focusing on music composition, and one of his first big involvements was with composing music on the computer, via Midi. He manifested this interest into a more arts-driven education at Cal Arts, and began using MaxMSP; a language which he still uses today. From a sound designer, to design work at the Cooper Hewitt Design museum, Tim began exploring the artistic side of computer programming with projects like his Lumen Installation (http://cargocollective.com/timstutts/Lumen-Installation), where he associated dynamic sounds with people’s movement around a space.

In exploring his interests post-graduation in San Francisco, Tim began wondering how he could be both a designer and developer. Thus, the title of “Prototyper” (creative programmer) was born. One of his first achievements with this title was his work in the IBM “Smarter Planet” commercial (http://cargocollective.com/timstutts/IBM-Smarter-Planet-Commercial?cm_mc_uid=53552595840014452024633&cm_mc_sid_50200000=1447641926), where he used openFrameworks and Processing to attach generative motion graphics to actual footage. This was especially interesting to hear that those were the products he used to create such an professional-looking commercial, for a company like IBM nonetheless; this really makes me value my ability to use programs like Processing and the type of logic that goes along with creating those art pieces.

Tim’s work continued through events like the EA E3 conference, where he again used openFrameworks to help visualize data, like generative triangle designs based on live tweets about the event. He worked with Honda on their ‘defense-driving’ heads-up display, prototyping in openFrameworks. Here he assigned geometry to things moving in physical space, like people crossing a street or cars on the side of the road, and presented them back to the driver to quicker/easier identification of what’s ahead of the car.

The overall takeaway I found most interesting was the fact that Tim, and many “prototypers” like him, can create professional, business-ready products on software which is so easy to “play around on” once you have a firm understanding of the language and logic. I think this shows how accessible creativity is in this field, because the final products are often simply creative generations (perhaps sometimes from playing around with the code), with business-world applications. Ever since his talk, I’ve been noticing every instance of his style of “prototyping” I see around me, whether it be small snippits in TV commercials or ESPN’s cut-scene animations. This culture of digital design is, and will always continue to be an extremely prominent part of our society. Simply being more conscious of it I was able to recognize it nearly everywhere I went. It’s exciting, too, to know that I have an beginner’s understanding of the logic and techniques to create these pieces we see everyday.

The final bit I’d like to note was that Tim mentioned multiple times his importance of not “killing the magic” of his designs. Often times the “magic”, that initial feeling the designs evoke, of his creations lie in the initial presentation to the audience, like in his IBM commercial. There are quick cuts showing the graphics he created, intended to inspire a specific feeling, in that case of energy and sustainability, just for that moment. When studying these graphics for longer period of time, we can see that there are imperfections and inconsistencies we didn’t notice at first, because the average television viewer wouldn’t have time to recognize them in the short amount of time they’re shown. This also goes for presenting products to clients–you always want to leave something up to question and interpretation in the presentation–“how did he do that?” “what does that part of the piece mean?” Don’t be too objective in the presentation of these prototypes; it is, after all else, an art form.

P.S. Lack of pictures is because I didn’t want to take pictures during Tim’s presentation.