top of page

Project Brief: 

The Music Player is an innovative final project in Creative Coding that aims to provide users with an interactive and visually engaging music player. The player features a unique sound visual effect in the middle of the screen, which reacts dynamically to the audio being played.

Users can control the player using the built-in buttons, allowing them to play/pause, switch between songs, and adjust the volume. To enhance accessibility, the player also includes motion control options that enable users to interact with it in various situations, such as when their hands are wet or dirty.

Additionally, The Music Player offers two different scenes for users to visualize the song, providing them with a customized experience based on their preferences.

With a timeline of two weeks, this project showcases the power of creative coding in developing innovative and interactive music players that offer accessibility and a personalized experience to users.


Timeline: 2 weeks. 

下载 (1)_edited.jpg

Fluid Scene

下载_edited.jpg

Dynamic Scene

Project Display: 

 

Arduino setup

Mouse control demo

Motion control demo

Process

My original idea from the beginning was to create an interactive p5 sketch, which will interact with sounds and the user's motion. I was thinking of using a mic Input & Arduino's ultrasonic sensor for it. 
 
I found some inspiration on Instagram for my idea. I'm thinking of creating a music player with p5 3d objects, and animations that play based on the music rhythm, and display in the cold colorway. Whenever the sensor detects someone passes, change to another animation that affects by people's motion(such as one hand up, both hands up... etc. 















I started by playing with some math and creating a wave effect for the fluid scene. 

for(let j = 0; j < 100; j++){
  push()
  rotate(sin(frameCount + j * 2) * 50)
  rect(0, 0, 600 - j * 3, 20 - j * 3, 200 - j)
  // ellipse(0, 0, 10 - j * 6)
  pop()

  push()
  rotate(sin(frameCount + j * 2) * 50)
  rect(-windowWidth/6, -windowHeight/3, 600 - j * 3, 1000 - j * 3, 200 - j)
  pop()

 

Then I get mic Input with my computer, mapped it out, and play with the noise(). Create some circles inside which affected by the mic Input. Because the result of getLevel() is low, so I let the volume times 1000. 

  let vol = mic.getLevel() * 1000;
  let v = map(vol, 0, 20, 0, 40);

let space = 0.1
for(let i = 0; i < 360; i += space){
  let xOff = map(cos(i), -1, 1, 0 ,2)
  let yOff = map(sin(i), -1, 1, 0, 2)

  let n = noise(xOff + start, yOff + start)
  let h = map(n, 0, 1, -150, 150)
  let l = map(n, 0, 1, -500, 500)

  rotate(space)
  // ellipse(windowWidth / 8, 0, h)
  ellipse(windowWidth / 8, 0, abs(v/l), v /4)
  ellipse(windowWidth / 4, 0, abs(v/l) * 3 )
}
start += 0.01

Looking at the scene, I feel that I like the current scene and there no much left for interaction in the motions. 

I asked myself, why wouldn't I create a music player with motion control? 

Then I preload all the songs. For better visualization, I pick techno, pop, and rock-style music. Using p5.FFT to get the waveform of the song, and map it so it interacts with the shapes behind it. 

Besides the shape I have in the scene, I play around and create a spot light/flower effect in the middle. 

    // ----- create a spotlight effect in the middle that base on the wave form
    // ----- reference: https://www.youtube.com/watch?v=uk96O7N1Yo0&t=440s&ab_channel=ColorfulCoding
    push()
  // console.log(wave)
  for(let t = -1; t <=1; t +=2){
    beginShape();
  for(let i = 0; i <= 180; i++){
    let index = floor(map(i, 0, 180, 0, songWaveform.length - 1))
    let r = map(songWaveform[index], -1, 1, -50, 50)
    let x = r * sin(i + frameCount*20) * t
    let y = r * cos(i + frameCount*10)
    stroke(255, 60 + songWaveform[index] * 10)
    rotate(r * sin(frameCount) ) 
    vertex(x,y)
    vertex(x*2,y*2)
    vertex(x*4,y*4)
  }
  endShape()
  }
  pop()
 

I feel the current scene I had doesn't present techno music really well, and it feels boring after looking for a while. So I change the angle mode to the radius, in order to play around with another scene. 

It's basically the same code as the fluid scene but play around with the math to create something different. To create a sense of depth, I put another layer of the shapes outside, with darker stroke color. 


 

屏幕截图_20221209_160218_edited.jpg

    beginShape();
    for (let i = 0; i < songWaveform.length; i++) {
      // rotate(TWO_PI/float(5)); 
    // ----- using volume to control opacity
      let q =  map(vol, 0, 50, 0, 100)
      stroke(255, 60+q)
      strokeWeight(.2)
      // console.log(songWaveform[i]*80)
      let r = map(songWaveform[i]*80, 0, 256, 40 ,400)
      let angle = map(i, 0, songWaveform.length, 0, 360)
      let x = r * cos(angle) * 3
      let y = r * sin(angle) * 3
 
      vertex(x, y);
 
    }
    endShape();

Same as for the fluid scene, I created several ellipses around for a sense of depth. And inner circles have different effects than outer circles. 

In the middle, I create several meteorolites, which will blow up based on the music. 

Fluid Scene

Dynamic Scene


I start testing with my ultrasonic sensor. I plan to have 2 sensors on each side of my laptop's screen. 

Setting up the Echo and Trigger pins of both sensors. I hard code the distance in 50cm. If it's greater than 50cm, it equals 50cm. 

I notice sometimes when my hands are too close to the sensor, the value returns 1000+. So I hard code it to 0 when it's 1000. 

 

void getDistance(int Trigger, int Echo){

  // put your main code here, to run repeatedly:
  digitalWrite(Trigger, LOW);
  delayMicroseconds(2);
  digitalWrite(Trigger, HIGH);
  delayMicroseconds(10);
  digitalWrite(Trigger, LOW);

  Duration = pulseIn(Echo, HIGH);
  Distance = Duration * 0.034 / 2;
 
  // ----- set the maximum range to 70 to prevent unintended activation, also it helps to get proper read. 
  if (Distance > 50){
    Distance = 50;
  };
  if(Distance > 1000){
    Distance = 0;
  };
}

In my thoughts, the logic of the motion is: 

  • Hold either hand in front of the sensor to play/pause the song. (Accessible for both left-handed and right-handed.)

  • Hold both hands in front of both sensors for 1 second to change the scene. 

  • For the next & previous songs, the idea is just as common sense. The slide from left to right is the next song, and from right to left is the previous song. Both sensors will run for hand detection. When either side sensor detects a hand, get data from the sensor again. If detects other hands, meaning the user slides. Then print out "next song" or "last song".

  • Featuring another sensor for volume change. After the user moves up and down, get the data again and calculate the difference. Later mapped it out in p5 to control the volume. 

I used p5 serial pot to connect between Arduino and P5. In p5, I use the string to control some booleans. 

Here is an example of what I wrote. 

For more please check my GitHub here.

 

function serialEvent() {
  // read a byte from the serial port, convert it to a number:
  inString = serial.readStringUntil('\r\n');
  console.log(inString)

  if(inString == "Play/Pause"){
    pause = !pause
    // console.log(pause)
  }
  if(inString == "Change Display Scene"){
    fluidScene = !fluidScene
  }
  if(inString == "Next Song"){
    nextSong = !nextSong
  }
  if(inString == "Last Song"){
    lastSong = !lastSong
  }
  if(inString == "Left hand detected"){
    leftHandDetected = true
  }
  if(inString == "Right hand detected"){
    rightHandDetected = true
  }
  if(inString == "Both hand detected"){
    bothHandDetected = true
  }
  if(inString == 'Volume Mode'){
    volMode = true
  }

if(nextSong){
    if(song[i].isPlaying()){
    song[i].stop()
    i = (i+1)%song.length
    song[i].play()
    song[i].setVolume(songVol)
    nextSong = false
  }else{
    i = (i+1)%song.length
    song[i].play()
    song[i].setVolume(songVol)
    nextSong = false
  }
  }

Going through, I did a user-testing with my girlfriend. Everything works great besides these 2 key points: 

  • Don't know when the hand is detected and when should move 

  • Without a physical button is not convenient. ​

With the feedback I gain, by using the contains() and class function, I create play&pause, previous&next song, and volume bar. I also print out the song name that is currently playing. 

Use mouseDragged function for the volume bar. The loop will reset the volume after the user release the bar after the mouse dragged, it and reset the volume of the current play song. 
 

Also, create circular animation when Arduino sends back "hand detected". 
 

Both hand detected

Volume Mode

Right hand detected

Left hand detected

bottom of page