For this week, we have implemented different kinds of instruments. So it’s getting more kind of a music-pad than just a piano.

All instruments are playable by gestures and keyboard.

The interface has already implemented different buttons, where the user is going to change the instruments by voice and gesture.

The buttons are on the left handside of the piano.
The idea is: By changing the instruments the buttons shall disappear and a list with the instruments appear on the window. The piano shall then move further to the left.
But we are still not sure about this decision. Maybe the list appears in the middle of the screen and disappears after a selection, so we don’t have to think about moving the piano.

We now struggle with the voice recognition. It seems that “out of a sudden” it’s reacting very slow, although it already worked fine. It now needs like 20 seconds before sending a response. We think it might be because of the loud classroom.

For the next time: We want to implement different kind of rhythms and working buttons for changing instruments and rhythms.