This article describes just one of the many ways music and technology are becoming more integrated every day:
Georgia Tech Article
In summary, their Director of Music Technology has created a robot that can not only play music, but improvise, playing WITH live humans. It can listen to a human player and copy and respond to their style, timbre, dynamics, etc. It began as a drummer (drum set) but has been reprogrammed to play xylophone as well.
I like the article's quote "The project is also designed to shed light on humans' cognitive and physical ability to enjoy and create music. This is one of the most unique human traits that has not been explained by science as of yet." We discuss things like this all the time in music classes: how music is so emotive and can represent such feelings, the other advantages it provides outside of music. But why? What physically makes the brain respond this way. Perhaps this project is a step into the right direction to figuring this out.
A new model of the robot will delve further into exploring the relationship between musicians (or in this case a robot and a musician) and how they interact in performance.
Other than this being a fascinating project, I am interested in what it means for music education. Is it good or bad? On one hand, we could one day model things for our students using a robot; the robot can accompany students while they are practicing (such as during improv for jazz). We can also use this robot as a cross-curricular project with science classes, either analyzing it or perhaps even building one. On the other hand, relying too much on this robot can be detrimental. Also, the robot technically cannot think; it has algorithms programmed into it that allow it to create a randomly generated response. At some point, these were the thoughts of a person, not completely created from nothing.
Regardless of how we view this development, this is a whole new step and extremely interesting.
No comments:
Post a Comment