Wednesday, May 2, 2012

Final Project

GoogleDoc Matrix


                I chose my lesson plan from my inclusion class last semester. It is about teaching students in grades four through six how to play the recorder. At first, it seemed like a challenge to integrate technology into learning to play an instrument. It was a challenge, but I believe the technologies I chose will greatly benefit the students not only to learn the recorder, but to learn other musical devices.
                The first row discusses the items we will have learned before picking up the recorder that are necessary to play it. These are basic musical foundations needed for any instrument or basic musical training. While we are reviewing these items, we can integrate technology by using online sites that allow us to practice our knowledge. I enjoy MusicTheory.net because it allows one to quiz themselves (with both written and aural examples) based on a specific unit of study. It also allows you to change the difficulty of the exercises for different level learners, something that is virtually impossible in the classroom without this technology. Imagine trying to give each student a different, fair assessment without wasting an immense amount of time and energy. NETS 1C applies here because it concerns using models to explore specific issues. MusicTheory.net allows the students to practice at their own pace and the teacher can arrange quizzes, so this can be a student activity, or teacher-guided.
                The second row concerns using a variety of instruments to learn musical nuisances, and also playing alone or with other students. I did not include a NETS standard in this group because it is irrelevant, although a variety of technologies can be used to learn some of the same things. The only way for this standard to be accomplished is though playing an instrument, in this lesson, the recorder.
                The next column is very important, as it discusses imitating and improvising melodies or patterns. The NETS standard of communicating ideas to an audience is directly correlated with the improvisation standard. It is also very possible to integrate technology effectively here. The traditional way to do these activities would be for a teacher to play a melody or pattern, which a student then copies, or the teacher playing backgrounds while the student has to improvise over them. With new technology, however, we can hear a melody on music notation software, such as Finale, and have the students repeat it. We can even have them practice this at home by sending them the Finale file and having them decipher it. SmartMusic is another fantastic way of monitoring practice at home. This program can tell the student when they are playing notes incorrectly, which would be invaluable when trying to figure out a melody by ear. Lastly, ImproVisor is a great website with ideas and a program on how to improvise. The program allows students to play a solo over a chord change played beneath them. It also allows for a solo to be automatically written, for the beginning soloist who does not know what to do during an improv section).
                The next column takes the playing from ear and puts a visual aspect to it: reading sheet music. Hopefully by this point, the students will know their clefs and how to read music, but it is still very possible that they will not, or that they will struggle with it. Melodies and patterns will still be played, but from written music this time. They will need to use their ears and eyes to follow the pattern of the music on the board. This is a teacher/student activity, because it is based around the students but needs a teacher to mediate and help out. I picked SmartMusic for this section because of its previously described ability to tell a student when they are playing notes incorrectly. In this case, it will even show them which notes are wrong.
                The next column discussing Orff instruments is similar to the second column, discussing the proper use of basic instruments. The recorder will be displayed by both the teacher and YouTube videos. If the teacher can find something intriguing on YouTube for their students, the students will be more willing to play enthusiastically.
                The last column shows the playing of actual melodies and songs, in AB or ABA patterns. These are very common patterns which the students need to know. As NETS states, preexisting knowledge will form the basis of this section of the lesson. This group activity with guided practice allows for the culmination of everything the students have been working on. SmartMusic is again a fantastic device. This will be the culmination of our short lesson or group of lessons, as we will be playing a full song on the recorder.
                Overall, the addition of technology to this lesson plan was greatly beneficial and not nearly as difficult as I first thought it would be. As long as it benefits and interests my students, I would be willing to try any of these technologies or other suggestions for the good of my students.

Tuesday, May 1, 2012

Musical Robots

This article describes just one of the many ways music and technology are becoming more integrated every day:
Georgia Tech Article

In summary, their Director of Music Technology has created a robot that can not only play music, but improvise, playing WITH live humans. It can listen to a human player and copy and respond to their style, timbre, dynamics, etc. It began as a drummer (drum set) but has been reprogrammed to play xylophone as well.

I like the article's quote "The project is also designed to shed light on humans' cognitive and physical ability to enjoy and create music. This is one of the most unique human traits that has not been explained by science as of yet." We discuss things like this all the time in music classes: how music is so emotive and can represent such feelings, the other advantages it provides outside of music. But why? What physically makes the brain respond this way. Perhaps this project is a step into the right direction to figuring this out.

A new model of the robot will delve further into exploring the relationship between musicians (or in this case a robot and a musician) and how they interact in performance.

Other than this being a fascinating project, I am interested in what it means for music education. Is it good or bad? On one hand, we could one day model things for our students using a robot; the robot can accompany students while they are practicing (such as during improv for jazz). We can also use this robot as a cross-curricular project with science classes, either analyzing it or perhaps even building one. On the other hand, relying too much on this robot can be detrimental. Also, the robot technically cannot think; it has algorithms programmed into it that allow it to create a randomly generated response. At some point, these were the thoughts of a person, not completely created from nothing.

Regardless of how we view this development, this is a whole new step and extremely interesting.

Wednesday, April 11, 2012

Interactivity #5

Link to Google Spreadsheet


I interviewed a teacher from Hillsborough, NJ. She teaches ninth through twelfth grade band and works closely with the teacher who teaches music theory to tenth, eleventh, and twelfth graders. The teacher was not familiar with NETS for Students 2007 at all and no knowledge whether they were being incorporated into other classrooms in the school. Her first reaction was skeptical as to why there were separate sets for teachers and for students, since there is only one set of any other type of standards with an understanding the teacher uses these to create a lesson to indirectly achieve their goals with students. She also noted that while the NETS are a good idea, technology should be integrated into the classroom, not integrating a lesson plan into technology. As far as she is aware, the school is not implementing NETS specifically, although they encourage the usage of technology to aid the lesson. Most of the more advanced/newer technology occurs in specific tech and programming classes, both at the school and in the Vo Tech program. The school system also reinforces using technology throughout the district, exposing younger students to the internet basics and the safe way to use it, as well as other common programs. These are reinforced and built upon as students progress through the school system. I was not surprised by her responses. This is a very middle-class district that cannot afford super expensive technology in every single one of its classes. For her personally, the most technology used in an ensemble is a CD and the stereo system. She plans on introducing SmartMusic if the budget allows within the next three years. This allows students to record their practice sessions for the teacher to hear as a homework assignment, and gives them tips on their playing, such as correcting wrong or out of tune notes. For her colleague's music theory class, they use several programs for ear training and theory exercises on the computer.
I feel the best way to expose teachers to the NETS is to show them how much they could benefit from using technology in the classroom. Making it a reality and having them actually do it may be difficult, though, especially for older teachers who are stubbornly set in their ways. It will also be difficult to have teachers create a lesson plan based around technology, as opposed to integrating the technology into a lesson plan. I also feel that it is difficult enough for teachers to plan valuable lessons for their students while following their subject area standards without bringing another set of parameters into it.

Wednesday, March 21, 2012

Interactivity #4



I chose to use a lesson plan on the science behind how sounds are made because it is different than most music lessons. I find the concept of sound being something you can actually see and feel interesting. For many years, we did not have the technology to see sound; now that we do, students can understand so much more. For example, when musicians are out of tune, we say there are "beats" in the music. To the ear, it sounds almost like a vibrating fan. But we now know that it looks like sound waves beating against each other because they are not in sync. I think if students understand the science behind sound, certain aspects of music will make more sense. The difficulty in this lesson was justifying it with a specific standard. This lesson covers many of the standards and promotes a deep understanding of the general topic of sound in music, but it is difficult to pin it to a few specific ones. A well-balanced variety of strategies will engage different types of learners, causing students to think critically and learn about the topic not only from the teacher, but from professional videos from scientists and musicians. The technology used in this lesson is not breaking edge, but I do not think that it has to be. The use of a variety of free online resources allows us to teach things we would never have the knowledge or ability to share in the past. By adding the use of an oscilloscope to this lesson, the students will be able to do the same experimenting they have just watched on a professional video. Being able to watch professionals via video and sound is essential for a budding musician. It shows what they are striving for, and often inspires a student to continue their music career.

Wednesday, February 29, 2012

I can't think of a creative title... (Interactivity #3)



                As far as this activity being "authentic collaborative," it was valuable to see what others in my field found. I am sure the first things we all wrote down were those things we already knew about. But then, we had to go searching for some technologies we did not know of. I found a lot of interesting things along the way and found some new technologies I had never heard of. I even learned a bit about some technologies I have heard of, but have no experience or exposure to.
            While this spreadsheet could be useful in the future, I feel that there are some things on it that I did not consider technology. For example, several items were of websites where music educators order music and other supplies from often. I was trying to decide if this counted as technology: it is on the internet and makes our lives easier to order things for our students, but on the other hand, paper catalogs existed for this. So it is for convenience, not necessity.
            There are other items, music composition softwares, and prototypes of technology that offer amazing opportunities in the classroom for the present and the future. Composition is being pushed in music classrooms now more than ever, and while paper and pen or improvisation have their places, software such as Finale allows students to write and hear their composition played back. Someday soon, our students will be able to conduct a virtual orchestra in a program run with something like the Wii remote. Keeping in mind we may be teaching in an urban setting one day, the digital tuners and metronomes that are online for free are a fantastic resource. Although musicians should own them, we must keep in mind our students may not have the money.

Wednesday, February 15, 2012

From Bach to Rock (Interactivity #2.5)

I could not decide which picture to use, so here are two:



Practical usage of the keyboard/synthesizer in the classroom.




"Any good music must be an innovation." -Les Baxter

From Bach to Rock (Interactivity #2)

                I would have to say that the radio network beginning in the 1920s effected the music education classroom the most in this period. Previously, the music that could be listened to in the classroom was limited. There were basically vinyl recordings that could be played for the class or live music production. To own enough records to have a good variety for students and a record player in each music classroom was too expensive and could not have been very effective. Short of going to see a performance of a professional group, the live music production was limited to what the students themselves produced, which is not an effective way to actively listen to a piece of music. When the radio came along, however, a much wider range of exposure occurred. Not only could more people be exposed to different genres, but these genres could expand and share more pieces with the general public. So, instead of listening to the same three records all of the time, classrooms were able to switch on the radio to whatever prerecorded or even live performance was occurring at the time. This provided a wider exposure to music for the students and they could even hear live performances they never would have been able to otherwise.

                In the given time period, there are several things that could be considered as having the biggest impact on music education. Some of my fellow music educators have already chosen the phonograph (obviously very important), so I have chosen a different path. I picked the synthesizer. The earliest origins of the synthesizer were in 1896, but not as we know it today. The first semblance of one was steam-powered and weighed 200 tons! The Theremin in 1919 took the next step, but ended up becoming a completely different instrument. In the 1950s, the term "synthesizer" was used for the first time. RCA came out with the item, but it was ridiculously complicated and almost impossible to play effectively. The transistor in the late 1940s would allow for this to become smaller and more portable. In 1963, Moog, the father of the modern synthesizer, combined different sound modules he was working on into one item, which was introduced in 1966. After refining the idea to something more accessible and cheaper, his Model D was released in 1970.

                Synthesizers began to blossom mainstream in the 1960s and 70s when rock bands specifically began to experiment with the different sounds and effects that could be achieved. To this day synthesizers are still being improved, but the largest amount of growth was before and coming into the late 1980s. Today we do not realize all of the work that went into it, we only see a keyboard that makes cool sounds if we press certain buttons.

                The reality is that music education in the general music classroom setting would be entirely different if we did not have the synthesizer. First, teaching students piano would be nearly impossible. Each student would have to take a turn on the piano or have a private lesson with the teacher. Classrooms are lucky if they have even one piano, and it is most likely not in tune. Instead, we can have classrooms with several keyboards (a term used interchangeably here with "synthesizers")  so instruction can be given to multiple students at one time. Just as Beulah Mae wanted her students to learn to read instead of copying what the teacher says in A Social History of Media and Technology in Schools, students can work on their own at their own pace to learn piano with the keyboard. Headphones can be plugged into most keyboards so students can hear themselves better or practice in "silence." More advanced classrooms are even set up with a system so that when everyone has headphones in, they can hear the teacher speaking or playing, the teacher can hear each individual playing, or the students can play together from opposite ends of the room. The keyboard also makes it possible for students to practice piano at home, where they most likely do not have a piano (cost, space, etc.) and have the option of playing "silently."

                Aside from learning to play piano, the keyboard is also used for other educational activities, including singing songs, learning music theory, and even using the different settings as backgrounds to teach improvisation. As Grandma Bessie's journal states, "The modern system of schooling is nothing short of blasphemous in its ignorance of human creativity" (p. 43). Grace even briefly mentions the advantage of using a piano even in non-musical classes.