Music Education Research

Music and Literacy- Research update

In what has become a rapidly expanding area of research, music (auditory skills) and its effect on the brain is being studied in multiple fields including music education, neuroscience, education, and neuropsychology. When one considers the different parts of the brain responsible for coordination, vision, auditory, motor, problem solving, emotion, and spatial awareness, music truly is “a total brain experience.” This experience has to be reinforced and practiced to make connections permanent.

The same processes involving speech and reading are also involved in music literacy. Phonological awareness is the understanding of sound. The ability to see a letter to make a sound then put sounds together to make words and sentences is not much different than a musician looking at a note to sing or play to make into a musical phrase or melody. Fluency skills that involve automatic word recognition and prosody such as expression, pitch, accents, and stress are all necessary skills to a musician. The ability to automatically read music in both pitch and rhythm with correct fingerings produces musical fluency and allows the musician to concentrate on the same expressive elements.

The process of reading music is the same as reading written words. Students must learn to decode or translate a symbol into sound. For instance, when reading vocal music, students must be able to decode on multiple levels at the same time including words (vocal lyrics), pitches (melody), rhythm, expression (accents & dynamics), and the accompaniment (piano or instrumental part).

For music to assist students in the acquisition of these foundation skills for literacy, they must be musical in nature, have the specific objectives of discriminating sounds, form auditory-visual associations, and involve sequences of auditory and visual stimuli. In the music classroom, students learn sound discrimination when composing, listening, performing, or responding to music. Both music and language are acoustically and functionally complex and engage high-level cognitive processes.

According to the Auditory Neuroscience Laboratory, Northwestern University

· Musicians show Neural Enhancement of Timbre

· Musical training generalizes to linguistic pitch

· Musicians have more robust and faithful encoding of linguistic pitch information

· Musicians show neural enhancement of pitch

· Musicians show neural enhancement of timing

· Musicians response to speech stimulus is earlier than nonmusicians

· Musicians are not just better at encoding music, but also speech

· Musical experience generalizes to perceiving emotion

· Pitch, Time, and Timbre are components of music perception; also keys to the expression of emotion in speech

· Musical experience changes brainstem encoding of emotional sounds

Primary Source:

Professor Dee Hansen, D.M.A. “What Brain Research Tells Us About Music Learning.” MMEA All-State Conference, Boston, MA, February 28, 2013.

Additional Sources:

· Auditory Neuroscience Laboratory, Northwestern University www.brainvolts.northwestern.edu

· Hansen, D., Bernstorf, E., and Stuber, G. (2004). The Music and Literacy Connection. MENC

· Joyce Eastlund Gromko, Bowling Green University, Journal of Research in Music Education, 53/3.206 Fall, 2005

· Miller, J. & Schwanenflugel, P. (2008). A Longitudinal Study of the Development of Reading Prosody as a Dimension of Oral Reading Fluency in Early Elementary School Children. Reading Research Quarterly, 43(4). Pp. 336-354.

· www.pbs.org & Wolfe, Patricia, (2001) Brain Matters. Alexandria, VA: ASCD.

Strong correlations between Language & Music in Research

· Working Memory Transfers: Gromko & Hansen (2009), Lucas & Gromko (2007)

· Language development: Krause, Skoe, Parbery-Clark & Ashley (2009)

· Reading (Phonemic Awareness): Hansen, et al., (2012), Loui et al. (2011), Rubinson (2010), Stegemoller, Skoe, Nicol, Warrior, Kraus (2008)

· Prosody (Fluency): Anvari (2002), Slevc & Miyake (2006), Meyrer, Elmer, & Jancke (2012)