Cirencester Deer Park School - Music Technology
There are 2 main track types used in any DAWs. On this page we are going to look at the main differences between them.
(Software instruments, external MIDI devices, MIDI loops from library)
(Microphones, instrument, audio loops from library)
Recording: MIDI data
Midi can be played in to the DAW using a midi controller, this is most often a musical keyboard but could be a electronic drum kit or a pad controller.
Editing:
As midi is just data you can move individual notes, change the voicing that the DAW uses to make the sound. You can correct the timing of the notes using a function called Quantising.
File Size:
As MIDI files are just data they are quite small and are comparable in size to word documents. A DAW project that has just midi in it would still be very small in size and might only be Megabytes of data.
Recording: An Audio Signal
Audio can be recorded by plugging a microphone or an instrument it to the DAW. The DAW then captures the sounds as detailed below.
Editing:
As audio tracks are sound waves you can't change individual sounds in the same way as you can change MIDI. You can correct the timing of sounds using flex editing or audio quantising but this is more limited than with MIDI.
File Size:
Uncompressed audio files can be quite large. They take about 10mb per minute of recorded audio. A DAW project file with lots of audio in it can be Gigabytes of data.
Below is some more detailed information about MIDI and audio.
What is MIDI?
MIDI is a protocol that allows computers, musical instruments and other hardware to communicate. MIDI was first developed in the early 80s to standardize the growing amount of digital music hardware. Manufacturers needed a simple way to make their products compatible with those of other brands, so they agreed on a standard language which became MIDI. The finished MIDI standard was finally unveiled in 1982 and all manufactures of MIDI equipment used this standard.
MIDI never transmits an actual audio signal—it’s information only. That means that if a MIDI keyboard doesn’t have an onboard sound source like a synth or sampler, it won’t make any sound! When you connect a MIDI controller to your DAW to play virtual instruments, you’re simply feeding them real time MIDI information.
What message get sent between devices?
When you connect a MIDI controller to your DAW to play virtual instruments, you’re simply feeding them real time MIDI information.
These messages include:
Note on and off- how long was the note held down
Pitch- what notes has been triggered, for example C1, G2
Velocity-How hard was the note pressed down, this is represented by 0-127
Channel message- what instrument is assigned to the note piano sounds have numbers starting with 00, strings sounds have numbers starting with 035 etc
What is Audio?
Audio is sound that is captured by the DAW. This involves converting sound waves in to digital information.
he red line is the sound wave. The blue bars are the snapshot (sample) that the computer takes to convert the sound wave in to digital information.
To convert a sound wave in to digital information the computer takes a sample of the sound at time intervals, This is known as the sample rate. The standard sample rate for commercial music is 44,100 Hz, this means the computer takes a snapshot of the sound wave 44,100 times every second.
What about bit depth?
The bit rate is how much information is taken at each sample. The more data is taken at each sample (every blue bar) the more detail of the frequencies will be taken. The standard bit rate is 16 bits of data per sample. (65, 536 different levels of frequency can be captured)
The picture above shows 16 bit depth, each box represents 1 bit of data.
24 bit depth, has more bits of data which means it can represent the sound wave more accurately which give a higher quality representation of the sound, the downside of this is it takes up more space and computing power when it needs to be played back