60
1 Instructors’ Manual Outline for Modern Recording Techniques 8 th Edition by Dave Miles Huber The intent of this manual is to help you design your course. You can adapt this material to suit your particular class needs and focus. I would advocate that as much as possible you try to follow the layout of the text to fit your particular course in recording or audio production. The author of Modern Recording Techniques 8 th Edition, David Miles Huber, has taken the lead provided by the original author of the text, Robert E. Runstein, to carefully guide the student and reader from introductory subjects to those much more complex – each chapter building upon the previous. As an instructor, you may also find that there are legitimate time constraints as to the content that can be taught in your given time frame. In those cases where academic semester lengths prohibit teaching the entire text try to pick and choose from the more advanced chapters that may or may not be applicable to your course needs. For those of you who are teaching the contents of this book within a single semester, here is a proposed chapter coverage plan that might best suit your needs: Week: Chapter(s) 1 1 – Introduction, 2 – Sound and Hearing 2 3 – Studio Acoustics and Design 3 4 – Microphones: Design and Application 4 5 – The Analog Tape Recorder 5 6 – Digital Audio Technology, 7 – The Digital Audio Workstation 6 8 – Groove Tools and Techniques, 9 – MIDI and Electronic Music Technology 7 11 – Synchronization and review for the Midterm Exam 8 Midterm Exam 9 12 – Amplifiers, 13 – Power- and Ground-related Issues, 17 – Monitoring 10 14 – The Art and Technology of Mixing 11 15 – Signal Processing, 16 – Noise Reduction 12 10 – Multimedia and the Web 13 19 – Mastering, 20 – Product Manufacture 14 18 – Surround Sound, 21 – Studio Tips and Tricks 15 22 – Yesterday, Today and Tomorrow and review for the Final Exam 16 Final Exam Now, a note about the presentation of material to your students using this text: To fully justify the content of the text and its presentation please consider where your lectures will be taught. Of course, the best environment would be in a state-of-the-art studio control room outfitted with all of the latest equipment adjacent to a recording booth with all of the best new and vintage microphones. Add to that a laundry list of vintage outboard processing gear and recording equipment that you could bring out for “show and tell” at a moment’s notice or patch in at will. And, of course, add to this a computer connected to the web and the monitor system in the control room, along with a projection system so all can see. However, I realize that not all of us have that luxury in our teaching environments, so for each chapter we are providing some additional in-class possibilities throughout this manual.

Instructors’ Manual Outline for - Routledgecw.routledge.com/textbooks/9780240821573/instructorManual/data/...Instructors’ Manual Outline for ... beginning with initial song ideas

  • Upload
    letuyen

  • View
    214

  • Download
    0

Embed Size (px)

Citation preview

1

Instructors’ Manual Outline for Modern Recording Techniques 8th Edition by Dave Miles Huber The intent of this manual is to help you design your course. You can adapt this material to suit your particular class needs and focus. I would advocate that as much as possible you try to follow the layout of the text to fit your particular course in recording or audio production. The author of Modern Recording Techniques 8th Edition, David Miles Huber, has taken the lead provided by the original author of the text, Robert E. Runstein, to carefully guide the student and reader from introductory subjects to those much more complex – each chapter building upon the previous. As an instructor, you may also find that there are legitimate time constraints as to the content that can be taught in your given time frame. In those cases where academic semester lengths prohibit teaching the entire text try to pick and choose from the more advanced chapters that may or may not be applicable to your course needs. For those of you who are teaching the contents of this book within a single semester, here is a proposed chapter coverage plan that might best suit your needs: Week: Chapter(s)

1 1 – Introduction, 2 – Sound and Hearing 2 3 – Studio Acoustics and Design 3 4 – Microphones: Design and Application 4 5 – The Analog Tape Recorder 5 6 – Digital Audio Technology, 7 – The Digital Audio Workstation 6 8 – Groove Tools and Techniques, 9 – MIDI and Electronic Music

Technology 7 11 – Synchronization and review for the Midterm Exam 8 Midterm Exam 9 12 – Amplifiers, 13 – Power- and Ground-related Issues, 17 –

Monitoring 10 14 – The Art and Technology of Mixing 11 15 – Signal Processing, 16 – Noise Reduction 12 10 – Multimedia and the Web 13 19 – Mastering, 20 – Product Manufacture 14 18 – Surround Sound, 21 – Studio Tips and Tricks 15 22 – Yesterday, Today and Tomorrow and review for the Final

Exam 16 Final Exam

Now, a note about the presentation of material to your students using this text: To fully justify the content of the text and its presentation please consider where your lectures will be taught. Of course, the best environment would be in a state-of-the-art studio control room outfitted with all of the latest equipment adjacent to a recording booth with all of the best new and vintage microphones. Add to that a laundry list of vintage outboard processing gear and recording equipment that you could bring out for “show and tell” at a moment’s notice or patch in at will. And, of course, add to this a computer connected to the web and the monitor system in the control room, along with a projection system so all can see. However, I realize that not all of us have that luxury in our teaching environments, so for each chapter we are providing some additional in-class possibilities throughout this manual.

2

Above all else, have fun teaching your course with this text. Keep in mind that you are training the next generation of recording engineers, producers, inventors, and gurus of all types whose thirst for knowledge about the subject of recording is insatiable. If you are having fun with the material, they will have fun too. And having fun is a great way to learn! Chapter 1: Introduction

• Objectives • To introduce students to the various elements of a recording studio,

such as the recording room, control room, iso booth, and machine room

• To define the various types of recording venues and their differences – whether they be a commercial recording studio, personal project-studio, portable studio, live sound venue, or sound for picture studio

• Serve as an introduction to the recording process from initial concept through to distribution of final product

• To provide a thumbnail sketch of the various roles assumed by personnel in the music and recording industry

• Lecture and Discussion Ideas The lecture on chapter 1 should focus on the different types of recording facilities and what differentiates one from the other. It would be good to provide somewhat of a historical overview to the development of various studio types. For example, you may want to indicate the changes in professional recording studio design throughout the history of recorded audio: initially classically-oriented designs as offshoots of the radio industry, then the rise of independent pop music recording studios in the late sixties and into the seventies, the change in design from live rooms to dead rooms in the late seventies to provide isolation at the cost of real-sounding liveliness and the return to larger, live room design in the 1980s. Also, discuss the development and proliferation of small, personal or project rooms made possible by the cost-reduction of semi-professional equipment beginning in the late 1980s. Find out how many of your students have their own (or have access to) personal or project studios either in their homes or band rehearsal areas. Have them describe the hardware/software setups they are currently using. You might be surprised! Following their reading of chapter 1, have students discuss in class where exactly they see themselves working in the industry. You might even break it down further by having the student describe what they would like to do initially upon graduation and then where they see themselves five years from then.

• Terminology Acoustically “Dead” (p.6)

3

A/D (Analog-to-Digital) Converter (p.42) Arranger (p.20) Artist (p.20) Assistant Engineer (p.22) Audio-for-film (18) Auto-locator (p.9) Commercial Music Studio (p.3) Comping (p.35) Control Room (p.7) DAWs (p.8) DJ (Disc Jockey)/VJ (Video Jockey, p.23) Effects Devices (p.9) Engineer (p.22) Game Sound (p.18) Iso Booths (p.7) Large-scale Integrated (LSI) Circuit (p.1) Live/On-Location Recording (p.17) Machine Room (p.9) Maintenance Engineer (p.23) Mastering (p.39) Mastering Engineer (p.23) MIDI (Musical Instrument Digital Interface, p.1) Mixdown (p.37) Mixing (p.8) Multimedia Audio (p.19) Multitrack Production (p.29) Music Lawyer (p.25) Networking (p.28) Overdubbing (p.34) Portable Studio (p.13) Preparation (Pre-production, p.29) Producer (p.21) Product Manufacturing (p.40) Project Studio (p.11) Recording (p.30) Recording Console ( or Board or Desk, p.8) Routing (p.8) Spatial Positioning (p.8) “Studio in the palm of your hand” (p.14) Studio Manager (p.24) Studio Musician (p.20) Track sheet (p.32) Transducer (p.40) Women’s Audio Mission (p.26)

• Sample Exercise (in class) In class, have your students create a flowchart of the modern recording process beginning with initial song ideas and demo creation through to tracking, overdubbing, mixing, mastering and distribution. Each of the major segments should be in boxes with arrows leading to the next step in the process. At each stage of the process, the

4

students should indicate the types of facilities that might be utilized. You may choose to break the class into small groups to work on this, with one group working on a large-scale professional recording in a large commercial recording studio while the other group is working on a project studio-oriented version. Compare the results of the two at the end of class for entire class feedback.

• Examination questions: short answer Is there a “perfect” recording studio design? Why? Why was it determined that acoustically “dead” recording spaces and control rooms were not necessarily good for audio recording purposes? What is the purpose of a dedicated “machine room” in a professional recording facility? Briefly describe the overdubbing process. What device(s) have made this practice a possibility? What factors have contributed to the growth of audio for multimedia and sound for picture? Define the concept of a transducer. How is this device utilized in recording? Name one example of a transducer. The person often hired to guide an artist throughout the recording process is the: a. Engineer b. Manager c. Producer d. Mastering Engineer e. Assistant Engineer The person who tweaks aspects of the final recording including level, equalization and dynamics is the: a. Engineer b. Manager c. Producer d. Mastering Engineer e. Assistant Engineer The person responsible for all session documentation and notes is the: a. Engineer b. Manager c. Producer d. Mastering Engineer e. Assistant Engineer The device that converts one form of energy into another is called: a. LSI b. DAW c. Transducer d. Autolocator

5

The device that allows for digital audio recording and manipulation is: a. LSI b. DAW c. Transducer d. Autolocator

• Sample Assignment (out of class) Have your students locate a particular recording facility on the web that they think would match up with their particular career intentions for audio engineering. Perhaps you could have them copy some of the web pages from that particular facility and prepare a short paragraph indicating why they thought this facility would be a good match for their career expectations.

• List of further reading/resources Modern Recording Techniques Website: www.modrec.com Women’s Audio Mission: www.womensaudiomission.org The Recording Academy: www.grammy.org Grammy U (for students): www.grammy.org/recording-academy/grammy-u Audio Engineering Society (AES): www.aes.org AES Education (for students): www.aes.org/education/ Chapter 2: Sound and Hearing

• Objectives • To describe and illustrate the basic physical attributes of sound • To define the decibel and demonstrate the relationship between various

decibel levels and audio-related quantities • To show students how the ear converts sound into neural information • To describe psycho-acoustic aspects of human hearing

• Lecture and Discussion Ideas The lecture(s) on chapter 2 should focus on the physics of sound and the perception of hearing. Potentially this is the most math-intensive of all the chapters in this book and it is important that your students have at least a rudimentary grasp on the formulas used for measuring sound. As you will see from the in-class exercises below, the mathematical concepts are stressed. In your discussion of the concepts covered in chapter 2 it will be important to relate the physical concepts or theories to real world perception. Try to provide examples of the various concepts covered either by bringing audible demonstrations to class for playback or by using the materials provided in the Audio Tutorial section of www.modrec.com. Most current DAW systems now have integrated signal generators and metering tools that can be used to demonstrate the concepts of amplitude, frequency, phase in class. Visual representations of waveforms moving with respect to frequency, amplitude and phase are good tools for learning these concepts.

6

Another useful tool for demonstrating acoustic waveforms, both simple and complex is to utilize some form of a synthesizer or possibly even a tone generator. Hook the audio outputs of the device to both the classroom amplifier inputs as well as the visual metering inputs. This way, students will be able to both see and hear the changes taking place with the waveform as it is manipulated. This may take a little pre-class prep, but the results are worth it. When discussing the ear and hearing, it is important to stress the potential for permanent hearing damage if a student is exposed to loud sound levels. Huber covers this both with respect to studio monitoring as well as live sound venues. Point your students to the House Research Institute website at www.hei.org for more information.

• Terminology Acoustic Trauma (p.65) Amplitude (p.45) Attack (p.58) Beats (p.68) Binaural Localization (p.69) Cochlea (p.64) Combination Tones (p.68) Complex Wave (p.56) Compression (p.44) Cycle (p.47) Decay (p.58) Decay Time (or Reverb Time, p.73) Decibel (p.59) Diffraction of sound (p.49) Direct Sound (p.72) Eardrum (p.64) Early Reflections (p.72) Effects of the Pinnae (p.71) Envelope (p.58) Equal-Loudness Contour Curves (Fletcher-Munson Curves, p.66) Flat Frequency Response Curve (p.50) Frequency (p.47) Frequency Response Curve (p.50) Hammer, Anvil and Stirrup (p.64) Harmonic (p.54) Harmonic Content (p.54) Hertz (p.47) In Phase (p.53) Interaural Arrival-time Differences (p.70) Interaural Intensity Differences (p.70) Logarithm (p.60) Masking (p.69) Out of Phase (p.51) Overtones (p.54)

7

Panning (p.71) Partials (p.54) Peak Amplitude (p.46) Peak-to-Peak Value (p.46) Period of a Wave (p.48) Permanent Threshold Shift (p.65) Phase (p.51) Phase shift (p.53) Pinna (p.64) Psychoacoustics (p.66) Power (p.62) Rarefaction (p.44) Reflection of Sound (p.48) Release (p.58) Reverberation (P.73) RMS Voltage (p.46) Sawtooth Wave (p.56) Simple Wave (p.56) Sine Wave (p.54) Sound-Pressure Level, p.61) Sound-Pressure Wave (p.43) Square Wave (p.56) Sustain (p.58) Temporal Fusion (P.72) Temporary Threshold Shift (p.65) Threshold of Feeling (p.65) Threshold of Hearing (p.64) Threshold of Pain (p.65) Timbre (p.56) Triangle Wave (p.56) Velocity (p.48) Voltage (p.62) Waveform (p.44) Wavelength (p.48) Wave Propagation (p.44)

• Sample Exercise (in class) When covering this chapter in class, try to provide as many visual correlations of the aural principles being discussed. As mentioned in chapter 1, a signal generator and metering plug-ins can be used to good effect. Here are some other ideas to discuss the ear and hearing: Bring some cardboard tubes such as those used for gift-wrapping to class. After splitting up your class into smaller groups have them experiment with the changes that occur with their hearing by having them hold the tubes up to the sides of their heads at the pinna. This will demonstrate the effect of the pinnae on hearing and produce some odd results! This effect can also be achieved by having student cup their hands behind their ears. Have the students attempt to describe the timbre change that they perceive as music is played, encouraging then to move there hands periodically to change the timbre. Additionally you might take a short “field trip”

8

around your educational institution and demonstrate the behavior of sound in a variety of acoustic environments. Especially useful for demonstration are areas with dense reverberation such as an auditorium, and any large acoustically untreated room with flutter echoes. Students will more easily hear acoustic phenomena in spaces such as these. A good tool for demonstrating the behavior of sound is a toy clicker, or simply clap your hands. This exercise may be repeated for additional reasons when covering recording studio acoustics.

• Examination questions: How are sound pressure waves transmitted through air? When one sound differs from another even though both are at the same amplitude and frequency, what is the cause of the difference? What is the difference between “peak” and “RMS” amplitude level measurements? Express the difference between peak amplitude and RMS amplitude mathematically: Why are there different decibel scales? Express each decibel scale mathematically: What is the velocity of sound, measured in ft/sec, when the temperature is 77 degrees Fahrenheit? Show your work: What is the wavelength for a 50 Hz. sine wave? Show your work: Why might it be important to be careful when placing multiple microphones around the same sound source (i.e. a guitar)? The range of human hearing is: a. 20 – 2,000 Hz. b. 200 – 20,000 Hz. c. 20 – 20kHz. d. 20 – 200kHz. When two identical waveforms combine in-phase the result is: a. Cancellation b. Addition c. a Complex waveform d. No change at all

9

When two identical waveforms combine out of phase the result is: a. Cancellation b. Addition c. Complex waveform d. No change at all Whole number multiples of a fundamental frequency are termed: a. overtones b. timbre c. odd harmonics d. octaves Which of the following does the ear perceive as harsh or distorted? a. overtones b. timbre c. odd harmonics d. octaves

• List of further reading/resources Modern Recording Techniques: www.modrec.com House Research Institute (formerly the House Ear Institute): www.hei.org

• List of key people from the text Heinrich Hertz Alexander Graham Bell Harvey Fletcher and Wilden Munson Chapter 3: Studio Acoustics and Design

• Objectives • To discuss the acoustic principles and design considerations for

recording studio spaces • To discuss design principles specific to control room acoustics such as

acoustic isolation, symmetry, and frequency balance. • A practical overview of architectural and acoustic principles that

govern room reflection, absorption and reverberation • Include cost factors for design approaches regardless of the project’s

scope

• Lecture and Discussion Ideas It was a good idea to have the chapter on studio design and acoustics follow immediately after the chapter on sound and hearing. Though not as mathematically challenging as the concepts presented in chapter 2, chapter 3 does bring additional mathematical concepts to bear. At this point it might be a good idea to remind your students that although this material may seem a little dry for now, grasping the concepts presented in both chapters 2 and 3 is extremely important for them to

10

proceed on to the potentially more immediately gratifying chapters on recording equipment and techniques ahead. In your lecture on chapter 3, you should begin by reviewing the various studio types initially discussed in chapter 1. Each studio type will have its own special set of variables to address with respect to studio design and construction. However, it must be emphasized that no matter what the scale, the principles of acoustics will remain the same for all designs and applications. This may prove to be an especially interesting chapter and lecture discussion for those students who are already attempting to do their own recording at home or in their band practice spaces. They have probably already found out, to their chagrin, that the sounds they are getting are not what they intend or may have found out the hard way that they need additional isolation, etc. from neighbors. Your discussion on acoustic sound control and isolation will be extremely valuable to them.

• Terminology Absorption (p.100) Absorption coefficient (p.100) Acoustic echo chamber (p.107) Acoustic isolation (p.75) Acoustics (p.75) Acoustic separation (p.75) Anechoic (p.107) Angle of incidence/reflection (p.97) Audio-for-visual production environment (p.77) Bass traps (p.103) Dead (p.98) Diffuser (p.98) Direct signal (p.106) Early reflection (p.106) Floating construction (p.84) Flutter (slap) echo (p.98) Frequency balance (p.75) Functional trap (p.105) Gobos/flats (p.92) Imaging (p.92) Isolation room/booth (p.90) Live (p.98) Live end/dead end (p.98) Partition (p.92) Pressure-zone trap (p.104) Professional recording studio (p.76) Project studio (p.78) Quarter wavelength trap (p.104) Reverberation (p.75) Riser (p.86) Room modes (p.97)

11

RT60 (p.107) Soffit (p.83) Sound lock (p.89) Standing waves (p.97) Transmission loss (p.81) Vocal booths (p.90)

• Sample Exercise (in class) Divide your students into groups and, using the text as a guide, give them 10-15 minutes to produce a simplified studio design. You might provide some parameters of the studio location such as proximity to freeways, subways, airports, residences, etc. This will help the students not only to think about the sound quality produced in the studio, but also penetration of sound into the studio and transmission of sound out of the studio. Collect the ideas and discuss the results. If your classes are located in studio facilities, give a short tour to demonstrate the various acoustic principles discussed in the chapter. As in the class exercise provided in the last chapter, you might want to bring a toy clicker to quickly demonstrate the acoustic properties of the various rooms, although hand claps work well also.

• Examination questions: What is the definition of acoustics? Determine the transmission loss (TL) through 12” concrete (hint: you will not need a calculator). Show your work: Assuming that all standard building materials have some resonant frequency, what might be a good idea when constructing walls in studio spaces? How might you go about treating the floor of a studio space to prevent sound generated inside from bleeding into other adjacent spaces? The amount of time it takes a sound to decay by 60 dB in an acoustic environment is termed: a. RT60 b. SPL c. RF d. Live End/Dead End.

12

True or False: In recording studio construction it is a good idea to have all surface layers match up exactly end to end: a. True b. False True or False: In recording studio design it is a good idea to have walls that are not parallel to one another: a. True b. False The structure at the front of the control room that normally houses such items as large monitors and video monitors is termed: a. bass trap b. soffit c. reverb chamber d. gobo The acoustic structure designed to absorb low frequency energy is termed a: a. bass trap b. soffit c. reverb chamber d. gobo

• Sample Assignment (out of class) Have your students do a web search for studio designers or acousticians and have them turn in their favorite search result with a listing of the company’s studio design credits. A few designers are listed below, but a search with keywords like recording+studio+design should provide a wide variety of sources.

• List of further reading/resources Russ Berger Design Group: www.rbdg.com/home/ Primacoustic Acoustical Materials: www.primacoustic.com Argosy Studio Furniture: www.argosyconsole.com Auralex Acoustics: www.auralex.com Acoustics 101 (by Auralex): http://www.acoustics101.com Tubetrap Acoustical Materials: www.tubetrap.com Chapter 4: Microphones: Design and Application

• Objectives • Discuss the basic microphone capsule designs including moving coil,

ribbon, and condenser microphones • Describe the basic technical characteristics of microphones including

transient response and frequency response, and output characteristics • Illustrate the basic microphone polar patterns • Demonstrate the most commonly used single and stereo microphone

placement techniques • Describe typical microphone placement techniques used when

recording common instruments such as piano, guitar, drums, voice, horns, etc.

• Lecture and Discussion Ideas

13

Chapter 4 of Modern Recording Techniques is a rather extensive chapter on the subject of microphones. It is likely that you will need more than one lecture to cover all of the relevant material. Visual aids used throughout your lectures will prove to be very worthwhile in describing the various microphone transduction principles and polar response characteristics. For this reason it would be good to make overheads or create a slide show for projection in the class. Start by describing transducers and then dive into the three major transducers described in the text. Next, describe the major polar response patterns defined in the text. You may also wish to consult the web pages of various microphone manufacturers (several are listed below under resources) to get descriptions of their microphone designs and polar responses. If you are able to display web pages in the classroom, so much the better. An in-class exercise is described below which may also further serve to aid your students in their understanding of microphone designs and polar patterns.

• Terminology 3:1 Distance Rule (p.134) Accent Miking (p.137 Ambient Miking (p.138) Attenuation Pad (p.124) Balanced Line (p.124) Bi-directional (p.117) Blumlein (p.139) Boundary Microphone (p.132) Capacitor (p.113) Cardioid (p.117) Close Miking (p.133) Condenser Microphone (p.113) Decca Tree (p.142) Diaphragm (p.110) Direct Injection (DI, p.143) Directional (p.117) Directional Response/Polar Pattern (p.116) Distant Miking (p.130) Dynamic Microphone (p.110) Electret Condenser (p.116) Electromagnetic Induction (p.110) Electrostatic principle (p.113) Equivalent Noise Rating/Self-noise (p.124) Frequency Response Curve (p.121) Front-to-back Discrimination (p.120) Hypercardioid (p.117) Leakage (p.133) Microphone (or Mic, p.109) Microphone Preamplifier (p.128) M/S (Mid-Side, p.140) Monoaural Compatibility (p.141) Omnidirectional (p.117)

14

Output Characteristics (p.124) Output Impedance (p.125) Overload Characteristics (p.124) Phantom Power (p.114) Plosive (p.166) Pressure-gradient (p.117) Proximity Effect (p.122) Ribbon Microphone (p.111) Rumble (p.122) Reamping (p.144) Sensitivity Rating (p.124) Shock Mount (P.122) Sibilance (p.165) Spaced Pair (p.139) Stereo Miking Techniques (p.139) Supercardioid (p.117) Surround Miking (p.142) Transducer (p.109) Transient Response (p.123) Unbalanced Line (p.124) Voice Coil (p.110) Working Distance (p.130) XLR connector (P.127) X/Y (p.139)

• Sample Exercise (in class) If you have access to a variety of microphones (and an amplification system of some type) in your lecture area, bring the mics to class to demonstrate the various microphone transduction principles discussed in the chapter. You can also demonstrate polar response patterns by performing a “walk around” test with the mic on a stand. While walking around the mic, either state the same thing over and over, or use some portable type of sound generating device such as a pitch pipe or electronic tuner. Your students will not only become acquainted with microphone operation and response characteristics, but will also become more familiar with the various microphone makes and model numbers used in the demonstration.

• Examination questions: Provide a brief definition of a microphone: List the three types of microphone transducers: Broadly speaking, what directional pickup patterns might a microphone exhibit?

15

Briefly describe why directional microphones exhibit an increase in bass response: What is the function of a Direct Injection box? A ribbon microphone works on the principle of: a. electromagnetic induction b. capacitance c. electrostatic principle d. voice coils A condenser microphone works on the principle of: a. electromagnetic induction b. capacitance c. electrostatic principle d. voice coils Condensers might be thought of as: a. voice coils b. variable capacitors c. low impedance d. low performance A boost in the low frequency pick up of a directional microphone is referred to as: a. proximity effect b. discrimination c. rear entry ports d. cardioid Directional microphones can be deigned to reduce acoustic signals from the rear of the mic by virtue of: a. proximity effect b. discrimination c. rear entry ports d. cardioid

• Sample Assignment (out of class) Have your students perform the “Do-It-Yourself” tutorial exercises from the “Mic Types” section of the www.modrec.com web page.

• List of further reading/resources Shure Microphones: www.shure.com/microphones/ Neumann Microphones: www.neumann.com/ Electro-Voice Microphones: www.electrovoice.com/ Audio-Technica Microphones: www.audio-technica.com AKG Microphones: www.akg.com/ Royer Ribbon Microphones: http://www.royerlabs.com/ Beyer Dynamic Microphones: http://www.beyerdynamic.de/international/ Earthworks Condenser Microphones: www.earthworksaudio.com/

16

Telefunken Condenser Microphones: http://www.telefunkenusa.com/ Chapter 5: The Analog Tape Recorder

• Objectives • Provide an overview of analog recording technology • Give a brief description of analog vs. digital recording processes • Discuss the analog magnetic tape transport mechanism and the

function of magnetic tape heads • Describe analog magnetic tape machine alignment parameters such as

bias, equalization, and alignment • Explain strategies for storing, archiving, and backing up analog tapes

• Lecture and Discussion Ideas Please, please, don’t disregard this chapter on its face value! As much as we, as recording engineers, have embraced digital technology and use it daily, I feel that this is an extremely important chapter. Given my druthers (and budget) I would continue to track at least drums and bass guitar to 2” 24- or 16-track on some immaculately aligned machine at 15 IPS. Regardless of sample rate or bit depth, nothing in the digital domain sounds as good on the foundation of a groove. Nothing in the digital domain has provided me with the punch of analog. Sure, I’ll bounce to some other format for overdubs, but I may even mix with the original 2”! As I am writing this the last manufacturer of analog tape just signified that they are out of business, so you’ll pardon my tear-stained pages here. However, there are concerned organizations such as the Society of Professional Audio Recording Services (SPARS) that are committed to the continued manufacture of analog tape. A company based in the Netherlands – Recordable Media Group International purchased all of the remaining stock of BASF/Emtec analog recording tape and the equipment to manufacture it, so analog recording does indeed remain viable. An important side note: If your students graduate from your institution and have a good, working knowledge of analog recording and alignment, they will be sought after by freaks like me that own the last bastion of the “Rebellion”. They will be considered Jedis in training. So, if possible, try as much as possible to keep analog tape in your curricula. OK, I’ll jump off the soapbox and let you get into the chapter.

• Terminology Analog Tape Recorder (ATR, p.179) Bias Current (p.188) Cue Point (p.186) Degaussing (p.195)

17

Domain (p.181) Dump Edit (p.185) Erase Head (p.185) Equalization (p.187) Flux (p.188) Head Gap (p.186) Input/Source Monitoring Mode (p.188) Magnetic Tape Media (p.180) Modulation Noise (p.194) Playback Head (p.185) Print-through (p.192) Punch-in/Punch-out (p.189( Record Head (p.185) Reproduce Montoring Mode (p.188) Separation Loss (p.195) Shuttle Control (p.185) Supply Reel (P.193) Sync Monitoring Mode (p.188) Tails-out (p.193) Take-up Reel (p.193) Tape Counter (p.185) Tape Transport (p.182) Total Transport Logic (TTL, p.184) Vari-speed (p.185)

• Sample Exercise (in class) If willing and able, you might actually demonstrate the use of an analog tape machine for the class. Put the machine through its paces, recording and playing back. Then, if you are adventurous, demonstrate the process of analog tape machine alignment. Since this is becoming less and less a possibility, there still remains the possibility of showing video tapes or DVDs that demonstrate the steps required. Here’s another idea: let’s say that your class meets twice a week. On the first meeting on analog recording have the class break up into groups and discuss favorite recordings. Following their discussion, ask them as a group to find out how their favorite recordings were made: analog or digital (or a combination of both) before the second meeting. In the second meeting have each group detail how the recording was made (what studio, format, etc.: all available in general terms at www.allmusic.com or at the various webzines for the group itself). Ask each group if they have heard differences between analog and digital recordings of the group they have selected. This might provoke some interesting dialog.

• Examination questions: What is the function of bias current in analog recording?

18

Briefly describe what is meant by “sel-sync” or sync mode on an ATR: The smallest known permanent magnets are referred a=to as: a. PVC b. domains c. oxide d. asperity Print-through might occur if an analog tape is:

a. Wound tails out b. Wound tails in c. Wound heads on d. Wound heads off

Tails out

a. Refers to the tape saturation point b. Refers to the tape being rewound to the beginning c. Refers to the tape being fast forwarded to the end d. Refers to the tape being stored in a temperature controlled environment

• List of further reading/resources Society of Professional Audio Recording Services: http://www.spars.com/ RMG International (Magnetic Tape Manufacturer: http://www.rmgi.eu/ Producer & Engineer Wing of the National Academy of Recording Arts & Sciences: http://www.grammy.com/Recording_Academy/Producers_And_Engineers/ Chapter 6: Digital Audio Technology

• Objectives • Provide a thorough examination of digital audio technology including

binary systems and pulse-code modulation • Detail the digital recording and reproduction process, including basic

parameters of each step in the process such as sample rate, bit depth, and quantization

• Examines digital audio transmission standards • Provides an introduction to digital audio recording systems such as

samplers and hard-disk recorders

• Lecture and Discussion Ideas Obviously this is an important chapter considering the exponential growth of audio devices utilizing digital processors. From the cell phone to multi-track audio

19

recorders, digital conversion rules the day. And, as sadly noted in the last chapter, analog recording is becoming more and more just a vague memory. In your lecture(s) on this chapter it will be important to describe the full analog-to-digital conversion process and the corollary digital-to-analog conversion. I am a big fan of the graphics used throughout the text. If you have the capability, be sure to display the graphics on digital subjects such as conversion, quantization, aliasing, and dither while you are describing them in class. Showing the graphics supplied will help to solidify the concepts you are teaching. This chapter can get to be a bit math heavy, but virtually everything in the world relies upon digital technology so do not be afraid to impress the mathematical concepts of digital audio upon your students. Do not overlook the sections of this chapter on digital interfaces. Your students will be (or are) faced with the interconnection of digital devices on a daily basis.

• Terminology A/D Converter (p.207) ADAT Lightpipe (p.218) AES/EBU (p.215) Aliasing (p.206) Anti-aliasing Filter (p.206) Binary Number System (p.199) Bit (p.203) Bit Truncation (p.208) BNC connectors (p.223) Brick Wall Filter (p.206) D/A Converter (p.207) D/A Conversion (p.212) D-sub Connector (p.219) Data Coding (p.210) Digital Signal Processing (DSP, p.226) Dither (p.208) Error Correction (p.211) Flash memory (p.p.227) Hard-disk Recording (p.226) Helical Scan (p.229) Interleaving (p.211) Jitter (p.220) Least Significant Bit (LSB, p.204) MADI (p.217) Non-destructive Editing (p.226) Nyquist Theorem (p.205) Oversampling (p.207) Parity Bit (p.211) Pulse-code Modulation (p.211) Quantization (p.203)

20

Random-access Editing (p.226) RCA connectors (p.223) Rotating Head Technology (p.228) Sample-and-hold Circuit (p.209) Sampler (p.225) Sampling (p.201) Sampling Rate (p.202) SCMS (p.217) Signal-to-error Ratio (p.207) S/PDIF (p.216) TDIF (p.219) Toslink (p.216) Wordclock (p.222)

• Sample Exercise (in class) For this class it would be appropriate to demonstrate the functionality of a digital audio recorder and put it through its paces. If available, I would suggest demonstrating an older DAT, DTRS or ADAT machine. They are all still around in abundance and your students are sure to run across them. Should you demonstrate an ADAT or DTRS machine, walk your students through the tape formatting process – covering aspects such as sample rate and bit rate, digital black, and time code striping (if available on the machine used). Also, if two or more machines are available demonstrate digital transmission standards for digital audio copying via either AES/EBU or S/PDIF/Toslink.

• Examination questions: How many values are described in a binary system? In digital audio, what are the digital corollaries to analog measurements of amplitude and frequency? What is the benefit to increasing the sampling rate in digital audio? Provide a brief description of the Nyquist theorem and how it relates to digital audio: What are some methods used to reduce the effects of error in a digital audio system?

21

Which of the following is not a digital audio transmission specification? a. AES/EBU b. SMPTE c. SP/DIF d. mLAN What is the function of SCMS?

a. Provides better digital audio transmission b. Provides better analog to digital conversion c. Filters unwanted aliasing frequencies d. Provides a means for digital copy protection

Time-based errors in a digital audio system are known as: a. Jitter b. Dither c. Quantization d. Latency Distribution of a common digital reference pulse is termed: a. TDM b. Word Clock c. “Y” cabling d. MADI

• List of further reading/resources www.modrec.com www.digido.com Chapter 7: The Digital Audio Workstation

• Objectives • Provide a thorough examination of digital audio workstations

including basic practices used for mixing and editing • Examine the hardware and software platforms currently used for

digital audio workstation technology • Discuss the technical aspects of various forms of digital

interconnectivity used such as USB, FireWire, and Thunderbolt • Describe common sound file formats and DAW compatibility

protocols • Explore currently available software plug-ins for DAWs, including the

various plug-in formats available • Examine control and timing possibilities afforded among multiple

software applications via ReWire • Provide the various computer hardware parameters that impact DAW

performance such as processing speed, RAM, etc., as well as methods for maximizing DAW workflow in the studio

22

• Lecture and Discussion Ideas Students will likely be very interested in the material from this chapter. The more interactive the presentation of the chapter content, the greater impact it will make on students. If the class is being taught in a studio environment, begin with a basic overview of the components that make up the DAW system and how they are connected. If you have access to a projector and multiple software platforms on your computer, switch between the platforms during your presentation to show the commonalities between the programs. Instead of just discussing the nondestructive editing process, have a session file prepared to use for demonstration purposes. When demonstrating plug-in effects, it’s best to select material that lends itself to fairly dramatic processing, so that it will be easier for students to hear the changes being made. When discussing computer hardware such as RAM, hard drives, and processors, it is often helpful to open a computer case and physically identify the components in question. It may surprise you how many students have never seen the inside of a computer, and doing so allows them to better visualize the concepts being discussed.

• Terminology Accelerator Card (p.277) Advanced Authoring Format (AAF, p.255) AIFF (p.254) ASIO Driver (p.250) Audio Driver Protocol (p.250) AU Format (p.274) Audio Interface (p.248) AudioSuite Format (p.276) Automatic Power-saving (p.242) Automation (p.235) BIOS (p.241) Broadcast Wave (p.254) Boot Camp (p.237) Boot Drive (p.284) Bus-power (p.245) Comp (p.265) CoreAudio Driver (p.250) Cross-fade (p.263) DAW Controller (p.251) DAW Guidelines: Producers and Engineers Wing of NARAS (p.287) Daisy-chaining (p.245) Delay (p.272) Digital Audio Workstation (DAW, p.233) Digital Mixer Interface (p.268) DirectX Format (p.274) DSP Effects (p.269)

23

Dynamic Range Processor (p.270) Equalization (p.270) Export (p.279) Fading (p.262) FireWire (p.246) Gain Changing (p.262) Graphic Editing (p.258) Hot-pluggable (p.245) Interleaved File (p.279) Latency (p.250) Local Area Network (LAN, p.246) Marker/Marker Track (p.286) MAS Driver (p.250) MAS Format (p.276) MIDI Sequencer (p.298) Networking (p.246) Non-destructive Editing (p.263) Non-real Time DSP (p.270) Non-real Time Export (p.279) Normalization (p.262) Open Media Framework Interchange (OMFI, p.255) Operating System (OS, p.236) Pitch Change (p.274) Plug-in (p.274) Punch-in, Punch-out (p.265) Random Access Memory (RAM, p.237) Real-time DSP (p.270) Region (p.260) ReWire (p.276) RTAS Format (p.276) Rubberband Control (p.278) Server (p.247) Shared Web Connection (p.248) Software Instrument (p.298) Sound File Format (p.254) TDM Format (p.276) Thunderbolt (p.246) USB (p.244) USB Data Hub (p.245) VST Format (p.276) Wave File Format (p.254) WDM Driver (p.250) Zero-crossing (p.259)

• Sample Exercise (in class) When covering this chapter in class it would be extremely helpful if the digital audio workstation platform software being taught can be projected on a large monitor or

24

screen for the entire class to view. Even better, if the class takes place in a lab setting, it would be great if each of the students also has the same workstation software loaded up on a computer in front of them. Then they can follow along with you as you go through the software highlights and features. Have the class go through the basics of operation: Starting and naming a session; Recording some audio; Performing some DSP work such as cut, copy and paste; Mixing two or more tracks of audio; Saving the session. Be sure to cover session management procedures at each step including naming tracks, keeping track of DSP plug-in settings and where to save their work. Finally, have them relaunch the session to make sure that everything works and is ready to be turned in.

• Examination questions: What device has single-handedly revolutionized the traditional way in which audio can be recorded, processed and mixed? Newer computer systems allow for faster calculations and increased track count due to: List four of the potential pitfalls to be aware of and avoid when using a laptop computer as part of a Digital Audio Workstation platform? What are the two common interconnection protocols used with external digital audio interfaces for digital audio workstations: Small software programs that run in the background and allow data to be communicated between a system's software operating system and external audio hardware are termed: Accumulated delay in a DAW, typically measured in milliseconds, and caused when audio is converted to digital and processed is termed: Data flow on FireWire and USB is: a. Uni-directional (one way) b. Bi-directional (data flows flow ways)

Which of the following is not a standard for transferring entire sessions from one DAW software platform to another? a. FireWire b. AAF c. OMF d. OpenTL

25

Which of the following is not a digital audio sound file format? a. Wav b. AIFF c. USB d. Sound Designer 2 Audio can be streamed between two simultaneously running applications via this protocol developed by Propellerhead Software and Steinberg:: a. AudioSuite b. ReWire c. RTAS d. VST

• List of further reading/resources http://content.grammy.com/PDFs/Recording_Academy/Producers_And_Engineers/DAWGuidelineShort.pdf Chapter 8: Groove Tools and Techniques

• Objectives • Demonstrate effective looping techniques on a Digital Audio

Workstation • Describe the various ways to change the timing of loops without

affecting the pitch of the soundfile • Provids an overview of popular software platforms for groove and

loop-based music production • Expand the concept of groove-based software and hardware into the DJ

entertainment marketplace

• Lecture and Discussion Ideas A recent development in the world of computer-based music composition is the groove (or rhythmic) compositional tool. In this chapter of Modern Recording Techniques, David Mile Huber provides a detailed examination of these exciting production tools. Since many groove-based music production software programs are on the market or available on the web, you will likely find that your students are already very familiar with their concept and use. This lecture is almost certainly to be hands-on, with your students trying things out as you illustrate the functionality of the groove-based software your institution has at its disposal. I have found that this is a very rewarding lecture since everyone in the class will certainly grasp the concepts almost intuitively and want to go off in their own directions right from the start! Have fun with this chapter in your lecture plan – your students surely will!

• Terminology

26

Beat Slicing (p.292) Loop-based Audio Software (p.294) Pitch Change (p.291) Resampling (p.290) ReWire (p.300) Slice (p.292) Tempo Matching (p.289) Time Change (p.291) Time-stretching (p.290) Virtual Instrument Warping (p.291)

• Sample Exercise (in class) David Miles Huber includes two tutorials in this chapter which are quite good: “Manual Looping with a DAW” and “Having Fun with a Loop-Based Editor.” I would suggest having your class follow along with you in lab while performing these labs. Websites for downloading free versions of the software described in the tutorial are also provided if necessary. Again, have fun with your students in these exercises!

• Examination questions: What are the factors that need to be managed when creating cyclic-based loops? Why do multiple loops need to be synchronized with one another? What are the three possible combinations of time and pitch manipulation that may oocur using DSP to synchronize and tune loops? Briefly describe “warping” as it applies to soundfile looping: How is “beat slicing” within a soundfile different than time stretching using DSP? True or False: Propellerhead's Reason let's you record audio into it's sequencer? a. True b. False

27

The underlying software program which allows a program such as Live or Reason to synchronize sample-accurately with another DAW program is termed: a. ProTools b. ReWire c. Apple GarageBand d. Actually this isn't possible Most looping software programs not only handle audio files but _____ as well: a. MIDI files b. OMF files c. VST files d. All of these

• List of further reading/resources Acid Planet: http://www.acidplanet.com Sony Software: http://www.sonycreativesoftware.com Garageband Homepage: http://www.apple.com/ilife/garageband/ Propellerhead Software: http://www.propellerheads.se/ Spectrasonics Virtual Instruments: Http://www.spectrasonics.net Cakewalk/ Roland Software: http://www.cakewalk.com Steinberg Software: http://www.steinberg.net Fxpansion Virtual Instruments: http://www.fxpansion.com M-Audio Software and Interfaces: http://www.m-audio.com Chapter 9: MIDI and Electronic Music Technology

• Objectives • Provide a thorough examination of Musical Instrument Digital

Interface (MIDI) technology • Demonstrate how MIDI devices may be interconnected • Describe the basic MIDI message types and their uses • Provide a survey of common electronic instruments • Demonstrate the basic features of a MIDI sequencer and how it is

integrated into a recording studio environment

• Lecture and Discussion Ideas Since the mid 1980s, MIDI has played an ever increasing role in the development of musical styles, sounds, and production of music overall. David Miles Huber has witnessed, first hand, this growth within the recording industry and has written extensively on the subject. Depending upon the structure of the course you teach in audio recording, the scope of this chapter may prove to be a bit too deep on the subject matter, but again, MIDI is a very important topic for any student wishing to gain employment in the industry. If necessary, you might cut down some of the subject matter in this chapter for your course needs, but be sure to stress the main points in the chapter. Your curriculum in recording may even include other courses

28

focused entirely upon MIDI production, thereby relieving some of the attention this chapter needs within your course. In your discussion of MIDI, certainly discuss the main topics listed above and be sure to emphasize the role that MIDI plays in the modern recording environment. During your lecture(s) on the subject, you may even want to get some input from your students as to their current familiarity with MIDI technology. Find out how many of your students are already using MIDI keyboards, drum machines and sequencers, as their level of familiarity may already be high. So much the better, but with this familiarity, there are also certain to be questions about interfacing components and the use of particular devices or software. Therefore, it will be incumbent upon you to prepare well for the presentation of this chapter and to make sure that class discussions do not stray too far from the main topics.

• Terminology Additive Synthesis (p.342) Attack Velocity (p.327) Byte (p.323) Channel Pressure/ Aftertouch (p.328 Channel Voice Message (p.327) Continuous Controller (p.320) Control Change Message (p.329) Controller ID Number (p.330) Data Byte (p.324) Data Controller (p.320) Drum Controller (p.349) End of Exclusive Message (p.331) FM Synthesis (p.342) General MIDI (p.311) Keyboard Controller (p.347) MIDI (Musical Instrument Digital Interface, p.309) MIDI Cable (p.315) MIDI Channel (p.324) MIDI Mode (p.326) MIDI Daisy Chain (p.320) MIDI Echo Port (p.318) MIDI File Type (p.359) MIDI In Port (p.318) MIDI Interface (p.338) MIDI Manufacturers Association (MMA, p.331) MIDI Message (p.322) MIDI Out Port (p.318) MIDI Thru Port (p.318) MIDI Time Code (MTC, p.331) MSB (Most Significant Bit, p.324) Multiport MIDI Interface (p.338) Nibble (p.324)

29

Note Off Message (p.327) Note On Message (p.327) Percussion Controller (p.350) Performance Controller (p.340) Pitch Bend Change (p.329) Polyphonic Key Pressure (p.327) Program Change Message (p.328) Quantization (p.361) Sampler (p.342) Song Position Pointer (p.331) Song Select (p.331) Status Byte (p.324) Step-time Entry (p.357) System Exclusive Message (Sysex, p.335) System Message (p.331) Switch Controller (p.320) Synchronization (p.338) Transposition (p.360) Tune Request (p.331) Voice (p.345) Wavetable Synthesis (p.342)

• Sample Exercise (in class) An appropriate in-class exercise for this chapter is to set up a number of MIDI devices (keyboards, drum machines, effects devices, controllers) and a MIDI interface such as a Thru box or even something like a MOTU MIDI FastLane or a M-Audio MIDIsport. Demonstrate interconnection of the devices using the MIDI In, Out, and Thru connections. If able, show interconnection with the computer as well via USB. This will provide graphic illustration of connections. If on a Macintosh system, be sure to show how CoreMIDI works. Following your demonstration, have your students break up into groups and give them each a separate MIDI interface example with the equipment you can put together. Each group should then demonstrate how to solve their MIDI patching situation. Ask questions as to what they hear if they play a note on a particular device: do they hear one device or two and so forth. Obviously this will take some preparation prior to the class but the results are worth it and will definitely hit home the function of the MIDI communication protocol.

• Examination questions: At what rate does a MIDI interface transmit data? What are some of the non-musical instruments which may accept MIDI Machine Control sys-ex data?

30

What does General MIDI guarantee? Briefly describe the function of a MIDI Daisy Chain: How is a MIDI Status Byte differentiated from a MIDI Data Byte? How many channels of information are transmitted on a single MIDI cable? a. 4 b. 16 c. 8 d. 24 The internal software data communications protocol that has been developed to communicate MIDI, audio, timing sync, and control data between an effect plug-in and a host DAW program/CPU processor is: a. ReWire b. MAS c. VST d. All of these

• List of further reading/resources http://www.harmony-central.com http://board.midibuddy.net/t11293.html Chapter 10: Multimedia and the Web

• Objectives • Show the integration of digital audio files into multimedia and web-

based applications • Examines the various devices utilized for dissemination of multimedia

material • Traces the deployment of web-based multimedia and its operation • Discusses digital audio file types (uncompressed and compressed) for

use in multimedia and the web

31

• Lecture and Discussion Ideas

To be honest, much of this chapter should be a “gimme” for most of your students today. I do not intend to fault the chapter’s importance, but merely wish to state the obvious: the people you are teaching right now in this area have probably spent the majority of their lives in the internet age. File compression to them is a fact of life and a way of getting legal (or illegal) music for their IPods to jam with. But in deference to all of the artists we have recorded in the past or present, please make sure that your students are aware of copyright and royalties. Therefore, one of the main points to bring up in this chapter is copyright protection. Have your students discuss the topic and the availability of material on the web. Do they take advantage of it? Are they willing to pay for it? If so, how much? Are the current pay rates for legal downloads reasonable? Another area to discuss is the capability of compressing sound files. To do this you will need to discuss a bit of psychoacoustics and human hearing. You may even need to reinforce some of the principles learned in chapter 2 – Sound and Hearing. The ability to create a lossy-compressed sound file will depend entirely upon our ability to hear and our ability to mask the unwanted noise resulting from the compression.

• Terminology AAC Audio Format (p.380) AC3 Standard (p.375) Audio Codec (p.377) Authoring (p.375) AVI Video Format (p.388) Blu-Ray (p.375) CD Format/Book Standards (p.374) Cloud Computing (p.373) Compact Disc (CD, 373) Constant Bit Rate (p.379) Data Compression (p.378) DRM (p.380) DTS Standard (p.375) DVD (p.373) FLAC Audio Format (p.381) Flash Memory (p.373) General MIDI (p.384) Internet (p.372) Internet Radio (p.392) Internet Service Provider (ISP, p.372) IP Address (p.372) Lossy Coding (p.378) Metadata (p.381) MPEG I, II, and IV Video Formats (p.388) MP3 Audio Format (p.379) MP4 Audio Format (p.380) MPEG-1 Standard (p.375) MPEG-2 Standard (p.375)

32

Multimedia (p.369) Network (p.371) PCM Standard (p.375) Perceptual Coding (p.377) Physical Media (p.373) Quicktime Video Format (p.388) Raster Graphic (p.385) Resource Information File Format (RIFF, p.377_) SDDS Standard (p.375) SDMI (p.380) Standard MIDI File (p.384) URL (p.372) Variable Bit Rate (p.379) Vector Animation (p.385) Vector Graphic (p.385) Video Codec (p.388) Video Frame Rate (p.388) WMA Audio Format (p.380) World Wide Web (p.371)

• Sample Exercise (in class) Probably the best in-class exercise for this chapter would be to break the students into groups and have them determine on just how many sites they can find their favorite band’s last “hit.” How many sites were legal downloads? How many were not? Have each group discuss how they may or may not use the files. Was something distinctive about a file that they really wanted whether legal or not? What constituted whether it was legal?

• Examination questions: What distinguishes the CD Red Book standard from the CD Orange book standard? What does “authoring” refer to? What is the basis of perceptual coding? What is the function of the SDMI? Which of the following CD standards was developed to promote cross platform data?

33

a. CD Yellow Book b. CD Blue Book c. ISO 9660 d. CD HFS Which of the following might not be placed in a multimedia file as tagged metadata? a. Album titles b. Artist bios c. Song lyrics d. All may be The following MIDI protocol standardizes voice patch numbers: a. Standard MIDI b. MIDI 1 c. General MIDI d. Lieutenant MIDI Which is the fastest Internet connection? a. Ethernet b. T1 c. DSL d. Cable

• List of further reading/resources http://www.cd-info.com http://en.wikipedia.org/wiki/Secure_Digital_Music_Initiative http://www.riaa.org Chapter 11: Synchronization

• Objectives • Provide a description of the process whereby the transports of two or

more devices may be synchronized • Develops an understanding of SMPTE Time Code and its function • Discussion of MIDI Time Code and its implementation • Describes the need for a stable clock reference for both digital audio

and digital video applications

• Lecture and Discussion Ideas The subject of synchronization is quite important, as noted in the text, due to the proliferation of material requiring synchronization across media types. In your lecture be sure to discuss SMPTE Time Code and its importance within a synchronized system. Also be sure to point out the applications requiring MIDI Time Code and the very important need for a central, stable clock reference for all machines under synchronization. This may also serve as a refresher of some of the concepts brought to bear in your discussion of digital audio.

• Terminology

34

Black Burst Generator (p.407) Broadcast Wave Format (p.399) Continuous Jam Sync (p.403) Crosstalk (p.405) Drop Frame Time Code (p.398) EBU (European Broadcast Union, p.398) Frame (p.394) Free-wheeling Jam Sync (p.403) House Sync Generator (p.407) Jam Sync (p.403) LTC (Longitudinal Time Code, p.402) MIDI Cueing Message (p.401) MIDI Full Message (p.401) MIDI Quarter-frame Message (p.401) MIDI Time Code (p.400) Non-drop Frame Time Code (p.398) NTSC (National Television Standards Committee, p.398) PAL (p.398) SMPTE (p.394) SMPTE Offset (p.404) SMPTE Time Code (p.394) Stripe (p.411) Synchronization (p.393) Synchronizer (p.405) Sony 9-pin Connector (p.405) Sync Master (p.406) Sync Slave (p.406) Time Code Address (p.394) Time Code Frame Rate (p.396) Time Code Refresher (p.403) Time Code Word (p.395) Timing Clock (p.394)

• Sample Exercise (in class) In class, you might call up the www.modrec.com to play an illustration of time code or you may have other examples at your disposal. The fact that the signal is audible should indicate that it can be recorded on tape, although it should be made evident that in all cases, proper recording levels should be observed to prevent the time code from bleeding into a final mix. You may be aware of some songs out there that have been mixed where it is still possible to hear time code bleed on the released version. Also discuss the various time code types and their necessity for various film and video standards. Stress the importance of understanding this information for those students wishing to work in any sort of multimedia application environment.

• Examination questions:

35

What information is encoded in a SMPTE Time Code word? Why is it important to use an extremely stable clock reference for all digital devices under synchronization? Similarly, what is necessary for larger video production houses so that all video transmission take place in sync? If your audio production company is working on a video project produced “out of house”, what time code should be used for the remainder of the production? What is the frame rate for NTSC monochrome video? a. 24 fps b. 25 fps c. 29.97 fps d. 30 fps What is the frame rate for PAL video video? a. 24 fps b. 25 fps c. 29.97 fps d. 30 fps What is the frame rate for NTSC color video? a. 24 fps b. 25 fps c. 29.97 fps d. 30 fps How many masters can there be in a synchronized audio/video system? a. two b. one c. as many as necessary d. zero How many slaves can there be in a synchronized audio/video system? a. two b. one c. as many as necessary d. zero

• List of further reading/resources www.smpte.org Chapter 12: Amplifiers

36

• Objectives • Provide a basic explanation of amplifier design extrapolated to both

solid state and tube devices • Discuss the function of operational amplifiers • Guide the reader through the many uses and designs of amplifiers in an

audio system • Discusses the function of voltage- and digitally-controlled amplifiers

• Lecture and Discussion Ideas Your coverage of this brief chapter might best be combined with that of chapter 14, Monitors, as discussed at the beginning of this manual. Obviously, the two subjects work hand-in-hand and this may provide you a way to cover all of the material in a single semester. Most likely you will not need to get too in-depth on amplifier design in your lecture, but there are some good graphics in the text to help describe amplifier technology in a basic sense. What will be most important to stress is that amplifier design must take into account nonlinearity and the penalties if amplifiers are operated beyond their design limitations.

• Terminology Amplifier (p.413) Attenuate (p.417) Base (p.413) Cathode (p.414) Clipping (p.414) Collector (p.414) Cutoff Region (p.414) DC Bias Signal (p.414) Digitally-controlled Amplifier (p.420) Distribution amplifier (p.418) Emitter (p.413) Equalizer (p.416) Linear (p.414) Negative Feedback Loop (p.417) Odd-order Harmonics (p.415) Operational amplifier (p.416) Plate (p.414) Power amplifier (p.418) Preamplifier (p.416) Saturation (p.414) Saturation Point (p.414) Semiconductor (p.413) Summing amplifier (p.417) Transistor (p.413) Triode (p.414) Vacuum tube (p.413) Valve (p.414)

37

Voltage-controlled Amplifier (p.420)

• Sample Exercise (in class) An interesting in-class exercise might be to demonstrate the difference between tube and solid-state clipping and distortion. This might most effectively be done with demonstration CDs of various guitar amplifiers or by simply plugging in to a small “combo” version of each type of amplifier (you do not want to get too loud in your classroom!)

• Examination questions: In what ways is an amplifier like a valve? What is done to force the operation of a transistor into linear operation? What is meant by amplifier saturation? How is negative feedback applied? Why is negative feedback necessary? What protective measures might be used on power amplifiers? A frequency-discriminating amplifier is called a: a. VCA b. Equalizer c. Summing amplifier d. Isolation amplifier Which type of amplifiers are used on the mix bus of a console: a. VCA b. Equalizer c. Summing amplifier d. Isolation amplifier Amplifiers used in console automation systems are a: a. VCA b. Equalizer

38

c. Summing amplifier d. Isolation amplifier Chapter 13: Power- and Ground-Related Issues

• Objectives • Discuss the need for proper grounding in audio applications • Explain power conditioning applications as they pertain to the

recording studio environment • Describe the benefits of multi-phase power and balanced power

systems

• Lecture and Discussion Ideas This is an extremely brief, but critically important chapter. If time is short during the semester, the information here can be combined with chapter 12 – amplifiers, and chapter 17 – monitoring. While this can be a very complicated subject, the emphasis should be on the benefit to audio quality and on overall safety. If possible, point out to students the noise inducted into recordings by ground loops and “noisy” power. For example, guitar amplifiers are unfortunately often good choices for demonstrating 60-cycle hum.

• Terminology Balanced Power (p.423) Brown-out (p.423) Grounding (p.421) Ground Loop (p.422) Multi-phase Power (p.423) Power Conditioning (p.422) Transient Spike (p.423) Uninterruptible Power Supply (UPS, p.423) Voltage Regulation (p.423) Voltage Sag (p.423) Voltage Surge (p.423)

• Sample Exercise (in class) A useful exercise for students is to divide them into small groups and have them research the many power and grounding related solutions now on the market for studio owners. Numerous companies such as Furman have well designed websites with a great deal of information regarding these problems and various products that can help solve them. If time is available, have student groups briefly report on their findings.

39

• Examination questions: What are some of the audible results of poor grounding in a studio? True or False: Whenever possible, audio wiring and AC wiring should be laid together as closely as possible. Power conditioning devices can provide all of the following benefits EXCEPT:

a. voltage regulation b. elimination of power interruptions c. keeping AC lines quiet d. elimination of radio frequency interference

True or False: Bad cables and connections can occasionally cause audible grounding problems. A rise in voltage that can harm or reduce the working life of audio equipment is referred to as:

a. voltage sag b. voltage surge c. transient spike d. brown-out

Sharp, high-level energy event from lightning or other sources that can do serious damage to audio equipment is referred to as:

a. voltage sag b. voltage surge c. transient spike d. brown-out

Long-term sags in the available voltage on an AC line is referred to as:

a. voltage sag b. voltage surge c. transient spike d. brown-out

A system in which major electrical systems are placed on multiple AC circuits in a recording studio is called:

a. balanced power b. grounded power c. regulated power

40

d. multiple-phase power

• List of further reading/resources Furman Power Products: http://www.furmansound.com Ranenote “Grounding and Shielding Audio Devices”: http://www.rane.com/note151.html Chapter 14: The Art and Technology of Mixing

• Objectives • Provide a thorough examination of the “mixing surface” • Trace the use of audio consoles through all stages of a recording • Break down each of the sections of audio console design • Discuss the development of digital console design and application • Examines audio console automation systems

• Lecture and Discussion Ideas This very thorough chapter is essential to the discussion of audio recording. In previous editions of this text, the chapter was entirely devoted to the concept of the audio recording console. However, David Miles Huber has updated the philosophy of the approach with developments in the recording industry. Now, the chapter not only examines traditional console design and practice, but incorporates the present day “Mixing Surface” as well – the hardware implementation of a virtual DAW console. It will be important in your discussion of audio consoles to make your examples very clear to avoid confusion on the part of your students. There is a lot of terminology to cover in this chapter, therefore, it might be a good idea to provide the terms list below for your students to use as a study guide and as a method to follow your lecture. Concurrent with the discussion of console components is an examination of the techniques used throughout the recording process. You might break your lecture down by sections of the console or it may be more appropriate in your case to cover the technology by identifying the particular process at hand and what sections of the console become more prominent for each task.

• Terminology Attenuation Pad (p.435) Audio Production Console (p.425) Automation Read Mode (p.467) Automation Write Mode (p.465) Auxiliary Send (p.441) Basic/Rhythm/Bed Tracks (p.428)

41

Bus (p.430) Channel Fader (p.448) Channel Input (p.434) Channel/Track assignment matrix (p.450) Cue Mix (p.429) Direct Insert Monitoring (p.445) Drawn/Rubberband Automation (p.467) Dynamics Section (p.443) Foldback (p.447) Gain Level Optimization (p.437) Grouping (p.430) Half-normalled Patch Point (p.454) I/O Module (p.433) In-Line Monitoring (p.445) Input Strip (p.428) Insert Point (p.438) Line Trim (p.435) Master Output Fader (p.450) Meter (p.457) Mic Trim (p.435) Mini Bantam/TT Connector (p.454) Mixdown (p.431) Monitor Level Section (p.453) Monitor Section (p.443) Monitor Mix (p.429) Mute (p.448) Normaled Patch Point (p.455) Open Patch Point (p.454) Outboard line mixer (p.471) Output Section (p.449) Overdubbing (p.431) Pan Pot (p.448) Parallel Patch Point (p.455) Patch Bay (p.454) Peak Program Meter (p.456) Phase Reverse (p.437) Recording (p.427) Scratch Track (p.428) Signal Path (p.432) Signal-to-noise Ratio (p.428) Solo (p.448) Tip/ring/sleeve ¼-inch Connector (p.454) Tip/sleeve ¼-inch Connector (p.454) TRS Connector (p.438) VU Meter (p.459)

• Sample Exercise (in class)

42

Obviously a great way to discuss recording console design and technology is to have one in front of you as you discuss each of the components and controls. Barring this, use of the graphics provided in this chapter would be a second best solution. Most students will learn best with the hands-on approach and excellent in-class applications can be developed to get your students in front of the console, turning the knobs and hearing (and/or seeing) the results. Below are some generic questions on the subject of console design and application, but you may find that tailoring questions around your specific in-class demonstrations will be more appropriate.

• Examination questions: What are the three stages of the modern multi-track recording process? Provide a brief definition of an I/O module: What is the function of an auxiliary send? How are insert points generally utilized on the mixing console? What are some of the advantages to modern digital console technology? What are some of the benefits to using automation? What switch is used to reduce gain on an incoming audio signal? a. Attenuation pad b. Solo c. Pan d. Phase reverse What control is used to place audio signals spatially? a. Attenuation pad b. Solo c. Pan d. Phase reverse What control is used to listen to just one track in the mix? a. Attenuation pad b. Solo c. Pan d. Phase reverse

43

If we say that two points on a patch bay are “normalled” it means?

a. That we need a patch cord to connect them b. That we do not need a patch cord to connect them c. That the two points can not be connected d. That we will need two patch cords to connect them

On the patch bay, what are “parallels” used for?

a. Combining signals together b. Creating a “Y” c. Creating a mono output of a stereo feed d. All of the above

• List of further reading/resources www.ams-neve.com www.solid-state-logic.com www.makie.com www.tascam.com www.behringer.de www.sony.com/proaudio Chapter 15: Signal Processing

• Objectives • To present a thorough overview of audio signal processing in the

analog and digital domains • Provide examples of typical ways signal processors are connected • Break down signal processing devices into three main categories:

Spectral-based; Amplitude-based; Time-based

• Lecture and Discussion Ideas Perhaps the best way to present the material in this chapter is to take the lead of the author and break down the enormous subject of audio signal processing into the three more manageable areas of spectral-based, amplitude-based and time-based effects. This chapter is also well suited to in-class demonstrations of the processors being discussed. These demonstrations could take place as audio only examples played back from CDs or better yet, if your situation allows, patch representative examples of the devices being discussed in for both visual and aural presentation. Another good

44

way to do this in a classroom setting is to use a DAW and apply the ever more convenient plug-ins which replicate the effects of signal processing. Don’t forget to relate examples of past analog or acoustic versions of signal processing in this lecture. As mentioned in the chapter everything old is new again and it is very exciting to experiment with combinations of old and new signal processing. Your students will most likely not only benefit from these revelations, but will also enjoy them as well.

• Terminology Active Filter (p.483) Attack (p.497) Average Signal Level (p.494) AU Format (p.475) AudioSuite Format (p.475) Bandpass Filter (p.486) Bandwidth (p.485) Chamber Reverb (p.511) Chorus (p.507) Combing (p.507) Compressor (p.495) De-esser (p.498) Delay (p.506) Digital Delay Line (p.506) Direct Signal (p.509) DirectX (p475) Doubling (p.507) “Dry” (p.480) DSP (p.474) Dynamic Range Processor (p.494) Early Reflections (p.509) Equalization (p.482) Expander (p.504) External Key (p.478) Flanging (p.507) Frequency-selective Compression (p.498) Gate Reverb (p.511) Graphic Equalizer (p.488) Hall Reverb (p.510) High-pass Filter (p.486) Input Gain (p.496) Insertion Loss (p.483) Insert Routing (p.476) Key Input (p.505) Limiter (p.502) Live Reverb (p.511) Low-pass Filter (p.486)

45

MAS Format (p.475) Multiband Compression (p.501) Noise Gate (p.504) Notch Filter (p.489) Octave Bands (p.485) Output Gain (p.496) Parametric Equalizer (p.487) Passband (p.486) Passive Component (p.483) Peaking Filter (p.484) Phasing (p.507) Pitch Shifting (p.512) Plate Reverb (p.511) Plug-in (p.475) Psycho-acoustic Enhancement (p.511) Q/Quality Factor (p.485) Ratio (p.496) Release (p.497 Reverberation (p.509) Reverse Reverb (p.511) Room Reverb (p.511) Saturation (p.494) Selectable Frequency Equalizer (p.487) Send Routing (p.479) Shelving Filter (p.486) Sidechain (p.478) Signal Processor (p.473) Spatialization (p.512) Spring Reverb (p.511) Stopband (p.486) Threshold (p.496) Time-based Effects (p.505) Time Shifting (p.512) Turnover/Cutoff Frequency (p.486) TDM Format (p.475) VST (p.475) “Wet” (p.480)

• Sample Exercises For every major section of signal processing the author has provided tutorials at www.modrec.com. In addition to the lecture, these can be very beneficial either as additions to the lecture, or as suggested outside class work.

• Examination questions: What is the difference between Insert and Send signal routing?

46

What does equalization refer to? What is the difference between a Parametric and a Graphic equalizer? Provide a brief description of the function of the audio compressor: What are the three components making up reverberation cues? What must the user be aware of when applying pitch shifting plug-ins or devices to vocal tracks? Narrow bandwidths of sound may be attenuated with the following: a. Shelf filter b. Low Pass filter c. High Pass filter d. Notch filter Removal of low frequency spectrum sound may be accomplished with the following: a. Shelf filter b. Low Pass filter c. High Pass filter d. Notch filter What determines when a compressor will go into action?

a. When signals go below threshold b. When signals rise above threshold c. The ratio setting d. The attack setting

What determines by how much a signal may be compressed?

a. When signals go below threshold b. When signals rise above threshold c. The ratio setting d. The attack setting

“Humanizing” a sound can be accomplished with which Time-based processor? a. Reverb b. Pitch shifting c. Delay d. Time compression

47

Chapter 16: Noise Reduction

• Objectives • Provide a description of the techniques used for the reduction of

unwanted noise from recordings • Examine noise reduction in both the analog and digital domain • Explain noise reduction in both double-ended and single-ended

methods

• Lecture and Discussion Ideas As mentioned at the beginning of this manual, this chapter might be combined with another short chapter (suggested was the chapter on Multimedia and the Web) in order to maximize your use of the text in a standard semester. You might also fold this chapter into the coverage of Chapter 5 – The Analog Recorder. Regardless of whether you follow this idea or not, this chapter is instrumental in covering the specialized equipment used to reduce noise in recordings. Some discussion areas for you lecture include the necessity for noise reduction to maximize dynamic range in recordings and to restore older recordings marred by asperity noise. You might also wish to discuss the field of forensic audio restoration when covering this important subject.

• Terminology Adaptive Filter Noise Reduction (p.521) Breathing (p.522) Chirping (p.519) De-clicking (p.520) De-popping (p.520) Digital Noise Reduction (p.517) Fast Fourier Transform (p.518) Noise Gate (p.522) Pumping (p.522) Single-ended Noise Reduction (p.521) Tape Hiss (p.518)

• Sample Exercise (in class) There are a couple DIY Tutorials in this chapter that would be good for in-class presentation. The second DIY does require the use of a noise reduction plug-in algorithm that you may not have access to, however.

• Examination questions:

48

What is meant by Single-Ended Noise Reduction? What does digital technology provide to enhance the noise reduction process? A variable filter and a dynamic downward expander are combined in this: a. Double-Ended NR b. Single-Ended NR c. Noise gate d. Fast Fourier Transform Taking a digital “snapshot” of offending noise to be reduced may be utilized in this: a. Double-Ended NR b. Single-Ended NR c. Noise gate d. Fast Fourier Transform Chapter 17: Monitoring

• Objectives • Discuss both loudspeaker design and the correlation of the loudspeaker

with control room acoustics • Demonstrate room tuning operations • Explais monitor cabinet, driver, and crossover designs • Detail various monitoring configurations and provide tips for

monitoring mixes in progress

• Lecture and Discussion Ideas As mentioned at the beginning of this manual, it might be a good idea to couple this chapter with the companion chapter 10, Amplifiers. Discussions of one will obviously compare the two regardless. Several facets of this chapter refer back to the chapters on Sound and Hearing and Studio Acoustics – especially with regard to control room design and the reaction of a studio monitor within a control room. Probably one of the best ways to introduce your students to the differences among monitor systems is to play the same material through all systems you can muster. Any degree of A/B switching you can do to reinforce differences in monitor design vs. say, amplifier/crossover topology would be a good idea. For this you should have some ability in the classroom or studio to switch between multiple monitors or playback systems at will. Obviously monitoring plays an important part in the recording process. It is not uncommon for engineers and producers to book sessions at one particular facility because they are “comfortable” with the monitoring system there. That is, they have mixed there previously and the results translated well to the outside world (boom box, CD in the car, audiophile home system, etc.). If, for whatever reason, that comfort

49

level is compromised, those clients will be running to the next “greatest” place to do their work. This chapter is not only important for the material developed within, but also because it is the very area that is so willfully abused by practitioners of the trade. It is so very important that monitor levels be checked at every stage of the recording process. It is a very easy trap to make everyone in the session “happy” with the results of the session by playing everything “Just Bloody Loud” to get their jollies off, but the tracks at reasonable levels may sound terrible. Besides which, hearing is a sense that cannot recuperate and feel better after an aspirin. Once it’s gone, it’s gone.

• Terminology Active Crossover Network (p.533) Active Monitors (p.527) Air suspension (p.530) Bass Management (p.539) Bass reflex (p.530) Bi-amplified (p.533) Crossover Network (p.531) Crossover Point (p.532) Farfield Monitors (p.525) Headphones (p.529) LFE Channel (p.538) Monaural (p.540) Monitor (p.523) Mono Compatibility (p.540) Nearfield Monitors (p.526) Passive Crossover Network (p.531) Passive Monitors (p.527) Spectral Analyzer (p.543) Speaker Polarity (p.534) Stereophonic (p.540) Subwoofer (p.538) Three-way System (p.531) Tri-amplified (p.533) Two-way System (p.531)

• Sample Exercise (in class) If the classroom you are teaching in has multiple sets of monitors, one useful exercise is to divide the class into smaller groups and have each group research and discuss each set of monitors in your studio. Have them download the specification sheets from each manufacturer’s website and discuss how they expect the monitors to affect what they hear. Allow each group to briefly report on their findings and then play several pieces of music to determine the accuracy of the groups findings.

50

• Examination questions:

What are some of the reasons why a monitor system may sound different in one room from the next? What type of signal might be fed through your monitor system in order to tune it to the room you are working in? Broadly speaking, what are the two types of professional speaker enclosures? What is the difference between active and passive crossover networks? What might be a good monitoring level for mixing? The signal most often used to tune loudspeakers in a room is termed: a. Black noise b. Pink noise c. Grey noise d. Purple noise The “sweet spot” in a control room is determined by a triangle based upon the following angular alignments: a. 30-30-120 b. 45-45-90 c. 60-60-60 d. 90-90-0 What might a “spectral analyzer” accomplish?

a. Provide aural and visual cues for potential problems in speaker-to-room acoustics

b. Define the aural spectrum of a desired recording c. Replace the graphic equalizer as an insert point in a mix so that EQ judgments

are more soundly-based d. None – these are rarely used any more and have been replaced by modern

software counterparts Why might monitoring on a “Nearfield System” be appropriate during mixdown?

a. To demonstrate how loud the mix can be b. To indicate to producer how wrong she is about a part c. To monitor the mix as it might be played on a home or car system d. To prove to the bass player that his/her part can be heard in the mix

51

What is the benefit to “sealed” headphones in the studio? a. They provide the artist with an uncompromised cue mix of the session in

progress b. If open headphone types are used, the artist can run the session and play Halo

II simultaneously c. To a great extent, they block the sound of other instruments played

simultaneously d. Both a. and c.

Chapter 18: Surround Sound

• Objectives • Discuss the history and development of surround sound • Examine various surround sound formats including 5.1 • Provides illustrations of how one might set up their studio and

monitoring environment for 5.1

• Lecture and Discussion Ideas

This is a very relevant chapter for your students. Many of them will graduate and work in environments where the norm was stereo (2-channel) but now is multi-channel – especially in the areas of broadcast, authoring, and DVD preparation and mastering. The chapter is presented very cogently – try as much as possible to follow its lead in your lecture. If you have 5.1 music examples and a 5.1 playback system, use them to demonstrate the power of surround music mixing. Unfortunately there are many bad examples out there too. Feel free to play these as examples of what might not work! For music mixing and distribution, the multichannel road has been rocky. Many very successful pioneers and producers of music material attempted multichannel presentations of their material before the waters were tested. Really bad experiments like quadraphonic left some people burned. Above all else, it is important for a producer to be aware of his audience and how they will use the material. In the quad days something might be released that could only be played back correctly on a fraction of the hi-fi systems in existence. However, with the standardization of DVD-V and other advanced multichannel delivery formats, multichannel mixes can again be attempted and, at the consumer’s location, be replayed in their original intent.

• Terminology CinemaScope (p.548) Dolby AC3 (p.550) Dolby Pro Logic (p.550) Dolby Surround (p.549) DTS (p.556) Fantasound (p.548)

52

FLAC File Format (p.558) ITU (p.551) LFE (p.552) MP4 (p.558) Quadraphonic Sound (p.549) Todd AO (p.548) Up-mix (p.558) WMA (p.558)

• Sample Exercise (in class) For this lecture it would be a good idea to play various multichannel music mixes to compare the ways different engineers and producers approach multichannel music mixing. Some mixes are very traditional with surround material representing reflections off back walls and ambience, while others fly material around the room. Some music producers resist the use of the center speaker, others use it for material traditionally panned in the center. Some engineers like to utilize the LFE channel, others claim it should be used only for movie soundtracks. To show the different approach, play snippets of movie soundtracks as well. Play excerpts of several selections to get students to respond to the use of all speakers in the mixes. If you have the capability, turn off selected speakers to aurally demonstrate mix elements left in the remaining speakers. This can be especially illuminating for your students.

• Examination questions: How were the four channels of quadraphonic information pressed in a phonograph record? What is the ITU recommendation for placement of 5.1 monitors? What recommendations are there for placement of the LFE speaker in a room? In its most basic form how does Dolby ProLogic derive four channels of information from a two channel interface? What is the best method for monitoring a Dolby ProLogic mix in process? The first popular multichannel movie was: a. Star Wars b. Fantasia c. One Hundred Men and a Girl d. Oklahoma

53

Which of the following is not a surround sound format? a. Dolby ProLogic b. AC3 c. DTS d. Dolby Type A Perceptual coding is used in the following surround sound format: a. Dolby ProLogic b. AC3 c. DTS d. Dolby Type A True or False: Video information may be included on a DVD- Audio disc a. True b. False The soundtrack for a DTS movie release is located on: a. CD-ROMs b. The space in between the sprocket holes c. The film edges d. Space next to the picture

• List of further reading/resources www.dolby.com www.dtsonline.com www.steinberg.net www.m-audio.com www.mackie.com Chapter 19: Mastering

• Objectives • Guide students through the mastering process and explain the role of

the mastering engineer • Discuss the various pieces of equipment used for mastering • Explain the considerations in choosing a mastering engineer and in

preparing for a mastering session

• Lecture and Discussion Ideas As suggested at the beginning of this manual, this short chapter may be combined with the following chapter on Product Manufacture. The one follows the other logically. The art of mastering and the entire mastering process is something that has always been shrouded in mystery and practiced by engineers regarded as “voodoo high priests” – very secretive about their craft. This chapter will assist the reader in understanding just what is involved in the mastering process and provides useful guidelines for those who are about to have their work mastered.

54

In class, a description of the role of the mastering an album – the final stage before product manufacture – would be a good way to present the material.

• Terminology Crossfades (p.569) Dither (p.574) Dynamics (p.572) Equalization (p.571) Mastering (p.563) Mastering Engineer (p.563) Multiband Dynamic Processing (p.573) Transitions (p.569) Soundfile Resolution (p.574)

• Sample Exercise (in class) Just for fun you might go to the AMG website – www.allmusic.com. And check out the credits of a number of selected albums to see who mastered the project. Have your students volunteer favorite albums to check out. You might see the same names coming up time and again. The best mastering engineers may work on hundreds of albums a year so their names will come up quite often. Another website to check out for their articles on mastering is www.mixonline.com. Under the Recording tab are further breakdowns by category including mastering. Another great site to check out is www.soundonsound.com. Under “Articles”, then “Technique” is a section on mastering. This is an excellent resource for your students.

• Examination questions: What are some of the reasons/needs for mastering an album? What should be avoided when processing the dynamics of a mix? List some of the tools used by mastering engineers. Explain some of the choices made when assembling the sequence of a music project.

55

• List of further reading/resources www.mixonline.com www.soundonsound.com www.digido.com Chapter 20: Product Manufacture

• Objectives • Provide a broad overview of the manufacturing process for audio

recordings • Discuss the manufacturing of digital physical media such as CDs,

DVDs, and Blu-Ray discs • Discuss the process of disc cutting for vinyl records • Discuss downloadable media and music streaming

• Lecture and Discussion Ideas To be honest, most textbooks on the subject of recording do not include this type of information. This is a good chapter for it provides the reader with some ammunition to use when it comes time for them to “try to do it on their own.” The manufacturing process is not one of the very tantalizing aspects of the record industry, but it is a necessity. And that necessity is changing as is evidenced by the author’s references to web publishing. In your lecture, which might be dovetailed with the previous chapter on Mastering, it would be appropriate to reinforce the concept that this process is the end of the chain. Aside from the marketing and publicity of the recorded work, there are no more technological hurdles to cross. In this case, it might be interesting for you as a class to take a look at not only some traditionally produced and marketed label releases, but also to examine some of the newer, non-traditional releases. In the last Grammy Awards an award was given to an artist who never manufactured a single copy of her work. Her entire presence was mitigated by the web. We are living in an age where the ability to have CDs pressed or cassettes duplicated no longer matters. In this case, not only were there no duplication costs, but there were no other tangible costs such as printing, publicity, or tour support. It is indeed a brave new world out there for dissemination of recorded works. In your class, you might wish to hold an open discussion with your students about where they see the record industry headed over the next five years or so.

• Terminology

56

45/45 Cutting System (p.591) Art Proof (p.585) Blu-Ray (p.588) CD Burning (p.585) CD-R/CD-RW (p.581) Coarse Pitch (p.593) Compact Disc (CD, p.579) Delayed Path (p.593) Disc Cutting Head (p.592) Disc-Cutting Lathe (p.592) Double-layer DVD (p.588) DVD (p.588) Glass Master (p.582) Groove Pitch (p.593) Lacquer (p.594) Land (p.593) Lathe Bed (p.592) Lathe Cutting Head (p.592) Lathe Sled (p.592) Microgroove Pitch (p.593) Mother (p.596) Music Streaming (p.579) Overcut (p.593) Physical Media (p.579) Playback Stylus (p.594) Pitch/depth Control Computer (p.592) Stamper (p.596) Subcodes (p.580) Test Pressing (p.585) Turntable (p.592) Twinning (p.593) Undelayed Path (p.593) Vinylite Biscuit (p.596)

• Sample Exercise (in class) In class you might break your students into groups – each group representing a record label. Have each group determine how they are going to break a new artist and how they as a label are going to distribute their new, hot, artist’s material. Without getting too much into the music business side of things (we’ll reserve that for other classes) have your students focus upon the technology required to distribute the material. Have them map out a plan with respect to duplication or distribution of the recorded work. Collect the results after 15 minutes or so and discuss their ideas.

• Examination questions:

57

What is the difference between a manufacturing facility that performs services “in-house” vs. those that provide “out-sourcing” of their services? What is the purpose of a “test” pressing? On a compact disc what represents a logical binary “1” or “0”? What are some of the advantages to DVD-R over CD-R? What is the maximum size of a DVD? The CDs you buy of your favorite artist conform to what standard? a. CD-Orange Book b. CD-Red Book c. CD-Blue Book d. CD-Yellow Book

• List of further reading/resources www.billboard.com/ www.mixonline.com www.isourcebook.com www.artistpro.com www.cdrfaq.org Billboard Tape/Disc Directory Mix Master Directory The Recording Industry Sourcebook Chapter 21: Studio Tips and Techniques

• Objectives • Discuss the role of a producer • Examine the steps necessary to produce a recording • Provide the reader with a check list of what to get ready prior to going

into the studio • Details the steps for recording music in the studio

58

• Lecture and Discussion Ideas This chapter represents a culmination of all of the material presented thus far in the book. It is in this chapter that the entire process of going into the studio and recording is laid out step by step. In your class it would be a good idea to review each of the steps beginning with preparation, advancing to tracking rhythm parts, proceeding to overdubs and punches, to the final mix. What are the roles played by the producer? How about the various studio personnel? What role does the record label play? In your class, engage your students in all of these areas to get their input.

• Terminology Archive (p.606) Compositing (p.605) Copyright (p.601) Mixdown (p.606) Overdubbing (p.604) Producer (p.598) Punch-in (p.605)

• Sample Exercise (in class) For this chapter, it might make sense to break your class up into small groups of four or five and declare that each is a record label. Have them determine within each group what kind of label they are, what kind of music they are going to produce, and what position each person in the group will hold. Each “label” is in the process of getting an artist they’ve signed into the studio. They should tell you something about the artist or the band (genre of music, influences, etc.). Now, have them hire a producer and select a studio where they will record. Will additional musicians be needed? Is the material practiced and “worked up”? Have them define the recording schedule. Once the recording is complete they’ll need to hire someone to mix the project. Following this where will they have their album mastered. Where will it be duplicated? Each group should create a flow chart to indicate all of the steps and procedures they’ll follow to get their next “hit” released. To make things more interesting establish a budget for the entire project and have each “label” determine where and how the money will be spent.

• Examination questions: What are the two possible roles a producer may take? What are some of the things that an artist should do prior to going into the studio?

59

Why is it important to copyright your songs? Is the headphone (cue) mix important? Discuss. Describe some of the things that should be done when setting up in the studio for a recording session. What are some of the concerns to deal with when backing up data after a session?

• List of further reading/resources www.copyright.gov/forms/formsri.pdf How to Make and Sell Your Own Recording, 5th Edition, Dianne Sward Rapaport The Guerilla Music Marketing Handbook, Bob Baker This business of Music Marketing and Promotion, Tad Lathrop “The Sound of Money” in Audio Recording for Profit, Chris Stone Chapter 22: Yesterday, Today, and Tomorrow

• Objectives • Provide an historical overview of the recording process • Examine the state of recording technology today • Predict what the future of recording might entail

• Lecture and Discussion Ideas Well, it is the end of the semester. Time to review all that’s been covered thus far for that infamous final exam. This chapter puts everything in a nice package and will allow you to spend as much or as little time with it as you can to make time for that all important review. Hope you had fun with this book and the material!

• Terminology

60

Alan Dower Blumlein (p.613) Alexander M. Poniatoff (p.614) Ampex 200 (p.613) Bing Crosby (p.614) Integrated Circuit (p.614) Jack Mullin (p.614) Mary C. Bell (p.614) Peter Gotcher (p.616) Sound Designer (p.617)

• List of further reading/resources www.aes.org/aeshc www.museumofsound.org www.synthmuseum.org www.tinfoil.com www.rockhall.com www.usd.edu/smm www.computerhistory.org www.emplive.com www.lovesphere.org/mosr/