Key Takeaways
1. Sound Fundamentals: The Building Blocks of Audio Engineering
Understanding these characteristics is essential to make a decent recording, become a competent mix engineer, and generally increase your knowledge about audio.
Sound as Vibration. Sound is fundamentally a vibration that travels through the air, characterized by compression and rarefaction. Compression occurs when molecules are pushed together, while rarefaction is when they expand. These vibrations create sound waves, which possess seven key characteristics: amplitude (volume), frequency (pitch), phase (timing), velocity (speed), wavelength (physical length), harmonics (timbre), and envelope (ADSR).
Sine Wave Illustration. While real-world sounds are complex, the sine wave serves as a basic model for understanding sound wave characteristics. Amplitude, measured in decibels (dB), relates to the height of the wave and corresponds to volume. Frequency, measured in hertz (Hz), indicates the number of cycles per second and determines pitch.
Seven Key Characteristics. Mastering these seven characteristics is crucial for audio engineers. Understanding how they interact allows for precise control over sound during recording and mixing. For example, adjusting amplitude affects volume, while manipulating frequency shapes the tonal balance. Recognizing harmonics helps differentiate instruments, and understanding the envelope allows for shaping the attack, decay, sustain, and release of a sound.
2. The Human Ear: Understanding How We Perceive Sound
According to the Fletcher–Munson curve, most humans will hear all frequency ranges equally between 80 and 85 dB.
Ear as Transducer. The human ear, like a microphone, acts as a transducer, converting acoustic vibrations into mechanical energy, which is then transformed into electrical impulses sent to the brain. The ear is divided into three sections: outer, middle, and inner. The middle ear amplifies sound, while the inner ear protects against loud noises.
Frequency Perception. Humans perceive frequencies differently, with midrange frequencies (200 Hz to 5 kHz) being heard best, followed by highs (5 kHz to 20 kHz) and then lows (20 Hz to 200 Hz). However, at volumes between 80 and 85 dB, all frequency ranges are heard more equally, according to the Fletcher-Munson curve. This is why monitoring mixes at this level is recommended.
Protecting Your Hearing. Extended exposure to loud sounds can cause permanent hearing loss. Earplugs are essential for loud concerts and other high-volume events. Different types of earplugs are available, ranging from foam plugs that reduce volume and high frequencies to custom-molded plugs that retain the frequency range while decreasing amplitude.
3. EQ Essentials: Mastering Frequencies for Clear Audio
Understanding the different frequency ranges and how to describe them is a necessary skill before you can begin to equalize.
Equalization Defined. Equalization (EQ) involves adjusting the amplitude of specific frequencies or frequency ranges to alter the tone of a sound. This can be achieved through standalone rack units, channel strips, or software plugins. Understanding frequency ranges is crucial for effective EQ.
Frequency Ranges. Low frequencies (20-200 Hz) are omnidirectional and provide power, while mid frequencies (200 Hz-5 kHz) are more directional and add attack. High frequencies (5-20 kHz) are extremely directional and add brightness. Each range has distinct characteristics, and understanding these helps in describing and manipulating sound.
Subtractive EQ. Subtractive equalization involves cutting frequencies to create space and clarity in a mix. This technique is favored by professional engineers to avoid masking, where louder sounds obscure quieter ones. By removing unwanted frequencies, space is created for other instruments, resulting in a clearer, more defined mix.
4. People Skills: The Unsung Hero of Recording Success
One of the easiest ways to influence a recording does not require a knob or fader, or any technical skills at all. It requires you to be positive and supportive of the artist(s).
Likeability Matters. In the recording industry, people skills are often as important as technical expertise. Being positive and supportive creates a comfortable environment for artists, leading to better performances. Word-of-mouth referrals are crucial, so maintaining a good reputation is essential.
Vibe and Atmosphere. The "vibe" of a recording session can significantly impact the music. Engineers should strive to maintain a positive atmosphere by offering encouragement and avoiding negativity. Creating a comfortable and inspiring space can enhance the artist's performance.
Opinions and Boundaries. While technical opinions are welcome, creative input should be offered cautiously. It's important to remember that the artist's vision takes precedence. Patience, especially with inexperienced musicians, is key to a successful session.
5. Microphone Mastery: Capturing Sound with Precision
Microphones, or mics, are used to capture a sound much like our ears.
Microphone as Palette. Microphones are essential tools for audio engineers, akin to a painter's color palette. Each microphone choice influences the texture, tone, and color of a recording. Three major categories to consider when choosing a microphone are transducer type, directional characteristic, and frequency response.
Transducer Types. Dynamic microphones are rugged and handle loud sounds well, making them ideal for drums and amplifiers. Condenser microphones offer excellent transient response and clarity, suitable for vocals and acoustic instruments. Ribbon microphones provide a smoother, darker tone, often used for room sounds and drum overheads.
Directional Characteristics. A microphone's pickup pattern determines the direction from which it captures sound. Cardioid patterns are most common, rejecting sound from the rear. Omnidirectional patterns pick up sound equally from all directions. Understanding these patterns helps control isolation and environmental sound.
6. Mixing Consoles: Command Central for Audio Control
Think of the console as command central, controlling audio and its various components.
Console Functions. Mixing consoles, whether physical or virtual, serve as command centers for audio control. They feature faders for volume, pan knobs for positioning, auxiliary sections for effects, and various other functions. Mastering the console is essential for audio production.
Channel Strip Components. The channel strip is the fundamental building block of a mixing console. Key components include the preamp, EQ section, auxiliary sends, pan control, fader, and bus assignments. Understanding the function of each component is crucial for manipulating sound.
Master Section. In addition to channel strips, mixing consoles have a master section that controls overall volume, auxiliary sends, and speaker selection. Features like the mono button and talkback mic enhance workflow and monitoring capabilities.
7. Signal Processors: Enhancing and Manipulating Sound
Signal processors take an original signal and manipulate it to produce a new sound.
Signal Processor Defined. Signal processors manipulate audio signals to create new sounds or control existing ones. Common types include compressors, limiters, gates, equalizers, reverbs, and delays. These processors can be used to enhance, correct, or creatively alter audio.
Dynamic Range Control. Compression is used to control the dynamic range of a sound, making it more consistent in volume. Key parameters include threshold, ratio, attack, and release. Gates are used to turn sounds on and off based on amplitude, often used with drums.
Time-Based Effects. Reverb simulates acoustic spaces, adding depth and dimension to recordings. Delay creates repeating signals, adding space and texture. Other time-based effects include chorus, flanger, and phaser, each offering unique sonic characteristics.
8. Signal Flow: Directing Audio Traffic with Expertise
A key component of audio engineering is to fully understand the analog model of signal flow, that is, how sound travels from the beginning to the end of its audio path.
Signal Flow Importance. Understanding signal flow is crucial for troubleshooting and maintaining a smooth recording session. It involves tracing the path of audio from the microphone to the speakers, identifying each component and its function. Key terms include preamp, mic, line, tape send, tape return, and stereo bus.
Analog Model. In a traditional setup, signal flow starts with the microphone, which is connected to a preamp. The preamp boosts the signal, which is then sent to a recorder. The recorded signal returns to the console for monitoring and mixing.
Digital Application. The analog model of signal flow applies to digital recording as well. Whether using a separate console and recorder or a laptop with an audio interface, understanding the signal path is essential for capturing, storing, and monitoring sound.
9. Studio Session Procedures: Orchestrating a Smooth Recording
Although every recording session tends to be different, most recording sessions will include these seven stages...
Seven Stages. Most recording sessions follow seven stages: preproduction, setup, basic tracks, overdubs, rough mixes, mixing, and mastering. Each stage requires specific tasks and considerations. Understanding these stages helps in planning and managing time effectively.
Preproduction. Preproduction involves song selection, arrangement, tempo determination, and rehearsal. This stage is crucial for ensuring a smooth and efficient recording session. Well-prepared artists save time and money.
Mixing and Mastering. Mixing involves balancing, EQing, and processing recorded tracks to create a stereo mix. Mastering is the final stage, where the overall sound is polished and prepared for distribution. Each stage requires specific skills and attention to detail.
10. Basic Acoustics: Optimizing Your Recording Space
Acoustics is the science of sound and can help determine the quality of sound transmitted in a room or space.
Acoustic Principles. Acoustics is the science of sound and how it behaves in a space. Key concepts include reflection, absorption, diffraction, and diffusion. Understanding these principles helps in optimizing a recording environment.
Room Treatment. Acoustic treatment involves using materials to control reflections and standing waves. Common treatments include bass traps, diffusers, and absorption panels. The goal is to create a balanced and accurate sound field.
DIY Solutions. Affordable acoustic treatments can be built using readily available materials. Gobos, diffusers, and bass traps can be constructed to improve the sound of a home studio. Experimentation and careful listening are key to achieving optimal results.
11. Audio History: Learning from the Past to Shape the Future
To fully understand digital audio technology, you need to understand how, when, and why we ended up digitizing audio.
Analog's Legacy. Analog recording, characterized by continuous voltage levels, dominated the industry for over a century. Key milestones include the invention of the phonograph, the development of stereo recording, and the introduction of multi-track tape machines. Pioneers like Les Paul and Tom Dowd shaped modern recording techniques.
Digital Revolution. Digital audio, which samples sound and encodes it using binary code, revolutionized the industry. The development of the Compact Disc (CD) and software like Pro Tools marked significant advancements. Digital technology offered greater flexibility and accessibility.
Analog vs. Digital. Analog offers a warm, organic sound, while digital provides clarity and precision. Each has its advantages and disadvantages. Many modern studios combine the best of both worlds, using analog gear for its tonal qualities and digital tools for editing and manipulation.
12. Internships: Gaining Real-World Experience
An internship is a way to get your foot in the door. It is not to teach you audio basics. This is an important distinction.
Value of Internships. Internships provide invaluable real-world experience and networking opportunities. They offer a chance to learn from experienced professionals and gain practical skills. However, interns should possess basic audio knowledge before starting.
Intern Responsibilities. Interns should expect to perform a variety of tasks, ranging from cleaning and organizing to assisting with recording sessions. A positive attitude, willingness to learn, and strong work ethic are essential.
Getting an Internship. Networking, demonstrating enthusiasm, and possessing basic skills are key to securing an internship. Persistence and a willingness to start at the bottom are often necessary. A good internship can launch a successful career in audio engineering.
Last updated:
Review Summary
Audio Engineering 101 receives mostly positive reviews, with an average rating of 4.06 out of 5. Readers appreciate its comprehensive coverage of basic audio engineering concepts, making it ideal for beginners and those interested in home recording or podcasting. The book explains fundamental topics like EQ, compression, and microphone selection. While some find it too basic for experienced users, many praise its clear explanations and practical advice. A few reviewers note its focus on music production may limit its usefulness for podcasters, but overall, it's considered a valuable resource for building a strong foundation in audio engineering.