Unit I: Fundamentals of Sound and Audio Basics
Fundamentals of Sound, Perception of Sound
Sound as a Mechanical Wave
Definition:
Sound is a mechanical wave that requires a medium (such as air, water, or solid materials) to
travel. Unlike light, it cannot propagate through a vacuum.
Nature of Sound Waves:
Sound waves are longitudinal waves, meaning that the particles of the medium vibrate
parallel to the direction of wave travel. As the sound wave moves, it creates alternating areas
of compression (high pressure) and rarefaction (low pressure).
Characteristics of Sound
Frequency (Pitch):
Frequency refers to the number of wave cycles that pass a point per second, measured in
Hertz (Hz). Higher frequencies are perceived as higher-pitched sounds. For example, a violin
produces higher frequencies than a bass guitar.
Amplitude (Loudness):
Amplitude is the height of the sound wave and relates to how loud or soft the sound
appears. It is measured in decibels (dB). A whisper may be around 30 dB, while a rock
concert can exceed 100 dB.
Wavelength:
Wavelength is the distance between two consecutive points in phase on a wave (such as two
compressions). It is inversely related to frequency—higher frequency means shorter
wavelength.
Velocity:
The speed of sound varies depending on the medium. In air at room temperature, it travels
at approximately 343 m/s, but it moves faster in liquids and solids.
Human Perception of Sound
The Hearing Process:
When sound enters the ear, it causes the eardrum to vibrate. These vibrations are
transmitted through the ossicles (tiny bones in the middle ear) to the cochlea, which
contains hair cells that convert mechanical energy into electrical signals sent to the brain via
the auditory nerve.
Pitch Perception:
Pitch is how high or low a sound is perceived. It is directly related to frequency. The human
ear can typically hear frequencies between 20 Hz to 20,000 Hz (20 kHz).
Loudness Perception:
Loudness is the subjective perception of sound intensity. It’s not only about amplitude but
also depends on frequency and ear sensitivity. For instance, we are more sensitive to
frequencies between 2 kHz and 5 kHz.
Timbre (Tone Quality):
Timbre distinguishes different sound sources even if they play the same pitch and loudness.
It’s influenced by the harmonics and overtones present in the sound. For example, a piano
and a flute playing the same note sound different because of their unique timbres.
Influences on Sound Perception
Environmental Acoustics:
The characteristics of the room or space—such as size, shape, and materials—affect how
sound reflects, absorbs, and resonates. This is why sound in a cathedral differs from sound in
a small studio.
Listener Positioning:
The position of the listener relative to the sound source affects perception. Sitting closer to
speakers or instruments can make the sound seem louder or clearer. In stereo or surround
setups, positioning also affects spatial perception.
Psychoacoustic Principles
Masking:
One sound can make it difficult to hear another, especially if both are at similar frequencies.
For example, background music can mask dialogue in a film if not mixed properly.
Directionality (Localization):
Humans use binaural hearing (two ears) to determine the direction and distance of a sound
source. This includes interaural time difference (ITD) and interaural level difference (ILD).
Auditory Illusions:
These are tricks the brain plays based on how it interprets sound. Famous examples include
the Shepard tone (a tone that seems to ascend endlessly) or phantom words (words that
listeners believe they hear from random noise).
Sound Intensity & Levels, Tone Controls
Sound Intensity and Measurement
Definition of Sound Intensity:
Sound intensity refers to the amount of energy that a sound wave carries through a unit area
per second. It is a measure of how powerful a sound is and directly correlates with how loud
it is perceived by listeners.
Unit of Measurement – Decibels (dB):
Sound intensity is measured in decibels (dB), a logarithmic unit. The scale is relative,
meaning each increase of 10 dB represents a tenfold increase in intensity.
o Example: A normal conversation is around 60 dB, while a jet engine at close range
can reach 120 dB or more.
Dynamic Range:
The range between the softest and loudest sounds that can be heard or recorded is called
the dynamic range. In audio production, managing this range is key to achieving clarity and
impact.
Tone Controls
Definition and Purpose:
Tone controls are used to adjust the frequency response of an audio signal, helping to tailor
the sound’s tonal character to suit a specific mix, environment, or listener preference.
Basic Tone Control Bands:
o Bass (Low Frequencies): Typically ranges from 20 Hz to 250 Hz. Boosting bass gives
warmth or fullness; cutting it can reduce muddiness.
o Midrange (Middle Frequencies): Covers 250 Hz to 4 kHz. These frequencies affect
clarity, presence, and vocal definition. The midrange is crucial in most audio.
o Treble (High Frequencies): From 4 kHz to 20 kHz, affecting brightness and detail.
Boosting treble makes audio sparkle; too much can lead to harshness.
Applications in Audio Systems:
o Equalizer (EQ) knobs or sliders on mixers and speakers adjust tone controls.
o Useful in live sound setups, home audio systems, and studio mixing to create a
more pleasing or accurate sound.
Importance in Audio Production
Mixing and Mastering:
Proper understanding of intensity levels and tone controls allows for professional balancing
of tracks, avoiding distortion, ensuring consistency across playback systems, and enhancing
the overall sonic experience.
Adaptation to Playback Systems:
Different environments (e.g., headphones, studio monitors, car speakers) may emphasize or
mask certain frequencies. Tone controls help compensate for these variations.
Creative Control:
Artists and sound designers use intensity and tone shaping not just for correction, but also
for emotional and narrative effects, such as making a scene feel tense, warm, distant, or
aggressive.
1.3 Equalization, Dynamics & Compression, Noise Floor & Headroom
Equalization (EQ)
Definition:
Equalization is the process of adjusting the balance of different frequency components in an
audio signal. It is used to shape the tonal quality of sound.
Purpose of EQ:
o Enhance clarity, presence, and definition of audio.
o Remove or reduce unwanted frequencies (e.g., hum, rumble, sibilance).
o Carve space in the mix for each element (instrument or voice).
Types of Equalizers:
o Graphic EQ: Fixed frequency bands with sliders to boost/cut.
o Parametric EQ: Allows adjustment of frequency, gain, and Q (bandwidth) for precise
control.
o Shelving EQ: Boosts/cuts all frequencies above (high shelf) or below (low shelf) a
certain point.
o Notch/Peak EQ: Used to isolate and remove very specific frequencies.
1.4 Ambient Sounds, Spot Effects, Foley Sound Effects
Ambient Sounds
Ambient sounds are background noises that set the overall atmosphere and establish the
environment of a scene. Under this topic, note the following subtopics:
Definition & Purpose: Ambient sounds serve as the underlying audio landscape that defines
a location and contributes to the overall mood without drawing attention to any particular
element on screen.
Examples & Settings: Typical examples include natural sounds like a forest’s birdsong, the
rustle of leaves, or urban sounds such as city traffic. These sounds are continuous and help
to create a sense of place.
Spot Effects
Spot effects are targeted, discrete sounds that are directly linked to specific actions or events
occurring in the visual narrative. They are crucial for emphasizing key moments. The subtopics here
include:
Definition & Function: Spot effects are short, isolated audio cues used to highlight actions or
events. Unlike ambient sounds, they focus on individual, on-screen occurrences.
Examples: Common examples include a gunshot, door slamming, or the sound of glass
shattering. Such effects draw the viewer’s attention to important plot points or transitions.
Foley Sound Effects
Foley sound effects refer to custom-created audio that is recorded in synchronization with the
characters’ movements and interactions with objects. This area is broken down as follows:
Definition & Role: Foley is the art of reproducing everyday sounds in a controlled studio
environment to match the on-screen actions, thereby enhancing the realism and immersive
experience of the visual storytelling.
Process & Techniques: Foley artists use various props and surfaces to create sounds like
footsteps, clothing rustling, or the handling of objects. The goal is to mimic what would
naturally occur in the depicted environment, ensuring that the audio seamlessly aligns with
the movement and events.
Impact on Storytelling: By integrating realistic sound effects, Foley work adds depth to the
narrative, making each scene more engaging and believable for the audience.
1.5 Designing & Mixing Sound Boards
Soundboard/Audio Mixer Overview
A soundboard or audio mixer is a device designed to control multiple audio inputs at once. It is a
central tool in audio production that allows an operator to manage and shape the sound output in
real time.
Purpose and Functionality:
o Enables simultaneous management of several audio sources.
o Acts as the control center to adjust, modify, and enhance sound for various
applications.
Designing a Soundboard Setup
Creating an effective soundboard setup involves several key technical considerations. These elements
ensure that each audio channel is optimally routed and processed to achieve the desired sound
output.
Channel Routing:
o Directing audio signals to appropriate channels.
o Ensuring that each source is properly assigned and can be individually controlled.
Equalization (EQ):
o Adjusting frequency balance for each audio channel.
o Enhancing or reducing specific frequency ranges to improve clarity and blend.
Gain Staging:
o Managing input levels to ensure the signal is neither too weak nor too distorted.
o Balancing the levels across channels to maintain consistency.
Assigning Effects:
o Integrating audio effects to enrich the sound.
o Typically involves pre-setting channels with desired effects for real-time
manipulation.
3. Mixing Process
Mixing is the art of blending and balancing audio elements to produce a polished final product. This
process takes place after the initial setup, ensuring that all elements harmonize effectively.
Balancing Levels:
o Adjusting the volume of each channel to achieve a coherent mix.
o Maintaining dynamic range and clarity across the entire audio track.
Panning Audio Sources:
o Distributing sounds across the stereo field.
o Creating spatial awareness and depth in the final mix by positioning audio sources
appropriately.
Applying Additional Effects:
o Utilizing effects such as reverb or delay.
o Enhancing the overall atmosphere and texture, making the audio suitable for its
intended medium.
Target Applications:
o Tailoring the mix for various media including film, television, games, or live
performances.
o Ensuring that the final sound is cohesive and meets the professional standards
required for each format.