Glossary

µ-Law

Trim/Crop

A function that will delete all data in a sound file outside of the current selection.

Adaptive Delta Pulse Code Modulation (ADPCM)

Adaptive Delta Pulse Code Modulation (ADPCM)

A method of compressing audio data. Although the theory for compression using ADPCM is standard, there are many different algorithms employed. For example, Microsoft's ADPCM algorithm is not compatible with the International Multimedia Association's (IMA) approved ADPCM.

A-Law

A-Law

A compounded compression algorithm for voice signals defined by the Geneva Recommendations (G.711). The G.711 recommendation defines A-Law as a method of encoding 16-bit PCM signals into a non-linear 8-bit format. The algorithm is commonly used in United States' telecommunications. A-Law is very similar to µ-Law, however, each uses a slightly different coder and decoder.

Aliasing

Aliasing

A type of distortion that occurs when digitally recording high frequencies with a low sample rate. For example, in a motion picture, when a car's wheels appear to slowly spin backward while the car is quickly moving forward, you are seeing the effects of aliasing. Similarly when you try to record a frequency greater than one half of the sampling rate (the Nyquist Frequency), instead of hearing a high pitch, you may hear a low-frequency rumble.

To prevent aliasing, an anti-aliasing filter is used to remove high-frequencies before recording. Once the sound has been recorded, aliasing distortion is impossible to remove without also removing other frequencies from the sound. This same anti-aliasing filter must be applied when resampling to a lower sample rate.

Alpha Channel

Alpha Channel

Alpha is a fourth channel that determines how transparency is handled in an image file. The RGB channels are blended to determine each pixel's color, and the corresponding alpha channel determines each pixel's transparency. The alpha channel can have up to 256 shades of gray: 0 represents a transparent pixel, 255 represents an opaque pixel, and intermediate values are semitransparent.

Amplitude Modulation (AM)

Amplitude Modulation (AM)

A process whereby the amplitude (loudness) of a sound is varied over time. When varied slowly, a tremolo effect occurs. If the frequency of modulation is high, many side frequencies are created which can strongly alter the timbre of a sound.

Analog

Analog

When discussing audio, this term refers to a method of reproducing a sound wave with voltage fluctuations that are analogous to the pressure fluctuations of the sound wave. This is different from digital recording in that these fluctuations are infinitely varying rather than discrete changes at sample time. See "Quantization."

Aspect Ratio

Aspect Ratio

Describes the frame size of your video as a ratio of its width to its height. For example, video shot in NTSC DV format has a frame size of 720 by 480 pixels, which is roughly a 1.33:1 aspect ratio:

720 (frame width) ÷ 480 (frame height) = 1.5

1.5 x 0.9091 (pixel aspect ratio) = 1.36365.

Frame Size

Example

4:3 Standard Television

1.33:1 Aspect Ratio

16x9 Widescreen Television

1.78:1 Aspect Ratio

Academy Flat Theatrical Frame

1.85:1

Academy Scope Theatrical Frame

2.35:1 Aspect Ratio

Attack

Attack

The attack of a sound is the initial portion of the sound. Percussive sounds (drums, piano, guitar plucks) are said to have a fast attack. This means that the sound reaches its maximum amplitude in a very short time. Sounds that slowly swell up in volume (soft strings and wind sounds) are said to have a slow attack.

Audio Compression Manager (ACM)

Audio Compression Manager (ACM)

The Audio Compression Manager, from Microsoft, is a standard interface for audio compression and signal processing for the Windows operating system. The ACM can be used by software to compress and decompress .wav files.

Bandwidth

Bandwidth

When discussing audio equalization, each frequency band has a width associated with it that determines the range of frequencies that are affected by the EQ. An EQ band with a wide bandwidth will affect a wider range of frequencies than one with a narrow bandwidth.

When discussing network connections, refers to the rate of signals transmitted; the amount of data that can be transmitted in a fixed amount of time (stated in bits/second): a 56 Kbps network connection is capable of receiving 56,000 bits of data per second.

Beats Per Measure

Beats Per Measure

The time signature of a piece of music contains two pieces of information: the number of beats in each measure of music, and which note value gets one beat. uses this notion to determine the number of ticks to put on the time ruler above the track view and to determine the spacing when the ruler is displaying Measures & Beats.

Beats Per Minute (BPM)

Beats Per Minute (BPM)

The tempo of a piece of music can be written as a number of beats in one minute. If the tempo is 60 BPM, a single beat will occur once every second.

Bit

Bit

A bit is the most elementary unit in digital systems. Its value can only be 1 or 0, corresponding to a voltage in an electronic circuit. Bits are used to represent values in the binary numbering system. As an example, the 8-bit binary number 10011010 represents the unsigned value of 154 in the decimal system. In digital sampling, a binary number is used to store individual sound levels, called samples.

Bit Depth

Bit Depth

The number of bits used to represent a single sample. software uses either 8, 16 or 24-bit samples. Higher values will increase the quality of the playback and any recordings that you make. While 8-bit samples take up less memory (and hard disk space), they are inherently noisier than 16 or 24-bit samples.

Brightness

Brightness

Adjusting brightness adds or subtracts values from the color channels in an image to make the image lighter or darker. The maximum brightness setting adds 255 (pure white), and the minimum setting adds 0 (pure black).

Bus

Bus

A virtual pathway where signals from tracks and effects are mixed. A bus's output is a physical audio device in the computer where the signal is routed. The configuration of busses is saved with the project whereas the routing of busses to hardware is saved with the system. In this way, projects can be easily moved from one system to another without modifying the original layout of the project.

Byte

Byte

Refers to a set of 8 bits. An 8-bit sample requires one byte of memory to store, while a 16-bit sample takes two bytes of memory to store.

CCD

CCD

Charge coupled device. The image sensor in a digital camera.

Chroma

Chroma

The values that convey chrominance information.

Chrominance

Chrominance

The color content of an image without respect to its brightness.

Clipboard

Clipboard

The clipboard is where data that you have cut or copied is stored. You can then paste the data back into a different location on the timeline or paste it into other applications, such as Microsoft Word, or another instance of .

Clipping

Clipping

Clipping is what occurs when the amplitude of a sound is above the maximum allowed recording level. In digital systems, clipping is seen as a clamping of the data to a maximum value, such as 32,767 in 16-bit data. Clipping causes sound to distort.

CODEC

CODEC

An acronym for Coder/Decoder. Commonly used when working with data compression.

Complementary Color

Complementary Color

Complementary colors are colors that are 180 degrees apart on the color wheel. In the following image, you can see that red and cyan are complementary colors, as are magenta and green, and blue and yellow.

Contrast

Contrast

Adjusting contrast multiplies color values in an image to stretch the existing color channel values across a broader or narrower portion of the spectrum. The contrast center determines the anchor point for stretching.

Histogram of original image.

Contrast decreased with a contrast center of 0.5: the histogram is squeezed into a narrower portion of the spectrum anchored from the center of the graph.

Contrast decreased with a contrast center of 0.0: the histogram is squeezed into a narrower portion of the spectrum anchored at the left edge of the graph.

Contrast decreased with a contrast center of 1.0: the histogram is squeezed into a narrower portion of the spectrum anchored at the right edge of the graph.

Controller Track

Controller Track

A controller track is an audio track that controls the behavior of one or more other tracks. In VEGAS Pro, it can be used in the Auto Ducking feature to control the volume of listener tracks. However, a controller track can also be used more broadly in audio production to trigger effects like sidechain compression or to adjust parameters such as equalization or filter settings on other tracks.

For example, when using Auto Ducking, the controller track (such as a voiceover) lowers the volume of the background music on listener tracks when speech is detected, ensuring clear vocal audio.

Crossfade

Crossfade

Mixing two pieces of media by fading one out as the other fades in.

Cutoff Frequency

Cutoff frequency

The cutoff-frequency of a filter is the frequency at which the filter changes its response. For example, in a low-pass filter, frequencies greater than the cutoff frequency are attenuated while frequencies less than the cutoff frequency are not affected.

DC Offset

DC Offset

DC Offset occurs when hardware, such as a sound card, adds DC current to a recorded audio signal. This current causes the audio signal to alternate around a point above or below the normal -infinity dB (center) line in the sound file. To visually see if you have a DC offset present, you can zoom all the way into a sound file and see if it appears to be floating over the center line.

Decibel (dB)

Decibel (dB)

A unit used to represent a ratio between two numbers using a logarithmic scale. For example, when comparing the numbers 14 and 7, you could say 14 is two times greater than the number 7; or you could say 14 is 6 dB greater than the number 7. Engineers use the equation dB = 20 x log (V1/V2) when comparing two instantaneous values. Decibels are commonly used when dealing with sound because the ear perceives loudness in a logarithmic scale.

In , most measurements are given in decibels. For example, if you want to double the amplitude of a sound, you apply a 6 dB gain.

Device Driver

Device Driver

A program that enables the Windows operating system to connect different hardware and software. For example, a sound card device driver is used by software to control sound card recording and playback.

Digital Signal Processing (DSP)

Digital Signal Processing (DSP)

A general term describing anything that alters digital data. Digital Signal Processors alter the data after it has been digitized by using a combination of programming and mathematical techniques. DSP techniques are used to perform many effects such as equalization and reverb simulation.

Dithering

Dithering

The practice of adding noise to a signal to mask quantization noise.

Drag and Drop

Drag and Drop

A quick way to perform certain operations using the mouse in . To drag and drop, you click and hold a highlighted selection, drag it (hold the left-mouse button down and move the mouse) and drop it (let go of the mouse button) at another position on the screen.

Dynamic Range

Dynamic Range

The difference between the maximum and minimum signal levels. It can refer to a musical performance (high volume vs. low volume signals) or to electrical equipment (peak level before distortion vs. noise floor). For example, orchestral music has a wide dynamic range while thrash metal has a very small (always loud) range.

Endian (Little and Big)

Endian (Little and Big)

Little and Big Endian describe the ordering of multi-byte data that is used by a computer's microprocessor. Little Endian specifies that data is stored in a low to high-byte format; this ordering is used by the Intel microprocessors. Big Endian specifies that data is stored in a high to low-byte format; this ordering is used by the Motorola microprocessors.

Envelopes (Audio and Video)

Envelopes (Audio and Video)

Envelopes allow you to automate the change of a certain parameter over time. In the case of volume envelopes, you can create a fade out (which requires a change over time) by adding an envelope and creating an extra point to the line that indicates where the fade starts. Then you pull the end point of the envelope down to infinity.

Equalization (EQ)

Equalization (EQ)

The process by which certain frequency bands are raised or lowered in level. EQ has various uses. The most common use for users is to simply adjust the subjective timbral qualities of a sound.

Event

Event

An event is an occurrence of a media file on the timeline. Each event can contain more than one media file. Each media file in an event is called a take. An event also provides access to features that allow cropping, fade in and out, opacity, gain, and other tools to control and edit events. Events are divided into two broad categories: Video events and Audio events. Some of the tools are specific to the type of event: audio or video.

File Format

File Format

A file format specifies the way in which data is stored on your floppy disks or hard drive. In the Windows operating system, the most common file format is the Microsoft .WAV format. However, can read and write to many other file formats so you can maintain compatibility with other software and hardware configurations.

Frame Rate (Audio)

Frame Rate (Audio)

Audio uses frame rates only for the purposes of synching to video or other audio. In the latter case, the rate of 30 non-drop is typically used. In the former case 30 drop is usually used.

Frame Rate (Video)

Frame Rate (Video)

The speed at which individual images in the video are displayed on the screen during playback. A faster frame rate results in smoother motion in the video. The television frame rate in the US (NTSC) is 29.97 frames per second (fps). In many parts of Europe and Japan, the television standard is PAL at 25 fps.

Frequency Spectrum

Frequency Spectrum

The Frequency Spectrum of a signal refers to its range of frequencies. In audio, the frequency range is basically 20 Hz to 20,000 Hz. The frequency spectrum sometimes refers to the distribution of these frequencies. For example, bass-heavy sounds have a large frequency content in the low end (20 Hz - 200 Hz) of the spectrum.

Gamma

Gamma

Determines the brightness of the video and is used to compensate for differences between the source and output video and sometimes needs to be calibrated to match the source or destination. Higher gamma values result in lighter or brighter video as displayed on your computer's monitor.

Gamut

Gamut

Gamut refers to the complete range of something. In video editing, you want to ensure that your colors are within the acceptable range for your broadcast standard. When colors are outside the NTSC or PAL gamut, you can introduce image problems or noise into the video stream. You can use the video scopes to analyze your video before rendering and correct out-of-gamut colors with video plug-ins.

Hertz (Hz)

Hertz (Hz)

The unit of measurement for frequency or cycles per second (CPS).

In-Place Plug-In

In-Place Plug-In

An in-place plug-in processes audio data so that the output length always matches the input length. A non-in-place plug-in's output length need not match a given input length at any time: for example, Time Stretch, Gapper/Snipper, Pitch-Shift (without preserving duration), and some Vibrato settings can create an output that is longer or shorter than the input.

Insertion Point

Insertion Point

The insertion point (also referred to as the cursor position) is analogous to the cursor in a word processor. It is where pasted data will be placed or other data may be inserted depending on the operation.

Inverse Telecine (IVTC)

Inverse Telecine (IVTC)

Telecine is the process of converting 24 fps (cinema) source to 30 fps video (television) by adding pulldown fields. Inverse telecine, then, is the process of converting 30 fps (television) video to 24 fps (cinema) by removing pulldown.

Invert Data

Invert Data

Inverting sound data reverses the polarity of a waveform around its baseline. Inverting a waveform does not change the sound of a file; however, when you mix different sound files, phase cancellation can occur, producing a "hollow" sound. Inverting one of the files can prevent phase cancellation.

ISRC Code

ISRC Code

Industry Standard Recording Codes (ISRC) were designed to identify CD tracks. The ISRC code is a 12-character alphanumeric sequence in the following format:

Field

A

B

C

D

E

Sample ISRC

SE

T38

86

302

12

Field

Description

A

Country — Represents the recording's country of origin.

B

First Owner — Assigned ID for the producer of the project. Each country has a board that assigns these codes.

C

Year of Recording — Represents the year the recording was made.

D

Recording — Represents the recording's serial number made by the same producer in that year:

  • This value will use three digits (300-999) when the CD has 10 or more tracks.

  • This value will use four digits (0001-2999) when the CD has 9 or fewer tracks.

E

Recording Item (1 or 2 digits) Identifies tracks on a CD (each track can have a different ISRC code).

Listener Track

Listener Track

A listener track is an audio track that responds to the input or behavior of a controller track. In VEGAS Pro, listener tracks automatically lower their volume in response to the controller track in Auto Ducking. In a broader sense, listener tracks can be affected by other control mechanisms such as sidechain compression, where their audio properties change in reaction to the controller track’s input.

Luma

Luma

The values that convey luminance information.

Luminance

Luminance

The brightness of an image without respect to its color content.

Markers

Markers

Saved locations in the sound file. Markers are stored in the Regions List and can be used for quick navigation.

Media Control Interface (MCI)

Media Control Interface (MCI)

A standard way for applications to communicate with multimedia devices like sound cards and CD players. If a device has a MCI device driver, it can easily be controlled by most multimedia software.

Media Player

Media Player

A Microsoft Windows program that can play digital sounds or videos using MCI devices. Media Player is useful for testing your sound card setup.

MIDI Clock

MIDI Clock

A MIDI device specific timing reference. It is not absolute time like MTC, instead it is a tempo dependent number of "ticks" per quarter note. MIDI Clock is convenient for synching devices that need to do tempo changes mid-song. supports MIDI Clock out, but does not support MIDI Clock in.

MIDI Port

MIDI Port

A MIDI Port is the physical MIDI connection on a piece of MIDI gear. This port can be a MIDI in, out or through. Your computer must have a MIDI to output MIDI Time Code to an external device or to receive MIDI Time code from an external device.

MIDI Time Code (MTC)

MIDI Time Code (MTC)

MTC is an addendum to the MIDI 1.0 Specification and provides a way to specify absolute time for synchronizing MIDI capable applications. Basically, it is a MIDI representation of SMPTE timecode.

Mix

Mix

A function performs inherently by adding events to multiple audio tracks.

Multiple Stereo

Multiple Stereo

A mixer configuration that allows you to assign individual tracks to any number of stereo output busses. In single stereo mode, all tracks go out the same stereo bus. Multiple stereo configuration allows you to keep your signals from the tracks discrete if you want them to be.

Musical Instrument Device Interface (MIDI)

Musical Instrument Device Interface (MIDI)

A standard language of control messages that provides for communication between any MIDI compliant devices. Anything from synthesizers to lights to factory equipment can be controlled via MIDI. utilizes MIDI for synchronization purposes.

Noise-shaping

Noise-shaping

Noise-shaping is a technique that can minimize the audibility of quantization noise by shifting its frequency spectrum. For example, in 44,100 Hz audio quantization noise is shifted towards the Nyquist Frequency of 22,050 Hz.

Nondestructive Editing

Nondestructive Editing

This type of editing involves a pointer-based system of keeping track of edits. When you delete a section of audio in a nondestructive system, the audio on disk is not actually deleted. Instead, a set of pointers is established to tell the program to skip the deleted section during playback.

Normalize

Normalize

Refers to raising the volume so that the highest level sample in the file reaches a user defined level. Use this function to make sure you are fully utilizing the dynamic range available to you.

Nyquist Frequency

Nyquist Frequency

The Nyquist Frequency (or Nyquist Rate) is one half of the sample rate and represents the highest frequency that can be recorded using the sample rate without aliasing. For example, the Nyquist Frequency of 44,100 Hz is 22,050 Hz. Any frequencies higher than 22,050 Hz will produce aliasing distortion in the sample if no anti-aliasing filter is used while recording.

Pan

Pan

To place a mono or stereo sound source perceptually between two or more speakers.

Peak Data File

Peak Data File

The file created by when a file is opened for the first time. This file stores the information regarding the graphic display of the waveform so that opening a file is almost instantaneous in direct edit mode. This file is stored in the directory that the file resides in and has a .sfk extension.

Pixel Aspect

Pixel Aspect

The pixel aspect determines whether the pixels are square (1.0) which refers to computers, or rectangular (settings other than 1.000) which typically refers to televisions. The pixel aspect ratio, together with the frame size, determine the frame aspect ratio.

Plug-In

Plug-In

An effect that can be added to the product to enhance the feature set. supports all DirectX plug-ins. The built-in EQ, Compression and Dithering effects are also considered plug-ins because they work in other DirectX-compatible applications.

Plug-In Chain

Plug-In Chain

Plug-ins can string together into a chain so that the output of one effect feeds into the input of another. This allows for complex effects that could not otherwise be created.

Pre-roll/Post-roll

Pre-roll/Post-roll

Pre-roll is the amount of time elapsed before an event occurs. Post-roll is the amount of time after the event. The time selection defines the pre- and post-roll when recording into a selected event.

Preset

Preset

A snapshot of the current settings in a plug-in. Presets are created and named so that you can easily get back to a sound that you have previously created.

Pull-down

Zipper noise

Zipper noise occurs when you apply a changing gain to a signal, such as when fading out. If the gain does not change in small enough increments, zipper noise can become very noticeable. Fades are accomplished using 64-bit arithmetic, thereby creating no audible zipper noise.

Pulse Code Modulation (PCM)

Pulse Code Modulation (PCM)

PCM is the most common representation of uncompressed audio signals. This method of coding yields the highest fidelity possible when using digital storage.

Punch-In

Pulldown

In telecine conversion, fields are added to convert 24 fps film to 30 fps video. In 2-3 pulldown, for example, the first frame is scanned into two fields, the second frame is scanned into three fields, and so on for the duration of the film. 2-3 pulldown is the standard for NTSC broadcasts of 24p material. Use 2-3 pulldown when printing to tape, but not when you intend to use the rendered video in a project.

FIGURE 24 fps film (top) and resulting NTSC video with 2-3 pulldown fields (bottom)

Use 2-3-3-2 pulldown when you plan to use your rendered video as source media. When removing 2-3-3-2 pulldown, Frame three is discarded, and the pulldown fields in the remaining frames are merged:

FIGURE 24 fps film (top) and resulting NTSC video with 2-3-3-2 pulldown fields (bottom)

Quadraphonic

Punch-In

Punching-in during recording means automatically starting and stopping recording at user-specified times.

Quantization

Quadraphonic

A mixing implementation that allows for four discrete audio channels. These are usually routed to two front speakers and two back speakers to create immersive audio mixes.

Quantization Noise

Quantization

The process by which measurements are rounded to discrete values. Specifically with respect to audio, quantization is a function of the analog to digital conversion process. The continuous variation of the voltages of an analog audio signal are quantized to discrete amplitude values represented by digital, binary numbers. The number of bits available to describe these values determines the resolution or accuracy of quantization.

Region

Quantization Noise

A result of describing an analog signal in discrete digital terms. This noise is most easily heard in low-resolution digital sounds that have low bit depths and is similar to a "shhhhh" type sound while the audio is playing. It becomes more apparent when the signal is at low levels, such as when doing a fade out.

Resample

Region

A subsection of a sound file. You can define any number of regions in a project or media file.

Ripple

Ripple

"Ripple" refers to the automatic adjustment or linking of adjacent elements to accommodate changes. So, when you edit events on a track and activate the Ripple function, adjacent keyframes and envelopes on the relevant tracks are automatically adjusted - depending on the settings - to ensure the consistency of the editing.

Ruler Tags

Resample

The act of recalculating samples in a sound file at a different rate than the file was originally recorded. If a sample is resampled at a lower rate, sample points are removed from the sound file decreasing its size, but also decreasing its available frequency range. When resampling to a higher sample rate, extra sample points in the sound file are interpolated. This increases the size of the sound file but does not increase the quality. When down-sampling one must be aware of aliasing.

Ruler, Time

Ruler Tags

Small tab-shaped controls above the time ruler that represent the location of markers, regions, and loop points in the waveform display.

Sample

Ruler, Time

The time ruler is the area on a data window above the tracks display window that shows the horizontal axis units.

Sample Rate

Sample

The word sample is used in many different (and often confusing) ways when talking about digital sound. Here are some of the different meanings:

A discrete point in time that a sound signal is divided into when digitizing. For example, an audio CD-ROM contains 44,100 samples per second. Each sample is really only a number that contains the amplitude value of a waveform measured over time.

A sound that has been recorded in a digital format; used by musicians who make short recordings of musical instruments to be used for composition and performance of music or sound effects. These recordings are called samples. In this help system, we try to use sound file instead of sample whenever referring to a digital recording.

The act of recording sound digitally, i.e. to sample an instrument means to digitize and store it.

Shortcut Menu

Sample Value

The sample value (also referred to as sample amplitude) is the number stored by a single sample. In 16-bit audio, these values range from -32768 to 32767. In 8-bit audio, they range from -128 to 127. The maximum allowed sample value is often referred to as 100% or 0 dB.

Signal-to-Noise Ratio

Shortcut Menu

A context-sensitive menu that appears when you right-click on certain areas of the screen. The functions available in the shortcut menu depend on the object being clicked on as well as the state of the program. As with any menu, you can select an item from the shortcut menu to perform an operation. Shortcut menus are used frequently for quick access to many commands. An example of a shortcut menu can be found by right-clicking on any waveform display in a data window.

Small Computer Systems Interface (SCSI)

Signal-to-Noise Ratio

The signal-to-noise ratio (SNR) is a measurement of the difference between a recorded signal and noise levels. A high SNR is always the goal. The maximum signal-to-noise ratio of digital audio is determined by the number of bits per sample. In 16-bit audio, the signal to noise ratio is 96 dB while in 8-bit audio its 48 dB. However, in practice this SNR is never achieved, especially when using low-end electronics.

Society of Motion Picture and Television Engineers (SMPTE)

Small Computer Systems Interface (SCSI)

A standard interface protocol for connecting devices to your computer. The SCSI bus can accept up to seven devices at a time including CD ROM drives, hard drives and samplers.

Sound Card

Society of Motion Picture and Television Engineers (SMPTE)

SMPTE timecode is used to synchronize time between devices. The timecode is calculated in Hours:Minutes:Second:Frames, where Frames are fractions of a second based on the frame rate. Frame rates for SMPTE timecode are 24, 25, 29.97 and 30 frames per second.

Stereo

Sound Card

The sound card is the audio interface between your computer and the outside world. It is responsible for converting analog signals to digital and vice versa. will work with any Windows-compatible sound card.

Surround

Stereo

Mixer implementation that includes two discrete channels

Telecine

Surround

5.1 surround is a mixer implementation that includes six discrete channels

Tempo

Telecine

The process of creating 30 fps video (television) from 24 fps film (cinema).

Time Format

Tempo

Tempo is the rhythmic rate of a musical composition, usually specified in beats per minute (BPM).

Track

Time Format

The format used to display the time ruler and selection times. These include Time, Seconds, Frames, and all standard SMPTE frame rates. The status format is set for each sound file individually.

Trim/Crop

Track

A discrete timeline for audio data. Audio events sit on audio tracks and determine when a sound starts and stops. Multiple audio tracks are mixed together to give you a composite sound that you hear through your speakers.

Undo Buffer

µ-Law

µ-Law (mu-Law) is a companded compression algorithm for voice signals defined by the Geneva Recommendations (G.711). The G.711 recommendation defines µ-Law as a method of encoding 16-bit PCM signals into a non-linear 8-bit format. The algorithm is commonly used in European and Asian telecommunications. µ-Law is very similar to A-Law, however, each uses a slightly different coder and decoder.

Undo/Redo

Undo Buffer

This is the temporary file created before you do any processing to a project. This undo buffer allows the ability to rewrite previous versions of the project if you decide you do not like changes you have made to the project. This undo buffer is erased when the file is closed or the Clear Undo History command is invoked.

Undo/Redo History

Undo/Redo

These commands allow you to change a project back to a previous state, when you do not like the changes you have made, or reapply the changes after you have undone them. The ability to Undo/Redo is only limited by the size of your hard drive.

Video for Windows (AVI)

Undo/Redo History

A list of all of the functions that have been performed to a file that are available to be undone or redone. Undo/Redo History gives you the ability to undo or redo multiple functions as well as preview the functions for quick A/B-ing of the processed and unprocessed material.

Virtual MIDI Router (VMR)

Video for Windows (AVI)

A file format for digital video.

Zero-crossing

Virtual MIDI Router (VMR)

A software-only router for MIDI data between programs. software uses the VMR to receive MIDI timecode and send MIDI clock. No MIDI hardware or cables are required for a VMR, so routing can only be performed between programs running on the same PC.

Zipper noise

Zero-crossing

A zero-crossing is the point where a fluctuating signal crosses the zero-amplitude axis. By making edits at zero-crossings with the same slope, the chance of creating glitches is minimized. The fade edit edges setting is used to make zero crossing at event edges by fading the waveform to 0-amplitude over a short period of time.