Broadcast Industry Glossary

A | B | C | D | E | F | G | H | I | J | K | L | M | N | O | P | Q | R | S | T | U | V | W | X | Y | Z

Section A

1080/24p: The standardized international High Definition format having a sampling structure of 192 (H) x 1080(V) and operating at 24-frames/second progressively scanned.

1080/60i: The standardized international High Definition format having a sampling structure of 1920(H) x 1080(V) and operating in interlaced scan mode at 60-fields/second.

1280x720: Refers to the High Definition sampling structure of 1280(H) x 720(V). All 1280x720 images are scanned progressively.

1920x1080: Refers to the High definition sampling structure of 1920(H) x 1080(V). 1920x1080 images can be scanned either interlaced or progressive.

23.98 or 23.976: Refers to a video image rate of 23.976 (truncated to 23.98) frames/second. This is deliberately offset from 24 frames so that a simple 3:2 process will produce the standard 59.94-fields/second interlaced video.

24PsF: Term used to describe a 24 (23.98) frame progressive video that divides the video in segments of odd and even lines for transmission and storage and display.

2K: A film image scanned into a computer file at a resolution of 2048 horizontal pixels per line.

30p: 30 full-frames/second digital video progressively captured. Often referred to as 29.97p.

3:2 Pulldown: The process used to convert 24-frame/second film or 24p video into 54.94i video. The 3:2 process consists of two (2) parts — the "Pulldown" and the "3:2" cadence. The pulldown process is the slowing of the film or video to 23.976-frames/second. The 3:2 cadence is created by taking one (1) frame of the 24-frame source and filling three (3) of the 59.94 fields.

4:2:2 : A commonly used term for component digital video format. The numeral 4:2:2 denotes the ratio of the sampling frequencies of the single luminance channel to the two color difference channels. For every four (4) luminance samples, there are two (2) samples of each color channel.

4:4:4 : A sampling ratio that has equal amounts of luminance and both chrominance channels. Also known as CCIR-6601.

4K: A film image scanned into a computer file at a resolution of 4096 horizontal pixels per line. 4K is considered to be the full-resolution scan of 35mm film.

4-Point Edit: Marking all four points to place the source clip into the program. The speed of the source clip is adjusted to fit the space allowed for it in the program.

5.1 Audio: An arrangement of five (5) audio channels (left; center; right; left-surround; right-surround) and one (1) sub-woofer channel.

16X9: A wide-screen format in which the aspect-ratio of the screen is 16 units wide by 9 units high as opposed to the 4X3 of normal television.

24P: A video picture in "interlaced" i.e. first all the odd numbered fields are scanned then all the even ones. There are 60 fields in one frame of video and there are 30 frames per second. A Progressive scan scans the fields in order i.e. 1,2,3,4, etc... It gives more of a film-like look.

59.94 Fields/Second: The field rate of NTSC color television.

60 Fields/Second: The field rate of SMPTE HDEP standard.

720/60p: Refers to the High Definition format of a sampling structure of 1280(H) x 720(V) and operating at 60-frames/second progressively scanned.

8:8:8: Defines standard definition signals where all signals are sampled at 27MHz.

Section ABack to top Back To Top

A/B Roll: Creates fades, wipes and other transitions from one video source to another.

A to D Converter: An electronic device used at the input of digital audio equipment to convert analog signals to digital values.

Aaton Code: In-camera keycode/timecode reader.

Accommodation: The ability of our eyes to refocus at a new point of interest. In normal vision, the processes of focusing on objects at different distances (accommodation) and convergence/divergence (the angle between the lines of sight of our eyes) are linked by muscle reflex. A change in one creates a complementary change in the other. However, watching a stereoscopic film or TV programme requires the viewer to break the link between these different processes by accommodating at a fixed distance (the screen) while dynamically varying eye convergence and divergence (something we don't do in life) to view objects at different stereoscopic distances.

Action Safe Area: The area of a television picture that is visible on consumer television sets.

ADR: Automatic Dialogue Replacement. Recording/re-recording dialogue where the production sound is unstable or obscured.

AES/EBU: Informal name for a digital audio standard established by the Audio Engineering Society & European Broadcast Union. It is the transmission of two (2) channels of digital audio data on a single twisted-pair cable using 3-pin (XLR)

AGC: An acronym for "Automatic Gain Control". It is the circuitry used to ensure that output signals are maintained at constant levels despite widely varying input signals.

AIF/AIFF: An acronym for "Audio Interchange (File) Format". Platform-independent file format for digital-audio signals that can be used for audio-clips. It is capable of storing multiple mono or stereo channels. See also Audio clip.

AIV: An acronym for "audio in video". Digital audio can be transmitted either via a separate connection (AES/EBU) or embedded in the video signal, meaning that audio is sent over the same data connection as the video signal. See also AES/EBU and Embedded.

Aliasing: Defects or distortions in a television picture due to sampling limitations. Defects are commonly seen as jagged edges or diagonal lines and a pulsing/brightening in picture detail.

Alpha Channel: Additional channel that saves the relative transparency value additionally to the color information. The alpha channel is an additional channel that saves the relative transparency value additionally to the color information. Thanks to the Alpha values the layering of media objects on top of each other are facilitated. In the common four digit digital-sampling structure like 4:2:2:4, the alpha channel is represented by the last digit.

Anaglyph: A type of stereoscopy in which the two pictures are individually coloured and then superimposed as a single image rather than two separate images. Each eye sees only the required image through the use of coloured filters (e.g. red and green or red and cyan). Anaglyph glasses have been popular over the years for viewing 3D comics and some 3D films (particularly on VHS and DVD). Although Anaglyph itself has fallen out of favour for quality Stereo work there is modern work going on with other somewhat anaglyph like colour based systems (e.g. Trioviz or ColorCode-3D)

Analog: A continuous electrical signal that carries information in the form of various physical values such as amplitude or frequency modulation. Voltage or current rather than a set of digital numbers, represents a pixel.

Anamorphic: A film image horizontally compressed by a special lens to fit the width of a standard Academy ratio film frame then expanded during projection to its normal width and appearance on the screen. The vertical axis is not disturbed during this process.

Answer Print: The first film print combining picture and sound in release form offered by the film processing laboratory to the producer for acceptance.

API: An acronym for "Application Program Interface". Using this source code interface a programmer can make requests of the operating system or another application.

Array: When storing information on multiple devices of data storage, it is defined as an array.

Artifacts: Refers to video blemishes, noise or any physical interruption of the video image.

Aspect Ratio: The ratio of the width of a picture to the height.

Assemble Edit: Building a videotape in which a series of clips are placed in order, one after another.

ATSC: An acronym for "Advanced Televisions System Committee". It is the organization that is defining the standard for high-definition television in the United States.

Audio Clip: In a non-linear editing environment a clip indicates data of either video or audio that has been clipped out (copied) from a larger environment such as a reel or a video tape.

Auto Assembly: Automatic combining of edits on videotape conforming to a prepared edit decision list (EDL) with little or no human involvement.

Autoconforming: In general, autoconforming is the process where an offline-edited edit decision list (EDL) or cut list is used to reproduce the high-quality content of video and audio with the original source material.

Autodetection: Autodetection is a function of the DVS CLIPSTER/Pronto video systems that allows the operator to automatically detect and set the video format and raster of the incoming video signal, for example, for a capturing.

Autoscaling: Scaling generally indicates a change of the resolution of images, i.e. the images are made larger or smaller. If the resolution of the original material differs from the configured raster, it will be scaled either up or down to its maximum allowable width and/or height according to the selected video format. The autoscaling setting makes sure that no image information gets lost, i.e. the images will not be cropped nor will they be too small for the selected video format.

Section BBack to top Back To Top

B Negative: Film term referring to takes not originally intended to be oriented from dailies but later called to be printed. This term has been carried over into videotape and refers to non-circle takes that are later transferred as alternative takes.

Back Timing: Marking an "In" and "Out" point on a source clip and an "OUT" point in a program then allowing the edit software to calculate the "In" point on the program. This is also known as a "3 Point Edit".

Backup: Copying files or databases so that they will be preserved.

Bandwidth: Data throughput, meaning the amount of data sent. The term describes the amount of information that can be transmitted over a wire, line or method of linking communication devices. Therefore, it defines the range of transmission frequencies a network can use. The greater the bandwidth is, the larger is the amount of information that can be transferred over that network.

Batch Capture: Combining your video capture card with deck control so that you define the "in" and "out" points first, then capture only the footage you want.

Bin: On non-linear editing systems the bin is an organization tool for one or more film scenes.

Bit: Short for "Binary Digit". The smallest piece if binary digital data and is represented by either a "1" or a "0". Numbers of bits are used in digital video as a representation of signal quality (i.e. an 8-bit signal can have 256 levels from black to white while a 10-bit signal can have 1024 levels).

Bit Depth: The bit depth is an indication of the color depth that a pixel in a digital image may have. For example, when the image is available in 8 bit, each pixel in the image will provide one of 256 colors (28); when the image is in 10 bit, up to 1024 colors (210) are available.

Black Burst: A composite video signal consisting of all horizontal & vertical synchronization information. It is typically used as the house reference synchronization signal in television facilities.

BMP: An abbreviation for "Bitmap" i.e. the Windows Bitmap Format. An image file format that can be used for video clips.

BNC: An acronym for "British Naval Connector". A cable connector used exclusively in television.

Breaking the Frame: Stereo objects in front of the screen plane (negative parallax) are problematic if they intersect the edge of frame, as contradictory depth cues are sent to the viewer. Essentially one cue is saying that the object is in front of the screen and another is saying that the object is behind it.

This problem can be reduced in Post by a technique known as a 'floating window'. This involves applying a partially transparent mask on the left of the left image and on the right of the right image, reducing the strength of the cues on which ever side the object is breaking frame (and simultaneously if there are objects breaking frame both left and right).

Another kind of issue is caused by objects moving backwards and forwards over the edge of frame. As an object moves off the edge of a screen one stereo camera signal is lost before the other. The result is that the stereo signal temporarily 'switches off'. This can sometimes be solved by sizing up both images in Post, causing the object to move off screen altogether.

Objects breaking the frame aren't necessarily a problem. It happens in IMAX all the time and also is common in conventional stereo films - the audience is encouraged to concentrate away from such an object by well thought out shooting.

Breakout Box: A box to be connected to a computer system to provide further connections. In a digital video environment a breakout box may provide further connections for the video system, for example, to in- or output audio or video.

Breakout Cable: At hardware (e.g. a video system) input / output connections that are usually distributed over several standardized connectors can be combined and offered via a single connector for the sake of space.

Broadcast Quality: Footage that meets the high technical standards for broadcast or cablecast. Quality that does not meet this standard is referred to as "reference quality".

Browsing Tools: A browsing tool is, for instance, a standard file manager such as 'My Computer' or the Windows Explorer on the Windows operating system.

Burn-in: Burn-in means to superimpose certain information on another image. With such a feature you can provide each image with individual information such as timecode, frame or keycode data or comments.

Bus: A Bus is a group of data, control and/or addressing lines that extend from device to device and act as a conduit for signals. A Bus is often shared by several devices.

Byte: A byte consists of 8-bits or 10-bits.

Section CBack to top Back To Top

Cache: Cache is an especially fast memory. It is a collection of duplicated data values stored in a memory.

Camera Report: The form used to identify what is on each exposed camera roll and any special printing or transfer instructions.

Capture: Process of feeding media material from outside sources into a computer. When capturing media from an outside source it requires special hardware, the video capture card. Special software is needed too, when capturing video of what is displayed on the computer screen.

Capture Rate: A term used to describe the number of times/second that a picture is taken or captured in an imaging system. In a progressive system, the capture rate is equal to the frame-rate. In an interlaced system, the capture rate is double the frame rate. This is due to the fact that at each capture interval only one (1) field (a half resolution image) is acquired. It takes two (2) fields to make a complete frame.

Cardboarding: Lack of true 3D feel to a shot making it look like it is made from cardboard cut-outs. This is also referred to as Cut-out Planar Effect. Caused by inadequate depth resolution due to an incorrect matching between the focal length of the recording lens (or CGI camera) and the interocular distance between the cameras. See: Interocular

CAV: An acronym for "Component Analog Video". Component video signals in which an analog voltage or current represents the value of a pixel.

CE: CE is an acronym for Conformite Europeenne. The certificate or the CE mark is placed on products to signify that they conform to European Union regulations.

Centaurus: Centaurus is the industry standard for high-end, uncompressed video I/O hardware.

CG: An acronym for "computer graphics". Usually it stands for images either partially or completely created at a computer workstation. However, in the field of digital video CG-matrices are used to color convert images from the YUV color space (the color space of television signals) to the RGB color space (the color space used on computers) and vice versa.

Check Print: First film print used to check color corrections.

Chromakeying / Chromakeyer: Overlaying one video signal over another is defined as chromakeying. The areas of overlay must be defined by a specific range of color, or chrominance, depending on the foreground signal. The chrominance must have sufficient bandwidth or resolution. Chromakeying is also called blue screen or green screen, depending on the color being replaced.

Chrominance: The portion of the video signal that contain color information.

Cineon: This is a file format that was specifically designed to represent scanned film images.

Client: A computer system that wants to access a service - sometimes a remote one - on another computer is called a client. Typically this happens within a network.

Client-Server Architecture: Network structure which separates server applications from client applications. A central server manages all data for different clients and provides them with the required data. The system's scalability depends on the server performance and the expandability of its hardware resources.

CMYK: An acronym for "Cyan; Magenta; Yellow; Black". It is the designation for the subtractive color system used in pigment printers.

Coding: Ink stamping or burning numbers into the edges of work print and work track to mark sync points. It is done with a "coding" machine.

Coercivity: Measures the force required to erase a video tape that has been recorded to the maximum possible level. Coercivity is measured in "Oersteds".

Color Bars: A video test signal widely used for system and monitor set-up. The test signal typically contains eight (8) basic colors (white; yellow; cyan; magenta; blue; black) and is used to check chrominance functions of color television systems.

Color-Correction: Alteration of the tonal values of colored objects or images.

Color Space: This term describes the color range between specified references. Normally, references in television are quoted in the following way: RGB, Y, R-Y, B-Y, YIQ, YUV and Hue Saturation and Luminance (HSL), and XYZ.

Color Temperature: A concept formulated for the purpose of reference and standardization of color light sources. The "Color Temperature" is expressed in degrees centigrade beginning at absolute zero (Kelvin Scale).

Component: Component signal keeps luminance and chrominance separate. It provides better picture quality.

Component Analog: The unencoded output of a camera, videotape, etc...consisting of three (3) primary color signals, i.e. red, green & blue (RGB), that together convey all necessary picture information.

Component Digital: A digital representation of a component analog signal set, most often Y, B-Y, R-Y.

Composite: Composite combines luminance and chrominance. It is usually less expensive than component.

Composite Analog: An encoded video signal, such as NTSC or PAL video, that includes horizontal & vertical synchronization information.

Composite Digital: A digitally encoded video signal, such as NTSC or PAL video, that includes horizontal & vertical synchronization information.

Composite Print: A film print incorporating picture and sound elements on the same strip of film.

Compositing: Layering multiple pictures on top of each other. Used primarily for special effects.

Compress: The process of converting video & audio data into a more compact form for storage or transmission.

Compression Ratio: A value that indicates by what factor an image file has been reduced after compression. The higher the ratio, the greater the compression.

Conforming: To prepare a complete version of your project for viewing or playing out by conforming it. The conformed version might either be an intermediate working version or the final cut.

Content Management System (CMS): A CMS will help its users administrating large amounts of content. When using a large body of documents or multimedia or image resources a CMS will help its users administrating this kind of content.

Convergence: In human eyesight, the ability of our eyes to divert eye optical axes horizontally in an inward direction. The convergence 'near point' is the closest point which is still possible to perceive one image. In practice, the eyes can easily converge inward but have much less ability to diverge outward, as it is something we don't do in life and only when looking at 3D images that have positive parallax beyond the individual human interocular.

In cameras — 'toeing' of the cameras (to simulate the eyes converging)focusing on a depth point in the scene, either in front of, behind or at the point of interest. The 'convergence point' is where the axes of toed in cameras align on the Z-axis. Convergence can be adjusted in Post by horizontal movement. Note that sometimes the term 'vergence' is used to describe both convergence and divergence. Convergence pullers are camera-crew members on a Stereoscopic shoot who are responsible for setting up and shifting the convergence during a shot. See: Parallax

Cropping: A rectangular cutting off of image edges. By cropping you remove a part of your image, for example, to receive a letterbox effect (black borders at the top and bottom). As opposed to zoom and pan where you can create a similar effect, the remaining image is normally not scaled back to the size of the video format but remains in its original size.

CRT: An acronym for "Cathode Ray Tube". Used in monitors and television sets to show images on a screen. The current of the video signal is used to control electron rays generated by a cathode and directed onto a phosphorescing plane in a vacuumed tube. Wherever the electrons hit the phosphorescing plane they illuminate a dot on the plane in the brightness of the electron ray's strength. The image dots (i.e. electron rays) are guided by electromagnetic fields from the left to the right and line by line. Thus an image is created on a screen.

Cursor: The vertical bar that represents an exact point in an active (text) object.

Cut: Instant change between two sources of video, also called 'hard cut'. However, you can also create cuts or 'cutting points' by simply dividing a video clip at a certain position.

Cut List: A cut list is usually provided as a file and used to determine a sequence of video and audio clips. It describes a timeline with video and audio clips via timecode data of succeeding in- and outpoints. Cut lists may also contain information about transitions between clips (hard cut or wipe) and exist in various different, not standardized formats. See also EDL and Timeline.

Section DBack to top Back To Top

D-1: A non-compressed digital video recording format that uses data conforming to the ITU-R BT.601-2 standard. D-1 records on high-end 19mm (3/4") tape.

D-2: A non-compressed digital recording format that uses data conforming to SMPTE 244M and four (4) 20-bit audio channels. D-2 records on high-end 19mm (3/4") tape.

D-3: A non-compressed digital recording format that uses data conforming to SMPTE 244M and four (4) 20-bit audio channels. D-3 records on high-end 3/4" tape.

D-5: a non-compressed, 10-bit, 270 Mbit/ second, component or composite digital video recording format. D-5 records on high-end 3/4" magnetic tape.

D-7: DVC-Pro.

DA: An acronym for "Distribution Amplifier". A device used to multiply a video signal so that the signal stays constant throughout a number of devices.

DA-88: A Tascam-brand eight-track digital audio tape machine using the 8 mm video format.

Dailies: Picture and sound work-prints of a days shooting without regard to color-balance. Referred to as "rushes" in England. They are produced so that the best takes can be selected.

Direct Attached Storage (DAS): A storage unit directly attached to the device recording the data.

DAT: An acronym for "Digital Audio Tape". A consumer digital audio recording & playback system with a signal quality surpassing that of the Compact Disc (CD).

DCDM: DCDM is an acronym for Digital Cinema Distribution Master. The DCDM is made using the original finished picture-data from the DI process - normally 10-bit DPX files. To create the DCDM the data is encoded into 12 or 16-bit Tiff files for picture and 24-bit WAV files for audio. The DCDM provides the uncompressed master elements that enable the creation of the Digital Cinema Package (DCP).

DCP: An abbreviation for "Digital Cinema Package". A DCP is a collection of digital files used to store, organize and convey Digital Cinema image, audio and data streams. These include MXF, XML and JPEG2000 files. SMPTE standards are used to conform the various vendors, producers and distributors.

D-Cinema: Digital Cinema. It encompasses digital distribution and projection of digital cinematic material.

DDR: An acronym for "Digital Disk Recorder". Systems that record video or audio programs on one or more hard drives. They are mostly used in broadcast or radio broadcasting when editing or recording is required. The benefit of these systems: they offer immediate access to the material that was recorded before, without requiring pre-roll/post-roll or expensive maintenance of tape heads.

Decibel (dB): A unit of measure used to represent audio transmission levels, gains or losses. It describes the smallest perceptible change in audio level.

Decoder: A device used to recover component signals from a composite (encoded) source.

Decompression: When expanding a compressed file back into its original form it is uncompressed (or decompressed).

Defragmentation: Storing and deleting data on a storage medium such as a hard disk will cause in time a fragmentation of data on the storage. At that point the information is no longer stored as a large block in one place but scattered all over the storage. Though one will hardly experience this as a problem when working with a computer normally, when dealing with digital video it is of special importance: Then the data on the storage should be optimally aligned to be more suited for real-time operations. To achieve this you have to apply a defragmentation at regular intervals to the storage which will physically align the data properly on the storage. For this DVS developed a special defragmenter that even observes video clips consisting of individual image files: The files belonging to clips will be aligned in blocks, thus truly facilitating real-time processes.

Depth of Field: A term used to describe the areas of a picture both in front and behind the main focus point which remains in focus.

Depth Grading: A post production process where negative and positive parallax convergence and divergence are adjusted. This is not only a creative tool used to place objects on the Z axis but also a way to ensure that stereoscopic content can be comfortably watched on the screen size it is intended for. For example, in a Post suite the Director may be viewing a film on a small projection screen but the final delivery format may be a large theatre or Imax. In practice the eyes have little ability to diverge (up to one degree is considered the rule of thumb) and this is especially a consideration in depth grading for very large screens with positive parallax images, where the distance between the left and right representations of an image may be very widely spaced. Sometimes the term Depth Budget is used to refer to the combined value of positive and negative parallax and expressed as a % of screen width. See: Parallax

DFTC: Drop-Frame Timecode. SMPTE timecode created to match run-time or clock-time, exactly. Two frames of timecode are dropped every minute except every tenth minute. Broadcasters require masters to be delivered with DFTC.

DI: An abbreviation for "Digital Intermediate". DI is the process of digitizing a film (e.g. by color correction, inserting transitions, conversioning, conforming, etc...) before it is distributed into movie theaters

Digitizing: The act of taking analog audio and/or video and converting it to digital form.

Digital Intermediate (DI): A Digital Intermediate is the result of the process of shooting in High Definition or shooting on film followed by scanning to film quality data files, editing the project in High Definition and applying the creative process of color correction and color treatment to the completed master. This Digital Intermediate (DI) then becomes the master for video, DVD or for theatrical output by transferring this data master back to film.

Digital Television (DTV): The transmitting of a broadcast signal that consists of digital data.

Director's Cut: A rough-cut created by the director once the editor's cut is complete.

DirectShow: The Microsoft DirectShow application programming interface (API) is an architecture for streaming media on Windows. With DirectShow a software developer can implement all kinds of video and audio play-out and capture solutions in a software application. Various applications such as Windows Media Player use DirectShow already to display the video or audio content of files.

Disk Mirror: A Disk Mirror is a complete copy of data that resides on one physical disk to another physical disk. It doubles the data storage requirement when implemented. Also known as "RAID Level 1".

Disk Striping: Disk Striping is a technique used for spreading data over multiple disk drives. Disk striping speeds-up operations that retrieve data from disk storage.

Dissolve: A certain style of transition where one clip blends into the next. Most common are dissolve rates from a half-second to two seconds

Down-Conversion: The process of converting high-resolution video to lower-resolution video.

DPX: An acronym for "Digital Picture Exchange". This file format can be found in digital film work and is considered an ANSI/SMPTE 268M standard. DPX files can store image data and additional metadata in their file headers.

Drift: When an element does not keep a steady pace during playback. Usually caused when there is no timecode to lock to or when the record machine power supply is faulty. It can also refer to a color-correction setting on a telecine which has changed over time due to light-tube burn.

Driver: A program interacting with a special kind of software or particular device. The driver has special knowledge of the device or particular software interface that programs using the driver do not have.

Drop Frames: Frames/image files that cannot be read from or written to a storage device during a real-time operation have to be dropped i.e. they will be omitted during a play-out or record. A drop can be caused by all kinds of reasons, for example, a fragmented video storage, where data is physically scattered over the hard disk so that it takes too long to read it in time.

Dual Link: SDTV and HDTV in YUV 4:2:2 can be transmitted via a single BNC connector (single link). However, other video formats (e.g. RGB transmissions) exceed the data rate provided by a single BNC connector. For such a video signal two BNCs for a parallel (HD-)SDI connection are required, which is called a dual-link connection.

Duplicate Negative: A back-up or safety copy of a cut negative used for creating prints thus preserving the original negative.

DV: A video tape format designed primarily for the consumer market that records a 4:1:1 standard definition signal with a 5:1 compression ratio for a total bit-rate of 25Mb/second. DV cassettes come in two (2) sizes — Standard & Mini.

DVI: An acronym for "Digital Visual Interface". It is a video standard interface that will maximize the visual quality of digital display devices (e.g. computer displays, LCD panels, digital projectors and more). It is especially suited for uncompressed digital video data.

DVCPro HD: A High-Definition format developed by Panasonic. It uses ¼-inch wide tape stock and records 22:11:11 8-bit HD video.

DVD: An acronym for "Digital Versatile Disk". It is the same size as a compact disk (CD) but with a storage capacity up to 17 Gbytes.

Section EBack to top Back To Top

EE: The live signal. An incoming video signal (input signal) is immediately routed to the output.

EDL: An acronym for "Edit Decision List". A list of a video production's edit points. It is a record of all edit decisions made for a program in the form of a printed copy, paper tape or floppy disk file, which is used to assemble the project at a later date.

Edge Crop: A technique whereby just the center portions of a wide aspect ratio format are viewable.

Edge Numbers: Sequential numbers printed along the edge of a film strip by the manufacturer which allow frames to be easily identified either by human or machine.

Effect: To add an image or sound to an original piece of film data that was not there before. It will make the original piece more interesting. Process is mostly carried out electronically.

Embedded: Embedded usually stands for embedded audio in video (AIV). Via SDI or HD-SDI up to 16 channels of audio (AES/EBU) can be transmitted. While this is the easiest way to transmit audio together with video, a working on audio alone is normally not possible with this connection. See also AES/EBU and AIV.

Ethernet: A network technology for data transmission. A star-topology with twisted pair wiring is the most popular form. Common data rates are 10 Mbit/sec (Ethernet, 10 Base-T), 100 Mbit/sec (Fast Ethernet,100 Base-T), 1000 Mbit/sec (Gigabit Ethernet, 1000 Base-T) and 10,000 Mbit/sec (10 Gigabit Ethernet).

Section FBack to top Back To Top

Fade: Different kinds of transitions, e.g. cross-fade. Fade-in: A transition from a blank screen to an image. A fade-out is also called fade to black. This describes the process of a transition from an image to a blank (usually black) screen.

Fader: A console control which allows an operator to perform manual dissolves, fades & wipes.

Fail-Over: Automatically switch over to a backup or redundant system with equal characteristics. Optimally no data loss will occur thanks to fail-over.

FAT: FAT is an acronym for "File Allocation Table". It is a table that the system builds on a hard disk to keep track of what sectors are bad, are in use and by what file and in what sequence. Damage to the FAT is catastrophic.

Fault Tolerance: Fault Tolerance is a system's ability to remain operational in the event of a component, device or environmental failure.

FC-AL: An acronym for "Fibre Channel - Arbitrated Loop". An architecture used to maintain high data-rate transfer rates over long distances. It allows storage arrays to be separated by as much as 120 kilometers (12.5 miles), connected by one (1) non-amplified Fibre Channel optical link.

FC-Drives: These drives use the copper version of the Fibre Channel interface with a SCSI protocol. The maximum data rate is 4 Gbps.

FCC: A certificate with the acronym for "Frame Count Cuing". This describes the process of tracking scene changes within an element e.g. a clip.

Fibre Channel: A communications protocol designed to meet the requirements related to the demand for high performance data transfer. It supports data transmission and framing protocols for SCSI, HIPPI, Ethernet, Internet Protocol (IP) & ATM.

Field: A Field is a half of a video frame, either odd or even scan lines.

Film Scanner: Refers to a High Resolution Film to Data device that does not operate at "real-time".

Filter: A Filter is a computer software module used to process digital video for adding special effects to a program.

FIFO API: FIFO describes the process of first in, first out. API is the abbreviation for application program interface. The term FIFO explains the principle of a queue: data that comes in first, will be handled first. Then, the next data package will be handled. Thus, the data is organized and manipulated relative to time and prioritization.

File System: When storing and organizing computer files and their accompanying metadata, a popular method to use a file system. A file system might possibly have a storage device (e.g. hard disk) and then maintaining the physical location of the files is of importance. The file system will translate the file name used by the user to the physical address on the storage device. Another option is that the file system grants access to data on a file server - then acting as clients for a network protocol. File systems might be virtual, too, and then only exist as an access method for virtual data.

Finalizing: The process used to finish a video sequence. On a video system it is a process that generates a new clip from the project's timeline while the original material is not touched or altered. It saves the contents of the timeline in a freely selectable file and video/audio format to a new location, thereby applying all effects and cutting away material that is not needed.

Finishing: The complete process after fine-tuning the cutting and applying primary color corrections, such as applying secondary color corrections and titlings.

Firewire: A special high-speed bus standard capable of over 100 Mbits/second sustained data rate. Also known as IEEE P1394.

Flash Frames: White frames between frames with images on them. In video, these are mistimings in the EDL or editing that leave empty frames between cuts.

Flip-Flop: An effect on a video system where the video images are mirrored either horizontally (flip) or vertically (flop). See also Effect.

Flying Head: A video head that engages when a video deck is on "pause", providing a clear still-frame image.

FMV: An abbreviation for "Full Motion Video". Video that plays at 30 fps (NTSC) or 25 fps (PAL).

Foley: Background sounds added during audio sweetening to heighten realism.

Forced Display: A DVD feature that forces the display of a sub-picture irregardless of the wishes of the user.

Format: (1) The size, resolution, aspect ratio, color space, bit depth, format rate, etc. for a given image. (2) The file format for a given image. (3) The physical medium (such as film, video, etc.) used to capture or display an image sequence.

FPS: An abbreviation for "Frames Per Second".

Fragmentation: The scattering of data over a disk caused by successive recording and deletion operations. Data fragmentation occurs when a piece of data in memory is divided into several parts being physically far apart. Generally, this is the result of attempting to insert a large block of data into several small free spaces on the storage.

Frame: A frame consists of all the information required for a complete picture. Each video frame has 2 interlaced fields. In the NTSC system, there are 525 interlaced horizontal lines of picture information in 29.97 frames per second. In the PAL system, there are 625 interlaced horizontal lines of picture information in 25 frames per second.

Frame Rate: Used to describe the number of times per second that a complete picture is updated in an imaging system. (see Capture Rate)

FTP: An acronym for "File Transfer Protocol". It allows users to transfer files over a TCP/IP network.

Full-Field: a complete video image consisting of 2 fields per video frame.

Section GBack to top Back To Top

Gain: The increase or decrease in the strength of an electronic signal.

Gamut: The boundary of a color space. Colors outside the "gamut" of a specific color space are considered "illegal" for that color even though they may be well within the "gamut" of a different color space.

Genlock: The process of locking both the sync & burst of one signal to the burst & sync of another signal thus making the two signals synchronous.

Ghosting: Artefacts typically caused by signal leakage (crosstalk) between the two 'eyes'. A secondary 'ghost' image can be seen. There are several possible causes that can introduce the problem during acquisition, post production and display. One reason can be high contrast levels between an object and its background.

Gigabyte: A digital storage capacity equivalent to one-billion bytes.

Gigantism: Confusing Visual cues in a stereoscopic scene that can make an object appear to be the 'wrong' size i.e. the impression of strangely enlarged size of objects. This is due to the choice of interocular distance relative to the focal length of the camera lenses, e.g. shooting with an interocular distance much less than adult human eyesight can make a figure appear to be a giant. See: Miniaturization, Interocular

GPI: An acronym for "General-Purpose Interface". This interface is mostly used in broadcast and post production equipment. Some of these external devices do not have the ability to be directly controlled by the editor. In this case the GPI signal is used to synchronously "start" this equipment at the same time.

Gray Scale: A chart with varying shades of gray which is photographed during production and used by the film processing lab to color correct film.

GUI: An acronym for "Graphical User Interface". An interactive graphic displayed on a screen, being a means of operating software.

Section HBack to top Back To Top

HD: An acronym for "High Definition". It is frequently used to abbreviate HDEP & HDTV.

HD-SDI: An acronym for "High Definition Serial Digital Interface". Describes the transmission of digital video in HDTV (1920

HDCam: A High Definition videotape format developed by Sony Electronics. It utilizes 1/2-inch wide tape stock and a compression ratio of 2.7:1 at 440Mb/second.

HD D-5: A recording system that uses compression at about 4:1 to record HD material on standard D-5 cassettes.

HDEP: An acronym for "High Definition Electronic Production". This standard denotes 1125 scanning lines per frame; 60 fields per second; 2:1 interlace; an aspect ratio of 16X9; extended colorimetry; a 30MHz base bandwidth for each of its three color components. Also known as SMPTE 240.

HDTV: Collective term for television and video formats of a resolution higher than standard TV. There are various proposals and standards. The most common formats that are standardized by SMPTE and others have 1280 x 720 pixels (SMPTE 296M) and 1920 x 1080 pixels (SMPTE 274M). In some countries they are already used for broadcasting television programs. Besides television applications the HDTV equipment is also used in production and post production of feature films. Both formats can be used with frame rates from 23.976 up to 60 frames per second. While 1920 x 1080 typically is used with interlaced scanning, in this case with a maximum frame rate of 30 fps, 1280 x 720 is always progressive but with frame rates up to 60 fps, i.e. the frame rate of the 1280 x 720 format is normally twice the frame rate of the 1920 x 1080 format. With that said, the data rates of both formats are about the same. That is why most HDTV devices support both formats. If 1920 x 1080 is used with 50 or 60 fps in progressive mode the data rate is about as twice as high.

Head and Full: Terms for color value ranges indicating a restricted value range that provides headroom (head) and the full value range (full). A scene in post production is most likely worked upon in RGB in the full value range (8 bit: 0 = black, 255 = white). However, during broadcast the images have to be color converted to receive a "legal" broadcast signal. Such a signal must provide headroom in the color values to account for tolerances and a possible signal overshooting that may occur during the sampling of analog video signals. The color values have to be converted from RGB in the full value range to YUV in the restricted value range (mostly 16 = black, 235 = white).

Head and Tail: Video or audio material at the beginning (head) or end (tail) of a clip that is available on the storage of a non-linear editing system but not used nor visible in the timeline due to an adjustment (trimming) of the clip's in or out point. Clips that are recorded with heads and/or tails offer reserves in their content for further corrections during editing.

High-Definition Image: 1920 X 1080

Hold: An interpolation setting that maintains settings from one key frame until the next key frame and uses only one frame to jump to the next setting.

Host: A Host is a parent or base system that is accessing a RAID array for the purpose of data storage. Any system connected to a network.

Hot Spare: In order to provide reliability in different system configurations, a hot spare can be installed that works as a fail-over mechanism. The hot spare is connected but not actively working. If one part fails, the hot spare part will take over its job.

House Sync: The blackburst signal used to synchronize all the devices in a studio or a station.

HSDL: An acronym for "High-Speed Data Link". It is used to transmit and receive uncompressed 2K or 4K images. It is an expansion of the dual-link HD-SDI interface offering an easy way at a production site to share such data. With HSDL the frame rate has to be reduced to 15 to 20 fps for 2K or even 5 fps for 4K images. See also Frame rate, HD-SDI, SDI, and Dual link.

Hyperstereo: Using widely spaced cameras (e.g. beyond 70mm interocular) which record more stereo effect than the eyes can see. Such a large interocular distance can produce the effect of miniaturization. Also used in order to achieve the effect of more stereo depth and less scale in a scene. For close up work (e.g. miniatures etc.) special Interocular camera set ups of 5mm or less have been used (known as Hypostereo). For stereo effects on very long shots (e.g. landscapes) Interocular camera set ups of several meters have been used (Hyperstereo). One extreme example of Hyperstereo is from cameras mounted in space to record the Sun in 3D.

Hypostereo: Using closely spaced cameras (e.g. less than 50 mm interocular) which record less stereo effect than the eyes can see. Such a small interocular distance can produce the effect of gigantism. If standard cameras are used, the minimum interocular distance is typically limited by the thickness of the cameras so a mirror or beam splitter system is often used, enabling interoculars down to millimetres. See: Gigantism

Section IBack to top Back To Top

Insert Edit: Placing a section of a source clip in the time line with the media currently to the right of the insert point is moved farther to the right to accommodate the new clip.

Insertion Cursor: Double triangles that appear on the FX tracks showing where the Filter will be inserted.

Interface: a boundary between adjacent components, circuits or systems that enables the devices to exchange information.

Interlace: Technique for increasing picture repetition rate without increasing bandwidth by dividing a frame into separate fields.

Interlaced: A display system in which two (2) interleaved fields are used to create one (1) frame. The number of field lines is equal to one-half of the frame lines. Interlacing fields allows the level of light on a screen to be more constant thus reducing flicker.

Interlock: A system that electronically links a projector with a sound recorder.

Internegative: A duplicating film stock that turns into negative when printed from a positive print. It is used as a source for interpostive prints.

Interocular distance: The distance between the centers of the lenses of two recording cameras. A typical distance would be
63.5 mm (approximating average adult eye layout). The term 'Interaxial' is sometimes also used interchangeably with 'Interocular' (when referring to eyesight , 'Interpupillary' is often used)

Interpolation: Progressive calculation of a parameter between key frames.

Interpositive: Color master positive printed from the original negative, used for making duplicate negatives. These are used to manufacture release prints.

IP: An acronym for "Internet Protocol". The network layer protocol for the internet protocol suite.

I/O: This stands for "Input/Output". This term is used in situations where data is transferred to and/or from a system or devices.

Section JBack to top Back To Top

JBOD: An acronym for "Just a Bunch Of Disks". A collection of optical/magnetic disks used for storing data.

Jog/Shuttle: To move through a clip or sequence frame by frame with different speeds forward or backward.

Jogging: Single frame forward or backward movement on videotape.

JPEG: An acronym for "Joint Photographic Expert Group". Compression technique for still images and motion video. It is not as effective as MPEG which is optimized for motion video.

JPEG2000: An acronym for "Joint Photographic Experts Group 2000". It is a standard for compressing single images with high quality. JPEG2000 is successor of the standard format JPEG. It is the new image format that will be installed as a standard format in the film and post production business.

Jump Cut: Transition between two scenes which makes the subject appear to "jump". A cutaway shot remedies this alignment situation.

Section KBack to top Back To Top

Kelvin: A system or scale for measuring temperature. Absolute zero is 0° Kelvin or -273° C. The 'color' of white light is expressed in degrees Kelvin.

Key: A signal that can electronically "cut a hole" in a video picture to allow for the insertion of other elements.

Key Channel: Also called "Alpha Channel". A black-and-white video signal that can be added to the existing channels of a video signal (YUV[A] or RGB[A]). Normally used to determine parts of a video image that can be replaced by other content.

Keycode: A machine-readable code printed along the edge of the camera negative film (outside the perforations). It provides data e.g. about the film type, the name of the manufacturer and the film stock. Additionally, a reference number for the first frame on the film is given in order to match the film with a particular position of an EDL/cut list. In transmissions of digital video keycodes can also be used to replace or supplement timecode information. See also Timecode.

Keycode Number: Kodak's machine-readable key numbers that include a 10-digit key number, identification code, film code and off-set in perforations.

Keystoning: The result arising when the film plane in a camera or projector is not parallel to the view or screen, leading to a trapeze shape. On a stereoscopic image, where the cameras are 'toed-in' so that the object of interest coincides when viewed, there can be some mismatching of the outlines or borders of the two images. Techniques like corner pinning can be use to help correct this.

KHZ: Kilohertz. Equivalent to 1,000 cycles per second.

Kilobyte: A digital storage capacity equivalent to one-thousand bytes.

Section LBack to top Back To Top

Layback: Transferring the finished audio track back to the master video tape.

LCD: An acronym for "Liquid Crystal Display". A text/graphics display technology where minute electrical currents change selected parts of the display screen.

LCRS: Denotes an audio system that has four (4) full-range channels (Left, Center, Right, Surround). This signal is frequently encoded in Dolby Pro-Logic for distribution.

Letterbox: When a wide-screen image is projected onto a television screen, a space is left on the top and bottom of the screen.

Letterbox Format: A technique for displaying a wide aspect ratio format on a narrower aspect ratio screen.

Limiter: A compressor with a ratio ≥ (greater than or equal to) 10:1.

Locked: A video system is considered to be "locked" when the receiver is producing horizontal syncs that are in time with the transmitter.

Locked Cut/Locked Picture: The final version of a show after all the changes have been incorporated.

Log: At the beginning of an editing process, the information about source material is entered into bins, i.e. it is logged. Logging can either be done automatically or manually, before capturing or while capturing material.

Loop: Playing back a section of a timeline or clip again and again.

LTC: An acronym for "Linear Time Code". A timecode type (defined by SMPTE) recorded on the audio track of a videotape. LTC can be easily read when the tape is moving forwards or backwards.

Luminance: The portion of the video signal that contains the brightness information for the picture, without color information. A black & white picture contains luminance information only.

LUT: An acronym for "Look-Up Table". The LUT is used to transform data, it can e.g. map indexed-color pixels into a set of true-color values or it can perform gamma correction.

LVD: LVD is an acronym for Low Voltage Differential. It is a subset of Ultra 2 SCSI technology. It has lower voltage swings and is less susceptible to noise than standard Ultra SCSI technology.

Section MBack to top Back To Top

Mac OS: Operating system of Apple Macintosh computers.

Mapping: A technique for taking 2D images and applying it as a surface onto a 3D object.

Match-Frame Edit: An edit in which a scene already recorded on the master is continued with no apparent interruption.

Matte: The black bars found at the top and bottom of the picture when a wide-screen format is projected on a television set.

MB: Megabyte. A standard unit for measuring the information storage capacity of disks and memory. 1,000 kilobytes equals one (1) Megabyte.

Mbps, Mb/s: An abbreviation for "Megabits per Second". A measurement of data transfer speed equivalent to one-million bits per second.

Media Management: Managing media means moving and storing digital content assets in a safe way, while managing requests for duplicates.

Memory: A computer's internal storage area. It is either data storage that comes in the form of chips or data that exists on tapes or disks. Also, the term memory is used as a shorthand for physical memory.

Metadata: Data that describes other data. Generally structured information that describes a (possibly unstructured) set of data. For example, a title can be a metadata item of a movie which is stored as a clip in a file. The frame rate and resolution of a clip are also metadata items.

MHz: An abbreviation for "Megahertz". A measurement of frequency equivalent to one-million Hertz.

Miniaturization: Confusing visual cues in a stereoscopic scene that can make an object appear to be the 'wrong' size i.e. the impression of being strangely reduced in size. This is due to the choice of an interaxial distance of greater than 63.5 mm relative to the focal length of the camera lenses e.g. shooting with very widely spaced cameras. Subjectively this makes the audience feel like a giant looking at tiny objects, which is why miniaturization is sometime referred to as Lilliputianisism. See: Gigantism, Interocular.

Mixing: Combining all sound tracks onto a single master source.

MPEG: An acronym for "Moving Picture Experts Group". Standards designed for the handling of highly compressed moving images in real-time.

MTBF: An acronym for "Mean Time Between Failure". A statistical value for the reliability of a device. Higher numbers indicate higher reliability.

Multi-Channel: A term describes the number of channels (multiple) of audio or video.

MXF: An acronym for "Material Exchange Format". MXF describes how program material between tapes, archives and file servers is exchanged. MXF may comprise one whole sequence but can also contain a sequence of program segments or sequence of clips.

Section NBack to top Back To Top

NAB: An acronym for "National Association of Broadcasters". An association which has standardized the equalization used in recording and reproducing. NAB lobbies for the interests of broadcasting as a delivery mechanism.

Nagra: A professional .25-inch audiotape recorder.

NAS: An acronym for "Network Attached Storage". Data storage technology that can be connected directly to a computer network to provide centralized data access and storage to heterogeneous network clients. Storage space is usually made available through regular network connections. Due to the standard interface technology it is relatively inexpensive, but does not deliver sufficient data rates for real-time HD or film transfer.

Native Material: Certain video systems such as digital disk recorders or VTRs can only be configured to accept one video format a time. With VTRs the dependence on a definite video format is determined by the format of the used tape. With digital disk recorders it may depend on the format of the recorder's storage. This is then called the native material or the material native to the device.

Near-Line: Intermediate type of data storage, i.e. the on-site storage of data on removable media. It is a compromise between online storage (very quick to access) and offline storage (mostly backup or long-term storage) Near-line storage provides reliable, inexpensive and unlimited data backup and archiving, but with less speed and accessibility than with integrated online storage.

NLE: This term describes a form of the editing process. Here, the recording medium is not a tape, therefore, editing can be performed in a non-linear manner, i.e. the editor is independent of the sequence of the program. NLE has the advantage of editing with quick access to source clips and record space (e.g. on computer disks). Moreover, it removes the need of winding and pre-rolling of VTR operations and hence speeds up work. Even greater speed and flexibility are possible when real-time random access to any frame (true random access) is applied. The term NLE is mostly used when discussing offline editing systems storing highly compressed images, but increasingly online non-linear systems are available as well. Nowadays quite a range of systems claim online quality with video compression. Still, prospective users have to judge the suitability of the results for their application and bear in mind that for transmission/ distribution the signals will be decompressed and re-compressed again.

No Single Point of Failure: Describes a configuration in which at least one of each component may fail without losing the functionality or data of the system.

Noise Reduction: Electronic reduction of observable grain in the picture.

Non-Drop Frame: System of time code that retains all frame numbers in chronological order.

Non-Linear Editing: This term describes a form of the editing process. Here, the recording medium is not a tape, therefore, editing can be performed in a non-linear manner, i.e. the editor is independent of the sequence of the program. NLE has the advantage of editing with quick access to source clips and record space (e.g. on computer disks). Moreover, it removes the need of winding and pre-rolling of VTR operations and hence speeds up work. Even greater speed and flexibility are possible when real-time random access to any frame (true random access) is applied. The term NLE is mostly used when discussing offline editing systems storing highly compressed images, but increasingly online non-linear systems are available as well. Nowadays quite a range of systems claim online quality with video compression. Still, prospective users have to judge the suitability of the results for their application and bear in mind that for transmission/ distribution the signals will be decompressed and re-compressed again.

NTFS: An acronym for Microsoft Windows "New Technology File System". It is a file system from Windows, used for storing and retrieving files. It allows data security on fixed and removable disks.

NTSC: NTSC is an acronym for "National Television System Committee". It is the standard for broadcast color television in the United States Canada, Central America and Japan. NTSC image format is 4X3 aspect ratio; 525 horizontal lines; 60 Hz and 4 MHz bandwidth with a total 6 MHz video channel width.

Nyquist Rule: States that in order to be able to reconstruct a sample signal with out aliases, the sampling signal must occur at a rate of more than twice (2x) the highest desired frequency. This rule is usually observed in digital systems. A frequency lower than this rate is called an "alias".

Section OBack to top Back To Top

OEM: An acronym for "Original Equipment Manufacturer". The term has two different meanings: 1) A company supplying equipment to other companies in order to resell or incorporate this equipment into another product using the reseller's brand name.
2) A company acquiring a product or component but then reusing or incorporating it into a new product with its own brand name.

Oersted: A unit of magnetic field strength.

Off-Line Editing: Editing that is done to produce an edit decision list (EDL) which is used to assemble the program at a later date.

On-Line Editing: Editing that is done to produce a finished program master.

OMF: An acronym for "Open Media Framework". A file type that allows for sharing of interchangeable digital media.

OpenFX: Open-source animation and modeling program. Popular in the post production world because of flexible effects like lens flare, fog, explosions, waves and dissolving.

Operating System: Every computer needs a base program, the so-called operating system that manages the computer and grants control of various functions. Common examples are MS DOS and Windows(R) for PCs, Mac OS for Apple(R) Macintosh and UNIX for Linux(R). On top of the operating system, specific applications are installed. General purpose operating systems allow a wide range of applications to be used, they do not necessarily allow the most efficient or fastest possible use of the hardware for the application.

Opticals: Refers to film effects, film titles and film dissolves and fades.

Optical Fiber: A glass strand designed to carry light in a fashion similar to the manner in which wires carry electrical signals. Optical fibers carry much more information than wires through multiple paths over long distances due to a characteristic called "pulse dispersion".

Orthostereoscopic: A one-to-one condition where what is being displayed is the same as the 'real world'. For example IMAX 3D is often shot with parallel cameras spaced at the average human adult interpupillary distance (approx 63.5 mm) and with wide angle lenses that closely match an audience members view of the screen.

Overlay: Keyed insertion of one image onto another.

Section PBack to top Back To Top

Paintbox: Digital graphics generator manufactured by Quantel.

PAL: An acronym for "Phase Alternating Line". It is a composite color standard used in many parts of the world. The format consists of 625 scan lines of resolution at 25 fps (25 Hz). The phase alternation makes the signal less susceptible to distortion.

Pan and Scan: Term describes part of editing process: aspect ratio will be changed, by simply projecting portions of the image.

Parallax: This refers to the separation of the left and right images on the projection device or display screen. Positive Parallax puts an object behind the screen (on screen objects in the left eye image are to the left of the same objects in the right eye image). Negative Parallax puts an object in front of the screen (on screen objects in the left eye image are to the right of the same objects in the right eye image). Zero or neutral Parallax puts an object on the screen (on screen objects in the left eye image are overlaid on the same objects in the right eye image).

The only difference between stereo cameras should be parallax or angle between the axes of the lenses as in Camera Convergence — anything else can disturb the stereo viewing experience. This requires close attention, so that the cameras are set-up the same and with the same filters. Color differences, skewing, vertical misalignment, differential weave and hop, lens flares, poor VFX fixes, scratches and dirt can all cause problems.

Fast cuts between shots with strong positive and strong negative parallax can be unsettling in some circumstances. This is because the eyes and brain are being asked to jump uncomfortably quickly between positions and then make sense of the result. This can be mitigated by the use of 'handing off' — dynamically changing the convergence of an outgoing shot in relation to an incoming shot. Another method of dealing with this is trying wherever possible to cut between shots that are somewhat close in parallax. Vertical parallax is a vertical offset between stereo images and is very uncomfortable to watch — so it is necessary to remove during post production.

Note: The term 'Parallax' is sometimes used interchangeably with 'Congruence' or 'Disparity'

Parallel Digital: A digital video interface which uses twisted pair wiring and 25-pin D connectors to convey the bits of digital video signal in parallel.

Parity: An extra "bit" appended to a character as an accuracy check. It is one of the simplest error detection techniques and can detect a single-bit failure.

Patch Panel: A manual method of routing signals using a panel of receptacles for sources and destinations and wire jumpers to interconnect them.

PCI: An acronym for "Peripheral Component Interface". A high-speed interconnect system that runs at processor speed. PCI is designed so that all processors, co-processors and support chips can be linked together. PCI bus mastering provides perfect audio sync and sustained throughput levels over three (3) megabits per second.

Petabyte: A digital storage capacity equivalent to one-quadrillion bytes.

Pixel: The digital representation of the smallest area of a television capable of being delineated by the bit stream. The smaller and closer together the pixels, the higher thr picture resolution.

Play List: A prepared list of video sequences that can be fed to players, such as a digital disk recorder, and should be played out in the order of the list.

Play-Out: The playing out of video and/or audio material from a video system.

Plug-In: Additions to software that can be installed afterwards. Plug-ins provide special effects or features for the respective software.

Post House: Abbreviation for Post production House. Mostly a company specialized in the business of cutting, color grading, finishing or conforming a clip, movie or film.

Post Production: All production work performed after the raw video footage and audio elements have been captured. Editing, titles, special effects insertion image enhancement and audio mixing are done during post production.

Prerender: A term describing the process of graphics, image or video material that is being rendered, but not in real-time. Usually the material has been rendered before on other equipment, mostly with the equivalent hardware.

Primary Colors: Colors that are combined to produce the full range of other colors within the limits of a system.

Primary Source Clip: A clip created from a source clip when the source clip is placed in the time line on the sequencer. It does NOT contain any digitized media. It only refers to the primary clip.

Production: Creation of recorded image information with associated audio elements to achieve the thematic and artistic content desired for distribution.

Production Rolls: A generic term used for various types of production elements before they are cut and assembled into reels.

Production Sound: Audio recording during principle photography on-location.

Progressive Scanning: A display mode for electronic imaging in which all of the scanned lines are presented successively and each field has the same number of lines as a frame. It is also known as sequential scanning. It requires 2x's the bandwidth of interlaced scanning.

Protocol: A set of syntax rules defining the exchange of data.

Proxy: Material rendered in a lower quality, normally unsuitable for broadcast. Mainly used for preview or offline-editing purposes.

PsF Imaging: An acronym for "Progressive-Segmented Frame imaging". Whole frames are captured at the same instant. Each frame represents a single moment in time. After the frame is captured, it is "segmented" or separated into two (2) halves. One-half consists of odd lines and the other half consists of even lines.

Psuedoscopic: If a stereoscopic signal is reversed (e.g. each eye is being fed the opposite eye signal or if there is a one frame offset between each eyes) a strange 'punched in' effect appears. This is also referred to as inverted stereo or reversed stereo.

Pull-Down: A technique that eliminates redundant frames when converting film material (24 fps) to NTSC (30 fps).

Pulfrich Effect: Horizontal motion that can be interpreted as binocular depth. A stereo effect which is produced when 2D images moving laterally on a single plane are viewed at slightly different times by each eye.

Section QBack to top Back To Top

Quad Split: The visual effect of dividing a picture in four (4) segments, each of which may display video from a separate source.

Quantization: When converting an analog signal to a digital one, the process is called quantization. It measures a sample to determine a representative numerical value that is then encoded. There are three steps in analog-to-digital conversion: sampling, quantizing, and encoding. The representation of the coded values typically is done with binary numbers. Video signals are often coded using 8 or 10 bits, which allow 256 and 1024 different values respectively. Audio uses 16 or 24 bits with 65536 and 16777216 different values. For video and audio coding, increasing the bit number does not increase the maximum or minimum values but the number of steps between minimum and maximum which normally gives a better quality.

QuickTime: A QuickTime(R) file works as a multimedia container file. It contains one or more tracks, each of which stores a particular type of data, like video, audio, effects, or text (for subtitles, for example). Each track in turn contains track media. This might be either the digitally-encoded media stream (using a specific codec, e.g. JPEG, MP3, DivX, or PNG) or a data reference to the media stored in another file or elsewhere on a network. An 'edit list' indicates what parts of the media to use.

Section RBack to top Back To Top

R&D: An acronym for "Research and Development".

RAID: RAID is an acronym for "Redundant Array of Independent Disks". RAID is a method of enabling several physical hard disk drives to act as a single orchestrated storage device. It provides Fault Tolerance in the event of a disk drive failure. It also provides higher data rate throughput than a single disk drive.

RAM: An acronym for "Random Access Memory". The chips in a computer that contain its working memory.

Raster: A pattern of image system detection. It usually moves from left to right, and repeats over the image from top to bottom, counting the horizontal scan lines.

Raw Stock: Unexposed film or audio tape.

RCA Connector: A type of connector used on all VCR's and camcorders to carry standard composite video and audio signals. Also known as a "phono" connector.

Real Time: The idea or concept of a system, that will react and responds as fast as they happen. A good example is to be seen within the games industry: moving the joystick and seeing the image on screen react simultaneously - the processes needed to achieve this effect are called "real-time".

Record/Capture: Analog video (or audio) signals are converted into digital formats.

Recovery: Recreation of the original stored data in a RAID storage system e.g. after a hard disk failure. The missing data is recreated from the stored parity information.

Reference Black Level: Refers to the horizontal timing discussion.

Reference Clip: A clip created from a source clip when the source clip is placed in the time line on the sequencer. Also known as a "sub-clip" and "secondary clip".

Reference Genlock: Describes the process of signals being synchronized. When combining more than one signal, one specific reference signal will help to sychncronize the different sources.

Release Print: Numerous duplicate prints of a subject made for general theatre release. It is printed from an internegative.

Remote Control: To control a system by remote. Most video systems can be controlled by remote, for example, via an RS-422 interface, a common control interface in the field of video equipment. With it you can, for instance, command a video system to start a play-out operation from another system, while recording the played out material at the same time with the foreign system. Tape machines such as VTRs can also be controlled that way, making simultaneous play-out and record operations between different systems an easy task. See also RS-422.

Remote Diagnostic: To diagnose a system by remote, e.g. with the help diagnostics software by accessing a system via internet or network.

Rendering: The process by which video editing software and hardware convert raw video, effects, transitions and filters into a new continuous video file. A non-real time drawing of a picture relying on computer processing speed for graphics and compositing.

Resolution: The sharpness or "crispness" of a picture. It can be measured numerically by establishing the number of scanning lines used to create each frame of video.

Resolution Independent: Equipment that can work in more than one resolution.

RF: An acronym for "Radio Frequency". A term used to describe the radio signal band of the electromagnetic spectrum i.e. 3 MHz — 300 GHz. RF connectors carry RF television signals.

RF Splitter: A device that provides multiple RF signals. It is used to send signal from one VTR to two or more monitors/televisions.

RGB: The basic parallel component set (red; green; blue) in which a signal is used for each primary color. May also be referred to as "GBR", the mechanical sequence of the connectors in the SMPTE interconnect standard.

ROM: An acronym for "Read Only Memory". Permanently programmed memory.

Rotation: An effect on a video system which rotates and turns the images of a video clip at a freely definable angle. See also Effect.

Rough Cut: Assembly of edited shots prior to picture lock.

Routing: Describes the activity of a device within a computer network that will decide the destination of a data package. The router is connected to more than one network, is often included as part of a network switch.

RS-422: A medium range balanced serial data transmission standard. Full specification includes 9-way, D-type connectors. It is widely used for control links around production and post areas for a range of equipment.

Rushes: Footage that is shot in a day; referred to as "dailies" in the U.S.

Section SBack to top Back To Top

Sampling: The process where analog signals are measured, often million of times per second in video, in order to convert analog signals to digital. The official sampling standard for television is ITU-R 601.

SAN: An acronym for "Storage Area Network".

SAS: An acronym for "Serial Attached SCSI (Small Computer System Interface)". SAS is a market replacement for parallel SCSI. It is a serial interface that has the same electrical specifications as SATA, but uses SCSI protocol and two SATA links for a data rate of 300 MB/sec.

SATA: An acronym for "Serial ATA (Advanced Technology Attachment)". It is a further development of ATA also known as IDE. SATA is the successor of ATA, but it is a serial interface, resulting in easier cabling and fewer errors. The maximum data rate is 150 MB/sec. for SATA and 300 MB/sec. for SATA-2, a newer version of SATA.

SATA-2: Serial ATA (Advanced Technology Attachment). It is a further development of ATA also known as IDE. SATA is the successor of ATA, but it is a serial interface, resulting in easier cabling and fewer errors. The maximum data rate is 150 MB/sec. for SATA and 300 MB/sec. for SATA-2, a newer version of SATA.

Saturation: Term used to describe color brilliance or purity.

Scaler Control: To control a scaling (by remote). See Scaling.

Scaling: Scaling generally indicates a change of the resolution of images, i.e. the images are made larger or smaller. Auto-scaling is a setting of a video system that enables an automatic scaling and re-sizing of the original video material. If the resolution of the original material differs from the configured raster, it will be scaled either up or down to its maximum allowable width and/or height according to the selected video format. The auto-scaling setting makes sure that no image information gets lost i.e. the images will not be cropped nor will they be too small for the selected video format. See also Video Format.

Scan Converter: An external device that converts a computer's VGA output to video so that it can be displayed on a TV or VCR.

Scrub Bar: A scrub bar is a software item that represents a timeline of video material. It provides a slider/cursor that can be used to move through the images of the material i.e. with it you "scrub" the bar. See also Timeline

SCSI: An acronym for "Small Computer System Interface". It is a parallel interface that is used by computer systems to connect peripheral devices; a connection of up to 15 drives to one interface port are possible thanks to the BUS architecture. The maximum data rate is 320 MB/sec.

SCSI Address: A number from 1 to 7 that uniquely identifies a SCSI device to a system. No two SCSI devices that are physically connected to the same workstation can have the same SCSI address.

SCSI Termination: A metal cap that plugs into any open SCSI port on a SCSI bus line. All SCSI ports need to be occupied by a cable or terminator to ensure proper function.

SD: An acronym for "Standard-Definition".

SDI: An acronym for "Serial Digital Interface". It is a standard based on 270 Mbps transfer rate. It is a 10-bit interface for both component and composite digital video with four (4) channels of embedded digital audio. It uses 75-Ohm BNC connectors and coax cable and can transmit signal over 600 feet.

Serial Port: A computer input/output (I/O) pot through which the computer communicates with the external environment. The standard serial port uses RS-232 & RS-422 protocols.

Server: When a computer provides services to other computing systems (clients) over a network, it is defined as a server. Most complex computer systems today require a server, but the term can also refer to the software or hardware elements of such a system.

Shortcut: There are two types of shortcuts:
1. Computer shortcuts are small files containing the location of other files. Computer shortcuts are usually located on the desktop to start programs without using a command line. 2. Keyboard shortcut describes a key or set of keys to perform a predefined function. Sequences such as using a menu or typing commands can be reduced to a few keystrokes.

Shuttle: Functionality: 1.Viewing footage at speeds greater than real time. 2. A removable drive unit for easy transport of data and media files from one system to another without connecting and disconnecting cables.

Signal to Noise Ratio (S/N): The ratio between the strength of an electronic signal and the amount of electronic background noise. S/N is measured in decibels (dB). Video specifications include three (3) figures: Video (Luminance); Color (Chrominance); Audio (Sound). The larger the ratio, the better the signal.

Single Link: SDTV and HDTV in YUV 4:2:2 can be transmitted via HD-SDI and thus via a single BNC connector. This is called a single-link connection in contrast to a dual-link connection. See also HD-SDI and Dual link.

Slow-PAL: Utilizes the proximity of 25 Hz to 24 Hz common in film applications. It is the PAL video format in 24 frames per second (interlaced). Another application of Slow-PAL can be found when converting video sequences from PAL (25 fps) to NTSC (29.97 fps). In contrast to a direct conversion where an interpolation of video fields is required resulting in a slight loss of image quality, with Slow-PAL the conversion will leave the images untouched. However, the final result will be about 4% slower. By converting to Slow-PAL the frame rate is reduced to 24 frames per second which can then easily be converted to the near 30 frames per second of NTSC by simply doubling video fields.

SMART: SMART is an acronym for "Self-Monitoring Analysis & Reporting Technology". It allows disk drives to perform sophisticated self-diagnosis and auto-correction when possible and reports faults to the computer's Operating System (OS) when necessary.

Smart Slate: Production clapper that includes a lighted readout of the timecode being recorded onto the production sound audiotape.

SMTP: An acronym for "Simple Mail Transfer Protocol" which is a TCP/IP protocol. In sending and receiving e-mail it is usually used with POP3 or IMAP protocols to generate a user-friendly e-mailing process. SMTP is typically used by programs for sending e-mail while either POP3 or IMAP support programs for receiving e-mail.

SMPTE: SMPTE is an acronym for "Society of Motion Picture & Television Engineers". This group establishes and enforces industry technical standards.

Source Clip: Refers directly to physical media.

Source Timecode: Timecode information that is stored directly in the video clip or the individual frames of an image sequence (stored in the file headers). See also Timecode.

Spline: An interpolation that produces movement between key frames along curved lines creating a smooth, flowing motion.

Split Edit: A type of edit transition where either the video or the audio is delayed from being recorded for a given time.

Split-Screen: An effect that displays two images separated by a horizontal or vertical wipe line.

Squeeze: A change in the aspect ratio. Anamorphic lenses sometimes "squeeze" a widescreen scene by a factor of 2 horizontally so that it will fit on a 1.33:1 aspect ratio frame.

Stand-Alone: Stand-alone describes programs which run without the services of other programs (except maybe firmware).

Standard-Definition Image: 720 X 470

Stereo: Stereophonic sound. Two independent audio channels are used to create a spatial sound effect.

Stereoscopic Window: The amount of Stereo image available to the viewer is dictated by the frame surrounding a stereoscopic image, e.g. the size of TV or projection screen. This boundary is called the Stereo Window. Depending on their Parallax objects will appear either in front, at or behind this Window. IMAX has the largest window.

Stills: Short for "Still Images", i.e. the output when a play-out of video is paused. When a play-out is stopped on a digital video system, the last image will be continuously repeated at its video outputs. With video material in a progressive video format this results in an acceptable output. However, when interlaced video is used and the currently displayed image contains rapid movement, a simple frame repetition (the repetition of two video fields) is not acceptable because the image output will jitter. More favorable then is the repetition of a single video field. See also Interlaced and Frame repetition.

Stripe Set: A stripe set is a storage that consists of multiple hard drives. The total capacity is the sum of the individual drives. The performance in general is higher than the performance of a single hard drive. A stripe set has no redundancy and is the same as a RAID 0 configuration.

Sustained Data Rate: Sustained data rates are the average of data rates measured over a longer period. The best way to measure data rate of a storage-system is to run the same application intended to be used with the storage.

S-Video: An abbreviation for "Separated Video". The standard for the way a signal is carried on the cable itself. S-Video is a hardware standard that defines the physical cable jacks. The industry has settled on a 4-pin mini-plug connector. S-Video has no relationship to the resolution or refresh-rate of the signal.

Sweetening: The final combining and enhancing of a video program's audio tracks.

Sync Sound: Sound that is recorded with the intention of being married to a picture at an exact point.

Synchronization: Also referred to as "Sync". It is a transmission procedure by which the bit and character streams are controlled by accurately synchronized clocks, both at the receiving and sending end.

Section TBack to top Back To Top

Tail: See "Head and Tail".

TBC: An acronym for "Time Base Corrector". A device used to correct for time-base errors and stabilize the timing of the video output from a tape machine. It corrects problems in a video signals sync-pulse by generating a new, clean time-base and synchronizes any other incoming video to this reference.

TCP: An acronym for "Transport Control Protocol". It is the major transport protocol in the Internet suite of protocols providing reliable, connection-oriented, full-duplex streams. It uses IP for delivery.

Telecine: The process of transferring film data to video tape.

Terabyte: Equivalent to 1 trillion bytes or 1 thousand gigabytes.

TGA: Abbreviation for "Targa" i.e. the Targa Image Format. An 8-bit image file format in RGB with or without key that can be used for video clips. See also Key Channel and Video Clip.

Third-Party: Software or hardware developed by other manufacturers.

Thumbnail: A down-converted image to provide a preview of its original material. Thumbnails are used to show the contents of video clips in still images. Thus a complete loading and play-out of the clip only to take a look at its contents is not necessary.

TIFF: An acronym for "Tag Image File Format". It is the standard file format for high-resolution bit-mapped graphics. TIFF files may be compressed or uncompressed.

Time-Code: A system for numbering video frames where a code denoting hours/minutes/seconds/frames is assigned to each frame. In North America, the Time-Code standard is SMPTE.

Time-Code Generator: A signal generator designed to generate and transmit SMPTE time code.

Time-Line: The graphic representation of a program displayed in the sequencer window.

Tracking: The angle and speed at which the tape passes the video heads.

Transcoder: A device that converts one component format to another, e.g. to convert Y, R-Y, B-Y signals to RGB signals.

Transition: A change from one clip to the next. A popular example is a cut, when the first frame of the starting video segment directly follows the last frame of the segment that is ending. Other transitions are dissolves, wipes, fades, or DVEs.

Trim: The adjusting of transitions in a sequence.

Trim Handles: The frames before or after the "In" and "Out" points for a source clip allowing for trimming and transitions.

Section UBack to top Back To Top

Uncompressed: Uncompressing (or decompressing) is the act of expanding a compression file back into its original form. Software often comes in a compressed package, e.g. as internet download. It often uncompresses itself when you click on it. Files can be uncompressed using popular tools such as PKZIP in the DOS operating system, WinZip in Windows, and MacZip in Macintosh.

Up-Conversion: The process of converting low resolution video to higher resolution video.

USB: An acronym for "Universal Serial Bus".

Section VBack to top Back To Top

VANC: An abbreviation for "Vertical Ancillary Data".

Vectorscope: A specialized oscilloscope which demodulates the video signal and presents a display of R-Y versus B-Y. It allows for the accurate evaluation of the chrominance portion of a signal.

Versioning: Also known as "Version Control" and "Revision Control". The management of different versions of a set of data. In digital video it can stand for a comparison of multiple revisions of an image/clip sequence.

Vertical Interval: The vertical interval signals the picture monitor to go back to the top of the screen to begin another vertical scan. It is the portion of the video signal that occurs between the end of one field and the beginning of the next.

Video Clip: In a non-linear editing environment, a clip indicates data of either video or audio that has been clipped out (copied) from a larger environment such as a reel or a video tape. In essence a video clip is a snippet of video. Video clips usually are folders/directories that contain a great number of individual images files (the frames) which combined form the video sequence. The image files can be stored in a wide variety of picture file formats (e.g. BMP or TIFF). However, video clips can also be stored in a single file in a container file format such as QuickTime or Windows Media.

Video Format: Determines the way video is transmitted or received. For example, for a record it determines how to receive a video signal at the inputs. Most notably the setting of a video format must detail the video raster (resolution, e.g. 1920

Video Switcher: A device that allows transmissions between different video pictures. May contain special effects generators.

VITC: An acronym for "Vertical Interval Timecode". It is a popular method for recording for recording timecode. A timecode-address for reach frame of video is inserted in the vertical interval of the video signal where it is invisible on-screen yet easy to retrieve. Professional videotape machines can read VITC in either the play or the jog (manual) mode, making it ideal for editing.

Volume: An identifiable unit of data storage in computers or storage systems. It might be physically removable. In tape storage systems, a volume may be a tape cartridge (or in older systems, a tape reel). In mainframe storage systems, a volume may be a removable hard disk. Each volume can be specified by the user via its system-unique name or number. In some systems, the physical unit may be divided into several separately identifiable volumes.

VTR: An acronym for "Video Tape Recorder".

VU: An abbreviation for "Volume Units". A unit of measure for complex audio signals, usually in decibels (dB). The reference level of -20 dB is 0 VU.

Section WBack to top Back To Top

Watchdog: A software or hardware that determines what should be sent out in case of failure. For example, if dropped frames are detected, the watchdog will react and output certain images instead. The watchdog output can be configured for example, to a color bar image, a black frame or the last played out image.

WAV: An abbreviation for "Wave" i.e. the Wave File Format. File format for digital audio (waveform) data under Windows that can be used for audio clips. It is capable of storing multiple mono or stereo channels.

Waveform Monitor: A specialized oscilloscope that displays analog video signals at a horizontal and/or vertical rate. It is used for evaluating television signals.

Wetgate Print: A print created using a chemical process that coats the negative to help restore imperfections in the image.

Window Dub: Copies of videotape with "burnt-in" time-code display. Hours, minutes, seconds and frames appear on the recorded image.

Windows: Operating system for IBM compatible PCs developed by the company Microsoft.

Wipe: A shaped transition between video sources. A margin or border moves across the screen, wiping out the image of one scene and replacing it with an image of the next scene.

WMV: Microsoft developed a set of video codec technologies called "Windows Media Video" WMV is part of the Windows(R) Media framework.

Wordclock: A word clock or wordclock (sometimes called a sample clock) is a clock signal (not the actual device) used to synchronize other devices, such as digital audio tape machines and players. Wordclock is used entirely to keep a perfectly-timed and constant bit-rate to avoid data errors. The devices are interconnected via digital audio. A wordclock is used by several formats such as S/P-DIF, AES/EBU, ADAT. Other formats use a wordclock as well. Various audio over Ethernet protocols use broadcast packets for the wordclock. On a network the wordclock is controlled by a master clock.

Work Print: Any picture or sound track print, usually positive, intended for use in the editing process to establish the finished version of a film.

Workflow: Workflow is the operational aspect of a work procedure. It describes how tasks are structured, who performs them, what their relative order is, how they are synchronized, how information flows to support the tasks and how tasks are being tracked. As dimension of time workflow considers "throughput" as a distinct measure.

Workstation: A high-end, specialized computer system intended for use by engineers or imaging professionals.

Section XBack to top Back To Top

XFR: a slang expression for "transfer."

XLR: A secure three (3) pronged audio-connector covered by a metal sheath often found on high quality audio / video equipment; a type of audio connector featuring three leads: two for the signal and one for overall system grounding. XLR is often used for microphones.

Section YBack to top Back To Top

Y, R-Y, B-Y: The general set of CAV signals used in the PAL system as well as some encoder/decoder applications in NTSC systems. Y represents the luminance signal, R-Y is the 1st color difference signal and B-Y is the 2nd color difference signal.

Y, U, V: Luminance and color difference components for the PAL system. Y, U, and V are simply new names for Y, R-Y, and B-Y. The derivation from RGB is identical.

YUV: YUV is the abbreviation for the differential brightness and color signals. It is the color space used by NTSC and PAL video systems. While the "Y" is the luma component, the "U" and "V" are the color difference components. Some may mistake the Y'UV notation for Y'CbCr data. Most use the YUV notation rather than Y'UV or Y'U'V'. Technically correct is Y'U'V' since all three components are derived from RGB. YUV is also the name for some component analog interfaces of consumer equipment.

Section ZBack to top Back To Top

Zero Timing Point: The point at which all video signals must be in synchronization, typically the switcher input.

Zoom and Pan: Zoom increases the length of the camera lens, magnifying an aspect of a scene. The results of a zoom and a dolly are different. A dolly physically moves the camera closer to the point of interest without changing the length of the lens. Zoom increases the size of the point of interest by increasing the lens length. Pan adjusts the focal point by pivoting the camera direction, usually slowly across a scene.