Synchronization

Signal Processors–Videos

Posted on October 8, 2010. Filed under: Audio, Communications, Digital Communication, Signal Processors, Sound, Synchronization | Tags: , , , , , , , , , |

Sound Made Simple – Equalization – Definition

 

Adobe Audition 3.0 Introduction and New Features

Adobe Audition 3.0 Tutorial—-In English

Adobe Audition Editing and Effects

Main Points To Remember

1. Signal processors are used to alter some characteristics of a sound. They can be grouped into four categories; (1) spectrum, (2) time, (3) amplitude, or dynamics, and (4) noise.

2. The equalizer and filter are examples of spectrum processors because they alter the spectral balance of a signal. The equalizer increases or decreases level of a signal at a selected frequency by boost or cut (also known as peak and dip), or by shelving. The filter attenuates certain frequencies above, below, between, or at a preset point(s).

3. Two types of equalizers in common use are the fixed-frequency and the parametric.

4. The most common filters are high-pass (low-cut), low-pass (high-cut), band-pass, and notch.

5. Psychoacoustic processors add clarity, definition, and overall presence to sound.

6. Time signal processors affect the time relationships of signals. reverberation and delay ae two such effects.

7. The three  most common types of reverberation systems used today are digital, plate, and acoustic chamber.

8. Digital  reproduces electronically the sound of different acoustic environments.

9. An important feature of most digital reverb units is predelay–the amount of time between the onset of the direct sound and the appearance of the first reflections.

10. Delay is the time interval between a sound or signal and its repetition.

11. Delay effects, such as doubling, chorus, slap back echo, and prereverb delay, are usually produced electronically with a digital delay device.

12. Flanging and phasing split a signal and slightly delay one part to create controlled phase cancellations that generate a pulsating sound. Flanging uses a time delay; phasing uses a phase shifter.

13. Amplitude (dynamic) signal processors affect a sound’s dynamic range. These effects include compressing, limiting, de-essing, expanding, noise gating, and pitch shifting.

14. With compression, as the input level increases, the output level also increases but at a slower rate, reducing dynamic range. With limiting, the output level stays at or below a preset point regardless of its input level. With expansion, as the input level increases, th e output level also increases but at a greater rate, increasing dynamic range.

15. A de-esser is basically a fast-acting compressor that acts on high frequency sibilance by attenuating it.

16. A noise gate is used primarily to reduce or eliminate unwanted low-level noise, such as ambience and leakage. It is also used  creatively to produce dynamic special effects.

17. A pitch shifter uses both compression and expansion to change the pitch of a signal or the running time of a program.

18. Noise processors are designed to reduce or eliminate noise from an audio signal. Most are double-ended; they prevent noise from entering a signal. Single-ended noise processors reduce existing noise in the signal.

19. Noise reduction is also possible using the digital signal processing (DSP) in a hard-disk recording/editing system. Virtually any unwanted sound can be eliminated.

20. Multieffects signal processors combine several of the functions of individual signal processors in a single unit.

21. A voice, or vocal, processor can enhance, modify, pitch-correct, harmonize, and change completely the sound of a voice, even to the extent of gender.

22. Plug-ins incorporated in hard-disk recording/editing systems add to digital signal processing not only the familiar process tools,–EQ, compression and expansion, reverb, delaypitch shifting, and so on–but also an array of capabilities that place at the recordist’s fingertips a virtually limitless potential for creating entirely new effects and at comparatively little cost.

23. Plug-ins’ programs are available separately or in bundles. bundles include a number of different plug-in programs.

Read Full Post | Make a Comment ( None so far )

Synchronization And Transfers–Videos

Posted on October 8, 2010. Filed under: Audio, Law, Radio, Recordings, Sound, Sound Effects, Speech, Synchronization, Web | Tags: , , , , , , |

 

Midi Time Code & SMPTE Synchronization for Midi Composers

How to Synchronize Audio and Video

How to Make a Good Basic YouTube Video

Cheap Wireless Radio Mics From eBay – any good?

Radio Mics & Alternatives for Low Budget Filmmaking?

Main Points To Remember

1. Synchronization allows the locking of two or more devices that have microprocessor intelligence so that they operate at precisely the same rate.

2. Accurate synchronization requires a system to code the recording media as well as a synchronizer to read the codes, compare them, and adjust the positions and speeds of machine transports so that they run at exactly the same rate.

3. There are three basic time codes: longitudinal time code and vertical interval time code, both of which are forms of  SMPTE time code; MIDI time codes; and the IEC standard.

4. SMPTE time code is a high-frequency electronic digital signal consisting of a stream of pulses produced by a time code generator. Its identifying code numbers are broken down into hours, minutes, seconds, and frames.

5. SMPTE longitudinal time code (LTC) is a digital signal converted to audio frequencies so it can be recorded on an audio track.

6. Vertical interval time code (VITC) carries the same information as SMPTE code, but it is used with video-tape and encodes the information vertically within the video signal, outside the visible picture area.

7. MIDI time code (MTC) translates SMPTE time code into MIDI messages.

8. The IEC (International Electrotechnical Commission) standard is the time code system used in digital audio-cassette recorders to ensure compatibility among all R-DAT equipment.

9. Because all time code readouts are in the same form, it is easy to confuse them if any numbers are duplicated on the same tape or on multiple tapes in a production. Two ways to avoid this confusion are to use the zero-start or the time-of-day logging method.

10. In recording SMPTE time code, be careful to record it at the recommended level. It the signal is recorded at too low a level, synchronization is adversely affected. If it is recorded  at too high a level, the time code signal will distort.

11. Every digital audio system has a signal, known as a word clock, generated inside the device that controls sampling frequency, or sampling rate. With digital audio, sampling rate is the determining syn factor.

12. A degradation in word-clock signals among the digital devices being interfaced can create jitter–a variation in time from sample to sample that causes changes in the shape of the audio waveform.

13. Five frame rate standards are used within SMPTE time code: 23.976, 24, 25, 29.97, and 30 frames per second (fps).

14. Frame rates for television are in either drop frame or non-drop frame format. Drop frame time code is time-accurate because it makes up for the error that results from the difference between the 29.97-fps and the 30-fps rate of video. Non-drop frame is the original video time code calculated at 30 fps. The two modes are not interchangeable.

15. In double-system recording, sound and picture are recorded separately and in syn; the camera records the picture, and an audio recorder handles the sound.

16. In single-system recording, both sound and picture are recorded on the same medium.

17. Two methods used to synchronize the film camera and the audio recorder in double-system recording are crystal synchronization and time code synchronization.

18. In double-system recording, a clapslate is used to make a visible and an audible syn mark on the film and audio recording, respectively. This helps identify and synchronize scenes during their transfer from the audio recoding to magnetic film, or more common, hard disk, and in editing.

19. Time code permits the accurate interlocking of two or more recorders, but a synchronizer is necessary to ensure that their transports run together simultaneously.

20. Copying sound (or picture) from one audio film or video device to another is usually called a transfer. Dub is another often-used term to describe this process.

21. Common audio transfers are analog to analog, analog to digital, and digital to digital.

22. In transferring audio, the sound can be altered for special effects.

23. The process of transferring a double-system film recording for postproduction to align the audio and the film is called resolving.

Read Full Post | Make a Comment ( None so far )

Liked it here?
Why not try sites on the blogroll...