summaryrefslogtreecommitdiff
path: root/manual/xml/synchronization_concepts.xml
diff options
context:
space:
mode:
Diffstat (limited to 'manual/xml/synchronization_concepts.xml')
-rw-r--r--manual/xml/synchronization_concepts.xml286
1 files changed, 146 insertions, 140 deletions
diff --git a/manual/xml/synchronization_concepts.xml b/manual/xml/synchronization_concepts.xml
index 765bb69fe5..0947baf340 100644
--- a/manual/xml/synchronization_concepts.xml
+++ b/manual/xml/synchronization_concepts.xml
@@ -5,158 +5,164 @@
]>
<section id="sn-synchronization_concepts">
- <title>Synchronization Concepts</title>
- <para>
- As soon as you start handling audio on more than one device, it is important
- to understand and to think about
- <emphasis>synchronization</emphasis>
- : how to get the devices to have the same sense of time and speed.
- </para>
+ <title>Synchronization Concepts</title>
+ <para>
+ As soon as you start handling audio on more than one device, it is
+ important to understand and to think about
+ <emphasis>synchronization</emphasis> : how to get the devices to have
+ the same sense of time and speed.
+ </para>
- <para>
- However, there are two fundamentally different kinds of synchronization:
- </para>
+ <para>
+ However, there are two fundamentally different kinds of synchronization:
+ </para>
- <section id="sample-clock">
- <title>Sample Clock</title>
- <para>
- As outlined in the <emphasis>introductory concepts</emphasis> section,
- digital audio is created by taking a "sample" of an analog signal level on
- a periodic basis, say 48000 times per seconds (the "sample rate"). A
- dedicated clock (the "sample clock") ((actually, an oscillating crystal,
- but technology people call such things clocks)) "ticks" at that rate, and
- every time it does, a new sample is measured. The way the clock is used to
- convert digital audio back to an analog signal (i.e. to be sent to some
- loudspeakers) is more complex, but the clock is still an absolutely
- fundamental part of the mechanism.
- </para>
+ <section id="sample-clock">
+ <title>Sample Clock</title>
+ <para>
+ As outlined in the <emphasis>introductory concepts</emphasis> section,
+ digital audio is created by taking a "sample" of an analog signal
+ level on a periodic basis, say 48000 times per seconds (the "sample
+ rate"). A dedicated clock (the "sample clock") ((actually, an
+ oscillating crystal, but technology people call such things clocks))
+ "ticks" at that rate, and every time it does, a new sample is
+ measured. The way the clock is used to convert digital audio back to
+ an analog signal (i.e. to be sent to some loudspeakers) is more
+ complex, but the clock is still an absolutely fundamental part of the
+ mechanism.
+ </para>
- <para>
- Whenever you connect two digital audio devices together in order to move
- audio data from one to the other, you <emphasis>must ensure they share the
- same sample clock</emphasis> . Why is this necessary? The oscillating
- crystals used for the sample clock are generally very stable (they always
- tick at the same speed), but there are always minute differences in the
- speed that any two clocks tick at. When used by themselves, this makes no
- difference, but connect two digital audio devices together and these minute
- differences will eventually accumulate over time. Eventually, one of the
- devices will be trying to read a sample "in the middle" of the other
- device's tick, and the result is a small click or pop in the audio stream.
- </para>
- </section>
+ <para>
+ Whenever you connect two digital audio devices together in order to
+ move audio data from one to the other, you <emphasis>must ensure they
+ share the same sample clock</emphasis> . Why is this necessary? The
+ oscillating crystals used for the sample clock are generally very
+ stable (they always tick at the same speed), but there are always
+ minute differences in the speed that any two clocks tick at. When used
+ by themselves, this makes no difference, but connect two digital audio
+ devices together and these minute differences will eventually
+ accumulate over time. Eventually, one of the devices will be trying to
+ read a sample "in the middle" of the other device's tick, and the
+ result is a small click or pop in the audio stream.
+ </para>
+ </section>
- <section id="timeline-sync">
- <title>Timeline Sync</title>
- <para>
- The concept of a timeline comes up over and over again when working with a
- digital audio workstation, and also with video editing systems. By
- "timeline" we mean nothing more than some way to define a "name" for the
- point where certain sounds (and/or visual images) occur. When you work in
- Ardour's editor window, the rulers near the top provide one or more
- timelines in different units. You can look at the editor window and say
- "this sound starts at 1 minute 32 seconds" or "this tracks fades out
- starting at bar 13 beat 22".
- </para>
+ <section id="timeline-sync">
+ <title>Timeline Sync</title>
+ <para>
+ The concept of a timeline comes up over and over again when working
+ with a digital audio workstation, and also with video editing systems.
+ By "timeline" we mean nothing more than some way to define a "name"
+ for the point where certain sounds (and/or visual images) occur. When
+ you work in Ardour's editor window, the rulers near the top provide
+ one or more timelines in different units. You can look at the editor
+ window and say "this sound starts at 1 minute 32 seconds" or "this
+ tracks fades out starting at bar 13 beat 22".
+ </para>
- <para>
- But what happens when you want to share a timeline between two different
- devices? For example, you may want to run a hardware video editor in
- conjunction with ardour, and always have the visual and audio playback be
- at the same point "in time". How do they each know what "in time" means?
- How do they know where the other one is? A mechanism for answering these
- questions provides <emphasis>timeline synchronization</emphasis> .
- </para>
+ <para>
+ But what happens when you want to share a timeline between two
+ different devices? For example, you may want to run a hardware video
+ editor in conjunction with ardour, and always have the visual and
+ audio playback be at the same point "in time". How do they each know
+ what "in time" means? How do they know where the other one is? A
+ mechanism for answering these questions provides <emphasis>timeline
+ synchronization</emphasis> .
+ </para>
- <para>
- Timeline synchronization is entirely different from sample clock
- synchronization. Two devices can share a sample clock, but never use
- timeline information. Two devices can be sharing timeline information, but
- run on different sample clocks - they might not even have sample clocks if
- they are analog devices.
- </para>
- </section>
+ <para>
+ Timeline synchronization is entirely different from sample clock
+ synchronization. Two devices can share a sample clock, but never use
+ timeline information. Two devices can be sharing timeline information,
+ but run on different sample clocks - they might not even have sample
+ clocks if they are analog devices.
+ </para>
+ </section>
- <section id="word-clock">
- <title>Word Clock</title>
- <para>
- "Word Clock" is the name given to a signal used
- to distribute the "ticks" of a sample clock to multiple devices. Most
- digital audio devices that are intended for professional use have a word
- clock connector and a way to tell the device to use either its internal
- sample clock (for standalone use), or to use the word clock signal as the
- sample clock. Because of the electrical characteristics of the signal, it is
- very important that any length of cable used to distribute word clock is
- "terminated" with a 75 ohm resistor at both ends. Unfortunately, some
- devices include this terminator themselves, some contain a switchable
- resistor and some do not. Worse still, the user manuals for many devices do
- not provide any information on their termination configuration. It is often
- necessary to ask the manufacturer in cases where it is not made very obvious
- from marking near the word clock connectors on the device.
- </para>
- </section>
+ <section id="word-clock">
+ <title>Word Clock</title>
+ <para>
+ "Word Clock" is the name given to a signal used to distribute the
+ "ticks" of a sample clock to multiple devices. Most digital audio
+ devices that are intended for professional use have a word clock
+ connector and a way to tell the device to use either its internal
+ sample clock (for standalone use), or to use the word clock signal as
+ the sample clock. Because of the electrical characteristics of the
+ signal, it is very important that any length of cable used to
+ distribute word clock is "terminated" with a 75 ohm resistor at both
+ ends. Unfortunately, some devices include this terminator themselves,
+ some contain a switchable resistor and some do not. Worse still, the
+ user manuals for many devices do not provide any information on their
+ termination configuration. It is often necessary to ask the
+ manufacturer in cases where it is not made very obvious from marking
+ near the word clock connectors on the device.
+ </para>
+ </section>
- <section id="timecode">
- <title>Timecode</title>
- <para>
- "Timecode" is a signal that contains positional or "timeline" information.
- There are several different kinds of timecode signal, but by far the most
- important is known as SMPTE. Its name is an acronym for the Society for
- Motion Picture T?? Engineering, and timecode is just one of the standards
- they defined, but its the most well known. Because of its origins in the
- film/video world, SMPTE is very centered on the time units that matter to
- film/video editors. The base unit is called a "frame" and corresponds to a
- single still image in a film or video. There are typically on the order of
- 20-30 frames per second, so the actual resolution of SMPTE timecode is not
- very good compared to audio-based units where there are tens of thousands
- of "frames" per second.
- </para>
- </section>
+ <section id="timecode">
+ <title>Timecode</title>
+ <para>
+ "Timecode" is a signal that contains positional or "timeline"
+ information. There are several different kinds of timecode signal, but
+ by far the most important is known as SMPTE. Its name is an acronym
+ for the Society for Motion Picture T?? Engineering, and timecode is
+ just one of the standards they defined, but its the most well known.
+ Because of its origins in the film/video world, SMPTE is very centered
+ on the time units that matter to film/video editors. The base unit is
+ called a "frame" and corresponds to a single still image in a film or
+ video. There are typically on the order of 20-30 frames per second, so
+ the actual resolution of SMPTE timecode is not very good compared to
+ audio-based units where there are tens of thousands of "frames" per
+ second.
+ </para>
+ </section>
- <section id="SMPTE">
- <title>SMPTE</title>
- <para>
- SMPTE defines time using a combinations of hours, minutes, seconds, frames
- and subframes, combined with the frame rate. In a film/video environment,
- SMPTE is typically stored on the film/video media, and sent from the device
- used to play it. There are different ways of storing it on the media - you
- may come across terms like LTR and VTC - but the crucial idea to grasp is
- that the film/video has a timecode signal "stamped" into it, so that it is
- always possible to determine "what time it is" when any given image is
- visible.
- </para>
+ <section id="SMPTE">
+ <title>SMPTE</title>
+ <para>
+ SMPTE defines time using a combinations of hours, minutes, seconds,
+ frames and subframes, combined with the frame rate. In a film/video
+ environment, SMPTE is typically stored on the film/video media, and
+ sent from the device used to play it. There are different ways of
+ storing it on the media - you may come across terms like LTR and VTC -
+ but the crucial idea to grasp is that the film/video has a timecode
+ signal "stamped" into it, so that it is always possible to determine
+ "what time it is" when any given image is visible.
+ </para>
- <para>
- SMPTE timecode is sent from one system to another as an analog audio
- signal. You could listen to it if you wanted to, though it sounds like a
- generally screeching and unpleasant noise. What the SMPTE standard defines
- is a way to encode and decode the hrs:mins:secs:frames:subframes time into
- or from this audio signal.
- </para>
- </section>
+ <para>
+ SMPTE timecode is sent from one system to another as an analog audio
+ signal. You could listen to it if you wanted to, though it sounds like
+ a generally screeching and unpleasant noise. What the SMPTE standard
+ defines is a way to encode and decode the
+ hrs:mins:secs:frames:subframes time into or from this audio signal.
+ </para>
+ </section>
- <section id="mtc">
- <title>MTC</title>
- <para>
- The other very common form of timecode is known as "MTC" (MIDI Time Code).
- However, MTC is actually nothing more than a different way to transmit
- SMPTE timecode. It uses the exact same units as SMPTE timecode, but rather
- than send the signal as audio MTC defines a transmission method that uses a
- MIDI cabable and a data protocol. MTC consumes a measurable, but small,
- percentage of the available bandwidth on a MIDI cable (on the order of
- 2-3%). Most of the time, it is wise to use a single cable for MTC and MMC
- (MIDI Machine Control) and not share it with "musical" MIDI data (the kind
- that an instrument would send while being played).
- </para>
- </section>
+ <section id="mtc">
+ <title>MTC</title>
+ <para>
+ The other very common form of timecode is known as "MTC" (MIDI Time
+ Code). However, MTC is actually nothing more than a different way to
+ transmit SMPTE timecode. It uses the exact same units as SMPTE
+ timecode, but rather than send the signal as audio MTC defines a
+ transmission method that uses a MIDI cabable and a data protocol. MTC
+ consumes a measurable, but small, percentage of the available
+ bandwidth on a MIDI cable (on the order of 2-3%). Most of the time, it
+ is wise to use a single cable for MTC and MMC (MIDI Machine Control)
+ and not share it with "musical" MIDI data (the kind that an instrument
+ would send while being played).
+ </para>
+ </section>
- <section id="jack-transport">
- <title>JACK Transport</title>
- <para>
- For Ardour and other programs that use <emphasis>JACK</emphasis>, there is
- another method of doing timeline synchronization that is not based on SMPTE
- or MTC.
- </para>
- </section>
+ <section id="jack-transport">
+ <title>JACK Transport</title>
+ <para>
+ For Ardour and other programs that use <emphasis>JACK</emphasis>,
+ there is another method of doing timeline synchronization that is not
+ based on SMPTE or MTC.
+ </para>
+ </section>
<!--
<xi:include xmlns:xi="http://www.w3.org/2001/XInclude"
href="Some_Subsection.xml" />