summaryrefslogtreecommitdiff
path: root/manual/xml/jack.xml
diff options
context:
space:
mode:
Diffstat (limited to 'manual/xml/jack.xml')
-rw-r--r--manual/xml/jack.xml292
1 files changed, 292 insertions, 0 deletions
diff --git a/manual/xml/jack.xml b/manual/xml/jack.xml
new file mode 100644
index 0000000000..8798a45686
--- /dev/null
+++ b/manual/xml/jack.xml
@@ -0,0 +1,292 @@
+<?xml version="1.0" standalone="no"?>
+
+<!DOCTYPE section PUBLIC "-//OASIS//DTD DocBook XML V4.4//EN" "http://www.oasis-open.org/docbook/xml/4.4/docbookx.dtd" [
+
+]>
+
+<section id="sn-configuring-jack">
+ <title>Getting Audio In, Out and Around Your Computer</title>
+ <para>
+ Before you can begin to use Ardour, you will need to get the audio
+ input/output capabilities of your system working and properly
+ configured. There are two aspects to this process: getting your audio
+ interface (soundcard) working, and configuring it to work with the Jack
+ Audio Connection Kit (<ulink url="http://jackaudio.org/">JACK</ulink>).
+ </para>
+
+ <section id="sn-jack">
+ <title>JACK</title>
+ <para>
+ It is extremely important to understand that Ardour does not interact
+ directly with your audio interface when it is running. Instead, all of
+ the audio data signals that Ardour receives and generates are sent to
+ and from JACK, a piece of software that routes audio data between an
+ audio interface and audio applications, in real time.
+ </para>
+
+ <para>
+ Traditionally, most of the audio sources that you would want to
+ record, as well as a lot of the more significant effects processing,
+ existed outside the computer. Consequently one of the biggest issues
+ in integrating a computer into the operation of the studio is how to
+ move audio data in and out of the computer.
+ </para>
+
+ <para>
+ However, it is becoming increasingly common for studios to use audio
+ sources and effects processing that are comprised completely of
+ software, quite often running on the same machine as an audio
+ sequencer or digital audio workstation (DAW). A new problem arises in
+ such situations, because moving audio in and out of the DAW no longer
+ involves your hardware audio interface. Instead, data has to be moved
+ from one piece of software to another, preferably with the same kind
+ of sample synchronisation you’d have in a properly configured
+ digital hardware system. This is a problem that has been solved at
+ least a couple of times (ReWire from PropellerHeads and DirectConnect
+ from Digidesign are the two most common examples), but JACK is a new
+ design developed as an open source software project, and is thusly
+ available for anyone to use, learn from, extend, *fix or modify.
+ </para>
+
+ <para>
+ New users may not initially realize that by using Jack, their computer
+ becomes an extremely flexible and powerful audio tool - especially
+ with Ardour acting as the ’heart’ of the system.
+ </para>
+ </section>
+
+ <section id="getting-audio-working">
+ <title>Getting Your Audio Interface Working</title>
+ <note>
+ <para>
+ Although Ardour runs on OS X as well as Linux, this documentation
+ describes only a Linux (ALSA) system. The issues faced on OS X tend
+ to be entirely different, and are centered mostly on JACK. There are
+ also alternative audio device driver families for Linux but they are
+ also not discussed here.
+ </para>
+ </note>
+
+ <para>
+ Getting your audio interface working can be the hardest part of
+ setting your computer up to run Ardour, or it could be one of the
+ easiest. The level of difficulty you will face depends on the type of
+ audio interface ("soundcard") you are using, the operating system
+ version you are using, and your own understanding of how it all works.
+ </para>
+
+ <para>
+ In an ideal world, your computer already has a working audio
+ interface, and all you need do is to start up qjackctl and run JACK.
+ You can determine if you face this ideal situation by doing a few
+ simple tests on your machine. The most obvious test is whether
+ you’ve already heard audio coming out of your computer. If you are
+ in this situation, you can skip ahead to
+ <xref linkend="selecting-capture-source"/>.
+ </para>
+ </section>
+
+ <section id="checking-for-an-audio-interface">
+ <title>Checking For an Audio Interface</title>
+ <para>
+ If you’ve never tried to play audio on your computer before, you
+ should use a basic playback program such as play, aplay or possibly
+ xmms. Find an audio file on your machine (<command>locate
+ .wav</command> may help here), and try to play it. There are several
+ possibilities:
+ </para>
+
+ <itemizedlist>
+ <listitem>
+ <para>
+ You may get an error from the program
+ </para>
+ </listitem>
+
+ <listitem>
+ <para>
+ You may hear nothing
+ </para>
+ </listitem>
+
+ <listitem>
+ <para>
+ You may hear something, but its too quiet
+ </para>
+ </listitem>
+
+ <listitem>
+ <para>
+ you may hear something from the wrong loudspeakers.
+ </para>
+ </listitem>
+ </itemizedlist>
+ </section>
+
+ <section id="selecting-capture-source">
+ <title>Selecting Capture Source</title>
+ <para>
+ Many audio interfaces, particularly the cheaper varieties that are
+ often found built into computers, have ways to plug in both
+ microphones and instruments or other audio equipment to be recorded.
+ This immediately poses a question: how does Ardour (or any software)
+ know which signal to record, the one coming into the microphone input,
+ or the one arriving at the "line in" socket? The same question arises
+ also for "high-end" audio interfaces, though in different ways.
+ </para>
+
+ <para>
+ The short answer is: Ardour doesn’t. Instead, this is a choice you
+ have to make using a program a program that understands how to control
+ the mixing hardware on the audio interface. Linux/ALSA has a number of
+ such programs: alsamixer, gamix, aumix, kmix are just a few of them.
+ Each of them offers you a way to select which of the possible
+ recordable signals will be used for as the "capture source". How you
+ select the preferred signal varies from program to program, so you
+ will have to consult the help documentation for whichever program you
+ choose to use.
+ </para>
+
+ <para>
+ There are also a few programs that offer ways to control just one
+ particular kind of audio interface. For example, the
+ <application>hdspmixer</application> program offers control over the
+ very powerful matrix mixer present on several RME audio interface.
+ <application>envy24ctrl</application> does the same for a number of
+ interfaces built around the common ice1712/envy24 chipset, found in
+ devices from M-Audio, Terratec and others. Please note that this quite
+ similar to the situation for Windows and MacOS users, where each audio
+ interface often comes with its own control program that allows certain
+ critical configuration choices to be made.
+ </para>
+
+ <section id="problems-with-input-signal">
+ <title>"I don’t get any signal when I record …"</title>
+ <para>
+ The most common problem for first-time audio users on Linux is to
+ try to record something and get no signal at all, or alternatively,
+ a very low signal. The low signal problem typically arises from one
+ or more of the following issues:
+ </para>
+
+ <itemizedlist>
+ <listitem>
+ <para>
+ a microphone input plugged into the "line in" socket of the
+ interface. The signal levels delivered by microphones are very
+ small, and require amplification before they can be used by most
+ audio circuitry. In professional recording studios, this is done
+ using a dedicated box called a "pre-amplifier". If your audio
+ interface has a "mic input" socket, then it has its own
+ pre-amplifier built in, although its probably not a very good
+ one. If you make the mistake of plugging a microphone into the
+ "line in" socket, you will get either an inaudible or very quiet
+ signal.
+ </para>
+ </listitem>
+
+ <listitem>
+ <para>
+ the wrong capture source selected in the audio interface’s
+ hardware mixer (see above)
+ </para>
+ </listitem>
+
+ <listitem>
+ <para>
+ the "capture" gain level in the audio interface’s hardware
+ mixer is turned down too low. You will need to use a hardware
+ mixer application (as described above) to increase this.
+ </para>
+ </listitem>
+ </itemizedlist>
+
+ <note>
+ <para>
+ You will notice in the mixer strip for each track in ardour that
+ you can change the selection of the monitoring source between
+ input/pre/post. Adjusting the fader while watching the ’input’
+ levels will NOT have any affect on the levels. As mentioned above,
+ ardour is dependent on external mixer settings for a source level.
+ </para>
+ </note>
+ </section>
+ </section>
+
+ <section id="monitoring-choices">
+ <title>Monitoring Choices</title>
+ <para>
+ Its unfortunate that we have to raise this issue at a point in the
+ manual where you, the reader, may not even knoiw what "monitoring"
+ means. However, it is such an absolutely critical aspect of using any
+ digital audio workstation that we need to at least cover the basics
+ here. The only people who don’t need to care about monitoring are
+ those who will never use ardour to record a live performance (even on
+ performed using a software synthesizer).
+ </para>
+
+ <para>
+ Monitoring is the term we use to describe listening to what ardour is
+ recording. If you are playing a guitar and recording it with ardour,
+ you can probably hear the guitar’s own sound, but there are many
+ situations where relying on the sound of the instrument is completely
+ inadequate. For example, with an electronic instrument, there is no
+ sound until the electrical signal that it generates has been processed
+ by an amplifier and fed to a loudspeaker. But if Ardour is recording
+ the instrument’s signal, what is responsible for sending it to the
+ amp+loudspeakers? It can get a lot more complex than that: if you are
+ recording multiple performers at the same time, each performer needs
+ to hear their own playing/singing, but they also probably need to hear
+ some of their colleagues’ sound as well. You might be overdubbing
+ yourself - playing a new line on an instrument while listening to
+ tracks you’ve already recorded - how do you hear the new material as
+ well as the existing stuff?
+ </para>
+
+ <para>
+ Well, hopefully, you’re convinced that there are some questions to
+ be dealt with surrounding monitoring, see
+ <xref linkend="sn-monitoring"/> for more in depth information.
+ </para>
+ </section>
+
+ <section id="using-multiple-soundcards">
+ <title>Can I use multiple soundcards</title>
+ <para>
+ There are really lots of great reasons why you should not even attempt
+ to do this. But seriously, save your money for a while and buy
+ yourself a properly designed multichannel soundcard.
+ </para>
+ </section>
+
+ <section id="qjackctl">
+ <title>Qjackctl</title>
+ <para>
+ JACK itself does not come with graphical user interface - to start
+ JACK and control it you need to have access to a command line and a
+ basic knowledge of Unix-like operating systems. However,
+ <ulink url="http://qjackctl.sourceforge.net/">qjackctl</ulink> is a
+ wonderful application that wraps JACK up with a graphical interface
+ that is both nice to look at and useful at same time. qjackctl is the
+ recommended way of using JACK.
+ </para>
+ <mediaobject>
+ <imageobject>
+ <imagedata fileref="images/qjackctl.png"/>
+ </imageobject>
+ </mediaobject>
+ <para>
+ You should be able to start qjackctl from the “application menu”
+ of your system, typically found on the panel/appbar/dock or whatever
+ its called that lives at the top/bottom/left/right of your screen.
+ </para>
+
+ <para>
+ [ need screenshot of GNOME/KDE/OSX menus here ]
+ </para>
+ </section>
+<!--
+ <xi:include xmlns:xi="http://www.w3.org/2001/XInclude"
+ href="Some_Subsection.xml" />
+ -->
+</section>