[user/user-setup.tex] \label{user-setup-chapter} \lsection[Installing DABC]{user-install}{Installing \dabc} \index{DABC!Installation} When working at the GSI linux cluster, the \dabc\ framework is already installed and will be maintained by people of the gsi EE department. Here \dabc\ needs just to be activated from any GSI shell by typing \verba{.~dabclogin} (dot space). In this case, please skip this installation section and proceed with following section \paref{user-env} describing the set-up of the user environment. However, if working on a separate DAQ cluster outside GSI, it is mandatory to install the \dabc\ software from scratch. Hence the \dabc\ distribution is available for download at \hyperref{http://dabc.gsi.de}{}{}{http://dabc.gsi.de}. It is provided as a compressed tarball of sources \verba{dabc\_vn.m.ss.tar.gz} where n m and ss are version numbers. The following steps describe the recommended installation procedure: \bnum \item {\bf Unpack this \dabc\ distribution} at an appropriate installation directory, e.~g.~: \begin{small} \begin{verbatim} cd /opt/dabc; tar zxvf dabc_v1.0.00.tar.gz \end{verbatim} \end{small} This will extract the archive into a subdirectory which is labelled with the current version number like \verba{/opt/dabc/dabc\_v1.0.00}. This becomes the future \dabc\ system directory. \item {\bf Prepare the \dabc\ environment login script}: A template for this script can be found at \begin{small} \begin{verbatim} scripts/dabclogin.sh \end{verbatim} \end{small} \bbul \item Edit the \verba{DABCSYS} environment according to your local installation directory. This is done in the following lines: \begin{small} \begin{verbatim} export DABCSYS=/opt/dabc/dabc_1_0.00 \end{verbatim} \end{small} \item Specify correct location of your JAVA installation. This is done in the lines (shown here an example, make sure to get the path where the include directory is located): \begin{small} \begin{verbatim} export JAVA_HOME=/usr/lib/jvm \end{verbatim} \end{small} \item Copy the script to a location in your global \verba{\$PATH} for later login, e.~g.\ \verba{/usr/bin}. Alternatively, you may set an \func{alias} to the full pathname of \verba{dabclogin.sh} in your shell profile. \ebul \item Execute the just modified login script in your shell to set the environment: \begin{small} \begin{verbatim} . dabclogin.sh \end{verbatim} \end{small} This will set the environment for the compilation. \item Change to the \dabc\ installation directory and start the build: \begin{small} \begin{verbatim} cd $DABCSYS make \end{verbatim} \end{small} This will compile the \dabc\ framework and install a suitable version of DIM in a subdirectory of \verba{\$DABCSYS/dim}. \enum After succesful compilation, the \dabc\ framework installation is complete and can be used from any shell after invoking \verba{.~dabclogin.sh} The next sections \paref{user-env} and \paref{user-setup} will describe further steps to set-up the \dabc\ working environment for each user. \lsection[Set-up the DABC environment]{user-env}{Set-up the \dabc\ environment} \index{DABC!Environment set-up} Once the general \dabc\ framework is installed on a system, still each user must "activate" the environment and do further preparations to work with it. \bnum \item Execute the \dabc\ login script in a linux shell to set the environment. At GSI linux installation, this is done by \begin{small} \begin{verbatim} . dabclogin \end{verbatim} \end{small} For the user installation as described in above section \paref{user-install}, by default the script is named \begin{small} \begin{verbatim} . dabclogin.sh \end{verbatim} \end{small} The login script will already enable the \dabc\ framework for compilation of user written components. Additionally, the general executable \verba{dabc\_run} now provides the \dabc\ runtime environment and may be started directly for simple "batch mode" applications on a single node. However, further preparations are necessary if \dabc\ shall be used with DIM control system and GUI. \item Open a dedicated shell on the machine that shall provide the DIM name server, e.~g.~ \begin{small} \begin{verbatim} ssh nsnode.cluster.domain export DIM_DNS_NODE=nsnode.cluster.domain . dabclogin.sh dimDns & dimDid & \end{verbatim} \end{small} to launch the DIM name server. This is done \strong{once} at the beginning of the DAQ setup; usually the DIM name server needs not to be shut down when \dabc\ applications terminate. The DID is useful for inspecting DIM services. \item Set the DIM name server environment variable in any \dabc\ working shell (e.~g.~ the shell that will start the dabc gui later): \begin{small} \begin{verbatim} . dabclogin.sh export DIM_DNS_NODE=nsnode.cluster.domain \end{verbatim} \end{small} \item Now the \dabc\ GUI can be started in such prepared shell by typing \verba{dabc}, (or \verba{mbs} for a plain \mbs\ gui, resp.). See below in gui section. \enum \medskip To operate a \dabc\ application one should create a dedicated working directory to keep all relevant files: \bbul \item Setup files for \dabc\ (XML). \item Log files (text). \ebul The following section \paref{user-setup} gives a general description of the setup file syntax. The GUI may run on a machine with no access to the \dabc\ working directory, e.~g.\ a windows PC. Therefore the GUI setup files may use a different working directory, containing: \bbul \item Data files for startup panels (XML). \item Configuration files for GUI (XML). \ebul These configuration files for the GUI are described in more detail in Chapter \paref{user-gui-chapter}. Of course both setups, for the \dabc\ application and the GUI, can be put into one working directory if the GUI has access to it. \lsection[DABC setup file]{user-setup}{\dabc\ setup file} \index{DABC!Setup file} The setup file is an XML file in a \dabc-specific format, which contains values for some or all configuration parameters of the system. \lsubsection[Setup file example]{user-setup-configfile}{Setup file example} Let's consider this simple but functional configuration file: \begin{small} \begin{verbatim} \end{verbatim} \end{small} This is an example XML file for an MBS generator, which produces MBS events and provides them to an {\em MBS transport} server. This use case is described further in section \paref{user-app-mbs-eventserver}. Other examples of \dabc\ setup files can be found in the sections \paref{user:mbsapp}, \paref{user-app-bnet-dabc}, and \paref{user-app-bnetmbs} of this manual. \lsubsection[Basic syntax]{user-setup-syntax}{Basic syntax} A \dabc\ configuration file should always contain as root node. Inside the node one or several nodes should exists. Each node represents the {\em application context} which runs as independent executable. Optionally the node can have and nodes, which are described further in the following sections \paref{user-setup-variables} and \paref{user-setup-defaults}. \lsubsection[Context]{user-setup-context}{Context} A node can have two optional attributes: \bdes \item["host"] host name, where executable should run, default is "localhost" \item["name"] application (manager), default is the host name. \edes Inside a node configuration parameters for modules, devices, memory pools are contained. In the example file one sees several parameters for the output port of the generator module. \lsubsection[Run arguments]{user-setup-run}{Run arguments} Usually a node has a subnode, where the user may define different parameters, relevant for running the \dabc\ executable: \bdes \item[lib] name of a library which should be loaded. Several libraries can be specified. \item[func] name of a function which should be called to create modules. This is an alternative to instantiating a subclass of \class{dabc::Application} (compare section \paref{prog_plugin_applicaton}) \item[runfunc] function name to run some sequence of operations (start, stop, reconfigure) over application. Useful for batch mode \item[port] ssh port number of remote host \item[user] account name to be used for ssh (login without password should be possible) \item[init] init script, which should be called before dabc application starts \item[test] test script, which is called when test sequence is run by run.sh script \item[timeout] ssh timeout \item[debugger] argument to run with a debugger. Value should be like "gdb -x run.txt --args", where file run.txt should contain commands "r bt q". \item[workdir] directory where \dabc\ executable should start \item[debuglevel] level of debug output on console, default 1 \item[logfile] filename for log output, default none \item[loglevel] level of log output to file, default 2 \item[DIM\_DNS\_NODE] node name of DIM dns server, used by DIM controls implementation \item[DIM\_DNS\_PORT] port number of DIM dns server, used by DIM controls implementation \item[cpuinfo] instantiate \class{dabc::CpuInfoModule} to show CPU and memory usage information. Value must be >= 0. If 0, only two parameters are created, if 15 - several ratemeters will be created. \item[parslevel] level of pars visibility for control system, default 1 \edes \lsubsection[Variables]{user-setup-variables}{Variables} In the root node one can insert a node which may contain definitions of one or several variables. Once defined, such variables can be used in any place of the configuration file to set parameter values. In this case the syntax to set a parameter is: \begin{small} \begin{verbatim} \end{verbatim} \end{small} It is allowed to define a variable as a combination of text with another variable, but neither arithmetic nor string operations are supported. Using variables, one can modify the example in the following way: \begin{small} \begin{verbatim} \end{verbatim} \end{small} Here context name and module name are set via \verba{myname} variable, and mbs server socket port is set via \verba{myport} variable. There are several variables which are predefined by the configuration system: \bbul \item DABCSYS - top directory of \dabc\ installation \item DABCUSERDIR - user-specified directory \item DABCWORKDIR - current working directory \item DABCNUMNODES - number of nodes in configuration files \item DABCNODEID - sequence number of current node in configuration file \ebul Any shell environment variable is also available as variable in the configuration file to set parameter values. \lsubsection[Default values]{user-setup-defaults}{Default values} There are situations when one needs to set the same value to several similar parameters, for instance the same queue length for all output ports in the module. One possible way is to use syntax as described above. The disadvantage of such approach is that one must expand the XML file to set each queue length explicitely from the appropriate variable; so in case of a big number of ports the file will be very long and confusing to the user. Another possibility to set several parameters at once consists in \strong{wildcard rules} using "*" or "?" symbols. These can be defined in a node: \begin{small} \begin{verbatim} \end{verbatim} \end{small} In this example for all ports which names begin with the string "Output", and which belong to any module, the output queue length will be 5. A wildcard rule of this form will be applied for all contexts of the configuration file, i.~e.\ by such rule we set the output queue length for all modules on all nodes. This allows to configure a big multi-node cluster with a compact XML file. Another possibility to set default value for some parameters - create parameter with the same name in parent object. Here word \strong{create} is crutial - one should use \func{CreateParInt()} method in module constructor - it is not enough just put additional tag in xml file. For instance, one can create parameter "MbsServerPort" in generator module and than MBS server transport, created for output port, will use that value for as default server port number. \lsection[Installation of additional plug-ins]{user-plugins}{Installation of additional plug-ins} \index{DABC!Plug-in installation} Apart from the \dabc\ base package, there may be additional plug-in packages for specific use cases. Generally, these plug-in packages may consist of a \strong{plugins} part and an \strong{applications} part. The {\em plugins} part offers a library containing new components (like \class{Devices}, \class{Transports}, or \class{Modules}). The {\em applications} part mostly contains the XML setup files to use these new components in the \dabc\ runtime environment; however, it may contain an additonal library defining the \dabc\ \class{Application} class. As an example, we may consider a plug-in package for reading out data from specific PCIe hardware like the Active Buffer Board \ABB\ \cite{AbbDescription}. This package is separately available for download at \hyperref{http://dabc.gsi.de}{}{}{http://dabc.gsi.de} and described in detail in chapter \paref{prog-exa-pci-chapter} of the \dabc\ programmer's manual. There are principally two different ways to install such separate plug-in packages: Either within the general \verba{DABCSYS} directory as part of the central \dabc\ installation, as described in following section \paref{user-plugins-dabcsys}. Or at an independent location in a user directory, as described in section \paref{user-plugins-userdir}. \lsubsection[Add plug-in packages to \$DABCSYS]{user-plugins-dabcsys}{Add plug-in packages to \$DABCSYS} This is the recommended way to install a plug-in package if this package should be provided for all users of the \dabc\ installation. A typical scenario would be that an experimental group owns dedicated DAQ machines with system manager priviliges. In this case, the plugin-package may be installed under the same account as the central \dabc\ installation (probably, but not necessarily even the \keyw{root} account). The new plug-in package should be directly installed in the \verba{\$DABCSYS} directory then, with the following steps: \bnum \item Download the plug-in package tarball, e.~g.\ \verba{abb1.tar.gz} \item Call the \verba{dabclogin.sh} script of the \dabc\ installation (see section user-env) \item Copy the downloaded tarball to the \verba{\$DABCSYS} directory and unpack it there: \begin{verbatim} cp abb1.tar.gz $DABCSYS cd $DABCSYS tar zxvf abb1.tar.gz \end{verbatim} This will extract the new components into the appropriate \verba{plugins} and \verba{applications} folders below \verba{\$DABCSYS}. \item Build the new components with the top Makefile of \verba{\$DABCSYS}: \begin{verbatim} make \end{verbatim} \item To work with the new components, the configuration script(s) of the {\em applications} part should be copied to the personal workspace of each user (see section \paref{user-setup}). For the \ABB\ example, this is found at \begin{verbatim} $DABCSYS/applications/bnet-test/SetupBnetIB-ABB.xml \end{verbatim} \enum \lsubsection[Plug-in packages in user directory]{user-plugins-userdir}{Plug-in packages in user directory} This is the case when \dabc\ is installed centrally at the fileserver of an institute, and several experimental groups shall use different plug-ins. It is also the recommended way if several users want to modify the source code of a plug-in library independently without affecting the general installation. The new plug-in package should be installed in a user directory then, with the following steps: \bnum \item Download the plug-in package tarball, e.~g.\ \verba{abb1.tar.gz} \item Create a directory to contain your additional \dabc\ plugin packages: \begin{verbatim} mkdir $HOME/mydabcpackages \end{verbatim} \item Call the \verba{dabclogin.sh} script of the \dabc\ installation (see section user-env) \item Copy the downloaded tarball to the \verba{\$DABCSYS} directory and unpack it there: \begin{verbatim} cp abb1.tar.gz $HOME/mydabcpackages cd $HOME/mydabcpackages tar zxvf abb1.tar.gz \end{verbatim} This will extract the new components into the appropriate \verba{plugins} and \verba{applications} folders below the working directory. \item To build the {\em plugins} part, change to the appropriate package plugin directory and invoke the local Makefile, e.~g.\ for the \ABB\ example: \begin{verbatim} cd $HOME/mydabcpackages/plugins/abb make \end{verbatim} This will create the corresponding plug-in library in a subfolder denoted by the computer architecture, e.~g.~: \begin{verbatim} $HOME/mydabcpackages/plugins/abb/x86_64/lib/libDabcAbb.so \end{verbatim} \item For some plug-ins, there may be also small test executables with different Makefiles in subfolder \verba{test}. These can be optionally build and executed independent of the \dabc\ runtime environment. \item The \dabc\ working directory for the new plug-in will be located in subfolder \begin{verbatim} applications/plugin-name \end{verbatim} For the \ABB\ example, the application will set up a builder network with optional Active Buffer Board readouts, so this is at \begin{verbatim} $HOME/mydabcpackages/applications/bnet-test \end{verbatim} As in this example, there may be an additional library to be build containing the actual \class{Application} class. This is done by invoking the Makefile within the directory: \begin{verbatim} cd $HOME/mydabcpackages/applications/bnet-test make \end{verbatim} Here the application library is produced directly on top of the working directory: \begin{verbatim} $HOME/mydabcpackages/applications/bnet-test/libBnetTest.so \end{verbatim} \item The actual locations of the newly build libraries (plugins, and optionally applications part) has to be edited in the \keyw{} tag of the corresponding \dabc\ setup-file (here: \verba{SetupBnetIB-ABB.xml}). The default set-up examples in the plug-in packages assume that the library is located at \verba{\$DABCSYS/lib}, as it is in the alternative installation case as described in section \paref{user-plugins-dabcsys}. \enum