Server Configuration

Database Creation

A SQLite database file can be prepared for Crab using the schema provided:

% sqlite3 crab.db < doc/schema.sql

Alternatively if you are going to be using MySQL for your Crab database, create the database:

% mysqladmin -u root -p create crab

and create a user account for crab, changing the password (the “identified by” clause) to something suitable:

% mysql -u root -p mysql
> create user 'crab'@'localhost' identified by 'crab';
> grant all on crab.* to 'crab'@'localhost';
> flush privileges;

You can prepare a table creation script suitable for MySQL using the Makefile in the doc directory of the source package:

% make -C doc schema_mysql.sql
% mysql -u crab -p crab < doc/schema_mysql.sql

Configuration

The Crab server is configured by a crabd.ini file which can be placed either in /etc/crab/ or ~/.crab/. Note that this is a CherryPy configuration file, which is read slightly differently to typical .ini files which use Python’s ConfigParser.

% cp doc/crabd.ini ~/.crab/

The example crabd.ini file should be edited to uncomment the [crab] and [store] sections. The home and file entries must point to the location of Crab’s data files and the database file just created. By default the data files are installed in share/crab relative to the Python system prefix (sys.prefix).

There is also an [outputstore] section in the server configuration file. This allows the output from cron jobs and raw crontab files to be stored separately, and can be used to prevent the main database from becoming excessively large.

If you would like to have Crab delete the history of job events over a certain age, you can have it run a cleaning service by enabling the [clean] section of the server configuration file. Here you can select the cleaning schedule and length of history to keep. A fairly frequent cleaning schedule is recommended to avoid the accumulation of a large number of old events so that each cleaning operation does not take long. If the file output store is being used, the cleaning service will remove only the event records and not the output text. You can remove old output text separately, for example by running in your output store directory:

% find output -type f -mtime +90 -delete
% find output -type d -empty -delete

Running

The Crab server is run as crabd. When the server is executed directly, it will stay in the foreground:

% crabd

It can also be run in the background with the crabd-check script, which checks that it is not still running from a previous invocation of crabd-check. Therefore this is suitable for running from cron to keep the server running:

PYTHONPATH=/path/to/crab/lib
PATH=/path/to/crab/scripts:/bin:/usr/bin
7-57/10 * * * * CRABIGNORE=yes crabd-check

With the server running, the Crab dashboard should be visible from a web browser, by default on port 8000. The Crab clients will use this same web service to communicate with the server.

Migrating Job Information

The Crab server has the ability to export and import cron job information, including:

  • The list of cron jobs.

  • The configuration and notifications attached to each job.

  • General host/user-based notifications.

  • Raw crontabs.

You can write this information to a JSON file using the --export option:

% crabd --export job_information.json

Similarly you can read information with the --import option:

% crabd --import job_information.json

This merges the information from the file with the server’s existing configuration. You can also give a file name of - to export to standard output or read from standard input.

Example Configuration File

Here is the example server configuration file crabd.ini which is distributed with Crab:

# This file is read by CherryPy rather than ConfigParser
# and the following differences apply:  strings must be
# quoted, and it appears that if you include a section,
# you must include all settings in that section as the
# defaults are not kept.

# [crab]
# # Directory in which to find the res/ and templ/ directories.
# home = '/usr/share/crab'
#
# # Base URL to use when generating links to be used from
# # outside the Crab web interface, e.g. in notification
# # emails.
# base_url = 'http://crabserver.example.com:8000'
# # To generate automatically:
# base_url = None

# [store]
# # Main storage backend.
# type = 'sqlite'
# file = '/var/lib/crab/crab.db'
# # Alternatively for MySQL:
# # type = 'mysql'
# # host = 'localhost'
# # database = 'crab'
# # user = 'crab'
# # password = 'crab'

# [outputstore]
# # Storage backend to be used for storing job output
# # and raw crontabs.
# # (This is optional, unless the selected main backend
# # is not capable of storing output.)
# type = 'file'
# dir = '/var/lib/crab'

# [global]
# engine.autoreload.on = False
#
# server.socket_port = 8000
#
# # To listen on localhost only:
# server.socket_host = '127.0.0.1'
#
# # To listen on a specific address:
# server.socket_host = '0.0.0.0'

# [email]
# # Server through which to send email notifications.
# server = 'mailhost'
#
# # Name (and address) to send email from.
# from = 'Crab Daemon'
#
# # Subjects to use for different severity levels.
# subject_ok = 'Crab notification'
# subject_warning = 'Crab notification (WARNING)'
# subject_error = 'Crab notification (ERROR)'

# [notify]
# # Cron-style schedule for sending "daily" notifications,
# # to be used for notifications without specified schedules.
# daily = '0 0 * * *'
#
# # Timezone to use for the daily notification schedule.
# timezone = 'UTC'

# # Uncomment this section if you wish to use the automated cleaning
# # service to delete the history of old events.
# [clean]
# # Cron-style schedule for cleaning operations.
# schedule = '15 0 * * *'
# # Timezone to use for the cleaning schedule.
# timezone = 'UTC'
# # Number of days for which to keep events.
# keep_days = 90

# # This section applies if crabd is run with the --accesslog option
# # giving the base access log file name (e.g. via crabd-check).
# [access_log]
# # Maximum size of log files (MiB), or 0 to disable rotation.
# max_size = 10
# # Number of past log files to keep, or 0 to disable rotation.
# backup_count = 10

# # This section applies if crabd is run with the --errorlog option
# # giving the base error log file name (e.g. via crabd-check).
# [error_log]
# # Maximum size of log files (MiB), or 0 to disable rotation.
# max_size = 10
# # Number of past log files to keep, or 0 to disable rotation.
# backup_count = 10