Quick start guide to CLASSIC

This document will walk through the basic setup, compilation, and running of CLASSIC at FLUXNET sites.


This guide assumes that the reader is using a Linux machine, and that they have Singularity and tar installed. The installation guide for Singularity can be found here. Tar can be installed from here. If using a Windows or Mac machine see instructions on setting up Singularity here.

Obtaining the Source Code

The first step is setting up a directory structure. For the sake of this guide, we will work out of a directory located at, /home/eccc. So first we navigate to our chosen directory location:

cd /home/eccc

The CLASSIC source code is hosted on GitLab, and can be cloned with:

git clone https://gitlab.com/cccma/CLASSIC.git

If you are familiar with git’s workflow and wish to contribute to the codebase in the future, then it is a good idea to first fork a copy to your own GitLab account. Once you have a fork, you can clone the repository with:

git clone https://gitlab.com/_your_gitlab_username_/CLASSIC.git

Once the cloning process is complete, there should be a directory titled CLASSIC, located in your directory like:


Obtaining other necessary files

All other necessary files are found on Zenodo in the form of compressed packages. We will use the tar command to unpack them beside the CLASSIC repository.

Navigate to the CLASSIC community Zenodo page and download the FLUXNET, and CLASSIC_container tar files.

Your directory should now contain the following:


From the /home/eccc directory, we will decompress the tarballs and move the decompressed files to their correct locations. The following list of commands will accomplish this:

tar xzvf CLASSIC_container.tar.gz
tar xzvf FLUXNET.tar.gz

mv CLASSIC_container.simg CLASSIC/

mv FLUXNET/TRENDY_CO2_1700_2018.nc CLASSIC/inputFiles/
mv FLUXNET/observationalDataFLUXNET CLASSIC/inputFiles/
mv FLUXNET/benchmark_CLASSIC_output CLASSIC/benchmark/
mv FLUXNET/benchmark_CLASSIC_plots CLASSIC/benchmark/
mv FLUXNET/benchmark_CLASSIC_AMBER CLASSIC/benchmark/
rm -rf FLUXNET/

Using the container

Navigate to the CLASSIC root directory. The rest of the tutorial will assume that you stay in this working directory:

cd /home/eccc/CLASSIC

Since much of CLASSIC’s functionality relies on the libraries contained within the container, we execute commands through the container with this syntax:

singularity exec CLASSIC_container.simg [commands here]

Most of the calls like this are contained within the automated run scripts so their direct use is minimized. However, if you wish to deviate from the automated run scripts, that command will be of great use.

First, we’ll make sure we have a clean working directory.

singularity exec CLASSIC_container.simg make mode=serial clean

We are now ready to compile the source code. This is done with the command:

singularity exec CLASSIC_container.simg make mode=serial

The compilation process can take several minutes. If you get errors, check that you have the latest stable version of CLASSIC pulled from the repository.

Setting up FLUXNET site-level runs

With the FLUXNET sites in place and CLASSIC compiled, we can now setup the job options file(s) to run over the FLUXNET sites. To do this, run the job options script by invoking:


This script puts a new job-options file in every FLUXNET site directory tailored to that particular site. The singularity exec command is not necessary for this particular script.

Running CLASSIC over FLUXNET sites

This guide will cover two methods for running the CLASSIC binary on FLUXNET sites. They can be run individually, or through the batch-run script.

A script is provided in the CLASSIC repository to run all FLUXNET sites, provided the directory structure of this document is followed. The script will run for all sites it finds within the inputFiles/FLUXNETsites directory and is invoked by:


All model restart files are provided in a ‘pre-spunup’ state. Still, running all sites can a lengthy process on older machines, and sometimes you may not want or need all of them. To reduce the number of sites being run, remove the undesired site directories from inputFiles/FLUXNETsites. For example, if you don’t need to process DK-Sor, you could do:

mv inputFiles/FLUXNETsites/DK-Sor inputFiles

This would simply move the site out of the way, and it could be placed back into the FLUXNETsites directory when needed.

Running individual sites (advanced)

Individual sites are run through the Singularity container directly referencing the job options file for that particular site. The command takes the form:

singularity exec [Location of CLASSIC container] [location of CLASSIC binary] [Location of job options file for that site] [Longitude/Latitude of the site]

Since this is a site-level simulation the shorthand 0/0 or 0/0/0/0 can be used in place of the actual longitude and latitude of the site.

Piecing it all together, if we want to run on the site AU-Tum, the command would be:

singularity exec CLASSIC_container.simg bin/CLASSIC_serial inputFiles/FLUXNETsites/AU-Tum/job_options_file.txt 0/0/0/0

If you decide to shell into the singularity container (recommended if you’re familiar with Singularity and terminal commands), then this becomes

bin/CLASSIC_serial inputFiles/FLUXNETsites/AU-Tum/job_options_file.txt 0/0/0/0

More information on running CLASSIC can be found at the CLASSIC manual.

Processing output

Whether you ran multiple sites with the batch job, or a single site individually, the output can be found in outputFiles/FLUXNETsites/{sitename}/. These outputs are in the form of netCDF files. More information on netCDF can be found here, but briefly, it is a machine-agnostic array-oriented form of data storage.

A script is provided to convert this data to csv format, which may be easier to work with if unfamilier with netCDFs, as well as generate plots of several output variables against observational values of those variables. The script is run with:


NOTE: if not all sites were run with the batch script, errors will be seen in the output of this script. This is expected, and does not mean that the processing has failed.

Once the output plots have been generated, the process_outputs script will run the Automated Model Benchmarking (AMBER) tool. More information on AMBER can be found here.

After the script completes, PDF copies of the plots are found in outputFiles/plots, while AMBER results are in outputFiles/AMBER.