If you have ever used ANTS you know it is amazing for image registration. Unfortunately, it can take hours or even days to complete. That is where Condor comes in. Here is how I created a study specific template with ANTS and Condor.
This walkthrough assumes that you know or are familiar with *unix, ANTS and Condor. High-throughput Condor is a specialized workload management system for compute-intensive jobs, and I will assume that you already have it running on your favorite computing cluster. Using the scripts below, you should be able to create a study-specific MRI template using ANTS on your Condor system.
Ok, here we go. To summarize, I wrote a bash script that created a DAG file, which describes how to submit jobs to condor, which, in turn, execute bash scripts that run ANTS.
A set of programs to run and files to run them on. In my case this will be the ANTS executables, bash scripts to link executables together, and nifti-format T1 images.
To run it in Condor, you will need a submit file for each command you want to run.
Condor jobs can then be linked using a directed-acyclic graph, or DAG, file. Because DAG files can be complicated and study specific, I write scripts that create these files.
Once you have these parts, you will be able to run a script, and submit the DAG it creates to your condor system.
First, set up a directory with executables in it. The executables you will need are all part of the ANTS package. For this code to work, you will need ANTS, ImageMath, N3BiasFieldCorrection, SetOrigin, WarpImageMultiTransform, AverageImages, AverageAffineTransform, and MultiplyImages.
Prepare to run ANTS on a single subject using Condor.¶
In order to run ANTS repeatedly on a number of subjects, we will have to run it multiple times on a single subject.
Once you are done, you should be able to run it like this
This script will intensity normalize, bias correct, and zero the origin of both test and template images, then run image registration from test to template.
Once you have the script, you can create a submit file for your script. Each submit file describes an executable that can be run on a number of different input files. In this case, the submit file will describe how to run the script that we just looked at, go_ants_nifti.sh. This core components of a submit file are the executable, transfer_input_files, transfer_output_files and Arguments lines. These lines specify what files you will need to upload, what command to run, and what files to transfer back to your computer upon completion.
In addition to these basic processes, this submit file has a few lines that will help you out…
document your success and failure (i.e. Log, Output and Error files).
help your script run on the right computers (i.e. Requirements, request_cpus, request_memoryrequest_disk).
and, manage your job a bit by restarting it if it takes too long or gets booted from the compute-node unexpectedly (i.e. periodic_hold, periodic_release).
Assuming you setup your files appropriately, you could, at least in theory, run
Once you have ANTS running, the next step will be to create your study-specific template. This step, will create a mean image of all subjects in template space and transform it based on the average transformation it took subjects to get to template space. We will use the same strategy we used above, namely write a script and a submit file that will manage that script.
Then you create the Submit file that manages this bash script. This submit file will tell condor how to run and manage the above go_shapeUpdateTemplate.sh. If you are so inclined, you can update this submit file with some of the requirements, requests or periodic_ commands above. (I didnt because I dont have a really good idea how much memory/disk this process takes, and dont want to preemptively put it on hold.
Create your DAG parallel and dependent processes.¶
Now you have two scripts and two submit files for those scripts. The DAG you create will allow you run these commands in the right order on the right subjects. The format of the DAG file is pretty simple. For every job you run, you need to define what submit files to use, define any variables you want to pass to the submit file, and any dependencies it has.
This code will run the job myANTS. It defines myANTS as a single instance of go_ants_nifti.sh using the go_ants_nifti.submit submit file, with the specified VARS, which could dictate what subject to run. Importantly, this script will not run myANTS until another job previousTemplateCreation has finished, and will wait to start nextTemplateCreation until myANTS has finished.
Because this can get crazy complicated when you have many subjects and repeated template creation iterations, I wrote a bash script that will create a bunch of these commands.
Feel free to run it on your data and see what dag file it creates.
createAntsDag_nifti.sh template.nii templateMask.nii subject*.nii > myDag.dag
more myDag.dag
If you have gotten this far, you should be ready to submit your DAG file.
condor_submit_dag myDag.dag
Once you submit this file, your condor system should manage the execution of your script and give you many-many output files. In particular it should create a file for each subject*.nii in standard space called toTemplate_subject*.nii, as well as toTemplate_subject*Affine.txt and toTemplate_subject*Warp.nii files that describe the transformation. It will also create a study-specific template, called template_4.nii
Hopefully this will help you get your ANTSing to run faster.