Usage¶
Execution and the BIDS format¶
The Multi-Scale Brain Parcellator
workflow takes as principal input the path of the dataset
that is to be processed.
The input dataset is required to be in valid BIDS format, and it must include at least one T1w or MPRAGE structural image.
We highly recommend that you validate your dataset with the free, online
BIDS Validator.
Commandline Arguments¶
The command to run Multi-Scale Brain Parcellator
follow the BIDS-Apps definition with additional options specific to this pipeline.
Multi-scale Brain Parcellator BIDS App.
usage: multiscalebrainparcellator [-h]
[--participant_label PARTICIPANT_LABEL [PARTICIPANT_LABEL ...]]
[--number_of_cores NUMBER_OF_CORES]
[--isotropic_resolution ISOTROPIC_RESOLUTION]
[--thalamic_nuclei]
[--hippocampal_subfields]
[--brainstem_structures]
[--skip_bids_validator] [-v]
bids_dir output_dir {participant}
Positional Arguments¶
bids_dir | The directory with the input dataset formatted according to the BIDS standard. |
output_dir | The directory where the output files should be stored. If you are running group level analysis this folder should be prepopulated with the results of theparticipant level analysis. |
analysis_level | Possible choices: participant Level of the analysis that will be performed. Note that only participant level analysis is available.Multiple participant level analyses can be run independently (in parallel) using the same output_dir. |
Named Arguments¶
--participant_label | |
The label(s) of the participant(s) that should be analyzed. The label corresponds to sub-<participant_label> from the BIDS spec (so it does not include “sub-“). If this parameter is not provided all subjects should be analyzed. Multiple participants can be specified with a space separated list. | |
--number_of_cores | |
The number of cores to be used forprocessing (Maximum number of available processing cores used by default) | |
--isotropic_resolution | |
The isotropic resolution in mmused to resample the original anatomical imagesand applied a tthe beginning of the processing pipeline. | |
--thalamic_nuclei | |
Whether or not to parcellate the thalamic nuclei Default: False | |
--hippocampal_subfields | |
Whether or not to parcellate the hippocampal subfields Default: False | |
--brainstem_structures | |
Whether or not to parcellate the brainstem structures Default: False | |
--skip_bids_validator | |
Whether or not to perform BIDS dataset validation Default: False | |
-v, --version | Display the version of Multi-scale Brain Parcellator BIDS-App |
Participant Level Analysis¶
To run the docker image in participant level mode (for one participant):
docker run -it --rm \
-v /home/localadmin/data/ds001:/bids_dataset \
-v /media/localadmin/data/ds001/derivatives:/bids_dataset/derivatives \
-v /usr/local/freesurfer/license.txt:/opt/freesurfer/license.txt \
sebastientourbier/multiscalebrainparcellator:latest \
/bids_dataset /bids_dataset/derivatives participant --participant_label 01 \
--isotropic_resolution 1.0 \
--thalamic_nuclei \
--hippocampal_subfields \
--brainstem_structures
Note
The local directory of the input BIDS dataset (here: /home/localadmin/data/ds001
) and the output directory (here: /media/localadmin/data/ds001/derivatives
) used to process have to be mapped to the folders /bids_dataset
and /bids_dataset/derivatives
respectively using the -v
docker run option.
Debugging¶
Logs are outputted into
<output dir>/cmp/sub-<participant_label>/sub-<participant_label>_log-multiscalebrainparcellator.txt
.
Support and communication¶
The documentation of this project is found here: http://multiscalebrainparcellator.readthedocs.org/en/latest/.
All bugs, concerns and enhancement requests for this software can be submitted here: https://github.com/sebastientourbier/multiscalebrainparcellator/issues.
If you run into any problems or have any questions, you can post to the CMTK-users group.
Not running on a local machine? - Data transfer¶
If you intend to run multiscalebrainparcellator
on a remote system, you will need to
make your data available within that system first. Comprehensive solutions such as Datalad will handle data transfers with the appropriate
settings and commands. Datalad also performs version control over your data.