Hello, Thanks again for your tremendous help.
Sorry it took me so long to reply, I ran into a couple of things that I wanted to understand/fix. your help with the following is much appreciated.
I started encountering an error with version 1.3.0, wether using its default pipeline, or the replication pipeline that I ultimately want to run. I fixed this error by making a directory on my machine and binding it to the scratch/working directory in the command, I am just including this for completion, also, this error and fix are not needed for the latest container (CPAC v1.8.3).
Command: running default pipeline without binding scratch/working:
singularity run -B /home/ammar/cpac_inputs:/bids_dataset -B /home/ammar/cpac_outputs:/outputs -B /tmp:/tmp -B /home/ammar/cpac_outputs/log:/log c-pac-release-v1.3.0.post2.simg /bids_dataset /outputs participant --pipeline_file /home/ammar/cpac_inputs/default_pipeline_1.3.0.yml --skip_bids_validator
Error:
Namespace(analysis_level='participant', aws_input_creds=None, aws_output_creds=None, bids_dir='/bids_dataset', bids_validator_config=None, data_config_file=None, disable_file_logging=False, mem_gb=None, mem_mb=None, n_cpus='1', output_dir='/outputs', participant_label=None, participant_ndx=None, pipeline_file='/home/ammar/cpac_inputs/default_pipeline_1.3.0.yml', save_working_dir=False, skip_bids_validator=True)
skipping bids-validator...
#### Running C-PAC
Number of participants to run in parallel: 1
Input directory: /bids_dataset
Output directory: /outputs/output
Working directory: /scratch/working
Crash directory: /outputs/crash
Log directory: /outputs/log
Remove working directory: True
Available memory: 6.0 (GB)
Available threads: 1
Number of threads for ANTs: 1
Starting participant level processing
Traceback (most recent call last):
File "/code/run.py", line 388, in <module>
plugin='MultiProc', plugin_args=plugin_args)
File "/code/CPAC/pipeline/cpac_runner.py", line 481, in run
raise Exception(err)
Exception:
[!] CPAC says: Could not create the working directory: /scratch/working
Make sure you have permissions to write to this directory.
Unfortunately, after fixing the former error with the binding step, it still fails eventually, with different errors for the default pipeline of 1.3.0 and the replication pipeline.
Command: replication run with CPAC v1.3.0:
singularity run -B /home/ammar/cpac_inputs:/bids_dataset -B /home/ammar/cpac_outputs:/outputs -B /tmp:/tmp -B /home/ammar/cpac_outputs/scratch/working:/scratch/working c-pac-release-v1.3.0.post2.simg /bids_dataset /outputs participant --pipeline_file /home/ammar/cpac_inputs/replication_pipeline.yml --skip_bids_validator
Error:
Namespace(analysis_level='participant', aws_input_creds=None, aws_output_creds=None, bids_dir='/bids_dataset', bids_validator_config=None, data_config_file=None, disable_file_logging=False, mem_gb=None, mem_mb=None, n_cpus='1', output_dir='/outputs', participant_label=None, participant_ndx=None, pipeline_file='/home/ammar/cpac_inputs/replication_pipeline.yml', save_working_dir=False, skip_bids_validator=True)
skipping bids-validator...
#### Running C-PAC
Number of participants to run in parallel: 1
Input directory: /bids_dataset
Output directory: /outputs/output
Working directory: /scratch/working
Crash directory: /outputs/crash
Log directory: /outputs/log
Remove working directory: True
Available memory: 6.0 (GB)
Available threads: 1
Number of threads for ANTs: 1
Starting participant level processing
Traceback (most recent call last):
File "/code/run.py", line 388, in <module>
plugin='MultiProc', plugin_args=plugin_args)
File "/code/CPAC/pipeline/cpac_runner.py", line 419, in run
strategies = sorted(build_strategies(c))
File "/code/CPAC/pipeline/cpac_runner.py", line 112, in build_strategies
'_threshold': configuration.spikeThreshold,
AttributeError: 'Configuration' object has no attribute 'spikeThreshold'
after debugging the latter error, I found that the formatting of the nuisance regressors part is different between the replication pipeline I want to use, and the default pipeline of CPAC v1.3.0. Kindly note the replication pipeline I am trying right now is that of The HBN dataset. I got that from connecting to their S3 bucket via cyberduck, it can be found under cpac_deriv folder. While I can try to make those changes myself, I just wanted to check maybe you would have some more info or an easier way? Thanks!
on the other hand, running default pipeline of CPAC v1.3.0 or v1.8.3 with their respective containers causes an error, which I think is related to nypipe. the commands are included in the following. the error for both of them is similar, the error for the v1.3.0 is included in the attachment, along with log file contents (pipeline.log).
Command: default pipeline run v1.3.0
singularity run -B /home/ammar/cpac_inputs:/bids_dataset -B /home/ammar/cpac_outputs:/outputs -B /tmp:/tmp -B /home/ammar/cpac_outputs/scratch/working:/scratch/working c-pac-release-v1.3.0.post2.simg /bids_dataset /outputs participant --skip_bids_validator
Command: default pipeline run v1.8.3
singularity run -B /home/ammar/cpac_inputs:/bids_dataset -B /home/ammar/cpac_outputs:/outputs -B /tmp:/tmp c-pac-latest.simg /bids_dataset /outputs participant --skip_bids_validator
Finally, thanks again for your help, sorry if my post is a bit too long, kindly note that my main goal is to run the replication pipeline as mentioned earlier, and everything else is included for reference.
EDIT: I can not attach files, and the error + log file content are too big to be posted here. please let me know how can I send them if you need them.