Batch coding for multisubjects analysis

I have a python code I want to execute in a HCP so I need to use arrays of jobs using sbatch. I have never used that so I am both trying to understand how it works (to use it in my future works adapted to the necesities) and to use it now in the current code I am working on.

As my problem was very similar to the one proposed in #21124 (Example Code for Batch (analyzing multiple suibjects)), I followed the same steps proposed:

  1. Create the submission script (
    Which I adapted to my case by writing the directory that contained all the folders of my subjects instead of $DIRECTORY_WHERE_SUBJECT_FOLDERS_ARE and changed the part subjs=($(ls sub*/ -d)) for subjs=($(ls [0-9]*/ -d)) because I only wanted my subjects which are only numbers and there where other files in that directory that I did not want.
  2. I create the script that calls python for the subject ( I changed $ for my python projects directory and the header to:
#SBATCH --job-name=test_job
#SBATCH --output=res_test_job_%A_%a.txt
#SBATCH --partition= normal, gpu
#SBATCH --gpus=1
#SBATCH --cpus-per-gpu=8
#SBATCH --error error_%A_%a.out
  1. I added to my pyhton file the header from indicated and then changed the passing argument to sub.

However, when I run sbatch I obtain an slurm-JOBID.out with an error: Batch job submission failed: Invalid job array specification.

My current python code structure is the following:

  • Declaration of libraries used
  • Definition of functions to read certain files of the subjects (I pass the subject number and it generates the path to the file)
  • Definition of a function of the model which receives as input the subject directory. (The function is called: MAPMRI_SUBJECT(subjectdr))
  • The main code, which is:
subjectdr = subject_directory(sub)
file_check = subjectdr + '/mask_subject.nii.gz'

if exists(file_check) == True:
    print('Subject ' + str(sub) + ' has de correct files to proceed.')
    print('Subject ' + str(sub) + ' does not have the needed files.')

which goes through every subject in the subjs list, but it passes to the code one subject number given by the sub variable, at the time. and I do not understand why I receive an error

Hi Gabriella,

What is your full script? Have you tried running the lines in the submit array script line by line to see if the subjects are gathered as expected?


My full script is the following:

pushd '/home/mind/ggomezji/data/HCP900'
subjs=($(ls [0-9]*/ -d)) # Get list of subject directory names
subjs=("${subjs[@]///}") # Remove the lagging slash

# take the length of the array
# this will be useful for indexing later
len=$(expr ${#subjs[@]} - 1) # len - 1 because 0 index
echo Spawning ${#subjs[@]} sub-jobs. # print to command line

sbatch --array=0-$len /home/mind/ggomezji/projects/ ${subjs[@]}

In the terminal I computed cd '/home/mind/ggomezji/data/HCP900' and then ls [0-9]*/ -d to check the values saved in subjs and it showed just the numbers of the subjects, without including the other files in the folder.

This is the intended behavior of the script. Good to hear it is working properly.

Since it looks like the job array script is working as intended, I would guess your cluster does not allow you to run jobs this large. This would be something to ask your system admin.


Okay, thank you I’ll check