Resolving FreeSurfer Errors on macOS While Processing HCP Data

Summary of what happened:

Hello

I encountered errors when running a FreeSurfer script on macOS to process Human Connectome Project (HCP) data.

Command used (and if a helper script was used, a link to the helper script or the command generated):

export FREESURFER_HOME="/System/Volumes/Data/Applications/freesurfer/8.0.0"
source $FREESURFER_HOME/SetUpFreeSurfer.sh

BASE_DIR="/Volumes/MyDRIVE/HCPDATA"
DATA_DIR="$BASE_DIR"
SUBJECTS_DIR="$BASE_DIR/HCP_Freesurfer_Outputs"
OUTPUT_DIR="$SUBJECTS_DIR/group_stats"

mkdir -p "$SUBJECTS_DIR"
mkdir -p "$OUTPUT_DIR"

SUBJECTS=(100206 100206-2 100307 100307-2 100408 100408-2 100610 100610-2 101006 101006-2 101107 101107-2 101309 101309-2 101410 101410-2 101915 101915-2  102008 102008-2 102109 102109-2 102311 102311-2  1002062 1002062-2 1003072 1003072-2 1004082 1004082-2 1006102 1006102-2 1006062 1006062-2 1010072 1010072-2 1013092 1013092-2
1014102 1014102-2 1019152 1019152-2 1020082 1020082-2 1021092 1021092-2 1023112 1023112-2)

for SUBJ in "${SUBJECTS[@]}"; do
    echo "### Processing $SUBJ ###"
    SUBJ_DIR="$DATA_DIR/$SUBJ"

    T1=$(find "$SUBJ_DIR" -iname "T1w_acpc_dc_restore.nii.gz" | head -n 1)
    if [[ -z "$T1" ]]; then
        T1=$(find "$SUBJ_DIR" -iname "T1w.nii.gz" | grep -v Divided | head -n 1)
    fi

    T2=$(find "$SUBJ_DIR" -iname "T2w_acpc_dc_restore.nii.gz" | head -n 1)

    if [[ -f "$T1" ]]; then
        echo "Found T1: $T1"
        if [[ -f "$T2" ]]; then
            echo "Found T2: $T2"
            recon-all -s "$SUBJ" -i "$T1" -T2 "$T2" -T2pial -all
        else
            echo "T2 not found. Running with only T1."
            recon-all -s "$SUBJ" -i "$T1" -all
        fi
    else
        echo "No suitable T1 found for $SUBJ. Skipping."
    fi

    echo "### Done with $SUBJ ###"
done 

Version:

8.0.0

Environment (Docker, Singularity / Apptainer, custom installation):

Bare metal on MacOS

Data formatted according to a validatable standard? Please provide the output of the validator:

PASTE VALIDATOR OUTPUT HERE

Relevant log outputs (up to 20 lines):

ERROR: You are trying to re-run an existing subject with (possibly)
new input data (-i). If this is truly new input data, you should delete the subject folder and re-run, or specify a different subject name.
If you are just continuing an analysis of an existing subject, then omit all -i flags.

Screenshots / relevant information:


Hi @NILOUFAR and welcome to neurostars!

In the future please use the Software Support post category and template. You can see I edited it in for you this time.

The error message is pretty self contained, if you are trying to continue an already existing subject (that is, the subject already exists in your $SUBJECTS_DIR), then you do not specify a T1 image and instead specify the subject ID. Read more in the recon-all documentation. recon-all - Free Surfer Wiki

But also, why are you running recon-all on HCP data when it is publicly available?

Best,
Steven