Heudiconv Locking

Hello all. I seem to be running into an issue when trying to use a Heudiconv singularity container in combination with a Slurm job scheduler.

My current implementation worked for the initial run, but now I am getting issues with heudiconv locking up:
INFO: Running heudiconv version 0.8.0 latest 0.8.0
INFO: Need to process 1 study sessions
INFO: PROCESSING STARTS: {‘subject’: ‘221228’, ‘outdir’: ‘/bgfs/adombrovski/DNPL_DataMesh/Data/BIDS/Test_temp_BIDS/’, ‘session’: None}
INFO: Processing 3970 dicoms
INFO: Analyzing 3970 dicoms
INFO: Generated sequence info for 21 accession numbers with 3953 entries total
INFO: Doing conversion using dcm2niix
INFO: Lock 140213581456720 acquired on /bgfs/adombrovski/DNPL_DataMesh/Data/BIDS/Test_temp_BIDS/heudiconv.lock
INFO: Populating template files under /bgfs/adombrovski/DNPL_DataMesh/Data/BIDS/Test_temp_BIDS/
INFO: Lock 140213581456720 released on /bgfs/adombrovski/DNPL_DataMesh/Data/BIDS/Test_temp_BIDS/heudiconv.lock
INFO: PROCESSING DONE: {‘subject’: ‘221228’, ‘outdir’: ‘/bgfs/adombrovski/DNPL_DataMesh/Data/BIDS/Test_temp_BIDS/’, ‘session’: None}

after running the following command:
singularity run --cleanenv -B /bgfs/adombrovski/DNPL_DataMesh/Data ./…/…/…/…/Code/Singularity/heudiconv_latest.sif -d /bgfs/adombrovski/DNPL_DataMesh/Data/WPC-6605/{subject}/scans//resources/DICOM/files/ -s 221228 -o /bgfs/adombrovski/DNPL_DataMesh/Data/BIDS/Test_temp_BIDS/ -f /bgfs/adombrovski/DNPL_DataMesh/Data/BIDS/BIDS_Heuristics/learn2explore_clock_heuristic.py -b -c dcm2niix -g accession_number

A previous post I found seemed to provide a solution by deleting the .heudiconv hidden cache directory, but this particular solution does not work in my case. :

This is actually my second attempt to get this to work. The first implementation would attempt to add subjects directly to an existing BIDS directory created with heudiconv. This version creates a temp directory to copy the files over; however, it is yielding this same error with the creation of the temp directory.

Hopefully I was clear in my description of the problem. Any help on working around this heudiconv.lock issue or more information on how the locking works will be greatly appreciated!

TL;DR: Heudiconv singularity image seems to lock any and all BIDS directories after initial use.

Solved my own issue.

Turns out the subjects I was feeding into heudiconv did not have the task I was searching for.
Also, setting the --bids flag to have an argument of notop: “–bids notop” circumvents the lock without having to delete the .heudiconv directory; however, this does not create top-level BIDS files like the .json and .tsv.

Seems like the race condition they are avoiding is that of the top-level files. I think for now I will keep the temp directory implementation without “–bids notop” that way I can merge my new .tsv, unless the original implementation of directly adding to the already created BIDS directory does this automatically (which I believe it does). Will update afterwards in case anyone ends up down this rabbit hole in the future.

Merges fine into itself. Temp directory not needed. All finished talking to myself. :slight_smile:

Dear Shane thank you very much for posting this I have run onto the same issue and I am getting the following message:

singularity run --cleanenv -B /data/pt_02333/SOSENS/data/BIDS:/base ./heudiconv-latest.simg -d /base/DICOM/sub-{subject}/ses-{session}/0009_T1_MPR_sag_iso_FIL/*.dcm -o /base/nifti/ -f /base/nifti/code/heuristic.py -s 010 -ss V1 -c dcm2niix -b --overwrite

INFO: Running heudiconv version 0.8.0 latest 0.8.0
INFO: Need to process 1 study sessions
INFO: PROCESSING STARTS: {‘subject’: ‘010’, ‘outdir’: ‘/base/nifti/’, ‘session’: ‘V1’}
INFO: Processing 192 dicoms
INFO: Reloading existing filegroup.json because /base/nifti/.heudiconv/010/ses-V1/info/010_ses-V1.edit.txt exists
INFO: Doing conversion using dcm2niix
INFO: Lock 139701978964432 acquired on /base/nifti/heudiconv.lock
INFO: Populating template files under /base/nifti/
INFO: Lock 139701978964432 released on /base/nifti/heudiconv.lock
INFO: PROCESSING DONE: {‘subject’: ‘010’, ‘outdir’: ‘/base/nifti/’, ‘session’: ‘V1’}

I only managed to make it work by moving the .heudiconv directory outside of the nifti file. Is there a different way of getting around this? You mentioned adding the --bids flag but I did not quite understand that if you could provide an example I would be very grateful.

Best wishes,

Removing or moving the entire .heudiconv/ directory should not be necessary. If memory serves from when I was playing with this, I decided to remove files within the .heudiconv/ directory instead. It seems as though your issue is arising from what I have quoted above.

My guess here is that you could remove the info directory from this file path and get the same result:

Hopefully this helps. I am confused about one thing though, are you attempting to overwrite a previously run session?

This may also help:

The .heudiconv hidden directory

  • The Good Every time you run conversion to create the nifti files and directories, a record of what you did is recorded in the .heudiconv directory. This includes a copy of the convertall.py module that you ran for each subject and session.
  • The Bad If you rerun convertall.py for some subject and session that has already been run, heudiconv quietly uses the conversion routines it stored in .heudiconv . This can be really annoying if you are troubleshooting convertall.py .
  • More Good You can remove subject and session information from .heudiconv and run it fresh. In fact, you can entirely remove the .heudiconv directory and still run the convertall.py you put in the code directory. This will give you a fresh start. It obviously also means you can send someone else the convertall.py for a particular project and they can run it too.

source: https://neuroimaging-core-docs.readthedocs.io/en/latest/pages/heudiconv.html

Dear Shane thank you for your reply. I am trying to create a new BIDS directory with BOLD and T1w images. I had run a previous session that I wanted to overwrite. But even if I delete all the BIDS files and start new it seems to give me this message and only convert the first dicom files which are the BOLD and then stop.

Alright, then I believe that the solution is deleting the auto.txt and edit.txt files as stated before and confirmed here:

Keep note that heudiconv is in the 0.x.y stage of release, meaning that it is still undergoing active development. It is likely that --overwrite is not fully complete in functionality and does not account for these auto.txt and edit.txt files.

Good luck in you endeavors!