FMRI Prep and Slice Time Correction



Hi NeuroStars,

I have provided slice timing information via json files when running fMRI prep. However, despite this, the html report states “Slice timing correction: Not applied” (see attached). Is there a way to ensure slice time correction is performed? Or am I doing something wrong?

Below I have attached a screenshot of the json file, html report, and the commands used when running fMRI Prep.

Thapa :slight_smile:

  1. Please run the bids-validator on the input dataset and make sure there are no warnings concerning slice timeing.
  2. Please share the full HTML report (with figures which are in a separate folder).


Hi Chris,

Thank you for your reply.

I’ve double checked my data with the BIDS validator ( Below are the steps I took:

  1. Ran the data via the BIDS validator, and got the following errors:

File names:

  1. After reading into the errors, I removed the underscore for the “_task-”

  1. I then ran the data through the BIDS validator again. While I was able to remove 2 errors, I still got the following 2 errors:

  1. I have tried renaming the files again based upon the BIDS validator document, but seem to get the same error as shown above.


  1. Further, fMRI Prep seems to recognise the data and run its analysis on data with or without the underscore removed. However, I get the same message “Slice timing correction: Not applied”

So, the questions I now have are:

i) How is fMRI Prep running it analysis on data that cannot be fully validated by the BIDS validator?

ii) Why am I getting a “Task_Name” error despite naming it following the BIDS doc.

iii) Lastly, does this error explain why fMRI Prep is not being able to run slice time correction?

Thank you,



Out of curiosity - why did you use this URL instead of ?

It assumes input data is BIDS compatible, but never checks this assumption. In the next version, we will start running the validator as part of fmriprep.

This error concerns the content of the JSON files not the filenames. See

Nope. I would have to see the reports (please include figures wich are in a separate folder when shating the HTML file) to tell more.


It looks like you’re using multi-echo data – is that correct, @TribikramT ?


@ChrisGorgolewski: Thank you for your answers. Really appreciate your quick replies.

Re: URL vs clone from github - it was more about convenience as I came across the link (shared in the earlier post) in a previous post regarding BIDS validation. Is there a difference between the two? If yes, I am happy to cross-reference my data using the clone from github.

Re: fMRI Prep and BIDS validator: I wasn’t aware of this. Thank you for clearing this out. Yes, it would be awesome to include the BIDS validator as part of fMRI Prep.

Re Task Name error: Thank you for clarifying this too. I realised my previous JSON files had “ProtocolName” instead of “TaskName”. After changing this, the only error I get is “DATASET_DESCRIPTION_JSON_MISSING”

Result from BIDS validator

Link to figures:

@emdupre That is correct. I am using multi-echo data.

Once again, thank you for your help and replies.



Thanks @TribikramT - I will also need the HTML file.

@emdupre - is slice timing correction turned off for multi echo data?


Yes, slice-time correction is not implemented in master for multi-echo. It is in #1296 !



Hi again,

Just in case you’re still encountering this issue @TribikramT , slice-time correction for multi-echo was implemented in 1.2.4 ! So I would recommend upgrading to the latest release :slight_smile:



@ChrisGorgolewski @emdupre Cool! Thank you for the update. Will give it a go :slight_smile:


@emdupre @ChrisGorgolewski


This took longer than expected. However, I have managed to run fMRIPrep v1.2.5 on ME-data, and slice time correction has been applied! Yay!

However, initially I ran into memory issues despite increasing the memory to 12 gigs. I think this was when fMRIPrep tried running the BIDS validation step, becauase only when I skipped this step was fMRIPrep able to process my data. Please see error below:



                         ----- 1 sub-PILOT010 -----


Making sure the input data is BIDS compliant (warnings can be ignored in most cases).

<--- Last few GCs --->

[7852:0x367af80]    24038 ms: Mark-sweep 1392.2 (1398.5) -> 1392.2 (1397.5) MB, 5.0 / 0.0 ms  (average mu = 0.985, current mu = 0.002) last resort GC in old space requested

[7852:0x367af80]    24043 ms: Mark-sweep 1392.2 (1397.5) -> 1392.2 (1397.5) MB, 4.8 / 0.0 ms  (average mu = 0.974, current mu = 0.002) last resort GC in old space requested

<--- JS stacktrace --->

==== JS stack trace =========================================

    0: ExitFrame [pc: 0x2beb882dc01d]

Security context: 0x11a55fc63249 <JSObject>

    1: stringSlice(aka stringSlice) [0x1bc12ce1b9d1] [buffer.js:595] [bytecode=0x1bc12ce557b1 offset=91](this=0x198a1cc826f1 <undefined>,buf=0x2b61d02fe9f9 <Uint8Array map = 0x85b17250599>,encoding=0x05a2a02a0bf1 <String[4]: utf8>,start=0,end=49250463)

    2: toString [0x5a2a029b749] [buffer.js:668] [bytecode=0x1bc12ce552a1 offset=145](this=0x2b61d02fe9f9 <U...

FATAL ERROR: CALL_AND_RETRY_LAST Allocation failed - JavaScript heap out of memory

 1: 0x8d20d0 node::Abort() [bids-validator]

 2: 0x8d211c  [bids-validator]

 3: 0xb02b6e v8::Utils::ReportOOMFailure(v8::internal::Isolate*, char const*, bool) [bids-validator]

 4: 0xb02da4 v8::internal::V8::FatalProcessOutOfMemory(v8::internal::Isolate*, char const*, bool) [bids-validator]

 5: 0xef02e2  [bids-validator]

 6: 0xeffb4f v8::internal::Heap::AllocateRawWithRetryOrFail(int, v8::internal::AllocationSpace, v8::internal::AllocationAlignment) [bids-validator]

 7: 0xec7e35  [bids-validator]

 8: 0xecf79b v8::internal::Factory::NewRawTwoByteString(int, v8::internal::PretenureFlag) [bids-validator]

 9: 0xed0072 v8::internal::Factory::NewStringFromUtf8(v8::internal::Vector<char const>, v8::internal::PretenureFlag) [bids-validator]

10: 0xb10979 v8::String::NewFromUtf8(v8::Isolate*, char const*, v8::NewStringType, int) [bids-validator]

11: 0x994878 node::StringBytes::Encode(v8::Isolate*, char const*, unsigned long, node::encoding, v8::Local<v8::Value>*) [bids-validator]

12: 0x8edf32  [bids-validator]

13: 0xb8b32f  [bids-validator]

14: 0xb8be99 v8::internal::Builtin_HandleApiCall(int, v8::internal::Object**, v8::internal::Isolate*) [bids-validator]

15: 0x2beb882dc01d

Traceback (most recent call last):

  File "/usr/local/miniconda/bin/fmriprep", line 11, in <module>


  File "/usr/local/miniconda/lib/python3.6/site-packages/fmriprep/cli/", line 353, in main

    validate_input_dir(exec_env, opts.bids_dir, opts.participant_label)

  File "/usr/local/miniconda/lib/python3.6/site-packages/fmriprep/cli/", line 534, in validate_input_dir

    subprocess.check_call(['bids-validator', bids_dir, '-c', [](])

  File "/usr/local/miniconda/lib/python3.6/", line 291, in check_call

    raise CalledProcessError(retcode, cmd)

subprocess.CalledProcessError: Command '['bids-validator', '/home/ttha0011/kg98/Thapa/TMS-fMRI_Project/Pilot_testing/MEICAvfMRIPrep/fMRIPrep/rawdata', '-c', '/tmp/tmplep73xbs']' died with <Signals.SIGABRT: 6>.

Sentry is attempting to send 1 pending error messages

Waiting up to 2.0 seconds

Press Ctrl-C to quit

                         ----- DONE ---- 

P.S. happy to raise this as a separate issue in a different post.

Thank you both for your help :slight_smile:



This is unusual. How large is your input dataset? (in terms of GB and number of files?)


Hi @ChrisGorgolewski

Does the image below answer your question?


Is that it? There is only one participant in the input dataset?


Yes, that is correct. There is only one participant, as we’re still pilot testing some aspects of the project.


Any chance you could share the dataset so I could try to recreate the issue?


Sure! Is there a way I could send you the files?


Perhaps upload them to Dropbox or Google drive?


Hi @ChrisGorgolewski

Here’s the link:

Do let me know if you have trouble accessing the data :slight_smile:


I was not able to replicate it with a 1.2.5 in Docker limited to 4GB. What OS and docker versions are you using? What’s the content of