Summary of what happened:
When trying to run fitLins using any model (lightly modified example model, model generated with pyBIDS automodel, etc.), fitLins crashes during the BIDS model loading process with the error message:
NotImplementedError: 'Dict' nodes are not implemented
Command used (and if a helper script was used, a link to the helper script or the command generated):
apptainer run --cleanenv --home /arc/burst/st-toddrebe-1/ \
/arc/project/st-toddrebe-1/software_envs/fitlins/fitlins_latest.sif \
"${BIDS_dir}" \
"${out_dir}" \
dataset \
-vvv \
-m "${model}" \
-d "${BIDS_dir}"/derivatives/fmriprep \
--space "MNI152NLin6Asym" \
--n-cpus "${num_proc}" \
-w "${work_dir}"
Version:
0.11.0
Environment (Docker, Singularity / Apptainer, custom installation):
Apptainer via HPC system
Data formatted according to a validatable standard? Please provide the output of the validator:
BIDS-formatted dataset with derivatives folder from fMRIprep. The following warning did not present any issues for fMRIprep. The data dictionaries were presenting other, unrelated issues so I did not include them - hence the second warning.
1: [WARN] Not all subjects/sessions/runs have the same scanning parameters. (code: 39 - INCONSISTENT_PARAMETERS)
2: [WARN] Tabular file contains custom columns not described in a data dictionary (code: 82 - CUSTOM_COLUMN_WITHOUT_DESCRIPTION)
Relevant log outputs (up to 20 lines):
Traceback (most recent call last):
File "/opt/miniconda-latest/envs/neuro/lib/python3.9/site-packages/nipype/pipeline/plugins/multiproc.py", line 67, in run_node
result["result"] = node.run(updatehash=updatehash)
File "/opt/miniconda-latest/envs/neuro/lib/python3.9/site-packages/nipype/pipeline/engine/nodes.py", line 527, in run
result = self._run_interface(execute=True)
File "/opt/miniconda-latest/envs/neuro/lib/python3.9/site-packages/nipype/pipeline/engine/nodes.py", line 645, in _run_interface
return self._run_command(execute)
File "/opt/miniconda-latest/envs/neuro/lib/python3.9/site-packages/nipype/pipeline/engine/nodes.py", line 771, in _run_command
raise NodeExecutionError(msg)
nipype.pipeline.engine.nodes.NodeExecutionError: Exception raised while executing Node loader.
...
File "/opt/miniconda-latest/envs/neuro/lib/python3.9/site-packages/pandas/core/computation/expr.py", line 263, in f
raise NotImplementedError(f"'{node_name}' nodes are not implemented")
NotImplementedError: 'Dict' nodes are not implemented
Screenshots / relevant information:
Model file:
{
"Name": "MixedMotivation",
"BIDSModelVersion": "1.0.0",
"Input": {
"subject": [
"010",
"011",
"012"
],
"task": "ADAPTAVOID"
},
"Nodes": [
{
"Level": "Run",
"Name": "run_level",
"GroupBy": [
"run",
"subject"
],
"Model": {
"X": [
1,
"trial_type.GoAppetitive",
"trial_type.GoAversive",
"trial_type.NoGoAppetitive",
"trial_type.NoGoAversive"
],
",Type": "glm"
},
"Contrasts": [
{
"Name": "AppetitiveAversiveGo",
"ConditionList": [
"trial_type.GoAppetitive",
"trial_type.GoAversive"
],
"Weights": [
1,
-1
],
"Test": "t"
}
]
},
{
"Level": "Subject",
"Name": "subject_level",
"GroupBy": [
"subject",
"contrast"
],
"Model": {
"X": [
1
],
"Type": "meta"
},
"DummyContrasts": {
"Test": "t"
}
},
{
"Level": "Dataset",
"Name": "one-sample_dataset",
"GroupBy": [
"contrast"
],
"Model": {
"X": [
1
],
"Type": "glm"
},
"DummyContrasts": {
"Test": "t"
}
}
]
}
Events file example:
onset | duration | task | run | trial_type | response_time | corr_resp | rt_criterion | rt_mean | outcome | num_presses_centered | outcome_type |
19.5338435 | 6.012443701 | ADAPTAVOID | 01 | GoAversive | 20.0594266 | 0.0 | 20.7451566 | 20.9974999 | 31.5606178 | 5.052631579 | Go_punishment |
36.1745694 | 5.9803594 | ADAPTAVOID | 01 | GoAversive | n/a | 0.0 | n/a | 37.6382258 | 48.8014002 | n/a | Go_punishment |
55.3823226 | 6.0251623 | ADAPTAVOID | 01 | GoAppetitive | 55.9894561 | 0.0 | 56.6296086 | 56.845979 | 64.4120456 | 6.052631579 | other |
I’m just trying to test with a simple model on a few participants to quickly debug, and have gotten the same error with different contrasts, HRF setups, etc. Any help or suggestions would be greatly appreciated, thank you!