Problem running datalad in Google Colab

Hello colleagues,

I’m relatively new to the analysis of neuro data in Python. I’m currently following the DartBrains tutorials to get started. However, I experienced an issue with downloading the data as suggested here:

https://dartbrains.org/content/Download_Data.html

When trying to import the Python API for datalad (import datalad.api as dl) I get the following error message:

RuntimeError Traceback (most recent call last)
in ()
----> 1 import datalad.api as dl

7 frames
/usr/lib/python3.7/asyncio/base_events.py in _check_runnung(self)
521 def _check_runnung(self):
522 if self.is_running():
→ 523 raise RuntimeError(‘This event loop is already running’)
524 if events._get_running_loop() is not None:
525 raise RuntimeError(

RuntimeError: This event loop is already running

This unfortunately prevents me from further following a lot of the tutorials.

For the installation of datalad I followed the instructions provided here:

and here:

http://neuro.debian.net/install_pkg.html?p=datalad

When using the platform.platform() command I see that in my Colab Notebook the following version is installed: ‘Linux-5.4.144±x86_64-with-Ubuntu-18.04-bionic’

This all resulted in the following attempt to install datalad:

!wget -O- http://neuro.debian.net/lists/bionic.de-md.libre | sudo tee /etc/apt/sources.list.d/neurodebian.sources.list

!sudo apt-key adv --recv-keys --keyserver hkps://keyserver.ubuntu.com 0xA5D32F012649A5A9

!sudo apt-get update

!sudo apt-get install datalad

datalad seems to get installed just fine. However when trying to run import datalad.api as dl I get the error message mentioned above.

I am really struggling with getting it to run and would appreciate any tips and recommendations a lot!

Thank you very much!

For those who might be interested. I managed to work around the problem by using the following solution. provided here:

!pip install tensorflow_federated
import tensorflow as tf
import tensorflow_federated as tff
import nest_asyncio
nest_asyncio.apply()

However, further following the tutorial Download Data — DartBrains, I know get another error message when trying to actually get the files I want using datalad.

When following the instructions from the tutorial everything seems to work fine until I try to execute ds.get(file_list[0]) , where ds = dl.Dataset(localizer_path), dl = datalad.api and file list is a list of strings indicating the file paths for preprocessed nifti files of the participants.

When trying to execute ds.get(file_list[0]) I get the following error message:

[ERROR] git-annex: user error (git ["–git-dir=.git","–work-tree=.","–literal-pathspecs","-c",“annex.dotfiles=true”,"-c",“annex.retry=3”,“commit-tree”,“fcf8c179e07afbe0fe2d0bcf5eea1cb8000d5e0a”,"–no-gpg-sign","-p",“refs/heads/git-annex”] exited 128) [get(/data/Localizer/derivatives/fmriprep/sub-S01/func/sub-S01_task-localizer_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz)]

IncompleteResultsError Traceback (most recent call last)
in ()
----> 1 result = ds.get(file_list[0])

3 frames
/usr/lib/python3/dist-packages/datalad/interface/utils.py in generator_func(*_args, **_kwargs)
459 raise IncompleteResultsError(
460 failed=incomplete_results,
→ 461 msg=“Command did not complete successfully”)
462
463 if return_type == ‘generator’:

IncompleteResultsError: Command did not complete successfully. 1 failed:
[{‘action’: ‘get’,
‘annexkey’: ‘MD5E-s112128274–6cd22d5f4bcc1f5179ed8db4188bf888.nii.gz’,
‘message’: 'git-annex: user error (git ’
'["–git-dir=.git","–work-tree=.","–literal-pathspecs","-c",“annex.dotfiles=true”,"-c",“annex.retry=3”,“commit-tree”,“fcf8c179e07afbe0fe2d0bcf5eea1cb8000d5e0a”,"–no-gpg-sign","-p",“refs/heads/git-annex”] ’
‘exited 128)’,
‘path’: ‘/data/Localizer/derivatives/fmriprep/sub-S01/func/sub-S01_task-localizer_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz’,
‘refds’: ‘/data/Localizer’,
‘status’: ‘error’,
‘type’: ‘file’}]

What am I doing wrong?

For those who might still be interested.

The problem arose because the credentials for github had to be specified. I forgot to do so making it impossible to download the data via datalad.