jdkent, that sounds really cool! I think every additional medium is highly welcomed and I’m happy to help where I can.
Awesome, thanks for being on board!
Also, I don’t know Datacamp yet, but there’s the possibility to upgrade ones account to a paid version, is this necessary for the suggested Nipype Tutorial? Because I cannot access the proof of concept2 link and it’s not clear to me why. And a free and open access to the content would be best.
The course should be free and open, but right now it’s in development (they have a special link share for courses in development. I got a 404 when I tried the link too, but when I logged into my account and went to edit the course and then tried the link again, it worked… Maybe the link is only kept alive for courses that are actively being developed (e.g. within the past day?).
Additionally, the current nipype_tutorial docker image is rather big (above 8GB). Is it possible to use docker images to create the computation environment on Datacamp or can we only use python software packages?
This is where I’m currently having some issues. Datacamp uses docker, but we can only use their base images (which currently appears to be based on Ubuntu xenial 16.04). Our only (apparent) access to modify the Image is through the
requirements.sh script. The boilerplate text inside the script suggests all they want is for the developer to add
pip install commands, but I found I can install FSL, AFNI, and any packages though
apt-get as long as the commands are in
requirements.sh. However, I have not found a way to get the fsl/afni/etc. binaries accessible (via $PATH) during the build, or really set any environmental variables. In a Dockerfile, we can use
ENV to set environmental variables, but I don’t see I can have the same functionality within
requirements.sh. Right now there is a Dockerfile in the github repository (I put it there to test it), but that is not used during the build.