Openneuro-cli problem

Hi @alexfoias

Darn! It appears you ran through the soft fixes we had put up previously. These soft fixes were deployed to help resolve the publication submission. We are working to enhance our underlying storage configuration. These enhancements are being developed and are posed to help alleviate the timeout issues (the case here). I apologize for this inconvenience.

It appears the dataset rendering has resolved.

Thank you,
Franklin

Hello @franklin,

I don’t know if it’s the right thread to report this, but I’m having issues with both browser and CLI upload.
I’m trying to upload a new dataset (it’s still a draft). It took me approx 18 hours for the browser upload to finish.
When I checked the dataset, I saw that only one subject was partially uploaded, and I had an error message saying that the dataset wasn’t valid any more due to files missing.

Assuming the upload went wrong (wasn’t at the screen when it finished), I tried to use the CLI to finish the upload. I installed it using npm (node v10.19.0, npm v6.14.4, openneuro v3.20.0).
I did the login with the API key and didn’t get any message from it (I think it went ok though, cause I tried to download a draft dataset of mine and it worked). Just in case, I tried another coupe of time with new API keys, once even as a super user.

When I start the upload, it gets stuck at 0 for a very long time.
This is what I get from my terminal:

$ openneuro upload --dataset ds003192 -i .
Adding files to "ds003192"
bids-validator@1.5.4

        Summary:                   Available Tasks:                           Available Modalities: 
        1880 Files, 74.66GB        TODO: full task name for breathhold        T2w                   
        7 - Subjects                                                          T1w                   
        10 - Sessions                                                         bold                  
                                                                              physio                
                                                                              sbref                 
                                                                              fieldmap              


        If you have any questions, please post on https://neurostars.org/tags/bids.

=======================================================================
1686 files to be uploaded with a total size of 67.3 GB
? Begin upload? Yes
=======================================================================
Starting a new upload (6e385bc8-8254-42ca-87da-26a085ce6352) to dataset: 'ds003192'
ds003192 [----------------------------------------] 0% | ETA: 0s | 0/1686(node:2495066) UnhandledPromisen
    at Proxy.uploadParallel (/usr/local/lib/node_modules/openneuro-cli/node_modules/openneuro-client/src)
    at uploadFiles (/usr/local/lib/node_modules/openneuro-cli/src/upload.js:161:17)
    at uploadDataset (/usr/local/lib/node_modules/openneuro-cli/src/actions.js:74:11)
    at process._tickCallback (internal/process/next_tick.js:68:7)
(node:2495066) UnhandledPromiseRejectionWarning: Unhandled promise rejection. This error originated eith)
(node:2495066) [DEP0018] DeprecationWarning: Unhandled promise rejections are deprecated. In the future,.
ds003192 [----------------------------------------] 0% | ETA: 508s | 11/1686

Am I missing something?

Hi @smoia

Thank you for your message! This may be caused by an out of date node package. May you please try with Node 12?

Thank you,
Franklin

Hi @franklin,

thank you so much, the node update (v12.18.4) solved that issue!

However, now I do see this issue quite often:

Error [ERR_STREAM_PREMATURE_CLOSE]: Premature close
    at ClientHttp2Stream.onclose (internal/streams/end-of-stream.js:80:15)
    at ClientHttp2Stream.emit (events.js:327:22)
    at ClientHttp2Stream.EventEmitter.emit (domain.js:483:12)
    at closeStream (internal/http2/core.js:1727:14)
    at Http2Stream.onStreamClose (internal/http2/core.js:505:5) {
  code: 'ERR_STREAM_PREMATURE_CLOSE'
}

Are my files too big?

Hi @smoia

Good to hear we were able to resolve that one!

May you please clarify when this message is received? Was this error received during the upload process? Is there an error log associated with this (e.g. error code)

Thank you,
Franklin

Hello @franklin
This error was (repeatedly) received during the upload, that finished about three hours ago. No error log that I can see though - I looked in the dataset folder, in the folder I run the process from, and in the home folder.

The website is not showing the updated dataset yet though. I imagine it’s due to the delay that was discuss in earlier posts in this topic, so I’ll wait a couple of days to see if my files were correctly uploaded.

Cheers,
Stefano

Hi @smoia

If I may clarify - while the uploading is happening do you see the file is being uploaded but then this error message is populated, but the upload continues? I am piecing together how this message comes up.

It looks like I was able to get through a stuck commit. There is another subject there along with a session folder (I presume done through the upload directory button).

Thank you,
Franklin

Hi @franklin

Indeed, while the uploading is happening I see the file is being uploaded but then this error message is populated AND the upload continues.

Technically, this dataset should have seven subjects with ten sessions each. There should not be “free” session folders.

Here’s some ls commands from the folder I’m uploading.

~/Data/EuskalIBUR$ ls *
CHANGES  dataset_description.json  participants.tsv  task-breathhold_bold.json

sub-001:
ses-01  ses-02  ses-03  ses-04  ses-05  ses-06  ses-07  ses-08  ses-09  ses-10

sub-002:
ses-01  ses-02  ses-03  ses-04  ses-05  ses-06  ses-07  ses-08  ses-09  ses-10

sub-003:
ses-01  ses-02  ses-03  ses-04  ses-05  ses-06  ses-07  ses-08  ses-09  ses-10

sub-004:
ses-01  ses-02  ses-03  ses-04  ses-05  ses-06  ses-07  ses-08  ses-09  ses-10

sub-007:
ses-01  ses-02  ses-03  ses-04  ses-05  ses-06  ses-07  ses-08  ses-09  ses-10

sub-008:
ses-01  ses-02  ses-03  ses-04  ses-05  ses-06  ses-07  ses-08  ses-09  ses-10

sub-009:
ses-01  ses-02  ses-03  ses-04  ses-05  ses-06  ses-07  ses-08  ses-09  ses-10


~/Data/EuskalIBUR$ ls */*
sub-001/ses-01:
anat  fmap  func

sub-001/ses-02:
fmap  func

[...]

~/Data/EuskalIBUR$ ls sub-001/ses-01/*
sub-001/ses-01/anat:
sub-001_ses-01_acq-inv1_T1w.json    sub-001_ses-01_acq-inv2_T1w.json    sub-001_ses-01_acq-uni_T1w.json    sub-001_ses-01_T2w.json
sub-001_ses-01_acq-inv1_T1w.nii.gz  sub-001_ses-01_acq-inv2_T1w.nii.gz  sub-001_ses-01_acq-uni_T1w.nii.gz  sub-001_ses-01_T2w.nii.gz

sub-001/ses-01/fmap:
sub-001_ses-01_acq-breathhold_dir-AP_epi.json  sub-001_ses-01_acq-breathhold_dir-AP_epi.nii.gz  sub-001_ses-01_acq-breathhold_dir-PA_epi.json  sub-001_ses-01_acq-breathhold_dir-PA_epi.nii.gz

sub-001/ses-01/func:
sub-001_ses-01_task-breathhold_echo-1_bold.json    sub-001_ses-01_task-breathhold_echo-5_bold.json                   sub-001_ses-01_task-breathhold_rec-magnitude_echo-3_sbref.json
sub-001_ses-01_task-breathhold_echo-1_bold.nii.gz  sub-001_ses-01_task-breathhold_echo-5_bold.nii.gz                 sub-001_ses-01_task-breathhold_rec-magnitude_echo-3_sbref.nii.gz
sub-001_ses-01_task-breathhold_echo-2_bold.json    sub-001_ses-01_task-breathhold_physio.json                        sub-001_ses-01_task-breathhold_rec-magnitude_echo-4_sbref.json
sub-001_ses-01_task-breathhold_echo-2_bold.nii.gz  sub-001_ses-01_task-breathhold_physio.tsv.gz                      sub-001_ses-01_task-breathhold_rec-magnitude_echo-4_sbref.nii.gz
sub-001_ses-01_task-breathhold_echo-3_bold.json    sub-001_ses-01_task-breathhold_rec-magnitude_echo-1_sbref.json    sub-001_ses-01_task-breathhold_rec-magnitude_echo-5_sbref.json
sub-001_ses-01_task-breathhold_echo-3_bold.nii.gz  sub-001_ses-01_task-breathhold_rec-magnitude_echo-1_sbref.nii.gz  sub-001_ses-01_task-breathhold_rec-magnitude_echo-5_sbref.nii.gz
sub-001_ses-01_task-breathhold_echo-4_bold.json    sub-001_ses-01_task-breathhold_rec-magnitude_echo-2_sbref.json
sub-001_ses-01_task-breathhold_echo-4_bold.nii.gz  sub-001_ses-01_task-breathhold_rec-magnitude_echo-2_sbref.nii.gz

[...]

Should I try the upload again?

Hi @smoia

Thank you for sharing the directory structure and additional information! Perhaps may you please update to our latest version of the cli and try to upload? It may be something particular to that version.

Thank you,
Franklin

Hi @franklin,
I don’t want to jinx it, but with openneuro 3.21.0 the upload seems to be working. If I encounter further issues I’ll let you know.

Just one question: the website front-end might take some time to show the updated dataset, right?

Hi @smoia

Good to hear!

Yep, it may take a little bit.

Thank you,
Franklin

Hi @franklin,
I’d say enough days have passed since I uploaded my files in Openneuro. Unluckily, I can’t see the updated version in the website: it still shows the dataset before my last upload.

During the last upload I didn’t encounter any problem.

What can I do? Should I start adding only one subject at a time?

Hi @smoia

Thank you for following up! We have been making several updates to the cli to resolve the root issues. May you please updating to the latest version of the cli and upload against your dataset? Perhaps starting with 1 subject to confirm our patch.

Thank you,
Franklin

Hello @franklin!
Unluckily, I still have issues with the dataset and the CLI. This time, it uploaded one subject out of five.
Anything else I can do to try and get my dataset online?
Should I try to delete all the files from the dataset, or make a new one?

Hi @smoia

Darn - thank you for testing! We have been addressing related cli issues - may you please try this suggestion?

Another approach can also be to try to upload as a new dataset and evaluate how the cli responds.

Thank you,
Franklin

Hello Franklin,

I tried again with the upload following the suggestions you linked.

It seems to have worked (Who-hooooo!), but I also got this message:

./node_modules/.bin/openneuro upload -d ds003192 -i -v ~/Data/EuskalIBUR
Adding files to "ds003192"
bids-validator@1.5.7

        Summary:                   Available Tasks:        Available Modalities: 
        1880 Files, 74.66GB        Breath-holds            T2w                   
        7 - Subjects                                       T1w                   
        10 - Sessions                                      bold                  
                                                           physio                
                                                           sbref                 
                                                           fieldmap              


	If you have any questions, please post on https://neurostars.org/tags/bids.

=======================================================================
1879 files to be uploaded with a total size of 74.7 GB
? Begin upload? Yes
=======================================================================
Starting a new upload (439373bc) to dataset: 'ds003192'
ds003192 [========================================] 100% | ETA: 0s | 1879/1879
(node:2297945) UnhandledPromiseRejectionWarning: TypeError: Cannot read property 'end' of null
    at uploadDataset (/home/nemo/Data/openneuro/node_modules/openneuro-cli/src/actions.js:95:49)
    at runMicrotasks (<anonymous>)
    at processTicksAndRejections (internal/process/task_queues.js:93:5)
(Use `node --trace-warnings ...` to show where the warning was created)
(node:2297945) UnhandledPromiseRejectionWarning: Unhandled promise rejection. This error originated either by throwing inside of an async function without a catch block, or by rejecting a promise which was not handled with .catch(). To terminate the node process on unhandled promise rejection, use the CLI flag `--unhandled-rejections=strict` (see https://nodejs.org/api/cli.html#cli_unhandled_rejections_mode). (rejection id: 1)
(node:2297945) [DEP0018] DeprecationWarning: Unhandled promise rejections are deprecated. In the future, promise rejections that are not handled will terminate the Node.js process with a non-zero exit code.

Thank you for your help!

Cheers,
Stefano

Hi @smoia

Thank you for your message and sending over the error! Perhaps one thing to clarify - the data appear to have been uploaded successfully? This error may be due to a readout message rather than an issue with the upload.

Thank you,
Franklin

Hi @franklin,

The data appear to be uploaded. I didn’t try to download the dataset, but at least the validator is passing - and that’s a first. All subjects and sessions seems to be there, and the size of the dataset is similar to my local copy.

For future reference, I deleted all files and folders from the uploaded dataset (through the web interface), followed the suggestion you indicated earlier, and re-uploaded the dataset after being sure I wasn’t having any error running the bids validator.

Thank you again for your help!

hi @smoia

Great to hear!

Thank you,
Franklin

Hello,

Thank you for your work on Openneuro-cli! I’ve previously been able to use the command to update my dataset, but recently I got an error after passing bids-validation (which I can’t reproduce). Therefore, I tried to update some of the relevant software and tried to follow suggestions in this and other threads, but I’m still unable to upload anything. I’m unsure where the problem is, and how to update properly. I pasted my error trail below. Please let me know if you have any suggestions of how to get things working again and if you need any further information that would help to pin down the problem.

Many thanks,
Nicole

$ node -v
v14.15.1
$ npm -v
6.14.9
# uninstall openneuro-cli to start from scratch
$ sudo npm uninstall -g openneuro-cli
removed 926 packages in 8.676s
# check that uninstalled
$ openneuro -V
-bash: /usr/local/bin/openneuro: No such file or directory
# attempt to re-install
$ sudo npm install -g openneuro-cli
npm WARN deprecated resolve-url@0.2.1: https://github.com/lydell/resolve-url#deprecated
npm WARN deprecated urix@0.1.0: Please see https://github.com/lydell/urix#deprecated
npm WARN deprecated request@2.88.2: request has been deprecated, see https://github.com/request/request/issues/3142
npm WARN deprecated har-validator@5.1.5: this library is no longer supported
npm WARN deprecated mkdirp-promise@5.0.1: This package is broken and no longer maintained. 'mkdirp' itself supports promises now, please switch to that.
/usr/local/bin/openneuro -> /usr/local/lib/node_modules/openneuro-cli/src/index.js
npm WARN @octokit/plugin-request-log@1.0.2 requires a peer of @octokit/core@>=3 but none is installed. You must install peer dependencies yourself.
+ openneuro-cli@3.25.3
added 1009 packages from 552 contributors and updated 4 packages in 42.846s
# seems like something has gone wrong
$ openneuro -V
/usr/local/lib/node_modules/openneuro-cli/src/cli.js:1
#!/usr/bin/env node
SyntaxError: Invalid or unexpected token
  at new Script (vm.js:100:7)