Validator internal error due to dataset size?

I am getting the following error with validator 1.5.8

    1: [ERR] Internal error. SOME VALIDATION STEPS MAY NOT HAVE OCCURRED (code: 0 - INTERNAL ERROR)
            .undefined
                    Evidence: RangeError: Maximum call stack size exceeded
at missingSessionFiles (/usr/lib/node_modules/bids-validator/validators/session.js:20:10)
at validateMisc.then.then.then.then.then.then.then.then.eventsIssues (/usr/lib/node_modules/bids-validator/validators/bids/fullTest.js:199:42)
at processTicksAndRejections (internal/process/task_queues.js:86:5)

It seems the error I am experiencing is similar to what’s described in the thread below, but I didn’t find any resolution or workaround mentioned.

hi @ins0mniac2

Thank you for your message. Can you please share the size of the dataset you are trying to validate? Perhaps one way to reduce the validation size can be to add this flag: --ignoreSubjectConsistency . Another way could be to break down your dataset into a few subsets and validate the subsets.

Thank you,
Franklin

Thanks so much for responding. Dataset has more than 400 subjects but we started getting the error only after adding an additional modality data to it. Using that flag eliminates the error. I am curious what does it really do and in what way is it helping what appears to be a memory-related issue ? I still get the following warning (which is not an issue as it is a heterogeneous dataset):

[WARN] Not all subjects/sessions/runs have the same scanning parameters. (code: 39 - INCONSISTENT_PARAMETERS)

hi @ins0mniac2

Thank you for your message. That flag reduces down the validator log and the associated memory needs. That warning can be accounted for (as you also surfaced) by the heterogeneous dataset.

Thank you,
Franklin

MANAGED BY INCF