Problem generating connections with on-the-fly outputs (a la JSONFileGrabber)

Hello all,

I’m having a weird (to me) problem and perhaps someone has a quick answer or solution.

I’m trying to make a node like the JSONFileGrabber but that pulls data from a local web service instead of a static file. I’ve used basically the same code as the JSONFileGrabber, replacing the file-reading code with a PyCurl snippet. Both the File and Web versions both work with the stand alone test code provided in the header comments and the FileGrabber works in my pipeline if I make a static file out of the desired dynamic content.

So far so good … but when I try to use the Web version it tells me:

“Exception: Some connections were not found
Module scan_info has no output called onset_map”

The connection with the output in question (onset_map) isn’t defined in the FileGrabber either, but the pipeline doesn’t complain then, only with my Web version. If I remove the outgoing connections and force the Web node to run, the output directory has all of the JSON elements listed in the report.rst.

I’m including the code for the Web node, if that helps. It isn’t very big and I tried to strip it down to essentials. Any pointers are greatly appreciated; I must be missing something obvious. TIA!

Bill


class ScanInfoInputSpec(DynamicTraitedSpec, BaseInterfaceInputSpec):
project   = traits.Str(mandatory=True, desc='Project ID', default='sibstudy')
session   = traits.Str(mandatory=True, desc='Experiment ID')
scan_type = traits.Str(mandatory=True, desc='Scan description/Task')
scan_num   = traits.Str(mandatory=True, desc='Scan (i.e. number in session sequence)')

server    = traits.Str(desc='Web service location', default='webserv.example.org:9002')

class ScanInfo(IOBase):

input_spec = ScanInfoInputSpec
output_spec = DynamicTraitedSpec
_always_run = True

def _list_outputs(self):
    import json

    outputs = {}

    # Get JSON from web service at 'server'
    url = "http://{0}/scaninfo/{1}/{2}/{3}/{4}".format(
        self.inputs.server,
        self.inputs.project,
        self.inputs.scan_type,
        self.inputs.session,
        int(self.inputs.scan_id)
    )

    buffer = StringIO()
    c = pycurl.Curl()
    c.setopt(c.URL, url)
    c.setopt(c.WRITEDATA, buffer)
    c.perform()
    c.close()
    
    try:
        data = json.loads(buffer.getvalue())
    except Exception as e:
        raise RuntimeError('No onsets')
    
    if not isinstance(data, dict):
        raise RuntimeError('JSON input has no dictionary structure')

    for key, value in data.items():
        outputs[key] = value

    return outputs

this is an error and has to do with the assumption that any io interface is included in the nipype package itself.

relevant line of code here and here

i have fixed this and will be submitting a pull-request shortly. currently, you need to add your interface to nipype/interfaces/io.py for it to work properly.

Thanks for the quick response! I’ll give it a go right away.

Cheers,

Bill