Nipype: FILMGLS threshold vs. FSL film_gls command



Hi everyone,

I’ve been looking around for an explanation on what exactly the threshold input of the Nipype FILMGLS and FSL’s film_gls threshold input is about. For example, what units is it in? Is it for the pre-whitening or background? This becomes particularly important as I notice that fsl’s default is 1000 while Nipypes is 0. And I’ve seen codes with threshold = 10 in Nipype but also with threshold = 1000 (using SUSAN mask). For my own data, if I run it in Nipype with threshold = 1000 all copes and parameters end up empty as the numTS property in the FILMGLS logfile = 0. This data has already been processed through SPM and its more to test out nipype, so its not conceivable that no voxel passes a legitimate threshold.

So i’m wondering:

  1. What is the FILMGLS/ fls_gls threshold units and purpose?
  2. How is FILMGLS different from FSL’s film_gls terminal command if at all?
  3. Does the threshold unit’s meaning change depending on whether one is in native space or MNI/ Standard space? Currently I’m running FILMGLS on preprocessed native space.

Would appreciate your thoughts,


You can check the @satra explanation why he didn’t set threshold to 1000 for fsl 5.0.7 or higher.

The threshold was actually set to -1000, but we have some small bug that I’m trying to fix, so it’s treated as 0.


Thank you for your response! I asked around and got a somewhat satisfactory and very pragmatic answer. Essentially, the threshold “units” are in arbitrary scanner units which are unique to the scanner. If you follow fsl’s gui, it essentially scales up whatever the scanner units are to *10000. In that case the threshold in film_gls ends up being a good way to get rid of non-brain inputs. However, if you didn’t scale it up and your mean functional image is then below 1000 all data will be lost. For my data, mean functional images were not scaled up and so peak activities were around 400 (arbitrary scanner units). In this case it made since to put the threshold at 100 to take out any non-brain signal. Essentially to mask out noise.

I think this shows how much using nipype really forces you to understand what you are doing in your analysis. Much appreciated!