Hi all, I’ve been getting conflicting opinions as to whether or not defacing of full head NIfTI volumes is required for anonymization. HIPAA guidelines state that “face photographic images and any comparable images” are protected health information (PHI). Are full head volumes considered comparable to a photo of the face? I know a 3D image of the head can be reconstructed from such a volume but it is not easy for me to recognize individuals from such reconstructions.
Any input as to what your local ethics board requires and why, would be appreciated. I would just deface the scans to be safe, but we may need the face for EEG source localization.
much thanks,
-David
Hi @David_Groppe,
well…that question is really something else as it heavily depends on what you plan to do with the data and where you are recording it. Could you maybe provide a bit more information on that? Quite often even different ethics boards at the same university require different things. Maybe your colleagues can share their experience or you can directly ask the ethics board?
This topic was discussed recently in a Open Science Room emergent session during this year’s OHBM (should also be on youtube soon). You can also check the Open Brain Consent which has a lot of information and where we are currently working on an adapted version for GDPR and @DorienHuijser’s great work on a decision tree for data sharing.
Re EEG source localization: I guess you could do that on non-defaced images and then share the defaced ones and the derived parameters!?
Tagging the gang for further input: @StephanHeunis, @Cyril_Pernet, @yarikoptic, @robert (please add folks I forgot).
HTH, cheers, Peer
Hi David,
Reidentifying someone on basis of facial structure without skin color, without hair, without clearly visible eyes, without glasses, etc. might be hard for us humans to do, but I am pretty sure that for the software behind e.g. faceID (which we use to unlock our smartphones) it is actually not a problem. I also don’t know how (re)identification through fingerprints work, it is just a bunch of squiggly lines to me, but also that somehow works well if you leave it to computers.
From a European privacy perspective, I would say that it becomes problematic if biometric details (face, fingerprints, dna) in publicly shared data can be mapped to publicly available databases (or private/governemental, e.g. for fingerprints). See https://www.slideshare.net/RobertOostenveld/ohbm-2020-osr-brain-research-data-sharing-and-personal-data-privacy for the slides that I presented in the OHMB-OSR emergent session.
BIDS allows storing the location of the anatomical landmarks (e.g. nasion) and fiducials (the coils) in relation to an anatomical MRI, also if that anatomical MRI has been defaced (but you have to identify the points prior to defacing). See https://bids-specification.readthedocs.io/en/stable/04-modality-specific-files/02-magnetoencephalography.html#coordinate-system-json-_coordsystemjson. When you follow BIDS, whether or not to deface the MRIs is not such a big deal any more for the (re)use of the data.
Thanks very much @PeerHerholz and @robert. I have no doubt that MRIs are identifiable (even with defacing). My question is if they are identifiable enough to be considered comparable to photographic images of the face and therefore PHI. It seems like there is no clear consensus on this. Some centers release non-defaced MRIs, while others require defacing and even de-earing.
And to help clarify the issue, the data I’m concerned about are from people being evaluated for epilepsy via high density EEG systems with facial coverage. It would be ideal to be able to share the raw data so that the facial electrodes could be included in the source modeling.
I will check out the links you kindly provided for more guidance, but since it is clinical data, I think we will take the most conservative approach (even if that means losing the facial electrodes).