Suggestions for getting 9.4T data into BIDS format

I’m trying to help Naila get some 9.4T data into BIDS format. From what I’ve previously heard, the DICOM to nifti conversion for the 9.4T data is non-trivial, so autobids isn’t really an option. Does anyone with more experience with 9.4T data know how that DICOM to nifti conversion works and whether it produces the JSON sidecar files?

Thanks,
suzanne

I don’t see why autobids/tar2bids isn’t an option. We have used tar2bids for 9.4T data, the built-in 9.4T heuristic (cfmm_bruker) likely would need updating for Naila’s study, and there may be another fix we would want to merge in this PR too: https://github.com/khanlab/tar2bids/pull/25, but otherwise tar2bids is the first place I would start.

If any additional scripts or tools need to be run specifically on 9.4T data, we can always add them in to tar2bids too.

Ah. I must have misunderstood my last conversation about this with Igor. He said that the usual dicom-to-nifti conversion we use for the 3T and 7T scanner data caused orientation errors with the 9.4T data and didn’t recommend running the 9.4T data through tar2bids.

Let me touch base again with Naila and get some more information on what scan data she has.

Hello guys,

A few years ago, I’ve dealt with a number of issues with converting multi-frame (particularly DTI) datasets from the Bruker scanner to Nifti using dcm2niix. The problem was usually with converting Enhanced MR DICOM datasets, where the entire series is stored in a single multi-frame DICOM file. There was a (sometimes feasible, depending on # of volumes and # of slices) last resort workaround of converting the dataset to a bunch of single-slice “non-Enhanced” MR type DICOMs, then converting with dcm2niix. This, and some known issues, are in a separate discourse topic which was migrated from our old wiki.

Here’s a list of issues and PRs I’ve made on dcm2niix. You can look through those, and particularly where these issues from 2018/2019 were mentioned in more recent issues and PRs, in case you run into problems like slice ordering during conversion of EPI datasets, bval/bvec not extracted from DTIs, etc.

1 Like

Is there a specific version of tar2bids that works best for the cfmm_bruker.py heuristic? I currently running tar2bids:v0.0.5h (default in neuroglia-helpers) but am getting nowhere.

[switt4@gra-login3 uStructConcussion]$ tar2bids -P '*_{subject}' -h cfmm_bruker.py Baron_uStructConcussion_20220526_NR24_20220526_01.73F9D000.tar 
CMD: singularity run -e /project/6050199/akhanf/singularity/bids-apps/khanlab_tar2bids_v0.0.5h.sif -P *_{subject} -h cfmm_bruker.py Baron_uStructConcussion_20220526_NR24_20220526_01.73F9D000.tar
  Using custom PatientName search: *_{subject}
	Overriding heuristic file as: cfmm_bruker.py
  PI=Baron Study=uStructConcussion Date=20220526 PatientName=NR24
session not found in *_{subject}
Baron_uStructConcussion_20220526_NR24_20220526_01.73F9D000.tar -> sub-NR24
  Running: heudiconv ... 
heudiconv  --overwrite -b -d /scratch/switt4/uStructConcussion/Baron_uStructConcussion_20220526_\*_\{subject\}_20220526_01.73F9D000.tar -o /scratch/switt4/uStructConcussion/bids -f /opt/tar2bids/heuristics/cfmm_bruker.py -s NR24  | tee /scratch/switt4/uStructConcussion/bids/code/tar2bids_2022-08-17_13h14m_16225/logs/heudiconv.NR24
INFO: Running heudiconv version 0.5.4
INFO: Need to process 0 study sessions
  Moving hidden .heudiconv folder to /scratch/switt4/uStructConcussion/bids/code/tar2bids_2022-08-17_13h14m_16225/heudiconv
mv: cannot stat '/scratch/switt4/uStructConcussion/bids/.heudiconv': No such file or directory
 Running script to compute UNI-DEN from UNI & inversions (if it doesn't exist already)
/opt/tar2bids/etc/correctUniDen /scratch/switt4/uStructConcussion/bids NR24 | tee -a /scratch/switt4/uStructConcussion/bids/code/tar2bids_2022-08-17_13h14m_16225/logs/tuneup.NR24
  Cleaning participants and other tsv files (sorting, removing extra columns)...
Cleaning /scratch/switt4/uStructConcussion/bids/participants.tsv to remove all but 1st column
  Adding default .bidsignore file...
'/opt/tar2bids/etc/bidsignore' -> '/scratch/switt4/uStructConcussion/bids/.bidsignore'
  Running bids-validator...
bids-validator@1.2.5

	1: [ERR] Quick validation failed - the general folder structure does not resemble a BIDS dataset. Have you chosen the right folder (with "sub-*/" subfolders)? Check for structural/naming issues and presence of at least one subject. (code: 61 - QUICK_VALIDATION_FAILED)
		.bids

	Please visit https://neurostars.org/search?q=QUICK_VALIDATION_FAILED for existing conversations about this issue.

It seems from the screen dump that perhaps the naming scheme of Naila’s files don’t match what the cfmm_bruker.py heuristic is expecting (the dicominfo.tsv file isn’t being created, so I can’t easily verify this), but I’ve also seen similar behavior with tar2bids where it solution was to just use a specific version.

Could try with the latest tar2bids version perhaps (0.1.3), but not sure that is it…

Weird that the dicoms are not being found –

Looking at the logs, this is the heudiconv command below – you can try running it in a singularity shell to debug it, maybe the path to the tar file is getting wrong somehow…

heudiconv  --overwrite -b -d /scratch/switt4/uStructConcussion/Baron_uStructConcussion_20220526_\*_\{subject\}_20220526_01.73F9D000.tar -o /scratch/switt4/uStructConcussion/bids -f /opt/tar2bids/heuristics/cfmm_bruker.py -s NR24  

Try with this as well:

heudiconv  --overwrite -b -d '/scratch/switt4/uStructConcussion/Baron_uStructConcussion_20220526_*_{subject}_20220526_01.73F9D000.tar' -o /scratch/switt4/uStructConcussion/bids -f /opt/tar2bids/heuristics/cfmm_bruker.py -s NR24  

What is the path to the heudiconv container? I’m not seeing it in $SINGULARITY_DIR. Or, do I need to download my own copy?

You can use the heudiconv in the tar2bids container.

Tried both of the heudiconv commands and got the same log message about there being no DICOM images to process.

[switt4@gra-login1 uStructConcussion]$ singularity shell tar2bids_v0.1.3.sif 
Singularity> heudiconv  --overwrite -b -d /scratch/switt4/uStructConcussion/Baron_uStructConcussion_20220526_\*_\{subject\}_20220526_01.73F9D000.tar -o /scratch/switt4/uStructConcussion/bids -f /opt/tar2bids/heuristics/cfmm_bruker.py -s NR24  
WARNING: Could not check for version updates: Could not find a suitable TLS CA certificate bundle, invalid path: /etc/pki/tls/certs/ca-bundle.crt
INFO: Running heudiconv version 0.10.0 latest Unknown
INFO: Need to process 0 study sessions
Singularity> heudiconv  --overwrite -b -d '/scratch/switt4/uStructConcussion/Baron_uStructConcussion_20220526_*_{subject}_20220526_01.73F9D000.tar' -o /scratch/switt4/uStructConcussion/bids -f /opt/tar2bids/heuristics/cfmm_bruker.py -s NR24  
WARNING: Could not check for version updates: Could not find a suitable TLS CA certificate bundle, invalid path: /etc/pki/tls/certs/ca-bundle.crt
INFO: Running heudiconv version 0.10.0 latest Unknown
INFO: Need to process 0 study sessions

Only thing I can think is that something went wrong during the cfmm2tar pull? But I didn’t get any connection errors or time outs with graham when downloading two different tar files.

I spent a bit of time looking at this, and I think the tar path doesn’t match up with the file name. If I remove the _\*, yielding -d tar-files/Baron_uStructConcussion_20220526_\{subject\}_20220526_01.73F9D000.tar, heudiconv runs. I don’t really know if the run yields something satisfactory or how to get tar2bids to produce this invocation, though. Here’s my output:

Singularity> heudiconv --overwrite -b -d tar-files/Baron_uStructConcussion_20220526_\{subject\}_20220526_01.73F9D000.tar -o tar2bids-out-bruker/ -f /opt/tar2bids/heuristics/cfmm_bruker.py -s NR24
WARNING: Could not check for version updates: Connection to server could not be made
INFO: Running heudiconv version 0.10.0 latest Unknown
INFO: Need to process 1 study sessions
INFO: PROCESSING STARTS: {'subject': 'NR24', 'outdir': '/home/tkuehn/Code/tar2bids-out-bruker/', 'session': None}
INFO: Processing 15 dicoms
INFO: Analyzing 15 dicoms
INFO: Generated sequence info for 15 studies with 15 entries total
INFO: Doing conversion using dcm2niix
INFO: Converting /home/tkuehn/Code/tar2bids-out-bruker/sub-NR24/anat/sub-NR24_acq-TurboRARE_run-01_T2w (1 DICOMs) -> /home/tkuehn/Code/tar2bids-out-bruker/sub-NR24/anat . Converter: dcm2niix . Output types: ('nii.gz',)
220818-14:45:49,402 nipype.workflow INFO:
	 [Node] Setting-up "convert" in "/tmp/dcm2niix76pguni2/convert".
INFO: [Node] Setting-up "convert" in "/tmp/dcm2niix76pguni2/convert".
220818-14:45:49,407 nipype.workflow INFO:
	 [Node] Executing "convert" <nipype.interfaces.dcm2nii.Dcm2niix>
INFO: [Node] Executing "convert" <nipype.interfaces.dcm2nii.Dcm2niix>
220818-14:45:54,514 nipype.interface INFO:
	 stderr 2022-08-18T14:45:54.513878:Error: Anatomical Orientation Type (0010,2210) is QUADRUPED: rotate coordinates accordingly
INFO: stderr 2022-08-18T14:45:54.513878:Error: Anatomical Orientation Type (0010,2210) is QUADRUPED: rotate coordinates accordingly
220818-14:45:54,515 nipype.interface INFO:
	 stdout 2022-08-18T14:45:54.515075:Chris Rorden's dcm2niiX version v1.0.20210317  (JP2:OpenJPEG) (JP-LS:CharLS) GCC5.5.0 x86-64 (64-bit Linux)
INFO: stdout 2022-08-18T14:45:54.515075:Chris Rorden's dcm2niiX version v1.0.20210317  (JP2:OpenJPEG) (JP-LS:CharLS) GCC5.5.0 x86-64 (64-bit Linux)
220818-14:45:54,515 nipype.interface INFO:
	 stdout 2022-08-18T14:45:54.515075:Found 1 DICOM file(s)
INFO: stdout 2022-08-18T14:45:54.515075:Found 1 DICOM file(s)
220818-14:45:54,515 nipype.interface INFO:
	 stdout 2022-08-18T14:45:54.515075:Convert 1 DICOM as /home/tkuehn/Code/tar2bids-out-bruker/sub-NR24/anat/sub-NR24_acq-TurboRARE_run-01_T2w_heudiconv786 (128x96x35x1)
INFO: stdout 2022-08-18T14:45:54.515075:Convert 1 DICOM as /home/tkuehn/Code/tar2bids-out-bruker/sub-NR24/anat/sub-NR24_acq-TurboRARE_run-01_T2w_heudiconv786 (128x96x35x1)
220818-14:45:54,543 nipype.interface INFO:
	 stdout 2022-08-18T14:45:54.543540:Compress: "/usr/bin/pigz" -n -f -6 "/home/tkuehn/Code/tar2bids-out-bruker/sub-NR24/anat/sub-NR24_acq-TurboRARE_run-01_T2w_heudiconv786.nii"
INFO: stdout 2022-08-18T14:45:54.543540:Compress: "/usr/bin/pigz" -n -f -6 "/home/tkuehn/Code/tar2bids-out-bruker/sub-NR24/anat/sub-NR24_acq-TurboRARE_run-01_T2w_heudiconv786.nii"
220818-14:45:54,543 nipype.interface INFO:
	 stdout 2022-08-18T14:45:54.543540:Conversion required 0.032808 seconds (0.005307 for core code).
INFO: stdout 2022-08-18T14:45:54.543540:Conversion required 0.032808 seconds (0.005307 for core code).
220818-14:45:54,572 nipype.workflow INFO:
	 [Node] Finished "convert", elapsed time 0.082812s.
INFO: [Node] Finished "convert", elapsed time 0.082812s.
WARNING: Failed to get date/time for the content: 'FileDataset' object has no attribute 'AcquisitionDate'
WARNING: Failed to find task field in /home/tkuehn/Code/tar2bids-out-bruker/sub-NR24/anat/sub-NR24_acq-TurboRARE_run-01_T2w.json.
220818-14:45:59,604 nipype.workflow INFO:
	 [Node] Setting-up "embedder" in "/tmp/embedmetaq9v9rbf7/embedder".
INFO: [Node] Setting-up "embedder" in "/tmp/embedmetaq9v9rbf7/embedder".
220818-14:45:59,609 nipype.workflow INFO:
	 [Node] Executing "embedder" <nipype.interfaces.utility.wrappers.Function>
INFO: [Node] Executing "embedder" <nipype.interfaces.utility.wrappers.Function>
220818-14:45:59,653 nipype.workflow INFO:
	 [Node] Finished "embedder", elapsed time 0.042903s.
INFO: [Node] Finished "embedder", elapsed time 0.042903s.
INFO: Post-treating /home/tkuehn/Code/tar2bids-out-bruker/sub-NR24/anat/sub-NR24_acq-TurboRARE_run-01_T2w.json file
INFO: Populating template files under /home/tkuehn/Code/tar2bids-out-bruker/
INFO: PROCESSING DONE: {'subject': 'NR24', 'outdir': '/home/tkuehn/Code/tar2bids-out-bruker/', 'session': None}

Yes just saw this too while looking into it…

The fix is to use '{subject}'as the subject search in tar2bids, instead of the default '*_{subject}'.

In the tar name:

Baron_uStructConcussion_20220526_NR24_20220526_01.73F9D000.tar

the PatientName is actually just ‘NR24’ in this case. You can confirm this by looking on the dicom server. What probably threw you off is for 3T and 7T scans, the string after patient name is always 1 digit, whereas for the 9.4T scans it is ‘YYYYMMDD_##’.

That should at least get you started @switt4 - the heuristic will need modifying to capture the different series (only grabs the T2w for now), but you will need to work through that with Naila to describe the data and organization properly.

With qMRI in the bids standard now, it would be good to adhere to that:
https://bids-specification.readthedocs.io/en/stable/99-appendices/10-file-collections.html

Probably will want MTS suffix images for the MT_GRE (assuming MTsat?) data, and dwi (with different acq- flags) for the DWEpi data. For the B1 mapping data would go in the fmap folder, but suffix would depend on the method of B1 mapping used.

That makes sense. I’ll give it another try tomorrow morning using the correct subject/session string locator. I probably would have stared at that for days before finally seeing it…

I took a look at the 9.4T heuristic, and it looks as if the only scans of Naila’s that aren’t covered are the two advanced diffusion scans. But, I will check the output versus what is now the standard for qMRI sequences and make any necessary edits to any existing scans covered in the heuristic.

Using -P '{subject}' cleared the error I was getting. tar2bids is now successfully converting the TurboRARE sequence.