How to get help with ANTs ANTs wiki: https://github.com/ANTsX/ANTs/wiki Search issues or open a new one: https://github.com/ANTsX/ANTs/issues Please use the issue templates to ensure that you provide us with the information we need to help.
How to get help with ANTs ANTs wiki: https://github.com/ANTsX/ANTs/wiki Search issues or open a new one: https://github.com/ANTsX/ANTs/issues Please use the issue templates to ensure that you provide us with the information we need to help.
How to get help with ANTs ANTs wiki: https://github.com/ANTsX/ANTs/wiki Search issues or open a new one https://github.com/ANTsX/ANTs/issues Please use the issue templates to ensure that you provide us with the information we need to help.
okay, thank you!
Please post this as an issue over at the ANTs github site.
Hi, I red the "Multivariate Analysis of Structural and Diffusion Imaging in Traumatic Brain Injuty, Avants et al" paper (https://pubmed.ncbi.nlm.nih.gov/18995188/). This paper is very interesting for my internship because I would like to do something similar. I was searching in the documentation of ANTs if there is an option for using SyNMN while doing registration but I can't find anything. Has someone an idea where I can find this option (if it actually exists)? Thank you in advance!!
Done: https://github.com/ANTsX/ANTs/issues/1007 .
Hello, I am sending the V02_to_V01_Warped.nii.gz and the V02_to_V01_Using_trans.nii.gz in the link. https://drive.google.com/open?id=18ayG9b6DeF8Ui76uf9tKlcqn85x7OYp6
Please post this as an issue over at the ANTs github repo. We can continue the conversation over there.
We recently ran the following buildtemplateparallel command: /usr/lib/ants/buildtemplateparallel.sh -m 30x90x20 -r 1 -g 0.25 -i 4 -t GR -s CC -d 3 -o btp1_ -c 2 -n 1 -j 96 -z /flywheel/v0/input/template/btp_test_template.nii.gz subj-*-*_MPRAGE_reoriented_upsampled.nii.gz ...and found that it needed a little less than 3TB of disk space and took about 30 hours to run when given 10 scans. We want to run this same command on 245 scans. What are some reasonable estimates for the amount of disk space we...
V02_to_V01_Using_trans.nii.gz is different than V02_to_V01_Warped.nii.gz How much different?
Hello, Thank you for your response so I did what you told me antsRegistration --verbose 1 --dimensionality 3 --float 1 --collapse-output-transforms 1 -o [V02_to_V01_,V02_to_V01_Warped.nii.gz,V02_to_V01_InverseWarped.nii.gz] --initial-fixed-transform V01_transform2.txt --initial-fixed-transform V01_transform1.mat --initial-moving-transform V02_transform3_0Warp.nii.gz --initial-moving-transform V02_transform2.txt --initial-moving-transform V02_transform1.mat -t SyN[ 0.5,3,0 ] --interpolation BSpline...
done :) Thanks
Please post your questions as an issue over at the ANTs GitHub repo.
Dear All, I have a cohort of 55 pediatric subjects(age 7-14 years) and i want to calculate cortical thickness and gyrification index via ANTs. Can anyone suggest me how to create a pediatric template using ANTs which can be further used for brain extraction and calculation of cortical thickness and gyrification. Thank you.
The flag needs to precede each transform: --initial-moving-transform $transform1 --initial-moving-transform $transform2 ...
Hello, I read in the antsRegistration help that I can do the registration while providing the transformation files for the fixed and the moving image using the --initial-fixed-transform & --initial-moving-transform flags instead of applying the transformation and registring the transformed volumes. This is my command: antsRegistration --verbose 1 --dimensionality 3 --float 1 --collapse-output-transforms 1 -o [V02_to_V01_,V02_to_V01_Warped.nii.gz,V02_to_V01_InverseWarped.nii.gz] --initial-fixed-transform...
That would be great. On May 21, 2020, at 8:40 AM, Philip Cook cookpa@users.sourceforge.net wrote: I have trouble getting Sourceforge emails consistently. We could set preferred support to Github rather than this forum? Problem when creating FA map after Warping and reorienting a DT image Sent from sourceforge.net because you indicated interest in https://sourceforge.net/p/advants/discussion/840261/ To unsubscribe from further messages, please visit https://sourceforge.net/auth/subscriptions/
I have trouble getting Sourceforge emails consistently. We could set preferred support to Github rather than this forum?
Please post this over at the ANTs GitHub site as an issue. SourceForge tends to be not as visible to the developers. Also, if you can, please post the actual data.
Hi, After reviewing literature on registration of diffusion MRI data, I've realized that registering the tensor image is better, as it allows to use orientational, as well as spatial, information during registration (I had previously applied registration on my FA and MD maps). I followed the steps outlined here: https://github.com/ANTsX/ANTs/wiki/Warp-and-reorient-a-diffusion-tensor-image, and I started by testing the deformation and reorientation with a simple affine transformation (so no warps...
Ok, sorry, I thought this would be the appropriate place, because it's also general question about the mathematics/implementation of the displacement transform, which probably is ITK. I don't expect it to be a bug.
Please post this over at the appropriate GitHub forum depending on the package you're using (e.g., ANTsPy or ANTs).
I currently initialise a displacement transform from a numpy array. Do I get that right, that in principle effects of this transform should be more or less confined to a local environments within the limits of ~ px-coordinate + displacement ? Because I see different results if I compare a cut out of a volume and displacement field + transform versus a cutout of the transformed whole volume + displacement field. Thank you!
Please post this as an issue over at the ANTsPy github website.
Using antspy, when I supply images as numpyarrays which contain negative values I get the error above, is this expected?
Sorry, I can't make sense of the error log. It looks to be a system-specific issue for which is really difficult to debug using the help forum.
Sorry, I re-send the files and the cmake error and output log files were attached. CMakeError.log https://docs.google.com/document/d/18GEfMkW211MJeujpcnMcMrJho8e_ViNaNnTy5NOFto4/edit?usp=drivesdk CMakeOutput.log https://docs.google.com/document/d/1FIQCdFbubf59k3uM2A5DLgR_rddQg8PzSY593pgEuv4/edit?usp=drivesdk Nick Tustison ntustison@users.sourceforge.net 于 2020年5月11日周一 下午10:28写道: Your second link is incomplete and I don't have permission to read the first link. ANTs installtion error https://sourceforge.net/p/advants/discussion/840261/thread/4daf9b78df/?limit=25#b969/4263/49c0/8f08...
Have you looked at the images?
Hi, I have a subject cohort of aphasic patients whose age range is between 50-90. I have applied an imputation method on the lesioned brains. I wanted to use the age-matched IXI template on the data but was confused about which template to use. I found this link reading through the forum. https://drive.google.com/drive/folders/0B4SvObeEfaRySUNDOE5DWksyQ0k In the IXI folder, there are three templates template0, 1, 2. I wasn't sure which one to use. There is also a AgeGender specific folder with different...
It sounds like a system-specific issuse which is obviously difficult to debug through a help forum. Know that all the main developers primarily use macs so you should be fine with respect to that issue.
Yes, as strange as it can be, the registration results look correct. However, I think that the problem comes from my setup. I was doing some testing of my preprocessing pipeline on my MacBook Pro (macOS Catalina 10.15.4, clang version 11.0.3). But when I run the same code on the computing cluster of my organization, the deformation maps look just fine. Thank you. Kilian
Your second link is incomplete and I don't have permission to read the first link.
The cmake error log file and output log file were attached. CMakeOutput.log https://docs.google.com/document/d/10kV2TxrpaNwt98pE0_eGq21YyLs9IVwA6GXBzPEiwVc/edit?usp=drivesdk CMakeOutput.log https://docs.google.com/document/d/1FIQCdFbubf59k3uM2A5DLgR_rddQg8PzSY593pgEuv4/edit?usp=drivesdk Nick Tustison ntustison@users.sourceforge.net 于 2020年5月11日周一 上午11:29写道: Please include the entire error log file. Also, if you're compiling, I would recommend compiling from the latest source. ANTs installtion error...
Please include the entire error log file. Also, if you're compiling, I would recommend compiling from the latest source.
Hi, I followed the instructions on website( https://brianavants.wordpress.com/2012/04/13/updated-ants-compile-instructions-april-12-2012/ ) to install ANTs version 2.1.0. However, after the command "make -j 4", it stopped with the following error. I have tried many times with different ANTs version(v2.1.0 to v2.1.rc1 to rc2 to rc3 to v2.2) but it always encountered with the error. So I want to inquire how to solve this problem.
Do the registration results look correct?
Hello, I want to analyze the deformation fields produced by the non-rigid registration. However, I have a problem that I struggle to solve. When I’m looking at the jacobian's determinant, only a part of the brain is represented. It seems that the deformation field has been cropped. I don’t understand why (the same thing happens for InverseWarp or Warp). Can someone help me to understand what’s going on? In advance thanks. Kilian
One other difference is I didn't use the mask. I suspect that would be more of a cause than the metric.
That is so bizarre. That worked for me as well. Any thoughts on why SyNQuick vs SyN produced markedly different outcomes?
Here's what I ran and the results look fine: $ antsRegistrationSyNQuick.sh -d 3 -f DSURQE_40micron_average.nii -m sub-apoe0084_acq-2avg_space-T2w_desc-brain_unbiased.nii -o templatexsub -n 4 $ antsApplyTransforms -d 3 -i DSURQE_40micron_labels.nii -r sub-apoe0084_acq-2avg_space-T2w_desc-brain_unbiased.nii -o sub-apoe0084_acq-2avg_space-T2w_desc-brain_unbiased_labels.nii.gz -n GenericLabel -t [ templatexsub0GenericAffine.mat , 1] -t templatexsub1InverseWarp.nii.gz -v 1
Here's what I ran and the results look fine: $ antsRegistrationSyNQuick.sh -d 3 -f DSURQE_40micron_average.nii -m sub-apoe0084_acq-2avg_space-T2w_desc-brain_unbiased.nii -o templatexsub -n 4 $ antsApplyTransforms -d 3 -i DSURQE_40micron_labels.nii -r sub-apoe0084_acq-2avg_space-T2w_desc-brain_unbiased.nii -o sub-apoe0084_acq-2avg_space-T2w_desc-brain_unbiased_labels.nii.gz -n GenericLabel -t [ templatexsub0GenericAffine.mat , 1] -t templatexsub1InverseWarp.nii.gz -v 1
I can sure try that. Like I said in the original post, I used ITK-SNAP to reorient the image so that they were both the same. Here's what I fed to antsRegistrationSyN.sh
Are these the inputs to antsRegistrationSyN.sh? If so, that's the first problem. ANTs registration won't correct orientation issues like this. You can try to use antsAI but it might be easier for you to place corresponding landmarks in the two images and use antsLandmarkBasedTransformInitializer to determine an initial transform.
Here is the original image for the mouse brain without any processing (just in case either the N4 or the reorientation somehow caused the issue I'm seeing). The template/atlas files are here: http://repo.mouseimaging.ca/repo/DSURQE_40micron_nifti/ Please note that the template mask is in a different orientation than the template or labels file (I don't know why they saved it that way). Thanks for help debugging this. I really appreciate it.
Yeah, the partial FOV would seem to be incorrect and not consistent with the input images or typical ants registration performance. Can you post the data for us to look at?
I'll give that a shot. What about the partial FOV in the warped images? Is that a problem?
The command calls appear to be correct. My guess is that the headers are different between $template and /templates/DSURQE_40micron_labels.nii. You can try copying the header information from the former to the latter using CopyImageHeaderInformation.
I'm certain that I'm asking a very basic question that I've just somehow missed in the documentation somewhere. I'm attempting to register ex-vivo mouse brains to a template and use the the inverse warp to get the template space atlas into subject space. I'm running this in a singularity container using ANTs v. 2.3.1. The template brain and accompanying mask are 0.04 isotropic while our acquired brains are 0.075 isotropic. My mouse brain has been unbiased as well as reoriented using ITK-SNAP to LPI...
I'm certain that I'm asking a very basic question that I've just somehow missed in the documentation somewhere. I'm attempting to register ex-vivo mouse brains to a template and use the the inverse warp to get the template space atlas into subject space. I'm running this in a singularity container using ANTs v. 2.3.1. The template brain and accompanying mask are 0.04 isotropic while our acquired brains are 0.075 isotropic. My mouse brain has been unbiased as well as reoriented using ITK-SNAP to LPI...
Instead of trying to run template construction, try to run a single ANTs program, like bias correction. Also, this appears to be a compilation/system issue so you might want to do some google searching in addition to further consultation with your IT support.
Hello, I recently tried to use antsMTC2.sh on the cluster, which is SLURM in our university, to build up a template. After I executed the script on the terminal (please see the attached file named Script_MRI_analysis), I first got an error message in the log file (please see the attached file named testslurm_0505) with a segmentation fault. I found that there was no "Repaired.nii.gz" file, which should be generated from the N4BiasFieldCorrection, so it failed in further steps. I checked the slurm...
ExtractRegionFromImage
Thanks Nick! Is there a command to crop the images, for example I only want to take the 'center' 100x100 voxel of the original 256x256 image?
What is exactly "PaddingSize"? Is it the amount of added voxels on each size of the image, or the final size after the padding? The former. Also, is there a function to get the size of the image? PrintHeader
Hi, I have a question, maybe trivial. What is exactly "PaddingSize"? Is it the amount of added voxels on each size of the image, or the final size after the padding? Also, is there a function to get the size of the image?
There isn't ANTs-specific functionality to handle that data. Although you tried ITK Dicom, you might want to ask over at the ITK discussion forums to see if anybody has a solution.
Hello ANTs experts, I have a linear transformation file (translation + rotation) in Dicom format (Modality tagged as REG in header) and I want to either convert this to any of mat/tfm/nifti/nrrd/mha etc. format or directly apply it to an image using antsApplyTransforms. I tried the latter but didn't work. I also tried to conver it using ITK Dicom to nifti code but it doesn't seem to be designed for a REG transformation as it only supports GDCMImageIO. Is this possible somehow? Thanks, Seth
Hello Nick, thank you very much for the advice, it was really helpful. I had misplaced some landmarks indeed. The result is better now but still needs work. Thank you very much for all your help and quick responses!
Cluster system requirements and protocols vary from institution to institution so I really can't provide any direction in that respect. You should ask your cluster administrators who will likely point you to a tutorial or other resources.
Thank you Nick , Can i ask you...do you have any advice on how to use antsMotionCorr for a huge amount of volumes and benefit from a PBS cluster? eg how to submit the job? Thanks johan
% ImageMath | grep PadImage PadImage : If Pad-Number is negative, de-Padding occurs Usage : PadImage ImageIn PaddingSize [PaddingVoxelValue=0]
Thanks. I also want to pad the boundaries of some images with zeros or a background value to make it to a bigger size. Is there a function to do that?
Okay, well you need to calculate those transforms. Regarding mixing packages---I already provided the requisite caveat but obviously you're welcome to do whatever you want. As far as I know, there's no universal/general solution that works for every case so just proceed with caution and always look at your results.
No, I don't think so. By the way, on a side note, it seems that applying afni's 3dcopy can restore the sform matrix that we talked about.
Okay, do you have the transform from each average image to its corresponding T1?
Yes, I have the average image from the EPI. Thanks
Okay, hvae you made your average motion corrected image? The code is in one of my previous responses.
Thanks Nick, I'm afraid I lost you there, would you please elaborate more? or point me to an example? Matt
Sure. Now you can register each subject's motion corrected data to the template by concatenating the transforms that go from the the average EPI sequence through the corresponding T1-w image to the template.
Try ResampleImage.
Hi, I have used ANTs for registration and intensity normalization of MRI images. Is there a function to resize the x, y dimensions of all the images to the same size, say from 192x192 to 64x64?
Hi Nick, You are right. I did these two lines of motion correction for one subject and although in fsleyes and ITK-SNAP (it only works on the first volume) the orientations are OK, but sform matrix is not showing. So, there is some effect here. If I skip the motion correction in ANTs, can we proceed to next step using the files pre-processed in FSL? P.S. Interestingly, the template that I created, seems to have not affected and all the matrices are ok. Thanks, Matt
Hi Nick, You are right. I did these two lines of motion correction of one subject and although in fsleyes and ITK-SNAP (it only works on the first volume) the orientations are OK, but sform matrix is not showing. So, there is some effect here. If I skip the motion correction in ANTs, can we proceed to next step using the files pre-processed in FSL? P.S. Interestingly, the template that I created, seems to have not affected and all the matrices are ok. Thanks, Matt
Hi Nick, You are right. I did these two lines of motion correction of one subject and although in fsleyes and ITK-SNAP (it only works on the first volume) the orientations are OK, but sform matrix is not showing. So, there is some effect here. If I skip the motion correction in ANTs, can we proceed to next step using the files preprocessed inFSL? P.S. Interestingly, the template that I created, seems to have not affected and all the matrices are ok. Thanks, Matt
Hi Nick, You are right. I did these two lines of motion correction of one subject and although in fsleyes and ITK-SNAP (it only works on the first volume) the orientations are OK, but sform matrix is not showing. So, there is some effect here. If I skip the motion correction in ANTs, can we proceed to next step using the files FSL? P.S. Interestingly, the template that I created, seems to have not affected and all the matrices are ok. Thanks, Matt
You might first want to see if your data is affected. Try looking at the registered images in both ITK-SNAP and fsleyes.
Oh, I see... then this could cause a problem down the line. I want to run group ICA on the final registered EPI files. On the other hand, FSL's registration would not give the best outcome, so I like to use ANTs for registration. Any work around you might think of? Thanks!
You have to be careful about mixing toolkits. There's a well-known difference in how FSL treats the header information vs. ITK/ANTs which dictates the image orientation in physical space. You can read more about it here.
Thanks Nick, I have already my EPI data motion corrected using FSL's tools. Since you mentioned these set of parameters might not be optimal on my dataset, I might skip the motion correction for now. In that case, what would be my next step? Matt
Okay, so now you want to compute the transformations from your 4-D EPI sequences to your structural data. Typically, we'll do a motion correction of that sequence using antsMotionCorr. $ antsMotionCorr -d 3 -a ${input4dEpi} -o ${averageEpi} $ antsMotionCorr -d 3 -m MI[${averageEpi},${input4dEpi},1,32,0.1] -t Rigid[0.2] -i 500x100x50x0 -s 3x2x1x0 -f 5x4x2x1 -o [${outputPrefix},${outputPrefix}Warped.nii.gz] -v 1 Note that these is simply one set of parameters which might not be optimal for your sequence...
Hello, I have followed the script you mentioned, now I have all these files for each subject. Ready for next step. Thanks!
Okay, those landmarks look better but you also need to check your bspline landmarks. Run this antsLandmarkBasedTransformInitializer 3 \ $landmarks_human \ $landmarks_macaque \ bspline \ $bspline_init and take a look at the resulting displacement field. An easy way to do this is in ITK-SNAP: Layer inspector --> General --> Display mode: Deformation Grid Display. You definitely have some mismatched landmarks. Also, instead of antsRegistrationSyN.sh use antsRegistrationSyNQuick.sh. You'll get much quicker...
Hi Nick, thank you for your answer. The error in the affine transform was due to a lack of landmarks in the posterior part of the brain, this is what I obtained with antsApplyTransforms -d 3 -v 1 \ -i $macaque_template \ -r $human_template \ -o test.nii.gz \ -n linear \ -t $affine_init after adding two landmarks at the surface of the marginal and the parieto-occipital sulci : https://we.tl/t-9ftEwzAgR5
Okay, when I debug normalization issues, I think it's best to look at the results after each step. If we take a look at the results after the first landmark initialization, it's clear that something is off with one or more of your landmark pairs. Take a look at the attached screenshot which shows the registered $macaque_template in the space of the $human_template. Specifically, you can do antsApplyTransforms -d 3 -v 1 \ -i $macaque_template \ -r $human_template \ -o test.nii.gz \ -n linear \ -t...
Hi Nick! Sorry, about that. Here is the full script and the whole data and outputs via a wetransfer link : https://we.tl/t-qZNicTMGQ4 datadir= human_template=${datadir}/Human.nii.gz macaque_template=${datadir}/Macaque.nii.gz landmarks_human_few=${datadir}/Affine_landmarks_human.nii.gz landmarks_macaque_few=${datadir}/Affine_landmarks_macaque.nii.gz affine_init=${datadir}/macaque2human_affine.txt landmarks_human=${datadir}/Bspline_landmarks_human.nii.gz landmarks_macaque=${datadir}/Bspline_landmarks_macaque.nii.gz...
It can handle 3-D images. It's just that your images are atypical, probably incorrect, and that is probably why the program segfaults.
Thanks for your suggestion. I'm wondering if this package can only process 2D images instead of 3D volume? If not, is there any thickness-analysis relavent package from ANTs that can process 3D volumes?
OK Nick, now I have this for all subjects.
No, just for one. for all, it is in progress...
Great. Did you do that for all your subjects?
Hi Nick, Thanks! this is how I ran this: antsRegistrationSyNQuick.sh -d 3 -f template0.nii.gz -m sbj_MPRAGE_brain.nii.gz -o sub_reg -n 1 it took ~5 mins to finish and it created these files: sub_reg0GenericAffine.mat sub_reg1InverseWarp.nii.gz sub_reg1Warp.nii.gz sub_regInverseWarped.nii.gz sub_regWarped.nii.gz Matt
Okay, the first thing to do is delete all those files except for template0.nii.gz. The subject-specific files are not optimal (outdated) and the other template* files are not useful. Once you do that you can continue with your processing: Let's start with registering each subject to the template. First try: $ antsRegistrationSyNQuick.sh -d 3 -f $template -m $subjectT1 -o $outputPrefix -n ${numberOfThreads} Once you get that done, let me know and we'll move on to the second step.
Hello, Thanks for your help. I managed to create my template using antsMultivariateTemplateConstruction.sh from each subjects MPRAGE images! Since I am very new to ANTs, I appreciate your help again: my goal is to use these to do nonlinear registration of each subject 4D data (EPI image) to this newly created template. So, antsMultivariateTemplateConstruction.sh creates 4 files as the output template: template0.nii.gz (the output template) template0Affine.txt template0warp.nii.gz templatewarplog.txt...
These aren't typical input to the LaplacianThickness program for which we don't really provide support. The program was designed for gray and white matter probability images. Although you could probably adapt the program, it would be on you to actually made the changes for accommodation in the code.
Hi, attached are two 3D volume segmentation labels. I obtain that error by using these two volumes as inputs
Can you post your images?
I'm trying to use LaplcianThickness function for calculating 3D retinal thickness using Nipype module in python. However, I can successfully pass in 3D Nifti volumes, but Segmentation fault (core dumped) error will pop after all calculation is done. Does anyone know what may be the issue? I have tried ANTs package on Anaconda and Linux, neither of them works.