This repository has been archived by the owner on Apr 20, 2023. It is now read-only.
-
Notifications
You must be signed in to change notification settings - Fork 1
/
index.html
265 lines (206 loc) · 13.1 KB
/
index.html
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
<!DOCTYPE html>
<html>
<head>
<meta charset='utf-8' />
<meta http-equiv="X-UA-Compatible" content="chrome=1" />
<meta name="description" content="Neuroimaging codes : Pipeline for NeuroImaging" />
<link rel="stylesheet" type="text/css" media="screen" href="stylesheets/stylesheet.css">
<title>Neuroimaging codes</title>
</head>
<body>
<!-- HEADER -->
<div id="header_wrap" class="outer">
<header class="inner">
<a id="forkme_banner" href="https://github.com/mrhunsaker/NeuroImaging_Codes">View on GitHub</a>
<h1 id="project_title">Neuroimaging codes</h1>
<h2 id="project_tagline">Pipeline for NeuroImaging</h2>
<section id="downloads">
<a class="zip_download_link" href="https://github.com/mrhunsaker/NeuroImaging_Codes/zipball/master">Download this project as a .zip file</a>
<a class="tar_download_link" href="https://github.com/mrhunsaker/NeuroImaging_Codes/tarball/master">Download this project as a tar.gz file</a>
</section>
</header>
</div>
<!-- MAIN CONTENT -->
<div id="main_content_wrap" class="outer">
<section id="main_content" class="inner">
<h2>
<a name="required-programs" class="anchor" href="#required-programs"><span class="octicon octicon-link"></span></a>Required Programs</h2>
<p>These are the programs that we use in this pipeline. All of these except for Mango and a batch renaming application are called from the terminal.</p>
<ul>
<li> <a href="http://www.mccauslandcenter.sc.edu/mricro/mricron/dcm2nii.html">dcm2nii</a>
</li>
<li> ANTS (nightly build: git://github.com/stnava/ANTs.git)</li>
<li> ITK (nightly build: git clone git://itk.org/ITK.git )</li>
<li> <a href="http://fsl.fmrib.ox.ac.uk/fsl/fslwiki/">FMRIB Software Library (FSL)</a>
</li>
<li> <a href="http://www.itksnap.org/pmwiki/pmwiki.php?n=Convert3D.Documentation">Convert3D</a>
</li>
<li> <a href="http://www.r-project.org">R Statistical Computing Language</a>
</li>
<li> <a href="http://ric.uthscsa.edu/mango/">Mango</a>
</li>
<li> Batch renaming application that is capable of using regular expressions (i.e., <a href="http://www.publicspace.net/ABetterFinderRename/">Better Rename</a>)</li>
<li> Terminal set to Bourne Again Script (bash)</li>
</ul><hr><h2>
<a name="unzip-targz-files" class="anchor" href="#unzip-targz-files"><span class="octicon octicon-link"></span></a>unzip .tar.gz files</h2>
<pre><code>gunzip ./original/*.tar.gz
for i in ./original/*.tar ; do
sudo tar xopvf $i
done
</code></pre>
<hr><h2>
<a name="convert-dicom-files-to-nifti-format" class="anchor" href="#convert-dicom-files-to-nifti-format"><span class="octicon octicon-link"></span></a>Convert DICOM files to NIfTI format</h2>
<p>This program batch converts DICOM files into NIfTI format (nii.gz). It also crops the output of neck and non brain tissues, rsulting in a file with a "co" prefix and also attempts to align the images to a standard orientation.</p>
<pre><code>./dcm2nii -a y -d n -e n -f n -g n -i n -p n -r n -x y ./original/*/
find ./original/ -type f -name co*.nii.gz -exec cp {} ./data/ \;
</code></pre>
<p>At this point, rename the files with BetterRename.app or another program capable of batch renaming files using regular expressions to format names to mmu12345_Xwk.nii.gz for later steps using the following regular expression:</p>
<pre><code>(\w{2})+(\d{8})+(\_)+(\d{10})+(\w{6})+(\w{3})+(\d{5})(\d{1})+(\w*)
</code></pre>
<p>Replace the regular expression with (X = the age of the Subject at scan):</p>
<pre><code>\6\7_Xwk
</code></pre>
<hr><h2>
<a name="rigid-affine-alignment-to-a-template-to-ac-pc-align" class="anchor" href="#rigid-affine-alignment-to-a-template-to-ac-pc-align"><span class="octicon octicon-link"></span></a>Rigid affine alignment to a template to AC-PC Align</h2>
<p>This aligns all of the scans within the same general template space, but does not effect the size or shape of the scan being aligned to the template. However, this does <strong>not</strong> explicitly AC-PC align the images and does <strong>not</strong> change the scan origin like the acpcdetect program in the ART package.</p>
<pre><code>for i in ./data/*.nii.gz ; do
cd ./Template Directory/
$ANTSPATH/ANTS 3 -m MI[<Template>.nii.gz,$(dirname $i)/$(basename $i .nii.gz).nii.gz,1,32] -o $(dirname $i)/$(basename $i .nii.gz)ACPC -i 0 --number-of-affine-iterations 1x1x1 --rigid-affine true
$ANTSPATH/WarpImageMultiTransform 3 $(dirname $i)/$(basename $i .nii.gz).nii.gz $(dirname $i)/$(basename $i .nii.gz)_acpc.nii.gz $(dirname $i)/$(basename $i .nii.gz)ACPCAffine.txt -R <Template>.nii.gz
done
</code></pre>
<hr><h2>
<a name="reslice-images-to-35mm-isotropic-resolution-and-set-a-consistent-field-of-view" class="anchor" href="#reslice-images-to-35mm-isotropic-resolution-and-set-a-consistent-field-of-view"><span class="octicon octicon-link"></span></a>Reslice images to .35mm isotropic resolution and set a consistent field of view</h2>
<pre><code>for i in ./data/*_acpc.nii.gz ; do
./c3d $(dirname $i)/$(basename $i .nii.gz).nii.gz -interpolation Cubic -resample-mm .35x.35x.35mm -trim-to-size 256x256x256vox -verbose -o $(dirname $i)/$(basename $i .nii.gz)_resampled.nii.gz
done
</code></pre>
<hr><h2>
<a name="n4itk-bias-field-correction" class="anchor" href="#n4itk-bias-field-correction"><span class="octicon octicon-link"></span></a>N4ITK Bias Field Correction</h2>
<p>This removes the bias field generated by the magnetic inhomogeneity in the MRI scanning procedures. This is done by iteratively running the script three time with increasingly fine parameters. </p>
<pre><code>for i in /Users/thehunsakers/Documents/MPRAGE/*/*_resampled.nii.gz ; do
$ANTSPATH/N4BiasFieldCorrection -d 3 -i $(dirname $i)/$(basename $i .nii.gz).nii.gz -o $(dirname $i)/$(basename $i .nii.gz)_n4.nii.gz -s 8 -b [200] -c [50x50x50x50,0.000001]
$ANTSPATH/N4BiasFieldCorrection -d 3 -i $(dirname $i)/$(basename $i .nii.gz)_n4.nii.gz -o $(dirname $i)/$(basename $i .nii.gz)_n4.nii.gz -s 4 -b [200] -c [50x50x50x50,0.000001]
$ANTSPATH/N4BiasFieldCorrection -d 3 -i $(dirname $i)/$(basename $i .nii.gz)_n4.nii.gz -o $(dirname $i)/$(basename $i .nii.gz)_n4.nii.gz -s 2 -b [200] -c [50x50x50x50,0.000001]
done
</code></pre>
<hr><h2>
<a name="landmark-guided-template-matching-to-generate-region-of-interest" class="anchor" href="#landmark-guided-template-matching-to-generate-region-of-interest"><span class="octicon octicon-link"></span></a>Landmark Guided Template Matching to Generate Region of Interest</h2>
<p>Place control point landmarks of control points in the ROI located at regions of maximal anatomical variability across scans and this will take the traced, gold standard ROI in the templates folder and map it onto each scan. The guided registration script is located <a href="https://github.com/mrhunsaker/ANTS_Scripts/blob/master/guidedregistration.sh">Here</a>.</p>
<pre><code>for i in ./data/*_n4.nii.gz ; do
cd ./data/Template Folder/
sh guidedregistration.sh <Template>.nii.gz <Template_roi>.nii.gz $(dirname $i)/$(basename $i .nii.gz).nii.gz $(dirname $i)/$(basename $i .nii.gz)_roi.nii.gz $(dirname $i)/$(basename $i .nii.gz)_OUTPUT 100x100x10 3
done
for i in ./data/*_OUTPUTSEGMENTED.nii.gz ; do
./c3d $(dirname $i)/$(basename $i .nii.gz).nii.gz -binarize -o $(dirname $i)/$(basename $i .nii.gz)_binary.nii.gz
done
</code></pre>
<hr><h2>
<a name="learning-based-wrapper-to-fix-landmark-matching-roi" class="anchor" href="#learning-based-wrapper-to-fix-landmark-matching-roi"><span class="octicon octicon-link"></span></a>Learning Based Wrapper to Fix Landmark Matching ROI</h2>
<p>At this point, trace a number of primates regions of interest to be gold standard and run through the SegAdapter pipeline that uses machine learning to correct systematic errors introduced during the Landmark Guided Template Matching Step</p>
<pre><code>./bl ./data/inputIMAGEFILE.txt ./data/manualSegmentationFile.txt ./data/autoSegmentationFile.txt 1 2 4x4x4 .1 500 ./data/TRAINING/training
for i in ./data/*.nii.gz ; do
./sa $(dirname $i)/$(basename $i .nii.gz).nii.gz $(dirname $i)/$(basename $i .nii.gz).nii.gz ./data/TRAINING/training $(dirname $i)/$(basename $i .nii.gz)_CORR.nii.gz
done
</code></pre>
<h2>
<a name="compute-dice-coefficient" class="anchor" href="#compute-dice-coefficient"><span class="octicon octicon-link"></span></a>Compute DICE Coefficient</h2>
<pre><code>for i in ./data/*.nii.gz; do
c3d -verbose -overlap 1 $(dirname $i)/$(basename $i .nii.gz).nii.gz $(dirname $i)/$(basename $i .nii.gz)_OUTPUThipp_binary.nii.gz $(dirname $i)/$(basename $i .nii.gz)_CORR.nii.gz > ./data/$(basename $i .nii.gz).txt
done
</code></pre>
<hr><h2>
<a name="extract-roi-volumes-for-analysis" class="anchor" href="#extract-roi-volumes-for-analysis"><span class="octicon octicon-link"></span></a>Extract ROI Volumes for Analysis</h2>
<p>This will compute the Volumes and number of Voxels contained within each region of interest for analysis.</p>
<pre><code>find ./data/ -type f -name *_forANALYSIS.nii.gz -exec cp -fpv {} ./data/FINAL_ROI/ \;
for i in $(find ./data/FINAL_ROI/ -type f -name "_CORR.nii.gz"); do
val=$(fslval $i dim1)
xsize=$(echo "$val/2" | bc)
fslroi $i $(dirname $i)/$(basename $i .nii.gz)_HPCl.nii.gz 0 $xsize 0 -1 0 -1
xmin=$xsize; xsize=$(echo "$val-$xmin" | bc)
fslroi $i $(dirname $i)/$(basename $i .nii.gz)_HPCr.nii.gz $xmin $xsize 0 -1 0 -1
fslmaths $(dirname $i)/$(basename $i .nii.gz)_HPCr.nii.gz -mul 2 $(dirname $i)/$(basename $i .nii.gz)_HPCr.nii.gz
fslmerge -x $(dirname $i)/$(basename $i .nii.gz)_HPC.nii.gz $(dirname $i)/$(basename $i .nii.gz)_HPCr.nii.gz $(dirname $i)/$(basename $i .nii.gz)_HPCl.nii.gz
c3d $(dirname $i)/$(basename $i .nii.gz)_HPC.nii.gz -split -oo $(dirname $i)/temp.nii.gz $(dirname $i)/$(basename $i .nii.gz)_HPCl.nii.gz $(dirname $i)/$(basename $i .nii.gz)_HPCr.nii.gz
done
for i in ./data/FINAL_ROI/*_forANALYSIS.nii.gz; do
$FSLDIR/bin/fslstats $i -V >$(dirname $i)/$(basename $i .nii.gz).txt
done
for i in ./data/FINAL_ROI/*right.txt; do
newname=`echo "$i" | sed 's/_acpc_resampled_n4_OUTPUTSEGMENTED_binary_Corrected_forANALYSIS_right/_right/g'`
mv "$i" "$newname"
done
for i in ./data/FINAL_ROI/*left.txt; do
newname=`echo "$i" | sed 's/_acpc_resampled_n4_OUTPUTSEGMENTED_binary_Corrected_forANALYSIS_left/_left/g'`
mv "$i" "$newname"
done
for i in ./data/FINAL_ROI/*forANALYSIS.txt; do
rm -f $i
done
find ./data/FINAL_ROI/ -type f -name *.txt -exec cp -fpv {} ./data/FINAL_ROI/Volumes/ \;
</code></pre>
<hr><h2>
<a name="trim-rois-for-3d-rendering" class="anchor" href="#trim-rois-for-3d-rendering"><span class="octicon octicon-link"></span></a>Trim ROIs for 3D Rendering</h2>
<p>This trims the ROIs to have a 5 voxel padding. This greatly reduces the file size as well as makes 3D visualization much more clear.</p>
<pre><code>for i in ./data/*_Corrected.nii.gz ; do
./c3d $(dirname $i)/$(basename $i) -trim 5vox -o $(dirname $i)/$(basename $i .nii.gz)_forRENDERING.nii.gz
done
</code></pre>
<hr><h2>
<a name="organize-roi-into-a-file-using-r" class="anchor" href="#organize-roi-into-a-file-using-r"><span class="octicon octicon-link"></span></a>Organize ROI into a File using R</h2>
<p>This script takes the region of interest volumes generated above and tabulates them in a single text file with Subject ID, Age, Left Volume, and Right </p>
<pre><code>R
rm(list = ls(all = TRUE))
library(limma)
library(reshape)
library(outliers)
library(psych)
library(doBy)
library(gdata)
library(gplots)
library(lattice)
setwd("./data/FINAL_ROI/Volumes/")
files=list.files(path=".", recursive=TRUE)
all_files=data.frame(files=NULL)
for(i in seq(along=files))
{
orig=read.table(files[i], header=FALSE, sep=" ")
orig=rename(orig,c(V1="voxels",V2="volume"))
all_files=rbind(all_files,orig)
}
all_files=subset(all_files, select=-c(V3))
files=removeExt(files)
all_files=cbind(files,all_files)
data=all_files
allRIGHT=subset(data,grepl("_HPCr",data$files)
allLEFT=subset(data,grepl("_HPCl",data$files))
Subjects=allRIGHT$files
Subjects=as.character(Subjects)
Age=gsub("(\\w+)_(\\d+)(\\w+)", "\\2", Subjects)
Subjects=gsub("(\\w+)_(\\d+)(\\w+)", "\\1", Subjects)
R_Voxels=allRIGHT$voxels
R_Volume=allRIGHT$volume
L_Voxels=allLEFT$voxels
L_Volume=allLEFT$volume
Hippocampal_Volumes=cbind(Subjects,Age,R_Volume,L_Volume)
write.table(Hippocampal_Volumes, file="./data/FINAL_ROI/Volumes/Hippocampal_ROI.txt",sep=",",row.names=FALSE,col.names=TRUE)
</code></pre>
<h2>
<a name="have-yet-to-be-implemented" class="anchor" href="#have-yet-to-be-implemented"><span class="octicon octicon-link"></span></a>Have yet to be implemented</h2>
<ul>
<li> Set up and pilot acpcdetect for primate scans</li>
<li> Histogram clipping to normalize grayscale levels</li>
<li> Brain extraction (via Atropos)</li>
<li> DBM analyses to analyze ROI change over time</li>
</ul>
</section>
</div>
<!-- FOOTER -->
<div id="footer_wrap" class="outer">
<footer class="inner">
<p class="copyright">Neuroimaging codes maintained by <a href="https://github.com/mrhunsaker">mrhunsaker</a></p>
<p>Published with <a href="http://pages.github.com">GitHub Pages</a></p>
</footer>
</div>
</body>
</html>