[Tutorial] How to analyze large neuroimaging datasets - the NeuroJSON way #11
fangq
started this conversation in
NeuroJSON.io and data hosting service
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
A large dataset typically contains several gigabytes to hundreds of gigabytes of binary data files. Conventionally, a user has to download the entire zipped data package (many GBs in size) regardless whether they need all the embedded large data files or not.
Using our lightweight REST-API and JSON-encoded datasets, a user can selectively download and locally cache those data files that are only necessary to the analysis, making it much more efficient to start analyzing large datasets. Here are 3 tutorials showing how to do this in MATLAB and Octave.
Tutorial Part 1 - Use REST-API for data download and query a dataset
jcache1.mp4
Tutorial Part 2 - Downloading and rendering linked data files
jcache2.mp4
Tutorial Part 3 - Using JSON API on Octave
jcache3.mp4
Beta Was this translation helpful? Give feedback.
All reactions