generated from readthedocs/tutorial-template
-
Notifications
You must be signed in to change notification settings - Fork 9
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Merge pull request #93 from Hzaatiti/operational-protocol
add kit read con using fieldtrip
- Loading branch information
Showing
5 changed files
with
11 additions
and
17 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -2,19 +2,22 @@ Experiment example 9: Auditory vs Visual vs Motor stimulus | |
---------------------------------------------------------- | ||
|
||
In this experiment, a random sequence of three stimulus is performed: | ||
|
||
- an auditory stimulus with a 200 Hz audio | ||
- a visual stimulus with a white flash appearing on screen | ||
- a motor stimulus requiring a button press | ||
|
||
Data is stored safely in NYU Box under `audio-visual-motor` file. | ||
Acquired datasets are stored safely in NYU Box under `audio-visual-motor` file. | ||
|
||
`MEG Data Directory <https://nyu.box.com/v/meg-datafiles>`_ | ||
|
||
Author: Hadi Zaatiti <[email protected]> | ||
|
||
|
||
Download code from: https://github.com/Hzaatiti/meg-pipeline/tree/main/experiments/psychtoolbox/auditory-vs-visual | ||
|
||
.. dropdown:: Audio vs Visual vs Motor experiment | ||
|
||
.. literalinclude:: ../../../../experiments/psychtoolbox/auditory-vs-visual/auditory_vs_visual.m | ||
:language: matlab | ||
:language: matlab | ||
|
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
File renamed without changes.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,12 +1,3 @@ | ||
# oddball | ||
MATLAB code for auditory oddball task with eyetracking. | ||
This is just a super simple example for how to play sound and record eyetracking using Psychtoolbox (http://psychtoolbox.org/) in MATLAB. | ||
# Auditory Vs Visual Vs Motor activity | ||
|
||
SPECIAL NOTE: The sounds in the folder 'SoundFiles' have not been RMS equalised!!! | ||
SPECIAL NOTE 2: There might be **many bugs!** | ||
|
||
## Stimuli | ||
In one block, deviant sounds were presented against standard sounds. The standard sound is a 500-Hz pure tone, and there are two deviant types: white noise and harmonic tones (f0=200 Hz; 30 harmonics). All sounds are 500 ms long and with 30-ms cosine onset and offset ramps. (THEY SHOULD ALSO BE RMS EQUALISED.) In total, each deviant type was presented 10 times, for 20 deviant trials in total, and the standard type presented 60 times, resulting in an overall deviant probability of 25%. The inter-sound-onset interval is randomised between 3-3.5 second. | ||
## Procedure | ||
After calibration, subjects should be instructed to fixate their pupils on a fixation cross and listen to the sounds through headphones. | ||
Before the presentation of the first sound, there is a 15-second long resting state recording. | ||
This experiment provides three conditions visual, auditory and motor with a random permutation between them. |