Skip to content

Commit

Permalink
depth-weighted version for wMEM (#13)
Browse files Browse the repository at this point in the history
* be_scores2alpha

* add stable clustering v1

* add stable clustering v2

* fix units for NIRS

* Compute MNE as separate step in the pipeline

Move MNE call outside be_normalize_and_unit

* use jMNE in be_memstruct

* fix name of jwMNE

* call mne on wavelet coefficient

* more changes on MNE

* better handle ratioAmp for cMEM and wMEM

* oups

* rename be_scores2alpha for wMEM

* New noise covariance matrix

* new noise covariance (panel)

* [pipeline] Add detrending step

* [MNE] Use MNE on the correct data; normalize each time point

* WIP - use new wavelet

* Fix panel

* first commit

* WIP

* better load the options

* [WIP] better handle baseline

* More update

* fix panel

* Revert "WIP - use new wavelet"

This reverts commit 0818b31.

* minor changes

* Remove wMSP (duplicate MSP)

* Refactor new alpha initialization (not using msp)

* small optimization

* register l-curve in MEM figure

* fix wmad

* modified std for shuffled baseline

* baseline shuffling proposal

* added baseline function, corrected bug

* fixed shuffle for single baseline

* explicitely mention the windows length used for the baseline shufling

* keep track of the time associated to each baseline

* Select the correct baseline

* better handle baseline selection

* [optim] optimize MNE computation

* bugfix fix MNE :)

* [optim] More mne optimization

* change baseline duration to 20s

* move baseline option to optional

* fix selection of boxes to also work if pc_power = 1

* several bugfix

* Rename baseline to be_shuffle_baseline

* Add default options

* [bugfix] remove the dependance to bst_process for the standalone

* bugfix: fix sample selection without depth-weighting in MNE

* Put the new alpha method as option in the panel

* [bugfix] Don't use already pre-computed MNE for alpha

* [bugfix] fix bug related to fusion of modalities

* Fixup elements in panel

* Add glue to improve vertical space distribution

* [bugfix] Allow shuffle baseline to work with data fusion

* [bugfix] Have be_stable clustering works with data fusion

* [bugfix] make mne based alpha use multimodal data (dont work)

* [gain2alpha] removing threshold to keep all cluster

* improve scaling

* show/hide the correct option for each pipeline

* more fixes

* Update .gitignore

* add new options

* Reorganize options

* [bugfix] empty MEMglobal

* [bugfix] better handle the baseline

* update default options

* adjust windows size

* restore 'activate MEM Display' option while changing pipeline

* Saving time window settings on pipeline changes

* replace optimization routing txt with combo

* Saving optimization routine on pipeline change

* creating "Can use fminuc" function

* Fix function output var name

* remove fminunc function from combo if not avaiable

* Saving baseline segments on pipeline change

* [optim] pre-compute sigma_s before the loop

* more optim

* [optim] optimize the store of the wmem solution

* optimize wMEM output

* only call be_wavelet_inverse ones

* Move the sample selectionto be_launch_mem

* Changing logical way to show/hide expert panels

* fix overflow ui bug

* Fix inversion in button text : "Hide" "Show"

* FIx involuntary added tab

* [bugfix] fix mne

* fix option export

* [bugfix] use be_struct_copy_fields when possible

* update MNE implementation

* Move the Bye message

* TYPO

* add MNE solver (for testing purpose only)

* final changes

* Remove sources with 0 gains

* [bugfix] Baseline selection

* Minor fixes

* Improve error detector when forward model is wrong

---------

Co-authored-by: Jawata Afnan <[email protected]>
Co-authored-by: rcassani <[email protected]>
Co-authored-by: iliaazzo <[email protected]>
Co-authored-by: ilianAZZ <[email protected]>
  • Loading branch information
5 people authored Sep 5, 2024
1 parent 5c0cdfc commit 5ad7cdb
Show file tree
Hide file tree
Showing 44 changed files with 2,592 additions and 1,710 deletions.
5 changes: 3 additions & 2 deletions .gitignore
Original file line number Diff line number Diff line change
@@ -1,2 +1,3 @@

.DS_Store

.DS_Store
*.asv
8 changes: 8 additions & 0 deletions CHANGELOG
Original file line number Diff line number Diff line change
@@ -1,3 +1,11 @@
% 11-07-2024 - v3
- wMEM: Support of depth-weighting
- wMEM: stable parcellation in time
- wMEM: New alpha initialization using MNE ( alpha method = 6)
- wMEM: New noise covariance calculation (NoiseCov method = 6)
- wMEM: Baseline shuffling for resting state analysis
- GUI: new graphical interface to call MEM

% 26 - 10 - 2023
- MEM : Merge pipeline for EEG/MEG and fNIRS
- cMEM: Include option for depth-weighting
Expand Down
2 changes: 1 addition & 1 deletion best/VERSION.txt
Original file line number Diff line number Diff line change
@@ -1 +1 @@
2.7.4
3.0.0
4 changes: 2 additions & 2 deletions best/cmem/clusters/be_scores2alpha.m
Original file line number Diff line number Diff line change
Expand Up @@ -78,9 +78,9 @@
case 4 % Method 4 (alpha = 1/2)
alpha(idCLS,jj) = 0.5;

case 5 % Method 4 (alpha = 1/2)
case 5 % Method 5 (alpha = 1 )
alpha(idCLS,jj) = 1;

otherwise
error('Wrong ALPHA Method')
end
Expand Down
45 changes: 17 additions & 28 deletions best/cmem/preprocess/be_normalize_and_units.m
Original file line number Diff line number Diff line change
Expand Up @@ -35,7 +35,11 @@
for ii = 1 : numel(OPTIONS.mandatory.DataTypes) %For every Modality (Data Type)

% Std deviation for every channels on a modality
SD = std(OPTIONS.automatic.Modality(ii).baseline');
if isfield(OPTIONS.optional, 'baseline_shuffle') && OPTIONS.optional.baseline_shuffle
SD = std(OPTIONS.automatic.Modality(ii).baseline(:,:,1)');
else
SD = std(OPTIONS.automatic.Modality(ii).baseline');
end

% Define the mean standard deviation (MSD) of the present modality
MSD = mean(SD);
Expand All @@ -52,45 +56,30 @@
switch OPTIONS.optional.normalization

case 'fixed'
units_dipoles = 1e-9; % nAm
if any(ismember( 'NIRS', OPTIONS.mandatory.DataTypes))
units_dipoles = 1;
else
units_dipoles = 1e-9; % nAm
end

for ii = 1 : numel(OPTIONS.mandatory.DataTypes)
ratioG = 1/max( max(OPTIONS.automatic.Modality(ii).gain) );
OPTIONS.automatic.Modality(ii).units.Gain_units = ratioG;
OPTIONS.automatic.Modality(ii).units.Data_units = ratioG/(units_dipoles);
OPTIONS.automatic.Modality(ii).units.Cov_units = (ratioG/(units_dipoles))^2;
OPTIONS.automatic.Modality(ii).units_dipoles = units_dipoles;

OPTIONS.automatic.Modality(ii).units_dipoles = units_dipoles;
end

case 'adaptive'

%Local fusion of data and gain matrix to compute the
%regularisation parameter (J)
if numel(OPTIONS.mandatory.DataTypes)>1
M = [OPTIONS.automatic.Modality(1).data;OPTIONS.automatic.Modality(2).data];
G = [OPTIONS.automatic.Modality(1).gain;OPTIONS.automatic.Modality(2).gain];
else
M = OPTIONS.automatic.Modality(1).data;
G = OPTIONS.automatic.Modality(1).gain;
end

% % Add option OPTIONS.model.depth_weigth_MNE
if OPTIONS.model.depth_weigth_MNE > 0 || any(strcmp( OPTIONS.mandatory.DataTypes,'NIRS'))
J = be_jmne_lcurve(G,M,OPTIONS); % use depth weighting MNE for NIRS MEM
else
J = be_jmne(G,M,OPTIONS);
end
ratioAmp = 1 / max(max(abs(J))); %Same for both modalities


for ii = 1 : numel(OPTIONS.mandatory.DataTypes)

ratioG = 1 / max(max(OPTIONS.automatic.Modality(ii).gain));
ratioAmp = 1 / OPTIONS.automatic.Modality(ii).MNEAmp; %Same for every modalities
ratioG = 1 / max(max(OPTIONS.automatic.Modality(ii).gain));

OPTIONS.automatic.Modality(ii).units.Data_units = ratioAmp*ratioG;
OPTIONS.automatic.Modality(ii).units.Cov_units = (ratioAmp*ratioG)^2;
OPTIONS.automatic.Modality(ii).units.Gain_units = ratioG;

OPTIONS.automatic.Modality(ii).Jmne = J * ratioAmp;
OPTIONS.automatic.Modality(ii).ratioAmp = ratioAmp;

end
end

Expand Down
5 changes: 3 additions & 2 deletions best/cmem/solver/be_cmem_pipelineoptions.m
Original file line number Diff line number Diff line change
Expand Up @@ -4,11 +4,12 @@
DEF.clustering.clusters_type = 'static';
DEF.clustering.MSP_window = 1;
DEF.clustering.MSP_scores_threshold = 0.0;

DEF.model.alpha_threshold = 0.0;
DEF.model.active_mean_method = 2;
DEF.model.alpha_method = 3;
DEF.model.depth_weigth_MNE = 0;
DEF.model.depth_weigth_MEM = 0;
DEF.model.depth_weigth_MNE = 0.5;
DEF.model.depth_weigth_MEM = 0.5;

% automatic
DEF.automatic.selected_samples = [];
Expand Down
22 changes: 12 additions & 10 deletions best/cmem/solver/be_cmem_solver.m
Original file line number Diff line number Diff line change
Expand Up @@ -95,12 +95,10 @@
%%%%%%%%%%%%%%%%%%%%%%%%%%%%% END OF TO DO LIST %%%%%%%%%%%%%%%%%%%%%%%%%%%
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%


global MEMglobal

if OPTIONS.optional.verbose
fprintf('\n\n===== pipeline cMEM\n');
end
obj = struct('hfig', [] , 'hfigtab', [] );

%% Retrieve vertex connectivity - needed for clustering
[OPTIONS, obj.VertConn] = be_vertex_connectivity(HeadModel, OPTIONS);
Expand Down Expand Up @@ -143,14 +141,20 @@
% we keep leadfields of interest; we compute svd of normalized leadfields
[OPTIONS, obj] = be_main_leadfields(obj, OPTIONS);

%% ===== Apply temporal data window ===== %%
% check for a time segment to be localized
[OPTIONS] = be_apply_window( OPTIONS, [] );

%% ===== Compute Minimum Norm Solution ==== %%
% we compute MNE (using l-curve for nirs or depth-weighted version)
[obj, OPTIONS] = be_main_mne(obj, OPTIONS);

%% ===== Normalization ==== %%
% we absorb units (pT, nA) in the data, leadfields; we normalize the data
% and the leadfields
[OPTIONS] = be_normalize_and_units(OPTIONS);

%% ===== Apply temporal data window ===== %%
% check for a time segment to be localized
[OPTIONS] = be_apply_window( OPTIONS, [] );


%% ===== Double precision to single ===== %%
% relax the double precision for the msp (leadfield and data)
Expand Down Expand Up @@ -202,9 +206,7 @@
'MEMoptions', OPTIONS); % ...
%'MEMdata', obj);


return


disp('Bye.')
end


15 changes: 12 additions & 3 deletions best/main/be_main_clustering.m
Original file line number Diff line number Diff line change
Expand Up @@ -54,12 +54,21 @@
switch OPTIONS.mandatory.pipeline
case 'cMEM'
[CLS, SCR, OPTIONS] = be_cmem_clusterize_multim(obj, OPTIONS); %TO DO: CHECK IF DATA SHOULD BE MINUS BSL OR NOT
[ALPHA, CLS, OPTIONS] = be_scores2alpha(SCR, CLS, OPTIONS);
case 'wMEM'
[CLS, SCR, OPTIONS] = be_wfdr_clustering_multim(obj, OPTIONS);
[CLS, SCR, OPTIONS] = be_wmem_clusterize_multim(obj, OPTIONS);

if OPTIONS.model.alpha_method ~= 6
[ALPHA, CLS, OPTIONS] = be_scores2alpha(SCR, CLS, OPTIONS);
else % We compute the score using MNE
[ALPHA, CLS, OPTIONS] = be_gain2alpha(obj , CLS, OPTIONS);
end

case 'rMEM'
[CLS, SCR, OPTIONS] = be_rmem_clusterize_multim(obj, OPTIONS);
[ALPHA, CLS, OPTIONS] = be_scores2alpha(SCR, CLS, OPTIONS);

end
[ALPHA, CLS, OPTIONS] = be_scores2alpha(SCR, CLS, OPTIONS);

elseif strcmp( OPTIONS.mandatory.pipeline, 'wMEM' )
ALPHA = OPTIONS.optional.clustering.initial_alpha * ones(1,size(OPTIONS.automatic.Modality(1).selected_jk, 2));
Expand All @@ -77,7 +86,7 @@
obj.CLS = CLS;
obj.ALPHA = ALPHA;


end



Expand Down
1 change: 1 addition & 0 deletions best/main/be_main_leadfields.m
Original file line number Diff line number Diff line change
Expand Up @@ -45,6 +45,7 @@

for ii = 1 : numel(OPTIONS.mandatory.DataTypes)
OPTIONS.automatic.Modality(ii).gain = OPTIONS.automatic.Modality(ii).gain(:, obj.iModS);
obj.VertConn = obj.VertConn(obj.iModS,obj.iModS);
OPTIONS.automatic.Modality(ii).gain_struct = be_decompose_gain(OPTIONS.automatic.Modality(ii).gain);
end

Expand Down
2 changes: 2 additions & 0 deletions best/main/be_main_sources.m
Original file line number Diff line number Diff line change
Expand Up @@ -54,6 +54,8 @@
% what we keep:
obj.nb_sources = numel(obj.iModS);

assert(obj.nb_sources > 0, 'The source space is empty' );

if OPTIONS.optional.verbose
fprintf(' done.\n');
end
Expand Down
Loading

0 comments on commit 5ad7cdb

Please sign in to comment.