Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Outlier detection #36

Open
wants to merge 42 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
42 commits
Select commit Hold shift + click to select a range
cfcb6bd
Comment apex code
taslimisina Jun 9, 2021
13a8e54
Add Cifar10 outlier dataset and dataloader
taslimisina Jun 12, 2021
3b0daf3
Fix __getitem__ to get item directly from cifar10 dataset
taslimisina Jun 12, 2021
a3a0b11
Calculate attention mse loss with random noise
taslimisina Jun 12, 2021
417663c
Fix outlier out of bound class error
taslimisina Jun 12, 2021
d38523b
Add attention loss average meter for log
taslimisina Jun 12, 2021
c9a1940
replaced random noise with adverserial
soroushtaslimi Jun 17, 2021
0d22bd6
add support for mnist dataset
sajjad2014 Jun 17, 2021
acdc3d9
add outliner_dataset.py
sajjad2014 Jun 17, 2021
0d00ad0
bug regarding positional arguments removed
soroushtaslimi Jun 19, 2021
c42a383
validation noise changed to adverserial
soroushtaslimi Jun 19, 2021
a076932
attention changed to tensor
soroushtaslimi Jun 23, 2021
78681bc
roc_auc score added for validation
soroushtaslimi Jun 24, 2021
14e9b78
Merge pull request #1 from taslimisina/outliner_detection_dataset
soroushtaslimi Jun 26, 2021
37a1ed9
fgsm_attack defaults changed
soroushtaslimi Jun 28, 2021
ac0fb57
valid put fgsm outside no_grad, train added labels
soroushtaslimi Jul 6, 2021
c11627e
CSV_Writer added
soroushtaslimi Jul 6, 2021
4e50b20
fgsm arguments added
soroushtaslimi Jul 11, 2021
52ea74b
add label_loss_coef argument
soroushtaslimi Jul 20, 2021
c6b7192
replace fgsm attack by pgd attack, remove extra files
sajjad2014 Jul 27, 2021
3d87ed2
add attn_loss_coef argument, change how attention loss is calculated
sajjad2014 Jul 27, 2021
6284961
change global step to epoch
sajjad2014 Jul 27, 2021
2d82f2c
add more statistics to csv_writer, change default value of attention …
sajjad2014 Jul 27, 2021
9f90783
add normal_loss to val_csv_writer
sajjad2014 Jul 27, 2021
15c975e
fgsm change to PGD in parser
Jul 28, 2021
e711dcf
fgsm argument change to pgd in parser
Jul 28, 2021
d4e788e
fgsm argument change to pgd in parser
Jul 28, 2021
cbe2e39
delete extra new line HEAD
Jul 28, 2021
f695bd2
fgsm_eps change to pgd_eps-->bug fix
Jul 31, 2021
ee90fa8
function for show image(bug not fix!) just for help
Aug 1, 2021
d61972e
import matplotlib
Aug 1, 2021
81d97cf
add attacked and normal image visualization, fixed minor bugs
sajjad2014 Aug 4, 2021
75a3bf2
fix visualization bug
sajjad2014 Aug 4, 2021
97f3fc0
add is_normal to visualization
sajjad2014 Aug 4, 2021
b09e39e
draw perturbation instead of noised image
sajjad2014 Aug 5, 2021
ea0cfa1
attenion visualize function delete
Aug 8, 2021
cc633f0
add validation, add coefficient to noise diff for better visualization
sajjad2014 Aug 8, 2021
413c364
fix wrong import bug
sajjad2014 Aug 8, 2021
07c406b
change visualization, change loss from normal to adversarial
sajjad2014 Aug 18, 2021
61332c2
add model saving/loading, add attention visualization, add new metric…
sajjad2014 Sep 14, 2021
7f49731
add attention visualization during training
sajjad2014 Sep 14, 2021
24252fc
fix bug in csv writer, fix softmax auc
sajjad2014 Sep 15, 2021
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -144,3 +144,4 @@ cython_debug/
!/logs/
!/output/
.idea
.vscode
3 changes: 2 additions & 1 deletion models/modeling.py
Original file line number Diff line number Diff line change
Expand Up @@ -244,6 +244,7 @@ def forward(self, hidden_states):
if self.vis:
attn_weights.append(weights)
encoded = self.encoder_norm(hidden_states)
# attn_weights = torch.stack(attn_weights, dim=1)
return encoded, attn_weights


Expand Down Expand Up @@ -276,7 +277,7 @@ def forward(self, x, labels=None):
if labels is not None:
loss_fct = CrossEntropyLoss()
loss = loss_fct(logits.view(-1, self.num_classes), labels.view(-1))
return loss
return loss, attn_weights
else:
return logits, attn_weights

Expand Down
Loading