Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix/435 broken imports #436

Merged
merged 39 commits into from
Oct 5, 2023
Merged
Show file tree
Hide file tree
Changes from 13 commits
Commits
Show all changes
39 commits
Select commit Hold shift + click to select a range
524c526
Fix imports in notebooks
schroedk Sep 20, 2023
25b6474
Change artist name for example
schroedk Sep 20, 2023
7cbf4e6
Changes in executed notebook shapley_basic_spotify
schroedk Sep 21, 2023
3ae8579
Fix import of sklearn datasets
schroedk Sep 21, 2023
26002c3
Update output of notebooks/shapley_knn_flowers
schroedk Sep 21, 2023
9f9b289
Fix DataLoader creation, due to dimension miss-match
schroedk Sep 21, 2023
7fff43c
Fix pre-allocation of tensor bug, due to incorrect data length comput…
schroedk Sep 21, 2023
d0c73ab
Update output of notebooks/influence_synthetic
schroedk Sep 21, 2023
52f00ff
Update output of notebooks/influence_wine
schroedk Sep 21, 2023
c02678a
Fix broken notebook/influence_imagenet:
schroedk Sep 21, 2023
7a3b5f0
Fix issue, due to incorrect handling of partial models (only subset o…
schroedk Sep 25, 2023
f5d617d
Changed output of notebooks/influence_imagenet
schroedk Sep 25, 2023
9ea62e6
Use backward autodiff for hessian vector product computation
schroedk Sep 25, 2023
303d877
Adapt descriptive text for influence value distribution, to focus on …
schroedk Oct 1, 2023
de181fd
Fix references and descriptions in notebooks/influence_imagenet
schroedk Oct 1, 2023
b31697c
Add "hide" tag to cells
schroedk Oct 1, 2023
03cc7b3
Add matplotlib settings to remove background from plots
schroedk Oct 1, 2023
8f10eb3
Revert "Add "hide" tag to cells"
schroedk Oct 1, 2023
a5c05c6
Add %%capture magic command to suppress print and logging output for …
schroedk Oct 1, 2023
230cf4e
Increase batch size for training
schroedk Oct 1, 2023
f192e75
Add blank lines and fix indentation to correctly show formulas
schroedk Oct 1, 2023
5cd6280
Remove deprecated "interpretation" of influences
schroedk Oct 1, 2023
ef1bdc8
Add updated output for notebooks/influence_imagenet
schroedk Oct 1, 2023
1d7fbe2
Merge remote-tracking branch 'origin/develop' into fix/435-broken-imp…
schroedk Oct 1, 2023
3d42883
Add a CSS-filter for inverting images in notebook cells, add tags to …
schroedk Oct 4, 2023
fd9b5fa
Hide boilerplate cells in influence notebooks
schroedk Oct 4, 2023
6868fc2
Add warning for omitted code in documentation
schroedk Oct 4, 2023
f691a13
Update output and hide cells in notebooks/influence_synthetic
schroedk Oct 4, 2023
81643f2
Update output of notebooks/influence_wine, add warning for omitted ce…
schroedk Oct 4, 2023
798d85c
Remove broken link from CONTRIBUTING.md
schroedk Oct 4, 2023
5112a43
Modify xticks in plots to render nicely
schroedk Oct 4, 2023
6f075dd
Update output of notebooks/data_oob
schroedk Oct 4, 2023
4a655f8
Capture the output of cells in notebooks/least_core_basic, due to job…
schroedk Oct 4, 2023
ba40203
Update output of notebooks/least_core_basic
schroedk Oct 4, 2023
9840d70
Merge remote-tracking branch 'origin/develop' into fix/435-broken-imp…
schroedk Oct 4, 2023
9429c3e
Remove capture from notebooks/influence_imagenet and use tag to hide …
schroedk Oct 5, 2023
43c2231
Update output of notebooks/influence_imagenet
schroedk Oct 5, 2023
e29d9f7
Remove capture from notebooks/least_core_basic and use tag to hide ou…
schroedk Oct 5, 2023
71bc46d
Update output of notebooks/least_core_basic
schroedk Oct 5, 2023
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
222 changes: 109 additions & 113 deletions notebooks/influence_imagenet.ipynb

Large diffs are not rendered by default.

130 changes: 77 additions & 53 deletions notebooks/influence_synthetic.ipynb

Large diffs are not rendered by default.

136 changes: 100 additions & 36 deletions notebooks/influence_wine.ipynb

Large diffs are not rendered by default.

94 changes: 55 additions & 39 deletions notebooks/shapley_basic_spotify.ipynb

Large diffs are not rendered by default.

40 changes: 16 additions & 24 deletions notebooks/shapley_knn_flowers.ipynb

Large diffs are not rendered by default.

10 changes: 6 additions & 4 deletions src/pydvl/influence/general.py
Original file line number Diff line number Diff line change
Expand Up @@ -104,8 +104,9 @@ def test_grads() -> Generator[TensorType, None, None]:
) # type:ignore

try:
# if provided input_data implements __len__, pre-allocate the result tensor to reduce memory consumption
resulting_shape = (len(test_data), model.num_params) # type:ignore
# in case input_data is a torch DataLoader created from a Dataset,
# we can pre-allocate the result tensor to reduce memory consumption
resulting_shape = (len(test_data.dataset), model.num_params) # type:ignore
rhs = cat_gen(
test_grads(), resulting_shape, model # type:ignore
) # type:ignore
Expand Down Expand Up @@ -174,8 +175,9 @@ def train_grads() -> Generator[TensorType, None, None]:
) # type:ignore

try:
# if provided input_data implements __len__, pre-allocate the result tensor to reduce memory consumption
resulting_shape = (len(input_data), model.num_params) # type:ignore
# in case input_data is a torch DataLoader created from a Dataset,
# we can pre-allocate the result tensor to reduce memory consumption
resulting_shape = (len(input_data.dataset), model.num_params) # type:ignore
train_grad_tensor = cat_gen(
train_grads(), resulting_shape, model # type:ignore
) # type:ignore
Expand Down
10 changes: 5 additions & 5 deletions src/pydvl/influence/torch/functional.py
Original file line number Diff line number Diff line change
Expand Up @@ -91,7 +91,7 @@ def batch_hvp_gen(

for inputs, targets in iter(data_loader):
batch_loss = batch_loss_function(model, loss, inputs, targets)
model_params = dict(model.named_parameters())
model_params = {k: p for k, p in model.named_parameters() if p.requires_grad}

def batch_hvp(vec: torch.Tensor):
return flatten_tensors_to_vector(
Expand Down Expand Up @@ -166,9 +166,7 @@ def batch_loss_function(
"""

def batch_loss(params: Dict[str, torch.Tensor]):
outputs = functional_call(
model, params, (to_model_device(x, model),), strict=True
)
outputs = functional_call(model, params, (to_model_device(x, model),))
return loss(outputs, y)

return batch_loss
Expand Down Expand Up @@ -209,7 +207,9 @@ def get_hvp_function(
"""

params = {
k: p if track_gradients else p.detach() for k, p in model.named_parameters()
k: p if track_gradients else p.detach()
for k, p in model.named_parameters()
if p.requires_grad
}

def hvp_function(vec: torch.Tensor) -> torch.Tensor:
Expand Down
1 change: 0 additions & 1 deletion src/pydvl/influence/torch/torch_differentiable.py
Original file line number Diff line number Diff line change
Expand Up @@ -149,7 +149,6 @@ def model_func(param):
param,
),
(x.to(self.device),),
strict=True,
)
return self.loss(outputs, y.to(self.device))

Expand Down