Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Attention block] relative positional embedding #7346

Merged
merged 13 commits into from
Jan 18, 2024

Conversation

vgrau98
Copy link
Contributor

@vgrau98 vgrau98 commented Dec 29, 2023

Fixes #7356

Description

Add relative positinoal embedding in attention block as described in https://arxiv.org/pdf/2112.01526.pdf
Largely inspired by https://github.com/facebookresearch/segment-anything/blob/main/segment_anything/modeling/image_encoder.py

Can be useful for #6357

Types of changes

  • Non-breaking change (fix or new feature that would not break existing functionality).
  • Breaking change (fix or new feature that would cause existing functionality to change).
  • New tests added to cover the changes.
  • Integration tests passed locally by running ./runtests.sh -f -u --net --coverage.
  • Quick tests passed locally by running ./runtests.sh --quick --unittests --disttests.
  • In-line docstrings updated.
  • Documentation updated, tested make html command in the docs/ folder.

@vgrau98 vgrau98 force-pushed the attention-rel-pos-embedd branch 4 times, most recently from 67ccdbe to a46b374 Compare December 29, 2023 22:54
@vgrau98 vgrau98 marked this pull request as ready for review December 29, 2023 23:42
@vgrau98 vgrau98 force-pushed the attention-rel-pos-embedd branch 4 times, most recently from c1d9754 to 57fc23b Compare January 1, 2024 22:14
@marksgraham
Copy link
Contributor

I wonder if this is an example of the point we should start splitting out different types of attention block into their own blocks? Thoughts @KumoLiu @ericspod ?

Note this also will have some merge conflicts as it stands with the local window pr

@ericspod
Copy link
Member

ericspod commented Jan 6, 2024

@marksgraham @KumoLiu I feel with the generative merging in progress we should think about refactoring how these blocks work in general to try and separate these differing features into separate classes. We can perhaps have a single attention block class that is customisable with features, but if not we can do multiple classes but doing this cleanly will probably involve refactoring the existing things that have been merged from GenerativeModels. This PR adds something good however so we should keep this open but postpone until we have the other things merged perhaps.

@vgrau98 vgrau98 force-pushed the attention-rel-pos-embedd branch from 4b2c852 to fa72b81 Compare January 7, 2024 22:42
@vgrau98
Copy link
Contributor Author

vgrau98 commented Jan 8, 2024

Following @ericspod suggestions:
Use factory pattern to implement relative positional embedding in self attention block (use LayerFactory for now but open to any suggestion to better organise code). Anyone can extend the RelPosEmbedding feature, or add new transformer/attention features

Copy link
Contributor

@KumoLiu KumoLiu left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the quick update, looks cleaner now. Only have a few concern inline.

monai/networks/blocks/rel_pos_embedding.py Show resolved Hide resolved
monai/networks/blocks/selfattention.py Show resolved Hide resolved
@ericspod
Copy link
Member

I think this is looking much better. Would it be possible to merge into gen-ai-dev instead of dev so we can continue the generative refactoring? @KumoLiu @marksgraham any other feedback?

@vgrau98 vgrau98 changed the base branch from dev to gen-ai-dev January 14, 2024 18:37
@vgrau98
Copy link
Contributor Author

vgrau98 commented Jan 14, 2024

I think this is looking much better. Would it be possible to merge into gen-ai-dev instead of dev so we can continue the generative refactoring? @KumoLiu @marksgraham any other feedback?

target branch updated to gen-ai-dev

@KumoLiu
Copy link
Contributor

KumoLiu commented Jan 15, 2024

Hi @vgrau98, could you please try git rebase --onto <newparent> <oldparent> <until> to rebase your commit onto the gen-ai-dev, which may avoid merging the commit added from the dev branch (causing DCO error)?
https://womanonrails.com/git-rebase-onto

vgrau98 and others added 9 commits January 15, 2024 20:42
Signed-off-by: vgrau98 <[email protected]>
Signed-off-by: vgrau98 <[email protected]>
Co-authored-by: Eric Kerfoot <[email protected]>
Signed-off-by: vgrau98 <[email protected]>
Co-authored-by: Eric Kerfoot <[email protected]>
Signed-off-by: vgrau98 <[email protected]>
Signed-off-by: vgrau98 <[email protected]>
Signed-off-by: vgrau98 <[email protected]>
@vgrau98 vgrau98 force-pushed the attention-rel-pos-embedd branch from d75b341 to 6fb51f5 Compare January 15, 2024 19:53
@vgrau98
Copy link
Contributor Author

vgrau98 commented Jan 15, 2024

Hi @vgrau98, could you please try git rebase --onto <newparent> <oldparent> <until> to rebase your commit onto the gen-ai-dev, which may avoid merging the commit added from the dev branch (causing DCO error)? https://womanonrails.com/git-rebase-onto

should be ok now

@KumoLiu
Copy link
Contributor

KumoLiu commented Jan 16, 2024

/build

@KumoLiu
Copy link
Contributor

KumoLiu commented Jan 16, 2024

Hi @vgrau98, could you please check-pick the commit here after it merged which caused a CI error and could not merge?
Thanks.

@vgrau98
Copy link
Contributor Author

vgrau98 commented Jan 16, 2024

Hi @vgrau98, could you please check-pick the commit here after it merged which caused a CI error and could not merge? Thanks.

Is it still relevant? The bug seems to have been resolved according to your comments on PR #7391

@KumoLiu
Copy link
Contributor

KumoLiu commented Jan 17, 2024

Yes, I'm creating another PR to fix the error caused by the upgrade of flake8-bugbear
https://github.com/Project-MONAI/MONAI/actions/runs/7544331654/job/20537411217#step:2:2601

You can either cherry-pick that one or make the same change in this PR.

@KumoLiu
Copy link
Contributor

KumoLiu commented Jan 18, 2024

/build

@KumoLiu KumoLiu enabled auto-merge (squash) January 18, 2024 15:25
@KumoLiu KumoLiu merged commit 41fb3ff into Project-MONAI:gen-ai-dev Jan 18, 2024
28 checks passed
@vgrau98 vgrau98 deleted the attention-rel-pos-embedd branch January 24, 2024 22:24
marksgraham pushed a commit to marksgraham/MONAI that referenced this pull request Jan 30, 2024
Fixes Project-MONAI#7356

### Description

Add relative positinoal embedding in attention block as described in
https://arxiv.org/pdf/2112.01526.pdf
Largely inspired by
https://github.com/facebookresearch/segment-anything/blob/main/segment_anything/modeling/image_encoder.py

Can be useful for Project-MONAI#6357

### Types of changes
<!--- Put an `x` in all the boxes that apply, and remove the not
applicable items -->
- [x] Non-breaking change (fix or new feature that would not break
existing functionality).
- [ ] Breaking change (fix or new feature that would cause existing
functionality to change).
- [ ] New tests added to cover the changes.
- [ ] Integration tests passed locally by running `./runtests.sh -f -u
--net --coverage`.
- [ ] Quick tests passed locally by running `./runtests.sh --quick
--unittests --disttests`.
- [x] In-line docstrings updated.
- [x] Documentation updated, tested `make html` command in the `docs/`
folder.

---------

Signed-off-by: vgrau98 <[email protected]>
Signed-off-by: vgrau98 <[email protected]>
Co-authored-by: Eric Kerfoot <[email protected]>
Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
Co-authored-by: YunLiu <[email protected]>
Signed-off-by: Mark Graham <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Self attention with relative positional embedding
4 participants