Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Sequence #60

Closed
pereverges opened this issue May 23, 2022 · 6 comments · Fixed by #61
Closed

Sequence #60

pereverges opened this issue May 23, 2022 · 6 comments · Fixed by #61
Labels
enhancement New feature or request

Comments

@pereverges
Copy link
Collaborator

pop, popleft, replace avoid passing the tensor to pop or replace

@pereverges pereverges added the enhancement New feature or request label May 23, 2022
@pereverges
Copy link
Collaborator Author

Replace and from_tensor functions do not work properly

@mikeheddes
Copy link
Member

For all the methods we opted for this design where the user has to provide the item that will be replaced such that they can first find the exact item from the approximate one using whatever kind of cleanup memory they desire. Otherwise we would have to either have a reference to an item memory in each data structure or we remove the approximate version which will introduce a lot of noise to the value hypervector. What do you think would improve this design? What do you have in mind?

What is the problem with the replace and from_tensor functions?

@pereverges
Copy link
Collaborator Author

Can we make a lookup of the last element or first and then do the pop/popleft ourselves? We could also give the option of keeping track of a memory ourselves.

I am not able to make replace, actually replace a value. The from_tensors function gives me an error because of the dimension.

@mikeheddes
Copy link
Member

>>> hv = torchhd.random_hv(10, 10000)
>>> S = torchhd.structures.Sequence.from_tensor(hv)
>>> len(S)
0
>>> S.value
tensor([-2., -2.,  0.,  ..., -2., -2.,  4.])

the from_tensor method doesn't give me a dimension error but there is a bug because the size of the sequence isn't set correctly. I will fix that.

@mikeheddes
Copy link
Member

Fixed both issues, will make a PR now.

>>> hv = torchhd.random_hv(10, 10000)
>>> S = torchhd.structures.Sequence.from_tensor(hv)
>>> len(S)
10
>>> S.value
tensor([4., 0., 4.,  ..., 0., 0., 0.])
>>> torchhd.functional.cosine_similarity(S[2], hv)
tensor([ 0.0077, -0.0160,  0.3011, -0.0088, -0.0022,  0.0140,  0.0063,  0.0093,
         0.0094, -0.0055])
>>> S.replace(2, hv[2], hv[5])
>>> torchhd.functional.cosine_similarity(S[2], hv)
tensor([ 0.0011, -0.0180, -0.0102, -0.0068,  0.0024,  0.3229,  0.0038,  0.0244,
         0.0104, -0.0119])

@mikeheddes mikeheddes linked a pull request May 24, 2022 that will close this issue
@mikeheddes
Copy link
Member

For the general design discussion on passing the previous value in data structures see #62

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants