Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Added nnlib layers and test cases, added upsampling layers #38

Merged
merged 2 commits into from
Sep 8, 2020

Conversation

bjosv79
Copy link

@bjosv79 bjosv79 commented Aug 25, 2020

Added nnlib layers and layer tests (not exhaustive)
Fixed issues revealed by nnlib tests
Added upsampling layers

Copy link
Member

@DhairyaLGandhi DhairyaLGandhi left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for looking into this!

Tensor{T,N}(ptr[], on(t))
end

function _maxpool(t::Tensor{T,N}, kernel_size; stride = [1], padding = [0], dilation = [1]) where {T,N}
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why do we need this method?

Copy link
Author

@bjosv79 bjosv79 Aug 26, 2020

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I did not have the intention at first to make a PR, so I changed a few things that was not really needed. Sorry for that.

In my mind it is more clean to have the NNlib ConvDims and PoolDims only in nnlib.jl and not in the ops.jl. Also it makes sense to align the arguments of _meanpool and _maxpool. My own preference would be to remove the _maxpool method with PoolDims.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Best to keep the consistency with the method signatures and API for now.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I was not aware of the #21 issue. Is this PR still interesting? I can clean it up to include only the tests and the changes needed to pass the tests. In other words, remove the added layers, leave the method signatures as they are. Reversing strides, pads, dilations... to match the spatial sizes in ops.jl is however still needed.

I guess this will be resolved by #21 at some point.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The PR is definitely interesting. #21 should be fixed in the latest release too.

Keeping the signatures consistent is important however. The layers themselves should be fine. Reversing the dims is right too, I think.

@DhairyaLGandhi
Copy link
Member

bors try

bors bot added a commit that referenced this pull request Sep 3, 2020
@bors
Copy link
Contributor

bors bot commented Sep 3, 2020

try

Build failed:

@DhairyaLGandhi
Copy link
Member

bors try

bors bot added a commit that referenced this pull request Sep 3, 2020
@bors
Copy link
Contributor

bors bot commented Sep 3, 2020

try

Build succeeded:

@DhairyaLGandhi
Copy link
Member

I think we can do some of the changes in future PRs, thanks @bjosv79 !

bors r+

@bjosv79
Copy link
Author

bjosv79 commented Sep 8, 2020

Regarding method signatures, I assumed that any kwargs sent to NNlib for pooling and conv, eventually results in a NNlib call using ConvDims/PoolDims. If that is a correct assumption, it seems easier to let NNlib process the kwargs and only provide methods without kwargs.

@bors
Copy link
Contributor

bors bot commented Sep 8, 2020

Build succeeded:

@bors bors bot merged commit 4ac1ae9 into FluxML:master Sep 8, 2020
@DhairyaLGandhi
Copy link
Member

Following the method signatures defined by packages is best since that means there is a clear future deprecation path, as well as maintenance of backwards compatibility by dependencies is easier.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants