-
Notifications
You must be signed in to change notification settings - Fork 811
More accurate Winograd/Cook/Toom F(4x4, 3x3) transforms #224
Comments
This transform appears to be the same or slightly more accurate:
I had posted a variation of these matrices earlier that scaled some of the columns of AT and rows of G, but that seemed to hurt accuracy slightly. |
This one is done is pending merge for release. |
Could anyone advise how to obtain this improved version of winograd convolution from the original version which is described by equation (7) in the paper ? |
Hi @ProMach I generated the above transforms using an early version of the winCNN software, now available here: https://github.com/andravin/wincnn Matrix AT is a Vandermonde matrix, and the "polynomial roots" used by winCNN are just the numbers in the second row, leaving off the last column. Since these matrices were released, others have researched possibly better transforms. Please refer to apache/tvm#3553 |
I do not understand why the proposed toom-cook ALGORITHM 1 in the paper Error Analysis and Improving the Accuracy of Winograd Convolution for Deep Neural Networks does not need polynomial interpolation stage ? |
@andravin What do you exactly mean by 'polynomial roots' ? and why are they located in the second row ? |
Here are some more accurate transform matrices for the Winograd F(4x4, 3x3) kernel.
They are about 4X as accurate as the original transforms, and within a factor of 2X of the accuracy of direct convolution.
AT (the inverse transform) requires a couple more flops to compute than the original transform, because some of the values of +/-1 were replaced with rational numbers. I think the other transforms are the same number of flops.
The text was updated successfully, but these errors were encountered: