You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Oct 15, 2019. It is now read-only.
Hi, just a suggestion from a passer-by: I am interested in understanding the advantages of this library over raw mxnet and would love to see more discussion of it. Auto-differentiation definitely seems like one feature, but other than that it seems like MinPy has the same expressiveness as MXNet, with only-slightly-worse performance. It'd be great to hear more about how you folks are intending to provide the performance of a symbolic API on top of the usability of an imperative API; in my very vague understanding, MinPy sounds like more of a middle-ground than a solution that encompasses the benefits of both sides. That's not to say that that can't be an interesting place to be, but it seems like not quite what is advertised. I am definitely interested in being corrected if my understanding isn't right though!
Also, I would recommend reworking the "data-dependent branches" section from your readme -- it's pretty misleading. The two programs you're comparing have different semantics: the TensorFlow program is (I think) not representable in MinPy, and the MinPy program can be just as easily represented in TensorFlow (ie without if_ and the lambdas).
The text was updated successfully, but these errors were encountered:
Nice comments. We will think about how to fix the intro.
The key thing we wanted to do here to the land of API is "back to NumPy", by doing subtraction instead. With this fixed, our next step is to move the performance closer to or even exceeding that of today's systems. The technical direction is similar to a very early system we built, called Minerva: https://stanford.edu/~rezab/nips2014workshop/submits/minerva.pdf
Talking about expressiveness, MXNet is not turing-complete due to the lack of loop. This is not an inherent problem of symbolic programming though, since one could add interface like while in tensorflow to achieve that. Our view is that this is just like reinventing a new language. As a contrary, MinPy takes a different approach: it supports python's native if and loop, so it is just a python program. So whatever could be expressed by python (apparently turing-complete) could be expressed by MinPy. So I would say in terms of expressiveness, MinPy == TF > MXNet.
For the "data-dependent branch" example, they are in fact of the same semantic: it will choose the branch based on whether x is greater than y or not (please assume x and y are scalars here). The only difference is the syntax. Again, as what @zzhang-cn have said, we believe doing subtraction (having as few new APIs as possible) is better than adding a new interface for users to learn.
Perhaps I am not familiar enough with MXNet; why can't you use Python if statements while constructing your MXNet graph?
For the "data-dependent branch" examples, the MinPy program evaluates the conditional at graph-construction time, whereas the TF example emits it into the graph. Or if the conditional is non-symbolic, then in TF you could have used a python 'if' statement just like the MinPy program.
Sign up for freeto subscribe to this conversation on GitHub.
Already have an account?
Sign in.
Hi, just a suggestion from a passer-by: I am interested in understanding the advantages of this library over raw mxnet and would love to see more discussion of it. Auto-differentiation definitely seems like one feature, but other than that it seems like MinPy has the same expressiveness as MXNet, with only-slightly-worse performance. It'd be great to hear more about how you folks are intending to provide the performance of a symbolic API on top of the usability of an imperative API; in my very vague understanding, MinPy sounds like more of a middle-ground than a solution that encompasses the benefits of both sides. That's not to say that that can't be an interesting place to be, but it seems like not quite what is advertised. I am definitely interested in being corrected if my understanding isn't right though!
Also, I would recommend reworking the "data-dependent branches" section from your readme -- it's pretty misleading. The two programs you're comparing have different semantics: the TensorFlow program is (I think) not representable in MinPy, and the MinPy program can be just as easily represented in TensorFlow (ie without if_ and the lambdas).
The text was updated successfully, but these errors were encountered: