You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am currently testing a machine learning compiler that only accepts sequential models. I would like to know if there is a straightforward method to configure NNSmith in a way that it exclusively generates sequential models. (The sequntial model here means the model can be implemented by only using one torch.nn.Sequential())
Looking foward to your response!
Thanks a lot!
The text was updated successfully, but these errors were encountered:
Some similar feature request was proposed in #117 and implemented in #119, which aims to generate models with only one input and one output.
But that is slightly different from your goal to generate a sequential model as this mode can have parallel intermediate nodes.
But you know all models can be sequential or non-sequential per the definition of a "layer". If you regard compound operators as a whole then every multiple operator can be regarded as a compound operator leading to a sequential model. But if you look at the primary operators (such as add) then the so-called sequential models (e.g., those using transformer blocks) are actually non-sequential. i.e., I think this problem first needs to be solved by definition. :D
Thank you for your quick response!
The concept of the "compound operator" you mentioned is indeed relevant to the compiler I am going to test. I am currently further confirming on the extent to which the compound operator can express by the compiler.
Hello, sorry for disturbing you.
I am currently testing a machine learning compiler that only accepts sequential models. I would like to know if there is a straightforward method to configure NNSmith in a way that it exclusively generates sequential models. (The sequntial model here means the model can be implemented by only using one torch.nn.Sequential())
Looking foward to your response!
Thanks a lot!
The text was updated successfully, but these errors were encountered: