You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I just wanted to take a moment to commend you on the fantastic work you've done with Mirage. It's truly impressive and showcases a lot of hard work and innovation, especially for finding similar optimizations as flash-attention. .
I'm curious about how this work compares to previous efforts in the field (superoptimizer), such as TASO/PET/EinNet. I want to know how their performance compares at some common neural network (or serveral layers).
Although you mentioned "We do not compare Mirage with existing superoptimizers" in your paper, mirage's performance advantages over them will be even more admirable for this project, or your more detailed explanation will help readers understand why you don't compare them.
Thank you for your time, and I look forward to your insights!
The text was updated successfully, but these errors were encountered:
Great Work on Mirage!
I just wanted to take a moment to commend you on the fantastic work you've done with Mirage. It's truly impressive and showcases a lot of hard work and innovation, especially for finding similar optimizations as flash-attention. .
I'm curious about how this work compares to previous efforts in the field (superoptimizer), such as TASO/PET/EinNet. I want to know how their performance compares at some common neural network (or serveral layers).
Although you mentioned "We do not compare Mirage with existing superoptimizers" in your paper, mirage's performance advantages over them will be even more admirable for this project, or your more detailed explanation will help readers understand why you don't compare them.
Thank you for your time, and I look forward to your insights!
The text was updated successfully, but these errors were encountered: