You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hello, I've implemented your code on my own dataset. However, the d_loss decreases from 10(which equals to lambda) to a very small negative number(like -10000), the wasserstein distance keeps going to order of million, and the gradient penalty changes from 10 to 0 and then goes to order of thousand. I've worked on this problem for several days but I still can't solve it. Can anyone help me with this? @caogang
The text was updated successfully, but these errors were encountered:
Hello, I've implemented your code on my own dataset. However, the d_loss decreases from 10(which equals to lambda) to a very small negative number(like -10000), the wasserstein distance keeps going to order of million, and the gradient penalty changes from 10 to 0 and then goes to order of thousand. I've worked on this problem for several days but I still can't solve it. Can anyone help me with this?
@caogang
The text was updated successfully, but these errors were encountered: