You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
@yuanqing-wang The goal here was to create a (n x n x (2* Dx2)) tensor temp such that each
temp[i, j, :] is the concatenation of the vector representation of nodes i and j. The hope was that the neural networks
could use interesting information from the representation of both nodes to predict whether there is an edge between them
temp1=zx.repeat(1, n).view(n*n, h) # Shape should be (n, n, Dx2)
# zx, Atilde -> E_tilde
temp1 = zx.repeat(1, n).view(n * n, h) # Shape should be (n, n, Dx2)
temp2 = z.repeat(n, 1) # Shape is also (n, n, Dx2)
temp = torch.cat(
(temp1, temp2), 1
) # This creates a (n, n, 2 * Dx2) tensor <------------------ This is what we ultimately want
Hi,
Looks like if I specify any
embedding_dim
other than64
would result in an error? For example,gives me:
The text was updated successfully, but these errors were encountered: