Skip to content

GraphGallery 0.1.7

Compare
Choose a tag to compare
@EdisonLeeeee EdisonLeeeee released this 07 Aug 13:47
· 688 commits to master since this release

Changes

New features:

  • reset_lr, reset_optimizer, reset_weights for semi-supervised models
  • remove the last softmax activation function, so the output of the model will be a non-activated logit
  • add null context manager
  • add Gather layer

Bug fix

  • astensor rename to astensors and add astensor method
  • infer_type can accept scipy sparse matrix and tensorflow tensor
  • LGCN can accept dense matrix
  • fix normalize_adj bug when accept dense matrix

Others

  • move GCNF, EdgeGCN, SimplifiedOBVAT, MedianSAGE, GCN_MIX to graphgallery/nn/models/semisupervised/experimental/
  • by default, norm_x=None instead of norm_x='l1'
  • set default integer type int32 instead of int64

Example of GCN model

from graphgallery.nn.models import GCN
# adj is scipy sparse matrix, x is numpy array matrix
model = GCN(adj, x, labels, device='GPU', norm_x='l1', seed=123)
# build your GCN model with custom hyper-parameters
model.build()
# train your model. here idx_train and idx_val are numpy arrays
his = model.train(idx_train, idx_val, verbose=1, epochs=100)
# test your model
loss, accuracy = model.test(idx_test)
print(f'Test loss {loss:.5}, Test accuracy {accuracy:.2%}')

On Cora dataset:

loss 1.02, acc 95.00%, val_loss 1.41, val_acc 77.40%: 100%|██████████| 100/100 [00:02<00:00, 37.07it/s]
Test loss 1.4123, Test accuracy 81.20%

Build your model

you can use the following statement to build your model

# one hidden layer with hidden units 32 and activation function RELU
>>> model.build(hiddens=32, activations='relu')

# two hidden layer with hidden units 32, 64 and all activation functions are RELU
>>> model.build(hiddens=[32, 64], activations='relu')

# two hidden layer with hidden units 32, 64 and activation functions RELU and ELU
>>> model.build(hiddens=[32, 64], activations=['relu', 'elu'])

# other parameters like `dropouts` and `l2_norms` (if have) are the SAME.

Train or test your model

More details can be seen in the methods model.train and model.test

Hyper-parameters

you can simply use model.show() to show all your Hyper-parameters.
Otherwise you can also use model.show('model') or model.show('train') to show your model parameters and training parameters.
NOTE: you should install texttable first.

Visualization

  • Accuracy
import matplotlib.pyplot as plt
plt.plot(his.history['acc'])
plt.plot(his.history['val_acc'])
plt.legend(['Accuracy', 'Val Accuracy'])
plt.xlabel('Epochs')

visualization

  • Loss
import matplotlib.pyplot as plt
plt.plot(his.history['loss'])
plt.plot(his.history['val_loss'])
plt.legend(['Loss', 'Val Loss'])
plt.xlabel('Epochs')

visualization