Each time I execute this script I get different values as output. I do get the same output for each node though since its a complete connected graph and the input features are the same for each node. However I would expect to get the same values each time I execute it.
I need to be able to just have the forward pass that produces the same values if the input matrices are the same since I want to use this script to test another GCN Layer that I have coded in a different language for a software project.
As you can see in the code, I made sure to set the weight matrix and bias vector. I also tried to specify the random seed. I saw threads that mention dropouts but I am working with just one layer am I? However it still produces different values each time. What could be the problem?
import torch
from torch_geometric.nn import GCNConv
x = torch.tensor([[1.0], [1.0], [1.0]], dtype=torch.float)
edge_index = torch.tensor([[0,0,1,1,2,2], [1,2,0,2,0,1]], dtype=torch.long)
conv_layer = GCNConv(in_channels=1, out_channels=1)
new_weight_values = torch.tensor([[1.0]])
new_bias_values = torch.tensor([[0.0]])
conv_layer.weight = torch.nn.Parameter(new_weight_values)
conv_layer.bias = torch.nn.Parameter(new_bias_values)
output = conv_layer.forward(x, edge_index)
print("Input Features:")
print(x)
print("Output Features:")
print(output)```
As you can see in the code, I made sure to set the weight matrix and bias vector. I also tried to specify the random seed. I saw threads that mention dropouts but I am working with just one layer am I? However it still produces different values each time. What could be the problem?
GCNConv has a .lin
field where the mapping is stored, you are not making this part deterministic.
If you change
conv_layer.weight = torch.nn.Parameter(new_weight_values)
to
conv_layer.lin.weight = torch.nn.Parameter(new_weight_values)
it should work as expected.
In fact there is no such thing as conv_layer.weight
. You are just adding this attribute in your code. If you tried to print it before setting it - you would see an error. (bias is fine though).