Skip to content
This repository was archived by the owner on Dec 19, 2023. It is now read-only.
This repository was archived by the owner on Dec 19, 2023. It is now read-only.

Do not support DataParallel #38

@HryMurph

Description

@HryMurph

The project has not supported DataParallel and multi GPUs yet. I tried to add the codes myself but found several bugs while running the new version.

  1. Line 106, layers.py: edge = torch.cat((edge[:, :], edge_list_nhop[:, :]), dim=1). The shape of parameter 'edge' is (2, n) in the original code. While with DataParallel, the shape of 'edge' becomes (1, n) after a scatter function in DataParallel. Then this line causes Error because the shape of edge and edge_list_nhop in dimmension 1 is different (1 vs 2). The solution to fix it is to transpose the Corpus.train_adj_matrix[0] when initializing Corpus, and then in forward function of class SpKBGATModified, transpose parameter 'adj' again.
  2. RuntimeError: Expected all tensors to be on the same device, but found at least two devices. This error raised at Line 114, layers.py: edge_m = self.a.mm(edge_h). However, I have no idea how to fix this error. Hope someone to provide solution.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions