Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ModuleNotFoundError: No module named 'core' #143

Open
annyhou opened this issue Mar 18, 2020 · 3 comments
Open

ModuleNotFoundError: No module named 'core' #143

annyhou opened this issue Mar 18, 2020 · 3 comments

Comments

@annyhou
Copy link

annyhou commented Mar 18, 2020

Hi,
I want to get Pytorch2caffe by MMdnn. so I try to save the model with archietecture not only weights, but when I load it, I got the below error:
import torch
path = "../../cornernet_lite-model.pth"
model = torch.load(path)
Traceback (most recent call last):
File "", line 1, in
File "/install/[email protected]/lib/python3.7/site-packages/torch/serialization.py", line 367, in load
return _load(f, map_location, pickle_module)
File "
/install/[email protected]/lib/python3.7/site-packages/torch/serialization.py", line 538, in _load
result = unpickler.load()
ModuleNotFoundError: No module named 'core'

Thanks.

@annyhou
Copy link
Author

annyhou commented Mar 25, 2020

I change another method to convert Pytorch2Caffe by onnx. But when I run torch.onnx.export(model, args, f [, ...]), I got the below error:
RuntimeError: ONNX export failed: Couldn't export Python operator TopPoolFunction.
it's because of compelling whit c++?

@annyhou
Copy link
Author

annyhou commented May 21, 2020

https://github.com/xxradon/PytorchToCaffe
code:
import sys
sys.path.append("~/CornerNet-Lite/PytorchToCaffe/")
import pytorch_to_caffe
import torch
from types import MethodType

from core.models.CornerNet_Saccade import model
from core.nnet.py_factory import DummyModule

def load_model(model, pretrained_model):
print("loading from {}".format(pretrained_model))
with open(pretrained_model, "rb") as f:
params = torch.load(f)
model.load_state_dict(params)
return model

def forward(self, *xs, **kwargs):
return self.module(*xs, **kwargs)

net = DummyModule(model())
net.forward = MethodType(forward, net)
load_model(net, './cache/nnet/CornerNet_Saccade/CornerNet_Saccade_500000.pkl')

name = 'CornerNet_Lite'
net.eval()

input = torch.zeros([1, 3, 383, 383])
pytorch_to_caffe.trans_net(net, input, name)
pytorch_to_caffe.save_prototxt('./cache/nnet/CornerNet_Saccade/{}.prototxt'.format(name))
pytorch_to_caffe.save_caffemodel('./cache/nnet/CornerNet_Saccade/{}.caffemodel'.format(name))
Q:WARNING: CANNOT FOUND blob 140500196353368
Maybe, I should define the layers core-pooling?

@Zhuquantao
Copy link

Hi, how did you solve this problem? I met the same question.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants