Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Can't import onnx model, converted from BigGAN-PyTorch #184

Open
beac0n opened this issue Feb 27, 2021 · 2 comments
Open

Can't import onnx model, converted from BigGAN-PyTorch #184

beac0n opened this issue Feb 27, 2021 · 2 comments

Comments

@beac0n
Copy link

beac0n commented Feb 27, 2021

I've create an onnx model from the pytorch model from this repo: https://github.com/ajbrock/BigGAN-PyTorch.git

I can use the model in python, without pytorch only using onnx.

However I can't Unmarshal the ONNX model with onnx-go. See the stacktrace:

GOROOT=/usr/lib/go #gosetup
GOPATH=/home/beac0n/go #gosetup
/usr/lib/go/bin/go build -o /tmp/___go_build_go_stego_src go-stego/src #gosetup
/tmp/___go_build_go_stego_src
panic: runtime error: index out of range [0] with length 0

goroutine 1 [running]:
gorgonia.org/tensor.StdEng.makeArray(0xc029612ff8, 0xc1e040, 0xad47c0, 0x0)
        /home/beac0n/dev/beac0n/go-stego/vendor/gorgonia.org/tensor/defaultengine.go:22 +0xf5
gorgonia.org/tensor.(*Dense).makeArray(0xc029612fc0, 0x0)
        /home/beac0n/dev/beac0n/go-stego/vendor/gorgonia.org/tensor/dense.go:87 +0x192
gorgonia.org/tensor.(*Dense).fix(0xc029612fc0)
        /home/beac0n/dev/beac0n/go-stego/vendor/gorgonia.org/tensor/dense.go:292 +0x11d
gorgonia.org/tensor.New(0xc000200760, 0x3, 0x4, 0xc00000f7d0)
        /home/beac0n/dev/beac0n/go-stego/vendor/gorgonia.org/tensor/tensor.go:93 +0x77
github.com/owulveryck/onnx-go/internal/onnx/ir.(*TensorProto).Tensor(0xc0000b06c0, 0x0, 0xc013c93b18, 0x40e318, 0x30)
        /home/beac0n/dev/beac0n/go-stego/vendor/github.com/owulveryck/onnx-go/internal/onnx/ir/tensor.go:51 +0x391
github.com/owulveryck/onnx-go.toOperationAttribute(0xc000100900, 0x1, 0x0, 0xc029605d70, 0xc0f228)
        /home/beac0n/dev/beac0n/go-stego/vendor/github.com/owulveryck/onnx-go/attributes.go:30 +0x794
github.com/owulveryck/onnx-go.toOperationAttributes(0xc00009e320, 0x1, 0x1, 0xc0f228, 0xc029602e00, 0x0)
        /home/beac0n/dev/beac0n/go-stego/vendor/github.com/owulveryck/onnx-go/attributes.go:10 +0x8b
github.com/owulveryck/onnx-go.(*Model).applyModelProtoGraphNodeOperations(0xc013c93f30, 0xc00011e000, 0x0, 0x0)
        /home/beac0n/dev/beac0n/go-stego/vendor/github.com/owulveryck/onnx-go/decoder.go:235 +0x1a8
github.com/owulveryck/onnx-go.(*Model).applyModelProtoGraph(0xc013c93f30, 0xc00011e000, 0x139ec4db, 0x0)
        /home/beac0n/dev/beac0n/go-stego/vendor/github.com/owulveryck/onnx-go/decoder.go:149 +0x35a
github.com/owulveryck/onnx-go.(*Model).decodeProto(0xc013c93f30, 0xc00011e000, 0x139ec4db, 0xc16468)
        /home/beac0n/dev/beac0n/go-stego/vendor/github.com/owulveryck/onnx-go/decoder.go:112 +0x197
github.com/owulveryck/onnx-go.(*Model).UnmarshalBinary(0xc013c93f30, 0xc000280000, 0x139ec4da, 0x139ec4db, 0x139ec4db, 0x0)
        /home/beac0n/dev/beac0n/go-stego/vendor/github.com/owulveryck/onnx-go/decoder.go:38 +0xae
main.main()
        /home/beac0n/dev/beac0n/go-stego/src/main.go:25 +0x2ac

Process finished with exit code 2

The code im using:

package main

import (
	"github.com/owulveryck/onnx-go"
	"github.com/owulveryck/onnx-go/backend/x/gorgonnx"
	"io/ioutil"
	"log"
)

func main() {

	// Create a backend receiver
	backend := gorgonnx.NewGraph()
	// Create a model and set the execution backend
	model := onnx.NewModel(backend)

	// read the onnx model
	b, err := ioutil.ReadFile("biggan-512.onnx")
	if err != nil {
		log.Fatal(err)
	}
	// Decode it into the model

	err = model.UnmarshalBinary(b)
	if err != nil {
		log.Fatal(err)
	}
}

You can download the model here:
https://drive.google.com/file/d/1idIe8_GzoqyHqXWkEnYd8H_IuvTjfn4L/view?usp=sharing

@zhanghongyong123456
Copy link

I've create an onnx model from the pytorch model from this repo: https://github.com/ajbrock/BigGAN-PyTorch.git

I can use the model in python, without pytorch only using onnx.

However I can't Unmarshal the ONNX model with onnx-go. See the stacktrace:

GOROOT=/usr/lib/go #gosetup
GOPATH=/home/beac0n/go #gosetup
/usr/lib/go/bin/go build -o /tmp/___go_build_go_stego_src go-stego/src #gosetup
/tmp/___go_build_go_stego_src
panic: runtime error: index out of range [0] with length 0

goroutine 1 [running]:
gorgonia.org/tensor.StdEng.makeArray(0xc029612ff8, 0xc1e040, 0xad47c0, 0x0)
        /home/beac0n/dev/beac0n/go-stego/vendor/gorgonia.org/tensor/defaultengine.go:22 +0xf5
gorgonia.org/tensor.(*Dense).makeArray(0xc029612fc0, 0x0)
        /home/beac0n/dev/beac0n/go-stego/vendor/gorgonia.org/tensor/dense.go:87 +0x192
gorgonia.org/tensor.(*Dense).fix(0xc029612fc0)
        /home/beac0n/dev/beac0n/go-stego/vendor/gorgonia.org/tensor/dense.go:292 +0x11d
gorgonia.org/tensor.New(0xc000200760, 0x3, 0x4, 0xc00000f7d0)
        /home/beac0n/dev/beac0n/go-stego/vendor/gorgonia.org/tensor/tensor.go:93 +0x77
github.com/owulveryck/onnx-go/internal/onnx/ir.(*TensorProto).Tensor(0xc0000b06c0, 0x0, 0xc013c93b18, 0x40e318, 0x30)
        /home/beac0n/dev/beac0n/go-stego/vendor/github.com/owulveryck/onnx-go/internal/onnx/ir/tensor.go:51 +0x391
github.com/owulveryck/onnx-go.toOperationAttribute(0xc000100900, 0x1, 0x0, 0xc029605d70, 0xc0f228)
        /home/beac0n/dev/beac0n/go-stego/vendor/github.com/owulveryck/onnx-go/attributes.go:30 +0x794
github.com/owulveryck/onnx-go.toOperationAttributes(0xc00009e320, 0x1, 0x1, 0xc0f228, 0xc029602e00, 0x0)
        /home/beac0n/dev/beac0n/go-stego/vendor/github.com/owulveryck/onnx-go/attributes.go:10 +0x8b
github.com/owulveryck/onnx-go.(*Model).applyModelProtoGraphNodeOperations(0xc013c93f30, 0xc00011e000, 0x0, 0x0)
        /home/beac0n/dev/beac0n/go-stego/vendor/github.com/owulveryck/onnx-go/decoder.go:235 +0x1a8
github.com/owulveryck/onnx-go.(*Model).applyModelProtoGraph(0xc013c93f30, 0xc00011e000, 0x139ec4db, 0x0)
        /home/beac0n/dev/beac0n/go-stego/vendor/github.com/owulveryck/onnx-go/decoder.go:149 +0x35a
github.com/owulveryck/onnx-go.(*Model).decodeProto(0xc013c93f30, 0xc00011e000, 0x139ec4db, 0xc16468)
        /home/beac0n/dev/beac0n/go-stego/vendor/github.com/owulveryck/onnx-go/decoder.go:112 +0x197
github.com/owulveryck/onnx-go.(*Model).UnmarshalBinary(0xc013c93f30, 0xc000280000, 0x139ec4da, 0x139ec4db, 0x139ec4db, 0x0)
        /home/beac0n/dev/beac0n/go-stego/vendor/github.com/owulveryck/onnx-go/decoder.go:38 +0xae
main.main()
        /home/beac0n/dev/beac0n/go-stego/src/main.go:25 +0x2ac

Process finished with exit code 2

The code im using:

package main

import (
	"github.com/owulveryck/onnx-go"
	"github.com/owulveryck/onnx-go/backend/x/gorgonnx"
	"io/ioutil"
	"log"
)

func main() {

	// Create a backend receiver
	backend := gorgonnx.NewGraph()
	// Create a model and set the execution backend
	model := onnx.NewModel(backend)

	// read the onnx model
	b, err := ioutil.ReadFile("biggan-512.onnx")
	if err != nil {
		log.Fatal(err)
	}
	// Decode it into the model

	err = model.UnmarshalBinary(b)
	if err != nil {
		log.Fatal(err)
	}
}

You can download the model here: https://drive.google.com/file/d/1idIe8_GzoqyHqXWkEnYd8H_IuvTjfn4L/view?usp=sharing

1.Are you converting a pytorch-based model or a tensorflow model
2.How to convert the onnx model of binggan, can you share the conversion code, thank you,
3.Can you share the inference code of the onnx model?
Thanks again, looking forward to your reply

@aiwaki
Copy link

aiwaki commented Mar 31, 2023

I got the same error

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants