ONNX Model Exporter for TensorLayerX. It's updated on Both OpenI and Github. You can get a free GPU on OpenI to use this project.
TLX2ONNX enables users to convert models from TensorLayerX to ONNX.
pip install tlx2onnx
git clone https://github.com/tensorlayer/TLX2ONNX.git
cd TLX2ONNX
python setup.py install
TLX2ONNX can convert models built using TensorLayerX Module Subclass and Layers, and the Layers support list can be found in Operator list.
The following is an example of converting a multi-layer perceptron. You can get the code from here.
import os
os.environ["TL_BACKEND"] = 'tensorflow'
import tensorlayerx as tlx
from tensorlayerx.nn import Module
from tensorlayerx.nn import Linear, Concat, Elementwise
from tlx2onnx.main import export
import onnxruntime as rt
import numpy as np
class CustomModel(Module):
def __init__(self):
super(CustomModel, self).__init__(name="custom")
self.linear1 = Linear(in_features=20, out_features=10, act=tlx.ReLU, name='relu1_1')
self.linear2 = Linear(in_features=20, out_features=10, act=tlx.ReLU, name='relu2_1')
self.concat = Concat(concat_dim=1, name='concat_layer')
def forward(self, inputs):
d1 = self.linear1(inputs)
d2 = self.linear2(inputs)
outputs = self.concat([d1, d2])
return outputs
net = CustomModel()
input = tlx.nn.Input(shape=(3, 20), init=tlx.initializers.RandomNormal())
net.set_eval()
output = net(input)
print("tlx out", output)
onnx_model = export(net, input_spec=input, path='concat.onnx')
# Infer Model
sess = rt.InferenceSession('concat.onnx')
input_name = sess.get_inputs()[0].name
output_name = sess.get_outputs()[0].name
input_data = np.array(input, dtype=np.float32)
result = sess.run([output_name], {input_name: input_data})
print('onnx out', result)
The converted onnx file can be viewed via Netron.
The converted results have almost no loss of accuracy.
And the graph show the input and output sizes of each layer, which is very helpful for checking the model.
If you find TensorLayerX or TLX2ONNX useful for your project, please cite the following papers:
@article{tensorlayer2017,
author = {Dong, Hao and Supratak, Akara and Mai, Luo and Liu, Fangde and Oehmichen, Axel and Yu, Simiao and Guo, Yike},
journal = {ACM Multimedia},
title = {{TensorLayer: A Versatile Library for Efficient Deep Learning Development}},
url = {http://tensorlayer.org},
year = {2017}
}
@inproceedings{tensorlayer2021,
title={TensorLayer 3.0: A Deep Learning Library Compatible With Multiple Backends},
author={Lai, Cheng and Han, Jiarong and Dong, Hao},
booktitle={2021 IEEE International Conference on Multimedia \& Expo Workshops (ICMEW)},
pages={1--3},
year={2021},
organization={IEEE}
}
Dear OpenI User
Thank you for your continuous support to the Openl Qizhi Community AI Collaboration Platform. In order to protect your usage rights and ensure network security, we updated the Openl Qizhi Community AI Collaboration Platform Usage Agreement in January 2024. The updated agreement specifies that users are prohibited from using intranet penetration tools. After you click "Agree and continue", you can continue to use our services. Thank you for your cooperation and understanding.
For more agreement content, please refer to the《Openl Qizhi Community AI Collaboration Platform Usage Agreement》