Are you sure you want to delete this task? Once this task is deleted, it cannot be recovered.
Finetune a86cce1479 | 1 year ago | |
---|---|---|
compressai | 1 year ago | |
demo | 1 year ago | |
README.md | 1 year ago |
《Joint Autoregressive and Hierarchical Priors for Learned Image Compression》
1.translate from pytorch to mindspore
2.successfully run forward, backward and parameter update
3.test and compare between the mindspore version and the pytorch version with pretrained parameter
the translated model file is JointAutoregressiveHierarchicalPriors/demo/model_baseline.py
the pretrain paremeter is JointAutoregressiveHierarchicalPriors/demo/model_test/epoch_3.ckpt for mindspore
to use the translated model, check
use my_train.py to train mindspore model
1.cd JointAutoregressiveHierarchicalPriors/demo
2.python train.py -d "/DATA/DATANAS1/liyao/waseda/dataset/" --seed 0 --batch-size 4 --test-batch-size 1 --save --lambda 0.01
quality measurements
mindspore version
Loss: 7.845 | MSE loss: 0.01107 | PSNR (dB): 22.568 | Bpp loss: 0.3231 | Aux loss: 10432.27
@misc{minnen2018joint,
title={Joint Autoregressive and Hierarchical Priors for Learned Image Compression},
author={David Minnen and Johannes Ballé and George Toderici},
year={2018},
eprint={1809.02736},
archivePrefix={arXiv},
primaryClass={cs.CV}
}
name: Yao Li
Dear OpenI User
Thank you for your continuous support to the Openl Qizhi Community AI Collaboration Platform. In order to protect your usage rights and ensure network security, we updated the Openl Qizhi Community AI Collaboration Platform Usage Agreement in January 2024. The updated agreement specifies that users are prohibited from using intranet penetration tools. After you click "Agree and continue", you can continue to use our services. Thank you for your cooperation and understanding.
For more agreement content, please refer to the《Openl Qizhi Community AI Collaboration Platform Usage Agreement》