Are you sure you want to delete this task? Once this task is deleted, it cannot be recovered.
liuxinchen3 e85a5d82d1 | 1 year ago | |
---|---|---|
images | 1 year ago | |
LICENSE | 1 year ago | |
README.md | 1 year ago | |
cheater.py | 1 year ago | |
key | 1 year ago | |
main.py | 1 year ago | |
requirements.txt | 1 year ago |
antiAI can be used to protect images/audios/vidoes from AI recognitions by multiple algorithms. It only supports fawkes for now and will support more algorithms soon.
antiAI is developed by researchers at Department of Industial Engineering, Tsinghua University.
bleach>=2.1.0
click>=7.0
cryptography>=37.0.1
fawkes
keras==2.4.3
mtcnn
numpy>=1.15.4
numpy>=1.19.5
opencv-python>=4.6.0.66
pillow>=7.0.0
python_version>=3.8
stego-lsb
tensorflow==2.4.1
You can download it by:
git clone https://github.com/antiAIbyTsinghua/antiAI
cd antiAI
$ python main.py
Options:
--test
: run a test demo.--genkey
: generate a new key to encrypt or decrypt images.--protect
: protect images against AI.--recover
: recover images from decrypted images.-d
, --directory
: the directory that contains images.-m
, --method
: the algorithmn to protect images (only support fawkes for now).-k
, --key
: the key used to encrypt or decrypt images.--clean
: delete all generated files.For the first time use, you can test it by:
python main.py --test --clean
When it finished, it will print Test finished!
.
To generate a new key:
python main.py --genkey
We strongly recommend you to keep the key carefully!
To protect files from AI recognitions and encrypt them with a certain key:
python main.py --protect -d images -m fawkes -k key
To recover files from decrypted files with a certain key:
python main.py --recover -d images -k key
If none of --test
, --genkey
, --protect
, --recover
is announced, both protection and recovery will be performed using an existing key. For example:
python main.py -d images -m fawkes -k key
You will get 3 kinds of generated images. The images ended by _cloaked are the original files produced by antiAI algorithms. They seem similar to the original images but are difficult for AI to recognize. The images ended by _sealed are visually the same as the cloaked images, while the original images have been hiden in them. The images ended by _recovered are recovered images from the images ended by _sealed.
antiAI can help users protect their pravicy against AI recognitions. Users can save images ended by _sealed in their devices and can recover the original images whenever they want. An unauthorized visitor can only reach the cloaked images and will never reach the original images.
All images in images/ can only be used for reference.
Dear OpenI User
Thank you for your continuous support to the Openl Qizhi Community AI Collaboration Platform. In order to protect your usage rights and ensure network security, we updated the Openl Qizhi Community AI Collaboration Platform Usage Agreement in January 2024. The updated agreement specifies that users are prohibited from using intranet penetration tools. After you click "Agree and continue", you can continue to use our services. Thank you for your cooperation and understanding.
For more agreement content, please refer to the《Openl Qizhi Community AI Collaboration Platform Usage Agreement》