Deleting a branch is permanent. It CANNOT be undone. Continue?
Deleting a branch is permanent. It CANNOT be undone. Continue?
Dear OpenI User
Thank you for your continuous support to the Openl Qizhi Community AI Collaboration Platform. In order to protect your usage rights and ensure network security, we updated the Openl Qizhi Community AI Collaboration Platform Usage Agreement in January 2024. The updated agreement specifies that users are prohibited from using intranet penetration tools. After you click "Agree and continue", you can continue to use our services. Thank you for your cooperation and understanding.
For more agreement content, please refer to the《Openl Qizhi Community AI Collaboration Platform Usage Agreement》
模型: 2.6B
mindspore: 1.2
GPU: NVIDA V100
(pangu) root@iZ2ze15skiya5kcq1eryyuZ:/mnt/pangu/pangu-alpha# ./scripts/infer.sh
rank_id:0 rank_id str:0
local_rank:0, device id:0 start to run...
===config is: [PANGUALPHAConfig]==============================
batch_size:0
seq_length:1024
vocab_size:40000
embedding_size:2560
num_layers:32
num_heads:32
expand_ratio:4
post_layernorm_residual:False
dropout_rate:0.1
compute_dtype:Float16
use_past:False
dp:0
mp:8
self_layernorm:True
forward_reduce_scatter:True
stage_num:1
micro_size:1
word_emb_dp:True
eod_reset:False
load_ckpt_path:/mnt/pangu/pangu-alpha/checkpoint_file
=====args_opt is: Namespace(data_url=None, device_id=0, device_num=1, distribute='false', embedding_size=2560, load_ckpt_name='PANGUALPHA3.ckpt', load_ckpt_path='/mnt/pangu/pangu-alpha/checkpoint_file', micro_size=1, mode='2.6B', num_heads=32, num_layers=32, per_batch_size=1, run_type='predict', seq_length=1024, stage_num=1, strategy_load_ckpt_path='/mnt/pangu/pangu-alpha/strategy_load_ckpt/pangu_alpha_2.6B_ckpt_strategy.ckpt', tensor_model_parallel_num=8, tokenizer_path='/mnt/pangu/pangu-alpha/tokenizer/', train_url=None, vocab_size=40000)
[WARNING] ME(24520:140708633995072,MainProcess):2021-04-28-16:12:46.601.022 [mindspore/common/_decorator.py:32] 'GatherV2' is deprecated from version 1.1 and will be removed in a future version, use 'Gather' instead.
[WARNING] ME(24520:140708633995072,MainProcess):2021-04-28-16:12:46.642.878 [mindspore/common/_decorator.py:32] 'TensorAdd' is deprecated from version 1.1 and will be removed in a future version, use 'Add' instead.
[WARNING] ME(24520:140708633995072,MainProcess):2021-04-28-16:12:46.643.323 [mindspore/common/_decorator.py:32] 'TensorAdd' is deprecated from version 1.1 and will be removed in a future version, use 'Add' instead.
[WARNING] ME(24520:140708633995072,MainProcess):2021-04-28-16:12:46.644.994 [mindspore/common/_decorator.py:32] 'TensorAdd' is deprecated from version 1.1 and will be removed in a future version, use 'Add' instead.
[WARNING] ME(24520:140708633995072,MainProcess):2021-04-28-16:12:46.645.344 [mindspore/common/_decorator.py:32] 'TensorAdd' is deprecated from version 1.1 and will be removed in a future version, use 'Add' instead.
[WARNING] ME(24520:140708633995072,MainProcess):2021-04-28-16:12:46.843.766 [mindspore/common/_decorator.py:32] 'TensorAdd' is deprecated from version 1.1 and will be removed in a future version, use 'Add' instead.
[WARNING] ME(24520:140708633995072,MainProcess):2021-04-28-16:12:46.845.435 [mindspore/common/_decorator.py:32] 'TensorAdd' is deprecated from version 1.1 and will be removed in a future version, use 'Add' instead.
Traceback (most recent call last):
File "/mnt/pangu/pangu-alpha/run_pangu_alpha_predict.py", line 92, in
run_predict(args_opt)
File "/mnt/pangu/pangu-alpha/pangu_alpha_predict.py", line 228, in run_predict
run_predict_no_pipeline(args_opt)
File "/mnt/pangu/pangu-alpha/pangu_alpha_predict.py", line 195, in run_predict_no_pipeline
pangu_alpha = PANGUALPHA(config)
File "/mnt/pangu/pangu-alpha/pangu_alpha.py", line 886, in init
self.backbone = PANGUALPHA_Model(config)
File "/mnt/pangu/pangu-alpha/pangu_alpha.py", line 720, in init
per_block = Block(config, i + 1).set_comm_fusion(int(i / fusion_group_size) + 2)
File "/mnt/pangu/pangu-alpha/pangu_alpha.py", line 459, in init
self.attention = Attention(config, scale, layer_idx)
File "/mnt/pangu/pangu-alpha/pangu_alpha.py", line 285, in init
self.dropout.dropout_gen_mask.shard(((config.dp, 1, 1),))
File "/mnt/anaconda3/envs/pangu/lib/python3.7/site-packages/mindspore/nn/cell.py", line 266, in getattr
raise AttributeError("'{}' object has no attribute '{}'.".format(type(self).name, name))
AttributeError: 'Dropout' object has no attribute 'dropout_gen_mask'.
目前只支持在Ascend设备上运行,请查看具体环境要求
Hi,我在Ascend上运行也会报这个错:'Dropout' object has no attribute 'dropout_gen_mask'
dropout_gen_mask和dropout_do_mask接口在ascend-mindspore1.3中均统一至dropout,麻烦您修改相关代码!