imyzx
  • Joined on Apr 23, 2021
Loading Heatmap…

imyzx commented on issue PCL-Platform.Inte.../PanGu-Alpha#7

NVIDIA V100 单卡运行推理, AttributeError: 'Dropout' object has no attribute 'dropout_gen_mask'.

dropout_gen_mask和dropout_do_mask接口在ascend-mindspore1.3中均统一至dropout,麻烦您修改相关代码!

4 days ago

imyzx pushed to master at imyzx/PanguAlpha-NPU-openI

6 days ago

imyzx pushed to master at imyzx/PanguAlpha-NPU-openI

6 days ago

imyzx pushed to master at imyzx/PanguAlpha-NPU-openI

6 days ago

imyzx pushed to master at imyzx/PanguAlpha-NPU-openI

6 days ago

imyzx pushed to master at imyzx/PanguAlpha-NPU-openI

6 days ago

imyzx pushed to master at imyzx/PanguAlpha-NPU-openI

6 days ago

imyzx pushed to master at imyzx/PanguAlpha-NPU-openI

6 days ago

imyzx pushed to master at imyzx/PanguAlpha-NPU-openI

6 days ago

imyzx pushed to master at imyzx/PanguAlpha-NPU-openI

6 days ago

imyzx pushed to master at imyzx/PanguAlpha-NPU-openI

6 days ago

imyzx pushed to master at imyzx/PanguAlpha-NPU-openI

6 days ago

imyzx renamed repository from PanguAlpha-NPU to imyzx/PanguAlpha-NPU-openI

1 week ago

imyzx commented on issue PCL-Platform.Inte.../PanGu-Alpha#22

在启智里用云脑运行finetune训练脚本发生错误

请用bash finetune_pangu_distributed.sh来执行脚本。然后根据您的资源和数据位置,修改.sh文件配置

1 month ago

imyzx pushed to master at PCL-Platform.Inte.../PanGu-Alpha

6 months ago

imyzx pushed to master at PCL-Platform.Inte.../PanGu-Alpha

6 months ago

imyzx pushed to master at PCL-Platform.Inte.../PanGu-Alpha

6 months ago

imyzx pushed to master at PCL-Platform.Inte.../PanGu-Alpha

6 months ago

imyzx pushed to master at PCL-Platform.Inte.../PanGu-Alpha

  • 316d82242c 更新盘古-α模型对外开放的工作进展说明

6 months ago

imyzx commented on issue PCL-Platform.Inte.../PanGu-Alpha#11

'Dropout' object has no attribute 'dropout_gen_mask'.

``` class Output(nn.Cell): """ The output mapping module for each layer Args: config(PANGUALPHAConfig): the config of network scale: scale factor for initialization Inputs: x: output of the self-attention module Returns: output: Tensor, the output of this layer after mapping """ def __init__(self, config, scale=1.0): super(Output, self).__init__() input_size = config.embedding_size output_size = config.embedding_size * config.expand_ratio self.mapping = Mapping_output(config, input_size, output_size) self.projection = Mapping(config, output_size, input_size, scale) self.activation = nn.GELU() self.activation.gelu.shard(((config.dp, 1, config.mp),)) self.dropout = nn.Dropout(1 - config.dropout_rate) # self.dropout.dropout_gen_mask.shard(((config.dp, 1, 1),)) # self.dropout.dropout_do_mask.shard(((config.dp, 1, 1),)) def construct(self, x): hidden = self.activation(self.mapping(x)) output = self.projection(hidden) output = self.dropout(output) return output ``` 您好,推理时用不到dropout的,这两行您可以注释掉。

6 months ago