#24 pangu-alpha is composed of encoders rather than decoders

Closed
created 2 years ago by fandd · 1 comments
fandd commented 2 years ago
According to the latest pangu-alpha code [link](https://gitee.com/mindspore/models/blob/master/official/nlp/pangu_alpha/src/pangu_alpha.py), pangu-alpha model should be composed of transformer encoders rather than transformer decoders as described in the technical report [link](https://git.openi.org.cn/PCL-Platform.Intelligence/PanGu-Alpha/src/branch/master/PANGU-%ce%b1.pdf). ![image](/attachments/d4e9a6a0-6bfd-4202-83b3-339ac791e9f8) ![image](/attachments/4578124d-794b-4706-88a6-e17b658557c7)
superqing commented 2 years ago
Owner
This is a code naming problem. Because the main difference is the attention mask between encoders and decoders.
superqing closed this issue 2 years ago
Sign in to join this conversation.
No Label
No Milestone
No Assignees
2 Participants
Notifications
Due Date

No due date set.

Dependencies

This issue currently doesn't have any dependencies.

Loading…
There is no content yet.