#886 add auto_grad

Closed
lvyufeng wants to merge 3 commits from lvyufeng/MSAdapter:autograd into master
frelam reviewed 1 month ago
mindtorch/torch/nn/parameter.py
@@ -80,2 +64,2 @@
Tensor_.data_sync(self.data, True)
return f'Parameter containing: {Tensor_.__repr__(self.data)}, requires_grad={self.requires_grad})'
raise ValueError(f'not support type {type(data)}.')
self.tensor = msParameter(self.tensor)
frelam commented 1 month ago
有2个疑问, 1. parameter的值如何与self.tensor的值保持同步。 比如用户用了各种原地计算后(比如parameter.add_(3)),此时self.tensor值是否会更新。 2. 如果self.tensor被更新的时候,它的id和类型是否会改变。 如果id和类型改变了,将使用户丢掉原来的parameter对象。假设self.tensor已经传入了优化器,那优化器优化的将是一个被丢掉的对象,对原来的parameter没有改变。
frelam reviewed 1 month ago
mindtorch/torch/nn/parameter.py
@@ -40,3 +42,3 @@
raise ValueError("The argument 'init' should be number or string, but got {}.".format(type(init)))

class Parameter(ms.Parameter):
class Parameter(Tensor):
frelam commented 1 month ago
如果继承于ms.Parameter, 原有使用到mindspore的流程是否有影响? 比如ms.Cell.__setattr__等。 mindspore能否识别到mindtorch的parameter类型。
Erpim reviewed 1 month ago
@@ -5,6 +5,7 @@ from typing import Iterable
# from functools import lru_cache
import numpy as np
import mindspore as ms
from mindspore import ops
Erpim commented 1 month ago
没有使用到的导入包请删除
Erpim closed this pull request 1 week ago
Some checks failed
continuous-integration/drone/pr Build is failing
Please reopen this pull request to perform a merge.
Sign in to join this conversation.
No reviewers
No Label
No Milestone
No Assignees
3 Participants
Notifications
Due Date

No due date set.

Dependencies

This pull request currently doesn't have any dependencies.

Loading…
There is no content yet.