我使用的是PyTorch 1.4,需要在forward中导出一个循环内带卷积的模型:
class MyCell(torch.nn.Module):
def __init__(self):
super(MyCell, self).__init__()
def forward(self, x):
for i in range(5):
conv = torch.nn.Conv1d(1, 1, 2*i+3)
x = torch.nn.Relu()(conv(x))
return x
torch.jit.script(MyCell())这会产生以下错误:
RuntimeError:
Arguments for call are not valid.
The following variants are available:
_single(float[1] x) -> (float[]):
Expected a value of type 'List[float]' for argument 'x' but instead found type 'Tensor'.
_single(int[1] x) -> (int[]):
Expected a value of type 'List[int]' for argument 'x' but instead found type 'Tensor'.
The original call is:
File "***/torch/nn/modules/conv.py", line 187
padding=0, dilation=1, groups=1,
bias=True, padding_mode='zeros'):
kernel_size = _single(kernel_size)
~~~~~~~ <--- HERE
stride = _single(stride)
padding = _single(padding)
'Conv1d.__init__' is being compiled since it was called from 'Conv1d'
File "***", line ***
def forward(self, x):
for _ in range(5):
conv = torch.nn.Conv1d(1, 1, 2*i+3)
~~~~~~~~~~~~~~~ <--- HERE
x = torch.nn.Relu()(conv(x))
return x
'Conv1d' is being compiled since it was called from 'MyCell.forward'
File "***", line ***
def forward(self, x, h):
for _ in range(5):
conv = torch.nn.Conv1d(1, 1, 2*i+3)
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ <--- HERE
x = torch.nn.Relu()(conv(x))
return x我也尝试过预定义conv,然后将它们放在__init__中的一个列表中,但TorchScript不允许这样的类型:
class MyCell(torch.nn.Module):
def __init__(self):
super(MyCell, self).__init__()
self.conv = [torch.nn.Conv1d(1, 1, 2*i+3) for i in range(5)]
def forward(self, x):
for i in range(len(self.conv)):
x = torch.nn.Relu()(self.conv[i](x))
return x
torch.jit.script(MyCell())相反,这提供了:
RuntimeError:
Module 'MyCell' has no attribute 'conv' (This attribute exists on the Python module, but we failed to convert Python type: 'list' to a TorchScript type.):
File "***", line ***
def forward(self, x):
for i in range(len(self.conv)):
~~~~~~~~~ <--- HERE
x = torch.nn.Relu()(self.conv[i](x))
return x那么如何导出这个模块呢?背景:我正在将Mixed-scale Dense Networks (source)导出为TorchScript;虽然nn.Sequential可能适用于这个简化的情况,但实际上我需要在每次迭代中与所有历史卷积输出进行卷积,这不仅仅是链接层。
发布于 2020-03-05 01:49:12
您可以通过以下方式使用nn.ModuleList()。
此外,请注意,您目前不能下标nn.ModuleList,可能是由于issue#16123中提到的错误,但可以使用下面提到的变通方法。
class MyCell(nn.Module):
def __init__(self):
super(MyCell, self).__init__()
self.conv = nn.ModuleList([torch.nn.Conv1d(1, 1, 2*i+3) for i in range(5)])
self.relu = nn.ReLU()
def forward(self, x):
for mod in self.conv:
x = self.relu(mod(x))
return x
>>> torch.jit.script(MyCell())
RecursiveScriptModule(
original_name=MyCell
(conv): RecursiveScriptModule(
original_name=ModuleList
(0): RecursiveScriptModule(original_name=Conv1d)
(1): RecursiveScriptModule(original_name=Conv1d)
(2): RecursiveScriptModule(original_name=Conv1d)
(3): RecursiveScriptModule(original_name=Conv1d)
(4): RecursiveScriptModule(original_name=Conv1d)
)
(relu): RecursiveScriptModule(original_name=ReLU)
)发布于 2020-03-05 16:32:20
作为[https://stackoverflow.com/users/6210807/kharshit]建议的替代方案,您可以定义网络功能方式:
class MyCell(torch.nn.Module):
def __init__(self):
super(MyCell, self).__init__()
self.w = []
for i in range(5):
self.w.append( torch.Tensor( 1, 1, 2*i+3 ) )
# init w[i] here, maybe make it "requires grad"
def forward(self, x):
for i in range(5):
x = torch.nn.functional.conv1d( x, self.w[i] )
x = torch.nn.functional.relu( x )
return xhttps://stackoverflow.com/questions/60530703
复制相似问题