首页
学习
活动
专区
圈层
工具
发布
社区首页 >问答首页 >get_all_param_values()如何读取lasagne.layer

get_all_param_values()如何读取lasagne.layer
EN

Stack Overflow用户
提问于 2016-02-09 08:54:43
回答 2查看 1.5K关注 0票数 2

我正在运行Lasagne和Theano来创建我的卷积神经网络。我目前由以下部分组成

代码语言:javascript
复制
l_shape = lasagne.layers.ReshapeLayer(l_in, (-1, 3,130, 130))
l_conv1 = lasagne.layers.Conv2DLayer(l_shape, num_filters=32, filter_size=3, pad=1)
l_conv1_1 = lasagne.layers.Conv2DLayer(l_conv1, num_filters=32, filter_size=3, pad=1)
l_pool1 = lasagne.layers.MaxPool2DLayer(l_conv1_1, 2)
l_conv2 = lasagne.layers.Conv2DLayer(l_pool1, num_filters=64, filter_size=3, pad=1)
l_conv2_2 = lasagne.layers.Conv2DLayer(l_conv2, num_filters=64, filter_size=3, pad=1)
l_pool2 = lasagne.layers.MaxPool2DLayer(l_conv2_2, 2)
l_conv3 = lasagne.layers.Conv2DLayer(l_pool2, num_filters=64, filter_size=3, pad=1)
l_conv3_2 = lasagne.layers.Conv2DLayer(l_conv3, num_filters=64, filter_size=3, pad=1)
l_pool3 = lasagne.layers.MaxPool2DLayer(l_conv3_2, 2)
l_conv4 = lasagne.layers.Conv2DLayer(l_pool3, num_filters=64, filter_size=3, pad=1)
l_conv4_2 = lasagne.layers.Conv2DLayer(l_conv4, num_filters=64, filter_size=3, pad=1)
l_pool4 = lasagne.layers.MaxPool2DLayer(l_conv4_2, 2)
l_conv5 = lasagne.layers.Conv2DLayer(l_pool4, num_filters=64, filter_size=3, pad=1)
l_conv5_2 = lasagne.layers.Conv2DLayer(l_conv5, num_filters=64, filter_size=3, pad=1)
l_pool5 = lasagne.layers.MaxPool2DLayer(l_conv5_2, 2)
l_out = lasagne.layers.DenseLayer(l_pool5, num_units=2, nonlinearity=lasagne.nonlinearities.softmax)

我的最后一层是一个denselayer,它使用softmax来输出我的分类。我的最终目标是检索概率,而不是分类(0或1)。

当我调用get_all_param_values()时,它为我提供了一个扩展的数组。我只想要最后一个密集层的权重和偏移。你是怎么做的?我尝试过l_out.W、l_out.b和get_values()。

提前感谢!

EN

回答 2

Stack Overflow用户

发布于 2016-02-09 18:25:06

您可以使用get_params获取单个层的参数。这在documentation中有解释。

票数 1
EN

Stack Overflow用户

发布于 2016-02-13 22:14:32

我修改了您的代码,因为您粘贴的内容引用了l_in,但您的代码中没有包含l_in。我定义了以下网络:

代码语言:javascript
复制
l_shape = lasagne.layers.InputLayer(shape = (None, 3, 130, 130))
l_conv1 = lasagne.layers.Conv2DLayer(l_shape, num_filters=32, filter_size=3, pad=1)
l_conv1_1 = lasagne.layers.Conv2DLayer(l_conv1, num_filters=32, filter_size=3, pad=1)
l_pool1 = lasagne.layers.MaxPool2DLayer(l_conv1_1, 2)
l_conv2 = lasagne.layers.Conv2DLayer(l_pool1, num_filters=64, filter_size=3, pad=1)
l_conv2_2 = lasagne.layers.Conv2DLayer(l_conv2, num_filters=64, filter_size=3, pad=1)
l_pool2 = lasagne.layers.MaxPool2DLayer(l_conv2_2, 2)
l_conv3 = lasagne.layers.Conv2DLayer(l_pool2, num_filters=64, filter_size=3, pad=1)
l_conv3_2 = lasagne.layers.Conv2DLayer(l_conv3, num_filters=64, filter_size=3, pad=1)
l_pool3 = lasagne.layers.MaxPool2DLayer(l_conv3_2, 2)
l_conv4 = lasagne.layers.Conv2DLayer(l_pool3, num_filters=64, filter_size=3, pad=1)
l_conv4_2 = lasagne.layers.Conv2DLayer(l_conv4, num_filters=64, filter_size=3, pad=1)
l_pool4 = lasagne.layers.MaxPool2DLayer(l_conv4_2, 2)
l_conv5 = lasagne.layers.Conv2DLayer(l_pool4, num_filters=64, filter_size=3, pad=1)
l_conv5_2 = lasagne.layers.Conv2DLayer(l_conv5, num_filters=64, filter_size=3, pad=1)
l_pool5 = lasagne.layers.MaxPool2DLayer(l_conv5_2, 2)
l_out = lasagne.layers.DenseLayer(l_pool5, num_units=2, nonlinearity=lasagne.nonlinearities.softmax)

为了实现Daniel Renshaw的答案:

代码语言:javascript
复制
params = l_out.get_params()
W = params[0].get_value()

打印params时,您将看到l_out的所有参数:

代码语言:javascript
复制
[W, b] 

因此,params、params和params1的每个元素都是一个Theano共享变量,您可以通过paramsi.get_value()获得这些数值。

票数 1
EN
页面原文内容由Stack Overflow提供。腾讯云小微IT领域专用引擎提供翻译支持
原文链接:

https://stackoverflow.com/questions/35282146

复制
相关文章

相似问题

领券
问题归档专栏文章快讯文章归档关键词归档开发者手册归档开发者手册 Section 归档