在Theano教程中提供的logistic回归实例中,negative_log_likelihood函数中有一行代码如下所示:
def negative_log_likelihood(self, y):
"""Return the mean of the negative log-likelihood of the prediction
of this model under a given target distribution.
.. math::
\frac{1}{|\mathcal{D}|} \mathcal{L} (\theta=\{W,b\}, \mathcal{D}) =
\frac{1}{|\mathcal{D}|} \sum_{i=0}^{|\mathcal{D}|} \log(P(Y=y^{(i)}|x^{(i)}, W,b)) \\
\ell (\theta=\{W,b\}, \mathcal{D})
:type y: theano.tensor.TensorType
:param y: corresponds to a vector that gives for each example the
correct label
Note: we use the mean instead of the sum so that
the learning rate is less dependent on the batch size
"""
# y.shape[0] is (symbolically) the number of rows in y, i.e.,
# number of examples (call it n) in the minibatch
# T.arange(y.shape[0]) is a symbolic vector which will contain
# [0,1,2,... n-1] T.log(self.p_y_given_x) is a matrix of
# Log-Probabilities (call it LP) with one row per example and
# one column per class LP[T.arange(y.shape[0]),y] is a vector
# v containing [LP[0,y[0]], LP[1,y[1]], LP[2,y[2]], ...,
# LP[n-1,y[n-1]]] and T.mean(LP[T.arange(y.shape[0]),y]) is
# the mean (across minibatch examples) of the elements in v,
# i.e., the mean log-likelihood across the minibatch.
return -T.mean(T.log(self.p_y_given_x)[T.arange(y.shape[0]), y])有人能帮我解释一下在上面代码的最后一行中方括号的确切用法吗?如何解释[T.arange(y.shape[0]), y]?
谢谢!
发布于 2013-11-29 11:56:41
在函数的注释中,您拥有所需的大部分信息。
T.log(self.p_y_give_x)返回一个numpy矩阵。
因此,[T.arange(y.shape),y]是矩阵的一部分。这里我们使用的是numpy高级切片。请参阅:http://docs.scipy.org/doc/numpy/reference/arrays.indexing.html
发布于 2016-03-06 03:13:59
我对这里的矩阵切片也很困惑。T.arange(y.shape)是一个一维列表。y.shape取决于您设置的小型批处理的大小。Y是与T.arange(y.shape)具有相同维度的标签列表。因此,根据@William的引用,这种切片意味着:对于T.log(self.p_y_give_x)矩阵中的每一行,我们选择了一个列索引y(其中y表示金色标签,这里也用作索引)。
https://stackoverflow.com/questions/20284663
复制相似问题