传统的隐马尔可夫模型( HMM )中的维特比算法有一个起始概率矩阵(viterbi algorithm wiki),而tensorflow中viterbi_decode的参数只需要转移概率矩阵和发射概率矩阵。如何理解它?
def viterbi_decode(score, transition_params):
"""Decode the highest scoring sequence of tags outside of
TensorFlow.
This should only be used at test time.
Args:
score: A [seq_len, num_tags] matrix of unary potentials.
transition_params: A [num_tags, num_tags] matrix of binary potentials.
Returns:
viterbi: A [seq_len] list of integers containing the highest scoring tag
indicies.
viterbi_score: A float containing the score for the Viterbi
sequence.
"""发布于 2018-07-12 19:51:36
发布于 2018-08-12 02:32:52
我已经用tensorflow创建了关于维特比算法的完整详细的教程,你可以在这里看看:
假设您的数据如下所示:
# logits : A [batch_size, max_seq_len, num_tags] tensor of unary potentials to use as input to the CRF layer.
# labels_a : A [batch_size, max_seq_len] matrix of tag indices for which we compute the log-likelihood.
# sequence_len : A [batch_size] vector of true sequence lengths.然后
log_likelihood , transition_params = tf.contrib.crf.crf_log_likelihood(logits,labels_a,sequence_len)
#return of crf log_likelihood function
# log_likelihood: A scalar containing the log-likelihood of the given sequence of tag indices.
# transition_params: A [num_tags, num_tags] transition matrix.
# This is either provided by the caller or created in this function.现在我们可以计算viterbi分数了:
# score: A [seq_len, num_tags] matrix of unary potentials.
# transition_params: A [num_tags, num_tags] matrix of binary potentials.https://stackoverflow.com/questions/51301061
复制相似问题