我不是一个python人,试图理解如何将一些Python代码转换为C#代码:
# model parameters
with default_options(bias=False): # all the projections have no bias
attn_proj_enc = Stabilizer(enable_self_stabilization=enable_self_stabilization) >> Dense(attention_dim, init=init, input_rank=1) # projects input hidden state, keeping span axes intact
attn_proj_dec = Stabilizer(enable_self_stabilization=enable_self_stabilization) >> Dense(attention_dim, init=init, input_rank=1) # projects decoder hidden state, but keeping span and beam-search axes intact
attn_proj_tanh = Stabilizer(enable_self_stabilization=enable_self_stabilization) >> Dense(1 , init=init, input_rank=1) # projects tanh output, keeping span and beam-search axes intact
attn_final_stab = Stabilizer(enable_self_stabilization=enable_self_stabilization)下面的代码是这里的一段代码:CNTK/bindings/python/cntk/layers/models/attention.py
我的问题是,>>操作符到底在做什么?
和Dense一样,Stabilizer也是一个层,那么这些层发生了什么呢?这里是不是正在上演花哨的Bitwise Operation?
发布于 2020-01-21 09:41:39
我以前从未见过这种情况,但Model Instantiation似乎就是答案:
Example:
>>> model = Dense(500) >> Activation(C.relu) >> Dense(10)
>>> # is the same as
>>> model = Dense(500) >> C.relu >> Dense(10)
>>> # and also the same as
>>> model = Dense(500, activation=C.relu) >> Dense(10)来自:python/cntk/layers/layers.py#L1361
地址为:
Another example is a GRU layer with projection, which could be realized as
``Recurrence(GRU(500) >> Dense(200))``,
where the projection is applied to the hidden state as fed back to the next
step.
``F>>G`` is a short-hand for ``Sequential([F, G])``.来自:python/cntk/layers/sequence.py#L331
具体地说就是模型中的序列。
https://stackoverflow.com/questions/59832654
复制相似问题