擅长:python、mysql、java
<p>下面是一个简单激活的例子,它使用内部的火炬激活功能,但可以工作,并可以扩展到自定义。在</p>
<pre><code>import torch as pt
import torch.nn as nn
from torch.nn.modules import Module
# custom activation
class Act(Module):
def forward(self, z):
if(do_ratio > 0):
return nn.functional.dropout(pt.tanh(z), do_ratio)
else:
return pt.tanh(z)
act_fn = Act()
model = pt.nn.Sequential(
pt.nn.Linear(features, n_layer0, bias=enable_bias),
act_fn,
pt.nn.Linear(n_layer0, n_layer1, bias=enable_bias),
act_fn,
pt.nn.Linear(n_layer1, n_layer2, bias=enable_bias)
)
</code></pre>