torch.nn.ReLU

torch.nn.ReLU#

逐元素应用ReLU函数对数据进行激活

torch.nn.ReLU(inplace=False)

inplace:是否改变输入数据,如果设置为True,则会直接修改输入数据;如果设置为False,则不对输入数据做修改

\(\operatorname{ReLU}(x)=(x)^{+}=\max (0, x)\)

from torch import nn
import torch

a = torch.randn(10)
relu = nn.ReLU()
b = relu(a)
print("经过relu之前")
print(a)
print("经过relu之后")
print(b)
经过relu之前
tensor([-0.4821,  0.8092,  0.3960, -1.3074, -0.8350, -1.0724,  0.0626, -0.6775,
        -0.9568, -0.4655])
经过relu之后
tensor([0.0000, 0.8092, 0.3960, 0.0000, 0.0000, 0.0000, 0.0626, 0.0000, 0.0000,
        0.0000])
c = torch.randn(10)
print("经过relu之前c的值")
print(c)
relu = nn.ReLU(inplace=True)
d = relu(c)
print("经过relu之后c的值")
print(c)
print("经过relu之后d的值")
print(d)
经过relu之前c的值
tensor([ 0.0543,  0.8413,  0.9151,  1.0090, -0.8394,  0.7382, -0.0474,  0.2219,
         1.6900,  0.1074])
经过relu之后c的值
tensor([0.0543, 0.8413, 0.9151, 1.0090, 0.0000, 0.7382, 0.0000, 0.2219, 1.6900,
        0.1074])
经过relu之后d的值
tensor([0.0543, 0.8413, 0.9151, 1.0090, 0.0000, 0.7382, 0.0000, 0.2219, 1.6900,
        0.1074])