torch.nn.ReLU#

逐元素应用ReLU函数对数据进行激活

torch.nn.ReLU(inplace=False)

inplace:是否改变输入数据,如果设置为True,则会直接修改输入数据;如果设置为False,则不对输入数据做修改

\(\operatorname{ReLU}(x)=(x)^{+}=\max (0, x)\)

from torch import nn
import torch

a = torch.randn(10)
relu = nn.ReLU()
b = relu(a)
print("经过relu之前")
print(a)
print("经过relu之后")
print(b)
经过relu之前
tensor([-1.1869, -0.2303, -1.5306,  0.1550, -0.3371,  0.3551, -0.0140,  0.1734,
         1.0465, -1.4751])
经过relu之后
tensor([0.0000, 0.0000, 0.0000, 0.1550, 0.0000, 0.3551, 0.0000, 0.1734, 1.0465,
        0.0000])
c = torch.randn(10)
print("经过relu之前c的值")
print(c)
relu = nn.ReLU(inplace=True)
d = relu(c)
print("经过relu之后c的值")
print(c)
print("经过relu之后d的值")
print(d)
经过relu之前c的值
tensor([ 0.5177,  0.3214, -2.4516,  0.0706,  0.1252, -2.1519, -0.4721,  0.4154,
         0.6978,  0.0918])
经过relu之后c的值
tensor([0.5177, 0.3214, 0.0000, 0.0706, 0.1252, 0.0000, 0.0000, 0.4154, 0.6978,
        0.0918])
经过relu之后d的值
tensor([0.5177, 0.3214, 0.0000, 0.0706, 0.1252, 0.0000, 0.0000, 0.4154, 0.6978,
        0.0918])