torch.nn.ReLU#
逐元素应用ReLU函数对数据进行激活
torch.nn.ReLU(inplace=False)
inplace:是否改变输入数据,如果设置为True
,则会直接修改输入数据;如果设置为False
,则不对输入数据做修改
\(\operatorname{ReLU}(x)=(x)^{+}=\max (0, x)\)
from torch import nn
import torch
a = torch.randn(10)
relu = nn.ReLU()
b = relu(a)
print("经过relu之前")
print(a)
print("经过relu之后")
print(b)
经过relu之前
tensor([-0.1479, 0.5758, -0.9307, -0.2921, -0.3493, -0.0309, -0.8106, 0.1998,
0.2892, 1.2551])
经过relu之后
tensor([0.0000, 0.5758, 0.0000, 0.0000, 0.0000, 0.0000, 0.0000, 0.1998, 0.2892,
1.2551])
c = torch.randn(10)
print("经过relu之前c的值")
print(c)
relu = nn.ReLU(inplace=True)
d = relu(c)
print("经过relu之后c的值")
print(c)
print("经过relu之后d的值")
print(d)
经过relu之前c的值
tensor([ 0.9313, -0.0663, 0.7632, 0.5169, -1.0624, 0.9546, -0.4565, 0.4186,
-0.9414, 1.6503])
经过relu之后c的值
tensor([0.9313, 0.0000, 0.7632, 0.5169, 0.0000, 0.9546, 0.0000, 0.4186, 0.0000,
1.6503])
经过relu之后d的值
tensor([0.9313, 0.0000, 0.7632, 0.5169, 0.0000, 0.9546, 0.0000, 0.4186, 0.0000,
1.6503])