KL散度,也叫做相对熵,计算公式如下:
import torch.nn as nn
import torch
import torch.nn.functional as F
if __name__ == '__main__':
x_o=torch.Tensor([[1,2],[3,4]])
y_o=torch.Tensor([[0.1,0.2],[0.3,0.4]])
# x = F.log_softmax(x)
x = F.softmax(x_o, dim=1)
y = F.softmax(y_o, dim=1)
criterion = nn.KLDivLoss()
klloss = criterion(x, y)
print('klloss',klloss)
kl = F.kl_div(x_o.softmax(dim=-1).log(), y_o.softmax(dim=-1), reduction='sum')
print('kl',kl)
以下内容转自:
欢迎分享,转载请注明来源:内存溢出
评论列表(0条)