PyTorch学习笔记(三):Tensor变换

PyTorch学习笔记(三):Tensor变换,第1张

PyTorch学习笔记(三):Tensor变换 view/reshape
a = torch.rand(4,1,28,28)
print(a.shape)                     # torch.Size([4, 1, 28, 28])
print(a.view(4, 28*28).shape)      # torch.Size([4, 784])
print(a.reshape(4, 28*28).shape)   # torch.Size([4, 784])
print(a.view(4, -1).shape)         # torch.Size([4, 784])
print(a.reshape(4, -1).shape)      # torch.Size([4, 784])
squeeze/unsqueeze
a = torch.rand(4,1,28,28)
print(a.shape)                  # torch.Size([4, 1, 28, 28])
print(a.unsqueeze(0).shape)     # torch.Size([1, 4, 1, 28, 28])
print(a.unsqueeze(4).shape)     # torch.Size([4, 1, 28, 28, 1])
print(a.unsqueeze(-1).shape)    # torch.Size([4, 1, 28, 28, 1])
print(a.squeeze().shape)        # torch.Size([4, 28, 28])
print(a.squeeze(1).shape)       # torch.Size([4, 28, 28])
transpose/permute
a = torch.rand(10,3,32,32)
print(a.shape)                    # torch.Size([10, 3, 32, 32])
print(a.transpose(1,3).shape)     # torch.Size([10, 32, 32, 3])
print(a.permute(0,3,2,1).shape)   # torch.Size([10, 32, 32, 3])
expand/repeat
b = torch.randint(1, 10, (1, 3))
print(b)
print(b.shape)
print(b.storage())              # 数据存放形式
print(b.storage().data_ptr())   # 地址
# 输出
tensor([[7, 8, 9]])
torch.Size([1, 3])
 7
 8
 9
[torch.LongStorage of size 3]
2530665948608

# expand: broadcasting
b_1 = b.expand(3, 3)
print(b_1)
print(b_1.shape)
print(b_1.storage())
print(b_1.storage().data_ptr())
# 输出
tensor([[7, 8, 9],
        [7, 8, 9],
        [7, 8, 9]])
torch.Size([3, 3])
 7
 8
 9
[torch.LongStorage of size 3]
2530665948608

# repeat: memory copied
b_2 = b.repeat(3, 1)
print(b_2)
print(b_2.shape)
print(b_2.storage())
print(b_2.storage().data_ptr())
# 输出
tensor([[7, 8, 9],
        [7, 8, 9],
        [7, 8, 9]])
torch.Size([3, 3])
 7
 8
 9
 7
 8
 9
 7
 8
 9
[torch.LongStorage of size 9]
2530678187136

欢迎分享,转载请注明来源:内存溢出

原文地址: http://outofmemory.cn/zaji/5624852.html

(0)
打赏 微信扫一扫 微信扫一扫 支付宝扫一扫 支付宝扫一扫
上一篇 2022-12-15
下一篇 2022-12-15

发表评论

登录后才能评论

评论列表(0条)

保存