Tensor轉(zhuǎn)為numpy
np.array(Tensor)
numpy轉(zhuǎn)換為Tensor
torch.Tensor(numpy.darray)
PIL.Image.Image轉(zhuǎn)換成numpy
np.array(PIL.Image.Image)
numpy 轉(zhuǎn)換成PIL.Image.Image
Image.fromarray(numpy.ndarray)
首先需要保證numpy.ndarray 轉(zhuǎn)換成np.uint8型
numpy.astype(np.uint8),像素值[0,255]。
同時灰度圖像保證numpy.shape為(H,W),不能出現(xiàn)channels
這里需要np.squeeze()。彩色圖象保證numpy.shape為(H,W,3)
之后Image.fromarray(numpy.ndarray)
PIL.Image.Image轉(zhuǎn)換成Tensor
torchvision.transfrom
1
2
3
4
5
|
img = Image. open ( '00381fa010_940422.tif' ).convert( 'L' ) import torchvision.transforms as transforms trans = transforms.Compose([transforms.ToTensor()]) a = trans(img) |
Tensor轉(zhuǎn)化成PIL.Image.Image
先轉(zhuǎn)換成numpy,再轉(zhuǎn)換成PIL.Image.Image
灰度圖像
1
2
3
4
5
6
7
8
9
10
11
12
13
|
img = Image. open ( '00381fa010_940422.tif' ).convert( 'L' ) import torchvision.transforms as transforms trans = transforms.Compose([transforms.ToTensor()]) a = trans(img) b = np.array(a) #b.shape (1,64,64) maxi = b. max () b = b * 255. / maxi b = b.transpose( 1 , 2 , 0 ).astype(np.uint8) b = np.squeeze(b,axis = 2 ) xx = Image.fromarray(b) xx |
彩色圖象
1
2
3
4
5
6
7
8
9
10
|
img2 = Image. open ( '00381fa010_940422.tif' ).convert( 'RGB' ) import torchvision.transforms as transforms trans = transforms.Compose([transforms.ToTensor()]) a = trans(img2) a = np.array(a) maxi = a. max () a = a / maxi * 255 a = a.transpose( 1 , 2 , 0 ).astype(np.uint8) b = Image.fromarray(a) b |
python-opencv
1
2
3
4
5
6
|
import cv2 a = cv2.imread( '00381fa010_940422.tif' ) #a.shape (64,64,3) cv2.imwrite( 'asd.jpg' ,a) Image.fromarray(a) b = cv2.imread( '00381fa010_940422.tif' , 0 ) #b.shape (64,64) Image.fromarray(b) |
cv2.imread()返回numpy.darray, 讀取灰度圖像之后shape為(64,64),RGB圖像的shape為(64,64,3),可直接用Image.fromarray()轉(zhuǎn)換成Image。
cv寫圖像時,灰度圖像shape可以為(H,W)或(H,W,1)。彩色圖像(H,W,3)
要從numpy.ndarray得到PIL.Image.Image,灰度圖的shape必須為(H,W),彩色為(H,W,3)
對于Variable類型不能直接轉(zhuǎn)換成numpy.ndarray,需要用.data轉(zhuǎn)換
np.array(a.data)
以上這篇python、PyTorch圖像讀取與numpy轉(zhuǎn)換實例就是小編分享給大家的全部內(nèi)容了,希望能給大家一個參考,也希望大家多多支持服務(wù)器之家。
原文鏈接:https://blog.csdn.net/yskyskyer123/article/details/80707038