yuv420sp格式的数据怎么转换成jpeg的数据

yuv420sp格式的数据怎么转换成jpeg的数据,第1张

比喻说yuv422P ,或是420p,你可以很容易找到yuv422p_to_jpeg DCT转换函数 问题可能有很多原因, yuv数据, ,y ,u v分量分割不对,YUV转RGB错误, 从v4l2中取到的数据有无对齐之分,均可造成显示模糊。

找到了以前用过的一段程序,是可以正常工作的,你可以参考下

/**

\brief jpeg编码,输入格式为uyvy

*/

void write_YUV_JPEG_file (char * filename, unsigned char* yuvData, int quality,

int image_width,int image_height)

{

struct jpeg_compress_struct cinfo

struct jpeg_error_mgr jerr

FILE * outfile/* target file */

//JSAMPROW row_pointer[1]/* pointer to JSAMPLE row[s] */

//int row_stride/* physical row width in image buffer */

JSAMPIMAGE buffer

int band,i,buf_width[3],buf_height[3]

cinfo.err = jpeg_std_error(&jerr)

jpeg_create_compress(&cinfo)

if ((outfile = fopen(filename, "wb")) == NULL) {

fprintf(stderr, "can't open %s\n", filename)

exit(1)

}

jpeg_stdio_dest(&cinfo, outfile)

cinfo.image_width = image_width/* image width and height, in pixels */

cinfo.image_height = image_height

cinfo.input_components = 3/* # of color components per pixel */

cinfo.in_color_space = JCS_YCbCr/* colorspace of input image */

jpeg_set_defaults(&cinfo)

jpeg_set_quality(&cinfo, quality, TRUE )

//////////////////////////////

cinfo.raw_data_in = TRUE

cinfo.jpeg_color_space = JCS_YCbCr

cinfo.comp_info[0].h_samp_factor = 2

cinfo.comp_info[0].v_samp_factor = 1

/////////////////////////

jpeg_start_compress(&cinfo, TRUE)

buffer = (JSAMPIMAGE) (*cinfo.mem->alloc_small) ((j_common_ptr) &cinfo,

JPOOL_IMAGE, 3 * sizeof(JSAMPARRAY))

for(band=0band<3band++)

{

buf_width[band] = cinfo.comp_info[band].width_in_blocks * DCTSIZE

buf_height[band] = cinfo.comp_info[band].v_samp_factor * DCTSIZE

buffer[band] = (*cinfo.mem->alloc_sarray) ((j_common_ptr) &cinfo,

JPOOL_IMAGE, buf_width[band], buf_height[band])

}

unsigned char *rawData[3]

rawData[0]=yuvData

rawData[1]=yuvData+image_width*image_height

rawData[2]=yuvData+image_width*image_height*3/2

int max_line = cinfo.max_v_samp_factor*DCTSIZE

for(int counter=0cinfo.next_scanline <cinfo.image_heightcounter++)

{

//buffer image copy.

for(band=0band<3band++)

{

int mem_size = buf_width[band]

unsigned char *pDst = (unsigned char *) buffer[band][0]

unsigned char *pSrc = (unsigned char *) (rawData[band] + //yuv.data[band]分别表示YUV起始地址

counter*buf_height[band] * buf_width[band])

for(i=0i<buf_height[band]i++)

{

memcpy(pDst, pSrc, mem_size)

pSrc += buf_width[band]

pDst += buf_width[band]

}

}

jpeg_write_raw_data(&cinfo, buffer, max_line)

}

jpeg_finish_compress(&cinfo)

fclose(outfile)

jpeg_destroy_compress(&cinfo)

}

因为Y通道保留的是亮度信息,对于图像的分辨最重要,而U和V通道分别是色度和饱和度信息,相对亮度要次要,可以取相对少的数据来处理,因此有YUV411和YUV422的采样标准(你上面的4:2:2比例打错了)。


欢迎分享,转载请注明来源:内存溢出

原文地址: http://outofmemory.cn/yw/12161404.html

(0)
打赏 微信扫一扫 微信扫一扫 支付宝扫一扫 支付宝扫一扫
上一篇 2023-05-21
下一篇 2023-05-21

发表评论

登录后才能评论

评论列表(0条)

保存