Linux-V4L2-USB-Camera

序言

在 Linux 开发板上使用 V4L2 框架驱动 usb 摄像头。

V4L2 简述

V4L2(Video for Linux 2)是 Linux 内核中用于视频捕获设备的一个 API(应用程序编程接口),它提供了一套标准的方法来访问和控制视频捕获硬件,如摄像头、视频编码器等。

市面上有各种型号各种厂商的摄像头,驱动需要一个一个写很麻烦,于是出现了 v4l2 框架。现在大多数摄像头都适配 v4l2 框架,使用 v4l2 框架可以很方便地驱动各种接口的摄像头。

V4L2 在include/uapi/linux/videodev2.h 文件中定义了一些重要的数据结构,在采集图像的过程中,就是通过对这些数据的操作来获得最终的图像数据。

V4L2 支持两种方式来采集图像:内存映射方式(mmap)和直接读取方式(read),前者一般用于连续视频数据的采集,后者常用于静态图片数据的采集,本次是用摄像头进行视频采集所以使用 mmap 的方式。

应用程序通过 V4L2 接口采集视频数据步骤:

  1. 打开视频设备文件(/dev/videoX),进行视频采集的参数初始化,通过 V4L2 接口设置视频图像的采集窗口、采集的点阵大小和格式;
  2. 申请若干视频采集的帧缓冲区,并将这些帧缓冲区从内核空间映射到用户空间,便于应用程序读取/处理视频数据;
  3. 将申请到的帧缓冲区在视频采集输入队列排队,并启动视频采集;
  4. 驱动开始视频数据的采集,应用程序从视频采集输出队列取出帧缓冲区,处理完后,将帧缓冲区重新放入视频采集输入队列,循环往复采集连续的视频数据;
  5. 停止视频采集。

具体的程序实现流程可以参考下面的流程图:

image-20240812142300309

实现方式基本就是使用 ioctl 函数去设置和获取参数

最重要的是缓冲区的管理,摄像头启动视频采集后,驱动程序开始采集一帧数据,把采集的数据放入视频采集输入队列的第一个帧缓冲区,一帧数据采集完成(第一个帧缓冲区存放一帧数据),驱动程序将该帧缓冲区移至视频采集输出队列等待应用程序从输出队列取出。驱动程序接下来采集下一帧数据,放入第二个帧缓冲区,同样帧缓冲区存满下一帧数据后,就被放入视频采集输出队列。

应用程序视频采集输出队列中取出含有视频数据的帧缓冲区,处理帧缓冲区中的视频数据,如存储或压缩。

最后,应用程序将处理完数据的帧缓冲区重新放入视频采集输入队列,这样可以循环采集,如图所示。

image-20240812142848398

image-20240812143030377

V4L2 使用

应用程序使用 V4L2 获取视频数据的过程都是通过 ioctl 命令与驱动程序进行交互,v4l2 常用的 ioctl 控制符:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
VIDIOC_QUERYCAP     /* 获取设备支持的操作 */
VIDIOC_G_FMT /* 获取设置支持的视频格式 */
VIDIOC_S_FMT /* 设置捕获视频的格式 */
VIDIOC_REQBUFS /* 向驱动提出申请内存的请求 */
VIDIOC_QUERYBUF /* 向驱动查询申请到的内存 */
VIDIOC_QBUF /* 将空闲的内存加入可捕获视频的队列 */
VIDIOC_DQBUF /* 将已经捕获好视频的内存拉出已捕获视频的队列 */
VIDIOC_STREAMON /* 打开视频流 */
VIDIOC_STREAMOFF /* 关闭视频流 */
VIDIOC_QUERYCTRL /* 查询驱动是否支持该命令 */
VIDIOC_G_CTRL /* 获取当前命令值 */
VIDIOC_S_CTRL /* 设置新的命令值 */
VIDIOC_G_TUNER /* 获取调谐器信息 */
VIDIOC_S_TUNER /* 设置调谐器信息 */
VIDIOC_G_FREQUENCY /* 获取调谐器频率 */
VIDIOC_S_FREQUENCY /* 设置调谐器频率 */

打开/关闭摄像头

open 打开摄像头设备节点 /dev/videoX

close 关闭摄像头

1
2
int fd=open(“/dev/videoX”, O_RDWR);// 打开设备
close(fd);// 关闭设备

查询设备能力

VIDIOC_QUERYCAP:Query Capability,查看是否为捕获设备,是否支持 mmap 操作还是仅支持 read/write 操作。

v4l2_capability 结构体:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
/**
* struct v4l2_capability - Describes V4L2 device caps returned by VIDIOC_QUERYCAP
*
* @driver: name of the driver module (e.g. "bttv")
* @card: name of the card (e.g. "Hauppauge WinTV")
* @bus_info: name of the bus (e.g. "PCI:" + pci_name(pci_dev) )
* @version: KERNEL_VERSION
* @capabilities: capabilities of the physical device as a whole
* @device_caps: capabilities accessed via this particular device (node)
* @reserved: reserved fields for future extensions
*/
struct v4l2_capability {
__u8 driver[16];
__u8 card[32];
__u8 bus_info[32];
__u32 version;
__u32 capabilities;
__u32 device_caps;
__u32 reserved[3];
};

查询支持帧格式

VIDIOC_ENUM_FMT 枚举摄像头支持的格式。

1
2
3
4
5
6
7
8
9
10
11
/*
* F O R M A T E N U M E R A T I O N
*/
struct v4l2_fmtdesc {
__u32 index; /* Format number */
__u32 type; /* enum v4l2_buf_type */
__u32 flags;
__u8 description[32]; /* Description string */
__u32 pixelformat; /* Format fourcc */
__u32 reserved[4];
};

设置帧格式

VIDIOC_G_FMTVIDIOC_S_FMT,v4l2_format 结构体存放视频格式:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
/**
* struct v4l2_format - stream data format
* @type: enum v4l2_buf_type; type of the data stream
* @pix: definition of an image format
* @pix_mp: definition of a multiplanar image format
* @win: definition of an overlaid image
* @vbi: raw VBI capture or output parameters
* @sliced: sliced VBI capture or output parameters
* @raw_data: placeholder for future extensions and custom formats
*/
struct v4l2_format {
__u32 type;
union {
struct v4l2_pix_format pix; /* V4L2_BUF_TYPE_VIDEO_CAPTURE */
struct v4l2_pix_format_mplane pix_mp; /* V4L2_BUF_TYPE_VIDEO_CAPTURE_MPLANE */
struct v4l2_window win; /* V4L2_BUF_TYPE_VIDEO_OVERLAY */
struct v4l2_vbi_format vbi; /* V4L2_BUF_TYPE_VBI_CAPTURE */
struct v4l2_sliced_vbi_format sliced; /* V4L2_BUF_TYPE_SLICED_VBI_CAPTURE */
struct v4l2_sdr_format sdr; /* V4L2_BUF_TYPE_SDR_CAPTURE */
__u8 raw_data[200]; /* user-defined */
} fmt;
};

type 为枚举体 v4l2_buf_type 的一项:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
enum v4l2_buf_type {
V4L2_BUF_TYPE_VIDEO_CAPTURE = 1,
V4L2_BUF_TYPE_VIDEO_OUTPUT = 2,
V4L2_BUF_TYPE_VIDEO_OVERLAY = 3,
V4L2_BUF_TYPE_VBI_CAPTURE = 4,
V4L2_BUF_TYPE_VBI_OUTPUT = 5,
V4L2_BUF_TYPE_SLICED_VBI_CAPTURE = 6,
V4L2_BUF_TYPE_SLICED_VBI_OUTPUT = 7,
V4L2_BUF_TYPE_VIDEO_OUTPUT_OVERLAY = 8,
V4L2_BUF_TYPE_VIDEO_CAPTURE_MPLANE = 9,
V4L2_BUF_TYPE_VIDEO_OUTPUT_MPLANE = 10,
V4L2_BUF_TYPE_SDR_CAPTURE = 11,
V4L2_BUF_TYPE_SDR_OUTPUT = 12,
/* Deprecated, do not use */
V4L2_BUF_TYPE_PRIVATE = 0x80,
};

v4l2_pix_format 结构体存放捕获设备的帧格式属性:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
/*
* V I D E O I M A G E F O R M A T
*/
struct v4l2_pix_format {
__u32 width;
__u32 height;
__u32 pixelformat;
__u32 field; /* enum v4l2_field */
__u32 bytesperline; /* for padding, zero if unused */
__u32 sizeimage;
__u32 colorspace; /* enum v4l2_colorspace */
__u32 priv; /* private data, depends on pixelformat */
__u32 flags; /* format flags (V4L2_PIX_FMT_FLAG_*) */
__u32 ycbcr_enc; /* enum v4l2_ycbcr_encoding */
__u32 quantization; /* enum v4l2_quantization */
__u32 xfer_func; /* enum v4l2_xfer_func */
};

申请缓冲区

VIDIOC_REQBUFS 申请缓冲区需要应用程序来做,可以申请多个 buffer,但驱动程序不一定能申请到。通过结构体 v4l2_requestbuffers 请求驱动申请一片连续的内存用于缓存视频信息:

1
2
3
4
5
6
7
8
9
/*
* M E M O R Y - M A P P I N G B U F F E R S
*/
struct v4l2_requestbuffers {
__u32 count; // 缓冲区内缓冲帧的数目
__u32 type; /* enum v4l2_buf_type 缓冲帧数据格式 */
__u32 memory; /* enum v4l2_memory 区别是内存映射还是用户指针方式 */
__u32 reserved[2];
};

枚举体 v4l2_memory,一般使用 mmap 方式。

1
2
3
4
5
6
enum v4l2_memory {
V4L2_MEMORY_MMAP = 1,
V4L2_MEMORY_USERPTR = 2,
V4L2_MEMORY_OVERLAY = 3,
V4L2_MEMORY_DMABUF = 4, // 这个应该是新版本的
};

查询缓冲区

VIDIOC_QUERYBUF 查询 buffer 信息,如果申请了 N 个 buffer,ioctl 就应该执行 N 次,执行 mmap 后,应用程序就可以直接读写这些 buffer 了。

v4l2_buffer 结构体存放 buffer 的信息:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
/**
* struct v4l2_buffer - video buffer info
* @index: id number of the buffer
* @type: enum v4l2_buf_type; buffer type (type == *_MPLANE for
* multiplanar buffers);
* @bytesused: number of bytes occupied by data in the buffer (payload);
* unused (set to 0) for multiplanar buffers
* @flags: buffer informational flags
* @field: enum v4l2_field; field order of the image in the buffer
* @timestamp: frame timestamp
* @timecode: frame timecode
* @sequence: sequence count of this frame
* @memory: enum v4l2_memory; the method, in which the actual video data is
* passed
* @offset: for non-multiplanar buffers with memory == V4L2_MEMORY_MMAP;
* offset from the start of the device memory for this plane,
* (or a "cookie" that should be passed to mmap() as offset)
* @userptr: for non-multiplanar buffers with memory == V4L2_MEMORY_USERPTR;
* a userspace pointer pointing to this buffer
* @fd: for non-multiplanar buffers with memory == V4L2_MEMORY_DMABUF;
* a userspace file descriptor associated with this buffer
* @planes: for multiplanar buffers; userspace pointer to the array of plane
* info structs for this buffer
* @length: size in bytes of the buffer (NOT its payload) for single-plane
* buffers (when type != *_MPLANE); number of elements in the
* planes array for multi-plane buffers
*
* Contains data exchanged by application and driver using one of the Streaming
* I/O methods.
*/
struct v4l2_buffer {
__u32 index;
__u32 type;
__u32 bytesused;
__u32 flags;
__u32 field;
struct timeval timestamp;
struct v4l2_timecode timecode;
__u32 sequence;

/* memory location */
__u32 memory;
union {
__u32 offset;
unsigned long userptr;
struct v4l2_plane *planes;
__s32 fd;
} m;
__u32 length;
__u32 reserved2;
__u32 reserved;
};

放入/取出队列

VIDIOC_QBUF 把 buffer 放入队列,如果申请了 N 个 buffer,ioctl 就应该执行 N 次。

VIDIOC_DQBUF 把 buffer 从队列中取出。

启动/停止摄像头数据流

VIDIOC_STREAMONVIDIOC_STREAMOFF,开启摄像头数据流后一般是一个循环,使用 poll/select 监测 buffer 是否有数据,然后从输出队列中取出 buffer,处理后再放入输入队列。

和菜鸟一起学linux之V4L2摄像头应用流程-CSDN博客

v4l2 编程接口(一) — ioctl_v4l2 帧id-CSDN博客

配置内核支持 USB 摄像头

摄像头有多种接口 MIPI、DVP、USB 等,这里先使用了 USB 摄像头上手一下,后续再改用其他接口的摄像头。使用 USB 摄像头是因为支持 UVC(USB Video Capture)协议,可以不用写驱动文件先熟悉下 v4l2 框架,也就是常说的 USB 免驱。

使能 UVC 驱动

Linux 内核已经帮我们写好了 UVC 驱动,我们只需要在 Linux 内核配置中使能 UVC 驱动即可。

进入 Linux 内核图形配置界面

1
2
3
4
5
6
make menuconfig

Device Drivers ->
->Multimedia Support
->Media USB Adapters--
->USB Video Class(UVC)

image-20240812151124836

若没有 USB Video Class(UVC)这一项,则搜索 USB_VIDEO_CLASS 查看其依赖项,然后将其依赖项都选择,保存后重新打开。

image-20240812151323053

接着打开 Soc camera supportplatform camera support

1
2
3
4
Device Drivers ->
->Multimedia Support
->Media USB Adapters--
->V4L platform devices

image-20240812151643873

添加摄像头的 PID 和 VID

将摄像头插上电脑上,打开设备管理器查看摄像头的 PID 和 VID:

image-20240812151947981

打开内核文件中的 drivers/media/usb/uvc/uvc_driver.c,找到结构体 uvc_ids 仿照其他摄像头的代码,添加摄像头的信息:

1
2
3
4
5
6
7
8
9
/* add my camera modify your pid and vid */
{ .match_flags = USB_DEVICE_ID_MATCH_DEVICE
| USB_DEVICE_ID_MATCH_INT_INFO,
.idVendor = 0x1bcf,
.idProduct = 0x0b09,
.bInterfaceClass = USB_CLASS_VIDEO,
.bInterfaceSubClass = 1,
.bInterfaceProtocol = 0,
.driver_info = UVC_QUIRK_RESTRICT_FRAME_RATE },

最后重新编译内核并将新内核重新烧录到开发板上,插上摄像头就可以检测搭配设备了。

image-20240812153137246

我的开发板上默认有 /dev/video0 这个节点了,所以插入摄像头后是 /dev/video1

编写应用程序测试

可参考 mjpg-streamer 项目中的 plugins/input_uvc 目录下的文件。

jacksonliam/mjpg-streamer: Fork of http://sourceforge.net/projects/mjpg-streamer/ (github.com)

uvc_camera_capture.h 文件:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
#ifndef UVC_CAMERA_CAPTURE
#define UVC_CAMERA_CAPTURE

#include <stdio.h>
#include <sys/types.h>
#include <sys/stat.h>
#include <fcntl.h>
#include <unistd.h>
#include <sys/ioctl.h>
#include <linux/videodev2.h>
#include <string.h>
#include <sys/mman.h>
#include <signal.h>
#include <poll.h>
#include <linux/fb.h>
#include <stdlib.h>
#include <string.h>
#include <jpeglib.h>


#define DEBUG 1

#ifdef DEBUG
#define DBG(...) printf(__VA_ARGS__)
#else
#define DBG(...)
#endif


#define NB_BUFFER 4

struct vdIn {
int fd;
char *videodevice;
struct v4l2_fmtdesc fmtdesc;
struct v4l2_frmsizeenum frmsizeenum;
struct v4l2_streamparm streamparm;
struct v4l2_capability cap;
struct v4l2_format fmt;
struct v4l2_buffer buf;
struct v4l2_requestbuffers rb;
void *mem[NB_BUFFER];
int width;
int height;
int fps;
int formatIn;
};

struct lcdFramebuffer {
int fd;
struct fb_var_screeninfo var; /* LCD可变参数 */
unsigned int *base; /* Framebuffer映射基地址 */
int width, height;
};

#endif

uvc_camera_capture.c 文件:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
#include "uvc_camera_capture.h"
#include "time.h"

static int jpeg_to_rgb(const char *jpegData, char *rgbData, int size)
{
clock_t start_time, end_time;
start_time = clock();

struct jpeg_error_mgr jerr;
struct jpeg_decompress_struct cinfo;
cinfo.err = jpeg_std_error(&jerr);
// 1. 创建解码对象并且初始化
jpeg_create_decompress(&cinfo);
// 2. 装备解码的数据
//jpeg_stdio_src(&cinfo, infile);
jpeg_mem_src(&cinfo, jpegData, size);
// 3. 获取jpeg图片文件的参数
jpeg_read_header(&cinfo, TRUE);
/* Step 4: set parameters for decompression */
// 5. 开始解码
jpeg_start_decompress(&cinfo);
/*
调用jpeg_start_decompress函数之后,JPEG解压对象dinfo中
下面这几个字段(成员变量)将会比较有用:
dinfo.output_width: 图像输出宽度,一行占多少个像素点
dinfo.output_height: 图像输出高度,占多少行
dinfo.output_components: 每个像素点的分量数,每个像素点占多少个字节
3: R G B
4:A R G B
width * height * components
在调用jpeg_start_decompress之后,往往需要为解压后的扫描线上的
所有像素点分配存储空间:
存一行: output_width * output_components
*/
// 6.申请存储一行数据的内存空间
int row_stride = cinfo.output_width * cinfo.output_components;
//printf("output_width: %d, output_components: %d\n", cinfo.output_width, cinfo.output_components);
unsigned char *buffer = malloc(row_stride);
// dinfo.output_scanline , 表示的意思是,已经扫描了多少行
int i = 0;
while (cinfo.output_scanline < cinfo.output_height) {
jpeg_read_scanlines(&cinfo, &buffer, 1);
memcpy(rgbData + i * 640 * 3, buffer, row_stride);
i++;
}
/*
对扫描线的读取是按照从上到下的顺序进行的,也就是说图像最上方的扫描线最先
被jpeg_read_scanlines()读入到存储空间中,紧接着是第二行扫描线,最后是
图像底边的扫描线被读入到存储空间中去。
*/
// 7.解码完成
jpeg_finish_decompress(&cinfo);
// 8.释放解码对象
jpeg_destroy_decompress(&cinfo);
free(buffer);

end_time = clock();
DBG("mjpeg to rgb waste time %f s\n", (double)(end_time - start_time)/CLOCKS_PER_SEC);

return 1;
}

//将数据流以3字节为单位拷贝到rgb显存中
static void lcd_show_rgb(struct lcdFramebuffer *lfb, unsigned char *rgbData, int width ,int height)
{
clock_t start_time, end_time;
start_time = clock();

unsigned int *row_start = lfb->base;
unsigned int *ptr = NULL;
unsigned char *srcData = NULL;
unsigned char r, g, b;
for(int i = 0; i < height; i++){
ptr = row_start; // 每次指向当前行的起始地址
srcData = rgbData;
for(int j = 0; j < width; j++){
// r = *srcData++;
// g = *srcData++;
// b = *srcData++;
// *ptr++ = (0xFF << 24) | ((unsigned int)r << 16) |
// ((unsigned int)g << 8) | (unsigned int)b;
*ptr++ = (0xFF << 24) | ((*srcData++) << 16) |
((*srcData++) << 8) | (*srcData++);
}
rgbData += width * 3; // 图像大小的偏移
row_start += lfb->width; // 屏幕大小的偏移
}

end_time = clock();
DBG("Write fb waste time %f s\n", (double)(end_time - start_time)/CLOCKS_PER_SEC);
}

/* fcc2s - convert pixelformat to string
* (Obtained from vtl-utils: v4l2-ctl.cpp)
* args:
* fmsString - char* to hold string
* size - size of allocated memory for string
* pixelformat - v4l2 pixel format identidifier
*/
static void fcc2s(char* fmtString, unsigned int size, unsigned int pixelformat)
{
if ( size < 8 )
{
fmtString[0] = '\0';
return;
}


fmtString[0] = pixelformat & 0x7f;
fmtString[1] = (pixelformat >> 8 ) & 0x7f;
fmtString[2] = (pixelformat >> 16 ) & 0x7f;
fmtString[3] = (pixelformat >> 24 ) & 0x7f;
if (pixelformat & (1 << 31))
{
fmtString[4] = '-';
fmtString[5] = 'B';
fmtString[6] = 'E';
fmtString[7] = '\0';
}
else
{
fmtString[4] = '\0';
}
return;
}

static int init_v4l2(struct vdIn *vd)
{
int i;
int ret = 0;
int pixelFromatNum = 0;
int formatFrameSizeNum = 0;
vd->fd = open(vd->videodevice, O_RDWR);
if(vd->fd == -1){
perror("ERROR opening V4L interface");
return -1;
}

// 获取摄像头能力
memset(&vd->cap, 0, sizeof(struct v4l2_capability));
ret = ioctl(vd->fd, VIDIOC_QUERYCAP, &vd->cap);
if(ret < 0)
{
fprintf(stderr, "Error opening device %s: unable to query deivce.\n", vd->videodevice);
goto fatal;
}

DBG("driver:\t\t%s\n", vd->cap.driver);
DBG("card:\t\t%s\n", vd->cap.card);
DBG("bus_info:\t%s\n", vd->cap.bus_info);
DBG("version:\t%d\n", vd->cap.version);
DBG("capabilities:\t%x\n", vd->cap.capabilities);

if((vd->cap.capabilities & V4L2_CAP_VIDEO_CAPTURE) == 0) {
fprintf(stderr, "Error opening device %s: video capture not supported.\n",
vd->videodevice);
goto fatal;;
}
DBG("%s supports capture.\n", vd->videodevice);

if(vd->cap.capabilities & V4L2_CAP_STREAMING == 0) {
fprintf(stderr, "%s does not support streaming i/o\n", vd->videodevice);
goto fatal;
}
DBG("%s supports streaming.\n", vd->videodevice);

// 获取camera信息
memset(&vd->fmtdesc, 0, sizeof(struct v4l2_fmtdesc));
memset(&vd->streamparm, 0, sizeof(struct v4l2_streamparm));
memset(&vd->frmsizeenum, 0, sizeof(struct v4l2_frmsizeenum));
while(1)
{
vd->fmtdesc.index = pixelFromatNum++;
vd->fmtdesc.type = V4L2_BUF_TYPE_VIDEO_CAPTURE; // 设置视频采集设备类型

// 获取摄像头支持的格式
ret = ioctl(vd->fd, VIDIOC_ENUM_FMT, &vd->fmtdesc);
if(ret != 0)
break;

// 获取默认帧率
vd->streamparm.type = V4L2_BUF_TYPE_VIDEO_CAPTURE; // 设置视频采集设备类型
ret = ioctl(vd->fd, VIDIOC_G_PARM, &vd->streamparm);
if(ret == 0)
DBG("Default FPS: %d fps\n", vd->streamparm.parm.capture.timeperframe.denominator);

// 列出该格式下支持的分辨率
while(1)
{
vd->frmsizeenum.index = formatFrameSizeNum++;
vd->frmsizeenum.pixel_format = vd->fmtdesc.pixelformat;

ret = ioctl(vd->fd, VIDIOC_ENUM_FRAMESIZES, &vd->frmsizeenum);
if(ret == 0)
DBG("Support Format:%s, %d, FrameSize %d: %d x %d\n", vd->fmtdesc.description, vd->fmtdesc.pixelformat,
formatFrameSizeNum, vd->frmsizeenum.discrete.width, vd->frmsizeenum.discrete.height);
else
break;
}
formatFrameSizeNum = 0;
}

// 设置采集格式
memset(&vd->fmt, 0, sizeof(struct v4l2_format));
vd->fmt.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;
vd->fmt.fmt.pix.width = vd->width;
vd->fmt.fmt.pix.height = vd->height;
vd->fmt.fmt.pix.pixelformat = vd->formatIn;
vd->fmt.fmt.pix.field = V4L2_FIELD_ANY;
ret = ioctl(vd->fd, VIDIOC_S_FMT, &vd->fmt);
if(ret < 0) {
fprintf(stderr, "Unable to set format: %d res: %dx%d\n", vd->formatIn, vd->width, vd->height);
goto fatal;
}

// 检查设置分辨率是否正确 错误则自动设置
if((vd->fmt.fmt.pix.width != vd->width) || (vd->fmt.fmt.pix.height != vd->height)){
vd->width = vd->fmt.fmt.pix.width;
vd->height = vd->fmt.fmt.pix.height;
}

// 检查采集的像素格式是否正确 错误则自动设置
if(vd->formatIn != vd->fmt.fmt.pix.pixelformat){
char fmtStringRequested[8];
char fmtStringObtained[8];
fcc2s(fmtStringObtained, 8, vd->fmt.fmt.pix.pixelformat);
fcc2s(fmtStringRequested, 8, vd->formatIn);
fprintf(stderr, " i: Could not obtain the requested pixelformat: %s , driver gave us: %s\n",fmtStringRequested, fmtStringObtained);
fprintf(stderr, " i: The specified resolution is unavailable, using: width %d height %d instead \n", vd->fmt.fmt.pix.width, vd->fmt.fmt.pix.height);

switch(vd->fmt.fmt.pix.pixelformat){
case V4L2_PIX_FMT_JPEG:
// Fall-through intentional
case V4L2_PIX_FMT_MJPEG:
fprintf(stderr, " ... Falling back to the faster MJPG mode (consider changing cmd line options).\n");
vd->formatIn = vd->fmt.fmt.pix.pixelformat;
break;
case V4L2_PIX_FMT_YUYV:
fprintf(stderr, " ... Falling back to YUV mode (consider using -yuv option). Note that this requires much more CPU power\n");
vd->formatIn = vd->fmt.fmt.pix.pixelformat;
break;
case V4L2_PIX_FMT_UYVY:
fprintf(stderr, " ... Falling back to UYVY mode (consider using -uyvy option). Note that this requires much more CPU power\n");
vd->formatIn = vd->fmt.fmt.pix.pixelformat;
break;
case V4L2_PIX_FMT_RGB24:
fprintf(stderr, " ... Falling back to RGB24 mode (consider using -fourcc RGB24 option). Note that this requires much more CPU power\n");
vd->formatIn = vd->fmt.fmt.pix.pixelformat;
break;
case V4L2_PIX_FMT_RGB565:
fprintf(stderr, " ... Falling back to RGB565 mode (consider using -fourcc RGBP option). Note that this requires much more CPU power\n");
vd->formatIn = vd->fmt.fmt.pix.pixelformat;
break;
default:
goto fatal;
break;
}
}

char fmtStringRequested[8];
fcc2s(fmtStringRequested, 8, vd->formatIn);
DBG("Set Format: %s, FrameSize: %d x %d\n", fmtStringRequested, vd->width, vd->height);

// 申请内存
memset(&vd->rb, 0, sizeof(struct v4l2_requestbuffers));
vd->rb.count = NB_BUFFER;
vd->rb.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;
vd->rb.memory = V4L2_MEMORY_MMAP; // mmap方式
ret = ioctl(vd->fd, VIDIOC_REQBUFS, &vd->rb);
if(ret < 0){
perror("Unable to allocate buffers");
goto fatal;
}

// mmap内存
for(i = 0; i < NB_BUFFER; i++){
memset(&vd->buf, 0, sizeof(struct v4l2_buffer));
vd->buf.index = i;
vd->buf.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;
vd->buf.memory = V4L2_MEMORY_MMAP; // mmap方式

ret = ioctl(vd->fd, VIDIOC_QUERYBUF, &vd->buf); // 查看buffer
if(ret < 0) {
perror("Unable to query buffer");
goto fatal;
}

vd->mem[i] = mmap(0, vd->buf.length, PROT_READ | PROT_WRITE,
MAP_SHARED, vd->fd, vd->buf.m.offset);
if(vd->mem[i] == MAP_FAILED){
perror("Unable to map buffer");
goto fatal;
}
}

// 加入队列
for(i = 0; i < NB_BUFFER; i++){
memset(&vd->buf, 0, sizeof(struct v4l2_buffer));
vd->buf.index = i;
vd->buf.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;
vd->buf.memory = V4L2_MEMORY_MMAP; // mmap方式
ret = ioctl(vd->fd, VIDIOC_QBUF, &vd->buf);
if(ret < 0) {
perror("Unable to queue buffer");
goto fatal;
}
}

return 0;

fatal:
fprintf(stderr, "Init v4L2 failed !! exit fatal\n");
return -1;
}

static int init_videoIn(struct vdIn *vd, char *device, int width,
int height, int fps, int format)
{
if(vd == NULL || device == NULL)
return -1;
if(width == 0 || height == 0)
return -1;

vd->videodevice = (char *)calloc(1, 16 * sizeof(char));
if(snprintf(vd->videodevice, (16 - 1), "%s", device) < 0)
{
free(vd->videodevice);
vd->videodevice = NULL;
return -1;
}
vd->width = width;
vd->height = height;
vd->fps = fps;
vd->formatIn = format;
if(init_v4l2(vd) < 0)
{
// error
free(vd->videodevice);
vd->videodevice = NULL;
close(vd->fd);
return -1;
}

// free(vd->videodevice);
// close(vd->fd);

return 0;
}

static int video_enable(struct vdIn *vd)
{
int type = V4L2_BUF_TYPE_VIDEO_CAPTURE;
int ret;

ret = ioctl(vd->fd, VIDIOC_STREAMON, &type);
if(ret < 0) {
perror("Unable to start capture");
return ret;
}
//vd->streamingState = STREAMING_ON;
DBG("Starting capture\n");
return 0;
}

static int video_disable(struct vdIn *vd)
{
int type = V4L2_BUF_TYPE_VIDEO_CAPTURE;
int ret;

ret = ioctl(vd->fd, VIDIOC_STREAMOFF, &type);
if(ret != 0) {
perror("Unable to stop capture");
return ret;
}
DBG("Stopping capture\n");
//vd->streamingState = disabledState;
return 0;
}

static int video_handle(struct vdIn *vd, struct lcdFramebuffer *lfb)
{
int ret = 0;
struct pollfd fds[1];
char filename[128];
int file_cnt = 0;

//定义一个空间存储解码后的rgb
unsigned char rgbData[640 * 480 * 3];

clock_t start_time, end_time;
while(1)
{
start_time = clock();
// 1.poll
memset(fds, 0, sizeof(fds));
fds[0].fd = vd->fd;
fds[0].events = POLLIN;
if(poll(fds, 1, -1) > 0)
{
// 2.把buffer取出队列
memset(&vd->buf, 0, sizeof(struct v4l2_buffer));
vd->buf.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;
vd->buf.memory = V4L2_MEMORY_MMAP; // mmap方式

ret = ioctl(vd->fd, VIDIOC_DQBUF, &vd->buf);
if(ret < 0) {
perror("Unable to dequeue buffer");
return -1;
}

// // 3.处理数据,这里将buffer数据存储为文件
// sprintf(filename, "video_raw_data_%04d.jpg", file_cnt++);
// int fd_file = open(filename, O_RDWR | O_CREAT, 0666);
// if(fd_file < 0)
// {
// perror("create file error");
// close(fd);
// return -1;
// }
// write(fd_file, userbuf[buffer.index], buffer.bytesused);
// close(fd_file);
// printf("create %s", filename);

// 3.mjpg转换为rgb并显示在lcd屏幕
jpeg_to_rgb(vd->mem[vd->buf.index], rgbData, vd->buf.length);
//read_JPEG_file(vd->mem[vd->buf.index], rgbData, vd->buf.length);
lcd_show_rgb(lfb, rgbData, 640, 480);

// 4.把buffer放入队列
ret = ioctl(vd->fd, VIDIOC_QBUF, &vd->buf);
if(ret < 0) {
perror("Unable to queue buffer");
return -1;
}
}
end_time = clock();
DBG("Each frame waste time: %f s\n", (double)(end_time - start_time)/CLOCKS_PER_SEC);
}
return 0;
}

static int init_framebuffer(struct lcdFramebuffer *lfb, int width, int height)
{
if(lfb == NULL)
return -1;

// 打开LCD
lfb->fd = open("/dev/fb0", O_RDWR);
if(lfb->fd < 0)
{
perror("/dev/fb0");
return -1;
}

// 获取LCD屏幕信息
if (ioctl(lfb->fd, FBIOGET_VSCREENINFO, &lfb->var))
{
perror("Unable to get fb_var_screeninfo");
return -1;
}

// fb映射 4个字节 ARGB 8888
lfb->base = (unsigned int*)mmap(NULL, width * height * 4,
PROT_READ | PROT_WRITE, MAP_SHARED, lfb->fd, 0);
if(lfb->base == MAP_FAILED)
{
perror("Unable to map buffer");
return -1;
}
}

int main(int argc, char **argv)
{
int i;
struct vdIn *vd;
struct lcdFramebuffer *lfb;
lfb = (struct lcdFramebuffer *)malloc(sizeof(struct lcdFramebuffer));
vd = (struct vdIn *)malloc(sizeof(struct vdIn));

// 初始化结构体
init_videoIn(vd, "/dev/video1", 640, 480, 30, V4L2_PIX_FMT_MJPEG);

// 打开摄像头
video_enable(vd);

// 初始化fb
lfb->width = 1024;
lfb->height = 600;
init_framebuffer(lfb, 1024, 600);

// 处理数据
video_handle(vd, lfb);

// 销毁video内存
for (i = 0; i < NB_BUFFER; i++) {
munmap(vd->mem[i], vd->buf.length);
}
free(vd->videodevice);
vd->videodevice = NULL;
video_disable(vd);
close(vd->fd);
free(vd);

// 销毁fb内存
munmap(lfb->base, lfb->width * lfb->height);
free(lfb);
return 0;
}

运行日志:

image-20240813163957328

最终显示分辨率 640x480,帧数 25 上下浮动,mjpeg 转为 rgb 格式消耗时间占比较大。

由于手上的屏幕分辨率为 1024 * 600,其余两种图像分辨率为 1280 x 720 和 1920 x 1080 不能直接写入 fb 进行显示,需要对采集到的图像分辨率进行放大缩小等操作,较为麻烦,后续使用 OpenCV 库来试试水。

在linux虚拟机上显示摄像头视频(V4L2编程)_v4l2命令行显示视频-CSDN博客

嵌入式linux之在lcd上显示摄像头图像_esp32摄像头显示在lcd上-CSDN博客

V4L2图像采集+图片格式转换(YUYV、RGB、JPEG)_qt中使用v4l2将jpeg格式的图像转化为rbg-CSDN博客