CN114025088A - Method for realizing all-around image safety monitoring by arranging intelligent camera on commercial vehicle - Google Patents

Method for realizing all-around image safety monitoring by arranging intelligent camera on commercial vehicle Download PDF

Info

Publication number
CN114025088A
CN114025088A CN202111280660.6A CN202111280660A CN114025088A CN 114025088 A CN114025088 A CN 114025088A CN 202111280660 A CN202111280660 A CN 202111280660A CN 114025088 A CN114025088 A CN 114025088A
Authority
CN
China
Prior art keywords
panoramic
image
around
matrix
terminal
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202111280660.6A
Other languages
Chinese (zh)
Other versions
CN114025088B (en
Inventor
汤超
刘延
周金应
苏梦月
程前
陈金晶
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Caac Chongqing Automobile Inspection Co ltd
China Automotive Engineering Research Institute Co Ltd
Original Assignee
Caac Chongqing Automobile Inspection Co ltd
China Automotive Engineering Research Institute Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Caac Chongqing Automobile Inspection Co ltd, China Automotive Engineering Research Institute Co Ltd filed Critical Caac Chongqing Automobile Inspection Co ltd
Priority to CN202111280660.6A priority Critical patent/CN114025088B/en
Publication of CN114025088A publication Critical patent/CN114025088A/en
Application granted granted Critical
Publication of CN114025088B publication Critical patent/CN114025088B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/698Control of cameras or camera modules for achieving an enlarged field of view, e.g. panoramic image capture
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N17/00Diagnosis, testing or measuring for television systems or their details
    • H04N17/002Diagnosis, testing or measuring for television systems or their details for television cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/71Circuitry for evaluating the brightness variation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/2624Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects for obtaining an image which is composed of whole input images, e.g. splitscreen
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/181Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/10Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used
    • B60R2300/105Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used using multiple cameras
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Abstract

The invention provides a method for realizing the safety monitoring of a panoramic image by arranging an intelligent camera on an operating automobile, which comprises the steps of detecting a vehicle to be tested, and detecting the brightness consistency of a spliced panoramic image; the brightness consistency detection method comprises the following steps: S4-A, acquiring spliced images, and carrying out gray processing on the spliced images to obtain gray images; S4-B, carrying out image denoising processing on the gray level image; S4-C, dividing the acquired gray level image into equal-size areas according to the size of the image; S4-D, converting the color space of the regions into an LAB color space, extracting L brightness components in the color space, calculating the average value of the L components of each region, and performing difference operation on the average values of all the regions, wherein if the difference value is greater than or equal to a preset difference threshold value, the brightness is inconsistent; otherwise the brightness is consistent. The invention can detect and analyze the brightness consistency of the spliced image.

Description

Method for realizing all-around image safety monitoring by arranging intelligent camera on commercial vehicle
Technical Field
The invention relates to the technical field of vehicle safety, in particular to a method for realizing the safety monitoring of a panoramic image by arranging an intelligent camera on an operating automobile.
Background
Vehicles are all devices that are driven by driving wheels for the purpose of transporting people or goods. Recently, smart vehicles have been actively developed and commercialized for the safety and convenience of drivers or pedestrians. Intelligent vehicles are the most advanced vehicles incorporating Information Technology (IT), which not only introduce advanced systems of the vehicles themselves, but also provide optimal traffic efficiency through links to intelligent transportation systems. In particular, smart vehicles maximize safety and convenience for drivers, passengers, and pedestrians by performing auto-driving, Adaptive Cruise Control (ACC), obstacle detection, collision detection, accurate map provision, setting of a route to a destination, provision of a location of a main place. As a means for maximizing safety and convenience of drivers, passengers, and pedestrians, a panoramic all-round control device attracts attention. The panoramic looking-around control device provides a panoramic looking-around image of the vehicle using the camera device, and the driver can panoramic look around the vehicle in real time through the panoramic looking-around image.
Disclosure of Invention
The invention aims to at least solve the technical problems in the prior art, and particularly creatively provides a method for realizing the safety monitoring of a panoramic image by arranging an intelligent camera on an operating automobile.
In order to achieve the purpose, the invention provides a method for realizing the safety monitoring of a panoramic image by arranging an intelligent camera on an operating automobile, which comprises a vehicle to be tested, wherein A panoramic looking-around cameras are arranged on the body of the vehicle to be tested, A is a positive integer which is greater than or equal to 1 and is respectively a1 st panoramic looking-around camera, a2 nd panoramic looking-around camera, a3 rd panoramic looking-around camera, … … and an A th panoramic looking-around camera;
a panoramic all-around display screen fixing mounting seat for fixedly mounting a panoramic all-around display screen is arranged in the vehicle cab to be tested, and the panoramic all-around display screen is fixedly mounted on the panoramic all-around display screen fixing mounting seat;
the panoramic all-around vision controller is arranged in a vehicle to be tested, a panoramic all-around vision image data output end of an a-th panoramic all-around vision camera is connected with a panoramic all-around vision image data input end a of the panoramic all-around vision controller, a is a positive integer less than or equal to A, at the moment, a panoramic all-around vision image data output end of a1 st panoramic all-around vision camera is connected with a panoramic all-around vision image data input end 1 of the panoramic all-around vision controller, a panoramic all-around vision image data output end of a2 nd panoramic all-around vision camera is connected with a panoramic all-around vision image data input end 2 of the panoramic all-around vision controller, a panoramic all-around vision image data output end of a3 rd panoramic all-around vision camera is connected with a panoramic all-around vision image data input end 3 of the panoramic all-around vision controller, … …, and a panoramic all-around vision image data output end of the A-around vision camera is connected with the panoramic all-around vision image data input end A of the panoramic all-around vision controller; the display data output end of the panoramic looking-around controller is connected with the display data input end of the panoramic looking-around display screen;
the panoramic all-around vision controller carries out brightness consistency detection on the spliced panoramic all-around vision image according to the panoramic all-around vision image data collected by the panoramic all-around vision camera arranged on the vehicle body to be tested; the brightness consistency detection method comprises the following steps:
S4-A, acquiring spliced images, and carrying out gray processing on the spliced images to obtain gray images;
S4-B, carrying out image denoising processing on the gray level image;
S4-C, dividing the acquired gray level image into equal-size areas according to the size of the image;
S4-D, converting the color space of the regions into an LAB color space, extracting L brightness components in the color space, calculating the average value of the L components of each region, and performing difference operation on the average values of all the regions, wherein if the difference value is greater than or equal to a preset difference threshold value, the brightness is inconsistent; otherwise the brightness is consistent.
In a preferred embodiment of the present invention, in step S4-a, the processing method of the grayscale image is:
F(i,j)=Red(i,j)×Re+Green(i,j)×Gr+Blue(i,j)×Bl,
wherein ,Red(i,j)Representing the amount of red color pattern at the image pixel (i, j);
re represents a fusion scale factor of the red color mode quantity; re + Gr + Bl ═ 1;
Green(i,j)representing the amount of green color pattern at the image pixel (i, j);
re represents the fusion scale factor of the green color mode quantity;
Blue(i,j)representing the amount of blue color pattern at the image pixel (i, j);
re represents the fusion scale factor of the blue color mode quantity.
In a preferred embodiment of the present invention, in step S4-B, the method for performing image denoising processing on a grayscale image includes the following steps:
s0-21, finding out the noise point of the image pixel point, wherein the calculation method of the noise point of the pixel point in the image is as follows:
Figure BDA0003330604040000021
wherein ,F(i,j)Is shown in the imageThe gray value of the pixel point (i, j); that is, the gray value of the pixel point in the ith row and the jth column of the image;
G(i,j)indicating whether a pixel point (i, j) in the image is a noise point, G(i,j)When the pixel point (i, j) of the image is a noise point, G, is 0(i,j)1 represents that the image pixel point (i, j) is a non-noise point;
or represents a logical condition or;
amaxrepresenting that p multiplied by p with the image pixel point (i, j) as the center is the maximum mean value of the gray levels in the window; p is 2p '+ 1, p' is a positive integer greater than or equal to 1 and less than or equal to 3;
aminrepresenting that p multiplied by p with the image pixel point (i, j) as the center is the minimum mean value of the gray level in the window;
s0-22, carrying out matrix numbering on the pixel points in the window, and recording as G';
s0-23, arranging the gray values in the matrix G' from small to large, updating the gray values of the pixels at the noise points, wherein the calculation method of the gray values of the pixels at the noise points comprises the following steps:
Figure BDA0003330604040000022
wherein ,
Figure BDA0003330604040000023
representing the gray value of the pixel point at the noise point after updating;
Figure BDA0003330604040000024
express gray value ordering
Figure BDA0003330604040000025
A gray value;
Figure BDA0003330604040000026
express gray value ordering
Figure BDA0003330604040000027
A gray value.
In conclusion, due to the adoption of the technical scheme, the brightness consistency detection analysis can be performed on the spliced image.
Additional aspects and advantages of the invention will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the invention.
Drawings
The above and/or additional aspects and advantages of the present invention will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings of which:
FIG. 1 is a schematic block diagram of the process of the present invention.
Fig. 2 is a schematic circuit connection diagram of a second power supply module, a third power supply module, a fourth power supply module and a fifth power supply module according to the present invention.
Fig. 3 is a schematic circuit connection diagram of the first power supply module, the sixth power supply module and the seventh power supply module according to the present invention.
Fig. 4 is a schematic circuit connection diagram of an eighth power supply module, a ninth power supply module and a fault detection module according to the present invention.
Fig. 5 is a schematic circuit diagram of the panoramic all-round view acquisition module of the present invention.
Fig. 6 is a schematic diagram of the controller module circuit connection of the present invention.
FIG. 7 is a schematic circuit diagram of a storage module according to the present invention.
Detailed Description
Reference will now be made in detail to embodiments of the present invention, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the accompanying drawings are illustrative only for the purpose of explaining the present invention, and are not to be construed as limiting the present invention.
The invention provides a method for realizing safety monitoring of a panoramic image by arranging an intelligent camera on an operating automobile, which comprises a vehicle to be tested, wherein A panoramic looking-around cameras are arranged on a vehicle body of the vehicle to be tested, A is a positive integer which is greater than or equal to 1 and is respectively a1 st panoramic looking-around camera, a2 nd panoramic looking-around camera, a3 rd panoramic looking-around camera, … … and an A th panoramic looking-around camera;
a panoramic all-around display screen fixing mounting seat for fixedly mounting a panoramic all-around display screen is arranged in the vehicle cab to be tested, and the panoramic all-around display screen is fixedly mounted on the panoramic all-around display screen fixing mounting seat;
the panoramic all-around vision controller is arranged in a vehicle to be tested, a panoramic all-around vision image data output end of an a-th panoramic all-around vision camera is connected with a panoramic all-around vision image data input end a of the panoramic all-around vision controller, a is a positive integer less than or equal to A, at the moment, a panoramic all-around vision image data output end of a1 st panoramic all-around vision camera is connected with a panoramic all-around vision image data input end 1 of the panoramic all-around vision controller, a panoramic all-around vision image data output end of a2 nd panoramic all-around vision camera is connected with a panoramic all-around vision image data input end 2 of the panoramic all-around vision controller, a panoramic all-around vision image data output end of a3 rd panoramic all-around vision camera is connected with a panoramic all-around vision image data input end 3 of the panoramic all-around vision controller, … …, and a panoramic all-around vision image data output end of the A-around vision camera is connected with the panoramic all-around vision image data input end A of the panoramic all-around vision controller; the display data output end of the panoramic looking-around controller is connected with the display data input end of the panoramic looking-around display screen;
the panoramic all-around vision controller carries out brightness consistency detection on the spliced panoramic all-around vision image according to the panoramic all-around vision image data collected by the panoramic all-around vision camera arranged on the vehicle body to be tested; the brightness consistency detection method comprises the following steps:
as shown in fig. 1, S4-a, acquiring a spliced image, and performing graying processing on the spliced image to obtain a grayscale image;
S4-B, carrying out image denoising processing on the gray level image;
S4-C, dividing the acquired gray level image into equal-size areas according to the size of the image;
S4-D, converting the color space of the regions into an LAB color space, extracting L brightness components in the color space, calculating the average value of the L components of each region, and performing difference operation on the average values of all the regions, wherein if the difference value is greater than or equal to a preset difference threshold value, the brightness is inconsistent; otherwise the brightness is consistent.
In a preferred embodiment of the present invention, in step S4-a, the processing method of the grayscale image is:
F(i,j)=Red(i,j)×Re+Green(i,j)×Gr+Blue(i,j)×Bl,
wherein ,Red(i,j)Representing the amount of red color pattern at the image pixel (i, j);
re represents a fusion scale factor of the red color mode quantity; re + Gr + Bl ═ 1;
Green(i,j)representing the amount of green color pattern at the image pixel (i, j);
re represents the fusion scale factor of the green color mode quantity;
Blue(i,j)representing the amount of blue color pattern at the image pixel (i, j);
re represents the fusion scale factor of the blue color mode quantity.
In a preferred embodiment of the present invention, in step S4-B, the method for performing image denoising processing on a grayscale image includes the following steps:
s0-21, finding out the noise point of the image pixel point, wherein the calculation method of the noise point of the pixel point in the image is as follows:
Figure BDA0003330604040000031
wherein ,F(i,j)Representing the gray value at the image pixel (i, j); that is, the gray value of the pixel point in the ith row and the jth column of the image;
G(i,j)indicating whether a pixel point (i, j) in the image is a noise point, G(i,j)When the pixel point (i, j) of the image is a noise point, G, is 0(i,j)1 represents that the image pixel point (i, j) is a non-noise point;
or represents a logical condition or;
amaxrepresenting that p multiplied by p with the image pixel point (i, j) as the center is the maximum mean value of the gray levels in the window; p is 2p '+ 1, p' is a positive integer greater than or equal to 1 and less than or equal to 3;
aminrepresenting that p multiplied by p with the image pixel point (i, j) as the center is the minimum mean value of the gray level in the window;
s0-22, numbering the pixel points in the window in matrix, and recording the number as
Figure BDA0003330604040000041
wherein ,G(1,1)'represents a pixel point in the 1 st row and 1 st column in the matrix G'; g(1,2)'represents a pixel point in the 1 st row and 2 nd column in the matrix G'; g(1,3)'represents a pixel point in the 1 st row and 3 rd column of the matrix G'; g(1,p)'represents the pixel point in the 1 st row and the p th column in the matrix G'; g(2,1)'denotes a pixel in row 2 and column 1 in the matrix G'; g(2,2)'denotes a pixel in row 2 and column 2 in the matrix G'; g(2,3)'denotes a pixel point in row 2 and column 3 in the matrix G'; g(2,p)'denotes a pixel point in the 2 nd row and p th column in the matrix G'; g(3,1)'denotes a pixel point in row 3 and column 1 in the matrix G'; g(3,2)'denotes a pixel point in row 3 and column 2 in the matrix G'; g(3,3)'represents a pixel point in row 3 and column 3 in the matrix G'; g(3,p)'denotes a pixel point in the p-th column of row 3 in the matrix G'; g(p,1)'denotes a pixel point at the 1 st row in the matrix G'; g(p,2)'denotes a pixel point at the p-th row and 2 nd column in the matrix G'; g(p,3)'denotes a pixel point at the p-th row and 3 rd column in the matrix G'; g(p,p)'represents the pixel point in the p row and p column in the matrix G';
s0-23, arranging the gray values in the matrix G' from small to large, updating the gray values of the pixels at the noise points, wherein the calculation method of the gray values of the pixels at the noise points comprises the following steps:
Figure BDA0003330604040000042
wherein ,
Figure BDA0003330604040000043
representing the gray value of the pixel point at the noise point after updating;
Figure BDA0003330604040000044
express gray value ordering
Figure BDA0003330604040000045
A gray value;
Figure BDA0003330604040000046
express gray value ordering
Figure BDA0003330604040000047
A gray value.
The invention relates to a vehicle panoramic all-round view image test system, which can be as follows: the test system comprises a vehicle to be tested, wherein A panoramic looking-around cameras are arranged on a vehicle body of the vehicle to be tested, A is a positive integer which is greater than or equal to 1 and is respectively a1 st panoramic looking-around camera, a2 nd panoramic looking-around camera, a3 rd panoramic looking-around camera, … … and an A th panoramic looking-around camera;
the a-th panoramic all-around camera comprises a panoramic all-around camera shooting module, a panoramic all-around camera controller and a wireless data Bluetooth transmission module; the panoramic all-around image data output end of the panoramic all-around camera shooting module is connected with the panoramic all-around image data input end of the panoramic all-around camera controller, and the panoramic all-around image data transmission end of the controller is connected with the panoramic all-around image data transmission end of the wireless data Bluetooth transmission module; a is a positive integer less than or equal to A, and the 1 st panoramic all-around camera comprises a panoramic all-around camera shooting module, a panoramic all-around camera controller and a wireless data Bluetooth transmission module; the panoramic all-around image data output end of the panoramic all-around camera shooting module is connected with the panoramic all-around image data input end of the panoramic all-around camera controller, and the panoramic all-around image data transmission end of the controller is connected with the panoramic all-around image data transmission end of the wireless data Bluetooth transmission module; the 2 nd panoramic all-around camera comprises a panoramic all-around camera shooting module, a panoramic all-around camera controller and a wireless data Bluetooth transmission module; the panoramic all-around image data output end of the panoramic all-around camera shooting module is connected with the panoramic all-around image data input end of the panoramic all-around camera controller, and the panoramic all-around image data transmission end of the controller is connected with the panoramic all-around image data transmission end of the wireless data Bluetooth transmission module; the 3 rd panoramic all-around camera comprises a panoramic all-around camera shooting module, a panoramic all-around camera controller and a wireless data Bluetooth transmission module; the panoramic all-around image data output end of the panoramic all-around camera shooting module is connected with the panoramic all-around image data input end of the panoramic all-around camera controller, and the panoramic all-around image data transmission end of the controller is connected with the panoramic all-around image data transmission end of the wireless data Bluetooth transmission module; … …, respectively; the A-th panoramic all-around camera comprises a panoramic all-around camera shooting module, a panoramic all-around camera controller and a wireless data Bluetooth transmission module; the panoramic all-around image data output end of the panoramic all-around camera shooting module is connected with the panoramic all-around image data input end of the panoramic all-around camera controller, and the panoramic all-around image data transmission end of the controller is connected with the panoramic all-around image data transmission end of the wireless data Bluetooth transmission module;
a panoramic all-around display screen fixing mounting seat for fixedly mounting a panoramic all-around display screen is arranged in the vehicle cab to be tested, and the panoramic all-around display screen is fixedly mounted on the panoramic all-around display screen fixing mounting seat;
the panoramic all-around display screen comprises a panoramic all-around display module, a panoramic all-around display controller and a wireless Bluetooth data transmission module; a panoramic all-around image data transmission end of the wireless Bluetooth data transmission module is connected with a panoramic all-around image data transmission end of the panoramic all-around display controller, and a panoramic all-around image data output end of the panoramic all-around display controller is connected with a panoramic all-around image data input end of the panoramic all-around display module;
the panoramic all-around camera transmits the panoramic all-around image data shot by the panoramic all-around camera shooting module to the panoramic all-around display screen by using the wireless data Bluetooth transmission module; the panoramic all-round looking display screen receives the panoramic all-round looking image data sent by the a-th panoramic all-round looking camera through the wireless Bluetooth data transmission module, and displays the spliced panoramic all-round looking image data on the panoramic all-round looking display module. Performing brightness consistency detection on the spliced panoramic all-around image; the brightness consistency detection method comprises the following steps:
S4-A, acquiring spliced images, and carrying out gray processing on the spliced images to obtain gray images;
S4-B, carrying out image denoising processing on the gray level image;
S4-C, dividing the acquired gray level image into equal-size areas according to the size of the image;
S4-D, converting the color space of the regions into an LAB color space, extracting L brightness components in the color space, calculating the average value of the L components of each region, and performing difference operation on the average values of all the regions, wherein if the difference value is greater than or equal to a preset difference threshold value, the brightness is inconsistent; otherwise the brightness is consistent.
In a preferred embodiment of the present invention, in step S4-a, the processing method of the grayscale image is:
F(i,j)=Red(i,j)×Re+Green(i,j)×Gr+Blue(i,j)×Bl,
wherein ,Red(i,j)Representing the amount of red color pattern at the image pixel (i, j);
re represents a fusion scale factor of the red color mode quantity; re + Gr + Bl ═ 1;
Green(i,j)representing the amount of green color pattern at the image pixel (i, j);
re represents the fusion scale factor of the green color mode quantity;
Blue(i,j)representing the amount of blue color pattern at the image pixel (i, j);
re represents the fusion scale factor of the blue color mode quantity.
In a preferred embodiment of the present invention, in step S4-B, the method for performing image denoising processing on a grayscale image includes the following steps:
s0-21, finding out the noise point of the image pixel point, wherein the calculation method of the noise point of the pixel point in the image is as follows:
Figure BDA0003330604040000061
wherein ,F(i,j)Representing the gray value at the image pixel (i, j); that is, the gray value of the pixel point in the ith row and the jth column of the image;
G(i,j)indicating whether a pixel point (i, j) in the image is a noise point, G(i,j)When the pixel point (i, j) of the image is a noise point, G, is 0(i,j)1 represents that the image pixel point (i, j) is a non-noise point;
or represents a logical condition or;
amaxrepresenting that p multiplied by p with the image pixel point (i, j) as the center is the maximum mean value of the gray levels in the window; p is 2p '+ 1, p' is a positive integer greater than or equal to 1 and less than or equal to 3;
aminrepresenting that p multiplied by p with the image pixel point (i, j) as the center is the minimum mean value of the gray level in the window;
s0-22, numbering the pixel points in the window in matrix, and recording the number as
Figure BDA0003330604040000062
wherein ,G(1,1)'represents a pixel point in the 1 st row and 1 st column in the matrix G'; g(1,2)'represents a pixel point in the 1 st row and 2 nd column in the matrix G'; g(1,3)'represents a pixel point in the 1 st row and 3 rd column of the matrix G'; g(1,p)'represents the pixel point in the 1 st row and the p th column in the matrix G'; g(2,1)' is represented in matrixG' is the pixel point of the 2 nd row and the 1 st column; g(2,2)'denotes a pixel in row 2 and column 2 in the matrix G'; g(2,3)'denotes a pixel point in row 2 and column 3 in the matrix G'; g(2,p)'denotes a pixel point in the 2 nd row and p th column in the matrix G'; g(3,1)'denotes a pixel point in row 3 and column 1 in the matrix G'; g(3,2)'denotes a pixel point in row 3 and column 2 in the matrix G'; g(3,3)'represents a pixel point in row 3 and column 3 in the matrix G'; g(3,p)'denotes a pixel point in the p-th column of row 3 in the matrix G'; g(p,1)'denotes a pixel point at the 1 st row in the matrix G'; g(p,2)'denotes a pixel point at the p-th row and 2 nd column in the matrix G'; g(p,3)'denotes a pixel point at the p-th row and 3 rd column in the matrix G'; g(p,p)'represents the pixel point in the p row and p column in the matrix G';
s0-23, arranging the gray values in the matrix G' from small to large, updating the gray values of the pixels at the noise points, wherein the calculation method of the gray values of the pixels at the noise points comprises the following steps:
Figure BDA0003330604040000063
wherein ,
Figure BDA0003330604040000064
representing the gray value of the pixel point at the noise point after updating;
Figure BDA0003330604040000071
express gray value ordering
Figure BDA0003330604040000072
A gray value;
Figure BDA0003330604040000073
express gray value ordering
Figure BDA0003330604040000074
A gray value.
The invention relates to a vehicle panoramic all-round view image test system, which can also be as follows: the test system comprises a vehicle to be tested, wherein A panoramic looking-around cameras are arranged on a vehicle body of the vehicle to be tested, A is a positive integer which is greater than or equal to 1 and is respectively a1 st panoramic looking-around camera, a2 nd panoramic looking-around camera, a3 rd panoramic looking-around camera, … … and an A th panoramic looking-around camera;
a panoramic all-around display screen fixing mounting seat for fixedly mounting a panoramic all-around display screen is arranged in the vehicle cab to be tested, and the panoramic all-around display screen is fixedly mounted on the panoramic all-around display screen fixing mounting seat;
the panoramic all-around vision system comprises a panoramic all-around vision controller, a WiFi wireless data transmission module, a panoramic all-around vision controller and a panoramic all-around vision controller, wherein the panoramic all-around vision controller is arranged in a vehicle to be tested, a is a positive integer less than or equal to A, the panoramic all-around vision image data output end of a1 st panoramic all-around vision camera is connected with the panoramic all-around vision image data input end 1 of the panoramic all-around vision controller, the panoramic all-around vision image data output end of a2 nd panoramic all-around vision camera is connected with the panoramic all-around vision image data input end 2 of the panoramic all-around vision controller, the panoramic all-around vision image data output end of a3 rd panoramic all-around vision camera is connected with the panoramic all-around vision image data input end 3 of the panoramic all-around vision controller, … …, and the panoramic all-around vision image data output end of the A st panoramic all-around vision camera is connected with the panoramic all-around vision image data input end A of the panoramic all-around vision controller; the display data output end of the panoramic looking-around controller is connected with the display data input end of the panoramic looking-around display screen; a WiFi data transmission end of the panoramic looking-around controller is connected with a data transmission end of the WiFi wireless data transmission module;
the panoramic all-round looking controller utilizes the WiFi wireless data transmission module to send to processing terminal according to the panoramic all-round looking image data that the panoramic all-round looking camera that sets up on the vehicle body that awaits measuring gathered, and processing terminal shows the panoramic all-round looking image after handling on the panoramic all-round looking display screen. Performing brightness consistency detection on a panoramic all-around image displayed on the panoramic all-around display screen; the brightness consistency detection method comprises the following steps:
S4-A, acquiring spliced images, and carrying out gray processing on the spliced images to obtain gray images;
S4-B, carrying out image denoising processing on the gray level image;
S4-C, dividing the acquired gray level image into equal-size areas according to the size of the image;
S4-D, converting the color space of the regions into an LAB color space, extracting L brightness components in the color space, calculating the average value of the L components of each region, and performing difference operation on the average values of all the regions, wherein if the difference value is greater than or equal to a preset difference threshold value, the brightness is inconsistent; otherwise the brightness is consistent.
In a preferred embodiment of the present invention, in step S4-a, the processing method of the grayscale image is:
F(i,j)=Red(i,j)×Re+Green(i,j)×Gr+Blue(i,j)×Bl,
wherein ,Red(i,j)Representing the amount of red color pattern at the image pixel (i, j);
re represents a fusion scale factor of the red color mode quantity; re + Gr + Bl ═ 1;
Green(i,j)representing the amount of green color pattern at the image pixel (i, j);
re represents the fusion scale factor of the green color mode quantity;
Blue(i,j)representing the amount of blue color pattern at the image pixel (i, j);
re represents the fusion scale factor of the blue color mode quantity.
In a preferred embodiment of the present invention, in step S4-B, the method for performing image denoising processing on a grayscale image includes the following steps:
s0-21, finding out the noise point of the image pixel point, wherein the calculation method of the noise point of the pixel point in the image is as follows:
Figure BDA0003330604040000075
wherein ,F(i,j)Representing the gray value at the image pixel (i, j); that is, the gray value of the pixel point in the ith row and the jth column of the image;
G(i,j)indicating whether a pixel point (i, j) in the image is a noise point, G(i,j)When the pixel point (i, j) of the image is a noise point, G, is 0(i,j)1 represents that the image pixel point (i, j) is a non-noise point;
or represents a logical condition or;
amaxrepresenting that p multiplied by p with the image pixel point (i, j) as the center is the maximum mean value of the gray levels in the window; p is 2p '+ 1, p' is a positive integer greater than or equal to 1 and less than or equal to 3;
aminrepresenting that p multiplied by p with the image pixel point (i, j) as the center is the minimum mean value of the gray level in the window;
s0-22, numbering the pixel points in the window in matrix, and recording the number as
Figure BDA0003330604040000081
wherein ,G(1,1)'represents a pixel point in the 1 st row and 1 st column in the matrix G'; g(1,2)'represents a pixel point in the 1 st row and 2 nd column in the matrix G'; g(1,3)'represents a pixel point in the 1 st row and 3 rd column of the matrix G'; g(1,p)'represents the pixel point in the 1 st row and the p th column in the matrix G'; g(2,1)'denotes a pixel in row 2 and column 1 in the matrix G'; g(2,2)'denotes a pixel in row 2 and column 2 in the matrix G'; g(2,3)'denotes a pixel point in row 2 and column 3 in the matrix G'; g(2,p)'denotes a pixel point in the 2 nd row and p th column in the matrix G'; g(3,1)'denotes a pixel point in row 3 and column 1 in the matrix G'; g(3,2)'denotes a pixel point in row 3 and column 2 in the matrix G'; g(3,3)'represents a pixel point in row 3 and column 3 in the matrix G'; g(3,p)' denotes row 3, column p in matrix GPixel points; g(p,1)'denotes a pixel point at the 1 st row in the matrix G'; g(p,2)'denotes a pixel point at the p-th row and 2 nd column in the matrix G'; g(p,3)'denotes a pixel point at the p-th row and 3 rd column in the matrix G'; g(p,p)'represents the pixel point in the p row and p column in the matrix G';
s0-23, arranging the gray values in the matrix G' from small to large, updating the gray values of the pixels at the noise points, wherein the calculation method of the gray values of the pixels at the noise points comprises the following steps:
Figure BDA0003330604040000082
wherein ,
Figure BDA0003330604040000087
representing the gray value of the pixel point at the noise point after updating;
Figure BDA0003330604040000083
express gray value ordering
Figure BDA0003330604040000084
A gray value;
Figure BDA0003330604040000085
express gray value ordering
Figure BDA0003330604040000086
A gray value.
In the embodiment, the device further comprises a calibration device, wherein the calibration device comprises a transverse support plate, a rail capable of sliding linearly is arranged on the transverse support plate, a moving device is arranged on the rail, a vertical support rod is arranged on the moving device, a calibration target plate is arranged on the vertical support rod, and black and white color grids are arranged on any one surface or two surfaces of the calibration target plate;
the moving device comprises a pulley capable of sliding on the rail and a load arranged on the pulley, wherein the load comprises a box body, a driving module, a calibration controller and a wireless Bluetooth transmission unit, wherein the driving module is arranged in the box body and drives the pulley to slide;
the driving end of the calibration controller is connected with the control end of the driving module, and the wireless data end of the calibration controller is connected with the wireless data end of the wireless Bluetooth transmission unit; a panoramic all-around Bluetooth transmission module is correspondingly arranged on the vehicle to be tested, and a panoramic all-around image data transmission end of the panoramic all-around controller is connected with a panoramic all-around image data transmission end of the panoramic all-around Bluetooth transmission module;
the calibration device is arranged beside the panoramic all-around camera, so that the panoramic all-around camera is opposite to any surface of the calibration target plate, and the calibration controller adjusts the distance between the panoramic all-around camera and the calibration target plate according to a control signal sent by the panoramic all-around controller.
In a preferred embodiment of the present invention, 4 panoramic looking-around cameras are arranged on the body of the vehicle to be tested, which are respectively a1 st panoramic looking-around camera, a2 nd panoramic looking-around camera, a3 rd panoramic looking-around camera and a4 th panoramic looking-around camera;
a1 st panoramic all-around camera fixed mounting seat for fixedly mounting a1 st panoramic all-around camera is arranged in the middle of the head of the vehicle to be tested, and the 1 st panoramic all-around camera is fixedly mounted on the 1 st panoramic all-around camera fixed mounting seat; a2 nd panoramic looking around camera fixing mounting seat for fixedly mounting a2 nd panoramic looking around camera is arranged in the middle of the tail of the vehicle to be tested, and the 2 nd panoramic looking around camera is fixedly mounted on the 2 nd panoramic looking around camera fixing mounting seat; a3 rd panoramic all-around camera fixed mounting seat for fixedly mounting a3 rd panoramic all-around camera is arranged in the middle of the left side of the vehicle body of the vehicle to be tested, and the 3 rd panoramic all-around camera is fixedly mounted on the 3 rd panoramic all-around camera fixed mounting seat; a4 th panoramic looking around camera fixing mounting seat for fixedly mounting a4 th panoramic looking around camera is arranged in the middle of the right side of the vehicle body of the vehicle to be tested, and the 4 th panoramic looking around camera is fixedly mounted on the 4 th panoramic looking around camera fixing mounting seat;
the panoramic all-round looking image data output end of the 1 st panoramic all-round looking camera is connected with the 1 st end of the panoramic all-round looking image data input of the panoramic all-round looking controller, the panoramic all-round looking image data output end of the 2 nd panoramic all-round looking camera is connected with the 2 nd end of the panoramic all-round looking image data input of the panoramic all-round looking controller, the panoramic all-round looking image data output end of the 3 rd panoramic all-round looking camera is connected with the 3 rd end of the panoramic all-round looking image data input of the panoramic all-round looking controller, and the panoramic all-round looking image data output end of the 4 th panoramic all-round looking camera is connected with the 4 th end of the panoramic all-round looking image data input of the panoramic all-round looking controller.
The invention also discloses a method for testing the panoramic all-round looking image of the vehicle, which comprises the following steps:
s0, correcting the lens of the panoramic all-round looking camera;
s1, acquiring panoramic view image data shot by the panoramic view camera;
s2, carrying out image noise filtering and updating on the shot panoramic all-around image data;
s3, carrying out matching calibration according to the images shot at adjacent moments;
and S4, extracting image feature points in the images to realize the splicing of the panoramic all-around images.
In a preferred embodiment of the present invention, in step S0, the method for correcting the lens of the panoramic looking-around camera includes the steps of:
s01, acquiring the relation between the image pixel and the environment coordinate system of the vehicle to be tested;
and S02, correcting the lens of the panoramic all-round looking camera according to the shot calibration target plate.
In a preferred embodiment of the present invention, in step S01, the method for calculating the relationship between the image pixels and the coordinate system of the environment where the vehicle to be tested is located includes the following steps:
s011, acquiring the relationship between image pixels and image sizes, wherein the relationship between the image pixels and the image sizes is as follows:
Figure BDA0003330604040000091
wherein, (x, y,1)TRepresenting homogeneous image coordinates in units of length millimeters mm; t represents transposition;
(U0,V0)Tprincipal point coordinates representing an image;
dx represents the size of the panoramic looking-around camera pixel in the x-axis direction;
dy represents the size of the panoramic all-around camera pixel in the y-axis direction;
(U,V,1)Trepresenting homogeneous image coordinates in units of pixels ppi;
s012, obtaining the relation between the panoramic looking around camera coordinate system and the image size, wherein the relation between the panoramic looking around camera coordinate system and the image size is as follows:
Figure BDA0003330604040000101
wherein, (X, Y, Z)TCoordinates representing a panoramic looking-around camera coordinate system; t represents transposition;
(x,y,1)Trepresenting homogeneous image coordinates in units of length millimeters mm;
s represents the adjustment parameter of the panoramic all-round looking camera;
f represents the focal length of the panoramic all-round camera;
and S013, obtaining the relation between the image pixels and the coordinate system of the panoramic looking-around camera according to the relation between the image pixels and the image size in the step S011 and the relation between the coordinate system of the panoramic looking-around camera and the image size in the step S012, wherein the relation between the image pixels and the coordinate system of the panoramic looking-around camera is as follows:
will be provided with
Figure BDA0003330604040000102
Substitution into
Figure BDA0003330604040000103
Obtaining:
Figure BDA0003330604040000104
s014, acquiring a relation between a coordinate system of the panoramic all-round looking camera and a coordinate system of an environment where the vehicle to be tested is located, wherein the relation between the coordinate system of the panoramic all-round looking camera and the coordinate system of the environment where the vehicle to be tested is as follows:
Figure BDA0003330604040000105
wherein ,(X0,Y0,Z0)TRepresenting an environment coordinate system where the vehicle to be tested is located; t represents transposition;
r represents a coordinate system rotation matrix of the panoramic all-round camera;
(X,Y,Z)Tcoordinates representing a panoramic looking-around camera coordinate system;
m represents a coordinate system translation matrix of the panoramic looking-around camera;
and S015, obtaining the relation between the image pixels and the environment coordinate system of the vehicle to be tested according to the relation between the image pixels and the environment coordinate system of the panoramic looking-around camera in the step S013 and the relation between the environment coordinate system of the panoramic looking-around camera and the environment coordinate system of the vehicle to be tested in the step S014, wherein the relation between the image pixels and the environment coordinate system of the vehicle to be tested is as follows:
will be provided with
Figure BDA0003330604040000111
Substitution into
Figure BDA0003330604040000112
Obtaining:
Figure BDA0003330604040000113
wherein ,(X0,Y0,Z0)TRepresenting an environment coordinate system where the vehicle to be tested is located; t represents transposition;
r represents a coordinate system rotation matrix of the panoramic all-round camera;
s represents the adjustment parameter of the panoramic all-round looking camera;
f represents the focal length of the panoramic all-round camera;
dx represents the size of the panoramic looking-around camera pixel in the x-axis direction;
dy represents the size of the panoramic all-around camera pixel in the y-axis direction;
(U0,V0)Tprincipal point coordinates representing an image;
m represents a translation matrix of a coordinate system of the panoramic all-round camera.
In a preferred embodiment of the present invention, in step S012, the method for calculating the panoramic view camera adjustment parameter S includes the steps of:
s0121, taking any three points in the coordinate system of the environment where the vehicle to be tested is located, wherein the three points are respectively (X)1,Y1,Z1)、(X2,Y2,Z2) and (X3,Y3,Z3) Determining a plane equation set:
Figure BDA0003330604040000114
wherein ,(X1,Y1,Z1)≠(X2,Y2,Z2)≠(X3,Y3,Z3);
Figure BDA0003330604040000121
wherein ,a1Coefficients representing the X-axis in the ambient coordinate system;
a2coefficients representing the Y-axis in the ambient coordinate system;
a3coefficients representing the Z-axis in the ambient coordinate system;
a4representing the offset coefficient in the environment coordinate system;
s0122, obtaining a plane equation according to the plane equation set, wherein the method for obtaining the plane equation according to the plane equation set comprises the following steps:
will be provided with
Figure BDA0003330604040000122
Substitution into
a1X+a2Y+a3Z+a4When the ratio is 0, the following is obtained:
Figure BDA0003330604040000123
s0123, obtaining the panoramic looking-around camera adjusting parameter according to the plane equation, wherein the method for obtaining the panoramic looking-around camera adjusting parameter according to the plane equation comprises the following steps:
will be provided with
Figure BDA0003330604040000124
Substitution into
Figure BDA0003330604040000131
Obtaining:
Figure BDA0003330604040000132
wherein ,(X1,Y1,Z1)、(X2,Y2,Z2) and (X3,Y3,Z3) Any three points in the coordinate system of the environment where the vehicle to be tested is located;
s represents the adjustment parameter of the panoramic all-round looking camera;
f represents the focal length of the panoramic all-round camera;
dx represents the size of the panoramic looking-around camera pixel in the x-axis direction;
dy denotes the size of the panoramic camera pixel in the y-axis direction.
In a preferred embodiment of the present invention, in step S02, the calculation method for correcting the lens of the panoramic looking-around camera according to the shot calibration target plate includes:
s021, obtaining a mapping relation between the calibration target plate and the panoramic all-round looking camera, wherein the mapping relation between the calibration target plate and the panoramic all-round looking camera is as follows:
s×O=D[Rm]o,
Figure BDA0003330604040000133
wherein ,
Figure BDA0003330604040000134
representing the internal parameters of the panoramic all-round looking camera;
fxthe scale factor of the pixels of the panoramic all-round camera in the x-axis direction is represented;
fythe scale factor of the pixels of the panoramic all-round camera in the y-axis direction is represented;
ftrepresenting the coefficient that the x axis is not vertical to the y axis;
(U0,V0)Tprincipal point coordinates representing an image;
[r1 r2 r3 m]=[Rm]representing a rotation matrix R and a translation matrix M of a panoramic all-round camera coordinate system;
r1the 1 st column of a coordinate system rotation matrix R of the panoramic all-round camera is represented;
r2the 2 nd row of a coordinate system rotation matrix R of the panoramic all-round camera is represented;
r3a3 rd row of a coordinate system rotation matrix R of the panoramic all-round camera is represented;
m represents a translation matrix M of a coordinate system of the panoramic all-round camera;
(X0,Y0,0,1)To represents the homogeneous coordinate of the calibration target plate; the Z axis is 0;
(u′,v′,1)To denotes the homogeneous coordinates of the image;
s022, obtaining a conversion matrix existing between the calibration target plate and the panoramic all-round-looking camera, wherein the calculation method of the conversion matrix existing between the calibration target plate and the panoramic all-round-looking camera comprises the following steps:
s×O=ho,
wherein h represents a transformation matrix;
Figure BDA0003330604040000141
wherein λ represents a conversion parameter;
h1column 1 representing the transformation matrix h;
h2column 2, representing the transformation matrix h;
h3column 3, representing the transformation matrix h;
s023, utilizing the objective function
Figure BDA0003330604040000142
Calculating a conversion matrix h;
wherein min represents the minimum value;
|| ||2represents a2 norm;
Oian ith column representing homogeneous coordinates of the image;
Figure BDA0003330604040000143
an ith column indicating the calculated homogeneous coordinates of the image;
s024, obtaining the following components according to the conversion matrix h:
Figure BDA0003330604040000151
wherein ,D-TRepresents DTThe inverse matrix of (d);
D-1is the inverse matrix of D;
order to
Figure BDA0003330604040000152
wherein ,L11Represents an element in row 1 and column 1 in the matrix L;
L12represents an element in row 1, column 2 in the matrix L;
L13represents an element in row 1, column 3 in matrix L;
L21represents an element in row 2 and column 1 in the matrix L;
L22represents an element in row 2 and column 2 in the matrix L;
L23represents an element in row 2, column 3 in matrix L;
L31represents an element in row 3, column 1 in matrix L;
L32represents an element in row 3, column 2 in matrix L;
L33represents an element in row 3, column 3 in matrix L;
has hi TLhj=cij Tl,
wherein ,cij=[hi1hj1 hi1hj2+hi2hj1 hi2hj2 hi3hj1+hi1hj3 hi3hj2+hi2hj3 hi3hj3]T
hi1An element representing the ith row and the 1 st column in the conversion matrix h;
hj1an element representing the jth row and 1 st column in the conversion matrix h;
hi2an element representing the ith row and 2 nd column in the conversion matrix h;
hj2elements representing the jth row and 2 nd column in the conversion matrix h;
hi3an element representing the ith row and 3 rd column in the conversion matrix h;
hj3elements representing the jth row and 3 rd column in the transformation matrix h;
cijan element representing the ith row and the jth column in the matrix C;
l represents a parameter matrix;
h is to bei TLhj=cij Tl is rewritten into
Figure BDA0003330604040000161
c11Elements representing the 1 st row and 1 st column of the matrix C, C12Elements representing row 1, column 2 of matrix C, C22G, g is a positive integer which is greater than or equal to 6 and less than or equal to 8, and g equation sets are combined to obtain Cl which is 0; the matrix L can be solved, and then all internal parameters of the panoramic all-round looking camera are calculated according to the optimized search;
s025, after the matrix L is obtained, a rotation matrix R and a translation matrix M of the coordinate system of the panoramic camera after correction can be obtained:
Figure BDA0003330604040000162
r1=λL-1h1
r1=λL-1h2
r3=r1r2
m=λL-1h3
wherein ,L-1An inverse matrix representing L;
h1represents the 1 st column element in the transformation matrix h;
h2represents the 2 nd column element in the transformation matrix h;
h2representing the 3 rd column element in the transformation matrix h.
In a preferred embodiment of the present invention, in step S2, the method for calculating image noise filtering of the captured panoramic annular view image data includes the following steps:
s21, finding out the noise point of the image pixel point, wherein the calculation method of the noise point of the pixel point in the image is as follows:
Figure BDA0003330604040000163
wherein ,F(i,j)Representing the gray value at the image pixel (i, j); that is, the gray value of the pixel point in the ith row and the jth column of the image;
G(i,j)indicating whether a pixel point (i, j) in the image is a noise point, G(i,j)When the pixel point (i, j) of the image is a noise point, G, is 0(i,j)1 represents that the image pixel point (i, j) is a non-noise point;
or represents a logical condition or;
amaxrepresenting that p multiplied by p with the image pixel point (i, j) as the center is the maximum mean value of the gray levels in the window; p is 2p '+ 1, p' is a positive integer greater than or equal to 1 and less than or equal to 3;
aminrepresenting that p multiplied by p with the image pixel point (i, j) as the center is the minimum mean value of the gray level in the window;
s22, matrix numbering is carried out on the pixel points in the window, and the pixel points are recorded as
Figure BDA0003330604040000171
wherein ,G(1,1)'represents a pixel point in the 1 st row and 1 st column in the matrix G';
G(1,2)'represents a pixel point in the 1 st row and 2 nd column in the matrix G';
G(1,3)'represents a pixel point in the 1 st row and 3 rd column of the matrix G';
G(1,p)'represents the pixel point in the 1 st row and the p th column in the matrix G';
G(2,1)'denotes a pixel in row 2 and column 1 in the matrix G';
G(2,2)'denotes a pixel in row 2 and column 2 in the matrix G';
G(2,3)'denotes a pixel point in row 2 and column 3 in the matrix G';
G(2,p)'denotes a pixel point in the 2 nd row and p th column in the matrix G';
G(3,1)'denotes a pixel point in row 3 and column 1 in the matrix G';
G(3,2)'denotes a pixel point in row 3 and column 2 in the matrix G';
G(3,3)'represents a pixel point in row 3 and column 3 in the matrix G';
G(3,p)'denotes a pixel point in the p-th column of row 3 in the matrix G';
G(p,1)'denotes a pixel point at the 1 st row in the matrix G';
G(p,2)'denotes a pixel point at the p-th row and 2 nd column in the matrix G';
G(p,3)'denotes a pixel point at the p-th row and 3 rd column in the matrix G';
G(p,p)'represents the pixel point in the p row and p column in the matrix G';
s23, the gray values in the matrix G' are arranged from small to large, the gray values of the pixel points at the noise points are updated, and the calculation method of the gray values of the pixel points at the noise points is as follows:
Figure BDA0003330604040000181
wherein ,
Figure BDA0003330604040000182
representing the gray value of the pixel point at the noise point after updating;
Figure BDA0003330604040000183
express gray value ordering
Figure BDA0003330604040000184
A gray value;
Figure BDA0003330604040000185
express gray value ordering
Figure BDA0003330604040000186
A gray value.
In a preferred embodiment of the present invention, in step S21, the method for calculating the maximum mean value of gray scale is as follows:
s211, obtaining a gray average value according to the gray values in the window, wherein the calculation method of the gray average value comprises the following steps:
Figure BDA0003330604040000187
wherein ,
Figure BDA0003330604040000188
expressing the gray value corresponding to the pixel point of the ith row and the jth column in the matrix G';
Figure BDA0003330604040000189
expressing the gray value corresponding to the pixel point of the ith row and the jth column in the matrix G';
Figure BDA00033306040400001810
int () represents a rounding function;
Figure BDA00033306040400001811
expressing a gray level mean value;
s112, obtaining a maximum gray average value according to the gray average value, wherein the calculation method of the maximum gray average value comprises the following steps:
Figure BDA00033306040400001812
wherein ,ψ1Representing a preset first gray threshold value; psi1Is a positive number greater than 0;
Figure BDA00033306040400001813
Gmaxrepresents the maximum gray value in the matrix G';
or/and in step S211, the method for calculating the minimum mean value of the gray scale is:
s21-1, obtaining the gray average value according to the gray value in the window, wherein the calculation method of the gray average value is as follows:
Figure BDA00033306040400001814
wherein ,
Figure BDA00033306040400001815
expressing the gray value corresponding to the pixel point of the ith row and the jth column in the matrix G';
Figure BDA00033306040400001816
expressing the gray value corresponding to the pixel point of the ith row and the jth column in the matrix G';
Figure BDA00033306040400001817
int () represents a rounding function;
Figure BDA00033306040400001818
expressing a gray level mean value;
s21-2, obtaining the minimum mean value of the gray scale according to the mean value of the gray scale, wherein the calculation method of the minimum mean value of the gray scale comprises the following steps:
Figure BDA0003330604040000191
wherein ,ψ2Representing a preset second gray level threshold; psi2Is greater than or equal to psi1A positive number of;
Figure BDA0003330604040000192
Gmaxrepresenting the maximum gray value in the matrix G'.
In a preferred embodiment of the present invention, in step S21, the method for calculating the gray-scale value of the image is:
F(i,j)=Red(i,j)×Re+Green(i,j)×Gr+Blue(i,j)×Bl,
wherein ,Red(i,j)Representing the amount of red color pattern at the image pixel (i, j);
re represents a fusion scale factor of the red color mode quantity; re + Gr + Bl ═ 1;
Green(i,j)representing the amount of green color pattern at the image pixel (i, j);
re represents the fusion scale factor of the green color mode quantity;
Blue(i,j)representing the amount of blue color pattern at the image pixel (i, j);
re represents the fusion scale factor of the blue color mode quantity.
In a preferred embodiment of the present invention, in step S3, the calculation method for performing matching calibration based on images captured at adjacent times includes the steps of:
s31, obtaining the picture shot by the panoramic all-round looking camera at the t-1 moment and the picture shot by the panoramic all-round looking camera at the t moment to construct an image matching equation, wherein the calculation method of the matching equation is as follows:
Figure BDA0003330604040000193
wherein ,Pa,t-1The picture shot by the a-th panoramic all-around camera at the moment t-1 is represented; a is 1, 2, 3, … …, a;
Pa′,trepresenting a picture shot at the t moment of the a-th panoramic all-around camera;
(xc,yd) Coordinates representing pixel points in the image; c is an element of [0, W x r-1 ]],d∈[0,H×r-1],c,d∈Z+E represents that the set is in accordance with, W represents the width value of the image shot by the panoramic all-around camera, H represents the height value of the image shot by the panoramic all-around camera, r represents the resolution of the shot image, and Z represents the resolution of the shot image+Representing a set of positive integers;
(Δ x, Δ y) represents an image movement amount; when Δ x is greater than 0, it indicates that the pixel point Δ x is moved leftward, when Δ x is less than 0, it indicates that the pixel point | Δ x | is moved rightward, and | | represents that an absolute value is taken, when Δ y is greater than 0, it indicates that the pixel point Δ y | is moved upward, when Δ y is less than 0, it indicates that the pixel point | Δ y | is moved downward, and when Δ x is Δ y is 0, it indicates that the pixel point is not moved;
Figure BDA0003330604040000194
indicating a rotation angle value, wherein
Figure BDA0003330604040000195
Indicating clockwise rotation of the image
Figure BDA0003330604040000196
When in use
Figure BDA0003330604040000197
Indicating counterclockwise rotation of the image
Figure BDA0003330604040000198
When in use
Figure BDA0003330604040000199
Indicating that the image is not rotating;
Pa′,t(xc,yd) The coordinate (x) of the pixel point of the image shot by the a-th panoramic all-around camera at the time t is representedc,yd) The chroma of the color;
Figure BDA00033306040400001910
represents the image taken by the jth panoramic all-around camera at the time t-1 to translate (deltax, deltay) and rotate
Figure BDA00033306040400001911
Matching the color intensity;
s32, matching equation in step S31
Figure BDA0003330604040000201
Fourier transformation is carried out on the left side and the right side to obtain:
Figure BDA0003330604040000202
wherein ,Fa′,t(u, v) for picture Pa′,t(xc,yd) Data after Fourier transform is carried out;
Figure BDA0003330604040000203
representing a picture
Figure BDA0003330604040000204
Data after Fourier transform is carried out;
e represents a natural base number;
j represents an imaginary number;
u represents the Fourier u-axis;
v represents the fourier v-axis;
s33, for
Figure BDA0003330604040000205
And simultaneously taking amplitude values of the left side and the right side to obtain:
Figure BDA0003330604040000206
wherein ,
Figure BDA0003330604040000207
representing a fourier modulus;
Figure BDA0003330604040000208
s34, obtaining translation amount (delta x, delta y) and rotation amount by phase correlation method
Figure BDA0003330604040000209
In a preferred embodiment of the present invention, in step S4, the panorama image stitching method includes the steps of:
s41, acquiring an image feature point group at the t-1 moment and an image feature point group at the t moment;
s42, calculating image feature point pair similarity values between the feature points in the t-1 time image feature point group and the feature points in the t time image feature point group, wherein the method for calculating the image feature point pair similarity values is as follows:
Figure BDA00033306040400002010
wherein ,x(t-1)iAn i-dimensional image vector representing the image feature point at the time t-1;
xtian i-dimensional image vector representing a characteristic point of the image at the time t;
IDrepresenting the total dimension number of the image;
θηεrepresenting image feature point pairs similarity values;
s43, arranging the similarity values of the image feature point pairs in sequence from big to small, and selecting feature points corresponding to the similarity values of the first two image feature point pairs; and aligning the two selected feature points with the other two feature points to perform image fusion at adjacent moments to obtain the panoramic stitching image.
In a preferred embodiment of the present invention, the method further includes step S5, where the method for performing resolution analysis on the panoramic annular view image displayed on the panoramic annular view display screen includes the following steps:
s51, acquiring a panoramic all-around view image displayed on the panoramic all-around view display screen, and carrying out gray processing on the acquired panoramic all-around view image displayed on the panoramic all-around view display screen to obtain a gray image; the processing method of the gray level image comprises the following steps:
F(i,j)=Red(i,j)×Re+Green(i,j)×Gr+Blue(i,j)×Bl,
wherein ,Red(i,j)Representing the amount of red color pattern at the image pixel (i, j);
re represents a fusion scale factor of the red color mode quantity; re + Gr + Bl ═ 1;
Green(i,j)representing the amount of green color pattern at the image pixel (i, j);
re represents the fusion scale factor of the green color mode quantity;
Blue(i,j)representing the amount of blue color pattern at the image pixel (i, j);
re represents a fusion scale factor of the blue color mode quantity;
s52, dividing the acquired gray-scale image into equal-sized regions according to the image size,
s53, performing gradient calculation on each region by adopting a Laplacian operator;
s54, calculating the gradient variance of each region image, and judging the definition of the image according to the mean value of the gradient variances; the gradient variance mean value calculation method comprises the following steps:
Figure BDA0003330604040000211
wherein ,δiThe variance of the gradient is represented by,
Figure BDA0003330604040000212
representing a gradient variance set;
Figure BDA0003330604040000213
representing the total number of gradient variances in the set;
the method for judging the definition of the image comprises the following steps:
Figure BDA0003330604040000214
wherein ,Pc1 represents that the panoramic view image is an ultra-clear image;
Pc0 represents that the panoramic view image is a high-definition image;
Pcthe-1 represents that the panoramic view image is a standard definition image;
pmaxrepresenting a sharp segmentation maximum;
pminrepresents a sharp segmentation minimum;
and S55, displaying the real-time definition of the image on the upper right corner of the panoramic all-around display screen.
In a preferred embodiment of the present invention, in step S0, after the lens of the panoramic view camera is corrected, a symmetry test is performed on an image captured by the panoramic view camera, and the symmetry test method includes the following steps:
s0-1, acquiring a shot image on the calibration target board, and carrying out gray processing on the shot image to obtain a gray image; consistent with the gray level processing method;
s0-2, carrying out image denoising processing on the gray level image; the method for carrying out image denoising processing on the gray level image comprises the following steps:
s0-21, finding out the noise point of the image pixel point, wherein the calculation method of the noise point of the pixel point in the image is as follows:
Figure BDA0003330604040000215
wherein ,F(i,j)Representing the gray value at the image pixel (i, j); that is, the gray value of the pixel point in the ith row and the jth column of the image;
G(i,j)indicating whether a pixel point (i, j) in the image is a noise point, G(i,j)When the pixel point (i, j) of the image is a noise point, G, is 0(i,j)1 represents that the image pixel point (i, j) is a non-noise point;
or represents a logical condition or;
amaxrepresenting that p multiplied by p with the image pixel point (i, j) as the center is the maximum mean value of the gray levels in the window; p is 2p '+ 1, p' is a positive integer greater than or equal to 1 and less than or equal to 3;
aminrepresenting that p multiplied by p with the image pixel point (i, j) as the center is the minimum mean value of the gray level in the window;
s0-22, aligning the pixel points in the windowCarry out matrix numbering and record as
Figure BDA0003330604040000221
wherein ,G(1,1)'represents a pixel point in the 1 st row and 1 st column in the matrix G'; g(1,2)'represents a pixel point in the 1 st row and 2 nd column in the matrix G'; g(1,3)'represents a pixel point in the 1 st row and 3 rd column of the matrix G'; g(1,p)'represents the pixel point in the 1 st row and the p th column in the matrix G'; g(2,1)'denotes a pixel in row 2 and column 1 in the matrix G'; g(2,2)'denotes a pixel in row 2 and column 2 in the matrix G'; g(2,3)'denotes a pixel point in row 2 and column 3 in the matrix G'; g(2,p)'denotes a pixel point in the 2 nd row and p th column in the matrix G'; g(3,1)'denotes a pixel point in row 3 and column 1 in the matrix G'; g(3,2)'denotes a pixel point in row 3 and column 2 in the matrix G'; g(3,3)'represents a pixel point in row 3 and column 3 in the matrix G'; g(3,p)'denotes a pixel point in the p-th column of row 3 in the matrix G'; g(p,1)'denotes a pixel point at the 1 st row in the matrix G'; g(p,2)'denotes a pixel point at the p-th row and 2 nd column in the matrix G'; g(p,3)'denotes a pixel point at the p-th row and 3 rd column in the matrix G'; g(p,p)'represents the pixel point in the p row and p column in the matrix G';
s0-23, arranging the gray values in the matrix G' from small to large, updating the gray values of the pixels at the noise points, wherein the calculation method of the gray values of the pixels at the noise points comprises the following steps:
Figure BDA0003330604040000222
wherein ,
Figure BDA0003330604040000223
representing the gray value of the pixel point at the noise point after updating;
Figure BDA0003330604040000224
express gray value ordering
Figure BDA0003330604040000225
A gray value;
Figure BDA0003330604040000231
express gray value ordering
Figure BDA0003330604040000232
A gray value;
s0-3, performing edge detection on the denoised gray image by adopting a Canny operator to obtain an edge image;
s0-4, acquiring edge straight lines of the outermost periphery calibration target board of the edge image from the top, the bottom, the left and the right by adopting Hough straight line detection, respectively calculating included angles between the edge straight lines of the top, the bottom, the left and the right calibration target boards and the horizontal and vertical directions, and when the detected included angle is larger than or equal to a preset angle threshold value, the image is asymmetrical, and the preferable preset angle threshold value is 5-8 degrees.
In a preferred embodiment of the present invention, in step S4, a seam detection method for detecting a seam in a spliced image includes the steps of:
s4-1, acquiring spliced images, and carrying out gray processing on the spliced images to obtain gray images; consistent with the gray level processing method;
s4-2, carrying out image denoising processing on the gray level image; the method is consistent with the image denoising processing method;
s4-3, performing edge detection on the denoised gray image by adopting a Canny operator to obtain an edge image;
s4-4, digging out the image of the automobile model part, carrying out Hough straight line detection on the edge image in the transverse direction and the longitudinal direction to obtain a transverse straight line data set and a longitudinal straight line data set, calculating the distance between the transverse adjacent straight lines through a straight line distance formula, recording the distance data, checking whether the distance data deviates from the recorded large distance data, and if the distance data deviates from the recorded large distance data, judging that the distance data is a gap.
In a preferred embodiment of the present invention, in step S4, ghost detection is performed on the spliced completed image, and the ghost detection method includes the steps of:
s4-a, acquiring spliced images, and carrying out gray processing on the spliced images to obtain gray images;
s4-b, carrying out image denoising processing on the gray level image;
s4-c, carrying out edge detection on the denoised gray level image to obtain an edge image;
s4-d, digging out the image of the model part to obtain the edge image only of the calibration target plate, and extracting the grid edge straight line of the calibration target plate by adopting Hough straight line detection to transversely and longitudinally extract the edge image with the model buckled;
s4-e, after the gap straight line is removed according to the gap detection method, sorting the detected straight lines of the calibration target board in different directions, and positioning the upper, lower, left and right straight lines of each square according to the relation that the distances between every two straight lines are close;
and S4-f, comparing the coordinates of the same square grid at different positions, selecting the coordinate with the minimum left and upper boundaries and the maximum right and lower boundaries as the standard for dividing a square grid area, dividing the image into small areas for contour detection, and judging that double images appear when two or more square grid contours are extracted.
The first power supply module includes: as shown in fig. 3 and 6, the vehicle battery BAT is connected to the first terminal of the diode D202, the second terminal of the diode D202 is connected to the first terminal of the inductor L34, the second terminal of the inductor L34 is connected to the first terminal of the capacitor C21, the first terminal of the capacitor C23, the first terminal of the capacitor C24, the first terminal of the capacitor C25, the first terminal of the resistor R27, and the power supply voltage input terminal VIN of the buck chip U8, the second terminal of the resistor R27 is connected to the first terminal of the resistor R161, the first terminal of the capacitor C25, and the enable input terminal EN of the buck chip U8, the second terminal of the resistor R161 is connected to the power ground, the first terminal of the capacitor C25 is connected to the power ground, the second terminal of the capacitor C21, the second terminal of the capacitor C23, the second terminal of the capacitor C24, and the second terminal of the capacitor C25 are connected to the power supply ground, the timing resistor RT/SY of the buck chip U8 is connected to the first terminal of the resistor R111, a starting capacitor terminal SS of the buck chip U8 is connected to a first terminal of a capacitor C33, a second terminal of the capacitor C33 is connected to a power ground, a capacitor terminal BOOT of the buck chip U8 is connected to a first terminal of a capacitor C19, a power voltage output terminal of the buck chip U8 is connected to a second terminal of a capacitor C19, a first terminal of an inductor L404 and a first terminal of a diode D82, a second terminal of a diode D82 is connected to the power ground, a voltage feedback terminal FB of the buck chip U8 is connected to a first terminal of a resistor R66 and a first terminal of a resistor R61, a second terminal of a resistor R61 is connected to the power ground, a power ground terminal GND of the buck chip U8 is connected to the power ground, a second terminal of the inductor L404 is connected to a first terminal of a capacitor C44, a first terminal of a capacitor C26, a first terminal of a capacitor C77, a first terminal of a capacitor C27 and a second terminal of a resistor R66, a second terminal of the inductor L404 outputs a power voltage VCC 5V _ VCC _ C26 and a second terminal of the capacitor C27, the second terminal of the capacitor C44, the second terminal of the capacitor C26 and the second terminal of the capacitor C77 are respectively connected to the power ground. The power supply voltage of the vehicle battery BAT is converted into a stable 5V power supply voltage VCC _5V (+5V power supply) by the step-down chip U8. The voltage division is realized by the resistor R161 and the resistor R27 to provide an enable level signal for the enable input end EN of the buck chip U8, so that the buck chip U8 works.
The second power supply module includes: the 5V power voltage VCC _5V is respectively connected with a first end of a capacitor C35, a first end of a resistor R255 and an emitter of a triode Q101, a second end of a capacitor C35 is connected with a power ground, a second end of the resistor R255 and a base of the triode Q101 are respectively connected with a first end of a resistor R36, a second end of a resistor R36 is connected with a collector of a triode Q111, an emitter of the triode Q111 is connected with the power ground, a base of the triode Q111 is respectively connected with a first end of a resistor R303, a first terminal of the resistor R34 is connected to a first terminal of the capacitor C222, a second terminal of the resistor R34 and a second terminal of the capacitor C222 are respectively connected to a power ground, a second terminal of the resistor R303 is connected to a power voltage control terminal suspandb of the first controller U1, a collector of the transistor Q101 is connected to a first terminal of the capacitor C277, a collector of the transistor Q101 outputs a 5V power voltage VDD _5V, and a second terminal of the capacitor C277 is connected to the power ground. The power supply voltage control terminal suspandb of the first controller U1 transmits a turn-off level to the base of its transistor Q111, the collector of the transistor Q101 outputs a stable 5V power supply voltage VDD _5V (+5V power), the power supply voltage control terminal suspandb of the corresponding first controller U1 transmits a turn-on level to the base of its transistor Q111, and the collector of the transistor Q101 has no power output.
The third power supply module includes: the 5V power supply voltage VDD _5V is connected to the first terminal of the inductor L122, the second terminal of the inductor L122 is connected to the first terminal of the capacitor C511, the first terminal of the capacitor C521, the first terminal of the capacitor C531, the first terminal of the capacitor C541, the first terminal of the capacitor C551, and the first terminal of the capacitor C561, respectively, the second terminal of the inductor L122 outputs the 5V power supply voltage VDD1_5V, and the second terminal of the capacitor C511, the second terminal of the capacitor C521, the second terminal of the capacitor C531, the second terminal of the capacitor C541, the second terminal of the capacitor C551, and the second terminal of the capacitor C561 are connected to the power ground, respectively. The power voltage control terminal suspandb of the first controller U1 sends an off-level to the base of its diode Q111, the second terminal of the inductor L122 outputs a stable 5V power voltage VDD1 — 5V (+5V power), the power voltage control terminal suspandb of the corresponding first controller U1 sends an on-level to the base of its diode Q111, and the second terminal of the inductor L122 does not have a power output. In this embodiment, the capacitance value of the capacitor C52 is 471nF, the capacitance values of the capacitors C24 and C25 are VEJ M1ETR, the capacitance value of the diode D288 is NRVBAF440T3G, the capacitance value of the capacitor C23 is 15nF, the capacitance value of the diode D202 is NRVBA160T3G, the capacitance value of the inductor L34 is NRS5040T220MMGKV, the resistance value of the resistor R111 is 65K, the capacitance value of the capacitor C33 is 23nF, the resistance value of the resistor R55 is 101K, the capacitance value of the capacitor C19 is 120nF, the capacitance value of the capacitor C21 is 4.7uF, the capacitance value of the capacitor C26 is TAJC 016K TNJ, the resistance value of the resistor R161 is 95K, the resistance value of the resistor R27 is 365K, the capacitance value of the capacitor C77 is 22uF, the capacitance value of the capacitor C35 is 22uF, the resistance value of the resistor R255 is TAJC 016K TNJ, the resistance value of the resistor R255 is 95K, the resistance value of the resistor R75K is 95K, the resistance value of the resistor R150 is 95K, the capacitance value of the resistor R150 is 150K 150, the capacitance value of the capacitor C8414K 150, the capacitance value of the resistor R865 is nrqk 150, the resistor R150 is nrqgcq 150, the resistor R150 is 150, the resistor R8414K 150 is nrq 150 is 150, the resistor R150 is 120nF 14K, the resistor R150 is nrq 150 is 150, the resistor R8414K, the resistor R150 is R8414K, the resistor R150 is R150, the resistor R150 is R150K, the resistor R8414K, the resistor R150 is R150, the resistor R150 is R150C 8414K, the resistor R150C 150, the resistor R150 is R150, the resistor R150 is R8414K, the resistor R150 is R150K, the resistor R150 is R150, the resistor R150 is R150, the resistor R150K, the resistor R150 is R150, the resistor R150 is R150 g R150 is R150, the resistor R150 is R150, the resistor R150 is R150, the resistor R150 g R150, the resistor R150 is R150 g R150, the resistor R150 is R150 g R150, the resistor R150 g R150, the, the resistance value of the resistor R36 is 502 omega, the capacitance value of the capacitor C521 is 1uF, the model of the triode Q111 is BCW66GLT1G, the model of the inductor L122 is MPZ1608S601A, the capacitance value of the capacitor C511 is 2.2uF, the capacitance value of the capacitor C222 is 4.7nF, the capacitance values of the capacitor C531 and the capacitor C541 are 120nF, and the capacitance values of the capacitor C551 and the capacitor C561 are 100 nF.
The fourth power supply module includes: as shown in fig. 4 to 6, a 5V power voltage VCC _5V is connected to a first end of an inductor L422, a second end of the inductor L422 is respectively connected to a first end of a resistor R253, a first end of a capacitor C89, a first end of a capacitor C92, and a power voltage input end VIN of a buck chip U4, a delay reset end REDLAY of the buck chip U4 is connected to a first end of a capacitor C93, a second end of a capacitor C89, a second end of a capacitor C92, and a second end of a capacitor C93 are respectively connected to a power ground, a frequency oscillation end ROSE of the buck chip U4 is connected to a first end of the resistor R93, a second end of the resistor R93 is connected to the power ground, a power ground end GND of the buck chip U4 is connected to the power ground, and an enable input end EN of the buck chip U4 is connected to a second end of the resistor R253; the power supply voltage output end VOUT of the voltage reduction chip U4 is respectively connected with the first end of the capacitor C254, the first end of the capacitor C299 and the first end of the capacitor C292, the power supply voltage output end VOUT of the voltage reduction chip U4 outputs 3.3V power supply voltage VCC _3.3V, and the second end of the capacitor C254, the second end of the capacitor C299 and the second end of the capacitor C292 are respectively connected with a power ground. The +5V power supply is converted into a stable 3.3V power supply voltage VCC _3.3V (+3.3V power supply) by the buck chip U4.
The fifth power supply module includes: a 3.3V power supply voltage VCC _3.3V is respectively connected to the first terminal of the capacitor C188, the first terminal of the resistor R218, and the emitter of the transistor Q21, the second terminal of the capacitor C188 is connected to the power ground, the second terminal of the resistor R218 and the base of the transistor Q21 are respectively connected to the first terminal of the resistor R238, the second terminal of the resistor R238 is connected to the collector of the transistor Q31, the emitter of the transistor Q31 is connected to the power ground, the base of the transistor Q31 is respectively connected to the first terminal of the resistor R244, a first terminal of the resistor R73 is connected to a first terminal of the capacitor C208, a second terminal of the resistor R73 and a second terminal of the capacitor C208 are respectively connected to a power ground, a second terminal of the resistor R244 is connected to a power voltage control terminal CS _ DD of the second controller U2, a collector of the transistor Q21 is connected to a first terminal of the capacitor C195, a collector of the transistor Q21 outputs a 3.3V power voltage VDD _3.3V, and a second terminal of the capacitor C195 is connected to the power ground. The power voltage control terminal CS _ DD of the second controller U2 inputs a cut-off level to the base of the transistor Q31, the collector of the transistor Q21 outputs a 3.3V power voltage VDD _3.3V (3.3V power), and accordingly, the power voltage control terminal CS _ DD of the second controller U2 inputs a turn-on level to the base of the transistor Q31, and the collector of the transistor Q21 has no power output.
Still include the fault detection circuit: a feedback input terminal WD of the buck chip U4 is respectively connected to a first terminal of a resistor R98 and a first terminal of a resistor R358, a second terminal of a resistor R98 is connected to a power ground, a second terminal of the resistor R358 is respectively connected to a first terminal of a capacitor C217 and a feedback output terminal SET of the second controller U2, a second terminal of the capacitor C217 is connected to the power ground, a reset terminal nRST of the buck chip U4 is connected to a first terminal of a resistor R404, a second terminal of the resistor R404 is respectively connected to a first terminal of a resistor R331, a first terminal of a resistor R99, a first terminal of a resistor R411 and a detection terminal RSTBB of the first controller U1, a second terminal of the resistor R99 is connected to a reset input terminal RESETB of the second controller U2, a second terminal of the resistor R331 is connected to a 3.3V power voltage VCC _3.3V, a second terminal of the resistor R411 is connected to a fault output terminal FLT of the buck chip U4, a first terminal WD of the resistor R353V 3V, a first terminal of the resistor R466 is connected to a second terminal of the resistor R3V 466, and a second terminal of the resistor R466 is connected to a second terminal of the resistor R466, An enable input terminal NWD _ EN of the voltage reduction chip U4 is connected with a collector of the triode Q177, a second terminal of the resistor R555 is connected with a power ground, an emitter of the triode Q177 is connected with the power ground, a base of the triode Q177 is respectively connected with a first terminal of the resistor R377, a first terminal of the resistor R351 and a first terminal of the capacitor C229, a second terminal of the resistor R351 and a second terminal of the capacitor C229 are respectively connected with the power ground, and a second terminal of the resistor R377 is connected with a detection enable terminal CS _ D of the second controller U2. When the feedback input terminal WD of the buck chip U4 does not receive the first feedback signal output by the feedback output terminal SET of the second controller U2 within the first specified time or the feedback input terminal WD of the buck chip U4 does not receive the second feedback signal output by the feedback output terminal SET of the second controller U2 within the second specified time, the fault output terminal WD _ FLT of the buck chip U4 outputs a fault signal or the reset terminal nRST of the buck chip U4 outputs a reset signal, so that the second controller U2 resets, resets and restarts. In this embodiment, the resistance of the resistor R253 is 12K, the capacitance of the capacitor C89 is 120nF, the capacitance of the capacitor C92 is 22uF, the capacitance of the capacitor C93 is 120pF, the capacitance of the capacitor C217 is 5.6pF, the resistance of the resistor R331 is 5.1K, and the model of the voltage chip U44 is TPS7a6333QPWPRQ 1; the capacitance value of the capacitor C188 is 22uF, the resistance value of the resistor R218 is 15K, the resistance values of the resistors R244 and R73 are 11.4K, the capacitance value of the capacitor C208 is 1.5nF, the model of the transistor Q21 is BCW68, the resistance value of the resistor R236 is 47 Ω, the resistance value of the resistor R555 is 48K, the capacitance value of the capacitor C299 is 10nF, the capacitance value of the capacitor C229 is 22nF, the resistance values of the resistors R404 and R411 are 45 Ω, the resistance value of the resistor R93 is 160K, the resistance value of the resistor R98 is 12K, the model of the inductor L422 is MPZ S601A, the resistance value of the resistor R358 is 5.2K, the capacitance value of the capacitor C199 is 100nF, the model of the transistor Q31 is BCW66GLT1, the resistance value of the resistor R466 is 4.7K, the capacitance value of the transistor C257 is 10uF, the capacitance value of the transistor Q177 is 66 w 1, the model of the resistor R493 is 23K, the resistance value of the resistor R31 is 100nF 3, the resistance value of the resistor R31 is 3K, the resistance value of the resistor R377R 31 is 4.42K, and the resistance value of the resistor R31 is 3R 31.
The sixth power supply module includes: as shown IN fig. 2, a 5V power supply voltage VCC _5V is respectively connected to the first terminal of the capacitor C1, the first terminal of the capacitor C2, and the power supply voltage input terminal IN of the buck chip U44, the second terminal of the capacitor C1 and the second terminal of the capacitor C2 are respectively connected to the power ground, the power supply ground terminal GND of the buck chip U44 is connected to the power ground, the power supply voltage output terminal OUT of the buck chip U44 is respectively connected to the first terminal of the capacitor C3 and the first terminal of the capacitor C4, the power supply voltage output terminal OUT of the buck chip U44 outputs a 3.3V power supply voltage 3.3V, and the second terminal of the capacitor C3 and the second terminal of the capacitor C4 are respectively connected to the power ground. The input 5V power voltage VCC _5V is converted into a stable 3.3V power voltage 3.3V (3.3V power) and output.
The seventh power supply module includes: 3.3V power voltage 3.3V is respectively connected with a first end of a capacitor C8, a first end of a capacitor C9 and a first end of a diode D1, a second end of a capacitor C8 and a second end of a capacitor C9 are respectively connected with a power ground, a second end of a diode D1 is respectively connected with a first end of a capacitor C10 and a first end of a capacitor C11, a second end of the diode D1 outputs 2.5V power voltage 2.5V _ D, and a second end of a capacitor C10 and a second end of a capacitor C11 are respectively connected with the power ground. The diode D01 absorbs the voltage drop of about 0.8V, so that the second end of the diode D01 outputs a stable 2.5V supply voltage 2.5V _ D (2.5V supply).
The eighth power supply module includes: 3.3V supply voltage 3.3V is coupled to a first terminal of a resistor R03 and an emitter of a transistor Q01, respectively, a second terminal of a resistor R03 and a base of a transistor Q01 are coupled to a first terminal of a resistor R04, respectively, a second terminal of a resistor R04 is coupled to a collector of a transistor Q02, an emitter of a transistor Q01 is coupled to a power ground, a base of a transistor Q01 is coupled to a first terminal of a resistor R05, a second terminal of the resistor R05 is coupled to a supply voltage control terminal CS PWDB of a first controller U1, a collector of the transistor Q01 is coupled to a first terminal of a diode D2, a second terminal of the diode D2 is coupled to a first terminal of a capacitor C05, a first terminal of a capacitor C06 and a first terminal of a resistor R02, a second terminal of a diode D2 outputs a 2.8V supply voltage 2.8V _ CSD, a second terminal of the capacitor C05 and a second terminal of the capacitor C06 are coupled to a second terminal of the power ground, respectively, and a second terminal of the resistor R02 is coupled to a second terminal of the LED lamp, and the second end of the power indicator LED is connected with the power ground. When the power voltage control terminal CS _ PWDB of the first controller U1 sends a cut-off level to the base of the transistor Q02, the second terminal of the diode D2 outputs a stable 2.8V power voltage 2.8V _ CSD (2.8V power), and the power indicator LED is turned on to indicate that there is power output; the corresponding power voltage control terminal CS _ PWDB of the first controller U1 sends a conduction level to the base of the transistor Q02, and the second terminal of the diode D2 does not have a power output.
The ninth power supply module includes: the 2.8V power supply voltage 2.8V _ CSD is connected with the first end of the resistor R01, the second end of the resistor R01 is connected with the first end of the capacitor C07, the second end of the resistor R01 outputs 2.8V power supply voltage 2.8V _ CSA, and the second end of the capacitor C07 is connected with the power ground.
In this embodiment, the capacitance value of the capacitor C1 is 4.7uF, the model of the capacitor C2 is 104 capacitor, the model of the buck chip U44 is FS8853-3.3CL, the model of the capacitor C3 is 104 capacitor, the capacitance value of the capacitor C4 is 47uF, the models of the capacitor C8 and the capacitor C9 are 104 capacitor, the model of the diode D01 is LL4148, the capacitance value of the capacitor C10 is 10nF, the model of the capacitor C11 is 104 capacitor, the model of the triode Q1 is mmq 8550, the model of the diode D6342 is 1N5819 bt HW, the resistance value of the resistor R03 is 1M, the resistance value of the resistor R04 is 1.2K, the resistance value of the resistor R05 is 12K, the model of the mmq 02 is 1, the capacitance value of the capacitor C06 is 47uF, the model of the capacitor C05 is 104 capacitor, the resistance value of the resistor R395 is 7.2 Ω, the resistance value of the resistor R02 is 7. 07K, and the resistance value of the capacitor C01 is 1.5735.
As shown in fig. 5 and 6, the clock terminal MCLK of the second controller U2 is connected to the clock terminal CS _ CLK of the first controller U1, the analog ground terminal AGND of the second controller U2 is connected to power ground, the analog power terminal AVDD of the second controller U2 is connected to the first terminal of the capacitor C22, the first terminal of the capacitor C23 and the 2.8V supply voltage 2.8V _ CSA, the second terminal of the capacitor C22 and the second terminal of the capacitor C23 are connected to power ground, the digital ground terminal DGND of the second controller U2 is connected to power ground, the enable terminal ENB of the second controller U2 is connected to the enable output terminal CS _ EN of the first controller U1, the clock terminal SCK of the second controller U2 is connected to the clock terminal of the first controller U1, the data terminal SCK of the second controller U2 is connected to the data terminal CS _ EN of the first controller U1, the data terminal HSYNC of the second controller U9685 is connected to the vertical data terminal HSYNC 39yn of the first controller U1, a horizontal DATA terminal VSYNC of the second controller U2 is connected to a horizontal DATA terminal VSYNC of the first controller U1, a digital power supply terminal DVDD of the second controller U2 is connected to a first terminal of a capacitor C24, a first terminal of the capacitor C24, and a 2.8V _ CSD power supply voltage, respectively, a digital power supply ground terminal DGND of the second controller U24 is connected to a second terminal of the capacitor C24 and a second terminal of the capacitor C24, respectively, a DATA terminal DATA 24 of the second controller U24 is connected to a DATA terminal CS _ D24 of the first controller U24, a DATA terminal DATA 72 of the second controller U24 is connected to a DATA terminal CS _ D24 of the first controller U24, a DATA terminal CS _ D24 of the controller U24 is connected to a DATA terminal CS _ D24 of the controller U24, a DATA terminal CS _ D24 of the controller U24 of the DATA terminal CS _ D24 of the controller U24, the DATA terminal DATA7 of the second controller U2 is connected to the DATA terminal CS _ D6 of the first controller U1, the DATA terminal DATA8 of the second controller U2 is connected to the DATA terminal CS _ D7 of the first controller U1, and the DATA terminal DATA9 of the second controller U2 is connected to the DATA terminal CS _ D8 of the first controller U1. In this embodiment, the types of the capacitor C22, the capacitor C23, the capacitor C24, and the capacitor C25 are 104 capacitors, and the type of the second controller U2 is HY 7131R.
As shown in fig. 6, the power ground terminal OVSS of the first controller U1 is connected to the power ground, the power terminal OVDD of the first controller U1 is connected to the 3.3V power supply voltage 3.3V, the clock terminal SCL of the first controller U1 is connected to the first terminal of the resistor R31, the second terminal of the resistor R31 is connected to the 2.8V power supply voltage 2.8V _ CSD, the data terminal SDA of the first controller U1 is connected to the first terminal of the resistor R32 and the first terminal of the resistor R33, respectively, the second terminal of the resistor R32 is connected to the 2.8V power supply voltage 2.8V _ CSD, the second terminal of the resistor R33 is connected to the power ground, the reset terminal RST of the first controller U1 is connected to the first terminal of the resistor R13 and the first terminal of the capacitor C12, the second terminal of the capacitor C6 is connected to the power ground, the second terminal of the resistor R13 is connected to the 3.3V power supply voltage 3V power supply voltage, the first terminal of the first controller U1 is connected to the first terminal of the resistor p1, the first controller U1 is connected to the first terminal of the first controller U1, a second terminal of the resistor R24 and a second terminal of the resistor R28 are respectively connected to the 3.3V supply voltage 3.3V, a TEST terminal TEST of the first controller U1 is connected to a first terminal of the resistor R22, a second terminal of the resistor R22 is connected to the power ground, a reference ground terminal GND _ REF of the first controller U1 is connected to the power ground and a power ground terminal GND _ a of the first controller U1, a lock terminal SNAPB of the first controller U1 is respectively connected to a first terminal of the capacitor C18, a first terminal of the resistor R19 and a first terminal of the lock key S1, a second terminal of the capacitor C18 is connected to the power ground, a second terminal of the lock key S1 is connected to the power ground, a second terminal of the resistor R19 is connected to the 3.3V supply voltage 3.3V, a data terminal of the first controller U1 is respectively connected to a first terminal of the capacitor C17 and a first terminal of the resistor R17, a data terminal of the first controller U1 is connected to the capacitor DP 465 and a second terminal of the capacitor DP 465, a second terminal of the resistor R16 is connected to the first terminal of the resistor R18 and the data terminal D + of the data interface JP1, respectively, a second terminal of the resistor R18 is connected to the 3.3V supply voltage 3.3V, a second terminal of the resistor R17 is connected to the data terminal D-of the data interface JP1, a power terminal VCC of the data interface JP1 is connected to the 5V supply voltage VCC _5V, a power ground terminal GND of the data interface JP1 and a power ground terminal GND(s) of the data interface JP1 are connected to power terminals, respectively, a power terminal of the first controller U1 is connected to the 2.5V supply voltage 2.5V _ D, a power ground terminal VSS _ USB of the first controller U1 and a power ground terminal DVSS of the first controller U1 are connected to power ground, respectively, a crystal terminal CLKOUT of the first controller U1 is connected to the first terminal of the resistor R14 and the first terminal of the resistor R15, a second terminal of the resistor R638 is connected to the first terminal C15, and the first terminal of the capacitor Y638, a second terminal of the capacitor C15 is connected to the first terminal of the inductor L1, a crystal terminal CLKIN of the first controller U1 is connected to the second terminal of the resistor R14, the first terminal of the capacitor C13 and the second terminal of the crystal Y1, respectively, and the second terminal of the capacitor C13, the second terminal of the capacitor C14 and the second terminal of the inductor L1.
In this embodiment, the resistances of the resistor R31 and the resistor R32 are 4.7K, the resistance of the resistor R33 is 47K, the resistances of the resistor R24 and the resistor R28 are 4.7K, the resistance of the resistor R22 is 47K, the resistance of the resistor R19 is 12K, the resistance of the resistor R18 is 1.5K, the model of the capacitor C18 is 104 capacitor, the resistances of the resistor R16 and the resistor R17 are 22 Ω, the capacitances of the capacitor C16 and the capacitor C17 are 10pF, the resistance of the resistor R14 is 110K, the resistance of the resistor R15 is 45 Ω, the frequency of the crystal oscillator Y1 is 48MHz, the capacitances of the capacitor C13 and the capacitor C14 are 12pF, the model of the capacitor C15 is 102 capacitor, the inductance of the inductor L1 is 3.3uH, the resistance of the resistor R13 is 12.5K, the capacitance of the capacitor C12 is 105, and the model of the first controller ZC 1 is model of ZC 1.
The storage module includes: as shown in fig. 6 and 7, the address input terminal a0 of the memory chip U33, the address input terminal a1 of the memory chip U33, the address input terminal a1 of the memory chip U33 and the power ground terminal GND of the memory chip U33 are connected to the power ground, the power voltage input terminal VCC of the memory chip U33 is connected to the 3.3V power voltage 3.3V, the read/write terminal WP of the memory chip U33 is connected to the power ground, the clock terminal SCL of the memory chip U33 is connected to the clock terminal ESCK of the first controller U1, the data terminal SDA of the memory chip U33 is connected to the first terminal of the resistor R34 and the data terminal ESDA of the first controller U1, respectively, and the second terminal of the resistor R34 is connected to the 3.3V power voltage 3.3V. In this embodiment, the resistance of the resistor R34 is 4.7K, and the model number of the memory chip U33 is AT24C 02. The storage and reading of data are realized through the memory chip U33.
While embodiments of the invention have been shown and described, it will be understood by those of ordinary skill in the art that: various changes, modifications, substitutions and alterations can be made to the embodiments without departing from the principles and spirit of the invention, the scope of which is defined by the claims and their equivalents.

Claims (3)

1. A method for realizing safety monitoring of a panoramic image by arranging an intelligent camera on an operating automobile comprises a vehicle to be tested, and is characterized in that A panoramic looking-around cameras are arranged on a vehicle body of the vehicle to be tested, wherein A is a positive integer greater than or equal to 1 and is respectively a1 st panoramic looking-around camera, a2 nd panoramic looking-around camera, a3 rd panoramic looking-around camera, … … and an A th panoramic looking-around camera;
a panoramic all-around display screen fixing mounting seat for fixedly mounting a panoramic all-around display screen is arranged in the vehicle cab to be tested, and the panoramic all-around display screen is fixedly mounted on the panoramic all-around display screen fixing mounting seat;
the panoramic all-around display system is characterized by further comprising a panoramic all-around controller arranged in the vehicle to be tested, wherein a panoramic all-around image data output end of the a-th panoramic all-around camera is connected with a panoramic all-around image data input end of the panoramic all-around controller, a is a positive integer smaller than or equal to A, and a display data output end of the panoramic all-around controller is connected with a display data input end of a panoramic all-around display screen;
the panoramic all-around vision controller carries out brightness consistency detection on the spliced panoramic all-around vision image according to the panoramic all-around vision image data collected by the panoramic all-around vision camera arranged on the vehicle body to be tested; the brightness consistency detection method comprises the following steps:
S4-A, acquiring spliced images, and carrying out gray processing on the spliced images to obtain gray images;
S4-B, carrying out image denoising processing on the gray level image;
S4-C, dividing the acquired gray level image into equal-size areas according to the size of the image;
S4-D, converting the color space of the regions into an LAB color space, extracting L brightness components in the color space, calculating the average value of the L components of each region, and performing difference operation on the average values of all the regions, wherein if the difference value is greater than or equal to a preset difference threshold value, the brightness is inconsistent; otherwise the brightness is consistent.
2. The method for realizing the safety monitoring of the all-round looking images by arranging the intelligent camera on the commercial automobile according to the claim 1, wherein in the step S4-A, the processing method of the gray level images comprises the following steps:
F(i,j)=Red(i,j)×Re+Green(i,j)×Gr+Blue(i,j)×Bl,
wherein ,Red(i,j)Representing the amount of red color pattern at the image pixel (i, j);
re represents a fusion scale factor of the red color mode quantity; re + Gr + Bl ═ 1;
Green(i,j)representing the amount of green color pattern at the image pixel (i, j);
re represents the fusion scale factor of the green color mode quantity;
Blue(i,j)representing the amount of blue color pattern at the image pixel (i, j);
re represents the fusion scale factor of the blue color mode quantity.
3. The method for realizing the safety monitoring of the all-round looking images by arranging the intelligent camera on the commercial automobile according to claim 1, wherein in the step S4-B, the method for carrying out the image denoising processing on the gray level image comprises the following steps:
s0-21, finding out the noise point of the image pixel point, wherein the calculation method of the noise point of the pixel point in the image is as follows:
Figure FDA0003330604030000021
wherein ,F(i,j)Representing the gray value at the image pixel (i, j); that is, the gray value of the pixel point in the ith row and the jth column of the image;
G(i,j)indicating whether a pixel point (i, j) in the image is a noise point, G(i,j)When the pixel point (i, j) of the image is a noise point, G, is 0(i,j)1 represents that the image pixel point (i, j) is a non-noise point;
or represents a logical condition or;
amaxrepresenting that p multiplied by p with the image pixel point (i, j) as the center is the maximum mean value of the gray levels in the window; p is 2p '+ 1, p' is a positive integer greater than or equal to 1 and less than or equal to 3;
aminrepresenting that p multiplied by p with the image pixel point (i, j) as the center is the minimum mean value of the gray level in the window;
s0-22, numbering the pixel points in the window in matrix, and recording the number as
Figure FDA0003330604030000031
wherein ,G(1,1)'represents a pixel point in the 1 st row and 1 st column in the matrix G';
G(1,2)'represents a pixel point in the 1 st row and 2 nd column in the matrix G';
G(1,3)'represents a pixel point in the 1 st row and 3 rd column of the matrix G';
G(1,p)'represents the pixel point in the 1 st row and the p th column in the matrix G';
G(2,1)'denotes a pixel in row 2 and column 1 in the matrix G';
G(2,2)'denotes a pixel in row 2 and column 2 in the matrix G';
G(2,3)'denotes a pixel point in row 2 and column 3 in the matrix G';
G(2,p)'denotes a pixel point in the 2 nd row and p th column in the matrix G';
G(3,1)'denotes a pixel point in row 3 and column 1 in the matrix G';
G(3,2)'denotes a pixel point in row 3 and column 2 in the matrix G';
G(3,3)'represents a pixel point in row 3 and column 3 in the matrix G';
G(3,p)'denotes a pixel point in the p-th column of row 3 in the matrix G';
G(p,1)'denotes a pixel point at the 1 st row in the matrix G';
G(p,2)'denotes a pixel point at the p-th row and 2 nd column in the matrix G';
G(p,3)'denotes a pixel point at the p-th row and 3 rd column in the matrix G';
G(p,p)'represents the pixel point in the p row and p column in the matrix G';
s0-23, arranging the gray values in the matrix G' from small to large, updating the gray values of the pixels at the noise points, wherein the calculation method of the gray values of the pixels at the noise points comprises the following steps:
Figure FDA0003330604030000041
wherein ,
Figure FDA0003330604030000042
representing the gray value of the pixel point at the noise point after updating;
Figure FDA0003330604030000043
express gray value ordering
Figure FDA0003330604030000044
A gray value;
Figure FDA0003330604030000045
express gray value ordering
Figure FDA0003330604030000046
A gray value.
CN202111280660.6A 2021-10-31 2021-10-31 Method for realizing safety monitoring of all-round image by arranging intelligent camera on operating automobile Active CN114025088B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111280660.6A CN114025088B (en) 2021-10-31 2021-10-31 Method for realizing safety monitoring of all-round image by arranging intelligent camera on operating automobile

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111280660.6A CN114025088B (en) 2021-10-31 2021-10-31 Method for realizing safety monitoring of all-round image by arranging intelligent camera on operating automobile

Publications (2)

Publication Number Publication Date
CN114025088A true CN114025088A (en) 2022-02-08
CN114025088B CN114025088B (en) 2023-08-22

Family

ID=80059371

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111280660.6A Active CN114025088B (en) 2021-10-31 2021-10-31 Method for realizing safety monitoring of all-round image by arranging intelligent camera on operating automobile

Country Status (1)

Country Link
CN (1) CN114025088B (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005277582A (en) * 2004-03-23 2005-10-06 Sony Corp Panorama imaging apparatus, control program thereof, panorama imaging method, monitoring system, and program recording medium
US20140111605A1 (en) * 2012-10-22 2014-04-24 National Chung Cheng University Low-complexity panoramic image and video stitching method
CN105005963A (en) * 2015-06-30 2015-10-28 重庆市勘测院 Multi-camera images stitching and color homogenizing method
CN107566730A (en) * 2017-09-27 2018-01-09 维沃移动通信有限公司 A kind of panoramic picture image pickup method and mobile terminal
CN108650495A (en) * 2018-06-28 2018-10-12 华域视觉科技(上海)有限公司 A kind of automobile-used panoramic looking-around system and its adaptive light compensation method
CN109978765A (en) * 2019-03-11 2019-07-05 上海保隆汽车科技股份有限公司 Panoramic picture brightness correcting method and joining method and its device
CN112102168A (en) * 2020-09-03 2020-12-18 成都中科合迅科技有限公司 Image splicing method and system based on multiple threads
WO2021031458A1 (en) * 2019-08-16 2021-02-25 域鑫科技(惠州)有限公司 Method and device for image color correction applicable in endoscope, and storage medium

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005277582A (en) * 2004-03-23 2005-10-06 Sony Corp Panorama imaging apparatus, control program thereof, panorama imaging method, monitoring system, and program recording medium
US20140111605A1 (en) * 2012-10-22 2014-04-24 National Chung Cheng University Low-complexity panoramic image and video stitching method
CN105005963A (en) * 2015-06-30 2015-10-28 重庆市勘测院 Multi-camera images stitching and color homogenizing method
CN107566730A (en) * 2017-09-27 2018-01-09 维沃移动通信有限公司 A kind of panoramic picture image pickup method and mobile terminal
CN108650495A (en) * 2018-06-28 2018-10-12 华域视觉科技(上海)有限公司 A kind of automobile-used panoramic looking-around system and its adaptive light compensation method
CN109978765A (en) * 2019-03-11 2019-07-05 上海保隆汽车科技股份有限公司 Panoramic picture brightness correcting method and joining method and its device
WO2021031458A1 (en) * 2019-08-16 2021-02-25 域鑫科技(惠州)有限公司 Method and device for image color correction applicable in endoscope, and storage medium
CN112102168A (en) * 2020-09-03 2020-12-18 成都中科合迅科技有限公司 Image splicing method and system based on multiple threads

Also Published As

Publication number Publication date
CN114025088B (en) 2023-08-22

Similar Documents

Publication Publication Date Title
Zhang et al. CCTSDB 2021: a more comprehensive traffic sign detection benchmark
CN107154022B (en) A kind of dynamic panorama mosaic method suitable for trailer
CN110660023B (en) Video stitching method based on image semantic segmentation
US9053372B2 (en) Road marking detection and recognition
CN112819094B (en) Target detection and identification method based on structural similarity measurement
JP6688277B2 (en) Program, learning processing method, learning model, data structure, learning device, and object recognition device
CN102867417B (en) Taxi anti-forgery system and taxi anti-forgery method
CN108305291B (en) Monocular vision positioning and attitude determination method utilizing wall advertisement containing positioning two-dimensional code
CN103295021A (en) Method and system for detecting and recognizing feature of vehicle in static image
CN113449632B (en) Vision and radar perception algorithm optimization method and system based on fusion perception and automobile
CN110736472A (en) indoor high-precision map representation method based on fusion of vehicle-mounted all-around images and millimeter wave radar
CN114022438A (en) System for realizing safety test of panoramic all-round looking images by mounting panoramic camera on vehicle
US11216905B2 (en) Automatic detection, counting, and measurement of lumber boards using a handheld device
Kruber et al. Vehicle position estimation with aerial imagery from unmanned aerial vehicles
CN112069906A (en) Traffic light identification method based on OpenCV and comprehensive matching distance
CN109508714B (en) Low-cost multi-channel real-time digital instrument panel visual identification method and system
CN112053407B (en) Automatic lane line detection method based on AI technology in traffic law enforcement image
CN111652937A (en) Vehicle-mounted camera calibration method and device
CN114025088A (en) Method for realizing all-around image safety monitoring by arranging intelligent camera on commercial vehicle
CN113902965A (en) Multi-spectral pedestrian detection method based on multi-layer feature fusion
CN114037670A (en) System for realizing image splicing test by mounting panoramic camera on vehicle running safely
CN114037611A (en) Working method for realizing image splicing by mounting panoramic camera on safe-driving automobile
CN114037669A (en) Working method for realizing panoramic all-around image test of automobile
CN114040155A (en) Panoramic all-around image testing system for vehicle
CN108876755B (en) Improved method for constructing color background of gray level image

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant