CN114025088B - Method for realizing safety monitoring of all-round image by arranging intelligent camera on operating automobile - Google Patents

Method for realizing safety monitoring of all-round image by arranging intelligent camera on operating automobile Download PDF

Info

Publication number
CN114025088B
CN114025088B CN202111280660.6A CN202111280660A CN114025088B CN 114025088 B CN114025088 B CN 114025088B CN 202111280660 A CN202111280660 A CN 202111280660A CN 114025088 B CN114025088 B CN 114025088B
Authority
CN
China
Prior art keywords
image
panoramic
matrix
column
row
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111280660.6A
Other languages
Chinese (zh)
Other versions
CN114025088A (en
Inventor
汤超
刘延
周金应
苏梦月
程前
陈金晶
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Caac Chongqing Automobile Inspection Co ltd
China Automotive Engineering Research Institute Co Ltd
Original Assignee
Caac Chongqing Automobile Inspection Co ltd
China Automotive Engineering Research Institute Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Caac Chongqing Automobile Inspection Co ltd, China Automotive Engineering Research Institute Co Ltd filed Critical Caac Chongqing Automobile Inspection Co ltd
Priority to CN202111280660.6A priority Critical patent/CN114025088B/en
Publication of CN114025088A publication Critical patent/CN114025088A/en
Application granted granted Critical
Publication of CN114025088B publication Critical patent/CN114025088B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/698Control of cameras or camera modules for achieving an enlarged field of view, e.g. panoramic image capture
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N17/00Diagnosis, testing or measuring for television systems or their details
    • H04N17/002Diagnosis, testing or measuring for television systems or their details for television cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/71Circuitry for evaluating the brightness variation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/2624Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects for obtaining an image which is composed of whole input images, e.g. splitscreen
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/181Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/10Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used
    • B60R2300/105Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used using multiple cameras
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • General Health & Medical Sciences (AREA)
  • Mechanical Engineering (AREA)
  • Image Processing (AREA)

Abstract

The invention provides a method for realizing safety monitoring of an all-around image by arranging an intelligent camera on an operating automobile, which comprises the steps of carrying out brightness consistency detection on a spliced panoramic all-around image of a vehicle to be tested; the brightness consistency detection method comprises the following steps: S4-A, acquiring spliced images, and carrying out graying treatment on the spliced images to acquire gray images; S4-B, performing image denoising treatment on the gray level image; S4-C, dividing the acquired gray level image into equal-sized areas according to the image size; S4-D, converting the color space of the region into an LAB color space, extracting L brightness components in the color space, calculating the average value of the L components of each region, carrying out difference operation on the average value of all the regions, and if the difference value is greater than or equal to a preset difference threshold value, the brightness is inconsistent; otherwise, the brightness is consistent. The invention can detect and analyze the brightness consistency of the spliced image.

Description

Method for realizing safety monitoring of all-round image by arranging intelligent camera on operating automobile
Technical Field
The invention relates to the technical field of vehicle safety, in particular to a method for realizing the safety monitoring of an all-round image by arranging an intelligent camera on a commercial car.
Background
A vehicle is all devices that travel by driving wheels for the purpose of transporting people or goods. Recently, intelligent vehicles have been actively developed and commercialized for safety and convenience of drivers or pedestrians. Intelligent vehicles are the most advanced vehicles incorporating Information Technology (IT) that not only introduce advanced systems for the vehicle itself, but also provide optimal traffic efficiency through links to intelligent transportation systems. In particular, the intelligent vehicle maximizes safety and convenience of drivers, passengers and pedestrians by performing automatic driving, adaptive Cruise Control (ACC), obstacle detection, collision detection, precise map provision, setting of a route to a destination, and provision of a location of a main place. As a means for maximizing safety and convenience of drivers, passengers and pedestrians, a panorama control means attracts attention. The panorama panoramic all-around control device provides a panorama all-around image of the vehicle using the image pickup device, and the driver can panorama all-around the vehicle in real time through the panorama all-around image.
Disclosure of Invention
The invention aims at least solving the technical problems in the prior art, and particularly creatively provides a method for realizing the safety monitoring of the looking-around image by arranging an intelligent camera on a commercial car.
In order to achieve the above purpose of the present invention, the present invention provides a method for implementing panoramic image security monitoring by providing an intelligent camera on an operating vehicle, including a vehicle to be tested, wherein a panoramic cameras are provided on a vehicle body of the vehicle to be tested, and a is a positive integer greater than or equal to 1, and is a 1 st panoramic camera, a 2 nd panoramic camera, a 3 rd panoramic camera, a … … and an A th panoramic camera respectively;
a panoramic looking-around display screen fixed mounting seat for fixedly mounting a panoramic looking-around display screen is arranged in the vehicle cab to be tested, and the panoramic looking-around display screen is fixedly mounted on the panoramic looking-around display screen fixed mounting seat;
the panoramic all-round controller is arranged in the vehicle to be tested, the panoramic all-round image data output end of the a-th panoramic all-round camera is connected with the panoramic all-round image data input end a of the panoramic all-round controller, a is a positive integer smaller than or equal to A, at the moment, the panoramic all-round image data output end of the 1-th panoramic all-round camera is connected with the panoramic all-round image data input end 1 of the panoramic all-round controller, the panoramic all-round image data output end of the 2-th panoramic all-round camera is connected with the panoramic all-round image data input end 2 of the panoramic all-round controller, the panoramic all-round image data output end of the 3-th panoramic all-round camera is connected with the panoramic all-round image data input end 3 of the panoramic all-round controller, and the panoramic all-round image data output end of the A-round camera is connected with the panoramic all-round image data input end A of the panoramic all-round controller; the display data output end of the panoramic all-around controller is connected with the display data input end of the panoramic all-around display screen;
The panoramic all-round controller carries out brightness consistency detection on the spliced panoramic all-round image according to the panoramic all-round image data acquired by the panoramic all-round camera arranged on the vehicle body to be tested; the brightness consistency detection method comprises the following steps:
S4-A, acquiring spliced images, and carrying out graying treatment on the spliced images to acquire gray images;
S4-B, performing image denoising treatment on the gray level image;
S4-C, dividing the acquired gray level image into equal-sized areas according to the image size;
S4-D, converting the color space of the region into an LAB color space, extracting L brightness components in the color space, calculating the average value of the L components of each region, carrying out difference operation on the average value of all the regions, and if the difference value is greater than or equal to a preset difference threshold value, the brightness is inconsistent; otherwise, the brightness is consistent.
In a preferred embodiment of the present invention, in step S4-a, the gray scale image processing method includes:
F (i,j) =Red (i,j) ×Re+Green (i,j) ×Gr+Blue (i,j) ×Bl,
wherein ,Red(i,j) Representing the amount of red color mode at image pixel (i, j);
re represents the fusion proportionality coefficient of the red color mode quantity; re+gr+bl=1;
Green (i,j) representing the amount of green color mode at image pixel (i, j);
Re represents the fusion scaling factor of the green color mode quantity;
Blue (i,j) representing the amount of blue color mode at image pixel (i, j);
re represents the fusion scaling factor for the blue color mode quantity.
In a preferred embodiment of the present invention, in step S4-B, the method of image denoising a gray scale image includes the steps of:
s0-21, finding out noise points of pixel points of an image, wherein the calculation method of the noise points of the pixel points in the image comprises the following steps:
wherein ,F(i,j) A gray value representing a pixel (i, j) of the image; that is, the gray value of the pixel point in the ith row and jth column of the image;
G (i,j) indicating whether or not the pixel point (i, j) of the image is a noise point, G (i,j) =0 indicates that at the image pixel point (i, j) as the noise point, G (i,j) =1 indicates that at image pixel point (i, j) is a non-noise point;
or represents a logical condition or;
a max representing p×p centered on the image pixel (i, j) as the maximum average of gray levels in the window; p=2p '+1, p' being a positive integer greater than or equal to 1 and less than or equal to 3;
a min representing p×p centered on the image pixel (i, j) as the minimum mean of gray levels in the window;
s0-22, carrying out matrix numbering on pixel points in a window, and marking as G';
s0-23, arranging gray values in a matrix G' from small to small, and updating gray values of pixel points at noise points, wherein the gray values of the pixel points at the noise points are calculated by the following steps:
wherein ,representing the updated gray value of the pixel point at the noise point;
express gray value rank-ordered +.>Gray values;
express gray value rank-ordered +.>A gray value.
In summary, by adopting the technical scheme, the invention can perform brightness consistency detection analysis on the spliced image.
Additional aspects and advantages of the invention will be set forth in part in the description which follows, and in part will be obvious from the description, or may be learned by practice of the invention.
Drawings
The foregoing and/or additional aspects and advantages of the invention will become apparent and may be better understood from the following description of embodiments taken in conjunction with the accompanying drawings in which:
fig. 1 is a schematic block diagram of a flow of the present invention.
Fig. 2 is a schematic circuit connection diagram of a second power supply module, a third power supply module, a fourth power supply module, and a fifth power supply module according to the present invention.
Fig. 3 is a schematic circuit connection diagram of a first power supply module, a sixth power supply module and a seventh power supply module according to the present invention.
Fig. 4 is a schematic circuit connection diagram of an eighth power supply module, a ninth power supply module and a fault detection module according to the present invention.
Fig. 5 is a schematic diagram of circuit connection of the panoramic all-around acquisition module of the present invention.
FIG. 6 is a schematic diagram of the circuit connections of the controller module of the present invention.
FIG. 7 is a schematic diagram of a circuit connection of a storage module according to the present invention.
Detailed Description
Embodiments of the present invention are described in detail below, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to like or similar elements or elements having like or similar functions throughout. The embodiments described below by referring to the drawings are illustrative only and are not to be construed as limiting the invention.
The invention provides a method for realizing panoramic image safety monitoring by arranging intelligent cameras on an operating automobile, which comprises a vehicle to be tested, wherein A panoramic cameras are arranged on the body of the vehicle to be tested, and A is a positive integer greater than or equal to 1 and is respectively a 1 st panoramic camera, a 2 nd panoramic camera, a 3 rd panoramic camera, a … … th panoramic camera and an A th panoramic camera;
a panoramic looking-around display screen fixed mounting seat for fixedly mounting a panoramic looking-around display screen is arranged in the vehicle cab to be tested, and the panoramic looking-around display screen is fixedly mounted on the panoramic looking-around display screen fixed mounting seat;
the panoramic all-round controller is arranged in the vehicle to be tested, the panoramic all-round image data output end of the a-th panoramic all-round camera is connected with the panoramic all-round image data input end a of the panoramic all-round controller, a is a positive integer smaller than or equal to A, at the moment, the panoramic all-round image data output end of the 1-th panoramic all-round camera is connected with the panoramic all-round image data input end 1 of the panoramic all-round controller, the panoramic all-round image data output end of the 2-th panoramic all-round camera is connected with the panoramic all-round image data input end 2 of the panoramic all-round controller, the panoramic all-round image data output end of the 3-th panoramic all-round camera is connected with the panoramic all-round image data input end 3 of the panoramic all-round controller, and the panoramic all-round image data output end of the A-round camera is connected with the panoramic all-round image data input end A of the panoramic all-round controller; the display data output end of the panoramic all-around controller is connected with the display data input end of the panoramic all-around display screen;
The panoramic all-round controller carries out brightness consistency detection on the spliced panoramic all-round image according to the panoramic all-round image data acquired by the panoramic all-round camera arranged on the vehicle body to be tested; the brightness consistency detection method comprises the following steps:
S4-A, as shown in FIG. 1, acquiring a spliced image, and carrying out graying treatment on the spliced image to acquire a gray image;
S4-B, performing image denoising treatment on the gray level image;
S4-C, dividing the acquired gray level image into equal-sized areas according to the image size;
S4-D, converting the color space of the region into an LAB color space, extracting L brightness components in the color space, calculating the average value of the L components of each region, carrying out difference operation on the average value of all the regions, and if the difference value is greater than or equal to a preset difference threshold value, the brightness is inconsistent; otherwise, the brightness is consistent.
In a preferred embodiment of the present invention, in step S4-a, the gray scale image processing method includes:
F (i,j) =Red (i,j) ×Re+Green (i,j) ×Gr+Blue (i,j) ×Bl,
wherein ,Red(i,j) Representing the amount of red color mode at image pixel (i, j);
re represents the fusion proportionality coefficient of the red color mode quantity; re+gr+bl=1;
Green (i,j) representing the amount of green color mode at image pixel (i, j);
Re represents the fusion scaling factor of the green color mode quantity;
Blue (i,j) representing the amount of blue color mode at image pixel (i, j);
re represents the fusion scaling factor for the blue color mode quantity.
In a preferred embodiment of the present invention, in step S4-B, the method of image denoising a gray scale image includes the steps of:
s0-21, finding out noise points of pixel points of an image, wherein the calculation method of the noise points of the pixel points in the image comprises the following steps:
wherein ,F(i,j) A gray value representing a pixel (i, j) of the image; that is, the gray value of the pixel point in the ith row and jth column of the image;
G (i,j) indicating whether or not the pixel point (i, j) of the image is a noise point, G (i,j) =0 indicates that at the image pixel point (i, j) as the noise point, G (i,j) =1 indicates that at image pixel point (i, j) is a non-noise point;
or represents a logical condition or;
a max representing p×p centered on the image pixel (i, j) as the maximum average of gray levels in the window; p=2p '+1, p' being a positive integer greater than or equal to 1 and less than or equal to 3;
a min representing p×p centered on the image pixel (i, j) as the minimum mean of gray levels in the window;
s0-22, numbering the pixel points in the window in a matrix manner, and marking as
wherein ,G(1,1) ' represent row 1 and column 1 in matrix G Is a pixel of (1); g (1,2) 'represents the pixel points of row 1 and column 2 in the matrix G'; g (1,3) 'represents the pixel points of row 1 and column 3 in the matrix G'; g (1,p) 'represents the pixel point of row 1 and column p in matrix G'; g (2,1) 'represents the pixel points of row 2 and column 1 in the matrix G'; g (2,2) 'represents the pixel point of row 2 and column 2 in the matrix G'; g (2,3) 'represents the pixel points of row 2 and column 3 in the matrix G'; g (2,p) 'represents the pixel point of row 2 and column p in matrix G'; g (3,1) 'represents the pixel points of row 3 and column 1 in the matrix G'; g (3,2) 'represents the pixel points of row 3 and column 2 in the matrix G'; g (3,3) 'represents the 3 rd row and 3 rd column pixel points in the matrix G'; g (3,p) 'represents the pixel point of row 3 and column p in matrix G'; g (p,1) 'represents the pixel point of row p and column 1 in matrix G'; g (p,2) 'represents the pixel point of row p and column 2 in matrix G'; g (p,3) 'represents the pixel point of row p and column 3 in matrix G'; g (p,p) 'represents the p-th row and p-th column of pixel points in the matrix G';
s0-23, arranging gray values in a matrix G' from small to small, and updating gray values of pixel points at noise points, wherein the gray values of the pixel points at the noise points are calculated by the following steps:
wherein ,Representing the updated gray value of the pixel point at the noise point;
express gray value rank-ordered +.>Gray values;
express gray value rank-ordered +.>A gray value.
The invention relates to a panoramic all-round image testing system for a vehicle, which can be as follows: the panoramic all-around camera comprises a vehicle to be tested, wherein A panoramic all-around cameras are arranged on a vehicle body of the vehicle to be tested, and A is a positive integer which is more than or equal to 1 and is respectively a 1 st panoramic all-around camera, a 2 nd panoramic all-around camera, a 3 rd panoramic all-around camera, a … … th panoramic all-around camera and an A th panoramic all-around camera;
the a-th panoramic looking-around camera comprises a panoramic looking-around camera shooting module, a panoramic looking-around camera controller and a wireless data Bluetooth transmission module; the panoramic looking-around image data transmission end of the controller is connected with the panoramic looking-around image data transmission end of the wireless data Bluetooth transmission module; a is a positive integer less than or equal to A, and the 1 st panoramic looking-around camera comprises a panoramic looking-around camera shooting module, a panoramic looking-around camera controller and a wireless data Bluetooth transmission module; the panoramic looking-around image data transmission end of the controller is connected with the panoramic looking-around image data transmission end of the wireless data Bluetooth transmission module; the 2 nd panoramic looking-around camera comprises a panoramic looking-around camera shooting module, a panoramic looking-around camera controller and a wireless data Bluetooth transmission module; the panoramic looking-around image data transmission end of the controller is connected with the panoramic looking-around image data transmission end of the wireless data Bluetooth transmission module; the 3 rd panoramic looking-around camera comprises a panoramic looking-around camera shooting module, a panoramic looking-around camera controller and a wireless data Bluetooth transmission module; the panoramic looking-around image data transmission end of the controller is connected with the panoramic looking-around image data transmission end of the wireless data Bluetooth transmission module; … …; the A-th panoramic looking-around camera comprises a panoramic looking-around camera shooting module, a panoramic looking-around camera controller and a wireless data Bluetooth transmission module; the panoramic looking-around image data transmission end of the controller is connected with the panoramic looking-around image data transmission end of the wireless data Bluetooth transmission module;
A panoramic looking-around display screen fixed mounting seat for fixedly mounting a panoramic looking-around display screen is arranged in the vehicle cab to be tested, and the panoramic looking-around display screen is fixedly mounted on the panoramic looking-around display screen fixed mounting seat;
the panoramic all-round display screen comprises a panoramic all-round display module, a panoramic all-round display controller and a wireless Bluetooth data transmission module; the panoramic looking-around image data transmission end of the wireless Bluetooth transmission data module is connected with the panoramic looking-around image data transmission end of the panoramic looking-around display controller, and the panoramic looking-around image data output end of the panoramic looking-around display controller is connected with the panoramic looking-around image data input end of the panoramic looking-around display module;
the panoramic looking-around camera a panoramic looking-around image data shot by the panoramic looking-around camera shooting module is transmitted to the panoramic looking-around display screen by the wireless data Bluetooth transmission module; the panoramic all-round display screen receives panoramic all-round image data sent by the a panoramic all-round camera through the wireless Bluetooth transmission data module, and the spliced panoramic all-round image data is displayed on the panoramic all-round display module. Performing brightness consistency detection on the spliced panoramic all-around image; the brightness consistency detection method comprises the following steps:
S4-A, acquiring spliced images, and carrying out graying treatment on the spliced images to acquire gray images;
S4-B, performing image denoising treatment on the gray level image;
S4-C, dividing the acquired gray level image into equal-sized areas according to the image size;
S4-D, converting the color space of the region into an LAB color space, extracting L brightness components in the color space, calculating the average value of the L components of each region, carrying out difference operation on the average value of all the regions, and if the difference value is greater than or equal to a preset difference threshold value, the brightness is inconsistent; otherwise, the brightness is consistent.
In a preferred embodiment of the present invention, in step S4-a, the gray scale image processing method includes:
F (i,j) =Red (i,j) ×Re+Green (i,j) ×Gr+Blue (i,j) ×Bl,
wherein ,Red(i,j) Representing the amount of red color mode at image pixel (i, j);
re represents the fusion proportionality coefficient of the red color mode quantity; re+gr+bl=1;
Green (i,j) representing the amount of green color mode at image pixel (i, j);
re represents the fusion scaling factor of the green color mode quantity;
Blue (i,j) representing the amount of blue color mode at image pixel (i, j);
re represents the fusion scaling factor for the blue color mode quantity.
In a preferred embodiment of the present invention, in step S4-B, the method of image denoising a gray scale image includes the steps of:
S0-21, finding out noise points of pixel points of an image, wherein the calculation method of the noise points of the pixel points in the image comprises the following steps:
wherein ,F(i,j) A gray value representing a pixel (i, j) of the image; that is, the gray value of the pixel point in the ith row and jth column of the image;
G (i,j) indicating whether or not the pixel point (i, j) of the image is a noise point, G (i,j) =0 indicates that at the image pixel point (i, j) as the noise point, G (i,j) =1 indicates that at image pixel point (i, j) is a non-noise point;
or represents a logical condition or;
a max representing p×p centered on the image pixel (i, j) as the maximum average of gray levels in the window; p=2p '+1, p' being a positive integer greater than or equal to 1 and less than or equal to 3;
a min representing p×p centered on the image pixel (i, j) as the minimum mean of gray levels in the window;
s0-22, numbering the pixel points in the window in a matrix manner, and marking as
wherein ,G(1,1) 'represents the 1 st row and 1 st column pixel points in the matrix G'; g (1,2) 'represents the pixel points of row 1 and column 2 in the matrix G'; g (1,3) 'represents the pixel points of row 1 and column 3 in the matrix G'; g (1,p) 'represents the pixel point of row 1 and column p in matrix G'; g (2,1) 'represents the pixel points of row 2 and column 1 in the matrix G'; g (2,2) 'represents the pixel point of row 2 and column 2 in the matrix G'; g (2,3) 'represents the pixel points of row 2 and column 3 in the matrix G'; g (2,p) 'represents the pixel point of row 2 and column p in matrix G'; g (3,1) 'represents the pixel points of row 3 and column 1 in the matrix G'; g (3,2) 'represents the pixel points of row 3 and column 2 in the matrix G'; g (3,3) 'represents the 3 rd row and 3 rd column pixel points in the matrix G'; g (3,p) 'represents the pixel point of row 3 and column p in matrix G'; g (p,1) 'represents the pixel point of row p and column 1 in matrix G'; g (p,2) 'represents the pixel point of row p and column 2 in matrix G'; g (p,3) 'represented in matrix G'The p-th row and the 3-th column of the pixel points; g (p,p) 'represents the p-th row and p-th column of pixel points in the matrix G';
s0-23, arranging gray values in a matrix G' from small to small, and updating gray values of pixel points at noise points, wherein the gray values of the pixel points at the noise points are calculated by the following steps:
wherein ,representing the updated gray value of the pixel point at the noise point;
express gray value rank-ordered +.>Gray values;
express gray value rank-ordered +.>A gray value.
The invention relates to a panoramic all-round image testing system for a vehicle, which can also be as follows: the panoramic all-around camera comprises a vehicle to be tested, wherein A panoramic all-around cameras are arranged on a vehicle body of the vehicle to be tested, and A is a positive integer which is more than or equal to 1 and is respectively a 1 st panoramic all-around camera, a 2 nd panoramic all-around camera, a 3 rd panoramic all-around camera, a … … th panoramic all-around camera and an A th panoramic all-around camera;
A panoramic looking-around display screen fixed mounting seat for fixedly mounting a panoramic looking-around display screen is arranged in the vehicle cab to be tested, and the panoramic looking-around display screen is fixedly mounted on the panoramic looking-around display screen fixed mounting seat;
the panoramic all-round image data transmission system comprises a panoramic all-round controller, a WiFi wireless data transmission module, a panoramic all-round image data output end of a panoramic all-round camera, a panoramic all-round image data input end, a panoramic all-round image data output end, a panoramic all-round image data input end and a panoramic all-round image data input end, wherein the panoramic all-round image data output end of the panoramic all-round camera is arranged in a vehicle to be tested, the panoramic all-round image data input end is connected with the panoramic all-round image data input end 1 of the panoramic all-round controller, the panoramic all-round image data output end of the panoramic all-round camera is connected with the panoramic all-round image data input end 2 of the panoramic all-round controller, the panoramic all-round image data output end of the panoramic all-round camera is connected with the panoramic all-round image data input end 3 of the panoramic all-round controller, and the panoramic all-round image data output end of the panoramic all-round camera is connected with the panoramic all-round image data input end A of the panoramic all-round controller; the display data output end of the panoramic all-around controller is connected with the display data input end of the panoramic all-around display screen; the WiFi data transmission end of the panoramic all-around controller is connected with the data transmission end of the WiFi wireless data transmission module;
And the panoramic looking-around controller utilizes the WiFi wireless data transmission module to transmit the panoramic looking-around image data acquired by the panoramic looking-around camera arranged on the vehicle body to be tested to the processing terminal, and the processing terminal displays the processed panoramic looking-around image on the panoramic looking-around display screen. Performing brightness consistency detection on the panoramic all-around image displayed on the panoramic all-around display screen; the brightness consistency detection method comprises the following steps:
S4-A, acquiring spliced images, and carrying out graying treatment on the spliced images to acquire gray images;
S4-B, performing image denoising treatment on the gray level image;
S4-C, dividing the acquired gray level image into equal-sized areas according to the image size;
S4-D, converting the color space of the region into an LAB color space, extracting L brightness components in the color space, calculating the average value of the L components of each region, carrying out difference operation on the average value of all the regions, and if the difference value is greater than or equal to a preset difference threshold value, the brightness is inconsistent; otherwise, the brightness is consistent.
In a preferred embodiment of the present invention, in step S4-a, the gray scale image processing method includes:
F (i,j) =Red (i,j) ×Re+Green (i,j) ×Gr+Blue (i,j) ×Bl,
wherein ,Red(i,j) Representing the amount of red color mode at image pixel (i, j);
Re represents the fusion proportionality coefficient of the red color mode quantity; re+gr+bl=1;
Green (i,j) representing the amount of green color mode at image pixel (i, j);
re represents the fusion scaling factor of the green color mode quantity;
Blue (i,j) representing the amount of blue color mode at image pixel (i, j);
re represents the fusion scaling factor for the blue color mode quantity.
In a preferred embodiment of the present invention, in step S4-B, the method of image denoising a gray scale image includes the steps of:
s0-21, finding out noise points of pixel points of an image, wherein the calculation method of the noise points of the pixel points in the image comprises the following steps:
wherein ,F(i,j) A gray value representing a pixel (i, j) of the image; that is, the gray value of the pixel point in the ith row and jth column of the image;
G (i,j) indicating whether or not the pixel point (i, j) of the image is a noise point, G (i,j) =0 indicates that at the image pixel point (i, j) as the noise point, G (i,j) =1 indicates that at image pixel point (i, j) is a non-noise point;
or represents a logical condition or;
a max representing p x p centered at image pixel (i, j) as gray in windowThe maximum average value of the degrees; p=2p '+1, p' being a positive integer greater than or equal to 1 and less than or equal to 3;
a min representing p×p centered on the image pixel (i, j) as the minimum mean of gray levels in the window;
S0-22, numbering the pixel points in the window in a matrix manner, and marking as
wherein ,G(1,1) 'represents the 1 st row and 1 st column pixel points in the matrix G'; g (1,2) 'represents the pixel points of row 1 and column 2 in the matrix G'; g (1,3) 'represents the pixel points of row 1 and column 3 in the matrix G'; g (1,p) 'represents the pixel point of row 1 and column p in matrix G'; g (2,1) 'represents the pixel points of row 2 and column 1 in the matrix G'; g (2,2) 'represents the pixel point of row 2 and column 2 in the matrix G'; g (2,3) 'represents the pixel points of row 2 and column 3 in the matrix G'; g (2,p) 'represents the pixel point of row 2 and column p in matrix G'; g (3,1) 'represents the pixel points of row 3 and column 1 in the matrix G'; g (3,2) 'represents the pixel points of row 3 and column 2 in the matrix G'; g (3,3) 'represents the 3 rd row and 3 rd column pixel points in the matrix G'; g (3,p) 'represents the pixel point of row 3 and column p in matrix G'; g (p,1) 'represents the pixel point of row p and column 1 in matrix G'; g (p,2) 'represents the pixel point of row p and column 2 in matrix G'; g (p,3) 'represents the pixel point of row p and column 3 in matrix G'; g (p,p) 'represents the p-th row and p-th column of pixel points in the matrix G';
s0-23, arranging gray values in a matrix G' from small to small, and updating gray values of pixel points at noise points, wherein the gray values of the pixel points at the noise points are calculated by the following steps:
wherein ,representing the updated gray value of the pixel point at the noise point;
express gray value rank-ordered +.>Gray values;
express gray value rank-ordered +.>A gray value.
In the embodiment, the device further comprises a calibration device, wherein the calibration device comprises a transverse supporting plate, a track capable of sliding linearly is arranged on the transverse supporting plate, a moving device is arranged on the track, a vertical supporting rod is arranged on the moving device, a calibration target plate is arranged on the vertical supporting rod, and black and white color array grids are arranged on any one side or two sides of the calibration target plate;
the mobile device comprises a pulley capable of sliding on a track and a load arranged on the pulley, wherein the load comprises a box body, a driving module arranged in the box and used for driving the pulley to slide, a calibration controller and a wireless Bluetooth transmission unit;
the driving end of the calibration controller is connected with the control end of the driving module, and the wireless data end of the calibration controller is connected with the wireless data end of the wireless Bluetooth transmission unit; a panoramic looking-around Bluetooth transmission module is correspondingly arranged on the vehicle to be tested, and a panoramic looking-around image data transmission end of the panoramic looking-around controller is connected with a panoramic looking-around image data transmission end of the panoramic looking-around Bluetooth transmission module;
The calibration device is arranged beside the panoramic looking-around camera, so that the panoramic looking-around camera is opposite to any surface of the calibration target plate, and the calibration controller adjusts the distance between the panoramic looking-around camera and the calibration target plate according to the control signal sent by the panoramic looking-around controller.
In a preferred embodiment of the invention, 4 panoramic looking-around cameras are arranged on the vehicle body of the vehicle to be tested, namely a 1 st panoramic looking-around camera, a 2 nd panoramic looking-around camera, a 3 rd panoramic looking-around camera and a 4 th panoramic looking-around camera;
a 1 st panoramic looking-around camera fixed mounting seat for fixedly mounting a 1 st panoramic looking-around camera is arranged in the middle of the head of the vehicle to be tested, and the 1 st panoramic looking-around camera is fixedly mounted on the 1 st panoramic looking-around camera fixed mounting seat; a 2 nd panoramic looking-around camera fixed mounting seat for fixedly mounting a 2 nd panoramic looking-around camera is arranged in the middle of the tail of the vehicle to be tested, and the 2 nd panoramic looking-around camera is fixedly mounted on the 2 nd panoramic looking-around camera fixed mounting seat; a 3 rd panoramic looking-around camera fixed mounting seat for fixedly mounting the 3 rd panoramic looking-around camera is arranged in the middle of the left side of the vehicle body of the vehicle to be tested, and the 3 rd panoramic looking-around camera is fixedly mounted on the 3 rd panoramic looking-around camera fixed mounting seat; a 4 th panoramic looking-around camera fixed mounting seat for fixedly mounting a 4 th panoramic looking-around camera is arranged in the middle of the right side of the vehicle body of the vehicle to be tested, and the 4 th panoramic looking-around camera is fixedly mounted on the 4 th panoramic looking-around camera fixed mounting seat;
The panoramic looking-around image data output end of the 1 st panoramic looking-around camera is connected with the panoramic looking-around image data input 1 st end of the panoramic looking-around controller, the panoramic looking-around image data output end of the 2 nd panoramic looking-around camera is connected with the panoramic looking-around image data input 2 nd end of the panoramic looking-around controller, the panoramic looking-around image data output end of the 3 rd panoramic looking-around camera is connected with the panoramic looking-around image data input 3 rd end of the panoramic looking-around controller, and the panoramic looking-around image data output end of the 4 th panoramic looking-around camera is connected with the panoramic looking-around image data input 4 th end of the panoramic looking-around controller.
The invention also discloses a method for testing the panoramic all-around image of the vehicle, which comprises the following steps:
s0, correcting the panoramic camera lens;
s1, acquiring panoramic surrounding image data shot by a panoramic surrounding camera;
s2, filtering and updating image noise of the shot panoramic looking-around image data;
s3, matching calibration is carried out according to images shot at adjacent moments;
s4, extracting image characteristic points in the images to realize the stitching of panoramic all-around images.
In a preferred embodiment of the present invention, in step S0, a method for correcting a panoramic camera lens includes the steps of:
S01, acquiring the relation between an image pixel and an environment coordinate system where a vehicle to be tested is located;
s02, correcting the panoramic all-around camera lens according to the shot calibration target plate.
In a preferred embodiment of the present invention, in step S01, the method for calculating the relationship between the image pixels and the environmental coordinate system in which the vehicle to be tested is located includes the steps of:
s011, acquiring the relation between the image pixels and the image size, wherein the relation between the image pixels and the image size is as follows:
wherein, (x, y, 1) T Representing homogeneous image coordinates in millimeters in length mm; t represents a transpose;
(U 0 ,V 0 ) T representing principal point coordinates of the image;
dx represents the size of the panoramic camera pixels in the x-axis direction;
dy represents the size of the panoramic camera pixel in the y-axis direction;
(U,V,1) T representing homogeneous image coordinates in pixels ppi;
s012, obtaining the relation between the panoramic looking around camera coordinate system and the image size, wherein the relation between the panoramic looking around camera coordinate system and the image size is as follows:
wherein, (X, Y, Z) T Representing coordinates of a panoramic camera coordinate system; t represents a transpose;
(x,y,1) T representing homogeneous image coordinates in millimeters in length mm;
s represents the panoramic camera adjusting parameter;
f represents the focal length of the panoramic camera;
s013, according to the relation between the image pixels and the image size in the step S011 and the relation between the panoramic camera coordinate system and the image size in the step S012, obtaining the relation between the image pixels and the panoramic camera coordinate system, wherein the relation between the image pixels and the panoramic camera coordinate system is as follows:
will beSubstituted into->The method comprises the following steps:
s014, obtaining the relation between the panoramic looking around camera coordinate system and the environment coordinate system of the vehicle to be tested, wherein the relation between the panoramic looking around camera coordinate system and the environment coordinate system of the vehicle to be tested is as follows:
wherein ,(X0 ,Y 0 ,Z 0 ) T Indicating the position of the vehicle to be testedAn environmental coordinate system; t represents a transpose;
r represents a panoramic camera coordinate system rotation matrix;
(X,Y,Z) T representing coordinates of a panoramic camera coordinate system;
m represents a panoramic camera coordinate system translation matrix;
s015, obtaining the relation between the image pixel and the environment coordinate system of the vehicle to be tested according to the relation between the image pixel and the environment coordinate system of the panoramic camera in the step S013 and the relation between the panoramic camera coordinate system and the environment coordinate system of the vehicle to be tested in the step S014, wherein the relation between the image pixel and the environment coordinate system of the vehicle to be tested is as follows:
Will beSubstituted into->The method comprises the following steps:
wherein ,(X0 ,Y 0 ,Z 0 ) T Representing an environment coordinate system of a vehicle to be tested; t represents a transpose;
r represents a panoramic camera coordinate system rotation matrix;
s represents the panoramic camera adjusting parameter;
f represents the focal length of the panoramic camera;
dx represents the size of the panoramic camera pixels in the x-axis direction;
dy represents the size of the panoramic camera pixel in the y-axis direction;
(U 0 ,V 0 ) T representing principal point coordinates of the image;
m represents a panoramic camera coordinate system translation matrix.
In a preferred embodiment of the present invention, in step S012, the method for calculating the panoramic camera adjustment parameter S includes the steps of:
s0121, taking any three points in the environment coordinate system of the vehicle to be tested, wherein the three points are respectively (X 1 ,Y 1 ,Z 1 )、(X 2 ,Y 2 ,Z 2) and (X3 ,Y 3 ,Z 3 ) Determining a plane equation set:
wherein ,(X1 ,Y 1 ,Z 1 )≠(X 2 ,Y 2 ,Z 2 )≠(X 3 ,Y 3 ,Z 3 );/>
wherein ,a1 Coefficients representing the X-axis in the environmental coordinate system;
a 2 coefficients representing the Y-axis in the environment coordinate system;
a 3 coefficients representing the Z-axis in the environmental coordinate system;
a 4 representing the offset coefficient in the environment coordinate system;
s0122, obtaining a plane equation of the plane equation according to the plane equation set, wherein the method for obtaining the plane equation of the plane equation according to the plane equation set is as follows:
will beSubstitution into
a 1 X+a 2 Y+a 3 Z+a 4 =0, resulting in:
S0123, obtaining panoramic camera adjusting parameters according to a plane equation, wherein the method for obtaining the panoramic camera adjusting parameters according to the plane equation comprises the following steps:
will beSubstituted into->
The method comprises the following steps:
wherein ,(X1 ,Y 1 ,Z 1 )、(X 2 ,Y 2 ,Z 2) and (X3 ,Y 3 ,Z 3 ) Any three points in an environment coordinate system where a vehicle to be tested is located;
s represents the panoramic camera adjusting parameter;
f represents the focal length of the panoramic camera;
dx represents the size of the panoramic camera pixels in the x-axis direction;
dy represents the size of the panoramic camera pixel in the y-axis direction.
In a preferred embodiment of the present invention, in step S02, a calculation method for correcting a panoramic camera lens according to a photographed calibration target plate includes:
s021, obtaining a mapping relation between a calibration target plate and the panoramic looking around camera, wherein the mapping relation between the calibration target plate and the panoramic looking around camera is as follows:
s×O=D[R m ]o,
/>
wherein ,representing the internal parameters of the panoramic camera;
f x the scaling factor of the panoramic camera pixel in the x-axis direction is represented;
f y the proportion coefficient of the panoramic camera pixel in the y-axis direction is represented;
f t representing the non-perpendicular coefficients of the x-axis and the y-axis;
(U 0 ,V 0 ) T representing principal point coordinates of the image;
[r 1 r 2 r 3 m]=[R m ]representing a panoramic camera coordinate system rotation matrix R and a translation matrix M;
r 1 Column 1 of a coordinate system rotation matrix R of the panoramic camera is represented;
r 2 column 2 representing the panoramic camera coordinate system rotation matrix R;
r 3 column 3 representing the panoramic camera coordinate system rotation matrix R;
m represents a panoramic camera coordinate system translation matrix M;
(X 0 ,Y 0 ,0,1) T =o represents the homogeneous coordinates of the calibration target plate; at this time, the Z axis is 0;
(u′,v′,1) T =o represents homogeneous coordinates of an image;
s022, obtaining a conversion matrix existing between a calibration target plate and a panoramic looking around camera, wherein the calculation method of the conversion matrix existing between the calibration target plate and the panoramic looking around camera comprises the following steps:
s×O=ho,
wherein h represents a conversion matrix;
wherein λ represents a conversion parameter;
h 1 column 1 representing the transformation matrix h;
h 2 column 2 representing the transformation matrix h;
h 3 column 3 representing the transformation matrix h;
s023, using objective functionCalculating a conversion matrix h;
wherein min represents the minimum value;
|| || 2 representing a 2-norm;
O i an ith column representing homogeneous coordinates of the image;
an ith column representing the calculated homogeneous coordinates of the image;
s024, from the transformation matrix h, can be obtained:
wherein ,D-T Representation D T An inverse matrix of (a);
D -1 is the inverse of D;
order the wherein ,L11 Representing elements in row 1 and column 1 of matrix L;
L 12 representing elements in row 1, column 2 of matrix L;
L 13 Representing elements in row 1, column 3 of matrix L;
L 21 representing elements in row 2, column 1 of matrix L;
L 22 representing elements in row 2 and column 2 of matrix L;
L 23 representing elements in row 2 and column 3 of matrix L;
L 31 representing elements in row 3, column 1 of matrix L;
L 32 representing elements in row 3, column 2 of matrix L;
L 33 representing elements in row 3 and column 3 of matrix L;
has h i T Lh j =c ij T l,
wherein ,cij =[h i1 h j1 h i1 h j2 +h i2 h j1 h i2 h j2 h i3 h j1 +h i1 h j3 h i3 h j2 +h i2 h j3 h i3 h j3 ] T
h i1 Elements representing the ith row and 1 st column in the transformation matrix h;
h j1 elements representing the j-th row and 1-th column in the transformation matrix h;
h i2 elements representing row i, column 2 in the transformation matrix h;
h j2 elements representing the j-th row and 2-nd column in the transformation matrix h;
h i3 elements representing row i and column 3 in the transformation matrix h;
h j3 elements representing the j-th row and 3-rd column in the transformation matrix h;
c ij elements representing the ith row and jth column in matrix C;
l represents a parameter matrix;
will h i T Lh j =c ij T l is rewritten intoc 11 Representing the 1 st row and 1 st column elements of matrix C, C 12 Representing the elements of row 1 and column 2 of matrix C, C 22 The element representing the 2 nd row and the 2 nd column of the matrix C, for g pictures shot by the calibration target plate, g is a positive integer which is more than or equal to 6 and less than or equal to 8, and the g equation sets are combined to obtain Cl=0; the matrix L can be obtained, and then all internal parameters of the panoramic camera are obtained according to the optimized search calculation;
S025, obtaining a matrix L, and obtaining a corrected panoramic camera coordinate system rotation matrix R and a corrected translation matrix M:
r 1 =λL -1 h 1
r 1 =λL -1 h 2
r 3 =r 1 r 2
m=λL -1 h 3
wherein ,L-1 An inverse matrix representing L;
h 1 representing column 1 elements in the transformation matrix h;
h 2 representing column 2 elements in the transformation matrix h;
h 2 representing the 3 rd column element in the transformation matrix h.
In a preferred embodiment of the present invention, in step S2, the calculation method for filtering image noise from captured panoramic looking-around image data includes the steps of:
s21, finding out noise points of pixel points of an image, wherein the calculation method of the noise points of the pixel points in the image comprises the following steps:
wherein ,F(i,j) A gray value representing a pixel (i, j) of the image; that is, the gray value of the pixel point in the ith row and jth column of the image;
G (i,j) indicating whether or not the pixel point (i, j) of the image is a noise point, G (i,j) =0 indicates that at the image pixel point (i, j) as the noise point, G (i,j) =1 indicates that at image pixel point (i, j) is a non-noise point;
or represents a logical condition or;
a max representing p×p centered on the image pixel (i, j) as the maximum average of gray levels in the window; p=2p '+1, p' is greater than or equal to 1 and is small orA positive integer equal to 3;
a min representing p×p centered on the image pixel (i, j) as the minimum mean of gray levels in the window;
S22, numbering the pixel points in the window in a matrix manner, and marking as
wherein ,G(1,1) 'represents the 1 st row and 1 st column pixel points in the matrix G';
G (1,2) 'represents the pixel points of row 1 and column 2 in the matrix G';
G (1,3) 'represents the pixel points of row 1 and column 3 in the matrix G';
G (1,p) 'represents the pixel point of row 1 and column p in matrix G';
G (2,1) 'represents the pixel points of row 2 and column 1 in the matrix G';
G (2,2) 'represents the pixel point of row 2 and column 2 in the matrix G';
G (2,3) 'represents the pixel points of row 2 and column 3 in the matrix G';
G (2,p) 'represents the pixel point of row 2 and column p in matrix G';
G (3,1) 'represents the pixel points of row 3 and column 1 in the matrix G';
G (3,2) 'represents the pixel points of row 3 and column 2 in the matrix G';
G (3,3) 'represents the 3 rd row and 3 rd column pixel points in the matrix G';
G (3,p) 'represents the pixel point of row 3 and column p in matrix G';
G (p,1) 'represents the pixel point of row p and column 1 in matrix G';
G (p,2) 'represents the pixel point of row p and column 2 in matrix G';
G (p,3) 'represents the pixel point of row p and column 3 in matrix G';
G (p,p) 'represents the p-th row and p-th column of pixel points in the matrix G';
s23, arranging the gray values in the matrix G' from small to small, and updating the gray values of the pixel points at the noise points, wherein the gray values of the pixel points at the noise points are calculated by the following steps:
wherein ,representing the updated gray value of the pixel point at the noise point;
express gray value rank-ordered +.>Gray values;
express gray value rank-ordered +.>A gray value.
In a preferred embodiment of the present invention, in step S21, the method for calculating the maximum gray scale mean value is as follows:
s211, obtaining a gray average value according to the gray values in the window, wherein the gray average value is calculated by the following steps:
wherein ,represented in matrix G' by row iGray values corresponding to the pixel points of the j columns;
representing a gray value corresponding to a pixel point of an i ' th row and a j ' th column in the matrix G ';int () represents a rounding function;
representing a gray level average;
s112, obtaining a gray maximum average value according to the gray average value, wherein the gray maximum average value is calculated by the following steps:
wherein ,ψ1 Representing a preset first gray threshold; psi phi type 1 A positive number greater than 0;G max representing the maximum gray value in the matrix G';
or/and in step S211, the method for calculating the gray-scale minimum average value is:
s21-1, obtaining a gray average value according to the gray value in the window, wherein the gray average value is calculated by the following steps:
wherein ,representing a gray value corresponding to a pixel point of an ith row and a jth column in the matrix G'; />
Representing a gray value corresponding to a pixel point of an i ' th row and a j ' th column in the matrix G '; int () represents a rounding function;
representing a gray level average;
s21-2, obtaining a gray minimum average value according to the gray average value, wherein the gray minimum average value is calculated by the following steps:
wherein ,ψ2 Representing a preset second gray level threshold; psi phi type 2 Is greater than or equal to psi 1 Positive numbers of (a);G max representing the maximum gray value in the matrix G'.
In a preferred embodiment of the present invention, in step S21, the method for calculating the gray value of the image is:
F (i,j) =Red (i,j) ×Re+Green (i,j) ×Gr+Blue (i,j) ×Bl,
wherein ,Red(i,j) Representing the amount of red color mode at image pixel (i, j);
re represents the fusion proportionality coefficient of the red color mode quantity; re+gr+bl=1;
Green (i,j) representing the amount of green color mode at image pixel (i, j);
re represents the fusion scaling factor of the green color mode quantity;
Blue (i,j) representing the amount of blue color mode at image pixel (i, j);
re represents the fusion scaling factor for the blue color mode quantity.
In a preferred embodiment of the present invention, in step S3, a calculation method for performing matching calibration from images photographed at adjacent times includes the steps of:
s31, obtaining a picture shot by a panoramic all-around camera at the time t-1 and a picture shot by the panoramic all-around camera at the time t to construct an image matching equation, wherein the calculating method of the matching equation is as follows:
wherein ,Pa,t-1 Representing a picture shot at the moment of the a panoramic looking-around camera t-1; a=1, 2, 3, … …, a;
P a′,t representing a picture shot at the moment t of the a-th panoramic looking-around camera;
(x c ,y d ) Representing coordinates of pixel points in the image; c E [0, W×r-1 ]],d∈[0,H×r-1],c,d∈Z + The E represents the coincidence of the set, W represents the width value of the image shot by the panoramic camera, H represents the height value of the image shot by the panoramic camera, r represents the resolution of the shot image, and Z + Representing a positive integer set;
(Δx, Δy) represents an image shift amount; wherein, when Δx > 0, it indicates that the pixel point is moved to the left, when Δx < 0, it indicates that the pixel point is moved to the right, i indicates that the pixel point is moved to the up, when Δy > 0, it indicates that the pixel point is moved to the down, when Δy < 0, it indicates that the pixel point is not moved, when Δx=Δy=0;
represents the rotation angle value, wherein, when +.>Representing clockwise rotation +.>When->Representing counterclockwise rotation +.>When->Indicating that the image is not rotated;
P a′,t (x c ,y d ) Representing the coordinate (x) of the pixel point of the image shot by the a-th panoramic camera at the time t c ,y d ) Degree of color at the location;
image translation (Deltax, deltay) representing the j-th panoramic camera taken at time t-1 and rotated +.>Matching the color degree;
S32, matching the matching equation in step S31Fourier transformation is carried out on the left side and the right side to obtain:
wherein ,Fa′,t (u, v) represents the picture P a′,t (x c ,y d ) Performing Fourier transformation on the data;
representing a pair of picturesPerforming Fourier transformation on the data;
e represents a natural base;
j represents an imaginary number;
u represents the fourier u-axis;
v denotes the fourier v-axis;
s33, pair ofAmplitude values are simultaneously taken from the left side and the right side, and the following steps are obtained:
wherein ,representing fourier modulo;
s34, obtaining the translation (delta x, delta y) and rotation by a phase correlation method
In a preferred embodiment of the present invention, in step S4, the panoramic all-around image stitching method includes the steps of:
s41, acquiring a t-1 moment image characteristic point group and a t moment image characteristic point group;
s42, calculating image characteristic point pair similarity values of characteristic points in the image characteristic point group at the time t-1 and characteristic points in the image characteristic point group at the time t, wherein the image characteristic point pair similarity value calculating method comprises the following steps:
wherein ,x(t-1)i Representing t-1 i-dimensional image vector of image feature points at moment;
x ti an i-dimensional image vector representing the characteristic points of the image at the time t;
I D representing the total dimension number of the image;
θ ηε representing the similarity value of the image characteristic points;
s43, arranging the similar values of the image characteristic point pairs in sequence from large to small, and selecting characteristic points corresponding to the similar values of the first two image characteristic point pairs; and aligning the two selected characteristic points with the other two characteristic points to perform adjacent moment image fusion, so that a panoramic all-around spliced image can be obtained.
In a preferred embodiment of the present invention, the method further includes step S5 of performing sharpness analysis on the panoramic all-around image displayed on the panoramic all-around display screen, the method of performing sharpness analysis on the panoramic all-around image displayed on the panoramic all-around display screen including the steps of:
s51, obtaining a panoramic all-around image displayed on a panoramic all-around display screen, and carrying out graying treatment on the obtained panoramic all-around image displayed on the panoramic all-around display screen to obtain a gray image; the gray image processing method comprises the following steps:
F (i,j) =Red (i,j) ×Re+Green (i,j) ×Gr+Blue (i,j) ×Bl,
wherein ,Red(i,j) Representing the amount of red color mode at image pixel (i, j);
re represents the fusion proportionality coefficient of the red color mode quantity; re+gr+bl=1;
Green (i,j) representing the amount of green color mode at image pixel (i, j);
re represents the fusion scaling factor of the green color mode quantity;
Blue (i,j) representing the amount of blue color mode at image pixel (i, j);
re represents the fusion scaling factor of the blue color mode quantity;
s52, dividing the acquired gray level image into equal-sized areas according to the image size,
s53, gradient calculation is carried out on each region by using a Laplacian operator;
s54, calculating the gradient variance of each region image, and judging the definition of the image according to the gradient variance mean value; the method for calculating the gradient variance mean value comprises the following steps:
wherein ,δi The variance of the gradient is represented and,representing a gradient variance set; />Representing the total number of gradient variances in the set;
the method for judging the definition of the image comprises the following steps:
wherein ,Pc =1 indicates that the panoramic looking around image is an ultra-clear image;
P c =0 indicates that the panoramic looking around image is a high definition image;
P c = -1 indicates that the panoramic looking around image is a standard definition image;
p max representing a clear segmentation maximum;
p min representing a clear segmentation minimum;
and S55, displaying the real-time definition of the image on the upper right corner of the panoramic all-around display screen.
In a preferred embodiment of the present invention, in step S0, after correcting the panoramic camera lens, a symmetry test is performed on an image captured by the panoramic camera, and the symmetry test method includes the following steps:
s0-1, acquiring a shot image on a calibration target plate, and carrying out graying treatment on the shot image to obtain a gray image; the method is consistent with the upper gray level processing method;
s0-2, performing image denoising treatment on the gray level image; the method for carrying out image denoising processing on the gray level image comprises the following steps:
s0-21, finding out noise points of pixel points of an image, wherein the calculation method of the noise points of the pixel points in the image comprises the following steps:
wherein ,F(i,j) A gray value representing a pixel (i, j) of the image; that is, the gray value of the pixel point in the ith row and jth column of the image;
G (i,j) Indicating whether or not the pixel point (i, j) of the image is a noise point, G (i,j) =0 indicates that at the image pixel point (i, j) as the noise point, G (i,j) =1 indicates that at image pixel point (i, j) is a non-noise point;
or represents a logical condition or;
a max representing p×p centered on the image pixel (i, j) as the maximum average of gray levels in the window; p=2p '+1, p' being a positive integer greater than or equal to 1 and less than or equal to 3;
a min representing p×p centered on the image pixel (i, j) as the minimum mean of gray levels in the window;
s0-22, numbering the pixel points in the window in a matrix manner, and marking as
wherein ,G(1,1) 'represents the 1 st row and 1 st column pixel points in the matrix G'; g (1,2) 'represents the pixel points of row 1 and column 2 in the matrix G'; g (1,3) 'represents the pixel points of row 1 and column 3 in the matrix G'; g (1,p) 'represents the pixel point of row 1 and column p in matrix G'; g (2,1) 'represents the pixel points of row 2 and column 1 in the matrix G'; g (2,2) 'represents the pixel point of row 2 and column 2 in the matrix G'; g (2,3) 'represents the pixel points of row 2 and column 3 in the matrix G'; g (2,p) 'represents the pixel point of row 2 and column p in matrix G'; g (3,1) 'represents the pixel points of row 3 and column 1 in the matrix G'; g (3,2) 'represents the pixel points of row 3 and column 2 in the matrix G'; g (3,3) 'represents the 3 rd row and 3 rd column pixel points in the matrix G'; g (3,p) 'represents the pixel point of row 3 and column p in matrix G'; g (p,1) 'represents the pixel point of row p and column 1 in matrix G'; g (p,2) 'represents the pixel point of row p and column 2 in matrix G'; g (p,3) 'represents the pixel point of row p and column 3 in matrix G'; g (p,p) 'represents the p-th row and p-th column of pixel points in the matrix G';
s0-23, arranging gray values in a matrix G' from small to small, and updating gray values of pixel points at noise points, wherein the gray values of the pixel points at the noise points are calculated by the following steps:
wherein ,representing the updated gray value of the pixel point at the noise point;
express gray value rank-ordered +.>Gray values; />
Express gray value rank-ordered +.>Gray values;
s0-3, performing edge detection on the denoised gray image by adopting a Canny operator to obtain an edge image;
s0-4, detecting by adopting Hough straight lines, obtaining edge lines of the upper, lower, left and right outermost peripheral calibration target plates of the edge image, respectively calculating included angles between the edge lines of the upper, lower, left and right calibration target plates and the horizontal and vertical directions, and when the detected included angle is greater than or equal to a preset angle threshold, enabling the image to be asymmetric, wherein the preferable preset angle threshold is 5-8 degrees.
In a preferred embodiment of the present invention, in step S4, a gap detection is performed on the spliced completed images, and the gap detection method includes the steps of:
s4-1, acquiring spliced images, and carrying out graying treatment on the spliced images to acquire gray images; the method is consistent with the upper gray level processing method;
s4-2, performing image denoising treatment on the gray level image; the method is consistent with the denoising processing method of the upper image;
s4-3, performing edge detection on the denoised gray image by adopting a Canny operator to obtain an edge image;
s4-4, picking out the image of the vehicle model part, carrying out Hough straight line detection on the edge image in the transverse direction and the longitudinal direction to obtain a transverse straight line data set and a longitudinal straight line data set, calculating the distance between the transverse adjacent straight lines through a straight line distance formula, recording the distance data, checking whether the distance data deviate from the recorded distance data greatly, and judging as a gap if the distance data deviate from the recorded distance data greatly.
In a preferred embodiment of the present invention, in step S4, ghost detection is performed on the spliced completed image, and the ghost detection method thereof includes the steps of:
s4-a, acquiring spliced images, and carrying out graying treatment on the spliced images to acquire gray images;
S4-b, performing image denoising treatment on the gray level image;
s4-c, performing edge detection on the denoised gray image to obtain an edge image;
s4-d, picking out the image of the automobile model part to obtain an edge image of only the calibration target plate, and adopting Hough straight line detection to transversely and longitudinally extract the square edge straight line of the calibration target plate by oppositely buckling the edge image of the automobile model;
s4-e, after the slit straight lines are thrown out according to a slit detection method, sequencing the detected straight lines of the calibration target plate in different directions, and positioning the upper, lower, left and right straight lines of each square according to the relationship that the distances between every two straight lines are similar;
s4-f, comparing coordinates of the same square at different positions, selecting coordinates with minimum left and upper boundaries and maximum right and lower boundaries as a standard for dividing a square area, dividing an image into small areas for contour detection, and judging that double images appear when two or more square contours are extracted.
The first power supply module includes: as shown in fig. 3 and 6, the vehicle battery BAT is connected to a first terminal of the diode D202, a second terminal of the diode D202 is connected to a first terminal of the inductor L34, a second terminal of the inductor L34 is connected to a first terminal of the capacitor C21, a first terminal of the capacitor C23, a first terminal of the capacitor C25, a first terminal of the resistor R27, and a power supply voltage input terminal VIN of the buck chip U8, a second terminal of the resistor R27 is connected to a first terminal of the resistor R161, a first terminal of the capacitor C25, and an enable input terminal EN of the buck chip U8, a second terminal of the resistor R161 is connected to power supply ground, a first terminal of the capacitor C25 is connected to power supply ground, a second terminal of the capacitor C21, a second terminal of the capacitor C23, a second terminal of the capacitor C24, and a second terminal of the capacitor C25 are connected to power supply ground, a time-series resistor terminal RT/SY of the buck chip U8 is connected to a first terminal of the resistor R111, a second terminal of the resistor R111 is connected to power supply ground, the start capacitor end SS of the voltage-reducing chip U8 is connected with the first end of the capacitor C33, the second end of the capacitor C33 is connected with the power ground, the capacitor end BOOT of the voltage-reducing chip U8 is connected with the first end of the capacitor C19, the power voltage output end of the voltage-reducing chip U8 is respectively connected with the second end of the capacitor C19, the first end of the inductor L404 and the first end of the diode D82, the second end of the diode D82 is connected with the power ground, the voltage feedback end FB of the voltage-reducing chip U8 is respectively connected with the first end of the resistor R66 and the first end of the resistor R61, the second end of the resistor R61 is connected with the power ground, the power ground end GND of the voltage-reducing chip U8 is connected with the power ground, the second end of the inductor L404 is respectively connected with the first end of the capacitor C44, the first end of the capacitor C26, the first end of the capacitor C77, the first end of the capacitor C27 and the second end of the resistor R66, the second end of the inductor L404 outputs 5V power voltage VCC_5V, the second end of the capacitor C27 is connected with the power ground, the second terminal of capacitor C44, the second terminal of capacitor C26, and the second terminal of capacitor C77 are each connected to power ground. The power supply voltage of the vehicle battery BAT is converted into a stable 5V power supply voltage vcc_5v (+5v power supply) by the step-down chip U8. The resistor R161 and the resistor R27 are used for realizing voltage division, and an enable level signal is provided for an enable input end EN of the buck chip U8, so that the buck chip U8 works.
The second power supply module includes: the 5V power supply voltage VCC_5V is connected with a first end of a capacitor C35, a first end of a resistor R255 and an emitter of a triode Q101 respectively, a second end of the capacitor C35 is connected with power supply ground, a second end of the resistor R255 and a base of the triode Q101 are connected with a first end of a resistor R36 respectively, a second end of the resistor R36 is connected with a collector of a triode Q111, the emitter of the triode Q111 is connected with power supply ground, a base of the triode Q111 is connected with a first end of a resistor R303 respectively, a first end of a resistor R34 and a first end of a capacitor C222 respectively, a second end of the resistor R34 and a second end of the capacitor C222 are connected with power supply ground respectively, a second end of the resistor R303 is connected with a power supply voltage control end SUSPENDB of a first controller U1, a collector of the triode Q101 is connected with a first end of the capacitor C277, a collector of the triode Q101 outputs a 5V power supply voltage VDD_5V, and a second end of the capacitor C277 is connected with power supply ground. The power supply voltage control terminal SUSPENDB of the first controller U1 sends a cut-off level to the base electrode of the triode Q111, the collector electrode of the triode Q101 outputs a stable 5V power supply voltage VDD_5V (+ 5V power supply), and the corresponding power supply voltage control terminal SUSPENDB of the first controller U1 sends a conduction level to the base electrode of the triode Q111, and the collector electrode of the triode Q101 does not have a power supply output.
The third power supply module includes: the 5V supply voltage vdd_5v is connected to the first end of the inductor L122, the second end of the inductor L122 is connected to the first end of the capacitor C511, the first end of the capacitor C521, the first end of the capacitor C531, the first end of the capacitor C541, the first end of the capacitor C551 and the first end of the capacitor C561, the second end of the inductor L122 outputs the 5V supply voltage VDD1_5v, and the second end of the capacitor C511, the second end of the capacitor C521, the second end of the capacitor C531, the second end of the capacitor C541, and the second end of the capacitor C561 are connected to the power ground, respectively. The power supply voltage control terminal SUSPENDB of the first controller U1 sends a cut-off level to the base electrode of the triode Q111, the second end of the inductor L122 outputs a stable 5V power supply voltage VDD1_5V (+5V power supply), the corresponding power supply voltage control terminal SUSPENDB of the first controller U1 sends a conduction level to the base electrode of the triode Q111, and the second end of the inductor L122 does not have a power supply output. In this embodiment, the capacitance of the capacitor C52 is 471nF, the capacitance of the capacitor C24 and the capacitance C25 is VEJ M1ETR, the capacitance of the diode D288 is NRVBAF440T3G, the capacitance of the capacitor C23 is 15nF, the capacitance of the diode D202 is NRVBA160T3G, the capacitance of the inductor L34 is NRS5040T220MMGKV, the capacitance of the resistor R111 is 65K, the capacitance of the capacitor C33 is 23nF, the capacitance of the resistor R55 is 101K, the capacitance of the capacitor C19 is 120nF, the capacitance of the capacitor C21 is 4.7uF, the capacitance of the capacitor C26 is TAJC476K016TNJ, the capacitance of the resistor R161 is 95K, the capacitance of the resistor R27 is 365K, the capacitance of the capacitor C77 is 22uF, the capacitance of the capacitor C35 is 22uF, the resistance of the resistor R255 is 12K, the resistors R30 and R34 have resistance values of 15.8K, the capacitor C27 has a model UCD1C101MCL1GS, the triode Q101 has a model BCW68, the capacitor C277 has a capacitance value of 130nF, the inductor L404 has a model NRS5030T100MMGJV, the resistor R51 has a resistance value of 17.9K, the voltage chip U8 has a model LMR14020SSQDDARQ1, the capacitor C44 has a capacitance value of 150nF, the resistor R36 has a resistance value of 502 omega, the capacitor C521 has a capacitance value of 1uF, the triode Q111 has a model BCW66GLT1G, the inductor L122 has a model MPZ1608S601A, the capacitor C511 has a capacitance value of 2.2uF, the capacitor C222 has a capacitance value of 4.7nF, the capacitor C531 and the capacitor C531 has a capacitance value of 120nF, and the capacitor C551 and the capacitor C561 has a capacitance value of 100nF.
The fourth power supply module includes: as shown in fig. 4 to 6, the 5V power supply voltage vcc_5v is connected to the first end of the inductor L422, the second end of the inductor L422 is connected to the first end of the resistor R253, the first end of the capacitor C89, the first end of the capacitor C92 and the power supply voltage input end VIN of the buck chip U4, the delay reset end repay of the buck chip U4 is connected to the first end of the capacitor C93, the second end of the capacitor C89, the second end of the capacitor C92 and the second end of the capacitor C93 are connected to the power supply ground, the frequency oscillation end ROSE of the buck chip U4 is connected to the first end of the resistor R93, the second end of the resistor R93 is connected to the power supply ground, the power supply ground end GND of the buck chip U4 is connected to the power supply ground, and the enable input end EN of the buck chip U4 is connected to the second end of the resistor R253; the power supply voltage output end VOUT of the buck chip U4 is respectively connected to the first end of the capacitor C254, the first end of the capacitor C299 and the first end of the capacitor C292, and the power supply voltage output end VOUT of the buck chip U4 outputs a power supply voltage vcc_3.3v, and the second end of the capacitor C254, the second end of the capacitor C299 and the second end of the capacitor C292 are respectively connected to power supply ground. The input +5v power supply is converted into a stable 3.3V power supply voltage vcc_3.3V (+3.3v power supply) by the buck chip U4.
The fifth power supply module includes: the 3.3V power supply voltage VCC_3.3V is connected with a first end of a capacitor C188, a first end of a resistor R218 and an emitter of a triode Q21 respectively, a second end of the capacitor C188 is connected with power supply ground, a second end of the resistor R218 and a base of the triode Q21 are connected with a first end of a resistor R238 respectively, a second end of the resistor R238 is connected with a collector of a triode Q31, the emitter of the triode Q31 is connected with power supply ground, a base of the triode Q31 is connected with a first end of a resistor R244, a first end of a resistor R73 and a first end of a capacitor C208 respectively, a second end of the resistor R73 and a second end of the capacitor C208 are connected with power supply ground respectively, a collector of the triode Q21 is connected with a first end of a capacitor C195, a collector output 3.3V power supply voltage VDD_3.3V of the triode Q21 is output by the triode Q21, and a second end of the capacitor C195 is connected with power supply ground. The power supply voltage control terminal cs_dd of the second controller U2 inputs a turn-off level to the base of the transistor Q31, the collector of the transistor Q21 outputs a 3.3V power supply voltage vdd_3.3V (3.3V power supply), and correspondingly, the power supply voltage control terminal cs_dd of the second controller U2 inputs a turn-on level to the base of the transistor Q31, and the collector of the transistor Q21 does not have a power supply output.
The fault detection circuit also comprises: the feedback input end WD of the buck chip U4 is respectively connected with the first end of a resistor R98 and the first end of a resistor R358, the second end of the resistor R98 is connected with power ground, the second end of the resistor R358 is respectively connected with the first end of a capacitor C217 and the feedback output end SET of a second controller U2, the second end of the capacitor C217 is connected with power ground, the reset end nRST of the buck chip U4 is connected with the first end of a resistor R404, the second end of the resistor R404 is respectively connected with the first end of a resistor R331, the first end of a resistor R99, the first end of a resistor R411 and the detection end RSTBB of a first controller U1, the second end of the resistor R99 is connected with the reset input end RESETB of a second controller U2, the second end of a resistor R331 is connected with a 3.3V power supply voltage VCC_3.3V, the second end of a resistor R411 is connected with the fault output end FLT of the buck chip U4, the first end of a resistor R466 is connected with a 3.3V power supply voltage VCC 466, the second end of a resistor R466 is connected with the triode C177, the second end of a triode C controller U229 is connected with the second end of a triode C, and the second end of a triode C controller U1 is connected with the triode C, and the triode C is connected with the second end of the triode C controller C177, and the triode C is connected with the second end of the triode C controller C, the triode C is connected with the triode C is capable of the triode C is connected with the triode C3. When the feedback input end WD of the buck chip U4 does not receive the first feedback signal output by the feedback output end SET of the second controller U2 within the first specified time or the feedback input end WD of the buck chip U4 does not receive the second feedback signal output by the feedback output end SET of the second controller U2 within the second specified time, the fault output end wd_flt of the buck chip U4 outputs a fault signal or the reset end nRST of the buck chip U4 outputs a reset signal, so that the second controller U2 resets and restarts. In this embodiment, the resistance of the resistor R253 is 12K, the capacitance of the capacitor C89 is 120nF, the capacitance of the capacitor C92 is 22uF, the capacitance of the capacitor C93 is 120pF, the capacitance of the capacitor C217 is 5.6pF, the resistance of the resistor R331 is 5.1K, and the voltage chip U44 is TPS7a6333QPWPRQ1; the capacitance of the capacitor C188 is 22uF, the resistance of the resistor R218 is 15K, the resistances of the resistor R244 and the resistor R73 are 11.4K, the capacitance of the capacitor C208 is 1.5nF, the model of the triode Q21 is BCW68, the resistance of the resistor R236 is 47 omega, the resistance of the resistor R555 is 48K, the capacitance of the capacitor C299 is 10nF, the capacitance of the capacitor C229 is 22nF, the resistances of the resistor R404 and the resistor R411 are 45 omega, the resistance of the resistor R93 is 160K, the resistance of the resistor R98 is 12K, the model of the inductor L422 is MPZ S601A, the resistance of the resistor R358 is 5.2K, the capacitance of the capacitor C199 is 100nF, the model of the triode Q31 is BCW66GLT1G, the resistance of the resistor R466 is 4.7K, the capacitance of the capacitor C257 is 10uF, the model of the triode Q177 is BCW 66T 1G, the capacitance of the resistor C292 is 12K, the model of the resistor R99 is 100nF, and the model of the resistor 3.11 is 377.
The sixth power supply module includes: as shown IN fig. 2, the 5V power supply voltage vcc_5v is connected to the first end of the capacitor C1, the first end of the capacitor C2, and the power supply voltage input end IN of the buck chip U44, the second end of the capacitor C1 and the second end of the capacitor C2 are connected to the power supply ground, the power supply ground GND of the buck chip U44 is connected to the power supply ground, the power supply voltage output end OUT of the buck chip U44 is connected to the first end of the capacitor C3 and the first end of the capacitor C4, the power supply voltage output end OUT of the buck chip U44 outputs 3.3V of the power supply voltage 3.3V, and the second end of the capacitor C3 and the second end of the capacitor C4 are connected to the power supply ground. The input 5V power supply voltage vcc_5v is converted into a stable 3.3V power supply voltage 3.3V (3.3V power supply) output.
The seventh power supply module includes: 3.3V the power voltage 3.3V links to each other with the first end of electric capacity C8, the first end of electric capacity C9 and the first end of diode D1 respectively, and the second end of electric capacity C8 and the second end of electric capacity C9 link to each other with power ground respectively, and the second end of diode D1 links to each other with the first end of electric capacity C10 and the first end of electric capacity C11 respectively, and the second end of diode D1 outputs 2.5V power voltage 2.5V_D, and the second end of electric capacity C10 and the second end of electric capacity C11 link to each other with power ground respectively. The diode D01 absorbs a voltage drop of about 0.8V, so that the second end of the diode D01 outputs a stable 2.5V power supply voltage 2.5v_d (2.5V power supply).
The eighth power supply module includes: 3.3V supply voltage 3.3V links to each other with the first end of resistance R03 and triode Q01's projecting pole respectively, the second end of resistance R03 and triode Q01's base link to each other with the first end of resistance R04 respectively, the second end of resistance R04 links to each other with triode Q02's collecting electrode, triode Q01's projecting pole links to each other with power ground, triode Q01's base links to each other with the first end of resistance R05, the second end of resistance R05 links to each other with the power supply voltage control end CS_PWDB of first controller U1, triode Q01's collecting electrode links to each other with the first end of diode D2, diode D2's second end links to each other with the first end of electric capacity C05, the first end of electric capacity C06 and the first end of resistance R02 respectively, the second end of diode D2 outputs 2.8V supply voltage 2.8V CSD, the second end of electric capacity C05 and the second end of electric capacity C06 link to each other with power ground, the second end of resistance R02 links to each other with the first end of power indicator LED, the second end of power indicator LED links to each other with power ground. When the power supply voltage control terminal CS_PWDB of the first controller U1 sends a cut-off level to the base electrode of the triode Q02, the second terminal of the diode D2 outputs a stable 2.8V power supply voltage 2.8V_CSD (2.8V power supply), and meanwhile, the power supply indicator LED is lightened to indicate that power supply output exists; the power supply voltage control terminal cs_pwdb of the corresponding first controller U1 sends a turn-on level to the base of the transistor Q02, and the second terminal of the diode D2 has no power supply output.
The ninth power supply module includes: the 2.8V power supply voltage 2.8V_CSD is connected with the first end of a resistor R01, the second end of the resistor R01 is connected with the first end of a capacitor C07, the second end of the resistor R01 outputs the 2.8V power supply voltage 2.8V_CSA, and the second end of the capacitor C07 is connected with power supply ground.
In this embodiment, the capacitance value of the capacitor C1 is 4.7uF, the model of the capacitor C2 is 104 capacitor, the model of the buck chip U44 is FS8853-3.3CL, the model of the capacitor C3 is 104 capacitor, the capacitance value of the capacitor C4 is 47uF, the model of the capacitor C8 and the model of the capacitor C9 is 104 capacitor, the model of the diode D01 is LL4148, the capacitance value of the capacitor C10 is 10nF, the model of the capacitor C11 is 104 capacitor, the model of the triode Q1 is MMBT8550, the model of the diode D02 is 1N5819HW, the resistance value of the resistor R03 is 1M, the resistance value of the resistor R04 is 1.2K, the resistance value of the resistor R05 is 12K, the model of the triode Q02 is MMBT9013, the capacitance value of the capacitor C06 is 47uF, the model of the capacitor C05 is 104 capacitor, the resistance value of the resistor R01 is 7.2 Ω, the resistance value of the resistor R02 is 1.5K, and the model of the capacitor C07 is 105 capacitor.
As shown in fig. 5 and 6, the clock terminal MCLK of the second controller U2 is connected to the clock terminal cs_clk of the first controller U1, the analog ground terminal AGND of the second controller U2 is connected to the power ground, the analog power terminal AVDD of the second controller U2 is connected to the first terminal of the capacitor C22, the first terminal of the capacitor C23 and the 2.8V power supply voltage 2.8v_csa, the second terminal of the capacitor C22 and the second terminal of the capacitor C23 are connected to the power ground, the digital ground terminal DGND of the second controller U2 is connected to the power ground, the enable terminal ENB of the second controller U2 is connected to the enable output terminal cs_en of the first controller U1, the clock terminal SCK of the second controller U2 is connected to the clock terminal SCL of the first controller U1, the DATA terminal SDA of the second controller U2 is connected to the DATA terminal SDA of the first controller U1, the vertical DATA terminal hsc of the second controller U2 is connected to the vertical DATA input terminal HSYNC of the first controller U1, the horizontal DATA terminal VSYNC of the second controller U2 is connected with the horizontal DATA terminal VSYNC of the first controller U1, the digital power terminal DVDD of the second controller U2 is respectively connected with the first terminal of the capacitor C24, the first terminal of the capacitor C25 and the 2.8V power voltage 2.8V_CSD, the digital power ground terminal DGND of the second controller U2 is respectively connected with the second terminal of the capacitor C24 and the second terminal of the capacitor C25, the DATA terminal DATA1 of the second controller U2 is connected with the DATA terminal CS_D0 of the first controller U1, the DATA terminal DATA2 of the second controller U2 is connected with the DATA terminal CS_D1 of the first controller U1, the DATA terminal DATA3 of the second controller U2 is connected with the DATA terminal CS_D2 of the first controller U1, the DATA terminal DATA4 of the second controller U2 is connected with the DATA terminal CS_D3 of the first controller U1, the DATA terminal DATA5 of the second controller U2 is connected with the DATA terminal CS_D6 of the first controller U1, the DATA terminal DATA7 of the second controller U2 is connected to the DATA terminal cs_d6 of the first controller U1, the DATA terminal DATA8 of the second controller U2 is connected to the DATA terminal cs_d7 of the first controller U1, and the DATA terminal DATA9 of the second controller U2 is connected to the DATA terminal cs_d8 of the first controller U1. In the present embodiment, the types of the capacitor C22, the capacitor C23, the capacitor C24, and the capacitor C25 are 104 capacitors, and the type of the second controller U2 is HY7131R.
As shown in fig. 6, the power ground OVSS of the first controller U1 is connected to the power ground, the power source OVDD of the first controller U1 is connected to the 3.3V power voltage 3.3V, the clock terminal SCL of the first controller U1 is connected to the first terminal of the resistor R31, the second terminal of the resistor R31 is connected to the 2.8V power voltage 2.8v_csd, the data terminal SDA of the first controller U1 is connected to the first terminal of the resistor R32 and the first terminal of the resistor R33, the second terminal of the resistor R32 is connected to the 2.8V power voltage 2.8v_csd, the second terminal of the resistor R33 is connected to the power ground, the reset terminal RST of the first controller U1 is connected to the first terminal of the resistor R13 and the first terminal of the capacitor C12, the second terminal of the capacitor C12 is connected to the power ground, the second terminal of the resistor R13 is connected to the 3.3V power voltage 3.3V, the data terminal o3 of the first controller U1 is connected to the first terminal of the resistor R24, the first terminal of the first controller U1 is connected to the data terminal PIO 28 of the resistor R1, the second end of the resistor R24 and the second end of the resistor R28 are respectively connected with a 3.3V power supply voltage 3.3V, the TEST end TEST of the first controller U1 is connected with the first end of the resistor R22, the second end of the resistor R22 is connected with the power supply ground, the reference ground end GND_REF of the first controller U1 is respectively connected with the power supply ground and the power supply ground end GND_A of the first controller U1, the locking end SNAPB of the first controller U1 is respectively connected with the first end of the capacitor C18, the first end of the resistor R19 and the first end of the locking key S1, the second end of the capacitor C18 is connected with the power supply ground, the second end of the locking key S1 is connected with the power supply ground, the second end of the resistor R19 is connected with the 3.3V power supply voltage 3.3V, the data end DM of the first controller U1 is respectively connected with the first end of the capacitor C17 and the first end of the resistor R17, the data end DP of the first controller U1 is respectively connected with the first end of the capacitor C16 and the first end DP of the resistor R16, the second end of the capacitor C16 and the second end of the capacitor C16 are respectively connected with power ground, the second end of the resistor R16 is respectively connected with the first end of the resistor R18 and the data end D+ of the data interface JP1, the second end of the resistor R18 is respectively connected with 3.3V power supply voltage 3.3V, the second end of the resistor R17 is respectively connected with the data end D-of the data interface JP1, the power end VCC of the data interface JP1 is connected with 5V power supply voltage VCC_5V, the power ground end GND (S) of the data interface JP1 and the power ground end GND (S) of the data interface JP1 are respectively connected with the power supply end, the power ground end VSS_USB of the first controller U1 and the power ground end DVSS of the first controller U1 are respectively connected with power ground, the crystal end CLKOUT of the first controller U1 is respectively connected with the first end of the resistor R14 and the first end of the resistor R15, the second end VCC 14 and the first end of the capacitor C14, the second end of the capacitor C14 and the first end of the capacitor C1 are respectively connected with the first end 13 of the capacitor C1, the crystal oscillator C13 is respectively connected with the first end of the capacitor C1 and the first end 13, the crystal oscillator C1 is connected with the first end of the first end 13 of the capacitor C1 is connected with the first end of the capacitor C1.
In this embodiment, the resistances of the resistor R31 and the resistor R32 are 4.7K, the resistance of the resistor R33 is 47K, the resistances of the resistor R24 and the resistor R28 are 4.7K, the resistance of the resistor R22 is 47K, the resistance of the resistor R19 is 12K, the resistance of the resistor R18 is 1.5K, the type of the capacitor C18 is 104 capacitor, the resistance of the resistor R16 and the resistor R17 is 22Ω, the capacitance of the capacitor C16 and the capacitor C17 is 10pF, the resistance of the resistor R14 is 110K, the resistance of the resistor R15 is 45Ω, the frequency of the crystal oscillator Y1 is 48MHz, the capacitance of the capacitor C13 and the capacitor C14 is 12pF, the type of the capacitor C15 is 102 capacitor, the inductance of the inductor L1 is 3.3uH, the resistance of the resistor R13 is 12.5K, the type of the capacitor C12 is 105 capacitor, and the type of the first controller U1 is ZC0301.
The storage module includes: as shown in fig. 6 and 7, the address input terminal A0 of the memory chip U33, the address input terminal A1 of the memory chip U33, and the power ground terminal GND of the memory chip U33 are connected to power ground, the power voltage input terminal VCC of the memory chip U33 is connected to 3.3V, the read/write terminal WP of the memory chip U33 is connected to power ground, the clock terminal SCL of the memory chip U33 is connected to the clock terminal ESCK of the first controller U1, the data terminal SDA of the memory chip U33 is connected to the first terminal of the resistor R34 and the data terminal ESDA of the first controller U1, respectively, and the second terminal of the resistor R34 is connected to 3.3V. In the present embodiment, the resistance value of the resistor R34 is 4.7K, and the model of the memory chip U33 is AT24C02. The storage and reading of data is realized by the memory chip U33.
While embodiments of the present invention have been shown and described, it will be understood by those of ordinary skill in the art that: many changes, modifications, substitutions and variations may be made to the embodiments without departing from the spirit and principles of the invention, the scope of which is defined by the claims and their equivalents.

Claims (3)

1. A method for realizing panoramic image safety monitoring by arranging intelligent cameras on an operating automobile comprises a vehicle to be tested, and is characterized in that A panoramic cameras are arranged on the vehicle body of the vehicle to be tested, wherein A is a positive integer greater than or equal to 1, and is respectively a 1 st panoramic camera, a 2 nd panoramic camera, a 3 rd panoramic camera, a … … th panoramic camera and an A th panoramic camera;
a panoramic looking-around display screen fixed mounting seat for fixedly mounting a panoramic looking-around display screen is arranged in the vehicle cab to be tested, and the panoramic looking-around display screen is fixedly mounted on the panoramic looking-around display screen fixed mounting seat;
the panoramic all-round controller is arranged in the vehicle to be tested, the panoramic all-round image data output end of the a-th panoramic all-round camera is connected with the panoramic all-round image data input end a of the panoramic all-round controller, a is a positive integer less than or equal to A, and the display data output end of the panoramic all-round controller is connected with the display data input end of the panoramic all-round display screen;
The panoramic all-round controller carries out brightness consistency detection on the spliced panoramic all-round image according to the panoramic all-round image data acquired by the panoramic all-round camera arranged on the vehicle body to be tested; the brightness consistency detection method comprises the following steps:
S4-A, acquiring spliced images, and carrying out graying treatment on the spliced images to acquire gray images;
S4-B, performing image denoising treatment on the gray level image;
S4-C, dividing the acquired gray level image into equal-sized areas according to the image size;
S4-D, converting the color space of the region into an LAB color space, extracting L brightness components in the color space, calculating the average value of the L components of each region, carrying out difference operation on the average value of all the regions, and if the difference value is greater than or equal to a preset difference threshold value, the brightness is inconsistent; otherwise, the brightness is consistent.
2. The method for implementing the security monitoring of the looking-around image by arranging the intelligent camera on the commercial car according to claim 1, wherein in the step S4-a, the processing method of the gray-scale image is as follows:
F (i,j) =Red (i,j) ×Re+Green (i,j) ×Gr+Blue (i,j) ×Bl,
wherein ,Red(i,j) Representing the amount of red color mode at image pixel (i, j);
re represents the fusion proportionality coefficient of the red color mode quantity; re+gr+bl=1;
Green (i,j) Representing the amount of green color mode at image pixel (i, j);
re represents the fusion scaling factor of the green color mode quantity;
Blue (i,j) representing the amount of blue color mode at image pixel (i, j);
re represents the fusion scaling factor for the blue color mode quantity.
3. The method for implementing the safety monitoring of the looking-around image by arranging the intelligent camera on the commercial vehicle according to claim 1, wherein in the step S4-B, the method for performing the image denoising processing on the gray-scale image comprises the following steps:
s0-21, finding out noise points of pixel points of an image, wherein the calculation method of the noise points of the pixel points in the image comprises the following steps:
wherein ,F(i,j) A gray value representing a pixel (i, j) of the image; that is, the gray value of the pixel point in the ith row and jth column of the image;
G (i,j) indicating whether or not the pixel point (i, j) of the image is a noise point, G (i,j) =0 indicates that at the image pixel point (i, j) as the noise point, G (i,j) =1 indicates that at image pixel point (i, j) is a non-noise point;
or represents a logical condition or;
a max representing p×p centered on the image pixel (i, j) as the maximum average of gray levels in the window; p=2p '+1, p' being a positive integer greater than or equal to 1 and less than or equal to 3;
a min representing p×p centered on the image pixel (i, j) as the minimum mean of gray levels in the window;
S0-22, numbering the pixel points in the window in a matrix manner, and marking as
wherein ,G(1,1) 'represents the 1 st row and 1 st column pixel points in the matrix G';
G (1,2) 'represents the pixel points of row 1 and column 2 in the matrix G';
G (1,3) 'represents the pixel points of row 1 and column 3 in the matrix G';
G (1,p) 'represents the pixel point of row 1 and column p in matrix G';
G (2,1) 'represents the pixel points of row 2 and column 1 in the matrix G';
G (2,2) 'represents the pixel point of row 2 and column 2 in the matrix G';
G (2,3) 'represents the pixel points of row 2 and column 3 in the matrix G';
G (2,p) 'represents the pixel point of row 2 and column p in matrix G';
G (3,1) 'represents the pixel points of row 3 and column 1 in the matrix G';
G (3,2) 'represents the pixel points of row 3 and column 2 in the matrix G';
G (3,3) 'represents the 3 rd row and 3 rd column pixel points in the matrix G';
G (3,p) 'represents the pixel point of row 3 and column p in matrix G';
G (p,1) 'represents the pixel point of row p and column 1 in matrix G';
G (p,2) 'represents the pixel point of row p and column 2 in matrix G';
G (p,3) 'represents the pixel point of row p and column 3 in matrix G';
G (p,p) 'represents the p-th row and p-th column of pixel points in the matrix G';
s0-23, arranging gray values in a matrix G' from small to small, and updating gray values of pixel points at noise points, wherein the gray values of the pixel points at the noise points are calculated by the following steps:
wherein ,representing the updated gray value of the pixel point at the noise point;
express gray value rank-ordered +.>Gray values;
express gray value rank-ordered +.>A gray value.
CN202111280660.6A 2021-10-31 2021-10-31 Method for realizing safety monitoring of all-round image by arranging intelligent camera on operating automobile Active CN114025088B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111280660.6A CN114025088B (en) 2021-10-31 2021-10-31 Method for realizing safety monitoring of all-round image by arranging intelligent camera on operating automobile

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111280660.6A CN114025088B (en) 2021-10-31 2021-10-31 Method for realizing safety monitoring of all-round image by arranging intelligent camera on operating automobile

Publications (2)

Publication Number Publication Date
CN114025088A CN114025088A (en) 2022-02-08
CN114025088B true CN114025088B (en) 2023-08-22

Family

ID=80059371

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111280660.6A Active CN114025088B (en) 2021-10-31 2021-10-31 Method for realizing safety monitoring of all-round image by arranging intelligent camera on operating automobile

Country Status (1)

Country Link
CN (1) CN114025088B (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005277582A (en) * 2004-03-23 2005-10-06 Sony Corp Panorama imaging apparatus, control program thereof, panorama imaging method, monitoring system, and program recording medium
CN105005963A (en) * 2015-06-30 2015-10-28 重庆市勘测院 Multi-camera images stitching and color homogenizing method
CN107566730A (en) * 2017-09-27 2018-01-09 维沃移动通信有限公司 A kind of panoramic picture image pickup method and mobile terminal
CN108650495A (en) * 2018-06-28 2018-10-12 华域视觉科技(上海)有限公司 A kind of automobile-used panoramic looking-around system and its adaptive light compensation method
CN109978765A (en) * 2019-03-11 2019-07-05 上海保隆汽车科技股份有限公司 Panoramic picture brightness correcting method and joining method and its device
CN112102168A (en) * 2020-09-03 2020-12-18 成都中科合迅科技有限公司 Image splicing method and system based on multiple threads
WO2021031458A1 (en) * 2019-08-16 2021-02-25 域鑫科技(惠州)有限公司 Method and device for image color correction applicable in endoscope, and storage medium

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI435162B (en) * 2012-10-22 2014-04-21 Nat Univ Chung Cheng Low complexity of the panoramic image and video bonding method

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005277582A (en) * 2004-03-23 2005-10-06 Sony Corp Panorama imaging apparatus, control program thereof, panorama imaging method, monitoring system, and program recording medium
CN105005963A (en) * 2015-06-30 2015-10-28 重庆市勘测院 Multi-camera images stitching and color homogenizing method
CN107566730A (en) * 2017-09-27 2018-01-09 维沃移动通信有限公司 A kind of panoramic picture image pickup method and mobile terminal
CN108650495A (en) * 2018-06-28 2018-10-12 华域视觉科技(上海)有限公司 A kind of automobile-used panoramic looking-around system and its adaptive light compensation method
CN109978765A (en) * 2019-03-11 2019-07-05 上海保隆汽车科技股份有限公司 Panoramic picture brightness correcting method and joining method and its device
WO2021031458A1 (en) * 2019-08-16 2021-02-25 域鑫科技(惠州)有限公司 Method and device for image color correction applicable in endoscope, and storage medium
CN112102168A (en) * 2020-09-03 2020-12-18 成都中科合迅科技有限公司 Image splicing method and system based on multiple threads

Also Published As

Publication number Publication date
CN114025088A (en) 2022-02-08

Similar Documents

Publication Publication Date Title
Zhang et al. CCTSDB 2021: a more comprehensive traffic sign detection benchmark
Choi et al. Thermal image enhancement using convolutional neural network
CN112184818B (en) Vision-based vehicle positioning method and parking lot management system applying same
CN107154022B (en) A kind of dynamic panorama mosaic method suitable for trailer
CN111160172B (en) Parking space detection method, device, computer equipment and storage medium
CN104766058B (en) A kind of method and apparatus for obtaining lane line
CN107229908B (en) A kind of method for detecting lane lines
US20170294027A1 (en) Remote determination of quantity stored in containers in geographical region
CN108629292B (en) Curved lane line detection method and device and terminal
CN103065323B (en) Subsection space aligning method based on homography transformational matrix
CN113874927A (en) Parking detection method, system, processing device and storage medium
TWI726278B (en) Driving detection method, vehicle and driving processing device
CN111243003A (en) Vehicle-mounted binocular camera and method and device for detecting road height limiting rod
CN110736472A (en) indoor high-precision map representation method based on fusion of vehicle-mounted all-around images and millimeter wave radar
CN112818834A (en) Method, device and medium for judging avoidance of emergency vehicle at intersection
CN116030194A (en) Air-ground collaborative live-action three-dimensional modeling optimization method based on target detection avoidance
CN108847031A (en) Traffic behavior monitoring method, device, computer equipment and storage medium
CN114025088B (en) Method for realizing safety monitoring of all-round image by arranging intelligent camera on operating automobile
CN114372919A (en) Method and system for splicing panoramic all-around images of double-trailer train
CN114022438A (en) System for realizing safety test of panoramic all-round looking images by mounting panoramic camera on vehicle
CN112053407B (en) Automatic lane line detection method based on AI technology in traffic law enforcement image
CN110826364A (en) Stock position identification method and device
Wang et al. Lane-line detection algorithm for complex road based on OpenCV
CN114037611A (en) Working method for realizing image splicing by mounting panoramic camera on safe-driving automobile
CN114037670A (en) System for realizing image splicing test by mounting panoramic camera on vehicle running safely

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant