CN118075616A - Vehicle-mounted camera orientation angle calibration system - Google Patents

Vehicle-mounted camera orientation angle calibration system Download PDF

Info

Publication number
CN118075616A
CN118075616A CN202410145791.0A CN202410145791A CN118075616A CN 118075616 A CN118075616 A CN 118075616A CN 202410145791 A CN202410145791 A CN 202410145791A CN 118075616 A CN118075616 A CN 118075616A
Authority
CN
China
Prior art keywords
vehicle
target
calibration
camera
area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202410145791.0A
Other languages
Chinese (zh)
Inventor
张维忠
姜宇盘
钱登林
黄硕
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Faw Nanjing Technology Development Co ltd
FAW Group Corp
Original Assignee
Faw Nanjing Technology Development Co ltd
FAW Group Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Faw Nanjing Technology Development Co ltd, FAW Group Corp filed Critical Faw Nanjing Technology Development Co ltd
Priority to CN202410145791.0A priority Critical patent/CN118075616A/en
Publication of CN118075616A publication Critical patent/CN118075616A/en
Pending legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/695Control of camera direction for changing a field of view, e.g. pan, tilt or based on tracking of objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/57Mechanical or electrical details of cameras or camera modules specially adapted for being embedded in other devices
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/64Computer-aided capture of images, e.g. transfer from script file into camera, check of taken image quality, advice or proposal for image composition or decision on when to take image
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/66Remote control of cameras or camera parts, e.g. by remote control devices
    • H04N23/661Transmitting camera control signals through networks, e.g. control via the Internet
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/90Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)

Abstract

The embodiment of the invention discloses a vehicle-mounted camera orientation angle calibration system, which comprises: the system comprises a server and a vehicle-mounted terminal integrated on a target vehicle, wherein a control module and at least two vehicle-mounted cameras are integrated in the vehicle-mounted terminal, and a judging module and a calibrating module are integrated in the server; at least two vehicle-mounted cameras for obtaining at least two driving images; the judging module judges whether a target driving image transmitted by the vehicle-mounted terminal meets the preset image requirement or not, and sends a judging result to the calibration module, wherein the target driving image is obtained based on at least two driving images; the calibration module generates a calibration instruction under the condition that target cameras with the orientation angles to be calibrated exist in at least two vehicle-mounted cameras, and transmits the calibration instruction to the control module; and the control module calibrates the orientation angle of the target camera according to the calibration instruction. According to the technical scheme provided by the embodiment of the invention, the automatic calibration of the orientation angles of each vehicle-mounted camera on the target vehicle can be realized.

Description

Vehicle-mounted camera orientation angle calibration system
Technical Field
The embodiment of the invention relates to the technical field of vehicles, in particular to a vehicle-mounted camera orientation angle calibration system.
Background
In the field of autopilot, an autopilot vehicle usually needs to collect its own driving image by using a vehicle-mounted camera as a sensing sensor, so as to complete the judgment of a driving scene.
Currently, a plurality of vehicle-mounted cameras are usually installed on an autonomous vehicle, and the orientation angles of the installed plurality of vehicle-mounted cameras are manually calibrated to acquire driving images by using the calibrated plurality of vehicle-mounted cameras.
However, the currently employed manual calibration scheme is inefficient and costly and needs to be improved.
Disclosure of Invention
The embodiment of the invention provides a vehicle-mounted camera orientation angle calibration system for realizing automatic calibration of orientation angles of various vehicle-mounted cameras on a target vehicle.
According to an aspect of the present invention, there is provided an in-vehicle camera orientation angle calibration system including: the system comprises a server and a vehicle-mounted terminal integrated on a target vehicle, wherein a control module and at least two vehicle-mounted cameras are integrated in the vehicle-mounted terminal, and a judging module and a calibrating module are integrated in the server;
The vehicle-mounted cameras are respectively used for acquiring driving images of the target vehicle so as to obtain at least two driving images;
The judging module is used for receiving the target driving image transmitted by the vehicle-mounted terminal, judging whether the target driving image meets the preset image requirement or not, and sending the obtained judging result to the calibration module, wherein the target driving image is obtained based on at least two driving images;
The calibration module is used for generating a calibration instruction for calibrating the orientation angle of the target camera and transmitting the calibration instruction to the control module under the condition that the target camera with the orientation angle to be calibrated exists in at least two vehicle-mounted cameras according to the judging result;
And the control module is used for calibrating the orientation angle of the target camera according to the received calibration instruction.
The technical scheme of the embodiment of the invention comprises the following steps: the system comprises a server and a vehicle-mounted terminal integrated on a target vehicle, wherein a control module and at least two vehicle-mounted cameras are integrated in the vehicle-mounted terminal, and a judging module and a calibrating module are integrated in the server; the vehicle-mounted cameras are respectively used for acquiring driving images of the target vehicle so as to obtain at least two driving images; the judging module is used for receiving the target driving image transmitted by the vehicle-mounted terminal, judging whether the target driving image meets the preset image requirement or not, and sending the obtained judging result to the calibration module, wherein the target driving image is obtained based on at least two driving images, and the calibration efficiency is improved through judging whether the vehicle-mounted camera is required to be calibrated or not and classifying treatment; the calibration module is used for generating a calibration instruction for calibrating the orientation angle of the target camera under the condition that the target camera with the orientation angle to be calibrated exists in at least two vehicle-mounted cameras according to the judging result, transmitting the calibration instruction to the control module, automatically generating the calibration instruction, and improving the calibration efficiency and accuracy; and the control module is used for calibrating the orientation angle of the target camera according to the received calibration instruction, and the control module is used for automatically calibrating the target camera without manual participation, so that the cost is reduced, and the calibration efficiency is improved. According to the technical scheme, the driving image acquired by the vehicle-mounted camera is judged, the calibration command is automatically generated under the condition that calibration is needed, and the calibration of the orientation angles of the vehicle-mounted cameras is automatically completed according to the calibration command, so that the calibration cost of the vehicle-mounted camera is reduced, and the calibration efficiency and the calibration accuracy are improved.
It should be understood that the description in this section is not intended to identify key or critical features of the embodiments of the invention, nor is it intended to be used to limit the scope of the invention. Other features of the present invention will become apparent from the description that follows.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings required for the description of the embodiments will be briefly described below, and it is apparent that the drawings in the following description are only some embodiments of the present invention, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a block diagram of a vehicle camera orientation angle calibration system according to an embodiment of the present invention;
FIG. 2 is a block diagram of another vehicle camera orientation angle calibration system provided in accordance with an embodiment of the present invention;
FIG. 3 is a block diagram of a further in-vehicle camera orientation angle calibration system provided in accordance with an embodiment of the present invention;
fig. 4 is a schematic structural view of a specific example in a further in-vehicle camera orientation angle calibration system provided according to an embodiment of the present invention;
Fig. 5 is a flowchart of a specific example in a further in-vehicle camera orientation angle calibration system provided according to an embodiment of the present invention.
Detailed Description
In order that those skilled in the art will better understand the present invention, a technical solution in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in which it is apparent that the described embodiments are only some embodiments of the present invention, not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the present invention without making any inventive effort, shall fall within the scope of the present invention.
It should be noted that the terms "first," "second," and the like in the description and the claims of the present invention and the above figures are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that the embodiments of the invention described herein may be implemented in sequences other than those illustrated or otherwise described herein. The cases of "target", "original", etc. are similar and will not be described in detail herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
Fig. 1 is a block diagram of a vehicle camera orientation angle calibration system according to an embodiment of the present invention. The embodiment is applicable to the case of automatically calibrating the orientation angles of a plurality of in-vehicle cameras on a target vehicle.
Referring to fig. 1, an in-vehicle camera orientation angle calibration system according to an embodiment of the present invention includes: the system comprises a server 11 and an on-board terminal 12 integrated on a target vehicle, wherein a control module 121 and at least two on-board cameras 122 are integrated in the on-board terminal 12, and a judging module 111 and a calibration module 112 are integrated in the server 11;
At least two vehicle-mounted cameras 122 for respectively acquiring driving images of the target vehicle to obtain at least two driving images;
The judging module 111 is configured to receive the target driving image transmitted by the vehicle-mounted terminal 12, judge whether the target driving image meets a preset image requirement, and send the obtained judging result to the calibration module 112, where the target driving image is obtained based on at least two driving images;
the calibration module 112 is configured to generate a calibration instruction for calibrating the orientation angle of the target camera and transmit the calibration instruction to the control module 121 when it is determined that the target camera whose orientation angle is to be calibrated exists in the at least two vehicle-mounted cameras 122 according to the determination result;
And the control module 121 is used for calibrating the orientation angle of the target camera according to the received calibration instruction.
The in-vehicle terminal 12 can be understood as a terminal device integrated in the target vehicle at least for capturing a driving image of the target vehicle and for calibrating the angle of orientation of the in-vehicle camera. The in-vehicle terminal 12 has integrated therein a control module 121 usable for calibrating an orientation angle of the in-vehicle camera and an in-vehicle camera 122 for acquiring a running image of the target vehicle. In an embodiment of the present invention, the number of in-vehicle cameras 122 may be two or more, and the two or more in-vehicle cameras 122 are mounted on the target vehicle at the same or different orientation angles.
Specifically, the in-vehicle terminal 12 acquires the running images of the target vehicle through the two or more in-vehicle cameras 122, respectively, to obtain the two or more running images. Then, the vehicle-mounted terminal 12 obtains target driving images based on the driving images, and in practical application, the driving images can be optionally directly used as the target driving images; the target driving images obtained by stitching the driving images can be stitched, for example, the driving images are stitched in pairs or in plurality, and then the stitched driving images are used as target driving images, for example, 180-degree driving images, 270-degree driving images or panoramic driving images of the target vehicle. Then, the in-vehicle terminal 12 transmits the target running image to the server 11.
The server 11 may be understood as a local server or cloud server capable of performing data interaction with the vehicle terminal 12, so as to implement fast and efficient processing of data. The server 11 is integrated with a judging module 111 for judging whether the target driving image transmitted by the vehicle-mounted terminal 12 meets the preset image requirement, where the preset image requirement can be understood as an image requirement preset for the target driving image and required to be met by the target driving image, and for example, the preset image requirement can be at least one of an image definition requirement, an image integrity requirement, an image distortion requirement, and the like. The server 11 is further integrated with a calibration module 112 for generating a calibration command corresponding to the target camera when it is determined that the target camera whose orientation angle is to be calibrated exists in each of the in-vehicle cameras 112 according to the determination result sent by the determination module 111.
Specifically, the server 11 determines, through the determining module 111, whether the target driving image meets the preset image requirement, and sends the determination result to the calibration module 112. Further, the server 11 generates a calibration instruction for calibrating the orientation angle of the target camera when it is determined that the target camera exists in each in-vehicle camera 112 through the analysis and judgment result by the calibration module 112, and transmits the calibration instruction to the control module 121. On this basis, the in-vehicle terminal 11 further calibrates the orientation angle of the target camera according to the calibration instruction after receiving the calibration instruction through the control module 121.
In practical applications, the calibration module 112 may determine whether the target camera exists in a plurality of manners, for example, may obtain a current shooting angle of the vehicle-mounted camera according to the determination result, and determine whether the vehicle-mounted camera is the target camera according to the shooting angle, for example, use the vehicle-mounted camera with the shooting angle not in the preset angle range as the target camera; for example, when it is determined that the target driving image does not meet the preset image requirement according to the determination result, a driving image that causes the target driving image to not meet the preset image requirement may be found from the driving images, and then the vehicle-mounted camera corresponding to the driving image is the target camera. For example, when there is a large overlapping area in the target running image, the running image corresponding to the overlapping area may be used as the found running image; for another example, when a black area exists in the target driving image, the driving image corresponding to the area adjacent to the black area in the target driving image may be used as the found driving image.
In addition, after determining that the target camera exists, the calibration module 112 may generate a calibration instruction in a plurality of ways, and may generate a calibration instruction for adjusting the target camera by a preset angle in a preset direction; the calibration angle of the target camera to be calibrated can also be calculated according to the preset image requirement, and a calibration instruction corresponding to the calibration angle is generated, for example, a first area of an overlapping area in the target driving image is calculated, the direction and the angle of the target camera to be calibrated are determined according to the first area, and the calibration angle is obtained based on the direction and the angle.
The technical scheme of the embodiment of the invention comprises the following steps: the system comprises a server and a vehicle-mounted terminal integrated on a target vehicle, wherein a control module and at least two vehicle-mounted cameras are integrated in the vehicle-mounted terminal, and a judging module and a calibrating module are integrated in the server; the vehicle-mounted cameras are respectively used for acquiring driving images of the target vehicle so as to obtain at least two driving images; the judging module is used for receiving the target driving image transmitted by the vehicle-mounted terminal, judging whether the target driving image meets the preset image requirement or not, and sending the obtained judging result to the calibration module, wherein the target driving image is obtained based on at least two driving images, and the calibration efficiency is improved through judging whether the vehicle-mounted camera is required to be calibrated or not and classifying treatment; the calibration module is used for generating a calibration instruction for calibrating the orientation angle of the target camera under the condition that the target camera with the orientation angle to be calibrated exists in at least two vehicle-mounted cameras according to the judging result, transmitting the calibration instruction to the control module, automatically generating the calibration instruction, and improving the calibration efficiency and accuracy; and the control module is used for calibrating the orientation angle of the target camera according to the received calibration instruction, and the control module is used for automatically calibrating the target camera without manual participation, so that the cost is reduced, and the calibration efficiency is improved. According to the technical scheme, the driving images acquired by the vehicle-mounted cameras on the target vehicle are analyzed, the calibration command is automatically generated under the condition that the calibration needs to be carried out in the vehicle-mounted cameras, and the calibration of the orientation angles of the vehicle-mounted cameras is automatically completed according to the calibration command, so that the calibration cost of the vehicle-mounted cameras is reduced, and the calibration efficiency and the calibration accuracy are improved.
An optional technical scheme is that a cradle head corresponding to at least two vehicle-mounted cameras respectively is integrated in the vehicle-mounted terminal, so that the at least two vehicle-mounted cameras are respectively installed on the corresponding cradle heads, and the control module is specifically used for: and receiving a calibration instruction, and controlling the rotation of a cradle head provided with the target camera based on the calibration instruction so as to calibrate the orientation angle of the target camera.
The vehicle-mounted cameras are mounted on the cloud platforms, particularly at least two vehicle-mounted cameras are mounted on different cloud platforms respectively, so that automatic calibration of the corresponding vehicle-mounted camera orientation angles can be realized by controlling rotation of the cloud platforms, manual calibration is not needed, labor cost is reduced, and calibration precision and calibration efficiency are improved.
The calibration module is further configured to generate a calibration termination instruction when it is determined that the target camera does not exist in the at least two vehicle-mounted cameras according to the determination result;
The control module is further used for stopping calibrating the corresponding orientation angles of the at least two vehicle-mounted cameras based on the calibration stopping instruction sent by the calibration module.
The command for terminating calibration is understood to be a command for indicating that the orientation angle of the vehicle-mounted camera is calibrated in place, maintaining the existing orientation angle, and ending calibration.
Under the condition that the target driving image meets the preset image requirement, the fact that the orientation angle of the vehicle-mounted camera is in the proper orientation at the moment is indicated, the target camera which needs to be calibrated does not exist, and the calibration module can generate a calibration termination instruction at the moment, so that the control module can finish control of the target camera according to the calibration termination instruction, and the current orientation angle of the vehicle-mounted camera is fixed.
In another optional technical scheme, a first transmission module is further integrated in the vehicle-mounted terminal, and a second transmission module is further integrated in the server: the first transmission module is used for receiving the target driving image, uploading the target driving image to the second transmission module, receiving the calibration instruction issued by the second transmission module and sending the calibration instruction to the control module; the second transmission module is used for receiving the target driving image uploaded by the first transmission module, transmitting the target driving image to the judging module, receiving the calibration instruction sent by the calibration module and sending the calibration instruction to the first transmission module.
The data transmission between the vehicle-mounted terminal and the server can be realized through the first transmission module and the second transmission module. In practical application, optionally, the transmission modes of the first transmission module and the second transmission module can flexibly select a wired transmission mode or a wireless transmission mode according to practical situations. For example, a wired transmission mode can be used when the distance between the server and the vehicle-mounted terminal is relatively short, so as to improve the efficiency and stability of data transmission, and a wireless transmission mode can be used when the distance between the server and the vehicle-mounted terminal is relatively long, so as to improve the convenience of interaction.
Fig. 2 is a block diagram of another vehicle camera orientation angle calibration system according to an embodiment of the present invention. The present embodiment is optimized based on the above technical solutions. In this embodiment, optionally, a processor is integrated in the vehicle-mounted terminal; and the processor is used for receiving at least two driving images, and splicing the at least two driving images to obtain a target driving image. Wherein, the explanation of the same or corresponding terms as the above embodiments is not repeated herein.
Specifically, referring to fig. 2, the vehicle camera orientation angle calibration system according to the embodiment of the present invention includes: the system comprises a server 21 and an on-board terminal 22 integrated on a target vehicle, wherein a control module 221, at least two on-board cameras 222 and a processor 223 are integrated in the on-board terminal 22, and a judging module 211 and a calibration module 212 are integrated in the server 21;
At least two vehicle-mounted cameras 222 respectively used for acquiring driving images of the target vehicle to obtain at least two driving images;
A processor 223, configured to receive at least two driving images, and splice the at least two driving images to obtain a target driving image;
the judging module 211 is configured to receive the target driving image transmitted by the vehicle-mounted terminal 22, judge whether the target driving image meets a preset image requirement, and send the obtained judging result to the calibration module 212, where the target driving image is obtained based on at least two driving images;
the calibration module 212 is configured to generate a calibration instruction for calibrating the orientation angle of the target camera and transmit the calibration instruction to the control module 221 when it is determined that the target camera whose orientation angle is to be calibrated exists in the at least two vehicle-mounted cameras 222 according to the determination result;
The control module 221 is configured to calibrate an orientation angle of the target camera according to the received calibration instruction.
The processor 223 may be understood as a functional module for generating a target driving image, among other things. Optionally, the processor 223 may perform a partial stitching on the received driving images to generate a target driving image, where the partial stitching may be, for example, a two-two stitching or a three-three stitching; all the received driving images can be spliced to generate a target driving image; etc., and are not particularly limited herein.
In this embodiment, the received at least two driving images are spliced by the processor integrated in the vehicle-mounted terminal to generate the target driving image, and then the target driving image is transmitted to the server, so that compared with the case of directly transmitting the at least two driving images to the server, the transmission efficiency can be increased, the transmission pressure can be reduced, the processing amount of the server can be reduced, and the calibration efficiency can be improved.
An alternative solution, the processor is specifically configured to: and receiving at least two driving images, and splicing the at least two driving images based on the current corresponding direction angles of the at least two vehicle-mounted cameras to generate a target driving image.
At least two vehicle-mounted cameras are respectively corresponding to respective orientation angles, and the orientation angles can reflect the space ranges corresponding to the driving images respectively acquired by the vehicle-mounted cameras, so that at least two driving images can be spliced according to the orientation angles to generate a target driving image. Alternatively, the driving images corresponding to the vehicle-mounted cameras can be spliced based on the numerical values of the orientation angles respectively corresponding to the vehicle-mounted cameras; the driving images adjacent to each other in the directions can be spliced based on the directions respectively represented by the direction angles.
The driving images corresponding to the vehicle-mounted cameras are spliced through the orientation angles of the vehicle-mounted cameras, so that the target driving images which are completely connected can be obtained, and the analysis of the target driving images is facilitated.
On the basis, the control module is optional and is also used for controlling the reference camera in the at least two vehicle-mounted cameras to face 0 degree before the at least two vehicle-mounted cameras acquire the vehicle images; the processor is further configured to determine, for each of the at least two vehicle-mounted cameras except the reference camera, a yaw angle of the vehicle-mounted camera with respect to the reference camera, and take the yaw angle as a current orientation angle of the vehicle-mounted camera.
The processor calculates a deflection angle of each of the at least two vehicle-mounted cameras except the reference camera relative to the reference camera, and uses the deflection angle as an orientation angle of each of the vehicle-mounted cameras.
According to the technical scheme, the orientation of the reference camera is controlled to be 0 degrees, so that the deflection angle of the vehicle-mounted camera except the reference camera relative to the reference camera can be directly used as the orientation angle of the vehicle-mounted camera, and the quick determination of the orientation angle of each vehicle-mounted camera is realized.
Fig. 3 is a block diagram of another vehicle camera orientation angle calibration system according to an embodiment of the present invention. The present embodiment is optimized based on the above technical solutions. In this embodiment, optionally, the target driving image includes a panoramic driving image, and the preset image requirement includes a preset overlapping area requirement and/or a black area requirement; the judging module is specifically used for: receiving a panoramic running image transmitted by a vehicle-mounted terminal, and respectively calculating a first area of an overlapping area in the panoramic running image and/or a second area of a black area in the panoramic running image; judging whether the overlapping area meets the overlapping area requirement or not based on the first area, and/or judging whether the black area meets the black area requirement or not based on the second area, so as to obtain a judging result, wherein the area of the overlapping area requirement including the overlapping area is smaller than or equal to a first preset area, and/or the area of the black area requirement including the black area is smaller than or equal to a second preset area; and sending the judgment result to the calibration module. Wherein, the explanation of the same or corresponding terms as the above embodiments is not repeated herein.
Specifically, referring to fig. 3, the vehicle camera orientation angle calibration system according to the embodiment of the present invention includes: the system comprises a server 31 and a vehicle-mounted terminal 32 integrated on a target vehicle, wherein a control module 321, at least two vehicle-mounted cameras 322 and a processor 323 are integrated in the vehicle-mounted terminal 32, and a judging module 311 and a calibration module 312 are integrated in the server 31;
At least two vehicle-mounted cameras 322 respectively used for acquiring driving images of the target vehicle to obtain at least two driving images;
the processor 323 is configured to receive at least two driving images, and splice the at least two driving images based on the current corresponding orientation angles of the at least two vehicle-mounted cameras, so as to generate a target driving image, where the target driving image includes a panoramic driving image;
The judging module 311 is configured to receive the panoramic driving image transmitted by the vehicle-mounted terminal, and calculate a first area of an overlapping region in the panoramic driving image and/or a second area of a black region in the panoramic driving image respectively;
The judging module 311 is further configured to judge whether the overlapping area meets the requirement of the overlapping area based on the first area, and/or judge whether the black area meets the requirement of the black area based on the second area, so as to obtain a judging result, where the area of the overlapping area requiring the overlapping area is smaller than or equal to a first preset area, and/or the area of the black area requiring the black area is smaller than or equal to a second preset area, and send the judging result to the calibration module;
the calibration module 312 is configured to generate a calibration instruction for calibrating the orientation angle of the target camera and transmit the calibration instruction to the control module 321 when determining that the target camera whose orientation angle is to be calibrated exists in the at least two vehicle-mounted cameras 322 according to the determination result;
the control module 321 is configured to calibrate an orientation angle of the target camera according to the received calibration instruction.
The panoramic driving image can be understood as a driving image of 360 degrees around the target vehicle in the driving process of the target vehicle; the overlapping area can be understood as an area where images overlap in the panoramic running image obtained by stitching based on the running image; the black area can be understood as an area without image information in the panoramic driving image obtained based on the driving image stitching; the overlapping region requirement may be understood as a requirement that the overlapping region needs to meet, for example, a region area of the overlapping region is smaller than or equal to a first preset area, etc.; the black area requirement is understood to be a requirement that the black area needs to meet, e.g. the area of the black area is smaller than or equal to a second preset area, etc.
The processor 323 may stitch the panoramic running image based on the running image. After receiving the panoramic running image transmitted by the vehicle-mounted terminal 32, the judging module 311 calculates a first area of an overlapping area in the panoramic running image and a second area of a black area in the panoramic running image, judges whether the area of the overlapping area meets the requirement of the overlapping area, namely, whether the area is smaller than or equal to a first preset area, judges whether the black area meets the requirement of the black area, namely, whether the area is smaller than or equal to a second preset area, and finally sends a judging result to the calibrating module.
In this embodiment, the judging module judges whether the first area of the overlapping area in the panoramic driving image meets the requirement of the overlapping area and/or whether the second area of the black area in the panoramic driving image meets the requirement of the black area, and generates a judging result, so that the calibrating module can calibrate according to the classification of the judging result, and the efficiency and the accuracy of the calibration of the orientation angle of the vehicle-mounted camera are improved.
On the basis, an optional technical scheme, the number of the overlapped areas includes at least one, the number of the black areas includes at least one, and the calibration module is further used for: under the condition that the target overlapping area which does not meet the requirement of the overlapping area exists in at least one overlapping area according to the judging result, the vehicle-mounted camera which is associated with the target overlapping area in at least two vehicle-mounted cameras is used as the target camera with the orientation angle to be calibrated; and/or under the condition that the target black area which does not meet the black area requirement exists in at least one black area according to the judging result, the vehicle-mounted camera which is associated with the target black area in at least two vehicle-mounted cameras is used as the target camera with the orientation angle to be calibrated.
When at least one overlapping region has a target overlapping region which does not meet the requirement of the overlapping region, determining a driving image corresponding to the target overlapping region, and taking a vehicle-mounted camera corresponding to the driving image as a target camera which needs to be calibrated. For example, in the target driving image, if the target overlapping area is an overlapping area formed by combining driving images acquired by the camera a and the camera B, the camera a and the camera B are used as target cameras to be calibrated.
When at least one black area has a target black area which does not meet the black area requirement, the vehicle-mounted camera corresponding to the running image adjacent to the black area can be used as a target camera which needs to be calibrated.
By using the in-vehicle camera associated with the target overlap region and/or the in-vehicle camera associated with the target black zone as the target camera for which the orientation angle needs to be calibrated, the target overlap region and/or the black zone can be effectively reduced subsequent to calibrating the orientation angle of the target camera.
In another alternative solution, the calibration module is specifically configured to: under the condition that the target cameras with the orientation angles to be calibrated exist in at least two vehicle-mounted cameras according to the judging result, calculating the calibration angles of the target cameras based on the current orientation angles of the target cameras and the first area and/or the second area; based on the calibration angle, a calibration instruction for calibrating the orientation angle of the target camera is generated and transmitted to the control module.
In practical application, the calibration angle of the target camera may be calculated according to the current orientation angle and the first area of the target camera.
The acquisition view field of each vehicle-mounted camera is a sector area, the acquisition distance and the acquisition height of the acquisition view field of each vehicle-mounted camera are known, the target overlapping area can be understood as a sector cambered surface in the sector area overlapped between the acquisition view fields of two target cameras, the acquisition view field can be approximated to be a rectangle, the height of the acquisition view field is the acquisition height of the vehicle-mounted camera, the area is the first area, the width of the target overlapping area can be calculated under the condition that the first area and the acquisition height are determined, the width of the target overlapping area is taken as the arc length of the sector, the overlapping angle of the acquisition view field between the two target cameras can be obtained under the condition that the acquisition distance is known, half of the overlapping angle is taken as an adjustment angle, the target camera corresponding to the driving image on the left side of the target overlapping area subtracts the adjustment angle on the basis of the current orientation angle of the two target cameras, and the calibration angle of the target camera is obtained. Based further on the calibration angles, calibration instructions are generated to adjust the target camera to the corresponding calibration angle.
Meanwhile, the calibration angle of the target camera can be calculated according to the current orientation angle and the second area of the target camera.
The black area can be understood as a cambered surface corresponding to a fan shape in a sector area of a gap between two target camera acquisition visual fields, and can be approximately a rectangle, the height of the cambered surface is the acquisition height of the vehicle-mounted camera, the area is a second area, the width of the black area can be calculated under the condition that the second area and the acquisition height are determined, the width of the black area is taken as the arc length of the fan shape, the angle of the black area between the two target camera acquisition visual fields can be obtained under the condition that the acquisition distance is known, half of the angle is taken as an adjustment angle, the target camera corresponding to the driving image on the left side of the target overlapping area is added with the adjustment angle on the basis of the current orientation angle of the two target cameras, the calibration angle of the target camera is obtained, and the adjustment angle is subtracted from the target camera corresponding to the driving image on the right side of the target overlapping area, so that the calibration angle of the target camera is obtained. Based further on the calibration angles, calibration instructions are generated to adjust the target camera to the corresponding calibration angle.
Of course, the calibration angle of the target camera may also be calculated according to the current orientation angle, the first and second areas of the target camera, and the like, which are not specifically limited herein.
The calibration angle of the target camera is accurately calculated, and a calibration instruction is generated based on the calibration angle, so that the control module can accurately calibrate the orientation angle of the target camera, and the accuracy of the calibration of the orientation angle of the target camera is improved.
In order to better understand the above-described respective technical solutions as a whole, an exemplary description thereof is given below in conjunction with specific examples. Illustratively, as shown in fig. 4, the in-vehicle camera orientation angle calibration system set forth in this example may include: server and vehicle-mounted terminal, wherein, the server includes: calibration module, judgement module and second transmission module, vehicle-mounted terminal includes: the device comprises a first transmission module, a processor, a control module, a cradle head and a vehicle-mounted camera. In this specific example, the number of the vehicle-mounted cameras is 6, the first transmission module and the second transmission module perform data interaction based on a wireless transmission mode, each vehicle-mounted camera is respectively installed on a corresponding cradle head, and each cradle head is respectively installed right in front of, left in front of, right in front of, left behind, right behind and right behind the target vehicle.
On this basis, the workflow of the on-vehicle camera orientation angle calibration system illustrated in this example is shown in fig. 5, and the specific steps are as follows:
1. And taking the vehicle-mounted camera positioned right in front of the target vehicle as a reference camera, setting the orientation angle of the reference camera to be 0 degrees by rotating a cradle head on which the reference camera is mounted, and respectively calculating the deflection angles of other 5 paths of vehicle-mounted cameras relative to the reference camera by a processor in the vehicle-mounted terminal, thereby obtaining the current respective orientation angles of the other 5 paths of vehicle-mounted cameras.
2. The processor receives driving images respectively collected by the 6-way vehicle-mounted cameras, splices the 6 driving images based on the current respective orientation angles of the 6-way vehicle-mounted cameras, generates panoramic driving images, and uploads the panoramic driving images to the server through the first transmission module.
3. The panoramic driving image is received by the second transmission module in the server, the first area of the overlapped area and the second area of the black area in the panoramic driving image are calculated by the judgment module, whether the first area is smaller than or equal to the first preset area and whether the second area is smaller than or equal to the second preset area are judged, a judgment result is generated, and the judgment result is transmitted to the calibration module.
4. When the calibration module determines that the first area is smaller than or equal to the first preset area and the second area is smaller than or equal to the second preset area according to the judging result, if no target camera to be calibrated exists in other 5-path vehicle-mounted cameras, a calibration termination instruction is generated, and the calibration termination instruction is transmitted to the control module through the first transmission module and the second transmission module so as to terminate calibration.
5. When the calibration module determines that the first area is larger than the first preset area and/or the second area is larger than the second preset area according to the judging result, determining that a target camera to be calibrated exists in the other 5-path vehicle-mounted cameras, determining the target camera from the other 5-path vehicle-mounted cameras, calculating the calibration angle of the target camera based on the current orientation angle of the target camera and the area of the first area and/or the area of the person, generating a calibration instruction based on the calibration angle, and transmitting the calibration instruction to the control module through the first transmission module and the second transmission module, so that the control module controls the cradle head provided with the target camera to rotate according to the calibration instruction, and calibrating the orientation angle of the target camera.
According to the specific embodiment, the modules are matched with each other, so that the automatic calibration of the orientation angles of the vehicle-mounted cameras is realized, and compared with manual calibration, the calibration cost is reduced and the calibration efficiency is improved.
The above embodiments do not limit the scope of the present invention. It will be apparent to those skilled in the art that various modifications, combinations, sub-combinations and alternatives are possible, depending on design requirements and other factors. Any modifications, equivalent substitutions and improvements made within the spirit and principles of the present invention should be included in the scope of the present invention.

Claims (10)

1. An in-vehicle camera orientation angle calibration system, comprising: the system comprises a server and a vehicle-mounted terminal integrated on a target vehicle, wherein a control module and at least two vehicle-mounted cameras are integrated in the vehicle-mounted terminal, and a judging module and a calibrating module are integrated in the server;
the at least two vehicle-mounted cameras are respectively used for acquiring driving images of the target vehicle so as to obtain at least two driving images;
The judging module is used for receiving the target driving image transmitted by the vehicle-mounted terminal, judging whether the target driving image meets the preset image requirement, and sending the obtained judging result to the calibration module, wherein the target driving image is obtained based on the at least two driving images;
the calibration module is used for generating a calibration instruction for calibrating the orientation angle of the target camera under the condition that the target camera with the orientation angle to be calibrated exists in the at least two vehicle-mounted cameras according to the judging result, and transmitting the calibration instruction to the control module;
And the control module is used for calibrating the orientation angle of the target camera according to the received calibration instruction.
2. The system of claim 1, wherein the vehicle terminal further has a processor integrated therein;
the processor is used for receiving the at least two driving images, and splicing the at least two driving images to obtain the target driving image.
3. The system of claim 2, wherein the processor is specifically configured to:
and receiving the at least two driving images, and splicing the at least two driving images based on the current corresponding direction angles of the at least two vehicle-mounted cameras to generate the target driving image.
4. A system according to claim 3, wherein the target driving image comprises a panoramic driving image, the preset image requirements comprising preset overlap region requirements and/or black region requirements;
the judging module is specifically configured to:
receiving the panoramic driving image transmitted by the vehicle-mounted terminal, and respectively calculating a first area of an overlapping area in the panoramic driving image and/or a second area of a black area in the panoramic driving image;
Judging whether the overlapping area meets the overlapping area requirement or not based on the first area, and/or judging whether the black area meets the black area requirement or not based on the second area, and obtaining a judging result, wherein the area of the overlapping area requirement including the overlapping area is smaller than or equal to a first preset area, and/or the area of the black area requirement including the black area is smaller than or equal to a second preset area;
and sending the judging result to the calibration module.
5. The system of claim 4, wherein the number of overlapping regions comprises at least one and the number of black regions comprises at least one, the calibration module further to:
under the condition that a target overlapping area which does not meet the requirement of the overlapping area exists in at least one overlapping area according to the judging result, taking the vehicle-mounted camera which is associated with the target overlapping area in the at least two vehicle-mounted cameras as a target camera with an orientation angle to be calibrated;
and/or the number of the groups of groups,
And under the condition that the target black area which does not meet the black area requirement exists in the at least one black area according to the judging result, taking the vehicle-mounted camera which is associated with the target black area in the at least two vehicle-mounted cameras as a target camera with an orientation angle to be calibrated.
6. The system of claim 4, wherein the calibration module is specifically configured to:
Under the condition that a target camera with an orientation angle to be calibrated exists in the at least two vehicle-mounted cameras according to the judging result, calculating the calibration angle of the target camera based on the current orientation angle of the target camera and the first area and/or the second area;
Based on the calibration angle, a calibration instruction for calibrating the orientation angle of the target camera is generated and transmitted to the control module.
7. A system according to claim 3, characterized in that:
The control module is further used for controlling the reference camera in the at least two vehicle-mounted cameras to face 0 degree before the at least two vehicle-mounted cameras acquire the vehicle images;
The processor is further configured to determine, for each of the at least two vehicle-mounted cameras except for the reference camera, a yaw angle of the vehicle-mounted camera with respect to the reference camera, and take the yaw angle as a current orientation angle of the vehicle-mounted camera.
8. The system according to claim 1, wherein the vehicle-mounted terminal is further integrated with a cradle head corresponding to the at least two vehicle-mounted cameras respectively, so as to mount the at least two vehicle-mounted cameras on the corresponding cradle heads respectively, and the control module is specifically configured to:
And receiving the calibration instruction, and controlling the rotation of a cradle head provided with the target camera based on the calibration instruction so as to calibrate the orientation angle of the target camera.
9. The system according to claim 1, wherein:
the calibration module is further configured to generate a calibration termination instruction when it is determined that the target camera does not exist in the at least two vehicle-mounted cameras according to the determination result;
The control module is further configured to terminate calibration of the orientation angles corresponding to the at least two vehicle-mounted cameras respectively based on the calibration termination instruction sent by the calibration module.
10. The system of claim 1, wherein the vehicle-mounted terminal further comprises a first transmission module, and the server further comprises a second transmission module:
The first transmission module is used for receiving the target driving image, uploading the target driving image to the second transmission module, receiving the calibration instruction issued by the second transmission module and sending the calibration instruction to the control module;
The second transmission module is configured to receive the target driving image uploaded by the first transmission module, transmit the target driving image to the judgment module, receive the calibration instruction sent by the calibration module, and send the calibration instruction to the first transmission module.
CN202410145791.0A 2024-02-01 2024-02-01 Vehicle-mounted camera orientation angle calibration system Pending CN118075616A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410145791.0A CN118075616A (en) 2024-02-01 2024-02-01 Vehicle-mounted camera orientation angle calibration system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202410145791.0A CN118075616A (en) 2024-02-01 2024-02-01 Vehicle-mounted camera orientation angle calibration system

Publications (1)

Publication Number Publication Date
CN118075616A true CN118075616A (en) 2024-05-24

Family

ID=91103464

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202410145791.0A Pending CN118075616A (en) 2024-02-01 2024-02-01 Vehicle-mounted camera orientation angle calibration system

Country Status (1)

Country Link
CN (1) CN118075616A (en)

Similar Documents

Publication Publication Date Title
CN112965503B (en) Multi-path camera fusion splicing method, device, equipment and storage medium
JP4861034B2 (en) Car camera calibration system
CN107784672B (en) Method and device for acquiring external parameters of vehicle-mounted camera
EP2660625B1 (en) A method for monitoring a traffic stream and a traffic monitoring device
US10656503B2 (en) Gimbal control
JP5341789B2 (en) Parameter acquisition apparatus, parameter acquisition system, parameter acquisition method, and program
KR101343975B1 (en) System for detecting unexpected accident
EP1087204B1 (en) Range finder using stereoscopic images
KR102386670B1 (en) Self calibration apparatus of automotive radar and method having the same
CN113923420B (en) Region adjustment method and device, camera and storage medium
CN112537294B (en) Automatic parking control method and electronic equipment
US20220254064A1 (en) External parameter calibration method, device and system for image acquisition apparatus
KR101664908B1 (en) Unmanned air vehicle for monitoring solar cell panel and accurate moving method of the same
JP2000504418A (en) Distance and / or position measuring device
CN111835998B (en) Beyond-the-horizon panoramic image acquisition method, device, medium, equipment and system
US20220343656A1 (en) Method and system for automated calibration of sensors
US6611664B2 (en) Stereo image photographing system
CN113247009A (en) Method and apparatus for determining trailer size in a motor vehicle
CN112150547B (en) Method and device for determining vehicle body pose and looking around vision odometer system
JP6450530B2 (en) In-vehicle camera mounting angle adjustment processing, mounting angle detection device
KR101035538B1 (en) Apparatus and method for obtaining real time position information of car line
CN118075616A (en) Vehicle-mounted camera orientation angle calibration system
EP2490175A1 (en) Method for calibrating and/or aligning a camera mounted in an automobile vehicle and corresponding camera
KR102164702B1 (en) Automatic parking device and automatic parking method
JP6981881B2 (en) Camera misalignment detection device and camera misalignment detection method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination