CN109754363B - Around-the-eye image synthesis method and device based on fish eye camera - Google Patents

Around-the-eye image synthesis method and device based on fish eye camera Download PDF

Info

Publication number
CN109754363B
CN109754363B CN201811598527.3A CN201811598527A CN109754363B CN 109754363 B CN109754363 B CN 109754363B CN 201811598527 A CN201811598527 A CN 201811598527A CN 109754363 B CN109754363 B CN 109754363B
Authority
CN
China
Prior art keywords
looking
fisheye
around image
images
stereoscopic
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811598527.3A
Other languages
Chinese (zh)
Other versions
CN109754363A (en
Inventor
彭惠东
卢彦斌
吴颖谦
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zebred Network Technology Co Ltd
Original Assignee
Zebred Network Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zebred Network Technology Co Ltd filed Critical Zebred Network Technology Co Ltd
Priority to CN201811598527.3A priority Critical patent/CN109754363B/en
Publication of CN109754363A publication Critical patent/CN109754363A/en
Application granted granted Critical
Publication of CN109754363B publication Critical patent/CN109754363B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Abstract

The application provides a method and a device for synthesizing a looking-around image based on a fisheye camera, wherein the method comprises the following steps: the method comprises the steps of acquiring a plurality of fisheye images acquired by a plurality of fisheye cameras arranged on a vehicle, splicing the fisheye images according to the corresponding relation among the fisheye cameras acquired in advance, acquiring a plane looking-around image corresponding to the fisheye images, wherein the corresponding relation is used for representing the position relation among the fisheye images acquired by the fisheye cameras, performing space alignment processing on the same space object in the fisheye images, and acquiring a three-dimensional looking-around image corresponding to the plane looking-around image. By the method, the looking-around image of the real scenes around the vehicle can be obtained, the user experience is improved, the blind area of the vehicle during running is eliminated, and the running safety is improved.

Description

Around-the-eye image synthesis method and device based on fish eye camera
Technical Field
The application relates to the technical field of image processing, in particular to a method and a device for synthesizing a looking-around image based on a fisheye camera.
Background
In driving of a vehicle, particularly in reversing, it is important to be able to clearly display live-action image information around the vehicle for driving safety.
In the prior art, most cameras arranged on a vehicle work independently, images acquired by the cameras are usually returned to a user without being spliced, and the user experience is poor; in addition, although some high-end vehicle models have a look-around function, a wide-angle lens is mainly provided on a vehicle, since the wide-angle lens is narrow in angle of view and returns to a generally planar view, the user experience is poor.
Disclosure of Invention
In order to solve the problems in the prior art, the application provides a method and a device for synthesizing a looking-around image based on a fisheye camera, which are used for providing the looking-around image of the real scene around the vehicle for a user, improving the user experience, eliminating the blind area when the vehicle runs and improving the running safety.
In a first aspect, an embodiment of the present application provides a method for synthesizing a looking-around image based on a fisheye camera, including:
acquiring a plurality of fisheye images acquired by a plurality of fisheye cameras arranged on a vehicle;
splicing the fisheye images according to the pre-acquired corresponding relation among the fisheye cameras to acquire plane looking-around images corresponding to the fisheye images, wherein the corresponding relation is used for representing the position relation among the fisheye images acquired by the fisheye cameras;
and carrying out space alignment processing on the same space object in the plurality of fisheye images to obtain a stereoscopic looking-around image corresponding to the plane looking-around image.
Optionally, the method further comprises:
and optimizing the plane looking-around image to obtain the optimized plane looking-around image.
Optionally, the method further comprises:
acquiring a plurality of calibration plate images acquired by the plurality of fisheye cameras, wherein the fisheye images comprise the calibration plate images;
and acquiring the corresponding relation of the fisheye cameras according to the calibration plate images.
Optionally, the obtaining the correspondence of the plurality of fisheye cameras according to the plurality of calibration plate images includes:
acquiring characteristic points of the plurality of calibration plate images according to a characteristic point detection algorithm;
acquiring position coordinates of the feature points according to the feature points of the plurality of calibration plate images;
and acquiring the corresponding relation of the plurality of fisheye cameras according to the position coordinates of the feature points.
Optionally, the performing spatial alignment processing on the same spatial object in the plurality of fisheye images to obtain a stereoscopic looking-around image corresponding to the planar looking-around image includes:
carrying out space alignment processing on the same space object in the plurality of fish-eye images to obtain a first stereoscopic looking-around image;
performing color mixing processing on the first stereoscopic looking-around image to obtain a second stereoscopic looking-around image;
and acquiring a stereoscopic looking-around image corresponding to the plane looking-around image according to the second stereoscopic looking-around image and a pre-established three-dimensional model, wherein the stereoscopic looking-around image is the projection of the second stereoscopic looking-around image on the pre-established three-dimensional model.
Optionally, the method further comprises:
and displaying the stereoscopic looking-around image corresponding to the planar looking-around image.
Optionally, the number of the fisheye cameras is four.
In a second aspect, an embodiment of the present application provides a fisheye-camera-based looking-around image synthesis device, including:
an acquisition module for acquiring a plurality of fisheye images acquired by a plurality of fisheye cameras provided on a vehicle;
the processing module is used for splicing the plurality of fisheye images according to the pre-acquired corresponding relation among the plurality of fisheye cameras, so as to acquire a plane looking-around image corresponding to the plurality of fisheye images, wherein the corresponding relation is used for representing the position relation among the fisheye images acquired by the plurality of fisheye cameras;
and carrying out space alignment processing on the same space object in the plurality of fisheye images to obtain a stereoscopic looking-around image corresponding to the plane looking-around image.
Optionally, the processing module is further configured to optimize the plane view-around image, and obtain an optimized plane view-around image.
Optionally, the acquiring module is further configured to acquire a plurality of calibration plate images acquired by the plurality of fisheye cameras, where the fisheye images include the calibration plate images;
and acquiring the corresponding relation of the fisheye cameras according to the calibration plate images.
Optionally, the obtaining module is further configured to obtain feature points of the plurality of calibration plate images according to a feature point detection algorithm;
acquiring position coordinates of the feature points according to the feature points of the plurality of calibration plate images;
and acquiring the corresponding relation of the plurality of fisheye cameras according to the position coordinates of the feature points.
Optionally, the processing module is further configured to perform spatial alignment processing on the same spatial object in the plurality of fisheye images, so as to obtain a first stereoscopic looking-around image;
performing color mixing processing on the first stereoscopic looking-around image to obtain a second stereoscopic looking-around image;
the acquisition module is further configured to acquire a stereoscopic looking-around image corresponding to the planar looking-around image according to the second stereoscopic looking-around image and a pre-established three-dimensional model, where the stereoscopic looking-around image is a projection of the second stereoscopic looking-around image on the pre-established three-dimensional model.
Optionally, the apparatus further includes:
and the display module is used for displaying the stereoscopic looking-around image corresponding to the planar looking-around image.
Optionally, the number of the fisheye cameras is four.
In a third aspect, an embodiment of the present application provides an in-vehicle apparatus, including:
a processor;
a memory; and
a computer program;
wherein the computer program is stored in the memory and configured to be executed by the processor, the computer program comprising instructions for performing the method of the first aspect.
In a fourth aspect, embodiments of the present application provide a computer-readable storage medium storing a computer program, the computer program causing a server to execute the method of the first aspect.
The application provides a method and a device for synthesizing a looking-around image based on a fisheye camera, wherein the method comprises the following steps: the method comprises the steps of acquiring a plurality of fisheye images acquired by a plurality of fisheye cameras arranged on a vehicle, splicing the fisheye images according to the corresponding relation among the fisheye cameras acquired in advance, acquiring a plane looking-around image corresponding to the fisheye images, wherein the corresponding relation is used for representing the position relation among the fisheye images acquired by the fisheye cameras, performing space alignment processing on the same space object in the fisheye images, and acquiring a three-dimensional looking-around image corresponding to the plane looking-around image. By the method, the looking-around image of the real scenes around the vehicle can be obtained, the user experience is improved, the blind area of the vehicle during running is eliminated, and the running safety is improved.
Drawings
In order to more clearly illustrate the embodiments of the application or the technical solutions of the prior art, the drawings which are used in the description of the embodiments or the prior art will be briefly described, it being obvious that the drawings in the description below are only some embodiments of the application, and that other drawings can be obtained according to these drawings without inventive faculty for a person skilled in the art.
Fig. 1 is a schematic flow chart of a method for synthesizing a panoramic image based on a fisheye camera according to an embodiment of the present application;
fig. 2 is a schematic diagram of a fisheye image according to an embodiment of the present application;
FIG. 3 is a schematic view of a top plan view looking-around image provided by an embodiment of the present application;
fig. 4 is a second flow chart of a method for synthesizing a panoramic image based on a fisheye camera according to an embodiment of the present application;
FIG. 5 is a schematic diagram of a calibration plate according to an embodiment of the present application;
fig. 6 is a flow chart diagram III of a method for synthesizing a panoramic image based on a fisheye camera according to an embodiment of the present application;
fig. 7 is a schematic diagram of a spliced fisheye image provided in an embodiment of the present application;
fig. 8 is a schematic diagram of a stereoscopic looking-around image according to an embodiment of the present application;
fig. 9 is a schematic structural diagram of a fisheye-camera-based looking-around image synthesis device according to an embodiment of the present application;
fig. 10 is a schematic structural diagram of an in-vehicle apparatus according to an embodiment of the present application.
Detailed Description
The following description of the embodiments of the present application will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present application, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to be within the scope of the application.
The method for synthesizing the looking-around image of the fisheye camera can be applied to application scenes needing to look at the surrounding environment of the vehicle, such as the running or parking process of the vehicle. In the running or parking process of the vehicle, in order to ensure safety, drivers need to check whether barriers exist around the vehicle, most cameras arranged on most vehicles work independently at present, images collected by the cameras are returned to a user without being spliced, and unnatural live-action experience is given to the user, so that the user experience is poor; in addition, although some high-end vehicles have a look-around function, a wide-angle lens is mainly arranged on the vehicle, because the wide-angle lens has a narrow and relatively fixed visual angle, and the returned view is generally a plane view, the three-dimensional look-around image cannot be provided for a user, so that the user can manually touch and rotate the three-dimensional look-around image to any visual angle, the speed of returning the image is low, and the cost is high, so that the user experience is poor.
In view of the above problems, an embodiment of the present application provides a method for synthesizing a looking-around image of a fisheye camera, in which a vehicle-mounted device acquires a plurality of fisheye images acquired by a plurality of fisheye cameras provided on a vehicle, and splices the plurality of fisheye images according to a correspondence between the plurality of fisheye cameras acquired in advance, acquires a planar looking-around image corresponding to the plurality of fisheye images, where the correspondence is used to represent a positional relationship between the fisheye images acquired by the plurality of fisheye cameras, and performs spatial alignment processing on the same spatial object in the plurality of fisheye images, so as to acquire a stereoscopic looking-around image corresponding to the planar looking-around image. By the method, the looking-around image of the real scenes around the vehicle can be obtained, the user experience is improved, the blind area of the vehicle during running is eliminated, and the running safety is improved.
The technical scheme of the application is described in detail below by specific examples. The following embodiments may be combined with each other, and some embodiments may not be repeated for the same or similar concepts or processes.
Fig. 1 is a schematic flow chart of a method for synthesizing a panoramic image based on a fisheye camera according to an embodiment of the present application. The embodiment of the application provides a method for synthesizing a looking-around image of a fisheye camera, which can be executed by any device for executing the method for synthesizing the looking-around image of the fisheye camera, and the device can be realized by software and/or hardware. In this embodiment, the apparatus may be integrated in a vehicle-mounted device. As shown in fig. 1, the method for synthesizing the looking-around image of the fisheye camera provided by the embodiment of the application comprises the following steps:
s101, acquiring a plurality of fisheye images acquired by a plurality of fisheye cameras arranged on a vehicle.
The vehicle can be provided with a plurality of fish-eye cameras which can be respectively arranged around the vehicle and used for collecting images of the vehicle in different directions. The number of the fisheye cameras is not particularly limited, and the fisheye camera-collected images can be synthesized and completely display the surrounding environment of the vehicle.
Optionally, the number of the fisheye cameras is 4, and the fisheye cameras can be respectively arranged at the head, the tail, the left side and the right side of the vehicle.
S102, splicing the fisheye images according to the corresponding relation among the fisheye cameras acquired in advance, and acquiring the plane looking-around images corresponding to the fisheye images.
The corresponding relation is used for representing the position relation among fisheye images acquired by the fisheye cameras, and if the positions of the fisheye cameras are not changed, the corresponding relation is fixed.
Alternatively, the correspondence may be a geometric transformation relationship.
Taking 4 fisheye cameras arranged on a vehicle as an example, referring to fig. 2 and 3, fig. 2 is a schematic diagram of a fisheye image provided by an embodiment of the present application, and fig. 3 is a schematic diagram of a top view looking-around image provided by an embodiment of the present application.
As can be seen from fig. 2, the 4 fisheye cameras disposed on the vehicle are disposed on the front, the rear, the left and the right sides of the vehicle, and since the maximum photographing range of the fisheye camera is 180 °, the fisheye images collected by the fisheye camera 1 and the fisheye images collected by the fisheye camera 2 have overlapping areas, the fisheye images collected by the fisheye camera 2 and the fisheye camera 3 have overlapping areas, the fisheye images collected by the fisheye camera 3 and the fisheye camera 4 have overlapping areas, and then the fisheye images collected by the fisheye camera 4 and the fisheye camera 1 can be spliced according to the positional relationship between the four fisheye images collected by the four fisheye cameras, so as to obtain the planar looking-around images corresponding to the four fisheye images.
It should be noted that, the stitching here is to stitch any two fisheye images with overlapping areas, when the overlapping areas are completely overlapped, stitch the two fisheye images, and then stitch the two fisheye images until the four fisheye images are stitched together to obtain a complete planar image.
Further, a two-dimensional model is built in advance, and unnecessary information is filtered and displayed to a target of a user, for example: the object in the sky is combined with the two-dimensional model, and a top plane looking-around image shown in fig. 3 can be obtained.
S103, carrying out space alignment processing on the same space object in the plurality of fisheye images, and obtaining a stereoscopic looking-around image corresponding to the plane looking-around image.
In the stereoscopic looking-around image, the objects on the ground in the planar looking-around image in the plane looking-down view are substantially aligned (the stitching is completed), and when the objects in the space are aligned, there is a deviation, so that it is necessary to perform the space alignment process on the aggregate objects having the spatial alignment deviation.
In one possible implementation manner, the spatial alignment process may be performed on the same spatial object in the plurality of fisheye images to obtain a first stereoscopic looking-around image, then the color mixing process is performed on the first stereoscopic looking-around image to obtain a second stereoscopic looking-around image, and then the stereoscopic looking-around image corresponding to the planar looking-around image is obtained according to the second stereoscopic looking-around image and the pre-established three-dimensional model, where the stereoscopic looking-around image is the projection of the second stereoscopic looking-around image on the pre-established three-dimensional model.
Specifically, the spatial blending process may be to calculate an appearance function according to the geometric shape, appearance color, and the like of the same object in different fisheye images, where the function may be a custom function reflecting the rate of change of the different fisheye images, such as: the color shade change rate and the like are illustrated by the color, and if the color shade change rate is larger than the preset color shade change rate threshold value of the same object, the dark color area can be stretched or the light color area can be reduced, so that the same object has natural color on display.
The first stereoscopic looking-around image is obtained after the space alignment processing, because the incident light rays of all the fisheye cameras are different, the difference of the brightness of different fisheye images is possibly larger, so that the color mixing of the different fisheye images is needed to be visually natural, specifically, the unit brightness value of the same object on the different fisheye images can be calculated, the average value is calculated, and then the object in the different fisheye images is recoloured, so that the second stereoscopic looking-around image is obtained.
And pre-establishing a three-dimensional model, wherein the three-dimensional model is used for filtering objects which are not required to be displayed to a user, and the points on the three-dimensional model and the points of the stereoscopic image have preset corresponding relations.
Alternatively, the three-dimensional model may be a bowl-shaped network three-dimensional model, and the shape of the three-dimensional model is not limited in this scheme.
The obtained second stereoscopic looking-around image can be projected onto the three-dimensional model to obtain a stereoscopic looking-around image corresponding to the planar looking-around image.
According to the method for synthesizing the looking-around images of the fisheye camera, the vehicle-mounted equipment acquires a plurality of fisheye images acquired by a plurality of fisheye cameras arranged on a vehicle, the fisheye images are spliced according to the corresponding relation among the fisheye cameras acquired in advance, the plane looking-around images corresponding to the fisheye images are acquired, the corresponding relation is used for representing the position relation among the fisheye images acquired by the fisheye cameras, the spatial alignment processing is carried out on the same spatial objects in the fisheye images, and the stereoscopic looking-around images corresponding to the plane looking-around images are acquired. By the method, the looking-around image of the real scenes around the vehicle can be obtained, the user experience is improved, the blind area of the vehicle during running is eliminated, and the running safety is improved.
On the basis of the embodiment of fig. 1, fig. 4 is a second flow chart of a method for synthesizing a looking-around image based on a fisheye camera according to the embodiment of the present application, as shown in fig. 4, and the method further includes the following steps:
s201, acquiring a plurality of calibration plate images acquired by a plurality of fisheye cameras.
The calibration plate can be arranged on the ground around the vehicle, and the fisheye camera can be arranged around the vehicle and used for collecting calibration plate images, wherein the fisheye images comprise the calibration plate images.
Optionally, the number of the calibration plates is 4, and the calibration plates are respectively arranged around the vehicle.
In this embodiment, the type of the calibration plate is not limited, and the calibration cloth in fig. 5 may be two squares (black on the outer ring and white on the inner ring) or one square (black and white alternate).
S202, according to the plurality of calibration plate images, obtaining the corresponding relation of the plurality of fisheye cameras.
According to the plurality of calibration plate images acquired by the fisheye cameras, the positions of the same calibration plate in different calibration plate images can be determined, and the position relation of the plurality of calibration plate images can be determined, so that the corresponding relation of the plurality of fisheye cameras is obtained.
In one possible implementation manner, feature points of a plurality of calibration plate images are obtained according to a feature point detection algorithm, position coordinates of the feature points are obtained according to the feature points of the plurality of calibration plate images, and corresponding relations of a plurality of fisheye cameras are obtained according to the position coordinates of the feature points.
Specifically, referring to fig. 5 for explaining this step, fig. 5 is a schematic diagram of a calibration board provided in an embodiment of the present application, where, as shown in fig. 5, the calibration board is disposed at the front left, rear left, front right, and rear right of a vehicle, the vehicle is located at the center, and the fisheye cameras are respectively disposed at the front, rear, left, and right sides of the vehicle, and each fisheye camera can collect at least two calibration board images.
The feature points in fig. 5 may be vertices of black squares and white squares, 8 in number.
And then, initially setting the upper left corner A of the front left calibration plate as the origin of coordinates, and measuring in advance to obtain the position coordinates of each calibration plate relative to the origin of coordinates.
Further, according to the determined position coordinates of the feature points, pairing is performed on different fisheye images, taking two calibration plate images as an example, namely, a matching point N of the feature point M in the calibration plate image 2 is found in the calibration plate image 1, and as an overlapping area exists between the two fisheye images, referring to the specific process of fig. 2, the position coordinates of each feature point M, N are combined to obtain the position relationship between the calibration plate image 1 and the calibration plate image 2, and similarly, matching with the feature points of the calibration plate image 3 and the calibration plate image 4 can be sequentially completed according to the method, so that the corresponding relationship of four fisheye cameras is obtained.
According to the method for synthesizing the looking-around images of the fisheye cameras, the vehicle-mounted equipment acquires the plurality of calibration plate images acquired by the plurality of fisheye cameras, the fisheye images comprise the calibration plate images, and the corresponding relations of the plurality of fisheye cameras are acquired according to the plurality of calibration plate images, so that the corresponding relations of the plurality of fisheye cameras arranged on the vehicle are obtained in advance, and in the actual running process of the vehicle, the looking-around images of the real scenes around the vehicle can be obtained according to the corresponding relations obtained in advance and the fisheye images acquired in real time, so that the user experience is improved.
On the basis of the embodiments of fig. 1 and fig. 4, a specific implementation process of the present solution is described with reference to fig. 6, and fig. 6 is a schematic flow chart three of a method for synthesizing a panoramic image based on a fisheye camera according to an embodiment of the present application, as shown in fig. 6, where the method specifically includes the following steps:
s301, acquiring a plurality of fisheye images acquired by a plurality of fisheye cameras arranged on a vehicle.
S302, splicing the fisheye images according to the corresponding relation among the fisheye cameras acquired in advance, and acquiring the plane looking-around images corresponding to the fisheye images.
S303, optimizing the plane looking-around image, and obtaining the optimized plane looking-around image.
Because the position coordinates of each calibration plate relative to the origin of coordinates are measured in advance in step S202, and because of various reasons, certain errors exist in the measured position coordinates, and therefore certain errors also exist in the correspondence between the obtained fisheye cameras, the situation that the same object in different fisheye images is not aligned may exist in the obtained planar looking-around image, so that the spliced image is not perfect and needs to be optimized.
Specifically, referring to fig. 7, fig. 7 is a schematic diagram of a spliced fisheye image provided in an embodiment of the present application, and as shown in fig. 7, by taking a case of splicing two fisheye images, a fisheye image 1 and a fisheye image 2 include overlapping areas, after the fisheye image 1 and the fisheye image 2 are preliminarily spliced according to the above method, the fisheye image 1 and the fisheye image 2 do not overlap completely, the error is the area a, and when the fisheye image 1 and the fisheye image 2 overlap completely, the overlapping areas are stretched toward each other respectively to eliminate the error area a.
S304, carrying out space alignment processing on the same space object in the plurality of fisheye images to obtain a stereoscopic looking-around image corresponding to the plane looking-around image.
The implementation processes of steps S301 to S302 and steps S101 to S103 are similar, and the implementation processes of step S304 and step S103 are similar, and are not described in detail herein.
And S305, displaying the stereoscopic looking-around image corresponding to the planar looking-around image.
Fig. 8 is a schematic diagram of a stereoscopic looking-around image provided by the embodiment of the present application, as shown in fig. 8, a stereoscopic looking-around image of a real scene around a vehicle is presented to a user, and a black area in the middle is the vehicle, specifically, the obtained stereoscopic looking-around image can be displayed on a display screen of a vehicle-mounted device, and the stereoscopic looking-around image can be rotated to any view angle according to the requirement of the user, so that the user can more intuitively and truly perceive the real scene information around the vehicle.
According to the method for synthesizing the looking-around images of the fisheye camera, the vehicle-mounted equipment acquires a plurality of fisheye images acquired by the fisheye cameras arranged on the vehicle, the fisheye images are spliced according to the corresponding relation among the fisheye cameras acquired in advance, the plane looking-around images corresponding to the fisheye images are acquired, the corresponding relation is used for representing the position relation among the fisheye images acquired by the fisheye cameras, the plane looking-around images are optimized, the plane looking-around images after optimization are acquired, the space alignment processing is carried out on the same space objects in the fisheye images, and the three-dimensional looking-around images corresponding to the plane looking-around images are acquired. By the method, the looking-around image of the real scenes around the vehicle can be obtained, the user experience is improved, the blind area of the vehicle during running is eliminated, and the running safety is improved.
Fig. 9 is a schematic structural diagram of a fisheye-camera-based looking-around image synthesis device according to an embodiment of the present application, where the fisheye-camera-based looking-around image synthesis device may be an independent vehicle-mounted device, or may be a device integrated in a vehicle-mounted device, and the device may be implemented by software, hardware, or a combination of software and hardware. As shown in fig. 9, the fisheye camera-based looking-around image synthesis device 10 includes: an acquisition module 11, a processing module 12 and a display module 13;
an acquisition module 11 for acquiring a plurality of fisheye images acquired by a plurality of fisheye cameras provided on a vehicle;
the processing module 12 is configured to splice the plurality of fisheye images according to a pre-acquired correspondence between the plurality of fisheye cameras, and acquire a planar looking-around image corresponding to the plurality of fisheye images, where the correspondence is used to represent a positional relationship between the fisheye images acquired by the plurality of fisheye cameras;
and carrying out space alignment processing on the same space object in the plurality of fisheye images to obtain a stereoscopic looking-around image corresponding to the plane looking-around image.
Optionally, the processing module 12 is further configured to optimize the planar looking-around image, and obtain an optimized planar looking-around image.
Optionally, the acquiring module 11 is further configured to acquire a plurality of calibration plate images acquired by the plurality of fisheye cameras, where the fisheye images include the calibration plate images;
and acquiring the corresponding relation of the fisheye cameras according to the calibration plate images.
Optionally, the obtaining module 11 is further configured to obtain feature points of the plurality of calibration plate images according to a feature point detection algorithm;
acquiring position coordinates of the feature points according to the feature points of the plurality of calibration plate images;
and acquiring the corresponding relation of the plurality of fisheye cameras according to the position coordinates of the feature points.
Optionally, the processing module 12 is further configured to perform spatial alignment processing on the same spatial object in the plurality of fisheye images, so as to obtain a first stereoscopic looking-around image;
performing color mixing processing on the first stereoscopic looking-around image to obtain a second stereoscopic looking-around image;
the obtaining module 11 is further configured to obtain a stereoscopic looking-around image corresponding to the planar looking-around image according to the second stereoscopic looking-around image and a pre-established three-dimensional model, where the stereoscopic looking-around image is a projection of the second stereoscopic looking-around image on the pre-established three-dimensional model.
And the display module 12 is used for displaying the stereoscopic looking-around image corresponding to the planar looking-around image.
Optionally, the number of the fisheye cameras is four.
The device for synthesizing the looking-around image of the fisheye camera provided by the embodiment of the application can execute the method embodiment, and the implementation principle and the technical effect are similar, and are not repeated here.
Fig. 10 is a schematic hardware structure of an in-vehicle apparatus according to an embodiment of the present application, as shown in fig. 10, an in-vehicle apparatus 80 according to the present embodiment includes: a processor 801 and a memory 802; wherein:
a memory 802 for storing computer-executable instructions;
a processor 801 for executing computer-executable instructions stored in a memory to perform the steps performed by the receiving device in the above-described embodiments. Reference may be made in particular to the relevant description of the embodiments of the method described above.
Alternatively, the memory 802 may be separate or integrated with the processor 801.
When the memory 802 is provided separately, the voice interaction device further comprises a bus 803 for connecting said memory 802 and the processor 801.
The embodiment of the application also provides a computer readable storage medium, wherein the computer readable storage medium stores a computer program, and the computer program enables a server to execute the fisheye camera-based looking-around image synthesis method provided by any one of the previous embodiments.
In the several embodiments provided by the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described embodiments of the apparatus are merely illustrative, and for example, the division of the modules is merely a logical function division, and there may be additional divisions when actually implemented, for example, multiple modules may be combined or integrated into another system, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or modules, which may be in electrical, mechanical, or other forms.
The modules described as separate components may or may not be physically separate, and components shown as modules may or may not be physical units, may be located in one place, or may be distributed over multiple network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional module in the embodiments of the present application may be integrated in one processing unit, or each module may exist alone physically, or two or more modules may be integrated in one unit. The units formed by the modules can be realized in a form of hardware or a form of hardware and software functional units.
The integrated modules, which are implemented in the form of software functional modules, may be stored in a computer readable storage medium. The software functional module is stored in a storage medium, and includes several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) or a processor (english: processor) to perform some of the steps of the methods according to the embodiments of the application.
It should be understood that the above processor may be a central processing unit (english: central Processing Unit, abbreviated as CPU), or may be other general purpose processors, digital signal processors (english: digital Signal Processor, abbreviated as DSP), application specific integrated circuits (english: application Specific Integrated Circuit, abbreviated as ASIC), or the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of a method disclosed in connection with the present application may be embodied directly in a hardware processor for execution, or in a combination of hardware and software modules in a processor for execution.
The memory may comprise a high-speed RAM memory, and may further comprise a non-volatile memory NVM, such as at least one magnetic disk memory, and may also be a U-disk, a removable hard disk, a read-only memory, a magnetic disk or optical disk, etc.
The bus may be an industry standard architecture (Industry Standard Architecture, ISA) bus, an external device interconnect (Peripheral Component, PCI) bus, or an extended industry standard architecture (Extended Industry Standard Architecture, EISA) bus, among others. The buses may be divided into address buses, data buses, control buses, etc. For ease of illustration, the buses in the drawings of the present application are not limited to only one bus or to one type of bus.
The storage medium may be implemented by any type or combination of volatile or nonvolatile memory devices such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disk. A storage media may be any available media that can be accessed by a general purpose or special purpose computer.
An exemplary storage medium is coupled to the processor such the processor can read information from, and write information to, the storage medium. In the alternative, the storage medium may be integral to the processor. The processor and the storage medium may reside in an application specific integrated circuit (Application Specific Integrated Circuits, ASIC for short). It is also possible that the processor and the storage medium reside as discrete components in an electronic device or a master device.
Those of ordinary skill in the art will appreciate that: all or part of the steps for implementing the method embodiments described above may be performed by hardware associated with program instructions. The foregoing program may be stored in a computer readable storage medium. The program, when executed, performs steps including the method embodiments described above; and the aforementioned storage medium includes: various media that can store program code, such as ROM, RAM, magnetic or optical disks.
Finally, it should be noted that: the above embodiments are only for illustrating the technical solution of the present application, and not for limiting the same; although the application has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some or all of the technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit of the application.

Claims (9)

1. The method for synthesizing the looking-around image based on the fisheye camera is characterized by comprising the following steps of:
acquiring a plurality of fisheye images acquired by a plurality of fisheye cameras arranged on a vehicle;
splicing the fisheye images according to the pre-acquired corresponding relation among the fisheye cameras to acquire plane looking-around images corresponding to the fisheye images, wherein the corresponding relation is used for representing the position relation among the fisheye images acquired by the fisheye cameras;
carrying out space alignment treatment on the same space object in the plurality of fish-eye images to obtain a stereoscopic looking-around image corresponding to the plane looking-around image;
the step of performing spatial alignment processing on the same spatial object in the plurality of fisheye images to obtain a stereoscopic looking-around image corresponding to the planar looking-around image includes:
carrying out space alignment processing on the same space object in the plurality of fish-eye images to obtain a first stereoscopic looking-around image;
performing color mixing processing on the first stereoscopic looking-around image to obtain a second stereoscopic looking-around image;
and acquiring a stereoscopic looking-around image corresponding to the plane looking-around image according to the second stereoscopic looking-around image and a pre-established three-dimensional model, wherein the stereoscopic looking-around image is the projection of the second stereoscopic looking-around image on the pre-established three-dimensional model.
2. The method according to claim 1, wherein the method further comprises:
and optimizing the plane looking-around image to obtain the optimized plane looking-around image.
3. The method according to claim 1, wherein the method further comprises:
acquiring a plurality of calibration plate images acquired by the plurality of fisheye cameras, wherein the fisheye images comprise the calibration plate images;
and acquiring the corresponding relation of the fisheye cameras according to the calibration plate images.
4. A method according to claim 3, wherein the acquiring the correspondence of the plurality of fisheye cameras from the plurality of calibration plate images comprises:
acquiring characteristic points of the plurality of calibration plate images according to a characteristic point detection algorithm;
acquiring position coordinates of the feature points according to the feature points of the plurality of calibration plate images;
and acquiring the corresponding relation of the plurality of fisheye cameras according to the position coordinates of the feature points.
5. The method according to claim 1, wherein the method further comprises:
and displaying the stereoscopic looking-around image corresponding to the planar looking-around image.
6. The method of any one of claims 1 to 5, wherein the number of fisheye cameras is four.
7. An eye camera-based look-around image synthesizing apparatus, comprising:
an acquisition module for acquiring a plurality of fisheye images acquired by a plurality of fisheye cameras provided on a vehicle;
the processing module is used for splicing the plurality of fisheye images according to the pre-acquired corresponding relation among the plurality of fisheye cameras, so as to acquire a plane looking-around image corresponding to the plurality of fisheye images, wherein the corresponding relation is used for representing the position relation among the fisheye images acquired by the plurality of fisheye cameras;
carrying out space alignment treatment on the same space object in the plurality of fish-eye images to obtain a stereoscopic looking-around image corresponding to the plane looking-around image;
the processing module is further used for performing space alignment processing on the same space object in the plurality of fisheye images to obtain a first stereoscopic looking-around image;
performing color mixing processing on the first stereoscopic looking-around image to obtain a second stereoscopic looking-around image;
the acquisition module is further configured to acquire a stereoscopic looking-around image corresponding to the planar looking-around image according to the second stereoscopic looking-around image and a pre-established three-dimensional model, where the stereoscopic looking-around image is a projection of the second stereoscopic looking-around image on the pre-established three-dimensional model.
8. An in-vehicle apparatus, characterized by comprising:
a processor;
a memory; and
a computer program;
wherein the computer program is stored in the memory and configured to be executed by the processor, the computer program comprising instructions for performing the method of any of claims 1-6.
9. A computer readable storage medium, characterized in that the computer readable storage medium stores a computer program, which causes a server to perform the method of any one of claims 1-6.
CN201811598527.3A 2018-12-26 2018-12-26 Around-the-eye image synthesis method and device based on fish eye camera Active CN109754363B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811598527.3A CN109754363B (en) 2018-12-26 2018-12-26 Around-the-eye image synthesis method and device based on fish eye camera

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811598527.3A CN109754363B (en) 2018-12-26 2018-12-26 Around-the-eye image synthesis method and device based on fish eye camera

Publications (2)

Publication Number Publication Date
CN109754363A CN109754363A (en) 2019-05-14
CN109754363B true CN109754363B (en) 2023-08-15

Family

ID=66404056

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811598527.3A Active CN109754363B (en) 2018-12-26 2018-12-26 Around-the-eye image synthesis method and device based on fish eye camera

Country Status (1)

Country Link
CN (1) CN109754363B (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110466533A (en) * 2019-07-25 2019-11-19 东软睿驰汽车技术(沈阳)有限公司 A kind of control method for vehicle, apparatus and system
CN110580724B (en) * 2019-08-28 2022-02-25 贝壳技术有限公司 Method and device for calibrating binocular camera set and storage medium
CN110910311B (en) * 2019-10-30 2023-09-26 同济大学 Automatic splicing method of multi-path looking-around camera based on two-dimension code
CN113066158B (en) * 2019-12-16 2023-03-10 杭州海康威视数字技术股份有限公司 Vehicle-mounted all-round looking method and device
CN111754390A (en) * 2020-05-11 2020-10-09 上海欧菲智能车联科技有限公司 Ring view display method and device, computer equipment and storage medium
CN113132708B (en) * 2021-04-22 2022-02-22 贝壳找房(北京)科技有限公司 Method and apparatus for acquiring three-dimensional scene image using fisheye camera, device and medium

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104330074A (en) * 2014-11-03 2015-02-04 熊友谊 Intelligent surveying and mapping platform and realizing method thereof
CN106023070A (en) * 2016-06-14 2016-10-12 北京岚锋创视网络科技有限公司 Real-time panoramic splicing method and device
CN106534670A (en) * 2016-10-25 2017-03-22 成都通甲优博科技有限责任公司 Panoramic video generating method based on fixedly connected fisheye lens camera unit
CN106875339A (en) * 2017-02-22 2017-06-20 长沙全度影像科技有限公司 A kind of fish eye images joining method based on strip scaling board
CN107256535A (en) * 2017-06-06 2017-10-17 斑马信息科技有限公司 The display methods and device of panoramic looking-around image
CN107492125A (en) * 2017-07-28 2017-12-19 哈尔滨工业大学深圳研究生院 The processing method of automobile fish eye lens panoramic view picture
WO2018035845A1 (en) * 2016-08-26 2018-03-01 深圳市赛亿科技开发有限公司 Method for eliminating photographic blind spot based on 720 degree panoramic video camera
CN108257183A (en) * 2017-12-20 2018-07-06 歌尔科技有限公司 A kind of camera lens axis calibrating method and device
CN108629828A (en) * 2018-04-03 2018-10-09 中德(珠海)人工智能研究院有限公司 Scene rendering transition method in the moving process of three-dimensional large scene
CN108846796A (en) * 2018-06-22 2018-11-20 北京航空航天大学青岛研究院 Image split-joint method and electronic equipment

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104330074A (en) * 2014-11-03 2015-02-04 熊友谊 Intelligent surveying and mapping platform and realizing method thereof
CN106023070A (en) * 2016-06-14 2016-10-12 北京岚锋创视网络科技有限公司 Real-time panoramic splicing method and device
WO2018035845A1 (en) * 2016-08-26 2018-03-01 深圳市赛亿科技开发有限公司 Method for eliminating photographic blind spot based on 720 degree panoramic video camera
CN106534670A (en) * 2016-10-25 2017-03-22 成都通甲优博科技有限责任公司 Panoramic video generating method based on fixedly connected fisheye lens camera unit
CN106875339A (en) * 2017-02-22 2017-06-20 长沙全度影像科技有限公司 A kind of fish eye images joining method based on strip scaling board
CN107256535A (en) * 2017-06-06 2017-10-17 斑马信息科技有限公司 The display methods and device of panoramic looking-around image
CN107492125A (en) * 2017-07-28 2017-12-19 哈尔滨工业大学深圳研究生院 The processing method of automobile fish eye lens panoramic view picture
CN108257183A (en) * 2017-12-20 2018-07-06 歌尔科技有限公司 A kind of camera lens axis calibrating method and device
CN108629828A (en) * 2018-04-03 2018-10-09 中德(珠海)人工智能研究院有限公司 Scene rendering transition method in the moving process of three-dimensional large scene
CN108846796A (en) * 2018-06-22 2018-11-20 北京航空航天大学青岛研究院 Image split-joint method and electronic equipment

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
基于车载多视角的鱼眼图像拼接算法;周芳等;《数据通信》;20170531(第5期);第29-34页 *

Also Published As

Publication number Publication date
CN109754363A (en) 2019-05-14

Similar Documents

Publication Publication Date Title
CN109754363B (en) Around-the-eye image synthesis method and device based on fish eye camera
US11303806B2 (en) Three dimensional rendering for surround view using predetermined viewpoint lookup tables
US9866752B2 (en) Systems and methods for producing a combined view from fisheye cameras
CN107472135B (en) Image generation device, image generation method, and recording medium
CN110456967B (en) Information processing method, information processing apparatus, and program
EP3195584B1 (en) Object visualization in bowl-shaped imaging systems
KR101389884B1 (en) Dynamic image processing method and system for processing vehicular image
CN109040596B (en) Method for adjusting camera, mobile terminal and storage medium
US20160048992A1 (en) Information processing method, information processing device, and program
CN112224132A (en) Vehicle panoramic all-around obstacle early warning method
CN111757057A (en) Panoramic all-around display method, device, equipment and storage medium
CN111815752B (en) Image processing method and device and electronic equipment
CN110400255B (en) Vehicle panoramic image generation method and system and vehicle
US20180376130A1 (en) Image processing apparatus, image processing method, and image processing system
CN115880142A (en) Image generation method and device of trailer, storage medium and terminal
CN115734086A (en) Image processing method, device and system based on off-screen shooting and storage medium
CN113542463A (en) Video shooting device and method based on folding screen, storage medium and mobile terminal
CN117173014B (en) Method and device for synthesizing 3D target in BEV image
JP2007180719A (en) Vehicle drive support apparatus
CN116051379A (en) AVM image fusion method and device, vehicle and readable storage medium
CN116416321A (en) Vehicle-mounted panoramic looking-around automatic calibration method, system, equipment and storage medium
CN218839318U (en) 360-degree panoramic multi-interface visual system of loader
CN117853326A (en) Image stitching method and device of vehicle-mounted looking-around system, electronic equipment and vehicle
CN117830492A (en) Vehicle model rendering method and device, electronic equipment and storage medium
JP2004289733A (en) Vehicle periphery-monitoring apparatus

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant