CN112837207A - Panoramic depth measuring method, four-eye fisheye camera and binocular fisheye camera - Google Patents

Panoramic depth measuring method, four-eye fisheye camera and binocular fisheye camera Download PDF

Info

Publication number
CN112837207A
CN112837207A CN201911164667.4A CN201911164667A CN112837207A CN 112837207 A CN112837207 A CN 112837207A CN 201911164667 A CN201911164667 A CN 201911164667A CN 112837207 A CN112837207 A CN 112837207A
Authority
CN
China
Prior art keywords
fisheye
depth map
camera
eye
binocular
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201911164667.4A
Other languages
Chinese (zh)
Other versions
CN112837207B (en
Inventor
谢亮
姜文杰
刘靖康
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Insta360 Innovation Technology Co Ltd
Original Assignee
Insta360 Innovation Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Insta360 Innovation Technology Co Ltd filed Critical Insta360 Innovation Technology Co Ltd
Priority to CN201911164667.4A priority Critical patent/CN112837207B/en
Priority claimed from CN201911164667.4A external-priority patent/CN112837207B/en
Priority to PCT/CN2020/131506 priority patent/WO2021104308A1/en
Publication of CN112837207A publication Critical patent/CN112837207A/en
Application granted granted Critical
Publication of CN112837207B publication Critical patent/CN112837207B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/04Context-preserving transformations, e.g. by using an importance map
    • G06T3/047Fisheye or wide-angle transformations
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • G06T7/593Depth or shape recovery from multiple images from stereo images

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Measurement Of Optical Distance (AREA)

Abstract

The invention provides a panoramic depth measuring method, a four-eye fisheye camera and a binocular fisheye camera. The method comprises the following steps: acquiring a fisheye image shot by a fisheye lens; carrying out stereo matching on the fisheye image, and calculating a depth map of an overlapping area; and obtaining a panoramic depth map according to the depth map. The invention aims to perform stereo matching by using images acquired by a plurality of fisheye lenses on a panoramic camera to form a panoramic depth image, measure the depth of a target object, and provide a depth map, namely an azimuth of the target object in real time for the motion of the panoramic camera or a carrier of the panoramic camera such as an unmanned aerial vehicle, so as to achieve the effect of avoiding obstacles.

Description

Panoramic depth measuring method, four-eye fisheye camera and binocular fisheye camera
Technical Field
The invention belongs to the field of panoramic images, and particularly relates to a panoramic depth measuring method, a four-eye fisheye camera and a binocular fisheye camera.
Background
Most of the obstacle avoidance techniques adopted in the prior art for real-time obstacle avoidance rely on a large number of sensors such as ultrasonic waves, laser radars and the like. The obstacle avoidance method has some defects, such as the fact that obstacles cannot be avoided in time due to short detection distance, or the equipment is large in size and quality and cannot be simply assembled.
Panoramic cameras generally adopt a fisheye lens to take a picture for 360 degrees to achieve a panoramic effect. The maximum viewing angle of a fisheye image shot by the fisheye lens can reach 180 degrees or 270 degrees, and how to determine the direction of a target in a real environment according to a picture shot by the fisheye lens also becomes an important application point of the panoramic camera.
Disclosure of Invention
The invention provides a panoramic depth measuring method, a four-eye fisheye camera and a binocular fisheye camera, and aims to perform stereo matching on images acquired by a plurality of fisheye lenses on the panoramic camera to form a panoramic depth image and measure the depth of a target object.
The invention provides a panoramic depth measuring method by utilizing a plurality of fisheye lenses, which can calculate the 3D coordinates of a scene, and also can calculate the direction of an object for the motion of a panoramic camera or providing a depth map of a target object for a carrier of the panoramic camera such as an unmanned aerial vehicle and the like in real time, thereby achieving the effect of avoiding obstacles.
The invention provides a panoramic depth measuring method, which is suitable for a four-eye fisheye camera and comprises the following steps: acquiring a fisheye image shot by a fisheye lens; carrying out stereo matching on the fisheye image, and calculating a depth map of an overlapping area; and obtaining a panoramic depth map according to the depth map.
Further, in the above method, the four-eye fisheye camera sets two fisheye lenses on parallel surfaces, respectively, and performs stereo matching on the fisheye images and calculates a depth map of an overlapping region, and the method further includes: performing stereo matching on the fisheye images shot by the fisheye lenses on different surfaces of the four-eye fisheye camera, and calculating a first depth map of a first overlapping area; the fisheye images shot by the two fisheye lenses on the same surface of the four-eye fisheye camera are subjected to stereo matching respectively, and a second depth map of a second overlapping area and a third depth map of a third overlapping area are calculated respectively; the obtaining of the panoramic depth map according to the depth map further comprises: and merging the first depth map, the second depth map and the third depth map to obtain a panoramic depth map.
Further, in the above method, the stereo matching of the fisheye images captured by the fisheye lenses on different surfaces of the four-eye fisheye camera and calculating the first depth map of the first overlapping area further includes: and respectively carrying out stereo matching on the fisheye images shot by the two fisheye lenses positioned at the same end on different surfaces of the four-eye fisheye camera, and calculating a first depth map of a first overlapping area.
Further, in the above method, the acquiring the fisheye image captured by the fisheye lens includes acquiring a current picture or video frame captured by each fisheye lens.
Further, in the above method, the stereo matching includes finding a matching corresponding point from different fisheye images.
Further, in the above method, the four-eye fisheye camera is a body of the unmanned aerial vehicle or an external device.
Further, in the above method, the overlapping area includes a 360-degree panoramic area.
Further, the method further comprises the following steps: determining an obstacle from the depth map.
A second aspect of the present invention provides a four-eye fisheye camera, comprising: the image acquisition module is used for acquiring a fisheye image shot by the fisheye lens; the stereo matching module is used for carrying out stereo matching on the fisheye image and calculating a depth map of an overlapping area; and the panoramic synthesis module is used for obtaining a panoramic depth map according to the depth map.
Further, in the above four-eye fisheye camera, two fisheye lenses are respectively disposed on parallel surfaces, and the stereo matching is performed on the fisheye images, and a depth map of an overlapping region is calculated, further including: performing stereo matching on the fisheye images shot by the fisheye lenses on different surfaces of the four-eye fisheye camera, and calculating a first depth map of a first overlapping area; the fisheye images shot by the two fisheye lenses on the same surface of the four-eye fisheye camera are subjected to stereo matching respectively, and a second depth map of a second overlapping area and a third depth map of a third overlapping area are calculated respectively; the obtaining of the panoramic depth map according to the depth map further comprises: and merging the first depth map, the second depth map and the third depth map to obtain a panoramic depth map.
Further, in the above four-eye fisheye camera, the stereo matching of the fisheye images captured by the fisheye lenses on different surfaces of the four-eye fisheye camera and calculating the first depth map of the first overlapping area further includes: and respectively carrying out stereo matching on the fisheye images shot by the two fisheye lenses positioned at the same end on different surfaces of the four-eye fisheye camera, and calculating a first depth map of a first overlapping area.
Further, in the above four-eye fisheye camera, the acquiring of the fisheye image shot by the fisheye lens includes acquiring a current picture or video frame shot by each fisheye lens.
Further, in the above four-eye fisheye camera, the stereo matching includes finding matched corresponding points from different fisheye images.
Further, among the above-mentioned four mesh fisheye cameras, four mesh fisheye cameras are unmanned aerial vehicle's fuselage or external device.
Further, in the above four-eye fisheye camera, the overlapping area includes a 360-degree panoramic area.
Further, in the above four-eye fisheye camera, the method further includes: and the obstacle detection module is used for determining obstacles from the depth map.
The third aspect of the invention provides a panoramic depth measuring method, which is suitable for a binocular fisheye camera and comprises the following steps: obtaining fisheye images shot by the fisheye lens when the binocular fisheye camera is at different positions; carrying out stereo matching on the fisheye image, and calculating a depth map of an overlapping area; and obtaining a panoramic depth map according to the depth map.
Further, in the above method, the acquiring the fisheye images shot by the fisheye lens when the binocular fisheye camera is at different positions further includes: acquiring a fisheye image shot by the fisheye lens when the binocular fisheye camera is at a first position, and acquiring a fisheye image shot by the fisheye lens when the binocular fisheye camera is at a second position; and calculating the depth map according to the displacement of the binocular fisheye camera from the first position to the second position. The overlap region includes a 360 degree panoramic area.
Further, in the above method, the binocular fisheye camera is a body of the unmanned aerial vehicle or an external device.
Further, the method further comprises the following steps: determining an obstacle from the depth map.
A fourth aspect of the present invention provides a binocular fisheye camera comprising: the image module is used for acquiring fisheye images shot by the fisheye lens when the binocular fisheye camera is at different positions; the computing module is used for carrying out stereo matching on the fisheye image and computing a depth map of an overlapping area; and the depth module is used for obtaining a panoramic depth map according to the depth map.
Further, in the above-mentioned binocular fisheye camera, when obtaining the binocular fisheye camera in different positions, the fisheye image that fisheye lens was shot still includes: acquiring a fisheye image shot by the fisheye lens when the binocular fisheye camera is at a first position, and acquiring a fisheye image shot by the fisheye lens when the binocular fisheye camera is at a second position; and calculating the depth map according to the displacement of the binocular fisheye camera from the first position to the second position. The overlap region includes a 360 degree panoramic area.
Further, among the above-mentioned binocular fisheye camera, binocular fisheye camera is unmanned aerial vehicle's fuselage or external device.
Further, in the above-mentioned binocular fisheye camera, still include: and the obstacle avoidance module is used for determining obstacles from the depth map.
According to the invention, the images shot by the upper and lower and/or left and right fisheye lenses of the panoramic camera are subjected to stereo matching, and the depth of the object is calculated according to the matched characteristic points.
Drawings
Fig. 1 is a flowchart of a panoramic depth measuring method according to an embodiment of the present invention;
fig. 2 is a schematic diagram of a four-eye fisheye camera according to an embodiment of the invention;
fig. 3 is a schematic diagram of a four-eye fisheye camera according to another embodiment of the invention;
fig. 4 is a schematic diagram of a binocular fisheye camera according to an embodiment of the invention;
fig. 5 is a schematic diagram of a motion state of a binocular fisheye camera according to an embodiment of the invention;
fig. 6 is a schematic view of a binocular fisheye camera according to another embodiment of the invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more clearly apparent, the present invention is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
In order to explain the technical means of the present invention, the following description will be given by way of specific examples.
Referring to fig. 1, the embodiment of the invention discloses a panoramic depth measuring method, which is suitable for a four-eye fisheye camera and is characterized by comprising the following steps:
s101, obtaining a fisheye image shot by a fisheye lens;
s102, performing stereo matching on the fisheye image, and calculating a depth map of an overlapping area;
and S103, obtaining a panoramic depth map according to the depth map.
The obtaining of the fisheye image shot by the fisheye lens comprises obtaining a current picture or video frame shot by each fisheye lens. In this embodiment, a photograph taken by a fisheye lens is acquired.
Referring to fig. 2, in this embodiment, the step of respectively disposing two fisheye lenses on parallel surfaces of the four-eye fisheye camera, and performing stereo matching on the fisheye images in S102, where the fisheye lenses are four fisheye lenses, that is, f1, f2, f3, and f4, and calculating a depth map of an overlapping region further includes: and carrying out stereo matching on the fisheye images shot by the fisheye lenses on different surfaces of the four-eye fisheye camera, and calculating a first depth map of a first overlapping area. And respectively carrying out stereo matching on the fisheye images shot by the two fisheye lenses f1 and f3, and f2 and f4 on the same surface of the four-eye fisheye camera, and respectively calculating a second depth map S5 of a second overlapping region and a third depth map S6 of a third overlapping region. In S103, obtaining a panoramic depth map according to the depth map, further comprising: and merging the first depth maps S3 and S3', the second depth map S5 and the third depth map S6 to obtain a panoramic depth map.
The shooting visual angles of the four fisheye lenses are far beyond 180 degrees, such as 240 degrees and the like; in other embodiments, the number of fisheye lenses of the fisheye camera may be greater than or equal to 4.
In this embodiment, the step of performing stereo matching on the fisheye images shot by the fisheye lenses on different surfaces of the four-eye fisheye camera and calculating the first depth map of the first overlapping area specifically includes: and respectively carrying out stereo matching on the fisheye images shot by the two fisheye lenses positioned at the same end on different surfaces of the four-eye fisheye camera, and calculating a first depth map of a first overlapping area. Specifically, a depth map of the annular view angle overlapping region S3 is calculated by stereo matching f1 and f2, a depth map of the annular view angle overlapping region S3 'is calculated by stereo matching f3 and f4, and S3 and S3' together constitute a first depth map of the first overlapping region. It should be understood that the above-described overlapping regions are all three-dimensional volumetric spatial regions.
In other embodiments, S102 may include acquiring images of any one of the four-eye fisheye cameras on one side of the camera for binocular stereo matching with images of any one of the other fisheye cameras to obtain an overlapping region of the perspective images, such as the region of S0 in fig. 4, which is annular, and then forming a region equal to or exceeding 360 degrees with the overlapping region of the other two sides of the camera.
It should be understood that the above stereo matching includes finding matching corresponding points from different fisheye images, and may be a matching method such as dense optical flow, sparse optical flow, and the like.
It should be appreciated that in order to obtain a 360 degree panoramic depth map, the overlap region also correspondingly includes a 360 degree panoramic region. Since the distance of an object can be distinguished in a region in a depth map, an obstacle can be determined from the depth map.
As an application scenario of the four-eye fisheye camera in this embodiment, the four-eye fisheye camera may be a fuselage of the unmanned aerial vehicle or may also be an external device of the unmanned aerial vehicle. The unmanned aerial vehicle can be an unmanned aerial vehicle or an unmanned robot. The application of the four-eye fisheye camera on the unmanned aerial vehicle in the embodiment can provide a depth map for sensing the surrounding environment for the unmanned aerial vehicle, and can detect obstacles, so as to assist the unmanned aerial vehicle to avoid the obstacles or realize path planning.
According to the invention, the images shot by the upper and lower and/or left and right fisheye lenses of the panoramic camera are subjected to stereo matching, and the depth of the object is calculated according to the matched characteristic points.
Referring to fig. 3, another embodiment of the present invention discloses a four-eye fisheye camera 100, where the four-eye fisheye camera 100 has two fisheye lenses respectively disposed on parallel surfaces, and four fisheye lenses in total, that is, f1, f2, f3, and f4, and the step of performing stereo matching on the fisheye images in the stereo matching module 12 and calculating a depth map of an overlapping region further includes: and carrying out stereo matching on the fisheye images shot by the fisheye lenses on different surfaces of the four-eye fisheye camera, and calculating a first depth map of a first overlapping area. And respectively carrying out stereo matching on the fisheye images shot by the two fisheye lenses f1 and f3, and f2 and f4 on the same surface of the four-eye fisheye camera, and respectively calculating a second depth map S5 of a second overlapping region and a third depth map S6 of a third overlapping region. The obtaining of the panoramic depth map according to the depth map in the panoramic synthesis module 13 further includes: and merging the first depth maps S3 and S3', the second depth map S5 and the third depth map S6 to obtain a panoramic depth map.
In this embodiment, the step of performing stereo matching on the fisheye images shot by the fisheye lenses on different surfaces of the four-eye fisheye camera 100 and calculating the first depth map of the first overlapping area specifically includes: and respectively carrying out stereo matching on the fisheye images shot by the two fisheye lenses positioned at the same end on different surfaces of the four-eye fisheye camera, and calculating a first depth map of a first overlapping area. Specifically, a depth map of the overlapping region S3 is calculated by stereo matching f1 and f2, a depth map of the overlapping region S3 'is calculated by stereo matching f3 and f4, and S3 and S3' together constitute a first depth map of the first overlapping region. It should be understood that the above-described overlapping regions are all three-dimensional volumetric spatial regions.
It should be understood that the above stereo matching includes finding matching corresponding points from different of the fisheye images.
It should be appreciated that in order to obtain a 360 degree panoramic depth map, the overlap region also correspondingly includes a 360 degree panoramic region. Since the distance of an object can be distinguished in a region in a depth map, an obstacle can be determined from the depth map.
As an application scenario of the four-eye fisheye camera in this embodiment, the four-eye fisheye camera may be a fuselage of the unmanned aerial vehicle or may also be an external device of the unmanned aerial vehicle. The unmanned aerial vehicle can be an unmanned aerial vehicle or an unmanned robot. The application of the four-eye fisheye camera on the unmanned aerial vehicle in the embodiment can provide a depth map for sensing the surrounding environment for the unmanned aerial vehicle, and can detect obstacles, so as to assist the unmanned aerial vehicle to avoid the obstacles or realize path planning.
According to the invention, the images shot by the upper and lower and/or left and right fisheye lenses of the panoramic camera are subjected to stereo matching, and the depth of the object is calculated according to the matched characteristic points.
Referring to fig. 4 and 5, the embodiment of the invention also discloses a panoramic depth measuring method, which is suitable for a binocular fisheye camera and comprises the following steps: obtaining fisheye images shot by the fisheye lens when the binocular fisheye camera is at different positions; carrying out stereo matching on the fisheye image, and calculating a depth map of an overlapping area; and obtaining a panoramic depth map according to the depth map.
In this embodiment, the above-mentioned binocular fisheye camera that obtains when different positions, the fisheye image that fisheye lens was shot still includes: acquiring a fisheye image shot by the fisheye lens when the binocular fisheye camera is at a first position t1, and acquiring a fisheye image shot by the fisheye lens when the binocular fisheye camera is at a second position t 2; calculating the depth map according to the displacement of the binocular fisheye camera from a first position t1 to a second position t 2. The overlap region includes a 360 degree panoramic area.
In the embodiment of fig. 4, the fisheye lenses f1 and f2 of the binocular fisheye camera are arranged in a back-to-back manner, and the depth maps of the overlapping regions S3 and S4 can be calculated as in the above principle, but there are regions S1 and S2 which have no overlap and thus cannot obtain a depth map. As shown in fig. 5, when the binocular fisheye camera generates a certain displacement, which can be measured, and the images taken at the two front and rear positions t1 and t2 are reused, the regions which are not overlapped originally are covered by the overlapped regions, so that the regions can be subjected to stereo matching to obtain the depth maps of the regions, and thus the depth maps of 360 degrees can be synthesized. The images shot at the front position and the rear position are utilized respectively, so that the effect of the four-eye fisheye camera is achieved equivalently by using the binocular fisheye camera.
It should be understood that the above stereo matching includes finding matching corresponding points from different of the fisheye images.
It should be appreciated that in order to obtain a 360 degree panoramic depth map, the overlap region also correspondingly includes a 360 degree panoramic region. Since the distance of an object can be distinguished in a region in a depth map, an obstacle can be determined from the depth map.
As the application scene of binocular fisheye camera in this embodiment, this binocular fisheye camera can be unmanned aerial vehicle's fuselage or also can be unmanned aerial vehicle's external device. The unmanned aerial vehicle can be an unmanned aerial vehicle or an unmanned robot. The application of binocular fisheye camera on unmanned aerial vehicle in this embodiment can provide the depth map of perception all ring edge borders for unmanned aerial vehicle to can carry out the detection of barrier, thereby supplementary unmanned aerial vehicle keeps away the barrier, perhaps realizes path planning.
According to the invention, the images shot by the upper and lower and/or left and right fisheye lenses of the panoramic camera are subjected to stereo matching, and the depth of the object is calculated according to the matched characteristic points.
Referring to fig. 6, an embodiment of the present invention further discloses a binocular fisheye camera 200, including: the image module 21 is configured to obtain fisheye images shot by the fisheye lens when the binocular fisheye camera is at different positions; the calculation module 22 is configured to perform stereo matching on the fisheye image and calculate a depth map of an overlapping region; a depth module 23, configured to obtain a panoramic depth map according to the depth map.
In this embodiment, the above-mentioned obtaining of the fisheye image shot by the fisheye lens when the binocular fisheye camera 200 is at different positions further includes: acquiring a fisheye image shot by the fisheye lens when the binocular fisheye camera is at a first position t1, and acquiring a fisheye image shot by the fisheye lens when the binocular fisheye camera is at a second position t 2; calculating the depth map according to the displacement of the binocular fisheye camera from a first position t1 to a second position t 2. The overlap region includes a 360 degree panoramic area.
Referring to fig. 4, the fisheye lenses f1 and f2 of the binocular fisheye camera 200 are arranged in a back-to-back manner, and the depth maps of the overlapping regions S3 and S4 can be calculated as in the above principle, but there are regions S1 and S2 which have no overlap and thus cannot obtain a depth map. As shown in fig. 5, when the binocular fisheye camera generates a certain displacement, which can be measured, and the images taken at the two front and rear positions t1 and t2 are reused, the regions which are not overlapped originally are covered by the overlapped regions, so that the regions can be subjected to stereo matching to obtain the depth maps of the regions, and thus the depth maps of 360 degrees can be synthesized. The images shot at the front position and the rear position are utilized respectively, so that the effect of the four-eye fisheye camera is achieved equivalently by using the binocular fisheye camera.
It should be understood that the above stereo matching includes finding matching corresponding points from different of the fisheye images.
It should be appreciated that in order to obtain a 360 degree panoramic depth map, the overlap region also correspondingly includes a 360 degree panoramic region. Since the distance of an object can be distinguished in a region in a depth map, an obstacle can be determined from the depth map.
As the application scene of binocular fisheye camera in this embodiment, this binocular fisheye camera can be unmanned aerial vehicle's fuselage or also can be unmanned aerial vehicle's external device. The unmanned aerial vehicle can be an unmanned aerial vehicle or an unmanned robot. The application of binocular fisheye camera on unmanned aerial vehicle in this embodiment can provide the depth map of perception all ring edge borders for unmanned aerial vehicle to can carry out the detection of barrier, thereby supplementary unmanned aerial vehicle keeps away the barrier, perhaps realizes path planning.
According to the invention, the images shot by the upper and lower and/or left and right fisheye lenses of the panoramic camera are subjected to stereo matching, and the depth of the object is calculated according to the matched characteristic points.
In the embodiments provided in the present invention, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, or in a form of hardware plus a software functional unit.
The integrated unit implemented in the form of a software functional unit may be stored in a computer readable storage medium. The software functional unit is stored in a storage medium and includes several instructions to enable a computer device (which may be a personal computer, a server, or a network device) or a processor (processor) to execute some steps of the methods according to the embodiments of the present invention. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
It is obvious to those skilled in the art that, for convenience and simplicity of description, the foregoing division of the functional modules is merely used as an example, and in practical applications, the above function distribution may be performed by different functional modules according to needs, that is, the internal structure of the device is divided into different functional modules to perform all or part of the above described functions. For the specific working process of the device described above, reference may be made to the corresponding process in the foregoing method embodiment, which is not described herein again.
Finally, it should be noted that the above embodiments are only used for illustrating the technical solutions of the present invention, and not for limiting the same; while the invention has been described in detail and with reference to the foregoing embodiments, it will be understood by those skilled in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present invention.

Claims (24)

1. A panoramic depth measuring method is suitable for a four-eye fisheye camera, and is characterized by comprising the following steps:
acquiring a fisheye image shot by a fisheye lens;
carrying out stereo matching on the fisheye image, and calculating a depth map of an overlapping area;
and obtaining a panoramic depth map according to the depth map.
2. The method of claim 1,
the four-eye fisheye camera is provided with two fisheye lenses on the parallel surfaces respectively,
the stereo matching of the fisheye image and the calculation of the depth map of the overlapping region further include:
performing stereo matching on the fisheye images shot by the fisheye lenses on different surfaces of the four-eye fisheye camera, and calculating a first depth map of a first overlapping area;
the fisheye images shot by the two fisheye lenses on the same surface of the four-eye fisheye camera are subjected to stereo matching respectively, and a second depth map of a second overlapping area and a third depth map of a third overlapping area are calculated respectively;
the obtaining of the panoramic depth map according to the depth map further comprises:
and merging the first depth map, the second depth map and the third depth map to obtain a panoramic depth map.
3. The method of claim 2,
the stereo matching of the fisheye images shot by the fisheye lenses on different surfaces of the four-eye fisheye camera and the calculation of the first depth map of the first overlapping area further include:
and respectively carrying out stereo matching on the fisheye images shot by the two fisheye lenses positioned at the same end on different surfaces of the four-eye fisheye camera, and calculating a first depth map of a first overlapping area.
4. The method of claim 2, wherein said obtaining a fisheye image taken by said fisheye lens comprises obtaining a current picture or video frame taken by each of said fisheye lenses.
5. The method of claim 2, wherein the stereo matching comprises finding corresponding points of matching from different of the fisheye images.
6. The method of claim 2, wherein the quad-eye fisheye camera is a fuselage or an external device of the drone.
7. The method of claim 1, wherein the overlapping region comprises a 360 degree panoramic region.
8. The method of claim 1, further comprising the step of: determining an obstacle from the depth map.
9. A four-eye fisheye camera, comprising:
the image acquisition module is used for acquiring a fisheye image shot by the fisheye lens;
the stereo matching module is used for carrying out stereo matching on the fisheye image and calculating a depth map of an overlapping area;
and the panoramic synthesis module is used for obtaining a panoramic depth map according to the depth map.
10. The four-eye fisheye camera of claim 9,
the four-eye fisheye camera is provided with two fisheye lenses on the parallel surfaces respectively,
the stereo matching of the fisheye image and the calculation of the depth map of the overlapping region further include:
performing stereo matching on the fisheye images shot by the fisheye lenses on different surfaces of the four-eye fisheye camera, and calculating a first depth map of a first overlapping area;
the fisheye images shot by the two fisheye lenses on the same surface of the four-eye fisheye camera are subjected to stereo matching respectively, and a second depth map of a second overlapping area and a third depth map of a third overlapping area are calculated respectively;
the obtaining of the panoramic depth map according to the depth map further comprises:
and merging the first depth map, the second depth map and the third depth map to obtain a panoramic depth map.
11. The four-eye fisheye camera of claim 10,
the stereo matching of the fisheye images shot by the fisheye lenses on different surfaces of the four-eye fisheye camera and the calculation of the first depth map of the first overlapping area further include:
and respectively carrying out stereo matching on the fisheye images shot by the two fisheye lenses positioned at the same end on different surfaces of the four-eye fisheye camera, and calculating a first depth map of a first overlapping area.
12. The four-eye fisheye camera of claim 10, wherein said obtaining fisheye images taken by said fisheye lenses comprises obtaining a current picture or video frame taken by each of said fisheye lenses.
13. The quad fisheye camera of claim 10, wherein the stereo matching includes finding matching corresponding points from different fisheye images.
14. The four-eye fisheye camera of claim 10, wherein the four-eye fisheye camera is a fuselage of an unmanned aerial vehicle or an external device.
15. The quad fisheye camera of claim 9, wherein the overlapping area includes a 360 degree panoramic area.
16. The four-eye fisheye camera of claim 9, further comprising:
and the obstacle detection module is used for determining obstacles from the depth map.
17. A panoramic depth measuring method is suitable for a binocular fisheye camera and is characterized by comprising the following steps:
obtaining fisheye images shot by the fisheye lens when the binocular fisheye camera is at different positions;
carrying out stereo matching on the fisheye image, and calculating a depth map of an overlapping area;
and obtaining a panoramic depth map according to the depth map.
18. The method of claim 17,
acquire when the binocular fisheye camera is in different positions, the fisheye image that fisheye lens was shot still includes:
acquiring a fisheye image shot by the fisheye lens when the binocular fisheye camera is at a first position, and acquiring a fisheye image shot by the fisheye lens when the binocular fisheye camera is at a second position;
calculating the depth map according to the displacement of the binocular fisheye camera from a first position to a second position;
the overlap region includes a 360 degree panoramic area.
19. The method of claim 18, wherein the binocular fisheye camera is a fuselage of the drone or an external device.
20. The method of claim 17, further comprising the step of: determining an obstacle from the depth map.
21. A binocular fisheye camera, comprising:
the image module is used for acquiring fisheye images shot by the fisheye lens when the binocular fisheye camera is at different positions;
the computing module is used for carrying out stereo matching on the fisheye image and computing a depth map of an overlapping area;
and the depth module is used for obtaining a panoramic depth map according to the depth map.
22. The binocular fish-eye camera of claim 21,
acquire when the binocular fisheye camera is in different positions, the fisheye image that fisheye lens was shot still includes:
acquiring a fisheye image shot by the fisheye lens when the binocular fisheye camera is at a first position, and acquiring a fisheye image shot by the fisheye lens when the binocular fisheye camera is at a second position;
calculating the depth map according to the displacement of the binocular fisheye camera from a first position to a second position;
the overlap region includes a 360 degree panoramic area.
23. The binocular fish-eye camera of claim 22, wherein the binocular fish-eye camera is a fuselage of an unmanned aerial vehicle or an external device.
24. The binocular fish-eye camera of claim 21, further comprising: and the obstacle avoidance module is used for determining obstacles from the depth map.
CN201911164667.4A 2019-11-25 2019-11-25 Panoramic depth measurement method, four-eye fisheye camera and binocular fisheye camera Active CN112837207B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201911164667.4A CN112837207B (en) 2019-11-25 Panoramic depth measurement method, four-eye fisheye camera and binocular fisheye camera
PCT/CN2020/131506 WO2021104308A1 (en) 2019-11-25 2020-11-25 Panoramic depth measurement method, four-eye fisheye camera, and binocular fisheye camera

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911164667.4A CN112837207B (en) 2019-11-25 Panoramic depth measurement method, four-eye fisheye camera and binocular fisheye camera

Publications (2)

Publication Number Publication Date
CN112837207A true CN112837207A (en) 2021-05-25
CN112837207B CN112837207B (en) 2024-06-21

Family

ID=

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023041884A1 (en) 2021-09-17 2023-03-23 Lerity Hemispherical optronic system for detecting and locating threats, employing real-time processing
FR3127297A1 (en) 2021-09-17 2023-03-24 Lerity HEMISPHERIC OPTRONIC SYSTEM FOR DETECTION AND LOCATION OF THREATS WITH REAL-TIME PROCESSING
WO2023130465A1 (en) * 2022-01-10 2023-07-13 深圳市大疆创新科技有限公司 Aerial vehicle, image processing method and apparatus, and movable platform
WO2024103366A1 (en) * 2022-11-18 2024-05-23 影石创新科技股份有限公司 Panoramic unmanned aerial vehicle

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101000461A (en) * 2006-12-14 2007-07-18 上海杰图软件技术有限公司 Method for generating stereoscopic panorama by fish eye image
CN106931961A (en) * 2017-03-20 2017-07-07 成都通甲优博科技有限责任公司 A kind of automatic navigation method and device
CN108230392A (en) * 2018-01-23 2018-06-29 北京易智能科技有限公司 A kind of dysopia analyte detection false-alarm elimination method based on IMU
CN108269234A (en) * 2016-12-30 2018-07-10 成都观界创宇科技有限公司 A kind of lens of panoramic camera Attitude estimation method and panorama camera
CN108322730A (en) * 2018-03-09 2018-07-24 嘀拍信息科技南通有限公司 A kind of panorama depth camera system acquiring 360 degree of scene structures

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101000461A (en) * 2006-12-14 2007-07-18 上海杰图软件技术有限公司 Method for generating stereoscopic panorama by fish eye image
CN108269234A (en) * 2016-12-30 2018-07-10 成都观界创宇科技有限公司 A kind of lens of panoramic camera Attitude estimation method and panorama camera
CN106931961A (en) * 2017-03-20 2017-07-07 成都通甲优博科技有限责任公司 A kind of automatic navigation method and device
CN108230392A (en) * 2018-01-23 2018-06-29 北京易智能科技有限公司 A kind of dysopia analyte detection false-alarm elimination method based on IMU
CN108322730A (en) * 2018-03-09 2018-07-24 嘀拍信息科技南通有限公司 A kind of panorama depth camera system acquiring 360 degree of scene structures

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023041884A1 (en) 2021-09-17 2023-03-23 Lerity Hemispherical optronic system for detecting and locating threats, employing real-time processing
FR3127297A1 (en) 2021-09-17 2023-03-24 Lerity HEMISPHERIC OPTRONIC SYSTEM FOR DETECTION AND LOCATION OF THREATS WITH REAL-TIME PROCESSING
WO2023130465A1 (en) * 2022-01-10 2023-07-13 深圳市大疆创新科技有限公司 Aerial vehicle, image processing method and apparatus, and movable platform
WO2024103366A1 (en) * 2022-11-18 2024-05-23 影石创新科技股份有限公司 Panoramic unmanned aerial vehicle

Also Published As

Publication number Publication date
WO2021104308A1 (en) 2021-06-03

Similar Documents

Publication Publication Date Title
JP6484729B2 (en) Unmanned aircraft depth image acquisition method, acquisition device, and unmanned aircraft
US10085011B2 (en) Image calibrating, stitching and depth rebuilding method of a panoramic fish-eye camera and a system thereof
WO2019127445A1 (en) Three-dimensional mapping method, apparatus and system, cloud platform, electronic device, and computer program product
KR101666959B1 (en) Image processing apparatus having a function for automatically correcting image acquired from the camera and method therefor
JP5739584B2 (en) 3D image synthesizing apparatus and method for visualizing vehicle periphery
CN107113376B (en) A kind of image processing method, device and video camera
US8350894B2 (en) System and method for stereoscopic imaging
US20190012804A1 (en) Methods and apparatuses for panoramic image processing
JP6192853B2 (en) Optical flow imaging system and method using ultrasonic depth detection
CN106960454B (en) Depth of field obstacle avoidance method and equipment and unmanned aerial vehicle
JP2007192832A (en) Calibrating method of fish eye camera
CN110889873A (en) Target positioning method and device, electronic equipment and storage medium
CN111383264B (en) Positioning method, positioning device, terminal and computer storage medium
CN210986289U (en) Four-eye fisheye camera and binocular fisheye camera
CN115035235A (en) Three-dimensional reconstruction method and device
WO2021116078A1 (en) A method for measuring the topography of an environment
WO2021104308A1 (en) Panoramic depth measurement method, four-eye fisheye camera, and binocular fisheye camera
CN109658451B (en) Depth sensing method and device and depth sensing equipment
CN110800023A (en) Image processing method and equipment, camera device and unmanned aerial vehicle
Lin et al. Real-time low-cost omni-directional stereo vision via bi-polar spherical cameras
CN116929290A (en) Binocular visual angle difference three-dimensional depth measurement method, binocular visual angle difference three-dimensional depth measurement system and storage medium
CN111243021A (en) Vehicle-mounted visual positioning method and system based on multiple combined cameras and storage medium
CN112837207B (en) Panoramic depth measurement method, four-eye fisheye camera and binocular fisheye camera
CN113034615B (en) Equipment calibration method and related device for multi-source data fusion
CN113674356A (en) Camera screening method and related device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant