CN113645459B - High-dynamic 3D imaging method and device, electronic equipment and storage medium - Google Patents

High-dynamic 3D imaging method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN113645459B
CN113645459B CN202111190592.4A CN202111190592A CN113645459B CN 113645459 B CN113645459 B CN 113645459B CN 202111190592 A CN202111190592 A CN 202111190592A CN 113645459 B CN113645459 B CN 113645459B
Authority
CN
China
Prior art keywords
depth camera
information
spatial light
depth
target scene
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111190592.4A
Other languages
Chinese (zh)
Other versions
CN113645459A (en
Inventor
国学理
徐永奎
齐伟
任家辉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Lanxin Technology Co ltd
Original Assignee
Hangzhou Lanxin Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Lanxin Technology Co ltd filed Critical Hangzhou Lanxin Technology Co ltd
Priority to CN202111190592.4A priority Critical patent/CN113645459B/en
Publication of CN113645459A publication Critical patent/CN113645459A/en
Application granted granted Critical
Publication of CN113645459B publication Critical patent/CN113645459B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B35/00Stereoscopic photography

Abstract

The invention discloses a high dynamic 3D imaging method and device, electronic equipment and a storage medium, wherein the method comprises the following steps: acquiring gray information of a target scene acquired by a depth camera; controlling a spatial light modulator to perform spatial light modulation on the depth camera according to the gray information; after the depth camera performs spatial light modulation, phase information of a plurality of target scenes collected by the depth camera is acquired; and obtaining the depth information of the target scene according to the plurality of pieces of phase information. According to the invention, the depth camera and the spatial light modulator are combined, and spatial light regulation is carried out according to gray information to obtain an imaging image with uniform intensity, so that the measurement dynamic range of the system is improved, and the frame rate and timeliness defects caused by multiple exposures are avoided.

Description

High-dynamic 3D imaging method and device, electronic equipment and storage medium
Technical Field
The present application relates to the field of imaging technologies, and in particular, to a high dynamic 3D imaging method and apparatus, an electronic device, and a storage medium.
Background
The depth camera can be used for sensing three-dimensional depth information of the environment, so that the depth camera is widely applied to the fields of mobile robot navigation, aerospace, aviation, augmented reality, surveying and mapping and the like. The field of depth cameras is currently divided into active depth cameras, represented by structured light depth cameras and time-of-flight depth cameras, and passive depth cameras, represented by binocular cameras. Compared with a passive depth camera, the active depth camera can acquire distance information of a real environment without a large amount of calculation, and the acquired distance information is more reliable.
Currently, active depth cameras are mainly classified into optical time-of-flight depth cameras and structured light depth cameras. Taking the optical time-of-flight depth camera as an example, the optical time-of-flight depth camera is mainly influenced by the intensity of a reflected light signal when spatial perception exists, and the intensity of the reflected light signal is related to the reflectivity and the spatial distance of a target object. Effective depth information generally requires that the intensity of a feedback optical signal cannot be overexposed to cause a ranging error, and simultaneously cannot be too low to cause the ranging information to be submerged by noise. The most common method for achieving the effect is to perform multiple exposures, but the multiple exposures inevitably bring loss of measurement frame rate and timeliness.
Disclosure of Invention
An object of the embodiments of the present application is to provide a high dynamic 3D imaging method and apparatus, an electronic device, and a storage medium, so as to solve the technical problem of frame rate and lack of timeliness caused by multiple exposures in the related art.
According to a first aspect of embodiments of the present application, there is provided a high dynamic 3D imaging method, including:
acquiring gray information of a target scene acquired by a depth camera;
controlling a spatial light modulator to perform spatial light modulation on the depth camera according to the gray information;
after the depth camera performs spatial light modulation, phase information of a plurality of target scenes collected by the depth camera is acquired;
and obtaining the depth information of the target scene according to the plurality of pieces of phase information.
According to a second aspect of embodiments of the present application, there is provided a high dynamic 3D imaging apparatus, comprising:
the first acquisition module is used for acquiring the gray information of a target scene acquired by the depth camera;
the control module is used for controlling a spatial light modulator to perform spatial light modulation on the depth camera according to the gray information;
the second acquisition module is used for acquiring the phase information of a plurality of target scenes acquired by the depth camera after the depth camera performs spatial light modulation;
and the imaging module is used for obtaining the depth information of the target scene according to the plurality of pieces of phase information.
According to a third aspect of embodiments of the present application, there is provided a high dynamic 3D imaging apparatus, comprising:
the depth camera is used for acquiring gray level information of a target scene and acquiring phase information of a plurality of target scenes after being subjected to spatial light modulation by the spatial light modulator;
a spatial light modulator for spatially modulating light of the depth camera;
the calculation module is used for acquiring gray information of a target scene acquired by the depth camera, controlling a spatial light modulator to perform spatial light modulation on the depth camera according to the gray information, acquiring phase information of a plurality of target scenes acquired by the depth camera after the depth camera performs spatial light modulation, and acquiring depth information of the target scene according to the phase information.
According to a fourth aspect of embodiments of the present application, there is provided an electronic apparatus, comprising:
one or more processors;
a memory for storing one or more programs;
when executed by the one or more processors, cause the one or more processors to implement a method as described in the first aspect.
According to a fifth aspect of embodiments herein, there is provided a computer-readable storage medium having stored thereon computer instructions, characterized in that the instructions, when executed by a processor, implement the steps of the method according to the first aspect.
The technical scheme provided by the embodiment of the application can have the following beneficial effects:
according to the embodiment, the depth camera and the spatial light modulator are combined to sense the spatial depth, so that the efficiency is high, and the realization is simple; the spatial light modulator is adopted for spatial illumination modulation, and the depth information with a high dynamic range can be obtained only by collecting a single image, so that the calculation complexity is reduced; the method and the device can perform physical spatial light modulation on the original phase diagram, the optimized measuring range can reach 2 orders of magnitude, and meanwhile, the output frame rate of depth information is not reduced.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the application.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present application and together with the description, serve to explain the principles of the application.
Fig. 1 is a flow chart illustrating a method of high dynamic 3D imaging according to an exemplary embodiment.
Fig. 2 is a flowchart illustrating step S14 according to an exemplary embodiment.
Fig. 3 is a schematic structural diagram illustrating a high dynamic 3D imaging apparatus according to an exemplary embodiment.
Fig. 4 is a block diagram illustrating a high dynamic 3D imaging device according to an exemplary embodiment.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The embodiments described in the following exemplary embodiments do not represent all embodiments consistent with the present application. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present application, as detailed in the appended claims.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in this application and the appended claims, the singular forms "a", "an", and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should also be understood that the term "and/or" as used herein refers to and encompasses any and all possible combinations of one or more of the associated listed items.
Fig. 1 is a flow chart illustrating a method of high dynamic 3D imaging according to an exemplary embodiment, which may include the following steps, as shown in fig. 1:
step S11, acquiring gray scale information of a target scene acquired by a depth camera;
step S12, controlling a spatial light modulator to perform spatial light modulation on the depth camera according to the gray scale information;
step S13, after the depth camera carries out spatial light modulation, phase information of a plurality of target scenes collected by the depth camera is obtained;
and step S14, obtaining the depth information of the target scene according to the plurality of pieces of phase information.
According to the embodiment, the depth camera and the spatial light modulator are combined to sense the spatial depth, so that the efficiency is high, and the realization is simple; the spatial light modulator is adopted for spatial illumination modulation, and the depth information with a high dynamic range can be obtained only by collecting a single image, so that the calculation complexity is reduced; the method and the device can perform physical spatial light modulation on the original phase diagram, the optimized measuring range can reach 2 orders of magnitude, and meanwhile, the output frame rate of depth information is not reduced.
In the specific implementation of step S11, acquiring grayscale information of a target scene acquired by the depth camera;
specifically, the depth camera is configured to be in a gray level acquisition mode, that is, the depth camera is in a non-demodulation exposure mode, and meanwhile, a light source of the depth camera is in a direct current exposure mode, and a target scene of the depth camera is imaged under active illumination to obtain gray level information of the target scene. The depth camera of this example includes an imaging lens, an illumination source disposed beside the imaging lens to provide illumination, and a depth sensor disposed behind the imaging lens.
The method comprises the steps of obtaining gray information of a target scene collected by a depth camera, and transmitting the collected gray information to an FPGA.
In a specific implementation of step S12, controlling a spatial light modulator to perform spatial light modulation on the depth camera according to the gray scale information;
specifically, the FPGA analyzes the gray scale information to obtain a video signal, the video signal is adjusted, and the adjusted signal is input into the spatial light modulator to perform spatial light modulation on the depth camera. The spatial light modulator can reduce the light intensity transmittance of the overexposed area of the depth camera and increase the light intensity transmittance of the underexposed area, so that a target scene image with uniform exposure can be obtained. The spatial light modulator is arranged between the depth sensor and the imaging lens, the FPGA is connected with the depth sensor, receives the digital signal, converts the digital signal into a target scene brightness signal and is used for representing the illumination condition of a target scene. The spatial light modulator is arranged between the depth sensor and the imaging lens, so that the light modulation control of the whole imaging area is realized, and the dynamic range of the camera is improved.
The spatial light modulator can adopt a liquid crystal panel, an adjustable attenuator, a DMD and the like, the liquid crystal panel is taken as an example for description in the embodiment, the FPGA transmits a target scene brightness signal to the liquid crystal panel, the brightness signal output by the liquid crystal panel is divided into a line synchronous signal and a field synchronous signal and a video signal which is sent to an input end of a liquid crystal driving panel, and the driving and the control of the liquid crystal panel are completed. The local regulation and control of the frame transmittance is realized by independently controlling the light intensity transmittance of each pixel, and the pixels of the liquid crystal panel are matched with the pixels of the depth sensor and keep one-to-one correspondence. The liquid crystal panel can regulate and control the light intensity of each pixel point in real time, and the real-time regulation and control intensity of the pixels is controlled and adjusted by the FPGA according to the gray level information of the target scene and the threshold value set in advance.
The video signal is adjusted, including correction, gain and bias adjustment, and the adjusted signal is input into the liquid crystal driving board and directly input into the liquid crystal screen board, so that the correction significance lies in improving the quality of the video signal.
Due to the particularity of the pixel transmittance control principle, it is necessary to display a gray image (corresponding gray information) opposite to the gray level of the original image on the liquid crystal panel. In the original image, the brightness of the area corresponding to the liquid crystal panel is lower in the area which is brighter; in the original image, the luminance of the area corresponding to the liquid crystal panel is higher than that of the area relatively dark. Therefore, the gray scale of each pixel point of the final image is accurately adjusted.
Further, in order to adjust and control the light intensity information as accurately as possible, the gray scale information of the target scene should be as strong as possible, and displayed without overexposure as much as possible.
In a specific implementation of step S13, after the depth camera performs spatial light modulation, phase information of a plurality of target scenes collected by the depth camera is acquired;
specifically, after the depth camera performs spatial light modulation, the depth camera is configured to be in a depth acquisition mode, that is, the depth camera is in a demodulation exposure mode, and simultaneously, a light source of the depth camera is in a modulation exposure mode, so that phase information of a plurality of target scenes is acquired.
For a sine-modulated TOF camera, at least four phase maps with different phases need to be acquired simultaneously to feed back the depth information of the target scene. For a square wave modulated TOF camera, it is necessary to acquire illumination information of at least three target scenes to calculate depth information.
In a specific implementation of step S14, obtaining depth information of the target scene according to the plurality of pieces of phase information;
in particular, fig. 2 is a flowchart illustrating step S14 according to an exemplary embodiment, which may include the following sub-steps, referring to fig. 2:
step S141, calculating the phase difference of light back and forth on each pixel according to the plurality of pieces of phase information;
specifically, for a sine-modulated TOF camera, four phase maps with different phases need to be acquired at the same time to feed back the depth information of a target scene; for a square wave modulated TOF camera, the phase difference of the back light can be calculated only by acquiring the illumination information of three target scenes;
step S142, reversely deducing to obtain the flight time of the light according to the phase difference;
specifically, the following relationship exists between the optical flight time t, the phase difference Δ Φ, and the modulation frequency f:
Figure DEST_PATH_IMAGE001
the phase difference Δ Φ and the modulation frequency f are substituted into the above formula, and the optical flight time t can be obtained.
And step S143, calculating depth information of the target scene according to the light flight time.
Specifically, the product of the speed of light and the time of flight is generally the depth value of the target point.
Fig. 3 is a schematic structural diagram illustrating a high dynamic 3D imaging apparatus according to an exemplary embodiment. As shown in fig. 3, an embodiment of the present invention further provides a high dynamic 3D imaging apparatus, which may include:
the depth camera 10 is used for acquiring gray scale information of a target scene and acquiring phase information of a plurality of target scenes after being subjected to spatial light modulation by the spatial light modulator;
a spatial light modulator 20 for spatially modulating light of the depth camera;
the calculation module 30 is configured to obtain gray scale information of a target scene collected by the depth camera, control a spatial light modulator to perform spatial light modulation on the depth camera according to the gray scale information, obtain phase information of a plurality of target scenes collected by the depth camera after the depth camera performs spatial light modulation, and obtain depth information of the target scene according to the phase information.
Specifically, the depth camera 10 includes an imaging lens 11, an illumination light source 12, and a depth sensor 13, the illumination light source 12 being disposed beside the imaging lens 11 to provide illumination, and the depth sensor 13 being disposed behind the imaging lens 11.
The spatial light modulator 20 is arranged between the depth sensor 13 and the imaging lens 11, the calculation module 30 adopts an FPGA, the spatial light modulator 20 in this embodiment is illustrated by using a liquid crystal panel as an example, the liquid crystal panel is provided with a liquid crystal driver, the liquid crystal panel and the liquid crystal driver are connected, the FPGA is respectively connected with the depth sensor 13 and the liquid crystal driver, and by arranging the liquid crystal panel between the depth sensor 13 and the imaging lens 11, the light control of the whole imaging area can be realized, and the dynamic range of the camera is improved.
The specific way in which the depth camera 10, the spatial light modulator 20 and the calculation module 30 perform operations has been described in detail in relation to the embodiment of the method, and will not be elaborated upon here.
Fig. 4 is a block diagram illustrating a high dynamic 3D imaging device according to an exemplary embodiment. Referring to fig. 4, the apparatus includes:
the first acquisition module 21 is configured to acquire gray scale information of a target scene acquired by the depth camera;
the control module 22 is configured to control a spatial light modulator to perform spatial light modulation on the depth camera according to the gray scale information;
the second obtaining module 23 is configured to obtain phase information of a plurality of target scenes collected by the depth camera after the depth camera performs spatial light modulation;
and the imaging module 24 is configured to obtain depth information of the target scene according to the plurality of pieces of phase information.
With regard to the apparatus in the above-described embodiment, the specific manner in which each module performs the operation has been described in detail in the embodiment related to the method, and will not be elaborated here.
For the device embodiments, since they substantially correspond to the method embodiments, reference may be made to the partial description of the method embodiments for relevant points. The above-described embodiments of the apparatus are merely illustrative, and the units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the modules can be selected according to actual needs to achieve the purpose of the scheme of the application. One of ordinary skill in the art can understand and implement it without inventive effort.
Correspondingly, the present application also provides an electronic device, comprising: one or more processors; a memory for storing one or more programs; when executed by the one or more processors, cause the one or more processors to implement a highly dynamic 3D imaging method as described above.
Accordingly, the present application also provides a computer readable storage medium having stored thereon computer instructions, wherein the instructions, when executed by a processor, implement a high dynamic 3D imaging method as described above.
The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
In the above embodiments of the present invention, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
In the embodiments provided in the present application, it should be understood that the disclosed technology can be implemented in other ways. The above-described device embodiments are merely illustrative, and for example, the division of the units may be a logical division, and in actual implementation, there may be another division, for example, multiple units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, units or modules, and may be in an electrical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a removable hard disk, a magnetic or optical disk, and other various media capable of storing program codes.
Other embodiments of the present application will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any variations, uses, or adaptations of the invention following, in general, the principles of the application and including such departures from the present disclosure as come within known or customary practice within the art to which the invention pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the application being indicated by the following claims.
It will be understood that the present application is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the application is limited only by the appended claims.

Claims (10)

1. A method of high dynamic 3D imaging, comprising:
acquiring gray information of a target scene acquired by a depth camera;
controlling a spatial light modulator to perform spatial light modulation on the depth camera according to the gray information so as to reduce the light intensity transmittance of an overexposed area of the depth camera and increase the light intensity transmittance of an underexposed area;
after the depth camera performs spatial light modulation, phase information of a plurality of target scenes collected by the depth camera is acquired;
and obtaining the depth information of the target scene according to the plurality of pieces of phase information.
2. The method of claim 1, wherein obtaining grayscale information of a target scene captured by a depth camera comprises:
the method comprises the steps of obtaining gray scale information of a target scene collected by a depth camera, wherein the depth camera is in a non-demodulation exposure mode, and meanwhile, a light source of the depth camera is in a direct current exposure mode.
3. The method of claim 1, wherein spatially modulating the depth camera with a spatial light modulator according to the grayscale information comprises:
and analyzing the gray scale information to obtain a video signal, adjusting the video signal, and inputting the adjusted signal into the spatial light modulator to perform spatial light modulation on the depth camera.
4. The method of claim 1, wherein obtaining phase information of a plurality of target scenes collected by a depth camera after spatial light modulation by the depth camera comprises:
phase information of a plurality of target scenes collected by a depth camera is acquired, wherein the depth camera is in a demodulation exposure mode, and simultaneously a light source of the depth camera is in a modulation exposure mode.
5. The method of claim 1, wherein obtaining depth information of the target scene from the plurality of pieces of phase information comprises:
calculating to obtain the phase difference of light back and forth on each pixel according to the plurality of pieces of phase information;
according to the phase difference, reversely deducing to obtain the light flight time;
and calculating to obtain the depth information of the target scene according to the light flight time.
6. A high dynamic 3D imaging apparatus, comprising:
the first acquisition module is used for acquiring the gray information of a target scene acquired by the depth camera;
the control module is used for controlling a spatial light modulator to perform spatial light modulation on the depth camera according to the gray information so as to reduce the light intensity transmittance of an overexposed area of the depth camera and increase the light intensity transmittance of an underexposed area;
the second acquisition module is used for acquiring the phase information of a plurality of target scenes acquired by the depth camera after the depth camera performs spatial light modulation;
and the imaging module is used for obtaining the depth information of the target scene according to the plurality of pieces of phase information.
7. The apparatus of claim 6, wherein spatially modulating the depth camera with a spatial light modulator according to the gray scale information comprises:
and analyzing the gray scale information to obtain a video signal, adjusting the video signal, and inputting the adjusted signal into the spatial light modulator to perform spatial light modulation on the depth camera.
8. A high dynamic 3D imaging apparatus, comprising:
the depth camera is used for acquiring gray level information of a target scene and acquiring phase information of a plurality of target scenes after being subjected to spatial light modulation by the spatial light modulator;
the spatial light modulator is used for performing spatial light modulation on the depth camera, so that the light intensity transmittance is reduced for an overexposed area of the depth camera, and the light intensity transmittance is increased for an underexposed area;
the calculation module is used for acquiring gray information of a target scene acquired by the depth camera, controlling a spatial light modulator to perform spatial light modulation on the depth camera according to the gray information, acquiring phase information of a plurality of target scenes acquired by the depth camera after the depth camera performs spatial light modulation, and acquiring depth information of the target scene according to the phase information.
9. An electronic device, comprising:
one or more processors;
a memory for storing one or more programs;
when executed by the one or more processors, cause the one or more processors to implement the method of any one of claims 1-5.
10. A computer-readable storage medium having stored thereon computer instructions, which when executed by a processor, perform the steps of the method according to any one of claims 1-5.
CN202111190592.4A 2021-10-13 2021-10-13 High-dynamic 3D imaging method and device, electronic equipment and storage medium Active CN113645459B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111190592.4A CN113645459B (en) 2021-10-13 2021-10-13 High-dynamic 3D imaging method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111190592.4A CN113645459B (en) 2021-10-13 2021-10-13 High-dynamic 3D imaging method and device, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN113645459A CN113645459A (en) 2021-11-12
CN113645459B true CN113645459B (en) 2022-01-14

Family

ID=78426584

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111190592.4A Active CN113645459B (en) 2021-10-13 2021-10-13 High-dynamic 3D imaging method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN113645459B (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013012335A1 (en) * 2011-07-21 2013-01-24 Ziv Attar Imaging device for motion detection of objects in a scene, and method for motion detection of objects in a scene
CN109506591A (en) * 2018-09-14 2019-03-22 天津大学 A kind of adaptive illumination optimization method being adapted to complex illumination scene
CN111540042A (en) * 2020-04-28 2020-08-14 上海盛晃光学技术有限公司 Method, device and related equipment for three-dimensional reconstruction
CN113237435A (en) * 2021-05-08 2021-08-10 北京航空航天大学 High-light-reflection surface three-dimensional vision measurement system and method

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10061028B2 (en) * 2013-09-05 2018-08-28 Texas Instruments Incorporated Time-of-flight (TOF) assisted structured light imaging

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013012335A1 (en) * 2011-07-21 2013-01-24 Ziv Attar Imaging device for motion detection of objects in a scene, and method for motion detection of objects in a scene
CN109506591A (en) * 2018-09-14 2019-03-22 天津大学 A kind of adaptive illumination optimization method being adapted to complex illumination scene
CN111540042A (en) * 2020-04-28 2020-08-14 上海盛晃光学技术有限公司 Method, device and related equipment for three-dimensional reconstruction
CN113237435A (en) * 2021-05-08 2021-08-10 北京航空航天大学 High-light-reflection surface three-dimensional vision measurement system and method

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
基于自适应条纹投影的高反光物体三维面形测量;王柳等;《应用光学》;20180515(第03期) *

Also Published As

Publication number Publication date
CN113645459A (en) 2021-11-12

Similar Documents

Publication Publication Date Title
US11877086B2 (en) Method and system for generating at least one image of a real environment
US11131753B2 (en) Method, apparatus and computer program for a vehicle
CN103973989B (en) Obtain the method and system of high-dynamics image
CN110300292B (en) Projection distortion correction method, device, system and storage medium
US20160142615A1 (en) Robust layered light-field rendering
JP5536071B2 (en) Generation of depth data based on spatial light patterns
US10567646B2 (en) Imaging apparatus and imaging method
US20130010067A1 (en) Camera and Method for Focus Based Depth Reconstruction of Dynamic Scenes
US9807372B2 (en) Focused image generation single depth information from multiple images from multiple sensors
CN108063932B (en) Luminosity calibration method and device
CN105245785A (en) Brightness balance adjustment method of vehicle panoramic camera
US20190355101A1 (en) Image refocusing
US20210377432A1 (en) Information processing apparatus, information processing method, program, and interchangeable lens
CN105141841A (en) Camera equipment and method therefor
US20170307869A1 (en) Microscope and method for obtaining a high dynamic range synthesized image of an object
CN115150561B (en) High dynamic imaging system and method
US10341546B2 (en) Image processing apparatus and image processing method
JP2015073185A (en) Image processing device, image processing method and program
US10529057B2 (en) Image processing apparatus and image processing method
CN113645459B (en) High-dynamic 3D imaging method and device, electronic equipment and storage medium
US10306146B2 (en) Image processing apparatus and image processing method
KR20190074455A (en) Method, apparatus and program sotred in recording medium for refocucing of planar image
CN109495694B (en) RGB-D-based environment sensing method and device
Abedi et al. Multi-view high dynamic range reconstruction via gain estimation
KR101909392B1 (en) Surround view monitoring system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant