WO2020238481A1 - 图像获取方法、图像获取装置、电子设备和可读存储介质 - Google Patents

图像获取方法、图像获取装置、电子设备和可读存储介质 Download PDF

Info

Publication number
WO2020238481A1
WO2020238481A1 PCT/CN2020/085783 CN2020085783W WO2020238481A1 WO 2020238481 A1 WO2020238481 A1 WO 2020238481A1 CN 2020085783 W CN2020085783 W CN 2020085783W WO 2020238481 A1 WO2020238481 A1 WO 2020238481A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
laser
processor
frame
operating frequency
Prior art date
Application number
PCT/CN2020/085783
Other languages
English (en)
French (fr)
Inventor
徐乃江
Original Assignee
Oppo广东移动通信有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Oppo广东移动通信有限公司 filed Critical Oppo广东移动通信有限公司
Priority to EP20815599.4A priority Critical patent/EP3975537A4/en
Publication of WO2020238481A1 publication Critical patent/WO2020238481A1/zh
Priority to US17/525,544 priority patent/US20220067951A1/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/51Housings
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/2224Studio circuitry; Studio devices; Studio equipment related to virtual studio applications
    • H04N5/2226Determination of depth image, e.g. for foreground/background separation
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/22Measuring arrangements characterised by the use of optical techniques for measuring depth
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/521Depth or shape recovery from laser ranging, e.g. using interferometry; from the projection of structured light
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/45Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from two or more image sensors being of different type or operating in different modes, e.g. with a CMOS sensor for moving images in combination with a charge-coupled device [CCD] for still images
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/54Mounting of pick-up tubes, electronic image sensors, deviation or focusing coils
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/55Optical parts specially adapted for electronic image sensors; Mounting thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/56Cameras or camera modules comprising electronic image sensors; Control thereof provided with illuminating means
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/57Mechanical or electrical details of cameras or camera modules specially adapted for being embedded in other devices
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/71Circuitry for evaluating the brightness variation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/95Computational photography systems, e.g. light-field imaging systems
    • H04N23/951Computational photography systems, e.g. light-field imaging systems by using two or more images to influence resolution, frame rate or aspect ratio
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds

Definitions

  • This application relates to the field of imaging technology, and in particular to an image acquisition method, image acquisition device, electronic equipment and non-volatile computer-readable storage medium.
  • the depth camera obtains the depth information of the scene by projecting a laser pattern with spots on the scene. Specifically, the depth camera projects infrared lasers (for example, 940nm infrared lasers) into the scene, and the infrared lasers form speckle patterns. The depth camera collects the speckle patterns formed by reflections of objects in the scene to obtain depth information of the objects in the scene. .
  • infrared lasers for example, 940nm infrared lasers
  • the depth camera is used in a scene with high brightness, such as an outdoor scene with strong sunlight, the ambient light at this time contains a large amount of 940nm infrared light, and this part of the infrared light will enter the depth camera In imaging, the brightness of speckle pattern imaging is close to that of ambient infrared light imaging, and the algorithm cannot distinguish laser speckles, resulting in failure of laser speckle matching and partial or complete loss of depth information.
  • the embodiments of the present application provide an image acquisition method, an image acquisition device, an electronic device, and a non-volatile computer-readable storage medium.
  • the image acquisition method of the embodiment of the present application is used in an electronic device.
  • the electronic device includes a depth camera, the depth camera includes a laser projector, and the image acquisition method includes: projecting laser light onto the scene at a first operating frequency; acquiring a captured image at a second operating frequency, where the second operating frequency is greater than The first working frequency; distinguish a first image collected when the laser projector is not projecting laser light and a second image collected when the laser projector is projecting laser light in the collected image; according to the first image
  • a depth image is calculated for an image, the second image and the reference image.
  • the image acquisition device of the embodiment of the present application is used in an electronic device.
  • the electronic device includes a depth camera, and the depth camera includes a laser projector.
  • the image acquisition device includes: a transmission module, a first acquisition module, a distinguishing module, and a calculation module.
  • the transmitting module is used to project laser light to the scene at the first working frequency.
  • the first acquisition module is configured to acquire the acquired image at a second operating frequency, where the second operating frequency is greater than the first operating frequency.
  • the distinguishing module is used for distinguishing, from the collected images, a first image collected when the laser projector is not projecting laser light and a second image collected when the laser projector is projecting laser light.
  • the calculation module calculates a depth image according to the first image, the second image, and the reference image.
  • the electronic device of the embodiment of the present application includes a depth camera and a processor.
  • the depth camera includes a laser projector and an image collector.
  • the laser projector is used to project laser light to the scene at the first operating frequency.
  • the image collector is used to obtain a captured image at a second operating frequency, and the second operating frequency is greater than the first operating frequency.
  • the processor is configured to distinguish, in the collected images, a first image collected when the laser projector is not projecting laser light and a second image collected when the laser projector is projecting laser light; The image, the second image, and the reference image calculate a depth image.
  • the non-volatile computer-readable storage medium containing computer-readable instructions when the computer-readable instructions are executed by a processor, causes the processor to execute the aforementioned image acquisition method.
  • the laser projector and the image collector work at different operating frequencies, and the image collector can only collect data from the ambient infrared
  • Laser speckles are generated, and the acquired images formed by infrared lasers emitted by only the laser projector can be used to calculate the depth image.
  • the laser speckle matching is not affected, which can avoid partial or complete loss of depth information, thereby improving the accuracy of depth images degree.
  • FIG 1 and 2 are schematic diagrams of the structure of an electronic device according to some embodiments of the present application.
  • FIG. 3 is a schematic diagram of a system architecture of an electronic device according to some embodiments of the present application.
  • FIG. 4 is a schematic flowchart of an image acquisition method according to some embodiments of the present application.
  • Fig. 5 is a schematic diagram of modules of an image acquisition device according to some embodiments of the present application.
  • Fig. 6 is a schematic diagram of the principle of an image acquisition method according to some embodiments of the present application.
  • FIG. 7 to 10 are schematic flowcharts of image acquisition methods in some embodiments of the present application.
  • FIG. 11 is a schematic diagram of modules of an image acquisition device according to some embodiments of the present application.
  • FIG. 12 is a schematic diagram of the principle of an image acquisition method in some embodiments of the present application.
  • FIG. 13 is a schematic flowchart of an image acquisition method according to some embodiments of the present application.
  • FIG. 14 is a schematic diagram of modules of an image acquisition device according to some embodiments of the present application.
  • FIG. 15 is a schematic diagram of interaction between a non-volatile computer-readable storage medium and a processor in some embodiments of the present application.
  • the present application provides an electronic device 100.
  • the electronic device 100 may be a mobile phone, a tablet computer, a notebook computer, a smart wearable device (smart watch, smart bracelet, smart helmet, smart glasses, etc.), a virtual reality device, and the like.
  • the electronic device 100 is a mobile phone as an example, but the form of the electronic device 100 is not limited to a mobile phone.
  • the electronic device 100 includes a depth camera 10, a visible light camera 30, a processor 40 and a housing 50.
  • the processor 40 is housed in the housing 50.
  • the depth camera 10 and the visible light camera 30 are mounted on the housing 50.
  • the housing 50 includes a main body 51 and a movable bracket 52.
  • the movable bracket 52 can move relative to the main body 51 under the driving of the driving device.
  • the movable bracket 52 can slide relative to the main body 51 to slide into or out of the main body 51.
  • the depth camera 10 and the visible light camera 30 may be installed on a movable bracket 52, and the movement of the movable bracket 52 can drive the depth camera 10 and the visible light camera 30 to retract into the main body 51 or extend from the main body 51.
  • One or more collection windows are opened on the housing 50, and the collection windows can be opened on the front or back of the housing 50.
  • Both the depth camera 10 and the visible light camera 30 are aligned and installed with the collection window, so that the depth camera 10 and the visible light camera 30 can receive light incident from the collection window.
  • the movable bracket 52 can be triggered to slide out of the main body 51 to drive the depth camera 10 and the visible light camera 30 to extend from the main body 51; the user does not need to use the depth camera 10 and the visible light camera 30.
  • the movable bracket 52 can be triggered to slide into the main body 51 to drive the depth camera 10 and the visible light camera 30 to retract into the main body 51.
  • one or more through holes are opened on the housing 50, and the depth camera 10 and the visible light camera 30 are installed in the housing 50 and aligned with the through holes.
  • the through hole may be opened on the front or back of the housing 50, and the depth camera 10 and the visible light camera 30 may receive light passing through the through hole.
  • the depth camera 10 includes a laser projector 11 and an image collector 12.
  • the laser projector 11 can emit laser light.
  • the laser projector 11 includes a laser light source 111 and a first driver 112.
  • the first driver 111 can be used to drive the laser light source 111 to project laser light.
  • the laser light can be infrared laser or other invisible light, such as ultraviolet laser. Wait.
  • the image collector 12 can receive the laser light reflected by the object.
  • the laser is an infrared laser and the image collector 12 is an infrared camera.
  • the form of the laser and the image collector 12 is not limited to this.
  • the laser can also be an ultraviolet laser, and the image collector 12 is an ultraviolet light. camera. Both the laser projector 11 and the image collector 12 are connected to the processor 40.
  • the processor 40 may provide an enable signal for the laser projector 11, specifically, the processor 40 may provide an enable signal for the first driver 112.
  • the image collector 12 is connected to the processor 40 through an I2C bus.
  • the image collector 12 can control the projection timing of the laser projector 11 through a strobe signal (strobe signal), where the strobe signal is based on the image collector 12
  • strobe signal can be regarded as an electrical signal with alternating high and low levels, and the laser projector 11 projects laser light according to the laser projection timing indicated by the strobe.
  • the processor 40 can send an image acquisition instruction through the I2C bus to enable the depth camera 10 to work.
  • the image acquisition device 12 After the image acquisition device 12 receives the image acquisition instruction, it controls the switching device 61 through the strobe signal. If the strobe signal is high, The switching device 61 sends a first pulse signal (pwn1) to the first driver 112, and the first driver 112 drives the laser light source 111 to project laser light into the scene according to the first pulse signal. If the strobe signal is low, the switching device 61 stops The first pulse signal is sent to the first driver 112, and the laser light source 111 does not project laser; or, when the strobe signal is low, the switching device 61 sends the first pulse signal to the first driver 112, and the first driver 112 follows The first pulse signal drives the laser light source 111 to project laser light into the scene.
  • pwn1 a first pulse signal
  • the switching device 61 stops The first pulse signal is sent to the first driver 112, and the laser light source 111 does not project laser; or, when the strobe signal is low, the switching device 61 sends the first pulse signal to the first driver
  • the switching device 61 stops sending the first pulse signal to the first driver 112, and the laser light source 111 does not project laser light.
  • the strobe signal may not be used when the image collector 12 cooperates with the laser projector 11.
  • the processor 40 sends an image acquisition command to the image collector 12 and simultaneously sends a laser projection command to the first driver 112.
  • the image collector 12 After the image collector 12 receives the image acquisition instruction, it starts to acquire the acquired image, and when the first driver 112 receives the laser projection instruction, it drives the laser light source 111 to project laser light.
  • the laser projector 11 projects laser light, the laser light forms a laser pattern with spots and projects it on an object in the scene.
  • the image collector 12 collects the laser pattern reflected by the object to obtain the speckle image, and sends the speckle image to the processor 40 through the Mobile Industry Processor Interface (MIPI). Each time the image collector 12 sends a speckle image to the processor 40, the processor 40 receives a data stream. The processor 40 can calculate the depth image according to the speckle image and the reference image pre-stored in the processor 40.
  • MIPI Mobile Industry Processor Interface
  • the visible light camera 30 is also connected to the processor 40 through the I2C bus.
  • the visible light camera 30 can be used to collect visible light images. Each time the visible light camera 30 sends a frame of visible light image to the processor 40, the processor 40 receives a data stream.
  • the visible light camera 30 is used alone, that is, when the user only wants to obtain visible light images, the processor 40 sends an image acquisition instruction to the visible light camera 30 through the I2C bus to enable the visible light camera 30 to work.
  • the visible light camera 30 collects the visible light image of the scene after receiving the image acquisition instruction, and sends the visible light image to the processor 40 through the mobile industry processor interface.
  • the visible light camera 30 is used in conjunction with the depth camera 10, that is, when the user wants to obtain a three-dimensional image through the visible light image and the depth image, if the image collector 12 and the visible light camera 30 work at the same frequency, the image collector 12 and the visible light camera 30 pass the sync signal Realize hardware synchronization.
  • the processor 40 sends an image acquisition instruction to the image acquisition device 12 through the I2C bus.
  • the image collector 12 After the image collector 12 receives the image acquisition instruction, it can control the switching device 61 to send the first pulse signal (pwn1) to the first driver 112 through the strobe signal, so that the first driver 112 drives the laser light source 111 to emit laser light according to the first pulse signal
  • the image collector 12 and the visible light camera 30 are synchronized by a sync signal, which controls the visible light camera 30 to collect visible light images.
  • the electronic device 100 also includes a floodlight 20.
  • the floodlight 20 can emit uniform surface light into the scene.
  • the floodlight 20 includes a floodlight source 21 and a second driver 22.
  • the second driver 22 can be used to drive the floodlight source 21 to emit uniform surface light.
  • the light emitted by the floodlight 20 may be infrared light or other invisible light, such as ultraviolet light. In this application, the infrared light emitted by the floodlight 20 is taken as an example for description, but the form of light emitted by the floodlight 20 is not limited to this.
  • the floodlight 20 is connected to the processor 40, and the processor 40 can provide an enable signal for the floodlight 20. Specifically, the processor 40 can provide an enable signal for the second driver 22.
  • the floodlight 20 can work with the image collector 12 to collect infrared images.
  • the image collector 12 can use a strobe signal (strobe signal, the strobe signal and the image collector 12 control the strobe signal of the laser projector 11 into two Independent strobe signal) controls the emission timing of the infrared light emitted by the floodlight 20.
  • the strobe signal is generated according to the timing of the image collector 12 acquiring the captured image.
  • the strobe signal can be regarded as an electrical signal with alternating high and low levels. 20
  • the infrared light is emitted according to the infrared light emission timing indicated by the strobe signal.
  • the processor 40 may send an image acquisition instruction to the image acquisition device 12 via the I2C bus. After the image acquisition device 12 receives the image acquisition instruction, it controls the switching device 61 through the strobe signal. If the strobe signal is high, the switching device 61 sends a second pulse signal (pwn2) to the second driver 22. The second driver 22 controls the flood light source 21 to emit infrared light according to the second pulse signal.
  • the image acquisition device 12 controls the switching device 61 through the strobe signal. If the strobe signal is high, the switching device 61 sends a second pulse signal (pwn2) to the second driver 22.
  • the second driver 22 controls the flood light source 21 to emit infrared light according to the second pulse signal.
  • the switching device 61 stops sending the second pulse Signal to the second driver 22, the flood light source 21 does not emit infrared light; or, when the strobe signal is low, the switching device 61 sends a second pulse signal to the second driver 22, and the second driver 22 The pulse signal controls the flood light source 21 to emit infrared light.
  • the switching device 61 stops sending the second pulse signal to the second driver 22, and the flood light source 21 does not emit infrared light.
  • the image collector 12 When the floodlight 20 emits infrared light, the image collector 12 receives the infrared light reflected by objects in the scene to form an infrared image, and sends the infrared image to the processor 40 through the mobile industrial processor interface. Each time the image collector 12 sends a frame of infrared image to the processor 40, the processor 40 receives a data stream. This infrared image is usually used for iris recognition, face recognition, etc.
  • Image acquisition methods include:
  • the present application also provides an image acquisition device 90 used in the electronic device 100 of the embodiment shown in FIGS. 1-3.
  • the image acquisition method of this application can be implemented by the image acquisition device 90 of this application.
  • the image acquisition device 90 includes a transmission module 91, a first acquisition module 92, a distinguishing module 93 and a calculation module 94.
  • Step 01 can be implemented by the transmitting module 91.
  • Step 02 can be implemented by the first obtaining module 92.
  • Step 03 can be implemented by the distinguishing module 93.
  • Step 04 can be implemented by the calculation module 94.
  • the transmitting module 91 can be used to project laser light onto the scene at the first operating frequency.
  • the first acquisition module 92 may be used to acquire the acquired image at a second operating frequency, which is greater than the first operating frequency.
  • the distinguishing module 93 can be used to distinguish the first image collected when the laser projector 11 is not projecting laser light and the second image collected when the laser projector 11 is projecting laser light from the collected images.
  • the calculation module 94 may be used to calculate the depth image according to the first image, the second image, and the reference image. At this time, the transmitting module 91 is the laser projector 11, and the first acquiring module 92 is the image collector 12.
  • step 01 can be implemented by the laser projector 11.
  • step 02 can be implemented by the image collector 12.
  • Both step 03 and step 04 can be implemented by the processor 40.
  • the laser projector 11 can be used to project laser light to the scene at the first operating frequency.
  • the image collector 12 can be used to acquire images at a second operating frequency, and the second operating frequency is greater than the first operating frequency.
  • the processor 40 can be used to distinguish the first image collected when the laser projector 11 is not projecting laser light and the second image collected when the laser projector 11 is projecting laser light, and according to the first image, second image and Calculate the depth image with reference to the image.
  • the processor 40 simultaneously sends an image acquisition instruction for acquiring a depth image to the image acquisition device 12 and the first driver 112 via the I2C bus.
  • the first driver 112 drives the laser light source 111 to emit infrared laser light to the scene at the first operating frequency; after receiving the image acquisition instruction, the image acquisition device 12 acquires at the second operating frequency and is reflected back by objects in the scene Infrared laser to obtain the captured image. For example, as shown in FIG.
  • the solid line represents the timing of laser emission from the laser projector 11
  • the dashed line represents the timing of the image capture 12 acquiring the captured image and the number of frames of the captured image
  • the dashed line represents the timing of the first image and the second image.
  • the number of frames of the third image, from top to bottom in FIG. 6, is a solid line, a dashed line, and a dot-dash line in sequence, wherein the second operating frequency is twice the first operating frequency.
  • the image collector 12 first receives infrared light in the environment (hereinafter referred to as ambient infrared light) when the laser projector 11 is not projecting laser light to obtain the Nth frame of the collected image (this time is the first An image, which can also be called a background image), and the Nth frame of the captured image is sent to the processor 40 through the mobile industry processor interface; then, the image collector 12 can receive the ambient infrared light and the ambient infrared light when the laser projector 11 projects laser light.
  • ambient infrared light in the environment
  • the Nth frame of the captured image is sent to the processor 40 through the mobile industry processor interface
  • the infrared laser emitted by the laser projector 11 is used to obtain the N+1th frame of acquisition image (this time is the second image, which can also be called the interference speckle image), and the N+1th frame is acquired through the mobile industry processor interface
  • the image is sent to the processor 40; subsequently, the image collector 12 receives ambient infrared light when the laser projector 11 is not projecting laser light to obtain the N+2 frame of the captured image (the first image at this time), and processes it through the mobile industry
  • the device interface sends the N+2th frame of the captured image to the processor 40, and so on, the image collector 12 alternately obtains the first image and the second image.
  • the processor 40 sends a collection instruction for acquiring a depth image to the image collector 12 through the I2C bus.
  • the image collector 12 After the image collector 12 receives the image acquisition instruction, it sends a first pulse signal to the first driver 112 through the strobe signal control switch, and the first driver 112 drives the laser light source 111 to project laser light at the first operating frequency according to the first pulse signal.
  • the laser projector 11 projects laser light at the first working frequency
  • the image collector 12 collects the infrared laser light reflected by the objects in the scene at the second working frequency to obtain the collected image. As shown in FIG.
  • the solid line represents the timing of laser emission from the laser projector 11
  • the dashed line represents the timing of the image acquisition by the image collector 12 and the number of frames of the acquired image
  • the dashed line represents the time sequence obtained from the first image and the second image.
  • the number of frames of the third image, from top to bottom in FIG. 6, is a solid line, a dashed line, and a dot-dash line in sequence, wherein the second operating frequency is twice the first operating frequency.
  • the image collector 12 first receives ambient infrared light when the laser projector 11 is not projecting laser light to obtain the Nth frame of the collected image (this time is the first image, which can also be called the background image ), and send the Nth frame of the captured image to the processor 40 through the mobile industry processor interface; then, the image collector 12 can receive the ambient infrared light and the infrared laser emitted by the laser projector 11 when the laser projector 11 projects laser light In order to obtain the N+1th frame acquisition image (the second image at this time, also called the interference speckle image), and send the N+1th frame acquisition image to the processor 40 through the mobile industry processor interface; then, The image collector 12 then receives ambient infrared light when the laser projector 11 is not projecting laser light to obtain the N+2 frame of the captured image (the first image at this time), and captures the N+2 frame through the mobile industrial processor interface The image is sent to the processor 40, and so on, the image collector 12 alternately obtains
  • the image collector 12 may simultaneously execute the acquisition of the acquired image while sending the acquired image to the processor 40.
  • the image collector 12 may also obtain the second image first, then obtain the first image, and alternately perform the acquisition of the collected images according to this sequence.
  • the above-mentioned multiple relationship between the second operating frequency and the first operating frequency is only an example. In other embodiments, the multiple relationship between the second operating frequency and the first operating frequency may also be three times or four times. , Five times, six times and so on.
  • the processor 40 After the processor 40 receives a frame of the captured image, it distinguishes the received captured image and determines whether the captured image is the first image or the second image. After the processor 40 receives at least one frame of the first image and at least one frame of the second image, it can calculate the depth image according to the first image, the second image, and the reference image. Specifically, since the first image is collected when the laser projector 11 is not projecting laser light, the light that forms the first image only includes ambient infrared light, and the second image is collected when the laser projector 11 is projecting laser light, forming the first image. The light of the two images includes both the ambient infrared light and the infrared laser emitted by the laser projector 11. Therefore, the processor 40 can remove the part of the collected image formed by the ambient infrared light in the second image according to the first image, thereby obtaining only Collected image formed by infrared laser (that is, speckle image formed by infrared laser).
  • the ambient light includes infrared light with the same wavelength as the infrared laser emitted by the laser projector 11 (for example, including ambient infrared light at 940 nm), and when the image collector 12 acquires the captured image, this part of the infrared light will also be captured by the image collector. 12 received.
  • the proportion of ambient infrared light in the light received by the image collector 12 will increase, resulting in inconspicuous laser speckles in the collected image, thereby affecting the calculation of the depth image.
  • the image acquisition method of the present application controls the laser projector 11 and the image collector 12 to work at different operating frequencies.
  • the image collector 12 can collect the first image formed by only ambient infrared light and at the same time by the ambient infrared light and the laser projector.
  • the acquired image formed by the infrared laser is used to calculate the depth image, and the laser speckle matching is not affected, which can avoid partial or complete loss of depth information, thereby improving the accuracy of the depth image.
  • step 03 includes:
  • step 031 includes:
  • 0312 Add image type for each frame of collected image according to working status.
  • step 031, step 032, step 0311, and step 0312 can all be implemented by the distinguishing module 93.
  • the distinguishing module 93 can also be used to add an image type to each frame of the captured image, and to distinguish the first image from the second image according to the image type.
  • the distinguishing module 93 is used to add an image type to each frame of the captured image, specifically to determine the working status of the laser projector 11 at the time of collection according to the collection time of each frame of the collected image, and to collect each frame according to the working status Add image type to image.
  • step 031, step 032, step 0311, and step 0312 may all be implemented by the processor 40.
  • the processor 40 may also be used to add an image type to each frame of the captured image, and to distinguish the first image from the second image according to the image type.
  • the processor 40 is used to add an image type to each frame of the captured image, specifically to determine the working status of the laser projector 11 at the time of collection according to the collection time of each frame of the collected image, and to collect each frame according to the working status Add image type to image.
  • the processor 40 adds an image type (stream_type) to the captured image every time it receives a frame of the captured image from the image collector 12, so that the first image and the second image can be distinguished according to the image type in subsequent processing.
  • the processor 40 will monitor the working status of the laser projector 11 in real time via the I2C bus. Each time the processor 40 receives a frame of collected images from the image collector 12, it will first acquire the collection time of the collected image, and then determine according to the collection time of the collected image that the working state of the laser projector 11 is projection The laser is still not projected, and the image type is added to the captured image based on the judgment result.
  • the collection time of the collected image may be the start time, the end time, any time between the start time and the end time when the image collector 12 obtains each frame of the collected image.
  • the working state projected laser or non-projected laser
  • the structure of the image type stream_type is shown in Table 1:
  • stream When stream is 0 in Table 1, it means that the data stream at this time is an image formed by infrared light and/or infrared laser.
  • light When light is 00, it means that the data stream at this time is acquired without any device projecting infrared light and/or infrared laser (only ambient infrared light), then the processor 40 can add an image type of 000 to the collected image , To identify this captured image as the first image.
  • light is 01 it means that the data stream at this time is obtained when the laser projector 11 projects infrared laser light (both ambient infrared light and infrared laser light).
  • the processor 40 may add an image type of 001 to the captured image to identify this captured image as the second image.
  • the processor 40 can then distinguish the image types of the collected images according to stream_type.
  • step 04 includes:
  • both step 041 and step 042 can be implemented by the calculation module 94. That is to say, the calculation module 94 can be used to calculate the third image according to the first image and the second image, and calculate the depth image according to the third image and the reference image, wherein the acquisition time of the first image and the acquisition time of the second image The difference is less than the predetermined difference.
  • both step 041 and step 042 can be implemented by the calculation module 94.
  • the processor 40 may also be used to calculate a third image based on the first image and the second image, and calculate a depth image based on the third image and the reference image, where the acquisition time of the first image is the same as that of the second image. The difference in acquisition time is less than the predetermined difference.
  • the processor 40 may first distinguish the first image from the second image, and then select any frame of the second image and the first image of the specific frame corresponding to the arbitrary frame of the second image according to the acquisition time , Wherein the difference between the acquisition time of the first image of the specific frame and the acquisition time of the second image of the arbitrary frame is less than a predetermined difference. Subsequently, the processor 40 calculates a third image based on the first image of the specific frame and the second image of the arbitrary frame. The third image is the collected image formed by only the infrared laser emitted by the laser projector 11, or This is called the actual speckle image.
  • the processor 40 may calculate the depth image according to the third image and the reference image, where the number of frames of the second image, the number of frames of the third image, and the number of frames of the depth image are all equal. It can be understood that since the difference between the acquisition time of the first image and the acquisition time of the second image is small, the intensity of the ambient infrared light in the first image is closer to the intensity of the ambient infrared light in the second image. The accuracy of the third image calculated from the image and the second image is higher, which helps to further reduce the influence of ambient infrared light on the acquisition of the depth image.
  • the processor 40 may also add an image type to the third image and the depth image, so as to distinguish each data stream obtained after processing the acquired image. As shown in table 2:
  • the processor 40 can add an image type of 011 to the data stream after background subtraction processing to identify this data
  • the stream is the third image.
  • Light is XX
  • X indicates that the value is not limited, and the processor 40 may add an image type of 1XX to the data stream obtained after performing the depth calculation to identify this data stream as a depth image.
  • the acquisition time of the first image may be before the acquisition time of the second image or after the acquisition time of the second image. No restrictions.
  • the first image and the second image may be images of adjacent frames or non-adjacent frames.
  • the first image and the second image whose difference is less than the predetermined difference are images of adjacent frames; between the second operating frequency and the first operating frequency The multiple of is greater than twice.
  • the first image and the second image whose difference is less than the predetermined difference can be images of adjacent frames or non-adjacent frames Image (at this time, there is still a frame of first image between the first image and the second image).
  • the number of frames of the first image participating in the depth image calculation may also be multiple frames.
  • the processor 40 may first perform fusion processing on the two frames of the first image, for example, add the pixel values of the corresponding pixels of the two frames of the first image and then take the average value to obtain the fusion processed first image, and then use the fusion processing The subsequent first image and the adjacent frame of the second image calculate the third image.
  • the processor 40 may calculate multiple frames of third images, such as the (N+1)-Nth frame and the (N+3)-(N+2)th frame in FIG. 6
  • the third image, the (N+5)-(N+4)th frame of the third image, etc., and multiple frames of depth images are calculated corresponding to multiple frames of the third image.
  • the processor 40 may also calculate only one frame of the third image, and calculate a frame of depth image corresponding to one frame of the third image.
  • the number of frames of the third image can be determined according to the security level of the application scenario. Specifically, when the security level of the application scenario is high, for example, for the application scenario with a high security level such as payment, the number of frames of the third image should be more.
  • the payment action is executed successfully to improve the security of the payment; for the application scenario, the security level is low, for example, for the application scenario of portrait beautification based on depth information, the number of frames of the third image can be less, for example, one At this time, one frame of depth image is enough to beautify the portrait. In this way, the calculation amount and power consumption of the processor 40 can be reduced, and the image processing speed can be increased.
  • the image acquisition method further includes:
  • the image acquisition device 90 further includes an acquisition module 95, an adding module 96, and a determining module 97.
  • Step 05 can be implemented by the acquisition module 95.
  • Step 06 can be implemented by the adding module 96.
  • Step 07 can be implemented by the determining module 97.
  • the collection module 95 can be used to collect visible light images at a third operating frequency, which is greater than or less than the second operating frequency.
  • the adding module 96 can be used to add a collection time for each frame of visible light image and each frame of collected image.
  • the determining module 97 may be used to determine the frame-synchronized visible light image and the second image according to the acquisition time of the visible light image, the acquisition time of the acquired image, and the image type of the acquired image.
  • the collection module 95 is the visible light camera 30.
  • step 05 may be implemented by a visible light camera 30.
  • Step 06 and step 07 can be implemented by the processor 40.
  • the visible light camera 30 can be used to collect visible light images at a third operating frequency, which is greater or less than the second operating frequency.
  • the processor 40 can be used to add an acquisition time for each frame of visible light image and each frame of acquired image, and determine the frame-synchronized visible light image and the second image according to the acquisition time of the visible light image, the acquisition time of the acquired image, and the image type of the acquired image. .
  • the depth camera 10 In some application scenarios, for example, in the application scenario of 3D modeling of objects in the scene, it is necessary to use the depth camera 10 to obtain the depth information of the objects in the scene, and use the visible light camera 30 to obtain the color information of the objects in the scene, in order to achieve three-dimensional Modeling.
  • the processor 40 needs to turn on the depth camera 10 to obtain a depth image and simultaneously turn on the visible light camera 30 to obtain a visible light image.
  • the processor 40 can send an image acquisition instruction to the image collector 12 through the I2C bus, After the image collector 12 receives the image acquisition instruction, the image collector 12 and the visible light camera 30 are synchronized by a sync signal, which controls the visible light camera 30 to start collecting visible light images, so as to realize the hardware of the image collector 12 and the visible light camera 30 Synchronize. At this time, the number of frames of the collected image is consistent with the number of frames of the visible light image, and each frame of the collected image corresponds to each frame of the visible light image one-to-one.
  • the processor 40 needs to synchronize the image collector 12 and the visible light camera 30 through software synchronization. Specifically, the processor 40 sends an image acquisition instruction to the image acquisition device 12 through an I2C bus connected to the image acquisition device 12, and at the same time sends an image acquisition instruction to the visible light camera 30 through an I2C bus connected to the visible light camera 30.
  • the processor 40 Whenever the processor 40 receives a frame of a captured image, it adds an image type to each frame of the captured image, and also adds a capture time to each frame of the captured image. In addition, each time the processor 40 receives a frame of visible light image, it adds acquisition time to each frame of visible light image.
  • the collection time of the captured image can be the start time, end time, any time between the start time and the end time of each frame of the captured image collected by the image collector 12; the collection time of the visible light image can be visible light The start time, end time, any time between the start time and the end time of each frame of visible light image collected by the camera 30, and so on.
  • the processor 40 may first base on the acquisition time of the visible light image, the acquisition time of the acquired image, and the acquisition The type of image first determines the frame-synchronized visible light image and the second image.
  • frame synchronization means that the determined difference between the acquisition time of the second image and the acquisition time of the visible light image is less than the preset time difference.
  • the acquisition time of the visible light image can be located before or before the acquisition time of the second image.
  • the processor 40 selects the first image according to the determined second image to further calculate the depth image according to the second image, the first image, and the reference image. Finally, the processor 40 performs subsequent processing based on the depth image and the determined visible light image.
  • the processor 40 may also add acquisition time to each frame of depth image, and then determine the frame-synchronized visible light image and the depth image according to the acquisition time of the visible light image and the acquisition time of the depth image, and finally the frame synchronization The visible light image and depth image are processed later.
  • the acquisition time of each frame of depth image is the acquisition time of the second image corresponding to the frame of depth image.
  • the captured image also includes an infrared image
  • the infrared image is an image obtained by the image collector 12 collecting infrared light emitted by the floodlight 20.
  • the processor 40 adds an image type to each frame of the captured image, it also adds an image type to the infrared image.
  • the image types of infrared images are shown in Table 3:
  • the stream in Table 3 When the stream in Table 3 is 0, it means that the data stream at this time is an image formed by infrared light and/or infrared laser. When light is 10, it means that the data stream at this time is obtained when the floodlight 20 projects infrared light and the laser projector 11 does not project laser light. Then, when the processor 40 adds the image type of 010 to the captured image, it identifies that this frame of captured image is an infrared image.
  • the image collector 12 needs to be used in conjunction with the floodlight 20 and the laser projector 11.
  • the image collector 12 can obtain the first image, the second image, and the infrared image in a time-sharing manner.
  • the solid line represents the timing of laser emission from the laser projector 11
  • the double-dot chain line represents the timing of infrared light emitted by the floodlight 20
  • the dashed line represents the timing of the image acquisition by the image collector 12 and the number of frames of the image.
  • the dash-dotted line indicates the frame number of the third image obtained from the first image and the second image.
  • the second The working frequency is three times the first working frequency
  • the second working frequency is three times the fourth working frequency.
  • the processor 40 can monitor the working status of the floodlight 20 in real time via the I2C bus. Each time the processor 40 receives a frame of collected images from the image collector 12, it will first acquire the collection time of the collected image, and then determine according to the collection time of the collected image that the working state of the floodlight 20 is emission during the collection time of the collected image.
  • the infrared light is still not emitting infrared light, and the image type is added to the captured image based on the judgment result.
  • the processor 40 may subsequently determine the infrared image and the second image whose acquisition time difference is less than the set difference based on the acquisition time of the infrared image and the acquisition time of the second image. Further, the processor 40 may determine the infrared image and the second image. Depth image, and use the infrared image and the depth image for identity verification.
  • the image acquisition method further includes:
  • the image acquisition device 90 further includes a second acquisition module 98 and a judgment module 99.
  • Step 08 can be implemented by the second acquisition module 98.
  • Step 09 can be implemented by the judgment module 99.
  • the second acquiring module 98 can be used to acquire the brightness and type of the scene.
  • the judging module 99 can be used to judge whether the brightness is greater than the brightness threshold and the type is an outdoor scene.
  • the emission module 91 may be used to project laser light to the scene at the first operating frequency when the brightness is greater than the brightness threshold and the type is an outdoor scene.
  • both step 08 and step 09 can be implemented by the processor 40.
  • the processor 40 can be used to obtain the brightness and type of the scene and determine whether the brightness is greater than the brightness threshold and the type is an outdoor scene.
  • the laser projector 11 can be used to project laser light to the scene at the first operating frequency when the brightness is greater than the brightness threshold and the type is an outdoor scene.
  • the brightness of the scene can be obtained by analyzing the collected image obtained by the image collector 12 or the visible light image obtained by the visible light camera 30; or, the brightness of the scene can also be directly detected by a light sensor, and the processor 40 reads from the light sensor. Take the detected signal to get the brightness of the scene.
  • the type of scene can be obtained by analyzing the collected image obtained by the image collector 12 or the visible light image obtained by the visible light camera 30, for example, analyzing the collected image or the visible light image obtained by the visible light camera 30 to determine whether the type of the scene is an outdoor scene or an indoor Scene; the type of scene can also be determined directly according to the geographic location.
  • the processor 40 can obtain the positioning result of the global satellite positioning system for the scene, and then further judge the type of the scene according to the positioning result, for example, the positioning result is a certain office Building, it means that the scene is an indoor scene; the positioning scene is a certain park, it means that the scene is an outdoor scene; when the positioning scene is a certain street, it means that the scene is an outdoor scene, and so on.
  • the positioning result is a certain office Building, it means that the scene is an indoor scene; the positioning scene is a certain park, it means that the scene is an outdoor scene; when the positioning scene is a certain street, it means that the scene is an outdoor scene, and so on.
  • the ambient infrared light in the collected image will take up a larger proportion, which will have a greater impact on spot recognition. At this time, the interference of ambient infrared light needs to be removed. However, when the brightness of the scene is low, the proportion of ambient infrared light in the captured image will be less, and the impact on spot recognition will be small and can be ignored.
  • the image collector 12 and the laser projector 11 can be the same The working frequency works, and the processor 40 directly calculates the depth image according to the acquired image (ie, the second image) acquired by the image collector 12 and the reference image.
  • the processor 40 directly calculates the depth image based on the acquired image (ie, the second image) acquired by the image collector 12 and the reference image. In this way, the operating frequency of the image collector 12 can be reduced, and the power consumption of the image collector 12 can be reduced.
  • the image acquisition method can also determine whether to perform step 01 based only on the brightness of the scene.
  • the processor 40 only obtains the brightness of the scene, determines whether the brightness of the scene is greater than the brightness threshold, and the laser projector 11 projects laser light onto the scene at the first operating frequency when the brightness is greater than the brightness threshold.
  • the processor 40 may also add status information (status) to each data stream.
  • status information (status)
  • 0000 in Table 4 means the first image; 0010 means the second image; 0100 means the infrared image acquired by the image collector 12 when the floodlight 20 is turned on; 0111 means the third image; 1XX1 means the background reduction process The depth image; 1XX0 represents the depth image without background subtraction processing. In this way, state information is added to each data stream so that the processor 40 can distinguish whether each data stream has undergone background reduction processing.
  • the processor 40 includes a first storage area, a second storage area, and a logical subtraction circuit, and the logical subtraction circuit is connected to both the first storage area and the second storage area.
  • the first storage area is used to store the first image
  • the second storage area is used to store the second image
  • the logical subtraction circuit is used to process the first image and the second image to obtain the third image.
  • the logical subtraction circuit reads the first image from the first storage area, reads the second image from the second storage area, and performs subtraction on the first image and the second image after acquiring the first image and the second image
  • the third image is obtained by processing.
  • the logic subtraction circuit is also connected to the depth calculation module in the processor 40 (for example, it may be an integrated circuit ASIC dedicated to calculating depth, etc.).
  • the logic subtraction circuit sends the third image to the depth calculation module, and the depth calculation module is The third image and the reference image calculate the depth image.
  • the present application also provides a non-volatile computer-readable storage medium 200 containing computer-readable instructions.
  • the processor 300 executes the image acquisition method described in any one of the foregoing embodiments.
  • the processor 300 may be the processor 40 in FIG. 1.
  • the processor 300 when the computer-readable instructions are executed by the processor 300, the processor 300 is caused to perform the following steps:
  • the processor 300 when the computer-readable instructions are executed by the processor 300, the processor 300 is caused to perform the following steps:

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Optics & Photonics (AREA)
  • Computing Systems (AREA)
  • Human Computer Interaction (AREA)
  • Studio Devices (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

一种图像获取方法、图像获取装置(90)、电子设备(100)和可读存储介质。图像获取方法包括:以第一工作频率向场景投射激光;以大于第一工作频率的第二工作频率获取采集图像;在采集图像中区分出在激光投射器未投射激光时采集的第一图像及在激光投射器投射激光时采集的第二图像;根据第一图像、第二图像及参考图像计算深度图像。

Description

图像获取方法、图像获取装置、电子设备和可读存储介质
优先权信息
本申请请求2019年5月24日向中国国家知识产权局提交的、专利申请号为201910437665.1的专利申请的优先权和权益,并且通过参照将其全文并入此处。
技术领域
本申请涉及成像技术领域,特别涉及一种图像获取方法、图像获取装置、电子设备和非易失性计算机可读存储介质。
背景技术
深度相机通过向场景投射带有斑点的激光图案来获取场景的深度信息。具体地,深度相机向场景中投射出红外激光(例如940nm的红外激光),红外激光形成散斑图案,深度相机采集被场景中物体反射形成的散斑图案以进行场景中物体的深度信息的获取。然而,如果在亮度较高的场景下使用深度相机,例如处于阳光强烈的户外场景下使用深度相机,此时的环境光线中包含有大量940nm的红外光,这部分红外光会进入到深度相机中成像,导致散斑图案成像与环境红外光成像的亮度比较接近,算法无法区分激光散斑点,导致激光散斑匹配失败,深度信息出现部分或全部缺失。
发明内容
本申请实施方式提供了一种图像获取方法、图像获取装置、电子设备和非易失性计算机可读存储介质。
本申请实施方式的图像获取方法用于电子设备。所述电子设备包括深度相机,所述深度相机包括激光投射器,所述图像获取方法包括:以第一工作频率向场景投射激光;以第二工作频率获取采集图像,所述第二工作频率大于所述第一工作频率;在所述采集图像中区分出在所述激光投射器未投射激光时采集的第一图像及在所述激光投射器投射激光时采集的第二图像;根据所述第一图像、所述第二图像及参考图像计算深度图像。
本申请实施方式的图像获取装置用于电子设备。所述电子设备包括深度相机,所述深度相机包括激光投射器。所述图像获取装置包括:发射模块、第一获取模块、区分模块和计算模块。发射模块用于以第一工作频率向场景投射激光。第一获取模块用于以第二工作频率获取采集图像,所述第二工作频率大于所述第一工作频率。区分模块用于在所述采集图像中区分出在所述激光投射器未投射激光时采集的第一图像及在所述激光投射器投射激光时采集的第二图像。计算模块根据所述第一图像、所述第二图像及参考图像计算深度图像。
本申请实施方式的电子设备包括深度相机和处理器。所述深度相机包括激光投射器和图像采集器。所述激光投射器用于以第一工作频率向场景投射激光。所述图像采集器用于以第二工作频率获取采集图像,所述第二工作频率大于所述第一工作频率。所述处理器用于:在所述采集图像中区分出在所述激光投射器未投射激光时采集的第一图像及在所述激光投射器投射激光时采集的第二图像;根据所述第一图 像、所述第二图像及参考图像计算深度图像。
本申请实施方式的包含计算机可读指令的非易失性计算机可读存储介质,所述计算机可读指令被处理器执行时,使得所述处理器执行上述的图像获取方法。
本申请实施方式的图像获取方法、图像获取装置、电子设备和非易失性计算机可读存储介质,激光投射器与图像采集器以不同的工作频率工作,图像采集器可以采集到仅由环境红外光形成的第一图像以及同时由环境红外光和激光投射器发射的红外激光形成的第二图像,并基于第一图像去除掉第二图像中由环境红外光形成的图像部分,由此能够区分出激光散斑点,并能采用仅由激光投射器发射的红外激光形成的采集图像来计算深度图像,激光散斑匹配不受影响,可以避免深度信息出现部分或全部缺失,从而提升深度图像的精确度。
本申请实施方式的附加方面和优点将在下面的描述中部分给出,部分将从下面的描述中变得明显,或通过本申请的实践了解到。
附图说明
本申请的上述和/或附加的方面和优点可以从结合下面附图对实施方式的描述中将变得明显和容易理解,其中:
图1和2是本申请某些实施方式的电子设备的结构示意图。
图3是本申请某些实施方式的电子设备的系统架构示意图。
图4是本申请某些实施方式的图像获取方法的流程示意图。
图5是本申请某些实施方式的图像获取装置的模块示意图。
图6是本申请某些实施方式的图像获取方法的原理示意图。
图7至图10是本申请某些实施方式的图像获取方法的流程示意图。
图11是本申请某些实施方式的图像获取装置的模块示意图。
图12是本申请某些实施方式的图像获取方法的原理示意图。
图13是本申请某些实施方式的图像获取方法的流程示意图。
图14是本申请某些实施方式的图像获取装置的模块示意图。
图15是本申请某些实施方式的非易失性计算机可读存储介质与处理器的交互示意图。
具体实施方式
下面详细描述本申请的实施方式,所述实施方式的示例在附图中示出,其中,相同或类似的标号自始至终表示相同或类似的元件或具有相同或类似功能的元件。下面通过参考附图描述的实施方式是示例性的,仅用于解释本申请的实施方式,而不能理解为对本申请的实施方式的限制。
请一并参阅图1至图3,本申请提供一种电子设备100。其中,电子设备100可以是手机、平板电脑、笔记本电脑、智能穿戴设备(智能手表、智能手环、智能头盔、智能眼镜等)、虚拟现实设备等。本申请以电子设备100是手机为例进行说明,但电子设备100的形式并不限于手机。电子设备100包括深度相机10、可见光相机30、处理器40及壳体50。
处理器40收容在壳体50中。深度相机10和可见光相机30安装在壳体50上。在一个例子中,壳体50包括主体51和可动支架52。可动支架52在驱动装置的驱动下可以相对主体51运动,例如,可动支架52可以相对于主体51滑动,以滑入主体51或从主体51滑出。深度相机10和可见光相机30可以安装在可动支架52上,可动支架52运动可带动深度相机10和可见光相机30缩回主体51内或从主体51中伸出。壳体50上开设有一个或多个采集窗口,采集窗口可以开设在壳体50的正面或背面。深度相机10和可见光相机30均与采集窗口对准安装,以使深度相机10和可见光相机30能够接收从采集窗口入射的光线。用户在需要使用深度相机10和可见光相机30中的任意一个时,可以触发可动支架52从主体51中滑出以带动深度相机10和可见光相机30从主体51中伸出;用户不需要使用深度相机10或可见光相机30时,可以触发可动支架52滑入主体51以带动深度相机10和可见光相机30缩回主体51中。在另一个例子中,壳体50上开设有一个或多个通孔,深度相机10和可见光相机30安装在壳体50内并与通孔对准。通孔可以开设在壳体50的正面或背面,深度相机10和可见光相机30可以接收经过通孔的光线。
深度相机10包括激光投射器11和图像采集器12。激光投射器11可以发射激光,其中,激光投射器11包括激光光源111和第一驱动器112,第一驱动器111可用于驱动激光光源111投射激光,激光可以是红外激光或其他不可见光,例如紫外激光等。图像采集器12可以接收被物体反射回的激光。本申请以激光是红外激光,图像采集器12是红外摄像头为例进行说明,但激光及图像采集器12的形式并不限于此,例如,激光还可以是紫外激光,图像采集器12是紫外光摄像头。激光投射器11和图像采集器12均与处理器40连接。处理器40可以为激光投射器11提供使能信号,具体地,处理器40可以为第一驱动器112提供使能信号。图像采集器12通过I2C总线与处理器40连接。图像采集器12与激光投射器11配合使用时,在一个例子中,图像采集器12可以通过选通信号(strobe信号)控制激光投射器11的投射时序,其中,strobe信号是根据图像采集器12获取采集图像的时序来生成的,strobe信号可视为高低电平交替的电信号,激光投射器11根据strobe指示的激光投射时序来投射激光。具体地,处理器40可以通过I2C总线发送图像采集指令以启用深度相机10使其工作,图像采集器12接收到图像采集指令后,通过strobe信号控制开关器件61,若strobe信号为高电平,则开关器件61向第一驱动器112发送第一脉冲信号(pwn1),第一驱动器112根据第一脉冲信号驱动激光光源111向场景中投射激光,若strobe信号为低电平,则开关器件61停止发送第一脉冲信号至第一驱动器112,激光光源111不投射激光;或者,也可以是在strobe信号为低电平时,开关器件61向第一驱动器112发送第一脉冲信号,第一驱动器112根据第一脉冲信号驱动激光光源111向场景中投射激光,在strobe信号为高电平时,开关器件61停止发送第一脉冲信号至第一驱动器112,激光光源111不投射激光。在另一个例子中,图像采集器12与激光投射器11配合时可以无需用到strobe信号,此时,处理器40发送图像采集指令至图像采集器12并同时发送激光投射指令至第一驱动器112,图像采集器12接收到图像采集指令后开始获取采集图像,第一驱动器112接收到激光投射指令时驱动激光光源111投射激光。激光投射器11投射激光时,激光形成带有斑点的激光图案投射在场景中的物体上。图像采集器12采集被物体反射的激光图案得到散斑图像,并通过移动产业处理器接口(Mobile Industry Processor Interface,MIPI)将散斑图像发送给处理器40。图像采集器12每发送一帧散斑图像给处理器40,处理器40就接收到一个数据流。 处理器40可以根据散斑图像和预存在处理器40中的参考图像做深度图像的计算。
可见光相机30也通过I2C总线与处理器40连接。可见光相机30可用于采集可见光图像。可见光相机30每发送一帧可见光图像给处理器40,处理器40就接收到一个数据流。可见光相机30单独使用,即用户仅仅想要获取可见光图像时,处理器40通过I2C总线向可见光相机30发送图像采集指令以启用可见光相机30使其工作。可见光相机30接收到图像采集指令后采集场景的可见光图像,并通过移动产业处理器接口向处理器40发送可见光图像。可见光相机30和深度相机10配合使用,即用户想要通过可见光图像与深度图像获取三维图像时,若图像采集器12与可见光相机30工作频率相同,则图像采集器12与可见光相机30通过sync信号实现硬件同步。具体地,处理器40通过I2C总线向图像采集器12发送图像采集指令。图像采集器12接收到图像采集指令后,可以通过strobe信号控制开关器件61向第一驱动器112发送第一脉冲信号(pwn1),以使第一驱动器112根据第一脉冲信号驱动激光光源111发射激光;同时,图像采集器12与可见光相机30之间通过sync信号同步,该sync信号控制可见光相机30采集可见光图像。
电子设备100还包括泛光灯20。泛光灯20可以向场景中发射均匀的面光,泛光灯20包括泛光光源21及第二驱动器22,第二驱动器22可用于驱动泛光光源21发射均匀的面光。泛光灯20发出的光可以是红外光或其他不可见光,例如紫外光等。本申请以泛光灯20发射红外光为例进行说明,但泛光灯20发射的光的形式并不限于此。泛光灯20与处理器40连接,处理器40可以为泛光灯20提供使能信号,具体地,处理器40可以为第二驱动器22提供使能信号。泛光灯20可以与图像采集器12配合工作以采集红外图像。图像采集器12与泛光灯20配合使用时,在一个例子中,图像采集器12可以通过选通信号(strobe信号,该strobe信号与图像采集器12控制激光投射器11的strobe信号为两个独立的strobe信号)控制泛光灯20发射红外光的发射时序,strobe信号是根据图像采集器12获取采集图像的时序来生成的,strobe信号可视为高低电平交替的电信号,泛光灯20根据strobe信号指示的红外光发射时序来发射红外光。具体地,处理器40可以通过I2C总线向图像采集器12发送图像采集指令,图像采集器12接收到图像采集指令后,通过strobe信号控制开关器件61,若strobe信号为高电平,则开关器件61向第二驱动器22发送第二脉冲信号(pwn2),第二驱动器22根据第二脉冲信号控制泛光光源21发射红外光,若strobe信号为低电平,则开关器件61停止发送第二脉冲信号至第二驱动器22,泛光光源21不发射红外光;或者,也可以是在strobe信号为低电平时,开关器件61向第二驱动器22发送第二脉冲信号,第二驱动器22根据第二脉冲信号控制泛光光源21发射红外光,在strobe信号为高电平时,开关器件61停止发送第二脉冲信号至第二驱动器22,泛光光源21不发射红外光。泛光灯20发射红外光时,图像采集器12接收被场景中的物体反射的红外光以形成红外图像,并通过移动产业处理器接口将红外图像发送给处理器40。图像采集器12每发送一帧红外图像给处理器40,处理器40就接收到一个数据流。该红外图像通常用于虹膜识别、人脸识别等。
请参阅图4,本申请还提供一种用于图1-3所示实施例的电子设备100的图像获取方法。图像获取方法包括:
01:以第一工作频率向场景投射激光;
02:以第二工作频率获取采集图像,第二工作频率大于第一工作频率;
03:在采集图像中区分出在激光投射器11未投射激光时采集的第一图像及在激光投射器11投射激光时采集的第二图像;和
04:根据第一图像、第二图像及参考图像计算深度图像。
请参阅图5,本申请还提供一种用于图1-3所示实施例的电子设备100的图像获取装置90。本申请的图像获取方法可以由本申请的图像获取装置90实现。图像获取装置90包括发射模块91、第一获取模块92、区分模块93及计算模块94。步骤01可以由发射模块91实现。步骤02可以由第一获取模块92实现。步骤03可以由区分模块93实现。步骤04可以由计算模块94实现。也即是说,发射模块91可用于以第一工作频率向场景投射激光。第一获取模块92可用于以第二工作频率获取采集图像,第二工作频率大于第一工作频率。区分模块93可用于在采集图像中区分出在激光投射器11未投射激光时采集的第一图像及在激光投射器11投射激光时采集的第二图像。计算模块94可用于根据第一图像、第二图像及参考图像计算深度图像。此时,发射模块91为激光投射器11,第一获取模块92即为图像采集器12。
请再参阅图1,本申请的图像获取方法还可以由电子设备100实现。其中,步骤01可以由激光投射器11实现。步骤02可以由图像采集器12实现。步骤03和步骤04均可以由处理器40实现。也即是说,激光投射器11可用于以第一工作频率向场景投射激光。图像采集器12可用于以第二工作频率获取采集图像,第二工作频率大于第一工作频率。处理器40可用于在采集图像中区分出在激光投射器11未投射激光时采集的第一图像及在激光投射器11投射激光时采集的第二图像、以及根据第一图像、第二图像及参考图像计算深度图像。
具体地,图像采集器12与激光投射器11工作频率不同(即第二工作频率大于第一工作频率)时,若需要获取深度图像,比如在解锁、支付、解密、三维建模等使用场景下,在一个例子中,处理器40通过I2C总线向图像采集器12和第一驱动器112同时发送获取深度图像的图像采集指令。第一驱动器112接收到图像采集指令后,驱动激光光源111以第一工作频率向场景发射红外激光;图像采集器12接收到图像采集指令后,以第二工作频率采集被场景中的物体反射回的红外激光以获取采集图像。例如图6所示,实线表示激光投射器11发射激光的时序,虚线表示图像采集器12获取采集图像的时序及采集图像的帧数,点划线表示根据第一图像和第二图像得到的第三图像的帧数,图6中由上至下,依次为实线、虚线及点划线,其中,第二工作频率为第一工作频率的两倍。请参阅图6中实线与虚线部分,图像采集器12在激光投射器11未投射激光时先接收环境中的红外光(下称环境红外光)以获取第N帧采集图像(此时为第一图像,也可称作背景图像),并通过移动产业处理器接口将第N帧采集图像发送给处理器40;随后,图像采集器12可以在激光投射器11投射激光时接收环境红外光以及由激光投射器11发射的红外激光以获取第N+1帧采集图像(此时为第二图像,也可称作干扰散斑图像),并通过移动产业处理器接口将第N+1帧采集图像发送给处理器40;随后,图像采集器12再在激光投射器11未投射激光时接收环境红外光以获取第N+2帧采集图像(此时为第一图像),并通过移动产业处理器接口将第N+2帧采集图像发送给处理器40,依此类推,图像采集器12交替地获取第一图像和第二图像。
在另一个例子中,处理器40通过I2C总线向图像采集器12发送获取深度图像的采集指令。图像采集器12接收到图像采集指令后,通过strobe信号控制开关器向第一驱动器112发送第一脉冲信号,第 一驱动器112根据第一脉冲信号驱动激光光源111以第一工作频率投射激光(即激光投射器11以第一工作频率投射激光),同时图像采集器12以第二工作频率采集被场景中的物体反射回的红外激光以获取采集图像。如图6所示,实线表示激光投射器11发射激光的时序,虚线表示图像采集器12获取采集图像的时序及采集图像的帧数,点划线表示根据第一图像和第二图像得到的第三图像的帧数,图6中由上至下,依次为实线、虚线及点划线,其中,第二工作频率为第一工作频率的两倍。请参阅图6中实线与虚线部分,图像采集器12在激光投射器11未投射激光时先接收环境红外光以获取第N帧采集图像(此时为第一图像,也可称作背景图像),并通过移动产业处理器接口将第N帧采集图像发送给处理器40;随后,图像采集器12可以在激光投射器11投射激光时接收环境红外光以及由激光投射器11发射的红外激光以获取第N+1帧采集图像(此时为第二图像,也可称作干扰散斑图像),并通过移动产业处理器接口将第N+1帧采集图像发送给处理器40;随后,图像采集器12再在激光投射器11未投射激光时接收环境红外光以获取第N+2帧采集图像(此时为第一图像),并通过移动产业处理器接口将第N+2帧采集图像发送给处理器40,依此类推,图像采集器12交替地获取第一图像和第二图像。
需要说明的是,图像采集器12可以在发送采集图像给处理器40的过程中同时执行采集图像的获取。并且,图像采集器12也可以先获取第二图像,再获取第一图像,并根据这个顺序交替执行采集图像的获取。另外,上述的第二工作频率与第一工作频率之间的倍数关系仅为示例,在其他实施例中,第二工作频率与第一工作频率之间的倍数关系还可以是三倍、四倍、五倍、六倍等等。
处理器40每接收到一帧采集图像后,都会对接收到的采集图像进行区分,判断采集图像是第一图像还是第二图像。处理器40接收到至少一帧第一图像和至少一帧第二图像后,即可根据第一图像、第二图像以及参考图像计算深度图像。具体地,由于第一图像是在激光投射器11未投射激光时采集的,形成第一图像的光线仅包括环境红外光,而第二图像是在激光投射器11投射激光时采集的,形成第二图像的光线同时包括环境红外光和激光投射器11发射的红外激光,因此,处理器40可以根据第一图像来去除第二图像中的由环境红外光形成的采集图像的部分,从而得到仅由红外激光形成的采集图像(即由红外激光形成的散斑图像)。
可以理解,环境光包括与激光投射器11发射的红外激光波长相同的红外光(例如,包含940nm的环境红外光),图像采集器12获取采集图像时,这部分红外光也会被图像采集器12接收。在场景的亮度较高时,图像采集器12接收的光线中环境红外光的占比会增大,导致采集图像中的激光散斑点不明显,从而影响深度图像的计算。
本申请的图像获取方法控制激光投射器11与图像采集器12以不同的工作频率工作,图像采集器12可以采集到仅由环境红外光形成的第一图像以及同时由环境红外光和激光投射器11发射的红外激光形成的第二图像,并基于第一图像去除掉第二图像中由环境红外光形成的图像部分,由此能够区分出激光散斑点,并能采用仅由激光投射器11发射的红外激光形成的采集图像来计算深度图像,激光散斑匹配不受影响,可以避免深度信息出现部分或全部缺失,从而提升深度图像的精确度。
请参阅图7和图8,在某些实施方式中,步骤03包括:
031:为每一帧采集图像添加图像类型;和
032:根据图像类型区分第一图像与第二图像。
其中,步骤031包括:
0311:根据每一帧采集图像的采集时间确定在采集时间下激光投射器11的工作状态;和
0312:根据工作状态为每一帧采集图像添加图像类型。
请再参阅图5,在某些实施方式中,步骤031、步骤032、步骤0311及步骤0312均可以由区分模块93实现。也即是说,区分模块93还可用于为每一帧采集图像添加图像类型、以及根据图像类型区分第一图像与第二图像。区分模块93用于为每一帧采集图像添加图像类型时,具体用于根据每一帧采集图像的采集时间确定在采集时间下激光投射器11的工作状态、以及根据工作状态为每一帧采集图像添加图像类型。
请再参阅图3,在某些实施方式中,步骤031、步骤032、步骤0311、步骤0312均可以由处理器40实现。也即是说,处理器40还可用于为每一帧采集图像添加图像类型、以及根据图像类型区分第一图像与第二图像。处理器40用于为每一帧采集图像添加图像类型时,具体用于根据每一帧采集图像的采集时间确定在采集时间下激光投射器11的工作状态、以及根据工作状态为每一帧采集图像添加图像类型。
具体地,处理器40每从图像采集器12接收到一帧采集图像,都会为采集图像添加图像类型(stream_type),以便于后续处理中可以根据图像类型区分出第一图像和第二图像。具体地,在图像采集器12获取采集图像的期间,处理器40会通过I2C总线实时监测激光投射器11的工作状态。处理器40每从图像采集器12接收到一帧采集图像,会先获取采集图像的采集时间,再根据采集图像的采集时间来判断在采集图像的采集时间下激光投射器11的工作状态是投射激光还是未投射激光,并基于判断结果为采集图像添加图像类型。其中,采集图像的采集时间可以是图像采集器12获取每一帧采集图像的开始时间、结束时间、介于开始时间至结束时间之间的任意一个时间等等。如此,可以实现每一帧采集图像与激光投射器11在该帧采集图像获取期间的工作状态(投射激光或未投射激光)的对应,准确区分出采集图像的类型。在一个例子中,图像类型stream_type的结构如表1所示:
表1
Figure PCTCN2020085783-appb-000001
表1中stream为0时,表示此时的数据流为由红外光和/或红外激光形成的图像。light为00时,表示此时的数据流是在没有任何设备投射红外光和/或红外激光(仅有环境红外光)的情形下获取的,那么处理器40可以对采集图像添加000的图像类型,以标识这一采集图像为第一图像。light为01时,表示此时的数据流是在激光投射器11投射红外激光(既有环境红外光,又有红外激光)的情形下获取的。处理器40可以对采集图像添加001的图像类型,以标识这一采集图像为第二图像。处理器40后续即可根据stream_type来区分采集图像的图像类型。
请参阅图9,在某些实施方式中,步骤04包括:
041:根据第一图像和第二图像计算第三图像,第一图像的采集时间与第二图像的采集时间的差值 小于预定差值;和
042:根据第三图像和参考图像计算深度图像。
请再参阅图5,在某些实施方式中,步骤041及步骤042均可以由计算模块94实现。也即是说,计算模块94可用于根据第一图像和第二图像计算第三图像、以及根据第三图像和参考图像计算深度图像,其中,第一图像的采集时间与第二图像的采集时间的差值小于预定差值。
请再参阅图3,在某些实施方式中,步骤041及步骤042均可以由计算模块94实现。也即是说,处理器40还可以用于根据第一图像和第二图像计算第三图像、以及根据第三图像和参考图像计算深度图像,其中,第一图像的采集时间与第二图像的采集时间的差值小于预定差值。
在计算深度图像的过程中,处理器40可以先区分出第一图像与第二图像,再根据采集时间选出任意帧第二图像和与该任意帧第二图像对应的特定帧的第一图像,其中该特定帧的第一图像的采集时间与该任意帧的第二图像的采集时间的差值小于预定差值。随后,处理器40再根据该特定帧的第一图像和该任意帧的第二图像来计算第三图像,第三图像即为仅由激光投射器11发射的红外激光形成的采集图像,也可以称作实际散斑图像。具体地,第一图像中的多个像素点与第二图像中的多个像素点是一一对应的,假设第一图像为P1,第二图像为P2,第三图像为P3,处理器40可以将第二图像中的像素点P2 i,j的像素值减去第一图像中的像素点P1 i,j的像素值以得到第三图像中像素点P3 i,j的像素值,即P3 i,j=P2 i,j-P1 i,j,i∈N+,j∈N+。随后,处理器40可以根据第三图像与参考图像计算出深度图像,其中,第二图像的帧数、第三图像的帧数及深度图像的帧数均相等。可以理解,由于第一图像的采集时间和第二图像的采集时间的差值较小,那么第一图像中环境红外光的强度与第二图像中环境红外光的强度更为接近,基于第一图像和第二图像计算出的第三图像的精度更高,有利于进一步减小环境红外光对深度图像获取的影响。
在某些实施方式中,处理器40也可以为第三图像和深度图像添加图像类型,以便于对处理采集图像后得到的各个数据流进行区分。如表2所示:
表2
Figure PCTCN2020085783-appb-000002
表2中的stream为0时,表示此时的数据流为由红外光和/或红外激光形成的图像,stream为1时,表示此时的数据流为深度图像。light为11时,表示减背景处理,减背景处理即去除采集图像中由环境红外光形成的部分,那么处理器40可以对减背景处理后的数据流添加011的图像类型,以标识这一数据流为第三图像。Light为XX时,X表示不限定取值,处理器40可对进行深度计算后得到的数据流添加1XX的图像类型,以标识这一数据流为深度图像。
在某些实施方式中,参与深度图像计算的第一图像和第二图像中,第一图像的采集时间可以位于第二图像的采集时间之前,也可以位于第二图像的采集时间之后,在此不作限制。
在某些实施方式中,第一图像的采集时间与第二图像的采集时间的差值小于预定差值时,第一图像 和第二图像可以是相邻帧的图像,也可以是非相邻帧的图像。例如,在第二工作频率是第一工作频率的两倍时,差值小于预定差值的第一图像和第二图像是相邻帧的图像;在第二工作频率与第一工作频率之间的倍数大于两倍,例如第二工作频率是第一工作频率的三倍时,差值小于预定差值的第一图像和第二图像可以是相邻帧的图像,也可以是非相邻帧的图像(此时第一图像与第二图像之间还间隔有一帧第一图像)。
在某些实施方式中,参与深度图像计算的第一图像的帧数还可以为多帧。比如,在第二工作频率是第一工作频率的三倍时,可以选取两帧相邻的第一图像以及与这两帧第一图像相邻的一帧第二图像来计算第三图像。此时,处理器40可以先对两帧第一图像做融合处理,例如,将两帧第一图像对应像素点的像素值相加再取均值得到融合处理后的第一图像,再利用融合处理后的第一图像和该相邻的一帧第二图像计算第三图像。
在某些实施方式中,处理器40可以计算出多帧第三图像,如图6中的第(N+1)-N帧第三图像、第(N+3)-(N+2)帧第三图像、第(N+5)-(N+4)帧第三图像等等,并对应多帧第三图像计算出多帧深度图像。当然,在其他实施方式中,处理器40也可以仅计算出一帧第三图像,并对应一帧第三图像计算出一帧深度图像。第三图像的帧数可以根据应用场景的安全级别来确定。具体地,当应用场景的安全级别较高时,例如对于支付等安全级别较高的应用场景,第三图像的帧数应该较多,此时需要多帧深度图像与用户的深度模板的匹配均成功才执行支付动作,以提升支付的安全性;而对于应用场景的安全级别较低,例如对于基于深度信息进行人像美颜的应用场景,第三图像的帧数可以较少,例如,为一帧,此时利用一帧深度图像即足够进行人像美颜,如此,可以减少处理器40的计算量及功耗,并可以提升图像处理的速度。
请参阅图10,在某些实施方式中,图像获取方法还包括:
05:以第三工作频率采集可见光图像,第三工作频率大于或小于第二工作频率;
06:为每一帧可见光图像和每一帧采集图像添加采集时间;和
07:根据可见光图像的采集时间、采集图像的采集时间及采集图像的图像类型确定帧同步的可见光图像和第二图像。
请参阅图11,在某些实施方式中,图像获取装置90还包括采集模块95、添加模块96及确定模块97。步骤05可以由采集模块95实现。步骤06可以由添加模块96实现。步骤07可以由确定模块97实现。也即是说,采集模块95可用于以第三工作频率采集可见光图像,第三工作频率大于或小于第二工作频率。添加模块96可用于为每一帧可见光图像和每一帧采集图像添加采集时间。确定模块97可用于根据可见光图像的采集时间、采集图像的采集时间及采集图像的图像类型确定帧同步的可见光图像和第二图像。其中,采集模块95即为可见光相机30。
请再参阅图3,在某些实施方式中,步骤05可以由可见光相机30实现。步骤06和步骤07可以由处理器40实现。也即是说,可见光相机30可用于以第三工作频率采集可见光图像,第三工作频率大于或小于第二工作频率。处理器40可用于为每一帧可见光图像和每一帧采集图像添加采集时间、以及根据可见光图像的采集时间、采集图像的采集时间及采集图像的图像类型确定帧同步的可见光图像和第二图像。
在一些应用场景,例如,对场景中的物体进行三维建模的应用场景下,需要借助深度相机10获取场景中物体的深度信息,并且借助可见光相机30获取场景中物体的色彩信息,才能实现三维建模。此时,处理器40需要开启深度相机10获取深度图像并同时开启可见光相机30获取可见光图像。
若图像采集器12与可见光相机30具有相同的工作频率,即图像采集器12与可见光相机30均以第二工作频率工作,那么处理器40可以通过I2C总线发送图像采集指令至图像采集器12,图像采集器12接收到图像采集指令后,图像采集器12与可见光相机30之间通过sync信号同步,该sync信号控制可见光相机30开启采集可见光图像,以实现图像采集器12与可见光相机30的硬件同步。此时,采集图像的帧数与可见光图像的帧数一致,每一帧采集图像与每一帧可见光图像一一对应。
但在图像采集器12与可见光相机30的工作频率不同,即图像采集器12以第二工作频率工作,可见光相机30以不等于第二工作频率的第三工作频率工作时,图像采集器12与可见光相机30无法实现硬件同步。此时,处理器40需要通过软件同步的方式来实现图像采集器12与可见光相机30的同步。具体地,处理器40通过与图像采集器12连接的I2C总线发送图像采集指令至图像采集器12,同时通过与可见光相机30连接的I2C总线发送图像采集指令至可见光相机30。处理器40每接收到一帧采集图像时,会为每一帧采集图像添加图像类型,还会为每一帧采集图像添加采集时间。并且,处理器40每接收到一帧可见光图像时,会为每一帧可见光图像添加采集时间。其中,采集图像的采集时间可以是图像采集器12采集每一帧采集图像的开始时间、结束时间、介于开始时间至结束时间之间的任意一个时间等等;可见光图像的采集时间可以是可见光相机30采集每一帧可见光图像的开始时间、结束时间、介于开始时间至结束时间之间的任意一个时间等等。那么,在后续基于深度图像和可见光图像做进一步处理(如三维建模、借助深度信息做人像美颜等处理)时,处理器40可以先根据可见光图像的采集时间、采集图像的采集时间及采集图像的类型先确定帧同步的可见光图像和第二图像。其中,帧同步指的是确定出的第二图像的采集时间与可见光图像的采集时间的差值小于预设的时间差值,可见光图像的采集时间可以位于第二图像的采集时间之前也可以位于第二图像的采集时间之后。随后,处理器40再根据确定的第二图像选出第一图像以进一步根据第二图像、第一图像及参考图像计算深度图像。最后,处理器40基于深度图像和确定出的可见光图像进行后续处理。
在某些实施方式中,处理器40也可以为每一帧深度图像添加采集时间,再根据可见光图像的采集时间和深度图像的采集时间确定帧同步的可见光图像以及深度图像,最后对帧同步的可见光图像以及深度图像做后续处理。其中,每一帧深度图像的采集时间为与该帧深度图像对应的第二图像的采集时间。
请参阅图12,在某些实施方式中,采集图像还包括红外图像,红外图像为图像采集器12采集泛光灯20发射的红外光所得到的图像。处理器40为每一帧采集图像添加图像类型时,还会为红外图像添加图像类型。在一个例子中,红外图像的图像类型如表3所示:
表3
Figure PCTCN2020085783-appb-000003
表3中的stream为0时,表示此时的数据流为由红外光和/或红外激光形成的图像。light为10时, 表示此时的数据流是在泛光灯20投射红外光且激光投射器11未投射激光的情形下获取的。那么,处理器40为采集图像添加010的图像类型时,即标识这一帧采集图像为红外图像。
在某些应用场景中,比如同时基于深度图像与深度模板的匹配以及红外图像与红外模板的匹配实现身份验证的应用场景,图像采集器12需要与泛光灯20及激光投射器11配合使用,图像采集器12可以分时获取第一图像、第二图像及红外图像。如图12所示,实线表示激光投射器11发射激光的时序,双点划线表示泛光灯20发射红外光的时序,虚线表示图像采集器12获取采集图像的时序及采集图像的帧数,点划线表示根据第一图像和第二图像得到的第三图像的帧数,图12中由上至下,依次为实线、双点划线、虚线及点划线,其中,第二工作频率为第一工作频率的三倍,第二工作频率为第四工作频率的三倍。处理器40可以通过I2C总线实时监测泛光灯20的工作状态。处理器40每从图像采集器12接收到一帧采集图像,会先获取采集图像的采集时间,再根据采集图像的采集时间来判断在采集图像的采集时间下泛光灯20的工作状态是发射红外光还是未发射红外光,并基于判断结果为采集图像添加图像类型。处理器40后续可以基于红外图像的采集时间和第二图像的采集时间确定出采集时间的差值小于设定差值的红外图像和第二图像,进一步地,处理器40可以确定出红外图像和深度图像,并利用该红外图像和该深度图像进行身份验证。
请参阅图13,在某些实施方式中,图像获取方法还包括:
08:获取场景的亮度及类型;
09:判断亮度是否大于亮度阈值且类型为户外场景;
01:若是,则进入以第一工作频率向场景投射激光的步骤。
请参阅图14,在某些实施方式中,图像获取装置90还包括第二获取模块98和判断模块99。步骤08可以由第二获取模块98实现。步骤09可以由判断模块99实现。也即是说,第二获取模块98可用于获取场景的亮度及类型。判断模块99可用于判断亮度是否大于亮度阈值且类型为户外场景。发射模块91可用于在亮度大于亮度阈值且类型为户外场景时以第一工作频率向场景投射激光。
请再参阅图3,在某些实施方式中,步骤08和步骤09均可以由处理器40实现。也即是说,处理器40可用于获取场景的亮度及类型、以及判断亮度是否大于亮度阈值且类型为户外场景。激光投射器11可用于在亮度大于亮度阈值且类型为户外场景时以第一工作频率向场景投射激光。
具体地,场景的亮度可以通过分析图像采集器12获取的采集图像或可见光相机30获取的可见光图像得到;或者,场景的亮度也可以由光线感应器来直接检测,处理器40从光线感应器读取检测得到的信号以获取场景的亮度。场景的类型可以通过分析图像采集器12获取的采集图像或可见光相机30获取的可见光图像得到,例如分析采集图像或可见光相机30获取的可见光图像中的物体来判断场景的类型为户外场景还是户内场景;场景的类型也可以直接根据地理位置来确定,具体地,处理器40可以获取全球卫星定位系统对场景的定位结果,再根据定位结果进一步判断场景的类型,例如,定位结果为某某办公楼,则说明场景为户内场景;定位场景为某某公园,则说明场景为户外场景;定位场景为某某街道,则说明场景为户外场景等等。
可以理解,在场景的亮度较高(例如亮度大于亮度阈值)时,采集图像中环境红外光的占比会较多,对斑点的识别会较大影响,此时需要去除环境红外光的干扰。但是在场景的亮度较低时,采集图像中环 境红外光的占比会较少,对斑点的识别产生的影响较小,可以忽略不计,此时图像采集器12和激光投射器11可以采用相同工作频率工作,处理器40直接根据图像采集器12获取的采集图像(即第二图像)与参考图像计算深度图像。另外,场景的亮度较高时可能是户内的灯光光线较强引起的,由于灯光光线不包括红外光,不会对斑点的识别产生较大影响,此时图像采集器12和激光投射器11采用相同工作频率工作,处理器40直接根据图像采集器12获取的采集图像(即第二图像)与参考图像计算深度图像。如此,可以减小图像采集器12的工作频率,减少图像采集器12的功耗。
当然,在某些实施方式中,图像获取方法也可以仅仅基于场景的亮度来判断是否执行步骤01。具体地,处理器40仅仅获取场景的亮度,判断场景的亮度是否大于亮度阈值,激光投射器11在亮度大于亮度阈值时以第一工作频率向场景投射激光。
在某些实施方式中,处理器40还可以为每一个数据流添加状态信息(status)。在一个例子中,如表4所示:
表4
Figure PCTCN2020085783-appb-000004
状态信息status为0时,表示该数据流未经过减背景处理,状态信息status为1时,表示该数据流经过减背景处理。表4中的0000即表示第一图像;0010即表示第二图像;0100即表示泛光灯20开启时图像采集器12获取的红外图像;0111即表示第三图像;1XX1即表示经过减背景处理的深度图像;1XX0即表示未经过减背景处理的深度图像。如此,为每个数据流添加状态信息以便于处理器40分辨各个数据流是否经过减背景处理。
在某些实施方式中,处理器40包括第一存储区、第二存储区以及逻辑减电路,逻辑减电路与第一存储区及第二存储区均连接。其中,第一存储区用于存储第一图像,第二存储区用于存储第二图像,逻辑减电路用于处理第一图像和第二图像得到第三图像。具体地,逻辑减电路从第一存储区读取第一图像,从第二存储区读取第二图像,在获取到第一图像和第二图像后,对第一图像和第二图像执行减法处理得到第三图像。逻辑减电路还与处理器40中的深度计算模块(例如,可以是专门用于计算深度的集成电路ASIC等)连接,逻辑减电路将第三图像发送到深度计算模块中,由深度计算模块根据第三图像和参考图像计算深度图像。
请参阅图14,本申请还提供一种包含计算机可读指令的非易失性计算机可读存储介质200。计算机可读指令被处理器300执行时,使得处理器300执行上述任意一项实施方式所述的图像获取方法。处理器300可以是图1中的处理器40。
例如,请结合图4,计算机可读指令被处理器300执行时,使得处理器300执行以下步骤:
01:以第一工作频率向场景投射激光;
02:以第二工作频率获取采集图像,第二工作频率大于第一工作频率;
03:在采集图像中区分出在激光投射器11未投射激光时采集的第一图像及在激光投射器11投射激光时采集的第二图像;和
04:根据第一图像、第二图像及参考图像计算深度图像。
再例如,请结合图8,计算机可读指令被处理器300执行时,使得处理器300执行以下步骤:
031:为每一帧采集图像添加图像类型;和
032:根据图像类型区分第一图像与第二图像。
在本说明书的描述中,参考术语“一个实施方式”、“一些实施方式”、“示意性实施方式”、“一个例子”“示例”、“具体示例”或“一些示例”等的描述意指结合所述实施方式或示例描述的具体特征、结构、材料或者特点包含于本申请的至少一个实施方式或示例中。在本说明书中,对上述术语的示意性表述不一定指的是相同的实施方式或示例。而且,描述的具体特征、结构、材料或者特点可以在任何的一个或多个实施方式或示例中以合适的方式结合。此外,在不相互矛盾的情况下,本领域的技术人员可以将本说明书中描述的不同实施例或示例以及不同实施例或示例的特征进行结合和组合。
流程图中或在此以其他方式描述的任何过程或方法描述可以被理解为,表示包括一个或更多个用于实现特定逻辑功能或过程的步骤的可执行指令的代码的模块、片段或部分,并且本申请的优选实施方式的范围包括另外的实现,其中可以不按所示出或讨论的顺序,包括根据所涉及的功能按基本同时的方式或按相反的顺序,来执行功能,这应被本申请的实施例所属技术领域的技术人员所理解。
尽管上面已经示出和描述了本申请的实施方式,可以理解的是,上述实施方式是示例性的,不能理解为对本申请的限制,本领域的普通技术人员在本申请的范围内可以对上述实施方式进行变化、修改、替换和变型。

Claims (12)

  1. 一种图像获取方法,用于电子设备,其特征在于,所述电子设备包括深度相机,所述深度相机包括激光投射器,所述图像获取方法包括:
    以第一工作频率向场景投射激光;
    以第二工作频率获取采集图像,所述第二工作频率大于所述第一工作频率;
    在所述采集图像中区分出在所述激光投射器未投射激光时采集的第一图像及在所述激光投射器投射激光时采集的第二图像;和
    根据所述第一图像、所述第二图像及参考图像计算深度图像。
  2. 根据权利要求1所述的图像获取方法,其特征在于,所述在所述采集图像中区分出在所述激光投射器未投射激光时采集的第一图像及在所述激光投射器投射激光时采集的第二图像,包括:
    为每一帧所述采集图像添加图像类型;和
    根据所述图像类型区分所述第一图像与所述第二图像。
  3. 根据权利要求2所述的图像获取方法,其特征在于,所述为每一帧所述采集图像添加图像类型,包括:
    根据每一帧所述采集图像的采集时间确定在所述采集时间下所述激光投射器的工作状态;和
    根据所述工作状态为每一帧所述采集图像添加所述图像类型。
  4. 根据权利要求2所述的图像获取方法,其特征在于,所述根据所述第一图像、所述第二图像及参考图像计算深度图像,包括:
    根据所述第一图像和所述第二图像计算第三图像,所述第一图像的所述采集时间与所述第二图像的采集时间的差值小于预定差值;和
    根据所述第三图像和所述参考图像计算所述深度图像。
  5. 根据权利要求2所述的图像获取方法,其特征在于,所述图像获取方法还包括:
    以第三工作频率采集可见光图像,所述第三工作频率大于或小于所述第二工作频率;
    为每一帧所述可见光图像和每一帧所述采集图像添加采集时间;和
    根据所述可见光图像的所述采集时间、所述采集图像的所述采集时间及所述采集图像的图像类型确定帧同步的所述可见光图像和所述第二图像。
  6. 一种图像获取装置,用于电子设备,其特征在于,所述电子设备包括深度相机,所述深度相机包括激光投射器,所述图像获取装置包括:
    发射模块,用于以第一工作频率向场景投射激光;
    第一获取模块,用于以第二工作频率获取采集图像,所述第二工作频率大于所述第一工作频率;
    区分模块,用于在所述采集图像中区分出在所述激光投射器未投射激光时采集的第一图像及在所述激光投射器投射激光时采集的第二图像;和
    计算模块,用于根据所述第一图像、所述第二图像及参考图像计算深度图像。
  7. 一种电子设备,其特征在于,所述电子设备包括深度相机和处理器,所述深度相机包括激光投射器和图像采集器;
    所述激光投射器用于以第一工作频率向场景投射激光;
    所述图像采集器用于以第二工作频率获取采集图像,所述第二工作频率大于所述第一工作频率;
    所述处理器用于:
    在所述采集图像中区分出在所述激光投射器未投射激光时采集的第一图像及在所述激光投射器投射激光时采集的第二图像;和
    根据所述第一图像、所述第二图像及参考图像计算深度图像。
  8. 根据权利要求7所述的电子设备,其特征在于,所述处理器还用于:
    为每一帧所述采集图像添加图像类型;和
    根据所述图像类型区分所述第一图像与所述第二图像。
  9. 根据权利要求8所述的电子设备,其特征在于,所述处理器还用于:
    根据每一帧所述采集图像的采集时间确定在所述采集时间下所述激光投射器的工作状态;和
    根据所述工作状态为每一帧所述采集图像添加所述图像类型。
  10. 根据权利要求8所述的电子设备,其特征在于,所述处理器还用于:
    根据所述第一图像和所述第二图像计算第三图像,所述第一图像的所述采集时间与所述第二图像的采集时间的差值小于预定差值;和
    根据所述第三图像和所述参考图像计算所述深度图像。
  11. 根据权利要求8所述的电子设备,其特征在于,所述电子设备还包括可见光相机,所述可见光相机用于以第三工作频率采集可见光图像,所述第三工作频率大于或小于所述第二工作频率;
    所述处理器还用于:
    为每一帧所述可见光图像和每一帧所述采集图像添加采集时间;和
    根据所述可见光图像的所述采集时间、所述采集图像的所述采集时间及所述采集图像的图像类型确定帧同步的所述可见光图像和所述第二图像。
  12. 一种包含计算机可读指令的非易失性计算机可读存储介质,所述计算机可读指令被处理器执行时,使得所述处理器执行权利要求1-5任意一项所述的图像获取方法。
PCT/CN2020/085783 2019-05-24 2020-04-21 图像获取方法、图像获取装置、电子设备和可读存储介质 WO2020238481A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP20815599.4A EP3975537A4 (en) 2019-05-24 2020-04-21 IMAGE CAPTURE METHOD, IMAGE CAPTURE DEVICE, ELECTRONIC DEVICE AND READABLE STORAGE MEDIA
US17/525,544 US20220067951A1 (en) 2019-05-24 2021-11-12 Method for Acquiring Image, Electronic Device and Readable Storage Medium

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910437665.1 2019-05-24
CN201910437665.1A CN110012206A (zh) 2019-05-24 2019-05-24 图像获取方法、图像获取装置、电子设备和可读存储介质

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/525,544 Continuation US20220067951A1 (en) 2019-05-24 2021-11-12 Method for Acquiring Image, Electronic Device and Readable Storage Medium

Publications (1)

Publication Number Publication Date
WO2020238481A1 true WO2020238481A1 (zh) 2020-12-03

Family

ID=67177819

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/085783 WO2020238481A1 (zh) 2019-05-24 2020-04-21 图像获取方法、图像获取装置、电子设备和可读存储介质

Country Status (4)

Country Link
US (1) US20220067951A1 (zh)
EP (1) EP3975537A4 (zh)
CN (1) CN110012206A (zh)
WO (1) WO2020238481A1 (zh)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110012206A (zh) * 2019-05-24 2019-07-12 Oppo广东移动通信有限公司 图像获取方法、图像获取装置、电子设备和可读存储介质
CN116033273A (zh) * 2022-12-15 2023-04-28 杭州海康慧影科技有限公司 消除激光分层的图像处理方法、系统及装置

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103268608A (zh) * 2013-05-17 2013-08-28 清华大学 基于近红外激光散斑的深度估计方法及装置
CN106550228A (zh) * 2015-09-16 2017-03-29 上海图檬信息科技有限公司 获取三维场景的深度图的设备
CN107995434A (zh) * 2017-11-30 2018-05-04 广东欧珀移动通信有限公司 图像获取方法、电子装置和计算机可读存储介质
CN108716982A (zh) * 2018-04-28 2018-10-30 Oppo广东移动通信有限公司 光学元件检测方法、装置、电子设备和存储介质
US10282857B1 (en) * 2017-06-27 2019-05-07 Amazon Technologies, Inc. Self-validating structured light depth sensor system
CN110012206A (zh) * 2019-05-24 2019-07-12 Oppo广东移动通信有限公司 图像获取方法、图像获取装置、电子设备和可读存储介质

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2064676B1 (en) * 2006-09-21 2011-09-07 Thomson Licensing A method and system for three-dimensional model acquisition
JP5408400B1 (ja) * 2012-04-04 2014-02-05 コニカミノルタ株式会社 画像生成装置及びプログラム
CN102706452A (zh) * 2012-04-28 2012-10-03 中国科学院国家天文台 月球卫星干涉成像光谱仪实时数据的处理方法
US10268885B2 (en) * 2013-04-15 2019-04-23 Microsoft Technology Licensing, Llc Extracting true color from a color and infrared sensor
CN103971405A (zh) * 2014-05-06 2014-08-06 重庆大学 一种激光散斑结构光及深度信息的三维重建方法
US9762781B2 (en) * 2015-10-30 2017-09-12 Essential Products, Inc. Apparatus and method to maximize the display area of a mobile device by increasing the size of the display without necessarily increasing the size of the phone
US10438493B2 (en) * 2016-08-24 2019-10-08 Uber Technologies, Inc. Hybrid trip planning for autonomous vehicles
CN106454287B (zh) * 2016-10-27 2018-10-23 深圳奥比中光科技有限公司 组合摄像系统、移动终端及图像处理方法
US10404916B2 (en) * 2017-08-30 2019-09-03 Qualcomm Incorporated Multi-source video stabilization
CN107682607B (zh) * 2017-10-27 2019-10-22 Oppo广东移动通信有限公司 图像获取方法、装置、移动终端和存储介质
CN109461181B (zh) * 2018-10-17 2020-10-27 北京华捷艾米科技有限公司 基于散斑结构光的深度图像获取方法及系统
US10939090B2 (en) * 2019-02-06 2021-03-02 Canon Kabushiki Kaisha Control apparatus, imaging apparatus, illumination apparatus, image processing apparatus, image processing method, and storage medium
CN110062145B (zh) * 2019-05-24 2021-07-20 Oppo广东移动通信有限公司 深度相机、电子设备及图像获取方法

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103268608A (zh) * 2013-05-17 2013-08-28 清华大学 基于近红外激光散斑的深度估计方法及装置
CN106550228A (zh) * 2015-09-16 2017-03-29 上海图檬信息科技有限公司 获取三维场景的深度图的设备
US10282857B1 (en) * 2017-06-27 2019-05-07 Amazon Technologies, Inc. Self-validating structured light depth sensor system
CN107995434A (zh) * 2017-11-30 2018-05-04 广东欧珀移动通信有限公司 图像获取方法、电子装置和计算机可读存储介质
CN108716982A (zh) * 2018-04-28 2018-10-30 Oppo广东移动通信有限公司 光学元件检测方法、装置、电子设备和存储介质
CN110012206A (zh) * 2019-05-24 2019-07-12 Oppo广东移动通信有限公司 图像获取方法、图像获取装置、电子设备和可读存储介质

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3975537A4

Also Published As

Publication number Publication date
CN110012206A (zh) 2019-07-12
EP3975537A1 (en) 2022-03-30
EP3975537A4 (en) 2022-06-08
US20220067951A1 (en) 2022-03-03

Similar Documents

Publication Publication Date Title
WO2020238506A1 (zh) 电子装置的控制方法及电子装置
US9148637B2 (en) Face detection and tracking
CN110062145B (zh) 深度相机、电子设备及图像获取方法
US11143879B2 (en) Semi-dense depth estimation from a dynamic vision sensor (DVS) stereo pair and a pulsed speckle pattern projector
US9390487B2 (en) Scene exposure auto-compensation for differential image comparisons
WO2020238481A1 (zh) 图像获取方法、图像获取装置、电子设备和可读存储介质
WO2020238569A1 (zh) 终端的控制方法及控制装置、终端及计算机可读存储介质
WO2020259334A1 (zh) 调节方法、调节装置、终端及计算机可读存储介质
JP2008015915A (ja) 画像処理方法、入力インタフェース装置
WO2020238491A1 (zh) 电子装置的控制方法及电子装置
CN105912145A (zh) 一种激光笔鼠标系统及其图像定位方法
US11947045B2 (en) Controlling method for electronic device, and electronic device
CN110191279B (zh) 深度相机、电子设备及图像获取方法
JP2018067300A (ja) 情報処理装置及びその制御方法
WO2020237657A1 (zh) 电子设备的控制方法、电子设备和计算机可读存储介质
US20120044421A1 (en) Projector apparatus and method for dynamically masking objects
TWI753344B (zh) 混合型深度估算系統
WO2020087486A1 (zh) 深度图像处理方法、深度图像处理装置和电子装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20815599

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2020815599

Country of ref document: EP

Effective date: 20211222