WO2020259385A1 - 图像处理方法和装置、存储介质 - Google Patents

图像处理方法和装置、存储介质 Download PDF

Info

Publication number
WO2020259385A1
WO2020259385A1 PCT/CN2020/096822 CN2020096822W WO2020259385A1 WO 2020259385 A1 WO2020259385 A1 WO 2020259385A1 CN 2020096822 W CN2020096822 W CN 2020096822W WO 2020259385 A1 WO2020259385 A1 WO 2020259385A1
Authority
WO
WIPO (PCT)
Prior art keywords
module
image
image processing
infrared camera
response
Prior art date
Application number
PCT/CN2020/096822
Other languages
English (en)
French (fr)
Inventor
王路
Original Assignee
Oppo广东移动通信有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Oppo广东移动通信有限公司 filed Critical Oppo广东移动通信有限公司
Priority to EP20831054.0A priority Critical patent/EP3979202A4/en
Publication of WO2020259385A1 publication Critical patent/WO2020259385A1/zh
Priority to US17/559,672 priority patent/US20220114743A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/521Depth or shape recovery from laser ranging, e.g. using interferometry; from the projection of structured light
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • G01B11/2513Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object with several lines being projected in more than one direction, e.g. grids, patterns
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/10Image acquisition
    • G06V10/12Details of acquisition arrangements; Constructional details thereof
    • G06V10/14Optical characteristics of the device performing the acquisition or on the illumination arrangements
    • G06V10/141Control of illumination
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/10Image acquisition
    • G06V10/12Details of acquisition arrangements; Constructional details thereof
    • G06V10/14Optical characteristics of the device performing the acquisition or on the illumination arrangements
    • G06V10/145Illumination specially adapted for pattern recognition, e.g. using gratings
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/161Detection; Localisation; Normalisation
    • G06V40/166Detection; Localisation; Normalisation using acquisition arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/172Classification, e.g. identification
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/56Cameras or camera modules comprising electronic image sensors; Control thereof provided with illuminating means
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/04Synchronising
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10048Infrared image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30196Human being; Person
    • G06T2207/30201Face

Definitions

  • the embodiments of the present application relate to structured light technology, and in particular, to an image processing method and device, and a storage medium.
  • mobile devices such as mobile phones are increasingly adopting three-dimensional (3D, 3-Dimension) structured light technology to realize image application functions such as face unlocking, face payment, and three-dimensional modeling.
  • the structure is driven by a deep processing chip
  • the light transmitter emits light with certain structural characteristics and projects it on the object to be inspected.
  • the infrared camera shoots the object to be inspected and collects the speckle image of the object to be inspected.
  • the depth processing chip receives the speckle image from the infrared camera.
  • the depth calculation of the speckle image is performed to obtain the depth image of the object to be detected.
  • the depth image reflects the spatial position information of the object to be detected, and the image application function can be realized based on the depth image.
  • the laser emission of the structured light emitter is controlled by the advanced processing chip, and the advanced processing chip is programmed at one time and cannot be changed, the emission driving function in the advanced processing chip cannot be directly repaired when it fails or is not applicable. , Then, there is no guarantee that the depth image can be obtained stably, resulting in poor stability of the image application function.
  • the present application provides an image processing method, device, and storage medium, which can improve the stability of image application functions.
  • the embodiment of the application provides an image processing method, which is applied to an image processing device.
  • the image processing device includes an infrared camera, a structured light transmitter, an emission drive module, and an image processing module.
  • the method includes:
  • control the structured light emitter In response to the trigger signal, control the structured light emitter to emit laser light according to the emission parameters through the emission drive module, and transmit a synchronization signal to the infrared camera through the emission drive module;
  • controlling the infrared camera to collect a speckle image of the object to be detected In response to the synchronization signal, controlling the infrared camera to collect a speckle image of the object to be detected;
  • Control the image processing module to perform depth calculation on the speckle image to obtain a depth image of the object to be detected, and implement the image application function based on the depth image, which represents the depth of the object to be detected Three-dimensional structural features.
  • the image processing module includes a signal modulation module
  • the transmitting the acquired transmission parameters to the transmission driving module in response to the request instruction includes:
  • control the signal modulation module In response to the request instruction, control the signal modulation module to transmit the transmission parameter to the transmission drive module.
  • the infrared camera includes a signal modulation module, a trigger module and a timing control circuit;
  • transmitting the acquired emission parameters to the emission driving module and controlling the infrared camera to transmit a trigger signal to the emission driving module includes:
  • the signal modulation module is controlled to transmit the transmission parameter to the transmission drive module
  • the trigger module is controlled to transmit the trigger signal to the transmission drive module
  • the infrared camera further includes an image acquisition module
  • the controlling the infrared camera to collect a speckle image of the object to be detected includes:
  • the image acquisition module In response to the synchronization signal, the image acquisition module is controlled to periodically acquire the speckle image through the timing control circuit.
  • the infrared camera includes a signal modulation module
  • the method further includes:
  • the signal modulation module is controlled to modify the transmission parameter to obtain the updated transmission parameter.
  • controlling the structured light emitter to emit laser light according to the emission parameters through the emission driving module includes:
  • the structured light emitter is controlled to emit laser light according to the updated emission parameter through the emission driving module.
  • the request instruction includes a face unlock request
  • the image processing device further includes a visible light camera
  • transmitting the acquired emission parameters to the emission driving module and controlling the infrared camera to transmit a trigger signal to the emission driving module includes :
  • the transmission parameter is transmitted to the transmission drive module, and the infrared camera is controlled to transmit the trigger signal to the transmission drive module.
  • the realization of the image application function based on the depth image includes:
  • a face unlock response is generated, and in response to the face unlock response, an unlock page is opened.
  • the request instruction includes a three-dimensional modeling request
  • the image processing device further includes a visible light camera
  • the method further includes:
  • the visible light camera is controlled to collect the visible light image of the object to be detected.
  • the realization of the image application function based on the depth image includes:
  • the method after transmitting the acquired emission parameters to the emission driving module and controlling the infrared camera to transmit a trigger signal to the emission driving module, the method further includes:
  • the timing duration is obtained
  • the infrared camera is controlled to generate abnormality prompt information, and display the abnormality prompt information, and the abnormality prompt information indicates that the transmission drive module is abnormal.
  • An embodiment of the present application provides an image processing device.
  • the image processing device includes a control module, an infrared camera, a structured light transmitter, an emission drive module, and an image processing module.
  • the control module connects the infrared camera and the infrared camera through a bus.
  • An image processing module, the infrared camera is connected to the control module, the emission drive module and the image processing module, and the image processing module is connected to the control module and the infrared camera;
  • the control module is configured to, when a request instruction for an image application function is detected, in response to the request instruction, transmit the acquired emission parameters to the emission drive module, and control the infrared camera to the emission drive module Transmit trigger signal;
  • the emission driving module is configured to control the structured light transmitter to emit laser light according to the emission parameters in response to the trigger signal, and transmit a synchronization signal to the infrared camera;
  • the infrared camera is configured to collect a speckle image of the object to be detected in response to the synchronization signal; transmit the speckle image to the image processing module;
  • the image processing module is configured to perform depth calculation on the speckle image to obtain a depth image of the object to be detected, and implement the image application function based on the depth image, and the depth image represents the to be detected The three-dimensional structural characteristics of the object.
  • An embodiment of the present application provides an image processing device.
  • the image processing device includes an infrared camera, a structured light transmitter, an emission drive module, and an image processing module.
  • the infrared camera is connected to the emission drive module and the image processing module
  • the structured light transmitter is connected to the emission drive module
  • the image processing module further includes: a processor, a memory, and a communication bus.
  • the memory communicates with the processor through the communication bus, and the memory stores One or more programs executable by the processor, when the one or more programs are executed, the infrared camera, the structured light transmitter, the emission drive module and the The image processing module executes any one of the above image processing methods.
  • the embodiments of the present application provide a computer-readable storage medium, and the computer-readable storage medium stores one or more programs, and the one or more programs can be executed by one or more processors to realize any of the above An image processing method.
  • the embodiment of the application provides an image processing method and device, and a storage medium.
  • the method includes: first transmitting the acquired emission parameters to the emission driving module, and further controlling the infrared camera to transmit a trigger signal to the emission driving module, and further,
  • the emission drive module controls the structured light transmitter to emit laser light according to the emission parameters.
  • the emission drive module transmits a synchronization signal to the infrared camera.
  • the infrared camera collects the object to be detected while the structured light transmitter emits laser light.
  • the image processing module performs depth calculation on the speckle image to obtain the depth image of the object to be detected, and realizes the image application function based on the depth image.
  • the emission drive module controls the structured light emitter to emit laser light based on the acquired emission parameters and the trigger signal transmitted from the infrared camera, and controls the infrared camera to collect speckle images while emitting laser light.
  • the entire depth processing chip controls the structured light emitter to emit lasers and the infrared camera to collect speckle images, so that the emission drive module can work independently without relying on the depth processing chip. In this way, it is easier to repair when the emission drive module fails or is not applicable, thereby ensuring that the speckle image is collected normally, thereby ensuring that the depth image is normally obtained, and ultimately improving the stability of the image application function.
  • FIG. 1 is a schematic diagram of an image processing device provided by an embodiment of the application
  • FIG. 2 is a schematic diagram of the positional relationship of devices when collecting speckle images according to an embodiment of the application
  • Figure 3(a) is a schematic diagram of a light spot image provided by an embodiment of the application.
  • FIG. 3(b) is a schematic diagram of a speckle image provided by an embodiment of this application.
  • Fig. 3(c) is a schematic diagram of a depth image provided by an embodiment of this application.
  • Fig. 4(a) is a schematic structural diagram of another image processing device provided by an embodiment of the application.
  • FIG. 4(b) is a schematic structural diagram of another image processing device provided by an embodiment of this application.
  • FIG. 5 is a schematic flowchart of an image processing method provided by an embodiment of the application.
  • FIG. 6 is a third structural diagram of an image processing device provided by an embodiment of the application.
  • FIG. 7 is a fourth structural diagram of an image processing device provided by an embodiment of the application.
  • Fig. 8(a) is a schematic structural diagram of an infrared camera provided by an embodiment of the application.
  • Figure 8(b) is a schematic structural diagram of another infrared camera provided by an embodiment of the application.
  • FIG. 9 is a schematic flowchart of another image processing method provided by an embodiment of the application.
  • FIG. 10 is a schematic flowchart of another image processing method provided by an embodiment of this application.
  • FIG. 11 is a fifth structural diagram of an image processing apparatus provided by an embodiment of the application.
  • FIG. 1 is a schematic diagram of an image processing device provided by an embodiment of the application.
  • the image processing device 1 includes a control module 11 and an infrared camera. 12 (IR camera, Infrared Radiation Camera), structured light emitter 13, visible light camera 14, and depth processing chip 15.
  • IR camera Infrared Radiation Camera
  • structured light emitter 13 visible light camera 14
  • depth processing chip 15 IR camera, Infrared Radiation Camera
  • the depth processing chip 15 includes an emission driver module (Driver) 15-1, and the control module 11 is connected to the infrared camera 12 and visible light camera 14 and the depth processing chip 15, the infrared camera 12 is connected to the control module 11 and the depth processing chip 15, the depth processing chip 15 is connected to the control module 11, the infrared camera 12 and the structured light emitter 13, and the visible light camera 14 is connected to the control module 11.
  • Driver emission driver module
  • the control module 11 Based on the image processing device shown in Figure 1, when the control module 11 detects that the user starts the image application function, it generates a request instruction for the image application function; the control module 11 sends the request instruction through the integrated circuit bus (I2C bus, Inter-Integrated Circuit) It is transmitted to the infrared camera 12, the visible light camera 14 and the depth processing chip 15; in response to the request instruction, the depth processing chip 15 controls the emission drive module 15-1 to drive the structured light emitter 13 to emit laser light with certain structural characteristics, and project it to the waiting On the detection object, in synchronization with the start of the laser emission time, the infrared camera 12 starts to photograph the object to be detected, and collects speckle images of the object to be detected.
  • I2C bus Inter-Integrated Circuit
  • the infrared camera 12 is controlled to transmit the speckle image to the depth processing chip 15, and the depth processing chip 15 processes the speckle image, and the The change of the light structure is converted into depth information to obtain the depth image of the object to be inspected, the three-dimensional structure characteristics of the object to be inspected can be obtained from the depth image, and the image application function is realized based on the three-dimensional structure characteristics of the object to be inspected.
  • control module 11 may include a sensor and a processor.
  • the sensor detects an operation input by the user and transmits the operation to the processor.
  • the processor determines whether the operation meets the operating conditions for starting the image application function. When the operating conditions for starting the image application function are met, the request instruction of the image application function is generated; otherwise, the request instruction is not generated.
  • the deep processing chip 15 includes an application specific integrated circuit (ASIC, Application Specific Integrated Circuit), and the ASIC includes an emission drive module 15-1, an ARM processor (Advanced RISC Machines) 15-2, and a flash memory (Flash Memory) 15-3 And arithmetic core (Core) 15-4, when the control module 11 transmits the request command of the image application function to the ASIC via the I2C bus, the ARM processor 15-2 receives the request command via the I2C bus, and stores the request command in the flash memory 15-3 According to the request instruction and the program numbered in advance by the user, the emission drive module 15-1 is controlled to drive the structured light transmitter 13; on the other hand, when the infrared camera 12 transmits the speckle image to the ASIC, the ARM processor 15-2 will The speckle image is transmitted to the computing core 15-4, and the computing core 15-4 performs depth calculation on the speckle image to obtain a depth image.
  • ASIC application specific integrated circuit
  • the ASIC includes an emission drive module 15-1, an ARM processor
  • FIG. 2 is a schematic diagram of the positional relationship of equipment when collecting speckle images according to an embodiment of the application.
  • the structured light transmitter 13 includes a near-infrared laser 13-1 and a diffractive optical element (DOE, Diffractive Optical Element). ) 13-2, the near-infrared laser 13-1 and the diffractive optical element 13-2 are at the same position in the image processing device 1, the diffractive optical element 13-2 is located on the laser emitting side of the near-infrared laser 13-1, the near-infrared laser 13-1 and the infrared camera 12 are arranged side by side in the image processing device 1 at a certain interval.
  • DOE diffractive optical element
  • the distance between the near-infrared laser 13-1 and the infrared camera 12 is the baseline distance, for example, 40 mm; the object 20 to be detected is located in the structure Within the light projection area of the light transmitter 13 and the imaging area of the infrared camera 12, the light projection area includes the area within the two dotted lines connected to the diffractive optical element 13-2 in FIG. 2, and the imaging area includes the area in FIG.
  • the distance between the object 20 to be detected and the line connecting the near-infrared laser 13-1 and the infrared camera 12 is the vertical distance, and the irradiation from the near-infrared laser 13-1 to the object 20 to be detected
  • the angle is ⁇
  • the viewing angle from the infrared camera 12 to the object to be detected 20 is ⁇ ;
  • the wavelength of the infrared sensor (IR Sensor) in the infrared camera 12 is equal to the laser wavelength of the near-infrared laser 13-1, and the infrared camera 12 may be near-infrared camera.
  • the near-infrared laser 13-1 emits a certain wavelength of laser light onto the diffractive optical element 13-2, and the diffractive optical element 13-2 diffuses the laser light to form a spot pattern 21; the spot pattern 21 is projected onto the object to be detected 20 Above, the spot pattern 21 is composed of tens of thousands of diffraction spots within a certain angle range; the infrared camera 12 is used to photograph the object to be detected 20 projected with the spot pattern 21 to obtain a black and white speckle image; The image calculates the depth information to obtain the depth image.
  • the near-infrared laser 13-1 is a laser that emits a laser with a wavelength of 940 nm, for example, a vertical cavity surface emitting laser (VCSEL, Vertical Cavity Surface Emitting Laser), and the diffractive optical element 13-2 performs a laser with a wavelength of 940 nm.
  • VCSEL Vertical Cavity Surface Emitting Laser
  • the diffractive optical element 13-2 performs a laser with a wavelength of 940 nm.
  • a spot pattern is formed; when the spot pattern is projected onto a standard object, the infrared camera 12 is used to photograph the standard object, and the obtained spot image is shown in Figure 3(a).
  • the standard object includes a plane; when the spot image is projected onto On the portrait sculpture, the infrared camera 12 is used to photograph the portrait sculpture, and the obtained speckle image is shown in Figure 3(b); the depth information is calculated for the speckle image, and the obtained depth image is shown in Figure 3(c) ,
  • the color depth of the depth image represents the distance between various places in the portrait sculpture and the structured light emitter 13.
  • the image processing device 1 further includes a visible light camera 14 and a light-emitting source.
  • the visible light camera 14, the near-infrared laser 13-1, and the infrared camera 12 are arranged side by side in the image processing device 1 at a certain interval; the light-emitting source is located The vicinity of the visible light camera 14 is used to fill light for the visible light camera 14; wherein, the visible light camera 14 includes a three-primary color camera (RGB camera).
  • RGB camera three-primary color camera
  • the image processing device shown in FIG. 1 integrates the emission drive module 15-1 of the structured light emitter 13 into the depth processing chip, the circuit space of the emission drive module 15-1 is relatively large, Therefore, the requirements for the integration of the advanced processing chip are relatively high, and the volume of the advanced processing chip is too large; and, since the emission drive module 15-1 is integrated in the advanced processing chip, the current structured light signal, such as laser The control and modulation of the pulse signal is also carried out inside the depth processing chip, and the depth processing chip is usually pre-programmed and cannot be changed, so it is impossible to flexibly adjust the modulation mode of the structured light signal; and, in the current image processing device When the transmission driving module 15-1 fails or is not applicable, it is difficult and costly to replace or repair, and it is impossible to guarantee the stable acquisition of the depth image, resulting in poor stability of the image application function.
  • FIG. 4(a) is a schematic structural diagram of another image processing device provided by an embodiment of the application.
  • the image processing device 41 includes a control module 11-0. , Infrared camera 12-0, structured light transmitter 13-0, emission drive module 45-0 and image processing module 46-0, control module 11-0 connects infrared camera 12-0 and image processing module 46-0 via I2C bus ,
  • the infrared camera 12-0 is connected to the control module 11-0, the emission drive module 45-0 and the image processing module 46-0, and the image processing module 46-0 is connected to the control module 11-0 and the infrared camera 12-0.
  • the image processing device may include more or less components than shown in the figure, or a combination Some components, or different component arrangements; the image processing device can be implemented in various forms, for example, it can include mobile terminals such as mobile phones, tablet computers, notebook computers, palmtop computers, and fixed terminals such as desktop computers.
  • the control module 11-0 when the terminal starts the image application function based on structured light technology, for example, when the user starts the face unlocking or payment function on the terminal, the control module 11-0 will detect the request command from the image application function corresponding to the face unlock; in response to the request command, the control module 11-0 sends a start command to the infrared camera 12-0 and the image processing module 46-0 through the I2C bus .
  • the image processing module 46-0 In response to the start instruction, the image processing module 46-0 enters the ready-to-operate state, the infrared camera 12-0 enters the open state, and sends a trigger signal to the emission drive module 45-0; in response to the trigger signal, the emission drive module 45-0 Start, drive and control the structured light transmitter 13-0 to emit laser light, and at the same time feed back a synchronization signal to the infrared camera 12-0; in response to the synchronization signal, the infrared camera 12-0 starts to expose, thereby obtaining an image of the object to be detected, such as a human face Speckle image; the infrared camera 12-0 transmits the speckle image to the image processing module 46-0.
  • the image processing module 46-0 After receiving the speckle image, the image processing module 46-0 performs depth calculation on the speckle image to obtain the depth of the object to be detected An image, the depth image reflects the three-dimensional structural features of the object to be detected, such as a human face. In this way, the image processing device 41 can complete the corresponding image application functions in the request instruction, such as face unlocking, payment, etc., according to the depth image of the object to be detected, which is not limited in the embodiment of the present application.
  • the emission drive module 45-0 in FIG. 4(a) can work independently of the depth processing chip 15 or the image processing module 46-0 that realizes the depth image processing function, thereby improving the image
  • the maintainability of the processing device, and the laser emission time of the structured light transmitter 13-0 and the exposure collection time of the infrared camera 12-0 can be synchronized by the synchronization signal, avoiding the complicated pre-commissioning process, thus ensuring normal operation
  • the speckle image and the depth image are collected ground, and finally the stability of the image application function is improved.
  • FIG. 4(b) is a schematic structural diagram of another image processing device provided by an embodiment of the application.
  • the image processing device 4 includes a control module 11, an infrared camera 12, a structured light transmitter 13, a visible light camera 14, an emission drive module 45, and Image processing module 46, control module 11 is connected to infrared camera 12, visible light camera 14 and image processing module 46 via I2C bus, infrared camera 12 is connected to control module 11, emission drive module 45 and image processing module 46, image processing module 46 is connected to control module 11 and the infrared camera 12 and the visible light camera 14 are connected to the control module 11.
  • control module 11 when the control module 11 detects a request instruction for an image application function, in response to the request instruction, the control module 11-0 sends a start instruction through the I2C bus, and the visible light camera 14 responds In response to the start instruction, the visible light camera 14 enters the on state.
  • the image processing device may include more or less components than shown in the figure, or a combination of Some components, or different component arrangements; the image processing device can be implemented in various forms, for example, it can include mobile terminals such as mobile phones, tablet computers, notebook computers, palmtop computers, and fixed terminals such as desktop computers.
  • FIG. 5 is a schematic flowchart of an image processing method provided by an embodiment of the present application. As shown in FIG. The image processing method includes the following steps:
  • the image processing device When the image processing device detects the request command of the image application function, it obtains the transmission parameters and transmits the transmission parameters to the transmission drive module 45. At the same time, it also transmits the request command to the infrared camera 12, and the infrared camera 12 responds to the request command and transmits
  • the driving module 45 transmits a trigger signal; among them, the emission parameter is the emission parameter of the laser, including laser pulse frequency, laser pulse duty cycle and laser pulse period, etc.; the trigger signal is a signal for driving the emission driving module 45 to start emitting laser, for example, high Level.
  • the image processing module 46 includes a signal modulation module; after the image processing device detects the request instruction of the image application function, it transmits the request instruction to the image processing module 46; in response to the request instruction, the control signal modulation module transmits the drive module 45 Transmission transmission parameters.
  • the image processing device When the image processing device detects the request instruction, it transmits the request instruction to the image processing module 46.
  • the image processing module 46 includes a signal modulation module that generates transmission parameters.
  • the image processing module 46 controls the signal modulation module to transmit the drive module 45 Transmission transmission parameters.
  • FIG. 4(b) a schematic structural diagram of an image processing device provided by an embodiment of the present application may be as shown in FIG. 6, and the image processing module 46 shown in FIG. 4(b) may be an ASIC.
  • the processing module 46 may also include a signal modulation module 46-1, which is integrated in the ASIC in the form of a signal modulation circuit; after the image processing module 46 receives the request instruction, in response to the request instruction, the signal modulation circuit is turned on , The transmission parameters are transmitted to the transmission driving module 45.
  • FIG. 7 a schematic structural diagram of an image processing apparatus provided by an embodiment of the present application may be as shown in FIG. 7, and the image processing module 46 shown in FIG. 4(b) may be digital signal processing.
  • DSP Digital Singnal Processor
  • the signal modulation module 46-1 is written in the DSP register 46-2 (Register) in the form of a signal modulation program; after the image processing module 46 receives the request command, it calls And run the signal modulation program to realize the transmission of transmission parameters to the transmission drive module 45.
  • DSP Digital Singnal Processor
  • the image processing device can read and write the signal modulation program in the DSP through the control module 11 to modify the transmission parameters.
  • the infrared camera 12 includes a signal modulation module 12-1, a trigger module 12-2, and a timing control circuit 12-3; after the image processing device detects the request instruction of the image application function, it sends the timing control circuit 12-3 Transmission request instruction; in response to the request instruction, through the timing control circuit 12-3, the control signal modulation module 12-1 transmits transmission parameters to the transmission drive module 45, and controls the trigger module 12-2 to transmit the trigger signal to the transmission drive module 45.
  • the control module 11 in the image processing device detects the request instruction, it forwards the request instruction to the timing control circuit 12-3; the timing control circuit 12-3 transmits the instruction to the signal modulation module 12-1 in response to the request instruction, so that the signal is modulated
  • the module 12-1 transmits transmission parameters to the transmission drive module 45, and also transmits instructions to the trigger module 12-2, so that the trigger module 12-2 transmits a trigger signal to the transmission drive module 45.
  • the infrared camera 12 includes a signal modulation module 12-1, a trigger module 12-2, and a timing control circuit 12-3, and the timing control circuit 12- 3 Connect the signal modulation module 12-1 and the trigger module 12-2 respectively, and the timing control circuit 12-3 is also connected to the control module 11 and the emission drive module 45; when the control module 11 detects the request instruction, it sends the timing sequence in the infrared camera 12 The control circuit 12-3 transmits the request command. In response to the request command, the timing control circuit 12-3 controls the signal modulation module 12-1 to transmit the transmission parameters to the transmission drive module 45 and controls the trigger module 12-2 to transmit to the transmission drive module 45 Trigger signal.
  • the image processing module 46 includes the signal modulation module 46-1
  • the infrared camera 12 does not include the signal modulation module 12-1, and the structure diagram of the infrared camera 12 is shown in FIG. 8(b).
  • the image application function includes a face unlock function, a face payment function, and a three-dimensional modeling function.
  • the control module 11 in the image processing device detects that the user starts any image application function, it generates the image application function. Request instructions.
  • the control module 11 detects the screen lighting operation and determines that the image processing device is locked. In the state, it is determined that the user has activated the face unlock function, and a request instruction for the face unlock function is generated, that is, a face unlock request.
  • control the structured light transmitter In response to the trigger signal, control the structured light transmitter to emit laser light according to the emission parameters through the emission drive module, and transmit a synchronization signal to the infrared camera through the emission drive module;
  • the emission drive module 45 After the emission drive module 45 receives the emission parameters and the trigger signal, the emission drive module 45 controls the structured light transmitter 13 to start emitting laser according to the emission parameters, and transmits a synchronization signal to the infrared camera 12 once.
  • the emission parameters include laser pulse frequency, laser pulse duty cycle, and laser pulse period.
  • the emission drive module 45 controls the structured light emitter 13 to periodically emit laser light according to the emission parameters; for example, the laser pulse frequency can be It is 33 times per second, correspondingly, the emission period of each laser pulse is about 33 ms; the duty ratio of the laser pulse can be 10%.
  • the emission drive module 45 and the infrared camera 12 are directly connected.
  • the emission drive module 45 only needs to directly transmit a synchronization signal to the infrared camera 12 when it starts to emit laser light, and control the infrared camera 12 to start shooting after receiving the synchronization signal.
  • the start shooting time of the infrared camera 12 is synchronized with the start emission time of the structured light transmitter 13, avoiding the complicated synchronization debugging of the laser pulse frequency of the deep processing chip and the collection frequency of the infrared camera 12.
  • the infrared camera 12 includes a signal modulation module 12-1; the signal modulation module 12-1 is connected to the control module 11 through a timing control circuit 12-3 in the infrared camera 12.
  • the control module 11 in the image processing device receives the parameter modification instruction, it modulates the signal through the timing control circuit 12-3
  • the module 12-1 transmits the parameter modification instruction; in response to the parameter modification instruction, the signal modulation module 12-1 is controlled to modify the transmission parameter to obtain the updated transmission parameter.
  • control module 11 After the control module 11 receives the parameter modification instruction, it forwards the parameter modification instruction to the signal modulation module 12-1 in the infrared camera 12 through the timing control circuit 12-3, so that the signal modulation module 12-1 performs transmission parameters according to the parameter modification instruction. Modify to get the updated transmission parameters.
  • the structured light emitter 13 is controlled to emit laser light according to the updated emission parameters through the emission driving module 45.
  • the emission drive module 45 After the emission drive module 45 receives the updated emission parameters and the trigger signal, the emission drive module 45 controls the structured light transmitter 13 to start emitting laser light according to the updated emission parameters.
  • the image processing module 46 is a DSP and the DSP includes a signal modulation module, it is the same as controlling the signal modulation module in the infrared camera 12 to modify the transmission parameters, and the control module 11 controls the signal modulation module in the DSP to modify the transmission parameters.
  • the parameter control and adjustment of the structured light signal is realized by the signal modulation module inside the image processing module and the control module outside the image processing module, so that the structured light signal can be conveniently controlled and adjusted to improve The flexibility of the terminal modulating structured optical signal is improved.
  • S503 In response to the synchronization signal, control the infrared camera to collect a speckle image of the object to be detected;
  • the infrared camera 12 After the infrared camera 12 receives the synchronization signal from the transmission driving module 45, in response to the synchronization signal, the object to be detected is photographed, and the speckle image of the object to be detected is collected.
  • the infrared camera 12 further includes an image acquisition module 12-4; the transmission drive module 45 transmits a synchronization signal to the timing control circuit 12-3; in response to the synchronization signal, the timing control circuit 12-3 controls The image acquisition module 12-4 periodically acquires speckle images.
  • the timing control circuit 12-3 controls the image acquisition module 12-4 in response to the synchronization signal, so that the image acquisition module 12-4 periodically acquires speckle images.
  • the infrared camera 12 further includes an image acquisition module 12-4, the image acquisition module 12-4 is connected to the timing control circuit 12-3, and the image acquisition module 12-4 includes row selection circuit 12-41, column selection circuit 12-42, image array 12-43, amplifier 12-44, auto focus circuit 12-45, auto exposure circuit 12-46 and analog-to-digital converter (ADC, Analog-to-Digital Converter) 12-47; among them, row exposure circuit 12-41, column selection circuit 12-42, amplifier 12-44, auto focus circuit 12-45 and auto exposure circuit 12-46 are all connected with timing control circuit 12-3 connection, the image array 12-43 is connected to the amplifier 12-44, the auto focus circuit 12-45 and the automatic exposure circuit 12-46, the amplifier 12-44 is also connected to the analog-to-digital converter 12-47 and the analog-to-digital converter 12- 47 Connect the image processing module 46.
  • ADC Analog-to-Digital Converter
  • the timing control circuit 12-3 controls the image array 12-43 to receive the exposure time from the auto exposure circuit 12-46 and the exposure level from the auto focus circuit 12-45, and also by The timing control circuit 12-3 selectively turns on the row selection circuit 12-41 or the column selection circuit 12-42, and the image array 12-43 starts to expose according to the exposure time and degree of exposure, that is, the light signal is converted into electric charge, and then the image array 12- 43 transfers the charge to the amplifier 12-44; after the charge is amplified by the amplifier 12-44, it is transferred to the analog-to-digital converter 12-47, and the analog-to-digital converter 12-47 generates a speckle image.
  • the infrared camera 12 After the infrared camera 12 collects each frame of speckle images, the infrared camera 12 transmits each frame of speckle images to the image processing module 46, so that the image processing module 46 processes each frame of speckle images.
  • S505 Control the image processing module to perform depth calculation on the speckle image to obtain a depth image of the object to be detected, and implement an image application function based on the depth image, and the depth image represents the three-dimensional structural feature of the object to be detected.
  • the image processing module 46 performs depth calculation on each frame of speckle image to obtain each depth image.
  • the image processing module 46 transmits each depth image to the control module 11, so that the control module 11 realizes the image based on all the depth images.
  • the infrared camera 12 periodically collects speckle images and transmits them to the image processing module 46, and the image processing module 46 performs depth calculation on each frame of the collected speckle images to obtain each depth
  • the image processing module 46 may generate video stream information based on the multiple depth images, and transmit the video stream information to the control module 11.
  • the image processing method further includes: controlling the infrared camera 12 to start timing with the transmission time of the trigger signal; until it is detected that the infrared camera 12 receives the synchronization signal, obtain the timing duration; When the timing duration is greater than or equal to the preset duration, the infrared camera 12 is controlled to generate abnormal prompt information and display abnormal prompt information.
  • the abnormal prompt information indicates that the transmission drive module 45 is abnormal.
  • the transmission time is the starting time for timing until the infrared camera 12 receives the synchronization signal to stop timing, and subtract the start time from the stop timing time to obtain the timing duration;
  • the camera 12 determines that the timing duration is not less than the preset duration, it generates abnormal prompt information and transmits it to the control module 11, which is controlled by the control module 11 to display the abnormal prompt information on the display screen of the image processing device; wherein, the content of the abnormal prompt information includes The synchronization signal feedback of the emission drive module 45 is abnormal, or the image acquisition time and the laser emission time cannot be synchronized.
  • the preset duration may be the maximum interval duration set by the user according to the exposure time and laser pulse frequency.
  • the infrared camera 12 transmits the trigger signal to the emission driving module 45, after the maximum interval duration, the infrared camera 12 receives the emission drive In the case of the synchronization signal of the module 45, the start shooting time of the infrared camera 12 and the start emission time of the structured light transmitter 13 are kept synchronized.
  • the terminal obtains emission parameters, responds to trigger signals, and transmits synchronization signals to the infrared camera through an independent emission drive module, thereby reducing the terminal’s dependence on the depth processing chip and reducing the depth processing chip.
  • the resource occupation of the terminal reduces the maintenance cost of the terminal and ultimately improves the stability of the image application function.
  • FIG. 9 is a schematic flowchart of another image processing method provided by an embodiment of the application. As shown in FIG. 9, the image processing method includes the following step:
  • the image processing device also includes a light source.
  • the control module 11 detects the face unlock request, the control module 11 controls the light source to emit the light source, and transmits the face unlock request to the visible light camera 14.
  • the visible light camera 14 In response to the face unlock request, the visible light camera 14 The object to be detected is photographed, and a visible light image is collected.
  • the visible light image includes a two-dimensional color image.
  • the control module 11 judges whether there is a human face in the visible light image from the visible light image received by the visible light image 46, and obtains a detection result.
  • the detection result includes the presence of a human face in the visible light image, Or there is no face in the visible light image.
  • the visible light images collected by the visible light camera are generally 2D images
  • the image processing device can implement the face detection function through the image feature matching algorithm inside the control module 11, so as to use the control module to perform face detection on the visible light image. Detection.
  • the control module 11 may also be used to call other image detection modules on the terminal where the image processing device is located to perform face detection on the visible light image. The embodiments of the present application do not limit the functional modules and specific methods for face detection.
  • the control module 11 determines that the detection result represents the object to be detected as the target object; for the detection result of the absence of a human face in the visible light image, the control module 11 determines that the detection result represents that the object to be detected is not the target object, where The target audience is people.
  • step S903 is the same as the implementation process of the same solution in step S501.
  • control the structured light emitter In response to the trigger signal, control the structured light emitter to emit laser light according to the emission parameters through the emission drive module, and transmit a synchronization signal to the infrared camera through the emission drive module;
  • S905 In response to the synchronization signal, control the infrared camera to collect a speckle image of the object to be detected;
  • S906 Control the infrared camera to transmit the speckle image to the image processing module
  • steps S904-S906 is the same as the implementation process of steps S502-S504.
  • S907 Control the image processing module to perform depth calculation on the speckle image to obtain a depth image of the object to be detected;
  • step S907 is the same as the implementation process of the same solution in step S505.
  • S908 Acquire a target depth image when the detection result indicates that the object to be detected is a target object
  • the control module 11 When determining that the detection result indicates that the object to be detected is a person, the control module 11 obtains a target depth image from the memory of the image processing device, and the target depth image is a previously obtained depth image of the target user.
  • the control module 11 calculates the similarity between the depth image and the target depth image, and determines whether the similarity is less than a preset similarity threshold.
  • the preset similarity threshold is set based on the minimum similarity between different depth images of the same person.
  • control module 11 determines that the similarity is not less than the preset similarity threshold, it generates a face unlock response; otherwise, no face unlock response is generated; in response to the face unlock response, the unlock page is opened, and the control module 11 controls the image processing The unlock page is displayed on the display of the device.
  • step S908 does not limit the execution order of step S908, and the execution order of step S908 can be located at any position after step S902 and before step S909.
  • FIG. 10 is a schematic flowchart of another image processing method provided by an embodiment of the application. As shown in FIG. 10, the image processing method includes the following step:
  • control module 11 When the control module 11 detects the 3D modeling request, the control module 11 controls the visible light camera 14 to shoot the object to be detected and collect the visible light image; the 3D modeling request is the request instruction of the 3D modeling function.
  • control the structured light transmitter in response to the trigger signal, control the structured light transmitter to emit laser light according to the emission parameters through the emission drive module, and transmit the synchronization signal to the infrared camera through the emission drive module;
  • steps S1002-S1004 is the same as the implementation process of steps S502-S504.
  • step S1005 is the same as the implementation process of the same solution in step S505.
  • S1006 Use the visible light image to perform color rendering on the depth image to obtain a three-dimensional color image, and display the three-dimensional color image.
  • the emission drive module 45 controls the structured light emitter 13 to emit laser light and controls the infrared camera 12 to collect the scattered light while emitting laser light.
  • the emission drive module 45 is not integrated at one time and does not depend on depth. The processing chip works independently. In this way, the emission drive module 45 can be easily repaired when it fails or is not applicable, ensuring that the speckle image is collected normally, thereby ensuring that the depth image is normally obtained, and improving the stability of image application functions Sex.
  • the image processing device 1100 includes an infrared camera 12, a structured light transmitter 13, an emission driving module 45 and an image processing module 46, an infrared camera 12 is connected to the emission drive module 13 and the image processing module 46, and the structured light transmitter 13 is connected to the emission drive module 45.
  • the image processing device 1100 also includes: a processor 1101, a memory 1102, and a communication bus 1103.
  • the memory 1102 communicates with the processor through the communication bus 1103. 1101 for communication, and the memory 1102 stores one or more programs executable by the processor 1101. When one or more programs are executed, the processor 1101 controls the infrared camera 12, structured light transmitter 13, emission drive module 45 and image
  • the processing module 46 executes any image processing method as in the embodiments of the present application.
  • the image processing device 1100 further includes a visible light camera 14.
  • the embodiment of the present application provides a computer-readable storage medium.
  • the computer-readable storage medium stores one or more programs.
  • the one or more programs can be executed by one or more processors 1101.
  • the disclosed device and method may be implemented in other ways.
  • the device embodiments described above are merely illustrative.
  • the division of the units is only a logical function division, and there may be other divisions in actual implementation, such as: multiple units or components can be combined, or It can be integrated into another system, or some features can be ignored or not implemented.
  • the coupling, or direct coupling, or communication connection between the components shown or discussed may be indirect coupling or communication connection through some interfaces, devices or units, and may be electrical, mechanical or other forms of.
  • the units described above as separate components may or may not be physically separate, and the components displayed as units may or may not be physical units; they may be located in one place or distributed on multiple network units; Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
  • the functional units in the embodiments of the present application can all be integrated into one processing unit, or each unit can be individually used as a unit, or two or more units can be integrated into one unit;
  • the unit can be implemented in the form of hardware, or in the form of hardware plus software functional units.
  • the foregoing program can be stored in a computer readable storage medium.
  • the execution includes The steps of the foregoing method embodiment; and the foregoing storage medium includes: various media that can store program codes, such as a mobile storage device, a read only memory (Read Only Memory, ROM), a magnetic disk, or an optical disk.
  • ROM Read Only Memory
  • the aforementioned integrated unit of the present invention is implemented in the form of a software function module and sold or used as an independent product, it may also be stored in a computer readable storage medium.
  • the computer software product is stored in a storage medium and includes several instructions for A computer device (which may be a personal computer, a server, or a network device, etc.) executes all or part of the methods described in the various embodiments of the present invention.
  • the aforementioned storage media include: removable storage devices, ROMs, magnetic disks or optical discs and other media that can store program codes.
  • the emission driver module can work independently, does not need to be integrated in the advanced processing chip, and does not need to rely on the advanced processing chip, so that the emission driver module can be repaired more easily when it fails or is not applicable.
  • the speckle image is collected normally to ensure that the depth image is obtained normally, and the stability of the image application function is improved.
  • the parameter control and adjustment of the structured light signal are controlled by the signal modulation module inside the image processing module and outside the image processing module
  • the control modules are implemented together, so that the structured light signal can be conveniently controlled and adjusted, and the flexibility of the terminal to modulate the structured light signal is improved.

Abstract

一种图像处理方法、装置、及计算机可读存储介质,该方法应用于图像处理装置,该方法包括:当检测到图像应用功能的请求指令时,响应于请求指令,向发射驱动模块传输获取到的发射参数,并控制红外摄像头向发射驱动模块传输触发信号(S501);响应于触发信号,通过发射驱动模块,控制结构光发射器按照发射参数发射激光,并通过发射驱动模块向红外摄像头传输同步信号(S502);响应于同步信号,控制红外摄像头采集待检测物体的散斑图像(S503);控制红外摄像头将散斑图像传输至图像处理模块(S504);控制图像处理模块对散斑图像进行深度计算,得到待检测物体的深度图像,并基于深度图像实现图像应用功能,深度图像表征待检测物体的三维结构特征(S505)。

Description

图像处理方法和装置、存储介质
相关申请的交叉引用
本申请基于申请号为201910551552.4、申请日为2019年06月24日的中国专利申请提出,并要求该中国专利申请的优先权,该中国专利申请的全部内容在此以引入方式并入本申请。
技术领域
本申请实施例涉及结构光技术,尤其涉及一种图像处理方法和装置、存储介质。
背景技术
目前,手机等移动设备越来越多地采用三维(3D,3-Dimension)结构光技术,实现人脸解锁、人脸支付和三维建模等图像应用功能,具体地,由深度处理芯片驱动结构光发射器发射出具有一定结构特征的光线,投射到待检测物体上,同时由红外摄像头对待检测物体进行拍摄,采集到待检测物体的散斑图像,深度处理芯片从红外摄像头接收散斑图像,并对散斑图像进行深度计算,得到待检测物体的深度图像,深度图像反映了待检测物体的空间位置信息,进而可以基于深度图像实现图像应用功能。
然而,由于结构光发射器的激光发射是由深度处理芯片控制的,而深度处理芯片是一次性编写好不可更改的,则深度处理芯片中的发射驱动功能在失效或不适用时无法直接被修复,那么,就无法保证稳定地获取深度图像,进而导致图像应用功能的稳定性较差。
发明内容
本申请提供一种图像处理方法和装置、存储介质,能够提高图像应用功能的稳定性。
本申请的技术方案是这样实现的:
本申请实施例提供一种图像处理方法,应用于图像处理装置,所述图像处理装置包括红外摄像头、结构光发射器、发射驱动模块和图像处理模块,所述方法包括:
当检测到图像应用功能的请求指令时,响应于所述请求指令,向所述发射驱动模块传输获取到的发射参数,并控制所述红外摄像头向所述发射驱动模块传输触发信号;
响应于所述触发信号,通过所述发射驱动模块,控制所述结构光发射器按 照所述发射参数发射激光,并通过所述发射驱动模块向所述红外摄像头传输同步信号;
响应于所述同步信号,控制所述红外摄像头采集待检测物体的散斑图像;
控制所述红外摄像头将所述散斑图像传输至所述图像处理模块;
控制所述图像处理模块对所述散斑图像进行深度计算,得到所述待检测物体的深度图像,并基于所述深度图像实现所述图像应用功能,所述深度图像表征所述待检测物体的三维结构特征。
上述方案中,所述图像处理模块包括信号调制模块;
所述响应于所述请求指令,向所述发射驱动模块传输获取到的发射参数,包括:
向所述图像处理模块传输所述请求指令;
响应于所述请求指令,控制所述信号调制模块向所述发射驱动模块传输所述发射参数。
上述方案中,所述红外摄像头包括信号调制模块、触发模块和时序控制电路;
所述响应于所述请求指令,向所述发射驱动模块传输获取到的发射参数,并控制所述红外摄像头向所述发射驱动模块传输触发信号,包括:
向所述时序控制电路传输所述请求指令;
响应于所述请求指令,通过所述时序控制电路,控制所述信号调制模块向所述发射驱动模块传输所述发射参数、以及控制所述触发模块向所述发射驱动模块传输所述触发信号。
上述方案中,所述红外摄像头还包括图像采集模块;
所述控制所述红外摄像头采集待检测物体的散斑图像,包括:
向所述时序控制电路传输所述同步信号;
响应于所述同步信号,通过所述时序控制电路,控制所述图像采集模块周期性地采集所述散斑图像。
上述方案中,所述红外摄像头包括信号调制模块;
在所述响应于所述触发信号,通过所述发射驱动模块,控制所述发射驱动模块按照所述发射参数发射激光之前,所述方法还包括:
当接收到参数修改指令时,向所述信号调制模块传输所述参数修改指令;
响应于所述参数修改指令,控制所述信号调制模块修改所述发射参数,得到更新后的发射参数。
上述方案中,所述响应于所述触发信号,通过所述发射驱动模块,控制所述结构光发射器按照所述发射参数发射激光,包括:
响应于所述触发信号,通过所述发射驱动模块,控制所述结构光发射器按照所述更新后的发射参数发射激光。
上述方案中,所述请求指令包括人脸解锁请求,所述图像处理装置还包括可见光摄像头;
所述当检测到图像应用功能的请求指令时,响应于所述请求指令,向所述发射驱动模块传输获取到的发射参数,并控制所述红外摄像头向所述发射驱动 模块传输触发信号,包括:
当检测到所述人脸解锁请求时,响应于所述人脸解锁请求,控制所述可见光摄像头采集所述待检测物体的可见光图像;
对所述可见光图像进行人脸检测,得到检测结果;
当所述检测结果表征所述待检测物体为目标对象时,向所述发射驱动模块传输所述发射参数,并控制所述红外摄像头向所述发射驱动模块传输所述触发信号。
上述方案中,所述基于所述深度图像实现所述图像应用功能,包括:
获取目标深度图像;
计算所述深度图像和所述目标深度图像之间的相似度;
当所述相似度大于或等于预设相似度阈值时,生成人脸解锁响应,响应于所述人脸解锁响应,开启解锁页面。
上述方案中,所述请求指令包括三维建模请求,所述图像处理装置还包括可见光摄像头;
在所述向所述发射驱动模块传输获取到的发射参数之前,所述方法还包括:
当检测到所述三维建模请求时,响应于所述三维建模请求,控制所述可见光摄像头采集所述待检测物体的可见光图像。
上述方案中,所述基于所述深度图像实现所述图像应用功能,包括:
利用所述可见光图像,对所述深度图像进行颜色渲染,得到三维彩色图像,并显示所述三维彩色图像。
上述方案中,在所述向所述发射驱动模块传输获取到的发射参数,并控制所述红外摄像头向所述发射驱动模块传输触发信号之后,所述方法还包括:
控制所述红外摄像头以所述触发信号的传输时刻为开始时刻,进行计时;
直至检测到所述红外摄像头接收到所述同步信号,获得计时时长;
当所述计时时长大于或等于预设时长时,控制所述红外摄像头生成异常提示信息,并显示所述异常提示信息,所述异常提示信息表征所述发射驱动模块发生异常。
本申请实施例提供一种图像处理装置,所述图像处理装置包括控制模块、红外摄像头、结构光发射器、发射驱动模块和图像处理模块,所述控制模块通过总线连接所述红外摄像头和所述图像处理模块,所述红外摄像头连接所述控制模块、所述发射驱动模块和所述图像处理模块,所述图像处理模块连接所述控制模块和所述红外摄像头;
所述控制模块,用于当检测到图像应用功能的请求指令时,响应于所述请求指令,向所述发射驱动模块传输获取到的发射参数,并控制所述红外摄像头向所述发射驱动模块传输触发信号;
所述发射驱动模块,用于响应于所述触发信号,控制所述结构光发射器按照所述发射参数发射激光,并向所述红外摄像头传输同步信号;
所述红外摄像头,用于响应于所述同步信号,采集待检测物体的散斑图像;将所述散斑图像传输至所述图像处理模块;
所述图像处理模块,用于对所述散斑图像进行深度计算,得到所述待检测 物体的深度图像,并基于所述深度图像实现所述图像应用功能,所述深度图像表征所述待检测物体的三维结构特征。
本申请实施例提供一种图像处理装置,所述图像处理装置包括红外摄像头、结构光发射器、发射驱动模块和图像处理模块,所述红外摄像头连接所述发射驱动模块和所述图像处理模块,所述结构光发射器连接所述发射驱动模块,所述图像处理模块还包括:处理器、存储器以及通信总线,所述存储器通过所述通信总线与所述处理器进行通信,所述存储器存储所述处理器可执行的一个或者多个程序,当所述一个或者多个程序被执行时,通过所述处理器控制所述红外摄像头、所述结构光发射器、所述发射驱动模块和所述图像处理模块,执行如上述任意一种图像处理方法。
本申请实施例提供一种计算机可读存储介质,所述计算机可读存储介质存储有一个或者多个程序,所述一个或者多个程序可被一个或者多个处理器执行,以实现如上述任意一种图像处理方法。
本申请实施例提供了一种图像处理方法和装置、存储介质,所述方法包括:先将获取到的发射参数传输至发射驱动模块,还控制红外摄像头向发射驱动模块传输触发信号,进而,由发射驱动模块控制结构光发射器按照发射参数发射激光,同时,还由发射驱动模块向红外摄像头传输同步信号,响应于同步信号,红外摄像头在结构光发射器发射激光的同时,采集待检测物体的散斑图像,最后,由图像处理模块对散斑图像进行深度计算,得到待检测物体的深度图像,基于深度图像实现图像应用功能。采用上述技术实现方案,由发射驱动模块基于获取到的发射参数、以及从红外摄像头传输来的触发信号,控制结构光发射器发射激光、以及控制红外摄像头在激光发射的同时采集到散斑图像,而不是将发射驱动功能集成在深度处理芯片中,由整个深度处理芯片来控制结构光发射器发射激光和红外摄像头采集散斑图像,从而使得发射驱动模块能够不依赖于深度处理芯片进行独立工作,如此,在发射驱动模块失效或不适用时更容易得到修复,从而保证正常地采集到散斑图像,进而保证正常地获取深度图像,最终提高图像应用功能的稳定性。
附图说明
图1为本申请实施例提供的一种图像处理装置的示意图;
图2为本申请实施例提供的一种采集散斑图像时的设备位置关系示意图;
图3(a)为本申请实施例提供的一种光斑图像的示意图;
图3(b)为本申请实施例提供的一种散斑图像的示意图;
图3(c)为本申请实施例提供的一种深度图像的示意图;
图4(a)为本申请实施例提供的另一种图像处理装置的结构示意图;
图4(b)为本申请实施例提供的又一种图像处理装置的结构示意图;
图5为本申请实施例提供的一种图像处理方法的流程示意图;
图6为本申请实施例提供的一种图像处理装置的结构示意图三;
图7为本申请实施例提供的一种图像处理装置的结构示意图四;
图8(a)为本申请实施例提供的一种红外摄像头的结构示意图;
图8(b)为本申请实施例提供的另一种红外摄像头的结构示意图;
图9为本申请实施例提供的另一种图像处理方法的流程示意图;
图10为本申请实施例提供的又一种图像处理方法的流程示意图;
图11为本申请实施例提供的一种图像处理装置的结构示意图五。
具体实施方式
下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行清楚、完整地描述。
目前,对于基于深度图像实现图像应用功能的图像处理装置的示意图如图1所示,图1为本申请实施例提供的一种图像处理装置的示意图,图像处理装置1包括控制模块11、红外摄像头12(IR camera,Infrared Radiation Camera)、结构光发射器13、可见光摄像头14和深度处理芯片15,深度处理芯片15包括发射驱动模块(Driver)15-1,控制模块11连接红外摄像头12、可见光摄像头14和深度处理芯片15,红外摄像头12连接控制模块11和深度处理芯片15,深度处理芯片15连接控制模块11、红外摄像头12和结构光发射器13,可见光摄像头14连接控制模块11。
基于图1所示的图像处理装置,控制模块11检测到用户启动图像应用功能时,生成图像应用功能的请求指令;控制模块11通过集成电路总线(I2C总线,Inter-Integrated Circuit),将请求指令传输至红外摄像头12、可见光摄像头14和深度处理芯片15;响应于请求指令,深度处理芯片15控制发射驱动模块15-1来驱动结构光发射器13发射出具有一定结构特征的激光,投射到待检测对象上,与激光的开始发射时间同步,由红外摄像头12开始对待检测对象进行拍摄,采集到待检测物体的散斑图像,由于待检测物体中各处与结构光发射器13的距离不同,待检测物体中各处的光线结构的变化也不同;控制红外摄像头12将散斑图像传输至深度处理芯片15中,由深度处理芯片15对散斑图像进行处理,将待检测物体中各处的光线结构的变化换算成深度信息,得到待检测物体的深度图像,根据深度图像可以获得待检测物体的三维结构特征,基于待检测物体的三维结构特征实现图像应用功能。
需要说明的是,目前,基于图1所示的图像处理装置,为了使得红外摄像头12的开始拍摄时间与结构光发射器13的开始发射时间同步,需要预先对深度处理芯片15的激光脉冲频率、以及红外摄像头12采集频率进行同步调试。
示例性地,控制模块11可以包括传感器和处理器,通过传感器检测用户输入的操作,将该操作传输至处理器中,由处理器判断该操作是否符合启动图像应用功能的操作条件,当该操作符合启动图像应用功能的操作条件时,生成图像应用功能的请求指令,否则,不生成请求指令。
示例性地,深度处理芯片15包括专用集成电路(ASIC,Application Specific Integrated Circuit),ASIC包括发射驱动模块15-1、ARM处理器(Advanced RISC Machines)15-2、闪存(Flash Memory)15-3和运算核心(Core)15-4,当控制 模块11通过I2C总线向ASIC传输图像应用功能的请求指令时,ARM处理器15-2通过I2C总线接收请求指令,将请求指令存储至闪存15-3中,并根据请求指令和用户预先编号的程序,控制发射驱动模块15-1驱动结构光发射器13;另一方面,当红外摄像头12向ASIC传输散斑图像时,ARM处理器15-2将散斑图像传输至运算核心15-4中,运算核心15-4对散斑图像进行深度计算,得到深度图像。
在一些实施例中,图2为本申请实施例提供的一种采集散斑图像时的设备位置关系示意图,结构光发射器13包括近红外激光器13-1和衍射光学元件(DOE,Diffractive Optical Elements)13-2,近红外激光器13-1和衍射光学元件13-2在图像处理装置1中的同一个位置,衍射光学元件13-2位于近红外激光器13-1的激光发射侧,近红外激光器13-1和红外摄像头12按照一定间隔,并列排布在图像处理装置1中,近红外激光器13-1和红外摄像头12之间的距离即基线距离,例如,40毫米;待检测物体20位于结构光发射器13的光线投射区域内、以及红外摄像头12的拍摄区域内,光线投射区域包括图2中的与衍射光学元件13-2连接的两条虚线内的区域,拍摄区域包括图2中的与红外摄像头12连接的两条虚线内的区域,待检测物体20到近红外激光器13-1和红外摄像头12的连线的距离为垂直距离,近红外激光器13-1到待检测物体20的照射角为α,红外摄像头12到待检测物体20的视角为β;其中,红外摄像头12中的红外传感器(IR Sensor)的波长等于近红外激光器13-1的激光波长,红外摄像头12可以为近红外摄像头。
进一步地,近红外激光器13-1发射出一定波长的激光到衍射光学元件13-2上,衍射光学元件13-2对激光进行扩散后形成光斑图案21;将光斑图案21投射到待检测物体20上,光斑图案21是由数以万计的一定角度范围内的衍射斑点组成的;利用红外摄像头12对投射有光斑图案21的待检测物体20进行拍摄,得到黑白的散斑图像;对散斑图像计算深度信息,得到深度图像。
示例性地,近红外激光器13-1为发射波长为940nm的激光的激光器,例如,垂直腔面发射激光器(VCSEL,Vertical Cavity Surface Emitting Laser),衍射光学元件13-2对波长为940nm的激光进行扩散后形成光斑图案;当光斑图案投射到一个标准物体上,利用红外摄像头12对该标准物体进行拍摄,得到的光斑图像如图3(a)所示,标准物体包括平面;当光斑图像投射到人像雕塑上,利用红外摄像头12对该人像雕塑进行拍摄,得到的散斑图像如图3(b)所示;对该散斑图像计算深度信息,得到的深度图像如图3(c)所示,深度图像的颜色深浅表示人像雕塑中各处与结构光发射器13的距离大小。
在一些实施例中,图像处理装置1还包括可见光摄像头14和发光源,可见光摄像头14、近红外激光器13-1和红外摄像头12按照一定间隔,并列排布在图像处理装置1中;发光源位于可见光摄像头14的附近,用于为可见光摄像头14补光;其中,可见光摄像头14包括三原色摄像头(RGB camera)。
需要说明的是,目前,由于图1所示的图像处理装置将结构光发射器13的发射驱动模块15-1集成在深度处理芯片中,而发射驱动模块15-1的电路空间占用较大,因此会造成对深度处理芯片的集成度要求较高,且深度处理芯片 的体积也过大;并且,由于发射驱动模块15-1被集成在深度处理芯片中,因此目前对结构光信号,如激光脉冲信号的控制和调制也是在深度处理芯片内部进行的,而深度处理芯片通常是预先编写好不可更改的,因此无法灵活地对结构光信号的调制模式进行调整;并且,目前的图像处理装置中,当发射驱动模块15-1出现失效或不适用的情况时,更换或修复的难度大、成本高,无法保证稳定地获取深度图像,进而导致图像应用功能的稳定性较差。
本申请实施例提供了一种图像处理方法,应用于图像处理装置,图4(a)为本申请实施例提供的另一种图像处理装置的结构示意图,图像处理装置41包括控制模块11-0、红外摄像头12-0、结构光发射器13-0、发射驱动模块45-0和图像处理模块46-0,控制模块11-0通过I2C总线连接红外摄像头12-0和图像处理模块46-0,红外摄像头12-0连接控制模块11-0、发射驱动模块45-0和图像处理模块46-0,图像处理模块46-0连接控制模块11-0和红外摄像头12-0。
本领域技术人员可以理解,图4(a)中示出的图像处理装置的结构并不构成对图像处理装置的限定,图像处理装置可以包括比图示更多或更少的部件,或者组合某些部件,或者不同的部件布置;图像处理装置可以以各种形式来实施,例如,可以为包括诸如手机、平板电脑、笔记本电脑、掌上电脑等移动终端,以及诸如台式计算机等固定终端。
本申请实施例中,基于图4(a)所示的图像处理装置41,当终端上启动基于结构光技术的图像应用功能,例如当用户在终端上启动人脸解锁或支付功能时,控制模块11-0会检测到来自人脸解锁对应的图像应用功能的请求指令;响应于该请求指令,控制模块11-0通过I2C总线,向红外摄像头12-0和图像处理模块46-0发送启动指令。响应于该启动指令,图像处理模块46-0进入准备工作状态,红外摄像头12-0进入开启状态,并发送触发信号至发射驱动模块45-0;响应于该触发信号,发射驱动模块45-0启动,驱动并控制结构光发射器13-0发射激光,并且同时向红外摄像头12-0反馈同步信号;响应于该同步信号,红外摄像头12-0开始曝光,从而得到待检测物体例如人脸的散斑图像;红外摄像头12-0将散斑图像传输至图像处理模块46-0,图像处理模块46-0在接收到散斑图像之后,对散斑图像进行深度计算,得到待检测物体的深度图像,该深度图像中体现了待检测物体如人脸的三维结构特征。这样,图像处理装置41即可根据待检测物体的深度图像,完成请求指令中对应的图像应用功能,如人脸解锁,支付等,本申请实施例不作限制。
可以理解的是,相比于图1,图4(a)中的发射驱动模块45-0可以独立于实现深度图像处理功能的深度处理芯片15或图像处理模块46-0工作,从而提高了图像处理装置的可维护性,并且,结构光发射器13-0发射激光的时间与红外摄像头12-0的曝光采集时间可以通过同步信号进行同步,避免了复杂的预调试过程,从而保证了能够正常地采集到散斑图像以及获取深度图像,最终提高图像应用功能的稳定性。
图4(b)为本申请实施例提供的又一种图像处理装置的结构示意图,图像处理装置4包括控制模块11、红外摄像头12、结构光发射器13、可见光摄像头14、发射驱动模块45和图像处理模块46,控制模块11通过I2C总线连接红 外摄像头12、可见光摄像头14和图像处理模块46,红外摄像头12连接控制模块11、发射驱动模块45和图像处理模块46,图像处理模块46连接控制模块11和红外摄像头12,可见光摄像头14连接控制模块11。
本申请实施例中,基于图4(b),当控制模块11检测到图像应用功能的请求指令时,响应于该请求指令,控制模块11-0通过I2C总线,可见光摄像头14发送启动指令,响应于该启动指令,可见光摄像头14进入开启状态。
本领域技术人员可以理解,图4(b)中示出的图像处理装置的结构并不构成对图像处理装置的限定,图像处理装置可以包括比图示更多或更少的部件,或者组合某些部件,或者不同的部件布置;图像处理装置可以以各种形式来实施,例如,可以为包括诸如手机、平板电脑、笔记本电脑、掌上电脑等移动终端,以及诸如台式计算机等固定终端。
本申请实施例提供的图像处理方法可以基于图4(b)所示的图像处理装置所实现,图5为本申请实施例提供的一种图像处理方法的流程示意图,如图5所示,该图像处理方法包括以下步骤:
S501、当检测到图像应用功能的请求指令时,响应于请求指令,向发射驱动模块传输获取到的发射参数,并控制红外摄像头向发射驱动模块传输触发信号;
图像处理装置检测到图像应用功能的请求指令时,获取发射参数,将发射参数传输到发射驱动模块45中,与此同时,还向红外摄像头12传输请求指令,红外摄像头12响应请求指令,向发射驱动模块45传输触发信号;其中,发射参数为激光的发射参数,包括激光脉冲频率、激光脉冲占空比和激光脉冲周期等;触发信号为驱动发射驱动模块45开始发射激光的信号,例如,高电平。
在一些实施例中,图像处理模块46包括信号调制模块;图像处理装置检测到图像应用功能的请求指令后,向图像处理模块46传输请求指令;响应于请求指令,控制信号调制模块向发射驱动模块45传输发射参数。
图像处理装置在检测到请求指令时,向图像处理模块46传输请求指令,图像处理模块46包括生成发射参数的信号调制模块,图像处理模块46响应于请求指令,控制信号调制模块向发射驱动模块45传输发射参数。
示例性地,基于图4(b),本申请实施例提供的一种图像处理装置的结构示意图可以如图6所示,图4(b)中示出的图像处理模块46可以为ASIC,图像处理模块46还可以包括信号调制模块46-1,信号调制模块46-1以信号调制电路的形式集成在ASIC中;图像处理模块46接收到请求指令后,响应于请求指令,接通信号调制电路,实现向发射驱动模块45传输发射参数。
示例性地,基于图4(b),本申请实施例提供的一种图像处理装置的结构示意图可以如图7所示,图4(b)中示出的图像处理模块46可以为数字信号处理器(DSP,Digital Singnal Processor),信号调制模块46-1以信号调制程序的形式写在DSP的寄存器46-2(Register)中;图像处理模块46接收到请求指令后,响应于请求指令,调用和运行信号调制程序,实现向发射驱动模块45传输发射参数。
需要说明的是,图像处理装置可以通过控制模块11对DSP中的信号调制 程序进行读写操作,以修改发射参数。
在一些实施例中,红外摄像头12包括信号调制模块12-1、触发模块12-2和时序控制电路12-3;图像处理装置检测到图像应用功能的请求指令后,向时序控制电路12-3传输请求指令;响应于请求指令,通过时序控制电路12-3,控制信号调制模块12-1向发射驱动模块45传输发射参数、以及控制触发模块12-2向发射驱动模块45传输触发信号。
图像处理装置中的控制模块11检测到请求指令时,向时序控制电路12-3转发请求指令;时序控制电路12-3响应于请求指令,向信号调制模块12-1传输指令,以使得信号调制模块12-1向发射驱动模块45传输发射参数,还向触发模块12-2传输指令,以使得触发模块12-2向发射驱动模块45传输触发信号。
示例性地,如图8(a)所示的一种红外摄像头的结构示意图,红外摄像头12包括信号调制模块12-1、触发模块12-2和时序控制电路12-3,时序控制电路12-3分别连接信号调制模块12-1和触发模块12-2,时序控制电路12-3还连接控制模块11和发射驱动模块45;控制模块11在检测到请求指令时,向红外摄像头12中的时序控制电路12-3传输请求指令,响应于请求指令,由时序控制电路12-3控制信号调制模块12-1向发射驱动模块45传输发射参数、以及控制触发模块12-2向发射驱动模块45传输触发信号。
需要说明的时,当图像处理模块46包括信号调制模块46-1时,红外摄像头12不包括信号调制模块12-1,该红外摄像头12的结构示意图如图8(b)所示。
在一些实施例中,图像应用功能包括人脸解锁功能、人脸支付功能和三维建模功能,图像处理装置中的控制模块11检测到用户启动任一个图像应用功能时,生成该图像应用功能的请求指令。
示例性地,以人脸解锁功能为例,用户通过图像处理装置的实体按键或虚拟按键点亮图像处理装置的屏幕后,控制模块11检测到屏幕点亮操作、且确定图像处理装置处于锁屏状态时,确定用户启动人脸解锁功能,并生成人脸解锁功能的请求指令,即人脸解锁请求。
S502、响应于触发信号,通过发射驱动模块,控制结构光发射器按照发射参数发射激光,并通过发射驱动模块向红外摄像头传输同步信号;
发射驱动模块45在接收到发射参数和触发信号后,由发射驱动模块45控制结构光发射器13按照发射参数开始发射激光,并向红外摄像头12传输一次同步信号。
在一些实施例中,发射参数包括激光脉冲频率、激光脉冲占空比和激光脉冲周期,由发射驱动模块45控制结构光发射器13按照发射参数,周期性地发射激光;例如,激光脉冲频率可以为每秒33次,对应地,每个激光脉冲的发射周期时长约为33ms;激光脉冲占空比可以为10%。
可以看出,发射驱动模块45和红外摄像头12直接连接,发射驱动模块45只需要在开始发射激光时直接向红外摄像头12传输一次同步信号,控制红外摄像头12在接收到同步信号后开始拍摄,实现了红外摄像头12的开始拍摄时间与结构光发射器13的开始发射时间同步,避免了对深度处理芯片的激光脉冲频 率、以及红外摄像头12的采集频率进行复杂地同步调试。
在一些实施例中,红外摄像头12包括信号调制模块12-1;信号调制模块12-1通过红外摄像头12中的时序控制电路12-3与控制模块11连接。在响应于触发信号,通过发射驱动模块45,控制发射驱动模块45按照发射参数发射激光之前,图像处理装置中的控制模块11在接收到参数修改指令时,通过时序控制电路12-3向信号调制模块12-1传输参数修改指令;响应于参数修改指令,控制信号调制模块12-1修改发射参数,得到更新后的发射参数。
控制模块11接收到参数修改指令后,通过时序控制电路12-3向红外摄像头12中的信号调制模块12-1转发参数修改指令,以使得信号调制模块12-1根据参数修改指令对发射参数进行修改,得到更新后的发射参数。
进一步地,响应于触发信号,通过发射驱动模块45,控制结构光发射器13按照更新后的发射参数发射激光。
发射驱动模块45在接收到更新后的发射参数和触发信号后,由发射驱动模块45控制结构光发射器13按照更新后的发射参数开始发射激光。
需要说明的是,图像处理模块46为DSP、且DSP包括信号调制模块时,与控制红外摄像头12中的信号调制模块修改发射参数同理,控制模块11控制DSP中的信号调制模块修改发射参数。
可以理解的是,结构光信号的参数控制和调整是由图像处理模块内部的信号调制模块,以及图像处理模块外部的控制模块共同实现的,从而可以方便的对结构光信号进行控制和调整,提高了终端调制结构光信号的灵活性。
S503、响应于同步信号,控制红外摄像头采集待检测物体的散斑图像;
红外摄像头12从发射驱动模块45接收到同步信号后,响应于同步信号,对待检测物体进行拍摄,采集得到待检测物体的散斑图像。
在一些实施例中,所述红外摄像头12还包括图像采集模块12-4;由发射驱动模块45向时序控制电路12-3传输同步信号;响应于同步信号,通过时序控制电路12-3,控制图像采集模块12-4周期性地采集散斑图像。
时序控制电路12-3响应于同步信号,对图像采集模块12-4进行控制,以使得图像采集模块12-4周期性地采集散斑图像。
示例性地,如图8(a)所示的一种红外摄像头的结构示意图,红外摄像头12还包括图像采集模块12-4,图像采集模块12-4连接时序控制电路12-3,图像采集模块12-4包括行选电路12-41、列选电路12-42、图像阵列12-43、放大器12-44、自动对焦电路12-45、自动曝光电路12-46和模数转换器(ADC,Analog-to-Digital Converter)12-47;其中,行曝光电路12-41、列选电路12-42、放大器12-44、自动对焦电路12-45和自动曝光电路12-46都与时序控制电路12-3连接,图像阵列12-43连接放大器12-44、自动对焦电路12-45和自动曝光电路12-46,放大器12-44还连接模数转换器12-47,模数转换器12-47连接图像处理模块46。
进一步地,响应于同步信号,由时序控制电路12-3控制图像阵列12-43从自动曝光电路12-46中接收到曝光时间、以及从自动对焦电路12-45中接收到曝光程度,还由时序控制电路12-3选择接通行选电路12-41或列选电路12-42, 图像阵列12-43开始按照曝光时间和曝光程度进行曝光,即将光信号转换为电荷,再由图像阵列12-43将电荷传输到放大器12-44;由放大器12-44对电荷进行放大后,传输至模数转换器12-47,由模数转换器12-47生成散斑图像。
S504、控制红外摄像头将散斑图像传输至图像处理模块;
红外摄像头12采集到每一帧散斑图像后,由红外摄像头12将每一帧散斑图像传输至图像处理模块46,以使得图像处理模块46对每一帧散斑图像进行处理。
S505、控制图像处理模块对散斑图像进行深度计算,得到待检测物体的深度图像,并基于深度图像实现图像应用功能,深度图像表征待检测物体的三维结构特征。
由图像处理模块46对每一帧散斑图像进行深度计算,得到每一张深度图像,图像处理模块46将每一张深度图像传输至控制模块11,以使得控制模块11基于所有深度图像实现图像应用功能。
在一些实施例中,由红外摄像头12周期性地采集散斑图像,并传输至图像处理模块46,由图像处理模块46对采集到的每一帧散斑图像进行深度计算,得到每一张深度图像,进而,图像处理模块46可以在得到多张深度图像后,基于多张深度图像生成视频流信息,并将视频流信息传输至控制模块11。
在一些实施例中,在步骤S501之后,图像处理方法还包括:控制红外摄像头12以触发信号的传输时刻为开始时刻,进行计时;直至检测到红外摄像头12接收到同步信号,获得计时时长;当计时时长大于或等于预设时长时,控制红外摄像头12生成异常提示信息,并显示异常提示信息,异常提示信息表征发射驱动模块45发生异常。
控制红外摄像头12向发射驱动模块45传输触发信号的同时,以传输时刻为开始时刻进行计时,直至红外摄像头12接收到同步信号停止计时,并用停止计时时刻减去开始时刻,得到计时时长;由红外摄像头12判断计时时长不小于预设时长时,生成异常提示信息,并传输至控制模块11,由控制模块11控制在图像处理装置的显示屏上显示异常提示信息;其中,异常提示信息的内容包括发射驱动模块45的同步信号反馈发生异常、或者图像采集时间和激光发射时间无法同步。
示例性地,预设时长可以是用户根据曝光时间和激光脉冲频率设置的最大间隔时长,在红外摄像头12向发射驱动模块45传输触发信号后,经过该最大间隔时长,红外摄像头12接收到发射驱动模块45的同步信号的情况下,红外摄像头12的开始拍摄时间和结构光发射器13的开始发射时间保持同步。
可以理解的是,终端通过独立的发射驱动模块进行发射参数的获取、触发信号的响应,以及向红外摄像头进行同步信号传输等操作,从而减少了终端对深度处理芯片的依赖,降低了深度处理芯片的资源占用,降低了终端的维护成本,最终提高了图像应用功能的稳定性。
需要说明的是,针对人脸解锁功能,请求指令包括人脸解锁请求,图9为本申请实施例提供的另一种图像处理方法的流程示意图,如图9所示,该图像处理方法包括以下步骤:
S901、当检测到人脸解锁请求时,响应于人脸解锁请求,控制可见光摄像头采集待检测物体的可见光图像;
图像处理装置还包括发光源,控制模块11检测到人脸解锁请求时,控制模块11控制发光源发射光源,并向可见光摄像头14传输人脸解锁请求,响应于人脸解锁请求,由可见光摄像头14对待检测物体进行拍摄,采集得到可见光图像;其中,可见光图像包括二维彩色图像。
S902、对可见光图像进行人脸检测,得到检测结果;控制模块11从可见光图像46接收到的可见光图像,判断可见光图像中是否存在人脸,得到检测结果,检测结果包括可见光图像中存在人脸、或可见光图像中不存在人脸。
在一些实施例中,可见光摄像头采集到的14可见光图像一般为2D图像,图像处理装置可以在控制模块11内部通过图像特征匹配算法等实现人脸检测功能,以使用控制模块对可见光图像进行人脸检测,在其他的一些实施例中,也可以使用控制模块11调用图像处理装置所在终端上的其他图像检测模块,对可见光图像进行人脸检测。本申请实施例对进行人脸检测的功能模块和具体方法不做限定。
S903、当检测结果表征待检测物体为目标对象时,向发射驱动模块传输获取到的发射参数,并控制红外摄像头向发射驱动模块传输触发信号;
控制模块11针对可见光图像中存在人脸的检测结果,确定检测结果表征待检测物体为目标对象;针对可见光图像中不存在人脸的检测结果,确定检测结果表征待检测物体不是目标对象,其中,目标对象为人。
步骤S903的实现过程与步骤S501中的相同方案的实现过程相同。
S904、响应于触发信号,通过发射驱动模块,控制结构光发射器按照发射参数发射激光,并通过发射驱动模块向红外摄像头传输同步信号;
S905、响应于同步信号,控制红外摄像头采集待检测物体的散斑图像;
S906、控制红外摄像头将散斑图像传输至图像处理模块;
步骤S904-S906的实现过程与步骤S502-S504的实现过程相同。
S907、控制图像处理模块对散斑图像进行深度计算,得到待检测物体的深度图像;
步骤S907的实现过程与步骤S505中的相同方案的实现过程相同。
S908、当检测结果表征待检测物体为目标对象时,获取目标深度图像;
控制模块11在确定检测结果表征待检测物体为人时,从图像处理装置的存储器中获取目标深度图像,目标深度图像为预先获取到的目标用户的深度图像。
S909、计算深度图像和目标深度图像之间的相似度;
控制模块11计算深度图像和目标深度图像之间的相似度,并判断相似度是否小于预设相似度阈值,预设相似度阈值为基于同一个人的不同深度图像之间的最小相似度设置的。
S910、当相似度大于或等于预设相似度阈值时,生成人脸解锁响应,响应于人脸解锁响应,开启解锁页面。
控制模块11确定相似度不小于预设相似度阈值时,生成人脸解锁响应,否则,不生成人脸解锁响应;响应于人脸解锁响应,开启解锁页面,并由控制模 块11控制在图像处理装置的显示屏上显示解锁页面。
本申请实施例不对步骤S908的执行顺序进行限定,步骤S908的执行顺序可以位于步骤S902之后、步骤S909之前的任意位置。
需要说明的是,针对三维建模功能,请求指令包括三维建模请求,图10为本申请实施例提供的又一种图像处理方法的流程示意图,如图10所示,该图像处理方法包括以下步骤:
S1001、当检测到三维建模请求时,响应于三维建模请求,控制可见光摄像头采集待检测物体的可见光图像,向发射驱动模块传输获取到的发射参数,并控制红外摄像头向发射驱动模块传输触发信号;
控制模块11检测到三维建模请求时,控制模块11控制可见光摄像头14对待检测物体进行拍摄,采集得到可见光图像;其中,三维建模请求就是三维建模功能的请求指令。
S1002、响应于触发信号,通过发射驱动模块,控制结构光发射器按照发射参数发射激光,并通过发射驱动模块向红外摄像头传输同步信号;
S1003、响应于同步信号,控制红外摄像头采集待检测物体的散斑图像;
S1004、控制红外摄像头将散斑图像传输至图像处理模块;
步骤S1002-S1004的实现过程与步骤S502-S504的实现过程相同。
S1005、控制图像处理模块对散斑图像进行深度计算,得到待检测物体的深度图像;
步骤S1005的实现过程与步骤S505中的相同方案的实现过程相同。
S1006、利用可见光图像,对深度图像进行颜色渲染,得到三维彩色图像,并显示三维彩色图像。
可以理解的是,由发射驱动模块45基于获取到的发射参数、以及从红外摄像头12传输来的触发信号,控制结构光发射器13发射激光、以及控制红外摄像头12在激光发射的同时采集到散斑图像,而不是由集成有发射驱动功能的深度处理芯片,来控制结构光发射器13发射激光和红外摄像头12采集散斑图像,该发射驱动模块45不是一次性集成好的,不依赖于深度处理芯片,而是自身独立工作的,如此,发射驱动模块45在失效或不适用时容易地进行修复,保证正常地采集到散斑图像,进而保证正常地获取深度图像,提高图像应用功能的稳定性。
基于同一技术构思,本申请实施例提供了一种图像处理装置,如图11所示,图像处理装置1100包括红外摄像头12、结构光发射器13、发射驱动模块45和图像处理模块46,红外摄像头12连接发射驱动模块13和图像处理模块46,结构光发射器13连接发射驱动模块45,图像处理装置1100还包括:处理器1101、存储器1102以及通信总线1103,存储器1102通过通信总线1103与处理器1101进行通信,存储器1102存储处理器1101可执行的一个或者多个程序,当一个或者多个程序被执行时,通过处理器1101控制红外摄像头12、结构光发射器13、发射驱动模块45和图像处理模块46,执行如本申请实施例中任一种图像处理方法。
需要说明的是,图像处理装置1100还包括可见光摄像头14。
本申请实施例提供了一种计算机可读存储介质,计算机可读存储介质存储有一个或者多个程序,一个或者多个程序可被一个或者多个处理器1101执行,程序被处理器1101执行时实现如本申请实施例中任一种图像处理方法。
应理解,说明书通篇中提到的“一个实施例”或“一实施例”意味着与实施例有关的特定特征、结构或特性包括在本申请的至少一个实施例中。因此,在整个说明书各处出现的“在一个实施例中”或“在一实施例中”未必一定指相同的实施例。此外,这些特定的特征、结构或特性可以任意适合的方式结合在一个或多个实施例中。应理解,在本申请的各种实施例中,上述各过程的序号的大小并不意味着执行顺序的先后,各过程的执行顺序应以其功能和内在逻辑确定,而不应对本申请实施例的实施过程构成任何限定。上述本申请实施例序号仅仅为了描述,不代表实施例的优劣。
需要说明的是,在本文中,术语“包括”、“包含”或者其任何其他变体意在涵盖非排他性的包含,从而使得包括一系列要素的过程、方法、物品或者装置不仅包括那些要素,而且还包括没有明确列出的其他要素,或者是还包括为这种过程、方法、物品或者装置所固有的要素。在没有更多限制的情况下,由语句“包括一个……”限定的要素,并不排除在包括该要素的过程、方法、物品或者装置中还存在另外的相同要素。
在本申请所提供的几个实施例中,应该理解到,所揭露的设备和方法,可以通过其它的方式实现。以上所描述的设备实施例仅仅是示意性的,例如,所述单元的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,如:多个单元或组件可以结合,或可以集成到另一个系统,或一些特征可以忽略,或不执行。另外,所显示或讨论的各组成部分相互之间的耦合、或直接耦合、或通信连接可以是通过一些接口,设备或单元的间接耦合或通信连接,可以是电性的、机械的或其它形式的。
上述作为分离部件说明的单元可以是、或也可以不是物理上分开的,作为单元显示的部件可以是、或也可以不是物理单元;既可以位于一个地方,也可以分布到多个网络单元上;可以根据实际的需要选择其中的部分或全部单元来实现本实施例方案的目的。
另外,在本申请各实施例中的各功能单元可以全部集成在一个处理单元中,也可以是各单元分别单独作为一个单元,也可以两个或两个以上单元集成在一个单元中;上述集成的单元既可以采用硬件的形式实现,也可以采用硬件加软件功能单元的形式实现。
本领域普通技术人员可以理解:实现上述方法实施例的全部或部分步骤可以通过程序指令相关的硬件来完成,前述的程序可以存储于计算机可读取存储介质中,该程序在执行时,执行包括上述方法实施例的步骤;而前述的存储介质包括:移动存储设备、只读存储器(Read Only Memory,ROM)、磁碟或者光盘等各种可以存储程序代码的介质。
或者,本发明上述集成的单元如果以软件功能模块的形式实现并作为独立的产品销售或使用时,也可以存储在一个计算机可读取存储介质中。基于这样的理解,本发明实施例的技术方案本质上或者说对现有技术做出贡献的部分可 以以软件产品的形式体现出来,该计算机软件产品存储在一个存储介质中,包括若干指令用以使得一台计算机设备(可以是个人计算机、服务器、或者网络设备等)执行本发明各个实施例所述方法的全部或部分。而前述的存储介质包括:移动存储设备、ROM、磁碟或者光盘等各种可以存储程序代码的介质。
以上所述,仅为本发明的具体实施方式,但本发明的保护范围并不局限于此,任何熟悉本技术领域的技术人员在本发明揭露的技术范围内,可轻易想到变化或替换,都应涵盖在本发明的保护范围之内。因此,本发明的保护范围应以所述权利要求的保护范围为准。
工业实用性
本申请实施例中,发射驱动模块可以独立工作,不需集成在深度处理芯片中,也不需依赖于深度处理芯片,从而使得发射驱动模块在失效或不适用时可以更容易地进行修复,保证正常地采集到散斑图像,进而保证正常地获取深度图像,提高图像应用功能的稳定性,并且结构光信号的参数控制和调整是由图像处理模块内部的信号调制模块,以及图像处理模块外部的控制模块共同实现的,从而可以方便的对结构光信号进行控制和调整,提高了终端调制结构光信号的灵活性。

Claims (22)

  1. 一种图像处理方法,应用于图像处理装置,所述图像处理装置包括红外摄像头、结构光发射器、发射驱动模块和图像处理模块,所述方法包括:
    当检测到图像应用功能的请求指令时,响应于所述请求指令,向所述发射驱动模块传输获取到的发射参数,并控制所述红外摄像头向所述发射驱动模块传输触发信号;
    响应于所述触发信号,通过所述发射驱动模块,控制所述结构光发射器按照所述发射参数发射激光,并通过所述发射驱动模块向所述红外摄像头传输同步信号;
    响应于所述同步信号,控制所述红外摄像头采集待检测物体的散斑图像;
    控制所述红外摄像头将所述散斑图像传输至所述图像处理模块;
    控制所述图像处理模块对所述散斑图像进行深度计算,得到所述待检测物体的深度图像,并基于所述深度图像实现所述图像应用功能,所述深度图像表征所述待检测物体的三维结构特征。
  2. 根据权利要求1所述的方法,其中,所述图像处理模块包括信号调制模块;
    所述响应于所述请求指令,向所述发射驱动模块传输获取到的发射参数,包括:
    向所述图像处理模块传输所述请求指令;
    响应于所述请求指令,控制所述信号调制模块向所述发射驱动模块传输所述发射参数。
  3. 根据权利要求1所述的方法,其中,所述红外摄像头包括信号调制模块、触发模块和时序控制电路;
    所述响应于所述请求指令,向所述发射驱动模块传输获取到的发射参数,并控制所述红外摄像头向所述发射驱动模块传输触发信号,包括:
    向所述时序控制电路传输所述请求指令;
    响应于所述请求指令,通过所述时序控制电路,控制所述信号调制模块向所述发射驱动模块传输所述发射参数、以及控制所述触发模块向所述发射驱动模块传输所述触发信号。
  4. 根据权利要求3所述的方法,其中,所述红外摄像头还包括图像采集模块;
    所述控制所述红外摄像头采集待检测物体的散斑图像,包括:
    向所述时序控制电路传输所述同步信号;
    响应于所述同步信号,通过所述时序控制电路,控制所述图像采集模块周期性地采集所述散斑图像。
  5. 根据权利要求1所述的方法,其中,所述红外摄像头包括信号调制模块;
    在所述响应于所述触发信号,通过所述发射驱动模块,控制所述结构光发射器按照所述发射参数发射激光之前,所述方法还包括:
    当接收到参数修改指令时,向所述信号调制模块传输所述参数修改指令;
    响应于所述参数修改指令,控制所述信号调制模块修改所述发射参数,得到更新后的发射参数。
  6. 根据权利要求5所述的方法,其中,所述响应于所述触发信号,通过所述发射驱动模块,控制所述结构光发射器按照所述发射参数发射激光,包括:
    响应于所述触发信号,通过所述发射驱动模块,控制所述结构光发射器按照所述更新后的发射参数发射激光。
  7. 根据权利要求1所述的方法,其中,所述请求指令包括人脸解锁请求,所述图像处理装置还包括可见光摄像头;
    所述当检测到图像应用功能的请求指令时,响应于所述请求指令,向所述发射驱动模块传输获取到的发射参数,并控制所述红外摄像头向所述发射驱动模块传输触发信号,包括:
    当检测到所述人脸解锁请求时,响应于所述人脸解锁请求,控制所述可见光摄像头采集所述待检测物体的可见光图像;
    对所述可见光图像进行人脸检测,得到检测结果;
    当所述检测结果表征所述待检测物体为目标对象时,向所述发射驱动模块传输所述发射参数,并控制所述红外摄像头向所述发射驱动模块传输所述触发信号。
  8. 根据权利要求7所述的方法,其中,所述基于所述深度图像实现所述图像应用功能,包括:
    获取目标深度图像;
    计算所述深度图像和所述目标深度图像之间的相似度;
    当所述相似度大于或等于预设相似度阈值时,生成人脸解锁响应,响应于所述人脸解锁响应,开启解锁页面。
  9. 根据权利要求1所述的方法,其中,所述请求指令包括三维建模请求,所述图像处理装置还包括可见光摄像头;
    在所述向所述发射驱动模块传输获取到的发射参数之前,所述方法还包括:
    当检测到所述三维建模请求时,响应于所述三维建模请求,控制所述可见光摄像头采集所述待检测物体的可见光图像。
  10. 根据权利要求9所述的方法,其中,所述基于所述深度图像实现所述图像应用功能,包括:
    利用所述可见光图像,对所述深度图像进行颜色渲染,得到三维彩色图像,并显示所述三维彩色图像。
  11. 根据权利要求1所述的方法,其中,在所述向所述发射驱动模块传输获取到的发射参数,并控制所述红外摄像头向所述发射驱动模块传输触发信号之后,所述方法还包括:
    控制所述红外摄像头以所述触发信号的传输时刻为开始时刻,进行计时;
    直至检测到所述红外摄像头接收到所述同步信号,获得计时时长;
    当所述计时时长大于或等于预设时长时,控制所述红外摄像头生成异常提示信息,并显示所述异常提示信息,所述异常提示信息表征所述发射驱动模块发生异常。
  12. 一种图像处理装置,所述图像处理装置包括控制模块、红外摄像头、结构光发射器、发射驱动模块和图像处理模块,所述控制模块通过总线连接所述红外摄像头和所述图像处理模块,所述红外摄像头连接所述控制模块、所述发射驱动模块和所述图像处理模块,所述图像处理模块连接所述控制模块和所述红外摄像头;
    所述控制模块,用于当检测到图像应用功能的请求指令时,响应于所述请求指令,向所述发射驱动模块传输获取到的发射参数,并控制所述红外摄像头向所述发射驱动模块传输触发信号;
    所述发射驱动模块,用于响应于所述触发信号,控制所述结构光发射器按照所述发射参数发射激光,并向所述红外摄像头传输同步信号;
    所述红外摄像头,用于响应于所述同步信号,采集待检测物体的散斑图像;将所述散斑图像传输至所述图像处理模块;
    所述图像处理模块,用于对所述散斑图像进行深度计算,得到所述待检测物体的深度图像,并基于所述深度图像实现所述图像应用功能,所述深度图像表征所述待检测物体的三维结构特征。
  13. 根据权利要求12所述的图像处理装置,其中,所述图像处理模块包括信号调制模块;
    所述控制模块,还用于向所述图像处理模块传输所述请求指令;
    所述图像处理模块,还用于响应于所述请求指令,控制所述信号调制模块向所述发射驱动模块传输所述发射参数。
  14. 根据权利要求13所述的图像处理装置,其中,所述图像处理模块为专用集成电路,所述信号调制模块以信号调制电路的形式集成在所述专用集成电路中;
    所述图像处理模块,还用于响应于所述请求指令,接通所述信号调制电路,实现向所述发射驱动模块传输所述发射参数。
  15. 根据权利要求13所述的图像处理装置,其中,所述图像处理模块为数字信号处理器,所述信号调制模块以信号调制程序的形式写在所述数字信号处理器的寄存器中;
    所述图像处理模块,还用于响应于所述请求指令,调用和运行所述信号调制程序,实现向所述发射驱动模块传输所述发射参数。
  16. 根据权利要求12所述的图像处理装置,其中,所述红外摄像头包括信号调制模块、触发模块和时序控制电路;所述时序控制电路分别连接所述信号调制模块和所述触发模块,所述时序控制电路还连接所述控制模块和所述发射驱动模块;
    所述控制模块,还用于向所述时序控制电路传输所述请求指令;
    所述时序控制电路,还用于响应于所述请求指令,控制所述信号调制模块向所述发射驱动模块传输所述发射参数、以及控制所述触发模块向所述发射驱动模块传输所述触发信号。
  17. 根据权利要求16所述的图像处理装置,其中,所述红外摄像头还包括图像采集模块;所述图像采集模块连接所述时序控制电路;
    所述红外摄像头,还用于向所述时序控制电路传输所述同步信号;
    所述时序控制电路,还用于响应于所述同步信号,控制所述图像采集模块周期性地采集所述散斑图像。
  18. 根据权利要求12所述的图像处理装置,其中,所述红外摄像头包括信号调制模块;所述信号调制模块通过所述红外摄像头中的时序控制电路与所述控制模块连接;
    所述控制模块,还用于在所述发射驱动模块响应于所述触发信号,通过所述发射驱动模块,控制所述结构光发射器按照所述发射参数发射激光之前,当接收到参数修改指令时,通过所述时序控制电路向所述信号调制模块传输所述参数修改指令;
    所述信号调制模块,用于响应于所述参数修改指令,修改所述发射参数,得到更新后的发射参数。
  19. 根据权利要求18所述的图像处理装置,其中,所述发射驱动模块,还用于响应于所述触发信号,控制所述结构光发射器按照所述更新后的发射参数发射激光。
  20. 根据权利要求12所述的图像处理装置,其中,所述请求指令包括人脸解锁请求,所述图像处理装置还包括可见光摄像头;所述可见光摄像头与所述控制模块通过总线连接;
    所述控制模块,还用于当检测到所述人脸解锁请求时,响应于所述人脸解锁请求,控制所述可见光摄像头采集所述待检测物体的可见光图像;对所述可见光图像进行人脸检测,得到检测结果;当所述检测结果表征所述待检测物体为目标对象时,向所述发射驱动模块传输所述发射参数,并控制所述红外摄像头向所述发射驱动模块传输所述触发信号;或者,在所述向所述发射驱动模块传输获取到的发射参数之前,当检测到所述三维建模请求时,响应于所述三维建模请求,控制所述可见光摄像头采集所述待检测物体的可见光图像。
  21. 一种图像处理装置,其中,所述图像处理装置包括红外摄像头、结构光发射器、发射驱动模块和图像处理模块,所述红外摄像头连接所述发射驱动模块和所述图像处理模块,所述结构光发射器连接所述发射驱动模块,所述图像处理模块还包括:处理器、存储器以及通信总线,所述存储器通过所述通信总线与所述处理器进行通信,所述存储器存储所述处理器可执行的一个或者多个程序,当所述一个或者多个程序被执行时,通过所述处理器控制所述红外摄像头、所述结构光发射器、所述发射驱动模块和所述图像处理模块,执行如权利要求1-11任一项所述的方法。
  22. 一种计算机可读存储介质,其中,所述计算机可读存储介质存储有一个或者多个程序,所述一个或者多个程序可被一个或者多个处理器执行,以实现如权利要求1-11任一项所述的方法。
PCT/CN2020/096822 2019-06-24 2020-06-18 图像处理方法和装置、存储介质 WO2020259385A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP20831054.0A EP3979202A4 (en) 2019-06-24 2020-06-18 METHOD AND DEVICE FOR PROCESSING IMAGES AND RECORDING MEDIUM
US17/559,672 US20220114743A1 (en) 2019-06-24 2021-12-22 Image processing method and apparatus, and computer-readable non-transitory storage medium

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910551552.4A CN110335303B (zh) 2019-06-24 2019-06-24 图像处理方法和装置、及存储介质
CN201910551552.4 2019-06-24

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/559,672 Continuation US20220114743A1 (en) 2019-06-24 2021-12-22 Image processing method and apparatus, and computer-readable non-transitory storage medium

Publications (1)

Publication Number Publication Date
WO2020259385A1 true WO2020259385A1 (zh) 2020-12-30

Family

ID=68142797

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/096822 WO2020259385A1 (zh) 2019-06-24 2020-06-18 图像处理方法和装置、存储介质

Country Status (4)

Country Link
US (1) US20220114743A1 (zh)
EP (1) EP3979202A4 (zh)
CN (1) CN110335303B (zh)
WO (1) WO2020259385A1 (zh)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110335303B (zh) * 2019-06-24 2021-10-26 Oppo广东移动通信有限公司 图像处理方法和装置、及存储介质
CN111123277A (zh) * 2019-12-31 2020-05-08 深圳奥比中光科技有限公司 一种深度测量系统的同步装置
CN114640842A (zh) * 2022-03-17 2022-06-17 Oppo广东移动通信有限公司 隐藏摄像头的检测方法、终端及可读存储介质

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106210568A (zh) * 2016-07-15 2016-12-07 深圳奥比中光科技有限公司 图像处理方法以及装置
CN107896274A (zh) * 2017-10-27 2018-04-10 广东欧珀移动通信有限公司 红外发射器控制方法、终端及计算机可读存储介质
CN108549867A (zh) * 2018-04-12 2018-09-18 Oppo广东移动通信有限公司 图像处理方法、装置、计算机可读存储介质和电子设备
CN108696682A (zh) * 2018-04-28 2018-10-23 Oppo广东移动通信有限公司 数据处理方法、装置、电子设备及计算机可读存储介质
CN110335303A (zh) * 2019-06-24 2019-10-15 Oppo广东移动通信有限公司 图像处理方法和装置、及存储介质

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10721420B2 (en) * 2015-06-02 2020-07-21 Intel Corporation Method and system of adaptable exposure control and light projection for cameras
CN109427086A (zh) * 2017-08-22 2019-03-05 上海荆虹电子科技有限公司 三维图像生成装置及方法
JP2019114914A (ja) * 2017-12-22 2019-07-11 キヤノン株式会社 撮像装置、その制御方法、プログラムならびに撮像システム
US20190306441A1 (en) * 2018-04-03 2019-10-03 Mediatek Inc. Method And Apparatus Of Adaptive Infrared Projection Control
CN108595928A (zh) * 2018-04-12 2018-09-28 Oppo广东移动通信有限公司 人脸识别的信息处理方法、装置及终端设备
CN108564033A (zh) * 2018-04-12 2018-09-21 Oppo广东移动通信有限公司 基于结构光的安全验证方法、装置及终端设备
CN208172809U (zh) * 2018-04-18 2018-11-30 深圳阜时科技有限公司 图像获取装置、图像重构装置、身份识别装置、电子设备
CN112668547A (zh) * 2018-04-28 2021-04-16 Oppo广东移动通信有限公司 图像处理方法、装置、电子设备和计算机可读存储介质
JP7189499B2 (ja) * 2018-05-07 2022-12-14 オムロン株式会社 センサシステム
CN109190484A (zh) * 2018-08-06 2019-01-11 北京旷视科技有限公司 图像处理方法、装置和图像处理设备
US20200082160A1 (en) * 2018-09-12 2020-03-12 Kneron (Taiwan) Co., Ltd. Face recognition module with artificial intelligence models
US10990805B2 (en) * 2018-09-12 2021-04-27 Apple Inc. Hybrid mode illumination for facial recognition authentication

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106210568A (zh) * 2016-07-15 2016-12-07 深圳奥比中光科技有限公司 图像处理方法以及装置
CN107896274A (zh) * 2017-10-27 2018-04-10 广东欧珀移动通信有限公司 红外发射器控制方法、终端及计算机可读存储介质
CN108549867A (zh) * 2018-04-12 2018-09-18 Oppo广东移动通信有限公司 图像处理方法、装置、计算机可读存储介质和电子设备
CN108696682A (zh) * 2018-04-28 2018-10-23 Oppo广东移动通信有限公司 数据处理方法、装置、电子设备及计算机可读存储介质
CN110335303A (zh) * 2019-06-24 2019-10-15 Oppo广东移动通信有限公司 图像处理方法和装置、及存储介质

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3979202A4 *

Also Published As

Publication number Publication date
US20220114743A1 (en) 2022-04-14
CN110335303A (zh) 2019-10-15
CN110335303B (zh) 2021-10-26
EP3979202A4 (en) 2022-07-27
EP3979202A1 (en) 2022-04-06

Similar Documents

Publication Publication Date Title
WO2020259385A1 (zh) 图像处理方法和装置、存储介质
JP7195422B2 (ja) 顔認識方法および電子デバイス
US20180329517A1 (en) Controlling handheld object light sources for tracking
CN104730827B (zh) 投影仪、投影系统以及投影仪的控制方法
CN104243800B (zh) 控制装置和存储介质
US20140307953A1 (en) Active stereo with satellite device or devices
WO2019196683A1 (zh) 图像处理方法、装置、计算机可读存储介质和电子设备
KR102661185B1 (ko) 전자 장치 및 그의 이미지 촬영 방법
CN108650472A (zh) 控制拍摄的方法、装置、电子设备及计算机可读存储介质
JP6562124B2 (ja) 位置検出システム、及び、位置検出システムの制御方法
JP2015158885A (ja) 位置検出装置、プロジェクター、位置検出システム、及び、位置検出装置の制御方法
EP3621294A1 (en) Method and device for image processing, computer readable storage medium and electronic device
KR20070066382A (ko) 두 대의 카메라를 이용한 3차원 이미지 생성 방법 및 이를구현하는 카메라 단말기
US11081516B2 (en) Display screen, electronic device and method for three-dimensional feature recognition
KR20190035358A (ko) 외부 광에 기반하여 카메라를 제어하는 전자 장치 및 제어 방법
JP6740614B2 (ja) 物体検出装置、及び物体検出装置を備えた画像表示装置
CN108833885A (zh) 图像处理方法、装置、计算机可读存储介质和电子设备
CN108881712B (zh) 图像处理方法、装置、计算机可读存储介质和电子设备
CN112468723A (zh) 对焦方法及支付设备
TW201547275A (zh) 深度攝影機系統
TW201624098A (zh) 閃光燈控制系統及方法
CN113691791B (zh) 激光投影设备及其投影图像的显示方法
JP6340860B2 (ja) 位置検出装置、プロジェクター、位置検出システム、及び、位置検出装置の制御方法
US11438486B2 (en) 3D active depth sensing with laser pulse train bursts and a gated sensor
CN111886853B (zh) 图像数据处理方法及其设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20831054

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2020831054

Country of ref document: EP

Effective date: 20211229