CN112802116A - Image processing method, camera calibration method, device, storage medium and electronic equipment - Google Patents

Image processing method, camera calibration method, device, storage medium and electronic equipment Download PDF

Info

Publication number
CN112802116A
CN112802116A CN202011602675.5A CN202011602675A CN112802116A CN 112802116 A CN112802116 A CN 112802116A CN 202011602675 A CN202011602675 A CN 202011602675A CN 112802116 A CN112802116 A CN 112802116A
Authority
CN
China
Prior art keywords
image
temperature
reference image
camera
light emitter
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011602675.5A
Other languages
Chinese (zh)
Inventor
郭子青
周海涛
欧锦荣
谭筱
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN202011602675.5A priority Critical patent/CN112802116A/en
Publication of CN112802116A publication Critical patent/CN112802116A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30204Marker

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Studio Devices (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The application relates to an image processing method, a camera calibration method, a device, a computer readable storage medium and an electronic apparatus. The method comprises the following steps: controlling the temperature of a camera module to reach at least two different specified temperatures, wherein the camera module comprises a light emitter and a camera; controlling the camera to collect a reference image formed when the light emitter irradiates a reference plane at the specified temperature; storing the specified temperature and a reference image correspondingly; the reference image is provided with reference depth information, and the reference depth information is used for calculating the depth information of the object. The image processing method, the camera calibration device, the computer readable storage medium and the electronic equipment can realize the accuracy of image processing.

Description

Image processing method, camera calibration method, device, storage medium and electronic equipment
The application is a divisional application of a patent application with application number 201810690949.7, which is filed on 2018, 6 and 28 months and has the name of a camera marking method, a camera marking device, a computer-readable storage medium and an electronic device.
Technical Field
The present application relates to the field of computer technologies, and in particular, to an image processing method, a camera calibration method, an apparatus, a computer-readable storage medium, and an electronic device.
Background
Structured light may be used in unlocking, payment, beauty applications, and the like. Specifically, infrared rays with certain structural characteristics can be emitted by one laser emitter, then images formed by the infrared rays are collected by the laser camera, and depth information from an object to the camera can be calculated through the collected images formed by the infrared rays. If the laser emitter and the laser camera are deformed, the acquired image is changed, and errors are generated in the calculated depth information.
Disclosure of Invention
The embodiment of the application provides a camera calibration method, a camera calibration device, a computer-readable storage medium and electronic equipment, which can achieve accuracy of image processing.
A camera calibration method, the method comprising:
controlling the temperature of a camera module to reach at least two different specified temperatures, wherein the camera module comprises a light emitter and a camera;
controlling the camera to collect a reference image formed when the light emitter irradiates a reference plane at the specified temperature;
storing the specified temperature and a reference image correspondingly; the reference image is provided with reference depth information, and the reference depth information is used for calculating the depth information of the object.
A camera calibration apparatus, the apparatus comprising:
the temperature control module is used for controlling the temperature of the camera module to reach at least two different specified temperatures, wherein the camera module comprises a light emitter and a camera;
the image acquisition module is used for controlling the camera to acquire a reference image formed when the light emitter irradiates a reference plane at the specified temperature;
the image storage module is used for correspondingly storing the specified temperature and a reference image; the reference image is provided with reference depth information, and the reference depth information is used for calculating the depth information of the object.
A computer-readable storage medium, on which a computer program is stored which, when executed by a processor, carries out the steps of:
controlling the temperature of a camera module to reach at least two different specified temperatures, wherein the camera module comprises a light emitter and a camera;
controlling the camera to collect a reference image formed when the light emitter irradiates a reference plane at the specified temperature;
storing the specified temperature and a reference image correspondingly; the reference image is provided with reference depth information, and the reference depth information is used for calculating the depth information of the object.
An electronic device comprising a memory and a processor, the memory having stored therein computer-readable instructions that, when executed by the processor, cause the processor to perform the steps of:
controlling the temperature of a camera module to reach at least two different specified temperatures, wherein the camera module comprises a light emitter and a camera;
controlling the camera to collect a reference image formed when the light emitter irradiates a reference plane at the specified temperature;
storing the specified temperature and a reference image correspondingly; the reference image is provided with reference depth information, and the reference depth information is used for calculating the depth information of the object.
The camera calibration method, the camera calibration device, the computer readable storage medium and the electronic equipment can control the temperature of the camera module to reach at least two different specified temperatures, control the camera module to collect reference images formed at different specified temperatures, and store the reference images corresponding to the specified temperatures. Because the camera module can produce deformation under the temperature of difference, the image that camera module gathered also can be influenced to temperature itself simultaneously, so control camera module when the camera is markd and gather the image under the appointed temperature of difference. Therefore, the corresponding reference image can be obtained according to the temperature of the camera module, the depth information of the object is calculated according to the reference depth information in the reference image, errors caused by temperature change of the camera module are avoided, and the accuracy of image processing is improved.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
FIG. 1 is a diagram of an application environment for a camera calibration method in one embodiment;
FIG. 2 is a schematic view of an electronic device with a camera module mounted thereon in one embodiment;
FIG. 3 is a flow diagram of a camera calibration method in one embodiment;
FIG. 4 is a flow chart of a camera calibration method in another embodiment;
FIG. 5 is a schematic diagram of computing depth information in one embodiment;
FIG. 6 is a flow chart of a camera calibration method in yet another embodiment;
FIG. 7 is a flow chart of a camera calibration method in yet another embodiment;
FIG. 8 is a diagram of a hardware configuration for implementing the camera calibration method in one embodiment;
FIG. 9 is an interaction diagram for implementing a camera calibration method in one embodiment;
FIG. 10 is a schematic structural diagram of a camera calibration device in one embodiment;
fig. 11 is a schematic structural diagram of a camera calibration device in another embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
It will be understood that, as used herein, the terms "first," "second," and the like may be used herein to describe various elements, but these elements are not limited by these terms. These terms are only used to distinguish one element from another. For example, a first client may be referred to as a second client, and similarly, a second client may be referred to as a first client, without departing from the scope of the present application. Both the first client and the second client are clients, but they are not the same client.
Fig. 1 is a diagram of an application environment of a camera calibration method according to an embodiment. As shown in fig. 1, the application environment includes a calibration apparatus 10 and an electronic apparatus 12. A camera module can be installed on the electronic device 12, and the camera module includes a light emitter and a camera. The electronic device 12 is fixed on the calibration device 10, and the camera module of the electronic device 12 is calibrated by the calibration device 10. Specifically, the calibration device 10 includes a surface light source 100, a reference plane 102, a laser sensor 104, an electric angular position table 106, and an electric lifting table 108. The electric angular position table 106 can adjust the angle of the electronic device 10, so that the optical axis of the camera module of the electronic device 10 is perpendicular to the reference plane 102, and the electric lifting table 108 can adjust the vertical distance between the electronic device 12 and the reference plane 102, and measure the vertical distance through the laser sensor 104. The surface light source 100 is used for illuminating a coding region on the reference plane 102, the electronic device 12 can control the temperature of the camera module to reach at least two different specified temperatures, when the light sources emitted by the light emitters at the different specified temperatures irradiate the reference plane 102, the reference images formed on the reference plane 102 at the different specified temperatures are collected through the camera, the corresponding relation between the specified temperatures and the reference images is established, and then the specified temperatures and the reference images are correspondingly stored.
Fig. 2 is a schematic diagram of an electronic device with a camera module mounted in one embodiment. As shown in fig. 2, the electronic device 20 is mounted with a camera module, which includes a light emitter 202 and a camera 204. In the process of calibrating the camera, the electronic device 20 may control the temperature of the camera module to reach different specified temperatures, emit light through the light emitter 202 at different specified temperatures, and collect a reference image formed when the light irradiates the reference plane through the camera 204. Finally, the electronic device 20 may establish a correspondence between the specified temperature and the reference image, and store the specified temperature and the reference image.
Fig. 3 is a flowchart of a camera calibration method in an embodiment. As shown in fig. 3, the camera calibration method includes steps 302 to 306. Wherein:
and step 302, controlling the temperature of the camera module to reach at least two different specified temperatures, wherein the camera module comprises a light emitter and a camera.
In one embodiment, a camera may be mounted on the electronic device, and an image may be acquired through the mounted camera. The camera can be divided into types such as a laser camera and a visible light camera according to the difference of the obtained images, the laser camera can obtain the image formed by irradiating the laser to the object, and the visible light image can obtain the image formed by irradiating the visible light to the object. The electronic equipment can be provided with a plurality of cameras, and the installation position is not limited. For example, one camera may be installed on a front panel of the electronic device, two cameras may be installed on a back panel of the electronic device, and the cameras may be installed in an embedded manner inside the electronic device and then opened by rotating or sliding. Specifically, a front camera and a rear camera can be mounted on the electronic device, the front camera and the rear camera can acquire images from different viewing angles, the front camera can acquire images from a front viewing angle of the electronic device, and the rear camera can acquire images from a back viewing angle of the electronic device.
The electronic device may measure depth information from objects in the scene to the electronic device through the captured image, and may specifically measure the depth information through structured light. When the depth information is acquired through the structured light, a camera module comprising a light emitter and a camera can be installed on the electronic equipment, and the process of acquiring the depth information specifically comprises a camera calibration stage and a measurement stage. In the calibration stage of the camera, the light emitter can emit light rays, and when the light rays irradiate the reference plane, a reference image can be formed, and then the reference image is obtained through the camera. The distance from the reference plane to the electronic device is known and a correspondence between the known distance and the reference image can then be established. In the measurement phase, the actual distance of the object can be calculated according to the image acquired in real time and the corresponding relation.
It can be understood that the camera module may generate heat during operation, and the parameters and shape of the camera module may be affected by temperature changes. Therefore, in order to reduce errors caused by temperature, the camera module can be controlled to reach different temperatures in the process of calibrating the camera, and the camera module is controlled to collect reference images at different temperatures. Specifically, the temperature of the controllable camera module reaches at least two different specified temperatures and is completely calibrated at the different specified temperatures respectively.
And step 304, controlling the camera to acquire a reference image formed when the light emitter irradiates the reference plane at a specified temperature.
The light emitter and the camera in the camera module are generally on the same horizontal line, and the calibration equipment firstly needs to adjust the position of the electronic equipment to ensure that an optical axis formed by the light emitter and the camera is vertical to the reference plane, so that the vertical distance from the electronic equipment to the reference plane can be calculated. It will be appreciated that the vertical distance may be adjustable, and that when the vertical distance from the electronic device to the reference plane is different, the reference image formed will be different. When the temperature of the camera module reaches different specified temperatures, the light source generator can be controlled to emit light. And when the light irradiates the reference plane, the formed reference image is collected through the camera.
Step 306, storing the specified temperature and the reference image correspondingly; the reference image is provided with reference depth information, and the reference depth information is used for calculating the depth information of the object.
The light emitter can emit laser containing a plurality of speckle points, and then the reference image formed by irradiating the laser containing the speckle points on the reference plane is collected by the camera. The reference depth information is the distance from the electronic equipment to the reference plane, is known, and can be used for obtaining a model for calculating the depth information according to the reference image and the reference depth information. In the measuring process, a speckle image formed when laser irradiates an object can be obtained, and the depth information of the object contained in the speckle image can be calculated according to the model.
In the camera calibration process, reference images corresponding to different specified temperatures are collected and stored. In the process of measuring the depth information, the temperature of the camera module can be firstly obtained, a corresponding reference image is obtained according to the temperature, and the depth information of the object is calculated according to the obtained reference image. For example, the camera module is controlled to respectively acquire one reference image at 30 ℃ (Degree Celsius) and 80 ℃, and then the reference image is stored corresponding to the camera module. In the measuring process, the current temperature of the camera module is firstly acquired, and the reference image corresponding to the specified temperature closest to the current temperature is acquired to calculate the depth information.
The camera calibration method provided by the above embodiment can control the temperature of the camera module to reach at least two different specified temperatures, control the camera module to acquire the reference images formed at the different specified temperatures, and store the reference images in correspondence with the specified temperatures. Because the camera module can produce deformation under the temperature of difference, the image that camera module gathered also can be influenced to temperature itself simultaneously, so control camera module when the camera is markd and gather the image under the appointed temperature of difference. Therefore, the corresponding reference image can be obtained according to the temperature of the camera module, the depth information of the object is calculated according to the reference depth information in the reference image, errors caused by temperature change of the camera module are avoided, and the accuracy of image processing is improved.
Fig. 4 is a flowchart of a camera calibration method in another embodiment. As shown in fig. 4, the camera calibration method includes steps 402 to 412. Wherein:
step 402, inputting at least two Pulse Width Modulation (PWM) with different frequencies to the light emitter, and controlling the temperature of the light emitter to reach at least two different specified temperatures through the PWM.
In one embodiment, the light emitter may be connected to the processor, and the processor may transmit instructions to the light emitter to control the on/off of the light emitter. Specifically, in the camera calibration process, the laser speckles can be emitted through the light emitter, and then the reference image formed by the laser speckles irradiated on the object is collected through the laser camera. The work of the light emitter can be controlled through a pulse wave, so that the higher the working frequency is, the higher the temperature of the light emitter is, and the temperature of the camera module is increased. Therefore, in the calibration process, the temperature of the camera module can be adjusted by controlling the working frequency of the light emitter. Specifically, the light emitter is controlled to work at a specified frequency, and the temperature of the camera module is controlled to reach a specified temperature through the light emitter working at the specified frequency.
Specifically, the processor can be connected with the camera module, and the processor can be used for controlling the working frequency of the light emitter. The processor inputs a pulse signal to the light emitter and controls the switch of the light emitter through the pulse signal. The Pulse signal may be PWM (Pulse Width Modulation), and the processor may input different frequencies of PWM to the optical transmitter, so that the optical transmitter reaches different specified temperatures.
And step 404, controlling the camera to acquire a reference image formed when the light emitter irradiates the reference plane at a specified temperature.
Each time a reference image is acquired, the electronic device may establish a correspondence between the acquired reference image and the specified temperature. After acquiring the reference image, the electronic device stores the reference image and the corresponding specified temperature. Therefore, in the actual shooting process, the corresponding reference image can be obtained according to the temperature of the camera module.
And 406, establishing a corresponding relation between the specified temperature and the reference image, and writing the specified temperature and the reference image into a safe operation environment of the terminal for storage.
It can be understood that the corresponding relationship between the designated temperature and the reference image can be directly established, or a temperature range can be defined according to the designated temperature, the corresponding relationship between the temperature range and the reference image can be established, and then the temperature range and the reference image are written into the terminal. For example, the reference images formed by the light emitters at the specified temperatures of 30 deg.C, 60 deg.C, and 90 deg.C are collected as "pic-01", "pic-02", and "pic-03", respectively. If the temperature ranges corresponding to the specified temperatures are [0,50 ], [50 ℃,90 ], [90 ℃, + ∞ ] respectively, then the temperature range in which the light emitter falls can be determined during the distance measurement, and the corresponding reference image can be obtained according to the temperature range.
Generally, to ensure the security of image processing, the electronic device calculates depth information in a secure operating environment. Therefore, the acquired reference image and the corresponding specified temperature can be stored in the safe operation environment, and the depth information can be directly calculated in the safe environment in the measuring process. For example, the upper application of the electronic device initiates a face payment instruction, depth information can be acquired through the camera module in the face payment process, and whether the face is a living body is judged through the depth information, so that it is required to ensure that the depth information is obtained by calculation in a safe operation environment.
In the embodiment provided by the application, the secure operating environment in the terminal can be divided into a first secure operating environment and a second secure operating environment, and the storage space in the first secure operating environment is larger than that in the second secure operating environment. In order to avoid that the storage space in the second safe operation environment is excessively occupied and the processing of the image is influenced, the specified temperature and the reference image can be written into the first safe operation environment of the terminal for storage in the calibration process. And when the terminal is detected to be started, loading the specified temperature and the reference image from the first safe operation environment to the second safe operation environment for storage.
And step 408, controlling the camera module to collect speckle images when the camera module is opened.
Specifically, the processing unit at the terminal can receive the instruction from the upper application program, and when the processing unit receives the image acquisition instruction, the camera module can be controlled to work, and speckle images are acquired through the camera. The processing unit is connected to the camera, and the image that the camera obtained just can be transmitted for processing unit to carry out processing such as tailorring, brightness control, face detection, face identification through processing unit. Specifically, when the processing unit receives an image acquisition instruction, the processing unit controls the light emitter to work, and when the light emitter is started, a laser camera is used for acquiring a speckle image formed by the light emitter irradiating an object.
It will be appreciated that the light emitter may be a laser emitter, and when laser light is irradiated on optically rough surfaces with an average fluctuation larger than the wavelength, wavelets scattered by randomly distributed surface elements on the surfaces are superposed with each other to make the reflected light field have a random spatial light intensity distribution, which shows a granular structure, that is, laser speckle. The laser speckles formed are highly random, and therefore, the laser speckles generated by the laser emitted by different laser emitters are different. When the resulting laser speckle is projected onto objects of different depths and shapes, the resulting speckle images are not identical. The laser speckles formed by different laser emitters are unique, and therefore the speckle images obtained are also unique. The laser speckle formed by the light emitter can be irradiated on the object, and then the laser speckle collected by the laser camera is irradiated on a speckle image formed on the object.
The image capturing instruction refers to an instruction for triggering an image capturing operation. For example, when a user unlocks the smart phone, the smart phone can be verified and unlocked by acquiring a face image, and then the upper application can initiate an image acquisition instruction and control the camera module to acquire an image through the image acquisition instruction. Specifically, the first processing unit may receive an image acquisition instruction initiated by an upper application program, and when the first processing unit detects the image acquisition instruction, the first processing unit controls the camera module to be opened, and then controls the camera module to acquire the speckle image. The speckle image that the camera module was gathered can be sent for the second processing unit, and the depth information is calculated according to the speckle image to the second processing unit again.
When the temperature change of the light emitter is detected to exceed the temperature threshold value, the current temperature of the light emitter is obtained, step 410.
After the camera module is detected to be opened, the temperature of the light emitter can be detected at regular time through a temperature sensor, and the detected temperature is sent to the second processing unit. And the second processing unit judges whether the temperature change of the light emitter exceeds a temperature threshold value, if so, the temperature is taken as the current temperature of the light emitter, a corresponding target reference image is obtained according to the current temperature, and the depth information is calculated according to the obtained target reference image. For example, the temperature threshold may be 5 ℃, and when the temperature variation of the light emitters exceeds 5 ℃, the corresponding target reference image may be determined according to the acquired temperature of the light emitters. It will be appreciated that the time between acquiring the speckle image and acquiring the current temperature cannot be too long to ensure accuracy.
And step 412, acquiring a corresponding target reference image according to the current temperature of the light emitter, and calculating a depth image according to the speckle image and the target reference image, wherein the depth image is used for representing depth information of the object.
The specified temperature and the reference image are correspondingly stored, so that in the process of measuring the depth information, a corresponding target reference image can be determined according to the current temperature of the light emitter, and then the depth image is calculated according to the speckle image and the target reference image. Specifically, the target reference image and the speckle image may be compared to obtain offset information, where the offset information is used to indicate a horizontal offset of a speckle point in the speckle image relative to a corresponding scattered spot in the target reference image, and a depth image is calculated according to the offset information and the reference depth information. .
In one embodiment, each pixel (x, y) in the speckle image is traversed, and a pixel block of a predetermined size is selected centered on the pixel. For example, it may be a block of pixels that takes a size of 31 pixels by 31 pixels. And then searching a matched pixel block on the target reference image, and calculating the horizontal offset between the coordinate of the matched pixel point on the target reference image and the coordinate of the pixel point (x, y), wherein the right offset is positive, and the left offset is recorded as negative. And then substituting the calculated horizontal offset into a formula (1) to obtain the depth information of the pixel point (x, y). In this way, the depth information of each pixel point in the speckle image is calculated in sequence, and the depth information corresponding to each pixel point in the speckle image can be obtained.
The depth image can be used for representing depth information corresponding to a shot object, and each pixel point contained in the depth image represents one piece of depth information. Specifically, each scattered spot in the reference image corresponds to one piece of reference depth information, after the horizontal offset between the speckle point in the reference image and the scattered spot in the speckle image is obtained, the relative depth information from the object in the speckle image to the reference plane can be obtained through calculation according to the horizontal offset, then the actual depth information from the object to the camera can be obtained through calculation according to the relative depth information and the reference depth information, and the final depth image can be obtained.
FIG. 5 is a schematic diagram of computing depth information in one embodiment. As shown in fig. 5, the laser lamp 502 can generate laser speckles, and the laser speckles are reflected by an object and then the formed image is acquired by the laser camera 504. In the calibration process of the camera, laser speckles emitted by the laser lamp 502 are reflected by the reference plane 508, reflected light is collected by the laser camera 504, and a reference image is obtained by imaging through the imaging plane 510. The reference depth of reference plane 508 to laser lamp 502 is L, which is known. In the process of actually calculating the depth information, laser speckles emitted by the laser lamp 502 are reflected by the object 506, reflected light is collected by the laser camera 504, and an actual speckle image is obtained by imaging through the imaging plane 510. The calculation formula for obtaining the actual depth information is as follows:
Figure BDA0002869732100000121
where L is the distance between the laser lamp 502 and the reference plane 508, f is the focal length of the lens in the laser camera 504, CD is the distance between the laser lamp 502 and the laser camera 504, and AB is the offset distance between the image of the object 506 and the image of the reference plane 508. AB may be the product of the pixel offset n and the actual distance p of the pixel. When the distance Dis between the object 504 and the laser lamp 502 is greater than the distance L between the reference plane 506 and the laser lamp 502, AB is a negative value; AB is positive when the distance Dis between object 504 and laser lamp 502 is less than the distance L between reference plane 506 and laser lamp 502.
In an embodiment, can include first camera module and second camera module in the camera module, first camera module comprises floodlight and laser camera, and the second camera module comprises laser lamp and laser camera, and the laser camera of first camera module and the laser camera of second camera module can be same laser camera, also can be different laser cameras, do not do the restriction here. The laser lamp can emit laser speckles, and speckle images can be collected through the first camera module. Laser can take place for the floodlight, can gather infrared image through the second camera module.
The infrared image can represent the detail information of the shot object, and the depth information of the shot object can be acquired according to the speckle image. In order to ensure that the infrared image and the speckle image collected by the electronic equipment are corresponding, the camera module needs to be controlled to collect the infrared image and the speckle image simultaneously. If the first camera module and the second camera module work in a time-sharing mode, the time interval for collecting the infrared images and the speckle images must be guaranteed to be very short. Specifically, the first camera module is controlled to collect infrared images according to the image collecting instruction, and the second camera module is controlled to collect speckle images; and the time interval between the first moment of acquiring the infrared image and the second moment of acquiring the speckle image is smaller than a first threshold value.
The first threshold value is generally a relatively small value, and when the time interval is smaller than the first threshold value, the subject is considered to be unchanged, and the acquired infrared image and the speckle image are corresponding to each other. It can be understood that the adjustment can also be performed according to the change rule of the shot object. The faster the change of the object to be photographed, the smaller the correspondingly acquired first threshold value. The first threshold value may be set to a large value on the assumption that the subject is stationary for a long period of time. Specifically, the change speed of the object to be photographed is acquired, and the corresponding first threshold is acquired according to the change speed.
For example, when the mobile phone needs to be authenticated and unlocked through a human face, the user can click an unlocking key to initiate an unlocking instruction, and the front-facing camera is aligned with the face to shoot. The mobile phone sends the unlocking instruction to the processing unit, and the processing unit controls the camera to work. The method comprises the steps of firstly collecting infrared images through a first camera module, controlling a second camera module to collect speckle images after 1 millisecond time interval, and carrying out authentication and unlocking through the collected infrared images and the speckle images.
Furthermore, the camera module is controlled to collect the infrared image at the first moment, and the camera module is controlled to collect the speckle image at the second moment; the time interval between the first time and the target time is smaller than a second threshold value; the time interval between the second time and the target time is less than a third threshold. If the time interval between the first moment and the target moment is smaller than a second threshold value, controlling the camera module to collect the infrared image; if the time interval between the first moment and the target moment is larger than the second threshold, a prompt message responding to overtime can be returned to the application program, and the application program is waited to initiate the image acquisition instruction again.
After the camera module collects the infrared image, the processing unit can control the camera module to collect the speckle image, the time interval between the second moment and the first moment for collecting the speckle image is smaller than a first threshold value, and meanwhile, the time interval between the second moment and the target moment is smaller than a third threshold value. If the time interval between the second moment and the first moment is greater than the first threshold, or the time interval between the second moment and the target moment is greater than the third threshold, returning a prompt message of responding to timeout to the application program, and waiting for the application program to reinitiate the image acquisition instruction. It is understood that the second time for acquiring the speckle image may be greater than the first time for acquiring the infrared image, or may be less than the first time for acquiring the infrared image, which is not limited herein.
Specifically, the electronic device can be respectively provided with a floodlight controller and a laser lamp controller, the floodlight controller and the laser lamp controller are respectively connected through two paths of Pulse Width Modulation (PWM), and the processing unit can input PWM to the floodlight controller to control the floodlight to be turned on or input PWM to the laser lamp controller to control the laser lamp to be turned on.
In an embodiment provided by the present application, the step of storing the reference image may further include:
step 602, obtaining a module identifier of the camera module, and establishing a corresponding relationship between the module identifier, the designated temperature and the reference image.
It can be understood that, in the process of calibrating the camera, the camera module installed on the terminal can be calibrated, or the camera module can be calibrated independently. Therefore, if the camera module on the terminal is damaged, the calibrated reference image of the camera module can be directly written into the terminal after the camera module is replaced.
Specifically, each camera module all has corresponding module sign, and the module sign can be used to distinguish different camera modules. When the camera module is calibrated, after the reference image is acquired, the corresponding relation between the module identification, the specified temperature and the reference image can be established. Therefore, after the camera module is reinstalled on the terminal, the corresponding specified temperature and the reference image can be obtained according to the module identification.
Step 604, store the module id, the specified temperature, and the reference image in the server.
In the process of calibrating the camera module independently, the acquired module identification, the designated temperature and the reference image can be stored in the server. The server can store the module identification, the specified temperature and the reference image in a list form, and the specified temperature and the reference image can be inquired and obtained according to the module identification. After the camera module is calibrated, the terminal can acquire a reference image from the server when the camera module is reinstalled. Specifically, the method comprises the following steps:
step 702, when the server receives a reference image acquisition request sent by the terminal, acquiring a corresponding specified temperature and a reference image according to a module identifier included in the reference image acquisition request.
The camera module can be reinstalled at the terminal to after installing the camera module, read the module sign of the camera module of installation. And then generating a reference image acquisition request according to the module identification, and sending the reference image acquisition request to a server. Specifically, when the terminal sends the reference image acquisition request, the terminal can encrypt the module identifier contained in the reference image acquisition request, and then send the encrypted reference image acquisition request to the server.
Step 704, the specified temperature and the reference image are sent to the terminal.
After receiving the reference image acquisition request, the server can search for the corresponding specified temperature and the reference image according to the module identification, encrypt the specified temperature and the reference image and send the encrypted specified temperature and reference image to the terminal. And after receiving the specified temperature and the reference image, the terminal carries out decryption processing. And then writing the specified temperature and the reference image after the decryption process into the terminal. Specifically, the algorithm for encrypting the group identifier, the specified temperature, and the reference image is not limited. For example, the Data Encryption Standard (DES), the MD5(Message-Digest Algorithm 5), and the HAVAL (Diffie-Hellman) may be used.
The camera calibration method provided by the above embodiment can control the temperature of the camera module to reach at least two different specified temperatures, control the camera module to acquire the reference images formed at the different specified temperatures, and store the reference images in correspondence with the specified temperatures. Because the camera module can produce deformation under the temperature of difference, the image that camera module gathered also can be influenced to temperature itself simultaneously, so control camera module when the camera is markd and gather the image under the appointed temperature of difference. Therefore, the corresponding reference image can be obtained according to the temperature of the camera module, the depth information of the object is calculated according to the reference depth information in the reference image, errors caused by temperature change of the camera module are avoided, and the accuracy of image processing is improved.
It should be understood that, although the steps in the flowcharts of fig. 3, 4, 6, and 7 are shown in order as indicated by the arrows, the steps are not necessarily performed in order as indicated by the arrows. The steps are not performed in the exact order shown and described, and may be performed in other orders, unless explicitly stated otherwise. Moreover, at least some of the steps in fig. 3, 4, 6, and 7 may include multiple sub-steps or multiple stages, which are not necessarily performed at the same time, but may be performed at different times, and the order of performing the sub-steps or stages is not necessarily sequential, but may be performed alternately or alternatingly with other steps or at least some of the sub-steps or stages of other steps.
Fig. 8 is a hardware configuration diagram for implementing the camera calibration method in one embodiment. As shown in fig. 8, the electronic device may include a camera module 810, a Central Processing Unit (CPU)820 and a second processing unit 830, where the camera module 810 includes a laser camera 812, a floodlight 814, an RGB (Red/Green/Blue, Red/Green/Blue color mode) camera 816 and a laser light 818. The second processing unit 830 includes a pulse width modulation module 832, an SPI/I2C (Serial Peripheral Interface/Inter-Integrated Circuit) module 834, a RAM (Random Access Memory) module 836, and a Depth Engine module 838. The first processing Unit 822 may be a CPU core in a TEE (Trusted execution environment), and the second processing Unit 830 may be an MCU (micro controller Unit) processor. It is understood that the central processing unit 820 may be in a multi-core operation mode, and the CPU core in the central processing unit 820 may operate in a TEE or REE (Rich Execution Environment). Both the TEE and the REE are running modes of an ARM module (Advanced RISC Machines). Generally, the operation behavior with higher security in the electronic device needs to be executed under the TEE, and other operation behaviors can be executed under the REE.
In the camera calibration process, the laser lamp 818 can be controlled to reach at least two different specified temperatures by the pulse width modulation module 932 in the second processing unit 830, and when the different specified temperatures are reached, the laser camera 812 is controlled to collect a reference image formed when the reference plane is irradiated by the laser lamp 818. The captured reference image and the specified temperature may be stored in the first processing unit 822 in a trusted operating environment (first secure operating environment). When the electronic device is powered on, the specified temperature and the reference image are loaded from the first processing unit 822 to the second processing unit 830 for storage. It is to be appreciated that the second processing unit 830 is external to the central processing unit 820 and has inputs and outputs controlled by the first processing unit 822 which is in a trusted operating environment, such that the second processing unit 830 is considered to be in a second secure operating environment.
In the process of measuring depth information, when the central processing unit 820 receives an image capture instruction initiated by a target application program, the CPU core operating under the TEE, i.e., the first processing unit 822, sends the image capture instruction to the SPI/I2C module 834 in the MCU830 through the SECURE SPI/I2C to the second processing unit 830. After receiving the image acquisition instruction, the second processing unit 830 transmits a pulse wave through the pulse width modulation module 832 to control the floodlight 814 in the camera module 810 to be turned on to acquire an infrared image, and controls the laser light 818 in the camera module 810 to be turned on to acquire a speckle image. The camera module 810 may transmit the collected infrared image and speckle image to a Depth Engine module 838 in the second processing unit 830, and the Depth Engine module 838 may calculate an infrared parallax image according to the infrared image, calculate a Depth image according to the speckle image and a reference image, and obtain a Depth parallax image according to the Depth image. The infrared parallax image and the depth parallax image are then sent to the first processing unit 822 operating under TEE. The first processing unit 822 corrects the infrared parallax image to obtain a corrected infrared image, and corrects the corrected depth image according to the depth parallax image. The laser camera 812 and the RGB camera 816 are installed at different positions, so that when an image is collected, the images collected by the two cameras need to be aligned and corrected, so as to avoid an error caused by a shooting angle. That is, the infrared image and the depth image need to be corrected to obtain a corrected infrared image and a corrected depth image, respectively.
In one embodiment, the face recognition can be performed according to the corrected infrared image, and whether a face exists in the corrected infrared image and whether the detected face is matched with a stored face is detected; and if the human face passes the identification, performing living body detection according to the corrected infrared image and the corrected depth image, and detecting whether the human face is a living body human face. After acquiring the corrected infrared image and the corrected depth image, live body detection and then face recognition can be performed, or the face recognition and the live body detection can be performed simultaneously. After the face recognition passes and the detected face is a living face, the first processing unit 822 may send one or more of the corrected infrared image, the corrected depth image, and the face recognition result to the target application program.
Fig. 9 is an interaction diagram for implementing a camera calibration method in one embodiment. As shown in fig. 9, the interactive process of the camera calibration method may include steps 902 to 920. Wherein:
step 902, the calibration device controls the temperature of the camera module to reach at least two different specified temperatures.
And 904, controlling the camera to acquire a reference image formed when the light emitter irradiates the reference plane at the specified temperature by the calibration equipment.
Step 906, the calibration equipment obtains a module identification of the camera module, and establishes a corresponding relationship between the module identification, the designated temperature and the reference image.
Step 908, the calibration device sends the module identifier, the specified temperature, and the reference image to the server.
In step 910, the server receives and stores the module id, the specified temperature, and the reference image.
And 912, the terminal installs the camera module, acquires the module identifier of the installed camera module, and generates an image acquisition request according to the module identifier.
In step 914, the terminal sends the generated image acquisition request to the server.
In step 916, the server obtains the corresponding specified temperature and the reference image according to the module identifier included in the reference image obtaining request.
In step 918, the server sends the acquired designated temperature and the reference image to the terminal.
And step 920, the terminal receives the designated temperature and the reference image sent by the server, and writes the designated temperature and the reference image into a safe operation environment of the terminal for storage.
The camera calibration device provided by the embodiment can acquire the corresponding reference image according to the temperature of the camera module, and calculate the depth information of the object according to the reference depth information in the reference image, thereby avoiding errors caused by the temperature change of the camera module and improving the accuracy of image processing.
Fig. 10 is a schematic structural diagram of a camera calibration device in one embodiment. As shown in fig. 10, the camera calibration device 1000 includes a temperature control module 1002, an image acquisition module 1004, and an image storage module 1006. Wherein:
and the temperature control module 1002 is used for controlling the temperature of the camera module to reach at least two different specified temperatures, wherein the camera module comprises a light emitter and a camera.
An image obtaining module 1004, configured to control the camera to collect a reference image formed when the light emitter irradiates a reference plane at the specified temperature.
An image storage module 1006, configured to store the specified temperature in correspondence with a reference image; the reference image is provided with reference depth information, and the reference depth information is used for calculating the depth information of the object.
The camera calibration device provided by the embodiment can control the temperature of the camera module to reach at least two different specified temperatures, control the camera module to collect reference images formed at different specified temperatures, and store the reference images corresponding to the specified temperatures. Because the camera module can produce deformation under the temperature of difference, the image that camera module gathered also can be influenced to temperature itself simultaneously, so control camera module when the camera is markd and gather the image under the appointed temperature of difference. Therefore, the corresponding reference image can be obtained according to the temperature of the camera module, the depth information of the object is calculated according to the reference depth information in the reference image, errors caused by temperature change of the camera module are avoided, and the accuracy of image processing is improved.
Fig. 11 is a schematic structural diagram of a camera calibration device in another embodiment. As shown in fig. 11, the camera calibration apparatus 1100 includes a temperature control module 1102, an image acquisition module 1104, an image storage module 1106, and a depth calculation module 1108. Wherein:
and the temperature control module 1102 is used for controlling the temperature of the camera module to reach at least two different specified temperatures, wherein the camera module comprises a light emitter and a camera.
And the image acquisition module 1104 is used for controlling the camera to acquire a reference image formed when the light emitter irradiates a reference plane at the specified temperature.
An image storage module 1106, configured to store the specified temperature in correspondence with a reference image; the reference image is provided with reference depth information, and the reference depth information is used for calculating the depth information of the object.
The depth calculation module 1108 is configured to control the camera module to acquire a speckle image when it is detected that the camera module is opened; when the temperature change of the light emitter is detected to exceed a temperature threshold value, acquiring the current temperature of the light emitter; and acquiring a corresponding target reference image according to the current temperature of the light emitter, and calculating a depth image according to the speckle image and the target reference image, wherein the depth image is used for representing depth information of the object.
The camera calibration device provided by the embodiment can acquire the corresponding reference image according to the temperature of the camera module, and calculate the depth information of the object according to the reference depth information in the reference image, thereby avoiding errors caused by the temperature change of the camera module and improving the accuracy of image processing.
In one embodiment, the temperature control module 1102 is further configured to input at least two different frequencies of Pulse Width Modulation (PWM) to the optical transmitter, and the temperature of the optical transmitter is controlled to reach at least two different designated temperatures through the PWM.
In an embodiment, the image storage module 1106 is further configured to establish a corresponding relationship between the specified temperature and the reference image, and write the specified temperature and the reference image into a secure operating environment of the terminal for storage.
In one embodiment, the image storage module 1106 is further configured to write the specified temperature and the reference image into a first secure operating environment of the terminal for storage; and when the terminal is detected to be started, loading the specified temperature and the reference image from the first safe operation environment to a second safe operation environment for storage.
In an embodiment, the image storage module 1106 is further configured to obtain a module identifier of the camera module, and establish a corresponding relationship between the module identifier, the specified temperature, and the reference image; and storing the module identification, the specified temperature and the reference image into a server.
In an embodiment, the image storage module 1106 is further configured to, when the server receives a reference image acquisition request sent by a terminal, acquire a corresponding specified temperature and a reference image according to a module identifier included in the reference image acquisition request; and sending the specified temperature and the reference image to the terminal.
The division of the modules in the camera calibration device is merely used for illustration, and in other embodiments, the camera calibration device may be divided into different modules as needed to complete all or part of the functions of the camera calibration device.
The embodiment of the application also provides a computer readable storage medium. One or more non-transitory computer-readable storage media containing computer-executable instructions that, when executed by one or more processors, cause the processors to perform the camera calibration methods provided by the above-described embodiments.
A computer program product comprising instructions which, when run on a computer, cause the computer to perform the camera calibration method as provided by the above embodiments.
Any reference to memory, storage, database, or other medium used herein may include non-volatile and/or volatile memory. Suitable non-volatile memory can include read-only memory (ROM), Programmable ROM (PROM), Electrically Programmable ROM (EPROM), Electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM), which acts as external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms, such as Static RAM (SRAM), Dynamic RAM (DRAM), Synchronous DRAM (SDRAM), double data rate SDRAM (DDR SDRAM), Enhanced SDRAM (ESDRAM), synchronous Link (Synchlink) DRAM (SLDRAM), Rambus Direct RAM (RDRAM), direct bus dynamic RAM (DRDRAM), and bus dynamic RAM (RDRAM).
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the present application. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (18)

1. An image processing method is applied to an electronic device with a camera module, wherein the camera module comprises a light emitter and a camera, and the method comprises the following steps:
receiving an image acquisition instruction;
controlling the camera module to collect speckle images;
detecting the temperature of the light emitter, and judging whether the temperature change of the light emitter exceeds a temperature threshold value;
when the temperature change of the light emitter is detected to exceed a temperature threshold value, acquiring the current temperature of the light emitter;
and acquiring a corresponding target reference image according to the current temperature of the light emitter, and calculating a depth image according to the speckle image and the target reference image, wherein the depth image is used for representing depth information of the object.
2. The method of claim 1, wherein the obtaining a corresponding target reference image according to the current temperature of the light emitter comprises:
acquiring a corresponding target reference image from a safe operation environment of the terminal according to the current temperature of the light emitter;
the terminal comprises a terminal body, a terminal, a reference image and a temperature sensor, wherein the terminal is provided with a safety operation environment, and the terminal is provided with a designated temperature and a corresponding reference image in a storage mode.
3. The method of claim 2, wherein the obtaining of the corresponding target reference image from the safe operating environment of the terminal according to the current temperature of the light emitter comprises:
and when the terminal is detected to be started, loading the specified temperature and the reference image from a first safe operation environment to a second safe operation environment for storage, and acquiring a corresponding target reference image from the second safe operation environment of the terminal according to the current temperature of the light emitter, wherein the storage space in the first safe operation environment is larger than that in the second safe operation environment.
4. The method of claim 3,
the receiving an image acquisition instruction comprises:
sending the image acquisition instruction to a second processing unit in a second safe operation environment through a first processing unit in the first safe operation environment;
the calculating a depth image from the speckle image and a target reference image comprises:
and a second processing unit in the second safe operating environment calculates a depth image according to the speckle image and the target reference image.
5. The method of claim 4, further comprising:
controlling the camera module to collect an infrared image and sending the infrared image to a second processing unit, wherein the second processing unit calculates an infrared parallax image according to the infrared image;
obtaining a depth parallax image according to the depth image;
and sending the infrared parallax image and the depth parallax image to the first processing unit, and correcting the infrared parallax image by the first processing unit according to the infrared parallax image to obtain a corrected infrared image and correcting the corrected infrared image according to the depth parallax image to obtain a corrected depth image.
6. The method of claim 5, further comprising:
detecting whether a human face exists in the corrected infrared image and whether the detected human face is matched with a stored human face;
detecting whether the face is a living face according to the corrected infrared image and the corrected depth image;
and after the face identification passes and the detected face is a living body face, the first processing unit sends one or more of the corrected infrared image, the corrected depth image and the face identification result to a target application program.
7. The method of claim 1, wherein the obtaining a corresponding target reference image according to the current temperature of the light emitter comprises:
sending a reference image acquisition request to a server, and acquiring a corresponding target reference image from the server according to the current temperature of the light emitter;
the server stores a module identification, an appointed temperature and a reference image of the camera module, and the module identification, the appointed temperature and the reference image are in corresponding relation.
8. The method of claim 1, further comprising:
sending a reference image acquisition request to a server, wherein the reference image acquisition request comprises a module identification, the server stores the module identification, the specified temperature and the reference image of the camera module, and the module identification, the specified temperature and the reference image are in corresponding relation;
and receiving and storing the specified temperature and the reference image sent by the server.
9. A camera calibration method, characterized in that the method comprises:
controlling the temperature of a camera module to reach at least two different specified temperatures, wherein the camera module comprises a light emitter and a camera;
controlling the camera to collect a reference image formed when the light emitter irradiates a reference plane at the specified temperature;
storing the specified temperature and a reference image correspondingly; the reference image is provided with reference depth information, and the reference depth information is used for calculating the depth information of the object.
10. The method of claim 9, wherein controlling the temperature of the camera module to at least two different specified temperatures comprises:
inputting at least two Pulse Width Modulation (PWM) with different frequencies to a light emitter, and controlling the temperature of the light emitter to reach at least two different specified temperatures through the PWM.
11. The method of claim 9, wherein storing the specified temperature in correspondence with a reference image comprises:
and establishing a corresponding relation between the specified temperature and the reference image, and writing the specified temperature and the reference image into a safe operation environment of the terminal for storage.
12. The method of claim 11, wherein writing the specified temperature and the reference image to a secure operating environment of the terminal for storage comprises:
writing the specified temperature and the reference image into a first safe operation environment of the terminal for storage;
and when the terminal is detected to be started, loading the specified temperature and the reference image from the first safe operation environment to a second safe operation environment for storage.
13. The method of claim 9, wherein storing the specified temperature in correspondence with a reference image comprises:
acquiring a module identification of the camera module, and establishing a corresponding relation between the module identification, the specified temperature and a reference image;
and storing the module identification, the specified temperature and the reference image into a server.
14. The method of claim 13, wherein after storing the module identification, the specified temperature, and the reference image in a server, further comprising:
when the server receives a reference image acquisition request sent by a terminal, acquiring a corresponding specified temperature and a reference image according to a module identification contained in the reference image acquisition request;
and sending the specified temperature and the reference image to the terminal.
15. An image processing device is applied to an electronic device with a camera module, wherein the camera module comprises a light emitter and a camera, and the device comprises:
the instruction receiving module is used for receiving an image acquisition instruction;
the temperature detection module is used for detecting the temperature of the light emitter and judging whether the temperature change of the light emitter exceeds a temperature threshold value or not;
the depth calculation module is used for controlling the camera module to collect speckle images; when the temperature change of the light emitter is detected to exceed a temperature threshold value, acquiring the current temperature of the light emitter; and acquiring a corresponding target reference image according to the current temperature of the light emitter, and calculating a depth image according to the speckle image and the target reference image, wherein the depth image is used for representing depth information of the object.
16. A camera calibration device, characterized in that the device comprises:
the temperature control module is used for controlling the temperature of the camera module to reach at least two different specified temperatures, wherein the camera module comprises a light emitter and a camera;
the image acquisition module is used for controlling the camera to acquire a reference image formed when the light emitter irradiates a reference plane at the specified temperature;
the image storage module is used for correspondingly storing the specified temperature and a reference image; the reference image is provided with reference depth information, and the reference depth information is used for calculating the depth information of the object.
17. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the method according to any one of claims 1 to 14.
18. An electronic device comprising a memory and a processor, the memory having stored therein computer-readable instructions that, when executed by the processor, cause the processor to perform the method of any of claims 1-14.
CN202011602675.5A 2018-06-28 2018-06-28 Image processing method, camera calibration method, device, storage medium and electronic equipment Pending CN112802116A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011602675.5A CN112802116A (en) 2018-06-28 2018-06-28 Image processing method, camera calibration method, device, storage medium and electronic equipment

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201810690949.7A CN108921903B (en) 2018-06-28 2018-06-28 Camera calibration method, device, computer readable storage medium and electronic equipment
CN202011602675.5A CN112802116A (en) 2018-06-28 2018-06-28 Image processing method, camera calibration method, device, storage medium and electronic equipment

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
CN201810690949.7A Division CN108921903B (en) 2018-04-28 2018-06-28 Camera calibration method, device, computer readable storage medium and electronic equipment

Publications (1)

Publication Number Publication Date
CN112802116A true CN112802116A (en) 2021-05-14

Family

ID=64423398

Family Applications (2)

Application Number Title Priority Date Filing Date
CN201810690949.7A Active CN108921903B (en) 2018-04-28 2018-06-28 Camera calibration method, device, computer readable storage medium and electronic equipment
CN202011602675.5A Pending CN112802116A (en) 2018-06-28 2018-06-28 Image processing method, camera calibration method, device, storage medium and electronic equipment

Family Applications Before (1)

Application Number Title Priority Date Filing Date
CN201810690949.7A Active CN108921903B (en) 2018-04-28 2018-06-28 Camera calibration method, device, computer readable storage medium and electronic equipment

Country Status (1)

Country Link
CN (2) CN108921903B (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3621293B1 (en) 2018-04-28 2022-02-09 Guangdong Oppo Mobile Telecommunications Corp., Ltd. Image processing method, apparatus and computer-readable storage medium
CN111381222A (en) * 2018-12-27 2020-07-07 浙江舜宇智能光学技术有限公司 Temperature calibration equipment and method for TOF camera module
CN109903345B (en) * 2019-04-09 2023-04-25 歌尔光学科技有限公司 Depth module calibration method, calibration device and computer readable storage medium
CN110136209B (en) * 2019-05-21 2021-04-20 Oppo广东移动通信有限公司 Camera calibration method and device and computer readable storage medium
CN110599550A (en) * 2019-09-09 2019-12-20 香港光云科技有限公司 Calibration system of RGB-D module and equipment and method thereof
CN110827363A (en) * 2019-11-08 2020-02-21 深圳深岚视觉科技有限公司 Data processing method, data processing device, computer equipment and storage medium

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107657635A (en) * 2017-10-17 2018-02-02 深圳奥比中光科技有限公司 The bearing calibration of depth camera temperature error and system
US20180061056A1 (en) * 2016-08-30 2018-03-01 Microsoft Technology Licensing, Llc Temperature Compensation for Structured Light Depth Imaging System

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105120257B (en) * 2015-08-18 2017-12-15 宁波盈芯信息科技有限公司 A kind of vertical depth sensing device based on structure light coding
CN105141939B (en) * 2015-08-18 2017-05-17 宁波盈芯信息科技有限公司 Three-dimensional depth perception method and three-dimensional depth perception device based on adjustable working range
CN107659985B (en) * 2017-08-09 2021-03-09 Oppo广东移动通信有限公司 Method and device for reducing power consumption of mobile terminal, storage medium and mobile terminal
CN108073891A (en) * 2017-11-10 2018-05-25 广东日月潭电源科技有限公司 A kind of 3 D intelligent face identification system

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180061056A1 (en) * 2016-08-30 2018-03-01 Microsoft Technology Licensing, Llc Temperature Compensation for Structured Light Depth Imaging System
CN107657635A (en) * 2017-10-17 2018-02-02 深圳奥比中光科技有限公司 The bearing calibration of depth camera temperature error and system

Also Published As

Publication number Publication date
CN108921903A (en) 2018-11-30
CN108921903B (en) 2021-01-15

Similar Documents

Publication Publication Date Title
CN108764052B (en) Image processing method, image processing device, computer-readable storage medium and electronic equipment
CN108921903B (en) Camera calibration method, device, computer readable storage medium and electronic equipment
EP3621293B1 (en) Image processing method, apparatus and computer-readable storage medium
CN108805024B (en) Image processing method, image processing device, computer-readable storage medium and electronic equipment
CN108804895B (en) Image processing method, image processing device, computer-readable storage medium and electronic equipment
CN108549867B (en) Image processing method, image processing device, computer-readable storage medium and electronic equipment
CN108668078B (en) Image processing method, device, computer readable storage medium and electronic equipment
US11256903B2 (en) Image processing method, image processing device, computer readable storage medium and electronic device
WO2019206020A1 (en) Image processing method, apparatus, computer-readable storage medium, and electronic device
CN108711054B (en) Image processing method, image processing device, computer-readable storage medium and electronic equipment
EP3627827B1 (en) Method for controlling photographing, electronic device, and computer readable storage medium
CN109145653B (en) Data processing method and device, electronic equipment and computer readable storage medium
CN110191266B (en) Data processing method and device, electronic equipment and computer readable storage medium
CN108650472B (en) Method and device for controlling shooting, electronic equipment and computer-readable storage medium
CN108830141A (en) Image processing method, device, computer readable storage medium and electronic equipment
CN108833887B (en) Data processing method and device, electronic equipment and computer readable storage medium
CN108712400B (en) Data transmission method and device, computer readable storage medium and electronic equipment
EP3621294B1 (en) Method and device for image capture, computer readable storage medium and electronic device
CN108924421B (en) Image processing method, image processing device, computer-readable storage medium and electronic equipment
WO2019196669A1 (en) Laser-based security verification method and apparatus, and terminal device
CN108833885B (en) Image processing method, image processing device, computer-readable storage medium and electronic equipment
CN108881712B (en) Image processing method, image processing device, computer-readable storage medium and electronic equipment
EP3644261B1 (en) Image processing method, apparatus, computer-readable storage medium, and electronic device
CN108810516B (en) Data processing method and device, electronic equipment and computer readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination