CN108833887B - Data processing method and device, electronic equipment and computer readable storage medium - Google Patents

Data processing method and device, electronic equipment and computer readable storage medium Download PDF

Info

Publication number
CN108833887B
CN108833887B CN201810401326.3A CN201810401326A CN108833887B CN 108833887 B CN108833887 B CN 108833887B CN 201810401326 A CN201810401326 A CN 201810401326A CN 108833887 B CN108833887 B CN 108833887B
Authority
CN
China
Prior art keywords
processing unit
image
bus
laser
floodlight
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810401326.3A
Other languages
Chinese (zh)
Other versions
CN108833887A (en
Inventor
周海涛
谭国辉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN201810401326.3A priority Critical patent/CN108833887B/en
Publication of CN108833887A publication Critical patent/CN108833887A/en
Priority to EP19792981.3A priority patent/EP3672223B1/en
Priority to PCT/CN2019/083854 priority patent/WO2019206129A1/en
Priority to ES19792981T priority patent/ES2938471T3/en
Priority to US16/743,533 priority patent/US11050918B2/en
Application granted granted Critical
Publication of CN108833887B publication Critical patent/CN108833887B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

The embodiment of the application relates to a data processing method and device, electronic equipment and a computer readable storage medium. The electronic equipment comprises a camera module, a first processing unit and a second processing unit, wherein the first processing unit is respectively connected with the second processing unit and the camera module; the camera module comprises a laser camera, a floodlight and a laser lamp, and the laser camera, the floodlight, the laser lamp and the first processing unit are connected with the same two-wire serial I2C bus; and the first processing unit is used for controlling to start at least one of the floodlight and the laser lamp through the I2C bus when receiving the image acquisition instruction sent by the second processing unit, controlling the laser camera to acquire a target image through the I2C bus, processing the target image, and sending the processed target image to the second processing unit. The data processing method, the data processing device, the electronic equipment and the computer readable storage medium can reduce the complexity of a control circuit and reduce the cost.

Description

Data processing method and device, electronic equipment and computer readable storage medium
Technical Field
The present application relates to the field of computer technologies, and in particular, to a data processing method and apparatus, an electronic device, and a computer-readable storage medium.
Background
3D (3 Dimensions, three-dimensional) human faces play an important role in different application scenes such as face recognition, facial beautification, 3D model establishment and the like. The electronic equipment can emit laser through lasers such as a laser lamp, the face image irradiated by the laser is collected through the camera, and the 3D face is constructed through structured light. In a traditional mode, a control circuit for controlling a laser, a camera and the like by an electronic device is complex and high in cost.
Disclosure of Invention
Embodiments of the present application provide a data processing method, an apparatus, an electronic device, and a computer-readable storage medium, which can reduce complexity of a control circuit and reduce cost.
A data processing method is applied to electronic equipment, wherein the electronic equipment comprises a camera module, a first processing unit and a second processing unit, and the first processing unit is respectively connected with the second processing unit and the camera module; the camera module comprises a laser camera, a floodlight and a laser lamp, and the laser camera, the floodlight, the laser lamp and the first processing unit are connected with the same bidirectional two-wire system synchronous serial I2C bus;
the method comprises the following steps:
when the first processing unit receives an image acquisition instruction sent by the second processing unit, controlling to turn on at least one of the floodlight and the laser light through the I2C bus;
the first processing unit controls the laser camera to collect a target image through the I2C bus;
and processing the target image through the first processing unit, and sending the processed target image to the second processing unit.
A data processing device is suitable for electronic equipment, wherein the electronic equipment comprises a camera module, a first processing unit and a second processing unit, and the first processing unit is respectively connected with the second processing unit and the camera module; the camera module comprises a laser camera, a floodlight and a laser lamp, and the laser camera, the floodlight, the laser lamp and the first processing unit are connected with the same two-wire serial I2C bus;
the apparatus, comprising:
the first control module is used for controlling and starting at least one of the floodlight and the radium-shine lamp through the I2C bus when the first processing unit receives an image acquisition instruction sent by the second processing unit;
the second control module is used for controlling the laser camera to collect a target image through the I2C bus;
and the processing module is used for processing the target image through the first processing unit and sending the processed target image to the second processing unit.
An electronic device comprises a camera module, a first processing unit and a second processing unit, wherein the first processing unit is respectively connected with the second processing unit and the camera module; the camera module comprises a laser camera, a floodlight and a laser lamp, and the laser camera, the floodlight, the laser lamp and the first processing unit are connected with the same two-wire serial I2C bus;
the first processing unit is used for controlling to start at least one of the floodlight and the laser lamp through the I2C bus when receiving an image acquisition instruction sent by the second processing unit, controlling the laser camera to acquire a target image through the I2C bus, processing the target image, and sending the processed target image to the second processing unit.
A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the method as set forth above.
According to the data processing method, the data processing device, the electronic equipment and the computer readable storage medium, the laser camera, the floodlight, the laser lamp and the first processing unit are connected with the same I2C bus, the first processing unit controls to turn on at least one of the floodlight and the laser lamp through the I2C bus, controls the laser camera to collect a target image through the I2C, controls the floodlight, the laser lamp and the laser camera through the same I2C bus, multiplexes the I2C bus, and can reduce the complexity of a control circuit and reduce the cost.
Drawings
FIG. 1 is a diagram of an exemplary implementation of a data processing method;
FIG. 2 is a diagram of an application scenario of a data processing method in another embodiment;
FIG. 3 is a block diagram of an electronic device in one embodiment;
FIG. 4 is a flow diagram illustrating a data processing method according to one embodiment;
FIG. 5 is a schematic flow chart illustrating the operation of at least one of the floodlight and the laser light controlled via the I2C bus;
FIG. 6 is a schematic flow chart illustrating processing of a speckle image to obtain a depth image according to an embodiment;
FIG. 7 is a schematic flow chart of writing a reference speckle image in the first processing unit in one embodiment;
FIG. 8 is a schematic diagram illustrating a flow of a second processing unit sending image capture instructions to a first processing unit in one embodiment;
FIG. 9 is a block diagram of a data processing apparatus in one embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
It will be understood that, as used herein, the terms "first," "second," and the like may be used herein to describe various elements, but these elements are not limited by these terms. These terms are only used to distinguish one element from another. For example, a first client may be referred to as a second client, and similarly, a second client may be referred to as a first client, without departing from the scope of the present application. Both the first client and the second client are clients, but they are not the same client.
Fig. 1 is a diagram illustrating an application scenario of a data processing method according to an embodiment. As shown in fig. 4, the electronic device includes a laser camera 102, a laser light 104, a floodlight 106, a first processing unit 110, a second processing unit 120, and a controller 130. The first Processing Unit 110 may be an MCU (micro controller Unit) module, etc., and the second Processing Unit 120 may be a CPU (Central Processing Unit) module, etc. The first processing unit 110 may be connected with the laser camera 102, the laser light 104, the floodlight 106 and the second processing unit 120. The controller 130 can be connected to the laser lamp 104 and the floodlight 106 respectively, and the controller 130 can control the laser lamp 104 and the floodlight 106. The laser camera 102, the controller 130 and the first processing unit 110 are connected to the same two-wire serial I2C bus.
When the first processing unit 110 receives the image acquisition instruction sent by the second processing unit 120, at least one of the floodlight 106 and the laser light 104 can be controlled to be turned on through the I2C bus. The first processing unit 110 can send a control command to the controller 130 connected to the I2C bus, after receiving the control command, the controller 130 can control to turn on at least one of the floodlight 106 and the laser light 104 according to the control command, and the first processing unit 110 can turn on the floodlight 106 and the laser light 104 through Pulse Width Modulation (PWM). The first processing unit 110 can control the laser camera 102 to capture a target image through the I2C (Inter-Integrated Circuit, bi-directional two-wire synchronous serial) bus. The first processing unit 110 processes the acquired target image, and may send the processed target image to the second processing unit 120.
Fig. 2 is an application scenario diagram of a data processing method in another embodiment. As shown in fig. 2, the electronic device 200 may include a camera module 210, a second processing unit 220, and a first processing unit 230. The second processing unit 220 may be a CPU module. The first processing unit 230 may be an MCU module. The first processing unit 230 is connected between the second processing unit 220 and the camera module 210, the first processing unit 230 can control the laser camera 212, the floodlight 214 and the laser light 218 in the camera module 210, and the second processing unit 220 can control the RGB camera 216 in the camera module 210.
The camera module 210 includes a laser camera 212, a floodlight 214, an RGB camera 216, and a laser light 218. The laser camera 212 may be an infrared camera for acquiring infrared images. The floodlight 214 is a surface light source capable of emitting infrared light; the laser lamp 218 is a point light source capable of emitting laser light and is a point light source with a pattern. When the floodlight 214 emits a surface light source, the laser camera 212 can obtain an infrared image according to the reflected light. When the laser lamp 218 emits a point light source, the laser camera 212 may obtain a speckle image according to the reflected light. The speckle image is an image of the pattern deformation after the point light source with the pattern emitted by the laser lamp 218 is reflected. The laser camera 212, the floodlight 214, the laser light 218 and the first processing unit 230 can be connected with the same I2C bus.
The second processing unit 220 may include a CPU core operating in a TEE (Trusted Execution Environment) Environment and a CPU core operating in a REE (natural Execution Environment) Environment. The TEE environment and the REE environment are both running modes of an ARM module (Advanced RISC Machines, Advanced reduced instruction set processor). The security level of the TEE environment is higher, and only one CPU core in the second processing unit 220 can operate in the TEE environment at the same time. Generally, the operation behavior with higher security level in the electronic device 200 needs to be executed in the CPU core in the TEE environment, and the operation behavior with lower security level can be executed in the CPU core in the REE environment.
The first processing unit 230 includes a PWM module 232, an SPI/I2C (Serial Peripheral Interface/Inter-Integrated Circuit) Interface 234, a RAM (Random Access Memory) module 236, and a depth engine 238. The first processing unit 230 may control the floodlight 214 or laser 218 via the connected I2C bus, and the PWM module 232 may pulse the camera module to illuminate the floodlight 214 or laser 218 that is turned on. The first processing unit 230 may control the laser camera 212 to capture infrared images or speckle images through I2C. The SPI/I2C interface 234 is used for receiving the image capturing instruction sent by the second processing unit 220. The depth engine 238 may process the speckle images to obtain a depth disparity map.
When the second processing unit 220 receives a data acquisition request of an application program, for example, when the application program needs to perform face unlocking and face payment, an image acquisition instruction may be sent to the first processing unit 230 through the CPU core operating in the TEE environment. After the first processing unit 230 receives the image acquisition command, the floodlight 214 in the camera module 210 can be controlled to be turned on through the I2C bus, the floodlight 214 is lightened by the PWM module 232 through emitting pulse waves, the laser camera 212 is controlled to acquire infrared images through the I2C bus, the laser light 218 in the camera module 210 can be controlled to be turned on through the I2C bus, and the laser camera 212 is controlled to acquire speckle images through the I2C bus. The camera module 210 may send the collected infrared image and speckle image to the first processing unit 230. The first processing unit 230 may process the received infrared image to obtain an infrared disparity map; and processing the received speckle images to obtain a speckle parallax image or a depth parallax image. The processing of the infrared image and the speckle image by the first processing unit 230 refers to correcting the infrared image or the speckle image and removing the influence of internal and external parameters in the camera module 210 on the image. The first processing unit 230 can be set to different modes, and the images output by the different modes are different. When the first processing unit 230 is set to the speckle pattern mode, the first processing unit 230 processes the speckle image to obtain a speckle disparity map, and a target speckle image can be obtained according to the speckle disparity map; when the first processing unit 230 is set to the depth map mode, the first processing unit 230 processes the speckle images to obtain a depth disparity map, and obtains a depth image according to the depth disparity map, where the depth image is an image with depth information. The first processing unit 230 may send the infrared disparity map and the speckle disparity map to the second processing unit 220, and the first processing unit 230 may also send the infrared disparity map and the depth disparity map to the second processing unit 220. The second processing unit 220 may obtain an infrared image of the target according to the infrared disparity map and obtain a depth image according to the depth disparity map. Further, the second processing unit 220 may perform face recognition, face matching, living body detection, and depth information acquisition on the detected face according to the target infrared image and the depth image.
The communication between the first processing unit 230 and the second processing unit 220 is through a fixed security interface to ensure the security of the transmitted data. As shown in fig. 1, the data sent by the second processing unit 220 to the first processing unit 230 is through a SECURE SPI/I2C 240, and the data sent by the first processing unit 230 to the second processing unit 220 is through a SECURE MIPI (Mobile Industry Processor Interface) 250.
In an embodiment, the first processing unit 230 may also obtain a target infrared image according to the infrared disparity map, calculate and obtain a depth image according to the depth disparity map, and then send the target infrared image and the depth image to the second processing unit 220.
FIG. 3 is a block diagram of an electronic device in one embodiment. As shown in fig. 3, the electronic device includes a processor, a memory, a display screen, and an input device connected through a system bus. The memory may include, among other things, a non-volatile storage medium and a processor. The non-volatile storage medium of the electronic device stores an operating system and a computer program, and the computer program is executed by a processor to implement a data processing method provided in the embodiment of the present application. The processor is used for providing calculation and control capability and supporting the operation of the whole electronic equipment. The internal memory in the electronic device provides an environment for the execution of the computer program in the nonvolatile storage medium. The display screen of the electronic device may be a liquid crystal display screen or an electronic ink display screen, and the input device may be a touch layer covered on the display screen, a key, a track ball or a touch pad arranged on a housing of the electronic device, or an external keyboard, a touch pad or a mouse. The electronic device may be a mobile phone, a tablet computer, or a personal digital assistant or a wearable device, etc. Those skilled in the art will appreciate that the architecture shown in fig. 3 is a block diagram of only a portion of the architecture associated with the subject application, and does not constitute a limitation on the electronic devices to which the subject application may be applied, and that a particular electronic device may include more or less components than those shown, or may combine certain components, or have a different arrangement of components.
As shown in fig. 4, in one embodiment, there is provided a data processing method comprising the steps of:
and step 410, when the first processing unit receives the image acquisition instruction sent by the second processing unit, controlling to turn on at least one of the floodlight and the laser light through the I2C bus.
When an application program in the electronic device needs to acquire face data, the application program may send a data acquisition request to the second processing unit, where the face data may include, but is not limited to, data that needs to be subjected to face verification in scenes such as face unlocking and face payment, face depth information, and the like. After receiving the data acquisition request, the second processing unit may send an image acquisition instruction to the first processing unit, where the first processing unit may be an MCU module, and the second processing unit may be a CPU module.
The laser camera, the floodlight and the laser light in the first processing unit and the camera module can be connected with the same I2C bus. The I2C bus may implement data transfer between devices connected to the I2C bus via one data line and one clock line. After the first processing unit receives the image acquisition command sent by the second processing unit, a control command can be sent to the floodlight and/or the laser light which are simultaneously connected to the I2C bus through the I2C bus, and at least one of the floodlight and the laser light is controlled to be started.
In one embodiment, after receiving the image acquisition instruction, the first processing unit may determine whether the floodlight or the laser light needs to be controlled currently according to the image acquisition instruction. If the floodlight needs to be controlled to be turned on, the first processing unit can address the floodlight connected to the I2C bus through the I2C bus and then send a control instruction to the floodlight to control the floodlight to be turned on. If the laser lamp needs to be controlled to be started, the first processing unit can address the laser lamp connected to the I2C bus through the I2C bus, and then sends a control instruction to the laser lamp to control the floodlight to be started.
And step 420, the first processing unit controls the laser camera to acquire a target image through the I2C bus.
After the first processing unit controls and starts at least one of the floodlight and the laser light through the I2C bus, the laser camera can be controlled through the I2C bus to collect a target image, and the target image can comprise an infrared image, a speckle image and the like. The floodlight in the camera module is opened in the control of first processing unit accessible I2C bus to gather infrared image through this I2C bus control laser camera, wherein, the floodlight can be a pointolite to the even irradiation in all directions, and the light of floodlight transmission can be the infrared light, and the laser camera can gather the people face and obtain infrared image. The first processing unit can control to start the laser lamp in the camera module through the I2C bus, and control the laser camera to collect speckle images and the like through the I2C bus. Laser that the laser instrument sent can be carried out the diffraction by lens and DOE (dispersive optical elements) and produce the pattern of taking the speckle granule, projects the target object through the pattern of taking the speckle granule, receives the different skew that produces the speckle pattern of distance of target object each point and electronic equipment, and laser camera gathers the target object and obtains the speckle image.
In one embodiment, the first processing unit addresses the floodlight or the laser light connected to the I2C bus through the I2C bus and sends a control command to the floodlight or the laser light, and after the floodlight or the laser light is controlled to be turned on, the first processing unit addresses the laser camera connected to the I2C bus through the I2C bus and sends a control command to the laser camera to control the laser camera to acquire an infrared image or a speckle image.
And step 430, processing the target image through the first processing unit, and sending the processed target image to the second processing unit.
The laser camera can send the target image of gathering to first processing unit, and first processing unit can handle the target image. The first processing unit can be set to different modes, and different target images can be acquired in different modes, and different processing and the like can be carried out on the target images. When the first processing unit is in an infrared mode, the floodlight can be controlled to be turned on by the first processing unit through the I2C bus, the laser camera is controlled to collect infrared images through the I2C bus, and the infrared images can be processed to obtain an infrared parallax image. When the first processing unit is in a speckle pattern, the first processing unit can control the laser lamp to be started through the I2C bus, control the laser camera to collect speckle images through the I2C bus, and process the speckle images to obtain the speckle parallax image. When the first processing unit is in a depth map mode, the first processing unit can process the speckle images to obtain a depth parallax map.
In one embodiment, the first processing unit may perform a correction process on the target image, where the correction process is performed to correct an image content shift of the target image due to internal and external parameters of the laser camera and the RGB camera, for example, an image content shift due to a laser camera deflection angle, a placement position between the laser camera and the RGB camera, and the like. After the target image is corrected, a disparity map of the target image can be obtained, for example, an infrared disparity map can be obtained by correcting an infrared image, and a speckle disparity map or a depth disparity map can be obtained by correcting a speckle image. The correction processing is performed on the target image, so that the situation that the image finally presented on the screen of the electronic equipment is ghosted can be prevented.
The first processing unit processes the target image, and can send the processed target image to the second processing unit. The second processing unit can obtain required images such as infrared images, speckle images, depth images and the like according to the processed target images. The second processing unit can process the required image according to the requirement of the application program.
For example, when the application program needs to perform face verification, the second processing unit may perform face detection on the obtained required image, where the face detection may include face recognition, face matching, and living body detection. The human face recognition means that whether a human face exists in an image required for recognition, the human face matching means that the human face in the image required for recognition is matched with a pre-stored human face, and the living body detection means that whether the human face in the image required for detection has biological activity or not. If the application program needs to acquire the depth information of the face, the generated depth image can be uploaded to the application program, and the application program can perform beautifying processing, three-dimensional modeling and the like according to the received depth image.
In this embodiment, laser camera, floodlight, radium-shine lamp and first processing unit and same I2C bus connection, first processing unit opens at least one in floodlight and the radium-shine lamp through this I2C bus control to through this I2C control laser camera collection target image, through same I2C bus control floodlight, radium-shine lamp and laser camera, multiplex the I2C bus, can reduce control circuit's complexity, and reduce cost.
As shown in FIG. 5, in one embodiment, the step of controlling the turning on of at least one of the floodlight and the radium lamp through the I2C bus comprises the following steps:
step 502, determining the type of the acquired image according to the image acquisition instruction.
The first processing unit receives the image acquisition instruction sent by the second processing unit, and can determine the type of the acquired image according to the image acquisition instruction, wherein the image type can be one or more of an infrared image, a speckle image, a depth image and the like. The image type can be determined according to the face data required by the application program, and after the second processing unit receives the data acquisition request, the image type can be determined according to the data acquisition request, and an image acquisition instruction containing the image type is sent to the first processing unit. For example, if the data required to unlock the face is the infrared image and the speckle image, the image type may be determined to be the depth image if the face depth information is required, but the present invention is not limited thereto.
Step 504, if the image type is an infrared image, the first processing unit sends a first control instruction to the controller through the I2C bus, and the first control instruction is used for instructing the controller to turn on the floodlight.
The controller can be arranged in the electronic equipment, the floodlight and the laser lamp can share the same controller, the controller can be respectively connected with the floodlight and the laser lamp, the controller is used for controlling the floodlight and the laser lamp, the floodlight or the laser lamp can be started by controlling, the switching between the floodlight and the laser lamp is controlled, and the transmitting power of the floodlight and the laser lamp is controlled. The controller can be connected with the laser camera and the first processing unit through the same I2C bus.
If the image type is an infrared image, the first processing unit can send a first control instruction to the controller through the connected I2C, and the controller can switch to the floodlight according to the first control instruction and turn on the floodlight. The first processing unit may transmit a pulse to the controller via the PWM module to illuminate the floodlight. Alternatively, the first processing unit may address the controller via I2C and send a first control instruction to the controller.
If the image type is a speckle image or a depth image, the first processing unit sends a second control instruction to the controller through the I2C bus, and the second control instruction is used for instructing the controller to turn on the laser lamp, step 506.
If the image type is a speckle image or a depth image, the first processing unit can send a second control instruction to the controller through the connected I2C, and the controller can switch to the laser lamp according to the second control instruction and turn on the laser lamp. The first processing unit can emit pulses to the controller through the PWM module to light the laser lamp.
Alternatively, the image types may include multiple types, possibly including both infrared images and speckle images, or both infrared images and depth images, or infrared images, speckle images, depth images, and so forth. The first processing unit needs to respectively control to start the floodlight to collect the infrared image and control to start the laser light to collect the speckle image. The first processing unit can collect the infrared image firstly or the speckle image firstly, and does not limit the collecting sequence. When the image type includes an infrared image and a speckle image, or includes an infrared image and a depth image, the first processing unit may first send a first control instruction to the controller through the I2C bus, turn on the floodlight, control the laser camera to acquire the infrared image through the I2C bus, then send a second control instruction to the controller through the I2C bus, turn on the laser lamp, and control the laser camera to acquire the speckle image through the I2C bus.
In one embodiment, when the image type includes an infrared image and a speckle image, or includes an infrared image and a depth image, the first processing unit may also first send a second control command to the controller through the I2C bus, turn on the laser lamp, control the laser camera to acquire the speckle image through the I2C bus, then send a first control command to the controller through the I2C bus, turn on the floodlight, and control the laser camera to acquire the infrared image through the I2C bus. The time-sharing multiplexing of the same I2C bus can reduce the complexity of the control circuit and reduce the cost.
In this embodiment, the floodlight and the laser light can be switched and controlled by one controller, so that the complexity of the control circuit can be further reduced, and the cost can be reduced.
As shown in FIG. 6, in one embodiment, step 430 processes the target image by the first processing unit and sends the processed target image to the second processing unit, comprising the steps of:
step 602, a stored reference speckle image is obtained, the reference speckle image having reference depth information.
In the camera coordinate system, a straight line which is vertical to the imaging plane and passes through the center of the mirror surface is taken as a Z axis, and if the coordinates of the object in the camera coordinate system are (X, Y, Z), the Z value is the depth information of the object in the imaging plane of the camera. If the application program needs to acquire the depth information of the face, a depth image containing the face depth information needs to be acquired. The first processing unit can control the laser lamp to be started through an I2C bus and control the laser camera to collect speckle images through an I2C bus. The first processing unit can be pre-stored with a reference speckle pattern, the reference speckle pattern can be provided with reference depth information, and the depth information of each pixel point contained in the speckle image can be acquired according to the collected speckle image and the reference speckle image.
And step 604, matching the reference speckle image with the speckle image to obtain a matching result.
The first processing unit may sequentially select a pixel block of a predetermined size, for example, 31 pixels by 31 pixels, centering on each pixel point included in the collected speckle image, and search for a block matching the selected pixel block on the reference speckle image. The first processing unit can find two points on the same laser light path in the speckle image and the reference speckle image respectively from the pixel block selected from the acquired speckle images and the block matched with the reference speckle image, wherein the speckle information of the two points on the same laser light path is consistent, and the two points on the same laser light path can be identified as corresponding pixel points. The depth information of the points on each laser path in the reference speckle image is known. The first processing unit can calculate the offset between two corresponding pixel points of the target speckle image and the reference speckle image on the same laser light path, and calculate the depth information of each pixel point contained in the acquired speckle pattern according to the offset.
In one embodiment, the first processing unit calculates an offset between the collected speckle image and the reference speckle pattern, and calculates depth information of each pixel point included in the speckle image according to the offset, where a calculation formula may be as shown in formula (1):
Figure BDA0001645796370000111
wherein Z isDRepresenting the depth information of the pixel points, namely the depth values of the pixel points; l is the distance between the laser camera and the laser; f is the focal length of the lens in the laser camera, Z0The reference speckle image is acquired by comparing the depth value of a reference plane with the depth value of a laser camera of the electronic equipment, and P is the offset between the acquired speckle image and the corresponding pixel point in the reference speckle image. P can be obtained by multiplying the amount of pixels of the offset of the pixels in the target speckle pattern and the reference speckle pattern by the actual distance of one pixel. When the distance between the target object and the laser camera is larger than the distance between the reference plane and the laser camera, P is a negative value, and when the distance between the target object and the laser camera is smaller than the distance between the reference plane and the laser camera, P is a positive value.
And 606, generating a depth disparity map according to the reference depth information and the matching result, sending the depth disparity map to a second processing unit, and processing the depth disparity map through the second processing unit to obtain the depth map.
The first processing unit obtains the depth information of each pixel point contained in the acquired speckle image, can correct the acquired speckle image, and corrects the acquired speckle image to have image content offset caused by internal and external parameters of the laser camera and the RGB camera. The first processing unit can generate a depth parallax map according to the corrected speckle images and the depth values of all pixel points in the speckle images, and sends the depth parallax map to the second processing unit. The second processing unit may obtain a depth map according to the depth disparity map, and the depth map may include depth information of each pixel point. The second processing unit can upload the depth map to an application program, and the application program can perform beautifying, three-dimensional modeling and the like according to the depth information of the face in the depth map. The second processing unit can also carry out living body detection according to the depth information of the face in the depth map, and can prevent the collected face from being a two-dimensional plane face and the like.
In this embodiment, the depth information of the acquired image can be accurately obtained through the first processing unit, the data processing efficiency is high, and the accuracy of image processing is improved.
As shown in fig. 7, in one embodiment, before the step 602 of acquiring the stored reference speckle image, the method further includes the following steps:
and 702, collecting the temperature of the laser lamp every other collection time period, and acquiring a reference speckle image corresponding to the temperature through a second processing unit.
The electronic equipment can be provided with a temperature sensor beside the laser lamp, and the temperature of the laser lamp and the like can be collected through the temperature sensor. The second processing unit may acquire the temperature of the laser lamp acquired by the temperature sensor every acquisition time period, where the acquisition time period may be set according to actual requirements, such as 3 seconds, 4 seconds, and the like, but is not limited thereto. When the temperature of the laser lamp changes, the camera module may be deformed, and internal and external parameters of the first camera and the second camera are affected. The influence on the camera is different under different temperatures, so different reference speckle images can be corresponded under different temperatures.
The second processing unit can acquire a reference speckle image corresponding to the temperature, and process the speckle image acquired at the temperature according to the reference speckle image corresponding to the temperature to obtain a depth map. Optionally, the second processing unit may preset a plurality of different temperature intervals, such as 0 ℃ (camera shooting degree) -30 ℃, 30 ℃ -60 ℃, 60 ℃ -90 ℃ and the like, but is not limited thereto, and the different temperature intervals may correspond to different reference speckle images. After the second processing unit collects the temperature, the temperature interval where the temperature is located can be determined, and the reference speckle image corresponding to the temperature interval is obtained.
And 704, writing the reference speckle image acquired this time into the first processing unit through the second processing unit when the reference speckle image acquired this time is not consistent with the reference speckle image stored in the first processing unit.
After the second processing unit acquires the reference speckle image corresponding to the acquired temperature, whether the reference speckle image acquired this time is consistent with the reference speckle image stored in the first processing unit can be judged, an image identifier can be carried in the reference speckle image, and the image identifier can be composed of one or more of numbers, word lines, characters and the like. The second processing unit can read the stored image identifier of the reference speckle image from the first processing unit and compare the image identifier of the reference speckle image acquired this time with the image identifier read from the first processing unit. If the two image identifications are not consistent, it can be shown that the reference speckle image obtained this time is not consistent with the reference speckle image stored in the first processing unit, and the second processing unit can write the reference speckle image obtained this time into the first processing unit. The first processing unit may store the newly written reference speckle image and delete the previously stored reference speckle image.
In this embodiment, the reference speckle image corresponding to the temperature can be obtained according to the temperature of the laser lamp, so that the influence of the temperature on the finally output depth map is reduced, and the obtained depth information is more accurate.
As shown in fig. 8, in an embodiment, the data processing method further includes the following steps:
step 802, sending an image acquisition instruction to the first processing unit through a kernel of the second processing unit operating in a first operation mode, where the first operation mode is a trusted operation environment.
The second processing unit in the electronic device may include two operation modes, wherein the first operation mode may be a TEE, the TEE is a trusted operation environment, and the security level is high; the second operation mode may be REE, which is a natural operation environment, and the security level of the REE is low. After receiving a data acquisition request sent by an application program, the second processing unit can send an image acquisition instruction to the first processing unit through the first operation mode. When the second processing unit is a CPU with a single core, the single core can be directly switched from the second operation mode to the first operation mode; when the second processing unit has multiple cores, one core can be switched from the second operation mode to the first operation mode, other cores still operate in the second operation mode, and an image acquisition instruction is sent to the first processing unit through the core operating in the first operation mode.
And step 804, the first processing unit sends the processed target image to a kernel running in the first running mode in the second processing unit.
After the first processing unit processes the acquired first image, the processed first image can be sent to the kernel running in the first running mode, so that the first processing unit can be guaranteed to run in a trusted running environment all the time, and the safety is improved. The second processing unit may obtain, in the kernel operating in the first operation mode, a target image according to the processed first image, and process the target image according to a requirement of the application program. For example, the second processing unit may perform face detection on the target image in the kernel operating in the first operating mode.
In one embodiment, since the kernel operating in the first operating mode is unique, the second processing unit performs face detection on the target image in the TEE environment, and can perform face recognition, face matching, live body detection and the like on the target image one by one in a serial manner. The second processing unit may perform face recognition on the target image, and when a face is recognized, match the face included in the target image with a face stored in advance, and determine whether the faces are the same face. If the face is the same face, then the living body detection is carried out on the face according to the target image, and the collected face is prevented from being a two-dimensional plane face and the like. When the face is not recognized, face matching and live body detection may not be performed, and the processing pressure of the second processing unit may be reduced.
In this embodiment, the kernel with high security of the second processing unit sends the image acquisition instruction to the first processing unit, so that the first processing unit can be ensured to be in an environment with high security, and the security of data is improved.
In one embodiment, a data processing method is provided and applied to an electronic device, where the electronic device includes a camera module, a first processing unit and a second processing unit, and the first processing unit is connected to the second processing unit and the camera module respectively; the camera module comprises a laser camera, a floodlight, a laser lamp and a first processing unit which are connected with the same bidirectional two-wire system synchronous serial I2C bus.
The data processing method comprises the following steps:
and (1) when the first processing unit receives an image acquisition instruction sent by the second processing unit, controlling to turn on at least one of the floodlight and the laser light through an I2C bus.
In one embodiment, the electronic device further comprises a controller for controlling the floodlight and the laser light, the controller being connected to the I2C bus. Step (1) comprising: determining the type of the acquired image according to the image acquisition instruction; if the image type is an infrared image, the first processing unit sends a first control instruction to the controller through an I2C bus, and the first control instruction is used for indicating the controller to turn on the floodlight; if the image type is a speckle image or a depth image, the first processing unit sends a second control instruction to the controller through the I2C bus, and the second control instruction is used for instructing the controller to turn on the laser lamp.
In one embodiment, after the step of determining the type of the captured image according to the image capturing instruction, the method further comprises: when the image type comprises an infrared image and a speckle image, or comprises the infrared image and a depth image, the first processing unit sends a first control command to the controller through the I2C bus, turns on the floodlight, controls the laser camera to collect the infrared image through the I2C bus, then sends a second control command to the controller through the I2C bus, turns on the laser lamp, and controls the laser camera to collect the speckle image through the I2C bus.
In one embodiment, after the step of determining the type of the captured image according to the image capturing instruction, the method further comprises: when the image type comprises an infrared image and a speckle image, or comprises the infrared image and a depth image, the first processing unit sends a second control command to the controller through the I2C bus, starts the laser lamp, controls the laser camera to collect the speckle image through the I2C bus, then sends a first control command to the controller through the I2C bus, starts the floodlight, and controls the laser camera to collect the infrared image through the I2C bus.
And (2) controlling the laser camera to acquire a target image by the first processing unit through an I2C bus.
And (3) processing the target image through the first processing unit, and sending the processed target image to the second processing unit.
In one embodiment, step (3) comprises: acquiring a stored reference speckle image, wherein the reference speckle image is provided with reference depth information; matching the reference speckle image with the speckle image to obtain a matching result; and generating a depth parallax map according to the reference depth information and the matching result, sending the depth parallax map to a second processing unit, and processing the depth parallax map through the second processing unit to obtain the depth map.
In one embodiment, before the step of acquiring the stored reference speckle image, the method further includes: collecting the temperature of the laser lamp every other collection time period, and acquiring a reference speckle image corresponding to the temperature through a second processing unit; and when the reference speckle image acquired this time is not consistent with the reference speckle image stored in the first processing unit, writing the reference speckle image acquired this time into the first processing unit through the second processing unit.
In one embodiment, before step (1), further comprising: sending an image acquisition instruction to a first processing unit through a kernel in a second processing unit, wherein the kernel operates in a first operation mode, and the first operation mode is a trusted operation environment; step (3), comprising: and the first processing unit sends the processed target image to a kernel running in the first running mode in the second processing unit.
In this embodiment, laser camera, floodlight, radium-shine lamp and first processing unit and same I2C bus connection, first processing unit opens at least one in floodlight and the radium-shine lamp through this I2C bus control to through this I2C control laser camera collection target image, through same I2C bus control floodlight, radium-shine lamp and laser camera, multiplex the I2C bus, can reduce control circuit's complexity, and reduce cost.
It should be understood that, although the steps in the respective flow charts described above are shown in sequence as indicated by the arrows, the steps are not necessarily performed in sequence as indicated by the arrows. The steps are not performed in the exact order shown and described, and may be performed in other orders, unless explicitly stated otherwise. Moreover, at least a portion of the steps in the various flow diagrams described above may include multiple sub-steps or multiple stages, which are not necessarily performed at the same time, but may be performed at different times, and the order of performing the sub-steps or stages is not necessarily sequential, but may be performed alternately or alternatingly with other steps or at least a portion of the sub-steps or stages of other steps.
In one embodiment, an electronic device is provided. The electronic equipment comprises a camera module, a first processing unit and a second processing unit, wherein the first processing unit can be respectively connected with the second processing unit and the camera module. The camera module can include laser camera, floodlight and radium-shine lamp etc. and laser camera, floodlight, radium-shine lamp and first processing unit and same two-wire type serial I2C bus connection.
And the first processing unit is used for controlling to start at least one of the floodlight and the laser lamp through the I2C bus when receiving the image acquisition instruction sent by the second processing unit, controlling the laser camera to acquire a target image through the I2C bus, processing the target image, and sending the processed target image to the second processing unit.
In this embodiment, laser camera, floodlight, radium-shine lamp and first processing unit and same I2C bus connection, first processing unit opens at least one in floodlight and the radium-shine lamp through this I2C bus control to through this I2C control laser camera collection target image, through same I2C bus control floodlight, radium-shine lamp and laser camera, multiplex the I2C bus, can reduce control circuit's complexity, and reduce cost.
In one embodiment, the electronic device further comprises a controller, the controller is respectively connected with the floodlight and the laser lamp, the controller is used for controlling the floodlight and the laser lamp, and the controller is connected with the I2C bus.
The first processing unit is further used for determining the type of the acquired image according to the image acquisition instruction, if the image type is an infrared image, sending a first control instruction to the controller through an I2C bus, wherein the first control instruction is used for instructing the controller to turn on a floodlight, and if the image type is a speckle image or a depth image, sending a second control instruction to the controller through an I2C bus, and the second control instruction is used for instructing the controller to turn on the floodlight.
In one embodiment, the first processing unit is further configured to determine a type of the acquired image according to the image acquisition instruction, send a first control instruction to the controller through the I2C bus if the image type is an infrared image, where the first control instruction is used to instruct the controller to turn on the floodlight, and send a second control instruction to the controller through the I2C bus if the image type is a speckle image or a depth image, where the second control instruction is used to instruct the controller to turn on the laser light.
In one embodiment, the first processing unit is further configured to, when the image type includes an infrared image and a speckle image, or includes an infrared image and a depth image, send a second control command to the controller through the I2C bus, turn on the laser lamp, control the laser camera to acquire the speckle image through the I2C bus, then send a first control command to the controller through the I2C bus, turn on the floodlight, and control the laser camera to acquire the infrared image through the I2C bus.
In this embodiment, the floodlight and the laser light can be switched and controlled by one controller, so that the complexity of the control circuit can be further reduced, and the cost can be reduced.
In an embodiment, the first processing unit is further configured to obtain a stored reference speckle image, match the reference speckle image with the speckle image to obtain a matching result, generate a depth disparity map according to the reference depth information and the matching result, and send the depth disparity map to the second processing unit, where the reference speckle image has the reference depth information.
And the second processing unit is used for processing the depth parallax map to obtain a depth map.
In this embodiment, the depth information of the acquired image can be accurately obtained through the first processing unit, the data processing efficiency is high, and the accuracy of image processing is improved.
In one embodiment, the second processing unit is further configured to collect the temperature of the laser light every other collection time period, acquire a reference speckle image corresponding to the temperature, and write the reference speckle image acquired this time into the first processing unit when the reference speckle image acquired this time is not consistent with the reference speckle image stored in the first processing unit.
In this embodiment, the reference speckle image corresponding to the temperature can be obtained according to the temperature of the laser lamp, so that the influence of the temperature on the finally output depth map is reduced, and the obtained depth information is more accurate.
In an embodiment, the second processing unit is further configured to send the image capturing instruction to the first processing unit through a kernel of the second processing unit operating in a first operation mode, where the first operation mode is a trusted operation environment.
And the first processing unit is also used for sending the processed target image to a kernel which runs in the first running mode in the second processing unit.
In this embodiment, the kernel with high security of the second processing unit sends the image acquisition instruction to the first processing unit, so that the first processing unit can be ensured to be in an environment with high security, and the security of data is improved.
As shown in fig. 9, in one embodiment, a data processing apparatus 900 is provided, which is suitable for an electronic device including a camera module, a first processing unit and a second processing unit, wherein the first processing unit is connected to the second processing unit and the camera module, respectively. The camera module comprises a laser camera, a floodlight and a laser lamp, and the laser camera, the floodlight, the laser lamp and the first processing unit are connected with the same two-wire serial I2C bus. The data processing apparatus 900 includes a first control module 910, a second control module 920, and a processing module 930.
And the first control module 910 is configured to control to turn on at least one of the floodlight and the laser light through the I2C bus when the first processing unit receives the image acquisition instruction sent by the second processing unit.
And the second control module 920 is used for controlling the laser camera to acquire a target image through the I2C bus.
The processing module 930 is configured to process the target image through the first processing unit, and send the processed target image to the second processing unit.
In this embodiment, laser camera, floodlight, radium-shine lamp and first processing unit and same I2C bus connection, first processing unit opens at least one in floodlight and the radium-shine lamp through this I2C bus control to through this I2C control laser camera collection target image, through same I2C bus control floodlight, radium-shine lamp and laser camera, multiplex the I2C bus, can reduce control circuit's complexity, and reduce cost.
In one embodiment, the electronic device further comprises a controller for controlling the floodlight and the laser light, the controller being connected to the I2C bus. The first control module 910 includes a type determining unit and an instruction sending unit.
And the type determining unit is used for determining the type of the acquired image according to the image acquisition instruction.
And the instruction sending unit is used for sending a first control instruction to the controller through the I2C bus if the image type is an infrared image, and the first control instruction is used for instructing the controller to turn on the floodlight.
And the instruction sending unit is further used for sending a second control instruction to the controller through the I2C bus if the image type is the speckle image or the depth image, and the second control instruction is used for instructing the controller to turn on the laser lamp.
In one embodiment, the first control module 910 is further configured to, when the image type includes an infrared image and a speckle image, or includes an infrared image and a depth image, send a first control command to the controller through the I2C bus, turn on the floodlight, control the laser camera to acquire the infrared image through the I2C bus, then send a second control command to the controller through the I2C bus, turn on the laser light, and control the laser camera to acquire the speckle image through the I2C bus.
In one embodiment, the first control module 910 is further configured to, when the image type includes an infrared image and a speckle image, or includes an infrared image and a depth image, send a second control command to the controller through the I2C bus, turn on the laser light, control the laser camera to acquire the speckle image through the I2C bus, then send a first control command to the controller through the I2C bus, turn on the floodlight, and control the laser camera to acquire the infrared image through the I2C bus.
In this embodiment, the floodlight and the laser light can be switched and controlled by one controller, so that the complexity of the control circuit can be further reduced, and the cost can be reduced.
In one embodiment, the processing module 930 includes an image acquisition unit, a matching unit, and a generation unit.
And the image acquisition unit is used for acquiring the stored reference speckle images, and the reference speckle images are provided with reference depth information.
And the matching unit is used for matching the reference speckle image with the speckle image to obtain a matching result.
And the generating unit is used for generating a depth parallax map according to the reference depth information and the matching result, sending the depth parallax map to the second processing unit, and processing the depth parallax map through the second processing unit to obtain the depth map.
In this embodiment, the depth information of the acquired image can be accurately obtained through the first processing unit, the data processing efficiency is high, and the accuracy of image processing is improved.
In one embodiment, the data processing apparatus 900 includes a temperature acquisition module and a write module in addition to the first control module 910, the second control module 920 and the processing module 930.
And the temperature acquisition module is used for acquiring the temperature of the laser lamp at intervals of acquisition time and acquiring a reference speckle image corresponding to the temperature through the second processing unit.
And the writing module is used for writing the reference speckle image acquired this time into the first processing unit through the second processing unit when the reference speckle image acquired this time is not consistent with the reference speckle image stored in the first processing unit.
In this embodiment, the reference speckle image corresponding to the temperature can be obtained according to the temperature of the laser lamp, so that the influence of the temperature on the finally output depth map is reduced, and the obtained depth information is more accurate.
In one embodiment, the data processing apparatus 900 further includes a first control module 910, a second control module 920, a processing module 930, a temperature acquisition module, and a writing module in addition to the first control module 910, the second control module 920, the processing module 930, the temperature acquisition module, and the writing module
And the sending module is used for sending an image acquisition instruction to the first processing unit through the kernel in the second processing unit, which runs in the first running mode, wherein the first running mode is a trusted running environment.
The processing module 930 is further configured to send, by the first processing unit, the processed target image to a kernel of the second processing unit operating in the first operation mode.
In this embodiment, the kernel with high security of the second processing unit sends the image acquisition instruction to the first processing unit, so that the first processing unit can be ensured to be in an environment with high security, and the security of data is improved.
In one embodiment, a computer-readable storage medium is provided, on which a computer program is stored, which computer program, when being executed by a processor, carries out the above-mentioned data processing method.
In an embodiment, a computer program product is provided, comprising a computer program, which, when run on a computer device, causes the computer device to carry out the above-mentioned data processing method when executed.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by a computer program, which can be stored in a non-volatile computer-readable storage medium, and can include the processes of the embodiments of the methods described above when the program is executed. The storage medium may be a magnetic disk, an optical disk, a Read-Only Memory (ROM), or the like.
Any reference to memory, storage, database, or other medium as used herein may include non-volatile and/or volatile memory. Suitable non-volatile memory can include read-only memory (ROM), Programmable ROM (PROM), Electrically Programmable ROM (EPROM), Electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM), which acts as external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms, such as Static RAM (SRAM), Dynamic RAM (DRAM), Synchronous DRAM (SDRAM), double data rate SDRAM (DDR SDRAM), Enhanced SDRAM (ESDRAM), synchronous Link (Synchlink) DRAM (SLDRAM), Rambus Direct RAM (RDRAM), direct bus dynamic RAM (DRDRAM), and bus dynamic RAM (RDRAM).
The technical features of the embodiments described above may be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the embodiments described above are not described, but should be considered as being within the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (13)

1. A data processing method is applied to electronic equipment and is characterized in that the electronic equipment comprises a camera module, a controller, a first processing unit and a second processing unit, wherein the first processing unit is respectively connected with the second processing unit and the camera module; the camera module comprises a laser camera, a floodlight and a laser lamp, and the laser camera, the floodlight, the laser lamp and the first processing unit are connected with the same bidirectional two-wire system synchronous serial I2C bus; the controller is used for controlling the floodlight and the laser light and is connected with the I2C bus; the second processing unit comprises a CPU core running under a trusted running environment of a first running mode and a CPU core running under a natural running environment of a second running mode, and only one CPU core of the second processing unit runs under the trusted running environment; the security level of the trusted operating environment is higher than that of the natural operating environment;
the method comprises the following steps:
sending an image acquisition instruction to the first processing unit through a kernel in the second processing unit which runs in a first running mode; when the second processing unit is a single-core CPU, one core can be directly switched from the second operation mode to the first operation mode; when the second processing unit is multi-core, one kernel is switched to the first operation mode from the second operation mode, other kernels still operate in the second operation mode, and an image acquisition instruction is sent to the first processing unit through the kernels operating in the first operation mode;
when the first processing unit receives an image acquisition instruction sent by the second processing unit, controlling to turn on at least one of the floodlight and the laser light through the I2C bus;
the first processing unit controls the laser camera to collect a target image through the I2C bus; the target image comprises a speckle image;
collecting the temperature of the laser lamp every other collection time period, and acquiring a reference speckle image corresponding to the temperature through the second processing unit;
when the reference speckle image acquired this time is not consistent with the reference speckle image stored in the first processing unit, writing the reference speckle image acquired this time into the first processing unit through the second processing unit;
acquiring a stored reference speckle image with reference depth information;
matching the reference speckle image with the speckle image to obtain a matching result;
and generating a depth parallax map according to the reference depth information and the matching result, sending the depth parallax map to the second processing unit, and processing the depth parallax map through the second processing unit to obtain the depth map.
2. The method of claim 1,
the control of the turning on of at least one of the floodlight and the laser light through the I2C bus comprises the following steps:
determining the type of the acquired image according to the image acquisition instruction;
if the image type is an infrared image, the first processing unit sends a first control instruction to the controller through the I2C bus, and the first control instruction is used for indicating the controller to turn on the floodlight;
if the image type is a speckle image or a depth image, the first processing unit sends a second control instruction to the controller through the I2C bus, and the second control instruction is used for instructing the controller to turn on the laser lamp.
3. The method of claim 2, wherein after the determining a type of image to capture from the image capture instructions, the method further comprises:
when the image type comprises an infrared image and a speckle image, or comprises an infrared image and a depth image, the first processing unit sends a first control command to the controller through the I2C bus, turns on the floodlight, controls the laser camera to collect the infrared image through the I2C bus, then sends a second control command to the controller through the I2C bus, turns on the laser lamp, and controls the laser camera to collect the speckle image through the I2C bus.
4. The method of claim 2, wherein after the determining a type of image to capture from the image capture instructions, the method further comprises:
when the image type comprises an infrared image and a speckle image, or comprises an infrared image and a depth image, the first processing unit sends a second control command to the controller through the I2C bus, turns on the laser lamp, controls the laser camera to collect the speckle image through the I2C bus, then sends a first control command to the controller through the I2C bus, turns on the floodlight, and controls the laser camera to collect the infrared image through the I2C bus.
5. A data processing device is suitable for electronic equipment, and is characterized in that the electronic equipment comprises a camera module, a controller, a first processing unit and a second processing unit, wherein the first processing unit is respectively connected with the second processing unit and the camera module; the camera module comprises a laser camera, a floodlight and a laser lamp, and the laser camera, the floodlight, the laser lamp and the first processing unit are connected with the same bidirectional two-wire system synchronous serial I2C bus; the controller is used for controlling the floodlight and the laser light and is connected with the I2C bus; the second processing unit comprises a CPU core running under a trusted running environment of a first running mode and a CPU core running under a natural running environment of a second running mode, and only one CPU core of the second processing unit runs under the trusted running environment; the security level of the trusted operating environment is higher than that of the natural operating environment;
the apparatus, comprising:
the first control module is used for sending an image acquisition instruction to the first processing unit through a kernel in the second processing unit, which runs in a first running mode; when the second processing unit is a single-core CPU, one core can be directly switched from the second operation mode to the first operation mode; when the second processing unit is multi-core, one kernel is switched to the first operation mode from the second operation mode, other kernels still operate in the second operation mode, and an image acquisition instruction is sent to the first processing unit through the kernels operating in the first operation mode; when the first processing unit receives an image acquisition instruction sent by the second processing unit, controlling to turn on at least one of the floodlight and the laser light through the I2C bus;
the second control module is used for controlling the laser camera to collect a target image through the I2C bus; the target image comprises a speckle image;
the temperature acquisition module is used for acquiring the temperature of the laser lamp at intervals of acquisition time, and acquiring a reference speckle image corresponding to the temperature through the second processing unit;
the writing module is used for writing the reference speckle image acquired this time into the first processing unit through the second processing unit when the reference speckle image acquired this time is not consistent with the reference speckle image stored in the first processing unit;
the processing module is used for acquiring a stored reference speckle image, and the reference speckle image is provided with reference depth information; matching the reference speckle image with the speckle image to obtain a matching result; and generating a depth parallax map according to the reference depth information and the matching result, sending the depth parallax map to the second processing unit, and processing the depth parallax map through the second processing unit to obtain the depth map.
6. The apparatus of claim 5, wherein the first control module comprises:
the type determining unit is used for determining the type of the acquired image according to the image acquisition instruction;
the instruction sending unit is used for sending a first control instruction to the controller through the I2C bus if the image type is an infrared image, wherein the first control instruction is used for indicating the controller to turn on the floodlight;
the instruction sending unit is further configured to send a second control instruction to the controller through the I2C bus if the image type is a speckle image or a depth image, where the second control instruction is used to instruct the controller to turn on the laser lamp.
7. The apparatus of claim 6, further comprising:
the first control module is further used for sending a first control command to the controller through the I2C bus, turning on the floodlight, controlling the laser camera to collect an infrared image through the I2C bus, then sending a second control command to the controller through the I2C bus, turning on the laser lamp, and controlling the laser camera to collect a speckle image through the I2C bus when the image type comprises an infrared image and a speckle image or comprises an infrared image and a depth image.
8. The apparatus of claim 6,
the first control module is further used for sending a second control command to the controller through the I2C bus when the image types comprise infrared images and speckle images or comprise infrared images and depth images, turning on the laser lamp, controlling the laser camera to collect speckle images through the I2C bus, then sending a first control command to the controller through the I2C bus, turning on the floodlight, and controlling the laser camera to collect infrared images through the I2C bus.
9. An electronic device is characterized by comprising a camera module, a controller, a first processing unit and a second processing unit, wherein the first processing unit is respectively connected with the second processing unit and the camera module; the camera module comprises a laser camera, a floodlight and a laser lamp, and the laser camera, the floodlight, the laser lamp and the first processing unit are connected with the same bidirectional two-wire system synchronous serial I2C bus; the controller is used for controlling the floodlight and the laser light and is connected with the I2C bus; the second processing unit comprises a CPU core running under a trusted running environment of a first running mode and a CPU core running under a natural running environment of a second running mode, and only one CPU core of the second processing unit runs under the trusted running environment; the security level of the trusted operating environment is higher than that of the natural operating environment;
the second processing unit is used for sending an image acquisition instruction to the first processing unit through a kernel in the second processing unit, wherein the kernel runs in a first running mode; when the second processing unit is a single-core CPU, one core can be directly switched from the second operation mode to the first operation mode; when the second processing unit is multi-core, one kernel is switched to the first operation mode from the second operation mode, other kernels still operate in the second operation mode, and an image acquisition instruction is sent to the first processing unit through the kernels operating in the first operation mode;
the first processing unit is used for controlling to turn on at least one of the floodlight and the laser lamp through the I2C bus and controlling the laser camera to collect a target image through the I2C bus when receiving an image collecting instruction sent by the second processing unit; the target image comprises a speckle image;
collecting the temperature of the laser lamp every other collection time period, and acquiring a reference speckle image corresponding to the temperature through the second processing unit;
when the reference speckle image acquired this time is not consistent with the reference speckle image stored in the first processing unit, writing the reference speckle image acquired this time into the first processing unit through the second processing unit;
acquiring a stored reference speckle image with reference depth information;
matching the reference speckle image with the speckle image to obtain a matching result;
and generating a depth parallax map according to the reference depth information and the matching result, sending the depth parallax map to the second processing unit, and processing the depth parallax map through the second processing unit to obtain the depth map.
10. The electronic device of claim 9,
the first processing unit is further configured to determine a type of a collected image according to the image collection instruction, send a first control instruction to the controller through the I2C bus if the image type is an infrared image, where the first control instruction is used to instruct the controller to turn on the floodlight, and send a second control instruction to the controller through the I2C bus if the image type is a speckle image or a depth image, where the second control instruction is used to instruct the controller to turn on the laser light.
11. The electronic device of claim 10, wherein the first processing unit is further configured to, when the image type includes an infrared image and a speckle image, or includes an infrared image and a depth image, send a first control command to the controller via the I2C bus, turn on the floodlight, control the laser camera to capture an infrared image via the I2C bus, then send a second control command to the controller via the I2C bus, turn on the laser camera, and control the laser camera to capture a speckle image via the I2C bus.
12. The electronic device of claim 10, wherein the first processing unit is further configured to send a second control command to the controller via the I2C bus, turn on the laser light, control the laser camera via the I2C bus to capture a speckle image, and then send a first control command to the controller via the I2C bus, turn on the floodlight, and control the laser camera via the I2C bus to capture an infrared image when the image type includes an infrared image and a speckle image, or includes an infrared image and a depth image.
13. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the method according to any one of claims 1 to 4.
CN201810401326.3A 2018-04-28 2018-04-28 Data processing method and device, electronic equipment and computer readable storage medium Active CN108833887B (en)

Priority Applications (5)

Application Number Priority Date Filing Date Title
CN201810401326.3A CN108833887B (en) 2018-04-28 2018-04-28 Data processing method and device, electronic equipment and computer readable storage medium
EP19792981.3A EP3672223B1 (en) 2018-04-28 2019-04-23 Data processing method, electronic device, and computer-readable storage medium
PCT/CN2019/083854 WO2019206129A1 (en) 2018-04-28 2019-04-23 Data processing method, apparatus, electronic device, and computer-readable storage medium
ES19792981T ES2938471T3 (en) 2018-04-28 2019-04-23 Data processing method, electronic device and computer-readable storage medium
US16/743,533 US11050918B2 (en) 2018-04-28 2020-01-15 Method and apparatus for performing image processing, and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810401326.3A CN108833887B (en) 2018-04-28 2018-04-28 Data processing method and device, electronic equipment and computer readable storage medium

Publications (2)

Publication Number Publication Date
CN108833887A CN108833887A (en) 2018-11-16
CN108833887B true CN108833887B (en) 2021-05-18

Family

ID=64155651

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810401326.3A Active CN108833887B (en) 2018-04-28 2018-04-28 Data processing method and device, electronic equipment and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN108833887B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3672223B1 (en) 2018-04-28 2022-12-28 Guangdong Oppo Mobile Telecommunications Corp., Ltd. Data processing method, electronic device, and computer-readable storage medium
CN113826376B (en) * 2019-05-24 2023-08-15 Oppo广东移动通信有限公司 User equipment and strabismus correction method
CN110544335B (en) * 2019-08-30 2020-12-29 北京市商汤科技开发有限公司 Object recognition system and method, electronic device, and storage medium
CN112633181B (en) * 2020-12-25 2022-08-12 北京嘀嘀无限科技发展有限公司 Data processing method, system, device, equipment and medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102438111A (en) * 2011-09-20 2012-05-02 天津大学 Three-dimensional measurement chip and system based on double-array image sensor
CN106454287A (en) * 2016-10-27 2017-02-22 深圳奥比中光科技有限公司 Combined camera shooting system, mobile terminal and image processing method
CN107105217A (en) * 2017-04-17 2017-08-29 深圳奥比中光科技有限公司 Multi-mode depth calculation processor and 3D rendering equipment
CN107169483A (en) * 2017-07-12 2017-09-15 深圳奥比中光科技有限公司 Tasks carrying based on recognition of face

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9456201B2 (en) * 2014-02-10 2016-09-27 Microsoft Technology Licensing, Llc VCSEL array for a depth camera
US20170034456A1 (en) * 2015-07-31 2017-02-02 Dual Aperture International Co., Ltd. Sensor assembly with selective infrared filter array

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102438111A (en) * 2011-09-20 2012-05-02 天津大学 Three-dimensional measurement chip and system based on double-array image sensor
CN106454287A (en) * 2016-10-27 2017-02-22 深圳奥比中光科技有限公司 Combined camera shooting system, mobile terminal and image processing method
CN107105217A (en) * 2017-04-17 2017-08-29 深圳奥比中光科技有限公司 Multi-mode depth calculation processor and 3D rendering equipment
CN107169483A (en) * 2017-07-12 2017-09-15 深圳奥比中光科技有限公司 Tasks carrying based on recognition of face

Also Published As

Publication number Publication date
CN108833887A (en) 2018-11-16

Similar Documents

Publication Publication Date Title
CN110248111B (en) Method and device for controlling shooting, electronic equipment and computer-readable storage medium
CN108833887B (en) Data processing method and device, electronic equipment and computer readable storage medium
CN110324521B (en) Method and device for controlling camera, electronic equipment and storage medium
CN108549867B (en) Image processing method, image processing device, computer-readable storage medium and electronic equipment
CN110191266B (en) Data processing method and device, electronic equipment and computer readable storage medium
CN108650472B (en) Method and device for controlling shooting, electronic equipment and computer-readable storage medium
WO2019205887A1 (en) Method and apparatus for controlling photographing, electronic device, and computer readable storage medium
EP3621293A1 (en) Image processing method, apparatus, computer-readable storage medium, and electronic device
CN111523499B (en) Image processing method, apparatus, electronic device, and computer-readable storage medium
WO2019196683A1 (en) Method and device for image processing, computer-readable storage medium, and electronic device
US11050918B2 (en) Method and apparatus for performing image processing, and computer readable storage medium
CN108573170B (en) Information processing method and device, electronic equipment and computer readable storage medium
US20200065562A1 (en) Method and Device for Processing Image, Computer Readable Storage Medium and Electronic Device
CN109213610B (en) Data processing method and device, computer readable storage medium and electronic equipment
TWI709110B (en) Camera calibration method and apparatus, electronic device
CN108985255B (en) Data processing method and device, computer readable storage medium and electronic equipment
EP3621294B1 (en) Method and device for image capture, computer readable storage medium and electronic device
EP3605393A1 (en) Image correction due to deviations caused by temperature changes of camera light emitter
CN108810516B (en) Data processing method and device, electronic equipment and computer readable storage medium
EP3644261B1 (en) Image processing method, apparatus, computer-readable storage medium, and electronic device
CN109064503B (en) Image processing method and device, electronic equipment and computer readable storage medium
CN109145772B (en) Data processing method and device, computer readable storage medium and electronic equipment
CN115223215A (en) Image acquisition method and image acquisition equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant