CN108986153B - Image processing method and device, electronic equipment and computer readable storage medium - Google Patents

Image processing method and device, electronic equipment and computer readable storage medium Download PDF

Info

Publication number
CN108986153B
CN108986153B CN201810864801.0A CN201810864801A CN108986153B CN 108986153 B CN108986153 B CN 108986153B CN 201810864801 A CN201810864801 A CN 201810864801A CN 108986153 B CN108986153 B CN 108986153B
Authority
CN
China
Prior art keywords
image
feature
processor
infrared
rgb
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810864801.0A
Other languages
Chinese (zh)
Other versions
CN108986153A (en
Inventor
郭子青
周海涛
欧锦荣
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN201810864801.0A priority Critical patent/CN108986153B/en
Publication of CN108986153A publication Critical patent/CN108986153A/en
Priority to PCT/CN2019/080429 priority patent/WO2020024603A1/en
Priority to US16/526,648 priority patent/US10929988B2/en
Priority to EP19189270.2A priority patent/EP3605452B1/en
Priority to ES19189270T priority patent/ES2847232T3/en
Application granted granted Critical
Publication of CN108986153B publication Critical patent/CN108986153B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/33Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/50Monitoring users, programs or devices to maintain the integrity of platforms, e.g. of processors, firmware or operating systems
    • G06F21/57Certifying or maintaining trusted computer platforms, e.g. secure boots or power-downs, version controls, system software checks, secure updates or assessing vulnerabilities

Abstract

The application relates to an image processing method and device, an electronic device and a computer readable storage medium. The method comprises the following steps: acquiring an image acquisition instruction, acquiring an infrared image in a first operation environment according to the image acquisition instruction, extracting a first feature in the infrared image, acquiring an RGB image in a second operation environment, extracting a second feature in the RGB image, transmitting the second feature to the first operation environment, and performing registration processing according to the first feature and the second feature in the first operation environment, wherein the security level of the first operation environment is higher than that of the second operation environment. The infrared image and the RGB image are respectively collected in different running environments, and registration processing is carried out in the first running environment according to the characteristics of the collected images, so that the safety and convenience of image registration can be improved, and the quality of the images is further improved.

Description

Image processing method and device, electronic equipment and computer readable storage medium
Technical Field
The present application relates to the field of computer vision technologies, and in particular, to an image processing method and apparatus, an electronic device, and a computer-readable storage medium.
Background
With the development of computer vision technology, the requirements on the quality of the acquired images are higher and higher. In order to improve the quality of images, an IR camera and an RGB camera often capture images simultaneously on an electronic device. The infrared camera collects infrared images, the RGB camera collects RGB images, and the collected infrared images and the RGB images are registered to obtain final images.
However, the infrared image and the RGB image collected in the electronic device often have a problem of inconvenient registration, thereby affecting the quality of the image.
Disclosure of Invention
The embodiment of the application provides an image processing method, an image processing device, electronic equipment and a computer-readable storage medium, which can be used for registering images.
An image processing method comprising:
acquiring an image acquisition instruction;
acquiring an infrared image in a first operating environment according to the image acquisition instruction, and extracting a first feature in the infrared image, and acquiring an RGB image in a second operating environment, and extracting a second feature in the RGB image;
transmitting the second feature to the first operating environment, and performing registration processing according to the first feature and the second feature in the first operating environment;
wherein the first operating environment has a higher security level than the second operating environment.
An image processing apparatus comprising:
the instruction acquisition module is used for acquiring an image acquisition instruction;
the image acquisition module is used for acquiring an infrared image in a first operation environment according to the image acquisition instruction, extracting a first feature in the infrared image, acquiring an RGB image in a second operation environment and extracting a second feature in the RGB image;
the feature registration module is used for transmitting the second feature to the first operating environment and performing registration processing according to the first feature and the second feature in the first operating environment;
wherein the first operating environment has a higher security level than the second operating environment.
An electronic device comprising a memory and a processor, the memory having stored therein a computer program that, when executed by the processor, causes the processor to perform the steps of:
acquiring an image acquisition instruction;
acquiring an infrared image in a first operating environment according to the image acquisition instruction, and extracting a first feature in the infrared image, and acquiring an RGB image in a second operating environment, and extracting a second feature in the RGB image;
transmitting the second feature to the first operating environment, and performing registration processing according to the first feature and the second feature in the first operating environment;
wherein the first operating environment has a higher security level than the second operating environment.
A computer-readable storage medium, on which a computer program is stored which, when executed by a processor, carries out the steps of:
acquiring an image acquisition instruction;
acquiring an infrared image in a first operating environment according to the image acquisition instruction, and extracting a first feature in the infrared image, and acquiring an RGB image in a second operating environment, and extracting a second feature in the RGB image;
transmitting the second feature to the first operating environment, and performing registration processing according to the first feature and the second feature in the first operating environment;
wherein the first operating environment has a higher security level than the second operating environment.
According to the image processing method and device, the electronic equipment and the computer readable storage medium, the image acquisition instruction is obtained, the infrared image is acquired in the first operation environment according to the image acquisition instruction, the first feature in the infrared image is extracted, the RGB image is acquired in the second operation environment, the second feature in the RGB image is extracted, the second feature is transmitted to the first operation environment, and the registration processing is carried out in the first operation environment according to the first feature and the second feature, wherein the security level of the first operation environment is higher than that of the second operation environment. The infrared image and the RGB image are respectively collected in different running environments, and registration processing is carried out in the first running environment according to the characteristics of the collected images, so that the safety and convenience of image registration can be improved, and the quality of the images is further improved.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
FIG. 1 is a diagram of an exemplary embodiment of an image processing method;
FIG. 2 is a diagram showing an application environment of an image processing method in another embodiment;
FIG. 3 is a block diagram of an electronic device in one embodiment;
FIG. 4 is a flow diagram that illustrates a method for image processing, according to one embodiment;
FIG. 5 is a flow diagram of a method for capturing images and extracting image features based on image type, according to one embodiment;
FIG. 6 is a diagram illustrating a first processor coupled to a second optical transmitter and a first optical transmitter via an I2C bus according to one embodiment;
FIG. 7 is a schematic diagram of another embodiment in which a first processor is connected to a second optical transmitter and a first optical transmitter via an I2C bus;
FIG. 8 is a block diagram showing the structure of an image processing apparatus according to an embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
It will be understood that, as used herein, the terms "first," "second," and the like may be used herein to describe various elements, but these elements are not limited by these terms. These terms are only used to distinguish one element from another. For example, a first client may be referred to as a second client, and similarly, a second client may be referred to as a first client, without departing from the scope of the present application. Both the first client and the second client are clients, but they are not the same client.
Fig. 1 is a diagram illustrating an application scenario of an image processing method according to an embodiment. As shown in fig. 1, the electronic device includes an infrared camera 102, a second light emitter 104, a first light emitter 106, a first processor 110, a second processor 120, and a controller 130. The first processor 110 may be a Central Processing Unit (CPU) module, etc., and the second processor 120 may be a Micro Controller Unit (MCU) module, etc. The second processor 120 may be connected to the infrared camera 102 and the first processor 110, and the second processor 120 may be connected to the controller 130 through an I2C bus. Second processor 120 may include a PWM (Pulse Width Modulation) module 112, and is connected to controller 130 through PWM module 112, and controller 130 may be connected to second optical transmitter 104 and first optical transmitter 106 respectively.
When the second processor 120 receives the image capturing instruction sent by the first processor 110, it sends a control instruction to the controller 130 via the I2C bus, where the control instruction can be used to control at least one of the second light emitter 104 and the first light emitter 106 to be turned on. Second processor 120 may send a pulse to controller 130 via PWM module 112, illuminate at least one of second light emitter 104 and first light emitter 106 that are on, and capture a target image via infrared camera 102. The second processor 120 may process the target image and transmit the processed target image to the first processor 110.
Fig. 2 is an application scenario diagram of a data processing method in another embodiment. As shown in fig. 2, the electronic device 200 may include a camera module 210, a first processor 220, and a second processor 230. The first processor 220 may be a CPU module. The second processor 230 may be an MCU module. The second processor 230 is connected between the first processor 220 and the camera module 210, the second processor 230 can control the infrared camera 212, the second light emitter 214 and the first light emitter 218 in the camera module 210, and the first processor 220 can control the RGB camera 216 in the camera module 210.
The camera module 210 includes an infrared camera 212, a second light emitter 214, an RGB camera 216, and a first light emitter 218. The infrared camera 212 is configured to acquire an infrared image. The second light emitter 214 is a surface light source capable of emitting infrared light; the first light emitter 218 is a point light source capable of emitting laser light and is a point light source with patterns. When the second light emitter 214 emits a surface light source, the infrared camera 212 may obtain an infrared image according to the reflected light. When first light emitter 218 emits a point light source, infrared camera 212 may obtain a speckle image from the reflected light. The speckle image is an image of the point light source with the pattern emitted by the first light emitter 218, which is reflected and then the pattern is deformed. First light emitter 218 may be a laser lamp. Second light emitter 214 may be a floodlight. Floodlights are point light sources that can generate infrared light. The laser lamp is a point light source capable of generating infrared laser and is a point light source with patterns. The infrared camera in the camera module can obtain infrared image according to the light that reflects back when floodlight transmission area source to and obtain speckle image according to the light that reflects back when radium-shine lamp transmission pointolite.
The first processor 220 may include a CPU core that operates in a first execution environment and a CPU core that operates in a second execution environment. The first runtime Environment may be a TEE (Trusted Execution Environment) Environment, and the second runtime Environment may be a REE (Rich Execution Environment) Environment. The TEE environment and the REE environment are both running modes of an ARM module (Advanced RISC Machines, Advanced reduced instruction set processor). The security level of the TEE environment is high, and only one CPU core in the first processor 220 can operate in the TEE environment at the same time. Generally, the operation behavior with higher security level in the electronic device 200 needs to be executed in the CPU core in the TEE environment, and the operation behavior with lower security level can be executed in the CPU core in the REE environment.
The second processor 230 includes a PWM module 232, an SPI/I2C (Serial Peripheral Interface/Inter-Integrated Circuit) Interface 234, a RAM (Random Access Memory) module 236, and a depth engine 238. Second processor 230 may be coupled to a controller of second optical transmitter 214 and first optical transmitter 218 via PWM module 232, which may be coupled to second optical transmitter 214 and first optical transmitter 218, respectively, to control second optical transmitter 214 and first optical transmitter 218. The second processor 230 may also be connected to the controller via an I2C bus, and may control the turning on of the second light emitter 214 or the first light emitter 218 via the connected I2C bus, and the PWM module 232 may transmit a pulse to the camera module to light the turned on second light emitter 214 or the first light emitter 218. The second processor 230 may capture infrared images or speckle images via the laser camera 212. The SPI/I2C interface 234 is used for receiving image acquisition commands sent by the first processor 220. The depth engine 238 may process the speckle images to obtain a depth disparity map.
When the first processor 220 receives a data acquisition request of an application program, for example, when the application program needs to perform face unlocking and face payment, an image acquisition instruction may be sent to the second processor 230 through the CPU core operating in the TEE environment. After the second processor 230 receives the image acquisition command, it may send a control command to the controller through the I2C bus, control to turn on the second light emitter 214 in the control camera module 210, send a pulse wave to the controller through the PWM module 232 to light the second light emitter 214, control the infrared camera 212 through the I2C bus to acquire an infrared image, send a control command to the controller through the I2C bus, control to turn on the first light emitter 218 in the camera module 210, send a pulse wave to the controller through the PWM module 232 to light the first light emitter 218, and control the infrared camera 212 through the I2C bus to acquire a speckle image. The camera module 210 may send the collected infrared image and speckle image to the second processor 230. The second processor 230 may process the received infrared image to obtain an infrared disparity map; and processing the received speckle images to obtain a speckle parallax image or a depth parallax image. The processing of the infrared image and the speckle image by the second processor 230 refers to correcting the infrared image or the speckle image and removing the influence of internal and external parameters in the camera module 210 on the image. Wherein the second processor 230 can be set to different modes, and the images output by different modes are different. When the second processor 230 is set to the speckle pattern mode, the second processor 230 processes the speckle image to obtain a speckle disparity map, and a target speckle image can be obtained according to the speckle disparity map; when the second processor 230 is set to the depth map mode, the second processor 230 processes the speckle images to obtain a depth disparity map, and obtains a depth image according to the depth disparity map, where the depth image is an image with depth information. The second processor 230 may send the infrared disparity map and the speckle disparity map to the first processor 220, and the second processor 230 may also send the infrared disparity map and the depth disparity map to the first processor 220. The first processor 220 may obtain a target infrared image according to the infrared disparity map, and obtain a depth image according to the depth disparity map. Further, the first processor 220 may perform face recognition, face matching, living body detection, and depth information acquisition on the detected face according to the target infrared image and the depth image.
The communication between the second processor 230 and the first processor 220 is through a fixed security interface to ensure the security of the transmitted data. As shown in FIG. 2, the data sent by the first Processor 220 to the second Processor 230 is through SECURE SPI/I2C 240, and the data sent by the second Processor 230 to the first Processor 220 is through SECURE MIPI (Mobile Industry Processor Interface) 250.
In an embodiment, the second processor 230 may also obtain a target infrared image according to the infrared disparity map, calculate and obtain a depth image according to the depth disparity map, and then send the target infrared image and the depth image to the first processor 220.
FIG. 3 is a block diagram of an electronic device in one embodiment. As shown in fig. 3, the electronic device includes a processor, a memory, a display screen, and an input device connected through a system bus. The memory may include, among other things, a non-volatile storage medium and a processor. The non-volatile storage medium of the electronic device stores an operating system and a computer program, which is executed by a processor to implement an image processing method provided in an embodiment of the present application. The processor is used for providing calculation and control capability and supporting the operation of the whole electronic equipment. The internal memory in the electronic device provides an environment for the execution of the computer program in the nonvolatile storage medium. The display screen of the electronic device may be a liquid crystal display screen or an electronic ink display screen, and the input device may be a touch layer covered on the display screen, a key, a track ball or a touch pad arranged on a housing of the electronic device, or an external keyboard, a touch pad or a mouse. The electronic device may be a mobile phone, a tablet computer, or a personal digital assistant or a wearable device, etc. Those skilled in the art will appreciate that the architecture shown in fig. 3 is a block diagram of only a portion of the architecture associated with the subject application, and does not constitute a limitation on the electronic devices to which the subject application may be applied, and that a particular electronic device may include more or less components than those shown, or may combine certain components, or have a different arrangement of components.
In one embodiment, an image processing method is provided and exemplified as applied to the above electronic device, as shown in fig. 4, the method includes the steps of:
step 402, acquiring an image acquisition instruction.
The image acquisition instruction may be an instruction for acquiring an image, and the electronic device may acquire the image acquisition instruction through a button or a touch screen of the electronic device. Specifically, a central processing unit in the electronic device may obtain the image acquisition instruction.
Step 404, acquiring an infrared image in a first operating environment according to the image acquisition instruction, and extracting a first feature in the infrared image, and acquiring an RGB image in a second operating environment, and extracting a second feature in the RGB image.
Wherein the security level of the first operating environment is higher than that of the second operating environment. The first Execution environment may be a trusted Execution environment tee (trusted Execution environment), and the second Execution environment may be a native Execution environment ree (rich Execution environment). After the electronic equipment acquires the image acquisition instruction, the infrared image can be acquired in the first operation environment according to the image acquisition instruction, and the RGB image can be acquired in the second operation environment according to the image acquisition instruction.
After the electronic device collects the infrared image in the first operating environment, the electronic device can continue to extract the first feature in the infrared image in the first operating environment. Specifically, the electronic device may extract the first feature in the infrared image by using a Scale-invariant feature transform (SIFT) algorithm. The SIFT algorithm is a computer vision algorithm for detecting and describing local features in an image. Similarly, after the electronic device captures the RGB image in the second runtime environment, the electronic device may continue to extract the second feature in the RGB image in the second runtime environment. When the electronic device extracts the second feature in the RGB image in the second operating environment, the SIFT algorithm may also be used to extract the second feature in the RGB image.
Step 406, transmitting the second feature to the first operating environment, and performing a registration process in the first operating environment according to the first feature and the second feature.
The registration process is also called image registration, and is a process of matching and superimposing two or more images acquired at different times, with different imaging devices, or under different conditions. The first feature extracted by the electronic device is in a first operating environment and the second feature extracted by the electronic device is in a second operating environment, the electronic device being capable of transferring a second feature in the second operating environment to the first operating environment. After the electronic device transmits the second feature to the first operating environment, the first feature and the second feature exist in the first operating environment, and the electronic device can perform registration processing according to the first feature and the second feature in the first environment.
Acquiring an image acquisition instruction, acquiring an infrared image in a first operation environment according to the image acquisition instruction, extracting a first feature in the infrared image, acquiring an RGB image in a second operation environment, extracting a second feature in the RGB image, transmitting the second feature to the first operation environment, and performing registration processing according to the first feature and the second feature in the first operation environment, wherein the security level of the first operation environment is higher than that of the second operation environment. The infrared image and the RGB image are respectively collected in different running environments, and registration processing is carried out in a second running environment according to the characteristics of the collected images, so that the safety and convenience of image registration can be improved, and the quality of the images is further improved.
In one embodiment, after the electronic device acquires the image acquisition instruction, the infrared image may be acquired in the first operating environment according to the image acquisition instruction, and the first feature in the infrared image is extracted. The electronic device may further capture an RGB image in the second runtime environment and transmit the RGB image captured in the second runtime environment to the first runtime environment, and the electronic device may extract a second feature in the RGB image in the first runtime environment and perform a registration process based on the first feature and the second feature in the first runtime environment.
As shown in fig. 5, in an embodiment, the provided image processing method may further include a process of acquiring an image according to an image type and extracting image features, and the specific steps include:
step 502, when the second processor receives the image acquisition instruction sent by the first processor, determining the image type according to the image acquisition instruction.
The first processor may be a central Processing unit (cpu), and the second processor may be a micro control unit (mcu). The image type may be one or more of an infrared image, an RGB image, a speckle image, a depth image, and the like.
The second processor receives the image acquisition instruction sent by the first processor, and can determine the type of the acquired image according to the image acquisition instruction. The image type may be determined according to the image acquisition instruction. For example, if the image acquisition instruction is an image acquisition instruction requiring face unlocking, the image type may be determined to be an infrared image, and if the image acquisition instruction requires a face depth information engineer, the image type may be determined to be a depth image.
Step 504, if the image type is the first type, a first light emitter in the camera module is turned on, a pulse is sent to the first controller through the first PWM module, the first light emitter is turned on, an infrared image corresponding to the first type is collected in the first operating environment through an infrared camera in the camera module, and a first feature in the infrared image is extracted.
If the image type is a first type, and the first type may be an infrared image, the first processor may send a control instruction to the first controller, where the control instruction may be used to turn on a first light emitter in the camera module. The first processor may send a pulse signal through the first PWM module to a first controller operable to control the first light emitter to illuminate the first light emitter. Optionally, the first PWM module may control to light the first light emitter according to a pulse signal continuously sent to the second light emitter at a certain voltage amplitude and a certain time interval. The first light emitter can be a point light source which uniformly irradiates all directions, when the first light emitter is lightened, infrared light can be emitted, and the infrared camera can acquire an infrared image. The first light emitter may be a laser lamp. The second light emitter may be a floodlight.
Electronic equipment can gather infrared image in first operational environment through the infrared camera in the camera module, and electronic equipment can also carry out first feature extraction to the infrared image of gathering in first operational environment.
Step 506, if the image type is the second type, a second light emitter in the camera module is started, a pulse is sent to the second controller through the second Pulse Width Modulation (PWM) module, the second light emitter is lightened, an RGB image corresponding to the second type is collected in the second running environment through the RGB camera in the camera module, and second features in the RGB image are extracted.
If the image type is a second type, which may be an RGB image, etc., the first processor may send a control instruction to the second controller, and the control instruction may be used to turn on the second light emitter in the camera module. The first processor may send a pulse signal through the second PWM module to a second controller operable to control the second light emitter to illuminate the second light emitter. Optionally, the second PWM module may control to light the second light emitter according to a pulse signal continuously sent to the first light emitter at a certain voltage amplitude and a certain time interval. When the second light emitter is lighted, an RGB image can be acquired through the RGB camera.
The electronic equipment can acquire RGB images in the second operation environment through the RBG camera in the camera module, and the electronic equipment can also perform second feature extraction on the acquired RGB images in the second operation environment.
When the second processor receives an image acquisition instruction sent by the first processor, the image type is determined according to the image acquisition instruction, if the image type is the first type, the first light emitter is lightened through the first PWM module, the infrared image corresponding to the first type is acquired through the infrared camera, if the image type is the second type, the second light emitter is lightened to the second controller through the second PWM module, the RGB image corresponding to the second type is acquired through the RGB camera, the second light emitter and the first light emitter are respectively controlled through the two PWM modules, real-time switching is not needed, the data processing complexity can be reduced, and the processing pressure of the first processor is reduced.
In one embodiment, the first processor is connected to the second optical transmitter and the first optical transmitter via a bidirectional two-wire synchronous serial I2C bus, respectively. The provided image processing method further comprises the following steps: when the camera module is detected to be started, the first processor respectively configures the second light emitter and the first light emitter through the I2C bus.
When the electronic equipment needs to acquire required image data through the camera module, the camera module can be started, and images are acquired through the camera module. When the electronic equipment detects that the camera module is started, the first processor can respectively configure the second light emitter and the first light emitter through an I2C bus, wherein the I2C bus can realize data transmission among all devices connected to the I2C bus through a data line and a clock line. The first processor may first read the configuration file and configure the second optical transmitter and the first optical transmitter according to parameters included in the configuration file. The configuration file may record parameters such as the transmission power, the transmission current, etc. of the second optical transmitter and the first optical transmitter, but is not limited thereto, and may also be other parameters. The first processor may configure the transmission power, transmission current, etc. of the second optical transmitter and the first optical transmitter according to the parameters in the configuration file.
In one embodiment, the first processor may be connected to the second optical transmitter and the first optical transmitter through two I2C buses, respectively, and the first processor may be connected to the second optical transmitter through one I2C bus and the first optical transmitter through another I2C bus. When the first processor configures the second optical transmitter and the first optical transmitter, the second optical transmitter may be addressed and configured via an I2C bus connected to the second optical transmitter, while the first optical transmitter may be addressed and configured via an I2C bus connected to the first optical transmitter. The second optical transmitter and the first optical transmitter are respectively connected through two I2C buses, and can be configured in parallel, so that the data processing speed is improved.
In one embodiment, the first processor may be connected to the second optical transmitter and the first optical transmitter, respectively, via the same I2C bus, and the second optical transmitter, the first optical transmitter, and the first processor may be connected to the same I2C bus. When the first processor configures the second optical transmitter and the first optical transmitter, the second optical transmitter may be addressed through the I2C bus and configured, and then the first optical transmitter may be addressed through the I2C bus and configured. Optionally, the first processor may also address and configure the first optical transmitter via the I2C bus and then address and configure the second optical transmitter via the I2C bus. By time-sharing multiplexing the same I2C bus, the complexity of the control circuit can be reduced, resources can be saved, and the cost can be reduced.
Fig. 6 is a schematic diagram of a first processor connected to a second optical transmitter and a first optical transmitter via an I2C bus in one embodiment. As shown in fig. 6, first processor 220 connects second optical transmitter 214 and first optical transmitter 218, respectively, via the same I2C bus.
Fig. 7 is a schematic diagram of the first processor connected to the second optical transmitter and the first optical transmitter via an I2C bus in another embodiment. In fig. 7, first processor 220 is connected to second optical transmitter 214 and first optical transmitter 218 through two I2C buses, respectively, and first processor may be connected to second optical transmitter 214 through one I2C bus and to first optical transmitter 218 through another I2C bus.
In this embodiment, the first processor may configure the second light emitter and the first light emitter through the I2C bus when the camera module is started, so that image acquisition may be more accurately controlled, and data processing efficiency is improved.
In one embodiment, an image processing method is provided in which a timing at which a first PWM module sends a pulse to a first controller is different from a timing at which a second PWM module sends a pulse to a second controller, and a time interval between the timing at which the pulse is sent to the first controller and the timing at which the pulse is sent to the second controller is less than a time threshold.
The second processor determines the image type according to the image acquisition instruction, wherein the image type can comprise at least two types, for example, the image type can comprise a first type and a second type at the same time. When the image type includes an infrared image and an RGB image, the infrared image and the RGB image need to be acquired at the same time. The second processor may simultaneously send a pulse to the first controller through the first PWM module, and send a pulse to the second controller through the second PWM module to light the second light emitter and the first light emitter. The timing at which the first PWM module sends pulses to the first controller may be different from the timing at which the second PWM module sends pulses to the second controller, thereby illuminating the second light emitter and the first light emitter at different timings. The second processor can acquire the infrared image through the infrared camera at the moment when the first PWM module sends the pulse to the first controller, and can acquire the RGB image through the RGB camera at the moment when the second PWM module sends the pulse to the second controller.
The time interval between the moment when the first PWM module sends the pulse to the first controller and the moment when the second PWM module sends the pulse to the second controller is smaller than the time threshold, and the infrared camera can collect the RGB images in the time interval smaller than the time threshold after collecting the infrared images, so that the collected infrared images are consistent with the image content of the RGB images, and subsequent processing such as registration is convenient. The time threshold may be set according to actual requirements, for example, 20 milliseconds, 30 milliseconds, and the like.
The second processor can respectively collect the infrared image and the RGB image at different moments through the infrared camera and the RGB camera, the collected infrared image and the image content of the RGB image can be guaranteed to be consistent, and the accuracy of follow-up face detection is improved.
In an embodiment, the provided image processing method may further include a process of generating a binary file, specifically including: a binary file is generated and stored in the first runtime environment.
After the electronic device is registered according to the first feature and the second feature in the first operating environment, a binary file, namely a bin file, can be generated. The electronic device may also store the generated binary file in the first runtime environment.
In one embodiment, an image processing method is provided, and the specific steps for implementing the method are as follows:
first, the electronic device may obtain an image capture instruction. The image acquisition instruction may be an instruction for acquiring an image, and the electronic device may acquire the image acquisition instruction through a button or a touch screen of the electronic device. Specifically, a central processing unit in the electronic device may obtain the image acquisition instruction.
The electronic device can acquire the infrared image in the first operating environment according to the image acquisition instruction and extract a first feature in the infrared image, and the electronic device can also acquire the RGB image in the second operating environment and extract a second feature in the RGB image. The first processor is respectively connected with the second optical transmitter and the first optical transmitter through a bidirectional two-wire system synchronous serial I2C bus. When the electronic equipment detects that the camera module is started, the first processor respectively configures the second light emitter and the first light emitter through the I2C bus.
The first processor may be connected to the second optical transmitter and the first optical transmitter through two I2C buses, respectively, and the first processor may be connected to the second optical transmitter through one I2C bus and to the first optical transmitter through another I2C bus. The first processor may also be connected to the second optical transmitter and the first optical transmitter, respectively, via the same I2C bus, and the second optical transmitter, the first optical transmitter, and the first processor may be connected to the same I2C bus.
When the second processor receives the image acquisition instruction sent by the first processor, the electronic device can determine the image type according to the image acquisition instruction. If the image type is the first type, the electronic equipment can start a first light emitter in the camera module, sends a pulse to the first controller through the first PWM module, lights the first light emitter, collects an infrared image corresponding to the first type in a first operating environment through an infrared camera in the camera module, and extracts a first feature in the infrared image. If the image type is the second type, the electronic device can start a second light emitter in the camera module, send a pulse to the second controller through the second Pulse Width Modulation (PWM) module, light the second light emitter, collect an RGB image corresponding to the second type in the second operating environment through the RGB camera in the camera module, and extract a second feature in the RGB image.
The time when the first PWM module sends the pulse to the first controller is different from the time when the second PWM module sends the pulse to the second controller, and the time interval between the time when the first PWM module sends the pulse to the first controller and the time when the second PWM module sends the pulse to the second controller is smaller than a time threshold.
The electronic device can then transmit the second feature to the first runtime environment and perform a registration process based on the first feature and the second feature in the first runtime environment. The registration process is also called image registration, and is a process of matching and superimposing two or more images acquired at different times, with different imaging devices, or under different conditions. The first feature extracted by the electronic device is in a first operating environment and the second feature extracted by the electronic device is in a second operating environment, the electronic device being capable of transferring a second feature in the second operating environment to the first operating environment. After the electronic device transmits the second feature to the first operating environment, the first feature and the second feature exist in the first operating environment, and the electronic device can perform registration processing according to the first feature and the second feature in the first environment.
The electronic device may then also generate a binary file and store it in the first runtime environment. After the electronic device is registered according to the first feature and the second feature in the first operating environment, a binary file, namely a bin file, can be generated. The electronic device may also store the generated binary file in the first runtime environment.
It should be understood that, although the steps in the above-described flowcharts are shown in order as indicated by the arrows, the steps are not necessarily performed in order as indicated by the arrows. The steps are not performed in the exact order shown and described, and may be performed in other orders, unless explicitly stated otherwise. Moreover, at least a portion of the steps in the above-described flowcharts may include multiple sub-steps or multiple stages, which are not necessarily performed at the same time, but may be performed at different times, and the order of performing the sub-steps or the stages is not necessarily sequential, but may be performed alternately or alternatingly with other steps or at least a portion of the sub-steps or stages of other steps.
In one embodiment, an electronic device is provided and includes a camera module, a first processor, and a second processor, the second processor being connected to the first processor and the camera module, respectively. The camera module comprises a laser camera, a second light emitter and a first light emitter, the second light emitter is connected with the first controller, and the first light emitter is connected with the second controller. The second processor comprises a first PWM module and a second PWM module, the second processor is connected with the first controller through the first PWM module, and the second processor is connected with the second controller through the second PWM module.
The first processor is used for acquiring an image acquisition instruction, acquiring an infrared image in a first operation environment according to the image acquisition instruction, extracting a first feature in the infrared image, acquiring an RGB image in a second operation environment, extracting a second feature in the RGB image, transmitting the second feature to the first operation environment, and performing registration processing according to the first feature and the second feature in the first operation environment.
The second processor is used for determining the image type according to the image acquisition instruction when receiving the image acquisition instruction sent by the first processor, if the image type is the first type, starting a first light emitter in the camera module, sending a pulse to the first controller through the first Pulse Width Modulation (PWM) module, lighting the first light emitter, and acquiring an infrared image corresponding to the first type through the infrared camera in the camera module; if the image type is the second type, a second light emitter in the camera module is started, a pulse is sent to a second controller through a second PWM module, the second light emitter is lightened, and then an RGB image corresponding to the second type is collected through an RGB camera in the camera module.
In this embodiment, when the second processor receives an image acquisition instruction sent by the first processor, the image type is determined according to the image acquisition instruction, if the image type is the first type, the first light emitter is turned on by the first PWM module and the infrared image corresponding to the first type is acquired by the infrared camera, if the image type is the second type, the second light emitter is turned on to the second controller by the second PWM module and the RGB image corresponding to the second type is acquired by the RGB camera, and the second light emitter and the first light emitter are controlled by the two PWM modules respectively, so that real-time switching is not required, the complexity of data processing can be reduced, and the processing pressure of the first processing unit is reduced.
In one embodiment, the first processor is connected to the second optical transmitter and the first optical transmitter via a bidirectional two-wire synchronous serial I2C bus, respectively.
And the first processor is further used for respectively configuring the second light emitter and the first light emitter through the I2C bus when the camera module is detected to be started.
In one embodiment, the first processor is connected to the second optical transmitter and the first optical transmitter, respectively, through the same I2C bus.
In one embodiment, the first processor is connected to the second optical transmitter via one I2C bus and to the first optical transmitter via another I2C bus.
In this embodiment, the first processor may configure the second light emitter and the first light emitter through the I2C bus when the camera module is started, so that image acquisition may be more accurately controlled, and data processing efficiency is improved.
In one embodiment, the first PWM module sends a pulse to the first controller at a different time than the second PWM module sends a pulse to the second controller, and a time interval between the time of sending the pulse to the first controller and the time of sending the pulse to the second controller is less than a time threshold.
In this embodiment, the second processor may collect the infrared image and the RGB image at different times through the infrared camera and the RGB camera, and may ensure that the image contents of the collected infrared image and the RGB image are relatively consistent, thereby improving the accuracy of subsequent face detection.
In one embodiment, as shown in fig. 8, there is provided an image processing apparatus including: an instruction acquisition module 810, an image acquisition module 820, and a feature registration module 830, wherein:
and the instruction obtaining module 810 is used for obtaining an image acquisition instruction.
And the image acquisition module 820 is configured to acquire an infrared image in the first operating environment according to the image acquisition instruction and extract a first feature in the infrared image, and acquire an RGB image in the second operating environment and extract a second feature in the RGB image.
And a feature registration module 830, configured to transmit the second feature to the first operating environment, and perform a registration process according to the first feature and the second feature in the first operating environment.
Wherein the first operating environment has a higher security level than the second operating environment.
In one embodiment, the image acquisition module 820 may include an image type determination unit, a first feature extraction unit, and a second feature extraction unit, wherein:
and the image type determining unit is used for determining the image type according to the image acquisition instruction when the second processor receives the image acquisition instruction sent by the first processor.
And the first feature extraction unit is used for starting a first light emitter in the camera module if the image type is the first type, sending a pulse to the first controller through the first PWM module, lighting the first light emitter, acquiring an infrared image corresponding to the first type in the first operating environment through an infrared camera in the camera module, and extracting a first feature in the infrared image.
And the second feature extraction unit is used for starting a second light emitter in the camera module if the image type is the second type, sending a pulse to the second controller through the second Pulse Width Modulation (PWM) module, lighting the second light emitter, collecting an RGB image corresponding to the second type in a second operating environment through the RGB camera in the camera module, and extracting second features in the RGB image.
In one embodiment, the first processor is connected to the second optical transmitter and the first optical transmitter via a bidirectional two-wire synchronous serial I2C bus, respectively. When the camera module is detected to be started, the first processor respectively configures the second light emitter and the first light emitter through the I2C bus.
In one embodiment, the first processor is connected to the second optical transmitter via one I2C bus and to the first optical transmitter via another I2C bus.
In one embodiment, the first processor is connected to the second optical transmitter and the first optical transmitter, respectively, through the same I2C bus.
In one embodiment, the first PWM module sends a pulse to the first controller at a different time than the second PWM module sends a pulse to the second controller, and a time interval between the time of sending the pulse to the first controller and the time of sending the pulse to the second controller is less than a time threshold.
In one embodiment, the feature registration module 830 is further configured to generate a binary file and store the binary file in the first runtime environment.
The division of the modules in the image processing apparatus is only for illustration, and in other embodiments, the image processing apparatus may be divided into different modules as needed to complete all or part of the functions of the image processing apparatus.
For specific limitations of the image processing apparatus, reference may be made to the above limitations of the image processing method, which are not described herein again. The respective modules in the image processing apparatus described above may be wholly or partially implemented by software, hardware, and a combination thereof. The modules can be embedded in a hardware form or independent from a processor in the computer device, and can also be stored in a memory in the computer device in a software form, so that the processor can call and execute operations corresponding to the modules.
The implementation of each module in the image processing apparatus provided in the embodiment of the present application may be in the form of a computer program. The computer program may be run on a terminal or a server. The program modules constituted by the computer program may be stored on the memory of the terminal or the server. Which when executed by a processor, performs the steps of the method described in the embodiments of the present application.
The embodiment of the application also provides a computer readable storage medium. One or more non-transitory computer-readable storage media containing computer-executable instructions that, when executed by one or more processors, cause the processors to perform the steps of the image processing method.
A computer program product comprising instructions which, when run on a computer, cause the computer to perform an image processing method.
Any reference to memory, storage, database, or other medium used herein may include non-volatile and/or volatile memory. Non-volatile memory can include read-only memory (ROM), Programmable ROM (PROM), Electrically Programmable ROM (EPROM), Electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM), which acts as external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms, such as Static RAM (SRAM), Dynamic RAM (DRAM), Synchronous DRAM (SDRAM), double data rate SDRAM (DDR SDRAM), Enhanced SDRAM (ESDRAM), synchronous Link (Synchlink) DRAM (SLDRAM), Rambus Direct RAM (RDRAM), direct bus dynamic RAM (DRDRAM), and bus dynamic RAM (RDRAM).
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the present application. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (10)

1. An image processing method, comprising:
acquiring an image acquisition instruction;
acquiring an infrared image in a first operating environment according to the image acquisition instruction, and extracting a first feature in the infrared image, and acquiring an RGB image in a second operating environment, and extracting a second feature in the RGB image;
transmitting the second feature to the first operating environment, and performing registration processing according to the first feature and the second feature in the first operating environment;
wherein the first operating environment is higher in security level than the second operating environment;
the second processor collects infrared images and RGB images through the infrared camera and the RGB camera.
2. The method of claim 1, wherein capturing an infrared image in a first runtime environment and extracting a first feature in the infrared image and capturing an RGB image in a second runtime environment and extracting a second feature in the RGB image according to the image capture instructions comprises:
when the second processor receives an image acquisition instruction sent by the first processor, determining the image type according to the image acquisition instruction;
if the image type is a first type, starting a first light emitter in a camera module, sending a pulse to a first controller through a first PWM (pulse width modulation) module, lighting the first light emitter, acquiring an infrared image corresponding to the first type in a first operating environment through an infrared camera in the camera module, and extracting a first feature in the infrared image;
and if the image type is the second type, starting a second light emitter in the camera module, sending a pulse to a second controller through a second Pulse Width Modulation (PWM) module, lighting the second light emitter, collecting an RGB image corresponding to the second type in a second operating environment through an RGB camera in the camera module, and extracting a second feature in the RGB image.
3. The method of claim 2, wherein the first processor is connected to the second optical transmitter and the first optical transmitter via a bidirectional two-wire synchronous serial I2C bus;
the method further comprises the following steps:
when the camera module is detected to be started, the first processor respectively configures the second light emitter and the first light emitter through the I2C bus.
4. The method of claim 3, wherein the first processor is connected to the second optical transmitter via an I2C bus and to the first optical transmitter via another I2C bus.
5. The method of claim 3, wherein the first processor is connected to the second optical transmitter and the first optical transmitter via a same I2C bus.
6. The method of claim 2, wherein the first PWM module sends a pulse to the first controller at a different time than the second PWM module sends a pulse to the second controller, and wherein a time interval between the time of sending a pulse to the first controller and the time of sending a pulse to the second controller is less than a time threshold.
7. The method of any of claims 1 to 6, wherein after performing the registration processing step in the first runtime environment based on the first feature and the second feature, the method further comprises:
a binary file is generated and stored in the first runtime environment.
8. An image processing apparatus characterized by comprising:
the instruction acquisition module is used for acquiring an image acquisition instruction;
the image acquisition module is used for acquiring an infrared image in a first operation environment according to the image acquisition instruction, extracting a first feature in the infrared image, acquiring an RGB image in a second operation environment and extracting a second feature in the RGB image;
the feature registration module is used for transmitting the second feature to the first operating environment and performing registration processing according to the first feature and the second feature in the first operating environment;
the security level of the first operating environment is higher than that of the second operating environment, and the second processor collects infrared images and RGB images through the infrared camera and the RGB camera.
9. An electronic device comprising a memory and a processor, the memory having stored therein a computer program that, when executed by the processor, causes the processor to perform the steps of the image processing method according to any one of claims 1 to 7.
10. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the method according to any one of claims 1 to 7.
CN201810864801.0A 2018-08-01 2018-08-01 Image processing method and device, electronic equipment and computer readable storage medium Active CN108986153B (en)

Priority Applications (5)

Application Number Priority Date Filing Date Title
CN201810864801.0A CN108986153B (en) 2018-08-01 2018-08-01 Image processing method and device, electronic equipment and computer readable storage medium
PCT/CN2019/080429 WO2020024603A1 (en) 2018-08-01 2019-03-29 Image processing method and apparatus, electronic device, and computer readable storage medium
US16/526,648 US10929988B2 (en) 2018-08-01 2019-07-30 Method and device for processing image, and electronic device
EP19189270.2A EP3605452B1 (en) 2018-08-01 2019-07-31 Method and device for processing image, and electronic device
ES19189270T ES2847232T3 (en) 2018-08-01 2019-07-31 Method and device for processing images and electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810864801.0A CN108986153B (en) 2018-08-01 2018-08-01 Image processing method and device, electronic equipment and computer readable storage medium

Publications (2)

Publication Number Publication Date
CN108986153A CN108986153A (en) 2018-12-11
CN108986153B true CN108986153B (en) 2021-03-12

Family

ID=64551218

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810864801.0A Active CN108986153B (en) 2018-08-01 2018-08-01 Image processing method and device, electronic equipment and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN108986153B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020024603A1 (en) * 2018-08-01 2020-02-06 Oppo广东移动通信有限公司 Image processing method and apparatus, electronic device, and computer readable storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104333686A (en) * 2014-11-27 2015-02-04 天津天地伟业数码科技有限公司 Intelligent monitoring camera based on face and voiceprint recognition and control method of intelligent monitoring camera
US20150044995A1 (en) * 2012-11-16 2015-02-12 At&T Intellectual Property I, Lp Methods for provisioning universal integrated circuit cards
CN105306490A (en) * 2015-11-23 2016-02-03 小米科技有限责任公司 System, method and device for payment verification
CN105488679A (en) * 2015-11-23 2016-04-13 小米科技有限责任公司 Mobile payment equipment, method and device based on biological recognition technology
CN107808127A (en) * 2017-10-11 2018-03-16 广东欧珀移动通信有限公司 Face identification method and Related product
CN108090477A (en) * 2018-01-23 2018-05-29 北京易智能科技有限公司 A kind of face identification method and device based on Multi-spectral image fusion

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150044995A1 (en) * 2012-11-16 2015-02-12 At&T Intellectual Property I, Lp Methods for provisioning universal integrated circuit cards
CN104333686A (en) * 2014-11-27 2015-02-04 天津天地伟业数码科技有限公司 Intelligent monitoring camera based on face and voiceprint recognition and control method of intelligent monitoring camera
CN105306490A (en) * 2015-11-23 2016-02-03 小米科技有限责任公司 System, method and device for payment verification
CN105488679A (en) * 2015-11-23 2016-04-13 小米科技有限责任公司 Mobile payment equipment, method and device based on biological recognition technology
CN107808127A (en) * 2017-10-11 2018-03-16 广东欧珀移动通信有限公司 Face identification method and Related product
CN108090477A (en) * 2018-01-23 2018-05-29 北京易智能科技有限公司 A kind of face identification method and device based on Multi-spectral image fusion

Also Published As

Publication number Publication date
CN108986153A (en) 2018-12-11

Similar Documents

Publication Publication Date Title
CN110191266B (en) Data processing method and device, electronic equipment and computer readable storage medium
CN110248111B (en) Method and device for controlling shooting, electronic equipment and computer-readable storage medium
US10215557B2 (en) Distance image acquisition apparatus and distance image acquisition method
CN111523499B (en) Image processing method, apparatus, electronic device, and computer-readable storage medium
US20140321700A1 (en) Light sensing module and system
CN110971836B (en) Method and device for controlling shooting, electronic equipment and computer-readable storage medium
WO2019206129A1 (en) Data processing method, apparatus, electronic device, and computer-readable storage medium
CN108573170B (en) Information processing method and device, electronic equipment and computer readable storage medium
CN109145653B (en) Data processing method and device, electronic equipment and computer readable storage medium
WO2019205887A1 (en) Method and apparatus for controlling photographing, electronic device, and computer readable storage medium
CN108833887B (en) Data processing method and device, electronic equipment and computer readable storage medium
CN111008671B (en) Gesture recognition method and apparatus, electronic device, and computer-readable storage medium
CN108986153B (en) Image processing method and device, electronic equipment and computer readable storage medium
CN108965716B (en) Image processing method and device, electronic equipment and computer readable storage medium
CN109064503B (en) Image processing method and device, electronic equipment and computer readable storage medium
CN115604575A (en) Image acquisition device and image acquisition method
WO2020024603A1 (en) Image processing method and apparatus, electronic device, and computer readable storage medium
CN110609555B (en) Method, apparatus, electronic device, and computer-readable storage medium for signal control
KR20070066382A (en) 3d image creating method using two cameras and camera terminal that implementing the method
CN110213407B (en) Electronic device, operation method thereof and computer storage medium
CN105807888A (en) Electronic equipment and information processing method
KR20200117460A (en) Electronic device and method for controlling heat generation thereof
CN108810516B (en) Data processing method and device, electronic equipment and computer readable storage medium
CN111669482A (en) Image processing method, system, medium, chip and structural optoelectronic device
US11170204B2 (en) Data processing method, electronic device and computer-readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant