CN110324521B - Method and device for controlling camera, electronic equipment and storage medium - Google Patents

Method and device for controlling camera, electronic equipment and storage medium Download PDF

Info

Publication number
CN110324521B
CN110324521B CN201910600333.0A CN201910600333A CN110324521B CN 110324521 B CN110324521 B CN 110324521B CN 201910600333 A CN201910600333 A CN 201910600333A CN 110324521 B CN110324521 B CN 110324521B
Authority
CN
China
Prior art keywords
distance
laser
camera
target camera
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910600333.0A
Other languages
Chinese (zh)
Other versions
CN110324521A (en
Inventor
谭国辉
周海涛
郭子青
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN201910600333.0A priority Critical patent/CN110324521B/en
Publication of CN110324521A publication Critical patent/CN110324521A/en
Application granted granted Critical
Publication of CN110324521B publication Critical patent/CN110324521B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/02Constructional features of telephone sets
    • H04M1/0202Portable telephone sets, e.g. cordless phones, mobile phones or bar type handsets
    • H04M1/026Details of the structure or mounting of specific components
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/02Constructional features of telephone sets
    • H04M1/0202Portable telephone sets, e.g. cordless phones, mobile phones or bar type handsets
    • H04M1/026Details of the structure or mounting of specific components
    • H04M1/0264Details of the structure or mounting of specific components for a camera module assembly
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
    • H04M1/72454User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions according to context-related or environment-related conditions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/56Cameras or camera modules comprising electronic image sensors; Control thereof provided with illuminating means
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/57Mechanical or electrical details of cameras or camera modules specially adapted for being embedded in other devices
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/61Control of cameras or camera modules based on recognised objects
    • H04N23/611Control of cameras or camera modules based on recognised objects where the recognised objects include parts of the human body

Landscapes

  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Environmental & Geological Engineering (AREA)
  • Human Computer Interaction (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Studio Devices (AREA)

Abstract

The embodiment of the application relates to a method and a device for controlling a camera, electronic equipment and a storage medium. The method comprises the following steps: when a target camera is in an open state, acquiring the distance between a human face and the target camera at intervals of a preset time period; adjusting the shooting frame rate of the target camera and/or the transmitting power of a laser according to the distance; and controlling the laser to emit laser according to the emission power, and controlling the target camera to acquire a target image according to the shooting frame rate. The method, the device, the electronic equipment and the storage medium for controlling the camera can reduce the damage of laser emitted by the laser to human faces and protect the safety of human eyes.

Description

Method and device for controlling camera, electronic equipment and storage medium
Technical Field
The present disclosure relates to the field of imaging technologies, and in particular, to a method and an apparatus for controlling a camera, an electronic device, and a storage medium.
Background
3D (3Dimensions, three-dimensional) human faces play an important role in different application scenes such as face recognition, facial beautification, 3D model establishment and the like. Laser is emitted by lasers such as a laser lamp, images with speckles can be formed, a depth map of the face can be generated according to the speckle images, and therefore the 3D face is obtained. In a traditional mode, laser emitted by a laser device may cause certain damage to human eyes, and the human eyes are harmed.
Disclosure of Invention
The embodiment of the application provides a method and a device for controlling a camera, electronic equipment and a storage medium, which can reduce the harm of laser emitted by a laser to human faces and protect the safety of human eyes.
A method of controlling a camera, comprising:
when a target camera is in an open state, acquiring the distance between a human face and the target camera at intervals of a preset time period;
adjusting the shooting frame rate of the target camera and/or the transmitting power of a laser according to the distance;
and controlling the laser to emit laser according to the emission power, and controlling the target camera to acquire a target image according to the shooting frame rate.
An apparatus for controlling a camera, comprising:
the distance acquisition module is used for acquiring the distance between the face and the target camera at intervals of a preset time period when the target camera is in a use state;
the adjusting module is used for adjusting the shooting frame rate of the target camera and/or the transmitting power of the laser according to the distance;
and the control module is used for controlling the laser to emit laser according to the emission power and controlling the target camera to acquire a target image according to the shooting frame rate.
An electronic device, comprising: the camera module comprises a target camera and a laser, and the first processing unit is respectively connected with the target camera and the laser in the second processing unit and the camera module;
the second processing unit is used for acquiring the distance between the face and the target camera at intervals of a preset time period when the target camera is in an open state;
the second processing unit is further configured to adjust a shooting frame rate of the target camera and/or emission power of a laser according to the distance;
the first processing unit is used for controlling the laser to emit laser according to the emission power and controlling the target camera to acquire a target image according to the shooting frame rate.
A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the method as set forth above.
According to the method, the device, the electronic equipment and the storage medium for controlling the camera, when the target camera is in a use state, the distance between the human face and the target camera is acquired at intervals of a preset time period, the shooting frame rate of the target camera and/or the transmitting power of the laser is adjusted according to the distance, the laser is controlled to transmit laser according to the transmitting power, the target camera is controlled to collect a target image according to the shooting frame rate, the shooting frame rate of the target camera and/or the transmitting power of the laser can be dynamically adjusted according to the distance between the human face and the target camera, damage of the laser transmitted by the laser to the human eye can be reduced, and safety of the human eye is protected.
Drawings
FIG. 1 is a block diagram of an electronic device in one embodiment;
FIG. 2 is a diagram of an application scenario of a method for controlling a camera in one embodiment;
FIG. 3 is a schematic flow chart illustrating a method for controlling a camera in one embodiment;
FIG. 4 is a schematic flow chart illustrating a process of obtaining a distance between a face and a target camera according to an embodiment;
FIG. 5 is a block diagram of an apparatus for controlling a camera in one embodiment;
FIG. 6 is a block diagram of a distance acquisition module in one embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
It will be understood that, as used herein, the terms "first," "second," and the like may be used herein to describe various elements, but these elements are not limited by these terms. These terms are only used to distinguish one element from another. For example, a first client may be referred to as a second client, and similarly, a second client may be referred to as a first client, without departing from the scope of the present application. Both the first client and the second client are clients, but they are not the same client.
FIG. 1 is a block diagram of an electronic device in one embodiment. As shown in fig. 1, the electronic device includes a processor, a memory, a display screen, and an input device connected through a system bus. The memory may include, among other things, a non-volatile storage medium and a processor. The non-volatile storage medium of the electronic device stores an operating system and a computer program, and the computer program is executed by a processor to realize the method for controlling the camera provided in the embodiment of the application. The processor is used for providing calculation and control capability and supporting the operation of the whole electronic equipment. The internal memory in the electronic device provides an environment for the execution of the computer program in the nonvolatile storage medium. The display screen of the electronic device may be a liquid crystal display screen or an electronic ink display screen, and the input device may be a touch layer covered on the display screen, a key, a track ball or a touch pad arranged on a housing of the electronic device, or an external keyboard, a touch pad or a mouse. The electronic device may be a mobile phone, a tablet computer, or a personal digital assistant or a wearable device, etc. Those skilled in the art will appreciate that the architecture shown in fig. 1 is a block diagram of only a portion of the architecture associated with the subject application, and does not constitute a limitation on the electronic devices to which the subject application may be applied, and that a particular electronic device may include more or less components than those shown, or may combine certain components, or have a different arrangement of components.
Fig. 2 is an application scenario diagram of a method for controlling a camera in an embodiment. As shown in fig. 2, the electronic device 200 may include a camera module 210, a second processing unit 220, and a first processing unit 230. The second Processing Unit 220 may be a Central Processing Unit (CPU) module. The first processing Unit 230 may be an MCU (micro controller Unit) module 230. The first processing unit 230 is connected between the second processing unit 220 and the camera module 210, the first processing unit 230 can control the laser camera 212, the floodlight 214 and the laser light 218 in the camera module 210, and the second processing unit 220 can control the RGB (Red/Green/Blue, Red/Green/Blue color mode) camera 216 in the camera module 210.
The camera module 210 includes a laser camera 212, a floodlight 214, an RGB camera 216, and a laser light 218. The laser camera 212 may be an infrared camera for acquiring infrared images. The floodlight 214 is a surface light source capable of emitting infrared light; the laser lamp 218 is a point light source capable of emitting laser light and is a point light source with a pattern. When the floodlight 214 emits a surface light source, the laser camera 212 can obtain an infrared image according to the reflected light. When the laser lamp 218 emits a point light source, the laser camera 212 may obtain a speckle image according to the reflected light. The speckle image is an image of the pattern deformation after the point light source with the pattern emitted by the laser lamp 218 is reflected.
The second processing unit 220 may include a CPU core operating in a TEE (Trusted Execution Environment) Environment and a CPU core operating in a REE (natural Execution Environment) Environment. The TEE environment and the REE environment are both running modes of an ARM module (Advanced RISC Machines, Advanced reduced instruction set processor). The security level of the TEE environment is higher, and only one CPU core in the second processing unit 220 can operate in the TEE environment at the same time. Generally, the operation behavior with higher security level in the electronic device 200 needs to be executed in the CPU core in the TEE environment, and the operation behavior with lower security level can be executed in the CPU core in the REE environment.
The first processing unit 230 includes a PWM (Pulse Width Modulation) module 232, an SPI/I2C (Serial Peripheral Interface/Inter-Integrated Circuit) Interface 234, a RAM (Random Access Memory) module 236, and a depth engine 238. The PWM module 232 may transmit pulses to the camera module to control the floodlight 214 or the laser 218 to be turned on, so that the laser camera 212 may collect infrared images or speckle images. The SPI/I2C interface 234 is used for receiving the face acquisition command sent by the second processing unit 220. The depth engine 238 may process the speckle images to obtain a depth disparity map.
When the second processing unit 220 receives a data acquisition request of an application program, for example, when the application program needs to perform face unlocking and face payment, a face acquisition instruction may be sent to the first processing unit 230 through the CPU core operating in the TEE environment. After the first processing unit 230 receives the face collecting instruction, the PWM module 232 emits a pulse wave to control the floodlight 214 in the camera module 210 to be turned on and collect an infrared image through the laser camera 212, and to control the laser light 218 in the camera module 210 to be turned on and collect a speckle image through the laser camera 212. The camera module 210 may send the collected infrared image and speckle image to the first processing unit 230. The first processing unit 230 may process the received infrared image to obtain an infrared disparity map; and processing the received speckle images to obtain a speckle parallax image or a depth parallax image. The processing of the infrared image and the speckle image by the first processing unit 230 refers to correcting the infrared image or the speckle image and removing the influence of internal and external parameters in the camera module 210 on the image. The first processing unit 230 can be set to different modes, and the images output by the different modes are different. When the first processing unit 230 is set to be in the speckle pattern mode, the first processing unit 230 processes the speckle image to obtain a speckle disparity map, and a target speckle map can be obtained according to the speckle disparity map; when the first processing unit 230 is set to the depth map mode, the first processing unit 230 processes the speckle images to obtain a depth disparity map, and obtains a depth image according to the depth disparity map, where the depth image is an image with depth information. The first processing unit 230 may send the infrared disparity map and the speckle disparity map to the second processing unit 220, and the first processing unit 230 may also send the infrared disparity map and the depth disparity map to the second processing unit 220. The second processing unit 220 may obtain an infrared image of the target according to the infrared disparity map and obtain a depth image according to the depth disparity map. Further, the second processing unit 220 may perform face recognition, face matching, living body detection, and depth information acquisition on the detected face according to the target infrared image and the depth image.
The communication between the first processing unit 230 and the second processing unit 220 is through a fixed security interface to ensure the security of the transmitted data. As shown in fig. 1, the data sent by the second processing unit 220 to the first processing unit 230 is through a SECURE SPI/I2C 240, and the data sent by the first processing unit 230 to the second processing unit 220 is through a SECURE MIPI (Mobile Industry Processor Interface) 250.
In an embodiment, the first processing unit 230 may also obtain a target infrared image according to the infrared disparity map, calculate and obtain a depth image according to the depth disparity map, and then send the target infrared image and the depth image to the second processing unit 220.
As shown in fig. 3, in one embodiment, there is provided a method of controlling a camera, including the steps of:
and step 310, when the target camera is in an open state, acquiring the distance between the human face and the target camera every preset time period.
When an application program in the electronic equipment needs to acquire face data, the controllable target camera is started, and a target image is acquired through the target camera, wherein the target camera can refer to a laser camera in the camera module, and the laser camera can acquire invisible light images with different wavelengths. The target image may include, but is not limited to, an infrared image, a speckle image, etc., the speckle image referring to the infrared image with the speckle image. Electronic equipment can open the floodlight in the camera module and gather infrared image through the laser camera, can open lasers such as radium-shine lamp in the camera module and gather speckle image through the laser camera. The floodlight can be a pointolite of evenly shining in all directions, and the light that the floodlight sent can be the infrared light, and the laser camera can gather people's face and obtain infrared image. Laser that the laser instrument sent can be carried out the diffraction by lens and DOE (dispersive optical elements) and produce the pattern of taking the speckle granule, projects the target object through the pattern of taking the speckle granule, receives the different skew that produces the speckle pattern of distance of target object each point and electronic equipment, and laser camera gathers the target object and obtains the speckle image.
Electronic equipment control target camera opens to control the laser instrument and open transmission laser, thereby accessible target camera gathers the target image, when the distance between people's face and the target camera is too near, the laser of laser instrument transmission may cause certain injury to people's eye, and the distance is more near, and the injury is higher, endangers the health of people's face. When the target camera is in an open state, the electronic device can acquire the distance between the face and the target camera at intervals of a preset time period, wherein the acquired time period can be set according to actual requirements, such as 30 milliseconds, 1 second and the like, and the distance between the face and the target camera can also be understood as the distance between the face and the electronic device or the distance between the face and the laser and the like.
In one embodiment, the electronic device can adjust the acquisition time interval according to the degree of change in the distance between the face and the target camera. After the distance between the face and the target camera is obtained, the electronic equipment can obtain the distance between the face and the target camera at the last time, calculate the difference between the distance at the current time and the distance at the last time, and adjust the acquisition time interval according to the difference. If the difference value is larger, the distance between the human face and the target camera is changed greatly, so that the acquisition time period can be shortened, and the acquisition frequency is improved; if the difference value is smaller, the change of the distance between the human face and the target camera is small, the acquisition time period can be prolonged, and the acquisition frequency is reduced. The acquisition time interval is adjusted according to the distance change degree between the human face and the target camera, so that the distance between the human face and the target camera can be acquired more accurately and timely.
In one embodiment, after the electronic device collects a target image such as an infrared image and a speckle image through a target camera, the depth information of the face can be obtained according to the target image. In the camera coordinate system, a straight line which is vertical to the imaging plane and passes through the center of the mirror surface is taken as a Z axis, and if the coordinates of the object in the camera coordinate system are (X, Y, Z), the Z value is the depth information of the object in the imaging plane of the camera. The electronic equipment can determine the distance between the face and the target camera according to the depth information of the face.
In one embodiment, a distance sensor can be arranged on the electronic equipment, and the distance between the human face and the target camera is acquired through the distance sensor. It is to be understood that the electronic device may also use other methods to obtain the distance between the human face and the target camera, and is not limited to the above methods.
And step 320, adjusting the shooting frame rate of the target camera and/or the emission power of the laser according to the distance.
The electronic device can adjust a shooting frame rate of the target camera and/or emission power of the laser according to a distance between the human face and the target camera, wherein the shooting frame rate refers to a frequency of the target camera collecting a target image within a certain time, such as 1 frame/second, 3 frames/second and the like, the emission power of the laser can be used for representing intensity of emitted laser, and the intensity of the emitted laser is higher when the emission power is higher.
The electronic equipment can adjust the shooting frame rate of the target camera and/or the emission power of the laser according to the distance between the face and the target camera, and when the distance between the face and the target camera is too small, the shooting frame rate of the target camera and/or the emission power of the laser can be reduced. The shooting frame rate of the target camera is reduced, the emission times of the laser within a certain time can be reduced, the emission power of the laser is reduced, and the intensity of laser emitted by the laser can be reduced, so that the harm of laser emitted by the laser to human eyes can be reduced. The smaller the distance between the human face and the target camera, the smaller the shooting frame rate of the target camera can be, and/or the smaller the emission power of the laser can be.
And 330, controlling the laser to emit laser according to the emission power, and controlling the target camera to acquire a target image according to the shooting frame rate.
After the electronic equipment adjusts the shooting frame rate of the target camera and/or the emission power of the laser according to the distance between the face and the target camera, the laser can be controlled to emit laser according to the adjusted emission power, and the target camera is controlled to acquire a target image according to the shooting frame rate.
In one embodiment, the electronic device may include a camera module, a first processing unit, and a second processing unit, wherein the camera module may include a laser camera, a floodlight, a laser light, and an RGB camera, the first processing unit may be an MCU module, and the second processing unit may be a CPU module. When the second processing unit receives a face depth information acquisition request sent by an application program, a face acquisition instruction can be sent to the first processing unit. The first processing unit can control the laser camera to collect target images such as infrared images and speckle images through the PWM module. When the laser camera in the camera module is in an open state, the second processing unit can acquire the distance between the human face and the laser camera at intervals of a preset time period. The second processing unit may adjust a photographing frame rate of the laser camera and/or an emission power of the laser according to the distance. The first processing unit can control the laser lamp to emit laser according to the adjusted emission function, and control the laser camera to collect target images such as infrared images and speckle images according to the adjusted shooting frame rate.
In this embodiment, when the target camera is in a use state, the distance between the human face and the target camera is acquired every preset time period, the shooting frame rate of the target camera and/or the emission power of the laser is adjusted according to the distance, the laser is controlled to emit laser according to the emission power, the target camera is controlled to acquire a target image according to the shooting frame rate, the shooting frame rate of the target camera and/or the emission power of the laser can be dynamically adjusted according to the distance between the human face and the target camera, damage of the laser emitted by the laser to the human eye can be reduced, and safety of the human eye is protected.
In one embodiment, the step 320 of adjusting the shooting frame rate of the target camera and/or the emission power of the laser according to the distance includes: and when the distance is smaller than the first distance threshold and larger than the second distance threshold, reducing the shooting frame rate of the target camera.
The electronic device may set a first distance threshold and a second distance threshold, where the first distance threshold may be greater than the second distance threshold, and the first distance threshold may be a safe distance at which laser light emitted by the laser does not affect human eye health. The electronic equipment acquires the distance from the face to the target camera, whether the distance from the face to the target camera is smaller than a first distance threshold value or not can be judged, and if the distance is smaller than the first distance threshold value, laser emitted by the laser can cause damage to eyes of people. The electronic device may further determine whether the distance from the face to the target camera is greater than a second distance threshold. If the distance from the human face to the target camera is smaller than the first distance threshold and larger than the second distance threshold, it can be shown that laser emitted by the laser can cause certain damage to the human eye, but the damage caused is slight, and the electronic device can only reduce the shooting frame rate of the target camera without changing the emission power of the laser.
In an embodiment, if the distance from the human face to the target camera is less than or equal to the second distance threshold, it may be indicated that the human eye is damaged more by the laser emitted by the laser, and the electronic device may decrease the shooting frame rate of the target camera and decrease the emission power of the laser. Optionally, when the distance from the human face to the target camera is less than or equal to the second distance threshold, the electronic device may determine whether the distance is greater than a third distance threshold. If the distance from the face to the target camera is less than or equal to the second distance threshold and greater than the third distance threshold, the shooting frame rate of the target camera can be reduced, and the emission power of the laser can be reduced, wherein the reduction of the emission power of the laser can be the reduction of the driving current of the laser. The electronic device can reduce the driving current of the laser to a preset percentage of a rated driving current, wherein the preset percentage can be set according to actual requirements, for example, 30%, 20% and the like, and the rated driving current refers to a normal driving current of the laser when a human face is at a safe distance from the laser camera. If the distance between the human face and the target camera is smaller than or equal to the third distance threshold, it can be shown that the laser emitted by the laser will cause serious damage to the human eyes. The electronic device can reduce the shooting frame rate of the target camera, reduce the emission power of the laser, and reduce the driving current of the laser below a current threshold, wherein the current threshold can be smaller than a preset percentage of a rated driving current. The electronic equipment can greatly reduce the emission power of the laser, thereby protecting the safety of human eyes to the maximum extent.
In one embodiment, the electronic device may set a rated power of a standard frame rate laser of the target camera, and when the distance from the human face to the target camera is greater than or equal to the first distance threshold, it may be indicated that a relatively safe distance exists between the human face and the target camera, the laser may be controlled to emit laser according to the rated power, and the target camera may be controlled to acquire the target image according to the standard frame rate.
Optionally, the electronic device may establish a function of a relationship between the distance intervals and the shooting frame rate of the target camera and the transmission power of the laser, and different distance intervals may correspond to different shooting frame rates of the target camera and different transmission powers of the laser, respectively. For example, when the distance interval is greater than or equal to the first distance threshold value by 20 centimeters, the shooting frame rate of the corresponding target camera is a standard frame rate of 30 frames/second, and the emission power of the laser is 1000 milliwatts of rated power; when the distance interval is 20 cm smaller than the first distance threshold and 10 cm larger than the second distance threshold, the shooting frame rate of the corresponding target camera is 1 frame/second, and the emission power of the laser is 1000 milliwatts; the distance interval is less than or equal to 10 cm of the second distance threshold and greater than 3 cm of the third distance threshold, the shooting frame rate of the corresponding target camera is 1 frame/second, and the emission power of the laser is 300 milliwatts; the distance interval is less than or equal to the third distance threshold value of 3 cm, the shooting frame rate of the corresponding target camera is 1 frame/second, the emission power of the laser is 125 milliwatts, and the like. It can be understood that the distance intervals may be set according to actual requirements, and the shooting frame rate of the target camera and the emission power of the laser corresponding to each distance interval may also be set according to actual requirements, which are not limited to the above. By setting different distance intervals, the shooting frame rates of the target cameras corresponding to the distance intervals and the transmitting power of the laser, the safety of human eyes can be protected to the maximum extent, and the loss of the acquired target images can be reduced as much as possible.
The electronic equipment acquires the distance between the face and the target camera at preset time intervals, can determine the distance interval to which the distance belongs, and acquires the shooting frame rate of the target camera corresponding to the distance interval to which the distance belongs and the emission power of the laser, can control the laser to emit laser according to the corresponding emission power, and can control the target camera to acquire a target image according to the corresponding shooting frame rate.
In this embodiment, the shooting frame rate of the target camera and/or the emission power of the laser can be dynamically adjusted according to the distance between the human face and the target camera, so that not only can a normal target image be acquired, but also the damage of laser emitted by the laser to human eyes can be reduced, and the safety of the human eyes can be protected.
As shown in fig. 4, in an embodiment, the step of obtaining the distance between the face and the target camera includes the following steps:
and step 402, calculating a depth map according to the acquired target speckle pattern and a stored reference speckle pattern, wherein the reference speckle pattern is the stored speckle pattern used for camera calibration, and the reference speckle pattern carries reference depth information.
The target image collected by the target camera can comprise a target speckle pattern, the electronic equipment can obtain the collected target speckle pattern and the reference speckle pattern, and the target speckle pattern and the reference speckle pattern can be compared to obtain a depth map, so that the depth information of the face can be obtained from the depth map. The electronic device may sequentially select a pixel block of a predetermined size, for example, 31 pixels by 31 pixels, centered on each pixel point included in the target speckle pattern, and search for a block on the reference speckle pattern that matches the selected pixel block. The electronic equipment can find two points of the target speckle pattern and the reference speckle pattern on the same laser light path respectively from a pixel block selected from the target speckle pattern and a block matched with the reference speckle pattern, wherein speckle information of the two points on the same laser light path is consistent, and the two points on the same laser light path can be identified as corresponding pixel points. The electronic device can calculate the offset between two corresponding pixel points of the target speckle pattern and the reference speckle pattern on the same laser light path. The electronic equipment can calculate the depth information of each pixel point contained in the target speckle pattern according to the offset, so that the depth map containing the depth information of each pixel point in the target speckle pattern can be obtained.
In one embodiment, the electronic device calculates an offset between the target speckle pattern and the reference speckle pattern, and calculates depth information of each pixel point included in the target speckle pattern according to the offset, wherein a calculation formula of the depth information can be represented by formula (1):
Figure BDA0002119077490000081
wherein Z isDRepresenting the depth information of the pixel points, namely the depth values of the pixel points; l is the distance between the laser camera and the laser; f is the focal length of the lens in the laser camera, Z0The depth value of the reference plane from a laser camera of the electronic equipment during the collection of the reference speckle pattern is shown, and P is the offset between corresponding pixel points in the target speckle pattern and the reference speckle pattern. P can be obtained by multiplying the amount of pixels of the offset of the pixels in the target speckle pattern and the reference speckle pattern by the actual distance of one pixel. When the distance between the target object and the laser camera is larger than the distance between the reference plane and the laser camera, P is a negative value, and when the distance between the target object and the laser camera is smaller than the distance between the reference plane and the first collector, P is a positive value.
In step 404, the ratio of the effective value area of the depth map to the depth map is determined.
The electronic device can determine the distance between the face and the target camera according to the depth value of each pixel point in the depth map. The electronic equipment can perform face recognition on the target speckle pattern, determine a face region, and extract the depth value of each pixel point contained in the face region. The electronic device may calculate an average depth value of the face region and determine a distance between the face and the target camera according to the average depth value, for example, if the average depth value of the face region is 50 cm, the distance between the face and the target camera may be determined to be 50 cm. Optionally, the electronic device may also select a pixel block in a middle area of the target speckle pattern, for example, select a pixel block with a size of 25 pixels by 25 pixels in the middle area, and calculate an average depth value of the pixel block, which may be used as a distance between the human face and the target camera.
The electronic device can detect an effective value area of the depth map, wherein the effective value area refers to an area occupied by pixel points with depth values larger than a preset effective value, and the effective value can be set according to actual requirements. The electronic device can determine the proportion of the effective value area of the depth map in the depth map, and obtain the distance between the face and the target camera according to the proportion, when the distance between the face and the target camera is too close, the pixel point of the face in the depth map may have no depth value or the depth value is smaller.
The electronic equipment can establish the corresponding relation between the proportion of the effective value area of the depth map in the depth map and the distance, and convert the proportion of the effective value area of the depth map in the depth map into the distance between the human face and the target camera according to the corresponding relation. For example, if the ratio of the effective value area of the depth map to the depth map is 80%, and the corresponding distance is 20 cm, and if the ratio of the effective value area of the depth map to the depth map is less than 80%, it can be determined that the distance between the face and the target camera is less than 20 cm. If the ratio of the effective value area of the depth map to the depth map is 100%, the distance between the human face and the target camera can be directly determined according to the depth value of each pixel point in the depth map.
And 406, acquiring the distance between the human face and the target camera according to the proportion.
In one embodiment, the electronic device may be provided with a distance sensor, and the distance between the face of the person and the target camera is acquired through the distance sensor. The electronic equipment can calculate the depth information of the face according to the target speckle pattern collected by the target camera, and determine the distance between the face and the target camera according to the depth information. When the distance between the face and the target camera is smaller than the first distance threshold, the electronic equipment can acquire the distance between the face and the target camera, which is acquired by the distance sensor. The distance is calculated through the depth information, when the distance of the face is short, the distance is acquired through the distance sensor, and the distance between the face and the target camera can be obtained more accurately.
In one embodiment, the second processing unit in the electronic device may include two operation modes, where the first operation mode may be a TEE, the TEE is a trusted operation environment, and the security level is high; the second operation mode may be REE, which is a natural operation environment, and the security level of the REE is low. When the application program of the electronic device sends a face depth information acquisition request to the second processing unit, the second processing unit can acquire the application type of the application program and switch the operation mode according to the security level corresponding to the application type. The application types may include, but are not limited to, unlock applications, payment applications, camera applications, beauty applications, and the like. The security level of different application types may be different, for example, but not limited to, the security level corresponding to the payment application and the unlock application may be high, the security level corresponding to the camera application, the beauty application may be low, and the like. If the security level corresponding to the application type is high, the second processing unit can be switched to the first operation mode, and if the security level corresponding to the application type is low, the second processing unit can be switched to the second operation mode.
Optionally, when the second processing unit in the electronic device is a single core, the single core may be directly switched from the second operation mode to the first operation mode; when the second processing unit in the electronic device is multi-core, the electronic device switches one core from the second operation mode to the first operation mode, and the other cores still operate in the second operation mode. The second processing unit can send the face acquisition instruction to the first processing unit through the kernel switched to the first running mode, so that the instruction input by the first processing unit is ensured to be safe. The first processing unit can control the laser camera to collect target images such as infrared images and speckle images through the PWM module. The first processing unit can calculate a disparity map through the target speckle pattern and the reference speckle pattern, and sends the disparity map to a kernel which operates in a first operation mode in the second processing unit. If the security level of the application program sending the face depth information acquisition request is high, the kernel operating in the first operation mode in the second processing unit can calculate the depth map according to the disparity map. If the security level of the application program sending the face depth information acquisition request is low, the kernel running in the first running mode in the second processing unit can send the disparity map to other kernels running in the second processing mode, and the other kernels running in the second processing mode calculate to obtain the depth map.
In the embodiment, the distance between the human face and the target camera can be accurately calculated, so that the shooting frame rate of the target camera and/or the emission power of the laser can be dynamically adjusted along with the change of the distance, and the safety of human eyes is protected.
In one embodiment, a method of controlling a camera is provided, comprising the steps of:
and (1) when the target camera is in an open state, acquiring the distance between the face and the target camera every a preset time period.
In one embodiment, step (1) comprises: calculating a depth map according to the acquired target speckle pattern and a stored reference speckle pattern, wherein the reference speckle pattern is a stored speckle pattern used for camera calibration, and the reference speckle pattern carries reference depth information; determining the proportion of the effective value area of the depth map in the depth map; and obtaining the distance between the human face and the target camera according to the proportion.
In one embodiment, after the step of obtaining the distance between the human face and the target camera according to the proportion, the method further comprises the following steps: and when the distance acquired according to the proportion is smaller than a first distance threshold value, acquiring the distance between the face acquired by the distance sensor and the target camera.
And (2) adjusting the shooting frame rate of the target camera and/or the transmitting power of the laser according to the distance.
In one embodiment, step (2) comprises: and when the distance is smaller than the first distance threshold and larger than the second distance threshold, reducing the shooting frame rate of the target camera.
In one embodiment, step (2) comprises: and when the distance is smaller than or equal to the second distance threshold, reducing the shooting frame rate of the target camera and the emission power of the laser.
In one embodiment, step (2) comprises: when the distance is smaller than or equal to the second distance threshold and larger than the third distance threshold, reducing the driving current of the laser to a preset percentage of the rated driving current; and when the distance is smaller than or equal to a third distance threshold, reducing the driving current of the laser to be lower than a current threshold, wherein the current threshold is smaller than a preset percentage of the rated driving current.
In one embodiment, step (2) comprises: and when the distance is greater than or equal to the first distance threshold value, the shooting frame rate of the target camera is restored to the standard frame rate, and the emission power of the laser is restored to the rated power.
And (3) controlling a laser to emit laser according to the emission power, and controlling a target camera to acquire a target image according to the shooting frame rate.
In this embodiment, when the target camera is in a use state, the distance between the human face and the target camera is acquired every preset time period, the shooting frame rate of the target camera and/or the emission power of the laser is adjusted according to the distance, the laser is controlled to emit laser according to the emission power, the target camera is controlled to acquire a target image according to the shooting frame rate, the shooting frame rate of the target camera and/or the emission power of the laser can be dynamically adjusted according to the distance between the human face and the target camera, damage of the laser emitted by the laser to the human eye can be reduced, and safety of the human eye is protected.
It should be understood that, although the steps in the respective flow charts described above are shown in sequence as indicated by the arrows, the steps are not necessarily performed in sequence as indicated by the arrows. The steps are not performed in the exact order shown and described, and may be performed in other orders, unless explicitly stated otherwise. Moreover, at least a portion of the steps in the various flow diagrams described above may include multiple sub-steps or multiple stages, which are not necessarily performed at the same time, but may be performed at different times, and the order of performing the sub-steps or stages is not necessarily sequential, but may be performed alternately or alternatingly with other steps or at least a portion of the sub-steps or stages of other steps.
As shown in fig. 5, in one embodiment, an apparatus 500 for controlling a camera is provided, which includes a distance obtaining module 510, an adjusting module 520, and a control module 530.
And a distance obtaining module 510, configured to obtain a distance between the human face and the target camera every preset time period when the target camera is in a use state.
And an adjusting module 520, configured to adjust a shooting frame rate of the target camera and/or a transmission power of the laser according to the distance.
And the control module 530 is configured to control the laser to emit laser according to the emission power, and control the target camera to acquire the target image according to the shooting frame rate.
In this embodiment, when the target camera is in a use state, the distance between the human face and the target camera is acquired every preset time period, the shooting frame rate of the target camera and/or the emission power of the laser is adjusted according to the distance, the laser is controlled to emit laser according to the emission power, the target camera is controlled to acquire a target image according to the shooting frame rate, the shooting frame rate of the target camera and/or the emission power of the laser can be dynamically adjusted according to the distance between the human face and the target camera, damage of the laser emitted by the laser to the human eye can be reduced, and safety of the human eye is protected.
In one embodiment, the adjusting module 520 is further configured to decrease the shooting frame rate of the target camera when the distance is smaller than the first distance threshold and larger than the second distance threshold.
In one embodiment, the adjusting module 520 is further configured to reduce the shooting frame rate of the target camera and the emission power of the laser when the distance is less than or equal to the second distance threshold.
In one embodiment, the adjusting module 520 is further configured to reduce the driving current of the laser to a preset percentage of the rated driving current when the distance is less than or equal to the second distance threshold and greater than the third distance threshold.
The adjusting module 520 is further configured to reduce the driving current of the laser to be below a current threshold when the distance is smaller than or equal to a third distance threshold, where the current threshold is smaller than a preset percentage of the rated driving current.
In one embodiment, the adjusting module 520 is further configured to restore the shooting frame rate of the target camera to the standard frame rate and restore the emission power of the laser to the rated power when the distance is greater than or equal to the first distance threshold.
In this embodiment, the shooting frame rate of the target camera and/or the emission power of the laser can be dynamically adjusted according to the distance between the human face and the target camera, so that not only can a normal target image be acquired, but also the damage of laser emitted by the laser to human eyes can be reduced, and the safety of the human eyes can be protected.
As shown in fig. 6, in one embodiment, the distance obtaining module 510 includes a depth calculating unit 512, a scale determining unit 514, and a obtaining unit 516.
And the depth calculating unit 512 is configured to calculate a depth map according to the acquired target speckle pattern and a stored reference speckle pattern, where the reference speckle pattern is a stored speckle pattern used for camera calibration, and the reference speckle pattern carries reference depth information.
And a proportion determining unit 514, configured to determine a proportion of the valid value area of the depth map to the depth map.
And an obtaining unit 516, configured to obtain a distance between the human face and the target camera according to the ratio.
In one embodiment, the distance obtaining module 510 is further configured to obtain a distance between the human face collected by the distance sensor and the target camera when the distance obtained according to the ratio is smaller than a first distance threshold.
In the embodiment, the distance between the human face and the target camera can be accurately calculated, so that the shooting frame rate of the target camera and/or the emission power of the laser can be dynamically adjusted along with the change of the distance, and the safety of human eyes is protected.
In one embodiment, an electronic device is provided. The electronic equipment can include first processing unit, second processing unit and camera module, and camera module includes target camera and laser instrument, and target camera and laser instrument in second processing unit, the camera module are connected respectively to first processing unit.
And the second processing unit is used for acquiring the distance between the human face and the target camera every preset time period when the target camera is in an open state.
And the second processing unit is also used for adjusting the shooting frame rate of the target camera and/or the transmitting power of the laser according to the distance.
And the first processing unit is used for controlling the laser to emit laser according to the emission power and controlling the target camera to acquire a target image according to the shooting frame rate.
In this embodiment, when the target camera is in a use state, the distance between the human face and the target camera is acquired every preset time period, the shooting frame rate of the target camera and/or the emission power of the laser is adjusted according to the distance, the laser is controlled to emit laser according to the emission power, the target camera is controlled to acquire a target image according to the shooting frame rate, the shooting frame rate of the target camera and/or the emission power of the laser can be dynamically adjusted according to the distance between the human face and the target camera, damage of the laser emitted by the laser to the human eye can be reduced, and safety of the human eye is protected.
In one embodiment, the second processing unit is further configured to reduce the shooting frame rate of the target camera when the distance is smaller than the first distance threshold and larger than the second distance threshold.
In one embodiment, the second processing unit reduces the shooting frame rate of the target camera and the emission power of the laser when the distance is less than or equal to a second distance threshold.
In one embodiment, the second processing unit is further configured to reduce the driving current of the laser to a preset percentage of the rated driving current when the distance is less than or equal to the second distance threshold and greater than the third distance threshold.
And the second processing unit is also used for reducing the driving current of the laser below a current threshold when the distance is smaller than or equal to a third distance threshold, wherein the current threshold is smaller than a preset percentage of the rated driving current.
In one embodiment, the second processing unit is further configured to, when the distance is greater than or equal to the first distance threshold, restore the shooting frame rate of the target camera to the standard frame rate, and restore the emission power of the laser to the rated power.
In this embodiment, the shooting frame rate of the target camera and/or the emission power of the laser can be dynamically adjusted according to the distance between the human face and the target camera, so that not only can a normal target image be acquired, but also the damage of laser emitted by the laser to human eyes can be reduced, and the safety of the human eyes can be protected.
In one embodiment, the first processing unit is further configured to calculate a disparity map according to the acquired target speckle pattern and a stored reference speckle pattern, and transmit the disparity map to the second processing unit, where the reference speckle pattern is the stored speckle pattern used for camera calibration, and the reference speckle pattern carries reference depth information.
And the second processing unit is also used for calculating the depth map according to the disparity map and determining the proportion of the effective value area of the depth map in the depth map.
And the second processing unit is also used for acquiring the distance between the human face and the target camera according to the proportion.
In one embodiment, the second processing unit is further configured to obtain an application type of an application program, and determine a security level corresponding to the application type, where the application program is an application program that sends a face depth information request to the second processing unit;
and the second processing unit is also used for switching the operation modes according to the security level, receiving the disparity map through the first operation mode when the security level is high, calculating the depth map according to the disparity map in the first operation mode, receiving the disparity map through the first operation mode when the security level is low, and calculating the depth map according to the disparity map in the second operation mode.
In one embodiment, the electronic device further comprises a distance sensor, the distance sensor being connected to the second processing unit.
And the distance sensor is used for acquiring the distance between the human face and the target camera.
And the second processing unit is also used for acquiring the distance between the face acquired by the distance sensor and the target camera when the distance acquired according to the proportion is smaller than the first distance threshold.
In the embodiment, the distance between the human face and the target camera can be accurately calculated, so that the shooting frame rate of the target camera and/or the emission power of the laser can be dynamically adjusted along with the change of the distance, and the safety of human eyes is protected.
In one embodiment, a computer-readable storage medium is provided, on which a computer program is stored, which, when being executed by a processor, carries out the above-mentioned method of controlling a camera.
In one embodiment, a computer program product comprising a computer program is provided, which when run on a computer device causes the computer device to carry out the method of controlling a camera described above when executed.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by a computer program, which can be stored in a non-volatile computer-readable storage medium, and can include the processes of the embodiments of the methods described above when the program is executed. The storage medium may be a magnetic disk, an optical disk, a Read-Only Memory (ROM), or the like.
Any reference to memory, storage, database, or other medium as used herein may include non-volatile and/or volatile memory. Suitable non-volatile memory can include read-only memory (ROM), Programmable ROM (PROM), Electrically Programmable ROM (EPROM), Electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM), which acts as external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms, such as Static RAM (SRAM), Dynamic RAM (DRAM), Synchronous DRAM (SDRAM), double data rate SDRAM (DDR SDRAM), Enhanced SDRAM (ESDRAM), synchronous Link (Synchlink) DRAM (SLDRAM), Rambus Direct RAM (RDRAM), direct bus dynamic RAM (DRDRAM), and bus dynamic RAM (RDRAM).
The technical features of the embodiments described above may be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the embodiments described above are not described, but should be considered as being within the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (15)

1. A method of controlling a camera, comprising:
when a target camera is in an open state, acquiring the distance between a human face and the target camera;
adjusting the shooting frame rate of the target camera and the transmitting power of a laser according to the distance;
and controlling the laser to emit laser according to the emission power, and controlling the target camera to acquire a target image according to the shooting frame rate.
2. The method of claim 1, wherein the adjusting the frame rate of the target camera and the emission power of the laser according to the distance comprises:
and when the distance is smaller than or equal to a second distance threshold value, reducing the shooting frame rate of the target camera and the emission power of the laser.
3. The method of claim 2, wherein the adjusting the frame rate of the target camera and the emission power of the laser according to the distance comprises:
when the distance is smaller than or equal to the second distance threshold and larger than a third distance threshold, reducing the driving current of the laser to a preset percentage of a rated driving current;
when the distance is less than or equal to the third distance threshold, reducing the drive current of the laser below a current threshold, the current threshold being less than a preset percentage of the rated drive current.
4. The method according to any one of claims 2 to 3, wherein the adjusting of the shooting frame rate of the target camera and the emission power of the laser according to the distance comprises:
and when the distance is greater than or equal to a first distance threshold value, restoring the shooting frame rate of the target camera to a standard frame rate, and restoring the emission power of the laser to a rated power.
5. The method of claim 1, wherein the target image comprises a target speckle pattern;
the distance between the obtained face and the target camera comprises the following steps:
calculating a depth map according to the acquired target speckle pattern and a stored reference speckle pattern, wherein the reference speckle pattern is a stored speckle pattern used for camera calibration, and the reference speckle pattern carries reference depth information;
determining the proportion of the effective value area of the depth map to the depth map;
and acquiring the distance between the face and the target camera according to the proportion.
6. The method of claim 5, wherein after said obtaining the distance between the face and the target camera according to the ratio, the method further comprises:
and when the distance acquired according to the proportion is smaller than a first distance threshold value, acquiring the distance between the face and the target camera acquired by the distance sensor.
7. An apparatus for controlling a camera, comprising:
the distance acquisition module is used for acquiring the distance between the face and the target camera when the target camera is in a use state;
the adjusting module is used for adjusting the shooting frame rate of the target camera and the transmitting power of the laser according to the distance;
and the control module is used for controlling the laser to emit laser according to the emission power and controlling the target camera to acquire a target image according to the shooting frame rate.
8. An electronic device, comprising: the camera module comprises a target camera and a laser, and the first processing unit is respectively connected with the target camera and the laser in the second processing unit and the camera module;
the second processing unit is used for acquiring the distance between the face and the target camera when the target camera is in an open state;
the second processing unit is further configured to adjust a shooting frame rate of the target camera and an emission power of the laser according to the distance;
the first processing unit is used for controlling the laser to emit laser according to the emission power and controlling the target camera to acquire a target image according to the shooting frame rate.
9. The electronic device according to claim 8, wherein the second processing unit is further configured to reduce a shooting frame rate of the target camera and an emission power of a laser when the distance is less than or equal to a second distance threshold.
10. The electronic device of claim 9, wherein the second processing unit is further configured to reduce the driving current of the laser to a preset percentage of a rated driving current when the distance is less than or equal to the second distance threshold and greater than a third distance threshold;
the second processing unit is further configured to reduce the driving current of the laser below a current threshold when the distance is less than or equal to the third distance threshold, where the current threshold is less than a preset percentage of the rated driving current.
11. The electronic device according to any one of claims 9 to 10, wherein the second processing unit is further configured to, when the distance is greater than or equal to a first distance threshold, restore the shooting frame rate of the target camera to a standard frame rate, and restore the emission power of the laser to a rated power.
12. The electronic device according to claim 8, wherein the first processing unit is further configured to calculate a disparity map according to the acquired target speckle pattern and a stored reference speckle pattern, and transmit the disparity map to the second processing unit, where the reference speckle pattern is a stored speckle pattern used for camera calibration, and the reference speckle pattern carries reference depth information;
the second processing unit is further used for calculating a depth map according to the disparity map and determining the proportion of an effective value area of the depth map to the depth map;
and the second processing unit is also used for acquiring the distance between the human face and the target camera according to the proportion.
13. The electronic device according to claim 12, wherein the second processing unit is further configured to obtain an application type of an application program, and determine a security level corresponding to the application type, where the application program is an application program that sends a request for face depth information to the second processing unit;
the second processing unit is further configured to switch an operation mode according to the security level, receive the disparity map through the first operation mode when the security level is high, and calculate a depth map according to the disparity map in the first operation mode, receive the disparity map through the first operation mode when the security level is low, and calculate a depth map according to the disparity map in the second operation mode.
14. The electronic device of claim 12, further comprising a distance sensor coupled to the second processing unit;
the distance sensor is used for acquiring the distance between the human face and the target camera;
the second processing unit is further configured to acquire a distance between the face acquired by the distance sensor and the target camera when the distance acquired according to the ratio is smaller than a first distance threshold.
15. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the method according to any one of claims 1 to 6.
CN201910600333.0A 2018-04-28 2018-04-28 Method and device for controlling camera, electronic equipment and storage medium Active CN110324521B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910600333.0A CN110324521B (en) 2018-04-28 2018-04-28 Method and device for controlling camera, electronic equipment and storage medium

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910600333.0A CN110324521B (en) 2018-04-28 2018-04-28 Method and device for controlling camera, electronic equipment and storage medium
CN201810404834.7A CN108769509B (en) 2018-04-28 2018-04-28 Control method, apparatus, electronic equipment and the storage medium of camera

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
CN201810404834.7A Division CN108769509B (en) 2018-02-27 2018-04-28 Control method, apparatus, electronic equipment and the storage medium of camera

Publications (2)

Publication Number Publication Date
CN110324521A CN110324521A (en) 2019-10-11
CN110324521B true CN110324521B (en) 2021-03-26

Family

ID=64008875

Family Applications (2)

Application Number Title Priority Date Filing Date
CN201810404834.7A Active CN108769509B (en) 2018-02-27 2018-04-28 Control method, apparatus, electronic equipment and the storage medium of camera
CN201910600333.0A Active CN110324521B (en) 2018-04-28 2018-04-28 Method and device for controlling camera, electronic equipment and storage medium

Family Applications Before (1)

Application Number Title Priority Date Filing Date
CN201810404834.7A Active CN108769509B (en) 2018-02-27 2018-04-28 Control method, apparatus, electronic equipment and the storage medium of camera

Country Status (1)

Country Link
CN (2) CN108769509B (en)

Families Citing this family (27)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019165956A1 (en) * 2018-02-27 2019-09-06 Oppo广东移动通信有限公司 Control method, control apparatus, terminal, computer device, and storage medium
CN109683698B (en) * 2018-12-25 2020-05-22 Oppo广东移动通信有限公司 Payment verification method and device, electronic equipment and computer-readable storage medium
CN111435973B (en) * 2019-01-15 2021-12-03 杭州海康威视数字技术股份有限公司 Laser camera
CN109743505B (en) * 2019-01-25 2021-01-19 Oppo广东移动通信有限公司 Video shooting method and device based on laser ranging and electronic equipment
CN111738246A (en) * 2019-03-25 2020-10-02 北京小米移动软件有限公司 Electronic device, method of controlling the same, and machine-readable storage medium
CN109885093A (en) * 2019-03-26 2019-06-14 Oppo广东移动通信有限公司 Control system and control method, the terminal of flight time component
CN109946704A (en) * 2019-03-26 2019-06-28 Oppo广东移动通信有限公司 Control system and control method, the terminal of flight time component
WO2020237657A1 (en) * 2019-05-31 2020-12-03 Oppo广东移动通信有限公司 Control method for electronic device, electronic device, and computer-readable storage medium
CN110324602A (en) * 2019-06-17 2019-10-11 Oppo广东移动通信有限公司 A kind of generation method and device, terminal, storage medium of 3-D image
CN110308458B (en) * 2019-06-27 2021-03-23 Oppo广东移动通信有限公司 Adjusting method, adjusting device, terminal and computer readable storage medium
CN110260823A (en) * 2019-07-16 2019-09-20 腾讯科技(深圳)有限公司 A kind of structured light control method, device and computer equipment
CN110944135B (en) * 2019-11-18 2022-05-31 深圳前海达闼云端智能科技有限公司 Power control method, electronic device and storage medium
CN111199198B (en) * 2019-12-27 2023-08-04 深圳市优必选科技股份有限公司 Image target positioning method, image target positioning device and mobile robot
CN113132613A (en) * 2019-12-31 2021-07-16 中移物联网有限公司 Camera light supplementing device, electronic equipment and light supplementing method
CN113223209A (en) * 2020-01-20 2021-08-06 深圳绿米联创科技有限公司 Door lock control method and device, electronic equipment and storage medium
CN111487632A (en) * 2020-04-06 2020-08-04 深圳蚂里奥技术有限公司 Laser safety control device and control method
CN111487633A (en) * 2020-04-06 2020-08-04 深圳蚂里奥技术有限公司 Laser safety control device and method
CN111427049A (en) * 2020-04-06 2020-07-17 深圳蚂里奥技术有限公司 Laser safety device and control method
CN112073708B (en) * 2020-09-17 2022-08-09 君恒新信息科技(深圳)有限公司 Power control method and equipment for TOF camera light emission module
CN112967953B (en) * 2020-12-31 2023-09-08 深圳中科飞测科技股份有限公司 Method for using semiconductor processing apparatus, and storage medium
CN113268137A (en) * 2021-02-03 2021-08-17 深圳赋能软件有限公司 Human eye protection device and method, identity recognition device and electronic equipment
CN113807172B (en) * 2021-08-11 2022-10-18 荣耀终端有限公司 Face recognition method and device
CN113780090B (en) * 2021-08-12 2023-07-28 荣耀终端有限公司 Data processing method and device
CN114054879A (en) * 2021-10-20 2022-02-18 深圳泰德激光技术股份有限公司 Molten tin welding method, three-dimensional five-axis laser processing device and readable storage medium
CN114500795B (en) * 2021-12-27 2024-03-15 奥比中光科技集团股份有限公司 Laser safety control method and device, intelligent door lock and storage medium
CN114531541B (en) * 2022-01-10 2023-06-02 荣耀终端有限公司 Control method and device for camera module
CN115460347A (en) * 2022-08-18 2022-12-09 科大讯飞股份有限公司 Control method of monitoring device, monitoring device and storage medium

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6724467B1 (en) * 2002-04-19 2004-04-20 Richard I. Billmers System for viewing objects at a fire scene and method of use
CN1567982A (en) * 2003-07-07 2005-01-19 致伸科技股份有限公司 Image pick-up device finding a view by using laser beam
CN104349072A (en) * 2013-08-09 2015-02-11 联想(北京)有限公司 Control method, device and electronic equipment
CN105373223A (en) * 2015-10-10 2016-03-02 惠州Tcl移动通信有限公司 Lighting equipment capable of automatically adjusting luminous intensity and method
CN105791681A (en) * 2016-02-29 2016-07-20 广东欧珀移动通信有限公司 Control method, control device and electronic device
CN106331517A (en) * 2016-09-26 2017-01-11 维沃移动通信有限公司 Soft light lamp brightness control method and electronic device
CN107436430A (en) * 2017-08-07 2017-12-05 周俊 High security photoelectric remote-sensing device scan detection device
CN107451561A (en) * 2017-07-31 2017-12-08 广东欧珀移动通信有限公司 Iris recognition light compensation method and device

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007006016A (en) * 2005-06-22 2007-01-11 Sharp Corp Imaging equipment
JP5398341B2 (en) * 2009-05-11 2014-01-29 キヤノン株式会社 Object recognition apparatus and object recognition method
CN104967776B (en) * 2015-06-11 2018-03-27 广东欧珀移动通信有限公司 One kind is taken pictures method to set up and user terminal
CN205754594U (en) * 2016-02-15 2016-11-30 公安部第一研究所 A kind of light supply apparatus that human image collecting is carried out BLC
CN205921676U (en) * 2016-08-25 2017-02-01 北京旷视科技有限公司 Image capturing apparatus
CN107423716A (en) * 2017-07-31 2017-12-01 广东欧珀移动通信有限公司 Face method for monitoring state and device

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6724467B1 (en) * 2002-04-19 2004-04-20 Richard I. Billmers System for viewing objects at a fire scene and method of use
CN1567982A (en) * 2003-07-07 2005-01-19 致伸科技股份有限公司 Image pick-up device finding a view by using laser beam
CN104349072A (en) * 2013-08-09 2015-02-11 联想(北京)有限公司 Control method, device and electronic equipment
CN105373223A (en) * 2015-10-10 2016-03-02 惠州Tcl移动通信有限公司 Lighting equipment capable of automatically adjusting luminous intensity and method
CN105791681A (en) * 2016-02-29 2016-07-20 广东欧珀移动通信有限公司 Control method, control device and electronic device
CN106331517A (en) * 2016-09-26 2017-01-11 维沃移动通信有限公司 Soft light lamp brightness control method and electronic device
CN107451561A (en) * 2017-07-31 2017-12-08 广东欧珀移动通信有限公司 Iris recognition light compensation method and device
CN107436430A (en) * 2017-08-07 2017-12-05 周俊 High security photoelectric remote-sensing device scan detection device

Also Published As

Publication number Publication date
CN108769509A (en) 2018-11-06
CN110324521A (en) 2019-10-11
CN108769509B (en) 2019-06-21

Similar Documents

Publication Publication Date Title
CN110324521B (en) Method and device for controlling camera, electronic equipment and storage medium
US10455141B2 (en) Auto-focus method and apparatus and electronic device
WO2018161877A1 (en) Processing method, processing device, electronic device and computer readable storage medium
CN111126146B (en) Image processing method, image processing device, computer readable storage medium and electronic apparatus
CN108573170B (en) Information processing method and device, electronic equipment and computer readable storage medium
CN110248111B (en) Method and device for controlling shooting, electronic equipment and computer-readable storage medium
US11227368B2 (en) Method and device for controlling an electronic device based on determining a portrait region using a face region detection and depth information of the face region detected
CN108804895B (en) Image processing method, image processing device, computer-readable storage medium and electronic equipment
US11256903B2 (en) Image processing method, image processing device, computer readable storage medium and electronic device
US11335028B2 (en) Control method based on facial image, related control device, terminal and computer device
CN108650472B (en) Method and device for controlling shooting, electronic equipment and computer-readable storage medium
CN110191266B (en) Data processing method and device, electronic equipment and computer readable storage medium
CN109213610B (en) Data processing method and device, computer readable storage medium and electronic equipment
US20200065562A1 (en) Method and Device for Processing Image, Computer Readable Storage Medium and Electronic Device
CN109993115A (en) Image processing method, device and wearable device
CN111523499B (en) Image processing method, apparatus, electronic device, and computer-readable storage medium
JP2021501517A (en) Systems and methods for improving the signal-to-noise ratio in object tracking under low illumination conditions
CN108924426B (en) Image processing method and device, electronic equipment and computer readable storage medium
CN108711054B (en) Image processing method, image processing device, computer-readable storage medium and electronic equipment
CN111601373B (en) Backlight brightness control method and device, mobile terminal and storage medium
WO2020024576A1 (en) Camera calibration method and apparatus, electronic device, and computer-readable storage medium
CN108833887B (en) Data processing method and device, electronic equipment and computer readable storage medium
EP3621294B1 (en) Method and device for image capture, computer readable storage medium and electronic device
EP3605393A1 (en) Image correction due to deviations caused by temperature changes of camera light emitter
CN109120846B (en) Image processing method and device, electronic equipment and computer readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant