US11335028B2 - Control method based on facial image, related control device, terminal and computer device - Google Patents
Control method based on facial image, related control device, terminal and computer device Download PDFInfo
- Publication number
- US11335028B2 US11335028B2 US16/423,073 US201916423073A US11335028B2 US 11335028 B2 US11335028 B2 US 11335028B2 US 201916423073 A US201916423073 A US 201916423073A US 11335028 B2 US11335028 B2 US 11335028B2
- Authority
- US
- United States
- Prior art keywords
- projection distance
- distance
- image
- projection
- target
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active, expires
Links
- 238000000034 method Methods 0.000 title claims abstract description 64
- 230000001815 facial effect Effects 0.000 title claims description 83
- 238000003860 storage Methods 0.000 claims abstract description 11
- 238000012545 processing Methods 0.000 claims description 94
- 230000005540 biological transmission Effects 0.000 claims description 66
- 239000011521 glass Substances 0.000 claims description 25
- 238000001514 detection method Methods 0.000 claims description 20
- 238000010586 diagram Methods 0.000 description 30
- 239000000758 substrate Substances 0.000 description 20
- 230000003287 optical effect Effects 0.000 description 16
- 230000001681 protective effect Effects 0.000 description 10
- 210000000887 face Anatomy 0.000 description 8
- 230000008569 process Effects 0.000 description 7
- 230000004308 accommodation Effects 0.000 description 6
- 238000006243 chemical reaction Methods 0.000 description 6
- 230000037303 wrinkles Effects 0.000 description 6
- 238000004364 calculation method Methods 0.000 description 5
- 238000003384 imaging method Methods 0.000 description 5
- 238000005259 measurement Methods 0.000 description 5
- 230000008859 change Effects 0.000 description 4
- 208000030533 eye disease Diseases 0.000 description 4
- 230000004438 eyesight Effects 0.000 description 4
- 230000017525 heat dissipation Effects 0.000 description 4
- 239000000565 sealant Substances 0.000 description 4
- 239000000853 adhesive Substances 0.000 description 3
- 230000001070 adhesive effect Effects 0.000 description 3
- 230000003796 beauty Effects 0.000 description 2
- 230000006399 behavior Effects 0.000 description 2
- 238000009826 distribution Methods 0.000 description 2
- 210000001061 forehead Anatomy 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 239000002245 particle Substances 0.000 description 2
- 239000004984 smart glass Substances 0.000 description 2
- 230000003213 activating effect Effects 0.000 description 1
- 230000002238 attenuated effect Effects 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 238000004590 computer program Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 239000003292 glue Substances 0.000 description 1
- 230000020169 heat generation Effects 0.000 description 1
- 238000005286 illumination Methods 0.000 description 1
- 238000009434 installation Methods 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 230000005855 radiation Effects 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
- 210000000707 wrist Anatomy 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/80—Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B7/00—Control of exposure by setting shutters, diaphragms or filters, separately or conjointly
- G03B7/08—Control effected solely on the basis of the response, to the intensity of the light received by the camera, of a built-in light-sensitive device
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/50—Depth or shape recovery
- G06T7/521—Depth or shape recovery from laser ranging, e.g. using interferometry; from the projection of structured light
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/50—Depth or shape recovery
- G06T7/536—Depth or shape recovery from perspective effects, e.g. by using vanishing points
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/50—Depth or shape recovery
- G06T7/55—Depth or shape recovery from multiple images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/161—Detection; Localisation; Normalisation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/168—Feature extraction; Face representation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/20—Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from infrared radiation only
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/45—Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from two or more image sensors being of different type or operating in different modes, e.g. with a CMOS sensor for moving images in combination with a charge-coupled device [CCD] for still images
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/56—Cameras or camera modules comprising electronic image sensors; Control thereof provided with illuminating means
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/61—Control of cameras or camera modules based on recognised objects
- H04N23/611—Control of cameras or camera modules based on recognised objects where the recognised objects include parts of the human body
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/74—Circuitry for compensating brightness variation in the scene by influencing the scene brightness using illuminating means
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/222—Studio circuitry; Studio devices; Studio equipment
- H04N5/2224—Studio circuitry; Studio devices; Studio equipment related to virtual studio applications
- H04N5/2226—Determination of depth image, e.g. for foreground/background separation
-
- H04N5/2256—
-
- H04N5/23219—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/26—Power supply means, e.g. regulation thereof
- G06F1/32—Means for saving power
- G06F1/3203—Power management, i.e. event-based initiation of a power-saving mode
- G06F1/3206—Monitoring of events, devices or parameters that trigger a change in power modality
- G06F1/3231—Monitoring the presence, absence or movement of users
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10028—Range image; Depth image; 3D point clouds
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30196—Human being; Person
- G06T2207/30201—Face
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V2201/00—Indexing scheme relating to image or video recognition or understanding
- G06V2201/12—Acquisition of 3D measurements of objects
- G06V2201/121—Acquisition of 3D measurements of objects using special illumination
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/172—Classification, e.g. identification
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/178—Human faces, e.g. facial parts, sketches or expressions estimating age from face image; using age information for improving recognition
Definitions
- the present application relates to the field of image technology, and more particularly, to a control method of a camera module, a control device of a camera module, a terminal, a nonvolatile computer readable storage medium, and a computer device.
- a laser projector may emit laser light with preset pattern information and may project the laser light onto a target user in space.
- a laser light pattern reflected by the target user may be acquired by an imaging device, to further obtain a depth image of the target user.
- Embodiments of the present disclosure provide a control method of a camera module, a control device of a camera module, a terminal, a nonvolatile computer readable storage medium, and a computer device.
- the control method of a camera module includes: obtaining a projection distance between a target user and the camera module; and determining a control parameter of the camera module according to the projection distance and controlling the camera module based on the control parameter.
- the control device of a camera module includes a master distance obtaining module and a master control module.
- the master distance obtaining module is configured to obtain a projection distance between a target user and the camera module.
- the master control module is configured to determine a control parameter of the camera module according to the projection distance and control the camera module based on the control parameter.
- the terminal includes a camera module, a processor and a controller.
- the processor is configured to: obtain a projection distance between a target user and a camera module and determine a control parameter of the camera module according to the projection distance.
- the controller is configured to control the camera module based on the control parameter.
- the nonvolatile computer readable storage medium includes computer executable instructions.
- the computer executable instructions are executed by one or more processors, the one or more processors are configured to execute the above control method.
- the computer device includes a memory and a processor.
- the memory is configured to store computer readable instructions.
- the processor is configured to perform the above control method.
- FIG. 1 is a schematic flowchart illustrating a control method according to some embodiments of the present disclosure.
- FIG. 2 is a schematic flowchart illustrating a control method according to some embodiments of the present disclosure.
- FIG. 3 is a schematic flowchart illustrating a control method according to some embodiments of the present disclosure.
- FIG. 4 is a schematic flowchart illustrating a control method according to some embodiments of the present disclosure.
- FIG. 5 is a schematic flowchart illustrating a control method according to some embodiments of the present disclosure.
- FIG. 6 is a schematic flowchart illustrating a control method according to some embodiments of the present disclosure.
- FIG. 7 is a schematic flowchart illustrating a control method according to some embodiments of the present disclosure.
- FIG. 8 is a schematic flowchart illustrating a control method according to some embodiments of the present disclosure.
- FIG. 9 is a schematic flowchart illustrating a control method according to some embodiments of the present disclosure.
- FIG. 10 is a schematic flowchart illustrating a control method according to some embodiments of the present disclosure.
- FIG. 11 is a schematic flowchart illustrating a control method according to some embodiments of the present disclosure.
- FIG. 12 is a schematic flowchart illustrating a control method according to some embodiments of the present disclosure.
- FIG. 13 is a schematic flowchart illustrating a control method according to some embodiments of the present disclosure.
- FIG. 14 is a schematic flowchart illustrating a control method according to some embodiments of the present disclosure.
- FIG. 15 is a schematic flowchart illustrating a control method according to some embodiments of the present disclosure.
- FIG. 16 is a schematic flowchart illustrating a control method according to some embodiments of the present disclosure.
- FIG. 17 is a schematic flowchart illustrating a control method according to some embodiments of the present disclosure.
- FIG. 18 is a schematic block diagram illustrating a control device according to some embodiments of the present disclosure.
- FIG. 19 is a schematic block diagram illustrating a control device according to some embodiments of the present disclosure.
- FIG. 20 is a schematic block diagram illustrating a second obtaining module of a control device according to some embodiments of the present disclosure.
- FIG. 21 is a schematic block diagram illustrating a second calculating unit of a control device according to some embodiments of the present disclosure.
- FIG. 22 is a schematic block diagram illustrating a second calculating unit of a control device according to some embodiments of the present disclosure.
- FIG. 23 is a schematic block diagram illustrating a second calculating unit of a control device according to some embodiments of the present disclosure.
- FIG. 24 is a schematic block diagram illustrating a control device according to some embodiments of the present disclosure.
- FIG. 25 is a schematic block diagram illustrating a control device according to some embodiments of the present disclosure.
- FIG. 26 is a schematic block diagram illustrating a third obtaining module of a control device according to some embodiments of the present disclosure.
- FIG. 27 is a schematic block diagram illustrating a control device according to some embodiments of the present disclosure.
- FIG. 28 is a schematic block diagram illustrating a control device according to some embodiments of the present disclosure.
- FIG. 29 is a schematic block diagram illustrating a control device according to some embodiments of the present disclosure.
- FIG. 30 is a schematic block diagram illustrating a control device according to some embodiments of the present disclosure.
- FIG. 31 is a schematic block diagram illustrating a fifth obtaining module of a control device according to some embodiments of the present disclosure.
- FIG. 32 is a schematic block diagram illustrating a control device according to some embodiments of the present disclosure.
- FIG. 33 is a schematic block diagram illustrating a control device according to some embodiments of the present disclosure.
- FIG. 34 is a schematic block diagram illustrating a sixth obtaining module of a control device according to some embodiments of the present disclosure.
- FIG. 35 is a schematic block diagram illustrating a terminal according to some embodiments of the present disclosure.
- FIG. 36 is a schematic block diagram illustrating a terminal according to some embodiments of the present disclosure.
- FIG. 37 is a schematic block diagram illustrating a terminal according to some embodiments of the present disclosure.
- FIG. 38 is a schematic block diagram illustrating a terminal according to some embodiments of the present disclosure.
- FIG. 39 is a diagram illustrating an application scenario of a control method according to some embodiments of the present disclosure.
- FIG. 40 is a schematic block diagram illustrating a computer device according to some embodiments of the present disclosure.
- FIG. 41 is a block diagram illustrating a computer device according to some embodiments of the present disclosure.
- FIG. 42 is a structural diagram illustrating a laser projector according to some embodiments of the present disclosure.
- FIG. 43 is structural diagrams illustrating a part of a laser projector according to some embodiments of the present disclosure.
- FIG. 44 is structural diagrams illustrating a part of a laser projector according to some embodiments of the present disclosure.
- FIG. 45 is structural diagrams illustrating a part of a laser projector according to some embodiments of the present disclosure
- the present disclosure provides a control method, a control device, a terminal, a computer device, and a storage medium.
- the present disclosure provides a control method of a camera module 60 .
- the control method includes the following.
- a projection distance between a target user and the camera module 60 is obtained.
- a control parameter of the camera module 60 is determined according to the projection distance and the camera module 60 is controlled based on the control parameter.
- control device 10 may be applied to a computer device.
- the computer device may be a phone, a tablet computer, a notebook computer, a smart wristband, a smart watch, a smart helmet, smart glasses, and a game machine.
- the terminal 100 according to implementations of the present disclosure may also be one of computer devices.
- the camera module 60 includes a laser projector 30 .
- the control parameter includes projection power.
- the block 001 of obtaining the projection distance between the target user and the camera module 60 may include the following.
- a facial image of the target user captured with a preset parameter is obtained.
- a projection distance between the target user and the laser projector 30 is obtained according to the facial image.
- the block 002 of determining the control parameter of the camera module 60 according to the projection distance and controlling the camera module 60 based on the control parameter may include the following.
- the laser projector 30 is controlled to emit laser light with corresponding projection power, according to the projection distance.
- Embodiments of the present disclosure are described exemplarily by taking a phone as a terminal 100 .
- the terminal 100 includes a camera 60 , a processor 40 , a controller 50 .
- the camera module 60 of the terminal 100 includes an image capturing device 20 and a laser projector 30 .
- the image capturing device 20 may be a visible camera to obtain an RGB facial image of the target user.
- the image capturing device 20 may also be an infrared camera to obtain an infrared facial image of the target user.
- there may be a plurality of the image capturing devices 20 For example, there may be two visible cameras, two infrared cameras, or both the visible camera and the infrared camera.
- the preset parameter may be a focal length of the image capturing device 20 when capturing the facial image. Therefore, sizes of user's faces at same projection distances and captured by the image capturing device 20 are same.
- the laser projector 30 is configured to project a laser light pattern onto the target user.
- the image capturing device 20 may be further configured to capture a laser light pattern modulated by the target user and further generate a depth image of the target user for identity identification, dynamic capturing, and the like.
- the laser light may be infrared. When the projection power of the laser light is high or the projection distance from the target user to the laser projector 30 is short, the laser light emitted by the laser projector 30 may harm the user, for example, burning human eyes.
- the projection distance may be a distance between the user's face and a light exit surface of the laser projector 30 .
- a light entrance surface of the image capturing device 20 may be in a same plane as the light exit surface of the laser projector 30 .
- the controller 50 may be configured to control the laser projector 30 to emit the laser light with the corresponding projection power according to the projection distance.
- the control method according to implementations illustrated in FIG. 2 may control the laser projector 30 to emit the laser light with the corresponding projection power according to the projection distance between the target user and the laser projector 30 , thereby preventing the user being damaged caused by the high projection power of the laser projector 30 .
- the block 012 may include the following.
- a first ratio of the face to the facial image is determined.
- the projection distance is calculated based on the first ratio.
- a face region and a background region may be divided in the facial image by extracting and analyzing feature points of the face.
- the first ratio may be obtained by calculating a ratio of the number of pixels contained in the face region to the number of pixels contained in the facial image. It can be understood that the larger the first ratio, the closer the target user to the image capturing device 20 is. That is, when the target user is close to the laser projection module 30 , the projection distance is small. Therefore, the laser projector 30 is configured to emit the laser light with low projection power in case that the laser light emitted is too strong to burn the user's eyes. Meanwhile, when the first ratio is small, it is indicated that the target user is far away from the image capturing device 20 .
- the laser projector 30 is configured to project the laser light with high projection power.
- the laser light pattern after the laser light pattern is projected onto the target user and reflected by the target user, the laser light pattern still has appropriate intensity for forming a depth image.
- a face with a largest area is selected among the plurality of the faces as the face region to calculate the first ratio, and regions occupied by other faces are used as a part of the background region.
- the projection distance and the first ratio may be calibrated in advance.
- the user is directed to obtain the facial image at a preset projection distance.
- a calibration ratio corresponding to the facial image is calculated.
- a correspondence between the preset projection distance and the calibration ratio is stored, so as to calculate the projection distance according to an actual first ratio in subsequent operations.
- the user is directed to obtain the facial image at a projection distance of about 30 cm, the calibration ratio corresponding to the facial image is calculated as about 45%.
- the first ratio is calculated as R, based on a triangle similarity principle, it may be derived as
- R 45 ⁇ % D 30 ⁇ ⁇ cm , where D is an actual projection distance calculated based on the first ratio R in the actual measurements. In this way, based on the first ratio of the face region to the facial image, an actual projection distance of the target user may be objectively reflected.
- the block 0122 of calculating the projection distance based on the first ratio includes the following.
- a second ratio of a preset feature region of the face to the face contained in the facial image is calculated.
- the projection distance is calculated based on the first ratio and the second ratio.
- the second ratio is a ratio of the preset feature region of the face to the face.
- a feature region with a small difference among different individuals may be selected as the preset feature region.
- the preset feature region may be the distance between the eyes of the user.
- the second ratio is large, it is indicated that the user's face is small. Therefore, the projection distance calculated only based on the first ratio is large.
- the second ratio is small, it is indicated that the user's face is large. Therefore, the projection distance calculated only based on the first ratio is small.
- the first ratio, the second ratio and the projection distance may be calibrated in advance.
- the user is directed to photograph the facial image at the preset projection distance.
- the first calibration ratio and the second calibration ratio corresponding to the facial image are calculated.
- the correspondence among the preset projection distance, the first calibration ratio and the second calibration ratio are stored, so as to calculate a projection distance according to an actual first ratio and an actual second ratio in subsequent operations.
- the user is directed to obtain a facial image at the projection distance of about 25 cm, the first calibration ratio corresponding to the facial image is calculated as about 50%, and the second calibration ratio is about 10%.
- the first ratio is calculated as R1 and the second ratio is calculated as R2 based on a triangle similarity principle, it may be derived as
- R ⁇ ⁇ 1 50 ⁇ % D 25 ⁇ ⁇ cm , where D1 is an initial projection distance calculated based on the actually measured first ratio R1.
- a calibration projection distance D2 may be further calculated based on an equation
- the block 0122 of calculating the projection distance based on the first ratio includes the following.
- the projection distance is calculated based on the first ratio and a preset distance coefficient.
- a healthy condition of the user's eyes may be characterized by whether the user wears the glasses.
- the user wears the glasses it is indicated that the user's eyes is already suffered from related eye diseases or have poor eyesight.
- the preset distance coefficient may be between 0 and 1, such as 0.6, 0.78, 0.82 or 0.95.
- the initial projection distance or the calibration projection distance is multiplied by the distance coefficient to obtain a final projection distance.
- the final projection distance is used as the projection distance and used to calculate the projection power. In this way, particularly, it is possible to avoid the user suffering from the eye diseases or having the poor eyesight from being harmed caused by high power of the projected laser light.
- the distance coefficient may be unfixed.
- the distance coefficient may be automatically adjusted according to intensity of visible light or infrared light in an ambient environment.
- an average value of intensity of the visible light of all the pixels contained in the facial image may be calculated. Different average values correspond to different distance coefficients. In detail, the larger the average value, the smaller the distance coefficient is.
- an average value of intensity of the infrared light of all the pixels contained in the facial image may be calculated. Different average values correspond to different distance coefficients. The larger the average value, the smaller the distance coefficient is, and the smaller the average value, the larger the distance coefficient is.
- the block 0122 of calculating the projection distance based on the first ratio includes the following.
- an age of the target user is determined according to the facial image.
- the projection distance is adjusted according to the first ratio and the age.
- an amount, a distribution and an area of the feature points of facial wrinkles can be extracted from the facial image to determine the age of the target user.
- the age of the target user may be determined by extracting the number of the wrinkles around the eyes, or further in combination with the number of wrinkles at forehead of the user.
- a proportional coefficient may be obtained according to the age of the user. In detail, a correspondence between the age and the proportional coefficient can be found in a lookup table.
- the proportional coefficient when the age is under 15, the proportional coefficient is about 0.6. When the age is between 15 and 20, the proportional coefficient is about 0.8; when the age is between 20 and 45, the proportional coefficient is about 1.0. When the age is 45 or more, the proportional coefficient is about 0.8.
- an initial projection distance calculated based on the first ratio, and the calibration projection distance calculated based on the first ratio and the second ratio may be multiplied by the proportional coefficient to obtain the final projection distance.
- the final projection distance is used as the projection distance and used to calculate the projection power. In this way, it is possible to avoid, in particular, the user of a low age or a high age from being harmed due to the high power of the projected laser light.
- the block 001 of obtaining the projection distance between the target user and the camera module 60 includes the following.
- the projection distance between the target user and the laser projector 30 is obtained.
- the block 002 of determining the control parameter of the camera module 60 according to the projection distance and controlling the camera module 60 based on the control parameter includes the following.
- projection power corresponding to the laser projector 30 is obtained according to the projection distance.
- the laser projector 30 is controlled to emit the laser light with the projection power.
- Embodiments of the present disclosure are described exemplarily by taking a phone as a terminal 100 .
- the terminal 100 includes a distance detecting module 70 , a laser projector 30 , a processor 40 and a controller 50 .
- the distance detecting module 70 may be configured to obtain the projection distance between the target user and the laser projector 30 .
- a specific form of the distance detecting module 70 is not limited herein, which may be any distance detecting devices.
- the distance detecting module 70 may send actively a detection signal to the target user and receive a detection signal reflected by the target user to obtain the projection distance.
- the distance detecting module 70 may be a proximity sensor or a depth camera.
- the distance detecting module 20 may also directly receive a detection signal sent by the target user or a detection signals reflected by the target user, to obtain the projection distance.
- the distance detecting module 70 may include a plurality of the imaging devices. Images of the target users at different angles captured by the plurality of the imaging devices are processed, to obtain depth information (the projection distance) of the target user.
- the laser projector 70 is configured to project the laser light pattern onto a target user.
- the laser light pattern modulated by the target user may be captured by a target camera, to further generate a depth image of the target user for identity identification and dynamic capturing.
- the laser light may be infrared.
- the projection power of the laser light is too large or the projection distance between the target user and the laser projector 30 is too small, the laser light emitted by the laser projector 30 may harm the user, for example, burning the human eyes.
- the projection distance may be a distance between the user's face and a light exit surface of the laser projector 30 .
- the laser projector 30 may be a part of the distance detecting module 70 . That is, the laser projector 70 may be configured to detect the projection distance.
- the distance detecting module 70 further includes the target camera.
- the laser projector 70 may project the laser light pattern onto the target user with rated power.
- the target camera is further configured to capture the laser light pattern modulated by the target user, to further obtain the depth information of the target user, i.e., the projection distance.
- the processor 40 may be configured to obtain the projection power corresponding to the laser projector 30 according to the projection distance.
- the processor 40 may be configured to search for the projection power corresponding to the projection distance from a pre-stored correspondence table between the projection distance and the projection power.
- the controller 50 is configured to control the laser projector 30 to emit the laser light with the corresponding projection power.
- the controller 50 may be configured to control emission power of a light source of the laser projector 30 .
- the light source may be a vertical cavity surface emitting laser (VCSEL) or a distributed feedback laser (DFB).
- the control method of the implementation illustrated in FIG. 7 may control the laser projector 30 to emit the laser light with the corresponding projection power, according to the projection distance between the target user and the laser projector 30 , thereby preventing the user from being harmed due to the high projection power of the laser projector 30 .
- control method further includes the following.
- a scene image of a target scene is obtained.
- the laser projector 30 of the terminal 100 may be suitable for a variety of different occasions for different purposes. In these occasions, it is possible that a projection target of the laser projector 30 does not involve a human body.
- the laser projector 30 may be configured to measure a distance between the terminal 100 and a certain object, or to perform three-dimensional mapping of a certain object. At this time, the laser projector 30 may emit light with rated power, without obtaining the projection power by detecting in real time the projection distance, which is beneficial to reduce power consumption of the terminal 100 .
- the image capturing device 20 may be a visible camera to capture an RGB scene image of the target scene.
- the image capturing device 20 may also be an infrared camera to capture an infrared scene image of the target scene.
- the processor 40 is configured to obtain the scene image captured by the image capturing device 20 , and determine whether a feature point similar to a facial feature is contained in the scene image to determine whether a face exists in the scene image, and further determine whether a human body exists in the target scene. When it is determined by the processor 40 that a face exists in the scene image, the projection distance between the target user and the laser projector 30 is obtained.
- the processor 40 may transmit a judgment result whether a face exists to the distance detecting module 70 .
- the distance detecting module 70 may obtain the projection distance between the target user and the laser projector 30 . In this way, it is possible to prevent the human body within a projection range from being harmed by the laser projector 30 when the laser projector 30 is working with the high projection power.
- the block 021 includes the followings.
- a detection signal is transmitted to the target user.
- the projection distance is calculated based on a detection signal reflected by the target user.
- the detection signal may be infrared light.
- the distance detecting module 70 may include a transmitter 71 , a receiver 72 and a calculator 73 .
- the transmitter 71 is configured to transmit the infrared light to the target user.
- the infrared light is partially reflected by the target user and received by the receiver 72 .
- the calculator 73 may be configured to calculate the projection distance based on the reflected infrared light received by the receiver 72 .
- the calculator 73 may be configured to calculate a path length of the infrared light from being transmitted to being received based on the intensity of the infrared light received by the receiver 72 . A half of the path length is taken as the projection distance.
- the calculator 73 may be also configured to calculate the projection distance based on time information of receiving by the receiver 72 the infrared light.
- a total propagation time of the infrared light between the terminal 100 and the target user is a difference between the time when the transmitter 71 transmits the infrared light and the time when the receiver 72 receives the infrared light.
- the projection distance may be calculated by multiplying a half of the time difference by a propagation speed of the infrared light in the air.
- the detection signal may also be another type of detection signal, such as an ultrasonic wave, which is not limited herein.
- control method further includes the following.
- a captured image of the target user is obtained.
- an age of the target user is calculated according to the captured image.
- the projection power of the laser projector is adjusted according to the age.
- an amount, a distribution and an area of the feature points of facial wrinkles may be extracted from the facial image to determine the age of the target user.
- the age of the target user may be determined by extracting the number of the wrinkles around the eyes, or further in combination with the number of wrinkles at forehead of the user.
- a proportional coefficient may be obtained according to the age of the user. In detail, a correspondence between the age and the proportional coefficient may be found in a lookup table.
- the proportional coefficient when the age is under 15, the proportional coefficient is about 0.6. When the age is between 15 and 20, the proportional coefficient is about 0.8. When the age is between 20 and 45, the proportional coefficient is about 1.0. When the age is 45 or more, the proportional coefficient is about 0.8.
- an initial projection power calculated based on the projection distance may be multiplied by the proportional coefficient to obtain final projection power.
- the controller 50 may control the laser projector 30 to emit the laser light with the final projection power. In this way, it is possible to avoid, in particular, the user of a low age or a high age from being harmed by the projected laser light with the high power.
- control method further includes the following.
- a captured image of the target user is obtained.
- a healthy condition of the user's eyes may be characterized by whether the user wears the glasses.
- the user wears glasses it is indicated that the user's eyes is already suffered from related eye diseases or have poor eyesight.
- the laser light is projected onto a user wearing glasses, it is required to project the laser light with low power to avoid the eyes of the user from being harmed.
- an initial projection power calculated based on the projection distance may be multiplied by a preset adjustment coefficient to obtain the final projection power.
- the preset adjustment coefficient may be a coefficient between 0 and 1, such as 0.6, 0.78, 0.82 or 0.95.
- the controller 50 may control the laser projector 30 to emit the laser light based on the final projection power. In this way, particularly, it is possible to avoid the user suffering from the eye diseases or having the poor eyesight from being harmed caused by high power of the projected laser light.
- the adjustment coefficient may be unfixed.
- the adjustment coefficient may be automatically adjusted according to intensity of visible light or infrared light in an ambient environment.
- an average value of intensity of the visible light of all the pixels contained in the captured image may be calculated. Different average values correspond to different adjustment coefficients. In detail, the larger the average value, the smaller the adjustment coefficient is.
- an average value of intensity the infrared light of all the pixels contained in the captured image may be calculated. Different average values correspond to different adjustment coefficients. The larger the average value, the smaller the adjustment coefficient is, and the smaller the average value, the larger the adjustment coefficient is.
- the camera module 60 includes the laser projector 30 .
- the block 001 of obtaining the projection distance between the target user and the camera module 60 includes the following.
- the projection distance between the target user and the laser projector 30 is obtained.
- the block 002 of determining the control parameter of the camera module 60 according to the projection distance and controlling the camera module 60 based on the control parameter includes the following.
- an energy density when the laser projector 30 at a current distance emits the laser light with a preset parameter is obtained according to the projection distance.
- the preset parameter is adjusted according to the energy density and a safety energy threshold and the laser projector 30 is controlled based on the adjusted preset parameter.
- Embodiments of the present disclosure are described exemplarily by taking a phone as a terminal 100 .
- the terminal 100 includes an image capturing device 20 and a laser projector 30 .
- the laser projector 30 is configured to project a laser light pattern onto the target user.
- the image capturing device 20 may be an infrared camera. There may be one or more infrared cameras, for capturing a laser light pattern modulated by the target user, and further generating a depth image of the target user for identity identification, dynamic capturing or the like.
- the laser light may be infrared.
- the preset parameter of the laser projector 30 includes projection power, projection frequency or the like. It can be understood that the laser projector 30 projects the laser light pattern onto the target user based on a certain preset parameter.
- the human body or the human eyes have a threshold of receiving the energy density of laser light within a unit time. Generally, when the projection power of the laser light is too large or the projection distance between the target user and the laser projector 30 is too small and the energy density received by the human eye is greater than the threshold, the laser light emitted by the laser projector 30 may harm the user, such as burning the human eyes.
- the projection distance may be a distance between the user's face and a light exit surface of the laser projector 30 .
- a light entrance surface of the image capturing device 20 may be in the same plane as the light exit surface of the laser projector 30 .
- the distance of the laser projector 30 is inversely related to the energy density received by the human eyes.
- the distance between the target user and the laser projector 30 may be detected by various sensors of the terminal 100 , such as a proximity sensor, a depth camera, a front RGB camera, or a laser projector 30 , coordinated with the image capturing device, which is not limited herein.
- the processor 40 processes received data from an associated sensor or hardware to obtain the distance between the target user and the laser projector 30 .
- the energy density received by the human eyes is substantially fixed.
- a correspondence among the preset parameter, the projection distance and the energy density may be measured before being shipped, and may be stored in a memory of the terminal 100 .
- the energy density of the laser projector at the current distance emitting the laser with the preset parameter may be obtained by searching the preset correspondence between the projection distance and the energy density.
- the controller 50 may be configured to adjust the parameter of the laser projector 30 according to the energy density and the safety energy threshold acceptable by the human eye at the current projection distance, and to control the laser projector 30 to emit the laser light based on the adjusted parameter.
- the adjustment of the parameter may involve one or more of a plurality of parameters, which is not limited herein.
- the control method according to the implementation illustrated as FIG. 12 may control the projection parameter of the laser projector 30 according to the projection distance between the target user and the laser projector 30 and the projection energy density, thereby preventing the user from being harmed since the projection energy density of the laser projector 30 is too high.
- control method further includes the following.
- a scene image of a target scene is obtained.
- the image capturing device 20 may be a visible camera, such as a front camera of the terminal 100 . It can be understood that the laser projector 30 is disposed on a front panel of the terminal 100 and is faced towards the target user.
- the image capturing device 20 can obtain an RGB image of the target scene.
- the image capturing device 20 can also be an infrared camera to obtain an infrared image of the target scene.
- the preset parameter may be a focal length of the image capturing device 20 when capturing the scene image, to ensure that sizes of elements contained in a same scene at same projection distances and captured by the image capturing device 20 are same.
- the block 041 includes the following.
- a facial image of a target user captured with a preset parameter is obtained.
- a ratio of a face to the facial image is calculated.
- a projection distance between the target user and a laser projector is calculated based on the ratio.
- the facial image may be divided into a face region and a background region by extracting and analyzing feature points of the face.
- the ration may be obtained by calculating a ratio of the number of pixels in the face region to the number of pixels in the facial image. It can be understood that when the ratio is large, the target user is relatively close to the image capturing device 20 . That is, when the target user is close to the laser projection module 30 , the projection distance is small. At this time, the projection parameter of the laser projector 30 needs to be adjusted in case the laser light emitter is too strong such that the energy density of the projected laser is too large to harm the user's eyes. At the same time, when the ratio is small, it is indicated that the target user is far away from the image capturing device 20 .
- the target user is far away from the laser projection module 30 , the projection distance is large, and the energy density received by the human eyes is small.
- a face with a largest area is selected among the plurality of faces as the face region to calculate the ratio, and regions occupied by other faces are used as a part of the background region.
- the projection distance and the ratio may be calibrated in advance.
- the user is directed to obtain the facial image at a preset projection distance.
- a calibration ratio corresponding to the facial image is calculated.
- a correspondence between the preset projection distance and the calibration ratio is sorted, in order to calculate the projection distance according to an actual ratio in subsequent operations.
- the user is directed to obtain the facial image at a projection distance of 30 cm, the calibration ratio corresponding to the facial image is calculated as about 45%.
- the first ratio is calculated as R, based on a triangle similar principle, it may be derived as
- R 45 ⁇ % D 30 ⁇ ⁇ cm , where D is an actual projection distance calculated based on the ratio R in the actual measurements. In this way, based on the ratio of the face to the facial image, an actual projection distance of the target user can be objectively reflected.
- the block 043 includes the following.
- the projection power of the laser projector 30 is reduced when the energy density is greater than the safety energy threshold.
- the projection power of the laser projection 30 may be reduced when the energy density, received by the user, of the laser light projected by the laser projector 30 with the preset parameter is greater than the safety projection threshold.
- the correspondence between the projection distance and the projection power can be determined when being shipped and stored in advance in a memory.
- the projection power corresponding to the projection distance not only enables the laser light pattern projected by the laser projector 30 onto the target user to be captured by the image capturing device, but also enables the energy density received by the human eyes to be lower than the safety energy threshold after the power is adjusted.
- the projection power corresponding to the projection distance is searched for from the pre-stored correspondence between the projection distance and the projection power, and the laser projector 30 is controlled to emit the laser light with the corresponding projection power, or the projection power corresponding to the projection distance is calculated in combination with a preset conversion coefficient.
- the conversion coefficient is K and the projection distance is D
- the block 043 includes the following.
- a projection frequency of the laser projector 30 is reduced when the energy density is greater than the safety energy threshold.
- the projection frequency is reduced when the received energy density of the laser light emitted by the laser projector 30 with the preset parameter is greater than the safety projection threshold.
- the laser projector 30 emits the laser light in pulses and reduces energy accumulated by reducing the projection frequency, thereby reducing the energy density received by the human eyes.
- the block 043 includes the following.
- a pulse width of the laser light projected by the laser projector 30 is reduced, when the energy density is greater than the safe energy threshold.
- the pulse width of the laser light emitted is reduced when the energy density, received by the user, of the laser light emitted by the laser projector 30 with the preset parameter is greater than the safe energy threshold.
- the laser projector 30 emits the laser light in pulses, and reduces the pulse width of the laser light to reduce energy in a single projection, thereby reducing the energy density received by the human eyes similar to reducing the projection power.
- the block 043 further includes the following.
- the laser projector 30 is controlled to turn off, when the projection distance is less than a safety distance threshold.
- the laser projector 30 may be temporarily turned off. When it is detected that the projection distance is greater than the safety threshold, the laser projector 30 is turned on to operate.
- the camera module 60 includes a laser projector 30 and a target camera.
- the control parameter includes a capturing frame rate of the target camera and/or transmission power (i.e., projection power) of the laser projector 30 .
- the block 001 of obtaining the projection distance between the target user and the camera module 60 includes the following.
- a projection distance between a face of the target user and the target camera is obtained at a preset time interval, when the target camera is on.
- the block 002 of determining the control parameter of the camera module 60 according to the projection distance and controlling the camera module 60 based on the control parameter includes the following.
- the capturing frame rate of the target camera and/or the transmission power of the laser projector 30 is adjusted, according to the projection distance.
- the laser projector 30 is controlled to emit the laser light with the transmission power, and the target camera is controlled to capture the target image at the capturing frame rate.
- Embodiments of the present disclosure are described exemplarily by taking a phone as a terminal 100 .
- the target camera i.e., the laser camera 61
- the laser projector 30 may be controlled by the terminal 100 to emit the laser light, such that the target image may be captured by the target camera.
- the laser light emitted by the laser projector 30 may cause certain damage to the human eyes. The closer the projection distance is, the more serious the damage to face is. The face is suffered from damages.
- the terminal 100 may obtain the projection distance between the face and the target camera at the preset time interval.
- the preset time interval for capturing may be set according to actual requirements, for example, 30 milliseconds, 1 second or the like.
- the projection distance between the face and the target camera may also be understood as the projection distance between the face and the terminal 100 , or the projection distance between the face and the laser projector 30 .
- the terminal 100 may adjust the time interval for capturing according to a change degree of the projection distance between the face and the target camera. After obtaining a current projection distance between the face and the target camera, the terminal 100 may obtain a previous projection distance between the latest face and the target camera, calculate a difference between the current projection distance and the previous projection distance, and adjust the time interval for capturing according to the difference.
- a large difference indicates that a change of the projection distance between the face and the target camera is great.
- a capturing time period may be reduced and a capturing frequency may be increased.
- a small difference indicates that a change of the projection distance between the face and the target camera is small.
- the capturing time period may be increased and the capturing frequency may be reduced. According to the change of the projection distance between the face and the target camera, the time interval for capturing may be adjusted. Therefore, the projection distance between the face and the target camera may be obtained accurately and timely.
- the terminal 100 may obtain the depth information of the face according to the target image.
- a line perpendicular to an imaging plane and passing through a lens center is Z axis.
- the Z value is the depth information of the object in the imaging plane of the camera.
- the terminal 100 may determine the projection distance between the face and the target camera according to the depth information of the face.
- a distance sensor may be disposed on the terminal 100 .
- the projection distance between the face and the target camera may be collected by the distance sensor. It may be understood that the terminal 100 may also obtain the projection distance between the face and the target camera in other manners, which is not limited to the above manner.
- the terminal 100 may adjust the capturing frame rate of the target camera and/or the transmission power of the laser projector 30 according to the projection distance between the face and the target camera.
- the capturing frame rate refers to a frequency of capturing the target image via the target camera within a certain time, for example, 1 frame per second, 3 frames per second.
- the transmission power of the laser projector 30 may be intensity of the laser light emitted. The higher the transmission power, the higher the intensity of the laser light emitted.
- the terminal 100 may adjust the capturing frame rate of the target camera and/or the transmission power of the laser projector 30 according to the projection distance between the face and the target camera.
- the capturing frame rate of the target camera may be reduced and/or the transmission power of the laser projector 30 may be reduced.
- Times of projection from the laser projector 30 within a certain period of time may be reduced by reducing the capturing frame rate of the target camera.
- Intensity of the laser light emitted by the laser projector 30 may be reduced by reducing the transmission power of the laser projector 30 . Therefore, the damage to the human eyes caused by the laser light emitted by the laser projector 30 may be reduced.
- the laser projector 30 may be controlled to emit the laser light according to the adjusted transmission power, and the target camera may be controlled to capture the target image at the capturing frame rate.
- the projection distance between the face and the target camera is obtained at the preset time interval.
- the capturing frame rate of the target camera and/or the transmission power of the laser projector 30 is adjusted according to the projection distance.
- the laser projector 30 is controlled to emit the laser light according to the transmission power.
- the target camera is controlled to capture the target image according to the capturing frame rate.
- the capturing frame rate of the target camera and/or the transmission power of the laser projector 30 can be dynamically adjusted according to the projection distance between the face and the target camera, thereby reducing the damage to the human eyes caused by the laser light emitted by the laser projector 30 and protecting the human eyes.
- the block 052 of adjusting the capturing frame rate of the target camera and/or the transmission power of the laser projector 30 according to the projection distance includes: reducing the capturing frame rate of the target camera, when the projection distance is less than a first distance threshold and greater than a second distance threshold.
- the first distance threshold and the second distance threshold may be set by the terminal 100 .
- the first distance threshold may be greater than the second distance threshold.
- the first distance threshold may be a security distance that the laser light emitted by the laser projector 30 does not affect the human eyes.
- the terminal 100 may determine whether the projection distance between the face and the target camera is less than the first distance threshold after obtaining the projection distance between the face and the target camera. When the projection distance is smaller than the first distance threshold, it is indicated that the laser light emitted by the laser projector 30 may harm the human eyes.
- the terminal 100 may further determine whether the projection distance between the face and the target camera is greater than the second distance threshold.
- the terminal 100 can reduce only the capturing frame rate of the target camera without changing the transmission power of the laser projector 30 .
- the block 052 of adjusting the capturing frame rate of the target camera and/or the transmission power of the laser projector 30 according to the projection distance includes the following.
- the capturing frame rate of the target camera and the transmission power of the laser projector 30 are reduced, when the projection distance is less than or equal to the second distance threshold.
- a driving current of the laser projector 30 is reduced to a preset percentage of a rated driving current, when the projection distance is less than or equal to the second distance threshold and greater than a third distance threshold.
- the driving current of the laser projector 30 is reduced to be less than a current threshold, when the projection distance is less than or equal to the third distance threshold.
- the current threshold is less than the preset percentage of the rated driving current.
- the terminal 100 may reduce the capturing frame rate of the target camera and reduce the transmission power of the laser projector 30 .
- the terminal 100 may determine whether the projection distance is greater than the third distance threshold.
- the capturing frame rate of the target camera may be reduced and the transmission power of the laser projector 30 may be reduced.
- Reducing the transmission power of the laser projector 30 may be reducing the driving current of the laser projector 30 .
- the terminal 100 may reduce the driving current of the laser projector 30 to the preset percentage of the rated driving current.
- the preset percentage can be set according to the practical requirements, for example, 30%, 20%.
- the rated driving current refers to a normal driving current of the laser projector 30 when a distance between the face and the target camera is the security distance. When the projection distance between the face and the target camera is less than or equal to the third distance threshold, it is indicated that the laser light emitted by the laser projector 30 may cause serious damage to the human eyes.
- the terminal 100 can reduce the capturing frame rate of the target camera and reduce the transmission power of the laser projector 30 , such that the driving current of the laser projector 30 may be reduced to be less than the current threshold.
- the current threshold can be less than the preset percentage of the rated driving current.
- the terminal 100 can greatly reduce the transmission power of the laser projector 30 , thereby protecting the human eyes to a greatest extent.
- the block 052 includes recovering the capturing frame rate of the target camera to a standard frame rate and recovering the transmission power of the laser projector 30 to the rated power when the projection distance is greater than or equal to the first distance threshold.
- the terminal 100 may set the standard frame rate of the target camera and the rated power of the laser projector 30 .
- the projection distance between the face and the target camera is greater than or equal to the first distance threshold, it is indicated that the face is at the security distance from the target camera.
- the laser projector 30 can be controlled to emit the laser light according to the rated power and the target camera may be controlled to capture the target image at the standard frame rate.
- the terminal 100 can establish a relational function among a projection distance range and the capturing frame rate of the target camera and the transmission power of the laser projector 30 .
- Different projection distance ranges correspond to different capturing frame rates of the target camera and different transmission power of the laser projector 30 .
- the capturing frame rate of the target camera is the standard frame rate, i.e., 30 frames/second and the transmission power of the laser projector 30 is the rated power, i.e., 1000 mW.
- the capturing frame rate of the target camera is about 1 frame/second, and the transmission power of the laser projector 30 is about 1000 mW.
- the projection distance interval is less than or equal to the second distance threshold of 10 cm and greater than the third distance threshold of 3 cm, the capturing frame rate of the target camera is about 1 frame/sec and the transmission power of the laser projector 30 is about 300 mW.
- the projection distance interval is less than or equal to the third distance threshold of 3 cm, the capturing frame rate of target camera is about 1 frame/second and the transmission power of the laser projector 30 is about 125 mW.
- the projection distance range can be set according to practical requirements, and the capturing frame rate of the target camera and the transmission power of the laser projector 30 corresponding to the projection distance range can also be set according to practical requirements, which are not limited to the above conditions.
- the human eyes may be protected to a greatest extent and a loss of the captured target image may be minimized.
- the terminal 100 may obtain the projection distance between the face and the target camera at the preset time interval to determine the projection distance range to which the projection distance belongs, and obtain the capturing frame rate of the target camera and the transmission power of the laser projector 30 corresponding to the projection distance range.
- the laser projector 30 may be controlled to emit the laser light according to the corresponding transmission power.
- the target camera may be controlled to capture the target image at the corresponding capturing frame rate.
- the capturing frame rate of the target camera and/or the transmission power of the laser projector 30 may be dynamically adjusted according to the projection distance between the face and the target camera, thereby ensuring that a normal target image may be obtained, reducing the damage to the human eyes caused by the laser light emitted by the laser projector 30 , and protecting the security of the human eyes.
- the block 051 includes the following.
- a depth image is calculated according to a captured target speckle image and a stored reference speckle image.
- the reference speckle image is a stored speckle image for calibrating the camera.
- the reference speckle image carries reference depth information.
- a ratio of an effective value region of the depth image to the depth image is determined.
- the projection distance between the face and the target camera is obtained according to the ratio.
- control method further includes obtaining the projection distance between the face and the target camera collected by the distance sensor, when the projection distance obtained according to the ratio is less than the first distance threshold.
- the target image captured by the target camera may include the target speckle image.
- the terminal 100 may obtain the captured target speckle image and the reference speckle image, and compare the target speckle image with the reference speckle image to obtain the depth image. Therefore, the depth information of the face can be obtained from the depth image.
- the terminal 100 may select a pixel block centered on each pixel point contained in the target speckle image and having a preset size, for example, the size of 31 ⁇ 31 pixels, and search for a pixel block from the reference speckle image and matching the selected pixel block.
- the terminal 100 can find two points, respectively from the target speckle image and the reference speckle image, that are on a same laser light path and contained in the selected pixel block of the target speckle image and contained in the matching pixel block of the reference speckle image. Speckle information of the two points on the same laser light path is identical to each other. The two points on the same laser light path can be identified as corresponding pixels. The terminal 100 may calculate an offset between the two corresponding pixels that are respectively from the target speckle image and the reference speckle image on the same laser light path. The terminal 100 may obtain the depth information of each pixel point contained in the target speckle image according to the offset value, so as to obtain the depth image including the depth information of each pixel point of the target speckle image.
- the terminal 100 calculates the offset value between the target speckle image and the reference speckle image and obtains the depth information of each pixel point included in the target speckle image according to the offset value.
- the calculating may be according to equation (1):
- Z D L ⁇ f ⁇ Z 0 L ⁇ f + Z 0 ⁇ P ( 1 )
- Z D is the depth information of the pixel point, i.e., the depth value of the pixel point
- L is the distance between the laser camera 61 and the laser projector 30 (i.e., the laser 64 )
- f is a focal length of the lens of the laser camera 61
- Z D is a depth value from a reference plane and the laser camera 61 of the terminal 100 when the reference speckle image is captured
- P is a offset value between corresponding pixels respectively contained in the target speckle image and the reference speckle image.
- P can be obtained by multiplying the offset value between the target speckle image and the reference speckle image by an actual distance of a single pixel point.
- P is negative.
- P is positive.
- the terminal 100 may determine the projection distance between the face and the target camera according to the depth value of each pixel point in the depth image.
- the terminal 100 may perform face identification on the target speckle image to determine a face region and may extract the depth value of each pixel point included in the face region.
- the terminal 100 may calculate an average depth value of the face region and determine the projection distance between the face and the target camera according to the average depth value. For example, When the average depth value of the face region is 50 cm, it may be determined that the projection distance between the face and the target camera is about 50 cm.
- the terminal 100 may also select a pixel block in middle of the target speckle image. For example, a pixel block of a size of 25 ⁇ 25 pixels in the middle is selected and the average depth value of that pixel block is calculated. The average depth value of the pixel block may be used as the projection distance between the face and the target camera.
- the terminal 100 can detect the effective value region of the depth image.
- the effective value region refers to a region occupied by pixel points having a depth value greater than a preset effective value.
- the effective value can be set according to practical requirements.
- the terminal 100 may determine a ratio of the effective value region of the depth image to the depth image and obtain the projection distance between the face and the target camera according to the ratio. When the projection distance between the face and the target camera is too small, in the depth image, pixel points of the face may have no depth value or have small depth values.
- the terminal 100 may establish a correspondence between the ratio of the effective value region of the depth image to the depth image and the projection distance and may transform the ratio of the effective value region of the depth image to the depth image into the projection distance between the face and the target camera according to the correspondence.
- the ratio of the effective value region of the depth map to the depth image is about 80%, and the corresponding projection distance is about 20 cm.
- the ratio of the effective value region of the depth image to the depth image is less than 80%, it may be determined that the projection distance between the face and the target camera is less than 20 cm.
- the ratio of the effective value region of the depth image to the depth image is 100%, the projection distance between the face and the target camera can be directly determined according to the depth value of each pixel point in the depth image.
- the terminal 100 may be provided with a distance sensor.
- the projection distance between the face and the target camera may be collected by the distance sensor.
- the terminal 100 may calculate the depth information of the face according to the target speckle image captured by the target camera and determine the projection distance between the face and the target camera according to the depth information.
- the terminal 100 may obtain the projection distance between the face and the target camera collected by the distance sensor.
- the projection distance may be calculated initially based on the depth information.
- the projection distance from the face is relatively short, the projection distance may be obtained again through the distance sensor. Therefore, the projection distance between the face and the target camera can be obtained accurately.
- the embodiment illustrated as FIG. 17 can accurately calculate the projection distance between the face and the target camera, so that the capturing frame rate of the target camera and/or the transmission power of the laser projector 30 can be dynamically adjusted with changes in the distance, thereby protecting security of the human eyes.
- blocks in the flowcharts illustrated as FIGS. 16 and 17 are sequentially displayed as indicated by the arrows, these blocks are not necessarily performed in the order indicated by the arrows. Unless specified explicitly, the execution of these blocks is not strictly limited in sequence, and the blocks may be performed in other orders. Moreover, at least a part of the blocks in the schematic flowcharts above may include multiple sub-steps or multiple stages, which are not necessarily performed at a same time and may be executed at different times. The order of these sub-steps or stages is not necessarily performed sequentially, and may be performed alternately with at least a portion of other steps, sub-steps or stages of other steps.
- the present disclosure further provides a control device 10 of a camera module 60 .
- the control method described above according to the present disclosure may be implemented by the control device 10 of the present disclosure.
- the present disclosure further provides a control device 10 of a camera module 60 .
- the control method according to the present disclosure can be implemented by the control device 10 of the present disclosure.
- the control device 10 includes a master distance obtaining module 101 and a master control module 102 .
- the block 001 may be implemented by the master distance obtaining module 101 .
- the block 002 may be implemented by the master control module 102 . That is, the master distance obtaining module 101 may be configured to obtain a projection distance between a target user and the camera module 60 .
- the master control module 102 may be configured to determine a control parameter of the camera module 60 based on the projection distance and to control the camera module 60 based on the control parameter.
- the master distance obtaining module 101 includes a first obtaining module 111 and a second obtaining module 112 .
- the master control module 102 includes a first control module 113 .
- the block 011 may be implemented by the first obtaining module 111 .
- the block 012 may be implemented by the second obtaining module 112 .
- the block 013 may be implemented by the first control module 113 . That is, the first obtaining module 111 may be configured to obtain a facial image of the target user captured with the preset parameter.
- the second obtaining module 112 may be configured to obtain a projection distance between the target user and a laser projector 30 according to the facial image.
- the first control module 113 may be configured to control the laser projector 30 to emit laser light with corresponding projection power according to the projection distance.
- the second obtaining module 112 includes a first calculating unit 1121 and a second calculating unit 1122 .
- the block 0121 may be implemented by the first calculating unit 1121 .
- the block 0122 may be implemented by the second calculating unit 1122 . That is, the first calculating unit 1121 may be configured to calculate a first ratio of the face to the facial image.
- the second calculating unit 1122 may be configured to calculate the projection distance based on the first ratio.
- the second calculating unit 1122 includes a first calculating subunit 11221 and a second calculating subunit 11222 .
- the block 01221 may be implemented by the first calculating subunit 11221 .
- the block 01222 may be implemented by the second calculating subunit 11222 . That is, the first calculating subunit 11221 is configured to calculate a second ratio of a preset feature region of the face to the face contained in the facial image.
- the second calculating subunit 11222 is configured to calculate the projection distance based on the first ratio and the second ratio.
- the second calculating unit 1122 includes a first determining subunit 11223 and a third calculating subunit 11224 .
- the first determining subunit 11223 may be configured to implement the block 01223 .
- the third calculating subunit 11224 may be configured to implement the block 01224 . That is, the first determining subunit 11223 may be configured to determine whether the target user wears glasses according to the facial image.
- the third calculating subunit 11224 may be configured to calculate the projection distance based on the first ratio and a preset distance coefficient, when the target user wears the glasses.
- the second calculating unit 1122 includes a second determining subunit 11225 and an adjusting subunit 11226 .
- the second determining subunit 11225 may be configured to implement the block 01225 .
- the adjusting subunit 11226 may be configured to implement the block 01225 . That is, the second determining subunit 11225 may be configured to determine an age of the target user according to the facial image.
- the adjusting subunit 11226 may be configured to adjust the projection distance according to the first ratio and the age.
- the master distance obtaining module 101 includes a third obtaining module 211 , a fourth obtaining module 212 and a second control module 213 .
- the block 021 may be implemented by the third obtaining module 211 .
- the block 022 may be implemented by the fourth obtaining module 212 .
- the block 023 may be implemented by the second control module 213 . That is, the third obtaining module 211 may be configured to obtain a projection distance between the target user and a laser projector 30 .
- the fourth obtaining module 212 may be configured to obtain projection power corresponding to the laser projector 30 according to the projection distance.
- the second control module 213 may be configured to control the laser projector 30 to emit laser light with the projection power.
- the control device 10 further includes a seventh obtaining module 214 and a first determining module 215 .
- the seventh obtaining module 214 may be configured to implement the block 024 .
- the first determining module 215 may be configured to implement the block 025 . That is, the seventh obtaining module 214 may be configured to obtain a scene image of a target scene.
- the first determining module 215 may be configured to determine whether a face exists in the scene image. When the face is exists in the scene image, the third obtaining module 211 is configured to obtain the projection distance between the target user and the laser projector 30 .
- the third obtaining module 211 includes a transmitting unit 2111 and a receiving unit 2112 .
- the transmitting unit 2111 may be configured to implement the block 0211 .
- the receiving unit 2112 may be configured to implement the block 0212 . That is, the transmitting unit 2111 may be configured to transmit a detection signal to the target user.
- the receiving unit 2112 may be configured to calculate the projection distance based on a detection signal reflected by the target user.
- the control device 10 further includes an eighth obtaining module 216 , a first calculating module 217 and a first adjusting module 218 .
- the third obtaining module 16 may be configured to implement the block 026 .
- the first calculating module 27 may be configured to implement the block 027 .
- the first adjusting module 218 may be configured to implement the block 028 . That is, the eighth obtaining module 216 may be configured to obtain a captured image of the target user.
- the first calculating module 217 may be configured to calculate an age of the target user according to the captured image.
- the first adjusting module 218 may be configured to adjust the projection power of the laser projector 30 according to the age.
- the control device 10 further includes a ninth obtaining module 219 , a second determining module 230 and a second adjusting module 231 .
- the ninth obtaining module 219 may be configured to implement the block 029 .
- the second determining module 230 may be configured to implement the block 030 .
- the second adjusting module 231 may be configured to implement the block 031 . That is, the ninth obtaining module 219 may be configured to obtain a captured image of the target user.
- the second determining module 230 may be configured to determine whether the target user wears glasses according to the captured image.
- the second adjusting module 231 may be configured to reduce the projection power of the laser projector 30 , when the user wears the glasses.
- the master distance obtaining module 101 includes a fifth obtaining module 411 .
- the master control module 102 includes a second calculating module 412 and a third control module 413 .
- the block 041 may be implemented by the fifth obtaining module 411 .
- the block 042 may be implemented by the second calculating module 412 .
- the block 043 may be implemented by the third control module 413 . That is, the fifth obtaining module 411 may be configured to obtain the projection distance between the target user and the laser projector 30 .
- the second calculating module 412 may be configured to obtain an energy density when the laser projector 30 at a current distance emits the laser light with a preset parameter, according to the projection distance.
- the third control module 413 may configured to adjust the preset parameter according to the energy density and a safety energy threshold and to control the laser projector 30 based on the adjusted preset parameter.
- the control device 10 further includes an image obtaining module 414 and a third determining module 415 .
- the block 044 may be implemented by the image obtaining module 414 .
- the block 045 may be implemented by the third determining module 415 .
- the block 046 may be implemented by the fifth obtaining module 411 . That is, the image obtaining module 414 is configured to obtain a scene image of a target scene.
- the third determining module 415 is configured to determine whether a face exists in the scene image.
- the fifth obtaining module 411 is configured to obtain the projection distance between the target user and the laser projector 30 , when the face exists in the scene image.
- the fifth obtaining module 411 includes a first obtaining unit 4111 , a third calculating unit 4112 and a fourth calculating unit 4113 .
- the block 0411 can be implemented by the first obtaining unit 4111 .
- the block 0412 can be implemented by the third calculating unit 4112 .
- the block 0413 can be implemented by the fourth calculating unit 4113 . That is, the first obtaining unit 4111 may be configured to obtain a facial image of a target user captured with the preset parameter.
- the third calculating unit 4112 may be configured to calculate a ratio of a face to the facial image.
- the fourth calculating unit 4113 may be configured to calculate a projection distance based on the ratio.
- the third control module 413 includes a first control unit 4131 .
- the block 0431 can be implemented by the first control unit 4131 . That is, the first control unit 4131 is configured to reduce the projection power of the laser projector 30 when the energy density is greater than the safety energy threshold.
- the third control module 413 includes a second control unit 4132 .
- the block 0432 may be implemented by the second control unit 4132 . That is, the second control unit 4132 is configured to reduce a projection frequency of the laser projector 30 when the energy density is greater than the safety energy threshold.
- the third control module 413 includes a second control unit 4133 .
- the block 0433 may be implemented by the second control unit 4133 . That is, the second control unit 4133 is configured to reduce a pulse width of the laser light projected by the laser projector 30 , when the energy density is greater than the safe energy threshold.
- the third control module 413 includes a fourth control unit 4134 .
- the block 0434 can be implemented by the fourth control unit 4134 . That is, the fourth control unit 4134 is configured to control the laser projector 30 to turn off, when the projection distance is greater than a safety distance threshold.
- the control device 10 further includes a sixth obtaining module 511 , a third adjusting module 512 and a fourth control module 513 .
- the block 051 can be implemented by the sixth obtaining module 511 .
- the block 052 can be implemented by third adjusting module 512 .
- the block 053 can be implemented by the fourth control module 513 . That is, the sixth obtaining module 511 is configured to obtain a projection distance between a face of the target user and the target camera at a preset time interval, when the target camera is on.
- the third adjusting module 512 is configured to adjust the capturing frame rate of the target camera and/or the transmission power of the laser projector 30 according to the projection distance.
- the fourth control module 513 is configured to control the laser projector 30 to emit the laser light with the transmission power, and control the target camera to capture the target image at the capturing frame rate.
- the third adjusting module 512 is further configured to reduce the capturing frame rate of the target camera when the projection distance is less than a first distance threshold and greater than a second distance threshold.
- the third adjusting module 512 is further configured to reduce the capturing frame rate of the target camera and the transmission power of the laser projector 30 , when the projection distance is less than or equal to the second distance threshold, to reduce a driving current of the laser projector 30 to a preset percentage of a rated driving current, when the projection distance is less than or equal to the second distance threshold and greater than a third distance threshold, and to reduce the driving current of the laser projector 30 to be less than a current threshold, when the projection distance is less than or equal to the third distance threshold.
- the current threshold is less than the preset percentage of the rated driving current.
- the third adjusting module 512 is further configured to recover the capturing frame rate of the target camera to a standard frame rate and recover the transmission power of the laser projector 30 to the rated power, when the projection distance is greater than or equal to the first distance threshold.
- the sixth obtaining module 511 includes a fifth calculating unit 5111 , a determining unit 5112 and a second obtaining unit 5113 .
- the block 0511 can be implemented by the fifth calculating unit 5111 .
- the block 0512 can be implemented by the determining unit 5112 .
- the block 0513 can be implemented by the second obtaining unit 5113 . That is, the fifth calculating unit 5111 may be configured to calculate a depth image according to a captured target speckle image and a stored reference speckle image.
- the reference speckle image is a stored speckle image for calibrating the camera.
- the reference speckle image carries reference depth information.
- the determining unit 5112 may be configured to determine a ratio of an effective value region of the depth image to the depth image.
- the second obtaining unit 5113 may be configured to obtain the projection distance between the face and the target camera according to the ratio.
- the sixth obtaining module 511 is further configured to obtain the projection distance between the face and the target camera collected by the distance sensor, when the projection distance obtained according to the ratio is less than the first distance threshold.
- the present disclosure further provides a terminal 100 .
- the present disclosure further provides a terminal 100 .
- the control method of the present disclosure may also be implemented by the terminal 100 of the present disclosure.
- the terminal includes a camera module 60 , a processor 40 and a controller 50 .
- the block 001 may be implemented by the processor 40 .
- the block 002 may be implemented by the processor 40 and the controller 50 together. That is, the processor 40 is configured to obtain the projection distance between the target user and the camera module 60 and to determine the control parameter of the camera module 60 according to the projection distance.
- the controller 50 is configured to control the camera module 60 based on the control parameter.
- the terminal 100 includes an image capturing device 20 .
- the camera module 60 includes a laser projector 30 .
- the image capturing device 20 may be configured to capture a facial image of the target user with a preset parameter.
- the block 011 and the block 012 may be implemented by a processor 40 .
- the block 013 may be implemented by a controller 50 . That is, the processor 40 may be configured to obtain a facial image of the target user captured by the image capturing device 20 with a preset parameter and to obtain a projection distance between the target user and the laser projector 30 according to the facial image.
- the controller 50 may be configured to control the laser projector 30 to emit laser light with corresponding projection power according to the projection distance.
- the processor 40 may be further configured to implement the block 0121 and the block 0122 . That is, the processor 40 may be configured to calculate a first ratio of the face in the facial image and calculate the projection distance based on the first ratio.
- the block 01221 and the block 01222 may also be implemented by the processor 40 . That is, the processor 40 is further configured to calculate a second ratio of a preset feature region of the face to the face contained in the facial image, and to calculate the projection distance based on the first ratio and the second ratio.
- the processor 40 is further configured to implement the block 01223 and the block 01224 . That is, the processor 40 is further configured to determine whether the target user wears glasses according to the facial image, and to calculate the projection distance based on the first ratio and a preset distance coefficient, when the target user wears the glasses.
- the processor may be further configured to implement the block 01225 and the block 01226 . That is, the processor 40 is further configured to determine an age of the target user according to the facial image, and adjust the projection distance according to the first ratio and the age.
- the terminal 100 further includes a distance detecting module 70 .
- the distance detecting module 70 may be configured to implement the block 021 .
- the processor 40 may be configured to implement the block 022 .
- the controller 50 may be configured to implement the block 023 . That is, the distance detecting module 70 may be configured to obtain a projection distance between the target user and a laser projector 30 .
- the processor 40 may be configured to obtain projection power corresponding to the laser projector 30 according to the projection distance.
- the controller 50 may be configured to control the laser projector 30 to emit laser light with the projection power.
- the terminal 100 further includes the image capturing device 20 .
- the image capturing device 20 may be configured to capture a scene image of a target scene.
- the processor 40 may be further configured to implement the block 024 and the block 025 . That is, the processor 40 may be configured to obtain a scene image of a target scene and determine whether a face exists in the scene image. In detail, the processor 40 may be configured to obtain the scene image of the target scene captured by the image capturing device 20 .
- the distance detecting module 70 is configured to obtain the projection distance between the target user and the laser projector 30 .
- the distance detecting module 70 includes a transmitter 71 , a receiver 72 and a calculator 73 .
- the transmitter 71 may be configured to implement the block 0211 .
- the receiver 72 and the calculator 73 may be configured to implement the block 0212 together. That is, the transmitter 81 may be configured to transmit a detection signal to the target user.
- the receiver 72 may be configured to receive a detection signal reflected by the target user.
- the calculator 73 may be configured to calculate the projection distance based on the detection signal reflected by the target user.
- the terminal 100 further includes the image capturing device 20 .
- the image capturing device 20 may be configured to implement the block 026 .
- the processor 40 may also be configured to implement the block 027 and the block 028 . That is, the image capturing device 20 may be further configured to obtain a captured image of the target user.
- the processor 40 may be further configured to calculate an age of the target user according to the captured image, and adjust the projection power of the laser projector 30 according to the age.
- the terminal 100 further includes the image capturing device 20 .
- the image capturing device 20 may be configured to implement the block 029 .
- the processor 40 may also be configured to implement the blocks 030 and 031 . That is, the image capturing device 20 may be configured to obtain a captured image of the target user.
- the processor 40 may be configured to determine whether the target user wears the glasses according to the captured image, and to reduce the projection power of the laser projector 30 when the user wears the glasses.
- the block 041 and the block 042 may be implemented by the processor 40 .
- the block 043 may be implemented by the controller 50 . That is, the processor 40 may be configured to obtain the projection distance between the target user and the laser projector 30 , and to obtain an energy density when the laser projector 30 at a current distance emits the laser light with a preset parameter, according to the projection distance.
- the controller 50 may be configured to adjust the preset parameter according to the energy density and a safety energy threshold and to control the laser projector 30 based on the adjusted preset parameter.
- the image capturing device 20 of the terminal 100 is configured to implement the block 044 .
- the processor 40 is configured to implement the block 045 and the block 046 . That is, the image capturing device 20 is configured to obtain a scene image of a target scene.
- the processor 40 is configured to determine whether a face exists in the scene image and to obtain the projection distance between the target user and the laser projector 30 , when the face exists in the scene image.
- the processor 40 may be also configured to implement the block 0411 , the block 0412 , and the block 0413 . That is, the processor 40 may be configured to obtain a facial image of a target user captured by the image capturing device 20 with a preset parameter, to calculate a ratio of a face to the facial image, and to calculate a projection distance based on the ratio.
- the controller 50 is configured to reduce the projected power of the laser projector 30 when the energy density is greater than the safe energy threshold.
- the controller 50 is configured to reduce a projection frequency of the laser projector 30 when the energy density is greater than the safe energy threshold.
- the controller 50 is configured to reduce a pulse width of the laser light projected by the laser projector 30 , when the energy density is greater than the safe energy threshold.
- the controller 50 is configured to control the laser projector 30 to turn off, when the projection distance is less than a safety distance threshold.
- the processor 40 serves as a second processing unit.
- the controller 50 serves as the first processing unit.
- the terminal 100 includes a camera module 60 , the second processing unit 41 and the first processing unit 51 .
- the camera module 60 includes a laser camera 61 (i.e., the target camera), a floodlight 62 , an RGB (red/green/blue color mode) camera 63 , and a laser 64 (i.e., the laser projector 30 ).
- the second processing unit 41 can be a CPU (central processing unit) module.
- the first processing unit 51 can be an MCU (microcontroller unit) module or the like.
- the first processing unit 51 is coupled between the second processing unit 41 and the camera module 60 .
- the first processing unit 51 may be configured to control the laser camera 61 , the floodlight 62 and the laser 64 .
- the second processing unit 41 may be configured to control the RGB camera 63 .
- the laser camera 61 may be an infrared camera for obtaining an infrared image.
- the floodlight 62 is an area light source capable of emitting the infrared light.
- the laser 64 is a point light source capable of emitting the laser light that may form a pattern.
- the laser camera 61 may obtain an infrared image according to reflected light.
- the laser 64 emits the laser light
- the laser camera 61 may obtain a speckle image according to reflected light.
- the speckle image is an image having a pattern that is deformed compared to the pattern formed by the laser light emitted by the laser 64 after the laser light is reflected.
- the second processing unit 51 may include a CPU core running in TEE (trusted execution environment) and a CPU core running in REE (rich execution environment). Both the TEE and the REE are operating modes of an ARM (advanced reduced instruction set computer (RISC) machines) module.
- a security level of the TEE is relatively high. There may be one and only one CPU core of the second processing unit 41 may run in the TEE simultaneously. Generally, an operation behavior of the high security level on the terminal 100 needs to be performed in the CPU core under the TEE, and an operation behavior of the low security level may be performed in the CPU core under the REE.
- the first processing unit 51 includes a PWM (pulse width modulation) module 52 , a SPI/I2C (serial peripheral interface/inter-integrated circuit) interface 53 , a RAM (random access memory) module 54 , and a depth engine 55 .
- the PWM module 52 may emit a pulse to the camera module 60 , to control the floodlight 62 or the laser 64 to turn on, so that the laser camera 61 may capture an infrared image or a speckle image.
- the SPI/I2C interface 53 is configured to receive a face capturing command sent by the second processing unit 41 .
- the depth engine 55 may be configured to process the speckle image to obtain a depth disparity map.
- the CPU core running in the TEE may send the face capturing command to the first processing unit 51 .
- the first processing unit 51 may control the floodlight 62 to turn on by transmitting a pulse wave through the PWM module 52 such that the infrared image is captured by the laser camera 612 and to control the laser 64 to turn on such that the speckle image is captured by the laser camera 61 .
- the camera module 60 may transmit the captured infrared image and speckle image to the first processing unit 51 .
- the first processing unit 51 may process the received infrared image to obtain an infrared disparity map and process the received speckle images to obtain a speckle disparity map or a depth disparity map. Processing the infrared image and the speckle image by the first processing unit 51 refers to correcting the infrared image or the speckle image and removing influences of internal and external parameters of the camera module 60 on the images.
- the first processing unit 51 may be set to different modes. Images output by different modes are different. When the first processing unit 51 is set to a speckle image mode, the first processing unit 51 processes the speckle image to obtain a speckle disparity map. A target speckle image may be obtained according to the speckle disparity map.
- the first processing unit 51 When the first processing unit 51 is set to a depth image mode, the first processing unit 51 processes the speckle image to obtain a depth disparity map.
- a depth image may be obtained according to the depth disparity map.
- the depth image refers to an image with depth information.
- the first processing unit 51 may send the infrared disparity map and the speckle disparity map to the second processing unit 41 .
- the first processing unit 51 may also send the infrared disparity map and the depth disparity map to the second processing unit 41 .
- the second processing unit 41 may obtain a target infrared image according to the infrared disparity map and obtain a depth image according to the depth disparity map.
- the second processing unit 41 may perform face recognition, face matching, living body detection and obtain depth information of the detected face according to the target infrared image and the depth image.
- the first processing unit 51 may obtain the target infrared image according to the infrared disparity map, obtain the depth image through calculation according to the depth disparity map, and send the target infrared image and the depth image to the second processing unit 41 .
- Communication between the first processing unit 51 and the second processing unit 41 is done through a fixed security interface to ensure security of data transmission. As illustrated in FIG.
- the transmission of the data from the second processing unit 41 to the first processing unit 51 is done through the SECURE SPI/I2C 81
- the transmission of the data from the first processing unit 51 to the second processing unit 41 is done through the SECURE MIPI (mobile industry processor interface) 82 .
- the block 051 and the block 052 may be implemented by the second processing unit 41 .
- the block 053 may be implemented by the first processing unit. That is, the second processing unit 41 may be configured to obtain the projection distance between a face of the target user and a target camera at a preset time interval when the target camera is on, and to adjust a capturing frame rate of the target camera and/or transmission power of the laser projector 30 according to the projection distance.
- the first processing unit 51 may be configured to control the laser projector 30 to emit the laser light with the transmission power and to control the target camera to capture the target image at the capturing frame rate.
- the target camera may be controlled to turn on to capture the target image via the target camera.
- the target camera may refer to the laser camera 61 .
- the laser camera 61 may capture invisible images of different wavelengths.
- the target image may include, but is not limited to, an infrared image and a speckle image.
- the speckle image refers to an infrared image having a speckle pattern.
- the terminal 100 may turn the floodlight 62 on such that an infrared image is captured via the laser camera 61 and may turn a laser, such as the laser 64 , on such that the speckle image is captured by the laser camera 61 .
- the floodlight 62 may be a point light source that uniformly illuminates in all directions.
- the light emitted by the floodlight may be infrared.
- the laser camera may capture an infrared image by photographing the face.
- the laser light emitted by the laser projector 30 i.e., the laser 64
- the laser 64 may be diffracted by a lens and a DOE (diffractive optical element) to produce a pattern with speckle particles.
- the pattern with the speckle particles are projected onto the target object and shift of the speckle image is generated due to different distances between points of the target object with the terminal 100 .
- the laser camera 61 photographs the target object to obtain the speckle image.
- the second processing unit 41 can obtain the projection distance between the face and the laser camera 61 at a preset time interval.
- the second processing unit 41 can adjust the capturing frame rate of the laser camera and/or the transmission power of the laser projector 30 according to the projection distance.
- the first processing unit 51 can control the laser 64 to emit the laser light according to the adjusted transmission power, and control the laser camera 61 to obtain the target image, such as the infrared image and the speckle image, according to the adjusted capturing frame rate.
- the second processing unit 41 is further configured to reduce the capturing frame rate of the target camera when the projection distance is less than a first distance threshold and greater than a second distance threshold.
- the second processing unit 41 is further configured to reduce the capturing frame rate of the target camera and the transmission power of the laser projector 30 , when the projection distance is less than or equal to the second distance threshold, to reduce a driving current of the laser projector 30 to a preset percentage of a rated driving current, when the projection distance is less than or equal to the second distance threshold and greater than a third distance threshold, and to reduce the driving current of the laser projector 30 to be less than a current threshold, when the projection distance is less than or equal to the third distance threshold.
- the current threshold is less than the preset percentage of the rated driving current.
- the second processing unit 41 is further configured to recover the capturing frame rate of the target camera to a standard frame rate and recover the transmission power of the laser projector 30 to the rated power, when the projection distance is greater than or equal to the first distance threshold.
- the terminal 100 further includes a distance sensor.
- the distance sensor is coupled to the second processing unit 41 .
- the block 0511 can be implemented by the first processing unit 51 and the second processing unit 41 together.
- the block 0512 and the block 0513 can be implemented by the second processing unit 41 . That is, the first processing unit 51 is further configured to calculate the disparity image according to a captured target speckle image and a stored reference speckle image, and transmit the disparity image to the second processing unit 41 .
- the reference speckle image is a stored speckle image for calibrating the camera.
- the reference speckle image carries reference depth information.
- the second processing unit 41 is further configured to calculate a depth image according to the disparity image, and to determine a ratio of an effective value region of the depth image to the depth image.
- the second processing unit 41 is further configured to obtain the distance between the face and the target camera according to the ratio.
- the distance sensor is configured to collect the distance between the face and the target camera.
- the second processing unit 41 is further configured to obtain the projection distance between the face and the target camera collected by the distance sensor, when the projection distance obtained according to the ratio is less than the first distance threshold.
- the second processing unit 41 is further configured to obtain a type of the application and to determine a security level corresponding to the type of the application.
- the application is an application for transmitting a face depth information obtaining request to the second processing unit.
- the second processing unit 41 is further configured to switch among operation modes according to the security level. When the security level is high, the disparity image is received in a first operation mode. The depth map is calculated according to the disparity image in the first operation mode. When the security level is low, the disparity image is received in the first operation mode, and the depth image is calculated according to the disparity image in a second operation mode.
- the second processing unit 41 may be operated in two operation modes.
- the first operation mode may be the TEE.
- the TEE is a trustable operating environment with a high security level.
- the second operation mode may be the REE.
- the REE is a natural operating environment with a low security level.
- the application of the terminal 100 sends the face depth information obtaining request to the second processing unit 41
- the second processing unit 41 may obtain the type of the application and switch among the operation modes according to the security level corresponding to the type.
- the type of the application may include, but not limited to, an unlock application, a payment application, a camera application, a beauty application, and the like. Security levels corresponding to different types may be different.
- the security level corresponding to the payment application and the unlock application may be high, and the security level corresponding to the camera application and the beauty application may be low, but the present disclosure is not limited to.
- the second processing unit 41 may be switched to operate in the first operation mode.
- the security level corresponding to the type is low, the second processing unit 41 may be switched to operate in the second operation mode.
- the second processing unit 41 is single-core, the single core may be directly switched from the second operation mode to the first operation mode.
- the second processing unit 41 is multi-core, one of the multi-core is switched by the terminal 100 from the second operation mode to the first operation mode, and other cores still operate in the second operation mode.
- the second processing unit 41 can transmit a face capturing command to the first processing unit 51 via the core switched to operate in the first operation mode, thereby ensuring that a command input by the first processing unit 51 is secure.
- the first processing unit 51 can control the laser camera 61 to obtain the target image, such as the infrared image and the speckle image through the PWM module.
- the first processing unit 51 may obtain the disparity image through the calculation based on the target speckle image and the reference speckle image and may send the disparity image to the core operating in the first operating mode of the second processing unit 41 .
- the core operating in the first operation mode of the second processing unit 41 may obtain the depth image through the calculation based on the disparity image.
- the core operating in the first operation mode of the second processing unit 41 may send the disparity image to other cores operating in the second processing mode, and the depth image is obtained through the calculation by other cores operating in the second processing mode.
- a non-volatile computer readable storage medium includes one or more computer executable instructions.
- the computer executable instructions are executed by one or more processors, the one or more processors are configured to perform the control methods of any of the above embodiments.
- a block 001 of obtaining a projection distance between a target user and a camera module 60 is executed.
- a block 002 of determining a control parameter of the camera module 60 according to the projection distance and controlling the camera module 60 based on the control parameter is executed.
- a block 011 of obtaining a facial image of the target user captured based on a preset parameter is executed.
- a block 012 of obtaining a projection distance between the target user and a laser projector 30 according to the facial image is executed.
- a block 013 of controlling the laser projector 30 to emit laser light with corresponding projection power according to the projection distance is executed.
- a block 021 of obtaining a projection distance between the target user and the laser projector 30 is executed.
- a block 022 of obtaining projection power corresponding to the laser projector 30 according to the projection distance is executed.
- a block 023 of controlling the laser projector 30 to emit the laser light with the projection power is executed.
- a block 041 of obtaining a projection distance between the target user and the laser projector 30 is executed.
- a block 042 of obtaining an energy density of the laser light emitted by the laser projector 30 with the preset parameter at a current distance, according to the projection distance is executed.
- a block 043 of adjusting the preset parameter according to the energy density and a security energy threshold and controlling the laser projector 30 based on the adjusted preset parameter is executed.
- a block 051 of obtaining the projection distance between a face of the target user and the target camera at a preset time interval when the target camera is on is executed.
- a block 052 of adjusting a capturing frame rate of the target camera and/or transmission power of the laser projector 30 according to the projection distance is executed.
- a block 053 of controlling the laser projector 30 to emit the laser light with the transmission power and controlling the target camera to capture the target image at the capturing frame rate is executed.
- inventions of the present disclosure further provide a computer device 1000 .
- the computer device 1000 includes a memory 91 and a processor 40 .
- the memory 91 is configured to store computer readable instructions.
- the processor 40 is configured to the control method of any of the above embodiments.
- a block 001 of obtaining a projection distance between the target user and the camera module 60 is executed.
- a block 002 of determining a control parameter of the camera module 60 according to the projection distance and controlling the camera module 60 based on the control parameter is executed.
- a block 011 of obtaining a facial image of the target user captured with a preset parameter is executed.
- a block 012 of obtaining the projection distance between the target user and the laser projector 30 according to the facial image is executed.
- a block 013 of controlling the laser projector 30 to emit laser light with corresponding projection power according to the projection distance is executed.
- a block 021 of obtaining a projection distance between the target user and the laser projector 30 is executed.
- a block 022 of obtaining projection power corresponding to the laser projector 30 according to the projection distance is executed.
- a block 023 of controlling the laser projector 30 to emit the laser light with the projection power is executed.
- a block 041 of obtaining a projection distance between the target user and the laser projector 30 is executed.
- a block 042 of obtaining an energy density of the laser light emitted by the laser projector 30 with the preset parameter at a current distance, according to the projection distance is executed.
- a block 043 of adjusting the preset parameter according to the energy density and a security energy threshold and controlling the laser projector 30 based on the adjusted preset parameter is executed.
- a block 051 of obtaining the projection distance between a face of the target user and the target camera at a preset time interval when the target camera is on is executed.
- a block 052 of adjusting a capturing frame rate of the target camera and/or transmission power of the laser projector 30 according to the projection distance is executed.
- a block 053 of controlling the laser projector 30 to emit the laser light with the transmission power and controlling the target camera to capture the target image at the capturing frame rate is executed.
- FIGS. 40 and 41 are schematic block diagrams illustrating internal modules of the computer device 1000 .
- the computer device 1000 includes a processor 40 , a memory 91 (e.g., a non-volatile storage medium 91 ), an internal memory 92 , a display screen 94 and an input device 93 connected by a system bus 95 .
- the memory 91 of the computer device 1000 has an operating system and computer readable instructions stored therein.
- the computer readable instructions are executable by the processor 40 to realize a control method described in any of the above implementations.
- the processor 40 may be configured to provide a calculating and controlling ability to support operations of the entire computer device 1000 .
- the internal memory 92 of the computer device 1000 provides an environment for running the computer readable instructions stored in the memory 91 .
- the display 94 of the computer device 1000 may be a liquid crystal display or an electronic ink display or the like.
- the input device 93 may be a touch layer covered on the display 94 , or may be a button, trackball or touch board provided on a house of the computer device 1000 , or an external keyboard, touch board or mouse.
- the computer device 1000 may be a phone, a tablet computer, a laptop, a personal digital assistant, or a wearable device (e.g., a smart wrist strap, a smart watch, a smart helmet, smart glasses). It may be conceivable for those skilled in the art that the structures illustrated in FIGS.
- 40 and 41 are merely schematic diagrams of partial structures related to the solution of the present disclosure, and do not limit the computer device 1000 to which the solution of the present application is applied.
- the computer device 1000 may include more or fewer components than those illustrated in the figures, or some components may be combined, or different component arrangements may be included.
- the laser projector 30 in the control method illustrated in FIGS. 2, 7, and 12 includes a substrate component 31 , a lens tube 32 , a light source 33 , a collimating element 34 , a diffractive optics element (DOE) 35 and a protective cover 36 .
- the controller 50 is configured to control the projection power of the laser projector 30 by controlling emission power of the light source 33 of the laser projector 30 .
- the substrate component 31 includes a substrate 311 and a circuit board 312 .
- the circuit board 312 is disposed on the substrate 311 .
- the circuit board 312 is configured to couple the light source 33 and a main board of the terminal 100 .
- the circuit board 312 may be a hard board, a soft board or a soft and hard combination board. In the embodiment illustrated as FIG. 42 , the circuit board 312 is provided with a through hole 3121 .
- the light source 33 is fixed on the substrate 311 and electrically connected to the circuit board 312 .
- a heat dissipation hole 3111 can be formed on the substrate 311 .
- the heat generated by the light source 33 or the circuit board 312 when working can be dissipated by the heat dissipation hole 3111 .
- the heat dissipation hole 3111 can also be filled with thermal conductive adhesive to further improve the heat dissipation performance of the substrate component 31 .
- the lens tube 32 is fixedly connected to the substrate component 31 .
- the lens tube 32 is formed with an accommodation cavity 321 .
- the lens tube 32 includes a top wall 322 and an annular side wall 324 extending from the top wall 322 .
- the side wall 324 is disposed on the substrate component 31 .
- the top wall 322 is provided with a light through hole 3212 intercommunicating with the accommodation cavity 321 .
- the side wall 324 can be coupled to the circuit board 312 by adhesive.
- the protective cover 36 is disposed on the top wall 322 .
- the protective cover 36 includes a baffle 362 having a light exit through hole 360 and an annular side wall 364 extending from the baffle 362 .
- the light source 33 and the collimating element 34 are both disposed in the accommodation cavity 321 .
- the diffractive optical element 35 is arranged on the lens tube 32 .
- the collimating element 34 and the diffractive optical element 35 are sequentially disposed on a light-emitting path of the light source 33 .
- the collimating element 34 collimates the laser light emitted by the light source 33 .
- the laser light passes through the collimating element 34 and the diffractive optical element 35 to form the laser light pattern.
- the light source 33 may be a vertical cavity surface emitting laser (VCSEL) or an edge-emitting laser (EEL). In the embodiment illustrated as FIG. 42 , the light source 33 is the edge-emitting laser. In detail, the light source 33 may be a distributed feedback laser (DFB).
- the light source 33 is configured to emit the laser light into the accommodation cavity 312 . As illustrated in FIG. 43 , the light source 33 is substantially columnar. A surface of the light source 33 away from the substrate component 31 forms a light emitting surface 331 . The laser light is emitted from the light emitting surface 331 . The light emitting surface 331 faces the collimating element 34 .
- the light source 33 is fixed on the substrate component 31 .
- the light source 33 may be attached to the substrate component 31 by sealant 37 .
- a side surface of the light source 33 opposite to the light-emitting surface 331 is attached to the substrate component 31 .
- the side surface 332 of the light source 33 may also be attached to the substrate component 31 .
- the sealant 37 may enclose the side surface 332 , attach a certain surface of the side surface 332 to the substrate component 31 or attach several surfaces to the substrate component 31 .
- the sealant 37 may be thermal conductive adhesive to transfer the heat generated by the light source 33 when operating to the substrate component 31 .
- the diffractive optical element 35 is supported by the top wall 322 and received in the protective cover 36 . Opposite sides of the diffractive optical element 35 are respectively contact against the protective cover 36 and the top wall 322 .
- the baffle 362 includes an abutting surface 3622 adjacent to the light through hole 3212 . The diffractive optical element 35 contacts against the abutting surface 3622 .
- the diffractive optical element 35 includes a diffraction entrance surface 352 opposite to a diffraction exit surface 354 .
- the diffractive optical element 35 is supported by the top wall 322 .
- the diffraction exit surface 354 contacts against a surface of the baffle 362 closed to the light through hole 3212 (the abutting surface 3622 ).
- the diffraction entrance surface 352 contacts against the top wall 362 .
- the light through hole 3212 is aligned with the accommodation cavity 321 .
- the light exist through hole 360 is aligned with the light through hole 3212 .
- the top wall 322 , the annular side wall 364 and the baffle 362 are in contact with the diffractive optical element 35 , thereby preventing the diffractive optical element 35 from falling out of the protective cover 36 in a light emitting direction.
- the protective cover 36 is adhered to the top wall 362 by glue.
- the light source 33 of the laser projector 30 is implemented as the edge-emitting laser.
- temperature drift of the edge-emitting laser is smaller than that of the VCSEL array.
- the edge-emitting laser is a single-point illumination structure, it is not necessary to design an array structure, the manufacture is sample, and the cost of the light source of the laser projector 30 is low.
- the gain of the power is obtained through feedback of a grating structure.
- the distributed feedback laser is generally elongated structured.
- the edge-emitting laser is placed vertically.
- the edge-emitting laser may be prone to accidents such as dropping, shifting or shaking.
- the accidents such as dropping, shifting or shaking may be avoided by fixing the edge-emitting laser with the sealant 37 .
- the light source 33 can also be fixed to the substrate component 31 in a fixed manner as illustrated in FIG. 45 .
- the laser projector 30 includes a plurality of support blocks 38 .
- the support blocks can be fixed to the substrate component 31 .
- the plurality of the support blocks 38 collectively surround the light source 33 .
- the light source 33 can be mounted directly among the plurality of the support blocks 38 during installation. In one example, the plurality of the support blocks 38 collectively clamp the light source 33 to further prevent the light source 33 from shaking.
- the protective cover 36 may be omitted.
- the diffractive optical element 35 may be disposed in the accommodation cavity 321 .
- a diffraction exit surface 354 of the diffractive optical element 35 may contact against the top wall 322 .
- the laser light passes through the diffractive optical element 35 and the light through hole 3212 .
- the diffractive optical element 35 is less fell off.
- the substrate 311 can be omitted.
- the light source 33 can be directly fixed to the circuit board 312 to reduce an overall thickness of the laser projector 30 .
- the program can be stored in a nonvolatile computer readable storage medium.
- the program when being executed, may include the flow of the embodiments of the methods as described above.
- the storage medium may be a magnetic disk, an optical disk, a read-only memory (ROM), or the like.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Signal Processing (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Health & Medical Sciences (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Human Computer Interaction (AREA)
- General Health & Medical Sciences (AREA)
- Optics & Photonics (AREA)
- Controls And Circuits For Display Device (AREA)
- Projection Apparatus (AREA)
- Transforming Electric Information Into Light Information (AREA)
Abstract
Description
where D is an actual projection distance calculated based on the first ratio R in the actual measurements. In this way, based on the first ratio of the face region to the facial image, an actual projection distance of the target user may be objectively reflected.
where D1 is an initial projection distance calculated based on the actually measured first ratio R1. A calibration projection distance D2 may be further calculated based on an equation
and the actually measured second ratio R2, where D2 is taken as the projection distance. In this way, the calculation of the projection distance based on the first ratio and the second ratio takes individual differences among the users into account, thereby objectively obtaining the projection distance.
where D is an actual projection distance calculated based on the ratio R in the actual measurements. In this way, based on the ratio of the face to the facial image, an actual projection distance of the target user can be objectively reflected.
where ZD is the depth information of the pixel point, i.e., the depth value of the pixel point, L is the distance between the
Claims (24)
Applications Claiming Priority (9)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810161936.0A CN108376251B (en) | 2018-02-27 | 2018-02-27 | Control method, control device, terminal, computer device, and storage medium |
CN201810161946.4A CN108376252B (en) | 2018-02-27 | 2018-02-27 | Control method, control device, terminal, computer device, and storage medium |
CN201810161946.4 | 2018-02-27 | ||
CN201810162447.7A CN108281880A (en) | 2018-02-27 | 2018-02-27 | Control method, control device, terminal, computer equipment and storage medium |
CN201810161936.0 | 2018-02-27 | ||
CN201810162447.7 | 2018-02-27 | ||
CN201810404834.7 | 2018-04-28 | ||
CN201810404834.7A CN108769509B (en) | 2018-04-28 | 2018-04-28 | Control method, apparatus, electronic equipment and the storage medium of camera |
PCT/CN2019/076157 WO2019165956A1 (en) | 2018-02-27 | 2019-02-26 | Control method, control apparatus, terminal, computer device, and storage medium |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2019/076157 Continuation WO2019165956A1 (en) | 2018-02-27 | 2019-02-26 | Control method, control apparatus, terminal, computer device, and storage medium |
Publications (2)
Publication Number | Publication Date |
---|---|
US20190279398A1 US20190279398A1 (en) | 2019-09-12 |
US11335028B2 true US11335028B2 (en) | 2022-05-17 |
Family
ID=67804818
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/423,073 Active 2040-03-12 US11335028B2 (en) | 2018-02-27 | 2019-05-27 | Control method based on facial image, related control device, terminal and computer device |
Country Status (3)
Country | Link |
---|---|
US (1) | US11335028B2 (en) |
EP (1) | EP3564748A4 (en) |
WO (1) | WO2019165956A1 (en) |
Families Citing this family (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20190306441A1 (en) * | 2018-04-03 | 2019-10-03 | Mediatek Inc. | Method And Apparatus Of Adaptive Infrared Projection Control |
CN108666861B (en) * | 2018-05-09 | 2019-12-06 | 歌尔股份有限公司 | Method and device for correcting driving current of multiple lasers and laser projector |
TWI777141B (en) * | 2020-03-06 | 2022-09-11 | 技嘉科技股份有限公司 | Face identification method and face identification apparatus |
CN111427049B (en) * | 2020-04-06 | 2024-08-27 | 深圳蚂里奥技术有限公司 | Laser safety device and control method |
CN111487632B (en) * | 2020-04-06 | 2024-09-06 | 深圳蚂里奥技术有限公司 | Laser safety control device and control method |
CN111487633B (en) * | 2020-04-06 | 2024-08-23 | 深圳蚂里奥技术有限公司 | Laser safety control device and method |
CN112416135A (en) * | 2020-12-15 | 2021-02-26 | 广州富港万嘉智能科技有限公司 | Projection parameter determination method and device based on indoor positioning and projection system |
CN113902790B (en) * | 2021-12-09 | 2022-03-25 | 北京的卢深视科技有限公司 | Beauty guidance method, device, electronic equipment and computer readable storage medium |
CN114500795B (en) * | 2021-12-27 | 2024-03-15 | 奥比中光科技集团股份有限公司 | Laser safety control method and device, intelligent door lock and storage medium |
CN116718435B (en) * | 2023-04-26 | 2024-07-02 | 广州医科大学附属第一医院(广州呼吸中心) | Intelligent mobile aerosol collection robot |
Citations (54)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1555051A (en) | 1998-09-14 | 2004-12-15 | ���µ�����ҵ��ʽ���� | Data recording medium and device for recording and copying data |
JP2007006016A (en) | 2005-06-22 | 2007-01-11 | Sharp Corp | Imaging equipment |
CN101216881A (en) | 2007-12-28 | 2008-07-09 | 北京中星微电子有限公司 | A method and device for automatic image acquisition |
CN101384941A (en) | 2006-02-10 | 2009-03-11 | 松下电器产业株式会社 | Scanning unit and image display device |
CN101452181A (en) | 2007-12-03 | 2009-06-10 | 鸿富锦精密工业(深圳)有限公司 | Automatic focusing system and method OF electronic device |
CN101692281A (en) | 2009-06-03 | 2010-04-07 | 北京中星微电子有限公司 | Safety monitoring method, safety monitoring device and automatic teller machine (ATM) system |
CN101751219A (en) | 2008-12-05 | 2010-06-23 | 索尼爱立信移动通信日本株式会社 | Terminal apparatus, display control method, and display control program |
JP2010263581A (en) | 2009-05-11 | 2010-11-18 | Canon Inc | Object recognition apparatus and object recognition method |
CN102362220A (en) | 2009-03-26 | 2012-02-22 | Nec显示器解决方案株式会社 | Projector and method for controlling the same |
CN103152525A (en) | 2013-03-19 | 2013-06-12 | 北京和普威视光电技术有限公司 | Safety laser camera |
US20130177210A1 (en) | 2010-05-07 | 2013-07-11 | Samsung Electronics Co., Ltd. | Method and apparatus for recognizing location of user |
US20130195316A1 (en) * | 2012-01-30 | 2013-08-01 | Accenture Global Services Limited | System and method for face capture and matching |
CN103418138A (en) | 2013-08-19 | 2013-12-04 | 上海市激光技术研究所 | Safety control device of laser interaction game and application method thereof |
CN103488980A (en) | 2013-10-10 | 2014-01-01 | 广东小天才科技有限公司 | Camera-based sitting posture judgment method and device |
CN203800971U (en) | 2013-05-21 | 2014-08-27 | 上海鼎讯电子有限公司 | Damping and sealing structure for mobile-phone camera |
CN104268544A (en) | 2014-10-14 | 2015-01-07 | 陶晨 | Evaluation system for clothes visual effect |
CN204130899U (en) | 2014-11-12 | 2015-01-28 | 河北科技大学 | A kind of laser projection device |
CN104349072A (en) | 2013-08-09 | 2015-02-11 | 联想(北京)有限公司 | Control method, device and electronic equipment |
US20150226541A1 (en) * | 2012-09-28 | 2015-08-13 | Hitachi Automotive Systems, Ltd. | Imaging apparatus |
US20150244997A1 (en) | 2012-09-28 | 2015-08-27 | Rakuten, Inc. | Image processing device, image processing method, program and computer-readable storage medium |
CN104967776A (en) | 2015-06-11 | 2015-10-07 | 广东欧珀移动通信有限公司 | Photographing setting method and user terminal |
CN105120017A (en) | 2015-07-16 | 2015-12-02 | 广东欧珀移动通信有限公司 | Terminal |
CN105354792A (en) | 2015-10-27 | 2016-02-24 | 深圳市朗形网络科技有限公司 | Method for trying virtual glasses and mobile terminal |
CN105373223A (en) | 2015-10-10 | 2016-03-02 | 惠州Tcl移动通信有限公司 | Lighting equipment capable of automatically adjusting luminous intensity and method |
CN105451011A (en) | 2014-08-20 | 2016-03-30 | 联想(北京)有限公司 | Method and device for adjusting power |
US20160109232A1 (en) | 2014-10-21 | 2016-04-21 | Lg Electronics Inc. | Mobile terminal and method for controlling the same |
CN105680320A (en) | 2016-03-16 | 2016-06-15 | 中国科学院长春光学精密机械与物理研究所 | High-power, tunable and narrow linewidth external cavity semiconductor laser |
US20160166145A1 (en) * | 2013-07-16 | 2016-06-16 | Fittingbox | Method for determining ocular measurements using a consumer sensor |
CN105791681A (en) | 2016-02-29 | 2016-07-20 | 广东欧珀移动通信有限公司 | Control method, control device and electronic device |
CN105874473A (en) | 2014-01-02 | 2016-08-17 | 虹膜技术公司 | Apparatus and method for acquiring image for iris recognition using distance of facial feature |
CN205490686U (en) | 2015-12-31 | 2016-08-17 | 上海与德通讯技术有限公司 | Back camera casing fixed knot constructs and mobile terminal |
CN105959581A (en) | 2015-03-08 | 2016-09-21 | 联发科技股份有限公司 | Electronic device having dynamically controlled flashlight for image capturing and related control method |
CN106022275A (en) | 2016-05-26 | 2016-10-12 | 青岛海信移动通信技术股份有限公司 | Iris recognition method and apparatus, and mobile terminal |
CN205754594U (en) | 2016-02-15 | 2016-11-30 | 公安部第一研究所 | A kind of light supply apparatus that human image collecting is carried out BLC |
CN106203285A (en) | 2016-06-28 | 2016-12-07 | 广东欧珀移动通信有限公司 | Control method, control device and electronic installation |
CN106331517A (en) | 2016-09-26 | 2017-01-11 | 维沃移动通信有限公司 | Soft light lamp brightness control method and electronic device |
CN205921676U (en) | 2016-08-25 | 2017-02-01 | 北京旷视科技有限公司 | Image capturing apparatus |
US20170054962A1 (en) * | 2015-08-18 | 2017-02-23 | Yanhui Zhou | Three-dimensional depth perception method and apparatus with an adjustable working range |
CN106597789A (en) | 2016-12-21 | 2017-04-26 | 海信集团有限公司 | Control method and device of laser projection apparatus |
CN106651406A (en) | 2017-02-22 | 2017-05-10 | 北京智慧云测科技有限公司 | Payment terminal detecting device, system and method |
US20170180622A1 (en) * | 2015-12-21 | 2017-06-22 | Aviad Zabatani | Auto range control for active illumination depth camera |
CN106938370A (en) | 2015-12-30 | 2017-07-11 | 上海微电子装备有限公司 | A kind of laser-processing system and method |
CN206508403U (en) | 2016-08-08 | 2017-09-22 | 上海白贝蝶信息科技有限公司 | A kind of phototherapy beauty instrument and cosmetic system |
CN107330316A (en) | 2017-07-31 | 2017-11-07 | 广东欧珀移动通信有限公司 | unlocking processing method and related product |
CN107423716A (en) | 2017-07-31 | 2017-12-01 | 广东欧珀移动通信有限公司 | Face method for monitoring state and device |
CN107451561A (en) | 2017-07-31 | 2017-12-08 | 广东欧珀移动通信有限公司 | Iris recognition light compensation method and device |
CN107680128A (en) | 2017-10-31 | 2018-02-09 | 广东欧珀移动通信有限公司 | Image processing method, device, electronic equipment and computer-readable recording medium |
US20180084240A1 (en) * | 2016-09-16 | 2018-03-22 | Qualcomm Incorporated | Systems and methods for improved depth sensing |
US20180088214A1 (en) * | 2016-08-29 | 2018-03-29 | James Thomas O'Keeffe | Laser range finder with smart safety-conscious laser intensity |
CN108281880A (en) | 2018-02-27 | 2018-07-13 | 广东欧珀移动通信有限公司 | Control method, control device, terminal, computer equipment and storage medium |
CN108376251A (en) | 2018-02-27 | 2018-08-07 | 广东欧珀移动通信有限公司 | Control method, control device, terminal, computer equipment and storage medium |
CN108376252A (en) | 2018-02-27 | 2018-08-07 | 广东欧珀移动通信有限公司 | Control method, control device, terminal, computer equipment and storage medium |
CN108769509A (en) | 2018-04-28 | 2018-11-06 | Oppo广东移动通信有限公司 | Control method, apparatus, electronic equipment and the storage medium of camera |
US20190012784A1 (en) * | 2017-07-07 | 2019-01-10 | William F. WILEY | Application to determine reading/working distance |
-
2019
- 2019-02-26 EP EP19736567.9A patent/EP3564748A4/en not_active Withdrawn
- 2019-02-26 WO PCT/CN2019/076157 patent/WO2019165956A1/en unknown
- 2019-05-27 US US16/423,073 patent/US11335028B2/en active Active
Patent Citations (55)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1555051A (en) | 1998-09-14 | 2004-12-15 | ���µ�����ҵ��ʽ���� | Data recording medium and device for recording and copying data |
JP2007006016A (en) | 2005-06-22 | 2007-01-11 | Sharp Corp | Imaging equipment |
CN101384941A (en) | 2006-02-10 | 2009-03-11 | 松下电器产业株式会社 | Scanning unit and image display device |
CN101452181A (en) | 2007-12-03 | 2009-06-10 | 鸿富锦精密工业(深圳)有限公司 | Automatic focusing system and method OF electronic device |
CN101216881A (en) | 2007-12-28 | 2008-07-09 | 北京中星微电子有限公司 | A method and device for automatic image acquisition |
CN101751219A (en) | 2008-12-05 | 2010-06-23 | 索尼爱立信移动通信日本株式会社 | Terminal apparatus, display control method, and display control program |
CN102362220A (en) | 2009-03-26 | 2012-02-22 | Nec显示器解决方案株式会社 | Projector and method for controlling the same |
JP2010263581A (en) | 2009-05-11 | 2010-11-18 | Canon Inc | Object recognition apparatus and object recognition method |
CN101692281A (en) | 2009-06-03 | 2010-04-07 | 北京中星微电子有限公司 | Safety monitoring method, safety monitoring device and automatic teller machine (ATM) system |
US20130177210A1 (en) | 2010-05-07 | 2013-07-11 | Samsung Electronics Co., Ltd. | Method and apparatus for recognizing location of user |
US20130195316A1 (en) * | 2012-01-30 | 2013-08-01 | Accenture Global Services Limited | System and method for face capture and matching |
US20150226541A1 (en) * | 2012-09-28 | 2015-08-13 | Hitachi Automotive Systems, Ltd. | Imaging apparatus |
US20150244997A1 (en) | 2012-09-28 | 2015-08-27 | Rakuten, Inc. | Image processing device, image processing method, program and computer-readable storage medium |
CN103152525A (en) | 2013-03-19 | 2013-06-12 | 北京和普威视光电技术有限公司 | Safety laser camera |
CN203800971U (en) | 2013-05-21 | 2014-08-27 | 上海鼎讯电子有限公司 | Damping and sealing structure for mobile-phone camera |
US20160166145A1 (en) * | 2013-07-16 | 2016-06-16 | Fittingbox | Method for determining ocular measurements using a consumer sensor |
CN104349072A (en) | 2013-08-09 | 2015-02-11 | 联想(北京)有限公司 | Control method, device and electronic equipment |
US20150042871A1 (en) * | 2013-08-09 | 2015-02-12 | Lenovo (Beijing) Co., Ltd. | Control method, control apparatus and electronic device |
CN103418138A (en) | 2013-08-19 | 2013-12-04 | 上海市激光技术研究所 | Safety control device of laser interaction game and application method thereof |
CN103488980A (en) | 2013-10-10 | 2014-01-01 | 广东小天才科技有限公司 | Camera-based sitting posture judgment method and device |
CN105874473A (en) | 2014-01-02 | 2016-08-17 | 虹膜技术公司 | Apparatus and method for acquiring image for iris recognition using distance of facial feature |
CN105451011A (en) | 2014-08-20 | 2016-03-30 | 联想(北京)有限公司 | Method and device for adjusting power |
CN104268544A (en) | 2014-10-14 | 2015-01-07 | 陶晨 | Evaluation system for clothes visual effect |
US20160109232A1 (en) | 2014-10-21 | 2016-04-21 | Lg Electronics Inc. | Mobile terminal and method for controlling the same |
CN204130899U (en) | 2014-11-12 | 2015-01-28 | 河北科技大学 | A kind of laser projection device |
CN105959581A (en) | 2015-03-08 | 2016-09-21 | 联发科技股份有限公司 | Electronic device having dynamically controlled flashlight for image capturing and related control method |
CN104967776A (en) | 2015-06-11 | 2015-10-07 | 广东欧珀移动通信有限公司 | Photographing setting method and user terminal |
CN105120017A (en) | 2015-07-16 | 2015-12-02 | 广东欧珀移动通信有限公司 | Terminal |
US20170054962A1 (en) * | 2015-08-18 | 2017-02-23 | Yanhui Zhou | Three-dimensional depth perception method and apparatus with an adjustable working range |
CN105373223A (en) | 2015-10-10 | 2016-03-02 | 惠州Tcl移动通信有限公司 | Lighting equipment capable of automatically adjusting luminous intensity and method |
CN105354792A (en) | 2015-10-27 | 2016-02-24 | 深圳市朗形网络科技有限公司 | Method for trying virtual glasses and mobile terminal |
US20170180622A1 (en) * | 2015-12-21 | 2017-06-22 | Aviad Zabatani | Auto range control for active illumination depth camera |
CN106938370A (en) | 2015-12-30 | 2017-07-11 | 上海微电子装备有限公司 | A kind of laser-processing system and method |
CN205490686U (en) | 2015-12-31 | 2016-08-17 | 上海与德通讯技术有限公司 | Back camera casing fixed knot constructs and mobile terminal |
CN205754594U (en) | 2016-02-15 | 2016-11-30 | 公安部第一研究所 | A kind of light supply apparatus that human image collecting is carried out BLC |
CN105791681A (en) | 2016-02-29 | 2016-07-20 | 广东欧珀移动通信有限公司 | Control method, control device and electronic device |
CN105680320A (en) | 2016-03-16 | 2016-06-15 | 中国科学院长春光学精密机械与物理研究所 | High-power, tunable and narrow linewidth external cavity semiconductor laser |
CN106022275A (en) | 2016-05-26 | 2016-10-12 | 青岛海信移动通信技术股份有限公司 | Iris recognition method and apparatus, and mobile terminal |
CN106203285A (en) | 2016-06-28 | 2016-12-07 | 广东欧珀移动通信有限公司 | Control method, control device and electronic installation |
CN206508403U (en) | 2016-08-08 | 2017-09-22 | 上海白贝蝶信息科技有限公司 | A kind of phototherapy beauty instrument and cosmetic system |
CN205921676U (en) | 2016-08-25 | 2017-02-01 | 北京旷视科技有限公司 | Image capturing apparatus |
US20180088214A1 (en) * | 2016-08-29 | 2018-03-29 | James Thomas O'Keeffe | Laser range finder with smart safety-conscious laser intensity |
US20180084240A1 (en) * | 2016-09-16 | 2018-03-22 | Qualcomm Incorporated | Systems and methods for improved depth sensing |
CN106331517A (en) | 2016-09-26 | 2017-01-11 | 维沃移动通信有限公司 | Soft light lamp brightness control method and electronic device |
CN106597789A (en) | 2016-12-21 | 2017-04-26 | 海信集团有限公司 | Control method and device of laser projection apparatus |
CN106651406A (en) | 2017-02-22 | 2017-05-10 | 北京智慧云测科技有限公司 | Payment terminal detecting device, system and method |
US20190012784A1 (en) * | 2017-07-07 | 2019-01-10 | William F. WILEY | Application to determine reading/working distance |
CN107423716A (en) | 2017-07-31 | 2017-12-01 | 广东欧珀移动通信有限公司 | Face method for monitoring state and device |
CN107451561A (en) | 2017-07-31 | 2017-12-08 | 广东欧珀移动通信有限公司 | Iris recognition light compensation method and device |
CN107330316A (en) | 2017-07-31 | 2017-11-07 | 广东欧珀移动通信有限公司 | unlocking processing method and related product |
CN107680128A (en) | 2017-10-31 | 2018-02-09 | 广东欧珀移动通信有限公司 | Image processing method, device, electronic equipment and computer-readable recording medium |
CN108281880A (en) | 2018-02-27 | 2018-07-13 | 广东欧珀移动通信有限公司 | Control method, control device, terminal, computer equipment and storage medium |
CN108376251A (en) | 2018-02-27 | 2018-08-07 | 广东欧珀移动通信有限公司 | Control method, control device, terminal, computer equipment and storage medium |
CN108376252A (en) | 2018-02-27 | 2018-08-07 | 广东欧珀移动通信有限公司 | Control method, control device, terminal, computer equipment and storage medium |
CN108769509A (en) | 2018-04-28 | 2018-11-06 | Oppo广东移动通信有限公司 | Control method, apparatus, electronic equipment and the storage medium of camera |
Non-Patent Citations (10)
Title |
---|
EPO, Office Action for EP Application No. 19736567.9, dated Mar. 9, 2020. |
IPI, Office Action for IN Application No. 201917031567, dated Oct. 22, 2020. |
SIPO, First Office Action for CN Application No. 201810161936.0, dated Jun. 10, 2019. |
SIPO, First Office Action for CN Application No. 201810161946.4, dated Jun. 5, 2019. |
SIPO, First Office Action for CN Application No. 201810162447, dated Apr. 2, 2019. |
SIPO, First Office Action for CN Application No. 201810404834, dated Feb. 19, 2019. |
SIPO, Second Office Action for CN Application No. 201810161936.0, dated Sep. 9, 2019. |
SIPO, Second Office Action for CN Application No. 201810161946.4, dated Sep. 9, 2019. |
SIPO, Third Office Action for CN Application No. 201810161936.0, dated Dec. 11, 2019. |
WIPO, ISR for PCT/CN2019/076157, dated Jun. 3, 2019. |
Also Published As
Publication number | Publication date |
---|---|
EP3564748A1 (en) | 2019-11-06 |
US20190279398A1 (en) | 2019-09-12 |
WO2019165956A1 (en) | 2019-09-06 |
EP3564748A4 (en) | 2020-04-08 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11335028B2 (en) | Control method based on facial image, related control device, terminal and computer device | |
CN110324521B (en) | Method and device for controlling camera, electronic equipment and storage medium | |
TWI696391B (en) | Projector, detection method and detection device thereof, image capturing device, electronic device, and computer readable storage medium | |
CN107885023B (en) | Time-of-flight sensing for brightness and autofocus control in image projection devices | |
CN108376251B (en) | Control method, control device, terminal, computer device, and storage medium | |
WO2020038062A1 (en) | Control method and device, depth camera, electronic device, and readable storage medium | |
WO2020052284A1 (en) | Control method and device, depth camera, electronic device, and readable storage medium | |
WO2020038064A1 (en) | Control method and device, depth camera, electronic device, and readable storage medium | |
CN108281880A (en) | Control method, control device, terminal, computer equipment and storage medium | |
US20170180708A1 (en) | System and method for speckle reduction in laser projectors | |
WO2020259334A1 (en) | Adjustment method, adjustment apparatus, terminal and computer-readable storage medium | |
US11665334B2 (en) | Rolling shutter camera pipeline exposure timestamp error determination | |
CN108376252B (en) | Control method, control device, terminal, computer device, and storage medium | |
WO2019233106A1 (en) | Image acquisition method and device, image capture device, computer apparatus, and readable storage medium | |
US11441895B2 (en) | Control method, depth camera and electronic device | |
CN112204961A (en) | Semi-dense depth estimation from dynamic vision sensor stereo pairs and pulsed speckle pattern projectors | |
US20230199328A1 (en) | Method of removing interference and electronic device performing the method | |
CN212160703U (en) | Image sensing device and electronic apparatus | |
KR20210006605A (en) | Electronic device including sensor and method of operation thereof | |
US11353704B2 (en) | Head mounted device (HMD) coupled to smartphone executing personal authentication of a user | |
CN111598073A (en) | Image sensing device and electronic apparatus | |
US20230154368A1 (en) | Method and device for controlling luminance of augmented reality (ar) image | |
WO2022267645A1 (en) | Photography apparatus and method, electronic device, and storage medium | |
US11893698B2 (en) | Electronic device, AR device and method for controlling data transfer interval thereof | |
CN212484402U (en) | Image sensing device and electronic apparatus |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
AS | Assignment |
Owner name: GUANGDONG OPPO MOBILE TELECOMMUNICATIONS CORP., LTD., CHINA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:BAI, JIAN;TAN, GUOHUI;ZHOU, HAITAO;AND OTHERS;SIGNING DATES FROM 20180605 TO 20190415;REEL/FRAME:049296/0379 Owner name: GUANGDONG OPPO MOBILE TELECOMMUNICATIONS CORP., LT Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:BAI, JIAN;TAN, GUOHUI;ZHOU, HAITAO;AND OTHERS;SIGNING DATES FROM 20180605 TO 20190415;REEL/FRAME:049296/0379 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |