CN109726611A - Biological feather recognition method and device, readable storage medium storing program for executing and electronic equipment - Google Patents
Biological feather recognition method and device, readable storage medium storing program for executing and electronic equipment Download PDFInfo
- Publication number
- CN109726611A CN109726611A CN201711023405.7A CN201711023405A CN109726611A CN 109726611 A CN109726611 A CN 109726611A CN 201711023405 A CN201711023405 A CN 201711023405A CN 109726611 A CN109726611 A CN 109726611A
- Authority
- CN
- China
- Prior art keywords
- depth information
- image
- biological characteristic
- electronic equipment
- triggering command
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Abstract
The disclosure is directed to a kind of biological feather recognition method and device, readable storage medium storing program for executing and electronic equipment, this method includes the user's triggering command received for camera module;In response to user's triggering command, the depth information of the biological characteristic of subject is determined;The 3-D image of the biological characteristic is constructed according to the depth information.The disclosure can construct corresponding 3-D image based on the depth information of the biological characteristic got, be conducive to electronic equipment according to the 3-D image constructed and carry out identification, improve the security performance of electronic equipment.
Description
Technical field
This disclosure relates to field of terminal technology more particularly to a kind of biological feather recognition method and device, readable storage medium
Matter and electronic equipment.
Background technique
Currently, electronic equipment can obtain subject formed figure on the electronic equipment by camera module
Picture, so that subject is stored in electronic equipment in the form of two-dimensional image information, user can be set by the electronics
It is for future reference to see the image, or can also be checked by other terminal devices being connected with the electronic equipment.
Summary of the invention
The disclosure provides a kind of biological feather recognition method and device, readable storage medium storing program for executing and electronic equipment, to solve phase
Deficiency in the technology of pass.
According to the first aspect of the embodiments of the present disclosure, a kind of biological feather recognition method is provided, comprising:
Receive user's triggering command for camera module;
In response to user's triggering command, the depth information of the biological characteristic of subject is determined;
The 3-D image of the biological characteristic is constructed according to the depth information.
Optionally, the depth information of the biological characteristic of the determining subject, comprising:
Emit detection light to the biological characteristic;
Receive reflection light of the detection light after the reflection of the biological characteristic of the subject;
The depth information is obtained according to the detection light and the reflection light.
It is optionally, described that the depth information is obtained according to the detection light and the reflection light, comprising:
According to the phase difference and the time difference acquisition depth information between the detection light and the incident ray.
Optionally, described in the phase difference and time difference according between the detection light and the incident ray obtains
Depth information, comprising:
Sub- depth information is repeatedly obtained with the time difference according to the phase difference between the detection light and the incident ray;
The sub- depth information that will acquire carries out mean filter calculating or the depth information is calculated in median filtering.
Optionally, the user's triggering command received for camera module, comprising:
Receive the first triggering command for the physical button on electronic equipment;Alternatively,
Receive the second triggering command for the virtual key on the function pages shown in electronic equipment.
Optionally, after the 3-D image for constructing the biological characteristic according to the depth information, further includes:
Living things feature recognition is carried out based on the 3-D image and predefined image, and default function is completed based on recognition result
It can operation.
According to the second aspect of an embodiment of the present disclosure, a kind of biometric devices are provided, comprising:
Receiving module is configured as receiving user's triggering command for camera module;
Determining module is configured to respond to user's triggering command that the receiving module receives, determines subject
The depth information of the biological characteristic of body;
Image-forming module is configured as constructing the three-dimensional of the biological characteristic according to the depth information that the determining module determines
Image.
Optionally, the determining module includes:
Emit submodule, is configured as emitting detection light to the biological characteristic;
Receiving submodule is configured as receiving the detection light of the transmitting submodule transmitting through the subject
Reflection light after biological characteristic reflection;
Acquisition submodule is configured as being connect according to the detection light of the transmitting submodule transmitting with the receiving submodule
The reflection light of receipts obtains the depth information.
Optionally, the acquisition submodule includes:
Acquiring unit is configured as being obtained according to the phase difference between the detection light and the incident ray with the time difference
Take the depth information.
Optionally, the acquiring unit includes:
First obtain subelement, be configured as according to it is described detection light and the incident ray between phase difference and when
Between difference repeatedly obtain sub- depth information;
Second obtains subelement, is configured as the sub- depth information that the first acquisition subelement is got carrying out mean value
Filtering algorithm obtains the depth information.
Optionally, the receiving module includes:
First receiving submodule is configured as receiving the first triggering command for the physical button on electronic equipment;
Alternatively,
Second receiving submodule is configured as receiving for the virtual key on the function pages shown in electronic equipment
Second triggering command.
Optionally, further includes:
Identification module is configured as carrying out living things feature recognition based on the 3-D image and predefined image, and is based on
Recognition result completes preset function operation.
According to the third aspect of an embodiment of the present disclosure, a kind of computer readable storage medium is provided, calculating is stored thereon with
Machine instruction, when which is executed by processor the step of realization method as described in any of the above-described embodiment.
According to a fourth aspect of embodiments of the present disclosure, a kind of electronic equipment is provided, comprising:
Processor;
Memory for storage processor executable instruction;
Wherein, the processor is configured to realizing such as the step of any of the above-described embodiment the method.
The technical scheme provided by this disclosed embodiment can include the following benefits:
As can be seen from the above embodiments, the disclosure can be constructed corresponding based on the depth information of the biological characteristic got
3-D image is conducive to electronic equipment according to the 3-D image constructed and carries out identification, improves the safety of electronic equipment
Energy.
It should be understood that above general description and following detailed description be only it is exemplary and explanatory, not
The disclosure can be limited.
Detailed description of the invention
The drawings herein are incorporated into the specification and forms part of this specification, and shows the implementation for meeting the disclosure
Example, and together with specification for explaining the principles of this disclosure.
Fig. 1 is a kind of flow chart of biological feather recognition method shown according to an exemplary embodiment.
Fig. 2 is the flow chart of another biological feather recognition method shown according to an exemplary embodiment.
Fig. 3 is one of the application scenario diagram of another biological feather recognition method shown according to an exemplary embodiment.
Fig. 4 is the two of the application scenario diagram of another biological feather recognition method shown according to an exemplary embodiment.
Fig. 5 is the schematic diagram shown according to an exemplary embodiment for obtaining depth information mode.
Fig. 6-11 is a kind of block diagram of biometric devices shown according to an exemplary embodiment.
Figure 12 be it is shown according to an exemplary embodiment it is a kind of for biometric devices block diagram.
Specific embodiment
Example embodiments are described in detail here, and the example is illustrated in the accompanying drawings.Following description is related to
When attached drawing, unless otherwise indicated, the same numbers in different drawings indicate the same or similar elements.Following exemplary embodiment
Described in embodiment do not represent all embodiments consistent with the application.On the contrary, they be only with it is such as appended
The example of the consistent device and method of some aspects be described in detail in claims, the application.
It is only to be not intended to be limiting the application merely for for the purpose of describing particular embodiments in term used in this application.
It is also intended in the application and the "an" of singular used in the attached claims, " described " and "the" including majority
Form, unless the context clearly indicates other meaning.It is also understood that term "and/or" used herein refers to and wraps
It may be combined containing one or more associated any or all of project listed.
It will be appreciated that though various information, but this may be described using term first, second, third, etc. in the application
A little information should not necessarily be limited by these terms.These terms are only used to for same type of information being distinguished from each other out.For example, not departing from
In the case where the application range, the first information can also be referred to as the second information, and similarly, the second information can also be referred to as
One information.Depending on context, word as used in this " if " can be construed to " ... when " or " when ...
When " or " in response to determination ".
Fig. 1 is a kind of flow chart of biological feather recognition method shown according to an exemplary embodiment, as shown in Figure 1,
This method is applied in terminal, may comprise steps of:
In a step 101, user's triggering command for camera module is received.
In the present embodiment, which can be the first triggering command for the virtual push button on function interface,
The function interface may include payment interface, unlock interface or shooting interface etc.;Or the triggering command is also possible to be directed to
Second triggering command of the predetermined physical key on electronic equipment, for example, " home key ", " home key+power key " etc., the disclosure
It is limited not to this.
In a step 102, in response to user's triggering command, the depth letter of the biological characteristic of subject is determined
Breath.
In the present embodiment, in response to triggering command described in above-described embodiment, determine that the biology of subject is special
The depth information of sign.For example, first can according to subject on an electronic device determined at the grayscale information of image
Borderline region between the biological characteristic region and other regions for being different from the biological characteristic region, and by the borderline region institute
The part of encirclement is determined as biological characteristic region.
Further, corresponding biological characteristic position can be determined according to biological characteristic region, thus towards being taken
Biological characteristic transmitting detection light on object, and it is anti-after the reflection of the biological characteristic of subject to receive the detection light
Light is penetrated, so as to obtain the depth information of biological characteristic on subject according to the detection light and reflection light.
Detection light and reflection are determined according to the phase difference between detection light and reflection light specifically, can be
Time difference between light carries out that sub- depth information of the biological characteristic under this time detection is calculated according to the time difference, repeatedly
The sub- depth information obtained is calculated by mean filter or depth information is calculated in median filtering.
In step 103, the 3-D image of the biological characteristic is constructed according to the depth information.
In the present embodiment, the depth information that can be will acquire is bound to biological characteristic acquired in camera module
Plane information constructs to obtain the 3-D image of corresponding biological characteristic.
Based on the above embodiment in obtain biological characteristic 3-D image can with the pre-set image in electronic equipment into
Row living things feature recognition, and preset function operation is completed based on recognition result, for example, can be in 3-D image and electronic equipment
Pre-set image successful match when, unlock electronic equipment.
As can be seen from the above embodiments, the disclosure can be constructed corresponding based on the depth information of the biological characteristic got
3-D image is conducive to electronic equipment according to the 3-D image constructed and carries out identification, improves the safety of electronic equipment
Energy.
It is described in detail for the technical solution to the disclosure, it is following to will use mobile phones as an example to biological feather recognition method
Specific implementation process is described.As shown in Fig. 2, the biological feather recognition method may comprise steps of:
In step 201, predetermined physical key is clicked.
In the present embodiment, as shown in figure 3, when mobile phone 100 is in screen lock state, when triggering " home " key, can make
Mobile phone 100 is obtained to start camera module and construct 3-D image according to depth information;It certainly, can also in some other embodiment
Be according to the demand of user come decide whether start 3-D image constructing function.Wherein, " home " key is only used as example herein
Property explanation, can also include default virtual key of " payment ", " confirmation ", " login " etc. etc..
In step 202, obtain subject on mobile phone 100 at image grayscale information.
In step 203, according to grayscale information determine biological characteristic region and be different from biological characteristic region other
Region.
In the present embodiment, due to the color of position each on subject and the difference of brightness, can be taken
Object on mobile phone 100 at image each region gray value it is different, showed so as to the edge based on image
Gray scale discontinuity, the borderline region for obtaining image is split, obtain subject biological characteristic region and its
His region.
For example, as shown in figure 4, subject may include face part 10 and background parts 20, due to people
Physical distance between face part 10 and mobile phone 100 is shorter, physical distance is relatively between background parts 20 and mobile phone 100
It is long.Therefore, background parts 20 and people can be determined when the knots modification of the depth information in adjacent area is more than preset threshold
Borderline region 30 between face part 10 is assert to further determine that first part's (i.e. face that the borderline region 30 surrounds
The part that the boundary of part 10 surrounds;Wherein, which may include the part that the boundary and whole image edge surround jointly)
For people face part, and the other parts (i.e. mountain portions 20 and water surface part 30) that the first part is different from image are background
Part.
In step 204, emit detection to the biological characteristic being located on subject for corresponding to biological characteristic region
Light.
In step 205, reflection light is received.
In the present embodiment, which can be infrared ray, to can be reflected after encountering subject
Obtain reflection light, allow mobile phone 100 according to detection light and reflection light parameter determine current camera module with
Physical distance between the biological characteristic of subject.
It wherein, as shown in figure 5, can be according to preset order successively to each point transmitting light letter of object 40 when shooting object 40
Number to measure the depth information of respective point, for example can successively be measured according to the sequence in figure from left to right, from top to bottom.Together
When, object 40 can be taken multiple measurements to obtain depth information of the multiple groups about object 40, then be weighted and averaged it meter
Calculation obtains final depth information;Wherein, weight flexible setting, the disclosure can be limited not to this according to the actual situation
System.By taking multiple measurements to object 40 according to preset order, the accuracy rate and image of the information that fathoms can be improved
3-D effect, to further increase subsequent identification main body, the accuracy rate of background parts.
In step 206, the time difference is obtained according to the phase difference calculating between detection light and reflection light.
In step 207, it is calculated according to the time difference and obtains corresponding sub- depth information.
In the present embodiment, which can be for when the biology of the secondary mobile phone 100 measured to subject
Physical distance between feature.Specifically, can be according to the physics between the detection light of transmitting and received reflection light
Parameter, which calculates, obtains the physical distance.
For example, calculating acquisition can be carried out according to following functional relations:
D=Δτ/2Δτ;
Wherein, d: physical distance;C: the light velocity;π: pi;F: the tranmitting frequency of light is detected;Phase difference;Δτ: when
Between it is poor.
In a step 208, whether the quantity of sub- depth information reaches preset threshold.
In the present embodiment, when the quantity of group depth information reaches preset threshold, step 208 is executed;Group depth letter
When the quantity of breath does not reach preset threshold, return step 204.
In step 209, depth information is calculated using Mean Filtering Algorithm.
In the present embodiment, mean filter calculating is carried out based on the sub- depth information that repeated detection obtains, is conducive to improve
The accuracy rate of the depth information got.
For example, the 3 second son depth informations got for one of measurement point on object 40 are respectively a, b, c, and
And a > b > c, then, when being calculated using median filtering, the depth information of the measurement point is b at this time;When using mean filter meter
When calculation, the depth information of the measurement point is (a+b+c)/3 at this time.Certainly, in some other embodiment, it is also possible to be directed to and appoints
One predeterminable area first carries out the depth information that median filtering algorithm obtains corresponding points, then carries out mean filter for the predeterminable area
It calculates, and the depth information obtained after mean filter is calculated is as the depth information of each point on a predeterminable area.
In step 2010, the 3-D image of biological characteristic is constructed according to depth information.
In step 2011, which is matched with pre-set image, it is determined whether successful match.
In the present embodiment, when 3-D image and pre-set image successful match, step 2012 is executed;When 3-D image with
Pre-set image is when it fails to match, return step 204, to reacquire the depth information of corresponding biological characteristic, rebuilds three-dimensional
Image.
Certainly, in some other embodiments, 3-D image, saving processing can also be again pulled up with return step 2010
Resource.
In step 2012, mobile phone 100 is unlocked.
In the present embodiment, after based on recognition result unlock mobile phone, user is allowed to execute next behaviour for the mobile phone
Make step.Certainly, it is only illustrated for unlocking mobile phone herein, a variety of applied fields such as payment, login can also be applied to
Jing Zhong.
Corresponding with the embodiment of biological feather recognition method above-mentioned, the disclosure additionally provides biometric devices
Embodiment.
Fig. 6 is a kind of biometric devices block diagram shown according to an exemplary embodiment.Referring to Fig. 6, the device
200 include detection module 601, determining module 602 and image-forming module 603.
Receiving module 601 is configured as receiving user's triggering command for camera module;
Determining module 602 is configured to respond to user's triggering command that the receiving module 601 receives, determines quilt
Shoot the depth information of the biological characteristic of object;
Image-forming module 603 is configured as constructing the biological characteristic according to the depth information that the determining module 602 determines
3-D image.
As shown in fig. 7, Fig. 7 is the block diagram of another biometric devices shown according to an exemplary embodiment,
On the basis of aforementioned embodiment illustrated in fig. 6, determining module 602 may include: transmitting submodule 6021, receives son the embodiment
Module 6022 and acquisition submodule 6023, in which:
Emit submodule 6021, is configured as emitting detection light to the biological characteristic;
Receiving submodule 6022, the detection light for being configured as receiving transmitting submodule 6022 transmitting are clapped through described
Reflection light after taking the photograph the biological characteristic reflection of object;
Acquisition submodule 6023 is configured as the detection light emitted according to the transmitting submodule 6021 and the reception
The received reflection light of submodule 6022 obtains the depth information.
As shown in figure 8, Fig. 8 is the block diagram of another biometric devices shown according to an exemplary embodiment,
For the embodiment on the basis of aforementioned embodiment illustrated in fig. 7, acquisition submodule 6023 may include acquiring unit 6023A, in which:
Acquiring unit 6023A, be configured as according to it is described detection light and the incident ray between phase difference and when
Between difference obtain the depth information.
Acquiring unit 6023A can also include that the first acquisition subelement and second obtain subelement, wherein first obtains
Subelement, which can be configured as, repeatedly to be obtained according to the phase difference between the detection light and the incident ray with the time difference
Sub- depth information;Second acquisition subelement can be configured as by the first sub- depth information that gets of acquisition subelement into
Row Mean Filtering Algorithm obtains the depth information.
As shown in figure 9, Fig. 9 is the block diagram of another biometric devices shown according to an exemplary embodiment,
For the embodiment on the basis of any one of earlier figures 6-8 illustrated embodiment, receiving module 601 may include the first reception submodule
Block 6011, in which:
First receiving submodule 6011 is configured as receiving the first triggering for the physical button on electronic equipment and refers to
It enables;Alternatively,
Alternatively, as shown in Figure 10, receiving module 601 includes the second receiving submodule 6012,
Second receiving submodule 6012, be configured as receiving on the function pages shown in electronic equipment it is virtual by
Second triggering command of key.
As shown in figure 11, Figure 11 is the frame of another biometric devices shown according to an exemplary embodiment
Figure, the embodiment can also include: on the basis of any one of earlier figures 6-9 illustrated embodiment
Identification module 604 is configured as carrying out living things feature recognition, and base based on the 3-D image and predefined image
Preset function operation is completed in recognition result.
About the device in above-described embodiment, wherein modules execute the concrete mode of operation in related this method
Embodiment in be described in detail, no detailed explanation will be given here.
For device embodiment, since it corresponds essentially to embodiment of the method, so related place is referring to method reality
Apply the part explanation of example.The apparatus embodiments described above are merely exemplary, wherein described be used as separation unit
The unit of explanation may or may not be physically separated, and component shown as a unit can be or can also be with
It is not physical unit, it can it is in one place, or may be distributed over multiple network units.It can be according to actual
The purpose for needing to select some or all of the modules therein to realize disclosure scheme.Those of ordinary skill in the art are not paying
Out in the case where creative work, it can understand and implement.
Correspondingly, the disclosure also provides a kind of biometric devices, comprising: processor;It can for storage processor
The memory executed instruction;Wherein, the processor is configured to: receive user's triggering command for camera module;
In response to user's triggering command, the depth information of the biological characteristic of subject is determined;According to the depth information structure
Build the 3-D image of the biological characteristic.
Correspondingly, the disclosure also provides a kind of terminal, the terminal include memory and one or more than one
Program, one of them perhaps more than one program be stored in memory and be configured to by one or more than one
It includes the instruction for performing the following operation that reason device, which executes the one or more programs: being received for camera mould
User's triggering command of group;In response to user's triggering command, the depth information of the biological characteristic of subject is determined;Root
The 3-D image of the biological characteristic is constructed according to the depth information.
Figure 12 is a kind of block diagram for biometric devices shown according to an exemplary embodiment.For example, dress
Setting 1200 can be mobile phone, computer, digital broadcasting terminal, messaging device, game console, tablet device, doctor
Treat equipment, body-building equipment, personal digital assistant etc..
Referring to Fig.1 2, device 1200 may include following one or more components: processing component 1202, memory 1204,
Power supply module 1206, multimedia component 1208, audio component 1210, the interface 1212 of input/output (I/O), sensor module
1214 and communication component 1216.
The integrated operation of the usual control device 1200 of processing component 1202, such as with display, telephone call, data communication,
Camera operation and record operate associated operation.Processing component 1202 may include one or more processors 1220 to execute
Instruction, to perform all or part of the steps of the methods described above.In addition, processing component 1202 may include one or more moulds
Block, convenient for the interaction between processing component 1202 and other assemblies.For example, processing component 1202 may include multi-media module,
To facilitate the interaction between multimedia component 1208 and processing component 1202.
Memory 1204 is configured as storing various types of data to support the operation in device 1200.These data
Example includes the instruction of any application or method for operating on device 1200, contact data, telephone book data,
Message, picture, video etc..Memory 1204 can by any kind of volatibility or non-volatile memory device or they
Combination is realized, such as static random access memory (SRAM), electrically erasable programmable read-only memory (EEPROM), it is erasable can
Program read-only memory (EPROM), programmable read only memory (PROM), read-only memory (ROM), magnetic memory, flash memory
Reservoir, disk or CD.
Power supply module 1206 provides electric power for the various assemblies of device 1200.Power supply module 1206 may include power management
System, one or more power supplys and other with for device 1200 generate, manage, and distribute the associated component of electric power.
Multimedia component 1208 includes the screen of one output interface of offer between described device 1200 and user.?
In some embodiments, screen may include liquid crystal display (LCD) and touch panel (TP).If screen includes touch panel,
Screen may be implemented as touch screen, to receive input signal from the user.Touch panel includes that one or more touch passes
Sensor is to sense the gesture on touch, slide, and touch panel.The touch sensor can not only sense touch or sliding is dynamic
The boundary of work, but also detect duration and pressure associated with the touch or slide operation.In some embodiments, more
Media component 1208 includes a front camera and/or rear camera.When device 1200 is in operation mode, as shot mould
When formula or video mode, front camera and/or rear camera can receive external multi-medium data.Each preposition camera shooting
Head and rear camera can be a fixed optical lens system or have focusing and optical zoom capabilities.
Audio component 1210 is configured as output and/or input audio signal.For example, audio component 1210 includes a wheat
Gram wind (MIC), when device 1200 is in operation mode, when such as call mode, recording mode, and voice recognition mode, microphone quilt
It is configured to receive external audio signal.The received audio signal can be further stored in memory 1204 or via communication
Component 1216 is sent.In some embodiments, audio component 1210 further includes a loudspeaker, is used for output audio signal.
I/O interface 1212 provides interface, above-mentioned peripheral interface module between processing component 1202 and peripheral interface module
It can be keyboard, click wheel, button etc..These buttons may include, but are not limited to: home button, volume button, start button and
Locking press button.
Sensor module 1214 includes one or more sensors, and the state for providing various aspects for device 1200 is commented
Estimate.For example, sensor module 1214 can detecte the state that opens/closes of device 1200, the relative positioning of component, such as institute
The display and keypad that component is device 1200 are stated, sensor module 1214 can be with detection device 1200 or device 1,200 1
The position change of a component, the existence or non-existence that user contacts with device 1200,1200 orientation of device or acceleration/deceleration and dress
Set 1200 temperature change.Sensor module 1214 may include proximity sensor, be configured in not any physics
It is detected the presence of nearby objects when contact.Sensor module 1214 can also include optical sensor, as CMOS or ccd image are sensed
Device, for being used in imaging applications.In some embodiments, which can also include acceleration sensing
Device, gyro sensor, Magnetic Sensor, pressure sensor or temperature sensor.
Communication component 1216 is configured to facilitate the communication of wired or wireless way between device 1200 and other equipment.Dress
The wireless network based on communication standard, such as WiFi can be accessed by setting 1200,2G or 3G or their combination.It is exemplary at one
In embodiment, communication component 1216 receives broadcast singal or broadcast correlation from external broadcasting management system via broadcast channel
Information.In one exemplary embodiment, the communication component 1216 further includes near-field communication (NFC) module, to promote short distance
Communication.For example, radio frequency identification (RFID) technology, Infrared Data Association (IrDA) technology, ultra wide band can be based in NFC module
(UWB) technology, bluetooth (BT) technology and other technologies are realized.
In the exemplary embodiment, device 1200 can be by one or more application specific integrated circuit (ASIC), number
Signal processor (DSP), digital signal processing appts (DSPD), programmable logic device (PLD), field programmable gate array
(FPGA), controller, microcontroller, microprocessor or other electronic components are realized, for executing the above method.
In the exemplary embodiment, a kind of non-transitorycomputer readable storage medium including instruction, example are additionally provided
It such as include the memory 1204 of instruction, above-metioned instruction can be executed by the processor 1220 of device 1200 to complete the above method.Example
Such as, the non-transitorycomputer readable storage medium can be ROM, random access memory (RAM), CD-ROM, tape, soft
Disk and optical data storage devices etc..
Those skilled in the art will readily occur to its of the disclosure after considering specification and practicing disclosure disclosed herein
Its embodiment.This application is intended to cover any variations, uses, or adaptations of the disclosure, these modifications, purposes or
Person's adaptive change follows the general principles of this disclosure and including the undocumented common knowledge in the art of the disclosure
Or conventional techniques.The description and examples are only to be considered as illustrative, and the true scope and spirit of the disclosure are by following
Claim is pointed out.
It should be understood that the present disclosure is not limited to the precise structures that have been described above and shown in the drawings, and
And various modifications and changes may be made without departing from the scope thereof.The scope of the present disclosure is only limited by the accompanying claims.
Claims (14)
1. a kind of biological feather recognition method characterized by comprising
Receive user's triggering command for camera module;
In response to user's triggering command, the depth information of the biological characteristic of subject is determined;
The 3-D image of the biological characteristic is constructed according to the depth information.
2. biological feather recognition method according to claim 1, which is characterized in that the biology of the determining subject
The depth information of feature, comprising:
Emit detection light to the biological characteristic;
Receive reflection light of the detection light after the reflection of the biological characteristic of the subject;
The depth information is obtained according to the detection light and the reflection light.
3. biological feather recognition method according to claim 2, which is characterized in that described according to the detection light and institute
It states reflection light and obtains the depth information, comprising:
According to the phase difference and the time difference acquisition depth information between the detection light and the incident ray.
4. biological feather recognition method according to claim 3, which is characterized in that described according to the detection light and institute
The phase difference and time difference stated between incident ray obtain the depth information, comprising:
Sub- depth information is repeatedly obtained with the time difference according to the phase difference between the detection light and the incident ray;
The sub- depth information that will acquire carries out mean filter calculating or the depth information is calculated in median filtering.
5. biological feather recognition method according to claim 1, which is characterized in that described to receive for camera module
User's triggering command, comprising:
Receive the first triggering command for the physical button on electronic equipment;Alternatively,
Receive the second triggering command for the virtual key on the function pages shown in electronic equipment.
6. biological feather recognition method according to claim 1, which is characterized in that constructing institute according to the depth information
After the 3-D image for stating biological characteristic, further includes:
Living things feature recognition is carried out based on the 3-D image and predefined image, and preset function behaviour is completed based on recognition result
Make.
7. a kind of biometric devices characterized by comprising
Receiving module is configured as receiving user's triggering command for camera module;
Determining module is configured to respond to user's triggering command that the receiving module receives, determines subject
The depth information of biological characteristic;
Image-forming module is configured as constructing the three-dimensional figure of the biological characteristic according to the depth information that the determining module determines
Picture.
8. biometric devices according to claim 7, which is characterized in that the determining module includes:
Emit submodule, is configured as emitting detection light to the biological characteristic;
Receiving submodule is configured as receiving detection biology of the light through the subject of the transmitting submodule transmitting
Reflection light after feature reflection;
Acquisition submodule is configured as received according to the detection light of the transmitting submodule transmitting and the receiving submodule
Reflection light obtains the depth information.
9. biometric devices according to claim 8, which is characterized in that the acquisition submodule includes:
Acquiring unit is configured as obtaining institute with the time difference according to the phase difference between the detection light and the incident ray
State depth information.
10. biometric devices according to claim 9, which is characterized in that the acquiring unit includes:
First obtains subelement, is configured as according to the phase difference detected between light and the incident ray and time difference
Repeatedly obtain sub- depth information;
Second obtains subelement, is configured as the sub- depth information that the first acquisition subelement is got carrying out mean filter
Algorithm obtains the depth information.
11. biometric devices according to claim 7, which is characterized in that the receiving module includes:
First receiving submodule is configured as receiving the first triggering command for the physical button on electronic equipment;Alternatively,
Second receiving submodule is configured as receiving second for the virtual key on the function pages shown in electronic equipment
Triggering command.
12. biometric devices according to claim 7, which is characterized in that further include:
Identification module is configured as carrying out living things feature recognition based on the 3-D image and predefined image, and based on identification
As a result preset function operation is completed.
13. a kind of computer readable storage medium, is stored thereon with computer instruction, which is characterized in that the instruction is by processor
It is realized when execution such as the step of any one of claim 1-6 the method.
14. a kind of electronic equipment, which is characterized in that the electronic equipment includes:
Processor;
Memory for storage processor executable instruction;
Wherein, the processor is configured to realizing such as the step of any one of claim 1-6 the method.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201711023405.7A CN109726611B (en) | 2017-10-27 | 2017-10-27 | Biological feature recognition method and device, readable storage medium and electronic equipment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201711023405.7A CN109726611B (en) | 2017-10-27 | 2017-10-27 | Biological feature recognition method and device, readable storage medium and electronic equipment |
Publications (2)
Publication Number | Publication Date |
---|---|
CN109726611A true CN109726611A (en) | 2019-05-07 |
CN109726611B CN109726611B (en) | 2021-07-23 |
Family
ID=66291686
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201711023405.7A Active CN109726611B (en) | 2017-10-27 | 2017-10-27 | Biological feature recognition method and device, readable storage medium and electronic equipment |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN109726611B (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110944112A (en) * | 2019-11-22 | 2020-03-31 | 维沃移动通信有限公司 | Image processing method and electronic equipment |
CN114584697A (en) * | 2020-11-16 | 2022-06-03 | 中国航发商用航空发动机有限责任公司 | Residue detection apparatus and method |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1575524A (en) * | 2001-08-23 | 2005-02-02 | 华盛顿州大学 | Image acquisition with depth enhancement |
CN101866056A (en) * | 2010-05-28 | 2010-10-20 | 中国科学院合肥物质科学研究院 | 3D imaging method and system based on LED array common lens TOF depth measurement |
CN102073050A (en) * | 2010-12-17 | 2011-05-25 | 清华大学 | Depth-camera based three-dimensional scene depth measurement device |
CN104008366A (en) * | 2014-04-17 | 2014-08-27 | 深圳市唯特视科技有限公司 | 3D intelligent recognition method and system for biology |
CN104516560A (en) * | 2013-09-27 | 2015-04-15 | 联想(北京)有限公司 | Identification method, identification device and electronic equipment |
CN106485118A (en) * | 2016-09-19 | 2017-03-08 | 信利光电股份有限公司 | Electronic equipment and its identifying system, decryption method |
-
2017
- 2017-10-27 CN CN201711023405.7A patent/CN109726611B/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1575524A (en) * | 2001-08-23 | 2005-02-02 | 华盛顿州大学 | Image acquisition with depth enhancement |
CN101866056A (en) * | 2010-05-28 | 2010-10-20 | 中国科学院合肥物质科学研究院 | 3D imaging method and system based on LED array common lens TOF depth measurement |
CN102073050A (en) * | 2010-12-17 | 2011-05-25 | 清华大学 | Depth-camera based three-dimensional scene depth measurement device |
CN104516560A (en) * | 2013-09-27 | 2015-04-15 | 联想(北京)有限公司 | Identification method, identification device and electronic equipment |
CN104008366A (en) * | 2014-04-17 | 2014-08-27 | 深圳市唯特视科技有限公司 | 3D intelligent recognition method and system for biology |
CN106485118A (en) * | 2016-09-19 | 2017-03-08 | 信利光电股份有限公司 | Electronic equipment and its identifying system, decryption method |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110944112A (en) * | 2019-11-22 | 2020-03-31 | 维沃移动通信有限公司 | Image processing method and electronic equipment |
CN114584697A (en) * | 2020-11-16 | 2022-06-03 | 中国航发商用航空发动机有限责任公司 | Residue detection apparatus and method |
Also Published As
Publication number | Publication date |
---|---|
CN109726611B (en) | 2021-07-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111079576B (en) | Living body detection method, living body detection device, living body detection equipment and storage medium | |
CN105631403B (en) | Face identification method and device | |
CN104850828B (en) | Character recognition method and device | |
CN105550637B (en) | Profile independent positioning method and device | |
CN109670397A (en) | Detection method, device, electronic equipment and the storage medium of skeleton key point | |
CN105323372B (en) | Mobile terminal | |
CN106325521B (en) | Test virtual reality head shows the method and device of device software | |
CN109635539A (en) | A kind of face identification method and electronic equipment | |
CN105758319B (en) | The method and apparatus for measuring target object height by mobile terminal | |
CN110503023A (en) | Biopsy method and device, electronic equipment and storage medium | |
CN109726614A (en) | 3D stereoscopic imaging method and device, readable storage medium storing program for executing, electronic equipment | |
KR20160001263A (en) | Mobile terminal and controlling metheod thereof | |
CN107958223B (en) | Face recognition method and device, mobile equipment and computer readable storage medium | |
CN112287852B (en) | Face image processing method, face image display method, face image processing device and face image display equipment | |
CN107688781A (en) | Face identification method and device | |
CN105528078B (en) | The method and device of controlling electronic devices | |
CN105787322B (en) | The method and device of fingerprint recognition, mobile terminal | |
CN109672830A (en) | Image processing method, device, electronic equipment and storage medium | |
CN115718913B (en) | User identity recognition method and electronic equipment | |
CN112614214A (en) | Motion capture method, motion capture device, electronic device and storage medium | |
CN110610354A (en) | Method and device for settlement of articles in unmanned store and storage medium | |
CN109284591B (en) | Face unlocking method and device | |
CN109726611A (en) | Biological feather recognition method and device, readable storage medium storing program for executing and electronic equipment | |
CN106289161A (en) | Height measurement method and device | |
CN109376674A (en) | Method for detecting human face, device and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |