CN109426758A - Acquisition method and device, the computer readable storage medium of skin characteristic information - Google Patents
Acquisition method and device, the computer readable storage medium of skin characteristic information Download PDFInfo
- Publication number
- CN109426758A CN109426758A CN201710717477.5A CN201710717477A CN109426758A CN 109426758 A CN109426758 A CN 109426758A CN 201710717477 A CN201710717477 A CN 201710717477A CN 109426758 A CN109426758 A CN 109426758A
- Authority
- CN
- China
- Prior art keywords
- skin
- image data
- information
- collected
- characteristic information
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 40
- 238000012545 processing Methods 0.000 claims description 47
- 239000000284 extract Substances 0.000 claims description 39
- 238000000605 extraction Methods 0.000 claims description 19
- 241001269238 Data Species 0.000 claims description 6
- 238000012795 verification Methods 0.000 claims 1
- 230000008859 change Effects 0.000 abstract description 5
- 238000010586 diagram Methods 0.000 description 15
- 238000005516 engineering process Methods 0.000 description 14
- 238000004891 communication Methods 0.000 description 10
- 230000003287 optical effect Effects 0.000 description 8
- 238000009434 installation Methods 0.000 description 6
- 230000005236 sound signal Effects 0.000 description 4
- 230000001133 acceleration Effects 0.000 description 2
- 230000000712 assembly Effects 0.000 description 2
- 238000000429 assembly Methods 0.000 description 2
- 230000000994 depressogenic effect Effects 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 230000003993 interaction Effects 0.000 description 2
- 238000007726 management method Methods 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000002093 peripheral effect Effects 0.000 description 2
- KLDZYURQCUYZBL-UHFFFAOYSA-N 2-[3-[(2-hydroxyphenyl)methylideneamino]propyliminomethyl]phenol Chemical compound OC1=CC=CC=C1C=NCCCN=CC1=CC=CC=C1O KLDZYURQCUYZBL-UHFFFAOYSA-N 0.000 description 1
- 241000209140 Triticum Species 0.000 description 1
- 235000021307 Triticum Nutrition 0.000 description 1
- 230000006978 adaptation Effects 0.000 description 1
- 230000003044 adaptive effect Effects 0.000 description 1
- 235000013339 cereals Nutrition 0.000 description 1
- 238000012512 characterization method Methods 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 238000007796 conventional method Methods 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 230000007812 deficiency Effects 0.000 description 1
- 201000001098 delayed sleep phase syndrome Diseases 0.000 description 1
- 208000033921 delayed sleep phase type circadian rhythm sleep disease Diseases 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 230000006698 induction Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 239000012092 media component Substances 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 238000000926 separation method Methods 0.000 description 1
- 238000004088 simulation Methods 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/12—Fingerprints or palmprints
- G06V40/13—Sensors therefor
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/12—Fingerprints or palmprints
- G06V40/1347—Preprocessing; Feature extraction
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/12—Fingerprints or palmprints
- G06V40/1365—Matching; Classification
Landscapes
- Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
Abstract
The disclosure is directed to a kind of acquisition method of skin characteristic information and device, computer readable storage medium, this method may include: that the image data of skin to be collected is obtained by camera;Texture information is extracted from described image data, and using the texture information extracted as the characteristic information of the skin to be collected.By the technical solution of the disclosure, acquisition or identification to texture information can be realized in the case where the skin of user is not necessarily to contact with identification region.Meanwhile texture information can be disposably shot and acquire, change without multi collect and constantly the region for acquiring skin, to improve the efficiency of acquisition skin characteristic information, helps to promote user experience.
Description
Technical field
This disclosure relates to the acquisition method and device, calculating of field of terminal technology more particularly to a kind of skin characteristic information
Machine readable storage medium storing program for executing.
Background technique
Due to the uniqueness of the dermatoglyphs such as fingerprint, palmmprint, the identification technologies such as fingerprint, palmmprint are widely used in identity and recognize
Card.By taking fingerprint recognition as an example, by the way that the contact position of finger and fingerprint identification region is varied multiple times when user initially enrolls fingerprint,
To enroll complete finger print information as fingerprint template;When subsequent progress fingerprint recognition, the finger print information that will test (believe by simulation
Number) digital signal is converted to by ADC (analog-digital converter), and by the finger print information (gray scale of digital signal characterization after conversion
Information) it is matched with fingerprint template, when match degree is greater than the preset threshold, determine that the fingerprint currently identified passes through certification.
Summary of the invention
The disclosure provides the acquisition method and device, computer readable storage medium of a kind of skin characteristic information, to solve
Deficiency in the related technology.
According to the first aspect of the embodiments of the present disclosure, a kind of acquisition method of skin characteristic information is provided, electronics is applied to
Equipment;The described method includes:
The image data of skin to be collected is obtained by camera;
Texture information is extracted from described image data, and using the texture information extracted as the skin to be collected
Characteristic information.
Optionally, the image data that skin to be collected is obtained by camera, comprising:
Shooting operation at least once is implemented to skin to be collected by camera, obtains corresponding at least a shooting number
According to;
Wherein, at least part in the image data in each photographed data comprising skin to be collected, and all shooting numbers
The image data for including in can combine, to cover all regions to be collected of skin to be collected.
It is optionally, described to extract texture information from described image data, comprising:
Gray processing processing is carried out to described image data, to obtain the first gray scale image data of the skin to be collected;
The first grayscale information of the first gray scale image data is extracted, and using first grayscale information as the line
Manage information.
It is optionally, described to extract texture information from described image data, comprising:
Extract described image data in include the skin to be collected depth information, and using the depth information as
The texture information.
It is optionally, described to extract texture information from described image data, comprising:
Extract the depth information for the skin to be collected for including in described image data;
Gray processing processing is carried out to described image data according to the depth information, to obtain the of the skin to be collected
Two gray scale image data;
The second grayscale information of the second gray scale image data is extracted, and using second grayscale information as the line
Manage information.
Optionally, the characteristic information is used to compare verifying with default template characteristic information, or for as default
Template characteristic information.
According to the second aspect of an embodiment of the present disclosure, a kind of acquisition device of skin characteristic information is provided, electronics is applied to
Equipment;Described device includes:
Acquiring unit obtains the image data of skin to be collected by camera;
Extraction unit, extracts texture information from described image data, and using the texture information extracted as it is described to
Acquire the characteristic information of skin.
Optionally, the acquiring unit includes:
Subelement is shot, shooting operation at least once is implemented to skin to be collected by camera, is obtained accordingly at least
A photographed data;
Wherein, at least part in the image data in each photographed data comprising skin to be collected, and all shooting numbers
The image data for including in can combine, to cover all regions to be collected of skin to be collected.
Optionally, the extraction unit includes:
First gray proces subelement carries out gray processing processing to described image data, to obtain the skin to be collected
The first gray scale image data;
First extracts subelement, extracts the first grayscale information of the first gray scale image data, and grey by described first
Information is spent as the texture information.
Optionally, the extraction unit includes:
Second extracts subelement, extracts the depth information for the skin to be collected for including in described image data, and will
The depth information is as the texture information.
Optionally, the extraction unit includes:
Third extracts subelement, extracts the depth information for the skin to be collected for including in described image data;
Second gray proces subelement carries out gray processing processing to described image data according to the depth information, with
To the second gray scale image data of the skin to be collected;
4th extracts subelement, extracts the second grayscale information of the second gray scale image data, and grey by described second
Information is spent as the texture information.
Optionally, the characteristic information is used to compare verifying with default template characteristic information, or for as default
Template characteristic information.
According to the third aspect of an embodiment of the present disclosure, a kind of acquisition device of skin characteristic information is provided, comprising:
Processor;
Memory for storage processor executable instruction;
Wherein, the processor is configured to realizing the method as described in any one of above-described embodiment.
According to a fourth aspect of embodiments of the present disclosure, a kind of computer readable storage medium is provided, calculating is stored thereon with
Machine instruction is realized when the instruction is executed by processor such as the step of any one of above-described embodiment the method.
The technical scheme provided by this disclosed embodiment can include the following benefits:
As can be seen from the above embodiments, the disclosure obtains the texture information of skin by camera, and for being somebody's turn to do as identification
The template characteristic information of dermatoglyph or for verifying the skin can be not necessarily to contact with identification region in the skin of user
In the case where realize acquisition or identification to texture information.Meanwhile can disposably shoot and acquire texture information, without more
Secondary acquisition and the region for constantly changing acquisition skin help to be promoted and use to improve the efficiency of acquisition skin characteristic information
Family experience.
It should be understood that above general description and following detailed description be only it is exemplary and explanatory, not
The disclosure can be limited.
Detailed description of the invention
The drawings herein are incorporated into the specification and forms part of this specification, and shows the implementation for meeting the disclosure
Example, and together with specification for explaining the principles of this disclosure.
Fig. 1-2 is the schematic diagram for acquiring fingerprint in the related technology.
Fig. 3 is a kind of flow chart of the acquisition method of skin characteristic information shown according to an exemplary embodiment.
Fig. 4 is the flow chart of the acquisition method of another skin characteristic information shown according to an exemplary embodiment.
Fig. 5 is showing for the front camera acquisition fingerprint feature information of electronic equipment shown according to an exemplary embodiment
It is intended to.
Fig. 6 is the signal of the image data of the front camera shooting of electronic equipment shown according to an exemplary embodiment
Figure.
Fig. 7 is the schematic diagram shown according to an exemplary embodiment for dividing subregion to be collected.
Fig. 8 is the flow chart of the acquisition method of another skin characteristic information shown according to an exemplary embodiment.
Fig. 9-10 is the schematic diagram of TOF camera shooting depth information shown according to an exemplary embodiment.
Figure 11 is a kind of block diagram of the acquisition device of skin characteristic information shown according to an exemplary embodiment.
Figure 12 is the block diagram of the acquisition device of another skin characteristic information shown according to an exemplary embodiment.
Figure 13 is the block diagram of the acquisition device of another skin characteristic information shown according to an exemplary embodiment.
Figure 14 is the block diagram of the acquisition device of another skin characteristic information shown according to an exemplary embodiment.
Figure 15 is the block diagram of the acquisition device of another skin characteristic information shown according to an exemplary embodiment.
Figure 16 is that a kind of structure of acquisition device for skin characteristic information shown according to an exemplary embodiment is shown
It is intended to.
Specific embodiment
Example embodiments are described in detail here, and the example is illustrated in the accompanying drawings.Following description is related to
When attached drawing, unless otherwise indicated, the same numbers in different drawings indicate the same or similar elements.Following exemplary embodiment
Described in embodiment do not represent all embodiments consistent with the application.On the contrary, they be only with as appended
The example of the consistent device and method of some aspects be described in detail in claims, the application.
It is only to be not intended to be limiting the application merely for for the purpose of describing particular embodiments in term used in this application.
It is also intended in the application and the "an" of singular used in the attached claims, " described " and "the" including majority
Form, unless the context clearly indicates other meaning.It is also understood that term "and/or" used herein refers to and wraps
It may be combined containing one or more associated any or all of project listed.
It will be appreciated that though various information, but this may be described using term first, second, third, etc. in the application
A little information should not necessarily be limited by these terms.These terms are only used to for same type of information being distinguished from each other out.For example, not departing from
In the case where the application range, the first information can also be referred to as the second information, and similarly, the second information can also be referred to as
One information.Depending on context, word as used in this " if " can be construed to " ... when " or " when ...
When " or " in response to determination ".
The dermatoglyphs such as fingerprint, palmmprint are widely used in authentication because of its uniqueness.By taking fingerprint recognition as an example, Fig. 1-2
It is the schematic diagram for acquiring fingerprint in the related technology.As shown in Figure 1, finger 10 needs to realize by contact fingerprint identification region 20
Acquisition to fingerprint.Simultaneously as fingerprint mould group limited by the area in fingerprint sensor actual induction region (generally 3mm ×
6mm, much smaller than the region that finger 10 includes fingerprint), cause to need user constantly to convert finger 10 and fingerprint when acquiring fingerprint
The contact position of identification region 20.As shown in Fig. 2, in order to acquire finger 10 except the fingerprint in other regions in region in Fig. 1, finger
10 need to move up (compared to the position of finger 10 in Fig. 1).As it can be seen that finally needing the mobile multiple position of finger 10 to adopt
The fingerprint for collecting 10 whole region of finger, leads to the efficiency for reducing acquisition fingerprint, and user experience is poor.
Therefore, the disclosure is improved by way of to acquisition fingerprint, to solve above-mentioned skill present in the relevant technologies
Art problem, is described in detail below with reference to embodiment.
Fig. 3 is a kind of flow chart of the acquisition method of skin characteristic information shown according to an exemplary embodiment, such as Fig. 3
Shown, this method is applied in electronic equipment, may comprise steps of:
In step 302, the image data of skin to be collected is obtained by camera.
In the present embodiment, shooting operation at least once can be implemented to skin to be collected by camera, obtained corresponding
At least a photographed data;Wherein, at least part in the image data in each photographed data comprising skin to be collected, and
The image data for including in all photographed datas can combine, to cover all regions to be collected of skin to be collected.In a kind of feelings
Under condition, can disposably shoot down all regions to be collected image data (in the image data disposably shot comprising to
Acquire all texture informations of skin), so as to avoid the operation of multi collect image data, improve acquisition skin characteristic letter
The efficiency of breath;It in another case, can be after region to be collected be divided into multiple subregions to be collected again respectively to each
Subregion to be collected carries out the acquisition operation of image data, (mentions so as to improve the signal-to-noise ratio of acquired image data
The image quality of high acquired image);In still another case, the figure in region to be collected can be repeatedly shot from different perspectives
As data, according still further to default weighted superposition acquired image data, to improve the accuracy of the image data finally obtained.
In step 304, texture information is extracted from described image data, and using the texture information extracted as described in
The characteristic information of skin to be collected.
In one embodiment, gray processing processing first can be carried out to described image data, to obtain the skin to be collected
The first gray scale image data, then extract the first grayscale information of the first gray scale image data, and by first gray scale
Information is as the texture information.Image data gray processing by the skin to be collected for obtaining camera, then extract gray scale
Information, can be in the skin of user without realizing in the case where contact with identification region to texture information as texture information
Acquisition, be conducive to improve acquisition texture information efficiency.
In another embodiment, since there are ridge (convex portion between skin streakline) and paddy (skins in skin to be collected
Depressed section between skin streakline), it can be using the difference of distance between ridge and paddy as the foundation for identifying skin to be collected.Than
Such as, can extract the depth information for the skin to be collected for including in described image data, and using the depth information as
The texture information.Wherein, described image data can by depth camera shoot obtain, such as using binocular RGB, structure light,
The camera of TOF (Time Of Flight, flight time) technology.
In another embodiment, the acquisition of middle depth information based on the above embodiment, can first extract described image data
In include the skin to be collected depth information, and according to the depth information to described image data carry out gray processing at
Reason to obtain the second gray scale image data of the skin to be collected, then extracts the second ash of the second gray scale image data
Information is spent, and using second grayscale information as the texture information.By being carried out according to the depth information of skin to be collected
Gray processing handles and extracts grayscale information as texture information, equally can be not necessarily to connect with identification region in the skin of user
The acquisition to texture information is realized in the case where touching, is conducive to the efficiency for improving acquisition texture information.
It should be noted that the characteristic information can be used for comparing verifying with default template characteristic information, or it is used for
As default template characteristic information.
As can be seen from the above embodiments, the disclosure obtains the texture information of skin by camera, and for being somebody's turn to do as identification
The template characteristic information of dermatoglyph or for verifying the skin can be not necessarily to contact with identification region in the skin of user
In the case where realize acquisition or identification to texture information.Meanwhile can disposably shoot and acquire texture information, without more
Secondary acquisition and the region for constantly changing acquisition skin help to be promoted and use to improve the efficiency of acquisition skin characteristic information
Family experience.
In order to make it easy to understand, the technical solution of the disclosure is described in detail with reference to the accompanying drawing.Refer to Fig. 4, Fig. 4
It is the flow chart of the acquisition method of another skin characteristic information shown according to an exemplary embodiment, as shown in figure 4, the party
Method is applied in electronic equipment, may comprise steps of:
In step 402, the image data of skin to be collected is obtained by camera.
In the present embodiment, can using the technical solution of the disclosure as special acquisition mode or acquisition function, then when
When user opens the acquisition mode or acquisition function on an electronic device, it is special that the skin based on disclosed technique scheme can be realized
The acquisition method of reference breath.
Mode for acquiring image data can disposably shoot down all regions to be collected in one case
Image data.As shown in figure 5, front camera (it is of course also possible to being rear camera) the acquisition fingerprint with electronic equipment is special
For reference breath: camera 30 disposably shoots all image datas of lower finger 10, to obtain the fingerprint characteristic letter of finger 10
Breath.It is assumed that shown in image data such as Fig. 6 (a) of shooting, then the image data of all shootings can be shown for user's selection
Acquire the region of fingerprint feature information.For example, all image datas of shooting include 101 (its of finger-print region as shown in Fig. 6 (b)
In include fingerprint 1011) and non-finger-print region 102, believe then user can choose finger-print region 101 as acquisition fingerprint characteristic
The region of breath.So subsequent needs the characteristic information that takes the fingerprint in finger-print region 101, to improve acquisition fingerprint characteristic letter
The efficiency of breath.And all image datas by disposably shooting skin to be collected, it can be to avoid multi collect image data
The efficiency of acquisition skin characteristic information also can be improved in operation.
In another case, region to be collected can be divided into multiple subregions to be collected (multiple son to be collected
Include all areas of skin to be collected after the combination of region) after image data carried out to each subregion to be collected respectively again adopt
Collection operation, so as to improve the signal-to-noise ratio (improving the image quality of acquired image) of acquired image data.Than
Such as, it as shown in fig. 7, can be subregion A to be collected and subregion B to be collected by region division to be collected, and treats adopt respectively
Collect the acquisition operation that subregion A, B carry out image data.
In still another case, the image data that can repeatedly shoot region to be collected from different perspectives, according still further to default
Weighted superposition acquired image data, to improve the accuracy of the image data finally obtained.For example, respectively from front,
The image data of side shooting finger 10.And when the image data to all angles is overlapped, different angles is superimposed
Weight can be pre-configured with.For example, compared to the weight of the image data from front shooting, the image data that is shot from side
Weight is relatively small;Certainly, the disclosure is limited not to this.
In step 404, gray processing processing is carried out to the image data of acquisition, to obtain the first grayscale of skin to be collected
Image data.
In the present embodiment, gray scale can be carried out by the image data to acquisition such as integer arithmetic, integer shifting algorithm
Change processing.Certainly, the application does not limit the mode of gray processing processing.Gray processing processing is being carried out to image data
Afterwards, the grain distribution of skin to be collected is in certain grey-scale range.For example, according to different fingerprint IC (integrated
Circuit, integrated circuit) requirement, can be distributed in the grey-scale ranges such as 0-31,0-255.
In a step 406, the first grayscale information of the first gray scale image data is extracted, and using the first grayscale information as line
Manage information.
In a step 408, using texture information as the characteristic information of skin to be collected.
In the present embodiment, the characteristic information obtained can be used for comparing verifying with default template characteristic information, or use
In as default template characteristic information.For example, can identify fingerprint to be verified by the technical solution of the disclosure, can also lead to
The characteristic information of technical solution admission fingerprint of the disclosure is crossed to identify for subsequent fingerprint.In addition, determining template characteristic letter
After breath, the image data of the skin to be collected of camera acquisition can be deleted, to prevent from being usurped by other people or other softwares.
Image data gray processing by the skin to be collected for obtaining camera, then extract grayscale information and believe as texture
Breath can realize the acquisition to texture information in the case where the skin of user is not necessarily to contact with identification region, be conducive to
Improve the efficiency of acquisition texture information.
Fig. 8 is referred to, Fig. 8 is the acquisition method of another skin characteristic information shown according to an exemplary embodiment
Flow chart may comprise steps of as shown in figure 8, this method is applied in electronic equipment:
In step 802, the image data of skin to be collected is obtained by camera.
In step 804, the depth information for the skin to be collected for including in the image data of acquisition is extracted.
In step 806, gray processing processing is carried out to image data according to depth information, to obtain the of skin to be collected
Two gray scale image data.
In step 808, the second grayscale information of the second gray scale image data is extracted.
In step 810, it is undertaken in step 804, using depth information as texture information, and obtained texture information is made
For the characteristic information of skin to be collected;It is undertaken in step 808, using the second grayscale information as texture information, and the line that will be obtained
Manage characteristic information of the information as skin to be collected.
In the present embodiment, it is undertaken in step 804, since there are the ridge (protrusions between skin streakline in skin to be collected
Part) and paddy (depressed section between skin streakline), it can be using the difference of distance between ridge and paddy as identification skin to be collected
The foundation of skin.Wherein, described image data can be shot by depth camera and be obtained, for example use binocular RGB, structure light, TOF
The camera of (Time Of Flight, flight time) technology.
By taking TOF technology as an example, as shown in figs. 9-10, TOF camera 40 emits optical signal (transmitting letter to subject 50
Number) and receive the optical signal (return signal) of return.Pass through the phase difference between transmitting signal and corresponding return signalIt can be with
The time is calculatedWherein, f is the frequency of optical signal.TOF camera can be obtained further according to the speed c of optical signal
The distance between 40 and subject 50 Similarly, it (can be configured in electronics to set by TOF camera
In standby) shooting skin to be collected, to obtain skin median ridge to be collected (from the closer texture of TOF camera), paddy respectively (from TOF
The distance between the farther away texture of camera) and TOF camera, and using the distance of acquisition as texture information.
Be undertaken in step 808, based in step 804 to the extraction of depth information, can be first according to depth information to image
Data carry out gray processing processing, to obtain the second gray scale image data of skin to be collected, then extract the second gray scale image data
The second grayscale information, and using the second grayscale information as texture information.For example, since skin median ridge to be collected is from TOF camera shooting
Head it is closer, in skin to be collected paddy farther out from TOF camera, then in the second gray scale image data correspond to ridge part image
Gray value be greater than the gray value for corresponding to the image of valley point.By carrying out gray processing according to the depth information of skin to be collected
It handles and extracts grayscale information as texture information, equally can be not necessarily to the feelings contacted with identification region in the skin of user
The acquisition to texture information is realized under condition, is conducive to the efficiency for improving acquisition texture information.
In the present embodiment, step 802, the related content of step 806-810 can refer to the corresponding step of embodiment illustrated in fig. 4
Suddenly, details are not described herein.
Corresponding with the embodiment of acquisition method of skin characteristic information above-mentioned, the disclosure additionally provides skin characteristic letter
The embodiment of the acquisition device of breath.
Figure 11 is a kind of block diagram of the acquisition device of skin characteristic information shown according to an exemplary embodiment.Referring to figure
11, which includes acquiring unit 110 and extraction unit 120.
The acquiring unit 110 is configured as obtaining the image data of skin to be collected by camera;
The extraction unit 120 is configured as extracting texture information from described image data, and the texture extracted is believed
Cease the characteristic information as the skin to be collected.
As shown in figure 12, Figure 12 is the acquisition device of another skin characteristic information shown according to an exemplary embodiment
Block diagram, for the embodiment on the basis of aforementioned embodiment illustrated in fig. 11, acquiring unit 110 may include: shooting subelement
1101。
The shooting subelement 1101 is configured as implementing shooting operation at least once to skin to be collected by camera, obtains
To at least a photographed data accordingly;
Wherein, at least part in the image data in each photographed data comprising skin to be collected, and all shooting numbers
The image data for including in can combine, to cover all regions to be collected of skin to be collected.
As shown in figure 13, Figure 13 is the acquisition device of another skin characteristic information shown according to an exemplary embodiment
Block diagram, for the embodiment on the basis of aforementioned embodiment illustrated in fig. 11, extraction unit 120 may include: the first gray proces
Subelement 1201 and first extracts subelement 1202.
The first gray proces subelement 1201 is configured as carrying out gray processing processing to described image data, to obtain
State the first gray scale image data of skin to be collected;
The first extraction subelement 1202 is configured as extracting the first grayscale information of the first gray scale image data, and
Using first grayscale information as the texture information.
It should be noted that the first gray proces subelement 1201 and first in Installation practice shown in above-mentioned Figure 13
The structure for extracting subelement 1202 also may be embodied in the Installation practice of earlier figures 12, be not limited to this disclosure.
As shown in figure 14, Figure 14 is the acquisition device of another skin characteristic information shown according to an exemplary embodiment
Block diagram, for the embodiment on the basis of aforementioned embodiment illustrated in fig. 11, extraction unit 120 may include: that the second extraction is single
Member 1203.
The second extraction subelement 1203 is configured as extracting the skin to be collected for including in described image data
Depth information, and using the depth information as the texture information.
It should be noted that the second structure for extracting subelement 1203 in Installation practice shown in above-mentioned Figure 14 can also
To be included in the Installation practice of earlier figures 12, this disclosure is not limited.
As shown in figure 15, Figure 15 is the acquisition device of another skin characteristic information shown according to an exemplary embodiment
Block diagram, for the embodiment on the basis of aforementioned embodiment illustrated in fig. 11, extraction unit 120 may include: that extract son single for third
The 1204, second gray proces subelement of member 1205 and the 4th extracts subelement 1206.
The third extracts subelement 1204 and is configured as extracting the skin to be collected for including in described image data
Depth information;
The second gray proces subelement 1205 is configured as carrying out ash to described image data according to the depth information
Degreeization processing, to obtain the second gray scale image data of the skin to be collected;
4th extraction subelement 1206 is configured as extracting the second grayscale information of the second gray scale image data, and
Using second grayscale information as the texture information.
It should be noted that the third in above-mentioned Installation practice shown in figure 15 extracts subelement 1204, the second gray scale
The structure of processing subelement 1205 and the 4th extraction subelement 1206 also may be embodied in the Installation practice of earlier figures 12, right
This disclosure is not limited.
Optionally, the characteristic information is used to compare verifying with default template characteristic information, or for as default
Template characteristic information.
About the device in above-described embodiment, wherein modules execute the concrete mode of operation in related this method
Embodiment in be described in detail, no detailed explanation will be given here.
For device embodiment, since it corresponds essentially to embodiment of the method, so related place is referring to method reality
Apply the part explanation of example.The apparatus embodiments described above are merely exemplary, wherein described be used as separation unit
The unit of explanation may or may not be physically separated, and component shown as a unit can be or can also be with
It is not physical unit, it can it is in one place, or may be distributed over multiple network units.It can be according to actual
The purpose for needing to select some or all of the modules therein to realize disclosure scheme.Those of ordinary skill in the art are not paying
Out in the case where creative work, it can understand and implement.
Correspondingly, the disclosure also provides a kind of acquisition device of skin characteristic information, comprising: processor;At storage
Manage the memory of device executable instruction;Wherein, the processor is configured to: the image of skin to be collected is obtained by camera
Data;Texture information is extracted from described image data, and using the texture information extracted as the spy of the skin to be collected
Reference breath.
Correspondingly, the disclosure also provides a kind of terminal, the terminal include memory and one or more than one
Program, one of them perhaps more than one program be stored in memory and be configured to by one or more than one
It includes the instruction for performing the following operation that reason device, which executes the one or more programs: being obtained by camera wait adopt
Collect the image data of skin;Extract texture information from described image data, and using the texture information extracted as it is described to
Acquire the characteristic information of skin.
The block diagram for the device 1600 that Figure 16 is that one kind shown according to an exemplary embodiment is used for ....For example, device
1600 can be mobile phone, computer, digital broadcasting terminal, messaging device, game console, tablet device, medical treatment
Equipment, body-building equipment, personal digital assistant etc..
Referring to Fig.1 6, device 1600 may include following one or more components: processing component 1602, memory 1604,
Power supply module 1606, multimedia component 1608, audio component 1610, the interface 1612 of input/output (I/O), sensor module
1614 and communication component 1616.
The integrated operation of the usual control device 1600 of processing component 1602, such as with display, telephone call, data communication,
Camera operation and record operate associated operation.Processing component 1602 may include one or more processors 1620 to execute
Instruction, to perform all or part of the steps of the methods described above.In addition, processing component 1602 may include one or more moulds
Block, convenient for the interaction between processing component 1602 and other assemblies.For example, processing component 1602 may include multi-media module,
To facilitate the interaction between multimedia component 1608 and processing component 1602.
Memory 1604 is configured as storing various types of data to support the operation in device 1600.These data
Example includes the instruction of any application or method for operating on device 1600, contact data, telephone book data,
Message, picture, video etc..Memory 1604 can by any kind of volatibility or non-volatile memory device or they
Combination is realized, such as static random access memory (SRAM), electrically erasable programmable read-only memory (EEPROM), it is erasable can
Program read-only memory (EPROM), programmable read only memory (PROM), read-only memory (ROM), magnetic memory, flash memory
Reservoir, disk or CD.
Power supply module 1606 provides electric power for the various assemblies of device 1600.Power supply module 1606 may include power management
System, one or more power supplys and other with for device 1600 generate, manage, and distribute the associated component of electric power.
Multimedia component 1608 includes the screen of one output interface of offer between described device 1600 and user.?
In some embodiments, screen may include liquid crystal display (LCD) and touch panel (TP).If screen includes touch panel,
Screen may be implemented as touch screen, to receive input signal from the user.Touch panel includes that one or more touch passes
Sensor is to sense the gesture on touch, slide, and touch panel.The touch sensor can not only sense touch or sliding is dynamic
The boundary of work, but also detect duration and pressure associated with the touch or slide operation.In some embodiments, more
Media component 1608 includes a front camera and/or rear camera.When device 1600 is in operation mode, as shot mould
When formula or video mode, front camera and/or rear camera can receive external multi-medium data.Each preposition camera shooting
Head and rear camera can be a fixed optical lens system or have focusing and optical zoom capabilities.
Audio component 1610 is configured as output and/or input audio signal.For example, audio component 1610 includes a wheat
Gram wind (MIC), when device 1600 is in operation mode, when such as call mode, recording mode, and voice recognition mode, microphone quilt
It is configured to receive external audio signal.The received audio signal can be further stored in memory 1604 or via communication
Component 1616 is sent.In some embodiments, audio component 1610 further includes a loudspeaker, is used for output audio signal.
I/O interface 1612 provides interface, above-mentioned peripheral interface module between processing component 1602 and peripheral interface module
It can be keyboard, click wheel, button etc..These buttons may include, but are not limited to: home button, volume button, start button and
Locking press button.
Sensor module 1614 includes one or more sensors, and the state for providing various aspects for device 1600 is commented
Estimate.For example, sensor module 1614 can detecte the state that opens/closes of device 1600, the relative positioning of component, such as institute
The display and keypad that component is device 1600 are stated, sensor module 1614 can be with detection device 1600 or device 1,600 1
The position change of a component, the existence or non-existence that user contacts with device 1600,1600 orientation of device or acceleration/deceleration and dress
Set 1600 temperature change.Sensor module 1614 may include proximity sensor, be configured in not any physics
It is detected the presence of nearby objects when contact.Sensor module 1614 can also include optical sensor, as CMOS or ccd image are sensed
Device, for being used in imaging applications.In some embodiments, which can also include acceleration sensing
Device, gyro sensor, Magnetic Sensor, pressure sensor or temperature sensor.
Communication component 1616 is configured to facilitate the communication of wired or wireless way between device 1600 and other equipment.Dress
The wireless network based on communication standard, such as WiFi can be accessed by setting 1600,2G or 3G or their combination.It is exemplary at one
In embodiment, communication component 1616 receives broadcast singal or broadcast correlation from external broadcasting management system via broadcast channel
Information.In one exemplary embodiment, the communication component 1616 further includes near-field communication (NFC) module, to promote short distance
Communication.For example, radio frequency identification (RFID) technology, Infrared Data Association (IrDA) technology, ultra wide band can be based in NFC module
(UWB) technology, bluetooth (BT) technology and other technologies are realized.
In the exemplary embodiment, device 1600 can be by one or more application specific integrated circuit (ASIC), number
Signal processor (DSP), digital signal processing appts (DSPD), programmable logic device (PLD), field programmable gate array
(FPGA), controller, microcontroller, microprocessor or other electronic components are realized, for executing the above method.
In the exemplary embodiment, a kind of non-transitorycomputer readable storage medium including instruction, example are additionally provided
It such as include the memory 1604 of instruction, above-metioned instruction can be executed by the processor 1620 of device 1600 to complete the above method.Example
Such as, the non-transitorycomputer readable storage medium can be ROM, random access memory (RAM), CD-ROM, tape, soft
Disk and optical data storage devices etc..
Those skilled in the art will readily occur to its of the disclosure after considering specification and practicing disclosure disclosed herein
Its embodiment.This application is intended to cover any variations, uses, or adaptations of the disclosure, these modifications, purposes or
Person's adaptive change follows the general principles of this disclosure and including the undocumented common knowledge in the art of the disclosure
Or conventional techniques.The description and examples are only to be considered as illustrative, and the true scope and spirit of the disclosure are by following
Claim is pointed out.
It should be understood that the present disclosure is not limited to the precise structures that have been described above and shown in the drawings, and
And various modifications and changes may be made without departing from the scope thereof.The scope of the present disclosure is only limited by the accompanying claims.
Claims (14)
1. a kind of acquisition method of skin characteristic information, which is characterized in that be applied to electronic equipment;The described method includes:
The image data of skin to be collected is obtained by camera;
Texture information is extracted from described image data, and using the texture information extracted as the feature of the skin to be collected
Information.
2. the method according to claim 1, wherein the picture number for obtaining skin to be collected by camera
According to, comprising:
Shooting operation at least once is implemented to skin to be collected by camera, obtains corresponding at least a photographed data;
Wherein, at least part in the image data in each photographed data comprising skin to be collected, and in all photographed datas
The image data for including can combine, to cover all regions to be collected of skin to be collected.
3. the method according to claim 1, wherein described extract texture information, packet from described image data
It includes:
Gray processing processing is carried out to described image data, to obtain the first gray scale image data of the skin to be collected;
The first grayscale information of the first gray scale image data is extracted, and is believed using first grayscale information as the texture
Breath.
4. the method according to claim 1, wherein described extract texture information, packet from described image data
It includes:
The depth information for the skin to be collected for including in described image data is extracted, and using the depth information as described in
Texture information.
5. the method according to claim 1, wherein described extract texture information, packet from described image data
It includes:
Extract the depth information for the skin to be collected for including in described image data;
Gray processing processing is carried out to described image data according to the depth information, to obtain the second ash of the skin to be collected
Rank image data;
The second grayscale information of the second gray scale image data is extracted, and is believed using second grayscale information as the texture
Breath.
6. the method according to claim 1, wherein the characteristic information be used for default template characteristic information into
Row contrast verification, or for as default template characteristic information.
7. a kind of acquisition device of skin characteristic information, which is characterized in that be applied to electronic equipment;Described device includes:
Acquiring unit obtains the image data of skin to be collected by camera;
Extraction unit extracts texture information from described image data, and using the texture information extracted as described to be collected
The characteristic information of skin.
8. device according to claim 7, which is characterized in that the acquiring unit includes:
Subelement is shot, shooting operation at least once is implemented to skin to be collected by camera, is obtained corresponding at least a
Photographed data;
Wherein, at least part in the image data in each photographed data comprising skin to be collected, and in all photographed datas
The image data for including can combine, to cover all regions to be collected of skin to be collected.
9. device according to claim 7, which is characterized in that the extraction unit includes:
First gray proces subelement carries out gray processing processing to described image data, to obtain the of the skin to be collected
One gray scale image data;
First extracts subelement, extracts the first grayscale information of the first gray scale image data, and first gray scale is believed
Breath is used as the texture information.
10. device according to claim 7, which is characterized in that the extraction unit includes:
Second extracts subelement, extracts the depth information for the skin to be collected for including in described image data, and will be described
Depth information is as the texture information.
11. device according to claim 7, which is characterized in that the extraction unit includes:
Third extracts subelement, extracts the depth information for the skin to be collected for including in described image data;
Second gray proces subelement carries out gray processing processing to described image data according to the depth information, to obtain
State the second gray scale image data of skin to be collected;
4th extracts subelement, extracts the second grayscale information of the second gray scale image data, and second gray scale is believed
Breath is used as the texture information.
12. device according to claim 7, which is characterized in that the characteristic information is used for and default template characteristic information
Verifying is compared, or for as default template characteristic information.
13. a kind of acquisition device of skin characteristic information characterized by comprising
Processor;
Memory for storage processor executable instruction;
Wherein, the processor is configured to realizing such as method of any of claims 1-6.
14. a kind of computer readable storage medium, is stored thereon with computer instruction, which is characterized in that the instruction is by processor
It is realized when execution such as the step of any one of claim 1-6 the method.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710717477.5A CN109426758A (en) | 2017-08-21 | 2017-08-21 | Acquisition method and device, the computer readable storage medium of skin characteristic information |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710717477.5A CN109426758A (en) | 2017-08-21 | 2017-08-21 | Acquisition method and device, the computer readable storage medium of skin characteristic information |
Publications (1)
Publication Number | Publication Date |
---|---|
CN109426758A true CN109426758A (en) | 2019-03-05 |
Family
ID=65497569
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710717477.5A Pending CN109426758A (en) | 2017-08-21 | 2017-08-21 | Acquisition method and device, the computer readable storage medium of skin characteristic information |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN109426758A (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110664383A (en) * | 2019-09-27 | 2020-01-10 | 北京小米移动软件有限公司 | Thermometer, and storage method and device of body temperature information |
TWI790449B (en) * | 2020-01-21 | 2023-01-21 | 神盾股份有限公司 | Fingerprint identification device and fingerprint identification method |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102682440A (en) * | 2011-02-18 | 2012-09-19 | 佳能株式会社 | Image processing apparatus, image capturing apparatus, and image processing method |
CN103268483A (en) * | 2013-05-31 | 2013-08-28 | 沈阳工业大学 | Method for recognizing palmprint acquired in non-contact mode in open environment |
CN104361315A (en) * | 2014-10-27 | 2015-02-18 | 浙江工业大学 | 3D (three-dimensional) fingerprint recognition device based on monocular and multi-view stereoscopic machine vision |
CN104751040A (en) * | 2014-07-25 | 2015-07-01 | 北京智膜科技有限公司 | Fingerprint detection method based on intelligent mobile information equipment |
CN104751113A (en) * | 2014-07-25 | 2015-07-01 | 北京智膜科技有限公司 | Fingerprint recognition method based on intelligent mobile information device |
CN105022947A (en) * | 2015-07-28 | 2015-11-04 | 广东欧珀移动通信有限公司 | Fingerprint identification method for smartwatch and smartwatch |
CN106843588A (en) * | 2017-01-24 | 2017-06-13 | 北京小米移动软件有限公司 | Screen control method and device |
-
2017
- 2017-08-21 CN CN201710717477.5A patent/CN109426758A/en active Pending
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102682440A (en) * | 2011-02-18 | 2012-09-19 | 佳能株式会社 | Image processing apparatus, image capturing apparatus, and image processing method |
CN103268483A (en) * | 2013-05-31 | 2013-08-28 | 沈阳工业大学 | Method for recognizing palmprint acquired in non-contact mode in open environment |
CN104751040A (en) * | 2014-07-25 | 2015-07-01 | 北京智膜科技有限公司 | Fingerprint detection method based on intelligent mobile information equipment |
CN104751113A (en) * | 2014-07-25 | 2015-07-01 | 北京智膜科技有限公司 | Fingerprint recognition method based on intelligent mobile information device |
CN104361315A (en) * | 2014-10-27 | 2015-02-18 | 浙江工业大学 | 3D (three-dimensional) fingerprint recognition device based on monocular and multi-view stereoscopic machine vision |
CN105022947A (en) * | 2015-07-28 | 2015-11-04 | 广东欧珀移动通信有限公司 | Fingerprint identification method for smartwatch and smartwatch |
CN106843588A (en) * | 2017-01-24 | 2017-06-13 | 北京小米移动软件有限公司 | Screen control method and device |
Non-Patent Citations (1)
Title |
---|
YONGCHANG WANG等: "Data Acquisition and Processing of 3-D Fingerprints", 《IEEE TRANSACTIONS ON INFORMATION FORENSICS AND SECURITY》 * |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110664383A (en) * | 2019-09-27 | 2020-01-10 | 北京小米移动软件有限公司 | Thermometer, and storage method and device of body temperature information |
TWI790449B (en) * | 2020-01-21 | 2023-01-21 | 神盾股份有限公司 | Fingerprint identification device and fingerprint identification method |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108197586B (en) | Face recognition method and device | |
CN105488464B (en) | Fingerprint identification method and device | |
KR101844704B1 (en) | Method and apparatus for controlling display device, and intelligent pad | |
CN105095873B (en) | Photo be shared method, apparatus | |
CN104850828B (en) | Character recognition method and device | |
CN105631403B (en) | Face identification method and device | |
CN110222789B (en) | Image recognition method and storage medium | |
CN109670397A (en) | Detection method, device, electronic equipment and the storage medium of skeleton key point | |
CN105654033B (en) | Face image verification method and device | |
CN105224924A (en) | Living body faces recognition methods and device | |
CN106951884A (en) | Gather method, device and the electronic equipment of fingerprint | |
WO2019024717A1 (en) | Anti-counterfeiting processing method and related product | |
KR20160001263A (en) | Mobile terminal and controlling metheod thereof | |
CN104125396A (en) | Image shooting method and device | |
CN110503023A (en) | Biopsy method and device, electronic equipment and storage medium | |
CN107688781A (en) | Face identification method and device | |
CN105528078B (en) | The method and device of controlling electronic devices | |
CN104700353A (en) | Image filter generating method and device | |
CN104408404A (en) | Face identification method and apparatus | |
CN105069426A (en) | Similar picture determining method and apparatus | |
CN107766820A (en) | Image classification method and device | |
CN105335714B (en) | Photo processing method, device and equipment | |
CN108668080A (en) | Prompt method and device, the electronic equipment of camera lens degree of fouling | |
CN104933419A (en) | Method and device for obtaining iris images and iris identification equipment | |
CN206595991U (en) | A kind of double-camera mobile terminal |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20190305 |
|
RJ01 | Rejection of invention patent application after publication |