CN108345845A - Imaging sensor, camera lens module, mobile terminal, face identification method and device - Google Patents
Imaging sensor, camera lens module, mobile terminal, face identification method and device Download PDFInfo
- Publication number
- CN108345845A CN108345845A CN201810083033.5A CN201810083033A CN108345845A CN 108345845 A CN108345845 A CN 108345845A CN 201810083033 A CN201810083033 A CN 201810083033A CN 108345845 A CN108345845 A CN 108345845A
- Authority
- CN
- China
- Prior art keywords
- image
- light
- region
- face
- photosensitive
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/161—Detection; Localisation; Normalisation
- G06V40/166—Detection; Localisation; Normalisation using acquisition arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/172—Classification, e.g. identification
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/10—Image acquisition
- G06V10/17—Image acquisition using hand-held instruments
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/168—Feature extraction; Face representation
Abstract
The invention discloses a kind of imaging sensor, camera lens module, mobile terminal, face identification method and device, imaging sensors to include:Microlens layer, photosensitive element layer and the filter layer being set between microlens layer and photosensitive element layer;Incident ray after microlens layer and filter layer, is transmitted to photosensitive element layer successively, and photosensitive element layer includes the second photosensitive region of the first photosensitive region and corresponding infrared light of corresponding visible light.The characteristics of image that the present invention treats the infrared light image of verification face carries out Image Feature Matching, it may be determined that the face gone out before camera is a true face or human face photo, improves safety and the accuracy of recognition of face.
Description
Technical field
The present invention relates to image processing field more particularly to a kind of imaging sensor, camera lens module, mobile terminal, faces
Recognition methods and device.
Background technology
Now, the application with face recognition technology on intelligent mobile terminal, face recognition technology are increasingly closed
Note.Face recognition technology is as one in a biometrics identification technology very important branch, it has also become computer vision
With a research field very popular in area of pattern recognition.
Face recognition technology refers to the technology that the facial face and profile equal distribution feature of people are identified, due to not
Different with the face and contoured profile of people, relative to other biological identification technology, face recognition technology has good non-invade
Immunity, without interfering the normal behaviour of people that can reach preferable recognition effect.
Currently, using face recognition technology on mobile terminals, the image of camera shooting can not accurately be judged
Whether it is real human face, criminal can serve as real human face using human face photo, to complete recognition of face, face be caused to be known
Other safety is poor.
Invention content
The present invention provides a kind of imaging sensor, camera lens module, mobile terminal, face identification method and devices, with solution
It can accurately never judge whether the image of camera shooting is real human face, causes recognition of face safety is lower to ask
Topic.
In a first aspect, an embodiment of the present invention provides a kind of imaging sensors, including:
Microlens layer, photosensitive element layer and the filter layer being set between microlens layer and photosensitive element layer;
Wherein, incident ray is transmitted to photosensitive element layer, photosensitive element layer packet successively after microlens layer and filter layer
Include the second photosensitive region of the first photosensitive region and corresponding infrared light of corresponding visible light.
Second aspect, an embodiment of the present invention provides a kind of camera lens modules, including above-mentioned imaging sensor.
The third aspect, an embodiment of the present invention provides a kind of mobile terminals, including above-mentioned camera lens module.
Fourth aspect, an embodiment of the present invention provides a kind of face identification methods, including:
Obtain the visible images and infrared light image after treating verification face progress Image Acquisition;
Determine whether the characteristics of image of the visible images matches with the characteristics of image of the visible light face-image that prestores,
And whether the characteristics of image of the infrared light image matches with the characteristics of image of the infrared light face-image that prestores;
If the characteristics of image of visible images and the characteristics of image for the visible light face-image that prestores match, and infrared light figure
The characteristics of image of picture and the characteristics of image for the infrared light face-image that prestores match, it is determined that treat being verified as into for verification face
Work(.
5th aspect, the embodiment of the present invention additionally provide a kind of face identification device, including:
Acquisition module treats visible images and infrared light image after verification face carries out Image Acquisition for obtaining;
First determining module, for determine the visible images characteristics of image whether with the visible light face-image that prestores
Characteristics of image match and whether the characteristics of image of the infrared light image special with the image of the infrared light face-image that prestores
Sign matches;
Second determining module, if the characteristics of image of the characteristics of image and the visible light face-image that prestores for visible images
Match and the characteristics of image of infrared light image and the characteristics of image image for the infrared light face that prestores match, it is determined that is right
Face to be verified is verified as success.
6th aspect, the embodiment of the present invention additionally provide a kind of mobile terminal, including:Including processor, memory and deposit
The computer program that can be run on the memory and on the processor is stored up, the computer program is by the processor
It is realized such as the step of above-mentioned face identification method when execution.
In this way, in the embodiment of the present invention, by the figure for treating the infrared light image after verifying face progress Image Acquisition
As feature carries out Image Feature Matching, it is capable of determining that the face before camera is a true face or human face photo,
Improve safety and the accuracy of recognition of face;Also, due to needing characteristics of image and infrared light figure in visible images
It is just determined after the equal successful match of characteristics of image of two kinds of images of characteristics of image of picture and treats being proved to be successful for verification face, improved
The accuracy of recognition of face.
Description of the drawings
In order to illustrate the technical solution of the embodiments of the present invention more clearly, below by institute in the description to the embodiment of the present invention
Attached drawing to be used is needed to be briefly described, it should be apparent that, the accompanying drawings in the following description is only some implementations of the present invention
Example, for those of ordinary skill in the art, without creative efforts, can also obtain according to these attached drawings
Obtain other attached drawings.
Fig. 1 shows one of structural schematic diagrams of embodiment of imaging sensor of the present invention;
Fig. 2 indicates the second structural representation of the embodiment of the imaging sensor of the present invention;
Fig. 3 indicates the third structural representation of the embodiment of the imaging sensor of the present invention;
Fig. 4 indicates the flow chart of the embodiment of the face identification method of the present invention;
Fig. 5 indicates one of the structural schematic diagram of embodiment of face identification device of the present invention;
Fig. 6 indicates the second structural representation of the embodiment of the face identification device of the present invention;
Fig. 7 indicates the block diagram of the embodiment of the mobile terminal of the present invention.
Specific implementation mode
Following will be combined with the drawings in the embodiments of the present invention, and technical solution in the embodiment of the present invention carries out clear, complete
Site preparation describes, it is clear that described embodiments are some of the embodiments of the present invention, instead of all the embodiments.Based on this hair
Embodiment in bright, the every other reality that those of ordinary skill in the art are obtained under the premise of not making creative work
Example is applied, shall fall within the protection scope of the present invention.
Referring to figs. 1 to Fig. 3, an embodiment of the present invention provides a kind of imaging sensors, are applied to above-mentioned recognition of face side
Method, including:
Microlens layer 11, photosensitive element layer and the filter layer being set between microlens layer 11 and photosensitive element layer;
Wherein, incident ray is transmitted to photosensitive element layer, photosensitive element layer successively after microlens layer 11 and filter layer
Form the second photosensitive region 132 of the first photosensitive region 131 and corresponding infrared light that include visible light.It is photosensitive by acquisition first
Photoreceptor signal in region 131 can obtain visible images, can be obtained by the photoreceptor signal in the second photosensitive region 132
Infrared image.
Specifically, above-mentioned incident ray refers to available light.
Imaging sensor in the embodiment of the present invention can realize visual light imaging, and can realize infrared imaging.Pass through
The characteristics of image progress Image Feature Matching that verification face carries out the infrared light image after Image Acquisition is treated, is capable of determining that and takes the photograph
As the face before head is a true face or human face photo, safety and the accuracy of recognition of face are improved;Also,
Due to needing after the characteristics of image of visible images and two kinds of equal successful match of characteristics of image of characteristics of image of infrared light image
It just determines and treats being proved to be successful for verification face, improve the accuracy of recognition of face.
Specifically, in embodiments of the present invention, according to the different designs to filter layer and photosensitive element layer, the present invention is implemented
Imaging sensor in example includes three kinds of realization methods, and the microlens layer 11 of the imaging sensor in three kinds of realization methods is homogeneous
Together, therefore, hereinafter only respective filter layer and photosensitive element layer are introduced.
The first realization method of imaging sensor in the embodiment of the present invention is introduced below, referring to Fig.1, at the first
In realization method, photosensitive wavelength range is provided in the second photosensitive region 132, and to be located at 780nm photosensitive to the infrared light between 1mm
Diode, filter layer include:First filter layer, include for by visible light black light filtered region 1211 and be used for
Pass through region 1212a by the first of natural light;Second filter layer includes for the colorized optical filtering region 1221 by visible light
And for passing through region 1222a by the second of natural light;Wherein, black light filtered region 1211, colorized optical filtering region
1221 and first photosensitive region 131 it is corresponding, first passes through 132 phase of region 1212a, filtered region 1222 and the second photosensitive region
It is corresponding.
Wherein, black light filtered region 1211 refer to prevent black light by and allow visible light by filter area
Domain;For the first of first filter layer by allowing available light to pass through at the 1212a of region, first in the first filter layer passes through region
Be not provided at the positions 1212a prevent available light by filtering structure or ray filtering material.That is, to colorized optical filtering area is reached
Light at 1221 position of domain is visible light ray, and to reach the second of the second filter layer by the region positions 1222a at
Light is available light.
Colorized optical filtering region 1221 include for red light, three kinds of light of green light and blue ray by monochromatic light
Include the photosensitive subregion of red light at first photosensitive region 131 correspondingly by region, the photosensitive subregion of green light and
The photosensitive subregion of blue light, wherein the red light in colorized optical filtering region 1221 passes through on region and the first photosensitive region 131
The photosensitive subregion of red light is corresponding, and the green light in colorized optical filtering region 1221 passes through on region and the first photosensitive region 131
The photosensitive subregion of green light it is corresponding, the blue light in colorized optical filtering region 1221 passes through region and the first photosensitive region 131
On the photosensitive subregion of blue light it is corresponding.Visible light ray is to any one list reached in the colorized optical filtering region 1221
Coloured light is by after on region, it is seen that passes through the region by the light of region corresponding color with the monochromatic light in light and reaches first
With the monochromatic light by region at corresponding position on photosensitive region 131.
For the second of second filter layer by allowing available light to pass through at the 1222a of region, second in the second filter layer is logical
Cross be not provided at the positions 1222a of region prevent available light by filtering structure or ray filtering material.That is, reaching the second sense
Light at 132 position of light region is available light.
Wherein, in the first realization method, the first photosensitive region 131 is provided with visible light light sensitive diode, packet
It includes:Red light light sensitive diode, green light light sensitive diode and blue light light sensitive diode, red light light sensitive diode are located at
In the photosensitive subregion of red light, green light light sensitive diode is located in the photosensitive subregion of green light, blue light light sensitive diode position
In in the photosensitive subregion of blue light;The photosensitive wavelength range of red light light sensitive diode is located at 640 between 780nm, green light
The photosensitive wavelength range of light sensitive diode is located at 505nm between 525nm, and the wavelength photoreceptor of blue light light sensitive diode is located at
475nm is between 505nm.The monochromatic light on three photosensitive subregions of the first photosensitive region 131 is reached, corresponding monochrome is passed through
Light photodiode converts optical signals to electric signal, and then generates visible images according to the electric signal.
Due to the photosensitive wavelength range of the infrared light light sensitive diode in the second photosensitive region 132 be located at 780nm to 1mm it
Between, that is, it can only convert the optical signal of the infrared light in available light.Therefore, the second photosensitive region 132 is reached
Available light can realize infrared light signal conversion, and then generate infrared light image.
The second realization method of imaging sensor in the embodiment of the present invention is introduced below, with reference to Fig. 2, at second
In realization method, filter layer includes:First filter layer includes for the black light filtered region 1211 and use by visible light
Pass through region 1212b in the third by natural light;Second filter layer includes for the colorized optical filtering region by visible light
1221 and for the first filtered region 1222b by infrared light;Wherein, black light filtered region 1211, colorized optical filtering area
Domain 1221 and the first photosensitive region 131 are corresponding, and third passes through region 1212b, the first filtered region 1222b and the second photosensitive area
Domain 132 is corresponding.
Wherein, black light filtered region 1211 refer to prevent black light by and allow visible light by filter area
Domain;The third of first filter layer passes through region by allowing available light to pass through at the 1212b of region in the third of the first filter layer
Be not provided at the positions 1212b prevent available light by filtering structure or ray filtering material.That is, to colorized optical filtering area is reached
Light at 1221 position of domain is visible light ray, and is natural light to the light that the first positions filtered region 1222b go out is reached
Line.
Colorized optical filtering region 1221 include for red light, three kinds of light of green light and blue ray by monochromatic light
Include the photosensitive subregion of red light at first photosensitive region 131 correspondingly by region, the photosensitive subregion of green light and
The photosensitive subregion of blue light, wherein the red light in colorized optical filtering region 1221 passes through on region and the first photosensitive region 131
The photosensitive subregion of red light is corresponding, and the green light in colorized optical filtering region 1221 passes through on region and the first photosensitive region 131
The photosensitive subregion of green light it is corresponding, the blue light in colorized optical filtering region 1221 passes through region and the first photosensitive region 131
On the photosensitive subregion of blue light it is corresponding.Visible light ray is to any one list reached in the colorized optical filtering region 1221
Coloured light is by after on region, it is seen that passes through the region by the light of region corresponding color with the monochromatic light in light and reaches first
With the monochromatic light by region at corresponding position on photosensitive region 131.
First filtered region 1222b refer to prevent other light by and only allow infrared light by filtered region, from
Right light is reached by region 1212b at the positions the first filtered region 1222b by the first of the first filter layer, by this
One filtered region 1222b filters out the visible light in available light so that infrared light enters to the second photosensitive region 132
Place.
Wherein, in second of realization method, it is provided with visible light light sensitive diode in first photosensitive region 131,
It is provided with infrared light light sensitive diode in second photosensitive region 132, it is seen that light sensation optical diode and infrared light light sensitive diode
Wavelength photoreceptor is different.Wherein, the visible light light sensitive diode of setting includes in the first photosensitive region 131:Photosensitive two pole of red light
Pipe, green light light sensitive diode and blue light light sensitive diode, red light light sensitive diode are located at the photosensitive subregion of red light
Interior, green light light sensitive diode is located in the photosensitive subregion of green light, and blue light light sensitive diode is located at the photosensitive sub-district of blue light
In domain;The photosensitive wavelength range of red light light sensitive diode is located at 640 between 780nm, green light light sensitive diode it is photosensitive
Wave-length coverage is located at 505nm between 525nm, and the wavelength photoreceptor of blue light light sensitive diode is located at 475nm between 505nm,
The monochromatic light on three photosensitive subregions of the first photosensitive region 131 is reached, by corresponding monochromatic light ray electric diode by light
Signal is converted to electric signal, and then generates visible images according to the electric signal.The infrared light being arranged in second photosensitive region 132
The photosensitive wavelength range of light sensitive diode is the wave-length coverage of infrared light, i.e. 780nm reaches the second photosensitive area between 1mm
Infrared signal is converted to electric signal by the infrared light in domain 132 by the infrared light light sensitive diode, and then according to the electric signal
Generate infrared light image.
Specifically, in second of realization method, may be used also in first photosensitive region 131 and the second photosensitive region 132
To be set as the identical light sensitive diode of photosensitive wavelength range.
Wherein, in second of realization method of imaging sensor of the present invention, being not provided on the first filter layer pair can
The light-exposed region filtered, this kind of mode is convenient for the manufacturing to the first filter layer and the second filter layer.With the first reality
Imaging sensor in existing mode is compared, and is only infrared light due to reaching the light at the second photosensitive region 132, energy
The picture quality of enough infrared light images for ensureing to generate is more preferable.
With reference to Fig. 3, the imaging sensor of the third realization method of the invention is provided in Fig. 3, in the third realization side
In formula, the filter layer of imaging sensor includes:First filter layer includes for the black light filtered region by visible light
1211 and for the second filtered region 1212c by infrared light;Second filter layer includes for being filtered by the colour of visible light
Light region 1221 and for passing through region 1222c by the 4th of infrared light;Wherein, black light filtered region 1211, colour
Filtered region 1221 and the first photosensitive region 131 are corresponding, and the second filtered region 1212c, the 4th pass through region 1222c and second
Photosensitive region 132 is corresponding.
Wherein, black light filtered region 1211 refer to prevent black light by and allow visible light by filter area
Domain;Second filtered region 1212c refer to prevent other light by and only allow infrared light by filtered region.That is, arriving
The light reached at 1221 position of colorized optical filtering region is visible light ray, and passes through region to reach the second filter layer the 4th
Light at the positions 1222c is infrared light.
The 4th of second filter layer at the 1222c of region by allowing infrared light to pass through, in the four-way of the second filter layer
Cross be not provided at the positions 1222c of region prevent light by filtering structure or ray filtering material.
Colorized optical filtering region 1221 include for red light, three kinds of light of green light and blue ray by monochromatic light
Include the photosensitive subregion of red light at first photosensitive region 131 correspondingly by region, the photosensitive subregion of green light and
The photosensitive subregion of blue light, wherein the red light in colorized optical filtering region 1221 passes through on region and the first photosensitive region 131
The photosensitive subregion of red light is corresponding, and the green light in colorized optical filtering region 1221 passes through on region and the first photosensitive region 131
The photosensitive subregion of green light it is corresponding, the blue light in colorized optical filtering region 1221 passes through region and the first photosensitive region 131
On the photosensitive subregion of blue light it is corresponding.Visible light ray is to any one list reached in the colorized optical filtering region 1221
Coloured light is by after on region, it is seen that passes through the region by the light of region corresponding color with the monochromatic light in light and reaches first
With the monochromatic light by region at corresponding position on photosensitive region 131.
Wherein, in the third realization method, it is provided with visible light light sensitive diode in first photosensitive region 131,
It is provided with infrared light light sensitive diode in second photosensitive region 132, it is seen that light sensation optical diode and infrared light light sensitive diode
Wavelength photoreceptor is different.Wherein, the visible light light sensitive diode of setting includes in the first photosensitive region:Red light light sensitive diode,
Green light light sensitive diode and blue light light sensitive diode, red light light sensitive diode are located in the photosensitive subregion of red light,
Green light light sensitive diode is located in the photosensitive subregion of green light, and blue light light sensitive diode is located at the photosensitive subregion of blue light
It is interior;The photosensitive wavelength range of red light light sensitive diode is located at 640 between 780nm, the photosensitive wave of green light light sensitive diode
Long range is located at 505nm between 525nm, and the wavelength photoreceptor of blue light light sensitive diode is located at 475nm between 505nm, arrives
Monochromatic light up on three photosensitive subregions of the first photosensitive region 131, is believed light by corresponding monochromatic light ray electric diode
Electric signal number is converted to, and then visible images are generated according to the electric signal.The infrared light sensation being arranged in second photosensitive region 132
The photosensitive wavelength range of optical diode is the wave-length coverage of infrared light, i.e. 780nm reaches the second photosensitive region between 1mm
Infrared signal is converted to electric signal by the infrared light light sensitive diode, and then is given birth to according to the electric signal by 132 infrared light
At infrared light image.
Specifically, in the third realization method, may be used also in first photosensitive region 131 and the second photosensitive region 132
To be set as the identical light sensitive diode of photosensitive wavelength range.
Compared with the imaging sensor in the first realization method, due to reaching the light at the second photosensitive region 132 only
For infrared light, thereby it is ensured that the picture quality of the infrared light image generated is more preferable.The image of the third realization method
Difference lies in, it is seen that the differences of light filter layer installation position for imaging sensor in sensor and second of realization method, should
Kind mode, which can equally be realized, to visible light be imaged and be imaged infrared light.
Wherein, for the imaging sensor in above-mentioned three kinds of realization methods, the following conditions can be met.
The first condition is that the black light filtered region 1211 is coated on infrared in colorized optical filtering region 1221
Light filter coating or the infrared light optical filter being separately provided.
The mode of colorized optical filtering region 1221 is coated according to infrared light optical filtering figure layer, then in the first realization method
First by region 1212a and second is idle area or for filtering structure or optical filtering object are not arranged by region 1222a
The eyeglass of matter, third in second of realization method by region 1212b be idle area or for be not arranged filtering structure or
The eyeglass of ray filtering material, the in the third realization method the 4th by region 1222c is idle area or for optical filtering is not arranged
The eyeglass of structure or ray filtering material.
According to the infrared light optical filter being separately provided, similarly, then first in the first realization method passes through region
1212a and second is idle area or not to be arranged the eyeglass of filtering structure or ray filtering material by region 1222a, and second
Third in kind of realization method is idle area or for the mirror of filtering structure or ray filtering material is not arranged by region 1212b
Piece, the in the third realization method the 4th by region 1222c is idle area or for filtering structure or optical filtering object are not arranged
The eyeglass of matter.
Imaging sensor can also meet the one of which in second of condition and the third condition, wherein second
Part is that photosensitive element layer includes multiple pixel units, wherein each pixel unit includes the first photosensitive region 131 and second
Photosensitive region 132.
At this point, for a pixel unit, the first photosensitive region 131 is three monochromatic light pixels subelements, respectively
For:One red light pixel subelement, a green light pixel subelement and a blue light pixel subelement, the second photosensitive area
Domain 132 is one infrared light pixel subelement.
Wherein, the third condition is that photosensitive element layer includes multiple pixel units, a part of picture in multiple pixel units
Plain unit includes the first photosensitive region 131 and the second photosensitive region 132, and another part pixel unit only includes the first photosensitive region
131。
At this point, for the one part of pixel unit including the first photosensitive region 131 and the second photosensitive region 132, it should
The first photosensitive region in each pixel unit in partial pixel unit include a red light pixel subelement, one
Green light pixel subelement and a blue light pixel subelement, the second photosensitive region 132 are an infrared light pixel subelement.
It is every in the partial pixel unit for only including another part pixel unit of the first photosensitive region 131
The first photosensitive region 131 in one pixel unit includes two red light pixel subelements, a green light pixel list
Member, a blue light pixel subelement.
Include the one part of pixel of the first photosensitive region 131 and the second photosensitive region 132 also, in the third condition
The number of unit is less than another part pixel unit for only including the first photosensitive region.
Meet the photosensitive element layer of the third condition for the photosensitive element layer for meeting second of condition, can make
The visible light imaging effect finally obtained is more preferable.
The imaging sensor provided through the embodiment of the present invention, it is visible after verifying face progress Image Acquisition by treating
The characteristics of image of light image and the characteristics of image of infrared light image carry out Image Feature Matching, are capable of determining that the people before camera
Whether face is a true face, improves safety and the accuracy of recognition of face;Also, due to needing in visible light figure
Just verification face is treated in determination after two kinds of equal successful match of characteristics of image of the characteristics of image of picture and the characteristics of image of infrared light image
Be proved to be successful, improve the accuracy of recognition of face.
The embodiment of the present invention additionally provides a kind of camera lens module, including above-mentioned imaging sensor.
The lens set of the camera lens module is disposed adjacent with imaging sensor, that is, eliminates infrared fileter.
The embodiment of the present invention additionally provides a kind of mobile terminal, including above-mentioned camera lens module.
With reference to Fig. 4, an embodiment of the present invention provides a kind of face identification methods, including:
Step 201, the visible images and infrared light image after treating verification face progress Image Acquisition are obtained.
When needing to carry out recognition of face, mobile terminal controls camera and opens, and to being carried out in camera shot region
Face datection, when detecting face, camera in shot region to taking pictures.
Specifically, the mode taken pictures according to the difference of the function for the camera installed on mobile terminal, control camera
It differs, describes in detail below to the description of this part.
Step 202, determine visible images characteristics of image whether the characteristics of image phase with the visible light face-image that prestores
Whether the characteristics of image of matching and infrared light image matches with the characteristics of image of the infrared light face-image that prestores.
Wherein, in step 202, determine visible images characteristics of image whether the image with the visible light face-image that prestores
The step of feature matches include:
Step 2021, the first characteristics of image is extracted from visible images, and is carried from the visible light face-image that prestores
Take the second characteristics of image;
Step 2022, the first characteristics of image and the second characteristics of image are compared;
Step 2023, if the comparison result of the first characteristics of image and the second characteristics of image is greater than or equal to the first predetermined threshold
Value, it is determined that the characteristics of image of visible images and the characteristics of image for the visible light face-image that prestores match;
Step 2024, if the comparison result of the first characteristics of image and the second characteristics of image is less than the first predetermined threshold, really
The characteristics of image for determining the characteristics of image of visible images and the visible light face-image that prestores mismatches.
In step 202, determine infrared light image characteristics of image whether the characteristics of image with the infrared light face-image that prestores
The step of matching include:
Step 205, third characteristics of image is extracted from infrared light image, and is extracted from the infrared light face-image that prestores
4th characteristics of image;
Step 206, third characteristics of image and the 4th characteristics of image are compared;
Step 207, if the comparison result of third characteristics of image and the 4th characteristics of image is greater than or equal to the second predetermined threshold,
Then determine that the characteristics of image of the characteristics of image and the infrared light face-image that prestores of infrared light image matches;
Step 208, if the comparison result of third characteristics of image and the 4th characteristics of image is less than the second predetermined threshold, it is determined that
The characteristics of image of the characteristics of image of visible images and the visible light face-image that prestores mismatches.
Wherein, in step 2021, the first characteristics of image and the second characteristics of image refer to witness to be tested in visible images
The face feature of face.
Wherein, in step 2025, third characteristics of image and the 4th characteristics of image refer to witness to be tested in infrared light image
The face feature of face.
Step 203, if the characteristics of image of visible images and the characteristics of image for the visible light face-image that prestores match, and
The characteristics of image of infrared light image and the characteristics of image for the infrared light face-image that prestores match, it is determined that treat verification face
It is verified as success.
Wherein, if the characteristics of image of visible images and the characteristics of image for the visible light face-image that prestores match, and/or
The characteristics of image of infrared light image and the characteristics of image for the infrared light face-image that prestores match, it is determined that treat verification face
It is verified as failure.
Specifically, it is previously stored with multiple prestore visible light face-image and multiple infrared smooth surfaces that prestore in the terminal
Portion's image, wherein one characteristics of image in characteristics of image and multiple of visible images prestore visible images match
When, then it is assumed that the characteristics of image of visible images and the characteristics of image for the visible light face-image that prestores match;Similarly, hot
When the characteristics of image of outer light image matches with wherein one characteristics of image in multiple visible images that prestore, then it is assumed that can
The characteristics of image of light-exposed image and the characteristics of image for the infrared light face-image that prestores match.
Face identification method provided in an embodiment of the present invention verifies the infrared light after face carries out Image Acquisition by treating
The characteristics of image of image and the characteristics of image of visible images carry out Image Feature Matching, are capable of determining that the face before camera
It is a true face or human face photo, improves safety and the accuracy of recognition of face;Also, due to needing can
It is just determined to be tested after two kinds of equal successful match of characteristics of image of the characteristics of image of light-exposed image and the characteristics of image of infrared light image
Witness's face is proved to be successful, and improves the accuracy of recognition of face.
Further, in the embodiment of the present invention, according to the difference for the camera installed on mobile terminal, identifier is treated
There are two types of the modes that the visible images and infrared light image of face are obtained.
Wherein, when be equipped on mobile terminal visible light take pictures camera and infrared light take pictures camera when, acquisition is treated
Verification face carry out Image Acquisition after visible images and infrared image the step of include:
It is taken pictures by visible light camera of taking pictures, obtains visible images;
It is taken pictures by infrared light camera of taking pictures, obtains infrared light image.
Wherein, it is seen that photo-beat is only capable of carrying out visible light photosensitive according to the photosensitive element layer in the imaging sensor of camera
First photosensitive region;Infrared light take pictures camera imaging sensor in photosensitive element layer be only capable of carrying out infrared light it is photosensitive
Second photosensitive region.That is, all pixels unit of the photosensitive element layer in the imaging sensor that visible light takes pictures camera
In, only include visible light light sensitive diode;Photosensitive element layer in the imaging sensor that infrared light takes pictures camera it is all
In pixel unit, infrared light light sensitive diode is only included.
It is not interfered mutually between two cameras, enables to the visible images being ultimately imaged and infrared light image
Imaging effect it is preferable.
Further, in the embodiment of the present invention, the visible images after treating verification face progress Image Acquisition are obtained
And the step of infrared light image, includes:
It when mobile terminal is in visible light exposal model, is taken pictures by camera, after acquisition camera is taken pictures
Visible images;
It when mobile terminal is in infrared light exposal model, is taken pictures by camera, after acquisition camera is taken pictures
Infrared light image.
Specifically, in this kind of mode, the acquisition to visual light imaging and infrared image is by above-mentioned camera shooting mould
Group is acquired acquisition.The photosensitive element layer of the imaging sensor of camera module includes the first photosensitive region of corresponding visible light
With the second photosensitive region of corresponding infrared light, visible light light sensitive diode, the second photosensitive region are provided in the first photosensitive region
Inside it is provided with infrared light light sensitive diode.It is explained with a pixel unit in the photosensitive element layer of camera,
It is provided with visible light light sensitive diode in the first photosensitive region in the pixel unit, is specially:One red light light sensation light
Diode, a green light light sensation optical diode and a blue light light sensation optical diode, the second sense in the pixel unit
Visible light light sensitive diode is provided in light region, aforementioned four light sensitive diode forms a pixel unit.Wherein, four senses
Optical diode is respectively connected with a control circuit, which specifically includes:First switch pipe, the base stage of first switch pipe and control
Device connection processed, the collector of first switch pipe are connect with light sensitive diode;Second switch pipe, the emitter of first switch pipe and
The collector of two switching tubes connects, and the base stage of second switch pipe is connect with controller;Third switching tube, the base stage of third switching tube
It is connect with the collector of the emitter of first switch pipe and second switch pipe, emitter and the second switch pipe of third switching tube
Emitter connects;The base stage of 4th switching tube, the 4th switching tube is connect with controller, and the emitter of the 4th switching tube is opened with third
The collector connection of pipe is closed, the collector of the 4th switching tube is connected to the output.Wherein, second switch pipe, third switching tube and
4th switching tube is in the conduction state.
When in visible light exposal model, controller is sent out to the first switch pipe being connect with visible light light sensitive diode
Signal so that the connection between visible light light sensitive diode and third switching tube is connected in first switch pipe, at this point, being incident to visible
After the optical signal that light sensation optical diode goes out is converted to electric signal via the visible light light sensitive diode, first switch is passed sequentially through
It is externally exported after pipe, third switching tube and the 4th switching tube;During in visible light exposal model, controller do not control with
Infrared light light sensitive diode connection first switch pipe conducting so that infrared light light sensitive diode can not corresponding third open
It is connected between the pipe of pass;And then achieve the purpose that only to visual light imaging.
When in infrared light exposal model, controller is sent out to the first switch pipe being connect with infrared light light sensitive diode
Signal so that the connection between infrared light light sensitive diode and third switching tube is connected in first switch pipe, at this point, being incident to infrared
After the optical signal that light sensation optical diode goes out is converted to electric signal via the infrared light light sensitive diode, first switch is passed sequentially through
It is externally exported after pipe, third switching tube and the 4th switching tube;During in infrared light exposal model, controller do not control with
Visible light light sensitive diode connection first switch pipe conducting so that visible light light sensitive diode can not corresponding third open
It is connected between the pipe of pass;And then achieve the purpose that only to infrared imaging.
In such a way that a camera realizes visual light imaging and infrared imaging, it can increase other in mobile space
The arrangement space of component, meanwhile, reduce the manufacturing cost of mobile terminal.
Specifically, the face identification method in the embodiment of the present invention can be used for the unlocking screen of mobile terminal, such
In realization method, when treat verification face be proved to be successful when, carry out unlocking screen.
Alternatively, the face identification method in the embodiment of the present invention can also be applied to the payment technical field of mobile terminal, herein
It in class realization method, needs to enter in corresponding payment interface into mobile terminal, recognition of face could be carried out, and in face
When identifying successfully, paid.
In the embodiment of the present invention, by treat verification face carry out Image Acquisition after infrared light image characteristics of image and
The characteristics of image of visible images carries out Image Feature Matching, is capable of determining that the face before camera is a true face
Or human face photo, improves safety and the accuracy of recognition of face;Also, due to needing the image in visible images special
It is just determined after the equal successful match of characteristics of image of infrared light image of seeking peace and treats being proved to be successful for verification face, improve face knowledge
Other accuracy.
With reference to Fig. 5 to Fig. 7, the embodiment of the present invention additionally provides a kind of face identification device 300, including:
Acquisition module 301, for obtaining visible images and infrared light figure after treating verification face progress Image Acquisition
Picture;
First determining module 302, for determine visible images characteristics of image whether with the visible light face-image that prestores
Characteristics of image match and the characteristics of image of infrared light image whether the characteristics of image phase with the infrared light face-image that prestores
Matching;
Second determining module 303, if the image of the characteristics of image and the visible light face-image that prestores for visible images
Feature matches and the characteristics of image of infrared light image and the characteristics of image for the infrared light face-image that prestores match, then really
That treats verification face surely is verified as success.
With reference to Fig. 6, acquisition module 301 includes:
First acquisition unit 3011, for when mobile terminal is in visible light exposal model, being taken pictures, being obtained by camera
Visible images after taking camera to be taken pictures;
Second acquisition unit 3012, for when mobile terminal is in infrared light exposal model, being taken pictures, being obtained by camera
Infrared light image after taking camera to be taken pictures.
With reference to Fig. 6, the first determining module 302 includes:
First extraction unit 3021, for extracting the first characteristics of image from visible images, and from the visible light that prestores
The second characteristics of image is extracted in face-image;
First comparing unit 3022, for the first characteristics of image and the second characteristics of image to be compared;
First determination unit 3023, if the comparison result for the first characteristics of image and the second characteristics of image is greater than or equal to
First predetermined threshold, it is determined that the characteristics of image of visible images and the characteristics of image for the visible light face-image that prestores match;
Second determination unit 3024, if it is pre- to be less than first for the comparison result of the first characteristics of image and the second characteristics of image
Determine threshold value, it is determined that the characteristics of image of the characteristics of image of visible images and the visible light face-image that prestores mismatches.
With reference to Fig. 6, the first determining module 302 further includes:
Second extraction unit 3025, for extracting third characteristics of image from infrared light image, and from the infrared light that prestores
The 4th characteristics of image is extracted in face-image;
Second comparing unit 3026, for third characteristics of image and the 4th characteristics of image to be compared;
Third determination unit 3027, if the comparison result for third characteristics of image and the 4th characteristics of image is greater than or equal to
Second predetermined threshold, it is determined that the characteristics of image of infrared light image and the characteristics of image for the infrared light face-image that prestores match;
4th determination unit 3028, if it is pre- to be less than second for the comparison result of third characteristics of image and the 4th characteristics of image
Determine threshold value, it is determined that the characteristics of image of the characteristics of image of visible images and the visible light face-image that prestores mismatches.
Mobile terminal provided in an embodiment of the present invention can realize that mobile terminal is realized each in the embodiment of the method in Fig. 4
A process, to avoid repeating, which is not described herein again.By the figure for treating the infrared light image after verifying face progress Image Acquisition
As feature progress Image Feature Matching, it is capable of determining that whether the face before camera is a true face, improves people
The safety of face identification and accuracy;Also, due to needing in the characteristics of image of visible images and the image of infrared light image
It is just determined after the equal successful match of characteristics of image of two kinds of images of feature and treats being proved to be successful for verification face, improve recognition of face
Accuracy.
A kind of hardware architecture diagram of Fig. 7 mobile terminals of each embodiment to realize the present invention,
The mobile terminal 400 includes but not limited to:It is radio frequency unit 401, network module 402, audio output unit 403, defeated
Enter unit 404, sensor 405, display unit 406, user input unit 407, interface unit 408, memory 409, processor
The components such as 410 and power supply 411.It will be understood by those skilled in the art that mobile terminal structure shown in Fig. 7 is not constituted
Restriction to mobile terminal, mobile terminal may include than illustrating more or fewer components, either combine certain components or
Different component arrangements.In embodiments of the present invention, mobile terminal include but not limited to mobile phone, tablet computer, laptop,
Palm PC, car-mounted terminal, wearable device and pedometer etc..
Wherein, radio frequency unit 401, for transceiving data under control of processor 410;
Processor 410 treats visible images and infrared light image after verification face carries out Image Acquisition for obtaining;
If visible images and the prestore characteristics of image of visible light face-image of characteristics of image match, and the image of infrared light image
Feature and the characteristics of image for the infrared light face-image that prestores match, it is determined that treats verification face is verified as success.
By the figure for treating the characteristics of image and infrared light image of the visible images after verifying face progress Image Acquisition
As two kinds of characteristics of image progress Image Feature Matchings of feature, just identifier is treated in determination after two kinds of equal successful match of characteristics of image
Face is proved to be successful, and improves the accuracy of recognition of face.
It should be understood that the embodiment of the present invention in, radio frequency unit 401 can be used for receiving and sending messages or communication process in, signal
Send and receive, specifically, by from base station downlink data receive after, to processor 410 handle;In addition, by uplink
Data are sent to base station.In general, radio frequency unit 401 includes but not limited to antenna, at least one amplifier, transceiver, coupling
Device, low-noise amplifier, duplexer etc..In addition, radio frequency unit 401 can also by radio communication system and network and other set
Standby communication.
Mobile terminal has provided wireless broadband internet to the user by network module 402 and has accessed, and such as user is helped to receive
Send e-mails, browse webpage and access streaming video etc..
It is that audio output unit 403 can receive radio frequency unit 401 or network module 402 or in memory 409
The audio data of storage is converted into audio signal and exports to be sound.Moreover, audio output unit 403 can also be provided and be moved
The relevant audio output of specific function that dynamic terminal 400 executes is (for example, call signal receives sound, message sink sound etc.
Deng).Audio output unit 403 includes loud speaker, buzzer and receiver etc..
Input unit 404 is for receiving audio or video signal.Input unit 404 may include graphics processor
(Graphics Processing Unit, GPU) 4041 and microphone 4042, graphics processor 4041 is in video acquisition mode
Or the image data of the static images or video obtained by image capture apparatus (such as camera) in image capture mode carries out
Reason.Treated, and picture frame may be displayed on display unit 406.Through graphics processor 4041, treated that picture frame can be deposited
Storage is sent in memory 409 (or other storage mediums) or via radio frequency unit 401 or network module 402.Mike
Wind 4042 can receive sound, and can be audio data by such acoustic processing.Treated audio data can be
The format output of mobile communication base station can be sent to via radio frequency unit 401 by being converted in the case of telephone calling model.
Mobile terminal 400 further includes at least one sensor 405, such as optical sensor, motion sensor and other biographies
Sensor.Specifically, optical sensor includes ambient light sensor and proximity sensor, wherein ambient light sensor can be according to environment
The light and shade of light adjusts the brightness of display panel 4061, and proximity sensor can close when mobile terminal 400 is moved in one's ear
Display panel 4061 and/or backlight.As a kind of motion sensor, accelerometer sensor can detect in all directions (general
For three axis) size of acceleration, size and the direction of gravity are can detect that when static, can be used to identify mobile terminal posture (ratio
Such as horizontal/vertical screen switching, dependent game, magnetometer pose calibrating), Vibration identification correlation function (such as pedometer, tap);It passes
Sensor 405 can also include fingerprint sensor, pressure sensor, iris sensor, molecule sensor, gyroscope, barometer, wet
Meter, thermometer, infrared sensor etc. are spent, details are not described herein.
Display unit 406 is for showing information input by user or being supplied to the information of user.Display unit 406 can wrap
Display panel 4061 is included, liquid crystal display (Liquid Crystal Display, LCD), Organic Light Emitting Diode may be used
Forms such as (Organic Light-Emitting Diode, OLED) configure display panel 4061.
User input unit 407 can be used for receiving the number or character information of input, and generate the use with mobile terminal
Family is arranged and the related key signals input of function control.Specifically, user input unit 407 include touch panel 4071 and
Other input equipments 4072.Touch panel 4071, also referred to as touch screen collect user on it or neighbouring touch operation
(for example user uses any suitable objects or attachment such as finger, stylus on touch panel 4071 or in touch panel 4071
Neighbouring operation).Touch panel 4071 may include both touch detecting apparatus and touch controller.Wherein, touch detection
Device detects the touch orientation of user, and detects the signal that touch operation is brought, and transmits a signal to touch controller;Touch control
Device processed receives touch information from touch detecting apparatus, and is converted into contact coordinate, then gives processor 410, receiving area
It manages the order that device 410 is sent and is executed.Furthermore, it is possible to more using resistance-type, condenser type, infrared ray and surface acoustic wave etc.
Type realizes touch panel 4071.In addition to touch panel 4071, user input unit 407 can also include other input equipments
4072.Specifically, other input equipments 4072 can include but is not limited to physical keyboard, function key (such as volume control button,
Switch key etc.), trace ball, mouse, operating lever, details are not described herein.
Further, touch panel 4071 can be covered on display panel 4061, when touch panel 4071 is detected at it
On or near touch operation after, send processor 410 to determine the type of touch event, be followed by subsequent processing device 410 according to touch
The type for touching event provides corresponding visual output on display panel 4061.Although in the figure 7, touch panel 4071 and display
Panel 4061 is to realize the function that outputs and inputs of mobile terminal as two independent components, but in some embodiments
In, can be integrated by touch panel 4071 and display panel 4061 and realize the function that outputs and inputs of mobile terminal, it is specific this
Place does not limit.
Interface unit 408 is the interface that external device (ED) is connect with mobile terminal 400.For example, external device (ED) may include having
Line or wireless head-band earphone port, external power supply (or battery charger) port, wired or wireless data port, storage card end
Mouth, port, the port audio input/output (I/O), video i/o port, earphone end for connecting the device with identification module
Mouthful etc..Interface unit 408 can be used for receiving the input (for example, data information, electric power etc.) from external device (ED) and
By one or more elements that the input received is transferred in mobile terminal 400 or can be used in 400 He of mobile terminal
Transmission data between external device (ED).
Memory 409 can be used for storing software program and various data.Memory 409 can include mainly storing program area
And storage data field, wherein storing program area can storage program area, application program (such as the sound needed at least one function
Sound playing function, image player function etc.) etc.;Storage data field can store according to mobile phone use created data (such as
Audio data, phone directory etc.) etc..In addition, memory 409 may include high-speed random access memory, can also include non-easy
The property lost memory, a for example, at least disk memory, flush memory device or other volatile solid-state parts.
Processor 410 is the control centre of mobile terminal, utilizes each of various interfaces and the entire mobile terminal of connection
A part by running or execute the software program and/or module that are stored in memory 409, and calls and is stored in storage
Data in device 409 execute the various functions and processing data of mobile terminal, to carry out integral monitoring to mobile terminal.Place
Reason device 410 may include one or more processing units;Preferably, processor 410 can integrate application processor and modulatedemodulate is mediated
Manage device, wherein the main processing operation system of application processor, user interface and application program etc., modem processor is main
Processing wireless communication.It is understood that above-mentioned modem processor can not also be integrated into processor 410.
Mobile terminal 400 can also include the power supply 411 (such as battery) powered to all parts, it is preferred that power supply 411
Can be logically contiguous by power-supply management system and processor 410, to realize management charging by power-supply management system, put
The functions such as electricity and power managed.
In addition, mobile terminal 400 includes some unshowned function modules, details are not described herein.
Preferably, the embodiment of the present invention also provides a kind of mobile terminal, including processor 410, and memory 409 is stored in
On memory 409 and the computer program that can be run on the processor 410, the computer program are executed by processor 410
Each process of the above-mentioned face identification method embodiments of Shi Shixian, and identical technique effect can be reached, to avoid repeating, here
It repeats no more.
The embodiment of the present invention also provides a kind of computer readable storage medium, and meter is stored on computer readable storage medium
Calculation machine program, the computer program realize each process of above-mentioned face identification method embodiment, and energy when being executed by processor
Reach identical technique effect, to avoid repeating, which is not described herein again.Wherein, the computer readable storage medium, such as only
Read memory (Read-Only Memory, abbreviation ROM), random access memory (Random Access Memory, abbreviation
RAM), magnetic disc or CD etc..
It should be noted that herein, the terms "include", "comprise" or its any other variant are intended to non-row
His property includes, so that process, method, article or device including a series of elements include not only those elements, and
And further include other elements that are not explicitly listed, or further include for this process, method, article or device institute it is intrinsic
Element.In the absence of more restrictions, the element limited by sentence "including a ...", it is not excluded that including this
There is also other identical elements in the process of element, method, article or device.
Through the above description of the embodiments, those skilled in the art can be understood that above-described embodiment side
Method can add the mode of required general hardware platform to realize by software, naturally it is also possible to by hardware, but in many cases
The former is more preferably embodiment.Based on this understanding, technical scheme of the present invention substantially in other words does the prior art
Going out the part of contribution can be expressed in the form of software products, which is stored in a storage medium
In (such as ROM/RAM, magnetic disc, CD), including some instructions are used so that a station terminal (can be mobile phone, computer, service
Device, air conditioner or network equipment etc.) execute method described in each embodiment of the present invention.
The embodiment of the present invention is described with above attached drawing, but the invention is not limited in above-mentioned specific
Embodiment, the above mentioned embodiment is only schematical, rather than restrictive, those skilled in the art
Under the inspiration of the present invention, without breaking away from the scope protected by the purposes and claims of the present invention, it can also make very much
Form belongs within the protection of the present invention.
Above-described is the preferred embodiment of the present invention, it should be pointed out that the ordinary person of the art is come
It says, can also make several improvements and retouch under the premise of not departing from principle of the present invention, these improvements and modifications also exist
In protection scope of the present invention.
Claims (17)
1. a kind of imaging sensor, which is characterized in that described image sensor includes microlens layer, photosensitive element layer and is set to
Filter layer between the microlens layer and the photosensitive element layer;
Wherein, incident ray is transmitted to the photosensitive element layer successively after the microlens layer and the filter layer, described
Photosensitive element layer includes the second photosensitive region of the first photosensitive region and corresponding infrared light of corresponding visible light.
2. imaging sensor according to claim 1, which is characterized in that be provided with photosensitive wave in second photosensitive region
Long range is located at 780nm to the infrared light light sensitive diode between 1mm, and the filter layer includes:
First filter layer, include for by visible light black light filtered region and for by the first logical of natural light
Cross region;
Second filter layer, include for by visible light colorized optical filtering region and for passing through area by the second of natural light
Domain;
Wherein, the black light filtered region, the colorized optical filtering region and first photosensitive region are corresponding, and described
One is corresponding by region and second photosensitive region by region, described second.
3. imaging sensor according to claim 1, which is characterized in that the filter layer includes:
First filter layer, include for by visible light black light filtered region and for being passed through by the third of natural light
Region;
Second filter layer, include for by visible light colorized optical filtering region and for the first filter area by infrared light
Domain;
Wherein, the black light filtered region, the colorized optical filtering region and first photosensitive region are corresponding, and described
Three is corresponding by region, first filtered region and second photosensitive region.
4. imaging sensor according to claim 1, which is characterized in that the filter layer includes:
First filter layer, include for by visible light black light filtered region and for by infrared light second filter
Region;
Second filter layer, include for by visible light colorized optical filtering region and for passing through area by the 4th of infrared light
Domain;
Wherein, the black light filtered region, the colorized optical filtering region and first photosensitive region are corresponding, and described
It is two filtered regions, the described 4th corresponding by region and second photosensitive region.
5. imaging sensor according to claim 3 or 4, which is characterized in that being provided in first photosensitive region can
Light-exposed light sensitive diode is provided with infrared light light sensitive diode in second photosensitive region, wherein the visible light photosensitive two
Pole pipe is different from the photosensitive wavelength range of infrared light light sensitive diode.
6. imaging sensor according to claim 1, which is characterized in that the photosensitive element layer includes multiple pixel lists
Member, wherein each pixel unit includes first photosensitive region and second photosensitive region.
7. imaging sensor according to claim 1, which is characterized in that the photosensitive element layer includes multiple pixel lists
Member, the one part of pixel unit in multiple pixel units include first photosensitive region and second photosensitive region, another
Partial pixel unit only includes first photosensitive region.
8. a kind of camera lens module, which is characterized in that including claim 1 to 7 any one of them imaging sensor.
9. camera lens module according to claim 8, which is characterized in that the lens set and imaging sensor of the camera lens module
It is disposed adjacent.
10. a kind of mobile terminal, which is characterized in that including claim 8-9 any one of them camera lens modules.
11. a kind of face identification method, which is characterized in that be applied to mobile terminal as claimed in claim 10, the face
Recognition methods includes:
Obtain the visible images and infrared light image after treating verification face progress Image Acquisition;
Determine whether the characteristics of image of the visible images matches with the characteristics of image of the visible light face-image that prestores, and
Whether the characteristics of image of the infrared light image matches with the characteristics of image of the infrared light face-image that prestores;
If the characteristics of image of the visible images and the characteristics of image for the visible light face-image that prestores match, and described infrared
The characteristics of image of light image and the characteristics of image for the infrared light face-image that prestores match, it is determined that the face to be verified
It is verified as success.
12. face identification method according to claim 11, which is characterized in that the acquisition treats verification face and carries out figure
As acquisition after visible images and infrared light image the step of include:
It when mobile terminal is in visible light exposal model, is taken pictures, is obtained after the camera taken pictures by camera
The visible images;
It when mobile terminal is in infrared light exposal model, is taken pictures, is obtained after the camera taken pictures by camera
The infrared light image.
13. face identification method according to claim 11, which is characterized in that determine that the image of the visible images is special
The step of whether sign matches with the characteristics of image of the visible light face-image that prestores include:
The first characteristics of image is extracted from the visible images, and extracts second from the visible light face-image that prestores
Characteristics of image;
Described first image feature and second characteristics of image are compared;
If the comparison result of described first image feature and second characteristics of image is greater than or equal to the first predetermined threshold, really
The characteristics of image of the characteristics of image and the visible light face-image that prestores of the fixed visible images matches;
If the comparison result of described first image feature and second characteristics of image is less than first predetermined threshold, it is determined that
The characteristics of image of the characteristics of image of the visible images and the visible light face-image that prestores mismatches.
14. a kind of face identification device, which is characterized in that the face identification device includes:
Acquisition module treats visible images and infrared light image after verification face carries out Image Acquisition for obtaining;
First determining module, for determining whether the visible images match with the visible light face-image that prestores, Yi Jisuo
State whether infrared light image matches with the infrared light face-image that prestores;
Second determining module, for if it is determined that the characteristics of image of the visible images and the image of visible light face-image of prestoring
Feature matches, and determines the characteristics of image phase of the characteristics of image and the infrared light face-image that prestores of the infrared light image
Match, it is determined that success is verified as to the face to be verified.
15. face identification device according to claim 14, which is characterized in that the acquisition module includes:
First acquisition unit, for when mobile terminal is in visible light exposal model, being taken pictures by camera, being taken the photograph described in acquisition
The visible images after being taken pictures as head;
Second acquisition unit, for when mobile terminal is in infrared light exposal model, being taken pictures by camera, being taken the photograph described in acquisition
The infrared light image after being taken pictures as head.
16. face identification device according to claim 14, which is characterized in that first determining module includes:
First extraction unit, for extracting the first characteristics of image from the visible images, and from the visible light that prestores
The second characteristics of image is extracted in face-image;
First comparing unit, for described first image feature and second characteristics of image to be compared;
First determination unit, if the comparison result for described first image feature and second characteristics of image is greater than or equal to
First predetermined threshold, it is determined that the characteristics of image phase of the characteristics of image of the visible images and the visible light face-image that prestores
Match;
Second determination unit, if the comparison result for described first image feature and second characteristics of image is less than described the
One predetermined threshold, it is determined that the characteristics of image of the characteristics of image of the visible images and the visible light face-image that prestores is not
Match.
17. a kind of mobile terminal, which is characterized in that including processor, memory and be stored on the memory and can be in institute
State the computer program run on processor, when the computer program is executed by the processor realize as claim 11 to
The step of face identification method described in any one of 13.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810083033.5A CN108345845B (en) | 2018-01-29 | 2018-01-29 | Image sensor, lens module, mobile terminal, face recognition method and device |
PCT/CN2019/073403 WO2019144956A1 (en) | 2018-01-29 | 2019-01-28 | Image sensor, camera module, mobile terminal, and facial recognition method and apparatus |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810083033.5A CN108345845B (en) | 2018-01-29 | 2018-01-29 | Image sensor, lens module, mobile terminal, face recognition method and device |
Publications (2)
Publication Number | Publication Date |
---|---|
CN108345845A true CN108345845A (en) | 2018-07-31 |
CN108345845B CN108345845B (en) | 2020-09-25 |
Family
ID=62961693
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201810083033.5A Active CN108345845B (en) | 2018-01-29 | 2018-01-29 | Image sensor, lens module, mobile terminal, face recognition method and device |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN108345845B (en) |
WO (1) | WO2019144956A1 (en) |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108897178A (en) * | 2018-08-31 | 2018-11-27 | 武汉华星光电技术有限公司 | Colored filter substrate and display panel |
CN109143704A (en) * | 2018-09-13 | 2019-01-04 | 合肥京东方光电科技有限公司 | Display panel and terminal device |
WO2019144956A1 (en) * | 2018-01-29 | 2019-08-01 | 维沃移动通信有限公司 | Image sensor, camera module, mobile terminal, and facial recognition method and apparatus |
CN110532992A (en) * | 2019-09-04 | 2019-12-03 | 深圳市捷顺科技实业股份有限公司 | A kind of face identification method based on visible light and near-infrared |
CN111447423A (en) * | 2020-03-25 | 2020-07-24 | 浙江大华技术股份有限公司 | Image sensor, imaging apparatus, and image processing method |
CN112131906A (en) * | 2019-06-24 | 2020-12-25 | Oppo广东移动通信有限公司 | Optical fingerprint sensor and electronic equipment with same |
CN114143427A (en) * | 2021-11-23 | 2022-03-04 | 歌尔科技有限公司 | Camera assembly, mobile terminal and camera-based body temperature measuring method |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101504721A (en) * | 2009-03-13 | 2009-08-12 | 北京中星微电子有限公司 | Method and apparatus for identity authentication based on face image |
CN103839952A (en) * | 2012-11-20 | 2014-06-04 | 采钰科技股份有限公司 | Image-sensing apparatus |
CN105023005A (en) * | 2015-08-05 | 2015-11-04 | 王丽婷 | Face recognition apparatus and recognition method thereof |
CN106982329A (en) * | 2017-04-28 | 2017-07-25 | 广东欧珀移动通信有限公司 | Imaging sensor, focusing control method, imaging device and mobile terminal |
CN206370880U (en) * | 2017-01-25 | 2017-08-01 | 徐鹤菲 | A kind of dual camera imaging system and mobile terminal |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9531961B2 (en) * | 2015-05-01 | 2016-12-27 | Duelight Llc | Systems and methods for generating a digital image using separate color and intensity data |
US10148936B2 (en) * | 2013-07-01 | 2018-12-04 | Omnivision Technologies, Inc. | Multi-band image sensor for providing three-dimensional color images |
CN104394306B (en) * | 2014-11-24 | 2018-02-27 | 北京中科虹霸科技有限公司 | Camera module and equipment for the multichannel multizone plated film of iris recognition |
CN205666883U (en) * | 2016-03-23 | 2016-10-26 | 徐鹤菲 | Support compound imaging system and mobile terminal of formation of image of near infrared and visible light |
CN108345845B (en) * | 2018-01-29 | 2020-09-25 | 维沃移动通信有限公司 | Image sensor, lens module, mobile terminal, face recognition method and device |
-
2018
- 2018-01-29 CN CN201810083033.5A patent/CN108345845B/en active Active
-
2019
- 2019-01-28 WO PCT/CN2019/073403 patent/WO2019144956A1/en active Application Filing
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101504721A (en) * | 2009-03-13 | 2009-08-12 | 北京中星微电子有限公司 | Method and apparatus for identity authentication based on face image |
CN103839952A (en) * | 2012-11-20 | 2014-06-04 | 采钰科技股份有限公司 | Image-sensing apparatus |
CN105023005A (en) * | 2015-08-05 | 2015-11-04 | 王丽婷 | Face recognition apparatus and recognition method thereof |
CN206370880U (en) * | 2017-01-25 | 2017-08-01 | 徐鹤菲 | A kind of dual camera imaging system and mobile terminal |
CN106982329A (en) * | 2017-04-28 | 2017-07-25 | 广东欧珀移动通信有限公司 | Imaging sensor, focusing control method, imaging device and mobile terminal |
Cited By (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2019144956A1 (en) * | 2018-01-29 | 2019-08-01 | 维沃移动通信有限公司 | Image sensor, camera module, mobile terminal, and facial recognition method and apparatus |
CN108897178A (en) * | 2018-08-31 | 2018-11-27 | 武汉华星光电技术有限公司 | Colored filter substrate and display panel |
WO2020042231A1 (en) * | 2018-08-31 | 2020-03-05 | 武汉华星光电技术有限公司 | Color filter substrate and display panel |
CN109143704A (en) * | 2018-09-13 | 2019-01-04 | 合肥京东方光电科技有限公司 | Display panel and terminal device |
CN109143704B (en) * | 2018-09-13 | 2021-03-12 | 合肥京东方光电科技有限公司 | Display panel and terminal equipment |
CN112131906A (en) * | 2019-06-24 | 2020-12-25 | Oppo广东移动通信有限公司 | Optical fingerprint sensor and electronic equipment with same |
WO2020259270A1 (en) * | 2019-06-24 | 2020-12-30 | Oppo广东移动通信有限公司 | Optical fingerprint sensor and electronic device having same |
US11783562B2 (en) | 2019-06-24 | 2023-10-10 | Guangdong Oppo Mobile Telecommunications Corp., Ltd. | Optical fingerprint sensor and electronic device having same |
CN110532992A (en) * | 2019-09-04 | 2019-12-03 | 深圳市捷顺科技实业股份有限公司 | A kind of face identification method based on visible light and near-infrared |
CN111447423A (en) * | 2020-03-25 | 2020-07-24 | 浙江大华技术股份有限公司 | Image sensor, imaging apparatus, and image processing method |
CN114143427A (en) * | 2021-11-23 | 2022-03-04 | 歌尔科技有限公司 | Camera assembly, mobile terminal and camera-based body temperature measuring method |
Also Published As
Publication number | Publication date |
---|---|
WO2019144956A1 (en) | 2019-08-01 |
CN108345845B (en) | 2020-09-25 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108345845A (en) | Imaging sensor, camera lens module, mobile terminal, face identification method and device | |
CN108427876A (en) | A kind of fingerprint identification method and mobile terminal | |
CN109743498A (en) | A kind of shooting parameter adjustment method and terminal device | |
CN108174081B (en) | A kind of image pickup method and mobile terminal | |
CN108462826A (en) | A kind of method and mobile terminal of auxiliary photo-taking | |
CN109743503A (en) | Reminding method and terminal | |
CN108174012A (en) | A kind of authority control method and mobile terminal | |
CN109525837A (en) | The generation method and mobile terminal of image | |
CN110175449A (en) | Cipher-code input method and mobile terminal | |
CN109814799A (en) | Screen response control mehtod and terminal device | |
CN109803110A (en) | A kind of image processing method, terminal device and server | |
CN109819166A (en) | A kind of image processing method and electronic equipment | |
CN109544172A (en) | A kind of display methods and terminal device | |
CN108629280A (en) | Face identification method and mobile terminal | |
CN108229420A (en) | A kind of face identification method, mobile terminal | |
CN110516488A (en) | A kind of barcode scanning method and mobile terminal | |
CN109033912A (en) | A kind of recognition methods of identification code and mobile terminal | |
CN109246351A (en) | A kind of patterning process and terminal device | |
CN110312035A (en) | A kind of control method of terminal, mobile terminal and computer readable storage medium | |
CN109639981A (en) | A kind of image capturing method and mobile terminal | |
CN108600492A (en) | A kind of unlocking screen method and terminal | |
CN110519443A (en) | A kind of bright screen method and mobile terminal | |
CN108345780A (en) | A kind of solution lock control method and mobile terminal | |
CN108536513A (en) | A kind of picture display direction method of adjustment and mobile terminal | |
CN109963033A (en) | A kind of voice-based control method and terminal device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |