CN105447483B - Biopsy method and device - Google Patents

Biopsy method and device Download PDF

Info

Publication number
CN105447483B
CN105447483B CN201511030874.2A CN201511030874A CN105447483B CN 105447483 B CN105447483 B CN 105447483B CN 201511030874 A CN201511030874 A CN 201511030874A CN 105447483 B CN105447483 B CN 105447483B
Authority
CN
China
Prior art keywords
intensity
image
living body
frequency response
frequency
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201511030874.2A
Other languages
Chinese (zh)
Other versions
CN105447483A (en
Inventor
范浩强
印奇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xuzhou Kuang Shi Data Technology Co., Ltd.
Beijing Megvii Technology Co Ltd
Beijing Maigewei Technology Co Ltd
Original Assignee
Xuzhou Kuang Shi Data Technology Co Ltd
Beijing Megvii Technology Co Ltd
Beijing Maigewei Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xuzhou Kuang Shi Data Technology Co Ltd, Beijing Megvii Technology Co Ltd, Beijing Maigewei Technology Co Ltd filed Critical Xuzhou Kuang Shi Data Technology Co Ltd
Priority to CN201511030874.2A priority Critical patent/CN105447483B/en
Publication of CN105447483A publication Critical patent/CN105447483A/en
Application granted granted Critical
Publication of CN105447483B publication Critical patent/CN105447483B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/161Detection; Localisation; Normalisation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/40Spoof detection, e.g. liveness detection
    • G06V40/45Detection of the body part being alive

Abstract

The embodiment provides a kind of biopsy method and devices.The biopsy method includes: to receive at least two groups object images, which obtains in the case where the structure light irradiation object that at least two have different space frequency for object acquisition respectively;Calculate at least two frequency response intensity image corresponding at least two groups object images;And determine whether object is living body based at least two frequency response intensity images.Biopsy method and device according to an embodiment of the present invention, judge the frequency response situation of the structure light of a variety of spatial frequencys whether object is living body by detected object, cooperation of the In vivo detection of high security without detected object may be implemented in this mode.

Description

Biopsy method and device
Technical field
The present invention relates to technical field of face recognition, relate more specifically to a kind of biopsy method and device.
Background technique
Currently, face identification system is increasingly being applied to security protection, financial field needs the scene of authentication, such as silver Row remotely opens an account, access control system, remote transaction operation demonstration etc..In the application field of these high security levels, in addition to ensuring that by The human face similarity degree of verifier meets the bottom library stored in database, it is necessary first to which verifying authenticatee is a legal biology Living body.That is, face identification system is required to security from attacks, person uses photo, video, mask or three-dimensional face mould The modes such as type (being made of materials such as paper, gypsum or rubber) are attacked.
Existing some biopsy methods need detected person to cooperate, existing in the ill-matched situation of detected person Biopsy method be typically only capable to defence photo, the planar objects such as video, for the face with certain number of people 3D shape Model, protection effect are not good enough.
Therefore, for recognition of face In vivo detection problem that solution cooperates by no means, it is desirable to provide a kind of new In vivo detection side Method.
Summary of the invention
The present invention is proposed in view of the above problem.The present invention provides a kind of biopsy method and devices.
According to an aspect of the present invention, a kind of biopsy method is provided, comprising: at least two groups object images are received, it should At least two groups object images are directed to institute in the case where the structure light irradiation object that at least two have different space frequency respectively Object acquisition is stated to obtain;Calculate at least two frequency response intensity image corresponding at least two groups object images;And Determine whether the object is living body based at least two frequency response intensity images.
Illustratively, determine that the object whether be living body includes: to be based on based at least two frequency response intensity images Intensity relationship between at least two frequency response intensity images obtains one or more intensity and compares image;To at least two One of frequency response intensity image carries out Face datection, to determine human face region;For one or more intensity than in image Each, calculates in average intensity value of the intensity than the pixel in region in image, corresponding with human face region;And Judge one or more intensity than whether there is predetermined number, average intensity value in image in corresponding preset range Intensity than image, if it is present determining that the object is living body, if it does not exist, then determining that the object is not living body.
Illustratively, based between at least two frequency responses intensity image intensity relationship obtain one or Multiple intensity include: the selection Specific frequency response intensity image from least two frequency response intensity images than image;And For each of the residual frequency response intensity image at least two frequency response intensity images, the residual frequency is calculated The ratio between intensity value and the intensity value of respective pixel in Specific frequency response intensity image of the pixel of response intensity image, and root One or more intensity, which are obtained, according to the ratio between intensity value calculated compares image.
Illustratively, for one or more intensity than each of image, calculate the intensity than it is in image, with The average intensity value of pixel in the corresponding region of human face region includes: to calculate in the intensity than in image and face area The total intensity value of pixel in the corresponding region in domain;And calculate total intensity value and in the intensity than in image and face The area ratio in the corresponding region in region, to obtain average intensity value.
Illustratively, predetermined number is equal to sum of one or more of intensity than all intensity in image than image Mesh.
Illustratively, preset range is to be trained and obtained based on practical face.
Illustratively, structure light includes infrared light.
Illustratively, each group at least two groups object images includes at least two respectively with the same space frequency The object images obtained in the case where irradiating the object with the structure light of out of phase for object acquisition.
Illustratively, before receiving at least two groups object images, biopsy method further comprises: using at least two There is kind the structure light of different space frequency to irradiate the object;And when irradiating the object every time, for the object Acquisition target image, to obtain at least two groups object images.
Illustratively, described to calculate at least two frequency responses intensity corresponding at least two groups object images Image includes: the relationship between the pixel according to the corresponding position in each image in every group objects image, calculates and is somebody's turn to do The corresponding frequency response intensity image of group objects image.
According to a further aspect of the invention, a kind of living body detection device is provided, comprising: receiving module, for receiving at least Two group objects images, at least two groups object images are respectively in at least two structure light irradiation objects with different space frequency In the case of for object acquisition obtain;Computing module, for calculating corresponding at least two groups object images at least two Frequency response intensity image;And living body determining module, for determining that object is based at least two frequency response intensity images No is living body.
Illustratively, living body determining module includes: intensity than image acquisition submodule, for being rung based at least two frequencies It answers the intensity relationship between intensity image to obtain one or more intensity and compares image;Face datection submodule, for extremely One of few two frequency response intensity images carry out Face datection, to determine human face region;Mean intensity computational submodule, is used for For one or more intensity than each of image, calculate in the intensity than in image, corresponding with human face region The average intensity value of pixel in region;And judging submodule, for judging one or more intensity than whether depositing in image In predetermined number, intensity that average intensity value is in corresponding preset range than image, if it is present determining object It is living body, if it does not exist, then determining that object is not living body.
Illustratively, it includes: selecting unit that intensity, which obtains submodule than image, is used for from least two frequency response intensity Specific frequency response intensity image is selected in image;And intensity is used for strong at least two frequency responses than computing unit Each of the residual frequency response intensity image in image is spent, the strong of the pixel of the residual frequency response intensity image is calculated The ratio between the intensity value of respective pixel in angle value and Specific frequency response intensity image, and obtained according to the ratio between intensity value calculated It obtains one or more intensity and compares image.
Illustratively, mean intensity computational submodule includes: the first computing unit, for calculating in the intensity than in image , the total intensity value of pixel in corresponding with human face region region;And second computing unit, for calculating total intensity value With in area ratio of the intensity than region in image, corresponding with human face region, to obtain average intensity value.
Illustratively, predetermined number is equal to total number of one or more intensity than all intensity in image than image.
Illustratively, preset range is to be trained and obtained based on practical face.
Illustratively, structure light includes infrared light.
Illustratively, each group at least two groups object images includes at least two respectively with the same space frequency With the object images obtained in the case where the structure light irradiation object of out of phase for object acquisition.
Illustratively, living body detection device further comprises: optical transmitter module, for having different skies using at least two Between frequency structure light irradiation object;And image capture module, it is used in each irradiation object, for object acquisition target Image, to obtain at least two groups object images.
Illustratively, computing module is specifically used for according to the corresponding position in each image in every group objects image Relationship between pixel calculates frequency response intensity image corresponding with the group objects image.
Biopsy method and device according to an embodiment of the present invention, by detected object to a variety of spatial frequencys The frequency response situation of structure light judges whether object is living body, this mode may be implemented the In vivo detection of high security and Cooperation without detected object.
Detailed description of the invention
The embodiment of the present invention is described in more detail in conjunction with the accompanying drawings, the above and other purposes of the present invention, Feature and advantage will be apparent.Attached drawing is used to provide to further understand the embodiment of the present invention, and constitutes explanation A part of book, is used to explain the present invention together with the embodiment of the present invention, is not construed as limiting the invention.In the accompanying drawings, Identical reference label typically represents same parts or step.
Fig. 1 shows showing for the exemplary electronic device for realizing biopsy method according to an embodiment of the present invention and device Meaning property block diagram;
Fig. 2 shows the schematic flow charts of biopsy method according to an embodiment of the invention;
Fig. 3 shows the signal according to an embodiment of the invention using structure light irradiation object and acquisition target image Figure;
Fig. 4 shows the schematic grating fringe figure of structure light according to an embodiment of the invention;
Fig. 5 shows the schematic flow chart for the step of whether determining object according to an embodiment of the invention is living body;
Fig. 6 shows the schematic block diagram of living body detection device according to an embodiment of the invention;
Fig. 7 shows the schematic frame of the living body determining module in living body detection device according to an embodiment of the invention Figure;And
Fig. 8 shows the schematic block diagram of In vivo detection system according to an embodiment of the invention.
Specific embodiment
In order to enable the object, technical solutions and advantages of the present invention become apparent, root is described in detail below with reference to accompanying drawings According to example embodiments of the present invention.Obviously, described embodiment is only a part of the embodiments of the present invention, rather than this hair Bright whole embodiments, it should be appreciated that the present invention is not limited by example embodiment described herein.Based on described in the present invention The embodiment of the present invention, those skilled in the art's obtained all other embodiment in the case where not making the creative labor It should all fall under the scope of the present invention.
Firstly, describing the example for realizing biopsy method according to an embodiment of the present invention and device referring to Fig.1 Electronic equipment 100.
As shown in Figure 1, electronic equipment 100 include one or more processors 102, it is one or more storage device 104, defeated Enter device 106, output device 108, image collecting device 110 and light emitting devices 114, these components pass through bus system 112 and/or other forms bindiny mechanism's (not shown) interconnection.It should be noted that electronic equipment 100 shown in FIG. 1 component and Structure be it is illustrative, and not restrictive, as needed, the electronic equipment also can have other assemblies and structure.
The processor 102 can be central processing unit (CPU) or have data-handling capacity and/or instruction execution The processing unit of the other forms of ability, and the other components that can control in the electronic equipment 100 are desired to execute Function.
The storage device 104 may include one or more computer program products, and the computer program product can To include various forms of computer readable storage mediums, such as volatile memory and/or nonvolatile memory.It is described easy The property lost memory for example may include random access memory (RAM) and/or cache memory (cache) etc..It is described non- Volatile memory for example may include read-only memory (ROM), hard disk, flash memory etc..In the computer readable storage medium On can store one or more computer program instructions, processor 102 can run described program instruction, to realize hereafter institute The client functionality (realized by processor) in the embodiment of the present invention stated and/or other desired functions.In the meter Can also store various application programs and various data in calculation machine readable storage medium storing program for executing, for example, the application program use and/or The various data etc. generated.
The input unit 106 can be the device that user is used to input instruction, and may include keyboard, mouse, wheat One or more of gram wind and touch screen etc..
The output device 108 can export various information (such as image and/or sound) to external (such as user), and It and may include one or more of display, loudspeaker etc..
Described image acquisition device 110 can be shot desired image (such as photo, video frame etc.), and will be captured Image be stored in the storage device 104 for other components use.Image collecting device 110 can be using any suitable Camera installation realize, such as camera, video camera or camera of mobile terminal etc..
The light emitting devices 114 can be with emitting structural light.Illustratively, the light source that light emitting devices 114 uses is red Outer light source, the structure light launched are infrared lights.It is understood that if light emitting devices 114 uses infrared light supply, Image collecting device 110 should also be as the ability of corresponding infrared imaging.The light emitting devices 114 can independently or In the expected time, transmitting has the structure light of desired spatial frequency and desired phase under the control of processor 102.Light emitting devices 114 can be realized using equipment such as laser or projectors.
Illustratively, the exemplary electronic device for realizing biopsy method according to an embodiment of the present invention and device can With the Image Acquisition end etc. for being implemented as smart phone, tablet computer, access control system, personal computer etc..
In the following, biopsy method according to an embodiment of the present invention will be described with reference to Fig. 2.Fig. 2 shows according to the present invention one The schematic flow chart of the biopsy method 200 of a embodiment.As shown in Fig. 2, biopsy method 200 includes following step Suddenly.
In step S210, at least two groups object images are received, which has at least two respectively It is obtained in the case where the structure light irradiation object of different space frequency for object acquisition.
" object " as described herein is the object for participating in In vivo detection, can be true face, attacker uses Photo, video, mask or faceform etc..
According to one embodiment, structure light may include infrared light (as described above).The wavelength of infrared light can be example Such as 808nm or 850nm.Since people is insensitive to infrared light, can cause to invade to avoid to detected person using infrared light, So as to improve the user experience of detected person.Further optionally, living body inspection can be carried out to object using line-structured light It surveys.The information content for being included by the image data that line-structured light measures is larger, but handles simply, therefore facilitates accurate fast In vivo detection is carried out fastly.
As an example, light emitting devices, which can be used, successively launches at least two structures with different space frequency Light is used for irradiation object.When light emitting devices uses structure light irradiation object, image collecting device can use for object Image is acquired, to obtain object images.Then, collected object images can be sent to memory use by image collecting device It is handled by processor in later or is transmitted directly to processor.Hereafter using light emitting devices as projector and image collecting device To be described for camera.Fig. 3 shows utilization structure light irradiation object and acquisition according to an embodiment of the invention The schematic diagram of object images.Reliable information is relatively enriched in order to collect, projector should be spatially close with camera, such as Shown in Fig. 3.
The form of structure light is described below.With reference to Fig. 4, the schematic of structure light according to an embodiment of the invention is shown Grating fringe figure.As shown in figure 4, projector launches two kinds of structure lights respectively, both structure lights are divided into two groups, i.e. L group and R Group.L group is shown on the left of Fig. 4 comprising structure light corresponding with having three amplitude grating bar graphs of the first spatial frequency;Figure 4 right sides show R group comprising structure light corresponding with the three amplitude grating bar graphs with second space frequency.Six width light Sinusoidal pattern nicking as shown in Figure 4 is presented in grid bar graph.First spatial frequency is different with second space frequency.Illustratively, First spatial frequency can be the 1/24 of second space frequency, that is to say, that the space periodic of L group structure light is R group structure light 24 times of space periodic.It will be appreciated, of course, that the ratio is exemplary only, any suitable value can be, the present invention It is limited not to this.The phase of structure light corresponding from three amplitude grating bar graphs respectively in L group structure light is different, they it Between phase difference be 120 degree.The phase of structure light corresponding from three amplitude grating bar graphs respectively in R group structure light is also different, Phase difference between them is also 120 degree.Certainly, above-mentioned phase difference is merely exemplary, and can be any suitable value, this Invention is limited not to this.
Projector be can use successively with structure light irradiation object corresponding to every amplitude grating bar graph, and irradiated every time When object, camera acquisition target image is utilized.Fig. 4 shows six amplitude grating bar graphs, correspondingly, should collect six objects Image.This six object images can be divided into two groups according to the spatial frequency of corresponding structure light.
Although Fig. 4 is with two kinds of structure lights with different frequency as an example, however, it is to be appreciated that the space of structure light Frequency can be more than two kinds.Correspondingly, object images obtained can be more than two groups.In addition, under the same space frequency The phase of structure light can be fewer of more than three kinds.Correspondingly, the number of the object images in every group objects image can be lacked In or more than three.
In step S220, at least two frequency response intensity image corresponding at least two groups object images is calculated.
As unit of every group objects image, a frequency response intensity map can be calculated for this group objects image Picture.It in one embodiment, can be between the pixel according to the corresponding position in each image in every group objects image Relationship calculates frequency response intensity image corresponding with the group objects image.The calculating of frequency response intensity image is described below Process.
For i-th of object images in Mr. Yu's group objects image, it can be indicated with I [i, x, y] in i-th of object The intensity value of pixel at position (x, y) of image indicates the offset phase of structure light corresponding to i-th of object images with ai Position.Frequency response process of the structure light on object can be indicated using following formula:
q[x,y]*sin(p+ai)+r=I [i, x, y] (1)
Wherein, p represents the initial phase of the group objects image, and r represents the intensity value of bias light (ambient) ingredient, It is a complementary variable, q [x, y] represents frequency response intensity image corresponding to the group objects image in the position (x, y) The intensity value of the pixel at place.P, q [x, y], r are unknown numbers.
For Mr. Yu's group objects image, above-mentioned formula (1) can be listed for each object images therein, so as to To form equation group.It is solved according to the equation group of composition q [x, y], and then obtains frequency response intensity image.
Below for including four object images in certain group objects image.Assuming that structure corresponding to each object images The offset phase of light is respectively a0,a1,a2,a3, it is possible to list following formula:
It is understood that since there are three unknown number p, q [x, y], r for tool in frequency response formula, therefore in formula (2) Equation group belong to over-determined systems.In such a case, it is possible to the least square solution of the derivation of equation (2) calculates q [x, y], into And obtain frequency response intensity image.
Seen from the above description, when right with the structure light institute with out of phase comprising four or more in a group objects image When the object images answered, the equation group of composition belongs to over-determined systems, needs to seek least square solution.When in a group objects image When comprising three with object images corresponding to the structure light with out of phase, the equation group of composition, which belongs to, just determines equation group, It can seek accurately to solve.When in a group objects image only comprising two with have out of phase structure light corresponding to object diagram When picture, the equation group of composition belongs to Indeterminate Equation Group, needs to seek elementary solution.It only include two objects in a group objects image In the case where image, elementary solution can be asked using following formula:
Q [x, y]=max (I [1, x, y], I [2, x, y])-min (I [1, x, y]-I [2, x, y]) (3)
Optionally, for every group of structure light with the same space frequency, there can be out of phase using three kinds or more Structure light carry out irradiation object and acquire three or more object images, detection accuracy can be improved in this way.
The calculating process of frequency response intensity image is further described below with reference to structure light shown in Fig. 4.
For structure light corresponding to six amplitude gratings bar graph shown in Fig. 4, it is assumed that in three amplitude grating bar graph institutes of L group Corresponding lower three object images obtained of structure light irradiation are the 1st, the 2nd and the 3rd object images, in three amplitude gratings of R group Lower three object images obtained of the irradiation of structure light corresponding to bar graph are the 4th, the 5th and the 6th object images.
For object images corresponding to L group structure light, indicate frequency response intensity image in the position (x, y) with A [x, y] The intensity value of the pixel at place:
It can be calculated based on formula (4): A [x, y]=(2*I [1, x, y]-I [2, x, y]-I [3, x, y])2+3*(I [2,x,y]-I[3,x,y])2/9。
Similarly, for object images corresponding to R group structure light, with B [x, y] indicate frequency response intensity image (x, Y) intensity value of the pixel at position, it is available: B [x, y]=(2*I [4, x, y]-I [5, x, y]-I [6, x, y])2+3*(I [5,x,y]-I[6,x,y])2/9。
Reality of the intensity than image is obtained in the intensity relationship as described below based between frequency response intensity image It applies in example, A [x, y]=(2*I [1, x, y]-I [2, x, y]-I [3, x, y]) can be enabled2+3*(I[2,x,y]-I[3,x,y])2And And B [x, y]=(2*I [4, x, y]-I [5, x, y]-I [6, x, y])2+3*(I[5,x,y]-I[6,x,y])2
In step S230, determine whether object is living body based at least two frequency response intensity images.
After calculating at least two frequency response intensity images of acquisition, at least two frequency responses intensity can be based on Image determines whether object is living body.
Due to unlike material object a variety of spatial frequencys structure light irradiation under frequency response situation it is different, people The corresponding frequency response intensity image of skin and the corresponding frequency response intensity image of objects such as screen, paper, gypsum, rubber have Very big difference, therefore living body faces and non-living body attacker can be distinguished by frequency response intensity image.This mode without The object cooperation that need to be detected, and living body faces can be identified with the three-dimensional face model of certain number of people shape, it is A kind of detection method with high security and ease for use.
Illustratively, biopsy method according to an embodiment of the present invention can be in setting with memory and processor It is realized in standby, device or system.
Biopsy method according to an embodiment of the present invention can be deployed at man face image acquiring end, for example, in security protection Application field can be deployed in the Image Acquisition end of access control system;In financial application field, can be deployed at personal terminal, Smart phone, tablet computer, personal computer etc..
Alternatively, biopsy method according to an embodiment of the present invention can also be deployed in server end (or cloud with being distributed End) and personal terminal at., can be in personal terminal acquisition target image for example, in financial application field, personal terminal is by object Image is transmitted in server end (or cloud), and server end (or cloud) carries out In vivo detection according to object images.
Biopsy method according to an embodiment of the present invention, by detected object to the structure light of a variety of spatial frequencys Frequency response situation judge whether object is living body, the In vivo detection of high security may be implemented without quilt in this mode The cooperation of the object of detection.
Fig. 5 shows the step of whether determining object according to an embodiment of the invention is living body (step i.e. shown in Fig. 2 S230 schematic flow chart).As shown in figure 5, step S230 may comprise steps of.
In step S231, one or more is obtained based on the intensity relationship between at least two frequency response intensity images A intensity compares image.
Illustratively, step S231 can be specifically included: select specific frequency from least two frequency response intensity images Rate response intensity image;And for every in the residual frequency response intensity image at least two frequency response intensity images One, calculate the intensity value of the pixel of the residual frequency response intensity image and the corresponding picture in Specific frequency response intensity image The ratio between the intensity value of element, and one or more intensity are obtained according to the ratio between intensity value calculated and compare image.In this way Intensity can quickly and easily be obtained and compare image.It is exemplified below.
Still by taking embodiment shown in Fig. 4 as an example, there are two frequency response intensity image A [x, y] and B [x, y] for tool.With R [x, y] indicates that intensity than image, can calculate in the following manner: R [x, y]=B [x, y]/A [x, y] or R [x, y] =A [x, y]/B [x, y].Intensity can also calculate in the following manner than image: R [x, y]=(A [x, y]-B [x, y])/A [x, Y]=1-B [x, y]/A [x, y] or R [x, y]=(B [x, y]-A [x, y])/B [x, y]=1-A [x, y]/B [x, y].
For the case where there are more than two frequency response intensity images, group neatly can be carried out to frequency intensity image It closes and compares image to calculate intensity.For example, it is assumed that there are four frequency response intensity image A [x, y], B [x, y], C [x, y] and D [x,y].A frequency response intensity image A [x, y] can be then selected from this four frequency response intensity images, and it is remaining Frequency response image is successively combined, and three combinations are formed, then can be according to two above-mentioned frequency response intensity maps The intensity of picture calculates each combined intensity than the calculation of image and compares image.In this way, available three intensity is than image, Such as R1[x, y]=B [x, y]/A [x, y], R2[x, y]=C [x, y]/A [x, y] and R3[x, y]=D [x, y]/A [x, y].With Judge afterwards intensity than image average intensity value whether within a predetermined range (i.e. step S234) when, each intensity ratio can be directed to Image is successively judged.Above-mentioned intensity is only example than the calculation of image, can also use other any suitable group Conjunction mode obtains intensity than image, as described below.In one example, said frequencies response intensity image can be divided into two A combination A [x, y] and B [x, y] and C [x, y] and D [x, y], and two intensity are accordingly calculated than image R1[x, y]=B [x, y]/A [x, y], R2[x, y]=C [x, y]/D [x, y].In another example, four frequency response intensity maps can be directed to As calculating an intensity than image, such as R [x, y]=(B [x, y]+C [x, y]+D [x, y])/A [x, y]=B [x, y]/A [x, y]+C[x,y]/A[x,y]+D[x,y]/A[x,y]。
Those skilled in the art are understood that the intensity ratio of any number of frequency response intensity image as described above The calculation of image, details are not described herein.In addition, it is necessary to understand, foregoing description is only example, any suitable to be based on Intensity relationship between frequency response intensity image, which obtains intensity, should all fall into guarantor of the invention than the implementation of image It protects in range.
In step S232, Face datection is carried out to one of at least two frequency response intensity images, to determine human face region.
In this step, it can determine optional one at least two frequency response intensity images in selected frequency Whether include face in rate response intensity image, and in the case where include face in selected frequency response intensity image Human face region is oriented in selected frequency response intensity image.It is preferential to select L group structure light for embodiment shown in Fig. 4 Corresponding frequency response intensity image, that is to say, that frequency corresponding to the preferential selection lower one group of structure light of spatial frequency Rate response intensity image.Compared with other frequency response intensity images, the frequency corresponding to the lower structure light of spatial frequency Carrying out Face datection on response intensity image can be fairly simple and accurate, is easier to identify the human face region in image.
It can use preparatory trained human-face detector and come the locating human face region in frequency response intensity image.Example Such as, the Face datections such as Ha Er (Haar) algorithm, Adaboost algorithm and recognizer can be advanced in the base of a large amount of pictures Human-face detector is trained on plinth, for the single-frame images of input, trained human-face detector can rapidly be determined in advance for this Position goes out human face region.
It should be appreciated that the present invention is not limited by the method for detecting human face specifically used, either existing method for detecting human face Or the method for detecting human face of exploitation in the future can be applied in biopsy method according to an embodiment of the present invention, and It also should include within the scope of the present invention.
In step S233, for one or more intensity than each of image, calculate the intensity than it is in image, The average intensity value of pixel in region corresponding with human face region.
Illustratively, step S233 can be specifically included: be calculated in the intensity than in image, corresponding with human face region Region in pixel total intensity value;And calculate total intensity value in the intensity than in image, opposite with human face region The area ratio in the region answered, to obtain average intensity value.Average intensity value can be accurately calculated in this way.
The calculation method of average intensity value is exemplified below.Assuming that identified human face region is square in step S232 Shape (x0, y0)~(x1, y1) can be calculated by the following formula intensity than corresponding with the human face region in image R [x, y] The average intensity value u of pixel in region:
For obtaining the case where multiple intensity are than image in step S231, can be calculated for each intensity than image One average intensity value, to obtain multiple average intensity values.
In step S234, judge one or more intensity than whether there is in image predetermined number, average intensity value exists Intensity in corresponding preset range is than image, if it is present step S235 is gone to, if it does not exist, then going to step Rapid S236.
In step S235, determine that object is living body.
In step S236, determine that object is not living body.
Depending on predetermined number can according to need.If the intensity obtained in step S231 is only one than image, Predetermined number is equal to 1.That is, can directly judge average intensity value of this intensity than image whether in preset range It is interior, if it is, determining that object is living body, otherwise, it determines object is not living body.If the intensity ratio obtained in step S231 Image be it is multiple, then predetermined number can be equal to or less than intensity than image total number.For example, if being obtained in step S231 The intensity obtained is three than image, then predetermined number can be equal to 3.For these three intensity are than image, each pair Answer the preset range of oneself.It can directly judge average intensity value of these three intensity than image whether all corresponding In preset range.If it is, thinking that these three intensity are met the requirements than image, determine that object is living body, otherwise, it determines object It is not living body.In another example predetermined number can be equal to 2 if the intensity obtained in step S231 than image is three.It can To judge these three intensity than whether there is two intensity in image than the average intensity value of image in corresponding predetermined model In enclosing.If it is present thinking that these three intensity are met the requirements than image, determine that object is living body, if there is no (namely Say, only one or without intensity of the average intensity value in corresponding preset range than image), it is determined that object is not Living body.
Preset range can be limited by one or more threshold values, can be trained and be obtained based on practical face. For example, the structure light using various with different space frequency can irradiate practical face in advance and acquire a large amount of practical face Object images.Then, frequency response intensity image corresponding to the structure light of various spatial frequencys is calculated, is summarized based on various The intensity that the intensity relationship of frequency response intensity image corresponding to the structure light of different space frequency obtains is than image Feature, to obtain suitable threshold value, i.e., suitable preset range.
Judge that object whether be the mode of living body is a kind of higher In vivo detection side of accuracy than image according to intensity Formula.
Illustratively, each group at least two groups object images includes at least two respectively with the same space frequency With the object images obtained in the case where the structure light irradiation object of out of phase for object acquisition.As described above, for Every group objects image needs to solve the equation group of such as formula (2) in calculated frequency response intensity image.Therefore, for tool It can be at least two by the phase settings of the structure light under the same space frequency for the structure light for having the same space frequency Kind, correspondingly, the object images of acquisition are at least two.
Illustratively, before step S210, the biopsy method 200 be may further include: use at least two Kind has the structure light irradiation object of different space frequency;And in each irradiation object, for object acquisition target image, To obtain at least two groups object images.Above have been combined Fig. 3 and Fig. 4 describe utilize structure light irradiation object and acquisition pair As the method for image.As described above, light emitting devices (such as projector) can be used to object emitting structural light, and use Image collecting device (such as camera) acquisition target image.
Fig. 6 shows the schematic block diagram of living body detection device 600 according to an embodiment of the invention.
As shown in fig. 6, living body detection device 600 according to an embodiment of the present invention includes receiving module 610, computing module 620 and living body determining module 630.
Receiving module 610 is for receiving at least two groups object images, and at least two groups object images are respectively at least two It is obtained in the case where structure light irradiation object with different space frequency for object acquisition.Receiving module 610 can be by Fig. 1 Shown in the program instruction that stores in 102 Running storage device 104 of processor in electronic equipment realize.
Computing module 620 is for calculating at least two frequency response intensity map corresponding at least two groups object images Picture.The program that computing module 620 can store in 102 Running storage device 104 of processor in electronic equipment as shown in Figure 1 Instruction is to realize.
Living body determining module 630 is used to determine whether object is living body based at least two frequency response intensity images.It is living The program that body determining module 630 can store in 102 Running storage device 104 of processor in electronic equipment as shown in Figure 1 Instruction can execute the step S231 to S236 in biopsy method according to an embodiment of the present invention to realize.
Fig. 7 shows showing for the living body determining module 630 in living body detection device 600 according to an embodiment of the invention Meaning property block diagram.
As shown in fig. 7, the living body determining module 630 may include intensity than image acquisition submodule 6310, face inspection Survey submodule 6320, mean intensity computational submodule 6330 and judging submodule 6340.
Intensity obtains submodule 6310 than image and is used for based on the intensity ratio between at least two frequency response intensity images Example relationship obtains one or more intensity and compares image.Intensity than image obtain submodule 6310 can electronics as shown in Figure 1 set The program instruction that stores in 102 Running storage device 104 of processor in standby is realized.
The Face datection submodule 6320 is used to carry out Face datection to one of at least two frequency response intensity images, To determine human face region.The Face datection submodule 6320 can be human-face detector, can electronic equipment as shown in Figure 1 In 102 Running storage device 104 of processor in the program instruction that stores realize.
Mean intensity computational submodule 6330 is used to calculate at this one or more intensity than each of image Average intensity value of the intensity than the pixel in region in image, corresponding with human face region.The mean intensity calculates son The program instruction that module 6330 can store in 102 Running storage device 104 of processor in electronic equipment as shown in Figure 1 comes It realizes.
The judging submodule 6340 be used to judge one or more intensity than whether there is in image predetermined number, it is flat Equal intensity of the intensity value in corresponding preset range is than image, if it is present determining that object is living body, if do not deposited , it is determined that object is not living body.The judging submodule 6340 can processor 102 in electronic equipment as shown in Figure 1 The program instruction that stores in Running storage device 104 is realized.
According to embodiments of the present invention, it may include selecting unit and intensity than calculating that intensity, which obtains submodule 6310 than image, Unit.Selecting unit is used to select Specific frequency response intensity image from least two frequency response intensity images.Intensity ratio Computing unit is used for for each of the residual frequency response intensity image at least two frequency response intensity images, meter Calculate the strong of the intensity value of the pixel of the residual frequency response intensity image and the respective pixel in Specific frequency response intensity image The ratio between angle value, and one or more intensity are obtained according to the ratio between intensity value calculated and compare image.
According to embodiments of the present invention, the mean intensity computational submodule 6330 includes that the first computing unit and second calculate Unit.First computing unit is for calculating in the intensity than the pixel in region in image, corresponding with human face region Total intensity value.Second computing unit for calculate total intensity value in the intensity than in image, corresponding with human face region The area ratio in region, to obtain average intensity value.
According to embodiments of the present invention, predetermined number is equal to one or more intensity than all intensity in image than image Total number.
According to embodiments of the present invention, preset range is to be trained and obtained based on practical face.
According to embodiments of the present invention, structure light includes infrared light.
According to embodiments of the present invention, each group at least two groups object images includes at least two respectively with same The object images obtained in the case where the structure light irradiation object of spatial frequency and out of phase for object acquisition.
According to embodiments of the present invention, the living body detection device 600 further comprises optical transmitter module and Image Acquisition mould Block.Optical transmitter module is used for the structure light irradiation object using described at least two with different space frequency.Image Acquisition mould Block is used in each irradiation object, for object acquisition target image, to obtain at least two groups object images.Optical transmitter module It can be realized using light emitting devices 114 shown in FIG. 1, image capture module can use image collecting device shown in FIG. 1 110 realize.
According to embodiments of the present invention, the computing module 620 in the living body detection device 600 is specifically used for according to every group pair As the corresponding position in each image in image pixel between relationship, calculate corresponding with group objects image frequency Response intensity image.
Those of ordinary skill in the art may be aware that list described in conjunction with the examples disclosed in the embodiments of the present disclosure Member and algorithm steps can be realized with the combination of electronic hardware or computer software and electronic hardware.These functions are actually It is implemented in hardware or software, the specific application and design constraint depending on technical solution.Professional technician Each specific application can be used different methods to achieve the described function, but this realization is it is not considered that exceed The scope of the present invention.
Fig. 8 shows the schematic block diagram of In vivo detection system 800 according to an embodiment of the invention.In vivo detection system System 800 includes image collecting device 810, light emitting devices 820, storage device 830 and processor 840.
Image collecting device 810 is used for acquisition target image.Light emitting devices 820 is used for emitting structural light.
The storage of storage device 830 is for realizing the corresponding steps in biopsy method according to an embodiment of the present invention Program code.
The processor 840 is for running the program code stored in the storage device 830, to execute according to the present invention The corresponding steps of the biopsy method of embodiment, and for realizing living body detection device 600 according to an embodiment of the present invention In receiving module 610, computing module 620 and living body determining module 630.
In one embodiment, following steps are executed when said program code is run by the processor 840: being received extremely Few two group objects images, at least two groups object images structure light irradiation pair at least two with different space frequency respectively It is obtained as in the case where for object acquisition;It is strong to calculate at least two frequency responses corresponding at least two groups object images Spend image;And determine whether object is living body based at least two frequency response intensity images.
In one embodiment, performed based at least two when said program code is run by the processor 840 A frequency response intensity image determine the step of whether object is living body include: based at least two frequency response intensity images it Between intensity relationship obtain one or more intensity and compare image;People is carried out to one of at least two frequency response intensity images Face detection, to determine human face region;For one or more intensity than each of image, calculate in the intensity than in image , the average intensity value of pixel in corresponding with human face region region;And judge one or more intensity than in image With the presence or absence of predetermined number, intensity that average intensity value is in corresponding preset range than image, if it is present really Determining object is living body, if it does not exist, then determining that object is not living body.
In one embodiment, performed based at least two when said program code is run by the processor 840 It includes: from least that intensity relationship between a frequency response intensity image, which obtains one or more intensity than the step of image, Specific frequency response intensity image is selected in two frequency response intensity images;And at least two frequency response intensity maps Each of residual frequency response intensity image as in, calculates the intensity value of the pixel of the residual frequency response intensity image With the ratio between the intensity value of respective pixel in Specific frequency response intensity image, and according to the ratio between intensity value calculated obtain one A or multiple intensity compare image.
In one embodiment, performed for described one when said program code is run by the processor 840 A or multiple intensity are calculated in the intensity than each of image than in region in image, corresponding with human face region The step of average intensity value of pixel includes: to calculate in the intensity than in region in image, corresponding with human face region The total intensity value of pixel;And calculate total intensity value in the intensity than region in image, corresponding with human face region Area ratio, to obtain average intensity value.
In one embodiment, it is more total than image than all intensity in image to be equal to one or more intensity for predetermined number Number.
In one embodiment, preset range is to be trained and obtained based on practical face.
In one embodiment, structure light includes infrared light.
In one embodiment, each group at least two groups object images includes at least two respectively with same sky Between the object images that are obtained for object acquisition in the case where the structure light irradiation object of frequency and out of phase.
In one embodiment, the calculating performed when said program code is run by the processor 840 with The corresponding at least two frequency responses intensity image of at least two groups object images includes: according in every group objects image Each image in corresponding position pixel between relationship, calculate corresponding with group objects image frequency response intensity Image.
In addition, according to embodiments of the present invention, additionally providing a kind of storage medium, storing program on said storage Instruction, when described program instruction is run by computer or processor for executing the biopsy method of the embodiment of the present invention Corresponding steps, and for realizing the corresponding module in living body detection device according to an embodiment of the present invention.The storage medium It such as may include the storage card of smart phone, the storage unit of tablet computer, the hard disk of personal computer, read-only memory (ROM), Erasable Programmable Read Only Memory EPROM (EPROM), portable compact disc read-only memory (CD-ROM), USB storage, Or any combination of above-mentioned storage medium.
In one embodiment, the computer program instructions may be implemented real according to the present invention when being run by computer Each functional module of the living body detection device of example is applied, and/or In vivo detection according to an embodiment of the present invention can be executed Method.
In one embodiment, the computer program instructions execute following steps when being run by computer: receiving extremely Few two group objects images, at least two groups object images structure light irradiation pair at least two with different space frequency respectively It is obtained as in the case where for object acquisition;It is strong to calculate at least two frequency responses corresponding at least two groups object images Spend image;And determine whether object is living body based at least two frequency response intensity images.
In one embodiment, the computer program instructions are performed based at least two when being run by computer Frequency response intensity image determines that the step of whether object is living body includes: based between at least two frequency response intensity images Intensity relationship obtain one or more intensity and compare image;Face is carried out to one of at least two frequency response intensity images Detection, to determine human face region;For one or more intensity than each of image, calculate in the intensity than in image , the average intensity value of pixel in corresponding with human face region region;And judge one or more intensity than in image With the presence or absence of predetermined number, intensity that average intensity value is in corresponding preset range than image, if it is present really Determining object is living body, if it does not exist, then determining that object is not living body.
In one embodiment, the computer program instructions are performed based at least two when being run by computer It includes: from least two that intensity relationship between frequency response intensity image, which obtains one or more intensity than the step of image, Specific frequency response intensity image is selected in a frequency response intensity image;And at least two frequency response intensity images In each of residual frequency response intensity image, calculate the intensity value of the pixel of the residual frequency response intensity image with The ratio between the intensity value of respective pixel in Specific frequency response intensity image, and one is obtained according to the ratio between intensity value calculated Or multiple intensity compare image.
In one embodiment, the computer program instructions are performed for one when being run by computer Or multiple intensity are calculated in the intensity than each of image than the picture in region in image, corresponding with human face region The step of average intensity value of element includes: to calculate in the intensity than the picture in region in image, corresponding with human face region The total intensity value of element;And calculate total intensity value in face of the intensity than region in image, corresponding with human face region The ratio between product, to obtain average intensity value.
In one embodiment, it is more total than image than all intensity in image to be equal to one or more intensity for predetermined number Number.
In one embodiment, preset range is to be trained and obtained based on practical face.
In one embodiment, structure light includes infrared light.
In one embodiment, each group at least two groups object images includes at least two respectively with same sky Between the object images that are obtained for object acquisition in the case where the structure light irradiation object of frequency and out of phase.
In one embodiment, the computer program instructions calculating performed when being run by computer and institute Stating the corresponding at least two frequency responses intensity image of at least two groups object images includes: according in every group objects image Relationship between the pixel of corresponding position in each image calculates frequency response intensity map corresponding with the group objects image Picture.
Each module in In vivo detection system according to an embodiment of the present invention can pass through work according to an embodiment of the present invention The processor computer program instructions that store in memory of operation of the electronic equipment that physical examination is surveyed realize, or can be in root The computer instruction stored in computer readable storage medium according to the computer program product of the embodiment of the present invention is by computer It is realized when operation.
Biopsy method and device according to an embodiment of the present invention, In vivo detection system and storage medium, pass through by The object of detection judges the frequency response situation of the structure light of a variety of spatial frequencys whether object is living body, and this mode can To realize cooperation of the In vivo detection of high security without detected object.
Although describing example embodiment by reference to attached drawing here, it should be understood that above example embodiment are only exemplary , and be not intended to limit the scope of the invention to this.Those of ordinary skill in the art can carry out various changes wherein And modification, it is made without departing from the scope of the present invention and spiritual.All such changes and modifications are intended to be included in appended claims Within required the scope of the present invention.
Those of ordinary skill in the art may be aware that list described in conjunction with the examples disclosed in the embodiments of the present disclosure Member and algorithm steps can be realized with the combination of electronic hardware or computer software and electronic hardware.These functions are actually It is implemented in hardware or software, the specific application and design constraint depending on technical solution.Professional technician Each specific application can be used different methods to achieve the described function, but this realization is it is not considered that exceed The scope of the present invention.
In several embodiments provided herein, it should be understood that disclosed device and method can pass through it Its mode is realized.For example, apparatus embodiments described above are merely indicative, for example, the division of the unit, only Only a kind of logical function partition, there may be another division manner in actual implementation, such as multiple units or components can be tied Another equipment is closed or is desirably integrated into, or some features can be ignored or not executed.
In the instructions provided here, numerous specific details are set forth.It is to be appreciated, however, that implementation of the invention Example can be practiced without these specific details.In some instances, well known method, structure is not been shown in detail And technology, so as not to obscure the understanding of this specification.
Similarly, it should be understood that in order to simplify the present invention and help to understand one or more of the various inventive aspects, To in the description of exemplary embodiment of the present invention, each feature of the invention be grouped together into sometimes single embodiment, figure, Or in descriptions thereof.However, the method for the invention should not be construed to reflect an intention that i.e. claimed The present invention claims features more more than feature expressly recited in each claim.More precisely, as corresponding As claims reflect, inventive point is that all features less than some disclosed single embodiment can be used Feature solves corresponding technical problem.Therefore, it then follows thus claims of specific embodiment are expressly incorporated in the tool Body embodiment, wherein each, the claims themselves are regarded as separate embodiments of the invention.
It will be understood to those skilled in the art that any combination pair can be used other than mutually exclusive between feature All features disclosed in this specification (including adjoint claim, abstract and attached drawing) and so disclosed any method Or all process or units of equipment are combined.Unless expressly stated otherwise, this specification (is wanted including adjoint right Ask, make a summary and attached drawing) disclosed in each feature can be replaced with an alternative feature that provides the same, equivalent, or similar purpose.
In addition, it will be appreciated by those of skill in the art that although some embodiments described herein include other embodiments In included certain features rather than other feature, but the combination of the feature of different embodiments mean it is of the invention Within the scope of and form different embodiments.For example, in detail in the claims, embodiment claimed it is one of any Can in any combination mode come using.
Various component embodiments of the invention can be implemented in hardware, or to run on one or more processors Software module realize, or be implemented in a combination thereof.It will be understood by those of skill in the art that can be used in practice Microprocessor or digital signal processor (DSP) realize some moulds in article analytical equipment according to an embodiment of the present invention The some or all functions of block.The present invention is also implemented as a part or complete for executing method as described herein The program of device (for example, computer program and computer program product) in portion.It is such to realize that program of the invention can store On a computer-readable medium, it or may be in the form of one or more signals.Such signal can be from internet Downloading obtains on website, is perhaps provided on the carrier signal or is provided in any other form.
It should be noted that the above-mentioned embodiments illustrate rather than limit the invention, and ability Field technique personnel can be designed alternative embodiment without departing from the scope of the appended claims.In the claims, Any reference symbol between parentheses should not be configured to limitations on claims.Word "comprising" does not exclude the presence of not Element or step listed in the claims.Word "a" or "an" located in front of the element does not exclude the presence of multiple such Element.The present invention can be by means of including the hardware of several different elements and being come by means of properly programmed computer real It is existing.In the unit claims listing several devices, several in these devices can be through the same hardware branch To embody.The use of word first, second, and third does not indicate any sequence.These words can be explained and be run after fame Claim.
The above description is merely a specific embodiment or to the explanation of specific embodiment, protection of the invention Range is not limited thereto, and anyone skilled in the art in the technical scope disclosed by the present invention, can be easily Expect change or replacement, should be covered by the protection scope of the present invention.Protection scope of the present invention should be with claim Subject to protection scope.

Claims (20)

1. a kind of biopsy method, comprising:
At least two groups object images are received, at least two groups object images have different space frequency at least two respectively It is obtained in the case where structure light irradiation object for object acquisition;
Calculate at least two frequency responses intensity image corresponding at least two groups object images;And
Determine whether the object is living body based on at least two frequency responses intensity image;
Wherein, described to determine whether the object is that living body includes: based on at least two frequency responses intensity image
One or more intensity are obtained than figure based on the intensity relationship between at least two frequency responses intensity image Picture;
Face datection is carried out to one of described at least two frequency responses intensity image, to determine human face region;
For one or more of intensity than each of image, calculate in the intensity than in the image and face The average intensity value of pixel in the corresponding region in region;And
Judge one or more of intensity than whether there is the intensity of average intensity value within a predetermined range in image than image, If it is present determining that the object is living body, if it does not exist, then determining that the object is not living body.
2. biopsy method as described in claim 1, wherein described to be based on at least two frequency responses intensity image Between intensity relationship obtain one or more intensity than image and include:
Specific frequency response intensity image is selected from at least two frequency responses intensity image;And
For each of the residual frequency response intensity image in at least two frequency responses intensity image, calculating should The intensity value of the pixel of residual frequency response intensity image is strong with the respective pixel in the Specific frequency response intensity image The ratio between angle value, and one or more of intensity are obtained according to the ratio between intensity value calculated and compare image.
3. biopsy method as described in claim 1, wherein it is described for one or more of intensity than in image Each, calculates in average intensity value of the intensity than the pixel in region in image, corresponding with the human face region Include:
It calculates in total intensity value of the intensity than the pixel in region in image, corresponding with the human face region;And
The total intensity value is calculated with described in area of the intensity than region in image, corresponding with the human face region The ratio between, to obtain the average intensity value.
4. biopsy method as described in claim 1, wherein the one or more of intensity of judgement than being in image It is no that there are the intensity of average intensity value within a predetermined range to include: than image
Judge one or more of intensity than whether there is predetermined number, average intensity value in image corresponding pre- The intensity determined in range compares image.
5. biopsy method as claimed in claim 4, wherein the predetermined number is equal to one or more of intensity ratios The total number of all intensity in image than image.
6. biopsy method as described in claim 1, wherein the preset range be trained based on practical face and It obtains.
7. biopsy method as described in claim 1, wherein the structure light includes infrared light.
8. biopsy method as described in claim 1, wherein each group in at least two groups object images includes extremely Few two respectively in the case where the structure light with the same space frequency and out of phase irradiates the object for described right The object images obtained as acquisition.
9. biopsy method as described in claim 1, wherein described before the reception at least two groups object images Biopsy method further comprises:
The object is irradiated using described at least two structure lights with different space frequency;And
When irradiating the object every time, for the object acquisition target image, to obtain at least two groups object images.
10. biopsy method as described in claim 1, wherein the calculating is distinguished at least two groups object images Corresponding at least two frequency responses intensity image includes: according to the corresponding position in each image in every group objects image Pixel between relationship, calculate corresponding with group objects image frequency response intensity image.
11. a kind of living body detection device, comprising:
Receiving module, for receiving at least two groups object images, at least two groups object images have at least two respectively It is obtained in the case where the structure light irradiation object of different space frequency for object acquisition;
Computing module, for calculating at least two frequency responses intensity map corresponding at least two groups object images Picture;And
Living body determining module, for determining whether the object is living body based on at least two frequency responses intensity image;
Wherein, the living body determining module includes:
Intensity obtains submodule than image, for being closed based on the intensity between at least two frequency responses intensity image System obtains one or more intensity and compares image;
Face datection submodule, for carrying out Face datection to one of described at least two frequency responses intensity image, with determination Human face region;
Mean intensity computational submodule is strong at this for, than each of image, calculating for one or more of intensity Spend the average intensity value than the pixel in region in image, corresponding with the human face region;And
Judging submodule, for judging one or more of intensity than whether there is average intensity value in image in preset range Interior intensity is than image, if it is present determining that the object is living body, if it does not exist, then determining that the object is not living Body.
12. living body detection device as claimed in claim 11, wherein the intensity obtains submodule than image and includes:
Selecting unit, for selecting Specific frequency response intensity image from at least two frequency responses intensity image;With And
Intensity is than computing unit, for for the residual frequency response intensity figure in at least two frequency responses intensity image Each of as, calculate the intensity value and the Specific frequency response intensity map of the pixel of the residual frequency response intensity image The ratio between the intensity value of respective pixel as in, and one or more of intensity are obtained than figure according to the ratio between intensity value calculated Picture.
13. living body detection device as claimed in claim 11, wherein the mean intensity computational submodule includes:
First computing unit, for calculating in the intensity than the picture in region in image, corresponding with the human face region The total intensity value of element;And
Second computing unit, for calculate the total intensity value and it is described the intensity than it is in image, with the human face region The area ratio in corresponding region, to obtain the average intensity value.
14. living body detection device as claimed in claim 11, wherein the judging submodule is specifically used for judging one Or multiple intensity are than the intensity ratio whether there is predetermined number in image, average intensity value is in corresponding preset range Image.
15. living body detection device as claimed in claim 14, wherein the predetermined number is equal to one or more of intensity Total number than all intensity in image than image.
16. living body detection device as claimed in claim 11, wherein the preset range is trained based on practical face And obtain.
17. living body detection device as claimed in claim 11, wherein the structure light includes infrared light.
18. living body detection device as claimed in claim 11, wherein each group in at least two groups object images includes At least two respectively in the case where the structure light with the same space frequency and out of phase irradiates the object for described The object images that object acquisition obtains.
19. living body detection device as claimed in claim 11, wherein the living body detection device further comprises:
Optical transmitter module, for irradiating the object using described at least two structure lights with different space frequency;And
Image capture module, for when irradiating the object every time, for the object acquisition target image, described in obtaining At least two groups object images.
20. living body detection device as claimed in claim 11, wherein the computing module is specifically used for according to every group objects figure Relationship between the pixel of the corresponding position in each image as in, calculates frequency response corresponding with the group objects image Intensity image.
CN201511030874.2A 2015-12-31 2015-12-31 Biopsy method and device Active CN105447483B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201511030874.2A CN105447483B (en) 2015-12-31 2015-12-31 Biopsy method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201511030874.2A CN105447483B (en) 2015-12-31 2015-12-31 Biopsy method and device

Publications (2)

Publication Number Publication Date
CN105447483A CN105447483A (en) 2016-03-30
CN105447483B true CN105447483B (en) 2019-03-22

Family

ID=55557643

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201511030874.2A Active CN105447483B (en) 2015-12-31 2015-12-31 Biopsy method and device

Country Status (1)

Country Link
CN (1) CN105447483B (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107451510B (en) 2016-05-30 2023-07-21 北京旷视科技有限公司 Living body detection method and living body detection system
CN108881674B (en) * 2017-06-05 2020-09-18 北京旷视科技有限公司 Image acquisition device and image processing method
CN108875508B (en) * 2017-11-23 2021-06-29 北京旷视科技有限公司 Living body detection algorithm updating method, device, client, server and system
CN108875519B (en) * 2017-12-19 2023-05-26 北京旷视科技有限公司 Object detection method, device and system and storage medium
CN108509888B (en) * 2018-03-27 2022-01-28 百度在线网络技术(北京)有限公司 Method and apparatus for generating information
CN110598571A (en) * 2019-08-15 2019-12-20 中国平安人寿保险股份有限公司 Living body detection method, living body detection device and computer-readable storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1924892A (en) * 2006-09-21 2007-03-07 杭州电子科技大学 Method and device for vivi-detection in iris recognition
CN102622588A (en) * 2012-03-08 2012-08-01 无锡数字奥森科技有限公司 Dual-certification face anti-counterfeit method and device
CN104881632A (en) * 2015-04-28 2015-09-02 南京邮电大学 Hyperspectral face recognition method

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9412007B2 (en) * 2003-08-05 2016-08-09 Fotonation Limited Partial face detector red-eye filter method and apparatus

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1924892A (en) * 2006-09-21 2007-03-07 杭州电子科技大学 Method and device for vivi-detection in iris recognition
CN102622588A (en) * 2012-03-08 2012-08-01 无锡数字奥森科技有限公司 Dual-certification face anti-counterfeit method and device
CN104881632A (en) * 2015-04-28 2015-09-02 南京邮电大学 Hyperspectral face recognition method

Also Published As

Publication number Publication date
CN105447483A (en) 2016-03-30

Similar Documents

Publication Publication Date Title
CN105447483B (en) Biopsy method and device
CN105912986B (en) A kind of biopsy method and system
CN106203305B (en) Face living body detection method and device
CN102973242B (en) Image processing equipment, image processing method, image processing system, program and recording medium
US10599933B2 (en) Biometric image capturing apparatus and biometric image capturing method
CN104272731B (en) Apparatus and method for processing 3d information
CN106033601B (en) The method and apparatus for detecting abnormal case
CN109076148A (en) Everyday scenes reconstruction engine
CN109766876B (en) Non-contact fingerprint acquisition device and method
CN106407914A (en) Method for detecting human faces, device and remote teller machine system
CN108875546A (en) Face auth method, system and storage medium
CN108140255B (en) The method and system of reflecting surface in scene for identification
CN108876804A (en) It scratches as model training and image are scratched as methods, devices and systems and storage medium
CN106524909B (en) Three-dimensional image acquisition method and device
CN108875535A (en) image detecting method, device and system and storage medium
CN109190484A (en) Image processing method, device and image processing equipment
CN108509888B (en) Method and apparatus for generating information
US9846813B2 (en) Image pickup device
CN109508583A (en) A kind of acquisition methods and device of distribution trend
CN109241888A (en) Neural metwork training and object identifying method, device and system and storage medium
CN105491307B (en) Depth sensing system
JP2021047188A (en) Batch authentication of materials for automated anti-counterfeiting
CN110428394A (en) Method, apparatus and computer storage medium for target mobile detection
CN108875500A (en) Pedestrian recognition methods, device, system and storage medium again
CN110490058A (en) Training method, device, system and the computer-readable medium of pedestrian detection model

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information
CB02 Change of applicant information

Address after: 100190 Beijing, Haidian District Academy of Sciences, South Road, No. 2, block A, No. 313

Applicant after: MEGVII INC.

Applicant after: Beijing maigewei Technology Co., Ltd.

Address before: 100190 Beijing, Haidian District Academy of Sciences, South Road, No. 2, block A, No. 313

Applicant before: MEGVII INC.

Applicant before: Beijing aperture Science and Technology Ltd.

TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20180921

Address after: 221007 Fuxing North Road, Gulou District, Xuzhou, Jiangsu 219

Applicant after: Xuzhou Kuang Shi Data Technology Co., Ltd.

Applicant after: MEGVII INC.

Applicant after: Beijing maigewei Technology Co., Ltd.

Address before: 100190 A block 2, South Road, Haidian District Academy of Sciences, Beijing 313

Applicant before: MEGVII INC.

Applicant before: Beijing maigewei Technology Co., Ltd.

GR01 Patent grant
GR01 Patent grant