CN108881674B - Image acquisition device and image processing method - Google Patents

Image acquisition device and image processing method Download PDF

Info

Publication number
CN108881674B
CN108881674B CN201710412898.7A CN201710412898A CN108881674B CN 108881674 B CN108881674 B CN 108881674B CN 201710412898 A CN201710412898 A CN 201710412898A CN 108881674 B CN108881674 B CN 108881674B
Authority
CN
China
Prior art keywords
image
living body
light source
lights
controller
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710412898.7A
Other languages
Chinese (zh)
Other versions
CN108881674A (en
Inventor
范浩强
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Kuangshi Technology Co Ltd
Beijing Megvii Technology Co Ltd
Original Assignee
Beijing Kuangshi Technology Co Ltd
Beijing Megvii Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Kuangshi Technology Co Ltd, Beijing Megvii Technology Co Ltd filed Critical Beijing Kuangshi Technology Co Ltd
Priority to CN201710412898.7A priority Critical patent/CN108881674B/en
Publication of CN108881674A publication Critical patent/CN108881674A/en
Application granted granted Critical
Publication of CN108881674B publication Critical patent/CN108881674B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/56Cameras or camera modules comprising electronic image sensors; Control thereof provided with illuminating means
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/54Mounting of pick-up tubes, electronic image sensors, deviation or focusing coils
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/55Optical parts specially adapted for electronic image sensors; Mounting thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene

Abstract

The embodiment of the invention provides an image acquisition device and an image processing method. The image processing device comprises a light source, a controller, an image sensor and a signal processor, wherein the light source is used for simultaneously emitting a plurality of kinds of light with different wavelengths to an image acquisition area of the image sensor; the controller is used for starting the light source and controlling the light source to respectively emit various lights with two or more groups of brightness values in two or more than two periods of time, wherein each group of brightness values comprises a plurality of brightness values corresponding to the various lights one by one; the image sensor is used for acquiring an image aiming at the image acquisition area; the signal processor is used for processing two or more living body mode images respectively acquired by the image sensor under the irradiation of various lights with two or more groups of brightness values to obtain living body information for indicating whether a living body exists in the image acquisition area. The device and the method can avoid the time consumption caused by the prior staggered light emission and imaging and have the advantage of high speed.

Description

Image acquisition device and image processing method
Technical Field
The invention relates to the field of artificial intelligence, in particular to an image acquisition device and an image processing method in the field of face recognition.
Background
Living body recognition refers to a technique of distinguishing a human face from non-human face objects (prostheses) such as a photograph/screen/mask. The currently common living body identification method is based on a multispectral method, and the method distinguishes human faces from non-human faces by measuring the reflectivity of an object under light with different wavelengths, and has the characteristic of high safety. However, the existing multispectral-based living body identification system needs to be imaged separately under the alternate irradiation of lights with different wavelengths. For example, assuming that there are A, B, C light sources with different wavelengths, the image acquisition is performed by interleaving as follows: turning on the light source A and turning off the light source B, C to obtain an image 1; turning on the light source B and turning off the light source A, C to obtain an image 2; the light source C is turned on, the light source A, B is turned off, the image 3 is obtained, and then the images 1 to 3 are processed to determine whether the object subjected to living body recognition is a living body. Existing liveness identification systems are either complex, expensive, or slow in hardware due to the interleaved illumination and imaging.
Disclosure of Invention
The present invention has been made in view of the above problems. The invention provides an image acquisition device and an image processing method.
According to an aspect of the present invention, there is provided an image pickup apparatus. The system comprises a light source, a controller, an image sensor and a signal processor, wherein the light source is used for simultaneously emitting a plurality of lights with different wavelengths to an image acquisition area of the image sensor; the controller is configured to turn on the light source and control the light source to emit the plurality of lights having two or more sets of brightness values respectively in two or more periods, wherein each set of brightness values includes a plurality of brightness values corresponding to the plurality of lights one to one; the image sensor is used for acquiring an image aiming at the image acquisition area; the signal processor is used for processing two or more living body mode images respectively acquired by the image sensor under the irradiation of the various lights with two or more groups of brightness values to obtain living body information for indicating whether a living body exists in the image acquisition area.
Illustratively, the controller is further for turning off the light source; the signal processor is further configured to superimpose a non-living body mode image acquired by the image sensor when the light source is turned off with the living body information to obtain a superimposed image.
Illustratively, the image acquisition device further comprises an output device for outputting the overlay image.
Illustratively, the light source is a near-infrared light source.
Illustratively, the number of the image sensors is one, the image capturing apparatus further includes a natural light cut filter disposed in front of the image sensors, and the controller is further configured to switch the natural light cut filter to an enabled state when the light source is turned on and to switch the natural light cut filter to an disabled state when the light source is turned off.
Illustratively, the number of the image sensors is two, for receiving natural light and infrared light, respectively.
Illustratively, the controller is further configured to control the light source such that a duration of the plurality of lights having each set of luminance values is less than or equal to a time difference between two adjacent exposure start times of the image sensor.
Illustratively, the controller is further configured to control the light source to emit the plurality of lights with a current set of brightness values upon receiving an exposure signal transmitted by the image sensor regarding a start of a current exposure.
Illustratively, the controller is further configured to transmit a control signal regarding a start of a current exposure to the image sensor to control the image sensor to start exposure when the light source is controlled to emit the plurality of lights having a current set of brightness values.
Illustratively, the controller is further configured to transmit a pulse width modulation signal to the light source to control the light source to emit the plurality of lights having brightness values indicated by the pulse width modulation signal.
Illustratively, the signal processor processes the two or more live mode images by:
for each of the remaining living body mode images other than the first living body mode image of the two or more living body mode images, calculating a sub-living body rate of the living body mode image according to the following formula:
Mn(x,y)=exp(-an*(Ln(x,y)/L1(x,y)-bn)2);
wherein n is more than or equal to 2, Mn(x, y) is a sub-living body rate of a pixel having coordinates (x, y) of the nth living body pattern image, anIs a first preset parameter corresponding to the nth living body pattern image, bnIs a second preset parameter, L, corresponding to the nth living body pattern imagen(x, y) is a pixel value of a pixel having coordinates (x, y) of the nth living body pattern image, L1(x, y) is a pixel value of a pixel of which coordinates are (x, y) of the 1 st living body mode image;
multiplying all the calculated sub-living body rates to obtain the living body information.
Illustratively, the controller is further configured to periodically turn on the light source and control the light source to emit the plurality of lights having two or more sets of luminance values for two or more periods, respectively.
Illustratively, the image acquisition device further comprises an output device, and the controller is further configured to turn off the light source; the signal processor is also used for encoding a non-living body mode image acquired by the image sensor when the light source is turned off to obtain an encoded image; the output device is for outputting the encoded image.
According to another aspect of the present invention, there is provided an image processing method including: emitting plural kinds of light having two or more sets of luminance values with different wavelengths to the image collection area at two or more periods, respectively, wherein the plural kinds of light are emitted simultaneously at each period, and each set of luminance values includes plural luminance values corresponding to the plural kinds of light one to one; acquiring an image for an image acquisition area; and processing two or more living body mode images respectively acquired under irradiation of a plurality of kinds of light having two or more sets of luminance values to obtain living body information indicating whether a living body exists in the image acquisition region.
Illustratively, the image processing method further includes: stopping emitting the plurality of lights to the image acquisition area; and superimposing the non-living body mode image acquired when the plurality of kinds of light are not emitted with living body information to obtain a superimposed image.
Illustratively, the image processing method further includes: and outputting the superposed image.
Illustratively, processing two or more live mode images respectively acquired under illumination by a plurality of lights having two or more sets of luminance values includes:
for each of the remaining living body mode images other than the first living body mode image of the two or more living body mode images, a sub-living body rate of the living body mode image is calculated according to the following formula:
Mn(x,y)=exp(-an*(Ln(x,y)/L1(x,y)-bn)2);
wherein n is more than or equal to 2, Mn(x, y) is a sub-living body rate of a pixel having coordinates (x, y) of the nth living body pattern image, anIs a first preset parameter corresponding to the nth living body pattern image, bnIs a second preset parameter, L, corresponding to the nth living body pattern imagen(x, y) is a pixel value of a pixel having coordinates (x, y) of the nth living body pattern image, L1(x, y) is a pixel value of a pixel of which coordinates are (x, y) of the 1 st living body mode image;
all the calculated sub living body rates are multiplied to obtain living body information.
According to the image acquisition device and the image processing method provided by the embodiment of the invention, the light source emits multiple lights with different wavelengths at the same time, and the image sensor images under the simultaneous irradiation of the multiple lights, so that the image acquisition device and the image processing method provided by the embodiment of the invention can avoid time consumption caused by the existing staggered light emission and imaging, and have the advantage of high speed. In addition, the hardware structure of the image acquisition device is simple, and the cost is low.
Drawings
The above and other objects, features and advantages of the present invention will become more apparent by describing in more detail embodiments of the present invention with reference to the attached drawings. The accompanying drawings are included to provide a further understanding of the embodiments of the invention and are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and together with the description serve to explain the principles of the invention and not to limit the invention. In the drawings, like reference numbers generally represent like parts or steps.
FIG. 1 shows a schematic block diagram of an image acquisition apparatus according to an embodiment of the present invention; and
fig. 2 shows a schematic flow diagram of an image processing method according to an embodiment of the invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, exemplary embodiments according to the present invention will be described in detail below with reference to the accompanying drawings. It is to be understood that the described embodiments are merely a subset of embodiments of the invention and not all embodiments of the invention, with the understanding that the invention is not limited to the example embodiments described herein. All other embodiments, which can be derived by a person skilled in the art from the embodiments of the invention described herein without inventive step, shall fall within the scope of protection of the invention.
In order to solve the above-mentioned problems, embodiments of the present invention provide an image capturing device and an image processing method, which emit multiple lights with different wavelengths at the same time and image under simultaneous irradiation of the multiple lights, so as to avoid the problems of complex and expensive hardware caused by the prior staggered light emission and imaging.
Next, an image pickup apparatus according to an embodiment of the present invention will be described with reference to fig. 1. Fig. 1 shows a schematic block diagram of an image acquisition apparatus 100 according to an embodiment of the present invention. As shown in fig. 1, the image capturing apparatus 100 includes a light source 110, a controller 120, an image sensor 130, and a signal processor 140.
The light source 110 is used to simultaneously emit a plurality of lights having different wavelengths toward the image pickup area of the image sensor 130.
Illustratively, the light source 110 may be a near-infrared light source, that is, the plurality of lights different in wavelength may be near-infrared lights. For example, the light source 110 may emit one or more of light having wavelengths of 750nm, 850nm, 940nm, and 1020 nm. The near infrared light has the following advantages: has small irritation to human eyes and low cost. In one example, the light source 110 may include a plurality of laser light sources or a plurality of Light Emitting Diodes (LEDs). In another example, the light source 110 may be a multi-wavelength light source.
The controller 120 is configured to turn on the light source 110 and control the light source 110 to emit the plurality of lights having two or more sets of brightness values respectively in two or more periods (i.e., the number of sets of brightness values is consistent with the number of periods, the plurality of lights having two sets of brightness values are emitted respectively in two periods, the plurality of lights having three sets of brightness values are emitted respectively in three periods, and so on), wherein each set of brightness values includes a plurality of brightness values corresponding to the plurality of lights one to one.
As shown in fig. 1, the controller 120 is connected to the light source 110, and the connection may be a direct connection or an indirect connection. The controller 120 may transmit a control signal to the light source 110 to control the turning on and off of the light source 110, and may control the brightness of each light emitted by the light source 110 by the control signal.
The light source 110 may emit a plurality of lights having a set of brightness values during the same period of time when turned on. It will be appreciated that if the light source emits three lights, each set of brightness values comprises three brightness values, corresponding to the three lights respectively. The brightness value of each light emitted by the light source 110 at each period may be predefined, and the brightness value of each light emitted by the light source 110 at each period may be controlled by the controller 120. Illustratively, the controller 120 may control the brightness value of each light by a Pulse Width Modulation (PWM) method. The controller 120 may be further configured to transmit the PWM signal to the light source 110 to control the light source 110 to emit a plurality of lights having brightness values indicated by the PWM signal. Alternatively, the same PWM signal may be used to control the brightness values of a plurality of lights (such that the brightness value of each light at the same time is the same), or a plurality of PWM signals may be used to control the brightness values of a plurality of lights in a one-to-one correspondence. Of course, there may be other suitable implementation manners for controlling the brightness values of the plurality of lights by using the PWM signal, which is not described herein.
The image sensor 130 is used to acquire an image for an image acquisition area.
Illustratively, in case the light source 110 is a near infrared light source, the image sensor 130 may be a Complementary Metal Oxide Semiconductor (CMOS)/Charge Coupled Device (CCD) image array having a near infrared light receiving capability.
In one example, the number of the image sensors 130 may be one, and the image capturing apparatus 100 may further include a natural light cut filter disposed in front of the image sensors 130. The natural light cut filter may filter natural light when activated, and transmit infrared light, thereby allowing only infrared light to enter the image sensor 130. Alternatively, the activation and deactivation of the natural light cut-off filter may be controlled by the controller 120. For example, the controller 120 may also be configured to switch the natural light cut filter to an activated state when the light source 110 is turned on, and to switch the natural light cut filter to an inactivated state when the light source 110 is turned off. That is, the light source 110 and the filter segment may be operated simultaneously. When the image capturing device 100 operates in the non-living body mode, the controller 120 controls the light source 110 to be turned off, and at this time, the natural light cut-off filter may not be activated, and the image sensor 130 may directly capture a non-living body mode image under the irradiation of natural light. When the image capturing device 100 operates in the living body mode, the controller 120 controls the light source 110 to be turned on, and at this time, the natural light cut filter can be activated to filter out natural light, and the image sensor 130 captures a living body mode image under the irradiation of near infrared light.
In another example, the number of the image sensors 130 may be two, for receiving natural light and infrared light, respectively.
The signal processor 140 is configured to process two or more living body mode images respectively acquired by the image sensor 130 under irradiation of a plurality of lights having two or more sets of brightness values to obtain living body information indicating whether a living body is present in the image acquisition region.
Illustratively, the image acquisition device 100 may operate in a living body mode and a non-living body mode. When the image capturing device 100 operates in the living body mode, the light source 110 is turned on, and the image sensor 130 can capture images of the living body mode under various light irradiations. When the image capturing device 100 operates in the non-living body mode, the light source 110 is turned off, and the image sensor 130 captures a non-living body mode image under natural light, and the image sensor 130 functions as a conventional image sensor. Alternatively, if the light source 110 is a near-infrared light source and the image pickup device 100 includes the above-described natural light cut filter, the natural light cut filter may be switched to an activated state when the image pickup device 100 operates in the living body mode.
As shown in fig. 1, the signal processor 140 is connected to the image sensor 130, and the connection may be a direct connection or an indirect connection. The signal processor 140 may receive the image captured by the image sensor 130. The signal processor 140 is mainly used for processing the living body mode image acquired by the image sensor 130 to obtain the required living body information. By way of example and not limitation, the living body information may be a living body rate. The way of calculating the living body rate and its meaning will be described below, and will not be described herein.
Illustratively, the signal processor 140 may be, for example, an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA), an embedded processor (ARM), a Digital Signal Processor (DSP), an Image Signal Processor (ISP), or the like.
It is to be understood that although the controller 120 is shown in fig. 1 as being connected to the signal processor 140, this is not a limitation of the present invention. The controller 120 may communicate with the signal processor 140 to communicate certain information. Alternatively, the controller 120 and the signal processor 140 may also each operate independently.
In the image capturing apparatus 100 according to an embodiment of the present invention, the light source 110 emits a plurality of kinds of light having different wavelengths at the same time, and the image sensor 130 can capture a living body pattern image irradiated with the plurality of kinds of light at the same time, so that reflection information of the plurality of kinds of light by the object subjected to living body recognition can be included in the same captured living body pattern image. Image processing may then be performed by the signal processor 140 to determine whether the object is a living body (i.e., whether a living body exists in the image pickup area) from the living body pattern image. The image acquisition device 100 according to the embodiment of the invention can avoid time consumption caused by the existing staggered light emission and imaging, and has the advantage of high speed. Moreover, the hardware structure of the image acquisition device 100 is simple and the cost is low.
In addition, the image capturing apparatus 100 may be regarded as a living human face camera, which may be integrated with an existing human face recognition system. The existing multispectral-based living body recognition system generally needs a special Software Development Kit (SDK), and the integration process of the existing multispectral-based living body recognition system and the existing face recognition system is complicated. The image capturing device 100 according to the embodiment of the present invention has a simple programming because the light emitting and imaging processes are simplified, and the image capturing device 100 can be conveniently integrated with an existing face recognition system.
According to an embodiment of the present invention, the signal processor 140 may process two or more living body mode images by:
for each of the remaining living body mode images other than the first living body mode image of the two or more living body mode images, a sub-living body rate of the living body mode image is calculated according to the following formula:
Mn(x,y)=exp(-an*(Ln(x,y)/L1(x,y)-bn)2);
wherein n is more than or equal to 2, Mn(x, y) is a sub-living body rate of a pixel having coordinates (x, y) of the nth living body pattern image, anIs a first preset parameter corresponding to the nth living body pattern image, bnIs a second preset parameter, L, corresponding to the nth living body pattern imagen(x, y) is a pixel value of a pixel having coordinates (x, y) of the nth living body pattern image, L1(x, y) is a pixel value of a pixel of which coordinates are (x, y) of the 1 st living body mode image;
all the calculated sub living body rates are multiplied to obtain living body information.
In the living body mode, the controller 120 may set the brightness values of the various lights emitted by the light source 110 to a first set of brightness values predefined for a first period of time, and the image sensor 130 acquires a frame of image at this time, referred to as a "first living body mode image (L)1)". During the second period, the controller 120 may set the brightness values of the various lights emitted by the light source 110 to a second set of predefined brightness values, and the image sensor 130 acquires a frame of image at this time, which is referred to as a "second living body mode image (L)2)". Optionally, n images may be acquired, n ≧ 2.
For L1And L2Each pair of corresponding pixels in (1) calculates a ratio S2(x,y):
S2(x,y)=L2(x,y)/L1(x,y)
Then, L is calculated from the ratio2Sub-living body rate M of2(x,y):
M2(x,y)=exp(-a2*(S2(x,y)-b2)2)
Wherein, a2And b2Is and L2And the corresponding first preset parameter and the second preset parameter.
If the controller 120 sets two sets of brightness values in total, two living body pattern images L in total are obtained1And L2In this case, the calculated sub-living body rate M2The (x, y) is the final living body rate M (x, y), i.e. the required living body information. If more than two sets of brightness values are set by the controller 120, each living body pattern image L can be calculatednAnd a first living body pattern image L1And calculates the living body pattern image L based on the rationSub-living body rate M ofn(x, y). Subsequently, all the sub living body rates may be multiplied to obtain the final living body rate M (x, y), that is, the desired living body information.
For example, images of a normal human face and a prosthesis such as a photograph, a screen, or a mask may be previously acquired under a plurality of lights having different wavelengths. The image of a normal face is a positive sample image and the image of a prosthesis such as a photograph, screen or mask is a negative sample image. The above formula can be trained using the positive and negative sample images to determine the appropriate brightness distribution coefficient (i.e., the brightness value of each light set at a time) and parameter (i.e., the first preset parameter a)nAnd a second preset parameter bn)。
Illustratively, the living body rate M (x, y) may also be calculated by means of a convolutional neural network. For example, all of the living body pattern images acquired by the image sensor 130 may be input to a trained convolutional neural network, which may output the desired living body rate M (x, y). The convolutional neural network can also be trained using positive sample images acquired for normal faces and negative sample images acquired for prostheses such as photographs, screens or masks. The training mode and the application mode of the convolutional neural network can be understood by those skilled in the art, and are not described in detail herein.
The form (expressed as "living body rate") and the calculation manner of the living body information are merely examples and are not limiting, and the image capturing apparatus 100 according to the embodiment of the present invention may use other suitable manners to express and determine the living body information. For example, the living body information may also be indication information such as "yes"/"no", "1"/"0", "successful passage of a living body"/"failure of living body identification", and the like. Assuming that the living body information is "1"/"0", it is possible to represent that a living body exists in the image capturing area by "1" and that a living body does not exist in the image capturing area by "0".
According to an embodiment of the present invention, the controller 120 may be further configured to turn off the light source 110, and the signal processor 140 may be further configured to superimpose the non-living body mode image acquired by the image sensor 130 when the light source 110 is turned off with the living body information to obtain a superimposed image.
As described above, when the image capturing apparatus 100 operates in the non-living body mode, the light source 110 is turned off, and the image sensor 130 captures a non-living body mode image under natural light. The living body information is fused on the non-living body mode image acquired under natural light, so that the information of whether the living body exists in the image acquisition area and the position of the living body can be more clearly reflected in the non-living body mode image. Moreover, such fusion facilitates integration with existing face recognition systems, i.e., the image acquisition apparatus 100 may be seamlessly integrated with existing face recognition systems as a replacement for common image acquisition systems (e.g., cameras).
For example, the living body rate M (x, y) may approximately represent the confidence that the pixel with the coordinate (x, y) in the image (including the living body mode image and the non-living body mode image) acquired by the image sensor 130 belongs to the living body. For example, for a location in the image where a living body is present, the corresponding value of M (x, y) may be close to 1, and for a location in the image where a living body is not present, the corresponding value of M (x, y) may be close to 0. If the living body rate M (x, y) is directly expressed in the form of an image, the resulting image is approximately a black-and-white image, a white portion indicates the presence of a living body, and a black portion indicates the absence of a living body. For example, the pixel value of the corresponding pixel of the non-living body mode image may be multiplied by M (x, y), and the obtained result is the superimposed image. It is understood that in the superimposed image, the pixel value of the position where the living body is present in the non-living body mode image is substantially constant, and the pixel values of the remaining positions are close to 0. That is, in the superimposed image, the living body in the non-living body mode image can be left, and the remaining positions appear black. It should be understood that the above-mentioned manner of taking the value of M (x, y) and the manner of superimposing the non-living mode image are only examples and are not limitations of the present invention.
Illustratively, the number of non-living body mode images may be one or more. In the case where the image sensor 130 acquires a plurality of non-living body mode images, each non-living body mode image may be multiplied by the living body rate M (x, y), respectively, to obtain a plurality of superimposed images.
Illustratively, the non-living body mode image may be acquired before or after the living body mode image is acquired, and the living body mode image is continuously acquired, so that the change of the image acquisition conditions such as the object itself for living body recognition and the surrounding environment is not large, and thus the image processing result is more accurate. Of course, it is also possible to intermittently acquire non-living mode images during the acquisition of living mode images.
According to an embodiment of the present invention, the image capturing apparatus 100 may further include an output device for outputting the superimposed image. Illustratively, the output means may be an output interface for outputting the superimposed image to an external device through a wired or wireless network. The output device may also be any suitable display device, such as a cathode ray tube display (CRT), a Plasma Display Panel (PDP), a Liquid Crystal Display (LCD), or the like. The superposed image is output, so that a user can conveniently check the living body identification result, and the user can know whether the living body exists, the position of the living body, the appearance of the living body and the like at a glance.
According to an embodiment of the present invention, the controller 120 may be further configured to control the light source 110 to emit a plurality of lights having a current set of brightness values upon receiving an exposure signal transmitted by the image sensor 130 regarding the start of the current exposure.
For example, the controller 120 may not control the image sensor 130 to expose, and may receive an exposure signal of the image sensor 130 to know that the image sensor 130 starts exposing. In the case where the controller 120 receives the exposure signal, the controller 120 may transmit a control signal to the light source 110 to control the light source 110 to emit a variety of lights having luminance values indicated by the control signal. When the controller 120 receives the next exposure signal transmitted by the image sensor 130, the controller 120 may control the light source 110 to change the brightness value of each light it emits.
According to an embodiment of the present invention, the controller 120 may be further configured to transmit a control signal regarding the start of the current exposure to the image sensor 130 to control the image sensor 130 to start the exposure when the light source 110 is controlled to emit a plurality of lights having the current set of brightness values.
Illustratively, the controller 120 may control the switching of the light source 110 and the exposure of the image sensor 130 at the same time. For example, the controller 120 may simultaneously send respective control signals to the light source 110 and the image sensor 130, so that the image sensor 130 starts exposure when the light source 110 starts emitting light. It will be appreciated that it is easier to coordinate the operation of the light source 110 and the image sensor 130 in this manner.
According to an embodiment of the present invention, the controller 120 may be further configured to control the light source 110 such that the durations of the plurality of lights having each set of luminance values are less than or equal to a time difference between two adjacent exposure start times of the image sensor 130.
The controller 120 may control the turning on and off of the light source 110, that is, it may control the duration of the various lights emitted by the light source 110 having each set of brightness values. For example, the controller 120 may control the light source 110 to be turned on when the image sensor 130 starts the current exposure, and the controller 120 may control the light source 110 to be turned off before or when the image sensor 130 starts the next exposure. It will be appreciated by those skilled in the art that the image sensor will generally start the next exposure after a certain period of time has elapsed after the current exposure is finished, so that turning off the light source 110 before the next exposure does not affect the image capturing effect, and it is possible to save energy.
Illustratively, the controller 120 may be further configured to periodically turn on the light source 110 and control the light source 110 to emit a plurality of lights having two or more sets of brightness values for two or more periods, respectively.
A variety of lights having two or more sets of luminance values may be periodically emitted. According to an example, the controller 120 controls the light source 110 to emit the plurality of lights having three sets of brightness values, respectively, each set of brightness values lasting for one second, at first to third seconds, and the controller 120 turns off the light source 110 at fourth and fifth seconds; at sixth to eighth seconds, the controller 120 controls the light source 110 to emit a plurality of lights having three sets of luminance values, respectively (a set of luminance values at the sixth second coincides with a set of luminance values at the first second, a set of luminance values at the seventh second coincides with a set of luminance values at the second, a set of luminance values at the eighth second coincides with a set of luminance values at the third second), each set of luminance values lasts for one second, and at ninth and tenth seconds, the controller 120 turns off the light source 110; and so on. Since the environment and the object of the image acquisition region may vary in real time, the living body information may be periodically updated to realize real-time living body detection.
Illustratively, the image capturing apparatus 100 may further include an output device, and the controller 120 may be further configured to turn off the light source 110; the signal processor 140 may also be used to encode a non-living mode image acquired by the image sensor when the light source is off to obtain an encoded image; the output device may be for outputting the encoded image.
As described above, the image capturing apparatus 100 may operate in the non-living body mode. In this mode, the light source 110 is turned off, and the image capturing apparatus 100 can be used as a general camera. The image pickup apparatus 100 can freely switch between the face live camera and the general camera, and thus it can realize dual functions. The image capturing device 100 may encode and output the captured non-living body mode image.
Illustratively, the output means may be an output interface for outputting the superimposed image to an external device through a wired or wireless network. The output device may also be any suitable display device, such as a cathode ray tube display (CRT), a Plasma Display Panel (PDP), a Liquid Crystal Display (LCD), or the like.
By way of example and not limitation, the output means for outputting the encoded image may be the same means as the output means for outputting the superimposed image described above.
According to another aspect of the present invention, there is provided an image processing method. FIG. 2 shows a schematic flow diagram of an image processing method 200 according to one embodiment of the invention. As shown in fig. 2, the image processing method 200 includes the following steps.
In step S210, a plurality of lights having two or more sets of luminance values with different wavelengths are emitted to the image capturing area for two or more periods, respectively, wherein the plurality of lights are emitted simultaneously at each period, and each set of luminance values includes a plurality of luminance values corresponding to the plurality of lights one to one.
In step S220, an image is acquired for the image acquisition region.
In step S230, two or more living body mode images respectively acquired under irradiation of a plurality of kinds of light having two or more sets of luminance values are processed to obtain living body information indicating whether or not a living body exists in an image acquisition region.
As described above, the controller 120 may turn on the light source 110 and control the light source 110 to emit the plurality of lights having two or more sets of luminance values to the image collection area of the image sensor 130 for two or more periods, respectively. Meanwhile, an image may be acquired for the image acquisition area using the image sensor 130. After the two or more living body mode images are acquired, the acquired two or more living body mode images may be processed by the signal processor 140 described above to obtain desired living body information.
Illustratively, the image processing method 200 may further include: stopping emitting the plurality of lights to the image acquisition area; and superimposing the non-living body mode image acquired when the plurality of kinds of light are not emitted with living body information to obtain a superimposed image.
For example, the controller 120 may turn off the light source 110, thereby stopping the emission of the plurality of lights to the image collection area. During the period when the light source 110 is off, a non-living body mode image under natural light irradiation is acquired. The living body information is fused on the non-living body mode image acquired under natural light, so that the information of whether the living body exists in the image acquisition area and the position of the living body can be more clearly reflected in the non-living body mode image.
Illustratively, the image processing method 200 may further include: and outputting the superposed image.
Alternatively, the superimposed image may be output through a wired or wireless network, and may also be output through a display device. Outputting the superimposed image may facilitate the user to view the living body recognition result.
Illustratively, the plurality of lights is near infrared light. The plurality of lights may be emitted by a light source, which may be a near infrared light source.
Illustratively, step S220 is implemented by using one image sensor of the image capturing device, the image capturing device further includes a natural light cut-off filter, the natural light cut-off filter is disposed in front of the image sensor, and the image processing method 200 further includes: the natural light cut-off filter sheet is switched to an activated state when the light source is turned on, and is switched to an inactivated state when the light source is turned off.
Illustratively, step S220 is implemented by using two image sensors of the image capturing device, which are respectively used for receiving natural light and infrared light. Step S220 may include: two or more living body mode images are acquired using an image sensor for receiving infrared light. Step S220 may further include: acquiring a non-living body mode image by using an image sensor for receiving natural light.
Illustratively, the plurality of lights may be emitted by a light source, step S220 is implemented by an image sensor of the image capturing device, and the image processing method 200 may further include: the light source is controlled such that the duration of the plurality of lights having each set of luminance values is less than or equal to a time difference between two adjacent exposure start times of the image sensor.
Illustratively, the plurality of lights may be emitted by a light source, step S220 is implemented by an image sensor of the image capturing device, and the image processing method 200 may further include: upon receiving an exposure signal transmitted by the image sensor regarding the start of the current exposure, the light source is controlled to emit a plurality of lights having a current set of brightness values.
Illustratively, the plurality of lights may be emitted by a light source, step S220 is implemented by an image sensor of the image capturing device, and the image processing method 200 may further include: when the light source is controlled to emit a plurality of lights having the current set of brightness values, a control signal regarding the start of the current exposure is transmitted to the image sensor to control the image sensor to start the exposure.
Illustratively, the plurality of lights may be emitted by a light source, and step S210 may include: the pulse width modulation signal is transmitted to the light source to control the light source to emit a plurality of lights having luminance values indicated by the pulse width modulation signal. Exemplarily, step S230 may include:
for each of the remaining living body mode images other than the first living body mode image of the two or more living body mode images, a sub-living body rate of the living body mode image is calculated according to the following formula:
Mn(x,y)=exp(-an*(Ln(x,y)/L1(x,y)-bn)2);
wherein n is more than or equal to 2, Mn(x, y) is a sub-living body rate of a pixel having coordinates (x, y) of the nth living body pattern image, anIs a first preset parameter corresponding to the nth living body pattern image, bnIs a second preset parameter, L, corresponding to the nth living body pattern imagen(x, y) is a pixel value of a pixel having coordinates (x, y) of the nth living body pattern image, L1(x, y) is a pixel value of a pixel of which coordinates are (x, y) of the 1 st living body mode image;
all the calculated sub living body rates are multiplied to obtain living body information.
The manner of calculating the living body information from the sub-living body rate has been described above, and will not be described herein again. It is to be understood that the manner of calculating the living body information according to the sub living body rate is merely an example and not a limitation, and the living body information may be determined in other suitable manners.
Illustratively, the plurality of lights may be emitted by a light source, and step S210 may include: the light source is periodically turned on and controlled to emit a plurality of lights having two or more sets of brightness values for two or more periods, respectively.
Illustratively, the image processing method 200 further includes: stopping emitting the plurality of lights to the image acquisition area; encoding a non-living body mode image acquired when the plurality of lights are not emitted to obtain an encoded image; and outputting the encoded image.
Although the illustrative embodiments have been described herein with reference to the accompanying drawings, it is to be understood that the foregoing illustrative embodiments are merely exemplary and are not intended to limit the scope of the invention thereto. Various changes and modifications may be effected therein by one of ordinary skill in the pertinent art without departing from the scope or spirit of the present invention. All such changes and modifications are intended to be included within the scope of the present invention as set forth in the appended claims.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described device embodiments are merely illustrative, and for example, the division of the units is only one logical functional division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another device, or some features may be omitted, or not executed.
In the description provided herein, numerous specific details are set forth. It is understood, however, that embodiments of the invention may be practiced without these specific details. In some instances, well-known methods, structures and techniques have not been shown in detail in order not to obscure an understanding of this description.
Similarly, it should be appreciated that in the description of exemplary embodiments of the invention, various features of the invention are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the invention and aiding in the understanding of one or more of the various inventive aspects. However, the method of the present invention should not be construed to reflect the intent: that the invention as claimed requires more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive aspects lie in less than all features of a single disclosed embodiment. Thus, the claims following the detailed description are hereby expressly incorporated into this detailed description, with each claim standing on its own as a separate embodiment of this invention.
It will be understood by those skilled in the art that all of the features disclosed in this specification (including any accompanying claims, abstract and drawings), and all of the processes or elements of any method or apparatus so disclosed, may be combined in any combination, except combinations where such features are mutually exclusive. Each feature disclosed in this specification (including any accompanying claims, abstract and drawings) may be replaced by alternative features serving the same, equivalent or similar purpose, unless expressly stated otherwise.
Furthermore, those skilled in the art will appreciate that while some embodiments described herein include some features included in other embodiments, rather than other features, combinations of features of different embodiments are meant to be within the scope of the invention and form different embodiments. For example, in the claims, any of the claimed embodiments may be used in any combination.
The various component embodiments of the invention may be implemented in hardware, or in software modules running on one or more processors, or in a combination thereof. Those skilled in the art will appreciate that a microprocessor or Digital Signal Processor (DSP) may be used in practice to implement some or all of the functionality of some of the modules in an image acquisition apparatus according to embodiments of the present invention. The present invention may also be embodied as apparatus programs (e.g., computer programs and computer program products) for performing a portion or all of the methods described herein. Such programs implementing the present invention may be stored on computer-readable media or may be in the form of one or more signals. Such a signal may be downloaded from an internet website or provided on a carrier signal or in any other form.
It should be noted that the above-mentioned embodiments illustrate rather than limit the invention, and that those skilled in the art will be able to design alternative embodiments without departing from the scope of the appended claims. In the claims, any reference signs placed between parentheses shall not be construed as limiting the claim. The word "comprising" does not exclude the presence of elements or steps not listed in a claim. The word "a" or "an" preceding an element does not exclude the presence of a plurality of such elements. The invention may be implemented by means of hardware comprising several distinct elements, and by means of a suitably programmed computer. In the unit claims enumerating several means, several of these means may be embodied by one and the same item of hardware. The usage of the words first, second and third, etcetera do not indicate any ordering. These words may be interpreted as names.
The above description is only for the specific embodiment of the present invention or the description thereof, and the protection scope of the present invention is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present invention, and the changes or substitutions should be covered within the protection scope of the present invention. The protection scope of the present invention shall be subject to the protection scope of the claims.

Claims (17)

1. An image acquisition device comprises a light source, a controller, an image sensor and a signal processor, wherein,
the light source is used for simultaneously emitting a plurality of lights with different wavelengths to the image acquisition area of the image sensor;
the controller is configured to turn on the light source and control the light source to emit the plurality of lights having two or more sets of brightness values, respectively, in two or more periods, wherein the light source emits the plurality of lights having one set of brightness values in the same period when turned on, and each set of brightness values includes a plurality of brightness values corresponding to the plurality of lights one to one;
the image sensor is used for acquiring an image aiming at the image acquisition area;
the signal processor is used for processing two or more living body mode images respectively acquired by the image sensor under the irradiation of the various lights with two or more groups of brightness values to obtain living body information for indicating whether a living body exists in the image acquisition area.
2. The image capturing apparatus of claim 1,
the controller is further configured to turn off the light source;
the signal processor is further configured to superimpose a non-living body mode image acquired by the image sensor when the light source is turned off with the living body information to obtain a superimposed image.
3. The image capturing apparatus of claim 2, wherein the image capturing apparatus further comprises an output device for outputting the overlay image.
4. The image capture device of claim 1, wherein the light source is a near infrared light source.
5. The image pickup device according to claim 4, wherein the number of the image sensors is one, the image pickup device further comprising a natural light cut filter disposed in front of the image sensors,
the controller is further configured to switch the natural light cut-off filter to an enabled state when the light source is turned on, and switch the natural light cut-off filter to an disabled state when the light source is turned off.
6. The image pickup device according to claim 4, wherein the number of the image sensors is two for receiving natural light and infrared light, respectively.
7. The image capturing apparatus of claim 1, wherein the controller is further configured to control the light source such that a duration of the plurality of lights having each set of brightness values is less than or equal to a time difference between two adjacent exposure start times of the image sensor.
8. The image capturing apparatus as claimed in claim 1, wherein the controller is further configured to control the light source to emit the plurality of lights having a current set of brightness values upon receiving an exposure signal transmitted by the image sensor with respect to a start of a current exposure.
9. The image capturing apparatus of claim 1, wherein the controller is further configured to transmit a control signal regarding a start of a current exposure to the image sensor to control the image sensor to start exposure when the light source is controlled to emit the plurality of lights having a current set of brightness values.
10. The image capturing apparatus of claim 1, wherein the controller is further configured to transmit a pulse width modulation signal to the light source to control the light source to emit the plurality of lights having brightness values indicated by the pulse width modulation signal.
11. The image acquisition device of claim 1, wherein the signal processor processes the two or more live mode images by:
for each of the remaining living body mode images other than the first living body mode image of the two or more living body mode images, calculating a sub-living body rate of the living body mode image according to the following formula:
Mn(x,y)=exp(-an*(Ln(x,y)/L1(x,y)-bn)2);
wherein n is more than or equal to 2, Mn(x, y) is a sub-living body rate of a pixel having coordinates (x, y) of the nth living body pattern image, anIs a first preset parameter corresponding to the nth living body pattern image, bnIs a second preset parameter, L, corresponding to the nth living body pattern imagen(x, y) is a pixel value of a pixel having coordinates (x, y) of the nth living body pattern image, L1(x, y) is a pixel value of a pixel of which coordinates are (x, y) of the 1 st living body mode image;
multiplying all the calculated sub-living body rates to obtain the living body information.
12. The image capturing apparatus of claim 1, wherein the controller is further configured to periodically turn on the light source and control the light source to emit the plurality of lights having two or more sets of brightness values for two or more periods, respectively.
13. The image capturing apparatus of claim 1, wherein the image capturing apparatus further comprises an output device,
the controller is further configured to turn off the light source;
the signal processor is further configured to encode a non-living mode image acquired by the image sensor when the light source is turned off to obtain an encoded image;
the output means is for outputting the encoded image.
14. An image processing method comprising:
emitting, to the image acquisition region, a plurality of lights having two or more sets of brightness values with different wavelengths for two or more periods, respectively, wherein emitting, to the image acquisition region, a plurality of lights having two or more sets of brightness values with different wavelengths for two or more periods, respectively, includes: emitting the plurality of lights having a set of luminance values in the same period, the plurality of lights being emitted simultaneously in each period, and each set of luminance values including a plurality of luminance values in one-to-one correspondence with the plurality of lights;
acquiring an image for the image acquisition area; and
processing two or more living body mode images respectively acquired under the irradiation of the plurality of kinds of light having two or more sets of luminance values to obtain living body information indicating whether or not a living body exists in the image acquisition region.
15. The image processing method of claim 14, wherein the image processing method further comprises:
stopping emitting the plurality of lights to the image acquisition area; and
superimposing a non-living body mode image acquired when the plurality of kinds of light are not emitted with the living body information to obtain a superimposed image.
16. The image processing method of claim 15, wherein the image processing method further comprises: and outputting the superposed image.
17. The image processing method according to claim 14, wherein said processing two or more live mode images respectively acquired under the irradiation of the plurality of kinds of light having two or more sets of luminance values includes:
for each of the remaining living body mode images other than the first living body mode image of the two or more living body mode images, calculating a sub-living body rate of the living body mode image according to the following formula:
Mn(x,y)=exp(-an*(Ln(x,y)/L1(x,y)-bn)2);
wherein n is more than or equal to 2, Mn(x, y) is the coordinates of the nth living body pattern imageSub-active ratio of pixel of (x, y), anIs a first preset parameter corresponding to the nth living body pattern image, bnIs a second preset parameter, L, corresponding to the nth living body pattern imagen(x, y) is a pixel value of a pixel having coordinates (x, y) of the nth living body pattern image, L1(x, y) is a pixel value of a pixel of which coordinates are (x, y) of the 1 st living body mode image;
multiplying all the calculated sub-living body rates to obtain the living body information.
CN201710412898.7A 2017-06-05 2017-06-05 Image acquisition device and image processing method Active CN108881674B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710412898.7A CN108881674B (en) 2017-06-05 2017-06-05 Image acquisition device and image processing method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710412898.7A CN108881674B (en) 2017-06-05 2017-06-05 Image acquisition device and image processing method

Publications (2)

Publication Number Publication Date
CN108881674A CN108881674A (en) 2018-11-23
CN108881674B true CN108881674B (en) 2020-09-18

Family

ID=64320799

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710412898.7A Active CN108881674B (en) 2017-06-05 2017-06-05 Image acquisition device and image processing method

Country Status (1)

Country Link
CN (1) CN108881674B (en)

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN100576231C (en) * 2007-01-15 2009-12-30 中国科学院自动化研究所 Image collecting device and use the face identification system and the method for this device
CN101493884B (en) * 2008-01-24 2012-05-23 中国科学院自动化研究所 Multi-optical spectrum image collecting device and method
JP5541914B2 (en) * 2009-12-28 2014-07-09 オリンパス株式会社 Image processing apparatus, electronic apparatus, program, and operation method of endoscope apparatus
US10621454B2 (en) * 2015-06-29 2020-04-14 Beijing Kuangshi Technology Co., Ltd. Living body detection method, living body detection system, and computer program product
CN105260731A (en) * 2015-11-25 2016-01-20 商汤集团有限公司 Human face living body detection system and method based on optical pulses
CN105447483B (en) * 2015-12-31 2019-03-22 徐州旷视数据科技有限公司 Biopsy method and device
CN106529512B (en) * 2016-12-15 2019-09-10 北京旷视科技有限公司 Living body faces verification method and device

Also Published As

Publication number Publication date
CN108881674A (en) 2018-11-23

Similar Documents

Publication Publication Date Title
JP7354133B2 (en) Camera exposure adjustment for 3D depth sensing and 2D imaging
US20200082160A1 (en) Face recognition module with artificial intelligence models
WO2020010848A1 (en) Control method, microprocessor, computer readable storage medium, and computer apparatus
JP4297876B2 (en) Imaging apparatus, light source control method, and computer program
CN102472950B (en) Image pickup apparatus and mobile phone equipped therewith
US20160206204A1 (en) Organ Imaging Device
US9898658B2 (en) Pupil detection light source device, pupil detection device and pupil detection method
JP2014017114A (en) Illumination system
CN105850112A (en) Imaging control device
US11904869B2 (en) Monitoring system and non-transitory storage medium
WO2016031666A1 (en) Line-of-sight detection device
CN109697422B (en) Optical motion capture method and optical motion capture camera
CN108881674B (en) Image acquisition device and image processing method
JP7013730B2 (en) Image generation control device, image generation control method, and image generation control program
CN103685964B (en) Control method and device and electronic equipment
CN112752003A (en) Light supplementing method and device, light supplementing equipment and monitoring equipment
KR101872165B1 (en) Dental camera
CN103888674B (en) Image capture unit and image acquisition method
CN108886608B (en) White balance adjustment device, working method thereof and computer readable medium
TWI464527B (en) Electronic device and method for automatically adjusting light value of shooting
CN109426762B (en) Biological recognition system, method and biological recognition terminal
KR102460762B1 (en) Camera apparatus with integrated heterogeneous video
JP7080724B2 (en) Light distribution control device, light projection system and light distribution control method
CN113128259B (en) Face recognition device and face recognition method
JP6512806B2 (en) Imaging device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant