WO2019244496A1 - Dispositif de traitement d'informations, équipement portatif, procédé de traitement d'informations, et programme - Google Patents

Dispositif de traitement d'informations, équipement portatif, procédé de traitement d'informations, et programme Download PDF

Info

Publication number
WO2019244496A1
WO2019244496A1 PCT/JP2019/018523 JP2019018523W WO2019244496A1 WO 2019244496 A1 WO2019244496 A1 WO 2019244496A1 JP 2019018523 W JP2019018523 W JP 2019018523W WO 2019244496 A1 WO2019244496 A1 WO 2019244496A1
Authority
WO
WIPO (PCT)
Prior art keywords
mode
image
unit
control unit
information processing
Prior art date
Application number
PCT/JP2019/018523
Other languages
English (en)
Japanese (ja)
Inventor
紘一 作本
Original Assignee
ソニー株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニー株式会社 filed Critical ソニー株式会社
Publication of WO2019244496A1 publication Critical patent/WO2019244496A1/fr

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/117Identification of persons
    • A61B5/1171Identification of persons based on the shapes or appearances of their bodies or parts thereof
    • A61B5/1172Identification of persons based on the shapes or appearances of their bodies or parts thereof using fingerprinting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis

Definitions

  • the present disclosure relates to an information processing device, a wearable device, an information processing method, and a program.
  • One object of the present disclosure is to provide an information processing device, a wearable device, an information processing method, and a program that can suppress unnecessary power consumption.
  • the present disclosure for example, At least a control unit that selectively sets a first mode and a second mode in which a process that consumes more power than the first mode is performed;
  • the control unit is In the first mode, it is determined whether or not the image obtained via the sensor unit includes biological information,
  • the operation mode is changed from the first mode to the second mode, triggered by the fact that the biological information is included in the image,
  • the information processing apparatus performs at least a matching process using biological information.
  • a control unit that selectively sets at least a first mode and a second mode in which a process that consumes more power than the first mode is performed; And a sensor unit for acquiring an image.
  • the control unit is In the first mode, it is determined whether or not the image obtained via the sensor unit includes biological information, The operation mode is changed from the first mode to the second mode, triggered by the fact that the biological information is included in the image, In the second mode, the wearable device performs at least the matching process using the biological information.
  • the control unit selectively sets at least a first mode and a second mode in which a process that consumes more power than the first mode is performed,
  • the control unit is In the first mode, it is determined whether or not the image obtained via the sensor unit includes biological information,
  • the operation mode is changed from the first mode to the second mode, triggered by the fact that the biological information is included in the image,
  • the second mode is an information processing method that performs at least a matching process using biological information.
  • the control unit selectively sets at least a first mode and a second mode in which a process that consumes more power than the first mode is performed,
  • the control unit is In the first mode, it is determined whether or not the image obtained via the sensor unit includes biological information,
  • the operation mode is changed from the first mode to the second mode, triggered by the fact that the biological information is included in the image,
  • the second mode is a program that causes a computer to execute an information processing method for performing at least a matching process using biological information.
  • FIG. 1 is a diagram illustrating an example of an external appearance of a wristband electronic device according to an embodiment.
  • FIG. 2 is a diagram illustrating an example of an internal structure of the wristband type electronic device according to the embodiment.
  • FIG. 3 is a diagram illustrating a more specific example of the internal structure of the wristband type electronic device.
  • FIG. 4 is a block diagram illustrating a circuit configuration example of the wristband type electronic device according to the embodiment.
  • FIG. 5 is a functional block diagram for explaining a function example of the control unit according to the embodiment.
  • FIG. 6 is a diagram for explaining feature points of a fingerprint.
  • FIG. 7 is a functional block diagram for explaining a function example of the preprocessing unit according to the embodiment.
  • FIGS. 8A to 8D are diagrams for explaining processing by the noise removal unit according to the embodiment.
  • 9A and 9B are diagrams referred to when describing an example of processing for detecting the direction and the main frequency of the flow of the fingerprint.
  • FIGS. 10A and 10B are diagrams for explaining a process of estimating a fingerprint line to the outside of the imaging range.
  • FIG. 11 is a diagram illustrating an example of the certainty factor map.
  • FIG. 12 is a diagram illustrating an example of the certainty factor.
  • FIG. 13A and FIG. 13B are diagrams for explaining a process of generating a ridge estimation image with a certainty factor map.
  • FIG. 14A and FIG. 14B are diagrams for explaining a registration process according to the embodiment.
  • FIG. 15B are diagrams for explaining the matching process according to the embodiment.
  • FIG. 16 is a state transition diagram for explaining an example of the transition of the operation mode.
  • 17A and 17B are diagrams for explaining an example of the trigger P.
  • FIG. 18 is a diagram for explaining another example of the trigger P.
  • FIGS. 19A and 19B are diagrams illustrating an example of an axial direction defined by a wristband type electronic device.
  • 20A to 20C are diagrams for explaining another example of the trigger P.
  • FIG. FIGS. 21A and 21B are diagrams referred to when explaining another example of the trigger P.
  • FIGS. 22A to 22D are diagrams illustrating an example of the trigger Q.
  • FIG. FIG. 23 is a diagram for explaining an example of the trigger Q.
  • FIG. 24A to 24D are diagrams referred to when explaining another example of the trigger Q.
  • FIG. 25 is a flowchart illustrating a flow of a process according to the second embodiment.
  • FIG. 26 is a flowchart illustrating a flow of a process according to the second embodiment.
  • FIG. 27 is a diagram for describing a modification.
  • FIG. 1 shows an example of an external appearance of a wristband electronic device (wristband electronic device 1) according to the first embodiment.
  • the wristband type electronic device 1 is used, for example, like a wristwatch. More specifically, the wristband-type electronic device 1 has a band portion 2 wound around the user's wrist WR and a main body portion 3. The main body 3 has a display 4. Although details will be described later, in the wristband type electronic device 1 according to the embodiment, by touching the display 4 with a fingertip, it is possible to perform biometric authentication using fingerprint information of the fingertip.
  • FIG. 2 is a partial cross-sectional view illustrating an example of the structure inside the main body 3 of the wristband type electronic device 1.
  • the main unit 3 of the wristband type electronic device 1 includes, for example, the display 4, the light guide plate 5, the light emitting unit 6, the touch sensor unit 7, the image sensor 8 as an example of the sensor unit, and the lens unit 9 described above. .
  • a touch operation with the fingertip F is performed on the display 4, and the presence or absence of the touch is detected by the touch sensor unit 7.
  • the main body 3 of the wristband type electronic device 1 has a structure in which a light guide plate 5, a display 4, a lens unit 9, and an imaging element 8 are sequentially stacked from the near side to the far side when viewed from the operation direction.
  • the contact with the display 4 may include not only direct contact with the display 4 but also indirect contact via another member (for example, the light guide plate 5).
  • the contact with the display 4 may include, for example, not only the fingertip F touching the display 4 but also bringing the fingertip F close to the display 4 such that a fingerprint image is obtained.
  • the display 4 includes a liquid crystal LCD (Liquid Crystal Display), an OLED (Organic Light Emitting Diode), and the like.
  • the light guide plate 5 is, for example, a light transmissive member that guides light from the light emitting unit 6 to an area AR where the fingertip F is in contact.
  • the light guide plate 5 is not limited to a transparent one, and may be any as long as it transmits light to the extent that a fingerprint of the fingertip F can be photographed by the imaging element 8.
  • the light emitting unit 6 is configured by an LED (Light Emitting Diode) or the like, and is provided at least partially around the light guide plate 5.
  • the area AR is an area including a position corresponding to the image sensor 8, specifically, at least a position corresponding to a range of imaging by the image sensor 8.
  • the light emitting unit 6 provides light required for photographing, for example, by being turned on when photographing a fingerprint.
  • the touch sensor unit 7 is a sensor that detects contact of the fingertip F with the display 4.
  • a capacitive touch sensor is applied.
  • a touch sensor of another type such as a resistive film type may be applied as the touch sensor unit 7.
  • the touch sensor unit 7 is locally provided at a position near the lower part of the area AR.
  • the touch sensor unit 7 may be provided over substantially the entire lower side of the display 4.
  • the imaging device 8 is configured by a CCD (Charge Coupled Device), a CMOS (Complementary Metal Oxide Semiconductor), or the like.
  • the imaging element 8 photoelectrically converts the subject light (reflected light from the object that has come into contact with the display 4) incident through the lens unit 9 to convert the light into an electric charge.
  • Various processes in the subsequent stage are performed on the image signal obtained via the image sensor 8.
  • the lens unit 9 is configured by a lens (microlens) provided at intervals of several tens to several hundreds of pixels of the imaging element 8.
  • FIG. 3 is a diagram illustrating a more specific example of the internal structure of the wristband type electronic device 1.
  • the display 4 is described as a transparent panel unit having a plurality of transparent light emitting elements such as a transparent organic EL element and a quantum dot light emitting element.
  • the display 4 has an effective area 4A and an outer frame 4B.
  • the display 4 has a function as a display panel that displays an image in the effective area 4A by light emission of the plurality of transparent light emitting elements.
  • the transparent light emitting elements are arranged in a matrix in the effective area 4A, for example.
  • the display 4 has a function as a touch sensor that detects a touch state of an object such as a finger based on a value of capacitance between a plurality of wirings for a light emitting element, for example.
  • a cover glass 50 is provided on the upper surface (operation side) of the display 4, and an imaging unit 60 including the imaging element 8 is arranged below a partial area of the display 4.
  • the imaging unit 60 is arranged below a partial area of the display 4.
  • the imaging unit 60 has a function of imaging an object that is in contact with or in proximity to a partial area of the display 4 via the display 4.
  • the object imaged by the imaging unit 60 may be, for example, a part of a living body.
  • the imaging unit 60 may have a function of a biometric authentication device that performs biometric authentication on a part of a living body based on a captured image of a part of the living body obtained by imaging a part of the living body.
  • the function of the imaging unit 60 as a biometric authentication device can constitute, for example, a fingerprint sensor.
  • the imaging unit 60 includes a microlens array module 61, an imaging unit outer frame 62, the above-described imaging device 8, and a substrate 63.
  • the micro lens array module 61 is arranged in the effective area 4A of the display 4 when viewed from above.
  • the imaging element 8 is arranged on the substrate 63.
  • the microlens array module 61 is disposed between the image sensor 8 and the effective area 4A of the display 4.
  • the microlens array module 61 includes a cover glass / light guide plate 65, a microlens array 66, and a light guide plate 67 in this order from the top.
  • the microlens array 66 has a plurality of microlenses arranged in a matrix.
  • the microlens array 66 condenses object light from an object such as a finger toward the image sensor 8 by each of the plurality of microlenses.
  • the cover glass and light guide plate 65 has a role of protecting the surface of the microlens array 66. Further, the cover glass and light guide plate 65 has a role of guiding the object light transmitted through the effective area 4A of the display 4 to each of the plurality of microlenses.
  • the cover glass and light guide plate 65 has a plurality of light guide paths provided at positions corresponding to each of the plurality of micro lenses.
  • the light guide plate 67 has a plurality of light guide paths 68 as shown in FIG.
  • the plurality of light guide paths 68 are provided at positions corresponding to the plurality of microlenses, respectively, and guide the light collected by each of the plurality of microlenses to the image sensor 8.
  • FIG. 4 is a block diagram illustrating a circuit configuration example of the wristband type electronic device 1 and the like.
  • the wristband type electronic device 1 includes, for example, a control unit 11, a wireless communication unit 12, an antenna 13 connected to the wireless communication unit 12, an NFC (NFC), in addition to the display 4, the touch sensor unit 7, the image sensor 8, and the like.
  • Near Field Communication) communication unit 14 antenna 15 connected to NFC communication unit 14, position sensor unit 16, antenna 17 connected to position sensor unit 16, memory unit 18, vibrator 19, motion sensor 20, voice processing unit 21 , A microphone 22 and a speaker 23.
  • the control unit 11 includes, for example, a CPU (Central Processing Unit) and controls each unit of the wristband type electronic device 1. For example, the control unit 11 performs various types of image processing on a fingerprint image of the fingertip F captured by the imaging element 8 and performs fingerprint authentication based on a fingerprint image (fingerprint image), which is one of biological information.
  • a CPU Central Processing Unit
  • the wireless communication unit 12 performs short-range wireless communication with another terminal based on, for example, the Bluetooth (registered trademark) standard.
  • the wireless communication unit 12 performs modulation / demodulation processing, error correction processing, and the like in accordance with, for example, the Bluetooth (registered trademark) standard.
  • the NFC communication unit 14 performs wireless communication with a nearby reader / writer based on the NFC standard. Although illustration is omitted, power is supplied from a battery such as a lithium ion secondary battery to each unit of the wristband type electronic device 1. The battery may be charged wirelessly based on the NFC standard.
  • the position sensor unit 16 is a positioning unit that performs positioning of the current position by using a system called GNSS (Global Navigation Satellite System), for example. Data obtained by the wireless communication unit 12, the NFC communication unit 14, and the position sensor unit 16 are supplied to the control unit 11. And the control part 11 performs control based on the supplied data.
  • GNSS Global Navigation Satellite System
  • the memory unit 18 includes a ROM (Read Only Memory) in which a program executed by the control unit 11 is stored, a RAM (Random Access Memory) used as a work memory when the control unit 11 executes the program, and a data storage This is a general term for non-volatile memories and the like.
  • the memory unit 18 stores a feature amount of a fingerprint of an authorized user used for fingerprint authentication (hereinafter, appropriately referred to as a registered feature amount). This registered feature amount is initially registered, for example, when the wristband type electronic device 1 is used for the first time.
  • the vibrator 19 is, for example, a member that vibrates the main body 3 of the wristband type electronic device 1. By vibrating the main body 3 by the vibrator 19, an incoming call, reception of an e-mail, or the like is notified.
  • the motion sensor 20 detects the movement of the user wearing the wristband type electronic device 1.
  • an acceleration sensor As the motion sensor 20, an acceleration sensor, a gyro sensor, an electronic compass, a barometric pressure sensor, a biosensor for detecting blood pressure, a pulse, and the like are used.
  • a pressure sensor or the like for detecting whether or not the user wears the wristband type electronic device 1 may be provided on the back side (the side facing the wrist) of the band portion 2 or the main body portion 3.
  • the microphone 22 and the speaker 23 are connected to the voice processing unit 21, and the voice processing unit 21 performs a call process with the other party connected by wireless communication in the wireless communication unit 12.
  • the voice processing unit 21 can also perform a process for a voice input operation.
  • the wristband-type electronic device 1 is not limited to the above-described configuration example, and may have a configuration in which a part of the configuration of the above-described wristband-type electronic device 1 is not provided, or a configuration in which another configuration is added. But it's fine.
  • FIG. 5 is a functional block diagram for explaining an example of a function of the control unit 11.
  • the control unit 11 includes a preprocessing unit 11a, a feature point detection unit 11b, a feature amount extraction unit 11c, and a matching processing unit 11d.
  • the preprocessing unit 11a performs various correction processes on the input fingerprint image. Details of the processing performed by the preprocessing unit 11a will be described later.
  • the feature point detection unit 11b detects a feature point of a fingerprint from an image including the fingerprint by applying a known method.
  • the characteristic points of the fingerprint are, for example, end points and branch points in the pattern drawn on the fingerprint line of the fingerprint as shown in FIG. 6, and intersections and isolated points of the fingerprint line described later. It is a necessary characteristic part.
  • the fingerprint line is described as a ridge of a fingerprint, but may be at least one of a ridge and a valley of the fingerprint.
  • the feature amount extraction unit 11c extracts a feature amount characterizing each feature point detected by the feature point detection unit 11b.
  • the feature amount includes a position of a feature point, a direction of a feature line (for example, a relative direction (vector) with respect to a predetermined direction defined by a ridge, and the like.
  • the feature amount extraction unit 11 c A feature amount of the feature point is extracted based on the peripheral image including the feature point, for example, an image obtained by cutting out a size of 3 mm ⁇ 3 mm around the feature point and normalizing the angle is applied.
  • the effect of extracting the feature amount after normalizing with the angle is that even if the orientation of the finger photographed at the time of registration and at the time of verification is different.
  • an effect of making the extracted feature amount hard to change that is, an effect of improving the robustness with respect to the angle at which the finger is placed can be obtained.
  • the relative position of the sweat gland with respect to the feature point may be included in the feature amount of the feature point.
  • the embodiment according to the present disclosure does not necessarily need to take a fingerprint of a wide area of a finger. It can also be said to be a method suitable for fingerprint matching in a small area.
  • the matching processing unit 11d performs a matching process of comparing the feature amount extracted by the feature amount extracting unit 11c with a registered feature amount registered in advance, and outputs a matching score as a result of the matching process. If the collation score is equal to or more than the threshold, the fingerprint authentication is established, that is, it is determined that the user is an authorized user. Conversely, if the collation score is smaller than the threshold, the fingerprint authentication is not established.
  • the result of the matching process may be notified to the user by display, sound, vibration, or the like. As a result of the matching process, when the authentication is established, the use according to the application becomes possible, for example, the use of the predetermined function of the wristband type electronic device 1 is permitted.
  • the registered feature amount is described as being stored in the memory unit 18.
  • the registered feature amount may be stored in an external device such as a server device on a cloud.
  • the registered feature amount may be downloaded from an external device.
  • the registered feature amount may be automatically deleted from the wristband type electronic device 1 after the matching process is completed.
  • FIG. 7 is a functional block diagram illustrating an example of a function of the preprocessing unit 11a.
  • the preprocessing unit 11a includes, for example, a noise removal unit 101, a ridge estimation image generation unit 102 as an image generation unit, and a certainty factor map generation unit 103 as a configuration that executes functions included in the correction processing.
  • the noise removing unit 101 removes noise included in a fingerprint image.
  • 8A to 8D are diagrams for explaining the noise removal processing performed by the noise removal unit 101.
  • FIG. The image on the right side of FIG. 8A shows the fingerprint image IM1A, and dust NA is reflected in the fingerprint image IM1A.
  • the noise removing unit 101 determines, for example, a region in which a change in the luminance value between adjacent pixels is equal to or greater than a predetermined value as dust, and removes the dust NA by performing an interpolation process using peripheral pixels of the dust NA.
  • the ridge estimation image IM2A as shown on the right side of FIG. 8A is generated by the ridge estimation image generation unit 102.
  • a process for removing noise such as dust
  • other known processes can be applied. The same applies to the processing for removing noise other than dust described below.
  • the noise removing unit 101 removes fixed pattern noise that is noise other than dust, for example.
  • the image on the left side of FIG. 8B shows the fingerprint image IM1B, and the fingerprint image IM1B includes, for example, a fixed pattern noise NB having a vertical stripe.
  • the fixed pattern noise NB include, for example, the structure of the display 4, more specifically, the pattern of the display 4 itself.
  • the imaging element 8 is disposed on the back side of the display 4 with reference to the operation direction. For this reason, there is a possibility that a pattern included in the display 4 in an image obtained through the imaging element 8 is reflected in the fingerprint image as fixed pattern noise NB.
  • the noise removing unit 101 removes such fixed pattern noise NB and interpolates the location of the noise NB, the noise removing unit 101 has the structure of the wristband type electronic device 1 according to the present embodiment.
  • the ridge estimation image IM2B is generated by the ridge estimation image generation unit 102.
  • the noise removing unit 101 removes boundaries of the image sensor which are noises other than dust.
  • the image sensor 8 has four image sensors as a plurality of sub-sensor units and has a configuration in which the four image sensors are combined.
  • an image sensor 8 of a certain size is required in specifications, if an image sensor 8 of a required size can be formed by combining image sensors of an existing size, an image sensor 8 of a new size is required. Is more advantageous in terms of manufacturing cost and the like than manufacturing separately.
  • the imaging device 8 has a structure in which a plurality of imaging devices are combined, as shown on the left side of FIG. 8C, a boundary between the plurality of imaging devices appears in the fingerprint image IM1C as noise NC. Since the noise removing unit 101 removes such noise NC and interpolates the location of the noise NC, even in the case of the structure of the wristband type electronic device 1 according to the present embodiment, the fingerprint is It is possible to prevent the accuracy of authentication from being reduced.
  • a ridge estimation image IM2C is generated by the ridge estimation image generation unit 102.
  • the noise removing unit 101 determines that a curved pattern corresponding to a ridge is not a fingerprint, and removes the curved pattern.
  • the image IM2D after the removal is shown on the right side of FIG. 8D.
  • Such a process is useful, for example, when performing fingerprint authentication when the user's clothes or the like touches the display 4.
  • the process related to the fingerprint authentication may not be performed in the case of the image IM2D.
  • the noise removing unit 101 by performing the correction processing by the noise removing unit 101, it is possible to prevent the accuracy of fingerprint authentication from being reduced due to the influence of noise. In addition, it is possible to prevent feedback from being performed to the user due to authentication failure caused by a decrease in the accuracy of fingerprint authentication.
  • the ridge estimation image generation unit 102 generates a ridge estimation image in which a pattern based on a fingerprint line is estimated based on the image processed by the noise removal unit 101.
  • a method for generating the ridge estimation image a known method can be applied. An example of a method for generating a ridge estimation image according to the present embodiment will be described.
  • the ridge estimation image generation unit 102 uses FFT (Fast ⁇ Fourier ⁇ Transform) on the image processed by the noise removal unit 101, and calculates the average period (for example, 0.4 mm) of the fingerprint line of the fingerprint.
  • FFT Fast ⁇ Fourier ⁇ Transform
  • a ridge estimation image is generated by applying a bandpass filter before and after the frequency of (period).
  • the ridge estimation image generation unit 102 uses the FFT for each area of 1 mm square in the vicinity to generate a frequency (hereinafter, referred to as a main frequency as appropriate) and a dominant frequency in the area. (The flow direction of the fingerprint) is extracted, and a Gabor filter adapted to the frequency and the angle is applied to generate a ridge estimation image. According to the above two examples, main ridges / valleys are emphasized, and the influence of small noise can be reduced.
  • Example 2 an example of a method for detecting the flow direction and the main frequency of the fingerprint will be described with reference to FIG.
  • the image IM8 shown on the left side in FIG. 9A shows a certain fingerprint image IM8.
  • 9A a frequency spectrum obtained by applying the FFT to the image IM8 is shown.
  • One of the radial lines superimposed on the frequency spectrum indicates a component having the largest integral value described later.
  • FIG. 9B shows a frequency profile in the direction (principal direction) of the component in which the integrated value is the largest.
  • ⁇ ⁇ ⁇ As a first step, profiles are extracted for 16 directions of the frequency spectrum, and the direction with the largest integral value is determined. This is the main directional component of the wave. Subsequently, as a second step, a peak value is detected from the frequency profile in the main direction, and a frequency corresponding to the peak value is set as a main frequency. Thus, the flow direction and the main frequency of the fingerprint can be detected.
  • the ridge estimation image generation unit 102 extends a predetermined range to the outside of the captured area to estimate the fingerprint pattern. For example, based on the fingerprint image IM9A shown in FIG. 10A, as shown in FIG. 10B, a ridge estimation image IM9B enlarged to a range larger than the fingerprint image IM9A by a predetermined size is generated.
  • the fingerprint line obtained by the original size (the size of the fingerprint image IM9A) is extended along the flow (direction) of the fingerprint line.
  • a branch point or an intersection of the fingerprint line which is one of the characteristic points of the fingerprint, can be obtained by such processing. There is.
  • the above processing for example, even when the size of the image sensor 8 is small and the area of the image obtained by the image sensor 8 is limited, more feature points can be obtained, and the accuracy of fingerprint authentication is improved. Can be done.
  • the certainty map generation unit 103 generates a certainty map indicating the certainty of the estimation result in the area of the ridge estimation image that is the image obtained by estimating the pattern corresponding to the fingerprint.
  • FIG. 11 shows a certainty factor map MA10, which is an example of the certainty factor map.
  • the image area is divided into white and black areas.
  • the white area is an area having a high degree of certainty, that is, an area where a fingerprint line pattern is accurately obtained.
  • a black region is a region with low confidence.
  • a predetermined threshold value is set for the certainty factor. If the certainty factor is equal to or larger than the threshold value, the certainty factor is set as a high-confidence area.
  • an image of a predetermined size (for example, a rectangular image of 1 mm ⁇ 1 mm) is cut out from the image. Then, with respect to the cut-out image, a brightness distribution indicating a brightness value distribution of each pixel is created.
  • FIG. 12 shows an example of the luminance distribution.
  • the difference value D between the luminance value BV1 and the luminance value BV2 is set as the certainty factor. Note that the variance of the luminance values of the extracted images may be used as the certainty factor.
  • the function of the ridge estimation image generation unit 102 and the function of the certainty map generation unit 103 described above may be configured as one function block, and the function block may generate a ridge estimation image with certainty. good.
  • a white area is an area that can be recognized with an error of ⁇ or less
  • a black area is an area that cannot be suppressed to an error of ⁇ or less.
  • an accurate ridge image with respect to the input fingerprint image x is defined as a correct ridge image y.
  • An estimation error between the correct ridge image y and the ridge estimation image f (x) is defined as an estimation error dy.
  • One of the purposes of the process is to estimate an image f (x) that is close to y from x.
  • Another object is to recognize a region that is likely to be correctly estimated, in other words, to determine whether or not the region is a region where the estimation error can be reduced to ⁇ or less.
  • control unit 11 simultaneously learns functions f and g that minimize the loss function shown in FIG. 13B (however, 0 ⁇ g (x) ⁇ 1).
  • the portion shown in parentheses in the loss function shown in FIG. 13B is the estimated error dyi.
  • FIG. 14A is a diagram showing the flow of the registration process.
  • FIG. 14B is a diagram showing an image and the like obtained in each process in association with each process.
  • step ST11 an image input process is performed. For example, a fingertip is brought into contact with the display 4 and a fingerprint image is obtained via the imaging device 8. When the fingerprint authentication is acquired, the light emitting unit 6 emits light. Then, the process proceeds to step ST12.
  • step ST12 preprocessing is performed by the preprocessing unit 11a. Specifically, noise is removed from the fingerprint image by the noise removing unit 101.
  • the ridge estimation image generation unit 102 generates a ridge estimation image based on the fingerprint image from which noise has been removed. Further, the certainty map generation unit 103 generates a certainty map. In FIG. 14B, illustration of the certainty factor map is omitted. Then, the process proceeds to step ST13.
  • the feature point detection unit 11b detects a feature point of the fingerprint based on the ridge estimation image.
  • the feature point detection unit 11b refers to the certainty factor map and detects a feature point from an area determined to have a certainty factor or more.
  • FIG. 14B shows an example in which three feature points (the centers of circles) are detected. Then, the process proceeds to step ST14.
  • step ST14 the feature amount extraction unit 11c extracts a feature amount characterizing each feature point.
  • the feature amount extraction unit 11c cuts out an image of a predetermined size centering on each feature point, and extracts a feature amount based on the cut out image. Then, the process proceeds to step ST15.
  • step ST15 the control unit 11 performs a template registration process of registering the feature amount of each feature point extracted in the process in step ST14.
  • the feature amount of each feature point is stored in the memory unit 28, for example.
  • the feature amount stored in the memory unit 28 is used as a registered feature amount in a matching process described below.
  • FIG. 15A is a diagram illustrating a flow of the matching process.
  • FIG. 15B is a diagram illustrating an example of a feature amount acquired in each process and a diagram referred to when describing the process content, in association with each process.
  • step ST21 a fingertip is placed on the display 4, and a fingerprint image is obtained. Then, a feature amount extraction process for extracting a feature amount is performed.
  • the feature amount extraction process in step ST21 is a process including the above-described steps ST11 to ST14. Through the processing in step ST21, a feature amount for collation for performing fingerprint authentication is obtained.
  • FIG. 15B shows feature amounts corresponding to five feature points. Then, the process proceeds to step ST22.
  • step ST22 the control unit 11 reads out the registered feature amounts from the memory unit 28.
  • FIG. 15B shows an example of a registered feature amount. Then, the process proceeds to step ST23.
  • step ST23 the matching processing unit 11d performs a matching process of comparing the feature amount acquired in the process of step ST21 with the registered feature amount read in step ST22.
  • the matching processing unit 11d obtains a similarity score between the feature amount for matching and the registered feature amount by an inner product operation, and generates a similarity score matrix shown in FIG. 15B based on the result. “A” in the similarity score matrix indicates a registered feature point, and “B” indicates a feature point for comparison.
  • the (i, j) component is a similarity score between Ai and Bi.
  • the matching processing unit 11d calculates a matching score based on the similarity score matrix. If the collation score is equal to or greater than the threshold, fingerprint authentication is established. If the collation score is smaller than the threshold, fingerprint authentication is not established. For example, the maximum value in the similarity score matrix is set as the matching score. The average value in the similarity score matrix may be set as the matching score. The average value of the maximum value of each column in the similarity score matrix may be set as the matching score.
  • the feature amount is extracted based on the peripheral image of the feature point, information other than the information of the feature point itself can be used as the feature amount of the feature point. it can.
  • the matching process based on various information can be performed, so that the accuracy of fingerprint authentication can be improved.
  • fingerprint authentication is performed using the imaging device 8.
  • an image sensor more specifically, a COMS sensor
  • another method for example, a capacitance method.
  • a battery having a capacity corresponding to the required power may be used, in the case of a wearable device, the size of the battery that can be mounted is limited, and the capacity of the battery is limited. Therefore, it is desired to minimize unnecessary power consumption.
  • the number and size of input devices such as buttons to be provided are also restricted. Therefore, it is desirable that the control for minimizing unnecessary power consumption be performed without using an operation on a physical device such as a button as a trigger.
  • the second embodiment will be described in detail while considering such a viewpoint.
  • FIG. 16 is a diagram illustrating a state transition of the wristband type electronic device 1.
  • the wristband type electronic device 1 is capable of transitioning between, for example, three modes as an operation mode related to fingerprint authentication.
  • the three modes are mode 0, mode 1 and mode 2. From the viewpoint of power consumption, mode 0 has the lowest power consumption, and mode 2 has the highest power consumption.
  • the power consumption in mode 1 is larger than the power consumption in mode 0 and smaller than the power consumption in mode 2.
  • mode 0 corresponds to an example of the third mode
  • modes 1 and 2 correspond to examples of the first and second modes, respectively.
  • Mode 0 is a pause mode, in which the light emitting unit 6 is turned off and the image sensor 8 is not operated, that is, the fingerprint sensing using the image sensor 8 is not performed.
  • Mode 1 is a standby state in which the light emitting unit 6 is turned on and fingerprint sensing using the image sensor 8 is performed. Note that the sensing in the mode 1 may be such that it is possible to determine whether or not the object in contact with the display 4 is a fingerprint. More specifically, sensing that acquires an image that can determine whether or not a fingerprint (for example, a characteristic point of the fingerprint) is included may be used.
  • the mode 2 is an authentication state, in which the light emitting unit 6 is turned on, a feature amount of the fingerprint is acquired, and a matching process for comparing the acquired feature amount with the registered feature amount is performed.
  • an image is acquired via the image sensor 8 based on a setting different from the setting in the mode 1.
  • mode 1 for example, when a feature point of a fingerprint is detected from an image and it is determined that the object touching the display 4 is a fingertip, the operation mode transitions to mode 2 in which power consumption is higher. By such a mode transition, it is possible to prevent unnecessary execution of a matching process or the like that consumes a large amount of power even when the display 4 is touched by something other than a fingertip of clothes or the like. Therefore, for example, a decrease in the capacity of the battery can be suppressed.
  • mode 0 is a mode in which processing related to fingerprint authentication is not performed. Therefore, in the following description, a specific example of the operation in mode 1 and mode 2 will be described.
  • illumination control for controlling the brightness of the light emitting unit 6 by the control unit 11 is performed.
  • the operation in each mode is performed according to the illumination control.
  • the brightness (luminance) of the light emitting unit 6 is set to be small.
  • the brightness of the light emitting unit 6 is set to be larger than that in mode 1 so that a high-definition image is obtained. Since the amount of the reflected light from the fingertip changes depending on the state of the finger and the degree of pressing of the finger, the light emission intensity of the light emitting unit 6 may be adjusted adaptively based on the luminance of the image.
  • resolution control for changing the resolution is performed by the control unit 11 controlling active pixels in the image sensor 8.
  • the operation in each mode is performed according to the resolution control.
  • low resolution is set, and sensing at low resolution is performed.
  • the low resolution means, for example, a resolution of about 300 to 500 ppi (pixels per inch) at which a feature point of a fingerprint can be detected.
  • high resolution is set, and sensing at high resolution is performed.
  • the high resolution means, for example, a resolution of about 1000 ppi or more at which a feature finer than a fingerprint line such as a sweat gland can be photographed.
  • control unit 11 controls a region of an active pixel in the image sensor 8 to perform a sensing region control for controlling a sensing region which is an imaging range.
  • sensing using a part (for example, only near the center) of the image sensor 8 is performed.
  • mode 2 sensing using the entire image sensor 8 is performed.
  • Control combining the control in the above-described example may be performed. For example, in mode 1, sensing at a low resolution is performed by the entire image sensor 8 to detect a feature point of a fingerprint. In mode 2, only the area near the detected feature point may be sensed at high resolution.
  • transition between modes transitions according to a predetermined trigger, a lapse of time, a result of processing, and the like. As shown in FIG. 16, a transition is made from mode 0 to mode 1 based on the trigger P. Also, a transition is made from mode 1 to mode 2 based on the trigger Q.
  • the operation mode changes from mode 1 to mode 0.
  • the operation mode transitions from mode 1 to mode 0 (timeout).
  • the operation mode transitions from mode 2 to mode 1 (timeout).
  • the operation mode changes from the mode 2 to the mode 0.
  • the operation mode may be directly transitable from mode 0 to mode 2.
  • the operation mode may be allowed to transition from mode 0 to mode 2 based on the trigger R.
  • An example of the trigger R is an operation input for instructing to perform fingerprint authentication. In this case, since it is clear that fingerprint authentication is performed in advance, the operation mode may directly transition from mode 0 to mode 2.
  • the trigger P includes a timing at which the start of use of the wristband type electronic device 1 is detected. It is assumed that there is a high possibility that fingerprint authentication will be performed to execute a predetermined application at the timing when the use of the wristband type electronic device 1 is started. Therefore, the operation mode changes from mode 0 to mode 1.
  • the waveform of the acceleration sensor (the output of the acceleration sensor) or the change in the output of the acceleration sensor is equal to or more than the threshold or equal to or less than the threshold is given.
  • the wristband-type electronic device 1 there is a high possibility that the wristband-type electronic device 1 will be used, so that the operation mode transitions from mode 0 to mode 1.
  • the acceleration sensor can be applied as one of the motion sensors 20.
  • the trigger P As another specific example of the trigger P, as shown in FIG. 18, a case where there is a change equal to or more than a threshold value in the direction (gravity direction) of the composite vector of the three-axis acceleration can be given.
  • a sensor output corresponding to each axis is defined. Examples of each axis corresponding to the wristband type electronic device 1 are shown in FIGS. 19A and 19B.
  • the three-axis acceleration is represented by a three-dimensional vector, and if there is a change in the direction, it is determined that the hand direction has changed. Also in such a case, the operation mode transitions from mode 0 to mode 1 because there is a high possibility that some action including fingerprint authentication is performed on the wristband type electronic device 1.
  • the predetermined section is set so that the output of the acceleration sensor includes a portion where a change equal to or more than the threshold value occurs.
  • the output of the acceleration sensor corresponding to the set predetermined section is input to the recognizer schematically shown in FIG. 20B.
  • the recognizer determines whether a predetermined gesture has occurred by applying the function f to the output of the acceleration sensor.
  • a determination result of the recognizer is obtained as shown in FIG. 20C.
  • a case where the score f (x) indicating the defined gesture-likeness, which is the determination result, is equal to or greater than a threshold value is defined as a trigger P.
  • the trigger P may be a case where a contact of the fingertip with the display 4 or a movement of the fingertip with the display 4 is detected.
  • a trigger P may be set when a contact of an object or a movement of the object is detected instead of the fingertip.
  • FIGS. 21A and 21B are diagrams schematically showing respective positions of the image sensor 8 and the touch sensor unit 7 with respect to the display 4.
  • the touch sensor unit 7 that detects contact or movement of an object is arranged, for example, in the vicinity of the image sensor 8 as shown in FIGS. 21A and 21B.
  • the present invention is not limited to this, and various conditions can be set as the trigger P.
  • a combination of the above-described examples may be used as the trigger P.
  • the trigger Q is, for example, a trigger on the condition that a fingerprint is included in an image acquired via the imaging device 8.
  • a cycle that can be considered as a cycle of a fingerprint line (here, a ridge line and a valley line) of a fingerprint is set.
  • 22A shows an example of a 0.6 mm cycle
  • FIG. 22B shows an example of a 0.3 mm cycle
  • FIG. 22C shows an example of a 0.15 mm cycle
  • FIG. 22D shows an example of a 0.075 mm cycle.
  • a frequency component corresponding to each cycle is extracted from an image obtained via the image sensor 8. Then, for each frequency component, for example, 32 types of responses (in increments of 11.25 degrees) as shown in FIG. 23 are calculated, and the average value is obtained.
  • the average value corresponding to at least one of the four frequency components described above is equal to or greater than the threshold value, it is highly likely that the object shown in the image is a fingerprint. Therefore, a condition that the average value corresponding to at least one of the four frequency components is equal to or larger than the threshold is set as the trigger Q.
  • the condition that the number of fingerprint feature points equal to or larger than the threshold value is detected may be set as the trigger Q.
  • the characteristic points of the fingerprint in addition to the end point of the fingerprint line shown in FIG. 24A, the branch point of the fingerprint line shown in FIG. 24B, the intersection of the fingerprint line shown in FIG. 24C and the fingerprint line shown in FIG. May be included.
  • the present invention is not limited to this, and various conditions can be set as the trigger Q.
  • circles A, B, and C indicate the continuity of the processing. Also, the description will be given on the assumption that the operation mode at the start of the process is mode 0.
  • step ST31 in FIG. 25 for example, acceleration data is obtained based on the output of the motion sensor 20. Then, the process proceeds to step ST32.
  • step ST32 using the acceleration data obtained in step ST31, the control unit 11 performs a process of recognizing whether or not the trigger P is established. As described above, whether or not the trigger P has been established may be determined using data other than the acceleration data. Then, the process proceeds to step ST33.
  • step ST33 it is determined whether or not the trigger P is established based on the result of the process in step ST32. Here, if the trigger P is not established, the process returns to step ST31. If the trigger P is established, the process proceeds to step ST34.
  • the operation mode changes from mode 0 to mode 1.
  • the first elapsed time is set to 0 (initialized).
  • the first elapsed time is a time for determining whether or not the entire processing has been completed within a predetermined time, in other words, whether or not the processing has timed out. Then, the process proceeds to step ST35.
  • step ST35 the second elapsed time is set to 0 (initialized).
  • the second elapsed time is a time for determining whether or not the processing of the mode 1 has been completed within a predetermined time, in other words, whether or not the processing has timed out. Then, the process proceeds to step ST36.
  • step ST36 the light emitting section 6 is turned on with the brightness corresponding to mode 1. Then, the process proceeds to step ST37.
  • step ST37 sensing according to the setting corresponding to mode 1 is started. Then, the process proceeds to step ST38.
  • step ST38 an image is obtained via the imaging element 8 as a result of the sensing in step ST37. Then, the process proceeds to step ST39.
  • step ST39 a process of recognizing the trigger Q is performed. Then, the process proceeds to step ST40.
  • step ST40 in FIG. 26 the control unit 11 determines whether or not the trigger Q is established as a result of the processing in step ST39. Here, if the trigger Q is not established, the process proceeds to step ST41.
  • step ST41 it is determined whether the second elapsed time is greater than a predetermined threshold th1.
  • th1 is set to, for example, about 10 seconds.
  • the process in mode 1 times out, and the process returns to step ST31.
  • the process in mode 1 is repeated. That is, the process returns to step ST38, an image is acquired again, and the processes after step ST38 are performed.
  • step ST40 the operation mode transitions from mode 1 to mode 2 and then the processing proceeds to step ST42.
  • step ST42 the third elapsed time is set to 0 (initialized).
  • the third elapsed time is a time for determining whether or not the process of mode 2 has been completed within a predetermined time, in other words, whether or not the process has timed out. Then, the process proceeds to Step ST43.
  • step ST43 a setting relating to at least one of a shooting area, lighting (light emitting unit 6), and resolution according to mode 2 is performed, an image is shot based on the setting, and a fingerprint image is obtained. Further, a feature amount characterizing a feature point of the fingerprint image is extracted. Then, the process proceeds to step ST44.
  • step ST44 a matching process is performed to match the obtained feature amount with the registered feature amount. Then, the process proceeds to step ST45.
  • step ST45 it is determined whether or not the quality is sufficient. For example, if the number of detected feature points is equal to or greater than a threshold, it is determined that the quality is sufficient. Also, as a result of the matching process, if the number of feature points that are similar based on the comparison of the feature amounts is between a certain threshold thA and a certain threshold thB (threshold thA> thB), it is determined that the quality is not sufficient. Is also good.
  • the number of feature points that are similar based on the comparison of the feature amounts is equal to or greater than the threshold thA (in this case, fingerprint authentication is established), or the number of feature points that are similar based on the comparison of the feature amounts is If it is equal to or smaller than the threshold thB (in this case, fingerprint authentication is not established), it is determined that the quality is sufficient for determining the result of fingerprint authentication. If it is determined in step ST45 that the quality is not sufficient, the process proceeds to step ST46.
  • step ST46 it is determined whether the third elapsed time is greater than a threshold th2.
  • the threshold th2 is set to, for example, about 10 seconds. If the third elapsed time is equal to or less than the threshold th2, the process proceeds to step ST47.
  • step ST47 Since the third elapsed time is equal to or less than the threshold th2 and the time until the timeout has not elapsed, the mode 2 process is continued. That is, in step ST47, an image is acquired again via the imaging element 8, and the processing after step ST44 is performed.
  • step ST48 it is determined whether the first elapsed time is greater than a threshold th3. If the first elapsed time is equal to or smaller than the threshold th3 as a result of the determination, the process returns to step ST38 because the time-out for the entire process has not elapsed, and the process related to mode 1 is performed again. If the first elapsed time is greater than the threshold th3 as a result of the determination, the time-out for the entire process has elapsed, and the process returns to step ST31, which is the first process.
  • control is performed by appropriately setting the operation mode of the wristband type electronic device 1.
  • the power consumed by the unit 11 and the imaging element 8 can be suppressed. Further, mode transition can be performed without performing an operation on the input device.
  • a matching process using a low-resolution image may be performed.
  • the processing according to the mode 1 is performed, and the matching processing using the low-resolution image is performed.
  • the settlement amount is a large amount exceeding 1,000 yen, high security is required. Therefore, the process according to the mode 2 is performed, and the matching process using the high-resolution image is performed.
  • the trigger Q which is a condition for switching from the mode 1 to the mode 2 may be a condition according to the content of the application.
  • the content of the trigger Q which is a condition for switching from mode 1 to mode 2, may be dynamically switched.
  • the control unit 11 acquires the remaining battery capacity of the wristband type electronic device 1.
  • the SoC State $ of $ Charge
  • the content of the trigger Q is switched and the content of the trigger Q is made strict (the operation mode is changed from mode 1 to mode 2). Makes the transition difficult.).
  • the content of the trigger Q is set to a combination of the examples of the individual triggers Q described above.
  • a configuration may be adopted in which a control unit (second control unit 11A) for executing the processes related to modes 0 and 1 is provided. Then, when the trigger Q is established, the second control unit 11A may perform a notification process to the control unit 11 which is a higher-level host, and the control unit 11 may perform a process such as a matching process related to the mode 2. .
  • the control unit 11, which is a higher-level host controls various processes of the wristband type electronic device 1 and thus consumes a large amount of power. Therefore, when an image is obtained via the image sensor 8 (in other words, the display 4 (when something touches), activating the control unit 11 may increase the overall power consumption. Therefore, it is preferable to provide the second control unit 11A, which is a lower control unit that executes mode 0 and mode 1.
  • the threshold for establishing fingerprint authentication may be changed according to the content of the application. For example, when fingerprint authentication is performed to enable high-value payment, the standard for image quality may be increased, or the threshold value for the matching score may be significantly changed.
  • the configuration of the wristband type electronic device 1 according to the above-described embodiment can be changed as appropriate.
  • a configuration without the light guide plate 5 and the light emitting unit 6 may be employed.
  • imaging using light from the display 4 specifically, an OLED is performed.
  • the biological information is not limited to the fingerprint, but may be a blood vessel of a palm, a capillary blood vessel of a retina, or a combination thereof.
  • the fingerprint does not need to be a pattern formed by the fingerprint lines of the entire fingertip, but may include a part thereof. The same applies to other biological information.
  • the present disclosure can also be realized by an apparatus, a method, a program, a system, and the like.
  • a program that performs the function described in the above-described embodiment can be downloaded, and a device that does not have the function described in the embodiment downloads and installs the program, thereby explaining the program in the embodiment. Control can be performed.
  • the present disclosure can also be realized by a server that distributes such a program.
  • matters described in each of the embodiments and the modified examples can be appropriately combined.
  • the present disclosure can also adopt the following configurations.
  • the control unit includes: In the first mode, it is determined whether biological information is included in an image obtained through the sensor unit, With the biological information included in the image as a trigger, the operation mode is changed from the first mode to the second mode, In the second mode, an information processing device that performs a matching process using at least the biological information.
  • the control unit causes the light-emitting unit that emits light at a timing at which an image is taken by the sensor unit to emit light at a first luminance in the first mode, and the light-emitting unit emits a light having a first luminance higher than the first luminance in the second mode.
  • the information processing device according to (2) wherein the light emitting unit emits light at a luminance of 2.
  • the control unit controls the sensor unit to acquire the image of a first resolution.
  • the control unit has a second resolution larger than the first resolution.
  • the control unit performs control to acquire the image using a part of the sensor unit.
  • the image is obtained using the entire sensor unit.
  • the information processing apparatus according to any one of (2) to (4), which performs control to acquire.
  • the operation mode can be shifted from the third mode having lower power consumption than the power consumption of the first mode to the first mode,
  • the control unit includes: The operation mode is changed from the third mode to the first mode by using at least one of a case where a movement of the information processing apparatus is detected and a case where a predetermined operation is detected as a trigger.
  • Information processing device is (7) The information processing device according to (6), wherein in the third mode, the control unit turns off the light emitting unit and the sensor unit.
  • the information processing device according to (6) or (7), wherein a touch sensor unit that detects the predetermined operation is provided near the sensor unit.
  • the information processing apparatus according to any one of (3), (6) to (8), including the light emitting unit.
  • the information processing apparatus according to any one of (1) to (9), wherein the content of the trigger is changed so that the transition from the first mode to the second mode becomes difficult.
  • the control unit from an image including biological information obtained via the sensor unit, a feature point detection unit that detects a feature point, and a feature amount characterizing the feature point based on a peripheral image including the feature point.
  • the information processing apparatus according to any one of (1) to (10), further comprising: a feature amount extracting unit to extract.
  • the information processing apparatus according to any one of (1) to (11), wherein the biological information is at least one of a fingerprint and a blood vessel.
  • the processing according to the first mode is performed by another control unit different from the control unit.
  • a control unit that selectively sets at least a first mode and a second mode in which processing that consumes more power than the first mode is performed; And a sensor unit for acquiring an image.
  • the control unit includes: In the first mode, it is determined whether or not biological information is included in an image obtained through the sensor unit, With the biological information included in the image as a trigger, the operation mode is changed from the first mode to the second mode, In the second mode, a wearable device that performs a matching process using at least the biological information.
  • the control unit selectively sets at least a first mode and a second mode in which a process that consumes more power than the first mode is performed;
  • the control unit includes: In the first mode, it is determined whether biological information is included in an image obtained through the sensor unit, With the biological information included in the image as a trigger, the operation mode is changed from the first mode to the second mode, In the second mode, an information processing method that performs a matching process using at least the biological information.
  • the control unit selectively sets at least a first mode and a second mode in which a process that consumes more power than the first mode is performed;
  • the control unit includes: In the first mode, it is determined whether biological information is included in an image obtained through the sensor unit, Triggered by the biological information being included in the image, the operation mode is changed from the first mode to the second mode, In the second mode, a program for causing a computer to execute an information processing method that performs a matching process using at least the biological information.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Veterinary Medicine (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Public Health (AREA)
  • General Health & Medical Sciences (AREA)
  • Animal Behavior & Ethology (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Physiology (AREA)
  • Dentistry (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Collating Specific Patterns (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
  • Image Input (AREA)

Abstract

L'invention concerne un dispositif de traitement d'informations comprenant une partie de commande destinée à régler de façon sélective au moins un premier mode et un second mode dans lequel un processus consommant une plus grande quantité d'énergie que le premier mode est réalisé. Dans le premier mode, la partie de commande détermine si des informations biologiques sont incluses dans une image obtenue par l'intermédiaire d'une partie capteur. L'inclusion d'informations biologiques dans l'image sert de déclencheur pour faire passer le mode de fonctionnement du premier mode au second mode. Dans le second mode, la partie de commande réalise un processus de mise en correspondance qui utilise au moins les informations biologiques.
PCT/JP2019/018523 2018-06-19 2019-05-09 Dispositif de traitement d'informations, équipement portatif, procédé de traitement d'informations, et programme WO2019244496A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018-115759 2018-06-19
JP2018115759A JP2019219833A (ja) 2018-06-19 2018-06-19 情報処理装置、ウエアラブル機器、情報処理方法及びプログラム

Publications (1)

Publication Number Publication Date
WO2019244496A1 true WO2019244496A1 (fr) 2019-12-26

Family

ID=68983179

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/018523 WO2019244496A1 (fr) 2018-06-19 2019-05-09 Dispositif de traitement d'informations, équipement portatif, procédé de traitement d'informations, et programme

Country Status (2)

Country Link
JP (1) JP2019219833A (fr)
WO (1) WO2019244496A1 (fr)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021152925A1 (fr) * 2020-01-30 2021-08-05 株式会社村田製作所 Appareil de mesure d'informations biométriques et système de mesure d'informations biométriques
WO2022178431A1 (fr) * 2021-02-22 2022-08-25 Hoffmann Christopher J Dispositif de commande de lumière à suivi de mouvement

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004318892A (ja) * 2003-04-18 2004-11-11 Agilent Technol Inc 指画像入力用途における時間空間多重化システム及び方法
JP2012248047A (ja) * 2011-05-30 2012-12-13 Seiko Epson Corp 生体識別装置、及び、生体識別方法
JP2017509062A (ja) * 2014-02-21 2017-03-30 フィンガープリント カーズ アーベー 電子機器の制御方法
JP2017084045A (ja) * 2015-10-27 2017-05-18 京セラ株式会社 電子機器、電子機器の認証方法および認証プログラム
WO2017132258A1 (fr) * 2016-01-29 2017-08-03 Synaptics Incorporated Déclenchement d'une capture d'empreintes digitales avec un écran tactile

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004318892A (ja) * 2003-04-18 2004-11-11 Agilent Technol Inc 指画像入力用途における時間空間多重化システム及び方法
JP2012248047A (ja) * 2011-05-30 2012-12-13 Seiko Epson Corp 生体識別装置、及び、生体識別方法
JP2017509062A (ja) * 2014-02-21 2017-03-30 フィンガープリント カーズ アーベー 電子機器の制御方法
JP2017084045A (ja) * 2015-10-27 2017-05-18 京セラ株式会社 電子機器、電子機器の認証方法および認証プログラム
WO2017132258A1 (fr) * 2016-01-29 2017-08-03 Synaptics Incorporated Déclenchement d'une capture d'empreintes digitales avec un écran tactile

Also Published As

Publication number Publication date
JP2019219833A (ja) 2019-12-26

Similar Documents

Publication Publication Date Title
EP3523754B1 (fr) Procédé et appareil de détection d'animation de visage, et dispositif électronique
US10860850B2 (en) Method of recognition based on iris recognition and electronic device supporting the same
KR100947990B1 (ko) 차영상 엔트로피를 이용한 시선 추적 장치 및 그 방법
US9750420B1 (en) Facial feature selection for heart rate detection
US11163995B2 (en) User recognition and gaze tracking in a video system
US10928904B1 (en) User recognition and gaze tracking in a video system
KR102544608B1 (ko) 생체 센서를 이용하여 획득된 생체 정보를 포함하는 이미지의 상태에 기반하여, 생체 정보와 관련된 인증을 수행 하는 방법 및 이를 구현한 전자 장치
US11275458B2 (en) Method, electronic device, and storage medium for fingerprint recognition
US9785863B2 (en) Fingerprint authentication
US11335090B2 (en) Electronic device and method for providing function by using corneal image in electronic device
KR102544320B1 (ko) 전자 장치 및 전자 장치의 제어 방법
WO2019173011A1 (fr) Dispositif électronique comprenant un capteur biométrique de paume sans contact et procédés associés
WO2019244496A1 (fr) Dispositif de traitement d'informations, équipement portatif, procédé de traitement d'informations, et programme
US20190278970A1 (en) Detection device, information processing device, and information processing method
KR20180137830A (ko) 지압 인식 장치 및 이를 포함하는 전자 기기
WO2019244497A1 (fr) Dispositif de traitement d'informations, équipement portable, procédé de traitement d'informations, et programme
KR20190088679A (ko) 지문 입력의 압력 레벨에 기반하여 지문 처리 방식을 결정하는 전자 장치 및 방법
EP4398136A1 (fr) Dispositif électronique permettant de commander un fonctionnement sur la base de signaux biométriques et procédé de fonctionnement du dispositif électronique
US20230074386A1 (en) Method and apparatus for performing identity recognition on to-be-recognized object, device and medium
CN115829575A (zh) 支付验证方法、装置、终端、服务器及存储介质
JP7228509B2 (ja) 識別装置及び電子機器
EP4362481A1 (fr) Procédé d'affichage de guide pour la position d'une caméra, et dispositif électronique
US11460928B2 (en) Electronic device for recognizing gesture of user from sensor signal of user and method for recognizing gesture using the same
KR20190027704A (ko) 전자 장치 및 전자 장치의 지문 인식 방법
US11899884B2 (en) Electronic device and method of recognizing a force touch, by electronic device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19822335

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19822335

Country of ref document: EP

Kind code of ref document: A1