EP4398136A1 - Elektronische vorrichtung zur steuerung eines auf biometrischen signalen basierenden betriebs und betriebsverfahren dafür - Google Patents

Elektronische vorrichtung zur steuerung eines auf biometrischen signalen basierenden betriebs und betriebsverfahren dafür Download PDF

Info

Publication number
EP4398136A1
EP4398136A1 EP23771004.1A EP23771004A EP4398136A1 EP 4398136 A1 EP4398136 A1 EP 4398136A1 EP 23771004 A EP23771004 A EP 23771004A EP 4398136 A1 EP4398136 A1 EP 4398136A1
Authority
EP
European Patent Office
Prior art keywords
region
electronic device
user
face
biosignal
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
EP23771004.1A
Other languages
English (en)
French (fr)
Other versions
EP4398136A4 (de
Inventor
Yevhenii YAKISHYN
Mykhailo Zlotnyk
Oleksandr SHCHUR
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Samsung Electronics Co Ltd
Original Assignee
Samsung Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from KR1020220031587A external-priority patent/KR20230134348A/ko
Application filed by Samsung Electronics Co Ltd filed Critical Samsung Electronics Co Ltd
Publication of EP4398136A1 publication Critical patent/EP4398136A1/de
Publication of EP4398136A4 publication Critical patent/EP4398136A4/de
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0033Features or image-related aspects of imaging apparatus, e.g. for MRI, optical tomography or impedance tomography apparatus; Arrangements of imaging apparatus in a room
    • A61B5/004Features or image-related aspects of imaging apparatus, e.g. for MRI, optical tomography or impedance tomography apparatus; Arrangements of imaging apparatus in a room adapted for image acquisition of a particular organ or body part
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording for evaluating the cardiovascular system, e.g. pulse, heart rate, blood pressure or blood flow
    • A61B5/024Measuring pulse rate or heart rate
    • A61B5/02416Measuring pulse rate or heart rate using photoplethysmograph signals, e.g. generated by infrared radiation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/145Measuring characteristics of blood in vivo, e.g. gas concentration or pH-value ; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid or cerebral tissue
    • A61B5/1455Measuring characteristics of blood in vivo, e.g. gas concentration or pH-value ; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid or cerebral tissue using optical sensors, e.g. spectral photometrical oximeters
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/145Measuring characteristics of blood in vivo, e.g. gas concentration or pH-value ; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid or cerebral tissue
    • A61B5/1455Measuring characteristics of blood in vivo, e.g. gas concentration or pH-value ; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid or cerebral tissue using optical sensors, e.g. spectral photometrical oximeters
    • A61B5/14551Measuring characteristics of blood in vivo, e.g. gas concentration or pH-value ; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid or cerebral tissue using optical sensors, e.g. spectral photometrical oximeters for measuring blood gases
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • G06F21/32User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • G06V40/28Recognition of hand or arm movements, e.g. recognition of deaf sign language
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/70Multimodal biometrics, e.g. combining information from different biometric modalities
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/165Evaluating the state of mind, e.g. depression, anxiety
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30196Human being; Person

Definitions

  • the disclosure relates to an electronic device for controlling a biosignal-based operation and an operation method thereof. More particularly, the disclosure relates to an electronic device for controlling a biosignal-based operation and an operation method thereof, which enable management of an access right for controlling the operation of the electronic device by a user using a multimodal interface.
  • various input methods such as voice input may be supported.
  • the electronic device may recognize the user's voice while a voice recognition service is executed, and may execute an operation corresponding to the voice input or provide a search result.
  • artificial intelligence devices equipped with displays and cameras evolve into multimodal devices capable of performing various input/output operations as well as voice input/output, services that provide improved user experiences and new user experiences are emerging.
  • biometrics is an example of security authentication and is a technology for performing user authentication using unique physical characteristics, such as a user's fingerprint, face, and blood vessels. In the field of technology using such biometrics, the risk of theft or imitation is low, and ease of use thereof is high.
  • the electronic device may use a method of fusion and analyzing information input in each modality.
  • a method of fusion and analyzing information input in each modality there is a need to identify users to provide secure access to the electronic device.
  • FIG. 1 is a block diagram illustrating an electronic device in a network environment according to an embodiment of the disclosure.
  • an electronic device 101 in a network environment 100 may communicate with an external electronic device 102 via a first network 198 (e.g., a short-range wireless communication network), or at least one of an external electronic device 104 or a server 108 via a second network 199 (e.g., a long-range wireless communication network).
  • the electronic device 101 may communicate with the external electronic device 104 via the server 108.
  • the electronic device 101 may include a processor 120, a memory 130, an input module 150, a sound output module 155, a display module 160, an audio module 170, a sensor module 176, an interface 177, a connecting terminal 178, a haptic module 179, a camera module 180, a power management module 188, a battery 189, a communication module 190, a subscriber identification module (SIM) 196, or an antenna module 197.
  • at least one of the components e.g., the connecting terminal 178) may be omitted from the electronic device 101, or one or more other components may be added in the electronic device 101.
  • some of the components e.g., the sensor module 176, the camera module 180, or the antenna module 197) may be implemented as a single component (e.g., the display module 160).
  • the processor 120 may include a main processor 121 (e.g., a central processing unit (CPU) or an application processor (AP)), or an auxiliary processor 123 (e.g., a graphics processing unit (GPU), a neural processing unit (NPU), an image signal processor (ISP), a sensor hub processor, or a communication processor (CP)) that is operable independently from, or in conjunction with, the main processor 121.
  • a main processor 121 e.g., a central processing unit (CPU) or an application processor (AP)
  • auxiliary processor 123 e.g., a graphics processing unit (GPU), a neural processing unit (NPU), an image signal processor (ISP), a sensor hub processor, or a communication processor (CP)
  • the main processor 121 may be adapted to consume less power than the main processor 121, or to be specific to a specified function.
  • the auxiliary processor 123 may be implemented as separate from, or as part of the main processor 121.
  • the auxiliary processor 123 may include a hardware structure specified for artificial intelligence model processing.
  • An artificial intelligence model may be generated by machine learning. Such learning may be performed, e.g., by the electronic device 101 where the artificial intelligence model is performed or via a separate server (e.g., the server 108). Learning algorithms may include, but are not limited to, e.g., supervised learning, unsupervised learning, semi-supervised learning, or reinforcement learning.
  • the artificial intelligence model may include a plurality of artificial neural network layers.
  • the artificial neural network may be a deep neural network (DNN), a convolutional neural network (CNN), a recurrent neural network (RNN), a restricted Boltzmann machine (RBM), a deep belief network (DBN), a bidirectional recurrent deep neural network (BRDNN), deep Q-network or a combination of two or more thereof but is not limited thereto.
  • the artificial intelligence model may, additionally or alternatively, include a software structure other than the hardware structure.
  • the communication module 190 may support establishing a direct (e.g., wired) communication channel or a wireless communication channel between the electronic device 101 and the external electronic device (e.g., the external electronic device 102, the external electronic device 104, or the server 108) and performing communication via the established communication channel.
  • the communication module 190 may include one or more communication processors that are operable independently from the processor 120 (e.g., the application processor (AP)) and supports a direct (e.g., wired) communication or a wireless communication.
  • AP application processor
  • the wireless communication module 192 may support various technologies for securing performance on a high-frequency band, such as, e.g., beamforming, massive multiple-input and multiple-output (massive MIMO), full dimensional MIMO (FD-MIMO), array antenna, analog beam-forming, or large scale antenna.
  • the wireless communication module 192 may support various requirements specified in the electronic device 101, an external electronic device (e.g., the external electronic device 104), or a network system (e.g., the second network 199).
  • the wireless communication module 192 may support a peak data rate (e.g., 20 gigabits per second (Gbps) or more) for implementing eMBB, loss coverage (e.g., 164 database (dB) or less) for implementing mMTC, or U-plane latency (e.g., 0.5 ms or less for each of downlink (DL) and uplink (UL), or a round trip of 1 ms or less) for implementing URLLC.
  • Gbps gigabits per second
  • loss coverage e.g., 164 database (dB) or less
  • U-plane latency e.g., 0.5 ms or less for each of downlink (DL) and uplink (UL), or a round trip of 1 ms or less
  • the antenna module 197 may transmit or receive a signal or power to or from the outside (e.g., the external electronic device) of the electronic device 101.
  • the antenna module 197 may include an antenna including a radiating element including a conductive material or a conductive pattern formed in or on a substrate (e.g., a printed circuit board (PCB)).
  • the antenna module 197 may include a plurality of antennas (e.g., array antennas). In such a case, at least one antenna appropriate for a communication scheme used in the communication network, such as the first network 198 or the second network 199, may be selected, for example, by the communication module 190 from the plurality of antennas.
  • FIG. 2 is a perspective view illustrating an electronic device according to an embodiment of the disclosure.
  • the display 212 may be disposed inside the front cover 211 to correspond to the front cover 211.
  • the display 212 may include a touch screen, and may receive a touch, gesture, proximity, or hovering input using, for example, an electronic pen or a portion of the user's body.
  • the fact that the electronic device 200 is implemented in the form of a robot is merely exemplary, and there is no limitation in the form of implementation.
  • the electronic device 200 may be implemented as a standalone type that is formed as a single robot object.
  • the electronic device 200 may be implemented as a docking station type for fixing a tablet PC or a smart phone.
  • the electronic device 200 may be classified into a fixed/movable type according to whether the electronic device 200 is mobile.
  • an electronic device 300 may include a processor 320 (e.g., the processor 120 of FIG. 1 ), a memory 330 (e.g., the memory 130 of FIG. 1 ), a display 360 (e.g., the display module 160 of FIG. 1 ), a camera 380 (e.g., the camera module 180 of FIG. 1 ), and a communication interface 390 (e.g., the communication module 190 of FIG. 1 ).
  • a processor 320 e.g., the processor 120 of FIG. 1
  • a memory 330 e.g., the memory 130 of FIG. 1
  • a display 360 e.g., the display module 160 of FIG. 1
  • a camera 380 e.g., the camera module 180 of FIG. 1
  • a communication interface 390 e.g., the communication module 190 of FIG. 1 .
  • the electronic device 300 may be implemented with more or fewer components than those shown in FIG. 3 .
  • the user may perform a specific operation (or gesture) other than the voice to execute the predefined function of the electronic device 300, and a reference signal pattern of a biosignal generated by the user's gesture may be registered through a separate registration process.
  • a reference signal pattern of a biosignal generated by the user's gesture may be registered through a separate registration process.
  • the registration process of the reference signal pattern may include a process of determining a function type of the electronic device 300 that the user wants to control and registering the user's signal pattern corresponding to the determined function type.
  • the processor 320 may acquire a captured image (e.g., a video) through the camera 380. For example, the processor 320 may sequentially acquire a plurality of image frames through the camera 380.
  • a captured image e.g., a video
  • the processor 320 may sequentially acquire a plurality of image frames through the camera 380.
  • the processor 320 may obtain a result of matching a pair of the face and the hand by comparing a correlation relationship between the biosignal for the face region and the biosignal for the hand region. For example, the processor 320 may compare a pattern of the biosignal in the face region with a pattern of the biosignal in the hand region, and may match a pair of the face region and the hand region based on the comparison result. In the case of the same user, the outputs of the biosignal in the face region and the biosignal in the hand region may have similar characteristics. Accordingly, when the pattern of the biosignal in the face region and the pattern of the biosignal in the hand region are similar within a threshold range, the processor 320 may match the face region and the hand region as a pair.
  • the processor 320 may identify a lip region within the face region, and may identify a speaking user in the captured image based on the biosignal in the lip region.
  • the processor 320 may identify a gesture of the speaking user and perform an operation designated in association with the speaking user's gesture among operations designated for each gesture.
  • the user's gesture may be, for example, a hand gesture, such as raising a palm or pointing a finger.
  • a change in the biosignal that appears when the user performs a specific action may be output from the hand region.
  • the change in the biosignal may be used to recognize a user's motion.
  • the operation of the electronic device 300 may be controlled by the identified user's hand.
  • the processor 320 may identify the user's motion (or gesture) in the captured image based on the biosignal in the hand region, and may identify a command for controlling the operation of the electronic device 300 corresponding to the user's motion. Accordingly, the electronic device 300 may be accessed by the authenticated user based on the biosignal for the pair of face and hand.
  • the processor 320 may detect the user's face from the captured image (or an image frame) based on a face recognition method, and may recognize the detected face. According to an embodiment of the disclosure, the processor 320 may detect the face based on at least one image acquired through the camera 380 and perform face tracking, and may perform face recognition while performing face tracking. For example, face recognition may identify the user of the detected face. For example, the processor 320 may identify the user of the detected face using at least one piece of user face data stored in the memory 330.
  • the biosignal may include a photo-plethysmography (PPG) signal obtained through a non-contact photoplethysmography method.
  • PPG photo-plethysmography
  • the at least one processor may be configured to match the face region and the hand region as a pair.
  • the at least one processor when capturing a plurality of users through the camera, may be configured to identify a lip region within the at least one face region, and to identify a speaking user in the image based on a biosignal in the lip region.
  • an operating method may include operations 405 to 425.
  • Each step/operation of the operating method of FIG. 4 may be performed by an electronic device (e.g., the electronic device 101 of FIG. 1 , the electronic device 200 of FIG. 2 , or the electronic device 300 of FIG. 3 ) or at least one processor (e.g., the processor 120 of FIG. 1 or the processor 320 of FIG. 3 ) of the electronic device.
  • at least one of operations 405 to 425 may be omitted, the order of some operations may be changed, or another operation may be added.
  • the electronic device 300 may extract at least one face region and at least one hand region from an image captured by the camera 380.
  • the extracting of the at least one face region and the at least one hand region may include configuring a region of interest (ROI) in the image captured by the camera 380, and extracting the at least one face region and the at least one hand region within the ROI.
  • ROI region of interest
  • the electronic device 300 may acquire biosignals from each of the extracted at least one face region and hand region.
  • the biosignal may include a photo-plethysmography (PPG) signal obtained through a non-contact photoplethysmography method.
  • the method may further include acquiring at least one of the heart rate and the oxygen saturation based on the PPG signal.
  • the at least one of the heart rate and the oxygen saturation may be acquired based on the user's biosignals, and the acquired biometric data, such as the heart rate and the oxygen saturation may be used to determine service support corresponding to the user's current state.
  • the electronic device 300 may match the face region and the hand region in the image using the acquired biosignals.
  • the matching of the face region and the hand region in the image may include comparing a pattern of the biosignal in the at least one face region with a pattern of the biosignal in the at least one hand region, and matching a pair of face region and hand region among the at least one face region and the at least one hand region based on the comparison result.
  • the matching of the pair of face region and hand region may include matching the face region and the hand region as a pair.
  • the electronic device 300 may identify a user (or authenticate a user) by performing face recognition on the image while acquiring the biosignal.
  • the method may further include, after the identifying of the user by performing face recognition, associating the user identification result with the biosignal in the at least one face region.
  • the electronic device 300 may identify the user in the image by performing face recognition using pre-stored face data.
  • FIG. 8 is a diagram illustrating a method of extracting an rPPG signal for a hand according to an embodiment of the disclosure.
  • module may include a unit implemented in hardware, software, or firmware, and may interchangeably be used with other terms, for example, “logic”, “logic block”, “part”, or “circuitry”.
  • a module may be a single integral component, or a minimum unit or part thereof, adapted to perform one or more functions.
  • the module may be implemented in a form of an application-specific integrated circuit (ASIC).
  • ASIC application-specific integrated circuit

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Health & Medical Sciences (AREA)
  • Human Computer Interaction (AREA)
  • Cardiology (AREA)
  • Biophysics (AREA)
  • Veterinary Medicine (AREA)
  • Pathology (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • Public Health (AREA)
  • Multimedia (AREA)
  • Computer Security & Cryptography (AREA)
  • Physiology (AREA)
  • Optics & Photonics (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Computer Hardware Design (AREA)
  • Software Systems (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Psychiatry (AREA)
  • Social Psychology (AREA)
  • User Interface Of Digital Computer (AREA)
EP23771004.1A 2022-03-14 2023-03-02 Elektronische vorrichtung zur steuerung eines auf biometrischen signalen basierenden betriebs und betriebsverfahren dafür Pending EP4398136A4 (de)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020220031587A KR20230134348A (ko) 2022-03-14 2022-03-14 생체 신호 기반의 동작 제어를 위한 전자 장치 및 그 동작 방법
PCT/KR2023/002881 WO2023177125A1 (ko) 2022-03-14 2023-03-02 생체 신호 기반의 동작 제어를 위한 전자 장치 및 그 동작 방법

Publications (2)

Publication Number Publication Date
EP4398136A1 true EP4398136A1 (de) 2024-07-10
EP4398136A4 EP4398136A4 (de) 2024-07-24

Family

ID=87932827

Family Applications (1)

Application Number Title Priority Date Filing Date
EP23771004.1A Pending EP4398136A4 (de) 2022-03-14 2023-03-02 Elektronische vorrichtung zur steuerung eines auf biometrischen signalen basierenden betriebs und betriebsverfahren dafür

Country Status (3)

Country Link
US (1) US12495982B2 (de)
EP (1) EP4398136A4 (de)
CN (1) CN118786427A (de)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20230320667A1 (en) * 2022-04-07 2023-10-12 Faceheart Inc Corporation Contactless physiological measurement device and method
US12451220B2 (en) * 2022-11-14 2025-10-21 Qualcomm Incorporated Establishing individual root of trust using biomarkers

Family Cites Families (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101017936B1 (ko) * 2008-09-18 2011-03-04 동명대학교산학협력단 사용자의 제스춰 정보 인식을 기반으로 하여 디스플레이장치의 동작을 제어하는 시스템
TW201315438A (zh) 2011-10-14 2013-04-16 Ind Tech Res Inst 非接觸式之心脈量測方法及其系統
JP5822651B2 (ja) * 2011-10-26 2015-11-24 株式会社ソニー・コンピュータエンタテインメント 個体判別装置および個体判別方法
US9294475B2 (en) 2013-05-13 2016-03-22 Hoyos Labs Ip, Ltd. System and method for generating a biometric identifier
US9406295B2 (en) * 2013-11-22 2016-08-02 Intel Corporation Apparatus and method for voice based user enrollment with video assistance
KR101561817B1 (ko) 2013-12-05 2015-11-11 주식회사 슈프리마 얼굴과 손 인식을 이용한 생체 인증 장치 및 방법
JP6683367B2 (ja) * 2015-03-30 2020-04-22 国立大学法人東北大学 生体情報計測装置、生体情報計測方法及び生体情報計測プログラム
JP6308161B2 (ja) 2015-03-31 2018-04-11 株式会社エクォス・リサーチ 脈波検出装置、及び脈波検出プログラム
KR102407564B1 (ko) * 2017-08-01 2022-06-13 삼성전자주식회사 생체 정보를 판단하는 전자 장치 및 이의 동작 방법
KR101882281B1 (ko) 2017-09-15 2018-08-24 엘지전자 주식회사 디지털 디바이스 및 그의 생체 인증 방법
US10322728B1 (en) * 2018-02-22 2019-06-18 Futurewei Technologies, Inc. Method for distress and road rage detection
KR102823061B1 (ko) 2019-11-01 2025-06-23 삼성전자주식회사 복수의 센서 신호를 이용하여 사용자의 제스처를 인식하는 전자 장치
KR102273903B1 (ko) 2019-11-21 2021-07-06 주식회사 지비소프트 비접촉식 생체 지수 측정 방법
US11809536B2 (en) 2020-03-06 2023-11-07 Kyndryl, Inc. Headphone biometric authentication
CN111797735A (zh) * 2020-06-22 2020-10-20 深圳壹账通智能科技有限公司 人脸视频识别方法、装置、设备及存储介质
CN115210781A (zh) * 2021-01-26 2022-10-18 京东方科技集团股份有限公司 控制方法、电子设备及存储介质

Also Published As

Publication number Publication date
US20230284920A1 (en) 2023-09-14
US12495982B2 (en) 2025-12-16
CN118786427A (zh) 2024-10-15
EP4398136A4 (de) 2024-07-24

Similar Documents

Publication Publication Date Title
US10942995B2 (en) Method for obtaining biometric information using light source corresponding to biometric information and electronic device thereof
US10936709B2 (en) Electronic device and method for controlling the same
US10860850B2 (en) Method of recognition based on iris recognition and electronic device supporting the same
CN113515987B (zh) 掌纹识别方法、装置、计算机设备及存储介质
US9607138B1 (en) User authentication and verification through video analysis
CN105654952A (zh) 用于输出语音的电子设备、服务器和方法
CN110287918B (zh) 活体识别方法及相关产品
US20210132699A1 (en) Electronic device for recognizing gesture of user using a plurality of sensor signals
US12495982B2 (en) Electronic device for controlling operation based on a bio-signal and operating method thereof
US11275458B2 (en) Method, electronic device, and storage medium for fingerprint recognition
US12482469B2 (en) Electronic device and method for processing speech by classifying speech target
US20160274670A1 (en) Gesture input apparatus, gesture input method, and program for wearable terminal
US12123768B2 (en) Method for recognizing object by using millimeter wave and electronic device supporting same method
KR20210048930A (ko) 심전도 데이터를 이용한 사용자 검증 장치 및 그 방법
KR20230134348A (ko) 생체 신호 기반의 동작 제어를 위한 전자 장치 및 그 동작 방법
KR20220028559A (ko) 광음향 특성에 기반한 위조 지문 판단 방법 및 장치
EP4546294A1 (de) Elektronische vorrichtung mit fingerabdrucksensor und betriebsverfahren dafür
US20250218174A1 (en) Electronic device and method for providing notification information
US20250139214A1 (en) Method for performing authentication using fingerprint sensor and electronic device supporting the same
EP4372521A1 (de) Tragbare elektronische vorrichtung und verfahren, durch das eine tragbare elektronische vorrichtung bürstenzahninformationen liefert
EP4645140A1 (de) Elektronische vorrichtung zur verschlüsselung biometrischer informationen und betriebsverfahren dafür
US20230172483A1 (en) Wearable electronic device and method of operating the same
KR20250170901A (ko) 생체 정보를 이용하여 인증을 수행하는 방법 및 지원하는 전자 장치
US20240197195A1 (en) Electronic device for measuring biometrics, and operation method therefor
US20250130638A1 (en) Method for determining user's gaze and electronic device therefor

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20240404

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC ME MK MT NL NO PL PT RO RS SE SI SK SM TR

A4 Supplementary search report drawn up and despatched

Effective date: 20240620

RIC1 Information provided on ipc code assigned before grant

Ipc: A61B 5/1455 20060101ALI20240614BHEP

Ipc: A61B 5/024 20060101ALI20240614BHEP

Ipc: G06F 3/01 20060101ALI20240614BHEP

Ipc: G06V 40/70 20220101ALI20240614BHEP

Ipc: G06V 40/16 20220101ALI20240614BHEP

Ipc: G06F 21/32 20130101AFI20240614BHEP

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

17Q First examination report despatched

Effective date: 20260206