US10051177B2 - Method for control of camera module based on physiological signal - Google Patents

Method for control of camera module based on physiological signal Download PDF

Info

Publication number
US10051177B2
US10051177B2 US15/718,937 US201715718937A US10051177B2 US 10051177 B2 US10051177 B2 US 10051177B2 US 201715718937 A US201715718937 A US 201715718937A US 10051177 B2 US10051177 B2 US 10051177B2
Authority
US
United States
Prior art keywords
image
sensor
module
communication device
user input
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
US15/718,937
Other versions
US20180020155A1 (en
Inventor
Namjin Kim
Sora HYUN
Hyunho SEO
Changhyun CHUN
Yunemo KOO
Gaeyoun Kim
Geonsoo KIM
Heedeog Kim
Sunghyuk SHIN
Kihuk LEE
Chulhwan Lee
Cheolho CHEONG
Seungmin CHOI
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Samsung Electronics Co Ltd
Original Assignee
Samsung Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Samsung Electronics Co Ltd filed Critical Samsung Electronics Co Ltd
Priority to US15/718,937 priority Critical patent/US10051177B2/en
Publication of US20180020155A1 publication Critical patent/US20180020155A1/en
Priority to US16/033,906 priority patent/US10341554B2/en
Application granted granted Critical
Publication of US10051177B2 publication Critical patent/US10051177B2/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6887Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient mounted on external non-worn devices, e.g. non-medical devices
    • A61B5/6898Portable consumer electronic devices, e.g. music players, telephones, tablet computers
    • H04N5/23216
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/015Input arrangements based on nervous system activity detection, e.g. brain waves [EEG] detection, electromyograms [EMG] detection, electrodermal response detection
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/0205Simultaneously evaluating both cardiovascular conditions and different types of body conditions, e.g. heart and respiratory condition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/024Detecting, measuring or recording pulse rate or heart rate
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/024Detecting, measuring or recording pulse rate or heart rate
    • A61B5/02416Detecting, measuring or recording pulse rate or heart rate using photoplethysmograph signals, e.g. generated by infrared radiation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/024Detecting, measuring or recording pulse rate or heart rate
    • A61B5/02438Detecting, measuring or recording pulse rate or heart rate with portable devices, e.g. worn by the patient
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/117Identification of persons
    • A61B5/1171Identification of persons based on the shapes or appearances of their bodies or parts thereof
    • A61B5/1172Identification of persons based on the shapes or appearances of their bodies or parts thereof using fingerprinting
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/145Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue
    • A61B5/1455Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue using optical sensors, e.g. spectral photometrical oximeters
    • A61B5/14551Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue using optical sensors, e.g. spectral photometrical oximeters for measuring blood gases
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/165Evaluating the state of mind, e.g. depression, anxiety
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B15/00Special procedures for taking photographs; Apparatus therefor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1684Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
    • G06F1/169Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being an integrated pointing device, e.g. trackball in the palm rest area, mini-joystick integrated between keyboard keys, touch pads or touch stripes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1684Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
    • G06F1/1694Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being a single or a set of motion sensors for pointer control or gesture input obtained by sensing movements of the portable computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06K9/00013
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/61Control of cameras or camera modules based on recognised objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/62Control of parameters via user interfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N5/232
    • H04N5/23212
    • H04N5/23293
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/64Circuits for processing colour signals
    • H04N9/73Colour balance circuits, e.g. white balance circuits or colour temperature control
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/024Detecting, measuring or recording pulse rate or heart rate
    • A61B5/02405Determining heart rate variability

Definitions

  • the present invention relates generally to method for a control of a camera module, and more particularly to a method for a control of a camera module based on a physiological signal.
  • Various electronic devices include a camera module, and conventionally acquire an image by either pressing a photography button mounted to a body of the electronic device or by touching a photography button displayed on a display.
  • an operation of manipulating a mechanical button or a button image on a display may influence the quality of the acquired image.
  • the present invention has been made to address at least the problems and disadvantages described above and to provide at least the advantages described below.
  • an aspect of the present invention is to provide a method of image acquisition through a camera module of an electronic device without manipulating a mechanical button or a virtual button displayed on a display of the electronic device.
  • another aspect of the present invention is to provide a method of controlling a camera module of an electronic device by using physiological information.
  • another aspect of the present invention is to allow a user to control a camera module of an electronic device by using a sensor module.
  • a user of an electronic device can change a setting of the camera module or instruct to acquire an image based on a motion of a finger on the sensor module.
  • another aspect of the present invention is to acquire an image through a camera module of the electronic device which can be processed based on physiological information or emotional information received by the sensor module.
  • a method includes executing a camera application at an electronic device; activating a heart rate monitor (HRM) sensor operatively coupled with a first surface of the electronic device in response to the execution of the camera application; receiving an input signal via the HRM sensor; and capturing, using a processor operatively coupled with the electronic device, an image via an image sensor operatively coupled with a second surface of the electronic device based at least in part on the input signal.
  • HRM heart rate monitor
  • an apparatus in accordance with another aspect of the present invention, includes an image sensor operatively coupled with a first surface of the apparatus; a heart rate monitor (HRM) sensor coupled with a second surface of the apparatus; and a processor adapted to execute a camera application; activate the HRM sensor in response to the execution of the camera application; receive an input signal via the HRM sensor; and capture an image using the image sensor based at least in part on the input signal.
  • HRM heart rate monitor
  • an apparatus in accordance with another aspect of the present invention, includes an image sensor operatively coupled with a first surface of the apparatus; a heart rate monitor (HRM) sensor coupled with a second surface of the apparatus; and a processor adapted to activate the HRM sensor; receive an input signal using the activated HRM sensor; and capture an image using the image sensor based at least in part on the input signal.
  • HRM heart rate monitor
  • FIG. 1 is a block diagram of an electronic device within a network environment, according to an embodiment of the present invention
  • FIG. 2 is a block diagram of a configuration of an electronic device, according to an embodiment of the present invention.
  • FIG. 3 is a block diagram of a programming module of an electronic device, according to an embodiment of the present invention.
  • FIG. 4 is a block diagram of an electronic device, according to an embodiment of the present invention.
  • FIGS. 5A and 5B are perspective views of an electronic device, according to an embodiment of the present invention.
  • FIGS. 6A to 6C illustrate an operational state of an electronic device, according to an embodiment of the present invention
  • FIGS. 7A and 7B illustrate an operation of a sensor module of an electronic device, according to an embodiment of the present invention
  • FIGS. 8A and 8B are graphs illustrating physiological signals detected by an electronic device according to time lapse, according to an embodiment of the present invention.
  • FIG. 9 is a flowchart of an operation for acquiring an image based on an input signal by the electronic device, according to an embodiment of the present invention.
  • FIG. 10 is a flowchart of an operation for acquiring an image according to a contact operation by the electronic device, according to an embodiment of the present invention.
  • FIG. 11 is a flowchart of an operation for acquiring an image according to a separation operation by the electronic device, according to an embodiment of the present invention.
  • FIG. 12 is a flowchart of an operation for controlling the electronic device, according to an embodiment of the present invention.
  • FIGS. 13A and 13B illustrate an elapsing of a particular time, according to an embodiment of the present invention
  • FIG. 14 illustrates controlling an electronic device based on a contact sequence, according to an embodiment of the present invention
  • FIGS. 15A and 15B illustrate controlling an electronic device based on a movement direction, according to an embodiment of the present invention
  • FIG. 16 illustrates controlling an electronic device based on a contact sequence, according to an embodiment of the present invention
  • FIGS. 17 illustrates a biometric sensor coupled to a button of an electronic device, according to an embodiment of the present invention
  • FIG. 18 illustrates a method of using a biometric sensor coupled to a button of an electronic device, according to an embodiment of the present invention
  • FIG. 19 is a flowchart of an operation for processing an image by using fingerprint information, according to an embodiment of the present invention.
  • FIG. 20 is a flowchart of an operation for providing a preview image by using heart rate information, according to an embodiment of the present invention.
  • FIG. 21 illustrates a screen displaying a preview image provided using heart rate information, according to an embodiment of the present invention
  • FIG. 22 is a flowchart of an operation for displaying a user image and heart rate information together, according to an embodiment of the present invention.
  • FIG. 23 illustrates a screen displaying an image together with heart rate information, according to an embodiment of the present invention
  • FIG. 24 is a flowchart of an operation for displaying an acquired image and emotional information together, according to an embodiment of the present invention.
  • FIG. 25 illustrates a screen displaying an image together emotional information, according to an embodiment of the present invention
  • FIG. 26 illustrates a screen displaying an image in which physiological information is correlated with the user, according to an embodiment of the present invention.
  • FIGS. 27A to 27D illustrate a method of controlling an electronic device to photograph a panorama image, according to an embodiment of the present invention.
  • An electronic device may be an apparatus having a communication function.
  • the electronic device according to the present invention may be at least one of and/Or combinations of a smart phone, a tablet Personal Computer (PC), a mobile phone, a video phone, an electronic-boot (e-book) reader, a desktop PC, a laptop PC, a netbook computer, a Personal Digital Assistant (PDA), a Portable Multimedia Player (PMP), an MP3 player, a mobile medical appliance, an electronic bracelet, an electronic necklace, an electronic accessory, a camera, a wearable device, an electronic clock, a wrist watch, home appliances (e.g., a refrigerator, an air conditioner, a cleaner, an oven, a microwave oven, a washing machine, an air cleaner, etc.), an artificial intelligence robot, a television (TV), a Digital Versatile Disk (DVD) player, an audio player, various medical appliances (e.g., a Magnetic Resonance Angiography (MRA) device, a Magnetic Resonance Imaging
  • MRA Magnetic Resonance
  • FIG. 1 is a block diagram of an electronic device within a network environment, according to an embodiment of the present invention.
  • the electronic device 101 includes a bus 110 , a processor 120 , a memory 130 , a input/output interface 150 , a display 160 , and a communication interface 170 .
  • the bus 110 is a circuit for interconnecting the elements described above, e.g., the processor 120 , the memory 130 , the input/output interface 150 , the display 160 , and the communication interface 170 , and for allowing a communication, e.g., by transferring a control message, between the elements described above.
  • the processor 120 receives commands from the above-mentioned other elements, e.g. the memory 130 , the input/output interface 150 , the display 160 , and the communication interface 170 , through, for example, the bus 110 , deciphers the received commands, and performs operations and/or data processing according to the deciphered commands.
  • the bus 110 deciphers the received commands, and performs operations and/or data processing according to the deciphered commands.
  • the memory 130 stores commands received from the processor 120 and/or other elements, e.g. the input/output interface 150 , the display 160 , and the communication interface 170 , and/or commands and/or data generated by the processor 120 and/or other elements.
  • the memory 130 include software and/or programs 140 , such as a kernel 141 , middleware 143 , an Application Programming Interface (API) 145 , and an application 147 .
  • API Application Programming Interface
  • Each of the programming modules described above may be configured by software, firmware, hardware, and/or combinations of two or more thereof.
  • the kernel 141 controls and/or manages system resources, e.g. the bus 110 , the processor 120 or the memory 130 , used for execution of operations and/or functions implemented in other programming modules, such as the middleware 143 , the API 145 , and/or the application 147 . Further, the kernel 141 provides an interface through which the middleware 143 , the API 145 , and/or the application 147 can access and then control and/or manage an individual element of the electronic device 101 .
  • system resources e.g. the bus 110 , the processor 120 or the memory 130 , used for execution of operations and/or functions implemented in other programming modules, such as the middleware 143 , the API 145 , and/or the application 147 .
  • the kernel 141 provides an interface through which the middleware 143 , the API 145 , and/or the application 147 can access and then control and/or manage an individual element of the electronic device 101 .
  • the middleware 143 performs a relay function which allows the API 145 and/or the application 147 to communicate with and exchange data with the kernel 141 . Further, in relation to operation requests received from at least one of an application 147 , the middleware 143 performs load balancing in relation to the operation requests by, for example, giving a priority in using a system resource, e.g., the bus 110 , the processor 120 , and/or the memory 130 , of the electronic device 101 to at least one application from among the at least one of the application 147 .
  • a system resource e.g., the bus 110 , the processor 120 , and/or the memory 130
  • the API 145 is an interface through which the application 147 controls a function provided by the kernel 141 and/or the middleware 143 , and includes, for example, at least one interface or function for file control, window control, image processing, and/or character control.
  • the input/output interface 150 receives a command and/or data from a user, and transfers the received command and/or data to the processor 120 and/or the memory 130 through the bus 110 .
  • the display 160 displays an image, a video, and/or data to a user.
  • the communication interface 170 establishes a communication between the electronic device 101 and a first external device 102 , a second external device 104 , and/or a server 106 .
  • the communication interface 170 supports short range communication protocols, e.g., a WiFi protocol, a BlueTooth (BT) protocol, and a Near Field Communication (NFC) protocol, communication networks (e.g., Internet, Internet of Things (IoT), Local Area Network (LAN), Wire Area Network (WAN), a telecommunication network, a cellular network, and a satellite network) a Plain Old Telephone Service (POTS), or any other similar and/or suitable communication networks, such as network 162 .
  • Each of the first and second external devices 102 and 104 may be a same type and/or different types of electronic devices.
  • FIG. 2 is a block diagram of a configuration of an electronic device, according to an embodiment of the present invention.
  • the electronic device 201 may configure a whole or a part of the electronic device 101 illustrated in FIG. 1
  • the electronic device 201 includes one or more Application Processors (APs) 210 , a communication module 220 , a Subscriber Identification Module (SIM) card 224 , a memory 230 , a sensor module 240 , an input device 250 , a display module 260 , an interface 270 , an audio module 280 , a camera module 291 , a power management module 295 , a battery 296 , an indicator 297 , and a motor 298 .
  • APs Application Processors
  • SIM Subscriber Identification Module
  • the AP 210 operates an Operating System (OS) or an application program so as to control a plurality of hardware or software component elements connected to the AP 210 and execute various data processing and calculations including multimedia data.
  • OS Operating System
  • the AP 210 may be implemented by, for example, a System on Chip (SoC).
  • SoC System on Chip
  • the processor 210 may further include a Graphic Processing Unit (GPU).
  • the communication module 220 transmits/receives data in communication between different electronic devices, e.g., the second external device 104 and the server 106 , connected to the electronic device 201 through the network 162 .
  • the communication module 220 includes a cellular module 221 , a WiFi module 223 , a BlueTooth (BT) module 225 , a Global Positioning System (GPS) module 227 , a Near Field Communication (NFC) module 228 , and a Radio Frequency (RF) module 229 .
  • the cellular module 221 provides a voice, a call, a video call, a Short Message Service (SMS), or an Internet service through a communication network (for example, Long Term Evolution (LTE), LTE-Advanced (LTE-A), Code Division Multiple Access (CDMA), Wideband CDMA (WCDMA), Universal Mobile Telecommunication System (UMTS), Wireless Broadband (WiBro), Global System for Mobile communication (GSM), etc.).
  • a communication network for example, Long Term Evolution (LTE), LTE-Advanced (LTE-A), Code Division Multiple Access (CDMA), Wideband CDMA (WCDMA), Universal Mobile Telecommunication System (UMTS), Wireless Broadband (WiBro), Global System for Mobile communication (GSM), etc.
  • the cellular module 221 authenticates electronic devices within a communication network by using a the SIM card 224 .
  • the cellular module 221 performs at least some of the functions which can be provided by the AP 210 .
  • the cellular module 221 may perform at least some of the
  • the cellular module 221 may include a Communication Processor (CP). Further, the cellular module 221 may be implemented by, for example, an SoC.
  • CP Communication Processor
  • SoC SoC
  • the AP 210 may include at least some (for example, cellular module 221 ) of the aforementioned components in an embodiment.
  • the AP 210 or the cellular module 221 (for example, communication processor) loads a command or data received from at least one of a non-volatile memory and other components connected to each of the AP 210 and the cellular module 221 to a volatile memory and processes the loaded command or data. Further, the AP 210 or the cellular module 221 stores data received from at least one of other components or generated by at least one of other components in a non-volatile memory.
  • Each of the WiFi module 223 , the BT module 225 , the GPS module 227 , and the NFC module 228 may include, for example, a processor for processing data transmitted/received through the corresponding module.
  • the cellular module 221 , the WiFi module 223 , the BT module 225 , the GPS module 227 , and the NFC module 228 are illustrated as blocks separate from each other in FIG. 2 , at least some of the cellular module 221 , the WiFi module 223 , the BT module 225 , the GPS module 227 , and the NFC module 228 may be included in one integrated Chip (iC) or one IC package according to one embodiment.
  • iC integrated Chip
  • At least some (for example, the communication processor corresponding to the cellular module 221 and the WiFi processor corresponding to the WiFi module 223 ) of the processors corresponding to the cellular module 221 , the WiFi module 223 , the BT module 225 , the GPS module 227 , and the NFC module 228 may be implemented by one SoC.
  • the RF module 229 transmits/receives data, for example, an RF signal.
  • the RF module 229 may include, for example, a transceiver, a Power Amp Module (PAM), a frequency filter, a Low Noise Amplifier (LNA), etc. Further, the RF module 229 may include a component for transmitting/receiving electronic waves over a free air space in wireless communication, for example, a conductor, a conducting wire, etc.
  • the cellular module 221 , the WiFi module 223 , the BT module 225 , the GPS module 227 , and the NFC module 228 share one RF module 229 in FIG. 2 , at least one of the cellular module 221 , the WiFi module 223 , the BT module 225 , the GPS module 227 , and the NFC module 228 may transmit/receive an RF signal through a separate RF module.
  • the SIM card 224 is a card that may be inserted into a slot formed in a particular portion of the electronic device 201 .
  • the SIM card 224 includes unique identification information (for example, Integrated Circuit Card IDentifier (ICCID)) or subscriber information (for example, International Mobile Subscriber Identity (IMSI)).
  • ICCID Integrated Circuit Card IDentifier
  • IMSI International Mobile Subscriber Identity
  • the memory 230 includes an internal memory 232 or an external memory 234 .
  • the internal memory 232 may include at least one of a volatile memory (for example, a Random Access Memory (RAM), a Dynamic RAM (DRAM), a Static RAM (SRAM), Synchronous Dynamic RAM (SDRAM), etc.), and a non-volatile memory (for example, a Read Only Memory (ROM), a One Time Programmable ROM (OTPROM), a Programmable ROM (PROM), an Erasable and Programmable ROM (EPROM), an Electrically Erasable and Programmable ROM (EEPROM), a mask ROM, a flash ROM, a NAND flash memory, an NOR flash memory, etc.).
  • a volatile memory for example, a Random Access Memory (RAM), a Dynamic RAM (DRAM), a Static RAM (SRAM), Synchronous Dynamic RAM (SDRAM), etc.
  • a non-volatile memory for example, a Read Only Memory (ROM), a One Time Programmable ROM (OTPROM), a Programmable ROM (PROM), an Era
  • the internal memory 232 may be a Solid State Drive (SSD).
  • SSD Solid State Drive
  • the external memory 234 may include a flash drive, for example, a Compact Flash (CF), a Secure Digital (SD), a Micro Secure Digital (Micro-SD), a Mini Secure Digital (Mini-SD), an extreme Digital (xD), or a memory stick.
  • the external memory 234 may be functionally connected to the electronic device 201 through various interfaces.
  • the electronic device 201 may further include a storage device (or storage medium) such as a hard drive.
  • the sensor module 240 measures a physical quantity or detects an operation state of the electronic device 201 , and converts the measured or detected information to an electronic signal.
  • the sensor module 240 may include at least one of a gesture sensor 240 A, a gyro sensor 240 B, an atmospheric pressure (barometric) sensor 240 C, a magnetic sensor 240 D, an acceleration sensor 240 E, a grip sensor 240 F, a proximity sensor 2400 , a color sensor 240 H (for example, Red, Green, and Blue (ROB) sensor) 240 H, a biometric sensor 2401 , a temperature/humidity sensor 240 J, an illuminance (light) sensor 240 K, and a Ultra Violet (UV) sensor 240 M.
  • a gesture sensor 240 A a gyro sensor 240 B
  • an atmospheric pressure (barometric) sensor 240 C for example, a magnetic sensor 240 D
  • an acceleration sensor 240 E for example, a grip sensor 240 F
  • the sensor module 240 may include, for example, a E-nose sensor, an electromyography (EMG) sensor, an electroencephalogram (EEG) sensor, an electrocardiogram (ECG) sensor, an InfraRed (IR) sensor, an iris sensor, a fingerprint sensor, etc.
  • the sensor module 240 may further include a control circuit for controlling one or more sensors included in the sensor module 240 .
  • the input device 250 includes a touch panel 252 , a (digital) pen sensor 254 , a key 256 , and an ultrasonic input device 258 .
  • the touch panel 252 recognize a touch input in at least one type of a capacitive type, a resistive type, an infrared type, and an acoustic wave type.
  • the touch panel 252 further include a control circuit. In the capacitive type, the touch panel 252 can recognize proximity as well as a direct touch.
  • the touch panel 252 further include a tactile layer. In this event, the touch panel 252 provides a tactile reaction to the user.
  • the (digital) pen sensor 254 may be implemented, for example, using a method identical or similar to a method of receiving a touch input of the user or using a separate recognition sheet.
  • the key 256 may include a physical button, an optical key, or a key pad.
  • the ultrasonic input device 258 is a device which detects an acoustic wave by microphone 288 of the electronic device 201 through an input means generating an ultrasonic signal to identify data and performs wireless recognition.
  • the electronic device 201 may receive a user input from the first or second external device 102 and 104 or the server 106 connected to the electronic device 201 by using the communication module 220 .
  • the display module 260 includes a panel 262 , a hologram device 264 , and a projector 266 .
  • the panel 262 may be, for example, a Liquid Crystal Display (LCD) or an Active Matrix Organic Light Emitting Diode (AM-OLED).
  • the panel 262 may be implemented to be, for example, flexible, transparent, or wearable.
  • the panel 262 may be configured by the touch panel 252 and one module.
  • the hologram device 264 displays a stereoscopic image in the air by using interference of light.
  • the projector 266 projects light on a screen to display an image.
  • the screen may be located inside or outside the electronic device 201 .
  • the display module 260 may further include a control circuit for controlling the panel 262 , the hologram device 264 , and the projector 266 .
  • the interface 270 includes a High-Definition Multimedia interface (HDMI) 272 , a Universal Serial Bus (USB) 274 , an optical interface 276 , and a D-subminiature (D-sub) 278 .
  • the interface 270 may be included in the communication interface 160 illustrated in FIG. 1 .
  • the interface 290 may include, for example, a Mobile High-definition Link (MHL) interface, a Secure Digital (SD) card/Multi-Media Card (MMC), or an Infrared Data Association (IrDA) standard interface.
  • MHL Mobile High-definition Link
  • SD Secure Digital
  • MMC Multi-Media Card
  • IrDA Infrared Data Association
  • the audio module 280 bi-directionally converts a sound and an electronic signal. At least some components of the audio module 280 may be included in the input/output interface 140 illustrated in FIG. 1 .
  • the audio module 280 processes sound information input or output through a speaker 282 , a receiver 284 , an earphone 286 , and the microphone 288 .
  • the camera module 291 is a device which can photograph a still image and a video.
  • the camera module 291 may include one or more image sensors (for example, a front sensor or a back sensor), an Image Signal Processor (ISP) or a flash (for example, an LED or xenon lamp).
  • image sensors for example, a front sensor or a back sensor
  • ISP Image Signal Processor
  • flash for example, an LED or xenon lamp
  • the power management module 295 manages power of the electronic device 201 .
  • the power management module 295 may include, for example, a Power Management Integrated Circuit (PMIC), a charger Integrated Circuit (IC), or a battery gauge.
  • PMIC Power Management Integrated Circuit
  • IC charger Integrated Circuit
  • battery gauge a Battery gauge
  • the PMIC may be mounted to an integrated circuit or an SoC semiconductor.
  • a charging method is divided into wired and wireless methods.
  • the charger IC charges a battery and prevents over voltage or over current from flowing from a charger.
  • the charger IC includes a charger IC for at least one of the wired charging method and the wireless charging method.
  • the wireless charging method may include a magnetic resonance method, a magnetic induction method and an electromagnetic wave method, and additional circuits for wireless charging, for example, circuits such as a coil loop, a resonant circuit, a rectifier, etc. may be added.
  • the battery gauge measures, for example, a remaining quantity of the battery 296 , or a voltage, a current, or a temperature during charging.
  • the battery 296 stores or generates electricity and supplies power to the electronic device 201 by using the stored or generated electricity.
  • the battery 296 may include a rechargeable battery or a solar battery.
  • the indicator 297 shows particular statuses of the electronic device 201 or a part (for example, AP 210 ) of the electronic device 201 , for example, a booting status, a message status, a charging status, etc.
  • the motor 298 converts an electrical signal to a mechanical vibration.
  • the electronic device 201 may include a processing unit (for example, CPU) for supporting a module TV.
  • the processing unit for supporting the mobile TV may process media data according to a standard of Digital Multimedia Broadcasting (DMB), Digital Video Broadcasting (DVB), media flow, etc,.
  • DMB Digital Multimedia Broadcasting
  • DVD Digital Video Broadcasting
  • Each of the components of the electronic device according to various embodiments of the present invention may be implemented by one or more components and the name of the corresponding component may vary depending on a type of the electronic device.
  • the electronic device according to various embodiments of the present invention may include at least one of the above described components, a few of the components may be omitted, or additional components may be further included. Also, some of the components of the electronic device according to various embodiments of the present invention may be combined to form a single entity, and thus may equivalently execute functions of the corresponding components before being combined.
  • FIG. 3 is a block diagram of a programming module of an electronic device, according to an embodiment of the present invention.
  • the programming module 300 may be included (stored) in the electronic device 101 (for example, memory 130 ) illustrated in FIG. 1 . At least some of the programming module 300 may be formed of software, firmware, hardware, or a combination of software, firmware, and hardware.
  • the programming module 300 may be executed in the hardware (for example, electronic device 201 ) to include an Operating System (OS) controlling resources related to the electronic device 101 or various applications 370 driving on the OS.
  • OS Operating System
  • the OS is AndroidTM, iOSTM, WindowsTM, SymbianTM, TizenTM, BadaTM, etc.
  • the programming module 300 includes a kernel 320 , a middleware 330 , an Application Programming Interface (API) 360 , and the applications 370 .
  • API Application Programming Interface
  • the kernel 320 includes a system resource manager 321 and a device driver 323 .
  • the system resource manager 321 may include, for example, a process manager, a memory manager, and a file system manager.
  • the system resource manager 311 performs a system resource control, allocation, and recall.
  • the device driver 323 may include, for example, a display driver, a camera driver, a Bluetooth driver, a shared memory driver, a USB driver, a keypad driver, a WiFi driver, and an audio driver. Further, according to an embodiment, the device driver 312 include an Inter-Process Communication (IPC) driver.
  • IPC Inter-Process Communication
  • the middleware 330 includes a plurality of modules prepared in advance to provide a function required in common by the applications 370 . Further, the middleware 330 provides a function through the API 360 to allow the applications 370 to efficiently use limited system resources within the electronic device 101 .
  • the middleware 300 includes at least one of a runtime library 335 , an application manager 341 , a window manager 342 , a multimedia manager 343 , a resource manager 344 , a power manager 345 , a database manager 346 , a package manager 347 , a connection manager 348 , a notification manager 349 , a location manager 350 , a graphic manager 351 , and a security manager 352 .
  • the runtime library 335 includes a library module used by a complier to add a new function through a programming language while the application 370 is executed.
  • the runtime library 335 executes input and output, management of a memory, a function associated with an arithmetic function, etc.
  • the application manager 341 manages a life cycle of at least one of the applications 370 .
  • the window manager 342 manages GUI resources used on the screen.
  • the multimedia manager 343 detects a format required for reproducing various media files and performs an encoding or a decoding of a media file by using a codec suitable for the corresponding format.
  • the resource manager 344 manages resources such as a source code, a memory, or a storage space of at least one of the applications 370 .
  • the power manager 345 operates together with a Basic Input/Output System BIOS) to manage a battery or power and provides power information required for the operation.
  • BIOS Basic Input/Output System BIOS
  • the database manager 346 manages generation, search, and change of a database to be used by at least one of the applications 370 .
  • the package manager 347 manages an installation or an update of an application distributed in a form of a package file.
  • the connection manager 348 manages a wireless connection, such as WiFi or Bluetooth.
  • the notification manager 349 displays or notifies a user of an event, such as an arrival message, an appointment, a proximity alarm, etc., in a manner that does not disturb the user.
  • an event such as an arrival message, an appointment, a proximity alarm, etc.
  • the location manager 350 manages location information of the electronic device 101 .
  • the graphic manager 351 manages a graphic effect provided to the user or a user interface related to the graphic effect.
  • the security manager 352 provides a general security function required for a system security or a user authentication.
  • the middleware 330 may further include a telephony manager for managing a voice or a video call function of the electronic device 101 .
  • the middleware 330 may generate a new middleware module through a combination of various functions of the aforementioned internal component modules and use the generated new middleware module.
  • the middleware 330 may provide a module specific to each type of operating system to provide a differentiated function. Further, the middleware 330 may dynamically delete some of the conventional components or add new components. Accordingly, some of the components described in the embodiment of the present invention may be omitted, replaced with other components having different names but performing similar functions, or other components may be further included.
  • the API 360 is a set of API programming functions, and may be provided with a different configuration according to an operating system. For example, in AndroidTM or iOSTM, a single API set may be provided for each platform. In TizenTM, two or more API sets may be provided.
  • the applications 370 may include a preloaded application and/or a third party application.
  • the applications 370 may include a home application 371 , a dialer application 372 , a Short Messaging Service (SMS)/Multimedia Messaging Service (MMS) application 373 , an Instant Messaging (IM) application 374 , a browser application 375 , a camera application 376 , an alarm application 377 , a contact application 378 , a voice dial application 379 , an email application 380 , a calendar application 381 , a media player application 382 , an album application 383 , and a clock application 384 .
  • SMS Short Messaging Service
  • MMS Multimedia Messaging Service
  • IM Instant Messaging
  • email application 380 a calendar application 381 , a media player application 382 , an album application 383 , and a clock application 384 .
  • the present embodiment is not limited thereto, and the applications 370 may include any other similar and/or suitable application.
  • At least some of the programming module 300 is implemented by a command stored in a computer-readable storage medium.
  • the command is executed by one or more processors (for example, processor 210 )
  • the one or more processors perform a function corresponding to the command.
  • the computer-readable storage medium may be, for example, the memory 260 .
  • At least some of the programming module 300 may be implemented (for example, executed) by, for example, the processor 210 .
  • At least some of the programming module 300 may include, for example, a module, a program, a routine, sets of instructions, or a process for performing one or more functions.
  • the names of the components of the programming module 300 may vary depending on a type of operating system. Further, the programming module 300 may include one or more of the aforementioned components, omit some of the components, or further include other additional components.
  • FIG. 4 is a block diagram of an electronic device, according to an embodiment of the present invention.
  • an electronic device 400 is provided.
  • the electronic device 400 includes a camera module 410 , a sensor module 420 , and a control module 430 .
  • the electronic device may additionally include a display 440 .
  • the electronic device 400 may be the electronic device 101 described with reference to FIG. 1 or the electronic device 201 described with reference to FIG. 2 .
  • the camera module 410 may include one or more image sensors for photographing a still image or a moving image.
  • the camera module 410 changes the settings of the one or more image sensors according to a control command or a signal. For example, the camera module 410 controls the focus, white balance, a shutter speed, or brightness according to the control command. Further, the camera module 410 controls the one or more image sensors to acquire images according to the control command or the signal.
  • the sensor module 420 measures or detects a physical quantity related to the electronic device 400 or a physiological signal of a subject to be examined and converts the detected physical quantity or physiological signal into an electrical signal.
  • the sensor module 420 includes a biometric sensor for measuring the physiological signal of the subject to be examined.
  • the biometric sensor may be a Photoplethysmography (PPG) sensor for the subject to be examined.
  • PPG Photoplethysmography
  • the biometric sensor measures a physiological signal, including at least one of iris information, retina information, vein information, fingerprint information, and Saturation of peripheral Oxygen (SpO 2 ).
  • the biometric sensor may, alternatively, be a Heart Rate Monitor (HRM), a Heart Rate Variability (HRV) sensor, an electromyogram sensor, or an electroencephalogram sensor.
  • HRM Heart Rate Monitor
  • HRV Heart Rate Variability
  • electromyogram an electromyogram
  • electroencephalogram an electroencephalogram sensor
  • the sensor module 420 may include at least one of an illumination sensor, a gesture sensor, an acceleration sensor, a location sensor, a gyroscope sensor, and a magnetic sensor, as well as the biometric sensor.
  • the sensor module 420 is configured to receive an input signal, and the input signal includes a physiological signal of the subject to be examined and information on a physical quantity related to the electronic device 400 .
  • the subject to be examined may be a user of the electronic device 400 .
  • the control module 430 is functionally connected to the camera module 410 and the sensor module 420 , and controls the camera module 410 to acquire an image at least based on the input signal received through the sensor module 420 .
  • the physiological signal is included in the input signal for controlling the camera module 410 .
  • a control command or a signal for the camera module 410 is an input operation detected based on the input signal received through the sensor module 420 .
  • the input operation includes an operation by the subject to be examined with respect to at least a part of the surface of the sensor module 420 , for example, an approach operation, a contact operation, or a separation operation.
  • the display 440 displays a dynamic image acquired through the camera module 410 , for example, a preview image.
  • the display 440 displays a still image acquired through the camera module 410 , for example, a photographed image.
  • the display 440 may display various User Interfaces (Uls) for controlling the camera module 410 by the control module 430 .
  • Uls User Interfaces
  • the electronic device 400 may further include the audio module 280 .
  • the control module 430 is functionally connected to the audio module 280 and the sensor module 420 , and controls the audio module 280 to acquire audio data at least based on the input signal received through the sensor module 420 .
  • the physiological signal is included in the input signal for controlling the audio module.
  • a control command or a signal for the audio module is an input operation detected based on the input signal received through the sensor module 420 .
  • the input operation includes an operation by the subject to be examined with respect to at least a part of the surface of the sensor module 420 , for example, an approach operation, a contact operation, or a separation operation.
  • the electronic device 400 may additionally include the communication module 220 .
  • the control module 430 is connected to the communication module and the sensor module 420 and also functionally connected to at least one of the display 440 and the audio module, and controls at least one of the display 440 and the audio module 220 to acquire an image or audio data at least based on the input signal received through the sensor module 420 .
  • the physiological signal is included in the input signal for controlling the display 440 or the audio module.
  • a control command or a signal for the display 440 or the audio module is an input operation detected based on the input signal received through the sensor module 420 .
  • the input operation includes an operation by the subject to be examined with respect to at least a part of the surface of the sensor module 420 , for example, an approach operation, a contact operation, or a separation operation.
  • the control module 430 acquires an image (for example, a call application screen or a conference application screen) or a video (for example, a video including one or more of a recipient image and a sender image) displayed on the display 440 at least based on the input signal received through the sensor module 420 during a voice call, a video call, an audio conference, or a video conference.
  • the control module 430 acquires one or more of the audio signals output or input through the audio module 280 .
  • FIGS. 5A and 5B are perspective views of an electronic device, according to an embodiment of the present invention.
  • Electronic device 400 includes a bar type terminal body, however, the electronic device of the present invention is not limited to thereto and may have various structures such as a scroll type, a curvedly bent type, a slide type in which two or more bodies are coupled to reciprocally move, and a folder type.
  • the electronic device 400 includes a front surface, a side surface, and a rear surface, and includes both ends formed along a length direction.
  • the body includes a case forming the exterior of the electronic device 400 , and the case may be classified into a front case and a rear case.
  • Various types of components included in the electronic device 400 may be arranged in the space formed between the front case and the rear case.
  • the components included in the electronic device 201 may be located in the case or may be located in the space formed between the front case and the rear case.
  • the camera module 410 , the sensor module 420 , a button 425 , and the display 440 may be located in the case included in the body of the electronic device 400 .
  • the display 440 occupies a main part of the front case.
  • the camera module 410 may be located in at least one of the front case and the rear case included in the body of the electronic device 400 .
  • the camera module 411 is arranged on the front case of the electronic device 400 and is located in an area close to one of the end parts of the display 440 .
  • FIG. 58 a rear perspective view of an electronic device 400 is provided.
  • the camera module 413 is arranged on the rear surface of the body of the electronic device 400 and is located in the rear case.
  • the front camera module 411 and the rear camera module 413 may have different photographing directions.
  • the rear camera module 413 may be configured to be capable of performing photography in higher definition than the front camera module 411 .
  • a flash 415 is disposed in an area adjacent to the rear camera module 413 . When the image is acquired through the rear camera module 413 , the flash 415 may shine a light toward a subject for photography.
  • the sensor module 420 is disposed on one surface of the body of the electronic device 400 . As shown in FIG. 5B , the sensor module 420 is arranged on the rear surface of the body of the electronic device 400 and is located in the rear case. When the flash 415 is disposed on the rear case, the sensor module 420 is located in an area adjacent to the flash 415 . Alternatively, the sensor module 420 may be located in an area adjacent to the rear camera module 413 .
  • the sensor module 420 may be disposed on a position of the rear case which user's fingers can reach. In this case, at least some of the user's fingers, which grasp the electronic device 400 , may be objects from which the physiological signal is measured by the sensor module 420 .
  • the sensor module 420 may be located in an area adjacent to the button 425 or combined with the button 425 .
  • the biometric sensor included in the sensor module 420 may be an HRM sensor, an HRV sensor, an ECG sensor, or an SpO 2 sensor.
  • the biometric sensor included in the sensor module 420 includes a light emitting unit 421 for generating an incident light and irradiating the light to the subject to be examined and a light receiving unit 423 for receiving a reflected light from the subject to be examined.
  • the sensor module 420 may be a PPG sensor including the light emitting unit 421 and the light receiving unit 423 .
  • the light emitting unit 421 may be implemented as a Light Emitting Diode (LED). Further, the light emitting unit 421 may be implemented as one or more LEDs having different wavelengths. The different wavelengths may be rays including visible rays or infrared rays.
  • the physiological signal for example, a PPG signal is detected based on the reflected light received through the light receiving unit 423 .
  • FIGS. 6A to 6C illustrate an operational state of an electronic device, according to an embodiment of the present invention.
  • a user 500 may place the front camera module 411 located on the front surface of the body of the electronic device 400 toward the user 500 and may control an operation of the front camera module 411 by a subject 510 (such as the user's hand) grasping the body of the electronic device 400 .
  • a subject 510 such as the user's hand
  • physiological information on the user 500 is received through the sensor module 420 located in the rear case of the body of the electronic device 400 .
  • the electronic device 400 receives, through the sensor module 420 configured to include the biometric sensor, an input signal including physiological information on the subject to be examined, which corresponds to at least a part of the subject 510 (for example, fingers) grasping the body of the electronic device 400 .
  • the control module 430 detects an input operation corresponding to a movement of the part of the subject 510 , which is the subject to be examined, based on the input signal.
  • the control module 430 then controls the front camera module 411 to acquire an image at least based on the input operation.
  • the electronic device 400 acquires the image through the camera module 411 based on the input operation on the surface of the sensor module 420 by the subject to be examined and displays the image on the display 440 .
  • FIGS. 7A and 7B illustrate an operation of a sensor module of an electronic device, according to an embodiment of the present invention.
  • the sensor module 420 includes the light emitting unit 421 and the light receiving unit 423 .
  • the light emitting unit 421 generates an incident light and irradiate the light toward the subject 510 to be examined corresponding to a part of the human tissues, and the light receiving unit 423 receives a reflected light that is generated when the incident light penetrating the subject 510 to be examined is reflected and returned.
  • the incident light generated from the light emitting unit 421 may be implemented to have a particular wavelength.
  • the incident light may be, for example, a ray indicating a green light. Since the green incident light has a relatively low skin transmittance and high absorption compared to visible lights of other colors, the green incident light may be used for a wearable device worn on the wrist. Further, the incident light may he, for example, a ray indicating a red light.
  • the light emitting unit 421 may be implemented as one or more LEDs to generate different wavelengths.
  • the one or more LEDs may generate, for example, visible lights of green, red, or other colors or generate an infrared (IR).
  • the light emitting unit 421 includes a first light emitting unit 4211 and a second light emitting unit 4213 having different wavelengths.
  • the first light emitting unit 4211 and the second light emitting unit 4213 irradiate incident lights of different wavelengths to the subject 510 to be examined and each of the reflected incident lights is received by the light receiving unit 423 .
  • the sensor module 420 may be configured to include a PPG sensor including the light emitting unit 421 and the light receiving unit 423 .
  • a blood flow rate of peripheral blood vessel changes and, accordingly, volumes of the peripheral blood vessels also change.
  • the PPG sensor measures a change in the volume of the peripheral blood vessel by detecting a penetration amount of the light irradiated to the subject to be examined, and measures one or more of a change in a blood amount, and blood oxygen saturation within the vessel based on the change in the volume of the peripheral blood vessel.
  • the PPG sensor measures a variation in a time interval between heart rates or heartbeats per unit time based on the measured change in the blood amount within the vessel. Accordingly, the PPG sensor may operate as a Heart Rate Monitor (HRM) which can measure a heart rate based on the measured blood amount information.
  • HRM Heart Rate Monitor
  • the human tissue corresponding to the subject 510 to be examined may be, for example, a finger. While the contact state of the subject 510 on the surface of the sensor module 420 is maintained for a predetermined time after the subject 510 to be examined contacts the surface of the sensor module 420 , the sensor module 420 detects a change in the blood amount within the human tissue corresponding to the subject 510 to be examined according to a contraction period and a relaxation period. For example, the sensor module 420 detects a change in brightness since it is relatively dark in the contraction period since the blood amount increases and it is relatively bright in the relaxation period since the blood amount decreases.
  • the measurement module included in the sensor module 420 detects the light reflected from a vessel within the human tissue through the light receiving unit 423 and converts the detected light into an electrical signal, so as to acquire the physiological signal of the subject 510 to be examined. For example, the sensor module 420 converts an amount of the light detected by the light receiving unit 423 into voltage and receives the voltage as an input, and calculates an elapsed time between heart rates or heartbeats based on the measurement of a voltage change period.
  • the control module 430 analyzes the HRV based on the physiological signal received by the sensor module 420 and acquires physiological information including autonomic nervous system information of the subject 510 to be examined based on a result of the analysis.
  • the analysis is an analysis of a frequency area of the HRV, for example, analysis of power peak information generated in a particular frequency band based on Power Spectrum Density (PSD).
  • PSD Power Spectrum Density
  • the PSD may include a correlation function method, a fast Fourier transform, or an autoregressive technique.
  • the physiological information acquired based on a result of the analysis of the FRV may be information related to immune deficiency, physical stress, physical fatigue, lack of sleep, chronic stress, depression, and emotion (for example, preference, fright, arousal state, etc.).
  • the control module 430 measures oxygen saturation based on the physiological signal received by the sensor module 420 .
  • the sensor module 420 may include an oxygen saturation sensor, and the oxygen saturation sensor measures a ratio of the hemoglobin saturated with oxygen of the total hemoglobin.
  • the sensor module 420 for measuring the oxygen saturation may include the light emitting unit 421 including a red LED and an IR LED. Since a red wavelength and an IR wavelength have different reaction sensitivities to a change in oxygen saturation of arterial blood, SpO 2 is measured through a difference between the sensitivities.
  • Physiological information acquired based on a result of the measurement of SpO 2 may be information on burned calories, a difficulty in breathing, clouded consciousness, or a body state during an exercise.
  • the control module 430 detects an input operation corresponding to a particular movement of the subject 510 to be examined with respect to the surface of the sensor module 420 at least based on the input signal including the physiological signal.
  • the input operation may be distinguished based on a particular area within a particular distance 429 from the surface of the sensor module 420 .
  • the particular area may be a detection area in which the sensor module 420 can detect the subject 510 to be examined. Further, the detection area may be an area within a particular range distance from the sensor module 420 so that the physiological signal received by the sensor module 420 can indicate meaningful physiological information.
  • the detection area may be an area within a distance in which a measurable signal for the subject 510 to be examined can be received or an area in which a signal corresponding to a particular percentage of the intensity of the maximum measurable signal can be received.
  • the control module 430 determines the proximity of the subject 510 to be examined or whether the subject 510 to be examined makes contact based on the input signal including the physiological signal of the subject 510 to be examined.
  • the input operation may include, for example, an approach operation, a contact operation, or a separation operation.
  • the approach operation is an operation by which the subject 510 to be examined enters the detection area; that is, an operation by which the subject 510 a to be examined enters a location of the subject 510 b to be examined within the detection area.
  • the contact operation may be an operation by which the subject 510 to be examined contacts at least a part of the surface of the sensor module 420 or an operation by which the subject 510 b to be examined moves to a location of the subject 510 c to be examined to contact the surface of the sensor module 420 .
  • the separation operation may be an operation by which the subject 510 moves away from the surface of the sensor module 420 or an operation by which the subject 510 c to be examined separates from the surface of the sensor module 420 and moves to the location of the subject 510 b to be examined.
  • control module 430 may detect an input operation of the subject 510 to be examined by determining whether the subject 510 to be examined enters the detection area based on the input signal including the physiological signal, whether the subject 510 to be examined contacts at least a part of the surface of the sensor module 420 , or whether the subject 510 to be examined separates with the surface of the sensor module 420 .
  • the PPG sensor receives the physiological signal including at least one of a DC component and an AC component.
  • the AC component is a signal of a component, which varies relatively quickly depending on the heartbeat due to contraction and relaxation of the heart
  • the DC component is a signal of a component, which varies relatively slowly depending on blood volume, and an absorption degree or a reflection degree of the tissues surrounding the vessel regardless of the heartbeat among the physiological signal.
  • FIGS. 8A and 8B are graphs illustrating physiological signals detected by an electronic device according to time lapse, according to an embodiment of the present invention.
  • control module 430 determines the proximity of the subject 510 to be examined or whether the subject 510 to be examined contacts, based on at least one of a signal level and a signal pattern of the DC component or the AC component of the physiological signal received by the PPC sensor.
  • FIG. 8A an example of a waveform of the physiological signal including only the DC component is provided.
  • FIG. 8B an example of a waveform of the physiological signal including both the DC component and the AC component is provided.
  • the control module 430 may determine whether the subject 510 to be examined makes contact, based on whether a signal level of the physiological signal is larger than a particular value.
  • the signal level may be a physiological signal value determined based on the DC component or the AC component.
  • the control module 430 may determine whether the subject 510 to be examined makes contact, based on whether a level value of the DC component of the physiological signal is larger than a particular value.
  • the control module 430 may determine whether the subject 510 to be examined makes contact, based on whether a level value of the AC component of the physiological signal is within a particular value.
  • the control module 430 determines that the subject 510 to be examined is located outside the detection area when the level value of the DC component corresponding to the physiological signal is in the first range, determines that the subject 51 to be examined is located inside the detection area when the level value of the DC component is in the second range, and determines that the subject 510 to be examined contacts the surface of the sensor module 420 when the level value of the DC component is in the third range.
  • the control module 430 detects an approach operation when a change from the first range to the second range is detected, a contract operation when a change from the second range to the third range is detected, and a separation operation when a change from the third range to the second range is detected.
  • the first range is less than or equal to 60,000
  • the second range is greater than 60,000 and less than 100,000
  • the third range is greater than or equal to 100,000.
  • the first range may be less than or equal to 30%, the second range may be less than 50%, and the third range may be greater than or equal to 50%.
  • the level value of the DC component corresponding to the physiological signal may become different through a process of amplifying the received signal.
  • the level of the DC component of the physiological signal may become different by an ambient light of the electronic device 400 , so that an operation for removing the component by the ambient light is first performed.
  • the control module 430 may determine the proximity of the subject 510 to be examined or whether the subject 510 to be examined makes contact, based on the signal pattern of the physiological signal. For example, the control module 430 may determine the proximity of the subject 510 to be examined or whether the subject 510 to be examined makes contact, based on an amplitude of the AC component included in the physiological signal. As the amplitude of the AC component of the physiological signal is small, a difference in an increase and a decrease of the light received by the light receiving unit 423 according to vasodilation in the contraction period and vasoconstriction in the relaxation period decreases. In this case, it is determined as a non-contact state or a proximity state rather than the contact state in which the physiological signal of the subject 510 to be examined is sufficiently received.
  • the control module 430 determines that the subject 510 to be examined is located outside the detection area if a maximum or average amplitude change measured using peaks of the AC component included in the physiological signal is in the first range, and determines that the subject 51 to be examined is located inside the detection area if the maximum or average amplitude change is in the second range.
  • the control module 430 detects that the input operation corresponds to the approach operation if a change from the first range to the second change is detected.
  • the first range may be below 20% and the second range may be from 20% to 60%.
  • the control module 430 when detecting the input operation corresponding to the movement of the subject 510 to be examined, the control module 430 detects the contact operation if the amplitude (for example, a peak value) of the AC component included in the physiological signal is larger than or equal to a particular amplitude, and detects the separation operation if the amplitude of the AC component is smaller than the particular amplitude.
  • the amplitude for example, a peak value
  • the control module 430 detects the contact operation if a state, in which the amplitude of the peak value of the AC component included in the physiological signal becomes larger than or equal to the particular amplitude, is maintained for a predetermined time or a predetermined number of successive peak values remain in the value larger than or equal to the particular amplitude, and detects the separation operation if a state, in which the amplitude of the peak value becomes smaller than the particular amplitude, is maintained for the predetermined time or the predetermined number of successive peak values remains in the value smaller than the particular amplitude.
  • the control module 430 When detecting the input operation corresponding to the movement of the subject 510 to be examined, if a DC component included in the physiological signal is close to a maximum reception level value and an AC component included in the physiological signal is very weak, which is equal to or small than a particular level, the control module 430 detects the state in which the subject 510 to be examined is located inside the detection area or the non-contact or proximity state rather than the contact state in which the subject 510 to be examined contacts the surface of the sensor module 420 . In this case, the control module 430 determines that an amount of the light received by the light receiving unit 423 included in the sensor module 420 is large but the reflected light or transmitted light of the subject 510 to be examined is not measured.
  • the control module 430 determines the proximity of the subject 510 to be examined or whether the subject 510 to be examined makes contact, based on a level of an amount of the light received through the illumination sensor and the physiological signal.
  • control module 430 detects the input operation of the subject 510 to be examined corresponding to the movement of the subject 510 to be examined, based on the input signal including the physiological signal.
  • the control module 430 activates the sensor module 420 to include the physiological signal in the input signal. For example, when the camera module 410 is activated, if the face of the subject for photography is detected in a preview image received through the camera module 410 , if the proximity of the subject to be examined is detected by the proximity sensor included in the sensor module 420 , or if the existence of the subject to be examined is detected through a periodical monitoring by the PPG sensor included in the sensor module 420 , the control module 430 considers that the condition for activation is met, and thus activates the biometric sensor. Thereafter, the sensor module 420 receives the input signal including the physiological signal acquired through the activated biometric sensor. While the sensor module 420 receives the input signal including the physiological signal acquired through the activated biometric sensor, the control module 430 may control the audio module 280 to output audio data or control the motor 298 or a haptic module to output tactile feedback or force feedback.
  • the electronic device performs an operation of receiving the physiological signal of the user through the biometric sensor and an operation of acquiring an image through the camera module of the electronic device at least based on the change in the physiological signal.
  • the change in the physiological signal is detected based on at least one of a level and a pattern of the physiological signal.
  • FIG. 9 is a flowchart of an operation for acquiring an image based on an input signal by the electronic device, according to an embodiment of the present invention.
  • the electronic device 400 receives an input signal including a physiological signal, detects an input operation based on the input signal, and controls the camera module 410 according to the input operation.
  • the sensor module 420 receives an input signal including a physiological signal in step 1110 .
  • the sensor module 420 acquires the physiological signal from the subject 510 to be examined, which corresponds to a part of the user's body.
  • the sensor module 420 may be a PPG sensor or an ECG sensor, and the physiological signal may be a PPG signal or an ECG signal.
  • the sensor module 420 may further include sensors for measuring a physical quantity related to the electronic device 400 . In this case, the sensor module 420 acquires a measurement signal for the physical quantity related to the electronic device 400 .
  • the input signal includes the physiological signal and may further include the measurement signal.
  • the measurement signal is a measurement value of another physical quantity (for example, proximity by the illumination sensor) for the part of the user's body.
  • the control module 430 detects an input operation at least based on the input signal in step 1120 .
  • the control module 430 detects an input operation of the subject 510 to be examined based on the physiological signal included in the input signal. Further, the control module 430 detects the input operation of the subject 510 to be examined based on at least one of the physiological signal included in the input signal and another measurement signal. The input operation is detected based on at least one of a signal level or a signal pattern of the input signal.
  • the input operation detected by the control module 430 may be an approach operation, and the approach operation is an operation by which the subject 510 to be examined enters the detection area from the outside of the detection area.
  • the input operation detected by the control module 430 may be a contact operation, and the contact operation is an operation by which the subject 510 to be examined contacts at least a part of the surface of the sensor module 420 .
  • the input operation detected by the control module 430 may be a separation operation, and the separation operation is an operation by which the subject 510 to be examined is separated from the surface of the sensor module 420 .
  • the control module 430 acquires an image through the camera module 410 according to the input operation in step 1130 .
  • the control module 430 controls the camera module 410 to acquire the image.
  • the control module 430 controls the camera module 410 to acquire the image.
  • the control module 430 controls to change a setting (for example, a focus control or a white balance control) of the camera module 410 .
  • the control module 430 control the camera module 410 to start acquiring the image when the input operation is a first input operation and to end acquiring the image when the input operation is a second input operation. For example, photographing is started by a first contact operation, and the photographing is terminated when a second contact operation is made after the generation of the separation operation. Alternatively, when a multi-touch, that is, two contact operations are generated by different fingers, the image acquisition is started by the contact operation of the first finger and the image acquisition is terminated by the contact operation of the second finger. By the plurality of input operations, a panorama image or a video including a plurality of images may be photographed.
  • the control module 430 controls the camera module 410 based on one or more input operations detected at least based on the input signal, a time for which each of the input operations is maintained, or sequences of the one or more input operations. Further, the control module 430 changes a setting of the camera module 410 to acquire the image according to the one or more input operations.
  • various embodiments of this document will describe that the control module 430 controls the camera module 410 at least based on one or more detected input operations.
  • control module 430 may control the camera module 410 or the audio module 280 based on one or more detected input operations at least based on a time when the input operation is generated.
  • the control module 430 performs an operation for continuously acquiring one or more of images or audio data through at least one of the camera module 410 and the audio module 280 and temporarily stores the images or audio data in the memory 230 of the electronic device 400 for a predetermined maintenance time tp (for example, 10 seconds) from the acquisition time.
  • tp for example, 10 seconds
  • control module 430 may acquire and store one or more of images or audio data during a predetermined maintenance time (tn) from the time (t) when the input operation is generated.
  • control module 430 store one or more of images or audio data for a time from a time earlier than the time (t) when the input operation is generated by a first maintenance time to a time later than the time (t) when the input operation is generated by a second maintenance time.
  • a sound shot for example, an operation for recording audio data for a predetermined time including the time before or after the photographing time and correlating the recorded audio data with one or more images
  • FIG. 10 is a flowchart of an operation for acquiring an image according to a contact operation by the electronic device, according to an embodiment of the present invention.
  • the electronic device 400 receives an input signal including a physiological signal, such as a user bringing a part of the body including a linger into contact with the sensor module 420 based on the input signal, and, when it is detected that the contact state is maintained for a predetermined time, acquires an image through the camera module after a predetermined time elapses.
  • the control module 430 recognizes the contact operation or the contact maintenance state according to the contact operation as a control command that instructs to perform a preparation operation to acquire the image.
  • the sensor module 420 receives an input signal including a physiological signal of the subject 510 to be examined, and the control module 430 detects the contact operation at least based on the input signal in step 1210 .
  • the control module 430 changes a setting of the camera module 410 in preparation for acquiring the image in step 1220 .
  • An operation for changing the setting may be an operation for controlling the focus of the camera module 410 .
  • the operation for changing the setting may be an operation for controlling white balance of the camera module.
  • the operation for changing the setting may be an operation for controlling at least one of ISO, exposure, aperture, and shutter speed.
  • the control module 430 detects whether the contact state between the subject 510 to be examined and the sensor module 420 is maintained for a predetermined time after the contact operation at least based on the input signal including the physiological signal in step 1230 .
  • control module 430 acquires the image through the camera module 410 in step 1240 .
  • the control module 430 may acquire the image through the camera module 410 at every contact time. For example, when, after detecting the contact operation at least based on the input signal and detecting that the contact state is maintained for a predetermined time after the contact operation, the control module 430 repeatedly detects the separation operation and the contact operation, the control module 430 may repeatedly acquire the image through the camera module 410 according to the separation operation and the contact operation.
  • the control module 430 may measure the number of times by which the separation and the contact are repeated and switch to a predetermined setting or mode. For example, when, after detecting the contact operation at least based on the input signal and detecting that the contact state is maintained for a predetermined time after the contact operation, the control module 430 repeatedly detects the separation operation and the contact operation, the control module 430 may switch a camera photographing mode according to the separation operation and the contact operation. For example, when the control module 430 detects the separation after two contacts for a predetermined time, the control module 430 may switch the photographing mode to a night photographing mode.
  • control module 430 when the control module 430 detects the separation after three contacts for a predetermined time, the control module 430 switch the photographing mode to a sports photographing mode.
  • the control module 430 may execute the application program as the control module 430 detects the contact of the part of the user's body to the sensor module. For example, after detecting the contact operation and executing the application program for acquiring the image, the control module 430 changes a setting of the camera module 410 . Thereafter, when the contact state between the subject 510 to be examined and the sensor module 420 remains for the predetermined time, the control module 430 acquires the image through the camera module 410 .
  • FIG. 11 is a flowchart of an operation for acquiring an image according to a separation operation by the electronic device, according to an embodiment of the present invention.
  • the electronic device 400 acquires an image through the camera module 410 when the user separates the body part from the sensor module 420 .
  • the sensor module 420 receives an input signal including a physiological signal of the subject 510 to be examined, and the control module 430 detects the contact operation at least based on the input signal in step 1310 .
  • the control module 430 changes a setting of the camera module 410 as the preparation for acquiring the image in step 1320 .
  • An operation for changing the setting may be an operation for controlling the focus of the camera module 410 .
  • the operation for changing the setting may be an operation for controlling white balance of the camera module.
  • the operation for changing the setting may be an operation for controlling at least one of ISO, exposure, aperture, and shutter speed.
  • the control module 430 detects a separation operation by which the subject 510 to be examined is separated from the surface of the sensor module 420 after the contact operation at least based on the input signal including the physiological signal in step 1330 .
  • the control module 430 acquires an image through the camera module 410 according to the detected separation operation in step 1340 .
  • the control module 430 may ignore other photographing commands for acquiring the image except for the detection of the separation operation after the contact operation is detected.
  • the control module 430 may ignore automatic photographing conditions which can be implemented by the electronic device 400 , for example, conditions for acquiring the image when a particular pose of the subject for photography included in a preview image is detected, when a smiling face is detected, and when a particular movement pattern is detected, or conditions for acquiring the image when another input signal is received, and may acquire the image only when the separation operation is detected.
  • FIG. 12 is a flowchart of an operation for controlling the electronic device, according to an embodiment of the present invention.
  • the electronic device 400 may acquire an image through a camera after a waiting time from the separation of the body part from the sensor module by the user.
  • the sensor module 420 receives an input signal including a physiological signal of the subject 510 to be examined, and the control module 430 detects the contact operation at least based on the input signal in step 1310 .
  • control module 430 changes a setting of the camera module 410 in preparation for acquiring the image in step 1320 .
  • the control module 430 determines whether the contact state between the subject 510 to be examined and the sensor module 420 according to the contact operation is maintained for at least a predetermined time in step 1325 .
  • control module 430 When the control module 430 detects the separation operation by which the subject 510 to be examined is separated from the surface of the sensor module 420 in step 1330 , the control module 430 waits until a particular waiting time elapses in step 1335 before acquiring the image in step 1340 .
  • the particular waiting time may be determined based on a time for which the contact state is maintained.
  • the control module 430 may display a waiting state on the display 440 for the particular waiting time, for example, in the form of a graphic object.
  • FIGS. 13A and 13B illustrate an elapsing of a particular time, according to an embodiment of the present invention.
  • the control module 430 displays a graphic object 441 a indicating the start of the particular waiting time on the display 440 and then changes the graphic object 441 a to a graphic object 441 b after the particular waiting time elapses.
  • the control module 430 acquires an image through the camera module 410 after the particular waiting time elapses.
  • the control module 430 may record a voice for a particular recording time through a microphone included in the electronic device 400 .
  • the control module 430 may determine the particular recording time based on the time for which the contact state is maintained.
  • the control module 430 stores the voice together with the image.
  • FIG. 14 illustrates controlling an electronic device based on a contact sequence, according to an embodiment of the present invention.
  • the sensor module 420 includes the light emitting unit 421 and the light receiving unit 423 , and the light emitting unit 421 includes a first light emitting unit 421 a and a second light emitting unit 421 b having light emitting diodes of different wavelengths.
  • the first light emitting unit 421 a may include, for example, LEDs emitting an infrared light
  • the second light emitting unit 421 b may include, for example, LEDs emitting a red visible light.
  • the light receiving unit 423 receives reflected lights generated when incident lights of different wavelengths penetrate the subject 510 to be examined and then reflected and returned.
  • the sensor module 420 detects a movement direction of the subject 510 to be examined on the sensor module 420 by detecting contact sequences between the one or more LEDs and the subject 510 to be examined based on the received reflected lights.
  • the sensor module 420 may generate incident lights from the one or more LEDs simultaneously or at different times. Accordingly, the sensor module 420 detects the activated LED at a particular time among the one or more LEDs and, thus, detects the movement direction of the subject 510 to be examined on the sensor module 420 .
  • the control module 430 detects the movement direction of the subject 510 to be examined at least based on the input signal received through the sensor module 420 including the biometric sensor configured to include one or more LEDs. In this case, the control module 430 controls the electronic device 400 based on the movement direction.
  • FIGS. 15A and 15B illustrate controlling an electronic device based on a movement direction, according to an embodiment of the present invention.
  • the control module 430 detects a movement direction of the subject 510 to be examined from a first location 510 a to a second location 510 b on the surface of the sensor module 420 disposed on the rear surface of the electronic device 400 , moves the image 610 in the movement direction, and displays another image 620 according to the movement of the image.
  • the movement direction of the subject 510 to be examined is opposite to the movement direction of the images on the display 440 .
  • the movement direction of the subject 510 to be examined which contacts the sensor module 420 disposed on the rear surface of the electronic device 400 or closely crosses over the sensor module 420 , corresponds to a direction from right to left
  • the movement direction of the image 620 on the display 440 disposed on the front surface of the electronic device 400 corresponds to a direction from left to right
  • a speed of contents, which moves on the display 440 may vary depending on a movement speed of the subject 510 to be examined.
  • the control module 430 moves the content while displaying the content on the display 440 by controlling the electronic device 400 based on the movement direction.
  • the contents may be a webpage, and the movement operation may be a scroll operation,
  • the contents may be one or more pages of an electronic book, and the movement operation may be a page turn effect operation of the electronic book.
  • the contents may be a menu focus (for example, highlight, cursor, or selection indication) displayed on a menu, and the movement operation may be a movement operation of the menu focus.
  • FIG. 16 illustrates controlling an electronic device based on a contact sequence, according to an embodiment of the present invention.
  • the sensor module 420 includes the light emitting unit 421 and the light receiving unit 423 , and the light receiving unit 423 may include one or more light receiving diodes, such as a first light receiving diode 423 a, a second light receiving diode 423 b, a third light receiving diode 423 c, and a fourth light receiving diode 423 d.
  • a first light receiving diode 423 a a first light receiving diode 423 a
  • a second light receiving diode 423 b a third light receiving diode 423 c
  • a fourth light receiving diode 423 d a fourth light receiving diode 423 d.
  • an incident light generated by the light emitting unit 421 is reflected after penetrating the subject 510 to be examined and is received through each light receiving diode.
  • the control module 430 detects a direction of the movement of the subject to be examined on the sensor module 420 based on positions of the one or more light receiving dio
  • the control module 430 detects left and right movement directions of the subject 510 to be examined on the sensor module 420 .
  • the light receiving unit 423 includes four light receiving diodes, for example, the first light receiving diode 423 a, the second light receiving diode 423 b, the third light receiving diode 423 c, and the fourth light receiving diode 423 d
  • the four light receiving diodes are arranged on the top, bottom, left, and right sides from the center of the light emitting unit 421 , respectively.
  • the control module 430 detects left and right movement directions or top, bottom, left, and right movement directions of the subject 510 to be examined on the sensor module 420 .
  • the number of light receiving diodes according to the above described embodiment is only an example, and the light receiving unit 423 may include a different number of light receiving diodes.
  • the control module 430 may detect the movement direction of the subject 510 to be examined at least based on the input signal received through the sensor module 420 including the biometric sensor configured to include one or more light receiving diodes. In this case, the control module 430 controls the electronic device 400 based on the movement direction as described above.
  • FIGS. 17 illustrates a biometric sensor coupled to a button of an electronic device, according to an embodiment of the present invention.
  • the sensor module 420 may be formed to cover a switch disposed on the body of the electronic device 400 .
  • the sensor module 420 includes a biometric sensor arranged above a button 425 that covers the switch in an accumulated form.
  • the button 425 and the sensor module 420 may be functionally connected to control image acquisition through the camera module 410 .
  • the sensor module 420 when the sensor module 420 is formed to be disposed above the button 425 , the user contacts the sensor module 420 before pressing the button 425 , so that the sensor module 420 receives a physiological signal of a user's body part (for example, finger) corresponding to the subject 510 to be examined.
  • a user's body part for example, finger
  • control module 430 detects an input operation at least based on an input signal including the physiological signal and, accordingly, controls the electronic device 400 based on the input operation, a result of the determination on whether the switch opens or shuts, or both the input operation and the result of the determination on whether the switch opens or shuts.
  • the electronic device 400 may include sensors for measuring other physical quantities between the sensor module 420 and the button 425 , above the sensor module 420 , or below the button 425 . Although it has been described that the button 425 is formed on the front case of the electronic device 400 , the button 425 may be disposed on another surface of the electronic device 400 .
  • FIG. 18 illustrates a method of using a biometric sensor coupled to a button of an electronic device, according to an embodiment of the present invention.
  • an electronic device 400 including the sensor module 420 formed to cover the button 425 is provided.
  • control module 430 When the biometric sensor included in the sensor module 420 is deactivated, the control module 430 activates the biometric sensor after detecting the press of a button 425 as shown in step [a]. Further, the control module 430 may execute an application program for acquiring an image through the camera module 410 according to the activation of the biometric sensor.
  • control module 430 When the control module 430 detects a contact operation or a separation operation of the subject 510 to be examined at least based on the input signal received through the sensor module 420 and detects the press of the button 425 after the contact operation, the control module 430 changes a setting of the camera module 410 , as shown in step [b]. For example, the control module 430 may control the focus or white balance according to the contact operation even before the press of the button 425 is detected.
  • the contact operation may correspond to a half-press to focus for controlling the focus of the camera module 410 .
  • control module 430 acquires the image through the camera module 410 according to the press of the switch 452 .
  • the control module 430 deactivates the biometric sensor according to the separation operation, as shown in step [d].
  • FIG. 19 is a flowchart of an operation for processing an image by using fingerprint information, according to an embodiment of the present invention.
  • the sensor module 420 may include a fingerprint recognition sensor.
  • the sensor module 420 receives an input signal including information on the fingerprint of the subject 510 to be examined, through the fingerprint recognition sensor.
  • the control module 430 controls the electronic device 400 based on the fingerprint information.
  • the sensor module 420 receives an input signal including a physiological signal and fingerprint information, and the control module 430 detects an input operation at least based on the input signal in step 1410 .
  • the control module 430 identifies the user based on the fingerprint information included in the input signal in step 1420 .
  • the control module 430 acquires an image through the camera module 410 according to the input operation in step 1430 .
  • the control module 430 detects whether the user is included in the image in step 1440 , and, when the user is included in the image, stores the image in a protected storage space in step 1450 .
  • the protected storage space may be a secure area existing within the electronic device 400 , a personal storage space for the user, a personal storage area based on an external device of the electronic device 400 , or a service.
  • the control module 430 may acquire an image through the camera module 410 according to the input operation, and store the image in a guest storage space.
  • the guest storage space may be a published area existing within the electronic device 400 (for example, a memory which can be accessed without user authentication), or a personal storage area based on a service, an internal, or external device of the electronic device 400 storing at least one of the fingerprint information or the image to record the user when the electronic device 400 is stolen.
  • the control module 430 may control the electronic device 400 to process the image by using the heart rate information.
  • the electronic device 400 may acquire heart rate information of the subject 510 to be examined at least based on the physiological signal acquired through the sensor module 420 .
  • the control module 430 may display the heart rate information on the display 440 together with the image of the user corresponding to the subject 510 to be examined.
  • FIG. 20 is a flowchart of an operation for providing a preview image using heart rate information, according to an embodiment of the present invention.
  • FIG. 21 illustrates a screen displaying a preview image provided using heart rate information, according to an embodiment of the present invention.
  • the sensor module 420 receives an input signal including a physiological signal of the subject 510 to be examined, and the control module 430 detects a contact operation as an input operation at least based on the input signal in step 1510 .
  • the control module 430 acquires the heart rate information of the subject 510 to be examined at least based on the physiological signal while the contact operation is maintained in step 1520 .
  • the control module 430 displays a preview image 631 acquired through the camera module 410 on the display 440 and also displays heart rate information 633 together with the preview image in step 1530 .
  • the control module 430 detects a separation operation during a contact state in step 1540 , and acquire the image by controlling the camera module 410 in step 1550 .
  • the control module 430 may store the image such that the heart rate information is included in the image.
  • the control module 430 may detect the image of the user corresponding to the subject 510 to be examined included in the preview image and display the image of the user and the heart rate information such that the image of the user and the heart rate information are correlated to each other.
  • An operation for designating one or more display attributes for example, a location, size, color, shape, icon, avatar, and predetermined template
  • one or more display attributes for example, a location, size, color, shape, icon, avatar, and predetermined template
  • the heart rate information may be information related to stress or emotion determined based on the physiological signal collected through the HRM sensor or the HRV sensor, and display attributes of the heart rate information 633 may be designated or changed based on information on the corresponding stress or emotion.
  • FIG. 22 is a flowchart of an operation for displaying a user image and heart rate information together, according to an embodiment of the present invention.
  • FIG. 23 illustrates a screen displaying an image together with heart rate information, according to an embodiment of the present invention.
  • the sensor module 420 receives an input signal including a physiological signal of the subject 510 to be examined, and the control module 430 detects an input operation at least based on the input signal in step 1610 .
  • the control module 430 acquires heart rate information of the subject 510 to be examined at least based on the physiological signal in step 1620 .
  • the control module 430 acquires an image through the camera module 410 according to the input operation in step 1630 .
  • the control module 430 detects, within the image, a user image corresponding to the subject 510 to be examined in step 1640 .
  • the control module 430 detects the user image corresponding to the subject 510 to be examined according to the heart rate information, physiological information, or a user setting.
  • the electronic device 400 may include a database including the user image corresponding to the heart rate information, and the user image may be detected based on the database.
  • the control module 430 displays the acquired image 641 on the display 440 and displays the heart rate information such that the heart rate information is correlated to the user image 645 in a graphic object or text form 646 in step 1650 .
  • the control module 430 may display emotional information based on the physiological information together with the image.
  • FIG. 24 is a flowchart of an operation for displaying an acquired image and emotional information together, according to an embodiment of the present invention.
  • FIG. 25 illustrates a screen displaying an image together emotional information, according to an embodiment of the present invention.
  • the sensor module 420 receives an input signal including a physiological signal of the subject 510 to be examined, and the control module 430 detects an input operation at least based on the input signal in step 1710 .
  • the control module 430 acquires emotional information of the subject 510 to be examined at least based on the physiological signal in step 1720 .
  • the emotional information may include, for example, joy, shyness, grief, or excitement.
  • the emotional information may be acquired based on heart rate information.
  • the control module 430 determines that the emotional information corresponds to joy by detecting an increase in the heart rate for a predetermined time or a particular pattern of the heart rate signal.
  • control module 430 may acquire the emotional information by identifying an emotional state shown in the preview image acquired through the camera module 410 .
  • the control module 430 may acquire the emotional information based on at least one of a movement pattern of the user's body within the preview image, a facial expression in a face image, a face shape, a movement pattern of the face (for example, a change in shape of the eyes or mouth), existence or nonexistence of laughter, and existence or nonexistence of blinking.
  • control module 430 acquires an image through the camera module 410 according to the input operation in step 1730 .
  • the control module 430 displays the emotional information on the display 440 together with, the image in step 1740 .
  • the control module 430 may change the user's facial expression shown in the image according to the emotional information. More specifically, the control module 430 may analyze feature points of the user's face and change a location of a feature point associated with the emotional information among the feature points. When the emotional information corresponds to joy, the control module 430 may change the user's facial expression by changing locations of the feature points corresponding to the mouth such that the mouth corner lifts or lips part, so as to represent the user's smiling face in the image.
  • the control module 430 may change a part of all of the colors of the user's face shown in the image according to the emotional information. For example, when the emotional information corresponds to excitement or shyness, the control module 430 may make a red color 653 appear on the cheek of the face of the user 651 shown in the image.
  • the control module 430 may identify people included in the image, acquire physiological information or emotional information of the user corresponding to the subject 510 to be examined and other people, and display the physiological information and emotional information together. To this end, the electronic device 400 can communicate with another device including physiological information of the identified people. The control module 430 may acquire the physiological information of the user based on the physiological signal included in the input signal received through the sensor module 420 and acquire the physiological information of other people through communication with another electronic device corresponding to face recognition information.
  • FIG. 26 illustrates a screen displaying an image in which physiological information is correlated with the user, according to an embodiment of the present invention
  • physiological information 662 of the user 661 corresponding to the subject 510 to be examined is acquired through the sensor module of the electronic device 400 and correlated to the user 661 .
  • physiological information 664 and 668 of other users 663 and 667 may be acquired through communication or wearable devices of the other users 663 and 667 and correlated to the other users 663 and 667 .
  • the control module 430 may store physiological information or emotional information together with the image acquired through the camera module 410 .
  • the physiological information or the emotional information may be stored in a meta information area of a file format for the image.
  • FIGS. 27A to 27D illustrate a method of controlling an electronic device to photograph a panorama image, according to an embodiment of the present invention.
  • the sensor module 420 receives an input signal including a physiological signal of the subject 510 to be examined, and the control module 430 detects the input operation at least based on the input signal.
  • the control module 430 combines one or more images acquired according to the detected input operation to generate connected images, for example, panorama images.
  • the control module 430 of the electronic device 400 acquires a reference image through the camera module 410 and acquires one or more images connected to the reference image according to up and bottom or left and right rotation of the body of the electronic device 400
  • the control module 430 displays a guide for indicating locations of the one or more images connected to the reference image according to the rotation of the body on the display of the electronic device 400 .
  • the guide displays a photographing direction or a photographing location.
  • the control module 430 combines the reference image and the one or more images.
  • the combination may correspond to a combination of the one or more images corresponding to images above or below the reference image according to the top and bottom rotation of the body or a combination of the one or more images corresponding to images on the left or right side of the reference image according to the left and right rotation of the body.
  • the other input operation may be another contact operation or separation operation.
  • the control module 430 acquires a reference image 670 a through the camera module 411 arranged on the front surface of the body of the electronic device 400 .
  • the reference image 670 a may include an image 671 corresponding to the user of the electronic device 400 .
  • the control module 430 acquires at least one image 670 b or 670 c corresponding to an image on the right side of the reference image 670 a through the camera module 411 as the body of the electronic device 400 rotates counterclockwise.
  • At least one image 670 b or 670 c may include at least some areas of the reference image 670 a.
  • the control module 430 acquires at least one image 670 d or 670 e corresponding to an image on the left side of the reference image 670 a through the camera module 411 as the body of the electronic device 400 rotates clockwise.
  • At least one image 670 d or 670 e may include at least some areas of the reference image 670 a.
  • the control module 430 when an additional contact operation is detected or when a separation operation is detected after a contact state according to the contact operation is maintained for a particular time, the control module 430 generates one image 670 f by connecting the reference image 670 a and at least one image 670 b, 670 c, 670 d, or 670 e.

Abstract

A method is provided. The method includes executing a camera application at an electronic device; activating a heart rate monitor (HRM) sensor operatively coupled with a first surface of the electronic device in response to the execution of the camera application; receiving an input signal via the HRM sensor; and capturing, using a processor operatively coupled with the electronic device, an image via an image sensor operatively coupled with a second surface of the electronic device based at least in part on the input signal.

Description

PRIORITY
This application is a Continuation Application of, and claims priority under 35 U.S.C. § 120 to, U.S. application Ser. No. 15/239,307, which was filed in the U.S. Patent & Trademark Office on Aug. 17, 2016 and was a Continuation Application of, and claimed priority under 35 U.S.C. § 120 to, U.S. application Ser. No. 14/843,593, which was filed in the U.S. Patent & Trademark Office on Sep. 2, 2015, issued as U.S. Pat. No, 9,444,998 on Sep. 13, 2016, and claimed priority under 35 U.S.C. § 119(a) to Korean Patent Application No. 10-2014-0116510 filed in the Korean Intellectual Property Office on Sep. 2, 2014, the entire contents of each of which are incorporated herein by reference.
BACKGROUND 1. Field of the Invention
The present invention relates generally to method for a control of a camera module, and more particularly to a method for a control of a camera module based on a physiological signal.
2. Description of the Related Art
Various electronic devices include a camera module, and conventionally acquire an image by either pressing a photography button mounted to a body of the electronic device or by touching a photography button displayed on a display.
However, an operation of manipulating a mechanical button or a button image on a display may influence the quality of the acquired image.
SUMMARY
The present invention has been made to address at least the problems and disadvantages described above and to provide at least the advantages described below.
Accordingly, an aspect of the present invention is to provide a method of image acquisition through a camera module of an electronic device without manipulating a mechanical button or a virtual button displayed on a display of the electronic device.
Accordingly, another aspect of the present invention is to provide a method of controlling a camera module of an electronic device by using physiological information.
Accordingly, another aspect of the present invention is to allow a user to control a camera module of an electronic device by using a sensor module. For example, a user of an electronic device can change a setting of the camera module or instruct to acquire an image based on a motion of a finger on the sensor module.
Accordingly, another aspect of the present invention is to acquire an image through a camera module of the electronic device which can be processed based on physiological information or emotional information received by the sensor module.
In accordance with an aspect of the present invention, a method is provided. The method includes executing a camera application at an electronic device; activating a heart rate monitor (HRM) sensor operatively coupled with a first surface of the electronic device in response to the execution of the camera application; receiving an input signal via the HRM sensor; and capturing, using a processor operatively coupled with the electronic device, an image via an image sensor operatively coupled with a second surface of the electronic device based at least in part on the input signal.
In accordance with another aspect of the present invention, an apparatus is provided. The apparatus includes an image sensor operatively coupled with a first surface of the apparatus; a heart rate monitor (HRM) sensor coupled with a second surface of the apparatus; and a processor adapted to execute a camera application; activate the HRM sensor in response to the execution of the camera application; receive an input signal via the HRM sensor; and capture an image using the image sensor based at least in part on the input signal.
In accordance with another aspect of the present invention, an apparatus is provided. The apparatus includes an image sensor operatively coupled with a first surface of the apparatus; a heart rate monitor (HRM) sensor coupled with a second surface of the apparatus; and a processor adapted to activate the HRM sensor; receive an input signal using the activated HRM sensor; and capture an image using the image sensor based at least in part on the input signal.
BRIEF DESCRIPTION OF THE DRAWINGS
The above and other aspects, features and advantages of the present invention will be more apparent from the following detailed description taken in conjunction with the accompanying drawings, in which:
FIG. 1 is a block diagram of an electronic device within a network environment, according to an embodiment of the present invention;
FIG. 2 is a block diagram of a configuration of an electronic device, according to an embodiment of the present invention;
FIG. 3 is a block diagram of a programming module of an electronic device, according to an embodiment of the present invention;
FIG. 4 is a block diagram of an electronic device, according to an embodiment of the present invention;
FIGS. 5A and 5B are perspective views of an electronic device, according to an embodiment of the present invention;
FIGS. 6A to 6C illustrate an operational state of an electronic device, according to an embodiment of the present invention;
FIGS. 7A and 7B illustrate an operation of a sensor module of an electronic device, according to an embodiment of the present invention;
FIGS. 8A and 8B are graphs illustrating physiological signals detected by an electronic device according to time lapse, according to an embodiment of the present invention;
FIG. 9 is a flowchart of an operation for acquiring an image based on an input signal by the electronic device, according to an embodiment of the present invention;
FIG. 10 is a flowchart of an operation for acquiring an image according to a contact operation by the electronic device, according to an embodiment of the present invention;
FIG. 11 is a flowchart of an operation for acquiring an image according to a separation operation by the electronic device, according to an embodiment of the present invention;
FIG. 12 is a flowchart of an operation for controlling the electronic device, according to an embodiment of the present invention;
FIGS. 13A and 13B illustrate an elapsing of a particular time, according to an embodiment of the present invention;
FIG. 14 illustrates controlling an electronic device based on a contact sequence, according to an embodiment of the present invention;
FIGS. 15A and 15B illustrate controlling an electronic device based on a movement direction, according to an embodiment of the present invention;
FIG. 16 illustrates controlling an electronic device based on a contact sequence, according to an embodiment of the present invention;
FIGS. 17 illustrates a biometric sensor coupled to a button of an electronic device, according to an embodiment of the present invention;
FIG. 18 illustrates a method of using a biometric sensor coupled to a button of an electronic device, according to an embodiment of the present invention;
FIG. 19 is a flowchart of an operation for processing an image by using fingerprint information, according to an embodiment of the present invention;
FIG. 20 is a flowchart of an operation for providing a preview image by using heart rate information, according to an embodiment of the present invention;
FIG. 21 illustrates a screen displaying a preview image provided using heart rate information, according to an embodiment of the present invention;
FIG. 22 is a flowchart of an operation for displaying a user image and heart rate information together, according to an embodiment of the present invention;
FIG. 23 illustrates a screen displaying an image together with heart rate information, according to an embodiment of the present invention;
FIG. 24 is a flowchart of an operation for displaying an acquired image and emotional information together, according to an embodiment of the present invention;
FIG. 25 illustrates a screen displaying an image together emotional information, according to an embodiment of the present invention;
FIG. 26 illustrates a screen displaying an image in which physiological information is correlated with the user, according to an embodiment of the present invention; and
FIGS. 27A to 27D illustrate a method of controlling an electronic device to photograph a panorama image, according to an embodiment of the present invention.
DETAILED DESCRIPTION OF EMBODIMENTS OF THE PRESENT INVENTION
The following description with reference to the accompanying drawings is provided to assist in a comprehensive understanding of various embodiments of the present invention as defined by the claims and their equivalents. It includes various specific details to assist in that understanding, but these are to be regarded as merely as examples. Accordingly, those of ordinary skill in the art will recognize that various changes and modifications of the various embodiments described herein can be made without departing from the scope and spirit of the present invention. In addition, descriptions of well-known functions and constructions may be omitted for clarity and conciseness.
The terms and words used in the following description and claims are not limited to their dictionary meanings, but, are merely used to enable a clear and consistent understanding of the present invention. Accordingly, it should be apparent to those skilled in the art that the following description of various embodiments of the present invention is provided for illustration purposes only and not for the purpose of limiting the present invention as defined by the appended claims and their equivalents.
It is to be understood that the singular forms “a,” “an,” and “the” include plural referents unless the context clearly dictates otherwise. Thus, for example, reference to “a component surface” includes reference to one or more of such surfaces.
An electronic device according to an embodiment of the present invention may be an apparatus having a communication function. For example, the electronic device according to the present invention may be at least one of and/Or combinations of a smart phone, a tablet Personal Computer (PC), a mobile phone, a video phone, an electronic-boot (e-book) reader, a desktop PC, a laptop PC, a netbook computer, a Personal Digital Assistant (PDA), a Portable Multimedia Player (PMP), an MP3 player, a mobile medical appliance, an electronic bracelet, an electronic necklace, an electronic accessory, a camera, a wearable device, an electronic clock, a wrist watch, home appliances (e.g., a refrigerator, an air conditioner, a cleaner, an oven, a microwave oven, a washing machine, an air cleaner, etc.), an artificial intelligence robot, a television (TV), a Digital Versatile Disk (DVD) player, an audio player, various medical appliances (e.g., a Magnetic Resonance Angiography (MRA) device, a Magnetic Resonance Imaging (MRI) device, a Computerized Tomography (CT) device, an ultrasonagraphy device etc.), a navigation device, a Global Positioning System (UPS) receiver, an Event Data Recorder (EDR), a Flight Data Recorder (FDR), a set-top box, a TV box (e.g., HomeSync™ of SAMSUNG Electronics, Co., Apple TV™ of APPLE, Co., and Google TV™ of Google, Co.), an electronic dictionary, an infotainment device for a vehicle, an electronic equipment for a ship (e.g., a navigation device, a gyrocompass, etc.), an avionic device, a security device, an electronic cloth, an electronic key, a camcorder, a game console, a Head-Mounted Display (HMD) unit, a flat panel display device, an electronic frame, an electronic album, a piece of furniture having a communication function and/or a part of a building/structure, an electronic board, an electronic signature receiving device, and a protector.
It should be obvious to those skilled in the art that the electronic device according to the present invention is not limited to the aforementioned devices.
FIG. 1 is a block diagram of an electronic device within a network environment, according to an embodiment of the present invention.
Referring to FIG. 1, the electronic device 101 includes a bus 110, a processor 120, a memory 130, a input/output interface 150, a display 160, and a communication interface 170.
The bus 110 is a circuit for interconnecting the elements described above, e.g., the processor 120, the memory 130, the input/output interface 150, the display 160, and the communication interface 170, and for allowing a communication, e.g., by transferring a control message, between the elements described above.
The processor 120 receives commands from the above-mentioned other elements, e.g. the memory 130, the input/output interface 150, the display 160, and the communication interface 170, through, for example, the bus 110, deciphers the received commands, and performs operations and/or data processing according to the deciphered commands.
The memory 130 stores commands received from the processor 120 and/or other elements, e.g. the input/output interface 150, the display 160, and the communication interface 170, and/or commands and/or data generated by the processor 120 and/or other elements. The memory 130 include software and/or programs 140, such as a kernel 141, middleware 143, an Application Programming Interface (API) 145, and an application 147. Each of the programming modules described above may be configured by software, firmware, hardware, and/or combinations of two or more thereof.
The kernel 141 controls and/or manages system resources, e.g. the bus 110, the processor 120 or the memory 130, used for execution of operations and/or functions implemented in other programming modules, such as the middleware 143, the API 145, and/or the application 147. Further, the kernel 141 provides an interface through which the middleware 143, the API 145, and/or the application 147 can access and then control and/or manage an individual element of the electronic device 101.
The middleware 143 performs a relay function which allows the API 145 and/or the application 147 to communicate with and exchange data with the kernel 141. Further, in relation to operation requests received from at least one of an application 147, the middleware 143 performs load balancing in relation to the operation requests by, for example, giving a priority in using a system resource, e.g., the bus 110, the processor 120, and/or the memory 130, of the electronic device 101 to at least one application from among the at least one of the application 147.
The API 145 is an interface through which the application 147 controls a function provided by the kernel 141 and/or the middleware 143, and includes, for example, at least one interface or function for file control, window control, image processing, and/or character control.
The input/output interface 150 receives a command and/or data from a user, and transfers the received command and/or data to the processor 120 and/or the memory 130 through the bus 110. The display 160 displays an image, a video, and/or data to a user.
The communication interface 170 establishes a communication between the electronic device 101 and a first external device 102, a second external device 104, and/or a server 106. The communication interface 170 supports short range communication protocols, e.g., a WiFi protocol, a BlueTooth (BT) protocol, and a Near Field Communication (NFC) protocol, communication networks (e.g., Internet, Internet of Things (IoT), Local Area Network (LAN), Wire Area Network (WAN), a telecommunication network, a cellular network, and a satellite network) a Plain Old Telephone Service (POTS), or any other similar and/or suitable communication networks, such as network 162. Each of the first and second external devices 102 and 104 may be a same type and/or different types of electronic devices.
FIG. 2 is a block diagram of a configuration of an electronic device, according to an embodiment of the present invention.
Referring to FIG. 2, a block diagram of an electronic device 201 is provided. The electronic device 201 may configure a whole or a part of the electronic device 101 illustrated in FIG. 1 The electronic device 201 includes one or more Application Processors (APs) 210, a communication module 220, a Subscriber Identification Module (SIM) card 224, a memory 230, a sensor module 240, an input device 250, a display module 260, an interface 270, an audio module 280, a camera module 291, a power management module 295, a battery 296, an indicator 297, and a motor 298.
The AP 210 operates an Operating System (OS) or an application program so as to control a plurality of hardware or software component elements connected to the AP 210 and execute various data processing and calculations including multimedia data. The AP 210 may be implemented by, for example, a System on Chip (SoC). The processor 210 may further include a Graphic Processing Unit (GPU).
The communication module 220 transmits/receives data in communication between different electronic devices, e.g., the second external device 104 and the server 106, connected to the electronic device 201 through the network 162. According to an embodiment, the communication module 220 includes a cellular module 221, a WiFi module 223, a BlueTooth (BT) module 225, a Global Positioning System (GPS) module 227, a Near Field Communication (NFC) module 228, and a Radio Frequency (RF) module 229.
The cellular module 221 provides a voice, a call, a video call, a Short Message Service (SMS), or an Internet service through a communication network (for example, Long Term Evolution (LTE), LTE-Advanced (LTE-A), Code Division Multiple Access (CDMA), Wideband CDMA (WCDMA), Universal Mobile Telecommunication System (UMTS), Wireless Broadband (WiBro), Global System for Mobile communication (GSM), etc.). Further, the cellular module 221 authenticates electronic devices within a communication network by using a the SIM card 224. According to an embodiment, the cellular module 221 performs at least some of the functions which can be provided by the AP 210. For example, the cellular module 221 may perform at least some of the multimedia control functions.
The cellular module 221 may include a Communication Processor (CP). Further, the cellular module 221 may be implemented by, for example, an SoC.
Although the components such as the cellular module 221 (for example, communication processor), the memory 230, and the power management module 295 are illustrated as components separate from the AP 210 in FIG. 2, the AP 210 may include at least some (for example, cellular module 221) of the aforementioned components in an embodiment.
The AP 210 or the cellular module 221 (for example, communication processor) loads a command or data received from at least one of a non-volatile memory and other components connected to each of the AP 210 and the cellular module 221 to a volatile memory and processes the loaded command or data. Further, the AP 210 or the cellular module 221 stores data received from at least one of other components or generated by at least one of other components in a non-volatile memory.
Each of the WiFi module 223, the BT module 225, the GPS module 227, and the NFC module 228 may include, for example, a processor for processing data transmitted/received through the corresponding module. Although the cellular module 221, the WiFi module 223, the BT module 225, the GPS module 227, and the NFC module 228 are illustrated as blocks separate from each other in FIG. 2, at least some of the cellular module 221, the WiFi module 223, the BT module 225, the GPS module 227, and the NFC module 228 may be included in one integrated Chip (iC) or one IC package according to one embodiment. For example, at least some (for example, the communication processor corresponding to the cellular module 221 and the WiFi processor corresponding to the WiFi module 223) of the processors corresponding to the cellular module 221, the WiFi module 223, the BT module 225, the GPS module 227, and the NFC module 228 may be implemented by one SoC.
The RF module 229 transmits/receives data, for example, an RF signal. The RF module 229 may include, for example, a transceiver, a Power Amp Module (PAM), a frequency filter, a Low Noise Amplifier (LNA), etc. Further, the RF module 229 may include a component for transmitting/receiving electronic waves over a free air space in wireless communication, for example, a conductor, a conducting wire, etc. Although the cellular module 221, the WiFi module 223, the BT module 225, the GPS module 227, and the NFC module 228 share one RF module 229 in FIG. 2, at least one of the cellular module 221, the WiFi module 223, the BT module 225, the GPS module 227, and the NFC module 228 may transmit/receive an RF signal through a separate RF module.
The SIM card 224 is a card that may be inserted into a slot formed in a particular portion of the electronic device 201. The SIM card 224 includes unique identification information (for example, Integrated Circuit Card IDentifier (ICCID)) or subscriber information (for example, International Mobile Subscriber Identity (IMSI)).
The memory 230 includes an internal memory 232 or an external memory 234.
The internal memory 232 may include at least one of a volatile memory (for example, a Random Access Memory (RAM), a Dynamic RAM (DRAM), a Static RAM (SRAM), Synchronous Dynamic RAM (SDRAM), etc.), and a non-volatile memory (for example, a Read Only Memory (ROM), a One Time Programmable ROM (OTPROM), a Programmable ROM (PROM), an Erasable and Programmable ROM (EPROM), an Electrically Erasable and Programmable ROM (EEPROM), a mask ROM, a flash ROM, a NAND flash memory, an NOR flash memory, etc.).
The internal memory 232 may be a Solid State Drive (SSD).
The external memory 234 may include a flash drive, for example, a Compact Flash (CF), a Secure Digital (SD), a Micro Secure Digital (Micro-SD), a Mini Secure Digital (Mini-SD), an extreme Digital (xD), or a memory stick. The external memory 234 may be functionally connected to the electronic device 201 through various interfaces. The electronic device 201 may further include a storage device (or storage medium) such as a hard drive.
The sensor module 240 measures a physical quantity or detects an operation state of the electronic device 201, and converts the measured or detected information to an electronic signal. The sensor module 240 may include at least one of a gesture sensor 240A, a gyro sensor 240B, an atmospheric pressure (barometric) sensor 240C, a magnetic sensor 240D, an acceleration sensor 240E, a grip sensor 240F, a proximity sensor 2400, a color sensor 240H (for example, Red, Green, and Blue (ROB) sensor) 240H, a biometric sensor 2401, a temperature/humidity sensor 240J, an illuminance (light) sensor 240K, and a Ultra Violet (UV) sensor 240M. Additionally or alternatively, the sensor module 240 may include, for example, a E-nose sensor, an electromyography (EMG) sensor, an electroencephalogram (EEG) sensor, an electrocardiogram (ECG) sensor, an InfraRed (IR) sensor, an iris sensor, a fingerprint sensor, etc. The sensor module 240 may further include a control circuit for controlling one or more sensors included in the sensor module 240.
The input device 250 includes a touch panel 252, a (digital) pen sensor 254, a key 256, and an ultrasonic input device 258. For example, the touch panel 252 recognize a touch input in at least one type of a capacitive type, a resistive type, an infrared type, and an acoustic wave type. The touch panel 252 further include a control circuit. In the capacitive type, the touch panel 252 can recognize proximity as well as a direct touch. The touch panel 252 further include a tactile layer. In this event, the touch panel 252 provides a tactile reaction to the user.
The (digital) pen sensor 254 may be implemented, for example, using a method identical or similar to a method of receiving a touch input of the user or using a separate recognition sheet.
The key 256 may include a physical button, an optical key, or a key pad.
The ultrasonic input device 258 is a device which detects an acoustic wave by microphone 288 of the electronic device 201 through an input means generating an ultrasonic signal to identify data and performs wireless recognition.
The electronic device 201 may receive a user input from the first or second external device 102 and 104 or the server 106 connected to the electronic device 201 by using the communication module 220.
The display module 260 includes a panel 262, a hologram device 264, and a projector 266.
The panel 262 may be, for example, a Liquid Crystal Display (LCD) or an Active Matrix Organic Light Emitting Diode (AM-OLED). The panel 262 may be implemented to be, for example, flexible, transparent, or wearable. The panel 262 may be configured by the touch panel 252 and one module.
The hologram device 264 displays a stereoscopic image in the air by using interference of light.
The projector 266 projects light on a screen to display an image. The screen may be located inside or outside the electronic device 201.
The display module 260 may further include a control circuit for controlling the panel 262, the hologram device 264, and the projector 266.
The interface 270 includes a High-Definition Multimedia interface (HDMI) 272, a Universal Serial Bus (USB) 274, an optical interface 276, and a D-subminiature (D-sub) 278. The interface 270 may be included in the communication interface 160 illustrated in FIG. 1. Additionally or alternatively, the interface 290 may include, for example, a Mobile High-definition Link (MHL) interface, a Secure Digital (SD) card/Multi-Media Card (MMC), or an Infrared Data Association (IrDA) standard interface.
The audio module 280 bi-directionally converts a sound and an electronic signal. At least some components of the audio module 280 may be included in the input/output interface 140 illustrated in FIG. 1. The audio module 280 processes sound information input or output through a speaker 282, a receiver 284, an earphone 286, and the microphone 288.
The camera module 291 is a device which can photograph a still image and a video. The camera module 291 may include one or more image sensors (for example, a front sensor or a back sensor), an Image Signal Processor (ISP) or a flash (for example, an LED or xenon lamp).
The power management module 295 manages power of the electronic device 201. The power management module 295 may include, for example, a Power Management Integrated Circuit (PMIC), a charger Integrated Circuit (IC), or a battery gauge.
The PMIC may be mounted to an integrated circuit or an SoC semiconductor. A charging method is divided into wired and wireless methods. The charger IC charges a battery and prevents over voltage or over current from flowing from a charger. The charger IC includes a charger IC for at least one of the wired charging method and the wireless charging method.
The wireless charging method may include a magnetic resonance method, a magnetic induction method and an electromagnetic wave method, and additional circuits for wireless charging, for example, circuits such as a coil loop, a resonant circuit, a rectifier, etc. may be added.
The battery gauge measures, for example, a remaining quantity of the battery 296, or a voltage, a current, or a temperature during charging.
The battery 296 stores or generates electricity and supplies power to the electronic device 201 by using the stored or generated electricity. The battery 296 may include a rechargeable battery or a solar battery.
The indicator 297 shows particular statuses of the electronic device 201 or a part (for example, AP 210) of the electronic device 201, for example, a booting status, a message status, a charging status, etc.
The motor 298 converts an electrical signal to a mechanical vibration.
The electronic device 201 may include a processing unit (for example, CPU) for supporting a module TV. The processing unit for supporting the mobile TV may process media data according to a standard of Digital Multimedia Broadcasting (DMB), Digital Video Broadcasting (DVB), media flow, etc,.
Each of the components of the electronic device according to various embodiments of the present invention may be implemented by one or more components and the name of the corresponding component may vary depending on a type of the electronic device. The electronic device according to various embodiments of the present invention may include at least one of the above described components, a few of the components may be omitted, or additional components may be further included. Also, some of the components of the electronic device according to various embodiments of the present invention may be combined to form a single entity, and thus may equivalently execute functions of the corresponding components before being combined.
FIG. 3 is a block diagram of a programming module of an electronic device, according to an embodiment of the present invention;
Referring to FIG. 3 a block diagram of a programming module 300 is provided. The programming module 300 may be included (stored) in the electronic device 101 (for example, memory 130) illustrated in FIG. 1. At least some of the programming module 300 may be formed of software, firmware, hardware, or a combination of software, firmware, and hardware. The programming module 300 may be executed in the hardware (for example, electronic device 201) to include an Operating System (OS) controlling resources related to the electronic device 101 or various applications 370 driving on the OS. For example, the OS is Android™, iOS™, Windows™, Symbian™, Tizen™, Bada™, etc. The programming module 300 includes a kernel 320, a middleware 330, an Application Programming Interface (API) 360, and the applications 370.
The kernel 320 includes a system resource manager 321 and a device driver 323.
The system resource manager 321 may include, for example, a process manager, a memory manager, and a file system manager. The system resource manager 311 performs a system resource control, allocation, and recall.
The device driver 323 may include, for example, a display driver, a camera driver, a Bluetooth driver, a shared memory driver, a USB driver, a keypad driver, a WiFi driver, and an audio driver. Further, according to an embodiment, the device driver 312 include an Inter-Process Communication (IPC) driver.
The middleware 330 includes a plurality of modules prepared in advance to provide a function required in common by the applications 370. Further, the middleware 330 provides a function through the API 360 to allow the applications 370 to efficiently use limited system resources within the electronic device 101. The middleware 300 includes at least one of a runtime library 335, an application manager 341, a window manager 342, a multimedia manager 343, a resource manager 344, a power manager 345, a database manager 346, a package manager 347, a connection manager 348, a notification manager 349, a location manager 350, a graphic manager 351, and a security manager 352.
The runtime library 335 includes a library module used by a complier to add a new function through a programming language while the application 370 is executed. The runtime library 335 executes input and output, management of a memory, a function associated with an arithmetic function, etc.
The application manager 341 manages a life cycle of at least one of the applications 370.
The window manager 342 manages GUI resources used on the screen.
The multimedia manager 343 detects a format required for reproducing various media files and performs an encoding or a decoding of a media file by using a codec suitable for the corresponding format.
The resource manager 344 manages resources such as a source code, a memory, or a storage space of at least one of the applications 370.
The power manager 345 operates together with a Basic Input/Output System BIOS) to manage a battery or power and provides power information required for the operation.
The database manager 346 manages generation, search, and change of a database to be used by at least one of the applications 370.
The package manager 347 manages an installation or an update of an application distributed in a form of a package file.
The connection manager 348 manages a wireless connection, such as WiFi or Bluetooth.
The notification manager 349 displays or notifies a user of an event, such as an arrival message, an appointment, a proximity alarm, etc., in a manner that does not disturb the user.
The location manager 350 manages location information of the electronic device 101.
The graphic manager 351 manages a graphic effect provided to the user or a user interface related to the graphic effect.
The security manager 352 provides a general security function required for a system security or a user authentication.
When the electronic device 101 has a call function, the middleware 330 may further include a telephony manager for managing a voice or a video call function of the electronic device 101.
The middleware 330 may generate a new middleware module through a combination of various functions of the aforementioned internal component modules and use the generated new middleware module. The middleware 330 may provide a module specific to each type of operating system to provide a differentiated function. Further, the middleware 330 may dynamically delete some of the conventional components or add new components. Accordingly, some of the components described in the embodiment of the present invention may be omitted, replaced with other components having different names but performing similar functions, or other components may be further included.
The API 360 is a set of API programming functions, and may be provided with a different configuration according to an operating system. For example, in Android™ or iOS™, a single API set may be provided for each platform. In Tizen™, two or more API sets may be provided.
The applications 370 may include a preloaded application and/or a third party application. The applications 370 may include a home application 371, a dialer application 372, a Short Messaging Service (SMS)/Multimedia Messaging Service (MMS) application 373, an Instant Messaging (IM) application 374, a browser application 375, a camera application 376, an alarm application 377, a contact application 378, a voice dial application 379, an email application 380, a calendar application 381, a media player application 382, an album application 383, and a clock application 384. However, the present embodiment is not limited thereto, and the applications 370 may include any other similar and/or suitable application.
At least some of the programming module 300 is implemented by a command stored in a computer-readable storage medium. When the command is executed by one or more processors (for example, processor 210), the one or more processors perform a function corresponding to the command. The computer-readable storage medium may be, for example, the memory 260. At least some of the programming module 300 may be implemented (for example, executed) by, for example, the processor 210. At least some of the programming module 300 may include, for example, a module, a program, a routine, sets of instructions, or a process for performing one or more functions.
According to an embodiment of the present invention, the names of the components of the programming module 300 may vary depending on a type of operating system. Further, the programming module 300 may include one or more of the aforementioned components, omit some of the components, or further include other additional components.
FIG. 4 is a block diagram of an electronic device, according to an embodiment of the present invention.
Referring to FIG. 4, an electronic device 400 is provided. The electronic device 400 includes a camera module 410, a sensor module 420, and a control module 430. The electronic device may additionally include a display 440. The electronic device 400 may be the electronic device 101 described with reference to FIG. 1 or the electronic device 201 described with reference to FIG. 2.
The camera module 410 may include one or more image sensors for photographing a still image or a moving image. The camera module 410 changes the settings of the one or more image sensors according to a control command or a signal. For example, the camera module 410 controls the focus, white balance, a shutter speed, or brightness according to the control command. Further, the camera module 410 controls the one or more image sensors to acquire images according to the control command or the signal.
The sensor module 420 measures or detects a physical quantity related to the electronic device 400 or a physiological signal of a subject to be examined and converts the detected physical quantity or physiological signal into an electrical signal. The sensor module 420 includes a biometric sensor for measuring the physiological signal of the subject to be examined.
The biometric sensor may be a Photoplethysmography (PPG) sensor for the subject to be examined. The biometric sensor measures a physiological signal, including at least one of iris information, retina information, vein information, fingerprint information, and Saturation of peripheral Oxygen (SpO2).
The biometric sensor may, alternatively, be a Heart Rate Monitor (HRM), a Heart Rate Variability (HRV) sensor, an electromyogram sensor, or an electroencephalogram sensor.
Alternatively, the sensor module 420 may include at least one of an illumination sensor, a gesture sensor, an acceleration sensor, a location sensor, a gyroscope sensor, and a magnetic sensor, as well as the biometric sensor.
The sensor module 420 is configured to receive an input signal, and the input signal includes a physiological signal of the subject to be examined and information on a physical quantity related to the electronic device 400. The subject to be examined may be a user of the electronic device 400.
The control module 430 is functionally connected to the camera module 410 and the sensor module 420, and controls the camera module 410 to acquire an image at least based on the input signal received through the sensor module 420. The physiological signal is included in the input signal for controlling the camera module 410. A control command or a signal for the camera module 410 is an input operation detected based on the input signal received through the sensor module 420. The input operation includes an operation by the subject to be examined with respect to at least a part of the surface of the sensor module 420, for example, an approach operation, a contact operation, or a separation operation.
The display 440 displays a dynamic image acquired through the camera module 410, for example, a preview image. The display 440 displays a still image acquired through the camera module 410, for example, a photographed image. Further, the display 440 may display various User Interfaces (Uls) for controlling the camera module 410 by the control module 430.
The electronic device 400 may further include the audio module 280. In this case, the control module 430 is functionally connected to the audio module 280 and the sensor module 420, and controls the audio module 280 to acquire audio data at least based on the input signal received through the sensor module 420. The physiological signal is included in the input signal for controlling the audio module. A control command or a signal for the audio module is an input operation detected based on the input signal received through the sensor module 420. The input operation includes an operation by the subject to be examined with respect to at least a part of the surface of the sensor module 420, for example, an approach operation, a contact operation, or a separation operation.
The electronic device 400 may additionally include the communication module 220. In this case, the control module 430 is connected to the communication module and the sensor module 420 and also functionally connected to at least one of the display 440 and the audio module, and controls at least one of the display 440 and the audio module 220 to acquire an image or audio data at least based on the input signal received through the sensor module 420. The physiological signal is included in the input signal for controlling the display 440 or the audio module. A control command or a signal for the display 440 or the audio module is an input operation detected based on the input signal received through the sensor module 420. The input operation includes an operation by the subject to be examined with respect to at least a part of the surface of the sensor module 420, for example, an approach operation, a contact operation, or a separation operation. For example, the control module 430 acquires an image (for example, a call application screen or a conference application screen) or a video (for example, a video including one or more of a recipient image and a sender image) displayed on the display 440 at least based on the input signal received through the sensor module 420 during a voice call, a video call, an audio conference, or a video conference. Alternatively, the control module 430 acquires one or more of the audio signals output or input through the audio module 280.
FIGS. 5A and 5B are perspective views of an electronic device, according to an embodiment of the present invention.
Referring to FIG. 5A, a front perspective view of an electronic device 400 is provided. Electronic device 400 includes a bar type terminal body, however, the electronic device of the present invention is not limited to thereto and may have various structures such as a scroll type, a curvedly bent type, a slide type in which two or more bodies are coupled to reciprocally move, and a folder type.
As illustrated in FIG. 5A, the electronic device 400 includes a front surface, a side surface, and a rear surface, and includes both ends formed along a length direction. The body includes a case forming the exterior of the electronic device 400, and the case may be classified into a front case and a rear case. Various types of components included in the electronic device 400 may be arranged in the space formed between the front case and the rear case. When the electronic device 400 corresponds to the electronic device 201 disclosed with reference to FIG. 2, the components included in the electronic device 201 may be located in the case or may be located in the space formed between the front case and the rear case. For example, the camera module 410, the sensor module 420, a button 425, and the display 440 may be located in the case included in the body of the electronic device 400. The display 440 occupies a main part of the front case. The camera module 410 may be located in at least one of the front case and the rear case included in the body of the electronic device 400. As shown in FIG. 5A, the camera module 411 is arranged on the front case of the electronic device 400 and is located in an area close to one of the end parts of the display 440.
Referring to FIG. 58, a rear perspective view of an electronic device 400 is provided. As shown in FIG. 5B, the camera module 413 is arranged on the rear surface of the body of the electronic device 400 and is located in the rear case.
The front camera module 411 and the rear camera module 413 may have different photographing directions. The rear camera module 413 may be configured to be capable of performing photography in higher definition than the front camera module 411. A flash 415 is disposed in an area adjacent to the rear camera module 413. When the image is acquired through the rear camera module 413, the flash 415 may shine a light toward a subject for photography.
The sensor module 420 is disposed on one surface of the body of the electronic device 400. As shown in FIG. 5B, the sensor module 420 is arranged on the rear surface of the body of the electronic device 400 and is located in the rear case. When the flash 415 is disposed on the rear case, the sensor module 420 is located in an area adjacent to the flash 415. Alternatively, the sensor module 420 may be located in an area adjacent to the rear camera module 413.
Additionally, when the user grasps the electronic device 400 by the hand to acquire an image through the front camera module 411, the sensor module 420 may be disposed on a position of the rear case which user's fingers can reach. In this case, at least some of the user's fingers, which grasp the electronic device 400, may be objects from which the physiological signal is measured by the sensor module 420.
When the button 425 is disposed on the front case, the sensor module 420 may be located in an area adjacent to the button 425 or combined with the button 425.
The biometric sensor included in the sensor module 420 may be an HRM sensor, an HRV sensor, an ECG sensor, or an SpO2 sensor.
The biometric sensor included in the sensor module 420 includes a light emitting unit 421 for generating an incident light and irradiating the light to the subject to be examined and a light receiving unit 423 for receiving a reflected light from the subject to be examined.
The sensor module 420 may be a PPG sensor including the light emitting unit 421 and the light receiving unit 423. The light emitting unit 421 may be implemented as a Light Emitting Diode (LED). Further, the light emitting unit 421 may be implemented as one or more LEDs having different wavelengths. The different wavelengths may be rays including visible rays or infrared rays. The physiological signal, for example, a PPG signal is detected based on the reflected light received through the light receiving unit 423.
FIGS. 6A to 6C illustrate an operational state of an electronic device, according to an embodiment of the present invention.
Referring to FIG. 6A, a user 500 may place the front camera module 411 located on the front surface of the body of the electronic device 400 toward the user 500 and may control an operation of the front camera module 411 by a subject 510 (such as the user's hand) grasping the body of the electronic device 400.
Referring to FIG. 6B, physiological information on the user 500 is received through the sensor module 420 located in the rear case of the body of the electronic device 400. For example, the electronic device 400 receives, through the sensor module 420 configured to include the biometric sensor, an input signal including physiological information on the subject to be examined, which corresponds to at least a part of the subject 510 (for example, fingers) grasping the body of the electronic device 400. The control module 430 detects an input operation corresponding to a movement of the part of the subject 510, which is the subject to be examined, based on the input signal. The control module 430 then controls the front camera module 411 to acquire an image at least based on the input operation.
Referring to FIG. 6C, the electronic device 400 acquires the image through the camera module 411 based on the input operation on the surface of the sensor module 420 by the subject to be examined and displays the image on the display 440.
FIGS. 7A and 7B illustrate an operation of a sensor module of an electronic device, according to an embodiment of the present invention.
Referring to FIG. 7A, the sensor module 420 includes the light emitting unit 421 and the light receiving unit 423. The light emitting unit 421 generates an incident light and irradiate the light toward the subject 510 to be examined corresponding to a part of the human tissues, and the light receiving unit 423 receives a reflected light that is generated when the incident light penetrating the subject 510 to be examined is reflected and returned.
The incident light generated from the light emitting unit 421 may be implemented to have a particular wavelength. The incident light may be, for example, a ray indicating a green light. Since the green incident light has a relatively low skin transmittance and high absorption compared to visible lights of other colors, the green incident light may be used for a wearable device worn on the wrist. Further, the incident light may he, for example, a ray indicating a red light.
Alternatively, the light emitting unit 421 may be implemented as one or more LEDs to generate different wavelengths. The one or more LEDs may generate, for example, visible lights of green, red, or other colors or generate an infrared (IR).
As shown in FIG. 7A, for example, the light emitting unit 421 includes a first light emitting unit 4211 and a second light emitting unit 4213 having different wavelengths. The first light emitting unit 4211 and the second light emitting unit 4213 irradiate incident lights of different wavelengths to the subject 510 to be examined and each of the reflected incident lights is received by the light receiving unit 423.
The sensor module 420 may be configured to include a PPG sensor including the light emitting unit 421 and the light receiving unit 423. As the heart within the human body contracts and relaxes, a blood flow rate of peripheral blood vessel changes and, accordingly, volumes of the peripheral blood vessels also change. The PPG sensor measures a change in the volume of the peripheral blood vessel by detecting a penetration amount of the light irradiated to the subject to be examined, and measures one or more of a change in a blood amount, and blood oxygen saturation within the vessel based on the change in the volume of the peripheral blood vessel. The PPG sensor measures a variation in a time interval between heart rates or heartbeats per unit time based on the measured change in the blood amount within the vessel. Accordingly, the PPG sensor may operate as a Heart Rate Monitor (HRM) which can measure a heart rate based on the measured blood amount information.
The human tissue corresponding to the subject 510 to be examined may be, for example, a finger. While the contact state of the subject 510 on the surface of the sensor module 420 is maintained for a predetermined time after the subject 510 to be examined contacts the surface of the sensor module 420, the sensor module 420 detects a change in the blood amount within the human tissue corresponding to the subject 510 to be examined according to a contraction period and a relaxation period. For example, the sensor module 420 detects a change in brightness since it is relatively dark in the contraction period since the blood amount increases and it is relatively bright in the relaxation period since the blood amount decreases. The measurement module included in the sensor module 420 detects the light reflected from a vessel within the human tissue through the light receiving unit 423 and converts the detected light into an electrical signal, so as to acquire the physiological signal of the subject 510 to be examined. For example, the sensor module 420 converts an amount of the light detected by the light receiving unit 423 into voltage and receives the voltage as an input, and calculates an elapsed time between heart rates or heartbeats based on the measurement of a voltage change period.
The control module 430 analyzes the HRV based on the physiological signal received by the sensor module 420 and acquires physiological information including autonomic nervous system information of the subject 510 to be examined based on a result of the analysis. The analysis is an analysis of a frequency area of the HRV, for example, analysis of power peak information generated in a particular frequency band based on Power Spectrum Density (PSD). The PSD may include a correlation function method, a fast Fourier transform, or an autoregressive technique. The physiological information acquired based on a result of the analysis of the FRV may be information related to immune deficiency, physical stress, physical fatigue, lack of sleep, chronic stress, depression, and emotion (for example, preference, fright, arousal state, etc.).
The control module 430 measures oxygen saturation based on the physiological signal received by the sensor module 420. To this end, the sensor module 420 may include an oxygen saturation sensor, and the oxygen saturation sensor measures a ratio of the hemoglobin saturated with oxygen of the total hemoglobin. The sensor module 420 for measuring the oxygen saturation may include the light emitting unit 421 including a red LED and an IR LED. Since a red wavelength and an IR wavelength have different reaction sensitivities to a change in oxygen saturation of arterial blood, SpO2 is measured through a difference between the sensitivities. Physiological information acquired based on a result of the measurement of SpO2 may be information on burned calories, a difficulty in breathing, clouded consciousness, or a body state during an exercise.
Referring to FIG. 7B, the control module 430 detects an input operation corresponding to a particular movement of the subject 510 to be examined with respect to the surface of the sensor module 420 at least based on the input signal including the physiological signal. The input operation may be distinguished based on a particular area within a particular distance 429 from the surface of the sensor module 420. The particular area may be a detection area in which the sensor module 420 can detect the subject 510 to be examined. Further, the detection area may be an area within a particular range distance from the sensor module 420 so that the physiological signal received by the sensor module 420 can indicate meaningful physiological information.
For example, the detection area may be an area within a distance in which a measurable signal for the subject 510 to be examined can be received or an area in which a signal corresponding to a particular percentage of the intensity of the maximum measurable signal can be received. The control module 430 determines the proximity of the subject 510 to be examined or whether the subject 510 to be examined makes contact based on the input signal including the physiological signal of the subject 510 to be examined. The input operation may include, for example, an approach operation, a contact operation, or a separation operation.
The approach operation is an operation by which the subject 510 to be examined enters the detection area; that is, an operation by which the subject 510 a to be examined enters a location of the subject 510 b to be examined within the detection area. The contact operation may be an operation by which the subject 510 to be examined contacts at least a part of the surface of the sensor module 420 or an operation by which the subject 510 b to be examined moves to a location of the subject 510 c to be examined to contact the surface of the sensor module 420. The separation operation may be an operation by which the subject 510 moves away from the surface of the sensor module 420 or an operation by which the subject 510 c to be examined separates from the surface of the sensor module 420 and moves to the location of the subject 510 b to be examined.
Hereinafter, a method of determining the proximity of the subject to be examined will be described. According to various embodiments described below, the control module 430 may detect an input operation of the subject 510 to be examined by determining whether the subject 510 to be examined enters the detection area based on the input signal including the physiological signal, whether the subject 510 to be examined contacts at least a part of the surface of the sensor module 420, or whether the subject 510 to be examined separates with the surface of the sensor module 420.
When the subject 510 to be examined contacts at least a part of the PPG sensor, the PPG sensor receives the physiological signal including at least one of a DC component and an AC component. The AC component is a signal of a component, which varies relatively quickly depending on the heartbeat due to contraction and relaxation of the heart, and the DC component is a signal of a component, which varies relatively slowly depending on blood volume, and an absorption degree or a reflection degree of the tissues surrounding the vessel regardless of the heartbeat among the physiological signal.
FIGS. 8A and 8B are graphs illustrating physiological signals detected by an electronic device according to time lapse, according to an embodiment of the present invention.
Referring to FIGS. 8A and 8B, the control module 430 determines the proximity of the subject 510 to be examined or whether the subject 510 to be examined contacts, based on at least one of a signal level and a signal pattern of the DC component or the AC component of the physiological signal received by the PPC sensor.
Referring to FIG. 8A.,an example of a waveform of the physiological signal including only the DC component is provided.
Referring to FIG. 8B an example of a waveform of the physiological signal including both the DC component and the AC component is provided.
The control module 430 may determine whether the subject 510 to be examined makes contact, based on whether a signal level of the physiological signal is larger than a particular value. The signal level may be a physiological signal value determined based on the DC component or the AC component.
The control module 430 may determine whether the subject 510 to be examined makes contact, based on whether a level value of the DC component of the physiological signal is larger than a particular value.
The control module 430 may determine whether the subject 510 to be examined makes contact, based on whether a level value of the AC component of the physiological signal is within a particular value.
For example, when the level values of the DC component are classified into a first range, a second range, and a third range (first range<second range<third range), the control module 430 determines that the subject 510 to be examined is located outside the detection area when the level value of the DC component corresponding to the physiological signal is in the first range, determines that the subject 51 to be examined is located inside the detection area when the level value of the DC component is in the second range, and determines that the subject 510 to be examined contacts the surface of the sensor module 420 when the level value of the DC component is in the third range.
When detecting the input operation corresponding to the movement based on the physiological signal of the subject 510 to be examined, the control module 430 detects an approach operation when a change from the first range to the second range is detected, a contract operation when a change from the second range to the third range is detected, and a separation operation when a change from the third range to the second range is detected.
In another example, when the level value of the DC component which can be measured through the PPG sensor ranges from 0 to 200,000, the first range is less than or equal to 60,000, the second range is greater than 60,000 and less than 100,000, and the third range is greater than or equal to 100,000.
In another example, compared to the maximum value of the level value of the DC component, the first range may be less than or equal to 30%, the second range may be less than 50%, and the third range may be greater than or equal to 50%.
The level value of the DC component corresponding to the physiological signal may become different through a process of amplifying the received signal. When the subject 510 to be examined is located outside the detection area, the level of the DC component of the physiological signal may become different by an ambient light of the electronic device 400, so that an operation for removing the component by the ambient light is first performed.
The control module 430 may determine the proximity of the subject 510 to be examined or whether the subject 510 to be examined makes contact, based on the signal pattern of the physiological signal. For example, the control module 430 may determine the proximity of the subject 510 to be examined or whether the subject 510 to be examined makes contact, based on an amplitude of the AC component included in the physiological signal. As the amplitude of the AC component of the physiological signal is small, a difference in an increase and a decrease of the light received by the light receiving unit 423 according to vasodilation in the contraction period and vasoconstriction in the relaxation period decreases. In this case, it is determined as a non-contact state or a proximity state rather than the contact state in which the physiological signal of the subject 510 to be examined is sufficiently received.
For example, when the amplitude of the AC component is classified into a first range and a second range, the control module 430 determines that the subject 510 to be examined is located outside the detection area if a maximum or average amplitude change measured using peaks of the AC component included in the physiological signal is in the first range, and determines that the subject 51 to be examined is located inside the detection area if the maximum or average amplitude change is in the second range. In this case, when detecting the input operation corresponding to the movement of the subject 510 to be examined, the control module 430 detects that the input operation corresponds to the approach operation if a change from the first range to the second change is detected. For example, compared to the maximum amplitude of the AC component included in the physiological signal, the first range may be below 20% and the second range may be from 20% to 60%.
According to another embodiment, when detecting the input operation corresponding to the movement of the subject 510 to be examined, the control module 430 detects the contact operation if the amplitude (for example, a peak value) of the AC component included in the physiological signal is larger than or equal to a particular amplitude, and detects the separation operation if the amplitude of the AC component is smaller than the particular amplitude.
When detecting the input operation corresponding to the movement of the subject 510 to be examined, the control module 430 detects the contact operation if a state, in which the amplitude of the peak value of the AC component included in the physiological signal becomes larger than or equal to the particular amplitude, is maintained for a predetermined time or a predetermined number of successive peak values remain in the value larger than or equal to the particular amplitude, and detects the separation operation if a state, in which the amplitude of the peak value becomes smaller than the particular amplitude, is maintained for the predetermined time or the predetermined number of successive peak values remains in the value smaller than the particular amplitude.
When detecting the input operation corresponding to the movement of the subject 510 to be examined, if a DC component included in the physiological signal is close to a maximum reception level value and an AC component included in the physiological signal is very weak, which is equal to or small than a particular level, the control module 430 detects the state in which the subject 510 to be examined is located inside the detection area or the non-contact or proximity state rather than the contact state in which the subject 510 to be examined contacts the surface of the sensor module 420. In this case, the control module 430 determines that an amount of the light received by the light receiving unit 423 included in the sensor module 420 is large but the reflected light or transmitted light of the subject 510 to be examined is not measured.
When the sensor module 420 includes an illumination sensor, the control module 430 determines the proximity of the subject 510 to be examined or whether the subject 510 to be examined makes contact, based on a level of an amount of the light received through the illumination sensor and the physiological signal.
As described in the aforementioned various embodiments, the control module 430 detects the input operation of the subject 510 to be examined corresponding to the movement of the subject 510 to be examined, based on the input signal including the physiological signal.
When a condition for activating the biometric sensor is met, the control module 430 activates the sensor module 420 to include the physiological signal in the input signal. For example, when the camera module 410 is activated, if the face of the subject for photography is detected in a preview image received through the camera module 410, if the proximity of the subject to be examined is detected by the proximity sensor included in the sensor module 420, or if the existence of the subject to be examined is detected through a periodical monitoring by the PPG sensor included in the sensor module 420, the control module 430 considers that the condition for activation is met, and thus activates the biometric sensor. Thereafter, the sensor module 420 receives the input signal including the physiological signal acquired through the activated biometric sensor. While the sensor module 420 receives the input signal including the physiological signal acquired through the activated biometric sensor, the control module 430 may control the audio module 280 to output audio data or control the motor 298 or a haptic module to output tactile feedback or force feedback.
Accordingly, the electronic device performs an operation of receiving the physiological signal of the user through the biometric sensor and an operation of acquiring an image through the camera module of the electronic device at least based on the change in the physiological signal. The change in the physiological signal is detected based on at least one of a level and a pattern of the physiological signal.
FIG. 9 is a flowchart of an operation for acquiring an image based on an input signal by the electronic device, according to an embodiment of the present invention.
Referring to FIG. 9, the electronic device 400 receives an input signal including a physiological signal, detects an input operation based on the input signal, and controls the camera module 410 according to the input operation.
The sensor module 420 receives an input signal including a physiological signal in step 1110. For example, when the user grasping the electronic device 400 approaches or contacts the sensor module 420, the sensor module 420 acquires the physiological signal from the subject 510 to be examined, which corresponds to a part of the user's body. The sensor module 420 may be a PPG sensor or an ECG sensor, and the physiological signal may be a PPG signal or an ECG signal. The sensor module 420 may further include sensors for measuring a physical quantity related to the electronic device 400. In this case, the sensor module 420 acquires a measurement signal for the physical quantity related to the electronic device 400. The input signal includes the physiological signal and may further include the measurement signal. The measurement signal is a measurement value of another physical quantity (for example, proximity by the illumination sensor) for the part of the user's body.
The control module 430 detects an input operation at least based on the input signal in step 1120. The control module 430 detects an input operation of the subject 510 to be examined based on the physiological signal included in the input signal. Further, the control module 430 detects the input operation of the subject 510 to be examined based on at least one of the physiological signal included in the input signal and another measurement signal. The input operation is detected based on at least one of a signal level or a signal pattern of the input signal.
The input operation detected by the control module 430 may be an approach operation, and the approach operation is an operation by which the subject 510 to be examined enters the detection area from the outside of the detection area.
The input operation detected by the control module 430 may be a contact operation, and the contact operation is an operation by which the subject 510 to be examined contacts at least a part of the surface of the sensor module 420.
The input operation detected by the control module 430 may be a separation operation, and the separation operation is an operation by which the subject 510 to be examined is separated from the surface of the sensor module 420.
The control module 430 acquires an image through the camera module 410 according to the input operation in step 1130. When the input operation is the contact operation, the control module 430 controls the camera module 410 to acquire the image. When the input operation is the separation operation, the control module 430 controls the camera module 410 to acquire the image. When the input operation is the approach operation, the control module 430 controls to change a setting (for example, a focus control or a white balance control) of the camera module 410.
The control module 430 control the camera module 410 to start acquiring the image when the input operation is a first input operation and to end acquiring the image when the input operation is a second input operation. For example, photographing is started by a first contact operation, and the photographing is terminated when a second contact operation is made after the generation of the separation operation. Alternatively, when a multi-touch, that is, two contact operations are generated by different fingers, the image acquisition is started by the contact operation of the first finger and the image acquisition is terminated by the contact operation of the second finger. By the plurality of input operations, a panorama image or a video including a plurality of images may be photographed.
The control module 430 controls the camera module 410 based on one or more input operations detected at least based on the input signal, a time for which each of the input operations is maintained, or sequences of the one or more input operations. Further, the control module 430 changes a setting of the camera module 410 to acquire the image according to the one or more input operations. Hereinafter, various embodiments of this document will describe that the control module 430 controls the camera module 410 at least based on one or more detected input operations.
Alternatively, the control module 430 may control the camera module 410 or the audio module 280 based on one or more detected input operations at least based on a time when the input operation is generated. In this case, the control module 430 performs an operation for continuously acquiring one or more of images or audio data through at least one of the camera module 410 and the audio module 280 and temporarily stores the images or audio data in the memory 230 of the electronic device 400 for a predetermined maintenance time tp (for example, 10 seconds) from the acquisition time. Accordingly, when the input operation is generated, the control module 430 stores one or more of images or audio data from the time (t-tp), which is earlier than the time (t) when the input operation is generated by the predetermined maintenance time, to the time when the input operation is generated.
When the input operation is generated, the control module 430 may acquire and store one or more of images or audio data during a predetermined maintenance time (tn) from the time (t) when the input operation is generated.
Alternatively, the control module 430 store one or more of images or audio data for a time from a time earlier than the time (t) when the input operation is generated by a first maintenance time to a time later than the time (t) when the input operation is generated by a second maintenance time. For example, when the input operation is generated, a sound shot (for example, an operation for recording audio data for a predetermined time including the time before or after the photographing time and correlating the recorded audio data with one or more images) function may be performed.
FIG. 10 is a flowchart of an operation for acquiring an image according to a contact operation by the electronic device, according to an embodiment of the present invention.
Referring to FIG. 10, the electronic device 400 receives an input signal including a physiological signal, such as a user bringing a part of the body including a linger into contact with the sensor module 420 based on the input signal, and, when it is detected that the contact state is maintained for a predetermined time, acquires an image through the camera module after a predetermined time elapses. The control module 430 recognizes the contact operation or the contact maintenance state according to the contact operation as a control command that instructs to perform a preparation operation to acquire the image.
For example, the sensor module 420 receives an input signal including a physiological signal of the subject 510 to be examined, and the control module 430 detects the contact operation at least based on the input signal in step 1210.
When the contact operation is detected, the control module 430 changes a setting of the camera module 410 in preparation for acquiring the image in step 1220. An operation for changing the setting may be an operation for controlling the focus of the camera module 410. Further, the operation for changing the setting may be an operation for controlling white balance of the camera module. The operation for changing the setting may be an operation for controlling at least one of ISO, exposure, aperture, and shutter speed.
The control module 430 detects whether the contact state between the subject 510 to be examined and the sensor module 420 is maintained for a predetermined time after the contact operation at least based on the input signal including the physiological signal in step 1230.
When the contact state between the subject 510 to be examined and the sensor module 420 remains for the predetermined time, the control module 430 acquires the image through the camera module 410 in step 1240.
According to another embodiment, when, after bringing the body part into contact with the sensor module 420 continuously for a predetermined time, the user repeats separation and contact of the body part with respect to the sensor module 420, the control module 430 may acquire the image through the camera module 410 at every contact time. For example, when, after detecting the contact operation at least based on the input signal and detecting that the contact state is maintained for a predetermined time after the contact operation, the control module 430 repeatedly detects the separation operation and the contact operation, the control module 430 may repeatedly acquire the image through the camera module 410 according to the separation operation and the contact operation.
According to another embodiment, when, after maintaining the contact state between the body part and the sensor module 420 for a predetermined time, the user repeats separation and contact of the body part with respect to the sensor module 420, the control module 430 may measure the number of times by which the separation and the contact are repeated and switch to a predetermined setting or mode. For example, when, after detecting the contact operation at least based on the input signal and detecting that the contact state is maintained for a predetermined time after the contact operation, the control module 430 repeatedly detects the separation operation and the contact operation, the control module 430 may switch a camera photographing mode according to the separation operation and the contact operation. For example, when the control module 430 detects the separation after two contacts for a predetermined time, the control module 430 may switch the photographing mode to a night photographing mode.
As another example, when the control module 430 detects the separation after three contacts for a predetermined time, the control module 430 switch the photographing mode to a sports photographing mode.
When an application program for acquiring an image is not executed in the electronic device 400, the control module 430 may execute the application program as the control module 430 detects the contact of the part of the user's body to the sensor module. For example, after detecting the contact operation and executing the application program for acquiring the image, the control module 430 changes a setting of the camera module 410. Thereafter, when the contact state between the subject 510 to be examined and the sensor module 420 remains for the predetermined time, the control module 430 acquires the image through the camera module 410.
FIG. 11 is a flowchart of an operation for acquiring an image according to a separation operation by the electronic device, according to an embodiment of the present invention.
Referring to FIG. 11, after the user brings the body part such as a finger into contact with the sensor module based on an input signal including a physiological signal, the electronic device 400 acquires an image through the camera module 410 when the user separates the body part from the sensor module 420.
The sensor module 420 receives an input signal including a physiological signal of the subject 510 to be examined, and the control module 430 detects the contact operation at least based on the input signal in step 1310.
When the contact operation is detected, the control module 430 changes a setting of the camera module 410 as the preparation for acquiring the image in step 1320. An operation for changing the setting may be an operation for controlling the focus of the camera module 410. Further, the operation for changing the setting may be an operation for controlling white balance of the camera module. The operation for changing the setting may be an operation for controlling at least one of ISO, exposure, aperture, and shutter speed.
The control module 430 detects a separation operation by which the subject 510 to be examined is separated from the surface of the sensor module 420 after the contact operation at least based on the input signal including the physiological signal in step 1330.
The control module 430 acquires an image through the camera module 410 according to the detected separation operation in step 1340.
The control module 430 may ignore other photographing commands for acquiring the image except for the detection of the separation operation after the contact operation is detected. For example, the control module 430 may ignore automatic photographing conditions which can be implemented by the electronic device 400, for example, conditions for acquiring the image when a particular pose of the subject for photography included in a preview image is detected, when a smiling face is detected, and when a particular movement pattern is detected, or conditions for acquiring the image when another input signal is received, and may acquire the image only when the separation operation is detected.
FIG. 12 is a flowchart of an operation for controlling the electronic device, according to an embodiment of the present invention.
Referring to FIG. 12, when, after bringing the body part such as a finger into contact with the sensor module 420 based on the input signal including the physiological signal, the user maintains the contact state for a predetermined time, the electronic device 400 may acquire an image through a camera after a waiting time from the separation of the body part from the sensor module by the user.
The sensor module 420 receives an input signal including a physiological signal of the subject 510 to be examined, and the control module 430 detects the contact operation at least based on the input signal in step 1310.
When the contact operation is detected, the control module 430 changes a setting of the camera module 410 in preparation for acquiring the image in step 1320.
The control module 430 determines whether the contact state between the subject 510 to be examined and the sensor module 420 according to the contact operation is maintained for at least a predetermined time in step 1325.
When the control module 430 detects the separation operation by which the subject 510 to be examined is separated from the surface of the sensor module 420 in step 1330, the control module 430 waits until a particular waiting time elapses in step 1335 before acquiring the image in step 1340. The particular waiting time may be determined based on a time for which the contact state is maintained. The control module 430 may display a waiting state on the display 440 for the particular waiting time, for example, in the form of a graphic object.
FIGS. 13A and 13B illustrate an elapsing of a particular time, according to an embodiment of the present invention.
Referring to FIGS. 13A and 13B after the detection of the separation operation, the control module 430 displays a graphic object 441 a indicating the start of the particular waiting time on the display 440 and then changes the graphic object 441 a to a graphic object 441 b after the particular waiting time elapses. The control module 430 acquires an image through the camera module 410 after the particular waiting time elapses.
When the image is acquired through the camera module 410, the control module 430 may record a voice for a particular recording time through a microphone included in the electronic device 400. The control module 430 may determine the particular recording time based on the time for which the contact state is maintained. The control module 430 stores the voice together with the image.
FIG. 14 illustrates controlling an electronic device based on a contact sequence, according to an embodiment of the present invention.
Referring to FIG. 14, the sensor module 420 includes the light emitting unit 421 and the light receiving unit 423, and the light emitting unit 421 includes a first light emitting unit 421 a and a second light emitting unit 421 b having light emitting diodes of different wavelengths. The first light emitting unit 421 a may include, for example, LEDs emitting an infrared light, and the second light emitting unit 421 b may include, for example, LEDs emitting a red visible light.
When the light emitting unit 421 is implemented as one or more LEDs, the light receiving unit 423 receives reflected lights generated when incident lights of different wavelengths penetrate the subject 510 to be examined and then reflected and returned. The sensor module 420 detects a movement direction of the subject 510 to be examined on the sensor module 420 by detecting contact sequences between the one or more LEDs and the subject 510 to be examined based on the received reflected lights.
The sensor module 420 may generate incident lights from the one or more LEDs simultaneously or at different times. Accordingly, the sensor module 420 detects the activated LED at a particular time among the one or more LEDs and, thus, detects the movement direction of the subject 510 to be examined on the sensor module 420.
The control module 430 detects the movement direction of the subject 510 to be examined at least based on the input signal received through the sensor module 420 including the biometric sensor configured to include one or more LEDs. In this case, the control module 430 controls the electronic device 400 based on the movement direction.
FIGS. 15A and 15B illustrate controlling an electronic device based on a movement direction, according to an embodiment of the present invention.
Referring to FIGS. 15A and 15B after displaying an acquired image 610 on the display 440, the control module 430 detects a movement direction of the subject 510 to be examined from a first location 510 a to a second location 510 b on the surface of the sensor module 420 disposed on the rear surface of the electronic device 400, moves the image 610 in the movement direction, and displays another image 620 according to the movement of the image. In FIGS. 15A and 15B, the movement direction of the subject 510 to be examined is opposite to the movement direction of the images on the display 440. That is, the movement direction of the subject 510 to be examined, which contacts the sensor module 420 disposed on the rear surface of the electronic device 400 or closely crosses over the sensor module 420, corresponds to a direction from right to left, and the movement direction of the image 620 on the display 440 disposed on the front surface of the electronic device 400 corresponds to a direction from left to right. Further, a speed of contents, which moves on the display 440, may vary depending on a movement speed of the subject 510 to be examined.
The control module 430 moves the content while displaying the content on the display 440 by controlling the electronic device 400 based on the movement direction. For example, the contents may be a webpage, and the movement operation may be a scroll operation, As another example, the contents may be one or more pages of an electronic book, and the movement operation may be a page turn effect operation of the electronic book. As yet another example, the contents may be a menu focus (for example, highlight, cursor, or selection indication) displayed on a menu, and the movement operation may be a movement operation of the menu focus.
FIG. 16 illustrates controlling an electronic device based on a contact sequence, according to an embodiment of the present invention.
Referring to FIG. 16, the sensor module 420 includes the light emitting unit 421 and the light receiving unit 423, and the light receiving unit 423 may include one or more light receiving diodes, such as a first light receiving diode 423 a, a second light receiving diode 423 b, a third light receiving diode 423 c, and a fourth light receiving diode 423 d. In this case, an incident light generated by the light emitting unit 421 is reflected after penetrating the subject 510 to be examined and is received through each light receiving diode. The control module 430 detects a direction of the movement of the subject to be examined on the sensor module 420 based on positions of the one or more light receiving diodes and a difference or a change in the physiological signal received by each light receiving diode.
When the light receiving unit 423 includes two or more light receiving diodes a arranged on left and right sides from the center of the light emitting unit 421, the control module 430 detects left and right movement directions of the subject 510 to be examined on the sensor module 420.
When, the light receiving unit 423 includes four light receiving diodes, for example, the first light receiving diode 423 a, the second light receiving diode 423 b, the third light receiving diode 423 c, and the fourth light receiving diode 423 d, the four light receiving diodes are arranged on the top, bottom, left, and right sides from the center of the light emitting unit 421, respectively. In this case, the control module 430 detects left and right movement directions or top, bottom, left, and right movement directions of the subject 510 to be examined on the sensor module 420.
The number of light receiving diodes according to the above described embodiment is only an example, and the light receiving unit 423 may include a different number of light receiving diodes.
The control module 430 may detect the movement direction of the subject 510 to be examined at least based on the input signal received through the sensor module 420 including the biometric sensor configured to include one or more light receiving diodes. In this case, the control module 430 controls the electronic device 400 based on the movement direction as described above.
FIGS. 17 illustrates a biometric sensor coupled to a button of an electronic device, according to an embodiment of the present invention.
Referring to FIG. 17, the sensor module 420 may be formed to cover a switch disposed on the body of the electronic device 400. The sensor module 420 includes a biometric sensor arranged above a button 425 that covers the switch in an accumulated form. The button 425 and the sensor module 420 may be functionally connected to control image acquisition through the camera module 410. For example, when the sensor module 420 is formed to be disposed above the button 425, the user contacts the sensor module 420 before pressing the button 425, so that the sensor module 420 receives a physiological signal of a user's body part (for example, finger) corresponding to the subject 510 to be examined. In this case, the control module 430 detects an input operation at least based on an input signal including the physiological signal and, accordingly, controls the electronic device 400 based on the input operation, a result of the determination on whether the switch opens or shuts, or both the input operation and the result of the determination on whether the switch opens or shuts.
The electronic device 400 may include sensors for measuring other physical quantities between the sensor module 420 and the button 425, above the sensor module 420, or below the button 425. Although it has been described that the button 425 is formed on the front case of the electronic device 400, the button 425 may be disposed on another surface of the electronic device 400.
FIG. 18 illustrates a method of using a biometric sensor coupled to a button of an electronic device, according to an embodiment of the present invention.
Referring to FIG. 18, an electronic device 400 including the sensor module 420 formed to cover the button 425 is provided.
When the biometric sensor included in the sensor module 420 is deactivated, the control module 430 activates the biometric sensor after detecting the press of a button 425 as shown in step [a]. Further, the control module 430 may execute an application program for acquiring an image through the camera module 410 according to the activation of the biometric sensor.
When the control module 430 detects a contact operation or a separation operation of the subject 510 to be examined at least based on the input signal received through the sensor module 420 and detects the press of the button 425 after the contact operation, the control module 430 changes a setting of the camera module 410, as shown in step [b]. For example, the control module 430 may control the focus or white balance according to the contact operation even before the press of the button 425 is detected. The contact operation may correspond to a half-press to focus for controlling the focus of the camera module 410.
Thereafter, as shown in step [c], the control module 430 acquires the image through the camera module 410 according to the press of the switch 452. The control module 430 deactivates the biometric sensor according to the separation operation, as shown in step [d].
FIG. 19 is a flowchart of an operation for processing an image by using fingerprint information, according to an embodiment of the present invention.
Referring to FIG. 19, the sensor module 420 may include a fingerprint recognition sensor. In this case, the sensor module 420 receives an input signal including information on the fingerprint of the subject 510 to be examined, through the fingerprint recognition sensor. The control module 430 controls the electronic device 400 based on the fingerprint information.
The sensor module 420 receives an input signal including a physiological signal and fingerprint information, and the control module 430 detects an input operation at least based on the input signal in step 1410.
The control module 430 identifies the user based on the fingerprint information included in the input signal in step 1420.
The control module 430 acquires an image through the camera module 410 according to the input operation in step 1430.
The control module 430 detects whether the user is included in the image in step 1440, and, when the user is included in the image, stores the image in a protected storage space in step 1450. The protected storage space may be a secure area existing within the electronic device 400, a personal storage space for the user, a personal storage area based on an external device of the electronic device 400, or a service.
When the fingerprint is determined as the fingerprint of the user who is not identified, the control module 430 may acquire an image through the camera module 410 according to the input operation, and store the image in a guest storage space. The guest storage space may be a published area existing within the electronic device 400 (for example, a memory which can be accessed without user authentication), or a personal storage area based on a service, an internal, or external device of the electronic device 400 storing at least one of the fingerprint information or the image to record the user when the electronic device 400 is stolen.
When the physiological information acquired through the sensor module 420 is heart rate information, the control module 430 may control the electronic device 400 to process the image by using the heart rate information. The electronic device 400 may acquire heart rate information of the subject 510 to be examined at least based on the physiological signal acquired through the sensor module 420. The control module 430 may display the heart rate information on the display 440 together with the image of the user corresponding to the subject 510 to be examined.
FIG. 20 is a flowchart of an operation for providing a preview image using heart rate information, according to an embodiment of the present invention.
FIG. 21 illustrates a screen displaying a preview image provided using heart rate information, according to an embodiment of the present invention.
Referring to FIGS. 20 and 21 the sensor module 420 receives an input signal including a physiological signal of the subject 510 to be examined, and the control module 430 detects a contact operation as an input operation at least based on the input signal in step 1510.
The control module 430 acquires the heart rate information of the subject 510 to be examined at least based on the physiological signal while the contact operation is maintained in step 1520.
The control module 430 displays a preview image 631 acquired through the camera module 410 on the display 440 and also displays heart rate information 633 together with the preview image in step 1530.
The control module 430 detects a separation operation during a contact state in step 1540, and acquire the image by controlling the camera module 410 in step 1550.
The control module 430 may store the image such that the heart rate information is included in the image.
The control module 430 may detect the image of the user corresponding to the subject 510 to be examined included in the preview image and display the image of the user and the heart rate information such that the image of the user and the heart rate information are correlated to each other.
An operation for designating one or more display attributes (for example, a location, size, color, shape, icon, avatar, and predetermined template) of the heart rate information 633 while the preview image 631 is displayed may be further included.
The heart rate information may be information related to stress or emotion determined based on the physiological signal collected through the HRM sensor or the HRV sensor, and display attributes of the heart rate information 633 may be designated or changed based on information on the corresponding stress or emotion.
FIG. 22 is a flowchart of an operation for displaying a user image and heart rate information together, according to an embodiment of the present invention.
FIG. 23 illustrates a screen displaying an image together with heart rate information, according to an embodiment of the present invention.
Referring to FIGS. 22 and 23, for example, the sensor module 420 receives an input signal including a physiological signal of the subject 510 to be examined, and the control module 430 detects an input operation at least based on the input signal in step 1610.
The control module 430 acquires heart rate information of the subject 510 to be examined at least based on the physiological signal in step 1620.
The control module 430 acquires an image through the camera module 410 according to the input operation in step 1630.
The control module 430 detects, within the image, a user image corresponding to the subject 510 to be examined in step 1640. When images of a plurality of people 643, 645, and 647 are included in the image, the control module 430 detects the user image corresponding to the subject 510 to be examined according to the heart rate information, physiological information, or a user setting. The electronic device 400 may include a database including the user image corresponding to the heart rate information, and the user image may be detected based on the database.
The control module 430 displays the acquired image 641 on the display 440 and displays the heart rate information such that the heart rate information is correlated to the user image 645 in a graphic object or text form 646 in step 1650.
The control module 430 may display emotional information based on the physiological information together with the image.
FIG. 24 is a flowchart of an operation for displaying an acquired image and emotional information together, according to an embodiment of the present invention.
FIG. 25 illustrates a screen displaying an image together emotional information, according to an embodiment of the present invention.
Referring to FIGS. 24 and 25 the sensor module 420 receives an input signal including a physiological signal of the subject 510 to be examined, and the control module 430 detects an input operation at least based on the input signal in step 1710.
The control module 430 acquires emotional information of the subject 510 to be examined at least based on the physiological signal in step 1720. The emotional information may include, for example, joy, shyness, sorrow, or excitement. The emotional information may be acquired based on heart rate information.
The control module 430 determines that the emotional information corresponds to joy by detecting an increase in the heart rate for a predetermined time or a particular pattern of the heart rate signal.
Alternatively, the control module 430 may acquire the emotional information by identifying an emotional state shown in the preview image acquired through the camera module 410. For example, the control module 430 may acquire the emotional information based on at least one of a movement pattern of the user's body within the preview image, a facial expression in a face image, a face shape, a movement pattern of the face (for example, a change in shape of the eyes or mouth), existence or nonexistence of laughter, and existence or nonexistence of blinking.
Further, the control module 430 acquires an image through the camera module 410 according to the input operation in step 1730.
The control module 430 displays the emotional information on the display 440 together with, the image in step 1740.
The control module 430 may change the user's facial expression shown in the image according to the emotional information. More specifically, the control module 430 may analyze feature points of the user's face and change a location of a feature point associated with the emotional information among the feature points. When the emotional information corresponds to joy, the control module 430 may change the user's facial expression by changing locations of the feature points corresponding to the mouth such that the mouth corner lifts or lips part, so as to represent the user's smiling face in the image.
The control module 430 may change a part of all of the colors of the user's face shown in the image according to the emotional information. For example, when the emotional information corresponds to excitement or shyness, the control module 430 may make a red color 653 appear on the cheek of the face of the user 651 shown in the image.
The control module 430 may identify people included in the image, acquire physiological information or emotional information of the user corresponding to the subject 510 to be examined and other people, and display the physiological information and emotional information together. To this end, the electronic device 400 can communicate with another device including physiological information of the identified people. The control module 430 may acquire the physiological information of the user based on the physiological signal included in the input signal received through the sensor module 420 and acquire the physiological information of other people through communication with another electronic device corresponding to face recognition information.
FIG. 26 illustrates a screen displaying an image in which physiological information is correlated with the user, according to an embodiment of the present invention
Referring to FIG. 26, for example, physiological information 662 of the user 661 corresponding to the subject 510 to be examined is acquired through the sensor module of the electronic device 400 and correlated to the user 661. Further, physiological information 664 and 668 of other users 663 and 667 may be acquired through communication or wearable devices of the other users 663 and 667 and correlated to the other users 663 and 667.
The control module 430 may store physiological information or emotional information together with the image acquired through the camera module 410. The physiological information or the emotional information may be stored in a meta information area of a file format for the image.
FIGS. 27A to 27D illustrate a method of controlling an electronic device to photograph a panorama image, according to an embodiment of the present invention.
Referring to FIGS. 27A to 27D, the sensor module 420 receives an input signal including a physiological signal of the subject 510 to be examined, and the control module 430 detects the input operation at least based on the input signal. The control module 430 combines one or more images acquired according to the detected input operation to generate connected images, for example, panorama images.
When the contact operation is detected, the control module 430 of the electronic device 400 acquires a reference image through the camera module 410 and acquires one or more images connected to the reference image according to up and bottom or left and right rotation of the body of the electronic device 400 The control module 430 displays a guide for indicating locations of the one or more images connected to the reference image according to the rotation of the body on the display of the electronic device 400. The guide displays a photographing direction or a photographing location. When another input operation is detected, the control module 430 combines the reference image and the one or more images. For example, the combination may correspond to a combination of the one or more images corresponding to images above or below the reference image according to the top and bottom rotation of the body or a combination of the one or more images corresponding to images on the left or right side of the reference image according to the left and right rotation of the body. The other input operation may be another contact operation or separation operation.
Referring to FIG. 27A, when the contact operation with respect to the sensor module 420 arranged on the rear surface of the body of the electronic device 400 is detected, the control module 430 acquires a reference image 670 a through the camera module 411 arranged on the front surface of the body of the electronic device 400. The reference image 670 a may include an image 671 corresponding to the user of the electronic device 400.
Referring to FIG. 27B, the control module 430 acquires at least one image 670 b or 670 c corresponding to an image on the right side of the reference image 670 a through the camera module 411 as the body of the electronic device 400 rotates counterclockwise. At least one image 670 b or 670 c may include at least some areas of the reference image 670 a.
Referring to FIG. 27C, the control module 430 acquires at least one image 670 d or 670 e corresponding to an image on the left side of the reference image 670 a through the camera module 411 as the body of the electronic device 400 rotates clockwise. At least one image 670 d or 670 e may include at least some areas of the reference image 670 a.
Referring to FIG. 27D, when an additional contact operation is detected or when a separation operation is detected after a contact state according to the contact operation is maintained for a particular time, the control module 430 generates one image 670 f by connecting the reference image 670 a and at least one image 670 b, 670 c, 670 d, or 670 e.
Further, the embodiments disclosed in this document are only for the description and understanding of technical contents and do urn limit the scope of the present invention. Accordingly, it should be understood by those skilled in the art to which the present invention pertains that various changes in form and details may be made therein without departing from the spirit and scope of the present invention. Therefore, the scope of the present invention is defined not by the detailed description of the present invention, but by the appended claims and their equivalents, and thus, all differences within the scope will be construed as being included in the present invention.

Claims (20)

What is claimed is:
1. A portable communication device comprising:
a memory;
a touchscreen display disposed in a front surface of the portable communication device;
a biometric sensor disposed in a rear surface of the portable communication device; and
a processor adapted to:
present a first image of a plurality of images stored in the memory via the touch screen display;
receive a user input via the biometric sensor;
determine that the user input corresponds to a specified direction; and
present, via the touchscreen display, a second image of the plurality of images based at least in part on determining that the user input corresponds to the specified direction.
2. The portable communication device of claim 1, wherein the memory comprises a specified application to display the plurality of images and
wherein the processor is adapted to:
perform presenting the first image and presenting the second image via an user interface corresponding to the specified application.
3. The portable communication device of claim 1, wherein the processor is adapted to:
identify the second image based at least in part on the determined specified direction.
4. The portable communication device of claim 1, wherein the biometric sensor is adapted to detect a moving direction of a finger as at least part of the user input, and
wherein the processor is adapted to:
as at least part of determining that the user input corresponds to a specified direction, determining that the moving direction is opposite to the specified direction.
5. The portable communication device of claim 1, wherein the processor is adapted to:
as at least part of presenting the second image, replace the first image with the second image.
6. The portable communication device of claim 5, wherein the processor is adapted to:
as at least part of replacing the first image with the second image, slide the first image in another direction opposite to the specified direction from a specified area of the touchscreen and slide the second image in the other direction into the specified area simultaneously, such that the first image is hidden from the touchscreen display and the second image is viewed at the touchscreen display.
7. The portable communication device of claim 1, wherein the biometric sensor comprises a fingerprint sensor to obtain fingerprint information and movement information related to a finger.
8. The portable communication device of claim 1, further comprising:
an image sensor,
wherein the first image is captured using the image sensor in response to another user input, the other user input received via the biometric sensor while a preview image obtained via the image sensor is presented via the touchscreen.
9. The portable communication device of claim 8, wherein the image sensor is disposed adjacent to the biometric sensor.
10. The portable communication device of claim 8, wherein the first image is captured based at least in part on a determination that the other user input satisfies a specified condition.
11. The portable communication device of claim 1, wherein the processor is adapted to, as part of determining that the user input corresponds to a specified direction:
receive a touch input moving from a first position to a second position with respect to the biometric sensor, as the user input; and
perform the determining based at least in part on the first and second positions.
12. A method for displaying an image using a portable communication device, comprising:
presenting a first image of a plurality of images via a touchscreen display of the portable communication device;
receiving a user input via a biometric sensor of the portable communication device, while the first image is presented;
determining that the user input corresponds to a specified direction;
replacing, via the touchscreen display, the first image with a second image of the plurality of images based at least in part on determining that the user input corresponds to the specified direction.
13. The method of claim 12, wherein replacing the first image with the second image comprises:
identifying the second image based at least in part on the specified direction.
14. The method of claim 12, wherein replacing the first image with the second image comprises:
sliding the first image in another direction opposite to the specified direction from a specified area of the touchscreen and sliding the second image in the other direction into the specified area simultaneously, such that the first image is hidden from the touchscreen display and the second image is viewed at the touchscreen display.
15. The method of claim 14, wherein determining that the user input corresponds to a specified direction comprises:
identifying a movement speed of the user input,
wherein the first image or the second image is slid based at least in part on the movement speed.
16. The method of claim 12, wherein the first image is captured using an image sensor of the portable communication device in response to another user input, the other user input received via the biometric sensor while a preview image obtained via the image sensor is presented via the touchscreen.
17. The method of claim 16, wherein the first image is captured based at least in part on a determination that the other user input satisfies a specified condition.
18. A portable communication device comprising:
a touchscreen display disposed in a front surface of the portable communication device;
a fingerprint sensor disposed in a rear surface of the portable communication device;
an image sensor; and
a processor adapted to:
display, via the touchscreen, a preview image obtained via the image sensor;
receive a user input via the fingerprint sensor while the preview image is presented;
identify a movement of a finger based at least in part on the user input; and
present, via the touchscreen display, an image based at least in part on a determination that the movement satisfies a specified condition, wherein presenting the image includes:
capturing the image using the image sensor, based at least in part on the determination; and
displaying the captured image via the touchscreen.
19. The portable communication device of claim 18, wherein the processor is adapted to:
receive another user input via the fingerprint sensor while the image is displayed;
determine that the other user input corresponds to a specified direction; and
present, via the touch screen display, another image based at least in part on determining that the other user input corresponds to the specified direction.
20. The portable communication device of claim 19, further comprising:
a memory,
wherein the processor is adapted to:
prior to displaying the preview image, execute a camera application stored in the memory at the portable communication device; and
activate the fingerprint sensor based at least in part on the execution of the camera application.
US15/718,937 2014-09-02 2017-09-28 Method for control of camera module based on physiological signal Active US10051177B2 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US15/718,937 US10051177B2 (en) 2014-09-02 2017-09-28 Method for control of camera module based on physiological signal
US16/033,906 US10341554B2 (en) 2014-09-02 2018-07-12 Method for control of camera module based on physiological signal

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
KR10-2014-0116510 2014-09-02
KR1020140116510A KR102367550B1 (en) 2014-09-02 2014-09-02 Controlling a camera module based on physiological signals
US14/843,593 US9444998B2 (en) 2014-09-02 2015-09-02 Method for control of camera module based on physiological signal
US15/239,307 US9794474B2 (en) 2014-09-02 2016-08-17 Method for control of camera module based on physiological signal
US15/718,937 US10051177B2 (en) 2014-09-02 2017-09-28 Method for control of camera module based on physiological signal

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US15/239,307 Continuation US9794474B2 (en) 2014-09-02 2016-08-17 Method for control of camera module based on physiological signal

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US16/033,906 Continuation US10341554B2 (en) 2014-09-02 2018-07-12 Method for control of camera module based on physiological signal

Publications (2)

Publication Number Publication Date
US20180020155A1 US20180020155A1 (en) 2018-01-18
US10051177B2 true US10051177B2 (en) 2018-08-14

Family

ID=54105617

Family Applications (4)

Application Number Title Priority Date Filing Date
US14/843,593 Active US9444998B2 (en) 2014-09-02 2015-09-02 Method for control of camera module based on physiological signal
US15/239,307 Active US9794474B2 (en) 2014-09-02 2016-08-17 Method for control of camera module based on physiological signal
US15/718,937 Active US10051177B2 (en) 2014-09-02 2017-09-28 Method for control of camera module based on physiological signal
US16/033,906 Active US10341554B2 (en) 2014-09-02 2018-07-12 Method for control of camera module based on physiological signal

Family Applications Before (2)

Application Number Title Priority Date Filing Date
US14/843,593 Active US9444998B2 (en) 2014-09-02 2015-09-02 Method for control of camera module based on physiological signal
US15/239,307 Active US9794474B2 (en) 2014-09-02 2016-08-17 Method for control of camera module based on physiological signal

Family Applications After (1)

Application Number Title Priority Date Filing Date
US16/033,906 Active US10341554B2 (en) 2014-09-02 2018-07-12 Method for control of camera module based on physiological signal

Country Status (4)

Country Link
US (4) US9444998B2 (en)
EP (1) EP2993892B1 (en)
KR (1) KR102367550B1 (en)
CN (1) CN105391937B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180011590A1 (en) * 2016-07-06 2018-01-11 Samsung Electronics Co., Ltd. Electronic apparatus and operating method thereof
US10341554B2 (en) * 2014-09-02 2019-07-02 Samsung Electronics Co., Ltd Method for control of camera module based on physiological signal

Families Citing this family (50)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI439960B (en) 2010-04-07 2014-06-01 Apple Inc Avatar editing environment
KR102268196B1 (en) * 2014-06-13 2021-06-22 닛토덴코 가부시키가이샤 Device and method for removing artifacts in physiological measurements
US10420515B2 (en) 2015-06-15 2019-09-24 Vital Labs, Inc. Method and system for acquiring data for assessment of cardiovascular disease
CN107847156A (en) 2015-06-15 2018-03-27 维塔尔实验室公司 The method and system assessed and managed for angiocardiopathy
US10335045B2 (en) 2016-06-24 2019-07-02 Universita Degli Studi Di Trento Self-adaptive matrix completion for heart rate estimation from face videos under realistic conditions
KR20180006087A (en) * 2016-07-08 2018-01-17 삼성전자주식회사 Method for recognizing iris based on user intention and electronic device for the same
CN107040712B (en) * 2016-11-21 2019-11-26 英华达(上海)科技有限公司 Intelligent self-timer method and system
CN108236454B (en) * 2016-12-26 2021-05-07 阿里巴巴集团控股有限公司 Health measurement data acquisition method and electronic equipment
US10845955B2 (en) 2017-05-15 2020-11-24 Apple Inc. Displaying a scrollable list of affordances associated with physical activities
KR102585858B1 (en) 2017-05-16 2023-10-11 애플 인크. Emoji recording and sending
US11723579B2 (en) 2017-09-19 2023-08-15 Neuroenhancement Lab, LLC Method and apparatus for neuroenhancement
US11717686B2 (en) 2017-12-04 2023-08-08 Neuroenhancement Lab, LLC Method and apparatus for neuroenhancement to facilitate learning and performance
US11478603B2 (en) 2017-12-31 2022-10-25 Neuroenhancement Lab, LLC Method and apparatus for neuroenhancement to enhance emotional response
USD920137S1 (en) * 2018-03-07 2021-05-25 Intel Corporation Acoustic imaging device
DK179980B1 (en) 2018-03-12 2019-11-27 Apple Inc. User interfaces for health monitoring
US10466783B2 (en) * 2018-03-15 2019-11-05 Sanmina Corporation System and method for motion detection using a PPG sensor
KR102526951B1 (en) * 2018-04-06 2023-04-28 삼성전자 주식회사 Method and apparatus for measuring biometric information in electronic device
CN110353699B (en) * 2018-04-10 2023-03-31 深圳市理邦精密仪器股份有限公司 Sensor falling detection method and device and storage medium
US11364361B2 (en) 2018-04-20 2022-06-21 Neuroenhancement Lab, LLC System and method for inducing sleep by transplanting mental states
DK179992B1 (en) 2018-05-07 2020-01-14 Apple Inc. Visning af brugergrænseflader associeret med fysiske aktiviteter
US11317833B2 (en) 2018-05-07 2022-05-03 Apple Inc. Displaying user interfaces associated with physical activities
US11722764B2 (en) 2018-05-07 2023-08-08 Apple Inc. Creative camera
CN110457092A (en) * 2018-05-07 2019-11-15 苹果公司 Head portrait creates user interface
DK201870374A1 (en) 2018-05-07 2019-12-04 Apple Inc. Avatar creation user interface
US10375313B1 (en) 2018-05-07 2019-08-06 Apple Inc. Creative camera
EP3849410A4 (en) 2018-09-14 2022-11-02 Neuroenhancement Lab, LLC System and method of improving sleep
TWI678660B (en) * 2018-10-18 2019-12-01 宏碁股份有限公司 Electronic system and image processing method
KR20200074571A (en) * 2018-12-17 2020-06-25 삼성전자주식회사 Photoplethysmography sensor, electronic apparatus including the same and method for controlling the electronic apparatus
US11107261B2 (en) 2019-01-18 2021-08-31 Apple Inc. Virtual avatar animation based on facial feature movement
CN109886178A (en) * 2019-02-14 2019-06-14 Oppo广东移动通信有限公司 Fingerprint input method and Related product
US11204668B2 (en) 2019-05-14 2021-12-21 Samsung Electronics Co., Ltd. Electronic device and method for acquiring biometric information using light of display
US11786694B2 (en) 2019-05-24 2023-10-17 NeuroLight, Inc. Device, method, and app for facilitating sleep
US11234077B2 (en) 2019-06-01 2022-01-25 Apple Inc. User interfaces for managing audio exposure
US11152100B2 (en) 2019-06-01 2021-10-19 Apple Inc. Health application user interfaces
US11209957B2 (en) 2019-06-01 2021-12-28 Apple Inc. User interfaces for cycle tracking
US11228835B2 (en) 2019-06-01 2022-01-18 Apple Inc. User interfaces for managing audio exposure
KR102632479B1 (en) 2019-06-17 2024-01-31 삼성전자주식회사 Ppg sensor and method for operating ppg sensor
CN114706505A (en) 2019-09-09 2022-07-05 苹果公司 Research user interface
WO2021067589A1 (en) 2019-10-01 2021-04-08 Vital Labs, Inc. Method and system for determining cardiovascular parameters
CN110742596A (en) * 2019-10-17 2020-02-04 Oppo广东移动通信有限公司 Electronic equipment for photographing and biological information measurement
CN111277755B (en) * 2020-02-12 2021-12-07 广州小鹏汽车科技有限公司 Photographing control method and system and vehicle
DK202070625A1 (en) 2020-05-11 2022-01-04 Apple Inc User interfaces related to time
US11921998B2 (en) 2020-05-11 2024-03-05 Apple Inc. Editing features of an avatar
DK181037B1 (en) 2020-06-02 2022-10-10 Apple Inc User interfaces for health applications
US11698710B2 (en) 2020-08-31 2023-07-11 Apple Inc. User interfaces for logging user activities
US11744523B2 (en) 2021-03-05 2023-09-05 Riva Health, Inc. System and method for validating cardiovascular parameter monitors
US11714536B2 (en) 2021-05-21 2023-08-01 Apple Inc. Avatar sticker editor user interfaces
US11776190B2 (en) 2021-06-04 2023-10-03 Apple Inc. Techniques for managing an avatar on a lock screen
CN113727018B (en) * 2021-06-24 2022-12-02 荣耀终端有限公司 Shooting method and equipment
WO2023038992A1 (en) 2021-09-07 2023-03-16 Riva Health, Inc. System and method for determining data quality for cardiovascular parameter determination

Citations (33)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5072248A (en) 1989-12-01 1991-12-10 Asahi Kogaku Kogyo Kabushiki Kaisha Release switch apparatus
US20030021601A1 (en) 2001-07-30 2003-01-30 Tim Goldstein System and method for controlling electronic devices
KR20040000954A (en) 2002-06-26 2004-01-07 삼성전자주식회사 Method for nevigation key using sensor of fingerprint identification in mobile phone
JP2004361708A (en) 2003-06-05 2004-12-24 Fuji Photo Film Co Ltd Wrist camera
US6930707B2 (en) 2000-12-22 2005-08-16 International Business Machines Corporation Digital camera apparatus with biometric capability
KR100591239B1 (en) 2004-04-26 2006-06-19 한국전자통신연구원 Apparatus for measuring health conditions of a mobile phone and a method for controlling informations of the same
US7176973B2 (en) 2000-07-10 2007-02-13 Matsushita Electric Industrial Co., Ltd. Iris camera module
US20080253695A1 (en) 2007-04-10 2008-10-16 Sony Corporation Image storage processing apparatus, image search apparatus, image storage processing method, image search method and program
US20090203998A1 (en) 2008-02-13 2009-08-13 Gunnar Klinghult Heart rate counter, portable apparatus, method, and computer program for heart rate counting
US20100283609A1 (en) 2009-05-07 2010-11-11 Perpcast, Inc. Personal safety system, method, and apparatus
US20110221948A1 (en) 2010-03-15 2011-09-15 Canon Kabushiki Kaisha Image pickup apparatus and its control method
US8055032B2 (en) 2007-10-16 2011-11-08 Premier Image Technology (China) Ltd. Digital camera with fingerprint identification function
KR20120002258A (en) 2010-06-30 2012-01-05 엘지전자 주식회사 Mobile terminal and method for capturing images using facial recogition thereof
EP2617354A1 (en) 2012-01-17 2013-07-24 Sony Mobile Communications AB Camera button with integrated sensors
KR20130088613A (en) 2012-01-31 2013-08-08 엘지전자 주식회사 Mobile terminal
CN103353826A (en) 2013-04-16 2013-10-16 深圳市中兴移动通信有限公司 Display equipment and information processing method thereof
US20140063317A1 (en) 2012-08-31 2014-03-06 Lg Electronics Inc. Mobile terminal
US20140121471A1 (en) 2012-10-26 2014-05-01 Nike, Inc. Athletic Performance Monitoring System Utilizing Heart Rate Information
US20140126044A1 (en) 2010-06-10 2014-05-08 Arnold & Richter Cine Technik Gmbh & Co. Betriebs Kg Camera objective and camera system
US20140160432A1 (en) 2012-12-11 2014-06-12 Elwha Llc Self-Aligning Unobtrusive Active Eye Interrogation
US20140168406A1 (en) 2004-06-01 2014-06-19 Mark S. Olsson Self-leveling camera heads
US20140171758A1 (en) 2011-07-20 2014-06-19 Stephen Teni Ayanruoh Integrated Portable Medical Diagnostic System
US20140232885A1 (en) 2013-02-20 2014-08-21 Kristin Elizabeth Slater Method and system for generation of images based on biorhythms
US20150116249A1 (en) * 2013-10-31 2015-04-30 Samsung Electronics Co., Ltd. Method and electronic device for processing finger motion
US20150205993A1 (en) * 2014-01-22 2015-07-23 Samsung Electronics Co., Ltd. Method for providing control function using fingerprint sensor and electronic device thereof
US20150205358A1 (en) 2014-01-20 2015-07-23 Philip Scott Lyren Electronic Device with Touchless User Interface
US20150245514A1 (en) * 2014-02-21 2015-08-27 Samsung Electronics Co., Ltd. Electronic device including physical key
US20150365575A1 (en) 2014-06-13 2015-12-17 Sony Corporation Lifelog camera and method of controlling same according to transitions in activity
US9444998B2 (en) 2014-09-02 2016-09-13 Samsung Electronics Co., Ltd Method for control of camera module based on physiological signal
US20160367193A1 (en) 2015-02-28 2016-12-22 Boe Technology Group Co., Ltd. Remote controller and health detection system
US9733740B2 (en) * 2014-06-23 2017-08-15 Samsung Electronics Co., Ltd. Method of processing fingerprint and electronic device thereof
US20180011590A1 (en) * 2016-07-06 2018-01-11 Samsung Electronics Co., Ltd. Electronic apparatus and operating method thereof
US9894275B2 (en) * 2014-02-11 2018-02-13 Samsung Electronics Co., Ltd. Photographing method of an electronic device and the electronic device thereof

Family Cites Families (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7583293B2 (en) * 2001-12-06 2009-09-01 Aptina Imaging Corporation Apparatus and method for generating multi-image scenes with a camera
US20040189849A1 (en) * 2003-03-31 2004-09-30 Hofer Gregory V. Panoramic sequence guide
US7515054B2 (en) * 2004-04-01 2009-04-07 Torch William C Biosensors, communicators, and controllers monitoring eye movement and methods for using them
JP4284538B2 (en) * 2004-10-19 2009-06-24 ソニー株式会社 Playback apparatus and playback method
US7839429B2 (en) * 2005-05-26 2010-11-23 Hewlett-Packard Development Company, L.P. In-camera panorama stitching method and apparatus
KR100763236B1 (en) * 2006-05-09 2007-10-04 삼성전자주식회사 Apparatus and method for editing moving picture using physiological signal
KR100800804B1 (en) * 2006-12-27 2008-02-04 삼성전자주식회사 Method for photographing panorama picture
KR100866230B1 (en) * 2007-04-12 2008-10-30 삼성전자주식회사 Method for photographing panorama picture
JP5338174B2 (en) * 2008-07-28 2013-11-13 富士通株式会社 Panorama photographing apparatus and method, camera unit equipped with panoramic photographing apparatus
JP5352406B2 (en) * 2009-09-30 2013-11-27 富士フイルム株式会社 Composite image creation method, program therefor, and information processing apparatus
US20140152801A1 (en) * 2009-10-28 2014-06-05 Alentic Microscience Inc. Detecting and Using Light Representative of a Sample
JP2012099917A (en) * 2010-10-29 2012-05-24 Sanyo Electric Co Ltd Imaging device
JP5853359B2 (en) * 2010-11-11 2016-02-09 ソニー株式会社 IMAGING DEVICE, IMAGING DEVICE CONTROL METHOD, AND PROGRAM
US20160286119A1 (en) * 2011-04-18 2016-09-29 360fly, Inc. Mobile Device-Mountable Panoramic Camera System and Method of Displaying Images Captured Therefrom
KR101784176B1 (en) * 2011-05-25 2017-10-12 삼성전자주식회사 Image photographing device and control method thereof
KR101777354B1 (en) * 2011-06-20 2017-09-11 삼성전자주식회사 Digital photographing apparatus, method for controlling the same, and computer-readable storage medium
KR20130014983A (en) * 2011-08-01 2013-02-12 엘지이노텍 주식회사 The remote controller and display apparatus and information processing method thereof

Patent Citations (37)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5072248A (en) 1989-12-01 1991-12-10 Asahi Kogaku Kogyo Kabushiki Kaisha Release switch apparatus
US7176973B2 (en) 2000-07-10 2007-02-13 Matsushita Electric Industrial Co., Ltd. Iris camera module
US6930707B2 (en) 2000-12-22 2005-08-16 International Business Machines Corporation Digital camera apparatus with biometric capability
US20030021601A1 (en) 2001-07-30 2003-01-30 Tim Goldstein System and method for controlling electronic devices
US6885818B2 (en) 2001-07-30 2005-04-26 Hewlett-Packard Development Company, L.P. System and method for controlling electronic devices
KR20040000954A (en) 2002-06-26 2004-01-07 삼성전자주식회사 Method for nevigation key using sensor of fingerprint identification in mobile phone
US7162059B2 (en) 2002-06-26 2007-01-09 Samsung Electronics Co., Ltd. Method for implementing a navigation key function in a mobile communication terminal based on fingerprint recognition
JP2004361708A (en) 2003-06-05 2004-12-24 Fuji Photo Film Co Ltd Wrist camera
KR100591239B1 (en) 2004-04-26 2006-06-19 한국전자통신연구원 Apparatus for measuring health conditions of a mobile phone and a method for controlling informations of the same
US20140168406A1 (en) 2004-06-01 2014-06-19 Mark S. Olsson Self-leveling camera heads
US20080253695A1 (en) 2007-04-10 2008-10-16 Sony Corporation Image storage processing apparatus, image search apparatus, image storage processing method, image search method and program
US8055032B2 (en) 2007-10-16 2011-11-08 Premier Image Technology (China) Ltd. Digital camera with fingerprint identification function
US20090203998A1 (en) 2008-02-13 2009-08-13 Gunnar Klinghult Heart rate counter, portable apparatus, method, and computer program for heart rate counting
US20100283609A1 (en) 2009-05-07 2010-11-11 Perpcast, Inc. Personal safety system, method, and apparatus
US20110221948A1 (en) 2010-03-15 2011-09-15 Canon Kabushiki Kaisha Image pickup apparatus and its control method
US20140126044A1 (en) 2010-06-10 2014-05-08 Arnold & Richter Cine Technik Gmbh & Co. Betriebs Kg Camera objective and camera system
KR20120002258A (en) 2010-06-30 2012-01-05 엘지전자 주식회사 Mobile terminal and method for capturing images using facial recogition thereof
US20140171758A1 (en) 2011-07-20 2014-06-19 Stephen Teni Ayanruoh Integrated Portable Medical Diagnostic System
EP2617354A1 (en) 2012-01-17 2013-07-24 Sony Mobile Communications AB Camera button with integrated sensors
KR20130088613A (en) 2012-01-31 2013-08-08 엘지전자 주식회사 Mobile terminal
US20140063317A1 (en) 2012-08-31 2014-03-06 Lg Electronics Inc. Mobile terminal
US20140121471A1 (en) 2012-10-26 2014-05-01 Nike, Inc. Athletic Performance Monitoring System Utilizing Heart Rate Information
US20140160432A1 (en) 2012-12-11 2014-06-12 Elwha Llc Self-Aligning Unobtrusive Active Eye Interrogation
US20140232885A1 (en) 2013-02-20 2014-08-21 Kristin Elizabeth Slater Method and system for generation of images based on biorhythms
CN103353826A (en) 2013-04-16 2013-10-16 深圳市中兴移动通信有限公司 Display equipment and information processing method thereof
US20150116249A1 (en) * 2013-10-31 2015-04-30 Samsung Electronics Co., Ltd. Method and electronic device for processing finger motion
US20150205358A1 (en) 2014-01-20 2015-07-23 Philip Scott Lyren Electronic Device with Touchless User Interface
US20150205993A1 (en) * 2014-01-22 2015-07-23 Samsung Electronics Co., Ltd. Method for providing control function using fingerprint sensor and electronic device thereof
US9894275B2 (en) * 2014-02-11 2018-02-13 Samsung Electronics Co., Ltd. Photographing method of an electronic device and the electronic device thereof
US20150245514A1 (en) * 2014-02-21 2015-08-27 Samsung Electronics Co., Ltd. Electronic device including physical key
US20150365575A1 (en) 2014-06-13 2015-12-17 Sony Corporation Lifelog camera and method of controlling same according to transitions in activity
US9733740B2 (en) * 2014-06-23 2017-08-15 Samsung Electronics Co., Ltd. Method of processing fingerprint and electronic device thereof
US9794474B2 (en) * 2014-09-02 2017-10-17 Samsung Electronics Co., Ltd Method for control of camera module based on physiological signal
US20180020155A1 (en) * 2014-09-02 2018-01-18 Samsung Electronics Co., Ltd. Method for control of camera module based on physiological signal
US9444998B2 (en) 2014-09-02 2016-09-13 Samsung Electronics Co., Ltd Method for control of camera module based on physiological signal
US20160367193A1 (en) 2015-02-28 2016-12-22 Boe Technology Group Co., Ltd. Remote controller and health detection system
US20180011590A1 (en) * 2016-07-06 2018-01-11 Samsung Electronics Co., Ltd. Electronic apparatus and operating method thereof

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
European Search Report dated Feb. 2, 2016 issued in counterpart application No. 15183425.6. 1903, 13 pages.
Toshiyo Tamura et al., "Wearable Photoplethysmographic Sensors-Past and Present", Electronics, ISSN 2079-9292, Apr. 23, 2014, 21 pages.
Toshiyo Tamura et al., "Wearable Photoplethysmographic Sensors—Past and Present", Electronics, ISSN 2079-9292, Apr. 23, 2014, 21 pages.

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10341554B2 (en) * 2014-09-02 2019-07-02 Samsung Electronics Co., Ltd Method for control of camera module based on physiological signal
US20180011590A1 (en) * 2016-07-06 2018-01-11 Samsung Electronics Co., Ltd. Electronic apparatus and operating method thereof
US10635223B2 (en) * 2016-07-06 2020-04-28 Samsung Electronics Co., Ltd Electronic apparatus and operating method thereof

Also Published As

Publication number Publication date
KR102367550B1 (en) 2022-02-28
CN105391937A (en) 2016-03-09
US20180020155A1 (en) 2018-01-18
US20180324353A1 (en) 2018-11-08
EP2993892A1 (en) 2016-03-09
CN105391937B (en) 2020-03-17
US9794474B2 (en) 2017-10-17
EP2993892B1 (en) 2020-07-29
US9444998B2 (en) 2016-09-13
US20160065840A1 (en) 2016-03-03
KR20160028093A (en) 2016-03-11
US10341554B2 (en) 2019-07-02
US20160360100A1 (en) 2016-12-08

Similar Documents

Publication Publication Date Title
US10341554B2 (en) Method for control of camera module based on physiological signal
US10192044B2 (en) Electronic apparatus and method for controlling functions in the electronic apparatus using a bio-metric sensor
EP3357421B1 (en) Electronic device for authenticating biometric data and system
CN107408168B (en) Iris authentication method and apparatus using display information
US10599904B2 (en) Electronic device for measuring biometric information and method of operating same
US10165978B2 (en) Method for measuring human body information, and electronic device thereof
US20160142407A1 (en) Method and apparatus for displaying user interface in electronic device
US10078441B2 (en) Electronic apparatus and method for controlling display displaying content to which effects is applied
US20150190077A1 (en) Electronic device and photoplethysmography method
KR102354351B1 (en) Electronic device for determining sleeping state and method for controlling thereof
KR102401659B1 (en) Electronic device and method for processing video according to camera photography environment and scene using the same
EP3101578A1 (en) Electronic device for performing personal authentication and method thereof
US10835782B2 (en) Electronic device, system, and method for determining suitable workout in consideration of context
US10504560B2 (en) Electronic device and operation method thereof
CN109565548B (en) Method of controlling multi-view image and electronic device supporting the same
KR102358849B1 (en) Electronic device for providing information related to a smart watch and method for operating the same
KR20150082045A (en) Electronic device and photoplethysmography method
KR102477580B1 (en) Electronic apparatus and operating method thereof
KR20180023555A (en) Electronic device and a method for measuring heart rate based on an infrared rays sensor using the same
KR102418360B1 (en) A method for executing a function of an electronic device using a bio-signal and the electronic device therefor
KR102526951B1 (en) Method and apparatus for measuring biometric information in electronic device
US11191439B2 (en) Electronic device and method for capturing contents
KR20150082038A (en) Electronic device and photoplethysmography method

Legal Events

Date Code Title Description
FEPP Fee payment procedure

Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

STCF Information on status: patent grant

Free format text: PATENTED CASE

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 4