WO2022196984A1 - Dispositif électronique comprenant une caméra et son procédé de fonctionnement - Google Patents

Dispositif électronique comprenant une caméra et son procédé de fonctionnement Download PDF

Info

Publication number
WO2022196984A1
WO2022196984A1 PCT/KR2022/003005 KR2022003005W WO2022196984A1 WO 2022196984 A1 WO2022196984 A1 WO 2022196984A1 KR 2022003005 W KR2022003005 W KR 2022003005W WO 2022196984 A1 WO2022196984 A1 WO 2022196984A1
Authority
WO
WIPO (PCT)
Prior art keywords
electronic device
face
user
wearable electronic
display
Prior art date
Application number
PCT/KR2022/003005
Other languages
English (en)
Korean (ko)
Inventor
김보성
김성오
프루신스키발레리
여형석
이기혁
최재성
홍준의
Original Assignee
삼성전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 삼성전자 주식회사 filed Critical 삼성전자 주식회사
Publication of WO2022196984A1 publication Critical patent/WO2022196984A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/163Wearable computers, e.g. on a belt
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/002Specific input/output arrangements not covered by G06F3/01 - G06F3/16
    • G06F3/005Input arrangements through a video camera
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/012Head tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/015Input arrangements based on nervous system activity detection, e.g. brain waves [EEG] detection, electromyograms [EMG] detection, electrodermal response detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/57Mechanical or electrical details of cameras or camera modules specially adapted for being embedded in other devices
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04806Zoom, i.e. interaction techniques or interactors for controlling the zooming operation

Definitions

  • Various embodiments of the present disclosure relate to an electronic device including a camera and an operating method thereof.
  • the electronic device may be implemented by including an under display camera (UDC) in which the front camera is mounted under the display panel instead of in the form of a notch or hole display.
  • UDC under display camera
  • a smart watch is a type of wearable electronic device in which a watch and an electronic device are combined.
  • a smart watch performs not only a function as a watch, but also performs various functions by mounting devices (eg, a camera, a sensor, etc.) capable of performing various functions.
  • the smart watch may provide a list-up function.
  • the list-up function may mean a 'watch watch by raising wrist' function.
  • the smart watch may turn on the screen with only a gesture of raising the wrist without a separate input while the user wears the smart watch.
  • a smart watch When a smart watch is equipped with a camera, usability may deteriorate due to the arrangement of the camera. For example, even if the camera is not located on the same surface as the watch face, or even if it is located on the same surface, the size of the smart watch may increase and usability may decrease. In addition, since it is inconvenient for the user to look at the camera instead of looking at the watch face in order to use the camera, usability may be reduced.
  • the list-up function provided by the smart watch detects a gesture through a sensor, there is a possibility that misrecognition may occur. For example, when the user drives or exercises, an unintentional list-up function may be performed. In order to reduce such misrecognition, a large number of sensors may be mounted on the smart watch, but this may increase the size of the smart watch or increase the manufacturing cost.
  • a gesture for performing a list-up function is checked through a sensor, a user's face and gaze are checked through a UDC mounted on the wearable electronic device, and the gesture, the face, and a gaze are performed. It is possible to provide an electronic device capable of determining whether to turn on a display included in the wearable electronic device based on , and adjusting a screen displayed on the display, and an operating method thereof.
  • a wearable electronic device includes a first sensor, a display, a camera disposed under a rear surface that is not exposed to the outside of the display, and a processor, wherein the processor, through the first sensor, A first gesture for the wearable electronic device is checked, and the face and gaze of the user are checked based on an image including the user's face captured through the camera, and the first gesture, the face, and the gaze determines whether to turn on the display based on It can be set to adjust the screen.
  • the method of operating the wearable electronic device may include an operation of identifying a first gesture for the wearable electronic device through a first sensor included in the wearable electronic device, and exposing the wearable electronic device to the outside of a display Based on an image including the user's face photographed through a camera disposed below the rear surface, the operation of confirming the user's face and gaze, the first gesture, the face, and the gaze Determining whether to turn on the display of the electronic device, and when the display is turned on, the display is displayed on the display based on the distance between the face and the wearable electronic device, the direction of the face, and the angle of the gaze It may include an operation for adjusting the screen to be displayed.
  • the processor checks a first gesture for the wearable electronic device through a first sensor included in the wearable electronic device operation, the operation of confirming the user's face and gaze based on an image including the user's face photographed through a camera disposed under the rear surface of the display of the wearable electronic device that is not exposed to the outside; 1 The operation of determining whether to turn on the display of the electronic device based on the gesture, the face, and the gaze, and determining whether to turn on the display, the distance between the face and the wearable electronic device, the direction of the face, and an instruction executable to adjust the screen displayed on the display based on the angle of the gaze.
  • a wearable electronic device may reduce misrecognition of a list-up function and increase usability by using UDC.
  • FIG. 1 is a diagram illustrating a network environment according to various embodiments of the present disclosure
  • FIG. 2 is a diagram illustrating a configuration example of a wearable electronic device according to various embodiments of the present disclosure.
  • FIG. 3 is a diagram illustrating a configuration example of an electronic device according to various embodiments of the present disclosure.
  • FIG. 4 is a block diagram illustrating a schematic configuration of a wearable electronic device according to various embodiments of the present disclosure.
  • FIG. 5 is a flowchart illustrating a method of operating a wearable electronic device according to various embodiments of the present disclosure.
  • FIG. 6 is a diagram for describing an operation of a wearable electronic device checking a user's face and gaze, according to various embodiments of the present disclosure
  • FIG. 7 is a flowchart illustrating an operation of adjusting a screen of a wearable electronic device based on a distance from a user's face, according to various embodiments of the present disclosure.
  • FIG. 8 is a diagram for describing an operation of adjusting a screen of a wearable electronic device based on a distance from a user's face according to various embodiments of the present disclosure
  • 9A is a flowchart illustrating an operation of adjusting a screen of a wearable electronic device when a user other than a user is identified, according to various embodiments of the present disclosure
  • FIG. 9B is a diagram for describing an operation of adjusting a screen of a wearable electronic device when a user other than a user is identified, according to various embodiments of the present disclosure.
  • FIG. 10 is a diagram for explaining an operation of adjusting a screen based on a user's gaze by a wearable electronic device according to various embodiments of the present disclosure
  • FIG. 11 is a flowchart illustrating an operation in which a wearable electronic device adjusts a screen based on user's biometric information according to various embodiments of the present disclosure.
  • FIG. 12 is a diagram for explaining an operation of adjusting a screen of a wearable electronic device based on user's biometric information according to various embodiments of the present disclosure
  • FIG. 13 is a flowchart illustrating an operation in which a wearable electronic device adjusts a screen based on a user's gesture according to various embodiments of the present disclosure
  • FIG. 14 is a diagram for explaining an operation of adjusting a screen of a wearable electronic device based on a user's gesture according to various embodiments of the present disclosure
  • FIG. 15 is a flowchart illustrating an operation in which a wearable electronic device transmits an unlock signal to another electronic device according to various embodiments of the present disclosure
  • the term user used in various embodiments may refer to a person who uses an electronic device or a device (eg, an artificial intelligence electronic device) using the electronic device.
  • a device eg, an artificial intelligence electronic device
  • FIG. 1 is a block diagram of an electronic device 101 in a network environment 100, according to various embodiments.
  • the electronic device 101 communicates with the electronic device 102 through a first network 198 (eg, a short-range wireless communication network) or a second network 199 . It may communicate with the electronic device 104 or the server 108 through (eg, a long-distance wireless communication network). According to an embodiment, the electronic device 101 may communicate with the electronic device 104 through the server 108 .
  • the electronic device 101 includes a processor 120 , a memory 130 , an input module 150 , a sound output module 155 , a display module 160 , an audio module 170 , and a sensor module ( 176), interface 177, connection terminal 178, haptic module 179, camera module 180, power management module 188, battery 189, communication module 190, subscriber identification module 196 , or an antenna module 197 .
  • at least one of these components eg, the connection terminal 178
  • some of these components are integrated into one component (eg, display module 160 ). can be
  • the processor 120 for example, executes software (eg, a program 140) to execute at least one other component (eg, a hardware or software component) of the electronic device 101 connected to the processor 120. It can control and perform various data processing or operations. According to one embodiment, as at least part of data processing or operation, the processor 120 converts commands or data received from other components (eg, the sensor module 176 or the communication module 190 ) to the volatile memory 132 . may be stored in , process commands or data stored in the volatile memory 132 , and store the result data in the non-volatile memory 134 .
  • software eg, a program 140
  • the processor 120 converts commands or data received from other components (eg, the sensor module 176 or the communication module 190 ) to the volatile memory 132 .
  • the volatile memory 132 may be stored in , process commands or data stored in the volatile memory 132 , and store the result data in the non-volatile memory 134 .
  • the processor 120 is the main processor 121 (eg, a central processing unit or an application processor) or a secondary processor 123 (eg, a graphic processing unit, a neural network processing unit (eg, a graphic processing unit, a neural network processing unit) a neural processing unit (NPU), an image signal processor, a sensor hub processor, or a communication processor).
  • the main processor 121 eg, a central processing unit or an application processor
  • a secondary processor 123 eg, a graphic processing unit, a neural network processing unit (eg, a graphic processing unit, a neural network processing unit) a neural processing unit (NPU), an image signal processor, a sensor hub processor, or a communication processor.
  • the main processor 121 e.g, a central processing unit or an application processor
  • a secondary processor 123 eg, a graphic processing unit, a neural network processing unit (eg, a graphic processing unit, a neural network processing unit) a neural processing unit (NPU), an image signal processor, a
  • the secondary processor 123 may, for example, act on behalf of the main processor 121 while the main processor 121 is in an inactive (eg, sleep) state, or when the main processor 121 is active (eg, executing an application). ), together with the main processor 121, at least one of the components of the electronic device 101 (eg, the display module 160, the sensor module 176, or the communication module 190) It is possible to control at least some of the related functions or states.
  • the coprocessor 123 eg, an image signal processor or a communication processor
  • may be implemented as part of another functionally related component eg, the camera module 180 or the communication module 190 ). have.
  • the auxiliary processor 123 may include a hardware structure specialized for processing an artificial intelligence model.
  • Artificial intelligence models can be created through machine learning. Such learning may be performed, for example, in the electronic device 101 itself on which artificial intelligence is performed, or may be performed through a separate server (eg, the server 108).
  • the learning algorithm may include, for example, supervised learning, unsupervised learning, semi-supervised learning, or reinforcement learning, but in the above example not limited
  • the artificial intelligence model may include a plurality of artificial neural network layers.
  • Artificial neural networks include deep neural networks (DNNs), convolutional neural networks (CNNs), recurrent neural networks (RNNs), restricted boltzmann machines (RBMs), deep belief networks (DBNs), bidirectional recurrent deep neural networks (BRDNNs), It may be one of deep Q-networks or a combination of two or more of the above, but is not limited to the above example.
  • the artificial intelligence model may include, in addition to, or alternatively, a software structure in addition to the hardware structure.
  • the memory 130 may store various data used by at least one component (eg, the processor 120 or the sensor module 176 ) of the electronic device 101 .
  • the data may include, for example, input data or output data for software (eg, the program 140 ) and instructions related thereto.
  • the memory 130 may include a volatile memory 132 or a non-volatile memory 134 .
  • the program 140 may be stored as software in the memory 130 , and may include, for example, an operating system 142 , middleware 144 , or an application 146 .
  • the input module 150 may receive a command or data to be used by a component (eg, the processor 120 ) of the electronic device 101 from the outside (eg, a user) of the electronic device 101 .
  • the input module 150 may include, for example, a microphone, a mouse, a keyboard, a key (eg, a button), or a digital pen (eg, a stylus pen).
  • the sound output module 155 may output a sound signal to the outside of the electronic device 101 .
  • the sound output module 155 may include, for example, a speaker or a receiver.
  • the speaker can be used for general purposes such as multimedia playback or recording playback.
  • the receiver can be used to receive incoming calls. According to one embodiment, the receiver may be implemented separately from or as part of the speaker.
  • the display module 160 may visually provide information to the outside (eg, a user) of the electronic device 101 .
  • the display module 160 may include, for example, a control circuit for controlling a display, a hologram device, or a projector and a corresponding device.
  • the display module 160 may include a touch sensor configured to sense a touch or a pressure sensor configured to measure the intensity of a force generated by the touch.
  • the audio module 170 may convert a sound into an electric signal or, conversely, convert an electric signal into a sound. According to an embodiment, the audio module 170 acquires a sound through the input module 150 , or an external electronic device (eg, a sound output module 155 ) connected directly or wirelessly with the electronic device 101 .
  • the electronic device 102) eg, a speaker or headphones
  • the electronic device 102 may output a sound.
  • the sensor module 176 detects an operating state (eg, power or temperature) of the electronic device 101 or an external environmental state (eg, a user state), and generates an electrical signal or data value corresponding to the sensed state. can do.
  • the sensor module 176 may include, for example, a gesture sensor, a gyro sensor, a barometric pressure sensor, a magnetic sensor, an acceleration sensor, a grip sensor, a proximity sensor, a color sensor, an IR (infrared) sensor, a biometric sensor, It may include a temperature sensor, a humidity sensor, or an illuminance sensor.
  • the interface 177 may support one or more specified protocols that may be used by the electronic device 101 to directly or wirelessly connect with an external electronic device (eg, the electronic device 102 ).
  • the interface 177 may include, for example, a high definition multimedia interface (HDMI), a universal serial bus (USB) interface, an SD card interface, or an audio interface.
  • the connection terminal 178 may include a connector through which the electronic device 101 can be physically connected to an external electronic device (eg, the electronic device 102 ).
  • the connection terminal 178 may include, for example, an HDMI connector, a USB connector, an SD card connector, or an audio connector (eg, a headphone connector).
  • the haptic module 179 may convert an electrical signal into a mechanical stimulus (eg, vibration or movement) or an electrical stimulus that the user can perceive through tactile or kinesthetic sense.
  • the haptic module 179 may include, for example, a motor, a piezoelectric element, or an electrical stimulation device.
  • the camera module 180 may capture still images and moving images. According to an embodiment, the camera module 180 may include one or more lenses, image sensors, image signal processors, or flashes.
  • the power management module 188 may manage power supplied to the electronic device 101 .
  • the power management module 188 may be implemented as, for example, at least a part of a power management integrated circuit (PMIC).
  • PMIC power management integrated circuit
  • the battery 189 may supply power to at least one component of the electronic device 101 .
  • battery 189 may include, for example, a non-rechargeable primary cell, a rechargeable secondary cell, or a fuel cell.
  • the communication module 190 is a wireless communication module 192 (eg, a cellular communication module, a short-range communication module, or a global navigation satellite system (GNSS) communication module) or a wired communication module 194 (eg, : It may include a local area network (LAN) communication module, or a power line communication module).
  • a wireless communication module 192 eg, a cellular communication module, a short-range communication module, or a global navigation satellite system (GNSS) communication module
  • GNSS global navigation satellite system
  • wired communication module 194 eg, : It may include a local area network (LAN) communication module, or a power line communication module.
  • a corresponding communication module among these communication modules is a first network 198 (eg, a short-range communication network such as Bluetooth, wireless fidelity (WiFi) direct, or infrared data association (IrDA)) or a second network 199 (eg, legacy It may communicate with the external electronic device 104 through a cellular network, a 5G network, a next-generation communication network, the Internet, or a computer network (eg, a telecommunication network such as a LAN or a WAN).
  • a first network 198 eg, a short-range communication network such as Bluetooth, wireless fidelity (WiFi) direct, or infrared data association (IrDA)
  • a second network 199 eg, legacy It may communicate with the external electronic device 104 through a cellular network, a 5G network, a next-generation communication network, the Internet, or a computer network (eg, a telecommunication network such as a LAN or a WAN).
  • a telecommunication network
  • the wireless communication module 192 uses subscriber information (eg, International Mobile Subscriber Identifier (IMSI)) stored in the subscriber identification module 196 within a communication network such as the first network 198 or the second network 199 .
  • subscriber information eg, International Mobile Subscriber Identifier (IMSI)
  • IMSI International Mobile Subscriber Identifier
  • the electronic device 101 may be identified or authenticated.
  • the wireless communication module 192 may support a 5G network after a 4G network and a next-generation communication technology, for example, a new radio access technology (NR).
  • NR access technology includes high-speed transmission of high-capacity data (eMBB (enhanced mobile broadband)), minimization of terminal power and access to multiple terminals (mMTC (massive machine type communications)), or high reliability and low latency (URLLC (ultra-reliable and low-latency) -latency communications)).
  • eMBB enhanced mobile broadband
  • mMTC massive machine type communications
  • URLLC ultra-reliable and low-latency
  • the wireless communication module 192 may support a high frequency band (eg, mmWave band) to achieve a high data rate, for example.
  • a high frequency band eg, mmWave band
  • the wireless communication module 192 uses various techniques for securing performance in a high-frequency band, for example, beamforming, massive multiple-input and multiple-output (MIMO), all-dimensional multiplexing. It may support technologies such as full dimensional MIMO (FD-MIMO), an array antenna, analog beam-forming, or a large scale antenna.
  • MIMO massive multiple-input and multiple-output
  • the wireless communication module 192 may support various requirements defined in the electronic device 101 , an external electronic device (eg, the electronic device 104 ), or a network system (eg, the second network 199 ). According to an embodiment, the wireless communication module 192 may provide a peak data rate (eg, 20 Gbps or more) for realizing 1eMBB, loss coverage (eg, 164 dB or less) for realizing mMTC, or U-plane latency for realizing URLLC ( Example: Downlink (DL) and uplink (UL) each 0.5 ms or less, or round trip 1 ms or less) can be supported.
  • a peak data rate eg, 20 Gbps or more
  • loss coverage eg, 164 dB or less
  • U-plane latency for realizing URLLC
  • the antenna module 197 may transmit or receive a signal or power to the outside (eg, an external electronic device).
  • the antenna module 197 may include an antenna including a conductor formed on a substrate (eg, a PCB) or a radiator formed of a conductive pattern.
  • the antenna module 197 may include a plurality of antennas (eg, an array antenna). In this case, at least one antenna suitable for a communication method used in a communication network such as the first network 198 or the second network 199 is connected from the plurality of antennas by, for example, the communication module 190 . can be selected. A signal or power may be transmitted or received between the communication module 190 and an external electronic device through the selected at least one antenna.
  • other components eg, a radio frequency integrated circuit (RFIC)
  • RFIC radio frequency integrated circuit
  • the antenna module 197 may form a mmWave antenna module.
  • the mmWave antenna module comprises a printed circuit board, an RFIC disposed on or adjacent to a first side (eg, bottom side) of the printed circuit board and capable of supporting a designated high frequency band (eg, mmWave band); and a plurality of antennas (eg, an array antenna) disposed on or adjacent to a second side (eg, top or side) of the printed circuit board and capable of transmitting or receiving signals of the designated high frequency band. can do.
  • peripheral devices eg, a bus, general purpose input and output (GPIO), serial peripheral interface (SPI), or mobile industry processor interface (MIPI)
  • GPIO general purpose input and output
  • SPI serial peripheral interface
  • MIPI mobile industry processor interface
  • the command or data may be transmitted or received between the electronic device 101 and the external electronic device 104 through the server 108 connected to the second network 199 .
  • Each of the external electronic devices 102 or 104 may be the same as or different from the electronic device 101 .
  • all or a part of operations executed in the electronic device 101 may be executed in one or more external electronic devices 102 , 104 , or 108 .
  • the electronic device 101 may perform the function or service itself instead of executing the function or service itself.
  • one or more external electronic devices may be requested to perform at least a part of the function or the service.
  • One or more external electronic devices that have received the request may execute at least a part of the requested function or service, or an additional function or service related to the request, and transmit a result of the execution to the electronic device 101 .
  • the electronic device 101 may process the result as it is or additionally and provide it as at least a part of a response to the request.
  • cloud computing, distributed computing, mobile edge computing (MEC), or client-server computing technology may be used.
  • the electronic device 101 may provide an ultra-low latency service using, for example, distributed computing or mobile edge computing.
  • the external electronic device 104 may include an Internet of things (IoT) device.
  • the server 108 may be an intelligent server using machine learning and/or neural networks.
  • the external electronic device 104 or the server 108 may be included in the second network 199 .
  • the electronic device 101 may be applied to an intelligent service (eg, smart home, smart city, smart car, or health care) based on 5G communication technology and IoT-related technology.
  • the electronic device may have various types of devices.
  • the electronic device may include, for example, a portable communication device (eg, a smart phone), a computer device, a portable multimedia device, a portable medical device, a camera, a wearable device, or a home appliance device.
  • a portable communication device eg, a smart phone
  • a computer device e.g., a smart phone
  • a portable multimedia device e.g., a portable medical device
  • a camera e.g., a portable medical device
  • a camera e.g., a portable medical device
  • a camera e.g., a portable medical device
  • a wearable device e.g., a smart bracelet
  • a home appliance device e.g., a home appliance
  • first, second, or first or second may simply be used to distinguish an element from other elements in question, and may refer elements to other aspects (e.g., importance or order) is not limited. It is said that one (eg, first) component is “coupled” or “connected” to another (eg, second) component, with or without the terms “functionally” or “communicatively”. When referenced, it means that one component can be connected to the other component directly (eg by wire), wirelessly, or through a third component.
  • module used in various embodiments of this document may include a unit implemented in hardware, software, or firmware, and is interchangeable with terms such as, for example, logic, logic block, component, or circuit.
  • a module may be an integrally formed part or a minimum unit or a part of the part that performs one or more functions.
  • the module may be implemented in the form of an application-specific integrated circuit (ASIC).
  • ASIC application-specific integrated circuit
  • Various embodiments of the present document include one or more instructions stored in a storage medium (eg, internal memory 136 or external memory 138) readable by a machine (eg, electronic device 101).
  • a storage medium eg, internal memory 136 or external memory 138
  • the processor eg, the processor 120
  • the device eg, the electronic device 101
  • the one or more instructions may include code generated by a compiler or code executable by an interpreter.
  • the device-readable storage medium may be provided in the form of a non-transitory storage medium.
  • 'non-transitory' only means that the storage medium is a tangible device and does not contain a signal (eg, electromagnetic wave), and this term is used in cases where data is semi-permanently stored in the storage medium and It does not distinguish between temporary storage cases.
  • a signal eg, electromagnetic wave
  • the method according to various embodiments disclosed in this document may be provided in a computer program product (computer program product).
  • Computer program products may be traded between sellers and buyers as commodities.
  • the computer program product is distributed in the form of a machine-readable storage medium (eg compact disc read only memory (CD-ROM)), or via an application store (eg Play Store TM ) or on two user devices ( It can be distributed (eg downloaded or uploaded) directly or online between smartphones (eg: smartphones).
  • a part of the computer program product may be temporarily stored or temporarily created in a machine-readable storage medium such as a memory of a server of a manufacturer, a server of an application store, or a relay server.
  • each component eg, a module or a program of the above-described components may include a singular or a plurality of entities, and some of the plurality of entities may be separately disposed in other components. have.
  • one or more components or operations among the above-described corresponding components may be omitted, or one or more other components or operations may be added.
  • a plurality of components eg, a module or a program
  • the integrated component may perform one or more functions of each component of the plurality of components identically or similarly to those performed by the corresponding component among the plurality of components prior to the integration. .
  • operations performed by a module, program, or other component are executed sequentially, in parallel, repeatedly, or heuristically, or one or more of the operations are executed in a different order, or omitted. , or one or more other operations may be added.
  • FIG. 2 is a diagram illustrating a configuration example of an electronic device according to various embodiments
  • FIG. 3 is a diagram illustrating a configuration example of an electronic device according to various embodiments.
  • an electronic device 101 (eg, the electronic device 101 of FIG. 1 ) according to an embodiment includes a display module 160 (eg, FIG. 1 ) disposed on a front surface 201 of a housing 200 . 1 ) and a camera 210 (eg, the camera module 180 of FIG. 1 ).
  • the electronic device 101 may include a memory 130 , a display module 160 , a camera 210 , and at least one processor 120 electrically connected to the memory 130 .
  • the electronic device 101 according to an embodiment may further include other components described with reference to FIG. 1 .
  • the electronic device 101 may be implemented as a wearable electronic device.
  • the electronic device 101 may be implemented as a watch-type wearable electronic device (eg, a smart watch).
  • the display module 160 is disposed to be exposed on the front surface 201 of the housing 200 , and the window 310 is disposed such that the first surface is exposed to the outside, and the window
  • the display panel 320 may be disposed under the second surface (rear surface) of the 310 .
  • the display panel 320 may include a substrate (eg, a flexible circuit board (FPCB)) 340 and a display element layer 311 disposed on the substrate 340 .
  • the display panel 320 may be configured in the form of a touch sensitive panel (TSP).
  • the display element layer 311 is a circuit layer including a thin film transistor (TFT) (not shown), an organic light emitting diode (OLED) (not shown) as a display element, and an insulating layer (IL) therebetween (not shown). ) may be included.
  • the display panel 320 may include a display driver integrated circuit (not shown).
  • the transparent glass layer 310 and the display panel 320 may be configured to have at least a part curved shape.
  • the display panel 320 may be formed of a flexible polymer film, and may include, for example, polyimide, polyethylene terephthalate, or other polymer material.
  • the display panel 320 may include a first polymer layer 313 (eg, polyimide) and a second polymer layer 315 (eg, polyethylene terephthalate) disposed below the display element layer 311 . have.
  • the camera 210 may be disposed under a rear surface (eg, a second surface) that is not exposed to the outside of the display module 160 .
  • the camera 210 may be an under display camera (eg, an under display camera (UDC) camera) 210 in which at least a portion of the camera module 180 is disposed under the display panel 320 .
  • the camera 210 may be disposed between the display panel 320 and the substrate 340 .
  • the camera 210 may be at least a part of the camera module 180 of FIG. 1 , and may not be exposed because it is included in the housing 200 .
  • the camera 210 is a camera sensor (eg, an image sensor) (not shown) that detects light incident through the window 310 through a lens and converts the light into a digital signal to obtain an image, an image processing image It may be configured to include at least one of a processing module (not shown) or a memory (not shown) for storing images.
  • a processing module not shown
  • a memory not shown
  • FIG. 4 is a block diagram illustrating a schematic configuration of a wearable electronic device according to various embodiments of the present disclosure.
  • the wearable electronic device 401 includes a camera 410 , a processor 420 , a memory 430 , a first sensor 440 , a second sensor 450 , and a display. 460 , and a communication module 480 .
  • the wearable electronic device 401 may be implemented in the same or similar manner to the electronic device 101 of FIG. 1 .
  • the wearable electronic device 401 may be implemented as a watch-type wearable electronic device (eg, a smart watch).
  • the camera 410 may be implemented as an under display camera (UDC).
  • UDC under display camera
  • the camera 410 may be implemented in the same or similar manner to the camera 210 of FIG. 2 .
  • the camera 410 may be located in a central region of the display 460 (eg, the display 160 of FIG. 2 ).
  • the processor 420 may control the overall operation of the wearable electronic device 420 .
  • the processor 420 may be implemented in the same or similar manner to the processor 120 of FIG. 1 .
  • the processor 420 may identify a first gesture with respect to the wearable electronic device 401 through the first sensor 440 .
  • the first sensor 440 may sense a gesture with respect to the wearable electronic device 401 .
  • the first sensor 440 may include at least one of an acceleration sensor, a speed sensor, and a motion sensor.
  • the first gesture may include a gesture of raising the wrist while the user wears the wearable electronic device 401 (eg, a smart watch).
  • the processor 420 may capture and acquire an image including the user's face through the camera 410 .
  • the processor 420 may check the user's face and gaze based on the image including the user's face captured through the camera 410 .
  • the processor 420 may check the size of the user's face, the direction of the face, and the direction of the gaze, based on the image.
  • the processor 420 may determine the direction of the face based on the nose and/or both eyes included in the face.
  • the processor 420 may determine the direction of the gaze by checking the two eyes included in the face.
  • the processor 420 may check whether the direction of the face and the direction of the gaze coincide with the camera 410 .
  • the processor 420 may determine whether to turn on the display 460 based on the first gesture, the user's face, and the user's gaze. For example, the processor 420 may raise the wrist while the first gesture is a gesture (eg, the user wearing the wearable electronic device 401 ) pre-stored in the memory 430 (eg, the memory 130 of FIG. 1 ). You can check whether it matches the raising gesture). Also, the processor 420 may check whether the direction of the user's face and/or the direction of the gaze coincides with the camera 410 . The processor 420 may turn on the display when the first gesture matches the pre-stored gesture and the direction of the user's face and the direction of the gaze coincide with the camera 410 .
  • the processor 420 may turn on the display when the first gesture matches the pre-stored gesture and the direction of the user's face and the direction of the gaze coincide with the camera 410 .
  • the processor 420 may turn on the display when the first gesture matches a pre-stored gesture and the user's gaze direction matches the camera 410 .
  • the processor 420 may turn on the display 460 if the first gesture matches the pre-stored gesture and the direction of the gaze matches the camera 410 even if the direction of the face does not match the camera 410 .
  • the processor 420 may display a screen on the display 460 .
  • the screen may include at least one of a watch face, a message, a user interface, an image, an application execution screen, and a home screen.
  • the processor 420 is configured to be displayed on the display based on the distance between the user's face and the wearable electronic device 401 (or the camera 410), the direction of the user's face, and the direction (or angle) of the user's gaze. You can adjust the screen.
  • the adjustment of the screen may include changing at least one object included in the screen (eg, changing a color, changing a location, and/or changing a size) or changing from an existing screen to another screen.
  • the processor 420 determines the size of the screen displayed on the display 460 based on the distance between the user's face and the wearable electronic device 401 . can be enlarged or reduced. For example, the processor 420 may enlarge the size of the screen as the distance between the user's face and the wearable electronic device 401 increases. Alternatively, the processor 420 may reduce the size of the screen as the distance between the user's face and the wearable electronic device 401 increases.
  • the processor 420 checks the size of the user's face and the distance between the user's eyes based on the image including the user's face, and based on the size of the user's face and the distance between the user's eyes The distance between the user's face and the wearable electronic device may be checked.
  • the processor 420 may display only some information on a screen displayed on the display 460 .
  • the processor 420 may check the privacy level of content displayed on the screen and determine information displayed on the screen according to the privacy level.
  • the processor 420 may change the display position of the information displayed on the display 460 when a user other than the user previously registered in the image captured by the camera 410 is identified.
  • the processor 420 may display information in an area (eg, an area away from other users) that other users other than the pre-registered user cannot see in detail.
  • the processor 420 may adjust the position of the notification window displayed on the display 460 based on the direction of the face and the direction (or angle) of the gaze. For example, the processor 420 may display a notification window at a position toward which the user's face and gaze are directed. According to another embodiment, the processor 420 may adjust the position of the watch face displayed on the display 460 based on the direction of the face and the direction (or angle) of the gaze. For example, the processor 420 may display the watch face at a position toward which the user's face and gaze are directed.
  • the processor 420 checks a second gesture (eg, a gesture of spreading the hand) for the movement of the user's hand through the camera 410, and displays it on the display 460 based on the second gesture.
  • the displayed screen may be adjusted (eg, enlarged or reduced in screen size).
  • the processor 420 checks a second gesture for the user's movement based on images continuously captured by the camera 410 while the screen is displayed on the display 460 , and the second gesture You can check whether or not matches the specified gesture.
  • the processor 420 may adjust the screen of the display 460 according to a function corresponding to the second gesture.
  • the processor 420 may acquire the user's biometric information through the second sensor 450 .
  • the second sensor 450 may include a PPG sensor, an EKG sensor, and/or an ECG sensor capable of measuring a user's biosignal.
  • the processor 420 may update the screen displayed on the display 460 based on the biometric information obtained through the second sensor 450 .
  • the processor 420 may check the user's mood state and/or health state through the biometric information, and update the screen according to the confirmed mood state and/or health state.
  • the processor 420 may transmit/receive data (or signals) to and from the external electronic device 402 through the communication module 480 .
  • the external electronic device 402 may include a terminal, a smartphone, and an AI speaker.
  • the processor 420 may determine whether the user is a pre-registered user based on the user's face and gaze included in the image captured by the camera 410 . When the user is identified as a pre-registered user, the processor 420 may transmit a signal for requesting unlocking to the external electronic device 402 .
  • the processor 420 may display data received from the external electronic device 402 on the display 460 .
  • the processor 420 receives the privacy-related data received from the external electronic device 402 when the user's face and gaze are directed toward the wearable electronic device 401 . It may be displayed on the display 460 .
  • the processor 420 of FIG. 4 may perform at least some of the following operations of the electronic device.
  • the processor 420 performs an operation for convenience of description.
  • FIG. 5 is a flowchart illustrating a method of operating a wearable electronic device according to various embodiments of the present disclosure.
  • the wearable electronic device (eg, the wearable electronic device 401 of FIG. 4 ) includes a first sensor (eg, the first sensor 440 of FIG. 4 ).
  • the first gesture for the wearable electronic device 401 may be confirmed.
  • the processor 420 may check whether the first gesture matches a pre-registered gesture.
  • the wearable electronic device 401 receives the user's face based on the image including the user's face captured through the camera UDC (eg, the camera 410 of FIG. 4 ). Face and gaze can be checked.
  • the processor 420 may determine whether the user is using or looking at the wearable electronic device 401 based on the user's face direction and the angle of the gaze.
  • the wearable electronic device 401 determines whether to turn on a display (eg, the display 460 of FIG. 4 ) based on the first gesture, face, and gaze.
  • a display eg, the display 460 of FIG. 4
  • the processor 420 turns on the display 460 when it is confirmed that the first gesture matches the pre-registered gesture and the user is looking at the wearable electronic device 401 based on the face direction and the gaze angle. on) can be turned on.
  • the display 460 is based on the distance between the user's face and the wearable electronic device, the direction of the face, and the angle of the gaze.
  • the processor 420 may change the size of the screen or change the display position based on the distance between the user's face and the wearable electronic device, the direction of the face, and the angle of the gaze.
  • FIG. 6 is a diagram for describing an operation of a wearable electronic device checking a user's face and gaze, according to various embodiments of the present disclosure
  • a wearable electronic device uses a camera UDC (eg, the camera 410 of FIG. 4 ) by a user It may be determined whether the wearable electronic device 401 is being used or not. For example, the wearable electronic device 401 checks the direction of the user's face and the angle of the gaze (or the direction of the gaze) in the image projected through the camera (UDC) 410 , and determines the direction of the face and the angle of the gaze. Based on the determination, whether the user is using the wearable electronic device 401 may be determined.
  • a camera UDC eg, the camera 410 of FIG. 4
  • the wearable electronic device 401 allows the user to select the wearable electronic device 401 based on the direction of the gaze while the front of the user's face is confirmed in the image captured by the camera 410 . You can determine whether it is being used or not.
  • the wearable electronic device 401 may acquire an image 610 , 620 , 630 , or 640 including the user's face. For example, as shown in FIG. 6A , when the wearable electronic device 401 confirms that the direction of the face and the direction of the gaze coincide with the camera 410 in the image 610 including the user's face, the user may determine that the wearable electronic device 401 is being used or intends to be used. Also, as shown in FIG. 6B , in the wearable electronic device 401 , in the image 620 including the user's face, the direction of the face does not coincide with the camera 410 , but the direction of the gaze is the camera 410 .
  • the wearable electronic device 401 If matching with , it may be determined that the user is using or intends to use the wearable electronic device 401 .
  • the wearable electronic device 401 in the image 630 including the user's face, the direction of the face coincides with the camera 410 , but the direction of the gaze is the camera 410 . If it is confirmed that the user is not using the wearable electronic device 401 , it may be determined that the user is not using the wearable electronic device 401 or does not intend to use the wearable electronic device 401 . Also, as shown in (d) of FIG.
  • the wearable electronic device 401 when the wearable electronic device 401 confirms that the direction of the face and the direction of the gaze do not coincide with the camera 410 in the image 640 including the user's face, It may be determined that the user is not using the wearable electronic device 401 or does not intend to use it.
  • the wearable electronic device 401 when the user's face is not identified or the front of the face is not identified in the image captured by the camera 410 , the wearable electronic device 401 indicates that the user is not using the wearable electronic device 401 . Or it may be determined that there is no intention to use it.
  • the wearable electronic device 401 turns on a display (eg, the display 460 of FIG. 4 ) based on the first gesture, the direction of the face, and the angle of the gaze (or the direction of the gaze). on) can be decided.
  • a display eg, the display 460 of FIG. 4
  • FIG. 7 is a flowchart illustrating an operation of adjusting a screen of a wearable electronic device based on a distance from a user's face, according to various embodiments of the present disclosure.
  • the wearable electronic device 401 turns on a display (eg, the display 460 of FIG. 4 ) based on the first gesture, the direction of the face, and the angle of the gaze (or the direction of the gaze). on) can be decided.
  • a display eg, the display 460 of FIG. 4
  • the wearable electronic device 401 displays a screen displayed on the display 460 based on the user's face and gaze in the image captured by the camera 410 . Can be adjusted.
  • the wearable electronic device 401 (continuously) captures a plurality of images through the camera 410 to maintain the same gaze of the user for a specified time. can check whether For example, the wearable electronic device 401 may check whether the user's gaze looks at the display 460 of the camera 410 for a specified time.
  • the wearable electronic device 401 may check the size of the user's face and the distance between the eyes included in the face.
  • the wearable electronic device 401 may identify a distance between the user's face and the wearable electronic device 401 based on the size of the user's face and the distance between the eyes.
  • the wearable electronic device 401 may enlarge or reduce the size of the screen displayed on the display 460 based on the distance between the user's face and the wearable electronic device 401 . have. For example, the wearable electronic device 401 may enlarge the size of the screen as the distance between the user's face and the wearable electronic device 401 increases. Also, the wearable electronic device 401 may reduce the size of the screen as the distance between the user's face and the wearable electronic device 401 increases.
  • FIG. 8 is a diagram for describing an operation of adjusting a screen of a wearable electronic device based on a distance from a user's face according to various embodiments of the present disclosure
  • the wearable electronic device (eg, the wearable electronic device 401 of FIG. 4 ) checks the user's face included in the image captured by the camera 410 , and determines the size and eyes of the user's face. The distance between the user's face and the wearable electronic device 401 may be checked based on the distance between them.
  • the wearable electronic device 401 may identify the user face 810 of the first size.
  • the wearable electronic device 401 may determine that the distance between the user's face and the wearable electronic device 401 is the first distance d1 based on the user's face 810 of the first size.
  • the wearable electronic device 401 may display the first screen 820 on the display (eg, the display 460 of FIG. 4 ) based on the first distance d1 .
  • the wearable electronic device 401 may identify the user face 830 of the second size. For example, the second size may be smaller than the first size.
  • the wearable electronic device 401 may determine that the distance between the user's face and the wearable electronic device 401 is the second distance d2 based on the user's face 830 of the second size.
  • the wearable electronic device 401 may display the second screen 840 on the display 460 based on the second distance d2 .
  • the size of the second screen 840 may be larger than the size of the first screen 820 .
  • the second distance d2 may be longer than the first distance.
  • FIG. 8 describes only a method of determining the size of a screen based on the size of a face, the technical spirit of the present invention may not be limited thereto.
  • the wearable electronic device 401 may check the distance between the user's face and the wearable electronic device 401 based on a change in the distance between the user's eyes in the same or similar manner as described above.
  • 9A is a flowchart illustrating an operation of adjusting a screen of a wearable electronic device when a user other than a user is identified, according to various embodiments of the present disclosure
  • the wearable electronic device (the wearable electronic device 401 of FIG. 4 ) takes a picture through a camera UDC (eg, the camera 410 of FIG. 4 ). It can be checked whether a user other than the user is included in the image including the user's face. For example, the wearable electronic device 401 may check whether a user other than a pre-registered user (eg, a user of the wearable electronic device 401) is included in the image.
  • a pre-registered user eg, a user of the wearable electronic device 401
  • the wearable electronic device 401 displays a screen displayed on a display (eg, the display 460 in FIG. 4 ). Only some information in the message can be displayed. For example, the displayed information among the information of the message may be determined by the privacy level of the corresponding message. For example, for a message with the highest privacy level (eg, a secure authentication message), only the message subject may be displayed. Alternatively, in a message having an intermediate privacy level (eg, a private message), a message subject and a part of message content may be displayed.
  • the wearable electronic device 401 displays all information of the message on the screen displayed on the display 460 .
  • the wearable electronic device 401 may set to display only some information of the message according to the privacy level according to the user's setting.
  • FIG. 9B is a diagram for describing an operation of adjusting a screen of a wearable electronic device when a user other than a user is identified, according to various embodiments of the present disclosure.
  • a wearable electronic device eg, the wearable electronic device 401 of FIG. 4
  • displays a message 950 on a display eg, the display 460 of FIG. 4 .
  • a display eg, the display 460 of FIG. 4
  • the wearable electronic device 401 checks whether a user other than the user is included in the image including the user's face captured through the camera UDC (eg, the camera 410 of FIG. 4 ). can When it is confirmed that another user is included, the wearable electronic device 401 may display only partial information of a message displayed on the display 460 . For example, referring to (b) of FIG. 9B , the wearable electronic device 401 may display only the title 960 of the message on the display 460 . The wearable electronic device 401 may display the display position of the title 960 of the message on the lower area of the display 460 . Also, the size at which the title 960 of the message is displayed may be reduced compared to the size of the existing message 950 .
  • FIG. 10 is a diagram for explaining an operation of adjusting a screen based on a user's gaze by a wearable electronic device according to various embodiments of the present disclosure
  • the wearable electronic device (eg, the wearable electronic device 401 of FIG. 4 ) has an image captured through a camera UDC (eg, the camera 410 of FIG. 4 ). You can check the user's face included in the .
  • the wearable electronic device 401 may adjust the position of the notification window 1010 , 1020 , 1030 , 1040 , or 1050 displayed on the display 460 based on the face direction and the gaze direction (or angle).
  • the wearable electronic device 401 may display a notification window at a position toward which the user's face and gaze are directed. For example, when the position to which the user's face and gaze are directed is confirmed as the first position (eg, the central region of the display 460 ), the wearable electronic device 401 may display an area corresponding to the first position (eg, the display 460 ). ) may display a notification window 1010 in the middle area). For example, when the position to which the user's face and gaze are directed is confirmed as the second position (eg, the left region of the display 460 ), the wearable electronic device 401 may display an area corresponding to the second position (eg, the display 460 ).
  • the first position eg, the central region of the display 460
  • the wearable electronic device 401 may display an area corresponding to the first position (eg, the display 460 ).
  • the wearable electronic device 401 may display an area corresponding to the second position (eg, the display 460 ).
  • a notification window 1020 may be displayed.
  • the wearable electronic device 401 may display an area corresponding to the third position (eg, the display 460 ).
  • a notification window 1030 in the upper area).
  • the wearable electronic device 401 may display an area corresponding to the fourth position (eg, the display 460 ).
  • a notification window 1040 may be displayed.
  • a notification window 1050 may be displayed.
  • the wearable electronic device 401 may adjust the shape of the notification window displayed on the display 460 based on the position where the notification window is displayed.
  • FIG. 11 is a flowchart illustrating an operation in which a wearable electronic device adjusts a screen based on user's biometric information according to various embodiments of the present disclosure.
  • the wearable electronic device may acquire the user's biometric information through the second sensor.
  • the wearable electronic device 401 may update a screen displayed on a display (eg, the display 460 of FIG. 4 ) in consideration of the user's biometric information. For example, the wearable electronic device 401 may check the user's mood or health status based on the user's biometric information. The wearable electronic device 401 may update the watch face in consideration of the user's mood or health condition.
  • FIG. 12 is a diagram for explaining an operation of adjusting a screen of a wearable electronic device based on user's biometric information according to various embodiments of the present disclosure
  • the wearable electronic device may display the first watch face 1210 on a display (eg, the display 460 of FIG. 4 ).
  • the first watch face 1210 may be a basic watch face.
  • the wearable electronic device 401 may acquire the user's biometric information through a second sensor (eg, the second sensor 450 of FIG. 4 ). For example, the wearable electronic device 401 may check the user's mood state (or health state) based on the user's biometric information.
  • a second sensor eg, the second sensor 450 of FIG. 4
  • the wearable electronic device 401 may check the user's mood state (or health state) based on the user's biometric information.
  • the wearable electronic device 401 may update the watch face 1210 in consideration of the user's mood state (or health state). For example, when the user is "tired" as a result of checking the biometric information, the wearable electronic device 401 may update the first watch face 1210 to the second watch face 1220 .
  • the second watch face 1220 may include a color indicating the degree of “tiredness”.
  • the wearable electronic device 401 may update the first watch face 1210 to the third watch face 1230 .
  • the third watch face 1230 may include a color indicating the degree of "depressed”.
  • the wearable electronic device 401 may update the first watch face 1210 to the fourth watch face 1240 when the user is in a "nervous" state as a result of checking the biometric information.
  • the fourth watch face 1240 may include a color indicating the degree of “tension”.
  • the wearable electronic device 401 may provide a notification or feedback indicating the user's mood state (or health state) using sound or vibration while updating the watch face.
  • FIG. 13 is a flowchart illustrating an operation in which a wearable electronic device adjusts a screen based on a user's gesture according to various embodiments of the present disclosure
  • the wearable electronic device uses a camera UDC (eg, the camera 410 of FIG. 4 ).
  • the user's second gesture can be checked.
  • the second gesture may include a movement of a specific user's hand (or finger) and/or a shape of the hand.
  • the wearable electronic device 401 may set a face detection area and a hand detection area through the camera (UDC) 410 , and may check the user's face and hand through this.
  • the wearable electronic device 401 displays the camera (UDC) 410 by further considering the second gesture to the face and gaze of the user identified in the photographed image (eg, FIG.
  • the screen displayed on the display 460 of 4 may be adjusted.
  • the wearable electronic device 401 may adjust the screen displayed on the display 460 in consideration of only the second gesture identified by the camera (UDC) 410 .
  • FIG. 14 is a diagram for explaining an operation of adjusting a screen of a wearable electronic device based on a user's gesture according to various embodiments of the present disclosure
  • the wearable electronic device uses a camera UDC (eg, the camera 410 of FIG. 4 ) of a user's hand
  • the movement (or the shape of the hand) 1405 can be confirmed.
  • the wearable electronic device 401 may determine whether the movement of the user's hand matches the specified gesture.
  • the wearable electronic device 401 may perform a function corresponding to the specified gesture. For example, the wearable electronic device 401 may change the first screen 1410 having the first size displayed on the display (eg, the display 460 of FIG. 4 ) to the second screen 1420 having the second size. have. That is, the wearable electronic device 401 may enlarge the size of the screen previously displayed on the display 1460 based on the movement of the user's hand. According to another embodiment, the wearable electronic device 401 may change the second screen 1420 having the second size displayed on the display 460 to the first screen 1410 having the first size. That is, the wearable electronic device 401 may reduce the size of the screen previously displayed on the display 1460 based on the movement of the user's hand.
  • the wearable electronic device 401 may change the first screen 1410 having the first size displayed on the display (eg, the display 460 of FIG. 4 ) to the second screen 1420 having the second size. have. That is, the wearable electronic device 401 may enlarge the size of the
  • FIG. 14 describes only the operation of enlarging or reducing the size of the screen, the technical spirit of the present invention may not be limited thereto.
  • the wearable electronic device 401 may adjust the screen in various forms based on a gesture including the movement of the user's hand (or the shape of the hand) confirmed through the camera (UDC) 410 .
  • FIG. 15 is a flowchart illustrating an operation in which a wearable electronic device transmits an unlock signal to another electronic device according to various embodiments of the present disclosure
  • the wearable electronic device eg, the wearable electronic device 401 of FIG. 4
  • a camera UDC eg, the camera 410 of FIG. 4
  • the wearable electronic device 401 may determine whether the user matches the registered user based on the user's face and gaze.
  • the wearable electronic device 401 may include user information (eg, face and gaze information) stored in a memory (eg, memory 430 of FIG. 4 ) and the user's face and You can check whether the eyes match.
  • the wearable electronic device 401 includes the camera 410 It can be determined that the user identified through the is a registered user.
  • the wearable electronic device 401 may not perform a separate additional operation. In this case, the wearable electronic device 401 may monitor whether a new user is identified through the camera 410 .
  • the wearable electronic device 401 is configured to display an external electronic device (eg, the external electronic device 402 of FIG. 4 ). )) to send a signal requesting unlocking.
  • the external electronic device 402 may unlock or maintain the unlocked state of the external electronic device 402 based on reception of a signal for requesting unlocking.
  • a wearable electronic device includes a first sensor, a display, a camera disposed under a rear surface that is not exposed to the outside of the display, and a processor, wherein the processor, through the first sensor, A first gesture for the wearable electronic device is checked, and the face and gaze of the user are checked based on an image including the user's face captured through the camera, and the first gesture, the face, and the gaze determines whether to turn on the display based on It can be set to adjust the screen.
  • the processor may be configured to enlarge or reduce the size of the screen based on the distance between the face and the wearable electronic device when the gaze remains the same for a specified time.
  • the processor may be configured to identify the distance between the face and the wearable electronic device based on the size of the face and the distance between the eyes of the face.
  • the processor may be configured to display only some information on the screen when a user other than the user is identified through the camera.
  • the processor may be configured to change a display position of the partial information included in the screen when the other user other than the user is identified through the camera.
  • the processor may be set to adjust the position of the notification window displayed on the display based on the direction of the face and the angle of the gaze.
  • the processor may be configured to check a second gesture for the movement of the user's hand through the camera, and to adjust the configuration of the screen based on the second gesture.
  • the electronic device may further include a second sensor for acquiring biometric information, and the processor may be configured to acquire the user's biometric information through the second sensor and update the screen based on the biometric information.
  • the processor may be configured to check whether the user is a pre-registered user based on the face and the gaze, and to transmit a signal for requesting unlocking to an external electronic device when the user is identified as the pre-registered user.
  • the wearable electronic device may be implemented as a smart watch.
  • the method of operating the wearable electronic device may include an operation of identifying a first gesture for the wearable electronic device through a first sensor included in the wearable electronic device, and exposing the wearable electronic device to the outside of a display Based on an image including the user's face photographed through a camera disposed below the rear surface, the operation of confirming the user's face and gaze, the first gesture, the face, and the gaze Determining whether to turn on the display of the electronic device, and when the display is turned on, the display is displayed on the display based on the distance between the face and the wearable electronic device, the direction of the face, and the angle of the gaze It may include an operation for adjusting the screen to be displayed.
  • the operation of adjusting the screen may include an operation of enlarging or reducing the size of the screen based on the distance between the face and the wearable electronic device when the gaze remains the same for a specified time.
  • the adjusting the screen may include checking the distance between the face and the wearable electronic device based on the size of the face and the distance between the eyes of the face.
  • the operation of adjusting the screen may include an operation of displaying only partial information on the screen when a user other than the user is identified through the camera.
  • the operation of adjusting the screen may include an operation of changing a display position of the partial information included in the screen when the other user other than the user is identified through the camera.
  • the operation of adjusting the screen may include adjusting the position of the notification window displayed on the display based on the direction of the face and the angle of the gaze.
  • the method of operating the electronic device may further include checking a second gesture for the movement of the user's hand through the camera and adjusting the configuration of the screen based on the second gesture.
  • the method of operating the electronic device may further include acquiring the user's biometric information through a second sensor included in the wearable electronic device and updating the screen based on the biometric information.
  • the method of operating the electronic device includes an operation of confirming whether the user is a pre-registered user based on the face and the gaze, and a signal for requesting unlocking of an external electronic device when the user is identified as the pre-registered user It may further include the operation of transmitting.
  • the processor checks a first gesture for the wearable electronic device through a first sensor included in the wearable electronic device operation, the operation of confirming the user's face and gaze based on an image including the user's face photographed through a camera disposed under the rear surface of the display of the wearable electronic device that is not exposed to the outside; 1 The operation of determining whether to turn on the display of the electronic device based on the gesture, the face, and the gaze, and determining whether to turn on the display, the distance between the face and the wearable electronic device, the direction of the face, and an instruction executable to adjust the screen displayed on the display based on the angle of the gaze.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • Multimedia (AREA)
  • Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Dermatology (AREA)
  • General Health & Medical Sciences (AREA)
  • Neurology (AREA)
  • Neurosurgery (AREA)
  • Signal Processing (AREA)
  • Exposure Control For Cameras (AREA)
  • Details Of Cameras Including Film Mechanisms (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Un dispositif électronique portable selon divers modes de réalisation comprend : un premier capteur ; une unité d'affichage ; une caméra disposée sous la surface arrière de l'unité d'affichage qui n'est pas exposée à l'extérieur ; et un processeur. Le processeur peut être configuré pour : identifier, par le biais du premier capteur, un premier geste concernant le dispositif électronique portable ; identifier le visage et le regard d'un utilisateur sur la base d'une image capturée par la caméra, l'image comprenant le visage de l'utilisateur ; déterminer s'il faut mettre sous tension l'unité d'affichage sur la base du premier geste, du visage et du regard ; et lorsque la mise sous tension de l'unité d'affichage est déterminée, ajuster un écran affiché sur l'unité d'affichage sur la base de la distance entre le visage et le dispositif électronique portable, de la direction du visage et de l'angle du regard.
PCT/KR2022/003005 2021-03-15 2022-03-03 Dispositif électronique comprenant une caméra et son procédé de fonctionnement WO2022196984A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020210033224A KR20220128735A (ko) 2021-03-15 2021-03-15 카메라를 포함하는 전자 장치 및 이의 동작 방법
KR10-2021-0033224 2021-03-15

Publications (1)

Publication Number Publication Date
WO2022196984A1 true WO2022196984A1 (fr) 2022-09-22

Family

ID=83320756

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2022/003005 WO2022196984A1 (fr) 2021-03-15 2022-03-03 Dispositif électronique comprenant une caméra et son procédé de fonctionnement

Country Status (2)

Country Link
KR (1) KR20220128735A (fr)
WO (1) WO2022196984A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101398946B1 (ko) * 2013-06-25 2014-05-27 서울시립대학교 산학협력단 스마트 시계 제어 장치, 방법 및 기록매체
KR20150069856A (ko) * 2013-12-16 2015-06-24 삼성전자주식회사 스마트 워치 제어 방법 및 시스템
US20160125846A1 (en) * 2013-06-19 2016-05-05 Yulong Computer Telecommunication Scientific (Shenzhen) Co., Ltd. Smart watch and display method for smart watch
KR20200015291A (ko) * 2018-08-03 2020-02-12 서정호 시계형 스마트 웨어러블 장치 및 이를 포함하는 모니터링 시스템
JP2021012462A (ja) * 2019-07-04 2021-02-04 学校法人同志社 ウェアラブル装置およびウェアラブル装置の入力方法

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160125846A1 (en) * 2013-06-19 2016-05-05 Yulong Computer Telecommunication Scientific (Shenzhen) Co., Ltd. Smart watch and display method for smart watch
KR101398946B1 (ko) * 2013-06-25 2014-05-27 서울시립대학교 산학협력단 스마트 시계 제어 장치, 방법 및 기록매체
KR20150069856A (ko) * 2013-12-16 2015-06-24 삼성전자주식회사 스마트 워치 제어 방법 및 시스템
KR20200015291A (ko) * 2018-08-03 2020-02-12 서정호 시계형 스마트 웨어러블 장치 및 이를 포함하는 모니터링 시스템
JP2021012462A (ja) * 2019-07-04 2021-02-04 学校法人同志社 ウェアラブル装置およびウェアラブル装置の入力方法

Also Published As

Publication number Publication date
KR20220128735A (ko) 2022-09-22

Similar Documents

Publication Publication Date Title
EP4128729A1 (fr) Dispositif électronique comprenant une structure réduisant les frottements
WO2022014988A1 (fr) Dispositif électronique et procédé de commande de puissance
WO2022025494A1 (fr) Dispositif électronique de commande de luminance de dispositif d'affichage et procédé de fonctionnement associé
WO2022177299A1 (fr) Procédé de commande de fonction d'appel et dispositif électronique le prenant en charge
WO2022103021A1 (fr) Dispositif électronique à affichage flexible et procédé de commande dudit dispositif
WO2022025444A1 (fr) Procédé et appareil d'affichage d'écran
WO2022075632A1 (fr) Dispositif électronique comprenant une antenne
WO2022164055A1 (fr) Appareil électronique pliable et procédé pour fournir de l'énergie électrique à un appareil électronique externe à l'aide d'un appareil électronique pliable
WO2022050620A1 (fr) Circuit de détection et dispositif électronique le comprenant
WO2022030910A1 (fr) Dispositif électronique permettant de commander un mode d'entrée selon un angle de pliage et son procédé
WO2022196984A1 (fr) Dispositif électronique comprenant une caméra et son procédé de fonctionnement
WO2022030800A1 (fr) Dispositif électronique pour détection d'entrée d'utilisateur et son procédé de fonctionnement
WO2024101704A1 (fr) Dispositif pouvant être porté et procédé d'identification d'entrée tactile et support de stockage lisible par ordinateur non transitoire
WO2022119412A1 (fr) Dispositif électronique comprenant un afficheur flexible et une caméra
WO2023043118A1 (fr) Dispositif électronique et procédé de reconnaissance tactile de dispositif électronique
WO2023068549A1 (fr) Dispositif électronique utilisant un dispositif externe, et son procédé de fonctionnement
WO2022197155A1 (fr) Dispositif électronique comprenant une carte de circuit imprimé souple
WO2022177206A2 (fr) Appareil électronique et procédé de fonctionnement d'appareil électronique
WO2024090787A1 (fr) Dispositif électronique et procédé de fonctionnement d'ui d'authentification biométrique correspondant
WO2023058873A1 (fr) Dispositif électronique comprenant une antenne
WO2022203364A1 (fr) Dispositif électronique comprenant un boîtier comprenant une marque de repère et son procédé de fabrication
WO2023214675A1 (fr) Dispositif électronique et procédé de traitement d'entrée tactile
WO2022025698A1 (fr) Dispositif électronique effectuant une opération en fonction d'une phase de sommeil et procédé de fonctionnement pour dispositif électronique
WO2022108402A1 (fr) Procédé de fonctionnement d'écran souple, et dispositif électronique
WO2023018095A1 (fr) Dispositif électronique et procédé de connexion de dispositif externe par dispositif électronique

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22771650

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 22771650

Country of ref document: EP

Kind code of ref document: A1