WO2022030800A1 - Dispositif électronique pour détection d'entrée d'utilisateur et son procédé de fonctionnement - Google Patents

Dispositif électronique pour détection d'entrée d'utilisateur et son procédé de fonctionnement Download PDF

Info

Publication number
WO2022030800A1
WO2022030800A1 PCT/KR2021/009163 KR2021009163W WO2022030800A1 WO 2022030800 A1 WO2022030800 A1 WO 2022030800A1 KR 2021009163 W KR2021009163 W KR 2021009163W WO 2022030800 A1 WO2022030800 A1 WO 2022030800A1
Authority
WO
WIPO (PCT)
Prior art keywords
touch
touch information
electronic device
sensor
user
Prior art date
Application number
PCT/KR2021/009163
Other languages
English (en)
Korean (ko)
Inventor
이원희
최재혁
진서영
김재원
이영욱
조하연
최재원
Original Assignee
삼성전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 삼성전자 주식회사 filed Critical 삼성전자 주식회사
Publication of WO2022030800A1 publication Critical patent/WO2022030800A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/038Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/163Wearable computers, e.g. on a belt
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03547Touch pads, in which fingers can move on a surface

Definitions

  • Various embodiments of the present invention relate to an apparatus and method for detecting a user input in an electronic device.
  • Electronic devices are being provided in various forms, such as smartphones, tablet personal computers (PCs), or personal digital assistants (PDA).
  • Electronic devices are being developed in a form that can be worn by a user so as to improve mobility and user accessibility.
  • the electronic device may include a wearable device worn on the user's ear or worn on the user's face.
  • the electronic device may control an electronic device (eg, a wearable device) or an external device (eg, a smartphone) to which wireless communication is connected based on a user input.
  • the electronic device eg, a wearable device
  • the electronic device eg, a wearable device
  • An electronic device may detect a user input using a touch sensing sensor (eg, a touch screen panel (TSP)).
  • a touch sensing sensor eg, a touch screen panel (TSP)
  • an electronic device eg, a wearable device
  • TSP touch screen panel
  • a touch area for detecting a change in capacitance through a touch sensing sensor may be limited.
  • electronic devices eg, wearable devices
  • wearable devices are configured in various shapes, a user may not clearly recognize a touch area, so that a contact to an incorrect area may occur or a touch misrecognition phenomenon may occur.
  • Various embodiments of the present invention disclose an apparatus and method for improving recognition performance of a user input in an electronic device.
  • an electronic device includes a housing, an acceleration sensor disposed in the inner space of the housing, a touch sensing sensor disposed in the inner space, and a processor operatively connected to the acceleration sensor and the touch sensing sensor wherein the processor identifies first touch information related to the user's contact with the housing using the acceleration sensor, and identifies second touch information related to the user's contact with the housing using the touch sensing sensor. If touch information is identified, the number of touches included in the first touch information is two or more, and the number of touches included in the second touch information is one or more, based on the number of touches included in the first touch information Thus, a user input corresponding to the user's contact may be detected.
  • the method of operating an electronic device includes an operation of identifying first touch information related to a user's contact with a housing of the electronic device using an acceleration sensor, and an operation of identifying the first touch information related to a user's contact with the housing of the electronic device using a touch sensing sensor.
  • the number of touches included in the operation of identifying second touch information related to the user's contact and the first touch information is two or more, and the number of touches included in the second touch information is one or more
  • a user input related to a user's contact with the electronic device using an acceleration sensor and a touch sensing sensor (eg, a touch screen panel (TSP)) in the electronic device, As a result, it is possible to reduce a recognition error caused by a sensitive touch sensing sensor, prevent a ghost touch, and improve a touch misrecognition phenomenon to increase user usability of the electronic device.
  • a touch sensing sensor eg, a touch screen panel (TSP)
  • FIG. 1 is a block diagram of an electronic device according to various embodiments of the present disclosure.
  • FIG. 2A is a perspective view illustrating an electronic device that can be worn on a user's ear according to various embodiments of the present disclosure
  • FIG. 2B is a perspective view illustrating the electronic device of FIG. 2A as viewed from another direction, according to various embodiments of the present disclosure
  • FIG. 3 is a perspective view illustrating an electronic device that can be worn on a user's face according to various embodiments of the present disclosure
  • FIG. 4 is a block diagram of an electronic device for detecting a user input according to various embodiments of the present disclosure
  • FIG. 5 is a flowchart for detecting a user input in an electronic device according to various embodiments of the present disclosure
  • FIG. 6 illustrates a state in which an electronic device is coupled to a user's ear according to various embodiments of the present disclosure
  • 7A and 7B are examples of first touch information acquired through a first sensor in an electronic device according to various embodiments of the present disclosure
  • FIG. 8 is a flowchart for detecting a user input based on first touch information and second touch information in an electronic device according to various embodiments of the present disclosure
  • FIG. 9 is a flowchart for detecting a long press touch input in an electronic device according to various embodiments of the present disclosure.
  • FIG. 10 is a flowchart for preventing a touch misrecognition of a first sensor in an electronic device according to various embodiments of the present disclosure
  • 11 is an example of touch information acquired through a first sensor in an electronic device according to various embodiments of the present disclosure
  • FIG. 12 is a flowchart for preventing misrecognition of a touch by a first sensor and a second sensor in an electronic device according to various embodiments of the present disclosure
  • FIG. 13 is an example of touch information acquired through a first sensor and a second sensor in an electronic device according to various embodiments of the present disclosure
  • FIG. 1 is a block diagram of an electronic device 101 in a network environment 100 according to various embodiments of the present disclosure.
  • an electronic device 101 communicates with an electronic device 102 through a first network 198 (eg, a short-range wireless communication network) or a second network 199 . It may communicate with the electronic device 104 or the server 108 through (eg, a long-distance wireless communication network). According to an embodiment, the electronic device 101 may communicate with the electronic device 104 through the server 108 .
  • a first network 198 eg, a short-range wireless communication network
  • a second network 199 e.g., a second network 199
  • the electronic device 101 may communicate with the electronic device 104 through the server 108 .
  • the electronic device 101 includes a processor 120 , a memory 130 , an input module 150 , a sound output module 155 , a display module 160 , an audio module 170 , and a sensor module ( 176), interface 177, connection terminal 178, haptic module 179, camera module 180, power management module 188, battery 189, communication module 190, subscriber identification module 196 , or an antenna module 197 may be included.
  • at least one of these components eg, the connection terminal 178
  • may be omitted or one or more other components may be added to the electronic device 101 .
  • some of these components are integrated into one component (eg, display module 160 ). can be
  • the processor 120 for example, executes software (eg, a program 140) to execute at least one other component (eg, a hardware or software component) of the electronic device 101 connected to the processor 120 . It can control and perform various data processing or operations. According to one embodiment, as at least part of data processing or operation, the processor 120 converts commands or data received from other components (eg, the sensor module 176 or the communication module 190 ) to the volatile memory 132 . may be stored in the volatile memory 132 , and may process commands or data stored in the volatile memory 132 , and store the result data in the non-volatile memory 134 .
  • software eg, a program 140
  • the processor 120 converts commands or data received from other components (eg, the sensor module 176 or the communication module 190 ) to the volatile memory 132 .
  • the volatile memory 132 may be stored in the volatile memory 132 , and may process commands or data stored in the volatile memory 132 , and store the result data in the non-volatile memory 134 .
  • the processor 120 is the main processor 121 (eg, a central processing unit or an application processor) or a secondary processor 123 (eg, a graphic processing unit, a neural network processing unit) a neural processing unit (NPU), an image signal processor, a sensor hub processor, or a communication processor).
  • the main processor 121 e.g, a central processing unit or an application processor
  • a secondary processor 123 eg, a graphic processing unit, a neural network processing unit
  • NPU neural processing unit
  • an image signal processor e.g., a sensor hub processor, or a communication processor.
  • the main processor 121 e.g, a central processing unit or an application processor
  • a secondary processor 123 eg, a graphic processing unit, a neural network processing unit
  • NPU neural processing unit
  • an image signal processor e.g., a sensor hub processor, or a communication processor.
  • the main processor 121 e.g, a central processing unit or an application processor
  • a secondary processor 123
  • the auxiliary processor 123 is, for example, on behalf of the main processor 121 while the main processor 121 is in an inactive (eg, sleep) state, or the main processor 121 is active (eg, executing an application). ), together with the main processor 121, at least one of the components of the electronic device 101 (eg, the display module 160, the sensor module 176, or the communication module 190) It is possible to control at least some of the related functions or states.
  • the coprocessor 123 eg, an image signal processor or a communication processor
  • may be implemented as part of another functionally related component eg, the camera module 180 or the communication module 190. have.
  • the auxiliary processor 123 may include a hardware structure specialized for processing an artificial intelligence model.
  • Artificial intelligence models can be created through machine learning. Such learning may be performed, for example, in the electronic device 101 itself on which artificial intelligence is performed, or may be performed through a separate server (eg, the server 108).
  • the learning algorithm may include, for example, supervised learning, unsupervised learning, semi-supervised learning, or reinforcement learning, but in the above example not limited
  • the artificial intelligence model may include a plurality of artificial neural network layers.
  • Artificial neural networks include deep neural networks (DNNs), convolutional neural networks (CNNs), recurrent neural networks (RNNs), restricted boltzmann machines (RBMs), deep belief networks (DBNs), bidirectional recurrent deep neural networks (BRDNNs), It may be one of deep Q-networks or a combination of two or more, but is not limited to the above example.
  • the artificial intelligence model may include, in addition to, or alternatively, a software structure in addition to the hardware structure.
  • the memory 130 may store various data used by at least one component of the electronic device 101 (eg, the processor 120 or the sensor module 176 ).
  • the data may include, for example, input data or output data for software (eg, the program 140 ) and instructions related thereto.
  • the memory 130 may include a volatile memory 132 or a non-volatile memory 134 .
  • the program 140 may be stored as software in the memory 130 , and may include, for example, an operating system 142 , middleware 144 , or an application 146 .
  • the input module 150 may receive a command or data to be used in a component (eg, the processor 120 ) of the electronic device 101 from the outside (eg, a user) of the electronic device 101 .
  • the input module 150 may include, for example, a microphone, a mouse, a keyboard, a key (eg, a button), or a digital pen (eg, a stylus pen).
  • the sound output module 155 may output a sound signal to the outside of the electronic device 101 .
  • the sound output module 155 may include, for example, a speaker or a receiver.
  • the speaker can be used for general purposes such as multimedia playback or recording playback.
  • the receiver may be used to receive an incoming call. According to one embodiment, the receiver may be implemented separately from or as part of the speaker.
  • the display module 160 may visually provide information to the outside (eg, a user) of the electronic device 101 .
  • the display module 160 may include, for example, a control circuit for controlling a display, a hologram device, or a projector and a corresponding device.
  • the display module 160 may include a touch sensor configured to sense a touch or a pressure sensor configured to measure the intensity of force generated by the touch.
  • the audio module 170 may convert a sound into an electric signal or, conversely, convert an electric signal into a sound. According to an embodiment, the audio module 170 acquires a sound through the input module 150 or an external electronic device (eg, a sound output module 155 ) directly or wirelessly connected to the electronic device 101 . A sound may be output through the electronic device 102 (eg, a speaker or headphones).
  • an external electronic device eg, a sound output module 155
  • a sound may be output through the electronic device 102 (eg, a speaker or headphones).
  • the sensor module 176 detects an operating state (eg, power or temperature) of the electronic device 101 or an external environmental state (eg, user state), and generates an electrical signal or data value corresponding to the sensed state. can do.
  • the sensor module 176 may include, for example, a gesture sensor, a gyro sensor, a barometric pressure sensor, a magnetic sensor, an acceleration sensor, a grip sensor, a proximity sensor, a color sensor, an IR (infrared) sensor, a biometric sensor, It may include a temperature sensor, a humidity sensor, or an illuminance sensor.
  • the interface 177 may support one or more designated protocols that may be used by the electronic device 101 to directly or wirelessly connect with an external electronic device (eg, the electronic device 102 ).
  • the interface 177 may include, for example, a high definition multimedia interface (HDMI), a universal serial bus (USB) interface, an SD card interface, or an audio interface.
  • HDMI high definition multimedia interface
  • USB universal serial bus
  • SD card interface Secure Digital Card
  • the connection terminal 178 may include a connector through which the electronic device 101 can be physically connected to an external electronic device (eg, the electronic device 102 ).
  • the connection terminal 178 may include, for example, an HDMI connector, a USB connector, an SD card connector, or an audio connector (eg, a headphone connector).
  • the haptic module 179 may convert an electrical signal into a mechanical stimulus (eg, vibration or movement) or an electrical stimulus that the user can perceive through tactile or kinesthetic sense.
  • the haptic module 179 may include, for example, a motor, a piezoelectric element, or an electrical stimulation device.
  • the camera module 180 may capture still images and moving images. According to an embodiment, the camera module 180 may include one or more lenses, image sensors, image signal processors, or flashes.
  • the power management module 188 may manage power supplied to the electronic device 101 .
  • the power management module 188 may be implemented as, for example, at least a part of a power management integrated circuit (PMIC).
  • PMIC power management integrated circuit
  • the battery 189 may supply power to at least one component of the electronic device 101 .
  • battery 189 may include, for example, a non-rechargeable primary cell, a rechargeable secondary cell, or a fuel cell.
  • the communication module 190 is a direct (eg, wired) communication channel or a wireless communication channel between the electronic device 101 and an external electronic device (eg, the electronic device 102, the electronic device 104, or the server 108). It can support establishment and communication performance through the established communication channel.
  • the communication module 190 may include one or more communication processors that operate independently of the processor 120 (eg, an application processor) and support direct (eg, wired) communication or wireless communication.
  • the communication module 190 is a wireless communication module 192 (eg, a cellular communication module, a short-range communication module, or a global navigation satellite system (GNSS) communication module) or a wired communication module 194 (eg, : It may include a LAN (local area network) communication module, or a power line communication module).
  • GNSS global navigation satellite system
  • a corresponding communication module among these communication modules is a first network 198 (eg, a short-range communication network such as Bluetooth, wireless fidelity (WiFi) direct, or infrared data association (IrDA)) or a second network 199 (eg, legacy It may communicate with the external electronic device 104 through a cellular network, a 5G network, a next-generation communication network, the Internet, or a computer network (eg, a telecommunication network such as a LAN or a WAN).
  • a first network 198 eg, a short-range communication network such as Bluetooth, wireless fidelity (WiFi) direct, or infrared data association (IrDA)
  • a second network 199 eg, legacy It may communicate with the external electronic device 104 through a cellular network, a 5G network, a next-generation communication network, the Internet, or a computer network (eg, a telecommunication network such as a LAN or a WAN).
  • a telecommunication network
  • the wireless communication module 192 uses the subscriber information (eg, International Mobile Subscriber Identifier (IMSI)) stored in the subscriber identification module 196 within a communication network such as the first network 198 or the second network 199 .
  • the electronic device 101 may be identified or authenticated.
  • the wireless communication module 192 may support a 5G network after a 4G network and a next-generation communication technology, for example, a new radio access technology (NR).
  • NR access technology includes high-speed transmission of high-capacity data (eMBB (enhanced mobile broadband)), minimization of terminal power and access to multiple terminals (mMTC (massive machine type communications)), or high reliability and low latency (URLLC (ultra-reliable and low-latency) -latency communications)).
  • eMBB enhanced mobile broadband
  • mMTC massive machine type communications
  • URLLC ultra-reliable and low-latency
  • the wireless communication module 192 may support a high frequency band (eg, mmWave band) to achieve a high data rate, for example.
  • a high frequency band eg, mmWave band
  • the wireless communication module 192 includes various technologies for securing performance in a high-frequency band, for example, beamforming, massive multiple-input and multiple-output (MIMO), all-dimensional multiplexing. It may support technologies such as full dimensional MIMO (FD-MIMO), an array antenna, analog beam-forming, or a large scale antenna.
  • the wireless communication module 192 may support various requirements specified in the electronic device 101 , an external electronic device (eg, the electronic device 104 ), or a network system (eg, the second network 199 ).
  • the wireless communication module 192 may include a peak data rate (eg, 20 Gbps or more) for realizing eMBB, loss coverage (eg, 164 dB or less) for realizing mMTC, or U-plane latency for realizing URLLC ( Example: downlink (DL) and uplink (UL) each 0.5 ms or less, or round trip 1 ms or less).
  • a peak data rate eg, 20 Gbps or more
  • loss coverage eg, 164 dB or less
  • U-plane latency for realizing URLLC
  • the antenna module 197 may transmit or receive a signal or power to the outside (eg, an external electronic device).
  • the antenna module 197 may include an antenna including a conductor formed on a substrate (eg, a PCB) or a radiator formed of a conductive pattern.
  • the antenna module 197 may include a plurality of antennas (eg, an array antenna). In this case, at least one antenna suitable for a communication method used in a communication network such as the first network 198 or the second network 199 is selected from a plurality of antennas by, for example, the communication module 190 . can be A signal or power may be transmitted or received between the communication module 190 and an external electronic device through at least one selected antenna.
  • other components eg, a radio frequency integrated circuit (RFIC)
  • RFIC radio frequency integrated circuit
  • the antenna module 197 may form a mmWave antenna module.
  • the mmWave antenna module comprises a printed circuit board, an RFIC disposed on or adjacent to a first side (eg, underside) of the printed circuit board and capable of supporting a designated high frequency band (eg, mmWave band), and It may include a plurality of antennas (eg, an array antenna) disposed on or adjacent to a second side (eg, top or side) of the printed circuit board and capable of transmitting or receiving signals of a designated high frequency band. .
  • peripheral devices eg, a bus, general purpose input and output (GPIO), serial peripheral interface (SPI), or mobile industry processor interface (MIPI)
  • GPIO general purpose input and output
  • SPI serial peripheral interface
  • MIPI mobile industry processor interface
  • the command or data may be transmitted or received between the electronic device 101 and the external electronic device 104 through the server 108 connected to the second network 199 .
  • Each of the external electronic devices 102 or 104 may be the same as or different from the electronic device 101 .
  • all or a part of operations executed in the electronic device 101 may be executed in one or more external electronic devices 102 , 104 , or 108 .
  • the electronic device 101 may perform the function or service itself instead of executing the function or service itself.
  • one or more external electronic devices may be requested to perform at least a part of the function or the service.
  • One or more external electronic devices that have received the request may execute at least a part of the requested function or service, or an additional function or service related to the request, and transmit a result of the execution to the electronic device 101 .
  • the electronic device 101 may process the result as it is or additionally and provide it as at least a part of a response to the request.
  • cloud computing distributed computing, mobile edge computing (MEC), or client-server computing technology may be used.
  • the electronic device 101 may provide an ultra-low latency service using, for example, distributed computing or mobile edge computing.
  • the external electronic device 104 may include an Internet of things (IoT) device.
  • Server 108 may be an intelligent server using machine learning and/or neural networks.
  • the external electronic device 104 or the server 108 may be included in the second network 199 .
  • the electronic device 101 may be applied to an intelligent service (eg, smart home, smart city, smart car, or health care) based on 5G communication technology and IoT-related technology.
  • the electronic device may have various types of devices.
  • the electronic device may include, for example, a portable communication device (eg, a smart phone), a computer device, a portable multimedia device, a portable medical device, a camera, a wearable device, or a home appliance device.
  • a portable communication device eg, a smart phone
  • a computer device e.g., a smart phone
  • a portable multimedia device e.g., a portable medical device
  • a camera e.g., a portable medical device
  • a camera e.g., a portable medical device
  • a camera e.g., a portable medical device
  • a wearable device e.g., a smart bracelet
  • a home appliance device e.g., a home appliance
  • first, second, or first or second may be used simply to distinguish the element from other elements in question, and may refer to elements in other aspects (e.g., importance or order) is not limited. It is said that one (eg, first) component is “coupled” or “connected” to another (eg, second) component, with or without the terms “functionally” or “communicatively”. When referenced, it means that one component can be connected to the other component directly (eg by wire), wirelessly, or through a third component.
  • module used in various embodiments of this document may include a unit implemented in hardware, software, or firmware, and is interchangeable with terms such as, for example, logic, logic block, component, or circuit.
  • a module may be an integrally formed part or a minimum unit or a part of the part that performs one or more functions.
  • the module may be implemented in the form of an application-specific integrated circuit (ASIC).
  • ASIC application-specific integrated circuit
  • one or more instructions stored in a storage medium may be implemented as software (eg, the program 140) including
  • a processor eg, processor 120
  • a device eg, electronic device 101
  • the one or more instructions may include code generated by a compiler or code executable by an interpreter.
  • the device-readable storage medium may be provided in the form of a non-transitory storage medium.
  • 'non-transitory' only means that the storage medium is a tangible device and does not include a signal (eg, electromagnetic wave), and this term is used in cases where data is semi-permanently stored in the storage medium and It does not distinguish between temporary storage cases.
  • a signal eg, electromagnetic wave
  • the method according to various embodiments disclosed in this document may be included and provided in a computer program product.
  • Computer program products may be traded between sellers and buyers as commodities.
  • the computer program product is distributed in the form of a machine-readable storage medium (eg compact disc read only memory (CD-ROM)), or via an application store (eg Play Store TM ) or on two user devices ( It can be distributed (eg downloaded or uploaded) directly between smartphones (eg: smartphones) and online.
  • a part of the computer program product may be temporarily stored or temporarily generated in a machine-readable storage medium such as a memory of a server of a manufacturer, a server of an application store, or a relay server.
  • each component (eg, module or program) of the above-described components may include a singular or a plurality of entities, and some of the plurality of entities may be separately disposed in other components. have.
  • one or more components or operations among the above-described corresponding components may be omitted, or one or more other components or operations may be added.
  • a plurality of components eg, a module or a program
  • the integrated component may perform one or more functions of each component of the plurality of components identically or similarly to those performed by the corresponding component among the plurality of components prior to the integration. .
  • operations performed by a module, program, or other component are executed sequentially, in parallel, repeatedly, or heuristically, or one or more of the operations are executed in a different order, or omitted. or one or more other operations may be added.
  • FIG. 2A is a perspective view illustrating an electronic device 200 according to various embodiments.
  • FIG. 2B is a perspective view illustrating the electronic device 200 of FIG. 2A as viewed from another direction according to various embodiments of the present disclosure.
  • the electronic device 200 of FIGS. 2A and 2B may be at least partially similar to the electronic device 101 of FIG. 1 , or may further include other embodiments of the electronic device.
  • the electronic device 200 may include a housing 201 including a first case 201a and a second case 201b.
  • the electronic device 200 may include various structures and/or electric components accommodated in a space (eg, an internal space) between the first case 201a and the second case 201b.
  • the housing 201 may be formed in a form that can be worn on the user's ear (eg, 600 in FIG. 6 ).
  • the first case 201a and/or the second case 201b may be formed by a combination of at least two of ceramic, polymer, or metal, and at least one coating formed on the outer surface or the inner surface. It may include layers.
  • the first case 201a when the electronic device 200 is worn on the user's body, the first case 201a may include a first outer surface 203a that comes into contact with the user's body.
  • the second case 201b when the electronic device 200 is worn on the user's body, the second case 201b may face an external space opposite to the user's body.
  • a portion of the second outer surface 203b of the second case 201b is exposed to the outside space based on the shape of the user's body or the state in which the electronic device 200 is worn.
  • the remaining part may be concealed by the user's body.
  • the remaining portion of the second outer surface 203b that is hidden by the user's body may be in contact with the user's body.
  • the electronic device 200 includes a plurality of acoustic holes 211a, 211b, and 211c, an acoustic dimple 213 and/or a plurality of electrodes 241a.
  • the first acoustic hole 211a and the second acoustic hole 211b among the plurality of acoustic holes 211a, 211b, and 211c may be formed at different positions of the second case 201b.
  • the electronic device 200 is disposed in the inner space of the housing 201 , and an external sound (eg, a user's voice or It may include at least one microphone for receiving the sound of the surrounding environment).
  • an external sound eg, a user's voice or It may include at least one microphone for receiving the sound of the surrounding environment.
  • the acoustic groove 213 may include a recess structure in which a partial area of the second outer surface 203b of the second case 201b is formed lower than the outer surface.
  • the acoustic groove 213 may have an outer diameter wider than the outer diameter of the second acoustic hole 211b.
  • the acoustic groove 213 may have a shape extending in one direction from the position where the second acoustic hole 211b is formed.
  • the acoustic groove 213 may have a rectangular shape with rounded corners including the circular second acoustic hole 211b.
  • the shapes of the second acoustic hole 211b and the acoustic groove 213 are not limited thereto, and may be implemented in various ways.
  • the second acoustic hole 211b may be formed in an elliptical shape, and the acoustic groove 213 may have an elongated oval shape including the second acoustic hole 211b.
  • the second acoustic hole 211b may extend from the acoustic groove 213 to the inside of the second case 201b.
  • the second acoustic hole 211b may form a sound path through which an external sound is received by a microphone disposed in the inner space of the housing 201 together with the acoustic groove 213 .
  • the third acoustic hole 211c among the plurality of acoustic holes 211a, 211b, and 211c may be formed in the first case 201a.
  • the electronic device 200 may include at least one speaker that is disposed in the inner space of the housing 201 and radiates sound to the outside through the third sound hole 211c.
  • the third sound hole 211c may be commonly used for a speaker and a microphone disposed in the inner space of the housing 201 .
  • the third sound hole 211c may form a path through which the microphone receives external sound while radiating the sound (or sound) output from the speaker to the outside.
  • the electronic device 200 may include a dummy hole 211d formed in a position different from the third sound hole 211c in the first case 201a.
  • the dummy hole 211d may form a path through which a microphone disposed in the inner space of the housing 201 receives external sound.
  • the dummy hole 211d may adjust the pressure inside the ear canal to correspond to the pressure of the external environment, together with the vent hole 215 while the electronic device 200 is worn by the user.
  • the plurality of electrodes 241a are electrodes for receiving charging power and may be exposed to the outside through the first outer surface 203a of the first case 201a. According to an embodiment, since the first outer surface 203a of the first case 201a is configured to substantially contact the user's body, the plurality of electrodes ( 241a) may be visually concealed.
  • the electronic device 200 may include an optical window 219 and/or a vent hole 215 .
  • the optical window 219 may be exposed to the outside through the first outer surface 203a of the first case 201a.
  • the electronic device 200 includes a sensor (eg, a proximity sensor) disposed to correspond to the optical window 219 in the inner space of the housing 201 , so that the electronic device 200 is worn on the user's body. It can be detected whether or not
  • the vent hole 215 is formed in the second case 201b and may be exposed to the external space while the electronic device 200 is worn by the user. For example, when heat is generated by electrical components disposed in the inner space of the housing 201 , the vent hole 215 may induce or radiate heat inside the housing 201 to the outside.
  • the electronic device 200 may include a touch sensing sensor 220 and an acceleration sensor 230 disposed in the inner space of the housing 201 .
  • the touch sensing sensor 220 may detect a user's touch input based on a change in capacitance that occurs when the user touches at least a part of the second case 201b.
  • the acceleration sensor 230 may collect data related to the movement of the electronic device 200 .
  • the acceleration sensor 230 may collect data related to the movement of the electronic device 200 that occurs when the user touches at least a portion of the second case 201b.
  • the touch sensing sensor 220 and/or the acceleration sensor 230 are not limited thereto, and may be disposed in various positions.
  • FIG. 3 is a perspective view illustrating an electronic device 300 according to various embodiments.
  • the electronic device 300 of FIG. 3 may be at least partially similar to the electronic device 101 of FIG. 1 , or may further include other embodiments of the electronic device.
  • the electronic device 300 may include frames 310 , a first transparent display 330 , and a second transparent display 340 .
  • the first transparent display 330 and the second transparent display 340 may be positioned in front of the user's eyes.
  • the frame 310 may be implemented in various other forms that can be mounted on the user's head while positioning the first transparent display 330 and the second transparent display 340 in front of the user's eyes.
  • the frame 310 may be referred to as a 'housing'.
  • the frame 310 includes a first rim 311 , a second rim 312 , a bridge 313 , a first end piece 314 , and a second an endpiece 315 , a first temple 316 , a second leg 317 , or a nose pad 318 .
  • the first rim 311 may surround and support at least a part of the first transparent display 330 .
  • the second rim 312 may surround and support at least a portion of the second transparent display 340 .
  • the bridge 313 may connect the first limb 311 and the second limb 312 and may be placed on the user's nose.
  • the first endpiece 314 may connect the first limb 311 and the first leg 316 .
  • the second end piece 315 may connect the second rim 312 and the second leg 317 .
  • the first leg 316 is a foldable part connected to the first end piece 314 by a hinge, and the end part is bent behind the left ear so that it can be placed on the left ear. can be formed.
  • the second leg 317 is a foldable part connected to the second end piece 315 by a hinge, and the end portion thereof can be formed to be bent behind the right ear so that it can be placed on the right ear. have.
  • the nose support 318 may be a part that supports the frame 310 on the nose.
  • the frame 310 may be formed of a material such as plastic for wearability of the electronic device 300 , but is not limited thereto and is formed of various other materials such as metal in consideration of strength or aesthetics. it might be
  • the first transparent display 330 when the frame 310 is mounted on the user's head, the first transparent display 330 is positioned in front of the user's left eye, and the second transparent display 340 is positioned in front of the user's right eye.
  • the user may recognize a foreground (eg, an actual image) through the first transparent display 330 and the second transparent display 340 .
  • the electronic device 300 includes the first transparent display 330 and/or the second transparent display 340 so that a user wearing the electronic device 300 can see a virtual image superimposed on the foreground of the real space. At least one virtual image may be output to .
  • the first transparent display 330 and/or the second transparent display 340 may be a projection type transparent display.
  • the first transparent display 330 may form a reflective surface as a transparent plate (or transparent screen). In this case, the first transparent display 330 may reflect the virtual image on the reflective surface to flow into the user's eyes.
  • the second transparent display 340 may be implemented in substantially the same manner as the first transparent display 330 .
  • the electronic device 300 may include a first projector (not shown) that projects light related to an image to the first transparent display 330 .
  • the first transparent display 330 may be implemented as a transparent plate (or transparent screen) that displays an image projected from the first projector while the user sees the foreground in front of the user.
  • the first projector may be positioned on the first rim 311 or the first endpiece 314 .
  • the second transparent display 340 may be implemented in substantially the same manner as the first transparent display 330 .
  • the first transparent display 330 and/or the second transparent display 340 may be a see-through type transparent display.
  • the direct-view type transparent display may include a transparent organic light emitting diode (OLED) display or a transparent liquid crystal display (LCD).
  • the first transparent display 330 and/or the second transparent display 340 may be a transparent near-eye display (NED).
  • the first transparent display 330 and/or the second transparent display 340 may be implemented as an optical screen that allows a projected image to be clearly seen even in a bright environment.
  • the electronic device 300 includes a touch sensing sensor 350 and an acceleration disposed in a partial region of the frame 310 (eg, the second endpiece 315 and/or the second leg 317 ).
  • a sensor 360 may be included.
  • the touch sensor 350 may detect a user's touch input based on a change in capacitance generated when the user touches at least a part of the second leg 317 .
  • the acceleration sensor 360 may collect data related to the movement of the electronic device 300 .
  • the acceleration sensor 360 may collect data related to the movement of the electronic device 200 that occurs when the user touches at least a part of the second leg 317 .
  • the touch sensing sensor 350 and/or the acceleration sensor 360 are not limited thereto, and may be disposed in various positions.
  • the touch sensitive sensor 350 and/or the acceleration sensor 360 may include the first endpiece 314 , the second endpiece 315 , the first leg 316 and/or the second leg 317 . can be placed in
  • the electronic device 400 of FIG. 4 is at least partially similar to the electronic device 101 of FIG. 1 , the electronic device 200 of FIG. 2A , or the electronic device 300 of FIG. 3 , or Other embodiments may be further included.
  • the electronic device 400 may include a processor 410 , a first sensor 420 , a second sensor 430 , and/or a memory 440 .
  • the processor 410 may be substantially the same as the processor 120 of FIG. 1 or may be included in the processor 120 .
  • the first sensor 420 and/or the second sensor 430 may be substantially the same as the sensor module 176 of FIG. 1 , or may be included in the sensor module 176 .
  • the memory 440 may be substantially the same as the memory 130 of FIG. 1 or may be included in the memory 130 .
  • the first sensor 420 may collect data related to the movement of the electronic device 400 .
  • the first sensor 420 may detect an acceleration generated due to the movement of the electronic device 400 .
  • the acceleration generated due to the movement of the electronic device 400 may be obtained based on the amount of change in three axes (eg, X-axis, Y-axis, and Z-axis) of the first sensor 420 .
  • the movement of the electronic device 400 may be generated when an external object (eg, a finger) contacts the electronic device 400 .
  • the first sensor 420 may include the acceleration sensor 230 of FIG. 2B or the acceleration sensor 360 of FIG. 3 .
  • the first sensor 420 may generate first touch information based on data related to the movement of the electronic device 400 .
  • the first sensor 420 may detect the amount of movement of the electronic device 400 based on the amount of change in three axes (eg, the X-axis, the Y-axis, and the Z-axis).
  • the first sensor 420 may generate first touch information including the number of touches by an external object detected based on the size of the movement of the electronic device 400 . For example, when the size of the movement of the electronic device 400 exceeds the specified first size, the first sensor 420 may determine that a touch (or touch input) by an external object has occurred.
  • the first sensor 420 detects the first touch and the second touch is not detected within the specified first maximum time (eg, about 300 ms), it is assumed that the number of touches by the external object is one. can judge For example, when the first sensor 420 detects the second touch within a specified first maximum time (eg, about 300 ms) after detecting the first touch, the maximum specified value based on the detection time of the second touch It can be checked whether the third touch is detected within a time period. After detecting the second touch, if the third touch is not detected within a specified first maximum time (eg, about 300 ms), the first sensor 420 may determine that the number of touches by the external object is twice. .
  • a specified first maximum time eg, about 300 ms
  • the designated first size may include a reference size for determining whether the movement size of the electronic device 400 obtained through the first sensor 420 is related to a touch (or a touch input).
  • the designated first size may be changed based on the touch sensitivity of the electronic device 400 .
  • the specified first maximum time may include a maximum reference time (eg, about 300 ms) set to check whether a next touch exists using the first sensor 420 .
  • a touch is a contact of an external object determined to be related to a user input during contact with an external object (eg, a finger) of the electronic device 400 , and after the external object is in contact with a partial area of the electronic device 400 .
  • It may include a gesture of releasing the contact in the corresponding area without substantially moving.
  • a touch may be referred to as a tap.
  • the contact of the external object determined to be related to the user input by the first sensor 420 may indicate a contact in which the size of movement of the electronic device 400 for contacting the external object exceeds the specified first size. .
  • the first sensor 420 uses one of the three axes (eg, X-axis, Y-axis, and Z-axis) of the first sensor 420 (eg, Y-axis) to reduce touch misrecognition.
  • the magnitude of the movement of the electronic device 400 may be detected based on the amount of change in the other two axes (eg, the X-axis and the Z-axis).
  • the axis excluded from detecting the motion magnitude of the electronic device 400 may include an axis in a direction parallel to the direction of gravity when the electronic device 400 is worn by the user among the three axes of the first sensor 420 .
  • an axis in a direction parallel to the direction of gravity may include an axis in a direction parallel to the direction of gravity among three axes of the first sensor 420 corrected based on a shape worn by the user.
  • the first sensor 420 may determine whether the detected touch is valid based on data related to the movement of the electronic device 400 . For example, if the first sensor 420 detects a second touch within a designated first maximum time after detecting the first touch determined to be valid, the detection time of the first touch and the detection time of the second touch It may be checked whether the difference (eg, the time interval of touch detection) satisfies the specified second time. For example, the first sensor 420 may determine that the second touch is valid when the time interval of touch detection satisfies the specified second time. For example, the state satisfying the specified second time may include a state in which the time interval of touch detection exceeds the specified second time.
  • the first sensor 420 may determine that the second touch is invalid when the time interval for detecting the touch satisfies the specified second time.
  • the state in which the specified second time period is not satisfied may include a state in which the time interval of touch detection is equal to or less than the specified second time period.
  • the first touch information may include information related to the number of touches determined to be valid.
  • the designated second time period may include a minimum time interval of touch detection for determining an erroneous touch sensed through the first sensor 420 .
  • the second sensor 430 may collect data related to a contact of an external object (eg, a finger) with respect to the electronic device 400 .
  • the second sensor 430 may include an outer surface of the electronic device 400 (eg, the second outer surface 203b of the second case 201b of FIG. 2B or the second leg 317 of FIG. 3 ). ), a change in capacitance can be detected by the contact of an external object (eg, a finger) to a partial area.
  • the contact of the external object may include a state in which the external object approaches the outer surface of the electronic device 400 within a threshold distance.
  • the second sensor 430 may include the touch detection sensor 220 of FIG. 2B or the touch detection sensor 350 of FIG. 3 .
  • the second sensor 430 may generate second touch information based on data related to a contact of an external object (eg, a finger) with respect to the electronic device 400 .
  • the second sensor 430 may determine that the contact of the external object is a touch (or touch input) related to a user input.
  • the second sensor 430 may generate second touch information including information related to the number of detected touches based on the change in capacitance. For example, when the second sensor 430 detects the first touch and then does not detect the second touch within the specified second maximum time (eg, about 300 ms), it is determined that the number of touches of the external object is one. can do.
  • the second sensor 430 detects the specified second touch based on the detection time of the second touch. 2 It can be checked whether the third touch is detected within the maximum time. After detecting the second touch, if the third touch is not detected within the specified second maximum time (eg, about 300 ms), the second sensor 430 may determine that the number of touches by the external object is twice. .
  • the designated second size may include a reference size for determining whether the change in capacitance obtained through the second sensor 430 is related to the touch input.
  • the designated second size may be changed based on the touch sensitivity of the electronic device 400 .
  • the designated second maximum time may include a maximum reference time (eg, about 300 ms) set to check whether a next touch exists using the second sensor 430 .
  • the second sensor 430 may detect a long press touch input based on data related to a contact of an external object (eg, a finger) with respect to the electronic device 400 .
  • the second sensor 430 may determine that a long press touch input is detected when a change in capacitance exceeding a specified second size is continuously detected for a specified first time period.
  • the designated first time may include a reference touch holding time for determining a long press touch input using the second sensor 430 .
  • the processor 410 may control the operatively connected first sensor 420 and/or the second sensor 430 . According to an embodiment, when the electronic device 400 is activated, the processor 410 may control the first sensor 420 and/or the second sensor 430 to be activated.
  • the processor 410 is an external object for the electronic device 400 based on the first touch information received from the first sensor 420 and the second touch information received from the second sensor 430 . It is possible to detect a user input related to the contact of
  • the processor 410 when the first touch information and the second touch information satisfy a specified second condition, the processor 410 is configured to perform the first touch information (eg, the number of touches included in the first touch information) based on the first touch information.
  • a user input related to the contact of an external object may be detected. For example, when the number of touches included in the first touch information is one, the user input may be determined to be a single tap input. For example, when the number of touches included in the first touch information is twice, the user input may be determined to be a double tap input. For example, when the number of touches included in the first touch information is three, the user input may be determined to be a triple tap input.
  • the number of touches included in the first touch information (T acc ) exceeds 1 (T acc > 1), and the number of touches included in the second touch information (T TSP ) ) may include a state exceeding 0 (T TSP > 0).
  • a state in which the number of touches exceeds 1 may include a state in which two or more touches are detected using the first sensor 420 .
  • a state in which the number of touches exceeds 0 may include a state in which one or more touches are detected using the second sensor 430 .
  • the processor 410 when the first touch information and the second touch information satisfy a specified third condition, the processor 410 is configured to perform the second touch information based on the second touch information (eg, the number of touches included in the second touch information).
  • a user input related to the contact of an external object may be detected. For example, when the number of touches included in the second touch information is one, the user input may be determined to be a single tap input. For example, when the number of touches included in the second touch information is twice, the user input may be determined to be a double tap input. For example, when the number of touches included in the second touch information is three, the user input may be determined to be a triple tap input.
  • the processor 410 may determine that the touch is misrecognized. . In this case, the processor 410 may discard the touch detected through the first sensor 420 and the second sensor 430 .
  • the processor 410 may update the first touch information based on the generation time of the first touch information and the second touch information. For example, since the first sensor 420 has a relatively higher sampling rate than the second sensor 430 , the first touch information may be generated relatively faster than the second touch information. Accordingly, when the first touch information is generated later than the second touch information, the processor 410 determines that an erroneous touch is recognized through the first sensor 420 and updates the number of touches included in the first touch information. can do. For example, the processor 410 may decrease the number of touches included in the first touch information by one. For example, when the first touch information is updated, the processor 410 may detect a user input related to a contact of an external object with respect to the electronic device 400 based on the updated first and second touch information. can
  • the electronic device 400 when the processor 410 receives information related to a long press touch input from the second sensor 430 , the electronic device 400 is irrespective of the first touch information received from the first sensor 420 . ), it may be determined that the user input related to the contact of the external object is a long press touch input.
  • the memory 440 may store various data used by at least one component of the electronic device 400 (eg, the processor 410 , the first sensor 420 , or the second sensor 440 ). can be saved.
  • the data includes reference information for determining a touch by the first sensor 420 and/or the second sensor 430 , a reference time for determining the number of touches, and determining an erroneous touch by the first sensor 420 .
  • Information related to a reference time for performing a long press and/or a reference time for determining a long press touch input by the second sensor 430 may be stored.
  • the data may store information related to a criterion for determining a user input in the processor 410 .
  • the electronic device 400 may further include a communication circuit (not shown).
  • the communication circuit may be substantially the same as the communication module 190 of FIG. 1 or may be included in the communication module 190 .
  • the communication circuit may establish communication with an external device (eg, the electronic device 102 of FIG. 1 ) using wireless and/or wired resources.
  • the communication circuit may transmit information related to a user input determined by the processor 410 to an external device.
  • the communication circuit may establish communication with an external device through Bluetooth, bluetooth low energy (BLE), wireless LAN, and/or a cellular communication method (eg, long term evolution (LTE) or new radio (NR)).
  • BLE bluetooth low energy
  • LAN wireless local area network
  • a cellular communication method eg, long term evolution (LTE) or new radio (NR)
  • the processor 410 may generate the first touch information and/or the second touch information based on sensor data received from the first sensor 420 and/or the second sensor 430 . . According to an embodiment, the processor 410 may generate first touch information based on data related to the movement of the electronic device 400 received from the first sensor 420 . According to an embodiment, the processor 410 may generate second touch information based on data related to a contact of an external object (eg, a finger) with respect to the electronic device 400 received from the second sensor 430 . have.
  • an external object eg, a finger
  • the electronic device 400 when the electronic device 400 can be worn on the user's ears, the electronic device 400 may include a main device among the first device and the second device worn on both ears of the user.
  • the electronic device 400 eg, the main device
  • a user input related to a contact of an external object with respect to the secondary device may be determined based on the second touch information.
  • the electronic device 400 eg, the main device
  • an electronic device eg, the electronic device 101 of FIG. 1 , the electronic device 200 of FIG. 2A , the electronic device 300 of FIG. 3 , or the electronic device 400 of FIG. 4 ).
  • a housing eg, the housing 201 of FIG. 2A or the frame 310 of FIG. 3
  • an acceleration sensor eg, the sensor module 176 of FIG. 1 , the acceleration of FIG. 2B
  • the sensor 230, the acceleration sensor 360 of FIG. 3, or the first sensor 420 of FIG. 4 and a touch sensing sensor disposed in the inner space (eg, the sensor module 176 of FIG. 1, the touch of FIG.
  • the processor may It is determined based on the number of times that the user input corresponding to the user's contact is a double tap input, the number of touches included in the first touch information is three times, and the number of touches included in the second touch information is once In this case, it may be determined that the user input corresponding to the user's contact is a triple tap input based on the number of touches included in the first touch information.
  • the processor when the processor fails to detect a touch through the acceleration sensor, or when the number of touches included in the first touch information is one and the number of touches included in the second touch information is one , it may be determined that the user input is a single tap input.
  • the processor when the number of touches included in the first touch information is one and the number of touches included in the second touch information is one or more, the processor is configured to include a touch included in the second touch information.
  • the user input may be determined based on the number of times.
  • the processor when the processor fails to detect a touch through the acceleration sensor and the number of touches included in the second touch information is two or more, it may be determined that a touch misrecognition has occurred.
  • the processor when it is determined that the user's contact time with the housing satisfies a specified first time based on the second touch information, the processor may be configured to input the user input based on the second touch information. can be decided
  • the acceleration sensor may generate the first touch information based on changes in axes other than a first axis parallel to the direction of gravity among a plurality of axes included in the acceleration sensor.
  • the acceleration sensor checks a difference between detection times of the first touch and the second touch, and When the difference satisfies the specified second time, the second touch may be removed, and the first touch information may be generated based on the first touch excluding the second touch.
  • the processor identifies the acquisition time of the first touch information and the acquisition time of the second touch information, and the acquisition time of the second touch information is earlier than the acquisition time of the first touch information In this case, the first touch information may be updated.
  • a communication circuit may be further included, wherein the communication circuit may transmit information related to the user input to an external device.
  • FIG. 5 is a flowchart 500 for detecting a user input in an electronic device according to various embodiments of the present disclosure.
  • Operations in the following embodiments may be sequentially performed, but are not necessarily sequentially performed. For example, the order of the operations may be changed, and at least two operations may be performed in parallel.
  • the electronic device of FIG. 5 may be the electronic device 101 of FIG. 1 , the electronic device 200 of FIG. 2A , the electronic device 300 of FIG. 3 , or the electronic device 400 of FIG. 4 .
  • FIGS. 6, 7A, and 7B illustrates a state in which an electronic device is coupled to a user's ear according to various embodiments of the present disclosure
  • 7A and 7B are examples of first touch information acquired through a first sensor in an electronic device according to various embodiments of the present disclosure;
  • an electronic device may acquire first touch information related to a contact of an external object using a first sensor (eg, an acceleration sensor) in operation 501 .
  • a first sensor eg, an acceleration sensor
  • the first sensor 420 may determine that a touch (or touch input) by an external object has occurred.
  • the first sensor 420 may generate first touch information including the detected number of touches based on the amount of movement of the electronic device 400 and transmit it to the processor 410 .
  • the number of touches may be determined based on whether a next touch is detected within a first maximum size designated based on the point at which the touch is sensed.
  • the specified first maximum time may include a maximum reference time (eg, about 300 ms) set to check whether a next touch exists using the first sensor 420 .
  • the designated first size may include a reference size for determining whether the movement size of the electronic device 400 obtained through the first sensor 420 is related to the touch input.
  • the first sensor 420 uses one of the three axes (eg, X-axis, Y-axis, and Z-axis) of the first sensor 420 (eg, Y-axis) to reduce touch misrecognition.
  • the magnitude of the movement of the electronic device 400 may be detected based on the amount of change in the other two axes (eg, the X-axis and the Z-axis).
  • the first sensor 420 shows the magnitude of movement of the electronic device 400 with respect to the single tap 700 , the double tap 702 , and the triple tap 704 based on the amount of change in the three axes.
  • misrecognition information 710 may be generated by the movement of the user wearing the electronic device 400 .
  • the first sensor 420 includes a single tap 720, a double tap 722, and a triple tap 724 except for one axis (eg, the Y axis) among the amount of change of the three axes, as shown in FIG. 7B .
  • the generation of the misrecognition information 710 in FIG. 7A may be reduced.
  • the motion magnitude (AccFeature) of the electronic device 400 may be detected using Equation 1 using two axes (eg, an X-axis and a Z-axis).
  • the Accfeature of Equation 1 represents the motion magnitude of the electronic device 400 estimated based on sensor data sensed through the first sensor 420 (eg, an acceleration sensor), represents the magnitude of the change in the X-axis, represents the magnitude of the change amount of the Z-axis, and k may include a sampling index of the first sensor 420 .
  • an axis excluded from detecting the motion magnitude of the electronic device 400 is a state in which the electronic device 200 is worn by the user among three axes of the first sensor 420 ( 600 ).
  • an axis in a direction parallel to the direction of gravity may include an axis in a direction parallel to the direction of gravity among three axes of the first sensor 420 corrected based on a shape worn by the user.
  • the single tap 700 or 720 may include a state in which the number of touches is one.
  • the double tap 702 or 722 may include a state in which the number of touches is twice.
  • the triple tap 704 or 724 may include a state in which the number of touches is three.
  • the electronic device uses a second sensor (eg, a touch sensor) to externally Second touch information related to the contact of the object may be acquired.
  • the second sensor 430 may determine that a touch (or a touch input) by an external object has been detected when the change in capacitance exceeds a specified second size.
  • the second sensor 430 may generate second touch information including the detected number of touches based on the change in capacitance and transmit the generated second touch information to the processor 410 .
  • the number of touches may be determined based on whether a next touch is detected within a second maximum size designated based on the time at which the touch is sensed.
  • the designated second maximum time may include a maximum reference time (eg, about 300 ms) set to check whether a next touch exists using the second sensor 430 .
  • the designated second size may include a reference size for determining whether a change in capacitance obtained through the second sensor 430 is related to a touch input.
  • the electronic device obtains first touch information obtained using a first sensor (eg, an acceleration sensor) and a second sensor (eg, touch sensing)
  • a user input related to a contact of an external object may be determined based on the second touch information acquired using the sensor).
  • the processor 410 determines a user input based on the number of touches included in the first touch information and the second touch information.
  • the processor 410 when the number of touches included in the first touch information and the second touch information is different from the number of touches included in the first touch information and the second touch information, the processor 410 is configured to perform the first touch information based on the number of touches included in the first touch information and/or the second touch information.
  • a user input may be determined based on the number of touches included in the touch information or the second touch information.
  • the electronic device 400 when the electronic device 400 detects a user input related to a contact of an external object (eg, a finger) with respect to the electronic device 400 , the electronic device 400 transmits information related to the user input to the external device (eg, a smartphone). ) can be sent to
  • the information related to the user input may include information related to a command for controlling an external device based on the user input.
  • FIG. 8 is a flowchart 800 for detecting a user input based on first touch information and second touch information in an electronic device according to various embodiments of the present disclosure.
  • the operations of FIG. 8 may be detailed operations of operation 505 of FIG. 5 .
  • Operations in the following embodiments may be sequentially performed, but are not necessarily sequentially performed.
  • the order of the operations may be changed, and at least two operations may be performed in parallel.
  • the electronic device of FIG. 8 may be the electronic device 101 of FIG. 1 , the electronic device 200 of FIG. 2A , the electronic device 300 of FIG. 3 , or the electronic device 400 of FIG. 4 .
  • a user input related to contact of an external object it can be determined that this is a single tap input.
  • the single tap input may include an input in which a gesture of releasing the contact in a certain area of the electronic device 400 without movement after an external object has been touched is generated once.
  • the first touch information and the second It may be checked whether the touch information satisfies the specified second condition. For example, in a state satisfying the specified second condition, the number of touches included in the first touch information (T acc ) is two or more (T acc > 1), and the number of touches included in the second touch information (T TSP ) It may include a state (T TSP > 0) of more than once.
  • the processor 410 may be configured to provide a user input related to a contact of an external object based on the number of touches included in the first touch information. can be detected. For example, when the number of touches included in the first touch information is one, the user input may be determined to be a single tap input. For example, when the number of touches included in the first touch information is twice, the user input may be determined to be a double tap input. For example, when the number of touches included in the first touch information is three, the user input may be determined to be a triple tap input.
  • the electronic device eg, the processor 120 or 410
  • a specified third condition eg, 'Yes' in operation 809
  • an external device based on the second touch information A user input related to the contact of the object may be detected.
  • the processor 410 is configured to provide a user input related to a contact of an external object based on the number of touches included in the second touch information. can be detected.
  • the processor 410 when the electronic device (eg, the processor 120 or 410 ) does not satisfy the specified third condition (eg, 'No' in operation 809), it may be determined that the touch is misrecognized.
  • the processor 410 when the first touch information and the second touch information do not satisfy the specified first condition, the specified second condition, and the specified third condition, the processor 410 is configured to perform the first sensor 420 and/or It may be determined that the second sensor 430 is in a touch misrecognition state. In this case, the processor 410 may discard the touch information detected through the first sensor 420 and the second sensor 430 .
  • FIG. 9 is a flowchart 900 for detecting a long press touch input in an electronic device according to various embodiments of the present disclosure.
  • the operations of FIG. 8 may be detailed operations of operation 503 of FIG. 5 .
  • Operations in the following embodiments may be sequentially performed, but are not necessarily sequentially performed.
  • the order of the operations may be changed, and at least two operations may be performed in parallel.
  • the electronic device of FIG. 9 may be the electronic device 101 of FIG. 1 , the electronic device 200 of FIG. 2A , the electronic device 300 of FIG. 3 , or the electronic device 400 of FIG. 4 .
  • an electronic device eg, the processor 120 of FIG. 1 , the sensor module 176 of FIG. 1 , the processor 410 of FIG. 4 , or the second sensor 430 of FIG. 4 )
  • the second sensor 430 may determine that a touch by an external object is detected when the change in capacitance exceeds a second specified size.
  • an embodiment for touch detection may be terminated.
  • the electronic device eg, the processor 120 or 410 , the sensor module 176 , and the second sensor 430 .
  • the second sensor eg, a touch sensing sensor
  • the state that satisfies the specified first time may include a state in which a change in capacitance exceeding the specified second size acquired through the second sensor 430 is continuously detected for the specified first time.
  • the state that does not satisfy the specified first time is the first time when a change in capacitance exceeding the specified second size acquired through the second sensor 430 is detected (eg, touch holding time) is specified It may include a state of less than.
  • the designated first time may include a reference touch holding time for determining a long press touch input using the second sensor 430 .
  • the electronic device maintains the duration of a touch acquired through the second sensor (eg, a touch sensing sensor).
  • the second sensor eg, a touch sensing sensor
  • the specified first time is satisfied (eg, 'Yes' in operation 903 )
  • the second sensor 430 may determine that a long press touch input is detected when a change in capacitance exceeding a specified second size is continuously detected for a specified first time period.
  • the second sensor 430 may transmit information related to a long press touch input to the processor 410 .
  • the electronic device when the processor 410 receives information related to a long press touch input from the second sensor 430 , the electronic device is irrespective of the first touch information obtained using the first sensor 420 . It may be determined that the user input related to the contact of the external object with respect to 400 is a long press touch input.
  • the electronic device eg, the processor 120 or 410 , the sensor module 176 , and the second sensor 430
  • the holding time of the touch acquired through the second sensor eg, a touch sensing sensor
  • the specified first time eg, 'No' in operation 903
  • the external object It is possible to obtain second touch information including the number of touches detected by the touch.
  • the second touch is not detected within a specified second maximum time (eg, about 300 ms)
  • the number of touches of the external object is one.
  • the second sensor 430 may generate second touch information including the number of touches once and transmit it to the processor 410 .
  • the designated second maximum time may include a maximum reference time (eg, about 300 ms) set to check whether a next touch exists using the second sensor 430 .
  • the second sensor 430 when the second sensor 430 detects the first touch and then detects the second touch within a specified second maximum time (eg, about 300 ms), based on the detection time of the second touch It may be checked whether the third touch is detected within the specified second maximum time. After detecting the second touch, if the third touch is not detected within the specified second maximum time (eg, about 300 ms), the second sensor 430 may determine that the number of touches by the external object is twice. . For example, the second sensor 430 may generate second touch information including the number of touches twice and transmit it to the processor 410 .
  • a specified second maximum time eg, about 300 ms
  • FIG. 10 is a flowchart 1000 for preventing a touch misrecognition of a first sensor in an electronic device according to various embodiments of the present disclosure.
  • the operations of FIG. 8 may be detailed operations of operation 503 of FIG. 5 .
  • Operations in the following embodiments may be sequentially performed, but are not necessarily sequentially performed.
  • the order of the operations may be changed, and at least two operations may be performed in parallel.
  • the electronic device of FIG. 10 may be the electronic device 101 of FIG. 1 , the electronic device 200 of FIG. 2A , the electronic device 300 of FIG. 3 , or the electronic device 400 of FIG. 4 .
  • FIG. 11 is an example of touch information acquired through a first sensor in an electronic device according to various embodiments of the present disclosure;
  • an electronic device may detect a touch related to a contact of an external object using a first sensor (eg, an acceleration sensor) in operation 1001 .
  • a first sensor eg, an acceleration sensor
  • the first sensor 420 detects the external object as shown in FIG. 11 . It may be determined that the first touch 1100 is detected.
  • the electronic device determines whether the time interval of touch detection satisfies the specified second time period.
  • the time interval of touch detection may be detected based on a difference between the detection time of the nth touch and the detection time of the (n ⁇ 1)th touch when the nth touch is detected.
  • n is the number of touch detections related to the contact of the external object, and may include a positive integer.
  • the state satisfying the specified second time may include a state in which the time interval of touch detection exceeds the specified second time.
  • the state in which the specified second time period is not satisfied may include a state in which the time interval of touch detection is equal to or less than the specified second time period.
  • raw data related to the first touch information may be generated (or updated) based on a touch detected using a first sensor (eg, an acceleration sensor) have.
  • a first sensor eg, an acceleration sensor
  • the first sensor 420 may generate raw data related to the number of times of one touch.
  • the electronic device eg, the processor 120 or 410 , the sensor module 176 , or the first sensor 420 ) does not satisfy the second time interval for which the touch detection time interval is specified (eg: If 'No' in operation 1003), when raw data related to the first touch information is generated (or updated) (eg, operation 1005), in operation 1007, it may be checked whether the touch elapsed time satisfies a specified third time.
  • the first sensor 420 may determine whether an elapsed time (eg, a touch elapsed time) from the time when the first touch 1100 is detected exceeds a specified third time. .
  • the specified third time (eg, the specified first maximum time) may include a maximum reference time (eg, about 300 ms) set to check whether a next touch exists using the first sensor 420 .
  • the state satisfying the specified third time may include a state in which the touch elapsed time exceeds the specified third time.
  • the state in which the specified third time is not satisfied may include a state in which the touch elapsed time is equal to or less than the specified third time.
  • the electronic device eg, the processor 120 or 410 , the sensor module 176 , or the first sensor 420
  • the electronic device e.g. the processor 120 or 410 , the sensor module 176 , or the first sensor 420
  • a first sensor e.g, an acceleration sensor
  • the electronic device eg, the processor 120 or 410 , the sensor module 176 , or the first sensor 420 .
  • the electronic device eg, the processor 120 or 410 , the sensor module 176 , or the first sensor 420 .
  • the first sensor 420 detects the second touch 1110 as shown in FIG. 11 .
  • the difference between the detection time of the second touch 1110 and the detection time of the first touch 1100 It may be checked whether 1120 satisfies the specified second time.
  • the first sensor 420 when the time interval 1120 of touch detection of FIG. 11 exceeds a specified second time (eg, a state that satisfies the specified second time), the first sensor 420 provides first touch information
  • the number of touches of the raw data related to may be updated twice (eg, in operation 1005 ).
  • the first sensor 420 when the time interval 1120 of touch detection of FIG. 11 is less than or equal to the specified second time (eg, a state that does not satisfy the specified second time), the first sensor 420 detects the second touch 1110 ) can be considered invalid. That is, the first sensor 420 may determine that the second touch 1110 is information not related to the contact of the external object and discard the second touch 1110 .
  • first touch information including the number of touches sensed using a first sensor may be generated based on raw data related to the first touch information.
  • the first sensor 420 may determine that the user input due to the contact of the external object is terminated. Accordingly, the first sensor 420 may generate first touch information including the number of touches by the contact of the external object based on the raw data. The first sensor 420 may transmit first touch information to the processor 410 .
  • FIG. 12 is a flowchart 1200 for preventing touch misrecognition by a first sensor and a second sensor in an electronic device according to various embodiments of the present disclosure.
  • the operations of FIG. 12 may be detailed operations of operation 505 of FIG. 5 .
  • Operations in the following embodiments may be sequentially performed, but are not necessarily sequentially performed.
  • the order of the operations may be changed, and at least two operations may be performed in parallel.
  • the electronic device of FIG. 12 may be the electronic device 101 of FIG. 1 , the electronic device 200 of FIG. 2A , the electronic device 300 of FIG. 3 , or the electronic device 400 of FIG. 4 .
  • FIG. 13 is an example of touch information acquired through a first sensor and a second sensor in an electronic device according to various embodiments of the present disclosure;
  • an electronic device uses a first sensor (eg, an acceleration sensor) in operation 1201 .
  • a first sensor eg, an acceleration sensor
  • a first acquisition time of the acquired first touch information and a second acquisition time of the acquired second touch information using a second sensor may be checked.
  • the electronic device eg, the processor 120 or 410 . compares the first acquisition time of the first touch information with the second acquisition time of the second touch information to obtain the first touch information. It may be checked whether the second touch information is acquired later. According to an embodiment, when the first acquisition time is later than the second acquisition time, the processor 410 may determine that the first touch information is acquired later than the second touch information.
  • the first Touch information when the first touch information (eg, the processor 120 or 410 ) is acquired later than the second touch information (eg, 'Yes' in operation 1203), in operation 1205, the first Touch information can be updated.
  • the first touch information since the first sensor 420 has a relatively higher sampling rate than the second sensor 430 , the first touch information may be generated relatively faster than the second touch information.
  • the first sensor 420 includes a first touch 1312 that exceeds a first reference size 1350 based on the sensor data 1300 collected through the first sensor 420 .
  • first touch information including the number of touches twice may be generated.
  • the second sensor 430 includes a third touch 1332 that exceeds the second reference size 1352 based on the sensor data 1320 collected through the second sensor 430 .
  • second touch information including the number of touches twice may be generated.
  • the first touch information may be acquired as quickly as a time corresponding to the difference 1340 between the acquisition time 1318 of the first touch information and the acquisition time 1338 of the second touch information.
  • the processor 410 determines that an erroneous touch is recognized through the first sensor 420 and updates the number of touches included in the first touch information. can do.
  • the processor 410 may decrease the number of touches included in the first touch information by one.
  • the electronic device responds to the updated first touch information and the second touch information acquired using the second sensor (eg, a touch sensing sensor). Based on this, a user input related to a contact of an external object with respect to the electronic device 400 may be detected.
  • the processor 410 may detect a user input related to a contact of an external object based on the updated first and second touch information as in operations 801 to 811 of FIG. 8 .
  • the processor 410 may detect a user input related to a contact of an external object based on the first touch information and the second touch information.
  • an electronic device eg, the electronic device 101 of FIG. 1 , the electronic device 200 of FIG. 2A , the electronic device 300 of FIG. 3 , or the electronic device 400 of FIG. 4 .
  • an acceleration sensor eg, the sensor module 176 of FIG. 1 , the acceleration sensor 230 of FIG. 2B , the acceleration sensor 360 of FIG. 3 , or the first sensor 420 of FIG. 4
  • a user's contact with the housing of the electronic device eg, the sensor module 176 of FIG. 1 , the acceleration sensor 230 of FIG. 2B , the acceleration sensor 360 of FIG. 3 , or the first sensor 420 of FIG.
  • An operation for identifying the first touch information related to a touch detection sensor (eg, the sensor module 176 of FIG. 1 , the touch detection sensor 220 of FIG. 2B , the touch detection sensor 350 of FIG. 3 or the first touch sensor of FIG. 4 )
  • An operation of identifying second touch information related to the user's contact with the housing using the second sensor 430) and the number of touches included in the first touch information are two or more, and the second touch information and detecting a user input corresponding to the user's touch based on the number of touches included in the first touch information when the number of touches included in the .
  • the detecting of the first touch when the number of touches included in the first touch information is two and the number of touches included in the second touch information is one or more, the detecting of the first touch Determining that the user input corresponding to the user's contact is a double tap input based on the number of touches included in the information, or the number of touches included in the first touch information is three, and the second touch information When the included number of touches is one or more, determining that the user input corresponding to the user's contact is a triple tap input based on the number of touches included in the first touch information.
  • the user's The method may further include determining that the user input corresponding to the contact is a single tap input.
  • the method may further include determining a user input corresponding to the user's contact.
  • the method may further include determining that a touch misrecognition has occurred when a touch is not detected through the acceleration sensor and the number of touches included in the second touch information is two or more.
  • determining the user input based on the second touch information may further include.
  • the first touch information may be generated based on changes in axes other than a first axis parallel to the direction of gravity among a plurality of axes included in the acceleration sensor.
  • the first touch information checks a difference between the detection times of the first touch and the second touch, and the detection When the time difference satisfies the specified second time, the second touch may be generated based on the first touch except for the second touch.
  • an operation of identifying an acquisition time of the first touch information and an acquisition time of the second touch information, and an acquisition time of the second touch information precede the acquisition time of the first touch information may include an operation of updating the first touch information.
  • the method may further include transmitting information related to the user input to an external device.

Abstract

Divers modes de réalisation de la présente invention se rapportent à un dispositif et à un procédé de détection d'une entrée d'utilisateur dans un dispositif électronique. Le dispositif électronique comprend : un boîtier ; un capteur d'accélération disposé dans un espace interne du boîtier ; un capteur de détection tactile disposé dans l'espace interne ; et un processeur connecté fonctionnellement au capteur d'accélération et au capteur de détection tactile, le processeur pouvant identifier, en utilisant le capteur d'accélération, de premières informations tactiles relatives au contact d'un utilisateur avec le boîtier, identifier, en utilisant le capteur de détection tactile, de secondes informations tactiles relatives au contact de l'utilisateur avec le boîtier, et détecter une entrée d'utilisateur correspondant au contact de l'utilisateur sur la base des premières informations tactiles et des secondes informations tactiles. L'invention peut également concerner d'autres modes de réalisation.
PCT/KR2021/009163 2020-08-03 2021-07-16 Dispositif électronique pour détection d'entrée d'utilisateur et son procédé de fonctionnement WO2022030800A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020200096715A KR20220016603A (ko) 2020-08-03 2020-08-03 사용자 입력을 검출하기 위한 전자 장치 및 그의 동작 방법
KR10-2020-0096715 2020-08-03

Publications (1)

Publication Number Publication Date
WO2022030800A1 true WO2022030800A1 (fr) 2022-02-10

Family

ID=80117322

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2021/009163 WO2022030800A1 (fr) 2020-08-03 2021-07-16 Dispositif électronique pour détection d'entrée d'utilisateur et son procédé de fonctionnement

Country Status (2)

Country Link
KR (1) KR20220016603A (fr)
WO (1) WO2022030800A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20160023212A (ko) * 2014-08-21 2016-03-03 엘지전자 주식회사 글래스 타입의 이동 단말기 및 그 제어방법
US20170124308A1 (en) * 2015-11-04 2017-05-04 Acer Incorporated Smart Wearable Device and Unlocking Method Thereof
US10152162B1 (en) * 2015-05-15 2018-12-11 Apple Inc. Method of optimizing touch detection
JP2019067214A (ja) * 2017-10-02 2019-04-25 ヤフー株式会社 判定プログラム、判定方法、端末装置、学習データ、及びモデル
US20200104039A1 (en) * 2018-09-28 2020-04-02 Snap Inc. Neural network system for gesture, wear, activity, or carry detection on a wearable or mobile device

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20160023212A (ko) * 2014-08-21 2016-03-03 엘지전자 주식회사 글래스 타입의 이동 단말기 및 그 제어방법
US10152162B1 (en) * 2015-05-15 2018-12-11 Apple Inc. Method of optimizing touch detection
US20170124308A1 (en) * 2015-11-04 2017-05-04 Acer Incorporated Smart Wearable Device and Unlocking Method Thereof
JP2019067214A (ja) * 2017-10-02 2019-04-25 ヤフー株式会社 判定プログラム、判定方法、端末装置、学習データ、及びモデル
US20200104039A1 (en) * 2018-09-28 2020-04-02 Snap Inc. Neural network system for gesture, wear, activity, or carry detection on a wearable or mobile device

Also Published As

Publication number Publication date
KR20220016603A (ko) 2022-02-10

Similar Documents

Publication Publication Date Title
WO2022014988A1 (fr) Dispositif électronique et procédé de commande de puissance
WO2022025494A1 (fr) Dispositif électronique de commande de luminance de dispositif d'affichage et procédé de fonctionnement associé
WO2022030804A1 (fr) Dispositif électronique pliable pour commander la rotation d'un écran, et son procédé de fonctionnement
WO2022103021A1 (fr) Dispositif électronique à affichage flexible et procédé de commande dudit dispositif
WO2022220659A1 (fr) Dispositif électronique et procédé par lequel un dispositif électronique entre des informations à l'aide d'un dispositif électronique externe
WO2022177299A1 (fr) Procédé de commande de fonction d'appel et dispositif électronique le prenant en charge
WO2022119164A1 (fr) Appareil électronique comprenant un évent
WO2022065845A1 (fr) Procédé de traitement de données d'entrée et dispositif électronique le prenant en charge
WO2022014836A1 (fr) Procédé et appareil d'affichage d'objets virtuels dans différentes luminosités
WO2022030800A1 (fr) Dispositif électronique pour détection d'entrée d'utilisateur et son procédé de fonctionnement
WO2022196984A1 (fr) Dispositif électronique comprenant une caméra et son procédé de fonctionnement
WO2024039136A1 (fr) Procédé de réglage de données de configuration d'un capteur d'empreintes digitales, et dispositif électronique
WO2022103108A1 (fr) Dispositif électronique et procédé de détection d'entrée tactile sur le dispositif électronique
WO2024101704A1 (fr) Dispositif pouvant être porté et procédé d'identification d'entrée tactile et support de stockage lisible par ordinateur non transitoire
WO2022097992A1 (fr) Dispositif électronique comprenant un écran variable et son procédé de fonctionnement
WO2022097860A1 (fr) Dispositif électronique comprenant un écran flexible et procédé de fonctionnement associé
WO2023063584A1 (fr) Dispositif électronique pour identifier un état en utilisant un capteur
WO2022203364A1 (fr) Dispositif électronique comprenant un boîtier comprenant une marque de repère et son procédé de fabrication
WO2023068549A1 (fr) Dispositif électronique utilisant un dispositif externe, et son procédé de fonctionnement
WO2022119147A1 (fr) Procédé et appareil de commande de glissement basé sur de multiples touchers
WO2022065807A1 (fr) Structure de contact de module de caméra et dispositif électronique la comprenant
WO2022050627A1 (fr) Dispositif électronique comprenant un affichage souple et procédé de fonctionnement de celui-ci
WO2022191468A1 (fr) Dispositif électronique et procédé d'utilisation d'un capteur de pression barométrique de type carte dans un dispositif électronique
WO2022108402A1 (fr) Procédé de fonctionnement d'écran souple, et dispositif électronique
WO2022154317A1 (fr) Structure de contact de module de caméra et appareil électronique la comprenant

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21853841

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21853841

Country of ref document: EP

Kind code of ref document: A1