WO2024063330A1 - 웨어러블 전자 장치 및 상기 웨어러블 전자 장치를 이용하여 컨트롤러를 식별하는 방법 - Google Patents

웨어러블 전자 장치 및 상기 웨어러블 전자 장치를 이용하여 컨트롤러를 식별하는 방법 Download PDF

Info

Publication number
WO2024063330A1
WO2024063330A1 PCT/KR2023/011610 KR2023011610W WO2024063330A1 WO 2024063330 A1 WO2024063330 A1 WO 2024063330A1 KR 2023011610 W KR2023011610 W KR 2023011610W WO 2024063330 A1 WO2024063330 A1 WO 2024063330A1
Authority
WO
WIPO (PCT)
Prior art keywords
controller
recognition camera
electronic device
wearable electronic
led
Prior art date
Application number
PCT/KR2023/011610
Other languages
English (en)
French (fr)
Korean (ko)
Inventor
조남민
이진철
Original Assignee
삼성전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from KR1020220136744A external-priority patent/KR20240041772A/ko
Application filed by 삼성전자 주식회사 filed Critical 삼성전자 주식회사
Priority to EP23762127.1A priority Critical patent/EP4369155A4/de
Publication of WO2024063330A1 publication Critical patent/WO2024063330A1/ko

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/211Input arrangements for video game devices characterised by their sensors, purposes or types using inertial sensors, e.g. accelerometers or gyroscopes
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/213Input arrangements for video game devices characterised by their sensors, purposes or types comprising photodetecting means, e.g. cameras, photodiodes or infrared cells
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/038Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8082Virtual reality

Definitions

  • Various embodiments of the present invention relate to a wearable electronic device and a method of identifying at least one controller using the wearable electronic device.
  • Wearable electronic devices are of various types, such as glasses-type augmented reality (AR) glasses, virtual reality glasses, video see through (VST) mixed reality (MR) glasses, or head mounted displays (HMDs). It is changing into a form.
  • AR augmented reality
  • VST video see through
  • MR mixed reality
  • HMDs head mounted displays
  • the wearable electronic device can visually provide information combining an actually existing environment and/or a virtual object composed of graphics to the user through glasses (eg, a display).
  • glasses eg, a display
  • Wearable electronic devices can display virtual objects on glass (e.g., displays) to implement augmented reality and/or virtual reality.
  • Wearable electronic devices can play games using virtual objects displayed on glass.
  • the wearable electronic device is linked to a controller (or handler) including at least one LED (light emitting diode) and can play a virtual game displayed on the glass according to the operation of the controller.
  • a controller or handler
  • LED light emitting diode
  • the wearable electronic device may recognize the other user's controller and perform an incorrect operation.
  • the first wearable electronic device is playing a game using the first controller
  • a second controller associated with the second wearable electronic device approaches the first wearable electronic device
  • An incorrect operation may be performed as the operation signal of the second controller of the second wearable electronic device is recognized in addition to the first controller associated with the device.
  • Various embodiments of the present invention can provide a wearable electronic device that allows each wearable electronic device to identify a controller associated with it among a plurality of wearable electronic devices and controllers.
  • a wearable electronic device includes a wireless communication module, a sensor module, a first recognition camera and/or a second recognition camera, and the wireless communication module, the sensor module, the first recognition camera and/or It may include a processor operatively connected to the second recognition camera.
  • the processor may confirm that the first controller and the second controller in which LEDs are respectively arranged are detected within the designated space through at least one of the first recognition camera and the second recognition camera.
  • the processor may transmit a control signal to the first controller to cause the LED disposed in the first controller to operate at a first turn-on time.
  • the processor may transmit a control signal to the second controller to cause the LED disposed in the second controller to operate at a second turn-on time.
  • a method for a wearable electronic device to identify a first controller and a second controller includes a processor identifying a first controller with LEDs respectively arranged through at least one of a first recognition camera and a second recognition camera. And it may include an operation to confirm that the second controller is detected within the designated space. According to one embodiment, the method may include an operation of the processor transmitting a control signal to the first controller to cause the LED disposed in the first controller to operate at a first turn-on time. According to one embodiment, the method may include an operation of the processor transmitting a control signal to the second controller to cause the LED disposed in the second controller to operate at a second turn-on time.
  • each wearable electronic device is able to identify the controller associated with it, thereby preventing a user from mistakenly using a controller other than the controller associated with the wearable electronic device. This can prevent recognition.
  • the first controller of the first wearable electronic device and the second controller of the second wearable electronic device have the same LED emission cycle and the timing of the turn on time is set differently.
  • the first wearable electronic device recognizes the first controller based on the timing of the light emission cycle and the first turn-on time of the LED
  • the second wearable electronic device recognizes the first controller based on the timing of the light emission cycle and the second turn-on time of the LED.
  • the second controller can be recognized.
  • FIG. 1 is a block diagram of an electronic device in a network environment according to various embodiments of the present invention.
  • Figure 2 is a perspective view schematically showing the configuration of a wearable electronic device according to an embodiment of the present invention.
  • Figure 3 is a perspective view schematically showing a controller associated with a wearable electronic device according to an embodiment of the present invention.
  • Figure 4 is a perspective view schematically showing a pair of controllers linked to a wearable electronic device according to an embodiment of the present invention.
  • FIG. 5 is a diagram schematically showing the emission cycle and turn-on time of an LED for one controller associated with a wearable electronic device according to an embodiment of the present invention.
  • Figure 6 is a diagram schematically showing an embodiment in which a wearable electronic device can recognize a plurality of controllers according to an embodiment of the present invention.
  • FIG. 7(a) to 7(d) are diagrams illustrating an example in which a wearable electronic device identifies at least one controller according to an embodiment of the present invention.
  • FIG. 8(a) to 8(d) show that at least one of the first recognition camera or the second recognition camera of the wearable electronic device according to an embodiment of the present invention detects the LED disposed in at least one controller.
  • This is a diagram showing an example of setting the operation time.
  • FIG. 9 is a diagram illustrating an embodiment in which at least one of a first recognition camera or a second recognition camera of a wearable electronic device according to an embodiment of the present invention can detect the light emission pattern of an LED disposed in at least one controller. am.
  • FIG. 10 is a diagram illustrating various embodiments in which at least one of a first recognition camera or a second recognition camera of a wearable electronic device according to an embodiment of the present invention can detect the light emission pattern of an LED disposed in at least one controller am.
  • Figure 11a is a diagram showing an embodiment in which an accessory according to an embodiment of the present invention includes a first pattern.
  • FIG. 11B is a diagram illustrating an example in which an accessory according to an embodiment of the present invention includes a second pattern.
  • Figure 11C is a diagram showing various examples of patterns that can be arranged on an accessory according to an embodiment of the present invention.
  • FIG. 12 is a flowchart illustrating a method by which a wearable electronic device identifies at least one controller using at least one of a first recognition camera or a second recognition camera according to an embodiment of the present invention.
  • FIG. 1 is a block diagram of an electronic device 101 in a network environment 100, according to various embodiments of the present invention.
  • the electronic device 101 communicates with the electronic device 102 through a first network 198 (e.g., a short-range wireless communication network) or a second network 199. It is possible to communicate with at least one of the electronic device 104 or the server 108 through (e.g., a long-distance wireless communication network). According to one embodiment, the electronic device 101 may communicate with the electronic device 104 through the server 108.
  • a first network 198 e.g., a short-range wireless communication network
  • a second network 199 e.g., a long-distance wireless communication network.
  • the electronic device 101 may communicate with the electronic device 104 through the server 108.
  • the electronic device 101 includes a processor 120, a memory 130, an input module 150, an audio output module 155, a display module 160, an audio module 170, and a sensor module ( 176), interface 177, connection terminal 178, haptic module 179, camera module 180, power management module 188, battery 189, communication module 190, subscriber identification module 196 , or may include an antenna module 197.
  • at least one of these components eg, the connection terminal 178) may be omitted or one or more other components may be added to the electronic device 101.
  • some of these components e.g., sensor module 176, camera module 180, or antenna module 197) are integrated into one component (e.g., display module 160). It can be.
  • the processor 120 for example, executes software (e.g., program 140) to operate at least one other component (e.g., hardware or software component) of the electronic device 101 connected to the processor 120. It can be controlled and various data processing or calculations can be performed. According to one embodiment, as at least part of data processing or computation, the processor 120 stores instructions or data received from another component (e.g., sensor module 176 or communication module 190) in volatile memory 132. The commands or data stored in the volatile memory 132 can be processed, and the resulting data can be stored in the non-volatile memory 134.
  • software e.g., program 140
  • the processor 120 stores instructions or data received from another component (e.g., sensor module 176 or communication module 190) in volatile memory 132.
  • the commands or data stored in the volatile memory 132 can be processed, and the resulting data can be stored in the non-volatile memory 134.
  • the processor 120 includes the main processor 121 (e.g., a central processing unit or an application processor) or an auxiliary processor 123 that can operate independently or together (e.g., a graphics processing unit, a neural network processing unit ( It may include a neural processing unit (NPU), an image signal processor, a sensor hub processor, or a communication processor).
  • the main processor 121 e.g., a central processing unit or an application processor
  • an auxiliary processor 123 e.g., a graphics processing unit, a neural network processing unit ( It may include a neural processing unit (NPU), an image signal processor, a sensor hub processor, or a communication processor.
  • the electronic device 101 includes a main processor 121 and a secondary processor 123
  • the secondary processor 123 may be set to use lower power than the main processor 121 or be specialized for a designated function. You can.
  • the auxiliary processor 123 may be implemented separately from the main processor 121 or as part of it.
  • the auxiliary processor 123 may, for example, act on behalf of the main processor 121 while the main processor 121 is in an inactive (e.g., sleep) state, or while the main processor 121 is in an active (e.g., application execution) state. ), together with the main processor 121, at least one of the components of the electronic device 101 (e.g., the display module 160, the sensor module 176, or the communication module 190) At least some of the functions or states related to can be controlled.
  • co-processor 123 e.g., image signal processor or communication processor
  • may be implemented as part of another functionally related component e.g., camera module 180 or communication module 190. there is.
  • the auxiliary processor 123 may include a hardware structure specialized for processing artificial intelligence models.
  • Artificial intelligence models can be created through machine learning. For example, such learning may be performed in the electronic device 101 itself on which the artificial intelligence model is performed, or may be performed through a separate server (e.g., server 108).
  • Learning algorithms may include, for example, supervised learning, unsupervised learning, semi-supervised learning, or reinforcement learning, but It is not limited.
  • An artificial intelligence model may include multiple artificial neural network layers.
  • Artificial neural networks include deep neural network (DNN), convolutional neural network (CNN), recurrent neural network (RNN), restricted boltzmann machine (RBM), belief deep network (DBN), bidirectional recurrent deep neural network (BRDNN), It may be one of deep Q-networks or a combination of two or more of the above, but is not limited to the examples described above.
  • artificial intelligence models may additionally or alternatively include software structures.
  • the memory 130 may store various data used by at least one component (eg, the processor 120 or the sensor module 176) of the electronic device 101. Data may include, for example, input data or output data for software (e.g., program 140) and instructions related thereto.
  • Memory 130 may include volatile memory 132 or non-volatile memory 134.
  • the program 140 may be stored as software in the memory 130 and may include, for example, an operating system 142, middleware 144, or application 146.
  • the input module 150 may receive commands or data to be used in a component of the electronic device 101 (e.g., the processor 120) from outside the electronic device 101 (e.g., a user).
  • the input module 150 may include, for example, a microphone, mouse, keyboard, keys (eg, buttons), or digital pen (eg, stylus pen).
  • the sound output module 155 may output sound signals to the outside of the electronic device 101.
  • the sound output module 155 may include, for example, a speaker or a receiver. Speakers can be used for general purposes such as multimedia playback or recording playback.
  • the receiver can be used to receive incoming calls. According to one embodiment, the receiver may be implemented separately from the speaker or as part of it.
  • the display module 160 can visually provide information to the outside of the electronic device 101 (eg, a user).
  • the display module 160 may include, for example, a display, a hologram device, or a projector, and a control circuit for controlling the device.
  • the display module 160 may include a touch sensor configured to detect a touch, or a pressure sensor configured to measure the intensity of force generated by the touch.
  • the audio module 170 can convert sound into an electrical signal or, conversely, convert an electrical signal into sound. According to one embodiment, the audio module 170 acquires sound through the input module 150, the sound output module 155, or an external electronic device (e.g., directly or wirelessly connected to the electronic device 101). Sound may be output through the electronic device 102 (e.g., speaker or headphone).
  • the electronic device 102 e.g., speaker or headphone
  • the sensor module 176 detects the operating state (e.g., power or temperature) of the electronic device 101 or the external environmental state (e.g., user state) and generates an electrical signal or data value corresponding to the detected state. can do.
  • the sensor module 176 includes, for example, a gesture sensor, a gyro sensor, an air pressure sensor, a magnetic sensor, an acceleration sensor, a grip sensor, a proximity sensor, a color sensor, an IR (infrared) sensor, a biometric sensor, It may include a temperature sensor, humidity sensor, or light sensor.
  • the interface 177 may support one or more designated protocols that can be used to connect the electronic device 101 directly or wirelessly with an external electronic device (eg, the electronic device 102).
  • the interface 177 may include, for example, a high definition multimedia interface (HDMI), a universal serial bus (USB) interface, an SD card interface, or an audio interface.
  • HDMI high definition multimedia interface
  • USB universal serial bus
  • SD card interface Secure Digital Card interface
  • audio interface audio interface
  • connection terminal 178 may include a connector through which the electronic device 101 can be physically connected to an external electronic device (eg, the electronic device 102).
  • the connection terminal 178 may include, for example, an HDMI connector, a USB connector, an SD card connector, or an audio connector (eg, a headphone connector).
  • the haptic module 179 can convert electrical signals into mechanical stimulation (e.g., vibration or movement) or electrical stimulation that the user can perceive through tactile or kinesthetic senses.
  • the haptic module 179 may include, for example, a motor, a piezoelectric element, or an electrical stimulation device.
  • the camera module 180 can capture still images and moving images.
  • the camera module 180 may include one or more lenses, image sensors, image signal processors, or flashes.
  • the power management module 188 can manage power supplied to the electronic device 101.
  • the power management module 188 may be implemented as at least a part of, for example, a power management integrated circuit (PMIC).
  • PMIC power management integrated circuit
  • the battery 189 may supply power to at least one component of the electronic device 101.
  • the battery 189 may include, for example, a non-rechargeable primary battery, a rechargeable secondary battery, or a fuel cell.
  • Communication module 190 is configured to provide a direct (e.g., wired) communication channel or wireless communication channel between electronic device 101 and an external electronic device (e.g., electronic device 102, electronic device 104, or server 108). It can support establishment and communication through established communication channels. Communication module 190 operates independently of processor 120 (e.g., an application processor) and may include one or more communication processors that support direct (e.g., wired) communication or wireless communication.
  • processor 120 e.g., an application processor
  • the communication module 190 may be a wireless communication module 192 (e.g., a cellular communication module, a short-range wireless communication module, or a global navigation satellite system (GNSS) communication module) or a wired communication module 194 (e.g., : LAN (local area network) communication module, or power line communication module) may be included.
  • a wireless communication module 192 e.g., a cellular communication module, a short-range wireless communication module, or a global navigation satellite system (GNSS) communication module
  • GNSS global navigation satellite system
  • wired communication module 194 e.g., : LAN (local area network) communication module, or power line communication module
  • the corresponding communication module is a first network 198 (e.g., a short-range communication network such as Bluetooth, wireless fidelity (WiFi) direct, or infrared data association (IrDA)) or a second network 199 (e.g., legacy It may communicate with an external electronic device 104 through a telecommunication network such as a cellular network, a 5G network, a next-generation communication network, the Internet, or a computer network (e.g., LAN or WAN).
  • a telecommunication network such as a cellular network, a 5G network, a next-generation communication network, the Internet, or a computer network (e.g., LAN or WAN).
  • a telecommunication network such as a cellular network, a 5G network, a next-generation communication network, the Internet, or a computer network (e.g., LAN or WAN).
  • a telecommunication network such as a cellular network, a 5G network, a next-generation communication network
  • the wireless communication module 192 uses subscriber information (e.g., International Mobile Subscriber Identifier (IMSI)) stored in the subscriber identification module 196 within a communication network such as the first network 198 or the second network 199.
  • subscriber information e.g., International Mobile Subscriber Identifier (IMSI)
  • IMSI International Mobile Subscriber Identifier
  • the wireless communication module 192 may support 5G networks after 4G networks and next-generation communication technologies, for example, NR access technology (new radio access technology).
  • NR access technology provides high-speed transmission of high-capacity data (eMBB (enhanced mobile broadband)), minimization of terminal power and access to multiple terminals (mMTC (massive machine type communications)), or high reliability and low latency (URLLC (ultra-reliable and low latency). -latency communications)) can be supported.
  • the wireless communication module 192 may support high frequency bands (eg, mmWave bands), for example, to achieve high data rates.
  • the wireless communication module 192 uses various technologies to secure performance in high frequency bands, for example, beamforming, massive array multiple-input and multiple-output (MIMO), and full-dimensional multiplexing. It can support technologies such as input/output (FD-MIMO: full dimensional MIMO), array antenna, analog beam-forming, or large scale antenna.
  • the wireless communication module 192 may support various requirements specified in the electronic device 101, an external electronic device (e.g., electronic device 104), or a network system (e.g., second network 199).
  • the wireless communication module 192 supports Peak data rate (e.g., 20 Gbps or more) for realizing eMBB, loss coverage (e.g., 164 dB or less) for realizing mmTC, or U-plane latency (e.g., 164 dB or less) for realizing URLLC.
  • Peak data rate e.g., 20 Gbps or more
  • loss coverage e.g., 164 dB or less
  • U-plane latency e.g., 164 dB or less
  • the antenna module 197 may transmit signals or power to or receive signals or power from the outside (e.g., an external electronic device).
  • the antenna module 197 may include an antenna including a radiator made of a conductor or a conductive pattern formed on a substrate (eg, PCB).
  • the antenna module 197 may include a plurality of antennas (eg, an array antenna). In this case, at least one antenna suitable for a communication method used in a communication network such as the first network 198 or the second network 199 is connected to the plurality of antennas by, for example, the communication module 190. can be selected. Signals or power may be transmitted or received between the communication module 190 and an external electronic device through the at least one selected antenna.
  • other components eg, radio frequency integrated circuit (RFIC) may be additionally formed as part of the antenna module 197.
  • RFIC radio frequency integrated circuit
  • a mmWave antenna module includes: a printed circuit board, an RFIC disposed on or adjacent to a first side (e.g., bottom side) of the printed circuit board and capable of supporting a designated high frequency band (e.g., mmWave band); And a plurality of antennas (e.g., array antennas) disposed on or adjacent to the second side (e.g., top or side) of the printed circuit board and capable of transmitting or receiving signals in the designated high frequency band. can do.
  • a first side e.g., bottom side
  • a designated high frequency band e.g., mmWave band
  • a plurality of antennas e.g., array antennas
  • peripheral devices e.g., bus, general purpose input and output (GPIO), serial peripheral interface (SPI), or mobile industry processor interface (MIPI)
  • signal e.g. commands or data
  • commands or data may be transmitted or received between the electronic device 101 and the external electronic device 104 through the server 108 connected to the second network 199.
  • Each of the external electronic devices 102 or 104 may be of the same or different type as the electronic device 101.
  • all or part of the operations performed in the electronic device 101 may be executed in one or more of the external electronic devices 102, 104, or 108.
  • the electronic device 101 may perform the function or service instead of executing the function or service on its own.
  • one or more external electronic devices may be requested to perform at least part of the function or service.
  • One or more external electronic devices that have received the request may execute at least part of the requested function or service, or an additional function or service related to the request, and transmit the result of the execution to the electronic device 101.
  • the electronic device 101 may process the result as is or additionally and provide it as at least part of a response to the request.
  • cloud computing distributed computing, mobile edge computing (MEC), or client-server computing technology can be used.
  • the electronic device 101 may provide an ultra-low latency service using, for example, distributed computing or mobile edge computing.
  • the external electronic device 104 may include an Internet of Things (IoT) device.
  • Server 108 may be an intelligent server using machine learning and/or neural networks.
  • the external electronic device 104 or server 108 may be included in the second network 199.
  • the electronic device 101 may be applied to intelligent services (e.g., smart home, smart city, smart car, or healthcare) based on 5G communication technology and IoT-related technology.
  • Electronic devices may be of various types.
  • Electronic devices may include, for example, portable communication devices (e.g., smart phones), computer devices, portable multimedia devices, portable medical devices, cameras, wearable devices, or home appliances.
  • portable communication devices e.g., smart phones
  • portable multimedia devices e.g., portable medical devices
  • cameras e.g., wearable devices
  • home appliances e.g., home appliances
  • first, second, or first or second may be used simply to distinguish one component from another, and to refer to that component in other respects (e.g., importance or order) is not limited.
  • One (e.g., first) component is said to be “coupled” or “connected” to another (e.g., second) component, with or without the terms “functionally” or “communicatively.”
  • any of the components can be connected to the other components directly (e.g. wired), wirelessly, or through a third component.
  • module used in various embodiments of this document may include a unit implemented in hardware, software, or firmware, and is interchangeable with terms such as logic, logic block, component, or circuit, for example. It can be used as A module may be an integrated part or a minimum unit of the parts or a part thereof that performs one or more functions. For example, according to one embodiment, the module may be implemented in the form of an application-specific integrated circuit (ASIC).
  • ASIC application-specific integrated circuit
  • Various embodiments of the present document are one or more instructions stored in a storage medium (e.g., built-in memory 136 or external memory 138) that can be read by a machine (e.g., electronic device 101). It may be implemented as software (e.g., program 140) including these.
  • a processor e.g., processor 120
  • the one or more instructions may include code generated by a compiler or code that can be executed by an interpreter.
  • a storage medium that can be read by a device may be provided in the form of a non-transitory storage medium.
  • 'non-transitory' only means that the storage medium is a tangible device and does not contain signals (e.g. electromagnetic waves), and this term refers to cases where data is semi-permanently stored in the storage medium. There is no distinction between temporary storage cases.
  • Computer program products are commodities and can be traded between sellers and buyers.
  • the computer program product may be distributed in the form of a machine-readable storage medium (e.g. compact disc read only memory (CD-ROM)) or via an application store (e.g. Play Store TM ) or on two user devices (e.g. It can be distributed (e.g. downloaded or uploaded) directly between smart phones) or online.
  • a machine-readable storage medium e.g. compact disc read only memory (CD-ROM)
  • an application store e.g. Play Store TM
  • two user devices e.g. It can be distributed (e.g. downloaded or uploaded) directly between smart phones) or online.
  • at least a portion of the computer program product may be at least temporarily stored or temporarily created in a machine-readable storage medium, such as the memory of a manufacturer's server, an application store's server, or a relay server.
  • each component (e.g., module or program) of the above-described components may include a single or plural entity, and some of the plurality of entities may be separately placed in other components. there is.
  • one or more of the components or operations described above may be omitted, or one or more other components or operations may be added.
  • multiple components eg, modules or programs
  • the integrated component may perform one or more functions of each component of the plurality of components in the same or similar manner as those performed by the corresponding component of the plurality of components prior to the integration. .
  • operations performed by a module, program, or other component may be executed sequentially, in parallel, iteratively, or heuristically, or one or more of the operations may be executed in a different order, or omitted. Alternatively, one or more other operations may be added.
  • Figure 2 is a perspective view schematically showing the configuration of a wearable electronic device according to various embodiments of the present invention.
  • the wearable electronic device 200 of FIG. 2 may include the embodiments described in the electronic device 101 of FIG. 1 .
  • the wearable electronic device 200 includes augmented reality (AR) glasses (e.g., AR devices), virtual reality glasses (e.g., VR devices), video see through (VST) mixed reality (MR) glasses, It may include any one of a head mounted display (HMD).
  • AR augmented reality
  • VR virtual reality
  • VST video see through
  • MR mixed reality
  • HMD head mounted display
  • a wearable electronic device 200 includes a bridge 201, a first rim 210, a second rim 220, and a first end piece ( It may include an end piece 230), a second end piece 240, a first temple 250, and/or a second temple 260.
  • the bridge 201 may connect the first limb 210 and the second limb 220.
  • the bridge 201 may be formed of a non-conductive material (eg, polymer) and/or a conductive material (eg, metal).
  • the first rim 210 and the second rim 220 may be formed of a non-conductive material (eg, polymer) and/or a conductive material (eg, metal).
  • the bridge 201 may be positioned above the user's nose when the user wears the wearable electronic device 200.
  • the bridge 201 may separate the first limb 210 and the second limb 220 based on the user's nose.
  • the bridge 201 may include a camera module 203, a first eye tracking camera 205, a second eye tracking camera 207, and/or an audio module 209.
  • the camera module 203 (e.g., the camera module 180 of FIG. 1) captures the front (e.g., -y-axis direction) of the user (e.g., the user of the wearable electronic device 200). and image data can be obtained.
  • the camera module 203 may capture an image corresponding to the user's field of view (FoV) or measure the distance to a subject (eg, an object).
  • the camera module 203 may include an RGB camera, a high resolution (HR) camera, and/or a photo video (PV) camera.
  • the camera module 203 may include a color camera with an auto focus (AF) function and an optical image stabilization (OIS) function to acquire high-definition images.
  • AF auto focus
  • OIS optical image stabilization
  • the first gaze tracking camera 205 and the second gaze tracking camera 207 may check the user's gaze.
  • the first gaze tracking camera 205 and the second gaze tracking camera 207 may photograph the user's eyes in a direction opposite to the photographing direction of the camera module 203.
  • the first eye tracking camera 205 may partially photograph the user's left eye
  • the second eye tracking camera 207 may partially photograph the user's right eye.
  • the first gaze tracking camera 205 and the second gaze tracking camera 207 may detect the user's pupils (eg, left eye and right eye) and track the gaze direction.
  • the tracked gaze direction can be used to move the center of a virtual image including a virtual object in response to the gaze direction.
  • the first eye tracking camera 205 and/or the second eye tracking camera 207 may be, for example, an EOG sensor (electro-oculography or electrooculogram), a coil system, a dual Purkinje system, bright pupil systems or dark pupil systems. At least one method can be used to track the user's gaze.
  • the audio module 209 (eg, the audio module 170 in FIG. 1) may be disposed between the first eye tracking camera 205 and the second eye tracking camera 207.
  • the audio module 209 can convert the user's voice into an electrical signal or convert an electrical signal into sound.
  • Audio module 209 may include a microphone.
  • the first rim 210 and the second rim 220 may form a frame (eg, an eyeglasses frame) of the wearable electronic device 200 (eg, AR glasses).
  • the first rim 210 may be disposed in a first direction (eg, x-axis direction) of the bridge 201.
  • the first rim 210 may be placed in a position corresponding to the user's left eye.
  • the second rim 220 may be disposed in a second direction (eg, -x-axis direction) of the bridge 201, which is opposite to the first direction (eg, x-axis direction).
  • the second rim 220 may be placed in a position corresponding to the user's right eye.
  • the first rim 210 may surround and support at least a portion of the first glass 215 (eg, a first display) disposed on the inner peripheral surface.
  • the first glass 215 may be positioned in front of the user's left eye.
  • the second rim 220 may surround and support at least a portion of the second glass 225 (eg, a second display) disposed on the inner peripheral surface.
  • the second glass 225 may be positioned in front of the user's right eye.
  • a user of the wearable electronic device 200 can view the foreground (eg, actual image) of an external object (eg, subject) through the first glass 215 and the second glass 225.
  • the wearable electronic device 200 can implement augmented reality by displaying a virtual image overlaid on the foreground (eg, real image) of an external object.
  • the first glass 215 and the second glass 225 may include a projection type transparent display.
  • the first glass 215 and the second glass 225 are each transparent plates (or transparent screens) and can form a reflective surface, and the image generated by the wearable electronic device 200 is reflected (e.g., reflected) through the reflective surface. It may be total internal reflection and enter the user's left and right eyes.
  • the first glass 215 may include an optical waveguide that transmits light generated from a light source of the wearable electronic device 200 to the user's left eye.
  • the optical waveguide may be formed of glass, plastic, or polymer material, and may include a nanopattern (e.g., a polygonal or curved grating structure) formed on the inside or surface of the first glass 215. or mesh structure).
  • the optical waveguide may include at least one of at least one diffractive element (eg, a diffractive optical element (DOE), a holographic optical element (HOE)) or a reflective element (eg, a reflective mirror).
  • DOE diffractive optical element
  • HOE holographic optical element
  • the optical waveguide may guide display light emitted from the light source to the user's eyes using at least one diffractive element or reflective element included in the optical waveguide.
  • the diffractive element may include an input/output optical member
  • the reflective element may include a total internal reflection (TIR).
  • TIR total internal reflection
  • light emitted from a light source may be guided along an optical path to an optical waveguide through an input optical member, and light traveling inside the optical waveguide may be guided toward the user's eyes through an output optical member.
  • the second glass 225 may be implemented in substantially the same way as the first glass 215 .
  • the first glass 215 and the second glass 225 may each include the display module 160 shown in FIG. 1 .
  • the first glass 215 and the second glass 225 may be, for example, a liquid crystal display (LCD), a digital mirror device (DMD), or a silicon liquid crystal display (liquid crystal). on silicon (LCoS), organic light emitting diode (OLED), or micro LED (micro light emitting diode (micro LED)).
  • LCD liquid crystal display
  • DMD digital mirror device
  • silicon liquid crystal display liquid crystal display
  • LCDoS on silicon
  • OLED organic light emitting diode
  • micro LED micro light emitting diode
  • the wearable electronic device 200 includes the first glass 215 and the second glass 225.
  • the wearable electronic device 200 may include a light source that radiates light to the screen output area of the second glass 225.
  • the wearable electronic device 200 can provide a virtual image of good quality to the user even if it does not include a separate light source.
  • the first glass 215 and the second glass 225 may be formed of a glass plate, a plastic plate, or a polymer.
  • the first glass 215 and the second glass 225 may be transparent or opaque.
  • the first rim 210 may include a first microphone 211, a first recognition camera 213, a first light emitting device 217, and/or a first display module 219. there is.
  • the second rim 220 may include a second microphone 221, a second recognition camera 223, a second light emitting device 227, and/or a second display module 229.
  • the first light-emitting device 217 and the first display module 219 are included in the first end piece 230, and the second light-emitting device 227 and the second display module 229 are included in the first end piece 230. 2 May be included in the end piece 240.
  • the first microphone 211 and/or the second microphone 221 may receive the voice of the user of the wearable electronic device 200 and convert it into an electrical signal.
  • the first recognition camera 213 and/or the second recognition camera 223 may recognize the surrounding space of the wearable electronic device 200.
  • the first recognition camera 213 and/or the second recognition camera 223 may detect a user's gesture (e.g., a controller) within a certain distance (e.g., a certain space) of the wearable electronic device 200.
  • the first recognition camera 213 and/or the second recognition camera 223 reduce the RS (rolling shutter) phenomenon in order to detect and track the user's (e.g., controller) quick hand movements and/or fine movements of the fingers. It may include a GS (global shutter) camera.
  • the wearable electronic device 200 uses the first eye tracking camera 205, the second eye tracking camera 207, the first recognition camera 213, and/or the second recognition camera 223 to detect the user's left eye and /Or, among the right eyes, the eye corresponding to the main eye and/or the auxiliary eye can be detected.
  • the wearable electronic device 200 may detect the eye corresponding to the primary eye and/or the secondary eye based on the user's gaze direction with respect to an external object or virtual object.
  • the first light-emitting device 217 and/or the second light-emitting device 227 include a camera module 203, a first eye tracking camera 205, a second eye tracking camera 207, and a second eye tracking camera 207. It may emit light to increase the accuracy of the first recognition camera 213 and/or the second recognition camera 223.
  • the first light-emitting device 217 and/or the second light-emitting device 227 are used to increase accuracy when photographing the user's eyes using the first eye tracking camera 205 and/or the second eye tracking camera 207. It can be used as an auxiliary tool.
  • the first light-emitting device 217 and/or the second light-emitting device 227 may be used in a dark environment or in various places. It can be used as an auxiliary means when it is not easy to detect an object (e.g., a subject) to be photographed due to mixing of light sources and reflected light.
  • the first light emitting device 217 and/or the second light emitting device 227 may include, for example, an LED, an IR LED, or a xenon lamp.
  • the first display module 219 and/or the second display module 229 emit light and use the first glass 215 and/or the second glass 225 to display the user's left eye and /Or it can be delivered to the right eye.
  • the first glass 215 and/or the second glass 225 may display various image information using light emitted through the first display module 219 and/or the second display module 229.
  • the first display module 219 and/or the second display module 229 may include the display module 160 of FIG. 1 .
  • the wearable electronic device 200 displays the foreground of an external object and an image emitted through the first display module 219 and/or the second display module 229 through the first glass 215 and/or the second display module 229. 2 can be displayed by overlapping them through the glass 225.
  • the first end piece 230 may be coupled to a portion (eg, x-axis direction) of the first rim 210.
  • the second end piece 240 may be coupled to a portion (eg, -x-axis direction) of the second rim 220.
  • the first light emitting device 217 and the first display module 219 may be included in the first end piece 230.
  • the second light emitting device 227 and the second display module 229 may be included in the second end piece 240 .
  • the first end piece 230 may connect the first rim 210 and the first temple 250.
  • the second end piece 240 may connect the second rim 220 and the second temple 260.
  • the first temple 250 may be operatively connected to the first end piece 230 using the first hinge portion 255.
  • the first hinge portion 255 may be rotatable so that the first temple 250 is folded or unfolded with respect to the first rim 210 .
  • the first temple 250 may extend, for example, along the left side of the user's head.
  • the distal portion (e.g., in the y-axis direction) of the first temple 250 may be bent to be supported by, for example, the user's left ear when the user wears the wearable electronic device 200.
  • the second temple 260 may be operatively connected to the second end piece 240 using the second hinge portion 265.
  • the second hinge portion 265 may be rotatable so that the second temple 260 is folded or unfolded with respect to the second rim 220 .
  • the second temple 260 may extend, for example, along the right side of the user's head.
  • the distal portion (e.g., in the y-axis direction) of the second temple 260 may be bent to be supported by, for example, the user's right ear when the user wears the wearable electronic device 200.
  • the first temple 250 includes a first printed circuit board 251, a first sound output module 253 (e.g., the sound output module 155 of FIG. 1), and/or a first battery. 257 (e.g., battery 189 in FIG. 1).
  • the second temple 260 includes a second printed circuit board 261, a second sound output module 263 (e.g., the sound output module 155 in FIG. 1), and/or a second battery 267 (e.g., FIG. It may include a battery (189) of 1.
  • the first printed circuit board 251 and/or the second printed circuit board 261 include the processor 120, memory 130, sensor module 176, and interface 177 shown in FIG. 1. ) and/or various electronic components such as the wireless communication module 192 (eg, at least some of the components included in the electronic device 101 of FIG. 1) may be disposed.
  • the processor may include, for example, one or more of a central processing unit, an application processor, a graphics processing unit, an image signal processor, a sensor hub processor, or a communication processor.
  • the first printed circuit board 251 and/or the second printed circuit board 261 may include, for example, a printed circuit board (PCB), a flexible PCB (FPCB), or a rigid-flexible PCB (RFPCB). there is.
  • PCB printed circuit board
  • FPCB flexible PCB
  • RFPCB rigid-flexible PCB
  • the first printed circuit board 251 and/or the second printed circuit board 261 include a main PCB, a slave PCB partially overlapping with the main PCB, and/or between the main PCB and the slave PCB. It may include an interposer substrate.
  • the first printed circuit board 251 and/or the second printed circuit board 261 are connected to other components (e.g., the camera module 203, the first eye tracking camera) using an electrical path such as an FPCB and/or a cable.
  • the wearable electronic device 200 may include only one of the first printed circuit board 251 or the second printed circuit board 261.
  • the first audio output module 253 and/or the second audio output module 263 may transmit audio signals to the user's left and/or right ears.
  • the first sound output module 253 and/or the second sound output module 263 may include, for example, a piezo speaker (eg, bone conduction speaker) that transmits an audio signal without a speaker hole.
  • the wearable electronic device 200 may include only one of the first audio output module 253 or the second audio output module 263.
  • the first battery 257 and/or the second battery 267 uses a power management module (e.g., the power management module 188 of FIG. 1) to manage the first printed circuit board 251. ) and/or power may be supplied to the second printed circuit board 261.
  • the first battery 257 and/or the second battery 267 may include, for example, a non-rechargeable primary battery, a rechargeable secondary battery, or a fuel cell.
  • the wearable electronic device 200 may include only one of the first battery 257 or the second battery 267.
  • the wearable electronic device 200 may include a sensor module (eg, sensor module 176 in FIG. 1).
  • the sensor module may generate an electrical signal or data value corresponding to the internal operating state of the wearable electronic device 200 or the external environmental state.
  • Sensor modules include, for example, gesture sensors, gyro sensors, barometric pressure sensors, magnetic sensors, acceleration sensors, grip sensors, color sensors, IR (infrared) sensors, biometric sensors (e.g. HRM sensors), temperature sensors, humidity sensors, It may include at least one of an inertial measurement (IMU) sensor or an illumination sensor.
  • IMU inertial measurement
  • the sensor module may be configured to include various biometric sensors (or biometric sensors), such as an olfactory sensor (e-nose sensor), an electromyography sensor (EMG sensor), an electroencephalogram sensor (EEG sensor), an electrocardiogram sensor (ECG sensor), or an iris sensor.
  • biometric sensors such as an olfactory sensor (e-nose sensor), an electromyography sensor (EMG sensor), an electroencephalogram sensor (EEG sensor), an electrocardiogram sensor (ECG sensor), or an iris sensor.
  • biometric sensors such as an olfactory sensor (e-nose sensor), an electromyography sensor (EMG sensor), an electroencephalogram sensor (EEG sensor), an electrocardiogram sensor (ECG sensor), or an iris sensor.
  • EMG sensor electromyography sensor
  • EEG sensor electroencephalogram sensor
  • ECG sensor electrocardiogram sensor
  • iris sensor an iris sensor
  • the wearable electronic device 200 may display virtual objects through the first glass 215 and the second glass 225 to implement augmented reality and/or virtual reality.
  • Figure 3 is a perspective view schematically showing a controller associated with a wearable electronic device according to an embodiment of the present invention.
  • Figure 4 is a perspective view schematically showing a pair of controllers linked to a wearable electronic device according to an embodiment of the present invention.
  • the controller 300 includes the processor 120, memory 130, program 140, input module 150, audio output module 155, and sensor of the electronic device 101 shown in FIG. 1. It may include a module 176, a battery 189, and/or a wireless communication module 192.
  • the controller 300 may operate in conjunction with the wearable electronic device 200.
  • the controller 300 can transmit and receive signals with the wearable electronic device 200 through wireless communication.
  • Wireless communication may include either WiFi, Bluetooth, or mmWave.
  • the controller 300 may transmit key input signals and movement data to the wearable electronic device 200.
  • the controller 300 may receive a control signal from the wearable electronic device 200.
  • the controller 300 controls a game displayed through the first glass 215 (e.g., first display) and the second glass 225 (e.g., second display) of the wearable electronic device 200. You can play (or control) the game through the screen.
  • the wearable electronic device 200 may set the game screen displayed differently by using the first glass 215 and the second glass 225 based on the control operation of the controller 300.
  • the controller 300 may be a handler capable of controlling the game screen of the wearable electronic device 200.
  • the controller 300 may use a vision method.
  • the controller 300 may include a first case 301 and a second case 302.
  • the first case 301 and the second case 302 may be integrally connected.
  • the first case 301 may include an LED 310. At least one LED 310 may be disposed in the first case 301. A plurality of LEDs 310 may be disposed in the first case 301. The LED 310 may be disposed in the first case 301 so that it can be detected through the first recognition camera 213 and/or the second recognition camera 223 of the wearable electronic device 200. For example, part of the LED 310 may be placed on the outer surface of the first case 301, and another part may be placed on the inner surface of the first case 301. LED 310 can be turned on or off in a designated pattern. The LED 310 may repeatedly turn on and off at a set cycle to reduce current consumption. The turn-on and turn-off times of LED 310 can be changed.
  • the second case 302 may include an on/off button 320, a menu button 330, a joystick 340, and/or a touch pad 350.
  • the on-off button 320 can turn the controller 300 on or off.
  • the menu button 330 is used to call up a menu displayed through the first glass 215 (e.g., first display) and the second glass 225 (e.g., second display) of the wearable electronic device 200. You can.
  • the joystick 340 may be used to perform game operations (eg, movement and rotation) displayed through the first glass 215 and the second glass 225 of the wearable electronic device 200.
  • the touch pad 350 may be used to input operation signals required to play a game displayed through the first glass 215 and the second glass 225 of the wearable electronic device 200.
  • the configuration included in the second case 302 is not limited to the above-described example, and various other configurations may be included.
  • the controller 300 may have a printed circuit board (not shown) disposed inside the second case 302.
  • the printed circuit board of the controller 300 includes a processor 120, a memory 130, an input module 150, an audio output module 155, a sensor module 176, and a battery ( 189) or electronic components that perform substantially the same function as the wireless communication module 192 may be disposed.
  • the controller 300 associated with the wearable electronic device 200 includes a controller 361 that the user can operate using the right hand and a controller 361 that the user can operate using the left hand. It may include a controller 362 that can be manipulated.
  • the controller 361, which the user can operate using the right hand, and the controller 362, which the user can operate using the left hand, are each disclosed in FIG. 3.
  • the controller 300 may include substantially the same embodiments as described above.
  • the wearable electronic device 200 may perform operations in conjunction with at least one of the controllers 361 and 362.
  • FIG. 5 is a diagram schematically showing the emission cycle and turn-on time of an LED for one controller associated with a wearable electronic device according to an embodiment of the present invention.
  • the LED 310 disposed in the controller 300 may repeatedly turn on and off at a set cycle to reduce current consumption.
  • the LEDs 310 may operate with substantially the same emission period (eg, a1).
  • the LED 310 may operate with an emission cycle of approximately 33 ms.
  • the LED 310 may repeatedly turn on and turn off at substantially the same time interval (eg, t1).
  • the LED 310 may repeatedly turn on and off at a time interval of about 20 ⁇ s.
  • Figure 6 is a diagram schematically showing an embodiment in which a wearable electronic device can recognize a plurality of controllers according to an embodiment of the present invention.
  • user 1 can use the first wearable electronic device 200a linked to the first controller 300a.
  • User 2 can use the second wearable electronic device 200b connected to the second controller 300b.
  • User 3 can use the third wearable electronic device 200c linked to the third controller 300c.
  • User 4 can use the fourth wearable electronic device 200d connected to the fourth controller 300d.
  • user 5 when user 5 exists, user 5 may use, for example, a fifth wearable electronic device (not shown) associated with a fifth controller (not shown).
  • a fifth wearable electronic device (not shown) associated with a fifth controller (not shown).
  • Users 1 to 5 are examples, and there may be more or fewer users.
  • the first wearable electronic device 200a, the second wearable electronic device 200b, the third wearable electronic device 200c, or the fourth wearable electronic device 200d is an electronic device ( It may include substantially the same configurations as at least some of the embodiments described in 101).
  • the first wearable electronic device 200a, the second wearable electronic device 200b, the third wearable electronic device 200c, or the fourth wearable electronic device 200d is the wearable electronic device 200 described in FIG. 2. It may include substantially the same configurations as at least some of the embodiments.
  • the first wearable electronic device 200a may be the electronic device 101 shown in FIG. 1 .
  • the second wearable electronic device 200b, the third wearable electronic device 200c, or the fourth wearable electronic device 200d may be the external electronic devices 102 and 104 shown in FIG. 1 .
  • components substantially the same as the embodiment of the electronic device 101 of FIG. 1 or the embodiment of the wearable electronic device 200 of FIG. 2 are given similar reference numerals, and redundant description will be omitted. You can.
  • the first wearable electronic device 200a, the second wearable electronic device 200b, the third wearable electronic device 200c, and/or the fourth wearable electronic device 200d disclosed below have only the notation Although they are different, each can perform substantially the same function.
  • the first wearable electronic device 200a used by User 1 may function as a server.
  • the first wearable electronic device 200a serves as a host for the second wearable electronic device 200b, the third wearable electronic device 200c, and/or the fourth wearable electronic device 200d. can do.
  • the first wearable electronic device 200a includes a first controller 300a, a second wearable electronic device 200b, a second controller 300b, a third wearable electronic device 200c, a third controller 300c, and a third wearable electronic device 200c. 4
  • a control signal may be transmitted to the wearable electronic device 200d and/or the fourth controller 300d.
  • the control signal transmitted from the first wearable electronic device 200a is transmitted from the LEDs respectively disposed in the first controller 300a, the second controller 300b, the third controller 300c, and the fourth controller 300d. It may include a signal that controls the light emission cycle and turn-on time of 310.
  • the first wearable electronic device 200a controls the emission period and turn-on of the LEDs 310 disposed in the first controller 300a, the second controller 300b, the third controller 300c, and the fourth controller 300d, respectively. Based on time, the operation cycle and turn-on time of the first recognition camera 213 and the second recognition camera 223 can be controlled.
  • the timing at which the recognition camera 223f operates may be controlled.
  • the first wearable electronic device 200a may receive information transmitted from the second wearable electronic device 200b, the third wearable electronic device 200c, and/or the fourth wearable electronic device 200d. there is.
  • the LEDs 310 respectively disposed in the first controller 300a, the second controller 300b, the third controller 300c, and the fourth controller 300d are connected to the first wearable electronic device 200a. Operations can be performed based on control signals transmitted from. 2
  • the seventh recognition camera 213e and/or the eighth recognition camera 223f of the wearable electronic device 200d may perform operations based on the control signal transmitted from the first wearable electronic device 200a.
  • the first wearable electronic device 200a receives the first recognition signal through at least one of the first recognition camera 213 and the second recognition camera 223.
  • the movement and position of the controller 300a, the second controller 300b, the third controller 300c, and/or the fourth controller 300d can be detected.
  • the first wearable electronic device 200a detects the first controller 300a, the second controller 300b, and the first recognition camera 213 and/or the second recognition camera 223. Detects on and/or off of the LED 310 arranged in the 3 controller 300c and/or the 4th controller 300d, respectively, and detects the first controller 300a, the second controller 300b, and the third controller ( The positions of 300c) and/or the fourth controller 300d can be confirmed.
  • the first controller 300a, the second controller 300b, the third controller 300c, and/or the fourth controller 300d each include an inertial measurement sensor, and the first controller 300a, The angular velocity and acceleration of the second controller 300b, third controller 300c, and/or fourth controller 300d may be transmitted to the first wearable electronic device 200a.
  • the first controller 300a, second controller 300b, third controller 300c, and/or fourth controller 300d may include the sensor module 176 disclosed in FIG. 1.
  • the first wearable electronic device 200a may check the movement and position of the first controller 300a through the first recognition camera 213 and/or the second recognition camera 223. .
  • the first wearable electronic device 200a may receive information related to the movement and location of the second controller 300b through the second wearable electronic device 200b.
  • the first wearable electronic device 200a may receive information related to the movement and location of the third controller 300c through the third wearable electronic device 200c.
  • the first wearable electronic device 200a may receive information related to the movement and location of the fourth controller 300d through the fourth wearable electronic device 200d.
  • the first wearable electronic device 200a controls the first controller 300a, the second controller 300b, and the third controller 300c through a sensor module (e.g., the sensor module 176 in FIG. 1). ) and/or the movement and position of the fourth controller 300d may be detected.
  • the sensor module 176 of the first wearable electronic device 200a measures the angular velocities of the first controller 300a, the second controller 300b, the third controller 300c, and/or the fourth controller 300d. and an inertial measurement sensor that measures acceleration.
  • the first controller 300a, the second controller 300b, the third controller 300c, and the fourth controller 300d each include an inertial measurement sensor and transmit the measured information to the first wearable electronic device. It may also be transmitted to at least one of the device 200a to the fourth wearable electronic device 200d.
  • the sensor module 176 of the first wearable electronic device 200a includes a first controller 300a, a second controller 300b, a third controller 300c, and/or a fourth controller 300d. It may also include a global positioning system (GPS) sensor that detects the location.
  • GPS global positioning system
  • the first controller 300a, the second controller 300b, the third controller 300c, and the fourth controller 300d each include a GPS sensor and transmit the measured information to the first wearable electronic device. It may also be transmitted to at least one of the wearable electronic devices 200a to 200d.
  • the first wearable electronic device 200a recognizes the first controller 300a, the second controller 300b, and the third through the first recognition camera 213 and/or the second recognition camera 223. At least one of the light emission cycle, turn-on time (eg, light emission timing), and light emission pattern of the LED 310 disposed in the controller 300c and/or the fourth controller 300d can be detected.
  • the first wearable electronic device 200a recognizes the first controller 300a, the second controller 300b, the third controller 300c, and/or the first recognition camera 213 and/or the second recognition camera 223.
  • the fourth controller 300d may detect the light emission cycle, turn-on time, and/or light emission pattern of the LEDs 310 disposed in the fourth controller 300d, respectively, and detect the light emission cycle, turn-on time, and/or light emission pattern of the LEDs 310 respectively disposed in the fourth controller 300d, the first controller 300a, the second controller 300b, and the third controller 300c. and/or the fourth controller 300d may be identified.
  • the first wearable electronic device 200a receives specified information (e.g., movement) from the second wearable electronic device 200b, the third wearable electronic device 200c, and/or the fourth wearable electronic device 200d. and location signals) can be received.
  • specified information e.g., movement
  • the third wearable electronic device 200c the third wearable electronic device 200c, and/or the fourth wearable electronic device 200d. and location signals
  • the first wearable electronic device 200a renders the information received from the second wearable electronic device 200b, the third wearable electronic device 200c, and/or the fourth wearable electronic device 200d, and then renders the information as vision data and audio.
  • Data and/or control signals may be transmitted to the second wearable electronic device 200b, the third wearable electronic device 200c, and/or the fourth wearable electronic device 200d.
  • the first recognition camera 213 and/or the second recognition camera 223 may recognize the space surrounding the first wearable electronic device 200a.
  • the first recognition camera 213 and/or the second recognition camera 223 recognizes the first controller 300a and the second controller 300b within a certain distance (e.g., a certain space) of the first wearable electronic device 200a.
  • the movement and position of the third controller 300c and/or the fourth controller 300d can be detected.
  • the first recognition camera 213 and/or the second recognition camera 223 are respectively disposed in the first controller 300a, the second controller 300b, the third controller 300c, and/or the fourth controller 300d.
  • a GS (global shutter) camera capable of reducing RS (rolling shutter) phenomenon may be included to detect the emission cycle, timing of emission time, and emission pattern of the LED 310.
  • the LEDs 310 disposed in the first controller 300a, the second controller 300b, the third controller 300c, and the fourth controller 300d are turned on or off to have different patterns. It can be.
  • At least one of the first recognition camera 213 and the second recognition camera 223 of the first wearable electronic device 200a is the first controller 300a, the second controller 300b, the third controller 300c, and the first recognition camera 213 and the second recognition camera 223. 4
  • the light emission period, turn-on time e.g., light emission timing
  • the third controller 300c and/or the fourth controller 300d may be identified.
  • the first wearable electronic device 200a while the first wearable electronic device 200a is playing a game using the first controller 300a, the second controller 300b and the third controller 300b associated with the second wearable electronic device 200b
  • the third controller 300c associated with the wearable electronic device 200c and/or the fourth controller 300d associated with the fourth wearable electronic device 200d is located adjacent to the first wearable electronic device 200a.
  • the first wearable electronic device 200a detects the second controller 300b, the third controller 300c, and/or the fourth controller ( 300d) and send a control signal to the second controller 300b, the third controller 300c and/or to operate the LED 310 disposed in the first controller 300a to have a different turn-on time and a different light emission pattern.
  • it may be transmitted to the fourth controller 300d.
  • the first wearable electronic device 200a operates as a host
  • the wearable electronic device 200d may operate as a host and perform substantially the same functions as the first wearable electronic device 200a described above.
  • the electronic device 101 (e.g., mobile terminal) disclosed in FIG. 1 includes a first wearable electronic device 200a, a second wearable electronic device 200b, a third wearable electronic device 200c, and a third wearable electronic device 200c.
  • Control signals e.g., operation information
  • FIG. 7(a) to 7(d) are diagrams illustrating an example in which a wearable electronic device identifies at least one controller according to an embodiment of the present invention.
  • FIG. 7 may be a diagram showing the light emission cycle and turn-on time of the LED 310 of the first controller 300a associated with the first wearable electronic device 200a.
  • (b) in FIG. 7 shows the light emission period and turn-on time of the LED 310 of the first controller 300a associated with the first wearable electronic device 200a, and the second LED associated with the second wearable electronic device 200b.
  • This may be a diagram showing the light emission cycle and turn-on time of the LED 310 of the controller 300b.
  • (c) of FIG. 7 shows the light emission period and turn-on time of the LED 310 of the first controller 300a associated with the first wearable electronic device 200a and the second controller associated with the second wearable electronic device 200b.
  • (d) of FIG. 7 shows the light emission cycle and turn-on time of the LED 310 of the first controller 300a associated with the first wearable electronic device 200a and the second controller associated with the second wearable electronic device 200b.
  • the light emission cycle and turn-on time of the LED 310 of 300b, the light emission cycle and turn-on time of the LED 310 of the third controller 300c linked to the third wearable electronic device 200c, and the fourth wearable electronic device It may be a diagram showing the light emission cycle and turn-on time of the LED 310 of the fourth controller 300d linked to (200d).
  • the first wearable electronic device 200a sets the emission period (a1) and the first turn-on time (1) of the LED 310 disposed in the first controller 300a, It can be transmitted to the first controller 300a.
  • the first wearable electronic device 200a uses at least one of the first recognition camera 213 or the second recognition camera 223 to control the LED 310 with a first turn-on time 1. 300a) can be confirmed.
  • the first wearable electronic device 200a sets the emission period (a1) and the first turn-on time (1) of the LED 310 disposed in the first controller 300a, It can be transmitted to the first controller 300a.
  • the first wearable electronic device 200a sets the light emission period and the second turn-on time 2 of the LED 310 disposed in the second controller 300b, and controls the second controller 200b through the second wearable electronic device 200b. It can be delivered to (300b).
  • the first wearable electronic device 200a uses at least one of the first recognition camera 213 or the second recognition camera 223 to control the LED 310 with a first turn-on time 1. 300a) and/or the LED 310 may identify the second controller 300b operating at the second turn-on time (2).
  • the emission cycle of the LED 310 disposed in the first controller 300a and the emission cycle of the LED 310 disposed in the second controller 300b are substantially the same, and the emission cycle of the LED 310 disposed in the first controller 300a is substantially the same.
  • the first turn-on time (1) of the LED 310 disposed in and the second turn-on time (2) of the LED 310 disposed in the second controller 300b may be different.
  • the timing of the first turn-on time (1) and the second turn-on time (2) may be different.
  • the first wearable electronic device 200a sets the emission period (a1) and the first turn-on time (1) of the LED 310 disposed in the first controller 300a, It can be transmitted to the first controller 300a.
  • the first wearable electronic device 200a sets the light emission period and the second turn-on time 2 of the LED 310 disposed in the second controller 300b, and controls the second controller 200b through the second wearable electronic device 200b. It can be delivered to (300b).
  • the first wearable electronic device 200a sets the light emission cycle and the third turn-on time 3 of the LED 310 disposed in the third controller 300c, and controls the third controller through the third wearable electronic device 200c. It can be delivered to (300c).
  • the first wearable electronic device 200a uses at least one of the first recognition camera 213 or the second recognition camera 223 to control the LED 310 with a first turn-on time 1. 300a), identifying the second controller 300b in which the LED 310 operates at the second turn-on time (2) and/or the third controller 300c in which the LED 310 operates at the third turn-on time (3). can do.
  • the emission cycle of the LED 310 disposed in the first controller 300a, the emission cycle of the LED 310 disposed in the second controller 300b, and the LED disposed in the third controller 300c ( The light emission cycle of 310 is substantially the same, and the first turn-on time (1) of the LED 310 disposed in the first controller 300a and the second turn-on time 1 of the LED 310 disposed in the second controller 300b
  • the time 2 and the third turn-on time 3 of the LED 310 disposed in the third controller 300c may be different.
  • the timing of the first turn-on time (1), the second turn-on time (2), and the third turn-on time (3) may be different.
  • the first wearable electronic device 200a sets the emission period (a1) and the first turn-on time (1) of the LED 310 disposed in the first controller 300a, It can be transmitted to the first controller 300a.
  • the first wearable electronic device 200a sets the light emission period and the second turn-on time 2 of the LED 310 disposed in the second controller 300b, and controls the second controller 200b through the second wearable electronic device 200b. It can be delivered to (300b).
  • the first wearable electronic device 200a sets the light emission cycle and the third turn-on time 3 of the LED 310 disposed in the third controller 300c, and controls the third controller through the third wearable electronic device 200c. It can be delivered to (300c).
  • the first wearable electronic device 200a sets the light emission cycle and the fourth turn-on time 4 of the LED 310 disposed in the fourth controller 300d, and controls the fourth controller through the fourth wearable electronic device 200d. It can be delivered to (300d).
  • the first wearable electronic device 200a uses at least one of the first recognition camera 213 or the second recognition camera 223 to control the LED 310 with a first turn-on time 1. 300a), a second controller 300b in which the LED 310 operates at a second turn-on time (2), a third controller 300c and/or an LED in which the LED 310 operates at a third turn-on time (3) 310 can identify the fourth controller 300d operating at the fourth turn-on time (4).
  • the emission cycle of the LED 310 disposed in the first controller 300a, the emission cycle of the LED 310 disposed in the second controller 300b, and the LED disposed in the third controller 300c are substantially the same, and the first turn-on time (1) of the LED 310 arranged in the first controller 300a, The second turn-on time (2) of the LED (310) disposed in the second controller (300b), the third turn-on time (3) of the LED (310) disposed in the third controller (300c), and the fourth controller (300d)
  • the fourth turn-on time (4) of the LED 310 disposed in may be different.
  • the timing of the first turn-on time (1), the second turn-on time (2), the third turn-on time (3), and the fourth turn-on time (4) may be different.
  • the LEDs 310 arranged in the first controller 300a, the second controller 300b, the third controller 300c, and the fourth controller 300d may be turned on with time differences.
  • the first wearable electronic device 200a has an LED 310 disposed on the first controller 300a under the control of the processor 120.
  • a control signal can be transmitted to turn on at a specified time.
  • the first wearable electronic device 200a sends a control signal to the second wearable electronic device 200b to turn on the LED 310 disposed in the second controller 300b at a specified time under the control of the processor 120. It can be transmitted to the second controller 300b through .
  • the first wearable electronic device 200a sends a control signal to the third wearable electronic device 200c to turn on the LED 310 disposed in the third controller 300c at a specified time under the control of the processor 120. It can be transmitted to the third controller 300c through .
  • the first wearable electronic device 200a sends a control signal to the fourth wearable electronic device 200d to turn on the LED 310 disposed in the fourth controller 300d at a specified time under the control of the processor 120. It can be transmitted to the fourth controller 300b through .
  • the LEDs 310 disposed in the first controller 300a, the second controller 300b, the third controller 300c, and the fourth controller 300d have substantially the same emission period a1 (e.g., about 33 ms). It operates as follows, and the timing of the first turn-on time (1) to the fourth turn-on time (4) may be different.
  • the first wearable electronic device 200a under control of the processor 120, through the wireless communication module 192, the light emission cycle of the LED 310 is substantially the same, and the turn-on time (e.g. : The first turn-on time (1) to the fourth turn-on time (4)) sends control signals to operate differently to the first controller (300a), the second controller (300b), the third controller (300c), and the fourth controller. Each can be delivered to (300d).
  • the first wearable electronic device 200a includes a first controller 300a where each LED 310 has a different turn-on timing using at least one of the first recognition camera 213 and the second recognition camera 223; The second controller 300b, third controller 300c, and fourth controller 300d can be identified.
  • the first wearable electronic device 200a detects the first controller 300a, the second controller 300b, and the like through at least one of the first recognition camera 213 and the second recognition camera 223. Check the light emission intensity and/or color information of the LEDs 310 disposed in the third controller 300c and the fourth controller 300d, respectively, and determine the (300c) and the fourth controller (300d) may be identified respectively.
  • FIG. 8(a) to 8(d) show that at least one of the first recognition camera or the second recognition camera of the wearable electronic device according to an embodiment of the present invention detects the LED disposed in at least one controller.
  • This is a diagram showing an example of setting the operation time.
  • FIG. 8 shows the first turn-on time 1. It may be a diagram showing the operation time of at least one of the recognition camera 213 or the second recognition camera 223. (b) in FIG. 8 shows the first turn-on time (1) of the LED 310 of the first controller 300a associated with the first wearable electronic device 200a and the first turn-on time 1 associated with the second wearable electronic device 200b.
  • At least one of the first recognition camera 213 or the second recognition camera 223 of the first wearable electronic device 200a It may be a diagram showing the operation time of and the operation time of at least one of the third recognition camera 213a or the fourth recognition camera 223b of the second wearable electronic device 200b.
  • (c) of FIG. 8 shows the first turn-on time (1) of the LED 310 of the first controller 300a associated with the first wearable electronic device 200a, and the first turn-on time 1 associated with the second wearable electronic device 200b.
  • the first wearable electronic device 200a emits light from the LED 310 disposed on the first controller 300a through at least one of the first recognition camera 213 or the second recognition camera 223.
  • the timing of the cycle, turn-on time, and/or light emission pattern can be obtained.
  • the second wearable electronic device 200b detects the timing of the light emission cycle and turn-on time of the LED 310 disposed on the second controller 300b through at least one of the third recognition camera 213a or the fourth recognition camera 223b. And/or a light emission pattern may be obtained.
  • the third wearable electronic device 200c detects the timing of the light emission cycle and turn-on time of the LED 310 disposed on the third controller 300c through at least one of the fifth recognition camera 213c or the sixth recognition camera 223d. And/or a light emission pattern may be obtained.
  • the fourth wearable electronic device 200d detects the emission period, timing of turn-on time, and/or the light emission period of the LED 310 disposed on the fourth controller 300d through the seventh recognition camera 213e or the eighth recognition camera 223f. A luminescence pattern can be obtained.
  • the timing of turn-on time and/or light emission pattern may be shared.
  • information related to the light emission cycle, turn-on time timing, and/or light emission pattern of the LED 310 obtained from the second controller 300b to the fourth controller 300d, respectively, is stored in the second wearable electronic device 200b.
  • the first wearable electronic device 200a detects the first turn-on time (1) of the LED 310 disposed in the first controller 300a by the first recognition camera 213 or the first recognition camera 213.
  • the first recognition camera 213 or the second recognition camera 223 can be set to operate at the fifth turn-on time (5).
  • the fifth turn-on time (5) of at least one of the first recognition camera 213 or the second recognition camera 223 is longer than the first turn-on time (1) of the LED 310 disposed in the first controller 300a. can be set.
  • the first wearable electronic device 200a detects the second turn-on time 2 of the LED 310 disposed in the second controller 300b by the first recognition camera 213 or the second turn-on time 2.
  • the first recognition camera 213 or the second recognition camera 223 can be set to operate at the sixth turn-on time (6).
  • the sixth turn-on time (6) of at least one of the first recognition camera 213 or the second recognition camera 223 is longer than the second turn-on time (2) of the LED 310 disposed in the second controller 300b. can be set.
  • the first wearable electronic device 200a detects the third turn-on time (3) of the LED 310 disposed in the third controller 300c by the first recognition camera 213 or the third turn-on time 3.
  • the first recognition camera 213 or the second recognition camera 223 can be set to operate at the seventh turn-on time (7).
  • the seventh turn-on time (7) of at least one of the first recognition camera 213 or the second recognition camera 223 is longer than the third turn-on time (3) of the LED 310 disposed in the third controller 300c. can be set.
  • the first wearable electronic device 200a detects the fourth turn-on time 4 of the LED 310 disposed in the fourth controller 300d by the first recognition camera 213 or the fourth turn-on time 4.
  • the eighth turn-on time (8) of at least one of the first recognition camera 213 or the second recognition camera 223 is longer than the fourth turn-on time (4) of the LED 310 disposed in the fourth controller 300d. can be set.
  • the first wearable electronic device 200a recognizes the first controller 300a, the second controller 300b, and the first recognition camera 213 or the second recognition camera 223 through at least one of the first recognition camera 213 and the second recognition camera 223.
  • the light emission cycle, turn-on time timing, and/or light emission pattern of the LEDs 310 arranged in the third controller 300c and the fourth controller 300d, respectively, can be detected.
  • FIG. 9 is a diagram illustrating an embodiment in which at least one of a first recognition camera or a second recognition camera of a wearable electronic device according to an embodiment of the present invention can detect the light emission pattern of an LED disposed in at least one controller. am.
  • FIG. 10 is a diagram illustrating various embodiments in which at least one of a first recognition camera or a second recognition camera of a wearable electronic device according to an embodiment of the present invention can detect the light emission pattern of an LED disposed in at least one controller am.
  • FIGS. 9 and 10 show the wearable electronic device 200 using at least one of the first recognition camera 213 and the second recognition camera 223 to display the LED 310 disposed on the first controller.
  • ) may be a diagram showing various embodiments of detecting the first pattern (pattern 1) to the fifth pattern (pattern 5) of the LED 310 arranged in the fifth controller.
  • the first to fifth controllers described in FIGS. 9 and 10 may each be configured substantially the same as the configuration of the controller 300 shown in FIG. 3 .
  • the first column 910 represents a case where the LEDs 310 arranged in the first to fifth controllers are in the on and off states. You can.
  • the first column 910 may be a diagram showing all of the LEDs 310 arranged in the first to fifth controllers either in a light-emitting state or in a non-light-emitting state.
  • the second column 920 of FIG. 9 may indicate a case where the LEDs 310 arranged in the first to fifth controllers are in an on state.
  • the second column 920 may be a diagram showing only the case where the LEDs 310 arranged in the first to fifth controllers are in a light-emitting state.
  • the LEDs 310 arranged in the first to fifth controllers are arranged, for example, as the first LED (A) and the second LED (B). It can be included.
  • the first pattern (pattern 1) of the LED 310 disposed in the first controller is such that the first LED (A) is on, the second LED (B) is off, and the first LED (A) is on. On and the second LED (B) can be turned off.
  • the second pattern (pattern 2) of the LED 310 disposed in the second controller is such that the first LED (A) is off, the second LED (B) is on, and the first LED (A) is on. off and the second LED (B) may be on.
  • the third pattern (pattern 3) of the LED 310 disposed in the third controller is such that the first LED (A) is on, the second LED (B) is on, and the first LED (A) is on. off and the second LED (B) may be on.
  • the fourth pattern (pattern 4) of the LED 310 disposed in the fourth controller is such that the first LED (A) is off, the second LED (B) is on, and the first LED (A) is on. On and the second LED (B) can be turned off.
  • the fifth pattern (pattern 5) of the LED 310 disposed in the fifth controller is such that the first LED (A) is on, the second LED (B) is off, and the first LED (A) is on. On and the second LED (B) may be on.
  • the wearable electronic device 200 uses at least one of the first recognition camera 213 or the second recognition camera 223, The first pattern (pattern 1) of the first controller, the second pattern (pattern 2) of the second controller, the third pattern (pattern 3) of the third controller, the fourth pattern (pattern 4) of the fourth controller, and/or By recognizing the fifth pattern (pattern 5) of the fifth controller, the first to fifth controllers can be identified.
  • the wearable electronic device 200 uses at least one of the first recognition camera 213 or the second recognition camera 223,
  • the first pattern (pattern 1) of the first controller to the fifth pattern (pattern 5) of the fifth controller can be recognized and identified.
  • the first controller may include a first pattern (pattern 1) having an arrangement in which only the first LED (A) is on.
  • the second LED (B) may include an array that is off.
  • the second controller may include a second pattern (pattern 2) having an arrangement in which only the second LED (B) is on.
  • the first LED (A) may include an array that is off.
  • the second controller transmits information related to the second pattern (pattern 2) to the wearable electronic device 200 (e.g., the second wearable electronic device 200b in FIG. 6) through the second wearable electronic device (e.g., the second wearable electronic device 200b in FIG. 6). It can be transmitted to the first wearable electronic device 200a of FIG. 6).
  • the third pattern (pattern 3) of the third controller, the fourth pattern (pattern 4) of the fourth controller, and the fifth pattern (pattern 5) of the fifth controller are some of the first LEDs (A). And it may include a pattern in which some of the second LEDs (B) are turned on and/or turned off.
  • the third controller, fourth controller, and fifth controller each wearable wearable information of the third pattern (pattern 3), fourth pattern (pattern 4), and fifth pattern (pattern 5) through the corresponding wearable electronic device. It can be transmitted to the electronic device 200 (e.g., the first wearable electronic device 200a of FIG. 6).
  • the arrangement of the first pattern (pattern 1) of the first controller to the fifth pattern (pattern 5) of the fifth controller is not limited to the above-described example and may be changed to various other arrangements.
  • Figure 11a is a diagram showing an embodiment in which an accessory according to an embodiment of the present invention includes a first pattern.
  • Figure 11b is a diagram showing an example in which an accessory according to an embodiment of the present invention includes a second pattern.
  • Figure 11C is a diagram showing various examples of patterns that can be arranged on an accessory according to an embodiment of the present invention.
  • the wearable electronic device 200 is arranged on the accessory 1100 (e.g., a movable accessory device) using at least one of the first recognition camera 213 or the second recognition camera 223.
  • the first pattern 1110 of the LED 310 can be recognized.
  • the wearable electronic device 200 is arranged on the accessory 1100 (e.g., a movable accessory device) using at least one of the first recognition camera 213 or the second recognition camera 223.
  • the second pattern 1120 of the LED 310 can be recognized.
  • an accessory 1100 may form various patterns by turning on and/or off a plurality of first LEDs (A) and a plurality of second LEDs (B).
  • FIG. 12 is a flowchart illustrating a method by which a wearable electronic device identifies at least one controller using at least one of a first recognition camera or a second recognition camera according to an embodiment of the present invention.
  • the processor 120 of the wearable electronic device 200 operates the first controller ( It can be confirmed that 300a) and the second controller 300b are detected within the designated space.
  • the processor 120 of the wearable electronic device 200 controls a first controller 300a and a second controller 300b, each of which has an LED 310 arranged through a sensor module (e.g., proximity sensor) or wireless communication. ) can also be confirmed to be detected within the designated space.
  • a sensor module e.g., proximity sensor
  • the processor 120 may transmit a control signal to the first controller 300a to cause the LED 310 disposed in the first controller 300a to operate at the first turn-on time (1).
  • the processor 120 may transmit a control signal to the second controller 300b to cause the LED 310 disposed in the second controller 300b to operate at the second turn-on time 2.
  • the wearable electronic device 200 or 200a includes a wireless communication module 192, a sensor module 176, a first recognition camera 213 and/or a second recognition camera 223, And it may include a processor 120 operatively connected to the wireless communication module 192, the sensor module 176, the first recognition camera 213, and/or the second recognition camera 223.
  • the processor 120 through at least one of the first recognition camera 213 and the second recognition camera 223, a first controller 300a and a first controller 300a where LEDs 310 are respectively disposed. It can be confirmed that the second controller 300b is detected within the designated space.
  • the processor 120 sends a control signal to the first controller 300a to cause the LED 310 disposed in the first controller 300a to operate at the first turn-on time (1). It can be delivered. According to one embodiment, the processor 120 sends a control signal to the second controller 300b to cause the LED 310 disposed in the second controller 300b to operate at the second turn-on time (2). It can be delivered.
  • the processor 120 through at least one of the first recognition camera 213 and the second recognition camera 223, the first turn-on time (1) and the second turn-on It may be configured to identify the first controller 300 and the second controller 300b based on recognizing the timing of time (2).
  • the processor 120 when the processor 120 detects the first turn-on time (1) through at least one of the first recognition camera 213 and the second recognition camera 223, At least one of the first recognition camera 213 and the second recognition camera 223 operates with a fifth turn-on time (5) that is longer than the first turn-on time (1), and the first recognition camera (213) ) and when detecting the second turn-on time (2) through at least one of the first recognition camera 213 and the second recognition camera 223, at least one of the first recognition camera 213 and the second recognition camera 223 One may be configured to operate with a sixth turn-on time (6) that is longer than the second turn-on time (2).
  • the processor 120 allows the LED 310 disposed in the first controller 300a and the LED 310 disposed in the second controller 300b to operate with substantially the same light emission cycle. It can be configured.
  • the processor 120 through at least one of the first recognition camera 213, the second recognition camera 223, and the sensor module 176, the first controller 300a and It may be configured to detect movement of the second controller 300b.
  • the processor 120 through at least one of the first recognition camera 213, the second recognition camera 223, and the sensor module 176, the first controller 300a and It may be configured to detect the position of the second controller 300b.
  • the sensor module 176 is an inertial measurement sensor or an inertial measurement sensor that measures the angular velocity and acceleration of the first controller 300a and the second controller 300b. ) and a global positioning system (GPS) sensor that detects the location of the second controller 300b.
  • GPS global positioning system
  • the wearable electronic devices 200 and 200a have an LED 310 disposed on the first controller 300a turned on and/or turned off to operate in a first pattern, and the second controller 300b ) may be configured to be turned on and/or off to operate in a second pattern different from the first pattern.
  • the processor 120 detects the LED 310 disposed on the first controller 300a through at least one of the first recognition camera 213 and the second recognition camera 223. ) Based on recognizing the first pattern and the second pattern of the LED 310 disposed on the second controller 300b, the first controller 300 and the second controller 300b It can be configured to identify.
  • At least one of the first controller 300a and the second controller 300b may include a movable accessory device.
  • the method for the wearable electronic device (200, 200a) to identify the first controller (300a) and the second controller (300b) includes the processor 120 using the first recognition camera 213 and the It may include an operation of confirming, through at least one of the second recognition cameras 223, that the first controller 300a and the second controller 300b, each of which has an LED 310 disposed, are detected within a designated space. You can.
  • the method is such that the processor 120 sends a control signal to the first controller to cause the LED 310 disposed in the first controller 300a to operate at a first turn-on time (1). It may include an operation to transmit to (300a).
  • the method is such that the processor 120 sends a control signal to the second controller to cause the LED 310 disposed in the second controller 300b to operate at a second turn-on time (2). It may include an operation to transmit to (300b).
  • the method includes the processor 120, through at least one of the first recognition camera 213 and the second recognition camera 223, the first turn-on time (1) and the first recognition camera 223. 2 It may include an operation of identifying the first controller 300 and the second controller 300b based on recognizing the timing of the turn-on time (2).
  • the method includes the processor 120 setting the first turn-on time (1) through at least one of the first recognition camera 213 and the second recognition camera 223.
  • at least one of the first recognition camera 213 and the second recognition camera 223 operates with a fifth turn-on time (5) that is longer than the first turn-on time (1)
  • the first recognition camera 213 and the second recognition camera 223 may include an operation of controlling at least one of them to operate with a sixth turn-on time (6) that is longer than the second turn-on time (2).
  • the method is such that the processor 120 allows the LED 310 disposed in the first controller 300a and the LED 310 disposed in the second controller 300b to be substantially the same. It may include a control operation to operate in a light emission cycle.
  • the method includes the processor 120, the first controller through at least one of the first recognition camera 213, the second recognition camera 223, and the sensor module 176. It may include (300a) and an operation of detecting movement of the second controller (300b).
  • the method includes the processor 120, the first controller through at least one of the first recognition camera 213, the second recognition camera 223, and the sensor module 176. (300a) and may include an operation of detecting the position of the second controller (300b).
  • the sensor module 176 is an inertial measurement sensor or an inertial measurement sensor that measures the angular velocity and acceleration of the first controller 300a and the second controller 300b. ) and a global positioning system (GPS) sensor that detects the location of the second controller 300b.
  • GPS global positioning system
  • the method is such that the processor 120 turns on and/or off the LED 310 disposed in the first controller 300a to operate in a first pattern, and the second controller ( The LED 310 disposed in 300b) may include an operation of controlling the LED 310 to be turned on and/or turned off to operate in a second pattern different from the first pattern.
  • the method is such that the processor 120 uses at least one of the first recognition camera 213 and the second recognition camera 223 to detect the LED disposed in the first controller 300a. Based on recognizing the first pattern of 310 and the second pattern of LED 310 disposed in the second controller 300b, the first controller 300 and the second controller 300b ) may include an operation to identify.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
PCT/KR2023/011610 2022-09-23 2023-08-07 웨어러블 전자 장치 및 상기 웨어러블 전자 장치를 이용하여 컨트롤러를 식별하는 방법 WO2024063330A1 (ko)

Priority Applications (1)

Application Number Priority Date Filing Date Title
EP23762127.1A EP4369155A4 (de) 2022-09-23 2023-08-07 Tragbare elektronische vorrichtung und verfahren zur identifizierung einer steuerung unter verwendung einer tragbaren elektronischen vorrichtung

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
KR10-2022-0120855 2022-09-23
KR20220120855 2022-09-23
KR10-2022-0136744 2022-10-21
KR1020220136744A KR20240041772A (ko) 2022-09-23 2022-10-21 웨어러블 전자 장치 및 상기 웨어러블 전자 장치를 이용하여 컨트롤러를 식별하는 방법

Publications (1)

Publication Number Publication Date
WO2024063330A1 true WO2024063330A1 (ko) 2024-03-28

Family

ID=88021049

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2023/011610 WO2024063330A1 (ko) 2022-09-23 2023-08-07 웨어러블 전자 장치 및 상기 웨어러블 전자 장치를 이용하여 컨트롤러를 식별하는 방법

Country Status (2)

Country Link
EP (1) EP4369155A4 (de)
WO (1) WO2024063330A1 (de)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150258431A1 (en) * 2014-03-14 2015-09-17 Sony Computer Entertainment Inc. Gaming device with rotatably placed cameras
US20160357249A1 (en) * 2015-06-03 2016-12-08 Oculus Vr, Llc Hand-Held Controllers For Virtual Reality System
US20170131767A1 (en) * 2015-11-05 2017-05-11 Oculus Vr, Llc Controllers with asymmetric tracking patterns
KR20170081727A (ko) * 2014-12-31 2017-07-12 주식회사 소니 인터랙티브 엔터테인먼트 사용자의 손가락의 위치를 결정하기 위한 신호 발생 및 검출기 시스템 및 방법
KR20190135870A (ko) * 2018-05-29 2019-12-09 삼성전자주식회사 외부 전자 장치의 위치 및 움직임에 기반하여 외부 전자 장치와 관련된 객체를 표시하는 전자 장치 및 방법

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150258431A1 (en) * 2014-03-14 2015-09-17 Sony Computer Entertainment Inc. Gaming device with rotatably placed cameras
KR20170081727A (ko) * 2014-12-31 2017-07-12 주식회사 소니 인터랙티브 엔터테인먼트 사용자의 손가락의 위치를 결정하기 위한 신호 발생 및 검출기 시스템 및 방법
US20160357249A1 (en) * 2015-06-03 2016-12-08 Oculus Vr, Llc Hand-Held Controllers For Virtual Reality System
US20170131767A1 (en) * 2015-11-05 2017-05-11 Oculus Vr, Llc Controllers with asymmetric tracking patterns
KR20190135870A (ko) * 2018-05-29 2019-12-09 삼성전자주식회사 외부 전자 장치의 위치 및 움직임에 기반하여 외부 전자 장치와 관련된 객체를 표시하는 전자 장치 및 방법

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP4369155A4 *

Also Published As

Publication number Publication date
EP4369155A4 (de) 2024-06-05
EP4369155A1 (de) 2024-05-15

Similar Documents

Publication Publication Date Title
WO2022131549A1 (ko) 전자 장치 및 전자 장치의 동작 방법
WO2022119105A1 (ko) 발광부를 포함하는 웨어러블 전자 장치
WO2023106895A1 (ko) 가상 입력 장치를 이용하기 위한 전자 장치 및 그 전자 장치에서의 동작 방법
WO2022169255A1 (ko) 전자 장치 및 그의 사용자 시선을 추적하고 증강 현실 서비스를 제공하는 방법
WO2022186454A1 (ko) 가요성 인쇄 회로 기판을 포함하는 전자 장치
WO2024063330A1 (ko) 웨어러블 전자 장치 및 상기 웨어러블 전자 장치를 이용하여 컨트롤러를 식별하는 방법
WO2024043546A1 (ko) 사용자의 움직임을 트래킹 하기 위한 전자 장치 및 방법
WO2023048466A1 (ko) 전자 장치 및 컨텐츠 표시 방법
WO2022231160A1 (ko) 손 제스처에 기반하여 기능을 실행하는 전자 장치 및 그 작동 방법
WO2023080420A1 (ko) 가변형 그라운드를 포함하는 웨어러블 전자 장치
WO2024019293A1 (ko) 렌즈리스 카메라를 포함하는 웨어러블 전자 장치 및 이를 이용한 이미지 처리 방법
WO2023121120A1 (ko) 간섭 제거 방법 및 상기 방법을 수행하는 전자 장치
WO2024043438A1 (ko) 카메라 모듈을 제어하는 웨어러블 전자 장치 및 그 동작 방법
WO2022050638A1 (ko) 디스플레이의 설정 변경 방법 및 전자 장치
WO2024128668A1 (ko) 광 출력 모듈을 포함하는 웨어러블 전자 장치
WO2023136533A1 (ko) 간섭 제거 방법 및 상기 방법을 수행하는 전자 장치
WO2023068588A1 (ko) 안테나를 포함하는 웨어러블 전자 장치
WO2023080419A1 (ko) 비전 정보를 이용하여 전자기기를 제어하는 웨어러블 전자 장치 및 방법
WO2023149671A1 (ko) 입력 모드를 전환하는 증강 현실 장치 및 그 방법
WO2024058434A1 (ko) 사용자의 외부 환경을 촬영하는 컨트롤 장치 및 그 동작 방법 및 컨트롤 장치와 연결된 머리 착용형 전자 장치
WO2023027276A1 (ko) 스타일러스 펜을 이용하여 복수의 기능들을 실행하기 위한 전자 장치 및 그 작동 방법
WO2023120892A1 (ko) 글린트를 이용한 시선 추적에서 광원을 제어하는 장치 및 방법
WO2024080579A1 (ko) 사용자의 자세를 가이드하기 위한 웨어러블 장치 및 그 방법
WO2022255625A1 (ko) 영상 통화 중 다양한 커뮤니케이션을 지원하는 전자 장치 및 그의 동작 방법
WO2023153607A1 (ko) 주변 조도에 기초한 ar 컨텐츠 표시 방법 및 전자 장치

Legal Events

Date Code Title Description
ENP Entry into the national phase

Ref document number: 2023762127

Country of ref document: EP

Effective date: 20230907