WO2024043468A1 - Dispositif et procédé d'enregistrement et de désenregistrement de motif d'authentification à l'aide d'objets - Google Patents

Dispositif et procédé d'enregistrement et de désenregistrement de motif d'authentification à l'aide d'objets Download PDF

Info

Publication number
WO2024043468A1
WO2024043468A1 PCT/KR2023/008334 KR2023008334W WO2024043468A1 WO 2024043468 A1 WO2024043468 A1 WO 2024043468A1 KR 2023008334 W KR2023008334 W KR 2023008334W WO 2024043468 A1 WO2024043468 A1 WO 2024043468A1
Authority
WO
WIPO (PCT)
Prior art keywords
objects
authentication pattern
spatial
space
electronic device
Prior art date
Application number
PCT/KR2023/008334
Other languages
English (en)
Korean (ko)
Inventor
김 안스테파니
김현우
송미정
정신재
Original Assignee
삼성전자주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from KR1020220113299A external-priority patent/KR20240029475A/ko
Application filed by 삼성전자주식회사 filed Critical 삼성전자주식회사
Publication of WO2024043468A1 publication Critical patent/WO2024043468A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • G06F21/36User authentication by graphic or iconic representation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/45Structures or tools for the administration of authentication
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04815Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object

Definitions

  • the disclosure below relates to technology for registering and deregistering authentication patterns using objects in augmented reality (AR).
  • AR augmented reality
  • Augmented reality is a technology that displays a 3D (or 2D) virtual image on a real image or background as a single image. Augmented reality technology, which mixes the real environment and virtual objects, allows users to see the real environment, providing a better sense of reality and additional information.
  • an electronic device in order to perform user authentication, can perform user authentication by receiving a number displayed on the screen from the user or a pattern connecting a plurality of points.
  • the process of entering user authentication information for example, numeric information or pattern information
  • the process of entering user authentication information is externally transmitted. There is a problem that external users can easily recognize user authentication information by being exposed.
  • An electronic device may include a display module, a memory storing computer-executable instructions, and a processor that accesses the memory and executes the instructions.
  • the processor scans the target space, identifies a plurality of objects that can be registered in relation to the target space, and orders two or more objects among the identified plurality of objects by specifying characteristic information describing each object. may be selected, and a space authentication pattern for the target space may be registered based on the selection order of the two or more selected objects and the characteristic information of the two or more selected objects.
  • a method implemented with a processor may include scanning a target space and identifying a plurality of objects that can be registered in relation to the target space.
  • the method may include selecting two or more objects among the identified plurality of objects in an order while specifying characteristic information describing each object.
  • the method may include registering a space authentication pattern for the target space based on a selection order of the two or more selected objects and characteristic information of the two or more selected objects.
  • FIG. 1 is a block diagram of a terminal device in a network environment according to an embodiment.
  • FIG. 2 is a diagram illustrating the structure of a wearable augmented reality (AR) device according to an embodiment.
  • AR augmented reality
  • FIG. 3 is a diagram illustrating a camera and a gaze tracking sensor of an electronic device according to an embodiment.
  • FIG. 4 is a diagram illustrating a process in which an electronic device registers a space authentication pattern for a target space, according to an embodiment.
  • FIG. 5 is a diagram illustrating a process by which an electronic device sets the number of objects to be used to register a space authentication pattern for a target space, according to an embodiment.
  • FIG. 6A is a diagram illustrating a process in which an electronic device identifies a plurality of objects that can be registered in relation to a target space, according to an embodiment.
  • FIGS. 6B to 6C are diagrams illustrating a process in which an electronic device selects two or more objects from among a plurality of objects identified in the process of registering a spatial authentication pattern, according to an embodiment.
  • FIG. 7 is a diagram illustrating a process by which an electronic device specifies characteristic information describing an object, according to an embodiment.
  • FIG. 8 is a diagram illustrating a process in which an electronic device selects one object in the registration process of a spatial authentication pattern and then selects the next object in the order, according to an embodiment.
  • FIG. 9 is a diagram illustrating an example in which an electronic device completes registration of a space authentication pattern for a target space, according to an embodiment.
  • FIG. 10 is a diagram illustrating a process in which an electronic device transmits information about a space authentication pattern registered for a target space to another electronic device, according to an embodiment.
  • FIG. 11 is a diagram illustrating an example in which an electronic device releases a space authentication pattern registered for a target space, according to an embodiment.
  • FIG. 12 is a diagram illustrating a process in which an electronic device releases a space authentication pattern registered for a target space, according to an embodiment.
  • FIG. 13 is a diagram illustrating a process in which an electronic device selects an object in the process of releasing a spatial authentication pattern, according to an embodiment.
  • FIG. 14 is a diagram illustrating a process in which an electronic device selects one object and then selects the next object in the process of releasing a spatial authentication pattern, according to an embodiment.
  • FIG. 15 is a diagram illustrating a process by which an electronic device releases a spatial authentication pattern based on two or more objects selected in the process of releasing a spatial authentication pattern, according to an embodiment.
  • FIG. 16 is a diagram illustrating a process in which an electronic device stops releasing a spatial authentication pattern during the process of releasing the spatial authentication pattern, according to an embodiment.
  • FIGS. 17 and 18 are diagrams illustrating objects that an electronic device identifies according to the level of difficulty set for a space authentication pattern for a target space, according to an embodiment.
  • FIGS. 19A to 19C are diagrams illustrating objects that an electronic device identifies according to the level of difficulty set for a space authentication pattern, according to an embodiment.
  • FIG. 20 is a diagram illustrating a process in which an electronic device registers a space authentication pattern for each user in the process of registering a space authentication pattern for a target space, according to an embodiment.
  • FIG. 21 is a diagram illustrating a process in which an electronic device sets a usage authority level corresponding to a spatial authentication pattern during the registration process of a spatial authentication pattern, according to an embodiment.
  • FIG. 1 is a block diagram of a terminal device 101 in a network environment 100, according to various embodiments.
  • the terminal device 101 communicates with the electronic device 102 through a first network 198 (e.g., a short-range wireless communication network) or a second network 199. It is possible to communicate with at least one of the electronic device 104 or the server 108 through (e.g., a long-distance wireless communication network).
  • the terminal device 101 may communicate with the electronic device 104 through the server 108.
  • the terminal device 101 includes a processor 120, a memory 130, an input module 150, an audio output module 155, a display module 160, an audio module 170, and a sensor module ( 176), interface 177, connection terminal 178, haptic module 179, camera module 180, power management module 188, battery 189, communication module 190, subscriber identification module 196 , or may include an antenna module 197.
  • at least one of these components eg, the connection terminal 178) may be omitted, or one or more other components may be added to the terminal device 101.
  • some of these components e.g., sensor module 176, camera module 180, or antenna module 197) are integrated into one component (e.g., display module 160). It can be.
  • the processor 120 for example, executes software (e.g., program 140) to run at least one other component (e.g., hardware or software component) of the terminal device 101 connected to the processor 120. It can be controlled and various data processing or calculations can be performed. According to one embodiment, as at least part of data processing or computation, the processor 120 stores commands or data received from another component (e.g., sensor module 176 or communication module 190) in volatile memory 132. The commands or data stored in the volatile memory 132 can be processed, and the resulting data can be stored in the non-volatile memory 134.
  • software e.g., program 140
  • the processor 120 stores commands or data received from another component (e.g., sensor module 176 or communication module 190) in volatile memory 132.
  • the commands or data stored in the volatile memory 132 can be processed, and the resulting data can be stored in the non-volatile memory 134.
  • the processor 120 includes a main processor 121 (e.g., a central processing unit or an application processor) or an auxiliary processor 123 that can operate independently or together (e.g., a graphics processing unit, a neural network processing unit ( It may include a neural processing unit (NPU), an image signal processor, a sensor hub processor, or a communication processor).
  • a main processor 121 e.g., a central processing unit or an application processor
  • auxiliary processor 123 e.g., a graphics processing unit, a neural network processing unit ( It may include a neural processing unit (NPU), an image signal processor, a sensor hub processor, or a communication processor.
  • the terminal device 101 includes a main processor 121 and a secondary processor 123
  • the secondary processor 123 may be set to use lower power than the main processor 121 or be specialized for a designated function. You can.
  • the auxiliary processor 123 may be implemented separately from the main processor 121 or as part of it.
  • the auxiliary processor 123 may, for example, act on behalf of the main processor 121 while the main processor 121 is in an inactive (e.g., sleep) state, or while the main processor 121 is in an active (e.g., application execution) state. ), together with the main processor 121, at least one of the components of the terminal device 101 (e.g., the display module 160, the sensor module 176, or the communication module 190) At least some of the functions or states related to can be controlled.
  • co-processor 123 e.g., image signal processor or communication processor
  • may be implemented as part of another functionally related component e.g., camera module 180 or communication module 190. there is.
  • the auxiliary processor 123 may include a hardware structure specialized for processing artificial intelligence models.
  • Artificial intelligence models can be created through machine learning. For example, this learning may be performed in the terminal device 101 itself where the artificial intelligence model is performed, or may be performed through a separate server (e.g., server 108).
  • Learning algorithms may include, for example, supervised learning, unsupervised learning, semi-supervised learning, or reinforcement learning, but It is not limited.
  • An artificial intelligence model may include multiple artificial neural network layers.
  • Artificial neural networks include deep neural network (DNN), convolutional neural network (CNN), recurrent neural network (RNN), restricted boltzmann machine (RBM), belief deep network (DBN), bidirectional recurrent deep neural network (BRDNN), It may be one of deep Q-networks or a combination of two or more of the above, but is not limited to the examples described above.
  • artificial intelligence models may additionally or alternatively include software structures.
  • the memory 130 may store various data used by at least one component (eg, the processor 120 or the sensor module 176) of the terminal device 101. Data may include, for example, input data or output data for software (e.g., program 140) and instructions related thereto.
  • Memory 130 may include volatile memory 132 or non-volatile memory 134.
  • the program 140 may be stored as software in the memory 130 and may include, for example, an operating system 142, middleware 144, or application 146.
  • the input module 150 may receive commands or data to be used in a component of the terminal device 101 (e.g., the processor 120) from outside the terminal device 101 (e.g., a user).
  • the input module 150 may include, for example, a microphone, mouse, keyboard, keys (eg, buttons), or digital pen (eg, stylus pen).
  • the sound output module 155 may output sound signals to the outside of the terminal device 101.
  • the sound output module 155 may include, for example, a speaker or a receiver. Speakers can be used for general purposes such as multimedia playback or recording playback.
  • the receiver can be used to receive incoming calls. According to one embodiment, the receiver may be implemented separately from the speaker or as part of it.
  • the display module 160 may visually provide information to the outside of the terminal device 101 (eg, a user).
  • the display module 160 may include, for example, a display, a hologram device, or a projector, and a control circuit for controlling the device.
  • the display module 160 may include a touch sensor configured to detect a touch, or a pressure sensor configured to measure the intensity of force generated by the touch.
  • the audio module 170 can convert sound into an electrical signal or, conversely, convert an electrical signal into sound. According to one embodiment, the audio module 170 acquires sound through the input module 150, the audio output module 155, or an external electronic device (e.g., directly or wirelessly connected to the terminal device 101). Sound may be output through the electronic device 102 (e.g., speaker or headphone).
  • the electronic device 102 e.g., speaker or headphone
  • the sensor module 176 detects the operating state (e.g., power or temperature) of the terminal device 101 or the external environmental state (e.g., user state) and generates an electrical signal or data value corresponding to the detected state. can do.
  • the sensor module 176 includes, for example, a gesture sensor, a gyro sensor, an air pressure sensor, a magnetic sensor, an acceleration sensor, a grip sensor, a proximity sensor, a color sensor, an IR (infrared) sensor, a biometric sensor, It may include a temperature sensor, humidity sensor, or light sensor.
  • the interface 177 may support one or more designated protocols that can be used to directly or wirelessly connect the terminal device 101 to an external electronic device (eg, the electronic device 102).
  • the interface 177 may include, for example, a high definition multimedia interface (HDMI), a universal serial bus (USB) interface, an SD card interface, or an audio interface.
  • HDMI high definition multimedia interface
  • USB universal serial bus
  • SD card interface Secure Digital Card interface
  • audio interface audio interface
  • connection terminal 178 may include a connector through which the terminal device 101 can be physically connected to an external electronic device (eg, the electronic device 102).
  • the connection terminal 178 may include, for example, an HDMI connector, a USB connector, an SD card connector, or an audio connector (eg, a headphone connector).
  • the haptic module 179 can convert electrical signals into mechanical stimulation (e.g., vibration or movement) or electrical stimulation that the user can perceive through tactile or kinesthetic senses.
  • the haptic module 179 may include, for example, a motor, a piezoelectric element, or an electrical stimulation device.
  • the camera module 180 can capture still images and moving images.
  • the camera module 180 may include one or more lenses, image sensors, image signal processors, or flashes.
  • the power management module 188 can manage power supplied to the terminal device 101.
  • the power management module 188 may be implemented as at least a part of, for example, a power management integrated circuit (PMIC).
  • PMIC power management integrated circuit
  • the battery 189 may supply power to at least one component of the terminal device 101.
  • the battery 189 may include, for example, a non-rechargeable primary battery, a rechargeable secondary battery, or a fuel cell.
  • the communication module 190 is a direct (e.g., wired) communication channel or wireless communication channel between the terminal device 101 and an external electronic device (e.g., electronic device 102, electronic device 104, or server 108). It can support establishment and communication through established communication channels. Communication module 190 operates independently of processor 120 (e.g., an application processor) and may include one or more communication processors that support direct (e.g., wired) communication or wireless communication.
  • processor 120 e.g., an application processor
  • the communication module 190 is a wireless communication module 192 (e.g., a cellular communication module, a short-range wireless communication module, or a global navigation satellite system (GNSS) communication module) or a wired communication module 194 (e.g., : LAN (local area network) communication module, or power line communication module) may be included.
  • a wireless communication module 192 e.g., a cellular communication module, a short-range wireless communication module, or a global navigation satellite system (GNSS) communication module
  • GNSS global navigation satellite system
  • wired communication module 194 e.g., : LAN (local area network) communication module, or power line communication module
  • the corresponding communication module is a first network 198 (e.g., a short-range communication network such as Bluetooth, wireless fidelity (WiFi) direct, or infrared data association (IrDA)) or a second network 199 (e.g., legacy It may communicate with an external electronic device 104 through a telecommunication network such as a cellular network, a 5G network, a next-generation communication network, the Internet, or a computer network (e.g., LAN or WAN).
  • a telecommunication network such as a cellular network, a 5G network, a next-generation communication network, the Internet, or a computer network (e.g., LAN or WAN).
  • a telecommunication network such as a cellular network, a 5G network, a next-generation communication network, the Internet, or a computer network (e.g., LAN or WAN).
  • a telecommunication network such as a cellular network, a 5G network, a next-generation communication network
  • the wireless communication module 192 uses subscriber information (e.g., International Mobile Subscriber Identifier (IMSI)) stored in the subscriber identification module 196 within a communication network such as the first network 198 or the second network 199.
  • subscriber information e.g., International Mobile Subscriber Identifier (IMSI)
  • IMSI International Mobile Subscriber Identifier
  • the wireless communication module 192 may support 5G networks after 4G networks and next-generation communication technologies, for example, NR access technology (new radio access technology).
  • NR access technology provides high-speed transmission of high-capacity data (eMBB (enhanced mobile broadband)), minimization of terminal power and access to multiple terminals (mMTC (massive machine type communications)), or high reliability and low latency (URLLC (ultra-reliable and low latency). -latency communications)) can be supported.
  • the wireless communication module 192 may support high frequency bands (eg, mmWave bands), for example, to achieve high data rates.
  • the wireless communication module 192 uses various technologies to secure performance in high frequency bands, for example, beamforming, massive array multiple-input and multiple-output (MIMO), and full-dimensional multiplexing. It can support technologies such as input/output (FD-MIMO: full dimensional MIMO), array antenna, analog beam-forming, or large scale antenna.
  • the wireless communication module 192 may support various requirements specified in the terminal device 101, an external electronic device (e.g., electronic device 104), or a network system (e.g., second network 199).
  • the wireless communication module 192 supports Peak data rate (e.g., 20 Gbps or more) for realizing eMBB, loss coverage (e.g., 164 dB or less) for realizing mmTC, or U-plane latency (e.g., 164 dB or less) for realizing URLLC.
  • Peak data rate e.g., 20 Gbps or more
  • loss coverage e.g., 164 dB or less
  • U-plane latency e.g., 164 dB or less
  • the antenna module 197 may transmit or receive signals or power to or from the outside (eg, an external electronic device).
  • the antenna module 197 may include an antenna including a radiator made of a conductor or a conductive pattern formed on a substrate (eg, PCB).
  • the antenna module 197 may include a plurality of antennas (eg, an array antenna). In this case, at least one antenna suitable for a communication method used in a communication network such as the first network 198 or the second network 199 is connected to the plurality of antennas by, for example, the communication module 190. can be selected. Signals or power may be transmitted or received between the communication module 190 and an external electronic device through the at least one selected antenna.
  • other components eg, radio frequency integrated circuit (RFIC) may be additionally formed as part of the antenna module 197.
  • RFIC radio frequency integrated circuit
  • a mmWave antenna module includes: a printed circuit board, an RFIC disposed on or adjacent to a first side (e.g., bottom side) of the printed circuit board and capable of supporting a designated high frequency band (e.g., mmWave band); And a plurality of antennas (e.g., array antennas) disposed on or adjacent to the second side (e.g., top or side) of the printed circuit board and capable of transmitting or receiving signals in the designated high frequency band. can do.
  • a first side e.g., bottom side
  • a designated high frequency band e.g., mmWave band
  • a plurality of antennas e.g., array antennas
  • peripheral devices e.g., bus, general purpose input and output (GPIO), serial peripheral interface (SPI), or mobile industry processor interface (MIPI)
  • signal e.g. commands or data
  • commands or data may be transmitted or received between the terminal device 101 and the external electronic device 104 through the server 108 connected to the second network 199.
  • Each of the external electronic devices 102 or 104 may be of the same or different type as the terminal device 101.
  • all or part of the operations performed in the terminal device 101 may be executed in one or more of the external electronic devices 102, 104, or 108. For example, when the terminal device 101 needs to perform a certain function or service automatically or in response to a request from a user or another device, the terminal device 101 does not execute the function or service on its own. Alternatively, or additionally, one or more external electronic devices may be requested to perform at least part of the function or service.
  • One or more external electronic devices that have received the request may execute at least part of the requested function or service, or an additional function or service related to the request, and transmit the result of the execution to the terminal device 101.
  • the terminal device 101 may process the result as is or additionally and provide it as at least part of a response to the request.
  • cloud computing distributed computing, mobile edge computing (MEC), or client-server computing technology can be used.
  • the terminal device 101 may provide an ultra-low latency service using, for example, distributed computing or mobile edge computing.
  • the external electronic device 104 may include an Internet of Things (IoT) device.
  • Server 108 may be an intelligent server using machine learning and/or neural networks.
  • the external electronic device 104 or server 108 may be included in the second network 199.
  • the terminal device 101 may be applied to intelligent services (e.g., smart home, smart city, smart car, or healthcare) based on 5G communication technology and IoT-related technology.
  • Figure 2 is a diagram illustrating the structure of a wearable AR device according to an embodiment.
  • the wearable AR device 200 may be worn on the user's face and provide images related to augmented reality services and/or virtual reality services to the user.
  • the wearable AR device 200 includes a first display 205, a second display 210, a screen display unit 215, an input optical member 220, a first transparent member 225a, and a second transparent member.
  • the displays include, for example, a liquid crystal display (LCD), a digital mirror device (DMD), ), a liquid crystal on silicon (LCoS) device, an organic light emitting diode (OLED), or a micro LED (micro light emitting diode, micro LED).
  • LCD liquid crystal display
  • DMD digital mirror device
  • LCD liquid crystal on silicon
  • OLED organic light emitting diode
  • micro LED micro light emitting diode, micro LED
  • the wearable AR device 200 may include a light source that irradiates light to the screen output area of the display.
  • the wearable AR device 200 may provide good quality to the user even if it does not include a separate light source. can provide a virtual image of In one embodiment, if the display is implemented with organic light emitting diodes or micro LEDs, a light source is not required, so the wearable AR device 200 can be lightweight.
  • a display that can generate light on its own will be referred to as a self-luminous display, and will be described on the premise that it is a self-luminous display.
  • a display (eg, the first display 205 and the second display 210) according to various embodiments of the present invention may be composed of at least one micro light emitting diode (micro LED).
  • micro LED can express red (R, red), green (G, green), and blue (B, blue) through self-luminescence, and its size is small (e.g., 100 ⁇ m or less), so each chip is one pixel. (e.g. one of R, G, and B). Accordingly, when the display is composed of micro LED, high resolution can be provided without a backlight unit (BLU).
  • one pixel may include R, G, and B, and one chip may be implemented with a plurality of pixels including R, G, and B.
  • the display (e.g., the first display 205 and the second display 210) includes a display area composed of pixels for displaying a virtual image and reflection from the eyes disposed between the pixels. It may be composed of light-receiving pixels (e.g., photo sensor pixels) that receive light, convert it into electrical energy, and output it.
  • light-receiving pixels e.g., photo sensor pixels
  • the wearable AR device 200 may detect the user's gaze direction (eg, eye movement) through light-receiving pixels. For example, the wearable AR device 200 detects the gaze direction of the user's right eye and the user's left eye through one or more light-receiving pixels constituting the first display 205 and one or more light-receiving pixels constituting the second display 210. The gaze direction can be detected and tracked. The wearable AR device 200 may determine the position of the center of the virtual image according to the gaze direction of the user's right eye and left eye (e.g., the direction in which the user's right eye and left eye pupil are gazing) detected through one or more light-receiving pixels. .
  • the gaze direction e.g, eye movement
  • the light emitted from the display passes through a lens (not shown) and a waveguide and is directed to the user's right eye. It is possible to reach the screen display unit 215 formed on the first transparent member 225a and the screen display unit 215 formed on the second transparent member 225b arranged to face the user's left eye.
  • the light emitted from the display passes through the waveguide and reaches the grating area formed on the input optical member 220 and the screen display unit 215. It may be reflected and transmitted to the user's eyes.
  • the first transparent member 225a and/or the second transparent member 225b may be formed of a glass plate, a plastic plate, or a polymer, and may be made transparent or translucent.
  • a lens may be disposed on the front of a display (eg, the first display 205 and the second display 210).
  • Lenses may include concave lenses and/or convex lenses.
  • the lens may include a projection lens or a collimation lens.
  • the screen display unit 215 or a transparent member may include a lens including a waveguide or a reflective lens. there is.
  • the waveguide may be made of glass, plastic, or polymer, and may include a nanopattern formed on one inner or outer surface, for example, a grating structure in a polygonal or curved shape. there is.
  • light incident on one end of the waveguide may be propagated inside the display waveguide by a nano-pattern and provided to the user.
  • a waveguide composed of a free-form prism may provide incident light to the user through a reflection mirror.
  • the waveguide may include at least one of at least one diffractive element (eg, a diffractive optical element (DOE), a holographic optical element (HOE)), or a reflective element (eg, a reflective mirror).
  • the waveguide may guide light emitted from the displays 205 and 210 to the user's eyes using at least one diffractive element or reflective element included in the waveguide.
  • the diffractive element may include an input optic 220/output optic 220 (not shown).
  • the input optical member 220 may refer to an input grating area
  • the output optical member (not shown) may refer to an output grating area.
  • the input grating area transmits light output from the display (e.g., the first display 205 and the second display 210) (e.g., micro LED) to a transparent member (e.g., the first transparent member 250a) of the screen display unit 215. ), may serve as an input terminal that diffracts (or reflects) light to transmit it to the second transparent member 250b).
  • the output grating area may serve as an outlet that diffracts (or reflects) light transmitted to the transparent members of the waveguide (e.g., the first transparent member 250a and the second transparent member 250b) to the user's eyes.
  • the reflective element may include a total internal reflection (TIR) optical element or a total internal reflection waveguide.
  • TIR total internal reflection
  • total reflection is a method of guiding light. An angle of incidence is created so that the light (e.g. virtual image) input through the input grating area is 100% reflected from one side of the waveguide (e.g. a specific side), and the light input through the input grating area is created so that 100% of the light (e.g. virtual image) is reflected from one side of the waveguide (e.g. a specific side). This may mean ensuring that 100% delivery is achieved.
  • light emitted from the displays 205 and 210 may be guided through an optical input member 220 to a waveguide.
  • Light moving inside the waveguide may be guided toward the user's eyes through the output optical member.
  • the screen display unit 215 may be determined based on light emitted in the eye direction.
  • the first cameras 245a and 245b provide 3 degrees of freedom (3DoF), 6DoF head tracking, hand detection and tracking, gesture and/or space. May include a camera used for recognition.
  • the first cameras 245a and 245b may include a global shutter (GS) camera to detect and track the movement of the head and hand.
  • GS global shutter
  • the first cameras 245a and 245b may be stereo cameras for head tracking and spatial recognition, or cameras of the same standard and same performance may be used.
  • the first cameras 245a and 245b may be GS cameras with excellent performance (e.g., image drag) to detect and track fine movements such as fast hand movements and fingers.
  • the first cameras 245a and 245b may be RS (rolling shutter) cameras.
  • the first cameras 245a and 245b can perform SLAM functions through spatial recognition and depth shooting for 6 Dof.
  • the first cameras 245a and 245b may perform a user gesture recognition function.
  • the second cameras 275a and 275b may be used to detect and track the pupil.
  • the second cameras 275a and 275b may be referred to as cameras for eye tracking (ET).
  • the second camera 265a can track the user's gaze direction.
  • the wearable AR device 200 may consider the direction of the user's gaze and position the center of the virtual image projected on the screen display unit 215 according to the direction in which the user's eyes are gazing.
  • the second cameras 275a and 275b for tracking the gaze direction may be GS cameras to detect the pupil and track fast pupil movement.
  • the second camera 265a may be installed for the left eye and the right eye, respectively, and cameras with the same performance and specifications may be used as the second camera 265a for the left eye and the right eye.
  • the third camera 265 may be referred to as high resolution (HR) or photo video (PV) and may include a high resolution camera.
  • the third camera 265 may include a color camera equipped with functions for obtaining high-definition images, such as an auto focus (AF) function and an optical image stabilizer (OIS). It is not limited to this, and the third camera 265 may include a global shutter (GS) camera or a rolling shutter (RS) camera.
  • GS global shutter
  • RS rolling shutter
  • At least one sensor e.g., a gyro sensor, an acceleration sensor, a geomagnetic sensor, a touch sensor, an illumination sensor, and/or a gesture sensor
  • the first cameras 245a and 265b perform head tracking for 6DoF.
  • motion detection and prediction pose estimation & prediction
  • gesture and/or space recognition e.g., gesture and/or space recognition
  • the first cameras 245a and 245b may be used separately as a camera for head tracking and a camera for hand tracking.
  • the lighting units 230a and 230b may have different purposes depending on where they are attached.
  • the lighting units 230a and 230b are around a hinge connecting the frame and the temple (e.g., the first hinge 240a and the second hinge 240b) or around a bridge connecting the frames. ) It can be attached together with the first cameras 245a and 245b mounted nearby.
  • the lighting units 230a and 230b can be used as a means of supplementing surrounding brightness.
  • the lighting units 230a and 230b may be used in a dark environment or when it is difficult to detect a subject to be photographed due to mixing of various light sources and reflected light.
  • the lighting units 230a and 230b attached around the frame of the wearable AR device 200 are auxiliary to facilitate eye gaze detection when photographing the pupil with the second cameras 275a and 275b. It can be used as a means.
  • the lighting units 230a and 230b may include an IR (infrared) LED with an infrared wavelength.
  • the PCB (e.g., the first PCB 235a, the second PCB 235b) includes a processor (not shown), a memory (not shown), and a communication module that control the components of the wearable AR device 200. (not shown) may be included.
  • the communication module may be configured the same as the communication module 190 of FIG. 1, and the description of the communication module 190 may be applied in the same way.
  • the communication module may support establishment of a direct (e.g., wired) communication channel or wireless communication channel between the wearable AR device 200 and an external electronic device, and performance of communication through the established communication channel.
  • the PCB can transmit electrical signals to the components that make up the wearable AR device 200.
  • a communication module operates independently of the processor and may include one or more communication processors supporting direct (e.g., wired) communication or wireless communication.
  • the communication module is a wireless communication module (e.g., a cellular communication module, a short-range wireless communication module, or a global navigation satellite system (GNSS) communication module) or a wired communication module (e.g., a local area network (LAN)). network) communication module, or power line communication module).
  • GNSS global navigation satellite system
  • LAN local area network
  • the corresponding communication module (not shown) is a short-range communication network such as Bluetooth, WiFi (wireless fidelity) direct, or IrDA (infrared data association), or a legacy cellular network, 5G network, next-generation communication network, the Internet, or a computer network. It is possible to communicate with external electronic devices through a long-distance communication network such as (e.g. LAN or WAN).
  • a short-range communication network such as Bluetooth, WiFi (wireless fidelity) direct, or IrDA (infrared data association), or a legacy cellular network, 5G network, next-generation communication network, the Internet, or a computer network. It is possible to communicate with external electronic devices through a long-distance communication network such as (e.g. LAN or WAN).
  • These various types of communication modules may be integrated into one component (e.g., a single chip) or may be implemented as a plurality of separate components (e.g., multiple chips).
  • the wireless communication module may support 5G networks after the 4G network and next-generation communication technologies, for example, NR access technology (new radio access technology).
  • NR access technology provides high-speed transmission of high-capacity data (eMBB (enhanced mobile broadband)), minimization of terminal power and access to multiple terminals (mMTC (massive machine type communications)), or high reliability and low latency (URLLC (ultra-reliable and low latency).
  • eMBB enhanced mobile broadband
  • mMTC massive machine type communications
  • URLLC ultra-reliable and low latency
  • -latency communications can be supported.
  • the wireless communication module may support high frequency bands (e.g., mmWave bands), for example, to achieve high data rates.
  • Wireless communication modules use various technologies to secure performance in high frequency bands, such as beamforming, massive MIMO (multiple-input and multiple-output), and full-dimensional multiple input/output (FD).
  • -It can support technologies such as full dimensional MIMO (MIMO), array antenna, analog beam-forming, or large scale antenna.
  • MIMO full dimensional MIMO
  • array antenna array antenna
  • analog beam-forming or large scale antenna.
  • the wearable AR device 200 may further include an antenna module (not shown).
  • the antenna module can transmit signals or power to or receive signals or power from the outside (e.g., an external electronic device).
  • the antenna module may include an antenna including a radiator made of a conductor or a conductive pattern formed on a substrate (eg, the first PCB 235a and the second PCB 235b).
  • the antenna module may include a plurality of antennas (eg, an array antenna).
  • a plurality of microphones may process external acoustic signals into electrical voice data.
  • Processed voice data can be used in various ways depending on the function (or application being executed) being performed by the wearable AR device 200.
  • a plurality of speakers may output audio data received from a communication module or stored in a memory.
  • the battery 260 may include one or more batteries and may supply power to components constituting the wearable AR device 200.
  • the visors 270a and 270b may adjust the amount of external light incident on the user's eyes according to the transmittance.
  • the visors 270a and 270b may be located in front or behind the screen display unit 215.
  • the front of the screen display unit 215 may indicate a direction opposite to that of the user wearing the wearable AR device 200, and the rear may indicate the direction opposite to the user wearing the wearable AR device 200.
  • the visors 270a and 270b can protect the screen display unit 215 and adjust the amount of external light transmitted.
  • the visors 270a and 270b may include an electrochromic element that changes color and adjusts transmittance according to the applied power.
  • Electrochromism is a phenomenon in which color changes due to an oxidation-reduction reaction caused by applied power.
  • the visors 270a and 270b can adjust the transmittance of external light by using the color change of the electrochromic element.
  • the visors 270a and 270b may include a control module and an electrochromic element.
  • the control module can control the electrochromic element and adjust the transmittance of the electrochromic element.
  • FIG. 4 is a diagram illustrating a process in which an electronic device registers a space authentication pattern for a target space, according to an embodiment.
  • An electronic device e.g., the terminal device 101 of FIG. 1 or the wearable AR device 200 of FIG. 2 according to an embodiment has a see-through function that provides augmented reality (AR).
  • AR augmented reality
  • the see-through function may refer to a function that delivers additional information or images in real time as a single image while transmitting actual external images to the user's eyes through the display.
  • the electronic device may scan a target space and identify a plurality of objects that can be registered in relation to the target space.
  • Electronic devices can enter the target space.
  • a plurality of objects that the electronic device identifies in relation to the target space may be objects placed in the target space or virtual objects. For example, when an electronic device enters a target space, it may place a preset virtual object at a predetermined location within the target space.
  • the electronic device may select two or more objects from among a plurality of identified objects in an order while specifying characteristic information describing the objects.
  • the electronic device may select two or more objects from among a plurality of identified objects in order and specify at least one feature information for each object.
  • Feature information may include, for example, at least one of information indicating the properties of the object, information indicating the state of the object, and information indicating the relative position of the object.
  • the electronic device may register a space authentication pattern for the target space based on the selection order of the two or more selected objects and the characteristic information of the two or more selected objects.
  • FIG. 5 is a diagram illustrating a process in which an electronic device sets the number of objects to be used to register a space authentication pattern for a target space, according to an embodiment.
  • an electronic device e.g., the terminal device 101 of FIG. 1 or the wearable AR device 200 of FIG. 2 enters the target space 510
  • space authentication for the target space 510 is performed. It is possible to determine whether a pattern is registered. If a space authentication pattern for the target space 510 is registered, the electronic device may determine whether to initiate a process of releasing the space authentication pattern registered for the target space 510. The cancellation of the space authentication pattern registered for the target space 510 will be described in detail in FIGS. 12 to 16.
  • the electronic device may determine whether to initiate a process of registering the space authentication pattern for the target space 510. For example, the electronic device may determine whether to initiate a registration process for the space authentication pattern for the target space 510 based on user input.
  • the electronic device may set the number of objects to be used for registration of the space authentication pattern based on user input.
  • the electronic device may display a user interface 521 for selecting the number of objects to be used for registering a spatial authentication pattern on the screen 520 of the electronic device. For example, the electronic device may set the number of objects to be used to register a space authentication pattern for the target space 510 to three.
  • FIG. 6A is a diagram illustrating a process in which an electronic device identifies a plurality of objects that can be registered in relation to a target space, according to an embodiment.
  • the electronic device when the number of objects to be used for registration of a space authentication pattern is set, the electronic device (e.g., the terminal device 101 in FIG. 1 or the wearable AR device 200 in FIG. 2) is related to the target space.
  • a plurality of registerable objects 631, 632, 633, and 634 can be identified.
  • the electronic device may identify an object or virtual object placed in the target space 610 as an object that can be registered in relation to the target space 610.
  • the electronic device may display boundary areas 641, 642, 643, and 644 corresponding to each of the identified plurality of objects 631, 632, 633, and 634 on the screen 620. For example, referring to FIG.
  • the electronic device displays a border area (e.g., border area 641) on the screen 620 to include an area in which an identified object (e.g., object 631) is visible. can do.
  • the border area may have the shape of a cube with the minimum size including the area where the object is visible, but is not limited thereto.
  • the electronic device may display the boundary area on the screen to surround the area where the identified object is visible.
  • the electronic device may display the total number of objects that can be registered in relation to the target space 610 on the screen or announce them through voice.
  • FIGS. 6B to 6C are diagrams illustrating a process in which an electronic device selects two or more objects from among a plurality of objects identified in the process of registering a spatial authentication pattern, according to an embodiment.
  • an electronic device may track the center point 670 of the screen 620.
  • the electronic device identifies the object 631 as a spatial authentication pattern of the target space 610 based on the fact that the center point 670 of the tracked screen 620 is continuously positioned for more than a threshold time within the area where the object 631 is visible. It can be selected as an object to be used for registration.
  • the electronic device matches the object 631 to the space authentication pattern of the target space 610.
  • a user interface 621 inquiring whether to use it may be displayed on the screen.
  • the threshold time may be, but is not limited to, 3 seconds.
  • the electronic device when the electronic device determines to use the object 631 for registration of the space authentication pattern of the target space 610, it uses the object 631 as the target space (631) based on capturing the object 631. 610) can be selected as an object to be used for registration of the spatial authentication pattern.
  • the appearance of the object 631 may vary depending on the direction from which the object 631 is viewed. Accordingly, the electronic device can complete the capture of the object 631 by capturing the form in which the object 631 is viewed from all directions. To display the progress of capturing the object 631, the electronic device may display a closed curve 650 surrounding the area where the object 631 is visible on the screen 620.
  • the electronic device When the electronic device captures the shape of the object 631 corresponding to the first direction 681, the part of the closed curve 650 corresponding to the first direction 681 is changed to a second color different from the existing first color. It can be displayed.
  • the electronic device may display the portion of the closed curve 650 corresponding to the second direction 682 in a color different from the existing color.
  • the electronic device captures all visible shapes of the object 631 in each direction, the entire area of the closed curve 650 may be displayed in the second color.
  • the electronic device may select the object 631 as an object to be used for registering a space authentication pattern for the target space 610.
  • FIG. 7 is a diagram illustrating a process in which an electronic device specifies characteristic information describing an object, according to an embodiment.
  • an electronic device uses the object 731 as an object to be used in a space authentication pattern for the target space 710.
  • feature information describing the object 731 can be specified.
  • the electronic device may generate a plurality of characteristic information corresponding to the object 731 based on the captured form of the object 731.
  • the electronic device may generate characteristic information corresponding to the object 731 through vision recognition.
  • Feature information corresponding to the object 731 includes, for example, feature information 751, 752, and 753 indicating the properties of the object 731, feature information 754 indicating the current state of the object 731, and object There may be feature information 755 indicating the relative position of 731.
  • the electronic device may display a plurality of feature information 751 to 755 generated for the object 731 on the screen.
  • the electronic device may designate at least one feature information 752 and 753 among the plurality of feature information 751 to 755 generated for the object 731 as feature information describing the object 731.
  • the electronic device always displays essential feature information (e.g., feature information 753) among the plurality of feature information 751 to 755 generated for the object 731 regardless of user input. ) can be specified as feature information that explains.
  • the electronic device uses the remaining feature information (e.g., feature information 751, 752, 754, and 755), excluding essential feature information, among the plurality of feature information 751 to 755 corresponding to the object 731 to the user. Based on the input, it can be designated as feature information that describes the object 731.
  • FIG. 8 is a diagram illustrating a process in which an electronic device selects one object in the registration process of a spatial authentication pattern and then selects the next object in the order, according to an embodiment.
  • an electronic device may select two or more objects among a plurality of identified objects in an order.
  • the electronic device selects one object as the nth order object to be used for the spatial authentication pattern of the target space 810, and the nth A graphic object indicating that it is an object of order can be displayed in the area where the object is visible.
  • n may be a natural number of 1 or more.
  • the electronic device may select the object 831 as the first order object to be used for registration of the spatial authentication pattern, and select the graphic object 871 representing the first order as the object 831. can be displayed in the visible area.
  • the electronic device may select the object 832 as a second order object to be used for registration of the spatial authentication pattern, and display a graphic object 872 representing the second order in the area where the object 832 is visible. there is.
  • the electronic device shows one selected object (e.g., object 831) as an object to be used for registration of a spatial authentication pattern.
  • a graphic object may be displayed that connects an area where an object (e.g., object 831) is displayed and an area where an object (e.g., object 832) selected in the following order is shown.
  • the electronic device connects the point 871 within the area where the object 831 is shown and the point 872 within the area where the object 832 selected in the next order of the object 831 is shown with a solid line 891. You can.
  • the electronic device selects the last selected object at the current time as an object to be used for registration of a spatial authentication pattern (e.g., object 832). ) can be displayed, connecting the area where the image is displayed and the center point 870 of the screen 820.
  • a spatial authentication pattern e.g., object 832
  • the electronic device may connect the point 872 within the area where the object 832 is visible and the center point 870 of the screen 820 with a dotted line 899.
  • the electronic device does not need to select any more objects when selecting the object in the last order. Therefore, when the object in the last order is selected, the center point of the object and the screen may not be connected.
  • the electronic device in the process of registering the space authentication pattern of the target space 810, provides a user interface that indicates the total number of objects to be used in the space authentication pattern of the target space 810 and the number of objects selected up to the current point. (880) can be displayed.
  • the electronic device creates graphic objects 881, 882, and 883 as many as the total number of objects (e.g., 3) to be used for registering the space authentication pattern of the target space 810. It can be displayed in the interface 880, and among the displayed graphic objects 881, 882, 883, graphic objects 881, 882 as many as the number of objects selected up to the current point (e.g., 2) are displayed as the remaining graphic objects (881, 882). 883) and can be displayed in a different color.
  • FIG. 9 is a diagram illustrating an example in which an electronic device completes registration of a space authentication pattern for a target space, according to an embodiment.
  • An electronic device e.g., the terminal device 101 of FIG. 1 or the wearable AR device 200 of FIG. 2 according to an embodiment may use two or more objects 931 to register a space authentication pattern for the target space 910. , 932, 933) can be selected, and feature information for each of two or more objects 931, 932, 933 can be specified.
  • the electronic device may set the number of objects to be used to register a space authentication pattern for the target space 910 to three.
  • the electronic device may register the space authentication pattern of the target space 910 by selecting the object 931, the object 932, and the object 933 in that order.
  • the electronic device displays a graphic object 961 representing the first order on the area where the object 931 selected in the first order is shown, and a graphic representing the second order on the area where the object 932 selected in the second order is shown.
  • a graphic object 963 representing the third order may be displayed on the area where the object 962 and the object 933 selected in the third order are shown, respectively.
  • the electronic device may connect the first-order object 931 and the second-order object 932 with a solid line 991, and connect the second-order object 932 and the third-order object 933 with a solid line ( 992).
  • the electronic device displays a user interface 921 that inquires whether to complete registration of the space authentication pattern for the target space 910. can be displayed.
  • the electronic device determines to complete registration of the spatial authentication pattern, the target is selected based on the selection order of the selected objects (931, 932, 933) and the characteristic information specified for each of the selected objects (931, 932, 933). Registration of the space authentication pattern for the space 910 may be completed.
  • the electronic device registers the spatial authentication pattern by sequentially selecting two or more objects to be used for registration of the spatial authentication pattern at a certain first location in the target space 910 and specifying characteristic information of the two or more selected objects. can be completed.
  • the electronic device sequentially selects two or more objects to be used for registration of a space authentication pattern at a certain first location in the target space 910 and specifies characteristic information of the two or more selected objects, and specifies characteristic information of the two or more selected objects in the target space ( In response to selecting the same two or more objects in the same selection order at a second location that is different from the first location in 910), registration of the spatial authentication pattern may be completed.
  • FIG. 10 is a diagram illustrating a process in which an electronic device transmits information about a space authentication pattern registered for a target space to another electronic device, according to an embodiment.
  • An electronic device e.g., the terminal device 101 in FIG. 1 or the wearable AR device 200 in FIG. 2 according to an embodiment registers a space authentication pattern for the target space and then registers the space authentication pattern for the target space with another electronic device. Information about registered spatial authentication patterns can be transmitted.
  • the electronic device registers one object (e.g., object 931 in FIG. 9) for each of the objects (e.g., objects 931, 932, and 933 in FIG. 9) used to register the spatial authentication pattern of the target space.
  • a graphic object representing the order selected when registering a temporary space authentication pattern (e.g., graphic object 1011), an image representing the shape of one object (e.g., image 1012), and characteristic information specified for one object ( Example: feature information 1013) may generate mapped contents (eg, contents 1010, 1020, 1030).
  • the electronic device may transmit a plurality of contents 1010, 1020, and 1030 generated in response to the space authentication pattern registered for the target space to another electronic device.
  • FIG. 11 is a diagram illustrating an example in which an electronic device releases a space authentication pattern registered for a target space, according to an embodiment.
  • the electronic device e.g., the terminal device 101 in FIG. 1 or the wearable AR device 200 in FIG. 2 according to an embodiment may use the space registered for the target space.
  • the authentication pattern can initiate the deactivation process.
  • the electronic device may release a space authentication pattern registered for the target space based on selecting two or more objects from among a plurality of identified objects.
  • the electronic device selects a plurality of objects identical to two or more objects (e.g., objects 931, 932, and 933 of FIG. 9) used to register a space authentication pattern for the target space in advance. Based on the selection in order, the space authentication pattern registered for the target space can be released.
  • the preset selection order may represent the selection order of two or more objects selected during the registration process of a spatial authentication pattern. For example, the electronic device selects a plurality of objects identical to two or more objects (e.g., objects 931, 932, and 933 in FIG.
  • the preset feature information may indicate feature information specified for each of two or more objects selected during the registration process of a spatial authentication pattern.
  • the electronic device instead of selecting an object, releases the spatial authentication pattern registered for the target space using images and characteristic information corresponding to each of two or more objects used for registration of the spatial authentication pattern. You may. According to one embodiment, the electronic device may prohibit releasing a spatial authentication pattern based on selecting an object when releasing the spatial authentication pattern based on selecting an object continuously fails a preset number of times. For example, the preset number of times may be 5, but is not limited thereto. In this case, the electronic device may cancel the space authentication pattern registered for the target space using images and feature information corresponding to each of two or more objects used to register the space authentication pattern.
  • the electronic device may sequentially display content corresponding to each of two or more objects (e.g., objects 931, 932, and 933 in FIG. 9) used to register a spatial authentication pattern on the screen.
  • the electronic device receives a user input for sequentially selecting content corresponding to each of two or more objects (e.g., objects 931, 932, and 933 in FIG. 9) used to register a spatial authentication pattern
  • the registered The spatial authentication pattern can be released.
  • the electronic device displays an image representing the shape of an object (e.g., object 931 in FIG. 9) selected in the first order in the registration process of a spatial authentication pattern and an object (e.g., in FIG. 9).
  • Content 1110 to which characteristic information specified for the object 931 is mapped may be displayed on the screen.
  • the electronic device displays content (e.g., contents 1120) to which an image representing the shape of another object different from the object selected in the first order (e.g., object 931 in FIG. 9) and feature information describing the other object are mapped. , 1130)) can be displayed on the screen together with the content 1110.
  • content e.g., contents 1120
  • an image representing the shape of another object different from the object selected in the first order e.g., object 931 in FIG. 9
  • feature information describing the other object are mapped.
  • 1130 the electronic device receives a user input for selecting content 1110 corresponding to the object selected in the first order (e.g., the object 931 in FIG. 9)
  • the electronic device selects the object selected in the second order (e.g., the object 931 in FIG. 9).
  • Content (not shown) corresponding to (932)) can be displayed together with other contents.
  • the electronic device can cancel the registered spatial authentication pattern by sequentially selecting content corresponding to each of two or more objects.
  • the electronic device receives a user input for selecting content (e.g., content 1120, 1130) different from content corresponding to the objects used for registration of the spatial authentication pattern (e.g., content 1110), It may be determined that the release of the spatial authentication pattern failed.
  • FIG. 12 is a diagram illustrating a process in which an electronic device releases a space authentication pattern registered for a target space, according to an embodiment.
  • the electronic device e.g., the terminal device 101 in FIG. 1 or the wearable AR device 200 in FIG. 2 registers the target space ( 1210), it is possible to determine whether to initiate the process of releasing the registered spatial authentication pattern. For example, the electronic device may determine whether to initiate the process of releasing the space authentication pattern registered for the target space 1210 based on the user input. When the electronic device decides to cancel the space authentication pattern registered for the target space 1210, it can identify a plurality of selectable objects.
  • the electronic device can select an object using the center point 1270 of the screen 1220, which will be described later with reference to FIG. 13.
  • the electronic device may display a user interface 1280 on the screen 1220 that indicates the number of objects used to register the space authentication pattern for the target space 1210. For example, referring to FIG. 12, the electronic device displays graphic objects 1281, 1282, ..., 1286 equal to the total number of objects (e.g., 6) used to register the space authentication pattern of the target space 1210. ) can be displayed in the user interface 1280.
  • the electronic device In the process of releasing the spatial authentication pattern, the electronic device displays graphic objects corresponding to the number of objects selected up to the current point in a different color from the remaining graphic objects, thereby informing the user of the electronic device of the objects selected to release the spatial authentication pattern up to the current point. The number can be notified.
  • FIG. 13 is a diagram illustrating a process in which an electronic device selects an object in the process of releasing a spatial authentication pattern, according to an embodiment.
  • a plurality of selectable objects ( 1331, 1332, 1333, 1334) can be identified.
  • the plurality of objects 1331, 1332, 1333, and 1334 that the electronic device identifies during the process of releasing the spatial authentication pattern may include two or more objects selected during the registration process of the spatial authentication pattern.
  • the electronic device may display a boundary area corresponding to each of the identified plurality of objects 1331, 1332, 1333, and 1334 on the screen.
  • the electronic device may select an object through the center point 1370 of the screen 1320.
  • the electronic device releases the spatial authentication pattern for the identified object 1331 based on the center point 1370 of the screen 1320 being continuously positioned for more than a threshold time within the area where one identified object 1331 is visible. It can be selected as an object to be used.
  • FIG. 14 is a diagram illustrating a process in which an electronic device selects one object and then selects the next object in the process of releasing a spatial authentication pattern, according to an embodiment.
  • an electronic device e.g., the terminal device 101 of FIG. 1 or the wearable AR device 200 of FIG. 2 identifies a plurality of objects 1431, 1432, and 1433 identified in the process of releasing the spatial authentication pattern. ), you can select multiple objects sequentially.
  • the electronic device may sequentially select a plurality of objects equal to the number of objects used to register a spatial authentication pattern among the identified plurality of objects.
  • the electronic device selects one object (e.g., The selected object (e.g., object 1432) can be connected to each other in the following order: object 1431) and one object (e.g., object 1431). For example, the electronic device connects the point 1471 within the area where the object 1431 is shown and the point 1472 within the area where the object 1432 selected in the next order of the object 1431 is shown with a solid line 1491. You can.
  • the electronic device selects the last selected object at the current time as an object to be used for releasing the spatial authentication pattern.
  • the object 1432 and the center point 1470 of the screen 1420 can be connected to each other.
  • the electronic device may connect the point 1472 within the area where the object 1432 is visible and the center point 1470 of the screen 1420 with a dotted line 1499.
  • FIG. 15 is a diagram illustrating a process in which an electronic device releases a spatial authentication pattern based on a plurality of objects selected in the process of releasing the spatial authentication pattern, according to an embodiment.
  • the electronic device determines the number of objects used to register the spatial authentication pattern in the process of releasing the spatial authentication pattern. You can select as many objects as necessary. For example, referring to FIG. 15, when the number of objects used to register a spatial authentication pattern for the target space 1510 is 6, the electronic device selects one of the plurality of objects identified in the process of releasing the spatial authentication pattern. Six objects can be selected sequentially.
  • the electronic device may determine whether to cancel the spatial authentication pattern by comparing a plurality of objects selected in the process of canceling the spatial authentication pattern with two or more objects selected in the process of registering the spatial authentication pattern.
  • the electronic device determines whether to cancel the spatial authentication pattern by comparing a plurality of objects selected in the process of releasing the spatial authentication pattern with the selection order of two or more objects selected in the registration process of the spatial authentication pattern and the characteristic information specified for the two or more objects. You can judge.
  • the electronic device selects a plurality of objects selected in the process of releasing the spatial authentication pattern, matches two or more objects selected in the registration process of the spatial authentication pattern, and is selected according to the selection order of the two or more selected objects, In response to satisfying the characteristic information specified for two or more selected objects, the spatial authentication pattern may be released.
  • the electronic device may compare the kth order object selected in the process of releasing the spatial authentication pattern with the kth order object and characteristic information selected in the registration process of the spatial authentication pattern.
  • k may be a natural number less than or equal to the number of objects used to register the spatial authentication pattern.
  • the electronic device may compare the shape of object A in the kth order selected during the registration process of the spatial authentication pattern with the shape of object B in the kth order selected during the release process of the spatial authentication pattern. When the electronic device determines that object A and object B are the same object through shape comparison, the electronic device determines whether the kth object B selected in the process of releasing the spatial authentication pattern satisfies the characteristic information specified for object A. You can.
  • the electronic device may determine that selection of the kth object was successful. For example, the electronic device may select the object 1531 in the first order during the registration process of the space authentication pattern and specify feature information of 'located next to the sofa' for the object 1531. In this case, the electronic device selects the object 1531 in the first order in the process of releasing the space authentication pattern, and in response to the case where the object 1531 satisfies the characteristic information of 'located next to the sofa', It can be determined that the object selection in the order was successful.
  • the electronic device may release the space authentication pattern registered for the target space 1510. In this case, the electronic device may display a user interface 1521 on the screen 1520 indicating that the space authentication pattern has been released.
  • the electronic device may not release the space authentication pattern registered for the target space 1510. In this case, the electronic device may display a user interface on the screen indicating that the spatial authentication pattern has not been released. The electronic device may determine whether to proceed with the release process of the space authentication pattern registered for the target space 1510 again, and if the process of releasing the space authentication pattern is initiated again, the electronic device may continue until the current point in time to release the space authentication pattern. You can initialize the selected objects and select them again starting from the first object.
  • the electronic device may determine whether to release the spatial authentication pattern, but the server may also determine whether to release the spatial authentication pattern.
  • the electronic device may transmit information about the spatial authentication pattern registered for the target space 1510 to a server (eg, IoT server). Additionally, the electronic device may transmit information about a plurality of objects selected during the process of releasing the spatial authentication pattern to the server.
  • the server may determine whether to cancel the spatial authentication pattern by comparing the kth order object selected in the process of canceling the spatial authentication pattern with the kth order object and feature information selected in the registration process of the spatial authentication pattern. The server may transmit whether the spatial authentication pattern is released to the electronic device.
  • FIG. 16 is a diagram illustrating a process in which an electronic device stops releasing a spatial authentication pattern, according to an embodiment.
  • the electronic device may stop releasing the spatial authentication pattern while in the process of releasing the spatial authentication pattern.
  • the electronic device can detect the gaze direction of the user wearing the electronic device.
  • the electronic device may stop the release process of the spatial authentication pattern based on the fact that the detected user's gaze direction is downward relative to the horizontal plane and that the angle between the user's gaze direction and the horizontal plane is greater than or equal to a preset angle.
  • the electronic device asks whether to stop the release process of the spatial authentication pattern.
  • the user interface 1621 may be displayed on the screen 1620.
  • the preset angle may be 45 degrees, but is not limited thereto.
  • the electronic device may display a user interface 1680 indicating the number of objects used to register the spatial authentication pattern in a fixed area of the screen 1620.
  • the electronic device displays the user interface 1680 when the distance between the fixed area of the screen 1620 and the floor of the target space 1610 is less than a preset distance (for example, 80 cm). It may no longer be displayed on the screen 1620. If the distance between the fixed area of the user screen 1620 and the floor of the target space 1610 exceeds a preset distance, the electronic device may display the user interface 1680 on the fixed area of the screen 1620 again. there is.
  • a preset distance for example 80 cm
  • the electronic device may cancel the process of releasing the spatial authentication pattern. For example, when the electronic device cancels the spatial authentication pattern unlocking process, it may display a user interface (not shown) on the screen to inquire whether to restart the spatial authentication pattern unlocking process.
  • the electronic device may restart the process of releasing the spatial authentication pattern. For example, when restarting the process of releasing the spatial authentication pattern, the electronic device may initialize the objects selected up to the current point and select the first object in order to release the spatial authentication pattern.
  • FIGS. 17 and 18 are diagrams illustrating objects that an electronic device identifies according to the level of difficulty set for a space authentication pattern for a target space, according to an embodiment.
  • An electronic device (e.g., the terminal device 101 of FIG. 1 or the wearable AR device 200 of FIG. 2) according to an embodiment may set the difficulty level for the space authentication pattern during the registration process of the space authentication pattern for the target space. there is. Depending on the level of difficulty set for the space authentication pattern, a plurality of registerable objects that the electronic device identifies within the target space may be different. According to one embodiment, the electronic device may determine a plurality of registerable objects to be identified in relation to the target space, based on the difficulty level set for the space authentication pattern.
  • the levels of difficulty that can be set for a space authentication pattern may include a first level of difficulty, a second level of difficulty, and a third level of difficulty, and the level of difficulty may increase in the order of the first level of difficulty, the level of the second level of difficulty, and the level of the third level of difficulty.
  • the difficulty for the spatial authentication pattern is set to the first difficulty, the second difficulty, and the third difficulty, respectively, the registerable object identified by the electronic device is divided into the first object, the second object, and the third object.
  • the electronic device may determine a plurality of objects that can be registered to be identified according to the level of difficulty of the spatial authentication pattern, based on the size of the object or the size of the area where the plurality of objects are displayed together on the screen.
  • the electronic device when the electronic device sets the difficulty level of the space authentication pattern to the first difficulty level, the electronic device registers the first object 1711, 1712, 1713, and 1714 having a size greater than a preset size. Objects can be identified. For example, the electronic device may determine the size of the object based on the size of the area where the object is visible on the screen. According to one embodiment, when the difficulty of the spatial authentication pattern is set to the second difficulty, the electronic device may identify objects 1721, 1722, and 1723 with sizes less than the preset size as registerable second objects. You can. As described above, since the electronic device registers a space authentication pattern while being located at a certain location in the target space, the size of each object can be determined.
  • a plurality of objects 1831 are displayed together within an area 1850 having a size less than a preset size.
  • 1832, ..., 1839 can each be identified as third-party objects that can be registered.
  • the electronic device may select one reference object (e.g., object 1831) disposed in an area 1850 of a preset size or more, and the selected reference object (e.g., object 1831) and the area ( Third objects that can be registered can be identified by determining the degree of similarity with each of the other objects (e.g., objects 1832 and 1833) arranged within 1850).
  • the electronic device may identify other objects (e.g., objects 1832 and 1833) whose similarity to the reference object (e.g., object 1831) is greater than or equal to a threshold similarity as third objects that can be registered. For example, an electronic device may determine similarity between objects based on shape and size.
  • FIGS. 19A to 19C are diagrams illustrating objects that an electronic device identifies according to the level of difficulty set for a space authentication pattern, according to an embodiment.
  • an electronic device uses a spatial authentication pattern based on classifying a high-level object into at least one low-level object. You can determine the registerable objects to be identified by level of difficulty.
  • the lower object may be an object shown in a portion of the area where the upper object is shown.
  • a parent object may include multiple child objects.
  • the electronic device may identify one first object as a registerable object.
  • the electronic device classifies one first object, which is a higher level object, into a plurality of second objects, which are lower level objects, thereby creating a plurality of second objects. Each can be identified as a registerable object.
  • the electronic device sets the difficulty level of the spatial authentication pattern to the third difficulty level, classifies each of the plurality of second objects, which are upper level objects, into a plurality of third objects, which are lower level objects, so that each of the third objects can be registered. It can be identified as an object.
  • a plurality of feature information generated corresponding to the first object, the second object, and the third object may be different from each other.
  • the electronic device may identify the first object 1911 as a registerable object. For example, the electronic device may identify three picture frames hanging on the wall as one first object 1911.
  • the electronic device may designate at least one feature information 1952 among the plurality of feature information 1951 and 1952 generated corresponding to the first object 1911 as feature information describing the first object 1911.
  • the electronic device when the electronic device sets the difficulty of the space authentication pattern to the second difficulty, the electronic device divides the first object (e.g., the first object 1911 of FIG. 19A) into a plurality of second objects 1921 and 1922. , 1923), each of the second objects 1921, 1922, and 1923 can be identified as a registerable object. For example, the electronic device may identify each of three picture frames hanging on the wall as second objects 1921, 1922, and 1923.
  • the first object e.g., the first object 1911 of FIG. 19A
  • each of the second objects 1921, 1922, and 1923 can be identified as a registerable object.
  • the electronic device may identify each of three picture frames hanging on the wall as second objects 1921, 1922, and 1923.
  • each of the second objects (e.g., the second objects 1921, 1922, and 1923 in FIG. 19B) is divided into a plurality of second objects.
  • each of the third objects (1931, 1932, 1933, 1934, ...) can be identified as a registerable object.
  • the electronic device may identify each of the leaves drawn in each frame as third objects (1931, 1932, 1933, 1934, ).
  • a plurality of selectable objects that the electronic device identifies during the release process of the spatial authentication pattern may be different.
  • a plurality of selectable objects to be identified in relation to the target space may be determined based on the difficulty level set for the space authentication pattern. For example, when canceling a spatial authentication pattern set at the first level of difficulty, the electronic device may identify first objects as objects that can be selected for canceling the spatial authentication pattern. When canceling a spatial authentication pattern set to the second level of difficulty, the electronic device may identify second objects as objects that can be selected for canceling the spatial authentication pattern. When canceling a spatial authentication pattern set to the third level of difficulty, the electronic device may identify third objects as objects that can be selected for canceling the spatial authentication pattern.
  • FIG. 20 is a diagram illustrating a process in which an electronic device registers a space authentication pattern for each user account in the process of registering a space authentication pattern for a target space, according to an embodiment.
  • an electronic device may register a space authentication pattern for the target space 2010 for each user account.
  • the electronic device may register a first space authentication pattern 2011 for the target space 2010 in response to the first user account, and may register a first space authentication pattern 2011 for the target space 2010 in response to a second user account different from the first user account.
  • the combination of two or more objects used for registration of the first spatial authentication pattern (2011) may be different from the combination of two or more objects used for registration of the second spatial authentication pattern (2012). You can.
  • the electronic device selects one user account from among a plurality of user accounts corresponding to a plurality of space authentication patterns registered for the target space 2010, and responds to the selected user account. The process of releasing the registered space authentication pattern may be initiated.
  • FIG. 21 is a diagram illustrating a process in which an electronic device sets a usage authority level corresponding to a spatial authentication pattern during the registration process of a spatial authentication pattern, according to an embodiment.
  • an electronic device registers a spatial authentication pattern for the target space 2110, You can set the permission level.
  • the usage authority levels may include a first level, a second level, and a third level, and the usage authority levels may increase in the order of the first level, the second level, and the third level.
  • the higher the usage authority level set for the spatial authentication pattern the more authority the electronic device can receive from the external device when the spatial authentication pattern is released. For example, when a spatial authentication pattern with a high usage authority level is released, the electronic device can receive control authority for more IoT devices. For another example, when a space authentication pattern with a high usage authority level is released, the electronic device can receive access rights to a wider variety of contents.
  • the electronic device may transmit information about the spatial authentication pattern registered for the target space 2110 to the IoT server, and transmit information about a plurality of objects selected in the process of releasing the spatial authentication pattern to the IoT server.
  • the IoT server can determine whether to cancel the spatial authentication pattern by comparing information about the registered spatial authentication pattern with a plurality of objects selected in the process of canceling the spatial authentication pattern.
  • the IoT server decides to release the spatial authentication pattern, it can transmit the permission level set for the information and spatial authentication pattern of the electronic device to each content manager that manages IoT devices and contents.
  • IoT devices can grant control authority to electronic devices based on the usage authority level of the spatial authentication pattern, and content managers can also grant access authority to content to electronic devices based on the usage authority level of the spatial authentication pattern.
  • the electronic device grants control authority to the IoT devices 2181, 2182, and 2183 to each of the IoT devices 2181, 2182, and 2183. It can be granted from.
  • IoT device 2181 may be an electronic light
  • IoT device 2182 may be an electronic picture frame
  • IoT device 2183 may be an electronic chair.
  • the electronic device may increase the minimum number of objects to be used for registration of the spatial authentication pattern as the usage authority level set for the spatial authentication pattern increases. For example, when the electronic device sets the minimum number of objects to be used for registration of a spatial authentication pattern to 4 at the first level, the electronic device sets the minimum number of objects to be used for registration of a spatial authentication pattern at a second level higher than the first level. It can be set to 5.
  • the electronic device can set the effective period of the spatial authentication pattern.
  • the electronic device may receive authority corresponding to the usage authority level of the space authentication pattern during the period of the validity period set for the space authentication pattern from the point of release. there is.
  • the permission received by the electronic device may be released when the effective period of time elapses from the time the spatial authentication pattern is released.
  • the electronic device can receive authority corresponding to the usage authority level set for the space authentication pattern by canceling the space authentication pattern registered for the target space 2110 again.
  • An electronic device includes a display module, a memory in which computer-executable instructions are stored, and a processor that accesses the memory and executes the instructions, where the processor scans a target space and selects the target. Identify a plurality of objects that can be registered in relation to space, select two or more objects among the identified plurality of objects in an order while specifying characteristic information describing each object, and determine the selection order of the two or more selected objects and the two selected objects.
  • a space authentication pattern for the target space can be registered based on the characteristic information of the above objects.
  • the processor may set the number of objects to be used for registration of the spatial authentication pattern based on user input.
  • the processor tracks the center point of the screen, and determines the target object as an object to be used for registration of a spatial authentication pattern, based on the tracked center point of the screen being continuously located within the area where the target object is visible for more than a threshold time. You can choose as .
  • the processor creates a graphic object connecting the area where one selected object is shown and the area where the object selected in the next order of the one object is shown. You can display a graphic object that connects the center point of the screen and the area where the last selected object is visible at the current time.
  • the processor sequentially selects, for each of the two or more selected objects, contents to which images and feature information corresponding to the objects are mapped according to the selection order of the two or more selected objects, and sequentially selects a spatial authentication pattern registered for the target space. can be released.
  • the processor selects a plurality of objects to cancel the spatial authentication pattern as many as the number of objects used to register the spatial authentication pattern during the canceling process of the spatial authentication pattern, and selects a plurality of objects selected during the canceling process of the spatial authentication pattern. It is possible to determine whether to release the spatial authentication pattern by comparing the selection order of two or more objects with the characteristic information of the two or more selected objects.
  • the processor determines that a plurality of objects selected in the process of releasing the spatial authentication pattern match two or more objects selected in the registration process of the spatial authentication pattern, are selected according to the selection order of the two or more selected objects, and are selected in the selection order of the two or more selected objects.
  • the spatial authentication pattern may be released.
  • the processor detects the gaze direction of the user wearing the electronic device, and based on the fact that the angle formed between the detected user's gaze direction and the horizontal plane is greater than or equal to a preset angle, the processor detects the gaze direction of the user wearing the electronic device, can be stopped.
  • the processor sets a difficulty level for the spatial authentication pattern during the registration process of the spatial authentication pattern, and creates a plurality of registerable objects to be identified in relation to the target space based on the difficulty level set for the spatial authentication pattern during the registration process of the spatial authentication pattern.
  • a plurality of selectable objects to be identified in relation to the target space may be determined based on the difficulty level set for the space authentication pattern.
  • the processor When setting the difficulty of the spatial authentication pattern to the first difficulty, the processor identifies an object having a size greater than or equal to a preset size as a registerable object, and when setting the difficulty of the spatial authentication pattern to the second difficulty, the processor identifies an object having a size greater than or equal to a preset size.
  • Objects with a size less than the size are identified as registerable objects, and when the difficulty of the spatial authentication pattern is set to the third difficulty level, each of a plurality of objects shown together within an area with a size less than the preset size can be registered. It can be identified as an object.
  • the processor When setting the difficulty of the spatial authentication pattern to the first difficulty, the processor identifies the first object as a registerable object, and when setting the difficulty of the spatial authentication pattern to the second difficulty, the processor identifies the first object as a plurality of sub-objects. Classifies the plurality of second objects into second objects and identifies each of the plurality of second objects as a registerable object, and when the difficulty of the spatial authentication pattern is set to the third difficulty, the plurality of second objects are classified into a plurality of third objects that are sub-objects. By classifying objects into objects, each of the plurality of third objects can be identified as a registerable object.
  • the processor registers a first space authentication pattern for the target space in response to the first user account, registers a second space authentication pattern for the target space in response to a second user account different from the first user account, and registers a second space authentication pattern for the target space in response to the first user account. Based on selecting one user account from among the plurality of user accounts corresponding to the plurality of spatial authentication patterns registered for the space, a process of releasing the spatial authentication pattern registered corresponding to the selected user account may be initiated.
  • a method implemented with a processor includes the operation of scanning a target space to identify a plurality of objects that can be registered in relation to the target space, and describing each object of two or more objects among the plurality of identified objects. It may include an operation of selecting in an order while specifying characteristic information, and an operation of registering a space authentication pattern for the target space based on the selection order of two or more selected objects and the characteristic information of the two or more selected objects.
  • the method implemented with a processor may further include setting the number of objects to be used for registration of a spatial authentication pattern based on a user input.
  • the operation of selecting two or more objects involves tracking the center point of the screen, and spatially authenticating the target object based on the tracked center point of the screen being located continuously for more than a threshold time within the area where the target object is visible. It may include an operation of selecting an object to be used for pattern registration.
  • a method implemented with a processor includes, in the process of sequentially selecting two or more objects among a plurality of identified objects, an area where one selected object is shown and the selected object is shown in the order following the one object.
  • the operation may further include displaying a graphic object connecting the areas where the last selected object is displayed, and displaying a graphic object connecting the center point of the screen and the area where the last selected object is shown at the current time.
  • a method implemented with a processor is based on sequentially selecting content to which images and feature information corresponding to the objects are mapped for each of two or more selected objects according to the selection order of the two or more selected objects.
  • the operation of releasing a space authentication pattern registered for the target space may further be included.
  • a method implemented with a processor includes the operation of selecting a plurality of objects equal to the number of objects used to register the spatial authentication pattern to cancel the spatial authentication pattern in the process of releasing the spatial authentication pattern, and selecting the spatial authentication pattern.
  • the method may further include determining whether to release the spatial authentication pattern by comparing the plurality of objects selected in the release process with the selection order of the two or more selected objects and the characteristic information of the two or more selected objects.
  • a plurality of objects selected in the process of releasing a spatial authentication pattern are matched with two or more selected objects, and are selected according to the selection order of the two or more selected objects, and the selected two or more objects
  • the operation of releasing the spatial authentication pattern may be further included in response to satisfying the characteristic information specified for the space authentication pattern.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Security & Cryptography (AREA)
  • Human Computer Interaction (AREA)
  • Computer Hardware Design (AREA)
  • Software Systems (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Un dispositif électronique selon un mode de réalisation comprend : un module d'affichage ; une mémoire stockant des instructions exécutables par ordinateur ; et un processeur qui accède à la mémoire et exécute les instructions. Le processeur peut : balayer un espace cible pour identifier une pluralité d'objets qui peuvent être enregistrés par rapport à l'espace cible ; sélectionner au moins deux objets parmi la pluralité d'objets identifiés dans un ordre défini tout en spécifiant des informations caractéristiques décrivant chacun des objets ; et enregistrer un motif d'authentification d'espace pour l'espace cible sur la base de l'ordre de sélection des deux objets sélectionnés ou plus et des informations caractéristiques des deux objets sélectionnés ou plus.
PCT/KR2023/008334 2022-08-26 2023-06-16 Dispositif et procédé d'enregistrement et de désenregistrement de motif d'authentification à l'aide d'objets WO2024043468A1 (fr)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
KR20220107711 2022-08-26
KR10-2022-0107711 2022-08-26
KR1020220113299A KR20240029475A (ko) 2022-08-26 2022-09-07 객체들을 이용하여 인증 패턴을 등록 및 해제하는 장치 및 방법
KR10-2022-0113299 2022-09-07

Publications (1)

Publication Number Publication Date
WO2024043468A1 true WO2024043468A1 (fr) 2024-02-29

Family

ID=90013380

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2023/008334 WO2024043468A1 (fr) 2022-08-26 2023-06-16 Dispositif et procédé d'enregistrement et de désenregistrement de motif d'authentification à l'aide d'objets

Country Status (1)

Country Link
WO (1) WO2024043468A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030076300A1 (en) * 2000-05-16 2003-04-24 Eric Lauper Method and terminal for entering instructions
JP2013214863A (ja) * 2012-04-02 2013-10-17 Sharp Corp 携帯情報装置、携帯情報装置の制御方法、制御プログラムおよびそれを記録したコンピュータ読み取り可能な記録媒体
KR20180094186A (ko) * 2017-02-13 2018-08-23 한국전자통신연구원 사용자 인증 방법 및 장치
KR20190044838A (ko) * 2017-10-23 2019-05-02 동서대학교 산학협력단 융합현실, 가상현실 및 증강현실을 이용한 사용자 인증시스템
KR20200143317A (ko) * 2013-12-18 2020-12-23 마이크로소프트 테크놀로지 라이센싱, 엘엘씨 디스플레이 장치 상의 사용자 인증

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030076300A1 (en) * 2000-05-16 2003-04-24 Eric Lauper Method and terminal for entering instructions
JP2013214863A (ja) * 2012-04-02 2013-10-17 Sharp Corp 携帯情報装置、携帯情報装置の制御方法、制御プログラムおよびそれを記録したコンピュータ読み取り可能な記録媒体
KR20200143317A (ko) * 2013-12-18 2020-12-23 마이크로소프트 테크놀로지 라이센싱, 엘엘씨 디스플레이 장치 상의 사용자 인증
KR20180094186A (ko) * 2017-02-13 2018-08-23 한국전자통신연구원 사용자 인증 방법 및 장치
KR20190044838A (ko) * 2017-10-23 2019-05-02 동서대학교 산학협력단 융합현실, 가상현실 및 증강현실을 이용한 사용자 인증시스템

Similar Documents

Publication Publication Date Title
WO2022102954A1 (fr) Dispositif électronique à porter sur soi comprenant un écran
WO2022108076A1 (fr) Procédé de connexion sans fil d'un environnement de réalité augmentée et dispositif électronique associé
WO2022215895A1 (fr) Dispositif électronique à porter sur soi comprenant une pluralité d'appareils photo
WO2022092517A1 (fr) Dispositif électronique pouvant être porté comprenant une unité d'affichage, procédé de commande d'affichage, système comprenant un dispositif électronique pouvant être porté et boîtier
WO2023017986A1 (fr) Procédé et système électronique pour délivrer en sortie des données vidéo et des données audio
WO2023008711A1 (fr) Dispositif de réglage et procédé de fonctionnement de dispositif de réglage
WO2022191497A1 (fr) Dispositif électronique pouvant être porté et comprenant un affichage
WO2024043468A1 (fr) Dispositif et procédé d'enregistrement et de désenregistrement de motif d'authentification à l'aide d'objets
WO2024029858A1 (fr) Procédé de commande de module d'affichage et dispositif électronique exécutant le procédé
WO2023085569A1 (fr) Procédé et dispositif de commande de luminosité d'image ar
WO2023153607A1 (fr) Procédé d'affichage de contenu de réalite augmentée (ra) basé sur l'éclairement ambiant et dispositif électronique
WO2023080419A1 (fr) Dispositif électronique habitronique et procédé de commande de dispositif électronique à l'aide d'informations de vision
WO2024043611A1 (fr) Procédé de commande de module d'affichage et dispositif électronique pour sa mise en œuvre
WO2024048986A1 (fr) Dispositif et procédé de commande d'appareil proche en tenant compte de l'intention d'un utilisateur
WO2023120892A1 (fr) Dispositif et procédé de commande de sources lumineuses dans un suivi de regard à l'aide d'un reflet
WO2024025179A1 (fr) Procédé de connexion à un dispositif d'affichage externe sur la base d'informations biométriques, et dispositif électronique
WO2024048937A1 (fr) Dispositif électronique pouvant être porté et système de charge le comprenant
WO2023048466A1 (fr) Dispositif électronique et procédé d'affichage de contenu
WO2024029720A1 (fr) Dispositif et procédé d'authentification d'un utilisateur dans la réalité augmentée
WO2023153611A1 (fr) Procédé et dispositif d'obtention d'image d'objet
WO2023136533A1 (fr) Procédé d'annulation d'interférence et dispositif électronique pour sa mise en œuvre
WO2023080420A1 (fr) Dispositif électronique habitronique comprenant une terre variable
WO2024101747A1 (fr) Dispositif électronique à porter sur soi comprenant une caméra et procédé de fonctionnement du dispositif
WO2023063572A1 (fr) Dispositif électronique habitronique ayant une transmittance de visière réglable et une luminosité d'affichage réglable
WO2023121120A1 (fr) Procédé d'annulation d'interférences et dispositif électronique pour la mise en œuvre du procédé

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23857506

Country of ref document: EP

Kind code of ref document: A1