WO2016182090A1 - 안경형 단말기 및 이의 제어방법 - Google Patents
안경형 단말기 및 이의 제어방법 Download PDFInfo
- Publication number
- WO2016182090A1 WO2016182090A1 PCT/KR2015/004579 KR2015004579W WO2016182090A1 WO 2016182090 A1 WO2016182090 A1 WO 2016182090A1 KR 2015004579 W KR2015004579 W KR 2015004579W WO 2016182090 A1 WO2016182090 A1 WO 2016182090A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- user
- specified object
- information
- function
- external device
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims description 41
- 230000033001 locomotion Effects 0.000 claims description 39
- 238000004891 communication Methods 0.000 claims description 30
- 239000011521 glass Substances 0.000 claims description 11
- 230000005540 biological transmission Effects 0.000 claims description 5
- 238000010191 image analysis Methods 0.000 claims description 4
- 230000006870 function Effects 0.000 description 187
- 230000003287 optical effect Effects 0.000 description 15
- 230000000007 visual effect Effects 0.000 description 15
- 238000004458 analytical method Methods 0.000 description 12
- 238000010586 diagram Methods 0.000 description 12
- 238000001514 detection method Methods 0.000 description 5
- 210000003128 head Anatomy 0.000 description 5
- 230000004397 blinking Effects 0.000 description 4
- 238000005516 engineering process Methods 0.000 description 3
- 210000001747 pupil Anatomy 0.000 description 3
- 230000003190 augmentative effect Effects 0.000 description 2
- 230000015572 biosynthetic process Effects 0.000 description 2
- 230000014509 gene expression Effects 0.000 description 2
- 238000005286 illumination Methods 0.000 description 2
- 239000000463 material Substances 0.000 description 2
- 230000002093 peripheral effect Effects 0.000 description 2
- XUIMIQQOPSSXEZ-UHFFFAOYSA-N Silicon Chemical compound [Si] XUIMIQQOPSSXEZ-UHFFFAOYSA-N 0.000 description 1
- 230000001133 acceleration Effects 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 230000000193 eyeblink Effects 0.000 description 1
- 230000005484 gravity Effects 0.000 description 1
- 238000010295 mobile communication Methods 0.000 description 1
- 210000001331 nose Anatomy 0.000 description 1
- 230000005855 radiation Effects 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 230000015541 sensory perception of touch Effects 0.000 description 1
- 229910052710 silicon Inorganic materials 0.000 description 1
- 239000010703 silicon Substances 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 230000005236 sound signal Effects 0.000 description 1
- 239000000126 substance Substances 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/013—Eye tracking input arrangements
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/0093—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for monitoring data relating to the user, e.g. head-tracking, eye-tracking
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B27/0172—Head mounted characterised by optical features
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/002—Specific input/output arrangements not covered by G06F3/01 - G06F3/16
- G06F3/005—Input arrangements through a video camera
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04817—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0482—Interaction with lists of selectable items, e.g. menus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04842—Selection of displayed objects or displayed text elements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/20—Scenes; Scene-specific elements in augmented reality scenes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/40—Scenes; Scene-specific elements in video content
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/18—Eye characteristics, e.g. of the iris
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/0138—Head-up displays characterised by optical features comprising image capture systems, e.g. camera
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B2027/0178—Eyeglass type
Definitions
- the present invention relates to an eyeglass terminal capable of executing various functions based on the user's gaze analysis.
- Terminals may be divided into mobile / portable terminals and stationary terminals according to their mobility.
- the spectacle terminal may be further classified into a handheld terminal and a vehicle mount terminal according to whether a user can directly carry it.
- the terminal is implemented in the form of a multimedia player having complex functions such as taking a picture or video, playing a music or video file, playing a game or receiving a broadcast. have. Further, in order to support and increase the function of the terminal, it may be considered to improve the structural part and the software part of the terminal.
- the glass type terminal mounted on the head of the user may correspond to a head mounted display HMD.
- the display unit provided in a glass type terminal such as an HMD can provide various conveniences to the user by being combined with augmented reality technology, N screen technology, etc., beyond the simple image output function.
- the technical problem of the present invention is to provide a control method capable of executing various functions related to a specific object existing in reality by using the gaze analysis of the user.
- the display unit provided in the main body of the glasses type terminal, a camera for photographing the external environment, a sensing unit for detecting the user's eyes to the external environment and the user of the external environment The gaze is analyzed to specify an object to which the user's gaze is directed from an image corresponding to the external environment photographed by the camera, and based on an event related to the specified object being photographed through the camera, And a control unit for controlling the display unit such that a function icon associated with a preset function is located in a corresponding area corresponding to the area where the target is located.
- the controller may execute a function associated with the function icon based on the selection input related to the function icon being received.
- the selection input associated with the function icon may be received by the sensing unit when the user's gaze toward the function icon is detected, and the preset gesture of the user is detected.
- the event related to the specified object may be a predetermined movement of the specified object captured by the camera.
- the apparatus may further include a wireless communication unit, and the function icon may include a control icon associated with a function for controlling an external device corresponding to the specified object, and the controller may correspond to the specified object.
- the wireless communication may be controlled to transmit a preset control command to the external device.
- the controller may analyze the image of the specified object included in the image corresponding to the external environment, obtain identification information of the external device corresponding to the specified object, and identify the external device.
- the wireless communication unit may be controlled to perform wireless communication with the external device using information.
- the controller when the first gesture of the user is detected by a selection input related to the control icon, the controller transmits a first control command for requesting transmission of preset information to the external device, and controls the control.
- the wireless communication unit may be controlled to transmit a second control command for outputting preset information from the external device to the external device.
- the preset information received from the external device by the first control command is screen information being output to the display unit of the external device
- the control unit may display the display unit of the external device by the wireless communication unit.
- the screen information may be output to the display unit.
- the control unit may control the wireless communication unit to transmit the screen information to a device different from the external device based on a specific input being applied while the screen information is output to the display unit.
- the other device may be selected based on the gaze of the user sensed by the sensing unit.
- the controller may select preset information to be output from the external device by the second control command based on a type of an event related to the specified object captured by the camera.
- the function icon may include an information icon associated with a function for confirming information related to the specified object, and the controller may execute a function for confirming information related to the specified object. And analyzing the image of the specified object included in the image corresponding to the external environment to obtain information related to the specified object, and controlling the display unit to output at least a part of the information related to the specified object. can do.
- the controller may adjust an output amount of information related to the specified object according to the number of objects included in the image corresponding to the external environment.
- the function icon may include a data forming icon associated with a function of forming image data of the specified object, and the controller may be further configured to, when a selection input associated with the data forming icon is received.
- the camera may be controlled to photograph the captured object, and the display unit may be controlled to output a preview image of the specified object photographed by the camera.
- the controller may request to output an associated data image including motion information associated with movement of the specified target included in the preview image.
- the display unit may control the display unit such that the associated data image overlaps the preview image and is output.
- the controller may control the display unit to output information indicating the specific movement of the other object when a specific movement of the specified object is different from the specified object in the image corresponding to the external environment. Can be.
- a control method of an eyeglass type terminal photographing an external environment by a camera, sensing a user's gaze to the external environment by a sensing unit, and a user's gaze to the external environment Analyzing the data, and identifying an object to which the user's gaze is directed from an image corresponding to the external environment, and based on the event related to the specified object being photographed through the camera;
- the method may include positioning a function icon associated with a preset function in a corresponding corresponding area.
- the method may further include receiving a selection input associated with the function icon and executing a function associated with the function icon based on the selection input being received.
- the selection input related to the function icon may be received by detecting a preset gesture of the user in a state in which the user's gaze toward the function icon is detected by the sensing unit.
- the executing of the function associated with the function icon may include analyzing an image of the specified object included in an image corresponding to the external environment, based on image analysis of the specified object. Acquiring information related to the specified object and executing a function associated with the function icon by using the information related to the specified object.
- the event related to the specified object may be a predetermined movement of the specified object captured by the camera.
- the spectacles-type terminal may provide an icon of a function that can be executed in relation to the specified object by specifying one object in an external environment through the user's gaze analysis. Accordingly, the user may intuitively be provided with information about a function executable in relation to the specified object without directly searching for information related to the specified object.
- the user is provided with the convenience of executing various functions associated with the function icon by using a non-contact input such as changing a line of sight or a preset gesture.
- FIG. 1 is a block diagram illustrating a mobile terminal related to the present invention.
- FIG. 2A is a view of an eyeglass terminal according to an embodiment of the present disclosure, viewed from one direction
- FIG. 2B is a conceptual view illustrating an optical unit included in the eyeglass terminal of FIG. 2A.
- FIG. 3A is a flowchart illustrating a control method of providing a function icon associated with a specified object by a spectacle type terminal according to the present invention
- FIG. 3B is a representative diagram for describing a control method of providing the function icon.
- FIGS. 4A and 4B are diagrams illustrating an exemplary embodiment when a control icon linked to a function linked with an external device corresponding to the specified object is selected.
- 5A and 5B illustrate an example of transferring information received from a first external device to a second external device.
- FIGS. 6A and 6B illustrate an exemplary embodiment in which a glasses terminal controls the external device to output preset information from the external device.
- FIG. 7A and 7B illustrate embodiments in which a function of forming data related to a specified object is executed.
- 8A and 8B illustrate an example in which a function of confirming information related to a specified object is executed.
- FIGS. 9A and 9B are diagrams illustrating an embodiment in which information is transmitted to each of a plurality of external devices from a spectacle type terminal.
- FIG. 10 is a diagram illustrating an embodiment in which an eyeglass type terminal simultaneously receives information from a plurality of external devices.
- FIG. 11 is a diagram illustrating an embodiment of providing a notch for an object different from a specified object based on a line of sight of a user within an angle of view of a camera.
- the mobile terminal 100 includes a wireless communication unit 110, an input unit 120, a sensing unit 140, an output unit 150, an interface unit 160, a memory 170, a controller 180, and a power source. It may include a supply unit 190 and the like.
- the components shown in FIG. 1 are not essential to implementing the spectacle terminal according to the present invention, so the spectacle terminal described herein may have more or fewer components than those listed above.
- the wireless communication unit 110 of the components between the mobile terminal 100 and a wireless communication system, between the mobile terminal 100 and another mobile terminal 100, or the mobile terminal 100 And one or more modules that enable wireless communication between the server and an external server.
- the wireless communication unit 110 may include one or more modules for connecting the mobile terminal 100 to one or more networks.
- the wireless communication unit 110 may include at least one of the broadcast receiving module 111, the mobile communication module 112, the wireless internet module 113, the short range communication module 114, and the location information module 115. .
- the input unit 120 may include a camera 121 or an image input unit for inputting an image signal, a microphone 122 for inputting an audio signal, an audio input unit, or a user input unit 123 for receiving information from a user. , Touch keys, mechanical keys, and the like.
- the voice data or the image data collected by the input unit 120 may be analyzed and processed as a control command of the user.
- the sensing unit 140 may include one or more sensors for sensing at least one of information in the terminal, surrounding environment information surrounding the terminal, and user information.
- the sensing unit 140 may include a proximity sensor 141, an illumination sensor 142, an illumination sensor, a touch sensor, an acceleration sensor, a magnetic sensor, and gravity.
- Optical sensors e.g. cameras 121), microphones (see 122), battery gauges, environmental sensors (e.g.
- the spectacles-type terminal disclosed in the present specification may use a combination of information sensed by at least two or more of these sensors.
- the output unit 150 is used to generate an output related to sight, hearing, or tactile sense, and includes at least one of a display unit 151, an audio output unit 152, a hap tip module 153, and an optical output unit 154. can do.
- the display unit 151 forms a layer structure with or is integrally formed with the touch sensor, thereby implementing a touch screen.
- the touch screen may function as a user input unit 123 that provides an input interface between the terminal 100 and the user, and may provide an output interface between the terminal 100 and the user.
- the interface unit 160 serves as a path to various types of external devices connected to the terminal 100.
- the interface unit 160 connects a device equipped with a wired / wireless headset port, an external charger port, a wired / wireless data port, a memory card port, and an identification module. It may include at least one of a port, an audio input / output (I / O) port, a video input / output (I / O) port, and an earphone port.
- the terminal 100 may perform appropriate control related to the connected external device.
- the memory 170 stores data supporting various functions of the terminal 100.
- the memory 170 may store a plurality of application programs or applications that are driven by the terminal 100, data for operating the terminal 100, and instructions. At least some of these applications may be downloaded from an external server via wireless communication.
- at least some of these application programs may exist on the terminal 100 from the time of shipment for basic functions (for example, call incoming, outgoing, message receiving, and outgoing functions) of the terminal 100.
- the application program may be stored in the memory 170 and installed on the terminal 100 to be driven by the controller 180 to perform an operation (or function) of the terminal.
- the controller 180 In addition to the operation related to the application program, the controller 180 typically controls the overall operation of the terminal 100.
- the controller 180 may provide or process information or a function appropriate to a user by processing signals, data, information, and the like, which are input or output through the above-described components, or by driving an application program stored in the memory 170.
- controller 180 may control at least some of the components described with reference to FIG. 1 to drive an application program stored in the memory 170. In addition, the controller 180 may operate by combining at least two or more of the components included in the mobile terminal 100 to drive the application program.
- the power supply unit 190 receives power from an external power source and an internal power source under the control of the controller 180 to supply power to each component included in the terminal 100.
- the power supply unit 190 includes a battery, which may be a built-in battery or a replaceable battery.
- At least some of the components may operate in cooperation with each other to implement an operation, control, or control method of the terminal according to various embodiments described below.
- the operation, control, or control method of the terminal may be implemented on the terminal by driving at least one application program stored in the memory 170.
- FIG. 2A is a view of an eyeglass terminal according to an embodiment of the present disclosure, viewed from one direction
- FIG. 2B is a conceptual view illustrating an optical unit included in the eyeglass terminal of FIG. 2A.
- the spectacles-type terminal 200 may be configured to be worn on a head of a human body, and may include a frame part (case, housing, etc.) therefor.
- the frame portion may be formed of a flexible material to facilitate wearing.
- the frame part includes a first frame 201 and a second frame 202 of different materials.
- the eyeglass terminal 200 may include a feature of the mobile terminal 100 of FIG. 1 or a similar feature thereof.
- the frame part is supported by the head, and provides a space for mounting various components.
- electronic components such as a controller 280, a sound output module 252, and the like may be mounted in the frame unit.
- the frame portion may be mounted with a prism 251 ′ disposed adjacent to at least one of the left eye and the right eye.
- the prism 251 ′ may be detachable.
- the controller 280 is configured to control various electronic components provided in the eyeglass terminal 200.
- the controller 280 may be understood as a configuration corresponding to the controller 180 described above.
- the control part 280 is illustrated in the frame part on one side head.
- the position of the controller 280 is not limited thereto.
- the camera 221 is disposed adjacent to at least one of the left eye and the right eye, and is formed to capture an image of the front. Since the camera 221 is located adjacent to the eye, the camera 221 may acquire a scene viewed by the user as an image.
- the glasses type terminal 200 displays a separate preview image when the image capturing function is executed. May not be output to.
- the spectacles-type terminal 200 may include user input units 223a and 223b operated to receive a control command.
- the user input units 223a and 223b may be adopted in any manner as long as the user is operating in a tactile manner such as touch or push.
- the frame unit and the controller 280 are provided with push and touch input user input units 223a and 223b, respectively.
- the user input unit 223b of the touch input method will be mainly described, and reference numeral 223 will be used.
- the spectacles-type terminal 200 may include a microphone (not shown) that receives sound and processes it as electrical voice data, and a sound output module 252 that outputs sound.
- the visual information 300 output through the display unit may be seen to overlap the general field of view with the virtual image 400 through the optical unit 130 and the prism 251 ′.
- the spectacles type terminal 200 displays an augmented reality (AR) showing the virtual image 400 as a single image by superimposing the virtual image 400 on the image of the reality (or scene, background, and the user's general field of view) using the characteristics of the display.
- AR augmented reality
- the optical unit 130 may be embedded in the frame 101. That is, the optical unit 130 is not located in the user's field of view.
- the optical unit 130 includes first and second lenses 131 and 132 and a zoom lens unit 133.
- the virtual image 400 ′ formed by the optical unit 130 may be reflected by the prism 251 ′ and provided to the user's eyes.
- the controller 280 forms the visual information 300 output from the display unit as a virtual image image 400 having various sizes based on a user's control command, and outputs the virtual image 400 to various positions. 130 may be controlled.
- the display unit may output visual information 300 between the focus f of the first lens 131 of the optical unit 130 and the first lens 131. Accordingly, the virtual image 400 ′ of the visual information 300 is enlarged at a position opposite to the position where the visual information 300 is output, based on the focal point f of the first lens 131. Can be formed.
- the virtual image 400 ′ may be visible to the user through the optical unit 130 and the prism 251 ′.
- the virtual image 400 may be viewed in the virtual area 450 (or, the space or the virtual space) located in front of the user's eyes.
- This may mean that the virtual image 400 of the visual information 300 is formed in the virtual region 450 (for example, the virtual region located in front of the eyes of the user) away from the main body.
- the optical unit 130 is located in front of the virtual area 450 (for example, the eyes of the user gazing away from the main body) of the image 400 (the virtual image) of the visual information 300 output from the display unit.
- Virtual area 450 (or a space)).
- Related data may already be stored.
- the sensing unit may detect a user gesture performed in the space (or around the glass type terminal 200) existing within a predetermined distance from the glasses type terminal 200 or the glasses type terminal 200.
- the user gesture may mean an operation formed by a part of the user's body (for example, a finger, a hand, an arm, a leg, etc.) or an object (object, a stick, etc.).
- the spectacles-type terminal may provide a control method for specifying a target based on a user's gaze to the external environment in a captured image of an external environment and executing a function related to the specified target.
- the spectacles-type terminal according to the present invention may be applied when a teacher wearing the spectacles-type terminal looks at the plurality of students in a classroom environment including a plurality of students. That is, the eyeglass terminal may analyze the eyes of the teacher for the plurality of students, specify a student corresponding to the eyes of the teacher among the plurality of students as one object, and execute a function related to the specified object.
- a control method can be provided.
- the glasses-type terminal may output various function icons associated with a preset function to execute a function related to the specified object. Therefore, the user is provided with the convenience of executing various functions related to one object of the external environment only with a simple operation of changing the line of sight.
- FIG. 3A is a flowchart illustrating a control method of providing a function icon associated with a specified object by a spectacle type terminal according to the present invention
- FIG. 3B is a representative diagram for describing a control method of providing the function icon.
- the external environment may be photographed by the camera of the spectacle terminal according to the present invention (S301).
- the camera 221 of the spectacles-type terminal 200 may photograph the front of the eye of the user while the user is wearing the spectacles-type terminal 200.
- the camera 221 may capture the external environment in real time according to a preset field of view.
- the camera 221 may have a field of view corresponding to the viewing range of the user wearing the spectacles-type terminal 200.
- the camera 221 mounted on the spectacles-type terminal 200 may photograph an external environment according to a preset angle of view.
- the external environment may include a plurality of objects (1, 2, 3, 4).
- the eyeglass terminal 200 may detect a user's gaze with respect to the external environment by using a sensing unit (S302).
- the sensing unit may detect a gaze of a user wearing the spectacles-type terminal 200.
- the sensing unit may include a plurality of sensors capable of sensing the gaze of the user.
- the sensing unit may include a camera and may detect movement of the pupil of the user from the camera.
- the camera 221 and the camera included in the sensing unit may be disposed at different positions of the spectacle terminal 200. That is, the camera included in the sensing unit may be disposed at a position capable of detecting the gaze of the user.
- the sensing unit may include an infrared sensor.
- the infrared sensor may detect the movement of the user's eyes with infrared rays reflected from the user's eyes.
- the gaze of the user with respect to the external environment may be analyzed, and an object to which the user's gaze is directed may be specified from the captured image corresponding to the external environment (S303).
- the controller 280 may analyze the gaze of the user detected by the sensing unit. For example, the gaze of the user may be analyzed by applying an image of the movement of the user's pupil captured by the camera included in the sensing unit to a preset gaze tracking algorithm. In this case, the direction of the user's gaze may be determined by analyzing the image of the infrared light reflected from the user's eyes.
- the controller 280 may calculate the angle of the gaze of each of the two eyes of the user based on the position of the pupil of the user, and determine the direction and the distance of the gaze corresponding to the angle of the gaze.
- the controller 280 may specify an object to which the gaze of the user is directed from an image corresponding to the external environment (S303).
- the object may be a subject included in the external environment, and may include a person or a thing.
- a plurality of students 1, 2, 3, and 4 included in the external environment eg, a classroom
- the controller 280 may acquire direction or distance information to which the gaze of the user is directed by analyzing the gaze of the user, and may specify an object to which the gaze of the user is directed. That is, as shown in the second drawing of FIG. 3B, when the user's gaze toward the external environment is analyzed to be directed to one object 1, the controller 280 looks at the one object 1 to which the user's gaze is directed. It can be specified.
- the controller 280 may determine whether an event related to the specified object is photographed through the camera 221.
- the event may mean a predetermined movement of the specified target.
- the controller 280 may determine that an event related to the specified object has occurred. In this case, the controller 280 may determine whether the eyes match with the user by using the direction in which the face image of the specified object is directed in the captured image corresponding to the external environment.
- the controller 280 may determine the occurrence of an event related to the specified object based on the photographing of the predetermined movement of the specified object. That is, as shown in the third drawing of FIG. 3B, when the movement of raising the hand of the specified object 1 is photographed, the controller 280 may determine that an event related to the specified object 1 has occurred.
- the controller 280 when it is photographed that a predetermined operation (eg, bowing down) of the specified object is maintained for a predetermined time, the controller 280 generates an event related to the specified object. You can judge.
- a predetermined operation eg, bowing down
- the controller 280 may display the display unit such that a function icon associated with a preset function is located in a corresponding area corresponding to the area where the specified object is located. 251 can be controlled.
- the controller 280 may output a function icon related to a function executable in relation to the specified object to the display unit 251.
- the function icon is a graphic object used to execute a function associated with the function icon, and a function executable in association with the function icon may be preset.
- the function associated with the function icon may include all kinds of functions executable in the terminal.
- the function icon may be an icon associated with a function for checking specific information about the specified object. That is, the function associated with the function icon may be a function for checking personal information, identification information, grade information, etc. of the specified object.
- the function icon may be an icon associated with a function for generating information related to the specified object. That is, the function associated with the function icon may be a function (for example, a recording or recording function) for recording or acquiring a voice, an action or a movement, and a current state of the specified object.
- a function for example, a recording or recording function
- the function icon when there is an external device corresponding to the specified object, the function icon may be an icon associated with a function of controlling the external device by wireless connection with the external device. That is, the function icon is an icon associated with a function of transmitting a control command for outputting specific information from an external device corresponding to the specified object or transmitting a control command requesting reception of specific information to the external device. Can be.
- controller 280 may determine an output position of the function icon on the display unit 251 such that the function icon is displayed to the user in an area adjacent to the region where the specified object is located.
- the controller 280 uses the optical unit 130 to generate a virtual image of visual information output to the display unit 251. It may be output at a position away from the user within a viewing range of the user.
- the virtual image forming area corresponding to the output area of the display unit 251 may be set within the viewing range of the user.
- the virtual image of the visual information may be output in the virtual image forming area.
- the virtual image of the visual information output in the virtual image forming area may be displayed to the user by overlapping with the external environment.
- the controller 280 may control the display unit 251 to output the virtual image of the function icon to a corresponding area corresponding to the area where the specified object is located within the virtual image forming area.
- the controller 280 may determine the location of the specified object within the viewing range of the user. That is, the distance and direction from the user to the point where the specified object is located may be determined based on the gaze analysis of the user.
- controller 280 controls the function icon of the function icon on the display unit 251 so that the virtual image of the function icon is located in a corresponding area related to the identified position and orientation of the identified image. At least one of an output position, a size, and an output interval may be adjusted.
- an area 401 for outputting a virtual image corresponding to an output area of the display unit 251 may be set within a viewing range of the user.
- the controller 280 may display a function icon in a corresponding area corresponding to an area in which the specified object 1 is located in the virtual image forming area 401.
- the display unit 251 may be controlled to position the virtual image of the image. Accordingly, the user may be provided with the virtual image of the function icons 10, 11, and 12 in the peripheral area of the specified object as shown in the fourth drawing of FIG. 3B.
- the spectacles-type terminal may be formed in a light transmission at a position corresponding to at least one of both eyes of the user, may include a display unit for outputting visual information.
- the user may look at the external environment through the light transmitting display unit.
- the controller 280 may control the display unit such that the function icon is output to an area of the display unit corresponding to the gaze of the user.
- the display unit may output a real-time captured image of an external environment at a position corresponding to each of both eyes of the user.
- the controller may control the display unit to synthesize and output the function icon in one region corresponding to the specified object in the captured image image of the external environment.
- the eyeglass terminal according to the present invention may specify an object of interest to the user even without a separate input from the user by using the captured image of the external environment and the user's gaze information.
- the user by providing a function icon related to the specified object, the user can be provided with information about a function that can be executed in relation to the object of interest only by a simple operation such as changing the line of sight.
- an eyeglass type terminal having a structure according to FIGS. 2A and 2B will be described as a main embodiment.
- the eyeglass terminal according to the embodiment of the present invention is not limited to the structure of the eyeglass terminal described with reference to FIGS. 2A and 2B.
- the virtual image of the function icon will be expressed in the same way as the function icon.
- the user may execute a desired function by using the function icon. More specifically, in a state in which the function icon is output, the controller 280 may execute a function associated with the function icon based on a selection input related to the function icon from a user.
- the selection input may be received in various ways. For example, when a preset gesture of a user is detected by the sensing unit while the function icon is output, the controller 280 may determine that the selection input has been received. That is, whether the user's eye blinking (at least one of the left eye and the right eye blinks, a preset number of blinks, etc.) is detected, or the user's preset hand gesture is detected. You can judge.
- the sensing unit may include a plurality of sensors capable of sensing various gestures of the user.
- the selection input may be received by a touch or push applied to a user input unit (for example, a touch key or a push key) provided in the spectacles-type terminal.
- a user input unit for example, a touch key or a push key
- the function icons may be linked to all kinds of functions executable in the terminal, and a plurality of functions icons may be output.
- the controller 280 may select a function icon associated with a function to be executed by the user from among the plurality of function icons based on the gaze analysis of the user.
- the eyes of the user may be analyzed to specify a function icon to which the eyes of the user are directed among the plurality of function icons.
- the controller 280 may execute a function associated with the specified function icon based on the detection of the preset gesture of the user in the state where the function icon is specified.
- 4A and 4B are diagrams illustrating an embodiment when a control icon associated with a function of controlling an external device corresponding to the specified object is selected.
- the controller 280 may specify an object in the external environment based on the gaze analysis of the user. In this case, the controller 280 may give a specific visual effect to the specified object so that the user can recognize the specified object.
- the controller 280 is configured to display a specific visual effect 1a at the position of the specified object in the region 401 of forming the virtual image corresponding to the output area of the display unit 251. ) May be controlled such that the display unit 251 overlaps.
- the controller 280 may output a function icon associated with a function executable in relation to the specified object 1 to the display unit 251.
- the function icons 10, 11, and 12 may be output in the corresponding region corresponding to the specified object in the formation region 401 of the virtual image. .
- the controller 280 may analyze the eyes of the user and specify a function icon to which the eyes of the user are directed among the function icons 10, 11, and 12. .
- control unit 280 displays the display unit (12a) so that the specified function icon 12 is distinctly displayed 12a based on the gaze of the user. 251 can be controlled.
- the controller 280 may determine that a user's selection input for the function icon has been received.
- the controller 280 when a certain function icon 12 is detected by the sensing unit in a state in which a function icon 12 is specified, the controller 280 is detected. ) May execute a function associated with the specified function icon 12.
- the controller 280 may obtain information about the specified object for use in executing a function associated with the function icon. More specifically, the controller 280 may extract an image of the specified object from an image corresponding to the external environment, and analyze the image of the specified object. And, based on the image analysis of the specified object, it is possible to obtain information about the specified object.
- the information about the specified object may be stored in the memory of the spectacles type terminal together with the image information of the specified object.
- the controller 280 may control the wireless communication unit to obtain information about the specified object corresponding to the image of the specified object from an external server.
- the function icon is a function of controlling an external device corresponding to the specified object
- the information on the specified object may correspond to identification information of the external device for wirelessly connecting with the external device.
- the controller 280 may control the wireless communication unit to wirelessly connect with the external device corresponding to the specified object 1 by using the acquired identification information of the external device.
- the controller 280 may transmit a control command to the external device to transmit / receive information with the external device.
- the type of control command for controlling the external device corresponding to the specified object may be determined based on an input for selecting the function icon.
- the controller 280 may transmit a control command for receiving preset information from the external device to the external device when the first gesture by the user is detected while the function icon is specified.
- a control command for outputting preset information may be transmitted to the external device.
- a gesture for blinking one eye of the user may be detected by the sensing unit. Can be.
- the controller 280 may transmit a control command for receiving preset information from the external device 100.
- the preset information may be screen information currently being output to the display unit 151 of the external device 100.
- the controller 280 may output the received screen information to the display unit 251.
- the virtual image 20 of the screen information received from the external device 100 may be shown to the user in an overlapping state with the plurality of objects 1, 2, 3, and 4 located in the external environment.
- the controller 280 may control the wireless communication unit to transmit predetermined information to the external device 100.
- the preset information may be time information output to the display unit 151 of the external device 100.
- the user can easily select a function icon associated with a function that he / she wants to execute among the plurality of function icons by using the gaze.
- the user can selectively execute a detailed function related to one function.
- the controller 280 may temporarily store the received information in a memory. At this time, the user may have a need to deliver the received information to another external device, to confirm the received information through a larger display unit, or to share it with others.
- the spectacles-type terminal when the user receives additional input while the information received from the first external device is output to the display unit, the spectacles-type terminal receives the information received from the first external device from the second external device. It can provide the ability to share with devices.
- an external device corresponding to the specified object will be referred to as a first external device and another external device as a second external device.
- 5A and 5B illustrate an example of transferring information received from a first external device to a second external device.
- information is received from a first external device, and the controller 280 may output the received information to the display unit 251.
- the controller 280 may temporarily store the received information in a memory.
- a specific input may be applied from the user.
- the specific input may be a touch input applied to the main body of the spectacles-type terminal 200.
- the controller 280 may terminate the output of the received information from the display unit 251. . Accordingly, in the field of view of the user, the virtual image 21 of the received information may disappear. In addition, based on the touch input applied to the spectacles-type terminal, the controller 280 may activate the camera 221 to photograph the front (external environment).
- the controller 280 may control the sensing unit to detect the gaze of the user when the touch input is released.
- the controller 280 may move the gaze of the user based on the gaze analysis of the user in a captured image corresponding to an external environment captured by the camera 221. You can specify an object.
- the object may be a subject included in the external environment like the target.
- the object may be a second external device.
- the controller 280 may analyze an image of a second external device corresponding to the object, and obtain identification information corresponding to the second external device.
- the identification information corresponding to the second external device may be pre-registered in the memory together with the image of the second external device.
- controller 280 may control the wireless communication unit to transmit the information received from the first external device to the second external device.
- the information received from the first external device is transmitted to the second external device 500 as shown in the third drawing of FIG. 5A, and the second external device transmits the received information 21a to the second. It can be output to the display unit of the external device 500.
- the user may apply the received information to the second external device only by applying a touch to the main body of the eyeglass terminal and changing the line of sight while the information received from the first external device is output to the display unit.
- the user maintains a gaze toward the second external device 300 when the user maintains a touch on the spectacles-type terminal and releases the touch, thereby receiving information received from the first external device from the second external device. Can be passed to 300.
- the information received from the first external device may be output together with the previously output information. That is, as shown in the third drawing of FIG. 5B, the display unit 351 of the second external device 300 is divided into a plurality of areas, and the information 22 and the information 22 previously outputted on the display unit 351 of the second external device 300. The information 21a received from the first external device may be output together.
- the user may check the information received from the first external device by comparing the information received from the second external device with the information previously output from the second external device.
- the method of outputting information received from the first external device to the display unit of the second external device may be changed by setting of the eyeglass terminal or the second external device.
- the controller 280 may form a control command to output preset information from an output unit of the external device.
- the preset information may include all kinds of information that can be output to the output unit of the external device.
- the preset information may include visual information that can be output through a display unit of the external device, auditory information that can be output through an audio output unit, and vibration of a preset method that can be output through a haptic module of the external device. This may be included.
- the controller 280 may select information to be output from the external device among various information output from the external device based on various conditions. This will be described with reference to FIGS. 6A and 6B.
- FIGS. 6A and 6B illustrate an exemplary embodiment in which a glasses terminal controls the external device to output preset information from the external device.
- the controller 280 may determine that the user's gaze is in an image corresponding to an external environment captured by the camera 221. You can specify the object to be directed.
- controller 280 may detect that an event related to the specified object has occurred in the captured image corresponding to the external environment by the camera 221.
- the event related to the specified object may be defined as various movements related to the specified object, such as detecting a predetermined movement of the specified object or detecting that the predetermined movement is maintained for a predetermined time.
- the controller 280 detects that the bowing operation of the specified object 1 is maintained for a predetermined time through the camera 221. can do.
- the user may view the virtual image of the function icon output on the display unit 251 in the corresponding area corresponding to the specified object as shown in the second drawing of FIG. 6A.
- the user may apply a preset gesture (nod gesture) while keeping an eye on the function icon for controlling the external device.
- the controller 280 may transmit a control command for outputting preset information from the external device 100 according to the preset gesture.
- the preset information may be determined by the type of event related to the specified target. That is, the controller 280 may select the type of information to be output from the external device based on the type of event related to the specified object.
- the user can directly specify the type of information to be transmitted to the external device using voice.
- the controller 228 may analyze the voice input together with the selection gesture and select a type of information to be transmitted to the external device 100 by the selection gesture.
- the information 23 transmitted to the external device 100 may be output to the display unit 151 of the external device 100. Accordingly, the user is provided with the convenience of transmitting a desired information to the external device according to the selection gesture by inputting a voice together with a selection gesture for executing a function of controlling an external device.
- the function icon may be a data forming icon associated with a function of forming data related to the specified object.
- exemplary embodiments related to this will be described with reference to FIGS. 7A and 7B.
- FIG. 7A and 7B illustrate embodiments in which a function of generating information related to a specified object is executed.
- the spectacles-type terminal 200 may execute a function associated with a specific function icon 10 based on a gaze analysis of a user.
- the function associated with the function icon 10 may be a data forming icon for forming the image data for the specified object (5).
- control unit 280 is a one-eye gesture (eg, as shown in the second drawing of FIG. 7A) from the user in a state where the user's gaze is detected toward the function icon 10.
- a blinking gesture When a blinking gesture is detected, the data forming function may be executed.
- the controller 280 may control the camera 221 to photograph the specified object based on the detection of the preset gesture of the user.
- the controller 280 may output the preview image 25 photographed with respect to the specified object 5 to the display unit 251 in real time.
- the preview image 25 may be shown to the user through the virtual image forming area 401 corresponding to the output area of the display unit.
- the controller 280 may control the camera 221 such that the photographing of the specified object 5 is performed for a preset time.
- the controller 280 may detect the predetermined object 5 when the preset gesture of the user is detected while the photographing of the specified object 5 is being performed, as shown in the second drawing of FIG. 7B. You can stop shooting.
- the image data for the specified object 5 may be formed by the image photographed from when the photographing is performed to when it is stopped.
- the controller 280 may search for related data including motion information associated with the movement of the specified object 5 included in the preview image.
- the associated data may include pre-formed image data of the specified object including the associated motion information or image data of another object including the related motion information.
- the related data may be previously stored in a memory of the spectacles-type terminal 200 or may be received from an external server based on a specific input of the user.
- the controller 280 may overlap the association data with the preview image so as to compare the association data with the preview image.
- the controller 280 may select a section including a movement related to the movement of the specified object in the associated data section.
- the controller 280 may control the display unit 251 such that the section selected from the related data and the movement section of the specified target correspond to each other in the preview image.
- the user may form image data of one object located in an external environment by using the gaze, or compare related data related to the one object with the formed image data.
- sub-menu icons for various functions that can be executed in relation to the function icon may be output.
- the submenu icon may be secondarily selected based on the detection of the user's gaze and a specific operation, and thus a function associated with the selected submenu icon may be executed.
- the function icon may be an information icon associated with a function for checking information related to the specified object. This will be described with reference to FIGS. 8A and 8B.
- 8A and 8B illustrate an example in which a function of confirming information related to a specified object is executed.
- the object 1 is specified, and as shown in the second drawing of FIG. 8A, an area in which the specified object 2 is located and
- the function icons 10, 11, and 12 may be located in the corresponding areas.
- a function icon 10 associated with a function for confirming information related to the specified object may be specified based on a user's gaze.
- the controller 280 may output information related to the specified object to the display unit 251. .
- the information related to the specified object may include all kinds of information for identifying the specified object, such as personal information and grade information of the specified object.
- the controller 280 may display the information 27a related to the specified object in one area corresponding to the area where the specified object is located, similarly to the function icon.
- the unit 251 may be controlled.
- the controller 280 may determine an output amount of information related to the specified object according to the number of objects included in a field of view range of the camera 221.
- the camera 221 photographs the external environment by using a preset field of view
- the camera 221 is moved by the camera 221 as a distance between the user and objects included in the external environment is changed.
- the number of objects to be photographed may vary.
- FIG. 8A when a user is relatively far from objects included in the external environment, four objects may be included within a field of view range of the camera.
- FIG. 8B when the user is relatively close to the objects included in the external environment, two objects may be included in the field of view range of the camera 221.
- the controller 280 may output the amount of information 27b when the number of objects included in the field of view range of the camera 221 is small.
- the display unit 251 may be controlled to be larger than the output amount 27a of information when a large number of objects are included in a field of view range of the camera 221.
- the user looks at the specified object through the formation area 401 of the virtual image corresponding to the output area of the display unit 251, the user may be provided with an appropriate amount of information within his or her viewing range. .
- the spectacles-type terminal may perform wireless communication with a plurality of external devices included within a predetermined distance from the spectacles-type terminal.
- the eyeglass type terminal may simultaneously transmit information to each of the plurality of external devices or receive information from each of the plurality of external devices based on the detection of the user's gaze and a predetermined movement.
- the spectacles-type terminal may enter a specific mode and be wirelessly connected to each of a plurality of external devices that are registered in the spectacles-type terminal.
- FIGS. 9A and 9B are diagrams illustrating an embodiment in which information is transmitted to each of a plurality of external devices from a spectacle type terminal.
- the spectacles-type terminal may set an operation mode with settings.
- the spectacle terminal may operate in a first mode capable of transmitting and receiving specific information with an external device or a second mode capable of simultaneously transmitting and receiving specific information with each of a plurality of external devices by setting.
- the controller 280 may control the wireless communication unit to wirelessly connect with a plurality of external devices included within a predetermined distance from the spectacle type terminal 200.
- identification information corresponding to each of the plurality of external devices may be registered in advance in the eyeglass terminal.
- the controller 280 may control the camera 221 to capture an external environment and control the sensing unit to detect a user's gaze to the external environment. have.
- the controller 280 may take an external environment captured by the camera 221.
- the object to which the user's gaze is directed may be specified from the captured image corresponding to.
- the controller 280 may acquire the screen information 30 output to the display unit of the external device 500 through image analysis of the external device 500 corresponding to the specified object. For example, the controller 280 may obtain identification information corresponding to the external device 500 and receive the screen information 30 from the external device 500. As another example, the controller 280 may acquire a captured image of the screen information 30 using the camera 221.
- the controller 280 may be configured to move a user's preset movement (hand in one direction) while the user's gaze is directed toward the external device 500.
- the screen information 30 may be transmitted to a plurality of external devices.
- the screen information 30 may be transmitted to a plurality of external devices 100 and 100 ′ pre-registered in the eyeglass terminal 200.
- the terminal 300 of the user may be specified as one external device to which the user's eyes are directed.
- the controller 280 displays the information 28 output on the display unit 351 of the terminal 300 of the user. It can transmit to the external devices 100 and 100 '.
- the controller 280 may display the additional information formed by the user on the screen information 28 transmitted to each of the plurality of external devices 100 and 100 '. Can be controlled.
- the additional information formed by the user may be information corresponding to content highlighted based on the user's gaze, voice, etc. among contents included in the screen information 28.
- the controller 280 uses the information included in the area 28a specified by the user's gaze in the screen information 28 as the additional information. 100, 100 ').
- the additional information may be highlighted and displayed on the screen information 28 output to the display unit of each of the plurality of external devices 100 and 100 ′.
- the controller 280 adds at least a portion of the screen information 28 corresponding to the user's voice.
- Information may be transmitted to the plurality of external devices 100 and 100 '.
- the controller 280 may simultaneously transmit information to a plurality of external devices included within a predetermined distance from the spectacle type terminal based on the user's gaze or a predetermined movement. In addition, by transmitting additional information highlighted by the user among the information transmitted to the plurality of external devices, it may provide convenience to the user corresponding to each of the plurality of external devices.
- the eyeglass type terminal may receive information from each of a plurality of externally connected external devices. This will be described with reference to FIG. 10.
- FIG. 10 is a diagram illustrating an embodiment in which an eyeglass type terminal simultaneously receives information from a plurality of external devices.
- the user's gaze toward the external environment may be detected, and the user's preset gesture (a gesture of blinking both eyes) may be detected.
- the controller 280 may receive information output to the display units of the plurality of external devices 100a, 100b, 100c, and 100d at one time by using the wireless communication unit.
- the controller 280 may include summary information about information received from each of the plurality of external devices 100a, 100b, 100c, and 100d corresponding to each of the plurality of external devices 100a, 100b, 100c, and 100d.
- the display unit 251 may be controlled to be output to a peripheral area of the object.
- the information received from each of the plurality of external devices 100a, 100b, 100c, and 100d may be information input by a user corresponding to each of the plurality of external devices 100a, 100b, 100c, and 100d.
- the problem solving information for the mathematical problem For example, the controller 280 analyzes the information received from each of the plurality of external devices 100a, 100b, 100c, and 100d, and compares the summary information (if the problem solving information is corrected, the correct answer to the problem solving information). Information indicating whether or not) can be determined.
- the summary information 29a, 29b, 29c, and 29d is stored in an area corresponding to an area where an object corresponding to each of the plurality of external devices 100a, 100b, 100c, and 100d is located. Each can be located.
- one external device or a plurality of external devices may be controlled according to a mode set in the spectacles-type terminal.
- a user wearing the spectacles-type terminal can easily control the external device or a plurality of external devices by using a simple gesture of changing a gaze or an operation.
- the spectacle terminal according to the present invention may detect a specific movement of the one object and the other object within the angle of view of the camera 221.
- an embodiment related to this will be described with reference to FIG. 11.
- FIG. 11 is a diagram illustrating an embodiment of providing a notch for an object different from a specified object based on a line of sight of a user within an angle of view of a camera.
- the controller 280 may be a region where the one object 1 is located.
- the function icon may be output to the corresponding area corresponding to the.
- a specific movement of one object 1 and another object 4 to which the user's eyes are directed may be detected by the camera 221.
- the other object 4 may move in a specific motion, such as raising a hand or getting up from a seat.
- the controller 280 controls the display unit 251 to output a notch 4a for notifying the specific movement to a region related to the region where the other object 4 is located. Can be controlled. Therefore, the user can easily grasp the movement with respect to the other object 4 which has not been sensed even in the state where the gaze is fixed to the one object 1.
- the spectacles-type terminal may provide an icon of a function that can be executed in relation to the specified object by specifying one object in an external environment through the user's gaze analysis. Accordingly, the user may intuitively be provided with information about a function executable in relation to the specified object without directly searching for information related to the specified object.
- the user is provided with the convenience of executing various functions associated with the function icon by using a non-contact input such as changing a line of sight or a preset gesture.
- the present invention described above can be embodied as computer readable codes on a medium in which a program is recorded.
- the computer-readable medium includes all kinds of recording devices in which data that can be read by a computer system is stored. Examples of computer-readable media include hard disk drives (HDDs), solid state disks (SSDs), silicon disk drives (SDDs), ROMs, RAMs, CD-ROMs, magnetic tapes, floppy disks, optical data storage devices, and the like. This also includes implementations in the form of carrier waves (eg, transmission over the Internet).
- the computer may include the controller 180 of the terminal. Accordingly, the above detailed description should not be construed as limiting in all aspects and should be considered as illustrative. The scope of the invention should be determined by reasonable interpretation of the appended claims, and all changes within the equivalent scope of the invention are included in the scope of the invention.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Multimedia (AREA)
- Optics & Photonics (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Ophthalmology & Optometry (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
Claims (20)
- 착용 가능한 안경형 단말기에 있어서,상기 안경형 단말기의 본체에 구비되는 디스플레이부;외부환경을 촬영하는 카메라;상기 외부환경에 대한 사용자의 시선을 감지하는 센싱부; 및상기 외부환경에 대한 사용자의 시선을 분석하여, 상기 카메라에 의해 촬영된 상기 외부환경에 대응되는 영상으로부터 상기 사용자의 시선이 향하는 대상을 특정하고,상기 카메라를 통해 상기 특정된 대상과 관련된 이벤트가 촬영되는 것에 근거하여, 상기 특정된 대상이 위치한 영역과 대응되는 대응 영역에 기 설정된 기능에 연계된 기능 아이콘이 위치하도록 상기 디스플레이부를 제어하는 제어부를 포함하는 것을 특징으로 하는 안경형 단말기.
- 제1항에 있어서, 상기 제어부는,상기 기능 아이콘과 관련된 선택 입력이 수신되는 것에 근거하여, 상기 기능 아이콘에 연계된 기능을 실행하는 것을 특징으로 하는 안경형 단말기.
- 제2항에 있어서,상기 기능 아이콘과 관련된 선택 입력은,상기 센싱부에 의하여, 상기 기능 아이콘을 향하는 상기 사용자의 시선이 감지되는 상태에서, 상기 사용자의 기 설정된 제스처가 감지되는 것으로 수신되는 것을 특징으로 하는 안경형 단말기.
- 제1항에 있어서, 상기 특정된 대상과 관련된 이벤트는,상기 카메라를 통해 촬영되는 상기 특정된 대상의 기 설정된 움직임인 것을 특징으로 하는 안경형 단말기.
- 제2항에 있어서,무선통신부를 더 포함하고,상기 기능 아이콘은, 상기 특정된 대상에 대응되는 외부 기기를 제어하는 기능에 연계된 제어 아이콘을 포함하며,상기 제어부는,상기 특정된 대상에 대응되는 외부 기기를 제어하는 기능이 실행되는 경우, 상기 외부 기기로 기 설정된 제어명령을 전송하도록 상기 무선통신을 제어하는 것을 특징으로 하는 안경형 단말기.
- 제5항에 있어서, 상기 제어부는,상기 외부환경에 대응되는 영상에 포함된 상기 특정된 대상의 이미지를 분석하여 상기 특정된 대상에 대응되는 외부 기기의 식별정보를 획득하고,상기 외부 기기의 식별정보를 이용하여 상기 외부 기기와의 무선통신을 수행하도록 상기 무선통신부를 제어하는 것을 특징으로 하는 안경형 단말기.
- 제5항에 있어서, 상기 제어부는,상기 제어 아이콘과 관련된 선택 입력으로 사용자의 제1 제스처가 감지되는 경우, 기 설정된 정보의 전송을 요청하는 제1 제어명령을 상기 외부 기기로 전송하고,상기 제어 아이콘과 관련된 선택 입력으로 사용자의 제2 제스처가 감지되는 경우, 상기 외부 기기에서 기 설정된 정보를 출력시키는 제2 제어명령을 상기 외부 기기로 전송하도록 상기 무선통신부를 제어하는 것을 특징으로 하는 안경형 단말기.
- 제7항에 있어서,상기 제1 제어명령에 의하여 상기 외부 기기로부터 수신되는 기 설정된 정보는, 상기 외부 기기의 표시부에 출력중인 화면정보이며,상기 제어부는,상기 무선통신부에 의하여 상기 외부 기기의 표시부에 출력중인 화면정보가 수신되면, 상기 화면정보를 상기 디스플레이부에 출력시키는 것을 특징으로 하는 안경형 단말기.
- 제8항에 있어서, 상기 제어부는,상기 디스플레이부에 상기 화면정보가 출력된 상태에서 특정 입력이 인가되는 것에 근거하여 상기 외부 기기와 다른 기기로 상기 화면정보를 전송하도록 상기 무선통신부를 제어하고,상기 다른 기기는, 상기 센싱부에 의하여 감지된 사용자의 시선에 근거하여 선택되는 것을 특징으로 하는 안경형 단말기.
- 제7항에 있어서, 상기 제어부는,상기 카메라를 통해 촬영되는 상기 특정된 대상과 관련된 이벤트의 종류에 근거하여, 상기 제2 제어명령에 의하여 상기 외부 기기에서 출력할 기 설정된 정보를 선택하는 것을 특징으로 하는 안경형 단말기.
- 제2항에 있어서,상기 기능 아이콘은, 상기 특정된 대상과 관련된 정보를 확인하는 기능에 연계된 정보 아이콘을 포함하고,상기 제어부는,상기 특정된 대상과 관련된 정보를 확인하는 기능이 실행되는 경우, 상기 외부 환경에 대응되는 영상에 포함된 상기 특정된 대상의 이미지를 분석하여 상기 특정된 대상과 관련된 정보를 획득하고,상기 획득된 특정된 대상과 관련된 정보의 적어도 일부가 출력되도록 상기 디스플레이부를 제어하는 것을 특징으로 하는 안경형 단말기.
- 제11항에 있어서, 상기 제어부는,상기 외부 환경에 대응되는 영상에 포함된 대상의 수에 따라 상기 특정된 대상과 관련된 정보의 출력양을 조절하는 것을 특징으로 하는 안경형 단말기.
- 제2항에 있어서,상기 기능 아이콘은, 상기 특정된 대상에 대한 영상 데이터를 형성하는 기능에 연계된 데이터 형성 아이콘을 포함하고,상기 제어부는,상기 데이터 형성 아이콘과 관련된 선택 입력이 수신되면 상기 특정된 대상을 촬영하도록 상기 카메라를 제어하고,상기 카메라에 의하여 촬영되는 상기 특정된 대상에 대한 프리뷰 영상이 출력되도록 상기 디스플레이부를 제어하는 것을 특징으로 하는 안경형 단말기.
- 제13항에 있어서, 상기 제어부는,상기 디스플레이부에 상기 프리뷰 영상이 출력된 상태에서, 상기 프리뷰 영상에 포함된 상기 특정된 대상의 움직임과 연관된 움직임 정보를 포함하는 연관 데이터 영상의 출력 요청이 있으면,상기 프리뷰 영상에 상기 연관 데이터 영상이 오버랩되어 출력되도록 상기 디스플레이부를 제어하는 것을 특징으로 하는 안경형 단말기.
- 제1항에 있어서, 상기 제어부는,상기 외부환경에 대응되는 영상에서 상기 특정된 대상과 다른 대상에 대한 특정 움직임이 감지되면,상기 다른 대상의 특정 움직임을 알리는 정보가 출력되도록 상기 디스플레이부를 제어하는 것을 특징으로 하는 안경형 단말기.
- 카메라에 의하여 외부환경을 촬영하는 단계;센싱부에 의하여 상기 외부환경에 대한 사용자의 시선을 감지하는 단계;상기 외부환경에 대한 사용자의 시선을 분석하여, 상기 외부환경에 대응되는 영상으로부터 상기 사용자의 시선이 향하는 대상을 특정하는 단계;상기 카메라를 통해 상기 특정된 대상과 관련된 이벤트가 촬영되는 것에 근거하여, 상기 특정된 대상이 위치한 영역과 대응되는 대응 영역에 기 설정된 기능에 연계된 기능 아이콘을 위치시키는 단계를 포함하는 것을 특징으로 하는 안경형 단말기의 제어방법.
- 제16항에 있어서,상기 기능 아이콘과 관련된 선택 입력을 수신하는 단계; 및상기 선택 입력이 수신되는 것에 근거하여 상기 기능 아이콘에 연계된 기능을 실행하는 단계를 더 포함하는 것을 특징으로 하는 안경형 단말기의 제어방법.
- 제17항에 있어서,상기 기능 아이콘과 관련된 선택 입력은,상기 센싱부에 의하여 상기 기능 아이콘을 향하는 상기 사용자의 시선이 감지되는 상태에서, 상기 사용자의 기 설정된 제스처가 감지되는 것에 의하여 수신되는 것을 특징으로 하는 안경형 단말기의 제어방법.
- 제17항에 있어서, 상기 기능 아이콘에 연계된 기능을 실행하는 단계는,상기 외부환경에 대응되는 영상에 포함된 상기 특정된 대상의 이미지를 분석하는 단계;상기 특정된 대상의 이미지 분석에 근거하여 상기 특정된 대상과 관련된 정보를 획득하는 단계; 및상기 획득된 특정된 대상과 관련된 정보를 이용하여 상기 기능 아이콘에 연계된 기능을 실행하는 단계를 포함하는 것을 특징으로 하는 안경형 단말기의 제어방법.
- 제16항에 있어서,상기 특정된 대상과 관련된 이벤트는,상기 카메라를 통하여 촬영되는 상기 특정된 대상의 기 설정된 움직임인 것을 특징으로 하는 안경형 단말기의 제어방법.
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/569,922 US10514755B2 (en) | 2015-05-08 | 2015-05-08 | Glasses-type terminal and control method therefor |
PCT/KR2015/004579 WO2016182090A1 (ko) | 2015-05-08 | 2015-05-08 | 안경형 단말기 및 이의 제어방법 |
KR1020177029126A KR102110208B1 (ko) | 2015-05-08 | 2015-05-08 | 안경형 단말기 및 이의 제어방법 |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/KR2015/004579 WO2016182090A1 (ko) | 2015-05-08 | 2015-05-08 | 안경형 단말기 및 이의 제어방법 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2016182090A1 true WO2016182090A1 (ko) | 2016-11-17 |
Family
ID=57249069
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/KR2015/004579 WO2016182090A1 (ko) | 2015-05-08 | 2015-05-08 | 안경형 단말기 및 이의 제어방법 |
Country Status (3)
Country | Link |
---|---|
US (1) | US10514755B2 (ko) |
KR (1) | KR102110208B1 (ko) |
WO (1) | WO2016182090A1 (ko) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
TWI656359B (zh) * | 2017-05-09 | 2019-04-11 | 瑞軒科技股份有限公司 | 用於混合實境之裝置 |
Families Citing this family (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109804357A (zh) | 2016-10-07 | 2019-05-24 | 索尼公司 | 服务器、客户端、控制方法和存储介质 |
US10732826B2 (en) * | 2017-11-22 | 2020-08-04 | Microsoft Technology Licensing, Llc | Dynamic device interaction adaptation based on user engagement |
WO2019193885A1 (ja) * | 2018-04-06 | 2019-10-10 | ソニー株式会社 | 情報処理装置、情報処理方法、及びプログラム |
WO2019212798A1 (en) * | 2018-05-01 | 2019-11-07 | Snap Inc. | Image capture eyewear with auto-send |
KR20210019826A (ko) * | 2019-08-13 | 2021-02-23 | 삼성전자주식회사 | Ar 글래스 장치 및 그 동작 방법 |
US11481177B2 (en) * | 2020-06-30 | 2022-10-25 | Snap Inc. | Eyewear including multi-user, shared interactive experiences |
KR20220138933A (ko) * | 2021-04-07 | 2022-10-14 | 삼성전자주식회사 | 메타 렌즈를 포함하는 카메라 및 그 카메라를 포함하는 웨어러블 전자 장치 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20130000401A (ko) * | 2010-02-28 | 2013-01-02 | 오스터하우트 그룹 인코포레이티드 | 대화형 머리장착식 아이피스 상의 지역 광고 컨텐츠 |
US20130114850A1 (en) * | 2011-11-07 | 2013-05-09 | Eye-Com Corporation | Systems and methods for high-resolution gaze tracking |
KR20130059827A (ko) * | 2011-11-29 | 2013-06-07 | 학교법인 한국산업기술대학 | 동공인식을 이용한 안경 카메라 |
KR20130067902A (ko) * | 2011-12-14 | 2013-06-25 | 한국전자통신연구원 | 안경형 이동 통신 단말 장치 |
KR20140128489A (ko) * | 2013-04-25 | 2014-11-06 | (주)세이엔 | 영상 인식과 터치 인터페이스를 이용한 스마트 안경 및 그의 제어 방법 |
Family Cites Families (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10180572B2 (en) * | 2010-02-28 | 2019-01-15 | Microsoft Technology Licensing, Llc | AR glasses with event and user action control of external applications |
KR20140072651A (ko) * | 2012-12-05 | 2014-06-13 | 엘지전자 주식회사 | 글래스타입 휴대용 단말기 |
US9791921B2 (en) | 2013-02-19 | 2017-10-17 | Microsoft Technology Licensing, Llc | Context-aware augmented reality object commands |
JP5962547B2 (ja) * | 2013-03-08 | 2016-08-03 | ソニー株式会社 | 情報処理装置、情報処理方法およびプログラム |
KR20140139883A (ko) * | 2013-05-28 | 2014-12-08 | 엘지전자 주식회사 | 디스플레이 장치 |
KR20150018264A (ko) * | 2013-08-09 | 2015-02-23 | 엘지전자 주식회사 | 안경형 단말기의 정보 제공 장치 및 그 방법 |
KR20150033431A (ko) * | 2013-09-24 | 2015-04-01 | 엘지전자 주식회사 | 안경형 단말기 및 그 제어 방법 |
US10248192B2 (en) * | 2014-12-03 | 2019-04-02 | Microsoft Technology Licensing, Llc | Gaze target application launcher |
US10409443B2 (en) * | 2015-06-24 | 2019-09-10 | Microsoft Technology Licensing, Llc | Contextual cursor display based on hand tracking |
CN107787472A (zh) * | 2015-08-04 | 2018-03-09 | 谷歌有限责任公司 | 用于虚拟现实中的凝视交互的悬停行为 |
US10101803B2 (en) * | 2015-08-26 | 2018-10-16 | Google Llc | Dynamic switching and merging of head, gesture and touch input in virtual reality |
US10373381B2 (en) * | 2016-03-30 | 2019-08-06 | Microsoft Technology Licensing, Llc | Virtual object manipulation within physical environment |
US10395428B2 (en) * | 2016-06-13 | 2019-08-27 | Sony Interactive Entertainment Inc. | HMD transitions for focusing on specific content in virtual-reality environments |
US10663729B2 (en) * | 2016-09-15 | 2020-05-26 | Daqri, Llc | Peripheral device for head-mounted display |
JP2018097160A (ja) * | 2016-12-14 | 2018-06-21 | セイコーエプソン株式会社 | 表示システム、表示装置、及び、表示装置の制御方法 |
US10564720B2 (en) * | 2016-12-31 | 2020-02-18 | Daqri, Llc | User input validation and verification for augmented and mixed reality experiences |
US10290152B2 (en) * | 2017-04-03 | 2019-05-14 | Microsoft Technology Licensing, Llc | Virtual object user interface display |
US10871934B2 (en) * | 2017-05-04 | 2020-12-22 | Microsoft Technology Licensing, Llc | Virtual content displayed with shared anchor |
-
2015
- 2015-05-08 KR KR1020177029126A patent/KR102110208B1/ko active IP Right Grant
- 2015-05-08 US US15/569,922 patent/US10514755B2/en not_active Expired - Fee Related
- 2015-05-08 WO PCT/KR2015/004579 patent/WO2016182090A1/ko active Application Filing
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20130000401A (ko) * | 2010-02-28 | 2013-01-02 | 오스터하우트 그룹 인코포레이티드 | 대화형 머리장착식 아이피스 상의 지역 광고 컨텐츠 |
US20130114850A1 (en) * | 2011-11-07 | 2013-05-09 | Eye-Com Corporation | Systems and methods for high-resolution gaze tracking |
KR20130059827A (ko) * | 2011-11-29 | 2013-06-07 | 학교법인 한국산업기술대학 | 동공인식을 이용한 안경 카메라 |
KR20130067902A (ko) * | 2011-12-14 | 2013-06-25 | 한국전자통신연구원 | 안경형 이동 통신 단말 장치 |
KR20140128489A (ko) * | 2013-04-25 | 2014-11-06 | (주)세이엔 | 영상 인식과 터치 인터페이스를 이용한 스마트 안경 및 그의 제어 방법 |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
TWI656359B (zh) * | 2017-05-09 | 2019-04-11 | 瑞軒科技股份有限公司 | 用於混合實境之裝置 |
US10606098B2 (en) | 2017-05-09 | 2020-03-31 | Amtran Technology Co., Ltd. | Device for mixed reality |
US10613345B2 (en) | 2017-05-09 | 2020-04-07 | Amtran Technology Co., Ltd. | Mixed reality assembly and method of generating mixed reality |
US10795178B2 (en) | 2017-05-09 | 2020-10-06 | Amtran Technology Co., Ltd. | Device for mixed reality |
Also Published As
Publication number | Publication date |
---|---|
KR102110208B1 (ko) | 2020-05-13 |
KR20180004112A (ko) | 2018-01-10 |
US10514755B2 (en) | 2019-12-24 |
US20180150133A1 (en) | 2018-05-31 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2016190505A1 (ko) | 글래스타입 단말기 및 이의 제어방법 | |
WO2016182090A1 (ko) | 안경형 단말기 및 이의 제어방법 | |
WO2015053449A1 (ko) | 안경형 영상표시장치 및 그것의 제어방법 | |
WO2019147021A1 (en) | Device for providing augmented reality service, and method of operating the same | |
WO2017051975A1 (ko) | 이동 단말기 및 그 제어방법 | |
WO2016024746A1 (en) | Mobile terminal | |
WO2017018603A1 (ko) | 이동 단말기 및 이의 제어방법 | |
WO2016195147A1 (ko) | 헤드 마운티드 디스플레이 | |
WO2018056473A1 (ko) | 헤드 마운티드 디스플레이 장치 | |
WO2018124334A1 (ko) | 전자장치 | |
WO2018070624A2 (en) | Mobile terminal and control method thereof | |
WO2015194723A1 (ko) | 이동단말기 및 그 제어방법 | |
WO2015174611A1 (ko) | 이동 단말기 및 그것의 제어 방법 | |
WO2019160194A1 (ko) | 이동 단말기 및 그 제어방법 | |
WO2016195207A1 (ko) | 이동 단말기 | |
WO2017026554A1 (ko) | 이동 단말기 | |
WO2017204498A1 (ko) | 이동 단말기 | |
WO2016039496A1 (ko) | 이동단말기 및 그 제어방법 | |
WO2017022872A1 (ko) | 헤드 마운티드 디스플레이 및 그 제어방법 | |
WO2016039509A1 (ko) | 단말기 및 그 동작 방법 | |
WO2018135675A1 (ko) | 전자장치 | |
WO2016027932A1 (en) | Glass-type mobile terminal and control method thereof | |
WO2016013768A1 (ko) | 이동단말기 및 그 제어방법 | |
WO2016035920A1 (ko) | 이동 단말기 및 이의 제어 방법 | |
WO2016024707A1 (ko) | 이동 단말기 및 그 제어 방법 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 15891906 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 20177029126 Country of ref document: KR Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 15569922 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 15891906 Country of ref document: EP Kind code of ref document: A1 |