WO2016060293A1 - Dispositif d'affichage d'informations d'image et son procédé de commande - Google Patents

Dispositif d'affichage d'informations d'image et son procédé de commande Download PDF

Info

Publication number
WO2016060293A1
WO2016060293A1 PCT/KR2014/009687 KR2014009687W WO2016060293A1 WO 2016060293 A1 WO2016060293 A1 WO 2016060293A1 KR 2014009687 W KR2014009687 W KR 2014009687W WO 2016060293 A1 WO2016060293 A1 WO 2016060293A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
image information
function
display
control unit
Prior art date
Application number
PCT/KR2014/009687
Other languages
English (en)
Korean (ko)
Inventor
함준석
신윤호
김상래
정성재
박정아
Original Assignee
엘지전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 엘지전자 주식회사 filed Critical 엘지전자 주식회사
Priority to PCT/KR2014/009687 priority Critical patent/WO2016060293A1/fr
Publication of WO2016060293A1 publication Critical patent/WO2016060293A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/02Viewing or reading apparatus

Definitions

  • the present invention relates to a glass type image information display apparatus capable of displaying image information and a method of controlling the apparatus.
  • image information display devices continue to develop in newer forms. For example, if the image information display device is conventionally displayed in a form of displaying information on a fixed device such as a TV or a monitor, it is now light and small enough to be carried by a user, and thus, in various forms such as a mobile terminal or a notebook. It has been developed.
  • the image information display device has become smaller and has evolved into a wearable form that a user can wear like a watch or glasses. Accordingly, various devices such as smart watches are emerging, and accordingly, technology for displaying image information has been developed day by day.
  • displays capable of displaying image information are also being developed in new forms thanks to the development of these technologies.
  • the current display may be manufactured in a transparent form through which objects located on the back of the display may be seen, and thus may be implemented in the form of glasses.
  • Such a glass-type image information display device can not only display image information according to a user's selection, but also allow the user to visually check the external environment through the display.
  • such a glass-type image information display device as described above can recognize the external environment through the display, it can provide a variety of functions using the same. Accordingly, there is an active research into a method of displaying image information more effectively in the glass type image information display apparatus.
  • An object of the present invention is to provide an image information display apparatus and a method of controlling the apparatus that can display image information more effectively based on an external environment in a glass type image information display apparatus.
  • the image information display device is formed to be transparent to reflect the objects located on the rear side and the user
  • a main body including a plurality of glass display units respectively corresponding to both eyes, a sensing unit detecting a situation around the main body, a memory including information about the superior and non-eligible eyes of the user, and at least one of the plurality of display units.
  • the controller may further include a controller configured to display image information related to multimedia content and image information related to at least one of functions executable on the image information display apparatus. Based on the results of detecting the situation of the current user to determine the situation of the user, Depending on the result of the determination to the image information related to the function in which the execution characterized in that displayed on the display unit corresponding to within the user's advantage.
  • the controller may determine whether the user's position is moved based on a result of sensing the surrounding situation of the main body, and according to the determination result, the image information related to the function is stored in the user's superiority.
  • the preset function is displayed on a corresponding display unit, wherein the preset function is a function related to the navigation function and the navigation function.
  • the sensing unit includes a camera that receives an image outside the main body, and the control unit is configured to determine the situation of the user based on a degree that an image received from the camera is changed within a predetermined time. It is characterized by.
  • the sensing unit further comprises at least one sensor for detecting the movement of the user, the control unit, in the state where the user's position is moved, to the user's movement detected from the sensing unit It is characterized by determining the situation of the user based on.
  • the controller determines that the user is moving by using a vehicle and based on whether the user is a driver of the vehicle. And displaying image information related to the function on a display unit corresponding to the user's superiority.
  • the controller may determine whether the user is a driver driving the vehicle based on whether the vehicle in which the user rides is a preset vehicle.
  • control unit through the vehicle control unit for controlling the overall driving of the vehicle means for detecting the position of the driver's seat of the vehicle, the user based on the position of the driver's seat and the position of the main body the transportation It is characterized by determining whether the driver is driving the means.
  • the function related to the navigation function may include a function of displaying information about a feature around a user's current location.
  • the preset function may include a function executed by a preset event, wherein the preset event occurs when a message is received or a call is received.
  • the controller evaluates the importance of the generated event and based on the evaluated importance, the image information related to the generated event corresponds to the superiority of the user. It is characterized by displaying on the display unit.
  • the controller may evaluate the importance based on text included in the message.
  • control unit searches whether the text included in the message includes a preset word, and when the at least one preset word is searched for, adding the importance scores differently matched to each word. Based on the evaluation of the importance of the message.
  • the controller may evaluate the importance based on the telephone number of the received call.
  • the controller may detect surrounding situations of different main bodies according to the executed function.
  • the preset function includes a function of displaying notification information for notifying the user when an object approaching the user is detected within a predetermined distance, and the controller controls the environment around the main body.
  • notification information for notifying the approach of the object is displayed on the display unit corresponding to the superiority of the user. Characterized in that.
  • the controller may classify the type of the multimedia content and display image information related to the multimedia content in a display unit corresponding to the user's superiority according to the type of the multimedia content.
  • the control unit corresponds to notification information of the generated event within a user's superiority. It is characterized by displaying on the display unit.
  • the control unit may determine the position of the image information displayed on the display unit corresponding to the superiority, based on a result of detecting a situation around the user.
  • the control unit may display the image information in an area other than a display area in which preset objects are identifiably displayed among objects displayed on the display unit corresponding to the superiority.
  • the control method of the image information display apparatus the step of determining whether or not to execute a predetermined function, and the predetermined function is executed And determining a situation of a user by detecting a situation around the main body of the image information display device, and based on the determined situation of the user, image information related to the execution of the preset function is stored in the plurality of glasses. And displaying on a display unit corresponding to the superiority of the user among the display units.
  • the present invention determines whether the image information to be displayed on the display unit should be recognized by the user sooner, so that the image information is output to the specific display unit, so that the user It is advantageous in that image information can be recognized more quickly and accurately.
  • the present invention senses the surrounding situation, and based on the detected surrounding situation to output image information related to a predetermined function or event on a specific display, the user It is advantageous in that image information can be recognized quickly and accurately.
  • FIG. 1 is a block diagram showing a block configuration of an image information display apparatus according to an embodiment of the present invention.
  • FIG. 2 is an exemplary diagram illustrating an example in which an image information display apparatus according to an exemplary embodiment of the present invention is implemented in a glass type.
  • FIG. 3 is a flowchart illustrating an operation of displaying image information based on a currently performed function and a sensed surrounding situation by an image information display apparatus according to an exemplary embodiment.
  • FIG. 4 is a flowchart illustrating an operation of evaluating the importance of an event generated and displaying image information according to the image information display apparatus according to an exemplary embodiment of the present invention.
  • FIG. 5 is a flowchart illustrating an operation process of detecting an ambient situation and displaying image information according to the image information display apparatus according to an exemplary embodiment of the present invention.
  • FIG. 6 is an exemplary view illustrating an example in which image information is displayed in the image information display apparatus according to an exemplary embodiment of the present invention.
  • FIG. 7 is an exemplary view showing another example in which image information is displayed in the image information display apparatus according to the embodiment of the present invention.
  • the image information display device described herein includes a mobile phone, a smart phone, a laptop computer, a digital broadcasting terminal, a personal digital assistant, a portable multimedia player, a navigation, and a slate PC.
  • FIG. 1 is a block diagram showing a block configuration of an image information display device according to an exemplary embodiment of the present invention.
  • an image information display apparatus 100 may include a display 130, a detector 120, a memory 140, and the like.
  • the components shown in FIG. 1 are not essential in implementing an image information display apparatus according to an exemplary embodiment of the present invention, and thus, the image information display apparatus described in the present specification may have more components than those listed above. May have
  • the image information display apparatus according to an embodiment of the present invention may further include a sound output unit, a wireless communication unit, and the like. Alternatively, on the contrary, it may have fewer components than described above.
  • the display unit 130 among the components displays (outputs) information processed by the image information display apparatus 100.
  • the display unit 130 may display execution screen information of an application program driven by the image information display apparatus 100 or user interface (UI) and graphical user interface (GUI) information according to the execution screen information. Can be.
  • UI user interface
  • GUI graphical user interface
  • the display unit 151 may be configured as a stereoscopic display unit for displaying a stereoscopic image.
  • the stereoscopic display unit may be a three-dimensional display method such as a stereoscopic method (glasses method), an auto stereoscopic method (glasses-free method), a projection method (holographic method).
  • the display 130 may be provided in plural.
  • the first and second display units corresponding to both eyes of the user may be provided.
  • 132 and 134 may be implemented.
  • the display 130 may display the same image information on the first and second display units 132 and 134 or display different image information, respectively, under the control of the controller 110.
  • the image information may be displayed only in any one of the control unit 110.
  • the sensor 120 may include at least one sensor for sensing the surrounding environment information surrounding the image information display apparatus 100 or a user's input.
  • the detector 120 may include a location sensor, and the location sensor may detect the location of the current image information display device 100.
  • Representative examples of such position sensing sensors include a Global Positioning System (GPS) module or a Wireless Fidelity (WiFi) module.
  • GPS Global Positioning System
  • WiFi Wireless Fidelity
  • the mobile terminal may acquire the location of the image information display apparatus 100 using a signal transmitted from a GPS satellite.
  • the image information display apparatus 100 may obtain the position of the image information display apparatus 100 based on the information of the WIFI module and the wireless access point (AP) that transmits or receives the wireless signal.
  • the location information sensor is not limited to a component that directly calculates or acquires the location of the image information display apparatus 100.
  • the sensing unit 120 may include an acceleration sensor, a magnetic sensor, a gravity sensor, a gyroscope sensor, a motion sensor, a motion sensor, and an optical sensor. , For example, a camera).
  • the image information display apparatus 100 disclosed in the present disclosure may utilize a combination of information detected by at least two or more of these sensors.
  • the memory 140 may store a program for the operation of the controller 110, and may temporarily store input / output data (eg, a still image, a video, etc.).
  • the memory 140 may store data relating to sound of various patterns output from the image information display apparatus 100.
  • the memory 140 may store a plurality of application programs or applications that are driven by the image information display apparatus 100, data for operating the image information display apparatus 100, and instructions. At least some of these applications may be downloaded from an external server via wireless communication.
  • At least some of these applications may be used for the basic functions of the image information display apparatus 100 (for example, a function of adjusting the size and resolution of the image information such as a moving image or a still image, a focal length, etc.) It may exist on the image information display device 100 from the time of shipment.
  • the application program is stored in the memory 140, installed on the image information display apparatus 100, and driven to perform an operation (or function) of the image information display apparatus 100 by the controller 110. Can be.
  • the memory 140 may include a storage medium of a type such as a flash memory type, a hard disk type, a card type memory (eg, SD or XD memory, etc.).
  • the projection apparatus 100 may be operated in connection with a web storage that performs a storage function of the memory 140 on the Internet.
  • the controller 110 In addition to the operation related to the application program, the controller 110 typically controls the overall operation of the screen information display apparatus 100.
  • the controller 110 processes at least one of the display units 132 and 134 by using the above-described components to process signals, data, information, and the like, which are input or output, or to drive an application program stored in the memory 140. It is possible to display a single function or to provide a function appropriate to a user with respect to displayed image information.
  • the controller 110 may display image information corresponding to a function selected by a user on at least one of the display units 132 and 134.
  • the controller 110 may display image information on the selected function only on all or one of the display units 132 and 134 based on the function selected by the user.
  • the controller 110 may select a display unit to display the image information according to the type of the function selected by the user. That is, when a function selected by a user is a function that is preset to be displayed on both the display units 132 and 134, the controller 110 may output image information about the selected function to both the display units 132 and 134. have. For example, when the function selected by the user is a function of playing a multimedia content such as a movie or a drama, the controller 110 may output image information about the multimedia content to both the display units 132 and 134. However, if the function selected by the user is not a function of playing the multimedia content, the controller 110 may display image information corresponding to the currently selected function on only one of the display units 132 and 134.
  • the multimedia content may be preset by the user or may be preset when the image information display apparatus 100 according to the embodiment of the present invention is shipped.
  • content that a user can comfortably enjoy for interest such as a movie or a game, may be preset as the multimedia content.
  • the controller 110 may evaluate the importance of the image information to be displayed on the display units 132 and 134 so that the image information is displayed on a specific display unit among the display units 132 and 134. For example, if the image information to be displayed on the display units 132 and 134 is not a function of playing the multimedia content, the controller 110 confirms the image information to be displayed on the display units 132 and 134 by the user. It is also possible to determine whether or not to and to display the image information related to the selected function according to the determined result on the particular display unit.
  • the controller 110 may determine whether the image information to be displayed on the display units 132 and 134 should be checked by the user, and the surrounding situation of the image information display apparatus 100 and the currently selected function. The result of detecting is available. That is, if it is determined that the user is moving as a result of sensing the environment around the user, the controller 110 must check the function for displaying the current location of the user or a function related thereto. Can be determined as information to be recognized. In this case, the controller 110 may allow the image information to be displayed on a specific display unit among the display units 132 and 134 in the case of the image information determined to be recognized by the user.
  • the controller 110 may display the image information determined to be recognized by the user on a display unit corresponding to the user's superiority. This is because, when the image information display apparatus 100 according to the exemplary embodiment of the present invention has a glass shape, when the image information is displayed on the display unit at the position corresponding to the upper eye, the user may recognize the information more quickly and accurately.
  • the term “right yuan” refers to the eyes of the user's binocular that they use more consciously or unconsciously than the other. Accordingly, the visual information recognized through the superior eye can be recognized faster and more accurately than the visual information recognized through the unbiased eye.
  • Such superiority may be different for each person, and the controller 110 may receive a test result according to a preset method or use the result of detecting the superiority of the user using a test application stored in the memory 140. .
  • the detected superiority information may be stored in the memory 140, and may be provided to the controller 110 when requested by the controller 110.
  • the controller 110 may select a display unit corresponding to the user's superiority using information on the user's superiority stored in the memory 140. Accordingly, the controller 110 may display the image information determined to be recognized by the user in a display unit corresponding to the user's superiority.
  • the controller 110 may determine image information that the user must recognize based on a function that is currently executed in the image information display apparatus 100. For example, if a function to be executed based on a user's selection or a result of sensing an environment around the user is one of preset functions, the controller 110 must recognize the image information related to the function. It can also determine with image information. The controller 110 may display the image information on the display unit corresponding to the user's superiority.
  • the preset functions may be, for example, a function that outputs information that a user must recognize.
  • these features provide information about the user's current location or other information related to the current location (e.g. navigation-related features such as displaying navigation or features around the user's location).
  • the present invention may be a function for recognizing an object approaching a certain distance or the like and informing the user of the object. These functions may be previously selected by the user, or may be preset in advance from the factory when the image information display apparatus 100 according to an exemplary embodiment of the present invention is provided.
  • the controller 110 may display the image information based on a result of detecting the user's surrounding environment (the display unit corresponding to the user's edge). Of course, you can make it appear). For example, when the navigation function is performed according to the user's selection, if the user's location moves while there is little user's movement as a result of sensing the user's surroundings, the control unit 110 controls that the user transports the vehicle. It can be determined to move by means. In this case, although the navigation function is a function included in the preset function, the controller 110 may not display image information related to the navigation function on the display unit corresponding to the user's superiority.
  • the controller 110 detects the surrounding environment of the user, and when the illuminance is lower than a predetermined level (that is, in a dark area where the illuminance is lower than a predetermined level).
  • the notification information may be displayed on the display unit corresponding to the user's edge.
  • the sound output unit may output audio data related to image information currently displayed on the display units 132 and 134 under the control of the controller 110.
  • the sound output unit may include a connection port that may be connected to a connectable sound output device, for example, an earphone or the like, and the sound data transmitted through the connection port may be controlled by the controller 110. Volume may be output through the connected sound output device according to.
  • the image information display apparatus 100 may further include a communication module capable of performing a communication function with another external device or a preset external server.
  • the communication module uses the communication module such as Bluetooth, WIFI (Wireless-Fidelity), infrared communication, near field communication (NFC), and the like to communicate the image information display device 100 with the wireless communication system.
  • the short range wireless communication network may be short range wireless personal area networks.
  • the other mobile terminal is a wearable device capable of exchanging (or interworking) data with the image information display apparatus 100 according to the present invention, for example, a smartwatch, a smart glasses ( smart glass, head mounted display (HMD).
  • a smartwatch for example, a smartwatch, a smart glasses ( smart glass, head mounted display (HMD).
  • HMD head mounted display
  • At least some of the components may operate in cooperation with each other to implement an operation, control, or control method of the image information display apparatus 100 according to various embodiments described below.
  • the operation, control, or control method of the image information display apparatus 100 may be implemented by driving at least one application program stored in the memory 140.
  • the following various embodiments may be implemented in a recording medium readable by a computer or a similar device using, for example, software, hardware, or a combination thereof.
  • FIG. 2 is an exemplary diagram illustrating an example in which an image information display apparatus according to an exemplary embodiment of the present invention is implemented in a glass type.
  • the image information providing apparatus 200 may be manufactured in the form of glasses as shown in FIG. 2 and may be configured to be worn on the head of a human body. And it may be provided with a frame portion (case, housing, etc.) for this. In addition, the frame portion 201 may be formed of a flexible material to facilitate wearing.
  • the frame part 201 is supported by the head and provides a space in which various components are mounted. As illustrated, electronic components such as the control module 280, the sound output module 252, and the like may be mounted in the frame unit.
  • the frame unit 201 may include a plurality of display units 251a and 251b disposed at positions corresponding to the left eye and the right eye, respectively.
  • the display units 251a and 251b have light transmittance and may output image information.
  • the image information refers to a virtual graphic object generated by the glass type image information providing apparatus 200 or received from an external device.
  • the virtual object may mean an application, an icon corresponding to the icon, content, or a UI in a call mode. This may be generated by the controller 110 or received from an eyeglass terminal such as a smart phone.
  • the display units 251a and 251b have light transmittance, the user may see the external environment through the display units 251a and 251b.
  • the display units 251a and 251b may display an external environment and output information on any external object constituting the external environment.
  • the external object may be a business card, a person or an external device that can communicate with each other, or a building or a building that becomes a landmark of a specific region.
  • the control module 280 is configured to control various electronic components included in the glass type image information providing apparatus 200.
  • the control module 280 may be understood as a configuration corresponding to the controller 110 described above.
  • the control module 280 is illustrated to be installed in the frame portion 201 on the head.
  • the position of the control module 280 is not limited thereto.
  • the display units 251a and 251b may be implemented in the form of a head mounted display (HMD).
  • HMD type is a display method mounted on the head and showing an image directly in front of the user's eyes. Accordingly, when the user wears the glass type image information providing apparatus 200, an image may be directly provided in front of the user's eyes through at least one of the display units 251a and 251b.
  • the display units 251a and 251b may project an image to the eyes of a user using a prism.
  • the prism can be formed translucent so that the user can see the projected image and the general field of view (the range the user looks through the eye) together.
  • the glass-type image information providing apparatus 200 may provide an augmented reality (AR) that displays a virtual image by superimposing a virtual image on a real image or a background using the characteristics of the display.
  • AR augmented reality
  • the camera 221 is disposed adjacent to at least one of the left eye and the right eye, and is formed to capture an image of the front. Since the camera 221 is located adjacent to the eye, the camera 221 may acquire a scene viewed by the user as an image.
  • the camera 221 may be installed in the frame unit 201, and may be provided in plural to acquire a stereoscopic image.
  • the glass type image information providing apparatus 200 may include a user input unit 223 manipulated to receive a control command.
  • the user input unit 223 may be adopted in any manner as long as it is a tactile manner in which the user operates while having a tactile feeling such as touch or push.
  • the glass type image information providing apparatus 200 may include a microphone (not shown) that receives sound and processes the sound into electrical voice data, and a sound output module 252 that outputs sound.
  • the sound output module 252 may be configured to transmit sound in a general sound output method or a bone conduction method.
  • the sound output module 252 is implemented in a bone conduction manner, when the user wears the glass type image information providing apparatus 200, the sound output module 252 is in close contact with the head and vibrates the skull to transmit sound. Done.
  • the glass-type image information providing apparatus 200 may receive a voice signal of the user through the microphone and analyze the same, thereby performing a function according to the voice signal of the user.
  • the image information providing apparatus 200 has a glass type, that is, a form of glasses as described above.
  • a glass type that is, a form of glasses as described above.
  • FIG. 3 is a flowchart illustrating an operation of displaying image information based on a currently performed function and a sensed surrounding situation by an image information display apparatus according to an exemplary embodiment.
  • the controller 110 of the image information display apparatus 100 may display image information according to a function selected by a user on at least one of the display units 132 and 134.
  • the controller 110 may display image information related to the multimedia content selected by the user on at least one of the display units 132 and 134 in operation S300. .
  • the controller 110 may detect whether any one of the preset functions is executed in the image information display apparatus 100 (S302). For example, when the user executes a function different from the function currently performed on the current image information display apparatus 100 in step S302, or a specific event (for example, receiving or presetting a message) When a change in the surrounding environment occurs, a function corresponding to the occurrence of the event may be detected.
  • the controller 110 may determine whether the executed function is one of preset functions as a function of displaying image information that the user must recognize. Based on the determination result of step S302, image information related to the executed function may be displayed on the display unit corresponding to the user's superiority among the display units 132 and 134 (S304).
  • the controller 110 may determine the position of the image information displayed on the display unit corresponding to the dominant position based on the result of detecting the situation around the user. That is, the controller 110 may allow the image information to be displayed in an area other than a display area in which preset objects are identifiably displayed among the objects reflected in the display unit corresponding to the superiority.
  • the preset thing may be variously determined according to a user's selection or a predetermined bar.
  • the preset object may be an object, such as a traffic light or a sign, among objects reflected through the display unit corresponding to the superiority.
  • the controller 110 may recognize these objects by using the result detected by the detector 120.
  • the detector 120 may recognize an object corresponding to a traffic light or a sign from an image input through a camera.
  • the controller 110 may recognize this as a sign or the like.
  • the controller 110 may recognize when there is an object displaying a specific symbol such as a pedestrian display or a stop display, or when an object in which red, blue, or yellow is arranged in a line is detected as a traffic light. .
  • the controller 110 may know a position where objects corresponding to the traffic light or sign are displayed on the display unit corresponding to the upper edge. In this case, the controller 110 may display the image information in another area except for a location where objects such as a traffic light or a sign are displayed on the display unit corresponding to the upper eye. Accordingly, in the present invention, it is possible to prevent a situation in which the user cannot confirm important information such as a sign or a traffic light by the displayed image information.
  • the controller 110 may recognize various objects detected as a result of the detection of the sensor 120 using a database for recognizing a shape of the object.
  • the objects may be preset by a user, or may be obtained from the manufacturer of the image information display apparatus 100 according to an exemplary embodiment of the present disclosure or from a manufacturer who manufactures the image information display apparatus 100 according to an exemplary embodiment of the present disclosure.
  • the controller 110 may cause the image information to be output to an area other than the area where the objects are displayed.
  • the controller 110 may display the image information of a smaller size by adjusting the size of the region where the image information is displayed if the region where the image information is displayed is insufficient by the region where the objects are displayed.
  • step S304 the control unit 110 evaluates the importance of the image information related to the executed function or the generated event based on a preset criterion based on the currently generated event, and the related image information is based on the evaluated result. It may be displayed on the display unit corresponding to the superiority of. As described above, the operation of the controller 110 evaluating the importance of the image information related to the currently generated event will be described in detail with reference to FIG. 4.
  • step S304 even if the control unit 110 of the image information display device 100 according to an embodiment of the present invention, even if any one of the preset functions are executed, the image information display device 100
  • the user's situation may be determined based on the surrounding situation, and the image information may be displayed on a specific display unit according to a result of the user's situation being determined.
  • the controller 110 may further detect a surrounding situation and determine a state of the user whether the user is moving or stopped based on the detected surrounding situation. If the user is moving, the user may determine whether the user is moving through a vehicle such as a vehicle or whether the user is running or walking.
  • image information related to a function currently executed may be displayed on the display unit corresponding to the user's superiority.
  • an operation process of detecting the surrounding situation and displaying image information on a specific display unit will be described in more detail with reference to FIG. 5.
  • FIG. 4 is a flowchart illustrating an operation of evaluating the importance of an event generated and displaying image information according to the image information display apparatus according to an exemplary embodiment of the present invention.
  • control unit 110 of the image information display apparatus 100 not only performs any one of the functions preset in step S302, When an event occurs, it has been described that image information can be displayed on a display unit corresponding to the user's superiority based on the generated event.
  • 4 illustrates an operation process of evaluating the importance of an event generated by the controller 110 of the image information display apparatus 100 according to an exemplary embodiment of the present disclosure.
  • the controller 110 may detect this (S400).
  • the event detected at step S400 may be various. That is, the reception of a call or the reception of a message related to a short messaging service (SMS), a social network service (SNS), or the like, or generation of alarm information indicating that a predetermined time has been reached or a specific location has been reached, and Such an event may include a low battery or generation of notification information for notifying a predetermined schedule.
  • SMS short messaging service
  • SNS social network service
  • the controller 110 may detect the occurrence of such an event in step S400. In addition, the controller 110 may determine whether a function executed by a currently generated event is a preset function (S402). For example, when an event such as a call incoming or a message is received, the controller 110 may determine that a call function or an SMS or SNS function corresponds to the generated event. The controller 110 may determine whether a function corresponding to the generated event is a preset function. If the function is not a preset function, the controller 110 may display image information related to the event on at least one of the display units 132 and 134 (S410). .
  • step S402 if it is determined in step S402 that the function executed by the currently generated event corresponds to the preset functions, the control unit 110 proceeds to step S304 of FIG. 3 and the image related to the currently generated event.
  • the information may be displayed on the display unit corresponding to the superiority of the user among the display units 132 and 134.
  • the controller 110 further evaluates the importance of the currently occurring event, and only when the evaluated importance is equal to or greater than a preset level, the image information related to the event is displayed in a specific display unit (the display unit corresponding to the user's superiority). It may be displayed on (S406).
  • These criteria may be set differently for each type of event. For example, when an event currently occurring is a call reception, the telephone number information of the incoming call may be a criterion for evaluating the importance of the event, or when the event is a reception of a message, the content included in the message is Can be a criterion for evaluating the importance of an event.
  • the process may proceed to step S304 of FIG. 3 and display on the display unit corresponding to the user's superiority.
  • the controller 110 may evaluate the importance of the incoming call higher than the preset level in the case of a predetermined telephone number (for example, a telephone number stored in advance). In this case, the controller 110 may display the image information related to the incoming call only on the display unit corresponding to the user's superiority only in the call stored in advance at the step S304.
  • a predetermined telephone number for example, a telephone number stored in advance.
  • the controller 110 may evaluate the importance of the received message according to whether a preset word is included in the content of the received message. That is, for example, when a message includes words such as 'important', 'urgent', 'must', 'immediately contacted', the controller 110 may determine the message as a message of high importance, This may be displayed on the display unit corresponding to the superiority of the user in step S304.
  • the words for evaluating the importance of the message may be set by the user in advance, and may be set to match different importance scores for each word.
  • step S406 the controller 110 searches for preset words among the words included in the message, adds importance scores corresponding to the retrieved words, and based on the summed score, It may be determined whether the importance level is above a predetermined level.
  • the controller 110 proceeds to step S304 of FIG. 3 and corresponds to the user's superiority among the display units 132 and 134 for image information indicating the content of the received message. It can be displayed on the display unit.
  • step S304 the user's state is determined by further considering the result of sensing the surrounding situation of the image information display apparatus 100, and specific image information related to the currently executed function or event is displayed. It was mentioned that it may be indicated in the wealth.
  • This surrounding situation can be various.
  • the current position and movement state may be such a surrounding situation.
  • the degree of movement of the user's body as a result of detecting the movement of the user may be the surrounding situation of the image information display apparatus 100.
  • the surrounding illuminance may be the surrounding situation of the image information display apparatus 100.
  • the controller 110 may detect whether a specific condition is satisfied according to the function.
  • the specific condition may be a specific situation (for example, a user's position movement or illuminance) around the image information display apparatus 100 corresponding to a specific function, or an importance (eg, determined according to a preset criterion). For example, the importance assessed from the event that occurred.
  • image information related to the executed function may be displayed on a specific display unit.
  • the controller 110 may display the image information display device 100 based on a state in which a current position is moved and whether the user is in a vehicle. It is possible to determine whether the vehicle moves in the state of boarding the vehicle, or the user walks or jumps.
  • the controller 110 may display image information related to the navigation function on the display unit corresponding to the user's superiority based on the determined result.
  • the controller 110 may display a brightness (illuminance) around the current image information display apparatus 100 when the currently executed function is a function of displaying a notification object to the user when there is an object approaching within a preset distance. Based on the location of the current user, the notification information may be displayed on the display unit corresponding to the superiority of the user.
  • FIG. 5 illustrates an operation process in which the image information display device according to an exemplary embodiment of the present invention detects a surrounding situation and displays image information accordingly.
  • a function currently executed is a navigation function
  • the navigation function is preset as a function for outputting information that the user must recognize by the user in advance.
  • the controller 110 may detect whether the user location is moving (S500). This is because, unlike the user simply searching for navigation information of a specific region, when the user's location moves, it may be determined that the user is actually moving toward a specific destination, and thus the navigation information should be provided more quickly and accurately. . Accordingly, when the navigation function is executed, the controller 110 determines the user's situation according to the detection result of the detection unit 120 and displays the image information related to the navigation function according to the determined result. Can be determined. For example, the controller 110 displays the display unit on at least one of the first display unit 132 or the second display unit 134 according to the determination result, or designates one display unit corresponding to the user's superiority.
  • Such a position movement may be detected through a location information module such as a GPS, or may be detected by a sensor provided in the image information display apparatus 100.
  • the controller 110 may determine that the user moves when the image received from the optical sensor, that is, the camera 221 changes more than a predetermined level within a predetermined time.
  • step S500 if the location of the user is not moved, the controller 110 may determine that the navigation information does not need to be displayed on the display unit corresponding to the user's superiority. Accordingly, when the position of the user is in the stopped state, the controller 110 may display image information related to the navigation function on at least one of the display units 132 and 134 (S508).
  • the controller 110 may display image information related to the navigation function on one of the display units 132 and 134 according to the user's selection of the display unit corresponding to the non-right eye and the display unit corresponding to the non-right eye. Can be. This is because the user does not need to be provided with navigation information more quickly and accurately as described above, so that it is displayed on the display unit desired by the user according to the user's selection.
  • the controller 110 may display image information related to the navigation function on the display unit corresponding to the user's superiority. Alternatively, the controller 110 may detect whether the user is moving through a vehicle such as a vehicle, even if the user is moved.
  • the controller 110 may determine whether the user is using a vehicle such as a vehicle based on the user's movement. To this end, when the user's location is moving, the controller 110 may detect whether there is a movement of a predetermined level or more from the user (S502). That is, when the user walks or runs directly, the control unit 110 controls the movement of the user based on a detection result of an optical sensor (for example, an internal camera), an inertial sensor, or a gravity sensor provided in the detection unit 120. It can be detected. Based on the detected result, it may be determined whether the user is actually in a state of moving and directly moving to a specific destination.
  • an optical sensor for example, an internal camera
  • an inertial sensor for example, an inertial sensor, or a gravity sensor provided in the detection unit 120. It can be detected. Based on the detected result, it may be determined whether the user is actually in a state of moving and directly moving to a specific destination.
  • the controller 110 may output the navigation information on the display unit corresponding to the user's superiority, so that the user may recognize the navigation information more quickly and accurately.
  • the navigation information may mean image information of a navigation function itself or a function of displaying a function related to the navigation function, that is, a function of displaying information about a feature around a user's location, for example. .
  • the controller 110 may determine that the user moves to a specific destination by using a vehicle such as a vehicle. In this case, the controller 110 may determine that the user uses public transportation such as a bus or subway, and thus may determine that the user does not need to receive navigation information more quickly and accurately. In this case, the controller 110 may proceed to step S508 to display the navigation information on at least one of the display units 132 and 134 according to a user's selection.
  • the controller 110 displays the image information related to the navigation function in the user's superiority using only one of the user's location movement detection result (step S500) or the user's motion detection result (step S502). Of course, it can also be displayed in wealth.
  • the image information related to the navigation function may be displayed on the display unit corresponding to the user's superiority.
  • the controller 110 may determine whether the user is in a preset vehicle. In this case, when it is determined that the user is in a vehicle preset by the user, for example, a vehicle that the user directly drives, the controller 110 may provide navigation information to the user even if the user's movement is less than a predetermined level. It may be displayed on the display unit corresponding to the superiority of.
  • FIG. 6 is an exemplary view illustrating an example in which image information is displayed in the image information display apparatus according to an exemplary embodiment of the present invention.
  • a function of displaying a content of a message received by a preset function and a navigation function are included, and a case in which the left eye of the user's eyes is the superior eye of the user will be described as an example. .
  • FIG. 6 illustrates a case where an event currently occurring in the image information display apparatus 100 according to an exemplary embodiment of the present invention or a function according to a user's selection is performed. It is an exemplary figure which shows the case where image information related to a function or an event is displayed.
  • the controller 110 may display image information related to the function selected by the user and the generated event on at least one of the display units 132 and 134.
  • FIG. 6A illustrates image information related to a function selected by a user
  • FIG. 6B illustrates image information related to an generated event.
  • the controller 110 of the image information display apparatus 100 may perform a function executed by a user's selection, that is, functions in which a navigation function is preset.
  • a user's selection that is, functions in which a navigation function is preset.
  • the surrounding situation can be further detected.
  • the controller 110 of FIG. 2 the image information 600 related to the navigation function may be displayed on the display unit 132 corresponding to the user's superiority, that is, the user's left eye.
  • control unit 110 is required to meet the specific conditions related to the function to display the image information related to the event in the uppermost of the user, that is, It may be displayed on the display unit 132 corresponding to the left eye.
  • the controller 110 may receive the received message even though the function of displaying the content of the received message is one of preset functions. The importance of the message can be evaluated based on the content of. Then, based on the evaluated importance, it may be determined whether to display the message on the display unit 132 corresponding to the user's superiority.
  • control unit 110 is a word of the text included in the received message, if the word set in advance of high importance, that is, words such as 'immediately', 'contact', 'urgent', 'important' This may not be displayed on the display unit 132 corresponding to the user's superiority, so that the content of the received message 610 corresponds to the right eye of the user, as shown in FIG.
  • the display unit 134 may be displayed.
  • FIG. 7 illustrates an example in which image information is displayed based on a result of sensing a user's state even when the same function is performed in the image information display apparatus according to an exemplary embodiment of the present invention.
  • a navigation function is included as a preset function, and a case in which the left eye of the user's eyes is the superior eye of the user will be described as an example.
  • the controller 110 of the image information display apparatus 100 may move the user's position and the user's movement ( For example, the image information related to the navigation information is displayed on the display unit 132 corresponding to the user's superiority based on the state of moving by foot.
  • the controller 110 may display image information related to the navigation function based on a result of detecting the user's movement in the display unit corresponding to the user's superiority ( 132 may determine whether to display. That is, the control unit 110 is a case where the navigation function is executed by the user, even if the position of the user is moving, as shown in (a) of FIG. 7, when the user boards the vehicle, the navigation is performed. Image information related to a function may be displayed on any one of the display units 132 and 134 according to a user's selection.
  • the image information related to the navigation function may be displayed on the display unit 134 corresponding to the user's non-comfort, that is, the right eye, as shown in FIG. 7B.
  • the controller 110 may determine that the user does not need to receive navigation information more quickly and accurately.
  • the controller 110 displays the image information related to the navigation function corresponding to the user's superior, that is, the left eye It may be displayed on the unit 132.
  • the controller 110 may be configured to drive the vehicle when the user is a vehicle in which the user is preset as a driver or when the user is seated in the driver's seat based on a camera inside the vehicle. It can be judged that.
  • the controller 110 cooperates with a control unit (not shown) of the vehicle to detect the position of the driver's seat of the vehicle, and the user may detect the driver's seat of the vehicle from the position of the main body and the position of the driver's seat of the image information display apparatus 100. It is possible to determine whether or not to board. Alternatively, the controller 110 may detect a user's movement from an image input at a predetermined time interval through a camera inside the vehicle, and determine whether the user is a driver of the vehicle based on the detected user's movement. If the user wearing the image information display apparatus 100 is a driver of a vehicle, the controller 110 displays image information related to the navigation function on the display unit 132 corresponding to the user's superior eye, that is, the left eye. You can also mark FIG. 7C illustrates an example in which the image information 700 related to the navigation function is displayed on the display unit 132 corresponding to the user's superiority, that is, the left eye in this case.
  • the navigation function and the event in which the message is received are mainly described as examples, but the present invention is not limited thereto. That is, the present invention can also be used to notify when there is an object close to the user.
  • the control unit 110 is a user when the ambient illumination is below a certain level (for example, late time) or when the user passes through a predetermined area (for example, an area where the user passes by walking or the like when returning home).
  • notification information for notifying the user may be displayed on the display unit corresponding to the user's superiority so that the user can recognize the object more quickly.
  • a specific function capable of displaying image information in the user's superiority is set as an example.
  • these functions may be added or excluded as much as the user selects.
  • the location movement state and the user's movement state as an example of the detected surrounding situation, but this is only the surrounding situation related to the navigation function, not all functions detect this. That is, various different surrounding situations can be detected according to the functions to be set, and accordingly, it can be determined whether image information related to the function should be displayed in the user's superiority.
  • the image information to be displayed on the display unit corresponding to the user's superiority is determined based on the function selected by the user or the function generated by the event. Of course, it may be determined according to the displayed content. That is, for example, if the multimedia content selected by the user is content that the user can immerse, such as a movie or a game (hereinafter referred to as immersive content), the controller 110 displays the immersive content in the display units 132 and 134. All or at least one may be indicated. However, if the content selected by the user is not immersive content such as schedule notification or local information (hereinafter, referred to as non-immersive content), the controller 110 may display it on a display unit corresponding to the user's superiority.
  • the multimedia content selected by the user is content that the user can immerse, such as a movie or a game (hereinafter referred to as immersive content)
  • the controller 110 displays the immersive content in the display units 132 and 134. All or at least one may be indicated.
  • the controller 110 may display
  • the immersive content or the non-immersive content may be preset by the user or may be preset when the image information display apparatus 100 according to the embodiment of the present invention is shipped.
  • multimedia content such as a movie or a game may be set as the immersive content
  • image information related to other contents except for the immersive content may be set as non-immersive content.
  • the controller 110 displays notification information related to the event in the display unit corresponding to the user's superiority when a preset event occurs. Can be marked on. For example, as described above, the controller 110 may display notification information for notifying when a message is received, when a call is received, or when an event such as a specific object approaches within a predetermined distance occurs.
  • the notification information may be displayed on a display unit corresponding to the superiority of the user, and a sound signal may be further output together with the notification information or before the notification information.
  • the preset event may be preset by the user, or may be preset from the time of shipment of the image information display apparatus 100 according to an exemplary embodiment of the present invention.
  • the control unit 110 determines the image information related to the function based on the result of the user's situation.
  • the display on the display unit corresponding to the superiority of the user has been described as an example.
  • the present invention is not limited thereto. That is, in addition to the above-described navigation function or a function related thereto, various other functions may be selected depending on the user's selection.
  • a message function or the like may be selected as the function.
  • the control unit 110 corresponds to the user's priority in the image information associated with the application for creating the received message or the content of the message, based on the result of determining the current user's situation at the time of receiving or sending the message.
  • it can also be displayed on the display unit.
  • the content 610 of the received message may be displayed on the display unit 132 corresponding to the user's superiority.
  • the present invention described above can be embodied as computer readable codes on a medium in which a program is recorded.
  • the computer-readable medium includes all kinds of recording devices in which data that can be read by a computer system is stored. Examples of computer-readable media include hard disk drives (HDDs), solid state disks (SSDs), silicon disk drives (SDDs), ROMs, RAMs, CD-ROMs, magnetic tapes, floppy disks, optical data storage devices, and the like. This also includes implementations in the form of carrier waves (eg, transmission over the Internet).
  • the computer may include the controller 180 of the terminal.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

La présente invention concerne un dispositif d'affichage d'informations d'image de type en verre apte à afficher des informations d'image, et un procédé de commande associé, le dispositif comprenant : un corps principal comprenant une pluralité d'unités d'affichage en verre correspondant respectivement aux deux yeux d'un utilisateur ; une unité de détection pour détecter une situation périphérique du corps principal ; une mémoire comprenant des informations sur un œil prioritaire et un œil non prioritaire de l'utilisateur ; et une unité de commande pour l'affichage d'informations d'image associées à un contenu multimédia et d'informations d'image associées à au moins l'une des fonctions exécutables dans le dispositif d'affichage d'informations d'image sur au moins l'une de la pluralité d'unités d'affichage. Si une fonction prédéterminée est exécutée, l'unité de commande affiche des informations d'image associées à la fonction sur une unité d'affichage correspondant à l'œil prioritaire de l'utilisateur selon le résultat de la détection de la situation périphérique du corps principal, et la fonction prédéterminée comprend une fonction de navigation et une fonction liée à la fonction de navigation.
PCT/KR2014/009687 2014-10-15 2014-10-15 Dispositif d'affichage d'informations d'image et son procédé de commande WO2016060293A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/KR2014/009687 WO2016060293A1 (fr) 2014-10-15 2014-10-15 Dispositif d'affichage d'informations d'image et son procédé de commande

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/KR2014/009687 WO2016060293A1 (fr) 2014-10-15 2014-10-15 Dispositif d'affichage d'informations d'image et son procédé de commande

Publications (1)

Publication Number Publication Date
WO2016060293A1 true WO2016060293A1 (fr) 2016-04-21

Family

ID=55746822

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2014/009687 WO2016060293A1 (fr) 2014-10-15 2014-10-15 Dispositif d'affichage d'informations d'image et son procédé de commande

Country Status (1)

Country Link
WO (1) WO2016060293A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20100076039A (ko) * 2007-11-22 2010-07-05 가부시끼가이샤 도시바 표시 장치, 표시 방법 및 헤드-업 디스플레이
US20120289217A1 (en) * 2005-09-26 2012-11-15 Zoomsafer Inc. Safety features for portable electronic device
KR20130034125A (ko) * 2011-09-28 2013-04-05 송영일 증강현실 기능을 구비한 안경형 모니터
US20130336629A1 (en) * 2012-06-19 2013-12-19 Qualcomm Incorporated Reactive user interface for head-mounted display
KR20140066258A (ko) * 2011-09-26 2014-05-30 마이크로소프트 코포레이션 투시 근안 디스플레이에 대한 센서 입력에 기초한 비디오 디스플레이 수정

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120289217A1 (en) * 2005-09-26 2012-11-15 Zoomsafer Inc. Safety features for portable electronic device
KR20100076039A (ko) * 2007-11-22 2010-07-05 가부시끼가이샤 도시바 표시 장치, 표시 방법 및 헤드-업 디스플레이
KR20140066258A (ko) * 2011-09-26 2014-05-30 마이크로소프트 코포레이션 투시 근안 디스플레이에 대한 센서 입력에 기초한 비디오 디스플레이 수정
KR20130034125A (ko) * 2011-09-28 2013-04-05 송영일 증강현실 기능을 구비한 안경형 모니터
US20130336629A1 (en) * 2012-06-19 2013-12-19 Qualcomm Incorporated Reactive user interface for head-mounted display

Similar Documents

Publication Publication Date Title
RU2670784C2 (ru) Ориентация и визуализация виртуального объекта
WO2016021747A1 (fr) Visiocasque et son procédé de commande
WO2017119737A1 (fr) Procédé et dispositif de partage d'informations d'image dans un système de communications
EP3019964A1 (fr) Dispositif mobile, afficheur de tête et procédé de commande associé
WO2015167080A1 (fr) Procédé et appareil de commande de véhicule aérien sans pilote
WO2014025108A1 (fr) Visiocasque pour ajuster une sortie audio et une sortie vidéo l'une par rapport à l'autre et son procédé de commande
US20120242510A1 (en) Communication connecting apparatus and method thereof
WO2020171513A1 (fr) Procédé et appareil permettant d'afficher des informations d'environnement à l'aide d'une réalité augmentée
WO2015046676A1 (fr) Visiocasque et procédé de commande de ce dernier
WO2015163536A1 (fr) Dispositif d'affichage et son procédé de commande
WO2020185029A1 (fr) Dispositif électronique et procédé d'affichage des informations de partage sur la base de la réalité augmentée
WO2017010632A1 (fr) Dispositif mobile et procédé de commande dudit dispositif
WO2015105236A1 (fr) Visiocasque et son procédé de commande
WO2017217713A1 (fr) Procédé et appareil pour fournir des services de réalité augmentée
WO2018056617A1 (fr) Dispositif vestimentaire et procédé de fourniture de gadget logiciel correspondant
WO2018131928A1 (fr) Appareil et procédé de fourniture d'interface utilisateur adaptative
KR20160097655A (ko) 이동 단말기 및 그 제어 방법
WO2015111805A1 (fr) Terminal vestimentaire et système le comprenant
WO2021230568A1 (fr) Dispositif électronique permettant de fournir un service de réalité augmentée et son procédé de fonctionnement
WO2018097483A1 (fr) Procédé de génération d'informations de mouvement et dispositif électronique le prenant en charge
US9747871B2 (en) Portable terminal device, program, device shake compensation method, and condition detection method
WO2012081787A1 (fr) Appareil de traitement d'images de terminal mobile et procédé associé
WO2020145653A1 (fr) Dispositif électronique et procédé pour recommander un emplacement de capture d'images
WO2016060293A1 (fr) Dispositif d'affichage d'informations d'image et son procédé de commande
WO2018186642A1 (fr) Dispositif électronique et procédé d'affichage d'images à l'écran pour dispositif électronique

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14903976

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14903976

Country of ref document: EP

Kind code of ref document: A1