EP3370102B1 - Hmd-vorrichtung und verfahren zur steuerung davon - Google Patents

Hmd-vorrichtung und verfahren zur steuerung davon Download PDF

Info

Publication number
EP3370102B1
EP3370102B1 EP16886584.8A EP16886584A EP3370102B1 EP 3370102 B1 EP3370102 B1 EP 3370102B1 EP 16886584 A EP16886584 A EP 16886584A EP 3370102 B1 EP3370102 B1 EP 3370102B1
Authority
EP
European Patent Office
Prior art keywords
display
user
processor
hmd device
movement
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
EP16886584.8A
Other languages
English (en)
French (fr)
Other versions
EP3370102A4 (de
EP3370102A1 (de
Inventor
Seong-won HAN
Woo-Jin Park
Dae-hyun Ban
Sangsoon LIM
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Samsung Electronics Co Ltd
Original Assignee
Samsung Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Samsung Electronics Co Ltd filed Critical Samsung Electronics Co Ltd
Publication of EP3370102A1 publication Critical patent/EP3370102A1/de
Publication of EP3370102A4 publication Critical patent/EP3370102A4/de
Application granted granted Critical
Publication of EP3370102B1 publication Critical patent/EP3370102B1/de
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0179Display position adjusting means not related to the information to be displayed
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/36Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
    • G09G5/37Details of the operation on graphic patterns
    • G09G5/377Details of the operation on graphic patterns for mixing or overlaying two or more graphic patterns
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/60Editing figures and text; Combining figures or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0138Head-up displays characterised by optical features comprising image capture systems, e.g. camera
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/014Head-up displays characterised by optical features comprising information/image processing systems
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2340/00Aspects of display data processing
    • G09G2340/12Overlay of images, i.e. displayed pixel being the result of switching between the corresponding input pixels
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2354/00Aspects of interface with display user

Definitions

  • the disclosure relates to a head mounted display (HMD) device and a method for controlling the same, and more particularly, to an HMD device configured to provide virtual reality service and a method for controlling the same.
  • HMD head mounted display
  • HMD head mounted display
  • the virtual reality technology collectively refers to those technologies that essentially aim to recreate three-dimensional virtual space for the participants so that they can be immersed in the virtually-created world by means of an interaction through all human senses (vision, hearing, smell, taste, touch) in a three-dimensional virtual environment which is created through the computer graphics (CG) technology similarly to a real environment, and also to allow the human user to be immersed in this virtual space and maximize utilization of information.
  • CG computer graphics
  • a user may have difficulty of managing emergent situation when the user is immersed in the virtual space and not able to recognize situation occurring outside. Further, the user may have difficulty of understanding an external environment or using external interface (e.g., keyboard, mouse, or the like) without taking off the HMD device when such is required.
  • external interface e.g., keyboard, mouse, or the like
  • HMD devices recognize an external object and express this in augmented reality, there are inconveniences in that only registered external objects are expressed, or expression is not satisfactory, or delay time occurs.
  • US 2015/0062163 discloses a portable device and method of controlling therefor.
  • US 2015/0002394 describes a head mounted display providing eye gaze calibration and control method thereof.
  • an object of the present disclosure is to provide an HMD device configured to change a screen state of the certain area of a display included in the HMD device and a method for controlling the same.
  • the present invention provides a head mounted display (HMD) device according to claim 1, and a method for controlling a head mounted display (HMD) device according to claim 2.
  • HMD head mounted display
  • a screen state of a certain area of the display provided on the HMD device is changed according to movement of an object in front of the HMD device, and therefore, the user is able to view a front direction of the HMD device.
  • first a first constituent element
  • second a second constituent element
  • first constituent element e.g., first constituent element
  • second constituent element another constituent element
  • yet another constituent element e.g., third constituent element
  • FIG. 1 is a diagram illustrating an HMD device 100 according to an embodiment of the disclosure.
  • the head mounted display (HMD) device 100 may be worn on the user head to provide virtual reality service.
  • the HMD device 100 may have a shape of glasses, headset, helmet or the like and provide images to both eyes of the user.
  • the HMD device 100 may provide images in front of the eyes of the user by displaying images through a display.
  • temples of the glasses or band may be formed on a rear side of the HMD device 100, allowing the user to wear the same on his/her head.
  • the HMD device 100 may be mounted with a manipulation track pad, a return button, a volume adjust key, or the like.
  • the HMD device 100 may be implemented as a device requiring a separate display.
  • the HMD device 100 may be a main body (e.g., housing) in a form of glasses, headset, helmet or the like, and provide images to both eyes of the user as a smart phone, a tablet or the like is mounted in front of the main body.
  • the HMD device 100 may display different images from each other on an area viewed with the left eye of the user and an area viewed with the right eye on the display, such that different images from each other may enter the left eye and the right eye.
  • the HMD device 100 may be configured to track head movement of the user and immediately update visual images and provide 3D images as well as 2D images. For example, when the user wears the HMD device 100 on the head, the HMD device 100 completely seizing a control of a gaze of the user and provides 360 degree stereoscopic image and audio, and a gyro detection unit or an acceleration detection unit mounted on the HMD device 100 provides proper visual effects for a direction by sensing the user moving his/her head up and down or left and right.
  • the user may experience virtual reality (VR) as he/she is provided with 3D images present on a direction where the user lays his/her gaze among the panorama 3D images.
  • VR virtual reality
  • FIGS. 2A and 2B are block diagrams illustrating constitution of an HMD device 100 according to an embodiment of the disclosure.
  • the HMD device 100 includes a display 110, a detection unit 120 and a processor 130.
  • the display 110 may display content under the control of the processor 130.
  • the display 110 may display content stored in the HMD device 100 or received from another device. Further, the display 110 may overlay and display GUI or the like during playback of content.
  • the display 110 may change a screen state of a certain area of the display 110 under the control of the processor 130. For example, the display 110 may photograph external image and overlay it on the certain area of the display 100. Alternatively, the display 110 may increase transparency of the certain area of the display 110.
  • displaying the external image in an overlay form may involve adjusting an amount of flickering time of the overlaid external image and displaying a result.
  • changing the transparency may involve changing the transparency of the display 110.
  • the display 110 may be implemented as liquid crystal display panel (LCD), organic light emitting diodes (OLED) or the like, although not limited hereto.
  • the display 110 may be implemented as a flexible display, a transparent display or the like.
  • the detection unit 120 may detect movement of an object in front of the HMD device 100.
  • the detection unit 120 may mainly include a camera or an infrared detection unit.
  • the camera is configured to photograph still images or videos. Specifically, the camera may be used to photograph an object positioned in front of the HMD device 100.
  • the detection unit 120 may include a plurality of infrared detection units.
  • a plurality of infrared detection units may be arranged in a row on an edge of the HMD device 100.
  • a plurality of infrared detection units may determine approximate movement of an object positioned in front of the HMD device 100 according to presence/absence of the reflective wave.
  • the detection unit 120 is provided with the camera or the infrared detection unit, embodiments may not be limited hereto.
  • the detection unit 120 may be provided with an ultrasound detection unit, a depth map or the like.
  • the processor 130 may change a screen state of the display 110 to provide an image in front of the HMD device 100 based on a location of the object. Accordingly, the processor 130 may naturally express any intended area of the content display screen and the external display screen.
  • the detection unit 120 may include the camera, and the processor 130 may overlay and display an image of the images photographed by the camera that corresponds to the location where movement of an object is detected, upon an area corresponding to the display 110.
  • the processor 130 may determine relative movement of an object with respect to movement of the HMD device 100, and change a screen state to provide front image of the HMD device 100 based on the relative movement of the object.
  • the processor 130 may change a screen state to provide front image of the HMD device 100 based on a location of the object.
  • the processor 130 may change a screen state to provide front image of the HMD device 100 based on the user gesture.
  • the processor 130 may change a screen state to provide front image of the HMD device 100.
  • the processor 130 may change a screen state so as not to provide front image of the HMD device 100.
  • the detection unit 120 may detect user gaze, and the processor 130 may change a screen state to maintain the changed screen state when user gaze is placed on the certain area of the display 110 in which a screen state has been changed, or not provide front image of the HMD device 100 when user gaze is outside the certain area of the display for which transparency is to be changed.
  • the processor 130 may change transparency of the certain area of the display 110 based on a location of the object when movement of the object is detected.
  • the processor 130 may increase transparency up to a maximum value in which case the user can only view outside of the HMD device 100.
  • the processor 130 may change transparency to a medium value in which case the user can simultaneously view the outside of the HMD device 100 as well as the content.
  • the processor 130 may decrease transparency to have a minimum value in which case the user can only view content.
  • the processor 130 may change transparency of the certain area of the display 110 based on current transparency of the certain area of the display 110 when movement of the object is detected.
  • the HMD device 100 may further include a storage to store mapping information of an object in a preset form and corresponding movement, and the processor 130 may determine the certain area of the display 110 for which transparency is to be changed and degree of change in the transparency based on the mapping information.
  • FIG. 2B is a block diagram illustrating a detailed constitution of an HMD device 100 according to an embodiment.
  • the HMD device 100 includes a display 110, a detection unit 120, a processor 130, a storage 140, a communicator 150, a user interface 155, an audio processor 160, a video processor 170, a speaker 180, a button 181, and a microphone 182.
  • the constituent elements illustrated in FIG. 2B overlapping with those illustrated in FIG. 2A will not be redundantly described below.
  • the processor 130 may control the overall operation of the HMD device 100 using various programs stored in the storage 140.
  • the processor 130 includes RAM 131, ROM 132, a main CPU 133, a graphic processor 134, first to (n)th interfaces (135-1 to 135-n), and a bus 136.
  • RAM 131, ROM 132, the main CPU 133, the graphic processor 134, and the first to (n)th interfaces 135-1 to 135-n may be connected to each other via the bus 136.
  • the first to (n)th interfaces 135-1 to 135-n may be connected to the various constituent elements described above.
  • One of the interfaces may be a network interface connected to an external device through network.
  • the main CPU 133 may access the storage 140 and perform booting using O/S stored in the storage 140. Further, the main CPU 133 may perform various operations using various programs stored in the storage 140.
  • ROM 132 may store set of instructions for the system booting.
  • the main CPU 133 may copy O/S stored in the storage 140 to RAM 131 according to the instructions stored in ROM 132, and boot the system by implementing O/S.
  • the main CPU 133 may copy various application programs stored in the storage 140 to RAM 131 and perform various operations by implementing the application programs copied to RAM 131.
  • the graphic processor 134 may generate screen including various objects such as icons, images, texts, or the like using an computing unit (not illustrated) and a renderer (not illustrated).
  • the computing unit (not illustrated) may compute attribute values such as coordinate value, shape, size, color or the like in which each object is displayed according to layout of the screen based on the received control command.
  • the renderer (not illustrated) may generate various layouts of the screen including object based on the attribute values computed at the computing unit (not illustrated).
  • the screen generated at the renderer (not illustrated) may be displayed in a display area of the display 110.
  • the above described operation of the processor 130 may be performed with the programs stored in the storage 140.
  • the storage 140 may store various data such as O/S (operating system) software module for driving the HMD device 100, various contents, display control module, object detect module, or the like.
  • O/S operating system
  • the processor 130 may display content based on the information stored in the storage 140, and change transparency of the certain area of the display 110.
  • the communicator 150 is configured to perform communication with various types of external devices according to various types of communication methods.
  • the communicator 150 includes WiFi chip 151, Bluetooth chip 152, wireless communication chip 153, NFC chip 154 or the like.
  • the processor 130 may perform communication with various external devices using the communicator 150.
  • WiFi chip 151 and Bluetooh chip 152 may perform communication respectively according to WiFi method and Bluetooth method.
  • WiFi chip 151 or Bluetooth chip 152 various pieces of connection information such as SSID, session key or the like may be first transmitted and received and communication is connected using the connection information, before transmission and reception of various pieces of.
  • the wireless communication chip 153 refers to chip performing communication according to various communication standards such as IEEE, Zigbee, 3 rd Generation (3G), 3 rd Generation Partnership Project (3GPP), Long Term Evolution (LTE), or the like.
  • NFC chip 154 refers to chip operating in Near Field Communication (NFC) method using 13.56 MHz bandwidth among various RF-ID frequency bandwidths such as 135 kHz, 13.56 MHz, 433 MHz, 860 ⁇ 960 MHz, 2.45 GHz, or the like.
  • NFC Near Field Communication
  • the communicator 150 may perform one-way communication or two-way communication with an electronic device.
  • the communicator 150 may receive signals from the electronic device.
  • the communicator 150 may receive signals from the electronic device, or transmit signals to the electronic device.
  • the user interface 155 may receive various user interactions.
  • the user interface 155 may receive input of a keyboard, a mouse or the like.
  • the user interface 155 may be implemented as a remote controller receiver receiving remote controller signals from a remote controlling device, a camera detecting user movement, a microphone receiving user voices, or the like.
  • the user interface 155 may be implemented as a touch screen forming inter-layer structure with a touch pad.
  • the user interface 155 may be used as the display 110 described above. Specifically, the user may control the HMD device 100 by touching an exterior of the display 110.
  • the detection unit 120 includes a geomagnetic detection unit, a gyro detection unit, an acceleration detection unit, a proximity detection unit or the like.
  • the detection unit 120 may detect various manipulations such as rotating, titling, pressing, approaching, or the like.
  • the geomagnetic detection unit is a detection unit to detect rotating state, moving direction or the like of the HMD device 100.
  • the gyro detection unit is a detection unit to detect a rotating angle of the HMD device 100.
  • the geomagnetic detection unit and the gyro detection unit may be both provided; however, the HMD device 100 may detect rotating state even if only one of them is provided.
  • the acceleration detection unit is a detection unit to detect a titling degree of the HMD device 100.
  • the proximity detection unit is a detection unit to detect approaching movement without directly contacting a surface of the display.
  • the proximity detection unit may be implemented as various types of detection units such as high-frequency oscillating type forming high-frequency magnetic field and detecting electrical currents induced with magnetic field features which is varied at the time of object approaching, magnetic type using a magnet, and electrostatic capacity type detecting electrostatic capacity which is varied because of object approaching.
  • the audio processor 160 is a configured to perform processing with respect to audio data.
  • the audio processor 160 may perform various processes such as decoding, amplifying, noise filtering or the like of the audio data.
  • the video processor 170 is a configured to perform processes with respect to video data.
  • the video processor 170 may perform various image processes such as decoding, scaling, noise filtering, frame rate converting, resolution converting, or the like of the video data.
  • the speaker 180 is a configured to output various alarm sounds or voice messages as well as audio data processed in the audio processor 160.
  • the button 181 may be various forms of buttons such as mechanical button, touch pad, wheel or the like which are formed on a voluntary area of a front section, a side section, a rear side or the like of the exterior main body on the HMD device 100.
  • the microphone 182 is a configured to receive user voices or other sounds and convert it into audio data.
  • FIG. 3 is a diagram provided to describe a method for detecting movement of the object according to an embodiment of the disclosure.
  • the user may wear the HMD device 100 to experience virtual reality.
  • the processor 130 may display content for the user experience of virtual reality and detect a front direction of the HMD device 100 through the detection unit 120.
  • the processor 130 may detect a road, a car 310, a bus 320 or the like in front of the HMD device 100, but the scene illustrated in FIG. 3 may not be provided to the user.
  • the scene of FIG. 3 is a scene detected by the detection unit 120, which is not viewed by the user unless transparency of the certain area of the display is changed according to movement of the object to be described below.
  • the processor 130 may change a screen state of the display 110 to provide front image of the HMD device 100 based on a location of an object. For example, as illustrated in an upper side drawing of FIG. 3 , the processor 130 may display the user that the car 310 is moving, by changing a screen state of a certain area of the display 110 based on the detected location of the car 310 when detecting that the car 310 is moved toward a left direction.
  • the detection unit 120 may include the camera. Further, the processor 130 may overlay and display image corresponding to position where movement of the object is detected among images photographed by the camera on a corresponding area of the display 110.
  • the processor 130 may overlay and display image of the area where the car 310 is moving toward a left side direction among the photographed images on a corresponding area of the display 110.
  • the processor 130 may compare the entire area of the display 110 with the photographed image and determine a corresponding area of the display 110.
  • a left lower end of the photographed image may be also a left lower end on the display 110.
  • the processor 130 may compare size of the entire area of the display 110 with size of the photographed image and determine a corresponding area of the display 110 according to a proportional relation.
  • the processor 130 may change transparency of a certain area of the display 110 based on a location of an object. For example, when detecting that the car 310 is moving toward a left side direction, the processor 130 may change the certain area of the display 110 to be transparent based on the detected location of the car 310 and display the user that the car 310 is moving.
  • location of the car 310 may indicate relative position with the user rather than absolute position. That is, the processor 130 may change a screen state of the display 110 according to distance between the user and the car 310.
  • the processor 130 may determine relative movement of the object with respect to movement of the HMD device 100 when movement of the HMD device 100 is detected. For example, as illustrated in lower side diagram of FIG. 3 , when the user moves his head toward a left side, the detection unit 120 may detect that the car 310, the bus 320 or the like in front of the HMD device 100 is moved in a right side. In this case, the processor 130 may determine that the HMD device 100 is moved and determine actual movement of an object. That is, the processor 130 may ignore movement of the object according to movement of the HMD device 100 and determine actual movement of the object to be effective movement only.
  • the processor 130 may determine movement of the HMD device 100 using the geomagnetic detection unit, the gyro detection unit, the acceleration detection unit, the proximity detection unit or the like which are described above.
  • the processor 130 may detect actual movement of the object only in terms of software. For example, the processor 130 may determine that the HMD device 100 is moved when the detected area is entirely moved, and that the object is moved when a portion of the detected area is moved.
  • the processor 130 may determine that the HMD device 100 is moved and that some object is moved relatively with respect to the movement of the HMD device 100.
  • FIG. 3 A lower side diagram of FIG. 3 illustrates that only the car 310 is actually moving, while the processor 130 may determine that the HMD device 100 is moved and that the car 310 is moved relatively with respect to movement of the HMD device 100, as described above.
  • the processor 130 may change a screen state to provide front image of the HMD device 100 based on relative movement of an object. For example, as illustrated in FIG. 3 , the processor 130 may change a screen state of a lower area of the display 110 based on the detected location of the car 310 when determining that the car 310 is moved in a left side.
  • FIG. 3 illustrates that the car 310 is moved to the left side
  • the processor 130 may detect that the car 310 is moved in another direction. Specifically, the processor 130 may detect that the car 310 is approaching toward the user. For example, the processor 130 may change a screen state of a certain area of the display 110 upon detecting that the car 310 is increased in size. That is, the processor 130 may determine size of an area occupied by the detected car 310 from the entire detection area and determine the movement of the car 310.
  • FIGS. 4A to 4C are diagrams provided to describe an operation according to size of an object, according to an embodiment of the disclosure.
  • the processor 130 may change a screen state to provide front image of the HMD device 100 based on a location of an object. For example, the processor 130 may detect that the bus 410 is moving away upon detecting front direction of the HMD device 100. However, when determining that the bus 410 is less than a preset size because of the remote distance to the bus 410, the processor 130 may not change a screen state of a certain area of the display 110.
  • the processor 130 may change a screen state of a certain area of the display 110 based on a location of the smart phone 420. That is, when determining that the detected smart phone 420 is greater than a preset size, the processor 130 may change a screen state of a certain area of the display 110 corresponding to location of the smart phone 420, in which case the user can view the smart phone 420.
  • the processor 130 may compare the size of the detected object with a preset size. For example, as illustrated in FIG. 4C , the processor 130 may divide the entire detection area into a grid form and determine whether size of the detected object is greater than a preset size based on the number of grid cells occupied by the moving object.
  • embodiments may not be limited hereto, and accordingly, the number of grid cells may be varied.
  • the grid cells may be denser than that illustrated in FIG. 4C .
  • a reference number of grid cells may be established by the user.
  • FIGS. 5A to 5C are diagrams provided to describe an operation according to form of user gesture, according to an embodiment of the disclosure.
  • the processor 130 may change a screen state to provide front image of the HMD device 100 based on the user gesture. For example, the processor 130 may change a screen state of a certain area of the display 110 corresponding to the detected position where the user's hand is moved according to the gesture of moving the user's hand from a left side to a right side.
  • FIG. 5A illustrates images actually provided to the user who is experiencing virtual reality. That is, the processor 130 may change a screen state of a certain area of the display 110 according to user gesture while providing shooting game screen to the user, such that the user can view his hand and a keyboard in front of the HMD device 100. As described above, the processor 130 may provide front image by overlaying and displaying the photographed image or changing transparency of the display 110, thus allowing the user to control the keyboard.
  • the processor 130 may control preset functions when movement of the object is a user gesture in a preset form. For example, the processor 130 may change to movie content from the shooting game screen and display according to the gesture of moving the user's hand from a lower side to an upper side of the entire detection area.
  • the processor 130 may perform another function based on an area where movement of the object is performed in the entire detection area. For example, as illustrated in FIG. 5B , the processor 130 may change to movie content from the shooting game screen and display according to the gesture of moving the user's hand from a lower side to an upper side on a right side of the entire detection area. Alternatively, the processor 130 may increase volume of sound of the shooting game screen according to the gesture of moving the user's hand from a lower side to an upper side on a left side of the entire detection area.
  • the processor 130 may perform another function based on a moving direction of an object. For example, the processor 130 may change to movie content from the shooting game screen and display according to the gesture of moving the user's hand from a lower side to an upper side. Alternatively, the processor 130 may change the entire area of the display 110 to be transparent according to the gesture of moving the user's hand from an upper side to a lower side.
  • the processor 130 may change a screen state of a certain area of the display 110 by considering shape of an object. For example, the processor 130 may change a screen state of a certain area of the display 110 according to the gesture of clenching a fist and moving from a left side to a right side. In this case, the processor 130 may not perform any operation when the user does not clench a fist.
  • the processor 130 may change a screen state of a certain area of the display 110 by considering at least one of: form of an object; an area of the entire detection area where the movement of the object is performed; and a moving direction of an object.
  • the HMD device 100 may further include the storage 140 to store mapping information with respect to an object in a preset form and corresponding movement. Such mapping information may be established initially when the HMD device 100 is manufactured, although it may be inputted by the user.
  • the processor 130 may determine a certain area of the display 110 for which transparency is to be changed and degree of change in the transparency based on the mapping information. For example, the processor 130 may determine degree of change in the transparency based on the number of detected user fingers.
  • FIGS. 6A and 6B are diagrams provided to describe a method for changing a screen state according to an embodiment.
  • the processor 130 may change a screen state to provide front images of the HMD device 100 when the user gesture directing toward a preset direction is detected. As described above, the processor 130 may change a screen state of a certain area of the display 110 corresponding to a location where movement of the user's hand is detected according to the gesture of moving the user's hand from a left side to a right side.
  • FIG. 6B illustrates a method for changing a screen state again after a screen state has been changed as in FIG. 6A .
  • the processor 130 may change a screen state so as not to provide front images of the HMD device 100.
  • the 'opposite direction' refers to a direction opposite to a preset direction of the gesture of changing an initial screen state.
  • FIGS. 6A and 6B are merely one of embodiments, and the present disclosure is not limited hereto.
  • the processor 130 may change a screen state of a certain area of the display 110 to provide image of an area where the palm goes through according to movement of the palm.
  • the processor 130 may change a screen state of a certain area of the display 110 so as not to provide image of the area where the fist has passed according to movement of the fist.
  • the processor 130 may distinguish the right hand and the left hand and determine whether to provide front image.
  • FIGS. 7A and 7B are diagrams provided to describe an operation according to gaze of the user, according to an embodiment.
  • the detection unit 120 of the HMD device 100 may detect gaze of the user.
  • the detection unit 120 may include a camera photographing gaze of the user as well as a camera photographing a front direction of the HMD device 100.
  • embodiments may not be limited hereto, and accordingly, any device that can detect gaze of the user may be used as the detection unit 120.
  • the processor 130 may maintain the changed screen state when gaze of the user is placed on a certain area of the display 110 in which a screen state has been changed. Accordingly, the user may view a keyboard in front of the HMD device 100 continuously.
  • the processor 130 may change a screen state so as not to provide front image of the HMD device 100 when gaze of the user is outside the certain area of the display 110 in which a screen state has been changed. That is, the user may change location of gaze and view the shooting game screen again.
  • the processor 130 may change a screen state so as not to provide front image of the HMD device 100 only when gaze of the user is outside the certain area of the display 110 in which a screen state has been changed for greater than a preset time.
  • the processor 130 may gradually change a screen state when gaze of the user is outside the certain area of the display 110 in which a screen state has been changed. For example, the processor 130 may gradually decrease transparency of the overlaid front image or gradually decrease transparency of the display 110.
  • the processor 130 may maintain a screen state at a time point of gazing when gaze of the user again directs toward the certain area of the display 110 in which a screen state has been changed while gradually changing a screen state. Further, when gaze of the user is maintained for greater than a preset time while a screen state is kept, the processor 130 may change a screen state to provide front image of the HMD device 100 again.
  • FIGS. 8A to 8B are diagrams provided to describe a method for establishing the entire detection area, according to an embodiment.
  • the detection unit 120 of the HMD device 100 may include a camera 810. Further, a viewing angle 820 of the user may be different from a viewing angle 830 of the camera 810.
  • the viewing angle 820 of the user refers to an angle in which the user can view a front direction of the HMD device 100 through the display 110 under the condition in which the display 110 is transparent.
  • the viewing angle 830 of the camera 810 indicates a viewing of a scene that can be captured by the camera 810, which may be implemented as various viewing angles 830.
  • the viewing angle 830 of the camera 810 is broader than or same as the viewing angle 820 of the user.
  • FIG. 8B illustrates a scene viewed by the user based on the viewing angle 820 of the user
  • FIG. 8C illustrates a scene photographed based on the viewing angle 830 of the camera 810.
  • the processor 130 may detect movement of the object within an area corresponding to gaze of the user among the areas photographed by the camera 810 and change a screen state of a certain area of the display based on a location of an object. That is, because a scene photographed based on the viewing angle 830 of the camera 810 is broader, the processor 130 may determine only the movement of the object within an area 840 that corresponds to the scene viewed by the user to be a meaningful movement.
  • the processor 130 may detect movement of the object within an area corresponding to gaze of the user among the areas photographed by the camera 810. The above operation may reduce errors between the area actually viewed by the user and the area photographed by the camera 810.
  • the processor 130 may determine a certain area of the display 110 in which a screen state has been changed based on at least one of location of the object and the viewing angle of the user determined by the display 110.
  • FIG. 9 is a diagram provided to describe operation at current transparency of a display according to an embodiment of the disclosure.
  • the processor 130 may change transparency of a certain area of the display 110 based on current transparency of the certain area of the display 110. For example, when transparency of the certain area of the display 110 is 60 % and when movement of the object is detected, the processor 130 may reduce transparency of the certain area of the display 110. Alternatively, when transparency of the certain area of the display 110 is 40 % and when movement of the object is detected, the processor 130 may decrease transparency of the certain area of the display 110. That is, the processor 130 may change transparency of the certain area of the display 110 to be completely transparent or completely non-transparent, and In this example, determine how to change based on current transparency.
  • the processor 130 may change transparency of the certain area of the display 110 by considering at least one of current transparency of the certain area of the display 110, shape of an object, an area where movement of the object is performed in the entire detection area, and direction of the movement of an object.
  • FIG. 10 is a flowchart provided to describe a method for controlling an HMD device according to an embodiment of the disclosure.
  • content may be displayed on the display at S1010. Further, movement of the object in front of the HMD device may be detected at S1020. When movement of the object is detected, a screen state of the display may be changed to provide front image of the HMD device based on a location of the object at S1030.
  • the changing at S1030 may overlay and display image corresponding to position where movement of the object is detected among the images photographed by the camera on an area corresponding to the display.
  • the changing at S1030 may include the determining relative movement of the object with respect to movement of the HMD device and the changing a screen state to provide front image of the HMD device based on the relative movement of an object.
  • the changing at S1030 may change a screen state to provide front image of the HMD device based on a location of the object when the detected size of the object is greater than a preset size.
  • the changing at S1030 may change a screen state to provide front image of the HMD device based on the user gesture when movement of the object is a user gesture in a preset form.
  • the changing at S1030 may include the changing a screen state to provide front image of the HMD device when user gesture directing toward a preset direction is detected, and the changing a screen state so as not to provide front image of the HMD device when user gesture directing toward an opposite direction is detected in the changed screen state.
  • a screen state may be maintained when gaze of the user is detected to be placed on the certain area of the display in which a screen state has been changed, and a screen state may be changed so as not to provide front image of the HMD device when gaze of the user is outside the certain area of the display.
  • the changing at S1030 may change transparency of the certain area of the display based on a location of the object when movement of the object is detected.
  • the changing at S1030 may change transparency of the certain area of the display based on current transparency of the certain area of the display when movement of the object is detected.
  • the changing at S1030 may determine the certain area of the display for which transparency is to be changed and degree of change in the transparency based on mapping information of an object in a preset form and corresponding movement.
  • the user may view a front direction of the HMD device by changing transparency of the certain area of the display provided in the HMD device according to movement of the object in front of the HMD device.
  • transparency is changed based on movement of the object in front of the HMD device, it may not be limited hereto.
  • transparency of the display may be changed on the basis of user voice recognition.
  • control method of the HMD device may be implemented as program codes that can be executed in a computer and provided to each server or device to be executed by a processor while being stored in various non-transitory computer readable recording media.
  • non-transitory computer readable recording medium storing a program sequentially performing the displaying content on the display, the detecting movement of the object in front of the HMD device, and the changing a screen state of the display to provide front image of the HMD device based on a location of the object when movement of the object is detected.
  • the non-transitory computer readable recording medium refers to medium that store data semi-permanently and being read by a machine, rather than medium that store data temporarily such as register, cache, memory or the like.
  • the above various applications or programs may be stored and provided in non-transitory computer readable recording medium such as CD, DVD, hard disk, Bluray disk, USB, memory card, ROM or the like.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Computer Hardware Design (AREA)
  • Optics & Photonics (AREA)
  • Computer Graphics (AREA)
  • Software Systems (AREA)
  • User Interface Of Digital Computer (AREA)

Claims (2)

  1. Am Kopf befestigtes Anzeige-HMD-Gerät (100), umfassend:
    eine Anzeige (110), die konfiguriert ist, um Inhalt anzuzeigen;
    eine Erkennungseinheit (120), die konfiguriert ist, um eine Bewegung eines Objekts vor dem HMD-Gerät zu erkennen; und
    einen Prozessor (130), der dazu konfiguriert ist, einen Bildschirmzustand der Anzeige zu ändern, um ein Frontbild des HMD-Geräts basierend auf einer Position des Objekts bereitzustellen, wenn eine Bewegung des Objekts erfasst wird,
    wobei die Erkennungseinheit ferner eine Kamera (810) umfasst,
    wobei das Gerät dadurch gekennzeichnet ist, dass:
    die Anzeige konfiguriert ist, um den Inhalt anzuzeigen, um einen Dienst für virtuelle Realität bereitzustellen, und
    wenn die Bewegung des Objekts eine Benutzergeste in einer voreingestellten Form ist, der Prozessor dazu konfiguriert ist, den Bildschirmzustand zu ändern, um das Frontbild basierend auf der Benutzergeste in einem bestimmten Bereich der Anzeige (110) bereitzustellen, der der erfassten Position entspricht, an der die Hand des Benutzers gemäß der Geste bewegt wird, so dass der Benutzer, der die virtuelle Realität erlebt, seine Hand und eine Tastatur vor dem MHD-Gerät durch Überlagern und Anzeigen des fotografierten Bildes sehen kann, wodurch es dem Benutzer ermöglicht wird, die Tastatur zu steuern.
  2. Verfahren zum Steuern eines am Kopf befestigten Anzeige-HMD-Geräts, umfassend:
    Anzeigen (S1010) von Inhalten auf einer Anzeige;
    Erfassen (S1020) der Bewegung eines Objekts vor dem HMD-Gerät; und
    Ändern (S1030) eines Bildschirmzustands der Anzeige, um ein Frontbild des HMD-Geräts basierend auf einer Position des Objekts bereitzustellen, wenn eine Bewegung des Objekts erfasst wird,
    wobei das Verfahren durch Folgendes gekennzeichnet ist:
    Anzeigen des Inhalts, um einen Dienst für virtuelle Realität bereitzustellen, und wenn die Bewegung des Objekts eine Benutzergeste in einer voreingestellten Form ist, das Ändern (S1030) des Bildschirmzustands das Bereitstellen des Frontbilds basierend auf der Benutzergeste in einem bestimmten Bereich der Anzeige umfasst, der der erfassten Position entspricht, an der die Hand des Benutzers gemäß der Geste bewegt wird, derart, dass der Benutzer, der die virtuelle Realität erlebt, seine Hand und eine Tastatur vor dem MHD-Gerät durch Überlagern und Anzeigen des fotografierten Bildes sehen kann, wodurch es dem Benutzer ermöglicht wird, die Tastatur zu steuern.
EP16886584.8A 2016-01-20 2016-03-17 Hmd-vorrichtung und verfahren zur steuerung davon Active EP3370102B1 (de)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020160007086A KR102610120B1 (ko) 2016-01-20 2016-01-20 Hmd 디바이스 및 그 제어 방법
PCT/KR2016/002724 WO2017126741A1 (ko) 2016-01-20 2016-03-17 Hmd 디바이스 및 그 제어 방법

Publications (3)

Publication Number Publication Date
EP3370102A1 EP3370102A1 (de) 2018-09-05
EP3370102A4 EP3370102A4 (de) 2018-12-05
EP3370102B1 true EP3370102B1 (de) 2021-10-13

Family

ID=59362430

Family Applications (1)

Application Number Title Priority Date Filing Date
EP16886584.8A Active EP3370102B1 (de) 2016-01-20 2016-03-17 Hmd-vorrichtung und verfahren zur steuerung davon

Country Status (5)

Country Link
US (2) US10643579B2 (de)
EP (1) EP3370102B1 (de)
KR (1) KR102610120B1 (de)
CN (1) CN108474950B (de)
WO (1) WO2017126741A1 (de)

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102029906B1 (ko) * 2017-11-10 2019-11-08 전자부품연구원 이동수단의 가상현실 콘텐츠 제공 장치 및 방법
US11417296B2 (en) * 2018-03-13 2022-08-16 Sony Corporation Information processing device, information processing method, and recording medium
KR102220091B1 (ko) * 2018-11-26 2021-02-24 가천대학교 산학협력단 시선 기반의 360도 영상 스트리밍
JP7238456B2 (ja) * 2019-02-21 2023-03-14 セイコーエプソン株式会社 表示システム、情報処理装置の制御プログラム、及び情報処理装置の制御方法
WO2020209624A1 (en) 2019-04-11 2020-10-15 Samsung Electronics Co., Ltd. Head mounted display device and operating method thereof
US10992926B2 (en) 2019-04-15 2021-04-27 XRSpace CO., LTD. Head mounted display system capable of displaying a virtual scene and a real scene in a picture-in-picture mode, related method and related non-transitory computer readable storage medium
EP3734417A1 (de) * 2019-05-02 2020-11-04 XRSpace CO., LTD. Kopfmontiertes anzeigesystem, das in der lage ist, eine virtuelle szene und eine reale szene in einem bild-in-bild-modus anzuzeigen, zugehöriges verfahren und zugehöriges nichttransitorisches computerlesbares speichermedium
TWI719483B (zh) 2019-05-17 2021-02-21 雅得近顯股份有限公司 便利備忘操作系統
KR102048354B1 (ko) * 2019-07-10 2020-01-23 한화시스템(주) 무선 헬멧에서의 영상 디스플레이 장치 및 그 방법
JP2021125859A (ja) * 2020-02-10 2021-08-30 カシオ計算機株式会社 表示出力制御装置、表示出力制御システム、表示出力制御方法及びプログラム
KR102616729B1 (ko) 2022-11-09 2023-12-21 한화시스템 주식회사 도심 항공 모빌리티용 비행 장치 및 도심 항공 모빌리티용 비행체에서 콘텐츠 제공 방법

Family Cites Families (31)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1754201A1 (de) * 2004-05-27 2007-02-21 Canon Kabushiki Kaisha Informationsverarbeitungsverfahren, informationsverarbeitungsvorrichtung und bilderfassungsvorrichtung
MX2012010238A (es) * 2010-03-05 2013-01-18 Sony Comp Entertainment Us Mantenimiento de vistas multiples en un espacio virtual estable compartido.
US9111498B2 (en) * 2010-08-25 2015-08-18 Eastman Kodak Company Head-mounted display with environmental state detection
US8941559B2 (en) * 2010-09-21 2015-01-27 Microsoft Corporation Opacity filter for display device
US20120249587A1 (en) 2011-04-04 2012-10-04 Anderson Glen J Keyboard avatar for heads up display (hud)
KR101824501B1 (ko) * 2011-05-19 2018-02-01 삼성전자 주식회사 헤드 마운트 디스플레이 장치의 이미지 표시 제어 장치 및 방법
US20120327116A1 (en) 2011-06-23 2012-12-27 Microsoft Corporation Total field of view classification for head-mounted display
US8558759B1 (en) 2011-07-08 2013-10-15 Google Inc. Hand gestures to signify what is important
US8228315B1 (en) 2011-07-12 2012-07-24 Google Inc. Methods and systems for a virtual input device
EP2761362A4 (de) * 2011-09-26 2014-08-06 Microsoft Corp Videoanzeigeänderung auf der basis einer sensoreingabe für durchsichtige augennahe anzeige
JP2013125247A (ja) * 2011-12-16 2013-06-24 Sony Corp ヘッドマウントディスプレイ及び情報表示装置
US9213185B1 (en) * 2012-01-06 2015-12-15 Google Inc. Display scaling based on movement of a head-mounted display
JP5580855B2 (ja) 2012-06-12 2014-08-27 株式会社ソニー・コンピュータエンタテインメント 障害物回避装置および障害物回避方法
US9568735B2 (en) 2012-08-07 2017-02-14 Industry-University Cooperation Foundation Hanyang University Wearable display device having a detection function
US8482527B1 (en) * 2012-09-14 2013-07-09 Lg Electronics Inc. Apparatus and method of providing user interface on head mounted display and head mounted display thereof
JP2013077013A (ja) * 2012-11-20 2013-04-25 Sony Corp 表示装置、表示方法
JP6229260B2 (ja) * 2012-11-20 2017-11-15 セイコーエプソン株式会社 虚像表示装置
KR20140090552A (ko) 2013-01-09 2014-07-17 엘지전자 주식회사 시선 캘리브레이션을 제공하는 헤드 마운트 디스플레이 및 그 제어 방법
US9619021B2 (en) * 2013-01-09 2017-04-11 Lg Electronics Inc. Head mounted display providing eye gaze calibration and control method thereof
US9411160B2 (en) 2013-02-12 2016-08-09 Seiko Epson Corporation Head mounted display, control method for head mounted display, and image display system
KR20140129936A (ko) * 2013-04-30 2014-11-07 인텔렉추얼디스커버리 주식회사 헤드 마운트 디스플레이 및 이를 이용한 콘텐츠 제공 방법
KR20140130321A (ko) 2013-04-30 2014-11-10 (주)세이엔 착용형 전자 장치 및 그의 제어 방법
TW201447375A (zh) 2013-06-13 2014-12-16 Hsiu-Chi Yeh 擴增實境之頭戴式電子裝置及方法
KR20150006128A (ko) 2013-07-08 2015-01-16 엘지전자 주식회사 헤드 마운트 디스플레이 장치 및 그 동작방법
US9361733B2 (en) * 2013-09-02 2016-06-07 Lg Electronics Inc. Portable device and method of controlling therefor
JP5825328B2 (ja) * 2013-11-07 2015-12-02 コニカミノルタ株式会社 透過型hmdを有する情報表示システム及び表示制御プログラム
JP6294054B2 (ja) 2013-11-19 2018-03-14 株式会社Nttドコモ 映像表示装置、映像提示方法及びプログラム
KR20150101612A (ko) 2014-02-27 2015-09-04 엘지전자 주식회사 폐쇄형 시야(Closed-view)를 제공하는 헤드 마운티드 디스플레이 및 그 제어 방법
KR101665027B1 (ko) * 2014-03-05 2016-10-11 (주)스코넥엔터테인먼트 헤드 마운트 디스플레이용 헤드 트래킹 바 시스템
JP6252316B2 (ja) 2014-03-31 2017-12-27 株式会社デンソー 車両用表示制御装置
EP3403130A4 (de) * 2016-01-12 2020-01-01 eSIGHT CORP. Verfahren und vorrichtungen zur sehkraftverstärkung bei sprachelementen

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
None *

Also Published As

Publication number Publication date
EP3370102A4 (de) 2018-12-05
US20180366091A1 (en) 2018-12-20
KR20170087292A (ko) 2017-07-28
US10643579B2 (en) 2020-05-05
CN108474950A (zh) 2018-08-31
WO2017126741A1 (ko) 2017-07-27
US20200227009A1 (en) 2020-07-16
US11164546B2 (en) 2021-11-02
EP3370102A1 (de) 2018-09-05
KR102610120B1 (ko) 2023-12-06
CN108474950B (zh) 2021-09-28

Similar Documents

Publication Publication Date Title
US11164546B2 (en) HMD device and method for controlling same
US11366516B2 (en) Visibility improvement method based on eye tracking, machine-readable storage medium and electronic device
US20210405761A1 (en) Augmented reality experiences with object manipulation
US11160688B2 (en) Visual aid display device and method of operating the same
KR102349716B1 (ko) 영상 공유 방법 및 이를 수행하는 전자 장치
TWI471820B (zh) 行動終端機及其操作控制方法
CN107765429A (zh) 图像显示装置及其操作方法
JP7382972B2 (ja) 頭部装着型画像ディスプレイデバイスのための入力を提供するための方法および装置
US20160132189A1 (en) Method of controlling the display of images and electronic device adapted to the same
JP7005161B2 (ja) 電子機器及びその制御方法
KR20150041453A (ko) 안경형 영상표시장치 및 그것의 제어방법
KR20140070326A (ko) 3차원 인터페이스를 제공하는 모바일 장치 및 그것의 제스처 제어 방법
CN111937045B (zh) 信息处理装置、信息处理方法和记录介质
US20160021353A1 (en) I/o device, i/o program, and i/o method
CN111161396B (zh) 虚拟内容的控制方法、装置、终端设备及存储介质
KR20150146295A (ko) 헤드 마운티드 디스플레이 및 그것의 제어방법
US11675198B2 (en) Eyewear including virtual scene with 3D frames
JP2012108723A (ja) 指示受付装置
KR20150024199A (ko) 두부 장착형 디스플레이 장치 및 이의 제어방법
CN103914128B (zh) 头戴式电子设备和输入方法
CN112136096A (zh) 将物理输入设备显示为虚拟对象
CN107408186A (zh) 隐私内容的显示
US20240103680A1 (en) Devices, Methods, and Graphical User Interfaces For Interacting with Three-Dimensional Environments
KR20180052501A (ko) 디스플레이 장치 및 그 동작 방법
EP3702008A1 (de) Anzeigen eines ansichtsfensters eines virtuellen raums

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20180530

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

A4 Supplementary search report drawn up and despatched

Effective date: 20181030

RIC1 Information provided on ipc code assigned before grant

Ipc: G06F 3/01 20060101ALI20181024BHEP

Ipc: G02B 27/01 20060101AFI20181024BHEP

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

17Q First examination report despatched

Effective date: 20190326

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: GRANT OF PATENT IS INTENDED

INTG Intention to grant announced

Effective date: 20210517

GRAS Grant fee paid

Free format text: ORIGINAL CODE: EPIDOSNIGR3

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE PATENT HAS BEEN GRANTED

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

REG Reference to a national code

Ref country code: GB

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: CH

Ref legal event code: EP

REG Reference to a national code

Ref country code: DE

Ref legal event code: R096

Ref document number: 602016065005

Country of ref document: DE

REG Reference to a national code

Ref country code: IE

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: AT

Ref legal event code: REF

Ref document number: 1438594

Country of ref document: AT

Kind code of ref document: T

Effective date: 20211115

REG Reference to a national code

Ref country code: LT

Ref legal event code: MG9D

REG Reference to a national code

Ref country code: NL

Ref legal event code: MP

Effective date: 20211013

REG Reference to a national code

Ref country code: AT

Ref legal event code: MK05

Ref document number: 1438594

Country of ref document: AT

Kind code of ref document: T

Effective date: 20211013

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: RS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20211013

Ref country code: LT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20211013

Ref country code: FI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20211013

Ref country code: BG

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220113

Ref country code: AT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20211013

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220213

Ref country code: SE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20211013

Ref country code: PT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220214

Ref country code: PL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20211013

Ref country code: NO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220113

Ref country code: NL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20211013

Ref country code: LV

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20211013

Ref country code: HR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20211013

Ref country code: GR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220114

Ref country code: ES

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20211013

REG Reference to a national code

Ref country code: DE

Ref legal event code: R097

Ref document number: 602016065005

Country of ref document: DE

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SM

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20211013

Ref country code: SK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20211013

Ref country code: RO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20211013

Ref country code: EE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20211013

Ref country code: DK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20211013

Ref country code: CZ

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20211013

PLBE No opposition filed within time limit

Free format text: ORIGINAL CODE: 0009261

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT

26N No opposition filed

Effective date: 20220714

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MC

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20211013

Ref country code: AL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20211013

REG Reference to a national code

Ref country code: CH

Ref legal event code: PL

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20211013

REG Reference to a national code

Ref country code: BE

Ref legal event code: MM

Effective date: 20220331

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: LU

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20220317

Ref country code: LI

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20220331

Ref country code: IE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20220317

Ref country code: FR

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20220331

Ref country code: CH

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20220331

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: BE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20220331

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20211013

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: HU

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT; INVALID AB INITIO

Effective date: 20160317

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20211013

Ref country code: CY

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20211013

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: DE

Payment date: 20240220

Year of fee payment: 9

Ref country code: GB

Payment date: 20240220

Year of fee payment: 9

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: TR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20211013

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20211013