EP3649539A1 - Appareil de sortie visuelle doté d'une caméra et procédé de présentation - Google Patents
Appareil de sortie visuelle doté d'une caméra et procédé de présentationInfo
- Publication number
- EP3649539A1 EP3649539A1 EP18736893.1A EP18736893A EP3649539A1 EP 3649539 A1 EP3649539 A1 EP 3649539A1 EP 18736893 A EP18736893 A EP 18736893A EP 3649539 A1 EP3649539 A1 EP 3649539A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- output device
- visual output
- image
- camera
- virtual object
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
- 230000000007 visual effect Effects 0.000 title claims abstract description 119
- 238000000034 method Methods 0.000 title claims abstract description 11
- 210000001508 eye Anatomy 0.000 claims abstract description 19
- 241000282414 Homo sapiens Species 0.000 claims description 69
- 238000004891 communication Methods 0.000 claims description 16
- 230000005540 biological transmission Effects 0.000 claims description 11
- 238000006243 chemical reaction Methods 0.000 claims description 6
- 210000001525 retina Anatomy 0.000 claims description 6
- 230000004886 head movement Effects 0.000 claims description 5
- 230000004044 response Effects 0.000 claims description 5
- 238000005259 measurement Methods 0.000 claims description 4
- 230000035515 penetration Effects 0.000 claims 2
- 239000011521 glass Substances 0.000 description 47
- 210000003128 head Anatomy 0.000 description 15
- 230000033001 locomotion Effects 0.000 description 9
- 241000282412 Homo Species 0.000 description 7
- 230000008921 facial expression Effects 0.000 description 6
- 230000004297 night vision Effects 0.000 description 6
- 230000003287 optical effect Effects 0.000 description 5
- 201000010099 disease Diseases 0.000 description 4
- 208000037265 diseases, disorders, signs and symptoms Diseases 0.000 description 4
- 238000013461 design Methods 0.000 description 3
- 238000011161 development Methods 0.000 description 3
- 230000018109 developmental process Effects 0.000 description 3
- 230000006870 function Effects 0.000 description 3
- 201000003152 motion sickness Diseases 0.000 description 3
- 210000000697 sensory organ Anatomy 0.000 description 3
- 230000001133 acceleration Effects 0.000 description 2
- 230000004888 barrier function Effects 0.000 description 2
- 230000008094 contradictory effect Effects 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 238000005286 illumination Methods 0.000 description 2
- 238000005070 sampling Methods 0.000 description 2
- 208000037175 Travel-Related Illness Diseases 0.000 description 1
- 230000003190 augmentative effect Effects 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 238000011156 evaluation Methods 0.000 description 1
- 230000004807 localization Effects 0.000 description 1
- 206010025482 malaise Diseases 0.000 description 1
- 238000012634 optical imaging Methods 0.000 description 1
- 238000003909 pattern recognition Methods 0.000 description 1
- 230000000149 penetrating effect Effects 0.000 description 1
- 230000008447 perception Effects 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 230000002123 temporal effect Effects 0.000 description 1
- 230000016776 visual perception Effects 0.000 description 1
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/52—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
- G01S7/56—Display arrangements
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/52—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
- G01S7/56—Display arrangements
- G01S7/62—Cathode-ray tube displays
- G01S7/6245—Stereoscopic displays; Three-dimensional displays; Pseudo-three dimensional displays
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/52—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
- G01S7/56—Display arrangements
- G01S7/62—Cathode-ray tube displays
- G01S7/6281—Composite displays, e.g. split-screen, multiple images
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B27/0172—Head mounted characterised by optical features
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/14—Systems for two-way working
- H04N7/141—Systems for two-way working between two video terminals, e.g. videophone
- H04N7/142—Constructional details of the terminal equipment, e.g. arrangements of the camera and the display
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/14—Systems for two-way working
- H04N7/15—Conference systems
- H04N7/157—Conference systems defining a virtual conference space and using avatars or agents
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
- H04N7/183—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source
- H04N7/185—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source from a mobile camera, e.g. for remote control
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S15/00—Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
- G01S15/02—Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems using reflection of acoustic waves
- G01S15/06—Systems determining the position data of a target
- G01S15/46—Indirect determination of position data
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/0138—Head-up displays characterised by optical features comprising image capture systems, e.g. camera
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/014—Head-up displays characterised by optical features comprising information/image processing systems
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/0141—Head-up displays characterised by optical features characterised by the informative content of the display
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B2027/0178—Eyeglass type
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0179—Display position adjusting means not related to the information to be displayed
- G02B2027/0187—Display position adjusting means not related to the information to be displayed slaved to motion of at least a part of the body of the user, e.g. head, eye
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B30/00—Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images
Definitions
- the invention relates to a visual output device with a camera, several uses of such a visual output device and a method for presenting information by means of such a visual output device.
- DE 102014018056 A1 shows a visual output device in the form of virtual reality glasses (VR glasses) 12 with two displays 16. If a person 10 wears these VR glasses 12, then each display 16 is arranged in front of one eye.
- a display device of the VR glasses 12 is capable of displaying a virtual environment.
- a camera system 14 is mounted with one or more cameras. Each camera produces a camera image of the person's real environment 10.
- the VR glasses 12 normally display the virtual environment.
- a detection device monitors whether the person 10 a predetermined movement, such as a
- the VR glasses 12 on the displays 16 shows a camera image of the real environment of the person 10th
- DE 102015006612 A1 shows a data goggle 12 which displays images of a night-vision camera to a motorist.
- This data glasses 12 can be considered a visual
- a controller 12b is capable of displaying images or objects on the
- a night vision camera 16 a is disposed on the data glasses 12. Images from this night vision camera are displayed in the form of a night vision image on the display surface 12 built-glasses 12, for example, superimposed with the real image.
- a night vision camera 16b is mounted on the motor vehicle.
- a communication module 14 transmits signals from this night vision camera 16b to the data glasses 12. Thanks to these indulgence cameras 16A, 16B It is possible to realize a visual perception of the real environment even in dim light or darkness.
- the stereo camera 105 detects movements of a user's hand 155.
- Stereo camera 105 is a camera that captures images from the hand 155.
- various icons are shown, including a virtual joystick 1 17 and a virtual hand 1 15.
- the virtual hand 1 15 is displayed in accordance with the movements of the real hand 155 detected by the stereo camera 105 changed. In this way, the user can change the order of three-dimensional sonar data 1 13 on the display area 109 of the screen 103.
- the representation of the virtual hand 1 15 on the screen is changed depending on the measured values of the camera 105.
- the object of the invention is to provide a visual output device having the features of the preamble of claim 1 and a presentation method having the features of the preamble of claim 21, which facilitates communication between two people.
- the visual output device comprises
- the display device is attached to the carrier.
- the visual output device is designed to be carried by a human using the carrier.
- the visual output device provides a visually dense space in front of the eyes of a human wearing the visual output device.
- the visual output device is designed to prevent light from penetrating from the real environment into the optically dense space provided.
- the first camera is capable of producing an image of the environment of the visual output device.
- the receiving device is capable of receiving signals from a remote transmitting device.
- the display device is capable of generating at least one virtual object, depending on a signal received by the receiving device.
- This virtual object represents information in a form that a human can visually perceive. The information depends on the signal.
- the presentation device continues to present a common representation in the visually dense space.
- This common representation includes the virtual object representing the information and an image generated by the first camera.
- the presentation method according to the invention comprises the corresponding steps which are carried out while a person is wearing the visual output device according to the invention.
- the first camera produces an image of the real environment of the visual output device.
- the presentation device represents this image. Because the visual output device is worn on the head of the person to whom this image is shown, the output device moves when the human moves his head together with the support and thus the presentation device. Is prevented, that the different sense organs of this person provide contradictory information, namely on the one hand the eyes, which see the represented image, and on the other hand other sense organs, which perceive the spatial position, orientation and movement of the person in the space.
- the presentation device represents an image of the real environment. This image follows a head movement of the human being, the visual one
- Output device carries. This prevents the risk that the human being one
- Travel sickness or so-called VR disease (Simulator Disease, Virtual Reality Sickness) suffers, which can occur in particular when the person is moving in a moving real environment or him. This may in particular occur aboard a watercraft and assume a shape similar to a seasickness.
- a first person can communicate with another person while the first person is carrying the output device.
- the other person can appeal to the first person. Thanks to the output device according to the solution, the first person carrying the output device can not only hear words spoken by the other person, but also perceive the gestures and facial expressions of the other person.
- the visual output device is able to output information in a form visually perceptible by a human and thus to represent a virtual reality.
- a variety of information can be displayed become.
- the presentation device represents this virtual reality virtually on a screen or on the human retina. Because this screen
- the solution according to the visual output device prevents the need to switch between a representation of the real environment and a representation of a virtual reality. Such switching may confuse a person carrying the visual output device, particularly when the switching is abrupt or when the human is moving relative to the depicted real environment, but not the virtual reality depicted.
- the solution according to the visual output device allows the person who carries the output device to perceive the real environment and the virtual environment simultaneously, without the
- the visual output device is able to present additional information as if this information were present in the environment that the person carrying the visual output device would see if he were not wearing the visual output device.
- the visual output device of the present invention allows a person carrying the visual output device to make user or operator interventions on a machine or system, the first camera producing an image of that machine and the presentation device displaying that image of the machine in the common representation , At the same time, a measured value from a sensor is displayed as a virtual object in the common representation.
- the measured value may be from a variable size of the machine itself or from another variable variable.
- the human being is enabled to use the measured value in the user input without having to take his eyes off the machine and in particular without having to look at a physical display device for the measured value.
- Man can choose for himself which distance from the machine he wants to keep. Is that the person wearing the output device uses remote control to control the machine. As a result, humans can remain at a safe distance from the machine.
- the visual output device encloses a visually dense space in front of the eyes of a person who is carrying the output device.
- the visual output device prevents light from entering the optically dense space from the real environment. Because an optically dense space is provided, the common representation is the only optical information for a human carrying the output device.
- the solution according to the output device prevents light impressions from the outside to overlap the common representation.
- These light impressions can cause people to become confused or to recognize certain segments of the common representation, in particular virtual objects, not at all or only poorly, or that the eyes become overloaded and fatigue quickly. This unwanted effect can occur especially when the impressions due to
- a presentation device can be configured such that it displays differently bright images of the camera less differently.
- at least one virtual object and thus a virtual reality are superimposed with the image of the real environment.
- the real environment is rendered weaker or more powerful than the virtual reality information. It is possible for a person to do that
- Output device carries a semi-transparent overlay of the virtual reality with the illustrated real environment perceives.
- the receiving device receives a first measured value.
- This first reading comes from a first sensor that has measured a variable in time.
- the presentation device uses this received first measurement value as a first signal and generates a first virtual object.
- This first virtual object represents the received first measured value visually.
- the presentation device presents this first virtual object, which represents the first measured value, together with an image from the first camera.
- This embodiment enables a first person carrying the output device to communicate with another person and to perceive the first measured value during the communication. It is not necessary for the first person to twist his head during communication in order to temporarily view the other person and, at times, a remote indicator for the first reading.
- the design also allows the first human to perceive the first reading while looking at an output device for a second reading. This output device, which represents the second reading, is shown in the image which produces the first camera and which displays the presentation device in the common representation.
- the receiving device additionally receives a second measured value originating from a second sensor.
- the presentation device generates a second virtual object, which contains the visually represents the received second measured value. Presentation device presents in the common representation the first and the second virtual object and the image from the first camera.
- the presentation device presents the common representation to a human wearing the visual output device.
- the presentation device comprises a first screen. This first screen adjoins the optically dense space and is located in front of an eye or both eyes of a person wearing the output device. The presentation device is able to present the common presentation on the first screen.
- the common representation is displayed on a first screen, which adjoins the optically dense space. This type of presentation is often perceived by a human as less disturbing or threatening than other types of presentation, such as a projection on the retina.
- one area of the first screen to be used to represent the image of the real environment, and another area to represent the virtual object or objects.
- the visual output device includes a single screen adjacent to the optically dense space and simultaneously positioned in front of both eyes of a human carrying the output device.
- the output device comprises two screens. Each screen adjoins the optically dense space and is positioned in front of each human eye.
- the presentation device comprises a so-called retina projector.
- This retina projector projects the joint image onto the retina of at least one eye of a person carrying the output device.
- This embodiment saves a screen.
- the first camera which is attached to the support of the visual output device, has a viewing direction. In one embodiment, this line of sight coincides with the standard viewing direction of a person wearing the output device.
- the standard direction of view is the direction in which a person looks.
- the viewing direction of the first camera coincides with the standard viewing direction of a person wearing the visual output device.
- the first camera is mounted on the support so that it faces away from the human face and is oriented outwards in the standard viewing direction into the real environment.
- the images provided by the thus-arranged first camera show the real environment from the same viewpoint from which the human being would perceive the real environment if he did not wear the visual output device.
- the depicted images of the first camera even better match the spatial position and movement of the human head. Contradictory information from different sensory organs of humans are prevented with even greater certainty.
- the often unpleasant impression is that a person wearing the visual output device can not see what is in front of him. In particular, it is ensured that the human being has the security of recognizing an obstacle when moving in the standard viewing direction.
- the first camera and additionally a second camera are mounted on the carrier of the visual output device.
- the presentation device is able to produce a stereoscopic representation of the real environment from the images of these two cameras.
- the presentation device is able to present this stereoscopic representation together with the or a virtual object in the common representation.
- the first camera has a smaller horizontal viewing angle than the second camera.
- the presentation device is able to present an image of the first camera and a virtual object in the common representation additionally at least temporarily insert an image of the second camera in this common representation.
- This embodiment can further reduce the risk that the person carrying the output device suffers from motion sickness or VR disease. It is possible that the first camera produces an image in the near range, for example from another person who communicates with the first person. The second camera creates an image of the background, which is farther away. This second image is at least temporarily inserted into the common representation with the virtual object. This second image reduces the risk of VR disease.
- the visual output device comprises an image intensifier.
- the image intensifier is capable of amplifying an image which has produced the first and / or the second camera.
- the presentation device is able to present an image amplified by the image intensifier in the common representation.
- an image intensifier can amplify the image from the first camera.
- the presentation device is the enhanced image. This design makes it easier to use the visual output device in situations where there is less or even no natural illumination, as well as less or no artificial illumination. This is the case, for example, in closed rooms without windows and with only poor lighting or when used outdoors at dusk or at night, and especially when strong lighting is not desired.
- the image intensifier can be, for example, a residual light amplifier.
- the output device comprises a conversion device.
- This conversion device is capable of converting an image in the infrared light region into an image in the visible light region.
- the presentation device is able to present in the shared representation an image enhanced by the conversion device.
- the first camera receives and / or generates images in the infrared light range.
- an infrared light source illuminates the real environment.
- the design with the infrared image makes it easier to use the visual output device in low light conditions.
- the conversion device converts an infrared image into an image in the visible light range. In the common representation, this image is then displayed in the visible range. This representation is easier to grasp for a human than if infrared images were displayed directly.
- the output device comprises a compensation device.
- This compensation device is capable of calculating the respective average brightness of at least two images produced one after the other by the first camera.
- the equalizer may lighten the temporal successive image if its average brightness is smaller than that of the temporally preceding image, and darken the temporally subsequent image if its average brightness is greater than that of the temporally preceding image.
- the presentation device is able to present an image lightened or darkened by the equalization device.
- the visual output device is in data communication with an input device.
- This input device may be part of the visual output device and, for example, mounted on the carrier of the output device, or be spatially separate from the output device.
- the input device is capable of capturing at least one input of a human, in particular a human, who carries the visual output device.
- the presentation device can present a joint presentation.
- the presentation device is able to change this common representation, in response to a corresponding input, which the user has made with the input device and which has detected the input device. For example, the presentation device initially presents a Representation of an image, which has generated the first camera.
- the presentation device After detecting a corresponding user input, the presentation device adds a virtual object and then presents a common representation. Or the presentation device removes the or at least one virtual object after a corresponding user input from a common representation. Or the presentation device alters a virtual object, for example, enlarges or reduces it, or presents it brighter or darker or another point in the common representation. It is also possible that, in response to a detected user input, the presentation device alters, for example enlarges or reduces, or rotates, or presents it lighter or darker, the emptied image from the first camera.
- the human who carries the visual output device and uses the input device, can change the common representation of the image and the virtual object without having to set down the visual output device.
- humans can increase or decrease the scale of the image or change the brightness of the image.
- the input device may comprise, for example, a mouse or a joy stick or a switch or a touchpad.
- the input device may be configured to detect a head movement of the human, for example by means of a localization sensor, motion sensor or acceleration sensor.
- the input unit can also include a visual evaluation unit, which detects a human gesture and derives therefrom a user input of this person, for example by pattern recognition.
- the input device is capable of locating and / or determining and / or classifying a head movement of a person who carries the output device.
- the visual output device is capable of detecting an identifier of a person carrying the output device. For example, humans identify themselves biometrically (fingerprint) or by using a passwort.
- the presentation device presents in the shared representation the or at least one virtual object depending on the detected identifier. It is possible that a virtual object, which represents a confidential information visually, is presented only to a certain group of people.
- a virtual object may represent information, such as a measurement, in a manner visually perceptible by a human. It is possible that it depends on a security level and / or the position and or the function of the person, which information is visually displayed as virtual objects and / or in a soft way they are displayed.
- the configuration that an identifier of the person wearing the visual output device is detected makes it possible to generate the virtual objects depending on this security level or function or position of the person and, in particular, to display certain information only for specific people.
- the visual output device is a virtual reality glasses (VR glasses).
- VR glasses are also called video glasses, helmet displays or VR helmets.
- the visual output device can also be designed as augmented reality glasses (AR glasses).
- AR glasses augmented reality glasses
- the visual output device is configured such that at least the carrier, the first camera and at least one component of the presentation device is located on the head of a person who carries the visual output device.
- Embodiment are also the receiving device and / or the presentation device attached to the carrier. Man carries the head on his head
- the receiving device is spatially separated from the carrier and a data connection communicates with the presentation device, for example via a cable.
- the person also carries the receiving device or a component of the display device on the body, for example, in a bag or strapped to a body part.
- the first camera and the optional second camera are capable of generating static or moving optical images of the real environment, in particular one
- each camera repeatedly generates images, for example at a predetermined sampling rate or sampling frequency.
- the or each camera is configured as a digital camera with a CCD chip.
- a lens system guides light on this CCD chip.
- the first camera uses data on this CCD chip and in the common representation to represent the image of the real environment.
- the first camera may be configured as an SD camera comprising spaced-apart lenses or similar optical imaging units.
- the receiving device of the visual output device is capable of receiving signals from a spatially remote transmitting device.
- the receiving device may be configured to change the signals via a wired transmission channel, for example via an electrical line or an optical waveguide.
- the receiving device may alternatively also
- the signals via a wired or wireless
- Transmission channel to receive, for example by electromagnetic waves.
- a wireless transmission channel for example, electromagnetic waves, mobile radio, Bluetooth, WLAN, near-field communication and / or optical directional radio can be used. If the dispenser is used aboard a vessel, the vessel's on-board network may be used to provide a portion of the transmission channel.
- the invention can be implemented in an arrangement which includes a solution-based visual output device, a first sensor and a first transmission device includes.
- the first sensor is capable of measuring a value of a first variable magnitude and generating a first signal. This first signal depends on the measured value.
- the first transmission device is capable of transmitting the generated first signal to the receiving device of the visual output device.
- the presentation device is capable of generating a first virtual object. This virtual object represents the received first measured value visually.
- the presentation device is able to present the first virtual object together with an image generated by the first camera in a shared representation.
- a human wearing the inventive visual output device will be able to perceive the value of the first variable magnitude while remaining at a spatial distance from the first sensor.
- the first sensor may be in a location that is dangerous to humans. Possible, but thanks to the supported visual output device, it is not necessary for the measured first value to be displayed on a physical output device. As a result, a person does not need to move to a fixed physical output device and does not need to carry a portable physical output device and is still informed of the measured value of the first size.
- the first person can coincide with another at the same time or overlapping one another
- the first sensor may be arranged aboard a watercraft and
- an active sonar system for example, an active sonar system, a passive sonar system, a
- Towed antenna a radar system, a geoposition receiver, a measuring device for a vehicle speed and / or a device for the wind direction or the wind speed include.
- the arrangement comprises a second sensor and a second transmission device.
- the second sensor generates a second signal, which depends on a second measured value.
- the second transmitter transmits the second signal to the visual output device.
- the presentation device generates a second visual object and presents in the common representation the first virtual object, the second virtual object and the image.
- FIG. 1 shows a person wearing a solution-based visual output device in the form of VR glasses and an input device
- Fig. 2 is a schematic representation of what the VR glasses presented to a person wearing the VR glasses on the screen.
- the invention is used on board a manned watercraft, wherein the watercraft may be an overwater vehicle or an underwater vehicle.
- a crewmember uses a solution-based visual output device, which in the exemplary embodiment has the form of a virtual reality (VR) goggle.
- VR virtual reality
- an Oculus Rift® is used as VR glasses, which is extended in accordance with the solution.
- a human M.1 wears a VR goggles 101.
- These VR glasses 101 comprise a carrier which comprises a preferably elastic and variable in length tension belt 107 and a frame 106.
- the tension belt 107 is guided around the head K.1 of the person M.1 and carries the frame 106.
- the tension belt 107 ensures a secure fit of VR glasses 101.
- the frame 106 carries a plate-shaped and preferably flexible holding element 105.
- two camera lenses 103 are embedded, which belong to two digital cameras of the VR glasses 101.
- Each digital camera is capable of producing an image of the real environment of VR glasses 101.
- the viewing direction of each digital camera preferably coincides with the standard viewing direction of the person M.1 who wears the VR glasses 101.
- the two digital cameras with the lenses 103 virtually form the "eyes of reality" of humans.
- a screen 21 1 is provided on the inside of the holding element 105, that is to say on the surface of the holding element 105 facing the human M.1, a screen 21 1 is provided.
- Fig. 2 shows schematically this screen 21 1 in the viewing direction, in which the human M.1, who wears the VR glasses 101, on the holding member 105 and thus on the screen 21 1 looks.
- two screens 21 1 are provided, namely one screen in front of each human eye M.1.
- An illustrative device described below is capable of generating images and presenting them to the human M.1 on these two screens 21 1.
- the screens 21 1 belong to the display device.
- a computer 1 15 of the display device is mounted on the frame 106 and includes a computer which evaluates the signals from the camera lenses 103 and generates an image of the real environment.
- the computer 1 15 automatically causes this image to be presented on the or each screen 21 1.
- computer 1 15 generates a stereoscopic representation of the real environment and uses signals from both digital cameras 103 for this purpose.
- This stereoscopic display is presented on screen 21 1.
- FIG. 2 shows by way of example an image 215 presented on a screen 21 1.
- the person M.1 who wears the VR glasses 101 looks at another human M.2, for example another crew member of the watercraft.
- the image 215 shows the head K.2 and the upper body of this other human.
- the person M.1 who wears the VR glasses 101 can perceive gestures and the facial expressions of the other person without having to set down the VR glasses 101. This allows the two people M.1 and M.2 communicate visually with each other. This visual communication can complement or even replace acoustic communication when acoustic communication is not possible.
- the VR glasses 101 are opaque, i. it encloses a visually dense space in front of the eyes of the person wearing the VR glasses 101. This optically dense space is bounded by the holding element 105, in the frame 106 and the head K.1 of the human M.1. In particular, the holding element 105 is configured with the two camera lenses 103 completely opaque. The VR glasses 101 prevent light from the real environment from entering the optically dense space in front of the head K.1 of the first human M.1.
- a sonar system On board the vessel is a sonar system with an underwater antenna.
- This sonar system aims at a sound source that emits sound waves under water, and in particular determines the direction and / or the distance from the vessel to this sound source. Depending on the measured values (eg direction and distance and sound intensity as a function of time and / or frequency), this sonar system generates signals. These signals are transmitted via the wired electrical system of the vessel to a central computer 220.
- the central computer 220 is connected to a transmitter 127.
- This transmitter 127 wirelessly transmits these signals to a receiving device 1 17, which is also arranged on the frame 106 or on the holding member 105 of the VR glasses 101.
- the receiving device 1 17 receives the signals at a fixed frequency.
- the computer 1 15 of the display device evaluates the received signals and generates virtual objects in the form of virtual instruments 213.
- the Computer 1 15 of the presentation device 15 generates a joint representation which simultaneously presents the image 215 of the real environment and a plurality of virtual objects in the form of virtual instruments 213.
- the computer 1 15 updates the virtual instruments 213 at the frequency at which the receiving device 1 17 receives the signals.
- the or at least some signals contain presentation information. If a measured value is outside a predetermined range, for example, if the distance to a sound source falls below a predetermined barrier, then the corresponding virtual instrument 213 is highlighted. As a result, the attention of the human M.1 is directed to this virtual instrument 213 for a relevant measurement.
- the person M.1 uses the human M.2, for example because the crew member M.2 has previously addressed the crew member M.1.
- the human M.1 sees on the screen 21 1 on the one hand an image 215 which shows the human M.2, and on the other hand the virtual instruments 213, cf. 2.
- the human M.1 can thereby perceive the measured values from the sonar system, which are displayed with the aid of the virtual instruments 213, and at the same time communicate visually with the human M.2.
- the two people M.1, M.2 additionally each carry a voice input unit, for example a microphone, and a voice output unit (not shown).
- the voice output unit using the first human M.1 may be integrated with the frame 106 or the strap 107 of the VR glasses 101.
- the voice output unit that uses the second human M.2, for example, may belong to a headphone. Thanks to the speech input units and the speech output units, the two people M.1 and M.2 can additionally communicate acoustically with one another, even if considerable ambient noise makes acoustical communication without aids difficult.
- the human M.1 carries an input device 109 in his left hand. With the aid of the input device 109, the human M.1 can make user input.
- the human M.1 operates a button or button or a touch-sensitive panel on the input device 109.
- the input device 109 has a motion sensor or an acceleration sensor that registers a certain left-hand movement.
- the presentation device 105 is able to change the common representation of the image 215 and of the virtual instruments 213 depending on user input, for example to increase or decrease the magnification of the shared representation (zoom) or to lighten or darken it.
- 101 VR glasses acts as a visual output device, includes the strap 107, the frame 106, the support member 105, the camera lenses 103, the display device with the computer 1 15 and the screen 21 1 and the receiving device 1 17, from worn the first person M.1
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Signal Processing (AREA)
- Computer Networks & Wireless Communication (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Human Computer Interaction (AREA)
- Optics & Photonics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE102017114914.7A DE102017114914A1 (de) | 2017-07-04 | 2017-07-04 | Visuelles Ausgabegerät und Bediensystem |
DE102017114905.8A DE102017114905A1 (de) | 2017-07-04 | 2017-07-04 | Kommunikationsvorrichtung und Bediensystem |
PCT/EP2018/067894 WO2019007936A1 (fr) | 2017-07-04 | 2018-07-03 | Appareil de sortie visuelle doté d'une caméra et procédé de présentation |
Publications (1)
Publication Number | Publication Date |
---|---|
EP3649539A1 true EP3649539A1 (fr) | 2020-05-13 |
Family
ID=64950631
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP18736893.1A Withdrawn EP3649539A1 (fr) | 2017-07-04 | 2018-07-03 | Appareil de sortie visuelle doté d'une caméra et procédé de présentation |
EP18736891.5A Withdrawn EP3649538A1 (fr) | 2017-07-04 | 2018-07-03 | Ensemble et procédé de communication faisant intervenir deux appareils de sortie visuelle |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP18736891.5A Withdrawn EP3649538A1 (fr) | 2017-07-04 | 2018-07-03 | Ensemble et procédé de communication faisant intervenir deux appareils de sortie visuelle |
Country Status (2)
Country | Link |
---|---|
EP (2) | EP3649539A1 (fr) |
WO (2) | WO2019007936A1 (fr) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111031292A (zh) * | 2019-12-26 | 2020-04-17 | 北京中煤矿山工程有限公司 | 一种基于vr技术的煤矿安全生产实时监测系统 |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8467133B2 (en) * | 2010-02-28 | 2013-06-18 | Osterhout Group, Inc. | See-through display with an optical assembly including a wedge-shaped illumination system |
Family Cites Families (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6774869B2 (en) * | 2000-12-22 | 2004-08-10 | Board Of Trustees Operating Michigan State University | Teleportal face-to-face system |
DE102004019989B3 (de) * | 2004-04-23 | 2005-12-15 | Siemens Ag | Anordnung sowie Verfahren zur Durchführung von Videokonferenzen |
US20070030211A1 (en) * | 2005-06-02 | 2007-02-08 | Honeywell International Inc. | Wearable marine heads-up display system |
DE202009010719U1 (de) | 2009-08-07 | 2009-10-15 | Eckardt, Manuel | Kommunikationssystem |
EP2611152A3 (fr) * | 2011-12-28 | 2014-10-15 | Samsung Electronics Co., Ltd. | Appareil d'affichage, système de traitement d'image, procédé d'affichage et son traitement d'imagerie |
US9390561B2 (en) * | 2013-04-12 | 2016-07-12 | Microsoft Technology Licensing, Llc | Personal holographic billboard |
US10262462B2 (en) * | 2014-04-18 | 2019-04-16 | Magic Leap, Inc. | Systems and methods for augmented and virtual reality |
DE102014107211A1 (de) | 2014-05-22 | 2015-11-26 | Atlas Elektronik Gmbh | Vorrichtung zum Anzeigen einer virtuellen Realität sowie Messgerät |
US20160054791A1 (en) * | 2014-08-25 | 2016-02-25 | Daqri, Llc | Navigating augmented reality content with a watch |
DE102014018056B4 (de) | 2014-12-05 | 2024-08-14 | Audi Ag | Verfahren zum Betreiben einer Virtual-Reality-Brille und Virtual-Reality-Brille |
US10032388B2 (en) * | 2014-12-05 | 2018-07-24 | Illinois Tool Works Inc. | Augmented and mediated reality welding helmet systems |
US9904054B2 (en) | 2015-01-23 | 2018-02-27 | Oculus Vr, Llc | Headset with strain gauge expression recognition system |
US9910275B2 (en) * | 2015-05-18 | 2018-03-06 | Samsung Electronics Co., Ltd. | Image processing for head mounted display devices |
DE102015006612B4 (de) | 2015-05-21 | 2020-01-23 | Audi Ag | Verfahren zum Betreiben einer Datenbrille in einem Kraftfahrzeug und System mit einer Datenbrille |
DE202016000449U1 (de) | 2016-01-26 | 2016-03-08 | Johannes Schlemmer | Kommunikationssystem zur entfernten Unterstützung von Senioren |
US10019131B2 (en) | 2016-05-10 | 2018-07-10 | Google Llc | Two-handed object manipulations in virtual reality |
-
2018
- 2018-07-03 EP EP18736893.1A patent/EP3649539A1/fr not_active Withdrawn
- 2018-07-03 WO PCT/EP2018/067894 patent/WO2019007936A1/fr unknown
- 2018-07-03 EP EP18736891.5A patent/EP3649538A1/fr not_active Withdrawn
- 2018-07-03 WO PCT/EP2018/067892 patent/WO2019007934A1/fr unknown
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8467133B2 (en) * | 2010-02-28 | 2013-06-18 | Osterhout Group, Inc. | See-through display with an optical assembly including a wedge-shaped illumination system |
Also Published As
Publication number | Publication date |
---|---|
WO2019007934A1 (fr) | 2019-01-10 |
WO2019007936A1 (fr) | 2019-01-10 |
EP3649538A1 (fr) | 2020-05-13 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP3426366B1 (fr) | Détermination de position et orientation d'un casque de réalité virtuelle, et manège d'attraction doté d'un casque de réalité virtuelle | |
EP2901225B1 (fr) | Système de commande pour une machine de traitement de boissons | |
EP3394708B1 (fr) | Procédé de fonctionnement d'un système de réalité virtuelle et système de réalité virtuelle associé | |
DE102010038341A1 (de) | Videoüberwachungssystem sowie Verfahren zur Konfiguration eines Videoüberwachungssystems | |
EP3286532B1 (fr) | Procédé de détection de vibrations d'un dispositif et système de détection de vibrations | |
DE102016207009A1 (de) | Bediensystem für eine Maschine der Lebensmittelindustrie, insbesondere der Getränkemittelindustrie | |
DE102014213021A1 (de) | Lokalisierung eines HMD im Fahrzeug | |
DE102016206154A1 (de) | Verfahren und Vorrichtung zum Generieren eines Bildsignals und Anzeigesystem für ein Fahrzeug | |
DE102014006732A1 (de) | Bildüberlagerung von virtuellen Objekten in ein Kamerabild | |
DE102017208806B3 (de) | Externe Darstellung von Bildaufnahmen eines Fahrzeuginnenraums in einer VR-Brille | |
DE102014226185A1 (de) | Verfahren zum Bestimmen einer Blickrichtung einer Person | |
WO2018202614A1 (fr) | Dispositif de capteur mobile pour un appareil de sortie visuel pouvant être porté sur la tête, pouvant être utilisé dans un véhicule, et procédé servant à faire fonctionner un système d'affichage | |
EP3649539A1 (fr) | Appareil de sortie visuelle doté d'une caméra et procédé de présentation | |
DE102018218728A1 (de) | System zur Steuerung einer Maschine mit "Wearable Display" | |
DE102015006610B4 (de) | Verfahren zum Betreiben einer Datenbrille in einem Kraftfahrzeug und System mit einer Datenbrille | |
DE102014009699B4 (de) | Verfahren zum Betreiben einer Anzeigeeinrichtung und System mit einer Anzeigeeinrichtung | |
DE102014206625A1 (de) | Positionierung eines HMD im Fahrzeug | |
DE112021003465T5 (de) | Informationsprozessor, informationsverarbeitungsverfahren und speichermedium | |
DE102016102808A1 (de) | Verfahren zur Steuerung eines an einem Fahrzeug richtbar angeordnetes Sichtgeräts | |
DE102014019357B4 (de) | Anzeigesystem für einen Schutzhelm | |
WO2018130383A1 (fr) | Procédé de fonctionnement d'un système d'affichage comportant un visiocasque dans un véhicule automobile | |
DE102016125459B3 (de) | Bilderfassungsverfahren auf einem Bilderfassungssystem | |
WO2020161106A1 (fr) | Système de sécurité pourvu d'un dispositif de détection ainsi que procédé pour le contrôle de sécurité | |
WO2024061654A1 (fr) | Procédé et dispositif de détermination d'une pose de lunettes de paire de lunettes intelligentes au moyen d'un suivi extérieur | |
EP2907551A1 (fr) | Détecteur et procédé de fonctionnement d'un détecteur |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: UNKNOWN |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE |
|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
17P | Request for examination filed |
Effective date: 20200204 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
AX | Request for extension of the european patent |
Extension state: BA ME |
|
DAV | Request for validation of the european patent (deleted) | ||
DAX | Request for extension of the european patent (deleted) | ||
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
17Q | First examination report despatched |
Effective date: 20210729 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN |
|
18D | Application deemed to be withdrawn |
Effective date: 20211209 |