EP3458935A1 - Verfahren zum einstellen einer blickrichtung in einer darstellung einer virtuellen umgebung - Google Patents

Verfahren zum einstellen einer blickrichtung in einer darstellung einer virtuellen umgebung

Info

Publication number
EP3458935A1
EP3458935A1 EP17722406.0A EP17722406A EP3458935A1 EP 3458935 A1 EP3458935 A1 EP 3458935A1 EP 17722406 A EP17722406 A EP 17722406A EP 3458935 A1 EP3458935 A1 EP 3458935A1
Authority
EP
European Patent Office
Prior art keywords
representation
virtual environment
user
orientation
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP17722406.0A
Other languages
German (de)
English (en)
French (fr)
Inventor
Christopher Mutschler
Tobias FEIGL
Christian DAXER
Stephan Otto
Bercea COSMIN-IONUT
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fraunhofer Gesellschaft zur Forderung der Angewandten Forschung eV
Original Assignee
Fraunhofer Gesellschaft zur Forderung der Angewandten Forschung eV
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fraunhofer Gesellschaft zur Forderung der Angewandten Forschung eV filed Critical Fraunhofer Gesellschaft zur Forderung der Angewandten Forschung eV
Publication of EP3458935A1 publication Critical patent/EP3458935A1/de
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B27/0172Head mounted characterised by optical features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/012Head tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/02Affine transformations
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • G06T7/74Determining position or orientation of objects or cameras using feature-based methods involving reference images or patches
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0138Head-up displays characterised by optical features comprising image capture systems, e.g. camera
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0179Display position adjusting means not related to the information to be displayed
    • G02B2027/0187Display position adjusting means not related to the information to be displayed slaved to motion of at least a part of the body of the user, e.g. head, eye
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30244Camera pose
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/52Details of telephonic subscriber devices including functional features of a camera

Definitions

  • Embodiments deal with the representation of a virtual environment.
  • embodiments are concerned with a method for adjusting a line of sight in a representation of a virtual environment. background
  • Virtual reality refers to the representation and simultaneous perception of a computer-generated, interactive virtual environment and its physical properties.
  • a representation of the virtual environment to a user e.g. via a display device attached to the user's head.
  • Such devices are known as head-mounted display (HMD), head-mounted display unit or head-mounted unit (HMU).
  • the display device represents the virtual environment e.g. on a near-eye screen or projecting directly onto the user's retina.
  • the orientation i.
  • the viewing direction, in the representation of the virtual environment is thereby by rotating about a transverse axis (pitch axis) representing the virtual environment, rotating about a longitudinal axis (roll axis) of the representation of the virtual environment and / or rotation about a yaw axis (Vertical axis, vertical axis, yaw axis) of the representation of the virtual environment set.
  • the transverse axis, the longitudinal axis and the yaw axis are perpendicular to each other.
  • the position of the head of the user can be detected. For example, a position and an orientation of the user's head in the real environment, ie in the real world, can be determined in order to adapt the representation of the virtual environment. Accordingly, the perception of the own person in the real environment can be reduced and the identification with the virtual environment can be increased.
  • the propagation times of a radio signal of a single transmitter at the user's head to multiple remote receivers.
  • a time-difference-of-arrival (TDoA) method from the different difference times between the transmission of the radio signal by the transmitter and receiving the radio signal by the respective receiver a position of the head of the user with an accuracy in the single-digit centimeter range be determined.
  • the transmitter can be integrated, for example, in a display device fastened to the head of the user or attached to the head of the user independently of the display device fastened to the head of the user.
  • a user can thus change the displayed position in the virtual environment, for example, by moving freely in the real environment.
  • the position of the user's head can be detected via a camera-based method, a Time-of-Flight (ToF) method, a Round Trip Time (RTT) method or an inertial measurement unit (Inertial Measurement Unit, IMU).
  • ToF Time-of-Flight
  • RTT Round Trip Time
  • IMU inertial measurement unit
  • the orientation of the user's head can be determined, for example, by means of corresponding sensors (eg gyroscope, magnetometer, accelerometer) of the display device fastened to the head of the user.
  • sensors eg gyroscope, magnetometer, accelerometer
  • sensors already provided in the mobile communication device can be used to determine the orientation of the user's head.
  • a user can change the viewing direction in the virtual environment by turning or tilting the head in the real environment.
  • the viewing direction in the virtual environment is changed by rotating around the yaw axis of the representation of the virtual environment.
  • the magnetometer can be used to determine the orientation in the real environment sufficiently stable.
  • a magnetic field map can be created so that the orientation of the user's head in the real environment can be determined with a suitably calibrated magnetometer.
  • the determination of the orientation of the head of the user by means of the above-mentioned sensors can lead to orientation errors.
  • magnetometers sometimes provide incorrect readings so that the measured orientation of the head does not match the true orientation of the head in the real world.
  • an approximate orientation determination of the head by coupling the measured values of a gyroscope and a Accelerometers can lead to a discrepancy between the measured or specific orientation of the head and the true orientation of the head in the real environment due to measurement errors of the individual sensor elements.
  • the combination and integration of erroneous measured values over a longer period of time can lead to deviations between the specific orientation of the head and the true orientation of the head in the real environment.
  • frequent and intense changes in the rate of rotation of the sensor eg change between slow and fast rotation of the head
  • the error with the extent of Rate of rotation change increases. Accordingly, the orientation of the representation of the virtual environment, which is based on the measured values, is falsified.
  • a rotation offset of the viewing direction in the representation of the virtual environment around the yaw axis of the representation of the virtual environment up to about ⁇ 15 ° is usually not perceived by a user. If a user is running e.g. Going straight in the real environment, he sometimes fails to realize when the viewing direction in the virtual environment representation deviates up to about ⁇ 15 ° therefrom (i.e., the viewing direction is rotated up to about 15 ° to the left or right about the yaw axis). In other words, the user does not realize to some degree that, in contrast to the real environment in the virtual environment, he does not move straight ahead, but obliquely. Larger deviations, however, are perceived by the user and diminish the feeling of immersion.
  • the method comprises recording a known object in a real environment with a recording device (eg an image, a video or a sound recording). Furthermore, the method comprises determining a rotational offset of the viewing direction in the representation of the virtual environment about a yaw axis representing the virtual environment based on the image of the object, a known position of the recording device in the real environment and a current viewing direction in the representation of FIG virtual environment. The method further comprises rotating the viewing direction in the representation of the virtual environment around the rotational offset.
  • a recording device eg an image, a video or a sound recording.
  • the recording device can be arranged spatially in the immediate vicinity of the user.
  • the receiving device on the body of the user (about the head) are attached.
  • the orientation of the recording device in the real environment can be determined, which can therefore be assumed approximately as the orientation of the user's head in the real environment.
  • the rotational offset of the viewing direction in the representation of the virtual environment around the yaw axis of the representation of the virtual environment can be determined and the representation of the virtual environment can be corrected accordingly.
  • the representation of the virtual environment can thus be adapted to the actual position and orientation of the user's head in the real environment.
  • embodiments of the proposed method thus allow a calibration of the viewing direction in the representation of the virtual environment.
  • an improved sense of immersion can be created for a user.
  • FIG. 1 For exemplary embodiments, relate to a second method for setting a viewing direction in a representation of a virtual environment.
  • the method comprises recording a known object in a real environment with a recording device (eg an image, a video or a sound recording). Furthermore, the method comprises determining a rotational offset of the viewing direction in the representation of the virtual environment about a yaw axis representing the virtual environment based on the image of the object and a current viewing direction in the representation of the virtual Surroundings. In addition, the method comprises rotating the viewing direction in the representation of the virtual environment around the rotational offset.
  • a recording device eg an image, a video or a sound recording
  • the method comprises determining a rotational offset of the viewing direction in the representation of the virtual environment about a yaw axis representing the virtual environment based on the image of the object and a current viewing direction in the representation of the virtual Surroundings.
  • the method comprises rotating the viewing direction in the representation of the virtual environment around the rotational offset
  • the orientation of the recording device in the real environment can be determined.
  • the orientation of the cradle in the real environment may be approximated as orienting the user's head in the real environment.
  • the rotational offset of the viewing direction in the representation of the virtual environment around the yaw axis of the representation of the virtual environment can be determined and the representation of the virtual environment can be corrected accordingly.
  • the representation of the virtual environment can thus be adapted to the actual position and orientation of the user's head in the real environment.
  • embodiments of the proposed method thus allow a calibration of the viewing direction in the representation of the virtual environment.
  • an improved sense of immersion can be created for a user.
  • FIG. 1 For example an image, a video or a sound recording.
  • the method comprises determining a rotational offset of the viewing direction in the representation of the virtual environment about a yaw axis representing the virtual environment based on the images of the object at the first time and at the second time and measured values of at least one further at the head of the user attached sensor.
  • the method further comprises rotating the line of sight in the representation of the virtual environment around the rotation offset.
  • an effective rotation of the recording device about the yaw axis of the head can be determined. fes of the user between the first time and the second time.
  • the difference between the two particular values for the rotation of the pickup about the yaw axis of the user's head can be approximately taken as the rotational offset of the line of sight in the representation of the virtual environment around the yaw axis representing the virtual environment.
  • the representation of the virtual environment can thus be corrected accordingly.
  • the representation of the virtual environment can thus be adapted to the actual position and orientation of the user's head in the real environment.
  • Embodiments in another aspect include a program having program code for performing one of the proposed methods when the program code is executed on a computer, a processor, or a programmable hardware component.
  • FIG. 1 shows an example of a method for setting a viewing direction in a representation of a virtual environment
  • FIG. 2 shows an example of a relationship between an object in the real environment and a photograph of the object
  • 3a shows a first example of an object
  • 3b shows a second example of an object
  • Fig. 3c shows a third example of an object
  • Fig. 3d shows a fourth example of an object
  • 4 shows exemplary features in an object
  • 5 shows an example of an assignment of the positions of features of an object in the real environment to the position of the respective feature in a photograph of the object
  • FIG. 8 shows an example of a further method for setting a viewing direction in a representation of a virtual environment
  • Fig. 9a is a photograph of a fifth example of an object
  • FIG. 9b shows a corresponding binary recording to the recording shown in FIG. 9a
  • FIG. Fig. 9c is a photograph of a sixth example of an object
  • Fig. 9d shows a corresponding binary recording to the recording shown in Fig. 9b;
  • FIG. 10 shows an example of another method for setting a viewing direction in a representation of a virtual environment
  • 1 a shows an example of a relationship between a motion vector of a user, an actual viewing direction of the user and a viewing direction determined from measured values of at least one further sensor arranged on the user's head in the representation of the virtual environment at a first time;
  • 1 lb shows an example of a relationship between a motion vector of a user, an actual viewing direction of the user and one of measured values, at least another arranged at the head of the sensor sensor certain line of sight in the representation of the virtual environment at a second time.
  • the virtual environment is a computer-generated, interactive world with predetermined physical properties that can be output to a user, for example.
  • the viewing direction in the representation of the virtual environment corresponds to the orientation of the virtual environment shown in the representation of the virtual environment in the virtual environment.
  • the representation of the virtual environment may render the virtual environment from an ego perspective or first-person perspective, ie the virtual environment is rendered as a user would see if it were actually moving in the virtual environment.
  • the line of sight would correspond in the presentation of virtual environment of the user's line of sight, if he were actually moving in the virtual environment.
  • the viewing direction in the representation of the virtual environment is thereby set by rotating about the transverse axis of the representation of the virtual environment, rotating about the longitudinal axis of the representation of the virtual environment and / or rotating about the yaw axis of the representation of the virtual environment.
  • the method 100 comprises recording 102 a known object in a real environment (i.e., the real world) with a capture device.
  • the known object may be both an object placed specifically for the proposed method in the real environment and an object already present in the real environment.
  • the object may be located in an area in the real environment in which the user is moving, or may be an already existing element of this area.
  • the object may be both a substantially two-dimensional (planar) object, i. an object that extends substantially in only two spatial directions, as well as a three-dimensional object, i. an object that extends in similar magnitudes in all three spatial directions act. If the user is moving in the real environment e.g. within a room or hall, the known object may e.g.
  • the object may be e.g. a poster, a projection, a sound source or other element placed in the room or hall specifically for the proposed method.
  • the recording can be, for example, a still image (ie a single shot), a video (ie a sequence of images) or a sound recording, ie a recording of sound (eg sounds, sounds, music or speech).
  • the recording device may comprise a still camera, a video camera, a (stereo) sound recording device or a combination thereof.
  • the method 100 comprises determining 104 a rotational offset of the viewing direction in the representation of the virtual environment around the yaw axis representing the virtual environment based on the image of the object, a known position of the recording device in the real environment and a current viewing direction in the representation of FIG virtual environment.
  • the recording device can be spatially in the immediate Be arranged close to the user.
  • the transmitter can be arranged, for example, at the head of the user and the recording device spatially close to it, in order to know the position of the recording device in the real environment as precisely as possible.
  • the instantaneous viewing direction in the representation of the virtual environment may be, for example, from a display device (eg HMD, HMU), which outputs (and optionally also calculates) the representation of the virtual environment to the user, or a computer which calculates the virtual environment (eg back-end of a VR system).
  • the orientation of the recording device in the real environment can be determined, which can be approximated as the orientation of the user or his head in the real world. From this, by using the information about the instantaneous viewing direction in the representation of the virtual environment, the rotational offset of the viewing direction in the representation of the virtual environment around the yaw axis of the representation of the virtual environment can be determined.
  • the method 100 further comprises rotating 106 the viewing direction in the representation of the virtual environment around the rotational offset.
  • the viewing direction in the representation of the virtual environment is corrected by a rotation about the yaw axis of the representation of the virtual environment, the direction and the magnitude of the rotation being determined by the rotational offset.
  • the representation of the virtual environment is thus corrected by the rotational offset.
  • the representation of the virtual environment can thus be adapted to the actual position and orientation of the user's head in the real environment.
  • the method 100 thus allows a calibration of the viewing direction in the representation of the virtual environment.
  • an incorrectly determined orientation in the real environment or a drifting of the viewing direction in the representation of the virtual environment caused by measurement errors the sensor commonly used to determine the position and orientation (of the head) of a user is due to be corrected.
  • the method 100 may further include outputting the representation of the virtual environment to a user.
  • Outputting the representation of the virtual environment to the user may be e.g. via a display device attached to the head of the user, which further comprises the receiving device.
  • the orientation of the recording device in the real environment may be approximated as orienting the user's head in the real environment.
  • the display device attached to the user's head includes a mobile communication device (eg, a smartphone).
  • a mobile communication device eg, a smartphone
  • sensors eg gyroscope, magnetometer, accelerometer
  • the camera of the mobile communication device as a recording device
  • a rotational offset of the viewing direction in the representation of the virtual environment around the yaw axis of the representation of the virtual environment due to measurement errors of the sensor of the mobile communication device can be corrected.
  • For the calibration of the representation of the virtual environment can thus be made use of resources already provided by the mobile communication device.
  • the method 100 may be executed immediately (ie online) on the mobile communication device.
  • the method 100 can thus enable a calibration of the representation of the virtual environment without additional hardware components.
  • a portion of the method 100 may be performed by the mobile communication device and another portion of the method 100, such as determining the rotational offset of the viewing direction in the representation of the virtual environment around the yaw axis of the representation of the virtual environment, by an already existing back-end of the VR system that the user uses (ie offline).
  • the determined rotational offset can then be sent, for example, from the back-end to the mobile communication device so that it can rotate the viewing direction in the current representation of the virtual environment by the rotational offset.
  • the above-described functionality can eg by an update for one or more existing software components of the VR systems (eg software for the mobile communication device or software for the back-end) are implemented.
  • determining the rotational offset of the viewing direction in the representation of the virtual environment about the yaw axis of the representation of the virtual environment may include determining an orientation of the recording device in the real environment based on the image of the object and the known position of the recording device the real environment (example methods are explained below). Furthermore, the determining 104 of the rotational offset may include determining a desired viewing direction in the representation of the virtual environment based on the orientation of the recording device in the real environment. For example, the specific orientation of the recording device in the real environment can be provided to an algorithm for calculating the representation of the virtual environment, which calculates a representation of the virtual environment based thereon.
  • the desired viewing direction in the virtual environment can be that viewing direction in the virtual environment that corresponds to the actual position and orientation of the user's head in the real environment.
  • the recording device can be aligned straight ahead of the user in the real environment or vertically to the viewing direction.
  • the viewing direction in the calculated representation of the virtual environment can consequently be regarded as a desired viewing direction.
  • the rotational offset of the viewing direction in the representation of the virtual environment around the yaw axis of the representation of the virtual environment is now determined according to exemplary embodiments. This can be done, for example, by comparing the desired viewing direction and the current viewing direction in the representation of the virtual environment. In other words, it is determined how much the instantaneous viewing direction in the representation of the virtual environment relative to the desired viewing direction in the representation of the virtual environment is rotated about the yaw axis of the representation of the virtual environment.
  • the representation of the virtual environment can be rendered, for example, by the display device attached to the head of the user (eg comprising a mobile communication device).
  • the determination of the desired viewing direction for a time to can be carried out, for example, by a back-end of the VR system and then sent to the mobile communication device attached to the head of the user. From the desired viewing direction in the representation of the virtual environment for the time to and the current viewing direction in the representation of the virtual environment at the time to, the mobile communication device can then determine the rotational offset of the viewing direction about the yaw axis of the representation of the virtual environment at the time to.
  • the mobile communication device can eg the viewing direction in the representation of the virtual environment for the time ti by the rotational offset of the viewing direction at the time to turn, ie correct. Accordingly, the representation of the virtual environment can be output to the user with a correct viewing direction.
  • the method described above may be repeatedly executed during use of the VR system.
  • a further drift of the viewing direction in the representation of the virtual environment between the time to and the later time ti can be corrected.
  • the method can be at least partially executed again in order to verify the previous correction.
  • FIG. 2 shows by way of example how the orientation of a recording device in the real environment can be determined based on a picture 210 of a known object 220 and a known position C of the recording device in the real environment.
  • the object 220 can be thought of as a set of world points M.
  • the receptacle 210 can be understood as a set of pixels m.
  • the orientation of the recording device and thus the viewing angle in the receptacle 210 can generally be determined from a transformation, which converts the world points M of the object 220 into corresponding pixels m of the receptacle 210.
  • the transformation can be represented as follows:
  • R represents a general rotation matrix.
  • R can be represented as a product of three rotational matrices R x , R y and R z about mutually orthogonal unit directions X, Y and Z. For example, from a defined point of origin in the real world, X may point to the right, Y to point upward (ie, skyward), and Z to point deep (ie, forward).
  • the unit direction Y thus corresponds to the yaw axis (vertical axis), ie a rotation about this axis shifts a recording horizontally.
  • the axes of the virtual environment coordinate system may be selected differently from the mutually orthogonal unit directions X, Y and Z.
  • a position in the real environment can then be translated via a coordinate transformation into a position in the virtual environment. Accordingly, equation (1) can be reshaped as follows:
  • the three rotary matrices R x , R y and R z can thereby in dependence on an angle a, which indicates the sought horizontal orientation of the recording device in the real environment.
  • angle a defines an orientation (orientation) of the pick-up device in the plane spanned by X and Z.
  • the rotary matrices R x , R y and R z are defined as usual:
  • R y (a) can be determined from a corresponding M ⁇ -> m pair.
  • R x (a) and R z (a) can be determined, for example, by means of the sensors already present in the mobile communication device (eg via the gravitation vector).
  • the coefficients from equation (3) can be summarized as follows:
  • equation (3) can be represented as follows:
  • equation (10) can be reworded as follows:
  • FIGS. 3a to 3d some examples of possible objects are shown below.
  • 3a shows an amorphous pattern 310 with different gray levels
  • FIG. 3b a pattern 320 with circles of different sizes and gray levels
  • FIG. 3c a picture 330 of trees
  • FIG. 3d a collage 340 of character strings (eg words or numbers).
  • the object can be varied.
  • the patterns shown in FIGS. 3 a to 3 d may, for example, be mounted on a vertical plane in the real environment (eg by means of a poster or by means of projection).
  • the patterns shown in Figures 3a to 3d may be displayed in the form of a poster or as a projection on a side wall of a room or hall.
  • determining the orientation of the capture device in the real environment includes determining a transformation that matches at least a portion of the image of the known object with at least a portion of a comparison capture.
  • the comparison photograph can provide information about the position of the known object in the real environment.
  • a database can be maintained with comparison recordings that show different objects or an object from different perspectives.
  • information about the position of the object shown on it in the real world is stored for each comparison photograph. This information, which corresponds to the world points M in the example of FIG.
  • FIG. 4 shows a pattern with circles of different sizes and gray levels in the form of a poster 400 attached to a wall of the real environment as an example of an object.
  • the pattern includes a plurality of features 410-1, 410-2, ..., 410-n.
  • the features 410-1, 410-2, ..., 410-n may be determined via feature extraction methods.
  • feature extraction methods include the Scale-Invariant Feature Transform (SIFT) algorithm, the Speeded Up Robust Features (SURF) algorithm or the Binary Robust Independent Elementary Features (BRIEF) algorithm.
  • SIFT Scale-Invariant Feature Transform
  • SURF Speeded Up Robust Features
  • BRIEF Binary Robust Independent Elementary Features
  • the recording device is part of a display device attached to the head of the user for outputting the representation of the virtual environment, so that by measuring the position of the head of the user in the operation of the VR system, the position of the receiving device is approximately (substantially) known.
  • At least one feature of the object is detected.
  • a feature extraction method is applied to the image (for example, one of the above-mentioned algorithms).
  • a position of the feature is still determined.
  • the coordinates of the feature in the coordinate system of the recording are determined.
  • a comparison feature of the plurality of comparison features of the database is identified, which corresponds to the feature of the object in the recording.
  • the plurality of comparison features is in each case assigned a position in the real environment.
  • identification e.g. known image registration methods are used.
  • a Nearest Neighbor e.g. a Nearest Neighbor
  • the known position of the recording device in the real environment From the known position of the recording device in the real environment, the position of the feature in the recording and the position in the real environment, which is associated with the identified comparison feature, the orientation of the recording device in the real environment according to those shown in FIG Principles.
  • the known position from which the poster 400 is taken corresponds to the known position C of the susceptor in the real environment.
  • the position of the feature in the image, a pixel m and the position in the real environment, which is associated with the identified comparison feature a world point M.
  • a transformation can be determined, the position of the feature in the recording with the position in the real environment, which is associated with the identified comparison feature, brings.
  • the orientation of the Aufhahmevoroplasty can be determined in the real environment.
  • a plurality of features can be detected in a receptacle of an object.
  • multiple features of the object can be detected.
  • a plurality of comparison features can be identified from the database for the plurality of recognized features of the object.
  • an orientation of the recording device in the real environment can be determined for the several recognized features of the object.
  • FIG. 5 An exemplary assignment 500 of features of the object in the acquisition to comparison features is shown in FIG.
  • FIG. 5 shows in each case the position of a feature of the object (pixel) detected in the image of the object and the position in the real environment (world point) associated with the respective identified comparison feature.
  • the positions are given in arbitrary units.
  • Corresponding pixels and world points are connected in FIG. 5 with a straight line.
  • the pitch of the respective straight line-with the exception of straight lines 501 to 507- is approximately similar.
  • the pixel to the world points e.g. Brute-force or Fast Library for Approximate Nearest Neighbor (FLANN) based algorithms.
  • FLANN Fast Library for Approximate Nearest Neighbor
  • the frequencies of the orientations of the recording device determined for the several recognized features of the object are plotted in the real environment.
  • the orientation is plotted in the form of the angle a, which indicates the rotation about the vertical axis of the picture.
  • the angle ⁇ thus corresponds to an orientation of the receiving device in the plane spanned by X and Z, i. a rotation around Y.
  • the frequency is plotted logarithmically.
  • the angle a for some recognized features of the object is about -65 °, for some recognized features of the object about 60 °, for some recognized features of the object about 64 °, for a lot more recognized features of the object was determined to be about 90 ° and for even more recognized features of the object to about 91 °.
  • the orientation of the recording device in the real environment determined for the several identified features of the object is determined as the orientation of the recording device in the real environment, which fulfills a quality criterion. Referring to the example of Fig. 6, for example, one degree wide interval (bin) of the histogram having the largest number of entries may be selected.
  • the quality criterion can therefore be, for example, that the orientation of the recording device in the real environment is the most frequently determined orientation.
  • other quality criteria can be used. For example, it may be required that the selected interval must have a minimum number of entries, or that the selected bin must represent at least a predetermined proportion of the plurality of detected features of the object (ie, the bin must have at least the predetermined proportion of the plurality of detected features of the object represent each particular orientation).
  • the frequencies for 90 ° and 91 ° are dominant and absolutely in a similar range, so that both orientations can fulfill a selected quality criterion. Accordingly, at adjacent or similar orientations (i.e., at adjacent bins or bins that are separated by only a small number of intervening bins), the average of the two orientations may also be determined as the orientation of the cradle in the real environment. Optionally, a weighting of the adjacent or similar orientations may also be made (e.g., according to their frequency).
  • determining the orientation of the cradle in the real environment includes determining a comparison snapshot from a plurality of snapshots of a database.
  • the determination of the comparison image from the plurality of comparison images of the database is based on the known position of the recording device in the real environment. In other words, there will be a selected from the database, for which due to the position of the recording device has a high probability that it shows the object at all or shows the object from a similar or same perspective.
  • An orientation of at least the selected comparison recording in the real environment is known.
  • the orientation in the real environment can also be stored in the database for each of the plurality of comparison images.
  • determining the orientation of the recording device in the real environment comprises determining a rotation of the recording of the object relative to the comparison recording. That is, image registration of the image of the object against the comparative image is performed.
  • image registration methods such as the Enhanced Correlation Coefficient (ECC) algorithm can be used.
  • ECC Enhanced Correlation Coefficient
  • the image of the object can be e.g. are gradually rotated relative to the comparison recording, as indicated by the sequence of recordings 701 to 710 in Fig. 7.
  • a skylight is depicted as an example of an object in the real environment in which the user is moving.
  • the recordings are rotated from left to right in each case 1 ° counterclockwise to each other.
  • the ECC algorithm determines a correlation with the comparison image for each rotation. Subsequently, the best correlation is chosen and a corresponding transformation matrix is determined. Again from the transformation matrix the orientation, i. the rotation, the recording of the object relative to the comparison recording are determined.
  • the orientation of the recording device in the real environment is determined in the following (by combining the two information).
  • the second approach may be used, for example, for objects extending vertically only over the user.
  • the capture device may be oriented vertically to a user's straight-ahead viewing direction in the real world.
  • the recording device can in the real environment to heaven or in the direction of the ceiling of a room or hall in which the User moves, be aligned.
  • the object may be, for example, a lighting device, a (roof) window, a carrier, a beam on the ceiling of the room or the hall.
  • the plurality of comparison recordings in the database may include, for example, various recordings of the ceiling of the room or the hall.
  • the cradle may also be oriented toward the floor of a room or hall in which the user is moving.
  • the object may then be, for example, an in-ground light source (laser, LED) or a marker (eg an emergency marker), such as an arrow (eg, a glowing arrow).
  • an object according to the present disclosure may also be a specially designated object, such as a chroma keying object.
  • the display device may further comprise the recording device.
  • the display device may in turn comprise a mobile communication device together with a fixing device for attaching the mobile communication device to the user's head.
  • a camera of the mobile communication device can be used as a recording device.
  • a calibration of the representation of the virtual environment can be made possible.
  • a periscope-type device is used, one opening of which is oriented toward the ceiling or the floor and the other opening is oriented towards the lens of the camera of the mobile communication device.
  • the incident light beams can be deflected from the original direction of incidence (perpendicular to the first opening) to the desired outflow direction (perpendicular to the second opening).
  • the plurality of comparison images can be binary recordings. Accordingly, determining the rotation of the image of the object relative to the comparison image comprises converting the image of the object into a binary image of the object and determining the rotation of the binary image of the object relative to the comparative image. For determining the rotation of the binary recording of the object relative to In turn, the comparison recording can be carried out using the image registration methods described above.
  • the resolution of the plurality of comparison exposures may be limited (e.g., 320x240 pixels) to save computational power.
  • the method may include scaling the image of the object, i. the original resolution is scaled to a target resolution (for example, from 1920x1080 pixels to 320x240 pixels). As indicated, the target resolution may be lower than the original resolution. Due to the reduced number of pixels in the image of the object, computing time can be saved.
  • the orientation of the object in the image can also be determined and compared with a reference direction (eg, according to the methods described in connection with FIGS. 9a to 9d) to the orientation of the image device in the real environment to determine.
  • a reference direction eg, according to the methods described in connection with FIGS. 9a to 9d
  • FIG. 8 a method 800 for adjusting a viewing direction in a representation of a virtual environment according to a second aspect of the present disclosure is shown in FIG. 8.
  • the method 800 comprises recording 802 of a known object in a real environment with a recording device-as described above. That is, during recording, it may be e.g. to take a still picture, a video or a sound recording. Accordingly, the recording device may comprise a still camera, a video camera, a sound recording device or a combination thereof.
  • the method 800 comprises determining 804 a rotational offset of the viewing direction in the representation of the virtual environment about the yaw axis of the representation of the virtual environment based on the image of the object and a current viewing direction in the representation of the virtual environment.
  • the current viewing direction in the representation of the virtual environment can be, for example, from a display device (eg HMD, HMU), which outputs the representation of the virtual environment to the user (and optionally also calculated), or a computer that calculates the virtual environment (eg -End of a VR system).
  • the recording device can - as described above - be arranged spatially in the immediate vicinity of the user (eg on the head of the user).
  • the orientation of the recording device in the real environment can be determined, which can be approximated as the orientation of the user's head in the real environment. From this, by using the information about the instantaneous viewing direction in the representation of the virtual environment, the rotational offset of the viewing direction in the representation of the virtual environment around the yaw axis of the representation of the virtual environment can be determined.
  • the method 800 further includes rotating 806 the viewing direction in the representation of the virtual environment about the rotational offset.
  • the viewing direction in the representation of the virtual environment is corrected by a rotation about the yaw axis of the representation of the virtual environment, the direction and the magnitude of the rotation being determined by the rotational offset.
  • the representation of the virtual environment is thus corrected by the rotational offset.
  • the representation of the virtual environment can thus be adapted to the actual position and orientation of the user's head in the real environment. Consequently, the method 800 thus also permits a calibration of the viewing direction in the representation of the virtual environment.
  • the method 800 can also be used to correct an incorrectly determined orientation in the real environment or a drifting of the viewing direction in the representation of the virtual environment resulting from measurement errors of the commonly used sensors for determining the position and orientation (of the head) of a user become.
  • the method 800 may further include outputting the representation of the virtual environment to a user.
  • the outputting of the representation of the virtual environment to the user can take place, for example, via a display device attached to the head of the user, which further comprises the receiving device.
  • the orientation of the recording device in the real environment may be approximately assumed to be the orientation of the user's head in the real environment.
  • the display device attached to the user's head includes a mobile communication device (eg, a smartphone).
  • sensors eg gyroscope, magnetometer, accelerometer
  • sensors already present in the mobile communication device can be used to determine the orientation of the user's head in the real environment.
  • the camera of the mobile communication device as a recording device, a rotational offset of the viewing direction in the representation of the virtual environment around the yaw axis of the representation of the virtual environment due to measurement errors of the sensor of the mobile communication device can be corrected.
  • the method 800 may be performed immediately (ie, online) on the mobile communication device. The method 800 can thus also enable a calibration of the representation of the virtual environment without additional hardware components.
  • a portion of the method 800 may be performed by the mobile communication device and another portion of the method 800, such as determining 804 the rotational offset of the viewing direction in the representation of the virtual environment around the yaw axis of the representation of the virtual environment, by an already existing back-end of the VR system that the user uses (ie offline).
  • the particular rotation offset can then be sent, for example, from the back-end to the mobile communication device, so that it can rotate the viewing direction in the current representation of the virtual environment by the rotational offset.
  • the above-described functionality can be implemented, for example, by updating one or more existing software components of the VR system (eg software for the mobile communication device or software for the back-end).
  • determining the viewing direction rotation offset 804 in the virtual environment representation about the yaw axis of the virtual environment representation may include determining an orientation of the imaging device in the actual environment based on the object's capture and a reference direction (example methods this will be explained below).
  • the reference direction is a direction in the real environment whose orientation relative to that of the object is known. In other words, one uses knowledge about the orientation of the object relative to the reference direction for determining the orientation of the recording device in the real environment.
  • the determining 804 of the rotational offset may include determining a desired viewing direction in the representation of the virtual environment based on the orientation of the recording device in the real environment.
  • the specific orientation of the recording device in the real environment can be provided to an algorithm for calculating the representation of the virtual environment, which calculates a representation of the virtual environment based thereon.
  • the target viewing direction in the virtual environment may be that viewing direction in the virtual environment which corresponds to the actual position and orientation of the user's head in the real environment.
  • the recording device can be aligned vertically to the viewing direction of the user in the real environment.
  • the viewing direction in the calculated representation of the virtual environment can consequently be regarded as a desired viewing direction.
  • the rotational offset of the viewing direction in the depiction of the virtual environment around the yaw axis of the representation of the virtual environment is now determined according to exemplary embodiments. This can be done, for example, by comparing the desired viewing direction and the current viewing direction in the representation of the virtual environment. In other words, it is determined how much the instantaneous viewing direction in the representation of the virtual environment relative to the desired viewing direction in the representation of the virtual environment is rotated about the yaw axis of the representation of the virtual environment.
  • FIGS. 9a to 9d two different approaches for determining the orientation of the recording device in the real environment on the basis of an orientation of the object in the recording and a reference direction are explained by way of example.
  • a receptacle 900 of the roof of a hall in which the user moves in the real environment is shown in FIG. 9a.
  • the receptacle 900 shows a part of an elongated illumination device 910, which represents an exemplary object.
  • the object is not limited to elongated lighting devices.
  • the object can also be a window, a carrier or a pattern on the ceiling the hall or generally a room in the real environment in which the user moves, be.
  • the object can generally be an object that extends vertically only over the user.
  • the recording device can be aligned vertically to a line of sight straight ahead of the user in the real environment, ie the recording device can be oriented towards the sky or to the ceiling.
  • the cradle may also be oriented toward the floor of a room or hall in which the user is moving.
  • the object may then be, for example, an in-ground light source (laser, LED) or a marker (eg an emergency marker), such as an arrow (eg, a glowing arrow).
  • a camera of the mobile communication device can be used as a recording device .
  • a periscope-like device can be used, whose one opening is oriented towards the ceiling or the floor and whose other opening to the lens of the camera the mobile communication device is aligned.
  • Determining the orientation of the recording device in the real environment comprises, according to the first approach, converting the recording of the object into a binary recording of the object.
  • the binary recording 900 'corresponding to the recording 900 is shown in FIG. 9b.
  • an environment-dependent threshold value for the separation between the two possible states in the binary recording can be optionally determined or defined, for example.
  • the method comprises recognizing candidates for the object in the binary recording of the object. In the binary recording 900 ', this is the area 910' that corresponds to the oblong lighting device 910. Although only one candidate 910 'for the object is shown in the binary recording 900' shown in FIG.
  • the method further comprises determining a respective (linear) eccentricity e of the candidates for the object. That is, an eccentricity is determined for each of the recognized candidates.
  • the particular linear eccentricity allows one to estimate whether the candidate candidate is a more circular (e «0) or more elongated (e« 1) object.
  • the eccentricity of area 910' which is the only candidate in the image, is determined. Since the region 910 'is elongated, a value of the eccentricity of approximately one is determined for it.
  • the method comprises determining an orientation of a main axis of the candidate as orientation of the object in the recording whose eccentricity is above a threshold value and whose main axis is longer than the main axes of the other candidates for the object with an eccentricity above the threshold value.
  • their particular eccentricity is thus compared to a threshold to determine those candidates that represent an elongated object.
  • the threshold value can therefore be 0.5, 0.55, 0.6, 0.65, 0.7, 0.75, 0.8, 0.85 or 0.9.
  • the one with the longest major axis is selected.
  • the orientation of this candidate in the shot is determined as the orientation of the object in the shot.
  • the orientation of the candidate in the shot may be determined, for example, based on a helper vector 920, where helper vector 920 indicates the user's line of sight straight ahead. Defining the auxiliary vector 920 in the straight-line direction of the user may make it possible to view the orientation of the recording device as substantially identical to the orientation of the user in the real environment. Correspondingly, from the particular orientation of the recording device, that viewing direction in the virtual environment can be determined as the desired viewing direction in the representation of the virtual environment which corresponds to the actual position and orientation of the user's head in the real environment. For the region 910 ', it can therefore be determined that the orientation 930 of its main axis subtends an angle of 89 ° with respect to the auxiliary vector 920.
  • the orientation 930 of the major axis of the region 910 ' is determined to be the orientation of the object in the image.
  • the orientation of the receiving device can be determined from the orientation of the object in the recording.
  • the reference direction is a direction in the real environment whose orientation relative to that of the object is known. In other words, the orientation (orientation) of the object relative to the reference direction is known.
  • the reference direction can be determined for example for a known environment or determined from reference images.
  • a corner of the base area can be defined as the origin.
  • three orthogonal spatial axes X, Y and Z can be defined (analogous to the example of FIG. 2).
  • X may point to the right of the defined origin point in the corner of the footprint (ie, extend substantially along a first boundary of the footprint), Y face upward (ie, sky) (ie, stand substantially perpendicular to the footprint), and Z in show the depth (ie, forward) (ie, extend substantially along a second boundary of the base that is orthogonal to the first boundary of the base).
  • the unit direction Y thus corresponds to the yaw axis, ie a rotation about this axis shifts a recording horizontally.
  • the spatial axis Z which runs essentially along the second boundary of the base area, can then be selected as the reference direction.
  • the orientation of the object - in the example of FIG. 9a, thus of the elongate illumination device 910 - relative to the reference direction is known.
  • the elongate illumination device 910 can extend orthogonal to the reference direction Z, ie, in the opposite direction, parallel to the spatial direction X.
  • the orientation of the recording device can now be determined.
  • the orientation of the auxiliary vector 920 in the real environment is thus determined.
  • an orientation of the recording device can be determined in the plane defined by the spatial directions X and Z plane.
  • the resolution of the recording to be evaluated may be limited (eg to 320x240 pixels).
  • the method may include scaling the image of the object, ie, the original resolution is scaled to a target resolution (eg from 1920x1080 pixels to 320x240 pixels). As indicated, the target resolution may be lower than the original resolution. Due to the reduced number of pixels in the image of the object, computing time can be saved.
  • FIGS. 9c and 9d the second approach for determining the orientation of the recording device in the real environment on the basis of an orientation of the object in the recording and a reference direction is explained below.
  • FIG. 9c a receptacle 940 of the roof of a hall in which the user moves in the real environment is shown in FIG. 9c.
  • the receptacle 900 shows a (linear) arrangement of circular illumination devices 951, 952, 953, which represents an exemplary object.
  • the object is not limited to an arrangement of circular lighting devices.
  • the object may generally be any arrangement of circular objects on the ceiling of the hall or, more generally, a room in the real environment in which the user is moving.
  • the object can generally be an object that extends vertically only over the user.
  • the cradle may also be oriented toward the floor of a room or hall in which the user is moving.
  • the object may then be, for example, an in-ground light source (laser, LED) or a marker, such as an arrow.
  • the receiving device can be aligned vertically to a straight line of sight of the user in the real environment, ie the recording device can be directed to the sky (ie to the ceiling) or gene ground.
  • a display device attached to the head of the user which in turn includes a mobile communication device together with fixing device for attaching the Mobilkommunktions- tion device on the user's head
  • a camera of Mobilkommunikationsge- device can be used as a recording device ,
  • a calibration of the representation of the virtual environment can be made possible.
  • a periscope-like device can be used, whose one opening is oriented towards the ceiling or the floor and whose other opening to the lens of the camera the mobile communication device is aligned.
  • the determination of the orientation of the recording device in the real environment again involves converting the recording of the object into a binary recording of the object.
  • the binary recording 940 'corresponding to the recording 940 is shown in FIG. 9d.
  • an environment-dependent threshold value for the separation between the two possible states in the binary recording can be optionally determined or defined, for example.
  • the method comprises recognizing circular objects in the binary recording of the object. Respective radii of the circular objects are covered by a predetermined value range. In other words, only circular objects are detected whose value for the radius is greater than a first threshold value and less than a second threshold value.
  • the thresholds may be selected based on information about the real environment in which the user is moving (e.g., height of the roof or spacing of the lighting devices from the ground, dimensions of the lighting devices).
  • a Circular Hough Transfrom (CHT) based algorithm For recognizing circular objects in the binary recording, e.g. a Circular Hough Transfrom (CHT) based algorithm.
  • CHT Circular Hough Transfrom
  • the circular objects 95 ⁇ , 952 ', and 953' corresponding to the arrangement of circular illumination devices 951, 952, 953 in the receptacle 940 are recognized.
  • the bright regions 952 "and 953" in the binary recording 940 ' which are adjacent to the 952' and 953 ', do not become circular objects since they do not fulfill the radius criterion.
  • optical interference effects are excluded from the further process for determining the orientation of the recording device.
  • the method comprises determining distances of the circular objects from each other. For example, the center points of the circular objects 95 ⁇ , 952 'and 953' can be determined and the distances of the center points from each other can be determined. In the distance determination and the radii of the respective circular objects can be included.
  • determining the orientation of the recording device in the real environment according to the second approach comprises determining the orientation of the object in the recording based on the distances of the circular objects from each other. From the distances of the circular objects, a relation between the individual circular objects can be determined. In turn, for example, information about the real environment in which the user is moving can be used.
  • the distances between the individual illumination devices may be the linear arrangement of circular illumination devices 951, 952, 953 in the real environment, and the distance of the linear array of circular illumination devices 951, 952, 953 may be another linear array circular illumination devices (not shown in Fig. 9c) are used.
  • information about the geometry and nature of the area detectable by the cradle in the real environment eg a ceiling of a room or hall
  • the distances of the circular objects 95 ⁇ , 952 'and 953' to each other are determined to correspond to the distances of a linear array of lighting devices in the real environment, and the circular objects 95 ⁇ , 952 'and 953' thus become one represent a known object in the real environment. Accordingly, from the respective positions of the circular objects 951 ', 952' and 953 'in the binary recording, a direction vector 970 of the object represented by the circular objects 95 ⁇ , 952' and 953 'in the photograph is determined. For this, e.g. a line to the center of the circular objects 95 ⁇ , 952 'and 953' adapted (fitted).
  • the orientation of the directional vector 970 (ie, the object) in the shot can again be determined, for example, on the basis of an auxiliary vector 960 which, for example, indicates the line of sight of the user.
  • the orientation of the object spans an angle of 35 ° with respect to the auxiliary vector 960. That is, the linear array of circular illumination devices 951, 952, 953 represented by the directional vector 970 is rotated 35 ° counter to the auxiliary vector 960.
  • the orientation of the direction vector 970 is determined to be the orientation of the object in the shot.
  • the orientation of the receiving device can be determined from the orientation of the object in the recording.
  • the reference direction is a direction in the real environment whose orientation relative to that of the object is known.
  • the spatial axis Z for example, which in turn extends along the second boundary of the base area, can again be selected as the reference direction.
  • the orientation of the object - in the example of Fig. 9c thus the linear arrangement of circular lighting devices 951, 952, 953 - relative to the reference direction is known.
  • the linear arrangement of circular illumination devices 951, 952, 953 may be orthogonal to the reference direction Z, ie, in reverse parallel to the spatial direction X.
  • the orientation of the recording device can now be determined.
  • the orientation of the auxiliary vector 960 in the real environment is thus determined.
  • an orientation of the recording device can be determined in the plane defined by the spatial directions X and Z plane.
  • the resolution of the recording to be evaluated may be limited (for example to 320x240 pixels). Accordingly, the method may include scaling the image of the object, i. the original resolution is scaled to a target resolution (for example, from 1920x1080 pixels to 320x240 pixels). As indicated, the target resolution may be lower than the original resolution. Due to the reduced number of pixels in the image of the object, computing time can be saved.
  • the recording device makes recordings in a plane which is substantially orthogonal to a plane in which the user moves. For example, that the visitor moves in a hall and the recording device at a angle of substantially 90 ° to the ceiling of the hall takes pictures of this.
  • the cradle may also be inclined relative to the ceiling (eg, when the cradle is attached to the user's head and it is pitching or tilting with the head).
  • instantaneous measurements of a gyroscope and / or accelerometer of a mobile communication device used as a virtual environment display device as well as a recording device may be used quite generally (ie, in all embodiments of the present disclosure) to determine if there is an appropriate moment to take a picture. For example, it can be determined that the recording device only takes pictures in a certain value range of the measured values. This can be used to prevent images being evaluated that have blurring or other image distortions in the real environment due to the orientation of the recording device. Thus, it can be avoided that an erroneous orientation of the recording device in the real environment determined and thus the viewing direction in the virtual environment is rotated by a faulty rotational offset.
  • a receptacle is made of a light source attached to the body of the user (eg, abdomen, waistband).
  • a laser may be placed on the user's torso that emits a laser beam in the direction of straight-line movement of the user (ie, substantially straight ahead of the user).
  • the light source or the laser beam is the known object, which is received by the recording device. From the current position of the recording device and the known position of the recording device at least one previous time, an orientation of the body in the real environment is determined (ie, the motion vector at the time of recording is assumed to be the orientation of the body).
  • the orientation of the body in the real environment serves as a reference direction. From the orientation of the laser beam in the recording, the orientation of the recording device is now determined relative to the laser beam.
  • the orientation of the laser beam in the recording can be determined, for example, according to the method described in connection FIGS. 9a and 9b. Because the direction of the laser beam corresponds to the reference direction, the absolute orientation of the recording device in the real environment can be determined from the recording of the laser beam.
  • a desired viewing direction in the representation of the virtual environment can be determined, so that by comparison with the current viewing direction in the representation of the virtual environment again the rotational offset of the viewing direction in the representation of the virtual environment around the yaw axis of the representation of the virtual environment can be determined.
  • the viewing direction in the representation of the virtual environment can then be rotated by the rotational offset, ie corrected.
  • FIG. 10 shows another method 1000 for setting a viewing direction in a representation of a virtual environment.
  • the method 1000 comprises recording 1002 of an object arranged in a real environment on the user's body with a recording device arranged on the user's head at a first time to and at a later second time ti.
  • the object may be e.g. is a light source attached to the user's body (e.g., abdomen or waistband).
  • a laser may be placed on the user's torso that emits a laser beam in the direction of straight-line motion of the user (i.e., substantially straight ahead of the user).
  • a user may move his head by rotating about a transverse axis of the head, rotating about a longitudinal axis of the head, and / or rotating about a yaw axis of the head.
  • the transverse axis, the longitudinal axis and the yaw axis of the head are perpendicular to each other. Since the receiving device is arranged on the head of the user, this is also about the transverse axis, the longitudinal axis and the yaw axis of the head movable.
  • the representation of the virtual environment may be output to the user via a display device attached to the user's head.
  • the display device HMD
  • existing sensors eg gyroscope, magnetometer, accelerometer
  • a rotational position of the head to be determined about its yaw axis.
  • the determination of the orientation of the head with the existing sensors - as shown above - is faulty.
  • the display device may further comprise the receiving device.
  • the method 1000 further includes determining 1004 a rotational offset of the viewing direction in the representation of the virtual environment about a yaw axis of the representation of the virtual environment based on the images of the object at the first time to and at the second time ti and measured values of at least one further Head of the user attached sensor.
  • the rotational offset of the viewing direction in the representation of the virtual environment about a yaw axis of the representation of the virtual environment corresponds to a rotational offset of the receiving device about the yaw axis of the head of the user. This is determined by a comparison of the rotation of the recording device determined from the recordings of the object at times to and ti about the yaw axis of the user's head and that of the measured values between times to and ti of the at least one further sensor attached to the user's head Rotation of the recording device determined by the yaw axis of the user's head.
  • the rotational offset of the viewing direction in the representation of the virtual environment around the yaw axis of the representation of the virtual environment can be determined without having to determine the absolute orientation of the recording device or the user's head in the real environment. Rather, it is sufficient to determine the relative rotational offset of the receiving device or of the HMD about the yaw axis of the head.
  • the method 1000 therefore also includes rotating 1006 the viewing direction in the representation of the virtual environment around the rotational offset.
  • the viewing direction in the representation of the virtual environment is corrected by a rotation about the yaw axis of the representation of the virtual environment, the direction and the magnitude of the rotation being determined by the rotational offset.
  • the representation of the virtual environment is thus corrected by the rotational offset.
  • the representation of the virtual environment can thus be adapted to the actual position and orientation of the user's head in the real environment. Consequently, the method 1000 thus also permits a calibration of the viewing direction in the representation of the virtual environment. In particular, even with the method 1000, an erroneously determined orientation in the real environment or a drift of the viewing direction in the representation of the virtual environment, which results from measurement errors of commonly used sensors for determining the position and orientation (of the head) of a user corrected.
  • determining 1004 the rotational offset of the viewing direction in the representation of the virtual environment around the yaw axis of the virtual environment representation comprises determining a first rotation of the capture device about the yaw axis of the user's head between the first time to and the second time ti on the photographs of the object at the first time to and at the second time ti.
  • determining 1004 the rotational offset of the viewing direction in the representation of the virtual environment around the yaw axis of the virtual environment representation comprises determining a first rotation of the capture device about the yaw axis of the user's head between the first time to and the second time ti on the photographs of the object at the first time to and at the second time ti.
  • for the first time to the orientation of the recording device is determined relative to the laser beam from the recording of the laser beam at the first time to.
  • the laser beam may be directed in the direction of straight-ahead movement of the user and the user is looking straight ahead, so that rotation of the pick-up about the yaw axis of the head of 0 ° relative to the laser beam is determined as a first orientation.
  • the orientation of the recording device relative to the laser beam is determined from the recording of the laser beam at the second time ti. If, for example, the user has turned the head sideways at the time t.sub.i, a second orientation of the laser beam in the receptacle different from the first orientation is determined, ie a rotation of the recording device about the yaw axis of the head relative to the laser beam that differs from 0 °.
  • the orientation of the laser beam in the recording can be determined, for example, according to the method described in connection FIGS. 9a and 9b.
  • the relative angle of rotation about the yaw axis of the head between the orientation of the head at the first time to and the orientation of the head at the second time ti is determined from the two recordings.
  • determining 1004 the rotational offset of the viewing direction in the representation of the virtual environment around the yaw axis of the representation of the virtual environment comprises determining a second rotation of the recording device about the yaw axis of the user's head between the first time to and the second time ti based on the measured values of the at least one further sensor attached to the head of the user.
  • the effective (ie, total) rotation about the yaw axis of the head is in turn determined from the measured values of the at least one further sensor attached to the head of the user between the times to and ti.
  • the relative angle of rotation about the yaw axis of the head between the orientation of the head at the first time to and the orientation of the head at the second time ti determined from the measured values.
  • the rotation about the yaw axis of the head determined from the measurement values may be erroneous. Since this is used in the VR system for determining the viewing direction in the representation of the virtual environment, the viewing direction in the representation of the virtual environment can also be erroneous, ie rotated about the yaw axis of the representation of the virtual environment.
  • determining the gaze direction rotation offset 1004 in the representation of the virtual environment around the yaw axis of the virtual environment representation in these embodiments further comprises determining the rotational offset about the yaw axis of the head between the first rotation and the second rotation as the rotational offset of the gaze direction in the representation of the virtual environment around the yaw axis of the representation of the virtual environment.
  • the rotational offset about the yaw axis 15 of the head between the first rotation and the second rotation represents the difference (ie the difference) between the first rotation determined from the images of the object and the second rotation derived from the Measured values of the at least one further sensor is determined.
  • the resulting rotational offset about the yaw axis of the user's head due to the measurement errors of the sensing device (i.e., HMD) 20 is thus determined by the accurate determination of the rotation of the display relative to the object on the user's body (e.g., the laser beam).
  • the relative rotational offset of the receiving device or of the HMD about the yaw axis of the head can therefore be assumed as a rotational offset of the viewing direction in the representation of the virtual environment around the yaw axis of the representation of the virtual environment, so that these 25 corresponding to the rotational offset of the receiving device or the HMD around the yaw axis of the head can be corrected.
  • FIGS. 11a and 11b relationships between a motion vector of a user 1000 in the real environment, an actual viewing direction v rea i of the user 1100 in the real environment and one of the measured values of the at least one further at the head 1102 of FIG User 1100 arranged sensor certain view v vr shown in the representation of the virtual environment.
  • the user 1100 moves between the first time to and later second time ti along the motion vector p, which can be determined, for example, by position measurements m (to) and m (ti) at the two times to and ti.
  • Fig. IIa shows the situation at the first time to.
  • the user 1100 is looking straight ahead, i. in the direction of its motion vector p.
  • the actual orientation of the user's head is determined for the first time to by means of a pick-up device comprised by an HMD 1104 located at the head 1102 of the user 1100.
  • a pick-up device comprised by an HMD 1104 located at the head 1102 of the user 1100.
  • a first image of an object e.g., light source, laser light source, emitting laser beam
  • a relative orientation of the head to the object is determined for the first time to.
  • the orientation of the object relative to the body 1106 (e.g., trunk) of the user 1100 is known.
  • a laser beam may be directed in the direction of a straight-ahead movement of the user 1100, so that a rotation of the pick-up about the yaw axis of the head of 0 ° relative to the laser beam is determined as the first orientation. Since the laser beam is directed in the direction of the straight-ahead movement of the user 1100, the direction of the laser beam essentially corresponds to the motion vector p of the user 1100, so that the rotation ⁇ of the recording device about the yaw axis of the head relative to the motion vector p of the user 1100 is also known , here is a rotation of 0 °. Since the motion vector is known, the absolute orientation of the recording device and therefore also of the head 1102 of the user 1100 in the real environment is thus known.
  • the orientation of the head 1102 of the user 1100 in the real environment is also determined by means of at least one further sensor (eg, gyroscope, magnetometer, accelerometer) from the HMD 1104, and the viewing direction v vr in the illustration in FIG the virtual environment.
  • at least one further sensor eg, gyroscope, magnetometer, accelerometer
  • the viewing direction of the user 1100 in the real environment determined from the measured values of the at least one further sensor arranged at the head 1102 of the user 1100 is identical to the actual viewing direction v rea i of the user 1100 determined from the recording in the real environment.
  • the viewing direction Vvr in the representation of the virtual environment would also essentially correspond to the motion vector p of the user 1100, such that a rotation occurs between the actual viewing direction v rea i of the user 1100 in the real environment and the viewing direction v vr in the representation of the virtual environment ⁇ can be assumed to be substantially 0 °.
  • the rotation offset about the yaw axis of the representation of the virtual environment is substantially 0 °.
  • Fig. IIb now the situation at the second time ti is shown.
  • the user 1100 is looking straight ahead, ie essentially in the direction of his motion vector p.
  • a rotation of the recording device around the yaw axis of the head of 0 ° relative to eg the laser beam is again determined as the second orientation, ie the rotation ⁇ the pickup device about the yaw axis of the head relative to the motion vector p of the user 1100 is again 0 °.
  • the viewing direction of the user 1100 in the real environment is not identical to the actual viewing direction v rea i of the user determined from the second image User 1100 in the real environment. Therefore, the viewing direction v vr in the representation of the virtual environment also does not correspond substantially to the motion vector p of the user 1100, so that between the actual viewing direction v rea i of the user 1100 in the real environment and the viewing direction v vr in the representation of the virtual Environment is a rotation other than 0 ° ⁇ .
  • the rotational offset about the yaw axis of the representation of the virtual environment is different from 0 °.
  • the consequence of the rotation offset about 0 ° about the yaw axis of the representation of the virtual environment is that the user does not move in the virtual environment substantially along the motion vector p, but obliquely to it. If, for example, the user walks straight ahead in the real environment, he would move forward in the real environment.
  • the relative rotational offset about the yaw axis of the head 1102 of the user 1100 between the rotation determined from the two shots and that from the measured values the rotation of the at least one further sensor can be used.
  • the absolute orientation of the pickup device (and thus the head 1102) in the real environment which is determinable based on the knowledge of the relative orientation of the pickup device to the known motion vector, can be used to provide a desired line of sight in the illustration of FIGS to determine the virtual environment and to correct the viewing direction in the representation of the virtual environment by the rotational offset between the target viewing direction and the current viewing direction in the representation of the virtual environment (ie to rotate the yaw axis representing the virtual environment).
  • a calibration of the viewing direction in the representation of the virtual environment can be made possible.
  • aspects have been described in the context of a device, it will be understood that these aspects also constitute a description of the corresponding method, so that a block or a component of a device is also to be understood as a corresponding method step or as a feature of a method step. Similarly, aspects described in connection with or as a method step also represent a description of a corresponding block or detail or feature of a corresponding device. Depending on particular implementation requirements, embodiments of the invention may be implemented in hardware or in software.
  • the implementation may be performed using a digital storage medium, such as a floppy disk, a DVD, a Blu-Ray Disc, a CD, a ROM, a PROM, an EPROM, an EEPROM or FLASH memory, a hard disk, or other magnetic disk or optical memory are stored on the electronically readable control signals, which can cooperate with a programmable hardware component or cooperate such that the respective method is performed.
  • a digital storage medium such as a floppy disk, a DVD, a Blu-Ray Disc, a CD, a ROM, a PROM, an EPROM, an EEPROM or FLASH memory, a hard disk, or other magnetic disk or optical memory are stored on the electronically readable control signals, which can cooperate with a programmable hardware component or cooperate such that the respective method is performed.
  • the digital storage medium may therefore be machine or computer readable.
  • some embodiments include a data carrier having electronically readable control signals capable of interacting with a programmable computer system or programmable hardware component such that one of the methods described herein is performed.
  • One embodiment is thus a data carrier (or a digital storage medium or a computer readable medium) on which the program is recorded for performing any of the methods described herein.
  • embodiments of the present invention may be implemented as a program, firmware, computer program, or computer program product having program code or data, the program code or data operative to perform one of the methods when the program resides on a processor or a computer programmable hardware component expires.
  • the program code or the data can also be stored, for example, on a machine-readable carrier or data carrier.
  • the program code or the data may be available, inter alia, as source code, machine code or byte code as well as other intermediate code.
  • Another embodiment is further a data stream, a signal sequence, or a sequence of signals that represents the program for performing any of the methods described herein.
  • the data stream, the signal sequence or the sequence of signals can be configured, for example, to be transferred via a data communication connection, for example via the Internet or another network.
  • Embodiments are also data representing signal sequences that are suitable for transmission over a network or a data communication connection, the data representing the program.
  • a program may implement one of the methods during its execution, for example, by reading out of these memory locations or writing into them one or more data, as a result of which, if necessary, switching operations or other processes in transistor structures, in amplifier structures or in other electrical, optical, magnetic or operating according to another functional principle components are caused. Accordingly, by reading a memory location, data, values, sensor values or other information can be detected, determined or measured by a program. A program can therefore acquire, determine or measure quantities, values, measured variables and other information by reading from one or more storage locations, as well as effect, initiate or execute an action by writing to one or more storage locations and control other devices, machines and components ,

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Optics & Photonics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Processing Or Creating Images (AREA)
EP17722406.0A 2016-05-18 2017-04-27 Verfahren zum einstellen einer blickrichtung in einer darstellung einer virtuellen umgebung Withdrawn EP3458935A1 (de)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102016109153.7A DE102016109153A1 (de) 2016-05-18 2016-05-18 Verfahren zum einstellen einer blickrichtung in einer darstellung einer virtuellen umgebung
PCT/EP2017/060088 WO2017198441A1 (de) 2016-05-18 2017-04-27 Verfahren zum einstellen einer blickrichtung in einer darstellung einer virtuellen umgebung

Publications (1)

Publication Number Publication Date
EP3458935A1 true EP3458935A1 (de) 2019-03-27

Family

ID=58692475

Family Applications (1)

Application Number Title Priority Date Filing Date
EP17722406.0A Withdrawn EP3458935A1 (de) 2016-05-18 2017-04-27 Verfahren zum einstellen einer blickrichtung in einer darstellung einer virtuellen umgebung

Country Status (8)

Country Link
US (1) US10885663B2 (ko)
EP (1) EP3458935A1 (ko)
JP (1) JP6676785B2 (ko)
KR (1) KR102184619B1 (ko)
CN (1) CN109313488A (ko)
CA (1) CA3022914A1 (ko)
DE (1) DE102016109153A1 (ko)
WO (1) WO2017198441A1 (ko)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6944863B2 (ja) * 2017-12-12 2021-10-06 株式会社ソニー・インタラクティブエンタテインメント 画像補正装置、画像補正方法およびプログラム
US11054638B2 (en) * 2018-06-13 2021-07-06 Reavire, Inc. Tracking pointing direction of device
IL265818A (en) * 2019-04-02 2020-10-28 Ception Tech Ltd System and method for determining the position and orientation of an object in space

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4743818B2 (ja) 2003-04-04 2011-08-10 キヤノン株式会社 画像処理装置、画像処理方法、コンピュータプログラム
JP2005050189A (ja) * 2003-07-30 2005-02-24 Canon Inc 画像処理装置およびその方法並びにプログラムコード、記憶媒体
CN100410622C (zh) * 2004-05-14 2008-08-13 佳能株式会社 用于获得目标物体的位置和方位的信息处理方法和设备
JP2008033837A (ja) * 2006-07-31 2008-02-14 Sanyo Electric Co Ltd 点検システム及び誤差補正プログラム
FR2915568B1 (fr) * 2007-04-25 2009-07-31 Commissariat Energie Atomique Procede et dispositif de detection d'un axe de rotation sensiblement invariant
US9013617B2 (en) * 2012-10-12 2015-04-21 Qualcomm Incorporated Gyroscope conditioning and gyro-camera alignment
US9677840B2 (en) * 2014-03-14 2017-06-13 Lineweight Llc Augmented reality simulator
CN105023294B (zh) * 2015-07-13 2018-01-19 中国传媒大学 结合传感器与Unity3D的定点移动增强现实方法
CN105044915A (zh) * 2015-09-02 2015-11-11 大连麒美数字科技有限公司 一种利用头戴显示器实现影片互动的控制方法
US10723022B2 (en) * 2016-09-16 2020-07-28 Carbon Robotics, Inc. System and calibration, registration, and training methods

Also Published As

Publication number Publication date
WO2017198441A1 (de) 2017-11-23
US10885663B2 (en) 2021-01-05
JP2019519842A (ja) 2019-07-11
CN109313488A (zh) 2019-02-05
CA3022914A1 (en) 2017-11-23
JP6676785B2 (ja) 2020-04-08
KR20190005222A (ko) 2019-01-15
US20190180471A1 (en) 2019-06-13
DE102016109153A1 (de) 2017-11-23
KR102184619B1 (ko) 2020-11-30

Similar Documents

Publication Publication Date Title
DE102018200154A1 (de) Kalibrationsvorrichtung, Kalibrationsverfahren und Programm für einen visuellen Sensor
DE102015005267B4 (de) Informationsverarbeitungsvorrichtung, Verfahren dafür und Messvorrichtung
DE102008024462B4 (de) Interaktives Bildsystem, interaktive Vorrichtung und Betriebsverfahren derselben
DE102015015194A1 (de) Bildverarbeitungsvorrichtung und -verfahren und Programm
DE102016224095A1 (de) Verfahren zum Kalibrieren einer Kamera und Kalibriersystem
DE112016005865T5 (de) Automatische Bereichssteuerung für Tiefenkamera mit aktiver Beleuchtung
DE112010004767T5 (de) Punktwolkedaten-Verarbeitungsvorrichtung, Punktwolkedaten-Verarbeitungsverfahren und Punktwolkedaten-Verarbeitungsprogramm
DE102016013274A1 (de) Bildverarbeitungsvorrichtung und verfahren zur erkennung eines bilds eines zu erkennenden objekts aus eingabedaten
EP3182065A1 (de) Handhaltbares entfernungsmessgerät und verfahren zum erfassen relativer positionen
DE102018108027A1 (de) Objekterfassungsvorrichtung
EP2886043A1 (de) Verfahren zum Fortsetzen von Aufnahmen zum Erfassen von dreidimensionalen Geometrien von Objekten
WO2018087084A1 (de) Verfahren und vorrichtung zum überlagern eines abbilds einer realen szenerie mit virtuellen bild- und audiodaten und ein mobiles gerät
DE102014201271A1 (de) Verfahren und Steuergerät zum Erkennen einer Veränderung eines relativen Gierwinkels innerhalb eines Stereo-Video-Systems für ein Fahrzeug
DE102008011596B4 (de) Kombiniertes Baken- und Szenen-Navigationssystem
EP2423640A1 (de) Neigungssensor für ein Gerät und Verfahren zur Bestimmung der Neigung eines Gerätes
DE102014019671A1 (de) Verfahren zum optischen Abtasten und Vermessen einer Umgebung mit einer 3D-Messvorrichtung und Autokalibrierung mittels 2D-Kamera
WO2017198441A1 (de) Verfahren zum einstellen einer blickrichtung in einer darstellung einer virtuellen umgebung
EP3104330A1 (de) Verfahren zum nachverfolgen zumindest eines objektes und verfahren zum ersetzen zumindest eines objektes durch ein virtuelles objekt in einem von einer kamera aufgenommenen bewegtbildsignal
DE112017003426T5 (de) Verfahren und System zum Rekonstruieren einer dreidimensionalen Darstellung
EP3347878A2 (de) Verfahren und vorrichtung zum überlagern eines abbilds einer realen szenerie mit einem virtuellen bild und mobiles gerät
DE112014006493T5 (de) Bestimmen eines Massstabs dreidimensonaler Informationen
EP3539085B1 (de) 3d-referenzierung
DE102014106718A1 (de) Verfahren und System zur Bestimmung einer gegenständlichen Lage
DE102015106836A1 (de) Verfahren zur Steuerung einer 3D-Messvorrichtung mittels momentaner Bewegung und Vorrichtung hierzu
DE102010042821B4 (de) Verfahren und Vorrichtung zur Bestimmung einer Basisbreite eines Stereo-Erfassungssystems

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: UNKNOWN

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20181218

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

17Q First examination report despatched

Effective date: 20200803

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20211103