WO2013034133A1 - Interaktion mit einem dreidimensionalen virtuellen szenario - Google Patents
Interaktion mit einem dreidimensionalen virtuellen szenario Download PDFInfo
- Publication number
- WO2013034133A1 WO2013034133A1 PCT/DE2012/000892 DE2012000892W WO2013034133A1 WO 2013034133 A1 WO2013034133 A1 WO 2013034133A1 DE 2012000892 W DE2012000892 W DE 2012000892W WO 2013034133 A1 WO2013034133 A1 WO 2013034133A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- selection
- virtual
- scenario
- dimensional
- area
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/02—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
- G01S7/04—Display arrangements
- G01S7/06—Cathode-ray tube displays or other two dimensional or three-dimensional displays
- G01S7/20—Stereoscopic displays; Three-dimensional displays; Pseudo-three-dimensional displays
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B30/00—Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images
- G02B30/50—Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images the image being built up from image elements distributed over a 3D volume, e.g. voxels
- G02B30/56—Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images the image being built up from image elements distributed over a 3D volume, e.g. voxels by projecting aerial or floating images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/013—Eye tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04815—Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04842—Selection of displayed objects or displayed text elements
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/302—Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays
Definitions
- the invention relates to display devices for a three-dimensional virtual scenario.
- the invention relates to three-dimensional virtual scenario display devices for selecting objects in the virtual scenario with feedback upon successful selection of one of the objects, a workstation device for monitoring a three-dimensional virtual scenario, and interacting with a three-dimensional virtual scenario, using a workstation device to monitor a
- Three-dimensional virtual scenarios for the monitoring of airspaces, as well as a method for selecting objects in a three-dimensional scenario are described.
- a three-dimensional virtual scenario display device for selecting objects in the virtual scenario with feedback upon successful selection of an object comprising a virtual scenario renderer and a virtual scenario renderer
- the touch unit for touch-controlled selection of an object in the virtual scenario.
- the touch unit is in one
- the presentation unit can be based on stereoscopic visualization techniques, which are particularly suitable for the evaluation of three-dimensional
- Visualization techniques allow an observer of a three-dimensional virtual scenario to intuitively understand spatial data.
- Accommodation adjustment of the refractive power of the lens of the observer's eyes.
- convergence and accommodation are coupled together, and this coupling must be removed when considering a three-dimensional virtual scenario. This is because the eye is focused on an imaging unit, but the eye axes must align with the virtual objects that are in space or the virtual three-dimensional scenario in front of or behind the imaging
- Unpairing convergence and accommodation can lead to stress and hence fatigue on the human eye to headache and nausea in a viewer of a three-dimensional virtual scene.
- the conflict between convergence and accommodation also results from an operator interacting with virtual-scenario objects, for example, with his hand in the course of a direct interaction with the virtual scenario; that the real position of the hand overlaps with the virtual objects.
- the conflict between accommodation and convergence can be intensified.
- a three-dimensional virtual scenario may require wearing special gloves.
- these gloves make it possible to determine the position of the user's hands and, on the other hand, they can trigger a corresponding vibration when virtual objects are touched.
- the position of the hand is usually determined in this case with an optical detection system.
- a user typically moves their hands in the space in front of the user. The weight of the arms and the extra weight of the
- Gloves can limit the time of use, since the user may experience signs of fatigue early on. Particularly in the field of airspace surveillance or aviation, there are situations where two types of information are needed to build a good understanding of the current airspace situation and its evolution in the future. On the one hand, these are the global view of the overall situation and, on the other hand, a detailed look at the elements that are relevant for a potential conflict situation.
- An air traffic controller for example, who has to resolve a conflict situation between two aircraft, must have both
- perspective displays for the representation of a spatially acting scenario allow a graphical representation of a three-dimensional scenario
- a representation of three-dimensional scenarios is provided that allows both an overview and a detail display simultaneously, a simple and direct one
- the presentation unit is designed to evoke the impression of a three-dimensional scenario with a viewer.
- the presentation unit can have at least two projection devices which project a different image for each individual eye of the observer, so that a three-dimensional impression is created in the viewer.
- the presentation unit can also be designed to display differently polarized images, with spectacles of the observer having correspondingly polarized lenses resulting in one eye in each case being able to perceive one image and thus creating a three-dimensional impression on the viewer. It is important
- the touch unit is an input element for touch-controlled
- the touch unit may for example be transparent and be arranged in the three-dimensional display space of the virtual scenario, so that an object of the virtual scenario is selected by the user gripping the three-dimensional presentation space with one hand or both hands and touching the touch unit.
- the touch unit may be located anywhere in the three-dimensional presentation space or outside the three-dimensional presentation space.
- the touch unit can be designed as a plane or as an arbitrarily geometrically shaped surface.
- the touch unit can be designed as a flexibly formable element in order to be able to adapt the touch unit to the three-dimensional virtual scenario.
- the touch unit may comprise, for example, capacitive or resistive measurement systems or infrared-based gratings to determine the coordinates of one or more touch points at which the user touches the touch unit. Depending on the coordinates of a point of contact, for example, that object in the three-dimensional virtual scenario which comes closest to the point of contact is selected. According to an embodiment of the invention, the touch unit is adapted to represent a selection area for the object. The object is selected by touching the selection area.
- a computing device may, for example, a position of the
- the touch unit may be configured to represent a plurality of selection regions for a plurality of objects, each having a selection region assigned to an object in the virtual scenario.
- the feedback is made at the successful selection of one of the objects from the virtual scenario at least partially by a vibration of the touch unit or by the on
- the touch unit may, for example, be vibrated as a whole, for example with the aid of a motor, in particular a vibration motor, or individual areas of the touch unit may be vibrated.
- piezoelectric actuators can also be used as vibration elements, with the piezoelectric actuators in each case being made to oscillate at the point of contact when an object has been selected in the virtual scenario, thus signaling the user to select the object.
- Touch unit on a variety of areas, which are selectively selectable for tactile feedback on the selection of an object in the virtual scenario.
- the touch unit may be configured to allow selection of multiple objects at the same time. For example, an object with a first hand and another object with a second hand of the user can be selected.
- the touch unit can be located in the area of a selection area for an object for outputting a tactile feedback, i. for example, to perform a vibration. This allows the user, in particular when selecting multiple objects, to recognize which of the objects has been selected and which has not.
- the touch unit can be designed to allow a change of a map scale and a shifting of the illustrated map area.
- Tactile feedback is understood as meaning, for example, a vibration or the vibration of a piezoelectric actuator.
- the feedback on the successful selection of an object in the three-dimensional scenario is at least partially carried out by outputting an optical signal.
- the optical signal can alternatively or additionally to the tactile
- Under feedback by means of an optical signal is understood to highlight or represent a selection pointer.
- the brightness of the selected object may be changed, or the selected object may be provided with a border, or a pointing element pointing to that object may be displayed next to the selected object in the virtual scenario.
- the feedback about the selection of an object in the virtual scenario takes place at least partially by the output of an acoustic signal.
- the acoustic signal can be output as an alternative to the tactile feedback and / or the optical signal, but also in addition to the tactile feedback and / or the optical signal.
- an acoustic signal is understood as meaning, for example, the output of a short tone via an output unit, for example a loudspeaker.
- This structure allows the user to view more closely the overall scenario in the overview area and a user-selectable smaller area in the detail area.
- the overview area can be reproduced, for example, as a two-dimensional display and the detail area as a spatial representation.
- Detail section of the virtual scenario can be moved, rotated or resized.
- Airspace in a simple and manageable way allows to have both a view of the overall airspace situation in the overview area as well as potential conflict situations in the detail area.
- the invention allows the operator to change the detail area depending on the particular needs, i. Any area of the overview display can be selected for the detailed display. Of course, this selection can also be made such that a selected area of the detailed representation is displayed in the overview representation.
- a workstation device for monitoring a three-dimensional virtual scenario with a
- a three-dimensional virtual scenario display device for selecting objects in the virtual scenario with feedback upon successful selection of one of the objects as described above and described below.
- the workplace device can also be used, for example, to control unmanned aerial vehicles or to monitor any
- Scenarios can be used by one or more users.
- the workstation device as described above and below may, of course, have a plurality of display devices but also have one or more conventional displays for displaying additional two-dimensional information. These displays can be coupled with the display device, for example, so that a mutual influence of the information displayed is possible.
- a flight plan can be displayed on a display and, if an entry from the flight plan is selected, the corresponding aircraft in the
- the displays can also be arranged so that the display areas of all the displays merge into one another or a plurality of display areas are displayed on a physical display.
- the workstation device may include input elements that may be used alternatively or in addition to interacting directly with the three-dimensional virtual scenario.
- the workstation device may be a so-called. Computer mouse, a keyboard or use-typical interaction devices, such as those of a
- Air traffic controller workplace exhibit. Likewise, it can at all displays or presentation units to
- a workstation device is as described above and below for monitoring airspaces
- the workstation device can also be used to monitor and control unmanned aerial vehicles and to analyze a
- the workstation device may also be used to control components such as a camera or other sensors that are part of an unmanned aerial vehicle.
- the workstation device may be configured to represent, for example, a restricted zone or hazardous area in the three-dimensional scenario. It allows the three-dimensional representation of the
- Airspace easy and quick to detect whether an aircraft threatens, for example, to fly through a restricted zone or a hazardous area.
- An exclusion zone or a danger zone can be represented, for example, as a virtual body in the size of the exclusion zone or the hazardous area.
- a method of selecting objects in a three-dimensional scenario is provided. In a first step, a selection area of a virtual object is touched in a display area of a three-dimensional virtual scenario. In a subsequent step, a response is issued to an operator after the selection of the virtual object.
- the method further comprises the steps of: visualizing a selection element in the three-dimensional virtual scenario, moving the selection element according to a finger movement of the operator on the presentation surface, selecting an object in the three-dimensional scenario by matching the selection element with the one to be selected Object is brought.
- the visualization of the selection element, the movement of the selection element and the selection of the object takes place after touching the selection surface.
- the selector may be displayed in the virtual scenario when the operator touches the touch unit.
- the selection element is represented, for example, in the virtual scenario as a vertically extending beam of light or a light cylinder and moves through the three-dimensional virtual scenario in accordance with a finger movement of the finger
- the selection element encounters an object in the three-dimensional virtual scenario, then this object is selected for further operations, as long as the user remains essentially immobile on the object of the selection element for a certain time
- the selection of the object in the virtual scenario may occur after the selector element unmoving an object for one second. This waiting time is to prevent objects from being selected in the virtual scenario, even though the selection element has only been passed by them.
- the presentation of a selection item in the virtual scenario simplifies the selection of an item and allows the operator to select an item without considering the position of his hand in the virtual scenario.
- the selection of an object thus takes place in that the selection element is brought into coincidence with the object to be selected by movement of the hand, which is made possible in that the selection element, for example in the form of a light cylinder, runs vertically through the virtual scenario.
- Selection element in at least one point overlaps with the coordinates of the virtual object to be selected.
- a computer program element for controlling a three-dimensional virtual scenario display device for selecting objects in the virtual scenario with feedback upon successful selection of one of the objects is executed, the method of selecting virtual objects in a three-dimensional one virtual scenario as described above and described when the computer program element is executed on a processor of a computing unit.
- the computer program element can serve a processor of the
- a computer-readable medium is indicated with the computer program element as described above and below.
- a computer-readable medium may be any volatile or non-volatile storage medium, such as a hard disk, a CD, a DVD, a floppy disk, a memory card, or any other computer-readable medium
- FIG. 1 is a side view of a workstation device according to one
- Fig. 2 shows a perspective view of a workstation device according to another embodiment of the invention.
- Fig. 3 shows a schematic view of a display device according to an embodiment of the invention.
- Fig. 4 shows a schematic view of a display device according to another embodiment of the invention.
- Fig. 5 shows a side view of a workplace device according to a
- FIG. 6 shows a schematic view of a display device according to an embodiment of the invention.
- FIG. 7 shows a schematic view of a method for selecting
- Fig. 1 shows a workstation device 200 for an operator of a
- the workstation device 200 has a display device 100 with a display unit 110 and a touch unit 120.
- the display device 100 has a display device 100 with a display unit 110 and a touch unit 120.
- Touch unit 120 may, in particular, overlay a part of display unit 110.
- the touch unit can also the entire
- Overlay display unit 110 the touch unit is transparent in such a case, so that the operator of the workstation or the viewer of the display device, the view of the
- the display unit 110 and the touch unit 120 constitute one
- Touch unit 120 and the display unit 110 apply mutatis mutandis.
- the touch unit may be configured to cover the presentation unit, i. that the entire presentation unit with a
- the display unit 110 has a first display area 111 and a second display area 112, wherein the second display area is angled relative to the first display area in the direction of the user so that the two display areas show an inclusion angle ⁇ 115.
- the first display area 111 of the display unit 110 and the second display area 112 of the display unit 110 span by their mutually angled position with a viewer position 195, d. H. of the
- the display room 130 is thus that one
- An operator using the seat 190 during use of the workstation device 200 may also use the workspace area 140, in addition to the three-dimensional virtual scenario presentation space 130, on which other touch-sensitive or conventional displays may reside.
- the inclusion angle ⁇ 115 can be dimensioned such that all virtual objects in the display room 130 can be within an arm reach of the user of the workstation device 200.
- an inclusion angle a which is between 90 degrees and 150 degrees
- the inclusion angle ⁇ can also be adapted to the individual needs of a single user while both falling below and exceeding the range of 90 degrees to 150 degrees.
- the inclusion angle ⁇ is 120 degrees.
- the angled geometry of the display unit 110 is able to overcome the conflict between convergence and accommodation in use
- the angled geometry of the presentation unit may reflect the conflict between convergence and accommodation in a viewer of a virtual
- the three-dimensional virtual scenario can be represented, for example, such that the second display area 112 of the display unit 110 of the virtually represented earth surface or a
- the inventive workstation device is particularly suitable for longer-term, low-fatigue processing of three-dimensional virtual scenarios with integrated spatial representation of geographically referenced data, such. Aircraft, waypoints, control zones, threat areas, terrain topographies and weather events, with simple intuitive
- the presentation unit 110 may also have a rounded transition from the first display area 111 to the second display area 112. This avoids or reduces an interfering influence of a real visible edge between the first display area and the second display area on the three-dimensional impression of the virtual scenario.
- the display unit 110 also in the form of a
- Circular arc be formed.
- the workstation device as described above and below thus enables a large stereoscopic display volume or a
- the workstation device enables a virtual reference surface in the virtual three-dimensional scenario
- a terrain surface is positioned in the same plane as the actual presentation unit or touch unit.
- a distance of the virtual objects from the surface of the presentation unit can be reduced, thus reducing a conflict between convergence and accommodation in the viewer. Furthermore, disturbing influences on the three-dimensional impression are reduced as a result of the fact that the operator reaches into the presentation space with one hand and the eye of the observer thus becomes a real object, ie. H. the operator's hand, and perceives virtual objects at the same time.
- the touch unit 120 is executed at a touch of the
- Touch unit with the hand of the operator to give a feedback to the operator can be done by a detection unit (not shown) detects the touch coordinates on the touch unit and, for example, the display unit an optical feedback or a sound output unit (not shown) outputs the acoustic feedback.
- a detection unit detects the touch coordinates on the touch unit and, for example, the display unit an optical feedback or a sound output unit (not shown) outputs the acoustic feedback.
- the touch unit can output a tactile feedback by means of vibration or vibrations of piezo actuators.
- 2 shows a workstation device 200 with a display device 100, which is designed to display a three-dimensional virtual scenario, furthermore with three conventional display elements 210, 211, 212 for two-dimensional display of graphics and information, furthermore with two conventional input / interactive devices, such as a computer mouse 171 and a so-called space mouse 170, which is an interaction device with six degrees of freedom and with which elements in space, for example in a three-dimensional scenario, can be controlled.
- a display device 100 which is designed to display a three-dimensional virtual scenario, furthermore with three conventional display elements 210, 211, 212 for two-dimensional display of graphics and information, furthermore with two conventional input / interactive devices, such as a computer mouse 171 and a so-called space mouse 170, which is an interaction device with six degrees of freedom and with which elements in space, for example in a three-dimensional scenario, can be controlled.
- the three-dimensional impression of the presentation device 100 is the three-dimensional impression of the presentation device 100
- the glasses are designed to provide the eyes of a viewer with different images, giving the viewer the impression of a three-dimensional scenario.
- the spectacles 160 have a multiplicity of so-called reflectors 161, which serve to determine the eye position of a viewer in front of the presentation device 100 and thus, if necessary, to adapt the reproduction of the three-dimensional virtual scene to the position of the observer.
- the workstation device 200 can for example have a position detection unit (not shown) for this purpose, which detects, for example, by means of a camera system with a plurality of cameras, the eye position due to the position of the reflectors 161.
- FIG. 3 shows a perspective view of a display device 100 with a display unit 110 and a touch unit 120, wherein the
- Display unit 110 has a first display area 111 and a second display area 112.
- a selection area 302 is indicated for each virtual object in the presentation space 130.
- Each selection area 302 may be connected to the virtual area 301 assigned to this selection area via a selection element 303.
- the selection element 303 makes it easier for a user to assign a selection area 302 to a virtual object 301
- the display surface 310 may be spatially arranged in the three-dimensional virtual scenario so that the display surface 310 overlaps the touch unit 120. As a result, the selection areas 302 are also on the touch unit 120.
- the selection of a virtual object 301 in the three-dimensional virtual scene thus takes place in that the operator touches the touch unit 120 at the location with a finger on which the selection area 302 of the virtual object to be selected is placed.
- the touch unit 120 is designed to transmit the touch coordinates of the operator's finger to an evaluation unit, which the Match coordinates coordinates with the representation coordinates of the selection areas 302 and so can determine the selected virtual object.
- the touch unit 120 may be configured to respond only to the operator's touch at the locations where a selection area is displayed. This allows the operator to place his hands on the touch unit so that no selection area is touched, whereby putting down the hands can prevent operator fatigue and facilitate easy interaction with the virtual scenario.
- the described construction of the presentation device 100 thus makes it possible for an operator to interact with a virtual three-dimensional scene and to obtain a real feedback on the sole fact that he is involved in the
- the selection of a virtual object 301 that has taken place can be signaled to the operator by vibration of the touch unit 120, for example.
- Touch unit 120 are vibrated only on the size of the selected selection area 302. This can be achieved, for example, by the use of oscillating piezoactuators in the touch unit, the piezoactuators, after detection of the contact coordinates of the
- Touch unit are made to oscillate at the appropriate position.
- the virtual objects can also be selected by a selection element in the form of a vertical plane running in the virtual three-dimensional scene when touching the touch unit 120 at the touch position
- a virtual object 301 is then selected by bringing the selection element into coincidence with the virtual object to be selected. For example, in order to avoid an accidental selection of a virtual object, the selection can take place with a delay in such a way that a virtual object is not selected until the selection element has a certain time in
- FIG. 4 shows a display device 100 with a display unit 110 and a touch unit 120.
- a first display area 111 an overview area is displayed in a two-dimensional representation, and a partial area 401 of the overview area is displayed in detail in a display room 130 as a three-dimensional scenario 402.
- the objects located in the partial section of the overview area are displayed as virtual three-dimensional objects 301.
- the display device 100 as described above and below enables the operator to change the detail area 402 by moving the partial area in the overview area 401 or by the extract of the overview area in the three-dimensional view in the detail area 402 in the direction of at least one of the three indicated coordinates x, y or z is changed.
- FIG. 5 shows a workstation device 200 with a display device 100 and a three-dimensional virtual scenario shown in FIG
- the presentation device 100 has a
- Representation unit 110 and a touch unit 120 which together with the eyes of the operator 501 span the display space 130, in which the virtual objects 301 of the three-dimensional virtual
- a distance of the user 501 from the presentation device 100 can be dimensioned such that it is possible for the user to reach a majority or the entire presentation space 130 with at least one of his arms.
- the real position of the user's hand 502 the real position of the display device 100, and the virtual position of the virtual objects 301 in the virtual three-dimensional scenario are as small as possible, so that a conflict between convergence and accommodation in the user's visual system is minimized is reduced.
- This structure can be a
- FIG. 6 shows a three-dimensional virtual scenario display device 100 with a display unit 110 and a touch unit 120. In the presentation space 130, virtual three-dimensional objects 301 are imaged.
- a virtual surface 601 is arranged, on which a marking element 602 can be moved.
- Marking element 602 moves alone on the virtual surface 601, whereby the marking element 602 has two degrees of freedom in its movement. In other words, the marking element 602 is executed, a
- the marking element can be controlled for example by means of a conventional computer mouse.
- the selection of a virtual object in the three-dimensional scenario takes place in that the position of at least one user's eye 503 is detected by means of the reflectors 161 on a pair of glasses worn by the user, and a connecting line 504 is determined from the determined position of the eye 503
- the connecting line can also be calculated on the basis of an averaged position of both eyes of the observer.
- the position of the eyes of the user can be determined with or without glasses with corresponding reflectors. It should be noted that any mechanisms and methods for determining the position of the eyes can be used in the context of the invention.
- the selection of a virtual object 301 in the three-dimensional scenario is carried out by extending the connection line 504 into the presentation space 130 and selecting that virtual object whose virtual coordinates are crossed by the connection line 504.
- the selection of a virtual object 301 is then identified, for example, by means of a selection indicator 603.
- Marker element 602 moves, even so in the virtual scenario in
- Display room 130 may be arranged so that from the user's perspective virtual objects 301 are located in front of and / or behind the virtual surface 601.
- the marker 602 on the virtual surface 601 is moved so that the connecting line 504 crosses the coordinates of a virtual object 301, the marker 602 may be represented in the three-dimensional scenario as having the virtual information three-dimensional coordinates of the selected object occupies. From the user's point of view, this change is then such that, as soon as a virtual object 301 is selected, the marking element 602 makes a spatial movement toward the user or away from the user.
- Interaction devices such as a computer mouse. This can be done in the
- Degrees of freedom represent a simpler and faster-to-learn interaction with a three-dimensional scenario, as an input device with less
- FIG. 7 is a schematic view of a method according to FIG.
- a selection area of a virtual object is touched in a display area of a three-dimensional virtual scenario.
- the selection area is coupled to the virtual object such that a
- Touching the selection area allows a unique determination of the correspondingly selected virtual object.
- a selection element takes place in the three-dimensional virtual scenario.
- the selection element can be, for example, a light cylinder running vertically in the three-dimensional virtual scenario.
- the selection element can be visualized as a function of the contact duration of the selection area, ie. H. the selector element is visualized as soon as a user touches the selection area and can be hidden again as soon as the user removes his / her finger from the selection area. This allows the user to pause or terminate a selection operation of a virtual object, for example, because the user determines that he wishes to select another virtual object.
- Touch unit not removed once the visualized selection element in the virtual scenario persists and can be moved by performing a movement of the finger on the display surface or the touch unit in the virtual scenario. This allows a user to make the selection of a virtual object by gradually approaching the selection element to just the virtual object to be selected.
- a fourth step 704 an object is selected in the three-dimensional scenario by bringing the selection element into coincidence with the object to be selected.
- the selection of the object can be done, for example, by the
- Selection element is kept a certain time in registration with the object to be selected, for example one second.
- the period of time after which a virtual object is displayed as selected can be set arbitrarily.
- a return message is output to the operator after the virtual object has been selected.
- the feedback can be as described above haptic / tactile, optical or acoustic.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Signal Processing (AREA)
- Multimedia (AREA)
- Optics & Photonics (AREA)
- Computer Networks & Wireless Communication (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- User Interface Of Digital Computer (AREA)
- Position Input By Displaying (AREA)
- Processing Or Creating Images (AREA)
Abstract
Description
Claims
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020147006702A KR20140071365A (ko) | 2011-09-08 | 2012-09-06 | 3차원 가상 시나리오와의 상호작용 |
CA2847425A CA2847425C (en) | 2011-09-08 | 2012-09-06 | Interaction with a three-dimensional virtual scenario |
RU2014113395/08A RU2604430C2 (ru) | 2011-09-08 | 2012-09-06 | Взаимодействие с трехмерным виртуальным динамическим отображением |
US14/343,440 US20140282267A1 (en) | 2011-09-08 | 2012-09-06 | Interaction with a Three-Dimensional Virtual Scenario |
EP12780399.7A EP2753951A1 (de) | 2011-09-08 | 2012-09-06 | Interaktion mit einem dreidimensionalen virtuellen szenario |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE102011112618.3 | 2011-09-08 | ||
DE102011112618A DE102011112618A1 (de) | 2011-09-08 | 2011-09-08 | Interaktion mit einem dreidimensionalen virtuellen Szenario |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2013034133A1 true WO2013034133A1 (de) | 2013-03-14 |
Family
ID=47115084
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/DE2012/000892 WO2013034133A1 (de) | 2011-09-08 | 2012-09-06 | Interaktion mit einem dreidimensionalen virtuellen szenario |
Country Status (7)
Country | Link |
---|---|
US (1) | US20140282267A1 (de) |
EP (1) | EP2753951A1 (de) |
KR (1) | KR20140071365A (de) |
CA (1) | CA2847425C (de) |
DE (1) | DE102011112618A1 (de) |
RU (1) | RU2604430C2 (de) |
WO (1) | WO2013034133A1 (de) |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
FR2976681B1 (fr) * | 2011-06-17 | 2013-07-12 | Inst Nat Rech Inf Automat | Systeme de colocalisation d'un ecran tactile et d'un objet virtuel et dispostif pour la manipulation d'objets virtuels mettant en oeuvre un tel systeme |
JP2015132888A (ja) * | 2014-01-09 | 2015-07-23 | キヤノン株式会社 | 表示制御装置及び表示制御方法、プログラム、並びに記憶媒体 |
DE102014107220A1 (de) * | 2014-05-22 | 2015-11-26 | Atlas Elektronik Gmbh | Eingabevorrichtung, Rechner oder Bedienanlage sowie Fahrzeug |
US10140776B2 (en) | 2016-06-13 | 2018-11-27 | Microsoft Technology Licensing, Llc | Altering properties of rendered objects via control points |
DE102017117223A1 (de) * | 2017-07-31 | 2019-01-31 | Hamm Ag | Arbeitsmaschine, insbesondere Nutzfahrzeug |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6377229B1 (en) * | 1998-04-20 | 2002-04-23 | Dimensional Media Associates, Inc. | Multi-planar volumetric display system and method of operation using three-dimensional anti-aliasing |
EP1667088A1 (de) * | 2004-11-30 | 2006-06-07 | Oculus Info Inc. | System und Verfahren für interaktive 3D-Luftregionen |
US20070035511A1 (en) * | 2005-01-25 | 2007-02-15 | The Board Of Trustees Of The University Of Illinois. | Compact haptic and augmented virtual reality system |
Family Cites Families (55)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5320538A (en) * | 1992-09-23 | 1994-06-14 | Hughes Training, Inc. | Interactive aircraft training system and method |
US5394202A (en) * | 1993-01-14 | 1995-02-28 | Sun Microsystems, Inc. | Method and apparatus for generating high resolution 3D images in a head tracked stereo display system |
US5594469A (en) * | 1995-02-21 | 1997-01-14 | Mitsubishi Electric Information Technology Center America Inc. | Hand gesture machine control system |
US7225404B1 (en) * | 1996-04-04 | 2007-05-29 | Massachusetts Institute Of Technology | Method and apparatus for determining forces to be applied to a user through a haptic interface |
US6302542B1 (en) * | 1996-08-23 | 2001-10-16 | Che-Chih Tsao | Moving screen projection technique for volumetric three-dimensional display |
JP2985847B2 (ja) * | 1997-10-17 | 1999-12-06 | 日本電気株式会社 | 入力装置 |
US6031519A (en) * | 1997-12-30 | 2000-02-29 | O'brien; Wayne P. | Holographic direct manipulation interface |
US6429846B2 (en) * | 1998-06-23 | 2002-08-06 | Immersion Corporation | Haptic feedback for touchpads and other touch controls |
US6064354A (en) * | 1998-07-01 | 2000-05-16 | Deluca; Michael Joseph | Stereoscopic user interface method and apparatus |
US6373463B1 (en) * | 1998-10-14 | 2002-04-16 | Honeywell International Inc. | Cursor control system with tactile feedback |
US6842175B1 (en) * | 1999-04-22 | 2005-01-11 | Fraunhofer Usa, Inc. | Tools for interacting with virtual environments |
US6727924B1 (en) * | 2000-10-17 | 2004-04-27 | Novint Technologies, Inc. | Human-computer interface including efficient three-dimensional controls |
US20020175911A1 (en) * | 2001-05-22 | 2002-11-28 | Light John J. | Selecting a target object in three-dimensional space |
US7190365B2 (en) * | 2001-09-06 | 2007-03-13 | Schlumberger Technology Corporation | Method for navigating in a multi-scale three-dimensional scene |
US7324085B2 (en) * | 2002-01-25 | 2008-01-29 | Autodesk, Inc. | Techniques for pointing to locations within a volumetric display |
US6753847B2 (en) * | 2002-01-25 | 2004-06-22 | Silicon Graphics, Inc. | Three dimensional volumetric display input and output configurations |
GB0204652D0 (en) * | 2002-02-28 | 2002-04-10 | Koninkl Philips Electronics Nv | A method of providing a display gor a gui |
US6968511B1 (en) * | 2002-03-07 | 2005-11-22 | Microsoft Corporation | Graphical user interface, data structure and associated method for cluster-based document management |
JP2004199496A (ja) * | 2002-12-19 | 2004-07-15 | Sony Corp | 情報処理装置および方法、並びにプログラム |
JP2004334590A (ja) * | 2003-05-08 | 2004-11-25 | Denso Corp | 操作入力装置 |
JP4576131B2 (ja) * | 2004-02-19 | 2010-11-04 | パイオニア株式会社 | 立体的二次元画像表示装置及び立体的二次元画像表示方法 |
KR20050102803A (ko) * | 2004-04-23 | 2005-10-27 | 삼성전자주식회사 | 가상입력장치, 시스템 및 방법 |
JP2008506140A (ja) * | 2004-06-01 | 2008-02-28 | マイケル エー. ベセリー | 水平透視ディスプレイ |
US7348997B1 (en) * | 2004-07-21 | 2008-03-25 | United States Of America As Represented By The Secretary Of The Navy | Object selection in a computer-generated 3D environment |
JP2006053678A (ja) * | 2004-08-10 | 2006-02-23 | Toshiba Corp | ユニバーサルヒューマンインタフェースを有する電子機器 |
US20060267927A1 (en) * | 2005-05-27 | 2006-11-30 | Crenshaw James E | User interface controller method and apparatus for a handheld electronic device |
US20070064199A1 (en) * | 2005-09-19 | 2007-03-22 | Schindler Jon L | Projection display device |
US7834850B2 (en) * | 2005-11-29 | 2010-11-16 | Navisense | Method and system for object control |
JP4111231B2 (ja) * | 2006-07-14 | 2008-07-02 | 富士ゼロックス株式会社 | 立体表示システム |
US8384665B1 (en) * | 2006-07-14 | 2013-02-26 | Ailive, Inc. | Method and system for making a selection in 3D virtual environment |
JP4880693B2 (ja) * | 2006-10-02 | 2012-02-22 | パイオニア株式会社 | 画像表示装置 |
KR100851977B1 (ko) * | 2006-11-20 | 2008-08-12 | 삼성전자주식회사 | 가상 평면을 이용하여 전자 기기의 사용자 인터페이스를제어하는 방법 및 장치. |
US8726194B2 (en) * | 2007-07-27 | 2014-05-13 | Qualcomm Incorporated | Item selection using enhanced control |
CN101765798B (zh) * | 2007-07-30 | 2011-12-28 | 独立行政法人情报通信研究机构 | 多视点空中影像显示装置 |
RU71008U1 (ru) * | 2007-08-23 | 2008-02-20 | Дмитрий Анатольевич Орешин | Оптическая система объемного изображения |
JP5087632B2 (ja) * | 2007-10-01 | 2012-12-05 | パイオニア株式会社 | 画像表示装置 |
US20090112387A1 (en) * | 2007-10-30 | 2009-04-30 | Kabalkin Darin G | Unmanned Vehicle Control Station |
US8233206B2 (en) * | 2008-03-18 | 2012-07-31 | Zebra Imaging, Inc. | User interaction with holographic images |
JP4719929B2 (ja) * | 2009-03-31 | 2011-07-06 | Necカシオモバイルコミュニケーションズ株式会社 | 表示装置、および、プログラム |
US8896527B2 (en) * | 2009-04-07 | 2014-11-25 | Samsung Electronics Co., Ltd. | Multi-resolution pointing system |
US8760391B2 (en) * | 2009-05-22 | 2014-06-24 | Robert W. Hawkins | Input cueing emersion system and method |
JP5614014B2 (ja) * | 2009-09-04 | 2014-10-29 | ソニー株式会社 | 情報処理装置、表示制御方法及び表示制御プログラム |
US8970478B2 (en) * | 2009-10-14 | 2015-03-03 | Nokia Corporation | Autostereoscopic rendering and display apparatus |
US20110205185A1 (en) * | 2009-12-04 | 2011-08-25 | John David Newton | Sensor Methods and Systems for Position Detection |
KR101114750B1 (ko) * | 2010-01-29 | 2012-03-05 | 주식회사 팬택 | 다차원 영상을 이용한 사용자 인터페이스 장치 |
US9693039B2 (en) * | 2010-05-27 | 2017-06-27 | Nintendo Co., Ltd. | Hand-held electronic device |
US20120005624A1 (en) * | 2010-07-02 | 2012-01-05 | Vesely Michael A | User Interface Elements for Use within a Three Dimensional Scene |
US8643569B2 (en) * | 2010-07-14 | 2014-02-04 | Zspace, Inc. | Tools for use within a three dimensional scene |
US8970484B2 (en) * | 2010-07-23 | 2015-03-03 | Nec Corporation | Three dimensional display device and three dimensional display method |
US20120069143A1 (en) * | 2010-09-20 | 2012-03-22 | Joseph Yao Hua Chu | Object tracking and highlighting in stereoscopic images |
US8836755B2 (en) * | 2010-10-04 | 2014-09-16 | Disney Enterprises, Inc. | Two dimensional media combiner for creating three dimensional displays |
US9001053B2 (en) * | 2010-10-28 | 2015-04-07 | Honeywell International Inc. | Display system for controlling a selector symbol within an image |
US20120113223A1 (en) * | 2010-11-05 | 2012-05-10 | Microsoft Corporation | User Interaction in Augmented Reality |
JP5671349B2 (ja) * | 2011-01-06 | 2015-02-18 | 任天堂株式会社 | 画像処理プログラム、画像処理装置、画像処理システム、および画像処理方法 |
US8319746B1 (en) * | 2011-07-22 | 2012-11-27 | Google Inc. | Systems and methods for removing electrical noise from a touchpad signal |
-
2011
- 2011-09-08 DE DE102011112618A patent/DE102011112618A1/de active Pending
-
2012
- 2012-09-06 KR KR1020147006702A patent/KR20140071365A/ko not_active Application Discontinuation
- 2012-09-06 RU RU2014113395/08A patent/RU2604430C2/ru active
- 2012-09-06 WO PCT/DE2012/000892 patent/WO2013034133A1/de active Application Filing
- 2012-09-06 US US14/343,440 patent/US20140282267A1/en not_active Abandoned
- 2012-09-06 EP EP12780399.7A patent/EP2753951A1/de not_active Ceased
- 2012-09-06 CA CA2847425A patent/CA2847425C/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6377229B1 (en) * | 1998-04-20 | 2002-04-23 | Dimensional Media Associates, Inc. | Multi-planar volumetric display system and method of operation using three-dimensional anti-aliasing |
EP1667088A1 (de) * | 2004-11-30 | 2006-06-07 | Oculus Info Inc. | System und Verfahren für interaktive 3D-Luftregionen |
US20070035511A1 (en) * | 2005-01-25 | 2007-02-15 | The Board Of Trustees Of The University Of Illinois. | Compact haptic and augmented virtual reality system |
Non-Patent Citations (9)
Title |
---|
ANTONIO MONTELEONE ET AL: "THE AD4 SYSTEM INFRASTRUCTURE: INTEGRATING SIMULATION PLATFORMS AND 3D TECHNOLOGIES", EUROCONTROL 5TH INNOVATIVE RESEARCH WORKSHOP, 7 December 2006 (2006-12-07), pages 195 - 196, XP055046814, Retrieved from the Internet <URL:http://www.eurocontrol.int/eec/gallery/content/public/documents/newsletter/2007/issue_1/Workshop%20EEC-INO%202006%5B1%5D.pdf> [retrieved on 20121205] * |
DANG N T ET AL: "A comparison of different input devices for a 3D environment", INTERNATIONAL JOURNAL OF INDUSTRIAL ERGONOMICS, ELSEVIER, AMSTERDAM, NL, vol. 39, no. 3, 1 May 2009 (2009-05-01), pages 554 - 563, XP026035041, ISSN: 0169-8141, [retrieved on 20081216], DOI: 10.1016/J.ERGON.2008.10.016 * |
F PERSIANI ET AL: "A SEMI-IMMERSIVE SYNTHETIC ENVIRONMENT FOR COOPERATIVE AIR TRAFFIC CONTROL", 22ND INTERNATIONAL CONGRESS OF AERONAUTICAL SCIENCES, 1 September 2000 (2000-09-01), XP055046833, Retrieved from the Internet <URL:http://www.diem.ing.unibo.it/personale/liverani/Research/Papers/19.pdf> [retrieved on 20121205] * |
F PERSIANI ET AL: "A Virtual Reality based visualization and interaction tool for Air Traffic Control", INTERNATIONAL CONFERENCE ON DERSIGN TOOLS AND METHODS IN INDUSTRIAL ENGINEERING, 12 December 1999 (1999-12-12), XP055046832, Retrieved from the Internet <URL:http://diem1.ing.unibo.it/personale/liverani/Research/Papers../14.pdf> [retrieved on 20121205] * |
MARC BOURGOIS ET AL: "Interactive And Immersive 3D Visualization For ATC", 6TH USA/EUROPE AIR TRAFFIC MANAGEMENT RESEARCH AND DEVELOPMENT SEMINAR, BALTIMORE, MARYLAND, USA 27-30 JUNE 2005., 30 June 2005 (2005-06-30), XP055046764, Retrieved from the Internet <URL:http://www.atmseminar.org/seminarContent/seminar6/papers/p_040_IAC.pdf> [retrieved on 20121205] * |
MARCUS LANGE ET AL: "3D Visualization and 3D and Voice Interaction in Air Traffic Management", THE ANNUAL SIGRAD CONFERENCE SPECIAL THEME - REAL-TIME SIMULATIONS NOVEMBER 20-21, 2003, 20 November 2003 (2003-11-20), pages 17 - 22, XP055046754, ISBN: 978-9-17-373797-5, Retrieved from the Internet <URL:http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.146.4576&rep=rep1&type=pdf#page=25> [retrieved on 20121205] * |
MATT COOPER: "Interactive and Immersive 3D Visualization for ATC", 6TH USA/EUROPE AIR TRAFFIC MANAGEMENT RESEARCH AND DEVELOPMENT SEMINAR, BALTIMORE, MARYLAND, USA 27-30 JUNE 2005., 30 June 2005 (2005-06-30), XP055046765, Retrieved from the Internet <URL:http://www.atmseminarus.org/seminarContent/seminar6/presentations/pr_040_IAC.pdf> [retrieved on 20121205] * |
NGUYEN THONG DANG: "A STEREOSCOPIC 3D VISUALIZATION ENVIRONMENT FOR AIR TRAFFIC CONTROL AN ANALYSIS OF INTERACTION AND A PROPOSAL OF NEW INTERACTION TECHNIQUES", PH.D. THESIS, 31 December 2005 (2005-12-31), XP055046789, Retrieved from the Internet <URL:http://www.eurocontrol.int/eec/gallery/content/public/documents/PhD_theses/2005/Ph.D_Thesis_2005_Dang_T.pdf> [retrieved on 20121205] * |
RONALD AZUMA ET AL: "Advanced Human-Computer Interfaces for Air Traffic Management and Simulation", PROC. OF 1996 AIAA FLIGHT SIMULATION TECHNOLOGIES CONFERENCE (SAN DIEGO, CA, 29-31 JULY 1996), 31 July 1996 (1996-07-31), pages 656 - 666, XP055046768, Retrieved from the Internet <URL:http://www.ronaldazuma.com/papers/AIAA.pdf> [retrieved on 20121205] * |
Also Published As
Publication number | Publication date |
---|---|
EP2753951A1 (de) | 2014-07-16 |
CA2847425C (en) | 2020-04-14 |
DE102011112618A1 (de) | 2013-03-14 |
RU2014113395A (ru) | 2015-10-20 |
CA2847425A1 (en) | 2013-03-14 |
RU2604430C2 (ru) | 2016-12-10 |
KR20140071365A (ko) | 2014-06-11 |
US20140282267A1 (en) | 2014-09-18 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP2754298B1 (de) | Auswahl von objekten in einem dreidimensionalen virtuellen szenario | |
DE102019002898A1 (de) | Robotorsimulationsvorrichtung | |
EP3067874A1 (de) | Verfahren und vorrichtung zum testen eines in einem luftfahrzeug zu bedienenden geräts | |
EP3458939B1 (de) | Interaktionssystem und -verfahren | |
DE60302063T2 (de) | Graphische benutzeroberfläche für einen flugsimulator basierend auf einer client-server-architektur | |
DE69631947T2 (de) | Positionierung eines Eingabezeigers | |
DE112017005059T5 (de) | System und verfahren zum projizieren graphischer objekte | |
EP3709133B1 (de) | System zur haptischen interaktion mit virtuellen objekten für anwendungen in der virtuellen realität | |
WO2013034133A1 (de) | Interaktion mit einem dreidimensionalen virtuellen szenario | |
DE102014006776A1 (de) | Bedienvorrichtung für ein elektronisches Gerät | |
EP3507681A1 (de) | Verfahren zur interaktion mit bildinhalten, die auf einer anzeigevorrichtung in einem fahrzeug dargestellt werden | |
WO2020126240A1 (de) | Verfahren zum betreiben eines feldgeräts der automatisierungstechnik in einer augmented-reality/mixed-reality-umgebung | |
DE102021122362A1 (de) | Wechsel zwischen zuständen in einer hybriden virtual-reality-desktop-rechenumgebung | |
DE19704677A1 (de) | Verfahren und Einrichtung zum automatischen Erzeugen und Manipulieren eines dynamischen Kompaß-Cursors | |
EP1665023B1 (de) | Verfahren und vorrichtung zur steuerung eines graphiksystems der virtuellen realität mittels interaktionen | |
WO2005029302A2 (de) | Räumliche benutzungsschnittstelle zur steuerung eines graphiksystems der virtuellen realität mittels einer funktionsauswahl | |
DE112019002798T5 (de) | Informationsverarbeitungsvorrichtung, informationsverabeitungsverfahren und programm | |
EP2764698A2 (de) | Kooperativer 3d-arbeitsplatz | |
DE102019125075A1 (de) | Verfahren zur computerimplementierten Simulation eines LIDAR-Sensors in einer virtuellen Umgebung | |
EP3534240A1 (de) | Verfahren und vorrichtung zur daten-annotation | |
DE102018212944A1 (de) | Verfahren zur Unterstützung der Kollaboration zwischen Mensch und Roboter mittels einer Datenbrille | |
DE102011112620B3 (de) | Abgewinkeltes Display zur dreidimensionalen Darstellung eines Szenarios | |
DE112019003579T5 (de) | Informationsverarbeitungseinrichtung, programm undinformationsverarbeitungsverfahren | |
DE102019131740A1 (de) | Verfahren und Anzeigevorrichtung zur Erzeugung eines Tiefeneffekts in der Perspektive eines Beobachters auf einem flachen Anzeigemedium sowie Kraftfahrzeug | |
DE102019125612A1 (de) | Verfahren zur computerimplementierten Simulation eines optischen Sensors in einer virtuellen Umgebung |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 12780399 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2847425 Country of ref document: CA |
|
ENP | Entry into the national phase |
Ref document number: 20147006702 Country of ref document: KR Kind code of ref document: A |
|
ENP | Entry into the national phase |
Ref document number: 2014113395 Country of ref document: RU Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14343440 Country of ref document: US |