WO2013021458A1 - 複合現実感装置 - Google Patents
複合現実感装置 Download PDFInfo
- Publication number
- WO2013021458A1 WO2013021458A1 PCT/JP2011/068136 JP2011068136W WO2013021458A1 WO 2013021458 A1 WO2013021458 A1 WO 2013021458A1 JP 2011068136 W JP2011068136 W JP 2011068136W WO 2013021458 A1 WO2013021458 A1 WO 2013021458A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- unit
- calibration
- pressure distribution
- calibration data
- data
- Prior art date
Links
- 238000001514 detection method Methods 0.000 claims abstract description 89
- 238000000034 method Methods 0.000 claims abstract description 50
- 230000008569 process Effects 0.000 claims abstract description 47
- 238000006243 chemical reaction Methods 0.000 claims description 33
- 238000012545 processing Methods 0.000 claims description 26
- 238000013500 data storage Methods 0.000 claims description 21
- 230000033001 locomotion Effects 0.000 claims description 21
- 238000003384 imaging method Methods 0.000 description 51
- 239000003550 marker Substances 0.000 description 51
- 238000004364 calculation method Methods 0.000 description 27
- 239000011159 matrix material Substances 0.000 description 16
- 230000001133 acceleration Effects 0.000 description 12
- 238000010586 diagram Methods 0.000 description 11
- 210000003128 head Anatomy 0.000 description 10
- 238000009877 rendering Methods 0.000 description 10
- 230000005540 biological transmission Effects 0.000 description 9
- 230000003287 optical effect Effects 0.000 description 8
- 230000009466 transformation Effects 0.000 description 7
- 230000008859 change Effects 0.000 description 5
- 230000003190 augmentative effect Effects 0.000 description 2
- 238000004891 communication Methods 0.000 description 2
- 239000011521 glass Substances 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000001915 proofreading effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0179—Display position adjusting means not related to the information to be displayed
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
- G06T11/60—Editing figures and text; Combining figures or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/80—Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/0138—Head-up displays characterised by optical features comprising image capture systems, e.g. camera
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/014—Head-up displays characterised by optical features comprising information/image processing systems
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B2027/0178—Eyeglass type
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0179—Display position adjusting means not related to the information to be displayed
- G02B2027/0187—Display position adjusting means not related to the information to be displayed slaved to motion of at least a part of the body of the user, e.g. head, eye
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20092—Interactive image processing based on input by user
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30204—Marker
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30244—Camera pose
Definitions
- the present invention relates to the technical field of an optical transmission type mixed reality apparatus, for example.
- MR mixed reality
- CG Computer Graphics
- AR augmented reality
- the video transmission type mixed reality apparatus synthesizes CG with an image of a real environment photographed by a camera attached to a head-mounted display (HMD: Head Mount Display), and then converts the CG-synthesized image into an HMD. indicate.
- the optical transmission type mixed reality apparatus detects a specific position (for example, a marker position) of the real environment based on an image photographed by a camera attached to the HMD, for example, so that the detected specific position can be seen.
- a specific position for example, a marker position
- Non-Patent Document 1 discloses a technique related to calibration of the display position of information in the HMD for an optical transmission type mixed reality apparatus.
- Patent Document 1 discloses a technique for detecting whether or not an HMD is mounted and switching the power supply of the HMD to ON / OFF.
- the present invention has been made in view of, for example, the conventional problems described above, and an object of the present invention is to provide a mixed reality device that can suitably realize mixed reality, for example.
- the mixed reality apparatus of the present invention should be attached to the mounting part to be mounted on the user's head, specific position detecting means for detecting a specific position of the real environment, and the real environment.
- a head-mounted display having a display unit for displaying additional information, a mounting state detection unit for detecting a mounting state of the mounting unit, and conversion from the coordinate system of the specific position detection unit to the coordinate system of the display unit
- Update processing means for performing calibration data update processing for the above-described attachment state detected by the attachment state detection means.
- the 1 which shows schematic structure of the mixed reality apparatus which concerns on 1st Example.
- the 2 which shows schematic structure of the mixed reality apparatus which concerns on 1st Example.
- It is a block diagram which shows the structure of the mixed reality apparatus which concerns on 1st Example.
- It is a block diagram which shows the structure of the DB control part which concerns on 1st Example.
- It is a block diagram which shows the structure of the calibration part which concerns on 1st Example.
- the structure of the conversion matrix calculation part which concerns on 1st Example.
- the rendering part which concerns on 1st Example.
- the mixed reality apparatus of the present invention should be attached to the mounting part to be mounted on the user's head, specific position detecting means for detecting a specific position of the real environment, and the real environment.
- a head-mounted display having a display unit for displaying additional information, a mounting state detection unit for detecting a mounting state of the mounting unit, and conversion from the coordinate system of the specific position detection unit to the coordinate system of the display unit
- Update processing means for performing an update process of calibration data in accordance with the mounting state detected by the mounting state detection means.
- the mixed reality apparatus of the present invention is used in a state where a head mounted display (more specifically, a wearing part thereof) is worn on a user's head, and displays additional information on a transparent display part.
- This is an optical transmission type mixed reality device that realizes mixed reality. That is, according to the mixed reality apparatus of the present invention, at the time of operation, the specific position detection unit detects the specific position of the real environment (that is, the position and orientation (direction) where the marker is arranged in the real environment, for example), or the specific shape. A specific position and posture (direction) such as a position where the part exists is detected.
- the specific position detection unit includes, for example, an imaging unit such as a camera, and detects the specific position based on an image of a real environment photographed by the imaging unit.
- the specific position detection unit may include, for example, a magnetic sensor, an ultrasonic sensor, a gyroscope, an acceleration sensor, an angular velocity sensor, a GPS (Global Positioning System), a wireless communication device, or the like instead of or in addition to the imaging unit. Good.
- additional information such as CG and characters is displayed at a position corresponding to the detected specific position on the display unit. As a result, it is possible to realize a mixed reality as if the additional information that does not exist in the real environment exists in the real environment.
- the update processing means for performing the update processing of the calibration data for performing the conversion from the coordinate system of the specific position detecting means to the coordinate system of the display unit according to the mounting state detected by the mounting state detecting means.
- the display position that should correspond to the specific position differs in order to realize mixed reality. . For this reason, if no measures are taken, it may become difficult to achieve mixed reality when the mounting state of the mounting portion changes.
- the update processing means updates the calibration data in accordance with the mounting state detected by the mounting state detection means.
- the wearing state detection means includes, for example, a pressure distribution sensor that is provided in a mounting portion of the head mounted display and detects the pressure distribution applied from the user's head, and the pressure distribution detected by the pressure distribution sensor. The wearing state is detected based on the above.
- the “calibration data update process” is a process related to calibration data update, for example, a process of updating calibration data based on calibration data stored in a database (ie, an automatic update process of calibration data). ) And a process for notifying the user that the calibration data should be updated (that is, a process for prompting recalibration).
- the calibration data update process (calibration data automatic update process or recalibration prompting process) As a result, mixed reality can be suitably realized.
- mixed reality can be suitably realized.
- the wearing state detecting means includes a pressure distribution sensor that is provided in the wearing portion and detects a distribution of pressure applied from the head, and the pressure distribution sensor The wearing state is detected based on the detected pressure distribution.
- the wearing state detection means has a camera or a distance sensor attached inward (that is, toward the user side) to the head mounted display, for example, and is measured by an image or distance sensor photographed by this camera. The wearing state may be detected based on the measured distance.
- the wearing state detecting means further includes a movement detecting means for detecting a movement of the wearing portion, and the distribution of the pressure detected by the pressure distribution sensor and the movement. The wearing state is detected based on the movement detected by the detecting means.
- the mounting state detection means detects, for example, the movement of the mounting portion (for example, the speed or acceleration at which the mounting portion moves or the distance moved) by the movement detection means.
- the head mounted display when the head mounted display is accelerating, a force is applied, so the pressure distribution is different from that at rest. For this reason, when the head mounted display is accelerating, it may be erroneously detected that the wearing state has changed.
- the mounting state is detected based on the pressure distribution detected by the pressure distribution sensor and the motion detected by the motion detection means, it is possible to prevent erroneous detection of the mounting state.
- the wearing state detecting means increases the threshold value that is a criterion for determining that the wearing state has changed.
- the wearing state detecting means increases the threshold value that is a criterion for determining that the wearing state has changed.
- the acceleration detected by the motion detection unit is equal to or higher than a predetermined acceleration
- the mounting state detection by the mounting state detection unit may not be performed correctly.
- the detection may be stopped and the calibration data update process may not be performed.
- the apparatus further comprises calibration data storage means for storing the calibration data, and the update processing means uses the calibration data stored in the calibration data storage means as the update processing. Based on this, the calibration data is updated.
- the calibration data is updated based on the calibration data stored in the calibration data storage means, the work performed by the user to update the calibration data can be reduced, which is very convenient in practice. .
- the update processing means performs a process of notifying the user that the calibration data should be updated as the update process.
- the user can know that the calibration data should be updated. Therefore, the mixed reality can be more suitably realized by updating the calibration data in accordance with the user's instruction.
- 1 and 2 are external views showing a schematic configuration of the mixed reality apparatus according to the present embodiment.
- the mixed reality apparatus 1 is an optically transmissive mixed reality apparatus, and includes a head mounted display 100 (hereinafter, “HMD100”) including a mounting unit 110, an imaging unit 120, and a display unit 130. As appropriate).
- the user uses the mixed reality apparatus 1 with the HMD 100 attached.
- the mixed reality apparatus 1 realizes mixed reality by causing the display unit 130 to display CG as an example of the “additional information” according to the present invention so as to correspond to the position of the marker provided in the real environment.
- the HMD 100 is an example of a “head mounted display” according to the present invention.
- the mounting unit 110 is a member (glass frame-shaped member) configured to be mounted on the user's head, and configured to be able to sandwich the user's head from both the left and right sides.
- the mounting portion 110 is an example of the “mounting portion” according to the present invention.
- the imaging unit 120 includes a camera, and images a real environment in front of the user with the user wearing the HMD 100.
- the imaging unit 120 is provided between the two display units 130 arranged side by side.
- the imaging unit 120 constitutes an example of a “specific position detection unit” according to the present invention, together with a marker detection unit 231 described later.
- the position of the marker is detected based on the image captured by the imaging unit 120.
- the imaging unit 120 including the camera, a magnetic sensor, an ultrasonic sensor, a gyroscope, and an acceleration sensor are used.
- the marker position may be detected by an angular velocity sensor, a GPS, a wireless communication device, or the like.
- the display unit 130 is a display device having optical transparency, and one display unit 130 is provided corresponding to each of the left and right eyes of the user. The user feels as if a CG that does not exist in the real environment exists in the real environment by looking at the CG displayed on the display unit 130 while viewing the real environment through the display unit 130. be able to.
- the display unit 130 is an example of the “display unit” according to the present invention. Since the display unit 130 is provided integrally with the mounting unit 110, even if the mounting state of the mounting unit 110 changes, the positional relationship between the display unit 130 and the mounting unit 110 does not change.
- the pressure distribution sensor 140 is provided in the part of the present embodiment that is in contact with the user of the mounting unit 110.
- the pressure distribution sensor 140 is a sensor that detects the distribution of pressure applied to the mounting unit 110 from the user's head, and outputs the detected value to the DB control unit 210 described later with reference to FIG.
- the pressure distribution sensor 140 constitutes the “wearing state detection unit” according to the present invention.
- the distribution of pressure applied to the mounting unit 110 from the user's head varies depending on the mounting state of the mounting unit 110. Therefore, the detection value of the pressure distribution sensor 140 corresponds to the mounting state of the mounting unit 110.
- FIG. 3 is a block diagram showing the configuration of the mixed reality apparatus 1.
- the mixed reality apparatus 1 includes a button 150, a DB (database) control unit 210, in addition to the imaging unit 120, the display unit 130, and the pressure distribution sensor 140 described above with reference to FIGS. 1 and 2. , A calibration unit 220, a transformation matrix calculation unit 230, a rendering unit 240, and a selector (SEL) 250.
- the button 150 is a button as a user interface (UI) for calibration, and is displayed on the display unit 130 by the user when calibration processing for calibrating the display position of the CG on the display unit 130 is performed.
- a matching signal indicating that the proofreading image (for example, a cross-shaped image) and the marker in the real environment appear to match is output.
- the coincidence signal output from the button 150 is input to the calibration unit 220 described later.
- the user uses the button 150 to inform the calibration unit 220 of that fact. Inform.
- FIG. 4 is a block diagram showing the configuration of the DB control unit 210.
- the DB control unit 210 includes a pressure distribution database 211, a calibration data database 212, a pressure distribution comparison unit 213, and a DB write control unit (DB write control unit) 214.
- the pressure distribution database 211 is a database that stores the detected value (detected pressure) detected by the pressure distribution sensor 140 in association with the state number (state No.). Detection value and state No. of the pressure distribution sensor 140 Is written into the pressure distribution database 211 by the DB write controller 214 described later.
- the pressure distribution database 211 stores the detected values and state numbers. Are stored separately for each user. That is, data stored in the pressure distribution database 211 is managed for each user. The same applies to the calibration data database 212 described later.
- the current detection value of the pressure distribution sensor 140 is appropriately referred to as a detection value Pa.
- the calibration data database 212 is an example of the “calibration data storage unit” according to the present invention, and the calibration data calculated by the calibration unit 220 is stored in the state No. Is a database stored in association with each other.
- the calibration data database 211 includes calibration data and a state No. Are stored separately for each user.
- the calibration data calculated by the calibration unit 220 is referred to as calibration data Ma.
- the pressure distribution comparison unit 213 compares the detection value Pa of the current pressure distribution sensor 140 with the detection value stored in the pressure distribution database 211, and determines whether or not they match. When there is a detection value that matches the detection value Pa of the current pressure distribution sensor 140 among the detection values stored in the pressure distribution database 211, the pressure distribution comparison unit 213 responds to the matching detection value.
- the pressure distribution comparison unit 213 starts the calibration process when there is no detection value that matches the detection value Pa of the current pressure distribution sensor 140 among the detection values stored in the pressure distribution database 211.
- a calibration start trigger indicating the power is output to the calibration unit 220. Further, the pressure distribution comparison unit 213 outputs the current detection value Pa of the pressure distribution sensor 140 to the DB write control unit 214.
- the pressure distribution comparison unit 213 calculates a value Q using the following equation (1), and based on this value Q, the current detected value of the pressure distribution sensor 140 and the detection stored in the pressure distribution database 211. It is determined whether or not the values match.
- x i is a detected value of the current pressure distribution sensor 140
- y i is a detected value stored in the pressure distribution database 211.
- the pressure distribution comparison unit 213 determines that the detected value of the current pressure distribution sensor 140 matches the detected value stored in the pressure distribution database 211 when the value Q is equal to or less than a predetermined threshold value.
- the value Q corresponds to the distance between the current detected value of the pressure distribution sensor 140 and the detected value stored in the pressure distribution database 211.
- the detected value of the current pressure distribution sensor 140 matches the detected value stored in the pressure distribution database 211 .
- the correlation between the detected value of the current pressure distribution sensor 140 and the detected value stored in the pressure distribution database 211 pressure It may be determined whether or not they match based on a correlation coefficient indicating the similarity of distribution). In this case, even when the detected value of the current pressure distribution sensor 140 and the detected value stored in the pressure distribution database 211 are different, it is determined that they match, and the mounting state of the mounting unit 110 is the same.
- the detection value of the pressure distribution sensor 140 may be encoded (or quantized). In this case, by determining whether or not the detection value of the current pressure distribution sensor 140 matches the detection value stored in the pressure distribution database 211 based on the encoded detection value, The speed of the determination process can be increased.
- the DB write control unit 214 When the calculation end signal is input from the calibration unit 220, the DB write control unit 214 writes the current detection value Pa of the pressure distribution sensor 140 into the pressure distribution database 211 and calibrates the calibration data Ma calculated by the calibration unit 220. Write to the data database 212. At this time, the DB write control unit 214 sets the detection value Pa and the calibration data Ma to the state No. Are written in the pressure distribution database 211 and the calibration data database 212, respectively.
- the state No. Is added to the pressure distribution database 211, and the calibration data Ma calculated by the calibration unit 220 when the detection value of the pressure distribution sensor 140 is the detection value Pa (in other words, the detection value of the pressure distribution sensor 140).
- the calibration data Ma) determined by performing the calibration process when the detected value Pa is the detected value Pa is the state number. Is added to the calibration data database 212 (in other words, in association with the detected value Pa).
- FIG. 5 is a block diagram showing the configuration of the calibration unit 220.
- the calibration unit 220 calculates calibration data by performing calibration processing when a calibration start trigger is input from the DB control unit 210 (more specifically, the pressure distribution comparison unit 213). To do.
- the calibration unit 220 includes a calibration control unit 221, a calibration coordinate generation unit 222, a calibration display generation unit 223, a calibration marker position detection unit 224, a data storage unit 225, and a calibration data calculation unit 226. is doing.
- the calibration control unit 221 controls the calibration process. Specifically, the calibration control unit 221 controls the operations of the calibration coordinate generation unit 222, the calibration marker position detection unit 224, and the calibration data calculation unit 226.
- the calibration control unit 221 starts calibration processing when a calibration start trigger is input from the DB control unit 210. For example, when a calibration start trigger is input from the DB control unit 210, the calibration control unit 221 outputs a display update signal to the calibration coordinate generation unit 222 according to the coincidence signal from the button 150 and also generates a data addition trigger. The data is output to the data storage unit 225.
- the calibration control unit 221 outputs a calculation trigger to the calibration data calculation unit 226 and also outputs a mode switching signal to the selector 250 when the coincidence signal from the button 150 is input a predetermined number of times.
- the calibration data calculation unit 226 calculates calibration data Ma when a calculation trigger is input.
- the selector 250 performs mode switching for switching data output to the display unit 130 between calibration image data and display data.
- the user sets the calibration plate provided with the calibration marker so that the calibration marker matches the calibration image (for example, a cross-shaped image) displayed on the display unit 130.
- a match signal is output by the button 150.
- the calibration plate may be moved, or the HMD 100 may be moved.
- the calibration processing is not particularly limited, and for example, calibration may be performed so that a two-dimensional object such as a quadrangle in the real environment matches a two-dimensional display such as a quadrangle in the display unit 130. Then, the three-dimensional object in the real environment and the two-dimensional display on the display unit 130 may be calibrated.
- the calibration plate on which the calibration marker is provided is fixed and the position, size, orientation, etc. of the calibration image displayed on the display unit 130 are changed, and the calibration marker and the calibration image are changed. This may be done by detecting a match.
- the calibration coordinate generation unit 222 When the display update signal is input from the calibration control unit 221, the calibration coordinate generation unit 222 generates coordinates (Xd, Yd) for displaying the calibration image on the display unit 130. The calibration coordinate generation unit 222 outputs the generated coordinates (Xd, Yd) to the calibration display generation unit 223 and the data storage unit 225.
- the calibration display generation unit 223 includes image data (hereinafter referred to as “calibration image data”) of a calibration image (for example, a cross-shaped image) to be displayed at the coordinates (Xd, Yd) generated by the calibration coordinate generation unit 222. (Referred to as appropriate).
- the calibration display generation unit 223 generates and outputs the calibration image data to the selector 250 (see FIG. 3).
- the calibration marker position detection unit 224 detects the position of the calibration marker from the image captured by the imaging unit 120. Specifically, the calibration marker position detection unit 224 identifies coordinates (Xc, Yc, Zc) indicating the position of the calibration marker based on the image data input from the imaging unit 120, and the identified coordinates. (Xc, Yc, Zc) is output to the data storage unit 225.
- the data storage unit 225 receives the coordinates (Xd, Yd) input from the calibration coordinate generation unit 222 and the calibration marker position detection unit 224 when a data addition trigger is input from the calibration control unit 221. Coordinates (Xc, Yc, Zc) are stored in association with each other.
- the data storage unit 225 includes coordinates (Xd, Yd) that are marker position coordinates based on the coordinate system of the display unit 130 and coordinates (Xc, Yd) that are marker position coordinates based on the coordinate system of the imaging unit 120. A data list in which Yc, Zc) is associated is generated and held.
- the calibration data calculation unit 226 calculates calibration data based on the coordinates (Xd, Yd) and coordinates (Xc, Yc, Zc) stored in the data storage unit 225. Ma is calculated.
- the calibration data Ma is data for calibrating the relationship between the coordinate system of the imaging unit 120 and the coordinate system of the display unit 130.
- a rendering unit 240 (more specifically, an imaging to display conversion unit 243) described later with reference to FIG. 7 displays display data (CG data) from the coordinate system of the imaging unit 120 based on the calibration data Ma. Conversion into 130 coordinate systems (coordinate conversion and projection conversion).
- the calibration data calculation unit 226 finishes calculating the calibration data Ma, the calibration data calculation unit 226 outputs a calculation end signal indicating that to the calibration control unit 221.
- FIG. 6 is a block diagram illustrating a configuration of the transformation matrix calculation unit 230.
- the transformation matrix calculation unit 230 includes a marker detection unit 231 and an Rmc calculation unit 232.
- the marker detection unit 231 detects the position and size of the marker in the image captured by the imaging unit 120.
- the Rmc calculation unit 232 calculates a conversion matrix Rmc for converting from the marker coordinate system to the coordinate system of the imaging unit 120 based on the position and size of the marker detected by the marker detection unit 231.
- the Rmc calculation unit 232 outputs the calculated transformation matrix Rmc to the rendering unit 240.
- the CG is displayed on the display unit 130 so as to follow the marker.
- FIG. 7 is a block diagram showing a configuration of the rendering unit 240.
- the rendering unit 240 performs rendering on the CG to be displayed on the display unit 130.
- the rendering unit 240 includes a CG data storage unit 241, a marker to imaging coordinate conversion unit 242, and an imaging to display conversion unit 243.
- the CG data storage unit 241 is a storage unit in which CG data (CG data) to be displayed on the display unit 130 is stored.
- the CG data storage unit 241 stores CG data in the marker coordinate system.
- the CG data stored in the CG data storage unit 241 is three-dimensional (3D) data.
- the CG data stored in the CG data storage unit 241 is appropriately referred to as “marker coordinate system data”.
- the marker to imaging coordinate conversion unit 242 transfers the CG data stored in the CG data storage unit 241 from the marker coordinate system to the coordinate system of the imaging unit 120 based on the conversion matrix Rmc input from the conversion matrix calculation unit 230. Convert.
- the CG data based on the coordinate system of the imaging unit 120 after being converted by the marker to imaging coordinate conversion unit 242 is appropriately referred to as “imaging coordinate system data”.
- the imaging to display conversion unit 243 converts the imaging coordinate system data input from the marker to imaging coordinate conversion unit 242 into display data based on the calibration data Mx input from the calibration data database 212 (coordinate conversion and projection conversion). ).
- the display data is two-dimensional (2D) data based on the coordinate system of the display unit 120.
- the imaging to display conversion unit 243 outputs the display data to the selector 250 (see FIG. 3).
- the selector 250 selectively outputs the calibration image data input from the calibration unit 220 and the display data input from the rendering unit 240 to the display unit 130.
- the selector 250 outputs calibration image data to the display unit 130 when calibration processing is performed, and outputs display data to the display unit 130 when CG is to be displayed on the display unit 130.
- the display unit 130 displays a calibration image (for example, a cross-shaped image) based on the calibration image data, and displays CG based on the display data.
- FIG. 8 is a flowchart showing the operation flow of the mixed reality apparatus 1.
- step S10 an image of the real environment is acquired by the imaging unit 120 (step S10). That is, the mixed reality apparatus 1 acquires an image of the real environment by photographing the real environment with the imaging unit 120.
- the marker is detected by the conversion matrix calculation unit 230, and the conversion matrix Rmc is calculated (step S20). That is, the marker detection unit 231 of the transformation matrix calculation unit 230 detects the position, orientation (direction), and size of the marker provided in the real environment based on the real environment image acquired by the imaging unit 120. Based on the detected position, orientation (direction), and size of the marker, the Rmc calculation unit 232 of the conversion matrix calculation unit 230 calculates the conversion matrix Rmc.
- step S30 pressure distribution interlocking calibration processing is performed.
- FIG. 9 is a flowchart showing the flow of the pressure distribution interlocking calibration process.
- the detected value Pa of the current pressure distribution sensor 140 is acquired by the pressure distribution comparison unit 213 (see FIG. 4) (step S310).
- the pressure distribution comparison unit 213 compares the acquired detection value Pa with the detection value Px of the pressure distribution sensor 140 when the calibration data Mx currently in use is calculated (step S320).
- the pressure distribution comparison unit 213 determines whether or not the detection value Pa of the current pressure distribution sensor 140 matches the detection value Px of the pressure distribution sensor 140 when the calibration data Mx currently in use is calculated. Determination is made (step S330).
- step S330 If it is determined that the detected value Pa and the detected value Px match (step S330: Yes), the calibration data Mx and the detected value Px are held (step S375).
- the mounting state of the HMD 100 (more specifically, the mounting unit 110) is almost or not the time when the calibration data Mx is calculated and the present. Since the positional relationship between the user's eyes and the display unit 130 has changed little or not at all, the imaging is performed by the imaging to display conversion unit 243 (see FIG. 7) based on the calibration data Mx currently in use. By converting coordinate system data into display data, mixed reality can be suitably realized.
- step S330 If it is determined that the detection value Pa and the detection value Px do not match (step S330: No), the current detection value Pa of the pressure distribution sensor 140 and the detection value Pni stored in the pressure distribution database 211 Are compared by the pressure distribution comparison unit 213 (step S340).
- the pressure distribution comparison unit 213 determines whether or not the detected value Pa of the current pressure distribution sensor 140 matches the detected value Pni stored in the pressure distribution database 211 (step S350). In other words, the pressure distribution comparison unit 213 determines whether there is a detection value Pni that matches the detection value Pa of the current pressure distribution sensor 140 among the detection values Pni stored in the pressure distribution database 211. .
- step S350 If it is determined that the detected value Pa and the detected value Pni match (step S350: Yes), the calibration data Mx currently in use is changed to the calibration data Mni corresponding to the detected value Pni, and the detected value Px is changed to the detection value Pni.
- the calibration data Mni is calibration data calculated by the calibration unit 220 when the detection value of the pressure distribution sensor 140 is the detection value Pni, and the same state No. as the detection value Pni. And stored in the calibration data database 212.
- the mounting state of the HMD 100 (more specifically, the mounting portion 110) is almost or completely the same as when the calibration data Mni was calculated and the current one.
- the imaging coordinate system data is converted into display data by the imaging to display conversion unit 243 (see FIG. 7) based on the calibration data Mni. By doing so, mixed reality can be suitably realized.
- step S350: No When it is determined that the detected value Pa and the detected value Pni do not match (step S350: No), calibration processing is performed and calibration data Ma is acquired (step S360). That is, in this case (step S350: No), calibration processing is performed by the calibration unit 220, and new calibration data Ma is calculated.
- the calibration data Ma is added to the calibration data database 212, and the detected value Pa is added to the pressure distribution database 211 (step S370). That is, the calibration data Ma newly calculated by the calibration data calculation unit 226 (see FIG. 5) of the calibration unit 220 is input to the DB write control unit 214 of the DB control unit 210, and the DB write control unit 214 sets the calibration data database. 212 is written.
- the detected value Pa of the current pressure distribution sensor 140 is written into the pressure distribution database 211 by the DB write control unit 214. That is, in this embodiment, when the detected value that matches the detected value Pa of the current pressure distribution sensor 140 is not stored in the pressure distribution database 211, new calibration data Ma is obtained by performing a new calibration process. And the detected value Pa and the calibration data Ma are added to the pressure distribution database 211 and the calibration data database 212, respectively.
- the calibration data Mx currently in use is changed to new calibration data Ma corresponding to the detection value Pa of the current pressure distribution sensor 140, and the detection value Px is changed to the detection value Pa (step S380). ).
- the calibration data is maintained as it is.
- the calibration data is changed to the calibration data Mni corresponding to the detected value Pni.
- the detected value Pa of the current pressure distribution sensor 140 does not match the detected value Px corresponding to the calibration data Mx currently in use, and the detected value of the current pressure distribution sensor 140 is stored in the pressure distribution database 211.
- new calibration data Ma is calculated by performing a new calibration process, and the detected value Pa Fine calibration data Ma are added to the pressure distribution database 211 and calibration data database 212, respectively.
- a drawing process for generating CG display data to be displayed on the display unit 130 is performed (step S40).
- the marker coordinate system data stored in the CG data storage unit 241 is converted into imaging coordinate system data by the marker to imaging coordinate conversion unit 242 based on the conversion matrix Rmc.
- the imaging coordinate system data is converted into display data by the imaging to display conversion unit 243 based on the calibration data Mx.
- the display data generated in this way is input to the display unit 130 via the selector 250 (see FIG. 3).
- step S50 CG based on the display data is displayed on the display unit 130 (step S50).
- step S60 it is determined whether or not to end the display of the CG on the display unit 130 (step S60).
- step S60 If it is determined to end (step S60: Yes), the CG display is ended.
- step S60 If it is determined not to end (step S60: No), the process according to step S10 is performed again.
- step S30 the pressure distribution interlocking calibration processing
- the calibration data Mx is set to the detected value Pni.
- the corresponding calibration data Mni is changed (step S385). Therefore, the imaging coordinate system data can be converted into display data based on the calibration data Mni suitable for the current mounting state of the HMD 100 by the imaging to display conversion unit 243 without performing a new calibration process. Therefore, it is possible to eliminate the time required for the calibration process and the work of the user, and it is possible to suitably realize the mixed reality.
- the detected value Pa of the current pressure distribution sensor 140 does not coincide with the detected value Px corresponding to the calibration data Mx currently in use, and the pressure distribution database 211 indicates the current value. If there is no detected value that matches the detected value Pa of the pressure distribution sensor 140, new calibration data Ma is calculated by performing a new calibration process, and the detected value Pa and the calibration data Ma are respectively set to pressure. They are added to the distribution database 211 and the calibration data database 212 (steps S360, S370, and S380). Therefore, it can be detected that the current mounting state of the HMD 100 is different from the mounting state of the HMD 100 when the calibration data Mx currently in use is calculated, and a new calibration process can be performed reliably.
- Step S360 that is, the calibration process
- a process for notifying the user that the calibration data should be updated may be performed.
- the user can know that the calibration data should be updated. Therefore, the mixed reality can be suitably realized by performing the calibration process according to the user's instruction and updating the calibration data.
- a motion detection unit that includes an acceleration sensor or a gyro sensor and detects the movement of the mounting unit 110 may be provided.
- the threshold value of the pressure used as a reference for determining that the mounting state of the mounting unit 100 has changed may be increased. Thereby, it is possible to prevent erroneous detection of a change in the detected value of the pressure distribution sensor 140 (that is, the detected pressure distribution) caused by the acceleration motion as a change in the mounting state of the mounting unit 100.
- the mounting state of the mounting unit 110 may not be detected correctly.
- the calibration data update process may not be performed.
- FIG. 10 is a diagram for explaining calibration in the mixed reality apparatus 1.
- the position of the user's eye 910 and the position of the imaging unit 120 are different from each other.
- the images shown in the user's eyes 910 are different from each other.
- an image P1 see FIG. 10 (b)
- the marker 700 is located on the left side of the image
- the image P3 see FIG. 10C
- the marker 700 is located on the right side of the image.
- the marker 700 is provided on the object 1100 in the real environment.
- the position of the marker 700 is detected, and the CG 600 is combined with the detected position in the image P1, thereby the image P2 in which the positions of the CG 600 and the marker 700 match. Can be generated.
- an optically transmissive mixed reality device needs to be calibrated according to the difference between the position of the user's eye 910 and the position of the imaging unit 120. If calibration is not performed, the position and orientation (direction) of the marker 700 and the CG 600 may be shifted when the CG 600 is displayed on the display unit 130 as in the image P4 in FIG. .
- the positions of the CG 600 and the marker 700 as shown in an image P5 in FIG. The posture (direction) can be matched.
- the mounting state of the HMD 100 is detected based on the detection value of the pressure distribution sensor 140, and the calibration data is determined according to the detected mounting state. Since the update is performed, mixed reality can be suitably realized.
- the present invention is not limited to the above-described embodiments, and can be appropriately changed without departing from the gist or concept of the invention that can be read from the claims and the entire specification, and a mixed reality apparatus with such changes Is also included in the technical scope of the present invention.
Landscapes
- Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Optics & Photonics (AREA)
- Studio Devices (AREA)
- Processing Or Creating Images (AREA)
- Controls And Circuits For Display Device (AREA)
Abstract
Description
第1実施例に係る複合現実感装置について、図1から図9を参照して説明する。
100 ヘッドマウントディスプレイ(HMD)
110 装着部
120 撮像部
130 表示部
140 圧力分布センサー
150 ボタン
210 DB制御部
211 圧力分布データベース
212 校正データデータベース
213 圧力分布比較部
214 DB書き込み制御部
220 校正部
221 校正制御部
222 校正用座標生成部
223 校正用表示生成部
224 校正用マーカ位置検出部
225 データ蓄積部
226 校正データ算出部
230 変換行列算出部
240 レンダリング部
250 セレクタ
Claims (5)
- 使用者の頭部に装着される装着部と、現実環境の特定位置を検出する特定位置検出手段と、前記現実環境に付加すべき付加情報を表示する表示部とを有するヘッドマウントディスプレイと、
前記装着部の装着状態を検出する装着状態検出手段と、
前記特定位置検出手段の座標系から前記表示部の座標系への変換を行うための校正データの更新処理を、前記装着状態検出手段によって検出された装着状態に応じて行う更新処理手段と、
を備えたことを特徴とする複合現実感装置。 - 前記装着状態検出手段は、前記装着部に設けられ、前記頭部から加えられる圧力の分布を検出する圧力分布センサーを有し、前記圧力分布センサーによって検出された圧力の分布に基づいて前記装着状態を検出することを特徴とする請求項1に記載の複合現実感装置。
- 前記装着状態検出手段は、前記装着部の動きを検出する動き検出手段を更に有し、前記圧力分布センサーによって検出された圧力の分布及び前記動き検出手段によって検出された動きに基づいて、前記装着状態を検出することを特徴とする請求項2に記載の複合現実感装置。
- 前記校正データを蓄積する校正データ蓄積手段を更に備え、
前記更新処理手段は、前記更新処理として、前記校正データ蓄積手段に蓄積された校正データに基づいて前記校正データを更新する処理を行う
ことを特徴とする請求項1に記載の複合現実感装置。 - 前記更新処理手段は、前記更新処理として、前記使用者に前記校正データを更新すべき旨を知らせる処理を行うことを特徴とする請求項1に記載の複合現実感装置。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2013527778A JP5707497B2 (ja) | 2011-08-09 | 2011-08-09 | 複合現実感装置 |
PCT/JP2011/068136 WO2013021458A1 (ja) | 2011-08-09 | 2011-08-09 | 複合現実感装置 |
US14/236,767 US20140176609A1 (en) | 2011-08-09 | 2011-08-09 | Mixed reality apparatus |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2011/068136 WO2013021458A1 (ja) | 2011-08-09 | 2011-08-09 | 複合現実感装置 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2013021458A1 true WO2013021458A1 (ja) | 2013-02-14 |
Family
ID=47668010
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2011/068136 WO2013021458A1 (ja) | 2011-08-09 | 2011-08-09 | 複合現実感装置 |
Country Status (3)
Country | Link |
---|---|
US (1) | US20140176609A1 (ja) |
JP (1) | JP5707497B2 (ja) |
WO (1) | WO2013021458A1 (ja) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2014183554A (ja) * | 2013-03-21 | 2014-09-29 | Ntt Docomo Inc | 表示装置 |
GB2512404A (en) * | 2013-03-25 | 2014-10-01 | Sony Comp Entertainment Europe | Display |
JP2016009912A (ja) * | 2014-06-23 | 2016-01-18 | 富士通株式会社 | キャリブレーション装置、キャリブレーション方法、表示制御装置および表示制御方法 |
JP2016110489A (ja) * | 2014-12-09 | 2016-06-20 | コニカミノルタ株式会社 | 表示装置、表示装置のキャリブレーション方法、およびキャリブレーションプログラム |
JP2021508109A (ja) * | 2017-12-19 | 2021-02-25 | テレフオンアクチーボラゲット エルエム エリクソン(パブル) | 頭部装着型ディスプレイデバイスおよびその方法 |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5821464B2 (ja) * | 2011-09-22 | 2015-11-24 | セイコーエプソン株式会社 | 頭部装着型表示装置 |
TWI571827B (zh) * | 2012-11-13 | 2017-02-21 | 財團法人資訊工業策進會 | 決定3d物件影像在3d環境影像中深度的電子裝置及其方法 |
US9904053B2 (en) * | 2014-12-25 | 2018-02-27 | Seiko Epson Corporation | Display device, and method of controlling display device |
JP6565310B2 (ja) * | 2015-05-08 | 2019-08-28 | セイコーエプソン株式会社 | 表示装置、表示装置の制御方法、及び、プログラム |
US10902680B2 (en) * | 2018-04-03 | 2021-01-26 | Saeed Eslami | Augmented reality application system and method |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002228442A (ja) * | 2000-11-30 | 2002-08-14 | Mixed Reality Systems Laboratory Inc | 位置姿勢の決定方法及び装置並びに記憶媒体 |
JP2008146109A (ja) * | 2006-12-05 | 2008-06-26 | Canon Inc | 画像処理方法、画像処理装置 |
JP2011002753A (ja) * | 2009-06-22 | 2011-01-06 | Sony Corp | 頭部装着型ディスプレイ、及び、頭部装着型ディスプレイにおける画像表示方法 |
Family Cites Families (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0651901A (ja) * | 1992-06-29 | 1994-02-25 | Nri & Ncc Co Ltd | 視線認識によるコミュニケーション装置 |
US20040219978A1 (en) * | 2002-10-09 | 2004-11-04 | Namco Ltd. | Image generation method, program, and information storage medium |
JP4532856B2 (ja) * | 2003-07-08 | 2010-08-25 | キヤノン株式会社 | 位置姿勢計測方法及び装置 |
US7778118B2 (en) * | 2007-08-28 | 2010-08-17 | Garmin Ltd. | Watch device having touch-bezel user interface |
KR100974900B1 (ko) * | 2008-11-04 | 2010-08-09 | 한국전자통신연구원 | 동적 임계값을 이용한 마커 인식 장치 및 방법 |
US20110012896A1 (en) * | 2009-06-22 | 2011-01-20 | Ji Maengsob | Image display apparatus, 3d glasses, and method for operating the image display apparatus |
WO2011097564A1 (en) * | 2010-02-05 | 2011-08-11 | Kopin Corporation | Touch sensor for controlling eyewear |
US8982156B2 (en) * | 2010-06-10 | 2015-03-17 | Sartorius Stedim Biotech Gmbh | Assembling method, operating method, augmented reality system and computer program product |
-
2011
- 2011-08-09 WO PCT/JP2011/068136 patent/WO2013021458A1/ja active Application Filing
- 2011-08-09 US US14/236,767 patent/US20140176609A1/en not_active Abandoned
- 2011-08-09 JP JP2013527778A patent/JP5707497B2/ja active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002228442A (ja) * | 2000-11-30 | 2002-08-14 | Mixed Reality Systems Laboratory Inc | 位置姿勢の決定方法及び装置並びに記憶媒体 |
JP2008146109A (ja) * | 2006-12-05 | 2008-06-26 | Canon Inc | 画像処理方法、画像処理装置 |
JP2011002753A (ja) * | 2009-06-22 | 2011-01-06 | Sony Corp | 頭部装着型ディスプレイ、及び、頭部装着型ディスプレイにおける画像表示方法 |
Cited By (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2014183554A (ja) * | 2013-03-21 | 2014-09-29 | Ntt Docomo Inc | 表示装置 |
GB2512404A (en) * | 2013-03-25 | 2014-10-01 | Sony Comp Entertainment Europe | Display |
GB2514466A (en) * | 2013-03-25 | 2014-11-26 | Sony Comp Entertainment Europe | Display |
GB2514466B (en) * | 2013-03-25 | 2017-11-29 | Sony Interactive Entertainment Europe Ltd | Display |
US10054796B2 (en) | 2013-03-25 | 2018-08-21 | Sony Interactive Entertainment Europe Limited | Display |
JP2016009912A (ja) * | 2014-06-23 | 2016-01-18 | 富士通株式会社 | キャリブレーション装置、キャリブレーション方法、表示制御装置および表示制御方法 |
JP2016110489A (ja) * | 2014-12-09 | 2016-06-20 | コニカミノルタ株式会社 | 表示装置、表示装置のキャリブレーション方法、およびキャリブレーションプログラム |
JP2021508109A (ja) * | 2017-12-19 | 2021-02-25 | テレフオンアクチーボラゲット エルエム エリクソン(パブル) | 頭部装着型ディスプレイデバイスおよびその方法 |
JP7012163B2 (ja) | 2017-12-19 | 2022-01-27 | テレフオンアクチーボラゲット エルエム エリクソン(パブル) | 頭部装着型ディスプレイデバイスおよびその方法 |
US11380018B2 (en) | 2017-12-19 | 2022-07-05 | Telefonaktiebolaget Lm Ericsson (Publ) | Head-mounted display device and method thereof |
US11935267B2 (en) | 2017-12-19 | 2024-03-19 | Telefonaktiebolaget Lm Ericsson (Publ) | Head-mounted display device and method thereof |
Also Published As
Publication number | Publication date |
---|---|
JP5707497B2 (ja) | 2015-04-30 |
US20140176609A1 (en) | 2014-06-26 |
JPWO2013021458A1 (ja) | 2015-03-05 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5707497B2 (ja) | 複合現実感装置 | |
KR102517876B1 (ko) | 증강 현실 데이터를 레코딩하기 위한 기술 | |
EP3469457B1 (en) | Modular extension of inertial controller for six dof mixed reality input | |
JP5923603B2 (ja) | 表示装置、ヘッドマウントディスプレイ、校正方法及び校正プログラム、並びに記録媒体 | |
JP5844880B2 (ja) | ヘッドマウントディスプレイ、校正方法及び校正プログラム、並びに記録媒体 | |
JP4137078B2 (ja) | 複合現実感情報生成装置および方法 | |
EP3334148A1 (en) | Ocular video stabilization | |
JP6491574B2 (ja) | Ar情報表示装置 | |
US10304253B2 (en) | Computer program, object tracking method, and display device | |
JP2014170374A (ja) | 光学式シースルー型hmdを用いたarシステム | |
WO2014175324A1 (ja) | ワーク加工作業支援システムおよびワーク加工方法 | |
CN114761909A (zh) | 针对头戴式显示器的内容稳定 | |
JP4847195B2 (ja) | 画像からの色情報の取得方法 | |
JP2018067115A (ja) | プログラム、追跡方法、追跡装置 | |
KR20140122126A (ko) | 투명 디스플레이를 이용한 증강현실 구현 장치 및 그 방법 | |
US11024040B2 (en) | Dynamic object tracking | |
JP2007233971A (ja) | 画像合成方法及び装置 | |
JP2014106642A (ja) | 光学式シースルー型hmdを用いたarシステム | |
EP4416684A1 (en) | Smooth and jump-free rapid target acquisition | |
KR101900475B1 (ko) | 증강현실 객체의 정합을 위한 캘리브레이션 방법 및 이를 수행하기 위한 헤드 마운트 디스플레이 | |
US20220300120A1 (en) | Information processing apparatus, and control method | |
JP6266580B2 (ja) | ヘッドマウントディスプレイ、校正方法及び校正プログラム、並びに記録媒体 | |
JP2018156239A (ja) | 操作支援装置 | |
JP2006318094A (ja) | 情報処理方法、情報処理装置 | |
JP2021009557A (ja) | 情報処理装置、情報処理方法、及びプログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 11870787 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2013527778 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14236767 Country of ref document: US |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 11870787 Country of ref document: EP Kind code of ref document: A1 |