WO2016098591A1 - ヘッドマウントディスプレイ用の動画像処理装置、ヘッドマウントディスプレイ用の動画像処理方法およびヘッドマウントディスプレイシステム - Google Patents
ヘッドマウントディスプレイ用の動画像処理装置、ヘッドマウントディスプレイ用の動画像処理方法およびヘッドマウントディスプレイシステム Download PDFInfo
- Publication number
- WO2016098591A1 WO2016098591A1 PCT/JP2015/083849 JP2015083849W WO2016098591A1 WO 2016098591 A1 WO2016098591 A1 WO 2016098591A1 JP 2015083849 W JP2015083849 W JP 2015083849W WO 2016098591 A1 WO2016098591 A1 WO 2016098591A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- shift amount
- past
- current frame
- amount
- Prior art date
Links
- 238000003672 processing method Methods 0.000 title claims description 7
- 238000001514 detection method Methods 0.000 claims abstract description 138
- 238000004364 calculation method Methods 0.000 claims abstract description 89
- 238000003384 imaging method Methods 0.000 claims abstract description 37
- 230000015572 biosynthetic process Effects 0.000 claims abstract description 10
- 238000003786 synthesis reaction Methods 0.000 claims abstract description 10
- 238000010191 image analysis Methods 0.000 claims description 63
- 238000006073 displacement reaction Methods 0.000 claims description 41
- 238000005096 rolling process Methods 0.000 claims description 15
- 230000002194 synthesizing effect Effects 0.000 claims description 14
- 239000000203 mixture Substances 0.000 description 23
- 230000008859 change Effects 0.000 description 20
- 238000000034 method Methods 0.000 description 16
- 239000002131 composite material Substances 0.000 description 10
- 239000003550 marker Substances 0.000 description 10
- 238000010586 diagram Methods 0.000 description 9
- 230000000694 effects Effects 0.000 description 6
- 230000004048 modification Effects 0.000 description 6
- 238000012986 modification Methods 0.000 description 6
- 230000008569 process Effects 0.000 description 6
- 206010047571 Visual impairment Diseases 0.000 description 5
- 230000007423 decrease Effects 0.000 description 4
- 230000003287 optical effect Effects 0.000 description 4
- 230000009467 reduction Effects 0.000 description 3
- 230000001151 other effect Effects 0.000 description 2
- 125000002066 L-histidyl group Chemical group [H]N1C([H])=NC(C([H])([H])[C@](C(=O)[*])([H])N([H])[H])=C1[H] 0.000 description 1
- 230000001133 acceleration Effects 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 238000005314 correlation function Methods 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 239000000284 extract Substances 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 238000003703 image analysis method Methods 0.000 description 1
- 230000035945 sensitivity Effects 0.000 description 1
- 239000013589 supplement Substances 0.000 description 1
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0179—Display position adjusting means not related to the information to be displayed
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B27/0172—Head mounted characterised by optical features
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/80—Geometric correction
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/246—Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
- G06T7/248—Analysis of motion using feature-based methods, e.g. the tracking of corners or segments involving reference images or patches
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G3/00—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
- G09G3/001—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
- G09G5/36—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
- G09G5/37—Details of the operation on graphic patterns
- G09G5/377—Details of the operation on graphic patterns for mixing or overlaying two or more graphic patterns
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0179—Display position adjusting means not related to the information to be displayed
- G02B2027/0181—Adaptation to the pilot/driver
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0179—Display position adjusting means not related to the information to be displayed
- G02B2027/0187—Display position adjusting means not related to the information to be displayed slaved to motion of at least a part of the body of the user, e.g. head, eye
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B2027/0192—Supplementary details
- G02B2027/0198—System for aligning or maintaining alignment of an image in a predetermined direction
Definitions
- the present invention relates to a moving image processing device for a head mounted display, a moving image processing method for a head mounted display, and a head mounted display system.
- a moving image processing device for displaying an image on a head-mounted display having a photographing device and a display screen.
- Such a moving image processing apparatus is disclosed in, for example, Japanese Patent Laid-Open No. 2013-12088.
- Japanese Unexamined Patent Application Publication No. 2013-120988 discloses a head mounted display (HMD) provided with a photographing device and a display screen.
- This head-mounted display is mounted on the user's head and displays a moving image captured using the imaging device to the user on a display screen. The user can visually recognize the moving image captured by the image capturing device overlaid on the external scene that can be seen through the display screen.
- a head-mounted display it is known to reduce noise by image processing using a time direction filter that synthesizes a moving image in the time direction in order to improve visibility.
- the time direction filter if the photographing apparatus moves or there is an object that moves in the image, an afterimage is generated after the composition, so that it is necessary to correct a positional deviation between the frame images to be composed.
- the misregistration correction is performed by comparing each frame image to be synthesized by image analysis.
- the present invention has been made to solve the above-described problems, and one object of the present invention is to improve the accuracy and robustness of misregistration detection and increase the calculation cost of image processing. It is an object to provide a moving image processing device for a head mounted display, a moving image processing method for a head mounted display, and a head mounted display system that can be suppressed.
- a moving image processing device for a head mounted display is mounted on a user's head and is a photographing device and a display screen for photographing images at predetermined frame intervals.
- a moving image processing apparatus for a head-mounted display comprising: a posture detection unit capable of detecting a posture of a photographing device mounted on a user's head; and a photographing device based on a detection result of the posture detection unit.
- First image shift amount calculating means for calculating a first image shift amount in the yawing direction and pitching direction of the shooting device between the shot frames; a first image shift amount; a current frame image shot by the shooting device; A second image of a past image and a current frame image based on a past image based on one or a plurality of frame images photographed by a photographing device in the past
- a second image shift amount calculation means for calculating a is the amount, and corrects the previous image based on the second image shift amount, and an image synthesizing means for synthesizing the corrected past image and the current frame image.
- the moving image processing apparatus for a head-mounted display As described above, the yawing direction and the pitching direction of the photographing apparatus between the frames photographed by the photographing apparatus based on the detection result of the posture detecting means.
- First image displacement amount calculating means for calculating the first image displacement amount in the image, the first image displacement amount, the current frame image photographed by the photographing device, and one or a plurality of frame images photographed by the photographing device in the past.
- a second image deviation amount calculating means for calculating a second image deviation amount between the past image and the current frame image on the basis of the past image based on.
- the first image shift amount based on the posture change of the photographing apparatus detected by the posture detection unit can be used. Even when sufficient misregistration detection accuracy cannot be obtained by image analysis, it is possible to suppress a decrease in misregistration detection accuracy.
- the detection accuracy of the displacement is ensured by calculating the second image displacement amount even when the detection result of the posture detection means includes some errors. Therefore, robustness can be improved.
- the first image shift amount in the yawing direction and the pitching direction of the photographing apparatus it is possible to reduce the calculation cost for detecting the shift compared to the case where the shift including the rolling direction is taken into consideration. . As a result, the accuracy and robustness of misregistration detection can be improved, and an increase in calculation cost of image processing can be suppressed.
- the second image shift amount calculation means performs a second shift with a current frame image in a position shift search region set for a part of the past image.
- the position shift search area is configured to calculate the image shift amount, and the position and size in the past image are set in the position shift search region based on the first image shift amount. If comprised in this way, a position shift search area
- region can be narrowed down to the predetermined range centering on the position after the movement which moved only the 1st image shift amount, for example using the 1st image shift amount. As a result, it is possible to effectively reduce the calculation cost by narrowing the misregistration search area while ensuring the accuracy of misregistration detection as compared with the case where the misregistration search is performed over the entire past image.
- the misregistration search area is set to an area expanded around a part of the past image after movement by the first image deviation amount by a range corresponding to the error range of the first image deviation amount.
- the misregistration search area can be limited to a minimum range in consideration of the error range of the first image misregistration amount, so that the calculation cost can be reduced more effectively.
- the second image shift amount calculation unit analyzes the past image and the current frame image, and analyzes between the past image and the current frame image. An image analysis deviation amount is calculated, and a second image deviation amount is calculated based on the first image deviation amount and the image analysis deviation amount.
- the second image shift amount can be calculated in consideration of both the first image shift amount based on the detection result of the posture detection unit and the image analysis shift amount obtained by the image analysis. it can.
- the detection accuracy can be compensated for by the first image shift amount and the image analysis shift amount, it is possible to effectively improve the accuracy and robustness of the position shift detection.
- the second image shift amount calculation means calculates the second image shift amount by weighting and adding the first image shift amount and the image analysis shift amount according to the noise amount of the current frame image. Is configured to do.
- the second image shift amount calculation means calculates the second image shift amount by weighting and adding the first image shift amount and the image analysis shift amount according to the noise amount of the current frame image. Is configured to do.
- the accuracy of misalignment detection can be ensured by increasing the weighting of the first image misalignment amount.
- the amount of noise is small and the accuracy of the image analysis deviation amount can be obtained, it is also possible to improve the positional deviation detection accuracy further than the accuracy of the first image deviation amount by increasing the weighting of the image analysis deviation amount. .
- the second image shift amount calculating means increases the noise amount of the current frame image.
- the weight of the first image shift amount is increased.
- the second image shift amount calculation unit corrects the position of the past image based on the first image shift amount, and the position corrected past The second image shift amount between the image and the current frame image is calculated for each local region in the image, and the image composition unit calculates the position of the position corrected past image for each local region in the second image shift amount. And the corrected past image and the current frame image are synthesized. If comprised in this way, the position for every local area
- the first image deviation amount calculation means is based on the detection result of the attitude detection means and the history of the second image deviation amount calculated in the past.
- the first image shift amount is calculated based on the estimated image shift amount.
- the range of the positional shift amount of the image in the current frame image from the history of the second image shift amount. Can be estimated. Therefore, if configured as described above, for example, even when noise is added to the detection result of the posture detection means and an outlier that is out of the other detection values is detected, the first image can be obtained by considering the estimated image shift amount. The detection accuracy of the image shift amount can be ensured.
- the estimated image deviation amount is calculated in advance, the first image deviation amount can be calculated quickly, so that the occurrence of processing delay can be suppressed even when the estimated image deviation amount is used.
- the image composition unit calculates a similarity between the corrected past image and the current frame image for each local region in the image, and calculates The corrected past image and the current frame image are synthesized by weighting according to the similarity.
- the image synthesizing unit increases the weight of the corrected past image more than the current frame image as the similarity between the corrected past image and the current frame image increases.
- the image synthesizing unit is configured to synthesize the corrected past image and the current frame image using a recursive filter.
- the recursive filter is a process of combining the current frame image and the previous one previous image by weighted addition.
- the image synthesizing unit outputs the synthesized image obtained by the synthesis to the display screen as an output image of the current frame corresponding to the current frame image. And it is configured to store as a past image in the next frame. According to this configuration, since the composition can be performed using the composite image obtained in the immediately preceding frame with the smallest deviation from the current frame image, the calculation cost can be reduced.
- the first image shift amount calculation means excludes image shift due to parallel movement of the imaging apparatus, and the first image shift due to rotation of the imaging apparatus. Calculate the amount.
- the image shift due to the parallel movement of the photographing device is difficult to correct as the position shift of the entire image because the positional relationship between the near and far subjects changes.
- the influence of the position shift due to the parallel movement becomes relatively small. For this reason, by eliminating the image shift due to the parallel movement and calculating the first image shift amount limited to the rotational position shift, the calculation cost is reduced while effectively suppressing the influence on the position shift detection accuracy. be able to.
- the first image displacement amount calculation means excludes a rotational position displacement in the rolling direction of the imaging device, and the yawing direction and pitching direction of the imaging device.
- the first image shift amount at is calculated.
- the rotation in the rolling direction corresponds to the movement of the user tilting his / her neck to the left and right, and the movement is compared to the yawing direction (left and right swing) and the pitching direction (up and down swing). Few.
- the first image shift amount by calculating the first image shift amount by excluding the rolling direction and limiting to the yawing direction and the pitching direction, it is possible to reduce the calculation cost while effectively suppressing the influence on the position shift detection accuracy. it can.
- a moving image processing method for a head-mounted display according to a second aspect of the present invention is used for a head-mounted display equipped with a photographing device and a display screen that are mounted on a user's head and shoots images at a predetermined frame interval.
- a method for processing a moving image the step of detecting a posture of a photographing device mounted on a user's head, and a yawing of the photographing device between frames photographed by the photographing device based on a detection result of the posture of the photographing device Based on the step of calculating the first image shift amount in the direction and the pitching direction, the first image shift amount, the current frame image shot by the shooting device, and one or a plurality of frame images shot by the shooting device in the past Calculating a second image shift amount between the past image and the current frame image based on the past image; Zui corrects the past image, and a step of combining the corrected past image and the current frame image.
- the yawing direction and pitching of the photographing device between frames photographed by the photographing device based on the detection result of the posture of the photographing device.
- a head-mounted display system is a head-mounted display that is mounted on a user's head and includes an imaging device that captures images at predetermined frame intervals, a display screen, and the user's head.
- a posture detecting device capable of detecting the posture of the photographing device mounted on the camera, and a moving image processing device for displaying an image photographed by the photographing device on a display screen.
- the first image shift amount calculating means for calculating the first image shift amount in the yawing direction and the pitching direction of the shooting device between the frames shot by the shooting device, the first image shift amount, and the shooting device Based on the captured current frame image and the past image based on one or more frame images captured by the imaging device in the past,
- a second image shift amount calculating means for calculating a second image shift amount between the last image and the current frame image; correcting the past image based on the second image shift amount; and correcting the corrected past image and the current frame image.
- Image synthesizing means for synthesizing.
- the first image shift in the yawing direction and the pitching direction of the imaging device between frames taken by the imaging device based on the detection result of the attitude detection device.
- a first image deviation amount calculating means for calculating an amount; a first image deviation amount; a current frame image photographed by the photographing apparatus; and a past image based on one or a plurality of frame images photographed by the photographing apparatus in the past;
- a second image deviation amount calculating means for calculating a second image deviation amount between the past image and the current frame image.
- the detection accuracy of the positional deviation is ensured by calculating the second image deviation amount. Therefore, robustness can be improved. Further, by calculating the first image displacement amount only in the yawing direction and the pitching direction, the calculation cost for detecting the displacement can be reduced as compared with the case where the displacement including the rolling direction is taken into consideration. . As a result, the accuracy and robustness of misregistration detection can be improved, and an increase in calculation cost of image processing can be suppressed.
- FIG. 1 is a schematic diagram illustrating an overall configuration of a head mounted display system according to a first embodiment of the present invention.
- 1 is a block diagram illustrating a head mounted display and a moving image processing apparatus according to a first embodiment of the present invention. It is the block diagram which showed the control structure of the moving image processing device by 1st Embodiment of this invention. It is a conceptual diagram for demonstrating the image processing of the moving image processing device by 1st Embodiment of this invention.
- (A) is the schematic diagram which showed the coordinate system for showing the attitude
- (B) is a schematic diagram showing a coordinate system in an image. It is the figure which showed an example of the weight setting based on the similarity degree of a present frame image and a past image.
- the HMD system 100 is a mobile body 1 that moves in the sky (in the air) such as an aircraft (airplane or helicopter) among mobile bodies such as automobiles, airplanes, and ships that move by boarding people.
- an example of mounting will be described.
- the HMD system 100 includes a head mounted display (hereinafter referred to as “HMD”) 10 including a photographing device 11 and a display screen 12, and an image photographed by the photographing device 11.
- a moving image processing device 20 for displaying on the display screen 12 and a posture detection device 30 capable of detecting the posture of the photographing device 11 are provided.
- the posture detection device 30 is an example of the “posture detection means” in the present invention.
- the HMD system 100 displays a moving image photographed by the photographing device 11 on the display screen 12 of the HMD 10, so that the user can see the scene viewed through the display screen 12 and the display screen 12. It is possible to make the moving image overlap and visually recognize. As a result, the HMD system 100 supplements the field of view with the moving image of the photographing device 11 having higher sensitivity than the naked eye or transmits the display screen 12 in a low-light environment where it is difficult to grasp the outside world with the naked eye, such as in the evening or at night. Various information can be displayed on the object that is visually recognized. In the first embodiment, the user is a passenger (pilot) of the moving body 1 (aircraft).
- the HMD 10 is a head-mounted display device that includes a photographing device 11 and a display screen 12 that are mounted on the user's head 2.
- FIG. 1 shows an example of a helmet-shaped HMD 10.
- the HMD 10 includes a helmet 13, a visor-like display screen 12, and a photographing device 11, and the display screen 12 and the photographing device 11 are attached to the helmet 13.
- the HMD 10 also includes a display element 14 and an optical system 15 for projecting an image on the display screen 12.
- the display screen 12 has translucency so that the user can see the scene in front of the line of sight through the display screen 12.
- the imaging device 11 includes a lens and an image sensor (not shown), and a pair is provided on both the left and right sides of the helmet 13 (see FIG. 1) for left and right images.
- Each photographing device 11 is fixed so that the photographing optical axis direction coincides with the front of the user's line of sight (the forward direction perpendicular to the left-right direction connecting the pair of cameras).
- a pair of display elements 14 is provided, and the left-eye image and the right-eye image are collimated by the pair of optical systems 15 and projected onto the display screen 12.
- the image for the left eye and the image for the right eye reflected on the display screen 12 overlap with external light transmitted through the display screen 12 from the outside, and are visually recognized by the user's left eye and right eye, respectively.
- position detection apparatus 30 is attached to the helmet 13 shown in FIG.
- the moving image processing device 20 is incorporated in the helmet 13 as a video signal processing device of the HMD 10.
- the moving image processing device 20 has a function of displaying a moving image on the display screen 12 by receiving a moving image taken by the photographing device 11, performing image processing, and outputting the processed image to the display element 14.
- the moving image processing apparatus 20 includes a CPU 21 as a control unit of the moving image processing apparatus 20, a memory 22, a first interface unit 23 and a second interface unit 24, and a drive circuit 25. ing.
- the CPU 21 and the memory 22, the first interface unit 23 and the second interface unit 24, and the drive circuit 25 are provided in pairs for the left-eye image and the right-eye image, respectively.
- the CPU 21 is configured to control the photographing device 11 and the display element 14 by executing a program stored in the memory 22 and to perform predetermined image processing on the photographed moving image. That is, the CPU 21 receives a moving image captured by the imaging device 11 via the first interface unit 23, generates a video signal, and outputs the video signal to the drive circuit 25. Further, the CPU 21 acquires a detection signal (detection result) of the attitude detection device 30 via the second interface unit 24, and performs image processing of the moving image using the detection signal. Details of the image processing by the CPU 21 will be described later.
- the first interface unit 23 is connected to the photographing apparatus 11 and sends a video signal of a moving image photographed by the photographing apparatus 11 to the CPU 21.
- the second interface unit 24 is connected to the posture detection device 30 by wire or wirelessly and sends a detection signal of the posture detection device 30 to the CPU 21.
- the drive circuit 25 is a circuit for driving the display element 14 based on the video signal sent from the CPU 21.
- the posture detection device 30 is configured to detect the posture of the photographing device 11 fixed to the helmet 13 by detecting the posture of the helmet 13 attached to the head 2 of the user.
- position detection apparatus 30 is provided in the exterior of HMD10 (helmet 13), and is comprised so that the attitude
- the posture detection device 30 includes a pair of cameras 31 (stereo camera), a marker 32 attached to the user's head, and a detection control unit 33.
- the pair of cameras 31 is fixedly attached to the moving body 1 on which the user is boarded.
- the pair of cameras 31 are arranged at predetermined positions separated from each other so as to capture the marker 32 from different directions.
- the marker 32 is, for example, a light emitting element such as an LED, and three or more markers 32 are provided so that the three-dimensional posture of the photographing apparatus 11 (helmet 13) can be determined. Each marker 32 is fixedly attached to a position on the surface of the helmet 13 that can be photographed by the camera 31.
- the detection control unit 33 is a computer including a CPU 34 and a memory 35.
- the CPU 34 is configured to execute posture detection processing by executing a program stored in the memory 35.
- the memory 35 stores information on the positional relationship between the pair of cameras 31 (distance d between the cameras), information on the relative positional relationship between the three or more markers 32 (relative position coordinates), and the like.
- the CPU34 extracts each marker 32 from the image imaged with the camera 31, and calculates direction angle (alpha) 1 and (alpha) 2 of each marker 32 with respect to each camera31. Then, the CPU 34 calculates the spatial position coordinates of each marker 32 based on the principle of triangulation using the direction angles ⁇ 1 and ⁇ 2 of each marker 32 with respect to the pair of cameras 31 and the distance d between the pair of cameras 31. To do. Since each marker 32 is fixed to the helmet 13 with predetermined relative position coordinates, the CPU 34 determines the current angle of the helmet 13 based on the spatial position coordinates of the three or more markers 32 and the relative position coordinates between the markers. (Attitude) is calculated. Since the relative positional relationship of the photographing device 11 with respect to the helmet 13 is also known, the current angle (posture) of the photographing device 11 can be calculated from the current angle (posture) of the helmet 13.
- the posture detection device 30 detects the posture of the photographing device 11 attached to the head 2 of the user of the photographing device 11.
- the posture detection device 30 is configured such that the (CPU 21) sends the detected posture detection signal of the photographing device 11 to the moving image processing device 20 via the second interface unit 24.
- the attitude detection device 30 can employ, for example, the configuration disclosed in Japanese Patent Application Laid-Open No. 2008-59204, and thus a detailed description thereof is omitted.
- the CPU 21 includes a first image deviation amount calculation unit 41, a second image deviation amount calculation unit 42, and an image. And a synthesizer 43.
- the first image deviation amount calculation unit 41, the second image deviation amount calculation unit 42, and the image composition unit 43 are respectively a "first image deviation amount calculation unit” and a "second image deviation amount calculation unit” of the present invention. Means ”and“ image composition means ”.
- the first image shift amount calculation unit 41 calculates the first image shift amount G1 between frames shot by the shooting device 11 based on the detection result of the posture detection device 30.
- the first image shift amount G1 includes the latest current frame image 51 photographed by the photographing device 11, the previous past image 52 displayed on the display screen 12 one frame before (the output image output in the previous frame), and Is the amount of misalignment on the image.
- the first image shift amount calculation unit 41 calculates a change amount of the posture (head angle) of the photographing device 11 while the photographing device 11 photographs one frame from the detection result of the posture detection device 30. To do. For example, when the frame rate of the photographing apparatus 11 is 60 fps (Frames per second), the angle change amount of the posture (head angle) for 1/60 seconds is calculated.
- the first image shift amount calculation unit 41 converts the calculated angle change amount into an image shift amount between one frame.
- the first image shift amount calculation unit 41 excludes image shift due to parallel movement of the image capturing device 11 and calculates only the image shift amount due to rotation of the image capturing device 11. This is because the image shift caused by the rotation of the photographing device 11 is only the screen (view angle) is changed without changing the positional relationship between the near and far subjects, whereas the image shift caused by the parallel movement of the photographing device 11 is. This is because the positional relationship between near and far subjects changes (the closer the subject is to the photographing apparatus 11, the larger the positional deviation), and thus it is difficult to consider the positional deviation of the entire image.
- the first image deviation amount calculation unit 41 is configured to calculate the first image deviation amount G1 in the yawing direction and the pitching direction of the photographing apparatus 11.
- the rotational position deviation in the rolling direction is excluded.
- the rotation in the rolling direction corresponds to a movement in which the user tilts the neck left and right.
- the movement of the user's head 2 at the time of boarding the moving body 1 is dominated by the yawing direction (left-right swing) and the pitching direction (up-down swing), and has little motion in the rolling direction. For this reason, even if the rotational positional deviation in the rolling direction is excluded, the positional deviation of the image is hardly affected.
- AOVx [°] and AOVy [°] are the viewing angles of the photographing apparatus 11
- PIXx [pixel] and PIXy [pixel] are the number of pixels.
- ⁇ x [°] and ⁇ y [°] are the angle change amounts of the photographing apparatus 11.
- the center predetermined position between the cameras 31 of the posture detection device 30 is set as the origin of the rotation coordinate system.
- the second term on the right side corresponds to the first image shift amount G1.
- the second image shift amount calculation unit 42 includes the first image shift amount G1, the current frame image 51 shot by the shooting device 11, and the shot device 11 shot in the past.
- a second image shift amount G2 between the past image 52 and the current frame image 51 is calculated based on the past image 52 (output image output in the immediately preceding frame) based on one or a plurality of frame images.
- a known method for performing image analysis can be adopted as a method for calculating the positional deviation amount between the past image 52 and the current frame image 51.
- the calculation method of the positional shift amount is, for example, pattern matching, and a method of calculating the shift amount using the cross-correlation on the past image 52 and the current frame image 51, or the past image 52 and the current frame image 51 in the frequency domain.
- a phase-only correlation method for calculating the shift amount based on the phase correlation in the frequency space can be used.
- the shift amount can be calculated for each local region of the image by performing the above processing for each local region of the image.
- the second image displacement amount calculation unit 42 is configured to calculate the second image displacement amount G2 with the current frame image 51 in the displacement displacement search area 61 set for a part of the past image 52.
- the position and size in the past image 52 are set based on the first image misregistration amount G1.
- the first image shift amount G1 in the A direction is calculated with respect to the past image 52 in FIG.
- the first image shift amount G1 there is a predetermined error range in the first image shift amount G1 based on the detection result of the posture detection device 30.
- the misregistration search area 61 is a range corresponding to the error range of the first image misregistration amount G1 around the image portion 62 with the position after the movement corresponding to the first image misalignment amount G1 (position of the image 52a) as a center.
- An area expanded by E is set.
- the second image shift amount calculation unit 42 calculates a second image shift amount G2 between the past image 52 and the current frame image 51 as a result of pattern matching in the position shift search region 61.
- the image composition unit 43 (see FIG. 3) corrects the past image 52 based on the second image displacement amount G2, and composes the corrected past image 52 and the current frame image 51.
- the image composition unit 43 synthesizes the previous past image 52 with the current frame image 51 after correcting the position displacement by the second image displacement amount G2.
- the image synthesis unit 43 is configured to synthesize the corrected past image 52 and the current frame image 51 using a recursive filter.
- the recursive filter is a process of combining the current frame image 51 by weighted addition of the past image 52.
- the generated composite image Qn is expressed by the following equation (2).
- Qn (i, j) k * Pn (i, j) + (1-k) * Qn-1 (i, j) (2)
- i and j are pixels in the X direction and the Y direction in the image.
- the moving image processing apparatus 20 outputs the synthesized image 53 synthesized by the image synthesizing unit 43 to the display element 14 of the HMD 10 as an output image of the current frame corresponding to the current frame image 51 that is a captured image.
- the composite image 53 with the noise NC reduced is displayed.
- the composite image 53 is stored in the memory 22, and in the next frame, the composite image 53 is used as the past image 52 of the immediately preceding frame. That is, the past image 52 shown in FIG. 4 is a synthesized image 53 synthesized and output one frame before.
- the image composition unit 43 calculates the similarity between the corrected past image 52 and the current frame image 51 for each local region in the image, and weights according to the calculated similarity. Then, the corrected past image 52 and the current frame image 51 are synthesized. That is, as illustrated in FIG. 6, the image composition unit 43 increases the weight of the past image 52 over the current frame image 51 as the similarity between the past image 52 after the positional deviation correction and the current frame image 51 increases. . Therefore, as shown by the above equation (2), the weighting coefficient k when performing the synthesis process changes for each pixel group belonging to the local region.
- the similarity is determined, for example, by calculating a correlation function. The similarity increases when the correlation between the corrected past image 52 and the current frame image 51 is high.
- the image composition unit 43 makes the weight of the past image 52 smaller than the current frame image 51 when the similarity is low. As a result, the composition ratio of the current frame image 51 is larger than that of the past image 52 in which there is a change from the current frame (the degree of similarity is low), so that a composite image 53 that accurately reflects the current situation is obtained. .
- the flashing light source when the flashing light source is reflected in the local area, the past image 52 is in the off state, and the current frame image 51 is in the lit state. As a result of being reflected in 53, it can be avoided that the composite image 53 is in the middle between the lighting state and the extinguishing state.
- the first image shift amount calculation unit 41 performs the first image shift based on the detection result of the posture detection device 30 and the estimated image shift amount G3 estimated from the history of the second image shift amount G2 calculated in the past. An image shift amount G1 is calculated.
- the second image shift amount G2 is a positional shift amount between frames, the time-series posture change of the photographing apparatus 11 and the time-series change of the field of view (scenery) accompanying the movement of the moving body 1 are caused. Corresponding continuity. Therefore, as shown in FIG. 7, when n is the current frame, the second image shift from the past time point indicated by G2 (n-2), G2 (n-1), and G2 (n) to the present time.
- the estimated image shift amount G3 of the next frame can be calculated by approximation calculation based on the change in the amount G2.
- the first image shift amount calculation unit 41 calculates the first image shift amount G1 by weighting and adding the shift amount calculated from the detection result of the posture detection device 30 and the estimated image shift amount G3. As a result, even when noise or the like is included in the detection result of the posture detection device 30, an outlier that cannot be seen in time series is suppressed from being calculated as the first image shift amount G1.
- the CPU 21 of the moving image processing apparatus 20 acquires the current frame image 51 from the photographing apparatus 11.
- the current frame is the nth frame.
- the immediately preceding frame is the (n-1) th frame.
- step S2 the CPU 21 obtains the current angle information of the imaging device 11 (current frame (n)) from the attitude detection device 30 as a detection result.
- step S ⁇ b> 3 the CPU 21 calculates an angle change amount between one frame based on the angle information of the previous frame (n ⁇ 1) and the angle information of the current frame (n) stored in the memory 22.
- step S4 the CPU 21 calculates the first image shift amount G1 from the calculated angle change amount using the above equation (1).
- step S5 the CPU 21 sets the position shift search region 61 based on the first image shift amount G1, and calculates the second image shift amount G2 by image analysis (pattern matching).
- step S6 the CPU 21 corrects the positional deviation of the past image 52 displayed in the immediately preceding frame (n ⁇ 1) based on the calculated second image deviation amount G2.
- step S7 the CPU 21 synthesizes the corrected past image 52 and the current frame image 51 with a recursive filter.
- the CPU 21 outputs the obtained composite image 53 to the display element 14 of the HMD 10 as an output image of the current frame (n) corresponding to the current frame image 51 and stores it in the memory 22.
- the image processing of the moving image processing apparatus 20 is performed by repeating the above control processing for each frame.
- the HMD system 100 performs an operation of displaying the moving image photographed by the photographing device 11 on the HMD 10 (display screen 12) as an output image after image processing by the moving image processing device 20.
- the first image shift amount G1 in the yawing direction and the pitching direction of the imaging device 11 between frames captured by the imaging device 11 is calculated based on the detection result of the attitude detection device 30.
- a second image shift amount calculation unit 42 that calculates a second image shift amount G2 between the past image 52 and the current frame image 51 is provided based on 52 (the previous image 52 of the immediately preceding frame).
- the first image shift amount G1 based on the detection result of the attitude detection device 30 can be used, and therefore the noise NC As a result, even when sufficient positional deviation detection accuracy cannot be obtained by image analysis, it is possible to suppress a reduction in positional deviation detection accuracy.
- the second image deviation amount is also included when the detection result of the posture detection device 30 includes some error.
- the calculation cost can be reduced as compared with the case where the position shift including the rolling direction is taken into consideration. As a result, the accuracy and robustness of misregistration detection can be improved, and an increase in calculation cost of image processing can be suppressed.
- the second image displacement amount calculation unit 42 calculates the second image displacement amount G2 from the current frame image 51 in the displacement displacement search area 61 set for a part of the past image 52. Configure to calculate. Then, the position shift search area 61 is set to the position and size in the past image 52 based on the first image shift amount G1. As a result, by using the first image shift amount G1 based on the detection result of the posture detection device 30, the position shift search region 61 is set in a predetermined range centered on the moved position that has moved by the first image shift amount G1. Can be squeezed. As a result, the calculation cost can be effectively reduced by the amount of narrowing of the misalignment search area 61 while ensuring the accuracy of misalignment detection.
- the position shift search region 61 is moved around the part of the past image 52 (image portion 62) after the movement by the first image shift amount G1.
- An area that is expanded by a range E corresponding to the error range is set.
- the misregistration search area 61 can be limited to a minimum range that takes into account the error range of the first image shift amount G1 (error range of the posture detection device 30), thereby reducing the calculation cost more effectively. can do.
- the first image shift amount calculation unit 41 performs the estimated image shift estimated from the detection result of the posture detection device 30 and the history of the second image shift amount G2 calculated in the past.
- the first image shift amount G1 is calculated based on the amount G3. Thereby, for example, even when noise is added to the detection result of the posture detection device 30, the detection accuracy of the first image shift amount G1 can be ensured by considering the estimated image shift amount G3.
- the image composition unit 43 calculates the similarity between the corrected past image 52 and the current frame image 51 for each local region in the image, and according to the calculated similarity.
- the corrected past image 52 and the current frame image 51 are synthesized by weighting.
- an output image reflecting the current state is generated by weighting the current frame image 51 more greatly. can do.
- the image composition unit 43 causes the weight of the corrected past image 52 to be corrected from the current frame image 51 as the similarity between the corrected past image 52 and the current frame image 51 increases. Is configured to be larger. As a result, when there is no change in the external environment between frames of the past image 52 and the current frame image 51 (when the degree of similarity is high), the weight of the current frame image 51 including noise can be reduced and combined. , Noise can be reduced more effectively.
- the image synthesis unit 43 is configured to synthesize the corrected past image 52 and the current frame image 51 using a recursive filter. As a result, only the weighted addition of the two images needs to be performed, so that the calculation cost can be effectively reduced.
- the combined image 53 obtained by combining is output to the display screen 12 as an output image of the current frame corresponding to the current frame image 51, and the past image 52 in the next frame is output.
- the image composition unit 43 is configured so as to be stored in the memory 22. As a result, since the composition can be performed using the composite image 53 obtained in the immediately previous frame (n ⁇ 1) with the smallest deviation from the current frame image 51 (frame n), the calculation cost can be reduced. it can.
- the first image deviation amount calculation unit 41 excludes the image deviation caused by the parallel movement of the imaging device 11 and calculates the first image deviation amount G1 due to the rotation of the imaging device 11.
- the first image deviation amount G1 due to the rotation of the imaging device 11.
- the first image deviation amount calculation unit 41 excludes the rotational position deviation in the rolling direction of the imaging apparatus 11 and the first image deviation amount in the yawing direction and the pitching direction of the imaging apparatus 11. It is configured to calculate G1.
- the first image shift is limited to the yawing direction and the pitching direction by excluding the rolling direction.
- FIG. 1 an HMD system according to a second embodiment of the present invention will be described with reference to FIGS. 1, 3, and 9 to 11.
- FIG. 1 an example is shown in which the misregistration search region 61 is set based on the first image misregistration amount G1, and the second image misalignment amount G2 is calculated by image analysis (pattern matching) of the misregistration search region 61.
- the second image shift amount G2 is calculated from both the first image shift amount G1 and the image analysis shift amount obtained by image analysis will be described.
- the device configuration (hardware configuration) of the HMD system and the moving image processing device is the same as that of the first embodiment, and the control processing by the CPU 121 of the moving image processing device 120 is the same as that of the first embodiment. Is different. For this reason, the same components as those in the first embodiment are denoted by the same reference numerals and description thereof is omitted.
- the CPU 121 second image shift amount calculation unit 42 of the moving image processing apparatus 120 (see FIG. 3), as shown in FIG.
- the current frame image 51 is analyzed to calculate an image analysis deviation amount G11 between the past image 52 and the current frame image 51, and the second image is calculated based on the first image deviation amount G1 and the image analysis deviation amount G11.
- the shift amount G2 is configured to be calculated.
- the calculation of the image analysis deviation amount G11 can employ a known method such as pattern matching.
- the second image displacement amount calculation unit 42 performs image analysis such as pattern matching on the entire image 52 and the current frame image 51 over the entire image, thereby causing an image analysis displacement amount between the past image 52 and the current frame image 51. G11 is calculated.
- the second image shift amount calculation unit 42 calculates the second image shift amount G2 by weighting and adding the obtained first image shift amount G1 and image analysis shift amount G11 using a predetermined weight coefficient r.
- the second image shift amount G2 is calculated by the following equation (3).
- G2 r ⁇ G1 + (1 ⁇ r) ⁇ G11 (3)
- the second image shift amount G2 is calculated as an intermediate value of the ratio corresponding to the weight coefficient r with respect to the first image shift amount G1 and the image analysis shift amount G11.
- the second image shift amount calculation unit 42 adds the weight according to the amount of noise NC (noise amount) of the current frame image 51 to obtain the second image shift amount G2.
- An image shift amount G2 is calculated.
- the amount of noise included in the current frame image 51 changes due to external factors such as the brightness (illuminance) of the outside world.
- the amount of noise is small, and the amount of noise increases as the brightness of the outside world decreases.
- the error range of the image analysis deviation amount G11 by image analysis increases as the noise amount increases.
- the error range of the first image shift amount G1 calculated based on the detection result of the posture detection device 30 can be considered to be substantially constant regardless of the brightness of the external environment.
- the image analysis deviation amount G11 can obtain higher accuracy than the first image deviation amount G1.
- the second image shift amount calculation unit 42 increases the weight of the first image shift amount G1 as the noise amount of the current frame image 51 increases (the weight of the image analysis shift amount G11 decreases).
- the weighting factor r is set as follows.
- the weighting factor r can be calculated by storing a plurality of representative values in the memory 22 as shown in FIG. 10 and complementing an intermediate value corresponding to the amount of noise based on the representative value. Further, the value of the weighting factor and the range of the noise amount to which the weighting factor is applied may be stored in the memory 22 in association with each other, and a stepwise value corresponding to the noise amount may be read out.
- the noise amount of the current frame image 51 may be evaluated by image analysis, or may be evaluated by the brightness of the outside world.
- the past image 52 is corrected based on the obtained second image shift amount G2, and the corrected past image 52 and the current frame image 51 are synthesized.
- Steps S1 to S5 in FIG. 11 are the same as those in the first embodiment (see FIG. 8).
- step S11 the CPU 121 performs image analysis on the current frame image 51 of the current frame (n) obtained in step S1 and the past image 52 of the immediately previous frame (n ⁇ 1) stored in the memory 22.
- An image analysis deviation amount G11 is calculated.
- step S12 the CPU 121 determines the weighting factor r according to the amount of noise included in the current frame image 51 obtained in step S1.
- step S13 the CPU 121 weights and adds the first image shift amount G1 obtained in step S4 and the image analysis shift amount G11 obtained in step S11 using the weighting coefficient r determined in step S12.
- the second image shift amount G2 is calculated.
- the CPU 121 corrects the past image 52 of the immediately preceding frame (n ⁇ 1) based on the second image shift amount G2 in step S6, and in step S7, the corrected past image is corrected. 52 and the current frame image 51 are combined.
- the accuracy of the amount of noise is maintained by using the first image shift amount G1 when calculating the second image shift amount G2.
- the calculation cost can be reduced by calculating the first image shift amount G1 only in the yawing direction and the pitching direction. As a result, the accuracy and robustness of misregistration detection can be improved, and an increase in the calculation cost of image processing can be suppressed.
- the second image shift amount calculation unit 42 calculates the second image shift amount G2 based on the first image shift amount G1 and the image analysis shift amount G11. Constitute. Accordingly, the second image shift amount G2 can be calculated in consideration of both the first image shift amount G1 and the image analysis shift amount G11. For this reason, when the noise NC of the current frame image 51 is large and the accuracy of the image analysis deviation amount G11 is difficult to obtain, conversely, noise is generated in the detection result of the posture detection device 30 and the accuracy of the first image deviation amount G1 is obtained. Even when it is difficult, the detection accuracy can be supplemented by the first image deviation amount G1 and the image analysis deviation amount G11, so that the accuracy and robustness of the positional deviation detection can be effectively improved.
- the second image shift amount calculation unit 42 adds the first image shift amount G1 and the image analysis shift amount G11 according to the noise amount of the current frame image 51.
- the second image shift amount G2 is calculated.
- the second image shift amount calculation unit 42 is configured to increase the weight of the first image shift amount G1 as the noise amount of the current frame image 51 increases.
- the first image shift amount G1 and the image analysis shift amount G11 are positional shift amounts between frames, and thus have chronological continuity. For this reason, as shown in FIG. 12, for example, when the first image shift amount G1 of this time (current frame) greatly changes with respect to the history of the first image shift amount G1 from a certain past point to the present, The possibility of an outlier as a result of including noise in the detection result of the posture detection device 30 increases, and it can be evaluated that the reliability of the first image shift amount G1 is low.
- the second image shift amount calculation unit 42 determines the first image shift amount G1 and the image of the current (current frame) from the history of the first image shift amount G1 and the image analysis shift amount G11. By calculating the change amount V of each analysis deviation amount G11, the reliability of each of the first image deviation amount G1 and the image analysis deviation amount G11 is calculated. The second image shift amount calculation unit 42 calculates a lower reliability as the change amount V is more likely to be an outlier. Then, the second image shift amount calculation unit 42 determines the weighting factor r according to the reliability ratio of the first image shift amount G1 and the image analysis shift amount G11.
- a relatively large weighting coefficient r is set to the higher reliability of the first image deviation amount G1 and the image analysis deviation amount G11, and the weighted addition is performed, thereby calculating the calculated second image deviation amount G2.
- the influence of noise can be reduced.
- FIG. 1 unlike the first embodiment and the second embodiment, the past image is corrected by the first image shift amount G1, and the second image shift amount G2 is calculated for each local region.
- the device configuration (hardware configuration) of the HMD system and the moving image processing device is the same as that of the first embodiment, and the control processing by the CPU 221 of the moving image processing device 220 is the same as that of the first embodiment. Is different. For this reason, the same components as those in the first embodiment are denoted by the same reference numerals and description thereof is omitted.
- the CPU 221 (see FIG. 3) of the moving image processing apparatus 220 corrects the past image 52 with the first image shift amount G1, as shown in FIG.
- the second image shift amount G2 is calculated for each local region 261.
- the second image deviation amount calculation unit 42 corrects the position of the past image 52 of the immediately preceding frame (n ⁇ 1) based on the first image deviation amount G1, and the position corrected past image 252 and the current image are corrected.
- a second image shift amount G2 from the frame image 51 is calculated for each local region 261 in the image.
- the image composition unit 43 corrects the position of each of the local regions 261 of the past image 252 whose position has been corrected based on the second image shift amount G2, and corrects the past image 253 and the current frame of the current frame (n).
- the image 51 is synthesized.
- the second image shift amount calculation unit 42 first corrects the position of the past image 52 by the first image shift amount G1 so that the position of the entire image matches the current frame image 51. As a result, a corrected past image 252 is obtained.
- the second image shift amount calculation unit 42 calculates the position shift for each local region 261 of the past image 252 whose position has been corrected as the local second image shift amount G2.
- the positional deviation for each local area 261 is detected by performing pattern matching limited to the local area 261 over the entire past image 52, for example. That is, as shown in FIG. 14, pattern matching is performed over the entire current frame image 51 to be compared with respect to the local region 261 set for the past image 252 whose position has been corrected. As a result, the local areas 261 having the highest correlation are associated with each other, and the positional deviation between the local areas 261 is calculated as the second image deviation amount G2. By repeating the setting of the local region 261 and the pattern matching with the current frame image 51 over the entire region of the past image 252 whose position has been corrected, the second image shift amount G2 for each local region 261 is calculated.
- the amount G2 is calculated.
- the image composition unit 43 corrects the position of each position of the local image 261 in the past image 252 corrected based on the second image shift amount G2, and obtains a past image 253 corrected for each local region 261. Then, the image composition unit 43 synthesizes the corrected past image 253 and the current frame image 51. This makes it possible to reduce the residual image of the moving object 262 that remains after the position correction of the entire image of the past image 52 and the current frame image 51 in accordance with the current frame image 51.
- the local area 261 in which the moving object 262 is not reflected in the past image 52 usually has the highest correlation between the local areas 261 at the same position in the current frame image 51. If an error is included in the first image shift amount G1, the correlation becomes the highest in the range shifted by the error range from the same position in the current frame image 51. Therefore, this position shift is the second image for each local region 261.
- the deviation amount G2 is the highest correlation between the local areas 261 at the same position in the current frame image 51.
- Steps S1 to S5 in FIG. 15 are the same as those in the first embodiment (see FIG. 8).
- step S21 the CPU 221 corrects the displacement of the entire image of the past image 52 of the immediately preceding frame (n ⁇ 1) based on the first image displacement amount G1 obtained in step S4.
- step S22 the CPU 221 calculates a second image shift amount G2 for each local region 261 between the local region 261 of the past image 252 whose position has been corrected in step S21 and the current frame image 51 obtained in step S1.
- step S23 the CPU 221 performs positional deviation correction of each local region 261 based on the obtained second image deviation amount G2, and in step S7, the past image 253 of the immediately preceding frame (n ⁇ 1) whose position has been corrected. Are combined with the current frame image 51 of the current frame (n).
- the first image shift amount G1 is used to perform the positional shift correction in advance, so that the amount of noise is slightly increased.
- the accuracy can be maintained, and the calculation cost can be reduced by calculating the first image shift amount G1 only in the yawing direction and the pitching direction. As a result, the accuracy and robustness of misregistration detection can be improved, and an increase in the calculation cost of image processing can be suppressed.
- the second image shift amount calculation unit 42 corrects the position of the past image 52 based on the first image shift amount G1, and the position corrected past image 252 and the current position are corrected.
- the second image shift amount G2 from the frame image 51 is calculated for each local region 261 in the image.
- the image composition unit 43 corrects the position of each position of the local image 261 of the past image 252 whose position has been corrected based on the second image shift amount G2, and synthesizes the corrected past image 253 and the current frame image 51.
- the position of each local region 261 can be locally corrected based on the second image shift amount G2. Therefore, even when the moving object 262 is reflected locally in the entire image, the moving object It is possible to reduce the noise of the entire image while suppressing the afterimage from remaining locally in the image portion where 262 is reflected.
- the present invention is not limited to this.
- the present invention may be applied to an HMD system mounted on a land mobile body such as an automobile or a water mobile body such as a ship.
- the present invention may be applied to, for example, an HMD system used for a game application or an HMD system used by being worn by a user in daily life.
- the posture detection device 30 may be a sensor such as an acceleration sensor and a gyro sensor built in the HMD 10, or may be a combination of a sensor method and an optical method. Any method of detecting the posture of the photographing apparatus by the posture detection device may be used.
- the moving image processing apparatus 20 120, 220
- the present invention is not limited to this.
- the moving image processing apparatus may be a separate processing unit externally connected to the HMD.
- a pair of the photographing device 11 and the display element 14 of the HMD 10 and the CPU 21 of the moving image processing device 20 are provided to display the left-eye image and the right-eye image.
- the present invention is not limited to this.
- an HMD that projects an image only on one eye may be used.
- the image capturing apparatus, the display element, the CPU of the moving image processing apparatus, and the like may be provided one by one.
- the CPU of the moving image processing apparatus both a left-eye image and a right-eye image may be generated by a single CPU.
- the current frame image 51 of the current frame (n) and the past image 52 of the immediately previous frame (n ⁇ 1) are synthesized by the recursive filter. It is not limited to this.
- the current frame image and the past image may be synthesized using a method other than the recursive filter.
- the synthesis may be performed by a moving average filter that calculates a moving average of a past image and a current frame image for a plurality of frames.
- the moving average filter increases the calculation cost compared to the recursive filter.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Hardware Design (AREA)
- Optics & Photonics (AREA)
- Multimedia (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Controls And Circuits For Display Device (AREA)
- Geometry (AREA)
Abstract
Description
まず、図1~図8を参照して、本発明の第1実施形態によるヘッドマウントディスプレイシステム(以下、「HMDシステム」という)100の全体構成について説明する。第1実施形態では、HMDシステム100が、自動車、航空機、船舶など、人を搭乗させて移動する移動体のうち、特に航空機(飛行機やヘリコプタ)などの上空(空中)を移動する移動体1に搭載される例について説明する。
図1に示すように、第1実施形態によるHMDシステム100は、撮影装置11および表示画面12を備えたヘッドマウントディスプレイ(以下、「HMD」という)10と、撮影装置11により撮影された画像を表示画面12に表示させるための動画像処理装置20と、撮影装置11の姿勢を検出可能な姿勢検出装置30とを備える。姿勢検出装置30は、本発明の「姿勢検出手段」の一例である。
次に、動画像処理装置20の画像処理に関する詳細な構成について説明する。図3に示すように、動画像処理装置20のCPU21が実行する制御処理を機能ブロックとして説明すると、CPU21は、第1画像ずれ量算出部41と、第2画像ずれ量算出部42と、画像合成部43とを有する。第1画像ずれ量算出部41と、第2画像ずれ量算出部42と、画像合成部43とは、それぞれ、本発明の「第1画像ずれ量算出手段」と、「第2画像ずれ量算出手段」と、「画像合成手段」との一例である。
Qn(i,j)=k×Pn(i,j)+(1-k)×Qn-1(i,j)…(2)
ここで、i、jは画像中のX方向およびY方向の各画素である。
次に、図8を参照して、動画像処理装置20の制御処理について説明する。以下では、第1画像ずれ量算出部41、第2画像ずれ量算出部42および画像合成部43による処理を区別せず、CPU21による処理として説明する。
第1実施形態では、以下のような効果を得ることができる。
次に、図1、図3および図9~図11を参照して、本発明の第2実施形態によるHMDシステムについて説明する。第2実施形態では、第1画像ずれ量G1に基づいて位置ずれ探索領域61を設定し、位置ずれ探索領域61の画像解析(パターンマッチング)によって第2画像ずれ量G2を算出する例を示した上記第1実施形態とは異なり、第1画像ずれ量G1と画像解析で得られた画像解析ずれ量との両方から第2画像ずれ量G2を算出する例について説明する。なお、第2実施形態において、HMDシステムおよび動画像処理装置の装置構成(ハードウェア構成)は上記第1実施形態と同様であり、動画像処理装置120のCPU121による制御処理が上記第1実施形態とは異なる。そのため、上記第1実施形態と同様の構成については同一の符号を付し、説明を省略する。
G2=r×G1+(1-r)×G11…(3)
この結果、第1画像ずれ量G1および画像解析ずれ量G11に対して、重み係数rに応じた割合の中間値として第2画像ずれ量G2が算出される。
次に、図8および図11を参照して、第2実施形態による動画像処理装置120の制御処理について説明する。
第2実施形態では、以下のような効果を得ることができる。
上記第2実施形態では、現在フレーム画像51のノイズ量に応じて重み係数を決定する例を示したが、第2実施形態の変形例では、第1画像ずれ量G1および画像解析ずれ量G11のそれぞれの変化量に基づいて重み係数rを決定する例について説明する。
次に、図1、図3、図13~図15を参照して、本発明の第3実施形態によるHMDシステムについて説明する。第3実施形態では、上記第1実施形態および上記第2実施形態とは異なり、第1画像ずれ量G1により過去画像を補正した上で、局所領域毎に第2画像ずれ量G2を算出する例について説明する。なお、第3実施形態において、HMDシステムおよび動画像処理装置の装置構成(ハードウェア構成)は上記第1実施形態と同様であり、動画像処理装置220のCPU221による制御処理が上記第1実施形態とは異なる。そのため、上記第1実施形態と同様の構成については同一の符号を付し、説明を省略する。
次に、図8および図15を参照して、第3実施形態による動画像処理装置220の制御処理について説明する。
第3実施形態では、以下のような効果を得ることができる。
10 HMD(ヘッドマウントディスプレイ)
11 撮影装置
12 表示画面
20、120、220 動画像処理装置
30 姿勢検出装置(姿勢検出手段)
41 第1画像ずれ量算出部(第1画像ずれ量算出手段)
42 第2画像ずれ量算出部(第2画像ずれ量算出手段)
43 画像合成部(画像合成手段)
51 現在フレーム画像
52、252、253 過去画像
61 位置ずれ探索領域
100、200、300 HMDシステム(ヘッドマウントディスプレイシステム)
G1 第1画像ずれ量
G2 第2画像ずれ量
G11 画像解析ずれ量
Claims (16)
- 使用者の頭部に装着され、所定のフレーム間隔で画像を撮影する撮影装置および表示画面を備えたヘッドマウントディスプレイ用の動画像処理装置であって、
使用者の頭部に装着された前記撮影装置の姿勢を検出可能な姿勢検出手段と、
前記姿勢検出手段の検出結果に基づき、前記撮影装置により撮影されたフレーム間の前記撮影装置のヨーイング方向およびピッチング方向における第1画像ずれ量を算出する第1画像ずれ量算出手段と、
前記第1画像ずれ量と、前記撮影装置により撮影された現在フレーム画像と、過去に前記撮影装置により撮影された一又は複数のフレーム画像に基づく過去画像とに基づいて、前記過去画像と前記現在フレーム画像との第2画像ずれ量を算出する第2画像ずれ量算出手段と、
前記第2画像ずれ量に基づいて前記過去画像を補正し、補正された前記過去画像と前記現在フレーム画像とを合成する画像合成手段とを備える、ヘッドマウントディスプレイ用の動画像処理装置。 - 前記第2画像ずれ量算出手段は、前記過去画像の一部分に対して設定した位置ずれ探索領域において前記現在フレーム画像との前記第2画像ずれ量を算出するように構成され、
前記位置ずれ探索領域は、前記第1画像ずれ量に基づいて前記過去画像における位置および大きさが設定されている、請求項1に記載のヘッドマウントディスプレイ用の動画像処理装置。 - 前記位置ずれ探索領域は、前記第1画像ずれ量分の移動後における前記過去画像の一部分の周囲に、前記第1画像ずれ量の誤差範囲に相当する範囲だけ拡張した領域に設定される、請求項2に記載のヘッドマウントディスプレイ用の動画像処理装置。
- 前記第2画像ずれ量算出手段は、前記過去画像と前記現在フレーム画像とを解析して前記過去画像と前記現在フレーム画像との間の画像解析ずれ量を算出し、前記第1画像ずれ量と前記画像解析ずれ量とに基づいて、前記第2画像ずれ量を算出するように構成されている、請求項1に記載のヘッドマウントディスプレイ用の動画像処理装置。
- 前記第2画像ずれ量算出手段は、前記第1画像ずれ量と前記画像解析ずれ量とを、前記現在フレーム画像のノイズ量に応じて重み付け加算することにより、前記第2画像ずれ量を算出するように構成されている、請求項4に記載のヘッドマウントディスプレイ用の動画像処理装置。
- 前記第2画像ずれ量算出手段は、前記現在フレーム画像のノイズ量が多くなるほど前記第1画像ずれ量の重みを大きくする、請求項5に記載のヘッドマウントディスプレイ用の動画像処理装置。
- 前記第2画像ずれ量算出手段は、前記第1画像ずれ量に基づいて前記過去画像の位置補正を行い、位置補正された前記過去画像と前記現在フレーム画像との前記第2画像ずれ量を画像中の局所領域毎に算出するように構成され、
前記画像合成手段は、位置補正された前記過去画像の前記局所領域毎の位置を前記第2画像ずれ量に基づいて補正し、補正された前記過去画像と前記現在フレーム画像とを合成するように構成されている、請求項1に記載のヘッドマウントディスプレイ用の動画像処理装置。 - 前記第1画像ずれ量算出手段は、前記姿勢検出手段の検出結果と、過去に算出された前記第2画像ずれ量の履歴から推定される推定画像ずれ量とに基づいて、前記第1画像ずれ量を算出するように構成されている、請求項1に記載のヘッドマウントディスプレイ用の動画像処理装置。
- 前記画像合成手段は、前記補正された過去画像と前記現在フレーム画像との類似度を画像中の局所領域毎に算出し、算出した前記類似度に応じた重み付けで、前記補正された過去画像と前記現在フレーム画像とを合成するように構成されている、請求項1に記載のヘッドマウントディスプレイ用の動画像処理装置。
- 前記画像合成手段は、前記補正された過去画像と前記現在フレーム画像との類似度が高くなるほど、前記現在フレーム画像よりも前記補正された過去画像の重みを大きくする、請求項1に記載のヘッドマウントディスプレイ用の動画像処理装置。
- 前記画像合成手段は、リカーシブフィルタを用いて、前記補正された過去画像と前記現在フレーム画像とを合成するように構成されている、請求項1に記載のヘッドマウントディスプレイ用の動画像処理装置。
- 前記画像合成手段は、合成により得られた合成画像を、前記現在フレーム画像に対応する現在フレームの出力画像として前記表示画面に出力し、かつ、次のフレームにおける前記過去画像として記憶するように構成されている、請求項1に記載のヘッドマウントディスプレイ用の動画像処理装置。
- 前記第1画像ずれ量算出手段は、前記撮影装置の平行移動による画像ずれを除外し、前記撮影装置の回転による前記第1画像ずれ量を算出する、請求項1に記載のヘッドマウントディスプレイ用の動画像処理装置。
- 前記第1画像ずれ量算出手段は、前記撮影装置のローリング方向の回転位置ずれを除外し、前記撮影装置のヨーイング方向およびピッチング方向における前記第1画像ずれ量を算出する、請求項1に記載のヘッドマウントディスプレイ用の動画像処理装置。
- 使用者の頭部に装着され、所定のフレーム間隔で画像を撮影する撮影装置および表示画面を備えたヘッドマウントディスプレイ用の動画像処理方法であって、
使用者の頭部に装着された前記撮影装置の姿勢を検出するステップと、
前記撮影装置の姿勢の検出結果に基づき、前記撮影装置により撮影されたフレーム間の前記撮影装置のヨーイング方向およびピッチング方向における第1画像ずれ量を算出するステップと、
前記第1画像ずれ量と、前記撮影装置により撮影された現在フレーム画像と、過去に前記撮影装置により撮影された一又は複数のフレーム画像に基づく過去画像とに基づいて、前記過去画像と前記現在フレーム画像との第2画像ずれ量を算出するステップと、
前記第2画像ずれ量に基づいて前記過去画像を補正し、補正された前記過去画像と前記現在フレーム画像とを合成するステップとを備える、ヘッドマウントディスプレイ用の動画像処理方法。 - 使用者の頭部に装着され、所定のフレーム間隔で画像を撮影する撮影装置および表示画面を備えたヘッドマウントディスプレイと、
使用者の頭部に装着された前記撮影装置の姿勢を検出可能な姿勢検出装置と、
前記撮影装置により撮影された画像を前記表示画面に表示させるための動画像処理装置とを備え、
前記動画像処理装置は、
前記姿勢検出装置の検出結果に基づき、前記撮影装置により撮影されたフレーム間の前記撮影装置のヨーイング方向およびピッチング方向における第1画像ずれ量を算出する第1画像ずれ量算出手段と、
前記第1画像ずれ量と、前記撮影装置により撮影された現在フレーム画像と、過去に前記撮影装置により撮影された一又は複数のフレーム画像に基づく過去画像とに基づいて、前記過去画像と前記現在フレーム画像との第2画像ずれ量を算出する第2画像ずれ量算出手段と、
前記第2画像ずれ量に基づいて前記過去画像を補正し、補正された前記過去画像と前記現在フレーム画像とを合成する画像合成手段とを含む、ヘッドマウントディスプレイシステム。
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/534,560 US10948728B2 (en) | 2014-12-15 | 2015-12-02 | Dynamic image processing device for head mounted display, dynamic image processing method for head mounted display and head mounted display system |
JP2016564776A JP6555279B2 (ja) | 2014-12-15 | 2015-12-02 | ヘッドマウントディスプレイ用の動画像処理装置、ヘッドマウントディスプレイ用の動画像処理方法およびヘッドマウントディスプレイシステム |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2014-252871 | 2014-12-15 | ||
JP2014252871 | 2014-12-15 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2016098591A1 true WO2016098591A1 (ja) | 2016-06-23 |
Family
ID=56126485
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2015/083849 WO2016098591A1 (ja) | 2014-12-15 | 2015-12-02 | ヘッドマウントディスプレイ用の動画像処理装置、ヘッドマウントディスプレイ用の動画像処理方法およびヘッドマウントディスプレイシステム |
Country Status (3)
Country | Link |
---|---|
US (1) | US10948728B2 (ja) |
JP (1) | JP6555279B2 (ja) |
WO (1) | WO2016098591A1 (ja) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108241213A (zh) * | 2016-12-26 | 2018-07-03 | 乐金显示有限公司 | 头戴式显示器及其控制方法 |
JP2019184830A (ja) * | 2018-04-10 | 2019-10-24 | キヤノン株式会社 | 画像表示装置、画像表示方法 |
JP2020135337A (ja) * | 2019-02-19 | 2020-08-31 | コベルコ建機株式会社 | 作業機械、作業機械における覚醒判定方法、および、作業機械における覚醒判定プログラム |
US11256934B2 (en) | 2017-12-07 | 2022-02-22 | Samsung Electronics Co., Ltd. | Vehicle and method for controlling same |
Families Citing this family (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5924020B2 (ja) * | 2012-02-16 | 2016-05-25 | セイコーエプソン株式会社 | プロジェクター、及び、プロジェクターの制御方法 |
US10324290B2 (en) * | 2015-12-17 | 2019-06-18 | New Skully, Inc. | Situational awareness systems and methods |
KR102360412B1 (ko) * | 2017-08-25 | 2022-02-09 | 엘지디스플레이 주식회사 | 영상 생성 방법과 이를 이용한 표시장치 |
DE102020126953B3 (de) * | 2020-10-14 | 2021-12-30 | Bayerische Motoren Werke Aktiengesellschaft | System und Verfahren zum Erfassen einer räumlichen Orientierung einer tragbaren Vorrichtung |
US20220353489A1 (en) * | 2021-04-30 | 2022-11-03 | Microsoft Technology Licensing, Llc | Systems and methods for efficient generation of single photon avalanche diode imagery with persistence |
US11881129B2 (en) * | 2021-04-30 | 2024-01-23 | Microsoft Technology Licensing, Llc | Systems and methods for adding persistence to single photon avalanche diode imagery |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0223514U (ja) * | 1988-07-30 | 1990-02-16 | ||
JP2002272724A (ja) * | 2001-03-21 | 2002-09-24 | Toshiba Corp | ノイズ低減フィルタ及びx線診断システム |
WO2011019461A1 (en) * | 2009-08-10 | 2011-02-17 | Wisconsin Alumni Research Foundation | Vision system and method for motion adaptive integration of image frames |
Family Cites Families (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6522312B2 (en) * | 1997-09-01 | 2003-02-18 | Canon Kabushiki Kaisha | Apparatus for presenting mixed reality shared among operators |
WO2007032082A1 (ja) | 2005-09-16 | 2007-03-22 | Fujitsu Limited | 画像処理方法及び画像処理装置 |
JP4665872B2 (ja) * | 2006-08-30 | 2011-04-06 | 株式会社島津製作所 | ヘッドモーショントラッカ装置 |
JP5113426B2 (ja) * | 2007-05-29 | 2013-01-09 | キヤノン株式会社 | 頭部装着型表示装置、及びその制御方法 |
JP4924342B2 (ja) * | 2007-10-01 | 2012-04-25 | 株式会社島津製作所 | モーショントラッカ装置 |
JP5047090B2 (ja) * | 2008-07-31 | 2012-10-10 | キヤノン株式会社 | システム |
JP2011203823A (ja) * | 2010-03-24 | 2011-10-13 | Sony Corp | 画像処理装置、画像処理方法及びプログラム |
WO2012063542A1 (ja) * | 2010-11-09 | 2012-05-18 | 富士フイルム株式会社 | 拡張現実感提供装置 |
US8976086B2 (en) * | 2010-12-03 | 2015-03-10 | Esight Corp. | Apparatus and method for a bioptic real time video system |
JP2013120988A (ja) | 2011-12-06 | 2013-06-17 | Shimadzu Corp | 頭部装着型表示装置 |
US9597574B2 (en) * | 2011-12-30 | 2017-03-21 | Nike, Inc. | Golf aid including heads up display |
CN105229719B (zh) * | 2013-03-15 | 2018-04-27 | 奇跃公司 | 显示系统和方法 |
US9335547B2 (en) * | 2013-03-25 | 2016-05-10 | Seiko Epson Corporation | Head-mounted display device and method of controlling head-mounted display device |
GB201310359D0 (en) * | 2013-06-11 | 2013-07-24 | Sony Comp Entertainment Europe | Head-Mountable apparatus and systems |
JP5910613B2 (ja) | 2013-11-01 | 2016-04-27 | カシオ計算機株式会社 | 画像合成装置及びプログラム |
-
2015
- 2015-12-02 WO PCT/JP2015/083849 patent/WO2016098591A1/ja active Application Filing
- 2015-12-02 JP JP2016564776A patent/JP6555279B2/ja active Active
- 2015-12-02 US US15/534,560 patent/US10948728B2/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0223514U (ja) * | 1988-07-30 | 1990-02-16 | ||
JP2002272724A (ja) * | 2001-03-21 | 2002-09-24 | Toshiba Corp | ノイズ低減フィルタ及びx線診断システム |
WO2011019461A1 (en) * | 2009-08-10 | 2011-02-17 | Wisconsin Alumni Research Foundation | Vision system and method for motion adaptive integration of image frames |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108241213A (zh) * | 2016-12-26 | 2018-07-03 | 乐金显示有限公司 | 头戴式显示器及其控制方法 |
CN108241213B (zh) * | 2016-12-26 | 2020-08-11 | 乐金显示有限公司 | 头戴式显示器及其控制方法 |
US11256934B2 (en) | 2017-12-07 | 2022-02-22 | Samsung Electronics Co., Ltd. | Vehicle and method for controlling same |
JP2019184830A (ja) * | 2018-04-10 | 2019-10-24 | キヤノン株式会社 | 画像表示装置、画像表示方法 |
JP7121523B2 (ja) | 2018-04-10 | 2022-08-18 | キヤノン株式会社 | 画像表示装置、画像表示方法 |
JP2020135337A (ja) * | 2019-02-19 | 2020-08-31 | コベルコ建機株式会社 | 作業機械、作業機械における覚醒判定方法、および、作業機械における覚醒判定プログラム |
Also Published As
Publication number | Publication date |
---|---|
US10948728B2 (en) | 2021-03-16 |
JP6555279B2 (ja) | 2019-08-07 |
US20170343823A1 (en) | 2017-11-30 |
JPWO2016098591A1 (ja) | 2017-07-20 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6555279B2 (ja) | ヘッドマウントディスプレイ用の動画像処理装置、ヘッドマウントディスプレイ用の動画像処理方法およびヘッドマウントディスプレイシステム | |
CN108139204B (zh) | 信息处理装置、位置和/或姿态的估计方法及记录介质 | |
JP7026214B2 (ja) | ヘッドマウントディスプレイ追跡システム | |
US9747725B2 (en) | Video system for piloting a drone in immersive mode | |
US10491819B2 (en) | Portable system providing augmented vision of surroundings | |
US10365710B2 (en) | Head-mounted display device configured to display a visual element at a location derived from sensor data and perform calibration | |
US8587659B1 (en) | Method and apparatus for dynamic image registration | |
JP2015231106A (ja) | ヘッドマウントディスプレイ装置及びヘッドマウントディスプレイシステム | |
CN104781873A (zh) | 图像显示装置、图像显示方法、移动装置、图像显示系统、以及计算机程序 | |
US11443540B2 (en) | Information processing apparatus and information processing method | |
KR20180049082A (ko) | 관성 센서 데이터 정정 | |
US9626783B2 (en) | Helmet-used device capable of automatically adjusting positions of displayed information and helmet thereof | |
JP2001346200A (ja) | 画像切出し/表示システム | |
US20230232103A1 (en) | Image processing device, image display system, method, and program | |
WO2021153577A1 (ja) | 視線検出装置のキャリブレーション | |
US11112860B2 (en) | Helmet tracker buffeting compensation | |
TWI436270B (zh) | 虛擬望遠方法及其裝置 | |
JP2009006968A (ja) | 車両用表示装置 | |
JP2018056845A (ja) | 作業支援装置、システム、方法及びプログラム | |
US9970766B2 (en) | Platform-mounted artificial vision system | |
US20230251709A1 (en) | Electronic device, control method of electronic device, and non-transitory computer readable medium | |
JP5332127B2 (ja) | 頭部装着型表示装置 | |
JP2023132157A (ja) | 画像表示装置、画像表示方法、及びプログラム | |
JP2015133668A (ja) | 撮像装置及び画像処理方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 15869797 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2016564776 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 15534560 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 15869797 Country of ref document: EP Kind code of ref document: A1 |