WO2014077046A1 - 画像表示装置及び画像表示方法、移動体装置、画像表示システム、並びにコンピューター・プログラム - Google Patents
画像表示装置及び画像表示方法、移動体装置、画像表示システム、並びにコンピューター・プログラム Download PDFInfo
- Publication number
- WO2014077046A1 WO2014077046A1 PCT/JP2013/076830 JP2013076830W WO2014077046A1 WO 2014077046 A1 WO2014077046 A1 WO 2014077046A1 JP 2013076830 W JP2013076830 W JP 2013076830W WO 2014077046 A1 WO2014077046 A1 WO 2014077046A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- display
- mobile device
- head
- user
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims description 55
- 238000004590 computer program Methods 0.000 title claims description 12
- 238000001514 detection method Methods 0.000 claims abstract description 50
- 230000033001 locomotion Effects 0.000 claims abstract description 47
- 238000004891 communication Methods 0.000 claims description 34
- 230000001133 acceleration Effects 0.000 claims description 11
- 230000006870 function Effects 0.000 claims description 7
- 230000035807 sensation Effects 0.000 claims description 4
- 239000002131 composite material Substances 0.000 abstract description 4
- 210000003128 head Anatomy 0.000 description 86
- 238000010586 diagram Methods 0.000 description 31
- 238000012545 processing Methods 0.000 description 25
- 238000005516 engineering process Methods 0.000 description 21
- 230000008569 process Effects 0.000 description 12
- 230000007246 mechanism Effects 0.000 description 9
- 239000000203 mixture Substances 0.000 description 9
- 230000003287 optical effect Effects 0.000 description 8
- 238000012937 correction Methods 0.000 description 6
- 230000000694 effects Effects 0.000 description 5
- 230000005540 biological transmission Effects 0.000 description 4
- 238000005401 electroluminescence Methods 0.000 description 4
- 239000011521 glass Substances 0.000 description 4
- 238000003384 imaging method Methods 0.000 description 4
- 230000004044 response Effects 0.000 description 4
- 239000004973 liquid crystal related substance Substances 0.000 description 3
- 241001465754 Metazoa Species 0.000 description 2
- 230000008901 benefit Effects 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 239000003638 chemical reducing agent Substances 0.000 description 2
- 230000007423 decrease Effects 0.000 description 2
- 230000008451 emotion Effects 0.000 description 2
- 230000004886 head movement Effects 0.000 description 2
- 230000001788 irregular Effects 0.000 description 2
- 230000004807 localization Effects 0.000 description 2
- 230000000007 visual effect Effects 0.000 description 2
- 239000011165 3D composite Substances 0.000 description 1
- RZVHIXYEVGDQDX-UHFFFAOYSA-N 9,10-anthraquinone Chemical compound C1=CC=C2C(=O)C3=CC=CC=C3C(=O)C2=C1 RZVHIXYEVGDQDX-UHFFFAOYSA-N 0.000 description 1
- 241000251468 Actinopterygii Species 0.000 description 1
- 241000271566 Aves Species 0.000 description 1
- 241000938605 Crocodylia Species 0.000 description 1
- 241000238631 Hexapoda Species 0.000 description 1
- 125000002066 L-histidyl group Chemical group [H]N1C([H])=NC(C([H])([H])[C@](C(=O)[*])([H])N([H])[H])=C1[H] 0.000 description 1
- 241000124008 Mammalia Species 0.000 description 1
- 208000012886 Vertigo Diseases 0.000 description 1
- 230000009471 action Effects 0.000 description 1
- 230000037007 arousal Effects 0.000 description 1
- 230000003190 augmentative effect Effects 0.000 description 1
- 230000036760 body temperature Effects 0.000 description 1
- 239000000284 extract Substances 0.000 description 1
- 238000013213 extrapolation Methods 0.000 description 1
- 210000000744 eyelid Anatomy 0.000 description 1
- 238000003702 image correction Methods 0.000 description 1
- 238000007654 immersion Methods 0.000 description 1
- 238000009434 installation Methods 0.000 description 1
- 230000007257 malfunction Effects 0.000 description 1
- 230000006996 mental state Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 201000003152 motion sickness Diseases 0.000 description 1
- 230000003183 myoelectrical effect Effects 0.000 description 1
- 238000004091 panning Methods 0.000 description 1
- 230000035790 physiological processes and functions Effects 0.000 description 1
- 210000001747 pupil Anatomy 0.000 description 1
- 210000001525 retina Anatomy 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 238000009987 spinning Methods 0.000 description 1
- 230000006641 stabilisation Effects 0.000 description 1
- 238000011105 stabilization Methods 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 210000004243 sweat Anatomy 0.000 description 1
- 208000024891 symptom Diseases 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
- 230000007704 transition Effects 0.000 description 1
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/02—Viewing or reading apparatus
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U20/00—Constructional aspects of UAVs
- B64U20/80—Arrangement of on-board electronics, e.g. avionics systems or wiring
- B64U20/87—Mounting of imaging devices, e.g. mounting of gimbals
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/14—Digital output to display device ; Cooperation and interconnection of the display device with other functional units
- G06F3/147—Digital output to display device ; Cooperation and interconnection of the display device with other functional units using display panels
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
- G09G5/36—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
- G09G5/36—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
- G09G5/38—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory with means for controlling the display position
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/239—Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/296—Synchronisation thereof; Control thereof
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/332—Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
- H04N13/344—Displays for viewing with the aid of special glasses or head-mounted displays [HMD] with head-mounted left-right displays
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/366—Image reproducers using viewer tracking
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/40—Circuit details for pick-up tubes
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U2101/00—UAVs specially adapted for particular uses or applications
- B64U2101/30—UAVs specially adapted for particular uses or applications for imaging, photography or videography
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/0138—Head-up displays characterised by optical features comprising image capture systems, e.g. camera
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/0141—Head-up displays characterised by optical features characterised by the informative content of the display
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2340/00—Aspects of display data processing
- G09G2340/04—Changes in size, position or resolution of an image
- G09G2340/045—Zooming at least part of an image, i.e. enlarging it or shrinking it
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2340/00—Aspects of display data processing
- G09G2340/04—Changes in size, position or resolution of an image
- G09G2340/0464—Positioning
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2340/00—Aspects of display data processing
- G09G2340/12—Overlay of images, i.e. displayed pixel being the result of switching between the corresponding input pixels
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2340/00—Aspects of display data processing
- G09G2340/14—Solving problems related to the presentation of information to be displayed
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2354/00—Aspects of interface with display user
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G3/00—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
- G09G3/001—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background
- G09G3/003—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background to produce spatial visual effects
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
- G09G5/14—Display of multiple viewports
Definitions
- the technology disclosed in this specification includes an image display apparatus and an image display method that are used for viewing an image by a user wearing the head or face, a mobile apparatus that captures a viewed image while moving, and an image display system
- the present invention relates to a computer program, and in particular, an image display device and an image display method, a mobile device, and an image display system that are used for viewing images taken with an external camera such as an all-sky image, a fisheye image, and a panoramic image And computer programs.
- An image display device that is worn on a head or a face and is used for viewing an image, that is, a head-mounted display is known.
- a display unit is disposed for each of the left and right eyes, and a user can observe a realistic image by forming an enlarged virtual image of the display image using a virtual image optical system.
- the head-mounted display is configured to completely block the outside world when the user wears it on the head, the feeling of immersion during viewing increases.
- the head-mounted display can also display different images for the left and right eyes, and can display a 3D image by displaying an image with parallax for the left and right eyes.
- a wide-angle image can be viewed using a head-mounted display.
- a head-mounted display has been proposed in which a gyro sensor is attached to the head so that a 360-degree image of the entire space can be realized according to the movement of the user's head (for example, Patent Documents). 1, see Patent Document 2).
- Patent Documents Patent Documents 1
- Patent Document 2 By moving the display area so as to cancel the head movement detected by the gyro sensor, it is possible to present an image following the movement of the user's head.
- Another possible application is to view live images taken with an external camera on a head-mounted display.
- an image display system that displays an image actually captured by an imaging device mounted on a moving body other than a person, such as a radio control, on a display device worn by a user has been proposed (see, for example, Patent Document 3). )
- FPV First Person Viewing
- maneuvering is performed while viewing a first-person viewpoint (pilot viewpoint) image taken with a radio camera mounted on a radio control such as a helicopter.
- an omnidirectional camera that captures the surroundings and a laser rangefinder that measures the altitude of the omnidirectional camera from the ground are installed in a small helicopter that can control the flight altitude and flight speed with a flight control device.
- an aerial imaging system that performs aerial imaging with an omnidirectional camera at a predetermined altitude has been proposed (see, for example, Patent Document 4). Images taken by the omnidirectional camera can be transmitted to an external computer via a communication network.
- An object of the technology disclosed in the present specification is to provide an excellent image display device and image display method that can be used for viewing images taken with an external camera, such as an all-sky image, a fish-eye image, and a panoramic image, and movement It is to provide a body device, an image display system, and a computer program.
- a further object of the technology disclosed in the present specification is to provide an excellent image display device, image display method, and mobile device capable of suitably displaying a first-person viewpoint image captured by a camera mounted on a mobile device such as a radio control device. It is to provide an image display system and a computer program.
- the display control unit of the image display device extracts an area corresponding to the posture of the head from a wide-angle image captured by the mobile device. It is configured to be cut out and displayed on the display unit.
- the display control unit of the image display device displays a plurality of viewpoint images captured by a pan-focus parallel method from a plurality of viewpoints. Further, the convergence position of the viewpoint image is adjusted based on the moving speed of the moving body.
- the display control unit of the image display device according to claim 1 is configured to display an image at a viewpoint interval corresponding to a zoom operation on the display unit. Yes.
- the image display device is configured such that the line-of-sight direction of the camera unit of the mobile device is pan, tilt, or roll with respect to the posture of the head. It is configured to be offset in at least one direction.
- the image display device further includes an operation feeling feedback unit that feeds back an operation feeling to the user by tactile feeling or vibration, and the mobile device moves. It is configured to provide feedback to the user based on the acceleration received therein.
- the display control unit of the image display apparatus according to claim 1 displays the AR image superimposed on the real-world image captured by the mobile device. It is configured.
- the display control unit of the image display device according to claim 7 is configured such that the current position of the mobile device, an object included in the captured image, or the mobile device.
- the AR image corresponding to at least one of the states of the apparatus is configured to be displayed.
- the display control unit of the image display device according to claim 1 is configured to display the position information of the mobile device and the user.
- the display control unit of the image display apparatus further displays a captured image of an auto tracker that is captured while tracking the mobile device. It is configured.
- the image display device further includes a self-view image acquisition unit that acquires a self-view image that can be seen by the user's view.
- the display control unit is configured to switch and display the moving body line-of-sight image captured by the mobile body device and the self-eye line image.
- the display control unit of the image display device sets an image captured from a plurality of viewpoint positions of the mobile device to the posture of the head. It is configured to switch and display accordingly.
- the display control unit of the image display device according to claim 1 is configured to correct and display the shaking included in the moving image captured by the mobile device. It is configured.
- a posture detection step for detecting the posture of the user's head
- a display control step for controlling display of an image captured by the mobile device based on the posture of the head
- the technique described in claim 15 of the present application is: A mobile device that shoots while moving; An image display device that displays a captured image of the mobile device according to a posture of a user's head; Is an image display system.
- system here refers to a logical collection of a plurality of devices (or functional modules that realize specific functions), and each device or functional module is in a single housing. It does not matter whether or not.
- the mobile device according to claim 16 is configured to capture the whole sky image while changing the inter-viewpoint distance between the cameras of the plurality of viewpoints. ing.
- the mobile device is configured to detect an image outside the viewpoint of the camera from images captured by the cameras of the plurality of viewpoints with fixed inter-viewpoint distances. It is configured to extrapolate the image.
- the technology described in claim 19 of the present application is: A posture detection unit that detects the posture of the user's head, A display control unit that controls display of an image captured by the mobile device based on the posture of the head; As a computer program written in a computer-readable format to make the computer function.
- the computer program according to claim 19 of the present application defines a computer program written in a computer-readable format so as to realize predetermined processing on a computer.
- a cooperative operation is exhibited on the computer, and the same effect as the image display device according to claim 1 of the present application is obtained. be able to.
- FIG. 1 is a diagram schematically illustrating a configuration of an image display system 100 according to an embodiment of the technology disclosed in this specification.
- FIG. 2 is a view showing a user wearing the head mounted display 110 viewed from the front.
- FIG. 3 is a diagram illustrating a state in which a user wearing the head mounted display 110 illustrated in FIG. 2 is viewed from above.
- FIG. 4 is a diagram illustrating an internal configuration example of the head mounted display 110.
- FIG. 5 is a diagram illustrating a coordinate system of the posture detected by the posture / position detection unit 404.
- FIG. 6 is a diagram illustrating an internal configuration example of the mobile device 120.
- FIG. 1 is a diagram schematically illustrating a configuration of an image display system 100 according to an embodiment of the technology disclosed in this specification.
- FIG. 2 is a view showing a user wearing the head mounted display 110 viewed from the front.
- FIG. 3 is a diagram illustrating a state in which a user wearing the head mounted display 110 illustrated in FIG.
- FIG. 7 is a diagram schematically illustrating a functional configuration of the control unit 401 for displaying an image photographed on the mobile device 120 side on the head-mounted display 110.
- FIG. 8A is a diagram illustrating a state in which the camera unit 605 is mounted so that the camera base 606 can control the line of sight in each of the roll, tilt, and pan directions.
- FIG. 8B is a diagram illustrating a state in which the distance between the left-eye camera and the right-eye camera constituting the stereo camera is variable.
- FIG. 8C is a diagram showing a state in which the main camera 801 is mounted on the main body of the mobile device 120-2 of the automobile, and the sub camera 802 is mounted on the rear side thereof.
- FIG. 8A is a diagram illustrating a state in which the camera unit 605 is mounted so that the camera base 606 can control the line of sight in each of the roll, tilt, and pan directions.
- FIG. 8B is a diagram illustrating a state in which the distance between the left-eye camera
- FIG. 9 is a diagram illustrating a state in which the position and orientation of the display angle of view displayed on the display panel 409 are moved according to the movement of the user's head.
- FIG. 10 is a diagram illustrating an example of an operation sequence in which a wide-angle image such as a full-sky image captured on the mobile device 120 side is displayed in accordance with the posture of the user's head on the head-mounted display 110 side. is there.
- FIG. 11 shows a modified example of an operation sequence in which a wide-angle image such as an all-round image captured on the mobile device 120 side is displayed on the head-mounted display 110 side so as to follow the posture of the user's head.
- FIG. 12A is a diagram for explaining a method for controlling the convergence position when displaying a three-dimensional captured image on the head-mounted display 110 side based on the speed information of the mobile device 120.
- FIG. 12B is a diagram for explaining a method for controlling the convergence position when displaying a three-dimensional captured image on the head-mounted display 110 side based on the speed information of the mobile device 120.
- FIG. 12C is a diagram for describing a method for controlling the convergence position when displaying a three-dimensional captured image on the head-mounted display 110 side based on the speed information of the mobile device 120.
- FIG. 13A is a diagram for explaining a method of capturing a three-dimensional image by changing the inter-viewpoint distance d in accordance with the zoom.
- FIG. 13A is a diagram for explaining a method of capturing a three-dimensional image by changing the inter-viewpoint distance d in accordance with the zoom.
- FIG. 13B is a diagram for explaining a method of capturing a three-dimensional image by changing the inter-viewpoint distance d in accordance with the zoom.
- FIG. 14 is a diagram showing a state where the head is tilted upward and the camera unit 605 mounted on the mobile device 120-2 is also tilted upward.
- FIG. 15 is a diagram showing a state in which the tilt axis of the camera unit 605 mounted on the mobile device 120-2 is fixed at a position rotated by ⁇ y above the tilt axis of the user's head. is there.
- FIG. 16 is a diagram illustrating a state in which the coordinate system of the camera unit 605 is offset in each of pan, tilt, and roll directions with respect to the coordinate system of the user's head.
- FIG. 14 is a diagram showing a state where the head is tilted upward and the camera unit 605 mounted on the mobile device 120-2 is also tilted upward.
- FIG. 15 is a diagram showing a state in which the tilt axis of
- FIG. 17 is a diagram illustrating a state in which user position information is displayed as a small screen of the captured image display screen of mobile device 120.
- FIG. 18 is a diagram illustrating a state in which user position information is displayed on a large screen and a captured image is displayed on a small screen.
- FIG. 19 is a diagram showing a display example of user position information.
- FIG. 20 is a diagram illustrating a display example of user position information.
- FIG. 21 is a diagram showing a state in which the back of the mobile device 120-1 flying over the sky is being tracked by the auto tracker 2100 equipped with the camera 2101.
- FIG. 22 is a diagram exemplifying an image obtained by photographing the mobile device 120-1 from the rear with the camera 2101 of the auto tracker 2100.
- FIG. 23 is a diagram exemplifying a vehicle line-of-sight image captured by the mobile device 120-2 of the automobile.
- FIG. 24 is a diagram illustrating a state in which the user wearing the head-mounted display 110 follows the moving mobile device 120-3 while traveling.
- FIG. 24 is a diagram exemplifying a self-view image of a user who follows the moving body device 120-3 and is captured by the outer camera 413.
- FIG. 26 is a diagram illustrating an arrangement example of images taken at a plurality of viewpoint positions by the mobile device 120.
- FIG. 27 is a diagram illustrating a state in which the observed image causes trapezoidal distortion according to the vertical tilt of the user's head.
- FIG. 28 is a diagram showing a state in which a virtual obstacle 2801 and a prohibited area 2802 are AR-displayed with respect to a vehicle line-of-sight image (real world) photographed by the mobile device 120-2.
- FIG. 1 schematically shows a configuration of an image display system 100 according to an embodiment of the technology disclosed in this specification.
- the illustrated image display system 100 includes an image display device (head mounted display) 110 that is used by the user wearing on the head or face, and movement of an airplane (or a helicopter or other flying object), an automobile, a ship, or the like.
- the mobile device 120-1, 120-2, 120-3, etc. which is a model to be operated, and a controller 130 for maneuvering the mobile device 120 wirelessly.
- Each of the mobile devices 120-1, 120-2, 120-3,... Is equipped with a wireless camera (not shown), and photographs a landscape while moving.
- the controller 130 may be a multi-function information terminal such as a smartphone, for example, and activates a steering application for the mobile device 120.
- the head-mounted display 110 and the mobile device 120 and the controller 130 and the mobile device 120 are wirelessly connected by, for example, a wireless network or infrared communication.
- the mobile device 120 is equipped with a wireless camera (not shown) via a camera base (not shown) whose posture can be changed around the pan, tilt, and yaw axes.
- This wireless camera can take a full-sky image, a full-sphere image, or a panoramic image less than a half-sky circle.
- the wireless camera can perform wide-angle imaging using a fisheye lens.
- the mobile device 120 transmits an image captured by the wireless camera to the controller 130 or the head mounted display 110.
- the captured image can be directly transferred between the mobile device 120 and the head-mounted display 110, it is not essential for the mobile device 120 to transfer the captured image to the controller 130.
- a wireless connection between the head mounted display 110 is not essential.
- detailed description of the controller 130 is omitted on the assumption that data communication can be performed directly between the mobile device 120 and the head mounted display 110.
- the user himself / herself wearing the head mounted display 110 controls the moving body device 120 with the controller 130, but the captured image from the moving body device 120 is operated with the head mounted display 110.
- a person other than the user who enjoys the operation may control the mobile device 120 with the controller 130.
- FIG. 2 shows the user wearing the head mounted display 110 viewed from the front.
- the illustrated head-mounted display 110 is a structure similar to the shape of glasses, and is configured to directly cover the left and right eyes of the wearing user.
- a display panel (not shown in FIG. 2) to be observed by the user is disposed at a position facing the left and right eyes inside the head mounted display 110 main body.
- the display panel is configured by a micro display such as an organic EL element or a liquid crystal display.
- An outer camera 413 for inputting a surrounding image is installed in the center of the front surface of the head-mounted display 110 in the shape of glasses.
- microphones 201 and 202 are installed near the left and right ends of the support, respectively. Since there are two microphones 201 and 202, it can be separated from ambient noise and other people's voice by recognizing only the voice localized in the center (user's voice). Malfunctions can be prevented.
- FIG. 3 shows a state in which the user wearing the head mounted display 110 shown in FIG. 2 is viewed from above.
- the illustrated head mounted display 110 has display panels for left and right eyes on the side facing the user's face.
- the display panel is configured by a micro display such as an organic EL element or a liquid crystal display. Display images on the left and right display panels are observed by the left and right eyes of the user as enlarged virtual images by passing through the respective virtual image optical units. Further, since there are individual differences in eye height and eye width for each user, it is necessary to align the left and right display systems with the eyes of the user who wears them. In the example shown in FIG. 3, an eye width adjustment mechanism is provided between the display panel for the right eye and the display panel for the left eye.
- FIG. 4 shows an internal configuration example of the head mounted display 110. Hereinafter, each part will be described.
- the control unit 401 includes a ROM (Read Only Memory) 401A and a RAM (Random Access Memory) 401B.
- the ROM 401A stores program codes executed by the control unit 401 and various data.
- the control unit 401 executes a program loaded to the RAM 401B, thereby starting image display control and overall operation of the head mounted display 11000. Examples of programs and data stored in the ROM 401A include image display control programs, communication processing programs with external devices such as the mobile device 120 and the controller 130, identification information unique to the device 110, and the like.
- the image display control program for example, performs display control of a captured image received from the mobile device 120. Details of this point will be described later.
- the input operation unit 402 includes one or more operation elements that the user performs input operations such as keys, buttons, and switches, receives a user instruction via the operation elements, and outputs the instruction to the control unit 401.
- the input operation unit 402 similarly accepts a user instruction including a remote control command received by the remote control reception unit 403 and outputs it to the control unit 401.
- the posture / position detection unit 404 is a unit that detects the posture of the head of the user wearing the head mount display 110.
- the posture / position detection unit 404 is a gyro sensor, an acceleration sensor, a GPS (Global Positioning System) sensor, a geomagnetic sensor, or a combination of two or more sensors in consideration of the advantages and disadvantages of each sensor. Composed.
- FIG. 5 shows the coordinate system of the posture detected by the posture / position detection unit 404.
- the depth direction of the display image is the z axis
- the horizontal direction is the y axis
- the vertical direction is the x axis
- the origin position of the xyz axis is the viewpoint position. Therefore, the roll ⁇ z corresponds to the movement of the user's head around the z axis
- the tilt ⁇ y corresponds to the movement of the user's head around the y axis
- the pan ⁇ x corresponds to the movement of the user's head around the x axis.
- the posture / position detection unit 404 When the posture / position detection unit 404 detects the movement ( ⁇ z , ⁇ y , ⁇ x ) of the user's head in each of the roll, tilt, and pan directions and the parallel movement of the head, the posture / position detection unit 404 outputs it to the control unit 401. As will be described later, when the control unit 401 displays a photographed image or the like of the mobile device 120 on the screen of the display panel 409, the display angle of view is canceled so as to cancel the head movement detected by the posture / position detection unit 404. By moving, an image following the movement of the user's head can be presented.
- the state detection unit 411 acquires state information regarding the state of the user wearing the head-mounted display 110 and outputs the state information to the control unit 401.
- state information for example, the user's work state (whether the user's head mount display 110 is worn), the user's action state (moving state such as stationary, walking, running, eyelid opening / closing state, line-of-sight direction), Mental state (exciting degree such as whether the display image is immersed or concentrated while observing the display image, arousal degree, emotion and emotion), and physiological state are acquired.
- the state detection unit 411 obtains the state information from the user by using a wearing sensor such as a mechanical switch, a gyro sensor, an acceleration sensor, a speed sensor, a pressure sensor, a body temperature sensor, a sweat sensor, a myoelectric potential.
- a wearing sensor such as a mechanical switch, a gyro sensor, an acceleration sensor, a speed sensor, a pressure sensor, a body temperature sensor, a sweat sensor, a myoelectric potential.
- Various state sensors such as a sensor, an electrooculogram sensor, and an electroencephalogram sensor may be provided.
- the operational feeling feedback (FB) unit 412 includes a vibration generator and the like, and gives operational feeling feedback to the user wearing the head mounted display 110 by tactile sensation or vibration.
- the outer camera 413 is disposed in the approximate center of the front surface of the head-mounted display 110 having a shape of glasses or a hat, for example (see FIG. 2), and can capture a surrounding image.
- the outer camera 413 can capture an image of the user's own eyes. it can.
- the relative speed of the mobile device 120 can be measured using the outer camera 413.
- the communication unit 405 performs communication processing with external devices such as the mobile device 120 and the controller 130, and modulation / demodulation of communication signals and encoding / decoding processing.
- the communication unit 405 receives an image captured by the wireless camera from the mobile device 120.
- the image or other received data received and demodulated and decoded by the communication unit 405 is supplied to the control unit 401.
- the control unit 401 sends transmission data to an external device from the communication unit 405.
- the configuration of the communication unit 405 is arbitrary.
- the communication unit 405 can be configured according to a communication standard used for transmission / reception operations with an external device serving as a communication partner.
- the communication standard may be either wired or wireless.
- Communication standards mentioned here include MHL (Mobile High-definition Link), USB (Universal Serial Bus), HDMI (registered trademark) (Multidefinition Multimedia Interface), Wi-Fi (registered trademark), Bluetooth (registered trademark), and Bluetooth (registered trademark), Bluetooth trademark Examples include infrared communication.
- the storage unit 406 is a large-capacity storage device such as an SSD (Solid State Drive).
- the storage unit 406 stores data such as application programs executed by the control unit 701 and all-sky images, fisheye images, and panoramic images taken by the mobile device 120.
- the image processing unit 407 further performs signal processing such as image quality correction on the image signal output from the control unit 401 and converts the image signal to a resolution that matches the screen of the display panel 409.
- the display driving unit 408 sequentially selects the pixels of the display panel 409 for each row and performs line sequential scanning, and supplies a pixel signal based on the image signal that has been subjected to signal processing.
- the display panel 409 is configured by a micro display such as an organic EL (Electro-Luminescence) element or a liquid crystal display.
- the virtual image optical unit 410 enlarges and projects the display image on the display panel 409, and allows the user to observe it as an enlarged virtual image.
- the virtual image optical unit 410 enlarges the display image of the display panel 409 by 1000 times or more and forms a virtual image of about 750 inches on the retina 20 meters ahead of the user's pupil.
- the angle corresponds to 45.09 degrees.
- FIG. 6 shows an internal configuration example of the mobile device 120.
- the mobile device 120 is composed of various mobile models such as airplanes, helicopters, automobiles, ships, etc. Basically, only the moving mechanism 603 is different, and other functions are included. The general configuration is almost the same.
- the storage unit 602 is realized by a memory device such as a RAM or a ROM, and a large capacity storage device such as a hard disk drive or an SSD.
- the storage unit 602 is used to store a program executed by a CPU (Central Processing Unit) 601 and a photographed image by a camera.
- the CPU 601 controls each unit in the mobile device 120 by executing a program stored in the storage unit 602.
- the moving mechanism unit 603 includes a mechanism according to the type of moving body such as an airplane, helicopter, automobile, or ship, and the mechanism operates according to a moving instruction from the CPU 601. 120 The main body is moved.
- the position / orientation / velocity detection unit 604 includes a gyro sensor, an acceleration sensor, a GPS sensor, a geomagnetic sensor, and the like. Information on the current position and orientation of the mobile device 120 main body and information on the movement speed by the movement mechanism unit 603. To get. For example, in the case of the automobile mobile device 120-2, the position / orientation / speed detector 604 can calculate the moving speed from the number of rotations of the motor that rotates the wheel, the gear ratio of the speed reducer, and the diameter of the tire. is there. If these are actually measured before shipment and the data is stored in the storage unit 602, the speed can be derived simply by measuring the number of revolutions of the motor at the time of use.
- the camera unit 605 is composed of, for example, a stereo camera and can take a three-dimensional image.
- the camera unit 605 is mounted on the mobile device 120 via the camera base 606.
- the camera unit 605 uses a posture in which the line of sight is directed in the front direction of the mobile device 120 (or the traveling direction when moving by the movement of the moving mechanism unit 603) as a basic posture, and mainly displays first-person viewpoint (FPV) images. Take a picture.
- FPV first-person viewpoint
- the camera base 606 can operate in each of the roll, tilt, and pan directions, and changes the line of sight of the camera unit 605 according to the line of sight change instruction from the CPU 601, whereby the camera unit 605 can display a panoramic image. Wide-angle images and all-sky images can be taken.
- FIG. 8A shows a state in which the camera unit 605 is mounted so that the camera base 606 can control the line of sight in each of the roll, tilt, and pan directions.
- the camera unit 605 can adjust the distance d between the viewpoints of the two cameras constituting the stereo camera.
- FIG. 8B shows how the distance d between the viewpoints of the left-eye camera and the right-eye camera constituting the stereo camera is variable.
- the left-eye camera and the right-eye camera are not congested and the lines of sight of each other are substantially parallel. More preferably, the left-eye camera and the right-eye camera shoot with pan focus, that is, with a deep subject depth.
- the camera unit 605 may be configured by a plurality of cameras such as a secondary camera that captures images from behind the main body of the mobile device 120 in addition to the primary camera that captures first-person viewpoint images.
- FIG. 8C shows a state in which the main camera 801 is mounted on the main body of the mobile device 120-2 of the automobile, and the sub camera 802 is mounted behind the main camera 801.
- the communication unit 607 performs communication processing with external devices such as the head mounted display 110 and the controller 130, and modulation / demodulation of communication signals and encoding / decoding processing. For example, when the communication unit 607 receives a movement command from the controller 130, the CPU 601 instructs the movement mechanism unit 603 to move. Further, the communication unit 607 transmits the image captured by the camera unit 605 and the position and orientation information of the mobile device 120 main body detected by the position and orientation detection unit 604 to the head mount display 110 and the controller 130.
- an image captured by the camera unit 605 mounted on the mobile device 120 is transferred via the controller 130 or directly to the head mounted display 110, so that the user can move The photographed image from the body device 120 can be enjoyed on the head mounted display 110.
- a wide-angle image such as a panoramic image or a whole sky image captured by the camera unit 605 mounted on the mobile device 120 is reproduced and displayed on the head-mounted display 110.
- the CPU 601 displays the image following the movement of the user's head by moving the display angle of view so as to cancel the movement of the user's head detected by the posture / position detection unit 604.
- a composite image in which an AR (Augmented Reality) image, which is virtual image information, is superimposed on a real image within the display angle of view is displayed as necessary.
- FIG. 7 schematically shows a functional configuration of the control unit 401 for displaying an image taken on the mobile device 120 side on the head mounted display 110.
- the illustrated functional configuration is realized, for example, in a form in which a predetermined application program is executed in the control unit 401.
- the display angle-of-view control unit 701 moves the position and posture of the display angle of view displayed on the display panel 409 according to the movement of the user's head obtained through the posture / position detection unit 404 (see FIG. 5).
- the determined display angle of view is output to the image cutout unit 702.
- the image cutout unit 702 cuts out the image of the display field angle determined by the display field angle control unit 701 from the captured image received by the mobile device 120 received by the communication unit 405, and outputs the image to the image composition unit 703.
- the image composition unit 703 generates a composite image in which the AR image is superimposed on the actual image within the display angle of view as necessary, and outputs the composite image to the subsequent image processing unit 407. For example, when the current position of the user is acquired from the posture / position detection unit 404 or the current position of the mobile device 120 is acquired, AR images such as guidance and obstacles that match the position information are synthesized (described later). ). When the user wants to give operational feeling feedback corresponding to the AR image, such as a collision with an obstacle, the image composition unit 703 instructs the operational feeling feedback unit 412 to output feedback.
- the image composition unit 703 corrects a distortion or the like that occurs when an image cut out at the display angle of view is displayed on the display panel 409.
- FIG. 9 illustrates a state in which the position and posture of the display field angle displayed on the display panel 409 are moved according to the movement of the user's head.
- the display angle-of-view control unit 701 displays the display image in the arrow direction indicated by reference numeral 901 in the figure. Move corner 910.
- the image cutout unit 702 cuts out from the wide-angle image, and the image displayed on the display panel 409 also transitions.
- the display angle-of-view control unit 701 moves in the arrow direction indicated by reference numeral 902 in the figure.
- the display angle of view 910 is moved.
- the display angle-of-view control unit 701 moves in the arrow direction indicated by reference numeral 903 in the figure.
- the display angle of view 910 is moved.
- the display angle-of-view control unit 701 moves in the arrow direction indicated by reference numeral 904 in the figure.
- the display angle of view 910 is moved.
- the display angle-of-view control unit 701 moves in the arrow direction indicated by reference numeral 905 in the figure.
- the display angle of view 910 is moved.
- the display angle-of-view control unit 701 moves in the arrow direction indicated by reference numeral 906 in the figure.
- the display angle of view 910 is moved.
- FIG. 10 shows an example of an operation sequence in which a wide-angle image such as an all-sky image captured on the mobile device 120 side is displayed by following the posture of the user's head on the head-mounted display 110 side.
- a wide-angle image such as an all-sky image captured on the mobile device 120 side
- the mobile device 120 displays the images of the mobile device 120 side.
- direct data communication is performed between the head mounted display 110 and the mobile device 120, but other devices such as the controller 130 may be interposed.
- the head mounted display 110 transmits a request for a captured image to the moving mobile device 120 by remote operation from the controller 130 (SEQ1001).
- the mobile device 120 executes the image capturing process with the camera unit 605 while driving the camera base 606 (SEQ1002). Then, when the captured image is processed to generate a wide-angle image such as an all-round image, it is transmitted to the head-mounted display 110 (SEQ1003). However, the mobile device 120 does not shoot in response to a request from the head-mounted display 110, but always performs shooting processing, and a wide-angle image such as an all-sky image is head-mounted. You may make it transmit to the display 110.
- the mobile device 120 may also transmit information on the position, posture, and speed of the main body of the mobile device 120 measured by the position / orientation / velocity detection unit 604 at the time of image transmission or at other timing. .
- the posture / position detection unit 404 detects the movement of the user's head, that is, the line-of-sight direction (SEQ1004). Then, the position of the display angle of view cut out from the received captured image is controlled according to the detected line-of-sight direction, and the cut-out image is displayed on the display panel 409 (SEQ1005).
- FIG. 11 shows a modified example of an operation sequence in which a wide-angle image such as an all-round image captured on the mobile device 120 side is displayed on the head-mounted display 110 side so as to follow the posture of the user's head. ing.
- direct data communication is performed between the head mounted display 110 and the mobile device 120, but other devices such as the controller 130 may be interposed.
- the posture / position detection unit 404 monitors the movement of the user's head, that is, the line-of-sight direction (SEQ1101).
- the head-mounted display 110 transmits a request for a captured image to the moving mobile device 120 by remote operation from the controller 130 (SEQ1102). At that time, the user's line-of-sight information is also transmitted.
- the mobile device 120 executes the image capturing process with the camera unit 605 while driving the camera base 606 (SEQ1103). Then, when the captured image is processed to generate a wide-angle image such as an all-round image, the image is cut out with a display angle of view corresponding to the user's line of sight (SEQ 1104) and transmitted to the head mounted display 110 (SEQ 1105). ).
- the mobile device 120 may also transmit information on the position, posture, and speed of the main body of the mobile device 120 measured by the position / orientation / velocity detection unit 604 at the time of image transmission.
- the moving speed can be calculated from the number of rotations of the motor, the gear ratio of the speed reducer, and the diameter of the tire.
- the relative speed of the mobile device 120 can be measured using the outer camera 413 even from the head mounted display 110 side.
- the head mounted display 110 displays the received image on the display panel 409 (SEQ1106).
- the camera unit 605 includes a stereo camera. Further, the left-eye camera and the right-eye camera are not congested, the lines of sight of each other are substantially parallel, and shooting is performed with pan focus.
- the head-mounted display 110 may adjust the convergence position based on the speed information of the mobile device 120, for example, as a process of the image composition unit 703. .
- the speed of the mobile device 120 increases. Accordingly, the overlap between the left and right images I L and I R is reduced, and the convergence position, that is, the position where the eyes of the left and right eyes intersect with each other is separated so that the user can observe an image that can be seen well as far away (FIG. 12B). checking).
- the pin and position adjustment that blurs the area other than the screen localization will improve the visual effect that the user can see farther away, and ride on a fast moving object The presence of being increased.
- image processing for widening the viewpoint is performed by the mobile device 120 or the head mount. It may be performed on any of the displays 110.
- the stereo camera is configured such that the distance d between the viewpoints of the left-eye camera and the right-eye camera can be moved within a range of, for example, 50 mm to 100 mm (see FIG. 13A). ) Then, the distance between viewpoints is increased in accordance with the zoom-in, and conversely, the distance between viewpoints is shortened when zooming out. This can be created by changing the distance between the viewpoints in conjunction with zooming in a stereo camera installation state.
- FIG. 13B shows a state in which the distance d between viewpoints of the left-eye camera and the right-eye camera is shortened to 50 mm when zoomed out.
- the camera unit 605 may be directly controlled from the head mounted display 110 or the controller 130 to browse the 3D image live.
- a plurality of captured images with different viewpoint intervals are recorded, and the viewpoint intervals of the images to be reproduced are switched and controlled in conjunction with a zoom operation during browsing.
- an image between viewpoints may be generated by image processing in conjunction with zooming during browsing.
- a viewpoint image whose viewpoint interval is switched is synthesized by image processing, and is switched to the synthesized image in conjunction with zooming. It is also possible to perform zooming.
- the effect of moving the viewpoint is obtained by switching the distance between the viewpoints of the images displayed on the left and right. Can do.
- the effect of three-dimensional motion parallax due to viewpoint movement cannot be obtained.
- the tilt axis of the camera unit 605 mounted on the mobile device 120-2 is fixed at a position offset by ⁇ y above the tilt axis of the user's head.
- the user can teach the ⁇ y offset of the tilt axis of the camera unit 605 through the input operation unit 403 while wearing the head mounted display 110 and looking up at a desired angle ⁇ y. That's fine.
- the posture of the camera unit 605 interlocked with the user's head remains fixed at a position that is offset by ⁇ y in the upward tilt direction. It is up to.
- a ⁇ y offset instruction for the tilt axis of the camera unit 605 may be transmitted to the mobile device 120 directly from the head-mounted display 110 or via the controller 130.
- a coordinate system offset process is performed on the camera base 606. Then, in a state where the offset is set on the tilt axis, a captured image that follows the movement of the user's head is cut out and displayed.
- the same processing can be realized by internal processing of the head-mounted display 110.
- the display angle control unit 701 tilts upward by ⁇ y from the tilt axis of the user's line of sight detected by the posture / position detection unit 404 in response to the teaching of the ⁇ y offset of the tilt axis from the user. It is sufficient to set the display angle of view offset to.
- the image captured by the camera unit 605 being spun is directly mounted on the head.
- the coordinate system of the camera unit 605 is set so that the captured image does not change rapidly regardless of the movement of the user's head.
- a captured image at a certain time for example, immediately after the start of spinning
- display switching processing that follows the movement of the user's line of sight may be performed with reference to the coordinate system of the camera unit 605 at that time.
- the feedback head mounted display 110 includes an operation feeling feedback unit 412 that gives an operation feeling feedback to the user by touch or vibration.
- the head-mounted display 110 is a device worn by the user, and can directly provide tactile sensation and vibration, and is effective feedback.
- the mobile device 120 is equipped with an acceleration sensor as the position / orientation / velocity detection unit 604, and can detect a shake or an impact received by the airframe during flight, travel, or navigation.
- the head-mounted display 110 constantly monitors the detection value of the acceleration sensor of the mobile device 120 while displaying the image taken on the mobile device 120 side,
- the operation feeling feedback unit 412 is instructed to output feedback corresponding to the impact.
- the feedback output corresponding to the virtual impact is similarly output to the operation feeling feedback unit 412. Instruct.
- AR image display In the conventional FPV, a photographed image of a camera mounted on a radio control is displayed as it is.
- the head-mounted display 110 superimposes and displays an AR image that does not exist when displaying an image captured on the mobile device 120 side.
- an AR image such as a pylon
- a mobile device 120-2 for land movement such as an automobile (racing car), a circuit or the like.
- an AR image such as a buoy is superimposed on the captured image.
- the head mounted display 110 constantly monitors the current location of the mobile device 120 while displaying the captured image, and reaches the corresponding location. As a result of this (the corresponding location has entered the captured image), the AR image display process is activated.
- AR images that are indicators of movement of the mobile device 120, such as pylon and buoys, virtual obstacles that hinder the movement of the mobile device 120, areas that are dangerous for the mobile device 120 to pass, and others
- FIG. 28 shows a state in which a virtual obstacle 2801 and a prohibited area 2802 are AR-displayed with respect to a vehicle line-of-sight image (real world) photographed by the mobile device 120-2.
- the head mounted display 110 recognizes the photographed image sent from the mobile device 120 and finds the corresponding object.
- the AR image display process is started.
- target objects include products and people that users are looking for, animals such as pets, plants, buildings, and the like.
- the head mounted display 110 monitors the state in which the mobile device 120 is currently placed, and changes the state to the state registered in advance. Upon entering, an AR image display process is activated.
- an AR image display process is activated.
- an area where radio waves do not reach is set as a “prohibited area”, and guidance for preventing the mobile device 120 from entering the prohibited area is displayed as an AR image.
- an actual guide such as a signboard indicating the prohibited area may be installed.
- the AR image display is changed according to the characteristics (age, nationality, gender, personality, etc.) for each user and the skill of operating the mobile device 120. You may do it. For example, it is assumed that the required AR image differs depending on whether the user is an adult child. In addition, it is preferable that a beginner displays an AR image that notifies danger earlier than a highly skilled person.
- the user can observe the captured image on the head mounted display 110 while operating the controller 130 to move the mobile device 120. Even if the user himself / herself does not move, the user can enjoy the scenery of the remote place where the mobile device 120 has moved.
- the head mounted display 110 may display an AR image that displays the relative positional relationship between the user and the mobile device 120 in the captured image sent from the mobile device 120.
- the head mounted display 110 can obtain a navigation function in addition to enjoying the FPV by displaying the position information of the user.
- the image display of the relative position is performed by, for example, the user position information detected by the position / posture detection unit 404 in the head mount display 110 and the position / posture speed detection unit 604 of the mobile device 120. Can be generated based on the position information of the mobile body device 120 detected by.
- the user position information As a form of displaying the user position information, for example, as shown in FIG. 17, it is conceivable to display the user position information as a small screen of the captured image display screen of the mobile device 120.
- the small screen is, for example, a map image, and icons representing the current position of the mobile device 120 and the current position of the user are displayed in the image.
- FIG. 18 in accordance with a user operation through the input operation unit 402, the user position information can be displayed on the large screen, and the photographed image can be displayed on the small screen.
- the movement of the mobile device 120 is performed.
- the moving direction of the mobile device 120 can be detected by the position / orientation / velocity detection unit 604.
- the user's line-of-sight direction can be detected by the posture / position detection unit 404 or the state detection unit 411.
- the absolute position of each user can be indicated by arranging the user and the mobile device 120 on the map screen, but the map screen is not used (for example, a plain screen). (Above) The relative position of the user and the mobile device 120 may be shown.
- the absolute position information may be acquired by alternative means such as underwater acoustic communication.
- an auto tracker that tracks the rear (or side) of the mobile device 120 may be provided, and an image captured by the auto tracker may be displayed on the head mounted display 110.
- FIG. 21 shows a state in which an auto tracker 2100 equipped with a camera 2101 is tracking the rear of the mobile device 120-1 and the mobile device 120-2 flying over the sky. Further, FIG. 22 illustrates an image obtained by photographing the mobile device 120-1 from the rear with the camera 2101 of the auto tracker 2100.
- the outer camera 413 is arranged in the approximate center of the front surface of the main body of the head mounted display 110 having, for example, a glasses shape or a hat shape (see FIG. 2), and can capture a surrounding image.
- the outer camera 413 can capture an image of the user's own eyes. it can.
- the screen of the display panel 409 may be switched between a captured image from the mobile device 120 and a captured image by the outer camera 413 in accordance with a user operation via the input operation unit 402.
- FIG. 23 exemplifies a vehicle line-of-sight image captured by the mobile device 120-2 of the automobile.
- FIG. 24 illustrates an example in which the user wearing the head-mounted display 110 follows the moving mobile device 120-3 while traveling.
- FIG. 25 illustrates a self-view image of the user who follows the moving body device 120-3 and is captured by the outer camera 413.
- the camera unit 605 mounted on the mobile device 120 is basically at least one camera and is photographed from the first-person viewpoint of the operator, but is further equipped with a plurality of cameras that photograph from other viewpoint positions. You may do it.
- a camera that shoots from the front line of sight in the direction of travel a camera that shoots the landscape reflected in the left and right side mirrors, a camera that shoots the landscape reflected in the room mirror
- Multiple cameras will be installed, such as a camera that takes pictures of the scenery reflected when looking into instruments.
- a camera base 606 that moves the viewpoint position of the camera may be provided.
- the captured images of the cameras at the respective viewpoint positions are arranged as shown in FIG.
- the image taken from the vehicle is centered, the left and right side mirror images taken from the left and right, the room mirror view image above, and the view image looking into the instruments below.
- the display angle of view is not moved following the movement of the user's head, but when the head is turned to the left, the left side mirror eye image is displayed and the head is moved to the right.
- the image on the display panel 409 is switched to the right side mirror eye image when facing the head, the room mirror eye image when the head is facing up, and the eye image looking into the instruments when the head is facing down. Further, when the user turns his head to the front, the original vehicle line-of-sight image is restored.
- the display field angle control unit 701 displays the display field angle so as to follow this.
- the image cutout unit 702 cuts out the display image at the display angle of view after the movement (see FIG. 9).
- the display image has a trapezoidal shape with a shorter upper end with respect to the display frame of the display panel 409.
- the display image is farther from the projection eye toward the lower end of the display angle of view, so the display image has a trapezoidal shape with the lower end shortened with respect to the display frame of the display panel 409. (See FIG. 27). If the image to be observed is trapezoidal, the user feels uncomfortable.
- the image composition unit 703 performs trapezoidal distortion correction so that an image observed by the user is always rectangular.
- the captured image is pitched or swayed due to the influence of the unevenness of the road surface on which the vehicle travels.
- the flying object mobile device 120-1 such as an airplane or a helicopter
- the captured image is shaken by receiving an air current.
- the moving body device 120-3 of a ship such as a yacht
- the photographed image is shaken due to a water current or a wave during navigation.
- the shaking may be canceled and displayed on the head mounted display 110 side.
- the head mounted display 110 side shakes the image. Start the correction process.
- the display angle control unit 701 inputs the detection value of the acceleration sensor on the mobile device 120 side, and cancels the shaking amount of the mobile device 120 in addition to the posture information of the user's head.
- the position and orientation of the corner are determined and output to the image cutout unit 702.
- the image cutout unit 702 inputs the detection value of the acceleration sensor on the mobile device 120 side, and further displays the display field angle determined by the display field angle control unit 701 on the user's head based on the posture information.
- the position and orientation of the display field angle are corrected so as to cancel the shaking amount of the mobile device 120, and the image of the display field angle is cut out from the captured image.
- the shake correction process can be performed by making the moving image into a slow motion or thinning out the frame.
- a moving image When a moving image is set to slow motion, it may be set to slow motion only where there is movement in the frame.
- the screen size is reduced only during the period when the shake of the mobile device 120 is detected.
- a method of preventing shaking is also conceivable.
- the virtual image optical unit 410 forms an enlarged virtual image of the display image on the display panel 409. However, if the enlargement ratio is adjusted by the virtual image optical unit 410 or the display size is reduced by the display panel 409, Good.
- the movable body device 120 may be provided with a shake correction function in the optical system and image processing of the camera unit 605.
- the technology disclosed in the present specification can be configured as follows. (1) a display unit attached to the user's head; A posture detection unit for detecting the posture of the head; A display control unit that controls display on the display unit of an image captured by a mobile device based on the posture of the head; An image display device comprising: (2) The display control unit cuts out a region corresponding to the posture of the head from the wide-angle image captured by the mobile device and displays the region on the display unit. The image display device according to (1) above. (3) The display control unit adjusts a convergence position of the viewpoint image based on a moving speed of the moving body when displaying a plurality of viewpoint images photographed by a pan-focus parallel method at a plurality of viewpoints. , The image display device according to (1) above.
- the display control unit causes the display unit to display an image at a viewpoint interval corresponding to a zoom operation.
- the line-of-sight direction of the camera unit of the mobile device is offset in at least one direction of pan, tilt, and roll with respect to the posture of the head.
- An operation feeling feedback unit that feeds back an operation feeling to the user by tactile sensation or vibration is provided, and feedback based on an acceleration that the mobile device receives during movement is performed to the user
- the display control unit superimposes and displays an AR image on a real-world image captured by the mobile device.
- the display control unit displays the AR image corresponding to at least one of a current position of the mobile device, an object included in the captured image, or a state of the mobile device.
- the display control unit displays position information of the mobile device and the user.
- the display control unit further displays a photographed image of an auto tracker that photographs while tracking the mobile device.
- (11) A self-view image acquisition unit that acquires a self-view image that can be seen with the user's view, The display control unit is configured to switch and display a moving body line image captured by the mobile body device and the self line of sight image;
- the display control unit switches and displays images captured from a plurality of viewpoint positions of the mobile device according to the posture of the head.
- the display control unit corrects and displays shaking included in the moving image captured by the mobile device.
- a posture detection step for detecting the posture of the user's head
- a display control step for controlling display of an image captured by the mobile device based on the posture of the head
- An image display method comprising: (15) a mobile device that shoots while moving; An image display device that displays a captured image of the mobile device according to a posture of a user's head;
- An image display system comprising: (16) a camera unit; A camera base for controlling the viewing direction of the camera unit; A moving unit for moving the device; A communication unit that performs communication of data including an image captured by the camera unit; Comprising The camera unit includes a plurality of viewpoint cameras that perform shooting by a pan-focus parallel method.
- Mobile device comprising: (15) a mobile device that shoots while moving; An image display device that displays a captured image of the mobile device according to a posture of a user's head;
- An image display system comprising: (16) a camera unit; A camera base for controlling the viewing direction of the camera unit; A moving unit for moving the device; A
- the mobile device is a model device such as a radio control for an airplane, helicopter, automobile, yacht, etc., or may be a mobile apparatus such as an actual airplane, helicopter, automobile, yacht or the like.
- the mobile device itself may not be capable of remote operation or remote guidance.
- it may be a moving body of a living body such as a person or an animal instead of a moving body that is a mechanical device.
- DESCRIPTION OF SYMBOLS 100 ... Image display system 110 ... Head mounted display, 120 ... Mobile device 130 ... Controller 201, 202 ... Microphone 401 ... Control part, 401A ... ROM, 401B ... RAM 402: Input operation unit 403: Remote control reception unit 404 ... Posture / position detection unit 405 ... Communication unit 406 ... Storage unit 407 ... Image processing unit 408 ... Display drive unit 409 ... Display panel 410: Virtual image optical unit 411 ... State detection unit, 412 ... Operation feeling feedback unit 413 ... Outer camera 601 ... Control unit, 602 ... Storage unit 603 ... Movement mechanism unit, 604 ... Position and orientation speed detection unit 605 ... Camera unit, 606 ... Camera stand 607 ... Communication unit 701 ... Display angle control unit, 702 ... Image cutout unit 703 ... Image composition unit
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Hardware Design (AREA)
- Optics & Photonics (AREA)
- Microelectronics & Electronic Packaging (AREA)
- Mechanical Engineering (AREA)
- Remote Sensing (AREA)
- Aviation & Aerospace Engineering (AREA)
- General Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Controls And Circuits For Display Device (AREA)
- Processing Or Creating Images (AREA)
- Studio Devices (AREA)
Abstract
Description
ユーザーの頭部に取り付けられる表示部と、
前記頭部の姿勢を検出する姿勢検出部と、
移動体装置で撮影した画像の前記表示部への表示を、前記頭部の姿勢に基づいて制御する表示制御部と、
を具備する画像表示装である。
ユーザーの頭部の姿勢を検出する姿勢検出ステップと、
移動体装置で撮影した画像の表示を、前記頭部の姿勢に基づいて制御する表示制御ステップと、
を有する画像表示方法である。
移動しながら撮影する移動体装置と、
前記移動体装置の撮影画像を、ユーザーの頭部の姿勢に応じて表示する画像表示装置と、
を具備する画像表示システムである。
カメラ部と、
カメラ部の視線方向を制御するカメラ台と、
当該装置を移動させる移動部と、
前記カメラ部による撮影画像を含むデータの通信を行なう通信部と、
を具備し、
前記カメラ部は、パン・フォーカスの平行法により撮影を行なう複数の視点のカメラを備える、
移動体装置である。
ユーザーの頭部の姿勢を検出する姿勢検出部、
移動体装置で撮影した画像の表示を、前記頭部の姿勢に基づいて制御する表示制御部、
としてコンピューターを機能させるようにコンピューター可読形式で記述されたコンピューター・プログラムである。
図1には、本明細書で開示する技術の一実施形態に係る画像表示システム100の構成を模式的に示している。図示の画像表示システム100は、ユーザーが頭部又は顔部に装着して用いる画像表示装置(ヘッド・マウント・ディスプレイ)110と、飛行機(若しくはヘリコプター、その他の飛翔体)、自動車、船舶などの移動する模型からなる移動体装置120-1、120-2、120-3…と、移動体装置120を無線で操縦するコントローラー130で構成される。移動体装置120-1、120-2、120-3…は、それぞれ無線カメラ(図示しない)を搭載し、移動中に風景を撮影している。コントローラー130は、例えばスマートフォンなどの多機能情報端末でもよく、移動体装置120の操縦用アプリケーションを起動している。
移動体装置120側で撮影された画像をヘッド・マウント・ディスプレイ110で表示する際の処理の詳細について、以下で説明する。
カメラ部605は、図8Bに示したように、ステレオ・カメラで構成される。また、左眼用カメラと右眼用カメラに輻輳はなく、互いの視線はほぼ平行をなし、パン・フォーカスで撮影している。
全天周表示用の3次元画像を撮影する際、例えば一部のズームを行なうと、望遠鏡で覗いているような、欠き割り的な画像になってしまう。これは視点間距離が一定であるためである。そこで、移動体装置120側では、左右のカメラの視点間距離を移動できるステレオ・カメラを用い、ズームの倍率に従って視点間距離を移動させながら撮影することで、自然な3次元の全天周画像を得るようにする。あるいは、左眼用カメラと右眼用カメラの視点間距離が固定されている場合には、カメラの視点の外側を外挿するなどの視点を広げる画像処理を移動体装置120又はヘッド・マウント・ディスプレイ110のいずれかで行なうようにしてもよい。
ユーザーの頭部のパン、チルト、ロール座標系(図5を参照のこと)と、カメラ部605のパン、チルト、ロール座標系(図8Aを参照のこと)を一致させる場合、正面を向いているユーザーの頭部の動きに追従した表示画角の画像を映し出すと、移動体装置120が移動中は下の方が見えている間隔になる。例えば、自動車の移動体装置120-2の場合には、地面ばかりが見えることになる。このため、ユーザーは、正面に見える画像を表示させるには、頭部を上方向にチルトさせることで、移動体装置120-2に搭載されたカメラ部605も上方向にチルトさせる必要がある(図14を参照のこと)。しかしながら、移動体装置120-2が移動中に、ユーザーが常に上を向いていると首が疲れてしまう。
ヘッド・マウント・ディスプレイ110が、触感や振動によりユーザーに操作感フィードバックを与える操作感フィードバック部412を備えることは既に述べた。ヘッド・マウント・ディスプレイ110は、ユーザーが身に付けるデバイスであり、触感や振動を直接与えることができ、有効なフィードバックとなる。
従来のFPVでは、専ら、ラジコンに搭載したカメラの撮影画像をそのまま表示する。これに対し、本実施形態では、ヘッド・マウント・ディスプレイ110は、移動体装置120側で撮影した画像を表示する際に、実在しないAR画像を重畳して表示する。
本実施形態に係る画像表示システム100では、ユーザーは、コントローラー130を操作して移動体装置120を移動させながら、ヘッド・マウント・ディスプレイ110で撮影画像を観察することができる。ユーザー自身は移動しなくても、移動体装置120が移動した遠隔の場所の風景を楽しむことができる。
これまでは、移動体装置120が撮影する、いわゆるFPV(一人称視点画像)を、ヘッド・マウント・ディスプレイ110に表示して楽しむ実施形態について説明してきた。ユーザーによっては、移動体装置120を含んだ風景を楽しみたいという要求がある。
これまでは、移動体装置120で撮影した遠隔地の風景画像(オート・トラッカー2100の撮影画像を含む)をヘッド・マウント・ディスプレイ110に表示して楽しむ実施形態について説明してきた。
全天周など広角な撮影画像をより狭い表示画角で観察しているときに、ユーザーが頭部をチルトさせると、表示画角制御部701は、これに追従するように表示画角を上下方向に移動し、画像切り出し部702は、移動した後の表示画角で表示画像を切り出す(図9を参照のこと)。ここで、ユーザーが頭部を上方向にチルトして、表示画角を上方向に移動させたときには、表示画角の上端に行くほど射影面から遠くなる。したがって、表示画像は、表示パネル409の表示枠に対し、上端が短くなった台形状になる。同様に、ユーザーが頭部を下方向にチルトしたときには、表示画角の下端に行くほど射影眼から遠くなるので、表示画像は表示パネル409の表示枠に対し、下端が短くなった台形状になる(図27を参照のこと)。観察する映像が台形では、ユーザーに違和感を与えてしまう。
例えば自動車の移動体装置120-2の場合、走行する路面の凹凸の影響などにより、撮影画像が縦揺れ又は横揺れを生じる。また、飛行機やヘリコプターなど飛翔体の移動体装置120-1の場合、気流を受けることなどにより、撮影画像に揺れが生じる。ヨットなど船舶の移動体装置120-3においても、航行中の水流や波を受けて、撮影画像に揺れが生じる。
本明細書の開示の技術は、以下のような構成をとることも可能である。
(1)ユーザーの頭部に取り付けられる表示部と、
前記頭部の姿勢を検出する姿勢検出部と、
移動体装置で撮影した画像の前記表示部への表示を、前記頭部の姿勢に基づいて制御する表示制御部と、
を具備する画像表示装置。
(2)前記表示制御部は、前記移動体装置で撮影した広角画像から、前記頭部の姿勢に対応する領域を切り出して前記表示部に表示する、
上記(1)に記載の画像表示装置。
(3)前記表示制御部は、複数の視点でパン・フォーカスの平行法により撮影した複数の視点画像を表示する際に、前記移動体の移動速度に基づいて前記視点画像の輻輳位置を調整する、
上記(1)に記載の画像表示装置。
(4)前記表示制御部は、ズーム操作に対応した視点間隔の画像を前記表示部に表示させる、
上記(1)に記載の画像表示装置。
(5)前記移動体装置のカメラ部の視線方向を、前記頭部の姿勢に対し、パン、チルト、ロールの少なくとも1方向にオフセットさせる、
上記(1)に記載の画像表示装置。
(6)前記ユーザーに触感又は振動により操作感をフィードバックする操作感フィードバック部をさらに備え、前記移動体装置が移動中に受ける加速度に基づくフィードバックを前記ユーザーに行なう、
上記(1)に記載の画像表示装置。
(7)前記表示制御部は、前記移動体装置で撮影した実世界の画像にAR画像を重畳して表示させる、
上記(1)に記載の画像表示装置。
(8)前記表示制御部は、前記移動体装置の現在位置、撮影した画像に含まれる物体、又は、前記移動体装置の状態の少なくともいずれかに応じた前記AR画像を表示させる、
上記(7)に記載の画像表示装置。
(9)前記表示制御部は、前記移動体装置と前記ユーザーの位置情報を表示させる、
上記(1)に記載の画像表示装置。
(10)前記表示制御部は、前記移動体装置を追跡しながら撮影するオート・トラッカーの撮影画像をさらに表示させる、
上記(1)に記載の画像表示装置。
(11)前記ユーザーの目線で見える自分目線画像を取得する自分目線画像取得部をさらに備え、
前記表示制御部は、前記移動体装置で撮影した移動体目線画像と前記自分目線画像を切り替えて表示させる、
上記(1)に記載の画像表示装置。
(12)前記表示制御部は、前記移動体装置の複数の視点位置から撮影された画像を前記頭部の姿勢に応じて切り換えて表示させる、
上記(1)に記載の画像表示装置。
(13)前記表示制御部は、前記移動体装置で撮影した動画像に含まれる揺れを補正して表示させる、
上記(1)に記載の画像表示装置。
(14)ユーザーの頭部の姿勢を検出する姿勢検出ステップと、
移動体装置で撮影した画像の表示を、前記頭部の姿勢に基づいて制御する表示制御ステップと、
を有する画像表示方法。
(15)移動しながら撮影する移動体装置と、
前記移動体装置の撮影画像を、ユーザーの頭部の姿勢に応じて表示する画像表示装置と、
を具備する画像表示システム。
(16)カメラ部と、
カメラ部の視線方向を制御するカメラ台と、
当該装置を移動させる移動部と、
前記カメラ部による撮影画像を含むデータの通信を行なう通信部と、
を具備し、
前記カメラ部は、パン・フォーカスの平行法により撮影を行なう複数の視点のカメラを備える、
移動体装置。
(17)前記複数の視点のカメラ間の視点間距離を変更しながら全天周画像を撮影する、
上記(16)に記載の移動体装置。
(18)視点間距離が固定された前記複数の視点のカメラで撮影した画像から、前記カメラの視点の外側の画像を外挿する、
上記(16)に記載の移動体装置。
(19)ユーザーの頭部の姿勢を検出する姿勢検出部、
移動体装置で撮影した画像の表示を、前記頭部の姿勢に基づいて制御する表示制御部、
としてコンピューターを機能させるようにコンピューター可読形式で記述されたコンピューター・プログラム。
110…ヘッド・マウント・ディスプレイ、120…移動体装置
130…コントローラー
201、202…マイクロフォン
401…制御部、401A…ROM、401B…RAM
402…入力操作部、403…リモコン受信部
404…姿勢・位置検出部、405…通信部、406…記憶部
407…画像処理部、408…表示駆動部
409…表示パネル、410…虚像光学部
411…状態検出部、412…操作感フィードバック部
413…外側カメラ
601…制御部、602…記憶部
603…移動機構部、604…位置姿勢速度検出部
605…カメラ部、606…カメラ台
607…通信部
701…表示画角制御部、702…画像切り出し部
703…画像合成部
Claims (19)
- ユーザーの頭部に取り付けられる表示部と、
前記頭部の姿勢を検出する姿勢検出部と、
移動体装置で撮影した画像の前記表示部への表示を、前記頭部の姿勢に基づいて制御する表示制御部と、
を具備する画像表示装置。 - 前記表示制御部は、前記移動体装置で撮影した広角画像から、前記頭部の姿勢に対応する領域を切り出して前記表示部に表示する、
請求項1に記載の画像表示装置。 - 前記表示制御部は、複数の視点でパン・フォーカスの平行法により撮影した複数の視点画像を表示する際に、前記移動体の移動速度に基づいて前記視点画像の輻輳位置を調整する、
請求項1に記載の画像表示装置。 - 前記表示制御部は、ズーム操作に対応した視点間隔の画像を前記表示部に表示させる、
請求項1に記載の画像表示装置。 - 前記移動体装置のカメラ部の視線方向を、前記頭部の姿勢に対し、パン、チルト、ロールの少なくとも1方向にオフセットさせる、
請求項1に記載の画像表示装置。 - 前記ユーザーに触感又は振動により操作感をフィードバックする操作感フィードバック部をさらに備え、前記移動体装置が移動中に受ける加速度に基づくフィードバックを前記ユーザーに行なう、
請求項1に記載の画像表示装置。 - 前記表示制御部は、前記移動体装置で撮影した実世界の画像にAR画像を重畳して表示させる、
請求項1に記載の画像表示装置。 - 前記表示制御部は、前記移動体装置の現在位置、撮影した画像に含まれる物体、又は、前記移動体装置の状態の少なくともいずれかに応じた前記AR画像を表示させる、
請求項7に記載の画像表示装置。 - 前記表示制御部は、前記移動体装置と前記ユーザーの位置情報を表示させる、
請求項1に記載の画像表示装置。 - 前記表示制御部は、前記移動体装置を追跡しながら撮影するオート・トラッカーの撮影画像をさらに表示させる、
請求項1に記載の画像表示装置。 - 前記ユーザーの目線で見える自分目線画像を取得する自分目線画像取得部をさらに備え、
前記表示制御部は、前記移動体装置で撮影した移動体目線画像と前記自分目線画像を切り替えて表示させる、
請求項1に記載の画像表示装置。 - 前記表示制御部は、前記移動体装置の複数の視点位置から撮影された画像を前記頭部の姿勢に応じて切り換えて表示させる、
請求項1に記載の画像表示装置。 - 前記表示制御部は、前記移動体装置で撮影した動画像に含まれる揺れを補正して表示させる、
請求項1に記載の画像表示装置。 - ユーザーの頭部の姿勢を検出する姿勢検出ステップと、
移動体装置で撮影した画像の表示を、前記頭部の姿勢に基づいて制御する表示制御ステップと、
を有する画像表示方法。 - 移動しながら撮影する移動体装置と、
前記移動体装置の撮影画像を、ユーザーの頭部の姿勢に応じて表示する画像表示装置と、
を具備する画像表示システム。 - カメラ部と、
カメラ部の視線方向を制御するカメラ台と、
当該装置を移動させる移動部と、
前記カメラ部による撮影画像を含むデータの通信を行なう通信部と、
を具備し、
前記カメラ部は、パン・フォーカスの平行法により撮影を行なう複数の視点のカメラを備える、
移動体装置。 - 前記複数の視点のカメラ間の視点間距離を変更しながら全天周画像を撮影する、
請求項16に記載の移動体装置。 - 視点間距離が固定された前記複数の視点のカメラで撮影した画像から、前記カメラの視点の外側の画像を外挿する、
請求項16に記載の移動体装置。 - ユーザーの頭部の姿勢を検出する姿勢検出部、
移動体装置で撮影した画像の表示を、前記頭部の姿勢に基づいて制御する表示制御部、
としてコンピューターを機能させるようにコンピューター可読形式で記述されたコンピューター・プログラム。
Priority Applications (6)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
RU2015116420A RU2646360C2 (ru) | 2012-11-13 | 2013-10-02 | Устройство и способ отображения изображения, мобильное устройство, система отображения изображения и компьютерная программа |
BR112015010280-8A BR112015010280B1 (pt) | 2012-11-13 | 2013-10-02 | Sistema e método de exibição de imagem |
CN201380058073.XA CN104781873B (zh) | 2012-11-13 | 2013-10-02 | 图像显示装置、图像显示方法、移动装置、图像显示系统 |
US14/441,138 US10108018B2 (en) | 2012-11-13 | 2013-10-02 | Image display apparatus for displaying an image captured by a mobile apparatus |
EP13855496.9A EP2922049A4 (en) | 2012-11-13 | 2013-10-02 | PICTURE DISPLAY DEVICE AND PICTURE DISPLAY METHOD, MOBILE BODY DEVICE, PICTURE DISPLAY SYSTEM AND COMPUTER PROGRAM |
JP2014546903A JPWO2014077046A1 (ja) | 2012-11-13 | 2013-10-02 | 画像表示装置及び画像表示方法、移動体装置、画像表示システム、並びにコンピューター・プログラム |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2012249614 | 2012-11-13 | ||
JP2012-249614 | 2012-11-13 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2014077046A1 true WO2014077046A1 (ja) | 2014-05-22 |
Family
ID=50730967
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2013/076830 WO2014077046A1 (ja) | 2012-11-13 | 2013-10-02 | 画像表示装置及び画像表示方法、移動体装置、画像表示システム、並びにコンピューター・プログラム |
Country Status (7)
Country | Link |
---|---|
US (1) | US10108018B2 (ja) |
EP (1) | EP2922049A4 (ja) |
JP (1) | JPWO2014077046A1 (ja) |
CN (1) | CN104781873B (ja) |
BR (1) | BR112015010280B1 (ja) |
RU (1) | RU2646360C2 (ja) |
WO (1) | WO2014077046A1 (ja) |
Cited By (35)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104065952A (zh) * | 2014-07-02 | 2014-09-24 | 联想(北京)有限公司 | 穿戴式电子设备和图像处理方法 |
JP5831893B1 (ja) * | 2015-05-07 | 2015-12-09 | 立 寺坂 | 航空機の遠隔操縦システム |
WO2016017245A1 (ja) * | 2014-07-31 | 2016-02-04 | ソニー株式会社 | 情報処理装置及び情報処理方法、並びに画像表示システム |
JP2016034091A (ja) * | 2014-07-31 | 2016-03-10 | セイコーエプソン株式会社 | 表示装置、表示装置の制御方法、および、プログラム |
JP2016080971A (ja) * | 2014-10-21 | 2016-05-16 | ローム株式会社 | 視覚補助システム |
JP2017502596A (ja) * | 2014-09-17 | 2017-01-19 | エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co.,Ltd | ホワイトバランス調整方法及び撮像システム |
JP2017022433A (ja) * | 2015-07-07 | 2017-01-26 | 日立建機株式会社 | 遠隔操縦システム |
WO2017022296A1 (ja) * | 2015-08-03 | 2017-02-09 | ソニー株式会社 | 情報管理装置及び情報管理方法、並びに映像再生装置及び映像再生方法 |
CN106416240A (zh) * | 2014-05-27 | 2017-02-15 | 瑞典爱立信有限公司 | 对可旋转立体相机的远程控制 |
JP2017050718A (ja) * | 2015-09-02 | 2017-03-09 | 株式会社東芝 | ウェアラブル端末、方法及びシステム |
JP2017079385A (ja) * | 2015-10-20 | 2017-04-27 | 株式会社エヌアイデイ | 映像及び音声を送信可能な装置を用いた相互コミュニケーションシステム及び相互コミュニケーション方法、並びに相互コミュニケーションプログラム |
JP2017111619A (ja) * | 2015-12-16 | 2017-06-22 | 日本電信電話株式会社 | 画像処理装置、画像処理方法及び画像処理プログラム |
CN107111928A (zh) * | 2014-11-17 | 2017-08-29 | 洋马株式会社 | 远程控制作业机用显示系统 |
WO2017159063A1 (ja) * | 2016-03-14 | 2017-09-21 | ソニー株式会社 | 表示装置並びに情報処理端末装置 |
JP2017183852A (ja) * | 2016-03-29 | 2017-10-05 | ブラザー工業株式会社 | 表示装置および表示制御方法 |
EP3158424A4 (en) * | 2014-07-14 | 2017-11-08 | Huawei Technologies Co. Ltd. | System and method for display enhancement |
CN107615759A (zh) * | 2015-06-10 | 2018-01-19 | 索尼互动娱乐股份有限公司 | 头戴式显示器、显示控制方法以及程序 |
WO2018056155A1 (ja) * | 2016-09-21 | 2018-03-29 | 株式会社ソニー・インタラクティブエンタテインメント | 情報処理装置、画像生成方法およびヘッドマウントディスプレイ |
WO2018074893A1 (ko) * | 2016-10-20 | 2018-04-26 | 삼성전자 주식회사 | 디스플레이 장치 및 디스플레이 장치의 영상 처리 방법 |
JP2018073366A (ja) * | 2016-11-04 | 2018-05-10 | キヤノン株式会社 | 画像処理装置、画像処理方法、およびプログラム |
JP2018112809A (ja) * | 2017-01-10 | 2018-07-19 | セイコーエプソン株式会社 | 頭部装着型表示装置およびその制御方法、並びにコンピュータープログラム |
JP2018121267A (ja) * | 2017-01-27 | 2018-08-02 | セイコーエプソン株式会社 | 表示装置、及び、表示装置の制御方法 |
KR101895482B1 (ko) * | 2017-04-13 | 2018-09-05 | 박대건 | 증강현실 원격제어 방법, 프로그램 및 시스템 |
JP2018157578A (ja) * | 2018-05-10 | 2018-10-04 | パナソニックIpマネジメント株式会社 | 映像表示システム、映像表示装置及び映像表示方法 |
JP2018164223A (ja) * | 2017-03-27 | 2018-10-18 | 東芝情報システム株式会社 | 表示システム |
US20180359462A1 (en) * | 2015-08-06 | 2018-12-13 | Sony Interactive Entertainment Inc. | Information processing apparatus |
JP2019092140A (ja) * | 2017-11-14 | 2019-06-13 | 株式会社B.b.designLab | 瞳孔間のキャリブレーションプログラム、前記プログラムを用いたWebサーバ、及び、瞳孔間のキャリブレーション方法 |
JP2019213207A (ja) * | 2019-07-24 | 2019-12-12 | 株式会社Dapリアライズ | 無人移動体用端末装置、該無人移動体用端末装置によって制御される無人移動体、及び、無人移動体用端末装置用通信ユニット |
JP2020031435A (ja) * | 2019-10-31 | 2020-02-27 | ヤンマー株式会社 | 作業機、表示装置および作業機の遠隔操作のための表示システム |
US10587819B2 (en) | 2015-07-14 | 2020-03-10 | Panasonic Intellectual Property Management Co., Ltd. | Video display system, video display device, and video display method |
JP2020145612A (ja) * | 2019-03-07 | 2020-09-10 | 株式会社Jvcケンウッド | 画像処理装置、画像処理システム、画像処理方法、及びプログラム |
JP2020177066A (ja) * | 2019-04-16 | 2020-10-29 | 凸版印刷株式会社 | 観察状態表示システム、観察状態表示方法及びプログラム |
JP2021086229A (ja) * | 2019-11-25 | 2021-06-03 | パイオニア株式会社 | 表示制御装置、表示制御方法及び表示制御用プログラム |
WO2023203853A1 (ja) * | 2022-04-20 | 2023-10-26 | 株式会社サンタ・プレゼンツ | 遠隔体験システム |
US12063344B2 (en) | 2019-09-13 | 2024-08-13 | Sony Group Corporation | Reproduction device, reproduction method, and recording medium |
Families Citing this family (55)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2015115848A (ja) * | 2013-12-13 | 2015-06-22 | セイコーエプソン株式会社 | 頭部装着型表示装置および頭部装着型表示装置の制御方法 |
US10198865B2 (en) | 2014-07-10 | 2019-02-05 | Seiko Epson Corporation | HMD calibration with direct geometric modeling |
US10426668B2 (en) | 2014-09-18 | 2019-10-01 | Rohm Co., Ltd. | Binocular display apparatus |
CN105607253B (zh) * | 2014-11-17 | 2020-05-12 | 精工爱普生株式会社 | 头部佩戴型显示装置以及控制方法、显示系统 |
EP3059658A1 (en) * | 2015-02-18 | 2016-08-24 | Nokia Technologies OY | Apparatus, methods and computer programs for providing images |
US10409464B2 (en) * | 2015-03-18 | 2019-09-10 | Microsoft Technology Licensing, Llc | Providing a context related view with a wearable apparatus |
US10192133B2 (en) | 2015-06-22 | 2019-01-29 | Seiko Epson Corporation | Marker, method of detecting position and pose of marker, and computer program |
US10192361B2 (en) | 2015-07-06 | 2019-01-29 | Seiko Epson Corporation | Head-mounted display device and computer program |
JP6860485B2 (ja) * | 2015-08-05 | 2021-04-14 | ソニー株式会社 | 情報処理装置、および情報処理方法、並びにプログラム |
WO2017086355A1 (ja) * | 2015-11-17 | 2017-05-26 | ソニー株式会社 | 送信装置、送信方法、受信装置、受信方法および送受信システム |
US10424117B2 (en) | 2015-12-02 | 2019-09-24 | Seiko Epson Corporation | Controlling a display of a head-mounted display device |
JP6532393B2 (ja) * | 2015-12-02 | 2019-06-19 | 株式会社ソニー・インタラクティブエンタテインメント | 表示制御装置及び表示制御方法 |
CN105898285A (zh) * | 2015-12-21 | 2016-08-24 | 乐视致新电子科技(天津)有限公司 | 虚拟显示设备的影像播放方法和装置 |
WO2017110731A1 (ja) | 2015-12-24 | 2017-06-29 | 株式会社ソニー・インタラクティブエンタテインメント | 周波数帯決定装置、ヘッドマウントディスプレイ、周波数帯決定方法及びプログラム |
CN105653032B (zh) * | 2015-12-29 | 2019-02-19 | 小米科技有限责任公司 | 显示调整方法及装置 |
CN107250895A (zh) * | 2015-12-31 | 2017-10-13 | 深圳市柔宇科技有限公司 | 头戴式显示设备及其摄像头的调节方法 |
KR20180102591A (ko) * | 2016-01-13 | 2018-09-17 | 포브, 아이엔씨. | 표정 인식 시스템, 표정 인식 방법 및 표정 인식 프로그램 |
JP6668811B2 (ja) * | 2016-02-23 | 2020-03-18 | セイコーエプソン株式会社 | 訓練装置、訓練方法、プログラム |
US11052547B2 (en) * | 2016-04-20 | 2021-07-06 | Sony Interactive Entertainment Inc. | Robot and housing |
CN113359807A (zh) * | 2016-04-29 | 2021-09-07 | 深圳市大疆创新科技有限公司 | 一种无人机第一视角飞行的控制方法及系统、智能眼镜 |
CN109219789A (zh) * | 2016-05-04 | 2019-01-15 | 深圳脑穿越科技有限公司 | 虚拟现实的显示方法、装置及终端 |
CN107462989A (zh) * | 2016-06-06 | 2017-12-12 | 华硕电脑股份有限公司 | 影像稳定方法与电子装置 |
WO2017219288A1 (zh) * | 2016-06-22 | 2017-12-28 | 华为技术有限公司 | 头戴式显示设备及其处理方法 |
JPWO2018021067A1 (ja) * | 2016-07-29 | 2019-05-09 | ソニー株式会社 | 画像処理装置および画像処理方法 |
CN108351651B (zh) * | 2016-09-27 | 2024-06-21 | 深圳市大疆创新科技有限公司 | 一种基于影像的控制方法、装置及飞行器 |
WO2018058693A1 (zh) * | 2016-10-01 | 2018-04-05 | 北京蚁视科技有限公司 | 一种防止用户眩晕的视频图像显示方法 |
KR20180040451A (ko) * | 2016-10-12 | 2018-04-20 | 엘지전자 주식회사 | 이동 단말기 및 그의 동작 방법 |
CN106780757B (zh) * | 2016-12-02 | 2020-05-12 | 西北大学 | 一种增强现实的方法 |
CN106814735B (zh) * | 2016-12-06 | 2020-08-14 | 北京臻迪科技股份有限公司 | 一种无人船的控制系统 |
CN106598390B (zh) * | 2016-12-12 | 2021-01-15 | 联想(北京)有限公司 | 一种显示方法、电子设备及显示装置 |
KR20180068411A (ko) * | 2016-12-14 | 2018-06-22 | 삼성전자주식회사 | 무인 비행 전자 장치의 운행 제어 방법 및 이를 지원하는 전자 장치 |
TWI632395B (zh) | 2016-12-27 | 2018-08-11 | 廣達電腦股份有限公司 | 虛擬實境裝置以及其畫面產生方法 |
CN106791691B (zh) * | 2017-01-04 | 2021-09-14 | 北京臻迪科技股份有限公司 | 一种无人船的控制系统 |
KR102281400B1 (ko) * | 2017-03-23 | 2021-07-23 | 도시바 미쓰비시덴키 산교시스템 가부시키가이샤 | 철강 플랜트의 분석 지원 장치 |
CN107123158B (zh) * | 2017-04-06 | 2020-12-29 | 北京臻迪科技股份有限公司 | 一种无人船的成像系统 |
CN106990848B (zh) * | 2017-04-10 | 2023-06-20 | 河南中医药大学 | 一种虚拟现实防眩晕方法和系统 |
US11202051B2 (en) | 2017-05-18 | 2021-12-14 | Pcms Holdings, Inc. | System and method for distributing and rendering content as spherical video and 3D asset combination |
US10139631B1 (en) * | 2017-06-05 | 2018-11-27 | Microsoft Technology Licensing, Llc | Apparatus and method of 1:1 matching head mounted display view to head movement that controls articulated camera |
EP3637412A4 (en) | 2017-06-08 | 2021-04-14 | Canon Kabushiki Kaisha | IMAGE PROCESSING DEVICE AND CONTROL METHOD FOR IT |
US10970943B2 (en) * | 2017-06-09 | 2021-04-06 | II Timothy Robert Hay | Method and apparatus for a vehicle force indicator |
CN107368192B (zh) * | 2017-07-18 | 2021-03-02 | 歌尔光学科技有限公司 | Vr眼镜的实景观测方法及vr眼镜 |
US20190124251A1 (en) * | 2017-10-23 | 2019-04-25 | Sony Corporation | Remotely controllable camera on eyeglass-type mount for the blind |
CN108055523A (zh) * | 2017-12-11 | 2018-05-18 | 大连高马艺术设计工程有限公司 | 一种基于图像传输和姿态感应的模拟主视角控制游乐系统 |
JP7059662B2 (ja) * | 2018-02-02 | 2022-04-26 | トヨタ自動車株式会社 | 遠隔操作システム、及びその通信方法 |
JP7128409B2 (ja) * | 2018-06-27 | 2022-08-31 | 学校法人東海大学 | 遠隔制御装置、遠隔制御システム、遠隔制御方法及び遠隔制御プログラム |
CN110696712B (zh) * | 2018-07-10 | 2021-02-09 | 广州汽车集团股份有限公司 | 汽车后视镜自动调节方法、装置、计算机存储介质与汽车 |
CN109040555A (zh) * | 2018-08-22 | 2018-12-18 | 信利光电股份有限公司 | 一种fpv图像的显示系统及方法 |
US10823970B2 (en) | 2018-08-23 | 2020-11-03 | Apple Inc. | Head-mounted electronic display device with lens position sensing |
CN209690628U (zh) | 2018-08-23 | 2019-11-26 | 苹果公司 | 系统和头戴式设备 |
US10762660B2 (en) * | 2018-09-28 | 2020-09-01 | Verizon Patent And Licensing, Inc. | Methods and systems for detecting and assigning attributes to objects of interest in geospatial imagery |
CN109911140A (zh) * | 2019-04-09 | 2019-06-21 | 上海萃钛智能科技有限公司 | 一种水域航行信息增强装置、系统及方法 |
US11890424B2 (en) | 2019-07-15 | 2024-02-06 | International Business Machines Corporation | Augmented reality enabled motion sickness reduction |
JP6655751B1 (ja) * | 2019-07-25 | 2020-02-26 | エヌ・ティ・ティ・コミュニケーションズ株式会社 | 映像表示制御装置、方法およびプログラム |
DE102019124386A1 (de) * | 2019-09-11 | 2021-03-11 | Audi Ag | Verfahren zum Betreiben einer Virtual-Reality-Brille in einem Fahrzeug sowie Virtual-Reality-System mit einer Virtual-Reality-Brille und einem Fahrzeug |
CN113014871B (zh) * | 2021-02-20 | 2023-11-10 | 青岛小鸟看看科技有限公司 | 内窥镜图像显示方法、装置及内窥镜手术辅助系统 |
Citations (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH07287761A (ja) * | 1994-04-19 | 1995-10-31 | Canon Inc | 画像処理装置及び画像処理方法 |
JPH08191419A (ja) * | 1995-01-10 | 1996-07-23 | Yamaha Corp | 頭部装着型表示システム |
JPH0937137A (ja) * | 1995-07-25 | 1997-02-07 | Minolta Co Ltd | 移動型立体カメラ装置 |
JPH09106322A (ja) | 1995-10-09 | 1997-04-22 | Data Tec:Kk | ヘッドマウントディスプレイにおける姿勢角検出装置 |
JPH1070740A (ja) * | 1996-08-28 | 1998-03-10 | Sony Corp | 立体カメラおよびビデオ映像伝送システム |
JP2001209426A (ja) * | 2000-01-26 | 2001-08-03 | Nippon Telegr & Teleph Corp <Ntt> | 移動体制御装置 |
JP2005297168A (ja) * | 2004-04-16 | 2005-10-27 | Fujinon Corp | 遠隔操作ロボット |
JP2008147865A (ja) | 2006-12-07 | 2008-06-26 | Sony Corp | 画像表示システム、表示装置、表示方法 |
JP2010256534A (ja) | 2009-04-23 | 2010-11-11 | Fujifilm Corp | 全方位画像表示用ヘッドマウントディスプレイ装置 |
JP2011128220A (ja) * | 2009-12-15 | 2011-06-30 | Toshiba Corp | 情報提示装置、情報提示方法及びプログラム |
JP2011183824A (ja) | 2010-03-04 | 2011-09-22 | Chugoku Electric Power Co Inc:The | 空中撮影システム |
JP2012143447A (ja) | 2011-01-13 | 2012-08-02 | Sharp Corp | ネットワークシステム、コントロール方法、コントローラ、およびコントロールプログラム |
JP2012151800A (ja) | 2011-01-21 | 2012-08-09 | Sharp Corp | 撮影装置およびネットワークシステム |
Family Cites Families (28)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US3916094A (en) * | 1974-06-21 | 1975-10-28 | Us Navy | Submersible visual simulator for remotely piloted systems |
JPH08336128A (ja) * | 1995-04-05 | 1996-12-17 | Matsushita Electric Ind Co Ltd | 映像視聴装置 |
JPH08307753A (ja) * | 1995-05-09 | 1996-11-22 | Minolta Co Ltd | カメラと表示装置から成る映像システム及び表示装置 |
US5905525A (en) * | 1995-07-13 | 1999-05-18 | Minolta Co., Ltd. | Image display apparatus having a display controlled by user's head movement |
JPH10206789A (ja) * | 1997-01-22 | 1998-08-07 | Honda Motor Co Ltd | 車両用ヘッドマウントディスプレイ装置 |
JP2000013818A (ja) * | 1998-06-23 | 2000-01-14 | Nec Corp | 立体表示装置及び立体表示方法 |
US7180476B1 (en) * | 1999-06-30 | 2007-02-20 | The Boeing Company | Exterior aircraft vision system using a helmet-mounted display |
JP2001344597A (ja) * | 2000-05-30 | 2001-12-14 | Fuji Heavy Ind Ltd | 融合視界装置 |
SE527257C2 (sv) * | 2004-06-21 | 2006-01-31 | Totalfoersvarets Forskningsins | Anordning och metod för att presentera en omvärldsbild |
WO2006038662A1 (ja) * | 2004-10-07 | 2006-04-13 | Japan Science And Technology Agency | 画像表示装置および電子眼鏡 |
JP2006197068A (ja) * | 2005-01-12 | 2006-07-27 | Yokogawa Electric Corp | 画像表示装置および画像表示方法 |
JP5228307B2 (ja) * | 2006-10-16 | 2013-07-03 | ソニー株式会社 | 表示装置、表示方法 |
US7671887B2 (en) * | 2006-11-20 | 2010-03-02 | General Electric Company | System and method of navigating a medical instrument |
RU2334369C1 (ru) * | 2006-11-30 | 2008-09-20 | Борис Иванович Волков | Система стереотелевидения |
JP2009128565A (ja) * | 2007-11-22 | 2009-06-11 | Toshiba Corp | 表示装置、表示方法及びヘッドアップディスプレイ |
JP2009278234A (ja) * | 2008-05-13 | 2009-11-26 | Konica Minolta Holdings Inc | 表示システム |
US8885039B2 (en) * | 2008-07-25 | 2014-11-11 | Lg Electronics Inc. | Providing vehicle information |
JP4863527B2 (ja) * | 2009-12-01 | 2012-01-25 | 稔 稲葉 | 立体映像撮像装置 |
JP2011075956A (ja) * | 2009-09-30 | 2011-04-14 | Brother Industries Ltd | ヘッドマウントディスプレイ |
JP2011135247A (ja) * | 2009-12-24 | 2011-07-07 | Nikon Corp | 撮像装置 |
US20110238079A1 (en) * | 2010-03-18 | 2011-09-29 | SPI Surgical, Inc. | Surgical Cockpit Comprising Multisensory and Multimodal Interfaces for Robotic Surgery and Methods Related Thereto |
JP2011205353A (ja) * | 2010-03-25 | 2011-10-13 | Yamaha Corp | 視聴状況認識装置および視聴状況認識システム |
JP5148652B2 (ja) * | 2010-03-31 | 2013-02-20 | 株式会社バンダイナムコゲームス | プログラム、情報記憶媒体及び画像生成システム |
CN102270035A (zh) * | 2010-06-04 | 2011-12-07 | 三星电子株式会社 | 以非触摸方式来选择和操作对象的设备和方法 |
US9596453B2 (en) * | 2010-06-14 | 2017-03-14 | Lg Electronics Inc. | Electronic device and control method thereof |
JP5499985B2 (ja) * | 2010-08-09 | 2014-05-21 | ソニー株式会社 | 表示装置組立体 |
US8487787B2 (en) * | 2010-09-30 | 2013-07-16 | Honeywell International Inc. | Near-to-eye head tracking ground obstruction system and method |
CN102456127B (zh) * | 2010-10-21 | 2018-03-20 | 三星电子株式会社 | 头部姿态估计设备和方法 |
-
2013
- 2013-10-02 BR BR112015010280-8A patent/BR112015010280B1/pt active IP Right Grant
- 2013-10-02 WO PCT/JP2013/076830 patent/WO2014077046A1/ja active Application Filing
- 2013-10-02 JP JP2014546903A patent/JPWO2014077046A1/ja active Pending
- 2013-10-02 RU RU2015116420A patent/RU2646360C2/ru active
- 2013-10-02 CN CN201380058073.XA patent/CN104781873B/zh active Active
- 2013-10-02 US US14/441,138 patent/US10108018B2/en active Active
- 2013-10-02 EP EP13855496.9A patent/EP2922049A4/en not_active Withdrawn
Patent Citations (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH07287761A (ja) * | 1994-04-19 | 1995-10-31 | Canon Inc | 画像処理装置及び画像処理方法 |
JPH08191419A (ja) * | 1995-01-10 | 1996-07-23 | Yamaha Corp | 頭部装着型表示システム |
JPH0937137A (ja) * | 1995-07-25 | 1997-02-07 | Minolta Co Ltd | 移動型立体カメラ装置 |
JPH09106322A (ja) | 1995-10-09 | 1997-04-22 | Data Tec:Kk | ヘッドマウントディスプレイにおける姿勢角検出装置 |
JPH1070740A (ja) * | 1996-08-28 | 1998-03-10 | Sony Corp | 立体カメラおよびビデオ映像伝送システム |
JP2001209426A (ja) * | 2000-01-26 | 2001-08-03 | Nippon Telegr & Teleph Corp <Ntt> | 移動体制御装置 |
JP2005297168A (ja) * | 2004-04-16 | 2005-10-27 | Fujinon Corp | 遠隔操作ロボット |
JP2008147865A (ja) | 2006-12-07 | 2008-06-26 | Sony Corp | 画像表示システム、表示装置、表示方法 |
JP2010256534A (ja) | 2009-04-23 | 2010-11-11 | Fujifilm Corp | 全方位画像表示用ヘッドマウントディスプレイ装置 |
JP2011128220A (ja) * | 2009-12-15 | 2011-06-30 | Toshiba Corp | 情報提示装置、情報提示方法及びプログラム |
JP2011183824A (ja) | 2010-03-04 | 2011-09-22 | Chugoku Electric Power Co Inc:The | 空中撮影システム |
JP2012143447A (ja) | 2011-01-13 | 2012-08-02 | Sharp Corp | ネットワークシステム、コントロール方法、コントローラ、およびコントロールプログラム |
JP2012151800A (ja) | 2011-01-21 | 2012-08-09 | Sharp Corp | 撮影装置およびネットワークシステム |
Non-Patent Citations (1)
Title |
---|
See also references of EP2922049A4 |
Cited By (63)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP3149938A1 (en) * | 2014-05-27 | 2017-04-05 | Telefonaktiebolaget LM Ericsson (publ) | Remote control of a pivotable stereoscopic camera |
CN106416240A (zh) * | 2014-05-27 | 2017-02-15 | 瑞典爱立信有限公司 | 对可旋转立体相机的远程控制 |
CN104065952A (zh) * | 2014-07-02 | 2014-09-24 | 联想(北京)有限公司 | 穿戴式电子设备和图像处理方法 |
EP3158424A4 (en) * | 2014-07-14 | 2017-11-08 | Huawei Technologies Co. Ltd. | System and method for display enhancement |
JP2016034091A (ja) * | 2014-07-31 | 2016-03-10 | セイコーエプソン株式会社 | 表示装置、表示装置の制御方法、および、プログラム |
EP3177010A4 (en) * | 2014-07-31 | 2018-04-25 | Sony Corporation | Information processing device, information processing method, and image display system |
WO2016017245A1 (ja) * | 2014-07-31 | 2016-02-04 | ソニー株式会社 | 情報処理装置及び情報処理方法、並びに画像表示システム |
US20170278262A1 (en) | 2014-07-31 | 2017-09-28 | Sony Corporation | Information processing device, method of information processing, and image display system |
US10269132B2 (en) | 2014-07-31 | 2019-04-23 | Sony Corporation | Displaying images according to head posture and camera posture |
JPWO2016017245A1 (ja) * | 2014-07-31 | 2017-04-27 | ソニー株式会社 | 情報処理装置及び情報処理方法、並びに画像表示システム |
CN106664393A (zh) * | 2014-07-31 | 2017-05-10 | 索尼公司 | 信息处理装置、信息处理方法以及图像显示系统 |
JP2017502596A (ja) * | 2014-09-17 | 2017-01-19 | エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co.,Ltd | ホワイトバランス調整方法及び撮像システム |
US9743059B2 (en) | 2014-09-17 | 2017-08-22 | SZ DJI Technology Co., Ltd. | Automatic white balancing system and method |
US9743058B2 (en) | 2014-09-17 | 2017-08-22 | SZ DJI Technology Co., Ltd. | Automatic white balancing system and method |
JP2016080971A (ja) * | 2014-10-21 | 2016-05-16 | ローム株式会社 | 視覚補助システム |
EP3222042A1 (en) * | 2014-11-17 | 2017-09-27 | Yanmar Co., Ltd. | Display system for remote control of working machine |
JP2018501684A (ja) * | 2014-11-17 | 2018-01-18 | ヤンマー株式会社 | 作業機の遠隔操作のための表示システム |
CN107111928A (zh) * | 2014-11-17 | 2017-08-29 | 洋马株式会社 | 远程控制作业机用显示系统 |
CN111754750A (zh) * | 2014-11-17 | 2020-10-09 | 洋马动力科技有限公司 | 远程控制作业机用显示设备、显示系统以及作业机 |
US10474228B2 (en) | 2014-11-17 | 2019-11-12 | Yanmar Co., Ltd. | Display system for remote control of working machine |
CN107111928B (zh) * | 2014-11-17 | 2020-07-31 | 洋马动力科技有限公司 | 远程控制作业机用显示系统 |
CN111754750B (zh) * | 2014-11-17 | 2022-03-01 | 洋马动力科技有限公司 | 远程控制作业机用显示设备、显示系统以及作业机 |
EP3222042B1 (en) * | 2014-11-17 | 2022-07-27 | Yanmar Power Technology Co., Ltd. | Display system for remote control of working machine |
JP5831893B1 (ja) * | 2015-05-07 | 2015-12-09 | 立 寺坂 | 航空機の遠隔操縦システム |
JP2016210274A (ja) * | 2015-05-07 | 2016-12-15 | 立 寺坂 | 航空機の遠隔操縦システム |
EP3310043A4 (en) * | 2015-06-10 | 2019-01-16 | Sony Interactive Entertainment Inc. | HEAD-MOUNTED DISPLAY, DISPLAY CONTROL METHOD AND PROGRAM |
CN107615759A (zh) * | 2015-06-10 | 2018-01-19 | 索尼互动娱乐股份有限公司 | 头戴式显示器、显示控制方法以及程序 |
US10715735B2 (en) | 2015-06-10 | 2020-07-14 | Sony Interactive Entertainment Inc. | Head-mounted display, display control method, and program |
CN107615759B (zh) * | 2015-06-10 | 2020-09-01 | 索尼互动娱乐股份有限公司 | 头戴式显示器、显示控制方法以及程序 |
JP2017022433A (ja) * | 2015-07-07 | 2017-01-26 | 日立建機株式会社 | 遠隔操縦システム |
US10587819B2 (en) | 2015-07-14 | 2020-03-10 | Panasonic Intellectual Property Management Co., Ltd. | Video display system, video display device, and video display method |
WO2017022296A1 (ja) * | 2015-08-03 | 2017-02-09 | ソニー株式会社 | 情報管理装置及び情報管理方法、並びに映像再生装置及び映像再生方法 |
US11089213B2 (en) | 2015-08-03 | 2021-08-10 | Sony Group Corporation | Information management apparatus and information management method, and video reproduction apparatus and video reproduction method |
US20180359462A1 (en) * | 2015-08-06 | 2018-12-13 | Sony Interactive Entertainment Inc. | Information processing apparatus |
JP2017050718A (ja) * | 2015-09-02 | 2017-03-09 | 株式会社東芝 | ウェアラブル端末、方法及びシステム |
JP2017079385A (ja) * | 2015-10-20 | 2017-04-27 | 株式会社エヌアイデイ | 映像及び音声を送信可能な装置を用いた相互コミュニケーションシステム及び相互コミュニケーション方法、並びに相互コミュニケーションプログラム |
JP2017111619A (ja) * | 2015-12-16 | 2017-06-22 | 日本電信電話株式会社 | 画像処理装置、画像処理方法及び画像処理プログラム |
US10455184B2 (en) | 2016-03-14 | 2019-10-22 | Sony Corporation | Display device and information processing terminal device |
WO2017159063A1 (ja) * | 2016-03-14 | 2017-09-21 | ソニー株式会社 | 表示装置並びに情報処理端末装置 |
JPWO2017159063A1 (ja) * | 2016-03-14 | 2019-01-17 | ソニー株式会社 | 表示装置並びに情報処理端末装置 |
JP2017183852A (ja) * | 2016-03-29 | 2017-10-05 | ブラザー工業株式会社 | 表示装置および表示制御方法 |
US10377487B2 (en) | 2016-03-29 | 2019-08-13 | Brother Kogyo Kabushiki Kaisha | Display device and display control method |
WO2017169841A1 (ja) * | 2016-03-29 | 2017-10-05 | ブラザー工業株式会社 | 表示装置および表示制御方法 |
US11184597B2 (en) | 2016-09-21 | 2021-11-23 | Sony Interactive Entertainment Inc. | Information processing device, image generation method, and head-mounted display |
WO2018056155A1 (ja) * | 2016-09-21 | 2018-03-29 | 株式会社ソニー・インタラクティブエンタテインメント | 情報処理装置、画像生成方法およびヘッドマウントディスプレイ |
US10915993B2 (en) | 2016-10-20 | 2021-02-09 | Samsung Electronics Co., Ltd. | Display apparatus and image processing method thereof |
WO2018074893A1 (ko) * | 2016-10-20 | 2018-04-26 | 삼성전자 주식회사 | 디스플레이 장치 및 디스플레이 장치의 영상 처리 방법 |
JP2018073366A (ja) * | 2016-11-04 | 2018-05-10 | キヤノン株式会社 | 画像処理装置、画像処理方法、およびプログラム |
JP2018112809A (ja) * | 2017-01-10 | 2018-07-19 | セイコーエプソン株式会社 | 頭部装着型表示装置およびその制御方法、並びにコンピュータープログラム |
JP2018121267A (ja) * | 2017-01-27 | 2018-08-02 | セイコーエプソン株式会社 | 表示装置、及び、表示装置の制御方法 |
JP2018164223A (ja) * | 2017-03-27 | 2018-10-18 | 東芝情報システム株式会社 | 表示システム |
KR101895482B1 (ko) * | 2017-04-13 | 2018-09-05 | 박대건 | 증강현실 원격제어 방법, 프로그램 및 시스템 |
JP2019092140A (ja) * | 2017-11-14 | 2019-06-13 | 株式会社B.b.designLab | 瞳孔間のキャリブレーションプログラム、前記プログラムを用いたWebサーバ、及び、瞳孔間のキャリブレーション方法 |
JP2018157578A (ja) * | 2018-05-10 | 2018-10-04 | パナソニックIpマネジメント株式会社 | 映像表示システム、映像表示装置及び映像表示方法 |
JP2020145612A (ja) * | 2019-03-07 | 2020-09-10 | 株式会社Jvcケンウッド | 画像処理装置、画像処理システム、画像処理方法、及びプログラム |
JP2020177066A (ja) * | 2019-04-16 | 2020-10-29 | 凸版印刷株式会社 | 観察状態表示システム、観察状態表示方法及びプログラム |
JP7342409B2 (ja) | 2019-04-16 | 2023-09-12 | 凸版印刷株式会社 | 観察状態表示システム、観察状態表示方法及びプログラム |
JP2019213207A (ja) * | 2019-07-24 | 2019-12-12 | 株式会社Dapリアライズ | 無人移動体用端末装置、該無人移動体用端末装置によって制御される無人移動体、及び、無人移動体用端末装置用通信ユニット |
US12063344B2 (en) | 2019-09-13 | 2024-08-13 | Sony Group Corporation | Reproduction device, reproduction method, and recording medium |
JP2020031435A (ja) * | 2019-10-31 | 2020-02-27 | ヤンマー株式会社 | 作業機、表示装置および作業機の遠隔操作のための表示システム |
JP2021086229A (ja) * | 2019-11-25 | 2021-06-03 | パイオニア株式会社 | 表示制御装置、表示制御方法及び表示制御用プログラム |
JP7496683B2 (ja) | 2019-11-25 | 2024-06-07 | パイオニア株式会社 | 表示制御装置、表示制御方法及び表示制御用プログラム |
WO2023203853A1 (ja) * | 2022-04-20 | 2023-10-26 | 株式会社サンタ・プレゼンツ | 遠隔体験システム |
Also Published As
Publication number | Publication date |
---|---|
US20150293362A1 (en) | 2015-10-15 |
BR112015010280A2 (pt) | 2017-07-11 |
CN104781873A (zh) | 2015-07-15 |
JPWO2014077046A1 (ja) | 2017-01-05 |
BR112015010280A8 (pt) | 2019-10-01 |
EP2922049A4 (en) | 2016-07-13 |
EP2922049A1 (en) | 2015-09-23 |
CN104781873B (zh) | 2017-06-06 |
RU2015116420A (ru) | 2016-11-20 |
US10108018B2 (en) | 2018-10-23 |
BR112015010280B1 (pt) | 2022-06-21 |
RU2646360C2 (ru) | 2018-03-02 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2014077046A1 (ja) | 画像表示装置及び画像表示方法、移動体装置、画像表示システム、並びにコンピューター・プログラム | |
JP6642432B2 (ja) | 情報処理装置及び情報処理方法、並びに画像表示システム | |
US9886033B2 (en) | System for piloting a drone in immersion | |
CN108139799B (zh) | 基于用户的兴趣区(roi)处理图像数据的系统和方法 | |
US9747725B2 (en) | Video system for piloting a drone in immersive mode | |
US9738382B2 (en) | Drone immersion-piloting system | |
US9158305B2 (en) | Remote control system | |
CN102917205B (zh) | 车辆环绕视图系统 | |
CN108027700B (zh) | 信息处理装置 | |
JP6899875B2 (ja) | 情報処理装置、映像表示システム、情報処理装置の制御方法、及びプログラム | |
KR20170114458A (ko) | 스테레오 카메라를 장착하여 가상현실 3d 영상으로 주변을 탐색하는 드론 및 이와 연동하는 복합 기능의 vr 장치 | |
WO2015013979A1 (zh) | 遥控方法及终端 | |
JP6540572B2 (ja) | 表示装置および表示制御方法 | |
CN110825333B (zh) | 显示方法、装置、终端设备及存储介质 | |
KR102062127B1 (ko) | 촬영용 드론에 의한 헤드마운트 고글 영상정보 제공 시스템 | |
JP2003312592A (ja) | 遠隔操縦システム | |
KR20180025416A (ko) | 모션 인식 및 가상 현실을 이용한 드론 비행 제어 시스템 및 방법 | |
CN108646776B (zh) | 一种基于无人机的成像系统和方法 | |
WO2019206044A1 (zh) | 信息处理装置、信息提示指示方法、程序以及记录介质 | |
WO2020209167A1 (ja) | 情報処理装置、情報処理方法、及びプログラム | |
TWI436270B (zh) | 虛擬望遠方法及其裝置 | |
JP6699944B2 (ja) | 表示システム | |
KR101623300B1 (ko) | 헤드 마운트 디스플레이 장치 및 그 동작 방법 | |
JP6929674B2 (ja) | 環境画像表示システム及び環境画像表示方法 | |
US20240196045A1 (en) | Video display system, observation device, information processing method, and recording medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 13855496 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2014546903 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2013855496 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 2015116420 Country of ref document: RU Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14441138 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
REG | Reference to national code |
Ref country code: BR Ref legal event code: B01A Ref document number: 112015010280 Country of ref document: BR |
|
ENP | Entry into the national phase |
Ref document number: 112015010280 Country of ref document: BR Kind code of ref document: A2 Effective date: 20150506 |