WO2012033095A1 - 車両システム - Google Patents
車両システム Download PDFInfo
- Publication number
- WO2012033095A1 WO2012033095A1 PCT/JP2011/070271 JP2011070271W WO2012033095A1 WO 2012033095 A1 WO2012033095 A1 WO 2012033095A1 JP 2011070271 W JP2011070271 W JP 2011070271W WO 2012033095 A1 WO2012033095 A1 WO 2012033095A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- vehicle
- virtual image
- virtual
- display
- Prior art date
Links
- 238000003384 imaging method Methods 0.000 claims abstract description 39
- 239000002131 composite material Substances 0.000 claims abstract description 38
- 238000001514 detection method Methods 0.000 claims description 17
- 239000000203 mixture Substances 0.000 claims description 12
- 239000000284 extract Substances 0.000 claims description 4
- 239000003550 marker Substances 0.000 description 29
- 238000000034 method Methods 0.000 description 12
- 238000012545 processing Methods 0.000 description 10
- 230000003190 augmentative effect Effects 0.000 description 8
- 238000012937 correction Methods 0.000 description 8
- 230000008569 process Effects 0.000 description 8
- 230000002194 synthesizing effect Effects 0.000 description 6
- 230000008859 change Effects 0.000 description 4
- 230000009466 transformation Effects 0.000 description 4
- 238000004891 communication Methods 0.000 description 3
- 238000010586 diagram Methods 0.000 description 3
- 230000000007 visual effect Effects 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 2
- 239000011521 glass Substances 0.000 description 2
- 239000004575 stone Substances 0.000 description 2
- PXFBZOLANLWPMH-UHFFFAOYSA-N 16-Epiaffinine Natural products C1C(C2=CC=CC=C2N2)=C2C(=O)CC2C(=CC)CN(C)C1C2CO PXFBZOLANLWPMH-UHFFFAOYSA-N 0.000 description 1
- 239000005441 aurora Substances 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 238000009933 burial Methods 0.000 description 1
- 230000006378 damage Effects 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000007667 floating Methods 0.000 description 1
- 238000001454 recorded image Methods 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 230000008439 repair process Effects 0.000 description 1
- 238000011160 research Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/006—Mixed reality
Definitions
- the present invention relates to a vehicle system, and more particularly to an image processing system for a vehicle.
- the device that presents the mixed reality senses, for example, a real space by an imaging device such as a camera, and generates a composite image by superimposing a virtual image on an image (real image) taken by the imaging device, The composite image is output, and it is possible to give a user who sees the composite image a mixed reality that fuses the real image and the virtual image.
- an imaging device is attached to a head mounted display (HMD), and a virtual image is superimposed on a real image taken by the imaging device in real time to display a composite image.
- the composite image is generated and displayed on the head mounted display. Therefore, when the user looks around with the head-mounted display attached, the virtual image is superimposed on the actual image of the landscape ahead of the line of sight, and the virtual image exists as if in the real space. You can make it feel like.
- the present invention has been made in view of the above circumstances, and an object of the present invention is to show a passenger a landscape with buildings and the like as a landscape that can be seen from a vehicle without actually installing the buildings and the like. Another object of the present invention is to provide a vehicle system that can easily change the scenery that the passenger can see.
- control device is preferably mounted on the vehicle.
- the control device superimposes the virtual image corresponding to the position on a predetermined position of the real image and a storage unit that stores the virtual image, thereby generating a composite image. It is preferable to include an image composition unit that performs the display and a display unit that displays the composite image on the display device.
- the vehicle system of the present invention further includes a GPS receiver provided in the vehicle for receiving a GPS signal from a GPS satellite, and the storage unit is the virtual unit associated with a specific position in an absolute coordinate system.
- a GPS receiver provided in the vehicle for receiving a GPS signal from a GPS satellite
- the storage unit is the virtual unit associated with a specific position in an absolute coordinate system.
- a plurality of images are stored, the image synthesizing means, coordinate allocation means for associating position information of the absolute coordinate system with a specific location in the actual image, and the actual image based on the position information of the absolute coordinate system. It is preferable to have alignment means for superimposing the virtual image.
- the vehicle system of the present invention further includes direction detection means for detecting a direction in which a passenger faces, the display device is a plurality of head-mounted display devices, the imaging device is an omnidirectional camera, The display means extracts a region corresponding to the direction of the occupant detected by the direction detection means from the composite image, and thereby determines a display image to be displayed on each head-mounted display device. It is preferable that the image display means includes an image display means for displaying the display image on the head-mounted display device.
- the direction detecting means is provided in the head-mounted display device.
- FIG. 1 is a perspective view of a vehicle system according to a first embodiment of the present invention. It is a figure explaining a passenger
- the vehicle 1 of this embodiment is a car (bus) that runs on the road at a low speed (about 10 to 20 km / h), and uses an engine, a motor, or the like as a drive source.
- the vehicle 1 is provided with a driver seat in the front part of the vehicle 1, and a plurality of seats 12 on which a plurality of passengers 9 can get on the rear of the driver seat.
- the vehicle 1 is composed of a glass 16 whose upper half is open on the side and rear and whose upper half is transparent on the front, and the passenger 9 is seated on the seat 12 in the horizontal direction. It is configured to overlook.
- the vehicle 1 is provided on the upper surface of the ceiling, and an imaging device 13 that takes a surrounding landscape, a display device 3 disposed inside the vehicle 1, and a control that is mounted on the vehicle 1 and controls display on the display device 3.
- a device 5 and a GPS receiver 4 for receiving a GPS signal from a GPS (Global Positioning System) satellite and measuring the position of the vehicle 1 are provided.
- the imaging device 13 is configured by an omnidirectional camera 14 and, for example, captures a landscape in a vertical direction of 120 ° (range of ⁇ 60 ° to + 60 ° with respect to the horizontal direction) over the entire circumference in the horizontal direction. .
- the imaging device 13 is mounted and fixed on a base 15 that is magnetically attached to the upper surface of the ceiling of the vehicle 1.
- the imaging device 13 captures a landscape within a certain range centered on the imaging device 13, thereby generating a real image 20.
- the actual image 20 captured by the imaging device 13 is given position coordinate (absolute coordinates, so-called world coordinates) information in the area based on the positioning information by the GPS receiver 4. Information on the scenery around the vehicle 1 imaged by the imaging device 13 is transmitted to the control device 5.
- the control device 5 generates a composite image by superimposing a specific virtual image 21 corresponding to this position on a predetermined position in the actual image 20 taken by the imaging device 13, and displays the generated composite image.
- This is an apparatus to which a so-called mixed reality (including augmented reality and augmented virtual feeling) technology displayed on the apparatus 3 is applied.
- the control device 5 according to the present embodiment includes a storage unit 51 that stores the virtual image 21 and the virtual image based on the relative position of the vehicle 1 and the virtual image 21 so as to match the real image 20.
- the control device 5 is constituted by a computer whose main component is a microprocessor.
- the control device 5 of this embodiment is stored in the vehicle 1. In other words, the control device 5 is mounted on the vehicle 1. Thereby, the control apparatus 5 is comprised so that it can move with the vehicle 1.
- the storage unit 51 is configured by a virtual image memory, and the virtual image 21 and position coordinate (absolute coordinate) information on a position where the virtual image 21 should be superimposed are stored in advance in association with each other.
- the alignment means superimposes the virtual image 21 and the real image 20 on the basis of the position coordinate (absolute coordinate) information stored in the virtual image memory and the position coordinate (absolute coordinate) information given to the real image 20. Match.
- the virtual image 21 of the present embodiment is a computer graphics image (hereinafter referred to as a CG image) imitating a historical landscape (for example, a city or a castle).
- the position coordinate information given to the virtual image 21 is preferably a plurality of coordinates.
- the image transformation means 52 changes the size / posture of the virtual image 21 stored in the storage unit 51 according to the position of the vehicle 1.
- the image deforming unit 52 includes a virtual image deforming unit 55 that calculates a relative position and a relative angle of the virtual image 21 with respect to the vehicle 1 and deforms the virtual image 21 based on the calculated value.
- the virtual image deformation means 55 is a virtual image for the vehicle 1 from the position coordinates (absolute coordinates) of the vehicle 1 by the GPS receiver 4 provided in the vehicle 1 and the position coordinates (absolute coordinates) given to the virtual image 21. 21 relative distances and relative angles are calculated.
- the image deforming unit 52 deforms the size and orientation of the virtual image 21, and then adds a shade to the virtual image 21 and corrects the brightness according to the current time and the brightness of the landscape. have. In other words, the correcting unit 56 can superimpose the virtual image 21 on the actual image 20 without further discomfort.
- the correction means 56 relating to the shadow / brightness is described in “Tetsuya Tsunoda, Takeshi Oishi, Katsushi Ikeuchi,“ High-speed shadow expression method in mixed reality using a shadow plane ”, Journal of the Institute of Image Information and Television Engineers 62 (5), The technique described in “May 1, 2008, p. 788-795” is used.
- the virtual image 21 deformed by the virtual image deformation means 55 is sent to the image composition means 53.
- the image display means 58 causes each head-mounted display device 31 to display the composite image determined by the display image determination means 57.
- the image display means 58 receives the image information from the display image determination means 57, the image display means 58 displays the composite image on each head-mounted display device 31 arranged in the vehicle 1.
- FIG. 4 is a flowchart showing an example of the operation of this vehicle system.
- the superimposed drawing is performed (S6).
- the control device 5 acquires the posture information of each display device 3 by the direction detection means 32 (S7), calculates the view area of the passenger 9 from the direction detection means 32, and displays the display area corresponding to this view area. Determine (S8).
- the control device 5 causes each head-mounted display device 31 to display the image determined by the display image determining means 57 (S9).
- the control device 5 determines the presence / absence of a process end signal (S10). If the process end signal is not received, the control device 5 returns to the process of step S1 and repeats the processes of steps S1 to S10.
- the process end signal is received, the image capturing of the image pickup apparatus 13 is ended, and the process of the control apparatus 5 is also ended (S11).
- the vehicle system having such a configuration continuously changes the image to be displayed on the display device 3 in accordance with the movement of the traveling vehicle 1 and the movement of the sight of the occupant 9, so that the virtual image 21 looks as if it is a real landscape. It can be expressed as if it exists in the inside, so that it is possible to give a more realistic feeling to the passenger 9 than in the case where the mixed reality is obtained by moving only the field of view from the place as before. it can. Moreover, since the virtual image 21 is changed with respect to the movement of the viewpoint that combines the movement of the vehicle 1 and the change in the free field of view of the occupant 9, a more realistic feeling can be given.
- the imaging device 13 is configured by the omnidirectional camera 14 and is configured to determine a display area after generating a composite image around the vehicle 1. Even in the case where the part-mounted display device 31 is used, a CCD camera is not required for each head-mounted display device 31, and a significant cost reduction can be achieved. While the traveling speed of the vehicle 1 is slow and almost constant, the movement speed of the field of view of the passenger 9 is not constant. Since the display area to be displayed on each head-mounted display device 31 is determined, a burdensome process can be reduced as much as possible. In other words, the burden of attaching a CCD camera to each head-mounted display device 31 is increased because it is necessary to perform superimposition processing for each display device 3 at a high speed. A high burden can be avoided.
- the image displayed by the display device 3 is a virtual image 21 superimposed on the real image 20 obtained by capturing an actual landscape, the surrounding environment such as weather and brightness is reflected as it is. Thereby, for example, a sense of presence is further increased as compared with a case where a recorded image recorded in advance is simply broadcast according to the movement of the vehicle 1.
- Examples of the contents of the virtual image 21 include the following.
- the display by the display apparatus 3 can also be utilized as a video guide.
- a so-called digital signage (electronic advertisement) that displays a company name in front of a building or places a specific company name, product name, or brand name in an ad balloon floating in the air is used as the virtual image 21. You can also.
- Embodiment 2 will be described with reference to FIG.
- this embodiment is the same as Embodiment 1 for the most part, the same code
- the vehicle 1 system of the present embodiment is a vehicle image processing system that is mounted on and used in the vehicle 1 as in the first embodiment.
- the vehicle system of this embodiment includes an imaging device 13, a display device 3, and a control device 5.
- the imaging device 13 may form a projection surface over all directions, or may form a projection surface only in a predetermined region.
- the real image 20 is a projection of the scenery outside the vehicle 1 on the projection plane.
- the real image 20 is configured by a two-dimensional plane on which a three-dimensional object composed of a landscape outside the vehicle 1 is projected.
- the imaging device 13 receives a GPS signal from the GPS receiver 4.
- the imaging device 13 outputs the image data of the real image 20 and the GPS signal to the image composition unit 53 of the control device 5.
- the control device 5 includes a vehicle position recognizing unit 61, an image deforming unit 52, a storage unit 51, an image synthesizing unit 53, and a display unit 54.
- the control device 5 is mounted on the vehicle 1.
- the vehicle position recognition means 61 receives the GPS signal output from the GPS receiver 4.
- the vehicle position recognition means 61 recognizes the current position of the vehicle 1 in the absolute coordinate system (so-called world coordinate system) based on the GPS signal.
- the vehicle position recognizing means 61 outputs the recognition information of the vehicle 1 position to the image deforming means 52.
- the image deforming unit 52 includes a virtual image obtaining unit 63, a correcting unit 56, and a virtual image deforming unit 55. Since the correction means 56 is the same as that of the first embodiment, description thereof is omitted.
- the virtual image deformation means 55 calculates the distance and relative angle of the virtual image 21 with respect to the vehicle 1 from the specific position associated with each virtual image 21 and the position of the vehicle 1.
- the virtual image deformation means 55 converts the virtual image 21 based on the calculated information.
- the relative angle is an angle with respect to the reference.
- the angle with respect to the axis when the vehicle 1 is the origin.
- the storage unit 51 and the virtual image deformation means 55 may be as follows.
- the storage unit 51 stores a plurality of three-dimensional virtual objects.
- the virtual image deforming unit 55 rotates the three-dimensional virtual object in the local coordinate system from the distance between the vehicle 1 and the virtual image 21 and the relative angle, thereby generating the virtual image 21.
- the virtual image deformation means 55 is an image for superimposing the virtual image 21 stored in the storage unit 51 on the real image 20 based on the distance and relative angle between the vehicle 1 and the virtual image 21. Convert to
- the image composition unit 53 includes a coordinate assignment unit 59 and an alignment unit 60.
- the coordinate assigning means 59 associates the coordinate system (so-called screen coordinate system) in the real image 20 with the world coordinate system from the image data of the real image 20 input from the imaging device 13 and the GPS signal.
- the coordinate assigning means 59 converts the screen coordinate system into the world coordinate system.
- the coordinate assignment unit 59 associates position information in the world coordinate system with a specific location in the real image 20.
- the coordinate assigning unit 59 outputs a signal in which position information in the world coordinate system is associated with a specific location in the real image 20 to the alignment unit 60.
- the display unit 54 includes a display image determination unit 57 and an image display unit 58.
- the display image determination unit 57 receives the signal output from the direction detection unit 32 of the head-mounted display device 31. Further, the display image determination unit 57 receives the signal output from the alignment unit 60. The display image determination unit 57 calculates the passenger's field of view based on the signal from the direction detection unit 32. The display image determination unit 57 extracts a portion of the composite image corresponding to the view field area, and thereby determines an image (display image) to be output to the head-mounted display device 31.
- the display image determination means 57 performs processing for each head-mounted display device 31, calculates a different field of view, extracts an image corresponding to the field of view, and determines a display image.
- the display image determination unit 57 outputs display image data to the image display unit 58.
- the image display means 58 causes each head-mounted display device 31 to display based on the composite image data output by the display image determination means 57.
- the image display unit 58 outputs data for displaying the composite image on the display device 3.
- Embodiment 3 will be described.
- this embodiment is the same as Embodiment 1 for the most part, description is abbreviate
- the vehicle 1 system of the present embodiment is a vehicle image processing system that is mounted on and used in the vehicle 1 as in the first embodiment.
- the vehicle system of this embodiment includes an imaging device 13, a display device 3, and a control device 5.
- the configurations of the imaging device 13 and the display device 3 are the same as those of the first embodiment.
- the control device 5 is a device to which marker recognition type mixed reality (including augmented reality and augmented virtual feeling) technology is applied.
- the control device 5 includes a marker recognizing unit 62, an image deforming unit 52, a storage unit 51, an image synthesizing unit 53, and a display unit 54.
- the control device 5 generates a composite image by superimposing a virtual image 21 corresponding to the position on a predetermined position of the real image 20 and causes the display device 3 to display the composite image.
- the control device 5 is mounted on the vehicle 1.
- the marker includes a first recognition unit formed in a square frame shape in plan view and a second recognition unit formed inside the first recognition unit.
- the first recognition unit is formed with a constant width over the entire circumference, and is formed with a black frame.
- the second recognition unit is composed of different marks for each virtual image 21.
- the 2nd recognition part is formed in the black frame as a 1st recognition part.
- the marker recognizing means 62 recognizes the presence of the marker in the actual image 20 generated by the imaging device 13.
- the marker recognizing means 62 detects the first recognizing part of the marker and thereby recognizes the presence of the marker.
- the marker recognizing means 62 detects the second recognizing unit, collates it with the mark stored in the marker memory 511 of the storage unit 51, and recognizes the second recognizing unit.
- the marker recognition means 62 recognizes the size and angle of the marker based on the shapes projected on the projection planes of the first recognition unit and the second recognition unit.
- the marker recognizing means 62 outputs the marker position information in the screen coordinate system of the real image 20 to the image synthesizing means 53. Further, the marker recognizing unit 62 outputs the information of the second recognizing unit collated with the storage unit 51 and the size and angle of the marker to the image deforming unit 52.
- the image deformation unit 52 includes a virtual image acquisition unit 63, a correction unit 56, and a virtual image deformation unit 55.
- the virtual image acquisition unit 63 acquires the virtual image 21 from the virtual image memory 510 based on the information input by the marker recognition unit 62.
- the display means 54 displays the composite image on the display device 3.
- the display unit 54 includes a display image determination unit 57 and an image display unit 58 as in the second embodiment. Since the display means 54 is the same as the configuration of the second embodiment, description thereof is omitted.
- the marker may be a specific three-dimensional object existing outside the vehicle 1.
- the three-dimensional object includes a stone, a stele, a plant, a building, etc. having a specific shape.
- the marker recognizing means 62 recognizes a plurality of feature points on the marker and thereby recognizes a specific marker. Examples of the feature points on the marker include a corner portion and a straight line portion of the marker.
- the vehicle 1 according to the first to third embodiments is configured by an automobile capable of boarding a plurality of passengers.
- the vehicle according to the present invention is, for example, a train in which a plurality of vehicles are connected or a single passenger is on board. It may be a light vehicle such as a bicycle, and is not limited to an automobile.
- the display apparatus of this invention may utilize radio
- the control device 5 may be provided outside the vehicle 1.
- the control device 5, the imaging device 13, and the display device 3 are provided with a transmission / reception unit for wireless communication.
Landscapes
- Engineering & Computer Science (AREA)
- Computer Graphics (AREA)
- Computer Hardware Design (AREA)
- General Engineering & Computer Science (AREA)
- Software Systems (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Processing Or Creating Images (AREA)
- Controls And Circuits For Display Device (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
11 運転席
12 座席
13 撮像装置
14 全方位カメラ
15 基台
16 ガラス
20 実画像
21 仮想画像
3 表示装置
31 頭部装着型表示装置
32 方向検知手段
4 GPS装置
5 制御装置
51 記憶部
52 画像変形手段
53 画像合成手段
54 表示手段
55 位置・姿勢算出手段
56 補正手段
57 表示画像決定手段
58 画像表示手段
9 搭乗者
Claims (8)
- 車両に使用される車両システムであって、
前記車両から風景を撮影してこれにより実画像を生成する撮像装置と、
前記車両内に配置された表示装置と、
前記実画像の所定の位置に、当該位置に対応した仮想画像を重ね合わせることにより合成画像を生成し、その合成画像を表示装置に表示させる制御装置と
を備えている車両システム。 - 前記制御装置が前記車両に搭載されたものである
請求項1記載の車両システム。 - 前記制御装置は、
前記仮想画像を記憶する記憶部と、
前記実画像の所定の位置に、当該位置に対応した前記仮想画像を重ね合わせ、これにより合成画像を生成する画像合成手段と、
前記合成画像を前記表示装置に表示させる表示手段と
を備えている
請求項1又は請求項2に記載の車両システム。 - GPS衛星からのGPS信号を受信するため前記車両に設けられたGPS受信機をさらに備え、
前記記憶部は、絶対座標系内の特定の位置と関連付けられた前記仮想画像を複数記憶しており、
前記画像合成手段は、
前記実画像内の特定の箇所に絶対座標系の位置情報を対応付ける座標割り当て手段と、
絶対座標系の位置情報を基準にして前記実画像と前記仮想画像とを重ね合わせる位置合わせ手段と
を有している
請求項3記載の車両システム。 - 前記制御装置は、
前記GPS受信機に基づいて、絶対座標系内の現在の前記車両の位置を認識する車両位置認識手段と、
前記各仮想画像にそれぞれ関連付けられた特定の位置と前記車両の位置とから、前記車両に対する前記仮想画像の距離及び相対角度を算出し、これに基づいて前記仮想画像を変換する仮想画像変形手段と
を備え、
前記位置合わせ手段により前記実画像と重ね合わせられる前記仮想画像が、前記仮想画像変形手段により生成された画像である
請求項4記載の車両システム。 - 搭乗者の向く方向を検知する方向検知手段をさらに備え、
前記表示装置が、複数の頭部装着型表示装置であり、
前記撮像装置が全方位カメラであり、
前記表示手段は、前記合成画像のうち前記方向検知手段により検知された搭乗者の向く方向に対応する領域を抽出し、これにより各頭部装着型表示装置に表示させる表示画像を決定する表示画像決定手段と、
前記表示画像を頭部装着型表示装置に表示させる画像表示手段と
を有している
請求項5記載の車両システム。 - 前記方向検知手段が前記頭部装着型表示装置に設けられている
請求項6記載の車両システム。 - 前記仮想画像が、現在とは異なる時代の風景を模したコンピュータグラフィックス画像である
請求項1~7のいずれか一項に記載の車両システム。
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2012532984A JP5804571B2 (ja) | 2010-09-06 | 2011-09-06 | 車両システム |
CN2011800423551A CN103080983A (zh) | 2010-09-06 | 2011-09-06 | 车辆系统 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2010199146 | 2010-09-06 | ||
JP2010-199146 | 2010-09-06 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2012033095A1 true WO2012033095A1 (ja) | 2012-03-15 |
Family
ID=45810693
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2011/070271 WO2012033095A1 (ja) | 2010-09-06 | 2011-09-06 | 車両システム |
Country Status (3)
Country | Link |
---|---|
JP (1) | JP5804571B2 (ja) |
CN (1) | CN103080983A (ja) |
WO (1) | WO2012033095A1 (ja) |
Cited By (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2013246319A (ja) * | 2012-05-25 | 2013-12-09 | Fuji Television Network Inc | 画像表示装置及び画像表示方法 |
JP2014048864A (ja) * | 2012-08-31 | 2014-03-17 | Konami Digital Entertainment Co Ltd | 表示制御システム、ゲームシステム、表示制御システムの制御方法、表示制御装置、表示制御装置の制御方法、及びプログラム |
CN104102007A (zh) * | 2013-04-12 | 2014-10-15 | 聚晶半导体股份有限公司 | 头戴式显示器及其控制方法 |
JP2016110245A (ja) * | 2014-12-03 | 2016-06-20 | 株式会社T.J.Promotion | 表示システム、表示方法、コンピュータプログラム、コンピュータが読み取り可能な記憶媒体 |
CN105913772A (zh) * | 2016-05-27 | 2016-08-31 | 大连楼兰科技股份有限公司 | 车联网虚拟现实主题公园展示系统及方法 |
CN106057088A (zh) * | 2016-05-27 | 2016-10-26 | 大连楼兰科技股份有限公司 | 车联网虚拟现实主题公园展示方法 |
JP2017532825A (ja) * | 2014-08-18 | 2017-11-02 | ユニバーサル シティ スタジオズ リミテッド ライアビリティ カンパニー | 拡張現実及び仮想現実画像を生成するためのシステム及び方法 |
US10083546B2 (en) | 2016-04-11 | 2018-09-25 | Fujitsu Ten Limited | Augmented reality information displaying device and augmented reality information displaying method |
KR20180128606A (ko) * | 2017-05-24 | 2018-12-04 | (주)루쏘팩토리 | 이동식 가상현실 체험 시스템 |
JP2018195302A (ja) * | 2017-05-18 | 2018-12-06 | 有限会社一級建築士事務所ターボ設計 | 仮想オブジェクトの表示システムを用いた顧客把握システム、顧客把握システムプログラム及び顧客把握方法 |
WO2018230563A1 (ja) * | 2017-06-16 | 2018-12-20 | 本田技研工業株式会社 | 車両の画像提供システム、サーバーシステムおよび車両の画像提供方法 |
JP2019145100A (ja) * | 2018-02-22 | 2019-08-29 | 株式会社ジブンハウス | 不動産情報出力装置、不動産情報出力方法及び不動産情報出力プログラム |
JP2020129356A (ja) * | 2019-02-07 | 2020-08-27 | 株式会社メルカリ | プログラム、情報処理方法、及び情報処理端末 |
JP2021092839A (ja) * | 2019-12-06 | 2021-06-17 | トヨタ自動車株式会社 | 表示システム |
JP2021092802A (ja) * | 2013-02-22 | 2021-06-17 | ソニーグループ株式会社 | 情報処理装置、制御方法及びプログラム |
CN114648740A (zh) * | 2020-12-21 | 2022-06-21 | 丰田自动车株式会社 | 显示系统以及显示装置 |
US11670054B2 (en) | 2016-05-05 | 2023-06-06 | Universal City Studios Llc | Systems and methods for generating stereoscopic, augmented, and virtual reality images |
Families Citing this family (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106707504A (zh) * | 2015-07-30 | 2017-05-24 | 比亚迪股份有限公司 | 用于车辆的hud显示装置及具有其的车辆 |
KR101885128B1 (ko) * | 2016-03-11 | 2018-08-03 | 주식회사 상화 | 가상현실 체험장치 |
US10366290B2 (en) * | 2016-05-11 | 2019-07-30 | Baidu Usa Llc | System and method for providing augmented virtual reality content in autonomous vehicles |
CN106096501A (zh) * | 2016-05-27 | 2016-11-09 | 大连楼兰科技股份有限公司 | 车联网虚拟现实全景回放平台 |
CN106096502A (zh) * | 2016-05-27 | 2016-11-09 | 大连楼兰科技股份有限公司 | 车联网虚拟现实全景回放系统及方法 |
CN106067877A (zh) * | 2016-05-27 | 2016-11-02 | 大连楼兰科技股份有限公司 | 车联网虚拟现实全景回放方法 |
KR101813018B1 (ko) * | 2016-12-23 | 2017-12-29 | 재단법인대구경북과학기술원 | 차량과 연계된 3d 콘텐츠 제공 장치 및 그 방법 |
CN111566706A (zh) * | 2017-12-26 | 2020-08-21 | 株式会社音乐馆 | 图像生成系统、图像生成方法及程序 |
EP3663942B1 (en) * | 2018-12-07 | 2023-04-26 | Volvo Car Corporation | Evaluation of a simulated vehicle functionality feature |
CN109739352A (zh) * | 2018-12-27 | 2019-05-10 | 斑马网络技术有限公司 | 景观图像的展示方法及设备 |
CN110337018A (zh) * | 2019-07-05 | 2019-10-15 | 南京恩诺网络科技有限公司 | 信息处理系统 |
CN112905005A (zh) * | 2021-01-22 | 2021-06-04 | 领悦数字信息技术有限公司 | 用于车辆的自适应显示方法、装置及存储介质 |
WO2024065799A1 (en) * | 2022-09-30 | 2024-04-04 | Intel Corporation | Vehicle passenger display modification |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001076168A (ja) * | 1999-09-02 | 2001-03-23 | Ntt Docomo Inc | 情報端末装置、データベースサーバ装置、画像表示システムおよびそれらの制御方法 |
WO2007052458A1 (ja) * | 2005-11-01 | 2007-05-10 | Matsushita Electric Industrial Co., Ltd. | 情報表示装置 |
JP2007226580A (ja) * | 2006-02-24 | 2007-09-06 | Advanced Telecommunication Research Institute International | 画像出力装置、及び画像出力方法 |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4689639B2 (ja) * | 2007-04-25 | 2011-05-25 | キヤノン株式会社 | 画像処理システム |
-
2011
- 2011-09-06 JP JP2012532984A patent/JP5804571B2/ja active Active
- 2011-09-06 WO PCT/JP2011/070271 patent/WO2012033095A1/ja active Application Filing
- 2011-09-06 CN CN2011800423551A patent/CN103080983A/zh active Pending
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001076168A (ja) * | 1999-09-02 | 2001-03-23 | Ntt Docomo Inc | 情報端末装置、データベースサーバ装置、画像表示システムおよびそれらの制御方法 |
WO2007052458A1 (ja) * | 2005-11-01 | 2007-05-10 | Matsushita Electric Industrial Co., Ltd. | 情報表示装置 |
JP2007226580A (ja) * | 2006-02-24 | 2007-09-06 | Advanced Telecommunication Research Institute International | 画像出力装置、及び画像出力方法 |
Cited By (37)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2013246319A (ja) * | 2012-05-25 | 2013-12-09 | Fuji Television Network Inc | 画像表示装置及び画像表示方法 |
JP2014048864A (ja) * | 2012-08-31 | 2014-03-17 | Konami Digital Entertainment Co Ltd | 表示制御システム、ゲームシステム、表示制御システムの制御方法、表示制御装置、表示制御装置の制御方法、及びプログラム |
US12130442B2 (en) | 2013-02-22 | 2024-10-29 | Sony Corporation | Information processing device that displays a virtual object relative to real space |
JP7268692B2 (ja) | 2013-02-22 | 2023-05-08 | ソニーグループ株式会社 | 情報処理装置、制御方法及びプログラム |
JP2021092802A (ja) * | 2013-02-22 | 2021-06-17 | ソニーグループ株式会社 | 情報処理装置、制御方法及びプログラム |
US11513353B2 (en) | 2013-02-22 | 2022-11-29 | Sony Corporation | Information processing device that displays a virtual object relative to real space |
US11885971B2 (en) | 2013-02-22 | 2024-01-30 | Sony Corporation | Information processing device that displays a virtual object relative to real space |
CN104102007A (zh) * | 2013-04-12 | 2014-10-15 | 聚晶半导体股份有限公司 | 头戴式显示器及其控制方法 |
US10606348B2 (en) | 2014-08-18 | 2020-03-31 | Universal City Studios Llc | Systems and methods for generating augmented and virtual reality images |
KR102718166B1 (ko) * | 2014-08-18 | 2024-10-15 | 유니버셜 시티 스튜디오스 엘엘씨 | 증강 및 가상 현실 이미지를 생성하는 시스템 및 방법 |
US10241568B2 (en) | 2014-08-18 | 2019-03-26 | Universal City Studios Llc | Systems and methods for generating augmented and virtual reality images |
JP7454544B2 (ja) | 2014-08-18 | 2024-03-22 | ユニバーサル シティ スタジオズ リミテッド ライアビリティ カンパニー | 拡張現実及び仮想現実画像を生成するためのシステム及び方法 |
JP2017532825A (ja) * | 2014-08-18 | 2017-11-02 | ユニバーサル シティ スタジオズ リミテッド ライアビリティ カンパニー | 拡張現実及び仮想現実画像を生成するためのシステム及び方法 |
JP2019166405A (ja) * | 2014-08-18 | 2019-10-03 | ユニバーサル シティ スタジオズ リミテッド ライアビリティ カンパニー | 拡張現実及び仮想現実画像を生成するためのシステム及び方法 |
KR20230071200A (ko) * | 2014-08-18 | 2023-05-23 | 유니버셜 시티 스튜디오스 엘엘씨 | 증강 및 가상 현실 이미지를 생성하는 시스템 및 방법 |
US11586277B2 (en) | 2014-08-18 | 2023-02-21 | Universal City Studios Llc | Systems and methods for generating augmented and virtual reality images |
JP6995799B2 (ja) | 2014-08-18 | 2022-01-17 | ユニバーサル シティ スタジオズ リミテッド ライアビリティ カンパニー | 拡張現実及び仮想現実画像を生成するためのシステム及び方法 |
JP2022036116A (ja) * | 2014-08-18 | 2022-03-04 | ユニバーサル シティ スタジオズ リミテッド ライアビリティ カンパニー | 拡張現実及び仮想現実画像を生成するためのシステム及び方法 |
JP2016110245A (ja) * | 2014-12-03 | 2016-06-20 | 株式会社T.J.Promotion | 表示システム、表示方法、コンピュータプログラム、コンピュータが読み取り可能な記憶媒体 |
US10083546B2 (en) | 2016-04-11 | 2018-09-25 | Fujitsu Ten Limited | Augmented reality information displaying device and augmented reality information displaying method |
US11670054B2 (en) | 2016-05-05 | 2023-06-06 | Universal City Studios Llc | Systems and methods for generating stereoscopic, augmented, and virtual reality images |
CN106057088A (zh) * | 2016-05-27 | 2016-10-26 | 大连楼兰科技股份有限公司 | 车联网虚拟现实主题公园展示方法 |
CN105913772A (zh) * | 2016-05-27 | 2016-08-31 | 大连楼兰科技股份有限公司 | 车联网虚拟现实主题公园展示系统及方法 |
JP2018195302A (ja) * | 2017-05-18 | 2018-12-06 | 有限会社一級建築士事務所ターボ設計 | 仮想オブジェクトの表示システムを用いた顧客把握システム、顧客把握システムプログラム及び顧客把握方法 |
KR20180128606A (ko) * | 2017-05-24 | 2018-12-04 | (주)루쏘팩토리 | 이동식 가상현실 체험 시스템 |
KR101996008B1 (ko) * | 2017-05-24 | 2019-07-03 | (주)루쏘팩토리 | 이동식 가상현실 체험 시스템 |
US11397322B2 (en) | 2017-06-16 | 2022-07-26 | Honda Motor Co., Ltd. | Image providing system for vehicle, server system, and image providing method for vehicle |
WO2018230563A1 (ja) * | 2017-06-16 | 2018-12-20 | 本田技研工業株式会社 | 車両の画像提供システム、サーバーシステムおよび車両の画像提供方法 |
JP7125063B2 (ja) | 2018-02-22 | 2022-08-24 | Jibun Haus.株式会社 | 不動産情報出力装置、不動産情報出力方法及び不動産情報出力プログラム |
JP2019145100A (ja) * | 2018-02-22 | 2019-08-29 | 株式会社ジブンハウス | 不動産情報出力装置、不動産情報出力方法及び不動産情報出力プログラム |
JP2020129356A (ja) * | 2019-02-07 | 2020-08-27 | 株式会社メルカリ | プログラム、情報処理方法、及び情報処理端末 |
JP7384014B2 (ja) | 2019-12-06 | 2023-11-21 | トヨタ自動車株式会社 | 表示システム |
JP2021092839A (ja) * | 2019-12-06 | 2021-06-17 | トヨタ自動車株式会社 | 表示システム |
US11590902B2 (en) | 2019-12-06 | 2023-02-28 | Toyota Jidosha Kabushiki Kaisha | Vehicle display system for displaying surrounding event information |
JP7372230B2 (ja) | 2020-12-21 | 2023-10-31 | トヨタ自動車株式会社 | 表示システム及び表示装置 |
CN114648740A (zh) * | 2020-12-21 | 2022-06-21 | 丰田自动车株式会社 | 显示系统以及显示装置 |
JP2022097826A (ja) * | 2020-12-21 | 2022-07-01 | トヨタ自動車株式会社 | 表示システム及び表示装置 |
Also Published As
Publication number | Publication date |
---|---|
CN103080983A (zh) | 2013-05-01 |
JPWO2012033095A1 (ja) | 2014-01-20 |
JP5804571B2 (ja) | 2015-11-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5804571B2 (ja) | 車両システム | |
US10582166B2 (en) | Method of tracking a mobile device and method of generating a geometrical model of a real environment using a camera of a mobile device | |
JP7571829B2 (ja) | 情報処理装置、情報処理方法、プログラム、および移動体 | |
US8295644B2 (en) | Birds eye view virtual imaging for real time composited wide field of view | |
US10029700B2 (en) | Infotainment system with head-up display for symbol projection | |
CN104781873B (zh) | 图像显示装置、图像显示方法、移动装置、图像显示系统 | |
WO2006035755A1 (ja) | 移動体ナビゲート情報表示方法および移動体ナビゲート情報表示装置 | |
WO2012169355A1 (ja) | 画像生成装置 | |
JP2009101718A (ja) | 映像表示装置及び映像表示方法 | |
CN104303211A (zh) | 用于在车辆显示器上整合虚拟对象的方法 | |
WO2011136209A1 (ja) | 観覧車 | |
WO2016102304A1 (en) | Method for presenting an image overlay element in an image with 3d information, driver assistance system and motor vehicle | |
CN102291541A (zh) | 一种车辆虚拟合成显示系统 | |
WO2018134897A1 (ja) | 位置姿勢検出装置、ar表示装置、位置姿勢検出方法およびar表示方法 | |
JP3301421B2 (ja) | 車両周囲状況提示装置 | |
JP2016095688A (ja) | 車載用情報表示装置 | |
JP2010128133A (ja) | 移動式情報重畳システム及び情報重畳方法 | |
CN109643468B (zh) | 图像处理装置和图像处理方法 | |
WO2004048895A1 (ja) | 移動体ナビゲート情報表示方法および移動体ナビゲート情報表示装置 | |
WO2021172037A1 (ja) | 画像処理装置、画像処理方法、プログラム、および画像提示システム | |
JP2008033781A (ja) | 路面勾配検出装置及び画像表示装置 | |
JP2013200820A (ja) | 画像送受信システム | |
Sridhar et al. | Generation of virtual display surfaces for in-vehicle contextual augmented reality | |
JP2015184804A (ja) | 画像表示装置、画像表示方法及び画像表示プログラム | |
JP2021148906A (ja) | 車両表示システムおよび表示装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 201180042355.1 Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 11823570 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2012532984 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 11823570 Country of ref document: EP Kind code of ref document: A1 |