WO2023037954A1 - Vr test system and vr test method - Google Patents

Vr test system and vr test method Download PDF

Info

Publication number
WO2023037954A1
WO2023037954A1 PCT/JP2022/032942 JP2022032942W WO2023037954A1 WO 2023037954 A1 WO2023037954 A1 WO 2023037954A1 JP 2022032942 W JP2022032942 W JP 2022032942W WO 2023037954 A1 WO2023037954 A1 WO 2023037954A1
Authority
WO
WIPO (PCT)
Prior art keywords
right images
display device
camera
image
stereo camera
Prior art date
Application number
PCT/JP2022/032942
Other languages
French (fr)
Japanese (ja)
Inventor
利道 高橋
正夫 中川
Original Assignee
株式会社明電舎
独立行政法人自動車技術総合機構
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社明電舎, 独立行政法人自動車技術総合機構 filed Critical 株式会社明電舎
Publication of WO2023037954A1 publication Critical patent/WO2023037954A1/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/128Adjusting depth or disparity
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/239Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N17/00Diagnosis, testing or measuring for television systems or their details

Definitions

  • the present invention relates to a VR test system and a VR test method.
  • Patent Document 1 which is an example of conventional technology, discloses a vehicle inspection system that inspects a vehicle that performs travel control based on information on the external environment in a predetermined direction detected by two monocular cameras.
  • Patent Document 2 which is an example of the conventional technology, in a vehicle-mounted stereo camera image processing device having a pair of image pickup devices, a diagnosis An image processing apparatus is disclosed that performs diagnosis by comparing a parallax image for use with a preset reference parallax image. In Patent Document 2, processing is performed on the measurement side, not on the display side.
  • the stereo camera performs binocular stereoscopic vision
  • the parallax between the left and right images is important.
  • the present invention has been made in view of the above, and an object of the present invention is to provide an arbitrary parallax to the left and right images that are the shooting targets of the stereo camera.
  • a driving environment simulation device that outputs images of a sensor model as left and right images independent of each other; a display device that displays each of the left and right images; a stereo camera installed in a darkroom together with a display device, and having left and right cameras independently photographing a left image and a right image of the left and right images displayed on the display device by a left camera and a right camera, respectively.
  • the driving environment simulation device is a stereo camera VR test system that includes an image processing unit that generates the left and right images independent of each other so as to have equivalent angles of view.
  • the left and right images have equivalent parallax by using a virtual angle of view for the sensor model.
  • the left and right images are generated so as to have an arbitrary equivalent parallax from the image coordinates on the display device.
  • one aspect of the present invention that solves the above problems and achieves the object is to simulate a driving environment by outputting images of a sensor model as left and right images independent of each other, and to display each of the left and right images. and capturing the left image of the displayed left and right images using the left camera of the stereo camera and the right image using the right camera of the stereo camera, independently of the left and right images, and
  • the simulation is a VR test method for the stereo camera, in which the left and right images are generated independently of each other so as to have equivalent angles of view.
  • an arbitrary parallax can be given to the left and right images that are the imaging targets of the stereo camera.
  • FIG. 1 is a diagram showing a VR (virtual reality) test system in an embodiment.
  • FIG. 2 is a diagram showing left and right images displayed on the display device.
  • FIG. 1 is a diagram showing a VR (virtual reality) test system 1 in this embodiment.
  • a VR test system 1 shown in FIG. 1 includes a driving environment simulation device 10 and a sensor simulation system 20 .
  • the driving environment simulation device 10 includes simulation software having a scenario, a vehicle model, a driver model, a sensor model, etc., and outputs images of the sensor model as left and right images (left image and right image) to the display device 201 of the sensor simulation system 20. do.
  • the simulator used to generate the left and right images may be simulation software having a sensor model (camera model).
  • the driving environment simulation device 10 also includes an image processing unit 101 . As will be described later, the image processing unit 101 performs processing to give an arbitrary parallax to the left and right images that are the targets of imaging by the stereo camera.
  • the left and right images output from the driving environment simulation device 10 to the display device 201 are left and right images given arbitrary parallax by the image processing unit 101 .
  • the sensor simulation system 20 includes a darkroom 200, a display device 201, and a stereo camera 202.
  • the darkroom 200 is a box-shaped structure provided so that surrounding structures and the like, which are disturbances, are not reflected on the display device 201 .
  • the stereo camera 202 to be evaluated is mounted on a vehicle, it is necessary to consider the possible size of the display device 201, which is a monitor, and the distance at which the display device 201 is arranged. Since such tests are often conducted indoors, compact installation is required. Also, in such tests, glare due to disturbances should be eliminated. Therefore, the display device 201 and the stereo camera 202 are installed inside the darkroom 200 .
  • the display device 201 displays left and right images, which are images of the sensor model output from the driving environment simulation device 10 and given arbitrary parallax.
  • a 4K display can be exemplified as the display device 201 .
  • the left and right images displayed on the display device 201 include left and right images independent of each other.
  • the stereo camera 202 has a left camera 202L and a right camera 202R, and shoots left and right images independently.
  • the left camera 202L captures the left image of the left and right images
  • the right camera 202R captures the right image of the left and right images.
  • the stereo camera 202 is controlled from the outside of the sensor simulation system 20 and sends an image acquired by photography to the outside of the sensor simulation system 20 .
  • the driving environment simulation device 10 outputs the image processed by the image processing section 101 to the display device 201 .
  • the left and right images output from the driving environment simulation device 10 to the display device 201 are given an arbitrary parallax and are independent left and right images.
  • the stereo camera 202 captures left and right images displayed on the display device 201 .
  • the arrangement and generation of the left and right images on the screen of the display device 201 for providing accurate parallax to the stereo camera 202 will be described below.
  • the range of the left and right angles of view of the display screen of the display device 201 is calculated and specified from the specifications of the stereo camera 202 (baseline length, angle of view, resolution, etc.) and the distance between the stereo camera 202 and the display device 201. be.
  • the left and right images displayed on the screen of the display device 201 are calculated and generated so as to avoid the intersection range of the left and right angles of view.
  • FIG. 2A and 2B are diagrams showing left and right images displayed on the display device 201.
  • FIG. The coordinate calculations required to display independent left and right images on display device 201, as shown in FIG. 2, are described below. Examples of parameters for the display device 201 and the stereo camera 202 are shown in Table 1.
  • each parameter in the coordinate calculation is shown in FIG.
  • the photographing ranges of the left and right cameras of stereo camera 202 are calculated from the relationship between display device 201 and stereo camera 202 .
  • the position and size of the image to be displayed, and its angle of view are determined as follows.
  • each parameter in the above formulas (1) and (2) is as follows.
  • H Width of shooting range
  • L Distance between display device 201 and stereo camera 202
  • Angle of view of field of view
  • V Height of shooting range
  • PV Vertical pixels
  • PH Horizontal pixels
  • the width H of the imaging range is calculated from the distance L from the stereo camera 202 to the display device 201 and the angle of view ⁇ of the field of view by the above equation (1). Also, the height V of the imaging range is calculated from the width H of the imaging range and the pixel ratio (P V /P H ) by the above equation (2). Next, the width D width of the generated image and the height D height of the generated image are calculated from these values by the following equations (3), (4) and (5).
  • each parameter in the above formulas (3), (4) and (5) is as follows.
  • D1 Distance from the origin to the right image
  • D2 Distance from the origin to the left image
  • B Distance between the left and right cameras (stereo camera baseline length)
  • the size of the left and right images to be generated is calculated according to equations (4) and (5) by taking into account the range where the left and right shooting ranges intersect according to equation (3) above.
  • equations (4) and (5) by taking into account the range where the left and right shooting ranges intersect according to equation (3) above.
  • the origins, y-axes, and z-axes of the display device 201 and the stereo camera 202 match, once the sizes of the left and right images are determined, the coordinates and the number of pixels on the display device 201 etc. are determined.
  • arbitrary parallax can be given to the left and right images that are the objects to be captured by the stereo camera.
  • a virtual field of view (FOV) ⁇ CG is calculated from the result of the first embodiment, and the virtual angle of view ⁇ CG is used for the sensor model (camera model) of the driving environment simulation device 10.
  • FOV field of view
  • ⁇ CG the virtual angle of view of the sensor model (camera model) to be output to the display device 201, which is a monitor
  • the sensor of the driving environment simulation device 10 is calculated.
  • left and right images having equivalent parallax are generated and output to the display device 201, which is a monitor.
  • the displayed image becomes smaller as shown in FIG. 2, and the virtual angle of view ⁇ CG of the sensor model (camera model) of the following equation (6) is used for image generation.
  • Atan is an inverse trigonometric function.
  • the accuracy of object recognition and distance measurement can be improved more than in the first embodiment.
  • each parameter in the above formula (7) is as follows.
  • BF product of baseline length (B) and focal length (F)
  • D ⁇ parallax at infinity
  • Equation (7) BF and D ⁇ are parameters specific to the stereo camera.
  • a parallax D to be photographed is calculated from a distance Z to be photographed.
  • the calculation of the parallax between the left and right images acquired by the stereo camera 202 is performed by SAD (Sum of Absolute Difference), and is calculated based on the upper left y0 and z0 of each angle of view. Since the left and right images to be displayed are at equal distances from the reference point of the angle of view, the stereo camera 202 may shoot without adjusting the parallax D calculated by the above equation (7).
  • the left and right images are displayed on the same display screen, but the present invention is not limited to this.
  • the display screen on which the left image is displayed and the display screen on which the right image is displayed may be different.
  • the present invention is not limited to this.
  • the driving environment is simulated by outputting images of the sensor model as left and right images independent of each other, each of the left and right images is displayed, and each of the displayed left and right images is viewed by the left and right cameras of a stereo camera.
  • the VR test method of the stereo camera is also included in the present invention.
  • VR test system 10 driving environment simulation device 101 image processing unit 20 sensor simulation system 200 darkroom 201 display device 202 stereo camera 202L left camera 202R right camera

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • General Health & Medical Sciences (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)

Abstract

[Problem] To provide an arbitrary parallax to left and right images to be captured by a stereoscopic camera. [Solution] This VR test system for a stereoscopic camera 202 comprises: a driving environment simulation device 10 for outputting sensor model video as mutually independent left and right images; a display device 201 for displaying each of the left and right images; and the stereoscopic camera 202 which is installed in a dark room 200 together with the display device 201, and includes left and right cameras for capturing left and right images displayed on the display device 201 left-right independently, i.e., the left image by the left camera and the right image by the right camera. The driving environment simulation device 10 comprises an image processing unit 101, the image processing unit 101 generating mutually independent left and right images having equivalent angles of view.

Description

VRテストシステム及びVRテスト方法VR test system and VR test method
 本発明は、VRテストシステム及びVRテスト方法に関する。 The present invention relates to a VR test system and a VR test method.
 従来、台上試験機において、シミュレーション装置により表示された様々な運転環境の表示画面を自動運転車のカメラの撮影対象として、自動運転車の評価が行われている。 Conventionally, self-driving cars are evaluated on a bench test machine, with the display screens of various driving environments displayed by the simulation device being taken by the self-driving car's camera.
 従来技術の一例である特許文献1には、2つの単眼カメラで検出される所定方向の外部環境の情報に基づいて走行制御を行う車両を検査する車両検査システムが開示されている。
 従来技術の一例である特許文献2には、一対の撮像素子を有する車載のステレオカメラ画像処理装置において、一対の撮像画像の差分に基づいて視差画像を生成する視差画像生成部に対して、診断用視差画像と予め設定されている基準視差画像とを比較して診断を行う画像処理装置が開示されている。
 特許文献2では、表示側ではなく、計測側において処理が行われている。
Patent Document 1, which is an example of conventional technology, discloses a vehicle inspection system that inspects a vehicle that performs travel control based on information on the external environment in a predetermined direction detected by two monocular cameras.
In Patent Document 2, which is an example of the conventional technology, in a vehicle-mounted stereo camera image processing device having a pair of image pickup devices, a diagnosis An image processing apparatus is disclosed that performs diagnosis by comparing a parallax image for use with a preset reference parallax image.
In Patent Document 2, processing is performed on the measurement side, not on the display side.
国際公開第2020/059472号WO2020/059472 特許第6742423号Patent No. 6742423
 ステレオカメラは両眼立体視を行うため、左右画像の視差が重要である。
 しかしながら、上記従来の技術では、任意の視差を与えた左右画像を表示させることができない。
Since the stereo camera performs binocular stereoscopic vision, the parallax between the left and right images is important.
However, with the conventional technique described above, it is not possible to display left and right images with arbitrary parallax.
 本発明は、上記に鑑みてなされたものであって、ステレオカメラの撮影対象である左右画像に任意の視差を与えることを目的とする。 The present invention has been made in view of the above, and an object of the present invention is to provide an arbitrary parallax to the left and right images that are the shooting targets of the stereo camera.
 上述の課題を解決して目的を達成する本発明の一態様は、センサモデルの映像を互いに独立した左右画像として出力する運転環境シミュレーション装置と、前記左右画像の各々を表示する表示装置と、前記表示装置とともに暗室内に設置され、前記表示装置に表示された前記左右画像のうち左画像を左カメラにより、右画像を右カメラにより、左右独立に撮影する左右カメラを有するステレオカメラと、を備え、前記運転環境シミュレーション装置は、画像処理部を備え、前記画像処理部は、等価的な画角を有するように互いに独立した前記左右画像を生成する、ステレオカメラのVRテストシステムである。 One aspect of the present invention that solves the above problems and achieves the object is a driving environment simulation device that outputs images of a sensor model as left and right images independent of each other; a display device that displays each of the left and right images; a stereo camera installed in a darkroom together with a display device, and having left and right cameras independently photographing a left image and a right image of the left and right images displayed on the display device by a left camera and a right camera, respectively. The driving environment simulation device is a stereo camera VR test system that includes an image processing unit that generates the left and right images independent of each other so as to have equivalent angles of view.
 本発明の一態様では、前記センサモデルに仮想画角が用いられることで、前記左右画像が等価的な視差を有する。 In one aspect of the present invention, the left and right images have equivalent parallax by using a virtual angle of view for the sensor model.
 本発明の一態様では、前記左右画像は、前記表示装置上の画像座標から任意の等価的な視差を有するように生成される。 In one aspect of the present invention, the left and right images are generated so as to have an arbitrary equivalent parallax from the image coordinates on the display device.
 又は、上述の課題を解決して目的を達成する本発明の一態様は、センサモデルの映像を互いに独立した左右画像として出力することで運転環境をシミュレーションすること、前記左右画像の各々を表示すること、前記表示された前記左右画像のうち左画像をステレオカメラの左カメラを用いて、右画像を前記ステレオカメラの右カメラを用いて、左右独立に撮影すること、を含み、前記運転環境のシミュレーションでは、等価的な画角を有するように互いに独立した前記左右画像が生成される、前記ステレオカメラのVRテスト方法である。 Alternatively, one aspect of the present invention that solves the above problems and achieves the object is to simulate a driving environment by outputting images of a sensor model as left and right images independent of each other, and to display each of the left and right images. and capturing the left image of the displayed left and right images using the left camera of the stereo camera and the right image using the right camera of the stereo camera, independently of the left and right images, and The simulation is a VR test method for the stereo camera, in which the left and right images are generated independently of each other so as to have equivalent angles of view.
 本発明によれば、ステレオカメラの撮影対象である左右画像に任意の視差を与えることができる。 According to the present invention, an arbitrary parallax can be given to the left and right images that are the imaging targets of the stereo camera.
図1は、実施形態におけるVR(virtual reality)テストシステムを示す図である。FIG. 1 is a diagram showing a VR (virtual reality) test system in an embodiment. 図2は、表示装置に表示する左右画像を示す図である。FIG. 2 is a diagram showing left and right images displayed on the display device.
 以下、添付図面を参照して、本発明を実施するための形態について説明する。
 ただし、本発明は、以下の実施形態の記載によって限定解釈されるものではない。
BEST MODE FOR CARRYING OUT THE INVENTION Hereinafter, embodiments for carrying out the present invention will be described with reference to the accompanying drawings.
However, the present invention is not to be construed as limited by the description of the following embodiments.
(実施形態1)
 図1は、本実施形態におけるVR(virtual reality)テストシステム1を示す図である。
 図1に示すVRテストシステム1は、運転環境シミュレーション装置10と、センサシミュレーションシステム20と、を備える。
(Embodiment 1)
FIG. 1 is a diagram showing a VR (virtual reality) test system 1 in this embodiment.
A VR test system 1 shown in FIG. 1 includes a driving environment simulation device 10 and a sensor simulation system 20 .
 運転環境シミュレーション装置10は、シナリオ、車両モデル、ドライバモデル及びセンサモデル等を有するシミュレーションソフトを含み、センサモデルの映像を左右画像(左画像及び右画像)としてセンサシミュレーションシステム20の表示装置201に出力する。
 ここで、左右画像の生成に用いるシミュレータとしては、センサモデル(カメラモデル)を有しているシミュレーションソフトウェアであればよい。
 また、運転環境シミュレーション装置10は、画像処理部101を備える。
 画像処理部101は、後述するように、ステレオカメラの撮影対象である左右画像に任意の視差を与える処理を行う。
 運転環境シミュレーション装置10から表示装置201に出力される左右画像は、画像処理部101によって任意の視差が与えられた左右画像である。
The driving environment simulation device 10 includes simulation software having a scenario, a vehicle model, a driver model, a sensor model, etc., and outputs images of the sensor model as left and right images (left image and right image) to the display device 201 of the sensor simulation system 20. do.
Here, the simulator used to generate the left and right images may be simulation software having a sensor model (camera model).
The driving environment simulation device 10 also includes an image processing unit 101 .
As will be described later, the image processing unit 101 performs processing to give an arbitrary parallax to the left and right images that are the targets of imaging by the stereo camera.
The left and right images output from the driving environment simulation device 10 to the display device 201 are left and right images given arbitrary parallax by the image processing unit 101 .
 センサシミュレーションシステム20は、暗室200と、表示装置201と、ステレオカメラ202と、を備える。 The sensor simulation system 20 includes a darkroom 200, a display device 201, and a stereo camera 202.
 暗室200は、外乱である周辺の構造物等が表示装置201に映り込まないように設けられた箱状の構造物である。
 車載された評価対象のステレオカメラ202を想定すると、モニタである表示装置201の配置可能な大きさ及び配置する距離を考慮する必要がある。
 このような試験は、室内において行われることが多いため、コンパクトに設置することが求められる。
 また、このような試験では、外乱による映り込みを除去すべきである。
 そこで、表示装置201及びステレオカメラ202は、暗室200内に設置される。
The darkroom 200 is a box-shaped structure provided so that surrounding structures and the like, which are disturbances, are not reflected on the display device 201 .
Assuming that the stereo camera 202 to be evaluated is mounted on a vehicle, it is necessary to consider the possible size of the display device 201, which is a monitor, and the distance at which the display device 201 is arranged.
Since such tests are often conducted indoors, compact installation is required.
Also, in such tests, glare due to disturbances should be eliminated.
Therefore, the display device 201 and the stereo camera 202 are installed inside the darkroom 200 .
 表示装置201は、運転環境シミュレーション装置10から出力されたセンサモデルの映像であって、任意の視差が与えられた左右画像を表示する。
 なお、表示装置201としては、4Kディスプレイを例示することができる。
 ここで、表示装置201に表示される左右画像は、互いに独立した左右画像を含む。
The display device 201 displays left and right images, which are images of the sensor model output from the driving environment simulation device 10 and given arbitrary parallax.
A 4K display can be exemplified as the display device 201 .
Here, the left and right images displayed on the display device 201 include left and right images independent of each other.
 ステレオカメラ202は、左カメラ202L及び右カメラ202Rを有し、左右画像を左右独立に撮影する。
 ここで、左カメラ202Lは左右画像のうち左画像を撮影し、右カメラ202Rは左右画像のうち右画像を撮影する。
 ステレオカメラ202は、センサシミュレーションシステム20の外部から制御され、センサシミュレーションシステム20の外部に撮影により取得した画像を送る。
The stereo camera 202 has a left camera 202L and a right camera 202R, and shoots left and right images independently.
Here, the left camera 202L captures the left image of the left and right images, and the right camera 202R captures the right image of the left and right images.
The stereo camera 202 is controlled from the outside of the sensor simulation system 20 and sends an image acquired by photography to the outside of the sensor simulation system 20 .
 運転環境シミュレーション装置10は、画像処理部101によって処理された画像を表示装置201に出力する。
 運転環境シミュレーション装置10から表示装置201に出力される左右画像は、任意の視差が与えられ、互いに独立した左右画像である。
 ステレオカメラ202は、表示装置201に表示された、左右画像を撮影する。
The driving environment simulation device 10 outputs the image processed by the image processing section 101 to the display device 201 .
The left and right images output from the driving environment simulation device 10 to the display device 201 are given an arbitrary parallax and are independent left and right images.
The stereo camera 202 captures left and right images displayed on the display device 201 .
 ステレオカメラ202に正確な視差を与えるための、表示装置201の画面における左右画像の配置及びその生成について、以下に説明する。
 表示装置201の表示画面の左右の画角の範囲は、ステレオカメラ202の諸元(基線長、画角及び解像度等)及びステレオカメラ202と表示装置201との距離等から、計算されて特定される。
 表示装置201の画面に表示される左右画像は、左右画角の交差範囲を避けるように計算されて生成される。
The arrangement and generation of the left and right images on the screen of the display device 201 for providing accurate parallax to the stereo camera 202 will be described below.
The range of the left and right angles of view of the display screen of the display device 201 is calculated and specified from the specifications of the stereo camera 202 (baseline length, angle of view, resolution, etc.) and the distance between the stereo camera 202 and the display device 201. be.
The left and right images displayed on the screen of the display device 201 are calculated and generated so as to avoid the intersection range of the left and right angles of view.
 図2は、表示装置201に表示する左右画像を示す図である。
 図2に示すように、表示装置201上に独立した左右画像を表示するために必要な座標計算について以下に説明する。
 表示装置201及びステレオカメラ202に関するパラメータの例は、表1に示す通りである。
2A and 2B are diagrams showing left and right images displayed on the display device 201. FIG.
The coordinate calculations required to display independent left and right images on display device 201, as shown in FIG. 2, are described below.
Examples of parameters for the display device 201 and the stereo camera 202 are shown in Table 1.
Figure JPOXMLDOC01-appb-T000001
Figure JPOXMLDOC01-appb-T000001
 ここで、図2に示すように表示装置201及びステレオカメラ202の各々の原点及びy軸,z軸は一致し、ステレオカメラ202は表示装置201に対して正対しているものとする。
 また、座標計算における各パラメータは、図2に示されている。
 ここで、ステレオカメラ202の左右カメラの撮影範囲は、表示装置201とステレオカメラ202との関係から算出される。
 そして、表示する画像の位置及び大きさと、その画角とは、以下のように決定される。
Here, as shown in FIG. 2, it is assumed that the origins, y-axes, and z-axes of the display device 201 and the stereo camera 202 are aligned, and the stereo camera 202 faces the display device 201 .
Also, each parameter in the coordinate calculation is shown in FIG.
Here, the photographing ranges of the left and right cameras of stereo camera 202 are calculated from the relationship between display device 201 and stereo camera 202 .
The position and size of the image to be displayed, and its angle of view are determined as follows.
Figure JPOXMLDOC01-appb-M000002
Figure JPOXMLDOC01-appb-M000002
Figure JPOXMLDOC01-appb-M000003
Figure JPOXMLDOC01-appb-M000003
 ここで、上記の式(1),(2)における各パラメータは、下記の通りである。
 H:撮影範囲の幅
 L:表示装置201とステレオカメラ202との距離
 θ:視野(Field of view)の画角
 V:撮影範囲の高さ
 P:垂直方向画素数(Vertical pixels)
 P:水平方向画素数(Horizontal pixels)
Here, each parameter in the above formulas (1) and (2) is as follows.
H: Width of shooting range L: Distance between display device 201 and stereo camera 202 θ: Angle of view of field of view V: Height of shooting range PV : Vertical pixels
PH : Horizontal pixels
 撮影範囲の幅Hは、ステレオカメラ202から表示装置201までの距離L及び視野の画角θから上記の式(1)により算出される。
 また、撮影範囲の高さVは、撮影範囲の幅H及びピクセル比(P/P)から上記の式(2)により算出される。
 次に、生成画像の幅Dwidth及び生成画像の高さDheightは、これらの値から下記の式(3),(4),(5)により算出される。
The width H of the imaging range is calculated from the distance L from the stereo camera 202 to the display device 201 and the angle of view θ of the field of view by the above equation (1).
Also, the height V of the imaging range is calculated from the width H of the imaging range and the pixel ratio (P V /P H ) by the above equation (2).
Next, the width D width of the generated image and the height D height of the generated image are calculated from these values by the following equations (3), (4) and (5).
Figure JPOXMLDOC01-appb-M000004
Figure JPOXMLDOC01-appb-M000004
Figure JPOXMLDOC01-appb-M000005
Figure JPOXMLDOC01-appb-M000005
Figure JPOXMLDOC01-appb-M000006
Figure JPOXMLDOC01-appb-M000006
 ここで、上記の式(3),(4),(5)における各パラメータは、下記の通りである。
 D:原点から右画像までの距離
 D:原点から左画像までの距離
 B:左カメラと右カメラとの距離(ステレオカメラの基線長)
Here, each parameter in the above formulas (3), (4) and (5) is as follows.
D1 : Distance from the origin to the right image D2 : Distance from the origin to the left image B: Distance between the left and right cameras (stereo camera baseline length)
 上記の式(3)により、左右の撮影範囲が交錯する範囲を考慮することで、式(4),(5)により、生成する左右画像のサイズが算出される。
 このとき、上述したように、表示装置201及びステレオカメラ202の各々の原点及びy軸,z軸が一致していることから、左右画像のサイズが決まれば、表示装置201上の座標及びピクセル数等が決定される。
The size of the left and right images to be generated is calculated according to equations (4) and (5) by taking into account the range where the left and right shooting ranges intersect according to equation (3) above.
At this time, as described above, since the origins, y-axes, and z-axes of the display device 201 and the stereo camera 202 match, once the sizes of the left and right images are determined, the coordinates and the number of pixels on the display device 201 etc. are determined.
 以上説明した本実施形態によれば、ステレオカメラの撮影対象である左右画像に任意の視差を与えることができる。 According to the present embodiment described above, arbitrary parallax can be given to the left and right images that are the objects to be captured by the stereo camera.
(実施形態2)
 本実施形態では、実施形態1の結果から仮想FOV(Virtual Fields of View)である仮想画角θCGを計算し、運転環境シミュレーション装置10のセンサモデル(カメラモデル)に仮想画角θCGを用いることで、等価的な視差を有する左右画像を生成する形態について説明する。
 本実施形態では、運転環境シミュレーション装置10において、モニタである表示装置201へ出力するセンサモデル(カメラモデル)の仮想画角を実施形態1の結果を用いて計算し、運転環境シミュレーション装置10のセンサモデル(カメラモデル)に導かれた仮想画角を用いて、等価的な視差を有する左右画像を生成してモニタである表示装置201へ出力する。
(Embodiment 2)
In this embodiment, a virtual field of view (FOV) θ CG is calculated from the result of the first embodiment, and the virtual angle of view θ CG is used for the sensor model (camera model) of the driving environment simulation device 10. Thus, a mode for generating left and right images having equivalent parallax will be described.
In this embodiment, in the driving environment simulation device 10, the virtual angle of view of the sensor model (camera model) to be output to the display device 201, which is a monitor, is calculated using the results of the first embodiment, and the sensor of the driving environment simulation device 10 is calculated. Using the virtual angle of view derived from the model (camera model), left and right images having equivalent parallax are generated and output to the display device 201, which is a monitor.
 ステレオカメラ202の左右カメラの各々に独立して表示装置201上の左右画像を撮影させるためには、左右の撮影範囲が交錯する範囲を避けるべきである。
 そのため、図2に示すように表示する画像は小さくなり、下記の式(6)のセンサモデル(カメラモデル)の仮想画角θCGを画像生成に用いる。
In order to allow the left and right cameras of the stereo camera 202 to independently capture the left and right images on the display device 201, a range in which the left and right capturing ranges intersect should be avoided.
Therefore, the displayed image becomes smaller as shown in FIG. 2, and the virtual angle of view θ CG of the sensor model (camera model) of the following equation (6) is used for image generation.
Figure JPOXMLDOC01-appb-M000007
Figure JPOXMLDOC01-appb-M000007
 ここで、atanは逆三角関数である。 Here, atan is an inverse trigonometric function.
 本実施形態によれば、仮想画角θCGを用いることで、物体認識及び測距の精度を実施形態1よりも向上させることができる。 According to this embodiment, by using the virtual angle of view θ CG , the accuracy of object recognition and distance measurement can be improved more than in the first embodiment.
(実施形態3)
 本実施形態では、運転環境シミュレーション装置10のセンサモデル(カメラモデル)に、モニタである表示装置201上の画像座標から任意の等価的な視差を有する左右画像を生成する形態について説明する。
 撮影させるべき視差Dは、撮影させたい距離Zから下記の式(7)によって算出される。
(Embodiment 3)
In this embodiment, a configuration will be described in which left and right images having arbitrary equivalent parallax are generated from the image coordinates on the display device 201, which is a monitor, in the sensor model (camera model) of the driving environment simulation device 10. FIG.
The parallax D to be photographed is calculated from the distance Z to be photographed by the following equation (7).
Figure JPOXMLDOC01-appb-M000008
Figure JPOXMLDOC01-appb-M000008
 ここで、上記の式(7)における各パラメータは、下記の通りである。
 BF:基線長(B)と焦点距離(F)との積
 D∞:無限遠における視差
Here, each parameter in the above formula (7) is as follows.
BF: product of baseline length (B) and focal length (F) D∞: parallax at infinity
 上記の式(7)において、BF及びD∞はステレオカメラ固有のパラメータである。
 撮影させたい距離Zから撮影させるべき視差Dが算出される。
 このとき、ステレオカメラ202が取得する左右画像間における視差の計算は、SAD(Sum of Absolute Difference)により行われ、各画角の左上y0,z0を基準に算出される。
 表示される左右画像の各々は、画角の基準点から等しい距離にあるため、ステレオカメラ202は、上記の式(7)で算出される視差Dを調整することなく撮影すればよい。
In Equation (7) above, BF and D∞ are parameters specific to the stereo camera.
A parallax D to be photographed is calculated from a distance Z to be photographed.
At this time, the calculation of the parallax between the left and right images acquired by the stereo camera 202 is performed by SAD (Sum of Absolute Difference), and is calculated based on the upper left y0 and z0 of each angle of view.
Since the left and right images to be displayed are at equal distances from the reference point of the angle of view, the stereo camera 202 may shoot without adjusting the parallax D calculated by the above equation (7).
 等価的視差を調整することで、運転環境シミュレーション装置10と実機のステレオカメラ202との調整(合わせ込み)が可能となり、物体認識及び測距の精度を実施形態1よりも向上させることができる。 By adjusting the equivalent parallax, it is possible to adjust (match) the driving environment simulation device 10 and the stereo camera 202 of the actual machine, and the accuracy of object recognition and distance measurement can be improved compared to the first embodiment.
 なお、実施形態1~3では、左右画像が同一の表示画面に表示される形態について説明したが、本発明は、これに限定されるものではない。
 左画像が表示される表示画面と右画像が表示される表示画面とが異なっていてもよい。
In the first to third embodiments, the left and right images are displayed on the same display screen, but the present invention is not limited to this.
The display screen on which the left image is displayed and the display screen on which the right image is displayed may be different.
 なお、実施形態1~3では、VRテストシステムについて説明したが、本発明は、これに限定されるものではない。
 すなわち、センサモデルの映像を互いに独立した左右画像として出力することで運転環境をシミュレーションすること、前記左右画像の各々を表示すること、前記表示された前記左右画像の各々をステレオカメラの左右カメラで撮影すること、前記表示された前記左右画像の物体認識及び測距を行うこと、を含み、前記運転環境のシミュレーションでは、等価的な画角を有するように互いに独立した前記左右画像が生成される、前記ステレオカメラのVRテスト方法も本発明に含まれるものである。
Although the VR test system has been described in the first to third embodiments, the present invention is not limited to this.
Specifically, the driving environment is simulated by outputting images of the sensor model as left and right images independent of each other, each of the left and right images is displayed, and each of the displayed left and right images is viewed by the left and right cameras of a stereo camera. object recognition and distance measurement of the displayed left and right images, wherein the driving environment simulation generates the left and right images independent of each other so as to have equivalent angles of view. , the VR test method of the stereo camera is also included in the present invention.
 なお、本発明は、上述の実施形態に限定されるものではなく、上述の構成に対して、構成要素の付加、削除又は転換を行った様々な変形例も含むものとする。 It should be noted that the present invention is not limited to the above-described embodiments, and includes various modifications in which components are added, deleted, or converted to the above-described configurations.
1 VRテストシステム
 10 運転環境シミュレーション装置
  101 画像処理部
 20 センサシミュレーションシステム
  200 暗室
  201 表示装置
  202 ステレオカメラ
  202L 左カメラ
  202R 右カメラ
1 VR test system 10 driving environment simulation device 101 image processing unit 20 sensor simulation system 200 darkroom 201 display device 202 stereo camera 202L left camera 202R right camera

Claims (4)

  1.  センサモデルの映像を互いに独立した左右画像として出力する運転環境シミュレーション装置と、
     前記左右画像の各々を表示する表示装置と、
     前記表示装置とともに暗室内に設置され、前記表示装置に表示された前記左右画像のうち左画像を左カメラにより、右画像を右カメラにより、左右独立に撮影する左右カメラを有するステレオカメラと、を備え、
     前記運転環境シミュレーション装置は、画像処理部を備え、
     前記画像処理部は、等価的な画角を有するように互いに独立した前記左右画像を生成する、ステレオカメラのVRテストシステム。
    a driving environment simulation device that outputs images of the sensor model as left and right images independent of each other;
    a display device that displays each of the left and right images;
    a stereo camera installed in a dark room together with the display device, and having left and right cameras independently photographing the left image and the right image of the left and right images displayed on the display device by the left camera and the right camera, respectively; prepared,
    The driving environment simulation device includes an image processing unit,
    A VR test system for a stereo camera, wherein the image processing unit generates the left and right images independent of each other so as to have equivalent angles of view.
  2.  前記センサモデルに仮想画角が用いられることで、前記左右画像が等価的な視差を有する、請求項1のVRテストシステム。 The VR test system of Claim 1, wherein the left and right images have equivalent parallax by using a virtual angle of view for the sensor model.
  3.  前記左右画像は、前記表示装置上の画像座標から任意の等価的な視差を有するように生成される、請求項1のVRテストシステム。 The VR test system of claim 1, wherein the left and right images are generated to have arbitrary equivalent parallax from image coordinates on the display device.
  4.  センサモデルの映像を互いに独立した左右画像として出力することで運転環境をシミュレーションすること、
     前記左右画像の各々を表示すること、
     前記表示された前記左右画像のうち左画像をステレオカメラの左カメラを用いて、右画像を前記ステレオカメラの右カメラを用いて、左右独立に撮影すること、を含み、
     前記運転環境のシミュレーションでは、等価的な画角を有するように互いに独立した前記左右画像が生成される、前記ステレオカメラのVRテスト方法。
    Simulating the driving environment by outputting images of the sensor model as left and right images independent of each other;
    displaying each of the left and right images;
    independently photographing the left image of the displayed left and right images using the left camera of the stereo camera and the right image using the right camera of the stereo camera;
    The stereo camera VR test method, wherein the driving environment simulation generates the left and right images independent of each other so as to have equivalent angles of view.
PCT/JP2022/032942 2021-09-07 2022-09-01 Vr test system and vr test method WO2023037954A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2021-145569 2021-09-07
JP2021145569A JP7357884B2 (en) 2021-09-07 2021-09-07 VR test system and VR test method

Publications (1)

Publication Number Publication Date
WO2023037954A1 true WO2023037954A1 (en) 2023-03-16

Family

ID=85506698

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/032942 WO2023037954A1 (en) 2021-09-07 2022-09-01 Vr test system and vr test method

Country Status (2)

Country Link
JP (1) JP7357884B2 (en)
WO (1) WO2023037954A1 (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10703508B1 (en) * 2016-08-31 2020-07-07 Amazon Technologies, Inc. Stereoscopic flight simulator with data acquisition
CN112954303A (en) * 2021-01-11 2021-06-11 中汽研汽车检验中心(广州)有限公司 Test video black box of automatic driving binocular camera

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10703508B1 (en) * 2016-08-31 2020-07-07 Amazon Technologies, Inc. Stereoscopic flight simulator with data acquisition
CN112954303A (en) * 2021-01-11 2021-06-11 中汽研汽车检验中心(广州)有限公司 Test video black box of automatic driving binocular camera

Also Published As

Publication number Publication date
JP7357884B2 (en) 2023-10-10
JP2023038706A (en) 2023-03-17

Similar Documents

Publication Publication Date Title
US11010925B2 (en) Methods and computer program products for calibrating stereo imaging systems by using a planar mirror
US9915857B2 (en) System and method for automated test-pattern-free projection calibration
US7342669B2 (en) Three-dimensional shape measuring method and its device
JP5872923B2 (en) AR image processing apparatus and method
CN104279960B (en) Method for measuring size of object by mobile equipment
CN113252309A (en) Testing method and testing device for near-to-eye display equipment and storage medium
US20210364900A1 (en) Projection Method of Projection System for Use to Correct Image Distortion on Uneven Surface
JPH09231373A (en) Device for measuring three-dimensional position
US20110187827A1 (en) Method and apparatus for creating a stereoscopic image
JP7088530B2 (en) 3D method and device for projecting measurement result-related information on the surface of an object to be measured
Mahdy et al. Projector calibration using passive stereo and triangulation
JP4843544B2 (en) 3D image correction method and apparatus
CN109146959A (en) Monocular camera realizes dynamic point method for three-dimensional measurement
TWI501193B (en) Computer graphics using AR technology. Image processing systems and methods
JP3328478B2 (en) Camera system
WO2023037954A1 (en) Vr test system and vr test method
JPS5853707A (en) Correcting method for distortion in picture of television camera in tridimensional distance measuring device
CN109945840B (en) Three-dimensional image shooting method and system
KR101694890B1 (en) Image processing system and method using 2D images from multiple points
JPWO2020075213A1 (en) Measuring equipment, measuring methods and microscope systems
CN116309854A (en) Method, device, equipment, system and storage medium for calibrating augmented reality equipment
JP2024004926A (en) VR test system and VR test method
JP2006197036A (en) Device and method for stereoscopic image display
JP6543096B2 (en) Element image generating device for viewpoint position, program therefor, and integral three-dimensional simulation system
KR102498028B1 (en) Surveillance Camera Systems and Mothod of Using the Same

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22867276

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE