WO2020162498A1 - Information processing device - Google Patents

Information processing device Download PDF

Info

Publication number
WO2020162498A1
WO2020162498A1 PCT/JP2020/004366 JP2020004366W WO2020162498A1 WO 2020162498 A1 WO2020162498 A1 WO 2020162498A1 JP 2020004366 W JP2020004366 W JP 2020004366W WO 2020162498 A1 WO2020162498 A1 WO 2020162498A1
Authority
WO
WIPO (PCT)
Prior art keywords
vehicle
calibration
information processing
camera
image
Prior art date
Application number
PCT/JP2020/004366
Other languages
French (fr)
Japanese (ja)
Inventor
育子 古村
Original Assignee
株式会社デンソー
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社デンソー filed Critical 株式会社デンソー
Publication of WO2020162498A1 publication Critical patent/WO2020162498A1/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast

Definitions

  • the present disclosure relates to an information processing device mounted on a vehicle and used.
  • Patent Document 1 proposes a technique for performing calibration during traveling without using a specific marker for calibration.
  • One aspect of the present disclosure is an information processing device mounted on a vehicle and used, and includes an execution unit and a determination unit.
  • the execution unit is configured to execute calibration of a camera mounted on the vehicle to capture an image of the surroundings of the vehicle based on a captured image of the camera.
  • the judgment unit is configured to judge whether a predetermined condition is satisfied.
  • the execution unit is configured to be able to execute the calibration when the determination unit determines that the predetermined condition is satisfied.
  • the execution of calibration can be limited to certain situations. Therefore, for example, by not performing the calibration in a situation where the calibration accuracy is low and the probability is high, it is possible to suppress a decrease in the calibration accuracy. Further, for example, by not performing the calibration in a situation where there is a high probability that the increase in the processing load due to the calibration will affect other controls, it is possible to suppress adverse effects on other controls.
  • FIG. 3 is a block diagram showing a configuration of an image processing unit corresponding to the information processing device of the first embodiment. It is a functional block diagram of an image processing unit. It is a figure explaining the outline of calibration. It is a flow chart of execution judgment processing of a 1st embodiment. It is a figure explaining the relationship between some control based on image processing, and traveling speed. It is a flow chart of execution judgment processing of a 2nd embodiment. It is a flow chart of execution judgment processing of a 3rd embodiment. It is a figure explaining the relationship between memory usage and processing speed fall. It is a flow chart of execution judgment processing of other embodiments.
  • a display system 1 shown in FIG. 1 is a system mounted on a vehicle and used, and includes a front camera (hereinafter, F camera) 11a, a rear camera (hereinafter, R camera) 11b, a left side camera (hereinafter, LS camera). 11c, four cameras of the light side camera (henceforth RS camera) 11d, the display 13, and ECU15 are provided.
  • F camera front camera
  • R camera rear camera
  • LS camera left side camera
  • the camera 11 is an imaging device mounted on the vehicle.
  • a known CCD image sensor or CMOS image sensor can be used.
  • the camera 11 photographs the surroundings of the vehicle at a predetermined time interval (1/15s as an example), and outputs the photographed image to the ECU 15.
  • the F camera 11a photographs the front of the vehicle.
  • the R camera 11b photographs the rear of the vehicle.
  • the LS camera 11c photographs the left side of the vehicle.
  • the RS camera 11d photographs the right side of the vehicle.
  • the display 13 is a display device such as a liquid crystal display that displays an image.
  • the display 13 displays the combined image generated by the ECU 15 according to the signal input from the ECU 15.
  • the ECU 15 includes a video signal input unit 21, a communication interface (I/F) 22, an image processing unit 23, a video signal output unit 24, and a power supply unit 25.
  • the video signal input unit 21 inputs a video signal indicating a captured image captured by the camera 11 from the camera 11, and outputs it to the image processing unit 23.
  • the communication I/F 22 acquires a signal output to the vehicle-mounted communication bus 17 from one or more control devices or sensors (not shown) mounted on the vehicle and outputs the signal to the image processing unit 23.
  • the communication I/F 22 acquires information about the vehicle such as the traveling speed of the vehicle (hereinafter, vehicle speed), the steering angle of the tire, and the shift range.
  • the image processing unit 23 includes a microcomputer having a CPU 31 and a semiconductor memory (hereinafter, memory 32) such as a RAM 32a, a ROM 32b, and a flash memory 32c.
  • memory 32 a semiconductor memory
  • Various functions of the image processing unit 23 are realized by the CPU 31 executing a program stored in the non-transitional physical recording medium.
  • the memory 32 corresponds to a non-transitional substantive recording medium storing a program. Further, by executing this program, the method corresponding to the program is executed.
  • the image processing unit 23 may include one microcomputer or a plurality of microcomputers.
  • the image processing unit 23 corresponds to an information processing device.
  • the image processing unit 23 has an image generation unit 41, an image recognition unit 42, a drawing unit 43, and an execution unit 44, as shown in FIG. 2, as a configuration of functions realized by the CPU 31 executing a program. And a determination unit 45.
  • the image generation unit 41 generates one or more composite images based on the captured images captured by two or more cameras of the four cameras 11.
  • the composite image is an image in which a plurality of captured images are combined, and the viewpoint of at least one captured image may be changed. For example, it may be an image of a viewpoint different from the captured image actually captured by the camera 11, or an image including a wider range than the captured image of one camera.
  • the image recognition unit 42 recognizes a predetermined target object from the composite image generated by the image generation unit 41.
  • the predetermined object here may include, for example, a lane marking such as a white line, a moving body such as a person, an automobile, or a bicycle, or a fixed object such as a signal or a sign.
  • the method of recognizing the object from the image is not particularly limited, and various known methods can be used.
  • the target object may be recognized by pattern matching or calculation processing using a learning model.
  • the detected object may be used for various controls.
  • the drawing unit 43 corrects the brightness and color of the composite image, and when output to the display 13, generates an image that is easily seen by passengers of the vehicle.
  • the execution unit 44 executes the calibration of the camera 11 based on the image captured by the camera 11 while the vehicle is traveling.
  • Calibration is to correct the camera posture parameters such as the position and angle of the camera 11 described above.
  • the specific content of the calibration may be, for example, estimating the installation position of the camera 11 with respect to the vehicle and replacing it with the parameter stored in the storage area of the memory 32. It is also possible to recalculate a new parameter using the calculated parameter. In the present embodiment, the calibration is executed without using the marker dedicated to the calibration. The outline of the calibration method will be briefly described with reference to FIG.
  • a first target image 62 is shown as an image showing the target 73 on the road photographed by the camera 72 fixed to the vehicle 71 at a certain timing t1.
  • An image in which the target 73 captured at the timing t2 after the timing t1 is schematically superimposed on the captured image 61 is referred to as a second target image 63.
  • the judgment unit 45 judges whether or not a predetermined condition is satisfied.
  • the predetermined condition is a condition suitable for executing the calibration.
  • the execution unit 44 is configured to be able to execute the calibration when the determination unit 45 determines that the predetermined condition is satisfied.
  • the predetermined condition corresponds to (i) not being a bad environment and (ii) recognizing a white line.
  • a bad environment is an environment in which it is highly probable that the accuracy will decrease when calibration is performed due to the poor quality of captured images.
  • Examples of the bad environment include that the environment around the vehicle is dark, that the road on which the vehicle is traveling is uneven, and that the camera 1 is dirty.
  • Examples of the uneven road include a road having a large number of irregularities on the road surface and a road having a large number of small obstacles such as pebbles.
  • the specific method for detecting a bad environment is not particularly limited.
  • the dark environment around the vehicle can be determined based on, for example, an image captured by the camera 11, or based on the output of an illuminance sensor (not shown) mounted on the vehicle.
  • the fact that the road on which the vehicle is traveling is uneven may be determined, for example, by detecting the presence of stones or dents by image processing of the captured image, or by a vibration sensor or acceleration not shown mounted on the vehicle. The determination can be made based on the output of the sensor.
  • the fact that the camera 11 is dirty can be determined, for example, from the change with time of the captured image acquired by the camera 11.
  • the white line can be recognized by the image recognition unit 42. Since the white line is suitable as a target for executing calibration, detection of the white line may be a requirement for executing good calibration.
  • the video signal output unit 24 outputs the combined image generated by the image processing unit 23 to the display 13.
  • the power supply unit 25 supplies electric power to the image processing unit 23 as well as each of the elements configuring the ECU 15.
  • the CPU 31 determines whether or not the vehicle is traveling. For example, the CPU 31 determines that the vehicle is traveling at a predetermined speed (for example, 5 km/h) or more based on the vehicle speed information acquired from a vehicle speed sensor (not shown).
  • a predetermined speed for example, 5 km/h
  • a vehicle speed sensor not shown
  • the CPU 31 determines whether or not a predetermined condition for permitting execution of calibration is satisfied.
  • the CPU 31 determines whether the environment of the vehicle is a bad environment. Here, if any one of the requirements that the environment around the vehicle is dark, that the road on which the vehicle is traveling is uneven, and that the camera is dirty, is satisfied in a bad environment. It is determined that there is. When the CPU 31 determines in S3 that the environment is bad, the CPU 31 returns to S1. On the other hand, when the CPU 31 determines in S3 that the environment is not bad, the CPU 31 proceeds to S4.
  • the CPU 31 determines whether or not a white line is detected.
  • the detection of the white line itself is realized as a function of the image recognition unit 42.
  • the CPU 31 proceeds to S5.
  • the CPU 31 determines in S4 that the white line is not detected, the CPU 31 returns to S1.
  • the CPU 31 functions as the determination unit 45 described above, and if the predetermined condition is not satisfied, the process returns to S1, and if the predetermined condition is satisfied, the process proceeds to S5.
  • the CPU 31 executes calibration.
  • the CPU 31 functions as the execution unit 44 described above, and executes the calibration described above.
  • the CPU 31 updates the camera parameters stored in the memory 32 to the parameters estimated by the calibration in S5. After S6, the process returns to S1.
  • the image processing unit 23 of the display system 1 can execute the calibration when there is a high probability that the calibration can be suitably executed. Therefore, by not performing the calibration in a situation where the calibration accuracy is low and the probability is high, it is possible to suppress deterioration of the calibration accuracy.
  • the image processing unit 23 determines whether or not a predetermined condition for performing calibration is satisfied based on at least one of the image captured by the camera 11 and the output of the sensor mounted on the vehicle. You can
  • the determination unit 45 satisfies at least one of the requirements that the environment around the vehicle is dark, that the road on which the vehicle is traveling is uneven, and that the camera 11 is dirty. If it does, it is determined that the conditions for executing the calibration are not satisfied. If the above requirements are satisfied, the accuracy of the calibration tends to be low, but if the above requirements are not satisfied, the calibration is not permitted, so that it is possible to suppress the calibration with low accuracy.
  • the requirement for the determination unit 45 to determine that the predetermined condition is satisfied includes at least that the captured image acquired by the camera 11 includes a white line. Therefore, the white line can be used as a calibration target, and the calibration accuracy can be improved.
  • the configuration in which the white line is used as the calibration target is illustrated, but the target may be something other than the white line.
  • the white line For example, lane markings other than the white line, road markings, curbs, and various other objects on the road can be used as targets.
  • an object other than the white line it may be determined whether or not the target object is detected instead of the white line in the process of S4 of FIG.
  • the second embodiment is different from the first embodiment in that the traveling speed of the vehicle is included as a requirement for determining whether or not a predetermined condition is satisfied.
  • various controls using the result of image processing of the captured image are executed according to the vehicle speed.
  • a plurality of controls are executed in the low speed range of less than 30 km/h, but those controls are not executed in the medium speed range of 30 km/h to 50 km/h, and in the high speed range of 50 km/h or more. Perform lane detection.
  • the CPU 31 proceeds to S13 after acquiring the captured image 61 in S12.
  • the CPU 31 determines whether the vehicle speed is in the medium speed range or higher.
  • the vehicle speed can be detected, for example, based on the output signal of a vehicle speed sensor (not shown). Since the image processing unit 23 executes a plurality of controls in the low speed range, the processing load on the image processing unit 23 is high. If calibration is further performed here, there is a high possibility that a delay will occur in other control. Therefore, the CPU 31 does not permit execution of the calibration while traveling in the low speed range.
  • reference numeral 31 permits the execution of calibration because the processing load is small during traveling in the medium speed range and the high speed range. The fact that the vehicle travels in the medium speed range and the high speed range corresponds to the vehicle being in a predetermined state.
  • the CPU 31 determines in S13 that the vehicle speed is not higher than the medium speed range, the CPU 31 returns to S11. On the other hand, when the CPU 31 determines in S13 that the vehicle speed is in the medium speed range or higher, the CPU 31 proceeds to S14 and executes the calibration.
  • the image processing unit 23 of the display system 1 can execute the calibration when the processing load due to the calibration is unlikely to affect other controls. Therefore, it is possible to prevent the execution of the calibration from adversely affecting other controls.
  • the second embodiment exemplifies a configuration in which execution of calibration is permitted only when the vehicle speed is in the medium speed range or higher.
  • the speed range in which the execution of calibration is permitted is not limited to the above example.
  • the configuration may be such that the execution of calibration is permitted only in a certain speed range.
  • the execution of calibration may be permitted only in any one of the low speed range, the medium speed range, and the high speed range.
  • the threshold value of the speed for determining whether or not the calibration can be executed is not limited to the speed example disclosed in the second embodiment, and may be various values.
  • the requirement for determining whether or not a predetermined condition is satisfied is that the processing load of the image processing unit 23 is small, specifically, the memory usage amount of the image processing unit 23 is included. This is different from the first embodiment.
  • “memory usage” refers to the usage of the RAM 32a.
  • the image processing unit 23 includes software that measures the amount of memory used. Further, in the third embodiment, similar to the second embodiment, various controls using the result of image processing of a captured image are executed.
  • the CPU 31 acquires the captured image 61 in S22, and then moves to S23.
  • the CPU 31 determines whether or not the memory usage amount before the calibration is performed is equal to or larger than the reference value.
  • the reference value here is, for example, as shown in FIG. 8, a value of the memory usage amount before the calibration is performed, which is estimated to cause a decrease in the processing speed.
  • Pattern A in FIG. 8 is an example of the memory usage amount in a range that does not affect the processing speed. Therefore, in this case, execution of calibration is permitted.
  • pattern B in FIG. 8 is an example of the memory usage amount in which the decrease in processing speed is estimated. In this case, execution of calibration is not permitted.
  • the memory usage amount being less than the predetermined reference value corresponds to satisfying the predetermined reference.
  • the CPU 31 determines in S23 that the memory usage amount is equal to or greater than the reference value, the CPU 31 returns to S21. On the other hand, when the CPU 31 determines in S23 that the memory usage amount is less than the reference value, the CPU 31 proceeds to S24 and executes calibration.
  • the image processing unit 23 of the display system 1 can execute the calibration when the memory usage amount is less than the reference value. Therefore, it is possible to prevent the execution of the calibration from adversely affecting other controls.
  • the reference value is an example of a memory usage amount estimated to cause a decrease in the processing speed of the image processing unit 23.
  • the reference value is not limited to the above configuration.
  • the reference value is a threshold value of the memory usage amount at which the processing speed is significantly reduced when the increase amount is added to the reference value. It may be a value of the memory usage amount that will be exceeded.
  • the processing load of the image processing unit 23 may be small. For example, it may be determined that the processing load of the image processing unit 23 is small when a specific control having a large processing load is not executed or when the number of executed controls is less than or equal to a predetermined number.
  • the predetermined condition is set so that the calibration can be executed at the timing at which the accuracy of the calibration is not deteriorated.
  • other control by the calibration is performed.
  • Predetermined conditions were set so that the influence on the above can be suppressed and the calibration can be executed.
  • the predetermined condition may be set so that both effects described above can be obtained.
  • it may be configured to perform all the determinations of S3, S4, S13, and S23 described above. With such a configuration, it is possible to simultaneously realize a reduction in calibration accuracy and an influence on other controls.
  • the configuration in which it is determined that the predetermined condition for permitting the execution of the calibration is determined based on the traveling speed of the vehicle has been exemplified, but the configuration is acquired based on the output of the vehicle-mounted sensor. Whether or not the predetermined condition is satisfied may be determined based on the state of the vehicle other than the determined vehicle speed. For example, the requirements for determining that a predetermined condition is satisfied, the vehicle is a state in which a light such as a headlight is not lit, a state in which a moving body is not detected in at least a certain range outside the vehicle, May be included. The fact that the traveling speed of the vehicle is within a predetermined range, that the vehicle travels without turning on the light, and that there is no moving object within the predetermined range correspond to examples where the vehicle is in the predetermined state. ..
  • the image processing unit 23 and the method thereof according to the present disclosure are provided by configuring a processor and a memory programmed to execute one or a plurality of functions embodied by a computer program. It may be realized by a dedicated computer. Alternatively, the image processing unit 23 and the method thereof described in the present disclosure may be realized by a dedicated computer provided by configuring a processor with one or more dedicated hardware logic circuits. Alternatively, the image processing unit 23 and its method described in the present disclosure include a processor and a memory programmed to execute one or a plurality of functions, and a processor configured by one or more hardware logic circuits. It may be realized by one or more dedicated computers configured by combination.
  • the computer program may be stored in a computer-readable non-transition tangible recording medium as an instruction executed by a computer.
  • the method for realizing the function of each unit included in the image processing unit 23 does not necessarily need to include software, and all the functions may be realized by using one or a plurality of hardware.
  • a plurality of functions of one constituent element in the above-described embodiment may be realized by a plurality of constituent elements, or one function of one constituent element may be realized by a plurality of constituent elements. .. Further, a plurality of functions of a plurality of constituent elements may be realized by one constituent element, or one function realized by a plurality of constituent elements may be realized by one constituent element. Moreover, you may omit a part of structure of the said embodiment. Further, at least a part of the configuration of the above-described embodiment may be added or replaced with respect to the configuration of the other above-described embodiment.

Abstract

An information processing device (23) which is mounted on a vehicle and used is provided with: an execution unit (44) configured to execute calibration of cameras (11a, 11b, 11c, 11d) which are mounted on the vehicle and which photograph the periphery of the vehicle, on the basis of a photographed image (61) of the cameras; and a determination unit (45) configured to determine whether a prescribed condition is satisfied. The execution unit is configured to be capable of executing the calibration when the determination unit determines that the prescribed condition is satisfied.

Description

情報処理装置Information processing equipment 関連出願の相互参照Cross-reference of related applications
 本国際出願は、2019年2月6日に日本国特許庁に出願された日本国特許出願第2019-19622号に基づく優先権を主張するものであり、日本国特許出願第2019-19622号の全内容を本国際出願に参照により援用する。 This international application claims priority based on Japanese Patent Application No. 2019-19622 filed with the Japan Patent Office on February 6, 2019, and is the same as Japanese Patent Application No. 2019-19622. The entire contents of this International Application are incorporated by reference.
 本開示は、車両に搭載して用いられる情報処理装置に関する。 The present disclosure relates to an information processing device mounted on a vehicle and used.
 車両に搭載される複数のカメラにより取得された複数の撮像画像を合成する場合には、カメラ姿勢の補正(以下、キャリブレーションとも記載する)を行うことで、合成画像の品質を向上させることができる。特許文献1には、キャリブレーションのための特定のマーカを用いずに、走行中にキャリブレーションを行う技術が提案されている。 When synthesizing a plurality of captured images acquired by a plurality of cameras mounted on a vehicle, the quality of the synthesized image can be improved by correcting the camera posture (hereinafter also referred to as calibration). it can. Patent Document 1 proposes a technique for performing calibration during traveling without using a specific marker for calibration.
特開2014-101075号公報JP, 2014-101075, A
 しかしながら、発明者の詳細な検討の結果、特定のマーカを用いない特許文献1のキャリブレーション手法では、車両の置かれる状況によってはキャリブレーションの精度が低くなってしまう場合があるという課題が見出された。また、発明者の詳細な検討の結果、キャリブレーションの実行に伴う処理負荷の増加によって他の制御に影響を及ぼしてしまう場合があるという課題が見出された。 However, as a result of a detailed study by the inventor, the calibration method of Patent Document 1 which does not use a specific marker has a problem that the calibration accuracy may be lowered depending on the situation where the vehicle is placed. Was done. Further, as a result of a detailed study by the inventor, it has been found that there is a case that an increase in processing load due to execution of calibration may affect other controls.
 本開示の1つの局面は、適切なタイミングでキャリブレーションを行うことが好ましい。 In one aspect of the present disclosure, it is preferable to perform calibration at an appropriate timing.
 本開示の一態様は、車両に搭載して用いられる情報処理装置であって、実行部と、判断部と、を備える。実行部は、車両に搭載されて当該車両の周囲を撮影するカメラのキャリブレーションを、カメラの撮影画像に基づいて実行するように構成される。判断部は、所定の条件を満たしているか否かを判断するように構成される。実行部は、判断部が所定の条件を満たしていると判断しているときに、キャリブレーションを実行可能に構成されている。 One aspect of the present disclosure is an information processing device mounted on a vehicle and used, and includes an execution unit and a determination unit. The execution unit is configured to execute calibration of a camera mounted on the vehicle to capture an image of the surroundings of the vehicle based on a captured image of the camera. The judgment unit is configured to judge whether a predetermined condition is satisfied. The execution unit is configured to be able to execute the calibration when the determination unit determines that the predetermined condition is satisfied.
 このような構成によれば、キャリブレーションの実行を、一定の状況に限定することができる。そのため、例えば、キャリブレーションの精度が低い蓋然性の高い状況でキャリブレーションを行わないことで、キャリブレーションの精度の低下を抑制できる。また例えば、キャリブレーションによる処理負荷の増加によって他の制御に影響が出てしまう蓋然性の高い状況にてキャリブレーションを行わないことで、他の制御に悪い影響を与えてしまうことを抑制できる。 With such a configuration, the execution of calibration can be limited to certain situations. Therefore, for example, by not performing the calibration in a situation where the calibration accuracy is low and the probability is high, it is possible to suppress a decrease in the calibration accuracy. Further, for example, by not performing the calibration in a situation where there is a high probability that the increase in the processing load due to the calibration will affect other controls, it is possible to suppress adverse effects on other controls.
第1実施形態の情報処理装置に相当する画像処理部の構成を示すブロック図である。FIG. 3 is a block diagram showing a configuration of an image processing unit corresponding to the information processing device of the first embodiment. 画像処理部の機能ブロック図である。It is a functional block diagram of an image processing unit. キャリブレーションの概要を説明する図である。It is a figure explaining the outline of calibration. 第1実施形態の実行判断処理のフローチャートである。It is a flow chart of execution judgment processing of a 1st embodiment. 画像処理に基づく複数の制御と走行速度との関係を説明する図である。It is a figure explaining the relationship between some control based on image processing, and traveling speed. 第2実施形態の実行判断処理のフローチャートである。It is a flow chart of execution judgment processing of a 2nd embodiment. 第3実施形態の実行判断処理のフローチャートである。It is a flow chart of execution judgment processing of a 3rd embodiment. メモリ使用量と処理速度低下の関係を説明する図である。It is a figure explaining the relationship between memory usage and processing speed fall. その他の実施形態の実行判断処理のフローチャートである。It is a flow chart of execution judgment processing of other embodiments.
 以下、図面を参照しながら、本開示の実施形態を説明する。 Hereinafter, embodiments of the present disclosure will be described with reference to the drawings.
 [1.第1実施形態]
 [1-1.構成]
 図1に示す表示システム1は、車両に搭載して用いられるシステムであって、フロントカメラ(以下、Fカメラ)11a,リアカメラ(以下、Rカメラ)11b,レフトサイドカメラ(以下、LSカメラ)11c,ライトサイドカメラ(以下、RSカメラ)11dの4つのカメラと、ディスプレイ13と、ECU15と、を備える。以下、上述した各カメラ全てを指す場合には、単にカメラ11と記載する場合がある。
[1. First Embodiment]
[1-1. Constitution]
A display system 1 shown in FIG. 1 is a system mounted on a vehicle and used, and includes a front camera (hereinafter, F camera) 11a, a rear camera (hereinafter, R camera) 11b, a left side camera (hereinafter, LS camera). 11c, four cameras of the light side camera (henceforth RS camera) 11d, the display 13, and ECU15 are provided. Hereinafter, when referring to all the above-mentioned cameras, they may be simply referred to as the camera 11.
 カメラ11は、車両に搭載された撮像装置である。カメラ11には、例えば公知のCCDイメージセンサやCMOSイメージセンサなどを用いることができる。カメラ11は所定の時間間隔(一例として1/15s)で車両の周囲を撮影し、撮影した撮影画像をECU15に出力する。Fカメラ11aは車両の前方を撮影する。Rカメラ11bは車両の後方を撮影する。LSカメラ11cは車両の左方を撮影する。RSカメラ11dは車両の右方を撮影する。 The camera 11 is an imaging device mounted on the vehicle. For the camera 11, for example, a known CCD image sensor or CMOS image sensor can be used. The camera 11 photographs the surroundings of the vehicle at a predetermined time interval (1/15s as an example), and outputs the photographed image to the ECU 15. The F camera 11a photographs the front of the vehicle. The R camera 11b photographs the rear of the vehicle. The LS camera 11c photographs the left side of the vehicle. The RS camera 11d photographs the right side of the vehicle.
 ディスプレイ13は、画像を表示する液晶ディスプレイなどの表示装置である。ディスプレイ13は、ECU15から入力される信号に従って、ECU15にて生成された合成画像を表示する。 The display 13 is a display device such as a liquid crystal display that displays an image. The display 13 displays the combined image generated by the ECU 15 according to the signal input from the ECU 15.
 ECU15は、映像信号入力部21と、通信インターフェース(I/F)22と、画像処理部23と、映像信号出力部24と、電源部25と、を備える。 The ECU 15 includes a video signal input unit 21, a communication interface (I/F) 22, an image processing unit 23, a video signal output unit 24, and a power supply unit 25.
 映像信号入力部21は、カメラ11により撮影された撮影画像を示す映像信号をカメラ11から入力して、画像処理部23に出力する。 The video signal input unit 21 inputs a video signal indicating a captured image captured by the camera 11 from the camera 11, and outputs it to the image processing unit 23.
 通信I/F22は、車両に搭載された図示しない1つ以上の制御装置やセンサ等から車載通信バス17に出力された信号を取得して画像処理部23に出力する。例えば通信I/F22は、車両の走行速度(以下、車速)、タイヤの舵角、シフトレンジなどの車両に関する情報を取得する。 The communication I/F 22 acquires a signal output to the vehicle-mounted communication bus 17 from one or more control devices or sensors (not shown) mounted on the vehicle and outputs the signal to the image processing unit 23. For example, the communication I/F 22 acquires information about the vehicle such as the traveling speed of the vehicle (hereinafter, vehicle speed), the steering angle of the tire, and the shift range.
 画像処理部23は、CPU31と、RAM32a、ROM32b、フラッシュメモリ32c等の半導体メモリ(以下、メモリ32)と、を有するマイクロコンピュータを備える。画像処理部23の各種機能は、CPU31が非遷移的実体的記録媒体に格納されたプログラムを実行することにより実現される。この例では、メモリ32が、プログラムを格納した非遷移的実体的記録媒体に該当する。また、このプログラムの実行により、プログラムに対応する方法が実行される。なお、画像処理部23は、1つのマイクロコンピュータを備えてもよいし、複数のマイクロコンピュータを備えてもよい。この画像処理部23が、情報処理装置に相当する。 The image processing unit 23 includes a microcomputer having a CPU 31 and a semiconductor memory (hereinafter, memory 32) such as a RAM 32a, a ROM 32b, and a flash memory 32c. Various functions of the image processing unit 23 are realized by the CPU 31 executing a program stored in the non-transitional physical recording medium. In this example, the memory 32 corresponds to a non-transitional substantive recording medium storing a program. Further, by executing this program, the method corresponding to the program is executed. The image processing unit 23 may include one microcomputer or a plurality of microcomputers. The image processing unit 23 corresponds to an information processing device.
 画像処理部23は、CPU31がプログラムを実行することで実現される機能の構成として、図2に示すように、画像生成部41と、画像認識部42と、描画部43と、実行部44と、判断部45と、を備える。 The image processing unit 23 has an image generation unit 41, an image recognition unit 42, a drawing unit 43, and an execution unit 44, as shown in FIG. 2, as a configuration of functions realized by the CPU 31 executing a program. And a determination unit 45.
 画像生成部41は、4つのカメラ11のうちの2つ以上のカメラにより撮影された撮影画像に基づいて、1つ以上の合成画像を生成する。合成画像とは、複数の撮影画像を組み合わせた画像であり、少なくとも1つの撮影画像の視点が変更されていてもよい。例えば、カメラ11により実際に撮影された撮影画像とは異なる視点の画像や、1つのカメラの撮影画像よりも広い範囲を含む画像であってもよい。 The image generation unit 41 generates one or more composite images based on the captured images captured by two or more cameras of the four cameras 11. The composite image is an image in which a plurality of captured images are combined, and the viewpoint of at least one captured image may be changed. For example, it may be an image of a viewpoint different from the captured image actually captured by the camera 11, or an image including a wider range than the captured image of one camera.
 合成画像を生成するためには、撮影画像それぞれの視点を特定する必要がある。視点を特定するためには、カメラ11それぞれの設置される位置や角度などを含むカメラの姿勢のパラメータが用いられる。メモリ32が構成する記憶領域には、カメラ11それぞれの上述したパラメータが記憶されており、画像生成部41によって必要なときに読み出され、使用される。 In order to generate a composite image, it is necessary to specify the viewpoint of each captured image. In order to specify the viewpoint, parameters of the posture of the camera including the position and angle of each camera 11 installed are used. The above-mentioned parameters of each camera 11 are stored in the storage area formed by the memory 32, and are read and used by the image generation unit 41 when necessary.
 画像認識部42は、画像生成部41により生成された合成画像から、所定の対象物を認識する。ここでいう所定の対象物とは、例えば、白線などの区画線、人,自動車,自転車などの移動体、信号や標識などの固定されたものなどが含まれうる。画像から対象物を認識する方法は特に限定されず、公知の様々な方法を用いることができる。例えば、パターンマッチングや、学習モデルを用いた計算処理によって対象物を認識してもよい。検出された対象物は、様々な制御に利用されてもよい。 The image recognition unit 42 recognizes a predetermined target object from the composite image generated by the image generation unit 41. The predetermined object here may include, for example, a lane marking such as a white line, a moving body such as a person, an automobile, or a bicycle, or a fixed object such as a signal or a sign. The method of recognizing the object from the image is not particularly limited, and various known methods can be used. For example, the target object may be recognized by pattern matching or calculation processing using a learning model. The detected object may be used for various controls.
 描画部43は、合成画像の輝度や色を補正し、ディスプレイ13に出力したときに車両の搭乗者が見易い画像を生成する。 The drawing unit 43 corrects the brightness and color of the composite image, and when output to the display 13, generates an image that is easily seen by passengers of the vehicle.
 実行部44は、車両の走行中に、カメラ11のキャリブレーションを、カメラ11の撮影画像に基づいて実行する。 The execution unit 44 executes the calibration of the camera 11 based on the image captured by the camera 11 while the vehicle is traveling.
 キャリブレーションとは、上述したカメラ11の位置や角度などのカメラ姿勢のパラメータを補正することである。キャリブレーションの具体的な内容は、例えば、カメラ11の車両に対する設置位置を推定して、メモリ32の記憶領域に記憶されるパラメータと置き換えることであってもよいし、従来のパラメータと新たに推定されたパラメータとを用いて新たにパラメータを再計算することであってもよい。本実施形態では、キャリブレーション専用のマーカを用いることなくキャリブレーションを実行する。キャリブレーション方法の概要を、図3を用いて簡単に説明する。 Calibration is to correct the camera posture parameters such as the position and angle of the camera 11 described above. The specific content of the calibration may be, for example, estimating the installation position of the camera 11 with respect to the vehicle and replacing it with the parameter stored in the storage area of the memory 32. It is also possible to recalculate a new parameter using the calculated parameter. In the present embodiment, the calibration is executed without using the marker dedicated to the calibration. The outline of the calibration method will be briefly described with reference to FIG.
 撮影画像61には、あるタイミングt1にて車両71に固定されたカメラ72により撮影された道路上のターゲット73を示す画像として、第1ターゲット画像62が示される。車両が走行して車両の位置が変化すると、ターゲット73の撮影画像上の位置は変化する。タイミングt1より後のタイミングt2にて撮影されたターゲット73を撮影画像61に模式的に重畳した画像を、第2ターゲット画像63とする。これら第1ターゲット画像62及び第2ターゲット画像63の画像上の位置の変化等と、タイミングt1からタイミングt2までの期間の車両71の移動量と、に基づいて、カメラ姿勢のパラメータを推定することができる。 In the photographed image 61, a first target image 62 is shown as an image showing the target 73 on the road photographed by the camera 72 fixed to the vehicle 71 at a certain timing t1. When the vehicle travels and the position of the vehicle changes, the position of the target 73 on the captured image changes. An image in which the target 73 captured at the timing t2 after the timing t1 is schematically superimposed on the captured image 61 is referred to as a second target image 63. Estimating the parameters of the camera attitude based on the changes in the positions of the first target image 62 and the second target image 63 on the image, and the movement amount of the vehicle 71 during the period from the timing t1 to the timing t2. You can
 なお、具体的なキャリブレーションの方法は特に限定されず、例えば、日本国特許出願公開第2014-101075に開示される手法を用いてもよい。 Note that the specific calibration method is not particularly limited, and for example, the method disclosed in Japanese Patent Application Publication No. 2014-101075 may be used.
 判断部45は、所定の条件を満たしているか否かを判断する。所定の条件とは、キャリブレーションを実行することに適した条件である。実行部44は、判断部45が所定の条件を満たしていると判断しているときに、キャリブレーションを実行可能に構成されている。 The judgment unit 45 judges whether or not a predetermined condition is satisfied. The predetermined condition is a condition suitable for executing the calibration. The execution unit 44 is configured to be able to execute the calibration when the determination unit 45 determines that the predetermined condition is satisfied.
 本実施形態において、所定の条件とは、(i)悪環境でないこと、かつ、(ii)白線を認識していること、が該当する。 In the present embodiment, the predetermined condition corresponds to (i) not being a bad environment and (ii) recognizing a white line.
 悪環境とは、撮影画像の品質が低いことにより、キャリブレーションを実行するとその精度が低くなる蓋然性が高いと考えられる環境である。悪環境とは、例えば、車両の周囲の環境が暗いこと、車両の走行中の道路が凸凹路であること、及び、カメラ1に汚れがあること、などが例示される。凸凹路の例としては、路面に多数の凹凸が存在する道路や、小石などの小さな障害物が多数存在する道路などが該当する。 A bad environment is an environment in which it is highly probable that the accuracy will decrease when calibration is performed due to the poor quality of captured images. Examples of the bad environment include that the environment around the vehicle is dark, that the road on which the vehicle is traveling is uneven, and that the camera 1 is dirty. Examples of the uneven road include a road having a large number of irregularities on the road surface and a road having a large number of small obstacles such as pebbles.
 悪環境を検出する具体的な手法は特に限定されない。車両の周囲の環境が暗いことは、例えば、カメラ11の撮影画像に基づいて判定したり、車両に搭載される図示しない照度センサの出力に基づいて判定したりすることができる。また、車両の走行中の道路が凸凹路であることは、例えば、撮影画像を画像処理することで石や凹みの存在を検出して判定したり、車両に搭載される図示しない振動センサや加速度センサの出力に基づいて判定したりすることができる。また、カメラ11に汚れがあることは、例えば、カメラ11により取得される撮影画像の経時変化から判断することができる。  The specific method for detecting a bad environment is not particularly limited. The dark environment around the vehicle can be determined based on, for example, an image captured by the camera 11, or based on the output of an illuminance sensor (not shown) mounted on the vehicle. In addition, the fact that the road on which the vehicle is traveling is uneven may be determined, for example, by detecting the presence of stones or dents by image processing of the captured image, or by a vibration sensor or acceleration not shown mounted on the vehicle. The determination can be made based on the output of the sensor. Further, the fact that the camera 11 is dirty can be determined, for example, from the change with time of the captured image acquired by the camera 11.
 白線は、画像認識部42により認識することができる。白線はキャリブレーションを実行するためのターゲットとして好適であるため、白線の検出を良好なキャリブレーションを実行するための要件としてもよい。 The white line can be recognized by the image recognition unit 42. Since the white line is suitable as a target for executing calibration, detection of the white line may be a requirement for executing good calibration.
 説明を図1に戻る。映像信号出力部24は、画像処理部23により生成された合成画像を、ディスプレイ13に出力する。電源部25は、画像処理部23のほか、ECU15を構成する各要素に電力を供給する。 Return to Figure 1 for explanation. The video signal output unit 24 outputs the combined image generated by the image processing unit 23 to the display 13. The power supply unit 25 supplies electric power to the image processing unit 23 as well as each of the elements configuring the ECU 15.
 [1-2.処理]
 次に、画像処理部23のCPU31が実行する実行判断処理について、図4のフローチャートを用いて説明する。
[1-2. processing]
Next, the execution determination process executed by the CPU 31 of the image processing unit 23 will be described with reference to the flowchart of FIG.
 まず、S1では、CPU31は、車両が走行中であるか否かを判定する。CPU31は、例えば、図示しない車速センサから取得される車速情報に基づき、所定速度(例えば、5km/h)以上のときに、車両が走行中であると判定する。 First, in S1, the CPU 31 determines whether or not the vehicle is traveling. For example, the CPU 31 determines that the vehicle is traveling at a predetermined speed (for example, 5 km/h) or more based on the vehicle speed information acquired from a vehicle speed sensor (not shown).
 CPU31は、S1で車両が走行中であると判定した場合には、S2へ移行し、カメラ11から車両の周囲の撮像画像を取得する。一方、CPU31は、S1で車両が走行中でないと判定した場合には、図4の処理を終了する。 When the CPU 31 determines in S1 that the vehicle is traveling, the CPU 31 proceeds to S2 and acquires a captured image around the vehicle from the camera 11. On the other hand, when the CPU 31 determines in S1 that the vehicle is not traveling, the process of FIG. 4 ends.
 続くS3及びS4において、CPU31は、キャリブレーションの実行を許可するための所定の条件が満たされているか否かを判断する。 In subsequent S3 and S4, the CPU 31 determines whether or not a predetermined condition for permitting execution of calibration is satisfied.
 具体的には、S3では、CPU31は、車両の環境が悪環境であるか否かを判定する。ここでは、車両の周囲の環境が暗いこと、車両の走行中の道路が凸凹路であること、及び、カメラに汚れがあること、のいずれか1つの要件でも満たされていれば、悪環境であると判定される。CPU31は、S3で悪環境であると判定した場合には、S1に戻る。一方、CPU31は、S3で悪環境でないと判定した場合には、S4へ移行する。 Specifically, in S3, the CPU 31 determines whether the environment of the vehicle is a bad environment. Here, if any one of the requirements that the environment around the vehicle is dark, that the road on which the vehicle is traveling is uneven, and that the camera is dirty, is satisfied in a bad environment. It is determined that there is. When the CPU 31 determines in S3 that the environment is bad, the CPU 31 returns to S1. On the other hand, when the CPU 31 determines in S3 that the environment is not bad, the CPU 31 proceeds to S4.
 続くS4では、CPU31は、白線を検知しているか否かを判定する。白線の検知自体は、画像認識部42の機能として実現される。CPU31は、S4で白線を検知していると判定した場合には、S5へ移行する。一方、CPU31は、S4で白線を検知していないと判定した場合には、S1へ戻る。 Next, in S4, the CPU 31 determines whether or not a white line is detected. The detection of the white line itself is realized as a function of the image recognition unit 42. When determining that the white line is detected in S4, the CPU 31 proceeds to S5. On the other hand, when the CPU 31 determines in S4 that the white line is not detected, the CPU 31 returns to S1.
 このように、CPU31は、上述した判断部45として機能し、所定の条件を満たさない場合には、S1に戻り、所定の条件を満たす場合には、S5へ移行する。 In this way, the CPU 31 functions as the determination unit 45 described above, and if the predetermined condition is not satisfied, the process returns to S1, and if the predetermined condition is satisfied, the process proceeds to S5.
 次に、S5では、CPU31は、キャリブレーションを実行する。ここでは、CPU31は上述した実行部44として機能し、上述したキャリブレーションを実行する。 Next, in S5, the CPU 31 executes calibration. Here, the CPU 31 functions as the execution unit 44 described above, and executes the calibration described above.
 次に、S6では、CPU31は、メモリ32に記憶されるカメラパラメータを、S5のキャリブレーションにより推定されたパラメータに更新する。このS6後、処理がS1に戻る。 Next, in S6, the CPU 31 updates the camera parameters stored in the memory 32 to the parameters estimated by the calibration in S5. After S6, the process returns to S1.
 [1-3.効果]
 以上詳述した第1実施形態によれば、以下の効果を奏する。
[1-3. effect]
According to the first embodiment described in detail above, the following effects are achieved.
 (1a)表示システム1の画像処理部23は、キャリブレーションを好適に実行できる蓋然性の高い場合において、キャリブレーションを実行することができる。そのため、キャリブレーションの精度が低い蓋然性の高い状況でキャリブレーションを行わないことで、キャリブレーションの精度が低下してしまうことを抑制できる。 (1a) The image processing unit 23 of the display system 1 can execute the calibration when there is a high probability that the calibration can be suitably executed. Therefore, by not performing the calibration in a situation where the calibration accuracy is low and the probability is high, it is possible to suppress deterioration of the calibration accuracy.
 (1b)画像処理部23は、キャリブレーションを行う所定の条件を満たしたか否かを、カメラ11の撮影画像、及び、車両に搭載されたセンサの出力の少なくともいずれか一方に基づいて実行することができる。 (1b) The image processing unit 23 determines whether or not a predetermined condition for performing calibration is satisfied based on at least one of the image captured by the camera 11 and the output of the sensor mounted on the vehicle. You can
 (1c)判断部45は、車両の周囲の環境が暗いこと、車両の走行中の道路が凸凹路であること、及び、カメラ11に汚れがあること、のうちの少なくとも1つの要件が満たされたときには、キャリブレーションを実行するための条件を満たしていないと判断する。上記の要件が満たされる場合にはキャリブレーションの精度が低くなり易いが、上記の要件を満たしていないときにはキャリブレーションを許可しないので、低い精度のキャリブレーションを実行してしまうことを抑制できる。 (1c) The determination unit 45 satisfies at least one of the requirements that the environment around the vehicle is dark, that the road on which the vehicle is traveling is uneven, and that the camera 11 is dirty. If it does, it is determined that the conditions for executing the calibration are not satisfied. If the above requirements are satisfied, the accuracy of the calibration tends to be low, but if the above requirements are not satisfied, the calibration is not permitted, so that it is possible to suppress the calibration with low accuracy.
 (1d)判断部45が所定の条件を満たすと判断するための要件には、少なくとも、カメラ11により取得された撮影画像に白線が含まれることが含まれる。よって、白線をキャリブレーションのターゲットとして用いることができ、キャリブレーションの精度を高めることができる。 (1d) The requirement for the determination unit 45 to determine that the predetermined condition is satisfied includes at least that the captured image acquired by the camera 11 includes a white line. Therefore, the white line can be used as a calibration target, and the calibration accuracy can be improved.
 [1-4.第1実施形態の変形例]
 第1実施形態では、白線をキャリブレーションのターゲットとして用いる構成を例示したが、ターゲットは白線以外の物であってもよい。例えば、白線以外の区画線や、道路標示物、縁石など、道路上の様々な対象物をターゲットとして用いることができる。ターゲットに白線以外の物を用いる場合は、図4のS4の処理では、白線に代えて、当該対象物が検出されているか否かを判断してもよい。
[1-4. Modification of First Embodiment]
In the first embodiment, the configuration in which the white line is used as the calibration target is illustrated, but the target may be something other than the white line. For example, lane markings other than the white line, road markings, curbs, and various other objects on the road can be used as targets. When an object other than the white line is used as the target, it may be determined whether or not the target object is detected instead of the white line in the process of S4 of FIG.
 [2.第2実施形態]
 [2-1.第1実施形態との相違点]
 第2実施形態は、基本的な構成は第1実施形態と同様であるため、相違点について以下に説明する。なお、第1実施形態と同じ符号は、同一の構成を示すものであって、先行する説明を参照する。
[2. Second Embodiment]
[2-1. Differences from the first embodiment]
The second embodiment has the same basic configuration as that of the first embodiment, and therefore the differences will be described below. The same reference numerals as those in the first embodiment indicate the same configurations, and refer to the preceding description.
 前述した第1実施形態では、キャリブレーションを行うか否かを判断するための要件として、悪環境であること、及び、白線を認識していることを例示した。これに対し、第2実施形態では、車両の走行速度を所定の条件を満たすか否かの判断の要件として含む点で第1実施形態と相違する。また第2実施形態では、図5に示されるように、車速に応じて、撮影画像を画像処理した結果を用いる様々な制御を実行する。本実施形態では、30km/h未満の低速域にて複数の制御を実行するが、30km/h~50km/hの中速域ではそれらの制御を実行せず、50km/h以上の高速域では車線検知を実行する。 In the above-described first embodiment, as a requirement for determining whether or not to perform the calibration, it is illustrated that the environment is a bad environment and that the white line is recognized. On the other hand, the second embodiment is different from the first embodiment in that the traveling speed of the vehicle is included as a requirement for determining whether or not a predetermined condition is satisfied. Further, in the second embodiment, as shown in FIG. 5, various controls using the result of image processing of the captured image are executed according to the vehicle speed. In the present embodiment, a plurality of controls are executed in the low speed range of less than 30 km/h, but those controls are not executed in the medium speed range of 30 km/h to 50 km/h, and in the high speed range of 50 km/h or more. Perform lane detection.
 [2-2.処理]
 第2実施形態の23のCPUが、第1実施形態の図4の実行判断処理に代えて実行する実行判断処理について、図6のフローチャートを用いて説明する。なお、図6におけるS11,S12,S14,及びS15の処理は、図4におけるS1,S2,S5,及びS6の処理と同様であるため、説明を一部簡略化している。
[2-2. processing]
The execution determination process executed by the CPU 23 of the second embodiment in place of the execution determination process of FIG. 4 of the first embodiment will be described with reference to the flowchart of FIG. Since the processing of S11, S12, S14, and S15 in FIG. 6 is the same as the processing of S1, S2, S5, and S6 in FIG. 4, the description is partly simplified.
 図6の実行判断処理では、CPU31は、S12にて撮影画像61を取得した後、S13へ移行する。 In the execution determination process of FIG. 6, the CPU 31 proceeds to S13 after acquiring the captured image 61 in S12.
 S13では、CPU31は、車速が中速域以上であるか否かを判定する。車速は、例えば、図示しない車速センサの出力信号に基づいて検出することができる。画像処理部23は、低速域では複数の制御を実行しているため、画像処理部23の処理負荷が高い。ここでさらにキャリブレーションを実行すると、他の制御に遅延が生じる可能性が高くなってしまう。よってCPU31は、低速域で走行中はキャリブレーションの実行を許可しない。一方、31は、中速域及び高速域で走行中は、処理負荷が小さいため、キャリブレーションの実行を許可する。車両が中速域及び高速域で走行することが、車両の状態が所定の状態であることに相当する。 In S13, the CPU 31 determines whether the vehicle speed is in the medium speed range or higher. The vehicle speed can be detected, for example, based on the output signal of a vehicle speed sensor (not shown). Since the image processing unit 23 executes a plurality of controls in the low speed range, the processing load on the image processing unit 23 is high. If calibration is further performed here, there is a high possibility that a delay will occur in other control. Therefore, the CPU 31 does not permit execution of the calibration while traveling in the low speed range. On the other hand, reference numeral 31 permits the execution of calibration because the processing load is small during traveling in the medium speed range and the high speed range. The fact that the vehicle travels in the medium speed range and the high speed range corresponds to the vehicle being in a predetermined state.
 CPU31は、S13で車速が中速域以上でないと判定した場合には、S11に戻る。一方、CPU31は、S13で車速が中速域以上であると判定した場合には、S14へ移行し、キャリブレーションを実行する。 When the CPU 31 determines in S13 that the vehicle speed is not higher than the medium speed range, the CPU 31 returns to S11. On the other hand, when the CPU 31 determines in S13 that the vehicle speed is in the medium speed range or higher, the CPU 31 proceeds to S14 and executes the calibration.
 [2-3.効果]
 以上詳述した第2実施形態によれば、以下の効果を奏する。
[2-3. effect]
According to the second embodiment described in detail above, the following effects are obtained.
 (2a)表示システム1の画像処理部23は、キャリブレーションによる処理負荷が他の制御に影響を及ぼす蓋然性の低い場合にキャリブレーションを実行することができる。そのため、キャリブレーションの実行が他の制御に悪影響を与えてしまうことを抑制できる。 (2a) The image processing unit 23 of the display system 1 can execute the calibration when the processing load due to the calibration is unlikely to affect other controls. Therefore, it is possible to prevent the execution of the calibration from adversely affecting other controls.
 [2-4.第2実施形態の変形例]
 第2実施形態では、車速が中速域以上である場合に限りキャリブレーションの実行を許可する構成を例示した。しかしながら、キャリブレーションの実行を許可する速度域は、上記の例に限定されない。例えば、一定の速度範囲においてのみキャリブレーションの実行を許可する構成としてもよい。一例として、低速域、中速域、及び高速域のうちのいずれか1つの場合にのみキャリブレーションの実行を許可してもよい。また、キャリブレーションの実行の可否を判断するための速度の閾値は、第2実施形態にて開示した速度の例に限定されず、様々な値としてもよい。
[2-4. Modification of Second Embodiment]
The second embodiment exemplifies a configuration in which execution of calibration is permitted only when the vehicle speed is in the medium speed range or higher. However, the speed range in which the execution of calibration is permitted is not limited to the above example. For example, the configuration may be such that the execution of calibration is permitted only in a certain speed range. As an example, the execution of calibration may be permitted only in any one of the low speed range, the medium speed range, and the high speed range. Further, the threshold value of the speed for determining whether or not the calibration can be executed is not limited to the speed example disclosed in the second embodiment, and may be various values.
 [3.第3実施形態]
 [3-1.第1実施形態との相違点]
 第3実施形態は、基本的な構成は第1実施形態及び第2実施形態と同様であるため、相違点について以下に説明する。なお、第1実施形態と同じ符号は、同一の構成を示すものであって、先行する説明を参照する。
[3. Third Embodiment]
[3-1. Differences from the first embodiment]
The basic configuration of the third embodiment is similar to that of the first and second embodiments, and therefore the differences will be described below. The same reference numerals as those in the first embodiment indicate the same configurations, and refer to the preceding description.
 第3実施形態では、所定の条件を満たすか否かの判断の要件に、画像処理部23の処理負荷が小さいこと、具体的には画像処理部23のメモリ使用量が含まれる点で、第1実施形態と相違する。第3実施形態において「メモリ使用量」とは、RAM32aの使用量のことを指す。画像処理部23は、メモリ使用量を計測するソフトウェアを備えている。また第3実施形態では、第2実施形態と同様に、撮影画像を画像処理した結果を用いる様々な制御を実行する。 In the third embodiment, the requirement for determining whether or not a predetermined condition is satisfied is that the processing load of the image processing unit 23 is small, specifically, the memory usage amount of the image processing unit 23 is included. This is different from the first embodiment. In the third embodiment, “memory usage” refers to the usage of the RAM 32a. The image processing unit 23 includes software that measures the amount of memory used. Further, in the third embodiment, similar to the second embodiment, various controls using the result of image processing of a captured image are executed.
 [3-2.処理]
 第3実施形態の23のCPUが、第1実施形態の図4の実行判断処理に代えて実行する実行判断処理について、図7のフローチャートを用いて説明する。なお、図7におけるS21,S22,S24,及びS25の処理は、図4におけるS1,S2,S5,及びS6の処理と同様であるため、説明を一部簡略化している。
[3-2. processing]
The execution determination process executed by the CPU 23 of the third embodiment in place of the execution determination process of FIG. 4 of the first embodiment will be described with reference to the flowchart of FIG. 7. Note that the processing of S21, S22, S24, and S25 in FIG. 7 is similar to the processing of S1, S2, S5, and S6 in FIG. 4, so the description is partially simplified.
 図7の実行判断処理では、CPU31は、S22にて撮影画像61を取得した後、S23へ移行する。 In the execution determination process of FIG. 7, the CPU 31 acquires the captured image 61 in S22, and then moves to S23.
 S23では、CPU31は、キャリブレーション実施前のメモリ使用量が基準値以上であるか否かを判定する。ここでの基準値とは、例えば図8に示されるように、処理速度の低下が生じていると推定される、キャリブレーション実施前のメモリ使用量の値である。図8のパターンAは、処理速度に影響がない範囲のメモリ使用量の例である。よって、この場合には、キャリブレーションの実行が許可される。一方、図8のパターンBは、処理速度の低下が推定されるメモリ使用量の例である。この場合には、キャリブレーションの実行が許可されない。メモリ使用量が所定の基準値未満であることが、所定の基準を満たすことに相当する。 In S23, the CPU 31 determines whether or not the memory usage amount before the calibration is performed is equal to or larger than the reference value. The reference value here is, for example, as shown in FIG. 8, a value of the memory usage amount before the calibration is performed, which is estimated to cause a decrease in the processing speed. Pattern A in FIG. 8 is an example of the memory usage amount in a range that does not affect the processing speed. Therefore, in this case, execution of calibration is permitted. On the other hand, pattern B in FIG. 8 is an example of the memory usage amount in which the decrease in processing speed is estimated. In this case, execution of calibration is not permitted. The memory usage amount being less than the predetermined reference value corresponds to satisfying the predetermined reference.
 CPU31は、S23でメモリ使用量が基準値以上であると判定した場合には、S21に戻る。一方、CPU31は、S23でメモリ使用量が基準値未満であると判定した場合には、S24へ移行し、キャリブレーションを実行する。 When the CPU 31 determines in S23 that the memory usage amount is equal to or greater than the reference value, the CPU 31 returns to S21. On the other hand, when the CPU 31 determines in S23 that the memory usage amount is less than the reference value, the CPU 31 proceeds to S24 and executes calibration.
 [3-3.効果]
 以上詳述した第3実施形態によれば、以下の効果を奏する。
[3-3. effect]
According to the third embodiment described in detail above, the following effects are obtained.
 (3a)表示システム1の画像処理部23は、メモリ使用量が基準値以上でない場合にキャリブレーションを実行することができる。そのため、キャリブレーションの実行が他の制御に悪影響を与えてしまうことを抑制できる。 (3a) The image processing unit 23 of the display system 1 can execute the calibration when the memory usage amount is less than the reference value. Therefore, it is possible to prevent the execution of the calibration from adversely affecting other controls.
 [3-4.第3実施形態の変形例]
 第3実施形態では、基準値とは、画像処理部23の処理速度の低下が生じていると推定されるメモリ使用量である構成を例示した。しかしながら、基準値は、上記の構成に限定されない。例えば、キャリブレーションを実行することによりメモリ使用量が増加することを考慮して、基準値は、上記増加量を当該基準値に加算すると、処理速度の低下が顕著となるメモリ使用量の閾値を超えることとなるメモリ使用量の値としてもよい。
[3-4. Modification of Third Embodiment]
In the third embodiment, the reference value is an example of a memory usage amount estimated to cause a decrease in the processing speed of the image processing unit 23. However, the reference value is not limited to the above configuration. For example, in consideration of the increase in the memory usage amount by performing the calibration, the reference value is a threshold value of the memory usage amount at which the processing speed is significantly reduced when the increase amount is added to the reference value. It may be a value of the memory usage amount that will be exceeded.
 また、メモリ使用量以外の所定の基準を満たした場合に、画像処理部23の処理負荷が小さいことを検出してもよい。例えば、処理負荷が大きい特定の制御が実行されていないことや、実行されている制御が所定数以下であるときなどに、画像処理部23の処理負荷が小さいと判断してもよい。 Further, when a predetermined criterion other than the memory usage amount is satisfied, it may be detected that the processing load of the image processing unit 23 is small. For example, it may be determined that the processing load of the image processing unit 23 is small when a specific control having a large processing load is not executed or when the number of executed controls is less than or equal to a predetermined number.
 [4.他の実施形態]
 以上、本開示の実施形態について説明したが、本開示は上述の実施形態に限定されることなく、種々変形して実施することができる。
[4. Other Embodiments]
Although the embodiments of the present disclosure have been described above, the present disclosure is not limited to the above-described embodiments, and various modifications can be implemented.
 (4a)上記第1実施形態ではキャリブレーションの精度が低下しないタイミングでキャリブレーションを実行できるように所定の条件が設定されており、また上記第2及び第3実施形態ではキャリブレーションによる他の制御への影響を抑制してキャリブレーションを実行できるように所定の条件が設定されていた。しかしながら、上述した両方の効果を得られるように所定の条件が設定されていてもよい。例えば、図9のフローチャートに示される実行判断処理のように、上述したS3、S4、S13、及びS23の全ての判断を行うように構成されていてもよい。このような構成であれば、キャリブレーションの精度の低下抑制と、他の制御への影響の抑制と、を同時に実現することができる。 (4a) In the first embodiment, the predetermined condition is set so that the calibration can be executed at the timing at which the accuracy of the calibration is not deteriorated. In the second and third embodiments, other control by the calibration is performed. Predetermined conditions were set so that the influence on the above can be suppressed and the calibration can be executed. However, the predetermined condition may be set so that both effects described above can be obtained. For example, like the execution determination process shown in the flowchart of FIG. 9, it may be configured to perform all the determinations of S3, S4, S13, and S23 described above. With such a configuration, it is possible to simultaneously realize a reduction in calibration accuracy and an influence on other controls.
 (4b)上述した第2実施形態では、車両の走行速度に基づいて、キャリブレーションの実行を許可するための所定の条件を満たすと判断する構成を例示したが、車載センサの出力に基づいて取得される車速以外の車両の状態に基づいて、所定の条件を満たすか否かを判断してもよい。例えば、所定の条件を満たすと判断するための要件には、車両が、前照灯などのライトが点灯していない状態、車両の外部における少なくとも一定の範囲に移動体を検出していない状態、などであることが含まれていてもよい。車両の走行速度が所定の範囲であること、ライトを点灯せずに走行すること、及び、所定範囲に移動体が存在しないことが、車両の状態が所定の状態であることの例に相当する。 (4b) In the above-described second embodiment, the configuration in which it is determined that the predetermined condition for permitting the execution of the calibration is determined based on the traveling speed of the vehicle has been exemplified, but the configuration is acquired based on the output of the vehicle-mounted sensor. Whether or not the predetermined condition is satisfied may be determined based on the state of the vehicle other than the determined vehicle speed. For example, the requirements for determining that a predetermined condition is satisfied, the vehicle is a state in which a light such as a headlight is not lit, a state in which a moving body is not detected in at least a certain range outside the vehicle, May be included. The fact that the traveling speed of the vehicle is within a predetermined range, that the vehicle travels without turning on the light, and that there is no moving object within the predetermined range correspond to examples where the vehicle is in the predetermined state. ..
 (4c)本開示に記載の画像処理部23及びその手法は、コンピュータプログラムにより具体化された一つ乃至は複数の機能を実行するようにプログラムされたプロセッサ及びメモリを構成することによって提供された専用コンピュータにより、実現されてもよい。あるいは、本開示に記載の画像処理部23及びその手法は、一つ以上の専用ハードウェア論理回路によってプロセッサを構成することによって提供された専用コンピュータにより、実現されてもよい。もしくは、本開示に記載の画像処理部23及びその手法は、一つ乃至は複数の機能を実行するようにプログラムされたプロセッサ及びメモリと一つ以上のハードウェア論理回路によって構成されたプロセッサとの組み合わせにより構成された一つ以上の専用コンピュータにより、実現されてもよい。また、コンピュータプログラムは、コンピュータにより実行されるインストラクションとして、コンピュータ読み取り可能な非遷移有形記録媒体に記憶されてもよい。画像処理部23に含まれる各部の機能を実現する手法には、必ずしもソフトウェアが含まれている必要はなく、その全部の機能が、一つあるいは複数のハードウェアを用いて実現されてもよい。 (4c) The image processing unit 23 and the method thereof according to the present disclosure are provided by configuring a processor and a memory programmed to execute one or a plurality of functions embodied by a computer program. It may be realized by a dedicated computer. Alternatively, the image processing unit 23 and the method thereof described in the present disclosure may be realized by a dedicated computer provided by configuring a processor with one or more dedicated hardware logic circuits. Alternatively, the image processing unit 23 and its method described in the present disclosure include a processor and a memory programmed to execute one or a plurality of functions, and a processor configured by one or more hardware logic circuits. It may be realized by one or more dedicated computers configured by combination. Further, the computer program may be stored in a computer-readable non-transition tangible recording medium as an instruction executed by a computer. The method for realizing the function of each unit included in the image processing unit 23 does not necessarily need to include software, and all the functions may be realized by using one or a plurality of hardware.
 (4d)上記実施形態における1つの構成要素が有する複数の機能を、複数の構成要素によって実現したり、1つの構成要素が有する1つの機能を、複数の構成要素によって実現したりしてもよい。また、複数の構成要素が有する複数の機能を、1つの構成要素によって実現したり、複数の構成要素によって実現される1つの機能を、1つの構成要素によって実現したりしてもよい。また、上記実施形態の構成の一部を省略してもよい。また、上記実施形態の構成の少なくとも一部を、他の上記実施形態の構成に対して付加又は置換してもよい。 (4d) A plurality of functions of one constituent element in the above-described embodiment may be realized by a plurality of constituent elements, or one function of one constituent element may be realized by a plurality of constituent elements. .. Further, a plurality of functions of a plurality of constituent elements may be realized by one constituent element, or one function realized by a plurality of constituent elements may be realized by one constituent element. Moreover, you may omit a part of structure of the said embodiment. Further, at least a part of the configuration of the above-described embodiment may be added or replaced with respect to the configuration of the other above-described embodiment.
 (4e)上述した画像処理部23の他、当該画像処理部23を構成要素とするシステム、当該画像処理部23としてコンピュータを機能させるためのプログラム、このプログラムを記録した半導体メモリ等の非遷移的実態的記録媒体、キャリブレーション実行方法など、種々の形態で本開示を実現することもできる。 (4e) In addition to the image processing unit 23 described above, a system having the image processing unit 23 as a constituent element, a program for causing a computer to function as the image processing unit 23, and a non-transitional type such as a semiconductor memory storing the program The present disclosure can also be realized in various forms such as an actual recording medium and a calibration execution method.

Claims (6)

  1.  車両に搭載して用いられる情報処理装置(23)であって、
     前記車両に搭載されて当該車両の周囲を撮影するカメラ(11a、11b、11c、11d)のキャリブレーションを、前記カメラの撮影画像(61)に基づいて実行するように構成された実行部(44)と、
     所定の条件を満たしているか否かを判断するように構成された判断部(45)と、を備え、
     前記実行部は、前記判断部が前記所定の条件を満たしていると判断しているときに、前記キャリブレーションを実行可能に構成されている、情報処理装置。
    An information processing device (23) mounted on a vehicle and used,
    An execution unit (44) configured to execute the calibration of the cameras (11a, 11b, 11c, 11d) mounted on the vehicle and capturing the surroundings of the vehicle based on the captured image (61) of the camera. )When,
    A determination unit (45) configured to determine whether or not a predetermined condition is satisfied,
    The information processing apparatus, wherein the execution unit is configured to be able to execute the calibration when the determination unit determines that the predetermined condition is satisfied.
  2.  請求項1に記載の情報処理装置であって、
     前記所定の条件とは、前記カメラにより取得された撮影画像、及び、前記車両に搭載されたセンサの出力、のうちの少なくとも1つに基づいて判断される条件である、情報処理装置。
    The information processing apparatus according to claim 1, wherein
    The predetermined condition is an information processing device, which is a condition determined based on at least one of a captured image acquired by the camera and an output of a sensor mounted on the vehicle.
  3.  請求項2に記載の情報処理装置であって、
     前記判断部は、前記車両の周囲の環境が暗いこと、前記車両の走行中の道路が凸凹路であること、及び、前記カメラに汚れがあること、のうちの少なくとも1つの要件が満たされたときには、前記所定の条件を満たしていないと判断するように構成される、情報処理装置。
    The information processing apparatus according to claim 2, wherein
    At least one of the requirement that the environment surrounding the vehicle is dark, the road on which the vehicle is traveling is uneven, and the camera is dirty is satisfied by the determination unit. At times, the information processing apparatus is configured to determine that the predetermined condition is not satisfied.
  4.  請求項2又は請求項3に記載の情報処理装置であって、
     前記判断部が前記所定の条件を満たすと判断するための要件には、少なくとも、前記カメラにより取得された前記撮像画像に前記キャリブレーションの基準として用いることができるターゲット(62、63)が含まれることが含まれる、情報処理装置。
    The information processing apparatus according to claim 2 or 3, wherein
    The requirement for the determination unit to determine that the predetermined condition is satisfied includes at least the target (62, 63) that can be used as the calibration reference in the captured image acquired by the camera. An information processing device including:
  5.  請求項2から請求項4のいずれか1項に記載の情報処理装置であって、
     前記判断部が前記所定の条件を満たすと判断するための要件には、少なくとも前記センサの出力に基づいて取得される前記車両の状態が所定の状態であることが含まれる、情報処理装置。
    The information processing apparatus according to any one of claims 2 to 4, wherein:
    The information processing apparatus, wherein the requirement for the determination unit to determine that the predetermined condition is satisfied includes that at least the state of the vehicle acquired based on the output of the sensor is a predetermined state.
  6.  請求項1から請求項5のいずれか1項に記載の情報処理装置であって、
     前記判断部が前記所定の条件を満たすと判断するための要件には、少なくとも当該情報処理装置の処理負荷が小さいことを示す所定の基準を満たすことが含まれる、情報処理装置。
    The information processing apparatus according to any one of claims 1 to 5, wherein
    The information processing apparatus, wherein the requirement for the determining unit to determine that the predetermined condition is satisfied includes at least satisfying a predetermined criterion indicating that the processing load of the information processing apparatus is small.
PCT/JP2020/004366 2019-02-06 2020-02-05 Information processing device WO2020162498A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2019019622A JP7263807B2 (en) 2019-02-06 2019-02-06 Information processing equipment
JP2019-019622 2019-02-06

Publications (1)

Publication Number Publication Date
WO2020162498A1 true WO2020162498A1 (en) 2020-08-13

Family

ID=71947986

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2020/004366 WO2020162498A1 (en) 2019-02-06 2020-02-05 Information processing device

Country Status (2)

Country Link
JP (1) JP7263807B2 (en)
WO (1) WO2020162498A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023199465A1 (en) * 2022-04-14 2023-10-19 日立Astemo株式会社 Vehicle-mounted image processing device and calibration method therefor

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2011114624A1 (en) * 2010-03-17 2011-09-22 本田技研工業株式会社 Vehicle surroundings monitoring device
JP2016082258A (en) * 2014-10-09 2016-05-16 株式会社デンソー On-vehicle camera calibration device, image generation apparatus, on-vehicle camera calibration method and image generation method
JP2017143417A (en) * 2016-02-10 2017-08-17 クラリオン株式会社 Calibration system, calibration device
JP2017211222A (en) * 2016-05-24 2017-11-30 三菱電機株式会社 On-vehicle camera calibration auxiliary device

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4820221B2 (en) * 2006-06-29 2011-11-24 日立オートモティブシステムズ株式会社 Car camera calibration device and program

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2011114624A1 (en) * 2010-03-17 2011-09-22 本田技研工業株式会社 Vehicle surroundings monitoring device
JP2016082258A (en) * 2014-10-09 2016-05-16 株式会社デンソー On-vehicle camera calibration device, image generation apparatus, on-vehicle camera calibration method and image generation method
JP2017143417A (en) * 2016-02-10 2017-08-17 クラリオン株式会社 Calibration system, calibration device
JP2017211222A (en) * 2016-05-24 2017-11-30 三菱電機株式会社 On-vehicle camera calibration auxiliary device

Also Published As

Publication number Publication date
JP2020127164A (en) 2020-08-20
JP7263807B2 (en) 2023-04-25

Similar Documents

Publication Publication Date Title
US9538144B2 (en) Full speed lane sensing using multiple cameras
US10875452B2 (en) Driving assistance device and driving assistance method
JP2013147112A (en) Vehicle driving environment recognition apparatus
JP6209825B2 (en) Parallax detection device and parallax detection method
US20140055572A1 (en) Image processing apparatus for a vehicle
US11017245B2 (en) Parking assist apparatus
JP2007293672A (en) Photographing apparatus for vehicle and soiling detection method for photographing apparatus for vehicle
JPWO2012066999A1 (en) In-vehicle camera displacement detection device
WO2020162498A1 (en) Information processing device
JP5083254B2 (en) Parking result display system
JP6407596B2 (en) Image processing apparatus and driving support system
JP6424449B2 (en) Rear status display device, rear status display method
JP2007018451A (en) Road boundary line detecting device
JP5716944B2 (en) In-vehicle camera device
JP6412934B2 (en) Object detection device, vehicle installed with object detection device, and program
JP6032141B2 (en) Travel road marking detection device and travel road marking detection method
JP4539400B2 (en) Stereo camera correction method and stereo camera correction device
JP2019133445A (en) Section line detection device, section line detection system, and section line detection method
JP7122394B2 (en) Imaging unit controller
JP7115420B2 (en) Image processing device
JP7005279B2 (en) Vehicle peripheral visibility device
JP2006036048A (en) Vehicle light controlling device
JP6855254B2 (en) Image processing device, image processing system, and image processing method
WO2017188245A1 (en) Image processing device, image processing method, and program
JP2019135620A (en) Traveling support device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20752720

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20752720

Country of ref document: EP

Kind code of ref document: A1