WO2018150496A1 - Information processing device, information processing method, and information processing program - Google Patents

Information processing device, information processing method, and information processing program Download PDF

Info

Publication number
WO2018150496A1
WO2018150496A1 PCT/JP2017/005573 JP2017005573W WO2018150496A1 WO 2018150496 A1 WO2018150496 A1 WO 2018150496A1 JP 2017005573 W JP2017005573 W JP 2017005573W WO 2018150496 A1 WO2018150496 A1 WO 2018150496A1
Authority
WO
WIPO (PCT)
Prior art keywords
vehicle
information processing
processing apparatus
image
reflected
Prior art date
Application number
PCT/JP2017/005573
Other languages
French (fr)
Japanese (ja)
Inventor
知奈美 木下
陽介 中畑
Original Assignee
富士通周辺機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 富士通周辺機株式会社 filed Critical 富士通周辺機株式会社
Priority to PCT/JP2017/005573 priority Critical patent/WO2018150496A1/en
Publication of WO2018150496A1 publication Critical patent/WO2018150496A1/en

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems

Definitions

  • the present invention relates to an information processing apparatus, an information processing method, and an information processing program.
  • learning data representing an image of a vehicle is stored, and if there is a partial image similar to the image of the vehicle represented by the learning data among images captured by the in-vehicle camera outside the vehicle, the vehicle is reflected in the partial image.
  • the center point of the saturation value of the luminance value of the light light of the vehicle is detected to detect the vehicle, and the center of the shadow of the vehicle is determined from the background image of the photographing location and the luminance value of the shadow of the vehicle.
  • Some of them detect a vehicle by detecting a point.
  • a signal indicating that the host vehicle has arrived at a target point that is in front of a point where the inclination of the traveling road surface ahead of the host vehicle relative to the road surface portion on which the host vehicle is traveling changes to a downward slope.
  • the position of the vehicle is erroneously detected.
  • an in-vehicle camera captures an image of the outside of the vehicle, which is an object with a shape or color similar to that of the vehicle, but an object that is not a vehicle, the position of the object is mistakenly detected as the vehicle position.
  • the position of the object is mistakenly detected as the vehicle position.
  • an object of the present invention is to provide an information processing apparatus, an information processing method, and an information processing program that can prevent erroneous detection of the position of a vehicle.
  • the target image captured by the imaging unit that captures the outside of the vehicle is acquired, and the acquired target image has the feature based on the sample data indicating the feature of the image captured of the vehicle Detecting the position of the image, and specifying information on a straight line indicating the reference of the position where the vehicle appears in the image captured by the imaging unit, which is specified based on the position where the vehicle is reflected in each of the plurality of images captured outside the vehicle
  • An information processing apparatus, an information processing method, and an information processing program for determining whether or not the detected position is a position where a vehicle is shown with reference to a storage unit to be stored are proposed.
  • FIG. 1 is an explanatory diagram of an example of the information processing method according to the embodiment.
  • FIG. 2 is a block diagram illustrating a hardware configuration example of the information processing apparatus 100.
  • FIG. 3 is a block diagram illustrating a functional configuration example of the information processing apparatus 100.
  • FIG. 4 is an explanatory diagram illustrating an example of an image 400 captured by the imaging device 206.
  • FIG. 5 is an explanatory diagram showing an example of detecting the position where the vehicle is shown.
  • FIG. 6 is an explanatory diagram illustrating an example of an image 600 captured by the imaging device 206.
  • FIG. 7 is an explanatory diagram illustrating an example of an image 700 captured by the imaging device 206.
  • FIG. 8 is an explanatory diagram illustrating a specific example of creating a function indicating a straight line 801.
  • FIG. 9 is an explanatory diagram (part 1) illustrating a specific example of determining whether or not the vehicle is in a reflected position.
  • FIG. 10 is an explanatory diagram (part 2) illustrating a specific example of determining whether or not the vehicle is in a reflected position.
  • FIG. 11 is an explanatory diagram illustrating an example of determining whether or not the vehicle is in a position where the threshold is changed.
  • FIG. 12 is a flowchart illustrating an example of a calculation processing procedure.
  • FIG. 13 is a flowchart illustrating an example of the determination processing procedure.
  • FIG. 14 is a flowchart illustrating an example of the adjustment processing procedure.
  • FIG. 1 is an explanatory diagram of an example of the information processing method according to the embodiment.
  • an information processing apparatus 100 is a computer provided in a vehicle.
  • the information processing apparatus 100 detects a position where the vehicle is reflected in the image captured by the imaging unit.
  • the imaging unit is provided in the vehicle and images the outside of the vehicle.
  • the imaging unit images, for example, the front of the vehicle, the rear of the vehicle, the oblique front of the vehicle, the oblique rear of the vehicle, the left direction of the vehicle, or the right direction of the vehicle.
  • the imaging unit includes an in-vehicle camera attached around the front mirror of the vehicle toward the front of the vehicle, an in-vehicle camera embedded in the door portion of the vehicle toward the rear of the vehicle, and in the left direction of the vehicle.
  • an in-vehicle camera embedded in the bumper part of the vehicle is an in-vehicle camera attached around the front mirror of the vehicle toward the front of the vehicle.
  • the in-vehicle camera detects a white line such as a lane boundary line reflected in an image taken outside the vehicle, and based on the position of the white line, it is possible to improve the accuracy of detecting the position where the vehicle is reflected.
  • a white line such as a lane boundary line reflected in an image taken outside the vehicle
  • the present embodiment after detecting the position of the partial image having the same characteristics as the sample data in the target image, it is determined whether or not the position is a position where the vehicle is reflected. An information processing method capable of preventing erroneous detection of the reflected position will be described.
  • the information processing apparatus 100 includes a storage unit 120.
  • the storage unit 120 stores information on the straight line 131.
  • a straight line 131 represents a reference of a position where a vehicle appears in an image 130 newly captured by the imaging unit in the future, and represents a region including a position where the vehicle can be reflected in an image 130 newly captured in the future.
  • the imaging unit is provided in the same vehicle as the information processing apparatus 100, for example.
  • the start point or end point of the straight line 131 may be, for example, a point inside the image 130 instead of a point at the end of the image 130 newly captured by the imaging unit in the future.
  • the information on the straight line 131 is specified based on the position where the vehicle is reflected in each of a plurality of images taken outside the vehicle.
  • the information on the straight line 131 is specified using, for example, the least square method.
  • the information on the straight line 131 is a function indicating the straight line 131, for example.
  • the plurality of images taken outside the vehicle are a plurality of images taken by the imaging unit.
  • the plurality of images captured outside the vehicle may be, for example, a plurality of images captured by an imaging unit provided in a vehicle different from the information processing apparatus 100.
  • the vehicle provided with the information processing apparatus 100 it may be a plurality of images taken by an imaging unit provided in a vehicle different from the vehicle provided with the information processing apparatus 100. Good.
  • the information processing apparatus 100 acquires the target image 110 captured by the imaging unit.
  • the information processing apparatus 100 acquires, for example, a target image 110 obliquely rearward of the vehicle, which is captured by an in-vehicle camera embedded in a door portion of the vehicle toward the rearward of the vehicle.
  • the information processing apparatus 100 detects the position 111 of the partial image having the same feature as the feature indicated by the sample data in the acquired target image 110 based on the sample data indicating the feature of the image obtained by capturing the vehicle. To do.
  • the sample data is, for example, a sample image showing a part of the vehicle.
  • the information processing apparatus 100 detects, for example, a position 111 of a partial image similar to a sample image in which a part of the vehicle is shown in the target image 110 by pattern matching using a sample image in which a part of the vehicle is shown. .
  • the information processing apparatus 100 determines based on the information on the straight line 131 stored in the storage unit 120 whether the detected position 111 is a position where the vehicle is reflected. For example, the information processing apparatus 100 determines whether the distance between the straight line 131 and the detected position 111 is greater than or equal to a threshold value. Here, when it is determined that the information processing apparatus 100 is less than the threshold value, the detected position 111 is a position where the vehicle is reflected and is determined not to be noise. On the other hand, when it is determined that the information processing apparatus 100 is equal to or greater than the threshold value, the detected position 111 determines that the detected position 111 is not a position where the vehicle is reflected but noise.
  • the information processing apparatus 100 can prevent the position from being erroneously detected as a position where the vehicle is reflected. For this reason, when using the position where the vehicle is reflected for ensuring safety during traveling of the vehicle, the information processing apparatus 100 can accurately ensure safety without using the erroneously detected position.
  • the information processing apparatus 100 can prevent erroneous detection of the position where the vehicle appears even if the accuracy of pattern matching is poor. Further, since the information processing apparatus 100 performs the process of determining whether or not the position detected by the pattern matching is a position where the vehicle is reflected by comparing the predetermined straight line with the detected position, the information processing apparatus 100 can perform a relatively short time. Can be done. For this reason, the information processing apparatus 100 can improve the accuracy of detecting the position where the vehicle is reflected without improving the accuracy of the pattern matching, and is required when detecting the position where the vehicle is reflected. An increase in time can be suppressed.
  • the information processing apparatus 100 can detect the position where the vehicle is reflected in a relatively short time in order to ensure safety while the vehicle is running. Further, since the information processing apparatus 100 does not have to improve the accuracy of pattern matching, it is possible to avoid increasing the cost of hardware for performing pattern matching.
  • the information processing apparatus 100 can perform automatic driving or driving assistance based on the position where the vehicle detected with relatively high accuracy is reflected, thereby improving safety.
  • the information processing apparatus 100 can detect that the distance between the vehicles has approached a certain distance or more based on the position where the vehicle is detected with relatively high accuracy. A warning can be issued and safety can be improved.
  • FIG. 2 is a block diagram illustrating a hardware configuration example of the information processing apparatus 100.
  • an information processing apparatus 100 includes a CPU (Central Processing Unit) 201, a memory 202, a network I / F (Interface) 203, a recording medium I / F 204, a recording medium 205, and an imaging device 206. Have. Each component is connected by a bus 200.
  • the CPU 201 controls the entire information processing apparatus 100.
  • the memory 202 includes, for example, a ROM (Read Only Memory), a RAM (Random Access Memory), a flash ROM, and the like. Specifically, for example, a flash ROM or ROM stores various programs, and a RAM is used as a work area of the CPU 201.
  • the program stored in the memory 202 is loaded on the CPU 201 to cause the CPU 201 to execute the coded process.
  • the memory 202 stores sample data indicating the characteristics of an image obtained by imaging a vehicle, information on a straight line that serves as a reference for a position where the vehicle is reflected, and the like.
  • the network I / F 203 is connected to the network 210 via a communication line, and is connected to another computer via the network 210.
  • the network I / F 203 controls an internal interface with the network 210 and controls data input / output from other computers.
  • a modem or a LAN (Local Area Network) adapter may be employed as the network I / F 203.
  • the recording medium I / F 204 controls reading / writing of data with respect to the recording medium 205 according to the control of the CPU 201.
  • the recording medium I / F 204 is, for example, a disk drive, an SSD (Solid State Drive), a USB (Universal Serial Bus) port, or the like.
  • the recording medium 205 is a non-volatile memory that stores data written under the control of the recording medium I / F 204.
  • the recording medium 205 is, for example, a disk, a semiconductor memory, a USB memory, or the like.
  • the recording medium 205 may be detachable from the information processing apparatus 100.
  • the imaging device 206 captures the outside of the vehicle and acquires image data representing the image captured outside the vehicle.
  • the imaging device 206 images, for example, the front of the vehicle, the rear of the vehicle, the oblique front of the vehicle, the oblique rear of the vehicle, the left direction of the vehicle, or the right direction of the vehicle.
  • the imaging device 206 includes an in-vehicle camera attached around the front mirror of the vehicle toward the front of the vehicle, an in-vehicle camera embedded in the door portion of the vehicle toward the rear of the vehicle, and the left direction of the vehicle.
  • an in-vehicle camera embedded in the bumper part of the vehicle can capture images by switching the imaging direction, and may capture a plurality of directions.
  • the imaging device 206 may not be integrated with the information processing device 100 as long as it can communicate with the information processing device 100.
  • the information processing apparatus 100 may include a plurality of imaging devices 206. In addition to the components described above, the information processing apparatus 100 may include, for example, a keyboard, a mouse, a display, and the like. The information processing apparatus 100 may not include the recording medium I / F 204 and the recording medium 205.
  • the information processing apparatus 100 is an on-vehicle apparatus, for example. Specifically, the information processing apparatus 100 is a drive recorder.
  • FIG. 3 is a block diagram illustrating a functional configuration example of the information processing apparatus 100.
  • the information processing apparatus 100 includes a first storage unit 301, a second storage unit 302, an acquisition unit 303, a detection unit 304, a creation unit 305, a determination unit 306, and an output unit 307.
  • the first storage unit 301 and the second storage unit 302 are realized by storage areas such as the memory 202 and the recording medium 205 shown in FIG.
  • the acquisition unit 303 to the output unit 307 are functions serving as a control unit. Specifically, the acquisition unit 303 to the output unit 307, for example, cause the CPU 201 to execute a program stored in a storage area such as the memory 202 or the recording medium 205 illustrated in FIG. 2 or the network I / F 203. By realizing the function.
  • the processing result of each functional unit is stored in a storage area such as the memory 202 and the recording medium 205 shown in FIG.
  • the first storage unit 301 stores, for example, sample data indicating features estimated to be included in a partial image in which a vehicle is shown.
  • the sample data is, for example, a sample image showing a part of the vehicle.
  • the sample data may be, for example, information indicating the number of horizontal or vertical edges included in a sample image showing a part of the vehicle.
  • the imaging unit may be capable of imaging a plurality of directions. The imaging unit is provided in the same vehicle as the information processing apparatus 100.
  • the imaging unit is, for example, the imaging device 206 illustrated in FIG.
  • the first storage unit 301 stores, for example, a sample image of a part of the vehicle that appears in the captured image when the imaging unit captures the direction in accordance with the imaging direction of the imaging unit. To do. Specifically, when the imaging unit captures the rear, the first storage unit 301 reflects the front part of the vehicle behind the image captured by the imaging unit. The specimen image is memorized. The specimen image of the front portion of the vehicle is a modeled image based on, for example, images of front portions of various vehicles. Thereby, the first storage unit 301 can store the sample data used by the detection unit 304. Further, the first storage unit 301 can enable the detection unit 304 to selectively use sample data corresponding to each direction even when the imaging unit can capture images in a plurality of directions.
  • the second storage unit 302 stores straight line information.
  • the straight line represents a reference of a position where the vehicle appears in an image newly captured by the imaging unit in the future, and represents a region including a position where the vehicle can be reflected in an image newly captured in the future.
  • the start point or end point of the straight line may be, for example, a point inside the image instead of a point at the end of the image newly captured in the future by the imaging unit.
  • the straight line information is specified based on the position where the vehicle is reflected in each of a plurality of images taken outside the vehicle.
  • the straight line information is specified using, for example, the least square method.
  • the straight line information is, for example, a function indicating a straight line.
  • the plurality of images captured outside the vehicle are, for example, a plurality of images captured in the past by an imaging unit provided in the same vehicle as the information processing apparatus 100.
  • the plurality of images captured outside the vehicle are, for example, the same type as the vehicle provided with the information processing apparatus 100, but the plurality of images captured in the past by an imaging unit provided in a vehicle different from the information processing apparatus 100. It may be. In this case, it is preferable that the imaging unit provided in the same vehicle as the information processing device 100 and the imaging unit provided in a vehicle different from the information processing device 100 have the same type of imaging direction.
  • x is a coordinate value in the horizontal direction of the image.
  • y is a coordinate value in the vertical direction of the image.
  • a is a coefficient, and is obtained by, for example, the least square method.
  • b is a coefficient, and is obtained by, for example, the least square method.
  • the position where the vehicle is reflected is the center coordinates of the partial image where the vehicle is reflected in the image.
  • the center coordinates are expressed by, for example, a combination (x, y) of the coordinate value x in the horizontal direction of the image and the coordinate value y in the vertical direction of the image.
  • the function may express that a range or a domain is set and the start point or end point of the straight line is a point inside the image.
  • storage part 302 can enable the determination part 306 to determine whether the position is a position where the vehicle was reflected based on the distance of a certain position and a straight line.
  • the second storage unit 302 associates each direction of a plurality of directions that can be imaged by the imaging unit with information on a straight line indicating a reference of a position where the vehicle appears in the image of the direction captured by the imaging unit. You may remember.
  • the second storage unit 302 stores a combination of the coefficient a and the coefficient b for each direction in which the imaging unit can capture an image.
  • the second storage unit 302 can make the determination unit 306 usable by switching functions indicating a plurality of straight lines in accordance with the direction in which the imaging unit captures an image. Even if the direction in which the imaging unit captures an image is switched, the second storage unit 302 can suppress a decrease in the determination accuracy of the determination unit 306.
  • the acquisition unit 303 acquires a plurality of images taken outside the vehicle.
  • the plurality of images captured outside the vehicle are, for example, a plurality of images that are provided in the same vehicle as the information processing apparatus 100 and are captured by the imaging unit that captures the outside of the vehicle.
  • the plurality of images obtained by imaging the outside of the vehicle may be, for example, a plurality of images that are provided on a vehicle different from the information processing apparatus 100 and captured by an imaging unit that images the outside of the vehicle from the vehicle.
  • the acquisition unit 303 acquires a plurality of images captured by the imaging unit and confirmed to show the vehicle as a plurality of images for specifying a straight line. Thereby, the acquisition unit 303 can acquire a plurality of images on which the creation unit 305 creates straight line information.
  • the detection unit 304 detects a position where the vehicle is reflected in each of the plurality of images acquired by the acquisition unit 303.
  • the detection unit 304 is a partial image similar to a sample image in which a part of the vehicle is shown in each of the plurality of images, for example, by pattern matching using the sample image stored in the first storage unit 301. As the position, the center coordinates of the partial image are detected.
  • the detection unit 304 may detect an edge for each of the plurality of images acquired by the acquisition unit 303. Then, the detection unit 304 may detect the center coordinates of the partial image as the position of the partial image corresponding to the number of edges indicated by the sample data in each image. Accordingly, the detection unit 304 can detect a plurality of positions used when the creation unit 305 creates straight line information.
  • the creation unit 305 creates straight line information based on the position of the vehicle detected by the detection unit 304 for each of the plurality of images acquired by the acquisition unit 303 and stores the straight line information in the second storage unit 302. Store. For example, the creation unit 305 creates a function indicating a straight line using the least squares method based on the position of the vehicle detected by the detection unit 304 for each of the plurality of images acquired by the acquisition unit 303. Is stored in the second storage unit 302.
  • the creation unit 305 calculates the number N of coordinates detected by the detection unit 304 and the N coordinates (x i , y i ) detected by the detection unit 304 using the following formula (1) and the following formula:
  • a combination of coefficient a and coefficient b is calculated and stored in the second storage unit 302.
  • i is 1 to N.
  • the creation unit 305 sets a threshold value based on the distance between the straight line and the position where the vehicle detected by the detection unit 304 for each of the plurality of images acquired by the acquisition unit 303 is displayed.
  • the creation unit 305 sets, for example, the maximum value of the distance between the straight line and the N coordinates (x i , y i ) detected by the detection unit 304 as a threshold value.
  • the creation unit 305 can set a threshold value that can be determined to be a position where the vehicle is reflected, at least for positions detected from a plurality of images that have been confirmed to be reflected. .
  • the acquisition unit 303 acquires the target image captured by the imaging unit that images outside the vehicle.
  • the target image is an image that is a target for detecting a position where the vehicle is reflected.
  • the acquisition unit 303 acquires, as a target image, the latest image captured by the imaging unit in a state where a function indicating a straight line is stored in the second storage unit 302. Thereby, the acquisition part 303 can acquire the object which detects the position where the vehicle was reflected.
  • the acquisition unit 303 may acquire a target image in any one of a plurality of directions imaged by the imaging unit.
  • the acquisition unit 303 acquires, for example, the latest image captured by the imaging unit as a target image in a state where a function indicating a straight line is stored in the second storage unit 302, and the direction captured by the imaging unit is obtained. get.
  • the acquisition unit 303 can acquire a target for detecting the position where the vehicle is reflected, and the determination unit 306 can specify which direction should be used for the function indicating the straight line. it can.
  • the detection unit 304 detects the position of the partial image having the same feature as the feature indicated by the sample data in the target image acquired by the acquisition unit 303 based on the sample data indicating the feature of the image obtained by capturing the vehicle.
  • the detection unit 304 uses, for example, pattern matching using a sample image stored in the first storage unit 301 as a position of a partial image similar to a sample image in which a part of the vehicle is reflected in the target image. The center coordinates of the partial image are detected.
  • the detection unit 304 may detect an edge in the target image. Then, the detection unit 304 may detect the center coordinates of the partial image as the position of the partial image corresponding to the number of edges indicated by the sample data in the target image. As a result, the detection unit 304 can identify a partial image in which the vehicle may be reflected, and can detect a position candidate where the vehicle is reflected.
  • the determination unit 306 determines whether or not the detected position is a position where the vehicle is reflected. For example, when the determination unit 306 determines that the distance d between the straight line and the detected position is equal to or less than the threshold value, the determination unit 306 determines that the detected position is a position where the vehicle is reflected.
  • the determination unit 306 determines whether the detected position is based on the straight line information stored in the second storage unit 302 in association with any direction. You may determine whether it is a reflected position. For example, the determination unit 306 determines whether or not the detected position is a position where the vehicle is reflected based on the straight line information stored in the second storage unit 302 in association with the direction acquired by the acquisition unit 303. Determine. Accordingly, the determination unit 306 can use a function indicating a plurality of straight lines in accordance with the direction in which the image capturing unit captures the target image, and can suppress a decrease in detection accuracy.
  • the output unit 307 outputs the determination result of the determination unit 306 or the position determined by the determination unit 306 as the position where the vehicle is reflected.
  • the output format is, for example, display on a display, print output to a printer, transmission to an external device via the network I / F 203, or storage in a storage area such as the memory 202 or the recording medium 205. Accordingly, the output unit 307 can notify the user of the determination result of the determination unit 306 or the position determined by the determination unit 306 as the position where the vehicle is reflected. Further, the output unit 307 can cause the automatic driving device or the like to use the determination result of the determination unit 306 or the position determined by the determination unit 306 as the position where the vehicle is reflected.
  • FIG. 4 is an explanatory diagram illustrating an example of an image 400 captured by the imaging device 206.
  • the imaging device 206 is an in-vehicle camera embedded in the right door portion of the vehicle facing diagonally right rear of the vehicle.
  • the imaging device 206 captures an image of the right rear side of the vehicle and acquires the image 400.
  • the information processing apparatus 100 acquires the target image 500 and detects the position where the vehicle appears in the target image 500 will be described with reference to FIG.
  • FIG. 5 is an explanatory diagram showing an example of detecting the position where the vehicle is shown.
  • the information processing apparatus 100 has created a function of the straight line 501 corresponding to the arrow 401 illustrated in FIG. 4 and stores the function.
  • the information processing apparatus 100 acquires the target image 500 obtained by the imaging apparatus 206 imaging the diagonally right rear of the vehicle.
  • the information processing apparatus 100 specifies a partial image 502 similar to the sample image in the target image 500 by pattern matching using the sample image.
  • the information processing apparatus 100 detects the center coordinates 503 of the specified partial image 502 as a candidate for a position where the vehicle is reflected.
  • the information processing apparatus 100 calculates the distance between the straight line 501 and the detected center coordinate 503 based on the stored function of the straight line 501. If the calculated distance is less than the threshold, the information processing apparatus 100 determines that the detected center coordinate 503 is a position where the vehicle is reflected and is not noise. If the calculated distance is equal to or greater than the threshold, the information processing apparatus 100 determines that the detected center coordinate 503 is noise, not a position where the vehicle is reflected.
  • the information processing apparatus 100 can prevent the position from being erroneously detected as a position where the vehicle is reflected. For this reason, when using the position where the vehicle is reflected for ensuring safety during traveling of the vehicle, the information processing apparatus 100 can accurately ensure safety without using the erroneously detected position.
  • the information processing apparatus 100 can prevent erroneous detection of the position where the vehicle appears even if the accuracy of pattern matching is poor. Further, the information processing apparatus 100 can perform the process of determining whether or not the position detected by pattern matching is a position where the vehicle is reflected by comparing the predetermined straight line with the detected position. It can be done in a short time. For this reason, the information processing apparatus 100 can improve the accuracy of detecting the position where the vehicle is reflected without improving the accuracy of the pattern matching, and is required when detecting the position where the vehicle is reflected. An increase in time can be suppressed.
  • FIG. 6 is an explanatory diagram illustrating an example of an image 600 captured by the imaging device 206.
  • the imaging device 206 is an in-vehicle camera attached around the front mirror of the vehicle toward the front of the vehicle. For example, the imaging device 206 captures the front of the vehicle and acquires the image 600.
  • the distance from the straight line corresponding to the arrow 601 indicates whether or not the position detected from the image 600 captured in front of the vehicle by the pattern matching using the sample image is likely to be the position where the vehicle is reflected. Can be evaluated based on Then, the information processing apparatus 100 can determine whether the position detected by pattern matching is a position where the vehicle is reflected, based on the straight line. Next, the description proceeds to FIG.
  • FIG. 7 is an explanatory diagram illustrating an example of an image 700 captured by the imaging device 206.
  • the imaging device 206 is an in-vehicle camera embedded in a bumper portion of the vehicle toward the left direction of the vehicle.
  • the imaging device 206 captures the left direction from the bumper portion of the vehicle and acquires the image 700.
  • the information processing apparatus 100 can determine whether the position detected by pattern matching is a position where the vehicle is reflected, based on the straight line.
  • the straight line used when determining whether the position detected by the pattern matching is the position where the vehicle is reflected, depending on which direction the imaging device 206 is imaging. Will be different.
  • the information processing apparatus 100 uses, for example, a plurality of images actually captured in the past by the image capturing apparatus 206 to determine whether the vehicle is in a position where the vehicle is reflected. It is preferable to specify whether such straight line information is used.
  • the information processing apparatus 100 when the image capturing apparatus 206 can capture a plurality of directions by switching the image capturing direction, the information processing apparatus 100 creates and stores straight line information corresponding to capturing each direction. You may make it leave. Thereby, the information processing apparatus 100 accurately determines whether or not the position detected by the pattern matching is a position where the vehicle is reflected, even when the imaging apparatus 206 can switch the imaging direction. Can be determined.
  • FIG. 8 is an explanatory diagram illustrating a specific example of creating a function indicating a straight line 801.
  • the information processing apparatus 100 acquires a plurality of images captured by the imaging device 206 and confirmed to show the vehicle, and coordinates (x indicated by ⁇ in FIG. 8 based on the respective images. i , y i ) are detected. i is 1 to N. N is the number of detected coordinates (x i , y i ).
  • the information processing apparatus 100 substitutes the coefficient N and the coefficient b by substituting the number N of coordinates and the detected N coordinates (x i , y i ) into the above equations (1) and (2). And a function indicating the straight line 801 is created and stored. Next, a specific example in which the information processing apparatus 100 determines whether or not the vehicle is shown is described with reference to FIGS. 9 and 10.
  • FIG. 9 and FIG. 10 are explanatory diagrams showing a specific example for determining whether or not the vehicle is in a reflected position.
  • the information processing apparatus 100 acquires a plurality of target images captured by the imaging apparatus 206 and detects coordinates (x i , y i ) indicated by ⁇ in FIG. 9 based on the respective target images.
  • i is 1 to N.
  • N is the number of detected coordinates (x i , y i ).
  • Coordinates (x i , y i ) indicated by ⁇ in FIG. 9 may include a position where an object that is not a vehicle is detected by pattern matching.
  • FIG. 9 the description proceeds to FIG.
  • the coordinates (x i , y i ) indicated by ⁇ in FIG. 10 are the same coordinates as the coordinates (x i , y i ) indicated by ⁇ in FIG.
  • the information processing apparatus 100 calculates the length of a perpendicular line from one of the coordinates (x i , y i ) indicated by ⁇ in FIG. 10 to the straight line 801, and the coordinates (x i , y i ) and the straight line 801 The distance is calculated. Then, the information processing apparatus 100 determines whether or not the calculated distance is greater than or equal to the threshold “100”.
  • the coordinate (x i , y i ) is a position where the vehicle is shown and is determined not to be noise.
  • the coordinate (x i , y i ) is determined not to be the position where the vehicle is reflected but to noise.
  • the information processing apparatus 100 can prevent the position from being erroneously detected as a position where the vehicle is reflected. For this reason, when using the position where the vehicle is reflected for ensuring safety during traveling of the vehicle, the information processing apparatus 100 can accurately ensure safety without using the erroneously detected position.
  • the information processing apparatus 100 can prevent erroneous detection of the position where the vehicle appears even if the accuracy of pattern matching is poor. Further, the information processing apparatus 100 can perform the process of determining whether or not the position detected by pattern matching is a position where the vehicle is reflected by comparing the predetermined straight line with the detected position. It can be done in a short time. For this reason, the information processing apparatus 100 can improve the accuracy of detecting the position where the vehicle is reflected without improving the accuracy of the pattern matching, and is required when detecting the position where the vehicle is reflected. An increase in time can be suppressed.
  • FIG. 11 is an explanatory diagram illustrating an example of determining whether or not the vehicle is in a position where the threshold is changed. 11, the coordinates (x i, y i) indicated ⁇ is 11, the coordinates (x i, y i) indicated ⁇ is 10 and is the same coordinates as.
  • the information processing apparatus 100 calculates the length of a perpendicular line from one of the coordinates (x i , y i ) indicated by ⁇ in FIG. 11 to the straight line 801, and the coordinates (x i , y i ) and the straight line 801 The distance is calculated. Then, the information processing apparatus 100 determines whether or not the calculated distance is greater than or equal to the threshold “75”.
  • the information processing apparatus 100 can determine whether the position detected by pattern matching is a position where the vehicle is reflected, using each of the plurality of threshold values.
  • the information processing apparatus 100 can adjust the range of the coordinates determined to be the position where the vehicle is reflected and not noise by changing the threshold value. For this reason, the information processing apparatus 100 may perform processing for setting a threshold value to a preferable value in advance.
  • the information processing apparatus 100 may select and set a threshold value at which a ratio of coordinates determined to be noise among a plurality of coordinates is a preferable ratio from the plurality of threshold values. Specifically, the information processing apparatus 100 performs determination processing using a threshold value “100”, a threshold value “75”, and the like for a plurality of coordinates detected from a plurality of images in which it is confirmed that the vehicle is reflected. A threshold value is set at which it is possible to determine that all coordinates are not noise. The information processing apparatus 100 sets the smaller threshold value if it is possible to determine that all coordinates are not noise regardless of the threshold value “100” or the threshold value “75”.
  • the information processing apparatus 100 may reset the threshold periodically. For example, the information processing apparatus 100 periodically performs a determination process using a plurality of threshold values on a plurality of coordinates detected from a plurality of images captured by the imaging device 206 while the vehicle is traveling, and sets the threshold values. May be. Specifically, when the threshold value “75” is used and the information processing apparatus 100 determines that there are too many coordinates to be determined as noise, the information processing apparatus 100 may set the threshold value “100” as a threshold value to be used in the future. .
  • FIG. 12 is a flowchart showing an example of a calculation processing procedure.
  • the information processing apparatus 100 acquires image data representing an image in which it is detected that a vehicle is reflected from a plurality of image data serving as specimens (step S1201).
  • the information processing apparatus 100 specifies a range in which it is determined that the vehicle is reflected on the image represented by the image data (step S1202). Then, the information processing apparatus 100 calculates the center coordinates of the specified range as the coordinates of the vehicle on the image represented by the image data (step S1203).
  • the information processing apparatus 100 determines whether or not to end the acquisition of the image data (step S1204).
  • the information processing apparatus 100 returns to the process of step S1201.
  • step S1204 when the acquisition ends (step S1204: Yes), the information processing apparatus 100 obtains an approximate straight line function representing a trajectory that the vehicle coordinates can take based on the calculated vehicle coordinates, and stores the function. The information is stored in the unit 120 (step S1205). Then, the information processing apparatus 100 ends the calculation process. Thereby, the information processing apparatus 100 can create a function indicating an approximate straight line.
  • FIG. 13 is a flowchart showing an example of the determination processing procedure.
  • the information processing apparatus 100 acquires new image data to be determined (step S1301).
  • the information processing apparatus 100 determines whether or not the vehicle is shown on the image represented by the acquired image data (step S1302).
  • step S1302 No
  • the information processing apparatus 100 returns to the process of step S1301.
  • step S1302 when the vehicle is shown (step S1302: Yes), the information processing apparatus 100 specifies a range in which it is determined that the vehicle is shown on the image represented by the image data based on the acquired image data. (Step S1303). Then, the information processing apparatus 100 calculates the center coordinates of the specified range as a candidate for the coordinates of the vehicle on the image represented by the image data (step S1304).
  • the information processing apparatus 100 determines whether or not the distance between the calculated vehicle coordinate candidate and the approximate straight line represented by the function stored in the storage unit 120 is greater than or equal to a threshold (step S1305). If it is not equal to or greater than the threshold (step S1305: No), the information processing apparatus 100 determines that the calculated vehicle coordinate candidate is the vehicle coordinate and is not noise, and outputs the determined result ( Step S1306). Then, the information processing apparatus 100 returns to the process of step S1301.
  • step S1305 when the value is equal to or greater than the threshold (step S1305: Yes), the information processing apparatus 100 determines that the calculated vehicle coordinate candidate is actually not the vehicle coordinate but noise, and determines the determination result. It outputs (step S1307). Then, the information processing apparatus 100 returns to the process of step S1301. Thereby, the information processing apparatus 100 can prevent erroneous detection of the position where the vehicle is reflected.
  • FIG. 14 is a flowchart showing an example of the adjustment processing procedure.
  • the information processing apparatus 100 acquires image data representing an image in which it is detected that a vehicle is reflected from a plurality of pieces of image data serving as specimens (step S1401).
  • the information processing apparatus 100 specifies a range in which it is determined that the vehicle is reflected on the image represented by the image data (step S1402). Then, the information processing apparatus 100 calculates the center coordinates of the specified range as a candidate for the coordinates of the vehicle on the image represented by the image data (step S1403).
  • the information processing apparatus 100 determines whether or not the distance between the calculated vehicle coordinate candidate and the approximate straight line represented by the function stored in the storage unit 120 is equal to or greater than the first threshold (step). S1404). If it is not equal to or greater than the first threshold (step S1404: No), the information processing apparatus 100 determines that the calculated vehicle coordinate candidate is the vehicle coordinate and is not noise, and the determined result is Store (step S1405). Then, the information processing apparatus 100 proceeds to the process of step S1407.
  • step S1404 determines that the calculated vehicle coordinate candidate is not actually the vehicle coordinate but noise, and the determination is made. The result is stored (step S1406). Then, the information processing apparatus 100 proceeds to the process of step S1407.
  • step S1407 the information processing apparatus 100 determines whether the distance between the calculated vehicle coordinate candidate and the approximate straight line represented by the function stored in the storage unit 120 is equal to or greater than the second threshold ( Step S1407). If it is not equal to or greater than the second threshold (step S1407: No), the information processing apparatus 100 determines that the calculated vehicle coordinate candidate is the vehicle coordinate and is not noise, and the determined result is Store (step S1408). Then, the information processing apparatus 100 proceeds to the process of step S1410.
  • step S1407 Yes
  • step S1407 Yes
  • step S1410 it is determined that the calculated vehicle coordinate candidate is actually not the vehicle coordinate but noise, and the determination result is stored ( Step S1409). Then, the information processing apparatus 100 proceeds to the process of step S1410.
  • step S1410 the information processing apparatus 100 determines whether to end the acquisition of image data (step S1410). If the acquisition is not terminated (step S1410: No), the information processing apparatus 100 returns to the process of step S1401.
  • step S1410 when the acquisition ends (step S1410: Yes), the information processing apparatus 100 uses either the first threshold value or the second threshold value as a threshold value used in the determination process based on the determined result. Setting is performed (step S1411). Then, the information processing apparatus 100 ends the adjustment process. Thereby, the information processing apparatus 100 can set the threshold value which determines accurately whether it is the position where the vehicle was reflected.
  • the target image 110 captured by the imaging apparatus 206 is acquired, and the position 111 of the partial image having the same characteristics as the specimen data in the target image 110 is detected. be able to. Further, according to the information processing apparatus 100, it is possible to determine whether or not the detected position 111 is a position where the vehicle is reflected, based on the information of the straight line 131 stored in the storage unit 120. Thereby, even if there exists the position 111 detected accidentally by pattern matching, the information processing apparatus 100 can prevent the position 111 from being erroneously detected as a position where the vehicle is reflected.
  • the detected position 111 when it is determined that the distance between the straight line 131 and the detected position 111 is equal to or less than the threshold value, the detected position 111 can be determined to be a position where the vehicle is reflected. As a result, the information processing apparatus 100 can evaluate the likelihood of the detected position 111 as the position where the vehicle is reflected by the distance from the straight line 131. In addition, the information processing apparatus 100 calculates the distance between the detected position 111 and the straight line 131 and compares the distance with a threshold value to determine whether or not the vehicle is reflected. This can be done in a relatively short time.
  • a plurality of images obtained by imaging the outside of the vehicle can be acquired, and a threshold value can be set based on the distance between the straight line 131 and the position where the vehicle appears in each of the acquired plurality of images. .
  • the information processing apparatus 100 can set the threshold value used for the process performed about the target image 110 imaged by the imaging device 206 to determine whether or not the vehicle is reflected. Since the information processing apparatus 100 sets a threshold based on a plurality of images taken outside the vehicle, it is easy to set a threshold that can determine whether or not the vehicle is in a position with relatively high accuracy. .
  • the information processing apparatus 100 a plurality of images taken outside the vehicle are acquired, information on the straight line 131 is created based on the position where the vehicle is reflected in each of the acquired plurality of images, and the information on the created straight line 131 is obtained. It can be stored in the storage unit 120. Accordingly, the information processing apparatus 100 can store information on the straight line 131 used for the process of determining whether the vehicle is in the position performed on the target image 110 captured by the imaging apparatus 206. . Since the information processing apparatus 100 creates information on the straight line 131 based on a plurality of images taken from outside the vehicle, the information processing apparatus 100 can determine whether or not the vehicle 131 is in a position with a relatively high accuracy. Information can be created.
  • a function indicating the straight line 131 is created using the least square method based on the position where the vehicle is reflected in each of a plurality of images taken outside the vehicle, and the created function is stored in the storage unit 120. Can be stored. Thereby, the information processing apparatus 100 can create a function indicating the straight line 131 even if there is an outlier at the position where the vehicle is reflected in each of the plurality of images taken outside the vehicle.
  • the target image 110 in any direction captured by the imaging apparatus 206 capable of capturing a plurality of directions is acquired, and the same characteristics as the characteristics indicated by the sample data in the target image 110 are obtained.
  • the position 111 of the partial image it has can be detected.
  • the information processing apparatus 100 can determine whether or not the detected position 111 is a position where the vehicle is reflected, even if the imaging direction of the imaging apparatus 206 is variable.
  • the information processing method described in this embodiment can be realized by executing a program prepared in advance on a computer such as a personal computer or a workstation.
  • the information processing program described in this embodiment is recorded on a computer-readable recording medium such as a hard disk, a flexible disk, a CD-ROM, an MO, and a DVD, and is executed by being read from the recording medium by the computer. Further, the information processing program described in this embodiment may be distributed via a network such as the Internet.

Abstract

An information processing device (100) acquires a target image (110) that has been imaged by an imaging unit. The information processing device (100) detects, on the basis of sample data indicating the features of a captured image of a vehicle, the position (111) of a partial image that has features identical to those indicated by the sample data, in the acquired target image (110). The information processing device (100) determines whether the detected position (111) is a position where the vehicle is pictured on the basis of information about a straight line (131) stored in a storage unit (120).

Description

情報処理装置、情報処理方法、および情報処理プログラムInformation processing apparatus, information processing method, and information processing program
 本発明は、情報処理装置、情報処理方法、および情報処理プログラムに関する。 The present invention relates to an information processing apparatus, an information processing method, and an information processing program.
 従来、車両の画像をあらわす学習データを記憶し、車載カメラが車外を撮像した画像のうち、学習データがあらわす車両の画像と類似した部分画像があれば、その部分画像を、車両が映っている部分画像であると判断し、車両が映った位置を検出する技術がある。 Conventionally, learning data representing an image of a vehicle is stored, and if there is a partial image similar to the image of the vehicle represented by the learning data among images captured by the in-vehicle camera outside the vehicle, the vehicle is reflected in the partial image. There is a technique for determining a partial image and detecting a position where the vehicle is reflected.
 先行技術としては、例えば、車両のライト光の輝度値の飽和値の中心点を検出して車両の検出を行うとともに、撮影箇所の背景画像と車両の影の輝度値とから車両の影の中心点を検出して車両の検出を行うものがある。また、例えば、自車両が走行している路面部分に対する自車両前方の走行路面の傾斜状況が相対的に下り傾斜に変化する地点の手前である対象地点に自車両が到達したことを示す信号を、撮像画像に基づいて出力する技術がある。 As the prior art, for example, the center point of the saturation value of the luminance value of the light light of the vehicle is detected to detect the vehicle, and the center of the shadow of the vehicle is determined from the background image of the photographing location and the luminance value of the shadow of the vehicle. Some of them detect a vehicle by detecting a point. In addition, for example, a signal indicating that the host vehicle has arrived at a target point that is in front of a point where the inclination of the traveling road surface ahead of the host vehicle relative to the road surface portion on which the host vehicle is traveling changes to a downward slope. There is a technique for outputting based on a captured image.
特開2007-94919号公報JP 2007-94919 A 特開2014-115980号公報JP 2014-115980 A
 しかしながら、従来技術では、車両の位置を誤検出する可能性がある。例えば、車載カメラが車外を撮像した画像に、車両と類似した形状や色彩などの物体であるが、車両ではない物体が映っている場合、その物体の位置を車両の位置と間違って検出してしまう可能性がある。 However, with the conventional technology, there is a possibility that the position of the vehicle is erroneously detected. For example, if an in-vehicle camera captures an image of the outside of the vehicle, which is an object with a shape or color similar to that of the vehicle, but an object that is not a vehicle, the position of the object is mistakenly detected as the vehicle position. There is a possibility.
 1つの側面では、本発明は、車両の位置の誤検出を防止することができる情報処理装置、情報処理方法、および情報処理プログラムを提供することを目的とする。 In one aspect, an object of the present invention is to provide an information processing apparatus, an information processing method, and an information processing program that can prevent erroneous detection of the position of a vehicle.
 1つの実施態様によれば、車外を撮像する撮像部によって撮像された対象画像を取得し、車両を撮像した画像の特徴を示す標本データに基づいて、取得した前記対象画像において前記特徴を有する部分画像の位置を検出し、車外を撮像した複数の画像のそれぞれにおいて車両が映った位置に基づいて特定される、前記撮像部によって撮像される画像において車両が映る位置の基準を示す直線の情報を記憶する記憶部を参照して、検出した前記位置が、車両が映った位置であるか否かを判定する情報処理装置、情報処理方法、および情報処理プログラムが提案される。 According to one embodiment, the target image captured by the imaging unit that captures the outside of the vehicle is acquired, and the acquired target image has the feature based on the sample data indicating the feature of the image captured of the vehicle Detecting the position of the image, and specifying information on a straight line indicating the reference of the position where the vehicle appears in the image captured by the imaging unit, which is specified based on the position where the vehicle is reflected in each of the plurality of images captured outside the vehicle An information processing apparatus, an information processing method, and an information processing program for determining whether or not the detected position is a position where a vehicle is shown with reference to a storage unit to be stored are proposed.
 本発明の一態様によれば、車両の位置の誤検出を防止することができるという効果を奏する。 According to one aspect of the present invention, it is possible to prevent erroneous detection of the position of the vehicle.
図1は、実施の形態にかかる情報処理方法の一実施例を示す説明図である。FIG. 1 is an explanatory diagram of an example of the information processing method according to the embodiment. 図2は、情報処理装置100のハードウェア構成例を示すブロック図である。FIG. 2 is a block diagram illustrating a hardware configuration example of the information processing apparatus 100. 図3は、情報処理装置100の機能的構成例を示すブロック図である。FIG. 3 is a block diagram illustrating a functional configuration example of the information processing apparatus 100. 図4は、撮像装置206によって撮像される画像400の一例を示す説明図である。FIG. 4 is an explanatory diagram illustrating an example of an image 400 captured by the imaging device 206. 図5は、車両が映った位置を検出する一例を示す説明図である。FIG. 5 is an explanatory diagram showing an example of detecting the position where the vehicle is shown. 図6は、撮像装置206によって撮像される画像600の一例を示す説明図である。FIG. 6 is an explanatory diagram illustrating an example of an image 600 captured by the imaging device 206. 図7は、撮像装置206によって撮像される画像700の一例を示す説明図である。FIG. 7 is an explanatory diagram illustrating an example of an image 700 captured by the imaging device 206. 図8は、直線801を示す関数を作成する具体例を示す説明図である。FIG. 8 is an explanatory diagram illustrating a specific example of creating a function indicating a straight line 801. 図9は、車両が映った位置であるか否かを判定する具体例を示す説明図(その1)である。FIG. 9 is an explanatory diagram (part 1) illustrating a specific example of determining whether or not the vehicle is in a reflected position. 図10は、車両が映った位置であるか否かを判定する具体例を示す説明図(その2)である。FIG. 10 is an explanatory diagram (part 2) illustrating a specific example of determining whether or not the vehicle is in a reflected position. 図11は、閾値を変更して車両が映った位置であるか否かを判定する一例を示す説明図である。FIG. 11 is an explanatory diagram illustrating an example of determining whether or not the vehicle is in a position where the threshold is changed. 図12は、算出処理手順の一例を示すフローチャートである。FIG. 12 is a flowchart illustrating an example of a calculation processing procedure. 図13は、判定処理手順の一例を示すフローチャートである。FIG. 13 is a flowchart illustrating an example of the determination processing procedure. 図14は、調整処理手順の一例を示すフローチャートである。FIG. 14 is a flowchart illustrating an example of the adjustment processing procedure.
 以下に、図面を参照して、本発明にかかる情報処理装置、情報処理方法、および情報処理プログラムの実施の形態を詳細に説明する。 Hereinafter, embodiments of an information processing device, an information processing method, and an information processing program according to the present invention will be described in detail with reference to the drawings.
(実施の形態にかかる情報処理方法の一実施例)
 図1は、実施の形態にかかる情報処理方法の一実施例を示す説明図である。図1において、情報処理装置100は、車両に設けられるコンピュータである。情報処理装置100は、撮像部が撮像した画像において車両が映った位置を検出する。
(One Example of Information Processing Method According to Embodiment)
FIG. 1 is an explanatory diagram of an example of the information processing method according to the embodiment. In FIG. 1, an information processing apparatus 100 is a computer provided in a vehicle. The information processing apparatus 100 detects a position where the vehicle is reflected in the image captured by the imaging unit.
 撮像部は、車両に設けられ、車外を撮影する。撮像部は、例えば、車両の前方、車両の後方、車両の斜め前方、車両の斜め後方、車両の左方向、または、車両の右方向などを撮像する。撮像部は、具体的には、車両の前方に向けて車両のフロントミラー周辺に付けられた車載カメラ、車両の斜め後方に向けて車両のドア部分に埋め込まれた車載カメラ、車両の左方向に向けて車両のバンパー部分に埋め込まれた車載カメラなどである。 The imaging unit is provided in the vehicle and images the outside of the vehicle. The imaging unit images, for example, the front of the vehicle, the rear of the vehicle, the oblique front of the vehicle, the oblique rear of the vehicle, the left direction of the vehicle, or the right direction of the vehicle. Specifically, the imaging unit includes an in-vehicle camera attached around the front mirror of the vehicle toward the front of the vehicle, an in-vehicle camera embedded in the door portion of the vehicle toward the rear of the vehicle, and in the left direction of the vehicle. For example, an in-vehicle camera embedded in the bumper part of the vehicle.
 ここで、車両の標本画像を用いたパターンマッチングによって、車載カメラが車外を撮像した画像のうち、車両の標本画像に類似する部分画像を検出することによって、車両が映った位置を検出することが考えられる。しかしながら、車両が映った位置を誤検出する可能性がある。例えば、車載カメラが車外を撮像した画像に、車両と類似した形状の物体であるが、車両ではない物体が映っている場合、その物体が映った位置を車両が映った位置と間違って検出してしまう可能性がある。 Here, it is possible to detect a position where the vehicle is reflected by detecting a partial image similar to the sample image of the vehicle from among the images captured by the vehicle-mounted camera outside the vehicle by pattern matching using the sample image of the vehicle. Conceivable. However, there is a possibility of erroneously detecting the position where the vehicle is reflected. For example, if an in-vehicle camera captures an image of the outside of the vehicle, it is an object that has a shape similar to that of a vehicle, but an object that is not a vehicle is reflected. There is a possibility that.
 これに対し、車両の標本画像を用いたパターンマッチングの精度を向上させることによって、車両が映った位置を誤検出する可能性を低減することが考えられる。しかしながら、パターンマッチングにかかる時間の増大化を招いてしまうことがある。このため、車両走行中の安全確保などのために比較的短時間で車両が映った位置を検出することが望まれる場合には、車両の標本画像を用いたパターンマッチングの精度を向上させることが難しいことがある。また、車両の標本画像を用いたパターンマッチングの精度を向上させるために、パターンマッチングを行うためのハードウェアにかかるコストの増大化を招いてしまうことがある。 On the other hand, it is conceivable to reduce the possibility of erroneously detecting the position where the vehicle is reflected by improving the accuracy of pattern matching using the sample image of the vehicle. However, the time required for pattern matching may increase. For this reason, when it is desired to detect a position where the vehicle is reflected in a relatively short time for ensuring safety during traveling of the vehicle, the accuracy of pattern matching using a sample image of the vehicle can be improved. It can be difficult. In addition, in order to improve the accuracy of pattern matching using a sample image of a vehicle, the cost for hardware for performing pattern matching may increase.
 また、車載カメラが車外を撮像した画像に映っている車線境界線などの白線を検出し、その白線の位置に基づいて、車両が映った位置を検出する精度を向上させることが考えられる。しかしながら、白線がない場合、または、白線が検出しづらい場合などには、車両が映った位置を検出する精度を向上させることができなくなる。 Also, it is conceivable that the in-vehicle camera detects a white line such as a lane boundary line reflected in an image taken outside the vehicle, and based on the position of the white line, it is possible to improve the accuracy of detecting the position where the vehicle is reflected. However, when there is no white line or when it is difficult to detect the white line, the accuracy of detecting the position where the vehicle is reflected cannot be improved.
 そこで、本実施の形態では、対象画像における、標本データと同じ特徴を有する部分画像の位置を検出した後、その位置が、車両が映った位置であるか否かを判定することにより、車両が映った位置の誤検出を防止することができる情報処理方法について説明する。 Therefore, in the present embodiment, after detecting the position of the partial image having the same characteristics as the sample data in the target image, it is determined whether or not the position is a position where the vehicle is reflected. An information processing method capable of preventing erroneous detection of the reflected position will be described.
 図1の例では、情報処理装置100は、記憶部120を有する。記憶部120は、直線131の情報を記憶する。直線131は、撮像部によって今後新たに撮像される画像130において車両が映る位置の基準を示し、今後新たに撮像される画像130において車両が映る位置になりうる位置が含まれる領域を代表する。撮像部は、例えば、情報処理装置100と同じ車両に設けられる。 In the example of FIG. 1, the information processing apparatus 100 includes a storage unit 120. The storage unit 120 stores information on the straight line 131. A straight line 131 represents a reference of a position where a vehicle appears in an image 130 newly captured by the imaging unit in the future, and represents a region including a position where the vehicle can be reflected in an image 130 newly captured in the future. The imaging unit is provided in the same vehicle as the information processing apparatus 100, for example.
 直線131の始点または終点は、例えば、撮像部によって今後新たに撮像される画像130の端部の点ではなく、その画像130の内部の点であってもよい。直線131の情報は、車外を撮像した複数の画像のそれぞれにおいて車両が映った位置に基づいて特定される。直線131の情報は、例えば、最小二乗法を用いて特定される。直線131の情報は、例えば、直線131を示す関数である。 The start point or end point of the straight line 131 may be, for example, a point inside the image 130 instead of a point at the end of the image 130 newly captured by the imaging unit in the future. The information on the straight line 131 is specified based on the position where the vehicle is reflected in each of a plurality of images taken outside the vehicle. The information on the straight line 131 is specified using, for example, the least square method. The information on the straight line 131 is a function indicating the straight line 131, for example.
 車外を撮像した複数の画像は、上記撮像部によって撮像された複数の画像である。また、車外を撮像した複数の画像は、例えば、情報処理装置100とは異なる車両に設けられた撮像部によって撮像された複数の画像であってもよい。具体的には、情報処理装置100が設けられた車両と同型であるが、情報処理装置100が設けられた車両とは異なる車両に設けられた撮像部によって撮像された複数の画像であってもよい。 The plurality of images taken outside the vehicle are a plurality of images taken by the imaging unit. Further, the plurality of images captured outside the vehicle may be, for example, a plurality of images captured by an imaging unit provided in a vehicle different from the information processing apparatus 100. Specifically, although it is the same type as the vehicle provided with the information processing apparatus 100, it may be a plurality of images taken by an imaging unit provided in a vehicle different from the vehicle provided with the information processing apparatus 100. Good.
 (1-1)情報処理装置100は、撮像部によって撮像された対象画像110を取得する。情報処理装置100は、例えば、車両の斜め後方に向けて車両のドア部分に埋め込まれた車載カメラによって撮像された、車両の斜め後方の対象画像110を取得する。 (1-1) The information processing apparatus 100 acquires the target image 110 captured by the imaging unit. The information processing apparatus 100 acquires, for example, a target image 110 obliquely rearward of the vehicle, which is captured by an in-vehicle camera embedded in a door portion of the vehicle toward the rearward of the vehicle.
 (1-2)情報処理装置100は、車両を撮像した画像の特徴を示す標本データに基づいて、取得した対象画像110において、標本データが示す特徴と同じ特徴を有する部分画像の位置111を検出する。標本データは、例えば、車両の一部が映った標本画像である。情報処理装置100は、例えば、車両の一部が映った標本画像を用いたパターンマッチングによって、対象画像110のうち、車両の一部が映った標本画像と類似する部分画像の位置111を検出する。 (1-2) The information processing apparatus 100 detects the position 111 of the partial image having the same feature as the feature indicated by the sample data in the acquired target image 110 based on the sample data indicating the feature of the image obtained by capturing the vehicle. To do. The sample data is, for example, a sample image showing a part of the vehicle. The information processing apparatus 100 detects, for example, a position 111 of a partial image similar to a sample image in which a part of the vehicle is shown in the target image 110 by pattern matching using a sample image in which a part of the vehicle is shown. .
 (1-3)情報処理装置100は、記憶部120に記憶された直線131の情報に基づいて、検出した位置111が、車両が映った位置であるか否かを判定する。情報処理装置100は、例えば、直線131と、検出した位置111との距離が、閾値以上であるか否かを判定する。ここで、情報処理装置100は、閾値未満であると判定した場合に、検出した位置111が、車両が映った位置であり、ノイズではないと判定する。一方で、情報処理装置100は、閾値以上であると判定した場合に、検出した位置111が、車両が映った位置ではなく、ノイズであると判定する。 (1-3) The information processing apparatus 100 determines based on the information on the straight line 131 stored in the storage unit 120 whether the detected position 111 is a position where the vehicle is reflected. For example, the information processing apparatus 100 determines whether the distance between the straight line 131 and the detected position 111 is greater than or equal to a threshold value. Here, when it is determined that the information processing apparatus 100 is less than the threshold value, the detected position 111 is a position where the vehicle is reflected and is determined not to be noise. On the other hand, when it is determined that the information processing apparatus 100 is equal to or greater than the threshold value, the detected position 111 determines that the detected position 111 is not a position where the vehicle is reflected but noise.
 これにより、情報処理装置100は、パターンマッチングによって誤って検出した位置があっても、その位置を、車両が映った位置として誤検出しないようにすることができる。このため、情報処理装置100は、車両走行中の安全確保などのために車両が映った位置を用いる場合、誤検出された位置を用いず、精度よく安全確保を図ることができる。 Thus, even if there is a position that is erroneously detected by pattern matching, the information processing apparatus 100 can prevent the position from being erroneously detected as a position where the vehicle is reflected. For this reason, when using the position where the vehicle is reflected for ensuring safety during traveling of the vehicle, the information processing apparatus 100 can accurately ensure safety without using the erroneously detected position.
 また、情報処理装置100は、パターンマッチングの精度が悪くても、車両が映った位置を誤検出しないようにすることができる。また、情報処理装置100は、パターンマッチングによって検出した位置が、車両が映った位置であるか否かを判定する処理を、所定の直線と検出した位置との比較によって行うため、比較的短時間で行うことができる。このため、情報処理装置100は、パターンマッチングの精度の向上を図らなくても、車両が映った位置を検出する精度の向上を図ることができるとともに、車両が映った位置を検出する際にかかる時間の増大化を抑制することができる。 In addition, the information processing apparatus 100 can prevent erroneous detection of the position where the vehicle appears even if the accuracy of pattern matching is poor. Further, since the information processing apparatus 100 performs the process of determining whether or not the position detected by the pattern matching is a position where the vehicle is reflected by comparing the predetermined straight line with the detected position, the information processing apparatus 100 can perform a relatively short time. Can be done. For this reason, the information processing apparatus 100 can improve the accuracy of detecting the position where the vehicle is reflected without improving the accuracy of the pattern matching, and is required when detecting the position where the vehicle is reflected. An increase in time can be suppressed.
 そして、情報処理装置100は、車両走行中の安全確保などのために、比較的短時間で車両が映った位置を検出することができる。また、情報処理装置100は、パターンマッチングの精度の向上を図らなくてもよいため、パターンマッチングを行うためのハードウェアにかかるコストを増大させなくてもよいようにすることができる。 And the information processing apparatus 100 can detect the position where the vehicle is reflected in a relatively short time in order to ensure safety while the vehicle is running. Further, since the information processing apparatus 100 does not have to improve the accuracy of pattern matching, it is possible to avoid increasing the cost of hardware for performing pattern matching.
 また、情報処理装置100は、比較的精度よく検出した車両が映った位置に基づいて、自動運転、または、運転支援を行うことができ、安全性の向上を図ることができる。また、情報処理装置100は、比較的精度よく検出した車両が映った位置に基づいて、車両間の距離が一定以上近づいたことを検出することができ、一定以上近づいた場合に、車両のドライバーに警告を発することができ、安全性の向上を図ることができる。 In addition, the information processing apparatus 100 can perform automatic driving or driving assistance based on the position where the vehicle detected with relatively high accuracy is reflected, thereby improving safety. In addition, the information processing apparatus 100 can detect that the distance between the vehicles has approached a certain distance or more based on the position where the vehicle is detected with relatively high accuracy. A warning can be issued and safety can be improved.
(情報処理装置100のハードウェア構成例)
 次に、図2を用いて、情報処理装置100のハードウェア構成例について説明する。
(Hardware configuration example of information processing apparatus 100)
Next, a hardware configuration example of the information processing apparatus 100 will be described with reference to FIG.
 図2は、情報処理装置100のハードウェア構成例を示すブロック図である。図2において、情報処理装置100は、CPU(Central Processing Unit)201と、メモリ202と、ネットワークI/F(Interface)203と、記録媒体I/F204と、記録媒体205と、撮像装置206とを有する。また、各構成部は、バス200によってそれぞれ接続される。 FIG. 2 is a block diagram illustrating a hardware configuration example of the information processing apparatus 100. In FIG. 2, an information processing apparatus 100 includes a CPU (Central Processing Unit) 201, a memory 202, a network I / F (Interface) 203, a recording medium I / F 204, a recording medium 205, and an imaging device 206. Have. Each component is connected by a bus 200.
 ここで、CPU201は、情報処理装置100の全体の制御を司る。メモリ202は、例えば、ROM(Read Only Memory)、RAM(Random Access Memory)およびフラッシュROMなどを有する。具体的には、例えば、フラッシュROMやROMが各種プログラムを記憶し、RAMがCPU201のワークエリアとして使用される。メモリ202に記憶されるプログラムは、CPU201にロードされることで、コーディングされている処理をCPU201に実行させる。メモリ202は、車両を撮像した画像の特徴を示す標本データ、車両が映る位置の基準となる直線の情報などを記憶する。 Here, the CPU 201 controls the entire information processing apparatus 100. The memory 202 includes, for example, a ROM (Read Only Memory), a RAM (Random Access Memory), a flash ROM, and the like. Specifically, for example, a flash ROM or ROM stores various programs, and a RAM is used as a work area of the CPU 201. The program stored in the memory 202 is loaded on the CPU 201 to cause the CPU 201 to execute the coded process. The memory 202 stores sample data indicating the characteristics of an image obtained by imaging a vehicle, information on a straight line that serves as a reference for a position where the vehicle is reflected, and the like.
 ネットワークI/F203は、通信回線を通じてネットワーク210に接続され、ネットワーク210を介して他のコンピュータに接続される。そして、ネットワークI/F203は、ネットワーク210と内部のインターフェースを司り、他のコンピュータからのデータの入出力を制御する。ネットワークI/F203には、例えば、モデムやLAN(Local Area Network)アダプタなどを採用することができる。 The network I / F 203 is connected to the network 210 via a communication line, and is connected to another computer via the network 210. The network I / F 203 controls an internal interface with the network 210 and controls data input / output from other computers. For example, a modem or a LAN (Local Area Network) adapter may be employed as the network I / F 203.
 記録媒体I/F204は、CPU201の制御に従って記録媒体205に対するデータのリード/ライトを制御する。記録媒体I/F204は、例えば、ディスクドライブ、SSD(Solid State Drive)、USB(Universal Serial Bus)ポートなどである。記録媒体205は、記録媒体I/F204の制御で書き込まれたデータを記憶する不揮発メモリである。記録媒体205は、例えば、ディスク、半導体メモリ、USBメモリなどである。記録媒体205は、情報処理装置100から着脱可能であってもよい。 The recording medium I / F 204 controls reading / writing of data with respect to the recording medium 205 according to the control of the CPU 201. The recording medium I / F 204 is, for example, a disk drive, an SSD (Solid State Drive), a USB (Universal Serial Bus) port, or the like. The recording medium 205 is a non-volatile memory that stores data written under the control of the recording medium I / F 204. The recording medium 205 is, for example, a disk, a semiconductor memory, a USB memory, or the like. The recording medium 205 may be detachable from the information processing apparatus 100.
 撮像装置206は、車外を撮像し、車外を撮像した画像をあらわす画像データを取得する。撮像装置206は、例えば、車両の前方、車両の後方、車両の斜め前方、車両の斜め後方、車両の左方向、または、車両の右方向などを撮像する。撮像装置206は、具体的には、車両の前方に向けて車両のフロントミラー周辺に付けられた車載カメラ、車両の斜め後方に向けて車両のドア部分に埋め込まれた車載カメラ、車両の左方向に向けて車両のバンパー部分に埋め込まれた車載カメラなどである。撮像装置206は、撮像する方向を切り替えて撮像可能であり、複数の方向を撮像可能であってもよい。撮像装置206は、情報処理装置100と通信可能であれば、情報処理装置100と一体でなくてもよい。 The imaging device 206 captures the outside of the vehicle and acquires image data representing the image captured outside the vehicle. The imaging device 206 images, for example, the front of the vehicle, the rear of the vehicle, the oblique front of the vehicle, the oblique rear of the vehicle, the left direction of the vehicle, or the right direction of the vehicle. Specifically, the imaging device 206 includes an in-vehicle camera attached around the front mirror of the vehicle toward the front of the vehicle, an in-vehicle camera embedded in the door portion of the vehicle toward the rear of the vehicle, and the left direction of the vehicle. For example, an in-vehicle camera embedded in the bumper part of the vehicle. The imaging device 206 can capture images by switching the imaging direction, and may capture a plurality of directions. The imaging device 206 may not be integrated with the information processing device 100 as long as it can communicate with the information processing device 100.
 情報処理装置100は、撮像装置206を複数有してもよい。情報処理装置100は、上述した構成部のほか、例えば、キーボード、マウス、ディスプレイなどを有してもよい。また、情報処理装置100は、記録媒体I/F204や記録媒体205を有していなくてもよい。情報処理装置100は、例えば、車載装置である。情報処理装置100は、具体的には、ドライブレコーダーである。 The information processing apparatus 100 may include a plurality of imaging devices 206. In addition to the components described above, the information processing apparatus 100 may include, for example, a keyboard, a mouse, a display, and the like. The information processing apparatus 100 may not include the recording medium I / F 204 and the recording medium 205. The information processing apparatus 100 is an on-vehicle apparatus, for example. Specifically, the information processing apparatus 100 is a drive recorder.
(情報処理装置100の機能的構成例)
 次に、図3を用いて、情報処理装置100の機能的構成例について説明する。
(Functional configuration example of information processing apparatus 100)
Next, a functional configuration example of the information processing apparatus 100 will be described with reference to FIG.
 図3は、情報処理装置100の機能的構成例を示すブロック図である。情報処理装置100は、第1の記憶部301と、第2の記憶部302と、取得部303と、検出部304と、作成部305と、判定部306と、出力部307とを含む。 FIG. 3 is a block diagram illustrating a functional configuration example of the information processing apparatus 100. The information processing apparatus 100 includes a first storage unit 301, a second storage unit 302, an acquisition unit 303, a detection unit 304, a creation unit 305, a determination unit 306, and an output unit 307.
 第1の記憶部301と、第2の記憶部302とは、例えば、図2に示したメモリ202や記録媒体205などの記憶領域によって実現される。取得部303~出力部307は、制御部となる機能である。取得部303~出力部307は、具体的には、例えば、図2に示したメモリ202や記録媒体205などの記憶領域に記憶されたプログラムをCPU201に実行させることにより、または、ネットワークI/F203により、その機能を実現する。各機能部の処理結果は、例えば、図2に示したメモリ202や記録媒体205などの記憶領域に記憶される。 The first storage unit 301 and the second storage unit 302 are realized by storage areas such as the memory 202 and the recording medium 205 shown in FIG. The acquisition unit 303 to the output unit 307 are functions serving as a control unit. Specifically, the acquisition unit 303 to the output unit 307, for example, cause the CPU 201 to execute a program stored in a storage area such as the memory 202 or the recording medium 205 illustrated in FIG. 2 or the network I / F 203. By realizing the function. The processing result of each functional unit is stored in a storage area such as the memory 202 and the recording medium 205 shown in FIG.
 第1の記憶部301は、車両を撮像した画像の特徴を示す標本データを記憶する。第1の記憶部301は、例えば、車両が映った部分画像が有すると推定される特徴を示す標本データを記憶する。標本データは、例えば、車両の一部が映った標本画像である。標本データは、例えば、車両の一部が映った標本画像に含まれる水平方向または垂直方向のエッジの多さを示す情報であってもよい。撮像部は、複数の方向を撮像可能であってもよい。撮像部は、情報処理装置100と同じ車両に設けられる。撮像部は、例えば、図2に示した撮像装置206である。 1st memory | storage part 301 memorize | stores the sample data which show the characteristic of the image which imaged the vehicle. The first storage unit 301 stores, for example, sample data indicating features estimated to be included in a partial image in which a vehicle is shown. The sample data is, for example, a sample image showing a part of the vehicle. The sample data may be, for example, information indicating the number of horizontal or vertical edges included in a sample image showing a part of the vehicle. The imaging unit may be capable of imaging a plurality of directions. The imaging unit is provided in the same vehicle as the information processing apparatus 100. The imaging unit is, for example, the imaging device 206 illustrated in FIG.
 第1の記憶部301は、例えば、撮像部が撮像する方向に合わせて、撮像部がその方向を撮像した場合に、撮像される画像に映ることになる車両の一部についての標本画像を記憶する。第1の記憶部301は、具体的には、撮像部が後方を撮像する場合、撮像部によって撮像される画像には、後方にある車両の前面部分が映ることになるため、車両の前面部分の標本画像を記憶する。車両の前面部分の標本画像は、例えば、種々の車両の前面部分の画像に基づいて、モデル化された画像である。これにより、第1の記憶部301は、検出部304によって用いられる標本データを記憶しておくことができる。また、第1の記憶部301は、撮像部が複数の方向を撮像可能であっても、検出部304がそれぞれの方向に対応する標本データを使い分け可能にすることができる。 The first storage unit 301 stores, for example, a sample image of a part of the vehicle that appears in the captured image when the imaging unit captures the direction in accordance with the imaging direction of the imaging unit. To do. Specifically, when the imaging unit captures the rear, the first storage unit 301 reflects the front part of the vehicle behind the image captured by the imaging unit. The specimen image is memorized. The specimen image of the front portion of the vehicle is a modeled image based on, for example, images of front portions of various vehicles. Thereby, the first storage unit 301 can store the sample data used by the detection unit 304. Further, the first storage unit 301 can enable the detection unit 304 to selectively use sample data corresponding to each direction even when the imaging unit can capture images in a plurality of directions.
 第2の記憶部302は、直線の情報を記憶する。直線は、撮像部によって今後新たに撮像される画像において車両が映る位置の基準を示し、今後新たに撮像される画像において車両が映る位置になりうる位置が含まれる領域を代表する。直線の始点または終点は、例えば、撮像部によって今後新たに撮像される画像の端部の点ではなく、その画像の内部の点であってもよい。直線の情報は、車外を撮像した複数の画像のそれぞれにおいて車両が映った位置に基づいて特定される。直線の情報は、例えば、最小二乗法を用いて特定される。直線の情報は、例えば、直線を示す関数である。 The second storage unit 302 stores straight line information. The straight line represents a reference of a position where the vehicle appears in an image newly captured by the imaging unit in the future, and represents a region including a position where the vehicle can be reflected in an image newly captured in the future. The start point or end point of the straight line may be, for example, a point inside the image instead of a point at the end of the image newly captured in the future by the imaging unit. The straight line information is specified based on the position where the vehicle is reflected in each of a plurality of images taken outside the vehicle. The straight line information is specified using, for example, the least square method. The straight line information is, for example, a function indicating a straight line.
 車外を撮像した複数の画像は、例えば、情報処理装置100と同じ車両に設けられた撮像部によって、過去に撮像された複数の画像である。車外を撮像した複数の画像は、例えば、情報処理装置100が設けられた車両と同型であるが、情報処理装置100とは異なる車両に設けられた撮像部によって、過去に撮像された複数の画像である場合があってもよい。この場合では、情報処理装置100と同じ車両に設けられた撮像部と、情報処理装置100とは異なる車両に設けられた撮像部とは、撮像する方向が同じ種類であることが好ましい。 The plurality of images captured outside the vehicle are, for example, a plurality of images captured in the past by an imaging unit provided in the same vehicle as the information processing apparatus 100. The plurality of images captured outside the vehicle are, for example, the same type as the vehicle provided with the information processing apparatus 100, but the plurality of images captured in the past by an imaging unit provided in a vehicle different from the information processing apparatus 100. It may be. In this case, it is preferable that the imaging unit provided in the same vehicle as the information processing device 100 and the imaging unit provided in a vehicle different from the information processing device 100 have the same type of imaging direction.
 関数は、例えば、y=ax+bで表現される。xは、画像の横方向の座標値である。yは、画像の縦方向の座標値である。aは、係数であり、例えば、最小二乗法によって求められる。bは、係数であり、例えば、最小二乗法によって求められる。車両が映る位置は、画像において車両が映った部分画像の中心座標である。中心座標は、例えば、画像の横方向の座標値xと、画像の縦方向の座標値yとの組み合わせ(x,y)で表現される。関数は、値域または定義域が設定され、直線の始点または終点が、画像の内部の点であることを表現してもよい。第2の記憶部302は、例えば、直線を示す関数y=ax+bの係数aと係数bとを記憶する。これにより、第2の記憶部302は、判定部306が、ある位置と直線との距離に基づいて、その位置が、車両が映った位置であるか否かを判定可能にすることができる。 The function is expressed by y = ax + b, for example. x is a coordinate value in the horizontal direction of the image. y is a coordinate value in the vertical direction of the image. a is a coefficient, and is obtained by, for example, the least square method. b is a coefficient, and is obtained by, for example, the least square method. The position where the vehicle is reflected is the center coordinates of the partial image where the vehicle is reflected in the image. The center coordinates are expressed by, for example, a combination (x, y) of the coordinate value x in the horizontal direction of the image and the coordinate value y in the vertical direction of the image. The function may express that a range or a domain is set and the start point or end point of the straight line is a point inside the image. For example, the second storage unit 302 stores a coefficient a and a coefficient b of a function y = ax + b indicating a straight line. Thereby, the 2nd memory | storage part 302 can enable the determination part 306 to determine whether the position is a position where the vehicle was reflected based on the distance of a certain position and a straight line.
 第2の記憶部302は、撮像部が撮像可能な複数の方向のそれぞれの方向に、撮像部によって撮像される当該方向についての画像において車両が映る位置の基準を示す直線の情報を対応付けて記憶してもよい。第2の記憶部302は、例えば、撮像部が撮像可能な方向ごとに、撮像部が当該方向を撮像した場合に、判定部306によって用いられる直線を示す関数y=ax+bを記憶する。第2の記憶部302は、具体的には、撮像部が撮像可能な方向ごとに、係数aと係数bとの組み合わせを記憶する。これにより、第2の記憶部302は、判定部306が、撮像部が撮像する方向に合わせて、複数の直線を示す関数を切り替えて使用可能にすることができる。第2の記憶部302は、撮像部が撮像する方向が切り替えられても、判定部306の判定精度の低下を抑制することができる。 The second storage unit 302 associates each direction of a plurality of directions that can be imaged by the imaging unit with information on a straight line indicating a reference of a position where the vehicle appears in the image of the direction captured by the imaging unit. You may remember. For example, the second storage unit 302 stores a function y = ax + b indicating a straight line used by the determination unit 306 when the imaging unit captures the direction for each direction in which the imaging unit can capture an image. Specifically, the second storage unit 302 stores a combination of the coefficient a and the coefficient b for each direction in which the imaging unit can capture an image. As a result, the second storage unit 302 can make the determination unit 306 usable by switching functions indicating a plurality of straight lines in accordance with the direction in which the imaging unit captures an image. Even if the direction in which the imaging unit captures an image is switched, the second storage unit 302 can suppress a decrease in the determination accuracy of the determination unit 306.
 取得部303は、車外を撮像した複数の画像を取得する。車外を撮像した複数の画像は、例えば、情報処理装置100と同じ車両に設けられ、車外を撮像する撮像部によって撮像された複数の画像である。車外を撮像した複数の画像は、例えば、情報処理装置100とは異なる車両に設けられ、その車両から車外を撮像する撮像部によって撮像された複数の画像であってもよい。取得部303は、撮像部によって撮像された、車両が映っていることが確認済みである複数の画像を、直線を特定するための複数の画像として取得する。これにより、取得部303は、作成部305が直線の情報を作成する基になる複数の画像を取得することができる。 The acquisition unit 303 acquires a plurality of images taken outside the vehicle. The plurality of images captured outside the vehicle are, for example, a plurality of images that are provided in the same vehicle as the information processing apparatus 100 and are captured by the imaging unit that captures the outside of the vehicle. The plurality of images obtained by imaging the outside of the vehicle may be, for example, a plurality of images that are provided on a vehicle different from the information processing apparatus 100 and captured by an imaging unit that images the outside of the vehicle from the vehicle. The acquisition unit 303 acquires a plurality of images captured by the imaging unit and confirmed to show the vehicle as a plurality of images for specifying a straight line. Thereby, the acquisition unit 303 can acquire a plurality of images on which the creation unit 305 creates straight line information.
 検出部304は、取得部303が取得した複数の画像のそれぞれにおいて車両が映った位置を検出する。検出部304は、例えば、第1の記憶部301に記憶された標本画像を用いたパターンマッチングによって、複数の画像のそれぞれの画像のうちで車両の一部が映った標本画像と類似する部分画像の位置として、その部分画像の中心座標を検出する。 The detection unit 304 detects a position where the vehicle is reflected in each of the plurality of images acquired by the acquisition unit 303. The detection unit 304 is a partial image similar to a sample image in which a part of the vehicle is shown in each of the plurality of images, for example, by pattern matching using the sample image stored in the first storage unit 301. As the position, the center coordinates of the partial image are detected.
 検出部304は、取得部303が取得した複数の画像のそれぞれの画像についてエッジを検出してもよい。そして、検出部304は、それぞれの画像のうちで、標本データが示すエッジの多さに対応する部分画像の位置として、その部分画像の中心座標を検出してもよい。これにより、検出部304は、作成部305が直線の情報を作成する際に用いられる、複数の位置を検出することができる。 The detection unit 304 may detect an edge for each of the plurality of images acquired by the acquisition unit 303. Then, the detection unit 304 may detect the center coordinates of the partial image as the position of the partial image corresponding to the number of edges indicated by the sample data in each image. Accordingly, the detection unit 304 can detect a plurality of positions used when the creation unit 305 creates straight line information.
 作成部305は、取得部303が取得した複数の画像のそれぞれについて検出部304が検出した車両が映った位置に基づいて直線の情報を作成し、その直線の情報を第2の記憶部302に格納する。作成部305は、例えば、取得部303が取得した複数の画像のそれぞれについて検出部304が検出した車両が映った位置に基づいて、最小二乗法を用いて直線を示す関数を作成し、その関数を第2の記憶部302に格納する。 The creation unit 305 creates straight line information based on the position of the vehicle detected by the detection unit 304 for each of the plurality of images acquired by the acquisition unit 303 and stores the straight line information in the second storage unit 302. Store. For example, the creation unit 305 creates a function indicating a straight line using the least squares method based on the position of the vehicle detected by the detection unit 304 for each of the plurality of images acquired by the acquisition unit 303. Is stored in the second storage unit 302.
 作成部305は、具体的には、検出部304が検出した座標の個数Nと、検出部304が検出したN個の座標(xi,yi)とを、下記式(1)および下記式(2)に代入することにより、係数aと係数bとの組み合わせを算出し、第2の記憶部302に記憶する。ここで、iは、1~Nである。これにより、作成部305は、判定部306によって用いられる直線を示す関数y=ax+bを作成し、記憶しておくことができる。 Specifically, the creation unit 305 calculates the number N of coordinates detected by the detection unit 304 and the N coordinates (x i , y i ) detected by the detection unit 304 using the following formula (1) and the following formula: By substituting in (2), a combination of coefficient a and coefficient b is calculated and stored in the second storage unit 302. Here, i is 1 to N. Thereby, the creation unit 305 can create and store a function y = ax + b indicating a straight line used by the determination unit 306.
Figure JPOXMLDOC01-appb-M000001
Figure JPOXMLDOC01-appb-M000001
Figure JPOXMLDOC01-appb-M000002
Figure JPOXMLDOC01-appb-M000002
 作成部305は、直線と、取得部303が取得した複数の画像のそれぞれについて検出部304が検出した車両が映った位置との距離に基づいて、閾値を設定する。作成部305は、例えば、直線と、検出部304が検出したN個の座標(xi,yi)との距離の最大値を、閾値として設定する。これにより、作成部305は、少なくとも、車両が映っていることが確認済みである複数の画像から検出された位置については、車両が映った位置であると判定可能な閾値を設定することができる。 The creation unit 305 sets a threshold value based on the distance between the straight line and the position where the vehicle detected by the detection unit 304 for each of the plurality of images acquired by the acquisition unit 303 is displayed. The creation unit 305 sets, for example, the maximum value of the distance between the straight line and the N coordinates (x i , y i ) detected by the detection unit 304 as a threshold value. Thereby, the creation unit 305 can set a threshold value that can be determined to be a position where the vehicle is reflected, at least for positions detected from a plurality of images that have been confirmed to be reflected. .
 取得部303は、車外を撮像する撮像部によって撮像された対象画像を取得する。対象画像は、車両が映った位置を検出する対象となる画像である。取得部303は、例えば、直線を示す関数が第2の記憶部302に記憶されている状態で、撮像部によって撮像された最新の画像を、対象画像として取得する。これにより、取得部303は、車両が映った位置を検出する対象を取得することができる。 The acquisition unit 303 acquires the target image captured by the imaging unit that images outside the vehicle. The target image is an image that is a target for detecting a position where the vehicle is reflected. For example, the acquisition unit 303 acquires, as a target image, the latest image captured by the imaging unit in a state where a function indicating a straight line is stored in the second storage unit 302. Thereby, the acquisition part 303 can acquire the object which detects the position where the vehicle was reflected.
 取得部303は、撮像部によって撮像された複数の方向のいずれかの方向の対象画像を取得してもよい。取得部303は、例えば、直線を示す関数が第2の記憶部302に記憶されている状態で、撮像部によって撮像された最新の画像を、対象画像として取得し、撮像部が撮像した方向を取得する。これにより、取得部303は、車両が映った位置を検出する対象を取得することができ、判定部306が、どの方向に対応する直線を示す関数を用いればよいかを特定可能にすることができる。 The acquisition unit 303 may acquire a target image in any one of a plurality of directions imaged by the imaging unit. The acquisition unit 303 acquires, for example, the latest image captured by the imaging unit as a target image in a state where a function indicating a straight line is stored in the second storage unit 302, and the direction captured by the imaging unit is obtained. get. Thereby, the acquisition unit 303 can acquire a target for detecting the position where the vehicle is reflected, and the determination unit 306 can specify which direction should be used for the function indicating the straight line. it can.
 検出部304は、車両を撮像した画像の特徴を示す標本データに基づいて、取得部303が取得した対象画像において、標本データが示す特徴と同じ特徴を有する部分画像の位置を検出する。検出部304は、例えば、第1の記憶部301に記憶された標本画像を用いたパターンマッチングによって、対象画像のうちで車両の一部が映った標本画像と類似する部分画像の位置として、その部分画像の中心座標を検出する。 The detection unit 304 detects the position of the partial image having the same feature as the feature indicated by the sample data in the target image acquired by the acquisition unit 303 based on the sample data indicating the feature of the image obtained by capturing the vehicle. The detection unit 304 uses, for example, pattern matching using a sample image stored in the first storage unit 301 as a position of a partial image similar to a sample image in which a part of the vehicle is reflected in the target image. The center coordinates of the partial image are detected.
 検出部304は、対象画像についてエッジを検出してもよい。そして、検出部304は、対象画像のうちで、標本データが示すエッジの多さに対応する部分画像の位置として、その部分画像の中心座標を検出してもよい。これにより、検出部304は、車両が映った可能性がある部分画像を特定することができ、車両が映った位置の候補を検出することができる。 The detection unit 304 may detect an edge in the target image. Then, the detection unit 304 may detect the center coordinates of the partial image as the position of the partial image corresponding to the number of edges indicated by the sample data in the target image. As a result, the detection unit 304 can identify a partial image in which the vehicle may be reflected, and can detect a position candidate where the vehicle is reflected.
 判定部306は、第1の記憶部301に記憶された直線の情報に基づいて、検出した位置が、車両が映った位置であるか否かを判定する。判定部306は、例えば、直線と、検出した位置との距離dが閾値以下であると判定した場合に、検出した位置が、車両が映った位置であると判定する。距離dは、例えば、d=|y-ax-b|/√(1+a2)によって算出される。これにより、判定部306は、パターンマッチングによって誤って検出した位置があっても、その位置を、車両が映った位置として誤検出しないようにすることができる。 Based on the straight line information stored in the first storage unit 301, the determination unit 306 determines whether or not the detected position is a position where the vehicle is reflected. For example, when the determination unit 306 determines that the distance d between the straight line and the detected position is equal to or less than the threshold value, the determination unit 306 determines that the detected position is a position where the vehicle is reflected. The distance d is calculated by, for example, d = | y−ax−b | / √ (1 + a 2 ). Thereby, even if there exists the position detected accidentally by pattern matching, the determination part 306 can prevent that position from being erroneously detected as the position where the vehicle was reflected.
 判定部306は、撮像部が複数の方向を撮像可能であれば、いずれかの方向に対応付けて第2の記憶部302に記憶された直線の情報に基づいて、検出した位置が、車両が映った位置であるか否かを判定してもよい。判定部306は、例えば、取得部303が取得した方向に対応付けて第2の記憶部302に記憶された直線の情報に基づいて、検出した位置が、車両が映った位置であるか否かを判定する。これにより、判定部306は、撮像部が対象画像を撮像した方向に合わせて、複数の直線を示す関数を使い分け、検出精度の低下の抑制を図ることができる。 If the image capturing unit can capture a plurality of directions, the determination unit 306 determines whether the detected position is based on the straight line information stored in the second storage unit 302 in association with any direction. You may determine whether it is a reflected position. For example, the determination unit 306 determines whether or not the detected position is a position where the vehicle is reflected based on the straight line information stored in the second storage unit 302 in association with the direction acquired by the acquisition unit 303. Determine. Accordingly, the determination unit 306 can use a function indicating a plurality of straight lines in accordance with the direction in which the image capturing unit captures the target image, and can suppress a decrease in detection accuracy.
 出力部307は、判定部306の判定結果、または、判定部306によって車両が映った位置であると判定された位置などを出力する。出力形式は、例えば、ディスプレイへの表示、プリンタへの印刷出力、ネットワークI/F203による外部装置への送信、または、メモリ202や記録媒体205などの記憶領域への記憶である。これにより、出力部307は、利用者に、判定部306の判定結果、または、判定部306によって車両が映った位置であると判定された位置などを通知することができる。また、出力部307は、自動運転装置などに、判定部306の判定結果、または、判定部306によって車両が映った位置であると判定された位置などを利用させることもできる。 The output unit 307 outputs the determination result of the determination unit 306 or the position determined by the determination unit 306 as the position where the vehicle is reflected. The output format is, for example, display on a display, print output to a printer, transmission to an external device via the network I / F 203, or storage in a storage area such as the memory 202 or the recording medium 205. Accordingly, the output unit 307 can notify the user of the determination result of the determination unit 306 or the position determined by the determination unit 306 as the position where the vehicle is reflected. Further, the output unit 307 can cause the automatic driving device or the like to use the determination result of the determination unit 306 or the position determined by the determination unit 306 as the position where the vehicle is reflected.
(情報処理装置100の動作の一例)
 次に、図4~図11を用いて、情報処理装置100の動作の一例について説明する。まず、図4を用いて、撮像装置206によって撮像される画像400の一例について説明する。
(Example of operation of information processing apparatus 100)
Next, an example of the operation of the information processing apparatus 100 will be described with reference to FIGS. First, an example of an image 400 captured by the imaging device 206 will be described with reference to FIG.
 図4は、撮像装置206によって撮像される画像400の一例を示す説明図である。図4の例では、撮像装置206は、車両の右斜め後方に向けて車両の右ドア部分に埋め込まれた車載カメラである。撮像装置206は、例えば、車両の右斜め後方を撮像し、画像400を取得する。 FIG. 4 is an explanatory diagram illustrating an example of an image 400 captured by the imaging device 206. In the example of FIG. 4, the imaging device 206 is an in-vehicle camera embedded in the right door portion of the vehicle facing diagonally right rear of the vehicle. For example, the imaging device 206 captures an image of the right rear side of the vehicle and acquires the image 400.
 図4に示すように、車両の右斜め後方を撮像した画像400においては、その車両が走行する車線の右側に隣接する車線が斜めに映ることになる傾向があるため、矢印401が示す方向に沿って、他の車両が走行する車線が映ることになる傾向がある。換言すれば、車両の右斜め後方を撮像した画像400においては、矢印401が示す方向に沿って、車両が存在しうる領域が広がっている傾向がある。このため、車両の右斜め後方を撮像した画像400においては、矢印401の周辺に、車両が映る位置が分布する傾向がある。 As shown in FIG. 4, in an image 400 obtained by imaging the right rear side of the vehicle, the lane adjacent to the right side of the lane in which the vehicle travels tends to appear diagonally, and therefore, in the direction indicated by the arrow 401. Along with this, there is a tendency that a lane in which another vehicle travels is reflected. In other words, in the image 400 obtained by imaging the diagonally right rear side of the vehicle, there is a tendency that a region where the vehicle can exist extends along the direction indicated by the arrow 401. For this reason, in the image 400 obtained by imaging the right rear side of the vehicle, positions where the vehicle appears tend to be distributed around the arrow 401.
 このことから、標本画像を用いたパターンマッチングによって、車両の右斜め後方を撮像した画像400から検出された位置が、車両が映った位置として尤もらしいか否かは、矢印401に対応する直線からの距離に基づいて評価可能である。これにより、標本画像を用いたパターンマッチングによって、車両の右斜め後方を撮像した画像400から検出された位置が、矢印401が示す方向に沿って存在する、車両が存在しうる領域に含まれるか否かを評価することができる。 From this, it can be determined from the straight line corresponding to the arrow 401 whether or not the position detected from the image 400 obtained by imaging the diagonally right rear of the vehicle by pattern matching using the sample image is likely to be the position where the vehicle is reflected. It is possible to evaluate based on the distance. As a result, whether the position detected from the image 400 obtained by imaging the right diagonal rear of the vehicle by pattern matching using the sample image is included in the region where the vehicle can exist along the direction indicated by the arrow 401. You can evaluate whether or not.
 次に、図5を用いて、情報処理装置100が、対象画像500を取得し、その対象画像500において車両が映った位置を検出する一例について説明する。 Next, an example in which the information processing apparatus 100 acquires the target image 500 and detects the position where the vehicle appears in the target image 500 will be described with reference to FIG.
 図5は、車両が映った位置を検出する一例を示す説明図である。図5の例では、情報処理装置100は、図4に示した矢印401に対応する直線501の関数を作成済みであり、その関数を記憶しているとする。 FIG. 5 is an explanatory diagram showing an example of detecting the position where the vehicle is shown. In the example of FIG. 5, it is assumed that the information processing apparatus 100 has created a function of the straight line 501 corresponding to the arrow 401 illustrated in FIG. 4 and stores the function.
 情報処理装置100は、撮像装置206が車両の右斜め後方を撮像して得られた対象画像500を取得する。情報処理装置100は、標本画像を用いたパターンマッチングによって、対象画像500において標本画像と類似する部分画像502を特定する。情報処理装置100は、特定した部分画像502の中心座標503を、車両が映った位置の候補として検出する。 The information processing apparatus 100 acquires the target image 500 obtained by the imaging apparatus 206 imaging the diagonally right rear of the vehicle. The information processing apparatus 100 specifies a partial image 502 similar to the sample image in the target image 500 by pattern matching using the sample image. The information processing apparatus 100 detects the center coordinates 503 of the specified partial image 502 as a candidate for a position where the vehicle is reflected.
 情報処理装置100は、記憶した直線501の関数に基づいて、その直線501と、検出した中心座標503との距離を算出する。情報処理装置100は、算出した距離が閾値未満であれば、検出した中心座標503を、車両が映った位置であり、ノイズではないと判定する。また、情報処理装置100は、算出した距離が閾値以上であれば、検出した中心座標503を、車両が映った位置ではなく、ノイズであると判定する。 The information processing apparatus 100 calculates the distance between the straight line 501 and the detected center coordinate 503 based on the stored function of the straight line 501. If the calculated distance is less than the threshold, the information processing apparatus 100 determines that the detected center coordinate 503 is a position where the vehicle is reflected and is not noise. If the calculated distance is equal to or greater than the threshold, the information processing apparatus 100 determines that the detected center coordinate 503 is noise, not a position where the vehicle is reflected.
 これにより、情報処理装置100は、パターンマッチングによって誤って検出した位置があっても、その位置を、車両が映った位置として誤検出しないようにすることができる。このため、情報処理装置100は、車両走行中の安全確保などのために車両が映った位置を用いる場合、誤検出された位置を用いず、精度よく安全確保を図ることができる。 Thus, even if there is a position that is erroneously detected by pattern matching, the information processing apparatus 100 can prevent the position from being erroneously detected as a position where the vehicle is reflected. For this reason, when using the position where the vehicle is reflected for ensuring safety during traveling of the vehicle, the information processing apparatus 100 can accurately ensure safety without using the erroneously detected position.
 また、情報処理装置100は、パターンマッチングの精度が悪くても、車両が映った位置を誤検出しないようにすることができる。また、情報処理装置100は、パターンマッチングによって検出した位置が、車両が映った位置であるか否かを判定する処理を、所定の直線と検出した位置との比較によって行うことができるため、比較的短時間で行うことができる。このため、情報処理装置100は、パターンマッチングの精度の向上を図らなくても、車両が映った位置を検出する精度の向上を図ることができるとともに、車両が映った位置を検出する際にかかる時間の増大化を抑制することができる。 In addition, the information processing apparatus 100 can prevent erroneous detection of the position where the vehicle appears even if the accuracy of pattern matching is poor. Further, the information processing apparatus 100 can perform the process of determining whether or not the position detected by pattern matching is a position where the vehicle is reflected by comparing the predetermined straight line with the detected position. It can be done in a short time. For this reason, the information processing apparatus 100 can improve the accuracy of detecting the position where the vehicle is reflected without improving the accuracy of the pattern matching, and is required when detecting the position where the vehicle is reflected. An increase in time can be suppressed.
 次に、図6および図7を用いて、撮像装置206によって撮像される画像600,700の一例について説明する。 Next, an example of images 600 and 700 captured by the imaging device 206 will be described with reference to FIGS.
 図6は、撮像装置206によって撮像される画像600の一例を示す説明図である。図6の例では、撮像装置206は、車両の前方に向けて車両のフロントミラー周辺に付けられた車載カメラである。撮像装置206は、例えば、車両の前方を撮像し、画像600を取得する。 FIG. 6 is an explanatory diagram illustrating an example of an image 600 captured by the imaging device 206. In the example of FIG. 6, the imaging device 206 is an in-vehicle camera attached around the front mirror of the vehicle toward the front of the vehicle. For example, the imaging device 206 captures the front of the vehicle and acquires the image 600.
 図6に示すように、車両の前方を撮像した画像600においては、その車両が走行する車線と同じ車線が前方に映ることになる傾向があるため、矢印601が示す方向に沿って、他の車両が走行する車線が映ることになる傾向がある。換言すれば、車両の前方を撮像した画像600においては、矢印601が示す方向に沿って、車両が存在しうる領域が広がっている傾向がある。このため、車両の前方を撮像した画像600においては、矢印601の周辺に、車両が映る位置が分布する傾向がある。 As shown in FIG. 6, in the image 600 obtained by capturing the front of the vehicle, the same lane as the lane in which the vehicle travels tends to appear in the forward direction. Therefore, along the direction indicated by the arrow 601, There is a tendency to show the lane in which the vehicle travels. In other words, in the image 600 obtained by imaging the front of the vehicle, there is a tendency that the region where the vehicle can exist extends along the direction indicated by the arrow 601. For this reason, in the image 600 obtained by capturing the front of the vehicle, the positions where the vehicle appears tend to be distributed around the arrow 601.
 このことから、標本画像を用いたパターンマッチングによって、車両の前方を撮像した画像600から検出された位置が、車両が映った位置として尤もらしいか否かは、矢印601に対応する直線からの距離に基づいて評価可能である。そして、情報処理装置100は、その直線に基づいて、パターンマッチングによって検出された位置が、車両が映った位置であるか否かを判定することができる。次に、図7の説明に移行する。 From this, the distance from the straight line corresponding to the arrow 601 indicates whether or not the position detected from the image 600 captured in front of the vehicle by the pattern matching using the sample image is likely to be the position where the vehicle is reflected. Can be evaluated based on Then, the information processing apparatus 100 can determine whether the position detected by pattern matching is a position where the vehicle is reflected, based on the straight line. Next, the description proceeds to FIG.
 図7は、撮像装置206によって撮像される画像700の一例を示す説明図である。図7の例では、撮像装置206は、車両の左方向に向けて車両のバンパー部分に埋め込まれた車載カメラである。撮像装置206は、例えば、車両のバンパー部分から左方向を撮像し、画像700を取得する。 FIG. 7 is an explanatory diagram illustrating an example of an image 700 captured by the imaging device 206. In the example of FIG. 7, the imaging device 206 is an in-vehicle camera embedded in a bumper portion of the vehicle toward the left direction of the vehicle. For example, the imaging device 206 captures the left direction from the bumper portion of the vehicle and acquires the image 700.
 図7に示すように、車両のバンパー部分から左方向を撮像した画像700においては、その車両の進行方向に直交する車線が映ることになる傾向があるため、矢印701が示す方向に沿って、他の車両が走行する車線が映ることになる傾向がある。換言すれば、車両のバンパー部分から左方向を撮像した画像700においては、矢印701が示す方向に沿って、車両が存在しうる領域が広がっている傾向がある。このため、車両のバンパー部分から左方向を撮像した画像700においては、矢印701の周辺に、車両が映る位置が分布する傾向がある。 As shown in FIG. 7, in the image 700 obtained by imaging the left direction from the bumper portion of the vehicle, a lane that is orthogonal to the traveling direction of the vehicle tends to be reflected, so along the direction indicated by the arrow 701, There is a tendency to show the lane in which other vehicles travel. In other words, in the image 700 obtained by imaging the left direction from the bumper portion of the vehicle, there is a tendency that an area where the vehicle can exist extends along the direction indicated by the arrow 701. For this reason, in the image 700 obtained by imaging the left direction from the bumper portion of the vehicle, the positions where the vehicle appears tend to be distributed around the arrow 701.
 このことから、標本画像を用いたパターンマッチングによって、車両のバンパー部分から左方向を撮像した画像700から検出された位置が、車両が映った位置として尤もらしいか否かは、矢印701に対応する直線からの距離に基づいて評価可能である。そして、情報処理装置100は、その直線に基づいて、パターンマッチングによって検出された位置が、車両が映った位置であるか否かを判定することができる。 Therefore, whether or not the position detected from the image 700 obtained by imaging the left direction from the bumper portion of the vehicle by pattern matching using the sample image is likely to be the position where the vehicle is reflected corresponds to the arrow 701. Evaluation can be made based on the distance from the straight line. Then, the information processing apparatus 100 can determine whether the position detected by pattern matching is a position where the vehicle is reflected, based on the straight line.
 このように、撮像装置206が、どの方向を撮像する装置であるかに応じて、パターンマッチングによって検出された位置が、車両が映った位置であるか否かを判定する際に用いられる直線は、異なることになる。このため、情報処理装置100は、例えば、撮像装置206が実際に過去に撮像した複数の画像を用いて、その撮像装置206に関して車両が映った位置であるか否かを判定する際に、どのような直線の情報を用いるかを特定することが好ましい。 As described above, the straight line used when determining whether the position detected by the pattern matching is the position where the vehicle is reflected, depending on which direction the imaging device 206 is imaging. Will be different. For this reason, the information processing apparatus 100 uses, for example, a plurality of images actually captured in the past by the image capturing apparatus 206 to determine whether the vehicle is in a position where the vehicle is reflected. It is preferable to specify whether such straight line information is used.
 また、情報処理装置100は、撮像装置206が、撮像する方向を切り替えて、複数の方向を撮像可能である場合、それぞれの方向を撮像する場合に対応する直線の情報を作成し、記憶しておくようにしてもよい。これにより、情報処理装置100は、撮像装置206が、撮像する方向を切り替え可能である場合であっても、パターンマッチングによって検出された位置が、車両が映った位置であるか否かを精度よく判定することができる。 In addition, when the image capturing apparatus 206 can capture a plurality of directions by switching the image capturing direction, the information processing apparatus 100 creates and stores straight line information corresponding to capturing each direction. You may make it leave. Thereby, the information processing apparatus 100 accurately determines whether or not the position detected by the pattern matching is a position where the vehicle is reflected, even when the imaging apparatus 206 can switch the imaging direction. Can be determined.
(情報処理装置100の動作の具体例)
 次に、図8~図10を用いて、情報処理装置100の動作の具体例について説明する。まず、図8を用いて、情報処理装置100が、直線801を示す関数を作成する具体例について説明する。
(Specific example of operation of information processing apparatus 100)
Next, a specific example of the operation of the information processing apparatus 100 will be described with reference to FIGS. First, a specific example in which the information processing apparatus 100 creates a function indicating a straight line 801 will be described with reference to FIG.
 図8は、直線801を示す関数を作成する具体例を示す説明図である。図8において、情報処理装置100は、撮像装置206によって撮像された、車両が映っていることが確認済みの複数の画像を取得し、それぞれの画像に基づいて図8の□が示す座標(xi,yi)を検出する。iは、1~Nである。Nは、検出した座標(xi,yi)の数である。 FIG. 8 is an explanatory diagram illustrating a specific example of creating a function indicating a straight line 801. In FIG. 8, the information processing apparatus 100 acquires a plurality of images captured by the imaging device 206 and confirmed to show the vehicle, and coordinates (x indicated by □ in FIG. 8 based on the respective images. i , y i ) are detected. i is 1 to N. N is the number of detected coordinates (x i , y i ).
 情報処理装置100は、座標の個数Nと、検出したN個の座標(xi,yi)とを、上記式(1)および上記式(2)に代入することにより、係数aと係数bとの組み合わせを算出し、直線801を示す関数を作成し、記憶しておく。次に、図9および図10を用いて、情報処理装置100が、車両が映った位置であるか否かを判定する具体例について説明する。 The information processing apparatus 100 substitutes the coefficient N and the coefficient b by substituting the number N of coordinates and the detected N coordinates (x i , y i ) into the above equations (1) and (2). And a function indicating the straight line 801 is created and stored. Next, a specific example in which the information processing apparatus 100 determines whether or not the vehicle is shown is described with reference to FIGS. 9 and 10.
 図9および図10は、車両が映った位置であるか否かを判定する具体例を示す説明図である。図9において、情報処理装置100は、撮像装置206によって撮像された複数の対象画像を取得し、それぞれの対象画像に基づいて図9の□が示す座標(xi,yi)を検出する。iは、1~Nである。Nは、検出した座標(xi,yi)の数である。図9の□が示す座標(xi,yi)は、パターンマッチングによって検出された、車両ではない物体が映った位置が含まれる可能性がある。次に、図10の説明に移行する。 FIG. 9 and FIG. 10 are explanatory diagrams showing a specific example for determining whether or not the vehicle is in a reflected position. In FIG. 9, the information processing apparatus 100 acquires a plurality of target images captured by the imaging apparatus 206 and detects coordinates (x i , y i ) indicated by □ in FIG. 9 based on the respective target images. i is 1 to N. N is the number of detected coordinates (x i , y i ). Coordinates (x i , y i ) indicated by □ in FIG. 9 may include a position where an object that is not a vehicle is detected by pattern matching. Next, the description proceeds to FIG.
 図10において、図10の□が示す座標(xi,yi)は、図9の□が示す座標(xi,yi)と同じ座標である。情報処理装置100は、図10の□が示す、いずれかの座標(xi,yi)から直線801への垂線の長さを算出し、その座標(xi,yi)と直線801との距離を算出する。そして、情報処理装置100は、算出した距離が閾値「100」以上であるか否かを判定する。 In FIG. 10, the coordinates (x i , y i ) indicated by □ in FIG. 10 are the same coordinates as the coordinates (x i , y i ) indicated by □ in FIG. The information processing apparatus 100 calculates the length of a perpendicular line from one of the coordinates (x i , y i ) indicated by □ in FIG. 10 to the straight line 801, and the coordinates (x i , y i ) and the straight line 801 The distance is calculated. Then, the information processing apparatus 100 determines whether or not the calculated distance is greater than or equal to the threshold “100”.
 ここで、情報処理装置100は、閾値「100」未満であれば、その座標(xi,yi)が、車両が映った位置であり、ノイズではないと判定する。一方で、情報処理装置100は、閾値「100」以上であれば、その座標(xi,yi)が、車両が映った位置ではなく、ノイズであると判定する。 Here, if the information processing apparatus 100 is less than the threshold value “100”, the coordinate (x i , y i ) is a position where the vehicle is shown and is determined not to be noise. On the other hand, if the information processing apparatus 100 is equal to or greater than the threshold “100”, the coordinate (x i , y i ) is determined not to be the position where the vehicle is reflected but to noise.
 これにより、情報処理装置100は、パターンマッチングによって誤って検出した位置があっても、その位置を、車両が映った位置として誤検出しないようにすることができる。このため、情報処理装置100は、車両走行中の安全確保などのために車両が映った位置を用いる場合、誤検出された位置を用いず、精度よく安全確保を図ることができる。 Thus, even if there is a position that is erroneously detected by pattern matching, the information processing apparatus 100 can prevent the position from being erroneously detected as a position where the vehicle is reflected. For this reason, when using the position where the vehicle is reflected for ensuring safety during traveling of the vehicle, the information processing apparatus 100 can accurately ensure safety without using the erroneously detected position.
 また、情報処理装置100は、パターンマッチングの精度が悪くても、車両が映った位置を誤検出しないようにすることができる。また、情報処理装置100は、パターンマッチングによって検出した位置が、車両が映った位置であるか否かを判定する処理を、所定の直線と検出した位置との比較によって行うことができるため、比較的短時間で行うことができる。このため、情報処理装置100は、パターンマッチングの精度の向上を図らなくても、車両が映った位置を検出する精度の向上を図ることができるとともに、車両が映った位置を検出する際にかかる時間の増大化を抑制することができる。 In addition, the information processing apparatus 100 can prevent erroneous detection of the position where the vehicle appears even if the accuracy of pattern matching is poor. Further, the information processing apparatus 100 can perform the process of determining whether or not the position detected by pattern matching is a position where the vehicle is reflected by comparing the predetermined straight line with the detected position. It can be done in a short time. For this reason, the information processing apparatus 100 can improve the accuracy of detecting the position where the vehicle is reflected without improving the accuracy of the pattern matching, and is required when detecting the position where the vehicle is reflected. An increase in time can be suppressed.
 次に、図11を用いて、情報処理装置100が、閾値を変更して車両が映った位置であるか否かを判定する一例について説明する。 Next, an example in which the information processing apparatus 100 determines whether or not the position where the vehicle is reflected by changing the threshold will be described with reference to FIG.
 図11は、閾値を変更して車両が映った位置であるか否かを判定する一例を示す説明図である。図11において、図11の□が示す座標(xi,yi)は、図10の□が示す座標(xi,yi)と同じ座標であるとする。 FIG. 11 is an explanatory diagram illustrating an example of determining whether or not the vehicle is in a position where the threshold is changed. 11, the coordinates (x i, y i) indicated □ is 11, the coordinates (x i, y i) indicated □ is 10 and is the same coordinates as.
 情報処理装置100は、図11の□が示す、いずれかの座標(xi,yi)から直線801への垂線の長さを算出し、その座標(xi,yi)と直線801との距離を算出する。そして、情報処理装置100は、算出した距離が閾値「75」以上であるか否かを判定する。 The information processing apparatus 100 calculates the length of a perpendicular line from one of the coordinates (x i , y i ) indicated by □ in FIG. 11 to the straight line 801, and the coordinates (x i , y i ) and the straight line 801 The distance is calculated. Then, the information processing apparatus 100 determines whether or not the calculated distance is greater than or equal to the threshold “75”.
 ここで、情報処理装置100は、閾値「75」未満であれば、その座標(xi,yi)が、車両が映った位置であり、ノイズではないと判定する。一方で、情報処理装置100は、閾値「75」以上であれば、その座標(xi,yi)が、車両が映った位置ではなく、ノイズであると判定する。これにより、情報処理装置100は、複数の閾値のそれぞれを用いて、パターンマッチングによって検出された位置が、車両が映った位置であるか否かを判定することができる。 Here, if the information processing apparatus 100 is less than the threshold value “75”, the coordinate (x i , y i ) is a position where the vehicle is shown and is determined not to be noise. On the other hand, if the information processing apparatus 100 is greater than or equal to the threshold “75”, the coordinate (x i , y i ) is determined not to be the position where the vehicle is reflected but to noise. Thereby, the information processing apparatus 100 can determine whether the position detected by pattern matching is a position where the vehicle is reflected, using each of the plurality of threshold values.
 このように、情報処理装置100は、閾値を変更することにより、車両が映った位置であり、ノイズではないと判定する座標の範囲を調整することができる。このため、情報処理装置100は、予め、閾値を、好ましい値に設定する処理を行うようにしてもよい。 As described above, the information processing apparatus 100 can adjust the range of the coordinates determined to be the position where the vehicle is reflected and not noise by changing the threshold value. For this reason, the information processing apparatus 100 may perform processing for setting a threshold value to a preferable value in advance.
 例えば、情報処理装置100は、複数の座標のうちノイズであると判定された座標の割合が好ましい割合になる閾値を、複数の閾値のうちから選択して設定してもよい。具体的には、情報処理装置100は、車両が映っていることが確認済みの複数の画像から検出した複数の座標について、閾値「100」と閾値「75」となどを用いた判定処理を行い、すべての座標がノイズではないと判定可能になる閾値を設定する。情報処理装置100は、閾値「100」と閾値「75」とのいずれを用いても、すべての座標がノイズではないと判定可能であれば、小さい方の閾値を設定する。 For example, the information processing apparatus 100 may select and set a threshold value at which a ratio of coordinates determined to be noise among a plurality of coordinates is a preferable ratio from the plurality of threshold values. Specifically, the information processing apparatus 100 performs determination processing using a threshold value “100”, a threshold value “75”, and the like for a plurality of coordinates detected from a plurality of images in which it is confirmed that the vehicle is reflected. A threshold value is set at which it is possible to determine that all coordinates are not noise. The information processing apparatus 100 sets the smaller threshold value if it is possible to determine that all coordinates are not noise regardless of the threshold value “100” or the threshold value “75”.
 また、情報処理装置100は、定期的に、閾値を設定し直してもよい。例えば、情報処理装置100は、定期的に、車両の走行中に撮像装置206によって撮像された複数の画像から検出した複数の座標について、複数の閾値を用いた判定処理を行い、閾値を設定してもよい。具体的には、情報処理装置100は、閾値「75」を用いた場合に、ノイズであると判定する座標が多すぎると判断すると、閾値「100」を、今後用いる閾値として設定してもよい。 Further, the information processing apparatus 100 may reset the threshold periodically. For example, the information processing apparatus 100 periodically performs a determination process using a plurality of threshold values on a plurality of coordinates detected from a plurality of images captured by the imaging device 206 while the vehicle is traveling, and sets the threshold values. May be. Specifically, when the threshold value “75” is used and the information processing apparatus 100 determines that there are too many coordinates to be determined as noise, the information processing apparatus 100 may set the threshold value “100” as a threshold value to be used in the future. .
(算出処理手順の一例)
 次に、図12を用いて、情報処理装置100が実行する算出処理手順の一例について説明する。
(Example of calculation processing procedure)
Next, an example of a calculation processing procedure executed by the information processing apparatus 100 will be described with reference to FIG.
 図12は、算出処理手順の一例を示すフローチャートである。図12において、まず、情報処理装置100は、標本となる複数の画像データから、車両が映ったことが検出された画像をあらわす画像データを取得する(ステップS1201)。 FIG. 12 is a flowchart showing an example of a calculation processing procedure. In FIG. 12, first, the information processing apparatus 100 acquires image data representing an image in which it is detected that a vehicle is reflected from a plurality of image data serving as specimens (step S1201).
 次に、情報処理装置100は、取得した画像データに基づいて、その画像データがあらわす画像上で車両が映っていると判断される範囲を特定する(ステップS1202)。そして、情報処理装置100は、その画像データがあらわす画像上での車両の座標として、特定した範囲の中心座標を算出する(ステップS1203)。 Next, based on the acquired image data, the information processing apparatus 100 specifies a range in which it is determined that the vehicle is reflected on the image represented by the image data (step S1202). Then, the information processing apparatus 100 calculates the center coordinates of the specified range as the coordinates of the vehicle on the image represented by the image data (step S1203).
 次に、情報処理装置100は、画像データの取得を終了するか否かを判定する(ステップS1204)。ここで、取得を終了しない場合(ステップS1204:No)、情報処理装置100は、ステップS1201の処理に戻る。 Next, the information processing apparatus 100 determines whether or not to end the acquisition of the image data (step S1204). Here, when acquisition is not completed (step S1204: No), the information processing apparatus 100 returns to the process of step S1201.
 一方で、取得を終了する場合(ステップS1204:Yes)、情報処理装置100は、算出した車両の座標に基づいて、車両の座標がとりうる軌跡をあらわす近似直線の関数を求め、その関数を記憶部120に記憶する(ステップS1205)。そして、情報処理装置100は、算出処理を終了する。これにより、情報処理装置100は、近似直線を示す関数を作成することができる。 On the other hand, when the acquisition ends (step S1204: Yes), the information processing apparatus 100 obtains an approximate straight line function representing a trajectory that the vehicle coordinates can take based on the calculated vehicle coordinates, and stores the function. The information is stored in the unit 120 (step S1205). Then, the information processing apparatus 100 ends the calculation process. Thereby, the information processing apparatus 100 can create a function indicating an approximate straight line.
(判定処理手順の一例)
 次に、図13を用いて、情報処理装置100が実行する判定処理手順の一例について説明する。
(Example of judgment processing procedure)
Next, an example of a determination processing procedure executed by the information processing apparatus 100 will be described with reference to FIG.
 図13は、判定処理手順の一例を示すフローチャートである。図13において、まず、情報処理装置100は、判定対象となる新たな画像データを取得する(ステップS1301)。次に、情報処理装置100は、取得した画像データがあらわす画像上で車両が映っているか否かを判定する(ステップS1302)。ここで、車両が映っていない場合(ステップS1302:No)、情報処理装置100は、ステップS1301の処理に戻る。 FIG. 13 is a flowchart showing an example of the determination processing procedure. In FIG. 13, first, the information processing apparatus 100 acquires new image data to be determined (step S1301). Next, the information processing apparatus 100 determines whether or not the vehicle is shown on the image represented by the acquired image data (step S1302). Here, when the vehicle is not shown (step S1302: No), the information processing apparatus 100 returns to the process of step S1301.
 一方で、車両が映っている場合(ステップS1302:Yes)、情報処理装置100は、取得した画像データに基づいて、その画像データがあらわす画像上で車両が映っていると判断される範囲を特定する(ステップS1303)。そして、情報処理装置100は、その画像データがあらわす画像上での車両の座標の候補として、特定した範囲の中心座標を算出する(ステップS1304)。 On the other hand, when the vehicle is shown (step S1302: Yes), the information processing apparatus 100 specifies a range in which it is determined that the vehicle is shown on the image represented by the image data based on the acquired image data. (Step S1303). Then, the information processing apparatus 100 calculates the center coordinates of the specified range as a candidate for the coordinates of the vehicle on the image represented by the image data (step S1304).
 次に、情報処理装置100は、算出した車両の座標の候補と、記憶部120に記憶された関数があらわす近似直線との距離が、閾値以上であるか否かを判定する(ステップS1305)。ここで、閾値以上ではない場合(ステップS1305:No)、情報処理装置100は、算出した車両の座標の候補が、車両の座標であり、ノイズではないと判定し、判定した結果を出力する(ステップS1306)。そして、情報処理装置100は、ステップS1301の処理に戻る。 Next, the information processing apparatus 100 determines whether or not the distance between the calculated vehicle coordinate candidate and the approximate straight line represented by the function stored in the storage unit 120 is greater than or equal to a threshold (step S1305). If it is not equal to or greater than the threshold (step S1305: No), the information processing apparatus 100 determines that the calculated vehicle coordinate candidate is the vehicle coordinate and is not noise, and outputs the determined result ( Step S1306). Then, the information processing apparatus 100 returns to the process of step S1301.
 一方で、閾値以上である場合(ステップS1305:Yes)、情報処理装置100は、算出した車両の座標の候補が、実際には車両の座標ではなく、ノイズであると判定し、判定した結果を出力する(ステップS1307)。そして、情報処理装置100は、ステップS1301の処理に戻る。これにより、情報処理装置100は、車両が映った位置の誤検出を防止することができる。 On the other hand, when the value is equal to or greater than the threshold (step S1305: Yes), the information processing apparatus 100 determines that the calculated vehicle coordinate candidate is actually not the vehicle coordinate but noise, and determines the determination result. It outputs (step S1307). Then, the information processing apparatus 100 returns to the process of step S1301. Thereby, the information processing apparatus 100 can prevent erroneous detection of the position where the vehicle is reflected.
(設定処理手順の一例)
 次に、図14を用いて、情報処理装置100が実行する調整処理手順の一例について説明する。
(Example of setting processing procedure)
Next, an example of the adjustment processing procedure executed by the information processing apparatus 100 will be described with reference to FIG.
 図14は、調整処理手順の一例を示すフローチャートである。図14において、まず、情報処理装置100は、標本となる複数の画像データから、車両が映ったことが検出された画像をあらわす画像データを取得する(ステップS1401)。 FIG. 14 is a flowchart showing an example of the adjustment processing procedure. In FIG. 14, first, the information processing apparatus 100 acquires image data representing an image in which it is detected that a vehicle is reflected from a plurality of pieces of image data serving as specimens (step S1401).
 次に、情報処理装置100は、取得した画像データに基づいて、その画像データがあらわす画像上で車両が映っていると判断される範囲を特定する(ステップS1402)。そして、情報処理装置100は、その画像データがあらわす画像上での車両の座標の候補として、特定した範囲の中心座標を算出する(ステップS1403)。 Next, based on the acquired image data, the information processing apparatus 100 specifies a range in which it is determined that the vehicle is reflected on the image represented by the image data (step S1402). Then, the information processing apparatus 100 calculates the center coordinates of the specified range as a candidate for the coordinates of the vehicle on the image represented by the image data (step S1403).
 次に、情報処理装置100は、算出した車両の座標の候補と、記憶部120に記憶された関数があらわす近似直線との距離が、第1の閾値以上であるか否かを判定する(ステップS1404)。ここで、第1の閾値以上ではない場合(ステップS1404:No)、情報処理装置100は、算出した車両の座標の候補が、車両の座標であり、ノイズではないと判定し、判定した結果を記憶する(ステップS1405)。そして、情報処理装置100は、ステップS1407の処理に移行する。 Next, the information processing apparatus 100 determines whether or not the distance between the calculated vehicle coordinate candidate and the approximate straight line represented by the function stored in the storage unit 120 is equal to or greater than the first threshold (step). S1404). If it is not equal to or greater than the first threshold (step S1404: No), the information processing apparatus 100 determines that the calculated vehicle coordinate candidate is the vehicle coordinate and is not noise, and the determined result is Store (step S1405). Then, the information processing apparatus 100 proceeds to the process of step S1407.
 一方で、第1の閾値以上である場合(ステップS1404:Yes)、情報処理装置100は、算出した車両の座標の候補が、実際には車両の座標ではなく、ノイズであると判定し、判定した結果を記憶する(ステップS1406)。そして、情報処理装置100は、ステップS1407の処理に移行する。 On the other hand, when the value is equal to or greater than the first threshold (step S1404: Yes), the information processing apparatus 100 determines that the calculated vehicle coordinate candidate is not actually the vehicle coordinate but noise, and the determination is made. The result is stored (step S1406). Then, the information processing apparatus 100 proceeds to the process of step S1407.
 ステップS1407で、情報処理装置100は、算出した車両の座標の候補と、記憶部120に記憶された関数があらわす近似直線との距離が、第2の閾値以上であるか否かを判定する(ステップS1407)。ここで、第2の閾値以上ではない場合(ステップS1407:No)、情報処理装置100は、算出した車両の座標の候補が、車両の座標であり、ノイズではないと判定し、判定した結果を記憶する(ステップS1408)。そして、情報処理装置100は、ステップS1410の処理に移行する。 In step S1407, the information processing apparatus 100 determines whether the distance between the calculated vehicle coordinate candidate and the approximate straight line represented by the function stored in the storage unit 120 is equal to or greater than the second threshold ( Step S1407). If it is not equal to or greater than the second threshold (step S1407: No), the information processing apparatus 100 determines that the calculated vehicle coordinate candidate is the vehicle coordinate and is not noise, and the determined result is Store (step S1408). Then, the information processing apparatus 100 proceeds to the process of step S1410.
 一方で、第2の閾値以上である場合(ステップS1407:Yes)、算出した車両の座標の候補が、実際には車両の座標ではなく、ノイズであると判定し、判定した結果を記憶する(ステップS1409)。そして、情報処理装置100は、ステップS1410の処理に移行する。 On the other hand, if it is equal to or greater than the second threshold (step S1407: Yes), it is determined that the calculated vehicle coordinate candidate is actually not the vehicle coordinate but noise, and the determination result is stored ( Step S1409). Then, the information processing apparatus 100 proceeds to the process of step S1410.
 ステップS1410で、情報処理装置100は、画像データの取得を終了するか否かを判定する(ステップS1410)。ここで、取得を終了しない場合(ステップS1410:No)、情報処理装置100は、ステップS1401の処理に戻る。 In step S1410, the information processing apparatus 100 determines whether to end the acquisition of image data (step S1410). If the acquisition is not terminated (step S1410: No), the information processing apparatus 100 returns to the process of step S1401.
 一方で、取得を終了する場合(ステップS1410:Yes)、情報処理装置100は、判定した結果に基づいて、第1の閾値と、第2の閾値とのいずれかを、判定処理に用いる閾値として設定する(ステップS1411)。そして、情報処理装置100は、調整処理を終了する。これにより、情報処理装置100は、車両が映った位置であるか否かを精度よく判定する閾値を設定することができる。 On the other hand, when the acquisition ends (step S1410: Yes), the information processing apparatus 100 uses either the first threshold value or the second threshold value as a threshold value used in the determination process based on the determined result. Setting is performed (step S1411). Then, the information processing apparatus 100 ends the adjustment process. Thereby, the information processing apparatus 100 can set the threshold value which determines accurately whether it is the position where the vehicle was reflected.
 以上説明したように、情報処理装置100によれば、撮像装置206によって撮像された対象画像110を取得し、対象画像110において標本データが示す特徴と同じ特徴を有する部分画像の位置111を検出することができる。また、情報処理装置100によれば、記憶部120に記憶された直線131の情報に基づいて、検出した位置111が、車両が映った位置であるか否かを判定することができる。これにより、情報処理装置100は、パターンマッチングによって誤って検出した位置111があっても、その位置111を、車両が映った位置として誤検出しないようにすることができる。 As described above, according to the information processing apparatus 100, the target image 110 captured by the imaging apparatus 206 is acquired, and the position 111 of the partial image having the same characteristics as the specimen data in the target image 110 is detected. be able to. Further, according to the information processing apparatus 100, it is possible to determine whether or not the detected position 111 is a position where the vehicle is reflected, based on the information of the straight line 131 stored in the storage unit 120. Thereby, even if there exists the position 111 detected accidentally by pattern matching, the information processing apparatus 100 can prevent the position 111 from being erroneously detected as a position where the vehicle is reflected.
 情報処理装置100によれば、直線131と、検出した位置111との距離が閾値以下であると判定した場合に、検出した位置111が、車両が映った位置であると判定することができる。これにより、情報処理装置100は、検出した位置111の、車両が映った位置としての尤もらしさを、直線131との距離によって評価することができる。また、情報処理装置100は、検出した位置111と直線131との距離の算出、および、その距離と閾値との比較によって、車両が映った位置であるか否かを判定するため、その判定を比較的短時間で行うことができる。 According to the information processing apparatus 100, when it is determined that the distance between the straight line 131 and the detected position 111 is equal to or less than the threshold value, the detected position 111 can be determined to be a position where the vehicle is reflected. As a result, the information processing apparatus 100 can evaluate the likelihood of the detected position 111 as the position where the vehicle is reflected by the distance from the straight line 131. In addition, the information processing apparatus 100 calculates the distance between the detected position 111 and the straight line 131 and compares the distance with a threshold value to determine whether or not the vehicle is reflected. This can be done in a relatively short time.
 情報処理装置100によれば、車外を撮像した複数の画像を取得し、直線131と、取得した複数の画像のそれぞれにおいて車両が映った位置との距離に基づいて、閾値を設定することができる。これにより、情報処理装置100は、撮像装置206によって撮像された対象画像110について行われる、車両が映った位置であるか否かを判定する処理に用いられる閾値を設定することができる。情報処理装置100は、車外を撮像した複数の画像に基づいて、閾値を設定するため、比較的精度よく、車両が映った位置であるか否かを判定することができる閾値を設定しやすくなる。 According to the information processing apparatus 100, a plurality of images obtained by imaging the outside of the vehicle can be acquired, and a threshold value can be set based on the distance between the straight line 131 and the position where the vehicle appears in each of the acquired plurality of images. . Thereby, the information processing apparatus 100 can set the threshold value used for the process performed about the target image 110 imaged by the imaging device 206 to determine whether or not the vehicle is reflected. Since the information processing apparatus 100 sets a threshold based on a plurality of images taken outside the vehicle, it is easy to set a threshold that can determine whether or not the vehicle is in a position with relatively high accuracy. .
 情報処理装置100によれば、車外を撮像した複数の画像を取得し、取得した複数の画像のそれぞれにおいて車両が映った位置に基づいて直線131の情報を作成し、作成した直線131の情報を記憶部120に格納することができる。これにより、情報処理装置100は、撮像装置206によって撮像された対象画像110について行われる、車両が映った位置であるか否かを判定する処理に用いられる直線131の情報を記憶することができる。情報処理装置100は、車外を撮像した複数の画像に基づいて、直線131の情報を作成するため、比較的精度よく、車両が映った位置であるか否かを判定することができる直線131の情報を作成することができる。 According to the information processing apparatus 100, a plurality of images taken outside the vehicle are acquired, information on the straight line 131 is created based on the position where the vehicle is reflected in each of the acquired plurality of images, and the information on the created straight line 131 is obtained. It can be stored in the storage unit 120. Accordingly, the information processing apparatus 100 can store information on the straight line 131 used for the process of determining whether the vehicle is in the position performed on the target image 110 captured by the imaging apparatus 206. . Since the information processing apparatus 100 creates information on the straight line 131 based on a plurality of images taken from outside the vehicle, the information processing apparatus 100 can determine whether or not the vehicle 131 is in a position with a relatively high accuracy. Information can be created.
 情報処理装置100によれば、車外を撮像した複数の画像のそれぞれにおいて車両が映った位置に基づいて、最小二乗法を用いて直線131を示す関数を作成し、作成した関数を記憶部120に格納することができる。これにより、情報処理装置100は、車外を撮像した複数の画像のそれぞれにおいて車両が映った位置に外れ値があっても、直線131を示す関数を作成することができる。 According to the information processing apparatus 100, a function indicating the straight line 131 is created using the least square method based on the position where the vehicle is reflected in each of a plurality of images taken outside the vehicle, and the created function is stored in the storage unit 120. Can be stored. Thereby, the information processing apparatus 100 can create a function indicating the straight line 131 even if there is an outlier at the position where the vehicle is reflected in each of the plurality of images taken outside the vehicle.
 情報処理装置100によれば、複数の方向を撮像可能である撮像装置206によって撮像された、いずれかの方向についての対象画像110を取得し、対象画像110において標本データが示す特徴と同じ特徴を有する部分画像の位置111を検出することができる。また、情報処理装置100によれば、いずれかの方向に対応付けて記憶部120に記憶された直線131の情報に基づいて、検出した位置111が、車両が映った位置であるか否かを判定することができる。これにより、情報処理装置100は、撮像装置206が撮像する方向が可変であっても、検出した位置111が、車両が映った位置であるか否かを判定することができる。 According to the information processing apparatus 100, the target image 110 in any direction captured by the imaging apparatus 206 capable of capturing a plurality of directions is acquired, and the same characteristics as the characteristics indicated by the sample data in the target image 110 are obtained. The position 111 of the partial image it has can be detected. Further, according to the information processing apparatus 100, based on the information of the straight line 131 stored in the storage unit 120 in association with any direction, it is determined whether or not the detected position 111 is a position where the vehicle is reflected. Can be determined. Thereby, the information processing apparatus 100 can determine whether or not the detected position 111 is a position where the vehicle is reflected, even if the imaging direction of the imaging apparatus 206 is variable.
 なお、本実施の形態で説明した情報処理方法は、予め用意されたプログラムをパーソナル・コンピュータやワークステーション等のコンピュータで実行することにより実現することができる。本実施の形態で説明した情報処理プログラムは、ハードディスク、フレキシブルディスク、CD-ROM、MO、DVD等のコンピュータで読み取り可能な記録媒体に記録され、コンピュータによって記録媒体から読み出されることによって実行される。また、本実施の形態で説明した情報処理プログラムは、インターネット等のネットワークを介して配布してもよい。 Note that the information processing method described in this embodiment can be realized by executing a program prepared in advance on a computer such as a personal computer or a workstation. The information processing program described in this embodiment is recorded on a computer-readable recording medium such as a hard disk, a flexible disk, a CD-ROM, an MO, and a DVD, and is executed by being read from the recording medium by the computer. Further, the information processing program described in this embodiment may be distributed via a network such as the Internet.
 100 情報処理装置
 110,500 対象画像
 111 位置
 120 記憶部
 130,400,600,700 画像
 131,501,801 直線
 200 バス
 201 CPU
 202 メモリ
 203 ネットワークI/F
 204 記録媒体I/F
 205 記録媒体
 206 撮像装置
 210 ネットワーク
 301 第1の記憶部
 302 第2の記憶部
 303 取得部
 304 検出部
 305 作成部
 306 判定部
 307 出力部
DESCRIPTION OF SYMBOLS 100 Information processing apparatus 110,500 Target image 111 Position 120 Storage part 130,400,600,700 Image 131,501,801 Straight line 200 Bus 201 CPU
202 Memory 203 Network I / F
204 Recording medium I / F
205 Recording Medium 206 Imaging Device 210 Network 301 First Storage Unit 302 Second Storage Unit 303 Acquisition Unit 304 Detection Unit 305 Creation Unit 306 Determination Unit 307 Output Unit

Claims (8)

  1.  車外を撮像した複数の画像のそれぞれにおいて車両が映った位置に基づいて特定される、車外を撮像する撮像部によって撮像される画像において車両が映る位置の基準を示す直線の情報を記憶する記憶部と、
     前記撮像部によって撮像された対象画像を取得し、
     車両を撮像した画像の特徴を示す標本データに基づいて、取得した前記対象画像において前記特徴を有する部分画像の位置を検出し、
     前記記憶部に記憶された前記直線の情報に基づいて、検出した前記位置が、車両が映った位置であるか否かを判定する、制御部と、
     を有することを特徴とする情報処理装置。
    A storage unit that stores information on a straight line that indicates a reference of a position where the vehicle appears in an image captured by the imaging unit that captures the outside of the vehicle, which is specified based on a position where the vehicle appears in each of a plurality of images captured outside the vehicle. When,
    Obtaining a target image captured by the imaging unit;
    Based on the sample data indicating the characteristics of the image obtained by imaging the vehicle, the position of the partial image having the characteristics in the acquired target image is detected,
    A control unit that determines whether the detected position is a position where the vehicle is reflected based on the straight line information stored in the storage unit;
    An information processing apparatus comprising:
  2.  前記制御部は、前記直線と、検出した前記位置との距離が閾値以下であると判定した場合に、検出した前記位置が、車両が映った位置であると判定する、ことを特徴とする請求項1に記載の情報処理装置。 The control unit, when determining that the distance between the straight line and the detected position is equal to or less than a threshold value, determines that the detected position is a position where a vehicle is reflected. Item 4. The information processing apparatus according to Item 1.
  3.  前記制御部は、さらに、前記複数の画像を取得し、前記直線と、取得した前記複数の画像のそれぞれにおいて車両が映った位置との距離に基づいて、前記閾値を設定する、ことを特徴とする請求項2に記載の情報処理装置。 The control unit further acquires the plurality of images, and sets the threshold based on a distance between the straight line and a position where the vehicle is reflected in each of the acquired plurality of images. The information processing apparatus according to claim 2.
  4.  前記制御部は、さらに、前記複数の画像を取得し、取得した前記複数の画像のそれぞれにおいて車両が映った位置に基づいて前記直線の情報を作成し、作成した前記直線の情報を前記記憶部に格納する、ことを特徴とする請求項1~3のいずれか一つに記載の情報処理装置。 The control unit further acquires the plurality of images, creates information on the straight line based on a position where the vehicle is reflected in each of the obtained plurality of images, and stores the created straight line information on the storage unit 4. The information processing apparatus according to claim 1, wherein the information processing apparatus is stored in the information processing apparatus.
  5.  前記制御部は、取得した前記複数の画像のそれぞれにおいて車両が映った位置に基づいて、最小二乗法を用いて前記直線を示す関数を作成し、作成した前記関数を前記記憶部に格納する、ことを特徴とする請求項4に記載の情報処理装置。 The control unit creates a function indicating the straight line using a least square method based on a position where the vehicle is reflected in each of the acquired images, and stores the created function in the storage unit. The information processing apparatus according to claim 4.
  6.  前記撮像部は、複数の方向を撮像可能であって、
     前記記憶部は、前記複数の方向のそれぞれの方向に、前記撮像部によって撮像される当該方向についての画像において車両が映る位置の基準を示す直線の情報を対応付けて記憶し、
     前記制御部は、前記撮像部によって撮像された前記複数の方向のいずれかの方向の対象画像を取得し、
     取得した前記対象画像において前記特徴を有する部分画像の位置を検出し、
     前記いずれかの方向に対応付けて前記記憶部に記憶された前記直線の情報に基づいて、検出した前記位置が、車両が映った位置であるか否かを判定する、ことを特徴とする請求項1~5のいずれか一つに記載の情報処理装置。
    The imaging unit can image a plurality of directions,
    The storage unit stores, in association with each direction of the plurality of directions, straight line information indicating a reference of a position where the vehicle is reflected in an image of the direction imaged by the imaging unit,
    The control unit acquires a target image in any one of the plurality of directions imaged by the imaging unit,
    Detecting the position of the partial image having the feature in the acquired target image;
    It is determined whether or not the detected position is a position where a vehicle is shown, based on the straight line information stored in the storage unit in association with any one of the directions. Item 6. The information processing device according to any one of Items 1 to 5.
  7.  コンピュータが、
     車外を撮像する撮像部によって撮像された対象画像を取得し、
     車両を撮像した画像の特徴を示す標本データに基づいて、取得した前記対象画像において前記特徴を有する部分画像の位置を検出し、
     車外を撮像した複数の画像のそれぞれにおいて車両が映った位置に基づいて特定される、前記撮像部によって撮像される画像において車両が映る位置の基準を示す直線の情報を記憶する記憶部を参照して、検出した前記位置が、車両が映った位置であるか否かを判定する、
     処理を実行することを特徴とする情報処理方法。
    Computer
    Obtain the target image captured by the imaging unit that captures the outside of the vehicle,
    Based on the sample data indicating the characteristics of the image obtained by imaging the vehicle, the position of the partial image having the characteristics in the acquired target image is detected,
    A storage unit that stores information on a straight line that indicates a reference of a position where the vehicle appears in the image captured by the imaging unit, which is specified based on a position where the vehicle is reflected in each of a plurality of images captured outside the vehicle. Determining whether the detected position is a position where the vehicle is reflected,
    An information processing method characterized by executing processing.
  8.  コンピュータに、
     車外を撮像する撮像部によって撮像された対象画像を取得し、
     車両を撮像した画像の特徴を示す標本データに基づいて、取得した前記対象画像において前記特徴を有する部分画像の位置を検出し、
     車外を撮像した複数の画像のそれぞれにおいて車両が映った位置に基づいて特定される、前記撮像部によって撮像される画像において車両が映る位置の基準を示す直線の情報を記憶する記憶部を参照して、検出した前記位置が、車両が映った位置であるか否かを判定する、
     処理を実行させることを特徴とする情報処理プログラム。
    On the computer,
    Obtain the target image captured by the imaging unit that captures the outside of the vehicle,
    Based on the sample data indicating the characteristics of the image obtained by imaging the vehicle, the position of the partial image having the characteristics in the acquired target image is detected,
    A storage unit that stores information on a straight line that indicates a reference of a position where the vehicle appears in the image captured by the imaging unit, which is specified based on a position where the vehicle is reflected in each of a plurality of images captured outside the vehicle. Determining whether the detected position is a position where the vehicle is reflected,
    An information processing program for executing a process.
PCT/JP2017/005573 2017-02-15 2017-02-15 Information processing device, information processing method, and information processing program WO2018150496A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2017/005573 WO2018150496A1 (en) 2017-02-15 2017-02-15 Information processing device, information processing method, and information processing program

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2017/005573 WO2018150496A1 (en) 2017-02-15 2017-02-15 Information processing device, information processing method, and information processing program

Publications (1)

Publication Number Publication Date
WO2018150496A1 true WO2018150496A1 (en) 2018-08-23

Family

ID=63170545

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/005573 WO2018150496A1 (en) 2017-02-15 2017-02-15 Information processing device, information processing method, and information processing program

Country Status (1)

Country Link
WO (1) WO2018150496A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111915638A (en) * 2019-05-09 2020-11-10 东芝泰格有限公司 Tracking device, information processing method, readable storage medium, and electronic apparatus

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006053756A (en) * 2004-08-11 2006-02-23 Tokyo Institute Of Technology Object detector
JP2008123462A (en) * 2006-11-16 2008-05-29 Hitachi Ltd Object detector
JP2014021510A (en) * 2012-07-12 2014-02-03 Jvc Kenwood Corp Information processor and information processing method, and, program

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006053756A (en) * 2004-08-11 2006-02-23 Tokyo Institute Of Technology Object detector
JP2008123462A (en) * 2006-11-16 2008-05-29 Hitachi Ltd Object detector
JP2014021510A (en) * 2012-07-12 2014-02-03 Jvc Kenwood Corp Information processor and information processing method, and, program

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111915638A (en) * 2019-05-09 2020-11-10 东芝泰格有限公司 Tracking device, information processing method, readable storage medium, and electronic apparatus

Similar Documents

Publication Publication Date Title
CN110287779B (en) Method, device and equipment for detecting lane line
US10331961B2 (en) Detecting device, detecting method, and program
US9672628B2 (en) Method for partitioning area, and inspection device
US9846823B2 (en) Traffic lane boundary line extraction apparatus and method of extracting traffic lane boundary line
JP4955096B2 (en) DETECTING DEVICE, DETECTING METHOD, DETECTING PROGRAM, AND RECORDING MEDIUM
JP2004117078A (en) Obstacle detection device and method
US10339396B2 (en) Vehicle accessibility determination device
JP6237875B2 (en) Self-position calculation device and self-position calculation method
CN110659547B (en) Object recognition method, device, vehicle and computer-readable storage medium
WO2013035612A1 (en) Obstacle sensing device, obstacle sensing method, and obstacle sensing program
JPWO2015125296A1 (en) Self-position calculation device and self-position calculation method
JP2007272292A (en) Shadow recognition method and shadow boundary extraction method
JP2020061020A (en) Pedestrian crossing sign estimation device
US11521337B2 (en) Map generation system, map generation method, and computer readable medium which generates linearization information calculates a reliability degree
US20160196657A1 (en) Method and system for providing depth mapping using patterned light
WO2023019793A1 (en) Determination method, cleaning robot, and computer storage medium
WO2018150496A1 (en) Information processing device, information processing method, and information processing program
JP2005156199A (en) Vehicle detection method and vehicle detector
JP2019218022A (en) Rail track detection device
JP3631095B2 (en) Irradiation field area extraction device, radiation imaging apparatus, radiation image system, irradiation field area extraction method, and computer-readable storage medium
JP2010286995A (en) Image processing system for vehicle
JP2004062519A (en) Lane mark detector
CN111340887B (en) Visual positioning method, visual positioning device, electronic equipment and storage medium
WO2016152288A1 (en) Material detection device, material detection method, and program
JP6688091B2 (en) Vehicle distance deriving device and vehicle distance deriving method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17897023

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17897023

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP