WO2018150496A1 - Dispositif de traitement d'informations, procédé de traitement d'informations et programme de traitement d'informations - Google Patents

Dispositif de traitement d'informations, procédé de traitement d'informations et programme de traitement d'informations Download PDF

Info

Publication number
WO2018150496A1
WO2018150496A1 PCT/JP2017/005573 JP2017005573W WO2018150496A1 WO 2018150496 A1 WO2018150496 A1 WO 2018150496A1 JP 2017005573 W JP2017005573 W JP 2017005573W WO 2018150496 A1 WO2018150496 A1 WO 2018150496A1
Authority
WO
WIPO (PCT)
Prior art keywords
vehicle
information processing
processing apparatus
image
reflected
Prior art date
Application number
PCT/JP2017/005573
Other languages
English (en)
Japanese (ja)
Inventor
知奈美 木下
陽介 中畑
Original Assignee
富士通周辺機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 富士通周辺機株式会社 filed Critical 富士通周辺機株式会社
Priority to PCT/JP2017/005573 priority Critical patent/WO2018150496A1/fr
Publication of WO2018150496A1 publication Critical patent/WO2018150496A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems

Definitions

  • the present invention relates to an information processing apparatus, an information processing method, and an information processing program.
  • learning data representing an image of a vehicle is stored, and if there is a partial image similar to the image of the vehicle represented by the learning data among images captured by the in-vehicle camera outside the vehicle, the vehicle is reflected in the partial image.
  • the center point of the saturation value of the luminance value of the light light of the vehicle is detected to detect the vehicle, and the center of the shadow of the vehicle is determined from the background image of the photographing location and the luminance value of the shadow of the vehicle.
  • Some of them detect a vehicle by detecting a point.
  • a signal indicating that the host vehicle has arrived at a target point that is in front of a point where the inclination of the traveling road surface ahead of the host vehicle relative to the road surface portion on which the host vehicle is traveling changes to a downward slope.
  • the position of the vehicle is erroneously detected.
  • an in-vehicle camera captures an image of the outside of the vehicle, which is an object with a shape or color similar to that of the vehicle, but an object that is not a vehicle, the position of the object is mistakenly detected as the vehicle position.
  • the position of the object is mistakenly detected as the vehicle position.
  • an object of the present invention is to provide an information processing apparatus, an information processing method, and an information processing program that can prevent erroneous detection of the position of a vehicle.
  • the target image captured by the imaging unit that captures the outside of the vehicle is acquired, and the acquired target image has the feature based on the sample data indicating the feature of the image captured of the vehicle Detecting the position of the image, and specifying information on a straight line indicating the reference of the position where the vehicle appears in the image captured by the imaging unit, which is specified based on the position where the vehicle is reflected in each of the plurality of images captured outside the vehicle
  • An information processing apparatus, an information processing method, and an information processing program for determining whether or not the detected position is a position where a vehicle is shown with reference to a storage unit to be stored are proposed.
  • FIG. 1 is an explanatory diagram of an example of the information processing method according to the embodiment.
  • FIG. 2 is a block diagram illustrating a hardware configuration example of the information processing apparatus 100.
  • FIG. 3 is a block diagram illustrating a functional configuration example of the information processing apparatus 100.
  • FIG. 4 is an explanatory diagram illustrating an example of an image 400 captured by the imaging device 206.
  • FIG. 5 is an explanatory diagram showing an example of detecting the position where the vehicle is shown.
  • FIG. 6 is an explanatory diagram illustrating an example of an image 600 captured by the imaging device 206.
  • FIG. 7 is an explanatory diagram illustrating an example of an image 700 captured by the imaging device 206.
  • FIG. 8 is an explanatory diagram illustrating a specific example of creating a function indicating a straight line 801.
  • FIG. 9 is an explanatory diagram (part 1) illustrating a specific example of determining whether or not the vehicle is in a reflected position.
  • FIG. 10 is an explanatory diagram (part 2) illustrating a specific example of determining whether or not the vehicle is in a reflected position.
  • FIG. 11 is an explanatory diagram illustrating an example of determining whether or not the vehicle is in a position where the threshold is changed.
  • FIG. 12 is a flowchart illustrating an example of a calculation processing procedure.
  • FIG. 13 is a flowchart illustrating an example of the determination processing procedure.
  • FIG. 14 is a flowchart illustrating an example of the adjustment processing procedure.
  • FIG. 1 is an explanatory diagram of an example of the information processing method according to the embodiment.
  • an information processing apparatus 100 is a computer provided in a vehicle.
  • the information processing apparatus 100 detects a position where the vehicle is reflected in the image captured by the imaging unit.
  • the imaging unit is provided in the vehicle and images the outside of the vehicle.
  • the imaging unit images, for example, the front of the vehicle, the rear of the vehicle, the oblique front of the vehicle, the oblique rear of the vehicle, the left direction of the vehicle, or the right direction of the vehicle.
  • the imaging unit includes an in-vehicle camera attached around the front mirror of the vehicle toward the front of the vehicle, an in-vehicle camera embedded in the door portion of the vehicle toward the rear of the vehicle, and in the left direction of the vehicle.
  • an in-vehicle camera embedded in the bumper part of the vehicle is an in-vehicle camera attached around the front mirror of the vehicle toward the front of the vehicle.
  • the in-vehicle camera detects a white line such as a lane boundary line reflected in an image taken outside the vehicle, and based on the position of the white line, it is possible to improve the accuracy of detecting the position where the vehicle is reflected.
  • a white line such as a lane boundary line reflected in an image taken outside the vehicle
  • the present embodiment after detecting the position of the partial image having the same characteristics as the sample data in the target image, it is determined whether or not the position is a position where the vehicle is reflected. An information processing method capable of preventing erroneous detection of the reflected position will be described.
  • the information processing apparatus 100 includes a storage unit 120.
  • the storage unit 120 stores information on the straight line 131.
  • a straight line 131 represents a reference of a position where a vehicle appears in an image 130 newly captured by the imaging unit in the future, and represents a region including a position where the vehicle can be reflected in an image 130 newly captured in the future.
  • the imaging unit is provided in the same vehicle as the information processing apparatus 100, for example.
  • the start point or end point of the straight line 131 may be, for example, a point inside the image 130 instead of a point at the end of the image 130 newly captured by the imaging unit in the future.
  • the information on the straight line 131 is specified based on the position where the vehicle is reflected in each of a plurality of images taken outside the vehicle.
  • the information on the straight line 131 is specified using, for example, the least square method.
  • the information on the straight line 131 is a function indicating the straight line 131, for example.
  • the plurality of images taken outside the vehicle are a plurality of images taken by the imaging unit.
  • the plurality of images captured outside the vehicle may be, for example, a plurality of images captured by an imaging unit provided in a vehicle different from the information processing apparatus 100.
  • the vehicle provided with the information processing apparatus 100 it may be a plurality of images taken by an imaging unit provided in a vehicle different from the vehicle provided with the information processing apparatus 100. Good.
  • the information processing apparatus 100 acquires the target image 110 captured by the imaging unit.
  • the information processing apparatus 100 acquires, for example, a target image 110 obliquely rearward of the vehicle, which is captured by an in-vehicle camera embedded in a door portion of the vehicle toward the rearward of the vehicle.
  • the information processing apparatus 100 detects the position 111 of the partial image having the same feature as the feature indicated by the sample data in the acquired target image 110 based on the sample data indicating the feature of the image obtained by capturing the vehicle. To do.
  • the sample data is, for example, a sample image showing a part of the vehicle.
  • the information processing apparatus 100 detects, for example, a position 111 of a partial image similar to a sample image in which a part of the vehicle is shown in the target image 110 by pattern matching using a sample image in which a part of the vehicle is shown. .
  • the information processing apparatus 100 determines based on the information on the straight line 131 stored in the storage unit 120 whether the detected position 111 is a position where the vehicle is reflected. For example, the information processing apparatus 100 determines whether the distance between the straight line 131 and the detected position 111 is greater than or equal to a threshold value. Here, when it is determined that the information processing apparatus 100 is less than the threshold value, the detected position 111 is a position where the vehicle is reflected and is determined not to be noise. On the other hand, when it is determined that the information processing apparatus 100 is equal to or greater than the threshold value, the detected position 111 determines that the detected position 111 is not a position where the vehicle is reflected but noise.
  • the information processing apparatus 100 can prevent the position from being erroneously detected as a position where the vehicle is reflected. For this reason, when using the position where the vehicle is reflected for ensuring safety during traveling of the vehicle, the information processing apparatus 100 can accurately ensure safety without using the erroneously detected position.
  • the information processing apparatus 100 can prevent erroneous detection of the position where the vehicle appears even if the accuracy of pattern matching is poor. Further, since the information processing apparatus 100 performs the process of determining whether or not the position detected by the pattern matching is a position where the vehicle is reflected by comparing the predetermined straight line with the detected position, the information processing apparatus 100 can perform a relatively short time. Can be done. For this reason, the information processing apparatus 100 can improve the accuracy of detecting the position where the vehicle is reflected without improving the accuracy of the pattern matching, and is required when detecting the position where the vehicle is reflected. An increase in time can be suppressed.
  • the information processing apparatus 100 can detect the position where the vehicle is reflected in a relatively short time in order to ensure safety while the vehicle is running. Further, since the information processing apparatus 100 does not have to improve the accuracy of pattern matching, it is possible to avoid increasing the cost of hardware for performing pattern matching.
  • the information processing apparatus 100 can perform automatic driving or driving assistance based on the position where the vehicle detected with relatively high accuracy is reflected, thereby improving safety.
  • the information processing apparatus 100 can detect that the distance between the vehicles has approached a certain distance or more based on the position where the vehicle is detected with relatively high accuracy. A warning can be issued and safety can be improved.
  • FIG. 2 is a block diagram illustrating a hardware configuration example of the information processing apparatus 100.
  • an information processing apparatus 100 includes a CPU (Central Processing Unit) 201, a memory 202, a network I / F (Interface) 203, a recording medium I / F 204, a recording medium 205, and an imaging device 206. Have. Each component is connected by a bus 200.
  • the CPU 201 controls the entire information processing apparatus 100.
  • the memory 202 includes, for example, a ROM (Read Only Memory), a RAM (Random Access Memory), a flash ROM, and the like. Specifically, for example, a flash ROM or ROM stores various programs, and a RAM is used as a work area of the CPU 201.
  • the program stored in the memory 202 is loaded on the CPU 201 to cause the CPU 201 to execute the coded process.
  • the memory 202 stores sample data indicating the characteristics of an image obtained by imaging a vehicle, information on a straight line that serves as a reference for a position where the vehicle is reflected, and the like.
  • the network I / F 203 is connected to the network 210 via a communication line, and is connected to another computer via the network 210.
  • the network I / F 203 controls an internal interface with the network 210 and controls data input / output from other computers.
  • a modem or a LAN (Local Area Network) adapter may be employed as the network I / F 203.
  • the recording medium I / F 204 controls reading / writing of data with respect to the recording medium 205 according to the control of the CPU 201.
  • the recording medium I / F 204 is, for example, a disk drive, an SSD (Solid State Drive), a USB (Universal Serial Bus) port, or the like.
  • the recording medium 205 is a non-volatile memory that stores data written under the control of the recording medium I / F 204.
  • the recording medium 205 is, for example, a disk, a semiconductor memory, a USB memory, or the like.
  • the recording medium 205 may be detachable from the information processing apparatus 100.
  • the imaging device 206 captures the outside of the vehicle and acquires image data representing the image captured outside the vehicle.
  • the imaging device 206 images, for example, the front of the vehicle, the rear of the vehicle, the oblique front of the vehicle, the oblique rear of the vehicle, the left direction of the vehicle, or the right direction of the vehicle.
  • the imaging device 206 includes an in-vehicle camera attached around the front mirror of the vehicle toward the front of the vehicle, an in-vehicle camera embedded in the door portion of the vehicle toward the rear of the vehicle, and the left direction of the vehicle.
  • an in-vehicle camera embedded in the bumper part of the vehicle can capture images by switching the imaging direction, and may capture a plurality of directions.
  • the imaging device 206 may not be integrated with the information processing device 100 as long as it can communicate with the information processing device 100.
  • the information processing apparatus 100 may include a plurality of imaging devices 206. In addition to the components described above, the information processing apparatus 100 may include, for example, a keyboard, a mouse, a display, and the like. The information processing apparatus 100 may not include the recording medium I / F 204 and the recording medium 205.
  • the information processing apparatus 100 is an on-vehicle apparatus, for example. Specifically, the information processing apparatus 100 is a drive recorder.
  • FIG. 3 is a block diagram illustrating a functional configuration example of the information processing apparatus 100.
  • the information processing apparatus 100 includes a first storage unit 301, a second storage unit 302, an acquisition unit 303, a detection unit 304, a creation unit 305, a determination unit 306, and an output unit 307.
  • the first storage unit 301 and the second storage unit 302 are realized by storage areas such as the memory 202 and the recording medium 205 shown in FIG.
  • the acquisition unit 303 to the output unit 307 are functions serving as a control unit. Specifically, the acquisition unit 303 to the output unit 307, for example, cause the CPU 201 to execute a program stored in a storage area such as the memory 202 or the recording medium 205 illustrated in FIG. 2 or the network I / F 203. By realizing the function.
  • the processing result of each functional unit is stored in a storage area such as the memory 202 and the recording medium 205 shown in FIG.
  • the first storage unit 301 stores, for example, sample data indicating features estimated to be included in a partial image in which a vehicle is shown.
  • the sample data is, for example, a sample image showing a part of the vehicle.
  • the sample data may be, for example, information indicating the number of horizontal or vertical edges included in a sample image showing a part of the vehicle.
  • the imaging unit may be capable of imaging a plurality of directions. The imaging unit is provided in the same vehicle as the information processing apparatus 100.
  • the imaging unit is, for example, the imaging device 206 illustrated in FIG.
  • the first storage unit 301 stores, for example, a sample image of a part of the vehicle that appears in the captured image when the imaging unit captures the direction in accordance with the imaging direction of the imaging unit. To do. Specifically, when the imaging unit captures the rear, the first storage unit 301 reflects the front part of the vehicle behind the image captured by the imaging unit. The specimen image is memorized. The specimen image of the front portion of the vehicle is a modeled image based on, for example, images of front portions of various vehicles. Thereby, the first storage unit 301 can store the sample data used by the detection unit 304. Further, the first storage unit 301 can enable the detection unit 304 to selectively use sample data corresponding to each direction even when the imaging unit can capture images in a plurality of directions.
  • the second storage unit 302 stores straight line information.
  • the straight line represents a reference of a position where the vehicle appears in an image newly captured by the imaging unit in the future, and represents a region including a position where the vehicle can be reflected in an image newly captured in the future.
  • the start point or end point of the straight line may be, for example, a point inside the image instead of a point at the end of the image newly captured in the future by the imaging unit.
  • the straight line information is specified based on the position where the vehicle is reflected in each of a plurality of images taken outside the vehicle.
  • the straight line information is specified using, for example, the least square method.
  • the straight line information is, for example, a function indicating a straight line.
  • the plurality of images captured outside the vehicle are, for example, a plurality of images captured in the past by an imaging unit provided in the same vehicle as the information processing apparatus 100.
  • the plurality of images captured outside the vehicle are, for example, the same type as the vehicle provided with the information processing apparatus 100, but the plurality of images captured in the past by an imaging unit provided in a vehicle different from the information processing apparatus 100. It may be. In this case, it is preferable that the imaging unit provided in the same vehicle as the information processing device 100 and the imaging unit provided in a vehicle different from the information processing device 100 have the same type of imaging direction.
  • x is a coordinate value in the horizontal direction of the image.
  • y is a coordinate value in the vertical direction of the image.
  • a is a coefficient, and is obtained by, for example, the least square method.
  • b is a coefficient, and is obtained by, for example, the least square method.
  • the position where the vehicle is reflected is the center coordinates of the partial image where the vehicle is reflected in the image.
  • the center coordinates are expressed by, for example, a combination (x, y) of the coordinate value x in the horizontal direction of the image and the coordinate value y in the vertical direction of the image.
  • the function may express that a range or a domain is set and the start point or end point of the straight line is a point inside the image.
  • storage part 302 can enable the determination part 306 to determine whether the position is a position where the vehicle was reflected based on the distance of a certain position and a straight line.
  • the second storage unit 302 associates each direction of a plurality of directions that can be imaged by the imaging unit with information on a straight line indicating a reference of a position where the vehicle appears in the image of the direction captured by the imaging unit. You may remember.
  • the second storage unit 302 stores a combination of the coefficient a and the coefficient b for each direction in which the imaging unit can capture an image.
  • the second storage unit 302 can make the determination unit 306 usable by switching functions indicating a plurality of straight lines in accordance with the direction in which the imaging unit captures an image. Even if the direction in which the imaging unit captures an image is switched, the second storage unit 302 can suppress a decrease in the determination accuracy of the determination unit 306.
  • the acquisition unit 303 acquires a plurality of images taken outside the vehicle.
  • the plurality of images captured outside the vehicle are, for example, a plurality of images that are provided in the same vehicle as the information processing apparatus 100 and are captured by the imaging unit that captures the outside of the vehicle.
  • the plurality of images obtained by imaging the outside of the vehicle may be, for example, a plurality of images that are provided on a vehicle different from the information processing apparatus 100 and captured by an imaging unit that images the outside of the vehicle from the vehicle.
  • the acquisition unit 303 acquires a plurality of images captured by the imaging unit and confirmed to show the vehicle as a plurality of images for specifying a straight line. Thereby, the acquisition unit 303 can acquire a plurality of images on which the creation unit 305 creates straight line information.
  • the detection unit 304 detects a position where the vehicle is reflected in each of the plurality of images acquired by the acquisition unit 303.
  • the detection unit 304 is a partial image similar to a sample image in which a part of the vehicle is shown in each of the plurality of images, for example, by pattern matching using the sample image stored in the first storage unit 301. As the position, the center coordinates of the partial image are detected.
  • the detection unit 304 may detect an edge for each of the plurality of images acquired by the acquisition unit 303. Then, the detection unit 304 may detect the center coordinates of the partial image as the position of the partial image corresponding to the number of edges indicated by the sample data in each image. Accordingly, the detection unit 304 can detect a plurality of positions used when the creation unit 305 creates straight line information.
  • the creation unit 305 creates straight line information based on the position of the vehicle detected by the detection unit 304 for each of the plurality of images acquired by the acquisition unit 303 and stores the straight line information in the second storage unit 302. Store. For example, the creation unit 305 creates a function indicating a straight line using the least squares method based on the position of the vehicle detected by the detection unit 304 for each of the plurality of images acquired by the acquisition unit 303. Is stored in the second storage unit 302.
  • the creation unit 305 calculates the number N of coordinates detected by the detection unit 304 and the N coordinates (x i , y i ) detected by the detection unit 304 using the following formula (1) and the following formula:
  • a combination of coefficient a and coefficient b is calculated and stored in the second storage unit 302.
  • i is 1 to N.
  • the creation unit 305 sets a threshold value based on the distance between the straight line and the position where the vehicle detected by the detection unit 304 for each of the plurality of images acquired by the acquisition unit 303 is displayed.
  • the creation unit 305 sets, for example, the maximum value of the distance between the straight line and the N coordinates (x i , y i ) detected by the detection unit 304 as a threshold value.
  • the creation unit 305 can set a threshold value that can be determined to be a position where the vehicle is reflected, at least for positions detected from a plurality of images that have been confirmed to be reflected. .
  • the acquisition unit 303 acquires the target image captured by the imaging unit that images outside the vehicle.
  • the target image is an image that is a target for detecting a position where the vehicle is reflected.
  • the acquisition unit 303 acquires, as a target image, the latest image captured by the imaging unit in a state where a function indicating a straight line is stored in the second storage unit 302. Thereby, the acquisition part 303 can acquire the object which detects the position where the vehicle was reflected.
  • the acquisition unit 303 may acquire a target image in any one of a plurality of directions imaged by the imaging unit.
  • the acquisition unit 303 acquires, for example, the latest image captured by the imaging unit as a target image in a state where a function indicating a straight line is stored in the second storage unit 302, and the direction captured by the imaging unit is obtained. get.
  • the acquisition unit 303 can acquire a target for detecting the position where the vehicle is reflected, and the determination unit 306 can specify which direction should be used for the function indicating the straight line. it can.
  • the detection unit 304 detects the position of the partial image having the same feature as the feature indicated by the sample data in the target image acquired by the acquisition unit 303 based on the sample data indicating the feature of the image obtained by capturing the vehicle.
  • the detection unit 304 uses, for example, pattern matching using a sample image stored in the first storage unit 301 as a position of a partial image similar to a sample image in which a part of the vehicle is reflected in the target image. The center coordinates of the partial image are detected.
  • the detection unit 304 may detect an edge in the target image. Then, the detection unit 304 may detect the center coordinates of the partial image as the position of the partial image corresponding to the number of edges indicated by the sample data in the target image. As a result, the detection unit 304 can identify a partial image in which the vehicle may be reflected, and can detect a position candidate where the vehicle is reflected.
  • the determination unit 306 determines whether or not the detected position is a position where the vehicle is reflected. For example, when the determination unit 306 determines that the distance d between the straight line and the detected position is equal to or less than the threshold value, the determination unit 306 determines that the detected position is a position where the vehicle is reflected.
  • the determination unit 306 determines whether the detected position is based on the straight line information stored in the second storage unit 302 in association with any direction. You may determine whether it is a reflected position. For example, the determination unit 306 determines whether or not the detected position is a position where the vehicle is reflected based on the straight line information stored in the second storage unit 302 in association with the direction acquired by the acquisition unit 303. Determine. Accordingly, the determination unit 306 can use a function indicating a plurality of straight lines in accordance with the direction in which the image capturing unit captures the target image, and can suppress a decrease in detection accuracy.
  • the output unit 307 outputs the determination result of the determination unit 306 or the position determined by the determination unit 306 as the position where the vehicle is reflected.
  • the output format is, for example, display on a display, print output to a printer, transmission to an external device via the network I / F 203, or storage in a storage area such as the memory 202 or the recording medium 205. Accordingly, the output unit 307 can notify the user of the determination result of the determination unit 306 or the position determined by the determination unit 306 as the position where the vehicle is reflected. Further, the output unit 307 can cause the automatic driving device or the like to use the determination result of the determination unit 306 or the position determined by the determination unit 306 as the position where the vehicle is reflected.
  • FIG. 4 is an explanatory diagram illustrating an example of an image 400 captured by the imaging device 206.
  • the imaging device 206 is an in-vehicle camera embedded in the right door portion of the vehicle facing diagonally right rear of the vehicle.
  • the imaging device 206 captures an image of the right rear side of the vehicle and acquires the image 400.
  • the information processing apparatus 100 acquires the target image 500 and detects the position where the vehicle appears in the target image 500 will be described with reference to FIG.
  • FIG. 5 is an explanatory diagram showing an example of detecting the position where the vehicle is shown.
  • the information processing apparatus 100 has created a function of the straight line 501 corresponding to the arrow 401 illustrated in FIG. 4 and stores the function.
  • the information processing apparatus 100 acquires the target image 500 obtained by the imaging apparatus 206 imaging the diagonally right rear of the vehicle.
  • the information processing apparatus 100 specifies a partial image 502 similar to the sample image in the target image 500 by pattern matching using the sample image.
  • the information processing apparatus 100 detects the center coordinates 503 of the specified partial image 502 as a candidate for a position where the vehicle is reflected.
  • the information processing apparatus 100 calculates the distance between the straight line 501 and the detected center coordinate 503 based on the stored function of the straight line 501. If the calculated distance is less than the threshold, the information processing apparatus 100 determines that the detected center coordinate 503 is a position where the vehicle is reflected and is not noise. If the calculated distance is equal to or greater than the threshold, the information processing apparatus 100 determines that the detected center coordinate 503 is noise, not a position where the vehicle is reflected.
  • the information processing apparatus 100 can prevent the position from being erroneously detected as a position where the vehicle is reflected. For this reason, when using the position where the vehicle is reflected for ensuring safety during traveling of the vehicle, the information processing apparatus 100 can accurately ensure safety without using the erroneously detected position.
  • the information processing apparatus 100 can prevent erroneous detection of the position where the vehicle appears even if the accuracy of pattern matching is poor. Further, the information processing apparatus 100 can perform the process of determining whether or not the position detected by pattern matching is a position where the vehicle is reflected by comparing the predetermined straight line with the detected position. It can be done in a short time. For this reason, the information processing apparatus 100 can improve the accuracy of detecting the position where the vehicle is reflected without improving the accuracy of the pattern matching, and is required when detecting the position where the vehicle is reflected. An increase in time can be suppressed.
  • FIG. 6 is an explanatory diagram illustrating an example of an image 600 captured by the imaging device 206.
  • the imaging device 206 is an in-vehicle camera attached around the front mirror of the vehicle toward the front of the vehicle. For example, the imaging device 206 captures the front of the vehicle and acquires the image 600.
  • the distance from the straight line corresponding to the arrow 601 indicates whether or not the position detected from the image 600 captured in front of the vehicle by the pattern matching using the sample image is likely to be the position where the vehicle is reflected. Can be evaluated based on Then, the information processing apparatus 100 can determine whether the position detected by pattern matching is a position where the vehicle is reflected, based on the straight line. Next, the description proceeds to FIG.
  • FIG. 7 is an explanatory diagram illustrating an example of an image 700 captured by the imaging device 206.
  • the imaging device 206 is an in-vehicle camera embedded in a bumper portion of the vehicle toward the left direction of the vehicle.
  • the imaging device 206 captures the left direction from the bumper portion of the vehicle and acquires the image 700.
  • the information processing apparatus 100 can determine whether the position detected by pattern matching is a position where the vehicle is reflected, based on the straight line.
  • the straight line used when determining whether the position detected by the pattern matching is the position where the vehicle is reflected, depending on which direction the imaging device 206 is imaging. Will be different.
  • the information processing apparatus 100 uses, for example, a plurality of images actually captured in the past by the image capturing apparatus 206 to determine whether the vehicle is in a position where the vehicle is reflected. It is preferable to specify whether such straight line information is used.
  • the information processing apparatus 100 when the image capturing apparatus 206 can capture a plurality of directions by switching the image capturing direction, the information processing apparatus 100 creates and stores straight line information corresponding to capturing each direction. You may make it leave. Thereby, the information processing apparatus 100 accurately determines whether or not the position detected by the pattern matching is a position where the vehicle is reflected, even when the imaging apparatus 206 can switch the imaging direction. Can be determined.
  • FIG. 8 is an explanatory diagram illustrating a specific example of creating a function indicating a straight line 801.
  • the information processing apparatus 100 acquires a plurality of images captured by the imaging device 206 and confirmed to show the vehicle, and coordinates (x indicated by ⁇ in FIG. 8 based on the respective images. i , y i ) are detected. i is 1 to N. N is the number of detected coordinates (x i , y i ).
  • the information processing apparatus 100 substitutes the coefficient N and the coefficient b by substituting the number N of coordinates and the detected N coordinates (x i , y i ) into the above equations (1) and (2). And a function indicating the straight line 801 is created and stored. Next, a specific example in which the information processing apparatus 100 determines whether or not the vehicle is shown is described with reference to FIGS. 9 and 10.
  • FIG. 9 and FIG. 10 are explanatory diagrams showing a specific example for determining whether or not the vehicle is in a reflected position.
  • the information processing apparatus 100 acquires a plurality of target images captured by the imaging apparatus 206 and detects coordinates (x i , y i ) indicated by ⁇ in FIG. 9 based on the respective target images.
  • i is 1 to N.
  • N is the number of detected coordinates (x i , y i ).
  • Coordinates (x i , y i ) indicated by ⁇ in FIG. 9 may include a position where an object that is not a vehicle is detected by pattern matching.
  • FIG. 9 the description proceeds to FIG.
  • the coordinates (x i , y i ) indicated by ⁇ in FIG. 10 are the same coordinates as the coordinates (x i , y i ) indicated by ⁇ in FIG.
  • the information processing apparatus 100 calculates the length of a perpendicular line from one of the coordinates (x i , y i ) indicated by ⁇ in FIG. 10 to the straight line 801, and the coordinates (x i , y i ) and the straight line 801 The distance is calculated. Then, the information processing apparatus 100 determines whether or not the calculated distance is greater than or equal to the threshold “100”.
  • the coordinate (x i , y i ) is a position where the vehicle is shown and is determined not to be noise.
  • the coordinate (x i , y i ) is determined not to be the position where the vehicle is reflected but to noise.
  • the information processing apparatus 100 can prevent the position from being erroneously detected as a position where the vehicle is reflected. For this reason, when using the position where the vehicle is reflected for ensuring safety during traveling of the vehicle, the information processing apparatus 100 can accurately ensure safety without using the erroneously detected position.
  • the information processing apparatus 100 can prevent erroneous detection of the position where the vehicle appears even if the accuracy of pattern matching is poor. Further, the information processing apparatus 100 can perform the process of determining whether or not the position detected by pattern matching is a position where the vehicle is reflected by comparing the predetermined straight line with the detected position. It can be done in a short time. For this reason, the information processing apparatus 100 can improve the accuracy of detecting the position where the vehicle is reflected without improving the accuracy of the pattern matching, and is required when detecting the position where the vehicle is reflected. An increase in time can be suppressed.
  • FIG. 11 is an explanatory diagram illustrating an example of determining whether or not the vehicle is in a position where the threshold is changed. 11, the coordinates (x i, y i) indicated ⁇ is 11, the coordinates (x i, y i) indicated ⁇ is 10 and is the same coordinates as.
  • the information processing apparatus 100 calculates the length of a perpendicular line from one of the coordinates (x i , y i ) indicated by ⁇ in FIG. 11 to the straight line 801, and the coordinates (x i , y i ) and the straight line 801 The distance is calculated. Then, the information processing apparatus 100 determines whether or not the calculated distance is greater than or equal to the threshold “75”.
  • the information processing apparatus 100 can determine whether the position detected by pattern matching is a position where the vehicle is reflected, using each of the plurality of threshold values.
  • the information processing apparatus 100 can adjust the range of the coordinates determined to be the position where the vehicle is reflected and not noise by changing the threshold value. For this reason, the information processing apparatus 100 may perform processing for setting a threshold value to a preferable value in advance.
  • the information processing apparatus 100 may select and set a threshold value at which a ratio of coordinates determined to be noise among a plurality of coordinates is a preferable ratio from the plurality of threshold values. Specifically, the information processing apparatus 100 performs determination processing using a threshold value “100”, a threshold value “75”, and the like for a plurality of coordinates detected from a plurality of images in which it is confirmed that the vehicle is reflected. A threshold value is set at which it is possible to determine that all coordinates are not noise. The information processing apparatus 100 sets the smaller threshold value if it is possible to determine that all coordinates are not noise regardless of the threshold value “100” or the threshold value “75”.
  • the information processing apparatus 100 may reset the threshold periodically. For example, the information processing apparatus 100 periodically performs a determination process using a plurality of threshold values on a plurality of coordinates detected from a plurality of images captured by the imaging device 206 while the vehicle is traveling, and sets the threshold values. May be. Specifically, when the threshold value “75” is used and the information processing apparatus 100 determines that there are too many coordinates to be determined as noise, the information processing apparatus 100 may set the threshold value “100” as a threshold value to be used in the future. .
  • FIG. 12 is a flowchart showing an example of a calculation processing procedure.
  • the information processing apparatus 100 acquires image data representing an image in which it is detected that a vehicle is reflected from a plurality of image data serving as specimens (step S1201).
  • the information processing apparatus 100 specifies a range in which it is determined that the vehicle is reflected on the image represented by the image data (step S1202). Then, the information processing apparatus 100 calculates the center coordinates of the specified range as the coordinates of the vehicle on the image represented by the image data (step S1203).
  • the information processing apparatus 100 determines whether or not to end the acquisition of the image data (step S1204).
  • the information processing apparatus 100 returns to the process of step S1201.
  • step S1204 when the acquisition ends (step S1204: Yes), the information processing apparatus 100 obtains an approximate straight line function representing a trajectory that the vehicle coordinates can take based on the calculated vehicle coordinates, and stores the function. The information is stored in the unit 120 (step S1205). Then, the information processing apparatus 100 ends the calculation process. Thereby, the information processing apparatus 100 can create a function indicating an approximate straight line.
  • FIG. 13 is a flowchart showing an example of the determination processing procedure.
  • the information processing apparatus 100 acquires new image data to be determined (step S1301).
  • the information processing apparatus 100 determines whether or not the vehicle is shown on the image represented by the acquired image data (step S1302).
  • step S1302 No
  • the information processing apparatus 100 returns to the process of step S1301.
  • step S1302 when the vehicle is shown (step S1302: Yes), the information processing apparatus 100 specifies a range in which it is determined that the vehicle is shown on the image represented by the image data based on the acquired image data. (Step S1303). Then, the information processing apparatus 100 calculates the center coordinates of the specified range as a candidate for the coordinates of the vehicle on the image represented by the image data (step S1304).
  • the information processing apparatus 100 determines whether or not the distance between the calculated vehicle coordinate candidate and the approximate straight line represented by the function stored in the storage unit 120 is greater than or equal to a threshold (step S1305). If it is not equal to or greater than the threshold (step S1305: No), the information processing apparatus 100 determines that the calculated vehicle coordinate candidate is the vehicle coordinate and is not noise, and outputs the determined result ( Step S1306). Then, the information processing apparatus 100 returns to the process of step S1301.
  • step S1305 when the value is equal to or greater than the threshold (step S1305: Yes), the information processing apparatus 100 determines that the calculated vehicle coordinate candidate is actually not the vehicle coordinate but noise, and determines the determination result. It outputs (step S1307). Then, the information processing apparatus 100 returns to the process of step S1301. Thereby, the information processing apparatus 100 can prevent erroneous detection of the position where the vehicle is reflected.
  • FIG. 14 is a flowchart showing an example of the adjustment processing procedure.
  • the information processing apparatus 100 acquires image data representing an image in which it is detected that a vehicle is reflected from a plurality of pieces of image data serving as specimens (step S1401).
  • the information processing apparatus 100 specifies a range in which it is determined that the vehicle is reflected on the image represented by the image data (step S1402). Then, the information processing apparatus 100 calculates the center coordinates of the specified range as a candidate for the coordinates of the vehicle on the image represented by the image data (step S1403).
  • the information processing apparatus 100 determines whether or not the distance between the calculated vehicle coordinate candidate and the approximate straight line represented by the function stored in the storage unit 120 is equal to or greater than the first threshold (step). S1404). If it is not equal to or greater than the first threshold (step S1404: No), the information processing apparatus 100 determines that the calculated vehicle coordinate candidate is the vehicle coordinate and is not noise, and the determined result is Store (step S1405). Then, the information processing apparatus 100 proceeds to the process of step S1407.
  • step S1404 determines that the calculated vehicle coordinate candidate is not actually the vehicle coordinate but noise, and the determination is made. The result is stored (step S1406). Then, the information processing apparatus 100 proceeds to the process of step S1407.
  • step S1407 the information processing apparatus 100 determines whether the distance between the calculated vehicle coordinate candidate and the approximate straight line represented by the function stored in the storage unit 120 is equal to or greater than the second threshold ( Step S1407). If it is not equal to or greater than the second threshold (step S1407: No), the information processing apparatus 100 determines that the calculated vehicle coordinate candidate is the vehicle coordinate and is not noise, and the determined result is Store (step S1408). Then, the information processing apparatus 100 proceeds to the process of step S1410.
  • step S1407 Yes
  • step S1407 Yes
  • step S1410 it is determined that the calculated vehicle coordinate candidate is actually not the vehicle coordinate but noise, and the determination result is stored ( Step S1409). Then, the information processing apparatus 100 proceeds to the process of step S1410.
  • step S1410 the information processing apparatus 100 determines whether to end the acquisition of image data (step S1410). If the acquisition is not terminated (step S1410: No), the information processing apparatus 100 returns to the process of step S1401.
  • step S1410 when the acquisition ends (step S1410: Yes), the information processing apparatus 100 uses either the first threshold value or the second threshold value as a threshold value used in the determination process based on the determined result. Setting is performed (step S1411). Then, the information processing apparatus 100 ends the adjustment process. Thereby, the information processing apparatus 100 can set the threshold value which determines accurately whether it is the position where the vehicle was reflected.
  • the target image 110 captured by the imaging apparatus 206 is acquired, and the position 111 of the partial image having the same characteristics as the specimen data in the target image 110 is detected. be able to. Further, according to the information processing apparatus 100, it is possible to determine whether or not the detected position 111 is a position where the vehicle is reflected, based on the information of the straight line 131 stored in the storage unit 120. Thereby, even if there exists the position 111 detected accidentally by pattern matching, the information processing apparatus 100 can prevent the position 111 from being erroneously detected as a position where the vehicle is reflected.
  • the detected position 111 when it is determined that the distance between the straight line 131 and the detected position 111 is equal to or less than the threshold value, the detected position 111 can be determined to be a position where the vehicle is reflected. As a result, the information processing apparatus 100 can evaluate the likelihood of the detected position 111 as the position where the vehicle is reflected by the distance from the straight line 131. In addition, the information processing apparatus 100 calculates the distance between the detected position 111 and the straight line 131 and compares the distance with a threshold value to determine whether or not the vehicle is reflected. This can be done in a relatively short time.
  • a plurality of images obtained by imaging the outside of the vehicle can be acquired, and a threshold value can be set based on the distance between the straight line 131 and the position where the vehicle appears in each of the acquired plurality of images. .
  • the information processing apparatus 100 can set the threshold value used for the process performed about the target image 110 imaged by the imaging device 206 to determine whether or not the vehicle is reflected. Since the information processing apparatus 100 sets a threshold based on a plurality of images taken outside the vehicle, it is easy to set a threshold that can determine whether or not the vehicle is in a position with relatively high accuracy. .
  • the information processing apparatus 100 a plurality of images taken outside the vehicle are acquired, information on the straight line 131 is created based on the position where the vehicle is reflected in each of the acquired plurality of images, and the information on the created straight line 131 is obtained. It can be stored in the storage unit 120. Accordingly, the information processing apparatus 100 can store information on the straight line 131 used for the process of determining whether the vehicle is in the position performed on the target image 110 captured by the imaging apparatus 206. . Since the information processing apparatus 100 creates information on the straight line 131 based on a plurality of images taken from outside the vehicle, the information processing apparatus 100 can determine whether or not the vehicle 131 is in a position with a relatively high accuracy. Information can be created.
  • a function indicating the straight line 131 is created using the least square method based on the position where the vehicle is reflected in each of a plurality of images taken outside the vehicle, and the created function is stored in the storage unit 120. Can be stored. Thereby, the information processing apparatus 100 can create a function indicating the straight line 131 even if there is an outlier at the position where the vehicle is reflected in each of the plurality of images taken outside the vehicle.
  • the target image 110 in any direction captured by the imaging apparatus 206 capable of capturing a plurality of directions is acquired, and the same characteristics as the characteristics indicated by the sample data in the target image 110 are obtained.
  • the position 111 of the partial image it has can be detected.
  • the information processing apparatus 100 can determine whether or not the detected position 111 is a position where the vehicle is reflected, even if the imaging direction of the imaging apparatus 206 is variable.
  • the information processing method described in this embodiment can be realized by executing a program prepared in advance on a computer such as a personal computer or a workstation.
  • the information processing program described in this embodiment is recorded on a computer-readable recording medium such as a hard disk, a flexible disk, a CD-ROM, an MO, and a DVD, and is executed by being read from the recording medium by the computer. Further, the information processing program described in this embodiment may be distributed via a network such as the Internet.

Abstract

Un dispositif de traitement d'informations (100) acquiert une image cible (110) qui a été réalisée par une unité d'imagerie. Le dispositif de traitement d'informations (100) détecte, à partir de données d'échantillon indiquant les caractéristiques d'une image de véhicule qui a été capturée, la position (111) d'une image partielle qui présente des caractéristiques identiques à celles indiquées par les données d'échantillon, dans l'image cible acquise (110). Le dispositif de traitement d'informations (100) détermine si la position détectée (111) est une position à laquelle le véhicule est photographié d'après des informations concernant une ligne droite (131) stockées dans une unité de stockage (120).
PCT/JP2017/005573 2017-02-15 2017-02-15 Dispositif de traitement d'informations, procédé de traitement d'informations et programme de traitement d'informations WO2018150496A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2017/005573 WO2018150496A1 (fr) 2017-02-15 2017-02-15 Dispositif de traitement d'informations, procédé de traitement d'informations et programme de traitement d'informations

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2017/005573 WO2018150496A1 (fr) 2017-02-15 2017-02-15 Dispositif de traitement d'informations, procédé de traitement d'informations et programme de traitement d'informations

Publications (1)

Publication Number Publication Date
WO2018150496A1 true WO2018150496A1 (fr) 2018-08-23

Family

ID=63170545

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/005573 WO2018150496A1 (fr) 2017-02-15 2017-02-15 Dispositif de traitement d'informations, procédé de traitement d'informations et programme de traitement d'informations

Country Status (1)

Country Link
WO (1) WO2018150496A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111915638A (zh) * 2019-05-09 2020-11-10 东芝泰格有限公司 追踪装置及信息处理方法、可读存储介质、电子设备

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006053756A (ja) * 2004-08-11 2006-02-23 Tokyo Institute Of Technology 物体検出装置
JP2008123462A (ja) * 2006-11-16 2008-05-29 Hitachi Ltd 物体検知装置
JP2014021510A (ja) * 2012-07-12 2014-02-03 Jvc Kenwood Corp 画像処理装置および画像処理方法、ならびに、プログラム

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006053756A (ja) * 2004-08-11 2006-02-23 Tokyo Institute Of Technology 物体検出装置
JP2008123462A (ja) * 2006-11-16 2008-05-29 Hitachi Ltd 物体検知装置
JP2014021510A (ja) * 2012-07-12 2014-02-03 Jvc Kenwood Corp 画像処理装置および画像処理方法、ならびに、プログラム

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111915638A (zh) * 2019-05-09 2020-11-10 东芝泰格有限公司 追踪装置及信息处理方法、可读存储介质、电子设备

Similar Documents

Publication Publication Date Title
CN110287779B (zh) 车道线的检测方法、装置及设备
US10331961B2 (en) Detecting device, detecting method, and program
US9846823B2 (en) Traffic lane boundary line extraction apparatus and method of extracting traffic lane boundary line
JP2004117078A (ja) 障害物検出装置及び方法
EP2980755B1 (fr) Procédé de partitionnement de zone et dispositif d'inspection
US10339396B2 (en) Vehicle accessibility determination device
JP6237875B2 (ja) 自己位置算出装置及び自己位置算出方法
CN110659547B (zh) 物体识别方法、装置、车辆和计算机可读存储介质
WO2013035612A1 (fr) Dispositif de détection d'obstacle, procédé de détection d'obstacle, et programme de détection d'obstacle
JP4674179B2 (ja) 影認識方法及び影境界抽出方法
JP2020061020A (ja) 横断歩道標示推定装置
US11521337B2 (en) Map generation system, map generation method, and computer readable medium which generates linearization information calculates a reliability degree
CN108629225B (zh) 一种基于多幅子图与图像显著性分析的车辆检测方法
US20160196657A1 (en) Method and system for providing depth mapping using patterned light
WO2023019793A1 (fr) Procédé de détermination, robot de nettoyage et support de stockage informatique
WO2018150496A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations et programme de traitement d'informations
JP2005156199A (ja) 車両検知方法及び車両検知装置
JP2019218022A (ja) 線路検出装置
JP3631095B2 (ja) 照射野領域抽出装置、放射線撮影装置、放射線画像用システム、照射野領域抽出方法、及びコンピュータ可読記憶媒体
JP2010286995A (ja) 車両用画像処理システム
JP3915621B2 (ja) レーンマーク検出装置
CN111340887B (zh) 视觉定位方法、装置、电子设备和存储介质
WO2016152288A1 (fr) Dispositif de détection de matériau, procédé de détection de matériau, et programme
JP6688091B2 (ja) 車両距離導出装置および車両距離導出方法
KR101595368B1 (ko) 적외선 이미지에서 특징추적을 위한 고온 공기 제거방법

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17897023

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17897023

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP