WO2017009933A1 - 信号機認識装置及び信号機認識方法 - Google Patents
信号機認識装置及び信号機認識方法 Download PDFInfo
- Publication number
- WO2017009933A1 WO2017009933A1 PCT/JP2015/070041 JP2015070041W WO2017009933A1 WO 2017009933 A1 WO2017009933 A1 WO 2017009933A1 JP 2015070041 W JP2015070041 W JP 2015070041W WO 2017009933 A1 WO2017009933 A1 WO 2017009933A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- imaging direction
- vehicle
- traffic light
- imaging
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims description 20
- 238000003384 imaging method Methods 0.000 claims abstract description 249
- 230000008859 change Effects 0.000 claims description 107
- 238000004364 calculation method Methods 0.000 claims description 23
- 238000001514 detection method Methods 0.000 abstract description 51
- 230000008569 process Effects 0.000 description 11
- 238000013459 approach Methods 0.000 description 10
- 238000010586 diagram Methods 0.000 description 10
- 230000006870 function Effects 0.000 description 7
- 230000004048 modification Effects 0.000 description 5
- 238000012986 modification Methods 0.000 description 5
- 230000006399 behavior Effects 0.000 description 3
- 238000009434 installation Methods 0.000 description 3
- 230000010365 information processing Effects 0.000 description 2
- 230000007246 mechanism Effects 0.000 description 2
- 230000002093 peripheral effect Effects 0.000 description 2
- 230000004397 blinking Effects 0.000 description 1
- 238000004590 computer program Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000003672 processing method Methods 0.000 description 1
- 230000000717 retained effect Effects 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/09—Arrangements for giving variable traffic instructions
- G08G1/0962—Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
- G08G1/09623—Systems involving the acquisition of information from passive traffic signs by means mounted on the vehicle
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R11/00—Arrangements for holding or mounting articles, not otherwise provided for
- B60R11/04—Mounting of cameras operative during drive; Arrangement of controls thereof relative to the vehicle
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/58—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
- G06V20/582—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads of traffic signs
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/58—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
- G06V20/584—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads of vehicle lights or traffic lights
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/01—Detecting movement of traffic to be counted or controlled
- G08G1/017—Detecting movement of traffic to be counted or controlled identifying vehicles
- G08G1/0175—Detecting movement of traffic to be counted or controlled identifying vehicles by photographing vehicles, e.g. when violating traffic rules
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/09—Arrangements for giving variable traffic instructions
- G08G1/0962—Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
- G08G1/0967—Systems involving transmission of highway information, e.g. weather, speed limits
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2555/00—Input parameters relating to exterior conditions, not covered by groups B60W2552/00, B60W2554/00
- B60W2555/60—Traffic rules, e.g. speed limits or right of way
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30248—Vehicle exterior or interior
- G06T2207/30252—Vehicle exterior; Vicinity of vehicle
- G06T2207/30256—Lane; Road marking
Definitions
- the present invention relates to a traffic signal recognition device and a traffic signal recognition method for recognizing a traffic signal mounted on a vehicle.
- driving such as stopping and running is controlled by recognizing a traffic light provided on the traveling path of the vehicle and detecting a lighting state such as a lighting color of the traffic light.
- Patent Document 1 Japanese Patent Laid-Open No. 11-306489
- a camera is mounted on a vehicle to image a traffic signal ahead.
- the horizontal angle and the vertical angle of the camera are controlled so that the traffic light is positioned at the center of the image captured by the camera.
- the enlargement ratio is controlled so that the image of the traffic light becomes a desired size.
- Patent Document 1 While the imaging direction of the camera is changed, an image captured by the camera is blurred and image recognition becomes difficult. As a result, there is a risk that the detection accuracy of the lighting state of the traffic light is lowered while the imaging direction of the camera is changed.
- the present invention has been made to solve such a conventional problem, and the purpose of the present invention is to make it unnecessary to change the imaging direction of the imaging unit when the vehicle approaches the traffic light, Alternatively, it is an object of the present invention to provide a traffic light recognition device that can reduce the number of times of changing the imaging direction.
- a traffic light recognition apparatus includes an imaging unit, a map information acquisition unit that acquires map information, a vehicle current position detection unit that detects a current position on a map of the vehicle, and a position on an image of the traffic signal. Has a traffic signal position estimation unit. Furthermore, the imaging direction setting unit for setting the imaging direction of the imaging unit based on the position on the signal image and the future moving direction on the signal image, and the imaging direction setting unit for setting the imaging direction of the imaging unit An imaging direction changing unit that changes the captured imaging direction, and a traffic signal recognition unit that recognizes a traffic signal from an image captured by the imaging unit in the imaging direction.
- an image around a vehicle is captured by an imaging unit, map information around the vehicle is acquired, a current position on the map of the vehicle is detected, and the current position and map information are detected. Based on this, the position of the traffic light on the image is estimated. Furthermore, the imaging direction of the imaging unit is set based on the position on the traffic signal image and the future moving direction on the traffic signal image, and the imaging direction of the imaging unit is set to the imaging direction set by the imaging direction setting unit. The traffic signal is recognized from the image captured by the imaging unit in the imaging direction.
- FIG. 1 is a block diagram showing a configuration of a traffic signal recognition device and its peripheral devices according to the first embodiment of the present invention.
- FIG. 2 is a block diagram showing in detail the traffic signal recognition apparatus 100 shown in FIG.
- the traffic signal recognition device 100 is mounted on a vehicle 51, and map information D02, camera information D03, vehicle current position information D05, and image data D07 are input from various devices mounted on the vehicle 51. Is done.
- the traffic signal information D04 which is information recognized by the traffic signal recognition device 100, is output to the subsequent device.
- the camera information D03 is information regarding the installation position of the camera 11 (see FIG. 2) with respect to the vehicle 51.
- the imaging area around the vehicle by the camera 11 can be estimated based on the camera information D03.
- the map information D02 is information given from a map database including map data (map information around the vehicle) of the travel path on which the vehicle travels. Location information and the like are included.
- the traffic signal recognition device 100 includes a camera 11 (imaging unit), a vehicle current position detection unit 12, a map information acquisition unit 17, an imaging direction setting unit 13, a traffic signal recognition unit 14, and a landmark information acquisition unit 18. It has.
- the camera 11 is a camera provided with a solid-state image sensor such as a CCD or a CMOS, for example.
- the camera 11 outputs the captured image as image data D07 to the traffic signal recognition unit 14.
- the camera 11 stores information related to the installation position of the camera 11 with respect to the vehicle 51 and outputs the information to the imaging direction setting unit 13 as camera information D03.
- the information regarding the installation position of the camera 11 can be calculated from the position on the image captured by the camera 11 by setting a calibration mark or the like at a known position with respect to the vehicle 51, for example. .
- the camera 11 is installed via a mechanism that is rotatable in the pan and tilt directions with respect to the vehicle 51, and has a drive mechanism that drives a rotation angle in the pan and tilt directions, and rotates in the pan and tilt directions. By driving the angle, the posture of the camera 11 can be controlled so that the camera 11 has a desired imaging direction.
- the map information acquisition unit 17 uses the map database including the map information (map information around the vehicle) of the travel path on which the vehicle travels, the position information of the landmarks such as ground landmarks existing around the travel path, and the traffic signal. Get location information. This map information is output to the vehicle current position detection unit 12 and the imaging direction setting unit 13 as map information D02.
- the landmark information acquisition unit 18 is, for example, a vehicle-mounted sensing camera or laser radar, and recognizes ground landmarks (road markings (lane marks, stop lines, characters), curbs, traffic lights, signs, etc.).
- the information on the relative position with respect to the vehicle 51 is acquired.
- the acquired information is output to the vehicle current position detection unit 12 as landmark information D01.
- the vehicle current position detection unit 12 acquires the landmark information D01 and the map information D02, detects the current position on the map of the vehicle 51 based on these information, and outputs this as vehicle current position information D05.
- the landmark information D01 includes information indicating the relative positional relationship of the ground landmark with respect to the vehicle 51. Accordingly, the current position of the vehicle 51 on the map can be detected by comparing the position information of the landmark information D01 with the position information of the ground landmark included in the map information D02.
- the “position” includes coordinates and orientation. Specifically, the position of the ground landmark includes its coordinates and posture, and the position of the vehicle 51 includes its coordinates and posture.
- the vehicle current position detection unit 12 outputs the coordinates (x, y, z) in the reference coordinate system and the posture (yaw, pitch, roll) that is the rotation direction of each coordinate axis as the vehicle current position information D05.
- the imaging direction setting unit 13 controls the orientation of the imaging direction of the camera 11 so that the traffic light existing on the traveling path of the vehicle 51 falls within the imaging area of the camera 11. .
- the posture of the camera 11 can be controlled by driving the camera 11 so that the rotation angle in the pan and tilt directions becomes the target imaging direction.
- the detection area estimated that a traffic light exists is set from the image imaged in this imaging direction, and it outputs as detection area information D06. In other words, if the posture of the camera 11 is determined and an area to be imaged is set, the position where the traffic light will be present can be specified on the image captured by the camera 11. It can be set as a detection area.
- the detection area information D06 is output to the traffic signal recognition unit 14. At this time, the detection area is set to a size that does not cause each traffic signal to be out of frame from the detection area even when an error occurs in the vehicle behavior or the current vehicle position information.
- the imaging direction setting unit 13 has a function of determining the imaging direction of the camera 11 based on the position of the vehicle 51, the traffic signal position, and the amount of change in the traffic signal position. Details of the imaging direction setting unit 13 will be described later with reference to FIG.
- the traffic light recognition unit 14 recognizes the traffic light from the image data D07 captured by the camera 11 based on the detection area information D06 described above. Specifically, based on the image data D07 output from the camera 11 and the detection area information D06 set by the imaging direction setting unit 13, image processing for recognizing the traffic light for the detection area is performed.
- image processing method for example, a method of detecting a signal lamp included in a traffic light using blinking light synchronized with an AC cycle of a commercial power supply, or similar features such as red, green, yellow hue and round shape It can detect using the method etc. which determine.
- known image processing for detecting a traffic light can be applied.
- the signal recognition process is performed not on the entire image data D07 captured by the camera 11 but on a detection area set as a part thereof, thereby reducing the load of information processing for signal detection and quickly.
- a traffic light can be detected.
- the traffic signal recognition part 14 outputs the recognition result of a traffic signal as traffic signal information D04.
- the signal recognition process is not limited to the above method, and other methods may be employed.
- FIG. 3 is a block diagram illustrating a detailed configuration of the imaging direction setting unit 13.
- the imaging direction setting unit 13 includes a traffic signal position estimation unit 21, a position change amount calculation unit 22, an imaging posture setting unit 23, a camera posture control unit 24, and a detection area calculation unit 25.
- the traffic signal position estimation unit 21 receives the map information D02 and the vehicle current position information D05, and outputs detected position information D08. Since the map information D02 includes the coordinates of each traffic signal, the traffic signal position estimation unit 21 determines the relative coordinates of the traffic signal with respect to the vehicle 51 based on the coordinates of each traffic signal, the coordinates of the vehicle 51, and the attitude of the camera 11. Can be requested. Therefore, if the posture at which the camera 11 captures the surroundings is determined, the position on the image where the traffic light on the captured image will be captured can be specified. For example, as shown in FIG. 5, the estimated position (x2, y2) of the traffic light can be set in the image R1. That is, the traffic signal position estimation unit 21 has a function of estimating the position of the traffic signal based on map information around the vehicle 51.
- the position change amount calculation unit 22 has a function of calculating the change amount of the traffic signal position estimated by the traffic signal position estimation unit 21 with the passage of time.
- the imaging posture setting unit 23 refers to the change amount (dx, dy), and estimates the moving direction of the traffic light in the image R1 from the change amount. Then, based on the estimated moving direction, the imaging direction of the camera 11 is determined so that the traffic light does not frame out of the image R1. Specifically, the change direction of the traffic light in the image R1 is obtained from the above change amount (dx, dy), and the traffic light is positioned at a proper position in the image R1 on the side opposite to the change direction. Next, the imaging direction of the camera 11 is determined.
- the imaging posture setting unit 23 is an imaging direction setting unit that determines the imaging direction of the camera 11 based on the position of the vehicle 51, the traffic signal position estimated by the traffic signal position estimation unit 21, and the amount of change in the traffic signal position. It has the function of.
- the moving direction of this traffic light is the upper right direction, that is, When dx is a positive value (dx> 0) and dy is a negative value (dy ⁇ 0), it is estimated that the traffic light in the image R1 moves in the upper right direction. Therefore, as shown in FIG. 7, the imaging direction of the camera 11 is determined so that the traffic light is located at the position of the lower left sign q2 in the image R1.
- the estimated position ( If it is determined that x2, y2) exist, the current state is maintained without changing the imaging direction of the camera 11.
- the camera orientation control unit 24 controls the orientation of the imaging direction of the camera 11 based on the imaging orientation information D10 output from the imaging orientation setting unit 23 so that the traffic light does not frame out of the image R1. Specifically, the posture of the camera 11 can be controlled by driving the camera 11 so that the rotation angle in the pan and tilt directions becomes the target imaging direction. Then, the posture information D11 of the camera 11 set by the posture control is output. Further, when the imaging posture information D10 does not change between the current calculation and the calculation one cycle before, the current imaging direction is maintained without changing the imaging direction of the camera 11.
- the camera attitude control unit 24 has a function as an imaging direction changing unit that changes the imaging direction so that the imaging direction by the camera 11 becomes the imaging direction set by the imaging attitude setting unit 23.
- the detection area calculation unit 25 detects a traffic light from the image R1 captured by the camera 11 based on the posture information D11 of the camera 11, the map information D02, and the vehicle current position information D05. Set.
- the map information D02 the position of the traffic light is registered in advance as coordinates on the map.
- the relative position of the traffic signal with respect to the vehicle 51 can be obtained.
- the position of the traffic light on the image R1 captured by the camera 11 is obtained, and further, based on the position of the traffic light on the image.
- a detection region is set in the image R1.
- the detection area is set to a size that prevents the traffic signal from being out of frame even when an error occurs in vehicle behavior or vehicle current position information. Then, the set detection area information D06 is output.
- the detection area information D06 is output to the traffic signal recognition unit 14 as shown in FIG.
- the vehicle current position detection unit 12, the imaging direction setting unit 13, and the traffic signal recognition unit 14 described above can be realized using a microcontroller including a CPU, a memory, and an input / output unit.
- the CPU configures a plurality of information processing units (12, 13, 14) included in the microcontroller by executing a computer program installed in advance.
- Part of the memory included in the microcontroller constitutes a map database that stores map information D02.
- the microcontroller may also be used as an ECU used for other control (for example, automatic driving control) related to the vehicle.
- step S11 the traffic signal position estimation unit 21 shown in FIG. 3 calculates the traffic signal position in the image R1 captured by the camera 11 based on the map information D02 and the vehicle current position information D05. Specifically, the position of the symbol q1 shown in FIG. 5 is calculated. This process is executed at a predetermined calculation cycle.
- step S12 the position change amount calculation unit 22 calculates the change amount of the traffic light position in the image R1. As shown in FIG. 6, when the position coordinates of the traffic light move from (x1, y1) to (x3, y3), the amount of change (dx, dy) at this time is calculated.
- step S13 the imaging posture setting unit 23 estimates whether or not the traffic light in the image R1 is framed out from the image R1. In this processing, as shown in FIG. 6, it is estimated whether or not the traffic signal is out of frame from the image R1 based on the estimated position coordinates (x2, y2) of the traffic signal and the amount of change (dx, dy).
- step S14 the camera posture control unit 24 captures the imaging direction even if the signal R1 is not out of frame from the image R1 or is out of necessity.
- the orientation of the camera 11 is controlled by setting the imaging direction of the camera 11 so that the number of times of change is kept to a minimum. For example, as shown in FIG. 6, when it is estimated that the traffic light in the image R1 exists at the coordinates (x2, y2), and this traffic light is moving to the upper right, the traffic light is still in the image. Expected to be out of frame from R1. Therefore, as shown in FIG. 7, the imaging direction of the camera 11 is set so that the traffic light is located at the position of the symbol q2 shown at the lower left in the image R1. In step S15, the camera posture control unit 24 controls the posture of the camera 11 so that the set imaging direction is obtained.
- step S13 if it is estimated that the frame is not out (NO in step S13), the process proceeds to step S16. Thereafter, in step S ⁇ b> 16, the detection area calculation unit 25 sets a detection area for detecting a traffic light from the image R ⁇ b> 1 captured by the camera 11. As a result, when the vehicle 51 approaches the intersection where the traffic signal is installed, the traffic signal can be prevented from being out of frame from the image R1.
- FIG. 8 is an explanatory diagram schematically showing a state in which the vehicle 51 is traveling on the straight traveling path X1 and approaching the traffic light P1.
- 8A shows the positional relationship between the vehicle and the traffic light P1
- FIG. 8B shows the image R1 before changing the imaging direction of the camera 11
- FIG. 8C shows the image after changing the imaging direction of the camera 11. R1 is shown.
- a point Z1 shown in FIG. 8A is a point where the vehicle 51 is sufficiently away from the traffic light P1 and the traffic light P1 can be confirmed from an image captured by the camera 11. Further, the point Z2 is a point where the vehicle 51 approaches the traffic light P1. The point Z3 is a point where a stop line is set. Therefore, the area of the points Z2 to Z3 is an area where the vehicle 51 performs braking, that is, when stopping or running is determined, and when it is determined to stop, the vehicle 51 is decelerated and braking is performed so that the vehicle 51 stops. It is an area. Therefore, the area indicated by the points Z2 to Z3 is an area where it is necessary to recognize the change in the lighting state of the traffic light P1 with high accuracy.
- the traffic light P1 is displayed in the image R1 picked up by the camera 11, as indicated by reference numeral b1 in FIG. It exists in the lower right.
- the traffic light P1 moves in the upper right direction in the image R1, as indicated by reference numeral b2.
- the traffic light P1 is displayed larger. Therefore, the traffic light P1 is out of the frame from the image R1 in this state.
- the imaging direction of the camera 11 is changed when the vehicle 51 reaches the point Z2 on the travel path. Specifically, the imaging area of the camera 11 is moved in the upper right direction. By doing so, the traffic light P1 moves to the lower left in the image R1, as indicated by reference numeral c1 in FIG. 8C. Accordingly, when the vehicle 51 further travels and reaches the point Z3, the traffic light P1 is reliably displayed without being out of frame from the image R1, as indicated by reference numeral c2. That is, the traffic light P1 can be retained in the image R1 without changing the imaging direction of the camera 11 in the region of the points Z2 to Z3 where the lighting state of the traffic light P1 needs to be recognized with high accuracy. And by setting a detection area in this image R1, highly accurate traffic signal recognition becomes possible.
- the traffic light recognition apparatus 100 detects the traffic light P1 present in the image R1 captured by the camera 11, it is based on the moving direction of the traffic light P1 in the image R1. Thus, it is estimated whether or not the traffic light P1 is out of frame. When it is estimated that the frame is out, the imaging direction of the camera 11 is changed in advance so that the position of the traffic light P1 in the image R1 is a position where the frame does not go out.
- the traffic light P1 is framed out of the image R1 without controlling the posture of the camera 11 in the imaging direction. Can be avoided. That is, it is unnecessary to change the imaging direction of the camera 11 or the number of changes in the imaging direction is reduced in the region of the points Z2 to Z3 which is most important when detecting the lighting state of the traffic light P1. It is possible to avoid blurring in the image captured by the camera 11. Therefore, it is possible to reliably detect the lighting state of the traffic light P1 and use it for automatic operation or the like.
- the imaging direction setting unit 13 calculates the change amount of the imaging direction from the position on the traffic light image and the future movement range on the traffic light image, and based on the imaging range of the camera 11 and the change amount of the imaging direction. Since the imaging direction is set, it is possible to reliably avoid the traffic light P1 from being out of frame from the image R1.
- the position of the traffic light P1 existing in the image R1 is estimated based on the vehicle current position information D05 of the vehicle and the map information D02 (see FIG. 3), and the movement direction of the traffic light P1 is estimated. Based on this, the posture of the camera 11 in the imaging direction is controlled.
- the traffic signal position estimation unit 21 shown in FIG. 3 recognizes the position of the traffic signal P1 by actually performing image processing on the image in the image R1. Then, the position change amount calculation unit 22 detects the position (x1, y1) of the traffic light P1 recognized in the past and the position (x2, y2) of the current traffic light P1 by image processing, and from the detected position information, Detection position change information D09 is obtained.
- the traffic signal P1 existing in the image in the image R1 is recognized by the image processing, and the posture control of the camera 11 in the imaging direction is performed based on the moving direction of the traffic signal P1. Therefore, it is possible to control the posture of the camera 11 in the imaging direction with higher accuracy.
- the imaging direction setting unit 13 includes a travel route determination unit 26, a traffic light position estimation unit 21, an imaging posture setting unit 23, a camera posture control unit 24, and a detection area calculation unit 25. ing. It differs from the first embodiment described above in that a “travel route determination unit 26” is provided instead of the “position change amount calculation unit 22” illustrated in FIG. The same components as those in FIG. 3 are denoted by the same reference numerals, and the description of the configurations is omitted.
- the travel route determination unit 26 receives map information D02 and vehicle current position information D05, and uses these pieces of information to determine a route on which the vehicle 51 travels. For example, based on the map information D02, the travel path on which the vehicle 51 is currently traveling is detected, and further, on which position on the travel path the vehicle 51 is traveling is detected from the vehicle current position information D05. Then, a route on which the vehicle 51 will travel from this detection result is estimated and output as travel route information D12. For example, when it is estimated that the vehicle 51 is traveling in front of a curved road and then enters the curved road (see the vehicle 51 in FIG. 11A described later), the curve direction of the curved road ( Information on the left or right direction) and the radius of curvature is output as travel route information D12.
- the imaging posture setting unit 23 determines the imaging direction of the camera 11 based on the travel route information D12 and the detected position information D08 output from the traffic light position estimation unit 21. Specifically, the imaging posture setting unit 23 predicts a change in the imaging direction when the camera 11 captures the traffic light P1 according to the traveling state of the vehicle 51, and the traffic signal P1 is detected even when the imaging direction changes. The imaging direction of the camera 11 is determined so as not to be out of the frame from the image R1.
- step S31 the traffic signal position estimation unit 21 shown in FIG. 9 calculates the traffic signal position in the image R1 captured by the camera 11 based on the map information D02 and the vehicle current position information D05. This process is executed at a predetermined calculation cycle.
- step S32 the travel route determination unit 26 obtains a route that the vehicle 51 is estimated to travel in the future from the map information D02, and further, based on the vehicle current position information D05 of the vehicle 51, the traffic signal movement in the image R1. Predict.
- step S33 the imaging posture setting unit 23 estimates whether or not the traffic light in the image R1 is out of the frame from the image R1. In this processing, based on the condition of the travel route of the vehicle 51, whether the traffic light is framed out of the image from information such as the travel direction of the vehicle 51 when the vehicle 51 approaches the intersection where the traffic signal is installed. Estimate whether or not.
- step S34 the camera posture control unit 24 captures the imaging direction even if the traffic light is not out of the image R1, or the frame is forced out of the frame.
- the orientation of the camera 11 is controlled by setting the imaging direction of the camera 11 so that the number of times of change is kept to a minimum.
- the posture control in the imaging direction of the camera 11 will be described later with reference to FIGS.
- step S ⁇ b> 35 the camera posture control unit 24 controls the posture of the camera 11 in the imaging direction so that the set imaging direction is obtained. Thereafter, the process proceeds to step S36.
- step S33 the process proceeds to step S36.
- step S ⁇ b> 36 the detection area calculation unit 25 sets a detection area for detecting a traffic light from the image R ⁇ b> 1 captured by the camera 11. With such a setting, when the vehicle 51 approaches the intersection where the traffic signal is installed, the traffic signal can be prevented from being out of frame from the image R1.
- step S34 details of the processing in step S34 described above will be described with reference to FIGS.
- FIG. 11A a case where the vehicle 51 travels on a curved road X2 that curves in the right direction and goes to an intersection where the traffic light P1 is installed will be described as an example.
- the vehicle 51 travels toward points Z1, Z2, Z3, and Z4.
- a traffic light P1 is present at the lower left of the image R1 at a point Z1 shown in FIG. 11 (a), as indicated by reference numeral b1 in FIG. 11 (b).
- This enlarged view is shown in FIG.
- Z1 to Z4 shown in FIG. 12 correspond to the points Z1 to Z4 shown in FIG. Therefore, if the image R1 is not moved by changing the imaging direction of the camera 11, the traffic light P1 will be out of the frame from the image R1.
- the movement of the traffic light P1 following the movement locus indicated by the curve L1 is information that can be acquired in advance from the map information D02 and the vehicle current position information D05 of the vehicle. Therefore, the camera attitude control unit 24 estimates that the traffic light P1 changes as shown by a curve L1 shown in FIG. 12, and prevents the traffic light P1 from being out of frame from the image R1 even when a change like the curve L1 occurs. The attitude of the camera 11 in the imaging direction is controlled.
- the traffic light P1 is located at the left end portion of the image R1, as indicated by reference numeral c2. Further, when the vehicle 51 reaches the points Z3 and Z4, the traffic light P1 is positioned in the image R1 as indicated by reference numerals c3 and c4. That is, when the vehicle 51 reaches the point Z1, the movement of the traffic light in the image R1 is predicted based on the future traveling route of the vehicle 51 and the vehicle current position information D05 of the vehicle 51, and based on this prediction By performing posture control in the imaging direction of the camera 11 in advance, after passing through the point Z1, the traffic light P1 can be captured in the image R1 without performing posture control in the imaging direction of the camera 11.
- FIG. 13A shows the position of the vehicle 51 and the curved road X2 that is the travel route of the vehicle 51, and is the same view as FIG.
- reference sign b1 in FIG. 13 (b) when the traffic light P1 is present in the lower left part of the image R1 and this traffic light P1 moves in the left direction
- reference sign c1 in FIG. 13 (c) As shown, the imaging direction of the camera 11 is changed so that the traffic light P1 is positioned on the right side of the image R1.
- the traffic light P1 moves in the upper right direction as indicated by reference numeral c2. Therefore, when the vehicle 51 reaches the point Z3, as shown by reference numeral c3, the possibility that the traffic light P1 is out of the frame from the image R1 increases. Accordingly, when the vehicle 51 reaches the point Z3, as shown by a symbol d1 in FIG. 13D, the imaging direction of the camera 11 is changed so that the traffic light P1 is located on the left side of the image R1. When the vehicle 51 reaches the point Z4, as shown by the symbol d2 in FIG. 13 (d), the traffic light P1 is positioned substantially at the center of the image R1.
- the traffic signal P1 can be prevented from being out of frame from the image R1, but the imaging direction is performed twice until the vehicle 51 reaches the point Z4 from the point Z1. Have made changes. As a result, the time for controlling the posture of the camera 11 in the imaging direction increases, and the detection accuracy of the lighting state of the traffic light P1 may be reduced.
- the traffic signal recognition device since the orientation of the imaging direction of the camera 11 is controlled based on the curved road X2 that is the travel route of the vehicle 51, as shown in FIG. If the imaging direction of the camera 11 is changed when the point Z1, which is a sufficiently distant point, is reached, then the traffic light P1 will not frame out of the image R1. Therefore, there is no change in the imaging direction of the camera 11 after the vehicle 51 approaches the traffic light P1.
- the imaging direction setting unit 13 includes the travel route determination unit 26, and the travel route determination unit 26 estimates the travel route of the vehicle 51 in advance.
- the moving range of the traffic light P1 in the image R1 is predicted. That is, the future moving range on the traffic light image is predicted. And based on this movement range, the imaging direction of the camera 11 is changed so that the traffic light does not go out of frame.
- the frame out of the traffic light P1 can be avoided by changing the minimum imaging direction.
- a change in the lighting state of the traffic light P1 is reliably detected, and the vehicle stops at the intersection or travels. Can be reliably determined.
- the imaging direction of the camera 11 is changed at the point Z1, which is a point sufficiently far from the intersection where the traffic light P1 exists, and then the imaging direction of the camera 11 need not be changed. It showed about.
- the vehicle 51 is automatically operated, and an area (hereinafter referred to as “change restriction area”) for restricting the change of the imaging direction of the camera is set before the traffic light P1. And it controls so that the traffic light P1 may not carry out a frame out from the inside of the image R1, without changing the imaging direction of the camera 11 within this change restriction area.
- change restriction area an area for restricting the change of the imaging direction of the camera.
- FIG. 14A is an explanatory diagram showing the change restriction area Q1 set before the traffic light P1.
- the lighting state red, green lighting state, etc.
- the vehicle 51 is stopped according to the lighting state, Alternatively, it is determined whether to continue traveling.
- An area requiring this determination is set as the change restriction area Q1. That is, when the imaging direction of the camera 11 is changed, the detection accuracy of the lighting state is lowered. Therefore, in the region where this determination is necessary, the change restriction region Q1 is set so that the lighting state of the traffic light P1 can be detected with high accuracy.
- the change restriction area Q1 can be set based on the stop position provided for the traffic light P1, the traveling speed of the vehicle 51, the vehicle current position information D05, and the map information D02.
- the change restriction region Q1 is set, and the posture of the camera 11 is controlled so as to change the imaging direction of the camera 11 at a point other than the change restriction region Q1.
- the traffic light P1 present in the image R1 at the point Z1 is slightly on the right side of the center, and therefore it is determined that the traffic light P1 will not be out of frame at this point. Is done. However, since the vehicle 51 has reached the point Z1 immediately before the change restriction area Q1, the imaging direction of the camera 11 is changed at this point. As a result, as shown by reference sign c1 in FIG. 14C, the traffic light P1 is controlled to be positioned at the lower left in the image R1. Thereafter, the traffic light P1 does not frame out of the image R1, as indicated by reference numerals c2 and c3 in FIG. 14C, until the vehicle 51 passes through the change restriction region Q1 and reaches the point Z2. Can be caught.
- the imaging direction of the camera 11 is changed in the change restriction area Q1. That is, when the vehicle 51 reaches the point Z1 in FIG. 15 (a), the traffic light P1 exists slightly on the right side of the image R1, as indicated by reference numeral b1 in FIG. 15 (b). It is not determined that the frame is out of the image R1. Then, as indicated by reference sign b2, since it is determined that the vehicle P1 is out of frame at the time when the traffic light P1 is located at the right end of the image R1 (the position of the vehicle 51 in the change restriction region Q1 in FIG. 15A), At that time, the imaging direction of the camera 11 is changed.
- the traffic light P1 is controlled to come to the lower left of the image R1. Further, when the vehicle 51 travels, the traffic light P1 is positioned slightly on the right side in the center of the image R1 as indicated by reference numeral c2 in FIG. In this case, the imaging direction of the camera 11 is changed within the change restriction area Q1 that requires the recognition result of the lighting state of the traffic light P1.
- the imaging direction of the camera 11 is changed within an area where it is necessary to determine whether the vehicle 51 stops in accordance with the lighting state of the traffic light P1 or continues to travel, and the traffic signal P1 is changed by changing the imaging direction.
- the accuracy of detection may decrease.
- the change restriction area Q1 is set in front of the traffic light P1, and the change of the imaging direction of the camera 11 in the change restriction area Q1 is prohibited. Therefore, it is possible to prevent the traffic light P1 from being out of frame from the image R1, and to detect the lighting state of the traffic light P1 with high accuracy. As a result, it is possible to appropriately determine whether the vehicle 51 is to be stopped or run.
- the change restriction area Q1 changes according to the vehicle speed, the deceleration G, and the distance to the stop line.
- the change restriction area Q1 for prohibiting the change of the imaging direction of the camera 11 is set in front of the traffic light P1, and the imaging direction of the camera 11 before the vehicle 51 reaches the change restriction area Q1. Was changed so as to avoid the traffic light P1 being out of frame from the image R1.
- the detection of the lighting state of the traffic light P1 is executed every predetermined calculation cycle, and among the images taken by the camera 11, for example, only the image at the start of the calculation cycle may be used. In such a case, in the time period when the image captured by the camera 11 is not used within this calculation cycle, even if the imaging direction of the camera 11 is changed, the detection of the lighting state of the traffic light P1 is affected. Absent.
- the time zone in which the image captured by the camera 11 is used (hereinafter referred to as “image usage time zone”).
- image usage time zone is divided into time zones in which the imaging direction of the camera 11 that does not use the image captured by the camera 11 can be changed (hereinafter referred to as “change time zone”), and changing the imaging direction is prohibited only in the image usage time zone.
- the change prohibition area is set so as to.
- a region Qa corresponding to the image use time zone and a region Qb corresponding to the change time zone are set in the change restriction region Q1. These areas Qa and Qb can be set based on the traveling speed of the vehicle 51 and the calculation cycle in the traffic light recognition unit 14. And about the area
- the imaging direction of the camera 11 can be changed. Therefore, as shown in FIG. 16 (b), when the vehicle 51 is traveling at the point Z1, the traffic light P1 is not out of the frame as shown by reference numeral b1 in FIG. 16 (b). Do not change the imaging direction.
- the vehicle 51 reaches the point Z2, it is determined that the traffic light P1 is out of frame from the image R1, as indicated by reference numeral b2, so the imaging direction of the camera 11 is changed.
- the traffic light P1 is positioned at the lower left in the image R1, and thereafter, when the vehicle 51 reaches the point Z3, the reference sign c2 in FIG. As shown, the traffic light P1 is not out of frame from the image R1.
- the area Qa corresponding to the image use time zone and the area Qb corresponding to the change time zone based on the calculation cycle of the traffic light recognition unit 14. And the change of the imaging direction of the camera 11 is permitted for the region Qb. Therefore, even when the vehicle 51 enters the change restriction area Q1, it is possible to change the imaging direction of the camera 11 when traveling in the area Qb, and the traffic light P1 goes out of the image R1. This can be avoided more reliably.
- attitude control in the imaging direction of the camera 11 is performed so that both traffic signals are not out of frame from the image.
- this will be described in detail with reference to FIG. 17A, when two traffic lights P1 and P2 are present on the travel path X3 of the vehicle 51 and both of them can be imaged, both traffic lights P1 and P2 are prevented from being out of frame from the image R1. Then, the attitude control of the camera 11 in the imaging direction is performed.
- an image R1 in which two traffic lights P1 and P2 are present is obtained as indicated by reference numeral b1 in FIG.
- the imaging direction of the camera 11 is changed at this point Z2.
- the imaging direction of the camera 11 is set and the posture is controlled so that the traffic lights P1 and P2 are positioned on the left side in the image R1. That is, control is performed so that the left traffic light P1 of the two traffic lights P1 and P2 is positioned on the left side of the image R1, and that both traffic lights P1 and P2 are not out of frame.
- the imaging direction of the camera 11 is changed at this point Z3.
- the imaging direction of the camera 11 is set so that the traffic lights P1 and P2 are positioned on the left side in the image R1, and the posture is controlled.
- each traffic signal P1 and P2 can be continuously displayed without being framed out from the image R1.
- the posture control in the imaging direction of the camera 11 is performed at three points Z2, Z3, and Z4 shown in FIG.
- Such attitude control in the imaging direction of the camera 11 is performed in the area Qb in which the image captured by the camera 11 in the signal recognition process is not used, as shown in FIG. It is possible to change the imaging direction of the camera 11 without affecting the state detection accuracy.
- the orientation of the camera 11 is set and the orientation is controlled so that the traffic lights P1 and P2 are not out of frame from the image R1.
- the present invention is not limited to two traffic lights, and can be similarly applied to cases where there are three or more traffic lights.
- the one with the smaller movement amount in the image R1 is preferentially framed out.
- the orientation of the camera 11 is controlled by setting the imaging direction of the camera 11 so that it does not.
- FIG. 18A when there are two traffic lights P1 and P2 on the traveling path of the vehicle 51, the image R1 captured by the camera 11 at the point Z1 is denoted by reference numeral b1 in FIG.
- the traffic lights P1 and P2 are captured at the center of the image R1.
- the posture control of the camera 11 in the imaging direction is not performed even when it is determined that the traffic light P2 is out of the frame. .
- the attitude of the camera 11 in the imaging direction is controlled, and the reference numeral in FIG.
- the traffic light P1 is positioned on the left side of the image R1.
- the traffic light P1 is captured without being out of frame from the image R1.
- the traffic light P2 operates in synchronization with the traffic light P1, no problem occurs even if the lighting state cannot be detected.
- the posture control in the imaging direction of the camera 11 at the point Z3 is performed by performing the control in the region Qb in which the image captured by the camera 11 in the signal recognition process is not used, as shown in FIG. It becomes possible to change the imaging direction of the camera 11 without affecting the detection accuracy of the traffic light state.
- the traffic signal recognition apparatus when two traffic signals P1 and P2 that operate in synchronization with each other exist in the image R1 captured by the camera 11, one of them. Since the traffic light (in the above example, the traffic light P1) is controlled so as not to be out of frame from the image R1, the number of changes in the imaging direction of the camera 11 can be reduced and the lighting state of the traffic light can be reliably detected. It becomes possible.
- the signal recognition device and the signal recognition method of the present invention have been described based on the illustrated embodiment. However, the present invention is not limited to this, and the configuration of each unit is an arbitrary configuration having the same function. Can be replaced.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Life Sciences & Earth Sciences (AREA)
- Mechanical Engineering (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Artificial Intelligence (AREA)
- Atmospheric Sciences (AREA)
- Bioinformatics & Computational Biology (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Data Mining & Analysis (AREA)
- Evolutionary Biology (AREA)
- Evolutionary Computation (AREA)
- General Engineering & Computer Science (AREA)
- Traffic Control Systems (AREA)
- Image Processing (AREA)
- Image Analysis (AREA)
- Studio Devices (AREA)
Abstract
Description
[第1実施形態の説明]
図1は、本発明の第1実施形態に係る信号機認識装置、及びその周辺機器の構成を示すブロック図である。また、図2は、図1に示した信号機認識装置100を詳細に示したブロック図である。図1に示すように、信号機認識装置100は車両51に搭載され、該車両51に搭載された各種の機器より、地図情報D02、カメラ情報D03、車両現在位置情報D05、及び画像データD07が入力される。そして、該信号機認識装置100で認識した情報である信号機情報D04を後段の装置に出力する。
次に、第1実施形態に係る信号機認識装置100の変形例について説明する。前述した第1実施形態では、車両の車両現在位置情報D05、及び地図情報D02(図3参照)に基づいて、画像R1内に存在する信号機P1の位置を推定し、この信号機P1の移動方向に基づいてカメラ11の撮像方向の姿勢制御をする構成とした。
次に、本発明の第2実施形態について説明する。第2実施形態に係る信号機認識装置の全体構成は、前述した図1と同一であり、撮像方向設定部13の構成が相違する。以下、図9に示すブロック図を参照して、第2実施形態に係る撮像方向設定部13の構成について説明する。
次に、本発明の第3実施形態について説明する。上述した第1実施形態、及び第2実施形態では、信号機P1が存在する交差点から十分遠い地点である地点Z1にてカメラ11の撮像方向を変更し、その後、カメラ11の撮像方向の変更を不要とすることについて示した。
前述した第3実施形態では、信号機P1の手前にカメラ11の撮像方向の変更を禁止する変更制限領域Q1を設定し、車両51がこの変更制限領域Q1に達する前の時点でカメラ11の撮像方向を変更することにより、画像R1内から信号機P1がフレームアウトすることを回避するように制御した。
次に、本発明の第4実施形態について説明する。なお、装置構成は、第1実施形態で示した図1~図3と同様であるので、説明を省略する。
次に、本発明の第5実施形態について説明する。なお、装置構成は第1実施形態で示した図1~図3と同様であるので、説明を省略する。
12 車両現在位置検出部
13 撮像方向設定部
14 信号機認識部
21 信号機位置推定部
22 位置変化量算出部
23 撮像姿勢設定部
24 カメラ姿勢制御部
25 検出領域算出部
26 走行経路決定部
51 車両
100 信号機認識装置
D01 ランドマーク情報
D02 地図情報
D03 カメラ情報
D04 信号機情報
D05 車両現在位置情報
D06 検出領域情報
D07 画像データ
D08 検出位置情報
D09 検出位置変化情報
D10 撮像姿勢情報
D11 姿勢情報
D12 走行経路情報
P1,P2 信号機
Q1 変更制限領域
R1 画像
X1 走行路
X2 カーブ路
X3 走行路
Claims (9)
- 車両に搭載され、前記車両周囲の画像を撮像する撮像部と、
前記車両周囲の地図情報を取得する地図情報取得部と、
前記車両の地図上の現在位置を検出する車両現在位置検出部と、
前記車両現在位置と前記地図情報に基づき、信号機の画像上の位置を推定する信号機位置推定部と、
前記信号機の画像上の位置と、前記信号機の画像上の将来の移動方向に基づいて、前記撮像部の撮像方向を設定する撮像方向設定部と、
前記撮像部の撮像方向を、前記撮像方向設定部で設定された撮像方向に変更する撮像方向変更部と、
前記撮像部が前記撮像方向で撮像した画像から信号機を認識する信号機認識部と、
を備えたことを特徴とする信号機認識装置。 - 前記撮像方向設定部は、前記信号機の画像上の位置と、前記信号機の画像上の将来の移動範囲に基づいて前記撮像部の撮像方向を設定すること
を特徴とする請求項1に記載の信号機認識装置。 - 撮像方向設定部は、前記信号機の画像上の位置と前記信号機の画像上の将来の移動範囲から撮像方向の変更量を算出し、前記撮像部の撮像範囲と前記変更量に基づいて撮像方向を設定すること
を特徴とする請求項2に記載の信号機認識装置。 - 前記撮像方向設定部は、前記車両の将来の走行経路に基づいて前記信号機の画像上の将来の移動範囲を予測すること
を特徴とする請求項2または3に記載の信号機認識装置。 - 前記撮像方向設定部は、複数の信号機が前記撮像部にて撮像可能である場合には、前記複数の信号機が画像上に含まれるように、前記撮像部の撮像方向を設定すること
を特徴とする請求項1~4のいずれか1項に記載の信号機認識装置。 - 前記撮像方向設定部は、同期して作動する複数の信号機が前記撮像部にて撮像可能である場合には、前記同期して作動する複数の信号機のうち、画像上の移動量が最小となる信号機の移動方向に基づいて前記撮像部の撮像方向を設定すること
を特徴とする請求項1~5のいずれか1項に記載の信号機認識装置。 - 前記撮像方向設定部は、前記信号機に対して設けられる停止位置から前記車両までの距離、及び前記車両の走行速度に基づいて、前記撮像部の撮像方向の変更を制限する変更制限領域を設定し、該変更制限領域内に進入する前に、前記撮像部の撮像方向を変更すること
を特徴とする請求項1~6のいずれか1項に記載の信号機認識装置。 - 前記撮像方向設定部は、前記信号機に対して設けられる停止位置から車両までの距離、車両の走行速度、及び、前記信号機認識部による信号機認識の演算周期に基づいて、前記撮像部の撮像方向の変更時間帯を設定し、
前記撮像方向変更部は、前記変更時間帯に前記撮像部の撮像方向を変更すること
を特徴とする請求項1~6のいずれか1項に記載の信号機認識装置。 - 撮像部により車両周囲の画像を撮像するステップと、
車両周囲の地図情報を取得するステップと、
車両の地図上の現在位置を検出するステップと、
車両現在位置と前記地図情報に基づき、前記信号機の画像上の位置を推定するステップと、
前記信号機の画像上の位置と、前記信号機の画像上の将来の移動方向に基づいて、前記撮像部の撮像方向を設定するステップと、
前記撮像部の撮像方向を、設定された撮像方向に変更するステップと、
前記撮像部が前記撮像方向で撮像した画像から信号機を認識するステップと、
を備えたことを特徴とする信号機認識方法。
Priority Applications (10)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
RU2018105099A RU2693419C1 (ru) | 2015-07-13 | 2015-07-13 | Устройство (варианты) и способ распознавания светофоров |
KR1020187001244A KR101988811B1 (ko) | 2015-07-13 | 2015-07-13 | 신호기 인식 장치 및 신호기 인식 방법 |
BR112018000191-0A BR112018000191B1 (pt) | 2015-07-13 | 2015-07-13 | Dispositivo de reconhecimento de semáforo e método de reconhecidmento de semáforo |
CN201580081578.7A CN107851387B (zh) | 2015-07-13 | 2015-07-13 | 信号机识别装置及信号机识别方法 |
US15/743,072 US10339805B2 (en) | 2015-07-13 | 2015-07-13 | Traffic light recognition device and traffic light recognition method |
MX2018000377A MX361911B (es) | 2015-07-13 | 2015-07-13 | Dispositivo de reconocimiento de semaforo y metodo de reconocimiento de semaforo. |
JP2017528037A JP6447729B2 (ja) | 2015-07-13 | 2015-07-13 | 信号機認識装置及び信号機認識方法 |
CA2992080A CA2992080C (en) | 2015-07-13 | 2015-07-13 | Traffic light recognition device and traffic light recognition method |
EP15898246.2A EP3324383B1 (en) | 2015-07-13 | 2015-07-13 | Traffic light recognition device and traffic light recognition method |
PCT/JP2015/070041 WO2017009933A1 (ja) | 2015-07-13 | 2015-07-13 | 信号機認識装置及び信号機認識方法 |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2015/070041 WO2017009933A1 (ja) | 2015-07-13 | 2015-07-13 | 信号機認識装置及び信号機認識方法 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2017009933A1 true WO2017009933A1 (ja) | 2017-01-19 |
Family
ID=57757866
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2015/070041 WO2017009933A1 (ja) | 2015-07-13 | 2015-07-13 | 信号機認識装置及び信号機認識方法 |
Country Status (10)
Country | Link |
---|---|
US (1) | US10339805B2 (ja) |
EP (1) | EP3324383B1 (ja) |
JP (1) | JP6447729B2 (ja) |
KR (1) | KR101988811B1 (ja) |
CN (1) | CN107851387B (ja) |
BR (1) | BR112018000191B1 (ja) |
CA (1) | CA2992080C (ja) |
MX (1) | MX361911B (ja) |
RU (1) | RU2693419C1 (ja) |
WO (1) | WO2017009933A1 (ja) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2019175051A (ja) * | 2018-03-28 | 2019-10-10 | 本田技研工業株式会社 | 車両制御装置 |
CN110831832A (zh) * | 2017-06-30 | 2020-02-21 | 德尔福技术有限公司 | 自动车辆的移动交通灯检测系统 |
EP3605500A4 (en) * | 2017-03-28 | 2021-01-13 | Pioneer Corporation | OUTPUT DEVICE, ORDERING PROCESS, PROGRAM AND STORAGE MEDIA |
JP2021076883A (ja) * | 2019-11-05 | 2021-05-20 | 三菱スペース・ソフトウエア株式会社 | データベース生成システムおよびデータベース生成プログラム |
JP2021076884A (ja) * | 2019-11-05 | 2021-05-20 | 三菱スペース・ソフトウエア株式会社 | 自動検出システムおよび自動検出プログラム |
Families Citing this family (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP6552064B2 (ja) * | 2017-03-31 | 2019-07-31 | 株式会社Subaru | 車両の走行制御システム |
EP3635622A4 (en) * | 2017-06-08 | 2020-05-27 | Zhejiang Dahua Technology Co., Ltd | DEVICES AND METHODS FOR PROCESSING TRAFFIC LIGHT IMAGES |
US10567724B2 (en) * | 2018-04-10 | 2020-02-18 | GM Global Technology Operations LLC | Dynamic demosaicing of camera pixels |
CN108897345A (zh) * | 2018-07-18 | 2018-11-27 | 北京小马智行科技有限公司 | 一种控制无人车摄像头旋转的方法及系统 |
US10339400B1 (en) * | 2018-12-18 | 2019-07-02 | Chongqing Jinkang New Energy Automobile Co., Ltd. | Traffic light detection using multiple cameras |
DE102018133441A1 (de) * | 2018-12-21 | 2020-06-25 | Volkswagen Aktiengesellschaft | Verfahren und System zum Bestimmen von Landmarken in einer Umgebung eines Fahrzeugs |
CN110647605B (zh) * | 2018-12-29 | 2022-04-29 | 北京奇虎科技有限公司 | 一种基于轨迹数据挖掘红绿灯数据的方法及装置 |
US10930145B2 (en) * | 2019-03-06 | 2021-02-23 | Avanti R&D, Inc. | Traffic system for predicting and providing traffic signal switching timing |
JP7268497B2 (ja) * | 2019-06-24 | 2023-05-08 | トヨタ自動車株式会社 | 信号認識システム |
JP7088137B2 (ja) * | 2019-07-26 | 2022-06-21 | トヨタ自動車株式会社 | 信号機情報管理システム |
WO2021094799A1 (ja) * | 2019-11-12 | 2021-05-20 | 日産自動車株式会社 | 信号機認識方法及び信号機認識装置 |
JP7459946B2 (ja) * | 2020-07-31 | 2024-04-02 | 日本電気株式会社 | 重要領域設定装置、符号化装置、重要領域の設定方法及びプログラム |
CN112489466B (zh) * | 2020-11-27 | 2022-02-22 | 恒大新能源汽车投资控股集团有限公司 | 交通信号灯识别方法和装置 |
JP7525427B2 (ja) * | 2021-03-23 | 2024-07-30 | トヨタ自動車株式会社 | 車両制御装置 |
US12056937B2 (en) | 2021-11-12 | 2024-08-06 | Toyota Research Institute, Inc. | Probabilistic modular lane transition state estimation |
CN114332815B (zh) * | 2021-12-24 | 2023-08-29 | 广州小鹏自动驾驶科技有限公司 | 交通灯状态检测方法、装置、车辆及存储介质 |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2003151042A (ja) * | 2001-11-13 | 2003-05-23 | Matsushita Electric Ind Co Ltd | 車両用ドライブレコーダ |
JP2007288444A (ja) * | 2006-04-14 | 2007-11-01 | Toyota Motor Corp | 車載カメラ制御装置および車載カメラ制御方法。 |
WO2008038370A1 (en) * | 2006-09-28 | 2008-04-03 | Pioneer Corporation | Traffic information detector, traffic information detecting method, traffic information detecting program, and recording medium |
WO2014162797A1 (ja) * | 2013-04-04 | 2014-10-09 | 日産自動車株式会社 | 信号認識装置 |
Family Cites Families (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH11306489A (ja) | 1998-04-16 | 1999-11-05 | Matsushita Electric Ind Co Ltd | カメラシステム |
EP1220182A3 (en) * | 2000-12-25 | 2005-08-17 | Matsushita Electric Industrial Co., Ltd. | Image detection apparatus, program, and recording medium |
US20150235094A1 (en) * | 2014-02-17 | 2015-08-20 | General Electric Company | Vehicle imaging system and method |
JP2006163756A (ja) * | 2004-12-07 | 2006-06-22 | Honda Lock Mfg Co Ltd | 車両の視界補助装置 |
JP4631750B2 (ja) * | 2006-03-06 | 2011-02-16 | トヨタ自動車株式会社 | 画像処理システム |
JP5427203B2 (ja) * | 2011-03-30 | 2014-02-26 | 富士重工業株式会社 | 車両用運転支援装置 |
MX2014000649A (es) * | 2011-08-02 | 2014-04-30 | Nissan Motor | Dispositivo de asistencia de manejo y metodo de asistencia de manejo. |
US8989914B1 (en) * | 2011-12-19 | 2015-03-24 | Lytx, Inc. | Driver identification based on driving maneuver signature |
KR101361663B1 (ko) * | 2012-03-21 | 2014-02-12 | 주식회사 코아로직 | 차량용 영상 처리 장치 및 방법 |
US9145140B2 (en) * | 2012-03-26 | 2015-09-29 | Google Inc. | Robust method for detecting traffic signals and their associated states |
JP5761109B2 (ja) * | 2012-04-10 | 2015-08-12 | トヨタ自動車株式会社 | 運転支援装置 |
US9176500B1 (en) * | 2012-05-14 | 2015-11-03 | Google Inc. | Consideration of risks in active sensing for an autonomous vehicle |
US20130335579A1 (en) * | 2012-06-15 | 2013-12-19 | Palo Alto Research Center Incorporated | Detection of camera misalignment |
US8988574B2 (en) * | 2012-12-27 | 2015-03-24 | Panasonic Intellectual Property Corporation Of America | Information communication method for obtaining information using bright line image |
JP5886799B2 (ja) | 2013-08-05 | 2016-03-16 | 富士重工業株式会社 | 車外環境認識装置 |
GB2517788B (en) * | 2013-09-03 | 2016-06-08 | Jaguar Land Rover Ltd | Water depth estimation apparatus and method |
JP6180968B2 (ja) * | 2014-03-10 | 2017-08-16 | 日立オートモティブシステムズ株式会社 | 車両制御装置 |
RU144555U1 (ru) * | 2014-05-05 | 2014-08-27 | Павел Юрьевич Михайлов | Устройство для повышения безопасности движения транспортного средства |
JP2016081359A (ja) * | 2014-10-20 | 2016-05-16 | クラリオン株式会社 | 情報提示装置 |
JP6462328B2 (ja) * | 2014-11-18 | 2019-01-30 | 日立オートモティブシステムズ株式会社 | 走行制御システム |
JP6361567B2 (ja) * | 2015-04-27 | 2018-07-25 | トヨタ自動車株式会社 | 自動運転車両システム |
-
2015
- 2015-07-13 EP EP15898246.2A patent/EP3324383B1/en active Active
- 2015-07-13 BR BR112018000191-0A patent/BR112018000191B1/pt active IP Right Grant
- 2015-07-13 RU RU2018105099A patent/RU2693419C1/ru active
- 2015-07-13 KR KR1020187001244A patent/KR101988811B1/ko active IP Right Grant
- 2015-07-13 US US15/743,072 patent/US10339805B2/en active Active
- 2015-07-13 MX MX2018000377A patent/MX361911B/es active IP Right Grant
- 2015-07-13 WO PCT/JP2015/070041 patent/WO2017009933A1/ja active Application Filing
- 2015-07-13 CN CN201580081578.7A patent/CN107851387B/zh active Active
- 2015-07-13 JP JP2017528037A patent/JP6447729B2/ja active Active
- 2015-07-13 CA CA2992080A patent/CA2992080C/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2003151042A (ja) * | 2001-11-13 | 2003-05-23 | Matsushita Electric Ind Co Ltd | 車両用ドライブレコーダ |
JP2007288444A (ja) * | 2006-04-14 | 2007-11-01 | Toyota Motor Corp | 車載カメラ制御装置および車載カメラ制御方法。 |
WO2008038370A1 (en) * | 2006-09-28 | 2008-04-03 | Pioneer Corporation | Traffic information detector, traffic information detecting method, traffic information detecting program, and recording medium |
WO2014162797A1 (ja) * | 2013-04-04 | 2014-10-09 | 日産自動車株式会社 | 信号認識装置 |
Non-Patent Citations (1)
Title |
---|
See also references of EP3324383A4 * |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP3605500A4 (en) * | 2017-03-28 | 2021-01-13 | Pioneer Corporation | OUTPUT DEVICE, ORDERING PROCESS, PROGRAM AND STORAGE MEDIA |
US11420632B2 (en) | 2017-03-28 | 2022-08-23 | Pioneer Corporation | Output device, control method, program and storage medium |
CN110831832A (zh) * | 2017-06-30 | 2020-02-21 | 德尔福技术有限公司 | 自动车辆的移动交通灯检测系统 |
JP2019175051A (ja) * | 2018-03-28 | 2019-10-10 | 本田技研工業株式会社 | 車両制御装置 |
JP6990137B2 (ja) | 2018-03-28 | 2022-01-12 | 本田技研工業株式会社 | 車両制御装置 |
JP2021076883A (ja) * | 2019-11-05 | 2021-05-20 | 三菱スペース・ソフトウエア株式会社 | データベース生成システムおよびデータベース生成プログラム |
JP2021076884A (ja) * | 2019-11-05 | 2021-05-20 | 三菱スペース・ソフトウエア株式会社 | 自動検出システムおよび自動検出プログラム |
Also Published As
Publication number | Publication date |
---|---|
RU2693419C1 (ru) | 2019-07-02 |
US10339805B2 (en) | 2019-07-02 |
EP3324383A1 (en) | 2018-05-23 |
MX361911B (es) | 2018-12-19 |
KR101988811B1 (ko) | 2019-06-12 |
CA2992080C (en) | 2020-01-28 |
KR20180018732A (ko) | 2018-02-21 |
CN107851387A (zh) | 2018-03-27 |
EP3324383B1 (en) | 2021-05-19 |
MX2018000377A (es) | 2018-03-14 |
JP6447729B2 (ja) | 2019-01-16 |
BR112018000191B1 (pt) | 2023-02-14 |
EP3324383A4 (en) | 2018-12-05 |
US20180365991A1 (en) | 2018-12-20 |
CN107851387B (zh) | 2021-04-27 |
BR112018000191A2 (ja) | 2018-09-11 |
JPWO2017009933A1 (ja) | 2018-05-31 |
CA2992080A1 (en) | 2017-01-19 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6447729B2 (ja) | 信号機認識装置及び信号機認識方法 | |
EP3324384B1 (en) | Traffic light recognition device and traffic light recognition method | |
US9235767B2 (en) | Detection region modification for driving assistance apparatus and driving assistance method | |
JP2008021196A (ja) | 車両の周囲環境認識装置及びシステム | |
JP2011180982A (ja) | 区画線検出装置 | |
US11663834B2 (en) | Traffic signal recognition method and traffic signal recognition device | |
KR101974772B1 (ko) | 신호기 검출 장치 및 신호기 검출 방법 | |
JP6115429B2 (ja) | 自車位置認識装置 | |
JP2007164566A (ja) | 感応制御用車両感知システムおよび装置 | |
US11679769B2 (en) | Traffic signal recognition method and traffic signal recognition device | |
CN107633213A (zh) | 一种无人驾驶汽车交通灯识别方法 | |
JP6253175B2 (ja) | 車両の外部環境認識装置 | |
JP5652374B2 (ja) | 車両用前照灯制御装置 | |
JP7202844B2 (ja) | 信号機認識方法及び信号機認識装置 | |
JP6295868B2 (ja) | 車両用表示装置 | |
RU2779798C1 (ru) | Способ распознавания светофора и устройство распознавания светофора | |
JP2004343303A (ja) | 車載カメラの露光制御装置 | |
JP2008199166A (ja) | 温度ドリフト補正装置及びその方法 | |
JP2007164565A (ja) | 感応制御用車両感知システムおよび装置 | |
JP2007164564A (ja) | 感応制御用車両感知システムおよび装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 15898246 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2017528037 Country of ref document: JP Kind code of ref document: A |
|
ENP | Entry into the national phase |
Ref document number: 2992080 Country of ref document: CA |
|
WWE | Wipo information: entry into national phase |
Ref document number: MX/A/2018/000377 Country of ref document: MX |
|
ENP | Entry into the national phase |
Ref document number: 20187001244 Country of ref document: KR Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2018105099 Country of ref document: RU Ref document number: 2015898246 Country of ref document: EP |
|
REG | Reference to national code |
Ref country code: BR Ref legal event code: B01A Ref document number: 112018000191 Country of ref document: BR |
|
ENP | Entry into the national phase |
Ref document number: 112018000191 Country of ref document: BR Kind code of ref document: A2 Effective date: 20180104 |