WO2019177019A1 - Traffic signal recognizing device, traffic signal recognition method, and program - Google Patents

Traffic signal recognizing device, traffic signal recognition method, and program Download PDF

Info

Publication number
WO2019177019A1
WO2019177019A1 PCT/JP2019/010260 JP2019010260W WO2019177019A1 WO 2019177019 A1 WO2019177019 A1 WO 2019177019A1 JP 2019010260 W JP2019010260 W JP 2019010260W WO 2019177019 A1 WO2019177019 A1 WO 2019177019A1
Authority
WO
WIPO (PCT)
Prior art keywords
lamp
area
similarity
color
traffic light
Prior art date
Application number
PCT/JP2019/010260
Other languages
French (fr)
Japanese (ja)
Inventor
良介 後藤
サヒム コルコス
村田 久治
本村 秀人
Original Assignee
パナソニックIpマネジメント株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by パナソニックIpマネジメント株式会社 filed Critical パナソニックIpマネジメント株式会社
Publication of WO2019177019A1 publication Critical patent/WO2019177019A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics

Definitions

  • the present disclosure relates to a traffic signal recognition device that recognizes traffic signals.
  • This traffic light recognition device acquires an image generated by shooting a traffic light, and recognizes the light color of the traffic light based on the image.
  • Patent Document 1 has a problem that an image of a signal generated by photographing under severe restrictions is necessary in order to properly recognize the signal.
  • the present disclosure provides a traffic light recognition device that can relax restrictions on shooting of a traffic light.
  • a traffic signal recognition apparatus includes a processor and a memory, and the processor uses the memory to recognize N images (N is an integer of 3 or more) from recognition target images acquired by a sensor. An area in which a traffic light having a lamp part is projected is extracted as a traffic light area, and a similarity between a plurality of images in which different lamp parts are projected is calculated from the traffic light area.
  • the traffic signal recognition device of the present disclosure can ease restrictions on traffic signal shooting.
  • FIG. 1 is a block diagram illustrating a configuration of a traffic signal recognition apparatus according to an embodiment.
  • FIG. 2 is a diagram for explaining processing of the region extraction unit, the region division unit, and the similarity calculation unit in the embodiment.
  • FIG. 3 is a diagram for explaining processing of the light color recognition unit in the embodiment.
  • FIG. 4 is a flowchart showing the processing operation of the traffic signal recognition apparatus in the embodiment.
  • FIG. 5 is a block diagram illustrating a configuration of the traffic signal recognition apparatus according to the first modification of the embodiment.
  • FIG. 6 is a flowchart illustrating the processing operation of the traffic signal recognition apparatus according to the first modification of the embodiment.
  • FIG. 7 is a block diagram illustrating a configuration of a traffic signal recognition apparatus according to Modification 2 of the embodiment.
  • FIG. 1 is a block diagram illustrating a configuration of a traffic signal recognition apparatus according to an embodiment.
  • FIG. 2 is a diagram for explaining processing of the region extraction unit, the region division unit, and the similarity calculation unit in
  • FIG. 8 is a diagram for explaining the recognition of the light color by the traffic light recognition apparatus according to the second modification of the embodiment.
  • FIG. 9 is a flowchart illustrating the processing operation of the traffic signal recognition apparatus according to the second modification of the embodiment.
  • FIG. 10 is a block diagram illustrating a configuration of a traffic signal recognition apparatus according to the third modification of the embodiment.
  • FIG. 11 is a diagram for explaining the recognition of the light color by the traffic light recognition apparatus according to the third modification of the embodiment.
  • FIG. 12 is a flowchart illustrating the processing operation of the traffic signal recognition apparatus according to the third modification of the embodiment.
  • FIG. 13 is a diagram for explaining recognition of the traveling direction by the traffic light recognition apparatus according to the fourth modification of the embodiment.
  • FIG. 14 is a diagram illustrating an example of a method of calculating the similarity of the first lamp unit region in the embodiment.
  • FIG. 15 is a diagram illustrating an example of a method for calculating the similarity of each lamp area according to the fifth modification of the embodiment.
  • FIG. 16 is a diagram illustrating another example of a method for calculating the similarity of each lamp unit region according to the fifth modification of the embodiment.
  • the traffic signal recognition device of Patent Document 1 acquires a color feature image generated by shooting at a predetermined shutter speed. Then, the traffic signal recognizing device extracts, from the color feature image, a circular area that shows the same color as the lighting color of the traffic light as a color feature candidate area. Further, the traffic signal recognition apparatus acquires a shape feature image generated by photographing at a shutter speed determined based on the average luminance around the color feature candidate region. Then, the traffic signal recognition device extracts a region that matches the shape of a predetermined traffic signal as a shape feature candidate region from the periphery of the color feature candidate region in the shape feature image. The shape feature candidate area extracted in this way is recognized as a traffic light area, and the color of the color feature candidate area is recognized as a lighting color of the traffic light, that is, a light color.
  • this traffic light recognition device in order to recognize the light color of the traffic light, a color feature image generated by photographing at a predetermined shutter speed is necessary.
  • the predetermined shutter speed is set so that the lighting color of the traffic light displayed in the color feature image appears clearly. That is, the predetermined shutter speed needs to be adjusted to a shutter speed suitable for the situation or environment so that color saturation does not occur in the color feature image.
  • a template that indicates the shape of the traffic signal is required. Since this search is performed based on the position of the color feature candidate area, a template is required for each position of the lighting color in the traffic light. Therefore, multiple types of templates are necessary.
  • a traffic light recognition apparatus includes a processor and a memory, and the processor uses the memory to recognize from a recognition target image acquired by a sensor, An area in which a traffic light having N lamps (N is an integer of 3 or more) is displayed as a traffic light area, and a plurality of similarities of images in which different light parts are projected from the traffic light area. Is calculated.
  • the processor may generate N lamp part areas as the plurality of images, and calculate the similarity of each of the N lamp part areas.
  • a lamp part that is not similar to any other lamp part among the N lamp parts of the traffic light can be recognized as a lit lamp part.
  • the processor is similar to any other lamp section area among the N lamp section areas based on the similarity calculated for each of the N lamp section areas. It may be determined whether or not there is a dissimilar region that is not a lamp part region.
  • the processor when the processor further determines that the dissimilar region exists, the processor recognizes a color preliminarily associated with the position of the dissimilar region in the traffic light region as a light color in the recognition target image. May be. Specifically, in the recognition of the lamp color, the processor refers to the association information indicating the color associated with each position of the N lamp unit regions in the traffic light region, and the association information The color associated with the position of the dissimilar region may be recognized as the lamp color in the recognition target image.
  • the lamp color is recognized based on the position of the dissimilar area. For example, if one of the N lamp sections of the traffic light is turned on and the other N-1 lamp sections are turned off, the one lamp section that is lit is displayed. The region is determined as a dissimilar region. If the dissimilar area is at the left end of the traffic light area, blue is recognized as the lamp color. As described above, since the lamp color is recognized based on the similarity between the N lamp section areas, that is, the difference in the feature amount of the N lamp section areas, the lamp color can be appropriately recognized.
  • the light part is not blue but is a color close to white or yellow. It may be reflected in.
  • the lamp color is recognized based on the difference in the feature amount of the N lamp area even if color saturation occurs in the recognition target image, such recognition is performed. It is possible to appropriately recognize the light color even for the target image.
  • the traffic light recognition apparatus can recognize the light color even from an image generated by photographing with a camera that is not highly accurate. Therefore, it is possible to appropriately recognize the light color even with respect to an image generated by photographing with a non-accurate camera attached to a vehicle that can withstand severe environmental conditions.
  • the shape of each of the N lamp units included in the traffic light may be circular or quadrangular, or any shape.
  • the color of the lamp portion of the traffic light is not limited to blue, yellow and red, and any color can be appropriately recognized.
  • the memory capacity for holding the template can be reduced.
  • the processor calculates, for each lamp area in the recognition target image, a correlation coefficient between the lamp area and each of at least two other lamp areas.
  • the similarity of the lamp area may be calculated by selecting the largest correlation coefficient as the similarity.
  • the processor determines that the lamp part area having the smallest similarity among the similarities of the N lamp part areas is the dissimilar area. May exist.
  • the processor determines the smallest similarity when the smallest similarity among the similarities of the N lamp part areas is equal to or less than a threshold value. You may determine with the lamp part area
  • the dissimilar region exists when the smallest similarity is equal to or smaller than the threshold, and it is determined that the dissimilar region does not exist when the smallest similarity is larger than the threshold. Therefore, it is possible to recognize the lamp color until there is no significant difference between the similarities of the N lamp section areas, and as a result, it is possible to suppress recognition of an inappropriate lamp color.
  • the processor determines that the dissimilar region does not exist in the recognition of the lamp color
  • the processor refers to the history information indicating the lamp color recognized for the image acquired in the past, and The lamp color indicated by the history information may be recognized as the lamp color in the recognition target image.
  • the processor refers to history information indicating a light color recognized for each of a plurality of images acquired before the recognition target image, and is recognized for the recognition target image.
  • history information indicating a light color recognized for each of a plurality of images acquired before the recognition target image, and is recognized for the recognition target image.
  • the most lamp colors are specified as lamp colors, and the lamp colors recognized for the recognition target image are You may update to the said multiple lamp color.
  • the plurality of images and recognition target images are, for example, a series of images generated by shooting at a constant frame rate. Then, the lamp color recognized for the recognition target image is updated to the multiple lamp colors that are the most lamp colors among the lamp colors and the lamp colors of the plurality of images. Therefore, for example, due to factors other than flicker such as noise, even if it is suddenly determined that there is no dissimilar area in the recognition target image, or an incorrect light color is recognized for the recognition target image, The error can be easily corrected.
  • the processor for each of a plurality of images acquired before the recognition target image, for each position in the traffic signal region in the image, The history information indicating the similarity of the lamp area at the position is referred to, and the average of the similarity of the lamp area at the position in the recognition target image and the plurality of images is calculated for each position in the traffic light area.
  • the lamp area at the position corresponding to the average similarity that is the smallest of the average similarities calculated for each position among the N lamp areas in the recognition target image. May be determined to exist as the dissimilar region.
  • the plurality of images and recognition target images are, for example, a series of images generated by shooting at a constant frame rate. Then, for each position in the traffic signal area, the average similarity between the recognition target image and the lamp area at the position in the plurality of images is calculated as the average similarity. For example, the average similarity of the lamp area at the left end in the traffic light area, the average similarity of the lamp area in the center of the traffic light area, and the average similarity of the lamp area at the right end in the traffic light area Calculated.
  • the processor may In the generation, by dividing the traffic light area, the N lamp part areas and the M lamp part areas in which the direction indicating lamp parts are respectively projected are generated. Based on the recognized lamp color and the respective feature values of the M lamp area, indicated by at least one direction indicating lamp section that is lit among the M direction indicating lamp sections. The traveling direction may be recognized.
  • the processor In the generation, by dividing the traffic light area, the N lamp section areas and M lamp section areas each displaying a direction indicating lamp section are generated, and the M lamp sections are further generated.
  • the traveling direction indicated by at least one direction indicator lamp unit that is lit among the M direction indicator lamp units may be recognized based on the feature amount of each region.
  • the lighting direction and the direction indication lamp unit that may be lit may be turned on regardless of whether the lighting direction and the direction indicating lamp unit that may be turned on are determined in advance. It is possible to appropriately recognize the traveling direction of the lamp unit.
  • the processor may calculate, for each of the N lamp areas, an image of the lamp area and a group of at least one other lamp area other than the lamp area. You may calculate the similarity of the said lamp
  • FIG. 1 is a block diagram illustrating a configuration of a traffic signal recognition apparatus according to an embodiment.
  • the vehicle 10 includes a camera 210, a map storage unit 220, a position detection system 230, a vehicle control unit 240, and a traffic signal recognition device 100.
  • the camera 210 captures, for example, the front of the vehicle 10 and outputs an image generated by the capturing (hereinafter referred to as a captured image) to the traffic signal recognition apparatus 100. Specifically, the camera 210 captures images at a constant frame rate, and outputs an image as a captured image to the traffic signal recognition apparatus 100 each time an image is generated by the capture.
  • the camera 210 is configured as a sensor that acquires a captured image (that is, a recognition target image described later). Note that the shutter speed used for shooting may be fixed or variable.
  • the map storage unit 220 stores at least map information indicating a map around the vehicle 10. Such map information is wirelessly transmitted from a server via a network such as the Internet and stored in the map storage unit 220, for example.
  • the position detection system 230 detects the position of the vehicle 10 and outputs position information indicating the position to the traffic signal recognition device 100.
  • the position detection system 230 is configured as a GPS (Global Positioning System) receiver.
  • the vehicle control unit 240 includes, for example, one or more ECUs (Electronic Control Units). Such a vehicle control part 240 acquires the light color information output from the traffic signal recognition apparatus 100, and controls driving
  • ECUs Electronic Control Units
  • the traffic signal recognition device 100 acquires a captured image from the camera 210 and recognizes the traffic signal displayed in the captured image. For example, the traffic light recognition device 100 recognizes the light color of the traffic light and outputs the light color information indicating the light color to the vehicle control unit 240. Note that the captured image used for the current traffic signal recognition is also referred to as a recognition target image.
  • the traffic light recognition apparatus 100 includes an area extraction unit 101, an area division unit 102, a similarity calculation unit 103, and a light color recognition unit 104.
  • the area extraction unit 101 acquires a captured image from the camera 210 and extracts an area where the traffic signal is displayed as a traffic signal area from the captured image.
  • the traffic light has N lamp units (N is an integer of 3 or more). For example, N is three, and the traffic light has a blue lamp unit, a yellow lamp unit, and a red lamp unit.
  • the captured image is used for recognition of the current lighting color, the captured image is treated as a recognition target image.
  • the region extraction unit 101 uses, for example, the map information stored in the map storage unit 220 and the position of the vehicle 10 detected by the position detection system 230 when extracting the traffic signal region from the captured image.
  • the map information indicates the position and form of each traffic signal in the three-dimensional space. That is, the map information not only shows the position of the road and the building, but also shows the position and form of the traffic signal arranged on the road.
  • the position of the traffic light is indicated by a three-dimensional coordinate system. That is, the position of the light box of the traffic light is indicated by longitude, latitude, and height. The height may be a height from a road or an altitude.
  • the form of the traffic signal includes the shape and size of the traffic signal when the traffic signal is viewed from the front.
  • the region extraction unit 101 identifies the position of the vehicle 10 on the map by mapping the position indicated by the position information output from the position detection system 230 to the map indicated by the map information. Furthermore, the area extraction unit 101 displays the traffic signal in the captured image based on the position of the vehicle 10 and the traffic signal on the map, the form of the traffic signal, the mounting position of the camera 210 on the vehicle 10 and the shooting direction. The area is detected geometrically. Then, the area extraction unit 101 extracts the detected area from the captured image as a traffic light area.
  • the area dividing unit 102 divides the traffic signal area to generate N lamp part areas each displaying the lamp part.
  • the area dividing unit 102 may equally divide the signal area into N parts when dividing the signal area into N lamp part areas.
  • the area dividing unit 102 divides the traffic signal area according to the number and arrangement of the lamp units.
  • the area dividing unit 102 divides the traffic light area in order to generate N lamp part areas, but generates N lamp part areas without dividing the traffic light area. Also good.
  • the area dividing unit 102 extracts, for each lamp unit, an area in which the lamp unit is projected from the entire signal area. In this extraction, at least two lamp section areas out of the N lamp section areas may include the same image such as a background. In addition, an image such as a background that is not included in any of the N lamp sections may be in the traffic signal area.
  • N lamp parts areas in which different lamp parts are projected are generated from the traffic light area, what kind of image processing is performed on the traffic light area. Also good.
  • the similarity calculation unit 103 acquires the N lamp unit areas described above from the region dividing unit 102, and calculates the similarity of each of the N lamp unit regions.
  • the lamp color recognition unit 104 acquires the similarity calculated for each of the N lamp unit regions from the similarity calculation unit 103. Then, the lamp color recognition unit 104 recognizes the color associated with the position of the dissimilar area specified based on the similarity as the lamp color in the captured image.
  • region is a lamp part area
  • the lamp color recognition unit 104 recognizes the color associated with the position of the dissimilar region in the association information as the lamp color in the captured image by referring to association information described later. To do.
  • the association information may be included in the map information.
  • the map information may indicate the colors associated with the positions of the lamp units in addition to the number (N) of the lamp units included in the traffic light and the arrangement of the N lamp units.
  • the lamp color recognition unit 104 outputs information indicating the recognized lamp color to the vehicle control unit 240 as lamp color information.
  • FIG. 2 is a diagram for explaining the processing of the region extracting unit 101, the region dividing unit 102, and the similarity calculating unit 103.
  • the region extraction unit 101 acquires a captured image Pa as a recognition target image from the camera 210 (step S1). Then, the area extraction unit 101 extracts the traffic light area Pb from the captured image Pa (step S2).
  • the three lamp parts are composed of a blue lamp part, a yellow lamp part, and a red lamp part.
  • the area dividing unit 102 divides the traffic signal area Pb into three lamp part areas Pb1, Pb2, and Pb3 (step S3).
  • the lamp area Pb1 is an area located at the left end of the traffic signal area Pb, and is also referred to as a first lamp section area Pb1.
  • the lamp part area Pb2 is an area located at the center of the traffic light area Pb, and is also referred to as a second lamp part area Pb2.
  • the lamp part area Pb3 is an area located at the right end of the traffic signal area Pb, and is also referred to as a third lamp part area Pb3.
  • the similarity calculation unit 103 calculates the respective similarities k1, k2, and k3 of the three lamp region Pb1, Pb2, and Pb3 (steps S4 to S6).
  • the similarity calculation unit 103 when calculating the similarity k1 of the first lamp part area Pb1 (step S4), the similarity calculation unit 103 firstly determines between the first lamp part area Pb1 and the second lamp part area Pb2. A correlation coefficient k12 is calculated. That is, the similarity calculation unit 103 compares the feature quantity of the image of the first lamp area Pb1 with the feature quantity of the image of the second lamp area Pb2, and sets the correlation between these feature quantities as the correlation coefficient k12. calculate. The correlation coefficient indicates a larger value as the feature amounts of the images in these regions are similar.
  • the similarity calculation unit 103 divides the first lamp area Pb1 into blocks composed of n ⁇ n pixels, and for each block of the first lamp area Pb1, the block most similar to the image of the block is determined. Search from the second lamp area Pb2. Specifically, the similarity calculation unit 103 represents the feature amount of the block in the first lamp unit region Pb1 as a vector. Then, the similarity calculation unit 103 finds a block having the feature amount of the vector having the shortest distance from the vector from the second lamp unit region Pb2. For example, a vector representing a feature amount of a block is composed of an array of pixel values (specifically, luminance values) of n ⁇ n pixels constituting the block.
  • the similarity calculation unit 103 calculates the correlation coefficient k12 from the shortest inter-vector distance obtained for each block of the first lamp unit region Pb1.
  • the correlation coefficient k12 is the reciprocal of the value obtained by averaging the shortest vector distances obtained for each block of the first lamp area Pb1.
  • the calculation method of the correlation coefficient is not limited to this, and any method may be used.
  • the shortest inter-vector distance is calculated for each block of the first lamp area Pb1, but the shortest inter-vector distance may be calculated in finer units.
  • the shortest inter-vector distance for the image in the processing target window may be calculated while shifting the processing target window one pixel at a time in the first lamp area Pb1.
  • the processing target window has a size that surrounds n ⁇ n pixels.
  • the similarity calculation unit 103 calculates the correlation coefficient k13 between the first lamp unit region Pb1 and the third lamp unit region Pb3 in the same manner as described above. And the similarity calculation part 103 selects the largest correlation coefficient of the correlation coefficient k12 and the correlation coefficient k13 as the similarity k1 of the 1st lamp part area
  • the similarity calculation unit 103 calculates the similarity k2 of the second lamp unit region Pb2 (step S5), first, the phase relationship between the second lamp unit region Pb2 and the first lamp unit region Pb1. The number k21 is calculated. Further, the similarity calculation unit 103 calculates a correlation coefficient k23 between the second lamp part region Pb2 and the third lamp part region Pb3. And the similarity calculation part 103 selects the largest correlation coefficient of the correlation coefficient k21 and the correlation coefficient k23 as the similarity k2 of the 2nd lamp part area
  • step S6 when the similarity calculation unit 103 calculates the similarity k3 of the third lamp unit region Pb3 (step S6), first, the phase relationship between the third lamp unit region Pb3 and the first lamp unit region Pb1. The number k31 is calculated. Further, the similarity calculation unit 103 calculates a correlation coefficient k32 between the third lamp unit region Pb3 and the second lamp unit region Pb2. And the similarity calculation part 103 selects the largest correlation coefficient of the correlation coefficient k31 and the correlation coefficient k32 as the similarity k3 of the 3rd lamp part area
  • the similarity calculation unit 103 calculates a correlation coefficient between each lamp unit area and each of at least two other lamp unit areas for each lamp unit area in the recognition target image. Among them, the largest correlation coefficient is selected as the similarity. As a result, the similarity of the lamp area is calculated.
  • the lamp unit area in which one of the lit lamps is projected is displayed.
  • a small value is calculated as the similarity.
  • a large value is calculated as the similarity of the lamp area where each of the other two lamp sections is projected. Therefore, by using the similarity of these three lamp part areas, a lamp part area that is not similar to any other lamp part area can be easily found out of the three lamp part areas. it can.
  • FIG. 3 is a diagram for explaining processing of the light color recognition unit 104.
  • the lamp color recognition unit 104 acquires the similarities k1 to k3 calculated for each of the first lamp part region Pb1, the second lamp part region Pb2, and the third lamp part region Pb3. Then, the lamp color recognition unit 104 recognizes the color of the lit lamp part among the three lamp parts of the traffic light as the lamp color in the captured image Pa based on the similarity k1 to k3. .
  • the lamp color recognizing unit 104 selects any of the three lamp part areas Pb1 to Pb3 based on the similarities k1 to k3 calculated for the three lamp part areas Pb1 to Pb3. It is determined whether or not there is a dissimilar region that is a lamp region that is not similar to the lamp region.
  • the similarity k1 of the similarities k1 to k3 is the smallest, so the first lamp region Pb1 having the similarity k1 is a dissimilar region. Is determined. That is, the lamp color recognition unit 104 determines that the first lamp part region Pb1 having the smallest similarity k1 among the similarities k1 to k3 of the three lamp part regions Pb1 to Pb3 exists as a dissimilar region. judge. As a result, it is possible to determine that there is a dissimilar region in any captured image Pa and recognize the light color in the captured image Pa.
  • the lamp color recognizing unit 104 determines the smallest similarity k1 when the smallest similarity k1 among the similarities k1 to k3 of the three lamp part regions Pb1 to Pb3 is equal to or less than the threshold Th. It determines with the 1st lamp part area
  • region Thereby, it is determined that the dissimilar region exists when the smallest similarity is equal to or less than the threshold, and it is determined that the dissimilar region does not exist when the smallest similarity is larger than the threshold. Therefore, it is possible to recognize the lamp color until there is no significant difference between the similarities of the N lamp section areas, and as a result, it is possible to suppress recognition of an inappropriate lamp color.
  • the light color recognition unit 104 recognizes the light color based on the position of the dissimilar region in the traffic signal region Pb. For example, at this time, the light color recognition unit 104 refers to the association information. That is, the lamp color recognition unit 104 refers to the association information indicating the colors associated with the respective positions of the three lamp unit areas Pb1 to Pb3 in the traffic signal area Pb. Then, the lamp color recognition unit 104 recognizes the color associated with the position of the dissimilar region as the lamp color in the captured image Pa in the association information. For example, as shown in FIG. 3, the association information indicates the blue color associated with the position of the first lamp section area Pb1 in the traffic light area Pb, that is, the left end.
  • the association information indicates the position of the second lamp area Pb2, that is, yellow associated with the center, and the position of the third lamp area Pb3, that is, red associated with the right end.
  • the lamp color recognition unit 104 recognizes, for example, the blue color associated with the position of the first lamp unit region Pb1 determined as the dissimilar region as the lamp color in the captured image Pa. .
  • FIG. 4 is a flowchart showing the processing operation of the traffic signal recognition apparatus 100 in the present embodiment.
  • N a traffic light having N
  • the area dividing unit 102 divides the traffic signal area Pb to generate three lamp part areas Pb1 to Pb3 in which the lamp parts are respectively projected (step S103).
  • the similarity calculation unit 103 calculates the respective similarities k1 to k3 of the three lamp unit regions Pb1 to Pb3 (step S104).
  • the lamp color recognition unit 104 lights up of the three lamp units of the traffic light based on the similarities k1 to k3 calculated for the three lamp unit areas Pb1 to Pb3.
  • the color of the existing lamp part is recognized as the lamp color in the captured image Pa.
  • the lamp color recognizing unit 104 is not similar to any of the other lamp section areas among the three lamp section areas Pb1 to Pb3 based on the similarities k1 to k3. It is determined whether or not there is a dissimilar region that is a lamp region (step S105). If the lamp color recognition unit 104 determines that the dissimilar area exists (Yes in step S105), the lamp color recognition unit 104 recognizes the lamp color based on the position of the dissimilar area in the traffic light area Pb (step S106).
  • the region extraction unit 101 acquires a captured image Pa as a recognition target image. Then, the area extraction unit 101 extracts an area where a traffic light having N lamp parts is displayed as a traffic light area from the recognition target image.
  • the area dividing unit 102 divides the traffic signal area to generate N lamp part areas each displaying the lamp part.
  • the similarity calculation unit 103 calculates the similarity of each of the N lamp unit regions.
  • the lamp color recognizing unit 104 is not similar to any other lamp unit region among the N lamp unit regions based on the similarity calculated for each of the N lamp unit regions. It is determined whether or not a dissimilar area that is a lamp area exists.
  • the lamp color recognition unit 104 determines that a dissimilar area exists, the color of the lit lamp part among the N lamp parts is determined based on the position of the dissimilar area in the traffic light area. Is recognized as a lamp color in the recognition target image.
  • the lamp color recognition unit 104 refers to the association information indicating the colors associated with the positions of the N lamp unit areas in the traffic signal area Pb.
  • the lamp color recognition part 104 recognizes the color matched with the position of a dissimilar area
  • the association information indicating the color associated with each position of the N lamp section regions may be included in the map information as described above, or may not be included in the map information.
  • the association information may be stored in advance in a memory, or may be stored in a cloud server that can communicate with the traffic signal recognition apparatus 100.
  • the light color recognition unit 104 may read the association information from the memory, or may dynamically read the association information from the cloud server. Further, the light color recognition unit 104 may inquire the cloud server about the color associated with the position of the dissimilar region and specify the color.
  • the lamp in which the one lamp section that is lit is displayed.
  • the partial area is determined as a dissimilar area. If the dissimilar area is at the left end of the traffic light area, blue is recognized as the lamp color. As described above, since the lamp color is recognized based on the similarity between the N lamp section areas, that is, the difference in the feature amount of the N lamp section areas, the lamp color can be appropriately recognized.
  • the light part when color saturation occurs, even if the blue light part of the traffic light is lit, in the light part region corresponding to that light part of the recognition target image, the light part is not blue but is a color close to white or yellow. It may be reflected in.
  • the lamp color is recognized based on the difference in the feature amounts of the N lamp part regions. It is possible to appropriately recognize the light color for the recognition target image.
  • the recognition target image since color saturation may occur in the recognition target image, it is possible to eliminate the necessity of adjusting the camera 210 to a predetermined shutter speed so that color saturation does not occur when the traffic light is captured by the camera 210. . As a result, the shutter speed of the camera 210 can be fixed. In addition, it is possible to appropriately recognize the light color even with respect to the recognition target image generated by photographing at night or rainy weather where color saturation is likely to occur.
  • the camera 210 that is not highly accurate can be used for shooting the traffic light.
  • the traffic light recognition apparatus 100 can recognize the light color even from a photographed image generated by photographing with the camera 210 that is not highly accurate. Therefore, it is possible to appropriately recognize the light color even for a captured image that is attached to the vehicle and is generated by the camera 210 that is not highly accurate and can withstand severe environmental conditions.
  • the shape of each of the N lamp units included in the traffic light may be circular or quadrangular, or any shape.
  • the color of the lamp portion of the traffic light is not limited to blue, yellow and red, and any color can be appropriately recognized.
  • the memory capacity for holding the template can be reduced.
  • Modification 1 there may be a case where an unlit traffic light is displayed on the captured image Pa due to the flicker of the traffic light.
  • the lighting of the traffic light is periodically performed.
  • the lamp unit blinks at a frequency of 100 Hz or 120 Hz. Therefore, at the timing when the captured image Pa is generated by capturing with the camera 210, there is a case where none of the lamps of the traffic light is lit. In the captured image Pa generated at such a timing, any lamp part of the traffic light is projected darkly.
  • the traffic light recognition device uses the light color recognized for the past captured image when a non-light traffic signal is displayed in the captured image Pa that is the recognition target image.
  • the lamp color in the recognition target image is recognized.
  • FIG. 5 is a block diagram showing the configuration of the traffic signal recognition apparatus according to this modification.
  • the traffic signal recognizing device 100a includes the components of the traffic signal recognizing device 100 in the above embodiment, and also includes a history holding unit 105 and a flicker processing unit 106. That is, the traffic signal recognition apparatus 100a includes an area extraction unit 101, an area division unit 102, a similarity calculation unit 103, a light color recognition unit 104, a history holding unit 105, and a flicker processing unit 106.
  • the lamp color recognizing unit 104 determines that there is a dissimilar area in the traffic signal area Pb, it outputs the lamp color information to the history holding unit 105 and the flicker processing unit 106. That is, if the traffic light displayed in the captured image Pa that is the recognition target image is not lightless, the light color information is output. On the other hand, if it is determined that there is no dissimilar area in the traffic light area Pb, the light color recognition unit 104 outputs lightless information indicating that the traffic light is lightless to the flicker processing unit 106. That is, when the traffic light shown in the captured image Pa is unlit, unlit information is output.
  • the history holding unit 105 is a recording medium for holding the lamp color information output from the lamp color recognition unit 104 as history information.
  • the history holding unit 105 includes a hard disk or a memory.
  • the memory may be nonvolatile or volatile.
  • the memory may be ROM (Read Only Memory) or RAM (Random Access Memory).
  • Such a history holding unit 105 when acquiring the lamp color information from the lamp color recognition unit 104, holds the lamp color information as history information.
  • the lamp color recognition unit 104 stores the lamp color information in the history holding unit 105
  • the old lamp color information that has already been stored may be deleted.
  • the lamp color recognition unit 104 may store only the latest lamp color information in the history holding unit 105.
  • the flicker processing unit 106 When the flicker processing unit 106 acquires the light color information from the light color recognition unit 104, the flicker processing unit 106 outputs the light color information to the vehicle control unit 240. On the other hand, when the flicker processing unit 106 acquires the no-light information from the light color recognition unit 104, the flicker processing unit 106 reads the history information held in the history holding unit 105. Then, the flicker processing unit 106 outputs the history information as lighting color information for the captured image Pa that is the recognition target image.
  • FIG. 6 is a flowchart showing the processing operation of the traffic signal recognition apparatus 100a according to this modification.
  • the traffic light recognition apparatus 100a executes the processes of steps S101 to S106, similar to the processing operation of the flowchart shown in FIG.
  • the lamp color recognition unit 104 of the traffic signal recognition apparatus 100a recognizes the lamp color in step S106
  • the lamp color information indicating the lamp color is stored in the history holding unit 105 as history information (step S107). This history information is used when it is determined that there is no dissimilar region in the recognition of the light color for a future captured image.
  • the flicker processing unit 106 acquires the lamp color information from the lamp color recognition unit 104, the flicker processing unit 106 outputs the lamp color information to the outside of the traffic light recognition device 100a (step S108). For example, the flicker processing unit 106 outputs the light color information to the vehicle control unit 240.
  • step S105 If it is determined in step S105 that there is no dissimilar region (No in step S105), the light color recognition unit 104 outputs no-light information to the flicker processing unit 106. For example, when a no-light signal is displayed in the captured image Pa that is the recognition target image, the smallest similarity is greater than the threshold Th, and therefore it is determined in step S105 that there is no dissimilar region. The In such a case, the light color recognition unit 104 outputs no-light information to the flicker processing unit 106. When the flicker processing unit 106 acquires the no-light information, the flicker processing unit 106 reads the history information from the history holding unit 105 (step S109). The flicker processing unit 106 outputs the read history information as lighting color information for the recognition target image (step S110).
  • the flicker processing unit 106 applies to the captured image acquired in the past when the light color recognition unit 104 determines that there is no dissimilar region.
  • the history information indicating the recognized light color is referred to.
  • the flicker processing unit 106 recognizes the lamp color indicated by the history information as the lamp color in the recognition target image.
  • the lamp color recognized for the past photographed image is recognized as the lamp color in the recognition target image. Accordingly, it is possible to appropriately prevent the recognition of the light color due to flicker.
  • the past photographed image is, for example, a photographed image (that is, a frame) immediately before the recognition target image generated by photographing at a constant frame rate by the camera 210, for example. Therefore, there is a high possibility that the lamp unit that has been turned off by flicker at the timing of capturing the recognition target image is lit at the timing of capturing the previous captured image. Therefore, by referring to the history information as described above, it is possible to appropriately recognize the light color even if flicker occurs.
  • the cause that makes it difficult to recognize the light color in the recognition target image is not limited to the flicker of the traffic light.
  • a captured image in which it is difficult to recognize the light color may be generated due to noise in the camera 210 or a shooting environment.
  • the traffic light recognition apparatus recognizes the light color in the recognition target image using the processing results for a plurality of past captured images.
  • FIG. 7 is a block diagram showing the configuration of the traffic signal recognition apparatus according to this modification.
  • the traffic signal recognizing device 100b includes the components of the traffic signal recognizing device 100 in the above embodiment, and also includes a history holding unit 105 and a time series processing unit 107. That is, the traffic signal recognition apparatus 100a includes an area extraction unit 101, an area division unit 102, a similarity calculation unit 103, a light color recognition unit 104, a history holding unit 105, and a time series processing unit 107.
  • the lamp color recognition unit 104 determines that there is a dissimilar area in the traffic signal area Pb, the lamp color information is output to the history holding unit 105 as in the first modification. On the other hand, if it is determined that there is no dissimilar area in the traffic signal area Pb, the lamp color recognition unit 104 recognizes the lamp color as black, for example, and outputs lamp color information indicating the black color to the history holding unit 105.
  • the history holding unit 105 when acquiring the lamp color information from the lamp color recognition unit 104, holds the lamp color information as a part of the history information.
  • the history holding unit 105 has a storage capacity for holding L pieces of light color information (L is an integer of 3 or more). Therefore, when the lamp color information is stored in the history holding unit 105 when the L lamp color information is already stored, the lamp color recognition unit 104 is the oldest among the lamp color information. Delete the light color information. Thereby, free capacity is secured in the history holding unit 105. Then, the lamp color recognition unit 104 stores the latest lamp color information in the history holding unit 105 in which the free space is secured. Thereby, the history holding unit 105 always holds the latest L pieces of lamp color information as history information.
  • L is an integer of 3 or more
  • the time series processing unit 107 reads out the history information stored in the history holding unit 105 every time the lighting color information is stored in the history holding unit 105.
  • the history information includes the latest L pieces of lamp color information.
  • the time series processing unit 107 updates the already recognized lamp color for the recognition target image based on the lamp color indicated by each of the L lamp color information.
  • the lamp color recognition unit 104 performs a lamp color temporary determination on the recognition target image
  • the time-series processing unit 107 detects the lamp color recognized for the past (L-1) photographed images. Is used to make a final determination of the light color for the recognition target image.
  • FIG. 8 is a diagram for explaining the recognition of the light color by the traffic signal recognition apparatus 100b according to the present modification.
  • the history holding unit 105 holds, as history information, lamp color information indicating the lamp color recognized for each of the past four captured images.
  • These four captured images are composed of the (n-4) th frame, the (n-3) th frame, the (n-2) th frame, and the (n-1) th frame.
  • the lighting color information of the (n ⁇ 1) th frame included in the history information indicates black as the lighting color
  • the other three lighting color information indicates blue as the lighting color.
  • Each frame is a captured image generated by shooting at a constant frame rate by the camera 210.
  • the lamp color recognition unit 104 recognizes the lamp color with respect to the nth frame which is the latest captured image and is the recognition target image. For example, the lamp color recognition unit 104 recognizes a blue lamp color.
  • the lamp color recognition unit 104 stores lamp color information indicating the lamp color recognized for the nth frame in the history holding unit 105.
  • the light color recognition unit 104 first deletes the light color information of the (n ⁇ 4) th frame from the history information stored in the history holding unit 105.
  • the lamp color recognition unit 104 stores lamp color information indicating the lamp color recognized for the nth frame in the history holding unit 105 as new lamp color information.
  • the history information of the history holding unit 105 includes the light color information of the nth frame instead of the light color information of the (n-4) th frame.
  • the time series processing unit 107 reads out the history information stored in the history holding unit 105. Then, the time-series processing unit 107 updates the lamp color (for example, blue) recognized for the nth frame based on the lamp color indicated by each of the latest four lamp color information included in the history information. To do. That is, the time-series processing unit 107 uses the lamp colors obtained by provisional determination for each of the (n-3) th frame, the (n-2) th frame, the (n-1) th frame, and the nth frame. The final determination of the light color for the nth frame is performed.
  • the lamp color for example, blue
  • the time-series processing unit 107 performs a majority decision of the lamp color indicated by each of the four lamp color information, and determines the lamp color recognized for the nth frame as the lamp color determined by the majority vote. Update to For example, the time-series processing unit 107 recognizes the lamp colors recognized for each of the (n-3) th frame, the (n-2) th frame, the (n-1) th frame, and the nth frame, that is, blue Blue, black, and blue majority vote. The time-series processing unit 107 specifies the lamp color determined by the majority decision, that is, blue, as the majority lamp color. Then, the time series processing unit 107 updates the lamp colors recognized for the nth frame to the multiple lamp colors.
  • FIG. 9 is a flowchart showing a processing operation by the traffic signal recognition apparatus 100b according to the present modification.
  • the traffic signal recognition apparatus 100b executes steps S101 to S106 in the same manner as the processing operation of the flowchart shown in FIG.
  • step S106 when the lamp color recognition unit 104 of the traffic light recognition apparatus 100b recognizes the lamp color in step S106, the lamp color information indicating the lamp color is stored in the history holding unit 105 as part of the history information (step S122). . If the lamp color recognition unit 104 determines that there is no dissimilar area in step S105 (No in step S105), for example, the lamp color recognition unit 104 recognizes the lamp color for the recognition target image as black (step S121). Then, the lamp color recognition unit 104 stores the lamp color information indicating the black color in the history holding unit 105 as part of the history information (step S122).
  • the time series processing unit 107 reads the history information stored in the history holding unit 105 (step S123). Then, the time-series processing unit 107 updates the lamp color in the recognition target image recognized in step S106 or S121 by the majority of the four lamp colors indicated by the history information (step S124).
  • the history holding unit 105 holds history information including four pieces of light color information.
  • the number of lamp color information included in the history information is not limited to four, and may be three or four or more.
  • the time series processing unit 107 reads the history information from the history holding unit 105 to acquire the light color information of the recognition target image included in the history information.
  • the time series processing unit 107 may directly acquire the light color information of the recognition target image from the light color recognition unit 104.
  • the history information read from the history holding unit 105 by the time-series processing unit 107 does not include the light color information of the recognition target image, but includes only the light color information of the past captured image.
  • the time-series processing unit 107 includes history information indicating the lamp color recognized for each of the plurality of captured images acquired before the recognition target image. Refer to Then, the time-series processing unit 107 calculates the largest number of lamp colors among the lamp colors recognized for the recognition target image and the lamp colors in each of the plurality of captured images indicated by the history information. Specify as color. The time-series processing unit 107 updates the lamp colors recognized for the recognition target image to the multiple lamp colors.
  • the traffic signal recognition apparatus recognizes the light color in the recognition target image using the processing results for a plurality of past captured images.
  • the traffic light recognition apparatus according to the present modification uses not the lamp color recognition result but the similarity calculation result as the processing result for a plurality of past captured images.
  • FIG. 10 is a block diagram showing a configuration of a traffic signal recognition apparatus according to this modification.
  • the traffic light recognition device 100c includes a light color recognition unit 104c instead of the light color recognition unit 104 among the components of the traffic light recognition device 100 in the above embodiment. Further, the traffic signal recognition apparatus 100 c includes a history holding unit 105. That is, the traffic signal recognition apparatus 100a includes an area extraction unit 101, an area division unit 102, a similarity calculation unit 103, a light color recognition unit 104c, and a history holding unit 105.
  • the history holding unit 105 holds the similarity information output from the similarity calculation unit 103 as part of the history information.
  • the history holding unit 105 has a storage capacity for holding L pieces of similarity information (L is an integer of 2 or more). Therefore, when the similarity calculation unit 103 stores the similarity information in the history holding unit 105, if the L old similarity information is already stored, the oldest similarity among the similarity information is stored. Delete information. Thereby, free capacity is secured in the history holding unit 105. Then, the similarity calculation unit 103 stores the latest similarity information in the history holding unit 105 in which the free space is secured. Thereby, the history holding unit 105 always holds the latest L pieces of similarity information as history information.
  • the lighting color recognition unit 104c reads the history information stored in the history holding unit 105 every time the similarity information is stored in the history holding unit 105.
  • the history information includes L pieces of recent similarity information.
  • the lamp color recognition unit 104c calculates the average similarity of the lamp area at the center as the average similarity of the lamp area, and calculates the average similarity of the lamp area at the right end. Calculated as the average similarity of partial areas. Then, the lamp color recognizing unit 104c determines that the smallest average similarity among the calculated three average similarity is, for example, the average similarity of the lamp region at the left end, in the traffic light region Pb of the recognition target image. The lamp area at the left end is specified as a dissimilar area.
  • FIG. 11 is a diagram for explaining the recognition of the light color by the traffic signal recognition device 100c according to this modification.
  • the history holding unit 105 holds similarity information indicating similarity calculated for each of the past four captured images as history information.
  • These four captured images are composed of the (n-4) th frame, the (n-3) th frame, the (n-2) th frame, and the (n-1) th frame. Note that these frames are captured images generated by capturing at a constant frame rate by the camera 210.
  • the similarity calculation unit 103 calculates the similarity of each of the three lamp unit regions with respect to the nth frame which is the latest captured image and is the recognition target image. Then, the similarity calculation unit 103 stores similarity information indicating the similarity calculated for the n-th frame in the history holding unit 105. At this time, the similarity calculation unit 103 first deletes the similarity information of the (n ⁇ 4) th frame from the history information stored in the history holding unit 105. Next, the similarity calculation unit 103 stores similarity information indicating the similarity calculated for the n-th frame in the history holding unit 105 as new similarity information. As a result, the history information of the history holding unit 105 includes the similarity information of the nth frame instead of the similarity information of the (n ⁇ 4) th frame.
  • the light color recognition unit 104 c reads the history information stored in the history holding unit 105.
  • the history information includes similarity information of the (n-3) th frame, similarity information of the (n-2) th frame, similarity information of the (n-1) th frame, and similarity of the nth frame.
  • Degree information is included.
  • the similarity information of the (n-3) th frame indicates 30/101/99 as the similarity of the lamp area at each of the left end, the center, and the right end in the traffic light area Pb.
  • the similarity information of the (n ⁇ 2) th frame indicates 70/111/105 as the similarity of the lamp area at each of the left end, the center, and the right end in the traffic light area Pb.
  • the similarity information of the (n ⁇ 1) th frame indicates 107/110/114 as the similarity of the lamp area at each of the left end, the center, and the right end in the traffic light area Pb.
  • the similarity information of the nth frame indicates 21/112/105 as the similarity of the lamp area at each of the left end, the center, and the right end in the traffic light area Pb.
  • the lamp color recognition unit 104c calculates the average of the similarity of the lamp area at the left end in the traffic signal area Pb indicated by the similarity information of each of the four frames. That is, the lamp color recognition unit 104c calculates the average similarity of the leftmost lamp unit region by (30 + 70 + 107 + 21) / 4. Similarly, the lamp color recognition unit 104c calculates the average similarity of the central lamp unit area by (101 + 111 + 110 + 112) / 4, and calculates the average similarity of the rightmost lamp unit region by (99 + 105 + 114 + 105) / 4.
  • the lamp color recognition unit 104c calculates 57/108/106 as the average similarity of the lamp area at each of the left end, the center, and the left end in the traffic light area Pb.
  • the lamp color recognition unit 104c identifies the position corresponding to the smallest average similarity “57” among these average similarities, that is, the lamp region at the left end, as a dissimilar region. That is, the lamp color recognizing unit 104c determines that the leftmost lamp unit region among the three lamp unit regions in the recognition target image exists as a dissimilar region.
  • the lamp color recognition unit 104c recognizes the lamp color in the recognition target image based on the position of the dissimilar region. For example, if the lamp color recognition unit 104c determines that the leftmost lamp unit region exists as a dissimilar region as described above, it recognizes blue as the lamp color in the recognition target image.
  • FIG. 12 is a flowchart showing a processing operation by the traffic signal recognition apparatus 100c according to the present modification.
  • the traffic signal recognition apparatus 100c executes the processing of steps S101 to S104, similarly to the processing operation of the flowchart shown in FIG.
  • the similarity calculation unit 103 of the traffic signal recognition apparatus 100c stores the similarity information indicating the similarity of each lamp unit area calculated in step S104 in the history holding unit 105 (step S104a).
  • the lamp color recognition section 104c determines which of the three lamp sections that the traffic light has. The color is recognized as the lamp color in the recognition target image.
  • the lamp color recognition unit 104c has a dissimilar region that is a lamp region that is not similar to any other lamp unit region among the three lamp unit regions based on the average similarity. It is determined whether or not (step S105). If the lamp color recognition unit 104 determines that the dissimilar area exists (Yes in step S105), the lamp color recognition unit 104 recognizes the lamp color based on the position of the dissimilar area in the traffic light area Pb (step S106).
  • the lamp color recognition unit 104c calculates the average of the similarity of the lamp area in each frame, but may calculate the weighted average as the average similarity. For example, a frame (captured image) that is closer in time to the recognition target image is multiplied by a greater weight to the similarity of the lamp area in that frame. This makes it possible to calculate a more appropriate average similarity for the recognition target image.
  • the light color recognition unit 104c reads the history information from the history holding unit 105, thereby acquiring similarity information of the recognition target image included in the history information.
  • the light color recognition unit 104c may directly acquire the similarity information of the recognition target image from the similarity calculation unit 103.
  • the history information read from the history holding unit 105 by the lamp color recognition unit 104c does not include the similarity information of the recognition target image, and includes only the similarity information of the past captured images.
  • the lamp color recognition unit 104c has history information indicating similarity calculated for each of a plurality of captured images acquired before the recognition target image. Refer to This history information indicates, for each of a plurality of photographed images, the degree of similarity of the lamp section region at the position for each position in the traffic light region Pb in the photographed image. Then, for each position in the traffic light area, the lamp color recognition unit 104c calculates the average similarity of the lamp area at the position in the recognition target image and the plurality of images as the average similarity. Further, the lamp color recognizing unit 104c has a lamp unit region at a position corresponding to the smallest average similarity among the average similarity calculated for each of the N lamp unit regions in the recognition target image. It is determined that it exists as a dissimilar region.
  • the dissimilar region is determined based on the average similarity using the past similarity, the dissimilar region can be determined more accurately than using the light color recognized in the past. .
  • the traffic light recognition device recognizes the light color of the traffic light.
  • the traffic signal recognizing device may recognize not only the lamp color but also the traveling direction indicated by the lit arrow light portion for the traffic light having the arrow light portion.
  • the lamp part indicated by the arrow is hereinafter referred to as a direction indicator lamp part, and the lamp part indicating a color such as blue, yellow, or blue is hereinafter also referred to as a color lamp part.
  • FIG. 13 is a diagram for explaining the recognition of the traveling direction by the traffic signal recognition device according to the present modification.
  • the traffic signal includes not only a blue color lamp unit, a yellow color lamp unit, and a red color lamp unit, but also a direction indicator lamp unit that indicates the right direction as the traveling direction.
  • the area extraction unit 101 of the traffic light recognition device extracts an area in which the three color lamp units and one direction indicator lamp unit are projected from the captured image Pa as the traffic signal area Pb. Then, the area dividing unit 102 divides the traffic light area Pb to thereby divide the lamp part areas Pb1, Pb2, and Pb3 in which each of the three color lamp parts is displayed, and the lamp in which the direction indicator lamp part is displayed. A partial area Pb4 is generated.
  • the traffic signal recognizing device recognizes the light color of the traffic signal based on the lamp area Pb1, Pb2 and Pb3 as in the above embodiment and the first to third modifications. Then, the traffic light recognition device determines that the direction indicator lamp unit is not lit if the lamp color is a predetermined color. For example, when the lamp color is blue or yellow, the traffic light recognition device determines that the direction indicator lamp unit is not lit. On the other hand, when the lamp color is red, the traffic light recognition device determines whether or not the direction indicator lamp unit is lit using the lamp unit region Pb4. For example, the traffic light recognition apparatus inputs the feature amount of the lamp part region Pb4 to a model such as a neural network generated by machine learning. The traffic signal recognizing device determines whether or not the direction indicator lamp unit is lit according to the output from the model.
  • a predetermined color For example, when the lamp color is blue or yellow, the traffic light recognition device determines that the direction indicator lamp unit is not lit.
  • the traffic light recognition device determines whether or not the direction indicator lamp
  • the traffic light recognition device calculates a correlation coefficient between a region other than the dissimilar region (hereinafter referred to as a similar region) of the lamp unit regions Pb1, Pb2, and Pb3 and the lamp unit region Pb4. It may be determined whether the number of relationships is greater than or equal to a threshold value. That is, since the similar area is darkly displayed in the captured image, if the correlation coefficient is equal to or greater than the threshold value, the lamp area Pb4 is also dark and the direction indicator lamp section is likely to be turned off. If the correlation coefficient is less than the threshold, the lamp area Pb4 is bright and the direction indicator lamp section is likely to be lit. Therefore, the traffic light recognition device determines that the direction indicator lamp unit is turned off when the correlation coefficient is equal to or greater than the threshold value. Conversely, the traffic light recognition device determines that the direction indicator lamp unit is lit when the correlation coefficient is less than the threshold value.
  • the traffic light recognition device determines that the direction indicator lamp unit is lit as described above, it refers to the association information. Similar to the association information shown in FIG. 3, this association information indicates the colors associated with the positions of the lamp areas Pb1, Pb2, and Pb3 and is associated with the position of the lamp area Pb4. Indicates the direction of travel. In the association information, the traffic signal recognition device recognizes the traveling direction associated with the position of the lamp unit region Pb4, for example, the right direction, as the traveling direction of the lit direction indicator lamp unit.
  • a traffic signal having a plurality of direction indicator lamps is also arranged on the road.
  • this traffic light only one of the direction indicator lamps is turned on, or two or more direction indicator lamps are turned on simultaneously.
  • the traffic light recognition device recognizes the light color of the traffic light based on the respective lamp area of the N lamp sections, as in the above embodiment and the first to third modifications. . Then, the traffic signal recognizing device narrows down the direction indicator lamp units that are lit from M (M is an integer of 1 or more) direction indicator lamp units based on the lamp color. For example, if the lamp color is blue, the traffic light recognition device narrows down the direction indicator lamp unit indicating the straight direction as a candidate, and if the lamp color is red, the traffic signal recognition device detects the direction indicator lamp unit indicating the right direction. Narrow down as candidates. Then, similarly to the above, the traffic signal recognizing device determines whether or not the candidate direction indicator lamp unit is lit using a model or a correlation coefficient.
  • the traffic signal recognition device when the traffic light has M (M is an integer of 1 or more) direction indicator lamps, each of which indicates a traveling direction of the vehicle, the traffic signal recognition device is turned on.
  • the traveling direction indicated by the direction indicator lamp unit is recognized. That is, the area dividing unit 102 divides the traffic light area Pb, thereby generating N lamp part areas and M lamp part areas in which the direction indicating lamp parts are respectively projected.
  • the traffic light recognition device is further turned on in the M direction indicator lamp units based on the lamp color recognized for the recognition target image and the feature quantities of the M lamp unit regions.
  • the traveling direction indicated by the at least one direction indicator lamp unit is recognized.
  • the traffic light recognition device first recognizes a traveling direction indicated by at least one direction indication lamp unit that is lit among the M direction indication lamp units, and based on the recognition result, the color lamp unit You may recognize the light color.
  • the above-described feature amount may be a vector in the above embodiment or a correlation coefficient.
  • the direction indicator lamp part For example, in the case where a direction indicator lamp unit that may be lit is predetermined for each lamp color of the traffic light, the direction in which the M direction indicator lamp units are lit up according to the recognized lamp color. It is possible to narrow down the candidates for the indicator lamp part. Further, for example, by inputting each feature amount of the M lamp area into a model such as a neural network generated by machine learning, a directional indicator lamp that is lit is selected from the narrowed candidates. Can be recognized properly.
  • FIG. 14 is a diagram illustrating an example of a method of calculating the similarity of the first lamp part region Pb1 in the above embodiment.
  • the similarity calculation unit 103 divides the first lamp unit region Pb1 into a plurality of blocks each composed of n ⁇ n pixels as described above. And the similarity calculation part 103 searches the block most similar to the image of the block from 2nd lamp part area
  • a block is also called a patch.
  • the similarity calculation unit 103 searches the second lamp part area Pb2 for a block most similar to the image of the block B11 at the upper left corner of the first lamp part area Pb1. Then, the similarity calculation unit 103 calculates the inter-vector distance (for example, “5”) between the block B11 and the most similar block searched from the second lamp unit region Pb2. The larger the distance between the vectors, the more the distance between the vectors indicates that the two blocks are not similar. The smaller the distance between the vectors, the more similar the distance between the two blocks is. Indicates. Therefore, it can be said that the distance between vectors is dissimilarity.
  • the inter-vector distance for example, “5”
  • the similarity calculation unit 103 performs the same processing as that of the block B11 on the block B12 next to the block B11. That is, the similarity calculation unit 103 searches the second lamp part area Pb2 for a block most similar to the image of the second block B12 from the upper left end of the first lamp part area Pb1 to the right. Then, the similarity calculation unit 103 calculates the inter-vector distance between the block B12 and the most similar block searched from the second lamp unit region Pb2.
  • the similarity calculation unit 103 repeatedly executes the processes shown in FIGS. 14A and 14B, so that each block included in the first lamp area Pb1 is shown in FIG. 14C. On the other hand, a distance between vectors (that is, dissimilarity) is calculated.
  • the similarity calculation unit 103 calculates the average or sum of the dissimilarities of these blocks, and the reciprocal of the calculation result is used as the correlation coefficient k12 of the first lamp part region Pb1 with respect to the second lamp part region Pb2. calculate. Similarly, the similarity calculation unit 103 calculates a correlation coefficient k13 of the first lamp part region Pb1 with respect to the third lamp part region Pb3, and calculates the largest correlation coefficient among the correlation coefficient k12 and the correlation coefficient k13. , Calculated as the similarity k1 of the first lamp area Pb1.
  • the second lamp section area Pb1 is compared with the second lamp section area Pb2 by comparing the first lamp section area Pb1 and the second lamp section area Pb2.
  • a correlation coefficient k12 for the lamp area Pb2 is calculated as the first similarity.
  • the correlation coefficient k13 of the first lamp part area Pb1 with respect to the third lamp part area Pb3 is calculated as the second similarity.
  • the largest similarity between the first similarity and the second similarity is calculated as the similarity of the first lamp area Pb1.
  • the first lamp part area Pb1 the first lamp part area Pb1, and each of the second lamp part area Pb2 and the third lamp part area Pb3 are individually compared.
  • the similarity between the second lamp part area Pb2 and the third lamp part area Pb3 is also calculated in the same manner as the similarity between the first lamp part area Pb1.
  • the similarity calculation unit 103 calculates the similarity of the first lamp part area Pb1, and each of the first lamp part area Pb1, the second lamp part area Pb2, and the third lamp part area Pb3. Are not compared individually. That is, the similarity calculation unit 103 compares the first lamp part area Pb1 with the group including the second lamp part area Pb2 and the third lamp part area Pb3, thereby determining the similarity of the first lamp part area Pb1. calculate.
  • FIG. 15 is a diagram illustrating an example of a method of calculating the similarity of each lamp area according to the present modification.
  • the similarity calculating unit 103 calculates the similarity of the first lamp unit area Pb1, as shown in FIG. 15A, the first lamp unit area Pb1, the second lamp unit area Pb2, and the third lamp unit.
  • the first group G1 composed of the region Pb3 is compared. That is, the similarity calculation unit 103 divides the first lamp unit area Pb1 into a plurality of blocks, and searches the first group G1 for the block most similar to the image of the block for each block of the first lamp unit area Pb1. To do.
  • the similarity calculation unit 103 obtains the inter-vector distance of each block included in the first lamp region Pb1 by the search, and based on the inter-vector distance, The correlation coefficient of one lamp area Pb1 is calculated as the similarity.
  • the similarity calculation unit 103 calculates the similarity of the second lamp unit area Pb2, as shown in FIG. 15B, the second lamp unit area Pb2, the first lamp unit area Pb1, and the third lamp unit.
  • the second group G2 composed of the region Pb3 is compared. That is, the similarity calculation unit 103 divides the second lamp unit region Pb2 into a plurality of blocks, and searches the second group G2 for the block most similar to the image of the block for each block of the second lamp unit region Pb2. To do. Similar to the example shown in FIG. 14, the similarity calculation unit 103 obtains the inter-vector distance of each block included in the second lamp part region Pb2 by the search, and based on the inter-vector distance, The correlation coefficient of the two lamp area Pb2 is calculated as the similarity.
  • the similarity calculation unit 103 calculates the similarity of the third lamp unit area Pb3, as shown in FIG. 15C, the third lamp unit area Pb3, the first lamp unit area Pb1, and the second lamp unit.
  • the third group G3 composed of the region Pb2 is compared. That is, the similarity calculation unit 103 divides the third lamp unit region Pb3 into a plurality of blocks, and searches for the block most similar to the image of the block from the third group G3 for each block of the third lamp unit region Pb3. To do.
  • the similarity calculation unit 103 obtains the inter-vector distance of each block included in the third lamp part region Pb2 by the search, and based on the inter-vector distance, The correlation coefficient of the three lamp area Pb3 is calculated as the similarity.
  • the respective similarities of the first lamp area Pb1, the second lamp area Pb2, and the third lamp area Pb3 are calculated using the groups. For example, if the lamp part projected in the first lamp part area Pb1 is turned on and the other lamp parts are turned off, a low similarity is calculated only for the first lamp part area Pb1, and the second lamp part is calculated. A high degree of similarity is calculated for each of the partial area Pb2 and the third lamp part area Pb3. Therefore, even with the calculation method shown in FIG. 15, it is possible to calculate an appropriate degree of similarity for each lamp area, as in the above embodiment.
  • FIG. 16 is a diagram showing another example of a method for calculating the similarity of each lamp section according to the present modification.
  • the lamp section area that is the target of similarity calculation is divided into a plurality of blocks, but conversely, the group may be divided into a plurality of blocks.
  • the similarity calculation unit 103 calculates the similarity of the third lamp part region Pb3 as shown in FIG. 16A by using a third lamp part region Pb1 and a second lamp part region Pb2.
  • the group G3 is compared with the third lamp part region Pb3. That is, the similarity calculation unit 103 divides the third group G3 into a plurality of blocks, and searches the third lamp unit region Pb3 for a block most similar to the image of each block of the third group G3.
  • the similarity calculation unit 103 obtains the inter-vector distance of each block included in the third group G3 by the search.
  • the similarity calculation part 103 calculates the correlation coefficient of the 3rd group G3 as a similarity of 3rd lamp part area
  • the similarity calculation section 103 calculates the second lamp composed of the third lamp section area Pb3 and the first lamp section area Pb1, as shown in FIG.
  • the group G2 is compared with the second lamp area Pb2. That is, the similarity calculation unit 103 divides the second group G2 into a plurality of blocks, and searches the second lamp unit region Pb2 for a block most similar to the image of each block of the second group G2. Next, similar to the example shown in FIG. 14, the similarity calculation unit 103 obtains the inter-vector distance of each block included in the second group G2 by the search. And the similarity calculation part 103 calculates the correlation coefficient of the 2nd group G2 as a similarity of 2nd lamp part area
  • the similarity calculation unit 103 calculates the similarity of the first lamp part area Pb1, as shown in FIG. 16C, the first lamp part area Pb3 and the second lamp part area Pb2.
  • the group G1 is compared with the first lamp area Pb1. That is, the similarity calculation unit 103 divides the first group G1 into a plurality of blocks, and searches the first lamp unit region Pb1 for a block most similar to the image of the block for each block of the first group G1.
  • the similarity calculation unit 103 obtains the inter-vector distance of each block included in the first group G1 by the search.
  • the similarity calculation part 103 calculates the correlation coefficient of the 1st group G1 as a similarity of 1st lamp
  • the respective similarities of the first lamp part area Pb1, the second lamp part area Pb2, and the third lamp part area Pb3 are calculated using groups. Is done. For example, if the lamp part projected in the first lamp part area Pb1 is turned on and the other lamp parts are turned off, a low similarity is calculated only for the first lamp part area Pb1, and the second lamp part is calculated. A high degree of similarity is calculated for each of the partial area Pb2 and the third lamp part area Pb3. Therefore, even in the calculation method shown in FIG. 16, an appropriate similarity can be calculated for each lamp area, as in the above embodiment.
  • the traffic light area of the recognition target image obtained by the photographing has four lamp part areas, that is, the first lamp part area Pb1, the second lamp part area Pb2, the third lamp part area Pb3, and the fourth lamp part. Divided into regions Pb4.
  • the similarity calculation unit 103 calculates the similarity of the fourth lamp part area Pb4, as shown in FIG. 17A, the first lamp part area Pb1, the second lamp part area Pb2, and the third lamp part.
  • the fourth group Gp4 including the partial area Pb3 is compared with the fourth lamp part area Pb4. That is, the similarity calculation unit 103 divides the fourth group Gp4 into a plurality of blocks, and searches the fourth lamp unit region Pb4 for a block most similar to the image of the block for each block of the fourth group Gp4.
  • the similarity calculation unit 103 obtains the inter-vector distance of each block included in the fourth group Gp4 by the search.
  • the similarity calculation part 103 calculates the correlation coefficient of the 4th group Gp4 as a similarity of 4th lamp part area
  • the similarity calculation unit 103 calculates the similarity of the first lamp area Pb1, as shown in FIG. 17B, the second lamp area Pb2, the third lamp area Pb3, and the fourth lamp.
  • the first group Gp1 including the partial area Pb4 is compared with the first lamp part area Pb1. That is, the similarity calculation unit 103 divides the first group Gp1 into a plurality of blocks, and searches the first lamp unit region Pb1 for a block most similar to the image of the block for each block of the first group Gp1.
  • the similarity calculation unit 103 obtains an inter-vector distance of each block included in the first group Gp1 by the search.
  • the similarity calculation part 103 calculates the correlation coefficient of the 1st group Gp1 as a similarity of 1st lamp part area
  • the similarity calculation unit 103 calculates the similarity of the second lamp part area Pb2, as shown in FIG. 17C, the third lamp part area Pb3, the fourth lamp part area Pb4, and the first lamp part.
  • the second group Gp2 including the partial area Pb1 is compared with the second lamp area Pb2. That is, the similarity calculation unit 103 divides the second group Gp2 into a plurality of blocks, and searches the second lamp unit region Pb2 for a block most similar to the image of the block for each block of the second group Gp2.
  • the similarity calculation unit 103 obtains the inter-vector distance of each block included in the second group Gp2 by the search.
  • the similarity calculation part 103 calculates the correlation coefficient of the 2nd group Gp2 as a similarity of 2nd lamp part area
  • the similarity calculation unit 103 calculates the fourth lamp part area Pb4, the first lamp part area Pb1, and the second lamp as shown in FIG.
  • the third group Gp3 composed of the partial area Pb2 is compared with the third lamp part area Pb3. That is, the similarity calculation unit 103 divides the third group Gp3 into a plurality of blocks, and searches the third lamp unit region Pb3 for a block most similar to the image of the block for each block of the third group Gp3.
  • the similarity calculation unit 103 obtains the inter-vector distance of each block included in the third group Gp3 by the search.
  • the similarity calculation part 103 calculates the correlation coefficient of the 3rd group Gp3 as a similarity of 3rd lamp part area
  • the similarities of the first lamp part area Pb1, the second lamp part area Pb2, and the third lamp part area Pb3 are grouped. Is used to calculate. For example, if the lamp part projected in the first lamp part area Pb1 is turned on and the other lamp parts are turned off, a low similarity is calculated only for the first lamp part area Pb1, and the second lamp part is calculated. A high degree of similarity is calculated for each of the partial area Pb2 and the third lamp part area Pb3. Therefore, even with the calculation method shown in FIG. 17, it is possible to calculate an appropriate degree of similarity for each lamp area, as in the above embodiment.
  • the similarity calculation unit 103 for each of the N lamp unit regions, includes an image of the lamp unit region and at least one other than the lamp unit region.
  • the similarity of the lamp area is calculated by comparing an image of a group of two other lamp areas.
  • the similarity calculation unit 103 includes, for each of the N lamp unit regions, an image of the lamp unit region and the group image included in the lamp unit region.
  • the similarity of the lamp area is calculated by searching the group for the block most similar to the image of the block.
  • the similarity calculation unit 103 compares the image of the lamp unit region with the image of the group for each of the N lamp unit regions, For each included block, the similarity of the lamp area is calculated by searching the lamp area for the block most similar to the image of the block.
  • the lamp area and each of the other lamp areas are individually set. It is possible to save time and effort for comparison. That is, by comparing the lamp area with a group of (N ⁇ 1) lamp areas other than the lamp area, the similarity of the lamp area can be easily calculated. Further, in the calculation of the degree of similarity, image regions having different sizes (for example, a lamp region and a group) can be appropriately compared.
  • each component included in the traffic signal recognition apparatus is configured by dedicated hardware or can be realized by executing a software program suitable for each component. Good.
  • Each component may be realized by a program execution unit such as a CPU or a processor reading and executing a software program recorded on a recording medium such as a hard disk or a semiconductor memory.
  • the software that realizes the traffic signal recognition apparatus in the embodiment and each modification causes the computer to execute each step included in the flowcharts shown in FIGS. 4, 6, 9, and 12.
  • the traffic signal recognizing device As described above, the traffic signal recognizing device according to one or a plurality of aspects has been described based on the embodiment and each modification, but the present disclosure is not limited to this embodiment. As long as it does not deviate from the gist of the present disclosure, various modifications conceived by those skilled in the art are applied to the present embodiment or the modified examples, and forms constructed by combining the components in the embodiments and the modified examples are also disclosed in the present disclosure It may be included in the range.
  • the area extraction unit 101 geometrically uses the position of the vehicle 10, the position and form of the traffic light, and the map information to geometrically determine the traffic light area Pb in the captured image Pa.
  • the area extraction unit 101 may detect the traffic signal area Pb using another detection method instead of such a detection method.
  • the area extraction unit 101 may detect the traffic light area Pb by image recognition by machine learning, or may detect the traffic light area Pb by pattern matching.
  • the number (N) of the lamp units may be any number as long as it is three or more.
  • the colors of the N lamp parts may be colors other than blue, yellow, and red, and two or more of the N lamp parts may be the same color.
  • the traffic signal recognition apparatus shown in FIGS. 1, 5, 7, and 10 may be configured by a processor and a memory. That is, the constituent elements other than the history holding unit 105 among the constituent elements of the traffic signal recognition apparatus are realized by the processor.
  • the processor implements components other than the history holding unit 105 by executing a program stored in the memory.
  • the memory may be configured as the history holding unit 105. That is, the memory may store a program for controlling the processor or may store history information.
  • the present disclosure can be used for a traffic signal recognition device that is mounted on, for example, an automatic traveling vehicle and recognizes a traffic signal on a traveling route of the vehicle.

Abstract

Provided is a traffic signal recognizing device with which it is possible to ease the restriction on imaging a traffic signal. A traffic signal recognizing device (100) is provided with a processor and a memory. The processor uses the memory to extract a region in which a traffic signal having N lights is shown from an image that is acquired by a sensor and is subject to recognition, the region being extracted as a traffic signal region, and calculates, from the traffic signal region, the level of similarity of a plurality of images in which lights different from each other are shown.

Description

信号機認識装置、信号機認識方法およびプログラムSignal recognition device, signal recognition method and program
 本開示は、信号機を認識する信号機認識装置などに関する。 The present disclosure relates to a traffic signal recognition device that recognizes traffic signals.
 従来、信号機を認識する信号機認識装置が提案されている(例えば、特許文献1参照)。この信号機認識装置は、信号機の撮影によって生成される画像を取得し、その画像に基づいて信号機の灯火色を認識する。 Conventionally, a traffic signal recognition device for recognizing a traffic signal has been proposed (for example, see Patent Document 1). This traffic light recognition device acquires an image generated by shooting a traffic light, and recognizes the light color of the traffic light based on the image.
特開2009-244946号公報JP 2009-244946 A
 しかしながら、上記特許文献1の信号機認識装置では、信号機を適切に認識するためには、厳しい制約の下での撮影によって生成された信号機の画像が必要であるという課題がある。 However, the signal recognition device of Patent Document 1 has a problem that an image of a signal generated by photographing under severe restrictions is necessary in order to properly recognize the signal.
 そこで、本開示は、信号機の撮影の制約を緩和することができる信号機認識装置を提供する。 Therefore, the present disclosure provides a traffic light recognition device that can relax restrictions on shooting of a traffic light.
 本開示の一態様に係る信号機認識装置は、プロセッサと、メモリとを備え、前記プロセッサは、前記メモリを用いて、センサによって取得された認識対象画像から、N個(Nは3以上の整数)の灯部を有する信号機が映し出されている領域を信号機領域として抽出し、前記信号機領域から、それぞれ互いに異なる灯部が映し出されている複数の画像の類似度を算出する。 A traffic signal recognition apparatus according to an aspect of the present disclosure includes a processor and a memory, and the processor uses the memory to recognize N images (N is an integer of 3 or more) from recognition target images acquired by a sensor. An area in which a traffic light having a lamp part is projected is extracted as a traffic light area, and a similarity between a plurality of images in which different lamp parts are projected is calculated from the traffic light area.
 なお、これらの包括的または具体的な態様は、システム、方法、集積回路、コンピュータプログラムまたはコンピュータ読み取り可能なCD-ROMなどの記録媒体で実現されてもよく、システム、方法、集積回路、コンピュータプログラムおよび記録媒体の任意な組み合わせで実現されてもよい。 Note that these comprehensive or specific modes may be realized by a system, a method, an integrated circuit, a computer program, or a recording medium such as a computer-readable CD-ROM, and the system, method, integrated circuit, and computer program. And any combination of recording media.
 本開示の信号機認識装置は、信号機の撮影の制約を緩和することができる。 The traffic signal recognition device of the present disclosure can ease restrictions on traffic signal shooting.
図1は、実施の形態における信号機認識装置の構成を示すブロック図である。FIG. 1 is a block diagram illustrating a configuration of a traffic signal recognition apparatus according to an embodiment. 図2は、実施の形態における領域抽出部、領域分割部および類似度算出部の処理を説明するための図である。FIG. 2 is a diagram for explaining processing of the region extraction unit, the region division unit, and the similarity calculation unit in the embodiment. 図3は、実施の形態における灯火色認識部の処理を説明するための図である。FIG. 3 is a diagram for explaining processing of the light color recognition unit in the embodiment. 図4は、実施の形態における信号機認識装置の処理動作を示すフローチャートである。FIG. 4 is a flowchart showing the processing operation of the traffic signal recognition apparatus in the embodiment. 図5は、実施の形態の変形例1に係る信号機認識装置の構成を示すブロック図である。FIG. 5 is a block diagram illustrating a configuration of the traffic signal recognition apparatus according to the first modification of the embodiment. 図6は、実施の形態の変形例1に係る信号機認識装置の処理動作を示すフローチャートである。FIG. 6 is a flowchart illustrating the processing operation of the traffic signal recognition apparatus according to the first modification of the embodiment. 図7は、実施の形態の変形例2に係る信号機認識装置の構成を示すブロック図である。FIG. 7 is a block diagram illustrating a configuration of a traffic signal recognition apparatus according to Modification 2 of the embodiment. 図8は、実施の形態の変形例2に係る信号機認識装置による灯火色の認識を説明するための図である。FIG. 8 is a diagram for explaining the recognition of the light color by the traffic light recognition apparatus according to the second modification of the embodiment. 図9は、実施の形態の変形例2に係る信号機認識装置の処理動作を示すフローチャートである。FIG. 9 is a flowchart illustrating the processing operation of the traffic signal recognition apparatus according to the second modification of the embodiment. 図10は、実施の形態の変形例3に係る信号機認識装置の構成を示すブロック図である。FIG. 10 is a block diagram illustrating a configuration of a traffic signal recognition apparatus according to the third modification of the embodiment. 図11は、実施の形態の変形例3に係る信号機認識装置による灯火色の認識を説明するための図である。FIG. 11 is a diagram for explaining the recognition of the light color by the traffic light recognition apparatus according to the third modification of the embodiment. 図12は、実施の形態の変形例3に係る信号機認識装置の処理動作を示すフローチャートである。FIG. 12 is a flowchart illustrating the processing operation of the traffic signal recognition apparatus according to the third modification of the embodiment. 図13は、実施の形態の変形例4に係る信号機認識装置による進行方向の認識を説明するための図である。FIG. 13 is a diagram for explaining recognition of the traveling direction by the traffic light recognition apparatus according to the fourth modification of the embodiment. 図14は、実施の形態における第1灯部領域の類似度の算出方法の一例を示す図である。FIG. 14 is a diagram illustrating an example of a method of calculating the similarity of the first lamp unit region in the embodiment. 図15は、実施の形態の変形例5に係る各灯部領域の類似度の算出方法の一例を示す図である。FIG. 15 is a diagram illustrating an example of a method for calculating the similarity of each lamp area according to the fifth modification of the embodiment. 図16は、実施の形態の変形例5に係る各灯部領域の類似度の算出方法の他の例を示す図である。FIG. 16 is a diagram illustrating another example of a method for calculating the similarity of each lamp unit region according to the fifth modification of the embodiment. 図17は、実施の形態の変形例5に係る4つの灯部領域(N=4)の類似度の算出方法の一例を示す図である。FIG. 17 is a diagram illustrating an example of a method of calculating the similarity of the four lamp unit regions (N = 4) according to the fifth modification of the embodiment.
 (本開示の基礎となった知見)
 本発明者は、「背景技術」の欄において記載した上記特許文献1の信号機認識装置に関し、以下の問題が生じることを見出した。
(Knowledge that became the basis of this disclosure)
The present inventor has found that the following problems occur with respect to the traffic signal recognition device of Patent Document 1 described in the “Background Art” section.
 上記特許文献1の信号機認識装置は、所定のシャッタースピードでの撮影によって生成された色特徴画像を取得する。そして、信号機認識装置は、その色特徴画像から、円形であって信号機の点灯色と同一の色を示す領域を色特徴候補領域として抽出する。さらに、信号機認識装置は、色特徴候補領域の周辺の平均輝度に基づいて決定されたシャッタースピードによる撮影によって生成された形状特徴画像を取得する。そして、信号機認識装置は、その形状特徴画像における色特徴候補領域の周辺から、所定の信号機の形状とマッチする領域を、形状特徴候補領域として抽出する。このように抽出された形状特徴候補領域は、信号機の領域として認識され、色特徴候補領域の色は、信号機の点灯色、すなわち灯火色として認識される。 The traffic signal recognition device of Patent Document 1 acquires a color feature image generated by shooting at a predetermined shutter speed. Then, the traffic signal recognizing device extracts, from the color feature image, a circular area that shows the same color as the lighting color of the traffic light as a color feature candidate area. Further, the traffic signal recognition apparatus acquires a shape feature image generated by photographing at a shutter speed determined based on the average luminance around the color feature candidate region. Then, the traffic signal recognition device extracts a region that matches the shape of a predetermined traffic signal as a shape feature candidate region from the periphery of the color feature candidate region in the shape feature image. The shape feature candidate area extracted in this way is recognized as a traffic light area, and the color of the color feature candidate area is recognized as a lighting color of the traffic light, that is, a light color.
 つまり、この信号機認識装置では、信号機の灯火色を認識するためには、所定のシャッタースピードによる撮影によって生成された色特徴画像が必要である。また、所定のシャッタースピードは、色特徴画像に映し出される信号機の点灯色が明瞭に表れるように設定される。つまり、所定のシャッタースピードは、色特徴画像において色飽和が生じないように、状況または環境に適したシャッタースピードに調整されている必要がある。 That is, in this traffic light recognition device, in order to recognize the light color of the traffic light, a color feature image generated by photographing at a predetermined shutter speed is necessary. The predetermined shutter speed is set so that the lighting color of the traffic light displayed in the color feature image appears clearly. That is, the predetermined shutter speed needs to be adjusted to a shutter speed suitable for the situation or environment so that color saturation does not occur in the color feature image.
 このように、上記特許文献1の信号機認識装置では、信号機の灯火色を適切に認識するためには、厳しい制約の下での撮影によって生成された信号機の画像が必要である。 As described above, in the traffic signal recognition device of Patent Document 1 described above, in order to properly recognize the light color of the traffic signal, an image of the traffic signal generated by photographing under severe restrictions is necessary.
 また、信号機の領域を認識するためには、形状特徴画像において信号機の形状が明瞭に表れるように、色特徴候補領域の周辺の平均輝度に基づいてシャッタースピードを再調整する必要がある。さらに、色特徴候補領域の周辺から所定の信号機の形状とマッチする領域を探索するため、信号機の形状を示すテンプレートが必要である。この探索は、色特徴候補領域の位置を基準に行われるため、信号機における点灯色の位置ごとに、テンプレートが必要である。したがって、複数種のテンプレートが必要である。 Also, in order to recognize the traffic light area, it is necessary to readjust the shutter speed based on the average luminance around the color feature candidate area so that the shape of the traffic light clearly appears in the shape feature image. Furthermore, in order to search for a region that matches the shape of a predetermined traffic signal from the periphery of the color feature candidate region, a template that indicates the shape of the traffic signal is required. Since this search is performed based on the position of the color feature candidate area, a template is required for each position of the lighting color in the traffic light. Therefore, multiple types of templates are necessary.
 このような課題を解決するために、本開示の一態様に係る信号機認識装置は、プロセッサと、メモリとを備え、前記プロセッサは、前記メモリを用いて、センサによって取得された認識対象画像から、N個(Nは3以上の整数)の灯部を有する信号機が映し出されている領域を信号機領域として抽出し、前記信号機領域から、それぞれ互いに異なる灯部が映し出されている複数の画像の類似度を算出する。例えば、前記プロセッサは、前記複数の画像として、N個の灯部領域を生成し、前記N個の灯部領域のそれぞれの類似度を算出してもよい。 In order to solve such a problem, a traffic light recognition apparatus according to an aspect of the present disclosure includes a processor and a memory, and the processor uses the memory to recognize from a recognition target image acquired by a sensor, An area in which a traffic light having N lamps (N is an integer of 3 or more) is displayed as a traffic light area, and a plurality of similarities of images in which different light parts are projected from the traffic light area. Is calculated. For example, the processor may generate N lamp part areas as the plurality of images, and calculate the similarity of each of the N lamp part areas.
 これにより、信号機のN個の灯部のうち、他の何れの灯部にも類似していない灯部を、点灯している灯部として認識することができる。その結果、各灯部領域の色をその領域の画像から認識する必要がなく、信号機の撮影の制約を緩和することができる。 Thus, a lamp part that is not similar to any other lamp part among the N lamp parts of the traffic light can be recognized as a lit lamp part. As a result, it is not necessary to recognize the color of each lamp area from the image of that area, and the restrictions on photographing of the traffic light can be relaxed.
 また、前記プロセッサは、前記N個の灯部領域のそれぞれに対して算出された前記類似度に基づいて、前記N個の灯部領域のうち、他の何れの灯部領域にも類似していない灯部領域である非類似領域が存在するか否かを判定してもよい。 Further, the processor is similar to any other lamp section area among the N lamp section areas based on the similarity calculated for each of the N lamp section areas. It may be determined whether or not there is a dissimilar region that is not a lamp part region.
 これにより、信号機のN個の灯部のうち、他の何れの灯部にも類似していない灯部を、点灯している灯部としてより適切に認識することができる。その結果、各灯部領域の色をその領域の画像から認識する必要がなく、信号機の撮影の制約を適切に緩和することができる。 Thereby, among the N lamp parts of the traffic light, a lamp part that is not similar to any other lamp part can be more appropriately recognized as a lit lamp part. As a result, it is not necessary to recognize the color of each lamp area from the image of that area, and the restrictions on shooting of the traffic light can be relaxed appropriately.
 例えば、前記プロセッサは、さらに、前記非類似領域が存在すると判定する場合には、前記信号機領域における前記非類似領域の位置に予め対応付けられている色を、前記認識対象画像における灯火色として認識してもよい。具体的には、前記プロセッサは、前記灯火色の認識では、前記信号機領域における前記N個の灯部領域のそれぞれの位置に対応付けられた色を示す対応付け情報を参照し、前記対応付け情報において、前記非類似領域の位置に対応付けられている色を、前記認識対象画像における前記灯火色として認識してもよい。 For example, when the processor further determines that the dissimilar region exists, the processor recognizes a color preliminarily associated with the position of the dissimilar region in the traffic light region as a light color in the recognition target image. May be. Specifically, in the recognition of the lamp color, the processor refers to the association information indicating the color associated with each position of the N lamp unit regions in the traffic light region, and the association information The color associated with the position of the dissimilar region may be recognized as the lamp color in the recognition target image.
 これにより、N個の灯部領域に、他の何れにも類似していない灯部領域が非類似領域として存在する場合には、その非類似領域の位置に基づいて灯火色が認識される。例えば、信号機が有するN個の灯部のうちの1つが点灯し、他のN-1個の灯部が消灯していれば、その点灯している1つの灯部が映し出されている灯部領域が非類似領域として判定される。そして、その非類似領域が信号機領域の左端にあれば、灯火色として青色が認識される。このように、N個の灯部領域の類似度、すなわち、N個の灯部領域の特徴量の差異によって、灯火色が認識されるため、適切に灯火色を認識することができる。 Thereby, when a lamp part area that is not similar to any of the N lamp part areas exists as a dissimilar area, the lamp color is recognized based on the position of the dissimilar area. For example, if one of the N lamp sections of the traffic light is turned on and the other N-1 lamp sections are turned off, the one lamp section that is lit is displayed. The region is determined as a dissimilar region. If the dissimilar area is at the left end of the traffic light area, blue is recognized as the lamp color. As described above, since the lamp color is recognized based on the similarity between the N lamp section areas, that is, the difference in the feature amount of the N lamp section areas, the lamp color can be appropriately recognized.
 例えば、色飽和が生じると、信号機の青色の灯部が点灯していても、認識対象画像のその灯部に対応する灯部領域では、その灯部が青色ではなく、白色または黄色に近い色に映っている場合がある。しかし、上記一態様に係る信号機認識装置では、認識対象画像において色飽和が生じていても、N個の灯部領域の特徴量の差異に基づいて灯火色が認識されるため、このような認識対象画像に対しても適切に灯火色を認識することができる。 For example, when color saturation occurs, even if the blue light part of the traffic light is lit, in the light part region corresponding to that light part of the recognition target image, the light part is not blue but is a color close to white or yellow. It may be reflected in. However, in the traffic light recognition apparatus according to the above aspect, since the lamp color is recognized based on the difference in the feature amount of the N lamp area even if color saturation occurs in the recognition target image, such recognition is performed. It is possible to appropriately recognize the light color even for the target image.
 したがって、認識対象画像において色飽和が生じていてもよいため、カメラで信号機を撮影するときに、色飽和が生じないように所定のシャッタースピードにカメラを調整する必要性を省くことができる。その結果、カメラのシャッタースピードを固定しておくことができる。また、色飽和が生じやすい夜間または雨天での撮影によって生成された認識対象画像に対しても、適切に灯火色を認識することができる。 Therefore, since color saturation may occur in the recognition target image, it is possible to eliminate the need to adjust the camera to a predetermined shutter speed so that color saturation does not occur when shooting a traffic light with the camera. As a result, the shutter speed of the camera can be fixed. In addition, it is possible to appropriately recognize the light color even with respect to the recognition target image generated by photographing at night or rainy weather where color saturation is likely to occur.
 さらに、信号機の灯部の灯火色が認識対象画像に正しく映し出されていなくてもよいため、信号機の撮影に高精度でないカメラを用いることができる。例えば、車両では環境条件が厳しため、高精細および広いダイナミックレンジのカメラを車両に取り付けて用いることは難しい。しかし、上記一態様に係る信号機認識装置では、高精度でないカメラによる撮影によって生成される画像からでも、灯火色を認識することができる。したがって、車両に取り付けられた、厳しい環境条件にも耐え得る高精度でないカメラによる撮影によって生成された画像に対しても、灯火色を適切に認識することができる。 Furthermore, since the light color of the light part of the traffic light does not have to be correctly displayed in the recognition target image, a camera with no high accuracy can be used for shooting the traffic light. For example, since environmental conditions are severe in a vehicle, it is difficult to use a high-definition and wide dynamic range camera attached to the vehicle. However, the traffic light recognition apparatus according to the aspect described above can recognize the light color even from an image generated by photographing with a camera that is not highly accurate. Therefore, it is possible to appropriately recognize the light color even with respect to an image generated by photographing with a non-accurate camera attached to a vehicle that can withstand severe environmental conditions.
 さらに、パターンマッチングを行わないため、信号機が有するN個の灯部のそれぞれの形状は、円形または四角形でもよく、どのような形状であってもよい。また、信号機の灯部の色は、青色、黄色および赤色だけでなく、どのような色であっても、灯火色を適切に認識することができる。また、パターンマッチングのためのテンプレートを必要としないため、テンプレートを保持するためのメモリ容量を削減することができる。 Furthermore, since pattern matching is not performed, the shape of each of the N lamp units included in the traffic light may be circular or quadrangular, or any shape. In addition, the color of the lamp portion of the traffic light is not limited to blue, yellow and red, and any color can be appropriately recognized. In addition, since a template for pattern matching is not required, the memory capacity for holding the template can be reduced.
 また、前記プロセッサは、前記類似度の算出では、前記認識対象画像における灯部領域ごとに、当該灯部領域と、少なくとも2つの他の灯部領域のそれぞれとの間の相関係数のうち、最も大きい相関係数を類似度として選択することによって、当該灯部領域の類似度を算出してもよい。 Further, in the calculation of the similarity, the processor calculates, for each lamp area in the recognition target image, a correlation coefficient between the lamp area and each of at least two other lamp areas. The similarity of the lamp area may be calculated by selecting the largest correlation coefficient as the similarity.
 これにより、例えば、信号機が有するN個の灯部のうちの1つが点灯し、他のN-1個の灯部が消灯していれば、その点灯している1つの灯部が映し出されている灯部領域の類似度として、小さい値が算出される。一方、他のN-1個の灯部のそれぞれが映し出されている灯部領域の類似度として、大きい値が算出される。したがって、これらのN個の灯部領域の類似度を用いることによって、非類似領域が存在するか否か、および、何れの灯部領域が非類似領域であるかを、より適切に判定することができる。 Thus, for example, if one of the N lamp sections of the traffic light is turned on and the other N-1 lamp sections are turned off, the one lamp section that is lit is displayed. A small value is calculated as the similarity of the lamp part area. On the other hand, a large value is calculated as the similarity of the lamp area where each of the other N-1 lamp sections is projected. Therefore, it is possible to more appropriately determine whether or not there is a dissimilar region and which lamp unit region is a dissimilar region by using the similarity of these N lamp unit regions. Can do.
 また、前記プロセッサは、前記非類似領域が存在するか否かの判定では、前記N個の灯部領域のそれぞれの類似度のうち、最も小さい類似度を有する灯部領域が、前記非類似領域として存在すると判定してもよい。 In addition, in determining whether or not the dissimilar area exists, the processor determines that the lamp part area having the smallest similarity among the similarities of the N lamp part areas is the dissimilar area. May exist.
 これにより、どのような認識対象画像であっても非類似領域が存在すると判定して、その認識対象画像における灯火色を認識することができる。 Thus, it is possible to determine that there is a dissimilar region in any recognition target image and recognize the light color in the recognition target image.
 また、前記プロセッサは、前記非類似領域が存在するか否かの判定では、前記N個の灯部領域のそれぞれの類似度のうちの最も小さい類似度が、閾値以下の場合に、前記最も小さい類似度を有する灯部領域が、前記非類似領域として存在すると判定してもよい。 Further, in determining whether or not the dissimilar area exists, the processor determines the smallest similarity when the smallest similarity among the similarities of the N lamp part areas is equal to or less than a threshold value. You may determine with the lamp part area | region which has a similarity degree existing as said dissimilar area | region.
 これにより、最も小さい類似度が閾値以下の場合に、非類似領域が存在すると判定され、最も小さい類似度が閾値よりも大きい場合に、非類似領域が存在しないと判定される。したがって、N個の灯部領域のそれぞれの類似度に有意な差がない場合にまで、灯火色の認識を行い、その結果、不適切な灯火色を認識してしまうことを抑えることができる。 Thus, it is determined that the dissimilar region exists when the smallest similarity is equal to or smaller than the threshold, and it is determined that the dissimilar region does not exist when the smallest similarity is larger than the threshold. Therefore, it is possible to recognize the lamp color until there is no significant difference between the similarities of the N lamp section areas, and as a result, it is possible to suppress recognition of an inappropriate lamp color.
 また、前記プロセッサは、前記灯火色の認識では、前記非類似領域が存在しないと判定する場合には、過去に取得された画像に対して認識された灯火色を示す履歴情報を参照し、前記履歴情報によって示される灯火色を、前記認識対象画像における灯火色として認識してもよい。 Further, when the processor determines that the dissimilar region does not exist in the recognition of the lamp color, the processor refers to the history information indicating the lamp color recognized for the image acquired in the past, and The lamp color indicated by the history information may be recognized as the lamp color in the recognition target image.
 例えば、フリッカによって、信号機が有する全ての灯部が点灯していないタイミングがある。このようなタイミングでの撮影によって生成された認識対象画像には非類似領域が存在しない。しかし、一定のフレームレートの撮影によって生成された一連の画像のうち、認識対象画像よりも1つ前の時点に生成された画像、すなわち過去の画像は、フリッカの影響を受けていない可能性が高い。つまり、過去の画像には、認識対象画像の撮影時にフリッカによって消灯していた灯部が点灯した状態で映し出されている可能性が高い。そこで、上記一態様に係る信号機認識装置では、認識対象画像に非類似領域が存在しない場合には、その過去の画像に対して認識された灯火色が、認識対象画像における灯火色として認識される。したがって、フリッカによって灯火色の認識が妨げられることを適切に抑えることができる。 For example, there is a timing when all the lights of the traffic light are not lit due to flicker. There is no dissimilar region in the recognition target image generated by shooting at such timing. However, among a series of images generated by shooting at a constant frame rate, an image generated at a point before the recognition target image, that is, a past image may not be affected by flicker. high. That is, there is a high possibility that the past image is projected in a state in which the lamp portion that has been turned off by flicker when the recognition target image is captured is turned on. Therefore, in the traffic signal recognition device according to the above aspect, when there is no dissimilar region in the recognition target image, the lamp color recognized for the past image is recognized as the lamp color in the recognition target image. . Accordingly, it is possible to appropriately prevent the recognition of the light color due to flicker.
 また、前記プロセッサは、さらに、前記認識対象画像よりも前に取得された複数の画像のそれぞれに対して認識された灯火色を示す履歴情報を参照し、前記認識対象画像に対して認識された灯火色と、前記履歴情報によって示される、前記複数の画像のそれぞれにおける灯火色とのうち、最も多い灯火色を多数灯火色として特定し、前記認識対象画像に対して認識された灯火色を、前記多数灯火色に更新してもよい。 Further, the processor refers to history information indicating a light color recognized for each of a plurality of images acquired before the recognition target image, and is recognized for the recognition target image. Among the lamp colors and lamp colors in each of the plurality of images indicated by the history information, the most lamp colors are specified as lamp colors, and the lamp colors recognized for the recognition target image are You may update to the said multiple lamp color.
 これにより、認識対象画像よりも前に取得された複数の画像のそれぞれに対して認識された灯火色が参照される。この複数の画像および認識対象画像は、例えば、一定のフレームレートの撮影によって生成された一連の画像である。そして、認識対象画像に対して認識された灯火色は、その灯火色と、その複数の画像のそれぞれの灯火色とのうち、最も多い灯火色である多数灯火色に更新される。したがって、例えば、ノイズなどのフリッカ以外の要因によって、突発的に、認識対象画像に非類似領域が存在しないと判定されたり、認識対象画像に対して誤った灯火色が認識されてしまっても、その誤りを簡単に正すことができる。 This refers to the lamp color recognized for each of the plurality of images acquired before the recognition target image. The plurality of images and recognition target images are, for example, a series of images generated by shooting at a constant frame rate. Then, the lamp color recognized for the recognition target image is updated to the multiple lamp colors that are the most lamp colors among the lamp colors and the lamp colors of the plurality of images. Therefore, for example, due to factors other than flicker such as noise, even if it is suddenly determined that there is no dissimilar area in the recognition target image, or an incorrect light color is recognized for the recognition target image, The error can be easily corrected.
 また、前記プロセッサは、前記非類似領域が存在するか否かの判定では、前記認識対象画像よりも前に取得された複数の画像のそれぞれについて、当該画像における前記信号機領域内の位置ごとに、当該位置にある灯部領域の類似度を示す履歴情報を参照し、前記信号機領域内の位置ごとに、前記認識対象画像および前記複数の画像において当該位置にある灯部領域の類似度の平均を、平均類似度として算出し、前記認識対象画像における前記N個の灯部領域のうち、前記位置ごとに算出された平均類似度の中で最も小さい平均類似度に対応する位置にある灯部領域が、前記非類似領域として存在すると判定してもよい。 Further, in determining whether or not the dissimilar region exists, the processor, for each of a plurality of images acquired before the recognition target image, for each position in the traffic signal region in the image, The history information indicating the similarity of the lamp area at the position is referred to, and the average of the similarity of the lamp area at the position in the recognition target image and the plurality of images is calculated for each position in the traffic light area. The lamp area at the position corresponding to the average similarity that is the smallest of the average similarities calculated for each position among the N lamp areas in the recognition target image. May be determined to exist as the dissimilar region.
 これにより、認識対象画像よりも前に取得された複数の画像のそれぞれに対して算出された類似度が参照される。この複数の画像および認識対象画像は、例えば、一定のフレームレートの撮影によって生成された一連の画像である。そして、信号機領域内の位置ごとに、認識対象画像およびその複数の画像において当該位置にある灯部領域の類似度の平均が、平均類似度として算出される。例えば、信号機領域内の左端にある灯部領域の平均類似度と、信号機領域内の中央にある灯部領域の平均類似度と、信号機領域内の右端にある灯部領域の平均類似度とが算出される。ここで、例えばN=3のとき、左端にある灯部領域の平均類似度が最も小さければ、認識対象画像における信号機領域内の左端の灯部領域が、非類似領域として存在すると判定される。したがって、例えば、ノイズなどのフリッカ以外の要因によって、突発的に、認識対象画像に非類似領域が存在しないと判定されたり、認識対象画像に対して誤った灯火色が認識されてしまっても、その誤りを正すことができる。また、過去の類似度を用いた平均類似度に基づいて、非類似領域が判定されるため、過去に認識された灯火色を用いるよりも、非類似領域の判定をより正確に行うことができる。 Thereby, the similarity calculated for each of the plurality of images acquired before the recognition target image is referred to. The plurality of images and recognition target images are, for example, a series of images generated by shooting at a constant frame rate. Then, for each position in the traffic signal area, the average similarity between the recognition target image and the lamp area at the position in the plurality of images is calculated as the average similarity. For example, the average similarity of the lamp area at the left end in the traffic light area, the average similarity of the lamp area in the center of the traffic light area, and the average similarity of the lamp area at the right end in the traffic light area Calculated. Here, for example, when N = 3, if the average similarity of the lamp area at the left end is the smallest, it is determined that the lamp area at the left end in the traffic light area in the recognition target image exists as a dissimilar area. Therefore, for example, due to factors other than flicker such as noise, even if it is suddenly determined that there is no dissimilar area in the recognition target image, or an incorrect light color is recognized for the recognition target image, The error can be corrected. Further, since the dissimilar region is determined based on the average similarity using the past similarity, the dissimilar region can be determined more accurately than using the light color recognized in the past. .
 また、前記信号機が、さらに、それぞれ車両の進行方向を示す灯部であるM個(Mは1以上の整数)の方向指示灯部を有する場合、前記プロセッサは、前記N個の灯部領域の生成では、前記信号機領域を分割することによって、前記N個の灯部領域と、それぞれ方向指示灯部が映し出されているM個の灯部領域とを生成し、さらに、前記認識対象画像に対して認識された灯火色と、前記M個の灯部領域のそれぞれの特徴量とに基づいて、前記M個の方向指示灯部の中で点灯している少なくとも1つの方向指示灯部によって示される進行方向を認識してもよい。あるいは、前記信号機が、さらに、それぞれ車両の進行方向を示す灯部であるM個(Mは1以上の整数)の方向指示灯部を有する場合、前記プロセッサは、前記N個の灯部領域の生成では、前記信号機領域を分割することによって、前記N個の灯部領域と、それぞれ方向指示灯部が映し出されているM個の灯部領域とを生成し、さらに、前記M個の灯部領域のそれぞれの特徴量に基づいて、前記M個の方向指示灯部の中で点灯している少なくとも1つの方向指示灯部によって示される進行方向を認識してもよい。 Further, when the traffic light further has M (M is an integer equal to or greater than 1) direction indicator lamps, each of which indicates a traveling direction of the vehicle, the processor may In the generation, by dividing the traffic light area, the N lamp part areas and the M lamp part areas in which the direction indicating lamp parts are respectively projected are generated. Based on the recognized lamp color and the respective feature values of the M lamp area, indicated by at least one direction indicating lamp section that is lit among the M direction indicating lamp sections. The traveling direction may be recognized. Alternatively, when the traffic light further has M (M is an integer of 1 or more) direction indicator lamps, each of which indicates a traveling direction of the vehicle, the processor In the generation, by dividing the traffic light area, the N lamp section areas and M lamp section areas each displaying a direction indicating lamp section are generated, and the M lamp sections are further generated. The traveling direction indicated by at least one direction indicator lamp unit that is lit among the M direction indicator lamp units may be recognized based on the feature amount of each region.
 これにより、例えば矢印などによって進行方向を示すM個の方向指示灯部のうち、点灯している方向指示灯部の進行方向を適切に認識することができる。例えば、機械学習によって生成されたニューラルネットワークなどのモデルに、M個の灯部領域のそれぞれの特徴量を入力することによって、そのM個の方向指示灯部の中から、点灯している方向指示灯部を適切に認識することができる。つまり、その点灯している方向指示灯部の進行方向を適切に認識することができる。さらに、信号機の灯火色ごとに、点灯する可能性がある方向指示灯部が予め定められている場合には、認識された灯火色によって、M個の方向指示灯部から、点灯している方向指示灯部の候補を絞り込むことができる。この場合には、その絞り込まれた候補の中から、点灯している方向指示灯部を適切に認識することができる。このように、上記一態様に係る信号機認識装置では、灯火色と、点灯する可能性がある方向指示灯部とが予め定められていても、定められていなくても、点灯している方向指示灯部の進行方向を適切に認識することができる。 This makes it possible to appropriately recognize the traveling direction of the directional indicator lamp portion that is lit out of the M direction indicator lamp portions that indicate the traveling direction by, for example, an arrow. For example, by inputting each feature amount of M lamp areas into a model such as a neural network generated by machine learning, the direction indication of lighting is indicated from among the M direction indicating lamp sections. It is possible to properly recognize the lamp unit. That is, it is possible to appropriately recognize the traveling direction of the directional indicator lamp unit that is lit. Furthermore, when the direction indicator lamp part which may be turned on for every lamp color of the traffic light is determined in advance, the direction of lighting from the M direction indicator lamp parts according to the recognized lamp color. It is possible to narrow down the candidates for the indicator lamp part. In this case, it is possible to appropriately recognize the lit direction indicator lamp unit from the narrowed candidates. As described above, in the traffic signal recognition device according to the above aspect, the lighting direction and the direction indication lamp unit that may be lit may be turned on regardless of whether the lighting direction and the direction indicating lamp unit that may be turned on are determined in advance. It is possible to appropriately recognize the traveling direction of the lamp unit.
 また、前記プロセッサは、前記類似度の算出では、前記N個の灯部領域のそれぞれについて、当該灯部領域の画像と、当該灯部領域以外の少なくとも1つの他の灯部領域からなるグループの画像とを比較することによって、当該灯部領域の類似度を算出してもよい。例えば、前記プロセッサは、前記N個の灯部領域のそれぞれについて、当該灯部領域の画像と、前記グループの画像とを比較するときには、当該灯部領域に含まれるブロックごとに、当該ブロックの画像と最も類似するブロックを前記グループから探索することによって、当該灯部領域の類似度を算出してもよい。また、前記プロセッサは、前記N個の灯部領域のそれぞれについて、当該灯部領域の画像と、前記グループの画像とを比較するときには、前記グループに含まれるブロックごとに、当該ブロックの画像と最も類似するブロックを当該灯部領域から探索することによって、当該灯部領域の類似度を算出してもよい。 In the calculation of the similarity, the processor may calculate, for each of the N lamp areas, an image of the lamp area and a group of at least one other lamp area other than the lamp area. You may calculate the similarity of the said lamp | ramp part area | region by comparing with an image. For example, when comparing the image of the lamp unit area and the image of the group for each of the N lamp unit areas, the processor, for each block included in the lamp unit area, The similarity of the lamp area may be calculated by searching the block most similar to the group. For each of the N lamp areas, the processor compares the image of the block with the image of the block for each block included in the group when comparing the image of the lamp area with the image of the group. You may calculate the similarity of the said lamp part area | region by searching for a similar block from the said lamp part area | region.
 これにより、例えば、グループが複数の灯部領域からなる場合に、灯部領域の類似度を算出するときには、その灯部領域と、他の複数の灯部領域のそれぞれとを個別に比較する手間を省くことができる。つまり、その灯部領域と、その灯部領域以外の(N-1)個の灯部領域からなるグループとを比較することによって、その灯部領域の類似度を簡単に算出することができる。 Thus, for example, when a group is composed of a plurality of lamp sections, when calculating the similarity of the lamp sections, it is time and effort to individually compare the lamp section with each of the other lamp sections. Can be omitted. That is, by comparing the lamp area with a group of (N−1) lamp areas other than the lamp area, the similarity of the lamp area can be easily calculated.
 以下、実施の形態について、図面を参照しながら具体的に説明する。 Hereinafter, embodiments will be specifically described with reference to the drawings.
 なお、以下で説明する実施の形態は、いずれも包括的または具体的な例を示すものである。以下の実施の形態で示される数値、形状、材料、構成要素、構成要素の配置位置および接続形態、ステップ、ステップの順序などは、一例であり、本開示を限定する主旨ではない。また、以下の実施の形態における構成要素のうち、最上位概念を示す独立請求項に記載されていない構成要素については、任意の構成要素として説明される。 It should be noted that each of the embodiments described below shows a comprehensive or specific example. Numerical values, shapes, materials, components, arrangement positions and connection forms of components, steps, order of steps, and the like shown in the following embodiments are merely examples, and are not intended to limit the present disclosure. In addition, among the constituent elements in the following embodiments, constituent elements that are not described in the independent claims indicating the highest concept are described as optional constituent elements.
 また、各図は、模式図であり、必ずしも厳密に図示されたものではない。また、各図において、同じ構成部材については同じ符号を付している。 Each figure is a schematic diagram and is not necessarily shown strictly. Moreover, in each figure, the same code | symbol is attached | subjected about the same structural member.
 (実施の形態)
 図1は、実施の形態における信号機認識装置の構成を示すブロック図である。
(Embodiment)
FIG. 1 is a block diagram illustrating a configuration of a traffic signal recognition apparatus according to an embodiment.
 例えば、車両10は、カメラ210と、地図格納部220と、位置検出システム230と、車両制御部240と、信号機認識装置100とを備える。 For example, the vehicle 10 includes a camera 210, a map storage unit 220, a position detection system 230, a vehicle control unit 240, and a traffic signal recognition device 100.
 カメラ210は、例えば車両10の前方を撮影し、その撮影によって生成される画像(以下、撮影画像という)を信号機認識装置100に出力する。具体的には、カメラ210は、一定のフレームレートで撮影を行い、その撮影によって画像が生成されるたびに、その画像を撮影画像として信号機認識装置100に出力する。本実施の形態では、カメラ210は、撮影画像(すなわち後述の認識対象画像)を取得するセンサとして構成されている。なお、撮影に用いられるシャッタースピードは固定されていてもよく、可変であってもよい。 The camera 210 captures, for example, the front of the vehicle 10 and outputs an image generated by the capturing (hereinafter referred to as a captured image) to the traffic signal recognition apparatus 100. Specifically, the camera 210 captures images at a constant frame rate, and outputs an image as a captured image to the traffic signal recognition apparatus 100 each time an image is generated by the capture. In the present embodiment, the camera 210 is configured as a sensor that acquires a captured image (that is, a recognition target image described later). Note that the shutter speed used for shooting may be fixed or variable.
 地図格納部220は、少なくとも車両10の周辺の地図を示す地図情報を格納している。このような地図情報は、例えば、インターネットなどのネットワークを介してサーバから無線で送信され、地図格納部220に格納されている。 The map storage unit 220 stores at least map information indicating a map around the vehicle 10. Such map information is wirelessly transmitted from a server via a network such as the Internet and stored in the map storage unit 220, for example.
 位置検出システム230は、車両10の位置を検出し、その位置を示す位置情報を信号機認識装置100に出力する。具体的には、位置検出システム230は、GPS(Global Positioning System)の受信機として構成されている。 The position detection system 230 detects the position of the vehicle 10 and outputs position information indicating the position to the traffic signal recognition device 100. Specifically, the position detection system 230 is configured as a GPS (Global Positioning System) receiver.
 車両制御部240は、例えば1つまたは複数のECU(Electronic Control Unit)からなる。このような車両制御部240は、信号機認識装置100から出力される灯火色情報を取得し、その灯火色情報によって示される信号機の灯火色に基づいて、車両10の走行を制御する。つまり、車両制御部240は、信号機が配置された道路を車両10が自動走行するように、その車両10を制御する。 The vehicle control unit 240 includes, for example, one or more ECUs (Electronic Control Units). Such a vehicle control part 240 acquires the light color information output from the traffic signal recognition apparatus 100, and controls driving | running | working of the vehicle 10 based on the light color of the traffic light shown by the light color information. That is, the vehicle control unit 240 controls the vehicle 10 so that the vehicle 10 automatically travels on the road where the traffic light is arranged.
 信号機認識装置100は、カメラ210から撮影画像を取得し、その撮影画像に映し出されている信号機を認識する。例えば、信号機認識装置100は、信号機の灯火色を認識し、その灯火色を示す灯火色情報を車両制御部240に出力する。なお、現在の信号機の認識に用いられる撮影画像を、認識対象画像ともいう。 The traffic signal recognition device 100 acquires a captured image from the camera 210 and recognizes the traffic signal displayed in the captured image. For example, the traffic light recognition device 100 recognizes the light color of the traffic light and outputs the light color information indicating the light color to the vehicle control unit 240. Note that the captured image used for the current traffic signal recognition is also referred to as a recognition target image.
 本実施の形態における信号機認識装置100は、領域抽出部101と、領域分割部102と、類似度算出部103と、灯火色認識部104とを備える。 The traffic light recognition apparatus 100 according to the present embodiment includes an area extraction unit 101, an area division unit 102, a similarity calculation unit 103, and a light color recognition unit 104.
 領域抽出部101は、カメラ210から撮影画像を取得し、信号機が映し出されている領域を信号機領域として、その撮影画像から抽出する。ここで、信号機は、N個(Nは3以上の整数)の灯部を有する。例えば、N個は3個であって、信号機は、青色の灯部と、黄色の灯部と、赤色の灯部とを有する。なお、その撮影画像が現在の灯火色の認識に用いられる場合には、その撮影画像は、認識対象画像として扱われる。 The area extraction unit 101 acquires a captured image from the camera 210 and extracts an area where the traffic signal is displayed as a traffic signal area from the captured image. Here, the traffic light has N lamp units (N is an integer of 3 or more). For example, N is three, and the traffic light has a blue lamp unit, a yellow lamp unit, and a red lamp unit. When the captured image is used for recognition of the current lighting color, the captured image is treated as a recognition target image.
 ここで、領域抽出部101は、撮影画像から信号機領域を抽出するときには、例えば、地図格納部220に格納されている地図情報と、位置検出システム230によって検出された車両10の位置とを用いる。 Here, the region extraction unit 101 uses, for example, the map information stored in the map storage unit 220 and the position of the vehicle 10 detected by the position detection system 230 when extracting the traffic signal region from the captured image.
 具体的には、地図情報は、3次元空間における各信号機の位置および形態を示す。つまり、地図情報は、道路および建物の位置を示すだけでなく、その道路に配置されている信号機の位置および形態も示す。信号機の位置は、3次元座標系によって示される。すなわち、信号機の灯箱の位置が、経度および緯度と高さとによって示される。高さは、道路からの高さであっても、標高であってもよい。信号機の形態は、信号機を正面から見たときのその信号機の形状および大きさを含む。 Specifically, the map information indicates the position and form of each traffic signal in the three-dimensional space. That is, the map information not only shows the position of the road and the building, but also shows the position and form of the traffic signal arranged on the road. The position of the traffic light is indicated by a three-dimensional coordinate system. That is, the position of the light box of the traffic light is indicated by longitude, latitude, and height. The height may be a height from a road or an altitude. The form of the traffic signal includes the shape and size of the traffic signal when the traffic signal is viewed from the front.
 領域抽出部101は、その地図情報によって示される地図に、位置検出システム230から出力された位置情報によって示される位置をマッピングすることによって、車両10の地図上の位置を特定する。さらに、領域抽出部101は、地図上の車両10および信号機のそれぞれの位置と、信号機の形態と、車両10におけるカメラ210の取付位置および撮影方向とに基づいて、撮影画像において信号機が映し出されている領域を幾何学的に検出する。そして、領域抽出部101は、その検出された領域を信号機領域として撮影画像から抽出する。 The region extraction unit 101 identifies the position of the vehicle 10 on the map by mapping the position indicated by the position information output from the position detection system 230 to the map indicated by the map information. Furthermore, the area extraction unit 101 displays the traffic signal in the captured image based on the position of the vehicle 10 and the traffic signal on the map, the form of the traffic signal, the mounting position of the camera 210 on the vehicle 10 and the shooting direction. The area is detected geometrically. Then, the area extraction unit 101 extracts the detected area from the captured image as a traffic light area.
 領域分割部102は、その信号機領域を分割することによって、それぞれ灯部が映し出されているN個の灯部領域を生成する。領域分割部102は、その信号機領域をN個の灯部領域に分割するときには、その信号機領域をN個に等分割してもよい。また、領域分割部102は、地図情報を用いて分割を行ってもよい。すなわち、地図情報は、信号機の位置だけでなく、信号機が有する灯部の数(N個)と、N個の灯部の配列とを示していてもよい。例えば、N=3である場合、地図情報は、灯部の数「3」を示すとともに、3つの灯部が水平方向に等間隔に配列されていることを示す。領域分割部102は、その灯部の数および配列にしたがって信号機領域を分割する。なお、本実施の形態では、領域分割部102は、N個の灯部領域を生成するために信号機領域を分割するが、信号機領域を分割せずに、N個の灯部領域を生成してもよい。例えば、領域分割部102は、灯部ごとに、その灯部が映し出されている領域を、その信号機領域の全体から抽出する。この抽出では、N個の灯部領域のうちの少なくとも2つの灯部領域に、互いに同じ例えば背景などの画像が含まれていてもよい。また、N個の灯部領域の何れにも含まれていない例えば背景などの画像が信号機領域にあってもよい。このように、本実施の形態では、信号機領域から、それぞれ互いに異なる灯部が映し出されているN個の灯部領域が生成されれば、その信号機領域に対してどのような画像処理を行ってもよい。 The area dividing unit 102 divides the traffic signal area to generate N lamp part areas each displaying the lamp part. The area dividing unit 102 may equally divide the signal area into N parts when dividing the signal area into N lamp part areas. The area dividing unit 102 may perform division using map information. That is, the map information may indicate not only the position of the traffic signal but also the number (N) of the lamp units included in the traffic signal and the arrangement of the N lamp units. For example, when N = 3, the map information indicates the number of lamp units “3” and indicates that three lamp units are arranged at equal intervals in the horizontal direction. The area dividing unit 102 divides the traffic signal area according to the number and arrangement of the lamp units. In this embodiment, the area dividing unit 102 divides the traffic light area in order to generate N lamp part areas, but generates N lamp part areas without dividing the traffic light area. Also good. For example, the area dividing unit 102 extracts, for each lamp unit, an area in which the lamp unit is projected from the entire signal area. In this extraction, at least two lamp section areas out of the N lamp section areas may include the same image such as a background. In addition, an image such as a background that is not included in any of the N lamp sections may be in the traffic signal area. As described above, in the present embodiment, if N lamp parts areas in which different lamp parts are projected are generated from the traffic light area, what kind of image processing is performed on the traffic light area. Also good.
 類似度算出部103は、上述のN個の灯部領域を領域分割部102から取得し、そのN個の灯部領域のそれぞれの類似度を算出する。 The similarity calculation unit 103 acquires the N lamp unit areas described above from the region dividing unit 102, and calculates the similarity of each of the N lamp unit regions.
 灯火色認識部104は、N個の灯部領域のそれぞれに対して算出された類似度を類似度算出部103から取得する。そして、灯火色認識部104は、それらの類似度に基づいて特定される非類似領域の位置に予め対応付けられている色を、撮影画像における灯火色として認識する。なお、非類似領域は、N個の灯部領域のうち、他の何れの灯部領域にも類似していない灯部領域である。具体的には、灯火色認識部104は、後述する対応付け情報を参照することによって、その対応付け情報において、非類似領域の位置に対応付けられている色を、撮影画像における灯火色として認識する。対応付け情報は、地図情報に含まれていてもよい。つまり、地図情報は、信号機が有する灯部の数(N個)と、N個の灯部の配列とに加えて、それらの灯部の位置に対応付けられている色を示していてもよい。さらに、灯火色認識部104は、その認識された灯火色を示す情報を灯火色情報として車両制御部240に出力する。 The lamp color recognition unit 104 acquires the similarity calculated for each of the N lamp unit regions from the similarity calculation unit 103. Then, the lamp color recognition unit 104 recognizes the color associated with the position of the dissimilar area specified based on the similarity as the lamp color in the captured image. In addition, a dissimilar area | region is a lamp part area | region which is not similar to any other lamp part area | regions among N lamp part area | regions. Specifically, the lamp color recognition unit 104 recognizes the color associated with the position of the dissimilar region in the association information as the lamp color in the captured image by referring to association information described later. To do. The association information may be included in the map information. That is, the map information may indicate the colors associated with the positions of the lamp units in addition to the number (N) of the lamp units included in the traffic light and the arrangement of the N lamp units. . Further, the lamp color recognition unit 104 outputs information indicating the recognized lamp color to the vehicle control unit 240 as lamp color information.
 図2は、領域抽出部101、領域分割部102および類似度算出部103の処理を説明するための図である。 FIG. 2 is a diagram for explaining the processing of the region extracting unit 101, the region dividing unit 102, and the similarity calculating unit 103.
 領域抽出部101は、カメラ210から認識対象画像として撮影画像Paを取得する(ステップS1)。そして、領域抽出部101は、その撮影画像Paから信号機領域Pbを抽出する(ステップS2)。ここで、信号機領域Pbには、N個の灯部として、3つの灯部を有する信号機が映し出されている。つまり、N=3である。また、3つの灯部は、青色の灯部と、黄色の灯部と、赤色の灯部とからなる。 The region extraction unit 101 acquires a captured image Pa as a recognition target image from the camera 210 (step S1). Then, the area extraction unit 101 extracts the traffic light area Pb from the captured image Pa (step S2). Here, in the traffic light area Pb, a traffic light having three lamp parts is displayed as N lamp parts. That is, N = 3. The three lamp parts are composed of a blue lamp part, a yellow lamp part, and a red lamp part.
 次に、領域分割部102は、その信号機領域Pbを3つの灯部領域Pb1、Pb2およびPb3に分割する(ステップS3)。例えば、灯部領域Pb1には、青色の灯部のみが映し出され、灯部領域Pb2には、黄色の灯部のみが映し出され、灯部領域Pb3には、赤色の灯部のみが映し出されている。また、灯部領域Pb1は、信号機領域Pbの左端に位置する領域であって、第1灯部領域Pb1ともいう。灯部領域Pb2は、信号機領域Pbの中央に位置する領域であって、第2灯部領域Pb2ともいう。灯部領域Pb3は、信号機領域Pbの右端に位置する領域であって、第3灯部領域Pb3ともいう。 Next, the area dividing unit 102 divides the traffic signal area Pb into three lamp part areas Pb1, Pb2, and Pb3 (step S3). For example, in the lamp area Pb1, only the blue lamp section is projected, only the yellow lamp section is projected in the lamp section area Pb2, and only the red lamp section is projected in the lamp section area Pb3. Yes. The lamp section area Pb1 is an area located at the left end of the traffic signal area Pb, and is also referred to as a first lamp section area Pb1. The lamp part area Pb2 is an area located at the center of the traffic light area Pb, and is also referred to as a second lamp part area Pb2. The lamp part area Pb3 is an area located at the right end of the traffic signal area Pb, and is also referred to as a third lamp part area Pb3.
 次に、類似度算出部103は、3つの灯部領域Pb1、Pb2およびPb3のそれぞれの類似度k1、k2およびk3を算出する(ステップS4~S6)。 Next, the similarity calculation unit 103 calculates the respective similarities k1, k2, and k3 of the three lamp region Pb1, Pb2, and Pb3 (steps S4 to S6).
 具体的には、類似度算出部103は、第1灯部領域Pb1の類似度k1を算出するときには(ステップS4)、まず、第1灯部領域Pb1と第2灯部領域Pb2との間の相関係数k12を算出する。つまり、類似度算出部103は、第1灯部領域Pb1の画像の特徴量と、第2灯部領域Pb2の画像の特徴量とを比較し、それらの特徴量の相関を相関係数k12として算出する。相関係数は、それらの領域の画像の特徴量が似ているほど大きい値を示す。 Specifically, when calculating the similarity k1 of the first lamp part area Pb1 (step S4), the similarity calculation unit 103 firstly determines between the first lamp part area Pb1 and the second lamp part area Pb2. A correlation coefficient k12 is calculated. That is, the similarity calculation unit 103 compares the feature quantity of the image of the first lamp area Pb1 with the feature quantity of the image of the second lamp area Pb2, and sets the correlation between these feature quantities as the correlation coefficient k12. calculate. The correlation coefficient indicates a larger value as the feature amounts of the images in these regions are similar.
 例えば、類似度算出部103は、第1灯部領域Pb1をn×n個の画素からなるブロックに分割し、第1灯部領域Pb1のブロックごとに、そのブロックの画像に最も類似するブロックを第2灯部領域Pb2から探索する。具体的には、類似度算出部103は、第1灯部領域Pb1におけるブロックの特徴量をベクトルとして表す。そして、類似度算出部103は、そのベクトルとの間の距離が最も短いベクトルの特徴量を有するブロックを、第2灯部領域Pb2から見つけ出す。例えば、ブロックの特徴量を表すベクトルは、そのブロックを構成するn×n個の画素の画素値(具体的には輝度値)の配列からなる。これにより、第1灯部領域Pb1のブロックごとに、最短のベクトル間距離が得られる。類似度算出部103は、その第1灯部領域Pb1のブロックごとに得られる最短のベクトル間距離から、相関係数k12を算出する。例えば、その相関係数k12は、第1灯部領域Pb1のブロックごとに得られる最短のベクトル間距離が平均化された値の逆数である。 For example, the similarity calculation unit 103 divides the first lamp area Pb1 into blocks composed of n × n pixels, and for each block of the first lamp area Pb1, the block most similar to the image of the block is determined. Search from the second lamp area Pb2. Specifically, the similarity calculation unit 103 represents the feature amount of the block in the first lamp unit region Pb1 as a vector. Then, the similarity calculation unit 103 finds a block having the feature amount of the vector having the shortest distance from the vector from the second lamp unit region Pb2. For example, a vector representing a feature amount of a block is composed of an array of pixel values (specifically, luminance values) of n × n pixels constituting the block. Thereby, the shortest distance between vectors is obtained for each block of the first lamp area Pb1. The similarity calculation unit 103 calculates the correlation coefficient k12 from the shortest inter-vector distance obtained for each block of the first lamp unit region Pb1. For example, the correlation coefficient k12 is the reciprocal of the value obtained by averaging the shortest vector distances obtained for each block of the first lamp area Pb1.
 なお、相関係数の算出方法は、これに限らず、どのような方法であってもよい。例えば、上述の例では、第1灯部領域Pb1のブロックごとに、最短のベクトル間距離を算出するが、より細かい単位で最短のベクトル間距離を算出してもよい。例えば、第1灯部領域Pb1において処理対象窓を1画素ずつずらしながら、その処理対象窓内の画像に対する最短のベクトル間距離を算出してもよい。例えば、処理対象窓は、n×n個の画素を囲う大きさである。 Note that the calculation method of the correlation coefficient is not limited to this, and any method may be used. For example, in the above-described example, the shortest inter-vector distance is calculated for each block of the first lamp area Pb1, but the shortest inter-vector distance may be calculated in finer units. For example, the shortest inter-vector distance for the image in the processing target window may be calculated while shifting the processing target window one pixel at a time in the first lamp area Pb1. For example, the processing target window has a size that surrounds n × n pixels.
 類似度算出部103は、上述と同様に、第1灯部領域Pb1と第3灯部領域Pb3との間の相関係数k13を算出する。そして、類似度算出部103は、相関係数k12と相関係数k13とのうちの最も大きい相関係数を、第1灯部領域Pb1の類似度k1として選択する。これにより、第1灯部領域Pb1の類似度k1が算出される。 The similarity calculation unit 103 calculates the correlation coefficient k13 between the first lamp unit region Pb1 and the third lamp unit region Pb3 in the same manner as described above. And the similarity calculation part 103 selects the largest correlation coefficient of the correlation coefficient k12 and the correlation coefficient k13 as the similarity k1 of the 1st lamp part area | region Pb1. Thereby, the similarity k1 of the first lamp part region Pb1 is calculated.
 同様に、類似度算出部103は、第2灯部領域Pb2の類似度k2を算出するときには(ステップS5)、まず、第2灯部領域Pb2と第1灯部領域Pb1との間の相関係数k21を算出する。さらに、類似度算出部103は、第2灯部領域Pb2と第3灯部領域Pb3との間の相関係数k23を算出する。そして、類似度算出部103は、相関係数k21と相関係数k23とのうちの最も大きい相関係数を、第2灯部領域Pb2の類似度k2として選択する。これにより、第2灯部領域Pb2の類似度k2が算出される。 Similarly, when the similarity calculation unit 103 calculates the similarity k2 of the second lamp unit region Pb2 (step S5), first, the phase relationship between the second lamp unit region Pb2 and the first lamp unit region Pb1. The number k21 is calculated. Further, the similarity calculation unit 103 calculates a correlation coefficient k23 between the second lamp part region Pb2 and the third lamp part region Pb3. And the similarity calculation part 103 selects the largest correlation coefficient of the correlation coefficient k21 and the correlation coefficient k23 as the similarity k2 of the 2nd lamp part area | region Pb2. Thereby, the similarity k2 of the second lamp area Pb2 is calculated.
 同様に、類似度算出部103は、第3灯部領域Pb3の類似度k3を算出するときには(ステップS6)、まず、第3灯部領域Pb3と第1灯部領域Pb1との間の相関係数k31を算出する。さらに、類似度算出部103は、第3灯部領域Pb3と第2灯部領域Pb2との間の相関係数k32を算出する。そして、類似度算出部103は、相関係数k31と相関係数k32とのうちの最も大きい相関係数を、第3灯部領域Pb3の類似度k3として選択する。これにより、第3灯部領域Pb3の類似度k3が算出される。 Similarly, when the similarity calculation unit 103 calculates the similarity k3 of the third lamp unit region Pb3 (step S6), first, the phase relationship between the third lamp unit region Pb3 and the first lamp unit region Pb1. The number k31 is calculated. Further, the similarity calculation unit 103 calculates a correlation coefficient k32 between the third lamp unit region Pb3 and the second lamp unit region Pb2. And the similarity calculation part 103 selects the largest correlation coefficient of the correlation coefficient k31 and the correlation coefficient k32 as the similarity k3 of the 3rd lamp part area | region Pb3. Thereby, the similarity k3 of the 3rd lamp part area | region Pb3 is calculated.
 このように、本実施の形態における類似度算出部103は、認識対象画像における灯部領域ごとに、その灯部領域と、少なくとも2つの他の灯部領域のそれぞれとの間の相関係数のうち、最も大きい相関係数を類似度として選択する。その結果、灯部領域の類似度が算出される。 As described above, the similarity calculation unit 103 according to the present embodiment calculates a correlation coefficient between each lamp unit area and each of at least two other lamp unit areas for each lamp unit area in the recognition target image. Among them, the largest correlation coefficient is selected as the similarity. As a result, the similarity of the lamp area is calculated.
 例えば、信号機が有する3個の灯部のうちの1つが点灯し、他の2個の灯部が消灯していれば、その点灯している1つの灯部が映し出されている灯部領域の類似度として、小さい値が算出される。一方、他の2個の灯部のそれぞれが映し出されている灯部領域の類似度として、大きい値が算出される。したがって、これらの3個の灯部領域の類似度を用いることによって、3個の灯部領域のうち、他の何れの灯部領域にも類似していない灯部領域を、容易に見つけ出すことができる。 For example, if one of the three lamps of a traffic light is lit and the other two lamps are not lit, the lamp unit area in which one of the lit lamps is projected is displayed. A small value is calculated as the similarity. On the other hand, a large value is calculated as the similarity of the lamp area where each of the other two lamp sections is projected. Therefore, by using the similarity of these three lamp part areas, a lamp part area that is not similar to any other lamp part area can be easily found out of the three lamp part areas. it can.
 図3は、灯火色認識部104の処理を説明するための図である。 FIG. 3 is a diagram for explaining processing of the light color recognition unit 104.
 灯火色認識部104は、第1灯部領域Pb1、第2灯部領域Pb2および第3灯部領域Pb3のそれぞれに対して算出された類似度k1~k3を取得する。そして、灯火色認識部104は、それらの類似度k1~k3に基づいて、信号機が有する3個の灯部のうちの点灯している灯部の色を、撮影画像Paにおける灯火色として認識する。 The lamp color recognition unit 104 acquires the similarities k1 to k3 calculated for each of the first lamp part region Pb1, the second lamp part region Pb2, and the third lamp part region Pb3. Then, the lamp color recognition unit 104 recognizes the color of the lit lamp part among the three lamp parts of the traffic light as the lamp color in the captured image Pa based on the similarity k1 to k3. .
 具体的には、灯火色認識部104は、3つの灯部領域Pb1~Pb3に対して算出された類似度k1~k3に基づいて、3つの灯部領域Pb1~Pb3のうち、他の何れの灯部領域にも類似していない灯部領域である非類似領域が存在するか否かを判定する。 Specifically, the lamp color recognizing unit 104 selects any of the three lamp part areas Pb1 to Pb3 based on the similarities k1 to k3 calculated for the three lamp part areas Pb1 to Pb3. It is determined whether or not there is a dissimilar region that is a lamp region that is not similar to the lamp region.
 例えば、図3に示すように、灯火色認識部104は、類似度k1~k3のうちの類似度k1が最も小さいため、この類似度k1を有する第1灯部領域Pb1が非類似領域であると判定する。つまり、灯火色認識部104は、3個の灯部領域Pb1~Pb3のそれぞれの類似度k1~k3のうち、最も小さい類似度k1を有する第1灯部領域Pb1が、非類似領域として存在すると判定する。これにより、どのような撮影画像Paであっても非類似領域が存在すると判定して、その撮影画像Paにおける灯火色を認識することができる。あるいは、灯火色認識部104は、3個の灯部領域Pb1~Pb3のそれぞれの類似度k1~k3のうちの最も小さい類似度k1が、閾値Th以下の場合に、その最も小さい類似度k1を有する第1灯部領域Pb1が、非類似領域として存在すると判定する。これにより、最も小さい類似度が閾値以下の場合に、非類似領域が存在すると判定され、最も小さい類似度が閾値よりも大きい場合に、非類似領域が存在しないと判定される。したがって、N個の灯部領域のそれぞれの類似度に有意な差がない場合にまで、灯火色の認識を行い、その結果、不適切な灯火色を認識してしまうことを抑えることができる。 For example, as shown in FIG. 3, in the lamp color recognition unit 104, the similarity k1 of the similarities k1 to k3 is the smallest, so the first lamp region Pb1 having the similarity k1 is a dissimilar region. Is determined. That is, the lamp color recognition unit 104 determines that the first lamp part region Pb1 having the smallest similarity k1 among the similarities k1 to k3 of the three lamp part regions Pb1 to Pb3 exists as a dissimilar region. judge. As a result, it is possible to determine that there is a dissimilar region in any captured image Pa and recognize the light color in the captured image Pa. Alternatively, the lamp color recognizing unit 104 determines the smallest similarity k1 when the smallest similarity k1 among the similarities k1 to k3 of the three lamp part regions Pb1 to Pb3 is equal to or less than the threshold Th. It determines with the 1st lamp part area | region Pb1 to have exists as a dissimilar area | region. Thereby, it is determined that the dissimilar region exists when the smallest similarity is equal to or less than the threshold, and it is determined that the dissimilar region does not exist when the smallest similarity is larger than the threshold. Therefore, it is possible to recognize the lamp color until there is no significant difference between the similarities of the N lamp section areas, and as a result, it is possible to suppress recognition of an inappropriate lamp color.
 さらに、灯火色認識部104は、非類似領域が存在すると判定する場合には、その信号機領域Pbにおける非類似領域の位置に基づいて灯火色を認識する。例えば、このときには、灯火色認識部104は、対応付け情報を参照する。つまり、灯火色認識部104は、信号機領域Pbにおける3個の灯部領域Pb1~Pb3のそれぞれの位置に対応付けられた色を示す対応付け情報を参照する。そして、灯火色認識部104は、その対応付け情報において、非類似領域の位置に対応付けられている色を、撮影画像Paにおける灯火色として認識する。例えば、図3に示すように、対応付け情報は、信号機領域Pbにおける第1灯部領域Pb1の位置、すなわち左端に対応付けられた青色を示す。同様に、対応付け情報は、第2灯部領域Pb2の位置、すなわち中央に対応付けられた黄色と、第3灯部領域Pb3の位置、すなわち右端に対応付けられた赤色とを示す。灯火色認識部104は、このような対応付け情報において、例えば、非類似領域として判定された第1灯部領域Pb1の位置に対応付けられている青色を、撮影画像Paにおける灯火色として認識する。 Further, when it is determined that a dissimilar region exists, the light color recognition unit 104 recognizes the light color based on the position of the dissimilar region in the traffic signal region Pb. For example, at this time, the light color recognition unit 104 refers to the association information. That is, the lamp color recognition unit 104 refers to the association information indicating the colors associated with the respective positions of the three lamp unit areas Pb1 to Pb3 in the traffic signal area Pb. Then, the lamp color recognition unit 104 recognizes the color associated with the position of the dissimilar region as the lamp color in the captured image Pa in the association information. For example, as shown in FIG. 3, the association information indicates the blue color associated with the position of the first lamp section area Pb1 in the traffic light area Pb, that is, the left end. Similarly, the association information indicates the position of the second lamp area Pb2, that is, yellow associated with the center, and the position of the third lamp area Pb3, that is, red associated with the right end. In such association information, the lamp color recognition unit 104 recognizes, for example, the blue color associated with the position of the first lamp unit region Pb1 determined as the dissimilar region as the lamp color in the captured image Pa. .
 図4は、本実施の形態における信号機認識装置100の処理動作を示すフローチャートである。 FIG. 4 is a flowchart showing the processing operation of the traffic signal recognition apparatus 100 in the present embodiment.
 信号機認識装置100の領域抽出部101は、カメラ210から認識対象画像として撮影画像Paを取得する(ステップS101)。そして、領域抽出部101は、N個(例えばN=3)の灯部を有する信号機が映し出されている領域を信号機領域Pbとして、その認識対象画像である撮影画像Paから抽出する(ステップS102)。 The area extraction unit 101 of the traffic signal recognition apparatus 100 acquires a captured image Pa as a recognition target image from the camera 210 (step S101). Then, the area extracting unit 101 extracts an area in which a traffic light having N (for example, N = 3) lamps is displayed as a traffic light area Pb from the captured image Pa that is the recognition target image (step S102). .
 次に、領域分割部102は、その信号機領域Pbを分割することによって、それぞれ灯部が映し出されている3個の灯部領域Pb1~Pb3を生成する(ステップS103)。 Next, the area dividing unit 102 divides the traffic signal area Pb to generate three lamp part areas Pb1 to Pb3 in which the lamp parts are respectively projected (step S103).
 次に、類似度算出部103は、その3個の灯部領域Pb1~Pb3のそれぞれの類似度k1~k3を算出する(ステップS104)。 Next, the similarity calculation unit 103 calculates the respective similarities k1 to k3 of the three lamp unit regions Pb1 to Pb3 (step S104).
 そして、灯火色認識部104は、その3個の灯部領域Pb1~Pb3のそれぞれに対して算出された類似度k1~k3に基づいて、信号機が有する3個の灯部のうちの点灯している灯部の色を、撮影画像Paにおける灯火色として認識する。 Then, the lamp color recognition unit 104 lights up of the three lamp units of the traffic light based on the similarities k1 to k3 calculated for the three lamp unit areas Pb1 to Pb3. The color of the existing lamp part is recognized as the lamp color in the captured image Pa.
 つまり、この灯火色の認識では、灯火色認識部104は、類似度k1~k3に基づいて、3個の灯部領域Pb1~Pb3のうち、他の何れの灯部領域にも類似していない灯部領域である非類似領域が存在するか否かを判定する(ステップS105)。ここで、灯火色認識部104は、その非類似領域が存在すると判定すると(ステップS105のYes)、信号機領域Pbにおける非類似領域の位置に基づいて灯火色を認識する(ステップS106)。 That is, in this lamp color recognition, the lamp color recognizing unit 104 is not similar to any of the other lamp section areas among the three lamp section areas Pb1 to Pb3 based on the similarities k1 to k3. It is determined whether or not there is a dissimilar region that is a lamp region (step S105). If the lamp color recognition unit 104 determines that the dissimilar area exists (Yes in step S105), the lamp color recognition unit 104 recognizes the lamp color based on the position of the dissimilar area in the traffic light area Pb (step S106).
 以上のように、本実施の形態における信号機認識装置100では、領域抽出部101は、撮影画像Paを認識対象画像を取得する。そして、領域抽出部101は、N個の灯部を有する信号機が映し出されている領域を信号機領域として、その認識対象画像から抽出する。領域分割部102は、その信号機領域を分割することによって、それぞれ灯部が映し出されているN個の灯部領域を生成する。類似度算出部103は、そのN個の灯部領域のそれぞれの類似度を算出する。灯火色認識部104は、そのN個の灯部領域のそれぞれに対して算出された類似度に基づいて、N個の灯部領域のうち、他の何れの灯部領域にも類似していない灯部領域である非類似領域が存在するか否かを判定する。ここで、灯火色認識部104は、非類似領域が存在すると判定する場合には、信号機領域における非類似領域の位置に基づいて、N個の灯部のうちの点灯している灯部の色を、認識対象画像における灯火色として認識する。例えば、灯火色認識部104は、信号機領域PbにおけるN個の灯部領域のそれぞれの位置に対応付けられた色を示す対応付け情報を参照する。そして、灯火色認識部104は、その対応付け情報において、非類似領域の位置に対応付けられている色を、認識対象画像における灯火色として認識する。なお、N個の灯部領域のそれぞれの位置に対応付けられた色を示す対応付け情報は、上述のように地図情報に含まれていてもよく、地図情報に含まれていなくてもよい。また、対応付け情報は、予めメモリに格納されていてもよく、信号機認識装置100と通信可能なクラウドサーバに格納されていてもよい。灯火色認識部104は、対応付け情報をそのメモリから読み出してもよく、そのクラウドサーバから動的に読み出してもよい。また、灯火色認識部104は、非類似領域の位置に対応付けられている色をクラウドサーバに問い合わせてその色を特定してもよい。 As described above, in the traffic signal recognition apparatus 100 according to the present embodiment, the region extraction unit 101 acquires a captured image Pa as a recognition target image. Then, the area extraction unit 101 extracts an area where a traffic light having N lamp parts is displayed as a traffic light area from the recognition target image. The area dividing unit 102 divides the traffic signal area to generate N lamp part areas each displaying the lamp part. The similarity calculation unit 103 calculates the similarity of each of the N lamp unit regions. The lamp color recognizing unit 104 is not similar to any other lamp unit region among the N lamp unit regions based on the similarity calculated for each of the N lamp unit regions. It is determined whether or not a dissimilar area that is a lamp area exists. Here, if the lamp color recognition unit 104 determines that a dissimilar area exists, the color of the lit lamp part among the N lamp parts is determined based on the position of the dissimilar area in the traffic light area. Is recognized as a lamp color in the recognition target image. For example, the lamp color recognition unit 104 refers to the association information indicating the colors associated with the positions of the N lamp unit areas in the traffic signal area Pb. And the lamp color recognition part 104 recognizes the color matched with the position of a dissimilar area | region as the lamp color in a recognition target image in the matching information. Note that the association information indicating the color associated with each position of the N lamp section regions may be included in the map information as described above, or may not be included in the map information. The association information may be stored in advance in a memory, or may be stored in a cloud server that can communicate with the traffic signal recognition apparatus 100. The light color recognition unit 104 may read the association information from the memory, or may dynamically read the association information from the cloud server. Further, the light color recognition unit 104 may inquire the cloud server about the color associated with the position of the dissimilar region and specify the color.
 これにより、信号機が有するN個の灯部のうちの1つが点灯し、他のN-1個の灯部が消灯していれば、その点灯している1つの灯部が映し出されている灯部領域が非類似領域として判定される。そして、その非類似領域が信号機領域の左端にあれば、灯火色として青色が認識される。このように、N個の灯部領域の類似度、すなわち、N個の灯部領域の特徴量の差異によって、灯火色が認識されるため、適切に灯火色を認識することができる。 As a result, if one of the N lamp sections of the traffic light is turned on and the other N-1 lamp sections are turned off, the lamp in which the one lamp section that is lit is displayed. The partial area is determined as a dissimilar area. If the dissimilar area is at the left end of the traffic light area, blue is recognized as the lamp color. As described above, since the lamp color is recognized based on the similarity between the N lamp section areas, that is, the difference in the feature amount of the N lamp section areas, the lamp color can be appropriately recognized.
 例えば、色飽和が生じると、信号機の青色の灯部が点灯していても、認識対象画像のその灯部に対応する灯部領域では、その灯部が青色ではなく、白色または黄色に近い色に映っている場合がある。しかし、本実施の形態における信号機認識装置100では、認識対象画像において色飽和が生じていても、N個の灯部領域の特徴量の差異に基づいて灯火色が認識されるため、このような認識対象画像に対しても適切に灯火色を認識することができる。 For example, when color saturation occurs, even if the blue light part of the traffic light is lit, in the light part region corresponding to that light part of the recognition target image, the light part is not blue but is a color close to white or yellow. It may be reflected in. However, in the traffic signal recognition apparatus 100 according to the present embodiment, even if color saturation occurs in the recognition target image, the lamp color is recognized based on the difference in the feature amounts of the N lamp part regions. It is possible to appropriately recognize the light color for the recognition target image.
 したがって、認識対象画像において色飽和が生じていてもよいため、カメラ210で信号機を撮影するときに、色飽和が生じないように所定のシャッタースピードにカメラ210を調整する必要性を省くことができる。その結果、カメラ210のシャッタースピードを固定しておくことができる。また、色飽和が生じやすい夜間または雨天での撮影によって生成された認識対象画像に対しても、適切に灯火色を認識することができる。 Accordingly, since color saturation may occur in the recognition target image, it is possible to eliminate the necessity of adjusting the camera 210 to a predetermined shutter speed so that color saturation does not occur when the traffic light is captured by the camera 210. . As a result, the shutter speed of the camera 210 can be fixed. In addition, it is possible to appropriately recognize the light color even with respect to the recognition target image generated by photographing at night or rainy weather where color saturation is likely to occur.
 さらに、信号機の灯部の灯火色が認識対象画像に正しく映し出されていなくてもよいため、信号機の撮影に高精度でないカメラ210を用いることができる。例えば、車両では環境条件が厳しため、高精細および広いダイナミックレンジのカメラを車両に取り付けて用いることは難しい。しかし、本実施の形態における信号機認識装置100では、高精度でないカメラ210による撮影によって生成される撮影画像からでも、灯火色を認識することができる。したがって、車両に取り付けられた、厳しい環境条件にも耐え得る高精度でないカメラ210による撮影によって生成された撮影画像に対しても、灯火色を適切に認識することができる。 Furthermore, since the lighting color of the light part of the traffic light does not have to be correctly projected on the recognition target image, the camera 210 that is not highly accurate can be used for shooting the traffic light. For example, since environmental conditions are severe in a vehicle, it is difficult to use a high-definition and wide dynamic range camera attached to the vehicle. However, the traffic light recognition apparatus 100 according to the present embodiment can recognize the light color even from a photographed image generated by photographing with the camera 210 that is not highly accurate. Therefore, it is possible to appropriately recognize the light color even for a captured image that is attached to the vehicle and is generated by the camera 210 that is not highly accurate and can withstand severe environmental conditions.
 さらに、パターンマッチングを行わないため、信号機が有するN個の灯部のそれぞれの形状は、円形または四角形でもよく、どのような形状であってもよい。また、信号機の灯部の色は、青色、黄色および赤色だけでなく、どのような色であっても、灯火色を適切に認識することができる。また、パターンマッチングのためのテンプレートを必要としないため、テンプレートを保持するためのメモリ容量を削減することができる。 Furthermore, since pattern matching is not performed, the shape of each of the N lamp units included in the traffic light may be circular or quadrangular, or any shape. In addition, the color of the lamp portion of the traffic light is not limited to blue, yellow and red, and any color can be appropriately recognized. In addition, since a template for pattern matching is not required, the memory capacity for holding the template can be reduced.
 (変形例1)
 ここで、信号機のフリッカによって、無灯火の信号機が撮影画像Paに映し出されている場合がある。つまり、信号機の灯部の点灯は周期的に行われている。例えば、灯部は、100Hzまたは120Hzの周波数で点滅している。したがって、カメラ210による撮影によって撮影画像Paが生成されるタイミングにおいて、信号機の何れの灯部も点灯していない場合がある。このようなタイミングで生成された撮影画像Paには、信号機の何れの灯部も暗く映し出されている。
(Modification 1)
Here, there may be a case where an unlit traffic light is displayed on the captured image Pa due to the flicker of the traffic light. In other words, the lighting of the traffic light is periodically performed. For example, the lamp unit blinks at a frequency of 100 Hz or 120 Hz. Therefore, at the timing when the captured image Pa is generated by capturing with the camera 210, there is a case where none of the lamps of the traffic light is lit. In the captured image Pa generated at such a timing, any lamp part of the traffic light is projected darkly.
 そこで、本変形例に係る信号機認識装置は、認識対象画像である撮影画像Paに無灯火の信号機が映し出されている場合には、過去の撮影画像に対して認識された灯火色を用いて、その認識対象画像における灯火色を認識する。 Therefore, the traffic light recognition device according to the present modification uses the light color recognized for the past captured image when a non-light traffic signal is displayed in the captured image Pa that is the recognition target image. The lamp color in the recognition target image is recognized.
 図5は、本変形例に係る信号機認識装置の構成を示すブロック図である。 FIG. 5 is a block diagram showing the configuration of the traffic signal recognition apparatus according to this modification.
 本変形例に係る信号機認識装置100aは、上記実施の形態における信号機認識装置100の構成要素を備えるとともに、履歴保持部105およびフリッカ処理部106を備える。つまり、信号機認識装置100aは、領域抽出部101と、領域分割部102と、類似度算出部103と、灯火色認識部104と、履歴保持部105と、フリッカ処理部106とを備える。 The traffic signal recognizing device 100a according to the present modification includes the components of the traffic signal recognizing device 100 in the above embodiment, and also includes a history holding unit 105 and a flicker processing unit 106. That is, the traffic signal recognition apparatus 100a includes an area extraction unit 101, an area division unit 102, a similarity calculation unit 103, a light color recognition unit 104, a history holding unit 105, and a flicker processing unit 106.
 本変形例に係る灯火色認識部104は、信号機領域Pbに非類似領域が存在すると判定すると、灯火色情報を履歴保持部105およびフリッカ処理部106に出力する。つまり、認識対象画像である撮影画像Paに映し出されている信号機が無灯火でない場合には、灯火色情報が出力される。一方、信号機領域Pbに非類似領域が存在しないと判定すると、灯火色認識部104は、信号機が無灯火であることを示す無灯火情報をフリッカ処理部106に出力する。つまり、撮影画像Paに映し出されている信号機が無灯火である場合には、無灯火情報が出力される。 If the lamp color recognizing unit 104 according to the present modification determines that there is a dissimilar area in the traffic signal area Pb, it outputs the lamp color information to the history holding unit 105 and the flicker processing unit 106. That is, if the traffic light displayed in the captured image Pa that is the recognition target image is not lightless, the light color information is output. On the other hand, if it is determined that there is no dissimilar area in the traffic light area Pb, the light color recognition unit 104 outputs lightless information indicating that the traffic light is lightless to the flicker processing unit 106. That is, when the traffic light shown in the captured image Pa is unlit, unlit information is output.
 履歴保持部105は、灯火色認識部104から出力される灯火色情報を履歴情報として保持するための記録媒体である。具体的には、履歴保持部105は、ハードディスクまたはメモリなどからなる。なお、メモリは、不揮発性であってもよく、揮発性であってもよい。また、メモリは、ROM(Read Only Memory)またはRAM(Random Access Memory)であってもよい。 The history holding unit 105 is a recording medium for holding the lamp color information output from the lamp color recognition unit 104 as history information. Specifically, the history holding unit 105 includes a hard disk or a memory. Note that the memory may be nonvolatile or volatile. The memory may be ROM (Read Only Memory) or RAM (Random Access Memory).
 このような履歴保持部105は、灯火色認識部104から灯火色情報を取得すると、その灯火色情報を履歴情報として保持する。ここで、灯火色認識部104は、灯火色情報を履歴保持部105に格納するときには、既に格納されている古い灯火色情報を削除してもよい。そして、灯火色認識部104は、最新の灯火色情報のみを、その履歴保持部105に格納してもよい。 Such a history holding unit 105, when acquiring the lamp color information from the lamp color recognition unit 104, holds the lamp color information as history information. Here, when the lamp color recognition unit 104 stores the lamp color information in the history holding unit 105, the old lamp color information that has already been stored may be deleted. The lamp color recognition unit 104 may store only the latest lamp color information in the history holding unit 105.
 フリッカ処理部106は、灯火色認識部104から灯火色情報を取得すると、その灯火色情報を車両制御部240に出力する。一方、フリッカ処理部106は、灯火色認識部104から無灯火情報を取得すると、履歴保持部105に保持されている履歴情報を読み出す。そして、フリッカ処理部106は、その履歴情報を、認識対象画像である撮影画像Paに対する灯火色情報として出力する。 When the flicker processing unit 106 acquires the light color information from the light color recognition unit 104, the flicker processing unit 106 outputs the light color information to the vehicle control unit 240. On the other hand, when the flicker processing unit 106 acquires the no-light information from the light color recognition unit 104, the flicker processing unit 106 reads the history information held in the history holding unit 105. Then, the flicker processing unit 106 outputs the history information as lighting color information for the captured image Pa that is the recognition target image.
 図6は、本変形例に係る信号機認識装置100aの処理動作を示すフローチャートである。 FIG. 6 is a flowchart showing the processing operation of the traffic signal recognition apparatus 100a according to this modification.
 本変形例に係る信号機認識装置100aは、図4に示すフローチャートの処理動作と同様に、ステップS101~S106の処理を実行する。 The traffic light recognition apparatus 100a according to the present modification executes the processes of steps S101 to S106, similar to the processing operation of the flowchart shown in FIG.
 信号機認識装置100aの灯火色認識部104は、ステップS106において灯火色を認識すると、その灯火色を示す灯火色情報を履歴情報として履歴保持部105に格納する(ステップS107)。この履歴情報は、将来の撮影画像に対する灯火色の認識において、非類似領域が存在しないと判定されたときに利用される。 When the lamp color recognition unit 104 of the traffic signal recognition apparatus 100a recognizes the lamp color in step S106, the lamp color information indicating the lamp color is stored in the history holding unit 105 as history information (step S107). This history information is used when it is determined that there is no dissimilar region in the recognition of the light color for a future captured image.
 そして、フリッカ処理部106は、その灯火色情報を灯火色認識部104から取得すると、信号機認識装置100aの外部にその灯火色情報を出力する(ステップS108)。例えば、フリッカ処理部106は、灯火色情報を車両制御部240に出力する。 Then, when the flicker processing unit 106 acquires the lamp color information from the lamp color recognition unit 104, the flicker processing unit 106 outputs the lamp color information to the outside of the traffic light recognition device 100a (step S108). For example, the flicker processing unit 106 outputs the light color information to the vehicle control unit 240.
 また、ステップS105において非類似領域が存在しないと判定されると(ステップS105のNo)、灯火色認識部104は、フリッカ処理部106に無灯火情報を出力する。例えば、認識対象画像である撮影画像Paに無灯火の信号機が映し出されている場合には、最も小さい類似度が閾値Thよりも大きいために、ステップS105において、非類似領域が存在しないと判定される。このような場合に、灯火色認識部104は、無灯火情報をフリッカ処理部106に出力する。そして、フリッカ処理部106は、その無灯火情報を取得すると、履歴保持部105から履歴情報を読み出す(ステップS109)。フリッカ処理部106は、その読み出された履歴情報を、認識対象画像に対する灯火色情報として出力する(ステップS110)。 If it is determined in step S105 that there is no dissimilar region (No in step S105), the light color recognition unit 104 outputs no-light information to the flicker processing unit 106. For example, when a no-light signal is displayed in the captured image Pa that is the recognition target image, the smallest similarity is greater than the threshold Th, and therefore it is determined in step S105 that there is no dissimilar region. The In such a case, the light color recognition unit 104 outputs no-light information to the flicker processing unit 106. When the flicker processing unit 106 acquires the no-light information, the flicker processing unit 106 reads the history information from the history holding unit 105 (step S109). The flicker processing unit 106 outputs the read history information as lighting color information for the recognition target image (step S110).
 このように、本変形例に係る信号機認識装置100aでは、フリッカ処理部106は、非類似領域が存在しないと灯火色認識部104によって判定される場合には、過去に取得された撮影画像に対して認識された灯火色を示す履歴情報を参照する。そして、フリッカ処理部106は、その履歴情報によって示される灯火色を、認識対象画像における灯火色として認識する。 As described above, in the traffic signal recognition apparatus 100a according to the present modification, the flicker processing unit 106 applies to the captured image acquired in the past when the light color recognition unit 104 determines that there is no dissimilar region. The history information indicating the recognized light color is referred to. The flicker processing unit 106 recognizes the lamp color indicated by the history information as the lamp color in the recognition target image.
 これにより、認識対象画像に非類似領域が存在しない場合には、その過去の撮影画像に対して認識された灯火色が、認識対象画像における灯火色として認識される。したがって、フリッカによって灯火色の認識が妨げられることを適切に抑えることができる。 Thus, when there is no dissimilar region in the recognition target image, the lamp color recognized for the past photographed image is recognized as the lamp color in the recognition target image. Accordingly, it is possible to appropriately prevent the recognition of the light color due to flicker.
 また、過去の撮影画像は、例えば、カメラ210による一定のフレームレートでの撮影によって生成された、認識対象画像の1つ前の撮影画像(すなわちフレーム)である。したがって、認識対象画像の撮影のタイミングにおいて、フリッカによって消灯していた灯部は、その1つ前の撮影画像の撮影のタイミングでは、点灯している可能性が高い。そこで、上述のように、履歴情報を参照することによって、フリッカが生じても、灯火色を適切に認識することができる。 The past photographed image is, for example, a photographed image (that is, a frame) immediately before the recognition target image generated by photographing at a constant frame rate by the camera 210, for example. Therefore, there is a high possibility that the lamp unit that has been turned off by flicker at the timing of capturing the recognition target image is lit at the timing of capturing the previous captured image. Therefore, by referring to the history information as described above, it is possible to appropriately recognize the light color even if flicker occurs.
 (変形例2)
 認識対象画像における灯火色の認識を困難にさせる原因は、信号機のフリッカだけとは限らない。例えば、カメラ210におけるノイズ、または撮影環境によって、灯火色を認識することが難しい撮影画像が生成されることがある。
(Modification 2)
The cause that makes it difficult to recognize the light color in the recognition target image is not limited to the flicker of the traffic light. For example, a captured image in which it is difficult to recognize the light color may be generated due to noise in the camera 210 or a shooting environment.
 そこで、本変形例に係る信号機認識装置は、認識対象画像における灯火色を、過去の複数の撮影画像に対する処理結果も用いて認識する。 Therefore, the traffic light recognition apparatus according to the present modification recognizes the light color in the recognition target image using the processing results for a plurality of past captured images.
 図7は、本変形例に係る信号機認識装置の構成を示すブロック図である。 FIG. 7 is a block diagram showing the configuration of the traffic signal recognition apparatus according to this modification.
 本変形例に係る信号機認識装置100bは、上記実施の形態における信号機認識装置100の構成要素を備えるとともに、履歴保持部105および時系列処理部107を備える。つまり、信号機認識装置100aは、領域抽出部101と、領域分割部102と、類似度算出部103と、灯火色認識部104と、履歴保持部105と、時系列処理部107とを備える。 The traffic signal recognizing device 100b according to the present modification includes the components of the traffic signal recognizing device 100 in the above embodiment, and also includes a history holding unit 105 and a time series processing unit 107. That is, the traffic signal recognition apparatus 100a includes an area extraction unit 101, an area division unit 102, a similarity calculation unit 103, a light color recognition unit 104, a history holding unit 105, and a time series processing unit 107.
 本変形例に係る灯火色認識部104は、信号機領域Pbに非類似領域が存在すると判定すると、上記変形例1と同様に、灯火色情報を履歴保持部105に出力する。一方、信号機領域Pbに非類似領域が存在しないと判定すると、灯火色認識部104は、灯火色を例えば黒色として認識し、その黒色を示す灯火色情報を履歴保持部105に出力する。 When the lamp color recognition unit 104 according to the present modification determines that there is a dissimilar area in the traffic signal area Pb, the lamp color information is output to the history holding unit 105 as in the first modification. On the other hand, if it is determined that there is no dissimilar area in the traffic signal area Pb, the lamp color recognition unit 104 recognizes the lamp color as black, for example, and outputs lamp color information indicating the black color to the history holding unit 105.
 履歴保持部105は、灯火色認識部104から灯火色情報を取得すると、その灯火色情報を履歴情報の一部として保持する。例えば、履歴保持部105は、L個(Lは3以上の整数)の灯火色情報を保持するための記憶容量を有する。したがって、灯火色認識部104は、灯火色情報を履歴保持部105に格納するときに、L個の古い灯火色情報が既に格納されている場合には、それらの灯火色情報のうちで最も古い灯火色情報を削除する。これにより、履歴保持部105に空き容量が確保される。そして、灯火色認識部104は、その空き容量が確保された履歴保持部105に、最新の灯火色情報を格納する。これにより、履歴保持部105は、常に最近のL個の灯火色情報を履歴情報として保持している。 The history holding unit 105, when acquiring the lamp color information from the lamp color recognition unit 104, holds the lamp color information as a part of the history information. For example, the history holding unit 105 has a storage capacity for holding L pieces of light color information (L is an integer of 3 or more). Therefore, when the lamp color information is stored in the history holding unit 105 when the L lamp color information is already stored, the lamp color recognition unit 104 is the oldest among the lamp color information. Delete the light color information. Thereby, free capacity is secured in the history holding unit 105. Then, the lamp color recognition unit 104 stores the latest lamp color information in the history holding unit 105 in which the free space is secured. Thereby, the history holding unit 105 always holds the latest L pieces of lamp color information as history information.
 時系列処理部107は、履歴保持部105に灯火色情報が格納されるたびに、その履歴保持部105に格納されている履歴情報を読み出す。この履歴情報には、最近のL個の灯火色情報が含まれている。時系列処理部107は、そのL個の灯火色情報のそれぞれによって示される灯火色に基づいて、認識対象画像に対して既に認識された灯火色を更新する。言い換えれば、灯火色認識部104は、認識対象画像に対して灯火色の仮判定を行い、時系列処理部107は、過去の(L-1)枚の撮影画像に対して認識された灯火色を用いて、その認識対象画像に対する灯火色の最終判定を行う。 The time series processing unit 107 reads out the history information stored in the history holding unit 105 every time the lighting color information is stored in the history holding unit 105. The history information includes the latest L pieces of lamp color information. The time series processing unit 107 updates the already recognized lamp color for the recognition target image based on the lamp color indicated by each of the L lamp color information. In other words, the lamp color recognition unit 104 performs a lamp color temporary determination on the recognition target image, and the time-series processing unit 107 detects the lamp color recognized for the past (L-1) photographed images. Is used to make a final determination of the light color for the recognition target image.
 図8は、本変形例に係る信号機認識装置100bによる灯火色の認識を説明するための図である。 FIG. 8 is a diagram for explaining the recognition of the light color by the traffic signal recognition apparatus 100b according to the present modification.
 例えば、L=4の場合、履歴保持部105は、過去の4つの撮影画像のそれぞれに対して認識された灯火色を示す灯火色情報を履歴情報として保持している。これらの4つの撮影画像は、第(n-4)フレームと、第(n-3)フレームと、第(n-2)フレームと、第(n-1)フレームとからなる。例えば、履歴情報に含まれる、第(n-1)フレームの灯火色情報は、灯火色として黒色を示し、他の3つの灯火色情報はそれぞれ、灯火色として青色を示す。なお、上記各フレームは、カメラ210による一定のフレームレートでの撮影によって生成された撮影画像である。 For example, when L = 4, the history holding unit 105 holds, as history information, lamp color information indicating the lamp color recognized for each of the past four captured images. These four captured images are composed of the (n-4) th frame, the (n-3) th frame, the (n-2) th frame, and the (n-1) th frame. For example, the lighting color information of the (n−1) th frame included in the history information indicates black as the lighting color, and the other three lighting color information indicates blue as the lighting color. Each frame is a captured image generated by shooting at a constant frame rate by the camera 210.
 ここで、灯火色認識部104は、最新の撮影画像であって認識対象画像である第nフレームに対して灯火色を認識する。例えば、灯火色認識部104は、青色の灯火色を認識する。そして、灯火色認識部104は、その第nフレームに対して認識された灯火色を示す灯火色情報を、履歴保持部105に格納する。このときには、灯火色認識部104は、まず、履歴保持部105に格納されている履歴情報のうち、第(n-4)フレームの灯火色情報を削除する。次に、灯火色認識部104は、第nフレームに対して認識された灯火色を示す灯火色情報を、新たな灯火色情報として履歴保持部105に格納する。その結果、履歴保持部105の履歴情報には、第(n-4)フレームの灯火色情報の代わりに、第nフレームの灯火色情報が含まれる。 Here, the lamp color recognition unit 104 recognizes the lamp color with respect to the nth frame which is the latest captured image and is the recognition target image. For example, the lamp color recognition unit 104 recognizes a blue lamp color. The lamp color recognition unit 104 stores lamp color information indicating the lamp color recognized for the nth frame in the history holding unit 105. At this time, the light color recognition unit 104 first deletes the light color information of the (n−4) th frame from the history information stored in the history holding unit 105. Next, the lamp color recognition unit 104 stores lamp color information indicating the lamp color recognized for the nth frame in the history holding unit 105 as new lamp color information. As a result, the history information of the history holding unit 105 includes the light color information of the nth frame instead of the light color information of the (n-4) th frame.
 次に、時系列処理部107は、その履歴保持部105に格納されている履歴情報を読み出す。そして、時系列処理部107は、その履歴情報に含まれる最近の4つの灯火色情報のそれぞれによって示される灯火色に基づいて、第nフレームに対して認識された灯火色(例えば青色)を更新する。つまり、時系列処理部107は、第(n-3)フレーム、第(n-2)フレーム、第(n-1)フレームおよび第nフレームのそれぞれに対する仮判定によって得られた灯火色を用いて、第nフレームに対する灯火色の最終判定を行う。 Next, the time series processing unit 107 reads out the history information stored in the history holding unit 105. Then, the time-series processing unit 107 updates the lamp color (for example, blue) recognized for the nth frame based on the lamp color indicated by each of the latest four lamp color information included in the history information. To do. That is, the time-series processing unit 107 uses the lamp colors obtained by provisional determination for each of the (n-3) th frame, the (n-2) th frame, the (n-1) th frame, and the nth frame. The final determination of the light color for the nth frame is performed.
 具体的には、時系列処理部107は、4つの灯火色情報のそれぞれによって示される灯火色の多数決を行い、第nフレームに対して認識された灯火色を、その多数決によって決定される灯火色に更新する。例えば、時系列処理部107は、第(n-3)フレーム、第(n-2)フレーム、第(n-1)フレームおよび第nフレームのそれぞれに対して認識された灯火色、すなわち、青色、青色、黒色、および青色の多数決を行う。時系列処理部107は、この多数決によって決定される灯火色、すなわち青色を、多数灯火色として特定する。そして、時系列処理部107は、第nフレームに対して認識された灯火色をその多数灯火色に更新する。 Specifically, the time-series processing unit 107 performs a majority decision of the lamp color indicated by each of the four lamp color information, and determines the lamp color recognized for the nth frame as the lamp color determined by the majority vote. Update to For example, the time-series processing unit 107 recognizes the lamp colors recognized for each of the (n-3) th frame, the (n-2) th frame, the (n-1) th frame, and the nth frame, that is, blue Blue, black, and blue majority vote. The time-series processing unit 107 specifies the lamp color determined by the majority decision, that is, blue, as the majority lamp color. Then, the time series processing unit 107 updates the lamp colors recognized for the nth frame to the multiple lamp colors.
 図9は、本変形例に係る信号機認識装置100bによる処理動作を示すフローチャートである。 FIG. 9 is a flowchart showing a processing operation by the traffic signal recognition apparatus 100b according to the present modification.
 本変形例に係る信号機認識装置100bは、まず、図4に示すフローチャートの処理動作と同様に、ステップS101~S106の処理を実行する。 First, the traffic signal recognition apparatus 100b according to the present modification executes steps S101 to S106 in the same manner as the processing operation of the flowchart shown in FIG.
 次に、信号機認識装置100bの灯火色認識部104は、ステップS106において灯火色を認識すると、その灯火色を示す灯火色情報を履歴情報の一部として履歴保持部105に格納する(ステップS122)。また、灯火色認識部104は、ステップS105において非類似領域が存在しないと判定すると(ステップS105のNo)、例えば、認識対象画像に対する灯火色を黒色として認識する(ステップS121)。そして、灯火色認識部104は、その黒色を示す灯火色情報を履歴情報の一部として履歴保持部105に格納する(ステップS122)。 Next, when the lamp color recognition unit 104 of the traffic light recognition apparatus 100b recognizes the lamp color in step S106, the lamp color information indicating the lamp color is stored in the history holding unit 105 as part of the history information (step S122). . If the lamp color recognition unit 104 determines that there is no dissimilar area in step S105 (No in step S105), for example, the lamp color recognition unit 104 recognizes the lamp color for the recognition target image as black (step S121). Then, the lamp color recognition unit 104 stores the lamp color information indicating the black color in the history holding unit 105 as part of the history information (step S122).
 次に、時系列処理部107は、履歴保持部105に格納されている履歴情報を読み出す(ステップS123)。そして、時系列処理部107は、その履歴情報によって示される4つの灯火色の多数決によって、ステップS106またはS121において認識された、認識対象画像における灯火色を更新する(ステップS124)。 Next, the time series processing unit 107 reads the history information stored in the history holding unit 105 (step S123). Then, the time-series processing unit 107 updates the lamp color in the recognition target image recognized in step S106 or S121 by the majority of the four lamp colors indicated by the history information (step S124).
 なお、上述の例では、履歴保持部105は、4つの灯火色情報を含む履歴情報を保持している。しかし、履歴情報に含まれる灯火色情報の数は4つに限らず、3つでも、4つ以上であってもよい。 In the above example, the history holding unit 105 holds history information including four pieces of light color information. However, the number of lamp color information included in the history information is not limited to four, and may be three or four or more.
 また、上述の例では、時系列処理部107は、履歴保持部105から履歴情報を読み出すことによって、その履歴情報に含まれる、認識対象画像の灯火色情報を取得する。しかし、時系列処理部107は、その認識対象画像の灯火色情報を灯火色認識部104から直接取得してもよい。この場合、時系列処理部107によって履歴保持部105から読み出される履歴情報には、認識対象画像の灯火色情報は含まれておらず、過去の撮影画像の灯火色情報のみが含まれている。 In the above-described example, the time series processing unit 107 reads the history information from the history holding unit 105 to acquire the light color information of the recognition target image included in the history information. However, the time series processing unit 107 may directly acquire the light color information of the recognition target image from the light color recognition unit 104. In this case, the history information read from the history holding unit 105 by the time-series processing unit 107 does not include the light color information of the recognition target image, but includes only the light color information of the past captured image.
 このように、本変形例に係る信号機認識装置100bでは、時系列処理部107は、認識対象画像よりも前に取得された複数の撮影画像のそれぞれに対して認識された灯火色を示す履歴情報を参照する。そして、時系列処理部107は、その認識対象画像に対して認識された灯火色と、履歴情報によって示される、上記複数の撮影画像のそれぞれにおける灯火色とのうち、最も多い灯火色を多数灯火色として特定する。時系列処理部107は、その認識対象画像に対して認識された灯火色を、多数灯火色に更新する。 As described above, in the traffic signal recognition apparatus 100b according to the present modification, the time-series processing unit 107 includes history information indicating the lamp color recognized for each of the plurality of captured images acquired before the recognition target image. Refer to Then, the time-series processing unit 107 calculates the largest number of lamp colors among the lamp colors recognized for the recognition target image and the lamp colors in each of the plurality of captured images indicated by the history information. Specify as color. The time-series processing unit 107 updates the lamp colors recognized for the recognition target image to the multiple lamp colors.
 これにより、例えば、ノイズなどのフリッカ以外の要因によって、突発的に、認識対象画像に非類似領域が存在しないと判定されたり、認識対象画像に対して誤った灯火色が認識されてしまっても、その誤りを簡単に正すことができる。 As a result, for example, even if it is suddenly determined that there is no dissimilar area in the recognition target image due to factors other than flicker such as noise, or an incorrect light color is recognized for the recognition target image. The error can be easily corrected.
 (変形例3)
 本変形例に係る信号機認識装置は、変形例2と同様に、認識対象画像における灯火色を、過去の複数の撮影画像に対する処理結果も用いて認識する。ただし、本変形例における信号機認識装置は、過去の複数の撮影画像に対する処理結果として、灯火色の認識結果ではなく、類似度の算出結果を用いる。
(Modification 3)
As in the second modification, the traffic signal recognition apparatus according to the second modification recognizes the light color in the recognition target image using the processing results for a plurality of past captured images. However, the traffic light recognition apparatus according to the present modification uses not the lamp color recognition result but the similarity calculation result as the processing result for a plurality of past captured images.
 図10は、本変形例に係る信号機認識装置の構成を示すブロック図である。 FIG. 10 is a block diagram showing a configuration of a traffic signal recognition apparatus according to this modification.
 本変形例に係る信号機認識装置100cは、上記実施の形態における信号機認識装置100の各構成要素のうちの灯火色認識部104の代わりに、灯火色認識部104cを備える。さらに、信号機認識装置100cは履歴保持部105を備える。つまり、信号機認識装置100aは、領域抽出部101と、領域分割部102と、類似度算出部103と、灯火色認識部104cと、履歴保持部105とを備える。 The traffic light recognition device 100c according to this modification includes a light color recognition unit 104c instead of the light color recognition unit 104 among the components of the traffic light recognition device 100 in the above embodiment. Further, the traffic signal recognition apparatus 100 c includes a history holding unit 105. That is, the traffic signal recognition apparatus 100a includes an area extraction unit 101, an area division unit 102, a similarity calculation unit 103, a light color recognition unit 104c, and a history holding unit 105.
 本変形例に係る類似度算出部103は、信号機領域PbにおけるN個の灯部領域のそれぞれの類似度を算出する。そして、類似度算出部103は、信号機領域Pb内の位置ごとに、当該位置にある灯部領域の類似度を示す類似度情報を履歴保持部105に出力する。つまり、類似度算出部103は、類似度情報を履歴保持部105に格納する。例えばN=3の場合、類似度情報は、信号機領域Pbの左端にある第1灯部領域Pb1の類似度と、信号機領域Pbの中央にある第2灯部領域Pb2の類似度と、信号機領域Pbの右端にある第3灯部領域Pb3の類似度とを示す。 The similarity calculation unit 103 according to the present modification calculates the similarity of each of the N lamp unit regions in the traffic signal region Pb. Then, for each position in the traffic light area Pb, the similarity calculation unit 103 outputs similarity information indicating the similarity of the lamp area at the position to the history holding unit 105. That is, the similarity calculation unit 103 stores the similarity information in the history holding unit 105. For example, when N = 3, the similarity information includes the similarity of the first lamp area Pb1 at the left end of the traffic light area Pb, the similarity of the second lamp area Pb2 at the center of the traffic light area Pb, and the traffic light area. The similarity of the 3rd lamp part area | region Pb3 in the right end of Pb is shown.
 履歴保持部105は、類似度算出部103から出力される類似度情報を履歴情報の一部として保持する。例えば、履歴保持部105は、L個(Lは2以上の整数)の類似度情報を保持するための記憶容量を有する。したがって、類似度算出部103は、類似度情報を履歴保持部105に格納するときに、L個の古い類似度情報が既に格納されていれば、それらの類似度情報のうちで最も古い類似度情報を削除する。これにより、履歴保持部105に空き容量が確保される。そして、類似度算出部103は、その空き容量が確保された履歴保持部105に、最新の類似度情報を格納する。これにより、履歴保持部105は、常に最近のL個の類似度情報を履歴情報として保持している。 The history holding unit 105 holds the similarity information output from the similarity calculation unit 103 as part of the history information. For example, the history holding unit 105 has a storage capacity for holding L pieces of similarity information (L is an integer of 2 or more). Therefore, when the similarity calculation unit 103 stores the similarity information in the history holding unit 105, if the L old similarity information is already stored, the oldest similarity among the similarity information is stored. Delete information. Thereby, free capacity is secured in the history holding unit 105. Then, the similarity calculation unit 103 stores the latest similarity information in the history holding unit 105 in which the free space is secured. Thereby, the history holding unit 105 always holds the latest L pieces of similarity information as history information.
 本変形例に係る灯火色認識部104cは、履歴保持部105に類似度情報が格納されるたびに、その履歴保持部105に格納されている履歴情報を読み出す。この履歴情報には、最近のL個の類似度情報が含まれている。灯火色認識部104cは、その最近のL個の類似度情報によって示される類似度の平均を、平均類似度として算出する。例えばN=3の場合、類似度情報は、上述のように、信号機領域Pbの左端、中央および右端のそれぞれにある灯部領域の類似度を示す。そこで、灯火色認識部104cは、L個の類似度情報によって示される、左端にある灯部領域の類似度の平均を、その灯部領域の平均類似度として算出する。同様に、灯火色認識部104cは、中央にある灯部領域の類似度の平均を、その灯部領域の平均類似度として算出し、右端にある灯部領域の類似度の平均を、その灯部領域の平均類似度として算出する。そして、灯火色認識部104cは、算出された3つの平均類似度の中で最も小さい平均類似度が、例えば左端にある灯部領域の平均類似度であれば、認識対象画像の信号機領域Pbにおいて左端にある灯部領域を、非類似領域として特定する。 The lighting color recognition unit 104c according to the present modification reads the history information stored in the history holding unit 105 every time the similarity information is stored in the history holding unit 105. The history information includes L pieces of recent similarity information. The light color recognition unit 104c calculates the average of the similarities indicated by the recent L pieces of similarity information as the average similarity. For example, when N = 3, the similarity information indicates the similarity of the lamp area at each of the left end, the center, and the right end of the traffic light area Pb as described above. Therefore, the lamp color recognition unit 104c calculates the average of the similarity of the lamp region at the left end indicated by the L pieces of similarity information as the average similarity of the lamp region. Similarly, the lamp color recognition unit 104c calculates the average similarity of the lamp area at the center as the average similarity of the lamp area, and calculates the average similarity of the lamp area at the right end. Calculated as the average similarity of partial areas. Then, the lamp color recognizing unit 104c determines that the smallest average similarity among the calculated three average similarity is, for example, the average similarity of the lamp region at the left end, in the traffic light region Pb of the recognition target image. The lamp area at the left end is specified as a dissimilar area.
 図11は、本変形例に係る信号機認識装置100cによる灯火色の認識を説明するための図である。 FIG. 11 is a diagram for explaining the recognition of the light color by the traffic signal recognition device 100c according to this modification.
 例えば、L=4の場合、履歴保持部105は、過去の4つの撮影画像のそれぞれに対して算出された類似度を示す類似度情報を履歴情報として保持している。これらの4つの撮影画像は、第(n-4)フレームと、第(n-3)フレームと、第(n-2)フレームと、第(n-1)フレームとからなる。なお、これらのフレームは、カメラ210による一定のフレームレートでの撮影によって生成された撮影画像である。 For example, when L = 4, the history holding unit 105 holds similarity information indicating similarity calculated for each of the past four captured images as history information. These four captured images are composed of the (n-4) th frame, the (n-3) th frame, the (n-2) th frame, and the (n-1) th frame. Note that these frames are captured images generated by capturing at a constant frame rate by the camera 210.
 ここで、N=3の場合、類似度算出部103は、最新の撮影画像であって認識対象画像である第nフレームに対して3つの灯部領域のそれぞれの類似度を算出する。そして、類似度算出部103は、その第nフレームに対して算出された類似度を示す類似度情報を、履歴保持部105に格納する。このときには、類似度算出部103は、まず、履歴保持部105に格納されている履歴情報のうち、第(n-4)フレームの類似度情報を削除する。次に、類似度算出部103は、第nフレームに対して算出された類似度を示す類似度情報を、新たな類似度情報として履歴保持部105に格納する。その結果、履歴保持部105の履歴情報には、第(n-4)フレームの類似度情報の代わりに、第nフレームの類似度情報が含まれる。 Here, when N = 3, the similarity calculation unit 103 calculates the similarity of each of the three lamp unit regions with respect to the nth frame which is the latest captured image and is the recognition target image. Then, the similarity calculation unit 103 stores similarity information indicating the similarity calculated for the n-th frame in the history holding unit 105. At this time, the similarity calculation unit 103 first deletes the similarity information of the (n−4) th frame from the history information stored in the history holding unit 105. Next, the similarity calculation unit 103 stores similarity information indicating the similarity calculated for the n-th frame in the history holding unit 105 as new similarity information. As a result, the history information of the history holding unit 105 includes the similarity information of the nth frame instead of the similarity information of the (n−4) th frame.
 次に、灯火色認識部104cは、その履歴保持部105に格納されている履歴情報を読み出す。この履歴情報には、第(n-3)フレームの類似度情報と、第(n-2)フレームの類似度情報と、第(n-1)フレームの類似度情報と、第nフレームの類似度情報とが含まれている。例えば、第(n-3)フレームの類似度情報は、信号機領域Pb内の左端、中央および右端のそれぞれにある灯部領域の類似度として、30/101/99を示す。第(n-2)フレームの類似度情報は、信号機領域Pb内の左端、中央および右端のそれぞれにある灯部領域の類似度として、70/111/105を示す。第(n-1)フレームの類似度情報は、信号機領域Pb内の左端、中央および右端のそれぞれにある灯部領域の類似度として、107/110/114を示す。第nフレームの類似度情報は、信号機領域Pb内の左端、中央および右端のそれぞれにある灯部領域の類似度として、21/112/105を示す。 Next, the light color recognition unit 104 c reads the history information stored in the history holding unit 105. The history information includes similarity information of the (n-3) th frame, similarity information of the (n-2) th frame, similarity information of the (n-1) th frame, and similarity of the nth frame. Degree information is included. For example, the similarity information of the (n-3) th frame indicates 30/101/99 as the similarity of the lamp area at each of the left end, the center, and the right end in the traffic light area Pb. The similarity information of the (n−2) th frame indicates 70/111/105 as the similarity of the lamp area at each of the left end, the center, and the right end in the traffic light area Pb. The similarity information of the (n−1) th frame indicates 107/110/114 as the similarity of the lamp area at each of the left end, the center, and the right end in the traffic light area Pb. The similarity information of the nth frame indicates 21/112/105 as the similarity of the lamp area at each of the left end, the center, and the right end in the traffic light area Pb.
 そして、灯火色認識部104cは、4つのフレームのそれぞれの類似度情報によって示される、信号機領域Pb内の左端にある灯部領域の類似度の平均を算出する。つまり、灯火色認識部104cは、左端の灯部領域の平均類似度を、(30+70+107+21)/4によって算出する。同様に、灯火色認識部104cは、中央の灯部領域の平均類似度を、(101+111+110+112)/4によって算出し、右端の灯部領域の平均類似度を、(99+105+114+105)/4によって算出する。 The lamp color recognition unit 104c calculates the average of the similarity of the lamp area at the left end in the traffic signal area Pb indicated by the similarity information of each of the four frames. That is, the lamp color recognition unit 104c calculates the average similarity of the leftmost lamp unit region by (30 + 70 + 107 + 21) / 4. Similarly, the lamp color recognition unit 104c calculates the average similarity of the central lamp unit area by (101 + 111 + 110 + 112) / 4, and calculates the average similarity of the rightmost lamp unit region by (99 + 105 + 114 + 105) / 4.
 これにより、灯火色認識部104cは、信号機領域Pb内の左端、中央および左端のそれぞれにある灯部領域の平均類似度として、57/108/106を算出する。 Thereby, the lamp color recognition unit 104c calculates 57/108/106 as the average similarity of the lamp area at each of the left end, the center, and the left end in the traffic light area Pb.
 そして、灯火色認識部104cは、これらの平均類似度の中で最も小さい平均類似度「57」に対応する位置、すなわち左端にある灯部領域を、非類似領域として特定する。つまり、灯火色認識部104cは、認識対象画像における3つの灯部領域のうちの左端の灯部領域が、非類似領域として存在すると判定する。 Then, the lamp color recognition unit 104c identifies the position corresponding to the smallest average similarity “57” among these average similarities, that is, the lamp region at the left end, as a dissimilar region. That is, the lamp color recognizing unit 104c determines that the leftmost lamp unit region among the three lamp unit regions in the recognition target image exists as a dissimilar region.
 次に、灯火色認識部104cは、上記実施の形態における灯火色認識部104と同様に、その非類似領域の位置に基づいて、認識対象画像における灯火色を認識する。例えば、灯火色認識部104cは、上述のように、左端の灯部領域が非類似領域として存在すると判定すると、認識対象画像における灯火色として青色を認識する。 Next, similarly to the lamp color recognition unit 104 in the above embodiment, the lamp color recognition unit 104c recognizes the lamp color in the recognition target image based on the position of the dissimilar region. For example, if the lamp color recognition unit 104c determines that the leftmost lamp unit region exists as a dissimilar region as described above, it recognizes blue as the lamp color in the recognition target image.
 図12は、本変形例に係る信号機認識装置100cによる処理動作を示すフローチャートである。 FIG. 12 is a flowchart showing a processing operation by the traffic signal recognition apparatus 100c according to the present modification.
 本変形例に係る信号機認識装置100cは、まず、図4に示すフローチャートの処理動作と同様に、ステップS101~S104の処理を実行する。 First, the traffic signal recognition apparatus 100c according to the present modification executes the processing of steps S101 to S104, similarly to the processing operation of the flowchart shown in FIG.
 次に、信号機認識装置100cの類似度算出部103は、ステップS104で算出された各灯部領域の類似度を示す類似度情報を履歴保持部105に格納する(ステップS104a)。 Next, the similarity calculation unit 103 of the traffic signal recognition apparatus 100c stores the similarity information indicating the similarity of each lamp unit area calculated in step S104 in the history holding unit 105 (step S104a).
 次に、灯火色認識部104cは、履歴保持部105に格納されている履歴情報を読み出す(ステップS104b)。つまり、履歴情報に含まれる最近のL個(例えばL=4)の類似度情報が読み出される。そして、灯火色認識部104cは、N=3の場合、信号機領域Pb内の左端、中央および左端のそれぞれにある灯部領域の平均類似度を算出する(ステップS104c)。 Next, the light color recognition unit 104c reads the history information stored in the history holding unit 105 (step S104b). That is, the latest L pieces of similarity information (for example, L = 4) included in the history information are read out. Then, when N = 3, the lamp color recognition unit 104c calculates the average similarity of the lamp unit regions at the left end, the center, and the left end in the traffic signal region Pb (step S104c).
 次に、灯火色認識部104cは、その3個の灯部領域のそれぞれに対して算出された平均類似度に基づいて、信号機が有する3個の灯部のうちの点灯している灯部の色を、認識対象画像における灯火色として認識する。 Next, based on the average similarity calculated for each of the three lamp section areas, the lamp color recognition section 104c determines which of the three lamp sections that the traffic light has. The color is recognized as the lamp color in the recognition target image.
 つまり、灯火色認識部104cは、平均類似度に基づいて、3個の灯部領域のうち、他の何れの灯部領域にも類似していない灯部領域である非類似領域が存在するか否かを判定する(ステップS105)。ここで、灯火色認識部104は、その非類似領域が存在すると判定すると(ステップS105のYes)、信号機領域Pbにおける非類似領域の位置に基づいて灯火色を認識する(ステップS106)。 That is, the lamp color recognition unit 104c has a dissimilar region that is a lamp region that is not similar to any other lamp unit region among the three lamp unit regions based on the average similarity. It is determined whether or not (step S105). If the lamp color recognition unit 104 determines that the dissimilar area exists (Yes in step S105), the lamp color recognition unit 104 recognizes the lamp color based on the position of the dissimilar area in the traffic light area Pb (step S106).
 なお、上述の例では、灯火色認識部104cは、各フレームにおける灯部領域の類似度の平均を算出したが、重み付き平均を平均類似度として算出してもよい。例えば、認識対象画像に時間的に近いフレーム(撮影画像)ほど、そのフレームにおける灯部領域の類似度に対して大きい重みが乗算される。これによって、認識対象画像に対してより適切な平均類似度を算出することができる。 In the above example, the lamp color recognition unit 104c calculates the average of the similarity of the lamp area in each frame, but may calculate the weighted average as the average similarity. For example, a frame (captured image) that is closer in time to the recognition target image is multiplied by a greater weight to the similarity of the lamp area in that frame. This makes it possible to calculate a more appropriate average similarity for the recognition target image.
 また、上述の例では、灯火色認識部104cは、履歴保持部105から履歴情報を読み出すことによって、その履歴情報に含まれる、認識対象画像の類似度情報を取得する。しかし、灯火色認識部104cは、その認識対象画像の類似度情報を類似度算出部103から直接取得してもよい。この場合、灯火色認識部104cによって履歴保持部105から読み出される履歴情報には、認識対象画像の類似度情報は含まれておらず、過去の撮影画像の類似度情報のみが含まれている。 In the above example, the light color recognition unit 104c reads the history information from the history holding unit 105, thereby acquiring similarity information of the recognition target image included in the history information. However, the light color recognition unit 104c may directly acquire the similarity information of the recognition target image from the similarity calculation unit 103. In this case, the history information read from the history holding unit 105 by the lamp color recognition unit 104c does not include the similarity information of the recognition target image, and includes only the similarity information of the past captured images.
 このように、本変形例に係る信号機認識装置100cでは、灯火色認識部104cは、認識対象画像よりも前に取得された複数の撮影画像のそれぞれに対して算出された類似度を示す履歴情報を参照する。この履歴情報は、複数の撮影画像のそれぞれについて、当該撮影画像における信号機領域Pb内の位置ごとに、当該位置にある灯部領域の類似度を示す。そして、灯火色認識部104cは、信号機領域内の位置ごとに、認識対象画像および複数の画像において当該位置にある灯部領域の類似度の平均を、平均類似度として算出する。さらに、灯火色認識部104cは、認識対象画像におけるN個の灯部領域のうち、その位置ごとに算出された平均類似度の中で最も小さい平均類似度に対応する位置にある灯部領域が、非類似領域として存在すると判定する。 As described above, in the traffic light recognition device 100c according to the present modification, the lamp color recognition unit 104c has history information indicating similarity calculated for each of a plurality of captured images acquired before the recognition target image. Refer to This history information indicates, for each of a plurality of photographed images, the degree of similarity of the lamp section region at the position for each position in the traffic light region Pb in the photographed image. Then, for each position in the traffic light area, the lamp color recognition unit 104c calculates the average similarity of the lamp area at the position in the recognition target image and the plurality of images as the average similarity. Further, the lamp color recognizing unit 104c has a lamp unit region at a position corresponding to the smallest average similarity among the average similarity calculated for each of the N lamp unit regions in the recognition target image. It is determined that it exists as a dissimilar region.
 これにより、例えば、ノイズなどのフリッカ以外の要因によって、突発的に、認識対象画像に非類似領域が存在しないと判定されたり、認識対象画像に対して誤った灯火色が認識されてしまっても、その誤りを正すことができる。また、過去の類似度を用いた平均類似度に基づいて、非類似領域が判定されるため、過去に認識された灯火色を用いるよりも、非類似領域の判定をより正確に行うことができる。 As a result, for example, even if it is suddenly determined that there is no dissimilar area in the recognition target image due to factors other than flicker such as noise, or an incorrect light color is recognized for the recognition target image. The error can be corrected. Further, since the dissimilar region is determined based on the average similarity using the past similarity, the dissimilar region can be determined more accurately than using the light color recognized in the past. .
 (変形例4)
 上記実施の形態および変形例1~3では、信号機認識装置は、信号機の灯火色を認識する。しかし、信号機認識装置は、矢印の灯部を有する信号機に対しては、灯火色だけでなく、点灯している矢印の灯部によって示される進行方向を認識してもよい。なお、矢印の灯部を、以下、方向指示灯部といい、青色、黄色または青色などの色を示す灯部を、以下、色灯部ともいう。
(Modification 4)
In the above embodiment and Modifications 1 to 3, the traffic light recognition device recognizes the light color of the traffic light. However, the traffic signal recognizing device may recognize not only the lamp color but also the traveling direction indicated by the lit arrow light portion for the traffic light having the arrow light portion. The lamp part indicated by the arrow is hereinafter referred to as a direction indicator lamp part, and the lamp part indicating a color such as blue, yellow, or blue is hereinafter also referred to as a color lamp part.
 図13は、本変形例に係る信号機認識装置による進行方向の認識を説明するための図である。 FIG. 13 is a diagram for explaining the recognition of the traveling direction by the traffic signal recognition device according to the present modification.
 例えば、図13に示すように、信号機は、青色の色灯部、黄色の色灯部および赤色の色灯部だけでなく、進行方向として右向きを示す方向指示灯部を備えている。 For example, as shown in FIG. 13, the traffic signal includes not only a blue color lamp unit, a yellow color lamp unit, and a red color lamp unit, but also a direction indicator lamp unit that indicates the right direction as the traveling direction.
 このような場合、信号機認識装置の領域抽出部101は、撮影画像Paのうち、3つの色灯部と、1つの方向指示灯部とが映し出されている領域を、信号機領域Pbとして抽出する。そして、領域分割部102は、その信号機領域Pbを分割することによって、3つの色灯部のそれぞれが映し出されている灯部領域Pb1、Pb2およびPb3と、方向指示灯部が映し出されている灯部領域Pb4とを生成する。 In such a case, the area extraction unit 101 of the traffic light recognition device extracts an area in which the three color lamp units and one direction indicator lamp unit are projected from the captured image Pa as the traffic signal area Pb. Then, the area dividing unit 102 divides the traffic light area Pb to thereby divide the lamp part areas Pb1, Pb2, and Pb3 in which each of the three color lamp parts is displayed, and the lamp in which the direction indicator lamp part is displayed. A partial area Pb4 is generated.
 信号機認識装置は、上記実施の形態および変形例1~3と同様に、灯部領域Pb1、Pb2およびPb3に基づいて、信号機の灯火色を認識する。そして、信号機認識装置は、その灯火色が予め定められた色であれば、方向指示灯部は点灯していないと判定する。例えば、灯火色が青色または黄色である場合には、信号機認識装置は、その方向指示灯部は点灯していないと判定する。一方、信号機認識装置は、灯火色が赤色である場合には、灯部領域Pb4を用いて、方向指示灯部が点灯しているか否かを判定する。例えば、信号機認識装置は、その灯部領域Pb4の特徴量を、機械学習によって生成されたニューラルネットワークなどのモデルに入力する。信号機認識装置は、そのモデルからの出力に応じて、方向指示灯部が点灯しているか否かを判定する。 The traffic signal recognizing device recognizes the light color of the traffic signal based on the lamp area Pb1, Pb2 and Pb3 as in the above embodiment and the first to third modifications. Then, the traffic light recognition device determines that the direction indicator lamp unit is not lit if the lamp color is a predetermined color. For example, when the lamp color is blue or yellow, the traffic light recognition device determines that the direction indicator lamp unit is not lit. On the other hand, when the lamp color is red, the traffic light recognition device determines whether or not the direction indicator lamp unit is lit using the lamp unit region Pb4. For example, the traffic light recognition apparatus inputs the feature amount of the lamp part region Pb4 to a model such as a neural network generated by machine learning. The traffic signal recognizing device determines whether or not the direction indicator lamp unit is lit according to the output from the model.
 あるいは、信号機認識装置は、灯部領域Pb1、Pb2およびPb3のうちの非類似領域以外の領域(以下、類似領域という)と、灯部領域Pb4との間の相関係数を算出し、その相関係数が閾値以上であるか否かを判定してもよい。つまり、撮影画像において類似領域は暗く映し出されているため、相関係数が閾値以上であれば、灯部領域Pb4も暗く、方向指示灯部は消灯している可能性が高い。また、相関係数が閾値未満であれば、灯部領域Pb4は明るく、方向指示灯部は点灯している可能性が高い。したがって、信号機認識装置は、その相関係数が閾値以上である場合には、方向指示灯部は消灯していると判定する。逆に、信号機認識装置は、その相関係数が閾値未満の場合には、方向指示灯部は点灯していると判定する。 Alternatively, the traffic light recognition device calculates a correlation coefficient between a region other than the dissimilar region (hereinafter referred to as a similar region) of the lamp unit regions Pb1, Pb2, and Pb3 and the lamp unit region Pb4. It may be determined whether the number of relationships is greater than or equal to a threshold value. That is, since the similar area is darkly displayed in the captured image, if the correlation coefficient is equal to or greater than the threshold value, the lamp area Pb4 is also dark and the direction indicator lamp section is likely to be turned off. If the correlation coefficient is less than the threshold, the lamp area Pb4 is bright and the direction indicator lamp section is likely to be lit. Therefore, the traffic light recognition device determines that the direction indicator lamp unit is turned off when the correlation coefficient is equal to or greater than the threshold value. Conversely, the traffic light recognition device determines that the direction indicator lamp unit is lit when the correlation coefficient is less than the threshold value.
 信号機認識装置は、上述のように、方向指示灯部が点灯していると判定すると、対応付け情報を参照する。この対応付け情報は、図3に示す対応付け情報と同様に、灯部領域Pb1、Pb2およびPb3のそれぞれの位置に対応付けられた色を示すとともに、灯部領域Pb4の位置に対応付けられた進行方向を示す。信号機認識装置は、その対応付け情報において、灯部領域Pb4の位置に対応付けられた進行方向、例えば右方向を、点灯している方向指示灯部の進行方向として認識する。 When the traffic light recognition device determines that the direction indicator lamp unit is lit as described above, it refers to the association information. Similar to the association information shown in FIG. 3, this association information indicates the colors associated with the positions of the lamp areas Pb1, Pb2, and Pb3 and is associated with the position of the lamp area Pb4. Indicates the direction of travel. In the association information, the traffic signal recognition device recognizes the traveling direction associated with the position of the lamp unit region Pb4, for example, the right direction, as the traveling direction of the lit direction indicator lamp unit.
 ここで、道路には、複数の方向指示灯部を有する信号機も配置されている。この信号機は、複数の方向指示灯部のうちの1つの方向指示灯部のみを点灯させたり、2つ以上の方向指示灯部を同時に点灯させたりする。 Here, a traffic signal having a plurality of direction indicator lamps is also arranged on the road. In this traffic light, only one of the direction indicator lamps is turned on, or two or more direction indicator lamps are turned on simultaneously.
 このような信号機に対しても、信号機認識装置は、上記実施の形態および変形例1~3と同様に、N個の灯部のそれぞれの灯部領域に基づいて、信号機の灯火色を認識する。そして、信号機認識装置は、その灯火色に基づいて、M個(Mは1以上の整数)の方向指示灯部から、点灯している方向指示灯部の候補を絞り込む。例えば、灯火色が青色であれば、信号機認識装置は、直進方向を示す方向指示灯部を候補として絞り込み、灯火色が赤色であれば、信号機認識装置は、右方向を示す方向指示灯部を候補として絞り込む。そして、信号機認識装置は、上述と同様に、モデルまたは相関係数などを用いて、その候補である方向指示灯部が点灯しているか否かを判定する。 Even for such a traffic light, the traffic light recognition device recognizes the light color of the traffic light based on the respective lamp area of the N lamp sections, as in the above embodiment and the first to third modifications. . Then, the traffic signal recognizing device narrows down the direction indicator lamp units that are lit from M (M is an integer of 1 or more) direction indicator lamp units based on the lamp color. For example, if the lamp color is blue, the traffic light recognition device narrows down the direction indicator lamp unit indicating the straight direction as a candidate, and if the lamp color is red, the traffic signal recognition device detects the direction indicator lamp unit indicating the right direction. Narrow down as candidates. Then, similarly to the above, the traffic signal recognizing device determines whether or not the candidate direction indicator lamp unit is lit using a model or a correlation coefficient.
 このように、本変形例では、信号機が、それぞれ車両の進行方向を示す灯部であるM個(Mは1以上の整数)の方向指示灯部を有する場合には、信号機認識装置は、点灯している方向指示灯部によって示される進行方向を認識する。つまり、領域分割部102は、信号機領域Pbを分割することによって、N個の灯部領域と、それぞれ方向指示灯部が映し出されているM個の灯部領域とを生成する。そして、信号機認識装置は、さらに、認識対象画像に対して認識された灯火色と、上記M個の灯部領域のそれぞれの特徴量とに基づいて、M個の方向指示灯部の中で点灯している少なくとも1つの方向指示灯部によって示される進行方向を認識する。あるいは、信号機認識装置は、先に、M個の方向指示灯部の中で点灯している少なくとも1つの方向指示灯部によって示される進行方向を認識し、その認識結果に基づいて、色灯部の灯火色を認識してもよい。なお、上述の特徴量は、上記実施の形態におけるベクトルであってもよく、相関係数であってもよい。 As described above, in the present modification, when the traffic light has M (M is an integer of 1 or more) direction indicator lamps, each of which indicates a traveling direction of the vehicle, the traffic signal recognition device is turned on. The traveling direction indicated by the direction indicator lamp unit is recognized. That is, the area dividing unit 102 divides the traffic light area Pb, thereby generating N lamp part areas and M lamp part areas in which the direction indicating lamp parts are respectively projected. The traffic light recognition device is further turned on in the M direction indicator lamp units based on the lamp color recognized for the recognition target image and the feature quantities of the M lamp unit regions. The traveling direction indicated by the at least one direction indicator lamp unit is recognized. Alternatively, the traffic light recognition device first recognizes a traveling direction indicated by at least one direction indication lamp unit that is lit among the M direction indication lamp units, and based on the recognition result, the color lamp unit You may recognize the light color. Note that the above-described feature amount may be a vector in the above embodiment or a correlation coefficient.
 これにより、例えば矢印などによって進行方向を示すM個の方向指示灯部のうち、点灯している方向指示灯部を適切に認識することができる。例えば、信号機の灯火色ごとに、点灯する可能性がある方向指示灯部が予め定められている場合には、認識された灯火色によって、M個の方向指示灯部から、点灯している方向指示灯部の候補を絞り込むことができる。さらに、例えば、機械学習によって生成されたニューラルネットワークなどのモデルに、M個の灯部領域のそれぞれの特徴量を入力することによって、その絞り込まれた候補の中から、点灯している方向指示灯部を適切に認識することができる。 Thereby, for example, among the M number of direction indicator lamps indicating the traveling direction by an arrow or the like, it is possible to appropriately recognize the lit direction indicator lamp part. For example, in the case where a direction indicator lamp unit that may be lit is predetermined for each lamp color of the traffic light, the direction in which the M direction indicator lamp units are lit up according to the recognized lamp color. It is possible to narrow down the candidates for the indicator lamp part. Further, for example, by inputting each feature amount of the M lamp area into a model such as a neural network generated by machine learning, a directional indicator lamp that is lit is selected from the narrowed candidates. Can be recognized properly.
 (変形例5)
 上記実施の形態およびその各変形例では、N個の灯部領域のそれぞれの類似度を算出するときには、その灯部領域と、残りの(N-1)個の灯部領域のそれぞれとを個別に比較する。一方、本変形例では、その灯部領域と、残りの(N-1)個の灯部領域からなるグループとを比較することによって、その灯部領域の類似度を算出する。
(Modification 5)
In the above-described embodiment and the modifications thereof, when calculating the similarity of each of the N lamp areas, the lamp area and each of the remaining (N−1) lamp areas are individually Compare to. On the other hand, in this modification, the similarity of the lamp area is calculated by comparing the lamp area with a group of the remaining (N−1) lamp areas.
 図14は、上記実施の形態における第1灯部領域Pb1の類似度の算出方法の一例を示す図である。 FIG. 14 is a diagram illustrating an example of a method of calculating the similarity of the first lamp part region Pb1 in the above embodiment.
 上記実施の形態では、類似度算出部103は、上述のように、第1灯部領域Pb1をそれぞれn×n個の画素からなる複数のブロックに分割する。そして、類似度算出部103は、第1灯部領域Pb1のブロックごとに、そのブロックの画像に最も類似するブロックを第2灯部領域Pb2から探索する。なお、ブロックは、パッチとも呼ばれる。 In the above-described embodiment, the similarity calculation unit 103 divides the first lamp unit region Pb1 into a plurality of blocks each composed of n × n pixels as described above. And the similarity calculation part 103 searches the block most similar to the image of the block from 2nd lamp part area | region Pb2 for every block of 1st lamp part area | region Pb1. A block is also called a patch.
 例えば、類似度算出部103は、図14の(a)に示すように、第1灯部領域Pb1の左上端のブロックB11の画像に最も類似するブロックを第2灯部領域Pb2から探索する。そして、類似度算出部103は、ブロックB11と、第2灯部領域Pb2から探索された最も類似するブロックとのベクトル間距離(例えば、「5」)を算出する。なお、このベクトル間距離が大きいほど、そのベクトル間距離は、2つのブロックが類似していないことを示し、ベクトル間距離が小さいほど、そのベクトル間距離は、2つのブロックが類似していることを示す。したがって、ベクトル間距離は、非類似度と言える。 For example, as shown in FIG. 14A, the similarity calculation unit 103 searches the second lamp part area Pb2 for a block most similar to the image of the block B11 at the upper left corner of the first lamp part area Pb1. Then, the similarity calculation unit 103 calculates the inter-vector distance (for example, “5”) between the block B11 and the most similar block searched from the second lamp unit region Pb2. The larger the distance between the vectors, the more the distance between the vectors indicates that the two blocks are not similar. The smaller the distance between the vectors, the more similar the distance between the two blocks is. Indicates. Therefore, it can be said that the distance between vectors is dissimilarity.
 次に、類似度算出部103は、図14の(b)に示すように、ブロックB11の次のブロックB12に対しても、ブロックB11と同様の処理を実行する。つまり、類似度算出部103は、第1灯部領域Pb1の左上端から右に2番目のブロックB12の画像に最も類似するブロックを第2灯部領域Pb2から探索する。そして、類似度算出部103は、ブロックB12と、第2灯部領域Pb2から探索された最も類似するブロックとのベクトル間距離を算出する。 Next, as shown in FIG. 14B, the similarity calculation unit 103 performs the same processing as that of the block B11 on the block B12 next to the block B11. That is, the similarity calculation unit 103 searches the second lamp part area Pb2 for a block most similar to the image of the second block B12 from the upper left end of the first lamp part area Pb1 to the right. Then, the similarity calculation unit 103 calculates the inter-vector distance between the block B12 and the most similar block searched from the second lamp unit region Pb2.
 類似度算出部103は、図14の(a)および(b)に示す処理を繰り返し実行することによって、図14の(c)に示すように、第1灯部領域Pb1に含まれる各ブロックに対してベクトル間距離(すなわち非類似度)を算出する。 The similarity calculation unit 103 repeatedly executes the processes shown in FIGS. 14A and 14B, so that each block included in the first lamp area Pb1 is shown in FIG. 14C. On the other hand, a distance between vectors (that is, dissimilarity) is calculated.
 類似度算出部103は、これらのブロックの非類似度の平均または和を算出し、さらに、その算出結果の逆数を、第1灯部領域Pb1の第2灯部領域Pb2に対する相関係数k12として算出する。同様に、類似度算出部103は、第1灯部領域Pb1の第3灯部領域Pb3に対する相関係数k13を算出し、相関係数k12と相関係数k13のうちの最も大きい相関係数を、第1灯部領域Pb1の類似度k1として算出する。 The similarity calculation unit 103 calculates the average or sum of the dissimilarities of these blocks, and the reciprocal of the calculation result is used as the correlation coefficient k12 of the first lamp part region Pb1 with respect to the second lamp part region Pb2. calculate. Similarly, the similarity calculation unit 103 calculates a correlation coefficient k13 of the first lamp part region Pb1 with respect to the third lamp part region Pb3, and calculates the largest correlation coefficient among the correlation coefficient k12 and the correlation coefficient k13. , Calculated as the similarity k1 of the first lamp area Pb1.
 このように、上記実施の形態では、第1灯部領域Pb1の類似度の算出では、第1灯部領域Pb1と第2灯部領域Pb2との比較によって、第1灯部領域Pb1の第2灯部領域Pb2に対する相関係数k12が、第1類似度として算出される。さらに、第1灯部領域Pb1と第3灯部領域Pb3との比較によって、第1灯部領域Pb1の第3灯部領域Pb3に対する相関係数k13が、第2類似度として算出される。そして、第1類似度および第2類似度のうちの最も大きい類似度が、第1灯部領域Pb1の類似度として算出される。したがって、第1灯部領域Pb1の類似度の算出では、その第1灯部領域Pb1と、第2灯部領域Pb2および第3灯部領域Pb3のそれぞれとが個別に比較される。第2灯部領域Pb2および第3灯部領域Pb3の類似度も、第1灯部領域Pb1の類似度と同様に算出される。 Thus, in the above embodiment, in the calculation of the similarity of the first lamp section area Pb1, the second lamp section area Pb1 is compared with the second lamp section area Pb2 by comparing the first lamp section area Pb1 and the second lamp section area Pb2. A correlation coefficient k12 for the lamp area Pb2 is calculated as the first similarity. Further, by comparing the first lamp part area Pb1 and the third lamp part area Pb3, the correlation coefficient k13 of the first lamp part area Pb1 with respect to the third lamp part area Pb3 is calculated as the second similarity. Then, the largest similarity between the first similarity and the second similarity is calculated as the similarity of the first lamp area Pb1. Therefore, in the calculation of the similarity of the first lamp part area Pb1, the first lamp part area Pb1, and each of the second lamp part area Pb2 and the third lamp part area Pb3 are individually compared. The similarity between the second lamp part area Pb2 and the third lamp part area Pb3 is also calculated in the same manner as the similarity between the first lamp part area Pb1.
 本変形例では、類似度算出部103は、第1灯部領域Pb1の類似度を算出するときには、第1灯部領域Pb1と、第2灯部領域Pb2および第3灯部領域Pb3のそれぞれとを個別に比較しない。つまり、類似度算出部103は、第1灯部領域Pb1と、第2灯部領域Pb2および第3灯部領域Pb3からなるグループとを比較することによって、第1灯部領域Pb1の類似度を算出する。 In this modification, the similarity calculation unit 103 calculates the similarity of the first lamp part area Pb1, and each of the first lamp part area Pb1, the second lamp part area Pb2, and the third lamp part area Pb3. Are not compared individually. That is, the similarity calculation unit 103 compares the first lamp part area Pb1 with the group including the second lamp part area Pb2 and the third lamp part area Pb3, thereby determining the similarity of the first lamp part area Pb1. calculate.
 図15は、本変形例に係る各灯部領域の類似度の算出方法の一例を示す図である。 FIG. 15 is a diagram illustrating an example of a method of calculating the similarity of each lamp area according to the present modification.
 類似度算出部103は、第1灯部領域Pb1の類似度の算出では、図15の(a)に示すように、第1灯部領域Pb1と、第2灯部領域Pb2および第3灯部領域Pb3からなる第1グループG1とを比較する。つまり、類似度算出部103は、第1灯部領域Pb1を複数のブロックに分割し、第1灯部領域Pb1のブロックごとに、そのブロックの画像に最も類似するブロックを第1グループG1から探索する。そして、類似度算出部103は、図14に示す例と同様、その探索によって、第1灯部領域Pb1に含まれる各ブロックのベクトル間距離を求め、それらのベクトル間距離に基づいて、その第1灯部領域Pb1の相関係数を類似度として算出する。 As shown in FIG. 15A, the similarity calculating unit 103 calculates the similarity of the first lamp unit area Pb1, as shown in FIG. 15A, the first lamp unit area Pb1, the second lamp unit area Pb2, and the third lamp unit. The first group G1 composed of the region Pb3 is compared. That is, the similarity calculation unit 103 divides the first lamp unit area Pb1 into a plurality of blocks, and searches the first group G1 for the block most similar to the image of the block for each block of the first lamp unit area Pb1. To do. Then, similar to the example shown in FIG. 14, the similarity calculation unit 103 obtains the inter-vector distance of each block included in the first lamp region Pb1 by the search, and based on the inter-vector distance, The correlation coefficient of one lamp area Pb1 is calculated as the similarity.
 類似度算出部103は、第2灯部領域Pb2の類似度の算出では、図15の(b)に示すように、第2灯部領域Pb2と、第1灯部領域Pb1および第3灯部領域Pb3からなる第2グループG2とを比較する。つまり、類似度算出部103は、第2灯部領域Pb2を複数のブロックに分割し、第2灯部領域Pb2のブロックごとに、そのブロックの画像に最も類似するブロックを第2グループG2から探索する。そして、類似度算出部103は、図14に示す例と同様、その探索によって、第2灯部領域Pb2に含まれる各ブロックのベクトル間距離を求め、それらのベクトル間距離に基づいて、その第2灯部領域Pb2の相関係数を類似度として算出する。 As shown in FIG. 15B, the similarity calculation unit 103 calculates the similarity of the second lamp unit area Pb2, as shown in FIG. 15B, the second lamp unit area Pb2, the first lamp unit area Pb1, and the third lamp unit. The second group G2 composed of the region Pb3 is compared. That is, the similarity calculation unit 103 divides the second lamp unit region Pb2 into a plurality of blocks, and searches the second group G2 for the block most similar to the image of the block for each block of the second lamp unit region Pb2. To do. Similar to the example shown in FIG. 14, the similarity calculation unit 103 obtains the inter-vector distance of each block included in the second lamp part region Pb2 by the search, and based on the inter-vector distance, The correlation coefficient of the two lamp area Pb2 is calculated as the similarity.
 類似度算出部103は、第3灯部領域Pb3の類似度の算出では、図15の(c)に示すように、第3灯部領域Pb3と、第1灯部領域Pb1および第2灯部領域Pb2からなる第3グループG3とを比較する。つまり、類似度算出部103は、第3灯部領域Pb3を複数のブロックに分割し、第3灯部領域Pb3のブロックごとに、そのブロックの画像に最も類似するブロックを第3グループG3から探索する。そして、類似度算出部103は、図14に示す例と同様、その探索によって、第3灯部領域Pb2に含まれる各ブロックのベクトル間距離を求め、それらのベクトル間距離に基づいて、その第3灯部領域Pb3の相関係数を類似度として算出する。 As shown in FIG. 15C, the similarity calculation unit 103 calculates the similarity of the third lamp unit area Pb3, as shown in FIG. 15C, the third lamp unit area Pb3, the first lamp unit area Pb1, and the second lamp unit. The third group G3 composed of the region Pb2 is compared. That is, the similarity calculation unit 103 divides the third lamp unit region Pb3 into a plurality of blocks, and searches for the block most similar to the image of the block from the third group G3 for each block of the third lamp unit region Pb3. To do. Then, similar to the example shown in FIG. 14, the similarity calculation unit 103 obtains the inter-vector distance of each block included in the third lamp part region Pb2 by the search, and based on the inter-vector distance, The correlation coefficient of the three lamp area Pb3 is calculated as the similarity.
 このように第1灯部領域Pb1、第2灯部領域Pb2および第3灯部領域Pb3のそれぞれの類似度がグループを用いて算出される。例えば、第1灯部領域Pb1に映し出されている灯部が点灯し、他の灯部が消灯していれば、第1灯部領域Pb1に対してのみ低い類似度が算出され、第2灯部領域Pb2および第3灯部領域Pb3のそれぞれに対しては高い類似度が算出される。したがって、図15に示す算出方法であっても、上記実施の形態と同様に、各灯部領域に対して適切な類似度を算出することができる。 Thus, the respective similarities of the first lamp area Pb1, the second lamp area Pb2, and the third lamp area Pb3 are calculated using the groups. For example, if the lamp part projected in the first lamp part area Pb1 is turned on and the other lamp parts are turned off, a low similarity is calculated only for the first lamp part area Pb1, and the second lamp part is calculated. A high degree of similarity is calculated for each of the partial area Pb2 and the third lamp part area Pb3. Therefore, even with the calculation method shown in FIG. 15, it is possible to calculate an appropriate degree of similarity for each lamp area, as in the above embodiment.
 図16は、本変形例に係る各灯部領域の類似度の算出方法の他の例を示す図である。 FIG. 16 is a diagram showing another example of a method for calculating the similarity of each lamp section according to the present modification.
 図15に示す例では、類似度の算出対象とされる灯部領域が複数のブロックに分割されるが、逆に、グループを複数のブロックに分割してもよい。 In the example shown in FIG. 15, the lamp section area that is the target of similarity calculation is divided into a plurality of blocks, but conversely, the group may be divided into a plurality of blocks.
 例えば、類似度算出部103は、第3灯部領域Pb3の類似度の算出では、図16の(a)に示すように、第1灯部領域Pb1および第2灯部領域Pb2からなる第3グループG3と、第3灯部領域Pb3とを比較する。つまり、類似度算出部103は、第3グループG3を複数のブロックに分割し、第3グループG3のブロックごとに、そのブロックの画像に最も類似するブロックを第3灯部領域Pb3から探索する。次に、類似度算出部103は、図14に示す例と同様、その探索によって、第3グループG3に含まれる各ブロックのベクトル間距離を求める。そして、類似度算出部103は、第3グループG3の複数のブロックのそれぞれのベクトル間距離に基づいて、その第3グループG3の相関係数を、第3灯部領域Pb3の類似度として算出する。 For example, the similarity calculation unit 103 calculates the similarity of the third lamp part region Pb3 as shown in FIG. 16A by using a third lamp part region Pb1 and a second lamp part region Pb2. The group G3 is compared with the third lamp part region Pb3. That is, the similarity calculation unit 103 divides the third group G3 into a plurality of blocks, and searches the third lamp unit region Pb3 for a block most similar to the image of each block of the third group G3. Next, similar to the example shown in FIG. 14, the similarity calculation unit 103 obtains the inter-vector distance of each block included in the third group G3 by the search. And the similarity calculation part 103 calculates the correlation coefficient of the 3rd group G3 as a similarity of 3rd lamp part area | region Pb3 based on each vector distance of several blocks of the 3rd group G3. .
 また、類似度算出部103は、第2灯部領域Pb2の類似度の算出では、図16の(b)に示すように、第3灯部領域Pb3および第1灯部領域Pb1からなる第2グループG2と、第2灯部領域Pb2とを比較する。つまり、類似度算出部103は、第2グループG2を複数のブロックに分割し、第2グループG2のブロックごとに、そのブロックの画像に最も類似するブロックを第2灯部領域Pb2から探索する。次に、類似度算出部103は、図14に示す例と同様、その探索によって、第2グループG2に含まれる各ブロックのベクトル間距離を求める。そして、類似度算出部103は、第2グループG2の複数のブロックのそれぞれのベクトル間距離に基づいて、その第2グループG2の相関係数を、第2灯部領域Pb2の類似度として算出する。 Further, in the calculation of the similarity of the second lamp section area Pb2, the similarity calculation section 103 calculates the second lamp composed of the third lamp section area Pb3 and the first lamp section area Pb1, as shown in FIG. The group G2 is compared with the second lamp area Pb2. That is, the similarity calculation unit 103 divides the second group G2 into a plurality of blocks, and searches the second lamp unit region Pb2 for a block most similar to the image of each block of the second group G2. Next, similar to the example shown in FIG. 14, the similarity calculation unit 103 obtains the inter-vector distance of each block included in the second group G2 by the search. And the similarity calculation part 103 calculates the correlation coefficient of the 2nd group G2 as a similarity of 2nd lamp part area | region Pb2 based on each vector distance of the some block of 2nd group G2. .
 また、類似度算出部103は、第1灯部領域Pb1の類似度の算出では、図16の(c)に示すように、第3灯部領域Pb3および第2灯部領域Pb2からなる第1グループG1と、第1灯部領域Pb1とを比較する。つまり、類似度算出部103は、第1グループG1を複数のブロックに分割し、第1グループG1のブロックごとに、そのブロックの画像に最も類似するブロックを第1灯部領域Pb1から探索する。次に、類似度算出部103は、図14に示す例と同様、その探索によって、第1グループG1に含まれる各ブロックのベクトル間距離を求める。そして、類似度算出部103は、第1グループG1の複数のブロックのそれぞれのベクトル間距離に基づいて、その第1グループG1の相関係数を、第1灯部領域Pb1の類似度として算出する。 Further, the similarity calculation unit 103 calculates the similarity of the first lamp part area Pb1, as shown in FIG. 16C, the first lamp part area Pb3 and the second lamp part area Pb2. The group G1 is compared with the first lamp area Pb1. That is, the similarity calculation unit 103 divides the first group G1 into a plurality of blocks, and searches the first lamp unit region Pb1 for a block most similar to the image of the block for each block of the first group G1. Next, similar to the example shown in FIG. 14, the similarity calculation unit 103 obtains the inter-vector distance of each block included in the first group G1 by the search. And the similarity calculation part 103 calculates the correlation coefficient of the 1st group G1 as a similarity of 1st lamp | ramp part area | region Pb1 based on each inter-vector distance of the some block of 1st group G1. .
 このように、図16に示す例でも、図15に示す例と同様、第1灯部領域Pb1、第2灯部領域Pb2および第3灯部領域Pb3のそれぞれの類似度がグループを用いて算出される。例えば、第1灯部領域Pb1に映し出されている灯部が点灯し、他の灯部が消灯していれば、第1灯部領域Pb1に対してのみ低い類似度が算出され、第2灯部領域Pb2および第3灯部領域Pb3のそれぞれに対しては高い類似度が算出される。したがって、図16に示す算出方法であっても、上記実施の形態と同様に、各灯部領域に対して適切な類似度を算出することができる。 In this way, in the example shown in FIG. 16 as well, as in the example shown in FIG. 15, the respective similarities of the first lamp part area Pb1, the second lamp part area Pb2, and the third lamp part area Pb3 are calculated using groups. Is done. For example, if the lamp part projected in the first lamp part area Pb1 is turned on and the other lamp parts are turned off, a low similarity is calculated only for the first lamp part area Pb1, and the second lamp part is calculated. A high degree of similarity is calculated for each of the partial area Pb2 and the third lamp part area Pb3. Therefore, even in the calculation method shown in FIG. 16, an appropriate similarity can be calculated for each lamp area, as in the above embodiment.
 図17は、本変形例に係る4つの灯部領域(N=4)の類似度の算出方法の一例を示す図である。 FIG. 17 is a diagram illustrating an example of a method for calculating the similarity of the four lamp areas (N = 4) according to the present modification.
 例えば、カメラ210は、4つの灯部(すなわちN=4)を有する信号機を撮影する。その結果、その撮影によって得られる認識対象画像の信号機領域は、4つの灯部領域、すなわち、第1灯部領域Pb1、第2灯部領域Pb2、第3灯部領域Pb3、および第4灯部領域Pb4に分割される。 For example, the camera 210 images a traffic light having four lamp parts (that is, N = 4). As a result, the traffic light area of the recognition target image obtained by the photographing has four lamp part areas, that is, the first lamp part area Pb1, the second lamp part area Pb2, the third lamp part area Pb3, and the fourth lamp part. Divided into regions Pb4.
 そこで、類似度算出部103は、第4灯部領域Pb4の類似度の算出では、図17の(a)に示すように、第1灯部領域Pb1、第2灯部領域Pb2および第3灯部領域Pb3からなる第4グループGp4と、第4灯部領域Pb4とを比較する。つまり、類似度算出部103は、第4グループGp4を複数のブロックに分割し、第4グループGp4のブロックごとに、そのブロックの画像に最も類似するブロックを第4灯部領域Pb4から探索する。次に、類似度算出部103は、図14に示す例と同様、その探索によって、第4グループGp4に含まれる各ブロックのベクトル間距離を求める。そして、類似度算出部103は、第4グループGp4のそれぞれのベクトル間距離に基づいて、その第4グループGp4の相関係数を、第4灯部領域Pb4の類似度として算出する。 Therefore, the similarity calculation unit 103 calculates the similarity of the fourth lamp part area Pb4, as shown in FIG. 17A, the first lamp part area Pb1, the second lamp part area Pb2, and the third lamp part. The fourth group Gp4 including the partial area Pb3 is compared with the fourth lamp part area Pb4. That is, the similarity calculation unit 103 divides the fourth group Gp4 into a plurality of blocks, and searches the fourth lamp unit region Pb4 for a block most similar to the image of the block for each block of the fourth group Gp4. Next, similarly to the example shown in FIG. 14, the similarity calculation unit 103 obtains the inter-vector distance of each block included in the fourth group Gp4 by the search. And the similarity calculation part 103 calculates the correlation coefficient of the 4th group Gp4 as a similarity of 4th lamp part area | region Pb4 based on each vector distance of 4th group Gp4.
 また、類似度算出部103は、第1灯部領域Pb1の類似度の算出では、図17の(b)に示すように、第2灯部領域Pb2、第3灯部領域Pb3および第4灯部領域Pb4からなる第1グループGp1と、第1灯部領域Pb1とを比較する。つまり、類似度算出部103は、第1グループGp1を複数のブロックに分割し、第1グループGp1のブロックごとに、そのブロックの画像に最も類似するブロックを第1灯部領域Pb1から探索する。次に、類似度算出部103は、図14に示す例と同様、その探索によって、第1グループGp1に含まれる各ブロックのベクトル間距離を求める。そして、類似度算出部103は、第1グループGp1のそれぞれのベクトル間距離に基づいて、その第1グループGp1の相関係数を、第1灯部領域Pb1の類似度として算出する。 Further, the similarity calculation unit 103 calculates the similarity of the first lamp area Pb1, as shown in FIG. 17B, the second lamp area Pb2, the third lamp area Pb3, and the fourth lamp. The first group Gp1 including the partial area Pb4 is compared with the first lamp part area Pb1. That is, the similarity calculation unit 103 divides the first group Gp1 into a plurality of blocks, and searches the first lamp unit region Pb1 for a block most similar to the image of the block for each block of the first group Gp1. Next, similar to the example shown in FIG. 14, the similarity calculation unit 103 obtains an inter-vector distance of each block included in the first group Gp1 by the search. And the similarity calculation part 103 calculates the correlation coefficient of the 1st group Gp1 as a similarity of 1st lamp part area | region Pb1 based on each vector distance of 1st group Gp1.
 また、類似度算出部103は、第2灯部領域Pb2の類似度の算出では、図17の(c)に示すように、第3灯部領域Pb3、第4灯部領域Pb4および第1灯部領域Pb1からなる第2グループGp2と、第2灯部領域Pb2とを比較する。つまり、類似度算出部103は、第2グループGp2を複数のブロックに分割し、第2グループGp2のブロックごとに、そのブロックの画像に最も類似するブロックを第2灯部領域Pb2から探索する。次に、類似度算出部103は、図14に示す例と同様、その探索によって、第2グループGp2に含まれる各ブロックのベクトル間距離を求める。そして、類似度算出部103は、第2グループGp2のそれぞれのベクトル間距離に基づいて、その第2グループGp2の相関係数を、第2灯部領域Pb2の類似度として算出する。 Further, the similarity calculation unit 103 calculates the similarity of the second lamp part area Pb2, as shown in FIG. 17C, the third lamp part area Pb3, the fourth lamp part area Pb4, and the first lamp part. The second group Gp2 including the partial area Pb1 is compared with the second lamp area Pb2. That is, the similarity calculation unit 103 divides the second group Gp2 into a plurality of blocks, and searches the second lamp unit region Pb2 for a block most similar to the image of the block for each block of the second group Gp2. Next, similar to the example shown in FIG. 14, the similarity calculation unit 103 obtains the inter-vector distance of each block included in the second group Gp2 by the search. And the similarity calculation part 103 calculates the correlation coefficient of the 2nd group Gp2 as a similarity of 2nd lamp part area | region Pb2 based on each vector distance of 2nd group Gp2.
 また、類似度算出部103は、第3灯部領域Pb3の類似度の算出では、図17の(d)に示すように、第4灯部領域Pb4、第1灯部領域Pb1および第2灯部領域Pb2からなる第3グループGp3と、第3灯部領域Pb3とを比較する。つまり、類似度算出部103は、第3グループGp3を複数のブロックに分割し、第3グループGp3のブロックごとに、そのブロックの画像に最も類似するブロックを第3灯部領域Pb3から探索する。次に、類似度算出部103は、図14に示す例と同様、その探索によって、第3グループGp3に含まれる各ブロックのベクトル間距離を求める。そして、類似度算出部103は、第3グループGp3のそれぞれのベクトル間距離に基づいて、その第3グループGp3の相関係数を、第3灯部領域Pb3の類似度として算出する。 Further, in calculating the similarity of the third lamp part area Pb3, the similarity calculation unit 103 calculates the fourth lamp part area Pb4, the first lamp part area Pb1, and the second lamp as shown in FIG. The third group Gp3 composed of the partial area Pb2 is compared with the third lamp part area Pb3. That is, the similarity calculation unit 103 divides the third group Gp3 into a plurality of blocks, and searches the third lamp unit region Pb3 for a block most similar to the image of the block for each block of the third group Gp3. Next, similar to the example shown in FIG. 14, the similarity calculation unit 103 obtains the inter-vector distance of each block included in the third group Gp3 by the search. And the similarity calculation part 103 calculates the correlation coefficient of the 3rd group Gp3 as a similarity of 3rd lamp part area | region Pb3 based on each vector distance of 3rd group Gp3.
 このように、図17に示す例でも、図15および図16に示す例と同様、第1灯部領域Pb1、第2灯部領域Pb2および第3灯部領域Pb3のそれぞれの類似度がグループを用いて算出される。例えば、第1灯部領域Pb1に映し出されている灯部が点灯し、他の灯部が消灯していれば、第1灯部領域Pb1に対してのみ低い類似度が算出され、第2灯部領域Pb2および第3灯部領域Pb3のそれぞれに対しては高い類似度が算出される。したがって、図17に示す算出方法であっても、上記実施の形態と同様に、各灯部領域に対して適切な類似度を算出することができる。 Thus, in the example shown in FIG. 17 as well, as in the examples shown in FIGS. 15 and 16, the similarities of the first lamp part area Pb1, the second lamp part area Pb2, and the third lamp part area Pb3 are grouped. Is used to calculate. For example, if the lamp part projected in the first lamp part area Pb1 is turned on and the other lamp parts are turned off, a low similarity is calculated only for the first lamp part area Pb1, and the second lamp part is calculated. A high degree of similarity is calculated for each of the partial area Pb2 and the third lamp part area Pb3. Therefore, even with the calculation method shown in FIG. 17, it is possible to calculate an appropriate degree of similarity for each lamp area, as in the above embodiment.
 つまり、本変形例では、図15~図16に示すように、類似度算出部103は、N個の灯部領域のそれぞれについて、当該灯部領域の画像と、当該灯部領域以外の少なくとも1つの他の灯部領域からなるグループの画像とを比較することによって、当該灯部領域の類似度を算出する。例えば、類似度算出部103は、図15に示すように、N個の灯部領域のそれぞれについて、当該灯部領域の画像と、そのグループの画像とを比較するときには、当該灯部領域に含まれるブロックごとに、当該ブロックの画像と最も類似するブロックをそのグループから探索することによって、当該灯部領域の類似度を算出する。または、類似度算出部103は、図16および図17に示すように、N個の灯部領域のそれぞれについて、当該灯部領域の画像と、そのグループの画像とを比較するときには、そのグループに含まれるブロックごとに、当該ブロックの画像と最も類似するブロックを当該灯部領域から探索することによって、当該灯部領域の類似度を算出する。 That is, in the present modification, as shown in FIGS. 15 to 16, the similarity calculation unit 103, for each of the N lamp unit regions, includes an image of the lamp unit region and at least one other than the lamp unit region. The similarity of the lamp area is calculated by comparing an image of a group of two other lamp areas. For example, as shown in FIG. 15, the similarity calculation unit 103 includes, for each of the N lamp unit regions, an image of the lamp unit region and the group image included in the lamp unit region. For each block, the similarity of the lamp area is calculated by searching the group for the block most similar to the image of the block. Alternatively, as shown in FIG. 16 and FIG. 17, when the similarity calculation unit 103 compares the image of the lamp unit region with the image of the group for each of the N lamp unit regions, For each included block, the similarity of the lamp area is calculated by searching the lamp area for the block most similar to the image of the block.
 これにより、例えば、グループが複数の灯部領域からなる場合であっても、灯部領域の類似度を算出するときには、その灯部領域と、他の複数の灯部領域のそれぞれとを個別に比較する手間を省くことができる。つまり、その灯部領域と、その灯部領域以外の(N-1)個の灯部領域からなるグループとを比較することによって、その灯部領域の類似度を簡単に算出することができる。また、類似度の算出において、互いに異なるサイズの画像領域(例えば灯部領域およびグループなど)を適切に比較することができる。 Thus, for example, even when a group is composed of a plurality of lamp areas, when calculating the similarity of the lamp areas, the lamp area and each of the other lamp areas are individually set. It is possible to save time and effort for comparison. That is, by comparing the lamp area with a group of (N−1) lamp areas other than the lamp area, the similarity of the lamp area can be easily calculated. Further, in the calculation of the degree of similarity, image regions having different sizes (for example, a lamp region and a group) can be appropriately compared.
 (その他の変形例)
 なお、上記実施の形態および各変形例において、信号機認識装置に含まれる各構成要素は、専用のハードウェアで構成されるか、各構成要素に適したソフトウェアプログラムを実行することによって実現されてもよい。各構成要素は、CPUまたはプロセッサなどのプログラム実行部が、ハードディスクまたは半導体メモリなどの記録媒体に記録されたソフトウェアプログラムを読み出して実行することによって実現されてもよい。ここで、上記実施の形態および各変形例における信号機認識装置を実現するソフトウェアは、図4、図6、図9、および図12に示すフローチャートに含まれる各ステップをコンピュータに実行させる。
(Other variations)
In the above-described embodiment and each modification, each component included in the traffic signal recognition apparatus is configured by dedicated hardware or can be realized by executing a software program suitable for each component. Good. Each component may be realized by a program execution unit such as a CPU or a processor reading and executing a software program recorded on a recording medium such as a hard disk or a semiconductor memory. Here, the software that realizes the traffic signal recognition apparatus in the embodiment and each modification causes the computer to execute each step included in the flowcharts shown in FIGS. 4, 6, 9, and 12.
 以上、一つまたは複数の態様に係る信号機認識装置について、実施の形態および各変形例に基づいて説明したが、本開示は、この実施の形態に限定されるものではない。本開示の趣旨を逸脱しない限り、当業者が思いつく各種変形を本実施の形態または変形例に施したものや、実施の形態および各変形例における構成要素を組み合わせて構築される形態も、本開示の範囲内に含まれてもよい。 As described above, the traffic signal recognizing device according to one or a plurality of aspects has been described based on the embodiment and each modification, but the present disclosure is not limited to this embodiment. As long as it does not deviate from the gist of the present disclosure, various modifications conceived by those skilled in the art are applied to the present embodiment or the modified examples, and forms constructed by combining the components in the embodiments and the modified examples are also disclosed in the present disclosure It may be included in the range.
 例えば、上記実施の形態および各変形例では、領域抽出部101は、車両10の位置と、信号機の位置および形態と、地図情報とを用いて幾何学的に、撮影画像Paにおける信号機領域Pbを検出する。しかし、領域抽出部101は、このような検出方法ではなく、他の検出方法を用いて信号機領域Pbを検出してもよい。例えば、領域抽出部101は、機械学習による画像認識によって信号機領域Pbを検出してもよく、パターンマッチングによって信号機領域Pbを検出してもよい。 For example, in the above embodiment and each modification, the area extraction unit 101 geometrically uses the position of the vehicle 10, the position and form of the traffic light, and the map information to geometrically determine the traffic light area Pb in the captured image Pa. To detect. However, the area extraction unit 101 may detect the traffic signal area Pb using another detection method instead of such a detection method. For example, the area extraction unit 101 may detect the traffic light area Pb by image recognition by machine learning, or may detect the traffic light area Pb by pattern matching.
 また、上記実施の形態および各変形例では、信号機認識装置は、互いに色の異なる3個の灯部を有する信号機の灯火色を認識する。すなわち、信号機認識装置は、N=3の場合における灯火色を認識する。しかし、灯部の数(N)は3個以上であれば、どのような数であってもよい。また、それらのN個の灯部の色は、青色、黄色および赤色以外の色であってもよく、N個のうちの2つ以上の灯部は、同じ色であってもよい。 Further, in the above embodiment and each modification, the traffic signal recognition device recognizes the lighting color of the traffic signal having three lamp parts having different colors. That is, the traffic light recognition device recognizes the light color when N = 3. However, the number (N) of the lamp units may be any number as long as it is three or more. Further, the colors of the N lamp parts may be colors other than blue, yellow, and red, and two or more of the N lamp parts may be the same color.
 また、図1、図5、図7、および図10に示す信号機認識装置は、プロセッサとメモリとによって構成されていてもよい。つまり、信号機認識装置の各構成要素のうち、履歴保持部105以外の構成要素は、プロセッサによって実現される。プロセッサは、メモリに格納されたプログラムを実行することによって、上記履歴保持部105以外の構成要素を実現する。また、メモリは、履歴保持部105として構成されていてもよい。つまり、メモリは、プロセッサを制御するためのプログラムを格納していてもよく、履歴情報を格納していてもよい。 Moreover, the traffic signal recognition apparatus shown in FIGS. 1, 5, 7, and 10 may be configured by a processor and a memory. That is, the constituent elements other than the history holding unit 105 among the constituent elements of the traffic signal recognition apparatus are realized by the processor. The processor implements components other than the history holding unit 105 by executing a program stored in the memory. The memory may be configured as the history holding unit 105. That is, the memory may store a program for controlling the processor or may store history information.
 本開示は、例えば自動走行車両などに搭載され、その車両の走行経路にある信号機を認識する信号機認識装置に利用可能である。 The present disclosure can be used for a traffic signal recognition device that is mounted on, for example, an automatic traveling vehicle and recognizes a traffic signal on a traveling route of the vehicle.
 10  車両
 100、100a、100b、100c  信号機認識装置
 101  領域抽出部
 102  領域分割部
 103  類似度算出部
 104、104c  灯火色認識部
 105  履歴保持部
 106  フリッカ処理部
 107  時系列処理部
 210  カメラ
 220  地図格納部
 230  位置検出システム
 240  車両制御部
DESCRIPTION OF SYMBOLS 10 Vehicle 100, 100a, 100b, 100c Traffic signal recognition apparatus 101 Area extraction part 102 Area division part 103 Similarity calculation part 104, 104c Light color recognition part 105 History holding part 106 Flicker processing part 107 Time series processing part 210 Camera 220 Map storage 230 Position detection system 240 Vehicle control unit

Claims (18)

  1.  プロセッサと、
     メモリとを備え、
     前記プロセッサは、前記メモリを用いて、
     センサによって取得された認識対象画像から、N個(Nは3以上の整数)の灯部を有する信号機が映し出されている領域を信号機領域として抽出し、
     前記信号機領域から、それぞれ互いに異なる灯部が映し出されている複数の画像の類似度を算出する、
     信号機認識装置。
    A processor;
    With memory,
    The processor uses the memory,
    From the recognition target image acquired by the sensor, an area where a traffic light having N (N is an integer of 3 or more) lamps is displayed is extracted as a traffic light area.
    From the traffic signal area, to calculate the similarity of a plurality of images in which different lamp parts are projected,
    Traffic light recognition device.
  2.  前記プロセッサは、
     前記複数の画像として、N個の灯部領域を生成し、
     前記N個の灯部領域のそれぞれの類似度を算出する、
     請求項1に記載の信号機認識装置。
    The processor is
    N lamp areas are generated as the plurality of images,
    Calculating the similarity of each of the N lamp areas;
    The traffic light recognition apparatus according to claim 1.
  3.  前記プロセッサは、
     前記N個の灯部領域のそれぞれに対して算出された前記類似度に基づいて、前記N個の灯部領域のうち、他の何れの灯部領域にも類似していない灯部領域である非類似領域が存在するか否かを判定する、
     請求項2に記載の信号機認識装置。
    The processor is
    Based on the similarity calculated for each of the N lamp part areas, the lamp part area is not similar to any other lamp part area among the N lamp part areas. Determine whether there is a dissimilar region,
    The traffic light recognition apparatus according to claim 2.
  4.  前記プロセッサは、さらに、
     前記非類似領域が存在すると判定する場合には、前記信号機領域における前記非類似領域の位置に予め対応付けられている色を、前記認識対象画像における灯火色として認識する、
     請求項3に記載の信号機認識装置。
    The processor further includes:
    If it is determined that the dissimilar region exists, a color associated with the position of the dissimilar region in the traffic light region in advance is recognized as a lamp color in the recognition target image.
    The traffic light recognition apparatus according to claim 3.
  5.  前記プロセッサは、
     前記灯火色の認識では、
     前記信号機領域における前記N個の灯部領域のそれぞれの位置に対応付けられた色を示す対応付け情報を参照し、
     前記対応付け情報において、前記非類似領域の位置に対応付けられている色を、前記認識対象画像における前記灯火色として認識する、
     請求項4に記載の信号機認識装置。
    The processor is
    In the recognition of the light color,
    With reference to the association information indicating the color associated with each position of the N lamp section areas in the traffic light area,
    In the association information, a color associated with the position of the dissimilar region is recognized as the lamp color in the recognition target image.
    The traffic light recognition apparatus according to claim 4.
  6.  前記プロセッサは、
     前記類似度の算出では、
     前記認識対象画像における灯部領域ごとに、当該灯部領域と、少なくとも2つの他の灯部領域のそれぞれとの間の相関係数のうち、最も大きい相関係数を類似度として選択することによって、当該灯部領域の類似度を算出する、
     請求項3~5の何れか1項に記載の信号機認識装置。
    The processor is
    In calculating the similarity,
    For each lamp area in the recognition target image, by selecting the largest correlation coefficient as the similarity among the correlation coefficients between the lamp area and each of at least two other lamp areas. Calculating the similarity of the lamp area,
    The traffic signal recognizing device according to any one of claims 3 to 5.
  7.  前記プロセッサは、
     前記非類似領域が存在するか否かの判定では、
     前記N個の灯部領域のそれぞれの類似度のうち、最も小さい類似度を有する灯部領域が、前記非類似領域として存在すると判定する
     請求項3~6の何れか1項に記載の信号機認識装置。
    The processor is
    In determining whether the dissimilar region exists,
    The traffic light recognition according to any one of claims 3 to 6, wherein it is determined that a lamp unit region having the smallest similarity among the similarities of the N lamp unit regions exists as the dissimilar region. apparatus.
  8.  前記プロセッサは、
     前記非類似領域が存在するか否かの判定では、
     前記N個の灯部領域のそれぞれの類似度のうちの最も小さい類似度が、閾値以下の場合に、前記最も小さい類似度を有する灯部領域が、前記非類似領域として存在すると判定する
     請求項3~6の何れか1項に記載の信号機認識装置。
    The processor is
    In determining whether the dissimilar region exists,
    The lamp unit region having the smallest similarity is determined to be present as the dissimilar region when the smallest similarity among the similarities of the N lamp unit regions is equal to or less than a threshold value. 7. The traffic light recognition apparatus according to any one of 3 to 6.
  9.  前記プロセッサは、
     前記灯火色の認識では、
     前記非類似領域が存在しないと判定する場合には、
     過去に取得された画像に対して認識された灯火色を示す履歴情報を参照し、
     前記履歴情報によって示される灯火色を、前記認識対象画像における灯火色として認識する
     請求項4に記載の信号機認識装置。
    The processor is
    In the recognition of the light color,
    When determining that the dissimilar region does not exist,
    Refer to the history information indicating the light color recognized for images acquired in the past,
    The traffic light recognition apparatus according to claim 4, wherein a lamp color indicated by the history information is recognized as a lamp color in the recognition target image.
  10.  前記プロセッサは、さらに、
     前記認識対象画像よりも前に取得された複数の画像のそれぞれに対して認識された灯火色を示す履歴情報を参照し、
     前記認識対象画像に対して認識された灯火色と、前記履歴情報によって示される、前記複数の画像のそれぞれにおける灯火色とのうち、最も多い灯火色を多数灯火色として特定し、
     前記認識対象画像に対して認識された灯火色を、前記多数灯火色に更新する
     請求項4に記載の信号機認識装置。
    The processor further includes:
    With reference to history information indicating the lamp color recognized for each of a plurality of images acquired before the recognition target image,
    Among the lamp colors recognized for the recognition target image and the lamp colors in each of the plurality of images indicated by the history information, the most lamp colors are specified as the lamp colors,
    The traffic light recognition apparatus according to claim 4, wherein a lamp color recognized for the recognition target image is updated to the multiple lamp colors.
  11.  前記プロセッサは、
     前記非類似領域が存在するか否かの判定では、
     前記認識対象画像よりも前に取得された複数の画像のそれぞれについて、当該画像における前記信号機領域内の位置ごとに、当該位置にある灯部領域の類似度を示す履歴情報を参照し、
     前記信号機領域内の位置ごとに、前記認識対象画像および前記複数の画像において当該位置にある灯部領域の類似度の平均を、平均類似度として算出し、
     前記認識対象画像における前記N個の灯部領域のうち、前記位置ごとに算出された平均類似度の中で最も小さい平均類似度に対応する位置にある灯部領域が、前記非類似領域として存在すると判定する
     請求項3~6の何れか1項に記載の信号機認識装置。
    The processor is
    In determining whether the dissimilar region exists,
    For each of a plurality of images acquired before the recognition target image, for each position in the traffic light area in the image, refer to history information indicating the similarity of the lamp area at the position,
    For each position in the traffic signal area, the average similarity of the lamp area at the position in the recognition target image and the plurality of images is calculated as an average similarity,
    Among the N lamp areas in the recognition target image, there is a lamp area at the position corresponding to the smallest average similarity among the average similarities calculated for each position as the dissimilar area. The traffic signal recognizing device according to any one of claims 3 to 6.
  12.  前記信号機が、さらに、それぞれ車両の進行方向を示す灯部であるM個(Mは1以上の整数)の方向指示灯部を有する場合、
     前記プロセッサは、
     前記N個の灯部領域の生成では、
     前記信号機領域を分割することによって、前記N個の灯部領域と、それぞれ方向指示灯部が映し出されているM個の灯部領域とを生成し、
     さらに、前記認識対象画像に対して認識された灯火色と、前記M個の灯部領域のそれぞれの特徴量とに基づいて、前記M個の方向指示灯部の中で点灯している少なくとも1つの方向指示灯部によって示される進行方向を認識する
     請求項4に記載の信号機認識装置。
    In the case where the traffic light further has M (M is an integer of 1 or more) direction indicator lamps, which are lamp parts each indicating the traveling direction of the vehicle,
    The processor is
    In the generation of the N lamp areas,
    By dividing the traffic light area, the N lamp part areas and M lamp part areas each displaying the direction indicator lamp part are generated,
    Further, at least one of the M direction indicator lamp units that is lit based on the lamp color recognized for the recognition target image and the feature amount of each of the M lamp unit regions. The traffic signal recognizing device according to claim 4, wherein the traveling direction indicated by the two direction indicator lamp units is recognized.
  13.  前記信号機が、さらに、それぞれ車両の進行方向を示す灯部であるM個(Mは1以上の整数)の方向指示灯部を有する場合、
     前記プロセッサは、
     前記N個の灯部領域の生成では、
     前記信号機領域を分割することによって、前記N個の灯部領域と、それぞれ方向指示灯部が映し出されているM個の灯部領域とを生成し、
     さらに、前記M個の灯部領域のそれぞれの特徴量に基づいて、前記M個の方向指示灯部の中で点灯している少なくとも1つの方向指示灯部によって示される進行方向を認識する
     請求項3~11の何れか1項に記載の信号機認識装置。
    In the case where the traffic light further has M (M is an integer of 1 or more) direction indicator lamps, which are lamp parts each indicating the traveling direction of the vehicle,
    The processor is
    In the generation of the N lamp areas,
    By dividing the traffic light area, the N lamp part areas and M lamp part areas each displaying the direction indicator lamp part are generated,
    Furthermore, the traveling direction indicated by at least one direction indicator lamp unit that is lit among the M direction indicator lamp units is recognized based on the feature amount of each of the M lamp unit regions. The traffic light recognition apparatus according to any one of 3 to 11.
  14.  前記プロセッサは、
     前記類似度の算出では、
     前記N個の灯部領域のそれぞれについて、当該灯部領域の画像と、当該灯部領域以外の少なくとも1つの他の灯部領域からなるグループの画像とを比較することによって、当該灯部領域の類似度を算出する、
     請求項3~13の何れか1項に記載の信号機認識装置。
    The processor is
    In calculating the similarity,
    For each of the N lamp areas, the image of the lamp area is compared with an image of a group of at least one other lamp area other than the lamp area. Calculate similarity,
    The traffic light recognition device according to any one of claims 3 to 13.
  15.  前記プロセッサは、
     前記N個の灯部領域のそれぞれについて、当該灯部領域の画像と、前記グループの画像とを比較するときには、当該灯部領域に含まれるブロックごとに、当該ブロックの画像と最も類似するブロックを前記グループから探索することによって、当該灯部領域の類似度を算出する、
     請求項14に記載の信号機認識装置。
    The processor is
    For each of the N lamp areas, when comparing the image of the lamp area and the image of the group, for each block included in the lamp area, the block most similar to the image of the block is selected. By searching from the group, the similarity of the lamp area is calculated.
    The traffic light recognition apparatus according to claim 14.
  16.  前記プロセッサは、
     前記N個の灯部領域のそれぞれについて、当該灯部領域の画像と、前記グループの画像とを比較するときには、前記グループに含まれるブロックごとに、当該ブロックの画像と最も類似するブロックを当該灯部領域から探索することによって、当該灯部領域の類似度を算出する
     請求項14に記載の信号機認識装置。
    The processor is
    For each of the N lamp areas, when comparing the image of the lamp area with the image of the group, for each block included in the group, the block most similar to the image of the block is selected. The traffic light recognition apparatus according to claim 14, wherein the similarity of the lamp section area is calculated by searching from the section area.
  17.  センサによって取得された認識対象画像から、N個(Nは3以上の整数)の灯部を有する信号機が映し出されている領域を信号機領域として抽出し、
     前記信号機領域から、それぞれ互いに異なる灯部が映し出されているN個の灯部領域を生成し、
     前記N個の灯部領域のそれぞれの類似度を算出し、
     前記N個の灯部領域のそれぞれに対して算出された前記類似度に基づいて、前記N個の灯部領域のうち、他の何れの灯部領域にも類似していない灯部領域である非類似領域が存在するか否かを判定する、
     信号機認識方法。
    From the recognition target image acquired by the sensor, an area where a traffic light having N (N is an integer of 3 or more) lamps is displayed is extracted as a traffic light area.
    From the traffic light area, N different lamp areas are displayed in which different lamp sections are projected,
    Calculating the similarity of each of the N lamp areas;
    Based on the similarity calculated for each of the N lamp part areas, the lamp part area is not similar to any other lamp part area among the N lamp part areas. Determine whether there is a dissimilar region,
    Signal recognition method.
  18.  センサによって取得された認識対象画像から、N個(Nは3以上の整数)の灯部を有する信号機が映し出されている領域を信号機領域として抽出し、
     前記信号機領域から、それぞれ互いに異なる灯部が映し出されているN個の灯部領域を生成し、
     前記N個の灯部領域のそれぞれの類似度を算出し、
     前記N個の灯部領域のそれぞれに対して算出された前記類似度に基づいて、前記N個の灯部領域のうち、他の何れの灯部領域にも類似していない灯部領域である非類似領域が存在するか否かを判定する、
     ことをコンピュータに実行させるためのプログラム。
    From the recognition target image acquired by the sensor, an area where a traffic light having N (N is an integer of 3 or more) lamps is displayed is extracted as a traffic light area.
    From the traffic light area, N different lamp areas are displayed in which different lamp sections are projected,
    Calculating the similarity of each of the N lamp areas;
    Based on the similarity calculated for each of the N lamp part areas, the lamp part area is not similar to any other lamp part area among the N lamp part areas. Determine whether there is a dissimilar region,
    A program that causes a computer to execute.
PCT/JP2019/010260 2018-03-14 2019-03-13 Traffic signal recognizing device, traffic signal recognition method, and program WO2019177019A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018-047116 2018-03-14
JP2018047116 2018-03-14

Publications (1)

Publication Number Publication Date
WO2019177019A1 true WO2019177019A1 (en) 2019-09-19

Family

ID=67907774

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/010260 WO2019177019A1 (en) 2018-03-14 2019-03-13 Traffic signal recognizing device, traffic signal recognition method, and program

Country Status (1)

Country Link
WO (1) WO2019177019A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003016429A (en) * 2001-06-28 2003-01-17 Honda Motor Co Ltd Vehicle periphery monitor device
JP2013186507A (en) * 2012-03-05 2013-09-19 Honda Motor Co Ltd Vehicle periphery monitoring device
JP2013242686A (en) * 2012-05-21 2013-12-05 Nissan Motor Co Ltd Traffic signal detection device and traffic signal detection method
JP2017022492A (en) * 2015-07-08 2017-01-26 オムロン株式会社 Image processing apparatus, traffic management system with the same, and image processing method

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003016429A (en) * 2001-06-28 2003-01-17 Honda Motor Co Ltd Vehicle periphery monitor device
JP2013186507A (en) * 2012-03-05 2013-09-19 Honda Motor Co Ltd Vehicle periphery monitoring device
JP2013242686A (en) * 2012-05-21 2013-12-05 Nissan Motor Co Ltd Traffic signal detection device and traffic signal detection method
JP2017022492A (en) * 2015-07-08 2017-01-26 オムロン株式会社 Image processing apparatus, traffic management system with the same, and image processing method

Similar Documents

Publication Publication Date Title
CN109034047B (en) Lane line detection method and device
JP5747549B2 (en) Signal detector and program
JP6477876B2 (en) Signal detection device and signal detection method
KR101716928B1 (en) Image processing method for vehicle camera and image processing apparatus usnig the same
US8488878B2 (en) Sky detection system used in image extraction device and method using sky detection system
CN109478329B (en) Image processing method and device
JP2018063680A (en) Traffic signal recognition method and traffic signal recognition device
JP6278791B2 (en) Vehicle position detection device, vehicle position detection method, vehicle position detection computer program, and vehicle position detection system
EP3477620A1 (en) Map production apparatus using machine learning and image processing
JP4772494B2 (en) Data processing device
JP2013186664A (en) Crosswalk recognition device and crosswalk recognition method
RU2700646C2 (en) Traffic light detection device and traffic light detection method
WO2019177019A1 (en) Traffic signal recognizing device, traffic signal recognition method, and program
JP5936527B2 (en) Image processing apparatus and image processing method
JP2004086417A (en) Method and device for detecting pedestrian on zebra crossing
JP2011170539A (en) Lighting color discriminating device and program
JPH07146137A (en) Distance-between-vehicles measuring apparatus
CN115900682A (en) Method for improving road topology through sequence estimation and anchor point detection
JP2019158762A (en) Device, method, and program for detecting abnormalities
JP2020194447A (en) Traffic light recognition device, traffic light recognition method, and program for traffic light recognition
JP6570321B2 (en) Information processing apparatus, information processing method, and program
WO2020203829A1 (en) Vehicle location determination device, vehicle location determination system, and vehicle location determination method
JP2019139471A (en) Image processing device, method, and program
JP2019012496A (en) Detection program, method and device
CN113128317B (en) Lane positioning system and lane positioning method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19767081

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19767081

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP