WO2019177019A1 - Dispositif de reconnaissance de signal de circulation, procédé de reconnaissance de signal de circulation, et programme - Google Patents

Dispositif de reconnaissance de signal de circulation, procédé de reconnaissance de signal de circulation, et programme Download PDF

Info

Publication number
WO2019177019A1
WO2019177019A1 PCT/JP2019/010260 JP2019010260W WO2019177019A1 WO 2019177019 A1 WO2019177019 A1 WO 2019177019A1 JP 2019010260 W JP2019010260 W JP 2019010260W WO 2019177019 A1 WO2019177019 A1 WO 2019177019A1
Authority
WO
WIPO (PCT)
Prior art keywords
lamp
area
similarity
color
traffic light
Prior art date
Application number
PCT/JP2019/010260
Other languages
English (en)
Japanese (ja)
Inventor
良介 後藤
サヒム コルコス
村田 久治
本村 秀人
Original Assignee
パナソニックIpマネジメント株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by パナソニックIpマネジメント株式会社 filed Critical パナソニックIpマネジメント株式会社
Publication of WO2019177019A1 publication Critical patent/WO2019177019A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics

Definitions

  • the present disclosure relates to a traffic signal recognition device that recognizes traffic signals.
  • This traffic light recognition device acquires an image generated by shooting a traffic light, and recognizes the light color of the traffic light based on the image.
  • Patent Document 1 has a problem that an image of a signal generated by photographing under severe restrictions is necessary in order to properly recognize the signal.
  • the present disclosure provides a traffic light recognition device that can relax restrictions on shooting of a traffic light.
  • a traffic signal recognition apparatus includes a processor and a memory, and the processor uses the memory to recognize N images (N is an integer of 3 or more) from recognition target images acquired by a sensor. An area in which a traffic light having a lamp part is projected is extracted as a traffic light area, and a similarity between a plurality of images in which different lamp parts are projected is calculated from the traffic light area.
  • the traffic signal recognition device of the present disclosure can ease restrictions on traffic signal shooting.
  • FIG. 1 is a block diagram illustrating a configuration of a traffic signal recognition apparatus according to an embodiment.
  • FIG. 2 is a diagram for explaining processing of the region extraction unit, the region division unit, and the similarity calculation unit in the embodiment.
  • FIG. 3 is a diagram for explaining processing of the light color recognition unit in the embodiment.
  • FIG. 4 is a flowchart showing the processing operation of the traffic signal recognition apparatus in the embodiment.
  • FIG. 5 is a block diagram illustrating a configuration of the traffic signal recognition apparatus according to the first modification of the embodiment.
  • FIG. 6 is a flowchart illustrating the processing operation of the traffic signal recognition apparatus according to the first modification of the embodiment.
  • FIG. 7 is a block diagram illustrating a configuration of a traffic signal recognition apparatus according to Modification 2 of the embodiment.
  • FIG. 1 is a block diagram illustrating a configuration of a traffic signal recognition apparatus according to an embodiment.
  • FIG. 2 is a diagram for explaining processing of the region extraction unit, the region division unit, and the similarity calculation unit in
  • FIG. 8 is a diagram for explaining the recognition of the light color by the traffic light recognition apparatus according to the second modification of the embodiment.
  • FIG. 9 is a flowchart illustrating the processing operation of the traffic signal recognition apparatus according to the second modification of the embodiment.
  • FIG. 10 is a block diagram illustrating a configuration of a traffic signal recognition apparatus according to the third modification of the embodiment.
  • FIG. 11 is a diagram for explaining the recognition of the light color by the traffic light recognition apparatus according to the third modification of the embodiment.
  • FIG. 12 is a flowchart illustrating the processing operation of the traffic signal recognition apparatus according to the third modification of the embodiment.
  • FIG. 13 is a diagram for explaining recognition of the traveling direction by the traffic light recognition apparatus according to the fourth modification of the embodiment.
  • FIG. 14 is a diagram illustrating an example of a method of calculating the similarity of the first lamp unit region in the embodiment.
  • FIG. 15 is a diagram illustrating an example of a method for calculating the similarity of each lamp area according to the fifth modification of the embodiment.
  • FIG. 16 is a diagram illustrating another example of a method for calculating the similarity of each lamp unit region according to the fifth modification of the embodiment.
  • the traffic signal recognition device of Patent Document 1 acquires a color feature image generated by shooting at a predetermined shutter speed. Then, the traffic signal recognizing device extracts, from the color feature image, a circular area that shows the same color as the lighting color of the traffic light as a color feature candidate area. Further, the traffic signal recognition apparatus acquires a shape feature image generated by photographing at a shutter speed determined based on the average luminance around the color feature candidate region. Then, the traffic signal recognition device extracts a region that matches the shape of a predetermined traffic signal as a shape feature candidate region from the periphery of the color feature candidate region in the shape feature image. The shape feature candidate area extracted in this way is recognized as a traffic light area, and the color of the color feature candidate area is recognized as a lighting color of the traffic light, that is, a light color.
  • this traffic light recognition device in order to recognize the light color of the traffic light, a color feature image generated by photographing at a predetermined shutter speed is necessary.
  • the predetermined shutter speed is set so that the lighting color of the traffic light displayed in the color feature image appears clearly. That is, the predetermined shutter speed needs to be adjusted to a shutter speed suitable for the situation or environment so that color saturation does not occur in the color feature image.
  • a template that indicates the shape of the traffic signal is required. Since this search is performed based on the position of the color feature candidate area, a template is required for each position of the lighting color in the traffic light. Therefore, multiple types of templates are necessary.
  • a traffic light recognition apparatus includes a processor and a memory, and the processor uses the memory to recognize from a recognition target image acquired by a sensor, An area in which a traffic light having N lamps (N is an integer of 3 or more) is displayed as a traffic light area, and a plurality of similarities of images in which different light parts are projected from the traffic light area. Is calculated.
  • the processor may generate N lamp part areas as the plurality of images, and calculate the similarity of each of the N lamp part areas.
  • a lamp part that is not similar to any other lamp part among the N lamp parts of the traffic light can be recognized as a lit lamp part.
  • the processor is similar to any other lamp section area among the N lamp section areas based on the similarity calculated for each of the N lamp section areas. It may be determined whether or not there is a dissimilar region that is not a lamp part region.
  • the processor when the processor further determines that the dissimilar region exists, the processor recognizes a color preliminarily associated with the position of the dissimilar region in the traffic light region as a light color in the recognition target image. May be. Specifically, in the recognition of the lamp color, the processor refers to the association information indicating the color associated with each position of the N lamp unit regions in the traffic light region, and the association information The color associated with the position of the dissimilar region may be recognized as the lamp color in the recognition target image.
  • the lamp color is recognized based on the position of the dissimilar area. For example, if one of the N lamp sections of the traffic light is turned on and the other N-1 lamp sections are turned off, the one lamp section that is lit is displayed. The region is determined as a dissimilar region. If the dissimilar area is at the left end of the traffic light area, blue is recognized as the lamp color. As described above, since the lamp color is recognized based on the similarity between the N lamp section areas, that is, the difference in the feature amount of the N lamp section areas, the lamp color can be appropriately recognized.
  • the light part is not blue but is a color close to white or yellow. It may be reflected in.
  • the lamp color is recognized based on the difference in the feature amount of the N lamp area even if color saturation occurs in the recognition target image, such recognition is performed. It is possible to appropriately recognize the light color even for the target image.
  • the traffic light recognition apparatus can recognize the light color even from an image generated by photographing with a camera that is not highly accurate. Therefore, it is possible to appropriately recognize the light color even with respect to an image generated by photographing with a non-accurate camera attached to a vehicle that can withstand severe environmental conditions.
  • the shape of each of the N lamp units included in the traffic light may be circular or quadrangular, or any shape.
  • the color of the lamp portion of the traffic light is not limited to blue, yellow and red, and any color can be appropriately recognized.
  • the memory capacity for holding the template can be reduced.
  • the processor calculates, for each lamp area in the recognition target image, a correlation coefficient between the lamp area and each of at least two other lamp areas.
  • the similarity of the lamp area may be calculated by selecting the largest correlation coefficient as the similarity.
  • the processor determines that the lamp part area having the smallest similarity among the similarities of the N lamp part areas is the dissimilar area. May exist.
  • the processor determines the smallest similarity when the smallest similarity among the similarities of the N lamp part areas is equal to or less than a threshold value. You may determine with the lamp part area
  • the dissimilar region exists when the smallest similarity is equal to or smaller than the threshold, and it is determined that the dissimilar region does not exist when the smallest similarity is larger than the threshold. Therefore, it is possible to recognize the lamp color until there is no significant difference between the similarities of the N lamp section areas, and as a result, it is possible to suppress recognition of an inappropriate lamp color.
  • the processor determines that the dissimilar region does not exist in the recognition of the lamp color
  • the processor refers to the history information indicating the lamp color recognized for the image acquired in the past, and The lamp color indicated by the history information may be recognized as the lamp color in the recognition target image.
  • the processor refers to history information indicating a light color recognized for each of a plurality of images acquired before the recognition target image, and is recognized for the recognition target image.
  • history information indicating a light color recognized for each of a plurality of images acquired before the recognition target image, and is recognized for the recognition target image.
  • the most lamp colors are specified as lamp colors, and the lamp colors recognized for the recognition target image are You may update to the said multiple lamp color.
  • the plurality of images and recognition target images are, for example, a series of images generated by shooting at a constant frame rate. Then, the lamp color recognized for the recognition target image is updated to the multiple lamp colors that are the most lamp colors among the lamp colors and the lamp colors of the plurality of images. Therefore, for example, due to factors other than flicker such as noise, even if it is suddenly determined that there is no dissimilar area in the recognition target image, or an incorrect light color is recognized for the recognition target image, The error can be easily corrected.
  • the processor for each of a plurality of images acquired before the recognition target image, for each position in the traffic signal region in the image, The history information indicating the similarity of the lamp area at the position is referred to, and the average of the similarity of the lamp area at the position in the recognition target image and the plurality of images is calculated for each position in the traffic light area.
  • the lamp area at the position corresponding to the average similarity that is the smallest of the average similarities calculated for each position among the N lamp areas in the recognition target image. May be determined to exist as the dissimilar region.
  • the plurality of images and recognition target images are, for example, a series of images generated by shooting at a constant frame rate. Then, for each position in the traffic signal area, the average similarity between the recognition target image and the lamp area at the position in the plurality of images is calculated as the average similarity. For example, the average similarity of the lamp area at the left end in the traffic light area, the average similarity of the lamp area in the center of the traffic light area, and the average similarity of the lamp area at the right end in the traffic light area Calculated.
  • the processor may In the generation, by dividing the traffic light area, the N lamp part areas and the M lamp part areas in which the direction indicating lamp parts are respectively projected are generated. Based on the recognized lamp color and the respective feature values of the M lamp area, indicated by at least one direction indicating lamp section that is lit among the M direction indicating lamp sections. The traveling direction may be recognized.
  • the processor In the generation, by dividing the traffic light area, the N lamp section areas and M lamp section areas each displaying a direction indicating lamp section are generated, and the M lamp sections are further generated.
  • the traveling direction indicated by at least one direction indicator lamp unit that is lit among the M direction indicator lamp units may be recognized based on the feature amount of each region.
  • the lighting direction and the direction indication lamp unit that may be lit may be turned on regardless of whether the lighting direction and the direction indicating lamp unit that may be turned on are determined in advance. It is possible to appropriately recognize the traveling direction of the lamp unit.
  • the processor may calculate, for each of the N lamp areas, an image of the lamp area and a group of at least one other lamp area other than the lamp area. You may calculate the similarity of the said lamp
  • FIG. 1 is a block diagram illustrating a configuration of a traffic signal recognition apparatus according to an embodiment.
  • the vehicle 10 includes a camera 210, a map storage unit 220, a position detection system 230, a vehicle control unit 240, and a traffic signal recognition device 100.
  • the camera 210 captures, for example, the front of the vehicle 10 and outputs an image generated by the capturing (hereinafter referred to as a captured image) to the traffic signal recognition apparatus 100. Specifically, the camera 210 captures images at a constant frame rate, and outputs an image as a captured image to the traffic signal recognition apparatus 100 each time an image is generated by the capture.
  • the camera 210 is configured as a sensor that acquires a captured image (that is, a recognition target image described later). Note that the shutter speed used for shooting may be fixed or variable.
  • the map storage unit 220 stores at least map information indicating a map around the vehicle 10. Such map information is wirelessly transmitted from a server via a network such as the Internet and stored in the map storage unit 220, for example.
  • the position detection system 230 detects the position of the vehicle 10 and outputs position information indicating the position to the traffic signal recognition device 100.
  • the position detection system 230 is configured as a GPS (Global Positioning System) receiver.
  • the vehicle control unit 240 includes, for example, one or more ECUs (Electronic Control Units). Such a vehicle control part 240 acquires the light color information output from the traffic signal recognition apparatus 100, and controls driving
  • ECUs Electronic Control Units
  • the traffic signal recognition device 100 acquires a captured image from the camera 210 and recognizes the traffic signal displayed in the captured image. For example, the traffic light recognition device 100 recognizes the light color of the traffic light and outputs the light color information indicating the light color to the vehicle control unit 240. Note that the captured image used for the current traffic signal recognition is also referred to as a recognition target image.
  • the traffic light recognition apparatus 100 includes an area extraction unit 101, an area division unit 102, a similarity calculation unit 103, and a light color recognition unit 104.
  • the area extraction unit 101 acquires a captured image from the camera 210 and extracts an area where the traffic signal is displayed as a traffic signal area from the captured image.
  • the traffic light has N lamp units (N is an integer of 3 or more). For example, N is three, and the traffic light has a blue lamp unit, a yellow lamp unit, and a red lamp unit.
  • the captured image is used for recognition of the current lighting color, the captured image is treated as a recognition target image.
  • the region extraction unit 101 uses, for example, the map information stored in the map storage unit 220 and the position of the vehicle 10 detected by the position detection system 230 when extracting the traffic signal region from the captured image.
  • the map information indicates the position and form of each traffic signal in the three-dimensional space. That is, the map information not only shows the position of the road and the building, but also shows the position and form of the traffic signal arranged on the road.
  • the position of the traffic light is indicated by a three-dimensional coordinate system. That is, the position of the light box of the traffic light is indicated by longitude, latitude, and height. The height may be a height from a road or an altitude.
  • the form of the traffic signal includes the shape and size of the traffic signal when the traffic signal is viewed from the front.
  • the region extraction unit 101 identifies the position of the vehicle 10 on the map by mapping the position indicated by the position information output from the position detection system 230 to the map indicated by the map information. Furthermore, the area extraction unit 101 displays the traffic signal in the captured image based on the position of the vehicle 10 and the traffic signal on the map, the form of the traffic signal, the mounting position of the camera 210 on the vehicle 10 and the shooting direction. The area is detected geometrically. Then, the area extraction unit 101 extracts the detected area from the captured image as a traffic light area.
  • the area dividing unit 102 divides the traffic signal area to generate N lamp part areas each displaying the lamp part.
  • the area dividing unit 102 may equally divide the signal area into N parts when dividing the signal area into N lamp part areas.
  • the area dividing unit 102 divides the traffic signal area according to the number and arrangement of the lamp units.
  • the area dividing unit 102 divides the traffic light area in order to generate N lamp part areas, but generates N lamp part areas without dividing the traffic light area. Also good.
  • the area dividing unit 102 extracts, for each lamp unit, an area in which the lamp unit is projected from the entire signal area. In this extraction, at least two lamp section areas out of the N lamp section areas may include the same image such as a background. In addition, an image such as a background that is not included in any of the N lamp sections may be in the traffic signal area.
  • N lamp parts areas in which different lamp parts are projected are generated from the traffic light area, what kind of image processing is performed on the traffic light area. Also good.
  • the similarity calculation unit 103 acquires the N lamp unit areas described above from the region dividing unit 102, and calculates the similarity of each of the N lamp unit regions.
  • the lamp color recognition unit 104 acquires the similarity calculated for each of the N lamp unit regions from the similarity calculation unit 103. Then, the lamp color recognition unit 104 recognizes the color associated with the position of the dissimilar area specified based on the similarity as the lamp color in the captured image.
  • region is a lamp part area
  • the lamp color recognition unit 104 recognizes the color associated with the position of the dissimilar region in the association information as the lamp color in the captured image by referring to association information described later. To do.
  • the association information may be included in the map information.
  • the map information may indicate the colors associated with the positions of the lamp units in addition to the number (N) of the lamp units included in the traffic light and the arrangement of the N lamp units.
  • the lamp color recognition unit 104 outputs information indicating the recognized lamp color to the vehicle control unit 240 as lamp color information.
  • FIG. 2 is a diagram for explaining the processing of the region extracting unit 101, the region dividing unit 102, and the similarity calculating unit 103.
  • the region extraction unit 101 acquires a captured image Pa as a recognition target image from the camera 210 (step S1). Then, the area extraction unit 101 extracts the traffic light area Pb from the captured image Pa (step S2).
  • the three lamp parts are composed of a blue lamp part, a yellow lamp part, and a red lamp part.
  • the area dividing unit 102 divides the traffic signal area Pb into three lamp part areas Pb1, Pb2, and Pb3 (step S3).
  • the lamp area Pb1 is an area located at the left end of the traffic signal area Pb, and is also referred to as a first lamp section area Pb1.
  • the lamp part area Pb2 is an area located at the center of the traffic light area Pb, and is also referred to as a second lamp part area Pb2.
  • the lamp part area Pb3 is an area located at the right end of the traffic signal area Pb, and is also referred to as a third lamp part area Pb3.
  • the similarity calculation unit 103 calculates the respective similarities k1, k2, and k3 of the three lamp region Pb1, Pb2, and Pb3 (steps S4 to S6).
  • the similarity calculation unit 103 when calculating the similarity k1 of the first lamp part area Pb1 (step S4), the similarity calculation unit 103 firstly determines between the first lamp part area Pb1 and the second lamp part area Pb2. A correlation coefficient k12 is calculated. That is, the similarity calculation unit 103 compares the feature quantity of the image of the first lamp area Pb1 with the feature quantity of the image of the second lamp area Pb2, and sets the correlation between these feature quantities as the correlation coefficient k12. calculate. The correlation coefficient indicates a larger value as the feature amounts of the images in these regions are similar.
  • the similarity calculation unit 103 divides the first lamp area Pb1 into blocks composed of n ⁇ n pixels, and for each block of the first lamp area Pb1, the block most similar to the image of the block is determined. Search from the second lamp area Pb2. Specifically, the similarity calculation unit 103 represents the feature amount of the block in the first lamp unit region Pb1 as a vector. Then, the similarity calculation unit 103 finds a block having the feature amount of the vector having the shortest distance from the vector from the second lamp unit region Pb2. For example, a vector representing a feature amount of a block is composed of an array of pixel values (specifically, luminance values) of n ⁇ n pixels constituting the block.
  • the similarity calculation unit 103 calculates the correlation coefficient k12 from the shortest inter-vector distance obtained for each block of the first lamp unit region Pb1.
  • the correlation coefficient k12 is the reciprocal of the value obtained by averaging the shortest vector distances obtained for each block of the first lamp area Pb1.
  • the calculation method of the correlation coefficient is not limited to this, and any method may be used.
  • the shortest inter-vector distance is calculated for each block of the first lamp area Pb1, but the shortest inter-vector distance may be calculated in finer units.
  • the shortest inter-vector distance for the image in the processing target window may be calculated while shifting the processing target window one pixel at a time in the first lamp area Pb1.
  • the processing target window has a size that surrounds n ⁇ n pixels.
  • the similarity calculation unit 103 calculates the correlation coefficient k13 between the first lamp unit region Pb1 and the third lamp unit region Pb3 in the same manner as described above. And the similarity calculation part 103 selects the largest correlation coefficient of the correlation coefficient k12 and the correlation coefficient k13 as the similarity k1 of the 1st lamp part area
  • the similarity calculation unit 103 calculates the similarity k2 of the second lamp unit region Pb2 (step S5), first, the phase relationship between the second lamp unit region Pb2 and the first lamp unit region Pb1. The number k21 is calculated. Further, the similarity calculation unit 103 calculates a correlation coefficient k23 between the second lamp part region Pb2 and the third lamp part region Pb3. And the similarity calculation part 103 selects the largest correlation coefficient of the correlation coefficient k21 and the correlation coefficient k23 as the similarity k2 of the 2nd lamp part area
  • step S6 when the similarity calculation unit 103 calculates the similarity k3 of the third lamp unit region Pb3 (step S6), first, the phase relationship between the third lamp unit region Pb3 and the first lamp unit region Pb1. The number k31 is calculated. Further, the similarity calculation unit 103 calculates a correlation coefficient k32 between the third lamp unit region Pb3 and the second lamp unit region Pb2. And the similarity calculation part 103 selects the largest correlation coefficient of the correlation coefficient k31 and the correlation coefficient k32 as the similarity k3 of the 3rd lamp part area
  • the similarity calculation unit 103 calculates a correlation coefficient between each lamp unit area and each of at least two other lamp unit areas for each lamp unit area in the recognition target image. Among them, the largest correlation coefficient is selected as the similarity. As a result, the similarity of the lamp area is calculated.
  • the lamp unit area in which one of the lit lamps is projected is displayed.
  • a small value is calculated as the similarity.
  • a large value is calculated as the similarity of the lamp area where each of the other two lamp sections is projected. Therefore, by using the similarity of these three lamp part areas, a lamp part area that is not similar to any other lamp part area can be easily found out of the three lamp part areas. it can.
  • FIG. 3 is a diagram for explaining processing of the light color recognition unit 104.
  • the lamp color recognition unit 104 acquires the similarities k1 to k3 calculated for each of the first lamp part region Pb1, the second lamp part region Pb2, and the third lamp part region Pb3. Then, the lamp color recognition unit 104 recognizes the color of the lit lamp part among the three lamp parts of the traffic light as the lamp color in the captured image Pa based on the similarity k1 to k3. .
  • the lamp color recognizing unit 104 selects any of the three lamp part areas Pb1 to Pb3 based on the similarities k1 to k3 calculated for the three lamp part areas Pb1 to Pb3. It is determined whether or not there is a dissimilar region that is a lamp region that is not similar to the lamp region.
  • the similarity k1 of the similarities k1 to k3 is the smallest, so the first lamp region Pb1 having the similarity k1 is a dissimilar region. Is determined. That is, the lamp color recognition unit 104 determines that the first lamp part region Pb1 having the smallest similarity k1 among the similarities k1 to k3 of the three lamp part regions Pb1 to Pb3 exists as a dissimilar region. judge. As a result, it is possible to determine that there is a dissimilar region in any captured image Pa and recognize the light color in the captured image Pa.
  • the lamp color recognizing unit 104 determines the smallest similarity k1 when the smallest similarity k1 among the similarities k1 to k3 of the three lamp part regions Pb1 to Pb3 is equal to or less than the threshold Th. It determines with the 1st lamp part area
  • region Thereby, it is determined that the dissimilar region exists when the smallest similarity is equal to or less than the threshold, and it is determined that the dissimilar region does not exist when the smallest similarity is larger than the threshold. Therefore, it is possible to recognize the lamp color until there is no significant difference between the similarities of the N lamp section areas, and as a result, it is possible to suppress recognition of an inappropriate lamp color.
  • the light color recognition unit 104 recognizes the light color based on the position of the dissimilar region in the traffic signal region Pb. For example, at this time, the light color recognition unit 104 refers to the association information. That is, the lamp color recognition unit 104 refers to the association information indicating the colors associated with the respective positions of the three lamp unit areas Pb1 to Pb3 in the traffic signal area Pb. Then, the lamp color recognition unit 104 recognizes the color associated with the position of the dissimilar region as the lamp color in the captured image Pa in the association information. For example, as shown in FIG. 3, the association information indicates the blue color associated with the position of the first lamp section area Pb1 in the traffic light area Pb, that is, the left end.
  • the association information indicates the position of the second lamp area Pb2, that is, yellow associated with the center, and the position of the third lamp area Pb3, that is, red associated with the right end.
  • the lamp color recognition unit 104 recognizes, for example, the blue color associated with the position of the first lamp unit region Pb1 determined as the dissimilar region as the lamp color in the captured image Pa. .
  • FIG. 4 is a flowchart showing the processing operation of the traffic signal recognition apparatus 100 in the present embodiment.
  • N a traffic light having N
  • the area dividing unit 102 divides the traffic signal area Pb to generate three lamp part areas Pb1 to Pb3 in which the lamp parts are respectively projected (step S103).
  • the similarity calculation unit 103 calculates the respective similarities k1 to k3 of the three lamp unit regions Pb1 to Pb3 (step S104).
  • the lamp color recognition unit 104 lights up of the three lamp units of the traffic light based on the similarities k1 to k3 calculated for the three lamp unit areas Pb1 to Pb3.
  • the color of the existing lamp part is recognized as the lamp color in the captured image Pa.
  • the lamp color recognizing unit 104 is not similar to any of the other lamp section areas among the three lamp section areas Pb1 to Pb3 based on the similarities k1 to k3. It is determined whether or not there is a dissimilar region that is a lamp region (step S105). If the lamp color recognition unit 104 determines that the dissimilar area exists (Yes in step S105), the lamp color recognition unit 104 recognizes the lamp color based on the position of the dissimilar area in the traffic light area Pb (step S106).
  • the region extraction unit 101 acquires a captured image Pa as a recognition target image. Then, the area extraction unit 101 extracts an area where a traffic light having N lamp parts is displayed as a traffic light area from the recognition target image.
  • the area dividing unit 102 divides the traffic signal area to generate N lamp part areas each displaying the lamp part.
  • the similarity calculation unit 103 calculates the similarity of each of the N lamp unit regions.
  • the lamp color recognizing unit 104 is not similar to any other lamp unit region among the N lamp unit regions based on the similarity calculated for each of the N lamp unit regions. It is determined whether or not a dissimilar area that is a lamp area exists.
  • the lamp color recognition unit 104 determines that a dissimilar area exists, the color of the lit lamp part among the N lamp parts is determined based on the position of the dissimilar area in the traffic light area. Is recognized as a lamp color in the recognition target image.
  • the lamp color recognition unit 104 refers to the association information indicating the colors associated with the positions of the N lamp unit areas in the traffic signal area Pb.
  • the lamp color recognition part 104 recognizes the color matched with the position of a dissimilar area
  • the association information indicating the color associated with each position of the N lamp section regions may be included in the map information as described above, or may not be included in the map information.
  • the association information may be stored in advance in a memory, or may be stored in a cloud server that can communicate with the traffic signal recognition apparatus 100.
  • the light color recognition unit 104 may read the association information from the memory, or may dynamically read the association information from the cloud server. Further, the light color recognition unit 104 may inquire the cloud server about the color associated with the position of the dissimilar region and specify the color.
  • the lamp in which the one lamp section that is lit is displayed.
  • the partial area is determined as a dissimilar area. If the dissimilar area is at the left end of the traffic light area, blue is recognized as the lamp color. As described above, since the lamp color is recognized based on the similarity between the N lamp section areas, that is, the difference in the feature amount of the N lamp section areas, the lamp color can be appropriately recognized.
  • the light part when color saturation occurs, even if the blue light part of the traffic light is lit, in the light part region corresponding to that light part of the recognition target image, the light part is not blue but is a color close to white or yellow. It may be reflected in.
  • the lamp color is recognized based on the difference in the feature amounts of the N lamp part regions. It is possible to appropriately recognize the light color for the recognition target image.
  • the recognition target image since color saturation may occur in the recognition target image, it is possible to eliminate the necessity of adjusting the camera 210 to a predetermined shutter speed so that color saturation does not occur when the traffic light is captured by the camera 210. . As a result, the shutter speed of the camera 210 can be fixed. In addition, it is possible to appropriately recognize the light color even with respect to the recognition target image generated by photographing at night or rainy weather where color saturation is likely to occur.
  • the camera 210 that is not highly accurate can be used for shooting the traffic light.
  • the traffic light recognition apparatus 100 can recognize the light color even from a photographed image generated by photographing with the camera 210 that is not highly accurate. Therefore, it is possible to appropriately recognize the light color even for a captured image that is attached to the vehicle and is generated by the camera 210 that is not highly accurate and can withstand severe environmental conditions.
  • the shape of each of the N lamp units included in the traffic light may be circular or quadrangular, or any shape.
  • the color of the lamp portion of the traffic light is not limited to blue, yellow and red, and any color can be appropriately recognized.
  • the memory capacity for holding the template can be reduced.
  • Modification 1 there may be a case where an unlit traffic light is displayed on the captured image Pa due to the flicker of the traffic light.
  • the lighting of the traffic light is periodically performed.
  • the lamp unit blinks at a frequency of 100 Hz or 120 Hz. Therefore, at the timing when the captured image Pa is generated by capturing with the camera 210, there is a case where none of the lamps of the traffic light is lit. In the captured image Pa generated at such a timing, any lamp part of the traffic light is projected darkly.
  • the traffic light recognition device uses the light color recognized for the past captured image when a non-light traffic signal is displayed in the captured image Pa that is the recognition target image.
  • the lamp color in the recognition target image is recognized.
  • FIG. 5 is a block diagram showing the configuration of the traffic signal recognition apparatus according to this modification.
  • the traffic signal recognizing device 100a includes the components of the traffic signal recognizing device 100 in the above embodiment, and also includes a history holding unit 105 and a flicker processing unit 106. That is, the traffic signal recognition apparatus 100a includes an area extraction unit 101, an area division unit 102, a similarity calculation unit 103, a light color recognition unit 104, a history holding unit 105, and a flicker processing unit 106.
  • the lamp color recognizing unit 104 determines that there is a dissimilar area in the traffic signal area Pb, it outputs the lamp color information to the history holding unit 105 and the flicker processing unit 106. That is, if the traffic light displayed in the captured image Pa that is the recognition target image is not lightless, the light color information is output. On the other hand, if it is determined that there is no dissimilar area in the traffic light area Pb, the light color recognition unit 104 outputs lightless information indicating that the traffic light is lightless to the flicker processing unit 106. That is, when the traffic light shown in the captured image Pa is unlit, unlit information is output.
  • the history holding unit 105 is a recording medium for holding the lamp color information output from the lamp color recognition unit 104 as history information.
  • the history holding unit 105 includes a hard disk or a memory.
  • the memory may be nonvolatile or volatile.
  • the memory may be ROM (Read Only Memory) or RAM (Random Access Memory).
  • Such a history holding unit 105 when acquiring the lamp color information from the lamp color recognition unit 104, holds the lamp color information as history information.
  • the lamp color recognition unit 104 stores the lamp color information in the history holding unit 105
  • the old lamp color information that has already been stored may be deleted.
  • the lamp color recognition unit 104 may store only the latest lamp color information in the history holding unit 105.
  • the flicker processing unit 106 When the flicker processing unit 106 acquires the light color information from the light color recognition unit 104, the flicker processing unit 106 outputs the light color information to the vehicle control unit 240. On the other hand, when the flicker processing unit 106 acquires the no-light information from the light color recognition unit 104, the flicker processing unit 106 reads the history information held in the history holding unit 105. Then, the flicker processing unit 106 outputs the history information as lighting color information for the captured image Pa that is the recognition target image.
  • FIG. 6 is a flowchart showing the processing operation of the traffic signal recognition apparatus 100a according to this modification.
  • the traffic light recognition apparatus 100a executes the processes of steps S101 to S106, similar to the processing operation of the flowchart shown in FIG.
  • the lamp color recognition unit 104 of the traffic signal recognition apparatus 100a recognizes the lamp color in step S106
  • the lamp color information indicating the lamp color is stored in the history holding unit 105 as history information (step S107). This history information is used when it is determined that there is no dissimilar region in the recognition of the light color for a future captured image.
  • the flicker processing unit 106 acquires the lamp color information from the lamp color recognition unit 104, the flicker processing unit 106 outputs the lamp color information to the outside of the traffic light recognition device 100a (step S108). For example, the flicker processing unit 106 outputs the light color information to the vehicle control unit 240.
  • step S105 If it is determined in step S105 that there is no dissimilar region (No in step S105), the light color recognition unit 104 outputs no-light information to the flicker processing unit 106. For example, when a no-light signal is displayed in the captured image Pa that is the recognition target image, the smallest similarity is greater than the threshold Th, and therefore it is determined in step S105 that there is no dissimilar region. The In such a case, the light color recognition unit 104 outputs no-light information to the flicker processing unit 106. When the flicker processing unit 106 acquires the no-light information, the flicker processing unit 106 reads the history information from the history holding unit 105 (step S109). The flicker processing unit 106 outputs the read history information as lighting color information for the recognition target image (step S110).
  • the flicker processing unit 106 applies to the captured image acquired in the past when the light color recognition unit 104 determines that there is no dissimilar region.
  • the history information indicating the recognized light color is referred to.
  • the flicker processing unit 106 recognizes the lamp color indicated by the history information as the lamp color in the recognition target image.
  • the lamp color recognized for the past photographed image is recognized as the lamp color in the recognition target image. Accordingly, it is possible to appropriately prevent the recognition of the light color due to flicker.
  • the past photographed image is, for example, a photographed image (that is, a frame) immediately before the recognition target image generated by photographing at a constant frame rate by the camera 210, for example. Therefore, there is a high possibility that the lamp unit that has been turned off by flicker at the timing of capturing the recognition target image is lit at the timing of capturing the previous captured image. Therefore, by referring to the history information as described above, it is possible to appropriately recognize the light color even if flicker occurs.
  • the cause that makes it difficult to recognize the light color in the recognition target image is not limited to the flicker of the traffic light.
  • a captured image in which it is difficult to recognize the light color may be generated due to noise in the camera 210 or a shooting environment.
  • the traffic light recognition apparatus recognizes the light color in the recognition target image using the processing results for a plurality of past captured images.
  • FIG. 7 is a block diagram showing the configuration of the traffic signal recognition apparatus according to this modification.
  • the traffic signal recognizing device 100b includes the components of the traffic signal recognizing device 100 in the above embodiment, and also includes a history holding unit 105 and a time series processing unit 107. That is, the traffic signal recognition apparatus 100a includes an area extraction unit 101, an area division unit 102, a similarity calculation unit 103, a light color recognition unit 104, a history holding unit 105, and a time series processing unit 107.
  • the lamp color recognition unit 104 determines that there is a dissimilar area in the traffic signal area Pb, the lamp color information is output to the history holding unit 105 as in the first modification. On the other hand, if it is determined that there is no dissimilar area in the traffic signal area Pb, the lamp color recognition unit 104 recognizes the lamp color as black, for example, and outputs lamp color information indicating the black color to the history holding unit 105.
  • the history holding unit 105 when acquiring the lamp color information from the lamp color recognition unit 104, holds the lamp color information as a part of the history information.
  • the history holding unit 105 has a storage capacity for holding L pieces of light color information (L is an integer of 3 or more). Therefore, when the lamp color information is stored in the history holding unit 105 when the L lamp color information is already stored, the lamp color recognition unit 104 is the oldest among the lamp color information. Delete the light color information. Thereby, free capacity is secured in the history holding unit 105. Then, the lamp color recognition unit 104 stores the latest lamp color information in the history holding unit 105 in which the free space is secured. Thereby, the history holding unit 105 always holds the latest L pieces of lamp color information as history information.
  • L is an integer of 3 or more
  • the time series processing unit 107 reads out the history information stored in the history holding unit 105 every time the lighting color information is stored in the history holding unit 105.
  • the history information includes the latest L pieces of lamp color information.
  • the time series processing unit 107 updates the already recognized lamp color for the recognition target image based on the lamp color indicated by each of the L lamp color information.
  • the lamp color recognition unit 104 performs a lamp color temporary determination on the recognition target image
  • the time-series processing unit 107 detects the lamp color recognized for the past (L-1) photographed images. Is used to make a final determination of the light color for the recognition target image.
  • FIG. 8 is a diagram for explaining the recognition of the light color by the traffic signal recognition apparatus 100b according to the present modification.
  • the history holding unit 105 holds, as history information, lamp color information indicating the lamp color recognized for each of the past four captured images.
  • These four captured images are composed of the (n-4) th frame, the (n-3) th frame, the (n-2) th frame, and the (n-1) th frame.
  • the lighting color information of the (n ⁇ 1) th frame included in the history information indicates black as the lighting color
  • the other three lighting color information indicates blue as the lighting color.
  • Each frame is a captured image generated by shooting at a constant frame rate by the camera 210.
  • the lamp color recognition unit 104 recognizes the lamp color with respect to the nth frame which is the latest captured image and is the recognition target image. For example, the lamp color recognition unit 104 recognizes a blue lamp color.
  • the lamp color recognition unit 104 stores lamp color information indicating the lamp color recognized for the nth frame in the history holding unit 105.
  • the light color recognition unit 104 first deletes the light color information of the (n ⁇ 4) th frame from the history information stored in the history holding unit 105.
  • the lamp color recognition unit 104 stores lamp color information indicating the lamp color recognized for the nth frame in the history holding unit 105 as new lamp color information.
  • the history information of the history holding unit 105 includes the light color information of the nth frame instead of the light color information of the (n-4) th frame.
  • the time series processing unit 107 reads out the history information stored in the history holding unit 105. Then, the time-series processing unit 107 updates the lamp color (for example, blue) recognized for the nth frame based on the lamp color indicated by each of the latest four lamp color information included in the history information. To do. That is, the time-series processing unit 107 uses the lamp colors obtained by provisional determination for each of the (n-3) th frame, the (n-2) th frame, the (n-1) th frame, and the nth frame. The final determination of the light color for the nth frame is performed.
  • the lamp color for example, blue
  • the time-series processing unit 107 performs a majority decision of the lamp color indicated by each of the four lamp color information, and determines the lamp color recognized for the nth frame as the lamp color determined by the majority vote. Update to For example, the time-series processing unit 107 recognizes the lamp colors recognized for each of the (n-3) th frame, the (n-2) th frame, the (n-1) th frame, and the nth frame, that is, blue Blue, black, and blue majority vote. The time-series processing unit 107 specifies the lamp color determined by the majority decision, that is, blue, as the majority lamp color. Then, the time series processing unit 107 updates the lamp colors recognized for the nth frame to the multiple lamp colors.
  • FIG. 9 is a flowchart showing a processing operation by the traffic signal recognition apparatus 100b according to the present modification.
  • the traffic signal recognition apparatus 100b executes steps S101 to S106 in the same manner as the processing operation of the flowchart shown in FIG.
  • step S106 when the lamp color recognition unit 104 of the traffic light recognition apparatus 100b recognizes the lamp color in step S106, the lamp color information indicating the lamp color is stored in the history holding unit 105 as part of the history information (step S122). . If the lamp color recognition unit 104 determines that there is no dissimilar area in step S105 (No in step S105), for example, the lamp color recognition unit 104 recognizes the lamp color for the recognition target image as black (step S121). Then, the lamp color recognition unit 104 stores the lamp color information indicating the black color in the history holding unit 105 as part of the history information (step S122).
  • the time series processing unit 107 reads the history information stored in the history holding unit 105 (step S123). Then, the time-series processing unit 107 updates the lamp color in the recognition target image recognized in step S106 or S121 by the majority of the four lamp colors indicated by the history information (step S124).
  • the history holding unit 105 holds history information including four pieces of light color information.
  • the number of lamp color information included in the history information is not limited to four, and may be three or four or more.
  • the time series processing unit 107 reads the history information from the history holding unit 105 to acquire the light color information of the recognition target image included in the history information.
  • the time series processing unit 107 may directly acquire the light color information of the recognition target image from the light color recognition unit 104.
  • the history information read from the history holding unit 105 by the time-series processing unit 107 does not include the light color information of the recognition target image, but includes only the light color information of the past captured image.
  • the time-series processing unit 107 includes history information indicating the lamp color recognized for each of the plurality of captured images acquired before the recognition target image. Refer to Then, the time-series processing unit 107 calculates the largest number of lamp colors among the lamp colors recognized for the recognition target image and the lamp colors in each of the plurality of captured images indicated by the history information. Specify as color. The time-series processing unit 107 updates the lamp colors recognized for the recognition target image to the multiple lamp colors.
  • the traffic signal recognition apparatus recognizes the light color in the recognition target image using the processing results for a plurality of past captured images.
  • the traffic light recognition apparatus according to the present modification uses not the lamp color recognition result but the similarity calculation result as the processing result for a plurality of past captured images.
  • FIG. 10 is a block diagram showing a configuration of a traffic signal recognition apparatus according to this modification.
  • the traffic light recognition device 100c includes a light color recognition unit 104c instead of the light color recognition unit 104 among the components of the traffic light recognition device 100 in the above embodiment. Further, the traffic signal recognition apparatus 100 c includes a history holding unit 105. That is, the traffic signal recognition apparatus 100a includes an area extraction unit 101, an area division unit 102, a similarity calculation unit 103, a light color recognition unit 104c, and a history holding unit 105.
  • the history holding unit 105 holds the similarity information output from the similarity calculation unit 103 as part of the history information.
  • the history holding unit 105 has a storage capacity for holding L pieces of similarity information (L is an integer of 2 or more). Therefore, when the similarity calculation unit 103 stores the similarity information in the history holding unit 105, if the L old similarity information is already stored, the oldest similarity among the similarity information is stored. Delete information. Thereby, free capacity is secured in the history holding unit 105. Then, the similarity calculation unit 103 stores the latest similarity information in the history holding unit 105 in which the free space is secured. Thereby, the history holding unit 105 always holds the latest L pieces of similarity information as history information.
  • the lighting color recognition unit 104c reads the history information stored in the history holding unit 105 every time the similarity information is stored in the history holding unit 105.
  • the history information includes L pieces of recent similarity information.
  • the lamp color recognition unit 104c calculates the average similarity of the lamp area at the center as the average similarity of the lamp area, and calculates the average similarity of the lamp area at the right end. Calculated as the average similarity of partial areas. Then, the lamp color recognizing unit 104c determines that the smallest average similarity among the calculated three average similarity is, for example, the average similarity of the lamp region at the left end, in the traffic light region Pb of the recognition target image. The lamp area at the left end is specified as a dissimilar area.
  • FIG. 11 is a diagram for explaining the recognition of the light color by the traffic signal recognition device 100c according to this modification.
  • the history holding unit 105 holds similarity information indicating similarity calculated for each of the past four captured images as history information.
  • These four captured images are composed of the (n-4) th frame, the (n-3) th frame, the (n-2) th frame, and the (n-1) th frame. Note that these frames are captured images generated by capturing at a constant frame rate by the camera 210.
  • the similarity calculation unit 103 calculates the similarity of each of the three lamp unit regions with respect to the nth frame which is the latest captured image and is the recognition target image. Then, the similarity calculation unit 103 stores similarity information indicating the similarity calculated for the n-th frame in the history holding unit 105. At this time, the similarity calculation unit 103 first deletes the similarity information of the (n ⁇ 4) th frame from the history information stored in the history holding unit 105. Next, the similarity calculation unit 103 stores similarity information indicating the similarity calculated for the n-th frame in the history holding unit 105 as new similarity information. As a result, the history information of the history holding unit 105 includes the similarity information of the nth frame instead of the similarity information of the (n ⁇ 4) th frame.
  • the light color recognition unit 104 c reads the history information stored in the history holding unit 105.
  • the history information includes similarity information of the (n-3) th frame, similarity information of the (n-2) th frame, similarity information of the (n-1) th frame, and similarity of the nth frame.
  • Degree information is included.
  • the similarity information of the (n-3) th frame indicates 30/101/99 as the similarity of the lamp area at each of the left end, the center, and the right end in the traffic light area Pb.
  • the similarity information of the (n ⁇ 2) th frame indicates 70/111/105 as the similarity of the lamp area at each of the left end, the center, and the right end in the traffic light area Pb.
  • the similarity information of the (n ⁇ 1) th frame indicates 107/110/114 as the similarity of the lamp area at each of the left end, the center, and the right end in the traffic light area Pb.
  • the similarity information of the nth frame indicates 21/112/105 as the similarity of the lamp area at each of the left end, the center, and the right end in the traffic light area Pb.
  • the lamp color recognition unit 104c calculates the average of the similarity of the lamp area at the left end in the traffic signal area Pb indicated by the similarity information of each of the four frames. That is, the lamp color recognition unit 104c calculates the average similarity of the leftmost lamp unit region by (30 + 70 + 107 + 21) / 4. Similarly, the lamp color recognition unit 104c calculates the average similarity of the central lamp unit area by (101 + 111 + 110 + 112) / 4, and calculates the average similarity of the rightmost lamp unit region by (99 + 105 + 114 + 105) / 4.
  • the lamp color recognition unit 104c calculates 57/108/106 as the average similarity of the lamp area at each of the left end, the center, and the left end in the traffic light area Pb.
  • the lamp color recognition unit 104c identifies the position corresponding to the smallest average similarity “57” among these average similarities, that is, the lamp region at the left end, as a dissimilar region. That is, the lamp color recognizing unit 104c determines that the leftmost lamp unit region among the three lamp unit regions in the recognition target image exists as a dissimilar region.
  • the lamp color recognition unit 104c recognizes the lamp color in the recognition target image based on the position of the dissimilar region. For example, if the lamp color recognition unit 104c determines that the leftmost lamp unit region exists as a dissimilar region as described above, it recognizes blue as the lamp color in the recognition target image.
  • FIG. 12 is a flowchart showing a processing operation by the traffic signal recognition apparatus 100c according to the present modification.
  • the traffic signal recognition apparatus 100c executes the processing of steps S101 to S104, similarly to the processing operation of the flowchart shown in FIG.
  • the similarity calculation unit 103 of the traffic signal recognition apparatus 100c stores the similarity information indicating the similarity of each lamp unit area calculated in step S104 in the history holding unit 105 (step S104a).
  • the lamp color recognition section 104c determines which of the three lamp sections that the traffic light has. The color is recognized as the lamp color in the recognition target image.
  • the lamp color recognition unit 104c has a dissimilar region that is a lamp region that is not similar to any other lamp unit region among the three lamp unit regions based on the average similarity. It is determined whether or not (step S105). If the lamp color recognition unit 104 determines that the dissimilar area exists (Yes in step S105), the lamp color recognition unit 104 recognizes the lamp color based on the position of the dissimilar area in the traffic light area Pb (step S106).
  • the lamp color recognition unit 104c calculates the average of the similarity of the lamp area in each frame, but may calculate the weighted average as the average similarity. For example, a frame (captured image) that is closer in time to the recognition target image is multiplied by a greater weight to the similarity of the lamp area in that frame. This makes it possible to calculate a more appropriate average similarity for the recognition target image.
  • the light color recognition unit 104c reads the history information from the history holding unit 105, thereby acquiring similarity information of the recognition target image included in the history information.
  • the light color recognition unit 104c may directly acquire the similarity information of the recognition target image from the similarity calculation unit 103.
  • the history information read from the history holding unit 105 by the lamp color recognition unit 104c does not include the similarity information of the recognition target image, and includes only the similarity information of the past captured images.
  • the lamp color recognition unit 104c has history information indicating similarity calculated for each of a plurality of captured images acquired before the recognition target image. Refer to This history information indicates, for each of a plurality of photographed images, the degree of similarity of the lamp section region at the position for each position in the traffic light region Pb in the photographed image. Then, for each position in the traffic light area, the lamp color recognition unit 104c calculates the average similarity of the lamp area at the position in the recognition target image and the plurality of images as the average similarity. Further, the lamp color recognizing unit 104c has a lamp unit region at a position corresponding to the smallest average similarity among the average similarity calculated for each of the N lamp unit regions in the recognition target image. It is determined that it exists as a dissimilar region.
  • the dissimilar region is determined based on the average similarity using the past similarity, the dissimilar region can be determined more accurately than using the light color recognized in the past. .
  • the traffic light recognition device recognizes the light color of the traffic light.
  • the traffic signal recognizing device may recognize not only the lamp color but also the traveling direction indicated by the lit arrow light portion for the traffic light having the arrow light portion.
  • the lamp part indicated by the arrow is hereinafter referred to as a direction indicator lamp part, and the lamp part indicating a color such as blue, yellow, or blue is hereinafter also referred to as a color lamp part.
  • FIG. 13 is a diagram for explaining the recognition of the traveling direction by the traffic signal recognition device according to the present modification.
  • the traffic signal includes not only a blue color lamp unit, a yellow color lamp unit, and a red color lamp unit, but also a direction indicator lamp unit that indicates the right direction as the traveling direction.
  • the area extraction unit 101 of the traffic light recognition device extracts an area in which the three color lamp units and one direction indicator lamp unit are projected from the captured image Pa as the traffic signal area Pb. Then, the area dividing unit 102 divides the traffic light area Pb to thereby divide the lamp part areas Pb1, Pb2, and Pb3 in which each of the three color lamp parts is displayed, and the lamp in which the direction indicator lamp part is displayed. A partial area Pb4 is generated.
  • the traffic signal recognizing device recognizes the light color of the traffic signal based on the lamp area Pb1, Pb2 and Pb3 as in the above embodiment and the first to third modifications. Then, the traffic light recognition device determines that the direction indicator lamp unit is not lit if the lamp color is a predetermined color. For example, when the lamp color is blue or yellow, the traffic light recognition device determines that the direction indicator lamp unit is not lit. On the other hand, when the lamp color is red, the traffic light recognition device determines whether or not the direction indicator lamp unit is lit using the lamp unit region Pb4. For example, the traffic light recognition apparatus inputs the feature amount of the lamp part region Pb4 to a model such as a neural network generated by machine learning. The traffic signal recognizing device determines whether or not the direction indicator lamp unit is lit according to the output from the model.
  • a predetermined color For example, when the lamp color is blue or yellow, the traffic light recognition device determines that the direction indicator lamp unit is not lit.
  • the traffic light recognition device determines whether or not the direction indicator lamp
  • the traffic light recognition device calculates a correlation coefficient between a region other than the dissimilar region (hereinafter referred to as a similar region) of the lamp unit regions Pb1, Pb2, and Pb3 and the lamp unit region Pb4. It may be determined whether the number of relationships is greater than or equal to a threshold value. That is, since the similar area is darkly displayed in the captured image, if the correlation coefficient is equal to or greater than the threshold value, the lamp area Pb4 is also dark and the direction indicator lamp section is likely to be turned off. If the correlation coefficient is less than the threshold, the lamp area Pb4 is bright and the direction indicator lamp section is likely to be lit. Therefore, the traffic light recognition device determines that the direction indicator lamp unit is turned off when the correlation coefficient is equal to or greater than the threshold value. Conversely, the traffic light recognition device determines that the direction indicator lamp unit is lit when the correlation coefficient is less than the threshold value.
  • the traffic light recognition device determines that the direction indicator lamp unit is lit as described above, it refers to the association information. Similar to the association information shown in FIG. 3, this association information indicates the colors associated with the positions of the lamp areas Pb1, Pb2, and Pb3 and is associated with the position of the lamp area Pb4. Indicates the direction of travel. In the association information, the traffic signal recognition device recognizes the traveling direction associated with the position of the lamp unit region Pb4, for example, the right direction, as the traveling direction of the lit direction indicator lamp unit.
  • a traffic signal having a plurality of direction indicator lamps is also arranged on the road.
  • this traffic light only one of the direction indicator lamps is turned on, or two or more direction indicator lamps are turned on simultaneously.
  • the traffic light recognition device recognizes the light color of the traffic light based on the respective lamp area of the N lamp sections, as in the above embodiment and the first to third modifications. . Then, the traffic signal recognizing device narrows down the direction indicator lamp units that are lit from M (M is an integer of 1 or more) direction indicator lamp units based on the lamp color. For example, if the lamp color is blue, the traffic light recognition device narrows down the direction indicator lamp unit indicating the straight direction as a candidate, and if the lamp color is red, the traffic signal recognition device detects the direction indicator lamp unit indicating the right direction. Narrow down as candidates. Then, similarly to the above, the traffic signal recognizing device determines whether or not the candidate direction indicator lamp unit is lit using a model or a correlation coefficient.
  • the traffic signal recognition device when the traffic light has M (M is an integer of 1 or more) direction indicator lamps, each of which indicates a traveling direction of the vehicle, the traffic signal recognition device is turned on.
  • the traveling direction indicated by the direction indicator lamp unit is recognized. That is, the area dividing unit 102 divides the traffic light area Pb, thereby generating N lamp part areas and M lamp part areas in which the direction indicating lamp parts are respectively projected.
  • the traffic light recognition device is further turned on in the M direction indicator lamp units based on the lamp color recognized for the recognition target image and the feature quantities of the M lamp unit regions.
  • the traveling direction indicated by the at least one direction indicator lamp unit is recognized.
  • the traffic light recognition device first recognizes a traveling direction indicated by at least one direction indication lamp unit that is lit among the M direction indication lamp units, and based on the recognition result, the color lamp unit You may recognize the light color.
  • the above-described feature amount may be a vector in the above embodiment or a correlation coefficient.
  • the direction indicator lamp part For example, in the case where a direction indicator lamp unit that may be lit is predetermined for each lamp color of the traffic light, the direction in which the M direction indicator lamp units are lit up according to the recognized lamp color. It is possible to narrow down the candidates for the indicator lamp part. Further, for example, by inputting each feature amount of the M lamp area into a model such as a neural network generated by machine learning, a directional indicator lamp that is lit is selected from the narrowed candidates. Can be recognized properly.
  • FIG. 14 is a diagram illustrating an example of a method of calculating the similarity of the first lamp part region Pb1 in the above embodiment.
  • the similarity calculation unit 103 divides the first lamp unit region Pb1 into a plurality of blocks each composed of n ⁇ n pixels as described above. And the similarity calculation part 103 searches the block most similar to the image of the block from 2nd lamp part area
  • a block is also called a patch.
  • the similarity calculation unit 103 searches the second lamp part area Pb2 for a block most similar to the image of the block B11 at the upper left corner of the first lamp part area Pb1. Then, the similarity calculation unit 103 calculates the inter-vector distance (for example, “5”) between the block B11 and the most similar block searched from the second lamp unit region Pb2. The larger the distance between the vectors, the more the distance between the vectors indicates that the two blocks are not similar. The smaller the distance between the vectors, the more similar the distance between the two blocks is. Indicates. Therefore, it can be said that the distance between vectors is dissimilarity.
  • the inter-vector distance for example, “5”
  • the similarity calculation unit 103 performs the same processing as that of the block B11 on the block B12 next to the block B11. That is, the similarity calculation unit 103 searches the second lamp part area Pb2 for a block most similar to the image of the second block B12 from the upper left end of the first lamp part area Pb1 to the right. Then, the similarity calculation unit 103 calculates the inter-vector distance between the block B12 and the most similar block searched from the second lamp unit region Pb2.
  • the similarity calculation unit 103 repeatedly executes the processes shown in FIGS. 14A and 14B, so that each block included in the first lamp area Pb1 is shown in FIG. 14C. On the other hand, a distance between vectors (that is, dissimilarity) is calculated.
  • the similarity calculation unit 103 calculates the average or sum of the dissimilarities of these blocks, and the reciprocal of the calculation result is used as the correlation coefficient k12 of the first lamp part region Pb1 with respect to the second lamp part region Pb2. calculate. Similarly, the similarity calculation unit 103 calculates a correlation coefficient k13 of the first lamp part region Pb1 with respect to the third lamp part region Pb3, and calculates the largest correlation coefficient among the correlation coefficient k12 and the correlation coefficient k13. , Calculated as the similarity k1 of the first lamp area Pb1.
  • the second lamp section area Pb1 is compared with the second lamp section area Pb2 by comparing the first lamp section area Pb1 and the second lamp section area Pb2.
  • a correlation coefficient k12 for the lamp area Pb2 is calculated as the first similarity.
  • the correlation coefficient k13 of the first lamp part area Pb1 with respect to the third lamp part area Pb3 is calculated as the second similarity.
  • the largest similarity between the first similarity and the second similarity is calculated as the similarity of the first lamp area Pb1.
  • the first lamp part area Pb1 the first lamp part area Pb1, and each of the second lamp part area Pb2 and the third lamp part area Pb3 are individually compared.
  • the similarity between the second lamp part area Pb2 and the third lamp part area Pb3 is also calculated in the same manner as the similarity between the first lamp part area Pb1.
  • the similarity calculation unit 103 calculates the similarity of the first lamp part area Pb1, and each of the first lamp part area Pb1, the second lamp part area Pb2, and the third lamp part area Pb3. Are not compared individually. That is, the similarity calculation unit 103 compares the first lamp part area Pb1 with the group including the second lamp part area Pb2 and the third lamp part area Pb3, thereby determining the similarity of the first lamp part area Pb1. calculate.
  • FIG. 15 is a diagram illustrating an example of a method of calculating the similarity of each lamp area according to the present modification.
  • the similarity calculating unit 103 calculates the similarity of the first lamp unit area Pb1, as shown in FIG. 15A, the first lamp unit area Pb1, the second lamp unit area Pb2, and the third lamp unit.
  • the first group G1 composed of the region Pb3 is compared. That is, the similarity calculation unit 103 divides the first lamp unit area Pb1 into a plurality of blocks, and searches the first group G1 for the block most similar to the image of the block for each block of the first lamp unit area Pb1. To do.
  • the similarity calculation unit 103 obtains the inter-vector distance of each block included in the first lamp region Pb1 by the search, and based on the inter-vector distance, The correlation coefficient of one lamp area Pb1 is calculated as the similarity.
  • the similarity calculation unit 103 calculates the similarity of the second lamp unit area Pb2, as shown in FIG. 15B, the second lamp unit area Pb2, the first lamp unit area Pb1, and the third lamp unit.
  • the second group G2 composed of the region Pb3 is compared. That is, the similarity calculation unit 103 divides the second lamp unit region Pb2 into a plurality of blocks, and searches the second group G2 for the block most similar to the image of the block for each block of the second lamp unit region Pb2. To do. Similar to the example shown in FIG. 14, the similarity calculation unit 103 obtains the inter-vector distance of each block included in the second lamp part region Pb2 by the search, and based on the inter-vector distance, The correlation coefficient of the two lamp area Pb2 is calculated as the similarity.
  • the similarity calculation unit 103 calculates the similarity of the third lamp unit area Pb3, as shown in FIG. 15C, the third lamp unit area Pb3, the first lamp unit area Pb1, and the second lamp unit.
  • the third group G3 composed of the region Pb2 is compared. That is, the similarity calculation unit 103 divides the third lamp unit region Pb3 into a plurality of blocks, and searches for the block most similar to the image of the block from the third group G3 for each block of the third lamp unit region Pb3. To do.
  • the similarity calculation unit 103 obtains the inter-vector distance of each block included in the third lamp part region Pb2 by the search, and based on the inter-vector distance, The correlation coefficient of the three lamp area Pb3 is calculated as the similarity.
  • the respective similarities of the first lamp area Pb1, the second lamp area Pb2, and the third lamp area Pb3 are calculated using the groups. For example, if the lamp part projected in the first lamp part area Pb1 is turned on and the other lamp parts are turned off, a low similarity is calculated only for the first lamp part area Pb1, and the second lamp part is calculated. A high degree of similarity is calculated for each of the partial area Pb2 and the third lamp part area Pb3. Therefore, even with the calculation method shown in FIG. 15, it is possible to calculate an appropriate degree of similarity for each lamp area, as in the above embodiment.
  • FIG. 16 is a diagram showing another example of a method for calculating the similarity of each lamp section according to the present modification.
  • the lamp section area that is the target of similarity calculation is divided into a plurality of blocks, but conversely, the group may be divided into a plurality of blocks.
  • the similarity calculation unit 103 calculates the similarity of the third lamp part region Pb3 as shown in FIG. 16A by using a third lamp part region Pb1 and a second lamp part region Pb2.
  • the group G3 is compared with the third lamp part region Pb3. That is, the similarity calculation unit 103 divides the third group G3 into a plurality of blocks, and searches the third lamp unit region Pb3 for a block most similar to the image of each block of the third group G3.
  • the similarity calculation unit 103 obtains the inter-vector distance of each block included in the third group G3 by the search.
  • the similarity calculation part 103 calculates the correlation coefficient of the 3rd group G3 as a similarity of 3rd lamp part area
  • the similarity calculation section 103 calculates the second lamp composed of the third lamp section area Pb3 and the first lamp section area Pb1, as shown in FIG.
  • the group G2 is compared with the second lamp area Pb2. That is, the similarity calculation unit 103 divides the second group G2 into a plurality of blocks, and searches the second lamp unit region Pb2 for a block most similar to the image of each block of the second group G2. Next, similar to the example shown in FIG. 14, the similarity calculation unit 103 obtains the inter-vector distance of each block included in the second group G2 by the search. And the similarity calculation part 103 calculates the correlation coefficient of the 2nd group G2 as a similarity of 2nd lamp part area
  • the similarity calculation unit 103 calculates the similarity of the first lamp part area Pb1, as shown in FIG. 16C, the first lamp part area Pb3 and the second lamp part area Pb2.
  • the group G1 is compared with the first lamp area Pb1. That is, the similarity calculation unit 103 divides the first group G1 into a plurality of blocks, and searches the first lamp unit region Pb1 for a block most similar to the image of the block for each block of the first group G1.
  • the similarity calculation unit 103 obtains the inter-vector distance of each block included in the first group G1 by the search.
  • the similarity calculation part 103 calculates the correlation coefficient of the 1st group G1 as a similarity of 1st lamp
  • the respective similarities of the first lamp part area Pb1, the second lamp part area Pb2, and the third lamp part area Pb3 are calculated using groups. Is done. For example, if the lamp part projected in the first lamp part area Pb1 is turned on and the other lamp parts are turned off, a low similarity is calculated only for the first lamp part area Pb1, and the second lamp part is calculated. A high degree of similarity is calculated for each of the partial area Pb2 and the third lamp part area Pb3. Therefore, even in the calculation method shown in FIG. 16, an appropriate similarity can be calculated for each lamp area, as in the above embodiment.
  • the traffic light area of the recognition target image obtained by the photographing has four lamp part areas, that is, the first lamp part area Pb1, the second lamp part area Pb2, the third lamp part area Pb3, and the fourth lamp part. Divided into regions Pb4.
  • the similarity calculation unit 103 calculates the similarity of the fourth lamp part area Pb4, as shown in FIG. 17A, the first lamp part area Pb1, the second lamp part area Pb2, and the third lamp part.
  • the fourth group Gp4 including the partial area Pb3 is compared with the fourth lamp part area Pb4. That is, the similarity calculation unit 103 divides the fourth group Gp4 into a plurality of blocks, and searches the fourth lamp unit region Pb4 for a block most similar to the image of the block for each block of the fourth group Gp4.
  • the similarity calculation unit 103 obtains the inter-vector distance of each block included in the fourth group Gp4 by the search.
  • the similarity calculation part 103 calculates the correlation coefficient of the 4th group Gp4 as a similarity of 4th lamp part area
  • the similarity calculation unit 103 calculates the similarity of the first lamp area Pb1, as shown in FIG. 17B, the second lamp area Pb2, the third lamp area Pb3, and the fourth lamp.
  • the first group Gp1 including the partial area Pb4 is compared with the first lamp part area Pb1. That is, the similarity calculation unit 103 divides the first group Gp1 into a plurality of blocks, and searches the first lamp unit region Pb1 for a block most similar to the image of the block for each block of the first group Gp1.
  • the similarity calculation unit 103 obtains an inter-vector distance of each block included in the first group Gp1 by the search.
  • the similarity calculation part 103 calculates the correlation coefficient of the 1st group Gp1 as a similarity of 1st lamp part area
  • the similarity calculation unit 103 calculates the similarity of the second lamp part area Pb2, as shown in FIG. 17C, the third lamp part area Pb3, the fourth lamp part area Pb4, and the first lamp part.
  • the second group Gp2 including the partial area Pb1 is compared with the second lamp area Pb2. That is, the similarity calculation unit 103 divides the second group Gp2 into a plurality of blocks, and searches the second lamp unit region Pb2 for a block most similar to the image of the block for each block of the second group Gp2.
  • the similarity calculation unit 103 obtains the inter-vector distance of each block included in the second group Gp2 by the search.
  • the similarity calculation part 103 calculates the correlation coefficient of the 2nd group Gp2 as a similarity of 2nd lamp part area
  • the similarity calculation unit 103 calculates the fourth lamp part area Pb4, the first lamp part area Pb1, and the second lamp as shown in FIG.
  • the third group Gp3 composed of the partial area Pb2 is compared with the third lamp part area Pb3. That is, the similarity calculation unit 103 divides the third group Gp3 into a plurality of blocks, and searches the third lamp unit region Pb3 for a block most similar to the image of the block for each block of the third group Gp3.
  • the similarity calculation unit 103 obtains the inter-vector distance of each block included in the third group Gp3 by the search.
  • the similarity calculation part 103 calculates the correlation coefficient of the 3rd group Gp3 as a similarity of 3rd lamp part area
  • the similarities of the first lamp part area Pb1, the second lamp part area Pb2, and the third lamp part area Pb3 are grouped. Is used to calculate. For example, if the lamp part projected in the first lamp part area Pb1 is turned on and the other lamp parts are turned off, a low similarity is calculated only for the first lamp part area Pb1, and the second lamp part is calculated. A high degree of similarity is calculated for each of the partial area Pb2 and the third lamp part area Pb3. Therefore, even with the calculation method shown in FIG. 17, it is possible to calculate an appropriate degree of similarity for each lamp area, as in the above embodiment.
  • the similarity calculation unit 103 for each of the N lamp unit regions, includes an image of the lamp unit region and at least one other than the lamp unit region.
  • the similarity of the lamp area is calculated by comparing an image of a group of two other lamp areas.
  • the similarity calculation unit 103 includes, for each of the N lamp unit regions, an image of the lamp unit region and the group image included in the lamp unit region.
  • the similarity of the lamp area is calculated by searching the group for the block most similar to the image of the block.
  • the similarity calculation unit 103 compares the image of the lamp unit region with the image of the group for each of the N lamp unit regions, For each included block, the similarity of the lamp area is calculated by searching the lamp area for the block most similar to the image of the block.
  • the lamp area and each of the other lamp areas are individually set. It is possible to save time and effort for comparison. That is, by comparing the lamp area with a group of (N ⁇ 1) lamp areas other than the lamp area, the similarity of the lamp area can be easily calculated. Further, in the calculation of the degree of similarity, image regions having different sizes (for example, a lamp region and a group) can be appropriately compared.
  • each component included in the traffic signal recognition apparatus is configured by dedicated hardware or can be realized by executing a software program suitable for each component. Good.
  • Each component may be realized by a program execution unit such as a CPU or a processor reading and executing a software program recorded on a recording medium such as a hard disk or a semiconductor memory.
  • the software that realizes the traffic signal recognition apparatus in the embodiment and each modification causes the computer to execute each step included in the flowcharts shown in FIGS. 4, 6, 9, and 12.
  • the traffic signal recognizing device As described above, the traffic signal recognizing device according to one or a plurality of aspects has been described based on the embodiment and each modification, but the present disclosure is not limited to this embodiment. As long as it does not deviate from the gist of the present disclosure, various modifications conceived by those skilled in the art are applied to the present embodiment or the modified examples, and forms constructed by combining the components in the embodiments and the modified examples are also disclosed in the present disclosure It may be included in the range.
  • the area extraction unit 101 geometrically uses the position of the vehicle 10, the position and form of the traffic light, and the map information to geometrically determine the traffic light area Pb in the captured image Pa.
  • the area extraction unit 101 may detect the traffic signal area Pb using another detection method instead of such a detection method.
  • the area extraction unit 101 may detect the traffic light area Pb by image recognition by machine learning, or may detect the traffic light area Pb by pattern matching.
  • the number (N) of the lamp units may be any number as long as it is three or more.
  • the colors of the N lamp parts may be colors other than blue, yellow, and red, and two or more of the N lamp parts may be the same color.
  • the traffic signal recognition apparatus shown in FIGS. 1, 5, 7, and 10 may be configured by a processor and a memory. That is, the constituent elements other than the history holding unit 105 among the constituent elements of the traffic signal recognition apparatus are realized by the processor.
  • the processor implements components other than the history holding unit 105 by executing a program stored in the memory.
  • the memory may be configured as the history holding unit 105. That is, the memory may store a program for controlling the processor or may store history information.
  • the present disclosure can be used for a traffic signal recognition device that is mounted on, for example, an automatic traveling vehicle and recognizes a traffic signal on a traveling route of the vehicle.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Traffic Control Systems (AREA)
  • Image Analysis (AREA)

Abstract

L'invention concerne un dispositif de reconnaissance de signal de circulation avec lequel il est possible de faciliter la restriction sur l'imagerie d'un signal de circulation. Un dispositif de reconnaissance de signal de circulation (100) est pourvu d'un processeur et d'une mémoire. Le processeur utilise la mémoire pour extraire une région dans laquelle un signal de circulation ayant N lumières est présenté à partir d'une image qui est acquise par un capteur et est soumise à une reconnaissance, la région étant extraite en tant que région de signal de circulation, et calcule, à partir de la région de signal de circulation, le niveau de similarité d'une pluralité d'images dans lesquelles des lumières différentes les unes des autres sont représentées.
PCT/JP2019/010260 2018-03-14 2019-03-13 Dispositif de reconnaissance de signal de circulation, procédé de reconnaissance de signal de circulation, et programme WO2019177019A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018047116 2018-03-14
JP2018-047116 2018-03-14

Publications (1)

Publication Number Publication Date
WO2019177019A1 true WO2019177019A1 (fr) 2019-09-19

Family

ID=67907774

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/010260 WO2019177019A1 (fr) 2018-03-14 2019-03-13 Dispositif de reconnaissance de signal de circulation, procédé de reconnaissance de signal de circulation, et programme

Country Status (1)

Country Link
WO (1) WO2019177019A1 (fr)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003016429A (ja) * 2001-06-28 2003-01-17 Honda Motor Co Ltd 車両周辺監視装置
JP2013186507A (ja) * 2012-03-05 2013-09-19 Honda Motor Co Ltd 車両周辺監視装置
JP2013242686A (ja) * 2012-05-21 2013-12-05 Nissan Motor Co Ltd 信号機検出装置及び信号機検出方法
JP2017022492A (ja) * 2015-07-08 2017-01-26 オムロン株式会社 画像処理装置およびこれを備えた交通管理システム、画像処理方法

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003016429A (ja) * 2001-06-28 2003-01-17 Honda Motor Co Ltd 車両周辺監視装置
JP2013186507A (ja) * 2012-03-05 2013-09-19 Honda Motor Co Ltd 車両周辺監視装置
JP2013242686A (ja) * 2012-05-21 2013-12-05 Nissan Motor Co Ltd 信号機検出装置及び信号機検出方法
JP2017022492A (ja) * 2015-07-08 2017-01-26 オムロン株式会社 画像処理装置およびこれを備えた交通管理システム、画像処理方法

Similar Documents

Publication Publication Date Title
US10970566B2 (en) Lane line detection method and apparatus
JP5747549B2 (ja) 信号機検出装置及びプログラム
JP6477876B2 (ja) 信号機検出装置及び信号機検出方法
KR101716928B1 (ko) 차량 카메라의 영상 처리 방법 및 이를 이용하는 영상 처리 장치
US8488878B2 (en) Sky detection system used in image extraction device and method using sky detection system
US20170228606A1 (en) Information processing apparatus, information processing method, and recording medium
JP2018063680A (ja) 交通信号認識方法および交通信号認識装置
JP6278791B2 (ja) 車両位置検出装置、車両位置検出方法及び車両位置検出用コンピュータプログラムならびに車両位置検出システム
EP3477620A1 (fr) Appareil de production de carte utilisant l'apprentissage automatique et le traitement d'image
JP4772494B2 (ja) データ処理装置
RU2700646C2 (ru) Устройство обнаружения светофора и способ обнаружения светофора
JP2017211200A (ja) カメラ校正装置及びカメラ校正方法
WO2019177019A1 (fr) Dispositif de reconnaissance de signal de circulation, procédé de reconnaissance de signal de circulation, et programme
JP5936527B2 (ja) 画像処理装置および画像処理方法
JP2011170539A (ja) 点灯色識別装置及びプログラム
JPH07146137A (ja) 車間距離計測装置
CN115900682A (zh) 通过序列估计和锚点检测改进道路拓扑的方法
JP2019158762A (ja) 異常検出装置、異常検出方法および異常検出システム
CN112784817A (zh) 车辆所在车道检测方法、装置、设备及存储介质
JP2020194447A (ja) 信号機認識装置、信号機認識方法および信号機認識のためのプログラム
JP6570321B2 (ja) 情報処理装置、情報処理方法及びプログラム
WO2020203829A1 (fr) Dispositif de détermination d'emplacement de véhicule, système de détermination d'emplacement de véhicule et procédé de détermination d'emplacement de véhicule
JP2019139471A (ja) 画像処理装置、方法及びプログラム
JP2019012496A (ja) 検出プログラム、方法、及び装置
JP7027749B2 (ja) ランドマーク検出方法及びランドマーク検出装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19767081

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19767081

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP