WO2023021726A1 - 標識認識装置、および、標識認識方法 - Google Patents

標識認識装置、および、標識認識方法 Download PDF

Info

Publication number
WO2023021726A1
WO2023021726A1 PCT/JP2022/004760 JP2022004760W WO2023021726A1 WO 2023021726 A1 WO2023021726 A1 WO 2023021726A1 JP 2022004760 W JP2022004760 W JP 2022004760W WO 2023021726 A1 WO2023021726 A1 WO 2023021726A1
Authority
WO
WIPO (PCT)
Prior art keywords
sign
candidate
recognition
registered
image
Prior art date
Application number
PCT/JP2022/004760
Other languages
English (en)
French (fr)
Japanese (ja)
Inventor
ユイビン ツーン
雄飛 椎名
健 永崎
Original Assignee
日立Astemo株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日立Astemo株式会社 filed Critical 日立Astemo株式会社
Priority to DE112022002739.8T priority Critical patent/DE112022002739T5/de
Priority to JP2023542187A priority patent/JPWO2023021726A1/ja
Publication of WO2023021726A1 publication Critical patent/WO2023021726A1/ja

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/56Extraction of image or video features relating to colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/62Extraction of image or video features relating to a temporal dimension, e.g. time-based feature extraction; Pattern tracking
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/58Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
    • G06V20/582Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads of traffic signs
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions

Definitions

  • the present invention relates to a sign recognition device and a sign recognition method used in an in-vehicle sensing device such as a stereo camera device.
  • a stereo camera device which is a type of in-vehicle sensing device, is a device that simultaneously detects visual information based on images and distance information of objects in the image. , three-dimensional objects, road surfaces, road signs, signage signs, etc.) can be grasped in detail, greatly contributing to the safety improvement of automatic driving control and driving support control.
  • Some vehicles equipped with a stereo camera device use the recognized traffic signs for acceleration and deceleration control of the vehicle.
  • Euro NCAP which is an evaluation index for advanced driver assistance systems
  • SAS speed assistance systems
  • ISA automatic speed control functions
  • ISA Intelligent Speed Assistance
  • recognition of conditional speed limits such as valid sections, time limits, and vehicle type limits is required, so there is a demand for an expansion of recognizable traffic signs.
  • Patent Document 1 is cited as a conventional technology that focuses on improving the recognition performance of deregulation signs (signs that release regulations such as speed limits and overtaking prohibitions).
  • a camera device capable of improving detection accuracy is provided.
  • Arithmetic processing unit searches for a diagonal line candidate from an image (diagonal line search 504).
  • Arithmetic processing unit selects a detected diagonal line candidate. from the image of the selected oblique line candidate of the deregulation sign is selected (diagonal line determination 505). identify the deregulation sign (identification process 506).”.
  • the present invention has been made in view of the above problems, and its object is to use an image pattern similar to the design of a deregulation sign, such as a pole, a utility pole, a tree branch, etc., as a non-existent deregulation sign.
  • An object of the present invention is to provide a sign recognition device and a sign recognition method that suppress erroneous detection.
  • the sign recognition apparatus of the present invention includes a sign candidate recognition unit that recognizes a sign candidate of a predetermined type from an image, and a first sign candidate that recognizes a first sign candidate as the sign candidate of the predetermined type. If so, there is a paired sign searching unit that associates the first sign with the second sign installed corresponding to the first sign, and the second sign installed corresponding to the first sign. a color information extraction unit for extracting color information of the second sign associated with the first sign; and a threshold determination unit for determining a threshold when extracting color information of a candidate for a sign.
  • the sign recognition device and the sign recognition method of the present invention it is possible to suppress false detection of image patterns similar to the design of deregulation signs, such as poles, utility poles, tree branches, etc., as non-existent deregulation signs. be able to.
  • FIG. 1 is a functional block diagram showing a schematic configuration of a stereo camera device of one embodiment
  • FIG. 4 is a flowchart of basic processing of the stereo camera device of one embodiment.
  • 4 is a flowchart of sign recognition processing of the sign recognition device of one embodiment.
  • 4 is a flowchart of circular object extraction processing of the sign recognition device of one embodiment.
  • Speed limit sign image pattern Explanatory drawing of the center estimation process and radius estimation process with respect to the edge image of a speed limit sign.
  • 4 is a flowchart of effective line search processing of the sign recognition device of one embodiment.
  • FIG. 7 is an explanatory diagram of left and right edge search processing in FIG. 6; Explanatory drawing of the straight line detection process of FIG.
  • FIG. 7 is an explanatory diagram of the effective straight line determination process in FIG.
  • FIG. 1 is a functional block diagram showing a schematic configuration of a stereo camera device 100 according to one embodiment of the invention.
  • the stereo camera device 100 is a type of in-vehicle sensing device mounted in a vehicle that executes automatic driving control and driving support control, and is based on image information of a shooting target area in front of the vehicle. It is a device that recognizes white lines, pedestrians, other vehicles, other three-dimensional objects, traffic lights, traffic signs, lighting lamps, etc.).
  • the stereo camera device 100 determines a control policy (brake, steering, etc.) of the own vehicle according to the environment outside the vehicle, and provides the control policy to an ECU (Electronic Control Unit) via an in-vehicle network CAN (Controller Area Network). It is an output device. Then, the ECU controls the braking system and the steering system of the own vehicle according to the control policy of the stereo camera device 100, thereby safely decelerating the own vehicle and causing the own vehicle to avoid obstacles.
  • a control policy brake, steering, etc.
  • the stereo camera device 100 of this embodiment has a camera 1 and a sign recognition device 2.
  • the camera 1 is composed of a left camera 1L and a right camera 1R arranged side by side, and outputs a pair of left and right images (left image P L and right image P R ) synchronously photographed in front of the vehicle.
  • the sign recognition device 2 recognizes traffic signs in the image captured by the camera 1.
  • the sign recognition device 2 includes a processor such as a CPU (Central Processing Unit), a memory such as a ROM (Read Only Memory), a RAM (Random Access Memory), and an SSD (Solid State Drive).
  • a processor such as a CPU (Central Processing Unit)
  • a memory such as a ROM (Read Only Memory), a RAM (Random Access Memory), and an SSD (Solid State Drive).
  • Each function of the sign recognition device 2 is realized by the processor executing a program stored in the ROM.
  • the RAM and SSD store data such as intermediate data for computation by the program and images of the camera 1 .
  • the stereo camera device 100 is a device capable of recognizing the environment outside the vehicle (white lines on the road, pedestrians, other vehicles, etc.) other than traffic signs. Since the recognition function is focused on, the device 2, which should be called the environment recognition device, will be called the sign recognition device 2 in this embodiment.
  • the sign recognition device 2 includes an image input interface 21, an image processing unit 22, an arithmetic processing unit 23, a storage unit 24, a CAN interface 25, a monitoring processing unit 26, and interconnecting them. It has an internal bus 27 that Each part will be outlined below.
  • the image input interface 21 is an interface that controls the imaging device of the camera 1 and captures the captured image.
  • the image captured by the image input interface 21 is transmitted to the image processing section 22 and the arithmetic processing section 23 via the internal bus 27 .
  • the image processing unit 22 compares the left image PL captured by the imaging device of the left camera 1L and the right image PR captured by the imaging device of the right camera 1R, and determines device-specific After performing image correction such as deviation correction and noise interpolation, the left and right images after correction are stored in the storage unit 24 . Further, the image processing unit 22 extracts mutually corresponding portions between the corrected left and right images, and then calculates distance information of each pixel on the image based on the parallax between the corresponding portions. Then, the calculated distance information is stored in the storage unit 24 .
  • the arithmetic processing unit 23 uses the corrected left and right images and the distance information stored in the storage unit 24 to recognize various objects necessary for perceiving the environment around the vehicle.
  • Various objects recognized here include, for example, pedestrians, other vehicles, other obstacles, traffic lights, traffic signs, tail lamps and headlights of vehicles, and the like. Some of these recognition results and intermediate calculation results are recorded in the storage unit 24 . Furthermore, the arithmetic processing unit 23 determines a control strategy for the own vehicle using the recognition results of various objects.
  • the storage unit 24 stores data during or after processing by the image processing unit 22 and the arithmetic processing unit 23 . Further, various information necessary for recognizing various objects is registered in advance in the storage unit 24 . Note that the storage unit 24 is specifically a memory such as the above-described RAM or SSD, and includes an image buffer 24a and a discriminator 24b, which will be described later.
  • the CAN interface 25 is an interface that transmits various objects recognized by the arithmetic processing unit 23 and vehicle control policies determined by the arithmetic processing unit 23 to the in-vehicle network CAN.
  • the ECU connected to the in-vehicle network CAN executes braking of the own vehicle, warning to the driver, etc., based on the object recognition result and control policy transmitted via the in-vehicle network CAN.
  • the monitoring processing unit 26 monitors whether any of the above-described units are operating abnormally, or whether an error has occurred during data transfer, and is a mechanism for preventing abnormal operations.
  • step S1L the image processing unit 22 causes the left camera 1L to capture the left image PL .
  • step S2L the image processing unit 22 performs processing such as image correction that absorbs the unique characteristics of the imaging element on the captured left image PL , and then corrects the left image PL . is stored in the image buffer 24 a of the storage unit 24 .
  • step S1R the image processing unit 22 causes the right camera 1R to capture the right image PR .
  • step S2R the image processing unit 22 performs processing such as image correction that absorbs the unique characteristics of the imaging element on the captured right image PR , and then corrects the right image PR . is stored in the image buffer 24 a of the storage unit 24 .
  • step S3 the image processing unit 22 compares the corrected left image PL and right image PR stored in the image buffer 24a, and calculates the parallax between the left and right images.
  • the parallax between the left and right images makes it clear where a given point on the object corresponds to where on the left and right images, so the distance to the object can be calculated by the principle of triangulation. . Note that the distance information obtained in this step is also stored in the storage unit 24 .
  • step S4 the arithmetic processing unit 23 recognizes various objects using the corrected left and right images and the distance information stored in the image buffer 24a.
  • Objects to be recognized include pedestrians, other vehicles, other three-dimensional objects, traffic signs, traffic lights, tail lamps, and the like.
  • the discriminator 24b is used as necessary for various object recognitions in this step.
  • the discriminator 24b stores and records, for example, features of an object to be recognized as machine learning data. Details of the sign recognition processing (step S40) executed in this step will be described later.
  • step S5 the arithmetic processing unit 23 considers the recognition results of various objects in step S4 and the state of the own vehicle (speed, steering angle, etc.) to determine a vehicle control policy. For example, if the own vehicle exceeds the speed limit, a control policy such as issuing a warning to the passengers or braking the own vehicle is determined. In this embodiment, as will be described below, the arithmetic processing unit 23 determines control of the own vehicle based on information on the sign recognized in step S4.
  • step S6 the CAN interface 25 outputs the various objects recognized in step S4 and the control policy of the own vehicle determined in step S6 to the external ECU through the in-vehicle network CAN.
  • the ECU can issue a warning to the occupants or brake the vehicle in accordance with the control policy determined by the stereo camera device 100 .
  • step S40 which is one aspect of the various object recognition processes (step S4) in FIG. 2, will be described using the flowchart in FIG.
  • step S41 the arithmetic processing unit 23 extracts an image pattern including a circular object from the post-correction image in the image buffer 24a.
  • This processing is divided into edge image generation processing (step S41a), center estimation processing (step S41b), and radius estimation processing (step S41c). Each process will be described in turn.
  • the arithmetic processing unit 23 generates an edge image PE1 by extracting edge components of the post-correction image in the image buffer 24a. For example, if an image pattern of a speed limit sign as shown in FIG. 5A is captured in part of the post-correction image, an edge image P E1 as shown in FIG. 5B is generated in this step.
  • the arithmetic processing unit 23 estimates the center of the circular object. Specifically, as shown in FIG. 5B, the arithmetic processing unit 23 draws a line segment Ln from each edge of the edge image PE1 in the normal direction, and draws a line segment Ln at a point where a certain number or more of intersections of the line segments Ln overlap. Assume C. In the example of FIG. 5B, the center C is presumed to be between the number "11" and the number "0".
  • the arithmetic processing unit 23 estimates the radius of the circular object based on the histogram of the edge image PE1 .
  • the histogram of circular objects in FIG. 5B can be generated.
  • the horizontal axis position corresponding to the distance from the center C to the first edge group E1 and the horizontal axis position corresponding to the distance from the center C to the second edge group E2 Therefore, it can be assumed that both the first edge group E1 and the second edge group E2 are substantially annular.
  • the arithmetic processing unit 23 estimates that the second edge group E2 having the larger radius is the outer diameter of the circular object.
  • the arithmetic processing unit 23 can extract a circular object and its features (center position, radius) from the corrected image in the image buffer 24a.
  • step S42 the arithmetic processing unit 23 acquires color information of the circular object extracted in step S41. This is because the color information of this sign is used as a reference when judging the authenticity of the deregulation sign in step S4b, which will be described later. Note that step S42 may be processed after step S43, which will be described later, or may be processed in parallel with step S43.
  • the arithmetic processing unit 23 identifies whether the circular object extracted at step S41 is a traffic sign candidate. Specifically, the arithmetic processing unit 23 performs identification processing using the classifier 24b on the image pattern of the corrected image in the image buffer 24a, which corresponds to the position of the circular object extracted in step S41. . As a result, it is recognized whether or not the circular object extracted in step S41 is a traffic sign candidate, and if the circular object is a traffic sign candidate, the content of the traffic sign (for example, speed limit) is recognized. A first identification score representing the likelihood of a traffic sign when identified as a traffic sign candidate is output in this identification process and stored in memory. The traffic sign candidates identified in this step are hereinafter referred to as "main signs". If a circular object is not extracted in step S41, this step may be omitted.
  • steps S44-45 and 47-4b candidates for deregulation signs are searched.
  • step S44 the arithmetic processing unit 23 searches for an effective straight line that can be a candidate for a deregulation sign from the corrected image in the image buffer 24a.
  • This effective line search processing is divided into left and right edge search processing (step S44a), line detection processing (step S44b), and effective line determination processing (step S44c). Each process will be described in order with reference to 7C.
  • the arithmetic processing unit 23 scans the pixels of the corrected image in the image buffer 24a row by row from left to right, as shown in FIG. 7A. If the left edge E L and the right edge E R can be found within a certain distance, they are extracted as a pair of left and right edges. By applying such processing to the entire area of the post-correction image, an edge image P E2 ( FIG. 7B is an excerpt of the edge image) can be generated by extracting a pair of left and right edges from the post-correction image.
  • the arithmetic processing unit 23 detects a straight line portion from the edge image PE2 generated in step S44a.
  • the edge image P E2 illustrated in FIG. 7B has a portion in which pairs of left and right edges are continuous in two or more rows and a portion in which there is only one row. In this step, a portion in which two or more rows of pairs of left and right edges continue vertically is detected as a straight line.
  • step S44c the arithmetic processing unit 23 determines whether the straight line detected in step S44b is a valid candidate for the deregulation sign, and extracts a valid straight line that can be a candidate.
  • Japan's deregulation sign has a conspicuous thick blue slanted line inside a circular sign (see the upper sign in Figure 11(a)).
  • release signs eg Germany, Tru, Sweden, etc.
  • straight portions that can be candidates for the deregulation sign are extracted from the post-correction image.
  • the distance to the straight line detected in step S44b can be calculated. Therefore, assuming that the deregulation sign exists at the calculated distance, the oblique line of the deregulation sign on the corrected image
  • the width and height of the part can also be approximated. Therefore, if the width of the edge pair illustrated in FIG. 7B is too narrow or too wide, the straight line formed by the edge pair is determined as an invalid straight line that cannot be a candidate for the deregulation sign, and is excluded from the valid straight lines. can do.
  • the height of the straight portion is short and does not reach the lower limit of the effective range that is considered appropriate as the height of the hatched portion of the deregulation sign (FIG. 7C (b) ), or longer than the upper limit (FIG. 7C(c)), these straight lines are determined as invalid straight lines that cannot be candidates for deregulation signs, and are excluded from valid straight lines.
  • the straight portion falls within the effective range (FIG. 7C(a))
  • the straight portion is extracted as an effective straight line.
  • step S44 the image patterns of utility poles and supports that are significantly different in form from the shaded portion of the deregulation sign are excluded from the deregulation sign candidates, and the straight portions that may be deregulation signs are excluded. can extract only the image pattern of
  • step S45 the arithmetic processing unit 23 determines whether an effective straight line has been extracted by the processing in step S44. If the effective straight line is not extracted, the process proceeds to step S46, and if the effective straight line is extracted, the process proceeds to step S47.
  • step S46 the arithmetic processing unit 23 tracks the main sign (traffic sign) candidate identified in step S43.
  • “tracking” refers to the shape information and position information of the sign candidate obtained in step S41, the color information obtained in step S42, and the color It refers to storing information in memory in association with time series.
  • the type of the traffic sign candidate identified in step S43 (for example, a speed limit sign) is registered in the tracking list L1, which is a list for registering the type of traffic sign being imaged by the stereo camera device 100. . Note that when the own vehicle passes through the main sign candidate registered in the tracking list L1 and becomes unable to take an image of the main sign candidate, the information on the main sign candidate registered in the tracking list L1 is not recognized. It is assumed that it is transcribed to list L2.
  • step S47 the arithmetic processing unit 23 calculates color information of the effective straight line extracted in step S44.
  • the arithmetic processing unit 23 arranges a window having a shape that accommodates an effective straight line in the image, and Calculate the blue color score.
  • the blue score is calculated by, for example, the following method. That is, if the color of each pixel in the image is defined by the three primary colors red (R), green (G), and blue (B), the area of the blue pixel in the window is divided by the total area of the window. By doing so, the blue score is calculated.
  • the calculated blue score is stored in memory.
  • blue is used as an example here, color information other than blue may be extracted according to the sign design. It says.
  • the deregulation sign in Japan is designed with a conspicuous blue diagonal line (see Fig. 11(a)), so the blue score in the window that actually includes the deregulation sign is considered to be high. Therefore, by focusing on the blue score of the window, for example, effective straight lines clearly other than blue, such as white straight lines and black straight lines, are excluded from candidates for deregulation signs. Only effective straight lines can be extracted.
  • the threshold for comparison with the blue score is set low in this step.
  • not only actual deregulation signs, but also effective straight lines that are not deregulation signs may be extracted as deregulation sign candidates. Candidates for deregulation signs at this step are not reliable enough.
  • step S48 candidates for deregulation signs are identified. This is performed in the same manner as the identification of the main mark in S42. Specifically, the arithmetic processing unit 23 uses the discriminator 24b to discriminate the deregulation sign candidates. At this time, a second identification score representing the likeness of the deregulation sign is output in this identification process and stored in memory.
  • the deregulation sign candidates identified in this step are hereinafter referred to as "deregulation signs". Note that step S48 may be processed before step S47, or may be processed in parallel with step S47.
  • step S49 the arithmetic processing unit 23 tracks the actual sign identified in step S42 and the candidates for the deregulation sign extracted in step S48.
  • “tracking" has the same meaning as described in step S46.
  • the tracking list L1 which is a list for registering the signs being imaged by the stereo camera device 100
  • the type of the main sign candidate identified in step S43 for example, a speed limit sign
  • the type extracted in step S48 Register the deregulation sign (candidate). Note that if the vehicle passes through a traffic sign and cannot take an image of the traffic sign registered in the tracking list L1, the information on the traffic sign registered in the tracking list L1 is transferred to the recognized list L2. shall be transcribed.
  • step S4a the arithmetic processing unit 23 searches for a real sign (for example, speed limit sign, no overtaking) paired with the deregulation sign candidate extracted in step S48.
  • the method of searching for paired signs in this step differs from country to country. For example, in Japan, as shown in FIG. 8A, this sign is placed below the deregulation sign. should be associated as On the other hand, in other countries, as shown in FIG. It suffices to associate the registered deregulation signs as a pair sign. By using either search method, it is possible to search for this sign that is paired with the derestriction sign regardless of the country.
  • step S4a the details of the pair sign search process in step S4a will be described using the flowchart of FIG. This flowchart is divided into processing from step S4aa to step S4ad and processing from step S4af to step S4ah.
  • step S4aa to step S4ad the arithmetic processing unit 23 searches for pair signs based on the tracking list L1. This is a search method for paired signs in countries where this sign and deregulated sign are installed at the same location, as illustrated in FIG. 8A.
  • step S4ab the arithmetic processing unit 23, based on the tracking information registered in the tracking list L1 in step S49 (information for specifying the sign such as the set position, size, type, and tracking number of the sign) Check if there is a main sign paired with the deregulation sign. For example, as shown in FIG. 8A, if a deregulation sign and a speed limit sign are registered in the tracking list L1, they are extracted as pair sign candidates.
  • step S4ac the arithmetic processing unit 23 confirms the arrangement of the deregulation sign and the pair sign candidate, and determines that both are pair signs if their installation positions are close. For example, as shown in FIG. 10A, a speed limit sign, which is the first paired sign candidate, is installed on the same post as the deregulation sign, and a second pair sign candidate is installed on a post far from the deregulation sign. If there is an overtaking prohibition sign, the speed limit sign installed on the same pole as the deregulation sign is determined as a pair sign, and the overtaking prohibition sign in the distance is not determined as a pair sign. In addition, when a plurality of permanent signs are installed on the same post as the deregulation sign, the plurality of permanent signs may be extracted as a pair of signs.
  • step S4ad the arithmetic processing unit 23 stores information on the paired signs extracted in step 48c in the storage unit 24.
  • step S4ae the arithmetic processing unit 23 confirms whether the pair indicator has been registered. Then, when the pair indicator is registered, the process of step S4a is terminated and the process proceeds to step S4b. On the other hand, if the pair indicator is unregistered, the process proceeds to step S4af.
  • step S4af to step S4ah the arithmetic processing unit 23 searches for pair signs based on the tracking list L1 and the recognized list L2. As illustrated in FIG. 8B, this is a search method for paired signs in countries where the main sign and deregulation sign are installed in separate locations.
  • step S4ag the arithmetic processing unit 23, based on the recognized information registered in the recognized list L2 (previously recognized sign information such as the setting position, size, type, and registration number of the sign), It is checked whether or not there is a regular sign paired with the deregulation sign registered in L1. For example, as shown in FIG. 8B, if a previously imaged speed limit sign is registered in the recognized list L2, it is extracted as a pair sign candidate. Since there is a possibility that a large number of previously recognized sign information are registered in the recognized list L2, in this step, the most recently registered main sign is determined as a pair sign candidate. For example, as shown in FIG.
  • a speed limit sign which is the first pair of candidate signs, is installed on the pole in front of the deregulation sign, and the second pair of sign candidates is installed on the pole in front of the speed limit sign. If a no overtaking sign is installed, the speed limit sign registered most recently is determined to be a paired sign, and the previously registered no overtaking sign is determined not to be a paired sign.
  • step S4aa to step S4ad corresponding to the environment in FIG. 8A and the processing from step S4af to step S4ah corresponding to the environment in FIG. 8B are executed. If the country in which the vehicle is traveling can be identified using It is good also as a structure to carry out.
  • step S4a After completing the process of step S4a shown in FIG. 9, the main sign that is paired with the deregulation sign can be extracted regardless of the installation mode of the deregulation sign (FIGS. 8A and 8B).
  • step S45 as described above, there is a possibility that an effective straight line that is not actually a deregulation sign is extracted as a candidate.
  • the color information of the actual sign it is confirmed whether or not the candidate extracted in step S48 is truly a deregulation sign.
  • the color information of the main sign is acquired in step S48) and stored in the memory.
  • step S4b the arithmetic processing unit 23 determines whether it is a time zone in which it is possible to perform the authenticity determination process (step S4c) of the candidate deregulation sign.
  • step S4c the color information of the paired signs is used. reliability is low. Therefore, in this step, for example, based on the time information acquired from the built-in timer of the ECU or the GPS, it is determined whether it is daytime or nighttime. Proceeding to step S4c, it is determined that it is difficult to obtain highly reliable color information at night, and the process proceeds to step S4d.
  • step S4c determines whether the candidate for this sign is extracted in step S43 or not extracted in step S43 or not extracted in step S48. If the candidate for this sign is not extracted in step S43, the process of step S4c cannot be executed, and if the candidate for the deregulation sign is not extracted in step S48, the truth of the candidate is determined. Since the process of step S4c for determining is not necessary in the first place, step S4c is avoided in these cases as well, and the process proceeds to step S4c.
  • step S4c the arithmetic processing unit 23 determines whether the candidates for deregulation signs extracted in step S48 are true or false. reject.
  • candidates for deregulation signs are extracted based on the blue score of the effective straight line, but deregulation signs illuminated with colored light and deregulation signs in cloudy or rainy weather are excluded from the candidates.
  • the threshold for comparison with the blue score of the effective straight line was set to be low, which allowed the extraction of effective straight lines that were not actual deregulation signs as candidates. Therefore, in this step, by referring to the color score of the main sign acquired in step S42 and adjusting the threshold value used to determine the authenticity of the candidates for the deregulation sign, the deregulation sign It is now possible to accurately judge the authenticity of a candidate.
  • this sign is a speed limit sign with a red ring around the sign
  • the red score within the window shaped to accommodate the red ring is considered to be a high value.
  • this sign is illuminated by colored light, cloudy or rainy weather, clear weather, etc.
  • the red score of this sign will change. In that case, the blue score of the deregulation sign would also change to the same extent.
  • the threshold of the blue score for judging the authenticity of the candidate of the deregulation sign is also set low. If it is high, the blue score threshold is also set high.
  • the color information for example, blue score (that is, confidence in being blue)
  • the rate of change in color information acquired when recognizing this sign against the color information of the template image of this sign to recognize it as a deregulation sign
  • a threshold for color information may be determined. Together, the color information of the sign and the rate of change of the color information of the sign are referred to as the "reliability" of the sign.
  • threshold value of color information for example, blue score
  • step S4d the arithmetic processing unit 23 refers to the tracking list L1 or the recognized list L2 to determine the result of identifying the sign. That is, if the process proceeds from step S45 to step S46, only the main marker candidate identified in step S43 is determined as the identification result as a marker. On the other hand, if the process proceeds from step S45 to step S47, for example, the main sign candidate identified in step S43 is identified as a pair sign of the deregulation sign candidate extracted in steps S48 and S4c. Then, when the pair sign is recognized in this step, a control policy that cancels the regulation indicated by this sign is determined in the subsequent step S5.
  • “determine” means outputting the information obtained from the recognized sign to the subsequent control process (S5 in FIG. 2) as the result of the recognition process.
  • the stereo camera device 100 of the present embodiment described above not only can the deregulation sign shown in FIG. Since it is possible to suppress erroneous detection of the upper ends of the posts having similar thicknesses as the deregulation sign, it is possible to improve the reliability of sign detection.
  • the stereo camera device 100 composed of two cameras was used, but one camera or three or more cameras may be used.
  • each of the above configurations, functions, processing units, processing means, and the like may be realized by hardware, for example, by designing a part or all of them using an integrated circuit.
  • each of the above configurations, functions, etc. may be realized by software by a processor interpreting and executing a program for realizing each function.
  • Information such as programs, tables, and files that implement each function can be stored in storage devices such as memory, hard disks, SSDs (Solid State Drives), or recording media such as IC cards, SD cards, and DVDs.
  • DESCRIPTION OF SYMBOLS 100 Stereo camera apparatus, 1... Camera, 1L... Left camera, 1R... Right camera, 2... Sign recognition apparatus, 21... Image input interface, 22... Image processing part, 23... Operation process part, 24... Storage part, 24a ... image buffer, 24b ... discriminator, 25 ... CAN interface, 26 ... monitoring processing unit, 27 ... internal bus, CAN in-vehicle network

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Traffic Control Systems (AREA)
  • Image Analysis (AREA)
PCT/JP2022/004760 2021-08-18 2022-02-07 標識認識装置、および、標識認識方法 WO2023021726A1 (ja)

Priority Applications (2)

Application Number Priority Date Filing Date Title
DE112022002739.8T DE112022002739T5 (de) 2021-08-18 2022-02-07 Schilderkennungsvorrichtung und schilderkennungsverfahren
JP2023542187A JPWO2023021726A1 (de) 2021-08-18 2022-02-07

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2021-133360 2021-08-18
JP2021133360 2021-08-18

Publications (1)

Publication Number Publication Date
WO2023021726A1 true WO2023021726A1 (ja) 2023-02-23

Family

ID=85240273

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/004760 WO2023021726A1 (ja) 2021-08-18 2022-02-07 標識認識装置、および、標識認識方法

Country Status (3)

Country Link
JP (1) JPWO2023021726A1 (de)
DE (1) DE112022002739T5 (de)
WO (1) WO2023021726A1 (de)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2015191619A (ja) * 2014-03-28 2015-11-02 富士重工業株式会社 車外環境認識装置
JP2017146711A (ja) * 2016-02-16 2017-08-24 株式会社日立製作所 画像処理装置、警告装置、画像処理システム、画像処理方法
JP2021056575A (ja) * 2019-09-27 2021-04-08 スズキ株式会社 車両用運転支援システム

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2015191619A (ja) * 2014-03-28 2015-11-02 富士重工業株式会社 車外環境認識装置
JP2017146711A (ja) * 2016-02-16 2017-08-24 株式会社日立製作所 画像処理装置、警告装置、画像処理システム、画像処理方法
JP2021056575A (ja) * 2019-09-27 2021-04-08 スズキ株式会社 車両用運転支援システム

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
KAGEYAMA, YOICHI; NISHIDA, MAKOTO: "Extraction of Circular Road Sign Considering Scene Image Features, Utilization of Brightness Information and Automatic Setting of Threshold", IMAGE LAB, vol. 22, no. 5, 10 May 2011 (2011-05-10), JP , pages 8 - 14, XP009543528, ISSN: 0915-6755 *

Also Published As

Publication number Publication date
DE112022002739T5 (de) 2024-04-11
JPWO2023021726A1 (de) 2023-02-23

Similar Documents

Publication Publication Date Title
US11854272B2 (en) Hazard detection from a camera in a scene with moving shadows
CN106647776B (zh) 车辆变道趋势的判断方法、判断装置和计算机存储介质
CN110197589B (zh) 一种基于深度学习的闯红灯违法检测方法
WO2020000251A1 (zh) 基于摄像机协同接力的路口违章视频识别方法
US9558412B2 (en) Vehicle exterior environment recognition device
JP4871909B2 (ja) 物体認識装置、および物体認識方法
US9659497B2 (en) Lane departure warning system and lane departure warning method
CN106169244A (zh) 利用人行横道识别结果的引导信息提供装置及方法
US20150161796A1 (en) Method and device for recognizing pedestrian and vehicle supporting the same
US20050102070A1 (en) Vehicle image processing device
CN104036279A (zh) 一种智能车行进控制方法及系统
CN105426864A (zh) 一种基于等距边缘点匹配的多车道线检测方法
JP6226368B2 (ja) 車両監視装置、および車両監視方法
US11373417B2 (en) Section line recognition device
JP2007179386A (ja) 白線認識方法及び白線認識装置
CN111222441A (zh) 基于车路协同的点云目标检测和盲区目标检测方法及系统
JP2020077293A (ja) 区画線検出装置及び区画線検出方法
WO2023021726A1 (ja) 標識認識装置、および、標識認識方法
CN112270258A (zh) 一种非机动车的违规信息获取方法及装置
EP3287940A1 (de) Kreuzungserkennungssystem für ein fahrzeug
KR20210002893A (ko) 하이브리드 기법을 이용한 번호판 인식 방법 및 그 시스템
CN115565363A (zh) 信号识别装置
US11679769B2 (en) Traffic signal recognition method and traffic signal recognition device
JP7058753B2 (ja) カメラ装置
JP7005762B2 (ja) カメラ装置の標識認識方法及び標識認識装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22858051

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2023542187

Country of ref document: JP

WWE Wipo information: entry into national phase

Ref document number: 112022002739

Country of ref document: DE