CN117518181A - System and method for evaluating highway traffic sign functionality - Google Patents
System and method for evaluating highway traffic sign functionality Download PDFInfo
- Publication number
- CN117518181A CN117518181A CN202311541304.4A CN202311541304A CN117518181A CN 117518181 A CN117518181 A CN 117518181A CN 202311541304 A CN202311541304 A CN 202311541304A CN 117518181 A CN117518181 A CN 117518181A
- Authority
- CN
- China
- Prior art keywords
- traffic sign
- driver
- vehicle
- target traffic
- sign
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 29
- 230000000007 visual effect Effects 0.000 claims abstract description 90
- 238000012545 processing Methods 0.000 claims abstract description 49
- 238000011156 evaluation Methods 0.000 claims abstract description 32
- 238000004891 communication Methods 0.000 claims abstract description 7
- 230000001960 triggered effect Effects 0.000 claims description 14
- 239000003550 marker Substances 0.000 claims description 9
- 238000013507 mapping Methods 0.000 claims description 7
- 230000004438 eyesight Effects 0.000 claims description 6
- 230000002159 abnormal effect Effects 0.000 claims description 5
- 238000003384 imaging method Methods 0.000 claims description 2
- 230000006870 function Effects 0.000 abstract description 12
- 230000000694 effects Effects 0.000 abstract description 6
- 230000008569 process Effects 0.000 description 10
- 238000012854 evaluation process Methods 0.000 description 6
- 238000001514 detection method Methods 0.000 description 5
- 230000005540 biological transmission Effects 0.000 description 3
- 238000010586 diagram Methods 0.000 description 3
- 239000011159 matrix material Substances 0.000 description 3
- 241000755266 Kathetostoma giganteum Species 0.000 description 2
- 230000009471 action Effects 0.000 description 2
- 238000004364 calculation method Methods 0.000 description 2
- 230000007423 decrease Effects 0.000 description 2
- 239000000428 dust Substances 0.000 description 2
- 239000000463 material Substances 0.000 description 2
- 238000003825 pressing Methods 0.000 description 2
- 108700009949 PTP protocol Proteins 0.000 description 1
- 206010039203 Road traffic accident Diseases 0.000 description 1
- 230000007488 abnormal function Effects 0.000 description 1
- 230000005856 abnormality Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 230000004927 fusion Effects 0.000 description 1
- 238000007499 fusion processing Methods 0.000 description 1
- 238000007689 inspection Methods 0.000 description 1
- 230000007257 malfunction Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 230000016776 visual perception Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/02—Systems using the reflection of electromagnetic waves other than radio waves
- G01S17/06—Systems determining position data of a target
- G01S17/08—Systems determining position data of a target for measuring distance only
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01M—TESTING STATIC OR DYNAMIC BALANCE OF MACHINES OR STRUCTURES; TESTING OF STRUCTURES OR APPARATUS, NOT OTHERWISE PROVIDED FOR
- G01M11/00—Testing of optical apparatus; Testing structures by optical methods not otherwise provided for
- G01M11/02—Testing optical properties
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01P—MEASURING LINEAR OR ANGULAR SPEED, ACCELERATION, DECELERATION, OR SHOCK; INDICATING PRESENCE, ABSENCE, OR DIRECTION, OF MOVEMENT
- G01P3/00—Measuring linear or angular speed; Measuring differences of linear or angular speeds
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/764—Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/58—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
- G06V20/582—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads of traffic signs
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Electromagnetism (AREA)
- Radar, Positioning & Navigation (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Analytical Chemistry (AREA)
- Remote Sensing (AREA)
- Computer Networks & Wireless Communication (AREA)
- Health & Medical Sciences (AREA)
- Artificial Intelligence (AREA)
- Chemical & Material Sciences (AREA)
- Computing Systems (AREA)
- Databases & Information Systems (AREA)
- Evolutionary Computation (AREA)
- General Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- Software Systems (AREA)
- Traffic Control Systems (AREA)
Abstract
The present invention relates to a system and a method for assessing highway traffic sign functionality, the system comprising: a laser radar installed on a roof right above a driver's seat of an evaluation vehicle, facing forward and kept horizontal; the camera shooting unit is arranged at the same position and has the same orientation as the laser radar and is used for acquiring the image of the mark; a speed sensor for acquiring speed information of the estimated vehicle; the processing unit is in communication connection with the laser radar, the camera unit and the speed sensor and can receive the point cloud data, the image and the speed information; the triggering unit is in communication connection with the processing unit and is used for receiving the operation instruction, so that the processing unit is controlled to receive the point cloud data, the image and the speed information; the processing unit determines the kind, the straight line visual recognition distance, and the visual recognition angle of the sign based on the point cloud data, the image, and the speed information, thereby evaluating whether the sign functions normally. The invention can evaluate the functionality of the driver based on the actual using effect of the sign from the objective environment of the driver.
Description
Technical Field
The invention belongs to the technical field of road traffic safety facility detection and evaluation, and mainly relates to a system and a method suitable for in-service traffic sign functional on-site rapid evaluation.
Background
The highway traffic sign is an important traffic safety protection facility, and specific information is transmitted through graphic symbols and characters so as to achieve the purposes of traffic management, traffic indication, road smoothness and traffic safety assurance. At present, retroreflective traffic signs are still the type of signs mainly used on roads, the performance of which is mainly reflected by the retroreflection coefficient of a reflective film, and the quality inspection standard of products is developed around measuring the retroreflection coefficient.
However, in the actual use process, factors determining whether the traffic sign can function properly include external environmental factors such as the position where the traffic sign is located, the road alignment, obstacle shielding, and lighting conditions, in addition to parameters of the sign itself such as the reflective power and the font size.
The traffic sign is usually arranged outdoors, dust is easy to accumulate, and even if technical parameters of the traffic sign accord with related standards, the reflecting capacity of the traffic sign can change along with the use time, so that the traffic sign cannot function properly; another disadvantage is that traffic signs that meet relevant standards do not function properly due to the influence of factors such as setting location, road alignment, obstruction, lighting conditions, etc. These conditions are common in practical use of roads, and can cause that the driver of the vehicle cannot timely and effectively identify the content of the sign, so that the traffic sign cannot accurately play the role of traffic control, and great potential safety hazard exists.
The Chinese patent application with publication number of CN112507902A detects the image extracted from the traffic monitoring video, compares the obtained traffic sign detection result with the reference traffic sign area, and judges whether the traffic sign is abnormal due to traffic accident or natural disaster according to the similarity.
Chinese patent publication No. CN215574640U discloses a sign reverse measuring instrument which allows a user to clean up traffic signs, particularly dust on the surface of high-altitude traffic signs, and detect a reflective film in a hand-held manner.
The Chinese patent publication No. CN207690301U discloses a vehicle-mounted traffic sign usability detecting device, which can measure the retroreflection coefficient of the traffic sign in batches through a retroreflection coefficient measuring instrument arranged at the front end of a vehicle.
It can be seen that the existing schemes detect the state or the retroreflection coefficient of the traffic sign, and such detection cannot reflect the effect of the traffic sign in a specific use process, and cannot be used for judging whether the traffic sign properly plays a role in an actual use process, that is, whether the traffic sign has normal functions in the actual use process.
Therefore, there is a need in the art for an effective system and method for evaluating the functionality of traffic signs based on the actual use of active highway traffic signs, starting from the objective environment in which the driver is located.
Disclosure of Invention
The invention aims to provide a system and a method for evaluating the functionality of a highway traffic sign, which are used for scientifically and effectively judging the actual use effect of an in-service highway traffic sign.
To achieve the above object, in one aspect, the present invention proposes a system for evaluating highway traffic sign functionality, comprising:
a lidar mounted on a roof of the vehicle directly above the estimated driving position, toward the front of the vehicle and kept horizontal;
the camera shooting unit is arranged at the same position with the laser radar and keeps consistent orientation, and is used for acquiring images of the target traffic sign;
a speed sensor provided on the evaluation vehicle for acquiring speed information of the evaluation vehicle;
the processing unit is in communication connection with the laser radar, the image pickup unit and the speed sensor and can receive point cloud data provided by the laser radar, images provided by the image pickup unit and vehicle speed information provided by the speed sensor;
The triggering unit is in communication connection with the processing unit and is used for receiving an operation instruction of a tester, so that the processing unit is controlled to receive point cloud data from the laser radar, images from the camera unit and vehicle speed information from the speed sensor;
the processing unit determines the type of the target traffic sign, the straight line visual recognition distance of the target traffic sign, the horizontal included angle and the vertical included angle based on the point cloud data, the image and the vehicle speed information, and evaluates whether the function of the target traffic sign is normal by judging whether the target traffic sign is in the effective visual field range of the driver when the straight line visual recognition distance of the target traffic sign is greater than or equal to the minimum visual recognition distance.
Preferably, the system further comprises a storage unit in communicative connection with the processing unit, the lidar, the camera unit and the speed sensor for storing the point cloud data, the image and the vehicle speed information.
Preferably, the trigger unit includes a first trigger, and the processing unit receives the point cloud data, the image, and the vehicle speed information when the first trigger is triggered.
Preferably, the triggering unit further includes a second trigger, and the processing unit stops receiving the point cloud data, the image, and the vehicle speed information when the second trigger is triggered.
In another aspect, the present invention provides a method of evaluating highway traffic sign functionality, comprising the steps of:
step one: setting the system for evaluating the functionality of the highway traffic sign on an evaluation vehicle, enabling the evaluation vehicle to run towards a target traffic sign according to the speed limit value of a road where the evaluation vehicle is located, and enabling a tester to sit on the co-driving position of the evaluation vehicle;
step two: when a tester can see the characters and the icons of the target traffic sign, the trigger unit is immediately operated, and the control processing unit receives the point cloud data from the laser radar, the image from the camera unit and the vehicle speed information from the speed sensor at the moment;
step three: judging the type of the target traffic sign in the image through a processing unit, and determining the minimum visible distance of the target traffic sign;
step four: determining a linear visual recognition distance of the target traffic sign based on point cloud data of the laser radar, and determining that the target traffic sign is abnormal in function if the linear visual recognition distance is smaller than the minimum visual recognition distance;
Step five: if the straight line visual recognition distance is larger than or equal to the minimum visual recognition distance, determining whether the target traffic sign is in the effective visual field range of the driver at the moment based on the point cloud data of the laser radar, and if the target traffic sign is in the effective visual field range of the driver, determining that the target traffic sign is normal in function;
step six: if the target traffic sign is not within the effective visual field of the driver, the fourth, fifth and sixth steps are executed again by the processing unit based on the point cloud data from the lidar, the image from the camera unit and the vehicle speed information from the speed sensor at the next time.
Preferably, the processing unit synchronously records the point cloud data of the lidar and the image of the imaging unit at a speed of not less than 10 fps.
Preferably, the process of determining whether the target traffic sign is within the effective field of view of the driver further comprises:
step A1: based on the type and speed of the vehicle being evaluated, an effective horizontal viewing angle and an effective vertical viewing angle of the driver are determined,
a right side =83.08·e -0.014v
a Right side An effective horizontal viewing angle for the right side of the driver; a, a Left side An effective horizontal viewing angle for the left side of the driver; θ 1 The included angle between the right front direction and the right side edge of the left A column when the driver is in the driving position; v is vehicle speed; v th1 The vehicle speed is critical, and the left boundary of the effective horizontal visual field of the driver is overlapped with the right side edge of the left A column of the vehicle;
b is the effective vertical viewing angle of the driver, θ 2 V is the vehicle speed, which is the included angle between the front direction of the driver and the upper edge of the front window when the driver is in the driving position; v th2 The vehicle speed is a critical vehicle speed, the upper boundary of the effective vertical view of the driver is overlapped with the upper edge of the front window of the vehicle, and mu is a proportionality coefficient between the upper view angle, the lower view angle and the left view angle and the right view angle;
step A2: determining a horizontal included angle and a vertical included angle of a target traffic sign, and judging whether the horizontal included angle and the vertical included angle meet the following conditions:
wherein α is the horizontal angle of the sign, β is the vertical angle of the sign, a is the effective horizontal angle of view of the driver, b is the effective vertical angle of view of the driver, and a=a when the target traffic sign is located to the left of the longitudinal centerline of the vehicle Left side A=a when the traffic sign is located on the right side of the vehicle longitudinal center line Right side ;
And if the horizontal included angle and the vertical included angle of the target traffic sign meet the conditions, determining that the target traffic sign is in the effective visual field range of the driver, otherwise, determining that the target traffic sign is not in the effective visual field range of the driver.
Preferably, the process of determining the straight line visual recognition distance of the target traffic sign based on the point cloud data of the laser radar further comprises:
step B1: establishing a mapping relation between the abscissa of each scanning point in the point cloud data and the abscissa of each pixel point in the image;
step B2: identifying a target traffic sign from the image, and determining a sign scanning point corresponding to the target traffic sign in the point cloud data through the mapping relation;
step B3: calculating the straight line visual recognition distance of the target traffic sign by the following formula
Wherein n is the number of mark scanning points for calculating the straight line visual recognition distance, and x i Scan the abscissa of the point for the ith marker, y i Scanning the ordinate, z, of the point for the ith mark i The depth coordinates of the points are scanned for the ith marker.
Preferably, the horizontal included angle alpha of the target traffic sign meetsThe vertical included angle beta of the target traffic sign meets +.>
Wherein,is x i Average value of>Is y i Average value of>Is z i Average value of (2).
Preferably, the n sign scanning points used for calculating the linear vision recognition distance of the target traffic sign are points in the range of 10% to 20% after all the sign scanning points are ordered from large to small in linear distance.
The invention takes the actual use effect of the road traffic sign evaluation as a starting point through converting the detection object, breaks through the traditional detection method based on the material retroreflection performance index, and truly realizes the comprehensive evaluation of the whole functionality of the traffic sign in the real scene.
The system and the method for evaluating the functionality of the highway traffic sign can scientifically evaluate the functionality of the traffic sign according to the actual use effect of the traffic sign and based on the visual perception system in the driving process of a driver, so that the problems and hidden dangers existing in the use of the traffic sign can be accurately found, the basis is provided for the scientific management and control of the highway traffic, and the traffic safety of vehicles is effectively ensured; moreover, the system and the method can evaluate the traffic sign in real time in the normal running process and the normal running speed of the vehicle, and have great advantages in the aspects of the detected speed and efficiency; in addition, the evaluation equipment is operated by a tester positioned at the co-driving position of the vehicle, and the running safety in the evaluation process is effectively ensured without the participation of a driver.
Drawings
Fig. 1 is a schematic structural view of a system for evaluating the functionality of a highway traffic sign according to a preferred embodiment of the present invention.
Fig. 2 is a flowchart of a method for evaluating highway traffic sign functionality according to a preferred embodiment of the present invention.
Fig. 3 is a schematic diagram of a laser radar measuring a linear distance from a target traffic sign according to a preferred embodiment of the present invention.
Detailed Description
Road traffic signs are typically placed on either side of the road or above the road, generally opposite the direction of travel of the vehicle. Since the vehicle is traveling on the road at a certain speed, traffic signs require the driver to see the sign content before a certain distance from it is not reached (not reduced) in order for the driver to have sufficient time to understand its meaning after seeing the sign content and to steer the vehicle accordingly. This particular distance is the minimum perceived distance of the traffic sign.
In the present invention, the minimum visible distance is a lower limit of the visible distance that the traffic sign should have when the vehicle uses the low beam. The road grades are different, and the speed limit values are different, so that the minimum visible distances of the traffic signs are different. For example, for highways and primary highways with speed limits exceeding 100km/h, the minimum perceived distance is 120m; the minimum visual recognition distances of the expressway, the primary road and the secondary road with the speed limit value of 80km/h are 120m, 90m and 90m respectively; the minimum visual recognition distance of the primary road, the secondary road and the tertiary road with the speed limit value of 60km/h and 40km/h is 70m; for four-level and lower-level roads with speed limit values not exceeding 30km/h, the minimum perceived distance of the traffic sign is 50m. In addition, traffic signs may be of different types, and their minimum viewing distances may be different: the minimum visual recognition distance is adopted for the forbidden and warning marks, and the road indicating mark, the indicating mark and the service facility type mark can be adjusted by about 10% on the basis of the values. In general, the more important the traffic sign, the greater the impact, and the greater its minimum viewing distance, so as to allow the driver more time to respond.
Regardless of whether the traffic sign is disposed on the road side or above the road, as the distance between the vehicle and the traffic sign becomes smaller, the angle of the traffic sign (i.e., the angle between the straight line direction connecting the driver and the traffic sign and the direction directly ahead of the driver) becomes larger. If the traffic sign is abnormal in function, visibility is poor, and a driver can see or see the content of the sign only when the vehicle is very close to the traffic sign, which means that the included angle of the traffic sign is relatively large. However, according to the requirement of safe driving, the driver needs to look ahead in a natural driving state, and the driver should not be required to make a large adjustment of the body posture in order to see the traffic sign. That is, the traffic sign should be within the effective field of view of the driver in the normal driving posture when the driver looks at the sign content.
The inventor has found that the static view of the driver can cover the view angle range of about 160 ° in the front left-right direction, but the dynamic effective view angle gradually decreases with the increase of the vehicle speed. When the vehicle speed is 40km/h, the effective visual angle of the driver is reduced to 100 degrees, when the vehicle speed is 60km/h, the effective visual angle is reduced to 75 degrees, when the vehicle speed is 80km/h, the effective visual angle is reduced to 60 degrees, and when the vehicle speed is 100km/h, the effective visual angle is reduced to 40 degrees. Through data fitting, the vehicle speed and the effective visual angle are found to be in an exponential relationship.
Meanwhile, for a driver of the left rudder vehicle, the horizontal viewing angle of the driver is also influenced by the left side A column of the front window. When the vehicle speed does not reach a specific critical value, the effective visual field of the driver is narrowed at the left side of the central line due to the limitation of the A column, and the position corresponding to the left side edge of the front window is taken as the left side boundary of the effective visual field of the driver, so that the left effective horizontal visual field of the driver is determined; when the speed exceeds a threshold value, an effective horizontal viewing angle that is exponentially related to the vehicle speed is employed.
Considering the above, the relationship between the effective horizontal viewing angle a of the vehicle driver and the vehicle speed v is:
a right side =83.08·e -0.014v
a Right side An effective horizontal viewing angle to the right of the driver (i.e., the angle between the right boundary of the driver's effective horizontal field of view and the straight ahead direction); a, a Left side An effective horizontal viewing angle for the left side of the driver (i.e., the angle between the left side of the driver's effective horizontal field of view and its straight ahead direction); θ 1 The angle between the right front direction and the right side edge of the left A column when the driver is in the driving position is related to the type of the vehicle: when the vehicle is a sedan, θ 1 15 °; when the vehicle is a Sport Utility Vehicle (SUV), θ 1 20 °; when the vehicle is road administration When the engineering vehicle is a flat-head vehicle such as an engineering vehicle, theta 1 30 °; v th1 The critical vehicle speed at which the left boundary of the driver's effective horizontal field of view coincides with the right side edge of the left a-pillar of the vehicle.
Similarly, the driver's vertical field of view is affected by the combination of the front window height and the vehicle speed. The normal person's upper and lower visual field is narrower than the left and right visual field, and the proportionality coefficient mu between the upper and lower visual field angle and the left and right visual field angle is smaller than 1, and the value is usually 2/3. At the same speed, the vertical effective field of view is μ times the horizontal effective field of view. The higher the vehicle speed, the smaller the effective viewing angle, and the less noticeable the information on the edges of the field of view. When the vehicle speed does not reach a specific critical value, the effective visual field of the driver is narrowed upwards due to the limitation of the height of the front window, and the position corresponding to the upper edge of the front window is taken as the upper boundary of the effective visual field of the driver, so that the upper effective vertical visual field of the driver is determined; when the speed exceeds a threshold, an effective vertical viewing angle that is exponentially related to the vehicle speed is employed.
Therefore, the effective vertical viewing angle b of the vehicle driver is related to the vehicle speed v as follows:
θ 2 the angle between the front direction of the driver and the upper edge of the front window when the driver is in the driving position is related to the type of the vehicle: when the vehicle is a sedan, θ 2 30 °; when the vehicle is SUV, θ 2 45 °; when the vehicle is a flat-head vehicle such as a road politics engineering vehicle, theta 2 60 °; v th2 The critical vehicle speed at which the upper boundary of the driver's effective vertical field of view coincides with the upper edge of the vehicle front window.
When the driver maintains a normal driving posture on the driving position, the overall effective visual field range is substantially elliptical. When the viewing angle of the traffic sign is within the effective visual field of the driver, the following conditions should be satisfied:
where α is the horizontal angle of the sign (i.e., the angle between the projection of the straight line connecting the driver and the sign on the horizontal plane and the direction directly in front of the driver); beta is the vertical angle of the sign (i.e., the angle between the projection of the straight line connecting the driver and the sign on the vertical plane in which the driver is heading and the driver heading). a is the effective horizontal viewing angle of the driver; b is the effective vertical viewing angle of the driver. A=a when the traffic sign is located to the left of the vehicle longitudinal center line Left side The method comprises the steps of carrying out a first treatment on the surface of the A=a when the traffic sign is located to the right of the vehicle longitudinal center line Right side 。
Fig. 1 shows a schematic structural diagram of a system 100 for evaluating highway traffic sign functionality according to a preferred embodiment of the present invention. As shown in fig. 1, a system 100 for evaluating highway traffic sign functionality includes: laser radar 110, camera unit 120, speed sensor 130, processing unit 140, trigger unit 150, and storage unit 160.
The laser radar 110 is installed on the roof of the vehicle right above the estimated driving position, toward the front of the vehicle and kept horizontal, for measuring the point-to-point spatial straight line distance between the target traffic sign plate and the radar lens, and the three-axis coordinates (x, y, z) of the target traffic sign in the spatial rectangular coordinate system can be obtained from the point cloud data output from the laser radar 110, whereby the distance of the target traffic sign with respect to the radar can be easily calculated.
The camera unit 120 is installed at the same position as the laser radar 110 and is also oriented consistently to capture images of the target traffic sign panel. The image pickup unit 120 preferably employs a short-focal wide-angle lens so that a large field of view can be obtained. Because the position of the camera unit 120 and the radar lens are kept relatively fixed, the outline of the marker panel in the video image output by the camera unit 120 can be mapped onto the point cloud image of the laser radar 110 at the same time.
Since the mounting positions of the laser radar 110 and the camera unit 120 are very close to the head of the driver, the data and information collected by the laser radar 110 and the camera unit 120 can well represent the actual situation of the traffic sign observed by the driver when driving the vehicle.
A speed sensor 130 is provided on the evaluation vehicle for collecting speed information of the vehicle. The speed sensor 130 may be implemented by a speed sensor of the vehicle itself, or a separate speed sensor (e.g., a wheel encoder) may be employed. In a preferred embodiment of the present invention, a GPS location sensor, beidou location sensor, or the like, for example, may be employed to provide both vehicle position and speed information.
The laser radar 110, the camera unit 120 and the speed sensor 130 are all in communication connection with the processing unit 140, the laser radar 110 provides point cloud data to the processing unit 140, the camera unit 120 provides images acquired by the camera unit to the processing unit 140, and the speed sensor 130 provides speed information of the vehicle to the processing unit 140.
The trigger unit 150 is also communicatively connected to the processing unit 140, and the trigger unit 150 is operated by a tester to receive an operation instruction of the tester, thereby controlling the processing unit 140 to receive the point cloud data from the lidar 110, the image from the camera unit 120, and the vehicle speed information from the speed sensor 130 and store them in the storage unit 160. The storage unit 160 is also communicatively connected to the lidar 110, the camera unit 120, and the speed sensor 130.
The trigger unit 150 further includes a first trigger 151 and a second trigger 152. In evaluating the functionality of the highway traffic sign using the system 100, the vehicle is evaluated to travel along the road, and the first trigger 151 and the second trigger 152 are operated by a tester located at the co-driving location of the evaluating vehicle. When the tester sees the target traffic sign and sees the information thereof, the first trigger 151 is operated, whereby the control processing unit 140 receives data and information from the laser radar 110, the camera unit 120, and the speed sensor 130 and stores them in the storage unit 160; when the vehicle travels to the same vertical section of the road as the subject traffic sign, the subject operates the second trigger 152, ending the previous storage process. That is, the system 100 records the point cloud data of the lidar 110, the image of the camera unit 120, and the vehicle speed information of the speed sensor 130 during the period from when the first trigger 151 is triggered to when the second trigger 152 is triggered, and evaluates the functionality of the target traffic sign on the basis of the same.
In the preferred embodiment of the present invention, the first trigger 151 and the second trigger 152 each employ a reset key, which springs up immediately after being pressed, supporting a faster operating frequency. In particular, when the system 100 of the present invention is implemented based on a notebook computer, a particular key on the keyboard of the notebook computer can be used as a trigger, and the key on the keyboard is a reset key.
Since the vehicle is estimated to travel at a higher speed, the speed at which the laser radar 110 provides the point cloud data and the camera unit 120 provides the image is preferably not lower than 10fps in order to avoid missing the key point cloud data and image frames. If the vehicle is estimated to be traveling at 90km/h, 10fps means that the time interval between adjacent point cloud data and image frames is 100ms, during which time the vehicle can travel 2.5m; however, if the time interval between adjacent point cloud data and image frames is 200ms (corresponding to 5 fps), the vehicle may travel a distance of 5m, and it is possible to miss the key point cloud data and image frames, resulting in inaccurate evaluation results.
In the preferred embodiment of the present invention, the lidar 110, the camera unit 120 and the speed sensor 130 communicate with the processing unit 140 and the storage unit 160 via high-speed transmission to reduce the time difference of the point cloud data, the image frames and the speed information due to the transmission. In order to avoid errors caused by the difference in time of the respective devices of the laser radar 110, the camera unit 120 and the speed sensor 130, the same clock may be used to uniformly time the respective devices, for example, the timing may be implemented by an NTP protocol (network time protocol) or a PTP protocol (accurate time protocol) supporting more accurate time synchronization.
In evaluating the functionality of the road traffic sign, the system 100 according to the preferred embodiment of the present invention is provided on an evaluation vehicle, the evaluation being performed as much as possible during a period of clear night with a small traffic volume, so as to avoid the influence of other vehicle lamps on the evaluation result; illuminating with a vehicle dipped headlight at the time of evaluation; the estimated vehicle runs according to the speed limit value of the detected road so as to represent the actual use environment of the traffic sign as much as possible, and when the running direction has multiple lanes, the estimated vehicle should run along the lane far away from the traffic sign (usually the first lane on the left side of the road); the tester should have normal eyesight, sit on the estimated co-driving position of the vehicle, and observe the road traffic sign in a natural state in front of the head-up road surface during the running of the vehicle. Although the difference exists between the copilot position and the driving position, the difference between the copilot position and the driving position and the distance between the copilot position and the target traffic sign is small and can be ignored; the two fields of view are greatly different, the left field of view of the driver is narrow, the right field of view of the driver is large, the right field of view of the co-driver is narrow, and the left side is large, but the difference in the two fields of view does not affect the functional evaluation result of the road traffic sign by the evaluation method of the invention which is described in detail later.
Fig. 2 shows a flow chart of a method for assessing highway traffic sign functionality according to a preferred embodiment of the present invention. During the evaluation of the travel of the vehicle, when the tester can see the text and the icon of the target traffic sign, the first trigger is pressed, thereby instructing the processing unit to record the start time and receiving the point cloud data from the lidar and the image from the camera unit at this time. The tester can press the first trigger by only looking at the mark clearly without understanding the specific meaning of the mark information.
The processing unit judges the type of the target traffic sign in the image through an image recognition algorithm, and determines the minimum visible distance which the target traffic sign should meet by combining the type information of the road. Preferably, the minimum visible distance of the traffic sign of different types under different road grades and different speed limit values is recorded through the lookup table, and the minimum visible distance corresponding to the road grade can be inquired and determined according to the type of the traffic sign board.
The processing unit determines the linear visual recognition distance of the target traffic sign through the point cloud data of the laser radar, namely, the linear distance between the laser radar and the sign when a tester looks at the target traffic sign is equivalent to the linear distance between the driver and the sign. For this reason, it is necessary to find the point coordinates corresponding to the target traffic sign from the point cloud data.
In a preferred embodiment of the invention, the point cloud coordinates in the range of the target traffic sign image are determined by performing position fusion on the point cloud data of the laser radar and the image of the camera unit, and the specific process is as follows:
after the relative positions of the laser radar and the camera unit are fixed on the physical structure, the mapping relation between the horizontal coordinate and the vertical coordinate of the scanning point on the radar point cloud picture and the horizontal coordinate of the pixel point in the image can be calibrated by using (x) r ,y r ) Represents the abscissa of a certain scan point on the radar point cloud (representing the distance of the scan point in the x-axis direction and the y-axis direction, respectively, relative to the origin of coordinates where the radar lens is located), using (x c ,y c ) The coordinates of the pixels on the video image acquired by the image capturing unit are represented by, for example, 1920×1080 resolution pictures, the coordinates of the pixels in the upper left corner are (0, 0), and the coordinates of the pixels in the lower right corner are (1919, 1079). The mapping relationship between the coordinates of the scanning points on the radar point cloud and the coordinates of the upper speed limit points of the video image can be expressed as follows:
and determining the pixel point coordinates of the target traffic sign in the video image, and determining the position coordinates of the target traffic sign on the point cloud image.
In the evaluation process, the pixel coordinates of the marker region recognized by the image can be expressed in a matrix form as follows:
The abscissa pixel point matrix is
The ordinate pixel point matrix is
By the inverse function of the mapping relation, can be obtainedThe abscissa of the mark in the point cloud area isThe ordinate of the mark in the point cloud area is +.>
And traversing the depth coordinates corresponding to the horizontal and vertical coordinates of the point cloud in the radar point cloud data, and constructing a space right triangle according to the horizontal and vertical coordinates and the depth coordinates to calculate the linear distance between the scanning point and the radar lens.
In order to avoid errors, in consideration of the fact that the existing image recognition algorithm cannot recognize the points of the contour edge portion of the traffic sign very accurately, a part of scanning points which are farthest and nearest to the straight line distance between the radar lens can be eliminated when calculating the actual straight line visual recognition distance, and the arithmetic average value of the straight line distances of the rest of scanning points can be used as the actual straight line visual recognition distance. For example, after the straight-line distances of the respective scanning points are sorted from large to small, the first 10% and the last 10% are excluded, and the actual straight-line visual recognition distance is calculated based on the intermediate 10% -90% of the scanning points. In the preferred embodiment of the invention, the points with the linear distances ranging from the first 10 percent to the first 20 percent after the sorting from the big to the small are taken for calculation, so that the interference of the points with unreliable individual distance data is avoided, and the execution speed of the algorithm is improved.
Fig. 3 shows a schematic diagram of the manner in which the lidar measures the linear distance to the target traffic sign according to the preferred embodiment of the present invention. In the figure, x, y and z are coordinates of the target traffic sign under a rectangular coordinate system of the laser radar, and D is a linear distance between the target traffic sign and the laser radar.
At the moment when the first trigger is triggered, through the aforementioned position fusion process, n radar scanning points (i.e., marker scanning points) are selected from the range of the target traffic marker panel region (for example, points in the range of the first 10% -20% after the linear distance is ordered from large to small), and the coordinates of the ith marker scanning point are marked as (x) i ,y i ,z i ),1≤i≤n, at this time, the straight line distance D between the target traffic sign and the lidar (i.e., the actual viewing distance of the sign) may be expressed as:
is x i I.e. the average of the horizontal distance between the target traffic sign and the radar lens>Is y i Is the average of the longitudinal distances between the target traffic sign and the radar lens, +.>Is z i I.e. the average of the depth distances between the target traffic sign and the radar lens.
Thereby, the horizontal included angle alpha of the target traffic sign meetsThe vertical included angle beta satisfies Alpha and beta can be obtained by simple calculation.
After the straight line visual recognition distance of the target traffic sign is determined, comparing the straight line visual recognition distance with the minimum visual recognition distance of the sign, and if the straight line visual recognition distance is smaller than the minimum visual recognition distance, indicating that the target traffic sign cannot be seen clearly in time by a driver, thereby determining the abnormal function of the target traffic sign; if the straight line visual recognition distance is greater than or equal to the minimum visual recognition distance, it is necessary to further determine whether the target traffic sign is within the effective field of view of the driver.
As described above, according to the estimated speed of the vehicle at the time when the first trigger is triggered and the vehicle type, the effective horizontal viewing angle a and the effective vertical viewing angle b of the driver can be obtained if the horizontal angle α and the vertical angle β of the target traffic sign satisfy the following conditions:
it may be determined that the target traffic sign is within the effective field of view of the driver. It follows that at the moment the first trigger is triggered, the driver can see the sign when the distance from the target traffic sign is greater than the minimum visible distance, and the sign handles the driver's effective field of view without the driver having to make a large body posture adjustment in order to see the sign. It can thus be determined that the target traffic sign is functioning properly.
If the horizontal included angle alpha and the vertical included angle beta of the target traffic sign meet the following conditions:
it may be determined that the target traffic sign is outside the driver's effective field of view or at a boundary location of the effective field of view. The location of the target traffic sign relative to the driver is not readily noticeable to the driver at the time the first trigger is triggered. In this case, the functional abnormality of the target traffic sign should not be recognized. Because the driver's distance from the sign is greater than the minimum perceived distance of the sign at this time, the target traffic sign may come within the effective field of view of the driver as the relative position of the driver and sign changes before the vehicle is estimated to travel until the driver's distance from the sign is less than the minimum perceived distance. There is therefore a need to continuously acquire point cloud data of the lidar, images of the camera unit, and vehicle speed information provided by the speed sensor.
After the first trigger is pressed, the processing unit synchronously records the point cloud data of the laser radar and the image of the camera unit at a specific speed (for example, not lower than 10 fps) until the tester presses the second trigger when the evaluation vehicle continues to travel until the target traffic sign and the tester are on the same vertical section of the road. After the tester presses the second trigger, the processing unit stops recording the point cloud data from the lidar, the image from the camera unit, and the speed information from the speed sensor.
In a period between the pressing of the first trigger by the tester and the pressing of the second trigger, the point cloud data, the captured image, and the speed information of the vehicle are stored in the storage unit in a format shown in table 1, for example.
TABLE 1
Wherein the "trigger" item in the first row is marked "1" indicating that the first trigger was pressed, the "trigger" item in the last row is marked "2" indicating that the second trigger was pressed, and each row of "trigger" items therebetween is marked "0" indicating that no trigger was pressed.
For the case that the straight line visual recognition distance is greater than or equal to the minimum visual recognition distance of the target traffic sign, but the sign is out of the effective visual field of the driver, the processing unit redetermines the straight line visual recognition distance D, the horizontal included angle alpha and the vertical included angle beta of the sign based on the point cloud data, the camera image and the vehicle speed information of the next frame, and repeats the previous evaluation process. Before the straight line visual recognition distance D decreases to less than the minimum visual recognition distance, if a situation in which the sign is within the effective field angle of view of the driver occurs, it can be determined that the sign functions normally.
If the sign is within the effective field of view of the driver until the laser radar has been less than the minimum perceived distance of the sign from the target traffic sign, then a malfunction of the sign may be determined.
If a certain mark is identified as abnormal in function because the actual visual recognition distance is smaller than the minimum visual recognition distance, it can be generally inferred that the mark is unsatisfactory in light reflecting performance, possibly that the retroreflectivity of the light reflecting material is to be improved or that the orientation of the mark is improperly set; if the actual viewing distance of a sign is greater than the minimum viewing distance, but has not been in the effective field of view of the driver, it is generally inferred that the sign has been light reflective to the desired level but is not properly positioned or angled.
When the system and the method are applied to evaluate the functionality of the road traffic sign, the processing unit can evaluate the functionality of the target traffic sign only by processing the data in the time period from the triggering of the first trigger to the triggering of the second trigger at most without analyzing all the data output by the laser radar and the camera unit and receiving the instruction of an operator through the triggering unit, and the requirements of the evaluation process on the data processing and operation capacity of the processing unit and the data transmission and storage capacity of the system are effectively reduced.
The most important purpose of evaluating the functionality of road traffic signs is to determine if the sign is able to timely and accurately convey information to the driver of the passing vehicle. However, for driving safety reasons, the driver is not allowed to participate in the evaluation process while driving the vehicle. According to the invention, the visual angle of the driver is simulated through the laser radar and the camera unit which are arranged on the roof right above the driver, and meanwhile, the evaluation process is started through the tester positioned at the copilot position, so that the effect of the sign under the actual use scene can be effectively checked without affecting safe driving. In addition, by calculating the horizontal included angle and the vertical included angle of the mark relative to the driving position, the deviation caused by the visual field difference between the tester and the driver can be effectively avoided, and meanwhile, the error possibly caused by the action of actively searching the mark by the tester through, for example, twisting, raising the head, exploring the body, leaning the body and the like can be eliminated, and after all, in the actual natural driving state, the driver needs to look back ahead according to the requirement of safe driving, and the actions which can seriously influence the driving safety should not be made. In addition, particularly for the situations that the height of the sign board hung above the road is high, the sign is positioned on the outer side of the unidirectional multilane, the radius of the road flat curve is small, the longitudinal slope is large and the like, when the straight line visual recognition distance of the sign meets the functional evaluation index, the visual recognition angle of the sign is usually required to be detected, so that the driver can be ensured to effectively acquire the information presented by the sign in the natural driving state.
In a preferred embodiment of the invention, the second trigger may be caused to be triggered when the estimated vehicle is traveling to the same vertical section as the target traffic sign on the road, providing an indication signal causing the processing unit to stop recording data and information from the lidar, the camera unit and the speed sensor. However, those skilled in the art will appreciate that the second trigger may be omitted. For example, by estimating the positional relationship of the vehicle with respect to the sign when the first trigger is triggered in combination with estimating the speed of the vehicle, the time required for the vehicle to pass the sign can be calculated, so that it is possible to set the time that elapses after the first trigger is triggered, automatically stop recording data and information from the laser radar, the image pickup unit, and the speed sensor.
It will be appreciated by those skilled in the art that any image recognition algorithm currently known or available in the future may be employed by the processing unit, provided that the type of target traffic sign is effectively recognized. Parameters affecting the minimum viewing distance, such as road grade, sign type, etc., and the correspondence with the minimum viewing distance may be preset in a program executed by the processing unit, so that the minimum viewing distance corresponding to the target traffic sign may be directly determined based on these parameters.
The embodiment of the present invention will be further described by taking the example of evaluating the functionality of a one-way three-lane highway outer side speed limit sign board. The speed limit value of the inner side lane of the three-lane expressway is 100km/h, the road plane line is a circular curve, and the used evaluation vehicle is a Sport Utility Vehicle (SUV).
When a tester can see clearly the characters and the icons of the marks, the first trigger is pressed at the first time, and a video image of the moment is acquired. The speed limit sign is determined to be one of the prohibition signs through image recognition, thereby determining that the minimum visual recognition distance thereof should reach 120m. The first trigger is triggeredAt this time, the vehicle travel speed was 98km/h. Horizontal distance x of sign 1 The sign is positioned on the right side of the center line of the estimated vehicle and is calculated to obtain an effective horizontal visual angle a 1 =a Right side =21°; taking μ=2/3, calculating to obtain an effective vertical viewing angle b 1 =14°, the vertical viewing angle of the estimated vehicle in the critical state is not reached. The mark point cloud data collected by the laser radar are as follows:
calculating to obtain the actual visual recognition distance D 1 133.23m, meets the requirement that the actual visual recognition distance is larger than the minimum visual recognition distance, and further calculates the sign visual recognition included angle to obtain alpha 1 =21.2°;β 1 =2.03°, from which it follows that:
indicating that the sign is out of the effective field of view of the driver at the moment the first trigger is triggered, it is difficult to see.
Continuously calculating the data at the next moment, wherein the running speed of the vehicle is 98km/h, the effective horizontal visual angle and the effective vertical visual angle are unchanged, and a 2 =21°、b 2 =14°. The mark point cloud data collected by the laser radar are as follows:
calculating to obtain the straight line distance D between the laser radar and the mark at the moment 2 = 125.08m, greater than the minimum visual recognition distance, further calculating the sign visual recognition angle to obtain α 2 =20.78°;β 2 =1.75°, from which it follows that:
indicating that at this point in time the sign is within the effective field of view of the driver. It can thus be determined that the flag is functioning properly.
In a preferred embodiment of the invention, the processing unit performs an evaluation of the functionality of the target traffic sign in real time and records the evaluation result corresponding to the sign. The point cloud data of the laser radar, the image of the camera unit and the speed information of the vehicle are stored in the storage unit between the triggering of the first trigger and the triggering of the second trigger by the tester, and the information of all the target traffic signs in the specific road section can be uniformly processed after the information acquisition is completed.
It should be appreciated that in the above described embodiments, all may be adjusted within a reasonable range according to the specific application, and are not limited to the specific cases in the above embodiments. That is, various changes and/or modifications to the specific embodiments may be made by those skilled in the art without departing from the spirit of the invention.
Claims (10)
1. A system for assessing highway traffic sign functionality, comprising:
a lidar mounted on a roof of the vehicle directly above the estimated driving position, toward the front of the vehicle and kept horizontal;
the camera shooting unit is arranged at the same position with the laser radar and keeps consistent orientation, and is used for acquiring images of the target traffic sign;
a speed sensor provided on the evaluation vehicle for acquiring speed information of the evaluation vehicle;
the processing unit is in communication connection with the laser radar, the image pickup unit and the speed sensor and can receive point cloud data provided by the laser radar, images provided by the image pickup unit and vehicle speed information provided by the speed sensor;
The triggering unit is in communication connection with the processing unit and is used for receiving an operation instruction of a tester, so that the processing unit is controlled to receive point cloud data from the laser radar, images from the camera unit and vehicle speed information from the speed sensor;
the processing unit determines the type of the target traffic sign, the straight line visual recognition distance of the target traffic sign, the horizontal included angle and the vertical included angle based on the point cloud data, the image and the vehicle speed information, and evaluates whether the function of the target traffic sign is normal by judging whether the target traffic sign is in the effective visual field range of the driver when the straight line visual recognition distance of the target traffic sign is greater than or equal to the minimum visual recognition distance.
2. The system of claim 1, further comprising a storage unit in communicative connection with the processing unit, the lidar, the camera unit, and the speed sensor for storing the point cloud data, the image, and the vehicle speed information.
3. The system of claim 1 or 2, wherein the trigger unit comprises a first trigger, the processing unit receiving the point cloud data, the image, and the vehicle speed information when the first trigger is triggered.
4. The system of claim 3, wherein the trigger unit further comprises a second trigger, the processing unit ceasing to receive the point cloud data, the image, and the vehicle speed information when the second trigger is triggered.
5. A method of evaluating highway traffic sign functionality comprising the steps of:
step one: setting the system according to any one of claims 1 to 4 on an evaluation vehicle, enabling the evaluation vehicle to run towards a target traffic sign according to the speed limit value of the road on which the evaluation vehicle is located, and enabling a tester to sit at the co-driving position of the evaluation vehicle;
step two: when a tester can see the content of the target traffic sign, the trigger unit is immediately operated, and the control processing unit receives point cloud data from the laser radar, images from the camera unit and vehicle speed information from the speed sensor at the moment;
step three: judging the type of the target traffic sign in the image through a processing unit, and determining the minimum visible distance of the target traffic sign;
step four: determining a linear visual recognition distance of the target traffic sign based on point cloud data of the laser radar, and determining that the target traffic sign is abnormal in function if the linear visual recognition distance is smaller than the minimum visual recognition distance;
Step five: if the straight line visual recognition distance is larger than or equal to the minimum visual recognition distance, determining whether the target traffic sign is in the effective visual field range of the driver at the moment based on the point cloud data of the laser radar, and if the target traffic sign is in the effective visual field range of the driver, determining that the target traffic sign is normal in function;
step six: if the target traffic sign is not within the effective visual field of the driver, the fourth, fifth and sixth steps are executed again by the processing unit based on the point cloud data from the lidar, the image from the camera unit and the vehicle speed information from the speed sensor at the next time.
6. The method according to claim 5, wherein the processing unit synchronously records the point cloud data of the lidar and the image of the imaging unit at a speed of not less than 10 fps.
7. The method of claim 5, wherein determining whether the target traffic sign is within the effective field of view of the driver further comprises:
step A1: based on the type and speed of the vehicle being evaluated, an effective horizontal viewing angle and an effective vertical viewing angle of the driver are determined,
a right side =83.08·e -0.014v
a Right side An effective horizontal viewing angle for the right side of the driver; a, a Left side An effective horizontal viewing angle for the left side of the driver; θ 1 The included angle between the right front direction and the right side edge of the left A column when the driver is in the driving position; v is the speed of the vehicle; v th1 The vehicle speed is a critical vehicle speed at which the left boundary of the effective horizontal view of the driver coincides with the right side edge of the left A-pillar of the vehicle;
b is the effective vertical viewing angle of the driver, θ 2 V is the speed of the vehicle, which is the angle between the front direction of the driver and the upper edge of the front window when the driver is in the driving position; v th2 The vehicle speed is a critical vehicle speed, the upper boundary of the effective vertical vision of the driver is overlapped with the upper edge of the front window of the vehicle, and mu is a proportionality coefficient between the upper and lower vision angles and the left and right vision angles;
step A2: determining a horizontal included angle and a vertical included angle of a target traffic sign, and judging whether the horizontal included angle and the vertical included angle meet the following conditions:
wherein α is the horizontal angle of the sign, β is the vertical angle of the sign, a is the effective horizontal angle of view of the driver, b is the effective vertical angle of view of the driver, and a=a when the target traffic sign is located to the left of the longitudinal centerline of the vehicle Left side A=a when the traffic sign is located on the right side of the vehicle longitudinal center line Right side ;
And if the horizontal included angle and the vertical included angle of the target traffic sign meet the conditions, determining that the target traffic sign is in the effective visual field range of the driver, otherwise, determining that the target traffic sign is not in the effective visual field range of the driver.
8. The method of claim 5, wherein determining a rectilinear vision distance for a target traffic sign based on point cloud data for a lidar further comprises:
step B1: establishing a mapping relation between the abscissa of each scanning point in the point cloud data and the abscissa of each pixel point in the image;
step B2: identifying a target traffic sign from the image, and determining a sign scanning point corresponding to the target traffic sign in the point cloud data through the mapping relation;
step B3: calculating the straight line visual recognition distance of the target traffic sign by the following formula
Wherein n is the number of mark scanning points for calculating the straight line visual recognition distance, and x i Scan the abscissa of the point for the ith marker, y i Scanning the ordinate, z, of the point for the ith mark i The depth coordinates of the points are scanned for the ith marker.
9. The method of claim 8, wherein the horizontal angle α of the target traffic sign satisfiesThe vertical included angle beta of the target traffic sign meets +.>
Wherein,is x i Average value of>Is y i Average value of>Is z i Average value of (2).
10. The method of claim 8, wherein the n sign scan points used to calculate the straight line visual recognition distance of the target traffic sign are points ranging from the first 10% to the first 20% after all sign scan points are sorted by straight line distance from large to small.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202311541304.4A CN117518181A (en) | 2023-11-17 | 2023-11-17 | System and method for evaluating highway traffic sign functionality |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202311541304.4A CN117518181A (en) | 2023-11-17 | 2023-11-17 | System and method for evaluating highway traffic sign functionality |
Publications (1)
Publication Number | Publication Date |
---|---|
CN117518181A true CN117518181A (en) | 2024-02-06 |
Family
ID=89754670
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202311541304.4A Pending CN117518181A (en) | 2023-11-17 | 2023-11-17 | System and method for evaluating highway traffic sign functionality |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN117518181A (en) |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101727744A (en) * | 2009-11-27 | 2010-06-09 | 东南大学 | Method for determining optimal setting position of roadside speed limit sign on ordinary road |
CN201716017U (en) * | 2010-05-07 | 2011-01-19 | 交通部公路科学研究所 | Traffic sign dynamic visual cognition distance analogue tester |
CN102620766A (en) * | 2012-04-11 | 2012-08-01 | 天津市市政工程设计研究院 | Dynamic legibility evaluation method for road tunnel traffic signs |
CN103760774A (en) * | 2013-12-26 | 2014-04-30 | 西南交通大学 | Simulation assessment system for reasonability of traffic sign design and set position |
US9972230B1 (en) * | 2012-09-12 | 2018-05-15 | Delorean, Llc | Traffic display with viewing distance control |
CN114724104A (en) * | 2022-05-24 | 2022-07-08 | 交通运输部公路科学研究所 | Method, device, electronic equipment, system and medium for detecting visual recognition distance |
-
2023
- 2023-11-17 CN CN202311541304.4A patent/CN117518181A/en active Pending
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101727744A (en) * | 2009-11-27 | 2010-06-09 | 东南大学 | Method for determining optimal setting position of roadside speed limit sign on ordinary road |
CN201716017U (en) * | 2010-05-07 | 2011-01-19 | 交通部公路科学研究所 | Traffic sign dynamic visual cognition distance analogue tester |
CN102620766A (en) * | 2012-04-11 | 2012-08-01 | 天津市市政工程设计研究院 | Dynamic legibility evaluation method for road tunnel traffic signs |
US9972230B1 (en) * | 2012-09-12 | 2018-05-15 | Delorean, Llc | Traffic display with viewing distance control |
CN103760774A (en) * | 2013-12-26 | 2014-04-30 | 西南交通大学 | Simulation assessment system for reasonability of traffic sign design and set position |
CN114724104A (en) * | 2022-05-24 | 2022-07-08 | 交通运输部公路科学研究所 | Method, device, electronic equipment, system and medium for detecting visual recognition distance |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN106919915B (en) | Map road marking and road quality acquisition device and method based on ADAS system | |
KR101867041B1 (en) | Method and device for detecting safe driving state of driver | |
CN112349144B (en) | Monocular vision-based vehicle collision early warning method and system | |
US12026960B2 (en) | Dynamic driving metric output generation using computer vision methods | |
EP1909064A1 (en) | Object detection device | |
US20120293357A1 (en) | Vehicle surroundings monitoring device | |
US10331955B2 (en) | Process for examining a loss of media of a motor vehicle as well as motor vehicle and system for implementing such a process | |
CN101044390B (en) | Sight distance measuring device and uses | |
GB2511612A (en) | Apparatus and method for detecting vehicle weave | |
US10732420B2 (en) | Head up display with symbols positioned to augment reality | |
CN103065519A (en) | Detecting system of ramp fixed point stopping and starting and detecting method of ramp fixed point stopping and starting | |
US10996469B2 (en) | Method and apparatus for providing driving information of vehicle, and recording medium | |
CN107229906A (en) | A kind of automobile overtaking's method for early warning based on units of variance model algorithm | |
US20090153662A1 (en) | Night vision system for recording and displaying a surrounding area | |
CN101234601A (en) | Automobile cruise control method based on monocular vision and implement system thereof | |
WO2018149539A1 (en) | A method and apparatus for estimating a range of a moving object | |
CN104115203A (en) | Three-dimensional object detection device | |
CN114724104B (en) | Method, device, electronic equipment, system and medium for detecting visual recognition distance | |
EP4105820A1 (en) | Method and device for acquiring 3d information of vehicle | |
CN117518181A (en) | System and method for evaluating highway traffic sign functionality | |
Olaverri-Monreal et al. | Tailigator: Cooperative system for safety distance observance | |
WO2008037473A1 (en) | Park assist system visually marking up dangerous objects | |
CN210269898U (en) | Binocular vision automobile speed capturing system | |
CN110763244B (en) | Electronic map generation system and method | |
JP6507590B2 (en) | Image conversion apparatus and image conversion method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |