CN111398989A - Performance analysis method and test equipment of driving assistance system - Google Patents

Performance analysis method and test equipment of driving assistance system Download PDF

Info

Publication number
CN111398989A
CN111398989A CN202010256859.4A CN202010256859A CN111398989A CN 111398989 A CN111398989 A CN 111398989A CN 202010256859 A CN202010256859 A CN 202010256859A CN 111398989 A CN111398989 A CN 111398989A
Authority
CN
China
Prior art keywords
information
true value
target information
evaluated
driving assistance
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010256859.4A
Other languages
Chinese (zh)
Inventor
张磊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Kunyi Electronic Technology Shanghai Co Ltd
Original Assignee
Kunyi Electronic Technology Shanghai Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Kunyi Electronic Technology Shanghai Co Ltd filed Critical Kunyi Electronic Technology Shanghai Co Ltd
Priority to CN202010256859.4A priority Critical patent/CN111398989A/en
Publication of CN111398989A publication Critical patent/CN111398989A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/87Combinations of systems using electromagnetic waves other than radio waves

Abstract

The invention provides a performance analysis method of a driving assistance system, which comprises the following steps: acquiring at least one target information true value in real time on test equipment independent of a vehicle-mounted sensor; acquiring at least one functional information true value related to driving assistance in real time at the test device; acquiring target information to be evaluated and function information to be evaluated, which are determined by the driving assistance system; and comparing the target information to be evaluated with the associated target information true value, and comparing the function information to be evaluated with the associated function information true value to evaluate the performance of the driving assistance system.

Description

Performance analysis method and test equipment of driving assistance system
Technical Field
The present invention relates to a driving assistance technology, and in particular, to a performance analysis method and a test apparatus for a driving assistance system.
Background
The automatic driving automobile can automatically acquire environmental information around the automobile and make decisions and path planning by means of technologies such as artificial intelligence, computer vision, radar, a global positioning system and a high-precision map, so that automatic driving completely independent of human operation is realized. Autonomous vehicles rely on Advanced Driving Assistance Systems (ADAS) of the vehicle to perform the required functions. The test of the auxiliary driving performance of the advanced vehicle driving auxiliary system is a key link for ensuring the safety of the automobile.
Disclosure of Invention
The invention aims to provide a performance analysis method and test equipment of a driving assistance system, which can analyze the driving assistance performance in real time.
In order to solve the above technical problem, the present invention provides a method for analyzing the performance of a driving assistance system, including the steps of: acquiring at least one target information true value in real time on test equipment independent of a vehicle-mounted sensor; acquiring at least one functional information true value related to driving assistance in real time at the test device; acquiring target information to be evaluated and function information to be evaluated, which are determined by the driving assistance system; and comparing the target information to be evaluated with the associated target information true value, and comparing the function information to be evaluated with the associated function information true value to evaluate the performance of the driving assistance system.
In an embodiment of the present invention, the step of obtaining at least one true value of the target information in real time at a test device independent of the vehicle-mounted sensor comprises: acquiring raw data of an external sensor independent of a vehicle-mounted sensor; and extracting the target information true value from the original data.
In an embodiment of the present invention, the step of obtaining at least one true value of the target information in real time at a test device independent of the vehicle-mounted sensor comprises: target data is acquired from an external sensor independent of the vehicle-mounted sensor, and a target information true value is generated according to the target data.
In an embodiment of the invention, the method further includes fusing data of a plurality of external sensors to obtain the true value of the target information.
In an embodiment of the present invention, the step of acquiring, at the test device, at least one real value of the functional information related to the driving assistance in real time includes: calculating the function information truth value according to at least one target information truth value; or according to at least one target information true value, calculating first function information, acquiring second function information of an external sensor independent of the vehicle-mounted sensor, and performing function arbitration to obtain a function information true value.
In an embodiment of the present invention, a method for comparing the target information to be evaluated with the associated target information truth value includes: comparing the parameter deviation of the target information to be evaluated and the associated target information true value at the same time point; and comparing the identification time deviation of the target information to be evaluated and the associated target information true value.
In an embodiment of the present invention, a method for comparing the function information to be evaluated with the associated function information truth value includes: comparing the parameter deviation of the functional information to be evaluated and the associated functional information true value at the same time point; and comparing the identification time deviation of the function information to be evaluated and the associated function information true value.
In an embodiment of the present invention, the method further includes triggering one or more system behaviors according to one or more evaluation results.
In an embodiment of the invention, the system behavior includes: and recording vehicle data and environment data before and after the failure moment of the driving assistance system, and sending out a prompt.
The invention also proposes a test device for performance analysis of a driving assistance system, the test device comprising a processor, a memory and a computer program which, when executed by the processor, performs the method as described above.
Compared with the prior art, the performance test method and the test equipment can acquire the real values of the ADAS target object and the function data in real time, and simultaneously compare and analyze the real values with the information of the tested ADAS controller, thereby assisting in judging whether the corresponding functions of the tested ADAS controller of the vehicle are normal or not.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the application and together with the description serve to explain the principle of the invention. In the drawings:
fig. 1 is a system implementation environment of a driving assistance system performance analysis according to an embodiment of the present invention.
Fig. 2 is a block diagram of a test apparatus according to an embodiment of the present invention.
FIG. 3 is a block diagram of a test apparatus according to another embodiment of the present invention.
Fig. 4 is a flowchart of a driving assistance system performance analysis method according to an embodiment of the present invention.
FIG. 5 is an example of test equipment acquiring data according to an embodiment of the present invention.
FIG. 6 is an example of test equipment acquiring data according to another embodiment of the present invention.
Fig. 7 is a data flow diagram according to an embodiment of the invention.
Detailed Description
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings used in the description of the embodiments will be briefly introduced below. It is obvious that the drawings in the following description are only examples or embodiments of the application, from which the application can also be applied to other similar scenarios without inventive effort for a person skilled in the art. Unless otherwise apparent from the context, or otherwise indicated, like reference numbers in the figures refer to the same structure or operation.
As used in this application and the appended claims, the terms "a," "an," "the," and/or "the" are not intended to be inclusive in the singular, but rather are intended to be inclusive in the plural unless the context clearly dictates otherwise. In general, the terms "comprises" and "comprising" merely indicate that steps and elements are included which are explicitly identified, that the steps and elements do not form an exclusive list, and that a method or apparatus may include other steps or elements.
The relative arrangement of the components and steps, the numerical expressions, and numerical values set forth in these embodiments do not limit the scope of the present application unless specifically stated otherwise. Meanwhile, it should be understood that the sizes of the respective portions shown in the drawings are not drawn in an actual proportional relationship for the convenience of description. Techniques, methods, and apparatus known to those of ordinary skill in the relevant art may not be discussed in detail but are intended to be part of the specification where appropriate. In all examples shown and discussed herein, any particular value should be construed as merely illustrative, and not limiting. Thus, other examples of the exemplary embodiments may have different values. It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, further discussion thereof is not required in subsequent figures.
In the description of the present application, it is to be understood that the orientation or positional relationship indicated by the directional terms such as "front, rear, upper, lower, left, right", "lateral, vertical, horizontal" and "top, bottom", etc., are generally based on the orientation or positional relationship shown in the drawings, and are used for convenience of description and simplicity of description only, and in the case of not making a reverse description, these directional terms do not indicate and imply that the device or element being referred to must have a particular orientation or be constructed and operated in a particular orientation, and therefore, should not be considered as limiting the scope of the present application; the terms "inner and outer" refer to the inner and outer relative to the profile of the respective component itself.
Spatially relative terms, such as "above … …," "above … …," "above … …," "above," and the like, may be used herein for ease of description to describe one device or feature's spatial relationship to another device or feature as illustrated in the figures. It will be understood that the spatially relative terms are intended to encompass different orientations of the device in use or operation in addition to the orientation depicted in the figures. For example, if a device in the figures is turned over, devices described as "above" or "on" other devices or configurations would then be oriented "below" or "under" the other devices or configurations. Thus, the exemplary term "above … …" can include both an orientation of "above … …" and "below … …". The device may be otherwise variously oriented (rotated 90 degrees or at other orientations) and the spatially relative descriptors used herein interpreted accordingly.
It should be noted that the terms "first", "second", and the like are used to define the components, and are only used for convenience of distinguishing the corresponding components, and the terms have no special meanings unless otherwise stated, and therefore, the scope of protection of the present application is not to be construed as being limited. Further, although the terms used in the present application are selected from publicly known and used terms, some of the terms mentioned in the specification of the present application may be selected by the applicant at his or her discretion, the detailed meanings of which are described in relevant parts of the description herein. Further, it is required that the present application is understood not only by the actual terms used but also by the meaning of each term lying within.
It will be understood that when an element is referred to as being "on," "connected to," "coupled to" or "contacting" another element, it can be directly on, connected or coupled to, or contacting the other element or intervening elements may be present. In contrast, when an element is referred to as being "directly on," "directly connected to," "directly coupled to" or "directly contacting" another element, there are no intervening elements present. Similarly, when a first component is said to be "in electrical contact with" or "electrically coupled to" a second component, there is an electrical path between the first component and the second component that allows current to flow. The electrical path may include capacitors, coupled inductors, and/or other components that allow current to flow even without direct contact between the conductive components.
Flow charts are used herein to illustrate operations performed by systems according to embodiments of the present application. It should be understood that the preceding or following operations are not necessarily performed in the exact order in which they are performed. Rather, various steps may be processed in reverse order or simultaneously. Meanwhile, other operations are added to or removed from these processes.
Fig. 1 is a system implementation environment of a driving assistance system performance analysis according to an embodiment of the present invention. Referring to fig. 1, the system implementation environment of the present embodiment has a test apparatus 100, an onboard ADAS controller 200, sensors such as a camera 310 and a lidar 320, and a CAN bus 400. The test device 100 and the onboard ADAS controller 200 are both connected to a CAN bus 400. Camera 310 and lidar 320 are connected to test apparatus 100. The vehicle-mounted ADAS controller 200 is used for executing the ADAS system to realize driving assistance, and can automatically acquire environmental information around the vehicle and make decisions and plan paths by means of technologies such as computer vision, radar, global positioning system, high-precision maps, artificial intelligence and the like. The test apparatus 100, the camera 310 and the lidar 320 are all apparatuses independent from the vehicle-mounted system, and are additionally connected to the vehicle-mounted system during the test. The test device 100 may communicate with the onboard ADAS controller 200 via a CAN bus 400. It is understood that camera 310 and lidar 320 are examples only and that other sensors, such as a position location navigation system, may be present in the system implementation environment. In addition, laser radar 320 may be replaced with other radars, such as millimeter-wave radar, and the like.
Fig. 2 is a block diagram of a test apparatus according to an embodiment of the present invention. Referring to fig. 2, the test apparatus 100 of the present embodiment may include a processor 101, a memory 102, a video interface 103, an ethernet interface 104, and a CAN interface 105. The CAN interface 105 is connected to the CAN bus 400. The video interface 103 and the ethernet interface 104 are respectively connected to the camera 310 and the laser radar 320. The processor 101 is connected to a video interface 103, an ethernet interface 104 and a CAN interface 105. The memory 102 may store computer instructions. The processor 101 may execute computer instructions to implement the desired test functions.
In the development and test process of the ADAS system, the truth values are required to be used as auxiliary references in the aspects of identification of target objects in the environment and ADAS function analysis, and analysis processing is carried out to judge whether the corresponding functions of the ADAS system of the vehicle are normal.
In the context of the present application, a true value, i.e. a true value, is an actual value that is objectively present to be measured under certain conditions. The truth value is usually an unknown quantity, and generally speaking, the truth value refers to a theoretical truth value, a specified truth value, or a relative truth value. For the vehicle ADAS system, first, the peripheral environment is sensed, and information of a target object in the current environment, such as where there is an obstacle in front, is obtained. The ADAS system then needs to obtain some key parameters to facilitate ADAS function calculations. For example, emergency braking systems require knowledge of the distance to the front obstacle, the relative speed, and the time of the collision. And finally, the emergency braking system generates a corresponding decision instruction, such as collision early warning or braking starting and the like. The above information is related information acquired, measured and calculated by the ADAS controller 200, and is not always consistent with actual information of the real environment or even far away from the actual information. For this reason, the introduction of the test device 100 and associated sensors 310 and 320, obtains more accurate information than the ADAS controller 200 being tested. For example, in the current actual environment, there are 10 obstacles in front of the vehicle, and the measured ADAS controller 200 measures only 3 of them, while the test equipment measures 7 of them. In addition, for example, the actual distance between the two vehicles is 30.5 meters currently, the measurement result of the ADAS controller 200 is 28 meters, and the measurement result of the testing device 100 is 30.2 meters. The test apparatus 100 may be considered a relative truth system.
For example, the truth values may include a target information truth value and a function information truth value. The real target information value is relevant real information of a target in the environment, and can comprise position data, size data and the like of the target such as people/vehicles/traffic signs and the like. The truth value of the ADAS function is the reasonable information of the key index parameters and key output parameters of the auxiliary functions.
In the context of the present application, target information truth values may include, but are not limited to: the relative distance [ m ] between the target object and the sensor, the azimuth [ degs ] of the target object relative to the sensor, the pitch angle [ degs ] of the target object relative to the sensor, the longitudinal speed [ m/s ] of the target object, the heading compass angle [ degs ] of the target object, the lane valid flag, the lane information (lane position parameter, lane curvature, lane line start and stop point), the traffic signal valid flag, the traffic signal information (traffic signal type, traffic signal color), the target vehicle valid flag, and the target vehicle information (length, width, height, type).
In the context of the present application, functional information truth values may include, but are not limited to: a functional state parameter [ current state ], an accelerator pedal opening instruction, a brake warning lamp signal, TTC collision time, HWT headway, a lane departure warning signal, and a lane keeping instruction.
Fig. 4 is a flowchart of a driving assistance system performance analysis method according to an embodiment of the present invention. This method may be performed in a test apparatus 100, such as a processor 101. Referring to fig. 1 to 4, the performance analysis method of the driving assistance system of the present embodiment includes the steps of:
at step 401, at least one target information true value is obtained in real time at the test apparatus 100.
In this step, test apparatus 100 may obtain a true value of target information based on data from camera 310 and/or lidar 320. The target information true values can be the various true values exemplified above, and are not expanded here.
In one embodiment, as shown in FIG. 3, camera 310 may emit raw video data. The processor 101 in the test equipment 100 identifies the target object from the image data by means of image processing algorithm (including image preprocessing, feature extraction, classification extraction) and deep learning, and extracts a true value of the target information. On the other hand, the laser radar 320 sends out original point cloud data, and the processor 101 of the testing device 100 performs target object identification on the original point cloud data in a machine learning (including detection, clustering, classifier identification) mode, and extracts a true value of target information.
In another embodiment, as shown in FIG. 5, camera 310 and lidar 320 may emit target information. The processor 101 of the testing apparatus 100 may obtain a high-accuracy target true value by using a sensor information fusion algorithm according to the target information true values sent from the camera 310 and the laser radar 320. In this embodiment, the camera 310 and the lidar 320 are also connected to the CAN bus 400, and the test apparatus 100 obtains target information through the CAN interface 105, as shown in fig. 3.
In various embodiments, as shown in fig. 5 and 6, the testing device 100 of the present application improves the accuracy of the target truth by multi-sensor (e.g., camera 310 and lidar 320, and any other available sensors) information fusion, relative to the ADAS controller 200 sensing the environment through limited and specific sensors. Specifically, the confidence difference of the multiple sensors to different characteristic information of the target object can be utilized to improve the confidence of truth value through a fusion algorithm. For example, the camera 310 has high accuracy for identifying information such as color and image of the target object, and the laser radar 320 has high accuracy for identifying spatial parameters such as geometric parameters, position parameters and motion state parameters of the target object.
At step 402, at least one functional information true value related to driving assistance is acquired in real time at a test device. The function information true values can be the above-exemplified true values, and are not expanded here.
In one embodiment, the test equipment 100 may use the target truth information to calculate the function information truth using ADAS control software algorithm modules loaded on the high performance processors 101 in the system. In one embodiment, the functional information true value may be calculated using the high accuracy target information true value obtained by the fusion algorithm, resulting in a higher accuracy functional information true value.
In another embodiment, the test equipment 100 may calculate the first functional information based on at least one true value of the target information as described above. On the other hand, the test apparatus 100 may acquire second function information of an external sensor (such as the camera 310, the lidar 320, and/or other sensors) independent of the vehicle-mounted sensor. The test apparatus 100 may perform a function arbitration of the first function information and the second function to obtain a higher accuracy true value of the function information.
In step 403, target information to be evaluated and function information to be evaluated determined by the driving assistance system are acquired.
In this step, the test apparatus 100 may obtain target information to be evaluated and function information to be evaluated from the ADAS controller 200 under test through the CAN interface 103.
In one embodiment, the camera 310, the lidar 320, and the corresponding measured sensor mounting position and angle information and vehicle digital model (digital-to-analog) information may be used as parameters, and introduced into a coordinate system of the testing apparatus 100, i.e., a calculation module (executed in the processor 101), to eliminate the target object space coordinate deviation caused by the mounting position and angle deviation between the sensors.
In one embodiment, the system-specific time delay offset may be obtained by using a uniform time stamp marking module (e.g., implemented in the processor 101) of the testing apparatus 100, by signal excitation feedback or the like during the system initialization stage, and then adjusting the system-specific sensor delay offset compensation coefficients to eliminate the system-specific time offset.
In step 404, the target information to be evaluated is compared with the associated target information true value, and the function information to be evaluated is compared with the associated function information true value, to evaluate the performance of the driving assistance system.
In this step, the test apparatus 100 may perform the above comparison to evaluate the performance of the driving assistance system.
In one embodiment, when comparing the target information truth value with the target information to be evaluated, the parameter deviation of the target information to be evaluated and the associated target information truth value at the same time point can be compared, and the identification time deviation of the target information to be evaluated and the associated target information truth value can be compared.
In one embodiment, when comparing the true value of the function information with the true value of the function information to be evaluated, the parameter deviation of the function information to be evaluated and the associated true value of the function information at the same time point can be compared, and the recognition time deviation of the function information to be evaluated and the associated true value of the function information can be compared.
After the evaluation result is obtained, part of the system behavior of the test device 100 is triggered if the analysis finds an anomaly. Here, the abnormality may include a case of missing report, false report, or the like of the target. Here, the system behavior may include recording data around an abnormal time point, issuing a reminder, and the like.
In an embodiment of the present application, as shown in fig. 7, the evaluation result signal obtained in the result step 404 may control various system behavior outputs through one or more trigger conditions, and this process may be executed inside the test apparatus 100. In one embodiment, multiple trigger condition runs may be supported simultaneously. In one embodiment, the same truth comparison analysis result can be used as a data source for a plurality of trigger conditions at the same time. In one embodiment, the same trigger condition may employ multiple truth contrastive analysis results simultaneously. In one embodiment, the same trigger condition may control multiple system output behaviors simultaneously. In one embodiment, the same system output behavior may be controlled by multiple trigger conditions simultaneously.
Through this function, various desired specific applications can be realized. For example, when the truth analysis of the system finds that the target object output from the ADAS controller 200 to be tested has a false negative (target false negative), the vehicle data and the environmental data before and after the failure time point can be recorded. For example, when the truth analysis of the system finds that the detected target output by the ADAS controller 200 to be tested has a false alarm (target false alarm), the user can be alerted by sound and/or light. For example, when the truth analysis of the system finds that the ADAS controller 200 that has been tested has sent a braking request (function false alarm) without sending the braking request, the relevant personnel can be notified in a short message and/or mail manner. For example, when the truth analysis of the system finds that a braking request is sent and the measured ADAS controller 200 does not send a braking request (function fail report), the relevant alarm signal can be pushed to the remote end.
The performance test method and the test equipment can acquire the real value of the ADAS target object and the functional data in real time, and simultaneously compare and analyze the real value with the information of the tested ADAS controller, thereby assisting in judging whether the corresponding functions of the tested ADAS controller of the vehicle are normal or not.
In contrast, in the comparative analysis method, after the data of the vehicle and the sensor are acquired through a large number of road tests by the independent off-board sensor, the data of the measured ADAS controller and the data of the off-board sensor need to be compared and analyzed manually in an off-line mode by the upper computer afterwards. For the target object data, as a comparison method, firstly, the sensor data needs to be labeled manually (if the sensor data does not contain labeling information), and then, the target object information in the video needs to be compared manually, and whether the situations of missing report and false report occur or not is judged. In addition, for ADAS functional data, it is more difficult to complete manually, and even secondary analysis may be performed by methods such as additionally writing scripts. Therefore, the method is lower in cost, higher in accuracy and capable of conducting more kinds of truth value analysis.
Having thus described the basic concept, it will be apparent to those skilled in the art that the foregoing disclosure is by way of example only, and is not intended to limit the present application. Various modifications, improvements and adaptations to the present application may occur to those skilled in the art, although not explicitly described herein. Such modifications, improvements and adaptations are proposed in the present application and thus fall within the spirit and scope of the exemplary embodiments of the present application.
Also, this application uses specific language to describe embodiments of the application. Reference throughout this specification to "one embodiment," "an embodiment," and/or "some embodiments" means that a particular feature, structure, or characteristic described in connection with at least one embodiment of the present application is included in at least one embodiment of the present application. Therefore, it is emphasized and should be appreciated that two or more references to "an embodiment" or "one embodiment" or "an alternative embodiment" in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, some features, structures, or characteristics of one or more embodiments of the present application may be combined as appropriate.
Aspects of the present application may be embodied entirely in hardware, entirely in software (including firmware, resident software, micro-code, etc.) or in a combination of hardware and software.
The computer readable medium may comprise a propagated data signal with the computer program code embodied therein, for example, on a baseband or as part of a carrier wave. The propagated signal may take any of a variety of forms, including electromagnetic, optical, and the like, or any suitable combination. The computer readable medium can be any computer readable medium that can communicate, propagate, or transport the program for use by or in connection with an instruction execution system, apparatus, or device. Program code on a computer readable medium may be propagated over any suitable medium, including radio, electrical cable, fiber optic cable, radio frequency signals, or the like, or any combination of the preceding.
Similarly, it should be noted that in the preceding description of embodiments of the application, various features are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the disclosure aiding in the understanding of one or more of the embodiments. This method of disclosure, however, is not intended to require more features than are expressly recited in the claims. Indeed, the embodiments may be characterized as having less than all of the features of a single embodiment disclosed above.
Numerals describing the number of components, attributes, etc. are used in some embodiments, it being understood that such numerals used in the description of the embodiments are modified in some instances by the use of the modifier "about", "approximately" or "substantially". Unless otherwise indicated, "about", "approximately" or "substantially" indicates that the number allows a variation of ± 20%. Accordingly, in some embodiments, the numerical parameters used in the specification and claims are approximations that may vary depending upon the desired properties of the individual embodiments. In some embodiments, the numerical parameter should take into account the specified significant digits and employ a general digit preserving approach. Notwithstanding that the numerical ranges and parameters setting forth the broad scope of the range are approximations, in the specific examples, such numerical values are set forth as precisely as possible within the scope of the application.
Although the present application has been described with reference to the present specific embodiments, it will be recognized by those skilled in the art that the foregoing embodiments are merely illustrative of the present application and that various changes and substitutions of equivalents may be made without departing from the spirit of the application, and therefore, it is intended that all changes and modifications to the above-described embodiments that come within the spirit of the application fall within the scope of the claims of the application.

Claims (10)

1. A performance analysis method of a driving assistance system, comprising the steps of:
acquiring at least one target information true value in real time on test equipment independent of a vehicle-mounted sensor;
acquiring at least one functional information true value related to driving assistance in real time at the test device;
acquiring target information to be evaluated and function information to be evaluated, which are determined by the driving assistance system; and
and comparing the target information to be evaluated with the associated target information true value, and comparing the function information to be evaluated with the associated function information true value to evaluate the performance of the driving assistance system.
2. The method of claim 1, wherein the step of obtaining at least one true value of the target information in real time at a sensor-independent test device comprises:
acquiring raw data of an external sensor independent of a vehicle-mounted sensor;
and extracting the target information true value from the original data.
3. The method of claim 1, wherein the step of obtaining at least one true value of the target information in real time at a sensor-independent test device comprises:
target data is acquired from an external sensor independent of the vehicle-mounted sensor, and a target information true value is generated according to the target data.
4. A method according to claim 2 or 3, further comprising fusing data from a plurality of external sensors to obtain the target information true value.
5. The method of claim 1, wherein the step of acquiring at least one functional information true value related to driving assistance in real time at the test device comprises:
calculating the function information truth value according to at least one target information truth value; or
According to at least one target information true value, first function information is calculated, second function information of an external sensor independent of the vehicle-mounted sensor is acquired, and function arbitration is performed to obtain a function information true value.
6. The method of claim 1, wherein comparing the target information to be evaluated to associated target information truth values comprises: comparing the parameter deviation of the target information to be evaluated and the associated target information true value at the same time point; and comparing the identification time deviation of the target information to be evaluated and the associated target information true value.
7. The method of claim 1, wherein comparing the function information to be evaluated to associated function information truth values comprises: comparing the parameter deviation of the functional information to be evaluated and the associated functional information true value at the same time point; and comparing the identification time deviation of the function information to be evaluated and the associated function information true value.
8. The method of claim 1, further comprising triggering one or more system behaviors based on one or more evaluation results.
9. The method of claim 8, wherein the system behavior comprises: and recording vehicle data and environment data before and after the failure moment of the driving assistance system, and sending out a prompt.
10. A test apparatus for performance analysis of a driving assistance system, the test apparatus comprising:
a processor;
a memory;
computer program which, when executed by the processor, performs the method of any of the preceding claims 1 to 9.
CN202010256859.4A 2020-04-02 2020-04-02 Performance analysis method and test equipment of driving assistance system Pending CN111398989A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010256859.4A CN111398989A (en) 2020-04-02 2020-04-02 Performance analysis method and test equipment of driving assistance system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010256859.4A CN111398989A (en) 2020-04-02 2020-04-02 Performance analysis method and test equipment of driving assistance system

Publications (1)

Publication Number Publication Date
CN111398989A true CN111398989A (en) 2020-07-10

Family

ID=71434799

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010256859.4A Pending CN111398989A (en) 2020-04-02 2020-04-02 Performance analysis method and test equipment of driving assistance system

Country Status (1)

Country Link
CN (1) CN111398989A (en)

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111964922A (en) * 2020-08-28 2020-11-20 福瑞泰克智能系统有限公司 Intelligent driving vehicle test system
CN112327806A (en) * 2020-11-02 2021-02-05 东软睿驰汽车技术(沈阳)有限公司 Automatic driving test method and device and electronic equipment
CN112887172A (en) * 2021-02-19 2021-06-01 北京百度网讯科技有限公司 Vehicle perception system test method, device, equipment and storage medium
CN113268411A (en) * 2021-04-25 2021-08-17 福瑞泰克智能系统有限公司 Driving assistance algorithm testing method and device, electronic device and storage medium
CN113673105A (en) * 2021-08-20 2021-11-19 安徽江淮汽车集团股份有限公司 Design method of true value comparison strategy
CN113805566A (en) * 2021-09-17 2021-12-17 南斗六星系统集成有限公司 Detection method and system for integrated driving assistance system controller
CN114047361A (en) * 2022-01-11 2022-02-15 深圳佑驾创新科技有限公司 Calibration system of ADAS visual equipment
CN114077218A (en) * 2022-01-19 2022-02-22 浙江吉利控股集团有限公司 Road data evaluation report generation method, device, equipment and storage medium
CN114091626A (en) * 2022-01-19 2022-02-25 浙江吉利控股集团有限公司 True value detection method, device, equipment and storage medium
CN114136356A (en) * 2021-11-30 2022-03-04 上汽通用五菱汽车股份有限公司 Parameter acquisition test system, method, device and computer readable storage medium
CN115797401A (en) * 2022-11-17 2023-03-14 昆易电子科技(上海)有限公司 Verification method and device of alignment parameters, storage medium and electronic equipment

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105549023A (en) * 2014-10-23 2016-05-04 现代摩比斯株式会社 Object detecting apparatus, and method of operating the same
CN108303271A (en) * 2018-03-01 2018-07-20 北京理工华创电动车技术有限公司 Auxiliary drives product testing system and test method
CN108334056A (en) * 2018-02-02 2018-07-27 安徽江淮汽车集团股份有限公司 A kind of ADAS test system and test method
CN108334055A (en) * 2018-01-30 2018-07-27 赵兴华 The method of inspection, device, equipment and the storage medium of Vehicular automatic driving algorithm
CN108663677A (en) * 2018-03-29 2018-10-16 上海智瞳通科技有限公司 A kind of method that multisensor depth integration improves target detection capabilities
CN109839634A (en) * 2019-01-25 2019-06-04 中国汽车技术研究中心有限公司 A kind of subject fusion method of vehicle-mounted camera and radar
DE102018131417A1 (en) * 2017-12-19 2019-06-19 Toyota Jidosha Kabushiki Kaisha Automatic generation of pedestrians in a virtual simulation of intersections
CN110940355A (en) * 2019-12-04 2020-03-31 上海创程车联网络科技有限公司 Function test method of ADAS (advanced data acquisition System) equipment

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105549023A (en) * 2014-10-23 2016-05-04 现代摩比斯株式会社 Object detecting apparatus, and method of operating the same
DE102018131417A1 (en) * 2017-12-19 2019-06-19 Toyota Jidosha Kabushiki Kaisha Automatic generation of pedestrians in a virtual simulation of intersections
CN108334055A (en) * 2018-01-30 2018-07-27 赵兴华 The method of inspection, device, equipment and the storage medium of Vehicular automatic driving algorithm
CN108334056A (en) * 2018-02-02 2018-07-27 安徽江淮汽车集团股份有限公司 A kind of ADAS test system and test method
CN108303271A (en) * 2018-03-01 2018-07-20 北京理工华创电动车技术有限公司 Auxiliary drives product testing system and test method
CN108663677A (en) * 2018-03-29 2018-10-16 上海智瞳通科技有限公司 A kind of method that multisensor depth integration improves target detection capabilities
CN109655825A (en) * 2018-03-29 2019-04-19 上海智瞳通科技有限公司 Data processing method, device and the multiple sensor integrated method of Multi-sensor Fusion
CN109839634A (en) * 2019-01-25 2019-06-04 中国汽车技术研究中心有限公司 A kind of subject fusion method of vehicle-mounted camera and radar
CN110940355A (en) * 2019-12-04 2020-03-31 上海创程车联网络科技有限公司 Function test method of ADAS (advanced data acquisition System) equipment

Cited By (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111964922A (en) * 2020-08-28 2020-11-20 福瑞泰克智能系统有限公司 Intelligent driving vehicle test system
CN112327806A (en) * 2020-11-02 2021-02-05 东软睿驰汽车技术(沈阳)有限公司 Automatic driving test method and device and electronic equipment
CN112327806B (en) * 2020-11-02 2022-02-15 东软睿驰汽车技术(沈阳)有限公司 Automatic driving test method and device, electronic equipment and storage medium
CN112887172A (en) * 2021-02-19 2021-06-01 北京百度网讯科技有限公司 Vehicle perception system test method, device, equipment and storage medium
CN113268411A (en) * 2021-04-25 2021-08-17 福瑞泰克智能系统有限公司 Driving assistance algorithm testing method and device, electronic device and storage medium
CN113673105A (en) * 2021-08-20 2021-11-19 安徽江淮汽车集团股份有限公司 Design method of true value comparison strategy
CN113805566A (en) * 2021-09-17 2021-12-17 南斗六星系统集成有限公司 Detection method and system for integrated driving assistance system controller
CN114136356A (en) * 2021-11-30 2022-03-04 上汽通用五菱汽车股份有限公司 Parameter acquisition test system, method, device and computer readable storage medium
CN114047361A (en) * 2022-01-11 2022-02-15 深圳佑驾创新科技有限公司 Calibration system of ADAS visual equipment
CN114047361B (en) * 2022-01-11 2022-04-05 深圳佑驾创新科技有限公司 Calibration system of ADAS visual equipment
CN114077218A (en) * 2022-01-19 2022-02-22 浙江吉利控股集团有限公司 Road data evaluation report generation method, device, equipment and storage medium
CN114091626A (en) * 2022-01-19 2022-02-25 浙江吉利控股集团有限公司 True value detection method, device, equipment and storage medium
CN114091626B (en) * 2022-01-19 2022-04-22 浙江吉利控股集团有限公司 True value detection method, device, equipment and storage medium
CN114077218B (en) * 2022-01-19 2022-04-22 浙江吉利控股集团有限公司 Road data evaluation report generation method, device, equipment and storage medium
WO2023137863A1 (en) * 2022-01-19 2023-07-27 浙江吉利控股集团有限公司 Method, apparatus and device for generating road data evaluation report, and storage medium
CN115797401A (en) * 2022-11-17 2023-03-14 昆易电子科技(上海)有限公司 Verification method and device of alignment parameters, storage medium and electronic equipment
CN115797401B (en) * 2022-11-17 2023-06-06 昆易电子科技(上海)有限公司 Verification method and device for alignment parameters, storage medium and electronic equipment

Similar Documents

Publication Publication Date Title
CN111398989A (en) Performance analysis method and test equipment of driving assistance system
CN106503653B (en) Region labeling method and device and electronic equipment
CN110796007B (en) Scene recognition method and computing device
US10369993B2 (en) Method and device for monitoring a setpoint trajectory to be traveled by a vehicle for being collision free
CN109085829B (en) Dynamic and static target identification method
US20210024095A1 (en) Method and device for controlling autonomous driving of vehicle, medium, and system
CN111038522B (en) Vehicle control unit and method for evaluating a training data set of a driver assistance system
Katare et al. Embedded system enabled vehicle collision detection: an ANN classifier
CN110936959B (en) On-line diagnosis and prediction of vehicle perception system
TW202031538A (en) Auxiliary driving method and system
CN113933858A (en) Abnormal detection method and device of positioning sensor and terminal equipment
US20230205202A1 (en) Systems and Methods for Remote Status Detection of Autonomous Vehicles
CN114639085A (en) Traffic signal lamp identification method and device, computer equipment and storage medium
JP5895728B2 (en) Vehicle group management device
CN110758244B (en) Method and system for automatically preventing rear-end collision of electric automobile
CN113532499B (en) Sensor security detection method and device for unmanned system and storage medium
CN111947669A (en) Method for using feature-based positioning maps for vehicles
CN114595738A (en) Method for generating training data for recognition model and method for generating recognition model
CN114184218A (en) Method, device and storage medium for testing a sensor system of a motor vehicle
EP3495222A1 (en) Image-capturing device
CN210760742U (en) Intelligent vehicle auxiliary driving system
CN110745145A (en) Multi-sensor management system for ADAS
CN114782748A (en) Vehicle door detection method and device, storage medium and automatic driving method
CN113591673A (en) Method and device for recognizing traffic signs
CN115402347A (en) Method for identifying a drivable region of a vehicle and driving assistance method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20200710