CN112732510A - Testing device for unmanned vehicle computing platform - Google Patents

Testing device for unmanned vehicle computing platform Download PDF

Info

Publication number
CN112732510A
CN112732510A CN202110365002.0A CN202110365002A CN112732510A CN 112732510 A CN112732510 A CN 112732510A CN 202110365002 A CN202110365002 A CN 202110365002A CN 112732510 A CN112732510 A CN 112732510A
Authority
CN
China
Prior art keywords
tested
computing platform
test
computing
platforms
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110365002.0A
Other languages
Chinese (zh)
Inventor
牛晓伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Neolix Technologies Co Ltd
Original Assignee
Neolix Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Neolix Technologies Co Ltd filed Critical Neolix Technologies Co Ltd
Priority to CN202110365002.0A priority Critical patent/CN112732510A/en
Publication of CN112732510A publication Critical patent/CN112732510A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/22Detection or location of defective computer hardware by testing during standby operation or during idle time, e.g. start-up testing
    • G06F11/2205Detection or location of defective computer hardware by testing during standby operation or during idle time, e.g. start-up testing using arrangements specific to the hardware being tested
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/22Detection or location of defective computer hardware by testing during standby operation or during idle time, e.g. start-up testing
    • G06F11/2273Test methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/30Monitoring
    • G06F11/3003Monitoring arrangements specially adapted to the computing system or computing system component being monitored
    • G06F11/3013Monitoring arrangements specially adapted to the computing system or computing system component being monitored where the computing system is an embedded system, i.e. a combination of hardware and software dedicated to perform a certain function in mobile devices, printers, automotive or aircraft systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/30Monitoring
    • G06F11/3051Monitoring arrangements for monitoring the configuration of the computing system or of the computing system component, e.g. monitoring the presence of processing resources, peripherals, I/O links, software programs
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/30Monitoring
    • G06F11/32Monitoring with visual or acoustical indication of the functioning of the machine
    • G06F11/321Display for diagnostics, e.g. diagnostic result display, self-test user interface
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/30Monitoring
    • G06F11/34Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation ; Recording or statistical evaluation of user activity, e.g. usability assessment
    • G06F11/3447Performance evaluation by modeling

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Quality & Reliability (AREA)
  • General Physics & Mathematics (AREA)
  • Computing Systems (AREA)
  • Computer Hardware Design (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • Mathematical Physics (AREA)
  • Human Computer Interaction (AREA)
  • Test And Diagnosis Of Digital Computers (AREA)

Abstract

The invention discloses a testing device of an unmanned vehicle computing platform, which comprises: the display is used for displaying the test result and inputting the test task; the display is connected with the testing equipment, and the testing equipment is detachably connected with the computing platform to be tested; the power supply is used for supplying power to the testing equipment and the computing platform to be tested, and is detachably connected with the computing platform to be tested; the power supply controller is respectively connected with the test equipment and the power supply; when the test equipment receives a test task, the test equipment sends a command to the power supply controller, so that the power supply controller controls the power supply to supply power to and cut off power from the computing platform to be tested; the computing platform to be tested feeds back the performance problem result to the testing equipment after power supply, and the testing equipment transmits the performance problem result to the display for displaying; and completing one test after the power of the computing platform to be tested is cut off. Through the technical scheme of the invention, the quality test of the computing platform to be tested can be realized.

Description

Testing device for unmanned vehicle computing platform
Technical Field
The invention relates to the technical field of unmanned vehicle testing, in particular to a testing device of an unmanned vehicle computing platform.
Background
To ensure proper use of the unmanned vehicle computing platform, the controller needs to be tested prior to installation into the computing platform for use.
At present, the basic functions of the controller are independently verified, and the controller can be installed in a computing platform for use after the verification is successful.
However, verification of the basic functionality of the controller may result in the computing platform not functioning properly.
Disclosure of Invention
The invention provides a testing device of an unmanned vehicle computing platform, which can realize the quality test of the computing platform to be tested through testing equipment, and use a controller in the computing platform which passes the quality test, thereby reducing the probability of problems occurring in the later use of the controller.
In a first aspect, the present invention provides a testing apparatus for an unmanned vehicle computing platform, comprising:
the display is used for displaying the test result and inputting the test task, and the test task comprises the performance problem and the test frequency contained in the computing platform to be tested;
the display is connected with the testing equipment so that the testing equipment receives testing tasks, and the testing equipment is detachably connected with the computing platform to be tested;
the power supply is used for supplying power to the testing equipment and the computing platform to be tested, and is detachably connected with the computing platform to be tested;
the power supply controller is respectively connected with the test equipment and the power supply;
when the test equipment receives a test task, the test equipment sends a command to the power supply controller, so that the power supply controller controls the power supply to supply power to and cut off power of the computing platform to be tested, and the computing platform to be tested simulates power supply and power off on an unmanned vehicle;
the computing platform to be tested feeds back a performance problem result to the testing equipment after power supply, and the testing equipment transmits the performance problem result to the display for display;
and completing one test of the computing platform to be tested after the power is cut off.
The invention provides a testing device for a computing platform of an unmanned vehicle, which comprises: the device comprises a display, a test device, a computing platform to be tested, a power supply and a power supply controller, wherein the display is used for displaying a test result and inputting a test task, the test task comprises a performance problem and test times contained in the computing platform to be tested, the test device is connected with the display so that the test device receives the test task, the test device is detachably connected with the platform to be tested, the power supply is used for supplying power to the test device and the computing platform to be tested and is detachably connected with the computing platform to be tested, when the test device receives the test task, the test device sends a command to the power supply controller so that the power supply controller controls the power supply to supply power to and cut off the computing platform to be tested, the computing platform to be tested simulates power-up and power-down on an unmanned vehicle, and the computing platform to be tested feeds back the performance problem result to the test, the test equipment transmits the performance problem result to the display for displaying, and the computing platform to be tested completes one test after power failure. According to the technical scheme, the power supply is repeatedly controlled to supply power to and cut off power from the computing platform to be tested according to the testing times, so that the testing equipment obtains a performance problem result fed back by the computing platform to be tested, the working condition that the controller is installed in the computing platform is simulated, the quality test of the controller in the computing platform is realized, and the controller passing the quality test is used, so that the probability of problems occurring in the later-stage use of the controller is reduced.
Further effects of the above-mentioned unconventional preferred modes will be described below in conjunction with specific embodiments.
Drawings
In order to more clearly illustrate the embodiments or the prior art solutions of the present invention, the drawings needed to be used in the description of the embodiments or the prior art will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments described in the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive labor.
Fig. 1 is a testing apparatus of an unmanned vehicle computing platform according to an embodiment of the present invention;
FIG. 2 is another testing apparatus for unmanned vehicle computing platform according to an embodiment of the present invention;
FIG. 3 illustrates a method for testing an unmanned vehicle computing platform, according to an embodiment of the present invention;
FIG. 4 is a block diagram of a test module for an unmanned vehicle computing platform, according to an embodiment of the present invention;
fig. 5 is an electronic device according to an embodiment of the invention;
wherein the figures in the drawings identify:
1-a display; 2-testing equipment; 3-a computing platform; 4-a power supply controller; 5-power supply.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the technical solutions of the present invention will be described in detail and completely with reference to the following embodiments and accompanying drawings. It is to be understood that the described embodiments are merely exemplary of the invention, and not restrictive of the full scope of the invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
As shown in fig. 1, a testing apparatus for a computing platform of an unmanned aerial vehicle provided in an embodiment of the present invention includes:
the display 1 is used for displaying a test result and inputting a test task, wherein the test task comprises a performance problem and test times contained in the computing platform 3 to be tested;
the testing device 2 is connected with the display 1, so that the testing device 2 receives testing tasks, and the testing device 2 is detachably connected with the computing platform 3 to be tested;
the power supply 5 is used for supplying power to the testing device 2 and the computing platform 3 to be tested, and the power supply 5 is detachably connected with the computing platform 3 to be tested;
the power supply controller 4 is connected with the test equipment 2 and the power supply 5 respectively;
when the test equipment 2 receives a test task, the test equipment 2 sends a command to the power controller 4, so that the power controller 4 controls the power supply 5 to supply power and cut off power to the computing platform 3 to be tested, and the computing platform 3 to be tested simulates power supply and power off on an unmanned vehicle;
the computing platform 3 to be tested feeds back the performance problem result to the testing equipment 2 after power supply, and the testing equipment 2 transmits the performance problem result to the display 1 for display;
and completing one test of the computing platform 3 to be tested after power failure.
The computing platform 3 to be tested is understood as a product which is required to be put into installation by the controller, and the quality detection of the controller is realized by simulating the actual performance condition of the controller in the product to be put into installation. It should be understood that the devices with the controllers installed connected to the testing device 2 can be referred to as computing platforms 3 to be tested, and the testing method of each computing platform 3 to be tested is the same. It should be noted that the controller is installed inside the computing platform 3, and it can be known whether the controller installed inside the computing platform 3 can be used normally through the performance problem result fed back by the computing platform 3. The controller may be a controller, and the controller may be a master device for controlling the starting, speed regulation, braking and reversing of the motor by changing the wiring of the master circuit or the control circuit and changing the resistance value in the circuit according to a predetermined sequence, and comprises a program counter, an instruction register, an instruction decoder, a timing generator and an operation controller. The controller may be rk3399, imx6, 6q, imx5744, for example, and this embodiment is not intended to be limiting, as any controller known in the art or may be developed in the future.
Specifically, the testing device 2 may be understood as a device that can be detachably connected to a plurality of computing platforms 3 to be tested, and can perform data exchange and data processing with the computing platforms 3 to be tested, so as to test performance problems of the computing platforms 3 to be tested, in order to facilitate displaying a performance problem result of the computing platforms 3 to be tested, in some feasible implementation manners, the testing device 2 and the display 1 may be integrated, and in other feasible implementation manners, the testing device 2 is connected to the external display 1. As for the connection mode between the testing device 2 and the computing platform 3 to be tested, in some feasible implementation modes, the testing device 2 and the computing platform 3 to be tested are connected by a wired connection mode, specifically, the testing device 2 is provided with a plurality of USB (Universal Serial Bus) interfaces, and is connected by a wired connection with the computing platform 3 to be tested by the USB interfaces, in other feasible implementation modes, the testing device 2 and the computing platform 3 to be tested are connected wirelessly, considering that the wired connection has high reliability and is not easily interfered by an external environment, therefore, the computing platform 3 to be tested and the testing device 2 are preferably connected by a wired connection.
In one embodiment, the test task may further include information of a test start time, a time interval between two adjacent tests, and the like. It should be understood that the user sets the test tasks through the test task setting page displayed on the display 1.
The working principle of the testing device of the unmanned vehicle computing platform 3 provided by the embodiment is as follows:
a user configures a test task on a display 1, then starts the test task, so that a test device 2 receives the test task, a first test is performed at the moment, the test device 2 issues a command to a power controller 4, the power controller 4 controls a power supply 5 to supply power to a computing platform 3 to be tested and cut off the power to complete a first test, then the computing platform 3 to be tested feeds a performance problem result back to the test device 2 after the power is supplied, the test device 2 transmits the performance problem result to the display 1 to be displayed, then the test device 2 records the completed test times, when the completed test times are different from the test times in the test task, a second test is performed according to the same test method until the completed test times are the same as the test times in the test task.
The testing device for the unmanned vehicle computing platform 3 provided by the embodiment at least has the following technical effects:
the power supply 5 is controlled repeatedly to supply power and cut off power to the computing platform 3 to be tested according to the test times, so that the test equipment 2 obtains a performance problem result fed back by the computing platform 3 to be tested, the simulation controller is installed on the working condition inside the computing platform 3, the quality test of the controller inside the computing platform 3 is realized, the controller passing the quality test is used, and the probability of problems occurring in the later-stage use of the controller is reduced.
In one embodiment, the computing platform to be tested 3 comprises an autonomous driving computing unit, a parallel driving system and/or a car networking system. It should be noted that, the embodiment is not intended to limit the computing platform 3 to be tested, and the specific requirement needs to be determined in combination with actual requirements.
In one embodiment, the performance issues include: whether camera initialization is normal, whether CAN (Controller Area Network, ISO international standardized serial communication protocol) communication is normal and/or whether Network communication is normal. It should be noted that, the embodiment is not intended to limit the performance problem, and the specific requirement needs to be determined in combination with actual requirements.
In one embodiment, the test device 2 issues a command to the power controller 4 by way of a General-purpose input/output (GPIO). It should be noted that, this embodiment is not intended to limit the method for issuing a command to the power controller 4 by the test device 2, and the determination needs to be specifically determined in combination with actual requirements.
In one embodiment, the testing apparatus counts the respective repetition times of the different performance problem results of the computing platform 3 to be tested, and outputs the respective repetition times of the different performance problem results to the display 1 for displaying, so that a user can conveniently analyze whether the controller in the computing platform 3 can be normally used according to the success times and the failure times of the different performance problem results.
In this embodiment, the quality test of the controller in the computing platform 3 to be tested is realized by counting the repetition times of different performance test results of the computing platform 3 to be tested and comprehensively considering the overall performance of the computing platform 3 to be tested in multiple tests, and the controller passing the quality test is used, so that the probability of problems occurring in the later-stage use of the controller is reduced. For example, the computing platform 3 to be tested has a performance problem result, and there are two different performance problem results: the CAN communication is normal, the CAN communication is abnormal, the number of times of repetition of the CAN communication is normal is a1, the number of times of repetition of the CAN communication is abnormal is a2, and a1+ a2= A.
It will be appreciated that the test results displayed by the display 1 include performance problem results and the number of repetitions of the performance problem results.
In one embodiment, the computing platform to be tested 3 feeds back a start result to the testing device 2 after power supply; and the test equipment 2 receives the performance problem result fed back by the computing platform 3 to be tested when judging that the computing platform 3 to be tested is started normally based on the starting result fed back by the computing platform 3 to be tested.
In this embodiment, the computing platform 3 to be tested feeds back a start result to the testing device 2 after power supply, the testing device 2 determines whether the computing platform 3 to be tested starts normally based on the start result fed back by the computing platform 3 to be tested, and receives a performance problem result fed back by the computing platform 3 to be tested when it is determined that the computing platform 3 to be tested starts normally, thereby ensuring accuracy of the performance test result, when it is determined that the computing platform 3 to be tested starts abnormally, it is not necessary to receive the performance problem result fed back by the computing platform 3 to be tested, and it is only necessary to directly determine that the computing platform 3 to be tested starts abnormally, thereby rapidly implementing testing of the computing platform 3 to be tested. It will be appreciated that the performance test results have a relatively high reference value due to the consideration of the start-up of the computing platform 3 to be tested.
It is to be noted that after the testing device 2 starts the computing platform 3 to be tested, the computing platform 3 to be tested performs a start self-check to obtain a start result of itself, specifically, a number or a letter may be used to represent a normal start and an abnormal start, for example, 1 and 0 are set, 1 represents a normal start, and 0 represents an abnormal start, and then the computing platform 3 to be tested sends the start result and the device identifier to the testing device 2, so that the testing device 2 obtains the start result and the device identifier fed back by the computing platform 3 to be tested. If the testing device 2 is wired to the computing platform 3 to be tested, the device identifier may be a product identifier.
In an embodiment, when the testing device 2 determines that the computing platform 3 to be tested is not in the flashing mode, the testing device executes the starting result fed back based on the computing platform 3 to be tested, and when it determines that the computing platform 3 to be tested is started normally, receives the performance problem result fed back by the computing platform 3 to be tested.
It should be noted that, considering that the computing platform 3 to be tested may have a flashing mode, at this time, the testing device 2 needs to verify whether the computing platform 3 to be tested is in the flashing mode, when the testing device 2 determines that the computing platform 3 to be tested is not in the flashing mode, the testing device will perform the operation of receiving the performance problem result fed back by the computing platform 3 to be tested when determining that the start result is normal, and if the testing device 2 determines that the computing platform 3 to be tested is in the flashing mode, the display 1 may be controlled to display the device identifier of the computing platform 3 to be tested in the flashing mode, so that the user can know the computing platform 3 in the flashing mode.
In one embodiment, the testing device 2 is detachably connected with at least two computing platforms 3 to be tested; when the test equipment 2 receives a test task, the test equipment 2 issues a command to the power controller 4, so that the power controller 4 controls the power supply 5 to supply power to and cut off power from the at least two computing platforms 3 to be tested.
In this embodiment, when there are two or more to-be-tested computing platforms 3 connected to the testing device 2, the testing device 2 can control the power supply 5 to simultaneously supply power to and cut off power from the two or more to-be-tested computing platforms 3, thereby performing batch testing and improving testing efficiency.
When there are two or more to-be-tested computing platforms 3 connected to the testing device 2, in one embodiment, the testing device 2 determines the number of normal starting platforms according to the starting results fed back by the at least two to-be-tested computing platforms 3 respectively; when the number of the normal starting platforms is different from the number of the at least two to-be-tested computing platforms 3, the testing device 2 judges whether each to-be-tested computing platform 3 is started normally; and when the number of the normal starting platforms is the same as that of the at least two computing platforms 3 to be tested, the testing equipment 2 judges that the starting of each computing platform 3 to be tested is normal.
In this embodiment, the comparison between the number of platforms of all the computing platforms 3 to be tested controlled by the testing device 2 and the number of normal starting platforms in all the computing platforms 3 to be tested preliminarily determines whether the computing platforms 3 to be tested start normally, and subsequently, when the computing platforms 3 to be tested start normally, the computing platforms 3 to be tested perform performance testing, determine a performance problem result and feed the performance problem result back to the testing device 2, and the obtained performance problem result takes into account the overall starting condition of all the computing platforms 3 to be tested controlled by the testing device 2, thereby having a relatively high reference value.
It can be understood that the number of platforms indicates the number of computing platforms 3 to be tested that the testing device 2 controls in one test, and in practical applications, the testing device 2 will generally control all computing platforms 3 to be tested that are connected to it, so that what number of computing platforms 3 to be tested the testing device 2 is connected to is what number of platforms. The number of normal boot platforms indicates the number of computing platforms 3 to be tested that the testing device 2 normally boots up among all computing platforms 3 to be tested controlled in one test.
It is noted that whether the number of platforms is equal to the number of normal startup platforms reflects the overall startup situation of all the computing platforms 3 to be tested that are controlled by the testing device 2 in one test. In a possible situation, when the number of the platforms detected by the testing device 2 is consistent with the number of the normal starting platforms, it can be considered that each computing platform 3 to be tested is normally started at this time, and after the testing device 2 judges that the computing platforms 3 to be tested are normally started, the computing platforms 3 to be tested verify the performance problem, so as to obtain a performance problem result. In another possible situation, when the number of platforms detected by the testing device 2 is inconsistent with the number of normally started platforms, it is necessary to determine whether each computing platform 3 to be tested is normally started, so as to ensure the accuracy of the test.
In one embodiment, the testing device 2 and the at least two computing platforms to be tested 3 are wired; when the number of the normal starting platforms is different from the number of the at least two to-be-tested computing platforms 3, the testing device 2 performs network verification on the to-be-tested computing platforms 3 based on the internet protocol addresses of the to-be-tested computing platforms 3; the testing device 2 judges that the computing platform 3 to be tested is normally started when the network verification of the computing platform 3 to be tested passes, and judges that the computing platform 3 to be tested is abnormally started when the network verification fails.
Considering that the reliability of the wired method is high, when the number of the platforms is different from the number of the normal startup platforms, part of the computing platforms 3 to be tested may not be normally started, and certainly, an interface between the computing platforms 3 to be tested and the testing device 2 may have a problem, in order to more accurately determine whether the computing platforms 3 to be tested are normally started, at this time, it is necessary to further verify whether each computing platform 3 to be tested is normally started, and optionally, for each computing platform 3 to be tested, the starting condition of the computing platform 3 to be tested is verified through a network, so as to ensure the accuracy of the performance test result of the obtained computing platform 3 to be tested.
Specifically, the network verification of the computing platform 3 to be tested is realized by acquiring an internet protocol address, i.e. an IP address, of the computing platform 3 to be tested according to the internet protocol address through a network verification instruction, such as ping, and when the network verification of the computing platform 3 to be tested passes, it indicates that the computing platform 3 to be tested starts normally, and when the network verification of the computing platform 3 to be tested does not pass, it indicates that the computing platform 3 to be tested starts abnormally. It should be noted that the network verification method of each computing platform 3 to be tested is consistent, and in practical application, the testing device 2 performs network verification on all computing platforms 3 to be tested at the same time.
Referring to fig. 2, a specific application scenario is given as follows:
one measuring device is provided with N USB interfaces, the measuring device is in wired connection with N computing platforms 3 to be tested in a USB interface mode, the N computing platforms 3 to be tested are in one-to-one correspondence with the N USB interfaces, the N computing platforms 3 to be tested are respectively provided with an IP address, and the measuring device is connected with a display 1 so that a user can set a test task through the display 1, wherein the test task is as follows: the test frequency is 10000, the performance problem includes whether the camera initialization is normal, whether the CAN communication is normal and/or whether the network communication is normal, the time interval between any two tests is t, and the test equipment 2 repeatedly sends a command to the power supply controller 4 connected with the test equipment according to the test task, so that the power supply controller 4 controls the power supply 5 connected with the power supply controller to continuously supply power to and cut off power from the N computing platforms 3 to be tested for 10000 times, namely, the power supply and the power off of the N computing platforms 3 to be tested are carried out in each test.
In practical application, for each computing platform 3 to be tested in each test, when the testing device 2 judges that the computing platform 3 to be tested is not in the flashing mode, the testing device can receive a starting result and a device identifier fed back by the computing platform 3 to be tested, wherein the starting result comprises 1 or 0, wherein 1 represents normal starting, and 0 represents abnormal starting; when the test device 2 receives the device identifiers of the N computing platforms 3 to be tested, it may be determined that the number of platforms of all the computing platforms 3 to be tested in the current test is N. Of course, if the computing platform 3 to be tested does not have the feedback device identifier, or the device identifier is the product identifier of the USB interface, the number of platforms N is directly determined by the number of USB interfaces on the testing device 2, and at the same time, different computing platforms 3 to be tested are distinguished by the product identifier of the USB interface.
If there is no problem in the wired communication between the testing device 2 and the N computing platforms 3 to be tested, the testing device 2 may obtain the starting results fed back by the N computing platforms 3 to be tested, at this time, the testing device 2 may count how many 1 s appear, and if there are N1 s, it may be determined that the number of normal starting platforms is N.
When the testing device 2 judges that N is equal to N, it can be judged that the N computing platforms 3 to be tested are all started normally.
When the testing device 2 judges that N is not equal to N, for each computing platform 3 in the N computing platforms 3 to be tested, when the testing device 2 judges that the computing platform 3 to be tested and the measuring device can normally communicate based on the IP address of the computing platform 3 to be tested, it indicates that the computing platform 3 to be tested is normally started, otherwise, it indicates that the computing platform 3 to be tested is abnormally started.
After the testing device 2 judges that the computing platform 3 to be tested is started normally, the performance problem result fed back by the computing platform 3 to be tested can be received. Meanwhile, the test device 2 counts the respective repetition times of the results of the different performance problems, and outputs the respective repetition times of the results of the different performance problems to the display 1 for display. Here, the test equipment 2 is used for realizing multiple tests on the plurality of computing platforms 3 to be tested, comprehensively considering the overall performance of the plurality of computing platforms 3 to be tested in the multiple tests, realizing batch tests of the computing platforms 3, improving the test efficiency and ensuring the test accuracy, and when the controller passing the quality test is installed in other products for use, the probability of the controller having problems can be reduced.
Based on the same concept as the device embodiment of the present invention, please refer to fig. 3, an embodiment of the present invention further provides a testing method for an unmanned vehicle computing platform, which is applied to the testing device for the unmanned vehicle computing platform according to any of the embodiments described above, and the testing method includes:
301, when receiving a test task, the test equipment issues a command to the power supply controller, so that the power supply controller controls the power supply to supply power to and cut off power from the computing platform to be tested, and the computing platform to be tested simulates power supply and power off on an unmanned vehicle, wherein the test task comprises a performance problem and test times contained in the computing platform to be tested;
and 302, feeding back a performance problem result to the testing equipment after the computing platform to be tested is powered on, transmitting the performance problem result to a display by the testing equipment for displaying, and completing one test after the computing platform to be tested is powered off.
In one embodiment, the testing device counts respective repetition times of different performance problem results of the computing platform to be tested, and outputs the respective repetition times of the different performance problem results to the display for display.
In one embodiment, further comprising:
the computing platform to be tested feeds a starting result back to the testing equipment after power is supplied;
and the test equipment receives a performance problem result fed back by the computing platform to be tested when judging that the computing platform to be tested is started normally based on the starting result fed back by the computing platform to be tested.
In one embodiment, further comprising:
the testing equipment determines the number of normal starting platforms according to the starting results fed back by the at least two computing platforms to be tested respectively;
when the number of the normal starting platforms is different from the number of the at least two computing platforms to be tested, the testing equipment judges whether each computing platform to be tested is started normally;
and when the number of the normal starting platforms is the same as that of the at least two computing platforms to be tested, the testing equipment judges that each computing platform to be tested is started normally.
In one embodiment, the testing device is in wired connection with the at least two computing platforms to be tested;
the testing device, when the number of the normally started platforms is different from the number of the at least two computing platforms to be tested, the testing device determining whether each computing platform to be tested is started normally, including:
when the number of the normal starting platforms is different from the number of the at least two computing platforms to be tested, the testing equipment performs network verification on the computing platforms to be tested based on the Internet protocol addresses of the computing platforms to be tested;
and the test equipment judges that the computing platform to be tested is normally started when the network verification of the computing platform to be tested passes, and judges that the computing platform to be tested is abnormally started when the network verification fails.
In an embodiment, when the test device determines that the computing platform to be tested is not in the flashing mode, the test device executes the starting result based on the feedback of the computing platform to be tested, and receives the performance problem result fed back by the computing platform to be tested when the computing platform to be tested is determined to be started normally.
Based on the same concept as the method embodiment of the present invention, referring to fig. 4, an embodiment of the present invention further provides a testing module for an unmanned vehicle computing platform, which is applied to the testing apparatus for an unmanned vehicle computing platform according to any of the foregoing embodiments, and the testing apparatus includes:
the simulation unit 401 is configured to issue a command to the power supply controller when a test task is received by the test equipment, so that the power supply controller controls the power supply to supply power to and cut off power from the computing platform to be tested, so that the computing platform to be tested simulates power supply and power off on an unmanned vehicle, and the test task includes a performance problem and test times included in the computing platform to be tested;
and the feedback display unit 402 is used for feeding back the performance problem result to the test equipment after the power is supplied to the computing platform to be tested, transmitting the performance problem result to a display for displaying through the test equipment, and completing one test after the power is off for the computing platform to be tested.
In an embodiment, the feedback display unit 402 is configured to count, by the testing device, respective repetition times of different performance problem results of the computing platform to be tested, and output the respective repetition times of the different performance problem results to the display for displaying.
In one embodiment, further comprising: a feedback unit and a receiving unit; wherein the content of the first and second substances,
the feedback unit is used for feeding back a starting result to the testing equipment after power is supplied to the computing platform to be tested;
the receiving unit is used for receiving the performance problem result fed back by the computing platform to be tested when the testing equipment judges that the computing platform to be tested is started normally based on the starting result fed back by the computing platform to be tested.
In one embodiment, further comprising: the device comprises a quantity determining unit, a first judging unit and a second judging unit; wherein the content of the first and second substances,
the quantity determining unit is used for determining the quantity of the normal starting platforms according to the starting results fed back by the at least two to-be-tested computing platforms respectively through the testing equipment;
the first judging unit is used for judging whether each computing platform to be tested is started normally or not when the number of the normally started platforms is different from the number of the platforms of the at least two computing platforms to be tested by the testing equipment;
the second judging unit is configured to judge, by the testing device, that each computing platform to be tested starts normally when the number of the normal starting platforms is the same as that of the at least two computing platforms to be tested.
In one embodiment, the testing device is in wired connection with the at least two computing platforms to be tested;
the first judgment unit includes: a verification subunit and a judgment subunit; wherein the content of the first and second substances,
the verification subunit is configured to perform network verification on the computing platforms to be tested based on the internet protocol addresses of the computing platforms to be tested when the number of the normal startup platforms is different from the number of the at least two computing platforms to be tested by the testing device;
the judging subunit is configured to judge, by the testing device, that the computing platform to be tested starts normally when the network verification of the computing platform to be tested passes, and judge that the computing platform to be tested starts abnormally when the network verification fails.
In one embodiment, further comprising: a trigger unit; wherein the content of the first and second substances,
the triggering unit is used for triggering the receiving unit when the testing equipment judges that the computing platform to be tested is not in the flash mode.
Fig. 5 is a schematic structural diagram of an electronic device according to an embodiment of the present invention. On the hardware level, the electronic device includes a processor 501 and a memory 502 storing execution instructions, and optionally includes an internal bus 503 and a network interface 504. The Memory 502 may include a Memory 5021, such as a Random-Access Memory (RAM), and may further include a non-volatile Memory 5022 (non-volatile Memory), such as at least 1 disk Memory; the processor 501, the network interface 504, and the memory 502 may be connected to each other by an internal bus 503, and the internal bus 503 may be an ISA (Industry Standard Architecture) bus, a PCI (PerIPheral Component Interconnect) bus, an EISA (Extended Industry Standard Architecture) bus, or the like; the internal bus 503 may be divided into an address bus, a data bus, a control bus, etc., and is indicated by only one double-headed arrow in fig. 5 for convenience of illustration, but does not indicate only one bus or one type of bus. Of course, the electronic device may also include hardware required for other services. When the processor 501 executes execution instructions stored by the memory 502, the processor 501 performs the method of any of the embodiments of the present invention and at least is used to perform the method as shown in fig. 3.
In a possible implementation manner, the processor reads corresponding execution instructions from the nonvolatile memory to the memory and then runs the corresponding execution instructions, and corresponding execution instructions can also be obtained from other equipment, so that a test module of the unmanned vehicle computing platform is formed on a logic level. The processor executes the execution instructions stored in the memory, so that the execution instructions are executed to realize the test method of the unmanned vehicle computing platform provided by any embodiment of the invention.
The processor may be an integrated circuit chip having signal processing capabilities. In implementation, the steps of the above method may be performed by integrated logic circuits of hardware in a processor or instructions in the form of software. The Processor may be a general-purpose Processor, including a Central Processing Unit (CPU), a Network Processor (NP), and the like; it may also be a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable gate array (Field-Programmable Ga)te Array, FPGA) or other programmable logic device, discrete gate or transistor logic device, discrete hardware component. The various methods, steps and logic blocks disclosed in the embodiments of the present invention may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
Embodiments of the present invention further provide a computer-readable storage medium, which includes an execution instruction, and when a processor of an electronic device executes the execution instruction, the processor executes a method provided in any one of the embodiments of the present invention. The electronic device may specifically be the electronic device shown in fig. 5; the execution instruction is a computer program corresponding to a test module of the unmanned vehicle computing platform.
It will be appreciated by those skilled in the art that embodiments of the present invention may be provided as a method or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects.
The embodiments of the present invention are described in a progressive manner, and the same and similar parts among the embodiments can be referred to each other, and each embodiment focuses on the differences from the other embodiments. In particular, as for the apparatus embodiment, since it is substantially similar to the method embodiment, the description is relatively simple, and for the relevant points, reference may be made to the partial description of the method embodiment.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
The above description is only an example of the present invention, and is not intended to limit the present invention. Various modifications and alterations to this invention will become apparent to those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present invention should be included in the scope of the claims of the present invention.

Claims (10)

1. A testing device for an unmanned vehicle computing platform, comprising:
the display is used for displaying the test result and inputting the test task, and the test task comprises the performance problem and the test frequency contained in the computing platform to be tested;
the display is connected with the testing equipment so that the testing equipment receives testing tasks, and the testing equipment is detachably connected with the computing platform to be tested;
the power supply is used for supplying power to the testing equipment and the computing platform to be tested, and is detachably connected with the computing platform to be tested;
the power supply controller is respectively connected with the test equipment and the power supply;
when the test equipment receives a test task, the test equipment sends a command to the power supply controller, so that the power supply controller controls the power supply to supply power to and cut off power of the computing platform to be tested, and the computing platform to be tested simulates power supply and power off on an unmanned vehicle;
the computing platform to be tested feeds back a performance problem result to the testing equipment after power supply, and the testing equipment transmits the performance problem result to the display for display;
and completing one test of the computing platform to be tested after the power is cut off.
2. The apparatus of claim 1, wherein the computing platform to be tested comprises an autonomous driving computing unit, a parallel driving system, and/or an internet of vehicles system.
3. The apparatus of claim 1, wherein the performance problem comprises: whether camera initialization is normal, whether CAN communication is normal and/or whether network communication is normal.
4. The apparatus of claim 1, wherein the test device issues commands to the power controller by way of GPIO.
5. The apparatus of claim 1, wherein the testing device counts respective repetition times of different performance problem results of the computing platform to be tested, and outputs the respective repetition times of the different performance problem results to the display for displaying.
6. The apparatus of claim 1, wherein the computing platform under test feeds back a start-up result to the testing device after power is supplied;
and the test equipment receives a performance problem result fed back by the computing platform to be tested when judging that the computing platform to be tested is started normally based on the starting result fed back by the computing platform to be tested.
7. The apparatus of claim 6, wherein the testing device is detachably connected with at least two computing platforms to be tested;
when the test equipment receives a test task, the test equipment issues a command to the power supply controller, so that the power supply controller controls the power supply to supply power to the at least two computing platforms to be tested and cut off the power.
8. The apparatus according to claim 7, wherein the testing device determines the number of normal boot platforms according to the boot results fed back by the at least two computing platforms to be tested;
when the number of the normal starting platforms is different from the number of the at least two computing platforms to be tested, the testing equipment judges whether each computing platform to be tested is started normally;
and when the number of the normal starting platforms is the same as that of the at least two computing platforms to be tested, the testing equipment judges that each computing platform to be tested is started normally.
9. The apparatus of claim 8, wherein the testing device is in wired connection with the at least two computing platforms under test;
when the number of the normal starting platforms is different from the number of the at least two computing platforms to be tested, the testing equipment performs network verification on the computing platforms to be tested based on the Internet protocol addresses of the computing platforms to be tested;
and the test equipment judges that the computing platform to be tested is normally started when the network verification of the computing platform to be tested passes, and judges that the computing platform to be tested is abnormally started when the network verification fails.
10. The apparatus according to claim 7, wherein the testing device executes the start result based on the feedback of the computing platform to be tested when determining that the computing platform to be tested is not in the flash mode, and receives the performance problem result fed back by the computing platform to be tested when determining that the computing platform to be tested is started normally.
CN202110365002.0A 2021-04-06 2021-04-06 Testing device for unmanned vehicle computing platform Pending CN112732510A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110365002.0A CN112732510A (en) 2021-04-06 2021-04-06 Testing device for unmanned vehicle computing platform

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110365002.0A CN112732510A (en) 2021-04-06 2021-04-06 Testing device for unmanned vehicle computing platform

Publications (1)

Publication Number Publication Date
CN112732510A true CN112732510A (en) 2021-04-30

Family

ID=75596413

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110365002.0A Pending CN112732510A (en) 2021-04-06 2021-04-06 Testing device for unmanned vehicle computing platform

Country Status (1)

Country Link
CN (1) CN112732510A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113189969A (en) * 2021-05-08 2021-07-30 东风汽车集团股份有限公司 Upper computer system for real-time monitoring and early warning of unmanned vehicle

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2015084A2 (en) * 2007-07-11 2009-01-14 Vector Informatik GmbH Testing device for electric components
CN102200777A (en) * 2011-03-25 2011-09-28 上海汽车集团股份有限公司 Performance monitoring method for new energy vehicle controller verification test
CN203405728U (en) * 2013-07-16 2014-01-22 北京汽车股份有限公司 Automatic test system of vehicle electronic control unit
CN209590584U (en) * 2019-01-21 2019-11-05 深圳市菲菱科思通信技术股份有限公司 Automatic controller for electric consumption up and down

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2015084A2 (en) * 2007-07-11 2009-01-14 Vector Informatik GmbH Testing device for electric components
CN102200777A (en) * 2011-03-25 2011-09-28 上海汽车集团股份有限公司 Performance monitoring method for new energy vehicle controller verification test
CN203405728U (en) * 2013-07-16 2014-01-22 北京汽车股份有限公司 Automatic test system of vehicle electronic control unit
CN209590584U (en) * 2019-01-21 2019-11-05 深圳市菲菱科思通信技术股份有限公司 Automatic controller for electric consumption up and down

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113189969A (en) * 2021-05-08 2021-07-30 东风汽车集团股份有限公司 Upper computer system for real-time monitoring and early warning of unmanned vehicle

Similar Documents

Publication Publication Date Title
US6684152B2 (en) Method of initializing a system for open/closed-loop control of the operational sequences of a motor vehicle and a system for carrying out the method
CN111786855B (en) Network card pressure testing method and device, electronic equipment and storage medium
CN109726061B (en) SoC chip verification method
US20170146987A1 (en) Electronic control module testing system
CN104865948A (en) Automatic vehicle controller diagnosing device and method
US20180095806A1 (en) Technologies for fast boot with adaptive memory pre-training
CN107357694A (en) Error event reporting system and its method during startup self-detection
CN110362434B (en) Object testing method and device
CN111132060A (en) Internet of things-based hardware automatic test tool and test method
CN112732510A (en) Testing device for unmanned vehicle computing platform
CN114170705A (en) Vehicle data uploading method, device and equipment
CN112286825A (en) Screen testing method and device for intelligent cabin and electronic equipment
CN110968004B (en) Cable test system based on FPGA prototype verification development board
CN111159048A (en) Application program testing method and device and computer readable storage medium
CN115129021B (en) Method and device for testing vehicle-mounted Ethernet
US8150671B2 (en) Portable USB power mode simulator tool
CN112416678B (en) Fan in-place detection device and method
CN105320120A (en) Device and method for simulating start-up failure of air conditioning compressor
CN109814525B (en) Automatic test method for detecting communication voltage range of automobile ECU CAN bus
CN111782499A (en) Test case generation method and system
CN113485284B (en) Message data processing method, device, equipment and storage medium
Kis et al. ATS-PCB: An Effective Automated Testing System for Advanced Driver Assistance Systems
CN118113541A (en) Vehicle fault code testing method and device, electronic equipment and readable storage medium
CN116820064A (en) Method and device for testing safety of functions of domain controller of vehicle
CN115080067A (en) Data programming method and device and air conditioner

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20210430

RJ01 Rejection of invention patent application after publication