CN114756463A - Test environment development method, system, equipment and medium - Google Patents

Test environment development method, system, equipment and medium Download PDF

Info

Publication number
CN114756463A
CN114756463A CN202210397318.2A CN202210397318A CN114756463A CN 114756463 A CN114756463 A CN 114756463A CN 202210397318 A CN202210397318 A CN 202210397318A CN 114756463 A CN114756463 A CN 114756463A
Authority
CN
China
Prior art keywords
data
simulation platform
comparator
reference model
software simulation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210397318.2A
Other languages
Chinese (zh)
Inventor
李岩
邵海波
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shandong Yunhai Guochuang Cloud Computing Equipment Industry Innovation Center Co Ltd
Original Assignee
Shandong Yunhai Guochuang Cloud Computing Equipment Industry Innovation Center Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shandong Yunhai Guochuang Cloud Computing Equipment Industry Innovation Center Co Ltd filed Critical Shandong Yunhai Guochuang Cloud Computing Equipment Industry Innovation Center Co Ltd
Priority to CN202210397318.2A priority Critical patent/CN114756463A/en
Publication of CN114756463A publication Critical patent/CN114756463A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/362Software debugging
    • G06F11/3648Software debugging using additional hardware
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3684Test management for test design, e.g. generating new test cases
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3688Test management for test execution, e.g. scheduling of test suites

Abstract

The invention discloses a test environment development method, which comprises the following steps: generating an initial software simulation platform comprising a reference model and a comparator; acquiring input and output data generated by the initial software simulation platform in the process of simulating the unit to be tested according to preset acquisition parameters; building a lightweight software simulation platform by using the reference model and the comparator; inputting the input data into the lightweight software simulation platform and debugging the reference model and the comparator according to the output data; and replacing the debugged reference model and comparator in the initial software simulation platform with the debugged reference model and comparator to obtain a final software simulation platform. The invention also discloses a system, a computer device and a readable storage medium. The scheme provided by the invention can greatly reduce the development time of the reference model and the comparator.

Description

Test environment development method, system, equipment and medium
Technical Field
The invention relates to the field of testing, in particular to a test environment development method, a test environment development system, test environment development equipment and a storage medium.
Background
The process of chip development makes the chip design become more and more complex. In order to ensure the correctness of chip design, the design of the chip needs to be comprehensively verified by simulation before chip flow, and the simulation platform generates excitation as an input signal of the chip and checks the comparison between the output signal of the chip and an expected output signal to verify the correctness of the chip design.
When a verifier takes a unit to be tested provided by a designer, a software environment (TB, test platform) capable of running simulation verification needs to be established for the unit to be tested, and as shown in fig. 1, the test platform includes a unit to be tested and a verification component. The verification component mainly comprises configuration information, an agent, a reference model and a comparator. According to different running positions of components to be tested, a classical chip verification platform can be divided into a software simulation platform and a hardware simulation platform.
As shown in fig. 1, the software emulation platform is to compile both the verification component and the unit under test into a binary file executable by a computer by using emulation compiling software, and then to put the executable file on a dedicated server to run by using emulation running software. Such emulation software and dedicated servers are referred to as software emulation platforms.
As shown in fig. 2, the hardware emulation platform mainly includes two parts: hardware platform and host. The hardware platform is mainly responsible for running the unit to be tested, and the host is responsible for running the verification component. Such emulation software and dedicated hardware platforms are referred to as hardware emulation platforms, as shown in FIG. 2. The hardware emulation software first uses the synthesis and routing software to compile a stream of executable binary bits for the unit under test. The verification component is then compiled into a binary file that the computer can run using an emulated compilation tool, the host computer being responsible for running this binary file. The simulation platform is connected with the host through a specific high-speed communication interface, and the host can obtain required partial operation data through the physical interface, wherein the data are used as input data of the host-side reference model and the comparator.
The software simulation platform and the hardware simulation have their advantages and disadvantages, as shown in fig. 3, the software simulation platform compiles binary files that can be executed by a computer, so that the compiling speed is greatly increased compared with the hardware platform, for example, the software compiling time of a small-scale communication interface only needs several minutes, but the compiling time by the hardware platform needs up to several hours. However, the simulation speed of the hardware platform is much faster than that of the software simulation platform, especially for very complex algorithms or systems on chip, for example, the execution speed of a typical system on chip on the hardware platform can be 1 to 10Hz, and the speed of the software simulation platform is only one hundredth to one thousandth of the speed. The software or hardware simulation platform can well verify the function of the unit to be tested. The software simulation platform can observe all signals in the verification component and the unit to be tested with the help of the debugging tool, while the hardware simulation platform can only observe part of signals of the unit to be tested, and the observation time length is limited. The hardware platform requires much more cost than the software emulation platform because it requires the purchase of additional hardware.
In order to combine the advantages of software simulation and hardware simulation, the current mainstream chip verification process firstly utilizes a software simulation platform to test the basic functions of a chip, and then utilizes a hardware simulation platform to verify the complex functions and sequential logic of the chip.
As shown in fig. 4, firstly, a verification engineer uses an automation tool to generate a basic verification component, and connects corresponding signals to the input and output of the unit to be tested, and input data is generally generated by a test case, converted by a driver, and then sent to the component to be tested. The output signal is typically connected to a comparator. Corresponding reference models and comparators are then developed based on the components under test, and the comparators compare expected values of the reference models with actual values of the units under test.
In the early development stage of the software simulation platform, the difficulties are mainly caused by the following points:
1. testing the time-consuming nature of platform development
The test platform development mainly comprises a reference model, a comparator and a test case. In the initial stage of verification, due to the immaturity of the development of the reference model and the comparator, the data of a typical scenario may need to be compiled and debugged several times, each time the whole test platform needs to be started, and the process needs a lot of time.
2. Diversity of test scenarios
For example, for a certain typical scenario, a verification engineer develops a reference model and a comparator on a software simulation platform, finds that the reference model is wrong after being executed on the software simulation platform, and finally outputs of the reference model and the comparator are the same in the current scenario after a plurality of attempts and modifications. By analogy, similar operations need to be performed for other typical scenarios, the test platform needs to be recompiled every debugging, and a large amount of time is consumed for running after the compilation is completed. In more extreme conditions, when an error occurs at a specific time or a specific event after the test platform runs for a long time, a large amount of time is needed to reach the time point of the previous error after the simulation platform is re-run after the reference model is modified.
3. Repeatability of special scenes
During the test process, sometimes the response of the component to be tested to the error input needs to be tested, and establishing the special error input needs to construct a complex test case.
After entering the hardware simulation phase, the probability of problems encountered by the software simulation is greatly reduced, but new difficulties are also encountered:
4. difficulty of debugging
As shown in fig. 3, since the hardware emulation platform executes the hardware bit stream after compiling and comprehensive wiring, the whole unit under test is a black box for the verification component, and the verification component can only draw a conclusion by comparing the output of the unit under test with the output of the reference model. If the difference is found, the problem location can be carried out only by adding debugging signals in the hardware bit stream to recompile and comprehensively route and observing specified signals through a special tool, and as shown in the foregoing, the compiling and comprehensive routing of the hardware simulation platform are very time-consuming. And due to the limitation of the size of the built-in memory, the number and the acquisition time of debugging signals cannot be added randomly according to requirements, so that the use of a hardware simulation platform is greatly restricted.
Disclosure of Invention
In view of the above, in order to overcome at least one aspect of the above problems, an embodiment of the present invention provides a test environment development method, including:
generating an initial software simulation platform comprising a reference model and a comparator;
acquiring input and output data generated by the initial software simulation platform in the process of simulating the unit to be tested according to preset acquisition parameters;
building a lightweight software simulation platform by using the reference model and the comparator;
inputting the input data into the lightweight software simulation platform and debugging the reference model and the comparator according to the output data;
and replacing the debugged reference model and the comparator in the initial software simulation platform with the debugged reference model and the comparator to obtain a final software simulation platform.
In some embodiments, further comprising:
constructing a hardware simulation platform;
debugging the unit to be tested by using the final software simulation platform;
simulating the debugged unit to be tested by using the hardware simulation platform;
acquiring input data of the hardware simulation platform when the debugged unit to be tested has an error in the process of simulating the debugged unit to be tested or acquiring input data under a preset scene;
and sending the acquired data as input data to the final software simulation platform to debug the debugged unit to be tested again.
In some embodiments, acquiring input and output data generated by the initial software simulation platform in the process of simulating the unit to be tested according to preset acquisition parameters further includes:
an acquisition mode is determined according to the acquisition parameters, wherein the acquisition mode comprises a continuous mode, an interval mode and an event mode.
In some embodiments, further comprising:
setting a data buffer area in response to the acquisition mode being a continuous mode;
storing the acquired data into a data cache region;
and converting the data in the data cache region into a preset data format, stamping a time stamp on the data and storing the data in a file.
In some embodiments, further comprising:
responding to the acquisition mode being an interval mode, and starting a first timer and a second timer;
generating an acquisition trigger signal every other first preset time period by utilizing the first timer;
recording the time without data input by using the second timer, and generating an acquisition stop signal after the time reaches the preset time;
and converting the acquired data into a preset data format, storing the preset data format in a file, and stamping a time stamp.
In some embodiments, further comprising:
responding to the event mode of the acquisition mode, and acquiring data according to a set time trigger condition, an address trigger condition and a data trigger condition;
and converting the acquired data into a preset data format, storing the preset data format in a file, and stamping a time stamp.
In some embodiments, further comprising:
extracting data and a timestamp in the file and respectively putting the data and the timestamp in a first queue and a second queue;
and sending the corresponding data in the first queue to the corresponding simulation platform according to the time of the timestamp in the second queue.
Based on the same inventive concept, according to another aspect of the present invention, an embodiment of the present invention further provides a test environment development system, including:
a generation module configured to generate an initial software simulation platform comprising a reference model and a comparator;
the acquisition module is configured to acquire input and output data generated by the initial software simulation platform in the process of simulating the unit to be tested according to preset acquisition parameters;
the building module is configured to build a lightweight software simulation platform by using the reference model and the comparator;
the debugging module is configured to input the input data into the lightweight software simulation platform and debug the reference model and the comparator according to the output data;
and the replacing module is configured to replace the debugged reference model and the comparator in the initial software simulation platform with the debugged reference model and the comparator to obtain a final software simulation platform.
Based on the same inventive concept, according to another aspect of the present invention, an embodiment of the present invention further provides a computer apparatus, including:
at least one processor; and
a memory storing a computer program operable on the processor, wherein the processor executes the program to perform any of the test environment development method steps as described above.
Based on the same inventive concept, according to another aspect of the present invention, an embodiment of the present invention further provides a computer-readable storage medium storing a computer program which, when executed by a processor, performs the steps of any one of the test environment development methods described above.
The invention has one of the following beneficial technical effects: the scheme provided by the invention can greatly reduce the development time of the reference model and the comparator. Therefore, after the reference model and the comparator are developed and stabilized in a lightweight verification environment, the reference model and the comparator can be directly put back to a traditional verification environment for large-scale regression testing.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art that other embodiments can be obtained by using the drawings without creative efforts.
FIG. 1 is a schematic diagram of a conventional software emulation verification platform;
FIG. 2 is a diagram of a conventional hardware simulation platform;
FIG. 3 is a schematic diagram comparing a software simulation platform with a hardware simulation platform;
FIG. 4 is a flow chart of a conventional chip verification process;
FIG. 5 is a flowchart illustrating a method for developing a test environment according to an embodiment of the present invention;
FIG. 6 is a schematic diagram of an initial software simulation platform provided by an embodiment of the present invention;
FIG. 7 is a schematic diagram of a lightweight software simulation platform provided in an embodiment of the present invention;
FIG. 8 is a block diagram of a data collection process provided by an embodiment of the present invention;
FIG. 9 is a schematic diagram of storing collected data in a file according to an embodiment of the present invention;
FIG. 10 is a schematic structural diagram of a test environment development system according to an embodiment of the present invention;
FIG. 11 is a schematic structural diagram of a computer device provided in an embodiment of the present invention;
fig. 12 is a schematic structural diagram of a computer-readable storage medium according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the following embodiments of the present invention are described in further detail with reference to the accompanying drawings.
It should be noted that all expressions using "first" and "second" in the embodiments of the present invention are used for distinguishing two entities with the same name but different names or different parameters, and it should be noted that "first" and "second" are only used for convenience of expression and should not be construed as a limitation to the embodiments of the present invention, and no description is given in the following embodiments.
According to an aspect of the present invention, an embodiment of the present invention provides a test environment development method, as shown in fig. 5, which may include the steps of:
s1, generating an initial software simulation platform comprising a reference model and a comparator;
s2, acquiring input and output data generated by the initial software simulation platform in the process of simulating the unit to be tested according to preset acquisition parameters;
s3, building a lightweight software simulation platform by using the reference model and the comparator;
s4, inputting the input data into the lightweight software simulation platform and debugging the reference model and the comparator according to the output data;
and S5, replacing the debugged reference model and comparator in the initial software simulation platform with the debugged reference model and comparator to obtain a final software simulation platform.
The scheme provided by the invention can greatly reduce the development time of the reference model and the comparator. Therefore, after the reference model and the comparator are developed and stabilized in a lightweight verification environment, the reference model and the comparator can be directly put back to a traditional verification environment for large-scale regression testing.
In some embodiments, steps S1-S5 are, in particular, an initial software simulation platform including a reference model and a comparator as shown in FIG. 6, wherein the unit under test runs in the host as the test component in the software simulation platform, and the unit under test runs on the hardware platform and the test component runs in the host in the hardware simulation platform. The test component mainly comprises a reference model and a comparator. The test data reach the tested component and the reference model through the driver, the outputs of the tested component and the reference model are used as the inputs of the comparator, and the comparator compares the actual value of the tested component with the expected value of the reference model to obtain a corresponding conclusion.
As shown in fig. 7, the lightweight software simulation platform is configured to implement development of a reference model and a comparator by collecting input and output data generated by the initial software simulation platform in fig. 6, inputting the input data into the reference model, and comparing the output data of the reference model with the collected output data.
In some embodiments, further comprising:
constructing a hardware simulation platform;
debugging the unit to be tested by using the final software simulation platform;
simulating the debugged unit to be tested by using the hardware simulation platform;
acquiring input data when the debugged unit to be tested generates errors in the simulation process of the debugged unit to be tested by the hardware simulation platform or acquiring input data under a preset scene;
and sending the acquired data as input data to the final software simulation platform to debug the debugged unit to be tested again.
Specifically, special data is collected from the hardware simulation platform, and the data comprises extreme scene data or error data of the unit to be tested. And then, the error data are analyzed and then injected into a final software simulation platform again, and finally debugging work is carried out in the software simulation platform.
In some embodiments, acquiring input and output data generated by the initial software simulation platform in the process of simulating the unit to be tested according to preset acquisition parameters further includes:
an acquisition mode is determined according to the acquisition parameters, wherein the acquisition mode comprises a continuous mode, an interval mode and an event mode.
Specifically, when data is collected, a specific input/output bus protocol on the unit to be tested can be converted into a specific data format and stored in a file, and time is added for each data packet. And then converting the data in the file into a specific data packet by using a data analysis unit and using the data packet as an input of a reference model and a comparator. The data acquisition can have three modes, the first mode is a continuous mode, in this mode, all input and output data are recorded into the file by the data conversion unit, and this mode maximally saves the possible data sets, but has the disadvantage of huge data volume and has large requirement on the storage medium space of the data. The second mode is an intermittent mode in which data is stored at a constant interval frequency, and the amount of data stored in this storage method is small, but there is a possibility that useful data is lost. The third is a time trigger mode, in which input and output under a specific scene can be saved by setting a trigger condition and a saving time, so that not only some extreme conditions can be quickly found, but also the saved data volume can be greatly reduced.
In some embodiments, further comprising:
setting a data buffer area in response to the acquisition mode being a continuous mode;
storing the acquired data into a data cache region;
and converting the data in the data cache region into a preset data format, stamping a time stamp and storing the data in a file.
Specifically, as shown in fig. 8, after entering the continuous mode, the data packet is continuously written into the file through the interface conversion module, in order to ensure the best file writing performance, a data buffer needs to be created, and if the data writing speed of the continuous mode is lower than the speed of the data conversion module for writing into the file, all data can be safely stored into the file; if the data writing speed is higher than the speed of writing the file by the data conversion module, the buffer area is full after a period of time, and at this time, two processing methods exist, one method is to discard the latest written data, the other method is to discard the earliest written data, both methods can cause data loss, and the user can select to discard the latest written data or the earliest written data, and the two methods have no obvious advantages or disadvantages according to the needs. If the data acquisition time or length is not specified, the sampling is stopped when the file is full.
In some embodiments, further comprising:
responding to the acquisition mode being an interval mode, and starting a first timer and a second timer;
generating an acquisition trigger signal every other first preset time period by utilizing the first timer;
recording the time without data input by using the second timer, and generating an acquisition stop signal after the time reaches the preset time;
and converting the acquired data into a preset data format, storing the preset data format in a file, and stamping a time stamp.
Specifically, as shown in fig. 8, after entering the interval mode, the data conversion module starts a timer, and the timer sends a trigger signal every other fixed time to remind the data acquisition unit. And an acquisition part in the data acquisition module waits for data on the bus after receiving the trigger signal sent by the timer, enters a sampling state if the data exist, and waits for data acquisition in a waiting state if the data do not exist. In order to prevent the module from entering the infinite wait state, another timer is required after the acquisition module enters the sampling state, and if no data arrives within the time specified by the timer, the acquisition state is jumped out.
In some embodiments, further comprising:
responding to the event mode of the acquisition mode, and acquiring data according to a set time trigger condition, an address trigger condition and a data trigger condition;
and converting the acquired data into a preset data format, storing the preset data format in a file, and stamping a time stamp.
Specifically, as shown in fig. 8, after entering the event trigger mode, the data conversion module triggers sampling according to trigger conditions, which are mainly time trigger, address trigger, and data trigger.
The time triggering means that when the simulation time is advanced to a certain moment T, the signal sampling module starts to work, and the sampling module stops after sampling for a certain time or a certain amount of data.
The address triggering means that when the bus takes time to transmit/receive data to a specific address, the signal sampling module starts to operate, and the sampling module stops after sampling for a certain time or a certain amount of data.
The data triggering means that when an event of writing/reading specific data occurs on the bus, the signal sampling module starts to work, and the sampling module stops after sampling a certain time or a certain amount of data.
In some embodiments, further comprising:
extracting data and a timestamp in the file and respectively putting the data and the timestamp in a first queue and a second queue;
and sending the corresponding data in the first queue to the corresponding simulation platform according to the time of the timestamp in the second queue.
Specifically, data acquisition is performed according to a selected trigger mode, data on a bus is subjected to clock sampling and then packaged into a plurality of data packets, and then the data packets are written into a disk of a host or a memory of a hardware platform through a file read-write interface of the system. As shown in fig. 9, the data packets are stored in the storage medium in series, and the content of each data packet is composed of two parts, one part is a time stamp, and the other part is data, wherein the time stamp records the start time of the current data packet sample, and the data packet records the input/output data of the module sample. A typical packet content includes data address information and data information, wherein the data information is mainly data bit width and length. Then when the data packet needs to be input into the corresponding simulation platform, the data packet can be extracted by a data analysis module, the data and the time stamp in the file are extracted by the data analysis module, the extracted time stamp is placed in a single queue, the data packet is placed in another single queue, and the two queues are in one-to-one correspondence. After the data packet and the time stamp are placed in the corresponding queue, the sending module starts a timer and sends the data packet to the output interface according to the time of the time stamp.
According to the scheme provided by the invention, a reference model and a comparator in a traditional verification platform are separated, and then the specific test data is stored in a file from a bus in a certain format by using the data acquisition module provided by the invention. And then, the test data in the file is transmitted to the reference model and the comparator as data by using the data analysis module provided by the invention. Because the reference model and the comparator are developed based on an independent lightweight verification platform, the whole test component and the unit to be tested can be prevented from being compiled each time. Meanwhile, the sampling method provided by the invention can be adopted to sample specific data, so that the development time of a reference model and a comparator can be greatly reduced. After the reference model and the comparator are developed and stabilized in a lightweight verification environment, the reference model and the comparator can be directly put back to a traditional verification environment for large-scale regression testing.
The traditional reference model and comparator development are generally developed based on a series of common data, so that errors are easy to occur when some specific data are encountered, and the reference model is required to be recompiled every time when the errors are generated, so that the whole verification environment is difficult to compile, and time is consumed. Meanwhile, in some cases, the reference model or the comparator is wrong after the simulation is run for a long time, and the corrected reference model or the comparator can be verified to be correct or not only by running the simulation for a long time again after the reference model or the comparator is corrected because wrong data are not stored.
The invention also solves the problem of difficult debugging of the unit to be tested in hardware simulation by using the sampled data. The special data or problem data collected from hardware is injected into the final software simulation platform, and all signals of the unit to be tested can be observed by means of a verification tool, so that debugging of the unit to be tested can be completed quickly.
Based on the same inventive concept, according to another aspect of the present invention, an embodiment of the present invention further provides a test environment development system 400, as shown in fig. 10, including:
a generation module 401 configured to generate an initial software simulation platform comprising a reference model and a comparator;
an acquisition module 402 configured to acquire input and output data generated by the initial software simulation platform in a simulation process of a unit to be tested according to preset acquisition parameters;
a building module 403 configured to build a lightweight software simulation platform by using the reference model and the comparator;
a debugging module 404 configured to input the input data into the lightweight software simulation platform and debug the reference model and the comparator according to the output data;
a replacing module 405 configured to replace the debugged reference model and the comparator in the initial software simulation platform with the debugged reference model and the comparator to obtain a final software simulation platform.
Based on the same inventive concept, according to another aspect of the present invention, as shown in fig. 11, an embodiment of the present invention further provides a computer apparatus 501, comprising:
at least one processor 520; and
a memory 510, the memory 510 storing a computer program 511 executable on the processor, the processor 520 executing the program to perform the steps of any of the test environment development methods described above.
Based on the same inventive concept, according to another aspect of the present invention, as shown in fig. 12, an embodiment of the present invention further provides a computer-readable storage medium 601, where the computer-readable storage medium 601 stores computer program instructions 610, and the computer program instructions 610, when executed by a processor, perform the steps of any one of the test environment development methods as above.
Finally, it should be noted that, as will be understood by those skilled in the art, all or part of the processes of the methods of the above embodiments may be implemented by a computer program, which may be stored in a computer-readable storage medium, and when executed, may include the processes of the embodiments of the methods described above.
Further, it should be appreciated that the computer-readable storage media (e.g., memory) herein can be either volatile memory or nonvolatile memory, or can include both volatile and nonvolatile memory.
Those of skill would further appreciate that the various illustrative logical blocks, modules, circuits, and algorithm steps described in connection with the disclosure herein may be implemented as electronic hardware, computer software, or combinations of both. To clearly illustrate this interchangeability of hardware and software, various illustrative components, blocks, modules, circuits, and steps have been described above generally in terms of their functionality. Whether such functionality is implemented as software or hardware depends upon the particular application and design constraints imposed on the overall system. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the disclosed embodiments of the present invention.
The foregoing is an exemplary embodiment of the present disclosure, but it should be noted that various changes and modifications could be made herein without departing from the scope of the present disclosure as defined by the appended claims. The functions, steps and/or actions of the method claims in accordance with the disclosed embodiments described herein need not be performed in any particular order. Furthermore, although elements of the disclosed embodiments of the invention may be described or claimed in the singular, the plural is contemplated unless limitation to the singular is explicitly stated.
It should be understood that, as used herein, the singular forms "a", "an" and "the" are intended to include the plural forms as well, unless the context clearly supports the exception. It should also be understood that "and/or" as used herein is meant to include any and all possible combinations of one or more of the associated listed items.
The numbers of the embodiments disclosed in the embodiments of the present invention are merely for description, and do not represent the merits of the embodiments.
It will be understood by those skilled in the art that all or part of the steps for implementing the above embodiments may be implemented by hardware, or may be implemented by a program instructing relevant hardware, and the program may be stored in a computer-readable storage medium, and the above-mentioned storage medium may be a read-only memory, a magnetic disk or an optical disk, etc.
Those of ordinary skill in the art will understand that: the discussion of any embodiment above is meant to be exemplary only, and is not intended to intimate that the scope of the disclosure, including the claims, of embodiments of the invention is limited to these examples; within the idea of an embodiment of the invention, also technical features in the above embodiment or in different embodiments may be combined and there are many other variations of the different aspects of the embodiments of the invention as described above, which are not provided in detail for the sake of brevity. Therefore, any omissions, modifications, substitutions, improvements, and the like that may be made without departing from the spirit and principles of the embodiments of the present invention are intended to be included within the scope of the embodiments of the present invention.

Claims (10)

1. A test environment development method is characterized by comprising the following steps:
generating an initial software simulation platform comprising a reference model and a comparator;
acquiring input and output data generated by the initial software simulation platform in the process of simulating the unit to be tested according to preset acquisition parameters;
building a lightweight software simulation platform by using the reference model and the comparator;
inputting the input data into the lightweight software simulation platform and debugging the reference model and the comparator according to the output data;
and replacing the debugged reference model and the comparator in the initial software simulation platform with the debugged reference model and the comparator to obtain a final software simulation platform.
2. The method of claim 1, further comprising:
constructing a hardware simulation platform;
debugging the unit to be tested by using the final software simulation platform;
simulating the debugged unit to be tested by using the hardware simulation platform;
acquiring input data of the hardware simulation platform when the debugged unit to be tested has an error in the process of simulating the debugged unit to be tested or acquiring input data under a preset scene;
and sending the acquired data as input data to the final software simulation platform to debug the debugged unit to be tested again.
3. The method of claim 1, wherein collecting input and output data generated by the initial software simulation platform during the simulation of the unit under test according to preset collection parameters further comprises:
an acquisition mode is determined according to the acquisition parameters, wherein the acquisition mode comprises a continuous mode, an interval mode and an event mode.
4. The method of claim 3, further comprising:
setting a data buffer area in response to the acquisition mode being a continuous mode;
storing the acquired data into a data cache region;
and converting the data in the data cache region into a preset data format, stamping a time stamp and storing the data in a file.
5. The method of claim 3, further comprising:
responding to the acquisition mode being an interval mode, and starting a first timer and a second timer;
generating an acquisition trigger signal every other first preset time period by utilizing the first timer;
recording the time without data input by using the second timer, and generating an acquisition stop signal after the time reaches the preset time;
and converting the acquired data into a preset data format, storing the preset data format in a file, and stamping a time stamp.
6. The method of claim 3, further comprising:
responding to the event mode of the acquisition mode, and acquiring data according to a set time trigger condition, an address trigger condition and a data trigger condition;
and converting the acquired data into a preset data format, storing the preset data format in a file, and stamping a time stamp.
7. The method of any one of claims 4-6, further comprising:
extracting data and a timestamp in the file and respectively putting the data and the timestamp in a first queue and a second queue;
and sending the corresponding data in the first queue to the corresponding simulation platform according to the time of the timestamp in the second queue.
8. A test environment development system, comprising:
a generation module configured to generate an initial software simulation platform comprising a reference model and a comparator;
the acquisition module is configured to acquire input and output data generated by the initial software simulation platform in the process of simulating the unit to be tested according to preset acquisition parameters;
the building module is configured to build a lightweight software simulation platform by using the reference model and the comparator;
the debugging module is configured to input the input data into the lightweight software simulation platform and debug the reference model and the comparator according to the output data;
and the replacing module is configured to replace the debugged reference model and the comparator in the initial software simulation platform with the debugged reference model and the comparator to obtain a final software simulation platform.
9. A computer device, comprising:
at least one processor; and
memory storing a computer program operable on the processor, wherein the processor executes the program to perform the steps of the method according to any of claims 1-7.
10. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the steps of the method according to any one of claims 1-7.
CN202210397318.2A 2022-04-15 2022-04-15 Test environment development method, system, equipment and medium Pending CN114756463A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210397318.2A CN114756463A (en) 2022-04-15 2022-04-15 Test environment development method, system, equipment and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210397318.2A CN114756463A (en) 2022-04-15 2022-04-15 Test environment development method, system, equipment and medium

Publications (1)

Publication Number Publication Date
CN114756463A true CN114756463A (en) 2022-07-15

Family

ID=82331372

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210397318.2A Pending CN114756463A (en) 2022-04-15 2022-04-15 Test environment development method, system, equipment and medium

Country Status (1)

Country Link
CN (1) CN114756463A (en)

Similar Documents

Publication Publication Date Title
CN107025167B (en) Method and apparatus for data flow analysis using compiler type information in processor trace logs
US20070061641A1 (en) Apparatus and method for generating test driver
CN113343617B (en) Software and hardware co-simulation method
US20070157134A1 (en) Method for testing a hardware circuit block written in a hardware description language
CN115684896A (en) Chip testability design test method, test platform, and generation method and device thereof
CN108628734B (en) Functional program debugging method and terminal
CN115562982A (en) Reference model debugging method and device, electronic equipment and storage medium
CN112286750A (en) GPIO (general purpose input/output) verification method and device, electronic equipment and medium
CN115858336A (en) Test vector generation method and device, computing equipment and storage medium
US10585771B1 (en) Pluggable hardware simulation test execution system
KR100506769B1 (en) High speed test pattern evaluation apparatus
CN108009086B (en) System automation test method based on case decomposition and function learning
CN115684895B (en) Chip testability design test method, test platform, and generation method and device thereof
CN110286882B (en) Foreground system design and verification method based on model detection
CN115656791B (en) Test method and test platform for chip testability design
CN114756463A (en) Test environment development method, system, equipment and medium
CN115684894A (en) Test method and test platform for chip testability design
Goli et al. Through the looking glass: Automated design understanding of SystemC-based VPs at the ESL
CN114896922A (en) Chip IO multiplexing verification method based on formal verification
CN101714114A (en) Device and method for supporting processor silicon post debugging
RU2729210C1 (en) Electronic devices software testing system
CN117597669A (en) Test method, system and device
CN115629928B (en) Software and hardware cooperative verification method and system for brain-like processor
CN112380133B (en) Method and device for simulating instruction set simulator by utilizing function library
CN117313650B (en) Chip test verification method and application device thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination