CN115034173A - Test method of chip simulation model - Google Patents

Test method of chip simulation model Download PDF

Info

Publication number
CN115034173A
CN115034173A CN202210628025.0A CN202210628025A CN115034173A CN 115034173 A CN115034173 A CN 115034173A CN 202210628025 A CN202210628025 A CN 202210628025A CN 115034173 A CN115034173 A CN 115034173A
Authority
CN
China
Prior art keywords
simulation model
test
chip simulation
test case
command line
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210628025.0A
Other languages
Chinese (zh)
Inventor
聂玲子
黄洋科
田立
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Spreadtrum Communications Shanghai Co Ltd
Original Assignee
Spreadtrum Communications Shanghai Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Spreadtrum Communications Shanghai Co Ltd filed Critical Spreadtrum Communications Shanghai Co Ltd
Priority to CN202210628025.0A priority Critical patent/CN115034173A/en
Publication of CN115034173A publication Critical patent/CN115034173A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F30/00Computer-aided design [CAD]
    • G06F30/30Circuit design
    • G06F30/39Circuit design at the physical level
    • G06F30/398Design verification or optimisation, e.g. using design rule check [DRC], layout versus schematics [LVS] or finite element methods [FEM]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3684Test management for test design, e.g. generating new test cases
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3688Test management for test execution, e.g. scheduling of test suites
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3692Test management for test results analysis

Abstract

The invention provides a testing method of a chip simulation model, which is applied to an automatic testing framework and comprises the following steps: receiving command line parameters transmitted by a control script; analyzing the command line parameters into command lines for executing the chip simulation model; starting a chip simulation model to enable the chip simulation model to execute a target test case according to the command line; monitoring the execution state of the chip simulation model until a mark of the execution end of the chip simulation model is obtained; and acquiring an output result of the chip simulation model, and comparing the output result with a target result of the target test case to obtain a test result of the test case. The test method of the chip simulation model provided by the invention can realize the test of the chip design scheme in an automatic execution mode and improve the test efficiency.

Description

Test method of chip simulation model
Technical Field
The invention relates to the technical field of chip design, in particular to a test method of a chip simulation model.
Background
Chips have played an important role in a variety of electronic devices, and most of the electronic devices rely on the implementation of their internal chips for operation. The chip implementation includes two stages of chip design and chip manufacturing. The main steps of chip fabrication are chip-out according to the chip design results, but once the chip is chip-out, the internal logic of the chip cannot be changed. If a problem occurs during the chip design phase, the chip can only be reflowed, and the resulting loss is clearly enormous. The cost of chip flow is very high, which puts higher demands on chip design and ensures the correctness of chip design. For this reason, in the chip design process, the correctness of the chip logic function is usually verified by building a chip simulation model. However, in the prior art, there is no method for verifying that the chip logic function can be realized in an automatic execution manner.
Disclosure of Invention
The test method of the chip simulation model provided by the invention can realize the test of the chip design scheme in an automatic execution mode and improve the test efficiency.
In a first aspect, the present invention provides a method for testing a chip simulation model, which is applied to an automatic test framework, and includes:
receiving command line parameters transmitted by a control script;
analyzing the command line parameters into command lines for executing a chip simulation model;
starting a chip simulation model to enable the chip simulation model to execute a target test case according to the command line;
monitoring the execution state of the chip simulation model until a mark of the execution end of the chip simulation model is obtained;
and acquiring an output result of the chip simulation model, and comparing the output result with a target result of the target test case to obtain a test result of the test case.
Optionally, monitoring the execution state of the chip simulation model until obtaining the flag indicating that the execution of the chip simulation model is finished includes:
and circularly reading the target memory address according to the preset time until the mark of the execution end of the subprocess is obtained from the target memory address.
Optionally, parsing the command line parameters into a command line for execution by a chip simulation model includes:
and determining an executable file and input data corresponding to the command line parameters according to the command line parameters, and combining to form a command line according to the executable file and the input data.
Optionally, the obtaining an output result of the chip simulation model, and comparing the output result with the target result of the target test case includes:
and comparing the output result of the chip simulation model with the target result byte by adopting the assertion of the automatic test framework.
Optionally, before starting the chip simulation model, the method further includes:
judging whether the target test case is a multi-core program;
when the target test case is a multi-core program, determining the number of cores required by the current program and sending the number of the cores to the chip simulation model so as to enable the chip simulation model and the cores required by the multi-core program to be in one-to-one correspondence to create threads.
Optionally, the method further comprises: separate memory is specified for the stack on which each thread runs.
Optionally, the method further comprises: one of the multiple threads is designated as a main thread, so that the execution results of the multiple threads are synchronized to the main thread and are transferred to a target storage address by the main thread.
Optionally, monitoring an execution state of the chip simulation model until obtaining a flag indicating that the execution of the chip simulation model is finished further includes:
and finishing the execution of the chip simulation model and recycling resources.
Optionally, starting the chip simulation model, so that the chip simulation model executes the target test case according to the command line includes:
executing fork function to establish a subprocess to start a chip simulation model;
and replacing the context of the subprocess by adopting an execl function so that the chip simulation model executes a target test case according to the command line.
In a second aspect, the present invention provides a method for testing a chip simulation model, which is applied to a control script, and includes:
traversing each test case and a plurality of input data corresponding to each test case, and forming command line parameters to be transmitted to the automatic test framework; so that the automatic test framework executes the test method of any one chip simulation model;
obtaining a test result of the automatic test framework on each group of input data of each test case;
and storing and counting the test result output by the automatic test framework.
Optionally, the obtaining a test result of the automatic test framework for each group of input parameters of each test case includes:
monitoring a test result of the automatic test framework;
when the automatic testing frame outputs the testing result within the preset time, the testing result output by the automatic testing frame is stored;
and when the automatic test frame does not output the test result within the preset time, determining that the test of the current test case fails.
Optionally, traversing each test case and a plurality of input data corresponding to each test case, and forming command line parameters to be transferred to the automatic test framework includes:
when each test case is traversed, starting to traverse the input data corresponding to the current test case;
and combining the command line options corresponding to the current test case and the command line options corresponding to the current input data to form command line parameters when the input data corresponding to the current test case is traversed, and transmitting the command line parameters to the automatic test framework.
Optionally, before traversing each test case and the plurality of input data corresponding to each test case, the method further includes:
acquiring a test case and a catalog of input data of the test case;
and creating a directory of the output file and the log file.
Optionally, traversing each test case and the plurality of input data corresponding to each test case further includes:
and judging whether the input data meets the data requirements of the test case.
In the technical scheme provided by the invention, command line parameters are transmitted to the automatic test framework in a script control mode, the automatic test framework analyzes the command line parameters to form a command line executable by the chip simulation model, and after the chip simulation model completes the execution of the command line, the automatic test framework compares the execution result with an expected target result to obtain whether the execution of the chip simulation model is correct or not. According to the technical scheme provided by the invention, the automatic test framework is controlled by the control script, so that the automatic test of the chip simulation model can be realized, and the test efficiency and accuracy are improved.
Drawings
FIG. 1 is a flow chart of a method for testing a chip simulation model according to an embodiment of the present invention;
FIG. 2 is a schematic diagram illustrating a compiling method of a test method of a chip simulation model according to another embodiment of the present invention;
FIG. 3 is a flowchart illustrating a method for testing a chip simulation model according to another embodiment of the present invention to execute a multi-core test case.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The embodiment of the invention provides a method for testing a chip simulation model, which is applied to an automatic test framework and comprises the following steps of:
receiving command line parameters transmitted by a control script;
in some embodiments, a control script is used to control the logic of the entire test process. In order to achieve the purpose of batch testing, the control script needs to traverse all the test case cases and different inputs in each test case, different input and related command line parameters are used for starting different cases, and if each case knocks a command line once, the time and labor are wasted, and the error rate is greatly improved. For the reasons, in this step, the control script performs several operations, i.e., traverses each test case, traverses the corresponding input data under the test case every time when traversing to one test case, and combines the command line options corresponding to the test cases and the command line options corresponding to the input data into command line parameters. In order to traverse the test cases and the input data, the control script needs to execute the following steps: 1) initialization: acquiring a case path, newly building a directory, a log path and the like of an output file, and judging whether parameters are legal or not; 2) outer-layer circulation, which is to traverse one by one according to case names; 3) inner loop, according to different inputs in the case, such as different feature maps, different weights, different offsets, etc.
Analyzing the command line parameters into command lines for executing a chip simulation model;
in some embodiments, the command line parameters point to the executable file of the test case and the file where the input parameters are located, the automatic test framework obtains the command line in the executable file and the specific parameters in the input parameters according to the command line parameters, and combines the command line and the specific parameters to form the command line executable by the chip simulation model.
Starting a chip simulation model to enable the chip simulation model to execute a target test case according to the command line;
in some embodiments, the test target is to determine whether the function of the chip simulation model is complete, so in this step, the automatic test framework starts a sub-process, which is an operation process of the chip simulation model, and then executes a corresponding command line by using the chip simulation model, and according to an execution result of the chip simulation model, it may be determined whether the function of the chip simulation model is complete. For example, the automatic test framework may adopt a Gtest test framework, and the chip simulation model may be implemented by Qemu; the test framework of the test adopts the fork function to start a subprocess, namely, a chip simulation model is started, and then context replacement is realized through the execute function, so that the chip simulation model executes the target test case according to the command line.
Monitoring the execution state of the chip simulation model until a mark of the execution end of the chip simulation model is obtained;
in some embodiments, the automated test framework listens to the execution state of the sub-process, i.e., listens to the operational state of the chip simulation model. When the chip simulation model finishes executing the target test case, an end mark is sent, whether the end mark is written in or not can be monitored for the automatic test framework, and when the end mark is written in, the execution of the target test case by the subprocess can be determined to be finished.
And acquiring an output result of the chip simulation model, and comparing the output result with a target result of the target test case to obtain a test result of the test case.
In some embodiments, the automatic test framework is capable of obtaining a target result of executing a current test case according to current input data in the process of obtaining data according to the command line parameters when receiving the command line parameters of the control script. When the output result of the chip simulation model is obtained, the target result is compared with the output result, and whether the current test case is successfully or unsuccessfully executed can be determined. For the automatic test framework, the test result is output to the control script, the control script can store and print the comparison result of a single test case, and the control script can output a total test report after the execution of a plurality of test cases is completed.
In the technical scheme provided by this embodiment, command line parameters are transferred to the automatic test framework in a script control manner, the automatic test framework analyzes the command line parameters to form a command line executable by the chip simulation model, and after the chip simulation model completes execution of the command line, the automatic test framework compares an execution result with an expected target result to obtain whether the chip simulation model is executed correctly. In the technical scheme provided by the embodiment, the automatic test framework is controlled by the control script, so that the automatic test of the chip simulation model can be realized, and the test efficiency and accuracy are improved.
Before the above-described embodiment is executed, a preparation operation is required to prepare a test program and test data. As shown in fig. 2, the preparation mainly includes the test code writing at the host/server end and the data and executable files required by the QEMU-simulated RISC-V processor. The host/server end code mainly utilizes the principle of a GTEST TEST framework, each TEST case is registered as a TEST, and the result run out by the simulator is compared with the prepared reference result byte by utilizing the self-contained assertion of the TEST framework to give a TEST result. The QEMU end needs to prepare input data, because the simulation is directed at the AI chip, the input data given by the invention is data such as a characteristic diagram, weight, offset and the like of the data, and simultaneously, cases needing to be tested are also written, if an addition instruction needs to be tested, the test case of the addition instruction needs to be added, each instruction or model can have a plurality of inputs, and each case and input can be executed by a test framework. After the program and data are ready, the program is compiled into an executable file and the input data are put into a compiled folder for standby. Wherein, the test registered in the GTEST is used to call the executable file of each case.
As an optional implementation manner, monitoring the execution state of the chip simulation model until obtaining the flag indicating that the execution of the chip simulation model is finished includes:
and circularly reading the target memory address according to the preset time until the mark of the execution end of the subprocess is obtained from the target memory address.
In this embodiment, a fixed time interval is set, the sequential target address memory is read every predetermined time, when the target address memory has no end mark, the time interval timing of the next period is continued, and when the end mark exists, the execution result of the chip simulation model executing the target test case is obtained.
As an optional implementation, parsing the command line parameters into command lines for execution by the chip simulation model includes:
and determining an executable file and input data corresponding to the command line parameters according to the command line parameters, and combining to form a command line according to the executable file and the input data.
In some embodiments, the command line parameters specify an executable file of the target test case and input data, and the automatic test framework extracts the executable command line and the specific input data of the chip simulation model from the executable file during the execution process according to the direction of the command line parameters. And combines the executable command lines with the specific input data to form command lines. The combined command lines can be directly executed by the chip simulation model after being transmitted to the chip simulation model.
As an optional implementation manner, the obtaining an output result of the chip simulation model, and comparing the output result with the target result of the target test case includes:
and comparing the output result of the chip simulation model with the target result byte by adopting the assertion of the automatic test framework.
In some embodiments, the automatic test framework generally has rich assertions, which can return different forms of prompts for different test cases, for example, in the test framework, that is, assertions with different functions can achieve tests with multiple purposes. In the present embodiment, the judgment of the execution result of the chip simulation model can be realized by using the assertion in the automatic test framework.
In some embodiments, as shown in fig. 3, before starting the chip simulation model, the method further includes:
judging whether the target test case is a multi-core program;
in some embodiments, for a test case, a case where multiple cores work together may be required, and for a case where multiple cores work together, resources and data output modes required for the case where a single core works may be different. Therefore, in the present embodiment, the target test case is also determined before the test case is executed.
When the target test case is a multi-core program, determining the number of cores required by the current program and sending the number of the cores to the chip simulation model so as to enable the chip simulation model and the cores required by the multi-core program to be in one-to-one correspondence to create threads.
In some embodiments, in order to enable multi-core booting, a parameter of the number of cores to be booted is added to Linux, that is, a Host-side program, and the Host-side program is responsible for parsing and transmitting the parameter into a command line for booting a QEMU program, and QEMU creates a thread for each processor core according to the parameter and boots the thread.
As an optional implementation, the method further includes: separate memory is designated for the stack on which each thread runs.
In some embodiments, to ensure the correct operation of the multi-core program, the stack of the program runtime needs to be stored in a separate memory. The stack is usually used to store information such as local variables and function call addresses when the program runs, and because the multi-core program is executed in parallel, if the stack is stored in the same position, the content stored in the stack is disordered, and the normal operation of the program is affected. The present embodiment solves this problem by specifying the location of the stack to a different physical memory address in runtime. In addition, if the processor supports multi-core applications using symbols such as global variables stored in the data segments, the data segments also have problems when the multi-core applications run in parallel. If the data segments are stored at the same position, the risk of data collision such as write-after-read and write-after-write of the data at the same address exists, and the correct operation of the program is affected. The first solution to this problem is to compile respective ELF files for multiple cores, and to respectively designate the addresses where the data segments are stored to different physical memories during compilation, and the second solution is to run the same ELF file for multiple cores, and to designate the same virtual address segment for the data segments of the program during compilation of the ELF file, but the address segment corresponds to a unique address of each core, that is, different physical address segments. The second scheme also depends on the implementation of the respective on-chip memories of the multiple cores in the QEMU, and simultaneously needs to modify codes for loading ELF files in the QEMU, because the QEMU does not support the parsing of data segments and the automatic copying to the respective on-chip memories of the multiple cores at this stage at present.
As an optional implementation, the method further includes: one of the multiple threads is designated as a main thread, so that the execution results of the multiple threads are synchronized to the main thread and are transferred to a target storage address by the main thread.
In some embodiments, in order to efficiently complete the automatic alignment work of the multi-core program, the embodiment also designs a suitable scheme for the multi-core test framework. By combining the application scenario of the multi-core processor in the embodiment, it can be found that the final result data of the multi-core program is usually stored in a public DDR memory, and therefore the multi-core test work only needs to compare the final DDR result. The method comprises the steps of designating a main core in a multi-core mode, triggering a dump function by the main core after confirming that all core programs are completely run by utilizing a multi-core synchronization mechanism, and finally comparing results.
As an optional implementation manner, monitoring the execution state of the chip simulation model until obtaining the flag indicating that the execution of the chip simulation model is finished further includes:
and finishing the execution of the chip simulation model and recycling resources.
In some embodiments, after reading the flag indicating that the execution of the chip simulation model is finished, it indicates that the current test case has been finished, and to save resources, the execution of the chip simulation model is finished, for example, the execution of the chip simulation model may be stopped in a kill subprocess manner, and resources occupied in the execution process of the chip simulation model are recovered.
As an optional implementation manner, starting the chip simulation model so that the chip simulation model executes the target test case according to the command line includes:
executing fork function to establish a subprocess to start a chip simulation model;
in some embodiments, using the fork function can divide the currently executing process into two processes, i.e., using the fork function can create sub-processes.
And replacing the context of the subprocess by adopting an execl function so that the chip simulation model executes a target test case according to the command line.
In some embodiments, the child process established by the fork function inherits the parameters such as the execution variables and the environment of the parent process, and in order to enable the chip simulation model to execute the target test case, the execl function is used for replacing the context of the child process, and the parameters such as the variables and the environment of the child process are changed.
An embodiment of the present invention further provides a method for testing a chip simulation model, which is applied to a control script, and as shown in fig. 1, the method includes:
traversing each test case and a plurality of input data corresponding to each test case, and forming command line parameters to be transmitted to the automatic test framework; so that the automatic test framework executes the test method of any one chip simulation model;
in some embodiments, for the purpose of batch testing, all test cases need to be traversed, and different inputs in each test case, different input and related command line parameters are used to start different cases, and if a command line is knocked once by each case, it is not only laborious and time-consuming, but also the error rate is greatly increased. In the script, the following work, namely traversing of a first case and a case is performed; second, a combination of command lines. The control script mainly aims to realize logic control in the automatic test process, and the control script forms command line parameters after combination through traversal of test cases and traversal of input data of each test case and combination of command line options corresponding to the test cases and command line options corresponding to the input data, and transmits the command line parameters to the automatic test framework. Specifically, the traversal process of the control script is as follows: 1) outer-layer circulation, which is traversed one by one according to the name of case; 2) inner loop, according to different inputs in the case, such as different feature maps, different weights, different offsets, etc. After the automatic test framework receives the command line parameters, the test case is executed according to the test method of the chip simulation model.
Obtaining a test result of the automatic test framework on each group of input data of each test case;
in some embodiments, in order to implement automation of the test, the test results need to be collected and processed in an automatic manner, and in this step, the control script obtains the test results of each group of input data of each test case without manual processing.
And storing and counting the test results output by the automatic test framework.
In some embodiments, the previous step is the acquisition of the test result of a single test case, and in this step, all the test results are stored and counted, so that a brief and clear test report can be given.
In the technical solution provided in this embodiment, command line parameters are transmitted to the automatic test framework in a script control manner, the automatic test framework analyzes the command line parameters to form a command line executable by the chip simulation model, and after the chip simulation model completes execution of the command line, the automatic test framework compares the execution result with an expected target result to obtain whether the chip simulation model is executed correctly. In the technical scheme provided by the embodiment, the automatic test framework is controlled by the control script, so that the automatic test of the chip simulation model can be realized, and the test efficiency and accuracy are improved.
As an optional implementation manner, the obtaining the test result of the automatic test framework for each set of input parameters of each test case includes:
monitoring a test result of the automatic test framework;
when the automatic testing frame outputs the testing result within the preset time, the testing result output by the automatic testing frame is stored;
and when the automatic test frame does not output the test result within the preset time, determining that the test of the current test case fails.
In the embodiment, the output test result of the automatic test framework is monitored, however, for part of the test cases with errors, the test cases are often difficult to execute for a long time, in order to improve the test efficiency, in the embodiment, a time limit is further set, and when the test result is not output beyond a predetermined time, the test case test is determined to fail.
As an optional implementation, traversing each test case and a plurality of input data corresponding to each test case, and forming command line parameters to be passed to the automatic test framework includes:
when a test case is traversed, input data corresponding to the current test case is traversed;
and combining the command line options corresponding to the current test case and the command line options corresponding to the current input data to form command line parameters when the input data corresponding to the current test case is traversed, and transmitting the command line parameters to the automatic test framework.
In the embodiment, the combination of the test cases and the input data is realized through two layers of circulation, the outer layer circulates and traverses the test cases, and the inner layer circulates and traverses the input data of the test cases, so that one test case can be intensively tested by adopting different test data, and the problem of the test case is conveniently found.
As an optional implementation manner, before traversing each test case and the plurality of input data corresponding to each test case, the method further includes:
acquiring a test case and a directory of input data of the test case;
and creating a directory of the output file and the log file.
In the present embodiment, the key settings for determining the test case source, the input data source, the output file location, and the log file location are usually performed and the script is initialized.
As an optional implementation manner, after traversing each test case and the plurality of input data corresponding to each test case, the method further includes:
and judging whether the input data meets the data requirement of the test case.
In some embodiments, the validity of the input data is a key element for ensuring the accuracy of the test result of the test case.
For flexibility in the testing process, the control scripts of the foregoing embodiments should also support custom command lines, such as the case's runtime, how much data the last dump is used for versus peer.
In each embodiment provided by the invention, the control script is adopted to control the transmission of the command line parameters to the automatic test framework, and the automatic test framework analyzes the command line parameters to form a command line which can be executed by the chip simulation model, thereby realizing the automatic test of the chip simulation model. In summary, as soon as a command line is started, a test program starts, a host/server side starts a process related to GTEST, which may be called a parent process, the parent process first obtains command line parameters given by a control script, then creates a child process, after the child process is created, the parent process starts a circular waiting mode, and queries whether the child process is finished or not in the waiting process. The child process is a case to be tested, the excel function is used for replacing the context of the process, the program of the child process is executed, the child process executes commands called in the case one by one during execution to obtain a calculated result, when all the commands are executed, an end mark is written in a file, the parent process recovers the child process after reading the whole mark, and one case is executed till the end.
It will be understood by those skilled in the art that all or part of the processes of the embodiments of the methods described above may be implemented by a computer program, which may be stored in a computer-readable storage medium, and when executed, may include the processes of the embodiments of the methods described above. The storage medium may be a magnetic disk, an optical disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), or the like.
The above description is only for the specific embodiment of the present invention, but the scope of the present invention is not limited thereto, and any changes or substitutions that can be easily conceived by those skilled in the art within the technical scope of the present invention are included in the scope of the present invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the claims.

Claims (14)

1. A method for testing a chip simulation model is applied to an automatic test framework and comprises the following steps:
receiving command line parameters transmitted by a control script;
analyzing the command line parameters into command lines for executing the chip simulation model;
starting a chip simulation model to enable the chip simulation model to execute a target test case according to the command line;
monitoring the execution state of the chip simulation model until a mark indicating that the execution of the chip simulation model is finished is obtained;
and acquiring an output result of the chip simulation model, and comparing the output result with a target result of the target test case to obtain a test result of the test case.
2. The method of claim 1, wherein monitoring the execution status of the chip simulation model until obtaining the indication of the end of the execution of the chip simulation model comprises:
and circularly reading the target memory address according to the preset time until the mark of the execution end of the subprocess is obtained from the target memory address.
3. The method of claim 1, wherein parsing the command line parameters into command lines for chip simulation model execution comprises:
and determining an executable file and input data corresponding to the command line parameters according to the command line parameters, and combining to form a command line according to the executable file and the input data.
4. The method of claim 1, wherein obtaining the output result of the chip simulation model and comparing the output result with the target result of the target test case comprises:
and comparing the output result of the chip simulation model with the target result byte by adopting the assertion of the automatic test framework.
5. The method of claim 1, wherein starting the chip simulation model further comprises:
judging whether the target test case is a multi-core program or not;
when the target test case is a multi-core program, determining the number of cores required by the current program and sending the number of cores to the chip simulation model, so that the chip simulation model and the cores required by the multi-core program are in one-to-one correspondence to create threads.
6. The method of claim 5, further comprising: separate memory is specified for the stack on which each thread runs.
7. The method of claim 5, further comprising: one of the multiple threads is designated as a main thread, so that the execution results of the multiple threads are synchronized to the main thread and are transferred to a target storage address by the main thread.
8. The method of claim 1, wherein monitoring the execution status of the chip simulation model until obtaining the indication of the end of the execution of the chip simulation model further comprises:
and finishing the execution of the chip simulation model and recycling resources.
9. The method of claim 1, wherein starting a chip simulation model to cause the chip simulation model to execute a target test case according to the command line comprises:
executing fork function to establish a subprocess to start a chip simulation model;
and replacing the context of the subprocess by adopting an execl function so that the chip simulation model executes a target test case according to the command line.
10. A method for testing a chip simulation model is applied to a control script and comprises the following steps:
traversing each test case and a plurality of input data corresponding to each test case, and forming command line parameters to be transmitted to an automatic test framework; to cause the automated test framework to perform the method of claims 1-9;
obtaining a test result of the automatic test framework on each group of input data of each test case;
and storing and counting the test results output by the automatic test framework.
11. The method of claim 10, wherein obtaining the test results of the automated test framework for each set of input parameters for each test case comprises:
monitoring a test result of the automatic test framework;
when the automatic testing frame outputs the testing result within the preset time, the testing result output by the automatic testing frame is stored;
and when the automatic test frame does not output the test result within the preset time, determining that the test of the current test case fails.
12. The method of claim 10, wherein traversing each test case and a plurality of input data corresponding to each test case and forming command line parameters for passing to an automated test framework comprises:
when each test case is traversed, starting to traverse the input data corresponding to the current test case;
and when the input data corresponding to the current test case is traversed, combining the command line option corresponding to the current test case and the command line option corresponding to the current input data to form a command line parameter, and transmitting the command line parameter to the automatic test framework.
13. The method of claim 10, wherein traversing each test case and the plurality of input data corresponding to each test case further comprises:
acquiring a test case and a directory of input data of the test case;
and creating directories of the output file and the log file.
14. The method of claim 10, wherein traversing each test case and the plurality of input data corresponding to each test case further comprises:
and judging whether the input data meets the data requirements of the test case.
CN202210628025.0A 2022-06-02 2022-06-02 Test method of chip simulation model Pending CN115034173A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210628025.0A CN115034173A (en) 2022-06-02 2022-06-02 Test method of chip simulation model

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210628025.0A CN115034173A (en) 2022-06-02 2022-06-02 Test method of chip simulation model

Publications (1)

Publication Number Publication Date
CN115034173A true CN115034173A (en) 2022-09-09

Family

ID=83122457

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210628025.0A Pending CN115034173A (en) 2022-06-02 2022-06-02 Test method of chip simulation model

Country Status (1)

Country Link
CN (1) CN115034173A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115809622A (en) * 2023-01-19 2023-03-17 南京集成电路产业服务中心有限公司 Chip simulation acceleration system with automatic optimization configuration function

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115809622A (en) * 2023-01-19 2023-03-17 南京集成电路产业服务中心有限公司 Chip simulation acceleration system with automatic optimization configuration function

Similar Documents

Publication Publication Date Title
US11281570B2 (en) Software testing method, system, apparatus, device medium, and computer program product
US8572437B2 (en) Multi-platform test automation enhancement
KR100868762B1 (en) Method of error detecting method for embedded sofeware
US7987460B2 (en) Automation tool for application installations
CN109189479B (en) Parallel automatic verification method for processor instruction set
CN107329889B (en) Method for automatically testing C compiler
CN107179982B (en) Cross-process debugging method and device
CN113419747A (en) Middleware deployment method, device, equipment and storage medium
CN115034173A (en) Test method of chip simulation model
US8972784B2 (en) Method and device for testing a system comprising at least a plurality of software units that can be executed simultaneously
CN111259619B (en) Control method and device for configuration object, storage medium and verification platform
WO2023207973A1 (en) Compiler test method and apparatus, case generation method and apparatus, and instruction storage structure
CN109582334B (en) Upgrading method and device for programmable logic device
CN113672260B (en) Method for initializing CPU of processor
WO2021249518A1 (en) Hotfix file generation and consistency detection methods and apparatuses, and device and medium
CN113315675B (en) White box switch U-Boot automatic testing method, system and storage medium
CN115617668A (en) Compatibility testing method, device and equipment
CN113742252B (en) Method and device for detecting memory disorder
US20020169997A1 (en) BIOS debug method
CN114064448A (en) Test environment deployment and smoking test method and device and electronic equipment
CN114327574A (en) Differential upgrading method, electronic device and storage medium
CN112711575A (en) Deployment method, system and related device of database cluster
CN110990181A (en) Method and system for automatically reproducing probabilistic setting failure of options in BIOS (basic input/output System)
CN110928798A (en) Code testing method, device and equipment
CN115858012B (en) Program variable configuration method, device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination