CN109074411B - Method and system for evaluating electronic components - Google Patents
Method and system for evaluating electronic components Download PDFInfo
- Publication number
- CN109074411B CN109074411B CN201780021918.6A CN201780021918A CN109074411B CN 109074411 B CN109074411 B CN 109074411B CN 201780021918 A CN201780021918 A CN 201780021918A CN 109074411 B CN109074411 B CN 109074411B
- Authority
- CN
- China
- Prior art keywords
- model
- data
- parameter
- electronic component
- computing device
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 37
- 238000011156 evaluation Methods 0.000 claims abstract description 161
- 230000027455 binding Effects 0.000 claims abstract description 59
- 238000009739 binding Methods 0.000 claims abstract description 59
- 230000000638 stimulation Effects 0.000 claims description 41
- 238000004891 communication Methods 0.000 claims description 19
- 238000006243 chemical reaction Methods 0.000 claims description 9
- 238000007493 shaping process Methods 0.000 claims description 9
- 230000000977 initiatory effect Effects 0.000 claims description 7
- 230000009466 transformation Effects 0.000 claims 6
- 238000004458 analytical method Methods 0.000 description 72
- 238000010586 diagram Methods 0.000 description 20
- 230000015654 memory Effects 0.000 description 16
- 230000004044 response Effects 0.000 description 15
- 230000008569 process Effects 0.000 description 14
- 230000006870 function Effects 0.000 description 11
- 238000004088 simulation Methods 0.000 description 11
- 238000012545 processing Methods 0.000 description 9
- 239000000047 product Substances 0.000 description 8
- 238000013461 design Methods 0.000 description 7
- 230000005540 biological transmission Effects 0.000 description 5
- 238000007726 management method Methods 0.000 description 5
- 230000003595 spectral effect Effects 0.000 description 5
- 238000013499 data model Methods 0.000 description 4
- 230000005291 magnetic effect Effects 0.000 description 4
- 230000003068 static effect Effects 0.000 description 4
- 230000006399 behavior Effects 0.000 description 3
- 230000008859 change Effects 0.000 description 3
- 230000000670 limiting effect Effects 0.000 description 3
- 239000000203 mixture Substances 0.000 description 3
- 239000008186 active pharmaceutical agent Substances 0.000 description 2
- 230000001413 cellular effect Effects 0.000 description 2
- 230000007613 environmental effect Effects 0.000 description 2
- 238000005259 measurement Methods 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 238000005070 sampling Methods 0.000 description 2
- VYZAMTAEIAYCRO-UHFFFAOYSA-N Chromium Chemical compound [Cr] VYZAMTAEIAYCRO-UHFFFAOYSA-N 0.000 description 1
- MQJKPEGWNLWLTK-UHFFFAOYSA-N Dapsone Chemical compound C1=CC(N)=CC=C1S(=O)(=O)C1=CC=C(N)C=C1 MQJKPEGWNLWLTK-UHFFFAOYSA-N 0.000 description 1
- 230000001133 acceleration Effects 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 239000002131 composite material Substances 0.000 description 1
- 238000004590 computer program Methods 0.000 description 1
- 230000003750 conditioning effect Effects 0.000 description 1
- 239000000470 constituent Substances 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 230000009977 dual effect Effects 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000003628 erosive effect Effects 0.000 description 1
- 239000012467 final product Substances 0.000 description 1
- 238000009472 formulation Methods 0.000 description 1
- 230000007274 generation of a signal involved in cell-cell signaling Effects 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 230000007774 longterm Effects 0.000 description 1
- 238000007620 mathematical function Methods 0.000 description 1
- 230000006855 networking Effects 0.000 description 1
- 230000010355 oscillation Effects 0.000 description 1
- 230000000737 periodic effect Effects 0.000 description 1
- 238000012797 qualification Methods 0.000 description 1
- 230000002441 reversible effect Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F30/00—Computer-aided design [CAD]
- G06F30/30—Circuit design
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2111/00—Details relating to CAD techniques
- G06F2111/02—CAD in a network environment, e.g. collaborative CAD or distributed simulation
Landscapes
- Engineering & Computer Science (AREA)
- Computer Hardware Design (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Evolutionary Computation (AREA)
- Geometry (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Various examples relate to systems and methods for evaluating electronic components. The server computing device may provide an evaluation user interface for a user application executing on the user computing device. The server computing device may receive an indication to evaluate the electronic component from the user application and via the user interface and access a configuration dataset of the electronic component. The configuration data set may include parameter data describing a set of parameters of the electronic component; binding data describing a relationship between a first parameter of the parameter set and a first model input parameter; and simulator data describing a model of the electronic component. The server computing device may also evaluate the electronic component based at least in part on the configuration data set.
Description
Request priority
The present application claims the benefit of U.S. provisional application Ser. No. 62/315,380 filed 3/30/2016 and U.S. patent application Ser. No. 15/439,822 filed 2/2017, the disclosures of which are incorporated herein by reference in their entirety.
Background
Interfacing with many types of environmental sensors (e.g., temperature, vibration, acceleration, light, speed, sound, etc.) typically involves using analog circuitry and mixed signal circuitry to receive and convert signals from the sensor to provide discrete time or digital representations. As the range of different available sensor types expands and as the accuracy continues to increase, the design of circuits capable of processing and converting signals from the sensors becomes more and more complex. At the same time, analog hardware design expertise is gradually decreasing among application and system design engineers, while focus is turned to one or more software or sensor expertise.
Summary of The Invention
Various examples relate to systems and methods for evaluating electronic components. The server computing device may provide an evaluation user interface for a user application executing on the user computing device. The server computing device may receive an indication of an electronic component for evaluation from the user application and via the user interface and access a configuration dataset of the electronic component. The configuration data set may include: parameter data describing a set of parameters of the electronic component; binding data describing a relationship between a first parameter of the parameter set and a first model input parameter; and simulator data describing a model of the electronic component. The model may reside on a server computing device or in a different location, as described herein. The server computing device may also evaluate the electronic component based at least in part on the configuration data set.
Additional examples are also disclosed, including method and machine-readable medium examples.
Brief Description of Drawings
In the drawings, which are not necessarily drawn to scale, like numerals may describe similar components in different views. The same numbers with different letter suffixes may represent different instances of similar components. The accompanying drawings illustrate by way of example, and not by way of limitation, various example embodiments discussed in the present document.
Fig. 1 illustrates an environment according to an example embodiment of the present disclosure.
FIG. 2 illustrates an evaluation system according to an example embodiment of the present disclosure.
Fig. 3 is a diagram illustrating another example embodiment of an environment according to an example embodiment of the present disclosure.
FIG. 4 is a diagram illustrating one example embodiment of a configuration dataset.
Fig. 5 illustrates a signal chain according to an example embodiment of the present disclosure.
Fig. 6 is a diagram illustrating another example signal chain of fig. 5.
Fig. 7 is a diagram illustrating yet another example signal chain.
FIG. 8 is a diagram illustrating one example evaluation of an example signal chain utilizing a client and an evaluation tool.
FIG. 9 is a flow chart illustrating one example embodiment of a process flow that may be performed to implement Fast Fourier Transform (FFT) analysis.
Fig. 10 is a flowchart showing one example of a process flow that may be performed to detect clipping of an analog-to-digital converter (ADC).
FIG. 11 is a screen shot showing one example of an evaluation screen that may be provided to a user via a User Interface (UI).
Fig. 12 is a screen shot showing another example of an evaluation screen with additional features.
Fig. 13 is a screen shot showing another example of the evaluation screens of fig. 11 and 12 in which the FFT analysis tab is selected.
Fig. 14 is a screen shot showing another example of the evaluation screen of fig. 11 and 12 in which a waveform tag is selected.
Fig. 15 is a screen shot showing another example of the evaluation screen of fig. 11 and 12, in which a next step tab is selected.
Fig. 16 is a screen shot showing an example of another configuration of the evaluation screen of fig. 11 and 12.
Fig. 17 is a block diagram illustrating one example of a software architecture for a computing device.
FIG. 18 is a block diagram illustrating a computing device hardware architecture in which a set or series of instructions may be executed to cause a machine to perform an example of any of the methods discussed herein.
Detailed Description
Certain details are set forth below to provide a thorough understanding of example embodiments of the disclosure. However, it will be apparent to one skilled in the art that the example embodiments of the present disclosure may be practiced without the various aspects of these specific details. In some instances, well-known circuits, control signals, timing protocols, computer system components, and software operations have not been shown in detail in order to avoid unnecessarily obscuring the described example embodiments of the disclosure.
The inventors have recognized that erosion of analog or mixed signal design expertise can lead to increased design costs, excessive progress, and even can impact final product quality. For example, if a system engineer can select a range of different conversion and signal processing devices, the engineer may not have sufficient qualification or tools to make informed decisions regarding device selection or device configuration within the system, including internal configuration or external support circuitry. Thus, using simulation tools may enable a user to evaluate various different devices and device configurations in the context of a particular sensing application in a "virtual" manner.
Examples of virtual evaluation tools described herein can provide an evaluation of selected circuit or system components and configurations based on received selection and configuration settings. The circuit and system designs typically include specifications for one or more system parameters, such as in terms of sensor specifications (e.g., including one or more of sensed parameter ranges, environmental operating ranges or limitations, sensed parameter measurement accuracy, or sensed parameter measurement accuracy). The virtual assessment tool can provide an interface that allows a user to select an electronic component (also referred to as a component). Examples of components may include electronic devices, circuits, systems, and the like. The virtual assessment tool can also provide an interface for setting or selecting the configuration of the selected component. The virtual assessment tool can also provide stimulation signals (e.g., selected waveforms or input signals having specified characteristics) to selected components in order to provide assessment data indicative of behavioral responses to the provided signals. The evaluation data may be used to determine whether the selected component meets the needs of the user.
Fig. 1 illustrates an environment 100 according to an example embodiment of the present disclosure. The environment 100 may include a server 120, and the server 120 may execute an assessment tool 124, the assessment tool 124 providing the behavior model of the selected one or more components to the client 102 (1-N) in response to receiving the stimulus signal. The clients 102 (1-N) may be or include any suitable computing device including, for example, desktop computers, laptop computers, workstations, mobile devices, tablet computers, and the like. The assessment tool 124 can provide the behavior model to the client 102 (1-N) in response to receiving a selection of at least one of the components or configurations from the client 102 (1-N). Each client 102 (1N) may include at least one computer-readable medium encoded with executable instructions executable by one or more processing units of the client 102 (1N) for transmitting, receiving, encoding, and/or decoding data from the server 120, such as the application 103 (1N) on the respective client 102 (1N). In some examples, application 103 (1N) may include a web-based application for sending, receiving, encoding, or decoding data from sensor 104 (1M) or server 120. For example, the applications 103 (1-N) may execute in a Web browser application. In some examples, the application 103 (1N) may include another non-web-based application for sending, receiving, encoding, and/or decoding data from the sensor or server 120.
Each client 102 (1N) may be coupled to the network 110 to provide one or more component or configuration selections to the server 120 and to receive evaluation data from the server 120. Network 110 may be implemented using one or more networks such as, but not limited to, a Local Area Network (LAN), a Wide Area Network (WAN), a Metropolitan Area Network (MAN), a cellular network, and/or the Internet. The communications provided to the network 110, from the network 110 and from the network 110 may be wired and/or wireless, and may also be provided by any network device known in the art, now or in the future. Devices communicating over network 110 may communicate with any communication protocol including, but not limited to, TCP/IP, UDP, RS-232 and IEEE802.11, long Term Evolution (LTE) or LTE advanced wireless communication, or any other cellular/wireless communication standard.
Server 120 may provide, process, and/or communicate with interfaces, modules, and/or circuits to evaluate selected components and configurations received from clients and to provide evaluation data to clients. The server 120 may include one or more processor units 121 and a computer-readable medium 123. The computer readable medium 123 may encode instructions for executing an assessment tool 124, which assessment tool 124 can include a client communication interface, a signal generator, a data analyzer module, and the like. For example, instructions of the evaluation tool 124 may cause the one or more processor units 121 to receive a request from a client of the client 102 (1N) to evaluate a selected component having a selected configuration, and in response to the request, run a simulation using the generated signal and provide evaluation data to the requesting client. The server may be capable of executing multiple instances of instructions of the assessment instrument 124 in response to multiple contemporaneous requests from a single client, multiple different clients, or a combination thereof. Server 120 may provide one or more of a list of components or user-adjustable configuration options to the user. The component list and user-adjustable configuration options may be stored in one or more databases accessible to server 120. For example, the database may be stored in the data store 130.
In operation, a client of the client 102 (1N) may launch a corresponding application 103 (1N) to engage the server 120 with instructions for executing the virtual assessment instrument 124. For example, the client may navigate to a website directed to server 120 to request access to assessment instrument 124. Server 120 may provide the requesting client with a list of component options and may receive a selection of one or more components from the list of components. In some examples, the component list may include analog circuit components such as digital-to-analog converters (DACs), ADCs, filters, or other signal conditioning circuits, and the like. In other examples, the selection of a component may be from another application running on the server 120 based on a previous selection by the client.
The components evaluated by the evaluation tool 124 may include discrete devices and/or chains or other device arrangements. Discrete devices may include, for example, DACs, ADCs, filters, sensors, transistors, and the like. A chain of discrete devices (referred to herein as a signal chain) may include examples of discrete devices that communicate with each other to process stimulation signals. For example, a component may comprise a system, circuit, device, etc. as described herein. The signal chain to be evaluated is depicted, in some examples, signal chain data is received from a user via the UI 364.
In response to receiving the selection of the component, the evaluation tool 124 may provide the requesting client with user-adjustable configuration options for the component. The user-adjustable configuration options may correspond to configuration settings (e.g., clock rate, external jitter, whether to enable one or more features, such as a Digital Down Converter (DDC)) available to one or more components. Configuration options available for a particular component may be described for a configuration data set stored in a configuration file database, such as at data store 130. The configuration data set may be or include any suitable logical data unit, e.g. a file. The evaluating may also include feeding a stimulation signal to the component. The server 120 may also provide the signal generator with user-defined configuration options to provide data signals, such as signal type (shape), frequency, amplitude, etc. In some examples, the input signal configuration options may be retrieved from a database, such as a profile database.
In response to receiving the selected configuration, the assessment tool 124 can run an assessment by providing the generated stimulation signal to a model of the component (based on the received configuration selection), e.g., to provide assessment output data. The evaluation of the components may be based on information corresponding to the components retrieved from the product database and/or configuration options of the components received from the clients 102 (1-N). The evaluation tool 124 may implement a signal chain to run simulations, including a signal generator, one or more simulators, and one or more data analyzers. The evaluation tool 124 may provide evaluation output data to the requesting client 102 (1-N), for example, through a user interface implemented by the application 103 (1-N). In some examples, the requesting client 102 (1-N) may adjust one or more configuration options and provide a request to perform the evaluation again.
Fig. 2 illustrates an evaluation system 200 according to an example embodiment of the present disclosure. For example, the evaluation system 200 may illustrate one example implementation and/or use of the evaluation tool 124 of the environment 100. The system 200 may include an evaluation tool 224 configured to perform virtual evaluation based on data from the component database 230 and the configuration data set 240. The assessment tool 224 can be implemented in the server 120 of fig. 1. Component database 230 and/or configuration data set 240 may be implemented in server 120 and/or data store 130 of fig. 1.
11-15, described in more detail herein, depict various example screen shots showing an evaluation screen 1100 associated with an evaluation tool. The evaluation screen 1100 is an example and is not intended to be limiting. In some examples, the assessment tool 224 can be run as a web-based application using a collection server (e.g., server 120) of a Dynamic Link Library (DLL). For example, DLLs may include active functional models, through implementation of functional Modeling Operations (MOTIFs) DLLs, and virtual assessment DLLs. For example, the DLL may be encoded using C# or C++. The MOTIF DLL may include an Application Program Interface (API) that receives one or more component model files corresponding to a selected component from the component database and may run a simulation of the selected component using the corresponding component model files.
The evaluation tool 224 may rely on the component database 230 and the configuration data set 240. The component database 230 may be stored in any suitable format, such as an extensible markup language (XML) format. Data from the component database may be loaded into a MOTIF DLL to enable simulation of high speed components. The configuration data set 240, which may be stored in an XML format, may be loaded into or by a component of the assessment tool 224 (e.g., a DLL or other similar component) to configure a user interface for each product. The configuration data set 240 may also be used by the virtual evaluation DLL to configure detailed structural relationships between the user interface and the underlying simulation.
The assessment tool 224 can interface with a client running in a browser (e.g., the client of the client 102 (1N) of fig. 1). In some examples, the client software may be written in the exact JavaScript framework. Additional functionality may be provided by an open source graphics library called D3. Virtual evaluation tool 224 can provide the client with a list of components from configuration dataset 240. In some examples, the component list may be divided into categories. In some examples, the component list may be or include a product map, such as a chip or module. The diagram may include one or more highlighted components. In some examples, highlighting of the chart may change to emphasize the currently selected component. In some examples, the assessment tool 224 can present a link to print an assessment session, navigate to a data sheet, navigate to a product page, or purchase a device associated with the component, such as depicted in the screen capture image 800 of fig. 8.
The components supported by the evaluation tool 224 may be associated with a unique configuration data set of the configuration data set 240. The configuration dataset 240 for the component may include various data describing the component, such as a layout of all or part of the user interface for receiving user configuration settings as well as relationships between settings, results, and simulations, and the like. The evaluation tool 224 uses these interfaces and structural relationships to perform the evaluation. The configuration dataset 240, in some examples, is generated or encoded in XML, and the file may include several parts, such as Args, bindings, bundles, generators, simulators, analyses, evaluations and Pages. Portions may include one or more elements. The elements of the configuration may correspond to instances of objects outlined in the software. When the virtual assessment tool 224 initializes a session, it reads the configuration file of the appropriate product and creates various associated instances of the specified object.
Fig. 3 is a diagram illustrating another example embodiment of an environment 300 according to an example embodiment of the present disclosure. The environment 300 includes a client 302 and a server 301. Client 302 may operate in a manner similar to that described herein with respect to clients 102 (1-N). For example, client 302 may execute Web browser application 350. Any suitable Web browser application 350 may be used, for example Internet Explorer of Microsoft Corporation TM Microsoft, microsoft CorporationMozilla Foundation->Chrome, google, inc., and the like. The Web application 352 may execute in the Web browser application 350 or by the Web browser application 350. For example, web browser application 350 and/or Web application 352 may perform the operations of applications 103 (1-N) of FIG. 1. The Web application 352 may generate and/or populate a User Interface (UI) 364, which may be displayed on a screen or other input/output (I/O) device of the client 302.
The server side 301 may include an evaluation tool 324, which may operate in a similar manner as described herein with respect to the evaluation tools 124, 224. For example, the assessment tool 324 can be executed at one or more servers (e.g., server 120 of environment 100). The assessment tool executable 324 can communicate with the Web application 352, for example, to receive parameter values through the UI 364 and/or to provide assessment output data or other data for populating the UI 364 through the communication link 357. Communication link 357 may be or be implemented through a network similar to network 110 of fig. 1. In some examples, communication link 357 may be implemented using asynchronous JavaScript and XML (AJAX) programming techniques, although any suitable technique may be used. In some examples, the assessment tool 324 can include multiple executable programs and/or DLLs that can be executed together to implement the operations of the assessment tool described herein.
The assessment tool 324 can be in communication with one or more models 356. Model 356 is shown on server side 301 and may be implemented or executed on the same server that executes evaluation tool 324 and/or at a different computing device. Any suitable type of model may be used. For example, the data model 358 may describe modeled responses of the electronic components to different stimulation signals at a database, table, or other suitable data structure. The assessment tool 324 can query the data model 358 to retrieve data describing the response of the modeling component to the various pre-simulated stimulation signals. Executable model 360 may be an algorithmic model executed by a server or other suitable computing device. Examples of executable models include simulation programs, such as with integrated circuit emphasis (SPICE) models, and MOTIF models, such as ADI simADC and ADI simDAC models available from ADI corporation.
In one example, the assessment tool 324 can also be in communication with one or more of the workbench models 362. The bench model 362 may be a "hardware-in-the-loop" model that includes a physical model circuit of the component to be modeled with a stimulus signal and measures its response. For example, the workbench model 362 may be implemented in a physical laboratory facility that may be remote from the client 302 and/or from a server or other computing device executing the assessment tool 324. A computing device in the laboratory facility may receive data describing the desired stimulation signals, prompt the device at the laboratory facility to generate the stimulation signals, and provide the stimulation signals to the physical instance of the modeling component. The computing device at the laboratory facility may also capture the response of the component and send the captured response back to the assessment tool 324.
The assessment tool 324 can initiate the model 356 during assessment. For example, the evaluation tool 324 can launch the executable model 360 by causing the executable model 360 to execute at a server executing the evaluation tool 324 and/or by requesting another computing device to execute the executable model 360. The assessment tool 324 can launch the data model 358 by accessing a data store or other storage location that includes data that constitutes the data model 358. The assessment tool can launch the workbench model 362, for example, by requesting a computing device at the laboratory facility to execute the workbench model 362, e.g., as described herein.
The assessment tool 324 can also be in communication with one or more configuration data sets 340. Configuration dataset 340, as described in greater detail herein with respect to fig. 4, may include various metadata for facilitating simulation of a component with one or more of models 356. In some examples, each component that may be evaluated by the evaluation tool 324 may be associated with a configuration dataset 340. In some examples, a single configuration data set 340 may describe any kind of component, including, for example, a signal chain as described herein.
Fig. 4 is a diagram illustrating one example embodiment of a configuration dataset 340 (1). For example, the configuration data set 340 (1) may be one of the configuration data sets 340 shown in fig. 4. Configuration data set 340 (1) may include parameter data 450, binding data 452, binding data 454, generator data 456, simulator data 458, analysis data 460, evaluation data 462, and page data 464.
The parameter data 450 may describe parameters that may be used as described herein to interface between a user (e.g., the operations client 102 (1-N)) and one or more models 356. The parameter or arg may be any value (e.g., a custom value) used in the assessment tool 324, such as a user-defined setting or performance result, e.g., along with associated metadata. The parameters may have various attributes described by metadata (which may be part of the parameter data 450). Example parameter attributes include types such as floating point, integer, boolean, etc., and unique names of parameters. Other example parameter attributes include name of display, unit information, minimum and maximum allowable values, and the like. The parameter data 450 of the configuration dataset 340 (1) may limit enumeration of data and metadata to parameters to be used in the evaluation session, but not perform any operations therein.
The parameter data 450 may include data (e.g., metadata) describing different parameters used by the assessment tool 424. For example, parameter data 450 describing a particular parameter may indicate the type of parameter. Example parameter types may include floating point, integer, string, boolean, enumerable, etc. Parameter data 450 describing a particular parameter may also include a constraint object describing a set of allowable values. The limits may take the form of a minimum, maximum, or a minimum-maximum pair, a set of discrete allowed values, or an unlimited number. When assigning values to parameters, an evaluation tool 424 or other suitable component may check the validity of the received values against the constraints (if any) described by the parameter data.
In some examples, the parameters described by the parameter data 450 provide a generic format for receiving input parameters for the model 356. As described herein, the parameters may allow the UI 364 to provide and/or request analog input parameters from a user. Whatever basic simulator is used, the generator, the evaluation or analysis is presented in a consistent manner. Example parameter data 450 is shown below. In the example of parameter data 450, string parameters with unlimited floating point parameters and with discrete sets of allowed options "FS" and "ODR" are initialized.
<Args>
<FloatName="vStart"DisplayName="StartVoltage"Unit="V"Value="0.001"/><ToggleName="fsOrODR"DisplayName="ControlFSorODR"Value="FS">
<Option>FS</Option>
<Option>ODR</Option>
</Toggle>
</Args>
In this example, the parameter vStart provides the parameter name, parameter value, and parameter selection fsoredr. FsOrODR may indicate whether the starting voltage has a specified Output Data Rate (ODR) or a specified filter. Note that this example parameter data 450 is an example and is not intended to be limiting.
Binding data 452 may describe a binding or relationship between parameters (e.g., parameters described by parameter data 450). A binding may be a directed relationship between one or more source parameters (also referred to as a source) and one or more target parameters (also referred to as a destination). One exemplary combination directly correlates the first parameter ArgA with the second parameter ArgB. In this example binding, the value of ArgB tracks the value of ArgA so that when the value of ArgA changes, the value of ArgB is set to be the same. In some exemplary combinations, several parameters may be mathematically connected by a formula (e.g., algebraic connection). In another exemplary combination, one parameter ArgC may be set to the square root of the sum of the squares of ArgA and ArgB, two other parameters.
The combination may represent the mathematical relationship between the parameters in any suitable format, for example, the reverse Polish representation. In some examples, the binding may be conditional or scene-based. For example, a condition constraint may describe a relationship between parameters that is true under certain conditions and false under other conditions. If the parameter or set of parameters is set to a particular value, the condition or conditions of the conditional binding may be met but not at other times.
The following is example binding data 452, showing two example bindings. The first binding includes a constraint, a parameter named 4800OverODR equal to 4800 divided by the value of parameter odr. The second binding is a conditional binding, meaning that if the value of fsOrODR is 'FS', fsInternational will be set to the value of fsbits, but if the value of fsOrODR is 'ODR', it will be set to the value of 4800 OverODR.
<Bindings>
<BindingRPN="4800@odr/"Destination="4800OverODR"/>
<BindingToggle="fsOrODR"Destination="fsInternal">
<CaseValue="FS"Source="fsbits"Destination="fsInternal"/>
<CaseValue="ODR"Source="4800OverODR"Destination="fsInternal"/>
</Binding>
</Bindings>
In some examples, the binding may allow the assessment tool 324 to translate information between the UI 364 and the model 356 in an efficient and/or consistent manner, as described herein. For example, different models 356 may desire different sets of input parameters that are specified in different ways, e.g., in different cells, etc. The binding described by binding data 452 may associate one or more parameters received from and/or provided to the user via UI 364 with one or more other parameters representing input parameters of particular model 356. For example, the binding may ensure that a set of user-defined settings map to the input parameters of the various models 356. For example, a particular model, generator, evaluation, or analysis may desire to receive an onset voltage in millivolts (mV). Alternatively, the parameter may provide a starting voltage in volts. The binding may convert the parameters into inputs that are desired by the simulator, generator, evaluation, or analysis. In another example, when model 356 returns performance results, the binding maps the results back to a user friendly format for delivery.
In some examples, binding data 452 may include any suitable number of bindings having any suitable depth. In graph theory terms, the combination of the determined parameter values may be expressed as a root-directed tree, anti-dendrite. Devices whose configuration information is fundamentally changed may be useful when functions such as digital down conversion or digital controlled oscillation are enabled or disabled.
Binding data 454 describes the binding. The bundles may describe groupings of inputs and/or outputs that would occur, for example, at a common area of the UI 364. For example, in the example evaluation screen 1100 shown in fig. 12-14, the bundles may describe inputs and/or outputs that appear in a settings column, including fields 1120, 1122, 1124 shown on the left side of the evaluation screen 1100. The content of the bundle typically includes parameters, e.g., described by parameter data 450. The bundles may be empty, e.g., bundles for enabling an external RC circuit may cause UI 364 to display inputs and/or outputs related to the feature. When the external RC circuit function is disabled, the beam may similarly be disabled and the display may not display the input or output. The following example package data 454 demonstrates two packages. The first, called "extEnabled", indicates that both a "resistive" input and a "capacitive" input are to be displayed. Second, called "extDisabled", shows a set of empty inputs and/or outputs to be displayed, for example, when the RC circuit functionality is disabled.
<DefName="COMMON-BUNDLES">
<BundleName="extEnabled">
<ArgName="resistance"/>
<ArgName="capacitance"/>
</Bundle>
<BundleName="extDisabled"/>
</Def>
The generator data 456 may describe one or more generators that may be used to generate one or more stimulation signals for a model that includes the components described by the configuration dataset 340 (1). The generator may specify the stimulation signal according to a defined pattern (e.g., a single tone sine wave, etc.). Any suitable generator may be used. In some examples, the generator data 456 may describe generator input parameters. The following example generator data 456 describes a diphone generator:
<Generators>
<TwoToneName="TwoTone">
<BindingSource="amp1"Destination="amplitude1"/>
<BindingSource="freq1"Destination="frequency1"/>
<BindingSource="amp2"Destination="amplitude2"/>
<BindingSource="freq2"Destination="frequency2"/>
</TwoTone>
</Generators>
simulator data 458 may describe one or more simulators. The simulator may wrap around the model 356. For example, the simulator may include metadata describing the model input parameters, e.g., in terms of parameters and/or bindings (e.g., as described by parameter data 450 and/or binding data 452). The simulator may also include metadata for converting the stimulus signal generated by the generator into a format desired by the model 356. Analysis data 460 may describe one or more analyses that may be performed on the output of model 356. For example, the analysis may receive output data from the model 356 and generate evaluation data, which may take the form of graphics and performance metrics.
Examples of analysis may include X-Y diagram analysis that generates evaluation data describing voltage, current, power, or other parameters of the model output in the time domain, fourier transform analysis, evaluation data such as FFT generation describing frequency content of the model output, and so forth.
The evaluation data 462 may describe one or more evaluations that may utilize the configuration data set 340 (1). For example, evaluation data 462 may describe one or more signal chains, one or more generators, and one or more analyses. Page data 464 can describe different pages of UI 364, also referred to as screens of UI 364. For example, the page data 464 of a page may specify which parameters or packages are displayed on the page, and may specify what results are displayed. (the results may also be described in terms of parameters and/or bundles). In some examples, phage data 464 of a page may specify what evaluations and/or analyses are to be run and/or what graphical or other output form is to be displayed. For example, a page for signal change ending in the FFT analysis module may describe the presentation of an FFT graph, e.g., as shown in fig. 13.
Within an object such as a simulator, generator or analysis, more parameters may be set to make the object functional. Which parameters are known priors in which object is identified based on the type. For example, the two-tone generator described above may require two amplitudes and two frequencies to produce a signal output. These may be kept as default values or may be specified by binding of previously declared parameters.
Declaration objects corresponding to the generators, simulators, and analytics can be relied upon in performance simulation in evaluation tool 324. For example, the signal chain may include one or more generators, one or more simulators, and one or more analyses. Fig. 5 illustrates a signal chain 500 according to an example embodiment of the present disclosure. The signal chain 500 may include a signal generator 510, a simulator 520, a filter 530, and a data analyzer 540. The signal chain 500 may be implemented by any of the various examples of the assessment tool described herein. Many of the evaluations in the evaluation tool, particularly in high speed products, may be calculated using the signal chain superstructure (e.g., signal chain 500). The signal chain may include three other substructures that are connected together in a specified manner, known as chain links. The first link is a signal generator 510 chain link, which may include the ability to generate and provide stimulus signals. The generator 510 may describe one or more components that are the subject of evaluation by configuring the generator data at the data set.
The signal generator 510 may be linked in turn to one or more simulators 520. As described above, a simulator or simulator may wrap around the model 356 for modeling a component or set of components. Simulator 520 may provide the generated stimulation signal (or output of a previous simulator) to model 356, such as a MOTIF model, for simulation. The output of simulator 520 may be provided to other simulators and/or chain links of data analyzers 540, which may include one or more analyses to analyze model output data to determine performance characteristics and provide evaluation data. In one example, analog output data may pass through filter 530 before reaching data analyzer 540. In some examples, filter 530 may include a SINC filter (e.g., sin (x)/(x) or SINC (pi x)/(pi x)). In some examples, the filter may be simulated using a model 356 and simulator described by simulator data 458.
The data analyzer 540 includes two example analyses. For example, frequency Domain (FD) analysis 542 may perform and/or facilitate FFT that transforms model output data to the frequency domain. In some examples, FD analysis 542 may further determine noise and distortion characteristics, such as signal-to-noise ratio (SNR), based on the transformed analog data. For example, the screen captured image 600 of fig. 13 depicts an FFT graph when an FFT tag is selected. In some examples, the data analyzer 540 may include a Time Domain (TD) analysis 544 in addition to the FD analysis 542 or in place of the FD analysis 542. TD analysis may plot model output data on an X-Y curve that shows a first model output parameter (e.g., current, voltage, power, etc.) versus time. For example, the evaluation screen 1100 shown in FIG. 14 depicts an X-Y diagram at the graphical screen 1128 when the waveform tab 1114 is selected.
The example of fig. 5 is only one possible signal chain arrangement. In some examples, the signal chain structure may be more complex. For example, each link may also contain a set of static possible items, each to be instantiated in the configuration dataset. In this case, each link will determine at run-time which item to apply to the chain as a whole. This decision is typically based on the configuration received from the client. For example, the signal generator 510 may apply a single tone generator or a double tone generator. The selection to be applied may be determined at run-time, depending on the user configuration selection. Further, simulator 520 may apply either of two different models 356 (e.g., a MOTIF model for a selected component) depending on whether the user has enabled features of the component (e.g., DDC).
Fig. 6 is a diagram illustrating another example signal chain 600. The signal chain 600 includes a generator 602 configured to generate a single-tone sinusoidal waveform. The signal chain also includes components ADC 604 and filter 606. The ADC 604 may convert the analog signal to a digital signal. The filter 606 may be a low pass filter, for example, to eliminate high frequency noise. Analysis 608 is a fourier transform (e.g., FD analysis) to show the frequency content. Any suitable fourier transform may be used, including, for example, FFT algorithms. The signal chain 600 may be described by the configuration data set 340. For example, the generator data 456 may describe the generator 602, including, for example, attributes and/or input parameters of the generator 602, which may be described in terms of parameters and/or combinations. The simulator data 458 may describe the model 356 of the ADC 604 and the model 356 of the filter 606 as well as input parameters, e.g., parameters and/or bindings, of the model. The analysis data 460 may describe a fourier transform analysis 608, for example, as well as input parameters such as the type of fourier transform to be used, the frequency range or frequency focus of the analysis, etc.
Fig. 7 is a diagram illustrating yet another example signal chain 700. For example, signal chain 700 is a composite signal chain that includes various chain chains having components that can be activated or deactivated to change the configuration of signal chain 700. The generator chain 702 may include two generators, a single generator 710 to generate a single-tone sinusoidal stimulation signal and a dual generator 712 to generate a dual-tone stimulation signal. The assessment tool can determine which generator 710, 712 to execute, e.g., based on a runtime or other settings provided by the user.
Similarly, simulator link 704 may include two constituent simulators 714, 718, each package model having the same components (in this example, ADCs). For example, simulator 714 may wrap the model of the ADC with certain features enabled (e.g., DDC features), while simulator 718 may wrap the model of the ADC with disabled features. The assessment tool 324 can determine which simulator 714, 718 and associated model 356 to use based on the input parameters, e.g., can be provided by a user at or before runtime. The signal chain 700 also includes analyses 706, 708, which may include TD, X-Y analysis 706, and FDFFT analysis 708.
FIG. 8 is a diagram illustrating one example evaluation 800 of an example signal chain 802 utilizing a client 302 and an evaluation tool 324. Although the evaluation 800 is described as being performed by the client 302, the evaluation tool 324, and the configuration data set 340 (1), the evaluation 800 and similar evaluations may be performed with various evaluation tools, clients, and configuration data sets, as described herein.
In the example of fig. 8, the signal chain 802 includes a generator 820, a simulator 822, and an analysis 824, but other signal chains may be implemented by an evaluation tool in a similar manner. Prior to conducting the evaluation 800, the evaluation tool 324 can retrieve a configuration data set associated with the signal chain 802 and/or one of the components of the signal chain 802. The configuration data set may include parameter data describing parameters 806, 808, 812, 814, 816, 818, generator data describing generator 820, simulator data describing simulator 822, and analysis data describing analysis 824.
The assessment tool can receive values for some or all of the parameters 806, 808, 812, 814, 816, 818 from the user. For example, a user may provide values for some or all of parameters 806, 808, 812, 814, 816, 818 to Web application 352 via user interface 364. Web application 352 may provide values for parameters 806, 808, 812, 814, 816, 818 to assessment tool 324 via communication link 357 (fig. 3). In some examples, some or all of the parameters 806, 808, 812, 814, 816, 818 have default values (e.g., defined in the configuration dataset) such that it is not necessary in some cases to receive evaluation-specific values from the user.
The parameters may be provided to generator 820, simulator 822, and/or analysis 824, such as binding 810, either directly as input parameters or as a result of the binding. For example, parameter 806 is provided to generator 820 as a generator input parameter. Simulator 822 receives parameters 814 and 816 as simulator input parameters. Simulator 822 also receives another input parameter, which in this example is a mathematical combination of parameters 808 and 812. The analysis reception parameters 818 serve as analysis input parameters.
Generator 820 may provide the received generator input parameters to a generator implementation 826. The generator implementation 826 may be any suitable implementation of a transmitter that provides a stimulus signal for provision to the simulator 822. For example, generator 826 may be executed on the same server that executes evaluation tool 324 and/or on a different computing device. In some examples, the generator 826 may be or include a data generator including, for example, a table or other table indicating a stimulation signal, such as a repeated stimulation signal. In some examples, generator 826 may be or include an application that generates a data stream representing a stimulation signal. In some examples, generator 826 may be or include a physical signal generator (e.g., for stage model 362 as described herein).
Simulator 822 may receive model input parameters and stimulation signals and provide model input parameters and stimulation signals to model 828, which model 828 may be any suitable type of model, such as described with respect to model 356 above. In some examples, generator implementation 826 may provide stimulation signals directly to model 828, e.g., bypassing generator 820 and simulator 822. Model 828 may generate a model output signal or data stream that may be provided to simulator 822 and/or directly to analysis implementation 830. Analysis 824 may receive analysis input parameters and provide the analysis input parameters to analysis implementation 830, which may generate evaluation output data. The assessment output data can be provided to analysis 824 and/or directly to assessment tool 324. The assessment tool 324 can provide assessment output data to the Web application 352, which can be incorporated into the user interface 364, e.g., as described herein.
Fig. 9 is a flow diagram illustrating one example embodiment of a process flow 900 that may be performed to implement FFT analysis. In some examples, process flow 900 may be performed by evaluation tool 324, for example, as part of code implementing evaluation tool 324. In other examples, the process flow 900 may be performed by a separate analysis implementation called for by an evaluation tool. The analysis implementation may be performed on the same server that performs the evaluation tool 324 and/or at a different computing device. Process flow 900 is described as being performed by a computing device, which may be a server or any other suitable computing device executing evaluation tool 324. Process flow 900 may operate on the output data stream generated by model 356.
In operation 902, the computing device may determine a frequency range of the output data stream. The frequency range may be determined, for example, by the value of a parameter. In some examples, the frequency range is from Direct Current (DC), e.g., 0Hz, to twice the nyquist frequency, e.g., twice the sampling rate of the data stream. The sampling rate of the data stream may be set, for example, by evaluating the ADC model earlier in the signal chain.
In one example, setting the frequency range from DC to twice the nyquist frequency may summarize certain aspects of FFT analysis to apply it to real or complex signals. At operation 904, the computing device may determine whether the data stream is authentic or complex and marked accordingly. For example, the computing device may mark the data stream as authentic if it is authentic or complex if it is complex. Because the data stream is real or complex, in some examples, the computing device will only mark the real data stream or only mark the complex data stream. When a computing device performs operations that involve different processing of real and complex signals, it may flag how the data stream is marked and apply the appropriate actual or complex operations. Example operations that differ according to the actual or complex nature of the data stream include fourier transforms, such as FFTs. For example, the spurious energy reported by a complex data stream may be different from an equivalent real data stream. Moreover, for various operations, the analyzed frequency bands may be different for some complex operations than for actual operations. For example, when a real data stream is received, the operation may treat DC as the nyquist frequency. When complex data streams are received, the operation may take into account from DC to twice the nyquist frequency.
At operation 906, the computing device may determine a frequency spur for FFT analysis. For example, the FFT analysis may divide the frequency range of the data stream into a plurality of frequency bins, where each bin represents a discrete set of frequencies. The frequency spurs may be or include frequency bins where the data stream is expected to have frequency content. For example, for single tone analysis (e.g., based on a single tone stimulus signal), the frequency spurs may be at or near the frequency of the input tone and the positive and negative harmonics of the single tone. In one example, the frequency spurs used for tone analysis (e.g., an evaluation of a signal chain including a tone generator) may be determined according to the following equation [1 ]:
F n =k*n*(F+M 0 )+M 1 [1]
in equation [1], fn represents the frequency spurious. The variable n represents the harmonic of the stimulus signal to be considered. In some examples, n may include an integer range from-7 to 7. In examples where n includes an integer range of-7 to 7, the analysis may consider seven negative harmonics of the tone and seven positive harmonics of the tone, although any other suitable number or range of harmonics may be considered. The variable k in equation [1] is a constant multiplier. In some examples, the variable k is equal to 1. In some examples of components having spectral inversion characteristics, the variable k is equal to minus 1. The variable F in equation [1] is the frequency of the tone (e.g., the frequency of the tone stimulus signal used).
In equation [1 ]]In the variable M 0 And M 1 Representing a mix of pre-domain and post-domain transforms. For example, in a signal chain comprising a converter such as an ADC or DAC, the converter or another component of the signal chain may perform mixing to achieve up-conversion or down-conversion of the output. Variable M 0 The mixing can be modeled prior to domain conversion and the variable M 1 The mixing can be modeled after the domain conversion.
An example equation for determining frequency spurs in a binaural analysis is given by equation [2] below. The binaural analysis may be an analysis of a periodic signal in which the stimulus signal comprises two discrete frequencies.
IMD n =k*n 1 *(F 1 +M 0 )+k*n 2 *(F 2 +M 0 )+M 1 [2]
In equation [2]]In IMD (in-mold digital) n Is made up of frequency F 1 And F 2 Intermodulation distortion terms are generated. Variable n 1 And n 2 Representing harmonics of the individual tones of the stimulation signal. For example, n 1 And n 2 May be an integer from-3 to 3. In some examples, n 1 And n 2 The frequency of the expected intermodulation distortion between the two tones of the stimulus signal may also be included. The variable k may beConstant multiplier. F (F) 1 And F 2 May be the frequency of the individual tones of the stimulation signal. Variable M 0 And M 1 Can be expressed as above with respect to equation [1 ]]The same values are described. In some examples, equation [1 ] is determined during execution]And [2]]For example, based on parameters and/or bindings.
In one example, a converter (e.g., an ADI company's AD9625 ADC) provides a DDC option that can be enabled or disabled. When the DDC option is enabled, the resulting signal is mixed after conversion by a frequency sometimes referred to as the Numerically Controlled Oscillator (NCO) frequency. For example, the binding may involve a parameter indicating the NCO frequency. In some examples, the NCO frequency may be the offset of each generated spur. In some examples, the offset is referred to as Postmix.
In optional operation 908, the computing device may exclude frequencies distorted by noise shaping from a further considered frequency range. For example, some components may utilize noise shaping to deliberately modify the frequency content of the noise so that it falls outside the frequency range in which the signal is expected to lie. This may reduce the signal-to-noise ratio of the output. At the same time, it may distort the frequency content of the data stream by increasing the power of the out-of-band content. The analysis input parameters used to determine frequencies to be excluded due to noise shaping may be received and/or calculated based on various parameters and/or bound values. For example, a component such as a converter that utilizes noise shaping may have a parameter called a tuning word, which may be represented as a parameter. In some examples, the user may determine a value of the tuning word, which may be associated with FFT analysis by binding. The value of the tuning word may be expressed as a formula, for example, according to a relationship in some examples, to determine which frequencies to exclude. The relationship of the different components may be different.
At operation 910, the computing device may determine an average noise of the data stream, e.g., excluding harmonics and intermodulation frequencies of the stimulus signal (e.g., if the stimulus signal is multi-tonal). At operation 912, the computing device may determine the power of each spurious data stream determined at 906. For example, the computing device may combine the power of a threshold number of frequency bins around the spur and compare the combined spur power to the noise floor determined at operation 910. At operation 914, the computing device may scan the remaining intervals of the frequency range to determine the worst other intervals. The worst others may be the highest power frequency bins or groups of frequency bins selected from those independent of harmonic and intermodulation frequencies. At operation 916, the computing device may determine output performance characteristics of the data stream, such as SNR, spurious Free Dynamic Range (SFDR), and the like.
In some examples, the computing device may add additional spurs to account for and/or detect out-of-band spectral content. For example, some components, such as some DACs used in Radio Frequency (RF) and other transmission applications, may introduce significant spectral content outside of the first Nyquist zone, and thus outside of the frequency range determined in operation 902. When present, such out-of-band spectral content sometimes leaks to other frequency bands where the user does not have legal transmission rights. In some examples, the FFT analysis may detect out-of-band spectral content by selecting and tracking additional spurs, which may be frequency shifted versions of some or all of the spurs determined at operation 906, that are converted to the second nyquist zone. To select out-of-band spurs, the computing device may, in some examples, replicate one or more spurs determined at operation 906 and move the replicated spurs to a second or subsequent nyquist zone. The computing device may calculate one or more out-of-band spurious powers. If the computing power of the out-of-band spurs is above a threshold level, this may be reported to the user by writing a description of the out-of-band spurs to the evaluation output data provided to the user via the UI 364. For example, out-of-band spurs may be included in the output of the FFT analysis of fig. 9. The user can use this feature to determine the need and design of an analog low-pass filter to be applied after the DAC.
In some examples, the evaluation tool 324 or other suitable component may be configured to detect a bad frequency plan in the signal chain. For example, when the ADC receives a stimulus signal having a very low frequency relative to the ADC clock frequency, the stimulus signal and its harmonics may not be distinguishable from each other, making the output of the ADC very noisy. In some examples, a designer may inadvertently cause a poor frequency match with the ADC. For example, DDC, decimation, and other similar ADC and pre-ADC processing operations may reduce the frequency of the ADC stimulus signal in a manner that is not recognizable by some designers. To address this issue, the evaluation tool 324 or other suitable component may be configured to compare the spurs generated at operation 906. For example, the evaluation tool 324 or other suitable component may apply an offset to the first spur to generate an offset spur. For example, the offset spurs may represent harmonic or intermodulation distortion of the stimulus signal. An evaluation tool 324 or other suitable component may then determine whether the power at the offset spur is greater than the power at the original spur. If so, it may be indicative that the frequency of the ADC input is too low. The assessment tool 324 or other suitable component can generate an alert that can be provided to the user, for example, through assessment output data displayed at the UI 364. The user may choose to redesign the circuit and/or continue the evaluation.
In one example, the evaluation tool 324 or other suitable component may be configured to detect the use of unsuitable tuning words when evaluating an ADC that utilizes noise shaping. For example, the noise shaping used by any particular ADC may be configured by providing tuning words. The tuning word may describe the frequency band (e.g., the ideal frequency band) in which the stimulation signal may reside. In some examples, the user may select the wrong tuning word, in which case the ADC may shift the noise into the bandwidth occupied by the stimulus signal, resulting in a low signal-to-noise ratio. In some examples, the evaluation tool 324 or other suitable component may compare the stimulation signal to the ideal frequency band indicated by the tuning word. If the stimulus signal falls outside of the desired frequency band, the assessment tool 324 or other suitable component may alert the user, for example, by writing the assessment output data with incorrect noise-shaping data provided at the UI 364. The user may choose to redesign the circuit and/or evaluate it.
In some examples, the assessment tool may be configured to detect and alert the user if the user attempts to assess a component or signal chain that experiences excessive clipping. For example, clipping may occur at the ADC where the ADC receives an analog signal higher than the ADC can represent in its output digital data stream. The highest value that the ADC may represent in the output digital data stream (referred to herein as the full scale value) may be the highest digital value generated by the ADC. For example, in a 16-bit ADC, a full scale value may occur when the output of the ADC is hexadecimal 65536. The respective physical values (e.g., voltage, current, etc.) corresponding to the full scale values may vary, for example, based on the resolution of the ADC.
In some examples, the evaluation tool 324 may be configured to detect clipping of the ADC during the evaluation component or signal chain. Fig. 10 is a flow chart illustrating one example of a process flow 1000 that may be performed to detect clipping of an ADC. In some examples, process flow 1000 may be performed by evaluation tool 324, for example, as part of code implementing evaluation tool 324. In other examples, process flow 1000 may be performed by a separate analysis implementation invoked by an evaluation tool. The analysis implementation may be performed on the same server that performs the evaluation tool 324 and/or at a different computing device. Process flow 1000 is described as being performed by a computing device, which may be a server or any other suitable computing device executing evaluation tool 324.
At operation 1002, the computing device may determine a number of times a full scale value occurs in the data stream. At operation 1004, the computing device may determine a full range threshold of the data flow. The full scale threshold may be the maximum acceptable rate of full scale values in the data stream. In some examples, the full-scale threshold may describe the number of full-scale values that will occur when the ADC receives an analog input of a sine wave having an amplitude that matches the scale of the ADC. For example, the computing device may determine a peak value, which may represent the proportion of all output samples that will be discretized into a full scale value when the analog input to the ADC is a full scale amplitude sinusoid. In some examples, the peak value may be given by the following equation [3 ]:
In equation [3]]In N Code (N codes ) May be the number of unique output codes that the ADC can generate. For example, for a 16-bit converter, 65536 unique output codes may be generated. The full-scale threshold can be calculated by the following equation [4]]The following is given:
threshold = (1-peak) ×length [4]
In equation [4], the length may be the length of the data stream, which may be represented as a number of samples acquired by the ADC. Equations [3] - [4] show only one example method of determining the full range threshold.
At operation 1006, the computing device may compare the data stream to the full value threshold determined at operation 1004. If the number of full scale values in the data stream is not greater than the full scale threshold, the computing device may evaluate at operation 1008. For example, the computing device may perform an FFT or other analysis of the estimated signal chain as described in fig. 9. If the number of full scale values in the data stream is greater than the full scale threshold, it may be an indication that the ADC may have received a signal exceeding the full scale amplitude that it can accurately discrete. The computing device may alert the user at 1010. For example, the assessment tool 324 can provide an alert message to the client 302, which can provide an alert through the assessment output data provided at the UI 364. In some examples, the evaluation may stop. In other examples, the user may be prompted to continue evaluating despite the clip.
Fig. 11-15 illustrate screenshots that may be provided to a user, for example, by the assessment tool 324 and/or the client 302 via the UI 364, as described herein. For example, fig. 11 is a screen shot showing one example of an evaluation screen 1100. In the example of fig. 11, the evaluation screen 1100 includes a component selection window 1102. In fig. 11, a component selection window 1102 includes a component category field 1104 from which a user can select a component category. Any suitable class of components may be selected. When a user selects a component category, the product field 1106 may include a particular instance of a component in the selected category. In some examples, the components selected through the component selection window 1102 may be all or part of the signal chain data describing the signal chain to be evaluated.
Fig. 12 is a screen shot showing another example of an evaluation screen 1100 with additional features. As shown in FIG. 12, the assessment screen 1100 can include various tabs 1110, 1112, 1114, 1116, 1118 that can be selected to access the functionality of the assessment tool 324. In the example of the evaluation screen 1100 shown in fig. 12, a chart tab 1110 is selected. For example, the evaluation screen 1100 may include a chart field 1126 that shows a diagram of the component to be modeled. For example, in fig. 12, chart field 1126 shows a diagram of an ADC. In one example, chart field 1126 may show a signal chain to be modeled.
In the example of fig. 12, the evaluation screen 110 also includes a settings field 1120 and parameter fields 1122, 1124. The settings field 1120 may include a control input for evaluation. For example, the settings field 1120 includes a run button that, when selected, may prompt the client 302 to initiate an evaluation via the evaluation tool 324. The parameter fields 1122, 1124 may include fields for receiving values of various parameters. The parameters prompted at fields 1122, 1124 may describe the selected component or signal chain by parameter data at the configuration data set. In one example, the particular parameters prompted at fields 1122, 1124 may be described by bundle data at a configuration dataset.
Fig. 13 is a screen shot showing another example of the evaluation screen 1100 in which the FFT analysis tab 1112 is selected. The graphical result field 1128 shows the evaluation output data, which includes a graphical representation of the FFT analysis of the component. FFT analysis may be performed, for example, as described herein with respect to fig. 9-10. The value result field 1130 shows various value results of the FFT analysis, including, for example, results related to signal, noise, and distortion. Fig. 14 is a screen shot showing another example of the evaluation screen 1100 in which the waveform tab 1114 is selected. This may cause the assessment screen 1100 to display assessment output data including an X-Y analysis, as described herein. For example, in the example of fig. 14, screenshot 1100 includes a graphical screen 1140 that shows an example output signal.
Fig. 15 is a screen shot showing another example of an evaluation screen 1100 in which a next step tab 1116 is selected. The next step field 1142 may provide a link to other activities that the user may engage in, for example, in connection with one or more components of the assessment. For example, print evaluation session button 1144, when selected, may cause client 302 to print and/or save at least a portion of the information provided by labels 1110, 1112, 1114, etc. Turning on the spreadsheet button 1146, when selected by the user, may cause the client 302 to access a spreadsheet of the evaluated component or components. When selected by the user, purchase button 1148 may cause client 302 to link to a Web page, e.g., via Web browser application 350, or other location where the user may purchase one or more evaluated components, e.g., alone or in combination with an evaluation board. When selected by the user, the access product page button 1150 may cause the Web browser application 350 to load a Web page that provides information about one or more components that are the subject of the evaluation. When selected, the help tab 1118 may provide the user with various offline and/or real-time help options for using the assessment tool 324.
Fig. 16 is a screen shot showing an example of another configuration of the evaluation screen of fig. 11 and 12. In the example of fig. 16, a download data table button 1170 and a purchase evaluation board button 1172 are added. The download data table button 1170, when selected by the user, may cause the Web browser application 350 to load a Web page of a data table that includes one or more evaluated components. Purchase evaluation board button 1172, when selected by the user, may cause client 302 to link to a web page or other location where the user may purchase one or more evaluated components, either alone or with the evaluation board. In the example of fig. 16, the subsequent step tab 1116 shown in other examples of the screen 1100 is omitted. In various examples, buttons 1170 and 1172 may include a replacement for follow-up tab 1116 and/or in addition to follow-up tab 1116. Further, although FFT tab 1112 is selected in fig. 16, in various examples, buttons 1170, 1172 may appear on screen 1100 when various other tabs 1110, 1114, 1118, etc. are selected.
Fig. 17 is a block diagram 1700 illustrating one example of a software architecture 1702 for a computing device. Architecture 1702 may be used in conjunction with various hardware architectures, e.g., server 120, clients 102 (1-N), 302, etc., e.g., as described herein. Fig. 17 is merely a non-limiting example of a software architecture 1702 and many other architectures may be implemented to facilitate the functionality described herein. A representative hardware layer 1704 is illustrated that can represent, for example, any of the computing devices described above. In some examples, the hardware layer 1704 may be implemented in accordance with the architecture 1702 of fig. 17 and/or the architecture 1800 of fig. 18.
The representative hardware layer 1704 includes one or more processor units 1706 that have associated executable instructions 1708. Executable instructions 1708 represent executable instructions of software architecture 1702, including implementations of the methods, modules, components, etc. of fig. 1-18. The hardware layer 1704 also includes a memory and/or storage module 1710 that also has executable instructions 1708. The hardware layer 1704 may also include other hardware indicated by other hardware 1712, which represents any other hardware of the hardware layer 1704, such as the other hardware shown as part of the hardware architecture 1800.
In the example architecture of fig. 17, the software architecture 1702 may be conceptualized as a stack of layers, with each layer providing specific functionality. For example, the software architecture 1702 may include layers such as an operating system 1714, libraries 1716, frameworks/middleware 1718, applications 1720, and a presentation layer 1744. Operationally, the application 1720 and/or other components within the layer may call the API call 1724 through the software stack and receive a response, return a value, etc., as shown by message 1726 of the response API call 1724. The layers shown are representative in nature and not all software architectures have all layers. For example, some mobile or dedicated operating systems may not provide the framework/middleware layer 1718, while other operating systems may provide such a layer. Other software architectures may include additional or different layers.
The operating system 1714 may manage hardware resources and provide common services. The operating system 1714 may include, for example, a kernel 1728, servers 1730, and drivers 1732. The kernel 1728 may act as an abstraction layer between hardware and other software layers. For example, core 1728 may be responsible for memory management, processor management (e.g., scheduling), component management, networking, security settings, and the like. Server 1730 may provide other common services for other software layers. In some examples, server 1730 includes interrupt service. Interrupt servicing may detect receipt of a hardware or software interrupt and, in response, cause the architecture 1702 to suspend its current processing and execute an Interrupt Service Routine (ISR) upon receipt of the interrupt. As described herein, the ISR may generate an alert.
The driver 1732 may be responsible for controlling or interfacing with the underlying hardware. For example, the drivers 1732 may include a display driver, a camera driver,Drivers, flash drives, serial communication drivers (e.g., universal Serial Bus (USB) drivers), wi-Fi drivers, NFC drivers, audio drivers, power management drivers, etc., depending on the hardware configuration.
Library 1716 may provide a common infrastructure that may be used by applications 1720 and/or other components and/or layers. The libraries 1716 typically provide functionality that allows other software modules to perform tasks in a manner that interfaces directly with the underlying operating system 1714 functions (e.g., kernel 1728, servers 1730, and/or drivers 1732) than directly. The library 1716 may include a system library 1734 (e.g., a C-standard library) that may provide functions such as memory allocation functions, string manipulation functions, mathematical functions, and the like. In addition, libraries 1716 may include API libraries 1736, such as media libraries (e.g., libraries for supporting presentation and manipulation of various media formats, such as MPEG4, h.264, MP3, AAC, AMR, JPG, PNG), graphics libraries (e.g., openGL frames that may be used to present 2D and 9D in graphics content on a display), database libraries (e.g., SQLite that may provide various relational database functions), web libraries (e.g., webKit that may provide web browsing functions), and the like. The library 1716 may also include various other libraries 1738 to provide many other APIs to the applications 1720 and other software components/modules.
The framework 1718 (also sometimes referred to as middleware) can provide a higher level of common infrastructure that can be used by the applications 1720 and/or other software components/modules. For example, the framework 1718 may provide various Graphical User Interface (GUI) functions, advanced resource management, advanced location services, and the like. The framework 1718 can provide a broad spectrum of other APIs that can be used by the application 1720 and/or other software components/modules, some of which can be specific to a particular operating system or platform.
The applications 1720 include built-in applications 1740 and/or third party applications 1742. Examples of representative built-in applications 1740 may include, but are not limited to, a contacts application, a browser application, a book reader application, a location application, a media application, a messaging application, and/or a gaming application. Third party applications 1742 may include any built-in application 1740 as well as various other applications. In a particular example, third party applications 1742 by entities other than the vendor of the particular platform (e.g., using Android TM Or iOS TM The application developed by the Software Development Kit (SDK) may be in a mobile operating system (e.g., iOS) TM 、Android TM 、Phone or other user computing device operating system). In this example, third party application 1742 may call an API call 1724 provided by a mobile operating system, such as operating system 1714, to facilitate the functionality described herein.
The application programs 1720 may utilize built-in operating system functions (e.g., kernel 1728, server 1730, and/or drivers 1732), libraries (e.g., system 1734, API1736, and other libraries 1738), and framework/middleware 1718 to create a user interface to interact with the system user. Alternatively or additionally, in some systems, interaction with the user may occur through a presentation layer (e.g., presentation layer 1744). In these systems, the application/module "logic" may be separate from aspects of the application/module that interact with the user.
Some software architectures utilize virtual machines. For example, the systems described herein may be performed using one or more virtual machines executing at one or more server computing machines. In the example of fig. 17, this is illustrated by virtual machine 1748. The virtual machine creates a software environment in which applications/modules may execute as if they were executing on a hardware computing device. Virtual machines are hosted by a host operating system (operating system 1714), and typically, but not always, have a virtual machine monitor 1746 that manages the operation of virtual machine 1748 and the interface with the host operating system (i.e., operating system 1714). The software architecture executes within virtual machine 1748, such as operating system 1750, libraries 1752, framework/middleware 1754, applications 1756, and/or presentation layer 1758. These software architecture layers executing within virtual machine 1748 may be the same as the corresponding layers previously described or may be different.
Fig. 18 is a block diagram illustrating a computing device hardware architecture 1800 in which a set or series of instructions may be executed to cause a machine to perform an example of any of the methods discussed herein. For example, architecture 1800 may execute software architecture 1702 described with respect to fig. 17. Architecture 1800 may operate as a stand-alone device or may be connected (e.g., networked) to other machines. In a networked deployment, the architecture 1800 may operate in the capacity of a server or client machine in a server-client network environment, or as a peer machine in a peer-to-peer (or distributed) network environment. Architecture 1800 may be implemented in a Personal Computer (PC), tablet PC, hybrid tablet computer, set-top box (STB), personal Digital Assistant (PDA), mobile telephone, network appliance, network router, switch or bridge, or any machine capable of executing instructions (sequential or otherwise) that specify operations to be performed by that machine.
The example architecture 1800 includes a processor unit 1802, the processor unit 1802 including at least one processor (e.g., a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), or both, a processor core, a compute node, etc.). Architecture 1800 may also include a main memory 1804 and a static memory 1806, which communicate with each other via a link 1808 (e.g., a bus). Architecture 1800 may also include a video display unit 1810, an alphanumeric input device 1812 (e.g., a keyboard), and a UI navigation device 1814 (e.g., a mouse). In some examples, video display unit 1810, input device 1812 and UI navigation device 1814 are incorporated into a touch screen display. Architecture 1800 may additionally include a storage device 1816 (e.g., a drive unit), a signal generation device 1818 (e.g., a speaker), a network interface device 1820, and one or more sensors (not shown), such as a Global Positioning System (GPS) sensor, compass, accelerometer, or other sensor.
In some examples, the processor unit 1802 or other suitable hardware component may support hardware interrupts. In response to the hardware interrupt, the processor unit 1802 may suspend its processing and execute ISR, e.g., as described herein.
The storage device 1816 includes a machine-readable medium 1822 having stored thereon one or more sets of data structures and instructions 1824 (e.g., software) embodying or used by any one or more of the methods or functions described herein. The instructions 1824 may also reside, completely or at least partially, within the main memory 1804, the static memory 1806, and/or within the processor unit 1802, with the main memory 1804, the static memory 1806, and the processor unit 1802 also constituting machine-readable media during execution by the architecture 1800. The instructions stored at the machine-readable medium 1822 may include, for example, instructions for implementing the software architecture 1702, instructions for performing any of the features described herein, and so forth.
While the machine-readable medium 1822 is shown in an example to be a single medium, the term "machine-readable medium" may include a single medium or multiple media (e.g., a centralized or distributed database, and/or associated caches and servers) that store the one or more instructions 1824. The term "machine-readable medium" shall also be taken to include any tangible medium that is capable of storing, encoding or carrying instructions for execution by the machine and that cause the machine to perform any one or more of the methodologies of the present disclosure or that is capable of storing, encoding or carrying data structures used by or associated with such instructions. The term "machine-readable medium" shall accordingly be taken to include, but not be limited to, solid-state memories, and optical and magnetic media. Specific examples of machine-readable media include non-volatile memory including, but not limited to, by way of example, semiconductor memory devices (e.g., electrically programmable read-only memory (EPROM), electrically erasable programmable read-only memory (EEPROM)) and flash memory devices; magnetic disks, such as internal hard disks and removable disks; magneto-optical disk; and CD-ROM and DVD-ROM disks.
Instructions 1824 may also be transmitted or received over communications network 1826 using a transmission medium via network interface device 1820 using any of a number of well-known transmission protocols, such as the hypertext transfer protocol (HTTP). Examples of communication networks include LANs, WANs, the internet, mobile telephone networks, plain Old Telephone (POTS) networks, and wireless data networks (e.g., wi-Fi, 3G, and 5G LTE/LTE-a or WiMAX networks). The term "transmission medium" shall be taken to include any intangible medium that is capable of storing, encoding or carrying instructions for execution by the machine, and includes digital or analog communications signals or other intangible medium to facilitate communication of such software.
The above detailed description includes references to the accompanying drawings, which form a part of the detailed description. The drawings show, by way of illustration, specific example embodiments in which the disclosure may be practiced. These example embodiments are also referred to herein as "examples". These examples may include elements other than those shown or described. However, the inventors also contemplate providing examples of only those elements shown or described. Moreover, the inventors also contemplate examples using any combination or permutation of those elements shown or described (or one or more aspects thereof), or with respect to a particular example (or one or more aspects thereof), or with respect to other examples (or one or more aspects thereof) shown or described herein.
If the usage between the present document and any document incorporated by reference is inconsistent, the usage in the present document controls.
In this document, the terms "a" or "an" are used throughout the patent document to include one or more than one, independent of any other instance or usage of "at least one" or "one or more". In this document, the term "or" is used to mean non-exclusive such that "a or B" includes "a but not B", "B but not a" and "a and B" unless otherwise indicated. Herein, the terms "comprising" and "wherein" are used as equivalents of the respective terms "comprising" and "wherein". Furthermore, in the following claims, the terms "comprise" and "include" are open-ended, i.e., a system, device, article, composition, formulation, or process that includes elements other than those listed after such term in the claims is still considered to be within the scope of the claims. Furthermore, in the following claims, the terms "first," "second," and "third," etc. are used merely as labels, and are not intended to impose numerical requirements on their objects.
The method examples described herein may be at least partially machine or computer implemented. Some examples may include a computer-readable medium or machine-readable medium encoded with instructions operable to configure an electronic device to perform a method as described in the examples above. Implementations of such methods may include code, such as microcode, assembly language code, higher-level language code, and the like. Such code may include computer readable instructions for performing various methods. The code may form part of a computer program product. Furthermore, in examples, the code may be tangibly stored on one or more volatile, non-transitory, or non-volatile tangible computer-readable media, such as during execution or at other times. Examples of such tangible computer-readable media may include, but are not limited to, hard disks, removable magnetic disks, removable optical disks (e.g., compact disks and digital video disks), magnetic tapes, memory cards or sticks, random Access Memories (RAMs), read Only Memories (ROMs), and the like. The above description is intended to be illustrative and not restrictive. For example, the above-described examples (or one or more aspects thereof) may be used in combination with each other. Other example embodiments may be used by one of ordinary skill in the art upon reading the above description. The abstract is provided to comply with 37c.f.r. ≡1.72 (b), allowing the reader to quickly ascertain the nature of the technical disclosure. It is submitted with the understanding that it will not be used to interpret or limit the scope or meaning of the claims. Moreover, in the foregoing detailed description, various features may be grouped together to simplify the present disclosure. This should not be construed as an intention that the unclaimed disclosed feature is essential to any claim. Rather, inventive subject matter may lie in less than all features of a particular disclosed example embodiment. Thus, the following claims are incorporated into the detailed description as examples or example embodiments, with each claim standing on its own as a separate example embodiment, and it is contemplated that these example embodiments may be combined with one another in various combinations or permutations. The scope of the disclosure should be determined with reference to the appended claims, along with the full scope of equivalents to which such claims are entitled.
Examples of the invention
Example 1 is a system for evaluating an electronic component, the system comprising: a server computing device comprising a processor unit and configured to execute an evaluation application, wherein the evaluation application is configured to perform operations comprising: providing an assessment user interface for a user application executing on a user computing device; receiving an indication of an electronic component for evaluation from the user application and via the user interface; accessing a configuration data set of the electronic component, wherein the configuration data set comprises: parameter data describing a set of parameters of the electronic component; binding data describing a relationship between a first parameter of the parameter set and a first model input parameter; and simulator data describing a model of the electronic component; and evaluating the electronic component based at least in part on the configuration dataset.
In example 2, the subject matter of example 1 optionally includes: wherein the configuration data set further comprises a first bundle of data describing a first bundle comprising a subset of parameters of the set of parameters, wherein the evaluation application is further configured to perform operations comprising: determining that the first bundle is enabled; and providing application field description data describing components of the user interface to the user for receiving values of a subset of parameters.
In example 3, the subject matter of any one or more of examples 1-2 optionally includes: wherein the evaluation application is further configured to perform operations comprising: receiving signal chain data from the user application, wherein the signal chain data describes the electronic component and analysis; receiving a data stream; and initiating analysis to generate assessment output data from the data stream.
In example 4, the subject matter of example 3 optionally includes: wherein the evaluation output data comprises plot data describing the first model output parameters versus time.
In example 5, the subject matter of any one or more of examples 3-4 optionally includes: wherein the evaluation output data comprises frequency content data describing frequency content of the first model output parameters.
In example 6, the subject matter of any one or more of examples 1-5 optionally includes: wherein the evaluation application is further configured to perform operations comprising: receiving a first value of a first parameter from the user application and via the user interface; generating a first value of the first model input parameter based at least in part on the first value of the first parameter and the binding data; initiating a model of the electronic component based at least in part on a first value of the first model input parameter; and providing, to the user application, evaluation output data for display on the user interface, wherein the evaluation output data is based at least in part on the output of the model.
In example 7, the subject matter of example 6 optionally includes: wherein launching the model includes executing the executable model.
In example 8, the subject matter of any one or more of examples 6-7 optionally includes: wherein launching the model includes launching physical model circuitry in communication with the server computing device.
In example 9, the subject matter of any one or more of examples 6-8 optionally includes: wherein the evaluation application is further configured to perform operations comprising: receiving a data stream describing a first model output parameter of the model; selecting a set of frequency spurs of the first model output parameter based at least in part on a first value of the first parameter; and determining a power of a first frequency spur of the set of frequency spurs, wherein the evaluation output data comprises power data describing the power of the first frequency spur.
In example 10, the subject matter of example 9 optionally includes: wherein the electronic component is a converter; and wherein the evaluation application is further configured to perform operations comprising: receiving stimulation signal data describing a stimulation signal of the model; and determining the first frequency spurs based at least in part on a fundamental frequency and a pre-conversion offset of the stimulation signal.
In example 11, the subject matter of any one or more of examples 9-10 optionally includes: wherein the electronic component is a converter; and wherein the evaluation application is further configured to perform operations comprising: receiving stimulation signal data describing a stimulation signal of the model; determining the first frequency spur based at least in part on a fundamental frequency of the stimulation signal; and determining a second frequency spur of the set of frequency spurs based at least in part on the fundamental frequency and a multiplier of the stimulation signal.
In example 12, the subject matter of any one or more of examples 9-11 optionally includes: wherein the first frequency spurs are located in a first nyquist zone; and wherein the evaluation application is further configured to perform operations comprising: determining a second frequency spur of the set of frequency spurs, wherein the second frequency spur corresponds to a first frequency spur shifted to the second nyquist zone; determining that the power of the second frequency spur is greater than the noise floor by at least a threshold amount; and writing a description of the second frequency spur to the assessment output data.
In example 13, the subject matter of example 12 optionally includes: wherein the first frequency spur is a fundamental frequency of the stimulation signal, wherein the second spur is a harmonic of the fundamental frequency; and wherein the evaluation application is further configured to perform operations comprising: applying an offset to the first frequency spurs based at least in part on a first value of such first parameter to generate first offset spurs; determining that the first offset spurious is greater than the second spurious; and writing low frequency warning data into the assessment output data.
In example 14, the subject matter of any one or more of examples 6-13 optionally includes: wherein the evaluation application is further configured to perform operations comprising: receiving a data stream describing a first model output parameter of the model; determining that at least a threshold number of values of the first model output parameter described by the data stream corresponds to a full scale value of the electronic component; and writing clip warning data to the assessment output data.
In example 15, the subject matter of any one or more of examples 6-14 optionally includes: wherein the electronic component is a converter, and wherein the first value of the first parameter is indicative of a tuning word of a noise shaping characteristic of the first converter; and wherein the evaluation application is further configured to perform operations comprising: determining an ideal frequency band of the electronic component; determining that the tuning word is outside the ideal frequency band; and writing incorrect noise-shaped data into the assessment output data.
Example 16 is a method of evaluating an electronic component, the method comprising: providing, by the server computing device, an evaluation user interface for a user application executing on the user computing device; receiving, by a server computing device, an indication of an electronic component for evaluation from the user application and via the user interface; accessing, by a server computing device, a configuration dataset of the electronic component, wherein the configuration dataset comprises: parameter data describing a set of parameters of the electronic component; binding data describing a relationship between a first parameter of the parameter set and a first model input parameter; and simulator data describing a model of the electronic component; and evaluating, by the server computing device, the electronic component based at least in part on the configuration dataset.
In example 17, the subject matter of example 16 optionally includes: receiving, by a server computing device, a first value of a first parameter from the user application and via the user interface; generating, by the server computing device, a first value of the first model input parameter based at least in part on the first value of the first parameter and the binding data; initiating, by a server computing device, a model of the electronic component based at least in part on a first value of the first model input parameter; and providing, by a server computing device, assessment output data to the user application for display on the user interface, wherein the assessment output data is based at least in part on an output of the model.
In example 18, the subject matter of any one or more of examples 16-17 optionally includes: receiving, by a server computing device, a data stream describing a first model output parameter of the model; selecting, by a server computing device, a set of frequency spurs of the first model output parameter based at least in part on a first value of the first parameter; and determining, by the server computing device, a power of a first frequency spur of the set of frequency spurs, wherein the evaluation output data includes power data describing the power of the first frequency spur.
Example 19 is a machine-readable medium comprising instructions thereon, which when executed by a computing device, cause the computing device to perform operations comprising: providing an assessment user interface for a user application executing on a user computing device; receiving an indication of an electronic component for evaluation from the user application and via the user interface; accessing a configuration data set of the electronic component, wherein the configuration data set comprises: parameter data describing a set of parameters of the electronic component; binding data describing a relationship between a first parameter of the parameter set and a first model input parameter; and simulator data describing a model of the electronic component; and evaluating the electronic component based at least in part on the configuration dataset.
In example 20, the subject matter of example 19 optionally includes: the instructions thereon, when executed by a computing device, cause the computing device to perform operations comprising: receiving a first value of a first parameter from the user application and via the user interface; generating a first value of the first model input parameter based at least in part on the first value of the first parameter and the binding data; initiating a model of the electronic component based at least in part on a first value of the first model input parameter; and providing, to the user application, evaluation output data for display on the user interface, wherein the evaluation output data is based at least in part on the output of the model.
Claims (19)
1. A system for evaluating electronic components, the system comprising:
a server computing device comprising a processor unit and configured to execute an evaluation application, wherein the evaluation application is configured to perform operations comprising:
providing a user interface for a user application executing on a user computing device;
receiving an indication of a first signal chain, the first signal chain comprising a signal generator and a first component simulator for modeling the electronic component, the indication of the first signal chain being received from the user application and via the user interface;
accessing a configuration data set for the first signal chain, wherein the configuration data set comprises:
parameter data describing a set of parameters for the signal generator and the first component simulator;
binding data describing a relationship between a first parameter of the parameter set and a model input parameter for a first model of the electronic component, the binding data further describing a relationship between the first parameter of the parameter set and a model input parameter for a second model of the electronic component, the second model being different from the first model; and
Simulator data describing at least the first model for the electronic component; and
executing an implementation of the signal generator to generate a stimulus signal;
selecting the first model for evaluation of the electronic component;
determining values of the model input parameters of the first model using the parameter data and the binding data at least in part by performing a transformation on the parameter data, the transformation being based on a relationship between the first parameter of the parameter set and the model input parameters of the first model described by the binding data;
evaluating, using the first component simulator, the electronic component at least in part by executing the first model for the electronic component to generate an electronic component output, the evaluating based at least in part on the stimulus signal and the value of a model input parameter for the first model;
executing at least one data analyzer to generate first signal link result data using the electronic component output; and
providing the first signal link result data to the user computing device through the user interface.
2. The system of claim 1, wherein the configuration data set further comprises first bundle data describing a first bundle comprising a subset of parameters of the parameter set, wherein the evaluation application is further configured to perform operations comprising:
Determining that the first bundle is enabled; and
providing field description data to the user application, the field description data describing components of the user interface for receiving values of the subset of parameters.
3. The system of claim 1, wherein the first signal link result data comprises plot data describing model output parameters versus time.
4. The system of claim 1, wherein the first signal link result data includes frequency content data describing frequency content of the first model output parameters.
5. The system of any one of claims 1 to 4, wherein the evaluation application is further configured to perform operations comprising:
receiving a first value for the first parameter from the user application and via the user interface;
generating a first value for the model input parameter based at least in part on the binding data and the first value for the first parameter;
initiating the first model for the electronic component based at least in part on a first value for the model input parameter; and
providing assessment output data to the user application for display at the user interface, wherein the assessment output data is based at least in part on the output of the first model.
6. The system of claim 5, wherein launching the first model comprises executing an executable model.
7. The system of claim 5, wherein launching the first model comprises launching a physical model circuit in communication with the server computing device.
8. The system of claim 5, wherein the evaluation application is further configured to perform operations comprising:
receiving a data stream describing model output parameters of the first model;
selecting a set of frequency spurs of the model output parameter based at least in part on a first value for the first parameter; and
determining power for a first frequency spur in the set of frequency spurs, wherein the evaluation output data includes power data describing power for the first frequency spur.
9. The system of claim 8, wherein the electronic component is a converter; and
wherein the evaluation application is further configured to perform operations comprising:
receiving stimulation signal data describing a stimulation signal for the first model; and
the first frequency spurs are determined based at least in part on a fundamental frequency and a pre-conversion offset of the stimulation signal.
10. The system of claim 8 or 9, wherein the electronic component is a converter; and
Wherein the evaluation application is further configured to perform operations comprising:
receiving stimulation signal data describing a stimulation signal for the first model;
determining the first frequency spur based at least in part on a fundamental frequency of the stimulation signal; and
a second frequency spur of the set of frequency spurs is determined based at least in part on the fundamental frequency and a multiplier of the stimulation signal.
11. The system of claim 8 or 9, wherein the first frequency spurs are located in a first nyquist zone; and
wherein the evaluation application is further configured to perform operations comprising:
determining a second frequency spur of the set of frequency spurs, wherein the second frequency spur corresponds to a first frequency spur shifted to the second nyquist zone;
determining that the power of the second frequency spur is greater than the noise floor by at least a threshold amount; and
writing a description of the second frequency spur to the assessment output data.
12. The system of claim 11, wherein the first frequency spur is a fundamental frequency of the stimulation signal, wherein a second spur is a harmonic of the fundamental frequency; and
wherein the evaluation application is further configured to perform operations comprising:
Applying an offset to the first frequency spurs based at least in part on a first value for the first parameter to generate first offset spurs;
determining that the first offset spurious is greater than the second spurious; and
low frequency warning data is written to the assessment output data.
13. The system of claim 5, wherein the evaluation application is further configured to perform operations comprising:
receiving a data stream describing model output parameters of the first model;
determining that at least a threshold number of values of the model output parameter described by the data stream corresponds to a full scale value of the electronic component; and
clip alert data is written to the assessment output data.
14. The system of claim 5, wherein the electronic component is a converter, and wherein the first value for the first parameter is indicative of a tuning word for a noise shaping feature of the converter; and
wherein the evaluation application is further configured to perform operations comprising:
determining an ideal frequency band of the electronic component;
determining that the tuning word is outside the ideal frequency band; and
incorrect noise-shaped data is written to the evaluation output data.
15. A method of evaluating an electronic component, the method comprising:
providing, by the server computing device, an evaluation user interface for a user application executing on the user computing device;
receiving, by the server computing device, an indication of a first signal chain, the first signal chain including a signal generator and a first component simulator for modeling the electronic component, the indication of the first signal chain received from the user application and via the user interface;
accessing, by the server computing device, a configuration data set for the first signal chain, wherein the configuration data set comprises:
parameter data describing a set of parameters for the signal generator and the first component simulator;
binding data describing a relationship between a first parameter of the parameter set and a model input parameter for a first model of the electronic component, the binding data further describing a relationship between the first parameter of the parameter set and a model input parameter for a second model of the electronic component, the second model being different from the first model; and
simulator data describing at least the first model for the electronic component;
Executing an implementation of the signal generator to generate a stimulus signal;
selecting the first model for evaluation of the electronic component;
determining values of the model input parameters for the first model using the parameter data and the binding data at least in part by performing a transformation on the parameter data, the transformation being based on a relationship between the first parameter of the parameter set and the model input parameters of the first model described by the binding data;
evaluating, by the server computing device, the electronic component at least in part by executing the first model for the electronic component to generate an electronic component output, the evaluating based at least in part on the stimulus signal and the value of a model input parameter for the first model, using the first component simulator;
executing at least one data analyzer to generate first signal link result data using the electronic component output; and
providing the first signal link result data to the user computing device through the user interface.
16. The method of claim 15, further comprising:
receiving, by the server computing device, a first value for the first parameter from the user application and via the user interface;
Generating, by the server computing device, a first value for the model input parameter based at least in part on the binding data and the first value for the first parameter;
initiating, by the server computing device, the first model for the electronic component based at least in part on a first value for the model input parameter; and
providing, by the server computing device, assessment output data to the user application for display at the user interface, wherein the assessment output data is based at least in part on an output of the first model.
17. The method of claim 15 or 16, further comprising:
receiving, by the server computing device, a data stream describing first model output parameters of the first model;
selecting, by the server computing device, a set of frequency spurs for the first model output parameter based at least in part on a first value for the first parameter; and
determining, by the server computing device, power for a first frequency spur of the set of frequency spurs, wherein the evaluation output data includes power data describing power for the first frequency spur.
18. A machine-readable medium comprising instructions thereon, which when executed by a computing device, cause the computing device to perform operations comprising:
providing a user interface for a user application executing on a user computing device;
receiving an indication of a first signal chain, the first signal chain comprising a signal generator and a first component simulator for modeling electronic components, the indication of the first signal chain received from the user application and via the user interface;
accessing a configuration data set for the first signal chain, wherein the configuration data set comprises:
parameter data describing a set of parameters for the signal generator and the first component simulator;
binding data describing a relationship between a first parameter of the parameter set and a model input parameter for a first model of the electronic component, the binding data further describing a relationship between the first parameter of the parameter set and a model input parameter for a second model of the electronic component, the second model being different from the first model; and
simulator data describing at least the first model for the electronic component; and executing an implementation of the signal generator to generate a stimulation signal;
Selecting the first model for evaluation of the electronic component;
determining values of the model input parameters for the first model using the parameter data and the binding data at least in part by performing a transformation on the parameter data, the transformation being based on a relationship between the first parameter of the parameter set and the model input parameters of the first model described by the binding data;
using the first component simulator, evaluating the electronic component based at least in part on the stimulus signal and the value of the model input parameter for the first model by executing the first model for the electronic component to generate an electronic component output;
executing at least one data analyzer to generate first signal link result data using the electronic component output; and
providing the first signal link result data to the user computing device through the user interface.
19. The medium of claim 18, further comprising instructions thereon that, when executed by the computing device, cause the computing device to perform operations comprising:
Receiving a first value for the first parameter from the user application and via the user interface;
generating a first value for the model input parameter based at least in part on the binding data and the first value for the first parameter;
initiating the first model for the electronic component based at least in part on a first value for the model input parameter; and
providing assessment output data to the user application for display at the user interface, wherein the assessment output data is based at least in part on the output of the first model.
Applications Claiming Priority (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201662315380P | 2016-03-30 | 2016-03-30 | |
US62/315,380 | 2016-03-30 | ||
US15/439,822 | 2017-02-22 | ||
US15/439,822 US10871382B2 (en) | 2016-02-22 | 2017-02-22 | Measurement-driven electronic device design environment and techniques |
PCT/US2017/025141 WO2017173143A1 (en) | 2016-03-30 | 2017-03-30 | Method and system for evalulaton of electronic components |
Publications (2)
Publication Number | Publication Date |
---|---|
CN109074411A CN109074411A (en) | 2018-12-21 |
CN109074411B true CN109074411B (en) | 2023-10-24 |
Family
ID=59965181
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201780021918.6A Active CN109074411B (en) | 2016-03-30 | 2017-03-30 | Method and system for evaluating electronic components |
Country Status (3)
Country | Link |
---|---|
EP (1) | EP3436991A1 (en) |
CN (1) | CN109074411B (en) |
WO (1) | WO2017173143A1 (en) |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11599697B2 (en) | 2016-02-22 | 2023-03-07 | Analog Devices, Inc. | Virtual evaluation for circuits and systems |
EP3420421B1 (en) | 2016-02-22 | 2020-06-24 | Analog Devices Global Unlimited Company | System and method for configuring a measurement module |
CN113011120A (en) * | 2021-03-04 | 2021-06-22 | 北京润尼尔网络科技有限公司 | Electronic circuit simulation system, method and machine-readable storage medium |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6530065B1 (en) * | 2000-03-14 | 2003-03-04 | Transim Technology Corporation | Client-server simulator, such as an electrical circuit simulator provided by a web server over the internet |
US6889172B2 (en) * | 2001-08-15 | 2005-05-03 | National Instruments Corporation | Network-based system for configuring a measurement system using software programs generated based on a user specification |
US7043393B2 (en) * | 2001-08-15 | 2006-05-09 | National Instruments Corporation | System and method for online specification of measurement hardware |
CN100399849C (en) * | 2005-03-31 | 2008-07-02 | 联想(北京)有限公司 | Multi-mode coexistence method for multi mode communication equipment |
US8818305B1 (en) * | 2012-11-14 | 2014-08-26 | Motorola Mobility Llc | Supply transitions in an envelope tracked power amplifier |
-
2017
- 2017-03-30 CN CN201780021918.6A patent/CN109074411B/en active Active
- 2017-03-30 WO PCT/US2017/025141 patent/WO2017173143A1/en active Application Filing
- 2017-03-30 EP EP17717598.1A patent/EP3436991A1/en not_active Withdrawn
Also Published As
Publication number | Publication date |
---|---|
WO2017173143A1 (en) | 2017-10-05 |
EP3436991A1 (en) | 2019-02-06 |
CN109074411A (en) | 2018-12-21 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11599697B2 (en) | Virtual evaluation for circuits and systems | |
JP6861738B2 (en) | Systems and methods for configuring measurement modules | |
CN109074411B (en) | Method and system for evaluating electronic components | |
US10656907B2 (en) | Translation of natural language into user interface actions | |
US9612946B2 (en) | Using linked data to determine package quality | |
US20130125092A1 (en) | Generating deployable code from simulation models | |
US9122801B2 (en) | Test planning with order coverage requirements | |
CN114662427B (en) | Debugging method and device for logic system design | |
CN112560401A (en) | Verilog file conversion method, device, storage medium and equipment | |
CN111104123A (en) | Automatic deployment of applications | |
CN113742221A (en) | Method for generating test case, electronic device and storage medium | |
CN102779045B (en) | A kind of interface creating method, interface editor and electronic equipment | |
US9846571B1 (en) | Utilizing clock rate pipelining to generate code for multi-rate systems | |
CN116684120A (en) | Block chain-based data credibility verification method, device, equipment and storage medium | |
CN116956797A (en) | Power integrity simulation method and device and electronic equipment | |
CN108153665A (en) | Data test method and device | |
Brada et al. | Ensuring component application consistency on small devices: A repository-based approach | |
US8984342B2 (en) | Unit testing and analysis using a stored reference signal | |
Šaliga et al. | ADC test library in LabVIEW | |
US20170123959A1 (en) | Optimized instrumentation based on functional coverage | |
US8782618B1 (en) | Instrument based processing | |
Pricop | On the Implementation of a Wireless Sensor Node Based on NodeMCU Development Platform. | |
US11990949B2 (en) | Radio frequency signal integrity verification | |
Zhu et al. | High‐order ridge reconstruction for more accurate signal estimate from time‐frequency representations | |
US11237940B2 (en) | Determining dynamic system performance from a static behavior model |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |