CN113127276A - Product testing method, device and system - Google Patents

Product testing method, device and system Download PDF

Info

Publication number
CN113127276A
CN113127276A CN202110259174.XA CN202110259174A CN113127276A CN 113127276 A CN113127276 A CN 113127276A CN 202110259174 A CN202110259174 A CN 202110259174A CN 113127276 A CN113127276 A CN 113127276A
Authority
CN
China
Prior art keywords
product
detection
interface
detection data
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110259174.XA
Other languages
Chinese (zh)
Inventor
狄素素
王德信
付晖
王见荣
王晓强
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Qingdao Goertek Intelligent Sensor Co Ltd
Original Assignee
Qingdao Goertek Intelligent Sensor Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Qingdao Goertek Intelligent Sensor Co Ltd filed Critical Qingdao Goertek Intelligent Sensor Co Ltd
Priority to CN202110259174.XA priority Critical patent/CN113127276A/en
Publication of CN113127276A publication Critical patent/CN113127276A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/22Detection or location of defective computer hardware by testing during standby operation or during idle time, e.g. start-up testing
    • G06F11/2205Detection or location of defective computer hardware by testing during standby operation or during idle time, e.g. start-up testing using arrangements specific to the hardware being tested
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/024Detecting, measuring or recording pulse rate or heart rate
    • A61B5/02438Detecting, measuring or recording pulse rate or heart rate with portable devices, e.g. worn by the patient
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/6802Sensor mounted on worn items
    • A61B5/681Wristwatch-type devices
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • General Physics & Mathematics (AREA)
  • Surgery (AREA)
  • General Engineering & Computer Science (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Cardiology (AREA)
  • Physiology (AREA)
  • Computer Hardware Design (AREA)
  • Multimedia (AREA)
  • Quality & Reliability (AREA)
  • Image Analysis (AREA)

Abstract

The present disclosure relates to a product testing method, device and system, wherein the product testing method comprises: under the condition that a tested product is in a detection state, acquiring a plurality of interface images of a detection interface of the product, which are acquired by a camera, wherein the detection interface is in an image acquisition area of the camera, and the interface images have different acquisition time points; according to the plurality of interface images, obtaining each detection data of the product at different acquisition time points; and processing the detection data according to a set data processing rule to obtain a test result of the product.

Description

Product testing method, device and system
Technical Field
The disclosed embodiments relate to the field of computer technologies, and in particular, to a method, an apparatus, and a system for product testing.
Background
Several algorithms are usually provided in existing products for implementing different functions. For example, in a smart bracelet type product, an algorithm for testing a human heart rate may be provided. Under the condition that the user triggers the heart rate detection function of the smart bracelet, the smart bracelet can be combined with the corresponding algorithm inside the smart bracelet to detect the heart rate value of the user.
In order to ensure the product quality, for example, to ensure the detection accuracy of the algorithm inside the product, the product needs to be tested.
Disclosure of Invention
An object of the disclosed embodiment is to provide a technical solution for product testing.
According to a first aspect of the present disclosure, there is provided a product testing method comprising: under the condition that a tested product is in a detection state, acquiring a plurality of interface images of a detection interface of the product, which are acquired by a camera, wherein the detection interface is in an image acquisition area of the camera, and the interface images have different acquisition time points; according to the plurality of interface images, obtaining each detection data of the product at different acquisition time points; and processing the detection data according to a set data processing rule to obtain a test result of the product.
Optionally, the processing the detection data according to the set data processing rule includes: dividing each detection data of the product at different acquisition time points based on the set target number to obtain at least two groups of detection data, wherein the number of the detection data in any group of detection data is equal to the target number; calculating an error value of each detection data in each group of detection data; acquiring a maximum error value and a minimum error value corresponding to each group of detection data; calculating an average value of each obtained maximum error value and each obtained minimum error value to serve as an average error value of the product; the test result of the product includes an average error value of the product.
Optionally, the method further comprises: acquiring each standard data of the product at different acquisition time points, wherein the standard data are obtained by detecting a target object by using a standard detection device under the same test condition, and the target object is an object detected by the product in the detection state;
the calculating an error value of each detection data in each set of detection data includes: corresponding the detection data and the standard data at the same time point; and calculating the error value of each detection data in each group of detection data according to the standard data corresponding to the detection data.
Optionally, the number of the products is N, and N is an integer not less than 2;
the method further comprises the following steps: comparing the average error values of the products to obtain a comparison result; and displaying at least one of detection data of each product at different acquisition time points, an average error value of each product and the comparison result through a user interface.
Optionally, the number of the products is N, and N is an integer not less than 2;
the method further comprises the following steps: calculating the cross-correlation coefficient of any two products according to the detection data of any two products at different acquisition time points; and displaying the cross-correlation coefficient through a user interface.
Optionally, the number of the products is N, and N is an integer not less than 2;
the N detection interfaces of the product are all in the image acquisition area of the camera;
detecting the same detected object by the N products in the detection state;
the N products comprise a first type of product and/or a second type of product.
Optionally, the obtaining, according to the plurality of interface images, each detection data of the product at different acquisition time points includes: identifying, for each interface image of the plurality of interface images, a numerical value within the identification area of the interface image based on a set identification area; and taking the identified numerical value as the detection data of the product at the corresponding acquisition time point to obtain each detection data of the product at different acquisition time points.
Optionally, the acquiring a plurality of interface images of the detection interface of the product acquired by the camera includes: controlling the camera according to a set time interval value, periodically acquiring interface images of a detection interface of the product, and acquiring a plurality of interface images acquired by the camera; or controlling the camera according to a set time interval, acquiring a video clip of the detection interface of the product in the time interval, and sampling a plurality of interface images of the detection interface of the product from the video clip according to a set sampling period.
According to a second aspect of the present disclosure, there is also provided a product testing apparatus, comprising: a camera; the image acquisition module is used for acquiring a plurality of interface images of a detection interface of a product, which are acquired by the camera under the condition that the tested product is in a detection state, wherein the detection interface is in an image acquisition area of the camera, and the interface images have different acquisition time points; the image processing module is used for acquiring each detection data of the product at different acquisition time points according to the plurality of interface images; and the data processing module is used for processing each detection data according to a set data processing rule to obtain a test result of the product.
According to a third aspect of the present disclosure, there is also provided a product testing apparatus comprising a memory for storing a computer program and a processor; the processor is adapted to execute the computer program to implement the method according to the first aspect of the present disclosure.
According to a fourth aspect of the present disclosure, there is also provided a product testing system, comprising: a product to be tested and a product testing device according to the second or third aspect of the present disclosure;
and detecting the detected object by using a built-in algorithm to obtain detection data when the product is in a detection state, and displaying the detection data on a detection interface of the product.
According to a fifth aspect of the present disclosure, there is also provided a computer readable storage medium having stored thereon a computer program which, when executed by a processor, implements the method according to the first aspect of the present disclosure.
The method and the device have the advantages that the camera is used for collecting the interface images of the detection interface of the tested product at different time points, the collected interface images are processed through an image processing technology to obtain the detection data of the product, and the obtained detection data are processed to obtain the test result of the product. Based on the test result, the inspection accuracy of the internal algorithm of the product can be obtained, and the quality of the product can be guaranteed.
Other features of embodiments of the present disclosure and advantages thereof will become apparent from the following detailed description of exemplary embodiments thereof, which is to be read in connection with the accompanying drawings.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of the specification, illustrate embodiments of the disclosure and together with the description, serve to explain the principles of the embodiments of the disclosure.
FIG. 1 is a schematic diagram of an implementation environment in which a product testing method according to one embodiment can be applied and a system component structure in which the method can be implemented;
FIG. 2 is a schematic flow diagram of a method of product testing according to one embodiment;
FIG. 3 is a schematic view of a detection interface of a product according to one embodiment;
FIG. 4 is a schematic diagram of an interface image according to one embodiment;
FIG. 5 is a schematic flow diagram of a method of product testing according to another embodiment;
FIG. 6 is a block schematic diagram of a production testing device according to one embodiment;
FIG. 7 is a schematic diagram of a hardware configuration of a production test apparatus according to one embodiment;
FIG. 8 is a block schematic diagram of a production testing system according to one embodiment.
Detailed Description
Various exemplary embodiments of the present disclosure will now be described in detail with reference to the accompanying drawings. It should be noted that: the relative arrangement of the components and steps, the numerical expressions and numerical values set forth in these embodiments do not limit the scope of the present invention unless specifically stated otherwise.
The following description of at least one exemplary embodiment is merely illustrative in nature and is in no way intended to limit the invention, its application, or uses.
Techniques, methods, and apparatus known to those of ordinary skill in the relevant art may not be discussed in detail but are intended to be part of the specification where appropriate.
In all examples shown and discussed herein, any particular value should be construed as merely illustrative, and not limiting. Thus, other examples of the exemplary embodiments may have different values.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, further discussion thereof is not required in subsequent figures.
One application scenario of the embodiment of the present disclosure is to test a product, and the product can utilize a built-in algorithm to perform corresponding detection on a detected object in a detection state, and display a detected result in real time on a detection interface.
In the course of implementation, the inventors have found that in the case of testing a product, there is a problem that the result detected by the product cannot be obtained or is not conveniently obtained. For example, for the purpose of testing a product, an alternative embodiment is: and manually checking and recording the detection result displayed by the product in real time, and inputting the recorded detection result into the processing device so that the processing device can conveniently test the product according to the input detection result. Therefore, the product testing operation is complicated, the problems that the user records are not timely, errors easily occur in manual recording and the like exist, the product testing effect is influenced, and the user experience is reduced. For another example, the user may establish a communication connection between the user terminal and the product in advance, and the detection result of the product may also be transmitted to the user terminal, so that the user can view the detection result conveniently. There are still disadvantages that the user needs to perform the operation of establishing the communication connection, and the user terminal generally cannot directly perform the product test.
In view of the technical problems in the foregoing embodiments, the inventor proposes a product testing method, in which a camera is used to collect interface images of a detection interface of a tested product at different time points, the collected interface images are processed by an image processing technique to obtain detection data of the product, and the obtained detection data is further processed to obtain a test result of the product. Based on the test result, the inspection accuracy of the internal algorithm of the product can be obtained, and the quality of the product can be guaranteed.
< implementation Environment and hardware configuration >
Fig. 1 is a schematic diagram of a composition structure of a product testing system 100 to which a product testing method according to an embodiment can be applied. As shown in fig. 1, the product testing system 100 includes a product testing device 1000, a product 2000 to be tested, and an object 3000 to be tested, and the product testing system 100 can be applied to a product testing scenario. Wherein the product 2000 is used to test the object 3000 and the product testing device 1000 is used to test the product 2000.
In this embodiment, the product testing apparatus 1000 is, for example, a desktop computer, a mobile phone, a portable computer, a tablet computer, a palm computer, or the like. As shown in fig. 1, the product testing device 1000 may include a processor 1100, a memory 1200, an interface device 1300, a communication device 1400, an output device 1500, an input device 1600, a camera 1700, and the like.
The processor 1100 may be a central processing unit CPU, a microprocessor MCU, or the like, for executing a computer program, which may be written in an instruction set of architectures such as x86, Arm, RISC, MIPS, SSE, or the like. The memory 1200 includes, for example, a ROM (read only memory), a RAM (random access memory), a nonvolatile memory such as a hard disk, and the like. The interface device 1300 includes, for example, a USB interface, a headphone interface, and the like. The communication device 1400 is capable of wired or wireless communication, for example, the communication device 1400 may include at least one short-range communication module, such as any module for performing short-range wireless communication based on short-range wireless communication protocols, such as the Hilink protocol, WiFi (IEEE 802.11 protocol), Mesh, bluetooth, ZigBee, Thread, Z-Wave, NFC, UWB, LiFi, and the like, and the communication device 1400 may also include a long-range communication module, such as any module for performing WLAN, GPRS, 2G/3G/4G/5G long-range communication. The output device 1500 may be, for example, a device that outputs a signal, may be a display device such as a liquid crystal display, an LED display, a touch display, or the like, or may be a speaker or the like that outputs voice information or the like. The input device 1600 may include, for example, a touch screen, a keyboard, and the like. The display device 1800 is, for example, a liquid crystal display panel, a touch panel, or the like.
As applied to the disclosed embodiments, the memory 1200 of the product testing device 1000 is used to store a computer program for controlling the processor 1100 of the product testing device 1000 to operate, implementing a product testing method according to any of the embodiments. A skilled person can design a computer program according to the solution of the embodiments of the present disclosure. How the computer program controls the processor to operate is well known in the art and will not be described in detail here.
Although a plurality of devices of the production test device 1000 are illustrated in fig. 1, the present invention may relate to only some of the devices, for example, the production test device 1000 relates to only the memory 1200, the processor 1100, and the camera 1700.
In this embodiment, the product 2000 is, for example, a smart band, a smart watch, a smart phone, and other object detection devices. As shown in fig. 1, the article 2000 may include a processor 2100, a memory 2200, an interface device 2300, a communication device 2400, a display device 2500, an input device 2600, and so on.
The processor 2100 may be a central processing unit CPU, a microprocessor MCU, or the like, for executing a computer program, which may be written in an instruction set of architectures such as x86, Arm, RISC, MIPS, SSE, or the like. The memory 2200 includes, for example, a ROM (read only memory), a RAM (random access memory), a nonvolatile memory such as a hard disk, and the like. The interface device 2300 includes, for example, a USB interface, a headphone interface, and the like. The communication device 2400 can perform wired or wireless communication, for example, the communication device 2400 may include at least one short-range communication module, for example, any module that performs short-range wireless communication based on a short-range wireless communication protocol such as a Hilink protocol, WiFi (IEEE 802.11 protocol), Mesh, bluetooth, ZigBee, Thread, Z-Wave, NFC, UWB, LiFi, and the like, and the communication device 2400 may also include a long-range communication module, for example, any module that performs WLAN, GPRS, 2G/3G/4G/5G long-range communication. The display device 2500 is, for example, a liquid crystal display, an LED display, a touch display, or the like. The input device 2600 may include, for example, a touch screen, a keyboard, and the like.
As applied to embodiments of the present disclosure, the memory 2200 of the product 2000 is configured to store a computer program configured to control the processor 2100 of the product 2000 to operate, providing support for implementing a product testing method according to any of the embodiments. A skilled person can design a computer program according to the solution of the embodiments of the present disclosure. How the computer program controls the processor to operate is well known in the art and will not be described in detail here.
Although a number of devices of article 2000 are shown in fig. 1, the present invention may relate to only some of the devices, for example, article 2000 may relate to only memory 1200, processor 1100, and display device 2500.
In this embodiment, the detected object 3000 corresponds to the product 2000 and is a detection object of the product 2000. For example, the product 2000 is a smart band, and when the detection index is a human heart rate, the detected object 3000 may be a human body. Of course, the detected object 3000 may also be other objects, such as hardware of an electronic device.
It should be understood that although fig. 1 shows only one product testing device 1000 and product 2000, it is not meant to limit the respective numbers, and the product testing system 100 may contain a plurality of product testing devices 1000, a plurality of products 2000.
The product testing system 100 shown in FIG. 1 is illustrative only and is not intended to limit the invention, its application, or uses in any way.
Various embodiments and examples according to the present invention are described below with reference to the accompanying drawings.
< method examples >
FIG. 2 is a flow diagram of a method of product testing, according to one embodiment. The main implementation body of the present embodiment is, for example, the product testing apparatus 1000 in fig. 1.
As shown in fig. 2, the product testing method of the present embodiment may include the following steps S201 to S203:
step S201, under the condition that a tested product is in a detection state, acquiring a plurality of interface images of a detection interface of the product, wherein the detection interface is acquired by a camera, the detection interface is in an image acquisition area of the camera, and the interface images have different acquisition time points.
In detail, several algorithms may be provided in the tested product, and based on the implementation of different algorithms, the product may have different detection functions. And detecting the detected object by using a built-in algorithm to obtain detection data when the product is in a detection state, and displaying the detection data on a detection interface of the product.
For example, an algorithm for testing a human heart rate can be set in a smart bracelet type product. The user is under the condition that triggers the rhythm of the heart detection function of intelligent bracelet, and intelligent bracelet gets into the state that detects people's rhythm of the heart, specifically can combine its inside algorithm that is used for detecting people's rhythm of the heart that predetermines, detects bracelet wearer's heart rate value. The wearing condition of the smart band can be as shown in fig. 1 or fig. 4, and the detection interface of the smart band in the heart rate detection state can be as shown in fig. 1, fig. 3 or fig. 4.
Therefore, in the step, the camera can be used for collecting the interface image of the detection interface of the product and acquiring the interface image under the condition that the product is in the detection state. Of course, to achieve image acquisition, the detection interface of the product should be within the image acquisition area of the camera.
Use intelligent bracelet test rhythm of the heart as the example, because rhythm of the heart detection is usually for detecting the rhythm of the heart change of user in a period, so in this step, the collection that usually can correspond acquires a plurality of interface images, and the collection time point of different interface images is different, reflects the rhythm of the heart change condition of user corresponding time period.
In this step, in order to obtain a plurality of interface images, there may be at least two implementation manners, one is that the camera periodically collects the interface images, the other is that the camera collects interface videos within a period of time, and then the interface images at the required time points are captured from the interface videos.
Based on this, for the implementation manner of periodically acquiring the interface images, in an embodiment of the present disclosure, in step S201, the acquiring a plurality of interface images of the detection interface of the product acquired by the camera includes: and controlling the camera according to a set time interval value, periodically acquiring interface images of a detection interface of the product, and acquiring a plurality of interface images acquired by the camera.
In detail, the time interval value may be set according to the detection of the product. For example, the time interval value may be an integral multiple of the detection time interval value to ensure regularity of numerical value acquisition, ensure that data acquired at different sampling time points are data acquired by different times of detection, and improve data processing effect. For example, the intelligent bracelet detects the human heart rate every 1s, and then this time interval value can set up to 1s, 2s etc..
Of course, to achieve data acquisition, the time for the camera to acquire the interface image is usually between two corresponding detection time points of the product.
In addition, as to the implementation manner of the above acquiring the interface videos, in another embodiment of the present disclosure, in step S201, the acquiring a plurality of interface images of the detection interface of the product acquired by a camera includes: and controlling the camera according to a set time interval, acquiring a video clip of the detection interface of the product in the time interval, and sampling a plurality of interface images of the detection interface of the product from the video clip according to a set sampling period.
In detail, this time interval can be set for as required, for example when using intelligent bracelet to detect people's rhythm of the heart, and this time interval's length of time can be 1min, and this time interval should be in the time interval that the rhythm of the heart detection process of intelligent bracelet corresponds.
In detail, based on the same implementation principle, the sampling period may be set according to the detection situation of the product. For example, the sampling period may be an integer multiple of the detection interval value.
Step S202, obtaining each detection data of the product at different acquisition time points according to the plurality of interface images obtained in step S201.
In detail, each interface image may be processed by an image processing technique to obtain corresponding detection data.
Specifically, in order to improve the recognition efficiency and the recognition accuracy, a recognition area may be preset, so as to recognize the recognition area of each interface image.
Based on this, in order to illustrate one possible implementation manner of identifying the interface image according to the identification area, in an embodiment of the present disclosure, the step S202, obtaining the respective detection data of the product at different acquisition time points according to the plurality of interface images, includes the following steps S2021-S2022:
step S2021, identifying, for each interface image of the plurality of interface images, a numerical value in the identification area of the interface image based on the set identification area.
Specifically, the identification area may be an area where a display position of the detection data in the interface image is located. For example, the interface image of the smart band may be as shown in fig. 3, and the identification area may be an area within a curve box as shown in fig. 3.
Step S2022, using the identified numerical value as the detection data of the product at the corresponding acquisition time point, to obtain each detection data of the product at different acquisition time points.
By recognizing the interface image shown in fig. 3, the detection data of "75" can be recognized, that is, the value of the heart rate representing the wearer of the smart band at the corresponding sampling time point is 75 times/minute.
Step S203, processing the detection data obtained in step S202 according to a set data processing rule to obtain a test result of the product.
Since the obtained detection data can directly reflect the detection condition, in an embodiment of the present disclosure, the obtained detection data can be directly displayed, so that the tester can obtain the product test result by checking the displayed detection data. That is, the product test result may include the respective detection data obtained in the above step S202. Of course, the detection results of the detected object at different time points can be obtained conveniently based on the direct display of each detection data.
In addition, considering that the number of the detection data is usually large, and it is inconvenient for the tester to quickly obtain the product test condition, different from the above-mentioned direct display of the detection data, in this step, each obtained detection data can be processed to obtain the product test result. I.e. the product test results may include such data processing results.
As can be seen from the above, according to the product testing method provided by the embodiment of the present disclosure, the camera may be used to collect the interface images of the detection interface of the tested product at different time points, and the collected interface images are processed by the image processing technology to obtain the detection data of the product, so as to process the obtained detection data to obtain the test result of the product. Based on the test result, the inspection accuracy of the internal algorithm of the product can be obtained, and the quality of the product can be guaranteed.
Based on the above, in order to illustrate one possible implementation manner of processing the detection data, in an embodiment of the present disclosure, the processing the respective detection data according to the set data processing rule in step S203 includes the following steps S2031 to S2034:
step S2031, based on the set number of targets, dividing each detection data of the product at different acquisition time points to obtain at least two groups of detection data, wherein the number of the detection data in any group of detection data is equal to the number of the targets.
In detail, the target number may be set as needed, and may be, for example, 5, 10, or the like.
Taking the product of the smart bracelet as an example, assuming that the time interval value is 1s, 60 interface images are collected altogether, and then 60 heart rates can be obtained for each smart bracelet product. If the target number is 5, the target number may be equally divided into 12 groups.
Step S2032, an error value of each detection data in each set of detection data is calculated.
In detail, an error value of each detection data may be calculated based on the corresponding standard data, and a product test condition may be reflected based on the error value.
Step S2033, a maximum error value and a minimum error value corresponding to each group of detection data are obtained.
Based on the above, there are currently 12 sets of heart rate values, each set comprising 5 heart rate values, so that there are 5 heart rate error values, so that the largest and smallest of the 5 heart rate error values are obtained. Based on this, for each smart band product, there are 12 maximum heart rate error values and 12 minimum heart rate error values.
Step S2034, calculating an average value for each of the obtained maximum error values and each of the obtained minimum error values as an average error value of the product; the test result of the product includes an average error value of the product.
Based on the above, an average of 12 maximum heart rate error values and 12 minimum heart rate error values may be calculated as an average heart rate error value of the tested smart band.
Or, for each group of heart rate values, calculating an average value according to the maximum error value and the minimum error value corresponding to the group of heart rate values to obtain an average value corresponding to each group of heart rate values, namely 12 average values, and then taking the average value of the 12 average values as the average error value of the product.
In this step, by processing each acquired detection data, a corresponding average error value can be obtained, and the data volume is significantly reduced, thereby facilitating storage, transmission, viewing and the like of the product test result. For example, by displaying the average error value in the product test result, the tester can quickly know the test condition of the currently tested product, and further quickly know the product quality condition.
As mentioned above, the error value may be calculated from the standard data, and based on this, in an embodiment of the present disclosure, the method further includes the following step S204:
step S204, obtaining each standard data of the product at different acquisition time points, wherein the standard data are obtained by detecting a target object by using a standard detection device under the same test condition, and the target object is an object detected by the product in the detection state.
In detail, the standard data detected by the standard detection device is a reference standard for reflecting the data accuracy of the detection data, and based on the comparison between the two types of data, the detection accuracy of the tested product can be obtained, the product quality can be reflected, and the algorithm precision of the corresponding algorithm in the product can be reflected.
In this embodiment, the standard data and the detection data are obtained under the same test conditions (for example, the test environments are the same, the standard detection device and the detection object of the product are the same, the data acquisition time is consistent, and the like), so as to ensure the calculation accuracy of the error value.
The standard detection means may be a standard device for measuring a human heart rate for acquiring a standard heart rate value. This standard detection device and above-mentioned two intelligent bracelet products carry out the detection of rhythm of the heart to same user simultaneously.
Based on the above, in step S2032, calculating an error value of each detection data in each set of detection data includes: corresponding the detection data and the standard data at the same time point; and calculating the error value of each detection data in each group of detection data according to the standard data corresponding to the detection data.
In general, the same value obtained by detecting the same object at the same time point should be the same. The standard data is used as a reference for judging the detection data, if the detection data is more consistent with the corresponding standard data, the quality of the tested product can be considered to be good, or the algorithm precision of the corresponding algorithm in the tested product can be considered to be high. On the contrary, if the difference between the detection data and the corresponding standard data is large, the quality of the tested product can be considered to be poor, or the algorithm precision of the corresponding algorithm in the tested product can be considered to be low.
In detail, the error value may be an absolute value of a difference of the heart rate value of the product subtracted from the standard heart rate value. Thus, for the smart band product, 60 heart rate error values can be obtained.
Based on the above, the product testing method provided by the embodiment of the disclosure can test one product. For example, referring to fig. 1 and fig. 3, the heart rate detection condition of the smart band worn on the wrist of the user can be tested.
In consideration of factors such as improvement of product testing efficiency and comparison and analysis among different products, the product testing method provided by the embodiment of the disclosure can test one product and simultaneously test a plurality of products. For example, referring to fig. 4, the heart rate detection conditions of two smart bracelets worn on the wrist of the user can be tested. Of course, in other embodiments of the present disclosure, the two smart bracelets may be worn at different wrists of the same user, respectively.
Under the condition of simultaneously testing a plurality of products, the test result of each product can be obtained, and the test results of different products can be compared. Based on this, in one embodiment of the present disclosure, the number of the products is N, and N is an integer not less than 2; the method further comprises the following steps S205-S206:
step S205, comparing the average error value of each product to obtain a comparison result.
In detail, by comparing the average error values of the two products, a comparison result can be obtained. Generally, the smaller the average error value, the higher the detection accuracy of the product and the better the product quality. When the comparison result is displayed, the user can conveniently and quickly know the quality comparison conditions of different products.
Step S206, displaying at least one of each detection data, an average error value of each product, and the comparison result of each product at different acquisition time points through a User Interface (UI).
In the step, the obtained test result is displayed, so that the tester can conveniently check the test result. Based on the test result, the detection accuracy of the built-in detection algorithm of the product can be known, the product quality of the product can be known, and the method can be used in application scenes such as algorithm evaluation, competitive product analysis and the like.
As described above, in the case of simultaneously testing a plurality of products, the average error value may be compared not only for comparison analysis between different products, but also for calculating the cross-correlation coefficient for different products based on the detection data of different products.
Based on this, in one embodiment of the present disclosure, the number of the products is N, and N is an integer not less than 2; the method further comprises the following steps S207-S208:
step S207, calculating the cross correlation coefficient of any two products according to the detection data of any two products at different acquisition time points.
Referring to fig. 4, when two smart bracelets are tested simultaneously, the cross-correlation coefficient of the two smart bracelets may be calculated according to 60 heart rate values of each smart bracelet. Under the general condition, the larger the cross-correlation coefficient is, the closer the heart rate detection accuracy rates of the two intelligent bracelets are, the closer the product quality is, the higher the consistency of the two intelligent bracelets is, and the better the consistency of the internal correlation algorithm of the product is.
In detail, the two products may be any two products in the same/different batches produced by the same manufacturer, and may also be products produced by different manufacturers. For two products of the same manufacturer, if the cross correlation coefficient of the two products is high, the stability of the products is good, and the consistency of the products is high.
And step S208, displaying the cross-correlation coefficient calculated in the step S207 through a user interface. In this step, the cross-correlation coefficient is displayed, so that a tester can compare two products conveniently, for example, the consistency between the two products can be known.
Based on the above, in one embodiment of the present disclosure, the number of the products is N, where N is an integer not less than 2; the N detection interfaces of the product are all in the image acquisition area of the camera; detecting the same detected object by the N products in the detection state; the N products comprise a first type of product and/or a second type of product.
The product detection interfaces of different products are all positioned in the image acquisition area of the same camera, so that the product testing efficiency can be improved, the data processing amount is reduced, the testing cost investment is reduced, and the like.
In this embodiment, among a plurality of products tested simultaneously, the types of products may be the same or different between different products. For example, the plurality of products may be a plurality of products in the same/different batches produced by the same manufacturer, and may be applied to scenarios such as product consistency analysis. For another example, the plurality of products may be products produced by different manufacturers, and may be applied to scenes such as competitive product analysis and the like.
In summary, based on the product testing method provided by this embodiment, a user can simply and quickly obtain the accuracy and consistency of the internal algorithm of the product, a large amount of manpower and material resources are saved, and information such as the accuracy and consistency of the algorithm is presented on a user interface, so that the testing result is clear at a glance, and the user can conveniently check the testing result.
Based on the above, fig. 5 is a flowchart illustrating a product testing method according to an embodiment, and the product testing method of the embodiment will now be described by taking the product testing system 100 shown in fig. 1 as an example.
As shown in fig. 5, the method of this embodiment may include steps S501 to S510 as follows:
step S501, under the condition that N tested products are in a detection state, the camera is controlled according to a set time interval value, interface images of detection interfaces of the products are periodically collected, and a plurality of interface images collected by the camera are obtained, wherein N is an integer not less than 2, the N products all detect the same detected object under the detection state, the detection interfaces of the N products are all in an image collection area of the camera, and the interface images have different collection time points.
In detail, the N said products comprise products of a first type and/or products of a second type.
For example, N may be 2, and when the product is a smart band product, two products may be worn on the wrist of the user as shown in fig. 4. This intelligence bracelet product embeds the algorithm that is used for testing user's heart rate value, and when the user triggered the rhythm of the heart detection function of intelligence bracelet product, intelligence bracelet product got into rhythm of the heart detection state to show the heart rate value that detects in real time on detecting the interface.
Step S502, for each product in the N products, obtaining each detection data of the product at different acquisition time points according to the plurality of interface images, and executing step S503 and step S509.
For example, one of the interface images may be an image inside a dashed box in fig. 4. Based on the image, heart rate values of the two smart bracelet products at the corresponding acquisition time points can be obtained, such as 75 for one and 76 for one, and the units are each times/minute.
Step S503, based on the set number of targets, dividing each detection data of the product at different acquisition time points to obtain at least two groups of detection data, wherein the number of the detection data in any group of detection data is equal to the number of the targets.
Assuming that the time interval value is 1s, 60 interface images are collected in total, and then 60 heart rates can be obtained for each smart band product. If the target number is 5, the target number may be equally divided into 12 groups.
Step S504, obtaining each standard data of the product at different collection time points, wherein the standard data are obtained by detecting a target object by using a standard detection device under the same test condition, and the target object is an object detected by the product in the detection state.
The standard detection means may be a standard device for measuring a human heart rate for acquiring a standard heart rate value. This standard detection device and above-mentioned two intelligent bracelet products carry out the detection of rhythm of the heart to same user simultaneously.
Step S505, the detection data at the same time point is correlated with the standard data, and an error value of each detection data in each set of detection data is calculated according to the standard data corresponding to the detection data.
The error value may be the absolute value of the difference of the heart rate value of the product subtracted from the standard heart rate value. Thus, for each smart band product, 60 heart rate error values can be obtained.
In step S506, a maximum error value and a minimum error value corresponding to each set of detection data are obtained.
For each smart band product, there are 12 sets of heart rate values, each set of heart rate values including 5 heart rate values, so that there are 5 heart rate error values, thereby obtaining a maximum heart rate error value and a minimum heart rate error value of the 5 heart rate error values. Based on this, for each smart band product, there are 12 maximum heart rate error values and 12 minimum heart rate error values.
Step S507, calculating an average value for each of the obtained maximum error values and each of the obtained minimum error values, as an average error value of the product.
And calculating the average value of the 12 maximum heart rate error values and the 12 minimum heart rate error values to serve as the average heart rate error value of the intelligent bracelet product.
Step S508, comparing the average error values of the products to obtain a comparison result, and performing step S510.
And comparing the average heart rate error values of the two intelligent bracelet products to obtain a comparison result. Under the general condition, the smaller the average heart rate error value is, the higher the heart rate detection accuracy of the smart bracelet product is, and the better the product quality is.
Step S509, calculating a cross-correlation coefficient of any two products according to each detection data of any two products at different acquisition time points.
And calculating the cross correlation coefficient of the two intelligent bracelet products according to the 60 heart rate values of the intelligent bracelet products. In general, the larger the cross-correlation coefficient is, the closer the heart rate detection accuracy rates of the two smart bracelet products are, and the closer the product quality is.
Step S510, displaying a test result through a user interface, wherein the test result comprises each detection data of each product at different acquisition time points, an average error value of each product, the comparison result and the cross-correlation coefficient.
And displaying the obtained heart rate test result so as to be convenient for the tester to check. Based on the heart rate test result, the detection accuracy of a heart rate detection algorithm built in the intelligent bracelet product can be known, the product quality of the intelligent bracelet product can be known, and the method can be used for application scenes such as algorithm evaluation and competitive product analysis.
< apparatus embodiment >
FIG. 6 is a functional block diagram of a production test device 60 according to one embodiment. As shown in fig. 6, the product testing device 60 may include a camera 601, an image acquisition module 602, an image processing module 603, and a data processing module 604.
The image obtaining module 602 obtains a plurality of interface images of a detection interface of a product, which are acquired by the camera 601, when the product to be tested is in a detection state, wherein the detection interface is in an image acquisition area of the camera 601, and the plurality of interface images have different acquisition time points. The image processing module 603 obtains each detection data of the product at different acquisition time points according to the plurality of interface images. The data processing module 604 processes the detection data according to the set data processing rule to obtain the test result of the product.
The product testing device 60 may be the product testing device 1000 of fig. 1, and the product being tested may be the product 2000 of fig. 1.
In an embodiment of the present disclosure, the data processing module 604 divides each detection data of the product at different acquisition time points based on a set number of targets to obtain at least two groups of detection data, where the number of the detection data in any group of detection data is equal to the number of the targets; calculating an error value of each detection data in each group of detection data; acquiring a maximum error value and a minimum error value corresponding to each group of detection data; calculating an average value of each obtained maximum error value and each obtained minimum error value to serve as an average error value of the product; the test result of the product includes an average error value of the product.
In one embodiment of the present disclosure, the product testing device 60 further includes: and the module is used for acquiring the standard data of the product at different acquisition time points. The standard data is data obtained by detecting a target object by using a standard detection device under the same test condition, and the target object is an object detected by the product in the detection state. Based on this, the data processing module 604 corresponds the detection data of the same time point with the standard data; and calculating the error value of each detection data in each group of detection data according to the standard data corresponding to the detection data.
In one embodiment of the present disclosure, the number of the products is N, and N is an integer not less than 2. Based on this, the product testing apparatus 60 further includes: a module for comparing the average error values of the products to obtain a comparison result; and the module is used for displaying at least one of detection data of each product at different acquisition time points, an average error value of each product and the comparison result through a user interface.
In one embodiment of the present disclosure, the number of the products is N, and N is an integer not less than 2. Based on this, the product testing apparatus 60 further includes: the cross-correlation coefficient of any two products is calculated according to each detection data of any two products at different acquisition time points; and the module is used for displaying the cross-correlation coefficient through a user interface.
In one embodiment of the present disclosure, the number of the products is N, and N is an integer not less than 2; the N detection interfaces of the product are all in the image acquisition area of the camera; detecting the same detected object by the N products in the detection state; the N products comprise a first type of product and/or a second type of product.
In an embodiment of the present disclosure, the image processing module 603 identifies, for each interface image in the plurality of interface images, a numerical value in the identification area of the interface image based on a set identification area; and taking the identified numerical value as the detection data of the product at the corresponding acquisition time point to obtain each detection data of the product at different acquisition time points.
In an embodiment of the present disclosure, the image obtaining module 602 controls the camera according to a set time interval value, periodically collects interface images of a detection interface of the product, and obtains a plurality of interface images collected by the camera. Or, the image obtaining module 602 controls the camera according to a set time interval, collects a video clip of the detection interface of the product in the time interval, and samples a plurality of interface images of the detection interface of the product from the video clip according to a set sampling period.
Fig. 7 is a hardware configuration diagram of a product testing device 70 according to another embodiment.
As shown in fig. 7, the product testing device 70 comprises a processor 701 and a memory 702, the memory 702 is used for storing an executable computer program, and the processor 701 is used for executing the method according to any of the above method embodiments according to the control of the computer program.
The product testing device 70 may be the product testing device 1000 of fig. 1.
The modules of the product testing apparatus 70 may be implemented by the processor 701 in the present embodiment executing the computer program stored in the memory 702, or may be implemented by other circuit configurations, which is not limited herein.
< System embodiment >
FIG. 8 is a functional block diagram of a production test system 80 according to one embodiment. As shown in fig. 8, the product testing system 80 may include: a product 801 to be tested and a product testing device 802. In a detection state of the product 801, a detected object is detected by using a built-in algorithm to obtain detection data, and the detection data is displayed on a detection interface of the product 801.
For example, referring to fig. 3 or fig. 4, an algorithm for detecting a heart rate of a person may be built in the smart band product, the smart band product may be worn on the wrist of the user, and the smart band product detects a heart rate value of the user by using the algorithm in a detection state, and displays the detected heart rate value on the product detection interface in real time.
As described above, in the case of testing the product 801 by using the product testing apparatus 802, the detection interface of the product 801 is in the image capturing area of the camera of the product testing apparatus 802.
The object to be inspected by the product 801 may be the inspected object 3000 of fig. 1. The product 801 may be the product 2000 of fig. 1. The product testing device 802 may be the product testing device 1000 in fig. 1, or may be any one of the product testing devices described in the embodiments of the apparatus of the present disclosure, such as the product testing device 60 or the product testing device 70 described above.
Furthermore, an embodiment of the present disclosure also provides a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements a method as any of the above method embodiments.
The present invention may be a system, method and/or computer program product. The computer program product may include a computer-readable storage medium having computer-readable program instructions embodied therewith for causing a processor to implement various aspects of the present invention.
The computer readable storage medium may be a tangible device that can hold and store the instructions for use by the instruction execution device. The computer readable storage medium may be, for example, but not limited to, an electronic memory device, a magnetic memory device, an optical memory device, an electromagnetic memory device, a semiconductor memory device, or any suitable combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), a Static Random Access Memory (SRAM), a portable compact disc read-only memory (CD-ROM), a Digital Versatile Disc (DVD), a memory stick, a floppy disk, a mechanical coding device, such as punch cards or in-groove projection structures having instructions stored thereon, and any suitable combination of the foregoing. Computer-readable storage media as used herein is not to be construed as transitory signals per se, such as radio waves or other freely propagating electromagnetic waves, electromagnetic waves propagating through a waveguide or other transmission medium (e.g., optical pulses through a fiber optic cable), or electrical signals transmitted through electrical wires.
The computer-readable program instructions described herein may be downloaded from a computer-readable storage medium to a respective computing/processing device, or to an external computer or external storage device via a network, such as the internet, a local area network, a wide area network, and/or a wireless network. The network may include copper transmission cables, fiber optic transmission, wireless transmission, routers, firewalls, switches, gateway computers and/or edge servers. The network adapter card or network interface in each computing/processing device receives computer-readable program instructions from the network and forwards the computer-readable program instructions for storage in a computer-readable storage medium in the respective computing/processing device.
The computer program instructions for carrying out operations of the present invention may be assembler instructions, Instruction Set Architecture (ISA) instructions, machine-related instructions, microcode, firmware instructions, state setting data, or source or object code written in any combination of one or more programming languages, including an object oriented programming language such as Smalltalk, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The computer-readable program instructions may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider). In some embodiments, aspects of the present invention are implemented by personalizing an electronic circuit, such as a programmable logic circuit, a Field Programmable Gate Array (FPGA), or a Programmable Logic Array (PLA), with state information of computer-readable program instructions, which can execute the computer-readable program instructions.
Aspects of the present invention are described herein with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the invention. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer-readable program instructions.
These computer-readable program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks. These computer-readable program instructions may also be stored in a computer-readable storage medium that can direct a computer, programmable data processing apparatus, and/or other devices to function in a particular manner, such that the computer-readable medium storing the instructions comprises an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.
The computer readable program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer, other programmable apparatus or other devices implement the functions/acts specified in the flowchart and/or block diagram block or blocks.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more executable instructions for implementing the specified logical function(s). In some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions. It is well known to those skilled in the art that implementation by hardware, by software, and by a combination of software and hardware are equivalent.
Having described embodiments of the present invention, the foregoing description is intended to be exemplary, not exhaustive, and not limited to the embodiments disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the described embodiments. The terminology used herein is chosen in order to best explain the principles of the embodiments, the practical application, or improvements made to the technology in the marketplace, or to enable others of ordinary skill in the art to understand the embodiments disclosed herein. The scope of the invention is defined by the appended claims.

Claims (12)

1. A product testing method, comprising:
under the condition that a tested product is in a detection state, acquiring a plurality of interface images of a detection interface of the product, which are acquired by a camera, wherein the detection interface is in an image acquisition area of the camera, and the interface images have different acquisition time points;
according to the plurality of interface images, obtaining each detection data of the product at different acquisition time points;
and processing the detection data according to a set data processing rule to obtain a test result of the product.
2. The method of claim 1, wherein said processing said respective detection data according to set data processing rules comprises:
dividing each detection data of the product at different acquisition time points based on the set target number to obtain at least two groups of detection data, wherein the number of the detection data in any group of detection data is equal to the target number;
calculating an error value of each detection data in each group of detection data;
acquiring a maximum error value and a minimum error value corresponding to each group of detection data;
calculating an average value of each obtained maximum error value and each obtained minimum error value to serve as an average error value of the product;
the test result of the product includes an average error value of the product.
3. The method of claim 2, wherein the method further comprises:
acquiring each standard data of the product at different acquisition time points, wherein the standard data are obtained by detecting a target object by using a standard detection device under the same test condition, and the target object is an object detected by the product in the detection state;
the calculating an error value of each detection data in each set of detection data includes:
corresponding the detection data and the standard data at the same time point;
and calculating the error value of each detection data in each group of detection data according to the standard data corresponding to the detection data.
4. The method of claim 2, wherein the number of products is N, N being an integer not less than 2;
the method further comprises the following steps: comparing the average error values of the products to obtain a comparison result;
and displaying at least one of detection data of each product at different acquisition time points, an average error value of each product and the comparison result through a user interface.
5. The method of claim 1, wherein the number of products is N, N being an integer not less than 2;
the method further comprises the following steps: calculating the cross-correlation coefficient of any two products according to the detection data of any two products at different acquisition time points;
and displaying the cross-correlation coefficient through a user interface.
6. The method of claim 1, wherein the number of products is N, N being an integer not less than 2;
the N detection interfaces of the product are all in the image acquisition area of the camera;
detecting the same detected object by the N products in the detection state;
the N products comprise a first type of product and/or a second type of product.
7. The method of claim 1, wherein the obtaining, from the plurality of interface images, respective inspection data of the product at different acquisition time points comprises:
identifying, for each interface image of the plurality of interface images, a numerical value within the identification area of the interface image based on a set identification area;
and taking the identified numerical value as the detection data of the product at the corresponding acquisition time point to obtain each detection data of the product at different acquisition time points.
8. The method of any one of claims 1 to 7, wherein said acquiring a plurality of interface images of a detection interface of the product captured by a camera comprises:
controlling the camera according to a set time interval value, periodically acquiring interface images of a detection interface of the product, and acquiring a plurality of interface images acquired by the camera; or the like, or, alternatively,
and controlling the camera according to a set time interval, acquiring a video clip of the detection interface of the product in the time interval, and sampling a plurality of interface images of the detection interface of the product from the video clip according to a set sampling period.
9. A product testing device, comprising:
a camera;
the image acquisition module is used for acquiring a plurality of interface images of a detection interface of a product, which are acquired by the camera under the condition that the tested product is in a detection state, wherein the detection interface is in an image acquisition area of the camera, and the interface images have different acquisition time points;
the image processing module is used for acquiring each detection data of the product at different acquisition time points according to the plurality of interface images;
and the data processing module is used for processing each detection data according to a set data processing rule to obtain a test result of the product.
10. A product testing apparatus comprising a memory and a processor, the memory for storing a computer program; the processor is adapted to execute the computer program to implement the method according to any of claims 1-8.
11. A product testing system, comprising: a product to be tested and the product testing device of claim 9 or 10;
and detecting the detected object by using a built-in algorithm to obtain detection data when the product is in a detection state, and displaying the detection data on a detection interface of the product.
12. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the method according to any one of claims 1-8.
CN202110259174.XA 2021-03-10 2021-03-10 Product testing method, device and system Pending CN113127276A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110259174.XA CN113127276A (en) 2021-03-10 2021-03-10 Product testing method, device and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110259174.XA CN113127276A (en) 2021-03-10 2021-03-10 Product testing method, device and system

Publications (1)

Publication Number Publication Date
CN113127276A true CN113127276A (en) 2021-07-16

Family

ID=76772888

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110259174.XA Pending CN113127276A (en) 2021-03-10 2021-03-10 Product testing method, device and system

Country Status (1)

Country Link
CN (1) CN113127276A (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN201731958U (en) * 2010-05-05 2011-02-02 中国计量学院 Multi-tank automatic electronic thermometer calibration device
CN107157460A (en) * 2017-06-16 2017-09-15 广州视源电子科技股份有限公司 A kind of sphygmomanometer stability test methods, devices and systems
CN207370705U (en) * 2017-04-21 2018-05-18 广州视源电子科技股份有限公司 Electronic sphygmomanometer test device
CN110141207A (en) * 2019-06-10 2019-08-20 出门问问信息科技有限公司 Heart rate detection adjustment method, device, storage medium and computer program product
CN110749597A (en) * 2019-11-01 2020-02-04 李聪 Method, system and device for automatically processing detection result

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN201731958U (en) * 2010-05-05 2011-02-02 中国计量学院 Multi-tank automatic electronic thermometer calibration device
CN207370705U (en) * 2017-04-21 2018-05-18 广州视源电子科技股份有限公司 Electronic sphygmomanometer test device
CN107157460A (en) * 2017-06-16 2017-09-15 广州视源电子科技股份有限公司 A kind of sphygmomanometer stability test methods, devices and systems
CN110141207A (en) * 2019-06-10 2019-08-20 出门问问信息科技有限公司 Heart rate detection adjustment method, device, storage medium and computer program product
CN110749597A (en) * 2019-11-01 2020-02-04 李聪 Method, system and device for automatically processing detection result

Similar Documents

Publication Publication Date Title
CN107647860B (en) Heart rate detection method and device, electronic equipment and storage medium
CN106650662B (en) Target object shielding detection method and device
JP2016122272A (en) Availability calculation system, availability calculation method and availability calculation program
CN103006195A (en) Non-contact vital sign data monitoring system and non-contact vital sign data monitoring method on basis of image processing
CN107003900B (en) Switching method and portable electronic equipment
CN108387757B (en) Method and apparatus for detecting moving state of movable device
EP3907697A1 (en) Method and apparatus for acquiring information
KR20170019136A (en) Apparatus and method for encoding image thereof
KR102457247B1 (en) Electronic device for processing image and method for controlling thereof
CN108038473B (en) Method and apparatus for outputting information
CN113723305A (en) Image and video detection method, device, electronic equipment and medium
CN113127276A (en) Product testing method, device and system
WO2018158815A1 (en) Inspection assistance device, inspection assistance method, and recording medium
US10452357B2 (en) Generation of distinctive value based on true random input
KR102369319B1 (en) Apparatus and method for providing handoff thereof
US20200065631A1 (en) Produce Assessment System
US11191439B2 (en) Electronic device and method for capturing contents
CN116327133A (en) Multi-physiological index detection method, device and related equipment
KR102301293B1 (en) A method of measuring the performance of a user terminal that analyzes the degree of computer aging
CN114732350A (en) Vision detection method and device, computer readable medium and electronic equipment
CN110908505B (en) Interest identification method, device, terminal equipment and storage medium
CN110334595B (en) Dog tail movement identification method, device, system and storage medium
US10401968B2 (en) Determining digit movement from frequency data
US9910828B2 (en) Spectrometer for personal context
CN108416317B (en) Method and device for acquiring information

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20210716

RJ01 Rejection of invention patent application after publication