CN107274442B - Image identification method and device - Google Patents

Image identification method and device Download PDF

Info

Publication number
CN107274442B
CN107274442B CN201710536756.1A CN201710536756A CN107274442B CN 107274442 B CN107274442 B CN 107274442B CN 201710536756 A CN201710536756 A CN 201710536756A CN 107274442 B CN107274442 B CN 107274442B
Authority
CN
China
Prior art keywords
image
standard
standard image
determining
feature point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710536756.1A
Other languages
Chinese (zh)
Other versions
CN107274442A (en
Inventor
蒋晓海
谢春鸿
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Testin Information Technology Co Ltd
Original Assignee
Beijing Testin Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Testin Information Technology Co Ltd filed Critical Beijing Testin Information Technology Co Ltd
Priority to CN201710536756.1A priority Critical patent/CN107274442B/en
Publication of CN107274442A publication Critical patent/CN107274442A/en
Application granted granted Critical
Publication of CN107274442B publication Critical patent/CN107274442B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/33Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformation in the plane of the image
    • G06T3/40Scaling the whole image or part thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformation in the plane of the image
    • G06T3/60Rotation of a whole image or part thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/40Image enhancement or restoration by the use of histogram techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras

Abstract

The application discloses an image identification method and device, wherein the method comprises the following steps: acquiring a standard image and a position coordinate of the standard image by running a test script; determining a first image with a preset size in a test terminal screen according to the position coordinates; determining a second image in the first image by performing feature matching on the first image and the standard image; and identifying a target image corresponding to the standard image by performing image comparison on the second image and the standard image. According to the method and the device, the image identification accuracy rate when the test script is operated in different test terminal devices to carry out APP automatic test is effectively improved through the dual identification processes of feature matching and image comparison.

Description

Image identification method and device
Technical Field
The present application relates to the field of computer technologies, and in particular, to an image recognition method and apparatus.
Background
With the rapid development of third-party Application programs (APPs), in order to ensure the performance of APPs, the testing requirements for APPs are also higher and higher.
At present, automated testing of APPs by means of software testing tools is gaining more and more widespread use. In the automated testing of APP, a recording playback technology is generally adopted, which specifically includes: the tester records the test operation of the APP through the script recording tool, generates a test script, and then plays back the test script in the test terminal equipment, so that the automatic test of the APP in the test terminal equipment is realized.
When the test script is played back in the test terminal device, for an event recorded in the test script for performing a test operation on an image element in the APP interface (for example, in a game of angry birds, a click operation performed on a certain bird in the interface), it is necessary to first identify the image element in the APP interface of the test terminal device and then perform a corresponding test operation on the image element.
However, since screen resolutions of different test terminal devices are not necessarily the same, when playing back the APP test script in different terminal devices for automated testing, the recognition accuracy for the image elements is low.
Disclosure of Invention
In view of this, embodiments of the present application provide an image recognition method and apparatus, so as to solve the problem of low recognition accuracy of image recognition in the existing automated test.
The embodiment of the application provides an image identification method, which comprises the following steps:
acquiring a standard image and a position coordinate of the standard image by running a test script;
determining a first image with a preset size in a test terminal screen according to the position coordinates;
determining a second image in the first image by performing feature matching on the first image and the standard image;
and identifying a target image corresponding to the standard image by performing image comparison on the second image and the standard image.
Optionally, determining a second image in the first image by performing feature matching on the first image and the standard image, including:
determining matched feature points between the standard image and the first image by performing feature matching on the first image and the standard image;
determining a third image in the first image according to the matched feature points, and determining the scaling and/or the rotation angle of the third image relative to the standard image;
and according to the scaling and/or the rotation angle, carrying out scaling processing and/or rotation processing on the third image to obtain the second image.
Optionally, determining a scaling and/or a rotation angle of the third image relative to the standard image comprises:
selecting a first feature point and a second feature point which are positioned in the standard image from the matched feature points;
selecting a third feature point which is located in the first image and matched with the first feature point and a fourth feature point which is located in the first image and matched with the second feature point from the matched feature points;
and determining the scaling and/or the rotation angle of the third image relative to the standard image according to the first characteristic point, the second characteristic point, the third characteristic point and the fourth characteristic point.
Optionally, selecting a first feature point and a second feature point located in the standard image from the matched feature points includes:
determining the matching degree of the matched feature points between the standard image and the first image;
sorting the feature points in the standard image from large matching degree to small matching degree according to the matching degree;
the feature point ranked at the first position is determined as the first feature point, and the feature point ranked at the second position is determined as the second feature point.
Optionally, identifying a target image corresponding to the standard image by performing image comparison on the second image and the standard image includes:
determining a similarity between the second image and the standard image by performing image comparison on the second image and the standard image;
and when the similarity is greater than a preset value, identifying the second image as the target image.
Optionally, performing image comparison on the second image and the standard image, including:
image comparison is performed on the second image and the standard image through at least one of the following algorithms: pixel contrast algorithm with fault tolerance, gray scale contrast algorithm, histogram contrast algorithm.
Optionally, the pixel contrast algorithm with fault tolerance includes:
setting a fault tolerance for pixel color variation between the second image and the standard image; and/or the presence of a gas in the gas,
setting a fault tolerance for pixel offsets between the second image and the standard image.
The embodiment of the present application further provides an image recognition apparatus, including: an acquisition unit, a determination unit and an identification unit, wherein:
the device comprises an acquisition unit, a processing unit and a processing unit, wherein the acquisition unit is used for acquiring a standard image and a position coordinate of the standard image by running a test script;
the determining unit is used for determining a first image with a preset size in a screen of the test terminal according to the position coordinates;
the determining unit is further used for determining a second image in the first image by performing feature matching on the first image and the standard image;
and the identification unit is used for identifying a target image corresponding to the standard image by performing image comparison on the second image and the standard image.
The embodiment of the present application further provides an image recognition apparatus, including: a memory and a processor, wherein:
a memory for storing a program;
a processor for executing the program stored in the memory, and specifically executing:
acquiring a standard image and a position coordinate of the standard image by running a test script;
determining a first image with a preset size in a test terminal screen according to the position coordinates;
determining a second image in the first image by performing feature matching on the first image and the standard image;
and identifying a target image corresponding to the standard image by performing image comparison on the second image and the standard image.
Embodiments of the present application also provide a computer-readable storage medium storing one or more programs that, when executed by an electronic device including a plurality of application programs, cause the electronic device to perform the following method:
acquiring a standard image and a position coordinate of the standard image by running a test script;
determining a first image with a preset size in a test terminal screen according to the position coordinates;
determining a second image in the first image by performing feature matching on the first image and the standard image;
and identifying a target image corresponding to the standard image by performing image comparison on the second image and the standard image.
The embodiment of the application adopts at least one technical scheme which can achieve the following beneficial effects:
the method comprises the steps of obtaining position coordinates of a standard image and a standard image by running a test script in a test terminal, determining a first image with a preset size in a screen of the test terminal according to the position coordinates, performing feature matching on the first image and the standard image, determining a second image in the first image, performing image comparison on the second image and the standard image, identifying a target image corresponding to the standard image, and effectively improving the image identification accuracy when the test script is run in different test terminal devices to perform APP automatic tests by means of a dual identification process of feature matching and image comparison.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the application and together with the description serve to explain the application and not to limit the application. In the drawings:
fig. 1 is a schematic flowchart of an image recognition method according to an embodiment of the present application;
fig. 2 is a schematic page diagram of a script recording tool according to an embodiment of the present application;
fig. 3 is a schematic diagram of recording a test script according to an embodiment of the present application;
FIG. 4 is a schematic diagram of position coordinates of a standard image provided in an embodiment of the present application;
FIG. 5 is a schematic diagram of determining a scaling ratio provided by an embodiment of the present application;
FIG. 6 is a schematic diagram of determining a rotation angle according to an embodiment of the present disclosure;
fig. 7 is a schematic structural diagram of an electronic device according to an embodiment of the present application;
fig. 8 is a schematic structural diagram of an image recognition apparatus according to an embodiment of the present disclosure;
fig. 9 is a schematic structural diagram of an image recognition apparatus according to an embodiment of the present application.
Detailed Description
In order to achieve the purpose of the present application, an embodiment of the present application provides an image recognition method and an image recognition apparatus, where the method includes: the method comprises the steps of obtaining position coordinates of a standard image and a standard image by running a test script in a test terminal, determining a first image with a preset size in a screen of the test terminal according to the position coordinates, performing feature matching on the first image and the standard image, determining a second image in the first image, performing image comparison on the second image and the standard image, identifying a target image corresponding to the standard image, and effectively improving the image identification accuracy when the test script is run in different test terminal devices to perform APP automatic tests by means of a dual identification process of feature matching and image comparison.
The technical solutions of the present application will be described clearly and completely below with reference to the specific embodiments of the present application and the accompanying drawings. It should be apparent that the described embodiments are only some of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The technical solutions provided by the embodiments of the present application are described in detail below with reference to the accompanying drawings.
Example 1
Fig. 1 is a schematic flowchart of an image recognition method according to an embodiment of the present application. The method may be as follows.
Step 102: and acquiring the standard image and the position coordinates of the standard image by running the test script.
Before the realization is to the automated test of APP, need carry out the recording of test script to APP. In the recording process of the APP test script, a tester installs a script recording tool (e.g., iTestin of cloud testing corporation) at a PC terminal, and connects a recording terminal device (e.g., a smart phone) to the PC terminal. And the tester finishes recording the APP test script through the script recording tool installed in the PC terminal and the recording terminal equipment connected with the PC terminal.
Fig. 2 is a schematic page view of a script recording tool according to an embodiment of the present application. As shown in fig. 2, in the interface of the script recording tool, the left side is the mapping screen of the recording terminal device, and the right side is the test event recording area. And testing the APP interface in the mapping screen by a tester through the mouse and the keyboard, and recording a test event corresponding to the testing operation in the test event recording area.
Fig. 3 is a schematic diagram of recording a test script according to an embodiment of the present application.
As shown in fig. 3, firstly, a tester clicks a drawing button at the upper right corner of the mapping screen, or presses a keyboard Ctrl key while operating a left mouse button, and selects an area to be operated (as the square in fig. 3); secondly, displaying an operation mode list (such as clicking, double clicking and the like) by a right mouse button in the selected area, and selecting one operation; then, the test event is recorded in the test event recording area on the right side, including: an image (namely a standard image) corresponding to the selected area, the position coordinates of the standard image, the size of the standard image and a test operation (for example, clicking) corresponding to the standard image; and finally, generating a test script according to all the recorded test events.
In the embodiment of the application, the APP is displayed in a full screen mode in the test terminal screen, and the position coordinates of the standard image are position coordinates in a corresponding normalized coordinate system in a test terminal device screen interface (hereinafter referred to as a screen interface).
Fig. 4 is a schematic diagram of position coordinates of a standard image according to an embodiment of the present application.
As shown in fig. 4, a normalized coordinate system is established with the top left corner vertex of the screen interface as the origin, that is, the length and width of the screen are respectively used as the unit lengths of the abscissa and ordinate axes in the coordinate system, that is, the screen width is used as the unit length of the abscissa axis (x axis), the screen length is used as the unit length of the ordinate value (y axis), the position of the standard image a in the screen interface is shown in the figure, and the coordinate value of at least one point in the standard image a in the coordinate system is determined.
For example, the coordinates of the vertex at the upper left corner of the standard image a are (0.2, 0.15), the coordinates of the center point are (0.4, 0.2), or the coordinates of the vertex at the lower right corner are (0.6, 0.25).
The position coordinates of the standard image in the normalized coordinate system corresponding to the screen interface can clearly show the relative position of the standard image in the screen interface, and are irrelevant to the resolution of the screen interface, so that the position range of the target image corresponding to the standard image can be found in the screen interfaces with different resolutions according to the position coordinates of the standard image.
In the embodiment of the present application, the size of the standard image is a size in a normalized coordinate system corresponding to the screen interface.
Still taking the above fig. 4 as an example, the length of the standard image a is 0.4 and the width is 0.1, that is, the length of the standard image a is 0.4 times the length of the screen interface, and the width of the standard image a is 0.1 times the width of the screen interface.
After the test script corresponding to the APP is recorded, the test script is played back in different test terminal devices, and the automatic test of the APP can be realized in different test terminals.
In the test terminal equipment, a standard sample diagram and the position coordinates of the standard sample diagram are obtained by running a test script.
Still taking the above fig. 4 as an example, the test script is run to obtain the position coordinates of the standard sample diagram a and the standard sample diagram a: coordinate values of the top left vertex (0.2, 0.15).
The position coordinates of the standard sample drawing a include at least one coordinate value of a position point.
Step 104: and determining a first image with a preset size in the screen of the test terminal according to the position coordinates.
Wherein the first image may represent a search range for a target image corresponding to the standard image in the test terminal screen.
For a test terminal screen, a normalized coordinate system is established by taking the top left corner vertex of a screen interface as an origin, namely the length and the width of the screen are respectively taken as the unit length of an abscissa axis and an ordinate axis in the coordinate system, namely the width of the screen is taken as the unit length of an abscissa axis (x axis), and the length of the screen is taken as the unit length of an ordinate value (y axis).
And finding a target position point corresponding to the coordinate value of the position point included in the position coordinates of the standard image in the normalized coordinate system corresponding to the screen of the test terminal.
Still taking the above fig. 4 as an example, the position coordinates of the standard image a are: the coordinates of the central point are (0.4, 0.2), so that in the normalized coordinate system corresponding to the screen of the test terminal, the point with the coordinates of (0.4, 0.2) is the target position point to be searched.
And determining a first image with a preset size in the terminal screen according to the target position point.
In order to extend the search range to ensure the accuracy of image recognition, the preset size should be not smaller than the size of the standard image.
It should be noted that the preset size is not smaller than the size of the standard image, and the specific value may be determined according to the actual situation, which is not specifically limited herein.
Also taking the above fig. 4 as an example, the coordinates of the center point of the standard image a are (0.6, 0.25), the length is 0.4, and the width is 0.1, in this case, the preset size may be set to (0.6, 0.25) the center point, the length is 0.8, and the width is 0.2, so as to determine the first image, i.e., determine the search range of the target image.
Step 106: the second image is determined in the first image by feature matching the first image with the standard image.
Wherein the second image may represent an image most similar to the standard image in the first image.
Specifically, the method comprises the following steps: determining matched feature points between the standard image and the first image by performing feature matching on the first image and the standard image;
determining a third image in the first image according to the matched feature points, and determining the scaling and/or the rotation angle of the third image relative to the standard image;
and according to the scaling and/or the rotation angle, carrying out scaling processing and/or rotation processing on the third image to obtain a second image.
Wherein the third image may represent an image determined to be most similar to the standard image by the feature matching, and the scaling process and/or the rotation process.
It should be noted that, the first image and the standard image may be subjected to feature matching by using a SURF feature matching algorithm, a SIFT feature matching algorithm, or other feature matching algorithms, which is not specifically limited herein.
In practical application, because the resolution of the test terminal screen may be different from that of the recording terminal screen, the third image may have a certain scaling with respect to the standard image; and/or, compared with the standard image in the recorded terminal screen, the third image in the test terminal screen may have a certain rotation angle (mostly appearing in the game APP) relative to the standard image.
In order to improve the image recognition accuracy in the APP automation test, the scaling and/or rotation angle of the third image with respect to the standard image needs to be determined.
In an embodiment of the present application, determining a scaling and/or a rotation angle of the third image relative to the standard image includes:
selecting a first feature point and a second feature point which are positioned in the standard image from the matched feature points;
selecting a third feature point which is positioned in the first image and matched with the first feature point and a fourth feature point which is positioned in the first image and matched with the second feature point from the matched feature points;
and determining the scaling and/or the rotation angle of the third image relative to the standard image according to the first characteristic point, the second characteristic point, the third characteristic point and the fourth characteristic point.
In the embodiment of the present application, selecting a first feature point and a second feature point located in a standard image from matched feature points includes:
determining the matching degree of the matched characteristic points between the standard image and the first image;
sorting the feature points in the standard image from large matching degree to small matching degree according to the matching degree;
the feature point ranked at the first position is determined as a first feature point, and the feature point ranked at the second position is determined as a second feature point.
Through feature matching, a plurality of feature points matched between the standard image and the first image are determined, but the matching degree between the feature points is different.
In order to accurately obtain the scaling and/or the rotation angle between the third image and the standard image, the feature points with high matching degree are selected for subsequent processing.
The method for selecting the feature points with higher matching degree can adopt the following two methods:
the method comprises the following steps:
firstly, determining the matching degree of each feature point matched between a standard image and a first image;
secondly, sorting the feature points in the standard image from large matching degree to small matching degree according to the matching degree of each feature point;
then, the feature point ranked first is determined as a first feature point, and the feature point ranked second is determined as a second feature point.
Finally, in the first image, the feature point matched with the first feature point is determined as a third feature point, and the feature point matched with the second feature point is determined as a fourth feature point.
Since the matched feature points in the standard image and the first image are present in pairs, the feature points with higher matching degree can be selected by the following method two.
The second method comprises the following steps:
firstly, determining the matching degree of each feature point matched between a standard image and a first image;
secondly, sorting the feature points in the first image from large to small according to the matching degree of each feature point;
then, determining the feature point ranked at the first position as a third feature point, and determining the feature point ranked at the second position as a fourth feature point;
finally, in the standard image, the feature point matching the third feature point is determined as the first feature point, and the feature point matching the fourth feature point is determined as the second feature point.
In this embodiment of the application, determining a scaling of the third image with respect to the standard image according to the first feature point, the second feature point, the third feature point, and the fourth feature point includes:
determining a first length of a line segment formed by the first characteristic point and the second characteristic point in a first preset coordinate system;
determining a second length of a line segment formed by the third characteristic point and the fourth characteristic point in a first preset coordinate system;
a ratio of the second length to the first length is determined and the ratio is determined as a scaling.
Fig. 5 is a schematic diagram of determining a scaling ratio according to an embodiment of the present application.
As shown in fig. 5, a first preset coordinate system is established with the first feature point a as an origin, in which the coordinates of the first feature point a are (0, 0) and the coordinates of the second feature point b are (x, y), so that the first length of the line segment composed of the first feature point a and the second feature point b in the first preset coordinate system is (0, 0)
Figure BDA0001340796750000111
As shown in fig. 5, a first preset coordinate system is established with the third feature point a ' as an origin, in which the coordinates of the third feature point a ' are (0, 0) and the coordinates of the fourth feature point b ' are (x ', y '), and therefore, the second length of the line segment composed of the third feature point a ' and the fourth feature point b ' in the first preset coordinate system is (m;)
Figure BDA0001340796750000112
The ratio of the second length L 'to the first length L is K ═ L'/L, and therefore the third image is scaled by K with respect to the standard image.
It should be noted that the first preset coordinate system may be a standard coordinate system, and may also be another coordinate system, which is not limited herein.
In this embodiment of the application, determining a rotation angle of the third image relative to the standard image according to the first feature point, the second feature point, the third feature point, and the fourth feature point includes:
determining a first included angle between a line segment formed by the first characteristic point and the second characteristic point and a preset direction in a second preset coordinate system;
determining a second included angle between a line segment formed by the third characteristic point and the fourth characteristic point and the preset direction in a second preset coordinate system;
an angular difference between the second angle and the first angle is determined and the angular difference is determined as the angle of rotation.
Fig. 6 is a schematic diagram of determining a rotation angle according to an embodiment of the present application.
As shown in fig. 6, a second preset coordinate system is established with the first feature point a as an origin, in which the coordinates of the first feature point a are (0, 0) and the coordinates of the second feature point b are (m, n), so that a first included angle between a line segment formed by the first feature point a and the second feature point b and the y direction of the preset direction in the second preset coordinate system is (0, 0) °
Figure BDA0001340796750000113
As shown in fig. 6, a second preset coordinate system is established with the third feature point a ' as an origin, in which the coordinates of the third feature point a ' are (0, 0) and the coordinates of the fourth feature point b ' are (m ', n '), so that a second included angle between a line segment formed by the third feature point a ' and the fourth feature point b ' and the preset direction y direction in the second preset coordinate system is (0, 0)
Figure BDA0001340796750000121
The angle difference between the second angle α 'and the first angle α is Δ α ═ α' - α, and therefore the angle of rotation of the third image relative to the standard image is Δ α.
It should be noted that the first preset coordinate system and the second preset coordinate system may be the same coordinate system or different coordinate systems, and are not limited herein.
And according to the determined scaling ratio and/or rotation angle, performing scaling processing and/or rotation processing on the third image to obtain a second image which is most similar to the standard image and has the same size and/or the same angle.
Step 108: and identifying a target image corresponding to the standard image by performing image comparison on the second image and the standard image.
In practical applications, for the second image obtained after the feature matching and the scaling and/or rotation process, the second image may not be the target image corresponding to the standard image that is required to be obtained due to the limitation of the accuracy of the feature matching.
For example, there may be a partial region similarity for two images that are not identical, such that after feature matching the two images, the two images are considered similar.
In order to further verify whether the second image is identified as a target image corresponding to the standard image, the second image and the standard image are subjected to image comparison.
Specifically, the similarity between the second image and the standard image is determined by performing image comparison on the second image and the standard image;
and when the similarity is greater than a preset value, identifying the second image as the target image.
It should be noted that the preset value may be determined according to actual situations, and is not specifically limited herein.
In the embodiment of the present application, performing image comparison on the second image and the standard image includes:
and performing image comparison on the second image and the standard image through at least one of the following algorithms: pixel contrast algorithm with fault tolerance, gray scale contrast algorithm, histogram contrast algorithm.
In the embodiment of the present application, the pixel comparison algorithm with fault tolerance includes:
setting the fault tolerance rate of pixel color change between the second image and the standard image; and/or the presence of a gas in the gas,
a fault tolerance for pixel offsets between the second image and the standard image is set.
In practical applications, since the resolution and/or display parameter information of the test terminal screen and the recording terminal screen may be different, the second image may have a certain pixel color variation and/or pixel shift with respect to the standard image.
In order to improve the image identification accuracy in the APP automatic test, the second image and the standard image are subjected to image comparison by adopting a pixel comparison algorithm with fault tolerance, wherein the pixel comparison algorithm is provided with the fault tolerance of pixel color change and/or the fault tolerance of pixel offset, and the identification accuracy of the target image can be effectively improved.
In the embodiment of the application, in order to further improve the identification accuracy of the target image, the second image and the target image may be subjected to image comparison in a manner of one or more combinations of a pixel comparison algorithm with fault tolerance, a gray scale comparison algorithm and a histogram comparison algorithm.
Through feature matching, zooming and/or rotating, a second image similar to the standard image is identified in the first image (namely the search range), and then through image identification of the second image and the standard image, whether the second image is a target image is further identified and determined.
After the target image is identified, corresponding test operation is executed on the target image according to the test operation corresponding to the standard image recorded in the test script, and finally, the automatic test of the APP is realized.
For example, when the test operation corresponding to the standard image is a click operation, the coordinates of the operation point corresponding to the click operation are determined, wherein the coordinates are the relative coordinates of the operation point relative to the standard image. And determining a target operation point in the target image according to the coordinates of the operation point, and further executing click operation on the target operation point.
In the embodiment of the application, when the similarity between the second image and the standard image is not greater than the preset value, it may be determined that the target image is not recognized.
In practical application, when the target image is identified on the screen of the test terminal, the target image may not be in the screen interface, for example, the screen interface is being loaded, and after the identification is performed by using the image identification method, the target image is not identified.
Therefore, after the target image is not recognized, the step 102 and the step 108 are repeatedly executed after the preset time interval to perform the re-recognition of the target image; and/or determining that the target image cannot be identified when the identification frequency of the target image is greater than the preset frequency.
According to the technical scheme, the position coordinates of the standard image and the standard image are obtained by running the test script in the test terminal, the first image with the preset size is determined in the screen of the test terminal according to the position coordinates, the second image is determined in the first image by performing feature matching on the first image and the standard image, the target image corresponding to the standard image is identified by performing image comparison on the second image and the standard image, and the image identification accuracy when the APP automatic test is performed by running the test script in different test terminal devices is effectively improved by the aid of the dual identification process of feature matching and image comparison.
Example 2
Fig. 7 is a schematic structural diagram of an electronic device according to an embodiment of the present application. As shown in fig. 7, at the hardware level, the electronic device includes a processor, an internal bus, a network interface, a memory, and a non-volatile memory, and may also include hardware required for other services. The processor reads a corresponding computer program from the nonvolatile memory into the memory and then runs the computer program to form the image recognition device on a logic level. Of course, besides the software implementation, the present application does not exclude other implementations, such as logic devices or a combination of software and hardware, and the like, that is, the execution subject of the following processing flow is not limited to each logic unit, and may also be hardware or logic devices.
Fig. 8 is a schematic structural diagram of an image recognition apparatus according to an embodiment of the present application. The apparatus 800 comprises: an acquisition unit 801, a determination unit 802, and a recognition unit 803, wherein:
an obtaining unit 801, configured to obtain a standard image and a position coordinate of the standard image by running a test script;
a determining unit 802, configured to determine, according to the position coordinates, a first image with a preset size in a test terminal screen;
a determination unit 802 for
Determining a second image in the first image by performing feature matching on the first image and the standard image;
and the identification unit is used for identifying the target image corresponding to the standard image by carrying out image comparison on the second image and the standard image.
Optionally, the determining unit 802 determines the second image in the first image by performing feature matching on the first image and the standard image, including:
determining matched feature points between the standard image and the first image by performing feature matching on the first image and the standard image;
determining a third image in the first image according to the matched feature points, and determining the scaling and/or the rotation angle of the third image relative to the standard image;
and according to the scaling and/or the rotation angle, carrying out scaling processing and/or rotation processing on the third image to obtain a second image.
Optionally, the determining unit 802 determines a scaling and/or a rotation angle of the third image with respect to the standard image, including:
selecting a first feature point and a second feature point which are positioned in the standard image from the matched feature points;
selecting a third feature point which is positioned in the first image and matched with the first feature point and a fourth feature point which is positioned in the first image and matched with the second feature point from the matched feature points;
and determining the scaling and/or the rotation angle of the third image relative to the standard image according to the first characteristic point, the second characteristic point, the third characteristic point and the fourth characteristic point.
Optionally, the determining unit 802 selects a first feature point and a second feature point located in the standard image from the matched feature points, and includes:
determining the matching degree of the matched characteristic points between the standard image and the first image;
sorting the feature points in the standard image from large matching degree to small matching degree according to the matching degree;
the feature point ranked at the first position is determined as a first feature point, and the feature point ranked at the second position is determined as a second feature point.
Alternatively, the identifying unit 803 identifies the target image corresponding to the standard image by performing image comparison on the second image and the standard image, including:
determining the similarity between the second image and the standard image by performing image comparison on the second image and the standard image;
and when the similarity is greater than a preset value, identifying the second image as a target image.
Optionally, the recognition unit 803 performs image comparison on the second image and the standard image, including:
and performing image comparison on the second image and the standard image through at least one of the following algorithms: pixel contrast algorithm with fault tolerance, gray scale contrast algorithm, histogram contrast algorithm.
Optionally, the pixel comparison algorithm with fault tolerance includes:
setting the fault tolerance rate of pixel color change between the second image and the standard image; and/or the presence of a gas in the gas,
a fault tolerance for pixel offsets between the second image and the standard image is set.
According to the image recognition device, the acquisition unit is used for acquiring a standard image and the position coordinates of the standard image by running a test script; the determining unit is used for determining a first image with a preset size in a screen of the test terminal according to the position coordinates; the determining unit is further used for determining a second image in the first image by performing feature matching on the first image and the standard image; and the identification unit is used for identifying the target image corresponding to the standard image by performing image comparison on the second image and the standard image, so that the image identification accuracy rate when the test script is operated in different test terminal equipment to perform APP automatic test is effectively improved through the dual identification process of feature matching and image comparison.
Fig. 9 is a schematic structural diagram of an image recognition apparatus according to an embodiment of the present application. The apparatus 900 may include: a channel interface 901 and a processor 902, optionally including a memory 903.
The channel interface 901, the processor 902 and the memory 903 may be interconnected by a bus 904 system. The bus 404 may be an ISA (Industry Standard Architecture) bus, a PCI (peripheral component Interconnect) bus, an EISA (Extended Industry Standard Architecture) bus, or the like. The bus may be divided into an address bus, a data bus, a control bus, etc. For ease of illustration, only one double-headed arrow is shown in FIG. 9, but this does not indicate only one bus or one type of bus.
Optionally, a memory 903 is included for storing programs. In particular, the program may include program code comprising computer operating instructions. The memory 903 may include both read-only memory and random access memory, and provides instructions and data to the processor 902. The Memory 903 may include a Random-Access Memory (RAM) and may also include a non-volatile Memory (non-volatile Memory), such as at least 1 disk Memory.
The processor 902 is configured to execute the following operations, optionally, to execute a program stored in the memory 903, and specifically to execute the following operations:
acquiring a standard image and a position coordinate of the standard image by running a test script;
determining a first image with a preset size in a test terminal screen according to the position coordinates;
determining a second image in the first image by performing feature matching on the first image and the standard image;
and identifying a target image corresponding to the standard image by performing image comparison on the second image and the standard image.
The method performed by the image recognition apparatus or Master node according to the embodiments disclosed in fig. 1 and fig. 7-8 of the present application may be implemented in the processor 902 or implemented by the processor 902. The processor 902 may be an integrated circuit chip having signal processing capabilities. In implementation, the steps of the above method may be performed by integrated logic circuits of hardware or instructions in the form of software in the processor 902. The Processor 902 may be a general-purpose Processor, including a Central Processing Unit (CPU), a Network Processor (NP), and the like; but also Digital Signal Processors (DSPs), Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs) or other Programmable logic devices, discrete gate or transistor logic devices, discrete hardware components. The various methods, steps, and logic blocks disclosed in the embodiments of the present application may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of the method disclosed in connection with the embodiments of the present application may be directly implemented by a hardware decoding processor, or implemented by a combination of hardware and software modules in the decoding processor. The software module may be located in ram, flash memory, rom, prom, or eprom, registers, etc. storage media as is well known in the art. The storage medium is located in the memory 903, and the processor 902 reads the information in the memory 903 and performs the steps of the above method in combination with the hardware thereof.
The image recognition device 900 may also perform the method of fig. 1 and implement the method performed by the administrator node.
Example 3
Embodiments of the present application also provide a computer-readable storage medium storing one or more programs, the one or more programs comprising instructions, which when executed by a portable electronic device comprising a plurality of application programs, enable the portable electronic device to perform the method of embodiment 1.
The foregoing description has been directed to specific embodiments of this disclosure. Other embodiments are within the scope of the following claims. In some cases, the actions or steps recited in the claims may be performed in a different order than in the embodiments and still achieve desirable results. In addition, the processes depicted in the accompanying figures do not necessarily require the particular order shown, or sequential order, to achieve desirable results. In some embodiments, multitasking and parallel processing may also be possible or may be advantageous.
In the 90 s of the 20 th century, improvements in a technology could clearly distinguish between improvements in hardware (e.g., improvements in circuit structures such as diodes, transistors, switches, etc.) and improvements in software (improvements in process flow). However, as technology advances, many of today's process flow improvements have been seen as direct improvements in hardware circuit architecture. Designers almost always obtain the corresponding hardware circuit structure by programming an improved method flow into the hardware circuit. Thus, it cannot be said that an improvement in the process flow cannot be realized by hardware physical modules. For example, a Programmable Logic Device (PLD), such as a Field Programmable Gate Array (FPGA), is an integrated circuit whose Logic functions are determined by programming the Device by a user. A digital system is "integrated" on a PLD by the designer's own programming without requiring the chip manufacturer to design and fabricate application-specific integrated circuit chips. Furthermore, nowadays, instead of manually making an integrated Circuit chip, such Programming is often implemented by "logic compiler" software, which is similar to a software compiler used in program development and writing, but the original code before compiling is also written by a specific Programming Language, which is called Hardware Description Language (HDL), and HDL is not only one but many, such as abel (advanced Boolean Expression Language), ahdl (alternate Language Description Language), traffic, pl (core unified Programming Language), HDCal, JHDL (Java Hardware Description Language), langue, Lola, HDL, laspam, hardsradware (Hardware Description Language), vhjhd (Hardware Description Language), and vhigh-Language, which are currently used in most common. It will also be apparent to those skilled in the art that hardware circuitry that implements the logical method flows can be readily obtained by merely slightly programming the method flows into an integrated circuit using the hardware description languages described above.
The controller may be implemented in any suitable manner, for example, the controller may take the form of, for example, a microprocessor or processor and a computer-readable medium storing computer-readable program code (e.g., software or firmware) executable by the (micro) processor, logic gates, switches, an Application Specific Integrated Circuit (ASIC), a programmable logic controller, and an embedded microcontroller, examples of which include, but are not limited to, the following microcontrollers: ARC 625D, Atmel AT91SAM, Microchip PIC18F26K20, and Silicone Labs C8051F320, the memory controller may also be implemented as part of the control logic for the memory. Those skilled in the art will also appreciate that, in addition to implementing the controller as pure computer readable program code, the same functionality can be implemented by logically programming method steps such that the controller is in the form of logic gates, switches, application specific integrated circuits, programmable logic controllers, embedded microcontrollers and the like. Such a controller may thus be considered a hardware component, and the means included therein for performing the various functions may also be considered as a structure within the hardware component. Or even means for performing the functions may be regarded as being both a software module for performing the method and a structure within a hardware component.
The systems, devices, modules or units illustrated in the above embodiments may be implemented by a computer chip or an entity, or by a product with certain functions. One typical implementation device is a computer. In particular, the computer may be, for example, a personal computer, a laptop computer, a cellular telephone, a camera phone, a smartphone, a personal digital assistant, a media player, a navigation device, an email device, a game console, a tablet computer, a wearable device, or a combination of any of these devices.
For convenience of description, the above devices are described as being divided into various units by function, and are described separately. Of course, the functionality of the units may be implemented in one or more software and/or hardware when implementing the present application.
As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, system, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In a typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include forms of volatile memory in a computer readable medium, Random Access Memory (RAM) and/or non-volatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). Memory is an example of a computer-readable medium.
Computer-readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), Digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information that can be accessed by a computing device. As defined herein, a computer readable medium does not include a transitory computer readable medium such as a modulated data signal and a carrier wave.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
The application may be described in the general context of computer-executable instructions, such as program modules, being executed by a computer. Generally, program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types. The application may also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, program modules may be located in both local and remote computer storage media including memory storage devices.
The embodiments in the present specification are described in a progressive manner, and the same and similar parts among the embodiments are referred to each other, and each embodiment focuses on the differences from the other embodiments. In particular, for the system embodiment, since it is substantially similar to the method embodiment, the description is simple, and for the relevant points, reference may be made to the partial description of the method embodiment.
The above description is only an example of the present application and is not intended to limit the present application. Various modifications and changes may occur to those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present application should be included in the scope of the claims of the present application.

Claims (14)

1. An image recognition method, comprising:
acquiring a standard image and a position coordinate of the standard image under a normalized coordinate system by running a test script;
determining a first image with a preset size in a test terminal screen according to the position coordinates;
determining a second image in the first image by performing feature matching on the first image and the standard image;
and identifying a target image corresponding to the standard image by performing image comparison on the second image and the standard image.
2. The method of claim 1, wherein determining a second image in the first image by feature matching the first image and the standard image comprises:
determining matched feature points between the standard image and the first image by performing feature matching on the first image and the standard image;
determining a third image in the first image according to the matched feature points, and determining the scaling and/or the rotation angle of the third image relative to the standard image;
and according to the scaling and/or the rotation angle, carrying out scaling processing and/or rotation processing on the third image to obtain the second image.
3. The method of claim 2, wherein determining a scale and/or a rotation angle of the third image relative to the standard image comprises:
selecting a first feature point and a second feature point which are positioned in the standard image from the matched feature points;
selecting a third feature point which is located in the first image and matched with the first feature point and a fourth feature point which is located in the first image and matched with the second feature point from the matched feature points;
and determining the scaling and/or the rotation angle of the third image relative to the standard image according to the first characteristic point, the second characteristic point, the third characteristic point and the fourth characteristic point.
4. The method of claim 3, wherein selecting a first feature point and a second feature point located in the standard image among the matched feature points comprises:
determining the matching degree of the matched feature points between the standard image and the first image;
sorting the feature points in the standard image from large matching degree to small matching degree according to the matching degree;
the feature point ranked at the first position is determined as the first feature point, and the feature point ranked at the second position is determined as the second feature point.
5. The method of claim 1, wherein identifying the target image corresponding to the standard image by image-comparing the second image and the standard image comprises:
determining a similarity between the second image and the standard image by performing image comparison on the second image and the standard image;
and when the similarity is greater than a preset value, identifying the second image as the target image.
6. The method of claim 5, wherein image-comparing the second image and the standard image comprises:
image comparison is performed on the second image and the standard image through at least one of the following algorithms: the pixel contrast algorithm with fault tolerance comprises a pixel contrast algorithm with fault tolerance, a gray scale contrast algorithm and a histogram contrast algorithm, wherein the pixel contrast algorithm with fault tolerance is a pixel contrast algorithm with fault tolerance, and the pixel contrast algorithm with fault tolerance is provided with the fault tolerance of pixel color change and/or the fault tolerance of pixel offset.
7. An image recognition apparatus, comprising: an acquisition unit, a determination unit and an identification unit, wherein:
the acquisition unit is used for acquiring a standard image and position coordinates of the standard image under a normalized coordinate system by running a test script;
the determining unit is used for determining a first image with a preset size in a screen of the test terminal according to the position coordinates;
the determination unit is also used for
Determining a second image in the first image by performing feature matching on the first image and the standard image;
and the identification unit is used for identifying a target image corresponding to the standard image by performing image comparison on the second image and the standard image.
8. The apparatus of claim 7, wherein the determining unit determines a second image in the first image by feature matching the first image and the standard image, comprising:
determining matched feature points between the standard image and the first image by performing feature matching on the first image and the standard image;
determining a third image in the first image according to the matched feature points, and determining the scaling and/or the rotation angle of the third image relative to the standard image;
and according to the scaling and/or the rotation angle, carrying out scaling processing and/or rotation processing on the third image to obtain the second image.
9. The apparatus of claim 8, wherein the determining unit determines a scaling and/or a rotation angle of the third image relative to the standard image, comprising:
selecting a first feature point and a second feature point which are positioned in the standard image from the matched feature points;
selecting a third feature point which is located in the first image and matched with the first feature point and a fourth feature point which is located in the first image and matched with the second feature point from the matched feature points;
and determining the scaling and/or the rotation angle of the third image relative to the standard image according to the first characteristic point, the second characteristic point, the third characteristic point and the fourth characteristic point.
10. The apparatus according to claim 9, wherein the determination unit selects a first feature point and a second feature point located in the standard image among the matched feature points, including:
determining the matching degree of the matched feature points between the standard image and the first image;
sorting the feature points in the standard image from large matching degree to small matching degree according to the matching degree;
the feature point ranked at the first position is determined as the first feature point, and the feature point ranked at the second position is determined as the second feature point.
11. The apparatus according to claim 7, wherein the identifying unit identifies the target image corresponding to the standard image by image-comparing the second image and the standard image, including:
determining a similarity between the second image and the standard image by performing image comparison on the second image and the standard image;
and when the similarity is greater than a preset value, identifying the second image as the target image.
12. The apparatus of claim 11, wherein the recognition unit performs image comparison of the second image and the standard image, comprising:
image comparison is performed on the second image and the standard image through at least one of the following algorithms: the pixel contrast algorithm with fault tolerance comprises a pixel contrast algorithm with fault tolerance, a gray scale contrast algorithm and a histogram contrast algorithm, wherein the pixel contrast algorithm with fault tolerance is a pixel contrast algorithm with fault tolerance, and the pixel contrast algorithm with fault tolerance is provided with the fault tolerance of pixel color change and/or the fault tolerance of pixel offset.
13. An image recognition apparatus, comprising: a memory and a processor, wherein:
a memory for storing a program;
a processor for executing the program stored in the memory, and specifically executing:
acquiring a standard image and a position coordinate of the standard image under a normalized coordinate system by running a test script;
determining a first image with a preset size in a test terminal screen according to the position coordinates;
determining a second image in the first image by performing feature matching on the first image and the standard image;
and identifying a target image corresponding to the standard image by performing image comparison on the second image and the standard image.
14. A computer-readable storage medium storing one or more programs which, when executed by an electronic device including a plurality of application programs, cause the electronic device to perform a method of:
acquiring a standard image and a position coordinate of the standard image under a normalized coordinate system by running a test script;
determining a first image with a preset size in a test terminal screen according to the position coordinates;
determining a second image in the first image by performing feature matching on the first image and the standard image;
and identifying a target image corresponding to the standard image by performing image comparison on the second image and the standard image.
CN201710536756.1A 2017-07-04 2017-07-04 Image identification method and device Active CN107274442B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710536756.1A CN107274442B (en) 2017-07-04 2017-07-04 Image identification method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710536756.1A CN107274442B (en) 2017-07-04 2017-07-04 Image identification method and device

Publications (2)

Publication Number Publication Date
CN107274442A CN107274442A (en) 2017-10-20
CN107274442B true CN107274442B (en) 2020-03-10

Family

ID=60070001

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710536756.1A Active CN107274442B (en) 2017-07-04 2017-07-04 Image identification method and device

Country Status (1)

Country Link
CN (1) CN107274442B (en)

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109901988A (en) * 2017-12-11 2019-06-18 北京京东尚科信息技术有限公司 A kind of page elements localization method and device for automatic test
CN108021346A (en) * 2017-12-26 2018-05-11 歌尔科技有限公司 VR helmets show method, VR helmets and the system of image
CN108490011B (en) * 2018-03-07 2020-08-14 燕山大学 Method for positioning detected area of transmission electron microscope block sample
CN108648172B (en) * 2018-03-30 2021-08-03 四川元匠科技有限公司 CT (computed tomography) map pulmonary nodule detection system based on 3D-Unet
CN108564082B (en) * 2018-04-28 2023-06-09 苏州赛腾精密电子股份有限公司 Image processing method, device, server and medium
CN109332192A (en) * 2018-08-03 2019-02-15 小黄狗环保科技有限公司 A kind of image-recognizing method classified for pop can and beverage bottle
CN109472279B (en) * 2018-08-31 2020-02-07 杭州千讯智能科技有限公司 Article identification method and device based on image processing
CN109919164B (en) * 2019-02-22 2021-01-05 腾讯科技(深圳)有限公司 User interface object identification method and device
CN109934305A (en) * 2019-04-01 2019-06-25 成都大学 Image-recognizing method and device based on image recognition model
CN110967350B (en) * 2019-11-05 2023-04-11 北京地平线机器人技术研发有限公司 Chip testing method and device based on image recognition and electronic equipment
CN111079730B (en) * 2019-11-20 2023-12-22 北京云聚智慧科技有限公司 Method for determining area of sample graph in interface graph and electronic equipment
CN112733862A (en) * 2021-01-05 2021-04-30 卓望数码技术(深圳)有限公司 Terminal image automatic matching method, system and storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102117412A (en) * 2009-12-31 2011-07-06 北大方正集团有限公司 Method and device for image recognition
CN105513038A (en) * 2014-10-20 2016-04-20 网易(杭州)网络有限公司 Image matching method and mobile phone application test platform
CN105740874A (en) * 2016-03-04 2016-07-06 网易(杭州)网络有限公司 Method and device for determining operation coordinate of automation test script during playback
CN105868102A (en) * 2016-03-22 2016-08-17 中国科学院软件研究所 Computer vision based mobile terminal application testing system and method

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102117412A (en) * 2009-12-31 2011-07-06 北大方正集团有限公司 Method and device for image recognition
CN105513038A (en) * 2014-10-20 2016-04-20 网易(杭州)网络有限公司 Image matching method and mobile phone application test platform
CN105740874A (en) * 2016-03-04 2016-07-06 网易(杭州)网络有限公司 Method and device for determining operation coordinate of automation test script during playback
CN105868102A (en) * 2016-03-22 2016-08-17 中国科学院软件研究所 Computer vision based mobile terminal application testing system and method

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
An Automatic Video Image Mosaic Algorithm Based on SIFT Feature Matching;Song Fuhua, Lu Bin;《Proceedings of the 2012 International Conference on Communication, Electronics and Automation Engineering》;Springer Berlin Heidelberg;20131231;879-886 *

Also Published As

Publication number Publication date
CN107274442A (en) 2017-10-20

Similar Documents

Publication Publication Date Title
CN107274442B (en) Image identification method and device
CN109189682B (en) Script recording method and device
CN107609437B (en) Target graphic code identification method and device
CN109344789B (en) Face tracking method and device
CN109034183B (en) Target detection method, device and equipment
KR102316230B1 (en) Image processing method and device
CN109033772B (en) Verification information input method and device
CN110689010A (en) Certificate identification method and device
CN110826894A (en) Hyper-parameter determination method and device and electronic equipment
CN112308113A (en) Target identification method, device and medium based on semi-supervision
CN112347512A (en) Image processing method, device, equipment and storage medium
CN110888756A (en) Diagnostic log generation method and device
CN109978044B (en) Training data generation method and device, and model training method and device
US11354544B2 (en) Fingerprint image processing methods and apparatuses
CN112966577B (en) Method and device for model training and information providing
CN111368902A (en) Data labeling method and device
CN110968513B (en) Recording method and device of test script
CN115061618A (en) Method and device for displaying list data in sliding mode and electronic equipment
CN114926437A (en) Image quality evaluation method and device
CN109376289B (en) Method and device for determining target application ranking in application search result
CN110866478B (en) Method, device and equipment for identifying object in image
CN109325127B (en) Risk identification method and device
CN110674495B (en) Detection method, device and equipment for group border crossing access
CN116188919B (en) Test method and device, readable storage medium and electronic equipment
CN115617638A (en) Test script generation method and device, storage medium and equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant