CN111738321B - Data processing method, device, terminal equipment and storage medium - Google Patents

Data processing method, device, terminal equipment and storage medium Download PDF

Info

Publication number
CN111738321B
CN111738321B CN202010537349.4A CN202010537349A CN111738321B CN 111738321 B CN111738321 B CN 111738321B CN 202010537349 A CN202010537349 A CN 202010537349A CN 111738321 B CN111738321 B CN 111738321B
Authority
CN
China
Prior art keywords
image
processed
sub
target
similarity threshold
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010537349.4A
Other languages
Chinese (zh)
Other versions
CN111738321A (en
Inventor
李蔼莉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Music Entertainment Technology Shenzhen Co Ltd
Original Assignee
Tencent Music Entertainment Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Music Entertainment Technology Shenzhen Co Ltd filed Critical Tencent Music Entertainment Technology Shenzhen Co Ltd
Priority to CN202010537349.4A priority Critical patent/CN111738321B/en
Publication of CN111738321A publication Critical patent/CN111738321A/en
Application granted granted Critical
Publication of CN111738321B publication Critical patent/CN111738321B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/75Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
    • G06V10/751Comparing pixel values or logical combinations thereof, or feature values having positional relevance, e.g. template matching
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/75Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
    • G06V10/751Comparing pixel values or logical combinations thereof, or feature values having positional relevance, e.g. template matching
    • G06V10/7515Shifting the patterns to accommodate for positional errors
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Evolutionary Computation (AREA)
  • Databases & Information Systems (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Computing Systems (AREA)
  • Artificial Intelligence (AREA)
  • Health & Medical Sciences (AREA)
  • Multimedia (AREA)
  • Image Analysis (AREA)

Abstract

The embodiment of the application discloses a data processing method, a device, terminal equipment and a storage medium, wherein the method comprises the following steps: acquiring a plurality of first sub-images to be processed, acquiring a first template image, and calculating a first similarity value of each first sub-image to be processed and the first template image; acquiring the number of target to-be-processed sub-images larger than a preset similarity threshold, and adjusting the preset similarity threshold based on the number of target to-be-processed sub-images and the number of target images until the number of target to-be-processed sub-images determined based on the adjusted preset similarity threshold is equal to the number of target images; acquiring a second sub-image to be processed and a second template image; and calculating a second similarity value of each second sub-image to be processed and the second template image, and determining a target matching image from the second sub-images to be processed based on the adjusted preset similarity threshold value. By adopting the embodiment of the application, the accuracy and the efficiency of identifying the target matching image can be improved.

Description

Data processing method, device, terminal equipment and storage medium
Technical Field
The present disclosure relates to the field of computer technologies, and in particular, to a data processing method, a data processing device, a terminal device, and a storage medium.
Background
At present, many automatic test tools only support control identification, and then automatic test operation is executed on the identified control, but in practical application, control searching is not supported in many application scenes, such as game application programs, and a display interface of the automatic test tools does not support control identification, so that automatic test of the game application programs cannot be realized. While some automated testing tools can solve the problem of automated testing of game applications based on image recognition, the accuracy of image recognition is not high, and the similarity threshold needs to be manually adjusted, so that the efficiency of image recognition is low. Therefore, how to improve the accuracy and the recognition efficiency of image recognition is a current urgent problem to be solved.
Disclosure of Invention
The embodiment of the application provides a data processing method, a data processing device, terminal equipment and a storage medium, which can improve the accuracy and efficiency of identifying target matching images and have high applicability.
In a first aspect, an embodiment of the present application provides a data processing method, including:
Acquiring a first image to be processed and acquiring first sliding window information, wherein sliding is performed from a starting point position of the first image to be processed based on a first window indicated by the first sliding window information to obtain a plurality of first sub-images to be processed, which are included in the first image to be processed, and the first sliding window information comprises a first window size which is the same as the size of a first template image;
acquiring the first template image, and calculating a first similarity value of each first sub-image to be processed and the first template image;
acquiring a preset similarity threshold and the number of target images included in the first image to be processed, determining the number of target sub-images to be processed corresponding to a first similarity value larger than the preset similarity threshold, adjusting the preset similarity threshold until the number of target sub-images to be processed determined based on the adjusted preset similarity threshold is equal to the number of target images, and determining the preset similarity threshold after stopping adjustment as a target similarity threshold, wherein the first template image is an image which comprises the same object as the target image;
Acquiring second sliding window information, a second image to be processed and a second template image, and acquiring a plurality of second sub-images to be processed, which are included in the second image to be processed, based on the second sliding window information, wherein the first image to be processed is the same as the second image to be processed in image type, the second sliding window information comprises a second window size, and the second window size is the same as the second template image in size;
and acquiring a second similarity value of each second sub-image to be processed and the second template image, and determining the second sub-image to be processed corresponding to the second similarity value larger than the target similarity threshold value as a target matching image.
With reference to the first aspect, in a possible implementation manner, the acquiring a preset similarity threshold includes:
acquiring a preset similarity threshold range, wherein the preset similarity threshold range comprises a left similarity threshold boundary and a right similarity threshold boundary;
and determining an arithmetic average value of the left boundary of the similarity threshold and the right boundary of the similarity threshold as the preset similarity threshold.
With reference to the first aspect, in a possible implementation manner, the adjusting the preset similarity threshold includes:
If the number of the target sub-images to be processed is greater than the number of the target images, updating the left boundary of the similarity threshold to be the preset similarity threshold, and determining the arithmetic average value of the updated left boundary of the similarity threshold and the right boundary of the similarity threshold to be the adjusted preset similarity threshold;
if the number of the target sub-images to be processed is smaller than the number of the target images, updating the right boundary of the similarity threshold to be the preset similarity threshold, and determining the arithmetic average value of the right boundary of the updated similarity threshold and the left boundary of the similarity threshold to be the adjusted preset similarity threshold.
With reference to the first aspect, in a possible implementation manner, before the acquiring the first sliding window information, the method further includes:
acquiring a first feature vector corresponding to a first feature point included in a third template image, and acquiring a second feature vector corresponding to a second feature point included in the first image to be processed, wherein the feature point comprises a contour point in the image, and the feature vector is a vector describing pixel points around the feature point;
and calculating the Hamming distance between the first feature vector and the second feature vector, determining a third similarity value between the third template image and the first image to be processed according to the Hamming distance, and acquiring first sliding window information if the third similarity value is not greater than a first preset similarity threshold value so as to acquire a plurality of first sub-images to be processed included in the first image to be processed according to the first sliding window information.
With reference to the first aspect, in a possible implementation manner, the second similarity value includes a normalized correlation matching similarity value and a normalized correlation coefficient matching similarity value; the method further comprises the steps of:
determining a second sub-image to be processed corresponding to the normalized correlation matching similarity value larger than the target similarity threshold from a plurality of normalized correlation matching similarity values to generate a first candidate image set;
determining a second sub-image to be processed corresponding to the normalized correlation coefficient matching similarity value larger than the target similarity threshold from a plurality of normalized correlation coefficient matching similarity values to generate a second candidate image set;
acquiring first center point positions corresponding to first candidate images included in the first candidate image set, acquiring second center point positions corresponding to second candidate images included in the second candidate image set, and calculating center point position difference values between the first center point positions and the second center point positions;
and if the difference value of the central point position between the first central point position corresponding to any one first candidate image and the second central point position corresponding to any one second candidate image is not greater than the difference value of the preset central point position, determining the image with larger similarity value in any one first candidate image and any one second candidate image as the target matching image.
With reference to the first aspect, in a possible implementation manner, the second image to be processed is an RGB image; the method further comprises the steps of:
acquiring an R channel image similarity value, a G channel image similarity value and a B channel image similarity value of each second sub-image to be processed in the plurality of second sub-images to be processed and the second template image;
acquiring a first weight value corresponding to a preset R channel image, a second weight value corresponding to a G channel image and a third weight value corresponding to a B channel image, wherein the sum of the first weight value, the second weight value and the third weight value is equal to 1;
carrying out weighted summation on the R channel image similarity value, the G channel image similarity value, the B channel image similarity value, the first weight value, the second weight value and the third weight value corresponding to each second sub-image to be processed to obtain a fourth similarity value corresponding to each second sub-image to be processed;
and determining the second sub-image to be processed corresponding to the fourth similarity value larger than the target similarity threshold as a target matching image.
With reference to the first aspect, in a possible implementation manner, the second image to be processed is a plurality of interface images included in the application program; the method further comprises the steps of:
Acquiring target matching images included in each interface image in the plurality of interface images, and executing test operation in each target matching image;
acquiring performance index parameters based on a performance data acquisition interface, wherein the performance index parameters comprise CPU occupancy rate parameters, memory occupancy rate parameters and fluency FPS parameters of a central processing unit;
and acquiring log information corresponding to the application program, extracting alarm information included in the log information, and generating a performance test report based on the performance index parameter and the alarm information.
In a second aspect, an embodiment of the present application provides a data processing apparatus, including:
the first sub-image to be processed acquisition module is used for acquiring a first image to be processed and acquiring first sliding window information, sliding is performed from a starting point position of the first image to be processed based on a first window indicated by the first sliding window information so as to obtain a plurality of first sub-images to be processed, which are included in the first image to be processed, wherein the first sliding window information comprises a first window size which is the same as a first template image;
the similarity value acquisition module is used for acquiring the first template image and calculating a first similarity value of each first sub-image to be processed and the first template image;
The similarity threshold determining module is configured to obtain a preset similarity threshold and a number of target images included in the first to-be-processed image, determine a number of target to-be-processed sub-images corresponding to a first similarity value greater than the preset similarity threshold, adjust the preset similarity threshold until the number of target to-be-processed sub-images determined based on the adjusted preset similarity threshold is equal to the number of target images, and determine the preset similarity threshold after stopping adjustment as a target similarity threshold, where the first template image is an image including the same object as the target image;
the second sub-image to be processed acquisition module is used for acquiring second sliding window information, a second image to be processed and a second template image, acquiring a plurality of second sub-images to be processed, which are included in the second image to be processed, based on the second sliding window information, wherein the first image to be processed is the same as the second image to be processed in image type, the second sliding window information comprises a second window size, and the second window size is the same as the second template image in size;
the target matching image acquisition module is used for acquiring a second similarity value of each second sub-image to be processed and the second template image, and determining the second sub-image to be processed corresponding to the second similarity value which is larger than the target similarity threshold value as a target matching image.
With reference to the second aspect, in one possible implementation manner, the similarity threshold determining module includes a preset similarity threshold obtaining unit and a similarity threshold adjusting unit, where the preset similarity threshold obtaining unit includes:
the threshold range obtaining subunit is used for obtaining a preset similarity threshold range, wherein the preset similarity threshold range comprises a left similarity threshold boundary and a right similarity threshold boundary;
and the similarity threshold determining subunit is used for determining an arithmetic average value of the left boundary of the similarity threshold and the right boundary of the similarity threshold as the preset similarity threshold.
With reference to the second aspect, in one possible implementation manner, the similarity threshold adjustment unit is specifically configured to:
if the number of the target sub-images to be processed is greater than the number of the target images, updating the left boundary of the similarity threshold to be the preset similarity threshold, and determining the arithmetic average value of the updated left boundary of the similarity threshold and the right boundary of the similarity threshold to be the adjusted preset similarity threshold;
if the number of the target sub-images to be processed is smaller than the number of the target images, updating the right boundary of the similarity threshold to be the preset similarity threshold, and determining the arithmetic average value of the right boundary of the updated similarity threshold and the left boundary of the similarity threshold to be the adjusted preset similarity threshold.
With reference to the second aspect, in a possible implementation manner, the apparatus further includes a feature matching module, where the feature matching module includes:
the feature point acquisition unit is used for acquiring a first feature vector corresponding to a first feature point included in the third template image and acquiring a second feature vector corresponding to a second feature point included in the first image to be processed, wherein the feature point comprises a contour point in the image, and the feature vector is a vector describing pixel points around the feature point;
and the feature point similarity calculation unit is used for calculating the Hamming distance between the first feature vector and the second feature vector, determining a third similarity value between the third template image and the first image to be processed according to the Hamming distance, and acquiring first sliding window information if the third similarity value is not greater than a first preset similarity threshold value so as to acquire a plurality of first sub-images to be processed included in the first image to be processed according to the first sliding window information.
With reference to the second aspect, in a possible implementation manner, the second similarity value includes a normalized correlation matching similarity value and a normalized correlation coefficient matching similarity value; the target matching image acquisition module comprises:
The first candidate image set acquisition unit is used for determining a second sub-image to be processed corresponding to the normalized correlation matching similarity value larger than the target similarity threshold from a plurality of normalized correlation matching similarity values so as to generate a first candidate image set;
the second candidate image set obtaining unit is used for determining a second sub-image to be processed corresponding to the normalized correlation coefficient matching similarity value larger than the target similarity threshold from the multiple normalized correlation coefficient matching similarity values so as to generate a second candidate image set;
a center point difference value obtaining unit, configured to obtain each first center point position corresponding to each first candidate image included in the first candidate image set, and obtain each second center point position corresponding to each second candidate image included in the second candidate image set, and calculate a center point position difference value between each first center point position and each second center point position;
and the first target matching image determining unit is used for determining an image with larger similarity value in any first candidate image and any second candidate image as a target matching image if the difference value of the central point position between the first central point position corresponding to any first candidate image and the second central point position corresponding to any second candidate image is not larger than the preset central point position difference value.
With reference to the second aspect, in a possible implementation manner, the second image to be processed is an RGB image; the target matching image acquisition module comprises:
the similarity value acquisition unit is used for acquiring an R channel image similarity value, a G channel image similarity value and a B channel image similarity value of each second sub-image to be processed in the plurality of second sub-images to be processed and the second template image;
the weight value acquisition unit is used for acquiring a first weight value corresponding to a preset R channel image, a second weight value corresponding to a G channel image and a third weight value corresponding to a B channel image, wherein the sum of the first weight value, the second weight value and the third weight value is equal to 1;
the similarity value processing unit is used for carrying out weighted summation on the R channel image similarity value, the G channel image similarity value, the B channel image similarity value, the first weight value, the second weight value and the third weight value corresponding to each second sub-image to be processed to obtain a fourth similarity value corresponding to each second sub-image to be processed;
and the second target matching image determining unit is used for determining a second sub-image to be processed corresponding to a fourth similarity value larger than the target similarity threshold as a target matching image.
With reference to the second aspect, in a possible implementation manner, the second image to be processed is a plurality of interface images included in the application program; the apparatus further includes a performance test report acquisition module, the performance test report acquisition module including:
a test operation execution unit, configured to obtain a target matching image included in each of the plurality of interface images, and execute a test operation in each of the target matching images;
the performance parameter acquisition unit is used for acquiring performance index parameters based on the performance data acquisition interface, wherein the performance index parameters comprise CPU occupancy rate parameters, memory occupancy rate parameters and fluency FPS parameters of a central processing unit;
and the performance test report generating unit is used for acquiring the log information corresponding to the application program, extracting the alarm information included in the log information, and generating a performance test report based on the performance index parameter and the alarm information.
In a third aspect, embodiments of the present application provide a terminal device that includes a processor and a memory, where the processor and the memory are interconnected. The memory is configured to store a computer program supporting the terminal device to perform the method provided by the first aspect and/or any of the possible implementation manners of the first aspect, the computer program comprising program instructions, the processor being configured to invoke the program instructions to perform the method provided by the first aspect and/or any of the possible implementation manners of the first aspect.
In a fourth aspect, embodiments of the present application provide a computer readable storage medium storing a computer program comprising program instructions which, when executed by a processor, cause the processor to perform the method provided by the first aspect and/or any one of the possible implementations of the first aspect.
In this embodiment of the present application, by acquiring the first to-be-processed image and acquiring the first sliding window information, the first window indicated by the first sliding window information may be slid from the starting point position of the first to-be-processed image, so as to obtain a plurality of first to-be-processed sub-images included in the first to-be-processed image. By acquiring the first template image, a first similarity value between each first sub-image to be processed in the plurality of first sub-images to be processed and the first template image can be calculated. Acquiring a preset similarity threshold and the number of target images corresponding to the target images included in the first image to be processed, determining the number of sub-images to be processed corresponding to a first similarity value larger than the preset similarity threshold from a plurality of first similarity values, and adjusting the preset similarity threshold based on the magnitude relation between the number of sub-images to be processed and the number of target images until the number of sub-images to be processed determined based on the adjusted preset similarity threshold is equal to the number of target images, and determining the adjusted preset similarity threshold as the target similarity threshold. And acquiring second sliding window information, a second image to be processed and a second template image, and acquiring a plurality of second sub-images to be processed, which are included in the second image to be processed, based on the second sliding window information. And acquiring a second similarity value of each second to-be-processed sub-image in the plurality of second to-be-processed sub-images and the second template image, and determining a second to-be-processed sub-image corresponding to the second similarity value larger than the target similarity threshold value from the plurality of second similarity values corresponding to the plurality of second to-be-processed sub-images as a target matching image. By adopting the embodiment of the application, the accuracy and efficiency of identifying the target matching image can be improved, the time for hitting the target matching image can be shortened, and the applicability is high.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a flow chart of a data processing method according to an embodiment of the present disclosure;
fig. 2 is a schematic view of an application scenario of a target image provided in an embodiment of the present application;
fig. 3 is a schematic view of an application scenario for acquiring a first sub-image to be processed according to an embodiment of the present application;
fig. 4 is an application scene schematic diagram of a first template image provided in an embodiment of the present application;
FIG. 5 is another flow chart of a data processing method according to an embodiment of the present disclosure;
fig. 6 is a schematic view of an application scenario for determining a target matching image according to an embodiment of the present application;
FIG. 7 is a schematic diagram of a data processing apparatus according to an embodiment of the present disclosure;
FIG. 8 is another schematic diagram of a data processing apparatus according to an embodiment of the present application;
fig. 9 is a schematic structural diagram of a terminal device provided in an embodiment of the present application.
Detailed Description
The following description of the embodiments of the present application will be made clearly and fully with reference to the accompanying drawings, in which it is evident that the embodiments described are only some, but not all, of the embodiments of the present application. All other embodiments, which can be made by one of ordinary skill in the art based on the embodiments herein without making any inventive effort, are intended to be within the scope of the present application.
The data processing method provided by the embodiment of the application can be widely applied to terminal equipment capable of carrying out image recognition. The terminal device includes, but is not limited to, a server, a smart phone, a tablet computer, a notebook computer, a desktop computer, and the like, which are not limited herein. For convenience of description, a terminal device will be described as an example. According to the method, by acquiring the first to-be-processed image and acquiring the first sliding window information, the first window indicated by the first sliding window information can slide from the starting point position of the first to-be-processed image so as to obtain a plurality of first to-be-processed sub-images included in the first to-be-processed image. By acquiring the first template image, a first similarity value between each first sub-image to be processed in the plurality of first sub-images to be processed and the first template image can be calculated. Acquiring a preset similarity threshold and the number of target images corresponding to the target images included in the first image to be processed, determining the number of sub-images to be processed corresponding to a first similarity value larger than the preset similarity threshold from a plurality of first similarity values, and adjusting the preset similarity threshold based on the magnitude relation between the number of sub-images to be processed and the number of target images until the number of sub-images to be processed determined based on the adjusted preset similarity threshold is equal to the number of target images, and determining the adjusted preset similarity threshold as the target similarity threshold. And acquiring second sliding window information, a second image to be processed and a second template image, and acquiring a plurality of second sub-images to be processed, which are included in the second image to be processed, based on the second sliding window information. And acquiring a second similarity value of each second to-be-processed sub-image in the plurality of second to-be-processed sub-images and the second template image, and determining a second to-be-processed sub-image corresponding to the second similarity value larger than the target similarity threshold value from the plurality of second similarity values corresponding to the plurality of second to-be-processed sub-images as a target matching image. By adopting the embodiment of the application, the accuracy and the recognition efficiency of recognizing the target matching image can be improved.
The method and the related apparatus according to the embodiments of the present application will be described in detail below with reference to fig. 1 to 9, respectively. The method provided by the embodiment of the application may include a data processing stage for acquiring a first similarity value between a first template image and a first sub-image to be processed, determining the number of sub-images to be processed based on a preset similarity threshold and the first similarity value, adjusting the preset similarity threshold based on the number of target images and the number of sub-images to be processed, and determining a target matching image from a second sub-image to be processed based on the adjusted preset similarity threshold. The implementation of the above-mentioned individual data processing phases can be seen from the implementation shown in fig. 1 and 5.
Referring to fig. 1, fig. 1 is a flow chart of a data processing method according to an embodiment of the present application. The method provided by the embodiment of the application may include the following steps 101 to 105:
101. and acquiring a first image to be processed and acquiring first sliding window information, and sliding from the starting point position of the first image to be processed based on the first window indicated by the first sliding window information so as to obtain a plurality of first sub-images to be processed, which are included in the first image to be processed.
The first image to be processed includes a target image, it should be understood that the target image may be any specified image, that is, the image that the user specifies to be obtained, for example, the target image may be a face, a number, or a symbol in the first image to be processed, which is not limited herein. For example, referring to fig. 2, fig. 2 is a schematic view of an application scenario of a target image according to an embodiment of the present application. As shown in fig. 2, the entire first image to be processed is composed of one number "2", one number "3", four numbers "4", one number "5", one number "6" and some blank portions. In which it is assumed that the number "4" included in the first image to be processed designates the desired image for the user, the target image is all the numbers "4" selected by the rectangular frame as shown in fig. 2.
It should be understood that the first sliding window information includes information such as a first window size corresponding to the first window and a window step size corresponding to the first window. Wherein the first window size is the same as the size of the acquired first template image. Based on the first window indicated by the first sliding window information, starting from the starting point position of the first image to be processed, sliding by only moving the distance of one window step length at a time according to the sequence from left to right and from top to bottom, and obtaining a plurality of first sub images to be processed, wherein the first sub images are included in the first image to be processed.
For example, referring to fig. 3, fig. 3 is a schematic view of an application scenario for acquiring a first sub-image to be processed according to an embodiment of the present application. Assuming that the first to-be-processed image is a 720p image, that is, the image line width of the first to-be-processed image is 720, and the image column width of the first to-be-processed image is 1280, the size of the first to-be-processed image may be expressed as the image line width×the image column width, that is, 720×1280. The first window is assumed to have a first window size of 300×300, and a window step size of 84, so that, starting from a starting point position of the first image to be processed, the first window (i.e., 300×300) is utilized to sequentially slide the first image to be processed from left to right and from top to bottom according to a distance of one window step (i.e., 84) moved each time, so that an image area covered by the first window in the first image to be processed each time can be determined as a first sub-image to be processed. That is, when the first window is slid from the start position of the first image to be processed (i.e., the first column of the first line) to the last column of the first line of the first image to be processed (i.e., the 1279 column of the 0 th line) in the order from left to right, the first window is moved down by a distance of one window step 84, then the sliding window is continued from left to right by the first window from the 0 th column of the 84 th line of the first image to be processed by the distance of one window step each time, and so on until the first window is slid to the end position of the first image to be processed (i.e., the last column of the last line), the image area covered by the first window each time in the first image to be processed is determined as the first sub-image to be processed, thereby obtaining a plurality of first sub-images to be processed, which the entire sliding process includes.
102. And acquiring a first template image, and calculating a first similarity value of each first sub-image to be processed in the plurality of first sub-images to be processed and the first template image.
In some possible embodiments, by acquiring the first template image, a similarity value between each of the plurality of first sub-images to be processed and the first template image, that is, the first similarity value, may be calculated according to the first template image. The size of the first template image is the same as the size of the first window for sliding, and the first template image may be an image completely the same as the target image included in the first image to be processed (i.e., may be a target image selected from the first image to be processed), or the first template image may be an image including the same object as the target image. For ease of understanding, the embodiments of the present application will be described by taking the first template image as an image that is identical to the target image included in the first image to be processed as an example.
For example, referring to fig. 4, fig. 4 is a schematic view of an application scenario of a first template image provided in an embodiment of the present application. As shown in fig. 4, the first image to be processed is composed of a head portrait of a person "small, a sun, a flower, and some blank parts. In which it is assumed that the target image is the head portrait of the person "small" in the first image to be processed, the first template image may be the same image as the target image entirely, as in image 1 shown in fig. 4. Alternatively, the first template image may be an image including the same object (e.g., "small brightness") as the target image, such as image 2 shown in fig. 4, and it is easy to see that both the image 2 and the target image shown in fig. 4 are images of the person "small brightness", except that the image 2 is an image of another expression of the person "small brightness".
It should be understood that a first sub-image to be processed corresponds to a similarity value, and that when the similarity value between a certain sub-image to be processed and the first template image is greater, the more similar the sub-image to be processed and the first template image are indicated. Generally, the method for calculating the similarity value between two images (i.e., the first template image and each first sub-image to be processed) mainly includes a square difference matching method (i.e., the cv2.tm_sqdiff method), a normalized square difference matching method (i.e., the cv2.tm_sqdiff_normal method), a correlation matching method (i.e., the cv2.tm_ccorr method), a normalized correlation matching method (i.e., the cv2.tm_ccoff_normal method), a coefficient matching method (the cv2.tm_ccoff method), a normalized correlation coefficient matching method (i.e., the cv2.tm_ccorr_normal method), and the like. For easy understanding, the embodiment of the application mainly uses a normalized correlation matching method to calculate the similarity value as an example. It should be understood that the similarity value between the template image and each sub-image to be processed calculated based on the normalized correlation matching method is a normalized correlation matching similarity value; and the similarity value between the template image and each sub-image to be processed calculated based on the normalization correlation coefficient matching method is the normalization correlation coefficient matching similarity value.
103. Acquiring a preset similarity threshold and the number of target images included in the first image to be processed, determining the number of target sub-images to be processed corresponding to a first similarity value larger than the preset similarity threshold, adjusting the preset similarity threshold based on the size relation between the number of target sub-images to be processed and the number of target images until the number of target sub-images to be processed determined based on the adjusted preset similarity threshold is equal to the number of target images, and determining the adjusted preset similarity threshold as the target similarity threshold.
In some possible embodiments, after calculating the first similarity value between each first sub-image to be processed and the first template image, the preset similarity threshold and the number of target images included in the first sub-image to be processed may be further acquired. According to the magnitude relation between the first similarity value corresponding to each first sub-image to be processed and the preset similarity threshold value, whether each first sub-image to be processed is a target image or not can be determined. Generally, a first sub-image to be processed having a first similarity value greater than or equal to a preset similarity threshold may be determined as the target image.
Specifically, a first similarity value greater than a preset similarity threshold is determined, and the sub-image to be processed corresponding to the first similarity value may be referred to as a target sub-image to be processed. And counting the number of target sub-images to be processed. Or, since one sub-image to be processed corresponds to one first similarity value, the number of the first similarity values greater than the preset similarity threshold may be directly determined as the number of the target sub-images to be processed.
It should be understood that the number of target images is the number of target images actually included in the first image to be processed, and the preset similarity threshold is the initialized similarity threshold, so that the number of target sub-images to be processed determined according to the initialized similarity threshold is the number of target images obtained through testing. Therefore, the embodiment of the application can adjust the preset similarity threshold based on the magnitude relation between the number of the target to-be-processed sub-images and the number of the target images, stop adjusting until the number of the target to-be-processed sub-images determined based on the adjusted preset similarity threshold is equal to the number of the target images, and determine the preset similarity threshold when the adjustment is stopped as the target similarity threshold. That is, the size of the similarity threshold may affect the accuracy of identifying the target image based on the template matching algorithm. Specifically, when the number of target sub-images to be processed is greater than the number of target images, it is indicated that the similarity threshold set by initialization is too small, so that the preset similarity threshold is increased, and if the number of target sub-images to be processed is smaller than the number of target images, it is indicated that the similarity threshold set by initialization is too large, so that the preset similarity threshold can be reduced.
Alternatively, in some possible embodiments, the preset similarity threshold range may be obtained, and then the similarity threshold set at the time of initialization, that is, the preset similarity threshold, may be determined based on the preset similarity threshold range. The preset similarity threshold range includes a left similarity threshold boundary and a right similarity threshold boundary, for example, the preset similarity threshold range is [ a, b ], and the left similarity threshold boundary is a and the right similarity threshold boundary is b. Therefore, the arithmetic average of the similarity threshold left boundary and the similarity threshold right boundary may be determined as the preset similarity threshold. For example, taking the preset similarity threshold range of [ a, b ] as an example, the preset similarity threshold= (a+b)/2. Further, when the preset similarity threshold is adjusted based on the size relation between the number of target sub-images to be processed and the number of target images, if the number of target sub-images to be processed is greater than the number of target images, updating the left boundary of the similarity threshold to be the preset similarity threshold, and determining the arithmetic average value of the left boundary of the updated similarity threshold and the right boundary of the similarity threshold to be the adjusted preset similarity threshold; if the number of the target to-be-processed sub-images is smaller than the number of the target images, updating the right boundary of the similarity threshold to be a preset similarity threshold, determining the arithmetic average value of the right boundary of the updated similarity threshold and the left boundary of the similarity threshold to be an adjusted preset similarity threshold, and determining the adjusted preset similarity threshold to be the target similarity threshold until the number of the target to-be-processed sub-images determined based on the adjusted preset similarity threshold is equal to the number of the target images.
For example, taking the preset similarity threshold range as [ a, b ] as an example, assume that the preset similarity threshold is (a+b)/2, wherein when the number of target sub-images to be processed, which is determined according to the preset similarity threshold and is greater than (a+b)/2, is greater than the number of target images, the left similarity threshold boundary is updated to the preset similarity threshold, that is, the updated similarity threshold range will be [ (a+b)/2, b ], so that the adjusted similarity threshold is determined according to the arithmetic average value of the updated similarity threshold left boundary and the similarity threshold right boundary, and the adjusted similarity threshold can be [ (a+b)/2+b ]/2.
For another example, taking the preset similarity threshold range as [ a, b ] as an example, assume that the preset similarity threshold is (a+b)/2, wherein when the number of target sub-images to be processed, which is determined according to the preset similarity threshold and is greater than (a+b)/2, is smaller than the number of target images, the right similarity threshold boundary is updated to the preset similarity threshold, that is, the updated similarity threshold range will be [ a, (a+b)/2 ], so that the adjusted similarity threshold is determined according to the arithmetic average value of the updated right similarity threshold boundary and the left similarity threshold boundary, and the adjusted similarity threshold is [ (a+b)/2+a ]/2.
104. And acquiring second sliding window information, a second image to be processed and a second template image, and acquiring a plurality of second sub-images to be processed, which are included in the second image to be processed, based on the second sliding window information.
In some possible embodiments, by acquiring the second image to be processed and acquiring the second sliding window information, the sliding may be performed from the starting point position of the second image to be processed based on the second window indicated by the second sliding window information, so as to obtain a plurality of second sub-images to be processed included in the second image to be processed. The second sliding window information includes information such as a second window size and a window step length corresponding to the second window, where the second window size is the same as the acquired second template image. It should be appreciated that the first image to be processed is of the same image type as the second image to be processed, e.g. both the first image to be processed and the second image to be processed are images comprising a face image type or both the first image to be processed and the second image to be processed are images comprising a scenic image type. Specifically, based on the second window indicated by the second sliding window information, the sliding can be performed by moving only the distance of one window step at a time from the starting point position of the second image to be processed according to the sequence from left to right and from top to bottom, so as to obtain a plurality of second sub images to be processed, wherein the second sub images are included in the second image to be processed.
105. And acquiring a second similarity value of each second sub-image to be processed and the second template image, and determining the second sub-image to be processed corresponding to the second similarity value larger than the target similarity threshold value as a target matching image.
In some possible embodiments, by acquiring the second template image, a similarity value, that is, a second similarity value, between each of the plurality of second sub-images to be processed and the second template image may be calculated according to the second template image. The size of the second template image is the same as the size of the second window for sliding, and the second template image may be an image completely the same as the target image included in the second image to be processed (i.e., may be a target image selected from the second image to be processed), or may be an image which only includes the same object as the target image. The embodiment of the application will be described taking the second template image as an image which is completely the same as the target image included in the second image to be processed as an example. It should be understood that a second sub-image to be processed corresponds to a similarity value, and that when the similarity value between a certain sub-image to be processed and the second template image is greater, the more similar the sub-image to be processed and the second template image are indicated. The embodiment of the application is described by taking calculation of the similarity value according to a normalized correlation matching method as an example.
And after calculating the second similarity value of each second sub-image to be processed and the second template image in the plurality of second sub-images to be processed included in the second image to be processed, comparing each second similarity value with the determined target similarity threshold value to determine the target matching image. Specifically, a second similarity value greater than a target similarity threshold may be determined from a plurality of second similarity values corresponding to a plurality of second sub-images to be processed, and the second sub-image to be processed corresponding to the determined second similarity value is used as the target matching image.
In this embodiment of the present application, by acquiring the first to-be-processed image and acquiring the first sliding window information, the first window indicated by the first sliding window information may be slid from the starting point position of the first to-be-processed image, so as to obtain a plurality of first to-be-processed sub-images included in the first to-be-processed image. By acquiring the first template image, a first similarity value between each first sub-image to be processed in the plurality of first sub-images to be processed and the first template image can be calculated. Acquiring a preset similarity threshold and the number of target images included in the first image to be processed, determining the number of target sub-images to be processed corresponding to a first similarity value larger than the preset similarity threshold, adjusting the preset similarity threshold based on the size relation between the number of target sub-images to be processed and the number of target images until the number of target sub-images to be processed determined based on the adjusted preset similarity threshold is equal to the number of target images, and determining the adjusted preset similarity threshold as the target similarity threshold. And acquiring second sliding window information, a second image to be processed and a second template image, and acquiring a plurality of second sub-images to be processed, which are included in the second image to be processed, based on the second sliding window information. And acquiring a second similarity value of each second sub-image to be processed and the second template image, and taking the second sub-image to be processed corresponding to the second similarity value larger than the target similarity threshold value as a target matching image. By adopting the embodiment of the application, the accuracy and efficiency of identifying the target matching image can be improved, the time for hitting the target matching image can be shortened, and the applicability is strong.
Referring to fig. 5, fig. 5 is another flow chart of the data processing method according to the embodiment of the present application. The method provided by the embodiment of the present application may be illustrated by the implementation manner provided in the following steps 201 to 206:
201. and acquiring a first image to be processed and acquiring first sliding window information, and sliding from the starting point position of the first image to be processed based on the first window indicated by the first sliding window information so as to obtain a plurality of first sub-images to be processed, which are included in the first image to be processed.
202. And acquiring a first template image, and calculating a first similarity value of each first sub-image to be processed in the plurality of first sub-images to be processed and the first template image.
203. Acquiring a preset similarity threshold and the number of target images included in the first image to be processed, determining the number of target sub-images to be processed corresponding to a first similarity value larger than the preset similarity threshold, adjusting the preset similarity threshold based on the size relation between the number of target sub-images to be processed and the number of target images until the number of target sub-images to be processed determined based on the adjusted preset similarity threshold is equal to the number of target images, and determining the adjusted preset similarity threshold as the target similarity threshold.
204. And acquiring second sliding window information, a second image to be processed and a second template image, and acquiring a plurality of second sub-images to be processed, which are included in the second image to be processed, based on the second sliding window information.
The specific implementation manner of step 201 to step 204 may refer to the descriptions of step 101 to step 104 in the corresponding embodiment of fig. 1, and will not be described herein.
205. And acquiring a second similarity value of each second sub-image to be processed and the second template image, and determining the second sub-image to be processed corresponding to the second similarity value larger than the target similarity threshold value as a target matching image.
In some possible embodiments, by acquiring the second template image, a similarity value, that is, a second similarity value, between each of the plurality of second sub-images to be processed and the second template image may be calculated according to the second template image. The size of the second template image is the same as the size of the second window for sliding, and the second template image may be an image completely the same as the target image included in the second image to be processed (i.e., may be a target image selected from the second image to be processed), or may be an image which only includes the same object as the target image. The embodiment of the application will be described taking the second template image as an image which is completely the same as the target image included in the second image to be processed as an example. It should be understood that a second sub-image to be processed corresponds to a similarity value, and that when the similarity value between a certain sub-image to be processed and the second template image is greater, the more similar the sub-image to be processed and the second template image are indicated. The embodiment of the application is described by taking calculation of the similarity value according to a normalized correlation matching method as an example.
And after calculating the second similarity value of each second sub-image to be processed and the second template image in the plurality of second sub-images to be processed included in the second image to be processed, comparing each second similarity value with the determined target similarity threshold value to determine the target matching image. Specifically, from a plurality of second similarity values corresponding to a plurality of second sub-images to be processed, a second sub-image to be processed corresponding to a second similarity value greater than a target similarity threshold may be determined as the target matching image.
Alternatively, in some possible embodiments, when the first image to be processed is an image with scaling, rotation or brightness change, the first image to be processed may be processed based on the feature matching algorithm, where the method provided in steps 201 to 205 is performed when the image to be processed cannot be matched to the target image based on the feature matching algorithm. It should be appreciated that the feature matching algorithms used above include, without limitation, scale-invariant feature transform matching (Scale Invariant Feature Transform, SIFT) algorithms, fast robust feature (Speeded Up Robust Features, SURF) algorithms, and the like. For convenience of description, the embodiment of the present application uses SIFT algorithm as an example. Specifically, by acquiring a first feature vector corresponding to a first feature point included in the third template image and acquiring a second feature vector corresponding to a second feature point included in the first image to be processed, a hamming distance between the first feature vector and the second feature vector can be calculated, and a third similarity value between the third template image and the first image to be processed is determined according to the hamming distance. If the third similarity value is not greater than the first preset similarity threshold, the method provided in steps 201 to 205 is performed. It should be understood that the feature points include contour points, corner points, edge points, bright points in dark areas, dark points in bright areas, and the like in the image, and the feature vector is a vector describing pixel points around the feature points.
Optionally, in some possible embodiments, when the second similarity value includes at least two similarity values, that is, when the second similarity value includes pixel values between the second template image and the second sub-image to be processed calculated according to at least two template matching algorithms, the candidate image may be determined from the plurality of second sub-images to be processed according to a size relationship between each similarity value and the target similarity value, so as to finally determine the target matching image from the candidate image.
For convenience of description, the embodiment of the application is described by taking an example that the second similarity value includes a normalized correlation matching similarity value and a normalized correlation coefficient matching similarity value. Specifically, from a plurality of normalized correlation matching similarity values corresponding to a plurality of second to-be-processed sub-images, a second to-be-processed sub-image corresponding to a normalized correlation matching similarity value greater than a target similarity threshold may be determined, so as to generate a first candidate image set, and for convenience of description, the second to-be-processed sub-image included in the first candidate image set may be referred to as a first candidate image. Correspondingly, from a plurality of normalized correlation coefficient matching similarity values corresponding to a plurality of second to-be-processed sub-images, a second to-be-processed sub-image corresponding to the normalized correlation coefficient matching similarity value greater than the target similarity threshold may be determined, so as to generate a second candidate image set, and for convenience of description, the second to-be-processed sub-image included in the second candidate image set may be referred to as a second candidate image. The center point position difference between each first center point position and each second center point position can be calculated by acquiring each first center point position corresponding to each first candidate image included in the first candidate image set and each second center point position corresponding to each second candidate image included in the second candidate image set. And if the difference value of the central point position between the first central point position corresponding to any one of the first candidate images and the second central point position corresponding to any one of the second candidate images is not greater than the preset central point position difference value, determining the image with the larger similarity value in the first candidate image and the second candidate image as the target matching image. It should be understood that the normalized correlation matching similarity value included in the second similarity value is a similarity value between the second template image and the plurality of second sub-images to be processed calculated according to the normalized correlation matching method, and the normalized correlation coefficient matching similarity value included in the second similarity value is a similarity value between the second template image and the plurality of second sub-images to be processed calculated according to the normalized correlation coefficient matching method. The center point difference between the first center point position and the second center point position may be a euclidean distance between the first center point position and the second center point position, etc., without limitation.
For example, referring to fig. 6, fig. 6 is a schematic view of an application scenario for determining a target matching image according to an embodiment of the present application. As shown in fig. 6, the entire second image to be processed is composed of one number "2", one number "3", one number "4", one number "5", one number "6", one number "7" and some blank parts. The number "7" included in the second image to be processed designates the image to be obtained for the user, and it is assumed that, based on a plurality of normalized correlation matching similarity values, it is determined that the first candidate image corresponding to the normalized correlation matching similarity value greater than the target similarity threshold is sub-image 1, where the normalized correlation matching similarity value corresponding to the sub-image 1 is s1. Based on a plurality of normalized phasesAnd the relation coefficient is matched with the similarity value, and a second candidate image corresponding to the normalized correlation coefficient matching similarity value which is larger than the target similarity threshold is determined to be a sub-image 2, wherein the normalized correlation matching similarity value corresponding to the sub-image 2 is s2, and s1 is more than s2. Assuming that the center point position (i.e., the first center point position) corresponding to the sub-image 1 is (x 1, y 1), and the center point position (i.e., the second center point position) corresponding to the sub-image 2 is (x 2, y 2), the center point position difference value can be obtained by calculating the center point position difference value between the first center point position and the second center point position Assuming that the preset center point position difference value is equal to z, and z1 < z, the sub-image 1 corresponding to s1 may be determined as the target matching image.
It should be understood that if the single-target recognition is performed on the second to-be-processed image, the second to-be-processed image corresponding to the largest normalized correlation matching similarity value may be determined as the first candidate image from the multiple normalized correlation matching similarity values corresponding to the multiple second to-be-processed sub-images. And then, determining a second to-be-processed image corresponding to the maximum normalized correlation coefficient matching similarity value from a plurality of normalized correlation coefficient matching similarity values corresponding to the plurality of second to-be-processed sub-images as a second candidate image, and further determining a target matching image according to the center point position difference value between the first candidate image and the second candidate image.
Optionally, in some possible embodiments, if the second image to be processed is an RGB image, after the RGB image is converted into the gray-scale image, a plurality of second sub-images to be processed included in the second image to be processed are acquired based on the second sliding window information, and according to the second template image, a second similarity value between each second sub-image to be processed and the second template image is calculated, and finally, from a plurality of second similarity values corresponding to the plurality of second sub-images to be processed, a second similarity value greater than a target similarity threshold is determined, and the second sub-image to be processed corresponding to the determined second similarity value is used as the target matching image. Or when the second image to be processed is an RGB image, the second image to be processed can be converted into a single-channel image, namely an R-channel image, a G-channel image and a B-channel image, and then a plurality of second sub-images to be processed, which are included in each single-channel image, are respectively obtained based on the second sliding window information. And then according to the second template image, determining similarity values between a plurality of second sub-images to be processed, which are included in each single-channel image, and the second template image respectively, namely an R-channel image similarity value, a G-channel image similarity value and a B-channel image similarity value. And obtaining a first weight value corresponding to the preset R channel image, a second weight value corresponding to the G channel image and a third weight value corresponding to the B channel image, wherein the weighted summation of the R channel image similarity value, the G channel image similarity value, the B channel image similarity value and the first weight value, the second weight value and the third weight value corresponding to each second sub-image to be processed can be carried out to obtain a fourth similarity value corresponding to each second sub-image to be processed, and the sum of the first weight value, the second weight value and the third weight value is equal to 1. And finally, determining a fourth similarity value larger than a target similarity threshold value from a plurality of fourth similarity values corresponding to the plurality of second sub-images to be processed, and taking the second sub-images to be processed corresponding to the determined fourth similarity value as target matching images. Generally, the weight values of the RGB three channels are usually 0.299 for the R channel image, 0.587 for the g channel image, and 0.114 for the b channel image. For example, assuming that the similarity value between the R channel image and the second template image of a certain second sub-image to be processed is a similarity value 1, the similarity value between the g channel image and the second template image is a similarity value 2, and the similarity value between the b channel image and the second template image is a similarity value 3, a fourth similarity value=similarity value 1×first weight value+similarity value 2×second weight value+similarity value 3×third weight value corresponding to the second sub-image to be processed.
206. The method comprises the steps of obtaining target matching images included in each interface image in a plurality of interface images, executing test operation in each target matching image, obtaining performance index parameters based on a performance data acquisition interface, obtaining log information corresponding to an application program, extracting alarm information included in the log information, and generating a performance test report based on the performance index parameters and the alarm information.
In some possible embodiments, the second image to be processed may be an interface image included in a certain Application (APP), for example, may be a game interface image included in a game Application. Thus, based on the methods provided in steps 201 to 205, for a plurality of interface images included in the application program, a target matching image may be respectively identified from each interface image, so as to perform a test operation on a position of each identified target matching image. The test operation includes a click operation, a slide operation, a long press operation, and the like, which are not limited herein. It should be appreciated that performance index parameters may also be obtained during execution of the test operation based on the performance data acquisition interface, wherein the collected performance index parameters include, without limitation, central processing unit (central processing unit, CPU) occupancy parameters, memory occupancy parameters, fluency (Frames Per Second, FPS) parameters, and the like. Further, after the test is finished, by acquiring the log information corresponding to the application program, the alarm information can be extracted from the log information, and a performance test report is generated based on the extracted performance index parameters and the alarm information so as to be checked by a user.
In this embodiment of the present application, by acquiring the first to-be-processed image and acquiring the first sliding window information, the first window indicated by the first sliding window information may be slid from the starting point position of the first to-be-processed image, so as to obtain a plurality of first to-be-processed sub-images included in the first to-be-processed image. By acquiring the first template image, a first similarity value between each first sub-image to be processed in the plurality of first sub-images to be processed and the first template image can be calculated. Acquiring a preset similarity threshold and the number of target images corresponding to target images included in the first image to be processed, determining the number of target sub-images to be processed corresponding to a first similarity value larger than the preset similarity threshold, adjusting the preset similarity threshold based on the size relation between the number of target sub-images to be processed and the number of target images until the number of target sub-images to be processed determined based on the adjusted preset similarity threshold is equal to the number of target images, and determining the adjusted preset similarity threshold as the target similarity threshold. And acquiring second sliding window information, a second image to be processed and a second template image, and acquiring a plurality of second sub-images to be processed, which are included in the second image to be processed, based on the second sliding window information. And acquiring a second similarity value of each second to-be-processed sub-image in the plurality of second to-be-processed sub-images and the second template image, and taking the second to-be-processed sub-image corresponding to the second similarity value larger than the target similarity threshold value as the target matching image. Further, by identifying a target matching image included in each of a plurality of interface images of the application, a test operation may be performed in each of the target matching images. The performance data acquisition interface is used for acquiring performance index parameters and log information corresponding to the application program, and can be used for extracting alarm information included in the log information and generating a performance test report based on the performance index parameters and the alarm information. By adopting the embodiment of the application, the accuracy and efficiency of identifying the target matching image can be improved, the time for hitting the target matching image can be shortened, and the applicability is high.
Referring to fig. 7, fig. 7 is a schematic structural diagram of a data processing apparatus according to an embodiment of the present application. The data processing device provided by the embodiment of the application comprises:
a first sub-image to be processed obtaining module 31, configured to obtain a first image to be processed and obtain first sliding window information, and slide from a start position of the first image to be processed based on a first window indicated by the first sliding window information, so as to obtain a plurality of first sub-images to be processed included in the first image to be processed, where the first sliding window information includes a first window size, and the first window size is the same as a size of a first template image;
a similarity value obtaining module 32, configured to obtain the first template image, and calculate a first similarity value of each first sub-image to be processed and the first template image;
a similarity threshold determining module 33, configured to obtain a preset similarity threshold and a number of target images included in the first to-be-processed image, determine a number of target to-be-processed sub-images corresponding to a first similarity value greater than the preset similarity threshold, adjust the preset similarity threshold until the number of target to-be-processed sub-images determined based on the adjusted preset similarity threshold is equal to the number of target images, and determine the preset similarity threshold after stopping adjustment as a target similarity threshold, where the first template image is an image including the same object as the target image;
A second sub-image to be processed obtaining module 34, configured to obtain second sliding window information, a second image to be processed, and a second template image, obtain, based on the second sliding window information, a plurality of second sub-images to be processed included in the second image to be processed, where the first image to be processed is the same as the image type of the second image to be processed, and the second sliding window information includes a second window size, and the second window size is the same as the size of the second template image;
the target matching image obtaining module 35 is configured to obtain a second similarity value of each of the second sub-images to be processed and the second template image, and determine a second sub-image to be processed corresponding to the second similarity value greater than the target similarity threshold as a target matching image.
Referring to fig. 8, fig. 8 is another schematic structural diagram of a data processing apparatus according to an embodiment of the present application, where:
in some possible embodiments, the similarity threshold determining module 33 includes a preset similarity threshold obtaining unit 331 and a similarity threshold adjusting unit 332, where the preset similarity threshold obtaining unit 331 includes:
A threshold range obtaining subunit 3311, configured to obtain a preset similarity threshold range, where the preset similarity threshold range includes a similarity threshold left boundary and a similarity threshold right boundary;
a similarity threshold determining subunit 3312, configured to determine an arithmetic average of the left boundary of the similarity threshold and the right boundary of the similarity threshold as the preset similarity threshold.
In some possible embodiments, the similarity threshold adjustment unit 332 is specifically configured to:
if the number of the target sub-images to be processed is greater than the number of the target images, updating the left boundary of the similarity threshold to be the preset similarity threshold, and determining the arithmetic average value of the updated left boundary of the similarity threshold and the right boundary of the similarity threshold to be the adjusted preset similarity threshold;
if the number of the target sub-images to be processed is smaller than the number of the target images, updating the right boundary of the similarity threshold to be the preset similarity threshold, and determining the arithmetic average value of the right boundary of the updated similarity threshold and the left boundary of the similarity threshold to be the adjusted preset similarity threshold.
In some possible embodiments, the apparatus further comprises a feature matching module 36, the feature matching module 36 comprising:
A feature point obtaining unit 361, configured to obtain a first feature vector corresponding to a first feature point included in a third template image, and obtain a second feature vector corresponding to a second feature point included in the first image to be processed, where the feature point includes a contour point in the image, and the feature vector is a vector describing a pixel point around the feature point;
the feature point similarity calculation unit 362 is configured to calculate a hamming distance between the first feature vector and the second feature vector, determine a third similarity value between the third template image and the first image to be processed according to the hamming distance, and if the third similarity value is not greater than a first preset similarity threshold, obtain first sliding window information, so as to obtain a plurality of first sub-images to be processed included in the first image to be processed according to the first sliding window information.
In some possible embodiments, the second similarity value includes a normalized correlation matching similarity value, a normalized correlation coefficient matching similarity value; the object-matching image acquisition module 35 includes:
a first candidate image set obtaining unit 351, configured to determine, from a plurality of normalized correlation match similarity values, a second sub-image to be processed corresponding to the normalized correlation match similarity value that is greater than the target similarity threshold, so as to generate a first candidate image set;
A second candidate image set obtaining unit 352, configured to determine, from a plurality of normalized correlation coefficient matching similarity values, a second sub-image to be processed corresponding to the normalized correlation coefficient matching similarity value that is greater than the target similarity threshold, so as to generate a second candidate image set;
a center point difference value obtaining unit 353, configured to obtain each first center point position corresponding to each first candidate image included in the first candidate image set, and obtain each second center point position corresponding to each second candidate image included in the second candidate image set, and calculate a center point position difference value between each first center point position and each second center point position;
the first target matching image determining unit 354 is configured to determine an image with a larger similarity value in the any first candidate image and the any second candidate image as a target matching image if a difference between a first center point position corresponding to the any first candidate image and a second center point position corresponding to the any second candidate image is not greater than a preset center point position difference.
In some possible embodiments, the second image to be processed is an RGB image; the object-matching image acquisition module 35 includes:
A similarity value obtaining unit 355, configured to obtain an R-channel image similarity value, a G-channel image similarity value, and a B-channel image similarity value of each of the plurality of second sub-images to be processed and the second template image;
the weight value obtaining unit 356 is configured to obtain a first weight value corresponding to a preset R channel image, a second weight value corresponding to a G channel image, and a third weight value corresponding to a B channel image, where a sum of the first weight value, the second weight value, and the third weight value is equal to 1;
a similarity value processing unit 357, configured to perform weighted summation on the R-channel image similarity value, the G-channel image similarity value, the B-channel image similarity value, the first weight value, the second weight value, and the third weight value corresponding to each second sub-image to be processed to obtain a fourth similarity value corresponding to each second sub-image to be processed;
and a second target matching image determining unit 358, configured to determine a second sub-image to be processed corresponding to a fourth similarity value greater than the target similarity threshold as a target matching image.
In some possible embodiments, the second image to be processed is a plurality of interface images included in the application program; the apparatus further comprises a performance test report acquisition module 37, the performance test report acquisition module 37 comprising:
A test operation execution unit 371 for acquiring a target matching image included in each of the plurality of interface images, and executing a test operation in each of the target matching images;
a performance parameter obtaining unit 372, configured to obtain performance index parameters based on a performance data acquisition interface, where the performance index parameters include a CPU occupancy parameter, a memory occupancy parameter, and a fluency FPS parameter of a central processing unit;
the performance test report generating unit 373 is configured to obtain log information corresponding to the application, extract alarm information included in the log information, and generate a performance test report based on the performance index parameter and the alarm information.
In a specific implementation, the data processing apparatus may execute, through each functional module built in the data processing apparatus, an implementation provided in each step of fig. 1 and fig. 5. For example, the first sub-image obtaining module 31 may be configured to perform the above-mentioned steps to obtain the first sub-image to be processed, obtain the first sliding window information, and obtain the first sub-image to be processed, and the implementation manners provided by the above-mentioned steps are specifically referred to, and are not described herein. The similarity value obtaining module 32 may be configured to obtain the first template image in each step, calculate a similarity value between the first template image and the first sub-image to be processed, and the like, and the implementation manner described in the related steps may be referred to in the implementation manner provided in each step, which is not described herein again. The above-mentioned similarity threshold determining module 33 may be configured to perform the above-mentioned steps to obtain the number of target images, determine the number of sub-images to be processed based on the plurality of first similarity values and the preset similarity threshold, and adjust the preset similarity threshold based on the number of target images and the number of sub-images to be processed, which may be specifically referred to the implementation manners provided in the above-mentioned steps, and will not be described herein again. The second to-be-processed sub-image obtaining module 34 may be configured to execute the implementation manners of obtaining the second to-be-processed sub-image included in the second to-be-processed image, obtaining the second template image, and the like in the above steps, and specifically, the implementation manners provided in the above steps may be referred to, and will not be described herein again. The above-mentioned target matching image obtaining module 35 may be configured to perform the above-mentioned implementation manners of calculating the similarity value between the second template image and the second sub-image to be processed in each step, determining the target matching image based on the target similarity threshold, etc., and specifically refer to the implementation manners provided in each step, which are not described herein again.
In this embodiment of the present application, the data processing apparatus may obtain the plurality of first sub-images to be processed included in the first image to be processed by obtaining the first image to be processed and obtaining the first sliding window information, and sliding the first window indicated by the first sliding window information from the start position of the first image to be processed based on the first window indicated by the first sliding window information. By acquiring the first template image, a first similarity value between each first sub-image to be processed in the plurality of first sub-images to be processed and the first template image can be calculated. Acquiring a preset similarity threshold and the number of target images corresponding to the target images included in the first image to be processed, determining the number of sub-images to be processed corresponding to a first similarity value larger than the preset similarity threshold from a plurality of first similarity values, and adjusting the preset similarity threshold based on the magnitude relation between the number of sub-images to be processed and the number of target images until the number of sub-images to be processed determined based on the adjusted preset similarity threshold is equal to the number of target images, and determining the adjusted preset similarity threshold as the target similarity threshold. And acquiring second sliding window information, a second image to be processed and a second template image, and acquiring a plurality of second sub-images to be processed, which are included in the second image to be processed, based on the second sliding window information. And acquiring a second similarity value of each second to-be-processed sub-image in the plurality of second to-be-processed sub-images and the second template image, and determining a second to-be-processed sub-image corresponding to the second similarity value larger than the target similarity threshold value from the plurality of second similarity values corresponding to the plurality of second to-be-processed sub-images as a target matching image. Further, by identifying a target matching image included in each of a plurality of interface images of the application, a test operation may be performed in each of the target matching images. The performance data acquisition interface is used for acquiring performance index parameters and log information corresponding to the application program, and can be used for extracting alarm information included in the log information and generating a performance test report based on the performance index parameters and the alarm information. By adopting the embodiment of the application, the accuracy and efficiency of identifying the target matching image can be improved, the time for hitting the target matching image can be shortened, the flexibility is high, and the application range is wide.
Referring to fig. 9, fig. 9 is a schematic structural diagram of a terminal device provided in an embodiment of the present application. As shown in fig. 9, the terminal device in the present embodiment may include: one or more processors 401 and a memory 402. The processor 401 and the memory 402 are connected via a bus 403. The memory 402 is used for storing a computer program comprising program instructions, and the processor 401 is used for executing the program instructions stored in the memory 402 for performing the following operations:
acquiring a first image to be processed and acquiring first sliding window information, wherein sliding is performed from a starting point position of the first image to be processed based on a first window indicated by the first sliding window information to obtain a plurality of first sub-images to be processed, which are included in the first image to be processed, and the first sliding window information comprises a first window size which is the same as the size of a first template image;
acquiring the first template image, and calculating a first similarity value of each first sub-image to be processed and the first template image;
acquiring a preset similarity threshold and the number of target images included in the first image to be processed, determining the number of target sub-images to be processed corresponding to a first similarity value larger than the preset similarity threshold, adjusting the preset similarity threshold until the number of target sub-images to be processed determined based on the adjusted preset similarity threshold is equal to the number of target images, and determining the preset similarity threshold after stopping adjustment as a target similarity threshold, wherein the first template image is an image which comprises the same object as the target image;
Acquiring second sliding window information, a second image to be processed and a second template image, and acquiring a plurality of second sub-images to be processed, which are included in the second image to be processed, based on the second sliding window information, wherein the first image to be processed is the same as the second image to be processed in image type, the second sliding window information comprises a second window size, and the second window size is the same as the second template image in size;
and acquiring a second similarity value of each second sub-image to be processed and the second template image, and determining the second sub-image to be processed corresponding to the second similarity value larger than the target similarity threshold value as a target matching image.
In some possible embodiments, the processor 401 is configured to:
acquiring a preset similarity threshold range, wherein the preset similarity threshold range comprises a left similarity threshold boundary and a right similarity threshold boundary;
and determining an arithmetic average value of the left boundary of the similarity threshold and the right boundary of the similarity threshold as the preset similarity threshold.
In some possible embodiments, the processor 401 is configured to:
if the number of the target sub-images to be processed is greater than the number of the target images, updating the left boundary of the similarity threshold to be the preset similarity threshold, and determining the arithmetic average value of the updated left boundary of the similarity threshold and the right boundary of the similarity threshold to be the adjusted preset similarity threshold;
If the number of the target sub-images to be processed is smaller than the number of the target images, updating the right boundary of the similarity threshold to be the preset similarity threshold, and determining the arithmetic average value of the right boundary of the updated similarity threshold and the left boundary of the similarity threshold to be the adjusted preset similarity threshold.
In some possible embodiments, the processor 401 is configured to:
acquiring a first feature vector corresponding to a first feature point included in a third template image, and acquiring a second feature vector corresponding to a second feature point included in the first image to be processed, wherein the feature point comprises a contour point in the image, and the feature vector is a vector describing pixel points around the feature point;
and calculating the Hamming distance between the first feature vector and the second feature vector, determining a third similarity value between the third template image and the first image to be processed according to the Hamming distance, and acquiring first sliding window information if the third similarity value is not greater than a first preset similarity threshold value so as to acquire a plurality of first sub-images to be processed included in the first image to be processed according to the first sliding window information.
In some possible embodiments, the second similarity value includes a normalized correlation matching similarity value, a normalized correlation coefficient matching similarity value; the processor 401 is configured to:
determining a second sub-image to be processed corresponding to the normalized correlation matching similarity value larger than the target similarity threshold from a plurality of normalized correlation matching similarity values to generate a first candidate image set;
determining a second sub-image to be processed corresponding to the normalized correlation coefficient matching similarity value larger than the target similarity threshold from a plurality of normalized correlation coefficient matching similarity values to generate a second candidate image set;
acquiring first center point positions corresponding to first candidate images included in the first candidate image set, acquiring second center point positions corresponding to second candidate images included in the second candidate image set, and calculating center point position difference values between the first center point positions and the second center point positions;
and if the difference value of the central point position between the first central point position corresponding to any one first candidate image and the second central point position corresponding to any one second candidate image is not greater than the difference value of the preset central point position, determining the image with larger similarity value in any one first candidate image and any one second candidate image as the target matching image.
In some possible embodiments, the second image to be processed is an RGB image; the processor 401 is configured to:
acquiring an R channel image similarity value, a G channel image similarity value and a B channel image similarity value of each second sub-image to be processed in the plurality of second sub-images to be processed and the second template image;
acquiring a first weight value corresponding to a preset R channel image, a second weight value corresponding to a G channel image and a third weight value corresponding to a B channel image, wherein the sum of the first weight value, the second weight value and the third weight value is equal to 1;
carrying out weighted summation on the R channel image similarity value, the G channel image similarity value, the B channel image similarity value, the first weight value, the second weight value and the third weight value corresponding to each second sub-image to be processed to obtain a fourth similarity value corresponding to each second sub-image to be processed;
and determining the second sub-image to be processed corresponding to the fourth similarity value larger than the target similarity threshold as a target matching image.
In some possible embodiments, the second image to be processed is a plurality of interface images included in the application program; the processor 401 is configured to:
Acquiring target matching images included in each interface image in the plurality of interface images, and executing test operation in each target matching image;
acquiring performance index parameters based on a performance data acquisition interface, wherein the performance index parameters comprise CPU occupancy rate parameters, memory occupancy rate parameters and fluency FPS parameters of a central processing unit;
and acquiring log information corresponding to the application program, extracting alarm information included in the log information, and generating a performance test report based on the performance index parameter and the alarm information.
It should be appreciated that in some possible embodiments, the processor 401 described above may be a central processing unit (central processing unit, CPU), which may also be other general purpose processors, digital signal processors (digital signal processor, DSP), application specific integrated circuits (application specific integrated circuit, ASIC), off-the-shelf programmable gate arrays (field programmable gate array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, or the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The memory 402 may include read only memory and random access memory and provides instructions and data to the processor 401. A portion of memory 402 may also include non-volatile random access memory. For example, the memory 402 may also store information of device type.
In a specific implementation, the terminal device may execute, through each built-in functional module, an implementation manner provided by each step in fig. 1 and fig. 5, and specifically may refer to an implementation manner provided by each step, which is not described herein again.
In this embodiment of the present application, by acquiring the first to-be-processed image and acquiring the first sliding window information, the terminal device may slide from the starting point position of the first to-be-processed image based on the first window indicated by the first sliding window information, so as to obtain a plurality of first to-be-processed sub-images included in the first to-be-processed image. By acquiring the first template image, a first similarity value between each first sub-image to be processed in the plurality of first sub-images to be processed and the first template image can be calculated. Acquiring a preset similarity threshold and the number of target images corresponding to the target images included in the first image to be processed, determining the number of sub-images to be processed corresponding to a first similarity value larger than the preset similarity threshold from a plurality of first similarity values, and adjusting the preset similarity threshold based on the magnitude relation between the number of sub-images to be processed and the number of target images until the number of sub-images to be processed determined based on the adjusted preset similarity threshold is equal to the number of target images, and determining the adjusted preset similarity threshold as the target similarity threshold. And acquiring second sliding window information, a second image to be processed and a second template image, and acquiring a plurality of second sub-images to be processed, which are included in the second image to be processed, based on the second sliding window information. And acquiring a second similarity value of each second to-be-processed sub-image in the plurality of second to-be-processed sub-images and the second template image, and determining a second to-be-processed sub-image corresponding to the second similarity value larger than the target similarity threshold value from the plurality of second similarity values corresponding to the plurality of second to-be-processed sub-images as a target matching image. Further, by identifying a target matching image included in each of a plurality of interface images of the application, a test operation may be performed in each of the target matching images. The performance data acquisition interface is used for acquiring performance index parameters and log information corresponding to the application program, and can be used for extracting alarm information included in the log information and generating a performance test report based on the performance index parameters and the alarm information. By adopting the embodiment of the application, the accuracy of identifying the target matching image can be improved, the flexibility is high, and the application range is wide.
The embodiments of the present application further provide a computer readable storage medium, where a computer program is stored, where the computer program includes program instructions, and when executed by a processor, implement a data processing method provided by each step in fig. 1 and fig. 5, and specifically refer to an implementation manner provided by each step, which is not described herein again.
The computer readable storage medium may be the data processing apparatus provided in any one of the foregoing embodiments or an internal storage unit of the terminal device, for example, a hard disk or a memory of an electronic device. The computer readable storage medium may also be an external storage device of the electronic device, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) card, a flash card (flash card) or the like, which are provided on the electronic device. Further, the computer-readable storage medium may also include both an internal storage unit and an external storage device of the electronic device. The computer-readable storage medium is used to store the computer program and other programs and data required by the electronic device. The computer-readable storage medium may also be used to temporarily store data that has been output or is to be output.
The terms "first," "second," "third," "fourth," and the like in the claims and in the description and drawings of the present application, are used for distinguishing between different objects and not for describing a particular sequential order. Furthermore, the terms "comprise" and "have," as well as any variations thereof, are intended to cover a non-exclusive inclusion. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those listed steps or elements but may include other steps or elements not listed or inherent to such process, method, article, or apparatus.
Reference herein to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment may be included in at least one embodiment of the present application. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. Those of skill in the art will explicitly and implicitly appreciate that the embodiments described herein may be combined with other embodiments. The term "and/or" as used in this specification and the appended claims refers to any and all possible combinations of one or more of the associated listed items, and includes such combinations. Those of ordinary skill in the art will appreciate that the elements and algorithm steps described in connection with the embodiments disclosed herein may be embodied in electronic hardware, in computer software, or in a combination of the two, and that the elements and steps of the examples have been generally described in terms of function in the foregoing description to clearly illustrate the interchangeability of hardware and software. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
The methods and related devices provided in the embodiments of the present application are described with reference to the method flowcharts and/or structure diagrams provided in the embodiments of the present application, and each flowchart and/or block of the method flowcharts and/or structure diagrams may be implemented by computer program instructions, and combinations of flowcharts and/or blocks in the flowchart and/or block diagrams. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks. These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or structural diagram block or blocks. These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or structures.

Claims (10)

1. A method of data processing, the method comprising:
acquiring a first image to be processed and acquiring first sliding window information, wherein sliding is performed from a starting point position of the first image to be processed based on a first window indicated by the first sliding window information to obtain a plurality of first sub-images to be processed, which are included in the first image to be processed, and the first sliding window information comprises a first window size which is the same as the size of a first template image;
acquiring the first template image, and calculating a first similarity value of each first sub-image to be processed and the first template image;
acquiring a preset similarity threshold and the number of target images included in the first image to be processed, determining the number of target sub-images to be processed corresponding to a first similarity value larger than the preset similarity threshold, adjusting the preset similarity threshold until the number of target sub-images to be processed determined based on the adjusted preset similarity threshold is equal to the number of target images, and determining the preset similarity threshold after stopping adjustment as a target similarity threshold, wherein the first template image is an image which comprises the same object as the target image;
Acquiring second sliding window information, a second image to be processed and a second template image, and acquiring a plurality of second sub-images to be processed, which are included in the second image to be processed, based on the second sliding window information, wherein the first image to be processed is the same as the second image to be processed in image type, the second sliding window information comprises a second window size, and the second window size is the same as the second template image in size; wherein the image type of any image to be processed is the type of an object included in the any image to be processed;
and acquiring a second similarity value of each second sub-image to be processed and the second template image, and determining the second sub-image to be processed corresponding to the second similarity value larger than the target similarity threshold value as a target matching image.
2. The method of claim 1, wherein the obtaining a preset similarity threshold comprises:
acquiring a preset similarity threshold range, wherein the preset similarity threshold range comprises a left similarity threshold boundary and a right similarity threshold boundary;
and determining an arithmetic average value of the left boundary of the similarity threshold and the right boundary of the similarity threshold as the preset similarity threshold.
3. The method of claim 2, wherein said adjusting the preset similarity threshold comprises:
if the number of the target sub-images to be processed is greater than the number of the target images, updating the left boundary of the similarity threshold to be the preset similarity threshold, and determining the arithmetic average value of the updated left boundary of the similarity threshold and the right boundary of the similarity threshold to be the adjusted preset similarity threshold;
if the number of the target sub-images to be processed is smaller than the number of the target images, updating the right boundary of the similarity threshold to be the preset similarity threshold, and determining the arithmetic average value of the right boundary of the updated similarity threshold and the left boundary of the similarity threshold to be the adjusted preset similarity threshold.
4. The method of claim 1, wherein prior to the obtaining the first sliding window information, the method further comprises:
acquiring a first feature vector corresponding to a first feature point included in a third template image, and acquiring a second feature vector corresponding to a second feature point included in the first image to be processed, wherein the feature point comprises a contour point in the image, and the feature vector is a vector describing pixel points around the feature point;
And calculating the Hamming distance between the first feature vector and the second feature vector, determining a third similarity value between the third template image and the first image to be processed according to the Hamming distance, and acquiring first sliding window information if the third similarity value is not greater than a first preset similarity threshold value so as to acquire a plurality of first sub-images to be processed included in the first image to be processed according to the first sliding window information.
5. The method of claim 1, wherein the second similarity value comprises a normalized correlation match similarity value, a normalized correlation coefficient match similarity value; the method further comprises the steps of:
determining a second sub-image to be processed corresponding to the normalized correlation matching similarity value larger than the target similarity threshold from a plurality of normalized correlation matching similarity values to generate a first candidate image set;
determining a second sub-image to be processed corresponding to the normalized correlation coefficient matching similarity value larger than the target similarity threshold from a plurality of normalized correlation coefficient matching similarity values to generate a second candidate image set;
acquiring first center point positions corresponding to first candidate images included in the first candidate image set, acquiring second center point positions corresponding to second candidate images included in the second candidate image set, and calculating center point position difference values between the first center point positions and the second center point positions;
And if the difference value of the central point position between the first central point position corresponding to any one first candidate image and the second central point position corresponding to any one second candidate image is not greater than the difference value of the preset central point position, determining the image with larger similarity value in any one first candidate image and any one second candidate image as the target matching image.
6. The method of claim 1, wherein the second image to be processed is an RGB image; the method further comprises the steps of:
acquiring an R channel image similarity value, a G channel image similarity value and a B channel image similarity value of each second sub-image to be processed in the plurality of second sub-images to be processed and the second template image;
acquiring a first weight value corresponding to a preset R channel image, a second weight value corresponding to a G channel image and a third weight value corresponding to a B channel image, wherein the sum of the first weight value, the second weight value and the third weight value is equal to 1;
carrying out weighted summation on the R channel image similarity value, the G channel image similarity value, the B channel image similarity value, the first weight value, the second weight value and the third weight value corresponding to each second sub-image to be processed to obtain a fourth similarity value corresponding to each second sub-image to be processed;
And determining the second sub-image to be processed corresponding to the fourth similarity value larger than the target similarity threshold as a target matching image.
7. The method according to any one of claims 1 to 6, wherein the second image to be processed is a plurality of interface images included in an application program; the method further comprises the steps of:
acquiring target matching images included in each interface image in the plurality of interface images, and executing test operation in each target matching image;
acquiring performance index parameters based on a performance data acquisition interface, wherein the performance index parameters comprise CPU occupancy rate parameters, memory occupancy rate parameters and fluency FPS parameters of a central processing unit;
and acquiring log information corresponding to the application program, extracting alarm information included in the log information, and generating a performance test report based on the performance index parameter and the alarm information.
8. A data processing apparatus, the apparatus comprising:
the first sub-image to be processed acquisition module is used for acquiring a first image to be processed and acquiring first sliding window information, sliding is performed from a starting point position of the first image to be processed based on a first window indicated by the first sliding window information so as to obtain a plurality of first sub-images to be processed, which are included in the first image to be processed, wherein the first sliding window information comprises a first window size which is the same as a first template image;
The similarity value acquisition module is used for acquiring the first template image and calculating a first similarity value of each first sub-image to be processed and the first template image;
the similarity threshold determining module is configured to obtain a preset similarity threshold and a number of target images included in the first to-be-processed image, determine a number of target to-be-processed sub-images corresponding to a first similarity value greater than the preset similarity threshold, adjust the preset similarity threshold until the number of target to-be-processed sub-images determined based on the adjusted preset similarity threshold is equal to the number of target images, and determine the preset similarity threshold after stopping adjustment as a target similarity threshold, where the first template image is an image including the same object as the target image;
the second sub-image to be processed acquisition module is used for acquiring second sliding window information, a second image to be processed and a second template image, acquiring a plurality of second sub-images to be processed, which are included in the second image to be processed, based on the second sliding window information, wherein the first image to be processed is the same as the second image to be processed in image type, the second sliding window information comprises a second window size, and the second window size is the same as the second template image in size; wherein the image type of any image to be processed is the type of an object included in the any image to be processed;
The target matching image acquisition module is used for acquiring a second similarity value of each second sub-image to be processed and the second template image, and determining the second sub-image to be processed corresponding to the second similarity value which is larger than the target similarity threshold value as a target matching image.
9. A terminal device comprising a processor and a memory, said processor and memory being interconnected;
the memory is for storing a computer program comprising program instructions, the processor being configured to invoke the program instructions to perform the method of any of claims 1-7.
10. A computer readable storage medium, characterized in that the computer readable storage medium stores a computer program comprising program instructions which, when executed by a processor, cause the processor to perform the method of any of claims 1-7.
CN202010537349.4A 2020-06-12 2020-06-12 Data processing method, device, terminal equipment and storage medium Active CN111738321B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010537349.4A CN111738321B (en) 2020-06-12 2020-06-12 Data processing method, device, terminal equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010537349.4A CN111738321B (en) 2020-06-12 2020-06-12 Data processing method, device, terminal equipment and storage medium

Publications (2)

Publication Number Publication Date
CN111738321A CN111738321A (en) 2020-10-02
CN111738321B true CN111738321B (en) 2023-08-08

Family

ID=72649083

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010537349.4A Active CN111738321B (en) 2020-06-12 2020-06-12 Data processing method, device, terminal equipment and storage medium

Country Status (1)

Country Link
CN (1) CN111738321B (en)

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112148908A (en) * 2020-10-23 2020-12-29 北京百度网讯科技有限公司 Image database updating method and device, electronic equipment and medium
CN112257714B (en) * 2020-11-13 2023-10-10 南京工业大学 Template matching method for non-rigid change image
CN112528761B (en) * 2020-11-24 2023-04-07 上海墨说科教设备有限公司 Method and system for extracting specific target in image, electronic device and storage medium
CN112668629A (en) * 2020-12-24 2021-04-16 深圳壹账通智能科技有限公司 Intelligent warehousing method, system, equipment and storage medium based on picture identification
CN113436068B (en) * 2021-06-10 2022-12-02 浙江大华技术股份有限公司 Image splicing method and device, electronic equipment and storage medium
CN113591921A (en) * 2021-06-30 2021-11-02 北京旷视科技有限公司 Image recognition method and device, electronic equipment and storage medium
CN113762097A (en) * 2021-08-18 2021-12-07 合肥联宝信息技术有限公司 Automatic document auditing method and device and computer readable storage medium
CN113923514B (en) * 2021-09-23 2024-03-01 青岛信芯微电子科技股份有限公司 Display device and MEMC repeated frame discarding method
CN114397901A (en) * 2021-11-30 2022-04-26 国网北京市电力公司 Unmanned aerial vehicle and unmanned aerial vehicle inspection method
CN114139007B (en) * 2022-01-26 2022-06-21 荣耀终端有限公司 Image searching method, electronic device, and medium thereof
CN116403170A (en) * 2023-06-02 2023-07-07 江西省水投江河信息技术有限公司 Multi-target tracking method and system for sand carrier and sand production ship

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018120420A1 (en) * 2016-12-26 2018-07-05 华为技术有限公司 Prediction method and device based on template matching
CN108830279A (en) * 2018-04-03 2018-11-16 南昌奇眸科技有限公司 A kind of image characteristics extraction and matching process
CN109241985A (en) * 2017-07-11 2019-01-18 普天信息技术有限公司 A kind of image-recognizing method and device
CN110689535A (en) * 2019-09-29 2020-01-14 歌尔股份有限公司 Workpiece identification method and device, electronic equipment and storage medium
CN111124902A (en) * 2019-12-12 2020-05-08 腾讯科技(深圳)有限公司 Object operating method and device, computer-readable storage medium and electronic device

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8934675B2 (en) * 2012-06-25 2015-01-13 Aquifi, Inc. Systems and methods for tracking human hands by performing parts based template matching using images from multiple viewpoints

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018120420A1 (en) * 2016-12-26 2018-07-05 华为技术有限公司 Prediction method and device based on template matching
CN109241985A (en) * 2017-07-11 2019-01-18 普天信息技术有限公司 A kind of image-recognizing method and device
CN108830279A (en) * 2018-04-03 2018-11-16 南昌奇眸科技有限公司 A kind of image characteristics extraction and matching process
CN110689535A (en) * 2019-09-29 2020-01-14 歌尔股份有限公司 Workpiece identification method and device, electronic equipment and storage medium
CN111124902A (en) * 2019-12-12 2020-05-08 腾讯科技(深圳)有限公司 Object operating method and device, computer-readable storage medium and electronic device

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
基于区域分块与特征点匹配的快速图像拼接方法;王腾锋;《中国优秀硕士学位论文全文数据库信息科技辑》(第1期);第I138-2102页 *

Also Published As

Publication number Publication date
CN111738321A (en) 2020-10-02

Similar Documents

Publication Publication Date Title
CN111738321B (en) Data processing method, device, terminal equipment and storage medium
CN108961303B (en) Image processing method and device, electronic equipment and computer readable medium
US10534957B2 (en) Eyeball movement analysis method and device, and storage medium
US10635946B2 (en) Eyeglass positioning method, apparatus and storage medium
CN102667810B (en) Face recognition in digital images
US8718324B2 (en) Method, apparatus and computer program product for providing object tracking using template switching and feature adaptation
JP2020144907A (en) Configurable convolution engine for interleave channel data
CN108509231B (en) VR-based application program opening method, electronic device, equipment and storage medium
US9058655B2 (en) Region of interest based image registration
US10650234B2 (en) Eyeball movement capturing method and device, and storage medium
WO2017054442A1 (en) Image information recognition processing method and device, and computer storage medium
WO2018082308A1 (en) Image processing method and terminal
CN105095860B (en) character segmentation method and device
CN112733767B (en) Human body key point detection method and device, storage medium and terminal equipment
WO2014074959A1 (en) Real-time face detection using pixel pairs
CN110503682B (en) Rectangular control identification method and device, terminal and storage medium
CN112966725B (en) Method and device for matching template images and terminal equipment
CN111626163A (en) Human face living body detection method and device and computer equipment
CN111028276A (en) Image alignment method and device, storage medium and electronic equipment
CN112529939A (en) Target track matching method and device, machine readable medium and equipment
CN108960247B (en) Image significance detection method and device and electronic equipment
CN110717452B (en) Image recognition method, device, terminal and computer readable storage medium
CN103955713B (en) A kind of icon-based programming method and apparatus
CN105447846B (en) Image processing method and electronic equipment
CN111080683A (en) Image processing method, image processing device, storage medium and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant