CN113313125A - Image processing method and device, electronic equipment and computer readable medium - Google Patents

Image processing method and device, electronic equipment and computer readable medium Download PDF

Info

Publication number
CN113313125A
CN113313125A CN202110662853.1A CN202110662853A CN113313125A CN 113313125 A CN113313125 A CN 113313125A CN 202110662853 A CN202110662853 A CN 202110662853A CN 113313125 A CN113313125 A CN 113313125A
Authority
CN
China
Prior art keywords
image
processed
matching
target
point pair
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110662853.1A
Other languages
Chinese (zh)
Inventor
刘华丽
黄琦
沈惠玲
张沛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Baidu Netcom Science and Technology Co Ltd
Original Assignee
Beijing Baidu Netcom Science and Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Baidu Netcom Science and Technology Co Ltd filed Critical Beijing Baidu Netcom Science and Technology Co Ltd
Priority to CN202110662853.1A priority Critical patent/CN113313125A/en
Publication of CN113313125A publication Critical patent/CN113313125A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/32Normalisation of the pattern dimensions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3684Test management for test design, e.g. generating new test cases
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3688Test management for test execution, e.g. scheduling of test suites

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • General Engineering & Computer Science (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Multimedia (AREA)
  • Image Analysis (AREA)

Abstract

The disclosure provides an image processing method and device, and relates to the technical fields of image processing, deep learning and the like. The specific implementation scheme is as follows: acquiring an image to be processed in real time; extracting feature points of an image to be processed to obtain a feature point set to be detected of the image to be processed, wherein the feature point set to be detected comprises at least one feature point to be detected; matching the feature point set to be detected with a preset target feature point set of a target image to obtain at least one matching point pair; performing area mapping on the image to be processed based on the at least one matching point pair to obtain a mapping area of the image to be processed; and detecting the image to be processed based on the mapping area and the target image to obtain a detection result. This embodiment improves the accuracy of image detection.

Description

Image processing method and device, electronic equipment and computer readable medium
Technical Field
The present disclosure relates to the field of computer technologies, and in particular, to the field of image processing, deep learning, and the like, and in particular, to an image processing method and apparatus, an electronic device, a computer-readable medium, and a computer program product.
Background
At present, a huge number of mobile phone user groups are owned domestically, and rich applications of the mobile phone user groups become necessary for installing the mobile phone users. One application can be selected by a user for a long time, and a good interaction interface, a good function and the like are required; therefore, a great deal of test work is needed before the product is released and put on line, and a great part of the test work is to test the display correctness of the interactive interface under each scene.
Disclosure of Invention
An image processing method and apparatus, an electronic device, a computer-readable medium, and a computer program product are provided.
According to a first aspect, there is provided an image processing method comprising: acquiring an image to be processed in real time; extracting characteristic points of the image to be processed to obtain a characteristic point set to be detected of the image to be processed; matching the feature point set to be detected with a preset target feature point set of a target image to obtain at least one matching point pair; performing area mapping on the image to be processed based on the at least one matching point pair to obtain a mapping area of the image to be processed; and detecting the image to be processed based on the mapping area and the target image to obtain a detection result.
According to a second aspect, there is provided an image processing apparatus comprising: an acquisition unit configured to acquire an image to be processed in real time; the extraction unit is configured to extract the characteristic points of the image to be processed to obtain a characteristic point set to be detected of the image to be processed; the matching unit is configured to match the feature point set to be detected with a target feature point set of a preset target image to obtain at least one matching point pair; the mapping unit is configured to perform area mapping on the image to be processed based on the at least one matching point pair to obtain a mapping area of the image to be processed; and the obtaining unit is configured to detect the image to be processed based on the mapping area and the target image to obtain a detection result.
According to a third aspect, there is provided an electronic device comprising: at least one processor; and a memory communicatively connected to the at least one processor, wherein the memory stores instructions executable by the at least one processor, the instructions being executable by the at least one processor to enable the at least one processor to perform the method as described in any one of the implementations of the first aspect.
According to a fourth aspect, there is provided a non-transitory computer readable storage medium having stored thereon computer instructions for causing a computer to perform a method as described in any one of the implementations of the first aspect.
According to a fifth aspect, there is provided a computer program product comprising a computer program which, when executed by a processor, implements a method as described in any of the implementations of the first aspect.
According to the image processing method and the image processing device provided by the embodiment of the disclosure, firstly, an image to be processed is obtained in real time; secondly, extracting feature points of the image to be processed to obtain a feature point set to be detected of the image to be processed; thirdly, matching the feature point set to be detected with a preset target feature point set of the target image to obtain at least one matching point pair; secondly, performing area mapping on the image to be processed based on at least one matching point pair to obtain a mapping area of the image to be processed; and finally, detecting the image to be processed based on the mapping area and the target image to obtain a detection result. Therefore, the mapping area is determined through the extraction of the feature points, the image detection is carried out based on the mapping area, the precision of the image detection is improved, the real-time automatic test of the image on the terminal is realized, and the problem of cross-terminal detection diversity is solved while the image test precision of the terminal is improved.
It should be understood that the statements in this section do not necessarily identify key or critical features of the embodiments of the present disclosure, nor do they limit the scope of the present disclosure. Other features of the present disclosure will become apparent from the following description.
Drawings
The drawings are included to provide a better understanding of the present solution and are not to be construed as limiting the present disclosure. Wherein:
FIG. 1 is a flow diagram of one embodiment of an image processing method according to the present disclosure;
FIG. 2 is a schematic diagram illustrating matching of a to-be-processed image and a target image on the same terminal according to an embodiment of the present disclosure;
FIG. 3 is a schematic diagram illustrating matching of images to be processed and target images on different terminals according to an embodiment of the present disclosure;
FIG. 4 is a flow chart of a method of acquiring images to be processed in real time according to the present disclosure;
FIG. 5 is a schematic block diagram of an embodiment of an image processing apparatus according to the present disclosure;
fig. 6 is a block diagram of an electronic device for implementing an image processing method of an embodiment of the present disclosure.
Detailed Description
Exemplary embodiments of the present disclosure are described below with reference to the accompanying drawings, in which various details of the embodiments of the disclosure are included to assist understanding, and which are to be considered as merely exemplary. Accordingly, those of ordinary skill in the art will recognize that various changes and modifications of the embodiments described herein can be made without departing from the scope and spirit of the present disclosure. Also, descriptions of well-known functions and constructions are omitted in the following description for clarity and conciseness.
Fig. 1 shows a flow 100 of an embodiment of an image processing method according to the present disclosure, the image processing method comprising the steps of:
step 101, acquiring an image to be processed in real time.
In this embodiment, the to-be-processed image is an image to be detected obtained in real time by an execution main body on which the image processing method is executed, and the providing ends of the to-be-processed image may be different according to different detection requirements. For example, in order to detect whether an interface image displayed by an application on the mobile terminal conforms to an expected scene, the image to be processed may be a picture obtained by capturing a screen of the image displayed by the mobile terminal in real time.
Optionally, the image to be processed may be an image stored in a database, and the execution subject on which the image processing method is executed acquires the image to be processed from an application of the terminal according to the detection requirement, and stores the image to be processed in the database, so that the execution subject can acquire the image to be processed from the database in real time, thereby providing data support for image detection.
Optionally, the image to be processed may also be a different image generated in real time by an application on the terminal executing a real-time task (e.g., a display task of a communication interface), for example, an opening interface image displayed after the communication application is triggered; the communication application displays the interface image which is in communication when sending out the communication signal.
And 102, extracting the characteristic points of the image to be processed to obtain a characteristic point set to be detected of the image to be processed.
The Feature of an image mainly includes three kinds of features, namely, a point, a line and a plane, wherein a point Feature is one of the most common features, and mainly includes an edge point, a center point, an intersection point of different objects in the image, and the like.
In this embodiment, the feature point set to be measured includes at least one feature point to be measured, each feature point to be measured has an attribute, and the attribute includes: location, pixel value, etc. When feature point extraction is performed on an image to be processed, feature point extraction can be performed on a target image at the same time to obtain a target feature point set of the target image, and the target feature point set of the target image can include at least one target feature point. In this embodiment, the target image is a preset image that conforms to an expected scene, and whether the similarity between the image to be processed and the target image is high is determined by matching the image to be processed with the target image, so that the purpose of detecting the image to be processed is achieved.
And 103, matching the feature point set to be detected with a preset target feature point set of the target image to obtain at least one matched point pair.
In this embodiment, matching the feature point set to be measured with the target feature point set refers to how to find a feature point to be measured that is most similar to each target feature point in the target feature point set in the feature point set to be measured. The feature point set to be detected and the target feature point set may be matched by a plurality of matching methods, for example, by using a Brute-force matching method (Brute-force matching), the distance between a certain target feature point in the target feature point set and each feature point to be detected in the feature point set to be detected is calculated, then the obtained distances are sorted, and the feature point to be detected with the closest distance is taken as the matching point in the matching point pair.
It should be noted that, the result is also obvious when the violent matching method is adopted, but in actual practice, there are a large number of wrong matching point pairs in the matching point pairs obtained by the violent matching method, which requires some mechanisms to filter out the wrong matching point pairs.
And 104, performing area mapping on the image to be processed based on the at least one matching point pair to obtain a mapping area of the image to be processed.
In this embodiment, the mapping region is a region in the to-be-processed image that is associated with the at least one matching point pair, that is, a region with the highest matching degree with the target image. And the similarity between the feature point set to be detected in the mapping region and the target feature point set of the target image is highest.
And 105, detecting the image to be processed based on the mapping area and the target image to obtain a detection result.
In this embodiment, the target image and the image to be processed may be two images displayed or stored on the same carrier (e.g., a terminal and a server), and the target image and the image to be processed may also be two images respectively displayed or stored on different carriers.
As shown in fig. 2, for the target image M and the to-be-processed image that are displayed or stored on the same carrier, the display mechanism is the same, when the to-be-processed image is detected, the to-be-processed image in the mapping region C is directly compared with the similarity of the target image M, if the similarity is greater than a similarity threshold (for example, 90%), it is determined that the to-be-processed image is qualified, the to-be-processed image in the mapping region meets the image display requirement, and a detection result is output, where the detection result may include: confidence and position identification of the target range, wherein the confidence is determined by the range of similarity comparison, and the position identification of the target range comprises: the upper left corner coordinates of the mapping region, the length of the mapping region, the width of the mapping region, etc. And when the similarity is smaller than the similarity threshold value, determining that the image to be processed is unqualified, and outputting a detection result of null. When the detection result is empty, the expected result is incorrect, and the reason of the problem needs to be checked.
As shown in fig. 3, for a target image M ' and an image to be processed, which are respectively displayed or stored in different carriers, it is necessary to perform image transformation on the target image M ' according to different carrier display mechanisms in which the target image M ' and the image to be processed are respectively located, when detecting the image to be processed, it is necessary to perform image transformation on the target image M ', compare the similarity between the image C ' in the mapping region and the transformed target image, and if the similarity is greater than a similarity threshold (e.g., 90%), it is determined that the image to be processed is qualified, and the image to be processed in the mapping region meets the image display requirement, and a detection result is output.
The embodiment provides the image processing method, and the used feature detection and matching algorithm is more detailed, so that on one hand, fine-grained image processing can be realized, and on the other hand, the image processing method has good robustness on the change of resolutions such as scaling; therefore, the target image can be detected by adopting preset public image resources, the cost for manually detecting the image to be processed is reduced, and the detection efficiency of the image to be processed is improved.
The image processing method provided by the embodiment of the disclosure includes the steps of firstly, acquiring an image to be processed in real time; secondly, extracting feature points of the image to be processed to obtain a feature point set to be detected of the image to be processed; thirdly, matching the feature point set to be detected with a preset target feature point set of the target image to obtain at least one matching point pair; secondly, performing area mapping on the image to be processed based on at least one matching point pair to obtain a mapping area of the image to be processed; and finally, detecting the image to be processed based on the mapping area and the target image to obtain a detection result. Therefore, the mapping area is determined through the extraction of the feature points, the image processing is carried out based on the mapping area, the image processing precision is improved, the real-time automatic test of the image on the terminal is realized, and the problem of cross-terminal detection diversity is solved while the image testing precision of the terminal is improved.
Fig. 4 shows a flowchart 400 of a method for acquiring an image to be processed in real time according to the present disclosure, which includes the following steps:
step 401, obtaining a test task from the test task list.
In this embodiment, the test task may be a task for testing an application on the terminal, and the test task may test whether a display program executed by the application is correct, and the execution main body may obtain a to-be-processed image corresponding to the test task from the application of the terminal, and may determine a result of success or failure of the application of the terminal to execute the test task through the to-be-processed image.
Before executing a test task, a test frame can be installed on a terminal, for the test of Interface design, a User Interface (UI) test frame (such as UI automation) can be adopted, the UI test frame is suitable for cross-system and cross-application functional UI tests of installed applications, an operator can check whether an interactive Interface meets an expected result in real time through the UI test frame, and an execution main body can also acquire an image to be processed in real time through the UI test frame and store the image to be processed in a database to provide data support for subsequent image processing.
Step 402, distributing the test task to the terminal so as to run the test task on the terminal.
In this embodiment, the test task is a task suitable for the terminal, and after the terminal executes the test task, the to-be-detected picture corresponding to the test task is generated.
In this embodiment, the terminal includes at least one terminal, the model and the operating system of each terminal are not limited, at least one application is installed on each terminal, and the application of each terminal can present a user interface image with the same effect after receiving the interface display task.
And 403, receiving the image to be processed corresponding to the test task sent by the terminal in real time.
In the optional implementation mode, firstly, an execution main body on which the image processing method operates triggers a test task, a task list comprising at least one test task is generated and distributed to a plurality of terminals, the plurality of terminals start to operate respective test tasks, each test task generates a corresponding image to be processed, each terminal uploads the respective image to be processed to the execution main body, the execution main body detects each image to be processed to give a detection result, and whether the test is successful or not is judged; and under the condition of failure, the result is subjected to feedback analysis function interaction and other possible loopholes.
Optionally, the execution main body on which the image processing method is executed obtains the picture to be detected in real time and displays the picture through the UI test framework, an operator detects whether the picture to be processed conforms to the target picture, and when the picture to be processed is not qualified, the problem that the application on the terminal executing the test task is a bug or the test task is detected.
According to the method for acquiring the image to be processed, which is provided by the optional implementation mode, the test tasks in the test task list are distributed to the terminals, so that the images to be processed are generated by all the terminals, the images to be processed sent by the terminals are received in real time, the purpose of automatically testing the images of the terminals is achieved, and the real-time test effect of the images to be processed of the terminals is ensured.
In some optional implementation manners of this embodiment, the detecting the image to be processed based on the mapping region and the target image to obtain a detection result includes: based on the mapping area, cutting the image to be processed to obtain a cut image; in response to detecting that the size of the cut image is the same as that of the target image, calculating the similarity between the cut image and the target image; and determining that the image to be processed is qualified in response to determining that the similarity between the cut image and the target image is greater than a set threshold.
In the optional implementation mode, after the mapping area is obtained, the image to be processed in the mapping area is cut, so that the position corresponding to the target image in the image to be processed is accurately determined; further, when the size of the cut image is the same as that of the target image, whether the image to be processed is qualified or not is detected through the similarity between the cut image and the target image, and the reliability of detection of the image to be processed is guaranteed.
In some optional implementation manners of this embodiment, the detecting the image to be processed based on the mapping region and the target image to obtain a detection result further includes: in response to detecting that the size of the cropped image is not the same as the size of the target image, the cropped image is zoomed so that the zoomed cropped image is the same as the size of the target image.
In this optional implementation, the cropped image may be zoomed by using an image zoom algorithm or an image zoom model to obtain a cropped image having the same size as the target image.
In the optional implementation mode, when the sizes of the cut image and the target image are not consistent, the cut image is zoomed, so that the reliability of the similarity contrast between the cut image and the target image is ensured.
In some optional implementation manners of this embodiment, the matching the feature point set to be detected with a preset target feature point set of the target image to obtain at least one matching point pair includes: matching the feature point set to be detected with a target feature point set of a target image by adopting a feature matching algorithm to obtain at least one initial point pair; and screening at least one initial point pair to obtain at least one matched point pair.
In this optional implementation manner, the feature matching algorithm may adopt a kNN classification algorithm (k-nearest neighbor), and a core idea of the kNN algorithm is as follows: if a sample belongs to a certain class in most of the k nearest neighbors in the feature space, the sample also belongs to this class and has the characteristics of the sample on this class. The classification algorithm only determines the class of the sample to be classified according to the class of the nearest sample or samples in the determination of classification decision.
In this optional implementation manner, each matching point pair belongs to at least one initial point pair, the matching point is a point pair which is obtained by screening the at least one initial point pair and is more accurately paired, and each initial point pair and each matching point include a target feature point and a feature point to be detected, which correspond to each other.
In the optional implementation manner, after the feature point set to be detected is matched with the target feature point set by using the feature matching algorithm, the accuracy of the obtained initial point is not high, and the accuracy of the matching result of the feature point set of the image to be processed and the target image is ensured by screening the initial point pair to obtain the more accurate matching point pair.
In some optional implementation manners of this embodiment, the screening at least one initial point pair to obtain at least one matching point pair includes: for each initial point pair of at least one initial point pair, determining a target characteristic point closest to the characteristic point to be detected and a next closest target characteristic point in the initial point pair; respectively calculating the ratio of Euclidean distances between the nearest target feature point and the next nearest target feature point; and in response to the ratio being greater than the set threshold, taking the initial point pair as a matching point pair.
In this alternative, the nearest target feature point and the next nearest target feature point may be one feature point or a plurality of feature points.
In the optional implementation manner, the accuracy of the obtained matching point pair is ensured through the ratio of the Euclidean distance between the target characteristic point closest to the characteristic point to be detected in the initial point pair and the next closest target characteristic point.
Optionally, the screening at least one initial point pair to obtain at least one matching point pair may further include: for each initial point pair of at least one initial point pair, determining a characteristic point to be detected closest to the target characteristic point and a characteristic point to be detected next to the target characteristic point in the initial point pair; respectively calculating the ratio of the Euclidean distance between the nearest characteristic point to be measured and the next nearest characteristic point to be measured; when the ratio of the Euclidean distance between the nearest characteristic point to be detected and the next nearest characteristic point to be detected is larger than a set threshold value, taking the initial point pair as a matching point pair; and when the ratio of the Euclidean distance between the nearest characteristic point to be measured and the next nearest characteristic point to be measured is less than or equal to a set threshold value, discarding the initial point.
In this alternative, the nearest feature point to be measured and the next nearest feature point to be measured may be one feature point or a plurality of feature points.
In some optional implementation manners of this embodiment, the screening at least one initial point pair to obtain at least one matching point pair further includes: obtaining a mask matrix by adopting homography transformation based on target characteristic points in at least one matching point pair; based on the mask matrix, removing the wrong matching point pair in the at least one matching point pair.
In this alternative implementation, a Homography (homographic) transformation is used to describe the position mapping relationship between the world coordinate system and the pixel coordinate system of the object. Each matching point pair comprises a target characteristic point and a characteristic point to be detected which are mutually corresponding, the target characteristic points in all the matching point pairs are mapped to the corresponding characteristic points to be detected on the image to be processed by adopting homography transformation, and a mask matrix reflecting the mapping state can be obtained by calculation.
The mask matrix is used to mask the image to control the image area or processing. Through the mask matrix, each pixel value in the image can be recalculated. The mask matrix controls the degree of influence of the old image current position and surrounding position pixels on the new image current position pixel value.
In this embodiment, the wrong matching point pair refers to that a corresponding feature point to be detected cannot be found for the current target feature point on the image to be processed through homography transformation, or the current feature point to be detected on the image to be processed does not have a corresponding target feature point on the target image, and at this time, the wrong matching point pair needs to be removed, so as to achieve the purpose of optimizing the feature point set to be detected.
In this embodiment, the mask matrix obtained through the homography transformation can effectively distinguish the accuracy of the feature points to be measured mapped on the plane where the image to be processed is located, and the mask matrix is used to filter the error feature points on the image to be processed, thereby achieving the purpose of filtering the error feature points in the feature point set to be measured.
In the optional implementation mode, the mask matrix obtained through homography transformation reflects correct or wrong mapping points of the target feature points on the image to be processed, and the reliability of screening of the matching point pairs is ensured.
In some optional implementation manners of this embodiment, the performing, based on the at least one matching point pair, region mapping on the image to be processed to obtain a mapping region of the image to be processed includes: determining all feature points to be detected corresponding to all matching points on the image to be processed based on at least one matching point pair; connecting all the feature points to be detected closest to the frame of the image to be processed to form a mapping curve surrounding all the feature points to be detected; and taking the area surrounded by the mapping curve as a mapping area of the image to be processed.
In the optional implementation mode, all the feature points to be detected on the image to be processed are determined, and the feature points to be detected closest to the frame of the image to be processed are connected to form a mapping area, so that the reliability of the mapping area is ensured.
With further reference to fig. 5, as an implementation of the method shown in the above figures, the present disclosure provides an embodiment of an image processing apparatus, which corresponds to the embodiment of the method shown in fig. 1, and which is particularly applicable in various electronic devices.
As shown in fig. 5, the present embodiment provides an image processing apparatus 500 including: acquiring section 501, extracting section 502, matching section 503, mapping section 504, and obtaining section 505. The acquiring unit 501 may be configured to acquire an image to be processed in real time. The extracting unit 502 may be configured to perform feature point extraction on the image to be processed to obtain a feature point set to be detected of the image to be processed, where the feature point set to be detected includes at least one feature point to be detected. The matching unit 503 may be configured to match the feature point set to be detected with a target feature point set of a preset target image, so as to obtain at least one matching point pair. The mapping unit 504 may be configured to perform region mapping on the image to be processed based on the at least one matching point pair, so as to obtain a mapping region of the image to be processed. The obtaining unit 505 may be configured to detect the image to be processed based on the mapping region and the target image, and obtain a detection result.
In the present embodiment, in the image processing apparatus 500: the specific processing of the obtaining unit 501, the extracting unit 502, the matching unit 503, and the mapping unit 504, and the obtaining unit 505 and the technical effects thereof can refer to the related descriptions of step 101, step 102, step 103, step 104, and step 105 in the corresponding embodiment of fig. 1, which are not described herein again.
In some optional implementation manners of this embodiment, the obtaining unit 501 includes: an acquisition module (not shown), a distribution module (not shown), and a receiving module (not shown). The obtaining module may be configured to obtain the test task from the test task list. The distribution module may be configured to distribute the test task to the terminal to run the test task on the terminal. The receiving module can be configured to receive the image to be processed corresponding to the test task sent by the terminal in real time.
In some optional implementations of this embodiment, the obtaining unit 505 includes: a cropping module (not shown), a calculation module (not shown), and a determination module (not shown). The cropping module may be configured to crop the image to be processed based on the mapping region, so as to obtain a cropped image. The calculating module may be configured to calculate a similarity between the cropped image and the target image in response to detecting that the cropped image and the target image are the same size. The determining module may be configured to determine that the image to be processed is qualified in response to determining that the similarity between the cropped image and the target image is greater than a set threshold.
In some optional implementation manners of this embodiment, the obtaining unit 505 further includes: a scaling module (not shown in the figure). The scaling module may be configured to, in response to detecting that the size of the cropped image is different from that of the target image, scale the cropped image so that the scaled cropped image has the same size as the target image.
In some optional implementations of this embodiment, the matching unit 503 includes: a matching module (not shown), a screening module (not shown). The matching module may be configured to match the feature point set to be detected with a target feature point set of the target image by using a feature matching algorithm, so as to obtain at least one initial point pair. The screening module may be configured to screen at least one initial point pair to obtain at least one matching point pair.
In some optional implementations of this embodiment, the screening module includes: a determination sub-module (not shown), a calculation sub-module (not shown), and a screening sub-module (not shown). The determining sub-module may be configured to determine, for each of at least one initial point pair, a target feature point closest to the feature point to be measured and a next closest target feature point in the initial point pair. The calculating sub-module may be configured to calculate a ratio of euclidean distances between the nearest target feature point and the next nearest target feature point, respectively. The screening submodule may be configured to treat the initial pair of points as a matched pair of points in response to the ratio being greater than a set threshold.
In some optional implementation manners of this embodiment, the screening module further includes: resulting in sub-modules (not shown) and removing the sub-modules (not shown). The obtaining submodule may be configured to obtain the mask matrix by using homography based on the target feature point in the at least one matching point pair. The removing submodule may be configured to remove an erroneous matching point pair of the at least one matching point based on the mask matrix.
In some optional implementations of this embodiment, the mapping unit 504 includes: an adaptation module (not shown), a connection module (not shown), and a localization module (not shown). The adaptation module may be configured to determine all feature points to be detected corresponding to all matching points on the image to be processed based on at least one matching point pair. The connection module may be configured to connect all feature points to be measured closest to a frame of the image to be processed, and form a mapping curve surrounding all the feature points to be measured. The localization module may be configured to use an area surrounded by the mapping curve as a mapping area of the image to be processed.
In the image processing apparatus provided by the embodiment of the present disclosure, first, the obtaining unit 501 obtains an image to be processed in real time; secondly, the extraction unit 502 performs feature point extraction on the image to be processed to obtain a feature point set to be detected of the image to be processed; thirdly, the matching unit 503 matches the feature point set to be detected with a preset target feature point set of the target image to obtain at least one matching point pair; then, the mapping unit 504 performs area mapping on the image to be processed based on the at least one matching point pair to obtain a mapping area of the image to be processed; finally, the obtaining unit 505 detects the image to be processed based on the mapping region and the target image to obtain a detection result. Therefore, the mapping area is determined through the extraction of the feature points, the image detection is carried out based on the mapping area, the precision of the image detection is improved, the real-time automatic test of the image on the terminal is realized, and the problem of cross-terminal detection diversity is solved while the image test precision of the terminal is improved.
The present disclosure also provides an electronic device, a readable storage medium, and a computer program product according to embodiments of the present disclosure.
FIG. 6 illustrates a schematic block diagram of an example electronic device 600 that can be used to implement embodiments of the present disclosure. Electronic devices are intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. The electronic device may also represent various forms of mobile devices, such as personal digital processing, cellular phones, smart phones, wearable devices, and other similar computing devices. The components shown herein, their connections and relationships, and their functions, are meant to be examples only, and are not meant to limit implementations of the disclosure described and/or claimed herein.
As shown in fig. 6, the apparatus 600 includes a computing unit 601, which can perform various appropriate actions and processes according to a computer program stored in a Read Only Memory (ROM)602 or a computer program loaded from a storage unit 608 into a Random Access Memory (RAM) 603. In the RAM603, various programs and data required for the operation of the device 600 can also be stored. The calculation unit 601, the ROM 602, and the RAM603 are connected to each other via a bus 604. An input/output (I/O) interface 605 is also connected to bus 604.
A number of components in the device 600 are connected to the I/O interface 605, including: an input unit 606 such as a keyboard, a mouse, or the like; an output unit 607 such as various types of displays, speakers, and the like; a storage unit 608, such as a magnetic disk, optical disk, or the like; and a communication unit 609 such as a network card, modem, wireless communication transceiver, etc. The communication unit 609 allows the device 600 to exchange information/data with other devices via a computer network such as the internet and/or various telecommunication networks.
The computing unit 601 may be a variety of general and/or special purpose processing components having processing and computing capabilities. Some examples of the computing unit 601 include, but are not limited to, a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), various dedicated Artificial Intelligence (AI) computing chips, various computing units running machine learning model algorithms, a Digital Signal Processor (DSP), and any suitable processor, controller, microcontroller, and so forth. The calculation unit 601 executes the respective methods and processes described above, such as the image processing method. For example, in some embodiments, the image processing method may be implemented as a computer software program tangibly embodied in a machine-readable medium, such as storage unit 608. In some embodiments, part or all of the computer program may be loaded and/or installed onto the device 600 via the ROM 602 and/or the communication unit 609. When the computer program is loaded into the RAM603 and executed by the computing unit 601, one or more steps of the image processing method described above may be performed. Alternatively, in other embodiments, the computing unit 601 may be configured to perform the image processing method by any other suitable means (e.g. by means of firmware).
Various implementations of the systems and techniques described here above may be implemented in digital electronic circuitry, integrated circuitry, Field Programmable Gate Arrays (FPGAs), Application Specific Integrated Circuits (ASICs), Application Specific Standard Products (ASSPs), system on a chip (SOCs), load programmable logic devices (CPLDs), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs that are executable and/or interpretable on a programmable system including at least one programmable processor, which may be special or general purpose, receiving data and instructions from, and transmitting data and instructions to, a storage system, at least one input device, and at least one output device.
Program code for implementing the methods of the present disclosure may be written in any combination of one or more programming languages. These program codes may be provided to a processor or controller of a general purpose computer, special purpose computer, or other programmable image processing apparatus, such that the program codes, when executed by the processor or controller, cause the functions/operations specified in the flowchart and/or block diagram to be performed. The program code may execute entirely on the machine, partly on the machine, as a stand-alone software package partly on the machine and partly on a remote machine or entirely on the remote machine or server.
In the context of this disclosure, a machine-readable medium may be a tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. The machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium. A machine-readable medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
To provide for interaction with a user, the systems and techniques described here can be implemented on a computer having: a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to a user; and a keyboard and a pointing device (e.g., a mouse or a trackball) by which a user can provide input to the computer. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user can be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic, speech, or tactile input.
The systems and techniques described here can be implemented in a computing system that includes a back-end component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a user computer having a graphical user interface or a web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such back-end, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include: local Area Networks (LANs), Wide Area Networks (WANs), and the Internet.
The computer system may include clients and servers. A client and server are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other.
In the technical scheme of the disclosure, the acquisition, storage, application and the like of the personal information of the related user all accord with the regulations of related laws and regulations, and do not violate the good customs of the public order.
It should be understood that various forms of the flows shown above may be used, with steps reordered, added, or deleted. For example, the steps described in the present disclosure may be executed in parallel or sequentially or in different orders, and are not limited herein as long as the desired results of the technical solutions disclosed in the present disclosure can be achieved.
The above detailed description should not be construed as limiting the scope of the disclosure. It should be understood by those skilled in the art that various modifications, combinations, sub-combinations and substitutions may be made in accordance with design requirements and other factors. Any modification, equivalent replacement, and improvement made within the spirit and principle of the present disclosure should be included in the scope of protection of the present disclosure.

Claims (19)

1. A method of image processing, the method comprising:
acquiring an image to be processed in real time;
extracting characteristic points of the image to be processed to obtain a characteristic point set to be detected of the image to be processed;
matching the feature point set to be detected with a preset target feature point set of a target image to obtain at least one matching point pair;
performing area mapping on the image to be processed based on the at least one matching point pair to obtain a mapping area of the image to be processed;
and detecting the image to be processed based on the mapping area and the target image to obtain a detection result.
2. The method of claim 1, wherein the acquiring the image to be processed in real-time comprises:
acquiring a test task from a test task list;
distributing the test task to a terminal to run the test task on the terminal;
and receiving the image to be processed corresponding to the test task sent by the terminal in real time.
3. The method according to claim 1, wherein the detecting the image to be processed based on the mapping region and the target image to obtain a detection result comprises:
based on the mapping area, cutting the image to be processed to obtain a cut image;
in response to detecting that the cropped image is the same size as the target image, calculating a similarity between the cropped image and the target image;
and determining that the image to be processed is qualified in response to determining that the similarity between the cut image and the target image is greater than a set threshold.
4. The method according to claim 3, wherein the detecting the image to be processed based on the mapping region and the target image to obtain a detection result further comprises:
and in response to detecting that the sizes of the cut image and the target image are different, scaling the cut image so that the scaled cut image and the target image have the same size.
5. The method according to one of claims 1 to 4, wherein the matching the feature point set to be tested with a target feature point set of a preset target image to obtain at least one matching point pair comprises:
matching the feature point set to be detected with a target feature point set of the target image by adopting a feature matching algorithm to obtain at least one initial point pair;
and screening the at least one initial point pair to obtain at least one matched point pair.
6. The method of claim 5, wherein the screening the at least one initial point pair to obtain at least one matching point pair comprises:
for each initial point pair of the at least one initial point pair, determining a target characteristic point closest to the characteristic point to be detected and a next closest target characteristic point in the initial point pair;
respectively calculating the ratio of Euclidean distances between the nearest target feature point and the next nearest target feature point;
and in response to the ratio being greater than a set threshold, taking the initial point pair as a matching point pair.
7. The method of claim 6, wherein the screening the at least one initial point pair to obtain at least one matching point pair further comprises:
obtaining a mask matrix by adopting homography transformation based on the target characteristic points in the at least one matching point pair;
based on the mask matrix, removing the wrong matched point pair in the at least one matched point pair.
8. The method according to claim 5, wherein the performing region mapping on the image to be processed based on the at least one matching point pair to obtain a mapped region of the image to be processed comprises:
determining all feature points to be detected corresponding to all matching points on the image to be processed based on the at least one matching point pair;
connecting all the feature points to be detected closest to the frame of the image to be processed to form a mapping curve surrounding all the feature points to be detected;
and taking the area surrounded by the mapping curve as the mapping area of the image to be processed.
9. An image processing apparatus, the apparatus comprising:
an acquisition unit configured to acquire an image to be processed in real time;
the extraction unit is configured to extract the feature points of the image to be processed to obtain a feature point set to be detected of the image to be processed;
the matching unit is configured to match the feature point set to be detected with a target feature point set of a preset target image to obtain at least one matching point pair;
the mapping unit is configured to perform area mapping on the image to be processed based on the at least one matching point pair to obtain a mapping area of the image to be processed;
and the obtaining unit is configured to detect the image to be processed based on the mapping area and the target image to obtain a detection result.
10. The apparatus of claim 9, wherein the obtaining unit comprises:
an obtaining module configured to obtain a test task from a test task list;
an allocation module configured to allocate the test task to a terminal to run the test task on the terminal;
and the receiving module is configured to receive the image to be processed corresponding to the test task sent by the terminal in real time.
11. The apparatus of claim 9, wherein the deriving unit comprises:
the cutting module is configured to cut the image to be processed based on the mapping area to obtain a cut image;
a calculation module configured to calculate a similarity between the cropped image and the target image in response to detecting that the cropped image and the target image are the same size;
a determination module configured to determine that the image to be processed is qualified in response to determining that a similarity between the cropped image and the target image is greater than a set threshold.
12. The apparatus of claim 11, wherein the deriving unit further comprises:
a scaling module configured to scale the cropped image such that the scaled cropped image is the same size as the target image in response to detecting that the cropped image is not the same size as the target image.
13. The apparatus according to one of claims 9-12, wherein the matching unit comprises:
the matching module is configured to match the feature point set to be detected with a target feature point set of the target image by adopting a feature matching algorithm to obtain at least one initial point pair;
and the screening module is configured to screen the at least one initial point pair to obtain at least one matched point pair.
14. The apparatus of claim 13, wherein the screening module comprises:
a determining submodule configured to determine, for each of the at least one initial point pair, a target feature point closest to the feature point to be measured and a next closest target feature point in the initial point pair;
a calculation submodule configured to calculate a ratio of euclidean distances between the nearest target feature point and the next nearest target feature point, respectively;
a screening submodule configured to treat the initial point pair as a matching point pair in response to the ratio being greater than a set threshold.
15. The apparatus of claim 14, wherein the screening module further comprises:
a obtaining submodule configured to obtain a mask matrix by using homography transformation based on the target feature point in the at least one matching point pair;
a removal submodule configured to remove an erroneous pair of matching points of the at least one matching point based on the mask matrix.
16. The apparatus of claim 13, wherein the mapping unit comprises:
an adaptation module configured to determine all feature points to be detected corresponding to all matching points on the image to be processed based on the at least one matching point pair;
the connecting module is configured to connect all the feature points to be detected closest to the frame of the image to be processed to form a mapping curve surrounding all the feature points to be detected;
and the localization module is configured to take the area surrounded by the mapping curve as the mapping area of the image to be processed.
17. An electronic device, comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein the content of the first and second substances,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method of any one of claims 1-8.
18. A non-transitory computer readable storage medium having stored thereon computer instructions for causing the computer to perform the method of any one of claims 1-8.
19. A computer program product comprising a computer program which, when executed by a processor, implements the method of any one of claims 1-8.
CN202110662853.1A 2021-06-15 2021-06-15 Image processing method and device, electronic equipment and computer readable medium Pending CN113313125A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110662853.1A CN113313125A (en) 2021-06-15 2021-06-15 Image processing method and device, electronic equipment and computer readable medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110662853.1A CN113313125A (en) 2021-06-15 2021-06-15 Image processing method and device, electronic equipment and computer readable medium

Publications (1)

Publication Number Publication Date
CN113313125A true CN113313125A (en) 2021-08-27

Family

ID=77378895

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110662853.1A Pending CN113313125A (en) 2021-06-15 2021-06-15 Image processing method and device, electronic equipment and computer readable medium

Country Status (1)

Country Link
CN (1) CN113313125A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115937546A (en) * 2022-11-30 2023-04-07 北京百度网讯科技有限公司 Image matching method, three-dimensional image reconstruction method, image matching device, three-dimensional image reconstruction device, electronic apparatus, and medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111767956A (en) * 2020-06-30 2020-10-13 苏州科达科技股份有限公司 Image tampering detection method, electronic device, and storage medium
CN112052186A (en) * 2020-10-10 2020-12-08 腾讯科技(深圳)有限公司 Target detection method, device, equipment and storage medium
CN112150464A (en) * 2020-10-23 2020-12-29 腾讯科技(深圳)有限公司 Image detection method and device, electronic equipment and storage medium

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111767956A (en) * 2020-06-30 2020-10-13 苏州科达科技股份有限公司 Image tampering detection method, electronic device, and storage medium
CN112052186A (en) * 2020-10-10 2020-12-08 腾讯科技(深圳)有限公司 Target detection method, device, equipment and storage medium
CN112150464A (en) * 2020-10-23 2020-12-29 腾讯科技(深圳)有限公司 Image detection method and device, electronic equipment and storage medium

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115937546A (en) * 2022-11-30 2023-04-07 北京百度网讯科技有限公司 Image matching method, three-dimensional image reconstruction method, image matching device, three-dimensional image reconstruction device, electronic apparatus, and medium

Similar Documents

Publication Publication Date Title
CN113971751A (en) Training feature extraction model, and method and device for detecting similar images
CN112597837A (en) Image detection method, apparatus, device, storage medium and computer program product
CN113591864B (en) Training method, device and system for text recognition model framework
CN114419035B (en) Product identification method, model training device and electronic equipment
CN112784760B (en) Human behavior recognition method, device, equipment and storage medium
CN112949767A (en) Sample image increment, image detection model training and image detection method
CN113256583A (en) Image quality detection method and apparatus, computer device, and medium
CN113378696A (en) Image processing method, device, equipment and storage medium
CN113360918A (en) Vulnerability rapid scanning method, device, equipment and storage medium
CN113643260A (en) Method, apparatus, device, medium and product for detecting image quality
CN113361468A (en) Business quality inspection method, device, equipment and storage medium
CN114596188A (en) Watermark detection method, model training method, device and electronic equipment
CN114022865A (en) Image processing method, apparatus, device and medium based on lane line recognition model
CN113313125A (en) Image processing method and device, electronic equipment and computer readable medium
CN113723305A (en) Image and video detection method, device, electronic equipment and medium
CN115631376A (en) Confrontation sample image generation method, training method and target detection method
CN114842476A (en) Watermark detection method and device and model training method and device
CN113887394A (en) Image processing method, device, equipment and storage medium
CN115131315A (en) Image change detection method, device, equipment and storage medium
CN114119990A (en) Method, apparatus and computer program product for image feature point matching
CN113362227A (en) Image processing method, image processing device, electronic equipment and storage medium
CN114092739B (en) Image processing method, apparatus, device, storage medium, and program product
CN114998906B (en) Text detection method, training method and device of model, electronic equipment and medium
CN112988011B (en) Word-taking translation method and device
CN113570607B (en) Target segmentation method and device and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination