CN111325777A - Method and apparatus for processing image - Google Patents

Method and apparatus for processing image Download PDF

Info

Publication number
CN111325777A
CN111325777A CN201811536270.9A CN201811536270A CN111325777A CN 111325777 A CN111325777 A CN 111325777A CN 201811536270 A CN201811536270 A CN 201811536270A CN 111325777 A CN111325777 A CN 111325777A
Authority
CN
China
Prior art keywords
matched
material image
image
images
image set
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201811536270.9A
Other languages
Chinese (zh)
Inventor
闫创
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Wodong Tianjun Information Technology Co Ltd
Original Assignee
Beijing Wodong Tianjun Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Wodong Tianjun Information Technology Co Ltd filed Critical Beijing Wodong Tianjun Information Technology Co Ltd
Priority to CN201811536270.9A priority Critical patent/CN111325777A/en
Publication of CN111325777A publication Critical patent/CN111325777A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/33Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/194Segmentation; Edge detection involving foreground-background segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20112Image segmentation details
    • G06T2207/20132Image cropping

Abstract

The embodiment of the application discloses a method and a device for processing an image. One embodiment of the method comprises: receiving material information including a material image and presentation information of the material image; acquiring a material image set to be matched corresponding to the presentation information; removing the material image to be matched, of which the association degree with the material image in the material image set to be matched is lower than a preset association degree threshold value; and determining a target material image to be matched with the material image from the material image set to be matched after the removing operation. This embodiment enriches the flexibility of the way the image is processed.

Description

Method and apparatus for processing image
Technical Field
The embodiment of the application relates to the technical field of computers, in particular to the technical field of internet, and particularly relates to a method and a device for processing images.
Background
Image features generally include global image features and local image features. The global image features refer to features which can represent the whole image, and are relative to local image features and are used for describing the whole features such as the color and the shape of the image or the object. The local image features are local expressions of the image features, reflect local characteristics of the images, and are suitable for matching, searching and other applications of the images.
Image matching is a technique that compares whether images point to the same target object based on image features. Image matching techniques may be applied to a variety of scenarios, such as target recognition, target detection and tracking, information retrieval, and the like.
Disclosure of Invention
The embodiment of the application provides a method and a device for processing an image.
In a first aspect, an embodiment of the present application provides a method for processing an image, where the method includes: receiving material information including a material image and information for presentation of the material image; acquiring a material image set to be matched corresponding to the presentation information; removing the material images to be matched, of which the association degree with the material images in the material image set to be matched is lower than a preset association degree threshold value; and determining a target material image to be matched with the material image from the material image set to be matched after the removing operation.
In some embodiments, the association degree between the material image to be matched in the material image set to be matched and the material image is determined by the following steps: extracting the material images to be matched in the material image set to be matched and the global image characteristics of the material images; and determining the association degree between the material image to be matched in the material image set to be matched and the material image according to the material image to be matched in the material image set to be matched and the global image characteristics of the material image.
In some embodiments, the determining, from the set of material images to be matched after the removing operation, a target material image to be matched that matches the material image includes: extracting the material images to be matched in the material image set to be matched after the removing operation and the local image characteristics of the material images; determining the matching degree of the material image to be matched in the material image set to be matched after the removing operation and the material image according to the material image to be matched in the material image set to be matched after the removing operation and the local image characteristics of the material image; and determining the material image to be matched, which has the matching degree with the material image reaching a preset matching degree threshold value, in the material image set to be matched after the removing operation as a target material image to be matched.
In some embodiments, the local image features described above include a binary feature descriptor.
In some embodiments, the association degree between the material image to be matched and the material image in the material image set to be matched is determined by the following steps: extracting color moments of the material images to be matched and the material images in the material image set to be matched; and determining the similarity between the material image to be matched in the material image set to be matched and the material image according to the material image to be matched in the material image set to be matched and the color moment of the material image.
In some embodiments, the determining, from the material image set to be matched after the removing operation, a target material image to be matched that matches the material image includes: extracting feature points and feature vectors of local image features of the material images to be matched in the material image set to be matched after the removing operation; taking the minimum value of the number of the characteristic points of the material image to be matched and the number of the characteristic points of the material image as the minimum value of the number of the characteristic points; generating at least one matching feature point of the material image to be matched and the material image based on the feature vector of the material image to be matched and the feature vector of the material image; dividing the generated number of the matching characteristic points by the minimum value of the number of the characteristic points to obtain a result which is used as the similarity of the material image to be matched and the material image; and determining the material images to be matched with the similarity greater than the similarity threshold value and the number of the matched characteristic points greater than the matched characteristic point threshold value in the removed material image set to be the target material images to be matched.
In some embodiments, the above method further comprises: and in response to detecting that the material image contains a preset non-material object, cutting the preset non-material object from the material image.
In some embodiments, the above method further comprises: and acquiring and pushing the attribute information of the target material image to be matched.
In a second aspect, an embodiment of the present application provides an apparatus for processing an image, the apparatus including: a receiving unit configured to receive material information including a material image and information for presentation of the material image; an obtaining unit configured to obtain a set of material images to be matched according to the information for presentation; the removing unit is configured to remove the material images to be matched, of which the association degree with the material images in the material image set to be matched is lower than a preset association degree threshold value; and the determining unit is configured to determine a target material image to be matched with the material image from the material image set to be matched after the removing operation.
In some embodiments, the association degree between the material image to be matched in the material image set to be matched and the material image is determined by the following steps: extracting the material images to be matched in the material image set to be matched and the global image characteristics of the material images; and determining the association degree between the material image to be matched in the material image set to be matched and the material image according to the material image to be matched in the material image set to be matched and the global image characteristics of the material image.
In some embodiments, the determining unit is further configured to determine a target material image to be matched, which is matched with the material image, from the material image set to be matched after the removing operation as follows: extracting the material images to be matched in the material image set to be matched after the removing operation and the local image characteristics of the material images; determining the matching degree of the material image to be matched in the material image set to be matched after the removing operation and the material image according to the material image to be matched in the material image set to be matched after the removing operation and the local image characteristics of the material image; and determining the material image to be matched, which has the matching degree with the material image reaching a preset matching degree threshold value, in the material image set to be matched after the removing operation as a target material image to be matched.
In some embodiments, the above apparatus further comprises: a clipping unit configured to clip a preset non-material object from the material image in response to detecting that the material image contains the preset non-material object.
In some embodiments, the above apparatus further comprises: and the pushing unit is configured to acquire and push the attribute information of the target material image to be matched.
In a third aspect, an embodiment of the present application provides a server, including: one or more processors; a storage device having one or more programs stored thereon, which when executed by one or more processors, cause the one or more processors to implement a method as in any embodiment of the method for processing images provided by the first aspect.
In a fourth aspect, the present application provides a computer-readable medium, on which a computer program is stored, which when executed by a processor implements the method of any one of the embodiments of the method for processing an image as provided in the first aspect.
According to the method and the device for processing the image, firstly, material information comprising material images and information for presenting the material images is received. And then, acquiring a material image set to be matched corresponding to the presentation information. And obtaining a material image set to be matched after the information for presentation is preliminarily filtered. And then removing the material images to be matched, of which the association degree with the material images in the material image set to be matched is lower than a preset association degree threshold value. And further filtering the material image set to be matched through the degree of association. And finally, determining a target material image to be matched with the material image from the material image set to be matched after the removing operation. The method and the device for processing the images preliminarily screen out the matched images through the presentation information, further eliminate the unmatched images through the relevance degree, and then match the filtered material images to be matched, so that the image matching efficiency is improved.
Drawings
Other features, objects and advantages of the present application will become more apparent upon reading of the following detailed description of non-limiting embodiments thereof, made with reference to the accompanying drawings in which:
FIG. 1 is an exemplary system architecture diagram in which one embodiment of the present application may be applied;
FIG. 2 is a flow diagram of one embodiment of a method for processing an image according to the present application;
FIGS. 3A and 3B are schematic diagrams of an application scenario according to one embodiment of a method for processing an image according to the present application;
FIG. 4 is a schematic illustration of an application scenario of a method for processing an image according to an embodiment of the present application;
FIG. 5 is a flow diagram of yet another embodiment of a method for processing an image according to the present application;
FIG. 6 is a schematic block diagram of one embodiment of an apparatus for processing images according to the present application;
FIG. 7 is a block diagram of a computer system suitable for use in implementing a server according to embodiments of the present application.
Detailed Description
The present application will be described in further detail with reference to the following drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the relevant invention and not restrictive of the invention. It should be noted that, for convenience of description, only the portions related to the related invention are shown in the drawings.
It should be noted that the embodiments and features of the embodiments in the present application may be combined with each other without conflict. The present application will be described in detail below with reference to the embodiments with reference to the attached drawings.
Fig. 1 illustrates an exemplary system architecture 100 to which the method for processing an image or the apparatus for processing an image of the embodiments of the present application may be applied.
As shown in fig. 1, the system architecture 100 may include terminal devices 101, 102, 103, a network 104, and a server 105. The network 104 serves as a medium for providing communication links between the terminal devices 101, 102, 103 and the server 105. Network 104 may include various connection types, such as wired, wireless communication links, or fiber optic cables, to name a few.
The terminal devices 101, 102, 103 interact with a server 105 via a network 104 to receive or send messages or the like. Various applications may be installed on the terminal devices 101, 102, 103, such as shopping applications, search applications, instant messaging tools, mailbox clients, social platform software, text editing applications, browser applications, reading applications, and the like.
The terminal apparatuses 101, 102, and 103 may be hardware or software. When the terminal devices 101, 102, 103 are hardware, they may be various electronic devices having a display screen, including but not limited to smart phones, tablet computers, e-book readers, laptop portable computers, desktop computers, and the like. When the terminal apparatuses 101, 102, 103 are software, they can be installed in the electronic apparatuses listed above. It may be implemented as a plurality of software or software modules (for example to provide a shopping-like service) or as a single software or software module. And is not particularly limited herein.
The server 105 may be a server providing various services, for example a background server for shopping-like applications on the terminal devices 101, 102, 103. The background server may receive the material images reported by the terminal devices 101, 102, and 103, search a matching image of the material image in the to-be-matched material image library, and perform other processing, and generate a processing result (or feed back the processing result (such as the to-be-matched material image matched with the material image) to the terminal devices 101, 102, and 103).
Note that the material images may be directly stored locally in the server 105, and the server 105 may directly extract and process the material images stored locally, and in this case, the terminal apparatuses 101, 102, and 103 and the network 104 may not be present.
It should be noted that the method for processing an image provided in the embodiment of the present application is generally performed by the server 105, and accordingly, the apparatus for processing an image is generally disposed in the server 105.
It should be understood that the number of terminal devices, networks, and servers in fig. 1 is merely illustrative. There may be any number of terminal devices, networks, and servers, as desired for implementation.
With continued reference to FIG. 2, a flow 200 of one embodiment of a method for processing an image according to the present application is shown. The method for processing the image comprises the following steps:
step 201 receives material information including a material image and information for presentation of the material image.
In the present embodiment, an execution subject of the method for processing an image (e.g., the server 105 shown in fig. 1) can receive material information including a material image and information for presentation of the material image by a wired connection manner or a wireless connection manner. As one example, the execution subject may receive material information including a material image and information for presentation of the material image from the terminal device. As still another example, the execution subject may also input material information including the material image and information for presentation of the material image by the user.
The material image may be an image that is a content material and has a problem (e.g., illegal, containing false content, etc.) reported by the user through the terminal device, and may be an image in various formats. For example, the material image may be an image captured by a mobile phone or a computer terminal, or an image captured on a display interface of the terminal. The information for presentation may be various information characterizing the presentation characteristics of the material image, such as image presentation position information, presentation time information, presentation manner information, and the like. As an example, the above-mentioned material image may be an advertisement material image, and in this case, the presentation information may be, for example: advertisement position information, affiliated service line information, charging mode information and playing time. The advertisement space information generally refers to the position information (e.g., upper left corner of page) of the material image on the terminal interface. The charging mode information may be CPD (Cost per day) mode information, RTB (Real Time Bidding) mode information, or other charging mode information, and is not limited herein.
In practice, the material image may include an image in an application interface for presenting information of a specified item or event, and may be an item information display diagram or an activity propaganda diagram, for example. When the user browses the image for presenting the information of the designated article or event, the user finds that the image contains illegal contents, can generate a material image containing the image for presenting the information of the designated article or event by means of screenshot, photograph taking and the like, and sends the material image to the execution main body.
Step 202, acquiring a material image set to be matched corresponding to the presentation information.
In this embodiment, the presentation information and the material image set to be matched have a corresponding relationship, for example, the corresponding relationship may be based on a corresponding relationship table, and the corresponding relationship table may be a corresponding relationship table which is prepared in advance and stores corresponding relationships between a plurality of presentation information and the material image set to be matched. The material image set to be matched is obtained by determining material images of which the information for presentation is consistent (for example, the same) with the information for presentation from the designated material image set.
As an example, if the material image and the material image to be matched belong to the advertisement field, the material image is a problem advertisement material image, and the material image to be matched is an advertisement material image, then if the basic attribute corresponding to the material image (i.e. presentation information, such as advertisement material subordinate service line, advertisement charging mode, advertisement playing time, etc.) is known, filtering may be performed through the basic attribute, and an advertisement material image set that may be matched to the problem material image is obtained. Specifically, the advertisement material images to be matched, which have the same basic attribute as the problem material images, are screened out and used as the material image set to be matched of the problem material images.
It should be noted that the above-mentioned material image set to be matched may be directly stored locally, or may be stored in other electronic devices that are in communication connection with the execution subject. When the material image set to be matched is stored locally, the execution subject can directly extract the locally stored material image set to be matched for processing. When the material image set to be matched is stored in other electronic equipment (such as storage equipment in a distributed storage system) in communication connection with the execution main body, the execution main body can acquire the material image set to be matched for processing through a wired connection mode or a wireless connection mode.
And 203, removing the material images to be matched, of which the association degree with the material images in the material image set to be matched is lower than a preset association degree threshold value.
In this embodiment, the association degree between each image to be matched in the material image set to be matched and the material image may be calculated first, and then it is determined whether the association degree between the material image to be matched and the material image is lower than a preset association degree threshold, and if so, the material image to be matched is removed from the material image set to be matched. The degree of association may be represented by similarity, or may be represented by other indexes that can be used to measure the degree of association between the two. The correlation threshold can be set according to actual needs.
In some optional implementation manners in this embodiment, the association degree between the material image set to be matched and the material image may be determined by the following steps:
the method comprises the steps of firstly, extracting the material images to be matched in the material image set to be matched and the global image characteristics of the material images.
And secondly, determining the association degree of the material image to be matched in the material image set to be matched and the material image according to the material image to be matched in the material image set to be matched and the global image characteristics of the material image.
It should be noted that the global image feature may be a pixel-based global image feature. The global image feature may be a color moment, a color histogram, a color aggregation vector, or a global image feature of another image that may be used, which is not limited herein. As an example, since color distribution information is mainly concentrated in low-order moments, the color distribution of an image can be expressed using first-order moments (mean), second-order moments (variance), and third-order moments (skewness) of colors. Features need not be vectorized. Because the global image features are features of the whole image, compared with local image features only focusing on foreground information or background information of the image, the global image features are easier to extract and express overall features of the image, and some material images to be matched which are not matched with the material images can be quickly eliminated by comparing the material images with the global features of the material images to be matched in the material image set to be matched.
And 204, determining a target material image to be matched with the material image from the material image set to be matched after the removing operation.
In this embodiment, a plurality of image matching algorithms may be used to determine the material image to be matched that matches the material image. For example, a gray-scale-based image matching algorithm such as MAD (mean absolute difference) and SAD (Sum of absolute Differences) may be used, or image matching may be performed using a machine learning model such as a neural network.
Optionally, the target material image to be matched, which is matched with the material image, may also be determined by the following steps:
firstly, for a material image to be matched in a material image set to be matched after removing operation, generating the similarity between the material image to be matched and the material image. The similarity of the image characteristics of the material image to be matched and the image characteristics of the material image can be calculated through extracting the image characteristics of the material image to be matched and the image characteristics of the material image, and the similarity is used as the similarity of the material image to be matched and the material image. The feature matching algorithm may be, for example, SURF (speedUp robust Features) algorithm, ORB (Oriented FAST and Rotated BRIEF) algorithm, etc.
Then, in response to determining that the generated similarity is greater than the similarity threshold, the material image to be matched is stored in the matching image set. The similarity threshold may be set according to actual needs, for example, 80%. The matching image set refers to images matching the material image. It is noted that the set of matching images may be empty.
Optionally, the target material image to be matched, which is matched with the material image, may also be determined by the following steps:
firstly, extracting the material images to be matched in the material image set to be matched after the removing operation and the local image characteristics of the material images. Local image features are typically features extracted from local regions of the image, including edges, corners, lines, curves, and regions of particular attributes, among others. For example, the local image features may be Binary feature descriptors of one or more local non-color information of the image extracted by an algorithm such as SIFT (Scale-invariant feature transform) or BRISK (Binary Robust scalable Keypoints).
And then, determining the matching degree of the material image to be matched in the material image set to be matched after the removing operation and the material image according to the material image to be matched in the material image set to be matched after the removing operation and the local image characteristics of the material image. The degree of matching can be determined using a BF (Brute Force) algorithm or KNN (k-nearest neighbor algorithm).
And finally, determining the material image to be matched, which has the matching degree with the material image reaching a preset matching degree threshold value, in the material image set to be matched after the removing operation as a target material image to be matched. The matching degree threshold can be set according to actual needs.
In some optional implementation manners of this embodiment, the determining, from the set of material images to be matched after the removing operation, a target material image to be matched that is matched with the material image may include: extracting feature points and feature vectors of local image features of the material images to be matched in the material image set to be matched after the removing operation; taking the minimum value of the number of the characteristic points of the material image to be matched and the number of the characteristic points of the material image as the minimum value of the number of the characteristic points; generating at least one matching feature point of the material image to be matched and the material image based on the feature vector of the material image to be matched and the feature vector of the material image; dividing the generated number of the matching characteristic points by the minimum value of the number of the characteristic points to obtain a result which is used as the similarity of the material image to be matched and the material image; and determining the material images to be matched with the similarity greater than the similarity threshold value and the number of the matched characteristic points greater than the matched characteristic point threshold value in the removed material image set to be the target material images to be matched.
In some optional implementations of this embodiment, the method may further include: in response to detecting that the material image contains a preset non-material object, the preset non-material object is cut out from the material image, and then the above steps 203 and 204 are performed with the cut-out material image as a new material image. Here, the non-material object may be a preset object that does not represent the content of the material, and may be an icon, a character, or the like. For example, an operator icon and a time identifier of the mobile phone in a material image obtained by screenshot of the mobile phone are non-material objects. For another example, in a material image formed by the variable recommended content image of the application software home screen page, an inherent object (for example, a search box, a preset plate icon, or the like) in the application software home screen page is a non-material object.
The execution main body may detect a non-material object in the material image through an SSD (single shot multiple detector) algorithm, may also detect a non-material object in the material image through a STDN (Scale-transfer detection Network) algorithm, and may also detect a non-material object in the material image through other detection algorithms, which is not limited herein.
The execution subject may cut the detected non-material object from the material image by using a Liang-Barsky line clipping algorithm, may also cut the detected non-material object from the material image by using a Cohen-Sutherland line clipping algorithm, and may also cut the detected non-material object from the material image by using another clipping algorithm, which is not limited herein.
The above steps 203 and 204 are performed by cutting out the detected non-material object from the above-described material image, and then, taking the material image after the cutting-out operation as a new material image. The influence of non-material objects on subsequent image matching can be avoided, and the matching efficiency is improved.
As an example, the material image is a screenshot of a cell phone (e.g., fig. 3A), the top left corner in the material image is an operator icon, the top right corner is a time identifier, the operator icon and below the time is a search box. First, non-material objects (operator icon, time stamp, search box) in the material image can be detected by the SSD algorithm. Then, the detected non-material objects (operator icon, time stamp, search box) can be cut out from the material image through the Cohen-Sutherland straight line cutting algorithm. Thereafter, the above steps 203 and 204 are performed with the material image after the clipping operation (for example, fig. 3B) as a new material image. Thereby avoiding the influence of non-material objects (operator icons, time markers, search boxes) on image matching.
In some optional implementations of this embodiment, the method for processing an image may further include: and acquiring and pushing attribute information of the target material image to be matched. The attribute information may include the information for presentation, such as an advertisement space, an affiliated service line, and the like. The method can also comprise image identification, image content abstract, storage position and the like of the target material image to be matched. The pushing manner may be pushing to a display page of the execution main body, or pushing to a terminal communicatively connected to the execution main body, or other pushing manners, which is not limited herein.
With continued reference to fig. 4, fig. 4 is a schematic diagram of an application scenario of the method for processing an image according to the present embodiment. In the application scenario 400 of fig. 4, the server 401 receives the charging mode CPD of the advertisement materials a and a, that is, (material a, CPD), and obtains the set of materials to be matched with the charging mode CPD { (material a1, CPD), (material AB2, CPD), (material B, CPD), (rabbit D, CPD) }. And removing the materials to be matched, wherein the correlation degree between the materials to be matched and the second-order color moment of the materials A is lower than the correlation degree threshold value by 50%. Specifically, in response to determining that the second order color moments of (material B, CPD) and (material D, CPD) and (material a, CPD) in the set of materials to be matched are below the relatedness threshold, (material B, CPD) and (material D, CPD) are removed from the set of materials to be matched. And determining target materials to be matched (materials A1 and CPD) matched with the materials from the material set to be matched after the removing operation { (materials A1 and CPD), (materials AB2 and CPD) }.
The method provided by the embodiment of the application comprises the steps of firstly, receiving material information comprising material images and presentation information of the material images, then, obtaining a material image set to be matched corresponding to the presentation information, and obtaining the material image set to be matched after preliminary filtering is carried out on the presentation information. And then removing the material images to be matched, of which the association degree with the material images in the material image set to be matched is lower than a preset association degree threshold value, and further filtering the material image set to be matched through the association degree. And finally, determining a target material image to be matched with the material image from the material image set to be matched after the removing operation. The unmatched images are further eliminated through the preliminary filtering of the information for presentation and the relevance, and the efficiency of image matching is improved. The images are matched based on the filtered material images to be matched, the images can be matched without manual investigation, and labor cost is reduced.
With further reference to FIG. 5, a flow 500 of another embodiment of a method for processing an image is shown. The flow 500 of the method for processing an image comprises the steps of:
step 501, receiving material information including a material image and information for presenting the material image.
Step 502, acquiring a material image set to be matched corresponding to the presentation information.
Step 503, extracting the material images to be matched in the material image set to be matched and the global image features of the material images.
It should be noted that the global image feature may be a pixel-based global image feature. The global image feature may be a color moment, a color histogram, a color aggregation vector, or a global image feature of another image that may be used, which is not limited herein.
And 504, determining the association degree of the material images to be matched and the material images in the material image set to be matched according to the material images to be matched in the material image set to be matched and the global image characteristics of the material images.
The correlation threshold can be set according to actual needs. As an example, since color distribution information is mainly concentrated in low-order moments, the color distribution of an image can be expressed using first-order moments (mean), second-order moments (variance), and third-order moments (skewness) of colors. Features need not be vectorized. The material images to be matched in the material image set to be matched are compared with the above material images to be matched, so that certain unmatched material images to be matched can be eliminated.
And 505, removing the material images to be matched, which have the association degree with the material images lower than a preset association degree threshold value, in the material image set to be matched.
Step 506, extracting the material images to be matched in the removed material image set to be matched and the local image features of the material images.
Local image features are typically features extracted from local regions of the image, including edges, corners, lines, curves, and regions of particular attributes, among others. For example, the local image feature may be a SIFT feature or a break feature.
And step 507, determining the matching degree of the material image to be matched and the material image in the material image set to be matched after the removing operation according to the material image to be matched and the local image characteristics of the material image in the material image set to be matched after the removing operation.
The above algorithm for determining the degree of matching may be a Brute Force algorithm or a nearest neighbor algorithm KNN (k-nearest neighbor).
And step 508, determining the material image to be matched, which has the matching degree with the material image reaching a preset matching degree threshold value, in the removed material image set to be matched as the target material image to be matched.
The matching degree threshold can be set according to actual needs.
Optionally, for the material image to be matched in the material image set to be matched after the removing operation, extracting feature points and feature vectors of local image features of the material image to be matched; taking the minimum value of the number of the characteristic points of the material image to be matched and the number of the characteristic points of the material image as the minimum value of the number of the characteristic points; generating at least one matching feature point of the material image to be matched and the material image based on the feature vector of the material image to be matched and the feature vector of the material image; and obtaining the matching degree of the material image to be matched and the material image based on the generated number of the matching characteristic points and the minimum value of the number of the characteristic points.
In this embodiment, at least one matching feature point of the material image to be matched and the material image may be generated by a feature matching algorithm. The feature matching algorithm may be a Brute Force algorithm or a nearest neighbor algorithm KNN (k-nearest neighbor) algorithm. Then, the result obtained by dividing the generated number of matching feature points by the minimum value of the number of feature points may be used as the matching degree between the material image to be matched and the material image.
The steps 501, 502, and 505 may be performed in a manner similar to the steps 201, 202, and 203 of the embodiment shown in fig. 2, and are not described herein again.
In some optional implementations of this embodiment, step 508 may further include: and pushing the material image to be matched, the matching degree of which reaches a preset matching degree threshold value. The matching degree threshold value can be set according to actual needs. The push mode may be to a display page of the execution main body, may also be to a terminal in communication connection with the execution main body, and may also be other push modes, which is not limited herein.
As can be seen from fig. 5, compared with the embodiment corresponding to fig. 2, the flow 500 of the method for processing an image in the present embodiment embodies the steps 503, 504, 506 and 508. Therefore, the scheme described in the embodiment can extract the local image features of the material image and the material image to be matched. And determining the matching degree of the material image to be matched and the material image in the material image set to be matched after the removing operation. And determining the material image to be matched, which has the matching degree with the material image reaching a preset matching degree threshold value, in the material image set to be matched after the removing operation as a target material image to be matched. Thereby obtaining the target material image to be matched of the material image. By filtering the global features, certain unmatched images can be eliminated, the calculation amount of subsequent matching steps can be reduced, and the efficiency is improved. And further accurately matching the local features to obtain a target material image to be matched. Do not need artifical the getting rid of, reduce the human cost, further improved the matching efficiency moreover.
With further reference to fig. 6, as an implementation of the methods shown in the above figures, the present application provides an embodiment of an apparatus for processing an image, which corresponds to the method embodiment shown in fig. 2, and which is particularly applicable in various electronic devices.
As shown in fig. 6, the apparatus 600 for processing an image according to the present embodiment includes: a receiving unit 601 configured to receive material information including a material image and information for presentation of the material image. An obtaining unit 602 configured to obtain a set of material images to be matched corresponding to the presentation information. A removing unit 603 configured to remove material images to be matched, which have an association degree with the material image lower than a preset association degree threshold value, from the material image set to be matched. The determining unit 604 is configured to determine a target material image to be matched, which matches the material image, from the material image set to be matched after the removing operation.
In the present embodiment, in the apparatus 600 for processing an image: the specific processing of the receiving unit 601, the obtaining unit 602, the removing unit 603, and the determining unit 604 and the technical effects thereof can refer to the related descriptions of step 201, step 202, step 203, and step 204 in the corresponding embodiment of fig. 2, which are not described herein again.
In some optional implementation manners of this embodiment, the association degree between the material image set to be matched and the material image is determined through the following steps: extracting the material images to be matched in the material image set to be matched and the global image characteristics of the material images; and determining the association degree between the material image to be matched in the material image set to be matched and the material image according to the material image to be matched in the material image set to be matched and the global image characteristics of the material image.
In some optional implementation manners of this embodiment, the determining unit includes: the extraction subunit is configured to extract the material images to be matched in the material image set to be matched after the removal operation and the local image features of the material images; the first determining subunit is configured to determine the matching degree of the material image to be matched in the material image set to be matched after the removing operation and the material image according to the material image to be matched in the material image set to be matched after the removing operation and the local image characteristics of the material image; and the second determining subunit is configured to determine, as a target material image to be matched, a material image to be matched, in the removed material image set to be matched, whose matching degree with the material image reaches a preset matching degree threshold.
In some optional implementations of this embodiment, the apparatus further includes: a clipping unit configured to clip a preset non-material object from the material image in response to detecting that the material image contains the preset non-material object.
In some optional implementations of this embodiment, the apparatus further includes: and the pushing unit is configured to acquire and push the attribute information of the target material image to be matched.
The apparatus provided by the above-described embodiment of the present application receives material information including a material image and information for presentation of the above-described material image by the receiving unit 601. The acquiring unit 602 acquires a set of material images to be matched corresponding to the presentation information, the removing unit 603 removes the material images to be matched, of which the association degree with the material images is lower than a preset association degree threshold value, from the set of material images to be matched, and the determining unit 604 determines a target material image to be matched, which is matched with the material image, from the set of material images to be matched after the removing operation. The device can further eliminate certain unmatched images through preliminary filtering of the information for presentation and the relevance, and is beneficial to improving the efficiency of image matching. And then, the filtered material images to be matched are matched, and the method can realize the matching of the images without manual investigation, thereby reducing the labor cost.
Referring now to FIG. 7, shown is a block diagram of a computer system 700 suitable for use in implementing a server according to embodiments of the present application. The server shown in fig. 7 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present application.
As shown in fig. 7, the computer system 700 includes a Central Processing Unit (CPU)701, which can perform various appropriate actions and processes in accordance with a program stored in a Read Only Memory (ROM)702 or a program loaded from a storage section 708 into a Random Access Memory (RAM) 703. In the RAM 703, various programs and data necessary for the operation of the system 700 are also stored. The CPU 701, the ROM 702, and the RAM 703 are connected to each other via a bus 704. An input/output (I/O) interface 705 is also connected to bus 704.
The following components are connected to the I/O interface 705: an input portion 706 including a keyboard, a mouse, and the like; an output section 707 including a display such as a Cathode Ray Tube (CRT), a Liquid Crystal Display (LCD), and the like, and a speaker; a storage section 708 including a hard disk and the like; and a communication section 709 including a network interface card such as a LAN card, a modem, or the like. The communication section 709 performs communication processing via a network such as the internet. A drive 710 is also connected to the I/O interface 705 as needed. A removable medium 711 such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like is mounted on the drive 710 as necessary, so that a computer program read out therefrom is mounted into the storage section 708 as necessary.
In particular, according to an embodiment of the present disclosure, the processes described above with reference to the flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method illustrated in the flow chart. In such an embodiment, the computer program can be downloaded and installed from a network through the communication section 709, and/or installed from the removable medium 711. The computer program, when executed by a Central Processing Unit (CPU)701, performs the above-described functions defined in the method of the present application. It should be noted that the computer readable medium described herein can be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present application, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In this application, however, a computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wire, fiber optic cable, RF, etc., or any suitable combination of the foregoing.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units described in the embodiments of the present application may be implemented by software or hardware. The described units may also be provided in a processor, and may be described as: a processor includes a receiving unit, an obtaining unit, a removing unit, and a determining unit. Here, the names of these units do not constitute a limitation of the unit itself in some cases, and for example, the receiving unit may also be described as a "unit that receives material information including a material image and information for presentation of the material image".
As another aspect, the present application also provides a computer-readable medium, which may be contained in the apparatus described in the above embodiments; or may be present separately and not assembled into the device. The computer readable medium carries one or more programs which, when executed by the apparatus, cause the apparatus to: receiving material information including a material image and information for presentation of the material image; acquiring a material image set to be matched corresponding to the presentation information; removing the material images to be matched, of which the association degree with the material images in the material image set to be matched is lower than a preset association degree threshold value; and determining a target material image to be matched with the material image from the material image set to be matched after the removing operation.
The above description is only a preferred embodiment of the application and is illustrative of the principles of the technology employed. It will be appreciated by those skilled in the art that the scope of the invention herein disclosed is not limited to the particular combination of features described above, but also encompasses other arrangements formed by any combination of the above features or their equivalents without departing from the spirit of the invention. For example, the above features may be replaced with (but not limited to) features having similar functions disclosed in the present application.

Claims (11)

1. A method for processing an image, comprising:
receiving material information including a material image and presentation information of the material image;
acquiring a material image set to be matched corresponding to the presentation information;
removing the material images to be matched, of which the association degree with the material images in the material image set to be matched is lower than a preset association degree threshold value;
and determining a target material image to be matched with the material image from the material image set to be matched after the removing operation.
2. The method according to claim 1, wherein the relevance of the material image to be matched in the set of material images to be matched and the material image is determined by:
extracting the material images to be matched in the material image set to be matched and the global image characteristics of the material images;
and determining the association degree of the material images to be matched in the material image set to be matched and the material images according to the material images to be matched in the material image set to be matched and the global image characteristics of the material images.
3. The method according to claim 1 or 2, wherein the determining a target material image to be matched, which is matched with the material image, from the material image set to be matched after the removing operation comprises:
extracting the material image to be matched in the material image set to be matched after the removing operation and the local image characteristics of the material image;
determining the matching degree of the material image to be matched in the material image set to be matched after the removing operation and the material image according to the material image to be matched in the material image set to be matched after the removing operation and the local image characteristics of the material image;
and determining the material image to be matched, which has the matching degree with the material image reaching a preset matching degree threshold value, in the material image set to be matched after the removing operation as a target material image to be matched.
4. The method of claim 3, wherein the local image features comprise a binary feature descriptor.
5. The method according to claim 1, wherein the relevance of the material image to be matched in the set of material images to be matched and the material image is determined by:
extracting the material images to be matched in the material image set to be matched and the color moments of the material images;
and determining the similarity between the material image to be matched in the material image set to be matched and the material image according to the material image to be matched in the material image set to be matched and the color moment of the material image.
6. The method according to claim 1, wherein the determining a target material image to be matched, which is matched with the material image, from the material image set to be matched after the removing operation comprises:
extracting feature points and feature vectors of local image features of the material images to be matched in the material image set to be matched after the removing operation; taking the minimum value of the number of the characteristic points of the material image to be matched and the number of the characteristic points of the material image as the minimum value of the number of the characteristic points; generating at least one matching feature point of the material image to be matched and the material image based on the feature vector of the material image to be matched and the feature vector of the material image; dividing the generated number of the matching characteristic points by the minimum value of the number of the characteristic points to obtain a result which is used as the similarity of the material image to be matched and the material image;
and determining the material images to be matched with the similarity greater than the similarity threshold value and the number of the matched characteristic points greater than the matched characteristic point threshold value in the material image set to be matched after the removing operation as target material images to be matched.
7. The method of claim 1, wherein the method further comprises:
and in response to detecting that the material image contains a preset non-material object, cutting the preset non-material object from the material image.
8. The method of claim 1, wherein the method further comprises:
and acquiring and pushing the attribute information of the target material image to be matched.
9. An apparatus for processing an image, comprising:
a receiving unit configured to receive material information including a material image and information for presentation of the material image;
the acquisition unit is configured to acquire a material image set to be matched corresponding to the presentation information;
a removing unit configured to remove the material image to be matched, of which the association degree with the material image in the material image set to be matched is lower than a preset association degree threshold value;
and the determining unit is configured to determine a target material image to be matched with the material image from the material image set to be matched after the removing operation.
10. A server, comprising:
one or more processors;
a storage device having one or more programs stored thereon,
when executed by the one or more processors, cause the one or more processors to implement the method of any one of claims 1-8.
11. A computer-readable medium, on which a computer program is stored which, when being executed by a processor, carries out the method according to any one of claims 1-8.
CN201811536270.9A 2018-12-14 2018-12-14 Method and apparatus for processing image Pending CN111325777A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811536270.9A CN111325777A (en) 2018-12-14 2018-12-14 Method and apparatus for processing image

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811536270.9A CN111325777A (en) 2018-12-14 2018-12-14 Method and apparatus for processing image

Publications (1)

Publication Number Publication Date
CN111325777A true CN111325777A (en) 2020-06-23

Family

ID=71172606

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811536270.9A Pending CN111325777A (en) 2018-12-14 2018-12-14 Method and apparatus for processing image

Country Status (1)

Country Link
CN (1) CN111325777A (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6526156B1 (en) * 1997-01-10 2003-02-25 Xerox Corporation Apparatus and method for identifying and tracking objects with view-based representations
CN102012939A (en) * 2010-12-13 2011-04-13 中国人民解放军国防科学技术大学 Method for automatically tagging animation scenes for matching through comprehensively utilizing overall color feature and local invariant features
CN103292804A (en) * 2013-05-27 2013-09-11 浙江大学 Monocular natural vision landmark assisted mobile robot positioning method
CN105550381A (en) * 2016-03-17 2016-05-04 北京工业大学 Efficient image retrieval method based on improved SIFT (scale invariant feature transform) feature
CN105608230A (en) * 2016-02-03 2016-05-25 南京云创大数据科技股份有限公司 Image retrieval based business information recommendation system and image retrieval based business information recommendation method
CN106933816A (en) * 2015-12-29 2017-07-07 北京大唐高鸿数据网络技术有限公司 Across camera lens object retrieval system and method based on global characteristics and local feature

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6526156B1 (en) * 1997-01-10 2003-02-25 Xerox Corporation Apparatus and method for identifying and tracking objects with view-based representations
CN102012939A (en) * 2010-12-13 2011-04-13 中国人民解放军国防科学技术大学 Method for automatically tagging animation scenes for matching through comprehensively utilizing overall color feature and local invariant features
CN103292804A (en) * 2013-05-27 2013-09-11 浙江大学 Monocular natural vision landmark assisted mobile robot positioning method
CN106933816A (en) * 2015-12-29 2017-07-07 北京大唐高鸿数据网络技术有限公司 Across camera lens object retrieval system and method based on global characteristics and local feature
CN105608230A (en) * 2016-02-03 2016-05-25 南京云创大数据科技股份有限公司 Image retrieval based business information recommendation system and image retrieval based business information recommendation method
CN105550381A (en) * 2016-03-17 2016-05-04 北京工业大学 Efficient image retrieval method based on improved SIFT (scale invariant feature transform) feature

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
吴昊: "基于检索的多选择图像修复研究", 《中国博士学位论文全文数据库 信息科技辑》 *
朱奇光 等: "基于全局特征与局部特征的图像分级匹配算法研究及应用", 《中国机械工程》 *
肖嵩 等: "《计算机图形学原理及应用》", 30 June 2014, 西安电子科技大学出版社 *

Similar Documents

Publication Publication Date Title
CN107590255B (en) Information pushing method and device
CN108229419B (en) Method and apparatus for clustering images
CN108734185B (en) Image verification method and device
CN109308681B (en) Image processing method and device
CN106303599B (en) Information processing method, system and server
US9977995B2 (en) Image clustering method, image clustering system, and image clustering server
CN107220652B (en) Method and device for processing pictures
CN109086834B (en) Character recognition method, character recognition device, electronic equipment and storage medium
US20190303499A1 (en) Systems and methods for determining video content relevance
CN109711508B (en) Image processing method and device
CN109583389B (en) Drawing recognition method and device
CN110942061A (en) Character recognition method, device, equipment and computer readable medium
CN108182457B (en) Method and apparatus for generating information
CN110807472B (en) Image recognition method and device, electronic equipment and storage medium
CN110941978B (en) Face clustering method and device for unidentified personnel and storage medium
CN112766284A (en) Image recognition method and device, storage medium and electronic equipment
CN114390368B (en) Live video data processing method and device, equipment and readable medium
CN109919220B (en) Method and apparatus for generating feature vectors of video
CN112100430B (en) Article tracing method and device
CN111598128B (en) Control state identification and control method, device, equipment and medium of user interface
CN108446737B (en) Method and device for identifying objects
CN110765304A (en) Image processing method, image processing device, electronic equipment and computer readable medium
CN112487943B (en) Key frame de-duplication method and device and electronic equipment
CN113742485A (en) Method and device for processing text
CN111325777A (en) Method and apparatus for processing image

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20200623

RJ01 Rejection of invention patent application after publication