CN116503357A - Image processing method and device - Google Patents

Image processing method and device Download PDF

Info

Publication number
CN116503357A
CN116503357A CN202310476481.2A CN202310476481A CN116503357A CN 116503357 A CN116503357 A CN 116503357A CN 202310476481 A CN202310476481 A CN 202310476481A CN 116503357 A CN116503357 A CN 116503357A
Authority
CN
China
Prior art keywords
image
application
definition
key
project
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310476481.2A
Other languages
Chinese (zh)
Inventor
路彭悦
马哲
王洪彬
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Alipay Hangzhou Information Technology Co Ltd
Original Assignee
Alipay Hangzhou Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Alipay Hangzhou Information Technology Co Ltd filed Critical Alipay Hangzhou Information Technology Co Ltd
Priority to CN202310476481.2A priority Critical patent/CN116503357A/en
Publication of CN116503357A publication Critical patent/CN116503357A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/0464Convolutional networks [CNN, ConvNet]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/764Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
    • G06V10/765Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects using rules for classification or partitioning the feature space
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/82Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30168Image quality inspection

Abstract

The embodiment of the specification provides an image processing method and device, wherein the image processing method comprises the following steps: after acquiring an application image and a content classification result of an application of an item, which are sent by an upstream node of the item, carrying out key image block detection on the application image based on the item, extracting image features from the detected key image blocks, calculating image definition of the application image according to the image features, determining image content classification of the key image blocks if the content classification result is empty, determining image quality parameters of the application image based on the image definition under the condition that the image content classification is item adaptation classification, and sending the application image to a downstream node of the item so as to carry out item application processing of the item based on the image quality parameters.

Description

Image processing method and device
Technical Field
The present document relates to the field of data processing technologies, and in particular, to an image processing method and apparatus.
Background
With the rapid development of internet technology, project applications of various projects are gradually transferred from offline to online, and users do not need to go to offline project institutions to apply for projects; in the process of online project application of various projects, the image quality needs to be evaluated; the image quality evaluation is divided into objective evaluation and subjective evaluation, wherein the subjective evaluation refers to evaluation of the image quality according to subjective impression of a viewer; objective evaluation refers to calculating a quantized value of image quality, for example, a mathematical model may be introduced to calculate a quantized value of image quality, and in this process, how to better implement image processing becomes an important point of attention for each party.
Disclosure of Invention
One or more embodiments of the present specification provide an image processing method including: and acquiring an application image for applying for the project, which is sent by an upstream node of the project, and a content classification result of the application image. And detecting the key image block of the application image based on the item to obtain a key image block. And extracting image features from the key image blocks, and calculating the image definition of the application image based on the image features. And if the content classification result is empty, determining the image content classification of the key image block based on the identification result obtained by carrying out image content identification on the key image block. And determining an image quality parameter of the application image based on the image definition and sending the image quality parameter to a downstream node of the project to perform project application processing of the project based on the image quality parameter under the condition that the image content is classified into a project adaptation classification.
One or more embodiments of the present specification provide an image processing apparatus including: the application image acquisition module is configured to acquire an application image for applying for the project, which is sent by an upstream node of the project, and a content classification result of the application image. And the image block detection module is configured to detect the key image block of the application image based on the item to obtain the key image block. And the definition calculating module is configured to extract image features from the key image blocks and calculate the image definition of the application image based on the image features. And if the content classification result is empty, running a content classification determining module, wherein the content classification determining module is configured to determine the image content classification of the key image block based on the identification result obtained by carrying out image content identification on the key image block. And the quality parameter determining module is configured to determine an image quality parameter of the application image based on the image definition and send the image quality parameter to a downstream node of the project to perform project application processing of the project based on the image quality parameter in the case that the image content is classified into a project adaptation class.
One or more embodiments of the present specification provide an image processing apparatus including: a processor; and a memory configured to store computer-executable instructions that, when executed, cause the processor to: and acquiring an application image for applying for the project, which is sent by an upstream node of the project, and a content classification result of the application image. And detecting the key image block of the application image based on the item to obtain a key image block. And extracting image features from the key image blocks, and calculating the image definition of the application image based on the image features. And if the content classification result is empty, determining the image content classification of the key image block based on the identification result obtained by carrying out image content identification on the key image block. And determining an image quality parameter of the application image based on the image definition and sending the image quality parameter to a downstream node of the project to perform project application processing of the project based on the image quality parameter under the condition that the image content is classified into a project adaptation classification.
One or more embodiments of the present specification provide a storage medium storing computer-executable instructions that, when executed by a processor, implement the following: and acquiring an application image for applying for the project, which is sent by an upstream node of the project, and a content classification result of the application image. And detecting the key image block of the application image based on the item to obtain a key image block. And extracting image features from the key image blocks, and calculating the image definition of the application image based on the image features. And if the content classification result is empty, determining the image content classification of the key image block based on the identification result obtained by carrying out image content identification on the key image block. And determining an image quality parameter of the application image based on the image definition and sending the image quality parameter to a downstream node of the project to perform project application processing of the project based on the image quality parameter under the condition that the image content is classified into a project adaptation classification.
Drawings
For a clearer description of one or more embodiments of the present description or of the solutions of the prior art, the drawings that are needed in the description of the embodiments or of the prior art will be briefly described below, it being obvious that the drawings in the description that follow are only some of the embodiments described in the present description, from which other drawings can be obtained, without inventive faculty, for a person skilled in the art;
FIG. 1 is a schematic diagram of an environment in which an image processing method according to one or more embodiments of the present disclosure is implemented;
FIG. 2 is a process flow diagram of an image processing method according to one or more embodiments of the present disclosure;
FIG. 3 is a process flow diagram of an image processing method for a secured project scenario provided by one or more embodiments of the present disclosure;
FIG. 4 is a schematic diagram of an embodiment of an image processing apparatus according to one or more embodiments of the present disclosure;
fig. 5 is a schematic structural diagram of an image processing apparatus according to one or more embodiments of the present disclosure.
Detailed Description
In order to enable a person skilled in the art to better understand the technical solutions in one or more embodiments of the present specification, the technical solutions in one or more embodiments of the present specification will be clearly and completely described below with reference to the drawings in one or more embodiments of the present specification, and it is obvious that the described embodiments are only some embodiments of the present specification, not all embodiments. All other embodiments, which can be made by one or more embodiments of the present disclosure without inventive effort, are intended to be within the scope of the present disclosure.
Referring to fig. 1, one or more embodiments of the present disclosure provide a schematic diagram of an implementation environment of an image processing method.
The image processing method provided in one or more embodiments of the present disclosure may be applicable to an implementation environment of a project application, where the implementation environment includes at least the image processing system 101. In addition, the implementation environment may also include an upstream node 102, or at least a downstream node 103.
The image processing system 101 may correspond to one server, or corresponds to a server cluster formed by a plurality of servers, or corresponds to one or more cloud servers in the cloud computing platform, and is configured to perform image processing in a project application process.
The upstream node 102 may correspond to one server, or to a server cluster formed by a plurality of servers, or to one or more cloud servers in the cloud computing platform; the downstream node 103 may also correspond to a server, or to a server cluster formed by several servers, or to one or more cloud servers in a cloud computing platform.
In this implementation environment, after acquiring the application image for applying for the project sent by the upstream node 102 of the project and the content classification result of the application image, the image processing system 101 may detect a key image block from the application image based on the project, and calculate the image sharpness of the application image according to the image feature extracted from the key image block; and under the condition that the content classification result is empty, determining the image content classification of the key image block, and under the condition that the image content classification is the project adaptation classification, determining the image quality parameter of the application image according to the image definition and sending the image quality parameter to the downstream node 103 of the project, so that the downstream node 103 performs project application processing of the project based on the image quality parameter, thereby improving the image quality of the application image and improving the convenience and effectiveness of the downstream node in performing the project application processing.
One or more embodiments of an image processing method provided in the present specification are as follows:
referring to fig. 2, the image processing method provided in the present embodiment specifically includes steps S202 to S210.
Step S202, acquiring an application image for applying for the project, which is sent by an upstream node of the project, and a content classification result of the application image.
The items described in this embodiment include related items that need to be subjected to image quality evaluation, such as a guarantee item (health guarantee item, asset guarantee item, vehicle guarantee item), an examination item, a driving test item, a medical insurance certificate application item, and the like; the upstream nodes comprise upstream processing nodes of the project, such as upstream processing nodes of the guarantee project in the process of guaranteeing the project. The project application includes applying for some services in the project, such as applying for a guarantee in the guarantee project, and the specific guarantee application may be a claim. The present embodiment may be applied to an image processing system, and the order of the execution nodes of the items may be an upstream node, an image processing system, a downstream node.
The application image comprises an image uploaded by a user and collected by a project application, such as an image for claim settlement, for example, an image of claim settlement materials shot by the user in the process of claim settlement in a protection project; the content classification result comprises a classification result of classifying the application image according to the content of the application image; the content classification result may or may not be null.
In practical application, in order to improve the convenience of image processing, the application image can be roughly classified by the upstream node of the project, and aiming at the application image, which is sent by the upstream node of the project and used for applying the project, and the content classification result of the application image can be obtained; in addition, the application image may not be roughly classified by the upstream node of the item, so step S202 may be replaced by acquiring the application image of the item application sent by the upstream node of the item, and form a new implementation together with other processing steps provided in this embodiment.
And step S204, detecting key image blocks of the application image based on the item to obtain the key image blocks.
In the step, in order to improve the accuracy of image quality evaluation of the application image, avoid the image quality evaluation of the application image by the information interference irrelevant to the project in the application image, the application image can be subjected to key image block detection by the project to obtain a key image block.
The key image block in this embodiment includes an image block related to the item in the application image; the key image block detection includes detecting image blocks related to the project in the application image, such as detecting image blocks related to the guarantee application (claim) in the application image, namely performing background removal processing on the application image based on the project.
In the specific implementation, in order to avoid interference of information irrelevant to the item in the application image with image quality evaluation, a key image block may be detected from the application image, and then the key image block is processed, so as to improve processing efficiency and processing accuracy.
Performing key image block detection on the application image based on the item to obtain boundary information of a plurality of image blocks in the application image and corresponding confidence;
and screening candidate boundary information from a plurality of boundary information according to the confidence, and extracting the key image block from the application image based on the candidate boundary information.
Wherein the boundary information of the plurality of image blocks comprises the boundary frames of the plurality of image blocks or coordinate information of the boundary frames of the plurality of image blocks; the confidence degree refers to an index representing the credibility of boundary information of a plurality of image blocks; the plurality of boundary information refers to boundary information of the plurality of image blocks.
Specifically, key image block detection is performed on the application image based on the item, coordinate information of boundary boxes of a plurality of image blocks in the application image is obtained, confidence coefficient of the plurality of coordinate information is calculated, the plurality of coordinate information is ordered in descending order according to the confidence coefficient, candidate coordinate information with order in front of a target order is screened in an ordering result, and key image blocks are extracted from the application image according to the candidate coordinate information.
In the process of extracting the key image block from the application image based on the candidate boundary information, the following operations are performed in an optional implementation manner provided in this embodiment:
determining target boundary information with confidence sequencing rank at a preset rank in the candidate boundary information;
performing overlapping degree calculation on the image block corresponding to the target boundary information and the image block corresponding to the residual boundary information to obtain overlapping degree;
and determining intermediate boundary information in the residual boundary information based on the overlapping degree, and extracting the key image block from the application image according to the intermediate boundary information and the target boundary information.
The confidence ranking order comprises ranking orders obtained by ranking according to the confidence. The remaining boundary information refers to boundary information other than the target boundary information among the candidate boundary information. The overlapping degree refers to the overlapping degree of the image block corresponding to the target boundary information and the image block corresponding to the residual boundary information.
Specifically, the target boundary information with the confidence ranking order at the first position is determined in the candidate boundary information, the intersection area and the union area of the image blocks corresponding to the target boundary information and the image blocks corresponding to the residual boundary information are calculated, the ratio of the intersection area to the union area is calculated, the intermediate boundary information is determined in the residual boundary information according to the ratio, and the key image blocks are extracted from the application image according to the intermediate boundary information and the target boundary information.
In the above-described determination of the intermediate boundary information among the remaining boundary information according to the ratio, the following operations may be performed: if the ratio of the first boundary information in the residual boundary information to the target boundary information exceeds the ratio threshold, deleting the first boundary information from the residual boundary information to obtain target residual boundary information, determining boundary information with the confidence sequencing rank at the first position in the target residual boundary information, calculating the overlapping degree of the determined boundary information and other boundary information in the target residual boundary information, if the overlapping degree of the determined boundary information and the other boundary information in the other boundary information is larger than the overlapping degree threshold, deleting the other boundary information in the target residual boundary information, and returning to execute the boundary information operation with the confidence sequencing rank at the first position in the target residual boundary information until the calculated overlapping degree is larger than the overlapping degree threshold, and taking the boundary information with the confidence sequencing rank at the first position of each screening as intermediate boundary information.
For example, key image block detection is performed on an application image based on a guarantee item, a boundary frame and confidence level of 6 image blocks in the application image are obtained, boundary frames of the first 4 bits are screened from the 6 boundary frames according to the confidence level, A, B, C, D are respectively determined, boundary frame A with the highest confidence level ranking level is determined as a target boundary frame from the 4 boundary frames, the overlapping degree between A and B, C, D is respectively calculated, B is deleted from the 3 boundary frames if the overlapping degree of B and A is larger than an overlapping degree threshold value, boundary frame C with the highest confidence level ranking level is determined in C, D, the overlapping degree of D and C is calculated, D is deleted if the overlapping degree is larger than an overlapping degree threshold value, the boundary frame C is taken as an intermediate boundary frame, and key image blocks are extracted from the application image according to the boundary frames A and C.
In addition, in the process of carrying out key image block detection on the application image based on the project to obtain a key image block, a target detection model can be introduced, and the application image is input into the target detection model to carry out key image block detection based on the project to obtain the key image block in the application image; among them, the target detection model may employ CNN (Convolutional Neural Network ).
In addition, in practical application, in the process of detecting the key image block of the application image based on the item, the key image block may not be detected, in this case, the application image may be used as the key image block, so as to improve the success rate of image processing.
Performing key image block detection on the application image based on the item;
if the detection result is empty, taking the application image as the key image block; after that, the following step S206 is performed;
if the detection result is not null, the following step S206 is performed.
It should be added that, after the detection of the key image block of the application image based on the item and the execution of the obtained key image block, the image parameter update may also be performed on the key image block, for example, the key image block is cut and/or scaled, so that the updated key image block is used as the key image block for extracting the image features subsequently.
And step S206, extracting image features from the key image blocks, and calculating the image definition of the application image based on the image features.
And in the step, the image features are extracted from the key image blocks, and the image definition of the application image is calculated based on the image features.
The image features of the embodiment comprise image feature vectors extracted from key image blocks; the image definition refers to an index for representing the definition degree of an application image.
In a specific execution process, in order to improve the efficiency and accuracy of feature extraction, a feature extraction model can be introduced, and a key image block is input into the feature extraction model to perform feature extraction so as to obtain the image features; the feature extraction model may adopt a backhaul (Backbone network), and in particular, may adopt a distraction network Split-attention Networks based on a Multi-path network and a attention mechanism Feature map Attention as the backhaul.
In particular, in order to improve the computing efficiency and accuracy of computing the image sharpness, a sharpness computing model may be introduced, and in the first alternative implementation provided in this embodiment, in the process of computing the image sharpness of the application image based on the image features, the following operations are performed:
And inputting the image characteristics into a definition calculation model to calculate the definition, and obtaining the image definition.
In a specific implementation process, the definition computing model may be obtained by training in advance, and in an optional implementation manner provided in this embodiment, the definition computing model is trained in the following manner:
inputting a first image feature in an image feature sample pair into a first definition network in a model to be trained to perform definition calculation, and inputting a second image feature into a second definition network in the model to be trained to perform definition calculation;
and calculating a loss value according to a sample sorting result and a sorting result obtained by sorting the calculated first definition and second definition, and carrying out parameter adjustment on the model to be trained according to the loss value.
Optionally, the image feature sample pair is obtained after performing key image block detection on the image sample pair and performing image feature extraction on the key image block pair obtained by detection; optionally, the image quality level of the first image is different from the image quality level of the second image in the image sample pair.
The implementation process of performing the key image block detection on the first image and the second image in the image sample pair is similar to the implementation process of performing the key image block detection on the application image, and the embodiment is not repeated herein.
The image quality level of the first image in the image sample pair can be higher than the image quality level of the second image or lower than the image quality level of the second image; that is, the image sharpness of the first image in the image sample pair is different from the image sharpness of the second image, and the image sharpness of the first image in the image sample pair may be greater than the image sharpness of the second image or may be less than the image sharpness of the second image; therefore, the convenience of model training is improved through the image sample pair.
In addition, in order to reduce the number of times of uploading the application image by the user, the target key image block may be subjected to super-resolution processing when the sharpness of the target key image block in each key image block is smaller than a preset sharpness threshold, and in the second alternative implementation provided in this embodiment, in the process of calculating the image sharpness of the application image based on the image features, the following operations are performed:
calculating sharpness of each key image block based on image features extracted from the key image blocks;
if the definition of the target key image block is smaller than a preset definition threshold, performing super-resolution processing on the target key image block;
And calculating the image definition of the application image according to the definition of the residual key image block and the definition of the target key image block processed by the super resolution.
The residual key image blocks refer to key image blocks except the target key image block in the key image blocks.
Specifically, according to the definition of the target key image block and the definition of the residual key image block obtained by the super-resolution processing, the process of calculating the image definition may obtain the image definition of the application image by performing weighted calculation according to the definition of the residual key image block, the definition of the target key image block after the super-resolution processing, and the respective weight values.
The implementation process of calculating the image definition of the application image based on the image features can be replaced by calculating the definition of each key image block based on the image features extracted from each key image block, performing super-resolution processing on each key image block according to the super-resolution level corresponding to the definition of each key image block, and calculating the image definition of the application image based on the definition of each key image block after the super-resolution processing; optionally, the super-grading corresponding to the definition of each key image block and the definition of each key image block are in a negative correlation relationship, that is, the lower the definition is, the higher the super-grading is; the higher the sharpness, the lower the super-grading;
Alternatively, the definition of each key image block may be calculated based on the image features extracted from each key image block, the super-ranking of each key image block may be determined according to the definition of each key image block and the correlation between each key image block and the item, the super-resolution processing may be performed on each key image block according to the super-ranking of each key image block, and the image definition of the application image may be calculated based on the definition of each key image block after the super-resolution processing; wherein, the relativity refers to the relativity degree of each key image block and the project.
In addition, in order to achieve pertinence and flexibility of calculating the image definition of the application image, a definition calculation model corresponding to the application image can be determined according to the content classification result of the application image, and image features are input into the definition calculation model to perform definition calculation, so that the image definition is obtained; in a third alternative implementation manner provided in the present embodiment, in a process of calculating an image sharpness of an application image based on image features, the following operations are performed:
determining a definition calculation model corresponding to the application image according to the content classification result;
And inputting the image characteristics into the definition calculation model to calculate the definition, so as to obtain the image definition.
Specifically, if the content of the application image in the content classification result is classified as the document classification, determining that the definition calculation model corresponding to the application image is a document definition calculation model, inputting the image characteristics into the document definition calculation model for definition calculation, and obtaining the image definition; if the content classification of the application image in the content classification result is the list classification, determining that a definition calculation model corresponding to the application image is a list definition calculation model, inputting image features into the list definition calculation model for definition calculation, and obtaining image definition; if the content classification result is empty, determining a definition computing model corresponding to the application image as a target definition computing model, inputting the image characteristics into the target definition computing model for definition computing, and obtaining the image definition.
In addition, in calculating the image sharpness of the application image based on the image features, the image sharpness of each key image block may be calculated from the image features of each key image block.
In a specific implementation process, after the image features are extracted from the key image block and the image sharpness of the application image is calculated based on the image features, in an optional implementation provided in this embodiment, the following operations are further performed:
And if the content classification result is not null, determining an image quality parameter of the application image based on the image definition, and sending the image quality parameter to the downstream node to perform project application processing of the project based on the image quality parameter.
Specifically, if the content classification result sent by the upstream node is not null, the content classification representing the application image is classified into the item adaptation classification, the image quality parameter of the application image is determined based on the image definition, and the image quality parameter is sent to the downstream node of the item to perform the item application processing of the item based on the image quality parameter.
Step S208, if the content classification result is null, determining the image content classification of the key image block based on the identification result obtained by performing image content identification on the key image block.
Extracting image features from the key image blocks, and calculating the image definition of the application image based on the image features; in the step, if the content classification result is empty, the image content classification of the key image block is determined based on the identification result obtained by carrying out image content identification on the key image block, and if the content classification result is not empty, the image quality parameter of the application image is determined based on the image definition, and the application image is sent to a downstream node to carry out project application processing of the project based on the image quality parameter.
The recognition result in this embodiment includes a recognition result obtained by recognizing the image content in the key image block. The image content classification includes a content classification obtained by dividing the image content of the key image block, the image content classification including a voucher classification, a manifest classification and/or other classification.
In the implementation, if the content classification result is empty, the image content of the key image block can be identified to obtain an identification result, and the image content classification of the key image block is determined according to the identification result.
In a specific implementation process, in an optional implementation manner provided in this embodiment, after determining that the image content classification of the key image block is performed based on the identification result obtained by performing image identification on the key image block if the content classification result is null, the following operations are further performed:
determining an image quality parameter of the application image based on the image sharpness if the image content classification is not an adaptation classification for the item;
and marking the determined image quality parameters in a failure mode, and sending the marked image quality parameters to the downstream node.
The project adaptation classification refers to image content classification adapted to the project; the project adaptation categorization includes a credential categorization and/or a manifest categorization. The image quality parameter refers to a parameter that characterizes the image quality of the application image, such as an image quality score, an image quality level, and the like.
In addition, in the process of calculating the image definition of the application image based on the image features, the image definition of each key image block may be calculated according to the image features of each key image block, and if the content classification result is empty, the image content classification of each key image block may be determined according to the identification result obtained by performing image content identification on each key image block.
It should be added that, on the basis that the step S202 may be replaced with the obtaining of the application image of the project application sent by the upstream node of the project, the step S208 may be replaced with determining the image content classification of the key image block based on the identification result obtained by performing the image content identification on the key image block, and form a new implementation manner with the other processing steps provided in the present embodiment.
Step S210, in the case that the image content is classified into an item adaptation classification, determining an image quality parameter of the application image based on the image definition, and transmitting to a downstream node of the item to perform item application processing of the item based on the image quality parameter.
If the content classification result is empty, determining the image content classification of the key image block based on the identification result obtained by carrying out image content identification on the key image block, in this step, when the image content classification is the item adaptation classification, determining the image quality parameter of the application image based on the image definition, and sending the application image to the downstream node of the item, so that the item application processing of the item is carried out at the downstream node based on the image quality parameter.
The image content classification according to this embodiment includes a content classification obtained by dividing the image content of the key image block, where the image content classification includes a voucher classification, a manifest classification, and/or other classification. The project application processing refers to related application processing of the project, such as guarantee application processing of guaranteeing the project.
In a specific implementation, in order to improve convenience of performing project application processing by a downstream node, in an optional implementation provided in this embodiment, in a process of determining an image quality parameter of an application image based on image definition, the following operations are performed:
if the image definition is larger than a definition threshold, determining that the image quality parameter of the application image is clear;
If the image definition is smaller than or equal to the definition threshold, determining that the image quality parameter of the application image is fuzzy;
optionally, the sharpness threshold is obtained from the upstream node or is determined based on a model training result of a sharpness calculation model.
In a specific implementation process, in an optional implementation manner provided in this embodiment, in a process of determining an image quality parameter of an application image based on image definition and sending the application image to a downstream node of an item, the following operations are performed:
determining an image quality parameter of the application image according to the image definition;
determining an execution action based on the image quality parameter, and judging whether the execution action is an application action or not;
if yes, the image quality parameters are sent to the downstream node.
In an optional implementation manner provided in this embodiment, if the result of the determining whether the execution action is the execution of the application action operation is no, the following operations are executed:
and sending a retry instruction to the upstream node, and returning to execute the step S202.
Specifically, the process of executing the action is determined based on the image quality parameter, and if the image quality parameter is clear, the executing action can be determined as the application action; if the image quality parameter is fuzzy, determining that the execution action is realized in a retry action mode.
In particular, in an alternative implementation manner provided in this embodiment, during the process of performing the project application processing of the project based on the image quality parameter, the following operations are performed:
if the image quality parameter is a preset quality parameter, carrying out project verification under the project on the application image, and carrying out project application to a user after the verification is passed;
if the image quality parameter is not the preset quality parameter, a retry reminder is sent to the user, and the above step S202 is executed again.
Wherein the preset quality parameter may be clear; the project verification comprises verification of authenticity of the application image. The project application includes related applications for the project, such as a guarantee application (claim settlement) for the guarantee project.
In addition, on the basis of determining the image content classification of each key image block according to the identification result obtained by carrying out image content identification on each key image block, if the image content classification of the first key image block is the project adaptation classification, determining the image quality parameter of the key image block based on the image definition of the key image block and sending the image quality parameter to a downstream node of the project; if the image quality parameter is a preset quality parameter, project verification under the project is carried out on the application image, project application is carried out on the user after the project verification is passed, and if the image quality parameter is not the preset quality parameter, a retry prompt is sent to the user to acquire the application image corresponding to the key image block uploaded by the user.
It should be noted that, the step S204 may be replaced by performing key image block detection on the application image based on the item to obtain a mask image containing the key image block; step S206 may be replaced with extracting image features from the mask image and calculating the image sharpness of the application image based on the image features; step S208 may be replaced with determining the image content classification of the mask image based on the recognition result obtained by performing the image content recognition on the mask image if the content classification result is null; step S210 may be replaced with determining an image quality parameter of the application image based on the image sharpness and transmitting to a downstream node of the item to perform item application processing of the item based on the image quality parameter in the case where the image content is classified as the item adaptation classification.
The key image block detection is carried out on the application image based on the item, and a mask image containing the key image block is obtained, and the method comprises the following steps: and detecting key image blocks of the application image based on the item, obtaining key image blocks, and carrying out pixel processing on the image blocks except the key image blocks in the application image according to a preset pixel value, so as to obtain the mask image. The contents of the replacement of other steps may refer to the execution process in each step, and this embodiment is not repeated here.
The following further describes the image processing method provided in this embodiment by taking the application of the image processing method provided in this embodiment to a security project scene as an example, and referring to fig. 3, the image processing method applied to the security project scene specifically includes the following steps.
Step S302, acquiring an application image for guaranteeing application sent by an upstream node of the guarantee item and a content classification result of the application image.
Step S304, key image block detection is carried out on the application image based on the guarantee item, and boundary information of a plurality of image blocks in the application image and corresponding confidence are obtained.
And step S306, screening candidate boundary information from the plurality of boundary information according to the confidence level, and extracting key image blocks from the application image based on the candidate boundary information.
Step S308, extracting image features from the key image blocks, and calculating the definition of the application image based on the image features.
In step S310, if the content classification result is null, the image content classification of the key image block is determined based on the identification result obtained by performing the image content identification on the key image block.
Step S312, in the case where the image content is classified as the item adaptation classification, the image quality parameter of the application image is determined based on the image sharpness, and sent to the downstream node of the security item to perform the security application processing of the security item based on the image quality parameter.
It should be noted that, if the content classification result is not null, the steps S310 to S312 may be replaced by determining the image quality parameter of the application image based on the image definition, and sending the image quality parameter to the downstream node to perform the guarantee application processing of the guarantee item, and form a new implementation manner with other processing steps provided in the present embodiment;
in addition, step S312 may be replaced with determining an image quality parameter of the application image based on the image sharpness in the case where the image content classification is not the item adaptation classification; and carrying out failure adaptation marking on the determined image quality parameters, sending the marked image quality parameters to a downstream node, and forming a new implementation mode with other processing steps provided by the embodiment.
An embodiment of an image processing apparatus provided in the present specification is as follows:
in the above-described embodiments, an image processing method and an image processing apparatus corresponding thereto are provided, and the following description is made with reference to the accompanying drawings.
Referring to fig. 4, a schematic diagram of an embodiment of an image processing apparatus provided in this embodiment is shown.
Since the apparatus embodiments correspond to the method embodiments, the description is relatively simple, and the relevant portions should be referred to the corresponding descriptions of the method embodiments provided above. The device embodiments described below are merely illustrative.
The present embodiment provides an image processing apparatus including:
an application image obtaining module 402, configured to obtain an application image for applying for an item sent by an upstream node of the item, and a content classification result of the application image;
an image block detection module 404 configured to perform key image block detection on the application image based on the item, to obtain a key image block;
a sharpness calculation module 406 configured to extract image features from the key image block and calculate image sharpness of the application image based on the image features;
if the content classification result is empty, running a content classification determining module 408, wherein the content classification determining module 408 is configured to determine the image content classification of the key image block based on the identification result obtained by performing image content identification on the key image block;
a quality parameter determining module 410 configured to determine an image quality parameter of the application image based on the image sharpness and send to a downstream node of the item to perform item application processing of the item based on the image quality parameter, if the image content is classified as an item adaptation class.
An embodiment of an image processing apparatus provided in the present specification is as follows:
in correspondence to the above-described image processing method, one or more embodiments of the present disclosure further provide an image processing apparatus for performing the above-provided image processing method, based on the same technical concept, and fig. 5 is a schematic structural diagram of the image processing apparatus provided by the one or more embodiments of the present disclosure.
An image processing apparatus provided in this embodiment includes:
as shown in fig. 5, the image processing apparatus may have a relatively large difference due to different configurations or performances, and may include one or more processors 501 and a memory 502, where one or more storage applications or data may be stored in the memory 502. Wherein the memory 502 may be transient storage or persistent storage. The application programs stored in memory 502 may include one or more modules (not shown), each of which may include a series of computer executable instructions in the image processing apparatus. Still further, the processor 501 may be configured to communicate with the memory 502 and execute a series of computer executable instructions in the memory 502 on the image processing device. The image processing device may also include one or more power supplies 503, one or more wired or wireless network interfaces 504, one or more input/output interfaces 505, one or more keyboards 506, and the like.
In a particular embodiment, an image processing apparatus includes a memory, and one or more programs, wherein the one or more programs are stored in the memory, and the one or more programs may include one or more modules, and each module may include a series of computer-executable instructions for the image processing apparatus, and configured to be executed by the one or more processors, the one or more programs comprising computer-executable instructions for:
acquiring an application image for applying for the project, which is sent by an upstream node of the project, and a content classification result of the application image;
performing key image block detection on the application image based on the item to obtain a key image block;
extracting image features from the key image blocks, and calculating the image definition of the application image based on the image features;
if the content classification result is empty, determining the image content classification of the key image block based on an identification result obtained by carrying out image content identification on the key image block;
and determining an image quality parameter of the application image based on the image definition and sending the image quality parameter to a downstream node of the project to perform project application processing of the project based on the image quality parameter under the condition that the image content is classified into a project adaptation classification.
An embodiment of a storage medium provided in the present specification is as follows:
in correspondence with the above-described image processing method, one or more embodiments of the present specification further provide a storage medium based on the same technical idea.
The storage medium provided in this embodiment is configured to store computer executable instructions that, when executed by a processor, implement the following flow:
acquiring an application image for applying for the project, which is sent by an upstream node of the project, and a content classification result of the application image;
performing key image block detection on the application image based on the item to obtain a key image block;
extracting image features from the key image blocks, and calculating the image definition of the application image based on the image features;
if the content classification result is empty, determining the image content classification of the key image block based on an identification result obtained by carrying out image content identification on the key image block;
and determining an image quality parameter of the application image based on the image definition and sending the image quality parameter to a downstream node of the project to perform project application processing of the project based on the image quality parameter under the condition that the image content is classified into a project adaptation classification.
It should be noted that, in the present specification, an embodiment of a storage medium and an embodiment of an image processing method in the present specification are based on the same inventive concept, so that a specific implementation of the embodiment may refer to an implementation of the foregoing corresponding method, and a repetition is omitted.
In this specification, each embodiment is described in a progressive manner, and the same or similar parts of each embodiment are referred to each other, and each embodiment focuses on the differences from other embodiments, for example, an apparatus embodiment, and a storage medium embodiment, which are all similar to a method embodiment, so that description is relatively simple, and relevant content in reading apparatus embodiments, and storage medium embodiments is referred to the part description of the method embodiment.
The foregoing describes specific embodiments of the present disclosure. Other embodiments are within the scope of the following claims. In some cases, the actions or steps recited in the claims can be performed in a different order than in the embodiments and still achieve desirable results. In addition, the processes depicted in the accompanying figures do not necessarily require the particular order shown, or sequential order, to achieve desirable results. In some embodiments, multitasking and parallel processing are also possible or may be advantageous.
In the 30 s of the 20 th century, improvements to one technology could clearly be distinguished as improvements in hardware (e.g., improvements to circuit structures such as diodes, transistors, switches, etc.) or software (improvements to the process flow). However, with the development of technology, many improvements of the current method flows can be regarded as direct improvements of hardware circuit structures. Designers almost always obtain corresponding hardware circuit structures by programming improved method flows into hardware circuits. Therefore, an improvement of a method flow cannot be said to be realized by a hardware entity module. For example, a programmable logic device (Programmable Logic Device, PLD) (e.g., field programmable gate array (Field Programmable Gate Array, FPGA)) is an integrated circuit whose logic function is determined by the programming of the device by a user. A designer programs to "integrate" a digital system onto a PLD without requiring the chip manufacturer to design and fabricate application-specific integrated circuit chips. Moreover, nowadays, instead of manually manufacturing integrated circuit chips, such programming is mostly implemented by using "logic compiler" software, which is similar to the software compiler used in program development and writing, and the original code before the compiling is also written in a specific programming language, which is called hardware description language (Hardware Description Language, HDL), but not just one of the hdds, but a plurality of kinds, such as ABEL (Advanced Boolean Expression Language), AHDL (Altera Hardware Description Language), confluence, CUPL (Cornell University Programming Language), HDCal, JHDL (Java Hardware Description Language), lava, lola, myHDL, PALASM, RHDL (Ruby Hardware Description Language), etc., VHDL (Very-High-Speed Integrated Circuit Hardware Description Language) and Verilog are currently most commonly used. It will also be apparent to those skilled in the art that a hardware circuit implementing the logic method flow can be readily obtained by merely slightly programming the method flow into an integrated circuit using several of the hardware description languages described above.
The controller may be implemented in any suitable manner, for example, the controller may take the form of, for example, a microprocessor or processor and a computer readable medium storing computer readable program code (e.g., software or firmware) executable by the (micro) processor, logic gates, switches, application specific integrated circuits (Application Specific Integrated Circuit, ASIC), programmable logic controllers, and embedded microcontrollers, examples of which include, but are not limited to, the following microcontrollers: ARC 625D, atmel AT91SAM, microchip PIC18F26K20, and Silicone Labs C8051F320, the memory controller may also be implemented as part of the control logic of the memory. Those skilled in the art will also appreciate that, in addition to implementing the controller in a pure computer readable program code, it is well possible to implement the same functionality by logically programming the method steps such that the controller is in the form of logic gates, switches, application specific integrated circuits, programmable logic controllers, embedded microcontrollers, etc. Such a controller may thus be regarded as a kind of hardware component, and means for performing various functions included therein may also be regarded as structures within the hardware component. Or even means for achieving the various functions may be regarded as either software modules implementing the methods or structures within hardware components.
The system, apparatus, module or unit set forth in the above embodiments may be implemented in particular by a computer chip or entity, or by a product having a certain function. One typical implementation is a computer. In particular, the computer may be, for example, a personal computer, a laptop computer, a cellular telephone, a camera phone, a smart phone, a personal digital assistant, a media player, a navigation device, an email device, a game console, a tablet computer, a wearable device, or a combination of any of these devices.
For convenience of description, the above devices are described as being functionally divided into various units, respectively. Of course, the functions of each unit may be implemented in the same piece or pieces of software and/or hardware when implementing the embodiments of the present specification.
One skilled in the relevant art will recognize that one or more embodiments of the present description may be provided as a method, system, or computer program product. Accordingly, one or more embodiments of the present description may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present description can take the form of a computer program product on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, etc.) having computer-usable program code embodied therein.
The present description is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the specification. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In one typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include volatile memory in a computer-readable medium, random Access Memory (RAM) and/or nonvolatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). Memory is an example of computer-readable media.
Computer readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of storage media for a computer include, but are not limited to, phase change memory (PRAM), static Random Access Memory (SRAM), dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), read Only Memory (ROM), electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), digital Versatile Disks (DVD) or other optical storage, magnetic cassettes, magnetic disk storage or other magnetic storage devices, or any other non-transmission medium which can be used to store information that can be accessed by a computing device. Computer-readable media, as defined herein, does not include transitory computer-readable media (transmission media), such as modulated data signals and carrier waves.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article or apparatus that comprises the element.
One or more embodiments of the present specification may be described in the general context of computer-executable instructions, such as program modules, being executed by a computer. Generally, program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types. One or more embodiments of the specification may also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, program modules may be located in both local and remote computer storage media including memory storage devices.
In this specification, each embodiment is described in a progressive manner, and identical and similar parts of each embodiment are all referred to each other, and each embodiment mainly describes differences from other embodiments. In particular, for system embodiments, since they are substantially similar to method embodiments, the description is relatively simple, as relevant to see a section of the description of method embodiments.
The foregoing description is by way of example only and is not intended to limit the present disclosure. Various modifications and changes may occur to those skilled in the art. Any modifications, equivalent substitutions, improvements, etc. that fall within the spirit and principles of the present document are intended to be included within the scope of the claims of the present document.

Claims (17)

1. An image processing method, comprising:
acquiring an application image for applying for the project, which is sent by an upstream node of the project, and a content classification result of the application image;
performing key image block detection on the application image based on the item to obtain a key image block;
extracting image features from the key image blocks, and calculating the image definition of the application image based on the image features;
If the content classification result is empty, determining the image content classification of the key image block based on an identification result obtained by carrying out image content identification on the key image block;
and determining an image quality parameter of the application image based on the image definition and sending the image quality parameter to a downstream node of the project to perform project application processing of the project based on the image quality parameter under the condition that the image content is classified into a project adaptation classification.
2. The image processing method according to claim 1, wherein the performing key image block detection on the application image based on the item, to obtain a key image block, comprises:
performing key image block detection on the application image based on the item to obtain boundary information of a plurality of image blocks in the application image and corresponding confidence;
and screening candidate boundary information from a plurality of boundary information according to the confidence, and extracting the key image block from the application image based on the candidate boundary information.
3. The image processing method according to claim 2, the extracting the key image block from the application image based on the candidate boundary information, comprising:
Determining target boundary information with confidence sequencing rank at a preset rank in the candidate boundary information;
performing overlapping degree calculation on the image block corresponding to the target boundary information and the image block corresponding to the residual boundary information to obtain overlapping degree;
and determining intermediate boundary information in the residual boundary information based on the overlapping degree, and extracting the key image block from the application image according to the intermediate boundary information and the target boundary information.
4. The image processing method according to claim 1, wherein the step of determining the image content classification of the key image block based on the identification result obtained by performing the image content identification on the key image block further comprises, after the step of performing the image content classification:
determining an image quality parameter of the application image based on the image sharpness if the image content classification is not an adaptation classification for the item;
and marking the determined image quality parameters in a failure mode, and sending the marked image quality parameters to the downstream node.
5. The image processing method according to claim 1, wherein after the step of extracting image features from the key image block and calculating the image sharpness of the application image based on the image features is performed, further comprising:
And if the content classification result is not null, determining an image quality parameter of the application image based on the image definition, and sending the image quality parameter to the downstream node to perform project application processing of the project based on the image quality parameter.
6. The image processing method according to claim 1, the determining an image quality parameter of the application image based on the image sharpness, comprising:
if the image definition is larger than a definition threshold, determining that the image quality parameter of the application image is clear;
if the image definition is smaller than or equal to the definition threshold, determining that the image quality parameter of the application image is fuzzy;
the definition threshold is obtained from the upstream node or is determined based on a model training result of a definition calculation model.
7. The image processing method according to claim 1, the calculating the image sharpness of the application image based on the image features, comprising:
inputting the image characteristics into a definition calculation model for definition calculation to obtain the image definition;
the definition calculation model is trained in the following mode:
inputting a first image feature in an image feature sample pair into a first definition network in a model to be trained to perform definition calculation, and inputting a second image feature into a second definition network in the model to be trained to perform definition calculation;
And calculating a loss value according to a sample sorting result and a sorting result obtained by sorting the calculated first definition and second definition, and carrying out parameter adjustment on the model to be trained according to the loss value.
8. The image processing method according to claim 7, wherein the image feature sample pair is obtained after performing key image block detection on the image sample pair and performing image feature extraction on the key image block pair obtained by the detection;
wherein the image quality level of the first image and the image quality level of the second image in the image sample pair are different.
9. The image processing method according to claim 1, the performing item application processing of the item based on the image quality parameter, comprising:
if the image quality parameter is a preset quality parameter, carrying out project verification under the project on the application image, and carrying out project application to a user after the verification is passed;
and if the image quality parameter is not the preset quality parameter, sending a retry reminder to the user.
10. The image processing method according to claim 1, wherein the performing key image block detection on the application image based on the item, to obtain a key image block, comprises:
Performing key image block detection on the application image based on the item;
and if the detection result is empty, taking the application image as the key image block.
11. The image processing method according to claim 1, the calculating the image sharpness of the application image based on the image features, comprising:
calculating sharpness of each key image block based on image features extracted from the key image blocks;
if the definition of the target key image block is smaller than a preset definition threshold, performing super-resolution processing on the target key image block;
and calculating the image definition of the application image according to the definition of the residual key image block and the definition of the target key image block processed by the super resolution.
12. The image processing method according to claim 1, the calculating the image sharpness of the application image based on the image features, comprising:
determining a definition calculation model corresponding to the application image according to the content classification result;
and inputting the image characteristics into the definition calculation model to calculate the definition, so as to obtain the image definition.
13. The image processing method according to claim 1, the determining an image quality parameter of the application image based on the image sharpness, and transmitting to a downstream node of the item, comprising:
Determining an image quality parameter of the application image according to the image definition;
determining an execution action based on the image quality parameter, and judging whether the execution action is an application action or not;
if yes, the image quality parameters are sent to the downstream node.
14. The image processing method according to claim 13, wherein if the determination of whether the execution action is the execution result after the execution of the application action operation is no, the following operations are executed:
and sending a retry instruction to the upstream node, and returning to execute the application image for applying the project sent by the upstream node for acquiring the project and the content classification result of the application image.
15. An image processing apparatus comprising:
the application image acquisition module is configured to acquire an application image for applying for the project, which is sent by an upstream node of the project, and a content classification result of the application image;
the image block detection module is configured to detect key image blocks of the application image based on the item, and obtain the key image blocks;
a definition calculation module configured to extract image features from the key image block and calculate image definition of the application image based on the image features;
If the content classification result is empty, a content classification determining module is operated, and the content classification determining module is configured to determine the image content classification of the key image block based on the identification result obtained by carrying out image content identification on the key image block;
and the quality parameter determining module is configured to determine an image quality parameter of the application image based on the image definition and send the image quality parameter to a downstream node of the project to perform project application processing of the project based on the image quality parameter in the case that the image content is classified into a project adaptation class.
16. An image processing apparatus comprising:
a processor; and a memory configured to store computer-executable instructions that, when executed, cause the processor to:
acquiring an application image for applying for the project, which is sent by an upstream node of the project, and a content classification result of the application image;
performing key image block detection on the application image based on the item to obtain a key image block;
extracting image features from the key image blocks, and calculating the image definition of the application image based on the image features;
If the content classification result is empty, determining the image content classification of the key image block based on an identification result obtained by carrying out image content identification on the key image block;
and determining an image quality parameter of the application image based on the image definition and sending the image quality parameter to a downstream node of the project to perform project application processing of the project based on the image quality parameter under the condition that the image content is classified into a project adaptation classification.
17. A storage medium storing computer-executable instructions that when executed by a processor implement the following:
acquiring an application image for applying for the project, which is sent by an upstream node of the project, and a content classification result of the application image;
performing key image block detection on the application image based on the item to obtain a key image block;
extracting image features from the key image blocks, and calculating the image definition of the application image based on the image features;
if the content classification result is empty, determining the image content classification of the key image block based on an identification result obtained by carrying out image content identification on the key image block;
And determining an image quality parameter of the application image based on the image definition and sending the image quality parameter to a downstream node of the project to perform project application processing of the project based on the image quality parameter under the condition that the image content is classified into a project adaptation classification.
CN202310476481.2A 2023-04-28 2023-04-28 Image processing method and device Pending CN116503357A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310476481.2A CN116503357A (en) 2023-04-28 2023-04-28 Image processing method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310476481.2A CN116503357A (en) 2023-04-28 2023-04-28 Image processing method and device

Publications (1)

Publication Number Publication Date
CN116503357A true CN116503357A (en) 2023-07-28

Family

ID=87322589

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310476481.2A Pending CN116503357A (en) 2023-04-28 2023-04-28 Image processing method and device

Country Status (1)

Country Link
CN (1) CN116503357A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116994007A (en) * 2023-09-26 2023-11-03 支付宝(杭州)信息技术有限公司 Commodity texture detection processing method and device

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116994007A (en) * 2023-09-26 2023-11-03 支付宝(杭州)信息技术有限公司 Commodity texture detection processing method and device
CN116994007B (en) * 2023-09-26 2024-03-19 支付宝(杭州)信息技术有限公司 Commodity texture detection processing method and device

Similar Documents

Publication Publication Date Title
CN112417093B (en) Model training method and device
CN109299276B (en) Method and device for converting text into word embedding and text classification
CN115712866B (en) Data processing method, device and equipment
CN116503357A (en) Image processing method and device
CN115840802A (en) Service processing method and device
CN110490058B (en) Training method, device and system of pedestrian detection model and computer readable medium
US20210044864A1 (en) Method and apparatus for identifying video content based on biometric features of characters
JP2016212879A (en) Information processing method and information processing apparatus
CN113222022A (en) Webpage classification identification method and device
CN116186330B (en) Video deduplication method and device based on multi-mode learning
CN115567371B (en) Abnormity detection method, device, equipment and readable storage medium
CN116664514A (en) Data processing method, device and equipment
CN114443916B (en) Supply and demand matching method and system for test data
CN115358777A (en) Advertisement putting processing method and device of virtual world
CN114969253A (en) Market subject and policy matching method and device, computing device and medium
CN111898626B (en) Model determination method and device and electronic equipment
CN114254588A (en) Data tag processing method and device
CN113807407A (en) Target detection model training method, model performance detection method and device
CN111539520A (en) Method and device for enhancing robustness of deep learning model
CN115860749B (en) Data processing method, device and equipment
CN116630029A (en) Risk identification model training method and device
CN117689006A (en) Federal migration learning method and device
CN117541963A (en) Method and device for extracting key video frames containing text risks
CN114996447A (en) Text hierarchy classification method and device based on center loss
CN116824339A (en) Image processing method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination