CN112836759B - Machine-selected picture evaluation method and device, storage medium and electronic equipment - Google Patents

Machine-selected picture evaluation method and device, storage medium and electronic equipment Download PDF

Info

Publication number
CN112836759B
CN112836759B CN202110181683.5A CN202110181683A CN112836759B CN 112836759 B CN112836759 B CN 112836759B CN 202110181683 A CN202110181683 A CN 202110181683A CN 112836759 B CN112836759 B CN 112836759B
Authority
CN
China
Prior art keywords
pictures
picture
type
calibration
similarity
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110181683.5A
Other languages
Chinese (zh)
Other versions
CN112836759A (en
Inventor
胡舒瀚
刘超
彭豪杨
夏伟
陈婉婉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chongqing Unisinsight Technology Co Ltd
Original Assignee
Chongqing Unisinsight Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chongqing Unisinsight Technology Co Ltd filed Critical Chongqing Unisinsight Technology Co Ltd
Priority to CN202110181683.5A priority Critical patent/CN112836759B/en
Publication of CN112836759A publication Critical patent/CN112836759A/en
Application granted granted Critical
Publication of CN112836759B publication Critical patent/CN112836759B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/75Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
    • G06V10/751Comparing pixel values or logical combinations thereof, or feature values having positional relevance, e.g. template matching
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques

Abstract

The application provides a machine-selected picture evaluation method and device, a storage medium and electronic equipment. Firstly, screening matching pictures from a machine selection set according to a calibration set; the calibration set comprises a first number of calibration pictures, the calibration pictures are pictures generated by manually calibrating the target video, the machine selection set comprises a second number of machine selection pictures, the machine selection pictures are pictures generated by performing machine selection operation on the target video, and the matching pictures are machine selection pictures matched with any one calibration picture. The optimization rate of the machine selection pictures is automatically obtained through the first quantity, the second quantity and the quantity of the matched pictures, the optimization rate characterizes the accuracy of the machine selection operation, the accuracy of the machine selection operation can be evaluated, manual judgment is not needed in the process of evaluation and interception, the labor cost is reduced, the influence of manual subjective judgment is eliminated, the evaluation standard is unified, the quantitative analysis result can be obtained, and the range of evaluation errors is reduced.

Description

Machine-selected picture evaluation method and device, storage medium and electronic equipment
Technical Field
The present invention relates to the field of images, and in particular, to a method and apparatus for evaluating mechanically selected pictures, a storage medium, and an electronic device.
Background
Along with the increasing popularity of security cameras, the effect of target identification security equipment based on a video algorithm in life is increasingly remarkable; the technologies of enterprise safety precaution, personnel monitoring, vehicle control and the like are basically mature, and social efficiency and people life are more convenient due to intelligent recognition; the better the image quality optimized by the algorithm, the greater the system effect on intelligent security, and the most important in practical application is to identify the image according to the algorithm for service development, such as personnel tracking, motor vehicle control and the like.
At present, most of test evaluation on the optimal effect of an algorithm picture depends on manual test screening judgment, and unified test evaluation standards are temporarily absent in the industry; the manual test mode has high labor cost, no quantitative analysis result and large test error, and is not consistent with the increasingly advanced rhythm of the video recognition technology, and the optimal test of the algorithm picture is urgently required to be an automatic test method aiming at the human-machine non-human face and how to define the optimal test index.
Disclosure of Invention
An object of the present invention is to provide a machine-selected picture evaluation method, apparatus, storage medium and electronic device, so as to at least partially improve the above-mentioned problems.
In order to achieve the above purpose, the technical solution adopted in the embodiment of the present application is as follows:
in a first aspect, an embodiment of the present application provides a machine-selected picture evaluation method, where the method includes:
screening out matching pictures from the machine selection set according to the calibration set;
the calibration set comprises a first number of calibration pictures, the calibration pictures are pictures generated by calibrating a target video manually, the machine selection set comprises a second number of machine selection pictures, the machine selection pictures are pictures generated by performing machine selection operation on the target video, and the matched pictures are machine selection pictures matched with any one calibration picture;
and obtaining the preference rate of the mechanically selected pictures according to the first quantity, the second quantity and the quantity of the matched pictures, wherein the preference rate characterizes the accuracy of the mechanically selected operation.
In a second aspect, an embodiment of the present application provides an apparatus for evaluating an mechanically selected picture, including:
the matching unit is used for screening matching pictures from the machine selection set according to the calibration set;
the calibration set comprises a first number of calibration pictures, the calibration pictures are pictures generated by calibrating a target video manually, the machine selection set comprises a second number of machine selection pictures, the machine selection pictures are pictures generated by performing machine selection operation on the target video, and the matched pictures are machine selection pictures matched with any one calibration picture;
and the processing unit is used for acquiring the preference rate of the mechanically selected pictures according to the first quantity, the second quantity and the quantity of the matched pictures, wherein the preference rate characterizes the accuracy of the mechanically selected operation.
In a third aspect, embodiments of the present application provide a storage medium having stored thereon a computer program which, when executed by a processor, implements the method described above.
In a fourth aspect, an embodiment of the present application provides an electronic device, including: a processor and a memory for storing one or more programs; the above-described method is implemented when the one or more programs are executed by the processor.
Compared with the prior art, the machine-selected picture evaluation method, the device, the storage medium and the electronic equipment provided by the embodiment of the application, wherein the matching pictures are screened out from the machine-selected set according to the calibration set; the calibration set comprises a first number of calibration pictures, the calibration pictures are pictures generated by manually calibrating the target video, the machine selection set comprises a second number of machine selection pictures, the machine selection pictures are pictures generated by performing machine selection operation on the target video, and the matched pictures are machine selection pictures matched with any one calibration picture; and obtaining the preference rate of the machine-selected pictures according to the first quantity, the second quantity and the quantity of the matched pictures, wherein the preference rate characterizes the accuracy of the machine-selected operation. The optimization rate of the machine selection pictures can be automatically obtained through the first quantity, the second quantity and the quantity of the matched pictures, the accuracy of the machine selection operation can be evaluated, the manual judgment is not needed in the evaluation and interception, the labor cost is reduced, meanwhile, the influence of the subjective judgment of the labor is eliminated, the evaluation standard is uniform, the quantitative analysis result can be obtained, and the range of the evaluation error is reduced.
In order to make the above objects, features and advantages of the present application more comprehensible, preferred embodiments accompanied with figures are described in detail below.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the embodiments will be briefly described below, it being understood that the following drawings only illustrate some embodiments of the present application and therefore should not be considered limiting in scope, and that other related drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a schematic structural diagram of an electronic device according to an embodiment of the present application;
fig. 2 is a flow chart of an evaluation method for machine-selected pictures according to an embodiment of the present application;
fig. 3 is a schematic diagram of sub-steps of S101 provided in an embodiment of the present application;
fig. 4 is a schematic diagram of sub-steps of S103 provided in the embodiment of the present application;
fig. 5 is a schematic unit diagram of an apparatus for evaluating an optional picture according to an embodiment of the present application.
In the figure: 10-a processor; 11-memory; 12-bus; 13-a communication interface; 201-a matching unit; 202-a processing unit.
Detailed Description
For the purposes of making the objects, technical solutions and advantages of the embodiments of the present application more clear, the technical solutions of the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is apparent that the described embodiments are some embodiments of the present application, but not all embodiments. The components of the embodiments of the present application, which are generally described and illustrated in the figures herein, may be arranged and designed in a wide variety of different configurations.
Thus, the following detailed description of the embodiments of the present application, as provided in the accompanying drawings, is not intended to limit the scope of the application, as claimed, but is merely representative of selected embodiments of the application. All other embodiments, which can be made by one of ordinary skill in the art based on the embodiments herein without making any inventive effort, are intended to be within the scope of the present application.
It should be noted that: like reference numerals and letters denote like items in the following figures, and thus once an item is defined in one figure, no further definition or explanation thereof is necessary in the following figures. Meanwhile, in the description of the present application, the terms "first", "second", and the like are used only to distinguish the description, and are not to be construed as indicating or implying relative importance.
It is noted that relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Moreover, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
In the description of the present application, it should be noted that, the terms "upper," "lower," "inner," "outer," and the like indicate an orientation or a positional relationship based on the orientation or the positional relationship shown in the drawings, or an orientation or a positional relationship conventionally put in use of the product of the application, merely for convenience of description and simplification of the description, and do not indicate or imply that the apparatus or element to be referred to must have a specific orientation, be configured and operated in a specific orientation, and therefore should not be construed as limiting the present application.
In the description of the present application, it should also be noted that, unless explicitly specified and limited otherwise, the terms "disposed," "connected," and "connected" are to be construed broadly, and may be, for example, fixedly connected, detachably connected, or integrally connected; can be mechanically or electrically connected; can be directly connected or indirectly connected through an intermediate medium, and can be communication between two elements. The specific meaning of the terms in this application will be understood by those of ordinary skill in the art in a specific context.
Some embodiments of the present application are described in detail below with reference to the accompanying drawings. The following embodiments and features of the embodiments may be combined with each other without conflict.
The embodiment of the application provides electronic equipment which can be server equipment or a computer terminal. Referring to fig. 1, a schematic structure of an electronic device is shown. The electronic device comprises a processor 10, a memory 11, a bus 12. The processor 10 and the memory 11 are connected by a bus 12, the processor 10 being adapted to execute executable modules, such as computer programs, stored in the memory 11.
The processor 10 may be an integrated circuit chip with signal processing capabilities. In implementation, the steps of the machine-selected picture evaluation method may be accomplished by instructions in the form of integrated logic circuits of hardware or software in the processor 10. The processor 10 may be a general-purpose processor, including a central processing unit (Central Processing Unit, CPU for short), a network processor (Network Processor, NP for short), etc.; but also digital signal processors (Digital Signal Processor, DSP for short), application specific integrated circuits (Application Specific Integrated Circuit, ASIC for short), field-programmable gate arrays (Field-Programmable Gate Array, FPGA for short) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components.
The memory 11 may comprise a high-speed random access memory (RAM: random Access Memory) and may also comprise a non-volatile memory (non-volatile memory), such as at least one disk memory.
Bus 12 may be a ISA (Industry Standard Architecture) bus, PCI (Peripheral Component Interconnect) bus, EISA (Extended Industry Standard Architecture) bus, or the like. Only one double-headed arrow is shown in fig. 1, but not only one bus 12 or one type of bus 12.
The memory 11 is used for storing programs, for example, programs corresponding to the machine-selected picture evaluation device. The picture-taking evaluation means comprise at least one software function module which can be stored in the memory 11 in the form of software or firmware (firmware) or which is solidified in the Operating System (OS) of the electronic device. After receiving the execution instruction, the processor 10 executes the program to implement the machine-selected picture evaluation method.
Possibly, the electronic device provided in the embodiment of the present application further includes a communication interface 13. The communication interface 13 is connected to the processor 10 via a bus. The electronic device may receive video recordings transmitted by other terminals via the communication interface 13.
It should be understood that the structure shown in fig. 1 is a schematic structural diagram of only a portion of an electronic device, which may also include more or fewer components than shown in fig. 1, or have a different configuration than shown in fig. 1. The components shown in fig. 1 may be implemented in hardware, software, or a combination thereof.
The method for evaluating the machine-selected picture provided in the embodiment of the present application may be applied to, but is not limited to, the electronic device shown in fig. 1, and the specific flow is shown in fig. 2:
s101, screening out matching pictures from the machine selection set according to the calibration set.
The calibration set comprises a first number of calibration pictures, the calibration pictures are pictures generated by calibrating a target video manually, the machine selection set comprises a second number of machine selection pictures, the machine selection pictures are pictures generated by performing machine selection operation on the target video, and the matching pictures are machine selection pictures matched with any one calibration picture.
Optionally, when the machine-selected picture meets a condition similar to any one of the calibration pictures, determining that the machine-selected picture is a matching picture.
S102, obtaining the preference rate of the mechanically selected pictures according to the first number, the second number and the number of the matched pictures.
Wherein, the preference rate characterizes the accuracy of the mechanical selection operation.
Optionally, the accuracy of the machine selection operation can be evaluated by automatically acquiring the optimal rate of the machine selection pictures through the first quantity, the second quantity and the quantity of the matched pictures, the manual judgment is not needed in the evaluation and interception, the labor cost is reduced, meanwhile, the influence of the manual subjective judgment is eliminated, the evaluation standard is unified, the quantitative analysis result can be obtained, and the range of the evaluation error is reduced.
In summary, the embodiment of the application provides a machine-selected picture evaluation method, wherein matching pictures are screened out from a machine-selected set according to a calibration set; the calibration set comprises a first number of calibration pictures, the calibration pictures are pictures generated by manually calibrating the target video, the machine selection set comprises a second number of machine selection pictures, the machine selection pictures are pictures generated by performing machine selection operation on the target video, and the matched pictures are machine selection pictures matched with any one calibration picture; and obtaining the preference rate of the machine-selected pictures according to the first quantity, the second quantity and the quantity of the matched pictures, wherein the preference rate characterizes the accuracy of the machine-selected operation. The optimization rate of the machine selection pictures can be automatically obtained through the first quantity, the second quantity and the quantity of the matched pictures, the accuracy of the machine selection operation can be evaluated, the manual judgment is not needed in the evaluation and interception, the labor cost is reduced, meanwhile, the influence of the subjective judgment of the labor is eliminated, the evaluation standard is uniform, the quantitative analysis result can be obtained, and the range of the evaluation error is reduced.
Optionally, before the evaluation starts, performing manual pre-calibration on the target video, and screening out the optimal preferable picture of the video target as a calibration picture, wherein all the calibration pictures form a calibration set. For example, the calibration set is SR, and includes calibration pictures { S1, S2, … Sn }, where the picture edges of the calibration pictures are close to the edges of the target (face). Further, each calibration picture may be uniquely named.
Optionally, the machine selection operation is performed on the target video, so that machine selection pictures can be obtained, and all the machine selection pictures are combined into the machine selection set PA. The selection set PA contains selection pictures { P1, P2, … Pm }. Further, each machine-selected picture may be uniquely named.
Optionally, extracting the eigenvalue vector, RGB characteristics and resolution information of each picture in SR and PA respectively, and storing the eigenvalue vector, RGB characteristics and resolution information as txt files in a corresponding folder according to the name of each picture. The picture resolution information is marked as a horizontal pixel row W px, a vertical pixel column H px, the data of the calibration picture is marked as H×W, and the data of the machine-selected picture is marked as h×w. The resolution information can simply and effectively judge the picture quality, the lower the resolution is, the lower the picture quality is, and the matching pictures of the same video object should be close to each other in resolution first. The picture RGB feature extraction uses normalized histogram data of RGB colors, is stored in a list form, and a list index ([ 0,255 ]) represents a corresponding color value, so that the data is finally in a range of [0,1] for comparison and statistics, the histogram data is normalized, and the list value is the number of pixel points of the color value/the number of total pixel points of the picture. The normalized data of the calibration picture is denoted as Hr, and the data of the machine-selected picture is denoted as Hr. The RGB features can reflect the color distribution state in the picture and can judge the scene. Extracting a picture characteristic value vector by using a target (face) recognition algorithm, calculating a vector included angle cosine value for judging, marking data of a picture as F, marking data of a machine-selected picture as F, and representing characteristic vector data of the picture.
With respect to the content in S101 on the basis of fig. 2, the embodiment of the present application further provides a possible implementation manner, please refer to fig. 3, S101 includes:
s101-1, sequentially screening out corresponding first type pictures from the machine selection set according to the calibration pictures in the calibration set.
The first type of pictures are machine-selected pictures with the resolution similarity with the calibrated pictures being larger than a resolution threshold.
As described above, the resolution of the picture can simply and effectively determine the picture quality, and the lower the resolution, the lower the picture quality, and the mutually matched pictures should be close in resolution first. When the resolution similarity is smaller than or equal to the resolution threshold, the resolution of the selected picture and the resolution of any one of the calibration pictures are greatly different, the matching condition is not met, and the matching condition can be eliminated.
Alternatively, the resolution similarity is between [0,1], the closer the value is to 1, the higher the degree of matching.
The resolution similarity calculation method is as follows:
Figure BDA0002941639990000091
wherein RES simi And (3) representing the resolution similarity, wherein Hpx_Wpx is the resolution information of the calibration picture, and H px_Wpx is the resolution information of the machine-selected picture.
S101-2, screening out corresponding second-type pictures from the first-type pictures according to the calibration pictures.
The second type of pictures are first type of pictures with RGB feature similarity with the calibration pictures being larger than RGB threshold values.
Optionally, the RGB feature similarity is obtained by comparing the normalized histogram data of the RGB colors of the first type of picture with the corresponding calibration picture. The RGB feature similarity is a fraction of [ -1,1], and the closer the values are, the higher the similarity is.
Optionally, the RGB colors of the picture are respectively converted into 3 groups of normalized histograms, the abscissa is the integer [0,255] color value, the ordinate is the probability of the color point appearing in the picture, and the total integral area of the normalized histograms is 1.
Comparison using a correlation comparison algorithm, for example: the red space normalized histogram is Hr1 and Hr2 respectively, and the calculation formula of the RGB similarity corresponding to red is as follows:
Figure BDA0002941639990000101
wherein Hr1 (I) represents the probability value of the red color value of the calibration picture as I in the histogram,
Figure BDA0002941639990000102
represents the arithmetic mean number of the red histogram of the calibration picture, hr2 (I) represents the probability value of the red color value of the mechanically selected picture as I in the histogram,
Figure BDA0002941639990000103
the arithmetic mean of red histograms of the mechanically selected pictures is represented, and d (Hr 1, hr 2) represents the RGB similarity of the mechanically selected pictures corresponding to the red colors of the calibrated pictures.
I.e. the sum of the covariances of all color values of the two pictures divided by the product of the standard deviation of all color points accumulated by the two pictures.
And similarly, obtaining the RGB similarity of the first type picture corresponding to the green and blue space histograms of the calibration picture.
Optionally, calculating an average value of the RGB similarity of the first type of picture corresponding to the red, green and blue spatial histograms of the calibration pictures, comparing the average value with an RGB threshold, and if the average value is greater than the RGB threshold, taking the first type of picture as the second type of picture.
Optionally, comparing the RGB feature similarity of the first type picture corresponding to the red, green and blue spatial histograms of the calibration picture with RGB threshold values respectively, and when any one of the RGB feature similarity does not meet the condition of being greater than the RGB threshold value, considering that the RGB feature similarity of the first type picture and the calibration picture does not meet the condition of being greater than the RGB threshold value; or when any one of the images satisfies the condition of being larger than the RGB threshold value, the RGB feature similarity of the first type image and the calibration image is considered to satisfy the condition of being larger than the RGB threshold value.
S101-3, screening out corresponding third-class pictures from the second-class pictures according to the calibration pictures.
The third type of pictures are second type of pictures with the similarity of the feature vectors with the calibrated pictures being larger than the feature vector threshold value.
Optionally, the feature vector similarity of each second type of picture and the corresponding calibration picture is calculated respectively. The feature vector similarity is the cosine value of the feature vector of the second type picture and the corresponding calibration picture, the result is (0, 1) decimal, the value is close to 1, and the picture similarity is higher.
Alternatively, the last calibration picture may correspond to the presence of one or more third class pictures, or the absence of a corresponding third class picture.
When there is only one third type of picture, the only third type of picture is used as the matching picture of the calibration picture, and when the number of the third type of pictures is greater than or equal to 2, further screening is needed, please continue to refer to fig. 3.
S101-4, judging whether the similarity values of the feature vectors corresponding to the third type of pictures are the same. If yes, executing S101-6; if not, S101-5 is performed.
Optionally, the similarity between the feature vectors can be the similarity between the pictures, and when the maximum value of the feature vector similarity corresponding to the third type of pictures is the same, further judgment is needed to execute S101-6; otherwise, when the maximum value of the feature vector similarity corresponding to the third type of picture is unique, the third type of picture with the maximum value of the feature vector similarity is used as the matching picture, and S101-5 is executed.
S101-5, taking the third type picture with the maximum value of the feature vector similarity as a matching picture.
S101-6, judging whether the values of the RGB feature similarity corresponding to the third type of pictures are the same. If yes, executing S101-8; if not, S101-7 is performed.
Optionally, when the maximum value of the RGB feature similarities corresponding to the third type of pictures is the same, further judgment is needed, and S101-8 is executed; otherwise, when the maximum value of the RGB feature similarities corresponding to the third type of pictures is the same, the third type of pictures with the maximum value of the RGB feature similarities are used as matching pictures, and S101-7 is executed.
S101-7, taking the third type of picture with the maximum RGB feature similarity value as a matching picture.
S101-8, judging whether the values of the corresponding resolution similarities of the third type of pictures are the same. If yes, executing S101-10; if not, S101-9 is performed.
Optionally, when the maximum value of the corresponding resolution similarity of the third type of pictures is the same, further judgment is needed, and S101-10 is executed; otherwise, if the third type of picture is unique to the maximum value of the resolution similarity, the third type of picture with the maximum value of the resolution similarity is used as the matching picture, and S101-9 is executed.
And S101-9, taking the third type picture with the maximum resolution similarity value as a matching picture.
S101-10, storing the third type of pictures in the undetermined set.
S101-11, determining matching pictures from the to-be-determined set according to a specified command input by a user.
Optionally, each type of evaluation index (resolution similarity, RGB feature similarity, and feature vector similarity) is set with a corresponding threshold value respectively; if all the comparison values meet the threshold value, creating a subfolder 'calibration chart' in a 'machine selection picture' folder, storing the calibration picture in the 'calibration chart' folder, and storing the machine selection picture in a 'snap shot picture' folder. If the comparison value does not meet the threshold value, skipping, and comparing with the next picture until all the pictures are compared; if a plurality of pictures with highest contrast value meet the threshold value, selecting the picture with highest contrast value, and storing the picture into a folder according to a successful optimization step; if the comparison values are the same, a sub-folder ' calibration map ' and a snap map ' are newly built in the repeated picture folder, the calibration map is stored in the calibration map folder, all the snap maps are stored in the snap map folder, and manual judgment is performed after automatic comparison.
On the basis of fig. 2, for the content in S102, a possible implementation manner is further provided in the embodiment of the present application, please refer to fig. 4, S102 includes:
s102-1, determining the matching success rate according to the second number and the number of the matching pictures.
Optionally, the match success rate = the number of matching pictures (RYX)/the second number. The second number may contain multiple pictures and error pictures, and the matching success rate indicates that the correct preferred number accounts for a proportion in the machine selection result given by the machine selection operation.
S102-2, determining the detection rate according to the first quantity and the quantity of the matched pictures.
Optionally, the detection rate=the number of matching pictures (RYX)/the first number. First number = positive number + missed number. The detection rate represents the proportion of objects that actually appear in the video that are successfully preferred.
S102-3, determining the preference rate according to the matching success rate and the matching success rate.
Optionally, the preference rate=2×match success rate×detection rate/(match success rate+detection rate), and the preference rate integrates the detection rate and the preference rate index, and a closer value to 1 indicates a better algorithm effect.
Optionally counting the successfully-compared pictures in the PA after the comparison of all the pictures is completed, removing the repeated pictures with similarity, and finally obtaining the optimal successfully-compared number RYX; the original quantity of the PA is TYX, so that the matching success rate of the algorithm snap pictures is RTX/TYX 100%. The following table illustrates the processed data.
Figure BDA0002941639990000131
Referring to fig. 5, fig. 5 is a schematic diagram illustrating an optional picture evaluation device according to an embodiment of the present application, and optionally, the optional picture evaluation device is applied to the electronic device described above.
The machine selects picture evaluation device to include: a matching unit 201 and a processing unit 202.
A matching unit 201, configured to screen out matching pictures from the machine selection set according to the calibration set;
the calibration set comprises a first number of calibration pictures, the calibration pictures are pictures generated by calibrating a target video manually, the machine selection set comprises a second number of machine selection pictures, the machine selection pictures are pictures generated by performing machine selection operation on the target video, and the matching pictures are machine selection pictures matched with any one calibration picture. Alternatively, the matching unit 201 may perform S101 described above.
The processing unit 202 is configured to obtain a preference rate of the mechanically selected pictures according to the first number, the second number and the number of matching pictures, where the preference rate characterizes accuracy of the mechanically selected operation. Alternatively, the processing unit 202 may perform S102 described above.
Optionally, the matching unit 201 is further configured to screen out corresponding first type pictures from the machine selection set sequentially according to the calibration pictures in the calibration set, where the first type pictures are machine selection pictures with a resolution similarity with the calibration pictures being greater than a resolution threshold;
the matching unit 201 is further configured to screen a corresponding second type of picture from the first type of pictures according to the calibration picture, where the second type of picture is a first type of picture with an RGB feature similarity with the calibration picture being greater than an RGB threshold;
the matching unit 201 is further configured to screen a corresponding third type of picture from the second type of pictures according to the calibration picture, where the third type of picture is the second type of picture with a feature vector similarity with the calibration picture being greater than a feature vector threshold;
the matching unit 201 is further configured to take, as a matching picture, a third type of picture having a maximum value of the feature vector similarity.
Optionally, when the number of the third type of pictures is greater than or equal to 2, the matching unit 201 is further configured to determine whether the values of the similarity of the feature vectors corresponding to the third type of pictures are the same;
if not, the matching unit 201 is further configured to use a third type of picture with the largest value of the feature vector similarity as a matching picture;
if yes, the matching unit 201 is further configured to determine whether the values of the RGB feature similarities corresponding to the third type of pictures are the same;
if not, the matching unit 201 is further configured to use a third type of picture with the maximum value of the RGB feature similarity as a matching picture;
if yes, the matching unit 201 is further configured to determine whether the values of the corresponding resolution similarities of the third type of pictures are the same;
if not, the matching unit 201 is further configured to use the third type of picture with the largest resolution similarity as a matching picture;
if the values of the resolution similarities corresponding to the third type of pictures are the same, the matching unit 201 is further configured to store the third type of pictures in the pending set;
the matching unit 201 is further configured to determine a matching picture from the pending set according to a specified command input by a user.
Alternatively, the matching unit 201 may perform S101-1 to S101-11 described above.
It should be noted that, the machine-selected picture evaluation device provided in this embodiment may execute the method flow shown in the method flow embodiment to achieve the corresponding technical effects. For a brief description, reference is made to the corresponding parts of the above embodiments, where this embodiment is not mentioned.
The present application also provides a computer-readable storage medium storing computer instructions, a program that when read and executed perform the machine-selected picture evaluation method of the above embodiments. The storage medium may include memory, flash memory, registers, combinations thereof, or the like.
The following provides an electronic device, which may be a server device or a computer terminal device, as shown in fig. 1, and may implement the above-mentioned machine-selected picture evaluation method; specifically, the electronic device includes: a processor 10, a memory 11, a bus 12. The processor 10 may be a CPU. The memory 11 is used to store one or more programs that, when executed by the processor 10, perform the machine-selected picture evaluation method of the above-described embodiment.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other manners as well. The apparatus embodiments described above are merely illustrative, for example, flow diagrams and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of apparatus, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
In addition, the functional modules in the embodiments of the present application may be integrated together to form a single part, or each module may exist alone, or two or more modules may be integrated to form a single part.
The functions, if implemented in the form of software functional modules and sold or used as a stand-alone product, may be stored in a computer-readable storage medium. Based on such understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art or in a part of the technical solution, in the form of a software product stored in a storage medium, including several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to perform all or part of the steps of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
The foregoing description is only of the preferred embodiments of the present application and is not intended to limit the same, but rather, various modifications and variations may be made by those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principles of the present application should be included in the protection scope of the present application.
It will be evident to those skilled in the art that the present application is not limited to the details of the foregoing illustrative embodiments, and that the present application may be embodied in other specific forms without departing from the spirit or essential characteristics thereof. The present embodiments are, therefore, to be considered in all respects as illustrative and not restrictive, the scope of the application being indicated by the appended claims rather than by the foregoing description, and all changes which come within the meaning and range of equivalency of the claims are therefore intended to be embraced therein. Any reference sign in a claim should not be construed as limiting the claim concerned.

Claims (7)

1. A machine-selected picture evaluation method, characterized in that the method comprises:
screening out matching pictures from the machine selection set according to the calibration set;
the calibration set comprises a first number of calibration pictures, the calibration pictures are pictures generated by calibrating a target video manually, the machine selection set comprises a second number of machine selection pictures, the machine selection pictures are pictures generated by performing machine selection operation on the target video, and the matched pictures are machine selection pictures matched with any one calibration picture;
obtaining a preference rate of the mechanically selected pictures according to the first quantity, the second quantity and the quantity of the matched pictures, wherein the preference rate characterizes the accuracy of the mechanically selected operation;
the step of screening the matching pictures from the machine selection set according to the calibration set comprises the following steps:
sequentially screening out corresponding first type pictures from the machine selection set according to the calibration pictures in the calibration set, wherein the first type pictures are machine selection pictures with resolution similarity larger than a resolution threshold value with the calibration pictures;
screening out a corresponding second type of picture from the first type of picture according to the calibration picture, wherein the second type of picture is a first type of picture with RGB feature similarity larger than an RGB threshold value with the calibration picture;
screening out a corresponding third type of picture from the second type of pictures according to the calibration picture, wherein the third type of picture is the second type of picture with the similarity of the feature vector with the calibration picture being larger than a feature vector threshold;
taking a third type picture with the maximum value of the feature vector similarity as the matching picture;
the step of screening the corresponding second type of pictures from the first type of pictures according to the calibration pictures comprises the following steps:
calculating RGB similarity corresponding to the red space histogram, the green space histogram and the blue space histogram of the first type picture and the calibration picture;
comparing RGB similarity corresponding to the red space histogram, the green space histogram and the blue space histogram of the first type picture and the calibration picture with RGB threshold values respectively;
when any one of the conditions is not satisfied with the conditions larger than the RGB threshold, the RGB feature similarity between the first type picture and the calibration picture is considered to be not satisfied with the conditions larger than the RGB threshold;
when any one of the images meets the condition that the RGB characteristic similarity between the first type of image and the calibration image is larger than the RGB threshold value, the first type of image and the calibration image are considered to meet the condition that the RGB characteristic similarity is larger than the RGB threshold value;
the step of obtaining the preference rate of the mechanically selected pictures according to the first number, the second number and the number of the matched pictures comprises the following steps:
determining a matching success rate according to the second quantity and the quantity of the matching pictures;
determining a detection rate according to the first quantity and the quantity of the matched pictures;
determining the optimal selection rate according to the matching success rate and the detection rate;
preference rate=2×match success rate×detection rate/(match success rate+detection rate).
2. The machine-selected picture evaluation method according to claim 1, wherein when the number of the third-class pictures is greater than or equal to 2, before taking the third-class picture with the largest value of the feature vector similarity as the matching picture, the step of screening the matching picture from the machine-selected set according to the calibration set further comprises:
judging whether the similarity values of the feature vectors corresponding to the third type of pictures are the same or not;
if not, taking the third type of picture with the maximum value of the feature vector similarity as the matching picture;
if yes, judging whether the values of the RGB feature similarity corresponding to the third type of pictures are the same or not;
if not, taking the third type of picture with the maximum RGB feature similarity value as the matching picture;
if yes, judging whether the values of the resolution similarity corresponding to the third type of pictures are the same or not;
and if not, taking the third type of picture with the maximum resolution similarity value as the matching picture.
3. The machine-selected picture evaluation method as claimed in claim 2, wherein the step of screening out matching pictures from the machine-selected set according to the calibration set further comprises:
if the values of the resolution similarity corresponding to the third type of pictures are the same, storing the third type of pictures in a pending set;
and determining the matched pictures from the to-be-determined set according to the specified command input by the user.
4. A machine-selected picture evaluation device, characterized in that the device comprises:
the matching unit is used for screening matching pictures from the machine selection set according to the calibration set;
the calibration set comprises a first number of calibration pictures, the calibration pictures are pictures generated by calibrating a target video manually, the machine selection set comprises a second number of machine selection pictures, the machine selection pictures are pictures generated by performing machine selection operation on the target video, and the matched pictures are machine selection pictures matched with any one calibration picture;
the processing unit is used for acquiring the preference rate of the mechanically selected pictures according to the first quantity, the second quantity and the quantity of the matched pictures, wherein the preference rate characterizes the accuracy of the mechanically selected operation;
the matching unit is also used for sequentially screening out corresponding first type pictures from the machine selection set according to the calibration pictures in the calibration set, wherein the first type pictures are machine selection pictures with the resolution similarity larger than a resolution threshold value with the calibration pictures;
the matching unit is further used for screening out corresponding second type pictures from the first type pictures according to the calibration pictures, wherein the second type pictures are first type pictures with RGB feature similarity larger than an RGB threshold value with the calibration pictures;
the matching unit is further used for screening out a corresponding third type of picture from the second type of pictures according to the calibration picture, wherein the third type of picture is the second type of picture with the similarity of the feature vector with the calibration picture being greater than a feature vector threshold;
the matching unit is further used for taking a third type of picture with the maximum value of the feature vector similarity as the matching picture;
the step of screening the corresponding second type of pictures from the first type of pictures according to the calibration pictures comprises the following steps:
calculating RGB similarity corresponding to the red space histogram, the green space histogram and the blue space histogram of the first type picture and the calibration picture;
comparing RGB similarity corresponding to the red space histogram, the green space histogram and the blue space histogram of the first type picture and the calibration picture with RGB threshold values respectively;
when any one of the conditions is not satisfied with the conditions larger than the RGB threshold, the RGB feature similarity between the first type picture and the calibration picture is considered to be not satisfied with the conditions larger than the RGB threshold;
when any one of the images meets the condition that the RGB characteristic similarity between the first type of image and the calibration image is larger than the RGB threshold value, the first type of image and the calibration image are considered to meet the condition that the RGB characteristic similarity is larger than the RGB threshold value;
the obtaining the preference rate of the mechanically selected pictures according to the first number, the second number and the number of the matched pictures comprises:
determining a matching success rate according to the second quantity and the quantity of the matching pictures;
determining a detection rate according to the first quantity and the quantity of the matched pictures;
determining the optimal selection rate according to the matching success rate and the detection rate;
preference rate=2×match success rate×detection rate/(match success rate+detection rate).
5. The apparatus for evaluating machine-selected pictures as set forth in claim 4, wherein when the number of said third-class pictures is greater than or equal to 2, said matching unit is further configured to determine whether the values of the similarity of said feature vectors corresponding to said third-class pictures are the same;
if not, the matching unit is further configured to use a third type of picture with the largest value of the feature vector similarity as the matching picture;
if yes, the matching unit is further configured to determine whether the values of the RGB feature similarities corresponding to the third type of pictures are the same;
if not, the matching unit is further configured to use a third type of picture with the maximum value of the RGB feature similarity as the matching picture;
if yes, the matching unit is further configured to determine whether the values of the third type of pictures corresponding to the resolution similarity are the same;
if not, the matching unit is further configured to use a third type of picture with the largest resolution similarity as the matching picture;
if the values of the resolution similarities corresponding to the third type of pictures are the same, the matching unit is further configured to store the third type of pictures in a pending set;
the matching unit is also used for determining matching pictures from the to-be-determined set according to the specified command input by the user.
6. A computer readable storage medium, on which a computer program is stored, which computer program, when being executed by a processor, implements the method according to any of claims 1-3.
7. An electronic device, comprising: a processor and a memory for storing one or more programs; the method of any of claims 1-3 being implemented when the one or more programs are executed by the processor.
CN202110181683.5A 2021-02-09 2021-02-09 Machine-selected picture evaluation method and device, storage medium and electronic equipment Active CN112836759B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110181683.5A CN112836759B (en) 2021-02-09 2021-02-09 Machine-selected picture evaluation method and device, storage medium and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110181683.5A CN112836759B (en) 2021-02-09 2021-02-09 Machine-selected picture evaluation method and device, storage medium and electronic equipment

Publications (2)

Publication Number Publication Date
CN112836759A CN112836759A (en) 2021-05-25
CN112836759B true CN112836759B (en) 2023-05-30

Family

ID=75933322

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110181683.5A Active CN112836759B (en) 2021-02-09 2021-02-09 Machine-selected picture evaluation method and device, storage medium and electronic equipment

Country Status (1)

Country Link
CN (1) CN112836759B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113345037A (en) * 2021-06-07 2021-09-03 重庆紫光华山智安科技有限公司 Automatic testing method, system, medium and terminal for motor vehicle algorithm indexes

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106469293A (en) * 2015-08-21 2017-03-01 上海羽视澄蓝信息科技有限公司 The method and system of quick detection target
CN106897549A (en) * 2017-01-25 2017-06-27 浙江大学 Cancer Individual treatment policy selection method based on molecular physiology group
WO2020037932A1 (en) * 2018-08-20 2020-02-27 深圳云天励飞技术有限公司 Image quality assessment method, apparatus, electronic device and computer readable storage medium
CN111126122A (en) * 2018-10-31 2020-05-08 浙江宇视科技有限公司 Face recognition algorithm evaluation method and device
CN111222856A (en) * 2020-01-15 2020-06-02 深信服科技股份有限公司 Mail identification method, device, equipment and storage medium
CN111738349A (en) * 2020-06-29 2020-10-02 重庆紫光华山智安科技有限公司 Detection effect evaluation method and device of target detection algorithm, storage medium and equipment
CN112231076A (en) * 2020-09-18 2021-01-15 广东奥博信息产业股份有限公司 Data annotation task scheduling method based on intelligent optimization

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070009159A1 (en) * 2005-06-24 2007-01-11 Nokia Corporation Image recognition system and method using holistic Harr-like feature matching
CN103530638B (en) * 2013-10-29 2016-08-17 无锡赛思汇智科技有限公司 Method for pedestrian matching under multi-cam
US20190325507A1 (en) * 2014-12-08 2019-10-24 International Cruise & Excursion Gallery, Inc. Systems and Methods for Customer Engagement in Travel Related Programs
CN106547744B (en) * 2015-09-16 2020-11-06 杭州海康威视数字技术股份有限公司 Image retrieval method and system
CN105824862A (en) * 2015-10-20 2016-08-03 维沃移动通信有限公司 Image classification method based on electronic equipment and electronic equipment
CN109063784B (en) * 2018-08-23 2021-03-05 深圳码隆科技有限公司 Character clothing image data screening method and device
CN109271932A (en) * 2018-09-17 2019-01-25 中国电子科技集团公司第二十八研究所 Pedestrian based on color-match recognition methods again
CN110084268A (en) * 2019-03-18 2019-08-02 浙江大华技术股份有限公司 Image comparison method, face identification method and device, computer storage medium
CN111401265B (en) * 2020-03-19 2020-12-25 重庆紫光华山智安科技有限公司 Pedestrian re-identification method and device, electronic equipment and computer-readable storage medium
CN111506750B (en) * 2020-06-15 2021-03-16 北京金山云网络技术有限公司 Picture retrieval method and device and electronic equipment

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106469293A (en) * 2015-08-21 2017-03-01 上海羽视澄蓝信息科技有限公司 The method and system of quick detection target
CN106897549A (en) * 2017-01-25 2017-06-27 浙江大学 Cancer Individual treatment policy selection method based on molecular physiology group
WO2020037932A1 (en) * 2018-08-20 2020-02-27 深圳云天励飞技术有限公司 Image quality assessment method, apparatus, electronic device and computer readable storage medium
CN111126122A (en) * 2018-10-31 2020-05-08 浙江宇视科技有限公司 Face recognition algorithm evaluation method and device
CN111222856A (en) * 2020-01-15 2020-06-02 深信服科技股份有限公司 Mail identification method, device, equipment and storage medium
CN111738349A (en) * 2020-06-29 2020-10-02 重庆紫光华山智安科技有限公司 Detection effect evaluation method and device of target detection algorithm, storage medium and equipment
CN112231076A (en) * 2020-09-18 2021-01-15 广东奥博信息产业股份有限公司 Data annotation task scheduling method based on intelligent optimization

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
周阳等."基于特征的MBD模型检索方法研究".《组合机床与自动化加工技术》.2018,(第10期),第151-155页及第160页. *

Also Published As

Publication number Publication date
CN112836759A (en) 2021-05-25

Similar Documents

Publication Publication Date Title
WO2017000716A2 (en) Image management method and device, and terminal device
US7826657B2 (en) Automatically generating a content-based quality metric for digital images
CN109886928B (en) Target cell marking method, device, storage medium and terminal equipment
CN110443297B (en) Image clustering method and device and computer storage medium
CN108182421A (en) Methods of video segmentation and device
CN107240082B (en) Splicing line optimization method and equipment
CN109903210B (en) Watermark removal method, watermark removal device and server
JP2001325593A (en) Duplicate picture detecting method in automatic albuming system
US20230214989A1 (en) Defect detection method, electronic device and readable storage medium
CN111738349A (en) Detection effect evaluation method and device of target detection algorithm, storage medium and equipment
CN104065863A (en) Image processing method and processing device
WO2020143165A1 (en) Reproduced image recognition method and system, and terminal device
WO2021184628A1 (en) Image processing method and device
CN112836759B (en) Machine-selected picture evaluation method and device, storage medium and electronic equipment
CN111291778B (en) Training method of depth classification model, exposure anomaly detection method and device
JP2006285956A (en) Red eye detecting method and device, and program
CN111598176A (en) Image matching processing method and device
CN112966687B (en) Image segmentation model training method and device and communication equipment
CN112287905A (en) Vehicle damage identification method, device, equipment and storage medium
WO2024001309A1 (en) Method and apparatus for generating and producing template for infrared thermal image analysis report
CN109919164B (en) User interface object identification method and device
CN116993654A (en) Camera module defect detection method, device, equipment, storage medium and product
CN113158773B (en) Training method and training device for living body detection model
CN115147633A (en) Image clustering method, device, equipment and storage medium
CN111400534B (en) Cover determination method and device for image data and computer storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant