CN113034611A - Operation evaluation method, operation evaluation device, electronic apparatus, and medium - Google Patents

Operation evaluation method, operation evaluation device, electronic apparatus, and medium Download PDF

Info

Publication number
CN113034611A
CN113034611A CN202110316365.5A CN202110316365A CN113034611A CN 113034611 A CN113034611 A CN 113034611A CN 202110316365 A CN202110316365 A CN 202110316365A CN 113034611 A CN113034611 A CN 113034611A
Authority
CN
China
Prior art keywords
parameter
target
foreground
picture
grid map
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
CN202110316365.5A
Other languages
Chinese (zh)
Inventor
王淳毅
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN202110316365.5A priority Critical patent/CN113034611A/en
Publication of CN113034611A publication Critical patent/CN113034611A/en
Withdrawn legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30168Image quality inspection

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)

Abstract

The application discloses an operation evaluation method, an operation evaluation device, electronic equipment and a medium, and belongs to the field of mobile communication. The method comprises the following steps: acquiring a first parameter and a second parameter of target processing operation; the target processing operation is an operation of processing an original picture and obtaining a target picture; the first parameters comprise at least one of an orthomorphism parameter of a foreground part and a bendability parameter of a background part, the foreground part is a foreground part of the target picture, and the background part is a background part of the target picture; the second parameter comprises a view angle change parameter between the target picture and the original picture; and obtaining the evaluation information of the target processing operation according to the first parameter and the second parameter. The embodiment of the application solves the problems that in the prior art, when the MOS value is used for evaluating the portrait distortion, the cost is high, and the robustness of a scoring mechanism is poor.

Description

Operation evaluation method, operation evaluation device, electronic apparatus, and medium
Technical Field
The present application relates to the field of mobile communications, and in particular, to an operation evaluation method, apparatus, electronic device, and medium.
Background
With the rapid development of mobile communication technology, various mobile electronic devices and non-mobile electronic devices have become indispensable tools in various aspects of people's lives. The functions of various Application programs (APPs) of the electronic equipment are gradually improved, and the functions do not only play a role in communication, but also provide various intelligent services for users, so that great convenience is brought to the work and life of the users.
With respect to the shooting function, electronic devices have become the main devices currently shooting multimedia files such as pictures or videos. During shooting by using electronic equipment, human image distortion is a phenomenon which is very obvious to human eyes, especially in wide-angle camera imaging. Therefore, it is necessary to perform a distortion correction process on the image formed subsequent to the photographing to reduce the distortion of the portrait range. Each user has a different look and feel for the same correction result, so that the acceptance of the distortion phenomenon by the general public is deduced by subjective scoring of most users, for example, by using Mean Opinion Score (MOS) to evaluate the quality of an image after distortion correction. However, when the MOS value is used to evaluate the human image distortion, a large amount of labor cost and time cost are required, and respective deviation and prior conditions exist between different experiments, which reduces the robustness of the scoring mechanism.
Disclosure of Invention
An object of the embodiments of the present application is to provide an operation evaluation method, an operation evaluation device, an electronic device, and a medium, which can solve the problems in the prior art that the cost is high and the robustness of a scoring mechanism is poor when a MOS value is used to evaluate portrait distortion.
In order to solve the technical problem, the present application is implemented as follows:
in a first aspect, an embodiment of the present application provides an operation evaluation method, where the method includes:
acquiring a first parameter and a second parameter of target processing operation;
the target processing operation is an operation of processing an original picture and obtaining a target picture;
the first parameters comprise at least one of an orthomorphism parameter of a foreground part and a bendability parameter of a background part, the foreground part is a foreground part of the target picture, and the background part is a background part of the target picture; the second parameter comprises a view angle change parameter between the target picture and the original picture;
and obtaining the evaluation information of the target processing operation according to the first parameter and the second parameter.
In a second aspect, an embodiment of the present application further provides an operation evaluation apparatus, including:
the acquisition module is used for acquiring a first parameter and a second parameter of target processing operation;
the target processing operation is an operation of processing an original picture and obtaining a target picture;
the first parameters comprise at least one of an orthomorphism parameter of a foreground part and a bendability parameter of a background part, the foreground part is a foreground part of the target picture, and the background part is a background part of the target picture; the second parameter comprises a view angle change parameter between the target picture and the original picture;
and the evaluation module is used for obtaining the evaluation information of the target processing operation according to the first parameter and the second parameter.
In a third aspect, an embodiment of the present application further provides an electronic device, which includes a memory, a processor, and a program or an instruction stored on the memory and executable on the processor, and when the processor executes the program or the instruction, the steps in the operation evaluation method described above are implemented.
In a fourth aspect, the present application further provides a readable storage medium, on which a program or instructions are stored, and when the program or instructions are executed by a processor, the program or instructions implement the steps in the operation evaluation method as described above.
In a fifth aspect, an embodiment of the present application provides a chip, where the chip includes a processor and a communication interface, where the communication interface is coupled to the processor, and the processor is configured to execute a program or instructions to implement the method described above.
In the embodiment of the application, a first parameter and a second parameter of target processing operation are obtained; the first parameters comprise at least one of an orthomorphism parameter of a foreground part and a bendability parameter of a background part, the foreground part is a foreground part of the target picture, and the background part is a background part of the target picture; the second parameter comprises a view angle change parameter between the target picture and the original picture; obtaining evaluation information of the target processing operation according to the first parameter and the second parameter, wherein in the process of evaluating the target processing operation, subjective scores of a large number of users do not need to be acquired, so that time cost and labor cost are saved; by using the evaluation information as an objective scoring mechanism, a large amount of scoring verification time and uncertainty in the scoring verification process are reduced, and the robustness of the scoring mechanism is improved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings needed to be used in the description of the embodiments of the present application will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art that other drawings can be obtained according to these drawings without inventive exercise.
FIG. 1 is a flow chart of an operation evaluation method provided by an embodiment of the present application;
FIG. 2 is a schematic diagram illustrating a first example provided by an embodiment of the present application;
FIG. 3 shows a flow chart of a second example provided by an embodiment of the present application;
FIG. 4 is a schematic diagram illustrating a second example provided by an embodiment of the present application;
FIG. 5 shows a block diagram of an operation evaluation device provided by an embodiment of the present application;
fig. 6 shows a block diagram of an electronic device provided by an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some, but not all, embodiments of the present application. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The terms first, second and the like in the description and in the claims of the present application are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the application are capable of operation in sequences other than those illustrated or described herein. In addition, "and/or" in the specification and claims means at least one of connected objects, a character "/" generally means that a preceding and succeeding related objects are in an "or" relationship.
The operation evaluation method provided by the embodiment of the present application is described in detail below with reference to the accompanying drawings by specific embodiments and application scenarios thereof.
Referring to fig. 1, an embodiment of the present application provides an operation evaluation method, which is optionally applicable to electronic devices including various handheld devices, vehicle-mounted devices, wearable devices, computing devices or other processing devices connected to a wireless modem, as well as various forms of Mobile Stations (MSs), Terminal devices (Terminal devices), and the like.
The method comprises the following steps:
step 101, acquiring a first parameter and a second parameter of target processing operation;
the target processing operation is an operation of processing an original picture and obtaining a target picture, for example, an adjustment operation of adjusting portrait distortion in the original picture;
the first parameter includes at least one of a Conformality parameter of the foreground portion and a bendability parameter of the background portion, the Conformality parameter (conformability energy) being used for representing a degree of keeping a shape consistent before and after adjustment, and being used for representing a fidelity degree of the foreground; the warping parameter (warping energy) indicates the degree of warping of the image before and after adjustment, and is used to indicate the degree of distortion of the background.
The foreground part is a foreground part of the target picture, and the background part is a background part of the target picture; it is understood that the foreground portion of the target picture is the same as the foreground portion of the original picture, and the background portion of the target picture is the same as the background portion of the original picture.
The second parameter includes a viewing angle variation parameter between the target picture and the original picture, and a viewing angle is a viewing angle, and in the optical instrument, an included angle formed by two edges Of a maximum range through which an object image Of the target to be measured can pass is called a viewing angle (FOV) by taking a lens Of the optical instrument as a vertex.
And 102, obtaining evaluation information of the target processing operation according to the first parameter and the second parameter.
After the first parameter and the second parameter are obtained, evaluating and processing the target processing operation according to a preset evaluation algorithm to obtain evaluation information; alternatively, the evaluation information may include an evaluation value of the target processing operation, and the evaluation value may be calculated by performing a weighted summation on the first parameter and the second parameter.
In the embodiment of the application, a first parameter and a second parameter of target processing operation are obtained; the first parameters comprise at least one of an orthomorphism parameter of a foreground part and a bendability parameter of a background part, the foreground part is a foreground part of the target picture, and the background part is a background part of the target picture; the second parameter comprises a view angle change parameter between the target picture and the original picture; obtaining evaluation information of the target processing operation according to the first parameter and the second parameter, wherein in the process of evaluating the target processing operation, subjective scores of a large number of users do not need to be acquired, so that time cost and labor cost are saved; by using the evaluation information as an objective scoring mechanism, a large amount of scoring verification time and uncertainty in the scoring verification process are reduced, and the robustness of the scoring mechanism is improved. The embodiment of the application solves the problems that in the prior art, when the MOS value is used for evaluating the portrait distortion, the cost is high, and the robustness of a scoring mechanism is poor.
In an optional embodiment, the obtaining the first parameter of the target processing operation includes:
extracting a foreground part and a background part of the target picture; optionally, the subject identification may be performed on the target picture, so as to obtain a foreground portion including the subject portion and a background portion not including the subject portion; as a first example, as shown in fig. 2, the part indicated by the mark a is a foreground part, and the remaining black part is a background part;
determining an orthomorphism parameter for the foreground portion and determining a bendability parameter for the background portion; the shape parameter is used for expressing the degree of keeping the shape consistent before and after adjustment and expressing the fidelity degree of the foreground; the curvature parameter represents the degree of curvature of the adjusted front and back images, and is used for representing the distortion degree of the background.
In an optional embodiment, the extracting the foreground part and the background part of the target picture includes:
carrying out portrait cutting processing on the original picture to obtain a mask image;
and carrying out segmentation processing on the target picture according to the mask image to obtain a foreground part and a background part.
The Mask image, i.e. the Layer Mask (Layer Mask), is attached to the Layer, and can Mask the area above the Layer to make the Layer image below the Layer display a gray-scale image, for example, the black part of the Mask can make the image of the Layer become transparent, so that the Layer below can be seen.
And performing portrait cutting processing on the original picture to obtain a mask image, and then performing segmentation processing on the target picture through the mask image to obtain a foreground part and a background part.
In an optional embodiment, the determining the conformality parameter of the foreground portion includes:
carrying out grid map sampling on the foreground part to obtain a first grid map; the grid graph is a mesh graph;
determining the orthomorphism parameters of the foreground part in the first grid map according to a first data relationship, wherein the first data relationship is as follows:
Figure BDA0002991484060000061
wherein E iscRepresenting an orthomorphism parameter;
let si,jRepresents a coordinate point in said first grid graph, then si,jThe coordinate value in the corresponding correction plane is (u)i,j,vi,j);ui,jRepresenting a horizontal coordinate value, v, on said correction planei,jIs the coordinate value of the vertical direction on the correction plane;
λi,jdenotes si,jA longitude value of (d); phi is ai,jDenotes si,jA latitude value of (d);
v represents a set of coordinate points in the first grid map;
wi,jrepresenting a preset weight.
For a certain point on the grid map, 4 points adjacent to the certain point on the grid map are taken to calculate the orthomorphism parameter of the certain point, such as sijFor coordinate points in the first grid map, the corresponding coordinates in the correction plane are (μ)i,j,νi,j) By calculating the point vi+1,j、ui,j+1、ui+1,j、vi,j+1The longitude relation or the latitude relation between the points, and the orthomorphism parameter of the point is obtained.
In an optional embodiment, the determining the bendability parameter of the background portion includes:
sampling the grid map of the background part to obtain a second grid map;
determining the flexibility parameter of the background part in the second grid map according to a second data relationship, wherein the second data relationship is as follows:
Figure BDA0002991484060000071
wherein the content of the first and second substances,
Figure BDA0002991484060000072
representing a first vector between a first target point and a second target point in the second mesh map; e represents the first target point and the second target pointA second vector in a third grid map of the original picture; seRepresenting a preset value.
Calculating the position displacement of each pixel of the original picture and the target picture through optical flow to obtain grid deformation (mesh warping), respectively sampling grid graphs on the original picture and the target picture, calculating the bending degree of all straight lines (vectors) on the grid graphs, and calculating the sum to obtain marking energy.
As a second example, referring to fig. 3, the operation evaluation method provided by the embodiment of the present application mainly includes the following processes:
step 301, performing portrait cutting processing on the original picture to obtain a mask image.
Step 302, performing segmentation processing on the target picture according to the mask map to obtain a foreground part and a background part.
Step 303, at least one of the orthomorphism parameter of the foreground part and the bendability parameter of the background part is extracted.
In step 304, a viewing angle variation parameter, i.e. a degree of reduction of the viewing angle of the target picture relative to the original picture, is determined.
Step 305, determining evaluation information according to the view angle change parameter and at least one of the orthomorphism parameter and the bendability parameter.
Taking the first parameter including the orthomorphism parameter of the foreground part and the background part as an example, with reference to fig. 4, the process of calculating the evaluation information mainly includes:
calculating to obtain A orthomorphism parameters and B bendability parameters through a target picture, an original picture and a grid picture; the C view angle variation parameter is calculated from the target picture and the original picture, and then D-weighted summation is performed on A, B, C, resulting in an E-evaluated value. When the D algorithm needs to be adjusted subsequently, the D algorithm can be realized only by adjusting the weight value, and the robustness of the algorithm is high.
In the embodiment of the application, a first parameter and a second parameter of target processing operation are obtained; the first parameters comprise at least one of an orthomorphism parameter of a foreground part and a bendability parameter of a background part, the foreground part is a foreground part of the target picture, and the background part is a background part of the target picture; the second parameter comprises a view angle change parameter between the target picture and the original picture; obtaining evaluation information of the target processing operation according to the first parameter and the second parameter, wherein in the process of evaluating the target processing operation, subjective scores of a large number of users do not need to be acquired, so that time cost and labor cost are saved; by using the evaluation information as an objective scoring mechanism, a large amount of scoring verification time and uncertainty in the scoring verification process are reduced, and the robustness of the scoring mechanism is improved.
Having described the operation evaluation method provided by the embodiments of the present application, the operation evaluation apparatus provided by the embodiments of the present application will be described below with reference to the accompanying drawings.
It should be noted that, in the operation evaluation method provided in the embodiment of the present application, the execution subject may be an operation evaluation device, or a control module in the operation evaluation device for executing the operation evaluation method. In the embodiment of the present application, an operation evaluation method executed by an operation evaluation device is taken as an example, and the operation evaluation method provided in the embodiment of the present application is described.
Referring to fig. 5, an embodiment of the present application further provides an operation evaluation apparatus 500, including:
an obtaining module 501, configured to obtain a first parameter and a second parameter of a target processing operation;
the target processing operation is an operation of processing an original picture and obtaining a target picture, for example, an adjustment operation of adjusting portrait distortion in the original picture;
the first parameter includes at least one of a Conformality parameter of the foreground portion and a bendability parameter of the background portion, the Conformality parameter (conformability energy) being used for representing a degree of keeping a shape consistent before and after adjustment, and being used for representing a fidelity degree of the foreground; the warping parameter (warping energy) indicates the degree of warping of the image before and after adjustment, and is used to indicate the degree of distortion of the background.
The foreground part is a foreground part of the target picture, and the background part is a background part of the target picture; it is understood that the foreground portion of the target picture is the same as the foreground portion of the original picture, and the background portion of the target picture is the same as the background portion of the original picture.
The second parameter includes a viewing angle variation parameter between the target picture and the original picture, and a viewing angle is a viewing angle, and in the optical instrument, an included angle formed by two edges Of a maximum range through which an object image Of the target to be measured can pass is called a viewing angle (FOV) by taking a lens Of the optical instrument as a vertex. An evaluation module 502, configured to obtain evaluation information of the target processing operation according to the first parameter and the second parameter.
After the first parameter and the second parameter are obtained, evaluating and processing the target processing operation according to a preset evaluation algorithm to obtain evaluation information; alternatively, the evaluation information may include an evaluation value of the target processing operation, and the evaluation value may be calculated by performing a weighted summation on the first parameter and the second parameter.
Optionally, in this embodiment of the present application, the obtaining module 501 includes:
the extraction submodule is used for extracting a foreground part and a background part of the target picture;
a determining sub-module for determining a conformality parameter of the foreground portion and determining a bendability parameter of the background portion.
Optionally, in an embodiment of the present application, the extracting sub-module is configured to:
carrying out portrait cutting processing on the original picture to obtain a mask image;
and carrying out segmentation processing on the target picture according to the mask image to obtain a foreground part and a background part.
Optionally, in an embodiment of the present application, the determining sub-module is configured to:
carrying out grid map sampling on the foreground part to obtain a first grid map;
determining the orthomorphism parameters of the foreground part in the first grid map according to a first data relationship, wherein the first data relationship is as follows:
Figure BDA0002991484060000101
wherein E iscRepresenting an orthomorphism parameter;
let si,jRepresents a coordinate point in said first grid graph, then si,jThe coordinate value in the corresponding correction plane is (u)i,j,vi,j);ui,jRepresenting a horizontal coordinate value, v, on said correction planei,jIs the coordinate value of the vertical direction on the correction plane;
λi,jdenotes si,jA longitude value of (d); phi is ai,jDenotes si,jA latitude value of (d);
v represents a set of coordinate points in the first grid map;
wi,jrepresenting a preset weight.
Optionally, in an embodiment of the present application, the determining sub-module is configured to:
sampling the grid map of the background part to obtain a second grid map;
determining the flexibility parameter of the background part in the second grid map according to a second data relationship, wherein the second data relationship is as follows:
Figure BDA0002991484060000102
wherein the content of the first and second substances,
Figure BDA0002991484060000103
representing a first vector between a first target point and a second target point in the second mesh map; e represents a second vector of the first target point and a second target point in a third grid map of the original picture; seRepresenting a preset value. In the embodiment of the present application, the obtaining module 501 obtains a first parameter and a second parameter of a target processing operation; the first parameters include an orthomorphism parameter of the foreground portion andat least one of a bendability parameter of a background portion, the foreground portion being a foreground portion of the target picture, the background portion being a background portion of the target picture; the second parameter comprises a view angle change parameter between the target picture and the original picture; the evaluation module 502 obtains the evaluation information of the target processing operation according to the first parameter and the second parameter, and does not need to collect subjective scores of a large number of users in the process of evaluating the target processing operation, so that the time cost and the labor cost are saved; by using the evaluation information as an objective scoring mechanism, a large amount of scoring verification time and uncertainty in the scoring verification process are reduced, and the robustness of the scoring mechanism is improved.
The operation evaluation device in the embodiment of the present application may be a device, or may be a component, an integrated circuit, or a chip in a terminal. The device can be mobile electronic equipment or non-mobile electronic equipment. By way of example, the Mobile electronic device may be a Mobile phone, a tablet Computer, a notebook Computer, a palm top Computer, an in-vehicle electronic device, a wearable device, an Ultra-Mobile Personal Computer (UMPC), a netbook or a Personal Digital Assistant (PDA), and the like, and the non-Mobile electronic device may be a server, a Network Attached Storage (NAS), a Personal Computer (Personal Computer, PC), a Television (TV), a teller machine, a self-service machine, and the like, and the embodiments of the present application are not limited in particular.
The operation evaluation device in the embodiment of the present application may be a device having an operating system. The operating system may be an Android operating system (Android), an iOS operating system, or other possible operating systems, which is not specifically limited in the embodiments of the present application.
The operation evaluation device provided in the embodiment of the present application can implement each process implemented by the operation evaluation device in the method embodiments of fig. 1 to fig. 4, and is not described here again to avoid repetition.
Optionally, an electronic device is further provided in this embodiment of the present application, and includes a processor 610, a memory 609, and a program or an instruction stored in the memory 609 and capable of being executed on the processor 610, where the program or the instruction is executed by the processor 610 to implement each process of the above operation evaluation method embodiment, and can achieve the same technical effect, and details are not described here to avoid repetition.
It should be noted that the electronic devices in the embodiments of the present application include the mobile electronic devices and the non-mobile electronic devices described above.
Fig. 6 is a schematic hardware structure diagram of an electronic device 600 implementing various embodiments of the present application;
the electronic device 600 includes, but is not limited to: a radio frequency unit 601, a network module 602, an audio output unit 603, an input unit 604, a sensor 605, a display unit 606, a user input unit 607, an interface unit 608, a memory 609, a processor 610, and a power supply 611.
Those skilled in the art will appreciate that the electronic device 600 may further comprise a power source (e.g., a battery) for supplying power to the various components, and the power source may be logically connected to the processor 610 through a power management system, so as to implement functions of managing charging, discharging, and power consumption through the power management system. The electronic device structure shown in fig. 6 does not constitute a limitation of the electronic device, and the electronic device may include more or less components than those shown, or combine some components, or arrange different components, and thus, the description is omitted here.
A processor 610 for obtaining a first parameter and a second parameter of a target processing operation;
the target processing operation is an operation of processing an original picture and obtaining a target picture;
the first parameters comprise at least one of an orthomorphism parameter of a foreground part and a bendability parameter of a background part, the foreground part is a foreground part of the target picture, and the background part is a background part of the target picture; the second parameter comprises a view angle change parameter between the target picture and the original picture;
and obtaining the evaluation information of the target processing operation according to the first parameter and the second parameter.
Optionally, the processor 610 is configured to:
extracting a foreground part and a background part of the target picture;
determining an orthomorphism parameter for the foreground portion and determining a bendability parameter for the background portion.
Optionally, the processor 610 is configured to:
carrying out portrait cutting processing on the original picture to obtain a mask image;
and carrying out segmentation processing on the target picture according to the mask image to obtain a foreground part and a background part.
Optionally, the processor 610 is configured to:
carrying out grid map sampling on the foreground part to obtain a first grid map;
determining the orthomorphism parameters of the foreground part in the first grid map according to a first data relationship, wherein the first data relationship is as follows:
Figure BDA0002991484060000131
wherein E iscRepresenting an orthomorphism parameter;
let si,jRepresents a coordinate point in said first grid graph, then si,jThe coordinate value in the corresponding correction plane is (u)i,j,vi,j);ui,jRepresenting a horizontal coordinate value, v, on said correction planei,jIs the coordinate value of the vertical direction on the correction plane;
λi,jdenotes si,jA longitude value of (d); phi is ai,jDenotes si,jA latitude value of (d);
v represents a set of coordinate points in the first grid map;
wi,jrepresenting a preset weight.
Optionally, the processor 610 is configured to:
sampling the grid map of the background part to obtain a second grid map;
determining the flexibility parameter of the background part in the second grid map according to a second data relationship, wherein the second data relationship is as follows:
Figure BDA0002991484060000132
wherein the content of the first and second substances,
Figure BDA0002991484060000133
representing a first vector between a first target point and a second target point in the second mesh map; e represents a second vector of the first target point and a second target point in a third grid map of the original picture; seRepresenting a preset value.
In the embodiment of the application, a first parameter and a second parameter of target processing operation are obtained; the first parameters comprise at least one of an orthomorphism parameter of a foreground part and a bendability parameter of a background part, the foreground part is a foreground part of the target picture, and the background part is a background part of the target picture; the second parameter comprises a view angle change parameter between the target picture and the original picture; obtaining evaluation information of the target processing operation according to the first parameter and the second parameter, wherein in the process of evaluating the target processing operation, subjective scores of a large number of users do not need to be acquired, so that time cost and labor cost are saved; by using the evaluation information as an objective scoring mechanism, a large amount of scoring verification time and uncertainty in the scoring verification process are reduced, and the robustness of the scoring mechanism is improved.
It is to be understood that, in the embodiment of the present application, the input Unit 604 may include a Graphics Processing Unit (GPU) 6041 and a microphone 6042, and the Graphics Processing Unit 6041 processes image data of a still picture or a video obtained by an image capturing apparatus (such as a camera) in a video capturing mode or an image capturing mode. The display unit 606 may include a display panel 6061, and the display panel 6061 may be configured in the form of a liquid crystal display, an organic light emitting diode, or the like. The user input unit 607 includes a touch panel 6071 and other input devices 6072. A touch panel 6071, also referred to as a touch screen. The touch panel 6071 may include two parts of a touch detection device and a touch controller. Other input devices 6072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described in detail herein. The memory 609 may be used to store software programs as well as various data including, but not limited to, application programs and an operating system. The processor 610 may integrate an application processor, which primarily handles operating systems, user interfaces, applications, etc., and a modem processor, which primarily handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 610.
The embodiment of the present application further provides a readable storage medium, where a program or an instruction is stored on the readable storage medium, and when the program or the instruction is executed by a processor, the program or the instruction implements each process of the operation evaluation method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here.
The processor is the processor in the electronic device described in the above embodiment. The readable storage medium includes a computer readable storage medium, such as a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and so on.
The embodiment of the present application further provides a chip, where the chip includes a processor and a communication interface, the communication interface is coupled to the processor, and the processor is configured to run a program or an instruction to implement each process of the above operation evaluation method embodiment, and can achieve the same technical effect, and the details are not repeated here to avoid repetition.
It should be understood that the chips mentioned in the embodiments of the present application may also be referred to as system-on-chip, system-on-chip or system-on-chip, etc.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element. Further, it should be noted that the scope of the methods and apparatus of the embodiments of the present application is not limited to performing the functions in the order illustrated or discussed, but may include performing the functions in a substantially simultaneous manner or in a reverse order based on the functions involved, e.g., the methods described may be performed in an order different than that described, and various steps may be added, omitted, or combined. In addition, features described with reference to certain examples may be combined in other examples.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present application may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present application.
While the present embodiments have been described with reference to the accompanying drawings, it is to be understood that the invention is not limited to the precise embodiments described above, which are meant to be illustrative and not restrictive, and that various changes may be made therein by those skilled in the art without departing from the spirit and scope of the invention as defined by the appended claims.

Claims (12)

1. An operation assessment method, characterized in that the method comprises:
acquiring a first parameter and a second parameter of target processing operation;
the target processing operation is an operation of processing an original picture and obtaining a target picture;
the first parameters comprise at least one of an orthomorphism parameter of a foreground part and a bendability parameter of a background part, the foreground part is a foreground part of the target picture, and the background part is a background part of the target picture; the second parameter comprises a view angle change parameter between the target picture and the original picture;
and obtaining the evaluation information of the target processing operation according to the first parameter and the second parameter.
2. The operation evaluation method of claim 1, wherein said obtaining a first parameter of a target processing operation comprises:
extracting a foreground part and a background part of the target picture;
determining an orthomorphism parameter for the foreground portion and determining a bendability parameter for the background portion.
3. The operation evaluation method according to claim 2, wherein the extracting of the foreground portion and the background portion of the target picture comprises:
carrying out portrait cutting processing on the original picture to obtain a mask image;
and carrying out segmentation processing on the target picture according to the mask image to obtain a foreground part and a background part.
4. The operation evaluation method according to claim 2, wherein the determining of the conformality parameter of the foreground portion comprises:
carrying out grid map sampling on the foreground part to obtain a first grid map;
determining the orthomorphism parameters of the foreground part in the first grid map according to a first data relationship, wherein the first data relationship is as follows:
Figure FDA0002991484050000021
wherein E iscRepresenting an orthomorphism parameter;
let si,jRepresents a coordinate point in said first grid graph, then si,jThe coordinate value in the corresponding correction plane is (u)i,j,vi,j);ui,jRepresenting a horizontal coordinate value, v, on said correction planei,jIs the coordinate value of the vertical direction on the correction plane;
λi,jdenotes si,jA longitude value of (d); phi is ai,jDenotes si,jA latitude value of (d);
v represents a set of coordinate points in the first grid map;
wi,jrepresenting a preset weight.
5. The operation evaluation method according to claim 2, wherein the determining of the bendability parameter of the background portion includes:
sampling the grid map of the background part to obtain a second grid map;
determining the flexibility parameter of the background part in the second grid map according to a second data relationship, wherein the second data relationship is as follows:
Figure FDA0002991484050000022
wherein the content of the first and second substances,
Figure FDA0002991484050000023
representing a first vector between a first target point and a second target point in the second grid map(ii) a e represents a second vector of the first target point and a second target point in a third grid map of the original picture; seRepresenting a preset value.
6. An operation evaluation device, characterized in that the device comprises:
the acquisition module is used for acquiring a first parameter and a second parameter of target processing operation;
the target processing operation is an operation of processing an original picture and obtaining a target picture;
the first parameters comprise at least one of an orthomorphism parameter of a foreground part and a bendability parameter of a background part, the foreground part is a foreground part of the target picture, and the background part is a background part of the target picture; the second parameter comprises a view angle change parameter between the target picture and the original picture;
and the evaluation module is used for obtaining the evaluation information of the target processing operation according to the first parameter and the second parameter.
7. The operation evaluation device according to claim 6, wherein the acquisition module comprises:
the extraction submodule is used for extracting a foreground part and a background part of the target picture;
a determining sub-module for determining a conformality parameter of the foreground portion and determining a bendability parameter of the background portion.
8. The operation evaluation device of claim 7, wherein the extraction sub-module is configured to:
carrying out portrait cutting processing on the original picture to obtain a mask image;
and carrying out segmentation processing on the target picture according to the mask image to obtain a foreground part and a background part.
9. The operation evaluation device of claim 7, wherein the determination sub-module is configured to:
carrying out grid map sampling on the foreground part to obtain a first grid map;
determining the orthomorphism parameters of the foreground part in the first grid map according to a first data relationship, wherein the first data relationship is as follows:
Figure FDA0002991484050000031
wherein E iscRepresenting an orthomorphism parameter;
let si,jRepresents a coordinate point in said first grid graph, then si,jThe coordinate value in the corresponding correction plane is (u)i,j,vi,j);ui,jRepresenting a horizontal coordinate value, v, on said correction planei,jIs the coordinate value of the vertical direction on the correction plane;
λi,jdenotes si,jA longitude value of (d); phi is ai,jDenotes si,jA latitude value of (d);
v represents a set of coordinate points in the first grid map;
wi,jrepresenting a preset weight.
10. The operation evaluation device of claim 7, wherein the determination sub-module is configured to:
sampling the grid map of the background part to obtain a second grid map;
determining the flexibility parameter of the background part in the second grid map according to a second data relationship, wherein the second data relationship is as follows:
Figure FDA0002991484050000041
wherein the content of the first and second substances,
Figure FDA0002991484050000042
representing a first vector between a first target point and a second target point in the second mesh map; e represents a second vector of the first target point and a second target point in a third grid map of the original picture; seRepresenting a preset value.
11. An electronic device comprising a processor, a memory, and a program or instructions stored on the memory and executable on the processor, the program or instructions when executed by the processor implementing the steps of the operation evaluation method of any one of claims 1 to 5.
12. A readable storage medium, characterized in that the readable storage medium stores thereon a program or instructions which, when executed by a processor, implement the steps of the operation evaluation method of any one of claims 1 to 5.
CN202110316365.5A 2021-03-24 2021-03-24 Operation evaluation method, operation evaluation device, electronic apparatus, and medium Withdrawn CN113034611A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110316365.5A CN113034611A (en) 2021-03-24 2021-03-24 Operation evaluation method, operation evaluation device, electronic apparatus, and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110316365.5A CN113034611A (en) 2021-03-24 2021-03-24 Operation evaluation method, operation evaluation device, electronic apparatus, and medium

Publications (1)

Publication Number Publication Date
CN113034611A true CN113034611A (en) 2021-06-25

Family

ID=76473473

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110316365.5A Withdrawn CN113034611A (en) 2021-03-24 2021-03-24 Operation evaluation method, operation evaluation device, electronic apparatus, and medium

Country Status (1)

Country Link
CN (1) CN113034611A (en)

Similar Documents

Publication Publication Date Title
CN110210571B (en) Image recognition method and device, computer equipment and computer readable storage medium
WO2020224479A1 (en) Method and apparatus for acquiring positions of target, and computer device and storage medium
CN113205568B (en) Image processing method, device, electronic equipment and storage medium
CN110570460B (en) Target tracking method, device, computer equipment and computer readable storage medium
US11030733B2 (en) Method, electronic device and storage medium for processing image
CN112333385B (en) Electronic anti-shake control method and device
CN110807769B (en) Image display control method and device
CN113489909B (en) Shooting parameter determining method and device and electronic equipment
CN110232417B (en) Image recognition method and device, computer equipment and computer readable storage medium
CN110047126B (en) Method, apparatus, electronic device, and computer-readable storage medium for rendering image
CN112150486B (en) Image processing method and device
CN111953907B (en) Composition method and device
CN113034611A (en) Operation evaluation method, operation evaluation device, electronic apparatus, and medium
CN114241127A (en) Panoramic image generation method and device, electronic equipment and medium
CN112561787A (en) Image processing method, image processing device, electronic equipment and storage medium
CN113407774A (en) Cover determining method and device, computer equipment and storage medium
CN112565605A (en) Image display method and device and electronic equipment
CN112381719B (en) Image processing method and device
CN112887621B (en) Control method and electronic device
CN112367468B (en) Image processing method and device and electronic equipment
CN113055599B (en) Camera switching method and device, electronic equipment and readable storage medium
CN112333388B (en) Image display method and device and electronic equipment
CN113489901B (en) Shooting method and device thereof
CN112367470B (en) Image processing method and device and electronic equipment
CN113703901A (en) Graphic code display method and device and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
WW01 Invention patent application withdrawn after publication

Application publication date: 20210625

WW01 Invention patent application withdrawn after publication