CN112783413B - Image processing method and device and electronic equipment - Google Patents

Image processing method and device and electronic equipment Download PDF

Info

Publication number
CN112783413B
CN112783413B CN202011624223.7A CN202011624223A CN112783413B CN 112783413 B CN112783413 B CN 112783413B CN 202011624223 A CN202011624223 A CN 202011624223A CN 112783413 B CN112783413 B CN 112783413B
Authority
CN
China
Prior art keywords
image
input
processed
selection frame
adjusting
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011624223.7A
Other languages
Chinese (zh)
Other versions
CN112783413A (en
Inventor
李家凯
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Hangzhou Co Ltd
Original Assignee
Vivo Mobile Communication Hangzhou Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Hangzhou Co Ltd filed Critical Vivo Mobile Communication Hangzhou Co Ltd
Priority to CN202011624223.7A priority Critical patent/CN112783413B/en
Publication of CN112783413A publication Critical patent/CN112783413A/en
Application granted granted Critical
Publication of CN112783413B publication Critical patent/CN112783413B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Editing Of Facsimile Originals (AREA)

Abstract

The application discloses an image processing method, an image processing device and electronic equipment, and belongs to the technical field of communication. The method comprises the steps of receiving a first input of an image to be processed; responding to the first input, displaying a region selection frame for selecting an image region of the image to be processed, and an adjusting control for adjusting the region selection frame; and receiving a second input of the adjusting control, responding to the second input, adjusting the size of the area selection frame with target adjusting precision, and determining the adjusted image in the area selection frame as a target image for saving. According to the image processing method, the area selection frame is adjusted according to the target adjustment precision through the setting of the adjustment control, the adjustment precision of the area selection frame can be guaranteed, and the problem that the precision is poor when the size is adjusted directly through manually operating the area selection frame at present is solved.

Description

Image processing method and device and electronic equipment
Technical Field
The application belongs to the technical field of communication, and particularly relates to an image processing method and device and electronic equipment.
Background
At present, electronic devices with screens generally have functions of touch screen operations, such as: touch display screen of cell-phone etc.. Most functions of the mobile phone depend on finger touch screen operation, for example, various functions such as application opening, application dragging, image editing and the like are realized by clicking and dragging with a finger. The normal finger touch operation can meet various operation requirements of the user. However, in some scenarios, the following problems may exist: when faced with some operations requiring higher fineness, it is difficult to perform finer operations such as fine adjustment due to the thickness and stability of the fingers of the operator. Particularly, when a user performs image cropping or uses some tools, such as a scan-related tool, and needs to select a certain part of an image, the current operation is completed by dragging the selection frame with a finger and adjusting the selection frame to a proper position. Due to the influence of the fingers of the operator on the sight and the factors of the stability of the finger operation, when the selection frame is finely adjusted, the operator is difficult to stably drag a small distance, and cannot finish fine adjustment and selection.
Content of application
The embodiment of the application aims to provide an image processing method and device, an electronic device and a computer program product, which can solve the problem that in the editing process of image frame selection, fine adjustment and selection are difficult to complete due to the influence of fingers of an operator on sight and the factors of stability of finger operation.
In order to solve the technical problem, the present application is implemented as follows:
in a first aspect, an embodiment of the present application provides an image processing method, where the method includes:
receiving a first input of an image to be processed;
responding to the first input, displaying a region selection frame for selecting an image region of the image to be processed, and an adjusting control for adjusting the region selection frame;
receiving a second input to the adjustment control;
and responding to the second input, adjusting the size of the area selection frame with target adjustment precision, and determining the adjusted image in the area selection frame as a target image for saving.
In a second aspect, an embodiment of the present application provides an apparatus for image processing, including:
the first receiving module is used for receiving a first input of an image to be processed;
the first response module is used for responding to the first input, displaying a region selection frame for selecting an image region of the image to be processed and an adjusting control for adjusting the region selection frame;
a second receiving module, configured to receive a second input to the adjustment control;
and the second response module is used for responding to the second input, adjusting the size of the area selection frame with target adjustment precision, and determining the adjusted image in the area selection frame as a target image for storage.
In a third aspect, an embodiment of the present application provides an electronic device, which includes a processor, a memory, and a program or instructions stored on the memory and executable on the processor, and when executed by the processor, the program or instructions implement the steps of the method according to the first aspect.
In a fourth aspect, embodiments of the present application provide a readable storage medium, on which a program or instructions are stored, which when executed by a processor implement the steps of the method according to the first aspect.
In a fifth aspect, an embodiment of the present application provides a chip, where the chip includes a processor and a communication interface, where the communication interface is coupled to the processor, and the processor is configured to execute a program or instructions to implement the method according to the first aspect.
According to the image processing method, the region selection frame is adjusted with the target adjustment precision through the setting of the adjustment control, the adjustment precision of the region selection frame can be guaranteed, and the problem that the precision is poor when the size is adjusted directly through manually operating the region selection frame at present is solved.
Drawings
FIG. 1 is a schematic diagram illustrating steps of an image processing method according to an embodiment of the present application;
FIG. 2 is a schematic diagram of an image processing method according to an embodiment of the present application;
FIG. 3 is a second schematic diagram of an image processing method according to an embodiment of the present application;
FIG. 4 is a third schematic diagram of an image processing method according to an embodiment of the present application;
FIG. 5 is a fourth schematic diagram of an image processing method according to an embodiment of the present application;
FIG. 6 is a block diagram of an image processing apparatus according to an embodiment of the present application;
FIG. 7 is a schematic structural diagram of an electronic device according to an embodiment of the present application;
fig. 8 is a schematic hardware structure diagram of an electronic device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some, but not all, of the embodiments of the present application. All other embodiments obtained by a person of ordinary skill in the art based on the embodiments in the present application without making any creative effort belong to the protection scope of the present application.
The terms first, second and the like in the description and in the claims of the present application are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It should be understood that the data so used may be interchanged under appropriate circumstances such that embodiments of the application may be implemented in sequences other than those illustrated or described herein. In addition, "and/or" in the specification and claims means at least one of connected objects, a character "/", and generally means that the former and latter related objects are in an "or" relationship.
The image processing method provided by the embodiment of the present application is described in detail below with reference to the accompanying drawings by using specific embodiments and application scenarios thereof.
As shown in fig. 1, an embodiment of the present application provides an image processing method, including:
step 101, receiving a first input of an image to be processed;
102, responding to the first input, displaying a region selection frame for selecting an image region of an image to be processed and an adjusting control for adjusting the region selection frame;
step 103, receiving a second input to the adjusting control;
and 104, responding to the second input, adjusting the size of the area selection frame with target adjustment precision, and determining the image in the area selection frame after adjustment as a target image for storage.
Optionally, the displaying a region selection frame for selecting an image region of the image to be processed, and an adjustment control for adjusting the region selection frame include:
displaying the area selection frame on the image to be processed, and displaying an adjusting control for adjusting the area selection frame in the area selection frame; or
Displaying the area selection frame on the image to be processed, and displaying an adjusting control for adjusting the area selection frame in a target area of the electronic equipment, wherein the target area is an area on a screen of the electronic equipment except for the display area of the image to be processed.
For example, as shown in fig. 2 and 3, an area selection frame 2 is displayed on the image to be processed 1, and the adjustment control is displayed on the display area selection frame 2 or below the display screen outside the image to be processed 1;
optionally, the adjustment control may be displayed in a semi-transparent state, for example, the corresponding transparency of the semi-transparent state is lower than the corresponding transparency of the image to be processed or the region selection box, so as to avoid that the adjustment control affects the display effect of the image to be processed or the region selection box.
According to the image processing method, the region selection frame is adjusted with the target adjustment precision through the setting of the adjustment control, the adjustment precision of the region selection frame can be guaranteed, and the problem that the precision is poor when the size is adjusted directly through manually operating the region selection frame at present is solved.
Optionally, the first input comprises at least one of:
inputting a first operation point of an image area of the image to be processed;
and performing trigger operation input of image editing on the image to be processed.
Optionally, as an implementation manner, the inputting of the first operation point to the image area of the image to be processed includes: and inputting a long press or a repeated press of one position point of the displayed first area selection frame of the image to be processed.
It should be noted that the long-press input includes a click input whose click duration is greater than or equal to a preset duration; the re-press input includes a click input with a click pressure greater than or equal to a preset pressure.
Optionally, as an implementation manner, the triggering operation input for editing the image to be processed includes: and performing operation input on the clipping or frame selection of the image to be processed.
As shown in fig. 2, when the display area selection frame 2 is already displayed, and the electronic device display screen is pressed or clicked for a long time, the adjustment control is directly displayed on the display area selection frame 2 or below the image to be processed 1.
For example, as shown in fig. 2 and fig. 3, when the user presses or re-presses the region selection frame 2 for a long time, the region selection frame 2 is displayed on the image to be processed 1, and an adjustment control is displayed on the region selection frame 2 or below the image to be processed 1; alternatively, the first and second electrodes may be,
when the user performs input of image cutting or frame selection, an adjustment control is displayed on the area selection frame 2 or below the image 1 to be processed, and a plurality of zooming points on the area selection frame are displayed at the same time.
It should be noted that, when the first input is an input of a first operation point on an image area of the image to be processed, the displaying, in response to the first input, an area selection frame for selecting the image area of the image to be processed includes:
displaying the first area selection frame as the area selection frame on the image to be processed;
displaying a second area selection frame on the image to be processed by taking the second area selection frame as the area selection frame;
wherein the second region selection box is a different selection box than the first region selection box.
According to the image processing method, the region selection frame is adjusted with the target adjustment precision through the setting of the adjustment control, the adjustment precision of the region selection frame can be guaranteed, and the problem that the precision is poor when the size is adjusted directly through manually operating the region selection frame at present is solved.
Optionally, the adjustment control includes an adjustment scale extending in multiple directions around one of the position points on the region selection frame.
As shown in fig. 2, the adjustment scale is an adjustment scale extending upward, downward, leftward and rightward around the origin of the target scale mark 41 in the target adjustment control.
According to the image processing method, the region selection frame is adjusted with the target adjustment precision through the setting of the adjustment control, the adjustment precision of the region selection frame can be guaranteed, and the problem that the precision is poor when the size is adjusted directly through manually operating the region selection frame at present is solved.
Optionally, the method further comprises:
and determining the target adjusting precision on the adjusting scale according to the distance between the area selection frame and the image boundary of the image to be processed.
Wherein the ratio of the numerical value indicated by the target scale mark on the adjustment control to the target adjustment size.
As shown in fig. 2 or fig. 3, the target adjustment accuracy may be obtained according to the position relationship between the first zoom point 31 on the area selection frame and the image 1 to be processed, and the target adjustment accuracy in the horizontal direction and the target adjustment accuracy in the vertical direction may be the same or different.
For one direction, for example, the horizontal direction or the vertical direction, the closer the first operation point is to the image boundary of the image to be processed, the larger the target adjustment precision value is.
For different directions, such as horizontal and vertical:
as shown in fig. 2 or fig. 3, for the horizontal direction, determining a first target adjustment precision according to the distance from a first zoom point 31 on the area selection frame to a first image boundary 11 of the image to be processed 1;
for the vertical direction, a second target adjustment accuracy is determined according to the distance from the first zoom point 31 on the area selection frame to the third image boundary 13 of the image to be processed 1.
The first target adjustment accuracy and the second target adjustment accuracy may be the same or different.
According to the image processing method, in the process of processing the image, through setting of the adjusting control, the target adjusting accuracy can be determined according to the position relation between the zooming point on the area selection frame and the image to be processed in the process of cutting or selecting the image, and the problem that the accuracy is poor when the size is adjusted directly through manually operating the area selection frame at present is solved.
Optionally, the method further comprises:
determining a second operation point on an image area of the image to be processed;
adjusting the size of the region selection box with a target adjustment accuracy in response to the second input, including:
and according to the target adjustment precision, taking the second operation point as an image adjustment starting point, and performing size adjustment on the image to be processed.
Here, after determining the second operation point on the image area of the image to be processed, as shown in fig. 4, with the first zoom point 31 as the second operation point, when the target adjustment accuracy is 2 times, the region selection box 2 moves two scales upward and two scales rightward for the first zoom point 31 or the target scale mark 41, and moves one unit distance upward and one unit distance rightward corresponding to the position of the first zoom point 31 as the starting point. It should be noted that the adjustment of the first operation point is the adjustment of the size of the area selection box 2, and the dragging of the first zoom point 31 and the third operation point in the target scale mark 41 can achieve the same effect.
According to the image processing method, in the process of processing the image, through setting of the adjusting control, the target adjusting accuracy can be determined according to the position relation between the zooming point on the area selection frame and the image to be processed in the process of cutting or selecting the image, and the problem that the accuracy is poor when the size is adjusted directly through manually operating the area selection frame at present is solved.
Optionally, the determining a second operation point on the image area of the image to be processed includes:
and determining that the first operation point of the first input in the image area of the image to be processed is the second operation point.
Optionally, the adjustment control includes: an operation point selection control;
the determining a second operation point on an image area of the image to be processed includes:
and determining the second operation point according to a third input to the operation point selection control.
Optionally, the third input comprises:
inputting any one of the pair of zoom point identifiers on the area selection box; alternatively, the first and second liquid crystal display panels may be,
a selection input to a zoom point selection button in the adjustment control.
Optionally, when the first input is a trigger operation input for editing an image of the image to be processed, a region selection frame is displayed, and at the same time, a zoom point identifier is also displayed on the region selection frame, as shown in fig. 3, a first zoom point 31, a second zoom point 32, a third zoom point 33, and a fourth zoom point 34. By adjusting the setting of the zoom point selection button 42 in the control, when the user is difficult to select the zoom point through fingers, the user can determine the zoom point through the input of the sliding zoom point selection button 42 and accurately select the zoom point, thereby solving the problem that the zoom point is difficult to accurately select under the condition that the size of the area selection frame is small when the user directly performs manual operation at present.
For example, the third input may be a click input or a long-press input to any one of a plurality of zoom points on the region selection box 2, or a slide input to the zoom point selection button 42 to determine the second zoom point 32. After the second zoom point 32 is determined, the region selection box 2 is adjusted by dragging input of an operation point in the second zoom point 32 or the target scale mark 41, and the position of the second zoom point 32 is adjusted with the target adjustment accuracy by dragging the second zoom point 32 or the target scale mark 41 with the position of the second zoom point 32 as a starting point.
The second operation point may be one of the boundary line intersections of the area selection frame, or may be a point on the boundary line. When the second operation point is a point on the boundary line, the corresponding boundary line is taken as an adjustment starting point, that is, adjustment is performed only for one direction. For example, when the second operation point is a point on the upper boundary line in the horizontal direction, only the adjustment is made in the horizontal direction; when the second operation point is a point on the vertical direction upper boundary line, only the vertical direction is adjusted.
Optionally, adjusting the size of the region selection box with the target adjustment accuracy includes:
receiving a fourth input to a precision selection button of the adjustment control, and determining the target adjustment precision in response to the fourth input.
Here, the fourth input may be a slide input for the accuracy selection button, and the target adjustment accuracy may be selected by switching between different adjustment accuracies through a slide of the accuracy selection button.
As shown in fig. 2, the zoom point at which the operation is required can be determined by a sliding input to the parameter selection button 42 of the adjustment control. The required target adjustment accuracy is determined by the slide input of the accuracy selection button 43 of the adjustment control.
And adjusting the target scaling point-target adjustment precision to determine the size of the area selection frame 2, and finally determining the image area corresponding to the adjusted area selection frame 2 as the target image area of the image 1 to be processed.
Here, the fourth input includes a drag input to a target scale identification of the adjustment control. The size of the area selection frame can be correspondingly adjusted according to the target adjustment precision by dragging the target scale mark of the adjustment control. As shown in fig. 4, the target scale mark 41 of the drag adjustment control correspondingly adjusts the size of the area selection box 2 according to the drag of the operation point in the target scale mark 41 and the target adjustment precision. Further, as shown in fig. 5, when the target adjustment accuracy is 2 times, the target scale mark 41 is moved up by two scales and moved right by two scales, and the area selection box 2 is moved up by one unit and moved right by one unit with the position of the second zoom point 32 as a starting point.
According to the image processing method, the target adjusting accuracy can be determined according to the position relation between the zooming point on the area selection frame and the image to be processed in the image cutting or framing process, the size of the area selection frame is correspondingly adjusted through the target scale mark on the adjusting control, and the problems that the size of the area selection frame is small, the area selection frame cannot be directly operated through fingers, and the size of the area selection frame cannot be accurately adjusted are solved.
It should be noted that the image processing method may further include:
receiving a fifth input;
in response to a fifth input, the display mode of the adjustment control is exited.
Here, the fifth input may include a double-click input or a long-press input with respect to the display screen of the electronic device.
The exit from the display mode of the adjustment control comprises:
quitting the adjusting function of the adjusting control and quitting the display mode of the adjusting control;
alternatively, the display of the adjustment control is hidden, but the adjustment function of the adjustment control remains active.
For example, after the clipping or the framing of the image to be processed is completed, the adjustment function of the current adjustment control can be quitted and the display of the adjustment control can be cancelled through double-click input or long-press input aiming at the display screen of the electronic equipment; or when the area selection box is adjusted, the gliding input or the double-click input is received, the area selection box is displayed in a full screen mode, the adjusting function of the adjusting control can be continuously used, the display of the adjusting control is cancelled, and a high visual effect is provided for a user.
Optionally, the receiving a second input to the adjustment control includes:
receiving a second input of a target scale mark of the adjusting control;
the adjusting the size of the region selection box with a target adjustment precision in response to the second input comprises:
responding to the second input, determining a target adjustment size according to the target scale mark and the target adjustment precision, and adjusting the size of the area selection frame according to the target adjustment size;
wherein the target adjustment precision is as follows: the target scale identifies the ratio of the indicated numerical value to the target adjustment size.
According to the image processing method, the target adjusting accuracy can be determined according to the position relation between the zooming point on the area selection frame and the image to be processed in the image cutting or framing process, the size of the area selection frame is correspondingly adjusted through the target scale mark on the adjusting control, and the problem that the size of the area selection frame is small, the area selection frame cannot be directly operated through fingers, and the size of the area selection frame cannot be accurately adjusted is solved.
It should be noted that, in the image processing method provided in the embodiment of the present application, the execution subject may be an image processing apparatus, or a control module in the image processing apparatus for executing the method of loading image processing. In the embodiment of the present application, a method for executing load image processing by an image processing apparatus is taken as an example, and the method for processing an image provided in the embodiment of the present application is described.
As shown in fig. 6, an embodiment of the present application further provides an image processing apparatus 600, including:
a first receiving module 601, configured to receive a first input of an image to be processed;
a first response module 602, configured to, in response to the first input, display a region selection box that selects an image region of the image to be processed, and an adjustment control that adjusts the region selection box;
a second receiving module 603, configured to receive a second input to the adjustment control;
a second response module 604, configured to, in response to the second input, adjust the size of the region selection frame with the target adjustment precision, and determine an image in the region selection frame after adjustment as a target image for saving.
According to the image processing device, the area selection frame is adjusted according to the target adjusting precision through the setting of the adjusting control, the adjusting precision of the area selection frame can be guaranteed, and the problem that the precision is poor when the size is adjusted directly through a manual operation area selection frame at present is solved.
Optionally, the first input comprises at least one of:
inputting a first operation point of an image area of the image to be processed;
and performing triggering operation input of image editing on the image to be processed.
Optionally, the adjustment control includes an adjustment scale extending in multiple directions around one of the position points on the region selection frame.
Optionally, the image processing apparatus 600 further includes:
the first determining module is used for determining the target adjusting precision on the adjusting scale according to the distance between the area selection frame and the image boundary of the image to be processed.
Optionally, the image processing apparatus 600 further includes:
the second determining module is used for determining a second operating point on the image area of the image to be processed;
the second responding module 604 is further configured to:
and according to the target adjusting precision, taking the second operation point as an image adjusting starting point, and adjusting the size of the image to be processed.
The second determining module includes:
a first determining unit, configured to determine that a first operation point of the first input on an image area of the image to be processed is the second operation point.
Optionally, the adjusting control includes an operation point selecting control;
the second determining module includes: and the second determining unit is used for determining the second operating point according to a third input to the operating point selection control.
According to the image processing device, the area selection frame is adjusted according to the target adjusting precision through the setting of the adjusting control, the adjusting precision of the area selection frame can be guaranteed, and the problem that the precision is poor when the size is adjusted directly through a manual operation area selection frame at present is solved.
The image processing apparatus in the embodiment of the present application may be an apparatus, or may be a component, an integrated circuit, or a chip in a terminal. The device can be mobile electronic equipment or non-mobile electronic equipment. By way of example, the mobile electronic device may be a mobile phone, a tablet computer, a notebook computer, a palm top computer, a vehicle-mounted electronic device, a wearable device, an ultra-mobile personal computer (UMPC), a netbook or a Personal Digital Assistant (PDA), and the like, and the non-mobile electronic device may be a server, a Network Attached Storage (NAS), a Personal Computer (PC), a Television (TV), a teller machine or a self-service machine, and the like, and the embodiment of the present application is not particularly limited.
The image processing apparatus in the embodiment of the present application may be an apparatus having an operating system. The operating system may be an Android (Android) operating system, an ios operating system, or other possible operating systems, and embodiments of the present application are not limited specifically.
The image processing apparatus provided in the embodiment of the present application can implement each process implemented by the image processing apparatus in the method embodiments of fig. 1 to fig. 5, and is not described herein again to avoid repetition.
According to the image processing method, the region selection frame is adjusted with the target adjustment precision through the setting of the adjustment control, the adjustment precision of the region selection frame can be guaranteed, and the problem that the precision is poor when the size is adjusted directly through manually operating the region selection frame at present is solved.
Optionally, as shown in fig. 7, an electronic device 700 is further provided in the embodiment of the present application, and includes a processor 710, a memory 709, and a program or an instruction that is stored in the memory 709 and can be executed on the processor 710, where the program or the instruction is executed by the processor 710 to implement each process of the embodiment of the image processing method, and can achieve the same technical effect, and in order to avoid repetition, the description is not repeated here.
It should be noted that the electronic devices in the embodiments of the present application include the mobile electronic devices and the non-mobile electronic devices described above.
Fig. 8 is a schematic diagram of a hardware structure of an electronic device implementing an embodiment of the present application.
The electronic device 800 includes, but is not limited to: a radio frequency unit 801, a network module 802, an audio output unit 803, an input unit 804, a sensor 805, a display unit 806, a user input unit 807, an interface unit 808, a memory 809, and a processor 810.
Those skilled in the art will appreciate that the electronic device 800 may further comprise a power source (e.g., a battery) for supplying power to the various components, and the power source may be logically connected to the processor 810 via a power management system, so as to manage charging, discharging, and power consumption management functions via the power management system. The electronic device structure shown in fig. 8 does not constitute a limitation of the electronic device, and the electronic device may include more or less components than those shown, or combine some components, or arrange different components, and thus, the description is omitted here.
Wherein, the user input unit 807 is configured to receive a first input to the image to be processed;
a processor 810 for responding to the first input;
a display unit 806, configured to display an area selection frame for selecting an image area of an image to be processed, and an adjustment control for adjusting the area selection frame;
a user input unit 807 for receiving a second input to the adjustment control
The processor 810 is further configured to adjust, in response to the second input, the size of the region selection frame with a target adjustment precision, and determine an image in the region selection frame after adjustment as a target image for saving.
Optionally, the processor 810 is further configured to determine the target adjustment precision on the adjustment scale according to a distance between the region selection frame and an image boundary of the image to be processed;
optionally, the processor 810 is further configured to determine a second operation point on an image area of the image to be processed;
adjusting the size of the region selection box with a target adjustment accuracy in response to the second input, including:
and according to the target adjusting precision, taking the second operation point as an image adjusting starting point, and adjusting the size of the image to be processed.
Optionally, the processor 810 is further configured to determine that the first operation point of the first input on the image area of the image to be processed is the second operation point.
Optionally, the processor 810 is further configured to determine the second operation point according to a third input to the operation point selection control. The electronic equipment provided by the embodiment of the application adjusts the area selection frame with the target adjusting precision through the setting of the adjusting control, can ensure the adjusting precision of the area selection frame, and solves the problem that the precision is poor when the size is adjusted directly through the manual operation area selection frame at present.
The embodiments of the present application further provide a readable storage medium, where a program or an instruction is stored, and when the program or the instruction is executed by a processor, the program or the instruction implements the processes of the embodiment of the image processing method, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here.
The processor is the processor in the electronic device described in the above embodiment. The readable storage medium includes a computer readable storage medium, such as a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and so on.
The embodiment of the present application further provides a chip, where the chip includes a processor and a communication interface, the communication interface is coupled to the processor, and the processor is configured to execute a program or an instruction to implement each process of the embodiment of the image processing method, and can achieve the same technical effect, and the details are not repeated here to avoid repetition.
It should be understood that the chips mentioned in the embodiments of the present application may also be referred to as system-on-chip, system-on-chip or system-on-chip, etc.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrases "comprising a component of' 8230; \8230;" does not exclude the presence of another like element in a process, method, article, or apparatus that comprises the element. Further, it should be noted that the scope of the methods and apparatus of the embodiments of the present application is not limited to performing the functions in the order illustrated or discussed, but may include performing the functions in a substantially simultaneous manner or in a reverse order based on the functions involved, e.g., the methods described may be performed in an order different than that described, and various steps may be added, omitted, or combined. Additionally, features described with reference to certain examples may be combined in other examples.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present application or portions thereof that contribute to the prior art may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (which may be a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present application.
While the present embodiments have been described with reference to the accompanying drawings, it is to be understood that the invention is not limited to the precise embodiments described above, which are meant to be illustrative and not restrictive, and that various changes may be made therein by those skilled in the art without departing from the spirit and scope of the invention as defined by the appended claims.

Claims (12)

1. An image processing method, characterized by comprising:
receiving a first input of an image to be processed;
responding to the first input, displaying a region selection frame for selecting an image region of the image to be processed and an adjusting control for adjusting the region selection frame; the region selection frame is used for cutting or framing at least part of the image to be processed;
receiving a second input to the adjustment control;
determining target adjustment accuracy according to the distance between the region selection frame and the image boundary of the image to be processed, wherein the target adjustment accuracy is the same or different in the horizontal direction and the vertical direction of the region selection frame;
in response to the second input, adjusting the size of the area selection frame with the target adjustment precision, and determining the adjusted image in the area selection frame as a target image for saving;
the adjustment control comprises an operating point selection control;
and determining a second operation point according to a third input to the operation point selection control.
2. The image processing method of claim 1, wherein the first input comprises at least one of:
inputting a first operation point of an image area of the image to be processed;
and performing triggering operation input of image editing on the image to be processed.
3. The method according to claim 1, wherein the adjustment control includes an adjustment scale extending in a plurality of directions with a position point on the region selection frame as a center.
4. The method of claim 3, further comprising:
determining a second operation point on an image area of the image to be processed;
in response to the second input, adjusting the size of the region selection box with a target adjustment precision, comprising:
and according to the target adjustment precision, taking the second operation point as an image adjustment starting point, and performing size adjustment on the image to be processed.
5. The image processing method according to claim 4, wherein the determining a second operation point on the image area of the image to be processed comprises:
determining that a first operation point of the first input on an image area of the image to be processed is the second operation point.
6. An image processing apparatus characterized by comprising:
the first receiving module is used for receiving a first input of an image to be processed;
the first response module is used for responding to the first input, displaying a region selection frame for selecting an image region of the image to be processed and an adjusting control for adjusting the region selection frame; the region selection frame is used for cutting or framing at least part of the image to be processed;
a second receiving module, configured to receive a second input to the adjustment control;
the first determining module is used for determining the target adjusting precision according to the distance between the area selection frame and the image boundary of the image to be processed;
the second response module is used for responding to the second input, adjusting the size of the area selection frame with the target adjustment precision, and determining the adjusted image in the area selection frame as a target image to be stored;
the adjustment control comprises an operation point selection control;
a second determination module to:
and determining a second operation point according to a third input to the operation point selection control.
7. The image processing apparatus of claim 6, wherein the first input comprises at least one of:
inputting a first operation point of an image area of the image to be processed;
and performing trigger operation input of image editing on the image to be processed.
8. The image processing apparatus according to claim 6, wherein the adjustment control includes an adjustment scale extending in a plurality of directions centering on one of the position points on the region selection frame.
9. The image processing apparatus according to claim 8, wherein the second determining module is further configured to:
determining a second operation point on an image area of the image to be processed;
the second response module is further configured to:
and according to the target adjusting precision, taking the second operation point as an image adjusting starting point, and adjusting the size of the image to be processed.
10. The image processing apparatus according to claim 8, wherein the second determination module includes:
a first determining unit, configured to determine that a first operation point of the first input on an image area of the image to be processed is the second operation point.
11. An electronic device, comprising a processor, a memory and a program or instructions stored on the memory and executable on the processor, which program or instructions, when executed by the processor, implement the steps of the image processing method according to any one of claims 1 to 5.
12. A readable storage medium, characterized in that it stores thereon a program or instructions which, when executed by a processor, implement the steps of the image processing method according to any one of claims 1 to 5.
CN202011624223.7A 2020-12-31 2020-12-31 Image processing method and device and electronic equipment Active CN112783413B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011624223.7A CN112783413B (en) 2020-12-31 2020-12-31 Image processing method and device and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011624223.7A CN112783413B (en) 2020-12-31 2020-12-31 Image processing method and device and electronic equipment

Publications (2)

Publication Number Publication Date
CN112783413A CN112783413A (en) 2021-05-11
CN112783413B true CN112783413B (en) 2023-01-24

Family

ID=75754441

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011624223.7A Active CN112783413B (en) 2020-12-31 2020-12-31 Image processing method and device and electronic equipment

Country Status (1)

Country Link
CN (1) CN112783413B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104217397A (en) * 2014-08-21 2014-12-17 深圳市金立通信设备有限公司 Image processing method
CN105205780A (en) * 2015-10-19 2015-12-30 新华瑞德(北京)网络科技有限公司 Picture cropping method and device
CN110032328A (en) * 2018-12-06 2019-07-19 阿里巴巴集团控股有限公司 A kind of size adjustment method and device of operation object
KR20190125855A (en) * 2018-04-30 2019-11-07 삼성전자주식회사 Image display device and operating method for the same
CN111957041A (en) * 2020-09-07 2020-11-20 网易(杭州)网络有限公司 Map viewing method in game, terminal, electronic equipment and storage medium

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106791062A (en) * 2016-12-12 2017-05-31 努比亚技术有限公司 A kind of sectional drawing terminal and screenshot method
CN111158561A (en) * 2019-12-03 2020-05-15 深圳传音控股股份有限公司 Intelligent terminal, view clipping method and computer readable storage medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104217397A (en) * 2014-08-21 2014-12-17 深圳市金立通信设备有限公司 Image processing method
CN105205780A (en) * 2015-10-19 2015-12-30 新华瑞德(北京)网络科技有限公司 Picture cropping method and device
KR20190125855A (en) * 2018-04-30 2019-11-07 삼성전자주식회사 Image display device and operating method for the same
CN110032328A (en) * 2018-12-06 2019-07-19 阿里巴巴集团控股有限公司 A kind of size adjustment method and device of operation object
CN111957041A (en) * 2020-09-07 2020-11-20 网易(杭州)网络有限公司 Map viewing method in game, terminal, electronic equipment and storage medium

Also Published As

Publication number Publication date
CN112783413A (en) 2021-05-11

Similar Documents

Publication Publication Date Title
CN112162812B (en) Object adding method and device
CN112269506B (en) Screen splitting method and device and electronic equipment
CN112099714B (en) Screenshot method and device, electronic equipment and readable storage medium
CN113467660A (en) Information sharing method and electronic equipment
CN112433693A (en) Split screen display method and device and electronic equipment
CN112083854A (en) Application program running method and device
CN112148167A (en) Control setting method and device and electronic equipment
CN111813305A (en) Application program starting method and device
CN113407144B (en) Display control method and device
CN112783406B (en) Operation execution method and device and electronic equipment
CN113253883A (en) Application interface display method and device and electronic equipment
CN112416199A (en) Control method and device and electronic equipment
CN112698762A (en) Icon display method and device and electronic equipment
CN112764611A (en) Application program control method and device and electronic equipment
CN111857507A (en) Desktop image processing method and device and electronic equipment
CN112558829A (en) Input method display method and device and electronic equipment
CN111857503A (en) Display method, display device and electronic equipment
CN111796746A (en) Volume adjusting method, volume adjusting device and electronic equipment
CN112783413B (en) Image processing method and device and electronic equipment
CN113495663B (en) Method and device for drawing rectangular layout, storage medium and electronic equipment
CN111857465B (en) Application icon sorting method and device and electronic equipment
CN114327173A (en) Information processing method and device and electronic equipment
CN111796736B (en) Application sharing method and device and electronic equipment
CN114518929A (en) Application program processing method and device
CN113885981A (en) Desktop editing method and device and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant