CN111986207A - Picture cropping method and terminal - Google Patents

Picture cropping method and terminal Download PDF

Info

Publication number
CN111986207A
CN111986207A CN201910438724.7A CN201910438724A CN111986207A CN 111986207 A CN111986207 A CN 111986207A CN 201910438724 A CN201910438724 A CN 201910438724A CN 111986207 A CN111986207 A CN 111986207A
Authority
CN
China
Prior art keywords
engine
cropping
target
image
clipping
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910438724.7A
Other languages
Chinese (zh)
Other versions
CN111986207B (en
Inventor
孙鹏飞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Baidu Netcom Science and Technology Co Ltd
Original Assignee
Beijing Baidu Netcom Science and Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Baidu Netcom Science and Technology Co Ltd filed Critical Beijing Baidu Netcom Science and Technology Co Ltd
Priority to CN201910438724.7A priority Critical patent/CN111986207B/en
Publication of CN111986207A publication Critical patent/CN111986207A/en
Application granted granted Critical
Publication of CN111986207B publication Critical patent/CN111986207B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image

Abstract

The invention provides a picture cropping method and a terminal, wherein the method comprises the following steps: receiving a picture to be cut input by a user; performing image cropping processing on the image to be cropped according to a target image cropping pickup engine to obtain a target image; the target clipping and drawing picking engine is a clipping and drawing picking engine constructed based on target clipping and drawing factors, and the target clipping and drawing factors are generated iteratively according to clipping and drawing behaviors. The embodiment of the invention carries out image cropping processing on the image to be cropped by the target image cropping engine constructed based on the image cropping behavior of the user, thereby realizing automatic image cropping on the image.

Description

Picture cropping method and terminal
Technical Field
The invention relates to the technical field of communication, in particular to a picture cropping method and a terminal.
Background
In the current network popularization, the wide rich media advertisement occupies a dominant position, and the picture occupies a great proportion in the advertisement. The pictures for popularization generally need to be cut into pictures to meet popularization requirements, and at present, the picture cutting process is generally performed manually by a user. The operation of cutting the picture is time-consuming and labor-consuming, and is easy to cause the loss of users. Therefore, the prior art has the problem of complicated graph cutting mode.
Disclosure of Invention
The embodiment of the invention provides a picture cropping method and a terminal, which aim to solve the problem that the picture cropping mode in the prior art is complex.
In a first aspect, an embodiment of the present invention provides a method for cropping a picture, including:
receiving a picture to be cut input by a user;
performing image cropping processing on the image to be cropped according to a target image cropping pickup engine to obtain a target image; the target clipping and drawing picking engine is a clipping and drawing picking engine constructed based on target clipping and drawing factors, and the target clipping and drawing factors are generated iteratively according to clipping and drawing behaviors.
In a second aspect, an embodiment of the present invention further provides a terminal, where the terminal includes:
the receiving module is used for receiving the picture to be cut input by the user;
the image cropping module is used for performing image cropping processing on the image to be cropped according to a target image cropping pickup engine to obtain a target image; the target clipping and drawing picking engine is a clipping and drawing picking engine constructed based on target clipping and drawing factors, and the target clipping and drawing factors are generated iteratively according to clipping and drawing behaviors.
In a third aspect, an embodiment of the present invention further provides a terminal, which includes a processor, a memory, and a computer program stored in the memory and executable on the processor, where the computer program, when executed by the processor, implements the steps of the above-mentioned picture cropping method.
In a fourth aspect, the present invention further provides a computer-readable storage medium, on which a computer program is stored, where the computer program, when executed by a processor, implements the steps of the above-mentioned picture cropping method.
The embodiment of the invention carries out image cropping processing on the image to be cropped by the target image cropping engine constructed based on the image cropping behavior of the user, thereby realizing automatic image cropping on the image.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the description of the embodiments of the present invention will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art that other drawings can be obtained according to these drawings without inventive exercise.
FIG. 1 is a flowchart of a method for cropping a picture according to an embodiment of the present invention;
FIG. 2 is a second flowchart of a method for cropping a picture according to an embodiment of the present invention;
FIG. 3 is a third flowchart of a method for cropping a picture according to an embodiment of the present invention;
FIG. 4 is a fourth flowchart of a method for cropping a picture according to an embodiment of the present invention;
Fig. 5 is a block diagram of a terminal according to an embodiment of the present invention;
fig. 6 is a block diagram of a terminal according to another embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Referring to fig. 1, fig. 1 is a flowchart of a picture cropping method according to an embodiment of the present invention, as shown in fig. 1, including the following steps:
step 101, receiving a picture to be cut input by a user;
in the embodiment of the present invention, the picture to be cut may be one or more pictures, for example, an operation interface may be provided for a user to input the picture to be cut, the user generally refers to an advertisement publisher, and the picture to be cut may be a picture that is processed by the user in advance, for example, a picture that is formed after a photographed picture is processed, or a picture that is formed after a network picture or a screenshot is processed.
102, performing image cropping processing on the image to be cropped according to a target image cropping pickup engine to obtain a target image; the target clipping and drawing picking engine is a clipping and drawing picking engine constructed based on target clipping and drawing factors, and the target clipping and drawing factors are generated iteratively according to clipping and drawing behaviors.
In an optional embodiment, the user may select a picture cropping mode, for example, a picture to be cropped may be cropped in a manual picture cropping mode, or a picture to be cropped may be automatically cropped in an automatic picture cropping mode. Specifically, in the automatic image cropping mode, the target image cropping engine may be adopted to perform image cropping processing on an image to be cropped, so as to obtain a target image.
The target picture may have a variety of styles, for example, in this embodiment, the target picture may have a style of PC, wise, bar, clue, or the like.
Specifically, the target picture may include a picture of one style, or may include pictures of multiple styles, which is not further limited herein.
The target clipping and drawing picking engine can comprise one or more clipping and drawing picking sub-engines corresponding to one style, wherein a certain clipping and drawing picking sub-engine A can be a clipping and drawing picking engine constructed based on the clipping and drawing factor A; the pattern cutting factor A is obtained by carrying out iterative processing on pattern cutting behaviors corresponding to pictures of a pattern A obtained by cutting the pictures by a plurality of users, and the pattern A corresponds to the pattern cutting and picking sub-engine A.
For example, the user a cuts the picture 1 by the picture cutting behavior 1 to obtain a picture in the style 1, and cuts the picture 1 by the picture cutting behavior 2 to obtain a picture in the style 2; the user B cuts the picture 2 to obtain a picture with a style 1 through the picture cutting behavior 3, and cuts the picture 3 to obtain a picture with a style 2 through the picture cutting behavior 4; the user C crops the picture 4 by cropping behavior 5 to obtain a picture of style 1, and crops the picture 4 by cropping behavior 6 to obtain a picture of style 2.
In this way, by performing iterative processing on the referring diagram behaviors 1, 3 and 5, a referring diagram factor can be obtained, and based on the referring diagram factor, the referring diagram picking sub-engine 1 corresponding to the style 1 can be constructed. Similarly, by performing iterative processing on the chart clipping behaviors 2, 4 and 6, a chart clipping factor can be obtained, and the chart clipping sub-engine 2 corresponding to the style 2 can be constructed based on the chart clipping factor.
The picture to be cut is cut by the picture cutting and picking sub-engine 1 to obtain a picture with a style 1; the picture of the style 2 can be obtained after the picture to be cropped is cropped by the image cropping and picking sub-engine 2.
The embodiment of the invention carries out image cropping processing on the image to be cropped by the target image cropping engine constructed based on the image cropping behavior of the user, thereby realizing automatic image cropping on the image. Therefore, the image cutting is not needed manually, and the advertisement promotion efficiency can be improved.
Further, referring to fig. 2, based on the foregoing embodiment, in this embodiment, the foregoing step 102 includes:
step 1021, performing image cropping processing on the picture to be cropped according to the first image cropping engine or the second image cropping engine to obtain a target picture;
the first chart cutting and picking engine is used for carrying out iteration generation on chart cutting factors corresponding to the first chart cutting and picking engine according to the chart cutting behaviors of different users; and iteratively generating a graph cutting factor corresponding to the second graph cutting and picking engine according to the graph cutting behavior of the user.
It should be noted that the configuration of the first chart picking engine may be set according to actual needs, for example, in this embodiment, the configuration of the first chart picking engine and the second chart picking engine is described in detail.
Specifically, referring to fig. 3, in an optional embodiment, before receiving the user input of the to-be-cut picture, the method further includes:
103, acquiring first history graph clipping behaviors of different users on each pattern of picture;
step 104, iterating the first historical graph clipping behavior to generate a graph clipping factor corresponding to each style;
105, constructing a clipping and picking sub-engine corresponding to each style according to the clipping factor corresponding to each style;
Wherein the first chart picking engine comprises a chart picking sub-engine corresponding to each style.
In this embodiment, the chart clipping and picking sub-engine corresponding to each style may be constructed according to the first history chart clipping behavior corresponding to each style, and the chart clipping and picking sub-engine may include one or more chart clipping and picking sub-engines corresponding to various styles. When the first image cropping engine is used for cropping images to be cropped, images in one or more types can be obtained, and specifically, when the first image cropping engine comprises a plurality of image cropping sub-engines, the obtained target images comprise images in the types corresponding to the plurality of image cropping sub-engines. Since a plurality of styles of cropped pictures can be obtained at one time, the cropping operation is simplified. Specifically, the first history cropping behavior is a cropping behavior generated by manually cropping each style of picture by a plurality of different users. The cropping behavior may include an area or location of cropping the picture.
In another alternative embodiment, referring to fig. 4, before receiving the user input of the to-be-cut picture, the method further includes:
step 106, acquiring a second history cropping behavior of the user on the picture of the first target style;
Step 107, iterating the second historical clipping behavior to generate a clipping factor corresponding to the first target style;
step 108, constructing a graph cutting and picking sub-engine corresponding to the first target style according to the graph cutting factor corresponding to the first target style;
the second chart picking engine comprises at least one chart picking sub-engine corresponding to the style.
In this embodiment, when the user selects the image cropping mode, if the user selects the manual image cropping mode, the image to be cropped may be displayed on the image cropping interface, and an image cropping toolbar may be displayed. The user can manually cut the picture to be cut by using the picture cutting tool. At this time, the current image cutting behavior of the user is recorded and stored. Specifically, when the number of pictures in a certain style obtained by manual cropping by a user is greater than a preset value, iteration can be performed on cropping corresponding to the style to obtain a cropping factor corresponding to the style, and then the cropping picture pickup sub-engine corresponding to the style is constructed based on the cropping factor. Since the second chart cropping engine is constructed based on the manual cropping behavior of the user in the embodiment, the pertinence of the chart cropping can be improved, and the satisfaction of automatic chart cropping can be improved. It should be understood that the sequence of constructing the first and second referring engines is not particularly limited, that is, the steps 103 to 105 may be performed before the steps 106 to 108 are performed, or may be performed after the steps 106 to 108 are performed, and no further limitation is made herein.
Further, if the system includes a first chart pickup engine and a second chart pickup engine, the selection manner of the chart pickup engine may be set according to actual needs, for example, in this embodiment, specifically, the step 1021 includes:
step 10211, receiving a cutting instruction input by the user;
step 10212, if the cropping instruction does not include the second target style selected by the user, or the cropping instruction includes the second target style, and the second cropping engine does not include the cropping sub-engine corresponding to the second target style, performing cropping processing on the picture to be cropped according to the first cropping engine to obtain a first target picture;
step 10213, if the cropping command includes the second target pattern and the second cropping pattern pickup engine includes the cropping pattern pickup sub-engine corresponding to the second target pattern, performing cropping processing on the to-be-cropped picture according to the cropping pattern pickup sub-engine corresponding to the second target pattern to obtain a second target picture.
In the present embodiment, a target chart pickup engine for performing chart clipping can be determined by judging whether the user selects a picture of a specific style. When the style of the clipped image is selected by the user, whether the second clipped image picking engine comprises a clipped image picking sub-engine corresponding to the style or not can be judged, and when the clipped image picking sub-engine exists, clipping is carried out according to the clipped image picking sub-engine; when the cutting picture picking sub-engine does not exist, cutting pictures to be cut according to the first cutting picture picking engine. In addition, when the style of the trimming picture is not selected by the user, the trimming picture can be trimmed according to the first trimming picture pick-up engine.
It should be noted that, in another alternative embodiment, the corresponding clipping pattern type option may be provided according to the corresponding pattern of the clipping pattern picking sub-engine included in the second clipping pattern picking engine. At the moment, when the user selects a chart cutting style, cutting the chart according to the second chart cutting pick-up engine; when the user does not select the cropping pattern, the cropping pattern is carried out according to the first cropping pattern picking engine.
It should be noted that, various optional implementations described in the embodiments of the present invention may be implemented in combination with each other or implemented separately, and the embodiments of the present invention are not limited thereto.
Referring to fig. 5, fig. 5 is a structural diagram of a terminal according to an embodiment of the present invention, and as shown in fig. 5, a terminal 500 includes:
a receiving module 501, configured to receive a to-be-cut picture input by a user;
the image cropping module 502 is configured to perform image cropping processing on the image to be cropped according to a target image cropping pickup engine to obtain a target image; the target clipping and drawing picking engine is a clipping and drawing picking engine constructed based on target clipping and drawing factors, and the target clipping and drawing factors are generated iteratively according to clipping and drawing behaviors.
Optionally, the map cropping module 502 is specifically configured to: performing image cropping processing on the image to be cropped according to a first image cropping engine or a second image cropping engine to obtain a target image; the first chart cutting and picking engine is used for carrying out iteration generation on chart cutting factors corresponding to the first chart cutting and picking engine according to the chart cutting behaviors of different users; and iteratively generating a graph cutting factor corresponding to the second graph cutting and picking engine according to the graph cutting behavior of the user.
Optionally, the terminal 500 further includes:
the first obtaining module is used for obtaining a second history cropping behavior of the user on the picture of the first target style;
a first iteration module, configured to iterate the second historical clipping behavior to generate a clipping factor corresponding to the first target style;
the first construction module is used for constructing a graph cutting and picking sub-engine corresponding to the first target style according to the graph cutting factor corresponding to the first target style;
the second chart picking engine comprises at least one chart picking sub-engine corresponding to the style.
Optionally, the map cropping module 502 includes:
the receiving unit is used for receiving a cutting instruction input by the user;
the processing unit is used for performing image cropping processing on the picture to be cropped according to the first image cropping engine to obtain a first target picture if the cropping instruction does not contain a second target pattern selected by a user or the cropping instruction contains the second target pattern and the second image cropping engine does not contain an image cropping sub-engine corresponding to the second target pattern; and if the cropping instruction comprises the second target pattern and the second image cropping and picking engine comprises an image cropping and picking sub-engine corresponding to the second target pattern, performing image cropping processing on the image to be cropped according to the image cropping and picking sub-engine corresponding to the second target pattern to obtain a second target image.
Optionally, the terminal 500 further includes:
the second acquisition module is used for acquiring first history graph clipping behaviors of different users on each type of picture;
the second iteration module is used for iterating the first historical graph clipping behavior to generate a graph clipping factor corresponding to each pattern;
the second construction module is used for constructing a clipping and picking sub-engine corresponding to each style according to the clipping factor corresponding to each style;
wherein the first chart picking engine comprises a chart picking sub-engine corresponding to each style.
The terminal provided by the embodiment of the present invention can implement each process implemented by the terminal in the method embodiments of fig. 1 to fig. 4, and is not described herein again to avoid repetition.
Fig. 6 is a schematic diagram of a hardware structure of a terminal for implementing various embodiments of the present invention.
The terminal 600 includes but is not limited to: a radio frequency unit 601, a network module 602, an audio output unit 603, an input unit 604, a sensor 605, a display unit 606, a user input unit 607, an interface unit 608, a memory 609, a processor 610, and a power supply 611. Those skilled in the art will appreciate that the terminal configuration shown in fig. 6 is not intended to be limiting, and that the terminal may include more or fewer components than shown, or some components may be combined, or a different arrangement of components. In the embodiment of the present invention, the terminal includes, but is not limited to, a mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted terminal, a wearable device, a pedometer, and the like.
The processor 610 is configured to receive a picture to be cut, which is input by a user; performing image cropping processing on the image to be cropped according to a target image cropping pickup engine to obtain a target image; the target clipping and drawing picking engine is a clipping and drawing picking engine constructed based on target clipping and drawing factors, and the target clipping and drawing factors are generated iteratively according to clipping and drawing behaviors.
Optionally, the processor 610 is specifically configured to perform graph cropping processing on the picture to be cropped according to the first graph cropping pickup engine or the second graph cropping pickup engine, so as to obtain a target picture;
the first chart cutting and picking engine is used for carrying out iteration generation on chart cutting factors corresponding to the first chart cutting and picking engine according to the chart cutting behaviors of different users; and iteratively generating a graph cutting factor corresponding to the second graph cutting and picking engine according to the graph cutting behavior of the user.
Optionally, the processor 610 is further configured to:
acquiring a second history cropping behavior of the user on the picture of the first target style;
iterating the second historical clipping behavior to generate a clipping factor corresponding to the first target style;
constructing a graph cutting and picking sub-engine corresponding to the first target style according to the graph cutting factor corresponding to the first target style;
The second chart picking engine comprises at least one chart picking sub-engine corresponding to the style.
Optionally, the processor 610 is specifically configured to:
receiving a cutting instruction input by the user;
if the cropping instruction does not contain a second target style selected by a user, or the cropping instruction contains the second target style, and the second image cropping engine does not contain an image cropping sub-engine corresponding to the second target style, performing image cropping processing on the image to be cropped according to the first image cropping engine to obtain a first target image;
and if the cropping instruction comprises the second target pattern and the second image cropping and picking engine comprises an image cropping and picking sub-engine corresponding to the second target pattern, performing image cropping processing on the image to be cropped according to the image cropping and picking sub-engine corresponding to the second target pattern to obtain a second target image.
Optionally, the processor 610 is further configured to:
acquiring first history graph clipping behaviors of different users on each style of picture;
iterating the first historical graph clipping behavior to generate a graph clipping factor corresponding to each style;
constructing a clipping image picking sub-engine corresponding to each style according to the clipping image factor corresponding to each style;
Wherein the first chart picking engine comprises a chart picking sub-engine corresponding to each style.
The embodiment of the invention carries out image cropping processing on the image to be cropped by the target image cropping engine constructed based on the image cropping behavior of the user, thereby realizing automatic image cropping on the image.
It should be understood that, in the embodiment of the present invention, the radio frequency unit 601 may be used for receiving and sending signals during a message sending and receiving process or a call process, and specifically, receives downlink data from a base station and then processes the received downlink data to the processor 610; in addition, the uplink data is transmitted to the base station. In general, radio frequency unit 601 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. Further, the radio frequency unit 601 may also communicate with a network and other devices through a wireless communication system.
The terminal provides wireless broadband internet access to the user through the network module 602, such as helping the user send and receive e-mails, browse web pages, and access streaming media.
The audio output unit 603 may convert audio data received by the radio frequency unit 601 or the network module 602 or stored in the memory 609 into an audio signal and output as sound. Also, the audio output unit 603 can also provide audio output related to a specific function performed by the terminal 600 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output unit 603 includes a speaker, a buzzer, a receiver, and the like.
The input unit 604 is used to receive audio or video signals. The input Unit 604 may include a Graphics Processing Unit (GPU) 6041 and a microphone 6042, and the Graphics processor 6041 processes image data of a still picture or video obtained by an image capturing apparatus (such as a camera) in a video capture mode or an image capture mode. The processed image frames may be displayed on the display unit 606. The image frames processed by the graphic processor 6041 may be stored in the memory 609 (or other storage medium) or transmitted via the radio frequency unit 601 or the network module 602. The microphone 6042 can receive sound, and can process such sound into audio data. The processed audio data may be converted into a format output transmittable to a mobile communication base station via the radio frequency unit 601 in case of the phone call mode.
The terminal 600 also includes at least one sensor 605, such as a light sensor, motion sensor, and other sensors. Specifically, the light sensor includes an ambient light sensor that can adjust the brightness of the display panel 6061 according to the brightness of ambient light, and a proximity sensor that can turn off the display panel 6061 and/or the backlight when the terminal 600 is moved to the ear. As one of the motion sensors, the accelerometer sensor can detect the magnitude of acceleration in each direction (generally three axes), detect the magnitude and direction of gravity when stationary, and can be used to identify the terminal posture (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), vibration identification related functions (such as pedometer, tapping), and the like; the sensors 605 may also include fingerprint sensors, pressure sensors, iris sensors, molecular sensors, gyroscopes, barometers, hygrometers, thermometers, infrared sensors, etc., which are not described in detail herein.
The display unit 606 is used to display information input by the user or information provided to the user. The Display unit 606 may include a Display panel 6061, and the Display panel 6061 may be configured by a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like.
The user input unit 607 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the terminal. Specifically, the user input unit 607 includes a touch panel 6071 and other input devices 6072. Touch panel 6071, also referred to as a touch screen, may collect touch operations by a user on or near it (e.g., operations by a user on or near touch panel 6071 using a finger, stylus, or any suitable object or accessory). The touch panel 6071 may include two parts of a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 610, receives a command from the processor 610, and executes the command. In addition, the touch panel 6071 can be implemented by various types such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. The user input unit 607 may include other input devices 6072 in addition to the touch panel 6071. Specifically, the other input devices 6072 may include, but are not limited to, a physical keyboard, function keys (such as volume control keys, switch keys, etc.), a track ball, a mouse, and a joystick, which are not described herein again.
Further, the touch panel 6071 can be overlaid on the display panel 6061, and when the touch panel 6071 detects a touch operation on or near the touch panel 6071, the touch operation is transmitted to the processor 610 to determine the type of the touch event, and then the processor 610 provides a corresponding visual output on the display panel 6061 according to the type of the touch event. Although in fig. 6, the touch panel 6071 and the display panel 6061 are two independent components to realize the input and output functions of the terminal, in some embodiments, the touch panel 6071 and the display panel 6061 may be integrated to realize the input and output functions of the terminal, and this is not limited here.
The interface unit 608 is an interface for connecting an external device to the terminal 600. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The interface unit 608 may be used to receive input (e.g., data information, power, etc.) from an external device and transmit the received input to one or more elements within the terminal 600 or may be used to transmit data between the terminal 600 and an external device.
The memory 609 may be used to store software programs as well as various data. The memory 609 may mainly include a program storage area and a data storage area, wherein the program storage area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 609 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The processor 610 is a control center of the terminal, connects various parts of the entire terminal using various interfaces and lines, and performs various functions of the terminal and processes data by operating or executing software programs and/or modules stored in the memory 609 and calling data stored in the memory 609, thereby performing overall monitoring of the terminal. Processor 610 may include one or more processing units; preferably, the processor 610 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 610.
The terminal 600 may further include a power supply 611 (e.g., a battery) for supplying power to the various components, and preferably, the power supply 611 is logically connected to the processor 610 via a power management system, so that functions of managing charging, discharging, and power consumption are performed via the power management system.
In addition, the terminal 600 includes some functional modules that are not shown, and are not described in detail herein.
Preferably, an embodiment of the present invention further provides a terminal, including a processor 610, a memory 609, and a computer program stored in the memory 609 and capable of running on the processor 610, where the computer program is executed by the processor 610 to implement each process of the above-mentioned image cropping method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not described here again.
An embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program implements each process of the above-mentioned embodiment of the picture cropping method, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here. The computer-readable storage medium may be a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment of the present invention.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: various media capable of storing program codes, such as a U disk, a removable hard disk, a ROM, a RAM, a magnetic disk, or an optical disk.
The above description is only for the specific embodiments of the present invention, but the scope of the present invention is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present invention, and all the changes or substitutions should be covered within the scope of the present invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the claims.

Claims (12)

1. A picture cropping method is characterized by comprising the following steps:
receiving a picture to be cut input by a user;
performing image cropping processing on the image to be cropped according to a target image cropping pickup engine to obtain a target image; the target clipping and drawing picking engine is a clipping and drawing picking engine constructed based on target clipping and drawing factors, and the target clipping and drawing factors are generated iteratively according to clipping and drawing behaviors.
2. The method according to claim 1, wherein the performing a cropping process on the picture to be cropped according to a target cropping picture pick-up engine to obtain a target picture comprises:
performing image cropping processing on the image to be cropped according to a first image cropping engine or a second image cropping engine to obtain a target image;
the first chart cutting and picking engine is used for carrying out iteration generation on chart cutting factors corresponding to the first chart cutting and picking engine according to the chart cutting behaviors of different users; and iteratively generating a graph cutting factor corresponding to the second graph cutting and picking engine according to the graph cutting behavior of the user.
3. The method of claim 2, wherein before receiving the user input of the picture to be cropped, the method further comprises:
acquiring a second history cropping behavior of the user on the picture of the first target style;
iterating the second historical clipping behavior to generate a clipping factor corresponding to the first target style;
constructing a graph cutting and picking sub-engine corresponding to the first target style according to the graph cutting factor corresponding to the first target style;
the second chart picking engine comprises at least one chart picking sub-engine corresponding to the style.
4. The method according to claim 3, wherein the cropping the picture to be cropped according to the first cropping picture pick-up engine or the second cropping picture pick-up engine to obtain the target picture comprises:
receiving a cutting instruction input by the user;
if the cropping instruction does not contain a second target style selected by a user, or the cropping instruction contains the second target style, and the second image cropping engine does not contain an image cropping sub-engine corresponding to the second target style, performing image cropping processing on the image to be cropped according to the first image cropping engine to obtain a first target image;
And if the cropping instruction comprises the second target pattern and the second image cropping and picking engine comprises an image cropping and picking sub-engine corresponding to the second target pattern, performing image cropping processing on the image to be cropped according to the image cropping and picking sub-engine corresponding to the second target pattern to obtain a second target image.
5. The method of claim 2, wherein before receiving the user input of the picture to be cropped, the method further comprises:
acquiring first history graph clipping behaviors of different users on each style of picture;
iterating the first historical graph clipping behavior to generate a graph clipping factor corresponding to each style;
constructing a clipping image picking sub-engine corresponding to each style according to the clipping image factor corresponding to each style;
wherein the first chart picking engine comprises a chart picking sub-engine corresponding to each style.
6. A terminal, comprising:
the receiving module is used for receiving the picture to be cut input by the user;
the image cropping module is used for performing image cropping processing on the image to be cropped according to a target image cropping pickup engine to obtain a target image; the target clipping and drawing picking engine is a clipping and drawing picking engine constructed based on target clipping and drawing factors, and the target clipping and drawing factors are generated iteratively according to clipping and drawing behaviors.
7. The terminal of claim 6, wherein the map-cutting module is specifically configured to: performing image cropping processing on the image to be cropped according to a first image cropping engine or a second image cropping engine to obtain a target image;
the first chart cutting and picking engine is used for carrying out iteration generation on chart cutting factors corresponding to the first chart cutting and picking engine according to the chart cutting behaviors of different users; and iteratively generating a graph cutting factor corresponding to the second graph cutting and picking engine according to the graph cutting behavior of the user.
8. The terminal of claim 7, further comprising:
the first obtaining module is used for obtaining a second history cropping behavior of the user on the picture of the first target style;
a first iteration module, configured to iterate the second historical clipping behavior to generate a clipping factor corresponding to the first target style;
the first construction module is used for constructing a graph cutting and picking sub-engine corresponding to the first target style according to the graph cutting factor corresponding to the first target style;
the second chart picking engine comprises at least one chart picking sub-engine corresponding to the style.
9. The terminal of claim 8, wherein the map cropping module comprises:
The receiving unit is used for receiving a cutting instruction input by the user;
the processing unit is used for performing image cropping processing on the picture to be cropped according to the first image cropping engine to obtain a first target picture if the cropping instruction does not contain a second target pattern selected by a user or the cropping instruction contains the second target pattern and the second image cropping engine does not contain an image cropping sub-engine corresponding to the second target pattern; and if the cropping instruction comprises the second target pattern and the second image cropping and picking engine comprises an image cropping and picking sub-engine corresponding to the second target pattern, performing image cropping processing on the image to be cropped according to the image cropping and picking sub-engine corresponding to the second target pattern to obtain a second target image.
10. The terminal of claim 7, further comprising:
the second acquisition module is used for acquiring first history graph clipping behaviors of different users on each type of picture;
the second iteration module is used for iterating the first historical graph clipping behavior to generate a graph clipping factor corresponding to each pattern;
the second construction module is used for constructing a clipping and picking sub-engine corresponding to each style according to the clipping factor corresponding to each style;
Wherein the first chart picking engine comprises a chart picking sub-engine corresponding to each style.
11. A terminal, comprising a processor, a memory and a computer program stored on the memory and executable on the processor, the computer program, when executed by the processor, implementing the steps of the picture cropping method as claimed in any one of claims 1 to 5.
12. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the steps of the picture cropping method of any one of claims 1 to 5.
CN201910438724.7A 2019-05-24 2019-05-24 Picture cutting method and terminal Active CN111986207B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910438724.7A CN111986207B (en) 2019-05-24 2019-05-24 Picture cutting method and terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910438724.7A CN111986207B (en) 2019-05-24 2019-05-24 Picture cutting method and terminal

Publications (2)

Publication Number Publication Date
CN111986207A true CN111986207A (en) 2020-11-24
CN111986207B CN111986207B (en) 2023-09-05

Family

ID=73436107

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910438724.7A Active CN111986207B (en) 2019-05-24 2019-05-24 Picture cutting method and terminal

Country Status (1)

Country Link
CN (1) CN111986207B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112632422A (en) * 2020-12-29 2021-04-09 中国平安财产保险股份有限公司 Intelligent image cutting method and device, electronic equipment and storage medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130069980A1 (en) * 2011-09-15 2013-03-21 Beau R. Hartshorne Dynamically Cropping Images
CN104504649A (en) * 2014-12-30 2015-04-08 百度在线网络技术(北京)有限公司 Picture cutting method and device
US20150104111A1 (en) * 2013-10-11 2015-04-16 Disney Enterprises, Inc. Methods and systems of local signal equalization
CN107480725A (en) * 2017-08-23 2017-12-15 京东方科技集团股份有限公司 Image-recognizing method, device and computer equipment based on deep learning
CN108062755A (en) * 2017-11-02 2018-05-22 广东数相智能科技有限公司 A kind of picture intelligence method of cutting out and device
US20180181593A1 (en) * 2016-12-28 2018-06-28 Shutterstock, Inc. Identification of a salient portion of an image
CN109657643A (en) * 2018-12-29 2019-04-19 百度在线网络技术(北京)有限公司 A kind of image processing method and device

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130069980A1 (en) * 2011-09-15 2013-03-21 Beau R. Hartshorne Dynamically Cropping Images
US20150104111A1 (en) * 2013-10-11 2015-04-16 Disney Enterprises, Inc. Methods and systems of local signal equalization
CN104504649A (en) * 2014-12-30 2015-04-08 百度在线网络技术(北京)有限公司 Picture cutting method and device
US20180181593A1 (en) * 2016-12-28 2018-06-28 Shutterstock, Inc. Identification of a salient portion of an image
CN107480725A (en) * 2017-08-23 2017-12-15 京东方科技集团股份有限公司 Image-recognizing method, device and computer equipment based on deep learning
CN108062755A (en) * 2017-11-02 2018-05-22 广东数相智能科技有限公司 A kind of picture intelligence method of cutting out and device
CN109657643A (en) * 2018-12-29 2019-04-19 百度在线网络技术(北京)有限公司 A kind of image processing method and device

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
杨善文;刘丽;李玮;: "线性工程带状地形图自动分幅的一种方法", 天然气与石油, no. 01, pages 61 - 64 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112632422A (en) * 2020-12-29 2021-04-09 中国平安财产保险股份有限公司 Intelligent image cutting method and device, electronic equipment and storage medium
CN112632422B (en) * 2020-12-29 2024-02-02 中国平安财产保险股份有限公司 Intelligent graph cutting method and device, electronic equipment and storage medium

Also Published As

Publication number Publication date
CN111986207B (en) 2023-09-05

Similar Documents

Publication Publication Date Title
CN107977144B (en) Screen capture processing method and mobile terminal
CN108845853B (en) Application program starting method and mobile terminal
CN108234289B (en) Message display method and device and mobile terminal
CN109525874B (en) Screen capturing method and terminal equipment
CN110096326B (en) Screen capturing method, terminal equipment and computer readable storage medium
CN110109593B (en) Screen capturing method and terminal equipment
CN109343755B (en) File processing method and terminal equipment
CN109857297B (en) Information processing method and terminal equipment
CN108038825B (en) Image processing method and mobile terminal
CN108900695B (en) Display processing method, terminal equipment and computer readable storage medium
CN108600089B (en) Expression image display method and terminal equipment
CN109412932B (en) Screen capturing method and terminal
CN107608606A (en) A kind of image display method, mobile terminal and computer-readable recording medium
CN109618218B (en) Video processing method and mobile terminal
CN108718389B (en) Shooting mode selection method and mobile terminal
CN111026305A (en) Audio processing method and electronic equipment
CN110609648A (en) Application program control method and terminal
CN108174109B (en) Photographing method and mobile terminal
CN108132749B (en) Image editing method and mobile terminal
CN108196781B (en) Interface display method and mobile terminal
CN110933494A (en) Picture sharing method and electronic equipment
CN110413363B (en) Screenshot method and terminal equipment
CN110191426B (en) Information sharing method and terminal
CN110022551B (en) Information interaction method and terminal equipment
CN109358913B (en) Application program starting method and terminal equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant