CN108898555B - Image processing method and terminal equipment - Google Patents
Image processing method and terminal equipment Download PDFInfo
- Publication number
- CN108898555B CN108898555B CN201810845920.1A CN201810845920A CN108898555B CN 108898555 B CN108898555 B CN 108898555B CN 201810845920 A CN201810845920 A CN 201810845920A CN 108898555 B CN108898555 B CN 108898555B
- Authority
- CN
- China
- Prior art keywords
- screen
- image
- target image
- area
- input
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000003672 processing method Methods 0.000 title claims abstract description 30
- 238000000034 method Methods 0.000 claims abstract description 30
- 230000004044 response Effects 0.000 claims abstract description 23
- 230000015572 biosynthetic process Effects 0.000 claims abstract description 12
- 238000003786 synthesis reaction Methods 0.000 claims abstract description 12
- 238000004590 computer program Methods 0.000 claims description 14
- 230000006870 function Effects 0.000 description 11
- 238000010586 diagram Methods 0.000 description 8
- 230000002194 synthesizing effect Effects 0.000 description 4
- 238000012545 processing Methods 0.000 description 3
- 238000004891 communication Methods 0.000 description 2
- 238000001514 detection method Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 230000001133 acceleration Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 238000007599 discharging Methods 0.000 description 1
- 230000005484 gravity Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000010295 mobile communication Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 238000010079 rubber tapping Methods 0.000 description 1
- 230000005236 sound signal Effects 0.000 description 1
- 238000010897 surface acoustic wave method Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformations in the plane of the image
- G06T3/40—Scaling of whole images or parts thereof, e.g. expanding or contracting
- G06T3/4038—Image mosaicing, e.g. composing plane images from plane sub-images
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The invention discloses an image processing method and a terminal device, wherein the method comprises the following steps: receiving a first input of a user to a target image in the N images displayed on the first screen and the second screen; cropping an image of a first region of the target image in response to the first input; carrying out image synthesis on the image of the second area of the target image and the N-1 images, and outputting the target image; wherein the first region and the image of the first region constitute a target image; the number of the target images is greater than or equal to 1, the cut images of the first area are invisible, at least one image in the N images is displayed on the first screen, and at least one image is displayed on the second screen; n is an integer larger than 1, so that the first input of a user to a target image in the N images displayed on the first screen and the second screen is received, the image of the first area of the target image can be cut, professional retouching software is not required to be started, and the operation steps are simplified.
Description
Technical Field
The embodiment of the invention relates to the technical field of image processing, in particular to an image processing method and terminal equipment.
Background
With the overall popularization of terminal devices, the requirements of users on the terminal devices are higher and higher. For example, users want the terminal device to be able to not only capture high quality pictures, but also trim the pictures. Taking image splicing as an example, the conventional image processing method splices a local image of one image with other images, needs to use professional image modifying software for cutting, and is complex to operate.
Disclosure of Invention
The embodiment of the invention provides an image processing method and terminal equipment, and aims to solve the problem that the image processing method in the prior art is complex to operate.
In order to solve the technical problem, the invention is realized as follows:
in a first aspect, an embodiment of the present invention provides an image processing method, which is applied to a terminal device including a first screen and a second screen, and the method includes:
receiving a first input of a user to a target image in the N images displayed on the first screen and the second screen;
cropping, in response to the first input, an image of a first region of the target image;
carrying out image synthesis on the image of the second area of the target image and the N-1 images, and outputting the target image;
wherein the first region and the image of the first region constitute the target image; the number of the target images is greater than or equal to 1, the cut images of the first area are invisible, at least one image in the N images is displayed on the first screen, and at least one image is displayed on the second screen; n is an integer greater than 1.
In a second aspect, an embodiment of the present invention further provides a terminal device, which is applied to a terminal device that includes a first screen and a second screen, where the terminal device includes:
the receiving module is used for receiving first input of a user on a target image in the N images displayed on the first screen and the second screen;
a cropping module to crop an image of a first region of the target image in response to the first input;
the synthesis module is used for carrying out image synthesis on the image of the second area of the target image and the N-1 images and outputting the target image;
wherein the first region and the image of the first region constitute the target image; the number of the target images is greater than or equal to 1, the cut images of the first area are invisible, at least one image in the N images is displayed on the first screen, and at least one image is displayed on the second screen; n is an integer greater than 1.
In a third aspect, an embodiment of the present invention further provides a terminal device, including: a memory, a processor and a computer program stored on the memory and executable on the processor, the computer program, when executed by the processor, implementing the steps of the image processing method described above.
In a fourth aspect, the embodiment of the present invention further provides a readable storage medium, where a computer program is stored on the readable storage medium, and when the computer program is executed by a processor, the computer program implements the steps of the image processing method described above.
In the embodiment of the invention, a first input of a user to a target image in N images displayed on a first screen and a second screen is received; cropping, in response to the first input, an image of a first region of the target image; and synthesizing the image of the second area of the target image and the N-1 images, and outputting the target image, so that the first input of a user to the target image in the N images displayed on the first screen and the second screen is received, the image of the first area of the target image can be cut, professional retouching software is not required to be started, and the operation steps are simplified.
Drawings
FIG. 1 is a flowchart of an image processing method according to an embodiment of the present invention;
fig. 2 is one of schematic diagrams of an image processing method provided in an embodiment of the present invention in an actual application scenario;
fig. 3 is a second schematic diagram of an image processing method in an actual application scenario according to an embodiment of the present invention;
fig. 4 is a third schematic diagram of an image processing method in an actual application scenario according to an embodiment of the present invention;
FIG. 5 is a fourth schematic diagram of an image processing method in an actual application scenario according to an embodiment of the present invention;
FIG. 6 is a fifth schematic view of an image processing method provided in an embodiment of the present invention in an actual application scenario;
fig. 7 is a sixth schematic view of an image processing method provided in an embodiment of the present invention in an actual application scenario;
fig. 8 is a schematic structural diagram of a terminal device according to an embodiment of the present invention;
fig. 9 is a second schematic structural diagram of a terminal device according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, not all, embodiments of the present invention. All other embodiments, which can be obtained by a person skilled in the art without inventive step based on the embodiments of the present invention, are within the scope of protection of the present invention. The technical solutions provided by the embodiments of the present invention are described in detail below with reference to the accompanying drawings.
The invention provides an image processing method, and an execution body of the method can be but not limited to a terminal device (such as a mobile phone, a tablet computer and the like) comprising a first screen and a second screen or a device capable of being configured to execute the method provided by the embodiment of the invention.
For convenience of description, the following description will be made of an embodiment of the method taking as an example a terminal device including a first screen and a second screen capable of performing the method. It is understood that the method is performed by a terminal device including a first screen and a second screen, which is only an exemplary illustration and should not be construed as a limitation of the method.
Fig. 1 is a flowchart of an image processing method according to an embodiment of the present invention, where the method in fig. 1 may be performed by a terminal device including a first screen and a second screen, and as shown in fig. 1, the method may include:
step 101, receiving a first input of a user to a target image in N images displayed on a first screen and a second screen.
The first input may be a touch input, which may include a click input, a long press input, or a slide input.
Of course, the first input may also refer to a touch operation, which may be a sliding operation and/or a clicking operation.
The sliding operation may be a single-contact sliding operation or a multi-contact sliding operation. For example, a single finger sliding operation or at least two finger sliding operations. The sliding direction of the sliding operation can be set arbitrarily, specifically according to actual requirements. Among them, for the multi-contact sliding operation, the sliding direction thereof is preferably the same direction sliding.
The clicking operation may be a single-click operation or a multi-click operation. The multi-click operation may be a multi-click operation performed on the same screen, or may be a multi-click operation performed on the first screen and the second screen, respectively. The number of clicks, the frequency of clicks, and the position of clicks of the click operation may be set according to actual requirements, and embodiments of the present invention are not particularly limited.
At least one image in the N images is displayed on the first screen, and at least one image is displayed on the second screen; n is an integer greater than 1.
Step 102, in response to the first input, cropping an image of a first region of the target image.
Wherein the cropped image of the first region is invisible.
And 103, carrying out image synthesis on the image of the second area of the target image and the N-1 images, and outputting the target image.
Wherein the first region and the image of the first region constitute the target image; the number of the target images is greater than or equal to 1.
In the embodiment of the invention, a first input of a user to a target image in N images displayed on a first screen and a second screen is received; cropping, in response to the first input, an image of a first region of the target image; and synthesizing the image of the second area of the target image and the N-1 images, and outputting the target image, so that the first input of a user to the target image in the N images displayed on the first screen and the second screen is received, the image of the first area of the target image can be cut, professional retouching software is not required to be started, and the operation steps are simplified.
Optionally, as an embodiment, the first input includes a first touch input;
after step 101 is executed, the image processing method provided in the embodiment of the present invention further includes:
updating the display position of the target image;
step 102 may be specifically implemented as:
cutting an image of a first area on a first side of a target boundary line in the target image;
the first area is an area of the target image located on a first side of a boundary of the target, and the second area is an area of the target image located on a second side of the boundary of the target.
The updating of the display position of the target image may be specifically implemented as: the display position of the target image can be updated in a refreshing mode, and the display position of the target image can also be updated in a touch mode.
The target boundary line is used to divide the first area and the second area. The target dividing line may be preset.
In the embodiment of the present invention, in response to the first touch input, the updating of the display position of the target image may be specifically implemented as: the display position of the target image is updated in response to the slide operation, such as updating the target image displayed in the first area to be displayed in the second area. Step 102 may be specifically implemented as: an image of a first region of the target image located on a first side of the target boundary is cropped.
For example, taking a first touch input as an example of a sliding operation of a user sliding a target image displayed on the first screen 1 from the first screen 1 to the second screen 2: as shown in fig. 2, the first screen 1 displays the target image and the image 11 in the target image, and the second screen 2 displays the target image and the image 21 in the target image. As shown in fig. 3, in response to a slide operation of a user sliding the target image displayed on the first screen 1 from the first screen 1 to the second screen 2, the target image displayed on the first screen 1 moves to an intermediate area (i.e., a target boundary) between the first screen 1 and the second screen 2, and at this time, the first image 11 of the target image displayed on the first screen 1 is positioned on a first side of the target boundary, and the first image 11 is cut.
According to the method and the device, the display position of the target image is updated in response to the first touch input of the user, so that the image in the target image is located in the first area on the first side of the target boundary, the image in the first area on the first side of the target boundary in the target image is cut, the image can be cut quickly through the first touch input of the user, the operation is simple, and the operation time is shortened.
Optionally, as an embodiment, before performing step 102, the image processing method provided in the embodiment of the present invention further includes:
in a case where the first side edge of the target image intersects the first screen or the second side edge of the first screen, a target boundary line is determined based on the second side edge.
It is understood that when the first side edge of the target image is moved to the first screen or the second side edge of the first screen intersects, the second side edge is determined to be the target boundary line.
According to the embodiment of the invention, the target boundary is determined based on the second side edge under the condition that the first side edge of the target image is intersected with the first screen or the second side edge of the first screen, so that the target boundary is determined by determining the condition that the first side edge of the target image is intersected with the first screen or the second side edge of the first screen, the subsequent cutting is facilitated, and the operation is simple.
Optionally, as an embodiment, the first input includes a second touch input;
the updating the display position of the target image comprises:
controlling the target image to move towards a first direction;
after step 102 is executed, the image processing method provided by the embodiment of the present invention further includes:
in response to the second touch input, controlling the target image to move towards a second direction, and restoring the display of the image of the third area of the target image according to the moving distance of the target image;
wherein the third region is associated with a distance moved by the target image.
The first direction and the second direction may be the same direction or different directions.
It is understood that, in response to a second touch input by the user, the target image is controlled to move toward the second direction, and the display of the image of the third area of the target image is restored, that is, the cropping of the image in the target image is gradually undone, according to the distance the target image moves.
For example, taking the second touch input as an example of a sliding operation of the user sliding the target displayed on the first screen 1 from the second screen 2 to the first screen 1: as shown in fig. 4, in response to a slide operation of the user sliding the target displayed on the first screen 1 from the second screen 2 to the first screen 1, the target image moves in the direction indicated by the arrow, and the image 11 in the target image gradually leaves the first area of the boundary of the target, at which time the display of the image 11 of the third area of the target image, which is associated with the distance the target image moves, is restored.
In the embodiment of the invention, by responding to the second touch input, controlling the target image to move towards the second direction and restoring the display of the image of the third area of the target image according to the moving distance of the target image, the clipping of the image in the target image can be cancelled through the second touch input of the user. If the user finds out that the cutting is wrong, the cutting can be cancelled at any time, and the operation is simple, convenient and fast.
Optionally, as an embodiment, the first screen and the second screen are relatively slidable, and the first input includes an input for controlling the relative sliding of the first screen and the second screen;
after step 101 is executed, the image processing method provided by the embodiment of the present invention further includes:
controlling the first screen to slide relative to the second screen or the second screen to slide relative to the first screen;
step 102 may be specifically implemented as including:
acquiring an overlapping area of the first screen and the second screen;
cutting an image of a first area of the target image, which is located in the overlapping area;
the first region is a first region of the target image located in the overlap region, and the second region is all image regions of the target image except the first region.
Illustratively, referring to fig. 2 and 6, in response to the user controlling the first screen 1 to slide relative to the second screen 2, after the first screen 1 slides relative to the second screen 2, there is an overlapping area between the first screen 1 and the second screen 2, where a first image area of the overlapping area is an area where the first screen 1 and the second screen 2 overlap with each other (i.e., an area where the image 11 displayed in the first screen 1 and the image 21 displayed in the second target screen 2 are located), that is, an area where the image 21 is located is determined as an image area to be cropped, and the image 11 displayed in the first screen 1 and the image 21 displayed in the second target screen 2 are cropped.
According to the method and the device for clipping the image in the overlapping area, the overlapping area of the first screen and the second screen is obtained in response to the input for controlling the relative sliding of the first screen and the second screen, the first area in the overlapping area is determined to be the area to be clipped, the image in the first area in the overlapping area in the target image is clipped, the first screen and the second screen are controlled to slide relatively by a user, the image in the first area in the overlapping area can be clipped, the operation is simple, and the operation time is shortened.
Optionally, as an embodiment, the step 103 may be specifically implemented as:
and responding to a second input of the user, carrying out image synthesis on the image of the second area of the target image and the N-1 images, and outputting the target image.
The second input may be a touch input, which may include a click input, a long press input, or a slide input.
Of course, the second input may also refer to a touch operation, which may be a sliding operation and/or a clicking operation.
The sliding operation may be a single-contact sliding operation or a multi-contact sliding operation. For example, a single finger sliding operation or at least two finger sliding operations. The sliding direction of the sliding operation can be set arbitrarily, specifically according to actual requirements. Among them, for the multi-contact sliding operation, the sliding direction thereof is preferably the same direction sliding.
The clicking operation may be a single-click operation or a multi-click operation. The multi-click operation may be a multi-click operation performed on the same screen, or may be a multi-click operation performed on the first screen and the second screen, respectively. The number of clicks, the frequency of clicks, and the position of clicks of the click operation may be set according to actual requirements, and embodiments of the present invention are not particularly limited.
For example, taking the second input as a sliding operation, where the sliding operation is a sliding operation in which two fingers of a user slide on the first screen 1 and the second screen 2 from the bottom to the top of the display screen, respectively, an embodiment of the present invention may specifically be implemented as follows: as shown in fig. 5 and 7, in response to a sliding operation of two fingers of a user sliding from the bottom to the top of the display screen on the first screen 1 and the second screen 2, respectively, an image of the second region of the target image and N-1 images are image-synthesized, and the target image is output.
According to the embodiment of the invention, the image of the second area of the target image and the N-1 images are synthesized by responding to the second input of the user, the target image is output, and the synthesis of the images can be realized through the second input of the user, so that the user can check the cut images, the requirement of the user on checking the images is met, and the operation is simple.
The image processing method according to the embodiment of the present invention is described in detail above with reference to fig. 1 to 7, and the terminal device according to the embodiment of the present invention is described in detail below with reference to fig. 8.
Fig. 8 is a schematic structural diagram of a terminal device according to an embodiment of the present invention, and as shown in fig. 8, the terminal device is based on the image processing method according to an embodiment of the present invention, and the terminal device includes a first screen and a second screen, and may include:
a receiving module 801, configured to receive a first input of a user to a target image in N images displayed on the first screen and the second screen;
a cropping module 802 for cropping an image of a first region of the target image in response to the first input;
a synthesizing module 803, configured to perform image synthesis on the image of the second region of the target image and N-1 images, and output the target image;
wherein the first region and the image of the first region constitute the target image; the number of the target images is greater than or equal to 1, the cut images of the first area are invisible, at least one image in the N images is displayed on the first screen, and at least one image is displayed on the second screen; n is an integer greater than 1.
In one embodiment, the first input comprises a first touch input;
the terminal device further includes:
an updating module 804, configured to update a display position of the target image;
the cropping module 802 includes:
a first cropping unit, electrically connected to the receiving module 801, for cropping an image of a first region of the target image located on a first side of a target boundary line;
the first area is an area of the target image located on a first side of a boundary of the target, and the second area is an area of the target image located on a second side of the boundary of the target.
In one embodiment, the terminal device further includes:
a determining module 805, configured to determine a target boundary line based on a second side edge of the first screen or the second screen when the first side edge of the target image intersects the first side edge or the second side edge of the first screen.
In one embodiment, the first input comprises a second touch input;
the update module 804 includes:
a control unit electrically connected to the receiving module 801 for controlling the target image to move towards a first direction;
the terminal device further includes:
a restoring module 806, configured to control the target image to move towards a second direction in response to the second touch input, and restore display of an image of a third area of the target image according to a distance that the target image moves;
wherein the third region is associated with a distance that the target image moves.
In one embodiment, the first screen and the second screen are relatively slidable, and the first input comprises an input for controlling the relative sliding of the first screen and the second screen;
the terminal device further includes:
a control module 807 for controlling sliding of the first screen relative to the second screen or sliding of the second screen relative to the first screen;
the cropping module 802 includes:
an acquisition unit electrically connected to the control module 807, configured to acquire an overlapping area of the first screen and the second screen;
the second cutting unit is electrically connected with the acquisition unit and is used for cutting an image of a first area positioned in the overlapping area in the target image;
the first region is a first region of the target image located in the overlap region, and the second region is all image regions of the target image except the first region.
In the embodiment of the invention, a first input of a user to a target image in N images displayed on a first screen and a second screen is received; cropping, in response to the first input, an image of a first region of the target image; the image of the second area of the target image is synthesized with the N-1 images, and the target image is output, so that the first input of the user to the target image in the N images displayed on the first screen and the second screen is received, the image of the first area of the target image can be cut, professional retouching software is not required to be started, and the operation steps are simplified
Figure 9 is a schematic diagram of a hardware structure of a terminal device implementing various embodiments of the present invention,
the terminal device 900 includes but is not limited to: a radio frequency unit 901, a network module 902, an audio output unit 903, an input unit 904, a sensor 905, a display unit 906, a user input unit 907, an interface unit 908, a memory 909, a processor 910, and a power supply 911. Those skilled in the art will appreciate that the terminal device configuration shown in fig. 9 does not constitute a limitation of the terminal device, and that the terminal device may include more or fewer components than shown, or combine certain components, or a different arrangement of components. In the embodiment of the present invention, the terminal device includes, but is not limited to, a mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted terminal, a wearable device, a pedometer, and the like.
The user input unit 907 is configured to receive a first input of a user on a target image in the N images displayed on the first screen and the second screen;
a processor 910 for cropping an image of a first region of the target image in response to the first input; carrying out image synthesis on the image of the second area of the target image and the N-1 images, and outputting the target image; wherein the first region and the image of the first region constitute the target image; the number of the target images is greater than or equal to 1, the cut images of the first area are invisible, at least one image in the N images is displayed on the first screen, and at least one image is displayed on the second screen; n is an integer greater than 1.
In the embodiment of the invention, a first input of a user to a target image in N images displayed on a first screen and a second screen is received; cropping, in response to the first input, an image of a first region of the target image; and synthesizing the image of the second area of the target image and the N-1 images, and outputting the target image, so that the image of the first area of the target image can be cut by receiving the first input of the user to the target image in the N images displayed on the first screen and the second screen, and professional retouching software is not required to be started, thereby simplifying the operation steps.
It should be understood that, in the embodiment of the present invention, the radio frequency unit 901 may be used for receiving and sending signals during a message transmission or call process, and specifically, after receiving downlink data from a base station, the downlink data is processed by the processor 910; in addition, the uplink data is transmitted to the base station. Generally, the radio frequency unit 901 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. In addition, the radio frequency unit 901 can also communicate with a network and other devices through a wireless communication system.
The terminal device provides the user with wireless broadband internet access through the network module 902, such as helping the user send and receive e-mails, browse webpages, access streaming media, and the like.
The audio output unit 903 may convert audio data received by the radio frequency unit 901 or the network module 902 or stored in the memory 909 into an audio signal and output as sound. Also, the audio output unit 903 may also provide audio output related to a specific function performed by the terminal apparatus 900 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output unit 903 includes a speaker, a buzzer, a receiver, and the like.
The input unit 904 is used to receive audio or video signals. The input Unit 904 may include a Graphics Processing Unit (GPU) 9041 and a microphone 9042, and the Graphics processor 9041 processes image data of a still picture or video obtained by an image capturing device (such as a camera) in a video capture mode or an image capture mode. The processed image frames may be displayed on the display unit 906. The image frames processed by the graphic processor 9041 may be stored in the memory 909 (or other storage medium) or transmitted via the radio frequency unit 901 or the network module 902. The microphone 9042 can receive sounds and can process such sounds into audio data. The processed audio data may be converted into a format output transmittable to a mobile communication base station via the radio frequency unit 901 in case of the phone call mode.
The terminal device 900 also includes at least one sensor 905, such as a light sensor, motion sensor, and other sensors. Specifically, the light sensor includes an ambient light sensor that can adjust the brightness of the display panel 9061 according to the brightness of ambient light, and a proximity sensor that can turn off the display panel 9061 and/or backlight when the terminal device 900 is moved to the ear. As one of the motion sensors, the accelerometer sensor can detect the magnitude of acceleration in each direction (generally three axes), detect the magnitude and direction of gravity when stationary, and can be used to identify the terminal device posture (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), vibration identification related functions (such as pedometer, tapping), and the like; the sensor 905 may further include a fingerprint sensor, a pressure sensor, an iris sensor, a molecular sensor, a gyroscope, a barometer, a hygrometer, a thermometer, an infrared sensor, etc., which are not described herein again.
The display unit 906 is used to display information input by the user or information provided to the user. The Display unit 906 may include a Display panel 9061, and the Display panel 9061 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like.
The user input unit 907 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the terminal device. Specifically, the user input unit 907 includes a touch panel 9071 and other input devices 9072. The touch panel 9071, also referred to as a touch screen, may collect touch operations by a user thereon or nearby (such as operations by the user on or near the touch panel 9071 using a finger, a stylus, or any other suitable object or accessory). The touch panel 9071 may include two parts, a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 910, receives a command from the processor 910, and executes the command. In addition, the touch panel 9071 may be implemented by using various types such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. The user input unit 907 may include other input devices 9072 in addition to the touch panel 9071. Specifically, the other input devices 9072 may include, but are not limited to, a physical keyboard, function keys (such as a volume control key, a switch key, and the like), a trackball, a mouse, and an operation rod, which are not described herein again.
Further, the touch panel 9071 may be overlaid on the display panel 9061, and when the touch panel 9071 detects a touch operation on or near the touch panel 9071, the touch panel is transmitted to the processor 910 to determine the type of the touch event, and then the processor 910 provides a corresponding visual output on the display panel 9061 according to the type of the touch event. Although in fig. 9, the touch panel 9071 and the display panel 9061 are implemented as two independent components to implement the input and output functions of the terminal device, in some embodiments, the touch panel 9071 and the display panel 9061 may be integrated to implement the input and output functions of the terminal device, which is not limited herein.
The interface unit 908 is an interface for connecting an external device to the terminal apparatus 900. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The interface unit 908 may be used to receive input (e.g., data information, power, etc.) from an external device and transmit the received input to one or more elements within the terminal apparatus 900 or may be used to transmit data between the terminal apparatus 900 and an external device.
The memory 909 may be used to store software programs as well as various data. The memory 909 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 909 may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid-state storage device.
The processor 910 is a control center of the terminal device, connects various parts of the entire terminal device with various interfaces and lines, and performs various functions of the terminal device and processes data by running or executing software programs and/or modules stored in the memory 909 and calling data stored in the memory 909, thereby monitoring the terminal device as a whole. Processor 910 may include one or more processing units; preferably, the processor 910 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It is to be appreciated that the modem processor described above may not be integrated into processor 910.
The terminal device 900 may further include a power supply 911 (e.g., a battery) for supplying power to various components, and preferably, the power supply 911 is logically connected to the processor 910 through a power management system, so as to manage charging, discharging, and power consumption management functions through the power management system.
In addition, the terminal device 900 includes some functional modules that are not shown, and are not described in detail here.
Preferably, an embodiment of the present invention further provides a terminal device, which includes a processor 910, a memory 909, and a computer program that is stored in the memory 909 and can be run on the processor 910, and when the computer program is executed by the processor 910, the computer program implements each process of the above-mentioned embodiment of the image processing method, and can achieve the same technical effect, and in order to avoid repetition, details are not described here again.
The embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program implements each process of the embodiment of the image processing method, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here. The computer-readable storage medium may be a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present invention.
While the present invention has been described with reference to the embodiments shown in the drawings, the present invention is not limited to the embodiments, which are illustrative and not restrictive, and it will be apparent to those skilled in the art that various changes and modifications can be made therein without departing from the spirit and scope of the invention as defined in the appended claims.
Claims (8)
1. An image processing method applied to a terminal device including a first screen and a second screen, the method comprising:
receiving a first input of a user to a target image in N images displayed on a first screen and a second screen, wherein the first screen and the second screen can slide relatively, and the first input comprises an input for controlling the relative sliding of the first screen and the second screen;
controlling the first screen to slide relative to the second screen or the second screen to slide relative to the first screen;
acquiring an overlapping area of the first screen and the second screen; cutting an image of a first area of the target image, which is located in the overlapping area; the first area is a first area of the target image located in the overlapping area, and the second area is all image areas of the target image except the first area;
carrying out image synthesis on the image of the second area of the target image and the N-1 images, and outputting the target image;
wherein the first region and the image of the first region constitute the target image; the number of the target images is greater than or equal to 1, the cut images of the first area are invisible, at least one image in the N images is displayed on the first screen, and at least one image is displayed on the second screen; n is an integer greater than 1.
2. The method of claim 1, wherein the first input comprises a first touch input;
after receiving a first input of a user to a target image in the N images displayed on the first screen and the second screen, the method further includes:
and updating the display position of the target image.
3. The method of claim 2, wherein the first input comprises a second touch input;
the updating the display position of the target image comprises:
controlling the target image to move towards a first direction;
the cropping, after the cropping the image of the first region of the target image located in the overlap region, further includes:
in response to the second touch input, controlling the target image to move towards a second direction, and restoring the display of the image of the third area of the target image according to the moving distance of the target image;
wherein the third region is associated with a distance that the target image moves.
4. A terminal device is applied to a terminal device comprising a first screen and a second screen, and is characterized by comprising:
the receiving module is used for receiving a first input of a user to a target image in N images displayed on a first screen and a second screen, wherein the first screen and the second screen can slide relatively, and the first input comprises an input for controlling the first screen and the second screen to slide relatively;
the control module is used for controlling the first screen to slide relative to the second screen or the second screen to slide relative to the first screen;
a cropping module to crop an image of a first region of the target image in response to the first input, the cropping module comprising: an acquisition unit configured to acquire an overlapping area of the first screen and the second screen; a second cropping unit used for cropping the image of the first area of the target image, which is positioned in the overlapping area; the first area is a first area of the target image located in the overlapping area, and the second area is all image areas of the target image except the first area;
the synthesis module is used for carrying out image synthesis on the image of the second area of the target image and the N-1 images and outputting the target image;
wherein the first region and the image of the first region constitute the target image; the number of the target images is greater than or equal to 1, the cut images of the first area are invisible, at least one image in the N images is displayed on the first screen, and at least one image is displayed on the second screen; n is an integer greater than 1.
5. The terminal device of claim 4, wherein the first input comprises a first touch input;
the terminal device further includes:
and the updating module is used for updating the display position of the target image.
6. The terminal device of claim 5, wherein the first input comprises a second touch input;
the update module includes:
a control unit for controlling the target image to move toward a first direction;
the terminal device further includes:
the restoring module is used for responding to the second touch input, controlling the target image to move towards a second direction and restoring the display of the image of the third area of the target image according to the moving distance of the target image;
wherein the third region is associated with a distance that the target image moves.
7. A terminal device, comprising: memory, processor and computer program stored on the memory and executable on the processor, which computer program, when executed by the processor, carries out the steps of the image processing method according to any one of claims 1 to 3.
8. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the steps of the image processing method according to any one of claims 1 to 3.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810845920.1A CN108898555B (en) | 2018-07-27 | 2018-07-27 | Image processing method and terminal equipment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810845920.1A CN108898555B (en) | 2018-07-27 | 2018-07-27 | Image processing method and terminal equipment |
Publications (2)
Publication Number | Publication Date |
---|---|
CN108898555A CN108898555A (en) | 2018-11-27 |
CN108898555B true CN108898555B (en) | 2022-09-23 |
Family
ID=64352602
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201810845920.1A Active CN108898555B (en) | 2018-07-27 | 2018-07-27 | Image processing method and terminal equipment |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN108898555B (en) |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109769089B (en) * | 2018-12-28 | 2021-03-16 | 维沃移动通信有限公司 | Image processing method and terminal equipment |
CN109886000B (en) * | 2019-02-01 | 2024-03-01 | 维沃移动通信有限公司 | Image encryption method and mobile terminal |
CN109993711A (en) * | 2019-03-25 | 2019-07-09 | 维沃移动通信有限公司 | A kind of image processing method and terminal device |
CN111311489B (en) * | 2020-01-17 | 2023-07-04 | 维沃移动通信有限公司 | Image clipping method and electronic equipment |
CN111784695B (en) * | 2020-06-01 | 2024-03-22 | 北京像素软件科技股份有限公司 | Method and device for cutting graph, electronic equipment and storage medium |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2005032125A1 (en) * | 2003-09-26 | 2005-04-07 | Sharp Kabushiki Kaisha | Panorama image creation device and panorama image imaging device |
CN102150413A (en) * | 2008-09-11 | 2011-08-10 | 索尼爱立信移动通讯有限公司 | Display device and method for displaying images in a variable size display area |
CN105761211A (en) * | 2016-03-30 | 2016-07-13 | 努比亚技术有限公司 | Method and device for splicing frames of mobile terminal |
CN106780346A (en) * | 2017-01-19 | 2017-05-31 | 努比亚技术有限公司 | A kind of picture splicing apparatus and method based on edge input |
CN107872623A (en) * | 2017-12-22 | 2018-04-03 | 维沃移动通信有限公司 | A kind of image pickup method, mobile terminal and computer-readable recording medium |
CN108053364A (en) * | 2017-12-28 | 2018-05-18 | 努比亚技术有限公司 | Picture method of cutting out, mobile terminal and computer readable storage medium |
CN108205801A (en) * | 2017-12-27 | 2018-06-26 | 中兴通讯股份有限公司 | A kind of method and terminal for supporting image mosaic |
CN108234891A (en) * | 2018-04-04 | 2018-06-29 | 维沃移动通信有限公司 | A kind of photographic method and mobile terminal |
-
2018
- 2018-07-27 CN CN201810845920.1A patent/CN108898555B/en active Active
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2005032125A1 (en) * | 2003-09-26 | 2005-04-07 | Sharp Kabushiki Kaisha | Panorama image creation device and panorama image imaging device |
CN102150413A (en) * | 2008-09-11 | 2011-08-10 | 索尼爱立信移动通讯有限公司 | Display device and method for displaying images in a variable size display area |
CN105761211A (en) * | 2016-03-30 | 2016-07-13 | 努比亚技术有限公司 | Method and device for splicing frames of mobile terminal |
CN106780346A (en) * | 2017-01-19 | 2017-05-31 | 努比亚技术有限公司 | A kind of picture splicing apparatus and method based on edge input |
CN107872623A (en) * | 2017-12-22 | 2018-04-03 | 维沃移动通信有限公司 | A kind of image pickup method, mobile terminal and computer-readable recording medium |
CN108205801A (en) * | 2017-12-27 | 2018-06-26 | 中兴通讯股份有限公司 | A kind of method and terminal for supporting image mosaic |
CN108053364A (en) * | 2017-12-28 | 2018-05-18 | 努比亚技术有限公司 | Picture method of cutting out, mobile terminal and computer readable storage medium |
CN108234891A (en) * | 2018-04-04 | 2018-06-29 | 维沃移动通信有限公司 | A kind of photographic method and mobile terminal |
Also Published As
Publication number | Publication date |
---|---|
CN108898555A (en) | 2018-11-27 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20220276909A1 (en) | Screen projection control method and electronic device | |
CN108536365B (en) | Image sharing method and terminal | |
CN109495711B (en) | Video call processing method, sending terminal, receiving terminal and electronic equipment | |
CN108898555B (en) | Image processing method and terminal equipment | |
CN110096326B (en) | Screen capturing method, terminal equipment and computer readable storage medium | |
CN109525874B (en) | Screen capturing method and terminal equipment | |
CN109032445B (en) | Screen display control method and terminal equipment | |
CN110196667B (en) | Notification message processing method and terminal | |
CN109032486B (en) | Display control method and terminal equipment | |
CN109710349B (en) | Screen capturing method and mobile terminal | |
CN108196815B (en) | Method for adjusting call sound and mobile terminal | |
CN107728923B (en) | Operation processing method and mobile terminal | |
CN109407948B (en) | Interface display method and mobile terminal | |
CN108228902B (en) | File display method and mobile terminal | |
CN109144393B (en) | Image display method and mobile terminal | |
CN110413363B (en) | Screenshot method and terminal equipment | |
CN108132749B (en) | Image editing method and mobile terminal | |
CN107734172B (en) | Information display method and mobile terminal | |
CN109669656B (en) | Information display method and terminal equipment | |
CN110442279B (en) | Message sending method and mobile terminal | |
CN110096203B (en) | Screenshot method and mobile terminal | |
CN109710130B (en) | Display method and terminal | |
CN109542321B (en) | Control method and device for screen display content | |
CN107729100B (en) | Interface display control method and mobile terminal | |
CN110536005B (en) | Object display adjustment method and terminal |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |