CN109062483B - Image processing method and terminal equipment - Google Patents
Image processing method and terminal equipment Download PDFInfo
- Publication number
- CN109062483B CN109062483B CN201810843918.0A CN201810843918A CN109062483B CN 109062483 B CN109062483 B CN 109062483B CN 201810843918 A CN201810843918 A CN 201810843918A CN 109062483 B CN109062483 B CN 109062483B
- Authority
- CN
- China
- Prior art keywords
- target
- image
- screen
- target screen
- area
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04845—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/60—Protecting data
- G06F21/62—Protecting access to data via a platform, e.g. using keys or access control rules
- G06F21/6218—Protecting access to data via a platform, e.g. using keys or access control rules to a system of files or objects, e.g. local or distributed file system or database
- G06F21/6245—Protecting personal data, e.g. for financial or medical purposes
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- General Physics & Mathematics (AREA)
- Bioethics (AREA)
- Medical Informatics (AREA)
- Databases & Information Systems (AREA)
- Computer Hardware Design (AREA)
- Computer Security & Cryptography (AREA)
- Software Systems (AREA)
- Human Computer Interaction (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The invention discloses an image processing method and a terminal device, wherein the method comprises the following steps: receiving a first input of a user; in response to the first input, controlling the first target screen to slide relative to the second target screen and hiding an image of a target area of a target image displayed in the first target screen; the first target screen and the second target screen are respectively one of the first screen and the second screen, so that a user can select an image to be hidden according to the requirement of the user, the whole image does not need to be hidden, and the flexibility is high. Moreover, the first target screen can be controlled to slide relative to the second target screen through the first input of the user so as to hide the image of the target area of the target image displayed in the first target screen, and the operation steps are simple.
Description
Technical Field
The embodiment of the invention relates to the technical field of image processing, in particular to an image processing method and terminal equipment.
Background
With the overall popularization of terminal devices, the requirements of users on the terminal devices are higher and higher. For example, users want the terminal device to be able to not only capture high quality pictures, but also trim the pictures. For example, a user wants to protect the privacy of a part of images in a picture, and the existing image processing method is to store the picture in a privacy cabinet, so that the operation steps are complicated, and the flexibility is poor. If the user needs to check, the picture needs to be called out from the confidential cabinet, so that the operation steps are complicated, and the user cannot conveniently check the picture.
Disclosure of Invention
The embodiment of the invention provides an image processing method and terminal equipment, and aims to solve the problems that in the prior art, if a user needs to protect the privacy of partial images in a picture, the picture needs to be stored in a secret cabinet, the operation steps are complex, and the flexibility is poor.
In order to solve the technical problem, the invention is realized as follows:
in a first aspect, an embodiment of the present invention provides an image processing method, which is applied to a terminal device including a first screen and a second screen that are slidable relative to each other, and the method includes:
receiving a first input of a user;
in response to the first input, controlling the first target screen to slide relative to the second target screen and hiding an image of a target area of a target image displayed in the first target screen;
wherein the first target screen and the second target screen are one of the first screen and the second screen, respectively.
In a second aspect, an embodiment of the present invention further provides a terminal device, which is applied to a terminal device including a first screen and a second screen that are slidable relative to each other, and includes:
the first receiving module is used for receiving a first input of a user;
a hiding module, configured to control the first target screen to slide relative to the second target screen in response to the first input, and hide an image of a target area of a target image displayed in the first target screen;
wherein the first target screen and the second target screen are one of the first screen and the second screen, respectively.
In a third aspect, an embodiment of the present invention further provides a terminal device, including: a memory, a processor and a computer program stored on the memory and executable on the processor, the computer program, when executed by the processor, implementing the steps of the image processing method described above.
In a fourth aspect, the embodiment of the present invention further provides a readable storage medium, where a computer program is stored on the readable storage medium, and when the computer program is executed by a processor, the computer program implements the steps of the image processing method described above.
In the embodiment of the invention, the first input of the user is received; in response to the first input, controlling the first target screen to slide relative to the second target screen and hiding an image of a target area of a target image displayed in the first target screen; the first target screen and the second target screen are respectively one of the first screen and the second screen, so that a user can select an image to be hidden according to the requirement of the user, the whole image does not need to be hidden, and the flexibility is high. Moreover, the first target screen can be controlled to slide relative to the second target screen through the first input of the user so as to hide the image of the target area of the target image displayed in the first target screen, and the operation steps are simple.
Drawings
FIG. 1 is a flowchart of an image processing method according to an embodiment of the present invention;
fig. 2 is one of schematic diagrams of an image processing method provided in an embodiment of the present invention in an actual application scenario;
fig. 3 is a second schematic diagram of an image processing method in an actual application scenario according to an embodiment of the present invention;
fig. 4 is a third schematic diagram of an image processing method in an actual application scenario according to an embodiment of the present invention;
FIG. 5 is a fourth schematic diagram of an image processing method in an actual application scenario according to an embodiment of the present invention;
FIG. 6 is a fifth schematic view of an image processing method provided in an embodiment of the present invention in an actual application scenario;
fig. 7 is a sixth schematic view of an image processing method provided in an embodiment of the present invention in an actual application scenario;
fig. 8 is a seventh schematic diagram of an image processing method in an actual application scenario according to an embodiment of the present invention;
fig. 9 is a schematic structural diagram of a terminal device according to an embodiment of the present invention;
fig. 10 is a second schematic structural diagram of a terminal device according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention. The technical solutions provided by the embodiments of the present invention are described in detail below with reference to the accompanying drawings.
The invention provides an image processing method, and an execution subject of the method can be but is not limited to a terminal device (such as a mobile phone, a tablet computer and the like) with a plurality of target screens or a device capable of being configured to execute the method provided by the embodiment of the invention.
For convenience of description, the following description will be made on an embodiment of the method taking as an example a terminal device including a first screen and a second screen that are relatively slidable, which is capable of performing the method. It is understood that the method is performed by a terminal device including a first screen and a second screen that are slidable with respect to each other, which is only an exemplary illustration and should not be construed as a limitation of the method.
Fig. 1 is a flowchart of an image processing method according to an embodiment of the present invention, where the method of fig. 1 may be applied to a terminal device including a first screen and a second screen that are slidable with respect to each other, and as shown in fig. 1, the method may include:
The first input may include controlling the first target screen and the second target screen to slide relatively, and may also include selecting an image area to be hidden.
The specific implementation of controlling the first target screen and the second target screen to slide relatively may include controlling the first target screen to slide relatively to the second target screen, or controlling the second target screen to slide relatively to the first target screen.
The specific implementation of the selecting of the image area to be hidden may include selecting the image area to be hidden by clicking, or selecting the image area to be hidden by sliding a track, or selecting the image area to be hidden by floating touch, or selecting the image area to be hidden by a voice instruction.
Step 102, responding to the first input, controlling the first target screen to slide relative to the second target screen, and hiding an image of a target area of a target image displayed in the first target screen;
wherein the first target screen and the second target screen are one of the first screen and the second screen, respectively.
The target area of the target image may be an overlapping area of the first target screen and the second target screen, or a selected image area to be hidden.
In this step, the image of the target area of the target image displayed in the first target screen is hidden, and specifically, the hiding of the image of the target area of the target image can be realized by synthesizing other images except for the image of the target area of the target image to obtain a synthesized picture. Of course, the image of the target area where the target image is set may also be directly hidden, and the embodiment of the present invention is not particularly limited, and may be set according to actual requirements during specific implementation.
In the embodiment of the invention, the first input of the user is received; in response to the first input, controlling the first target screen to slide relative to the second target screen and hiding an image of a target area of a target image displayed in the first target screen; the first target screen and the second target screen are respectively one of the first screen and the second screen, so that a user can select an image to be hidden according to the requirement of the user, the whole image does not need to be hidden, and the flexibility is high. Moreover, the first target screen can be controlled to slide relative to the second target screen through the first input of the user so as to hide the image of the target area of the target image displayed in the first target screen, and the operation steps are simple.
Optionally, as an embodiment, the first input includes a first sub-input, and the first sub-input is used to control the first target screen and the second target screen to slide relatively;
step 102 may be specifically implemented as:
acquiring an overlapping area of the first target screen and the second target screen;
determining a first image area in the target image, which is located in the overlapping area, as a target area;
concealing an image of the target area of the target image.
The first image area may be an image area on the first target screen, may also be an image area on the second target screen, and may also be an area on the first target screen and on the second target screen.
Illustratively, as shown in fig. 3, an image displayed in the first target screen 1, and an image 21 of a target area of the target image is displayed in the second target screen 2. As shown in fig. 4, after the first target screen 1 and the second target screen 2 are relatively slid, an overlapping area exists between the first target screen 1 and the second target screen 2, a first image area (i.e., an area where the image 21 displayed in the second target screen 2 is located) of the overlapping area is blocked by the first target screen 1, that is, the area where the image 21 is located is determined to be an image area to be hidden, and the image 21 displayed in the second target screen 2 is hidden.
The embodiment of the invention responds to the sliding between the first target screen and the second target screen, acquires the overlapping area of the first target screen and the second target screen, and determines the first image area of the overlapping area as the target area, namely the image area to be hidden, so as to hide the image of the target area, so that the user can control the first target screen and the second target screen to slide relatively, thereby realizing the hiding of the image of the target area, the operation is simple, and the operation time is shortened.
Optionally, as an embodiment, the first input includes a second sub-input, where the second sub-input is used to select an image area to be hidden;
step 102 may be specifically implemented as:
acquiring an overlapping area of the first target screen and the second target screen;
acquiring a second image area selected by the second sub-input;
determining the second image area as a target area, or determining a first image area and the second image area which are positioned in the overlapping area in the target image as target areas;
concealing an image of the target area of the target image.
And if the second sub-input can be a sliding operation, selecting the image area to be hidden and selecting the image area to be hidden through a sliding track. The sliding track can surround the image, for example, the sliding track is a closed circle, and the image is located in the inside of the circle; alternatively, the sliding track may surround the image, i.e. the sliding track semi-encloses the image, e.g. the sliding track is an arc, the image being located in a recess of the arc (such as the image 21 shown in fig. 6).
Illustratively, as shown in fig. 6, the image displayed in the first target screen 1, the other image displayed in the second target screen 2, the image 21, and the image 21 in the second image region selected in response to the second sub-input. As shown in fig. 7, the selected second image area is determined as a target area, i.e., an image area to be hidden, and an image 21 displayed in the second image area of the second target screen 2 is hidden; or, as shown in fig. 8, after the first target screen 1 and the second target screen 2 are relatively slid, an overlapping area exists between the first target screen 1 and the second target screen 2, a first image area of the overlapping area is blocked by the first target screen 1, an area where the image 21 of the first image area is located is determined as a target area, and meanwhile, a second image area selected in response to the second sub-input is also determined as a target area, so that the area where the image 21 is located can be determined as an image area to be hidden, and the image 21 displayed in the second target screen 2 is hidden.
According to the embodiment of the invention, the selected second image area is determined as the target area, namely the image area to be hidden, in response to the selection of the image area to be hidden, so that the image of the target area is hidden, a user can select the image to be hidden according to the self requirement, namely the image of the target area is hidden, and the flexibility is high.
In addition, the embodiment of the invention can also determine the first image area and the selected second image area of the overlapping area of the first target screen and the second target screen as the target areas, namely the image areas to be hidden, by responding to the sliding between the first target screen and the second target screen and responding to the selection of the image areas to be hidden, so as to hide the images of the target areas, and the images of the target areas can be hidden by controlling the first target screen and the second target screen to slide relatively by a user and selecting the images to be hidden according to the requirements of the user, so that the operation is simple, the operation time is shortened, and meanwhile, the flexibility is higher.
Optionally, as an embodiment, the step 102 may be specifically implemented as:
synthesizing the images of all the areas except the target area in the target image with the N images displayed on the first target screen and the second target screen, and outputting a target image;
wherein at least one of the N images is displayed on the second target screen; n is a positive integer.
It is understood that all images displayed in the first and second target screens except the hidden image are image-synthesized and the target image is output. Wherein at least one of the target images is displayed on a second target screen.
The image synthesis of the images of all the areas except the target area in the target image and the N images displayed by the first target screen and the second target screen can be specifically realized as follows: and in response to a third input of the user, image synthesis is performed on the images of all the areas except the target area in the target image and the N images displayed by the first target screen and the second target screen. The third input is used for controlling the first target screen and the second target screen to slide mutually and/or controlling touch operation on the first target screen and/or the second target screen.
For example, the third input may be a sliding operation of sliding two fingers of the user from the bottom to the top of the target screen on the first target screen and the second target screen respectively, and a sliding operation of sliding at least one of the first target screen and the second target screen to the other target screen by the user, and then as shown in fig. 4, fig. 7 and fig. 8, in response to the sliding operation of sliding two fingers of the user from the bottom to the top of the target screen on the first target screen 1 and the second target screen 2 respectively, and the sliding operation of sliding at least one of the first target screen 1 and the second target screen 2 to the other target screen, the other images on the first target screen 1 and the second target screen 2 except the image 21 to be hidden are synthesized to obtain the target image.
According to the embodiment of the invention, the images of all the areas except the target area in the target image are synthesized with the N images displayed on the first target screen and the second target screen, and the target image is output, so that the user can view the images except the hidden image, and the requirement of the user for viewing the images is met.
Optionally, as an embodiment, after performing step 102, the image processing method provided in the embodiment of the present invention may further include:
receiving a second input of the user;
controlling the first target screen or the second target screen to slide in response to the second input;
and displaying an image of a target area of the target image under the condition that the relative position of the first target screen and the second target screen is a preset position characteristic.
The preset position characteristic may refer to that the first target screen and the second target screen have no overlapping area, or the first target screen and the second target screen have a complete overlapping area, or the first target screen and the second target screen have a partial overlapping area.
The second input is used for controlling the mutual sliding of the first target screen and the second target screen and/or the touch operation on the first target screen and/or the second target screen.
For example, the second input may be a sliding operation in which two fingers of the user slide from the top to the bottom of the target screen on the first target screen and the second target screen, respectively, and a sliding operation in which the user controls at least one of the first target screen and the second target screen to slide to the other target screen, and then as shown in fig. 5, in response to the sliding operation in which two fingers of the user slide from the top to the bottom of the target screen on the first target screen 1 and the second target screen 2, respectively, and the sliding operation in which the user controls at least one of the first target screen 1 and the second target screen 2 to slide to the other target screen, an image of a target area of the target image is displayed to restore the initial display state of the first target screen 1 and the second target screen 2.
According to the embodiment of the invention, the image of the target area of the target image is displayed through the second input of the user, so that the synthetic image is flexibly restored to the initial image under the operation of the user. If the user wants to check the image for privacy protection, the second input of the user can be used for realizing the purpose, the operation is simple, and the user can check conveniently.
Preferably, the second input and the third input in the above embodiments are inputs with opposite parameters, and for example, if the third input is a sliding operation in which two fingers of a user slide from the bottom to the top of a target screen on a first target screen and a second target screen respectively, and a sliding operation in which the user controls at least one target screen of the first target screen and the second target screen to slide to the other target screen, the second input is a sliding operation in which two fingers of a user slide from the top to the bottom of a target screen on a first target screen and a second target screen respectively, and a sliding operation in which the user controls at least one target screen of the first target screen and the second target screen to slide to the other target screen.
According to the embodiment of the invention, based on the second input opposite to the third input parameter, the image of the target area of the target image is displayed, so that the synthetic image is flexibly restored to the initial image, and a user can distinguish the operation corresponding to image hiding and the operation corresponding to image display.
Optionally, as an embodiment, before executing step 102, the image processing method provided in the embodiment of the present invention may further include:
adjusting a display position of an image of the target area on the first target screen in response to a fourth input by the user.
The fourth input may include a sliding operation, and the specific implementation of this step may be:
in response to a sliding operation of a user sliding from one edge to the other edge of the second target screen, the image of the target area on the second target screen is adjusted from a position far away from the first target screen to a position close to the first target screen.
Illustratively, as shown in fig. 2, the image 21 of the target area on the second target screen is slid by the user from one edge (right edge) to the other edge (left edge) of the second target screen, and in response to the sliding operation, the image 21 on the second target screen is adjusted from a position away from the first target screen to a position close to the first target screen (as shown in fig. 3).
According to the embodiment of the invention, the display position of the image of the target area is adjusted on the first target screen in response to the fourth input of the user, so that the display position of the image of the target area can be adjusted at will, the image of the target area can be adjusted to a position close to the first target screen, and after the first target screen slides relative to the second target screen, the image of the target area is located in the overlapping area of the first target screen and the second target screen, so that the image of the target area is hidden, and the operation is simple and the flexibility is high.
The image processing method according to the embodiment of the present invention is described in detail above with reference to fig. 1 to 8, and the terminal device according to the embodiment of the present invention is described in detail below with reference to fig. 9.
Fig. 9 is a schematic structural diagram of a terminal device provided in an embodiment of the present invention, and as shown in fig. 9, the terminal device is applied to a terminal device including a first screen and a second screen that are slidable with respect to each other, and may include:
a first receiving module 901, configured to receive a first input of a user;
a hiding module 902, configured to, in response to the first input, control the first target screen to slide relative to the second target screen and hide an image of a target area of a target image displayed in the first target screen;
wherein the first target screen and the second target screen are one of the first screen and the second screen, respectively.
In one embodiment, the first input includes a first sub-input for controlling the first target screen and the second target screen to slide relative to each other;
the hiding module 902 includes:
a first obtaining unit, electrically connected to the first receiving module 901, configured to obtain an overlapping area of the first target screen and the second target screen;
a first determining unit electrically connected to the first acquiring unit, for determining a first image area located in the overlapping area in the target image as a target area;
a first hiding unit electrically connected to the first determining unit for hiding an image of the target area of the target image.
In one embodiment, the first input comprises a second sub-input, and the second sub-input is used for selecting an image area to be hidden;
the hiding module 902 includes:
a second obtaining unit, electrically connected to the first receiving module 901, configured to obtain an overlapping area of the first target screen and the second target screen;
a third obtaining unit, electrically connected to the first receiving module 901, configured to obtain a second image area selected by the second sub-input;
a second determining unit electrically connected to the second acquiring unit and the third acquiring unit, configured to determine the second image area as a target area, or determine a first image area and a second image area located in the overlapping area in the target image as target areas;
a second hiding unit electrically connected to the second determining unit, for hiding an image of the target area of the target image.
In one embodiment, the terminal device further includes:
an output module 903, configured to perform image synthesis on the images of all the regions in the target image except the target region and the N images displayed on the first target screen and the second target screen, and output a target image;
wherein at least one of the N images is displayed on the second target screen; n is a positive integer.
In one embodiment, the terminal device further includes:
a second receiving module 904, configured to receive a second input of the user;
a control module 905, configured to control the first target screen or the second target screen to slide in response to the second input;
a display module 906, configured to display an image of a target area of the target image when a relative position of the first target screen and the second target screen is a preset position feature.
In one embodiment, the terminal device further includes:
an adjusting module 907, configured to adjust a display position of the image of the target area on the first target screen in response to a fourth input by the user.
The terminal device may also execute the method of fig. 1, and implement the functions of the terminal device in the embodiments shown in fig. 1 to fig. 7, which are not described again.
In the embodiment of the invention, an image to be hidden in an initial picture displayed on at least one target screen of a first target screen and a second target screen is selected; responding to a first operation of a user, and synthesizing other images except the image to be hidden on the first target screen and the second target screen to obtain a synthesized picture; and displaying the synthesized picture on at least one of the first target screen and the second target screen, so that the user can select the image to be hidden according to the self requirement without hiding the whole picture, and the flexibility is higher. In addition, through the first operation of the user, other images except the image to be hidden on the first target screen and the second target screen can be synthesized and displayed, the operation steps are simple, and the user can conveniently check the images.
Figure 10 is a schematic diagram of a hardware structure of a terminal device implementing various embodiments of the present invention,
the terminal device 1000 includes but is not limited to: a radio frequency unit 1001, a network module 1002, an audio output unit 1003, an input unit 1004, a sensor 1005, a display unit 1006, a user input unit 1007, an interface unit 1008, a memory 1009, a processor 1010, and a power supply 1011. Those skilled in the art will appreciate that the terminal device configuration shown in fig. 10 is not intended to be limiting, and that terminal devices may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components. In the embodiment of the present invention, the terminal device includes, but is not limited to, a mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted terminal, a wearable device, a pedometer, and the like.
The user input unit 1007 is used for receiving a first input of a user;
a processor 1010, configured to control the first target screen to slide relative to the second target screen in response to the first input, and to hide an image of a target area of a target image displayed in the first target screen;
wherein the first target screen and the second target screen are one of the first screen and the second screen, respectively.
In the embodiment of the invention, the first input of the user is received; in response to the first input, controlling the first target screen to slide relative to the second target screen and hiding an image of a target area of a target image displayed in the first target screen; the first target screen and the second target screen are respectively one of the first screen and the second screen, so that a user can select an image to be hidden according to the requirement of the user, the whole image does not need to be hidden, and the flexibility is high. Moreover, the first target screen can be controlled to slide relative to the second target screen through the first input of the user so as to hide the image of the target area of the target image displayed in the first target screen, and the operation steps are simple.
It should be understood that, in the embodiment of the present invention, the radio frequency unit 1001 may be used for receiving and sending signals during a message transmission or a call, and specifically, receives downlink data from a base station and then processes the received downlink data to the processor 1010; in addition, the uplink data is transmitted to the base station. In general, radio frequency unit 1001 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. Further, the radio frequency unit 1001 may also communicate with a network and other devices through a wireless communication system.
The terminal device provides the user with wireless broadband internet access through the network module 1002, such as helping the user send and receive e-mails, browse webpages, access streaming media, and the like.
The audio output unit 1003 may convert audio data received by the radio frequency unit 1001 or the network module 1002 or stored in the memory 1009 into an audio signal and output as sound. Also, the audio output unit 1003 can also provide audio output related to a specific function performed by the terminal apparatus 1000 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output unit 1003 includes a speaker, a buzzer, a receiver, and the like.
The input unit 1004 is used to receive an audio or video signal. The input Unit 1004 may include a Graphics Processing Unit (GPU) 10041 and a microphone 10042, the Graphics processor 10041 Processing image data of still pictures or video obtained by an image capturing device (such as a camera) in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display unit 1006. The image frames processed by the graphic processor 10041 may be stored in the memory 1009 (or other storage medium) or transmitted via the radio frequency unit 1001 or the network module 1002. The microphone 10042 can receive sound and can process such sound into audio data. The processed audio data may be converted into a format output transmittable to a mobile communication base station via the radio frequency unit 1001 in case of a phone call mode.
The display unit 1006 is used to display information input by the user or information provided to the user. The Display unit 1006 may include a Display panel 10061, and the Display panel 10061 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like.
The user input unit 1007 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the terminal device. Specifically, the user input unit 1007 includes a touch panel 10071 and other input devices 10072. The touch panel 10071, also referred to as a touch screen, may collect touch operations by a user on or near the touch panel 10071 (e.g., operations by a user on or near the touch panel 10071 using a finger, a stylus, or any other suitable object or attachment). The touch panel 10071 may include two parts, a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 1010, and receives and executes commands sent by the processor 1010. In addition, the touch panel 10071 may be implemented by various types, such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. In addition to the touch panel 10071, the user input unit 1007 can include other input devices 10072. Specifically, the other input devices 10072 may include, but are not limited to, a physical keyboard, function keys (such as volume control keys, switch keys, etc.), a track ball, a mouse, and a joystick, which are not described herein again.
Further, the touch panel 10071 can be overlaid on the display panel 10061, and when the touch panel 10071 detects a touch operation thereon or nearby, the touch operation is transmitted to the processor 1010 to determine the type of the touch event, and then the processor 1010 provides a corresponding visual output on the display panel 10061 according to the type of the touch event. Although in fig. 10, the touch panel 10071 and the display panel 10061 are two independent components for implementing the input and output functions of the terminal device, in some embodiments, the touch panel 10071 and the display panel 10061 may be integrated to implement the input and output functions of the terminal device, and the implementation is not limited herein.
The interface unit 1008 is an interface for connecting an external device to the terminal apparatus 1000. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. Interface unit 1008 can be used to receive input from external devices (e.g., data information, power, etc.) and transmit the received input to one or more elements within terminal apparatus 1000 or can be used to transmit data between terminal apparatus 1000 and external devices.
The memory 1009 may be used to store software programs as well as various data. The memory 1009 may mainly include a program storage area and a data storage area, wherein the program storage area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, and the like), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 1009 may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The processor 1010 is a control center of the terminal device, connects various parts of the entire terminal device by using various interfaces and lines, and performs various functions of the terminal device and processes data by operating or executing software programs and/or modules stored in the memory 1009 and calling data stored in the memory 1009, thereby performing overall monitoring of the terminal device. Processor 1010 may include one or more processing units; preferably, the processor 1010 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into processor 1010.
In addition, the terminal device 1000 includes some functional modules that are not shown, and are not described herein again.
Preferably, an embodiment of the present invention further provides a terminal device, which includes a processor 1010, a memory 1009, and a computer program stored in the memory 1009 and capable of running on the processor 1010, where the computer program is executed by the processor 1010 to implement each process of the above-mentioned image processing method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not described here again.
The embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program implements each process of the embodiment of the image processing method, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here. The computer-readable storage medium may be a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present invention.
While the present invention has been described with reference to the embodiments shown in the drawings, the present invention is not limited to the embodiments, which are illustrative and not restrictive, and it will be apparent to those skilled in the art that various changes and modifications can be made therein without departing from the spirit and scope of the invention as defined in the appended claims.
Claims (8)
1. An image processing method applied to a terminal device including a first screen and a second screen which are slidable relative to each other, the method comprising:
receiving a first input of a user;
in response to the first input, controlling a first target screen to slide relative to a second target screen and hiding an image of a target area of a target image displayed in the first target screen;
wherein the first target screen and the second target screen are one of the first screen and the second screen, respectively;
the first input comprises an image area to be hidden, and the target area of the target image is the selected image area to be hidden;
the first input comprises a first sub-input and a second sub-input, and the first sub-input is used for controlling the first target screen and the second target screen to slide relatively; the second sub-input is sliding operation, and the image area to be hidden can be selected through a sliding track;
after the controlling the first target screen to slide relative to the second target screen and hiding the image of the target area of the target image displayed in the first target screen, the method further includes:
receiving a second input of the user;
controlling the first target screen or the second target screen to slide in response to the second input;
and displaying an image of a target area of the target image under the condition that the relative position of the first target screen and the second target screen is a preset position characteristic.
2. The method of claim 1, wherein the first input comprises a first sub-input for controlling the first target screen and the second target screen to slide relative to each other;
the hiding the image of the target area of the target image displayed in the first target screen includes:
acquiring an overlapping area of the first target screen and the second target screen;
determining a first image area in the target image, which is located in the overlapping area, as a target area;
concealing an image of the target area of the target image.
3. The method of claim 1, wherein after controlling the first target screen to slide relative to the second target screen and hide an image of a target area of a target image displayed in the first target screen, further comprising:
synthesizing the images of all the areas except the target area in the target image with the N images displayed on the first target screen and the second target screen, and outputting a target image;
wherein at least one of the N images is displayed on the second target screen; n is a positive integer.
4. A terminal device applied to a terminal device including a first screen and a second screen that are slidable relative to each other, comprising:
the first receiving module is used for receiving a first input of a user;
the hiding module is used for responding to the first input, controlling the first target screen to slide relative to the second target screen and hiding the image of the target area of the target image displayed in the first target screen;
wherein the first target screen and the second target screen are one of the first screen and the second screen, respectively;
the first input comprises an image area to be hidden, and the target area of the target image is the selected image area to be hidden;
the first input comprises a first sub-input and a second sub-input, and the first sub-input is used for controlling the first target screen and the second target screen to slide relatively; the second sub-input is sliding operation, and the image area to be hidden can be selected through a sliding track;
the second receiving module is used for receiving a second input of the user;
the control module is used for responding to the second input and controlling the first target screen or the second target screen to slide;
and the display module is used for displaying the image of the target area of the target image under the condition that the relative position of the first target screen and the second target screen is a preset position characteristic.
5. The terminal device of claim 4, wherein the first input comprises a first sub-input for controlling the first target screen and the second target screen to slide relative to each other;
the concealment module includes:
a first acquisition unit configured to acquire an overlapping area of the first target screen and the second target screen;
a first determining unit configured to determine a first image area located in the overlap area in the target image as a target area;
a first hiding unit configured to hide an image of the target area of the target image.
6. The terminal device of claim 4, further comprising:
the output module is used for carrying out image synthesis on the images of all the areas except the target area in the target image and the N images displayed by the first target screen and the second target screen and outputting the target image;
wherein at least one of the N images is displayed on the second target screen; n is a positive integer.
7. A terminal device, comprising: memory, processor and computer program stored on the memory and executable on the processor, which computer program, when executed by the processor, carries out the steps of the image processing method according to any one of claims 1 to 3.
8. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the steps of the image processing method according to any one of claims 1 to 3.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810843918.0A CN109062483B (en) | 2018-07-27 | 2018-07-27 | Image processing method and terminal equipment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810843918.0A CN109062483B (en) | 2018-07-27 | 2018-07-27 | Image processing method and terminal equipment |
Publications (2)
Publication Number | Publication Date |
---|---|
CN109062483A CN109062483A (en) | 2018-12-21 |
CN109062483B true CN109062483B (en) | 2021-02-19 |
Family
ID=64835912
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201810843918.0A Active CN109062483B (en) | 2018-07-27 | 2018-07-27 | Image processing method and terminal equipment |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN109062483B (en) |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109828732B (en) * | 2018-12-26 | 2022-07-01 | 维沃移动通信有限公司 | Display control method and terminal equipment |
CN109769089B (en) * | 2018-12-28 | 2021-03-16 | 维沃移动通信有限公司 | Image processing method and terminal equipment |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102150099A (en) * | 2008-09-08 | 2011-08-10 | 高通股份有限公司 | Multi-panel electronic device |
CN102150413A (en) * | 2008-09-11 | 2011-08-10 | 索尼爱立信移动通讯有限公司 | Display device and method for displaying images in a variable size display area |
WO2012026567A1 (en) * | 2010-08-27 | 2012-03-01 | 京セラ株式会社 | Portable terminal device |
CN107577246A (en) * | 2017-09-29 | 2018-01-12 | 深圳市富斯科技有限公司 | A kind of image capturing method, system and electronic platform and aircraft |
CN108205801A (en) * | 2017-12-27 | 2018-06-26 | 中兴通讯股份有限公司 | A kind of method and terminal for supporting image mosaic |
CN108234891A (en) * | 2018-04-04 | 2018-06-29 | 维沃移动通信有限公司 | A kind of photographic method and mobile terminal |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8239785B2 (en) * | 2010-01-27 | 2012-08-07 | Microsoft Corporation | Edge gestures |
CN102968268B (en) * | 2012-10-23 | 2018-07-10 | 努比亚技术有限公司 | A kind of method and apparatus for cutting picture |
CN107678648A (en) * | 2017-09-27 | 2018-02-09 | 北京小米移动软件有限公司 | Screenshotss processing method and processing device |
-
2018
- 2018-07-27 CN CN201810843918.0A patent/CN109062483B/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102150099A (en) * | 2008-09-08 | 2011-08-10 | 高通股份有限公司 | Multi-panel electronic device |
CN102150413A (en) * | 2008-09-11 | 2011-08-10 | 索尼爱立信移动通讯有限公司 | Display device and method for displaying images in a variable size display area |
WO2012026567A1 (en) * | 2010-08-27 | 2012-03-01 | 京セラ株式会社 | Portable terminal device |
CN107577246A (en) * | 2017-09-29 | 2018-01-12 | 深圳市富斯科技有限公司 | A kind of image capturing method, system and electronic platform and aircraft |
CN108205801A (en) * | 2017-12-27 | 2018-06-26 | 中兴通讯股份有限公司 | A kind of method and terminal for supporting image mosaic |
CN108234891A (en) * | 2018-04-04 | 2018-06-29 | 维沃移动通信有限公司 | A kind of photographic method and mobile terminal |
Also Published As
Publication number | Publication date |
---|---|
CN109062483A (en) | 2018-12-21 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP4047940A1 (en) | Screencast control method and electronic device | |
CN107957839B (en) | Display control method and mobile terminal | |
CN109992231B (en) | Screen projection method and terminal | |
CN107817939B (en) | Image processing method and mobile terminal | |
CN108495029B (en) | Photographing method and mobile terminal | |
CN110096326B (en) | Screen capturing method, terminal equipment and computer readable storage medium | |
CN109525874B (en) | Screen capturing method and terminal equipment | |
CN109032445B (en) | Screen display control method and terminal equipment | |
CN109240577B (en) | Screen capturing method and terminal | |
CN109032486B (en) | Display control method and terminal equipment | |
CN109213407B (en) | Screenshot method and terminal equipment | |
CN109710349B (en) | Screen capturing method and mobile terminal | |
CN109151367B (en) | Video call method and terminal equipment | |
CN108763317B (en) | Method for assisting in selecting picture and terminal equipment | |
CN109407948B (en) | Interface display method and mobile terminal | |
CN108900695B (en) | Display processing method, terminal equipment and computer readable storage medium | |
CN108898555B (en) | Image processing method and terminal equipment | |
CN109144393B (en) | Image display method and mobile terminal | |
CN110413363B (en) | Screenshot method and terminal equipment | |
CN111031253B (en) | Shooting method and electronic equipment | |
CN109669656B (en) | Information display method and terminal equipment | |
CN110990172A (en) | Application sharing method, first electronic device and computer-readable storage medium | |
CN109862172B (en) | Screen parameter adjusting method and terminal | |
CN109257505A (en) | A kind of screen control method and mobile terminal | |
CN110795021A (en) | Information display method and device and electronic equipment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |