CN109711138B - Picture processing method and mobile terminal - Google Patents

Picture processing method and mobile terminal Download PDF

Info

Publication number
CN109711138B
CN109711138B CN201811591700.7A CN201811591700A CN109711138B CN 109711138 B CN109711138 B CN 109711138B CN 201811591700 A CN201811591700 A CN 201811591700A CN 109711138 B CN109711138 B CN 109711138B
Authority
CN
China
Prior art keywords
screen
mobile terminal
unlocking
user
picture
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811591700.7A
Other languages
Chinese (zh)
Other versions
CN109711138A (en
Inventor
李少赓
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN201811591700.7A priority Critical patent/CN109711138B/en
Publication of CN109711138A publication Critical patent/CN109711138A/en
Application granted granted Critical
Publication of CN109711138B publication Critical patent/CN109711138B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • User Interface Of Digital Computer (AREA)
  • Telephone Function (AREA)

Abstract

The invention provides a picture processing method and a mobile terminal, wherein the method comprises the following steps: acquiring a sliding track input by a user on a first screen when the first screen is in a screen-off state; and under the condition that the sliding track is matched with a pre-stored gesture graph, controlling a second screen to display a target picture corresponding to the sliding track. Therefore, the user performs gesture graph drawing operation on the first screen and displays the corresponding target picture on the second screen, so that unlocking and picture searching operation can be avoided, and the operation difficulty of picture checking is reduced.

Description

Picture processing method and mobile terminal
Technical Field
The present invention relates to the field of communications technologies, and in particular, to a picture processing method and a mobile terminal.
Background
With the rapid development of terminal technologies, smart terminals are silently changing the lifestyle of human beings. The screen is used as a display output device of an intelligent terminal (especially a mobile terminal), from an initial small-size screen to a large-size screen to a present full-screen and double-sided screen, it can be seen that the screen size is continuously increased to improve the viewing experience of a user, but the portability of the mobile terminal is reduced due to the overlarge size, and the portability problem caused by the overlarge screen size is solved due to the appearance of the double-sided screen. Specifically, the mobile terminal with the double-sided screen is characterized in that a main display screen is arranged on the front side of the terminal, an auxiliary display screen is arranged on the back side of the terminal, the two screens can also realize division display, and the appearance of the double-sided screen mobile terminal brings users into a brand-new viewing world.
The function of storing pictures of the existing mobile terminal is one of the important functions. However, when the user views the picture of the mobile terminal to other users, the user usually needs to unlock the mobile terminal and then search for the picture, and if the number of stored pictures is large, the user usually needs to search for a long time, which is tedious to operate.
Disclosure of Invention
The embodiment of the invention provides a picture processing method and a mobile terminal, and aims to solve the problem of complex picture viewing operation.
In order to solve the technical problem, the invention is realized as follows:
in a first aspect, an embodiment of the present invention provides an image processing method applied to a mobile terminal having a first screen and a second screen, including:
the method comprises the steps that when a first screen is in a screen-off state, a sliding track input by a user on the first screen is obtained;
and under the condition that the sliding track is matched with a pre-stored gesture graph, controlling the second screen to display a target picture corresponding to the sliding track.
In a second aspect, an embodiment of the present invention further provides a mobile terminal, where the mobile terminal has a first screen and a second screen, and includes:
the acquisition module is used for acquiring a sliding track input by a user on a first screen when the first screen is in a screen-off state;
and the control module is used for controlling the second screen to display the target picture corresponding to the sliding track under the condition that the sliding track is matched with the pre-stored gesture graph.
In a third aspect, an embodiment of the present invention further provides a mobile terminal, including a processor, a memory, and a computer program stored on the memory and operable on the processor, where the computer program, when executed by the processor, implements the steps of the above-mentioned picture processing method.
In a fourth aspect, an embodiment of the present invention further provides a computer-readable storage medium, on which a computer program is stored, where the computer program, when executed by a processor, implements the steps of the above-mentioned picture processing method.
The method comprises the steps that a sliding track input by a user on a first screen is obtained when the first screen is in a screen-off state; and under the condition that the sliding track is matched with a pre-stored gesture graph, controlling the second screen to display a target picture corresponding to the sliding track. Therefore, the user performs gesture graph drawing operation on the first screen and displays the corresponding target picture on the second screen, so that unlocking and picture searching operation can be avoided, and the operation difficulty of picture checking is reduced.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the description of the embodiments of the present invention will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art that other drawings can be obtained according to these drawings without inventive exercise.
Fig. 1 is a flowchart of a picture processing method according to an embodiment of the present invention;
fig. 2 is one of operation interfaces set up in correspondence between a gesture graph and a picture in the picture processing method according to the embodiment of the present invention;
fig. 3 is a second operation interface set up according to the corresponding relationship between the gesture pattern and the picture in the picture processing method according to the embodiment of the present invention;
fig. 4 is a third operation interface set up according to the corresponding relationship between the gesture pattern and the picture in the picture processing method according to the embodiment of the present invention;
fig. 5 is a fourth operation interface set up according to the corresponding relationship between the gesture graph and the picture in the picture processing method according to the embodiment of the present invention;
fig. 6 is a fifth operation interface set up according to the corresponding relationship between the gesture graph and the picture in the picture processing method according to the embodiment of the present invention;
fig. 7 is a block diagram of a mobile terminal according to an embodiment of the present invention;
fig. 8 is a block diagram of another mobile terminal according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Referring to fig. 1, fig. 1 is a flowchart of a picture processing method according to an embodiment of the present invention, where the mobile terminal is a dual-screen mobile terminal, and the mobile terminal includes a first screen and a second screen, and as shown in fig. 1, the method includes the following steps:
101, a mobile terminal acquires a sliding track input by a user on a first screen when the first screen is in a screen-off state;
in the embodiment of the present invention, the first screen may be a main screen or an auxiliary screen, and is not further limited herein. Specifically, the mobile terminal may be provided with a "mood transfer" function, and when the function is turned on, a sliding track may be input through a sliding operation in a state where the first screen is in a screen-rest state, where the sliding track corresponds to a pre-stored gesture pattern, for example, the gesture pattern may include "V" and "W", and the user may draw a sliding track of "V" or "W" through the sliding operation on the first screen.
It should be noted that the above-mentioned screen-off state generally refers to a standby state in which the first screen is locked.
And step 102, the mobile terminal controls the second screen to display a target picture corresponding to the sliding track under the condition that the sliding track is matched with a pre-stored gesture graph.
In this embodiment, there may be one or more gesture graphs, and each gesture graph is associated with a picture. After the mobile terminal acquires the sliding track, the sliding track is compared with the gesture graph, whether a gesture graph exists in the prestored gesture graph or not is determined to be matched with the sliding track, and if the gesture graph exists to be matched with the sliding track, the graph related to the gesture graph is determined to be the target picture corresponding to the sliding track. At this time, the second screen is controlled to display the target picture.
It should be noted that the "mood transferring" function may be specifically implemented based on an intelligent wake-up function, specifically, the second screen is also in a screen-off state at this time, and the interface displaying the target picture may be suspended on the unlocking interface of the second screen.
The target picture may be a preset picture for expressing mood, or a picture that the user wants to share for others to view. For example, a user may take a good picture of a landscape in a place and associate the picture with a gesture pattern.
The method comprises the steps that a sliding track input by a user on a first screen is obtained when the first screen is in a screen-off state; and under the condition that the sliding track is matched with a pre-stored gesture graph, controlling the second screen to display a target picture corresponding to the sliding track. Therefore, the user performs gesture graph drawing operation on the first screen and displays the corresponding target picture on the second screen, so that unlocking and picture searching operation can be avoided, and the operation difficulty of picture checking is reduced.
It should be noted that the association relationship between the gesture graphics and the pictures can be preset. For example, before the step 101, the method further includes:
the mobile terminal displays a preset setting interface;
and the mobile terminal receives the input of the user on the setting interface and determines the association relationship between the gesture graph and the currently selected picture.
In the embodiment of the present invention, a gesture graph "V" is set as an example for explanation. Specifically, as shown in fig. 2, an operation control for "drawing V another screen to wake up the mood picture" is displayed in the application interface of the "mood transfer" function, and when the user selects to turn on, the "mood transfer" function can be turned on. At this time, a jump is made to a setting interface, specifically as shown in fig. 3, the setting interface may include default pictures "picture 1", "picture 2", and "picture 3" of the system; a "custom" option may also be included.
After the user selects picture 2 among "picture 1", "picture 2", and "picture 3", the user jumps to a preview interface (as shown in fig. 4) of the corresponding picture, and on the preview interface, by selecting an "application" option, a picture associated with the gesture graphic "V" can be set. When the user selects "custom", a picture selection option, which may include three picture options of camera, screen capture and WeChat, as shown in FIG. 5, is displayed, and after the final picture is selected, a preview interface of the picture is displayed, as shown in FIG. 6. At this time, by selecting the "confirm" option, the picture associated with the gesture graphic "V" can be set. Under the interface shown in fig. 6, the user can also edit the picture, for example, by cropping through the dashed box shown in fig. 6.
In this embodiment, a user can set the association relationship between the gesture graphics and the pictures on the setting interface, so that the association relationship between different gesture graphics and different pictures can be set according to an actual scene, and thus, the flexibility of picture display is improved.
Further, based on the foregoing embodiment, in this embodiment, when the target picture is associated with an application program, after step 102, the method further includes:
the mobile terminal receives unlocking input of a user on the second screen aiming at the target picture to obtain unlocking data;
the mobile terminal unlocks the first screen under the condition that the unlocking data passes verification;
and the mobile terminal runs a target application program associated with the target picture and displays the target application program on the unlocked first screen.
In the embodiment of the present invention, the manner of associating the target picture with the application program may be set according to actual needs, for example, an association relationship between the application program and the target picture may be set through the setting interface, or the target picture and the application program may have an association relationship with the application program, for example, the target picture is an operation interface of the application program, or the target picture carries identification information of the application program.
Specifically, the identification information may be a name, an icon, and the like, which may determine a unique identifier of the application program, and is not further limited herein, for example, the target picture may be a promotion picture of the application program or a picture obtained by shooting and storing a poster, where the picture carries the identification information of the application program. Because the target picture is selected as the operation interface of the application program or the identification information of the application program is carried on the target picture, the operation difficulty of associating the application program with the picture can be simplified.
In this embodiment, the receiving of the unlocking input of the user for the target picture on the second screen obtains unlocking data, and includes:
the mobile terminal receives touch input aiming at the target picture on the second screen by a user;
the mobile terminal responds to the touch input and displays an unlocking interface for unlocking the first screen;
and the mobile terminal acquires unlocking data input by the user on the unlocking interface.
In this embodiment, the touch input may be an operation of clicking a target picture, double clicking a target picture, or sliding a target picture, and after the touch input is performed, an unlocking interface may be displayed, where the unlocking interface may be a graphical unlocking interface, a digital unlocking interface, a fingerprint unlocking interface, a face recognition unlocking interface, or the like. And based on the unlocking interface, the user can perform unlocking verification. The unlocking data may be any one of unlocking graphics, unlocking character passwords (including numbers or letters and the like), unlocking fingerprints and unlocking facial features which are input by a user.
It should be noted that the touch input may display an unlocking interface and may also display a closing control, and a user may click the closing control, so as to close the currently displayed target image and enable the second screen to enter the information screen state.
After the input unlocking data is verified, the first screen can be unlocked, the first screen is lightened, the target application program is operated, and the target application program is displayed on the first screen. In this way, the user can perform corresponding operations of the target application on the first screen.
For a better understanding of the present invention, the following detailed description is given by way of specific examples.
For example, the mobile terminal is a mobile terminal of user a, and user a and user B are friends or relatives of user a. And setting a target picture 1 corresponding to the gesture graph V to be associated with the application program N and setting a target picture 2 corresponding to the gesture graph W to be associated with the application program M in the mobile terminal. If the user B wants to use the mobile terminal to play the application program N, the gesture graph V may be drawn on the first screen, at this time, the target picture 1 associated with the application program N will be displayed on the second screen, and the user a may determine whether the user B is allowed to play the application program N on the second screen, and if so, the password may be input on the second screen.
In order to ensure the privacy of the user a, after receiving that the user B executes the application program corresponding to the exit target picture on the first screen, the first screen may be controlled to enter the screen locking state. Therefore, the permission of the user B can be limited, the authorized use state is achieved, children or friends are prevented from excessively using the mobile terminal, and the privacy of the user is protected.
In the embodiment of the invention, the unlocking data is input on the second screen, so that the password can be effectively prevented from being peeped by others, and the use safety of the mobile terminal is improved. Meanwhile, the embarrassment caused by inputting the password by other people can be avoided when other people use the mobile terminal, and therefore the experience degree of the user is improved.
It should be noted that, various optional implementations described in the embodiments of the present invention may be implemented in combination with each other or implemented separately, and the embodiments of the present invention are not limited thereto.
Referring to fig. 7, fig. 7 is a block diagram of a mobile terminal according to an embodiment of the present invention, where the mobile terminal 700 has a first screen and a second screen, and as shown in fig. 7, the mobile terminal 700 includes:
the acquisition module 701 is used for acquiring a sliding track input by a user on a first screen when the first screen is in a screen-off state;
a control module 702, configured to control the second screen to display a target picture corresponding to the sliding track when the sliding track matches a pre-stored gesture pattern.
Optionally, the mobile terminal 700 further includes:
the display module is used for displaying a preset setting interface;
and the first receiving module is used for receiving the input of a user on the setting interface and determining the association relation between the gesture graph and the currently selected picture.
Optionally, in a case that the target picture is associated with an application, the mobile terminal 700 further includes:
the second receiving module is used for receiving unlocking input of a user on the second screen aiming at the target picture to obtain unlocking data;
the unlocking module is used for unlocking the first screen under the condition that the unlocking data passes verification;
and the processing module is used for running a target application program associated with the target picture and displaying the target application program on the unlocked first screen.
Optionally, the target picture is an operation interface of an application program, or the target picture carries identification information of the application program.
Optionally, the second receiving module includes:
the receiving unit is used for receiving touch input of a user on the second screen aiming at the target picture;
the display unit is used for responding to the touch input and displaying an unlocking interface for unlocking the first screen;
and the acquisition unit is used for acquiring the unlocking data input by the user on the unlocking interface.
The mobile terminal provided in the embodiment of the present invention can implement each process implemented by the mobile terminal in the method embodiments of fig. 1 to fig. 6, and is not described herein again to avoid repetition.
Fig. 8 is a schematic diagram of a hardware structure of a mobile terminal implementing various embodiments of the present invention.
The mobile terminal 800 includes, but is not limited to: a radio frequency unit 801, a network module 802, an audio output unit 803, an input unit 804, a sensor 805, a display unit 806, a user input unit 807, an interface unit 808, a memory 809, a processor 810, and a power supply 811. Those skilled in the art will appreciate that the mobile terminal architecture illustrated in fig. 8 is not intended to be limiting of mobile terminals, and that a mobile terminal may include more or fewer components than those illustrated, or some components may be combined, or a different arrangement of components. In the embodiment of the present invention, the mobile terminal includes, but is not limited to, a mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted terminal, a wearable device, a pedometer, and the like.
Wherein the display unit 506 includes a first screen and a second screen;
the processor 810 is configured to acquire a sliding track input by a user on a first screen when the first screen is in a screen-off state; and under the condition that the sliding track is matched with a pre-stored gesture graph, controlling the second screen to display a target picture corresponding to the sliding track.
Optionally, the processor 810 is further configured to display a preset setting interface; and receiving input of a user on the setting interface, and determining the association relationship between the gesture graph and the currently selected picture.
Optionally, in a case that the target picture is associated with an application, the user input unit 807 is configured to receive an unlocking input of the user on the second screen for the target picture, so as to obtain unlocking data;
the processor 810 is further configured to unlock the first screen if the unlocking data is verified; and running a target application program associated with the target picture, and displaying the target application program on the unlocked first screen.
Optionally, the target picture is an operation interface of an application program, or the target picture carries identification information of the application program.
Optionally, the user input unit 807 is specifically configured to receive a touch input of the user on the second screen for the target picture;
processor 810 is further configured to: responding to the touch input, and displaying an unlocking interface for unlocking the first screen; and acquiring unlocking data input by a user on the unlocking interface.
The method comprises the steps that a sliding track input by a user on a first screen is obtained when the first screen is in a screen-off state; and under the condition that the sliding track is matched with a pre-stored gesture graph, controlling the second screen to display a target picture corresponding to the sliding track. Therefore, the user performs gesture graph drawing operation on the first screen and displays the corresponding target picture on the second screen, so that unlocking and picture searching operation can be avoided, and the operation difficulty of picture checking is reduced.
It should be understood that, in the embodiment of the present invention, the radio frequency unit 801 may be used for receiving and sending signals during a message sending and receiving process or a call process, and specifically, receives downlink data from a base station and then processes the received downlink data to the processor 810; in addition, the uplink data is transmitted to the base station. In general, radio frequency unit 801 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. Further, the radio frequency unit 801 can also communicate with a network and other devices through a wireless communication system.
The mobile terminal provides the user with wireless broadband internet access through the network module 802, such as helping the user send and receive e-mails, browse webpages, access streaming media, and the like.
The audio output unit 803 may convert audio data received by the radio frequency unit 801 or the network module 802 or stored in the memory 809 into an audio signal and output as sound. Also, the audio output unit 803 may also provide audio output related to a specific function performed by the mobile terminal 800 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output unit 803 includes a speaker, a buzzer, a receiver, and the like.
The input unit 804 is used for receiving an audio or video signal. The input Unit 804 may include a Graphics Processing Unit (GPU) 8041 and a microphone 8042, and the Graphics processor 8041 processes image data of a still picture or video obtained by an image capturing device (such as a camera) in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display unit 806. The image frames processed by the graphics processor 8041 may be stored in the memory 809 (or other storage medium) or transmitted via the radio frequency unit 801 or the network module 802. The microphone 8042 can receive sound, and can process such sound into audio data. The processed audio data may be converted into a format output transmittable to a mobile communication base station via the radio frequency unit 801 in case of a phone call mode.
The mobile terminal 800 also includes at least one sensor 805, such as light sensors, motion sensors, and other sensors. Specifically, the light sensor includes an ambient light sensor that can adjust the brightness of the display panel 8061 according to the brightness of ambient light, and a proximity sensor that can turn off the display panel 8061 and/or the backlight when the mobile terminal 800 moves to the ear. As one of the motion sensors, the accelerometer sensor can detect the magnitude of acceleration in each direction (generally three axes), detect the magnitude and direction of gravity when stationary, and can be used to identify the posture of the mobile terminal (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), and vibration identification related functions (such as pedometer, tapping); the sensors 805 may also include fingerprint sensors, pressure sensors, iris sensors, molecular sensors, gyroscopes, barometers, hygrometers, thermometers, infrared sensors, etc., which are not described in detail herein.
The Display unit 806 may include a Display panel 8061, and the Display panel 8061 may be configured in the form of a liquid Crystal Display (L acquired Crystal Display, L CD), an Organic light-Emitting Diode (O L ED), or the like.
The user input unit 807 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the mobile terminal. Specifically, the user input unit 807 includes a touch panel 8071 and other input devices 8072. The touch panel 8071, also referred to as a touch screen, may collect touch operations by a user on or near the touch panel 8071 (e.g., operations by a user on or near the touch panel 8071 using a finger, a stylus, or any other suitable object or accessory). The touch panel 8071 may include two portions of a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 810, receives a command from the processor 810, and executes the command. In addition, the touch panel 8071 can be implemented by various types such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. In addition to the touch panel 8071, the user input unit 807 can include other input devices 8072. In particular, other input devices 8072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described in detail herein.
Further, the touch panel 8071 can be overlaid on the display panel 8061, and when the touch panel 8071 detects a touch operation on or near the touch panel 8071, the touch operation is transmitted to the processor 810 to determine the type of the touch event, and then the processor 810 provides a corresponding visual output on the display panel 8061 according to the type of the touch event. Although in fig. 8, the touch panel 8071 and the display panel 8061 are two independent components to implement the input and output functions of the mobile terminal, in some embodiments, the touch panel 8071 and the display panel 8061 may be integrated to implement the input and output functions of the mobile terminal, which is not limited herein.
The interface unit 808 is an interface through which an external device is connected to the mobile terminal 800. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The interface unit 808 may be used to receive input (e.g., data information, power, etc.) from external devices and transmit the received input to one or more elements within the mobile terminal 800 or may be used to transmit data between the mobile terminal 800 and external devices.
The memory 809 may be used to store software programs as well as various data. The memory 809 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 809 can include high speed random access memory, and can also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The processor 810 is a control center of the mobile terminal, connects various parts of the entire mobile terminal using various interfaces and lines, and performs various functions of the mobile terminal and processes data by running or executing software programs and/or modules stored in the memory 809 and calling data stored in the memory 809, thereby integrally monitoring the mobile terminal. Processor 810 may include one or more processing units; preferably, the processor 810 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into processor 810.
The mobile terminal 800 may also include a power supply 811 (e.g., a battery) for powering the various components, and the power supply 811 may be logically coupled to the processor 810 via a power management system that may be used to manage charging, discharging, and power consumption.
In addition, the mobile terminal 800 includes some functional modules that are not shown, and thus, are not described in detail herein.
Preferably, an embodiment of the present invention further provides a mobile terminal, which includes a processor 810, a memory 809, and a computer program stored in the memory 809 and capable of running on the processor 810, where the computer program, when executed by the processor 810, implements each process of the above-mentioned image processing method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not described here again.
The embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program implements each process of the above-mentioned image processing method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here. The computer-readable storage medium may be a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment of the present invention.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: various media capable of storing program codes, such as a U disk, a removable hard disk, a ROM, a RAM, a magnetic disk, or an optical disk.
The above description is only for the specific embodiments of the present invention, but the scope of the present invention is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present invention, and all the changes or substitutions should be covered within the scope of the present invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the claims.

Claims (9)

1. A picture processing method is applied to a double-sided screen mobile terminal with a first screen and a second screen, and is characterized by comprising the following steps:
the method comprises the steps that when a first screen is in a screen-off state, a sliding track input by a user on the first screen is obtained;
under the condition that the sliding track is matched with a pre-stored gesture graph, controlling the second screen to display a target picture corresponding to the sliding track;
under the condition that the target picture is associated with an application program, and under the condition that the sliding track is matched with a pre-stored gesture graph, after the second screen is controlled to display the picture corresponding to the sliding track, the method further comprises the following steps:
receiving unlocking input of a user on the second screen aiming at the target picture to obtain unlocking data;
unlocking the first screen under the condition that the unlocking data passes verification;
running a target application program associated with the target picture, and displaying the target application program on the unlocked first screen;
the first screen is a main display screen arranged on the front side of the double-screen mobile terminal, and the second screen is an auxiliary display screen arranged on the back side of the double-screen mobile terminal.
2. The method according to claim 1, before acquiring a sliding track input by a user on the first screen in a state that the first screen is in a screen-saving state, further comprising:
displaying a preset setting interface;
and receiving input of a user on the setting interface, and determining the association relationship between the gesture graph and the currently selected picture.
3. The method according to claim 1, wherein the target picture is an operation interface of an application program, or the target picture carries identification information of the application program.
4. The method of claim 1, wherein receiving user unlock input for the target picture on the second screen results in unlock data comprising:
receiving touch input of a user on the second screen aiming at the target picture;
responding to the touch input, and displaying an unlocking interface for unlocking the first screen;
and acquiring unlocking data input by a user on the unlocking interface.
5. The utility model provides a mobile terminal, mobile terminal is two-sided screen mobile terminal, has first screen and second screen, its characterized in that includes:
the acquisition module is used for acquiring a sliding track input by a user on a first screen when the first screen is in a screen-off state;
the control module is used for controlling the second screen to display a target picture corresponding to the sliding track under the condition that the sliding track is matched with a pre-stored gesture graph;
in a case that the target picture is associated with an application program, the mobile terminal further includes:
the second receiving module is used for receiving unlocking input of a user on the second screen aiming at the target picture to obtain unlocking data;
the unlocking module is used for unlocking the first screen under the condition that the unlocking data passes verification;
the processing module is used for running a target application program associated with the target picture and displaying the target application program on the unlocked first screen;
the first screen is a main display screen arranged on the front side of the double-screen mobile terminal, and the second screen is an auxiliary display screen arranged on the back side of the double-screen mobile terminal.
6. The mobile terminal of claim 5, wherein the mobile terminal further comprises:
the display module is used for displaying a preset setting interface;
and the first receiving module is used for receiving the input of a user on the setting interface and determining the association relation between the gesture graph and the currently selected picture.
7. The mobile terminal according to claim 5, wherein the target picture is an operation interface of an application program, or the target picture carries identification information of the application program.
8. The mobile terminal of claim 5, wherein the second receiving module comprises:
the receiving unit is used for receiving touch input of a user on the second screen aiming at the target picture;
the display unit is used for responding to the touch input and displaying an unlocking interface for unlocking the first screen;
and the acquisition unit is used for acquiring the unlocking data input by the user on the unlocking interface.
9. A mobile terminal, characterized in that it comprises a processor, a memory and a computer program stored on the memory and executable on the processor, which computer program, when executed by the processor, implements the steps of the picture processing method according to any one of claims 1 to 4.
CN201811591700.7A 2018-12-25 2018-12-25 Picture processing method and mobile terminal Active CN109711138B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811591700.7A CN109711138B (en) 2018-12-25 2018-12-25 Picture processing method and mobile terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811591700.7A CN109711138B (en) 2018-12-25 2018-12-25 Picture processing method and mobile terminal

Publications (2)

Publication Number Publication Date
CN109711138A CN109711138A (en) 2019-05-03
CN109711138B true CN109711138B (en) 2020-08-04

Family

ID=66257567

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811591700.7A Active CN109711138B (en) 2018-12-25 2018-12-25 Picture processing method and mobile terminal

Country Status (1)

Country Link
CN (1) CN109711138B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101778165A (en) * 2010-01-06 2010-07-14 宇龙计算机通信科技(深圳)有限公司 Unlocking method and system of screen and mobile terminal
CN106249943A (en) * 2016-07-13 2016-12-21 广东欧珀移动通信有限公司 A kind of method for information display, device and terminal
CN106354406A (en) * 2016-08-22 2017-01-25 贵州万臻时代通讯技术有限公司 Method and device for controlling double screen mobile phone
CN107278368A (en) * 2017-04-24 2017-10-20 北京小米移动软件有限公司 Screen control method and device
CN108055380A (en) * 2017-11-23 2018-05-18 北京珠穆朗玛移动通信有限公司 Double screen unlocking method, mobile terminal and storage medium
CN108337376A (en) * 2018-01-25 2018-07-27 北京珠穆朗玛移动通信有限公司 Pair screen method for information display, mobile terminal and storage medium

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101650102B1 (en) * 2009-09-22 2016-08-23 삼성전자주식회사 Method of Providing User Interface of Mobile Terminal Equipped with Touch screen and Mobile Terminal thereof
JP5581986B2 (en) * 2010-11-12 2014-09-03 コニカミノルタ株式会社 Image forming apparatus and computer program

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101778165A (en) * 2010-01-06 2010-07-14 宇龙计算机通信科技(深圳)有限公司 Unlocking method and system of screen and mobile terminal
CN106249943A (en) * 2016-07-13 2016-12-21 广东欧珀移动通信有限公司 A kind of method for information display, device and terminal
CN106354406A (en) * 2016-08-22 2017-01-25 贵州万臻时代通讯技术有限公司 Method and device for controlling double screen mobile phone
CN107278368A (en) * 2017-04-24 2017-10-20 北京小米移动软件有限公司 Screen control method and device
CN108055380A (en) * 2017-11-23 2018-05-18 北京珠穆朗玛移动通信有限公司 Double screen unlocking method, mobile terminal and storage medium
CN108337376A (en) * 2018-01-25 2018-07-27 北京珠穆朗玛移动通信有限公司 Pair screen method for information display, mobile terminal and storage medium

Also Published As

Publication number Publication date
CN109711138A (en) 2019-05-03

Similar Documents

Publication Publication Date Title
CN107977144B (en) Screen capture processing method and mobile terminal
CN108459797B (en) Control method of folding screen and mobile terminal
CN109992231B (en) Screen projection method and terminal
CN109078319B (en) Game interface display method and terminal
CN109241775B (en) Privacy protection method and terminal
CN108459815B (en) Display control method and mobile terminal
WO2021169959A1 (en) Application starting method and electronic device
CN108256308B (en) Face recognition unlocking control method and mobile terminal
CN108629171B (en) Unread message processing method and terminal
CN108108111B (en) Method and device for checking lock screen picture information and mobile terminal
CN111124179A (en) Information processing method and electronic equipment
CN110519443B (en) Screen lightening method and mobile terminal
CN108388459B (en) Message display processing method and mobile terminal
CN111124537B (en) Application starting method and electronic equipment
CN109743634B (en) Video playing control method and terminal
CN110113486B (en) Application icon moving method and terminal
CN109753776B (en) Information processing method and device and mobile terminal
CN111460537A (en) Method for hiding page content and electronic equipment
CN108108608B (en) Control method of mobile terminal and mobile terminal
CN107491685B (en) Face recognition method and mobile terminal
CN109740000B (en) Multimedia file processing method and mobile terminal
CN109711138B (en) Picture processing method and mobile terminal
CN110989832B (en) Control method and electronic equipment
CN111381753B (en) Multimedia file playing method and electronic equipment
CN110879896B (en) Encryption and decryption method for folding screen terminal and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant