CN110647277A - Control method and terminal equipment - Google Patents

Control method and terminal equipment Download PDF

Info

Publication number
CN110647277A
CN110647277A CN201910803952.XA CN201910803952A CN110647277A CN 110647277 A CN110647277 A CN 110647277A CN 201910803952 A CN201910803952 A CN 201910803952A CN 110647277 A CN110647277 A CN 110647277A
Authority
CN
China
Prior art keywords
input
terminal device
camera
target
area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910803952.XA
Other languages
Chinese (zh)
Inventor
庾增增
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN201910803952.XA priority Critical patent/CN110647277A/en
Publication of CN110647277A publication Critical patent/CN110647277A/en
Priority to PCT/CN2020/111443 priority patent/WO2021037073A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text

Abstract

The embodiment of the invention provides a control method and terminal equipment, relates to the technical field of communication, and aims to solve the problems that the process of running an application program by the terminal equipment is complicated and time-consuming, and the man-machine interaction performance is poor. The scheme comprises the following steps: receiving a first input of a user in a target area, and responding to the first input to execute a first action corresponding to the first input; the target area is an area which is on the screen of the terminal equipment and corresponds to the first camera positioned below the screen of the terminal equipment, and the target area is positioned on the navigation bar of the terminal equipment. The method can be applied to terminal equipment with a hole digging screen.

Description

Control method and terminal equipment
Technical Field
The embodiment of the invention relates to the technical field of communication, in particular to a control method and terminal equipment.
Background
With the development of terminal technology, more and more applications are installed in terminal equipment.
At present, when a user needs a terminal device to run an application program, the user needs to search an application icon indicating the application program in a desktop of the terminal device, and operate the application icon after searching the application icon indicating the application program, so as to trigger the terminal device to run the application program.
However, according to the above method, when a user requires the terminal device to run a certain application program, the terminal device needs to execute the above-mentioned series of operations to trigger the terminal device to run the application program, which results in a tedious and time-consuming process for the terminal device to run the application program and poor man-machine interaction performance.
Disclosure of Invention
The embodiment of the invention provides a control method and terminal equipment, and aims to solve the problems that the process of running an application program by the terminal equipment is complicated and time-consuming, and the man-machine interaction performance is poor.
In order to solve the technical problem, the present application is implemented as follows:
in a first aspect, an embodiment of the present invention provides a control method, which is applied to a terminal device, and the method includes receiving a first input of a user in a target area, where the target area is an area on a screen of the terminal device and corresponding to a first camera located below the screen of the terminal device; and in response to the first input, performing a first action corresponding to the first input; and the target area is positioned in the navigation bar of the terminal equipment.
In a second aspect, an embodiment of the present invention provides a terminal device, where the terminal device includes: the device comprises a receiving module and a processing module. The receiving module is used for receiving a first input of a user in a target area, wherein the target area is an area which is on a screen of the terminal equipment and corresponds to a first camera positioned below the screen of the terminal equipment; and the processing module is used for responding to the first input received by the receiving module and executing a first action corresponding to the first input. And the target area is positioned in the navigation bar of the terminal equipment.
In a third aspect, an embodiment of the present invention provides a terminal device, which includes a processor, a memory, and a computer program stored in the memory and executable on the processor, where the computer program, when executed by the processor, implements the steps of the control method in the first aspect.
In a fourth aspect, an embodiment of the present invention provides a computer-readable storage medium, on which a computer program is stored, where the computer program, when executed by a processor, implements the steps of the control method of the first aspect.
In the embodiment of the present invention, the terminal device may receive a first input of a user in a target area (an area on a screen of the terminal device and corresponding to a first camera located under the screen of the terminal device); and may perform a first action corresponding to the first input in response to the first input; wherein the target area can be located in a navigation bar of the terminal device. By means of the scheme, when the user requires the terminal device to execute a certain action, for example, a first action, the user can execute a first input corresponding to the first action in the target area to trigger the terminal device to directly execute the first action. Therefore, on the basis of improving the screen occupation ratio of the terminal equipment, the use reliability and the service life of the terminal equipment can be ensured, the process of executing a certain action by the terminal equipment can be simplified, and the man-machine interaction performance is improved.
Drawings
Fig. 1 is a schematic diagram of an architecture of a possible android operating system according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of a control method according to an embodiment of the present invention;
fig. 3 is one of schematic interfaces of an application of the control method according to the embodiment of the present invention;
FIG. 4 is a second schematic interface diagram of an application of the control method according to the embodiment of the present invention;
fig. 5 is a third schematic interface diagram of an application of the control method according to the embodiment of the present invention;
FIG. 6 is a fourth schematic view of an interface applied by the control method according to the embodiment of the present invention;
FIG. 7 is a fifth schematic view of an interface applied by the control method according to the embodiment of the present invention;
FIG. 8 is a sixth schematic view of an interface applied by the control method according to the embodiment of the present invention;
fig. 9 is a schematic structural diagram of a terminal device according to an embodiment of the present invention;
fig. 10 is a hardware schematic diagram of a terminal device according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The term "and/or" herein is an association relationship describing an associated object, meaning that three relationships may exist, e.g., a and/or B, may mean: a exists alone, A and B exist simultaneously, and B exists alone. The symbol "/" herein denotes a relationship in which the associated object is or, for example, a/B denotes a or B.
The terms "first" and "second," etc. herein are used to distinguish between different objects and are not used to describe a particular order of objects. For example, the first input and the second input, etc. are for distinguishing different inputs, rather than for describing a particular order of inputs.
In the embodiments of the present invention, words such as "exemplary" or "for example" are used to mean serving as examples, illustrations or descriptions. Any embodiment or design described as "exemplary" or "e.g.," an embodiment of the present invention is not necessarily to be construed as preferred or advantageous over other embodiments or designs. Rather, use of the word "exemplary" or "such as" is intended to present concepts related in a concrete fashion.
In the description of the embodiments of the present invention, unless otherwise specified, "a plurality" means two or more, for example, a plurality of elements means two or more elements, and the like.
Some of the nouns or terms referred to in the claims and the specification of the present application will be explained first.
Preview pictures collected by the camera: the preview picture acquired by the camera currently started by the terminal device is also the picture displayed by the camera preview interface.
A camera preview interface: i.e. the preview interface of the camera application. The method refers to an interface displayed by the terminal equipment when a camera application program in the terminal equipment runs in the foreground of the terminal equipment.
And the Navigation Bar (Navigation Bar) is a shortcut button Bar of the screen of the terminal equipment. Generally, the navigation bar appears in the peripheral area of the screen of the mobile phone in the form of virtual keys (for example, it may be located in the top area of the screen, or in the bottom area of the screen, or in the left area of the screen, or in the right area of the screen). The shortcut buttons of the terminal device screen may generally include a back button (back), a home button (home), and recently used buttons (entries).
Digging a hole screen: also known as a drill screen. The method is characterized in that a hole is dug in a screen of the terminal equipment to expose the front camera arranged at the hole, so that the screen occupation ratio of the terminal equipment can be improved on the premise of not sacrificing the photographing effect of the front camera.
The terminal device provided with the hole digging screen can comprise a display layer and a touch layer, and the hole digging screen can be punched in two modes (namely, the following mode I and mode II).
The first method is to punch through both the display layer of the screen and the touch layer of the screen. In this case, the punched area of the screen cannot normally display the content, nor can it normally respond to the touch input of the user.
And in the second mode, the display layer of the screen is punched through, and the touch layer of the screen is reserved. In this case, although the punched area of the screen cannot normally display the content, the touch layer is retained, so that the touch input of the user can be normally responded.
It should be noted that the hole digging screen in the embodiment of the present invention may be implemented in the above-mentioned manner two.
The embodiment of the invention provides a control method and terminal equipment, wherein the terminal equipment can receive first input of a user in a target area (an area which is on a screen of the terminal equipment and corresponds to a first camera positioned below the screen of the terminal equipment); and may perform a first action corresponding to the first input in response to the first input; wherein the target area can be located in a navigation bar of the terminal device. Then, when the user requires the terminal device to perform a certain action, for example, a first action, the user may perform a first input corresponding to the first action in the target area to trigger the terminal device to directly perform the first action. Therefore, on the basis of improving the screen occupation ratio of the terminal equipment, the use reliability and the service life of the terminal equipment can be ensured, the process of executing a certain action by the terminal equipment can be simplified, and the man-machine interaction performance is improved.
The terminal device in the embodiment of the present invention may be a terminal device having an operating system. The operating system may be an Android (Android) operating system, an ios operating system, or other possible operating systems, and embodiments of the present invention are not limited in particular.
The following takes an android operating system as an example to describe a software environment to which the control method provided by the embodiment of the present invention is applied.
Fig. 1 is a schematic diagram of an architecture of a possible android operating system according to an embodiment of the present invention. In fig. 1, the architecture of the android operating system includes 4 layers, which are respectively: an application layer, an application framework layer, a system runtime layer, and a kernel layer (specifically, a Linux kernel layer).
The application program layer comprises various application programs (including system application programs and third-party application programs) in an android operating system.
The application framework layer is a framework of the application, and a developer can develop some applications based on the application framework layer under the condition of complying with the development principle of the framework of the application.
The system runtime layer includes libraries (also called system libraries) and android operating system runtime environments. The library mainly provides various resources required by the android operating system. The android operating system running environment is used for providing a software environment for the android operating system.
The kernel layer is an operating system layer of an android operating system and belongs to the bottommost layer of an android operating system software layer. The kernel layer provides kernel system services and hardware-related drivers for the android operating system based on the Linux kernel.
Taking an android operating system as an example, in the embodiment of the present invention, a developer may develop a software program for implementing the control method provided in the embodiment of the present invention based on the system architecture of the android operating system shown in fig. 1, so that the interface control method may operate based on the android operating system shown in fig. 1. Namely, the processor or the terminal device can implement the control method provided by the embodiment of the invention by running the software program in the android operating system.
The terminal equipment in the embodiment of the invention can be a mobile terminal or a non-mobile terminal. For example, the mobile terminal may be a mobile phone, a tablet computer, a notebook computer, a palm top computer, a vehicle-mounted terminal, a wearable device, an ultra-mobile personal computer (UMPC), a netbook or a Personal Digital Assistant (PDA), and the like, and the non-mobile terminal may be a Personal Computer (PC), a Television (TV), a teller machine, a self-service machine, and the like, and the embodiment of the present invention is not limited in particular.
The execution main body of the control method provided by the embodiment of the present invention may be the terminal device, or may also be a functional module and/or a functional entity capable of implementing the control method in the terminal device, which may be specifically determined according to actual use requirements, and the embodiment of the present invention is not limited. The following takes a terminal device as an example to exemplarily explain a control method provided by the embodiment of the present invention.
In this embodiment of the present invention, in a case that a camera is disposed below an area in a screen of a terminal device (for example, a target area in this embodiment of the present invention, the target area may be an area corresponding to a first camera located below the screen of the terminal device, and the target area may be located in a navigation bar), if a user needs the terminal device to perform a certain action related to the camera (for example, a first action in this embodiment of the present invention), the user may trigger the terminal device to directly display an action corresponding to an input through an input in the area (for example, the first input in this embodiment of the present invention), so as to trigger the terminal device to perform the action corresponding to the input. Therefore, on the basis of improving the screen occupation ratio of the terminal equipment, the use reliability and the service life of the terminal equipment can be ensured, the process of executing a certain action by the terminal equipment can be simplified, and the man-machine interaction performance is improved.
The following describes an exemplary control method provided by an embodiment of the present invention with reference to the drawings.
As shown in fig. 2, an embodiment of the present invention provides a control method, which may include S201 and S202 described below.
S201, the terminal equipment receives a first input of a user in a target area.
The target area can be an area which is on a screen of the terminal device and corresponds to a first camera positioned below the screen of the terminal device, and the target area can be positioned on a navigation bar of the terminal device.
Optionally, in the embodiment of the present invention, the screen of the terminal device may be a hole-digging screen, and the screen may include a display layer and a touch layer. Specifically, in the embodiment of the present invention, a target hole may be disposed at a position of the display layer corresponding to the target area, and the first camera may be disposed at the target hole.
It can be understood that the first camera located below the screen of the terminal device can be understood as a first camera located below the touch layer of the screen of the terminal device.
In the embodiment of the present invention, since the hole is only cut in the display layer (i.e., the target hole is disposed in the display layer) of the hole-cutting screen in the embodiment of the present invention, and the touch layer is not cut through, i.e., the display layer of the screen is cut through and the touch layer of the screen is reserved, the touch function of the target area can be ensured to be normally used while the first camera is exposed.
For the description of the digging screen, reference may be made to the description of the digging screen in the above noun explanation section, and details are not repeated here.
Optionally, in this embodiment of the present invention, the first camera may be a front camera of the terminal device.
Optionally, in the embodiment of the present invention, the terminal device may further include a second camera, and the second camera may be a rear camera of the terminal device.
For the description of the navigation bar, reference may be made to the related description of the navigation bar in the noun explanation section, and details are not repeated here.
Optionally, in the embodiment of the present invention, the navigation bar of the terminal device may be located in any area of the screen of the terminal device. For example, the navigation bar may be located in a top area of the screen of the terminal device, or may be located in a bottom area of the screen of the terminal device, or may be located in a left area of the screen of the terminal device, or may be located in a right area of the screen of the terminal device, which may be determined according to actual usage requirements, and the embodiment of the present invention is not limited.
Optionally, in the embodiment of the present invention, because the target area is located on the navigation bar of the terminal device, and the target area is an area on the screen of the terminal device and corresponding to the first camera below the screen of the terminal device, that is, the first camera is disposed below the target area in the navigation bar of the terminal device, and does not occupy an effective display area of the screen of the terminal device and does not need to stretch, on the basis of increasing the screen occupation ratio of the terminal device, the front camera (for example, the first camera in the embodiment of the present invention) of the terminal device is prevented from stretching, so that the use reliability and the service life of the terminal device can be increased.
Optionally, in an embodiment of the present invention, the target area may be any area in the navigation bar.
Optionally, in the embodiment of the present invention, the first input may be any possible form of input, such as click input, long-press input, re-press input, drag input, slide input, and the like, which may be determined specifically according to actual use requirements, and the embodiment of the present invention is not limited.
The click input may be input by clicking a first preset number of times. The long press input may be an input contacting for a first preset duration. The above-mentioned heavy-pressing input is also referred to as a pressure touch input, and refers to an input that a user presses at a pressure value greater than or equal to the first pressure threshold value. The drag input may be an input of dragging in an arbitrary direction. The slide input may be an input to slide in a first direction.
In an embodiment of the present invention, the first predetermined number of times may be two or more times. The first preset time, the first pressure threshold, and the first direction may be determined according to actual use requirements, and the embodiment of the present invention is not limited.
S202, the terminal equipment responds to the first input and executes a first action corresponding to the first input.
The first action may be an action related to the first camera.
Optionally, in this embodiment of the present invention, the action related to the first camera may be understood as performing an opening action on an object related to the first camera.
It is understood that the object associated with the first camera may include: a first camera, a second camera, a camera application, a gallery application, and so forth. The method can be determined according to actual use requirements, and the embodiment of the invention is not limited.
Optionally, in an embodiment of the present invention, the first action may include any one of: and opening the first camera and displaying the preview picture acquired by the first camera, opening the second camera and displaying the preview picture acquired by the second camera, and displaying the interface of the gallery application program.
It can be understood that, in the embodiment of the present invention, the terminal device displays the preview picture acquired by the first camera, that is, the terminal device displays the preview picture acquired by the front camera of the terminal device; for example, while the terminal device opens the first camera, the camera application may be run in the foreground of the terminal device, and a preview screen captured by the first camera may be displayed in a camera preview interface (i.e., a preview interface of the camera application). The terminal equipment displays a preview picture acquired by the second camera, namely the terminal equipment displays the preview picture acquired by the rear camera of the terminal equipment; for example, while the terminal device opens the second camera, the terminal device may run a camera application in the foreground of the terminal device, and display a preview screen captured by the second camera in a camera preview interface (i.e., a preview interface of the camera application).
For the descriptions of the preview picture acquired by the first camera, the preview picture acquired by the second camera, and the camera preview interface, reference may be made specifically to the description of the preview picture acquired by the camera in the above noun explanation section, which is not described herein again.
Optionally, in the embodiment of the present invention, the interface of the gallery application program may be any one interface in the gallery application program.
Optionally, in this embodiment of the present invention, the first input is different, and the first action performed by the terminal device after responding to the first input may also be different. That is, the user can trigger the terminal device to execute different first actions through different first inputs.
It can be understood that, in the embodiment of the present invention, a specific first input may not be limited to trigger the terminal device to execute a certain determined first action; only one specific first action needs to be uniquely corresponding to each specific first input. The method can be determined according to actual use requirements, and the embodiment of the invention is not limited.
The correspondence of the first input to the first action is exemplarily described below in three possible implementations, respectively.
In a possible implementation manner, if the user needs to use the first camera, the user may click on the target area (that is, the first input is a click input), so that the terminal device may respond to the click input, open the first camera, and display a preview picture acquired by the first camera.
In the embodiment of the present invention, the number of clicks of the click input may be two or more. The method can be determined according to actual use requirements, and the embodiment of the invention is not limited.
Optionally, in the embodiment of the present invention, when the terminal device displays any interface, the terminal device may respond to the click input, start the first camera, and display a preview picture acquired by the first camera.
Specifically, under the condition that the terminal device displays any interface, the user may click on the target area (the click times are two or more), that is, the terminal device receives the first input of the user, and then the terminal device may determine whether the current interface displayed by the terminal device is the camera preview interface. If the current interface is not the camera preview interface, the terminal device can run a camera application program in the foreground of the terminal device and open the first camera, so that the terminal device can display the camera preview interface and display a preview picture acquired by the first camera in the camera preview interface. If the current interface is a camera preview interface, the terminal device can further determine whether a preview picture acquired by the first camera is displayed in the camera preview interface; if the preview picture acquired by the first camera is displayed in the camera preview interface, the terminal equipment can keep starting the first camera and keep displaying the preview picture acquired by the first camera; if the preview picture collected by the second camera is displayed in the camera preview interface, the terminal equipment can close the second camera and open the first camera, and then the terminal equipment can display the preview picture collected by the first camera.
For example, in a possible implementation manner, assuming that the currently displayed interface of the terminal device is the interface of the chat application, the first input is a double-click input, and the navigation field is located in the bottom area of the screen of the terminal device, then, as shown in (a) in fig. 3, the terminal device displays the interface 30 of the chat application and the navigation bar 31, wherein the navigation bar 31 is located in the bottom area of the screen of the terminal device, and the target area 32 is located in the navigation bar 31. Then, the user may double-click on the target area 32, that is, the terminal device receives the first input of the user, so that the terminal device may respond to the first input, as shown in (b) of fig. 3, and the terminal device may display a camera preview interface and display a preview screen 33 captured by the first camera in the camera preview interface (that is, the terminal device turns on the first camera and displays the preview screen captured by the first camera).
In the embodiment of the invention, because a user can directly trigger the terminal device to start the first camera and display the preview picture acquired by the first camera through the first input in the target area, the user does not need to trigger the terminal device to display the preview picture acquired by the first camera through a plurality of inputs (for example, firstly searching for an identifier indicating a camera application program, then triggering the terminal device to display a preview interface of the camera application program according to the identifier input, and switching the identifier input to the camera in the camera preview interface); therefore, the operation process of starting the first camera and displaying the preview picture acquired by the first camera can be simplified, and the man-machine interaction performance is improved.
In another possible implementation manner, if the user needs to use the second camera, the user may press the target area for a long time (that is, the first input is a long-press input), so that the terminal device may open the second camera and display a preview picture acquired by the second camera in response to the long-press input.
Optionally, in the embodiment of the present invention, when the terminal device displays any interface, the terminal device may respond to the long press input, start the second camera, and display a preview picture acquired by the second camera.
Specifically, under the condition that the terminal device displays any interface, the user may press for a long time in the target area, that is, the terminal device receives the first input of the user, and then the terminal device may determine whether the current interface displayed by the terminal device is the camera preview interface. If the current interface is not the camera preview interface, the terminal device may first run the camera application program on the foreground of the terminal device and start the second camera, so that the terminal device may display the camera preview interface and display a preview picture acquired by the second camera in the camera preview interface. If the current interface is a camera preview interface, the terminal device can further determine whether a preview picture acquired by the second camera is displayed in the camera preview interface; if the preview picture acquired by the second camera is displayed in the camera preview interface, the terminal equipment can continuously start the second camera and display the preview picture acquired by the second camera; if the preview picture collected by the first camera is displayed in the camera preview interface, the terminal equipment can close the first camera and open the second camera, so that the terminal equipment can display the preview image collected by the second camera.
For example, in another possible implementation manner, assuming that the currently displayed interface of the terminal device is the interface of the chat application, the first input is a long press input, and the navigation bar is located in the bottom area of the screen of the terminal device, then, as shown in (a) in fig. 4, the terminal device displays the interface 40 and the navigation bar 41 of the chat application, and the target area 42 is located in the navigation bar 41. Then, the user may long-press in the target area 42, that is, the terminal device receives the first input of the user, so that the terminal device may respond to the first input, as shown in (b) of fig. 4, and the terminal device may display a camera preview interface and display a preview screen 43 captured by the second camera in the camera preview interface (that is, the terminal device turns on the second camera and displays the preview screen captured by the second camera).
In the embodiment of the invention, because a user can directly trigger the terminal equipment to start the second camera and display the preview picture acquired by the second camera through the first input in the target area, the user does not need to trigger the terminal equipment to display the preview picture acquired by the first camera through a plurality of inputs (for example, firstly searching for an identifier indicating a camera application program, then triggering the terminal equipment to display a preview interface of the camera application program according to the identifier input, and switching the identifier input to the camera in the camera preview interface); therefore, the operation process of starting the second camera and displaying the preview picture acquired by the second camera can be simplified, and the man-machine interaction performance is improved.
In yet another possible implementation manner, if the user needs to view the gallery application, the user may slide (i.e., the first input is a slide input) the input in the target area, so that the terminal device may display the interface of the gallery application in response to the slide input.
Optionally, in the embodiment of the present invention, when the terminal device displays any interface, the terminal device may respond to the sliding input to display the interface of the gallery application program.
Specifically, in the case that the terminal device displays any interface, the user may slide in the target area (for example, slide to the left), that is, the terminal device receives the first input of the user, and then the terminal device may determine whether the current interface displayed by the terminal device is the interface of the gallery application program. If the current interface is not the interface of the gallery application program, the terminal device can run the gallery application program in the foreground of the terminal device and display the interface of the gallery application program. If the current interface is the interface of the gallery application program, the terminal device can keep running the gallery application program in the foreground of the terminal device and display the interface of the gallery application program.
For example, in another possible implementation manner, assuming that the currently displayed interface of the terminal device is an interface of a chat application, the gallery application is a gallery application, the first input is a long press input, and the navigation bar is located in the bottom area of the screen of the terminal device, as shown in (a) in fig. 5, the terminal device displays an interface 50 of the chat application and a navigation bar 51, the target area 52 is located in the navigation bar 51, and the user can slide to the left in the target area 52, that is, the terminal device receives the first input of the user, so that the terminal device can respond to the first input, as shown in (b) in fig. 5, and the terminal device can display an interface 53 of the gallery application.
In the embodiment of the invention, because the user can directly trigger the terminal device to display the picture of the gallery application program through the first input in the target area, the user does not need to search the identifier (for example, the application icon of the gallery application program) indicating the gallery application program and then input the identifier to trigger the terminal device to display the picture of the gallery application program. Therefore, the operation process of opening the interface for displaying the gallery application program can be simplified, and the man-machine interaction performance is improved.
It is understood that, in the embodiment of the present invention, the first action corresponding to each first input described in the above three possible implementations is an exemplary enumeration and does not set any limit to the embodiment of the present invention. In practical implementation, when the first input is a click input and the number of clicks is two or more, and the terminal device responds to the click input, the terminal device may also start the second camera and display a preview picture acquired by the second camera; or, when the first input is long-press input of the user in the target area, the terminal device may also display an interface of the gallery application program after responding to the long-press input; or, when the first input is a sliding input of the user in the target area and the sliding direction is leftward sliding, after the end device responds to the sliding input, the end device may also start the first camera and display a preview picture acquired by the first camera.
In the embodiment of the invention, when the current interface of the terminal equipment is any interface, a user can input in the target area, the terminal equipment is triggered to directly execute the first action corresponding to the input, the terminal equipment does not need to be triggered to return to the desktop first, and then the identifier for indicating the action is searched on the desktop, so that the action convenience of the terminal equipment can be improved, and the human-computer interaction performance can be improved.
Optionally, in an embodiment of the present invention, the navigation bar may include at least one virtual key, the target area may be an area where the target virtual key is located, and the target virtual key may be a virtual key in the at least one virtual key.
Optionally, in the embodiment of the present invention, the area where the target virtual key is located may be an area for displaying the target virtual key in a screen of the terminal device, may also be a peripheral area (for example, an annular area surrounding the target virtual key) of the area for displaying the target virtual key in the screen of the terminal device, and may also be an area formed by the area for displaying the target virtual key and the peripheral area of the area in the screen of the terminal device, which may specifically be determined according to an actual use requirement, which is not limited in the embodiment of the present invention.
Optionally, in this embodiment of the present invention, each of the at least one virtual key may be any one of the following: return button, home button, last used button. The method can be determined according to actual use requirements, and the embodiment of the invention is not limited.
It should be noted that, in the embodiment of the present invention, each of the at least one virtual key is different.
Optionally, in an embodiment of the present invention, the target virtual key may be any one of the at least one virtual key, for example, the target virtual key may be a return key, a home key, or a recently used key. The method can be determined according to actual use requirements, and the embodiment of the invention is not limited.
In the embodiment of the invention, the first camera can be arranged in the area where the target virtual key is located, so that the screen occupation ratio of the terminal equipment can be further improved on the basis of ensuring the normal use of the first camera and the target virtual key.
Optionally, in the embodiment of the present invention, a target identifier is displayed in another area of the target area except the first area, the target identifier may be used to indicate a target virtual key, and the first area may be a projection area of the first camera on the screen of the terminal device. The terminal equipment screen is the screen of the terminal equipment; the projection area of the first camera on the screen of the terminal device can be understood as the orthographic projection area of the first camera on the screen of the terminal device.
Optionally, in this embodiment of the present invention, the target area may include a first area and a second area (i.e., other areas of the target area except for the first area). The first region may be located at the center of the target region and the second region may be located outside the first region. So that the target identification can be displayed outside the first camera. Therefore, visually, the first camera forms an identifier on the surface of the screen together with the target identifier, so that the first camera can be prevented from influencing the display effect of the screen.
For example, as shown in fig. 5 (b), the target region may be 52 shown in fig. 5 (b), the first region may be a mesh filling region 521 shown in fig. 5 (b), and the second region may be a black region 522 shown in fig. 5 (b).
Optionally, in the embodiment of the present invention, the target hole for setting the first camera may be located in the first area. Specifically, the target hole may be provided on the display layer of the first area, and the area on the screen of the terminal device corresponding to the first area cannot display the content but has a touch function.
In the embodiment of the invention, as the target marks for indicating the target virtual keys are displayed in the other areas except the first area in the target area, the position of the target virtual keys on the screen of the terminal equipment can be prompted to a user on the basis of further improving the screen occupation ratio, so that the operation convenience and the human-computer interaction performance can be improved.
In the control method provided by the embodiment of the present invention, when a user requires a terminal device to execute a certain action, for example, a first action, the user may execute a first input corresponding to the first action in a target area to trigger the terminal device to directly execute the first action. Therefore, on the basis of improving the screen occupation ratio of the terminal equipment, the use reliability and the service life of the terminal equipment can be ensured, the action execution process of the terminal equipment can be simplified, and the man-machine interaction performance is improved.
Optionally, in an embodiment of the present invention, the first action may include displaying an interface of a gallery application, and the gallery application may be a gallery application. After S202, the control method provided in the embodiment of the present invention may further include S203 and S204 described below.
S203, in the case of displaying the target image, the terminal equipment receives a second input of the user in the target area.
The target image may be an image in a gallery application.
Optionally, in the embodiment of the present invention, the second input may be any possible form of input such as a drag input and a slide input, which may be determined specifically according to actual use requirements, and the embodiment of the present invention is not limited.
The drag input may be an input of dragging in an arbitrary direction. The slide input may be an input to slide in a second direction.
In an embodiment of the present invention, the second input is different from the first input. Specifically, the second direction is different from the first direction.
It is to be understood that, in the embodiment of the present invention, the second input may be an input for the target image.
It should be noted that, in the embodiment of the present invention, after the terminal device displays the interface of the gallery application program, a user may trigger the terminal device to display the target image through an input on the interface of the gallery application program.
And S204, the terminal equipment responds to the second input and executes a second action corresponding to the second input on the target image.
Wherein the second action may include any one of: the target image is displayed in a reduced size, enlarged, and deleted.
Optionally, in this embodiment of the present invention, the second input is different, and the second action executed by the terminal device after responding to the second input may also be different.
Optionally, in the embodiment of the present invention, the second input is taken as an example of a sliding input. When the second input is an input that the user slides down in the target area, the terminal device may display the target image in a reduced size in response to the second input. When the second input is an input by the user sliding up the target area, the terminal device may enlarge and display the target image in response to the second input. When the second input is an input that the user slides to the right in the target area, the terminal device may delete the target image in response to the second input, and it is understood that, in this case, after the terminal device deletes the target image, a previous image or a next image of the target image may be automatically displayed.
For example, suppose the terminal device displays a person image, and the target image is an image in a gallery application, and the second input is a slide input; it is further assumed that the navigation field is located in the bottom area of the terminal device. Then, in the first implementation, as shown in (a) of fig. 6, the terminal device displays the person image 60 and the navigation bar, and the target area 61 is located in the navigation bar, the user can slide up the target area 61, that is, the terminal device receives a second input from the user, and then, in response to the second input, the terminal device can display a first interface 62 including an enlarged image of the person image, that is, the terminal device enlarges and displays the target image, as shown in (b) of fig. 6. In a second implementation, as shown in (a) of fig. 7, the terminal device displays the character image 70 and the navigation bar, and the target area 71 is located in the navigation bar, the user can slide down in the target area 71, that is, the terminal device receives a second input from the user, and then, in response to the second input, as shown in (b) of fig. 7, the terminal device can display a first interface 72, wherein the second interface 72 includes a reduced image of the character image, that is, the terminal device reduces the display target image. In a third implementation, as shown in (a) of fig. 8, the terminal device displays the character image 80 and the navigation bar, and the target area 81 is located in the navigation bar, the user can slide to the right in the target area 81, that is, the terminal device receives a second input from the user, and then, in response to the second input, as shown in (b) of fig. 8, the terminal device deletes the character image, and can display a third interface 82, which includes a previous image of the character image in the third interface 82.
It is to be understood that the second input may also be other inputs different from the first input, which may be determined according to actual usage requirements, and the embodiment of the present invention is not limited.
In the embodiment of the invention, when the terminal equipment displays the target image, the user can trigger the terminal equipment to reduce and display the target image and enlarge and display the target image or delete the target image through different inputs in the target area, namely, the user can directly trigger the terminal equipment to execute corresponding actions on the target image through the inputs in the target area, so that the convenience of operating the target image can be improved, and the man-machine interaction performance is improved.
In the embodiment of the present invention, the control methods shown in the above method drawings are all exemplarily described with reference to one drawing in the embodiment of the present invention. In specific implementation, the control methods shown in the above method drawings may also be implemented by combining with any other drawings that may be combined, which are illustrated in the above embodiments, and are not described herein again.
As shown in fig. 9, an embodiment of the present invention provides a terminal device 900, where the terminal device 900 may include a receiving module 901 and a processing module 902. A receiving module 901, configured to receive a first input of a user in a target area; the processing module 902 may be configured to perform a first action corresponding to the first input in response to the first input received by the receiving module 901. The target area may be an area on the screen of the terminal device and corresponding to the first camera located under the screen of the terminal device, and the target area may be located on a navigation bar of the terminal device.
Optionally, in an embodiment of the present invention, the first action may include any one of: and opening the first camera and displaying the preview picture acquired by the first camera, opening the second camera and displaying the preview picture acquired by the second camera, and displaying the interface of the gallery application program.
Optionally, in an embodiment of the present invention, the navigation bar may include at least one virtual key, the target area may be an area where the target virtual key is located, and the target virtual key may be a virtual key in the at least one virtual key.
Optionally, in this embodiment of the present invention, a target identifier is displayed in another area of the target area except the first area, where the target identifier may be used to indicate a target virtual key, and the first area may be a projection area of the first camera on the screen of the terminal device.
Optionally, in an embodiment of the present invention, the first action may include displaying an interface of a gallery application. A receiving module 901, further configured to receive a second input of the user in the target area in a case where the target image is displayed after the processing module 902 performs the first action corresponding to the first input; the processing module 902 may be further configured to, in response to the second input received by the receiving module 901, perform a second action corresponding to the second input on the target image. Wherein, the target image may be an image in a gallery application program, and the second action may include any one of: the target image is displayed in a reduced size, enlarged, and deleted.
The terminal device 900 provided in the embodiment of the present invention can implement each process implemented by the terminal device 900 shown in the foregoing method embodiment, and for avoiding repetition, details are not described here again.
The embodiment of the invention provides a terminal device, which can receive a first input of a user in a target area (an area which is on a screen of the terminal device and corresponds to a first camera positioned below the screen of the terminal device); and may perform a first action corresponding to the first input in response to the first input; wherein the target area can be located in a navigation bar of the terminal device. Then, when the user requires the terminal device to perform a certain action, for example, a first action, the user may perform a first input corresponding to the first action in the target area to trigger the terminal device to directly perform the first action. Therefore, on the basis of improving the screen occupation ratio of the terminal equipment, the use reliability and the service life of the terminal equipment can be ensured, the process of executing a certain action by the terminal equipment can be simplified, and the man-machine interaction performance is improved.
Fig. 10 is a schematic diagram of a hardware structure of a terminal device for implementing various embodiments of the present invention. As shown in fig. 10, the terminal device 100 includes, but is not limited to: radio frequency unit 101, network module 102, audio output unit 103, input unit 104, sensor 105, display unit 106, user input unit 107, interface unit 108, memory 109, processor 110, and power supply 111. Those skilled in the art will appreciate that the terminal device configuration shown in fig. 10 is not intended to be limiting, and that terminal devices may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components. In the embodiment of the present invention, the terminal device includes, but is not limited to, a mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted terminal device, a wearable device, a pedometer, and the like.
The user input unit 107 is used for receiving a first input of a user in the target area; the processor 110 is configured to execute a first action corresponding to the first input in response to the first input received by the user input unit 107. The target area is an area which is on the screen of the terminal equipment and corresponds to the first camera positioned below the screen of the terminal equipment, and the target area is positioned on the navigation bar of the terminal equipment.
It can be understood that, in the embodiment of the present invention, the receiving module 901 in the structural schematic diagram of the terminal device (for example, fig. 9) may be implemented by the user input unit 107. The processing module 902 in the schematic structural diagram of the terminal device (for example, fig. 9) may be implemented by the processor 110.
The embodiment of the invention provides a terminal device, which can receive a first input of a user in a target area (an area which is on a screen of the terminal device and corresponds to a first camera positioned below the screen of the terminal device); and may perform a first action corresponding to the first input in response to the first input; wherein the target area can be located in a navigation bar of the terminal device. Then, when the user requires the terminal device to perform a certain action, for example, a first action, the user may perform a first input corresponding to the first action in the target area to trigger the terminal device to directly perform the first action. Therefore, on the basis of improving the screen occupation ratio of the terminal equipment, the use reliability and the service life of the terminal equipment can be ensured, the process of executing a certain action by the terminal equipment can be simplified, and the man-machine interaction performance is improved.
It should be understood that, in the embodiment of the present invention, the radio frequency unit 101 may be used for receiving and sending signals during a message transmission or call process, and specifically, after receiving downlink data from a base station, the downlink data is processed by the processor 110; in addition, the uplink data is transmitted to the base station. Typically, radio frequency unit 101 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. In addition, the radio frequency unit 101 can also communicate with a network and other devices through a wireless communication system.
The terminal device provides wireless broadband internet access to the user through the network module 102, such as helping the user send and receive e-mails, browse webpages, access streaming media, and the like.
The audio output unit 103 may convert audio data received by the radio frequency unit 101 or the network module 102 or stored in the memory 109 into an audio signal and output as sound. Also, the audio output unit 103 may also provide audio output related to a specific function performed by the terminal device 100 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output unit 103 includes a speaker, a buzzer, a receiver, and the like.
The input unit 104 is used to receive an audio or video signal. The input unit 104 may include a Graphics Processing Unit (GPU) 1041 and a microphone 1042, and the graphics processor 1041 processes image data of a still picture or video obtained by an image capturing device (e.g., a camera) in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display unit 106. The image frames processed by the graphic processor 1041 may be stored in the memory 109 (or other storage medium) or transmitted via the radio frequency unit 101 or the network module 102. The microphone 1042 may receive sound and may be capable of processing such sound into audio data. The processed audio data may be converted into a format output transmittable to a mobile communication base station via the radio frequency unit 101 in case of a phone call mode.
The terminal device 100 also includes at least one sensor 105, such as a light sensor, a motion sensor, and other sensors. Specifically, the light sensor includes an ambient light sensor that can adjust the brightness of the display panel 1061 according to the brightness of ambient light, and a proximity sensor that can turn off the display panel 1061 and/or the backlight when the terminal device 100 is moved to the ear. As one of the motion sensors, the accelerometer sensor can detect the magnitude of acceleration in each direction (generally three axes), detect the magnitude and direction of gravity when stationary, and can be used to identify the terminal device posture (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), vibration identification related functions (such as pedometer, tapping), and the like; the sensors 105 may also include fingerprint sensors, pressure sensors, iris sensors, molecular sensors, gyroscopes, barometers, hygrometers, thermometers, infrared sensors, etc., which are not described in detail herein.
The display unit 106 is used to display information input by a user or information provided to the user. The display unit 106 may include a display panel 1061, and the display panel 1061 may be configured in the form of a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), or the like.
The user input unit 107 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the terminal device. Specifically, the user input unit 107 includes a touch panel 1071 and other input devices 1072. Touch panel 1071, also referred to as a touch screen, may collect touch operations by a user on or near the touch panel 1071 (e.g., operations by a user on or near touch panel 1071 using a finger, stylus, or any suitable object or attachment). The touch panel 1071 may include two parts of a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 110, and receives and executes commands sent by the processor 110. In addition, the touch panel 1071 may be implemented in various types, such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. In addition to the touch panel 1071, the user input unit 107 may include other input devices 1072. Specifically, other input devices 1072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described in detail herein.
Further, the touch panel 1071 may be overlaid on the display panel 1061, and when the touch panel 1071 detects a touch operation thereon or nearby, the touch panel 1071 transmits the touch operation to the processor 110 to determine the type of the touch event, and then the processor 110 provides a corresponding visual output on the display panel 1061 according to the type of the touch event. Although in fig. 10, the touch panel 1071 and the display panel 1061 are two independent components to implement the input and output functions of the terminal device, in some embodiments, the touch panel 1071 and the display panel 1061 may be integrated to implement the input and output functions of the terminal device, and is not limited herein.
The interface unit 108 is an interface for connecting an external device to the terminal apparatus 100. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The interface unit 108 may be used to receive input (e.g., data information, power, etc.) from an external device and transmit the received input to one or more elements within the terminal apparatus 100 or may be used to transmit data between the terminal apparatus 100 and the external device.
The memory 109 may be used to store software programs as well as various data. The memory 109 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 109 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The processor 110 is a control center of the terminal device, connects various parts of the entire terminal device by using various interfaces and lines, and performs various functions of the terminal device and processes data by running or executing software programs and/or modules stored in the memory 109 and calling data stored in the memory 109, thereby performing overall monitoring of the terminal device. Processor 110 may include one or more processing units; alternatively, the processor 110 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 110.
The terminal device 100 may further include a power supply 111 (such as a battery) for supplying power to each component, and optionally, the power supply 111 may be logically connected to the processor 110 through a power management system, so as to implement functions of managing charging, discharging, and power consumption through the power management system.
In addition, the terminal device 100 includes some functional modules that are not shown, and are not described in detail here.
Optionally, an embodiment of the present invention further provides a terminal device, which includes the processor 110 shown in fig. 10, the memory 109, and a computer program stored in the memory 109 and capable of running on the processor 110, where the computer program, when executed by the processor 110, implements the processes of the foregoing method embodiment, and can achieve the same technical effect, and details are not described here to avoid repetition.
An embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program implements the processes of the foregoing method embodiments, and can achieve the same technical effects, and in order to avoid repetition, details are not repeated here. The computer-readable storage medium may include a read-only memory (ROM), a Random Access Memory (RAM), a magnetic or optical disk, and the like.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present application may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal device (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present application.
While the present embodiments have been described with reference to the accompanying drawings, it is to be understood that the invention is not limited to the precise embodiments described above, which are meant to be illustrative and not restrictive, and that various changes may be made therein by those skilled in the art without departing from the spirit and scope of the invention as defined by the appended claims.

Claims (12)

1. A control method is applied to terminal equipment, and is characterized in that the method comprises the following steps:
receiving a first input of a user in a target area, wherein the target area is an area which is on a screen of the terminal equipment and corresponds to a first camera positioned below the screen of the terminal equipment;
in response to the first input, performing a first action corresponding to the first input;
and the target area is positioned on a navigation bar of the terminal equipment.
2. The method of claim 1, wherein the first action comprises any one of: and opening the first camera and displaying the preview picture acquired by the first camera, opening the second camera and displaying the preview picture acquired by the second camera, and displaying the interface of the gallery application program.
3. The method of claim 1, wherein the navigation bar comprises at least one virtual key, the target area is an area where a target virtual key is located, and the target virtual key is a virtual key of the at least one virtual key.
4. The method according to claim 3, wherein other areas of the target area than the first area are displayed with target identifiers, the target identifiers are used for indicating the target virtual keys, and the first area is a projection area of the first camera on the screen of the terminal device.
5. The method of claim 2, wherein the first action comprises displaying an interface of the gallery application;
after the performing the first action corresponding to the first input, the method further comprises:
receiving a second input of a user in the target area under the condition of displaying a target image, wherein the target image is an image in the gallery application program;
in response to the second input, performing a second action on the target image corresponding to the second input;
wherein the second action comprises any one of: and displaying the target image in a reduced mode, displaying the target image in an enlarged mode, and deleting the target image.
6. A terminal device, characterized in that the terminal device comprises: the device comprises a receiving module and a processing module;
the receiving module is used for receiving a first input of a user in a target area, wherein the target area is an area which is on a screen of the terminal equipment and corresponds to a first camera positioned below the screen of the terminal equipment;
the processing module is used for responding to the first input received by the receiving module, and executing a first action corresponding to the first input, wherein the first action is an action related to the first camera;
and the target area is positioned on a navigation bar of the terminal equipment.
7. The terminal device according to claim 6, wherein the first camera is turned on and displays a preview picture acquired by the first camera, the second camera is turned on and displays a preview picture acquired by the second camera, and an interface of a gallery application is displayed.
8. The terminal device according to claim 6, wherein the navigation bar comprises at least one virtual key, the target area is an area where a target virtual key is located, and the target virtual key is a virtual key of the at least one virtual key.
9. The terminal device according to claim 8, wherein a target identifier is displayed in a region other than a first region in the target region, the target identifier is used for indicating the target virtual key, and the first region is a projection region of the first camera on the screen of the terminal device.
10. The terminal device of claim 7, wherein the first action comprises displaying an interface of a gallery application;
the receiving module is further used for receiving a second input of the user in the target area after the processing module executes the first action corresponding to the first input and under the condition that the processing module displays the target image;
the processing module is further used for responding to the second input received by the receiving module, and executing a second action corresponding to the second input on the target image;
wherein the second action comprises any one of: reducing and displaying the target image, enlarging and displaying the target image, and deleting the target image; the target image is an image in the gallery application.
11. A terminal device, characterized in that it comprises a processor, a memory and a computer program stored on the memory and executable on the processor, which computer program, when executed by the processor, implements the steps of the control method according to any one of claims 1 to 5.
12. A computer-readable storage medium, characterized in that a computer program is stored on the computer-readable storage medium, which computer program, when being executed by a processor, carries out the steps of the control method according to any one of claims 1 to 5.
CN201910803952.XA 2019-08-28 2019-08-28 Control method and terminal equipment Pending CN110647277A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201910803952.XA CN110647277A (en) 2019-08-28 2019-08-28 Control method and terminal equipment
PCT/CN2020/111443 WO2021037073A1 (en) 2019-08-28 2020-08-26 Control method and terminal device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910803952.XA CN110647277A (en) 2019-08-28 2019-08-28 Control method and terminal equipment

Publications (1)

Publication Number Publication Date
CN110647277A true CN110647277A (en) 2020-01-03

Family

ID=68991072

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910803952.XA Pending CN110647277A (en) 2019-08-28 2019-08-28 Control method and terminal equipment

Country Status (2)

Country Link
CN (1) CN110647277A (en)
WO (1) WO2021037073A1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111522478A (en) * 2020-04-17 2020-08-11 维沃移动通信有限公司 Icon moving method and electronic equipment
CN111953900A (en) * 2020-08-07 2020-11-17 维沃移动通信有限公司 Picture shooting method and device and electronic equipment
CN111966237A (en) * 2020-08-06 2020-11-20 Tcl通讯(宁波)有限公司 Touch compensation method and device for perforated screen and terminal
WO2021037073A1 (en) * 2019-08-28 2021-03-04 维沃移动通信有限公司 Control method and terminal device

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104679401A (en) * 2013-12-03 2015-06-03 上海思立微电子科技有限公司 Terminal and touch method thereof
CN106921767A (en) * 2017-03-07 2017-07-04 捷开通讯(深圳)有限公司 A kind of mobile terminal of screen accounting high
CN107580179A (en) * 2017-08-18 2018-01-12 维沃移动通信有限公司 A kind of camera starts method and mobile terminal
CN107948496A (en) * 2017-10-09 2018-04-20 广东小天才科技有限公司 Photographic method, device, mobile terminal and the storage medium of mobile terminal
CN108196722A (en) * 2018-01-29 2018-06-22 广东欧珀移动通信有限公司 A kind of electronic equipment and its touch control method, computer readable storage medium
CN108196714A (en) * 2018-01-02 2018-06-22 联想(北京)有限公司 A kind of electronic equipment
CN108491142A (en) * 2018-03-09 2018-09-04 广东欧珀移动通信有限公司 A kind of control method of mobile terminal, mobile terminal and storage medium
CN108803990A (en) * 2018-06-12 2018-11-13 Oppo广东移动通信有限公司 Exchange method, device and terminal
CN109151296A (en) * 2017-06-19 2019-01-04 北京小米移动软件有限公司 Electronic equipment, switching method and device, computer readable storage medium
CN109740519A (en) * 2018-12-29 2019-05-10 联想(北京)有限公司 Control method and electronic equipment

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103167179B (en) * 2013-03-12 2015-03-18 广东欧珀移动通信有限公司 Method for rapidly starting photographing function and mobile device
KR20150072922A (en) * 2013-12-20 2015-06-30 엘지전자 주식회사 Mobile terminal and operation method thereof
CN104216639B (en) * 2014-08-28 2018-02-09 深圳市金立通信设备有限公司 A kind of terminal operation method
CN107147847A (en) * 2017-05-15 2017-09-08 上海与德科技有限公司 A kind of control device, method and the mobile terminal of the camera of mobile terminal
CN110647277A (en) * 2019-08-28 2020-01-03 维沃移动通信有限公司 Control method and terminal equipment
CN110572575A (en) * 2019-09-20 2019-12-13 三星电子(中国)研发中心 camera shooting control method and device
CN111163260B (en) * 2019-12-20 2021-11-19 维沃移动通信有限公司 Camera starting method and electronic equipment

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104679401A (en) * 2013-12-03 2015-06-03 上海思立微电子科技有限公司 Terminal and touch method thereof
CN106921767A (en) * 2017-03-07 2017-07-04 捷开通讯(深圳)有限公司 A kind of mobile terminal of screen accounting high
CN109151296A (en) * 2017-06-19 2019-01-04 北京小米移动软件有限公司 Electronic equipment, switching method and device, computer readable storage medium
CN107580179A (en) * 2017-08-18 2018-01-12 维沃移动通信有限公司 A kind of camera starts method and mobile terminal
CN107948496A (en) * 2017-10-09 2018-04-20 广东小天才科技有限公司 Photographic method, device, mobile terminal and the storage medium of mobile terminal
CN108196714A (en) * 2018-01-02 2018-06-22 联想(北京)有限公司 A kind of electronic equipment
CN108196722A (en) * 2018-01-29 2018-06-22 广东欧珀移动通信有限公司 A kind of electronic equipment and its touch control method, computer readable storage medium
CN108491142A (en) * 2018-03-09 2018-09-04 广东欧珀移动通信有限公司 A kind of control method of mobile terminal, mobile terminal and storage medium
CN108803990A (en) * 2018-06-12 2018-11-13 Oppo广东移动通信有限公司 Exchange method, device and terminal
CN109740519A (en) * 2018-12-29 2019-05-10 联想(北京)有限公司 Control method and electronic equipment

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021037073A1 (en) * 2019-08-28 2021-03-04 维沃移动通信有限公司 Control method and terminal device
CN111522478A (en) * 2020-04-17 2020-08-11 维沃移动通信有限公司 Icon moving method and electronic equipment
CN111522478B (en) * 2020-04-17 2021-09-07 维沃移动通信有限公司 Icon moving method and electronic equipment
CN111966237A (en) * 2020-08-06 2020-11-20 Tcl通讯(宁波)有限公司 Touch compensation method and device for perforated screen and terminal
CN111953900A (en) * 2020-08-07 2020-11-17 维沃移动通信有限公司 Picture shooting method and device and electronic equipment
CN111953900B (en) * 2020-08-07 2022-01-28 维沃移动通信有限公司 Picture shooting method and device and electronic equipment
WO2022028495A1 (en) * 2020-08-07 2022-02-10 维沃移动通信有限公司 Picture photographing method and apparatus, and electronic device
EP4195652A4 (en) * 2020-08-07 2024-01-10 Vivo Mobile Communication Co Ltd Picture photographing method and apparatus, and electronic device

Also Published As

Publication number Publication date
WO2021037073A1 (en) 2021-03-04

Similar Documents

Publication Publication Date Title
CN110851051B (en) Object sharing method and electronic equipment
CN111459355B (en) Content sharing method and electronic equipment
CN110502163B (en) Terminal device control method and terminal device
CN111061574A (en) Object sharing method and electronic equipment
CN110489029B (en) Icon display method and terminal equipment
CN110752981B (en) Information control method and electronic equipment
CN111142723B (en) Icon moving method and electronic equipment
CN110928461A (en) Icon moving method and electronic equipment
CN110099296B (en) Information display method and terminal equipment
CN110502162B (en) Folder creating method and terminal equipment
CN110069188B (en) Identification display method and terminal equipment
CN109408072B (en) Application program deleting method and terminal equipment
CN109828731B (en) Searching method and terminal equipment
CN109976611B (en) Terminal device control method and terminal device
CN110703972B (en) File control method and electronic equipment
CN111026299A (en) Information sharing method and electronic equipment
CN110647277A (en) Control method and terminal equipment
CN110750188A (en) Message display method and electronic equipment
CN110502164B (en) Interface display method and electronic equipment
CN111163224B (en) Voice message playing method and electronic equipment
CN110231972B (en) Message display method and terminal equipment
CN111026350A (en) Display control method and electronic equipment
CN108874906B (en) Information recommendation method and terminal
CN111190517B (en) Split screen display method and electronic equipment
CN111338525A (en) Control method of electronic equipment and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination