WO2021037073A1 - 控制方法及终端设备 - Google Patents

控制方法及终端设备 Download PDF

Info

Publication number
WO2021037073A1
WO2021037073A1 PCT/CN2020/111443 CN2020111443W WO2021037073A1 WO 2021037073 A1 WO2021037073 A1 WO 2021037073A1 CN 2020111443 W CN2020111443 W CN 2020111443W WO 2021037073 A1 WO2021037073 A1 WO 2021037073A1
Authority
WO
WIPO (PCT)
Prior art keywords
terminal device
camera
input
target
area
Prior art date
Application number
PCT/CN2020/111443
Other languages
English (en)
French (fr)
Inventor
庾增增
Original Assignee
维沃移动通信有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 维沃移动通信有限公司 filed Critical 维沃移动通信有限公司
Publication of WO2021037073A1 publication Critical patent/WO2021037073A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text

Definitions

  • the embodiments of the present application relate to the field of communication technologies, and in particular, to a control method and terminal equipment.
  • the user needs to first search for the application icon indicating the application program on the desktop of the terminal device, and after finding the application icon indicating the application program, then the application icon Operate to trigger the terminal device to run the application.
  • the embodiments of the present application provide a control method and a terminal device to solve the problem of cumbersome and time-consuming process of running an application program on the terminal device, and poor human-computer interaction performance.
  • an embodiment of the present application provides a control method applied to a terminal device.
  • the method includes receiving a user's first input in a target area, where the target area is on the screen of the terminal device and is located on the terminal device.
  • an embodiment of the present application provides a terminal device.
  • the terminal device includes a receiving module and a processing module.
  • the receiving module is used to receive the first input of the user in the target area, the target area is the area on the screen of the terminal device and corresponding to the first camera located under the screen of the terminal device;
  • the processing module is used to respond to receiving The first input received by the module executes the first action corresponding to the first input.
  • the target area is located in the navigation bar of the terminal device.
  • an embodiment of the present application provides a terminal device, including a processor, a memory, and a computer program stored on the memory and capable of running on the processor.
  • the computer program implements the above-mentioned first aspect when the computer program is executed by the processor. The steps of the control method.
  • an embodiment of the present application provides a computer-readable storage medium storing a computer program on the computer-readable storage medium, and when the computer program is executed by a processor, the steps of the control method of the first aspect are realized.
  • the terminal device may receive the user's first input in the target area (the area on the screen of the terminal device and the area corresponding to the first camera located under the screen of the terminal device); and may respond to the first input. Input, perform a first action corresponding to the first input; wherein, the target area may be located in the navigation bar of the terminal device.
  • the user when the user requires the terminal device to perform a certain action, such as a first action, the user can perform a first input corresponding to the first action in the target area to trigger the terminal device to directly perform the first action.
  • FIG. 1 is a schematic structural diagram of a possible Android operating system provided by an embodiment of the application
  • Fig. 2 is a schematic diagram of a control method provided by an embodiment of the application.
  • FIG. 3 is one of the schematic diagrams of the interface of the control method application provided by the embodiment of the application.
  • FIG. 5 is the third schematic diagram of the interface of the control method application provided by the embodiment of this application.
  • FIG. 6 is the fourth schematic diagram of the interface of the control method application provided by the embodiment of this application.
  • FIG. 7 is the fifth schematic diagram of the interface of the control method application provided by the embodiment of this application.
  • FIG. 8 is the sixth schematic diagram of the interface of the control method application provided by the embodiment of this application.
  • FIG. 9 is a schematic structural diagram of a terminal device provided by an embodiment of the application.
  • FIG. 10 is a schematic diagram of hardware of a terminal device provided by an embodiment of the application.
  • first and second in this document are used to distinguish different objects, rather than to describe a specific order of objects.
  • first input and the second input are used to distinguish different inputs, rather than to describe a specific order of input.
  • words such as “exemplary” or “for example” are used as examples, illustrations, or illustrations. Any embodiment or design solution described as “exemplary” or “for example” in the embodiments of the present application should not be construed as being more preferable or advantageous than other embodiments or design solutions. To be precise, words such as “exemplary” or “for example” are used to present related concepts in a specific manner.
  • plural refers to two or more than two, for example, a plurality of elements refers to two or more than two elements.
  • the preview image captured by the camera the preview image captured by the camera currently turned on by the terminal device, that is, the image displayed on the camera preview interface.
  • Camera preview interface the preview interface of the camera application. Refers to the interface displayed by the terminal device when the camera application in the terminal device is running in the foreground of the terminal device.
  • Navigation Bar It is the shortcut button bar on the screen of the terminal device.
  • the navigation bar appears in the peripheral area of the mobile phone screen in the form of virtual buttons (for example, it can be located at the top area of the screen, it can also be located at the bottom area of the screen, it can also be located in the left area of the screen, or it can be located in the right area of the screen) .
  • the shortcut buttons on the screen of the terminal device usually include a back button (back), a home button (home), and a recently used button (recents).
  • Digging screen It can also be called a drilling screen. It refers to digging a hole on the screen of the terminal device to expose the front camera set at the hole. In this way, the screen-to-body ratio of the terminal device can be increased without sacrificing the photographing effect of the front camera.
  • a terminal device provided with a hole-punching screen may include a display layer and a touch layer, and punching holes on the hole-punching screen may be implemented in two ways (namely, the following method 1 and method 2).
  • Method 1 Punch through both the display layer of the screen and the touch layer of the screen. In this case, the hole-punched area of the screen cannot display content normally, nor can it respond normally to the user's touch input.
  • Method 2 Break through the display layer of the screen and retain the touch layer of the screen. In this case, although the perforated area of the screen cannot display content normally, since the touch layer is reserved, it can normally respond to the user's touch input.
  • hole digging screen in the embodiment of the present application can be implemented using the second method described above.
  • the embodiments of the present application provide a control method and a terminal device
  • the terminal device can receive a user's first input in a target area (the area on the screen of the terminal device and corresponding to the first camera located under the screen of the terminal device) And can respond to the first input, perform a first action corresponding to the first input; wherein, the target area can be located in the navigation bar of the terminal device. Then, when the user requires the terminal device to perform a certain action, such as a first action, the user can perform a first input corresponding to the first action in the target area to trigger the terminal device to directly perform the first action.
  • the terminal device in the embodiment of the present application may be a terminal device with an operating system.
  • the operating system may be an Android operating system, an ios operating system, or other possible operating systems, which are not specifically limited in the embodiment of the present application.
  • the following uses the Android operating system as an example to introduce the software environment to which the control method provided in the embodiments of the present application is applied.
  • FIG. 1 it is a schematic structural diagram of a possible Android operating system provided by an embodiment of this application.
  • the architecture of the Android operating system includes 4 layers, namely: application layer, application framework layer, system runtime library layer, and kernel layer (specifically, it may be the Linux kernel layer).
  • the application layer includes various applications (including system applications and third-party applications) in the Android operating system.
  • the application framework layer is the framework of the application. Developers can develop some applications based on the application framework layer while complying with the development principles of the application framework.
  • the system runtime library layer includes a library (also called a system library) and an Android operating system runtime environment.
  • the library mainly provides various resources needed by the Android operating system.
  • the Android operating system operating environment is used to provide a software environment for the Android operating system.
  • the kernel layer is the operating system layer of the Android operating system and belongs to the lowest level of the Android operating system software.
  • the core layer is based on the Linux kernel to provide core system services and hardware-related drivers for the Android operating system.
  • the developer can develop a software program that implements the control method provided in the embodiment of the application based on the system architecture of the Android operating system as shown in FIG. 1, so that the interface can be controlled
  • the method can be run based on the Android operating system as shown in FIG. 1. That is, the processor or the terminal device can implement the control method provided in the embodiment of the present application by running the software program in the Android operating system.
  • the terminal device in the embodiment of the present application may be a mobile terminal or a non-mobile terminal.
  • the mobile terminal may be a mobile phone, a tablet computer, a notebook computer, a palmtop computer, a vehicle-mounted terminal, a wearable device, an ultra-mobile personal computer (UMPC), a netbook, or a personal digital assistant (personal digital assistant
  • the non-mobile terminal may be a personal computer (PC), a television (television, TV), a teller machine, or a self-service machine, etc., which are not specifically limited in the embodiment of the present application.
  • the execution subject of the control method provided in the embodiments of the present application may be the above-mentioned terminal device, or may be a functional module and/or functional entity in the terminal device that can implement the control method, which can be specifically determined according to actual usage requirements.
  • the embodiment is not limited.
  • a terminal device is taken as an example to illustrate the control method provided in the embodiment of the present application.
  • the target area in the embodiment of the present application, the target area may be the area corresponding to the first camera located under the screen of the terminal device, and the target area may be located in the navigation Column
  • a camera if the user requires the terminal device to perform a certain camera-related action (for example, the first action in the embodiment of this application), the user can enter through an input in this area (for example, this application
  • the first input in the embodiment triggers the terminal device to directly display the action corresponding to the input, so as to trigger the terminal device to execute the action corresponding to the input.
  • an embodiment of the present application provides a control method, which may include the following S201 and S202.
  • S201 The terminal device receives a user's first input in the target area.
  • the above-mentioned target area may be an area on the screen of the terminal device and corresponding to the first camera located under the screen of the terminal device, and the target area may be located in the navigation bar of the terminal device.
  • the screen of the terminal device may be a hole-drilling screen, and the screen may include a display layer and a touch layer.
  • a target hole may be provided at the display layer corresponding to the target area, and the above-mentioned first camera may be provided at the target hole.
  • first camera located under the screen of the terminal device may be understood as the first camera located under the touch layer of the screen of the terminal device.
  • the digging screen of the embodiment of the present application since the digging screen of the embodiment of the present application only digs holes in the display layer (that is, the target hole is set in the display layer), the touch layer is not pierced, that is, it penetrates the display layer of the screen and The touch layer of the screen is retained, so the first camera can be exposed while ensuring that the touch function of the target area can be used normally.
  • the above-mentioned first camera may be a front camera of the terminal device.
  • the terminal device may further include a second camera, and the second camera may be a rear camera of the terminal device.
  • the navigation bar of the terminal device may be located in any area of the screen of the terminal device.
  • the navigation bar can be located in the top area of the terminal device screen, or can be located in the bottom area of the terminal device screen, or can be located in the left area of the terminal device screen, or can be located in the right area of the terminal device screen, depending on actual use The requirements are determined, and the embodiment of the present application does not limit it.
  • the target area is located in the navigation bar of the terminal device, and the target area is the area on the screen of the terminal device and corresponding to the first camera under the screen of the terminal device, that is, the first camera setting Under the target area in the navigation bar of the terminal device, it does not occupy the effective display area of the terminal device screen and does not need to be stretched. Therefore, on the basis of increasing the screen ratio of the terminal device, the front camera of the terminal device (such as this The first camera in the application embodiment is retractable, so that the reliability and lifespan of the terminal device can be improved.
  • the aforementioned target area may be any area in the aforementioned navigation bar.
  • the above-mentioned first input may be any possible form of input such as click input, long press input, double press input, drag input, and sliding input, etc., which can be determined according to actual usage requirements.
  • the embodiment is not limited.
  • the above-mentioned click input may be an input of a first preset number of clicks.
  • the aforementioned long-press input may be an input for the first preset duration of contact.
  • the above-mentioned re-press input is also called pressure touch input, which refers to an input in which the user presses with a pressure value greater than or equal to the first pressure threshold.
  • the aforementioned drag input may be an input of dragging in any direction.
  • the aforementioned sliding input may be an input of sliding in the first direction.
  • the above-mentioned first preset number of times may be two or more times.
  • the foregoing first preset duration, the foregoing first pressure threshold, and the first direction may all be determined according to actual use requirements, which are not limited in the embodiment of the present application.
  • the terminal device executes a first action corresponding to the first input.
  • the above-mentioned first action may be an action related to the first camera.
  • the above-mentioned action related to the first camera may be understood as performing an opening action on an object related to the first camera.
  • objects related to the first camera may include: a first camera, a second camera, a camera application, a gallery application, and so on.
  • the details can be determined according to actual use requirements, and the embodiment of the present application does not limit it.
  • the above-mentioned first action may include any one of the following: turning on the first camera and displaying the preview image collected by the first camera, turning on the second camera and displaying the preview image collected by the second camera, and displaying The interface of the gallery application.
  • the terminal device displays the preview screen collected by the first camera, that is, the terminal device displays the preview screen collected by the front camera of the terminal device; for example, the terminal device can turn on the first camera while the terminal device
  • the camera application is run in the foreground of the camera, and the preview image collected by the first camera is displayed in the camera preview interface (that is, the preview interface of the camera application).
  • the terminal device displays the preview screen captured by the second camera, that is, the terminal device displays the preview screen captured by the rear camera of the terminal device; for example, the terminal device can run the camera application in the foreground of the terminal device while the second camera is turned on.
  • the preview picture collected by the second camera is displayed in the camera preview interface (that is, the preview interface of the camera application).
  • the interface of the aforementioned gallery application may be any interface in the gallery application.
  • the above-mentioned first input is different, and the first action performed by the terminal device after responding to the first input may also be different. That is, the user can trigger the terminal device to perform different first actions through different first inputs.
  • a specific first input may not be restricted to trigger the terminal device to perform a certain first action; it is only necessary that each specific first input uniquely corresponds to a certain first action.
  • the details can be determined according to actual use requirements, and the embodiment of the present application does not limit it.
  • the user can click on the target area (that is, the first input is a click input), so that the terminal device can respond to the click input, turn on the first camera and display the first camera.
  • a preview image captured by a camera.
  • the number of clicks for the above-mentioned click input may be two or more times.
  • the details can be determined according to actual use requirements, and the embodiment of the present application does not limit it.
  • the terminal device when the terminal device displays any interface, the terminal device may respond to the above-mentioned click input to turn on the first camera and display the preview image collected by the first camera.
  • the terminal device when the terminal device displays any interface, the user can click on the target area (the number of clicks is two or more), that is, the terminal device receives the user's first input, and then the terminal device can first determine the terminal device Whether the current interface displayed is the camera preview interface. If the current interface is not the camera preview interface, the terminal device can run the camera application in the foreground of the terminal device and turn on the first camera, so that the terminal device can display the camera preview interface and display the preview collected by the first camera in the camera preview interface Picture. If the current interface is the camera preview interface, the terminal device can further determine whether the camera preview interface displays the preview image captured by the first camera; if the camera preview interface displays the preview image captured by the first camera, the terminal device can keep the first camera on. The camera keeps displaying the preview screen captured by the first camera; if the preview screen captured by the second camera is displayed in the camera preview interface, the terminal device can turn off the second camera and turn on the first camera, and then the terminal device can display the first camera capture Preview screen.
  • the target area
  • the terminal device displays the interface 30 of the chat application program and the navigation bar 31, wherein the navigation bar 31 is located in the bottom area of the screen of the terminal device, and the target area 32 is located in the navigation bar 31.
  • the terminal device can double-click in the target area 32, that is, the terminal device receives the user's first input, so that the terminal device can respond to the first input, as shown in Figure 3 (b), the terminal device can display the camera preview interface , And display the preview image 33 captured by the first camera in the camera preview interface (that is, the terminal device turns on the first camera and displays the preview image captured by the first camera).
  • the user since the user can directly trigger the terminal device to turn on the first camera and display the preview image collected by the first camera through the first input in the target area, the user does not need to pass multiple inputs (for example, first search and indicate the camera application
  • the identification of the program, and then the identification input triggers the terminal device to display the preview interface of the camera application, and the camera switch identification input in the camera preview interface) triggers the terminal device to display the preview image collected by the first camera; therefore, the opening can be simplified
  • the first camera also displays the operation process of the preview picture collected by the first camera, which improves the human-computer interaction performance.
  • the user can press and hold in the target area (that is, the first input is a long press input), so that the terminal device can respond to the long press input to turn on the second camera.
  • the camera also displays the preview image captured by the second camera.
  • the terminal device when the terminal device displays any interface, the terminal device may respond to the aforementioned long-press input to turn on the second camera and display the preview image collected by the second camera.
  • the terminal device when the terminal device displays any interface, the user can press and hold the target area, that is, the terminal device receives the user's first input, and then the terminal device can first determine whether the current interface displayed by the terminal device is a camera preview interface. If the current interface is not the camera preview interface, the terminal device can first run the camera application in the foreground of the terminal device and turn on the second camera, so that the terminal device can display the camera preview interface and display the second camera capture in the camera preview interface Preview screen.
  • the terminal device can further determine whether the preview screen captured by the second camera is displayed in the camera preview interface; if the preview screen captured by the second camera is displayed in the camera preview interface, the terminal device can continue to open the second camera The camera also displays the preview image captured by the second camera; if the preview image captured by the first camera is displayed in the camera preview interface, the terminal device can turn off the first camera and turn on the second camera, so that the terminal device can display the second camera capture preview image.
  • the terminal device displays the interface 40 of the chat application program and the navigation bar 41, and the target area 42 is located in the navigation bar 41.
  • the terminal device can display a camera preview Interface, and display the preview image 43 captured by the second camera in the camera preview interface (that is, the terminal device turns on the second camera and displays the preview image captured by the second camera).
  • the user since the user can directly trigger the terminal device to turn on the second camera and display the preview image collected by the second camera through the first input in the target area, the user does not need to pass multiple inputs (for example, first search and indicate the camera application
  • the identification of the program and then input the identification to trigger the terminal device to display the preview interface of the camera application, and switch the identification input of the camera in the camera preview interface) to trigger the terminal device to display the preview image collected by the second camera; therefore, it can be simplified to open
  • the second camera also displays the operation process of the preview picture collected by the second camera, which improves the human-computer interaction performance.
  • the user can slide in the target area (that is, the first input is a sliding input) input, so that the terminal device can respond to the sliding input to display the gallery application Interface.
  • the target area that is, the first input is a sliding input
  • the terminal device may display the interface of the gallery application in response to the above-mentioned sliding input.
  • the terminal device when the terminal device displays any interface, the user can slide in the target area (for example, slide to the left), that is, the terminal device receives the user's first input, and then the terminal device can first determine whether the current interface displayed by the terminal device is It is the interface of the gallery application. If the current interface is not the interface of the gallery application, the terminal device can run the gallery application in the foreground of the terminal device and display the interface of the gallery application. If the current interface is the interface of a gallery application, the terminal device can keep running the gallery application in the foreground of the terminal device and display the interface of the gallery application.
  • the interface currently displayed on the terminal device is the interface of the chat application
  • the gallery application is the gallery application
  • the first input is long-press input
  • the navigation bar is located on the screen of the terminal device.
  • the terminal device displays the chat application interface 50 and the navigation bar 51
  • the target area 52 is located in the navigation bar 51
  • the user can slide left in the target area 52, that is, the terminal
  • the terminal device receives the user's first input, so that the terminal device can respond to the first input.
  • the terminal device can display an interface 53 of the gallery application.
  • the user since the user can directly trigger the terminal device to display the picture of the gallery application through the first input in the target area, the user does not need to search for the identifier indicating the gallery application (for example, the application of the gallery application). Icon), and then input to the identification to trigger the picture of the gallery application of the terminal device. Therefore, the operation process of opening the interface of the display gallery application program can be simplified, and the human-computer interaction performance can be improved.
  • the identifier indicating the gallery application for example, the application of the gallery application). Icon
  • the first action corresponding to each first input described in the foregoing three possible implementation manners is an exemplary enumeration, which does not impose any limitation on the embodiment of the present application.
  • the terminal device can also turn on the second camera and display the second camera collection
  • the terminal device may also display the gallery application interface after responding to the long-press input; or, when the first input is the user’s
  • the sliding input of the target area is the sliding direction to the left, after the end device responds to the sliding input, it can also turn on the first camera and display the preview image collected by the first camera.
  • the terminal device since the user can input in the target area when the current interface of the terminal device is any interface, the terminal device is triggered to directly perform the first action corresponding to the input without triggering the terminal device to return to the desktop first, and then Searching for the logo used to indicate the action on the desktop can improve the convenience of the terminal device's action and improve the performance of human-computer interaction.
  • the navigation bar may include at least one virtual button
  • the target area may be an area where the target virtual button is located
  • the target virtual button may be a virtual button of the at least one virtual button
  • the area where the target virtual key is located can be understood as the area used to display the target virtual key on the screen of the terminal device, or the area used to display the target virtual key on the screen of the terminal device.
  • the peripheral area of the area of the target virtual button (for example, it may be a circular area surrounding the target virtual button), and may also be the area formed by the area used to display the target virtual button and the peripheral area of the area on the screen of the terminal device. It can be determined according to actual use requirements, and the embodiment of the present application does not limit it.
  • each of the above-mentioned at least one virtual key may be any one of the following: a return key, a home key, and a recently used key.
  • a return key any one of the following: a return key, a home key, and a recently used key.
  • the details can be determined according to actual use requirements, and the embodiment of the present application does not limit it.
  • each of the above-mentioned at least one virtual key is different.
  • the above-mentioned target virtual button may be any one of the above-mentioned at least one virtual button.
  • the above-mentioned target virtual button may be a return button, a home button or a recently used button.
  • the details can be determined according to actual use requirements, and the embodiment of the present application does not limit it.
  • the screen-to-body ratio of the terminal device can be further increased on the basis of ensuring the normal use of the first camera and the target virtual button.
  • a target identifier is displayed in other areas, the target identifier may be used to indicate the target virtual key, and the first area may be the first camera on the terminal device.
  • the projection area on the screen refers to the screen of the terminal device; the projection area of the first camera on the terminal device screen can be understood as the orthographic projection area of the first camera on the screen of the terminal device.
  • the aforementioned target area may include a first area and a second area (that is, other areas in the target area except the first area).
  • the first area may be located in the center of the target area, and the second area may be located outside the first area. Therefore, the target identifier can be displayed on the outside of the first camera. In this way, visually, the first camera forms an identification with the target identification on the surface of the screen, so as to prevent the first camera from affecting the display effect of the screen.
  • the target area may be 52 shown in (b) in FIG. 5, the first area may be the grid-filled area 521 shown in (b) in FIG. 5, and the second area may be shown in FIG. (b) The black area 522 shown.
  • the above-mentioned target hole for setting the first camera may be located in the first area.
  • the target hole may be set on the display layer of the first area, and the area on the screen of the terminal device corresponding to the first area cannot display content, but has a touch function.
  • the user can be prompted to the terminal of the target virtual key on the basis of further increasing the screen-to-body ratio.
  • the location of the device on the screen which can improve the convenience of operation and the performance of human-computer interaction.
  • the user when the user requires the terminal device to perform a certain action, such as the first action, the user can perform the first input corresponding to the first action in the target area to trigger the terminal device to directly execute the first action. action.
  • the process of executing actions by the terminal device can be simplified, and the human-computer interaction performance can be improved.
  • the above-mentioned first action may include displaying an interface of a gallery application.
  • the control method provided in the embodiment of the present application may further include the following S203 and S204.
  • the terminal device receives a second input of the user in the target area.
  • the above-mentioned target image may be an image in a gallery application.
  • the above-mentioned second input may be any possible input such as a drag input, a sliding input, etc., which may be specifically determined according to actual use requirements, and the embodiment of the present application does not limit it.
  • the aforementioned drag input may be an input of dragging in any direction.
  • the aforementioned sliding input may be an input of sliding in the second direction.
  • the above-mentioned second input is different from the above-mentioned first input.
  • the second direction is different from the first direction.
  • the above-mentioned second input may be an input for a target image.
  • the user can trigger the terminal device to display the target image through an input on the interface of the gallery application.
  • the terminal device In response to the second input, the terminal device performs a second action corresponding to the second input on the target image.
  • the above-mentioned second action may include any one of the following: zoom out and display the target image, zoom in and display the target image, and delete the target image.
  • the second action performed by the terminal device after responding to the second input may also be different.
  • the second input is a sliding input as an example.
  • the terminal device can zoom out and display the target image in response to the second input.
  • the terminal device can zoom in and display the target image in response to the second input.
  • the terminal device can delete the target image in response to the second input. Understandably, in this case, the terminal device can automatically display the target image after deleting the target image. The previous image or the next image.
  • the terminal device displays a character image
  • the target image is an image in a gallery application
  • the second input is a sliding input
  • the navigation bar is located at the bottom area of the terminal device.
  • the terminal device displays the character image 60 and the navigation bar
  • the target area 61 is located in the navigation bar, and the user can slide upwards in the target area 61, that is, the terminal device
  • the terminal device may respond to the second input, as shown in (b) of FIG. 6, display a first interface 62, which includes an enlarged image of the character image, namely The terminal device zooms in and displays the target image.
  • the terminal device displays the character image 70 and the navigation bar, and the target area 71 is located in the navigation bar.
  • the user can slide down on the target area 71, that is, the terminal device receives
  • the terminal device can then respond to the second input, as shown in (b) in FIG. 7, display a first interface 72, and the second interface 72 includes a reduced image of the character image, that is, The terminal device zooms out and displays the target image.
  • the terminal device displays a character image 80 and a navigation bar, and the target area 81 is located in the navigation bar.
  • the user can slide right in the target area 81, that is, the terminal device receives To the user’s second input, the terminal device can then respond to the second input. As shown in (b) in Figure 8, the terminal device deletes the character image and can display a third interface 82 that includes The previous image of the character image.
  • the above-mentioned second input may also be another input that is different from the above-mentioned one input, and may be specifically determined according to actual use requirements, which is not limited in the embodiment of the present application.
  • the terminal device when the terminal device displays the target image, because the user can trigger the terminal device to zoom out, display the target image, or delete the target image through different inputs in the target area, that is, the user can directly pass in the target area
  • the input triggers the terminal device to perform corresponding actions on the target image, so the convenience of operating the target image can be increased, and the human-computer interaction performance can be improved.
  • control methods shown in each of the foregoing method drawings are all exemplified in conjunction with a drawing in the embodiments of the present application.
  • control methods shown in the figures of the above methods can also be implemented in combination with any other figures that can be combined as illustrated in the above embodiments, and will not be repeated here.
  • the terminal device 900 may include a receiving module 901 and a processing module 902.
  • the receiving module 901 may be used to receive a user's first input in the target area; the processing module 902 may be used to perform a first action corresponding to the first input in response to the first input received by the receiving module 901.
  • the target area may be an area on the screen of the terminal device and corresponding to the first camera located under the screen of the terminal device, and the target area may be located in the navigation bar of the terminal device.
  • the above-mentioned first action may include any one of the following: turning on the first camera and displaying the preview image collected by the first camera, turning on the second camera and displaying the preview image collected by the second camera, and displaying The interface of the gallery application.
  • the aforementioned navigation bar may include at least one virtual button
  • the target area may be an area where the target virtual button is located
  • the target virtual button may be a virtual button of the at least one virtual button
  • a target identifier is displayed in the above-mentioned target area other than the first area.
  • the target identifier may be used to indicate the target virtual button.
  • the first area may be the first camera on the screen of the terminal device. Projection area on the top.
  • the above-mentioned first action may include displaying an interface of a gallery application.
  • the receiving module 901 can also be used to receive the user's second input in the target area when the target image is displayed after the processing module 902 performs the first action corresponding to the first input; the processing module 902 can also be used to In response to the second input received by the receiving module 901, a second action corresponding to the second input is performed on the target image.
  • the target image may be an image in a gallery application, and the second action may include any one of the following: zoom out and display the target image, zoom in and display the target image, and delete the target image.
  • the terminal device 900 provided in the embodiment of the present application can implement the various processes implemented by the terminal device 900 shown in the foregoing method embodiments. To avoid repetition, details are not described herein again.
  • An embodiment of the present application provides a terminal device, which can receive a user's first input in a target area (an area on the screen of the terminal device and corresponding to the first camera located under the screen of the terminal device); and In response to the first input, perform a first action corresponding to the first input; wherein, the target area may be located in the navigation bar of the terminal device. Then, when the user requires the terminal device to perform a certain action, such as a first action, the user can perform a first input corresponding to the first action in the target area to trigger the terminal device to directly perform the first action.
  • a target area an area on the screen of the terminal device and corresponding to the first camera located under the screen of the terminal device
  • the target area may be located in the navigation bar of the terminal device.
  • FIG. 10 is a schematic diagram of the hardware structure of a terminal device that implements each embodiment of the present application.
  • the terminal device 100 includes but is not limited to: a radio frequency unit 101, a network module 102, an audio output unit 103, an input unit 104, a sensor 105, a display unit 106, a user input unit 107, an interface unit 108, and a memory 109 , The processor 110, and the power supply 111 and other components.
  • the structure of the terminal device shown in FIG. 10 does not constitute a limitation on the terminal device, and the terminal device may include more or less components than those shown in the figure, or a combination of certain components, or different components. Layout.
  • terminal devices include, but are not limited to, mobile phones, tablet computers, notebook computers, palmtop computers, vehicle-mounted terminal devices, wearable devices, and pedometers.
  • the user input unit 107 is configured to receive a user's first input in the target area; the processor 110 is configured to perform a first action corresponding to the first input in response to the first input received by the user input unit 107.
  • the target area is an area on the screen of the terminal device and corresponding to the first camera located under the screen of the terminal device, and the target area is located in the navigation bar of the terminal device.
  • the receiving module 901 in the above-mentioned structural schematic diagram of the terminal device may be implemented by the above-mentioned user input unit 107.
  • the processing module 902 in the above-mentioned structural schematic diagram of the terminal device may be implemented by the above-mentioned processor 110.
  • An embodiment of the present application provides a terminal device, which can receive a user's first input in a target area (an area on the screen of the terminal device and corresponding to the first camera located under the screen of the terminal device); and In response to the first input, perform a first action corresponding to the first input; wherein, the target area may be located in the navigation bar of the terminal device. Then, when the user requires the terminal device to perform a certain action, such as a first action, the user can perform a first input corresponding to the first action in the target area to trigger the terminal device to directly perform the first action.
  • a target area an area on the screen of the terminal device and corresponding to the first camera located under the screen of the terminal device
  • the target area may be located in the navigation bar of the terminal device.
  • the radio frequency unit 101 can be used for receiving and sending signals in the process of sending and receiving information or talking. Specifically, the downlink data from the base station is received and processed by the processor 110; in addition, Uplink data is sent to the base station.
  • the radio frequency unit 101 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like.
  • the radio frequency unit 101 can also communicate with the network and other devices through a wireless communication system.
  • the terminal device provides users with wireless broadband Internet access through the network module 102, such as helping users to send and receive emails, browse web pages, and access streaming media.
  • the audio output unit 103 can convert the audio data received by the radio frequency unit 101 or the network module 102 or stored in the memory 109 into an audio signal and output it as sound. Moreover, the audio output unit 103 may also provide audio output related to a specific function performed by the terminal device 100 (for example, call signal reception sound, message reception sound, etc.).
  • the audio output unit 103 includes a speaker, a buzzer, a receiver, and the like.
  • the input unit 104 is used to receive audio or video signals.
  • the input unit 104 may include a graphics processing unit (GPU) 1041 and a microphone 1042.
  • the graphics processor 1041 is configured to monitor images of still pictures or videos obtained by an image capture device (such as a camera) in a video capture mode or an image capture mode.
  • the data is processed.
  • the processed image frame can be displayed on the display unit 106.
  • the image frame processed by the graphics processor 1041 may be stored in the memory 109 (or other storage medium) or sent via the radio frequency unit 101 or the network module 102.
  • the microphone 1042 can receive sound, and can process such sound into audio data.
  • the processed audio data can be converted into a format that can be sent to a mobile communication base station via the radio frequency unit 101 for output in the case of a telephone call mode.
  • the terminal device 100 also includes at least one sensor 105, such as a light sensor, a motion sensor, and other sensors.
  • the light sensor includes an ambient light sensor and a proximity sensor.
  • the ambient light sensor can adjust the brightness of the display panel 1061 according to the brightness of the ambient light.
  • the proximity sensor can close the display panel 1061 and the display panel 1061 when the terminal device 100 is moved to the ear. / Or backlight.
  • the accelerometer sensor can detect the magnitude of acceleration in various directions (usually three-axis), and can detect the magnitude and direction of gravity when stationary, and can be used to identify the posture of the terminal device (such as horizontal and vertical screen switching, related games , Magnetometer posture calibration), vibration recognition related functions (such as pedometer, tap), etc.; sensor 105 can also include fingerprint sensors, pressure sensors, iris sensors, molecular sensors, gyroscopes, barometers, hygrometers, thermometers, Infrared sensors, etc., will not be repeated here.
  • the display unit 106 is used to display information input by the user or information provided to the user.
  • the display unit 106 may include a display panel 1061, and the display panel 1061 may be configured in the form of a liquid crystal display (LCD), an organic light-emitting diode (OLED), or the like.
  • LCD liquid crystal display
  • OLED organic light-emitting diode
  • the user input unit 107 can be used to receive input digital or character information, and generate key signal inputs related to user settings and function control of the terminal device.
  • the user input unit 107 includes a touch panel 1071 and other input devices 1072.
  • the touch panel 1071 also called a touch screen, can collect user touch operations on or near it (for example, the user uses any suitable objects or accessories such as fingers, stylus, etc.) on the touch panel 1071 or near the touch panel 1071. operating).
  • the touch panel 1071 may include two parts: a touch detection device and a touch controller.
  • the touch detection device detects the user's touch position, and detects the signal brought by the touch operation, and transmits the signal to the touch controller; the touch controller receives the touch information from the touch detection device, converts it into contact coordinates, and then sends it To the processor 110, the command sent by the processor 110 is received and executed.
  • the touch panel 1071 can be implemented in multiple types such as resistive, capacitive, infrared, and surface acoustic wave.
  • the user input unit 107 may also include other input devices 1072.
  • other input devices 1072 may include, but are not limited to, a physical keyboard, function keys (such as volume control buttons, switch buttons, etc.), trackball, mouse, and joystick, which will not be repeated here.
  • the touch panel 1071 can be overlaid on the display panel 1061.
  • the touch panel 1071 detects a touch operation on or near it, it transmits it to the processor 110 to determine the type of the touch event, and then the processor 110 determines the type of the touch event according to the touch.
  • the type of event provides corresponding visual output on the display panel 1061.
  • the touch panel 1071 and the display panel 1061 are used as two independent components to implement the input and output functions of the terminal device, in some embodiments, the touch panel 1071 and the display panel 1061 can be integrated
  • the implementation of the input and output functions of the terminal device is not specifically limited here.
  • the interface unit 108 is an interface for connecting an external device with the terminal device 100.
  • the external device may include a wired or wireless headset port, an external power source (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device with an identification module, audio input/output (I/O) port, video I/O port, headphone port, etc.
  • the interface unit 108 can be used to receive input from an external device (for example, data information, power, etc.) and transmit the received input to one or more elements in the terminal device 100 or can be used to connect to the terminal device 100 and external devices. Transfer data between devices.
  • the memory 109 can be used to store software programs and various data.
  • the memory 109 may mainly include a program storage area and a data storage area.
  • the program storage area may store an operating system, an application program required by at least one function (such as a sound playback function, an image playback function, etc.), etc.; Data created by the use of mobile phones (such as audio data, phone book, etc.), etc.
  • the memory 109 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, a flash memory device, or other volatile solid-state storage devices.
  • the processor 110 is the control center of the terminal device. It uses various interfaces and lines to connect the various parts of the entire terminal device, runs or executes software programs and/or modules stored in the memory 109, and calls data stored in the memory 109. , Perform various functions of the terminal equipment and process data, so as to monitor the terminal equipment as a whole.
  • the processor 110 may include one or more processing units; optionally, the processor 110 may integrate an application processor and a modem processor, where the application processor mainly processes the operating system, user interface, and application programs, etc., and the modem
  • the adjustment processor mainly deals with wireless communication. It can be understood that the foregoing modem processor may not be integrated into the processor 110.
  • the terminal device 100 may also include a power source 111 (such as a battery) for supplying power to various components.
  • a power source 111 such as a battery
  • the power source 111 may be logically connected to the processor 110 through a power management system, so as to manage charging, discharging, and power consumption through the power management system. Management and other functions.
  • the terminal device 100 includes some functional modules not shown, which will not be repeated here.
  • an embodiment of the present application further provides a terminal device, including a processor 110 as shown in FIG. 10, a memory 109, a computer program stored in the memory 109 and running on the processor 110, the computer
  • a terminal device including a processor 110 as shown in FIG. 10, a memory 109, a computer program stored in the memory 109 and running on the processor 110, the computer
  • the program is executed by the processor 110, each process of the foregoing method embodiment is realized, and the same technical effect can be achieved. In order to avoid repetition, details are not repeated here.
  • the embodiments of the present application also provide a computer-readable storage medium, and a computer program is stored on the computer-readable storage medium.
  • a computer program is stored on the computer-readable storage medium.
  • the computer program is executed by a processor, each process of the above-mentioned method embodiment is realized, and the same technical effect can be achieved. To avoid repetition, I won’t repeat it here.
  • the computer-readable storage medium may include read-only memory (ROM), random access memory (RAM), magnetic disk or optical disk, etc.
  • the above-mentioned embodiment method can be implemented by means of software plus the necessary general hardware platform, of course, it can also be implemented by hardware, but in many cases the former is better.
  • the computer software product is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk). ) Includes several instructions to make a terminal device (which can be a mobile phone, a computer, a server, an air conditioner, or a network device, etc.) execute the method described in each embodiment of the present application.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

本申请实施例提供一种控制方法及终端设备。该方案包括:接收用户在目标区域的第一输入,响应于第一输入,执行与第一输入对应的第一动作;其中,目标区域为该终端设备的屏幕上、且与位于该终端设备屏下的第一摄像头对应的区域,目标区域位于所述终端设备的导航栏。

Description

控制方法及终端设备
相关申请的交叉引用
本申请主张在2019年08月28日在中国提交的中国专利申请号No.201910803952.X的优先权,其全部内容通过引用包含于此。
技术领域
本申请实施例涉及通信技术领域,尤其涉及一种控制方法及终端设备。
背景技术
随着终端技术的发展,终端设备中安装的应用程序越来越多。
目前,当用户需求终端设备运行某个应用程序时,用户需要先在终端设备的桌面中查找指示该应用程序的应用图标,并在查找到指示该应用程序的应用图标之后,再对该应用图标操作,以触发终端设备运行该应用程序。
然而,按照上述方法,当用户需求终端设备运行某个应用程序时,需要执行上述一系列操作才能触发终端设备运行该应用程序,从而导致终端设备运行应用程序的过程繁琐且耗时,人机交互性能差。
发明内容
本申请实施例提供一种控制方法及终端设备,以解决终端设备运行应用程序的过程繁琐且耗时,人机交互性能差的问题。
为了解决上述技术问题,本申请是这样实现的:
第一方面,本申请实施例提供了一种控制方法,应用于终端设备,该方法包括,接收用户在目标区域的第一输入,目标区域为该终端设备的屏幕上、且与位于该终端设备屏下的第一摄像头对应的区域;并响应于第一输入,执行与第一输入对应的第一动作;其中,目标区域位于该终端设备的导航栏。
第二方面,本申请实施例提供了一种终端设备,该终端设备包括:接收模块和处理模块。接收模块,用于接收用户在目标区域的第一输入,目标区域为该终端设备的屏幕上、且与位于所述终端设备屏下的第一摄像头对应的区域;处理模块,用于响应于接收模块接收的第一输入,执行与第一输入对应的第一动作。其中,目标区域位于该终端设备的导航栏。
第三方面,本申请实施例提供了一种终端设备,包括处理器、存储器及存储在该存储器上并可在处理器上运行的计算机程序,该计算机程序被处理器执行时实现上述第一方面的控制方法的步骤。
第四方面,本申请实施例提供了一种计算机可读存储介质,该计算机可读存储介质上存储计算机程序,该计算机程序被处理器执行时实现上述第一方面的控制方法的步骤。
在本申请实施例中,终端设备可以接收用户在目标区域(为终端设备的屏幕上、且与位于该终端设备屏下的第一摄像头对应的区域)的第一输入;并可以响应于第一输入,执行与第一输入对应的第一动作;其中,目标区域可以位于该终端设备的导航 栏。通过该方案,当用户需求终端设备执行某个动作,例如第一动作时,用户可以在目标区域执行与第一动作对应的第一输入,以触发终端设备直接执行第一动作。如此,在提高终端设备的屏占比的基础上,不但能够保证终端设备的使用可靠性和使用寿命,而且还能简化终端设备执行某个动作的过程,提高人机交互性能。
附图说明
图1为本申请实施例提供的一种可能的安卓操作系统的架构示意图;
图2为本申请实施例提供的控制方法的示意图;
图3为本申请实施例提供的控制方法应用的界面示意图之一;
图4为本申请实施例提供的控制方法应用的界面示意图之二;
图5为本申请实施例提供的控制方法应用的界面示意图之三;
图6为本申请实施例提供的控制方法应用的界面示意图之四;
图7为本申请实施例提供的控制方法应用的界面示意图之五;
图8为本申请实施例提供的控制方法应用的界面示意图之六;
图9为本申请实施例提供的终端设备的结构示意图;
图10为本申请实施例提供的终端设备的硬件示意图。
具体实施方式
下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例是本申请一部分实施例,而不是全部的实施例。基于本申请中的实施例,本领域普通技术人员在没有作出创造性劳动前提下所获得的所有其他实施例,都属于本申请保护的范围。
本文中术语“和/或”,是一种描述关联对象的关联关系,表示可以存在三种关系,例如,A和/或B,可以表示:单独存在A,同时存在A和B,单独存在B这三种情况。本文中符号“/”表示关联对象是或者的关系,例如A/B表示A或者B。
本文中的术语“第一”和“第二”等是用于区别不同的对象,而不是用于描述对象的特定顺序。例如,第一输入和第二输入等是用于区别不同的输入,而不是用于描述输入的特定顺序。
在本申请实施例中,“示例性的”或者“例如”等词用于表示作例子、例证或说明。本申请实施例中被描述为“示例性的”或者“例如”的任何实施例或设计方案不应被解释为比其它实施例或设计方案更优选或更具优势。确切而言,使用“示例性的”或者“例如”等词旨在以具体方式呈现相关概念。
在本申请实施例的描述中,除非另有说明,“多个”的含义是指两个或者两个以上,例如,多个元件是指两个或者两个以上的元件等。
下面首先对本申请的权利要求书和说明书中涉及的一些名词或者术语进行解释说明。
摄像头采集的预览画面:即终端设备当前开启的摄像头采集的预览画面,也即相机预览界面显示的画面。
相机预览界面:即相机应用程序的预览界面。是指终端设备中的相机应用程序在终端设备的前台运行时,终端设备显示的界面。
导航栏(Navigation Bar):是终端设备屏幕的快捷按钮栏。一般,导航栏以虚拟按键的形式出现在手机屏幕周边区域(例如可以位于屏幕的顶部区域,也可以位于屏幕的底部区域,也可以位于屏幕的左侧区域,也可以位于屏幕的右侧区域)。终端设备屏幕的快捷按钮通常可以包括返回按键(back)、主页按键(home)和最近使用按键(recents)。
挖孔屏:也可以称为钻孔屏。是指在终端设备屏幕上挖孔,以露出设置在该孔处的前置摄像头,这样,可以在不牺牲前置摄像头拍照效果的前提下,提高终端设备的屏占比。
其中,设置有挖孔屏的终端设备可以包括显示层和触控层,在挖孔屏上打孔可以通过两种方式(即下述的方式一和方式二)实现。
方式一、将屏幕的显示层和屏幕的触控层均打穿。这种情况下,屏幕的打孔区域既无法正常显示内容,也无法正常响应用户的触摸输入。
方式二、将屏幕的显示层打穿,并保留屏幕的触控层。这种情况下,屏幕的打孔区域虽然无法正常显示内容,但是由于保留了触控层,因此可以正常响应用户的触摸输入。
需要说明的是,本申请实施例中的挖孔屏可以采用上述方式二实现。
本申请实施例提供一种控制方法和终端设备,该终端设备可以接收用户在目标区域(为终端设备的屏幕上、且与位于该终端设备屏下的第一摄像头对应的区域)的第一输入;并可以响应于第一输入,执行与第一输入对应的第一动作;其中,目标区域可以位于该终端设备的导航栏。那么,当用户需求终端设备执行某个动作,例如第一动作时,用户可以在目标区域执行与第一动作对应的第一输入,以触发终端设备直接执行第一动作。如此,在提高终端设备的屏占比的基础上,不但能够保证终端设备的使用可靠性和使用寿命,而且还能简化终端设备执行某个动作的过程,提高人机交互性能。
本申请实施例中的终端设备可以为具有操作系统的终端设备。该操作系统可以为安卓(Android)操作系统,可以为ios操作系统,还可以为其他可能的操作系统,本申请实施例不作具体限定。
下面以安卓操作系统为例,介绍本申请实施例提供的控制方法所应用的软件环境。
如图1所示,为本申请实施例提供的一种可能的安卓操作系统的架构示意图。在图1中,安卓操作系统的架构包括4层,分别为:应用程序层、应用程序框架层、系统运行库层和内核层(具体可以为Linux内核层)。
其中,应用程序层包括安卓操作系统中的各个应用程序(包括系统应用程序和第三方应用程序)。
应用程序框架层是应用程序的框架,开发人员可以在遵守应用程序的框架的开发原则的情况下,基于应用程序框架层开发一些应用程序。
系统运行库层包括库(也称为系统库)和安卓操作系统运行环境。库主要为安卓操作系统提供其所需的各类资源。安卓操作系统运行环境用于为安卓操作系统提供软件环境。
内核层是安卓操作系统的操作系统层,属于安卓操作系统软件层次的最底层。内 核层基于Linux内核为安卓操作系统提供核心系统服务和与硬件相关的驱动程序。
以安卓操作系统为例,本申请实施例中,开发人员可以基于上述如图1所示的安卓操作系统的系统架构,开发实现本申请实施例提供的控制方法的软件程序,从而使得该界面控制方法可以基于如图1所示的安卓操作系统运行。即处理器或者终端设备可以通过在安卓操作系统中运行该软件程序实现本申请实施例提供的控制方法。
本申请实施例中的终端设备可以为移动终端,也可以为非移动终端。示例性的,移动终端可以为手机、平板电脑、笔记本电脑、掌上电脑、车载终端、可穿戴设备、超级移动个人计算机(ultra-mobile personal computer,UMPC)、上网本或者个人数字助理(personal digital assistant,PDA)等,非移动终端可以为个人计算机(personal computer,PC)、电视机(television,TV)、柜员机或者自助机等,本申请实施例不作具体限定。
本申请实施例提供的控制方法的执行主体可以为上述的终端设备,也可以为该终端设备中能够实现该控制方法的功能模块和/或功能实体,具体的可以根据实际使用需求确定,本申请实施例不作限定。下面以终端设备为例,对本申请实施例提供的控制方法进行示例性的说明。
本申请实施例中,在终端设备的屏幕中的一个区域下方(例如本申请实施例中的目标区域,目标区域可以为位于终端设备屏下的第一摄像头对应的区域,且目标区域可以位于导航栏)设置有摄像头的情况下,若用户需求终端设备执行某个与摄像头相关的动作(例如与本申请实施例中的第一动作),则用户可以通过在该区域的一个输入(例如本申请实施例中的第一输入),触发终端设备直接显示与该输入对应的动作,以触发终端设备执行与该输入对应的动作。如此,在提高终端设备的屏占比的基础上,不但能够保证终端设备的使用可靠性和使用寿命,而且还能简化终端设备执行某个动作的过程,提高人机交互性能。
下面具体结合各个附图对本申请实施例提供的控制方法进行示例性的描述。
如图2所示,本申请实施例提供一种控制方法,该方法可以包括下述的S201和S202。
S201、终端设备接收用户在目标区域的第一输入。
其中,上述目标区域可以为终端设备的屏幕上、且与位于终端设备屏下的第一摄像头对应的区域,且目标区域可以位于所述终端设备的导航栏。
可选地,本申请实施例中,终端设备的屏幕可以为挖孔屏,该屏幕可以包括显示层和触控层。具体的,本申请实施例中,该显示层对应于目标区域处可以设置一个目标孔,上述第一摄像头可以设置在该目标孔处。
可以理解,上述位于终端设备屏下的第一摄像头可以理解为位于终端设备屏幕的触摸层下方的第一摄像头。
可以理解,本申请实施例中,由于本申请实施例的挖孔屏只在显示层挖孔(即目标孔设置在显示层),触控层并没有挖穿,即打穿屏幕的显示层且保留屏幕的触控层,因此可以在露出第一摄像头的同时,保证目标区域的触摸功能能够正常使用。
对于挖孔屏的相关描述具体可以参见上述名词解释部分对挖孔屏的相关描述,此处不再赘述。
可选地,本申请实施例中,上述第一摄像头可以为终端设备的前置摄像头。
可选地,本申请实施例中,终端设备还可以包括第二摄像头,且第二摄像头可以 为终端设备的后置摄像头。
对于导航栏的描述具体可以参见上述名词解释部分对导航栏的相关描述,此处不再赘述。
可选地,本申请实施例中,终端设备的导航栏可以位于终端设备的屏幕的任意区域。例如,导航栏可以位于终端设备屏幕的顶部区域,或者可以位于终端设备屏幕的底部区域,或者可以位于终端设备屏幕的左部区域,或者可以位于终端设备屏幕的右部区域,具体可以根据实际使用需求确定,本申请实施例不作限定。
可选地,本申请实施例中,由于目标区域位于终端设备的导航栏,且目标区域为终端设备的屏幕上、且与于终端设备屏下的第一摄像头对应的区域,即第一摄像头设置在终端设备的导航栏中的目标区域下方,而不用占用终端设备屏幕的有效显示区域且无需伸缩,因此,可以在提高终端设备屏占比的基础上,避免终端设备的前置摄像头(例如本申请实施例中的第一摄像头)伸缩,从而可以提高终端设备的使用可靠性和寿命。
可选地,本申请实施例中,上述目标区域可以为上述导航栏中的任意区域。
可选地,本申请实施例中,上述第一输入可以为点击输入、长按输入、重按输入、拖动输入和滑动输入等任意可能形式的输入,具体可以根据实际使用需求确定,本申请实施例不作限定。
其中,上述点击输入可以为点击第一预设次数的输入。上述长按输入可以为接触第一预设时长的输入。上述重按输入也称为压力触控输入,是指用户以大于或等于第一压力阈值的压力值按下的输入。上述拖动输入可以为向任意方向拖动的输入。上述滑动输入可以为向第一方向滑动的输入。
本申请实施例中,上述第一预设次数可以为两次或两次以上的次数。上述第一预设时长以及上述第一压力阈值、第一方向均可以根据实际使用需求确定,本申请实施例不作限定。
S202、终端设备响应于第一输入,执行与第一输入对应的第一动作。
其中,上述第一动作可以为与第一摄像头相关的动作。
可选地,本申请实施例中,上述与第一摄像头相关的动作可以理解为,对与第一摄像头相关的对象执行开启动作。
可以理解,与第一摄像头相关的对象可以包括:第一摄像头、第二摄像头、相机应用程序、图库应用程序等。具体可以根据实际使用需求确定,本申请实施例不作限定。
可选地,本申请实施例中,上述第一动作可以包括以下任意一项:开启第一摄像头并显示第一摄像头采集的预览画面,开启第二摄像头并显示第二摄像头采集的预览画面,显示图库应用程序的界面。
可以理解,本申请实施例中,终端设备显示第一摄像头采集的预览画面,即终端设备显示终端设备的前置摄像头采集的预览画面;例如,终端设备开启第一摄像头的同时,可以在终端设备的前台运行相机应用程序,并在相机预览界面(即相机应用程序的预览界面)中显示第一摄像头采集的预览画面。终端设备显示第二摄像头采集的预览画面,即终端设备显示终端设备的后置摄像头采集的预览画面;例如,终端设备 开启第二摄像头的同时,终端设备可以在终端设备的前台运行相机应用程序,并在相机预览界面(即相机应用程序的预览界面)中显示第二摄像头采集的预览画面。
对于第一摄像头采集的预览画面、第二摄像头采集的预览画面、相机预览界面的描述,具体可以参见上述名词解释部分对摄像头采集的预览画面的相关描述,此处不再赘述。
可选地,本申请实施例中,上述图库应用程序的界面可以为图库应用程序中的任意一个界面。
可选地,本申请实施例中,上述第一输入不同,终端设备响应第一输入之后执行的第一动作也可能不同。即用户可以通过不同的第一输入,触发终端设备执行不同的第一动作。
可以理解,本申请实施例中,可以不限定某一具体第一输入触发终端设备执行某个确定的第一动作;只需每个具体的第一输入唯一对应一个确定的第一动作即可。具体可以根据实际使用需求确定,本申请实施例不作限定。
下面分别以三种可能的实现方式对第一输入与第一动作的对应关系进行示例性的描述。
一种可能的实现方式中,若用户需求使用第一摄像头,则用户可以在目标区域点击(即第一输入为点击输入),从而终端设备可以响应于该点击输入,开启第一摄像头并显示第一摄像头采集的预览画面。
需要说明的是,本申请实施例中,上述点击输入的点击次数可以为两次或者两次以上。具体可以根据实际使用需求确定,本申请实施例不作限定。
可选地,本申请实施例中,在终端设备显示任意界面的情况下,终端设备均可以响应于上述点击输入,开启第一摄像头并显示第一摄像头采集的预览画面。
具体的,在终端设备显示任意界面的情况下,用户可以在目标区域点击(点击次数为两次或者两次以上),即终端设备接收到用户的第一输入,然后终端设备可以先确定终端设备显示的当前界面是否为相机预览界面。若当前界面不是相机预览界面,则终端设备可以在终端设备的前台运行相机应用程序,并开启第一摄像头,从而终端设备可以显示相机预览界面,且在相机预览界面中显示第一摄像头采集的预览画面。若当前界面为相机预览界面,则终端设备可以进一步确定相机预览界面中是否显示第一摄像头采集的预览画面;如果相机预览界面中显示第一摄像头采集的预览画面,那么终端设备可以保持开启第一摄像头并保持显示第一摄像头采集的预览画面;如果相机预览界面中显示第二摄像头采集的预览画面,则终端设备可以关闭第二摄像头,并开启第一摄像头,然后终端设备可以显示第一摄像头采集的预览画面。
示例性的,一种可能的实现方式中,假设终端设备当前显示的界面为聊天应用程序的界面,第一输入为双击输入,导航栏位于终端设备屏幕的底部区域,那么,如图3中的(a)所示,终端设备显示聊天应用程序的界面30和导航栏31,其中,导航栏31位于终端设备的屏幕的底部区域,目标区域32位于导航栏31中。然后,用户可以在目标区域32双击,即终端设备接收到用户的第一输入,从而,终端设备可以响应于第一输入,如图3中的(b)所示,终端设备可以显示相机预览界面,并在相机预览界面中显示第一摄像头采集的预览画面33(即终端设备开启第一摄像头并显示第一摄像头采集的预览画面)。
本申请实施例中,由于用户可以通过在目标区域的第一输入,直接触发终端设备开启 第一摄像头并显示第一摄像头采集的预览画面,而无需用户通过多个输入(例如先查找指示相机应用程序的标识、然后再对该标识输入触发终端设备显示相机应用程序的预览界面,并对该相机预览界面中的摄像头切换标识输入)触发终端设备显示第一摄像头采集的预览画面;因此可以简化开启第一摄像头并显示第一摄像头采集的预览画面的操作过程,提高人机交互性能。
另一种可能的实现方式中,若用户需求使用第二摄像头,则用户可以在目标区域长按(即第一输入为长按输入),从而终端设备可以响应于该长按输入,开启第二摄像头并显示第二摄像头采集的预览画面。
可选地,本申请实施例中,在终端设备显示任意界面的情况下,终端设备均可以响应于上述长按输入,开启第二摄像头并显示第二摄像头采集的预览画面。
具体的,在终端设备显示任意界面的情况下,用户可以在目标区域长按,即终端设备接收到用户的第一输入,然后终端设备可以先确定终端设备显示的当前界面是否为相机预览界面。若当前界面不是相机预览界面,则终端设备可以先在终端设备的前台运行相机应用程序,并开启第二摄像头,从而终端设备可以显示相机预览界面,且在该相机预览界面中显示第二摄像头采集的预览画面。若当前界面为相机预览界面,则终端设备可以进一步确定相机预览界面中是否显示第二摄像头采集的预览画面;如果相机预览界面中显示第二摄像头采集的预览画面,那么终端设备可以继续开启第二摄像头并显示第二摄像头采集的预览画面;如果相机预览界面中显示第一摄像头采集的预览画面,则终端设备可以关闭第一摄像头,并开启第二摄像头,从而终端设备可以显示第二摄像头采集预览图像。
示例性的,另一种可能的实现方式中,假设终端设备当前显示的界面为聊天应用程序的界面,第一输入为长按输入,导航栏位于终端设备屏幕的底部区域,那么,如图4中的(a)所示,终端设备显示聊天应用程序的界面40和导航栏41,目标区域42位于导航栏41中。然后,用户可以在目标区域42长按,即终端设备接收到用户的第一输入,从而,终端设备可以响应于第一输入,如图4中的(b)所示,终端设备可以显示相机预览界面,并在相机预览界面中显示第二摄像头采集的预览画面43(即终端设备开启第二摄像头并显示第二摄像头采集的预览画面)。
本申请实施例中,由于用户可以通过在目标区域的第一输入,直接触发终端设备开启第二摄像头并显示第二摄像头采集的预览画面,而无需用户通过多个输入(例如先查找指示相机应用程序的标识、然后再对该标识输入触发终端设备显示相机应用程序的预览界面,并对该相机预览界面中的摄像头切换标识输入)触发终端设备显示第二摄像头采集的预览画面;因此可以简化开启第二摄像头并显示第二摄像头采集的预览画面的操作过程,提高人机交互性能。
又一种可能的实现方式中,若用户需求查看图库应用程序,则用户可以在目标区域滑动(即第一输入为滑动输入)输入,从而,终端设备可以响应于该滑动输入,显示图库应用程序的界面。
可选地,本申请实施例中,在终端设备显示任意界面的情况下,终端设备均可以响应于上述滑动输入,显示图库应用程序的界面。
具体的,在终端设备显示任意界面的情况下,用户可以在目标区域滑动(例如向左滑动),即终端设备接收到用户的第一输入,然后终端设备可以先确定终端设备显示的当前 界面是否为图库应用程序的界面。若当前界面不是图库应用程序的界面,则终端设备可以在终端设备的前台运行图库应用程序,并显示图库应用程序的界面。若当前界面为图库应用程序的界面,那么终端设备可以保持在终端设备的前台运行图库应用程序,并显示图库应用程序的界面。
示例性的,又一种可能的实现方式中,假设终端设备当前显示的界面为聊天应用程序的界面,图库应用程序为图库应用程序,第一输入为长按输入,导航栏位于终端设备屏幕的底部区域,那么,如图5中的(a)所示,终端设备显示聊天应用程序的界面50和导航栏51,目标区域52位于导航栏51,用户可以在目标区域52向左滑动,即终端设备接收到用户的第一输入,从而,终端设备可以响应于第一输入,如图5中的(b)所示,终端设备可以显示图库应用程序的界面53。
本申请实施例中,由于用户可以通过在目标区域的第一输入,直接触发终端设备显示图库应用程序的画面,而无需用户先查找指示该图库应用程序的标识(例如,该图库应用程序的应用图标)、然后对该标识输入触发终端设备该图库应用程序的画面。因此可以简化开启显示图库应用程序的界面的操作过程,提高人机交互性能。
可以理解,本申请实施例中,上述三种可能的实现方式中描述的每个第一输入对应的第一动作均是示例性的列举,其并不对本申请实施例造成任何限定。实际实现中,还可以为,当第一输入为点击输入,且点击次数为两次或两次以上时,终端设备响应该点击输入之后,终端设备也可以开启第二摄像头并显示第二摄像头采集的预览画面;或者,当上述第一输入为用户在目标区域的长按输入时,终端设备响应该长按输入之后,也可以显示图库应用程序的界面;或者,当上述第一输入为用户在目标区域的滑动输入,且滑动方向为向左滑动时,端设备响应该滑动输入之后,也可以开启第一摄像头并显示第一摄像头采集的预览画面。
本申请实施例中,由于当终端设备的当前界面为任意界面时,用户均可以在目标区域输入,触发终端设备直接执行与该输入对应的第一动作,而无需触发终端设备先返回桌面,再在桌面上查找用于指示该动作的标识,因此可以提高终端设备的动作便捷性,提高人机交互性能。
可选地,本申请实施例中,上述导航栏可以包括至少一个虚拟按键,上述目标区域可以为目标虚拟按键所在的区域,目标虚拟按键可以为该至少一个虚拟按键中的虚拟按键。
可选地,本申请实施例中,上述目标虚拟按键所在的区域,可以理解为,可以为终端设备的屏幕中用于显示目标虚拟按键的区域,也可以为终端设备的屏幕中用于显示该目标虚拟按键的区域的外围区域(例如可以为围绕目标虚拟按键的一个环形区域),还可以为终端设备的屏幕中用于显示该目标虚拟按键的区域和该区域的外围区域构成的区域,具体可以根据实际使用需求确定,本申请实施例不作限定。
可选地,本申请实施例中,上述至少一个虚拟按键中的每个虚拟按键可以为以下任意一个:返回按键、主页按键、最近使用按键。具体可以根据实际使用需求确定,本申请实施例不作限定。
需要说明的是,本申请实施例中,上述至少一个虚拟按键中的每个虚拟按键不同。
可选地,本申请实施例中,上述目标虚拟按键可以为上述至少一个虚拟按键中的 任意一项,例如,上述目标虚拟按键可以为返回按键、主页按键或最近使用按键。具体可以根据实际使用需求确定,本申请实施例不作限定。
本申请实施例中,由于第一摄像头可以设置在目标虚拟按键所在区域,因此可以在保证第一摄像头和目标虚拟按键的正常使用的基础上,进一步提高终端设备的屏占比。
可选地,本申请实施例中,上述目标区域中除第一区域之外的其它区域显示有目标标识,目标标识可以用于指示目标虚拟按键,且第一区域可以为第一摄像头在终端设备屏上的投影区域。其中,终端设备屏上是指终端设备的屏幕上;第一摄像头在终端设备屏上的投影区域可以理解为,第一摄像头在终端设备的屏幕上的正投影区域。
可选地,本申请实施例中,上述目标区域可以包括第一区域和第二区域(即目标区域中除第一区域之外的其它区域)。第一区域可以位于目标区域的中心,第二区域可以位于第一区域的外侧。从而目标标识可以显示在第一摄像头外侧。如此,在视觉上,第一摄像头在屏幕的表面与目标标识形成一个标识,从而可以避免第一摄像头影响屏幕的显示效果。
示例性的,目标区域可以为图5中的(b)所示的52,第一区域可以为图5中的(b)所示的网格填充区域521,第二区域可以为图5中的(b)所示的黑色区域522。
可选地,本申请实施例中,上述用于设置第一摄像头的目标孔可以位于第一区域。具体的,目标孔可以设置在第一区域的显示层上,并且终端设备屏幕上对应于第一区域的区域无法显示内容,但具有触摸功能。
本申请实施例中,由于目标区域中除第一区域之外的其他区域显示有用于指示目标虚拟按键的目标标识,因此可以在进一步提高屏占比的基础上,提示用户目标虚拟按键的在终端设备的屏幕上的位置,从而可以提高操作便捷性和人机交互性能。
本申请实施例提供的控制方法中,当用户需求终端设备执行某个动作,例如第一动作时,用户可以在目标区域执行与第一动作对应的第一输入,以触发终端设备直接执行第一动作。如此,在提高终端设备的屏占比的基础上,不但能够保证终端设备的使用可靠性和使用寿命,而且还能简化终端设备执行动作的过程,提高人机交互性能。
可选地,本申请实施例中,上述第一动作可以包括显示图库应用程序的界面。在上述S202之后,本申请实施例提供的控制方法还可以包括下述的S203和S204。
S203、在显示目标图像的情况下,终端设备接收用户在目标区域的第二输入。
其中,上述目标图像可以为图库应用程序中的图像。
可选地,本申请实施例中,上述第二输入可以为拖动输入、滑动输入等任意可能形式的输入,具体可以根据实际使用需求确定,本申请实施例不作限定。
其中,上述拖动输入可以为向任意方向拖动的输入。上述滑动输入可以为向第二方向滑动的输入。
需要说明的是,本申请实施例中,上述第二输入和上述第一输入不同。具体的,第二方向与第一方向不同。
可以理解,本申请实施例中,上述第二输入可以为针对目标图像的输入。
需要说明的是,本申请实施例中,终端设备在显示图库应用程序的界面之后,用户可以通过在图库应用程序的界面上的一个输入,触发终端设备显示目标图像。
S204、终端设备响应于第二输入,对目标图像执行与第二输入对应的第二动作。
其中,上述第二动作可以包括以下任意一项:缩小显示目标图像,放大显示目标图像、删除目标图像。
可选地,本申请实施例中,上述第二输入不同,则终端设备响应第二输入之后,执行的第二动作也可能不同。
可选地,本申请实施例中,以第二输入为滑动输入为例。当第二输入为用户在目标区域向下滑动的输入时,终端设备响应第二输入,可以缩小显示目标图像。当第二输入为用户在目标区域向上滑动的输入时,终端设备响应第二输入,可以放大显示目标图像。当第二输入为用户在目标区域向右滑动的输入时,终端设备响应第二输入,可以删除目标图像,可以理解,这种情况下,终端设备删除目标图像之后,可以自动显示该目标图像的上一张图像或者下一张图像。
示例性的,假设终端设备显示人物图像,且该目标图像为图库应用程序中的图像,第二输入为滑动输入;又假设导航栏位于终端设备的底部区域。那么,第一种实现方式中,如图6中的(a)所示,终端设备显示人物图像60和导航栏,且目标区域61位于导航栏,用户可以在目标区域61向上滑动,即终端设备接收到用户的第二输入,然后,终端设备可以响应于第二输入,如图6中的(b)所示,显示第一界面62,该第一界面中包括该人物图像的放大图像,即终端设备放大显示目标图像。第二种实现方式中,如图7中的(a)所示,终端设备显示人物图像70和导航栏,且目标区域71位于导航栏,用户可以在目标区域71向下滑动,即终端设备接收到用户的第二输入,然后,终端设备可以响应于第二输入,如图7中的(b)所示,显示第一界面72,该第二界面72中包括该人物图像的缩小图像,即终端设备缩小显示目标图像。第三种实现方式中,如图8中的(a)所示,终端设备显示人物图像80和导航栏,且目标区域81位于导航栏,用户可以在目标区域81向右滑动,即终端设备接收到用户的第二输入,然后,终端设备可以响应于第二输入,如图8中的(b)所示,终端设备删除人物图像,并可以显示第三界面82,该第三界面82中包括人物图像的上一张图像。
可以理解,上述第二输入还可以为与上述一输入不同的其它输入,具体可以根据实际使用需求确定,本申请实施例不作限定。
本申请实施例中,当终端设备显示目标图像时,由于用户可以通过在目标区域的不同输入,触发终端设备缩小显示目标图像,放大显示目标图像或者删除目标图像,即用户可以直接通过在目标区域的输入触发终端设备对目标图像执行相应的动作,因此可以增加操作目标图像的便捷性,提高人机交互性能。
需要说明的是,本申请实施例中,上述各个方法附图所示的控制方法均是以结合本申请实施例中的一个附图为例示例性的说明的。具体实现时,上述各个方法附图所示的控制方法还可以结合上述实施例中示意的其它可以结合的任意附图实现,此处不再赘述。
如图9所示,本申请实施例提供一种终端设备900,该终端设备900可以包括接收模块901和处理模块902。接收模块901,可以用于接收用户在目标区域的第一输入;处理模块902,可以用于响应于接收模块901接收的第一输入,执行与第一输入对应的第一动作。其中,目标区域可以为该终端设备的屏幕上、且与位于该终端设备屏下的第一摄像头对应的区域,且目标区域可以位于该终端设备的导航栏。
可选地,本申请实施例中,上述第一动作可以包括以下任意一项:开启第一摄像头并显示第一摄像头采集的预览画面,开启第二摄像头并显示第二摄像头采集的预览画面,显示图库应用程序的界面。
可选地,本申请实施例中,上述导航栏可以包括至少一个虚拟按键,目标区域可以为目标虚拟按键所在的区域,目标虚拟按键可以为该至少一个虚拟按键中的虚拟按键。
可选地,本申请实施例中,上述目标区域中除第一区域之外的其他区域显示有目标标识,目标标识可以用于指示目标虚拟按键,第一区域可以为第一摄像头在终端设备屏上的投影区域。
可选地,本申请实施例中,上述第一动作可以包括显示图库应用程序的界面。接收模块901,还可以用于在处理模块902执行与第一输入对应的第一动作之后,在显示目标图像的情况下,接收用户在目标区域的第二输入;处理模块902,还可以用于响应于接收模块901接收的第二输入,对目标图像执行与第二输入对应的第二动作。其中,目标图像可以为图库应用程序中的图像,第二动作可以包括以下任意一项:缩小显示目标图像,放大显示目标图像、删除目标图像。
本申请实施例提供的终端设备900能够实现上述方法实施例所示的终端设备900实现的各个过程,为避免重复,这里不再赘述。
本申请实施例提供一种终端设备,该终端设备可以接收用户在目标区域(为终端设备的屏幕上、且与位于该终端设备屏下的第一摄像头对应的区域)的第一输入;并可以响应于第一输入,执行与第一输入对应的第一动作;其中,目标区域可以位于该终端设备的导航栏。那么,当用户需求终端设备执行某个动作,例如第一动作时,用户可以在目标区域执行与第一动作对应的第一输入,以触发终端设备直接执行第一动作。如此,在提高终端设备的屏占比的基础上,不但能够保证终端设备的使用可靠性和使用寿命,而且还能简化终端设备执行某个动作的过程,提高人机交互性能。
图10为实现本申请各个实施例的一种终端设备的硬件结构示意图。如图10所示,该终端设备100包括但不限于:射频单元101、网络模块102、音频输出单元103、输入单元104、传感器105、显示单元106、用户输入单元107、接口单元108、存储器109、处理器110、以及电源111等部件。本领域技术人员可以理解,图10中示出的终端设备结构并不构成对终端设备的限定,终端设备可以包括比图示更多或更少的部件,或者组合某些部件,或者不同的部件布置。在本申请实施例中,终端设备包括但不限于手机、平板电脑、笔记本电脑、掌上电脑、车载终端设备、可穿戴设备、以及计步器等。
其中,用户输入单元107,用于接收用户在目标区域的第一输入;处理器110,用于响应于用户输入单元107接收的第一输入,执行与第一输入对应的第一动作。其中,目标区域为终端设备的屏幕上、且与位于终端设备屏下的第一摄像头对应的区域,且目标区域位于终端设备的导航栏。
可以理解,本申请实施例中,上述终端设备的结构示意图(例如图9)中的接收模块901可以通过上述用户输入单元107实现。上述终端设备的结构示意图(例如图9)中的处理模块902可以通过上述处理器110实现。
本申请实施例提供一种终端设备,该终端设备可以接收用户在目标区域(为终端设备的屏幕上、且与位于该终端设备屏下的第一摄像头对应的区域)的第一输入;并可以响应于第一输入,执行与第一输入对应的第一动作;其中,目标区域可以位于该终端设备的导航栏。那么,当用户需求终端设备执行某个动作,例如第一动作时,用户可以在目标区域执行与第一动作对应的第一输入,以触发终端设备直接执行第一动作。如此,在提高终端设备的屏占比的基础上,不但能够保证终端设备的使用可靠性和使用寿命,而且还能简化终端设备执行某个动作的过程,提高人机交互性能。
应理解的是,本申请实施例中,射频单元101可用于收发信息或通话过程中,信号的接收和发送,具体的,将来自基站的下行数据接收后,给处理器110处理;另外,将上行的数据发送给基站。通常,射频单元101包括但不限于天线、至少一个放大器、收发信机、耦合器、低噪声放大器、双工器等。此外,射频单元101还可以通过无线通信系统与网络和其他设备通信。
终端设备通过网络模块102为用户提供了无线的宽带互联网访问,如帮助用户收发电子邮件、浏览网页和访问流式媒体等。
音频输出单元103可以将射频单元101或网络模块102接收的或者在存储器109中存储的音频数据转换成音频信号并且输出为声音。而且,音频输出单元103还可以提供与终端设备100执行的特定功能相关的音频输出(例如,呼叫信号接收声音、消息接收声音等等)。音频输出单元103包括扬声器、蜂鸣器以及受话器等。
输入单元104用于接收音频或视频信号。输入单元104可以包括图形处理器(graphics processing unit,GPU)1041和麦克风1042,图形处理器1041对在视频捕获模式或图像捕获模式中由图像捕获装置(如摄像头)获得的静态图片或视频的图像数据进行处理。处理后的图像帧可以显示在显示单元106上。经图形处理器1041处理后的图像帧可以存储在存储器109(或其它存储介质)中或者经由射频单元101或网络模块102进行发送。麦克风1042可以接收声音,并且能够将这样的声音处理为音频数据。处理后的音频数据可以在电话通话模式的情况下转换为可经由射频单元101发送到移动通信基站的格式输出。
终端设备100还包括至少一种传感器105,比如光传感器、运动传感器以及其他传感器。具体地,光传感器包括环境光传感器及接近传感器,其中,环境光传感器可根据环境光线的明暗来调节显示面板1061的亮度,接近传感器可在终端设备100移动到耳边时,关闭显示面板1061和/或背光。作为运动传感器的一种,加速计传感器可检测各个方向上(一般为三轴)加速度的大小,静止时可检测出重力的大小及方向,可用于识别终端设备姿态(比如横竖屏切换、相关游戏、磁力计姿态校准)、振动识别相关功能(比如计步器、敲击)等;传感器105还可以包括指纹传感器、压力传感器、虹膜传感器、分子传感器、陀螺仪、气压计、湿度计、温度计、红外线传感器等,在此不再赘述。
显示单元106用于显示由用户输入的信息或提供给用户的信息。显示单元106可包括显示面板1061,可以采用液晶显示器(liquid crystal display,LCD)、有机发光二极管(organic light-emitting diode,OLED)等形式来配置显示面板1061。
用户输入单元107可用于接收输入的数字或字符信息,以及产生与终端设备的用 户设置以及功能控制有关的键信号输入。具体地,用户输入单元107包括触控面板1071以及其他输入设备1072。触控面板1071,也称为触摸屏,可收集用户在其上或附近的触摸操作(比如用户使用手指、触笔等任何适合的物体或附件在触控面板1071上或在触控面板1071附近的操作)。触控面板1071可包括触摸检测装置和触摸控制器两个部分。其中,触摸检测装置检测用户的触摸方位,并检测触摸操作带来的信号,将信号传送给触摸控制器;触摸控制器从触摸检测装置上接收触摸信息,并将它转换成触点坐标,再送给处理器110,接收处理器110发来的命令并加以执行。此外,可以采用电阻式、电容式、红外线以及表面声波等多种类型实现触控面板1071。除了触控面板1071,用户输入单元107还可以包括其他输入设备1072。具体地,其他输入设备1072可以包括但不限于物理键盘、功能键(比如音量控制按键、开关按键等)、轨迹球、鼠标、操作杆,在此不再赘述。
进一步的,触控面板1071可覆盖在显示面板1061上,当触控面板1071检测到在其上或附近的触摸操作后,传送给处理器110以确定触摸事件的类型,随后处理器110根据触摸事件的类型在显示面板1061上提供相应的视觉输出。虽然在图10中,触控面板1071与显示面板1061是作为两个独立的部件来实现终端设备的输入和输出功能,但是在某些实施例中,可以将触控面板1071与显示面板1061集成而实现终端设备的输入和输出功能,具体此处不做限定。
接口单元108为外部装置与终端设备100连接的接口。例如,外部装置可以包括有线或无线头戴式耳机端口、外部电源(或电池充电器)端口、有线或无线数据端口、存储卡端口、用于连接具有识别模块的装置的端口、音频输入/输出(I/O)端口、视频I/O端口、耳机端口等等。接口单元108可以用于接收来自外部装置的输入(例如,数据信息、电力等等)并且将接收到的输入传输到终端设备100内的一个或多个元件或者可以用于在终端设备100和外部装置之间传输数据。
存储器109可用于存储软件程序以及各种数据。存储器109可主要包括存储程序区和存储数据区,其中,存储程序区可存储操作系统、至少一个功能所需的应用程序(比如声音播放功能、图像播放功能等)等;存储数据区可存储根据手机的使用所创建的数据(比如音频数据、电话本等)等。此外,存储器109可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件、闪存器件、或其他易失性固态存储器件。
处理器110是终端设备的控制中心,利用各种接口和线路连接整个终端设备的各个部分,通过运行或执行存储在存储器109内的软件程序和/或模块,以及调用存储在存储器109内的数据,执行终端设备的各种功能和处理数据,从而对终端设备进行整体监控。处理器110可包括一个或多个处理单元;可选地,处理器110可集成应用处理器和调制解调处理器,其中,应用处理器主要处理操作系统、用户界面和应用程序等,调制解调处理器主要处理无线通信。可以理解的是,上述调制解调处理器也可以不集成到处理器110中。
终端设备100还可以包括给各个部件供电的电源111(比如电池),可选地,电源111可以通过电源管理系统与处理器110逻辑相连,从而通过电源管理系统实现管理充电、放电、以及功耗管理等功能。
另外,终端设备100包括一些未示出的功能模块,在此不再赘述。
可选地,本申请实施例还提供一种终端设备,包括如图10所示的处理器110,存储器109,存储在存储器109上并可在所述处理器110上运行的计算机程序,该计算机程序被处理器110执行时实现上述方法实施例的各个过程,且能达到相同的技术效果,为避免重复,这里不再赘述。
本申请实施例还提供一种计算机可读存储介质,计算机可读存储介质上存储有计算机程序,该计算机程序被处理器执行时实现上述上述方法实施例的各个过程,且能达到相同的技术效果,为避免重复,这里不再赘述。其中,该计算机可读存储介质可以包括只读存储器(read-only memory,ROM)、随机存取存储器(random access memory,RAM)、磁碟或者光盘等。
需要说明的是,在本文中,术语“包括”、“包含”或者其任何其他变体意在涵盖非排他性的包含,从而使得包括一系列要素的过程、方法、物品或者装置不仅包括那些要素,而且还包括没有明确列出的其他要素,或者是还包括为这种过程、方法、物品或者装置所固有的要素。在没有更多限制的情况下,由语句“包括一个……”限定的要素,并不排除在包括该要素的过程、方法、物品或者装置中还存在另外的相同要素。
通过以上的实施方式的描述,本领域的技术人员可以清楚地了解到上述实施例方法可借助软件加必需的通用硬件平台的方式来实现,当然也可以通过硬件,但很多情况下前者是更佳的实施方式。基于这样的理解,本申请的技术方案本质上或者说对相关技术做出贡献的部分可以以软件产品的形式体现出来,该计算机软件产品存储在一个存储介质(如ROM/RAM、磁碟、光盘)中,包括若干指令用以使得一台终端设备(可以是手机,计算机,服务器,空调器,或者网络设备等)执行本申请各个实施例所述的方法。
上面结合附图对本申请的实施例进行了描述,但是本申请并不局限于上述的具体实施方式,上述的具体实施方式仅仅是示意性的,而不是限制性的,本领域的普通技术人员在本申请的启示下,在不脱离本申请宗旨和权利要求所保护的范围情况下,还可做出很多形式,均属于本申请的保护之内。

Claims (12)

  1. 一种控制方法,应用于终端设备,所述方法包括:
    接收用户在目标区域的第一输入,所述目标区域为所述终端设备的屏幕上、且与位于所述终端设备屏下的第一摄像头对应的区域;
    响应于所述第一输入,执行与所述第一输入对应的第一动作;
    其中,所述目标区域位于所述终端设备的导航栏。
  2. 根据权利要求1所述的方法,其中,所述第一动作包括以下任意一项:开启所述第一摄像头并显示所述第一摄像头采集的预览画面,开启第二摄像头并显示所述第二摄像头采集的预览画面,显示图库应用程序的界面。
  3. 根据权利要求1所述的方法,其中,所述导航栏包括至少一个虚拟按键,所述目标区域为目标虚拟按键所在的区域,所述目标虚拟按键为所述至少一个虚拟按键中的虚拟按键。
  4. 根据权利要求3所述的方法,其中,所述目标区域中除第一区域之外的其他区域显示有目标标识,所述目标标识用于指示所述目标虚拟按键,所述第一区域为所述第一摄像头在所述终端设备屏上的投影区域。
  5. 根据权利要求2所述的方法,其中,所述第一动作包括显示所述图库应用程序的界面;
    所述执行与所述第一输入对应的第一动作之后,所述方法还包括:
    在显示目标图像的情况下,接收用户在所述目标区域的第二输入,所述目标图像为所述图库应用程序中的图像;
    响应于所述第二输入,对所述目标图像执行与所述第二输入对应的第二动作;
    其中,所述第二动作包括以下任意一项:缩小显示所述目标图像,放大显示所述目标图像、删除所述目标图像。
  6. 一种终端设备,其中,所述终端设备包括:接收模块和处理模块;
    所述接收模块,用于接收用户在目标区域的第一输入,所述目标区域为所述终端设备的屏幕上、且与位于所述终端设备屏下的第一摄像头对应的区域;
    所述处理模块,用于响应于所述接收模块接收的所述第一输入,执行与所述第一输入对应的第一动作,所述第一动作为与所述第一摄像头相关的动作;
    其中,所述目标区域位于所述终端设备的导航栏。
  7. 根据权利要求6所述的终端设备,其中,所述第一动作包括以下任意一项:开启所述第一摄像头并显示所述第一摄像头采集的预览画面,开启第二摄像头并显示所述第二摄像头采集的预览画面,显示图库应用程序的界面。
  8. 根据权利要求6所述的终端设备,其中,所述导航栏包括至少一个虚拟按键,所述目标区域为目标虚拟按键所在的区域,所述目标虚拟按键为所述至少一个虚拟按键中的虚拟按键。
  9. 根据权利要求8所述的终端设备,其中,所述目标区域中除第一区域之外的其他区域显示有目标标识,所述目标标识用于指示所述目标虚拟按键,所述第一区域为所述第一摄像头在所述终端设备屏上的投影区域。
  10. 根据权利要求7所述的终端设备,其中,所述第一动作包括显示图库应用程 序的界面;
    所述接收模块,还用于在所述处理模块执行与所述第一输入对应的第一动作之后,且在所述处理模块显示目标图像的情况下,接收用户在所述目标区域的第二输入;
    所述处理模块,还用于响应于所述接收模块接收的所述第二输入,对所述目标图像执行与所述第二输入对应的第二动作;
    其中,所述第二动作包括以下任意一项:缩小显示所述目标图像,放大显示所述目标图像、删除所述目标图像;所述目标图像为所述图库应用程序中的图像。
  11. 一种终端设备,所述终端包括处理器、存储器及存储在所述存储器上并可在所述处理器上运行的计算机程序,所述计算机程序被所述处理器执行时实现如权利要求1至5中任一项所述的控制方法的步骤。
  12. 一种计算机可读存储介质,所述计算机可读存储介质上存储计算机程序,所述计算机程序被处理器执行时实现如权利要求1至5中任一项所述的控制方法的步骤。
PCT/CN2020/111443 2019-08-28 2020-08-26 控制方法及终端设备 WO2021037073A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910803952.XA CN110647277A (zh) 2019-08-28 2019-08-28 一种控制方法及终端设备
CN201910803952.X 2019-08-28

Publications (1)

Publication Number Publication Date
WO2021037073A1 true WO2021037073A1 (zh) 2021-03-04

Family

ID=68991072

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/111443 WO2021037073A1 (zh) 2019-08-28 2020-08-26 控制方法及终端设备

Country Status (2)

Country Link
CN (1) CN110647277A (zh)
WO (1) WO2021037073A1 (zh)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110647277A (zh) * 2019-08-28 2020-01-03 维沃移动通信有限公司 一种控制方法及终端设备
CN111522478B (zh) * 2020-04-17 2021-09-07 维沃移动通信有限公司 一种图标的移动方法及电子设备
CN111966237B (zh) * 2020-08-06 2023-06-20 Tcl通讯(宁波)有限公司 一种开孔屏触摸补偿方法、装置及终端
CN111953900B (zh) * 2020-08-07 2022-01-28 维沃移动通信有限公司 图片拍摄方法、装置和电子设备

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103167179A (zh) * 2013-03-12 2013-06-19 广东欧珀移动通信有限公司 一种快速启动拍照功能的方法及移动设备
CN104216639A (zh) * 2014-08-28 2014-12-17 深圳市金立通信设备有限公司 一种终端操作方法
CN104679401A (zh) * 2013-12-03 2015-06-03 上海思立微电子科技有限公司 一种终端的触控方法及终端
KR20150072922A (ko) * 2013-12-20 2015-06-30 엘지전자 주식회사 이동 단말기 및 그 동작 방법
CN107147847A (zh) * 2017-05-15 2017-09-08 上海与德科技有限公司 一种移动终端的照相机的控制装置、方法以及移动终端
CN109151296A (zh) * 2017-06-19 2019-01-04 北京小米移动软件有限公司 电子设备、切换方法及装置、计算机可读存储介质
CN110572575A (zh) * 2019-09-20 2019-12-13 三星电子(中国)研发中心 一种摄像头拍摄控制方法和装置
CN110647277A (zh) * 2019-08-28 2020-01-03 维沃移动通信有限公司 一种控制方法及终端设备
CN111163260A (zh) * 2019-12-20 2020-05-15 维沃移动通信有限公司 一种摄像头启动方法及电子设备

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106921767A (zh) * 2017-03-07 2017-07-04 捷开通讯(深圳)有限公司 一种高屏占比的移动终端
CN107580179A (zh) * 2017-08-18 2018-01-12 维沃移动通信有限公司 一种摄像头启动方法及移动终端
CN107948496A (zh) * 2017-10-09 2018-04-20 广东小天才科技有限公司 移动终端的拍照方法、装置、移动终端及存储介质
CN108196714B (zh) * 2018-01-02 2021-01-15 联想(北京)有限公司 一种电子设备
CN108196722A (zh) * 2018-01-29 2018-06-22 广东欧珀移动通信有限公司 一种电子设备及其触控方法、计算机可读存储介质
CN108491142B (zh) * 2018-03-09 2020-08-11 Oppo广东移动通信有限公司 一种移动终端的控制方法、移动终端及存储介质
CN108803990B (zh) * 2018-06-12 2021-03-16 Oppo广东移动通信有限公司 交互方法、装置及终端
CN109740519B (zh) * 2018-12-29 2021-08-13 联想(北京)有限公司 控制方法和电子设备

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103167179A (zh) * 2013-03-12 2013-06-19 广东欧珀移动通信有限公司 一种快速启动拍照功能的方法及移动设备
CN104679401A (zh) * 2013-12-03 2015-06-03 上海思立微电子科技有限公司 一种终端的触控方法及终端
KR20150072922A (ko) * 2013-12-20 2015-06-30 엘지전자 주식회사 이동 단말기 및 그 동작 방법
CN104216639A (zh) * 2014-08-28 2014-12-17 深圳市金立通信设备有限公司 一种终端操作方法
CN107147847A (zh) * 2017-05-15 2017-09-08 上海与德科技有限公司 一种移动终端的照相机的控制装置、方法以及移动终端
CN109151296A (zh) * 2017-06-19 2019-01-04 北京小米移动软件有限公司 电子设备、切换方法及装置、计算机可读存储介质
CN110647277A (zh) * 2019-08-28 2020-01-03 维沃移动通信有限公司 一种控制方法及终端设备
CN110572575A (zh) * 2019-09-20 2019-12-13 三星电子(中国)研发中心 一种摄像头拍摄控制方法和装置
CN111163260A (zh) * 2019-12-20 2020-05-15 维沃移动通信有限公司 一种摄像头启动方法及电子设备

Also Published As

Publication number Publication date
CN110647277A (zh) 2020-01-03

Similar Documents

Publication Publication Date Title
CN110851051B (zh) 一种对象分享方法及电子设备
WO2021104365A1 (zh) 对象分享方法及电子设备
WO2021218902A1 (zh) 显示控制方法、装置及电子设备
WO2020258929A1 (zh) 文件夹界面切换方法及终端设备
WO2021104195A1 (zh) 图像显示方法及电子设备
WO2020063091A1 (zh) 一种图片处理方法及终端设备
WO2021083132A1 (zh) 图标移动方法及电子设备
WO2021037073A1 (zh) 控制方法及终端设备
CN110221885B (zh) 一种界面显示方法及终端设备
WO2020151525A1 (zh) 消息发送方法及终端设备
WO2020192299A1 (zh) 信息显示方法及终端设备
WO2020151460A1 (zh) 对象处理方法及终端设备
WO2021129536A1 (zh) 图标移动方法及电子设备
WO2020199783A1 (zh) 界面显示方法及终端设备
WO2021012927A1 (zh) 图标显示方法及终端设备
WO2021104163A1 (zh) 图标整理方法及电子设备
WO2021004327A1 (zh) 应用权限设置方法及终端设备
CN108920226B (zh) 屏幕录制方法及装置
WO2020182035A1 (zh) 图像处理方法及终端设备
WO2021121265A1 (zh) 摄像头启动方法及电子设备
WO2021057290A1 (zh) 信息控制方法及电子设备
WO2020192324A1 (zh) 界面显示方法及终端设备
WO2021164716A1 (zh) 显示方法及电子设备
WO2021017738A1 (zh) 界面显示方法及电子设备
WO2021031717A1 (zh) 截屏方法及终端设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20856937

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20856937

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 20856937

Country of ref document: EP

Kind code of ref document: A1

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 20.10.2022)

122 Ep: pct application non-entry in european phase

Ref document number: 20856937

Country of ref document: EP

Kind code of ref document: A1