WO2021037073A1 - 控制方法及终端设备 - Google Patents
控制方法及终端设备 Download PDFInfo
- Publication number
- WO2021037073A1 WO2021037073A1 PCT/CN2020/111443 CN2020111443W WO2021037073A1 WO 2021037073 A1 WO2021037073 A1 WO 2021037073A1 CN 2020111443 W CN2020111443 W CN 2020111443W WO 2021037073 A1 WO2021037073 A1 WO 2021037073A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- terminal device
- camera
- input
- target
- area
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04847—Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
Definitions
- the embodiments of the present application relate to the field of communication technologies, and in particular, to a control method and terminal equipment.
- the user needs to first search for the application icon indicating the application program on the desktop of the terminal device, and after finding the application icon indicating the application program, then the application icon Operate to trigger the terminal device to run the application.
- the embodiments of the present application provide a control method and a terminal device to solve the problem of cumbersome and time-consuming process of running an application program on the terminal device, and poor human-computer interaction performance.
- an embodiment of the present application provides a control method applied to a terminal device.
- the method includes receiving a user's first input in a target area, where the target area is on the screen of the terminal device and is located on the terminal device.
- an embodiment of the present application provides a terminal device.
- the terminal device includes a receiving module and a processing module.
- the receiving module is used to receive the first input of the user in the target area, the target area is the area on the screen of the terminal device and corresponding to the first camera located under the screen of the terminal device;
- the processing module is used to respond to receiving The first input received by the module executes the first action corresponding to the first input.
- the target area is located in the navigation bar of the terminal device.
- an embodiment of the present application provides a terminal device, including a processor, a memory, and a computer program stored on the memory and capable of running on the processor.
- the computer program implements the above-mentioned first aspect when the computer program is executed by the processor. The steps of the control method.
- an embodiment of the present application provides a computer-readable storage medium storing a computer program on the computer-readable storage medium, and when the computer program is executed by a processor, the steps of the control method of the first aspect are realized.
- the terminal device may receive the user's first input in the target area (the area on the screen of the terminal device and the area corresponding to the first camera located under the screen of the terminal device); and may respond to the first input. Input, perform a first action corresponding to the first input; wherein, the target area may be located in the navigation bar of the terminal device.
- the user when the user requires the terminal device to perform a certain action, such as a first action, the user can perform a first input corresponding to the first action in the target area to trigger the terminal device to directly perform the first action.
- FIG. 1 is a schematic structural diagram of a possible Android operating system provided by an embodiment of the application
- Fig. 2 is a schematic diagram of a control method provided by an embodiment of the application.
- FIG. 3 is one of the schematic diagrams of the interface of the control method application provided by the embodiment of the application.
- FIG. 5 is the third schematic diagram of the interface of the control method application provided by the embodiment of this application.
- FIG. 6 is the fourth schematic diagram of the interface of the control method application provided by the embodiment of this application.
- FIG. 7 is the fifth schematic diagram of the interface of the control method application provided by the embodiment of this application.
- FIG. 8 is the sixth schematic diagram of the interface of the control method application provided by the embodiment of this application.
- FIG. 9 is a schematic structural diagram of a terminal device provided by an embodiment of the application.
- FIG. 10 is a schematic diagram of hardware of a terminal device provided by an embodiment of the application.
- first and second in this document are used to distinguish different objects, rather than to describe a specific order of objects.
- first input and the second input are used to distinguish different inputs, rather than to describe a specific order of input.
- words such as “exemplary” or “for example” are used as examples, illustrations, or illustrations. Any embodiment or design solution described as “exemplary” or “for example” in the embodiments of the present application should not be construed as being more preferable or advantageous than other embodiments or design solutions. To be precise, words such as “exemplary” or “for example” are used to present related concepts in a specific manner.
- plural refers to two or more than two, for example, a plurality of elements refers to two or more than two elements.
- the preview image captured by the camera the preview image captured by the camera currently turned on by the terminal device, that is, the image displayed on the camera preview interface.
- Camera preview interface the preview interface of the camera application. Refers to the interface displayed by the terminal device when the camera application in the terminal device is running in the foreground of the terminal device.
- Navigation Bar It is the shortcut button bar on the screen of the terminal device.
- the navigation bar appears in the peripheral area of the mobile phone screen in the form of virtual buttons (for example, it can be located at the top area of the screen, it can also be located at the bottom area of the screen, it can also be located in the left area of the screen, or it can be located in the right area of the screen) .
- the shortcut buttons on the screen of the terminal device usually include a back button (back), a home button (home), and a recently used button (recents).
- Digging screen It can also be called a drilling screen. It refers to digging a hole on the screen of the terminal device to expose the front camera set at the hole. In this way, the screen-to-body ratio of the terminal device can be increased without sacrificing the photographing effect of the front camera.
- a terminal device provided with a hole-punching screen may include a display layer and a touch layer, and punching holes on the hole-punching screen may be implemented in two ways (namely, the following method 1 and method 2).
- Method 1 Punch through both the display layer of the screen and the touch layer of the screen. In this case, the hole-punched area of the screen cannot display content normally, nor can it respond normally to the user's touch input.
- Method 2 Break through the display layer of the screen and retain the touch layer of the screen. In this case, although the perforated area of the screen cannot display content normally, since the touch layer is reserved, it can normally respond to the user's touch input.
- hole digging screen in the embodiment of the present application can be implemented using the second method described above.
- the embodiments of the present application provide a control method and a terminal device
- the terminal device can receive a user's first input in a target area (the area on the screen of the terminal device and corresponding to the first camera located under the screen of the terminal device) And can respond to the first input, perform a first action corresponding to the first input; wherein, the target area can be located in the navigation bar of the terminal device. Then, when the user requires the terminal device to perform a certain action, such as a first action, the user can perform a first input corresponding to the first action in the target area to trigger the terminal device to directly perform the first action.
- the terminal device in the embodiment of the present application may be a terminal device with an operating system.
- the operating system may be an Android operating system, an ios operating system, or other possible operating systems, which are not specifically limited in the embodiment of the present application.
- the following uses the Android operating system as an example to introduce the software environment to which the control method provided in the embodiments of the present application is applied.
- FIG. 1 it is a schematic structural diagram of a possible Android operating system provided by an embodiment of this application.
- the architecture of the Android operating system includes 4 layers, namely: application layer, application framework layer, system runtime library layer, and kernel layer (specifically, it may be the Linux kernel layer).
- the application layer includes various applications (including system applications and third-party applications) in the Android operating system.
- the application framework layer is the framework of the application. Developers can develop some applications based on the application framework layer while complying with the development principles of the application framework.
- the system runtime library layer includes a library (also called a system library) and an Android operating system runtime environment.
- the library mainly provides various resources needed by the Android operating system.
- the Android operating system operating environment is used to provide a software environment for the Android operating system.
- the kernel layer is the operating system layer of the Android operating system and belongs to the lowest level of the Android operating system software.
- the core layer is based on the Linux kernel to provide core system services and hardware-related drivers for the Android operating system.
- the developer can develop a software program that implements the control method provided in the embodiment of the application based on the system architecture of the Android operating system as shown in FIG. 1, so that the interface can be controlled
- the method can be run based on the Android operating system as shown in FIG. 1. That is, the processor or the terminal device can implement the control method provided in the embodiment of the present application by running the software program in the Android operating system.
- the terminal device in the embodiment of the present application may be a mobile terminal or a non-mobile terminal.
- the mobile terminal may be a mobile phone, a tablet computer, a notebook computer, a palmtop computer, a vehicle-mounted terminal, a wearable device, an ultra-mobile personal computer (UMPC), a netbook, or a personal digital assistant (personal digital assistant
- the non-mobile terminal may be a personal computer (PC), a television (television, TV), a teller machine, or a self-service machine, etc., which are not specifically limited in the embodiment of the present application.
- the execution subject of the control method provided in the embodiments of the present application may be the above-mentioned terminal device, or may be a functional module and/or functional entity in the terminal device that can implement the control method, which can be specifically determined according to actual usage requirements.
- the embodiment is not limited.
- a terminal device is taken as an example to illustrate the control method provided in the embodiment of the present application.
- the target area in the embodiment of the present application, the target area may be the area corresponding to the first camera located under the screen of the terminal device, and the target area may be located in the navigation Column
- a camera if the user requires the terminal device to perform a certain camera-related action (for example, the first action in the embodiment of this application), the user can enter through an input in this area (for example, this application
- the first input in the embodiment triggers the terminal device to directly display the action corresponding to the input, so as to trigger the terminal device to execute the action corresponding to the input.
- an embodiment of the present application provides a control method, which may include the following S201 and S202.
- S201 The terminal device receives a user's first input in the target area.
- the above-mentioned target area may be an area on the screen of the terminal device and corresponding to the first camera located under the screen of the terminal device, and the target area may be located in the navigation bar of the terminal device.
- the screen of the terminal device may be a hole-drilling screen, and the screen may include a display layer and a touch layer.
- a target hole may be provided at the display layer corresponding to the target area, and the above-mentioned first camera may be provided at the target hole.
- first camera located under the screen of the terminal device may be understood as the first camera located under the touch layer of the screen of the terminal device.
- the digging screen of the embodiment of the present application since the digging screen of the embodiment of the present application only digs holes in the display layer (that is, the target hole is set in the display layer), the touch layer is not pierced, that is, it penetrates the display layer of the screen and The touch layer of the screen is retained, so the first camera can be exposed while ensuring that the touch function of the target area can be used normally.
- the above-mentioned first camera may be a front camera of the terminal device.
- the terminal device may further include a second camera, and the second camera may be a rear camera of the terminal device.
- the navigation bar of the terminal device may be located in any area of the screen of the terminal device.
- the navigation bar can be located in the top area of the terminal device screen, or can be located in the bottom area of the terminal device screen, or can be located in the left area of the terminal device screen, or can be located in the right area of the terminal device screen, depending on actual use The requirements are determined, and the embodiment of the present application does not limit it.
- the target area is located in the navigation bar of the terminal device, and the target area is the area on the screen of the terminal device and corresponding to the first camera under the screen of the terminal device, that is, the first camera setting Under the target area in the navigation bar of the terminal device, it does not occupy the effective display area of the terminal device screen and does not need to be stretched. Therefore, on the basis of increasing the screen ratio of the terminal device, the front camera of the terminal device (such as this The first camera in the application embodiment is retractable, so that the reliability and lifespan of the terminal device can be improved.
- the aforementioned target area may be any area in the aforementioned navigation bar.
- the above-mentioned first input may be any possible form of input such as click input, long press input, double press input, drag input, and sliding input, etc., which can be determined according to actual usage requirements.
- the embodiment is not limited.
- the above-mentioned click input may be an input of a first preset number of clicks.
- the aforementioned long-press input may be an input for the first preset duration of contact.
- the above-mentioned re-press input is also called pressure touch input, which refers to an input in which the user presses with a pressure value greater than or equal to the first pressure threshold.
- the aforementioned drag input may be an input of dragging in any direction.
- the aforementioned sliding input may be an input of sliding in the first direction.
- the above-mentioned first preset number of times may be two or more times.
- the foregoing first preset duration, the foregoing first pressure threshold, and the first direction may all be determined according to actual use requirements, which are not limited in the embodiment of the present application.
- the terminal device executes a first action corresponding to the first input.
- the above-mentioned first action may be an action related to the first camera.
- the above-mentioned action related to the first camera may be understood as performing an opening action on an object related to the first camera.
- objects related to the first camera may include: a first camera, a second camera, a camera application, a gallery application, and so on.
- the details can be determined according to actual use requirements, and the embodiment of the present application does not limit it.
- the above-mentioned first action may include any one of the following: turning on the first camera and displaying the preview image collected by the first camera, turning on the second camera and displaying the preview image collected by the second camera, and displaying The interface of the gallery application.
- the terminal device displays the preview screen collected by the first camera, that is, the terminal device displays the preview screen collected by the front camera of the terminal device; for example, the terminal device can turn on the first camera while the terminal device
- the camera application is run in the foreground of the camera, and the preview image collected by the first camera is displayed in the camera preview interface (that is, the preview interface of the camera application).
- the terminal device displays the preview screen captured by the second camera, that is, the terminal device displays the preview screen captured by the rear camera of the terminal device; for example, the terminal device can run the camera application in the foreground of the terminal device while the second camera is turned on.
- the preview picture collected by the second camera is displayed in the camera preview interface (that is, the preview interface of the camera application).
- the interface of the aforementioned gallery application may be any interface in the gallery application.
- the above-mentioned first input is different, and the first action performed by the terminal device after responding to the first input may also be different. That is, the user can trigger the terminal device to perform different first actions through different first inputs.
- a specific first input may not be restricted to trigger the terminal device to perform a certain first action; it is only necessary that each specific first input uniquely corresponds to a certain first action.
- the details can be determined according to actual use requirements, and the embodiment of the present application does not limit it.
- the user can click on the target area (that is, the first input is a click input), so that the terminal device can respond to the click input, turn on the first camera and display the first camera.
- a preview image captured by a camera.
- the number of clicks for the above-mentioned click input may be two or more times.
- the details can be determined according to actual use requirements, and the embodiment of the present application does not limit it.
- the terminal device when the terminal device displays any interface, the terminal device may respond to the above-mentioned click input to turn on the first camera and display the preview image collected by the first camera.
- the terminal device when the terminal device displays any interface, the user can click on the target area (the number of clicks is two or more), that is, the terminal device receives the user's first input, and then the terminal device can first determine the terminal device Whether the current interface displayed is the camera preview interface. If the current interface is not the camera preview interface, the terminal device can run the camera application in the foreground of the terminal device and turn on the first camera, so that the terminal device can display the camera preview interface and display the preview collected by the first camera in the camera preview interface Picture. If the current interface is the camera preview interface, the terminal device can further determine whether the camera preview interface displays the preview image captured by the first camera; if the camera preview interface displays the preview image captured by the first camera, the terminal device can keep the first camera on. The camera keeps displaying the preview screen captured by the first camera; if the preview screen captured by the second camera is displayed in the camera preview interface, the terminal device can turn off the second camera and turn on the first camera, and then the terminal device can display the first camera capture Preview screen.
- the target area
- the terminal device displays the interface 30 of the chat application program and the navigation bar 31, wherein the navigation bar 31 is located in the bottom area of the screen of the terminal device, and the target area 32 is located in the navigation bar 31.
- the terminal device can double-click in the target area 32, that is, the terminal device receives the user's first input, so that the terminal device can respond to the first input, as shown in Figure 3 (b), the terminal device can display the camera preview interface , And display the preview image 33 captured by the first camera in the camera preview interface (that is, the terminal device turns on the first camera and displays the preview image captured by the first camera).
- the user since the user can directly trigger the terminal device to turn on the first camera and display the preview image collected by the first camera through the first input in the target area, the user does not need to pass multiple inputs (for example, first search and indicate the camera application
- the identification of the program, and then the identification input triggers the terminal device to display the preview interface of the camera application, and the camera switch identification input in the camera preview interface) triggers the terminal device to display the preview image collected by the first camera; therefore, the opening can be simplified
- the first camera also displays the operation process of the preview picture collected by the first camera, which improves the human-computer interaction performance.
- the user can press and hold in the target area (that is, the first input is a long press input), so that the terminal device can respond to the long press input to turn on the second camera.
- the camera also displays the preview image captured by the second camera.
- the terminal device when the terminal device displays any interface, the terminal device may respond to the aforementioned long-press input to turn on the second camera and display the preview image collected by the second camera.
- the terminal device when the terminal device displays any interface, the user can press and hold the target area, that is, the terminal device receives the user's first input, and then the terminal device can first determine whether the current interface displayed by the terminal device is a camera preview interface. If the current interface is not the camera preview interface, the terminal device can first run the camera application in the foreground of the terminal device and turn on the second camera, so that the terminal device can display the camera preview interface and display the second camera capture in the camera preview interface Preview screen.
- the terminal device can further determine whether the preview screen captured by the second camera is displayed in the camera preview interface; if the preview screen captured by the second camera is displayed in the camera preview interface, the terminal device can continue to open the second camera The camera also displays the preview image captured by the second camera; if the preview image captured by the first camera is displayed in the camera preview interface, the terminal device can turn off the first camera and turn on the second camera, so that the terminal device can display the second camera capture preview image.
- the terminal device displays the interface 40 of the chat application program and the navigation bar 41, and the target area 42 is located in the navigation bar 41.
- the terminal device can display a camera preview Interface, and display the preview image 43 captured by the second camera in the camera preview interface (that is, the terminal device turns on the second camera and displays the preview image captured by the second camera).
- the user since the user can directly trigger the terminal device to turn on the second camera and display the preview image collected by the second camera through the first input in the target area, the user does not need to pass multiple inputs (for example, first search and indicate the camera application
- the identification of the program and then input the identification to trigger the terminal device to display the preview interface of the camera application, and switch the identification input of the camera in the camera preview interface) to trigger the terminal device to display the preview image collected by the second camera; therefore, it can be simplified to open
- the second camera also displays the operation process of the preview picture collected by the second camera, which improves the human-computer interaction performance.
- the user can slide in the target area (that is, the first input is a sliding input) input, so that the terminal device can respond to the sliding input to display the gallery application Interface.
- the target area that is, the first input is a sliding input
- the terminal device may display the interface of the gallery application in response to the above-mentioned sliding input.
- the terminal device when the terminal device displays any interface, the user can slide in the target area (for example, slide to the left), that is, the terminal device receives the user's first input, and then the terminal device can first determine whether the current interface displayed by the terminal device is It is the interface of the gallery application. If the current interface is not the interface of the gallery application, the terminal device can run the gallery application in the foreground of the terminal device and display the interface of the gallery application. If the current interface is the interface of a gallery application, the terminal device can keep running the gallery application in the foreground of the terminal device and display the interface of the gallery application.
- the interface currently displayed on the terminal device is the interface of the chat application
- the gallery application is the gallery application
- the first input is long-press input
- the navigation bar is located on the screen of the terminal device.
- the terminal device displays the chat application interface 50 and the navigation bar 51
- the target area 52 is located in the navigation bar 51
- the user can slide left in the target area 52, that is, the terminal
- the terminal device receives the user's first input, so that the terminal device can respond to the first input.
- the terminal device can display an interface 53 of the gallery application.
- the user since the user can directly trigger the terminal device to display the picture of the gallery application through the first input in the target area, the user does not need to search for the identifier indicating the gallery application (for example, the application of the gallery application). Icon), and then input to the identification to trigger the picture of the gallery application of the terminal device. Therefore, the operation process of opening the interface of the display gallery application program can be simplified, and the human-computer interaction performance can be improved.
- the identifier indicating the gallery application for example, the application of the gallery application). Icon
- the first action corresponding to each first input described in the foregoing three possible implementation manners is an exemplary enumeration, which does not impose any limitation on the embodiment of the present application.
- the terminal device can also turn on the second camera and display the second camera collection
- the terminal device may also display the gallery application interface after responding to the long-press input; or, when the first input is the user’s
- the sliding input of the target area is the sliding direction to the left, after the end device responds to the sliding input, it can also turn on the first camera and display the preview image collected by the first camera.
- the terminal device since the user can input in the target area when the current interface of the terminal device is any interface, the terminal device is triggered to directly perform the first action corresponding to the input without triggering the terminal device to return to the desktop first, and then Searching for the logo used to indicate the action on the desktop can improve the convenience of the terminal device's action and improve the performance of human-computer interaction.
- the navigation bar may include at least one virtual button
- the target area may be an area where the target virtual button is located
- the target virtual button may be a virtual button of the at least one virtual button
- the area where the target virtual key is located can be understood as the area used to display the target virtual key on the screen of the terminal device, or the area used to display the target virtual key on the screen of the terminal device.
- the peripheral area of the area of the target virtual button (for example, it may be a circular area surrounding the target virtual button), and may also be the area formed by the area used to display the target virtual button and the peripheral area of the area on the screen of the terminal device. It can be determined according to actual use requirements, and the embodiment of the present application does not limit it.
- each of the above-mentioned at least one virtual key may be any one of the following: a return key, a home key, and a recently used key.
- a return key any one of the following: a return key, a home key, and a recently used key.
- the details can be determined according to actual use requirements, and the embodiment of the present application does not limit it.
- each of the above-mentioned at least one virtual key is different.
- the above-mentioned target virtual button may be any one of the above-mentioned at least one virtual button.
- the above-mentioned target virtual button may be a return button, a home button or a recently used button.
- the details can be determined according to actual use requirements, and the embodiment of the present application does not limit it.
- the screen-to-body ratio of the terminal device can be further increased on the basis of ensuring the normal use of the first camera and the target virtual button.
- a target identifier is displayed in other areas, the target identifier may be used to indicate the target virtual key, and the first area may be the first camera on the terminal device.
- the projection area on the screen refers to the screen of the terminal device; the projection area of the first camera on the terminal device screen can be understood as the orthographic projection area of the first camera on the screen of the terminal device.
- the aforementioned target area may include a first area and a second area (that is, other areas in the target area except the first area).
- the first area may be located in the center of the target area, and the second area may be located outside the first area. Therefore, the target identifier can be displayed on the outside of the first camera. In this way, visually, the first camera forms an identification with the target identification on the surface of the screen, so as to prevent the first camera from affecting the display effect of the screen.
- the target area may be 52 shown in (b) in FIG. 5, the first area may be the grid-filled area 521 shown in (b) in FIG. 5, and the second area may be shown in FIG. (b) The black area 522 shown.
- the above-mentioned target hole for setting the first camera may be located in the first area.
- the target hole may be set on the display layer of the first area, and the area on the screen of the terminal device corresponding to the first area cannot display content, but has a touch function.
- the user can be prompted to the terminal of the target virtual key on the basis of further increasing the screen-to-body ratio.
- the location of the device on the screen which can improve the convenience of operation and the performance of human-computer interaction.
- the user when the user requires the terminal device to perform a certain action, such as the first action, the user can perform the first input corresponding to the first action in the target area to trigger the terminal device to directly execute the first action. action.
- the process of executing actions by the terminal device can be simplified, and the human-computer interaction performance can be improved.
- the above-mentioned first action may include displaying an interface of a gallery application.
- the control method provided in the embodiment of the present application may further include the following S203 and S204.
- the terminal device receives a second input of the user in the target area.
- the above-mentioned target image may be an image in a gallery application.
- the above-mentioned second input may be any possible input such as a drag input, a sliding input, etc., which may be specifically determined according to actual use requirements, and the embodiment of the present application does not limit it.
- the aforementioned drag input may be an input of dragging in any direction.
- the aforementioned sliding input may be an input of sliding in the second direction.
- the above-mentioned second input is different from the above-mentioned first input.
- the second direction is different from the first direction.
- the above-mentioned second input may be an input for a target image.
- the user can trigger the terminal device to display the target image through an input on the interface of the gallery application.
- the terminal device In response to the second input, the terminal device performs a second action corresponding to the second input on the target image.
- the above-mentioned second action may include any one of the following: zoom out and display the target image, zoom in and display the target image, and delete the target image.
- the second action performed by the terminal device after responding to the second input may also be different.
- the second input is a sliding input as an example.
- the terminal device can zoom out and display the target image in response to the second input.
- the terminal device can zoom in and display the target image in response to the second input.
- the terminal device can delete the target image in response to the second input. Understandably, in this case, the terminal device can automatically display the target image after deleting the target image. The previous image or the next image.
- the terminal device displays a character image
- the target image is an image in a gallery application
- the second input is a sliding input
- the navigation bar is located at the bottom area of the terminal device.
- the terminal device displays the character image 60 and the navigation bar
- the target area 61 is located in the navigation bar, and the user can slide upwards in the target area 61, that is, the terminal device
- the terminal device may respond to the second input, as shown in (b) of FIG. 6, display a first interface 62, which includes an enlarged image of the character image, namely The terminal device zooms in and displays the target image.
- the terminal device displays the character image 70 and the navigation bar, and the target area 71 is located in the navigation bar.
- the user can slide down on the target area 71, that is, the terminal device receives
- the terminal device can then respond to the second input, as shown in (b) in FIG. 7, display a first interface 72, and the second interface 72 includes a reduced image of the character image, that is, The terminal device zooms out and displays the target image.
- the terminal device displays a character image 80 and a navigation bar, and the target area 81 is located in the navigation bar.
- the user can slide right in the target area 81, that is, the terminal device receives To the user’s second input, the terminal device can then respond to the second input. As shown in (b) in Figure 8, the terminal device deletes the character image and can display a third interface 82 that includes The previous image of the character image.
- the above-mentioned second input may also be another input that is different from the above-mentioned one input, and may be specifically determined according to actual use requirements, which is not limited in the embodiment of the present application.
- the terminal device when the terminal device displays the target image, because the user can trigger the terminal device to zoom out, display the target image, or delete the target image through different inputs in the target area, that is, the user can directly pass in the target area
- the input triggers the terminal device to perform corresponding actions on the target image, so the convenience of operating the target image can be increased, and the human-computer interaction performance can be improved.
- control methods shown in each of the foregoing method drawings are all exemplified in conjunction with a drawing in the embodiments of the present application.
- control methods shown in the figures of the above methods can also be implemented in combination with any other figures that can be combined as illustrated in the above embodiments, and will not be repeated here.
- the terminal device 900 may include a receiving module 901 and a processing module 902.
- the receiving module 901 may be used to receive a user's first input in the target area; the processing module 902 may be used to perform a first action corresponding to the first input in response to the first input received by the receiving module 901.
- the target area may be an area on the screen of the terminal device and corresponding to the first camera located under the screen of the terminal device, and the target area may be located in the navigation bar of the terminal device.
- the above-mentioned first action may include any one of the following: turning on the first camera and displaying the preview image collected by the first camera, turning on the second camera and displaying the preview image collected by the second camera, and displaying The interface of the gallery application.
- the aforementioned navigation bar may include at least one virtual button
- the target area may be an area where the target virtual button is located
- the target virtual button may be a virtual button of the at least one virtual button
- a target identifier is displayed in the above-mentioned target area other than the first area.
- the target identifier may be used to indicate the target virtual button.
- the first area may be the first camera on the screen of the terminal device. Projection area on the top.
- the above-mentioned first action may include displaying an interface of a gallery application.
- the receiving module 901 can also be used to receive the user's second input in the target area when the target image is displayed after the processing module 902 performs the first action corresponding to the first input; the processing module 902 can also be used to In response to the second input received by the receiving module 901, a second action corresponding to the second input is performed on the target image.
- the target image may be an image in a gallery application, and the second action may include any one of the following: zoom out and display the target image, zoom in and display the target image, and delete the target image.
- the terminal device 900 provided in the embodiment of the present application can implement the various processes implemented by the terminal device 900 shown in the foregoing method embodiments. To avoid repetition, details are not described herein again.
- An embodiment of the present application provides a terminal device, which can receive a user's first input in a target area (an area on the screen of the terminal device and corresponding to the first camera located under the screen of the terminal device); and In response to the first input, perform a first action corresponding to the first input; wherein, the target area may be located in the navigation bar of the terminal device. Then, when the user requires the terminal device to perform a certain action, such as a first action, the user can perform a first input corresponding to the first action in the target area to trigger the terminal device to directly perform the first action.
- a target area an area on the screen of the terminal device and corresponding to the first camera located under the screen of the terminal device
- the target area may be located in the navigation bar of the terminal device.
- FIG. 10 is a schematic diagram of the hardware structure of a terminal device that implements each embodiment of the present application.
- the terminal device 100 includes but is not limited to: a radio frequency unit 101, a network module 102, an audio output unit 103, an input unit 104, a sensor 105, a display unit 106, a user input unit 107, an interface unit 108, and a memory 109 , The processor 110, and the power supply 111 and other components.
- the structure of the terminal device shown in FIG. 10 does not constitute a limitation on the terminal device, and the terminal device may include more or less components than those shown in the figure, or a combination of certain components, or different components. Layout.
- terminal devices include, but are not limited to, mobile phones, tablet computers, notebook computers, palmtop computers, vehicle-mounted terminal devices, wearable devices, and pedometers.
- the user input unit 107 is configured to receive a user's first input in the target area; the processor 110 is configured to perform a first action corresponding to the first input in response to the first input received by the user input unit 107.
- the target area is an area on the screen of the terminal device and corresponding to the first camera located under the screen of the terminal device, and the target area is located in the navigation bar of the terminal device.
- the receiving module 901 in the above-mentioned structural schematic diagram of the terminal device may be implemented by the above-mentioned user input unit 107.
- the processing module 902 in the above-mentioned structural schematic diagram of the terminal device may be implemented by the above-mentioned processor 110.
- An embodiment of the present application provides a terminal device, which can receive a user's first input in a target area (an area on the screen of the terminal device and corresponding to the first camera located under the screen of the terminal device); and In response to the first input, perform a first action corresponding to the first input; wherein, the target area may be located in the navigation bar of the terminal device. Then, when the user requires the terminal device to perform a certain action, such as a first action, the user can perform a first input corresponding to the first action in the target area to trigger the terminal device to directly perform the first action.
- a target area an area on the screen of the terminal device and corresponding to the first camera located under the screen of the terminal device
- the target area may be located in the navigation bar of the terminal device.
- the radio frequency unit 101 can be used for receiving and sending signals in the process of sending and receiving information or talking. Specifically, the downlink data from the base station is received and processed by the processor 110; in addition, Uplink data is sent to the base station.
- the radio frequency unit 101 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like.
- the radio frequency unit 101 can also communicate with the network and other devices through a wireless communication system.
- the terminal device provides users with wireless broadband Internet access through the network module 102, such as helping users to send and receive emails, browse web pages, and access streaming media.
- the audio output unit 103 can convert the audio data received by the radio frequency unit 101 or the network module 102 or stored in the memory 109 into an audio signal and output it as sound. Moreover, the audio output unit 103 may also provide audio output related to a specific function performed by the terminal device 100 (for example, call signal reception sound, message reception sound, etc.).
- the audio output unit 103 includes a speaker, a buzzer, a receiver, and the like.
- the input unit 104 is used to receive audio or video signals.
- the input unit 104 may include a graphics processing unit (GPU) 1041 and a microphone 1042.
- the graphics processor 1041 is configured to monitor images of still pictures or videos obtained by an image capture device (such as a camera) in a video capture mode or an image capture mode.
- the data is processed.
- the processed image frame can be displayed on the display unit 106.
- the image frame processed by the graphics processor 1041 may be stored in the memory 109 (or other storage medium) or sent via the radio frequency unit 101 or the network module 102.
- the microphone 1042 can receive sound, and can process such sound into audio data.
- the processed audio data can be converted into a format that can be sent to a mobile communication base station via the radio frequency unit 101 for output in the case of a telephone call mode.
- the terminal device 100 also includes at least one sensor 105, such as a light sensor, a motion sensor, and other sensors.
- the light sensor includes an ambient light sensor and a proximity sensor.
- the ambient light sensor can adjust the brightness of the display panel 1061 according to the brightness of the ambient light.
- the proximity sensor can close the display panel 1061 and the display panel 1061 when the terminal device 100 is moved to the ear. / Or backlight.
- the accelerometer sensor can detect the magnitude of acceleration in various directions (usually three-axis), and can detect the magnitude and direction of gravity when stationary, and can be used to identify the posture of the terminal device (such as horizontal and vertical screen switching, related games , Magnetometer posture calibration), vibration recognition related functions (such as pedometer, tap), etc.; sensor 105 can also include fingerprint sensors, pressure sensors, iris sensors, molecular sensors, gyroscopes, barometers, hygrometers, thermometers, Infrared sensors, etc., will not be repeated here.
- the display unit 106 is used to display information input by the user or information provided to the user.
- the display unit 106 may include a display panel 1061, and the display panel 1061 may be configured in the form of a liquid crystal display (LCD), an organic light-emitting diode (OLED), or the like.
- LCD liquid crystal display
- OLED organic light-emitting diode
- the user input unit 107 can be used to receive input digital or character information, and generate key signal inputs related to user settings and function control of the terminal device.
- the user input unit 107 includes a touch panel 1071 and other input devices 1072.
- the touch panel 1071 also called a touch screen, can collect user touch operations on or near it (for example, the user uses any suitable objects or accessories such as fingers, stylus, etc.) on the touch panel 1071 or near the touch panel 1071. operating).
- the touch panel 1071 may include two parts: a touch detection device and a touch controller.
- the touch detection device detects the user's touch position, and detects the signal brought by the touch operation, and transmits the signal to the touch controller; the touch controller receives the touch information from the touch detection device, converts it into contact coordinates, and then sends it To the processor 110, the command sent by the processor 110 is received and executed.
- the touch panel 1071 can be implemented in multiple types such as resistive, capacitive, infrared, and surface acoustic wave.
- the user input unit 107 may also include other input devices 1072.
- other input devices 1072 may include, but are not limited to, a physical keyboard, function keys (such as volume control buttons, switch buttons, etc.), trackball, mouse, and joystick, which will not be repeated here.
- the touch panel 1071 can be overlaid on the display panel 1061.
- the touch panel 1071 detects a touch operation on or near it, it transmits it to the processor 110 to determine the type of the touch event, and then the processor 110 determines the type of the touch event according to the touch.
- the type of event provides corresponding visual output on the display panel 1061.
- the touch panel 1071 and the display panel 1061 are used as two independent components to implement the input and output functions of the terminal device, in some embodiments, the touch panel 1071 and the display panel 1061 can be integrated
- the implementation of the input and output functions of the terminal device is not specifically limited here.
- the interface unit 108 is an interface for connecting an external device with the terminal device 100.
- the external device may include a wired or wireless headset port, an external power source (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device with an identification module, audio input/output (I/O) port, video I/O port, headphone port, etc.
- the interface unit 108 can be used to receive input from an external device (for example, data information, power, etc.) and transmit the received input to one or more elements in the terminal device 100 or can be used to connect to the terminal device 100 and external devices. Transfer data between devices.
- the memory 109 can be used to store software programs and various data.
- the memory 109 may mainly include a program storage area and a data storage area.
- the program storage area may store an operating system, an application program required by at least one function (such as a sound playback function, an image playback function, etc.), etc.; Data created by the use of mobile phones (such as audio data, phone book, etc.), etc.
- the memory 109 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, a flash memory device, or other volatile solid-state storage devices.
- the processor 110 is the control center of the terminal device. It uses various interfaces and lines to connect the various parts of the entire terminal device, runs or executes software programs and/or modules stored in the memory 109, and calls data stored in the memory 109. , Perform various functions of the terminal equipment and process data, so as to monitor the terminal equipment as a whole.
- the processor 110 may include one or more processing units; optionally, the processor 110 may integrate an application processor and a modem processor, where the application processor mainly processes the operating system, user interface, and application programs, etc., and the modem
- the adjustment processor mainly deals with wireless communication. It can be understood that the foregoing modem processor may not be integrated into the processor 110.
- the terminal device 100 may also include a power source 111 (such as a battery) for supplying power to various components.
- a power source 111 such as a battery
- the power source 111 may be logically connected to the processor 110 through a power management system, so as to manage charging, discharging, and power consumption through the power management system. Management and other functions.
- the terminal device 100 includes some functional modules not shown, which will not be repeated here.
- an embodiment of the present application further provides a terminal device, including a processor 110 as shown in FIG. 10, a memory 109, a computer program stored in the memory 109 and running on the processor 110, the computer
- a terminal device including a processor 110 as shown in FIG. 10, a memory 109, a computer program stored in the memory 109 and running on the processor 110, the computer
- the program is executed by the processor 110, each process of the foregoing method embodiment is realized, and the same technical effect can be achieved. In order to avoid repetition, details are not repeated here.
- the embodiments of the present application also provide a computer-readable storage medium, and a computer program is stored on the computer-readable storage medium.
- a computer program is stored on the computer-readable storage medium.
- the computer program is executed by a processor, each process of the above-mentioned method embodiment is realized, and the same technical effect can be achieved. To avoid repetition, I won’t repeat it here.
- the computer-readable storage medium may include read-only memory (ROM), random access memory (RAM), magnetic disk or optical disk, etc.
- the above-mentioned embodiment method can be implemented by means of software plus the necessary general hardware platform, of course, it can also be implemented by hardware, but in many cases the former is better.
- the computer software product is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk). ) Includes several instructions to make a terminal device (which can be a mobile phone, a computer, a server, an air conditioner, or a network device, etc.) execute the method described in each embodiment of the present application.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
Claims (12)
- 一种控制方法,应用于终端设备,所述方法包括:接收用户在目标区域的第一输入,所述目标区域为所述终端设备的屏幕上、且与位于所述终端设备屏下的第一摄像头对应的区域;响应于所述第一输入,执行与所述第一输入对应的第一动作;其中,所述目标区域位于所述终端设备的导航栏。
- 根据权利要求1所述的方法,其中,所述第一动作包括以下任意一项:开启所述第一摄像头并显示所述第一摄像头采集的预览画面,开启第二摄像头并显示所述第二摄像头采集的预览画面,显示图库应用程序的界面。
- 根据权利要求1所述的方法,其中,所述导航栏包括至少一个虚拟按键,所述目标区域为目标虚拟按键所在的区域,所述目标虚拟按键为所述至少一个虚拟按键中的虚拟按键。
- 根据权利要求3所述的方法,其中,所述目标区域中除第一区域之外的其他区域显示有目标标识,所述目标标识用于指示所述目标虚拟按键,所述第一区域为所述第一摄像头在所述终端设备屏上的投影区域。
- 根据权利要求2所述的方法,其中,所述第一动作包括显示所述图库应用程序的界面;所述执行与所述第一输入对应的第一动作之后,所述方法还包括:在显示目标图像的情况下,接收用户在所述目标区域的第二输入,所述目标图像为所述图库应用程序中的图像;响应于所述第二输入,对所述目标图像执行与所述第二输入对应的第二动作;其中,所述第二动作包括以下任意一项:缩小显示所述目标图像,放大显示所述目标图像、删除所述目标图像。
- 一种终端设备,其中,所述终端设备包括:接收模块和处理模块;所述接收模块,用于接收用户在目标区域的第一输入,所述目标区域为所述终端设备的屏幕上、且与位于所述终端设备屏下的第一摄像头对应的区域;所述处理模块,用于响应于所述接收模块接收的所述第一输入,执行与所述第一输入对应的第一动作,所述第一动作为与所述第一摄像头相关的动作;其中,所述目标区域位于所述终端设备的导航栏。
- 根据权利要求6所述的终端设备,其中,所述第一动作包括以下任意一项:开启所述第一摄像头并显示所述第一摄像头采集的预览画面,开启第二摄像头并显示所述第二摄像头采集的预览画面,显示图库应用程序的界面。
- 根据权利要求6所述的终端设备,其中,所述导航栏包括至少一个虚拟按键,所述目标区域为目标虚拟按键所在的区域,所述目标虚拟按键为所述至少一个虚拟按键中的虚拟按键。
- 根据权利要求8所述的终端设备,其中,所述目标区域中除第一区域之外的其他区域显示有目标标识,所述目标标识用于指示所述目标虚拟按键,所述第一区域为所述第一摄像头在所述终端设备屏上的投影区域。
- 根据权利要求7所述的终端设备,其中,所述第一动作包括显示图库应用程 序的界面;所述接收模块,还用于在所述处理模块执行与所述第一输入对应的第一动作之后,且在所述处理模块显示目标图像的情况下,接收用户在所述目标区域的第二输入;所述处理模块,还用于响应于所述接收模块接收的所述第二输入,对所述目标图像执行与所述第二输入对应的第二动作;其中,所述第二动作包括以下任意一项:缩小显示所述目标图像,放大显示所述目标图像、删除所述目标图像;所述目标图像为所述图库应用程序中的图像。
- 一种终端设备,所述终端包括处理器、存储器及存储在所述存储器上并可在所述处理器上运行的计算机程序,所述计算机程序被所述处理器执行时实现如权利要求1至5中任一项所述的控制方法的步骤。
- 一种计算机可读存储介质,所述计算机可读存储介质上存储计算机程序,所述计算机程序被处理器执行时实现如权利要求1至5中任一项所述的控制方法的步骤。
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910803952.XA CN110647277A (zh) | 2019-08-28 | 2019-08-28 | 一种控制方法及终端设备 |
CN201910803952.X | 2019-08-28 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2021037073A1 true WO2021037073A1 (zh) | 2021-03-04 |
Family
ID=68991072
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2020/111443 WO2021037073A1 (zh) | 2019-08-28 | 2020-08-26 | 控制方法及终端设备 |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN110647277A (zh) |
WO (1) | WO2021037073A1 (zh) |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110647277A (zh) * | 2019-08-28 | 2020-01-03 | 维沃移动通信有限公司 | 一种控制方法及终端设备 |
CN111522478B (zh) * | 2020-04-17 | 2021-09-07 | 维沃移动通信有限公司 | 一种图标的移动方法及电子设备 |
CN111966237B (zh) * | 2020-08-06 | 2023-06-20 | Tcl通讯(宁波)有限公司 | 一种开孔屏触摸补偿方法、装置及终端 |
CN111953900B (zh) * | 2020-08-07 | 2022-01-28 | 维沃移动通信有限公司 | 图片拍摄方法、装置和电子设备 |
Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103167179A (zh) * | 2013-03-12 | 2013-06-19 | 广东欧珀移动通信有限公司 | 一种快速启动拍照功能的方法及移动设备 |
CN104216639A (zh) * | 2014-08-28 | 2014-12-17 | 深圳市金立通信设备有限公司 | 一种终端操作方法 |
CN104679401A (zh) * | 2013-12-03 | 2015-06-03 | 上海思立微电子科技有限公司 | 一种终端的触控方法及终端 |
KR20150072922A (ko) * | 2013-12-20 | 2015-06-30 | 엘지전자 주식회사 | 이동 단말기 및 그 동작 방법 |
CN107147847A (zh) * | 2017-05-15 | 2017-09-08 | 上海与德科技有限公司 | 一种移动终端的照相机的控制装置、方法以及移动终端 |
CN109151296A (zh) * | 2017-06-19 | 2019-01-04 | 北京小米移动软件有限公司 | 电子设备、切换方法及装置、计算机可读存储介质 |
CN110572575A (zh) * | 2019-09-20 | 2019-12-13 | 三星电子(中国)研发中心 | 一种摄像头拍摄控制方法和装置 |
CN110647277A (zh) * | 2019-08-28 | 2020-01-03 | 维沃移动通信有限公司 | 一种控制方法及终端设备 |
CN111163260A (zh) * | 2019-12-20 | 2020-05-15 | 维沃移动通信有限公司 | 一种摄像头启动方法及电子设备 |
Family Cites Families (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106921767A (zh) * | 2017-03-07 | 2017-07-04 | 捷开通讯(深圳)有限公司 | 一种高屏占比的移动终端 |
CN107580179A (zh) * | 2017-08-18 | 2018-01-12 | 维沃移动通信有限公司 | 一种摄像头启动方法及移动终端 |
CN107948496A (zh) * | 2017-10-09 | 2018-04-20 | 广东小天才科技有限公司 | 移动终端的拍照方法、装置、移动终端及存储介质 |
CN108196714B (zh) * | 2018-01-02 | 2021-01-15 | 联想(北京)有限公司 | 一种电子设备 |
CN108196722A (zh) * | 2018-01-29 | 2018-06-22 | 广东欧珀移动通信有限公司 | 一种电子设备及其触控方法、计算机可读存储介质 |
CN108491142B (zh) * | 2018-03-09 | 2020-08-11 | Oppo广东移动通信有限公司 | 一种移动终端的控制方法、移动终端及存储介质 |
CN108803990B (zh) * | 2018-06-12 | 2021-03-16 | Oppo广东移动通信有限公司 | 交互方法、装置及终端 |
CN109740519B (zh) * | 2018-12-29 | 2021-08-13 | 联想(北京)有限公司 | 控制方法和电子设备 |
-
2019
- 2019-08-28 CN CN201910803952.XA patent/CN110647277A/zh active Pending
-
2020
- 2020-08-26 WO PCT/CN2020/111443 patent/WO2021037073A1/zh active Application Filing
Patent Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103167179A (zh) * | 2013-03-12 | 2013-06-19 | 广东欧珀移动通信有限公司 | 一种快速启动拍照功能的方法及移动设备 |
CN104679401A (zh) * | 2013-12-03 | 2015-06-03 | 上海思立微电子科技有限公司 | 一种终端的触控方法及终端 |
KR20150072922A (ko) * | 2013-12-20 | 2015-06-30 | 엘지전자 주식회사 | 이동 단말기 및 그 동작 방법 |
CN104216639A (zh) * | 2014-08-28 | 2014-12-17 | 深圳市金立通信设备有限公司 | 一种终端操作方法 |
CN107147847A (zh) * | 2017-05-15 | 2017-09-08 | 上海与德科技有限公司 | 一种移动终端的照相机的控制装置、方法以及移动终端 |
CN109151296A (zh) * | 2017-06-19 | 2019-01-04 | 北京小米移动软件有限公司 | 电子设备、切换方法及装置、计算机可读存储介质 |
CN110647277A (zh) * | 2019-08-28 | 2020-01-03 | 维沃移动通信有限公司 | 一种控制方法及终端设备 |
CN110572575A (zh) * | 2019-09-20 | 2019-12-13 | 三星电子(中国)研发中心 | 一种摄像头拍摄控制方法和装置 |
CN111163260A (zh) * | 2019-12-20 | 2020-05-15 | 维沃移动通信有限公司 | 一种摄像头启动方法及电子设备 |
Also Published As
Publication number | Publication date |
---|---|
CN110647277A (zh) | 2020-01-03 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110851051B (zh) | 一种对象分享方法及电子设备 | |
WO2021104365A1 (zh) | 对象分享方法及电子设备 | |
WO2021218902A1 (zh) | 显示控制方法、装置及电子设备 | |
WO2020258929A1 (zh) | 文件夹界面切换方法及终端设备 | |
WO2021104195A1 (zh) | 图像显示方法及电子设备 | |
WO2020063091A1 (zh) | 一种图片处理方法及终端设备 | |
WO2021083132A1 (zh) | 图标移动方法及电子设备 | |
WO2021037073A1 (zh) | 控制方法及终端设备 | |
CN110221885B (zh) | 一种界面显示方法及终端设备 | |
WO2020151525A1 (zh) | 消息发送方法及终端设备 | |
WO2020192299A1 (zh) | 信息显示方法及终端设备 | |
WO2020151460A1 (zh) | 对象处理方法及终端设备 | |
WO2021129536A1 (zh) | 图标移动方法及电子设备 | |
WO2020199783A1 (zh) | 界面显示方法及终端设备 | |
WO2021012927A1 (zh) | 图标显示方法及终端设备 | |
WO2021104163A1 (zh) | 图标整理方法及电子设备 | |
WO2021004327A1 (zh) | 应用权限设置方法及终端设备 | |
CN108920226B (zh) | 屏幕录制方法及装置 | |
WO2020182035A1 (zh) | 图像处理方法及终端设备 | |
WO2021121265A1 (zh) | 摄像头启动方法及电子设备 | |
WO2021057290A1 (zh) | 信息控制方法及电子设备 | |
WO2020192324A1 (zh) | 界面显示方法及终端设备 | |
WO2021164716A1 (zh) | 显示方法及电子设备 | |
WO2021017738A1 (zh) | 界面显示方法及电子设备 | |
WO2021031717A1 (zh) | 截屏方法及终端设备 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 20856937 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 20856937 Country of ref document: EP Kind code of ref document: A1 |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 20856937 Country of ref document: EP Kind code of ref document: A1 |
|
32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 20.10.2022) |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 20856937 Country of ref document: EP Kind code of ref document: A1 |