WO2020151675A1 - 对象控制方法及终端设备 - Google Patents

对象控制方法及终端设备 Download PDF

Info

Publication number
WO2020151675A1
WO2020151675A1 PCT/CN2020/073301 CN2020073301W WO2020151675A1 WO 2020151675 A1 WO2020151675 A1 WO 2020151675A1 CN 2020073301 W CN2020073301 W CN 2020073301W WO 2020151675 A1 WO2020151675 A1 WO 2020151675A1
Authority
WO
WIPO (PCT)
Prior art keywords
control
screen
target
input
terminal device
Prior art date
Application number
PCT/CN2020/073301
Other languages
English (en)
French (fr)
Inventor
唐俊坤
Original Assignee
维沃移动通信有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 维沃移动通信有限公司 filed Critical 维沃移动通信有限公司
Publication of WO2020151675A1 publication Critical patent/WO2020151675A1/zh
Priority to US17/380,029 priority Critical patent/US11526320B2/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1423Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2354/00Aspects of interface with display user

Definitions

  • the embodiments of the present disclosure relate to the field of communication technologies, and in particular, to an object control method and terminal equipment.
  • the double-sided screen terminal device can display the first interface on the main screen and the second interface on the secondary screen.
  • the user can input an icon in the first interface displayed on the main screen to trigger the terminal device to perform an action corresponding to the input (for example, change the position of the icon, move the icon to a folder, etc.).
  • the user can also input an icon in the second interface displayed on the secondary screen to trigger the terminal device to perform an action corresponding to the input (for example, change the position of the icon, move the icon to a folder, etc.).
  • the first interface and the second interface are located on different screens of the double-sided screen terminal device, if the user needs to perform touch operations on both the first interface and the second interface, it is often necessary to switch between the main screen and the secondary screen. Switch between multiple times. As a result, the multi-screen terminal device has poor convenience when controlling objects on different screens.
  • the embodiments of the present disclosure provide an object control method and a terminal device, so as to solve the problem of poor convenience when the existing multi-screen terminal device controls objects on different screens.
  • the embodiments of the present disclosure provide an object control method, which is applied to a terminal device.
  • the terminal device includes at least two screens.
  • the method includes: receiving a user's control of the target control control and the first object in the first screen.
  • the first input, the object in the target control control is the object in the second screen, and the second screen is the screen other than the first screen among the at least two screens; and in response to the first input, in the On the first screen, perform a first action corresponding to the first input on the first object; wherein the first object is an object in a target control control or an object in a target area, and the target area is the first screen The area other than the area where the target control control is located.
  • the embodiments of the present disclosure provide a terminal device.
  • the terminal device includes at least two screens.
  • the terminal device may include a receiving module and a control module.
  • the receiving module is configured to receive a user's first input on the target control control in the first screen and the first object, the object in the target control control is the object in the second screen, and the second screen is the at least two screens Screens other than the first screen.
  • the control module is configured to, in response to the first input received by the receiving module, perform a first action corresponding to the first input on the first object on the first screen.
  • the first object is an object in the target control control or an object in a target area
  • the target area is an area on the first screen excluding the area where the target control control is located.
  • inventions of the present disclosure provide a terminal device.
  • the terminal device includes a processor, a memory, and a computer program that is stored in the memory and can run on the processor.
  • the computer program is executed by the processor, the above-mentioned first On the one hand the steps of the object control method.
  • the embodiments of the present disclosure provide a computer-readable storage medium storing a computer program on the computer-readable storage medium, and when the computer program is executed by a processor, the steps of the object control method in the first aspect are realized.
  • the user's first input on the target manipulation control in the first screen and the first object (the object in the target manipulation control is the object in the second screen, and the second screen is the at least The screen other than the first screen among the two screens); and in response to the first input, on the first screen, perform a first action corresponding to the first input on the first object;
  • the The first object is an object in a target control control or an object in a target area
  • the target area is an area on the first screen excluding the area where the target control control is located.
  • the manipulation control can be used to trigger the display interface of the second screen on the first screen, so it can be displayed on the first screen.
  • This example can improve the convenience of multi-screen terminal equipment to control objects on different screens.
  • FIG. 1 is a schematic structural diagram of a possible Android operating system provided by an embodiment of the present disclosure
  • FIG. 3 is one of the schematic diagrams of the interface of the object control method application provided by the embodiments of the disclosure.
  • FIG. 5 is the third schematic diagram of the interface of the object control method application provided by the embodiments of the disclosure.
  • FIG. 6 is the fourth schematic diagram of the interface of the object control method application provided by the embodiments of the disclosure.
  • FIG. 7 is the fifth schematic diagram of the interface of the object control method application provided by the embodiments of the disclosure.
  • FIG. 8 is the sixth schematic diagram of the interface of the object control method application provided by the embodiments of the disclosure.
  • FIG. 9 is the second schematic diagram of an object control method provided by an embodiment of the disclosure.
  • FIG. 10 is the seventh schematic diagram of the interface of the object control method application provided by the embodiments of the disclosure.
  • FIG. 11 is the third schematic diagram of an object control method provided by an embodiment of the disclosure.
  • FIG. 12 is a schematic structural diagram of a terminal device provided by an embodiment of the disclosure.
  • FIG. 13 is a schematic diagram of hardware of a terminal device provided by an embodiment of the disclosure.
  • first and second in the specification and claims of the present disclosure are used to distinguish different objects, rather than to describe a specific order of objects.
  • first screen and the second screen are used to distinguish different screens, rather than to describe the specific order of the screens.
  • words such as “exemplary” or “for example” are used as examples, illustrations, or illustrations. Any embodiment or design solution described as “exemplary” or “for example” in the embodiments of the present disclosure should not be construed as being more preferable or advantageous than other embodiments or design solutions. To be precise, words such as “exemplary” or “for example” are used to present related concepts in a specific manner.
  • multiple refers to two or more than two, for example, multiple processing units refers to two or more processing units, etc.
  • the embodiments of the present disclosure provide an object control method and terminal device, which can receive a user's first input of a target control control in a first screen and a first object (the object in the target control control is an object in the second screen, The second screen is the screen other than the first screen among the at least two screens); and in response to the first input, on the first screen, perform the first object corresponding to the first input The first action; wherein, the first object is an object in a target control control or an object in a target area, and the target area is an area on the first screen excluding the area where the target control control is located.
  • the manipulation control can be used to trigger the display interface of the second screen on the first screen, so it can be displayed on the first screen.
  • This example can improve the convenience of multi-screen terminal equipment to control objects on different screens.
  • the terminal device in the embodiment of the present disclosure may be a terminal device with an operating system.
  • the operating system may be an Android operating system, an ios operating system, or other possible operating systems, which are not specifically limited in the embodiments of the present disclosure.
  • the following takes the Android operating system as an example to introduce the software environment to which the object control method provided by the embodiments of the present disclosure is applied.
  • FIG. 1 it is a schematic structural diagram of a possible Android operating system provided by an embodiment of the present disclosure.
  • the architecture of the Android operating system includes 4 layers, which are: application layer, application framework layer, system runtime library layer, and kernel layer (specifically, it may be the Linux kernel layer).
  • the application layer includes various applications (including system applications and third-party applications) in the Android operating system.
  • the application framework layer is the framework of the application. Developers can develop some applications based on the application framework layer while complying with the development principles of the application framework.
  • the system runtime library layer includes a library (also called a system library) and an Android operating system runtime environment.
  • the library mainly provides various resources needed by the Android operating system.
  • the Android operating system operating environment is used to provide a software environment for the Android operating system.
  • the kernel layer is the operating system layer of the Android operating system and belongs to the lowest level of the Android operating system software.
  • the kernel layer is based on the Linux kernel to provide core system services and hardware-related drivers for the Android operating system.
  • developers can develop software programs that implement the object control method provided by the embodiments of the present disclosure based on the system architecture of the Android operating system as shown in FIG.
  • the control method can be run based on the Android operating system as shown in FIG. 1. That is, the processor or the terminal device can implement the object control method provided by the embodiments of the present disclosure by running the software program in the Android operating system.
  • the terminal device in the embodiment of the present disclosure may be a mobile terminal or a non-mobile terminal.
  • the mobile terminal may be a mobile phone, a tablet computer, a notebook computer, a palmtop computer, a vehicle-mounted terminal, a wearable device, an ultra-mobile personal computer (UMPC), a netbook or a personal digital assistant (personal digital assistant
  • the non-mobile terminal may be a personal computer (PC), a television (television, TV), a teller machine, or a self-service machine, etc., which are not specifically limited in the embodiment of the present disclosure.
  • the execution subject of the object control method provided by the embodiments of the present disclosure may be the above-mentioned terminal device, or may be a functional module and/or functional entity in the terminal device that can implement the object control method, which can be specifically determined according to actual usage requirements.
  • the embodiments of the present disclosure are not limited. The following takes a terminal device as an example to illustrate the object control method provided by the embodiment of the present disclosure.
  • an embodiment of the present disclosure provides an object control method, which can be applied to a multi-screen terminal device, and the object control method can include the following S200 and S201.
  • the terminal device receives a user's first input on the target control control and the first object in the first screen, and the object in the target control control is the object in the second screen.
  • the multi-screen terminal device may include a first screen and other screens except the first screen, and each screen may display its own interface (for example, a desktop or a display interface).
  • target manipulation controls may also be displayed.
  • the target control control is a control control corresponding to the second screen, and the second screen is the screen other than the first screen among the at least two screens.
  • the second screen is the target control in the at least one screen.
  • the target control control can be understood as a presentation mode in which the interface of the second screen or the object in the second screen is projected or mapped on the first screen.
  • the object in the target control control is the object in the second screen on the first screen. Project or map.
  • the target control control can be used to trigger the display interface of the second screen on the first screen.
  • the embodiment of the present disclosure can directly control or operate the objects in the second screen on the first screen.
  • the interactive operation between the first screen and the second screen on the second screen or the objects or content in the first screen can be implemented on the first screen.
  • the display interface of the first screen includes but is not limited to one target control control. Of course, it may also include multiple control controls. Each control control in the multiple control controls corresponds to a different screen in other screens. Each control control in each control control is equivalent to the aforementioned target control control.
  • the user can perform interactive operations on the first screen.
  • the target manipulation control on the screen and the first object input ie, the first input
  • the terminal device on the first screen to perform an action corresponding to the input on the first object.
  • the first screen may be the main screen of the multi-screen terminal device, and the other screens may be the secondary screen of the multi-screen terminal device; of course, the first screen may also be the secondary screen of the multi-screen terminal device.
  • One of the other screens may be the main screen of the multi-screen terminal device. Specifically, it can be determined according to actual usage requirements, and the embodiment of the present disclosure does not limit it.
  • the double-sided screen terminal device includes a first screen and a second screen, and the first screen is the main screen and the second screen is the secondary screen.
  • FIG. 3 exemplarily shows a schematic diagram of an interface of an object control method application provided by an embodiment of the present disclosure.
  • the double-sided screen terminal device may include a main screen 30 and a sub-screen 31.
  • the control control 1 corresponding to the secondary screen 31 is displayed.
  • the secondary screen 31 includes application icons 2 and folder icons 2 (hereinafter referred to as secondary screen objects).
  • the control control 1 also includes application icons 2 and folder icons 2, that is, the objects in the control control 1 and the secondary screen 31
  • the objects are the same. It can be understood that the object in the control control 1 is the projection or mapping of the object in the secondary screen 31 on the primary screen 30.
  • the above-mentioned target control control is one of at least one control control in the first screen. As shown in FIG. 3, the target control control is the control control 1 in the main screen 30.
  • the above-mentioned first object may be an icon, such as an application icon, a folder icon, etc., may also be a video playback window, may also be a browser page, or may be any other possible object. Specifically, it can be determined according to actual usage requirements, and the embodiment of the present disclosure does not limit it.
  • the above-mentioned first object may be an object in the target control control.
  • the first object may be one of the application icon 2 and the folder icon 2.
  • the aforementioned first object may be one of at least one object in the first screen.
  • the first object may be one of the application icon 1 and the folder icon 1 in the main screen 30 (ie, the main screen object).
  • the objects and control controls in the above first screen are all exemplary enumerations, that is, the embodiments of the present disclosure include but are not limited to the various objects and control controls enumerated above.
  • the above-mentioned first screen may also include any other possible types or numbers of objects and control controls, which may be specifically determined according to actual usage requirements, which are not limited in the embodiment of the present disclosure.
  • the first input of the user may be a click input (for example, a single-click input or a double-click input), a drag input, or any other possible form of input, which can be specifically based on The actual use requirement is determined, and the embodiment of the present disclosure does not limit it.
  • a click input for example, a single-click input or a double-click input
  • a drag input or any other possible form of input, which can be specifically based on The actual use requirement is determined, and the embodiment of the present disclosure does not limit it.
  • the terminal device In response to the first input, the terminal device performs a first action corresponding to the first input on the first object on the first screen; wherein the first object is an object in a target control control or is a target area Object, the target area is the area on the first screen excluding the area where the target control control is located.
  • the above-mentioned first action may be specifically determined according to the user's first input to the target manipulation control and the first object. If the first object is different, the first input is different, and the first action is different.
  • the following describes the embodiments of the present disclosure in detail with respect to the first object being the object in the above-mentioned target control control (the following first implementation manner) and the first object being the object in the above-mentioned target area (the following second implementation manner). Provide object control method.
  • the first action may be the following (a), (b), and (c) Any one of:
  • the terminal device displays an interface corresponding to the first object in an area corresponding to the target control control.
  • the terminal device can respond to the input and display and the secondary display in the area where the target control control is located.
  • the interface corresponding to the screen object which is equivalent to responding to the user's input on the secondary screen object on the main screen. In this way, it is possible to directly operate the objects (or content) in the secondary screen on the main screen, thereby improving the operation convenience of the multi-screen terminal device.
  • the terminal device may respond to The input is to run the application program corresponding to the application icon 2 and display the interface 32 of the application program in the area where the control control 1 is located. In this way, the application icons in the secondary screen can be directly operated on the main screen.
  • the terminal device can respond to the user's input to the folder icon 2 in the target control control.
  • the area displays a folder expansion page 33 corresponding to the folder icon 2 (wherein, the folder expansion page includes an application icon 3 and an application icon 4). In this way, it is possible to directly operate the folder icons in the secondary screen on the main screen.
  • the terminal device moves the first object from the first position in the target manipulation control to the second position in the target manipulation control.
  • the terminal device can respond to This input drags the secondary screen object from the first position in the target control control to the second position in the target control control, so that the position of the secondary screen object in the target control control changes, thereby causing the secondary screen object
  • the position in the secondary screen also changes, which is equivalent to moving the secondary screen object in the secondary screen.
  • the terminal device can respond to the input and move the application icon 2 to the folder icon 2.
  • the folder icon 2 includes the application icon 2, the application icon 3, and the application icon 4. .
  • the objects or content in the secondary screen can be directly operated on the main screen.
  • the terminal device moves the first object from the target control control to the target area.
  • the terminal device can respond to the input by dragging the secondary screen object from Moving to the target area of the main screen in the target control control is equivalent to moving the secondary screen object from the secondary screen to the primary screen, so that the secondary screen object becomes the primary screen object.
  • the terminal device can respond to the input by dragging the secondary screen object from Moving to the target area of the main screen in the target control control is equivalent to moving the secondary screen object from the secondary screen to the primary screen, so that the secondary screen object becomes the primary screen object.
  • the first object is the application icon 2 in the control control 1
  • the first input is the input of the user dragging the application icon 2 to the target area in the main screen 30.
  • the terminal device can respond to the input to move the application icon 2 from the control control 1 to the main screen 30, so that the application icon 2 becomes the main screen object. In this way, interactive operations of objects or content in different screens can be performed on the main screen.
  • the first action may include: the terminal device moves the first object from the target area to the target control control.
  • the terminal device can respond to the input by dragging the main screen object from the main screen. Drag the target area in to the target control control, which is equivalent to moving the main screen object from the main screen to the secondary screen, so that the primary screen object becomes the secondary screen object. In this way, interactive operations of objects or content in different screens can be completed on the main screen, thereby improving the convenience of operation of the multi-screen terminal device.
  • the first object is the application icon 1 in the main screen 30, and the first input is the input of the user dragging the application icon 1 from the main screen 30 to the control control 1
  • the terminal device can respond to the input to move the application icon 1 from the main screen 30 to the control control 1, which is equivalent to moving the application icon 1 from the main screen 30 to the secondary screen 31, so that the application icon 1 becomes the secondary screen object .
  • interactive operations of objects or content in different screens can be performed on the main screen.
  • first actions are all exemplary enumerations, that is, the embodiments of the present disclosure include but are not limited to the foregoing enumerated actions.
  • the above-mentioned first action may also include any other possible actions, which may be specifically determined according to actual usage requirements, which is not limited in the embodiment of the present disclosure.
  • (a) and (b) in the above-mentioned first implementation are specific implementations of the internal operations of the control control.
  • the above example uses the trigger to display the interface corresponding to the first object and the movement of the first object as an example to illustrate the internal operation of the control control.
  • the embodiments of the present disclosure include but are not limited to the above implementations, for example, The user can long press the icon in the control control to drag and adjust any area within the control control to trigger the movement operation of the icon, or can move the icon A to the icon B to trigger the generation of a folder, which contains Icon A and Icon B, etc., these operations are like directly operating the secondary screen on the main screen. Specifically, it can be determined according to actual usage requirements, and the embodiment of the present disclosure does not limit it.
  • (c) and the second implementation manner in the above-mentioned first implementation manner are specific implementation manners of the interactive operation of the secondary screen object or the primary screen object between the primary screen and the secondary screen. That is, move objects on the secondary screen to the primary screen or move objects on the primary screen to the secondary screen. In this way, the user can quickly operate objects or content in different secondary screens on the main screen, achieving the effect of fast switching between the main screen and the secondary screen and content adjustment, which is convenient for operation and greatly improves user experience.
  • the object control method provided by the embodiments of the present disclosure can receive a user's first input to the target control control in the first screen and the first object (the object in the target control control is the object in the second screen, and the second screen Is the screen other than the first screen among the at least two screens); and in response to the first input, on the first screen, perform a first action on the first object corresponding to the first input;
  • the first object is an object in a target control control or an object in a target area, and the target area is an area on the first screen excluding the area where the target control control is located.
  • the manipulation control can be used to trigger the display interface of the second screen on the first screen, so it can be displayed on the first screen.
  • This example can improve the convenience of multi-screen terminal equipment to control objects on different screens.
  • the object control method provided in the embodiment of the present disclosure may further include the following S202 and S203.
  • S202 The terminal device receives a second input of the user to the target control control.
  • the user can input the target control control in the main screen to trigger the display interface of the secondary screen corresponding to the target control control on the main screen (that is, the secondary screen projection window). ), and then the user can operate on the display interface of the secondary screen to trigger the terminal device to perform a corresponding action, thereby realizing the operation of the secondary screen on the primary screen.
  • the second input of the user may be a click input (for example, a single-click input or a double-click input), a long-press input, or any other possible form of input, which can be specifically based on The actual use requirement is determined, and the embodiment of the present disclosure does not limit it.
  • the terminal device In response to the second input, the terminal device updates the display mode of the target control control from the first display mode to the second display mode; wherein at least one second object is displayed in the target control control in the second display mode.
  • the at least one second object is an object in the second screen.
  • the display mode of the control control may include a first display mode and a second display mode.
  • the display mode of the control control can be changed after the control control is triggered. Specifically, the display mode of the control control can be updated from the first display mode (that is, the initial state) to the second display mode to display the object in the control control (the above-mentioned second object); the display mode of the control control can be changed from the second display mode. The mode is updated to the first display mode to cancel the display of the object in the control control (that is, return to the initial state).
  • the target control control in the first display mode may be a folder identifier (for example, a folder icon), and the target control control in the second display mode may be a display interface, and the display interface may be corresponding to the target control control.
  • the projection or mapping of the interface of the secondary screen on the main screen, the display interface can be understood as the folder expansion page identified by the folder.
  • the target control control in the first display mode may be referred to as the secondary screen projected folder or the projected folder
  • the target control control in the second display mode may be referred to as the expanded page of the secondary screen projected folder or the projected folder. Expand the page.
  • the display mode of the control control can be changed after the control control is triggered. Specifically, the display mode of the control control can be updated from the first display mode to the second display mode to display or enter the expanded page of the projected folder; the display mode of the control control can be updated from the second display mode to the first display mode, To cancel the display or exit the expanded page of the projected folder.
  • the above-mentioned display interface may be displayed in suspension on the first screen, or may be superimposed and displayed on the display interface of the first screen, which may be specifically determined according to actual usage requirements, which is not limited in the embodiment of the present disclosure.
  • the shape and size of the above-mentioned display interface can be determined according to actual usage requirements, which are not limited in the embodiment of the present disclosure.
  • the initial state (ie, the first display mode) of the control control 1 in the main screen 30 is represented as a projected folder. If the user inputs the control control 1, as shown in (b) in Figure 10, the terminal device can respond to the input and update the display mode of the control control 1 from the projected folder mode to the projected folder expansion By way of page 41, the objects on the expanded page 41 of the projected folder are objects on the secondary screen corresponding to the control control 1.
  • the display mode of the target control control in response to the user's input to the target control control, can be changed, so that the display interface of the secondary screen corresponding to the target control control is displayed on the main screen.
  • the objects or content in the secondary screen can be directly operated on the main screen, or the interactive operations of objects or content in different screens can be performed on the main screen, so that the operation convenience of the multi-screen terminal device can be improved.
  • the secondary screen projection folder belongs to the folder type of the primary screen
  • the objects in the secondary screen projection folder ie secondary screen objects
  • the secondary screen projection folder can be directly dragged to the primary screen according to the folder operation strategy, so that the secondary screen The object becomes the main screen object.
  • the expansion page of the secondary screen projection folder is equivalent to the expansion page of the ordinary folder
  • the secondary screen projection folder is The boundary of the expanded page is equivalent to the boundary of the secondary screen. Therefore, drag the object in the secondary screen projection folder to the primary screen through the boundary of the expanded page of the secondary screen projection folder to realize the object from the secondary screen desktop to the primary screen desktop.
  • the terminal device can display the expanded page of the secondary screen projection folder, and then the user can pass the application icon 1 through the secondary screen
  • the boundary of the expanded page of the projected folder is dragged to any area in the projected folder of the secondary screen (including the permanent folder of the folder, etc.), so that the main screen icon can be moved from the main screen to the secondary screen projected folder, thereby moving the main screen
  • the icon moves to the secondary screen so that the main screen object becomes the secondary screen object.
  • the user can drag the application icon 1 to the folder icon in the secondary screen projection folder.
  • the foregoing description takes the target manipulation control in the first display mode as a folder identifier as an example. It can be understood that the embodiments of the present disclosure include but are not limited to the foregoing first display mode. The control can also be expressed in any other possible display modes, which can be specifically determined according to actual usage requirements, and the embodiment of the present disclosure does not limit it.
  • the user can input the secondary screen projection folder to trigger the display of the expanded page of the secondary screen projection folder, that is, display the display page of the secondary screen , So that the user can operate the secondary screen like a normal folder, and realize the content interaction between the secondary screen desktop and the main screen desktop.
  • the terminal device can quickly display (ie enter) or cancel the display (ie exit) of the secondary screen projected folder in response to the user's input on the first screen (hereinafter referred to as shortcut gesture input)
  • shortcut gesture input The expanded page of.
  • the first input may be a multi-finger (for example, two-finger) zoom-in gesture input, which is used to trigger any display interface on the main screen to enter the expanded page of the secondary screen projection folder.
  • the first input may be a multi-finger (for example, two-finger) zoom out gesture input, which is used to trigger exit from the expanded page of the projected folder on the secondary screen and quickly return to the display interface of the primary screen.
  • the user since the user does not need to search for the icon of the secondary screen projection folder on the main screen, and does not need to input the secondary screen projection folder, it can trigger the quick entry or exit of the expansion page of the secondary screen projection folder, thus improving the user's operation Convenience.
  • the same navigation bar as that in the display interface of the sub-screen may be displayed in the sub-interface, and the navigation bar may include the return key ( Function keys such as Back key) and/or home key (ie home key).
  • the terminal device can respond to the user's input of the function keys in the sub-interface on the display interface of the secondary screen (that is, the sub-interface), and can also respond to the user's input of the function keys in the sub-interface on the display interface of the main screen. Due to the determination of the software strategy, the embodiment of the present disclosure sets no limitation.
  • the terminal device may respond to the user's input of the home key in the sub-interface, and update the sub-interface to display the main interface of the sub-screen.
  • the user can use a multi-finger zoom-out gesture input on the first screen to trigger the terminal device to cancel the main interface of the secondary screen and display the main interface of the main screen.
  • the terminal device may cancel displaying the expanded page of the projected folder on the secondary screen in response to the user's input to the home button in the sub-interface, and display the main interface of the main screen.
  • the object control method provided in the embodiment of the present disclosure may further include the following S204.
  • the terminal device may perform a first action corresponding to the first input to the first object on the first screen in response to the user's first input to the target manipulation control and the first object in the first screen, And the result of performing the first action on the first object is displayed on the second screen corresponding to the target manipulation control.
  • the terminal device In response to the input, the application icon 2 can be moved to the folder icon 2 (that is, the above-mentioned first action), and the result of performing the first action on the application icon 2 is displayed on the secondary screen 31 corresponding to the control control 1. That is, the folder expansion page 34 of the folder icon 2 includes the application icon 2, the application icon 3, and the application icon 4.
  • the embodiment of the present disclosure is not limited to the above-mentioned secondary screen 31
  • the display result for example, the display result on the secondary screen 31 may be the folder icon 2 instead of the folder expansion page 34 of the folder icon 2.
  • it can be determined according to actual usage requirements, and the embodiment of the present disclosure does not limit it.
  • the display interface of the secondary screen can be displayed on the main screen, so that the objects in the secondary screen can be directly operated on the main screen, and the objects on the primary screen or the secondary screen can be operated to realize the secondary screen object or the main screen The interaction of the object between the main screen interface and the secondary screen interface.
  • the embodiment of the present disclosure may not limit the execution sequence of S201 and S204. That is, in the embodiment of the present disclosure, S201 may be executed first, and then S204; S204 may be executed first, and S201 may be executed later; S201 and S204 may also be executed simultaneously. It can be understood that the above-mentioned FIG. 11 is an example in which S201 is executed first and then S204 is executed.
  • icon projection is taken as an example. It can be understood that the embodiments of the present disclosure include but are not limited to the above object projection methods.
  • the object projection methods may also include theme projection, setting interface projection, and video Interface projection, browser page projection, etc., can be specifically determined according to actual usage requirements, which are not limited in the embodiment of the present disclosure.
  • an embodiment of the present disclosure provides a terminal device 700.
  • the terminal device 700 includes at least two screens.
  • the terminal device 700 may include a receiving module 701 and a control module 702.
  • the receiving module 701 is configured to receive a user's first input to a target control control in the first screen and a first object, where the object in the target control control is an object in a second screen, and the second screen is the at least two screens. Screens other than the first screen.
  • the control module 702 is configured to, in response to the first input received by the receiving module 701, perform a first action corresponding to the first input on the first object on the first screen.
  • the first object is an object in the target control control or an object in a target area
  • the target area is an area on the first screen excluding the area where the target control control is located.
  • the object in one of the above-mentioned at least one control control is an object on the screen corresponding to the one control control.
  • the above-mentioned first object may be an object in a target control control
  • the above-mentioned first action may include any one of the following: displaying in an area corresponding to the target control control corresponding to the first object , Move the first object from the target control control to the target area, and move the first object from the first position in the target control control to the second position in the target control control.
  • the above-mentioned first object may be an object in a target area
  • the above-mentioned first action may include: moving the first object from the target area to the target manipulation control.
  • control module 702 is further configured to, after the receiving module 701 receives the user's first input on the target manipulation control and the first object in the first screen, respond to the first input, The second screen displays the result of performing the first action on the first object.
  • the receiving module 701 is further configured to receive the user's second input to the target control control and the first object in the first screen before receiving the user's first input to the target control control and the first object.
  • the control module 702 is further configured to update the display mode of the target control control from the first display mode to the second display mode in response to the second input received by the receiving module 701, and the target control control in the second display mode displays at least A second object, and the at least one second object is an object in the second screen.
  • the target control control in the first display mode is a folder identifier
  • the target control control in the second display mode is a display interface
  • the terminal device provided in the embodiment of the present disclosure can implement each process implemented by the terminal device in the foregoing method embodiment, and to avoid repetition, details are not described herein again.
  • the terminal device can receive a user's first input to the target control control in the first screen and the first object (the object in the target control control is the object in the second screen, and the second screen is The at least two screens except the first screen); and in response to the first input, on the first screen, perform a first action on the first object corresponding to the first input; wherein ,
  • the first object is an object in a target control control or an object in a target area, and the target area is an area on the first screen excluding the area where the target control control is located.
  • the manipulation control can be used to trigger the display interface of the second screen on the first screen, so it can be displayed on the first screen.
  • This example can improve the convenience of multi-screen terminal equipment to control objects on different screens.
  • FIG. 13 is a schematic diagram of the hardware structure of a terminal device that implements various embodiments of the present disclosure.
  • the terminal device 800 includes but is not limited to: a radio frequency unit 801, a network module 802, an audio output unit 803, an input unit 804, a sensor 805, a display unit 806, a user input unit 807, an interface unit 808, and a memory 809 , Processor 810, and power supply 811.
  • a radio frequency unit 801 includes but is not limited to: a radio frequency unit 801, a network module 802, an audio output unit 803, an input unit 804, a sensor 805, a display unit 806, a user input unit 807, an interface unit 808, and a memory 809 , Processor 810, and power supply 811.
  • the terminal device may include more or fewer components than those shown in the figure, or a combination of certain components, or different components. Layout.
  • terminal devices include, but are not limited to, mobile phones, tablet computers, notebook computers, palmtop computers, vehicle-mounted terminals,
  • the user input unit 807 is configured to receive the user's first input on the target control control in the first screen and the first object.
  • the object in the target control control is the object in the second screen, and the second screen is the
  • the processor 810 is configured to respond to the first input received by the user input unit 807, on the first screen, on the first screen other than the first screen; Enter the corresponding first action.
  • the first object is an object in the target control control or an object in a target area, and the target area is an area on the first screen excluding the area where the target control control is located.
  • Embodiments of the present disclosure provide a terminal device, which can receive a user's first input of a target control control in a first screen and a first object (the object in the target control control is an object in the second screen, The second screen is the screen other than the first screen among the at least two screens); and in response to the first input, on the first screen, perform the first object corresponding to the first input on the first object An action; wherein the first object is an object in a target control control or an object in a target area, and the target area is an area on the first screen excluding the area where the target control control is located.
  • the manipulation control can be used to trigger the display interface of the second screen on the first screen, so it can be displayed on the first screen.
  • This example can improve the convenience of multi-screen terminal equipment to control objects on different screens.
  • the radio frequency unit 801 can be used for receiving and sending signals in the process of sending and receiving information or talking. Specifically, the downlink data from the base station is received and processed by the processor 810; Uplink data is sent to the base station.
  • the radio frequency unit 801 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like.
  • the radio frequency unit 801 can also communicate with the network and other devices through a wireless communication system.
  • the terminal device 800 provides users with wireless broadband Internet access through the network module 802, such as helping users to send and receive emails, browse web pages, and access streaming media.
  • the audio output unit 803 can convert the audio data received by the radio frequency unit 801 or the network module 802 or stored in the memory 809 into audio signals and output them as sounds. Moreover, the audio output unit 803 may also provide audio output related to a specific function performed by the terminal device 800 (for example, call signal reception sound, message reception sound, etc.).
  • the audio output unit 803 includes a speaker, a buzzer, a receiver, and the like.
  • the input unit 804 is used to receive audio or video signals.
  • the input unit 804 may include a graphics processing unit (GPU) 8041 and a microphone 8042.
  • the graphics processor 8041 is used to capture images of still pictures or videos obtained by an image capture device (such as a camera) in a video capture mode or an image capture mode. Data is processed.
  • the processed image frame may be displayed on the display unit 806.
  • the image frame processed by the graphics processor 8041 may be stored in the memory 809 (or other storage medium) or sent via the radio frequency unit 801 or the network module 802.
  • the microphone 8042 can receive sound and can process such sound into audio data.
  • the processed audio data can be converted into a format that can be sent to a mobile communication base station via the radio frequency unit 801 for output in the case of a telephone call mode.
  • the terminal device 800 also includes at least one sensor 805, such as a light sensor, a motion sensor, and other sensors.
  • the light sensor includes an ambient light sensor and a proximity sensor.
  • the ambient light sensor can adjust the brightness of the display panel 8061 according to the brightness of the ambient light.
  • the proximity sensor can close the display panel 8061 and the display panel when the terminal device 800 is moved to the ear. / Or backlight.
  • the accelerometer sensor can detect the magnitude of acceleration in various directions (usually three-axis), and can detect the magnitude and direction of gravity when stationary, and can be used to identify terminal device posture (such as horizontal and vertical screen switching, related games , Magnetometer attitude calibration), vibration recognition related functions (such as pedometer, percussion), etc.; sensor 805 can also include fingerprint sensors, pressure sensors, iris sensors, molecular sensors, gyroscopes, barometers, hygrometers, thermometers, Infrared sensors, etc., will not be repeated here.
  • the display unit 806 is used to display information input by the user or information provided to the user.
  • the display unit 806 may include a display panel 8061, and the display panel 8061 may be configured in the form of a liquid crystal display (LCD), an organic light-emitting diode (OLED), etc.
  • LCD liquid crystal display
  • OLED organic light-emitting diode
  • the user input unit 807 can be used to receive inputted number or character information, and generate key signal input related to user settings and function control of the terminal device.
  • the user input unit 807 includes a touch panel 8071 and other input devices 8072.
  • the touch panel 8071 also called a touch screen, can collect user touch operations on or near it (for example, the user uses any suitable objects or accessories such as fingers, stylus, etc.) on the touch panel 8071 or near the touch panel 8071. operating).
  • the touch panel 8071 may include two parts: a touch detection device and a touch controller.
  • the touch detection device detects the user's touch position, and detects the signal brought by the touch operation, and transmits the signal to the touch controller; the touch controller receives the touch information from the touch detection device, converts it into contact coordinates, and then sends it To the processor 810, the command sent by the processor 810 is received and executed.
  • the touch panel 8071 can be implemented in multiple types such as resistive, capacitive, infrared, and surface acoustic wave.
  • the user input unit 807 may also include other input devices 8072.
  • other input devices 8072 may include, but are not limited to, a physical keyboard, function keys (such as volume control buttons, switch buttons, etc.), trackball, mouse, and joystick, which will not be repeated here.
  • the touch panel 8071 can cover the display panel 8061.
  • the touch panel 8071 detects a touch operation on or near it, it transmits it to the processor 810 to determine the type of the touch event.
  • the type of event provides corresponding visual output on the display panel 8061.
  • the touch panel 8071 and the display panel 8061 are used as two independent components to realize the input and output functions of the terminal device, but in some embodiments, the touch panel 8071 and the display panel 8061 can be integrated
  • the implementation of the input and output functions of the terminal device is not specifically limited here.
  • the interface unit 808 is an interface for connecting an external device with the terminal device 800.
  • the external device may include a wired or wireless headset port, an external power source (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device with an identification module, audio input/output (I/O) port, video I/O port, headphone port, etc.
  • the interface unit 808 can be used to receive input (for example, data information, power, etc.) from an external device and transmit the received input to one or more elements in the terminal device 800 or can be used to connect to the terminal device 800 and external Transfer data between devices.
  • the memory 809 can be used to store software programs and various data.
  • the memory 809 may mainly include a program storage area and a data storage area.
  • the program storage area may store an operating system, an application program required by at least one function (such as a sound playback function, an image playback function, etc.), etc.; Data (such as audio data, phone book, etc.) created by the use of mobile phones.
  • the memory 809 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, a flash memory device, or other volatile solid-state storage devices.
  • the processor 810 is the control center of the terminal device. It uses various interfaces and lines to connect the various parts of the entire terminal device. It runs or executes software programs and/or modules stored in the memory 809, and calls data stored in the memory 809. , Perform various functions of terminal equipment and process data, so as to monitor the terminal equipment as a whole.
  • the processor 810 may include one or more processing units; optionally, the processor 810 may integrate an application processor and a modem processor, where the application processor mainly processes the operating system, user interface and application programs, etc.
  • the adjustment processor mainly deals with wireless communication. It can be understood that the foregoing modem processor may not be integrated into the processor 810.
  • the terminal device 800 may also include a power source 811 (such as a battery) for supplying power to various components.
  • a power source 811 such as a battery
  • the power source 811 may be logically connected to the processor 810 through a power management system, so as to manage charging, discharging, and power consumption through the power management system. Management and other functions.
  • the terminal device 800 includes some functional modules not shown, which will not be repeated here.
  • an embodiment of the present disclosure further provides a terminal device, including a processor 810 as shown in FIG. 13, a memory 809, a computer program stored in the memory 809 and capable of running on the processor 810, and the computer program is
  • the processor 810 implements each process of the foregoing object control method embodiment while executing, and can achieve the same technical effect. To avoid repetition, details are not described herein again.
  • the embodiments of the present disclosure also provide a computer-readable storage medium, and a computer program is stored on the computer-readable storage medium.
  • a computer program is stored on the computer-readable storage medium.
  • the computer program is executed by a processor, each process of the foregoing object control method embodiment is realized, and the same technology can be achieved. The effect, in order to avoid repetition, will not be repeated here.
  • the computer-readable storage medium may include read-only memory (ROM), random access memory (RAM), magnetic disk or optical disk, etc.
  • the technical solution of the present disclosure can be embodied in the form of a software product in essence or the part that contributes to the related technology.
  • the computer software product is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk). ) Includes several instructions to make a terminal device (which may be a mobile phone, a computer, a server, an air conditioner, or a network device, etc.) execute the methods disclosed in the various embodiments of the present disclosure.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

本公开实施例提供了一种对象控制方法及终端设备。该方法包括:接收用户对第一屏中的目标操控控件和第一对象的第一输入,该目标操控控件中的对象为第二屏中的对象,该第二屏为该至少两个屏中除该第一屏之外的屏;并响应于第一输入,在第一屏上,对第一对象执行与第一输入对应的第一动作;其中,该第一对象为目标操控控件中的对象或为目标区域中的对象,该目标区域为第一屏上除该目标操控控件所在区域之外的区域。该方法可以应用于多面屏终端设备的对象控制场景中。

Description

对象控制方法及终端设备
本申请要求于2019年1月21日提交中国国家知识产权局、申请号为201910052676.8、申请名称为“一种对象控制方法及终端设备”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本公开实施例涉及通信技术领域,尤其涉及一种对象控制方法及终端设备。
背景技术
随着终端设备的应用范围越来越广,用户对于使用多面屏终端设备的便捷性的需求日益提升。
以双面屏终端设备为例,双面屏终端设备可以在主屏上显示第一界面,并在副屏上显示第二界面。用户可以对主屏显示的第一界面中的图标输入,触发终端设备执行与该输入对应的动作(例如:更改图标的位置,将图标移动到文件夹中等)。用户也可以对副屏显示的第二界面中的图标输入,触发终端设备执行与该输入对应的动作(例如:更改图标的位置,将图标移动到文件夹中等)。
然而,由于第一界面和第二界面分别位于双面屏终端设备的不同屏上,因此若用户需要在第一界面和第二界面上均进行触控操作,则往往需要在主屏和副屏之间多次切换。如此,导致多面屏终端设备在对不同屏的对象控制时的便捷性较差。
发明内容
本公开实施例提供一种对象控制方法及终端设备,以解决现有多面屏终端设备在对不同屏的对象控制时的便捷性较差的问题。
为了解决上述技术问题,本公开是这样实现的:
第一方面,本公开实施例提供了一种对象控制方法,应用于终端设备,该终端设备包括至少两个屏,该方法包括:接收用户对第一屏中的目标操控控件和第一对象的第一输入,该目标操控控件中的对象为第二屏中的对象,该第二屏为该至少两个屏中除该第一屏之外的屏;并响应于该第一输入,在该第一屏上,对该第一对象执行与该第一输入对应的第一动作;其中,该第一对象为目标操控控件中的对象或为目标区域中的对象,该目标区域为第一屏上除该目标操控控件所在区域之外的区域。
第二方面,本公开实施例提供了一种终端设备,该终端设备包括至少两个屏,该终端设备可以包括接收模块和控制模块。接收模块用于接收用户对第一屏中的目标操控控件和第一对象的第一输入,该目标操控控件中的对象为第二屏中的对象,该第二屏为该至少两个屏中除该第一屏之外的屏。控制模块用于响应于接收模块接收的第一输入,在该第一屏上,对该第一对象执行与该第一输入对应的第一动作。其中,该第一对象为该目标操控控件中的对象或为目标区域中的对象,该目标区域为该第一屏上除该目标操控控件所在区域之外的区域。
第三方面,本公开实施例提供了一种终端设备,该终端设备包括处理器、存储器及存储在存储器上并可在处理器上运行的计算机程序,该计算机程序被处理器执行时 实现上述第一方面中的对象控制方法的步骤。
第四方面,本公开实施例提供了一种计算机可读存储介质,该计算机可读存储介质上存储计算机程序,该计算机程序被处理器执行时实现上述第一方面中的对象控制方法的步骤。
在本公开实施例中,可以接收用户对第一屏中的目标操控控件和第一对象的第一输入(该目标操控控件中的对象为第二屏中的对象,该第二屏为该至少两个屏中除该第一屏之外的屏);并响应于该第一输入,在该第一屏上,对该第一对象执行与该第一输入对应的第一动作;其中,该第一对象为目标操控控件中的对象或为目标区域中的对象,该目标区域为第一屏上除该目标操控控件所在区域之外的区域。通过该方案,由于本公开实施例中第一屏中包括与第二屏对应的操控控件,该操控控件可以用于触发在第一屏上显示第二屏的显示界面,因此可以在第一屏上直接控制或操作第二屏中的对象,也可以在第一屏上实现第二屏或第一屏中的对象或内容在第一屏和第二屏之间的交互操作,如此本公开实施例可以提升多面屏终端设备对不同屏的对象控制的便捷性。
附图说明
图1为本公开实施例提供的一种可能的安卓操作系统的架构示意图;
图2为本公开实施例提供的对象控制方法的流程示意图之一;
图3为本公开实施例提供的对象控制方法应用的界面示意图之一;
图4为本公开实施例提供的对象控制方法应用的界面示意图之二;
图5为本公开实施例提供的对象控制方法应用的界面示意图之三;
图6为本公开实施例提供的对象控制方法应用的界面示意图之四;
图7为本公开实施例提供的对象控制方法应用的界面示意图之五;
图8为本公开实施例提供的对象控制方法应用的界面示意图之六;
图9为本公开实施例提供的对象控制方法的示意图之二;
图10为本公开实施例提供的对象控制方法应用的界面示意图之七;
图11为本公开实施例提供的对象控制方法的示意图之三;
图12为本公开实施例提供的终端设备的结构示意图;
图13为本公开实施例提供的终端设备的硬件示意图。
具体实施方式
下面将结合本公开实施例中的附图,对本公开实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例是本公开一部分实施例,而不是全部的实施例。基于本公开中的实施例,本领域普通技术人员在没有作出创造性劳动前提下所获得的所有其他实施例,都属于本公开保护的范围。
本文中术语“和/或”,是一种描述关联对象的关联关系,表示可以存在三种关系,例如,A和/或B,可以表示:单独存在A,同时存在A和B,单独存在B这三种情况。本文中符号“/”表示关联对象是或者的关系,例如A/B表示A或者B。
本公开的说明书和权利要求书中的术语“第一”和“第二”等是用于区别不同的对象,而不是用于描述对象的特定顺序。例如,第一屏和第二屏等是用于区别不同的屏,而不是用于描述屏的特定顺序。
在本公开实施例中,“示例性的”或者“例如”等词用于表示作例子、例证或说明。本公开实施例中被描述为“示例性的”或者“例如”的任何实施例或设计方案不应被解释为比其它实施例或设计方案更优选或更具优势。确切而言,使用“示例性的”或者“例如”等词旨在以具体方式呈现相关概念。
在本公开实施例的描述中,除非另有说明,“多个”的含义是指两个或者两个以上,例如,多个处理单元是指两个或者两个以上的处理单元等。
本公开实施例提供一种对象控制方法及终端设备,可以接收用户对第一屏中的目标操控控件和第一对象的第一输入(该目标操控控件中的对象为第二屏中的对象,该第二屏为该至少两个屏中除该第一屏之外的屏);并响应于该第一输入,在该第一屏上,对该第一对象执行与该第一输入对应的第一动作;其中,该第一对象为目标操控控件中的对象或为目标区域中的对象,该目标区域为第一屏上除该目标操控控件所在区域之外的区域。通过该方案,由于本公开实施例中第一屏中包括与第二屏对应的操控控件,该操控控件可以用于触发在第一屏上显示第二屏的显示界面,因此可以在第一屏上直接控制或操作第二屏中的对象,也可以在第一屏上实现第二屏或第一屏中的对象或内容在第一屏和第二屏之间的交互操作,如此本公开实施例可以提升多面屏终端设备对不同屏的对象控制的便捷性。
本公开实施例中的终端设备可以为具有操作系统的终端设备。该操作系统可以为安卓(Android)操作系统,可以为ios操作系统,还可以为其他可能的操作系统,本公开实施例不作具体限定。
下面以安卓操作系统为例,介绍一下本公开实施例提供的对象控制方法所应用的软件环境。
如图1所示,为本公开实施例提供的一种可能的安卓操作系统的架构示意图。在图1中,安卓操作系统的架构包括4层,分别为:应用程序层、应用程序框架层、系统运行库层和内核层(具体可以为Linux内核层)。
其中,应用程序层包括安卓操作系统中的各个应用程序(包括系统应用程序和第三方应用程序)。
应用程序框架层是应用程序的框架,开发人员可以在遵守应用程序的框架的开发原则的情况下,基于应用程序框架层开发一些应用程序。
系统运行库层包括库(也称为系统库)和安卓操作系统运行环境。库主要为安卓操作系统提供其所需的各类资源。安卓操作系统运行环境用于为安卓操作系统提供软件环境。
内核层是安卓操作系统的操作系统层,属于安卓操作系统软件层次的最底层。内核层基于Linux内核为安卓操作系统提供核心系统服务和与硬件相关的驱动程序。
以安卓操作系统为例,本公开实施例中,开发人员可以基于上述如图1所示的安卓操作系统的系统架构,开发实现本公开实施例提供的对象控制方法的软件程序,从而使得该对象控制方法可以基于如图1所示的安卓操作系统运行。即处理器或者终端设备可以通过在安卓操作系统中运行该软件程序实现本公开实施例提供的对象控制方法。
本公开实施例中的终端设备可以为移动终端,也可以为非移动终端。示例性的, 移动终端可以为手机、平板电脑、笔记本电脑、掌上电脑、车载终端、可穿戴设备、超级移动个人计算机(ultra-mobile personal computer,UMPC)、上网本或者个人数字助理(personal digital assistant,PDA)等,非移动终端可以为个人计算机(personal computer,PC)、电视机(television,TV)、柜员机或者自助机等,本公开实施例不作具体限定。
本公开实施例提供的对象控制方法的执行主体可以为上述的终端设备,也可以为该终端设备中能够实现该对象控制方法的功能模块和/或功能实体,具体的可以根据实际使用需求确定,本公开实施例不作限定。下面以终端设备为例,对本公开实施例提供的对象控制方法进行示例性的说明。
如图2所示,本公开实施例提供一种对象控制方法,该对象控制方法可以应用于多面屏终端设备,该对象控制方法可以包括下述的S200和S201。
S200、终端设备接收用户对第一屏中的目标操控控件和第一对象的第一输入,该目标操控控件中的对象为第二屏中的对象。
本公开实施例中,多面屏终端设备可以包括第一屏和除第一屏之外的其它屏,每个屏可以显示各自的界面(例如桌面或显示界面)。第一屏的显示界面中除了显示有至少一个对象(例如应用图标和文件夹图标等)之外,还可以显示有目标操控控件。该目标操控控件为对应于第二屏的操控控件,该第二屏为至少两个屏中除第一屏之外的屏,换句话说,该第二屏为上述至少一个屏中与目标操控控件对应的屏。目标操控控件可以理解为将第二屏的界面或第二屏中的对象投射或映射在第一屏的呈现方式,其中,目标操控控件中的对象为第二屏中的对象在第一屏的投射或映射。
本公开实施例中,该目标操控控件可以用于触发在第一屏上显示第二屏的显示界面,如此本公开实施例可以在第一屏上直接控制或操作第二屏中的对象,也可以在第一屏上实现第二屏或第一屏中的对象或内容在第一屏和第二屏之间的交互操作,详细描述参见下文。
需要说明的是,第一屏的显示界面中包括但不限于一个目标操控控件,当然还可以包括多个操控控件,该多个操控控件中的各个操控控件分别对应于其它屏中的不同屏,各个操控控件中的每个操控控件等同于上述目标操控控件。
本公开实施例中,若用户要在第一屏对其它屏中的对象操作,或者在第一屏对多面屏中的不同屏中的对象(或内容)进行交互操作,则用户可以对第一屏中的目标操控控件和第一对象输入(即第一输入),以触发终端设备在第一屏上,对该第一对象执行与该输入对应的动作,详细描述参见下文。
可选的,本公开实施例中,第一屏可以为多面屏终端设备的主屏,其它屏可以为多面屏终端设备的副屏;当然,第一屏也可以为多面屏终端设备的副屏,其它屏中的一个屏可以为多面屏终端设备的主屏。具体可以根据实际使用需求确定,本公开实施例不作限定。
为了便于描述和理解,以下以双面屏终端设备为例示例性地说明本公开实施例提供的对象控制方法。假设双面屏终端设备包括第一屏和第二屏,第一屏为主屏以及第二屏为副屏。
图3示例性地示出了本公开实施例提供的对象控制方法应用的界面示意图。如图 3所示,双面屏终端设备可以包括主屏30和副屏31,主屏30中除了显示有至少一个对象(例如应用图标1和文件夹图标1,以下称为主屏对象)之外,还显示有与副屏31对应的操控控件1。副屏31中包括应用图标2和文件夹图标2(以下称为副屏对象),该操控控件1中同样包括应用图标2和文件夹图标2,即操控控件1中的对象与副屏31中的对象是相同的。可以理解,该操控控件1中的对象为副屏31中的对象在主屏30中的投射或映射。
本公开实施例中,上述目标操控控件为第一屏中的至少一个操控控件中的一个。如图3所示,目标操控控件为主屏30中的操控控件1。
可选的,本公开实施例中,上述第一对象可以为图标,例如应用图标、文件夹图标等,也可以为视频播放窗口,还可以为浏览器页面,或者可以为其它任意可能的对象,具体可以根据实际使用需求确定,本公开实施例不作限定。
可选的,本公开实施例中,上述第一对象可以为目标操控控件中的一个对象。如图3所示,第一对象可以为应用图标2和文件夹图标2中的一个。
可选的,本公开实施例中,上述第一对象可以为第一屏中的至少一个对象中的一个。如图3所示,第一对象可以为主屏30中的应用图标1和文件夹图标1(即主屏对象)中的一个。
可以理解,上述第一屏中的对象和操控控件均是示例性的列举,即本公开实施例包括但不限于上述列举的各个对象和操控控件。实际实现时,上述第一屏中还可以包括其它任意可能种类或数量的对象和操控控件,具体可以根据实际使用需求确定,本公开实施例不作限定。
可选的,本公开实施例中,上述用户的第一输入可以为点击输入(例如单击输入或双击输入),也可以为拖动输入,还可以是其它任意可能形式的输入,具体可以根据实际使用需求确定,本公开实施例不作限定。
S201、终端设备响应于第一输入,在第一屏上,对第一对象执行与第一输入对应的第一动作;其中,该第一对象为目标操控控件中的对象或为目标区域中的对象,该目标区域为第一屏上除目标操控控件所在区域之外的区域。
本公开实施例中,上述第一动作具体可以根据用户对目标操控控件和第一对象的第一输入确定,第一对象不同,则第一输入不同,进而第一动作不同。下面分别针对第一对象为上述目标操控控件中的对象(下述的第一实现方式)、和第一对象为上述目标区域中的对象(下述的第二实现方式)详细说明本公开实施例提供的对象控制方法。
第一实现方式
可选的,在第一实现方式中,假设上述第一对象为上述目标操控控件中的对象(即副屏对象),那么上述第一动作可以为以下(a)、(b)和(c)中的任意一项:
(a)终端设备在与目标操控控件对应的区域显示与第一对象对应的界面。
本公开实施例中,假设第一输入具体为用户对目标操控控件中的副屏对象(即第一对象)的输入,那么终端设备可以响应于该输入,在目标操控控件所在区域显示与该副屏对象对应的界面,这相当于在主屏上响应用户对副屏对象的输入。如此,可以实现在主屏上直接对副屏中的对象(或内容)进行操作,从而可以提升多面屏终端设 备的操作便捷性。
示例性的,结合图3,如图4所示,假设第一对象为操控控件1中的应用图标2,且第一输入为用户对该应用图标2的单击输入,那么终端设备可以响应于该输入,运行与该应用图标2对应的应用程序,并在操控控件1所在区域显示该应用程序的界面32。如此,可以实现在主屏上直接对副屏中的应用图标进行操作。
又示例性的,结合图3,如图5所示,假设第一对象为操控控件1中的文件夹图标2,那么终端设备可以响应于用户对文件夹图标2的输入,在目标操控控件所在区域显示与该文件夹图标2对应的文件夹展开页33(其中,文件夹展开页中包括应用图标3和应用图标4)。如此,可以实现在主屏上直接对副屏中的文件夹图标进行操作。
(b)终端设备将第一对象从目标操控控件中的第一位置移动至目标操控控件中的第二位置。
本公开实施例中,假设第一输入具体为用户将副屏对象(即第一对象)从目标操控控件中的第一位置拖动至目标操控控件中的第二位置,那么终端设备可以响应于该输入,将副屏对象从目标操控控件中的第一位置拖动至目标操控控件中的第二位置,使得该副屏对象在目标操控控件中的位置发生改变,由此使得该副屏对象在副屏中的位置也发生改变,这相当于在副屏中移动副屏对象。如此,可以在主屏上直接对副屏中的对象或内容进行操作,从而可以提升多面屏终端设备的操作便捷性。
示例性的,结合图3和图5,如图6所示,假设第一对象为操控控件1中的应用图标2,且第一输入为用户将该应用图标2拖动至文件夹图标2所在区域(即上述目标区域)的输入,那么终端设备可以响应于该输入,将应用图标2移动至文件夹图标2中,这样,文件夹图标2中包括应用图标2、应用图标3和应用图标4。如此,可以在主屏上直接对副屏中的对象或内容进行操作。
(c)终端设备将第一对象从目标操控控件中移动至目标区域。
本公开实施例中,假设第一输入具体为用户将副屏对象(即第一对象)从目标操控控件中拖动至主屏的目标区域,那么终端设备可以响应于该输入,将副屏对象从目标操控控件中移动至主屏的目标区域,这相当于将副屏对象从副屏中移动至主屏中,使得该副屏对象变为主屏对象。如此,可以在主屏上进行不同屏中的对象或内容的交互操作,从而可以提升多面屏终端设备的操作便捷性。
示例性的,结合图3,如图7所示,假设第一对象为操控控件1中的应用图标2,且第一输入为用户将应用图标2拖动至主屏30中的目标区域的输入,那么终端设备可以响应于该输入,将应用图标2从操控控件1中移动至主屏30中,使得应用图标2变为主屏对象。如此,可以在主屏上进行不同屏中的对象或内容的交互操作。
第二实现方式
在第二实现方式中,假设上述第一对象为上述目标区域中的对象,那么上述第一动作可以包括:终端设备将第一对象从该目标区域移动至目标操控控件中。
本公开实施例中,假设第一输入具体为用户将主屏对象(即第一对象)从主屏中的目标区域拖动至目标操控控件中,那么终端设备可以响应于该输入,将主屏对象从主屏中的目标区域拖动至目标操控控件中,这相当于将主屏对象从主屏中移动至副屏中,使得该主屏对象变为副屏对象。如此,可以在主屏上完成不同屏中的对象或内容 的交互操作,从而可以提升多面屏终端设备的操作便捷性。
示例性的,结合图3,如图8所示,假设第一对象为主屏30中的应用图标1,且第一输入为用户将应用图标1从主屏30中拖动至操控控件1中的输入,那么终端设备可以响应于该输入,将应用图标1从主屏30中移动至操控控件1中,这相当于将应用图标1从主屏30中移动至副屏31中,使得应用图标1变为副屏对象。如此,可以在主屏上进行不同屏中的对象或内容的交互操作。
可以理解,上述的第一动作均是示例性的列举,即本公开实施例包括但不限于上述列举的动作。实际实现时,上述第一动作还可以包括其它任意可能的动作,具体可以根据实际使用需求确定,本公开实施例不作限定。
本公开实施例中,上述第一实现方式中的(a)和(b)为操控控件的内部操作的具体实现方式。需要说明的是,上述以触发显示与第一对象对应的界面以及移动第一对象为例示例性的说明操控控件的内部操作,可以理解,本公开实施例包括但不限于上述实现方式,例如,用户可以长按操控控件中的图标在该操控控件内的任意区域进行拖动调整,触发对该图标的移动操作,或者可以将图标A移动至图标B,触发生成文件夹,该文件夹中包括图标A和图标B等,这些操作如同在主屏直接操作副屏。具体可以根据实际使用需求确定,本公开实施例不作限定。
本公开实施例中,上述第一实现方式中的(c)和第二实现方式为副屏对象或主屏对象在主屏和副屏之间的交互操作的具体实现方式。即,将副屏对象移动到主屏中或者将主屏中的对象移动到副屏中。如此,用户可以在主屏上快速操作不同副屏中的对象或内容,达到主屏和副屏快速切换及内容调整的效果,操作便捷,极大的提升用户体验。
本公开实施例提供的对象控制方法,可以接收用户对第一屏中的目标操控控件和第一对象的第一输入(该目标操控控件中的对象为第二屏中的对象,该第二屏为该至少两个屏中除该第一屏之外的屏);并响应于该第一输入,在该第一屏上,对该第一对象执行与该第一输入对应的第一动作;其中,该第一对象为目标操控控件中的对象或为目标区域中的对象,该目标区域为第一屏上除该目标操控控件所在区域之外的区域。通过该方案,由于本公开实施例中第一屏中包括与第二屏对应的操控控件,该操控控件可以用于触发在第一屏上显示第二屏的显示界面,因此可以在第一屏上直接控制或操作第二屏中的对象,也可以在第一屏上实现第二屏或第一屏中的对象或内容在第一屏和第二屏之间的交互操作,如此本公开实施例可以提升多面屏终端设备对不同屏的对象控制的便捷性。
可选的,结合图2,如图9所示,在上述的S200之前,本公开实施例提供的对象控制方法还可以包括下述的S202和S203。
S202、终端设备接收用户对目标操控控件的第二输入。
本公开实施例中,若用户要在主屏中操作副屏,用户可以对主屏中的目标操控控件输入,以触发在主屏显示与该目标操控控件对应的副屏的显示界面(即副屏投射窗口),进而用户可以在该副屏的显示界面上操作,即可触发终端设备执行相应的动作,从而实现在主屏对副屏的操作。
可选的,本公开实施例中,上述用户的第二输入可以为点击输入(例如单击输入 或双击输入),也可以为长按输入,还可以是其它任意可能形式的输入,具体可以根据实际使用需求确定,本公开实施例不作限定。
S203、终端设备响应于第二输入,将该目标操控控件的显示方式从第一显示方式更新为第二显示方式;其中,第二显示方式的目标操控控件中显示有至少一个第二对象。
其中,上述至少一个第二对象为上述第二屏中的对象。
本公开实施例中,操控控件的显示方式可以包括第一显示方式和第二显示方式。操控控件的显示方式在操控控件被触发之后可以变化。具体的,操控控件的显示方式可以从第一显示方式(即初始状态)更新为第二显示方式,以显示操控控件中的对象(上述第二对象);操控控件的显示方式可以从第二显示方式更新为第一显示方式,以取消显示操控控件中的对象(即返回初始状态)。
可选的,上述第一显示方式的目标操控控件可以为文件夹标识(例如文件夹图标),上述第二显示方式的目标操控控件可以为显示界面,该显示界面可以为与目标操控控件对应的副屏的界面在主屏上的投射或映射,该显示界面可以理解为文件夹标识的文件夹展开页。具体的,上述第一显示方式的目标操控控件可以称为副屏投射文件夹或投射文件夹,上述第二显示方式的目标操控控件可以称为副屏投射文件夹的展开页或投射文件夹的展开页。
具体的,操控控件的显示方式在操控控件被触发之后可以变化。具体的,操控控件的显示方式可以从第一显示方式更新为第二显示方式,以显示或进入投射文件夹的展开页;操控控件的显示方式可以从第二显示方式更新为第一显示方式,以取消显示或退出投射文件夹的展开页。
可选的,本公开实施例中,上述显示界面可以悬浮显示于第一屏上,也可以叠加显示于第一屏的显示界面上,具体可以根据实际使用需求确定,本公开实施例不作限定。另外,上述显示界面的形状和尺寸可以根据实际使用需求确定,本公开实施例不作限定。
如图10中的(a)所示,主屏30中的操控控件1的初始状态(即第一显示方式)表示为投射文件夹。若用户对该操控控件1输入,则如图10中的(b)所示,终端设备可以响应于该输入,将该操控控件1的显示方式从投射文件夹的方式更新为投射文件夹的展开页41的方式,该投射文件夹的展开页41中的对象为与操控控件1对应的副屏中的对象。
本公开实施例中,可以响应于用户对目标操控控件的输入,改变该目标操控控件的显示方式,使得在主屏显示与该目标操控控件对应的副屏的显示界面。如此,可以在主屏上直接对副屏中的对象或内容进行操作,或者可以在主屏上进行不同屏中的对象或内容的交互操作,从而可以提升多面屏终端设备的操作便捷性。
下面基于上述的S202和S203,说明将副屏对象从副屏投射文件夹中移动到主屏中的具体实现方式。
具体的,由于副屏投射文件夹属于主屏中的文件夹类型,因此可以按照文件夹操作的策略直接把副屏投射文件夹中的对象(即副屏对象)拖动到主屏中,使得副屏对象成为主屏对象。基于文件夹的投射和嵌套,从主屏桌面的维度来说,副屏投射文件 夹的展开页面相当于普通文件夹的展开页,且从副屏桌面的维度来说,副屏投射文件夹的展开页的边界相当于副屏的边界,因此将副屏投射文件夹中的对象经过副屏投射文件夹的展开页的边界拖动到主屏中,即可实现对象从副屏桌面到主屏桌面的移动操作。
示例性的,若要将副屏投射文件夹中的文件夹图标2中的应用图标3拖动到主屏中,则首先将应用图标3从文件夹图标2拖动到副屏投射文件夹的展开页,然后继续将文件夹图标2经过副屏投射文件夹的展开页的边界从副屏投射文件夹中拖出,如此实现将图标从副屏投射文件夹中移动到主屏中,从而将副屏图标移动至主屏中,使得副屏对象成为主屏对象。
下面再基于上述的S202和S203,说明将主屏对象从主屏中移动到副屏投射文件夹中的具体实现方式。
具体的,若用户在主屏上长按主屏中的应用图标1拖动到副屏投射文件夹中,则终端设备可以显示副屏投射文件夹的展开页,然后用户可以将应用图标1经过副屏投射文件夹的展开页的边界,拖动到副屏投射文件夹中的任意区域(包括文件夹常驻栏等),如此实现将主屏图标从主屏中移动到副屏投射文件夹中,从而将主屏图标移动至副屏中,使得主屏对象成为副屏对象。进一步,用户可以将应用图标1拖动到副屏投射文件夹中的文件夹图标中。
需要说明的是,上述以第一显示方式的目标操控控件为文件夹标识为例示例性的说明,可以理解,本公开实施例包括但不限于上述第一显示方式,第一显示方式的目标操控控件还可以表现为其它任意可能的显示方式,具体可以根据实际使用需求确定,本公开实施例不作限定。
本公开实施例中,通过在主屏显示一个特殊文件夹,作为副屏投射文件夹,用户可以对副屏投射文件夹输入以触发显示副屏投射文件夹的展开页,即显示副屏的显示页面,从而用户可以像操作普通文件夹一样去操作副屏,实现副屏桌面和主屏桌面之间的内容交互。
可选的,本公开实施例中,终端设备可以响应于用户在第一屏上的输入(以下称为快捷手势输入),快捷显示(即进入)或取消显示(即退出)副屏投射文件夹的展开页。
示例性的,第一输入可以为多指(例如双指)放大手势输入,用于触发在主屏的任意显示界面进入到副屏投射文件夹的展开页。同样,第一输入可以为多指(例如双指)缩小手势输入,用于触发从副屏投射文件夹的展开页面退出,快速返回到主屏的显示界面。如此,由于无需用户在主屏上寻找副屏投射文件夹的图标,也无需对副屏投射文件夹输入,既可触发快捷进入到或退出副屏投射文件夹的展开页,因此提升了用户的操作便捷性。
本公开实施例中,由于子界面是副屏的显示界面在主屏的投射或映射,因此子界面中可以显示有与副屏的显示界面中同样的导航栏,该导航栏中可以包括返回键(即Back键)和/或主页键(即home键)等功能键。终端设备可以在副屏的显示界面(即子界面)上响应用户对子界面中的功能键的输入,也可以在主屏的显示界面上响应用户对子界面中的功能键的输入,具体可以取决于软件策略确定,本公开实施例不作限 定。
示例性的,终端设备可以响应用户对子界面中的home键的输入,将子界面更新显示为副屏的主界面。在此情况下,若用户要返回到主屏的主界面,则用户可以在第一屏上的多指缩小手势输入,以触发终端设备取消显示副屏的主界面,并显示主屏的主界面。
又示例性的,终端设备可以响应于用户对子界面中的home键等的输入,取消显示副屏投射文件夹的展开页,并显示主屏的主界面。
可选的,结合图2,如图11所示,在上述的S200之后,本公开实施例提供的对象控制方法还可以包括下述的S204。
S204、终端设备响应于第一输入,在第二屏上显示对第一对象执行第一动作后的结果。
本公开实施例中,终端设备可以响应于用户对第一屏中的目标操控控件和第一对象的第一输入,在第一屏上对第一对象执行与第一输入对应的第一动作,并且在与目标操控控件对应的第二屏上显示对第一对象执行第一动作后的结果。
示例性的,结合上述的图6,假设第一对象为操控控件1中的应用图标2,且第一输入为用户将该应用图标2拖动至文件夹图标2所在区域的输入,那么终端设备可以响应于该输入,将应用图标2移动至文件夹图标2中(即上述第一动作),并且在与操控控件1对应的副屏31上显示对应用图标2执行第一动作后的结果,即文件夹图标2的文件夹展开页34中包括应用图标2、应用图标3和应用图标4。
需要说明的是,以上是以文件夹展开页作为执行对应用图标2执行第一动作后的副屏31上的显示结果为例示例性的说明,本公开实施例不限于上述副屏31上的显示结果,例如,副屏31上的显示结果可以为文件夹图标2,而并非文件夹图标2的文件夹展开页34。具体可以根据实际使用需求确定,本公开实施例不作限定。
通过该方案,可以在主屏上显示副屏的显示界面,如此可以在主屏上直接操作副屏中的对象,并且可以通过对主屏中的对象或副屏中的对象操作,实现副屏对象或主屏对象在主屏界面和副屏界面之间的交互。
需要说明的是,本公开实施例可以不限定S201和S204的执行顺序。即本公开实施例可以先执行S201,后执行S204;也可以先执行S204,后执行S201;还可以同时执行S201和S204。可以理解,上述图11是以先执行S201后执行S204为例示意的。
需要说明的是,本公开实施例中以图标投射为例示例性的说明,可以理解,本公开实施例包括但不限于上述对象投射方式,对象投射方式还可以包括主题投射、设置界面投射、视频界面投射、浏览器页面投射等,具体可以根据实际使用需求确定,本公开实施例不作限定。
如图12所示,本公开实施例提供一种终端设备700,该终端设备700包括至少两个屏,该终端设备700可以包括接收模块701和控制模块702。接收模块701用于接收用户对第一屏中的目标操控控件和第一对象的第一输入,该目标操控控件中的对象为第二屏中的对象,该第二屏为该至少两个屏中除该第一屏之外的屏。控制模块702用于响应于接收模块701接收的第一输入,在该第一屏上,对该第一对象执行与该第一输入对应的第一动作。其中,该第一对象为该目标操控控件中的对象或为目标区域 中的对象,该目标区域为该第一屏上除该目标操控控件所在区域之外的区域。
可选的,本公开实施例中,上述至少一个操控控件中的一个操控控件中的对象为与该一个操控控件对应的屏中的对象。
可选的,本公开实施例中,上述第一对象可以为目标操控控件中的对象,上述第一动作可以包括以下任意一项:在与该目标操控控件对应的区域显示与该第一对象对应的界面、将该第一对象从该目标操控控件中移动至目标区域、将该第一对象从该目标操控控件中的第一位置移动至该目标操控控件中的第二位置。
可选的,本公开实施例中,上述第一对象可以为目标区域中的对象,上述第一动作可以包括:将第一对象从该目标区域移动至目标操控控件中。
可选的,本公开实施例中,控制模块702还用于在接收模块701接收用户对第一屏中的目标操控控件和第一对象的第一输入之后,响应于该第一输入,在上述第二屏上显示对该第一对象执行第一动作后的结果。
可选的,本公开实施例中,接收模块701还用于在接收用户对第一屏中的目标操控控件和第一对象的第一输入之前,接收用户对目标操控控件的第二输入。控制模块702还用于响应于接收模块701接收的第二输入,将该目标操控控件的显示方式从第一显示方式更新为第二显示方式,该第二显示方式的目标操控控件中显示有至少一个第二对象,该至少一个第二对象为第二屏中的对象。
可选的,本公开实施例中,上述第一显示方式的目标操控控件为文件夹标识,上述第二显示方式的目标操控控件为显示界面。
本公开实施例提供的终端设备能够实现上述方法实施例中终端设备实现的各个过程,为避免重复,这里不再赘述。
本公开实施例提供的终端设备,可以接收用户对第一屏中的目标操控控件和第一对象的第一输入(该目标操控控件中的对象为第二屏中的对象,该第二屏为该至少两个屏中除该第一屏之外的屏);并响应于该第一输入,在该第一屏上,对该第一对象执行与该第一输入对应的第一动作;其中,该第一对象为目标操控控件中的对象或为目标区域中的对象,该目标区域为第一屏上除该目标操控控件所在区域之外的区域。通过该方案,由于本公开实施例中第一屏中包括与第二屏对应的操控控件,该操控控件可以用于触发在第一屏上显示第二屏的显示界面,因此可以在第一屏上直接控制或操作第二屏中的对象,也可以在第一屏上实现第二屏或第一屏中的对象或内容在第一屏和第二屏之间的交互操作,如此本公开实施例可以提升多面屏终端设备对不同屏的对象控制的便捷性。
图13为实现本公开各个实施例的一种终端设备的硬件结构示意图。如图13所示,该终端设备800包括但不限于:射频单元801、网络模块802、音频输出单元803、输入单元804、传感器805、显示单元806、用户输入单元807、接口单元808、存储器809、处理器810、以及电源811等部件。本领域技术人员可以理解,图13中示出的终端设备结构并不构成对终端设备的限定,终端设备可以包括比图示更多或更少的部件,或者组合某些部件,或者不同的部件布置。在本公开实施例中,终端设备包括但不限于手机、平板电脑、笔记本电脑、掌上电脑、车载终端、可穿戴设备、以及计步器等。
其中,用户输入单元807,用于接收用户对第一屏中的目标操控控件和第一对象的第一输入,该目标操控控件中的对象为第二屏中的对象,该第二屏为该至少两个屏中除该第一屏之外的屏;处理器810,用于响应于用户输入单元807接收的第一输入,在该第一屏上,对该第一对象执行与该第一输入对应的第一动作。其中,该第一对象为该目标操控控件中的对象或为目标区域中的对象,该目标区域为该第一屏上除该目标操控控件所在区域之外的区域。
本公开实施例提供一种终端设备,该终端设备可以接收用户对第一屏中的目标操控控件和第一对象的第一输入(该目标操控控件中的对象为第二屏中的对象,该第二屏为该至少两个屏中除该第一屏之外的屏);并响应于该第一输入,在该第一屏上,对该第一对象执行与该第一输入对应的第一动作;其中,该第一对象为目标操控控件中的对象或为目标区域中的对象,该目标区域为第一屏上除该目标操控控件所在区域之外的区域。通过该方案,由于本公开实施例中第一屏中包括与第二屏对应的操控控件,该操控控件可以用于触发在第一屏上显示第二屏的显示界面,因此可以在第一屏上直接控制或操作第二屏中的对象,也可以在第一屏上实现第二屏或第一屏中的对象或内容在第一屏和第二屏之间的交互操作,如此本公开实施例可以提升多面屏终端设备对不同屏的对象控制的便捷性。
应理解的是,本公开实施例中,射频单元801可用于收发信息或通话过程中,信号的接收和发送,具体的,将来自基站的下行数据接收后,给处理器810处理;另外,将上行的数据发送给基站。通常,射频单元801包括但不限于天线、至少一个放大器、收发信机、耦合器、低噪声放大器、双工器等。此外,射频单元801还可以通过无线通信系统与网络和其他设备通信。
终端设备800通过网络模块802为用户提供了无线的宽带互联网访问,如帮助用户收发电子邮件、浏览网页和访问流式媒体等。
音频输出单元803可以将射频单元801或网络模块802接收的或者在存储器809中存储的音频数据转换成音频信号并且输出为声音。而且,音频输出单元803还可以提供与终端设备800执行的特定功能相关的音频输出(例如,呼叫信号接收声音、消息接收声音等等)。音频输出单元803包括扬声器、蜂鸣器以及受话器等。
输入单元804用于接收音频或视频信号。输入单元804可以包括图形处理器(graphics processing unit,GPU)8041和麦克风8042,图形处理器8041对在视频捕获模式或图像捕获模式中由图像捕获装置(如摄像头)获得的静态图片或视频的图像数据进行处理。处理后的图像帧可以显示在显示单元806上。经图形处理器8041处理后的图像帧可以存储在存储器809(或其它存储介质)中或者经由射频单元801或网络模块802进行发送。麦克风8042可以接收声音,并且能够将这样的声音处理为音频数据。处理后的音频数据可以在电话通话模式的情况下转换为可经由射频单元801发送到移动通信基站的格式输出。
终端设备800还包括至少一种传感器805,比如光传感器、运动传感器以及其他传感器。具体地,光传感器包括环境光传感器及接近传感器,其中,环境光传感器可根据环境光线的明暗来调节显示面板8061的亮度,接近传感器可在终端设备800移动到耳边时,关闭显示面板8061和/或背光。作为运动传感器的一种,加速计传感器可 检测各个方向上(一般为三轴)加速度的大小,静止时可检测出重力的大小及方向,可用于识别终端设备姿态(比如横竖屏切换、相关游戏、磁力计姿态校准)、振动识别相关功能(比如计步器、敲击)等;传感器805还可以包括指纹传感器、压力传感器、虹膜传感器、分子传感器、陀螺仪、气压计、湿度计、温度计、红外线传感器等,在此不再赘述。
显示单元806用于显示由用户输入的信息或提供给用户的信息。显示单元806可包括显示面板8061,可以采用液晶显示器(liquid crystal display,LCD)、有机发光二极管(organic light-emitting diode,OLED)等形式来配置显示面板8061。
用户输入单元807可用于接收输入的数字或字符信息,以及产生与终端设备的用户设置以及功能控制有关的键信号输入。具体地,用户输入单元807包括触控面板8071以及其他输入设备8072。触控面板8071,也称为触摸屏,可收集用户在其上或附近的触摸操作(比如用户使用手指、触笔等任何适合的物体或附件在触控面板8071上或在触控面板8071附近的操作)。触控面板8071可包括触摸检测装置和触摸控制器两个部分。其中,触摸检测装置检测用户的触摸方位,并检测触摸操作带来的信号,将信号传送给触摸控制器;触摸控制器从触摸检测装置上接收触摸信息,并将它转换成触点坐标,再送给处理器810,接收处理器810发来的命令并加以执行。此外,可以采用电阻式、电容式、红外线以及表面声波等多种类型实现触控面板8071。除了触控面板8071,用户输入单元807还可以包括其他输入设备8072。具体地,其他输入设备8072可以包括但不限于物理键盘、功能键(比如音量控制按键、开关按键等)、轨迹球、鼠标、操作杆,在此不再赘述。
进一步的,触控面板8071可覆盖在显示面板8061上,当触控面板8071检测到在其上或附近的触摸操作后,传送给处理器810以确定触摸事件的类型,随后处理器810根据触摸事件的类型在显示面板8061上提供相应的视觉输出。虽然在图13中,触控面板8071与显示面板8061是作为两个独立的部件来实现终端设备的输入和输出功能,但是在某些实施例中,可以将触控面板8071与显示面板8061集成而实现终端设备的输入和输出功能,具体此处不做限定。
接口单元808为外部装置与终端设备800连接的接口。例如,外部装置可以包括有线或无线头戴式耳机端口、外部电源(或电池充电器)端口、有线或无线数据端口、存储卡端口、用于连接具有识别模块的装置的端口、音频输入/输出(I/O)端口、视频I/O端口、耳机端口等等。接口单元808可以用于接收来自外部装置的输入(例如,数据信息、电力等等)并且将接收到的输入传输到终端设备800内的一个或多个元件或者可以用于在终端设备800和外部装置之间传输数据。
存储器809可用于存储软件程序以及各种数据。存储器809可主要包括存储程序区和存储数据区,其中,存储程序区可存储操作系统、至少一个功能所需的应用程序(比如声音播放功能、图像播放功能等)等;存储数据区可存储根据手机的使用所创建的数据(比如音频数据、电话本等)等。此外,存储器809可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件、闪存器件、或其他易失性固态存储器件。
处理器810是终端设备的控制中心,利用各种接口和线路连接整个终端设备的各 个部分,通过运行或执行存储在存储器809内的软件程序和/或模块,以及调用存储在存储器809内的数据,执行终端设备的各种功能和处理数据,从而对终端设备进行整体监控。处理器810可包括一个或多个处理单元;可选的,处理器810可集成应用处理器和调制解调处理器,其中,应用处理器主要处理操作系统、用户界面和应用程序等,调制解调处理器主要处理无线通信。可以理解的是,上述调制解调处理器也可以不集成到处理器810中。
终端设备800还可以包括给各个部件供电的电源811(比如电池),可选的,电源811可以通过电源管理系统与处理器810逻辑相连,从而通过电源管理系统实现管理充电、放电、以及功耗管理等功能。
另外,终端设备800包括一些未示出的功能模块,在此不再赘述。
可选的,本公开实施例还提供一种终端设备,包括如图13所示的处理器810,存储器809,存储在存储器809上并可在处理器810上运行的计算机程序,该计算机程序被处理器810执行时实现上述对象控制方法实施例的各个过程,且能达到相同的技术效果,为避免重复,这里不再赘述。
本公开实施例还提供一种计算机可读存储介质,计算机可读存储介质上存储有计算机程序,该计算机程序被处理器执行时实现上述对象控制方法实施例的各个过程,且能达到相同的技术效果,为避免重复,这里不再赘述。其中,该计算机可读存储介质可以包括只读存储器(read-only memory,ROM)、随机存取存储器(random access memory,RAM)、磁碟或者光盘等。
需要说明的是,在本文中,术语“包括”、“包含”或者其任何其他变体意在涵盖非排他性的包含,从而使得包括一系列要素的过程、方法、物品或者装置不仅包括那些要素,而且还包括没有明确列出的其他要素,或者是还包括为这种过程、方法、物品或者装置所固有的要素。在没有更多限制的情况下,由语句“包括一个……”限定的要素,并不排除在包括该要素的过程、方法、物品或者装置中还存在另外的相同要素。
通过以上的实施方式的描述,本领域的技术人员可以清楚地了解到上述实施例方法可借助软件加必需的通用硬件平台的方式来实现,当然也可以通过硬件,但很多情况下前者是更佳的实施方式。基于这样的理解,本公开的技术方案本质上或者说对相关技术做出贡献的部分可以以软件产品的形式体现出来,该计算机软件产品存储在一个存储介质(如ROM/RAM、磁碟、光盘)中,包括若干指令用以使得一台终端设备(可以是手机,计算机,服务器,空调器,或者网络设备等)执行本公开各个实施例公开的方法。
上面结合附图对本公开的实施例进行了描述,但是本公开并不局限于上述的具体实施方式,上述的具体实施方式仅仅是示意性的,而不是限制性的,本领域的普通技术人员在本公开的启示下,在不脱离本公开宗旨和权利要求所保护的范围情况下,还可做出很多形式,均属于本公开的保护之内。

Claims (12)

  1. 一种对象控制方法,应用于终端设备,所述终端设备包括至少两个屏,所述方法包括:
    接收用户对第一屏中的目标操控控件和第一对象的第一输入,所述目标操控控件中的对象为第二屏中的对象,所述第二屏为所述至少两个屏中除所述第一屏之外的屏;
    响应于所述第一输入,在所述第一屏上,对所述第一对象执行与所述第一输入对应的第一动作;
    其中,所述第一对象为所述目标操控控件中的对象或为目标区域中的对象,所述目标区域为所述第一屏上除所述目标操控控件所在区域之外的区域。
  2. 根据权利要求1所述的方法,其中,所述第一对象为所述目标操控控件中的对象,所述第一动作包括以下任意一项:在与所述目标操控控件对应的区域显示与所述第一对象对应的界面、将所述第一对象从所述目标操控控件中移动至所述目标区域、将所述第一对象从所述目标操控控件中的第一位置移动至所述目标操控控件中的第二位置;
    或者,
    所述第一对象为所述目标区域中的对象,所述第一动作包括:将所述第一对象从所述目标区域移动至所述目标操控控件中。
  3. 根据权利要求2所述的方法,其中,在接收用户对第一屏中的目标操控控件和第一对象的第一输入之后,所述方法还包括:
    响应于所述第一输入,在所述第二屏上显示对所述第一对象执行所述第一动作后的结果。
  4. 根据权利要求1至3中任一项所述的方法,其中,在接收用户对第一屏中的目标操控控件和第一对象的第一输入之前,所述方法还包括:
    接收用户对所述目标操控控件的第二输入;
    响应于所述第二输入,将所述目标操控控件的显示方式从第一显示方式更新为第二显示方式,所述第二显示方式的所述目标操控控件中显示有至少一个第二对象,所述至少一个第二对象为所述第二屏中的对象。
  5. 根据权利要求4所述的方法,其中,所述第一显示方式的所述目标操控控件为文件夹标识,所述第二显示方式的所述目标操控控件为显示界面。
  6. 一种终端设备,所述终端设备包括至少两个屏,所述终端设备包括接收模块和控制模块;
    所述接收模块,用于接收用户对第一屏中的目标操控控件和第一对象的第一输入,所述目标操控控件中的对象为第二屏中的对象,所述第二屏为所述至少两个屏中除所述第一屏之外的屏;
    所述控制模块,用于响应于所述接收模块接收的所述第一输入,在所述第一屏上,对所述第一对象执行与所述第一输入对应的第一动作;
    其中,所述第一对象为所述目标操控控件中的对象或为目标区域中的对象,所述目标区域为所述第一屏上除所述目标操控控件所在区域之外的区域。
  7. 根据权利要求6所述的终端设备,其中,所述第一对象为所述目标操控控件中 的对象,所述第一动作包括以下任意一项:在与所述目标操控控件对应的区域显示与所述第一对象对应的界面、将所述第一对象从所述目标操控控件中移动至所述目标区域、将所述第一对象从所述目标操控控件中的第一位置移动至所述目标操控控件中的第二位置;
    或者,
    所述第一对象为所述目标区域中的对象,所述第一动作包括:将所述第一对象从所述目标区域移动至所述目标操控控件中。
  8. 根据权利要求7所述的终端设备,其中,所述控制模块,还用于在接收模块接收用户对第一屏中的目标操控控件和第一对象的第一输入之后,响应于所述第一输入,在所述第二屏上显示对所述第一对象执行所述第一动作后的结果。
  9. 根据权利要求6至8中任一项所述的终端设备,其中,所述接收模块,还用于在接收用户对第一屏中的目标操控控件和第一对象的第一输入之前,接收用户对所述目标操控控件的第二输入;
    所述控制模块,还用于响应于所述接收模块接收的所述第二输入,将所述目标操控控件的显示方式从第一显示方式更新为第二显示方式,所述第二显示方式的所述目标操控控件中显示有至少一个第二对象,所述至少一个第二对象为所述第二屏中的对象。
  10. 根据权利要求9所述的终端设备,其中,所述第一显示方式的所述目标操控控件为文件夹标识,所述第二显示方式的所述目标操控控件为显示界面。
  11. 一种终端设备,包括处理器、存储器及存储在所述存储器上并可在所述处理器上运行的计算机程序,所述计算机程序被所述处理器执行时实现如权利要求1至5中任一项所述的对象控制方法的步骤。
  12. 一种计算机可读存储介质,所述计算机可读存储介质上存储计算机程序,所述计算机程序被处理器执行时实现如权利要求1至5中任一项所述的终端设备的对象控制方法的步骤。
PCT/CN2020/073301 2019-01-21 2020-01-20 对象控制方法及终端设备 WO2020151675A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US17/380,029 US11526320B2 (en) 2019-01-21 2021-07-20 Multi-screen interface control method and terminal device

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910052676.8A CN109901760B (zh) 2019-01-21 2019-01-21 一种对象控制方法及终端设备
CN201910052676.8 2019-01-21

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/380,029 Continuation US11526320B2 (en) 2019-01-21 2021-07-20 Multi-screen interface control method and terminal device

Publications (1)

Publication Number Publication Date
WO2020151675A1 true WO2020151675A1 (zh) 2020-07-30

Family

ID=66943969

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/073301 WO2020151675A1 (zh) 2019-01-21 2020-01-20 对象控制方法及终端设备

Country Status (3)

Country Link
US (1) US11526320B2 (zh)
CN (1) CN109901760B (zh)
WO (1) WO2020151675A1 (zh)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109901760B (zh) 2019-01-21 2020-07-28 维沃移动通信有限公司 一种对象控制方法及终端设备
CN110851066B (zh) * 2019-10-24 2021-12-10 瑞芯微电子股份有限公司 一种支持多显示屏触控的方法和装置

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104125332A (zh) * 2013-04-29 2014-10-29 Lg电子株式会社 移动终端和控制移动终端的方法
CN104133629A (zh) * 2014-07-10 2014-11-05 深圳市中兴移动通信有限公司 双屏互动的方法及移动终端
CN108932092A (zh) * 2018-06-29 2018-12-04 维沃移动通信有限公司 一种显示控制方法及终端设备
CN109901760A (zh) * 2019-01-21 2019-06-18 维沃移动通信有限公司 一种对象控制方法及终端设备

Family Cites Families (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103324435B (zh) * 2013-05-24 2017-02-08 华为技术有限公司 分屏显示的方法、装置及其电子设备
CN104660789B (zh) * 2013-11-20 2018-07-06 联想(北京)有限公司 一种信息处理方法及电子设备
CN104915144A (zh) * 2015-06-29 2015-09-16 惠州华阳通用电子有限公司 双屏互动用户界面投射方法
CN104915173A (zh) * 2015-06-29 2015-09-16 惠州华阳通用电子有限公司 双屏互动控制方法
CN105117099A (zh) * 2015-08-07 2015-12-02 深圳市金立通信设备有限公司 一种终端界面的控制方法及终端
KR20170096408A (ko) * 2016-02-16 2017-08-24 삼성전자주식회사 어플리케이션을 표시하는 방법 및 이를 지원하는 전자 장치
CN107797747A (zh) * 2016-08-31 2018-03-13 中兴通讯股份有限公司 一种基于多屏的屏幕操控方法、装置和终端
CN108008890B (zh) * 2017-11-30 2021-04-23 努比亚技术有限公司 双屏移动终端及其跨应用快捷传输方法、可读存储介质
CN108153504A (zh) * 2017-12-25 2018-06-12 努比亚技术有限公司 双屏信息交互方法、移动终端和计算机可读存储介质
CN108958580B (zh) * 2018-06-28 2021-07-23 维沃移动通信有限公司 一种显示控制方法及终端设备
CN109002268A (zh) * 2018-06-28 2018-12-14 努比亚技术有限公司 双面屏终端操作方法、移动终端及计算机可读存储介质
CN109032486B (zh) 2018-07-10 2021-01-22 维沃移动通信有限公司 一种显示控制方法及终端设备
CN109194815A (zh) * 2018-07-20 2019-01-11 重庆宝力优特科技有限公司 基于多屏终端的操作方法、装置及计算机可读存储介质
CN109164965A (zh) * 2018-08-10 2019-01-08 奇酷互联网络科技(深圳)有限公司 移动终端及其缩小屏幕界面的方法、装置和可读存储介质
CN109213401A (zh) * 2018-08-28 2019-01-15 南昌努比亚技术有限公司 双面屏应用图标整理方法、移动终端及可读存储介质
CN109522278B (zh) * 2018-10-18 2020-08-04 维沃移动通信有限公司 一种文件存储方法及终端设备

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104125332A (zh) * 2013-04-29 2014-10-29 Lg电子株式会社 移动终端和控制移动终端的方法
CN104133629A (zh) * 2014-07-10 2014-11-05 深圳市中兴移动通信有限公司 双屏互动的方法及移动终端
CN108932092A (zh) * 2018-06-29 2018-12-04 维沃移动通信有限公司 一种显示控制方法及终端设备
CN109901760A (zh) * 2019-01-21 2019-06-18 维沃移动通信有限公司 一种对象控制方法及终端设备

Also Published As

Publication number Publication date
US20210349671A1 (en) 2021-11-11
CN109901760A (zh) 2019-06-18
CN109901760B (zh) 2020-07-28
US11526320B2 (en) 2022-12-13

Similar Documents

Publication Publication Date Title
WO2021083052A1 (zh) 对象分享方法及电子设备
WO2020258929A1 (zh) 文件夹界面切换方法及终端设备
WO2021218902A1 (zh) 显示控制方法、装置及电子设备
WO2021136133A1 (zh) 一种应用程序切换方法及电子设备
WO2021057337A1 (zh) 操作方法及电子设备
WO2020063091A1 (zh) 一种图片处理方法及终端设备
WO2021083132A1 (zh) 图标移动方法及电子设备
WO2021082711A1 (zh) 图像显示方法及电子设备
WO2020181942A1 (zh) 图标控制方法及终端设备
CN109614061B (zh) 显示方法及终端
WO2020151460A1 (zh) 对象处理方法及终端设备
WO2021012927A1 (zh) 图标显示方法及终端设备
WO2021104163A1 (zh) 图标整理方法及电子设备
WO2020151525A1 (zh) 消息发送方法及终端设备
WO2020192296A1 (zh) 界面显示方法及终端设备
WO2021129536A1 (zh) 图标移动方法及电子设备
WO2021004341A1 (zh) 文件夹的创建方法及终端设备
WO2020173235A1 (zh) 任务切换方法及终端设备
WO2020215982A1 (zh) 桌面图标管理方法及终端设备
WO2021004327A1 (zh) 应用权限设置方法及终端设备
WO2020192324A1 (zh) 界面显示方法及终端设备
CN109857289B (zh) 显示控制方法及终端设备
WO2020199783A1 (zh) 界面显示方法及终端设备
WO2020192298A1 (zh) 图像处理方法及终端设备
WO2020215950A1 (zh) 界面显示方法及终端设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20746029

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20746029

Country of ref document: EP

Kind code of ref document: A1