WO2021036594A1 - 一种应用于投屏场景的控制方法以及相关设备 - Google Patents

一种应用于投屏场景的控制方法以及相关设备 Download PDF

Info

Publication number
WO2021036594A1
WO2021036594A1 PCT/CN2020/103440 CN2020103440W WO2021036594A1 WO 2021036594 A1 WO2021036594 A1 WO 2021036594A1 CN 2020103440 W CN2020103440 W CN 2020103440W WO 2021036594 A1 WO2021036594 A1 WO 2021036594A1
Authority
WO
WIPO (PCT)
Prior art keywords
navigation
navigation function
mobile phone
screen
display device
Prior art date
Application number
PCT/CN2020/103440
Other languages
English (en)
French (fr)
Inventor
谷贺瑾
牛思月
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Priority to EP20858332.8A priority Critical patent/EP4016272A4/en
Priority to MX2022002472A priority patent/MX2022002472A/es
Priority to US17/638,567 priority patent/US11809704B2/en
Publication of WO2021036594A1 publication Critical patent/WO2021036594A1/zh
Priority to US18/473,483 priority patent/US20240012562A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1423Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/02Input arrangements using manually operated switches, e.g. using keyboards or dials
    • G06F3/023Arrangements for converting discrete items of information into a coded form, e.g. arrangements for interpreting keyboard generated codes as alphanumeric codes, operand codes or instruction codes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/038Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1454Digital output to display device ; Cooperation and interconnection of the display device with other functional units involving copying of the display data of a local workstation or window to a remote workstation or window so that an actual copy of the data is displayed simultaneously on two or more displays, e.g. teledisplay
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/08Cursor circuits
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/12Synchronisation between the display unit and other units, e.g. other display units, video-disc players
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/14Display of multiple viewports
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • H04M1/72409User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality by interfacing with external accessories
    • H04M1/72412User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality by interfacing with external accessories using two-way short-range wireless interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/038Indexing scheme relating to G06F3/038
    • G06F2203/0383Remote input, i.e. interface arrangements in which the signals generated by a pointing device are transmitted to a PC at a remote location, e.g. to a PC in a LAN
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04803Split screen, i.e. subdividing the display area or the window area into separate subareas
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2354/00Aspects of interface with display user
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2370/00Aspects of data communication
    • G09G2370/04Exchange of auxiliary data, i.e. other than image data, between monitor and graphics controller
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2370/00Aspects of data communication
    • G09G2370/16Use of wireless transmission of display information
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/16Details of telephonic subscriber devices including more than one display unit
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/42204User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor

Definitions

  • This application relates to the field of computer technology, and in particular to a control method and related equipment applied to a screen projection scene.
  • a screencasting scene that includes a mobile phone and a computer as an example.
  • the computer can present a collaboration window.
  • the content of the mobile phone screen can be displayed in the collaboration window, that is, the mobile phone mirroring is displayed on the computer.
  • the user can control the mobile phone by using keyboard and mouse operations in the computer's collaboration window.
  • the present application provides a control method and related equipment applied to a screen projection scene.
  • keyboard and mouse operations are used to realize functions, so as to replace functions that are difficult to simulate the keyboard and mouse operations and improve the user's operating experience.
  • this application provides a control method applied to a screen projection scene.
  • the projection scene can include mobile phones and display devices.
  • the display device receives the first screen content and the target navigation function identifier sent by the mobile phone; generates a collaboration window including the projection area and the navigation bar according to the target navigation function identifier, and displays the first screen in the projection area Content; receives the mouse and keyboard operations acting on the virtual navigation buttons on the navigation bar; generates key commands according to the keyboard and mouse operations, and sends the key commands to the mobile phone, so that the mobile phone executes the navigation function according to the key commands, and the mobile phone can use the navigation function to first
  • the screen content is adjusted to the second screen content; the display device receives the second screen content sent by the mobile phone, and displays the second screen content in the projection area.
  • the first screen content refers to the content displayed on the screen of the mobile phone when the mobile phone establishes a connection with the display device.
  • the target navigation function includes the mobile phone navigation function in addition to the three-button navigation function.
  • the navigation bar includes three virtual navigation buttons.
  • the three virtual navigation buttons correspond to different navigation functions.
  • the three virtual navigation buttons are the menu button, the desktop button, and the return button.
  • the menu button is used to enter the task menu
  • the desktop button is used to enter the task menu. Back to the desktop, the back key is used to return to the previous level.
  • the functions shown above are commonly used navigation functions. In actual applications, the functions of the three virtual navigation buttons can also be set as other types of navigation functions, which are not limited in this application.
  • the display device sets a navigation bar in the collaboration window, and the user can perform system navigation functions on the mobile phone through the three virtual navigation keys in the navigation bar.
  • the mobile phone screen content and the collaboration window content can be updated synchronously, thereby reducing the problem of accidental touch when the user uses other navigation methods to control the mobile phone, thereby improving the user's experience of using the display device to control the mobile phone.
  • the current navigation function includes a gesture navigation function and/or an off-screen physical navigation function
  • the off-screen physical navigation function is implemented by a physical button.
  • the above method further includes: when the mode of the collaboration window is maximized window mode and the pointer position is not in the first target area, hiding the navigation bar, the first target area is in the projection area The part of the edge area corresponding to the navigation bar; when the mode of the collaboration window is the maximized window mode and the pointer position is in the first target area, the navigation bar is displayed in the first target area.
  • the collaboration window also includes a title bar; when the mode of the collaboration window is maximized window mode and the pointer position is not in the second target area, the title bar is hidden, and the second target area is on the screen. The part of the edge area of the area corresponding to the title bar; when the mode of the collaborative window is the maximized window mode and the pointer position is in the second target area, the title bar is displayed in the second target area.
  • the navigation bar when the collaboration window is a portrait window, the navigation bar is located below or above the projection area, and the navigation bar is adjacent to the projection area; when the collaboration window is a landscape window, the navigation bar Located on the right or left side of the projection area, the navigation bar is adjacent to the projection area.
  • the above-mentioned method further includes: disabling the navigation function corresponding to the target navigation function identifier to simulate the keyboard and mouse operation in the collaboration window.
  • this application provides a control method applied to a screen projection scene.
  • the method includes: sending a first screen content and a target navigation function identifier to a display device; receiving a key instruction sent by the display device; executing a navigation function corresponding to the key instruction according to a preset correspondence relationship, and the navigation function is used to adjust the first screen content Is the second screen content; the mobile phone sends the second screen content to the display device, so that the display device displays the second screen content in the projection area of the collaboration window.
  • the key instruction is generated by the display device according to the operation of the mouse and keyboard.
  • the target navigation function identifier is used to identify the current navigation function on the mobile phone in addition to the three-button navigation function.
  • the current navigation function includes a gesture navigation function and/or an off-screen physical navigation function, and the off-screen physical navigation function is implemented by a physical button.
  • the present application provides a display device with functions that can implement the control method in the first aspect or any implementation manner.
  • This function can be realized by hardware, or by hardware executing corresponding software.
  • the hardware or software includes one or more modules corresponding to the above-mentioned functions.
  • the present application provides a mobile phone with functions that can implement the control method in the second aspect or any implementation manner.
  • This function can be realized by hardware, or by hardware executing corresponding software.
  • the hardware or software includes one or more modules corresponding to the above-mentioned functions.
  • this application provides a screen projection system.
  • the projection system includes the display device provided in the third aspect or any one of the implementation manners, and the mobile phone provided in the fourth aspect or any one of the implementation manners.
  • the present application provides a computer-readable storage medium that stores instructions that, when executed on a computer, cause the computer to execute the method of the first aspect or the second aspect.
  • the present application provides a computer program product, which when running on a computer, causes the computer to execute the method of the first aspect or the second aspect.
  • Figure 1 is a schematic diagram of a screen projection scene in this application.
  • Figure 2 is a signaling interaction diagram of the control method applied to the screen projection scenario in this application;
  • Figure 3A is a schematic diagram of the collaboration window in this application.
  • FIG. 3B is a schematic diagram of the collaboration window in this application as a maximized portrait window
  • FIG. 3C is a schematic diagram of the collaboration window in this application as a maximized horizontal screen window
  • Figure 3D is another schematic diagram of the collaboration window in this application being a maximized horizontal screen window
  • Figure 3E is another schematic diagram of the collaboration window in this application.
  • Figure 3F is another schematic diagram of the collaboration window in this application.
  • Figure 4 is a schematic diagram of the structure of the display device in this application.
  • FIG. 5 is a schematic diagram of the structure of the mobile phone in this application.
  • Figure 6 is a schematic diagram of the screen projection system in this application.
  • FIG. 7 is another schematic diagram of the structure of the display device in this application.
  • Fig. 8 is another schematic diagram of the structure of the mobile phone in this application.
  • This application relates to a control method applied to a screen projection scene.
  • FIG. 1 is a schematic diagram of a screen projection scene.
  • the screen projection scene includes the mobile phone 10 and the display device 20.
  • the mobile phone 10 and the display device 20 may be connected through a wireless link 30.
  • the mobile phone 10 and the display device 20 may also be connected via a wired connection, such as a data cable.
  • the display device 20 may generate a collaboration window 40 according to the content displayed on the screen of the mobile phone 10.
  • the user's operation in the collaboration window 40 can update the screen content of the mobile phone 10 and the content of the collaboration window 40 simultaneously.
  • the display device 20 refers to a computing device that performs input operations through a keyboard and/or a mouse and has a display, such as a desktop computer, a notebook computer, and the like.
  • the mobile phone 10 is also called a mobile phone.
  • the system navigation function on the mobile phone is also called the system navigation mode. System navigation functions include gesture navigation, off-screen physical navigation, and three-button navigation.
  • the display device 20 takes a computer as an example. After the mobile phone 10 is projected to the computer, in the case of using gesture navigation on the mobile phone 10, it is difficult for the user to accurately simulate the mobile phone navigation operation on the computer through keyboard and mouse operations. For example, swipe up from the bottom left of the phone screen to return to the desktop. Swipe up from the bottom left of the phone screen and stay for a while to enter the task menu. Swipe to the right from the far left of the phone screen to return to the previous level. In this way, when the user uses the keyboard and mouse to move on the computer, it is easy to start the navigation function by mistake. For example, when the user wants to enter the task menu, sliding the mouse pointer up from the lower left end of the collaboration window incorrectly activates the function of returning to the desktop.
  • pressing the physical button once means returning to the previous level
  • pressing the physical button twice quickly means returning to the desktop. It is easy to cause false touches by keyboard and mouse operations on the computer. For example, when the user wants to activate the return to desktop function, clicking three times activates the navigation function of returning to the previous level.
  • this application provides three-button navigation virtual buttons on the computer to replace gesture navigation operations.
  • the three virtual navigation buttons on the navigation bar can be operated with mouse and keyboard, which can reduce the situation of accidentally touching the navigation function. Has better accuracy, thereby improving the user's operating experience of controlling the phone through other devices.
  • an embodiment of the control method provided by the present application includes:
  • Step 201 The display device receives the first screen content and the target navigation function identifier sent by the mobile phone.
  • the mobile phone can send the target navigation function identifier to the display device.
  • the target navigation function identifier is used to identify the current navigation function in addition to the three-button navigation function.
  • the target navigation function identification may be a character string used to identify the current navigation function, or a digital number. In practical applications, the target navigation function identification can also be expressed in other ways, such as pictures, symbols, text, and so on.
  • the target navigation function identifier can be carried in a message.
  • the current navigation function includes but is not limited to: gesture navigation function and/or off-screen physical navigation function.
  • the off-screen physical navigation function is realized by a physical button.
  • the three-key navigation function refers to the three-key navigation function on the screen or the three-key navigation function outside the screen.
  • the display device sends a query instruction to the mobile phone, and the mobile phone obtains the target navigation function identifier according to the query instruction, and then sends the target navigation function identifier to the display device.
  • the first screen content and the target navigation function identifier received by the display device from the mobile phone can be executed together or separately.
  • Step 202 The display device generates a collaboration window including a projection area and a navigation bar according to the target navigation function identifier, and displays the first screen content in the projection area.
  • the display device can determine that the current navigation function corresponding to the target navigation function identifier is not a three-button navigation function, which means that the computer is simulated by keyboard and mouse operation to simulate the navigation being used on the mobile phone.
  • the function may cause false touches.
  • the mobile phone can send the usage status of all navigation functions to the display device, and the display device can also determine the navigation function being used on the mobile phone according to the usage status of each navigation function.
  • the projection area is used to display according to the content of the mobile phone screen.
  • the navigation bar includes three virtual navigation buttons.
  • the three virtual navigation buttons correspond to different navigation functions.
  • the three virtual navigation buttons are the menu button, the desktop button, and the return button.
  • the menu key is used to enter the task menu
  • the desktop key is used to return to the desktop
  • the return key is used to return to the previous level. It is understandable that the functions shown above are commonly used navigation functions. In actual applications, the functions of the three virtual navigation buttons can also be set as other types of navigation functions, which are not limited in this application.
  • the configuration of the projection area and the navigation bar in the collaboration window of the display device can be executed together or separately. For example, when the display device receives the first screen content sent by the mobile phone, the display device generates a projection area in the collaboration window. When the display device receives the target navigation function identifier, the display device generates a navigation bar in the collaboration window.
  • Step 203 The display device receives the mouse and keyboard operation acting on the virtual navigation button.
  • Key-mouse operations refer to operations using keyboard input, and/or operations using mouse input. For example, when the mouse pointer selects a virtual button, click the virtual navigation button.
  • the keyboard and mouse operations are not limited to the above examples.
  • Step 204 The display device generates a key instruction according to the mouse and keyboard operation.
  • the display device can generate a key instruction corresponding to the virtual navigation key. For example, if the virtual navigation key is a menu key, a key command corresponding to the menu key is generated. If the virtual navigation key is a desktop key, a key instruction corresponding to the desktop key is generated. If the virtual navigation key is a return key, a key instruction corresponding to the return key is generated.
  • Step 205 The display device sends the key command to the mobile phone.
  • Step 206 The mobile phone executes the navigation function corresponding to the key command according to the preset correspondence relationship, and adjusts the first screen content to the second screen content.
  • the preset correspondence relationship refers to the correspondence relationship between the key command from the display device and the navigation function of the mobile phone.
  • the key command corresponds to the menu key
  • executing the navigation function on the mobile phone is to enter the task menu.
  • the key instruction corresponds to the desktop key
  • executing the navigation function on the mobile phone is to return to the desktop.
  • the key command corresponds to the return key
  • executing the navigation function on the mobile phone is to return to the previous level.
  • Step 207 The display device receives the second screen content sent by the mobile phone.
  • Step 208 The display device displays the second screen content in the projection area.
  • the display device when the mobile phone uses other navigation functions other than the three-button navigation function, the display device sets a navigation bar in the collaboration window, and the user performs the system navigation function on the mobile phone through the three virtual navigation keys in the navigation bar.
  • the mobile phone screen content and the collaboration window content can be updated synchronously, thereby reducing the problem of accidental touch when the user uses other navigation methods to control the mobile phone, thereby improving the user's experience of using the display device to control the mobile phone.
  • the collaboration window 40 of the present application may include a projection area 301, a navigation bar 302, and a title bar 303.
  • the navigation bar 302 and the title bar 303 can be hidden according to actual needs.
  • the navigation bar 302 includes three virtual navigation buttons, which are a first virtual navigation button 3021, a second virtual navigation button 3022, and a third virtual navigation button 3023, respectively.
  • the first virtual navigation key 3021, the second virtual navigation key 3022, and the third virtual navigation key 3023 are respectively: a return key, a desktop key, and a menu key.
  • the functions of the virtual navigation buttons and the sequence between the virtual navigation buttons can be adjusted according to actual needs.
  • the navigation bar 302 can be set outside the projection area 301 or within the projection area 301.
  • the navigation bar 302 is set within the projection area 301; when the collaboration window 40 is not a maximized window, the navigation bar 302 is set outside the projection area 301.
  • the title bar 303 includes a minimize window button, a maximize window button, and a close window button.
  • the title bar 303 may also include a window name, a direction lock button, etc., which are not limited here.
  • collaboration window 40 is a maximized window and the navigation bar 302 is set in the projection area 301:
  • the navigation bar 302 when the mode of the collaboration window 40 is the maximized window mode and the pointer position is not in the first target area, the navigation bar 302 is hidden, and the first target area is in the edge area of the projection area 301 The part corresponding to the navigation bar 302; when the mode of the collaboration window 40 is the maximized window mode and the pointer position is in the first target area, the navigation bar 302 is displayed in the first target area.
  • the first target area may specifically be one or a combination of the upper edge area, the lower edge area, the left edge area, or the right edge area of the projection area 301, or it may be a part of any one of the above edge areas. This application is not limited.
  • the collaboration window 40 is a maximized portrait window
  • the first target area is the lower edge of the projection area 301.
  • the navigation bar 302 is displayed on the lower edge of the projection area 301.
  • the navigation bar 302 is hidden.
  • the way of hiding the navigation bar 302 may be, but is not limited to: the navigation bar 302 moves downward from the lower edge of the screen area 301 and disappears, or the navigation bar 302 disappears directly.
  • the collaboration window 40 is a maximized horizontal screen window
  • the first target area is the right edge of the projection area 301.
  • the projection area 301 may be a full screen area.
  • the navigation bar 302 is displayed on the right edge of the projection area 301.
  • the navigation bar 302 is hidden.
  • the way to hide the navigation bar 302 may be, but is not limited to: the navigation bar 302 moves out from the right edge of the projection area 301 to the right and disappears, or the navigation bar 302 disappears directly.
  • the navigation bar 302 When the navigation bar 302 is displayed in the first target area within the projection area 301, the navigation bar 302 will block a part of the display content of the projection area 301.
  • the background of the navigation bar 302 can be set to be transparent, so as to reduce the occlusion of the projection area 301.
  • hiding the navigation bar 302 can display the mobile phone screen content in full screen. This provides a full-screen display of the mobile phone screen content on the computer, which can improve the user's experience of watching the screencast. And a navigation bar that can be hidden or expanded is provided. When the user wants to navigate the system, the navigation bar can be quickly called up, which has the advantages of convenience and speed.
  • the above method further includes: when the mode of the collaboration window 40 is the maximized window mode and the pointer position is not in the second target area, hiding the title bar 303, the second target area is The part corresponding to the title bar 303 in the edge area of the projection area 301; when the mode of the collaboration window 40 is the maximized window mode and the pointer position is in the second target area, the title bar 303 is displayed in the second target area.
  • the collaboration window 40 further includes a title bar 303.
  • the collaboration window 40 is a maximized portrait window.
  • the collaboration window 40 is a maximized horizontal screen window
  • the projection area 301 is a full screen area at this time.
  • the second target area may specifically be one or a combination of the upper edge area, the lower edge area, the left edge area, or the right edge area of the projection area 301, or it may be a part of any one of the above edge areas. This application Not limited. It can be understood that the second target area and the first target area are usually set as two independent areas.
  • hiding the title bar 303 can display the mobile phone screen content in full screen. This provides a full-screen display of the mobile phone screen content on the computer, which can improve the user's experience of watching the screencast.
  • the title bar 303 can be called up quickly, which has the advantages of convenience and speed.
  • the navigation bar 302 when the collaboration window 40 is a portrait window, the navigation bar 302 is located below the projection area 301, and the navigation bar 302 is adjacent to the projection area 301.
  • This setting conforms to the user's habit of using the three-button navigation function in portrait mode, and can improve user experience.
  • the navigation bar 302 can also be set in other positions of the projection area 301, for example, above the projection area 301.
  • the navigation bar 302 when the collaboration window 40 is a horizontal screen window, the navigation bar 302 is located on the right side of the projection area 301, and the navigation bar 302 is adjacent to the projection area 301.
  • This setting conforms to the user's habit of using the three-button navigation function in the landscape mode, and can improve the user experience.
  • the navigation bar 302 can also be set in other positions of the projection area 301, for example, to the left of the projection area 301.
  • the display device when performing a zooming operation on the collaboration window 40, can zoom the projection area 301 and the navigation bar 302 according to the same scale according to the zooming operation.
  • the display device may enable multiple navigation functions at the same time, or may only enable one navigation function and prohibit other navigation functions.
  • each navigation function in the collaboration window can be implemented through multiple types of keyboard and mouse operations.
  • the navigation function of returning to the desktop is realized through two kinds of keyboard and mouse operations.
  • One type of mouse and keyboard operation is to click a desktop button, which is a virtual navigation button.
  • Another keyboard and mouse operation is to swipe up from the bottom left of the collaboration window.
  • the keyboard and mouse operations for other navigation functions can be set according to the actual situation.
  • the above method further includes: disabling the navigation function corresponding to the target navigation function identifier in the keyboard and mouse operation simulation in the collaboration window.
  • the display device can disable the keyboard and mouse operations in the collaboration window to simulate the aforementioned navigation function.
  • the display device only provides a three-button navigation function, so that in the collaboration window, it is possible to avoid the use of keyboard and mouse operations to simulate gesture navigation or to simulate off-screen physical navigation caused by false touches.
  • the above method further includes: the display device sends an instruction to modify the navigation function to the mobile phone according to the target navigation function identifier, and the mobile phone modifies the current navigation function to the three-button navigation function on the screen according to the modify navigation function instruction.
  • the navigation function of the mobile phone is modified to the three-button navigation function on the screen.
  • control method can also be applied to other electronic devices that navigate through gestures or physically navigate off the screen.
  • an embodiment of the display device provided by the present application includes:
  • the receiving module 401 is configured to receive the first screen content and the target navigation function identifier sent by the mobile phone, and the target navigation function identifier is used to identify the current navigation function on the mobile phone except for the three-button navigation function;
  • the display module 402 is configured to generate a collaboration window including a projection area and a navigation bar according to the target navigation function identifier.
  • the navigation bar includes three virtual navigation buttons; the first screen content is displayed in the projection area;
  • the input module 403 is used to receive mouse and keyboard operations on the virtual navigation buttons
  • the processing module 404 is configured to generate key commands according to the keyboard and mouse operations
  • the sending module 405 is configured to send the key command to the mobile phone, so that the mobile phone executes a navigation function according to the key command, and the navigation function is used to adjust the content of the first screen to the content of the second screen;
  • the receiving module 401 is also used to receive the second screen content sent by the mobile phone;
  • the display module 402 is also used to display the second screen content in the collaboration window.
  • the current navigation function includes a gesture navigation function and/or an off-screen physical navigation function, and the off-screen physical navigation function is implemented by a physical button.
  • the display module 402 is also configured to hide the navigation bar when the mode of the collaboration window is the maximized window mode and the pointer position is not in the first target area, the first target area is corresponding to the navigation bar in the edge area of the projection area Part; when the mode of the collaborative window is maximized window mode and the pointer position is in the first target area, the navigation bar is displayed in the first target area.
  • the collaboration window further includes a title bar
  • the display module 402 is also used to hide the title bar when the mode of the collaboration window is the maximized window mode and the pointer position is not in the second target area, the second target area is corresponding to the title bar in the edge area of the projection area Part; when the mode of the collaborative window is maximized window mode and the pointer position is in the second target area, the title bar is displayed in the second target area.
  • processing module 404 is further configured to disable the navigation function corresponding to the target navigation function identifier in the keyboard and mouse operation simulation in the collaboration window.
  • an embodiment of the mobile phone provided by this application includes:
  • the sending module 501 is configured to send the first screen content and the target navigation function identifier to the display device, and the target navigation function identifier is used to identify the current navigation function on the mobile phone in addition to the three-button navigation function;
  • the receiving module 502 is configured to receive the key instruction sent by the display device, the key instruction is generated by the display device according to the operation of the mouse and keyboard;
  • the processing module 503 is configured to execute the navigation function corresponding to the key instruction according to the preset correspondence relationship, and the navigation function is used to adjust the content of the first screen to the content of the second screen;
  • the sending module 501 is also configured to send the second screen content to the display device, so that the display device displays the second screen content in the projection area of the collaboration window.
  • the current navigation function includes a gesture navigation function and/or an off-screen physical navigation function, and the off-screen physical navigation function is implemented by a physical button.
  • an embodiment of the screen projection system provided by the present application includes:
  • the display device 20 and the mobile phone 10, and the mobile phone 10 and the display device 20 are connected through a wireless link 30;
  • the display device 20 is configured to receive the first screen content and the status of the target navigation function sent by the mobile phone 10.
  • the target navigation function includes mobile phone navigation functions other than the three-button navigation function; when the target navigation function is in the on state, it generates A collaboration window including a projection area and a navigation bar, displaying the first screen content in the projection area; the navigation bar includes three virtual navigation buttons; receiving mouse and keyboard operations on the virtual navigation buttons; generating key commands according to the mouse and keyboard operations, Send the key command to the mobile phone, so that the mobile phone performs the navigation function according to the key command; receive the second screen content sent by the mobile phone, and display the second screen content in the projection area;
  • the mobile phone 10 is used to send the first screen content and the status of the target navigation function to the display device 20; when the status of the target navigation function is in the on state, it receives the key instruction sent by the display device; executes the navigation function according to the key instruction, and the navigation function uses To adjust the first screen content to the second screen content; send the second screen content to the display device.
  • the wireless link may be a wireless fidelity (WiFi) link or a Bluetooth link.
  • the function of the display device 20 is the same as the display device in the embodiment shown in FIG. 4 or the alternative embodiment.
  • the function of the mobile phone 10 is the same as the mobile phone in the embodiment shown in FIG. 5 or in the alternative embodiment.
  • another embodiment of the display device 20 provided by the present application includes:
  • the input device 701, the display device 702, the memory 703, the processor 704, the transceiver 705, and the data interface 706; the input device 701, the display device 702, the memory 703, the processor 704, the transceiver 705, and the data interface 706 may be connected by a bus.
  • the input device 701 may be a keyboard or a mouse.
  • the display device 702 may be a display, a projector or other equipment for display.
  • the memory 703 may be a volatile memory or a non-volatile memory, or include both volatile and non-volatile memory.
  • the non-volatile memory can be read-only memory (ROM), programmable read-only memory (programmable ROM, PROM), erasable programmable read-only memory (erasable PROM, EPROM), and electrically available Erase programmable read-only memory (electrically EPROM, EEPROM) or flash memory.
  • the volatile memory may be random access memory (RAM), which is used as an external cache. It should be noted that the memories described herein are intended to include, but are not limited to, these and any other suitable types of memories.
  • the processor 704 can be a general-purpose processor, including a central processing unit (CPU), a network processor (NP), etc.; it can also be a digital signal processing (DSP), an application specific integrated circuit (application specific integrated circuit, ASIC), field-programmable gate array (field-programmable gate array, FPGA) or other programmable logic devices, etc.
  • the processor 704 is used to implement the function of the display device in the above embodiment by calling the program code in the memory 703.
  • the transceiver 705 is used to receive and transmit data in wireless communication.
  • the device for implementing the receiving function in the transceiver 705 can be regarded as a receiver, and the device for implementing the sending function in the transceiver 705 can be regarded as a transmitter, that is, the transceiver 701 includes a receiver and a transmitter.
  • the transceiver 705 may also be referred to as a transceiver or a transceiver circuit or the like.
  • the receiver can sometimes be called a receiver or a receiving circuit.
  • the transmitter can sometimes be called a transmitter or a transmitting circuit.
  • the data interface 706 is connected to the mobile phone in a wired manner.
  • the display device 20 may include any number of input devices 701, display devices 702, memory 703, processor 704, transceiver 705, data interface 706, etc., to implement the display device 20 in the device embodiments of the present application.
  • the executed function or operation, and all the devices that can implement this application are within the protection scope of this application.
  • the display device 20 may also include a power source and the like.
  • the power supply is used to supply power to each component, and it can be logically connected to the processor 704 through a power management system, and functions such as charging, discharging, and power management are realized through the power management system.
  • FIG. 8 another embodiment of the mobile phone provided by the present application includes:
  • Radio Frequency (RF) circuit 810 Radio Frequency (RF) circuit 810, memory 820, input unit 830, display unit 840, Bluetooth module 850, audio circuit 860, WiFi module 870, processor 880, power supply 890 and other components.
  • RF Radio Frequency
  • the RF circuit 810 can be used for receiving and sending signals during the process of sending and receiving information or talking. In particular, after receiving the downlink information of the base station, it is processed by the processor 880; in addition, the designed uplink data is sent to the base station.
  • the RF circuit 810 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier (LNA), a duplexer, and the like.
  • the RF circuit 810 can also communicate with the network and other devices through wireless communication.
  • the above-mentioned wireless communication can use any communication standard or protocol, including but not limited to Global System of Mobile Communication (GSM), General Packet Radio Service (GPRS), Code Division Multiple Access (Code Division) Multiple Access (CDMA), Wideband Code Division Multiple Access (WCDMA), Long Term Evolution (LTE), Email, Short Messaging Service (SMS), etc.
  • GSM Global System of Mobile Communication
  • GPRS General Packet Radio Service
  • CDMA Code Division Multiple Access
  • WCDMA Wideband Code Division Multiple Access
  • LTE Long Term Evolution
  • Email Short Messaging Service
  • the memory 820 may be used to store software programs and modules.
  • the processor 880 executes various functional applications and data processing of the mobile phone by running the software programs and modules stored in the memory 820.
  • the memory 820 may mainly include a program storage area and a data storage area.
  • the program storage area may store an operating system, an application program required by at least one function (such as a sound playback function, an image playback function, etc.), etc.; Data created by the use of mobile phones (such as audio data, phone book, etc.), etc.
  • the memory 820 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, a flash memory device, or other volatile solid-state storage devices.
  • the input unit 830 may be used to receive inputted digital or character information, and generate key signal input related to user settings and function control of the mobile phone.
  • the input unit 830 may include a touch panel 831 and other input devices 832.
  • the touch panel 831 also called a touch screen, can collect user touch operations on or near it (for example, the user uses any suitable objects or accessories such as fingers, stylus, etc.) on the touch panel 831 or near the touch panel 831. Operation), and drive the corresponding connection device according to the preset program.
  • the touch panel 831 may include two parts: a touch detection device and a touch controller.
  • the touch detection device detects the user's touch position, detects the signal brought by the touch operation, and transmits the signal to the touch controller; the touch controller receives the touch information from the touch detection device, converts it into contact coordinates, and then sends it To the processor 880, and can receive and execute the commands sent by the processor 880.
  • the touch panel 831 can be implemented in multiple types such as resistive, capacitive, infrared, and surface acoustic wave.
  • the input unit 830 may also include other input devices 832.
  • other input devices 832 may include, but are not limited to, one or more of function keys (such as volume control keys, switch keys, etc.), trackballs, and joysticks.
  • the display unit 840 may be used to display information input by the user or information provided to the user and various menus of the mobile phone.
  • the display unit 840 may include a display panel 841.
  • the display panel 841 may be configured in the form of a liquid crystal display (Liquid Crystal Display, LCD), an organic light-emitting diode (Organic Light-Emitting Diode, OLED), etc.
  • the touch panel 831 can cover the display panel 841. When the touch panel 831 detects a touch operation on or near it, it transmits it to the processor 880 to determine the type of the touch event, and then the processor 880 determines the type of the touch event. The type provides corresponding visual output on the display panel 841.
  • the touch panel 831 and the display panel 841 are used as two independent components to realize the input and input functions of the mobile phone, but in some embodiments, the touch panel 831 and the display panel 841 can be integrated Realize the input and output functions of the mobile phone.
  • the mobile phone may also include a Bluetooth module 850.
  • the audio circuit 860, the speaker 861, and the microphone 862 can provide an audio interface between the user and the mobile phone.
  • the audio circuit 860 can transmit the electric signal after the conversion of the received audio data to the speaker 861, and the speaker 861 converts it into a sound signal for output; on the other hand, the microphone 862 converts the collected sound signal into an electric signal, and the audio circuit 860 After being received, it is converted into audio data, and then processed by the audio data output processor 880, and then sent to, for example, another mobile phone via the RF circuit 810, or the audio data is output to the memory 820 for further processing.
  • WiFi is a short-distance wireless transmission technology.
  • the mobile phone can help users send and receive emails, browse web pages, and access streaming media through the WiFi module 870. It provides users with wireless broadband Internet access.
  • FIG. 8 shows the WiFi module 870, it is understandable that it is not a necessary component of the mobile phone and can be omitted as needed without changing the essence of the invention.
  • the processor 880 is the control center of the mobile phone. It uses various interfaces and lines to connect various parts of the entire mobile phone, and executes by running or executing software programs and/or modules stored in the memory 820, and calling data stored in the memory 820. Various functions and processing data of the mobile phone can be used to monitor the mobile phone as a whole.
  • the processor 880 may include one or more processing units; preferably, the processor 880 may integrate an application processor and a modem processor, where the application processor mainly processes the operating system, user interface, application programs, etc. , The modem processor mainly deals with wireless communication. It can be understood that the foregoing modem processor may not be integrated into the processor 880.
  • the mobile phone also includes a power supply 890 (such as a battery) for supplying power to various components.
  • a power supply 890 (such as a battery) for supplying power to various components.
  • the power supply can be logically connected to the processor 880 through a power management system, so that functions such as charging, discharging, and power management can be managed through the power management system.
  • the mobile phone may also include a camera, a sensor, etc., which will not be repeated here.
  • the processor 880 included in the mobile phone can implement the functions of the mobile phone in the embodiment shown in FIG. 2 or an optional embodiment.
  • the present application provides a computer storage medium, including instructions, when the instructions run on a computing device, the computing device executes the steps implemented by the display device in any one of the above embodiments.
  • the present application also provides a computer storage medium, including instructions, which when the instructions run on a computing device, cause the computing device to execute the steps implemented by the mobile phone in any of the above embodiments.
  • the present application also provides a computer program product, which when running on a computing device, causes the computing device to execute the steps implemented by the display device in any of the above embodiments.
  • This application also provides a computer program product, which when running on a computing device, causes the computing device to execute the steps implemented by the mobile phone in any of the above embodiments.
  • the computer program product includes one or more computer instructions.
  • Computer instructions may be stored in a computer-readable storage medium, or transmitted from one computer-readable storage medium to another computer-readable storage medium.
  • Computer instructions may be transmitted from a website, computer, server, or data center through a cable (such as Coaxial cable, optical fiber, digital subscriber line) or wireless (such as infrared, wireless, microwave, etc.) to transmit to another website site, computer, server or data center.
  • the computer-readable storage medium may be any available medium that can be stored by a computer or a data storage device such as a server or a data center integrated with one or more available media.
  • the usable medium may be a magnetic medium (for example, a floppy disk, a hard disk, and a magnetic tape), an optical medium (for example, a DVD), or a semiconductor medium (for example, a solid state disk (SSD)).

Abstract

一种应用于投屏场景的控制方法包括:接收手机发送的第一屏幕内容和目标导航功能标识;根据目标导航功能标识生成包括投屏区域和导航栏的协同窗口,导航栏包括三个虚拟导航按键,在投屏区域显示第一屏幕内容;接收作用于虚拟导航按键上的键鼠操作;根据键鼠操作生成按键指令,将按键指令发送给手机,使得手机根据按键指令执行导航功能,手机通过导航功能可以将第一屏幕内容调整为第二屏幕内容;显示设备接收手机发送的第二屏幕内容,在投屏区域显示第二屏幕内容。本申请还提供一种能实现以上控制方法的相关设备。

Description

一种应用于投屏场景的控制方法以及相关设备
本申请要求于2019年08月29日提交中国专利局、申请号为201910809113.9、申请名称为“一种应用于投屏场景的控制方法以及相关设备”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及计算机技术领域,尤其涉及一种应用于投屏场景的控制方法以及相关设备。
背景技术
随着计算机技术的发展,不同类型的设备(例如使用不同操作系统的设备)之间可以进行投屏,这样可以实现屏幕共享。
以包括手机和电脑的投屏场景为例,当手机投屏到电脑时,电脑可以呈现一个协同窗口。在协同窗口内可以显示手机屏幕内容,即在电脑上呈现手机镜像。用户在电脑的协同窗口中使用键鼠操作,可以控制手机。
但是,手机通常采用触摸屏作为输入设备,电脑采用键盘和鼠标作为输入设备。通过鼠标或键盘模拟触摸操作存在不足,影响用户通过电脑控制手机的体验。
发明内容
本申请提供一种应用于投屏场景的控制方法以及相关设备,在投屏场景下使用键鼠操作实现功能,以替代键鼠操作难以模拟实现的功能,提高用户的操作体验。
第一方面,本申请提供一种应用于投屏场景的控制方法。投屏场景可以包括手机和显示设备。当手机与显示设备建立连接之后,显示设备接收手机发送的第一屏幕内容和目标导航功能标识;根据目标导航功能标识生成包括投屏区域和导航栏的协同窗口,在投屏区域显示第一屏幕内容;接收作用于在导航栏上虚拟导航按键上的键鼠操作;根据键鼠操作生成按键指令,将按键指令发送给手机,使得手机根据按键指令执行导航功能,手机通过导航功能可以将第一屏幕内容调整为第二屏幕内容;显示设备接收手机发送的第二屏幕内容,在投屏区域显示第二屏幕内容。
其中,第一屏幕内容是指在手机与显示设备建立连接时在手机的屏幕上显示的内容。目标导航功能包括除了三键导航功能之外的手机导航功能。导航栏包括三个虚拟导航按键,三个虚拟导航按键分别对应不同的导航功能,例如三个虚拟导航按键分别为菜单键、桌面键和返回键,菜单键用于进入任务菜单,桌面键用于回到桌面,返回键用于返回上一级。可以理解的是,以上示出的功能是常用的导航功能,在实际应用中还可以将三个虚拟导航按键的功能设置为其他类型的导航功能,本申请不作限定。
本申请中,在手机使用除了三键导航方式之外的其他导航方式的情况下,显示设备在协同窗口设置导航栏,用户通过导航栏中的三个虚拟导航键能够对手机执行系统导航功能,使得手机屏幕内容和协同窗口内容能够同步更新,从而减少了用户使用其他导航方式控制 手机时容易发生误触的问题,由此改善了用户使用显示设备控制手机的体验。
在一种可能的实现方式中,当前导航功能包括手势导航功能和/或屏幕外物理导航功能,屏幕外物理导航功能由一个物理按键实现。
在另一种可能的实现方式中,上述方法还包括:当协同窗口的模式为最大化窗口模式且指针位置不处于第一目标区域时,隐藏导航栏,第一目标区域是在投屏区域的边缘区域中与导航栏对应的部分;当协同窗口的模式为最大化窗口模式且指针位置处于第一目标区域时,在第一目标区域显示导航栏。
在另一种可能的实现方式中,协同窗口还包括标题栏;当协同窗口的模式为最大化窗口模式且指针位置不处于第二目标区域时,隐藏标题栏,第二目标区域是在投屏区域的边缘区域中与标题栏对应的部分;当协同窗口的模式为最大化窗口模式且指针位置处于第二目标区域时,在第二目标区域显示标题栏。
在另一种可能的实现方式中,当协同窗口为竖屏窗口时,导航栏位于投屏区域的下方或上方,导航栏与投屏区域相邻;当协同窗口为横屏窗口时,导航栏位于投屏区域的右侧或左侧,导航栏与投屏区域相邻。
在另一种可能的实现方式中,上述方法还包括:在协同窗口中禁用键鼠操作模拟与目标导航功能标识对应的导航功能。
第二方面,本申请提供一种应用于投屏场景的控制方法。该方法包括:向显示设备发送第一屏幕内容和目标导航功能标识;接收显示设备发送的按键指令;根据预设对应关系执行与按键指令对应的导航功能,导航功能用于将第一屏幕内容调整为第二屏幕内容;手机将第二屏幕内容发送给显示设备,使得显示设备在协同窗口的投屏区域显示第二屏幕内容。
其中,按键指令是显示设备根据键鼠操作生成的。目标导航功能标识用于标识在手机上除了三键导航功能之外的当前导航功能。在一种可能的实现方式中,当前导航功能包括手势导航功能和/或屏幕外物理导航功能,屏幕外物理导航功能由一个物理按键实现。
第三方面,本申请提供一种显示装置,其具有的功能可以实现第一方面或任意一种实现方式中的控制方法。该功能可以通过硬件实现,也可以通过硬件执行相应的软件实现。硬件或软件包括一个或多个与上述功能相对应的模块。
第四方面,本申请提供一种手机,其具有的功能可以实现第二方面或任意一种实现方式中的控制方法。该功能可以通过硬件实现,也可以通过硬件执行相应的软件实现。硬件或软件包括一个或多个与上述功能相对应的模块。
第五方面,本申请提供一种投屏系统。该投屏系统包括如第三方面或其中任意一种实现方式提供的显示设备,和第四方面或其中任意一种实现方式提供的手机。
第六方面,本申请提供一种计算机可读存储介质,其存储有指令,当指令在计算机上运行时,使得计算机执行第一方面或第二方面的方法。
第七方面,本申请提供一种计算机程序产品,该计算机程序产品在计算机上运行时,使得计算机执行第一方面或第二方面的方法。
附图说明
图1为本申请中投屏场景的一个示意图;
图2为本申请中应用于投屏场景的控制方法的一个信令交互图;
图3A为本申请中协同窗口的一个示意图;
图3B为本申请中协同窗口为最大化竖屏窗口的一个示意图;
图3C为本申请中协同窗口为最大化横屏窗口的一个示意图;
图3D为本申请中协同窗口为最大化横屏窗口的另一个示意图;
图3E为本申请中协同窗口的另一个示意图;
图3F为本申请中协同窗口的另一个示意图;
图4为本申请中显示设备的一个结构示意图;
图5为本申请中手机的一个结构示意图;
图6为本申请中投屏系统的一个示意图;
图7为本申请中显示设备的另一个结构示意图;
图8为本申请中手机的另一个结构示意图。
具体实施方式
本申请涉及一种应用于投屏场景的控制方法。
图1为投屏场景的一个示意图。投屏场景包括手机10和显示设备20。手机10和显示设备20可以通过无线链路30连接。手机10和显示设备20也可以有线连接,例如数据线。在手机10和显示设备20建立连接之后,显示设备20可以根据手机10的屏幕显示内容生成协同窗口40。用户在协同窗口40的操作,可以同步更新手机10的屏幕内容和协同窗口40的内容。
显示设备20是指通过键盘和/或鼠标执行输入操作并且具有显示器的计算设备,例如台式电脑、笔记本电脑等。手机10也称为移动电话。在手机上的系统导航功能也称为系统导航方式。系统导航功能包括手势导航、屏幕外物理导航和三键导航等。
显示设备20以电脑为例,当手机10投屏到电脑之后,在手机10上使用手势导航的情况下,用户通过键鼠操作在电脑上难以准确模拟手机导航操作。例如,从手机屏幕的左下端向上滑动,则返回桌面。从手机屏幕的左下端向上滑动并停留一段时间,则进入任务菜单。从手机屏幕的最左侧向右滑动,则返回上一级。这样,用户在电脑上使用键鼠操作移动时,很容易错误启动导航功能。例如,用户想要进入任务菜单,将鼠标指针从协同窗口的左下端向上滑动时错误启动了返回桌面的功能。
或者,在手机10上使用屏幕外物理导航的情况下,按压物理按键一次,表示返回上一级,快速按压物理按键两次,表示返回桌面。在电脑上通过键鼠操作容易引发误触,例如用户想要启动返回桌面功能时,点击三次启动了返回上一级的导航功能。
以上可以看出,在手机使用手势导航,或者屏幕外物理导航的情况下,用户通过键鼠操作模拟上述导航方式时容易引发误触,由此难以准确控制手机进行系统导航,导致用户 操作体验不佳。
对于上述场景下存在的问题,本申请通过在电脑上提供三键导航虚拟按键来代替手势导航操作,这样对导航栏的三个虚拟导航按键执行键鼠操作,能够减少误触导航功能的情况,具有更好的准确性,从而提高用户通过其他设备控制手机的操作体验。
下面对上述投屏场景下的控制方法进行介绍。参阅图2,本申请提供的控制方法的一个实施例包括:
步骤201、显示设备接收手机发送的第一屏幕内容和目标导航功能标识。
本实施例中,手机可以将目标导航功能标识发送给显示设备。目标导航功能标识用于标识除了三键导航功能之外的当前导航功能。具体的,目标导航功能标识可以是用于标识当前导航功能的字符串,或者数字编号。在实际应用中,目标导航功能标识还可以通过其他方式进行表示,例如图片、符号、文字等等。当手机发送目标导航功能标识时,可以将目标导航功能标识携带于一个消息中。
当前导航功能包括但不限于:手势导航功能和/或屏幕外物理导航功能,屏幕外物理导航功能由一个物理按键实现。三键导航功能是指屏幕内三键导航功能或者屏幕外三键导航功能。
可选的,显示设备向手机发送查询指令,手机根据查询指令获取目标导航功能标识,然后将目标导航功能标识发送给显示设备。
需要说明的是,显示设备接收手机发送的第一屏幕内容和目标导航功能标识可以一起执行,也可以分开执行。
步骤202、显示设备根据目标导航功能标识生成包括投屏区域和导航栏的协同窗口,在投屏区域显示第一屏幕内容。
当显示设备收到手机发送的目标导航功能标识时,显示设备可以确定目标导航功能标识对应的当前导航功能不是三键导航功能,即表明在电脑上通过键鼠操作模拟在手机上正在使用的导航功能可能引发误触。可选的,手机可以将全部导航功能的使用状态发送给显示设备,显示设备根据各导航功能的使用状态也可以确定手机上正在使用的导航功能。
投屏区域用于根据手机屏幕的内容进行显示。
导航栏包括三个虚拟导航按键。三个虚拟导航按键分别对应不同的导航功能,例如三个虚拟导航按键分别为菜单键、桌面键和返回键。菜单键用于进入任务菜单,桌面键用于回到桌面,返回键用于返回上一级。可以理解的是,以上示出的功能是常用的导航功能,在实际应用中还可以将三个虚拟导航按键的功能设置为其他类型的导航功能,本申请不作限定。
需要说明的是,显示设备在协同窗口中配置投屏区域和导航栏可以在一起执行,也可以分开执行。例如,当显示设备收到手机发送的第一屏幕内容时,显示设备在协同窗口中生成投屏区域。当显示设备收到目标导航功能标识时,显示设备在协同窗口中生成导航栏。
步骤203、显示设备接收作用于虚拟导航按键上的键鼠操作。
键鼠操作是指使用键盘输入的操作,和/或,使用鼠标输入的操作。例如,当鼠标指针选中虚拟按键时,单击虚拟导航按键。键鼠操作不限于以上举例。
步骤204、显示设备根据键鼠操作生成按键指令。
对虚拟导航按键执行键鼠操作时,显示设备可以生成与虚拟导航按键对应的按键指令。例如,若虚拟导航按键为菜单键,则生成与菜单键对应的按键指令。若虚拟导航按键为桌面键,则生成与桌面键对应的按键指令。若虚拟导航按键为返回键,则生成与返回键对应的按键指令。
步骤205、显示设备将按键指令发送给手机。
步骤206、手机根据预设对应关系执行与按键指令对应的导航功能,将第一屏幕内容调整为第二屏幕内容。
预设对应关系是指来自显示设备的按键指令与手机导航功能之间的对应关系。当按键指令与菜单键对应时,在手机上执行导航功能为进入任务菜单。当按键指令与桌面键对应时,在手机上执行导航功能为回到桌面。当按键指令与返回键对应时,在手机上执行导航功能为返回上一级。可以理解,电脑上的虚拟导航按键的功能与手机的系统导航功能是一一对应的。
步骤207、显示设备接收手机发送的第二屏幕内容。
步骤208、显示设备在投屏区域显示第二屏幕内容。
本实施例中,在手机使用除了三键导航功能之外的其他导航功能的情况下,显示设备在协同窗口设置导航栏,用户通过导航栏中的三个虚拟导航键对手机执行系统导航功能,使得手机屏幕内容和协同窗口内容能够同步更新,从而减少了用户使用其他导航方式控制手机时容易发生误触的问题,由此改善了用户使用显示设备控制手机的体验。
参阅图3A,本申请的协同窗口40可以包括投屏区域301、导航栏302和标题栏303。导航栏302和标题栏303根据实际需要可以隐藏。
导航栏302包括三个虚拟导航按键,分别为第一虚拟导航按键3021、第二虚拟导航按键3022和第三虚拟导航按键3023。可选的,第一虚拟导航按键3021、第二虚拟导航按键3022和第三虚拟导航按键3023分别为:返回键、桌面键和菜单键。虚拟导航按键的功能和虚拟导航按键之间的顺序可以根据实际需要进行调整。
导航栏302可以设置在投屏区域301之外,也可以设置在投屏区域301之内。可选的,当协同窗口40为最大化窗口时,导航栏302设置在投屏区域301之内;当协同窗口40不是最大化窗口时,导航栏302设置在投屏区域301之外。
标题栏303包括最小化窗口按钮、最大化窗口按钮和关闭窗口按钮。另外,标题栏303还可以包括窗口名称、方向锁定按钮等,在此不作限定。
下面对协同窗口40为最大化窗口且导航栏302设置在投屏区域301之内进行介绍:
在另一个可选实施例中,当协同窗口40的模式为最大化窗口模式且指针位置不处于第一目标区域时,隐藏导航栏302,第一目标区域是在投屏区域301的边缘区域中与导航栏302对应的部分;当协同窗口40的模式为最大化窗口模式且指针位置处于第一目标区域时,在第一目标区域显示导航栏302。
其中,第一目标区域具体可以是投屏区域301的上侧边缘区域、下侧边缘区域、左侧边缘区域或右侧边缘区域中的一个或组合,也可以是以上任意一个边缘区域的部分,本申 请不作限定。
参阅图3B,一种情况下,协同窗口40为最大化的竖屏窗口,第一目标区域为投屏区域301的下侧边缘。当指针处于投屏区域301的下侧边缘时,在投屏区域301的下侧边缘显示导航栏302。当指针不处于投屏区域301的下侧边缘时,隐藏导航栏302。隐藏导航栏302方式可以是但不限于:导航栏302从投屏区域301的下侧边缘向下移出消失,或者导航栏302直接消失。
参阅图3C,在另一种情况下,协同窗口40为最大化的横屏窗口,第一目标区域为投屏区域301的右侧边缘。在协同窗口40为最大化的横屏窗口的情况下,投屏区域301可以为全屏区域。当指针处于投屏区域301的右侧边缘时,在投屏区域301的右侧边缘显示导航栏302。当指针不处于投屏区域301的右侧边缘时,隐藏导航栏302。隐藏导航栏302方式可以是但不限于:导航栏302从投屏区域301的右侧边缘向右移出消失,或者导航栏302直接消失。
当导航栏302在投屏区域301之内的第一目标区域内显示时,导航栏302会遮挡一部分投屏区域301的显示内容。导航栏302的背景可以设置为透明,这样减少对投屏区域301的遮挡。
当协同窗口40的模式为最大化窗口模式且指针位置不处于第一目标区域时,隐藏导航栏302可以全屏显示手机屏幕内容。这样提供了一种在电脑上全屏显示手机屏幕内容的方式,能够提高用户观看投屏的体验。并且提供了一种可隐藏或展开的导航栏,当用户想要进行系统导航时,能够快速唤出导航栏,具有方便快捷的优点。
参阅图3D,在另一个可选实施例中,上述方法还包括:当协同窗口40的模式为最大化窗口模式且指针位置不处于第二目标区域时,隐藏标题栏303,第二目标区域是在投屏区域301的边缘区域中与标题栏303对应的部分;当协同窗口40的模式为最大化窗口模式且指针位置处于第二目标区域时,在第二目标区域显示标题栏303。
本实施例中,协同窗口40还包括标题栏303。一种情况下,协同窗口40为最大化的竖屏窗口。另一种情况下,协同窗口40为最大化的横屏窗口,此时投屏区域301为全屏区域。第二目标区域具体可以是投屏区域301的上侧边缘区域、下侧边缘区域、左侧边缘区域或右侧边缘区域中的一个或组合,也可以是以上任意一个边缘区域的部分,本申请不作限定。可以理解的是,第二目标区域与第一目标区域通常设置为两个独立的区域。
当协同窗口40的模式为最大化窗口模式且指针位置不处于第二目标区域时,隐藏标题栏303可以全屏显示手机屏幕内容。这样提供了一种在电脑上全屏显示手机屏幕内容的方式,能够提高用户观看投屏的体验。当用户想要进行查看标题栏303或者调整协同窗口40时,能够快速唤出标题栏303,具有方便快捷的优点。
下面对导航栏302设置在投屏区域301之外的情况进行介绍:
参阅图3E,在另一个可选实施例中,当协同窗口40为竖屏窗口时,导航栏302位于投屏区域301的下方,导航栏302与投屏区域301相邻。这样设置符合用户在竖屏模式下使用三键导航功能的习惯,能够提高用户体验。当协同窗口40为竖屏窗口时,导航栏302还可以设置在投屏区域301的其他方位,例如投屏区域301的上方。
参阅图3F,在另一个可选实施例中,当协同窗口40为横屏窗口时,导航栏302位于投屏区域301的右侧,导航栏302与投屏区域301相邻。这样设置符合用户在横屏模式下使用三键导航功能的习惯,能够提高用户体验。当协同窗口40为横屏窗口时,导航栏302还可以设置在投屏区域301的其他方位,例如投屏区域301的左侧。
需要说明的是,对协同窗口40进行缩放操作时,显示设备根据缩放操作可以按照相同的比例将投屏区域301和导航栏302进行缩放。
在本申请的协同窗口40中,显示设备可以同时启用多种导航功能,也可以只启用一种导航功能且禁止其他导航功能。
在一个可选实施例中,在协同窗口中的每个导航功能可以通过多种类型的键鼠操作实现。
例如,对于返回桌面的导航功能通过两种键鼠操作实现。一种键鼠操作是单击桌面键,该桌面键为虚拟导航按键。另一种键鼠操作是从协同窗口的左下端向上滑动。依此类推,可以根据实际情况设置用于实现其他导航功能的键鼠操作。
当显示设备启用多种导航功能时,用户可以根据需求选择键鼠操作实现导航功能,提供了实施的灵活性。
在另一个可选实施例中,上述方法还包括:在协同窗口中禁用键鼠操作模拟与目标导航功能标识对应的导航功能。
本实施例中,当手机采用手势导航或屏幕外物理导航时,显示设备可以在协同窗口中禁用键鼠操作模拟上述导航功能。显示设备仅提供三键导航功能,这样在协同窗口中能够避免使用键鼠操作模拟手势导航或模拟屏幕外物理导航引发的误触。
在另一个可选实施例中,上述方法还包括:显示设备根据目标导航功能标识向手机发送修改导航功能指令,手机根据该修改导航功能指令将当前导航功能修改为屏幕内三键导航功能。本实施例中,当手机使用其他导航功能的情况下,将手机的导航功能修改为屏幕内三键导航功能。在投屏时,手机和显示设备上都显示屏幕内三键导航栏,在显示设备上通过屏幕内三键导航栏可以对手机进行控制。
需要说明的是,除了手机之外,上述控制方法还可以应用于其他通过手势导航或者屏幕外物理导航的电子设备。
以上对本申请的控制方法进行了介绍,下面对本申请用于实现以上控制方法的相关装置进行介绍。参阅图4,本申请提供的显示装置的一个实施例包括:
接收模块401,用于接收手机发送的第一屏幕内容和目标导航功能标识,目标导航功能标识用于标识在手机上除了三键导航功能之外的当前导航功能;
显示模块402,用于根据目标导航功能标识生成包括投屏区域和导航栏的协同窗口,导航栏包括三个虚拟导航按键;在投屏区域显示第一屏幕内容;
输入模块403,用于接收作用于虚拟导航按键上的键鼠操作;
处理模块404,用于根据键鼠操作生成按键指令;
发送模块405,用于将按键指令发送给手机,使得手机根据按键指令执行导航功能,导航功能用于将第一屏幕内容调整为第二屏幕内容;
接收模块401,还用于接收手机发送的第二屏幕内容;
显示模块402,还用于在协同窗口显示第二屏幕内容。
在一个可选实施例中,当前导航功能包括手势导航功能和/或屏幕外物理导航功能,屏幕外物理导航功能由一个物理按键实现。
在另一个可选实施例中,
显示模块402,还用于当协同窗口的模式为最大化窗口模式且指针位置不处于第一目标区域时,隐藏导航栏,第一目标区域是在投屏区域的边缘区域中与导航栏对应的部分;当协同窗口的模式为最大化窗口模式且指针位置处于第一目标区域时,在第一目标区域显示导航栏。
在另一个可选实施例中,协同窗口还包括标题栏;
显示模块402,还用于当协同窗口的模式为最大化窗口模式且指针位置不处于第二目标区域时,隐藏标题栏,第二目标区域是在投屏区域的边缘区域中与标题栏对应的部分;当协同窗口的模式为最大化窗口模式且指针位置处于第二目标区域时,在第二目标区域显示标题栏。
在另一个可选实施例中,处理模块404,还用于在协同窗口中禁用键鼠操作模拟与目标导航功能标识对应的导航功能。
参阅图5,本申请提供的手机的一个实施例包括:
发送模块501,用于向显示设备发送第一屏幕内容和目标导航功能标识,目标导航功能标识用于标识在手机上除了三键导航功能之外的当前导航功能;
接收模块502,用于接收显示设备发送的按键指令,按键指令是显示设备根据键鼠操作生成的;
处理模块503,用于根据预设对应关系执行与按键指令对应的导航功能,导航功能用于将第一屏幕内容调整为第二屏幕内容;
发送模块501,还用于将第二屏幕内容发送给显示设备,使得显示设备在协同窗口的投屏区域中显示第二屏幕内容。
在一个可选实施例中,当前导航功能包括手势导航功能和/或屏幕外物理导航功能,屏幕外物理导航功能由一个物理按键实现。
参阅图6,本申请提供的投屏系统的一个实施例包括:
显示设备20和手机10,手机10与显示设备20通过无线链路30连接;
显示设备20,用于接收手机10发送的第一屏幕内容和目标导航功能的状态,目标导航功能包括除了三键导航功能之外的手机导航功能;当目标导航功能的状态为开启状态时,生成包括投屏区域和导航栏的协同窗口,在投屏区域显示第一屏幕内容;导航栏包括三个虚拟导航按键;接收作用于虚拟导航按键上的键鼠操作;根据键鼠操作生成按键指令,将按键指令发送给手机,使得手机根据按键指令执行导航功能;接收手机发送的第二屏幕内容,在投屏区域显示第二屏幕内容;
手机10,用于向显示设备20发送第一屏幕内容和目标导航功能的状态;当目标导航功能的状态处于开启状态时,接收显示设备发送的按键指令;根据按键指令执行导航功能, 导航功能用于将第一屏幕内容调整为第二屏幕内容;将第二屏幕内容发送给显示设备。
具体的,无线链路可以是无线保真(wireless fidelity,WiFi)链路或蓝牙链路等。显示设备20的功能与图4所示实施例或可选实施例中的显示设备相同。手机10的功能与图5所示实施例或可选实施例中的手机相同。
参阅图7,本申请提供的显示设备20的另一个实施例包括:
输入装置701、显示装置702、存储器703、处理器704、收发器705和数据接口706;输入装置701、显示装置702、存储器703、处理器704、收发器705和数据接口706可以通过总线连接。
输入装置701可以是键盘或鼠标。
显示装置702可以是显示器,投影仪或其他用于显示的设备。
存储器703可以是易失性存储器或非易失性存储器,或包括易失性和非易失性存储器两者。其中,非易失性存储器可以是只读存储器(read-only memory,ROM)、可编程只读存储器(programmable ROM,PROM)、可擦除可编程只读存储器(erasable PROM,EPROM)、电可擦除可编程只读存储器(electrically EPROM,EEPROM)或闪存。易失性存储器可以是随机存取存储器(random access memory,RAM),其用作外部高速缓存。应注意,本文描述的存储器旨在包括但不限于这些和任意其它适合类型的存储器。
处理器704可以是通用处理器,包括中央处理器(central processing unit,CPU)、网络处理器(network processor,NP)等;还可以是数字信号处理器(digital signal processing,DSP)、专用集成电路(application specific integrated circuit,ASIC)、现场可编程门阵列(field-programmable gate array,FPGA)或者其他可编程逻辑器件等。处理器704通过调用存储器703中的程序代码,用于实现以上实施例中显示设备的功能。
收发器705用于在无线通信中接收和发送数据。可以将收发器705中用于实现接收功能的器件视为接收器,将收发器705中用于实现发送功能的器件视为发送器,即收发器701包括接收器和发送器。收发器705也可以称为收发机或收发电路等。接收器有时也可以称为接收机或接收电路等。发送器有时也可以称为发射机或者发射电路等。
数据接口706通过有线方式与手机进行连接。
本实施例中给出的结构图仅示出了显示设备20的简化设计。在实际应用中,显示设备20可以包含任意数量的输入装置701、显示装置702、存储器703、处理器704、收发器705和数据接口706等,以实现本申请各装置实施例中显示设备20所执行的功能或操作,而所有可以实现本申请的装置都在本申请的保护范围之内。尽管未示出,显示设备20还可以包括电源等。电源用于为各组件供电,其可以通过电源管理系统与处理器704逻辑连接,通过电源管理系统实现管理充电、放电、以及功耗管理等功能。
参阅图8,本申请提供的手机的另一个实施例包括:
射频(Radio Frequency,RF)电路810、存储器820、输入单元830、显示单元840、蓝牙模块850、音频电路860、WiFi模块870、处理器880以及电源890等部件。本领域技术人员可以理解,图8中示出的手机结构并不构成对手机的限定,可以包括比图示更多或 更少的部件,或者组合某些部件,或者不同的部件布置。
下面结合图8对手机的各个构成部件进行具体的介绍:
RF电路810可用于收发信息或通话过程中,信号的接收和发送,特别地,将基站的下行信息接收后,给处理器880处理;另外,将设计上行的数据发送给基站。通常,RF电路810包括但不限于天线、至少一个放大器、收发信机、耦合器、低噪声放大器(Low Noise Amplifier,LNA)、双工器等。此外,RF电路810还可以通过无线通信与网络和其他设备通信。上述无线通信可以使用任一通信标准或协议,包括但不限于全球移动通讯系统(Global System of Mobile communication,GSM)、通用分组无线服务(General Packet Radio Service,GPRS)、码分多址(Code Division Multiple Access,CDMA)、宽带码分多址(Wideband Code Division Multiple Access,WCDMA)、长期演进(Long Term Evolution,LTE)、电子邮件、短消息服务(Short Messaging Service,SMS)等。
存储器820可用于存储软件程序以及模块,处理器880通过运行存储在存储器820的软件程序以及模块,从而执行手机的各种功能应用以及数据处理。存储器820可主要包括存储程序区和存储数据区,其中,存储程序区可存储操作系统、至少一个功能所需的应用程序(比如声音播放功能、图像播放功能等)等;存储数据区可存储根据手机的使用所创建的数据(比如音频数据、电话本等)等。此外,存储器820可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件、闪存器件、或其他易失性固态存储器件。
输入单元830可用于接收输入的数字或字符信息,以及产生与手机的用户设置以及功能控制有关的键信号输入。具体地,输入单元830可包括触控面板831以及其他输入设备832。触控面板831,也称为触摸屏,可收集用户在其上或附近的触摸操作(比如用户使用手指、触笔等任何适合的物体或附件在触控面板831上或在触控面板831附近的操作),并根据预先设定的程式驱动相应的连接装置。可选的,触控面板831可包括触摸检测装置和触摸控制器两个部分。其中,触摸检测装置检测用户的触摸方位,并检测触摸操作带来的信号,将信号传送给触摸控制器;触摸控制器从触摸检测装置上接收触摸信息,并将它转换成触点坐标,再送给处理器880,并能接收处理器880发来的命令并加以执行。此外,可以采用电阻式、电容式、红外线以及表面声波等多种类型实现触控面板831。除了触控面板831,输入单元830还可以包括其他输入设备832。具体地,其他输入设备832可以包括但不限于功能键(比如音量控制按键、开关按键等)、轨迹球、操作杆等中的一种或多种。
显示单元840可用于显示由用户输入的信息或提供给用户的信息以及手机的各种菜单。显示单元840可包括显示面板841,可选的,可以采用液晶显示器(Liquid Crystal Display,LCD)、有机发光二极管(Organic Light-Emitting Diode,OLED)等形式来配置显示面板841。进一步的,触控面板831可覆盖显示面板841,当触控面板831检测到在其上或附近的触摸操作后,传送给处理器880以确定触摸事件的类型,随后处理器880根据触摸事件的类型在显示面板841上提供相应的视觉输出。虽然在图8中,触控面板831与显示面板841是作为两个独立的部件来实现手机的输入和输入功能,但是在某些实施例中,可以将触控面板831与显示面板841集成而实现手机的输入和输出功能。
手机还可以包括蓝牙模块850。
音频电路860、扬声器861,传声器862可提供用户与手机之间的音频接口。音频电路860可将接收到的音频数据转换后的电信号,传输到扬声器861,由扬声器861转换为声音信号输出;另一方面,传声器862将收集的声音信号转换为电信号,由音频电路860接收后转换为音频数据,再将音频数据输出处理器880处理后,经RF电路810以发送给比如另一手机,或者将音频数据输出至存储器820以便进一步处理。
WiFi属于短距离无线传输技术,手机通过WiFi模块870可以帮助用户收发电子邮件、浏览网页和访问流式媒体等,它为用户提供了无线的宽带互联网访问。虽然图8示出了WiFi模块870,但是可以理解的是,其并不属于手机的必须构成,完全可以根据需要在不改变发明的本质的范围内而省略。
处理器880是手机的控制中心,利用各种接口和线路连接整个手机的各个部分,通过运行或执行存储在存储器820内的软件程序和/或模块,以及调用存储在存储器820内的数据,执行手机的各种功能和处理数据,从而对手机进行整体监控。可选的,处理器880可包括一个或多个处理单元;优选的,处理器880可集成应用处理器和调制解调处理器,其中,应用处理器主要处理操作系统、用户界面和应用程序等,调制解调处理器主要处理无线通信。可以理解的是,上述调制解调处理器也可以不集成到处理器880中。
手机还包括给各个部件供电的电源890(比如电池),优选的,电源可以通过电源管理系统与处理器880逻辑相连,从而通过电源管理系统实现管理充电、放电、以及功耗管理等功能。
尽管未示出,手机还可以包括摄像头、传感器等,在此不再赘述。
在本发明实施例中,通过调用存储器820存储的程序,该手机所包括的处理器880能够实现图2所示实施例或可选实施例中手机的功能。
在本申请中,“多个”是指两个或两个以上,其他量词与之类似。“和/或”表示可以存在三种关系,例如,A和/或B可以表示:单独存在A,同时存在A和B,单独存在B这三种情况。应理解,在本发明的各种实施例中,上述各过程的序号的大小并不意味着执行顺序的先后,各过程的执行顺序应以其功能和内在逻辑确定,而不应对本发明实施例的实施过程构成任何限定。
本申请提供一种计算机存储介质,包括指令,当指令在计算设备上运行时,使得计算设备执行以上任意一个实施例中显示设备所实施的步骤。
本申请还提供一种计算机存储介质,包括指令,当指令在计算设备上运行时,使得计算设备执行以上任意一个实施例中手机所实施的步骤。
本申请还提供一种计算机程序产品,当其在计算设备上运行时,使得计算设备执行以上任意一个实施例中显示设备所实施的步骤。
本申请还提供一种计算机程序产品,当其在计算设备上运行时,使得计算设备执行以上任意一个实施例中手机所实施的步骤。
在上述实施例中,可以全部或部分地通过软件、硬件、固件或者其任意组合来实现。当使用软件实现时,可以全部或部分地以计算机程序产品的形式实现。
计算机程序产品包括一个或多个计算机指令。在显示设备或手机上加载和执行计算机程序指令时,全部或部分地产生按照本申请的流程或功能。计算机指令可以存储在计算机可读存储介质中,或者从一个计算机可读存储介质向另一计算机可读存储介质传输,例如,计算机指令可以从一个网站站点、计算机、服务器或数据中心通过有线(例如同轴电缆、光纤、数字用户线)或无线(例如红外、无线、微波等)方式向另一个网站站点、计算机、服务器或数据中心进行传输。计算机可读存储介质可以是计算机能够存储的任何可用介质或者是包含一个或多个可用介质集成的服务器、数据中心等数据存储设备。可用介质可以是磁性介质,(例如软盘、硬盘、磁带)、光介质(例如DVD)、或者半导体介质(例如固态硬盘(Solid State Disk,SSD))等。
以上实施例仅用以说明本申请的技术方案,而非对其限制;尽管参照前述实施例对本申请进行了详细的说明,本领域的普通技术人员应当理解:其依然可以对前述各实施例所记载的技术方案进行修改,或者对其中部分技术特征进行等同替换;而这些修改或者替换,并不使相应技术方案的本质脱离本申请各实施例的技术方案的范围。

Claims (17)

  1. 一种应用于投屏场景的控制方法,其特征在于,包括:
    接收手机发送的第一屏幕内容和目标导航功能标识,所述目标导航功能标识用于标识在手机上除了三键导航功能之外的当前导航功能;
    根据所述目标导航功能标识生成包括投屏区域和导航栏的协同窗口,所述导航栏包括三个虚拟导航按键;
    在所述投屏区域显示所述第一屏幕内容;
    接收作用于所述虚拟导航按键上的键鼠操作;
    根据所述键鼠操作生成按键指令,将所述按键指令发送给所述手机,使得所述手机根据所述按键指令执行导航功能,所述导航功能用于将所述第一屏幕内容调整为第二屏幕内容;
    接收所述手机发送的第二屏幕内容;
    在所述投屏区域显示所述第二屏幕内容。
  2. 根据权利要求1所述的方法,其特征在于,所述当前导航功能包括手势导航功能和/或屏幕外物理导航功能,所述屏幕外物理导航功能由一个物理按键实现。
  3. 根据权利要求1所述的方法,其特征在于,所述方法还包括:
    当所述协同窗口的模式为最大化窗口模式且指针位置不处于第一目标区域时,隐藏所述导航栏,所述第一目标区域是在所述投屏区域的边缘区域中与所述导航栏对应的部分;
    当所述协同窗口的模式为最大化窗口模式且所述指针位置处于所述第一目标区域时,在所述第一目标区域显示所述导航栏。
  4. 根据权利要求1至3中任一项所述的方法,其特征在于,所述协同窗口还包括标题栏;
    当所述协同窗口的模式为最大化窗口模式且指针位置不处于第二目标区域时,隐藏所述标题栏,所述第二目标区域是在所述投屏区域的边缘区域中与所述标题栏对应的部分;
    当所述协同窗口的模式为最大化窗口模式且所述指针位置处于第二目标区域时,在所述第二目标区域显示所述标题栏。
  5. 根据权利要求1或2所述的方法,其特征在于,
    当所述协同窗口为竖屏窗口时,所述导航栏位于所述投屏区域的下方或上方,所述导航栏与所述投屏区域相邻;
    当所述协同窗口为横屏窗口时,所述导航栏位于所述投屏区域的右侧或左侧。
  6. 根据权利要求1至3中任一项所述的方法,其特征在于,所述方法还包括:
    在所述协同窗口中禁用键鼠操作模拟与所述目标导航功能标识对应的导航功能。
  7. 一种控制方法,其特征在于,包括:
    向显示设备发送第一屏幕内容和目标导航功能标识,所述目标导航功能标识用于标识在手机上除了三键导航功能之外的当前导航功能;
    接收所述显示设备发送的按键指令,所述按键指令是所述显示设备根据键鼠操作生成的;
    根据预设对应关系执行与所述按键指令对应的导航功能,所述导航功能用于将所述第一屏幕内容调整为第二屏幕内容;
    将所述第二屏幕内容发送给所述显示设备,使得所述显示设备在协同窗口的投屏区域显示所述第二屏幕内容。
  8. 根据权利要求7所述的方法,其特征在于,所述当前导航功能包括手势导航功能和/或屏幕外物理导航功能,所述屏幕外物理导航功能由一个物理按键实现。
  9. 一种显示设备,其特征在于,包括:
    接收模块,用于接收手机发送的第一屏幕内容和目标导航功能标识,所述目标导航功能标识用于标识在手机上除了三键导航功能之外的当前导航功能;
    显示模块,用于根据所述目标导航功能标识生成包括投屏区域和导航栏的协同窗口,所述导航栏包括三个虚拟导航按键;在所述投屏区域显示所述第一屏幕内容;
    输入模块,用于接收作用于所述虚拟导航按键上的键鼠操作;
    处理模块,用于根据所述键鼠操作生成按键指令;
    发送模块,用于将所述按键指令发送给所述手机,使得所述手机根据所述按键指令执行导航功能,所述导航功能用于将所述第一屏幕内容调整为第二屏幕内容;
    所述接收模块,还用于接收所述手机发送的所述第二屏幕内容;
    所述显示模块,还用于在所述投屏区域显示所述第二屏幕内容。
  10. 根据权利要求9所述的显示设备,其特征在于,所述当前导航功能包括手势导航功能和/或屏幕外物理导航功能,所述屏幕外物理导航功能由一个物理按键实现。
  11. 根据权利要求9所述的显示设备,其特征在于,
    所述显示模块,还用于当所述协同窗口的模式为最大化窗口模式且指针位置不处于第一目标区域时,隐藏所述导航栏,所述第一目标区域是在所述投屏区域的边缘区域中与所述导航栏对应的部分;当所述协同窗口的模式为最大化窗口模式且所述指针位置处于所述第一目标区域时,在所述第一目标区域显示所述导航栏。
  12. 根据权利要求9至11中任一项所述的显示设备,其特征在于,所述协同窗口还包括标题栏;
    所述显示模块,还用于当所述协同窗口的模式为最大化窗口模式且指针位置不处于第二目标区域时,隐藏所述标题栏,所述第二目标区域是在所述投屏区域的边缘区域中与所述标题栏对应的部分;当所述协同窗口的模式为最大化窗口模式且所述指针位置处于第二目标区域时,在所述第二目标区域显示所述标题栏。
  13. 根据权利要求9至11中任一项所述的显示设备,其特征在于,
    所述处理模块,还用于在所述协同窗口中禁用键鼠操作模拟与所述目标导航功能标识对应的导航功能。
  14. 一种手机,其特征在于,包括:
    发送模块,用于向显示设备发送第一屏幕内容和目标导航功能标识,所述目标导航功能标识用于标识在手机上除了三键导航功能之外的当前导航功能;
    接收模块,用于接收所述显示设备发送的按键指令,所述按键指令是所述显示设备根 据键鼠操作生成的;
    处理模块,用于根据预设对应关系执行与所述按键指令对应的导航功能,所述导航功能用于将所述第一屏幕内容调整为第二屏幕内容;
    所述发送模块,还用于将所述第二屏幕内容发送给所述显示设备,使得所述显示设备在协同窗口的投屏区域中显示所述第二屏幕内容。
  15. 根据权利要求14所述的手机,其特征在于,所述当前导航功能包括手势导航功能和/或屏幕外物理导航功能,所述屏幕外物理导航功能由一个物理按键实现。
  16. 一种投屏系统,其特征在于,包括显示设备和手机;
    所述显示设备,用于接收手机发送的第一屏幕内容和目标导航功能标识,所述目标导航功能标识用于标识在手机上除了三键导航功能之外的当前导航功能;据所述目标导航功能标识生成包括投屏区域和导航栏的协同窗口,在所述投屏区域显示所述第一屏幕内容;所述导航栏包括三个虚拟导航按键;接收作用于所述虚拟导航按键上的键鼠操作;根据所述键鼠操作生成按键指令,将所述按键指令发送给所述手机;接收所述手机发送的第二屏幕内容,在所述投屏区域显示所述第二屏幕内容;
    所述手机,用于向所述显示设备发送第一屏幕内容和目标导航功能标识;接收所述显示设备发送的按键指令;所述根据所述按键指令执行导航功能,所述导航功能用于将所述第一屏幕内容调整为第二屏幕内容;将所述第二屏幕内容发送给所述显示设备。
  17. 一种计算机可读存储介质,包括指令,当其在计算机上运行时,使得计算机执行如权利要求1至8中任一项所述的方法。
PCT/CN2020/103440 2019-08-29 2020-07-22 一种应用于投屏场景的控制方法以及相关设备 WO2021036594A1 (zh)

Priority Applications (4)

Application Number Priority Date Filing Date Title
EP20858332.8A EP4016272A4 (en) 2019-08-29 2020-07-22 CONTROL METHOD FOR A SCREEN PROJECTION SCENARIO AND ASSOCIATED DEVICE
MX2022002472A MX2022002472A (es) 2019-08-29 2020-07-22 Método de control aplicado a escenario de proyección en pantalla y dispositivo relacionado.
US17/638,567 US11809704B2 (en) 2019-08-29 2020-07-22 Control method applied to screen projection scenario and related device
US18/473,483 US20240012562A1 (en) 2019-08-29 2023-09-25 Control method applied to screen projection scenario and related device

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910809113.9 2019-08-29
CN201910809113.9A CN110673782B (zh) 2019-08-29 2019-08-29 一种应用于投屏场景的控制方法以及相关设备

Related Child Applications (2)

Application Number Title Priority Date Filing Date
US17/638,567 A-371-Of-International US11809704B2 (en) 2019-08-29 2020-07-22 Control method applied to screen projection scenario and related device
US18/473,483 Continuation US20240012562A1 (en) 2019-08-29 2023-09-25 Control method applied to screen projection scenario and related device

Publications (1)

Publication Number Publication Date
WO2021036594A1 true WO2021036594A1 (zh) 2021-03-04

Family

ID=69075691

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/103440 WO2021036594A1 (zh) 2019-08-29 2020-07-22 一种应用于投屏场景的控制方法以及相关设备

Country Status (5)

Country Link
US (2) US11809704B2 (zh)
EP (1) EP4016272A4 (zh)
CN (3) CN115357178B (zh)
MX (1) MX2022002472A (zh)
WO (1) WO2021036594A1 (zh)

Families Citing this family (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115357178B (zh) * 2019-08-29 2023-08-08 荣耀终端有限公司 一种应用于投屏场景的控制方法以及相关设备
CN111327769B (zh) * 2020-02-25 2022-04-08 北京小米移动软件有限公司 多屏互动方法及装置、存储介质
CN111596875B (zh) * 2020-04-17 2022-09-13 维沃移动通信有限公司 屏幕扩展方法及电子设备
CN111562896B (zh) * 2020-04-26 2023-05-19 维沃移动通信(杭州)有限公司 投屏方法及电子设备
CN111597000B (zh) * 2020-05-14 2023-08-01 青岛海信移动通信技术有限公司 一种小窗口管理方法及终端
CN111913628B (zh) * 2020-06-22 2022-05-06 维沃移动通信有限公司 分享方法、装置和电子设备
CN114237529A (zh) * 2020-09-07 2022-03-25 华为技术有限公司 一种导航栏显示方法、显示方法与第一电子设备
CN114253496A (zh) * 2020-09-10 2022-03-29 华为技术有限公司 一种显示方法及电子设备
US20230376264A1 (en) * 2020-09-10 2023-11-23 Huawei Technologies Co., Ltd. Display Method and Electronic Device
CN113553014B (zh) * 2020-09-10 2023-01-06 华为技术有限公司 多窗口投屏场景下的应用界面显示方法及电子设备
CN117093165A (zh) * 2020-12-24 2023-11-21 华为技术有限公司 设备控制方法和终端设备
CN114691059B (zh) * 2020-12-25 2024-03-26 华为技术有限公司 一种投屏显示方法及电子设备
CN112631538A (zh) * 2020-12-30 2021-04-09 安徽鸿程光电有限公司 显示方法、装置、设备及计算机存储介质
CN115442509B (zh) * 2021-06-01 2023-10-13 荣耀终端有限公司 拍摄方法、用户界面及电子设备
CN113507694B (zh) * 2021-06-18 2024-02-23 厦门亿联网络技术股份有限公司 一种基于无线辅流设备的投屏方法及装置
CN115562525B (zh) * 2022-03-15 2023-06-13 荣耀终端有限公司 截屏方法及装置
CN114860143A (zh) * 2022-05-20 2022-08-05 Oppo广东移动通信有限公司 导航控制方法及装置、终端设备、存储介质

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103713847A (zh) * 2013-12-25 2014-04-09 华为终端有限公司 用户设备的系统栏的控制方法和用户设备
CN109753256A (zh) * 2018-03-29 2019-05-14 北京字节跳动网络技术有限公司 一种窗口控制栏的布局方法、装置及设备
CN110673782A (zh) * 2019-08-29 2020-01-10 华为技术有限公司 一种应用于投屏场景的控制方法以及相关设备

Family Cites Families (37)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050091359A1 (en) 2003-10-24 2005-04-28 Microsoft Corporation Systems and methods for projecting content from computing devices
KR100714707B1 (ko) * 2006-01-06 2007-05-04 삼성전자주식회사 3차원 그래픽 유저 인터페이스를 위한 네비게이션 장치 및방법
JP2008123408A (ja) * 2006-11-15 2008-05-29 Brother Ind Ltd 投影装置、プログラム、投影方法、並びに投影システム
US8850052B2 (en) * 2008-09-30 2014-09-30 Apple Inc. System and method for simplified resource sharing
US8698845B2 (en) * 2010-01-06 2014-04-15 Apple Inc. Device, method, and graphical user interface with interactive popup views
KR101763593B1 (ko) * 2010-08-24 2017-08-01 엘지전자 주식회사 컨텐츠 동기화 방법 및 그 방법을 채용한 사용자 단말기
WO2012046890A1 (ko) * 2010-10-06 2012-04-12 엘지전자 주식회사 이동단말기, 디스플레이 장치 및 그 제어 방법
KR101495190B1 (ko) * 2011-03-25 2015-02-24 엘지전자 주식회사 영상표시장치 및 그 영상표시장치의 동작 방법
KR101788060B1 (ko) * 2011-04-13 2017-11-15 엘지전자 주식회사 영상표시장치 및 이를 이용한 콘텐츠 관리방법
KR101857563B1 (ko) 2011-05-11 2018-05-15 삼성전자 주식회사 네트워크 전자기기들 간 데이터 공유 방법 및 장치
US8806369B2 (en) * 2011-08-26 2014-08-12 Apple Inc. Device, method, and graphical user interface for managing and interacting with concurrently open software applications
EP2602730B1 (en) * 2011-12-07 2018-02-14 BlackBerry Limited Presenting context information in a computing device
KR101522399B1 (ko) * 2011-12-23 2015-05-22 주식회사 케이티 휴대 단말에서 외부 디스플레이 기기로의 화면 표출 방법 및 그 휴대 단말
EP2808773A4 (en) 2012-01-26 2015-12-16 Panasonic Corp MOBILE TERMINAL, TELEPHONE RECEIVER AND DEVICE CONNECTING METHOD
US20130219303A1 (en) * 2012-02-21 2013-08-22 Research In Motion Tat Ab Method, apparatus, and system for providing a shared user interface
US9513793B2 (en) * 2012-02-24 2016-12-06 Blackberry Limited Method and apparatus for interconnected devices
US9360997B2 (en) * 2012-08-29 2016-06-07 Apple Inc. Content presentation and interaction across multiple displays
US9619128B2 (en) * 2013-07-01 2017-04-11 Microsoft Technology Licensing, Llc Dynamic presentation prototyping and generation
KR102138505B1 (ko) 2013-07-10 2020-07-28 엘지전자 주식회사 이동단말기 및 그 제어방법
KR102144553B1 (ko) * 2013-08-30 2020-08-13 삼성전자주식회사 다중 디스플레이 방법, 저장 매체 및 전자 장치
US9507482B2 (en) * 2013-10-07 2016-11-29 Narsys, LLC Electronic slide presentation controller
US9870058B2 (en) * 2014-04-23 2018-01-16 Sony Corporation Control of a real world object user interface
KR102219861B1 (ko) * 2014-05-23 2021-02-24 삼성전자주식회사 화면 공유 방법 및 그 전자 장치
KR102269481B1 (ko) * 2014-10-17 2021-06-28 삼성전자주식회사 디바이스 간에 화면 공유 방법 및 이를 이용하는 디바이스
CN106155614B (zh) * 2015-04-24 2020-04-24 联想(北京)有限公司 一种操作信息传输方法及电子设备
CN104967886B (zh) 2015-05-28 2018-09-25 深圳市创易联合科技有限公司 无线投影方法和系统
US10459675B2 (en) * 2015-09-04 2019-10-29 Fives Cinetic Corp. System and method for controlling a process line using a PLC and scalable HMI control template
CN107870754A (zh) * 2016-09-28 2018-04-03 法乐第(北京)网络科技有限公司 一种控制设备上展示的内容的方法及装置
CN106873846A (zh) * 2016-12-29 2017-06-20 北京奇虎科技有限公司 一种pc端控制移动设备的方法及系统
CN108702414B (zh) * 2017-06-16 2021-04-09 华为技术有限公司 一种屏幕锁定方法、装置及计算机可读存储介质
CN109218731B (zh) * 2017-06-30 2021-06-01 腾讯科技(深圳)有限公司 移动设备的投屏方法、装置及系统
KR20190021016A (ko) 2017-08-22 2019-03-05 삼성전자주식회사 전자 장치 및 그 제어 방법
CN107547750B (zh) * 2017-09-11 2019-01-25 Oppo广东移动通信有限公司 终端的控制方法、装置和存储介质
CN107682724B (zh) 2017-09-29 2020-05-01 北京盛世辉科技有限公司 显示方法、装置、智能遥控器及计算机可读存储介质
CN107743220A (zh) 2017-10-24 2018-02-27 西安万像电子科技有限公司 状态提示的方法、装置和中转投屏系统
CN108459836B (zh) * 2018-01-19 2019-05-31 广州视源电子科技股份有限公司 批注显示方法、装置、设备及存储介质
CN108595137B (zh) * 2018-04-25 2021-05-04 广州视源电子科技股份有限公司 无线投屏方法、装置和投屏器

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103713847A (zh) * 2013-12-25 2014-04-09 华为终端有限公司 用户设备的系统栏的控制方法和用户设备
CN109753256A (zh) * 2018-03-29 2019-05-14 北京字节跳动网络技术有限公司 一种窗口控制栏的布局方法、装置及设备
CN110673782A (zh) * 2019-08-29 2020-01-10 华为技术有限公司 一种应用于投屏场景的控制方法以及相关设备

Also Published As

Publication number Publication date
US11809704B2 (en) 2023-11-07
MX2022002472A (es) 2022-06-08
CN115357178B (zh) 2023-08-08
US20220300153A1 (en) 2022-09-22
CN110673782A (zh) 2020-01-10
US20240012562A1 (en) 2024-01-11
EP4016272A4 (en) 2022-11-09
EP4016272A1 (en) 2022-06-22
CN115793950A (zh) 2023-03-14
CN115357178A (zh) 2022-11-18
CN110673782B (zh) 2022-11-29

Similar Documents

Publication Publication Date Title
WO2021036594A1 (zh) 一种应用于投屏场景的控制方法以及相关设备
WO2021104365A1 (zh) 对象分享方法及电子设备
WO2019174611A1 (zh) 应用程序的设置方法及移动终端
WO2020181942A1 (zh) 图标控制方法及终端设备
US20220300302A1 (en) Application sharing method and electronic device
JP2023529868A (ja) 共有方法、装置及び電子機器
WO2019206036A1 (zh) 消息管理方法及终端
CN109407930A (zh) 一种应用程序处理方法及终端设备
WO2020192428A1 (zh) 对象管理方法及移动终端
US10275056B2 (en) Method and apparatus for processing input using display
CN110168487A (zh) 一种触摸控制方法及装置
WO2020181956A1 (zh) 应用标识的显示方法及终端设备
WO2020001358A1 (zh) 图标的整理方法及终端设备
WO2020215969A1 (zh) 内容输入方法及终端设备
WO2018086234A1 (zh) 一种对象处理方法和终端
WO2021088706A1 (zh) 应用程序的控制方法和电子设备
US20210109699A1 (en) Data Processing Method and Mobile Device
US11681410B2 (en) Icon management method and terminal device
WO2022111397A1 (zh) 控制方法、装置和电子设备
WO2019047129A1 (zh) 一种移动应用图标的方法及终端
CN108881742B (zh) 一种视频生成方法及终端设备
CN104238931B (zh) 信息输入方法、装置及电子设备
KR20220154825A (ko) 노트 생성 방법 및 전자기기
WO2020151675A1 (zh) 对象控制方法及终端设备
CN110531905B (zh) 一种图标控制方法及终端

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20858332

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2020858332

Country of ref document: EP

Effective date: 20220316