US20240012562A1 - Control method applied to screen projection scenario and related device - Google Patents

Control method applied to screen projection scenario and related device Download PDF

Info

Publication number
US20240012562A1
US20240012562A1 US18/473,483 US202318473483A US2024012562A1 US 20240012562 A1 US20240012562 A1 US 20240012562A1 US 202318473483 A US202318473483 A US 202318473483A US 2024012562 A1 US2024012562 A1 US 2024012562A1
Authority
US
United States
Prior art keywords
navigation
key
electronic device
screen
navigation function
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
US18/473,483
Inventor
Hejin Gu
Siyue NIU
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to US18/473,483 priority Critical patent/US20240012562A1/en
Assigned to HONOR DEVICE CO., LTD. reassignment HONOR DEVICE CO., LTD. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: GU, Hejin, NIU, Siyue
Publication of US20240012562A1 publication Critical patent/US20240012562A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1423Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/02Input arrangements using manually operated switches, e.g. using keyboards or dials
    • G06F3/023Arrangements for converting discrete items of information into a coded form, e.g. arrangements for interpreting keyboard generated codes as alphanumeric codes, operand codes or instruction codes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/038Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1454Digital output to display device ; Cooperation and interconnection of the display device with other functional units involving copying of the display data of a local workstation or window to a remote workstation or window so that an actual copy of the data is displayed simultaneously on two or more displays, e.g. teledisplay
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/08Cursor circuits
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/12Synchronisation between the display unit and other units, e.g. other display units, video-disc players
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/14Display of multiple viewports
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • H04M1/72409User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality by interfacing with external accessories
    • H04M1/72412User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality by interfacing with external accessories using two-way short-range wireless interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/038Indexing scheme relating to G06F3/038
    • G06F2203/0383Remote input, i.e. interface arrangements in which the signals generated by a pointing device are transmitted to a PC at a remote location, e.g. to a PC in a LAN
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04803Split screen, i.e. subdividing the display area or the window area into separate subareas
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2354/00Aspects of interface with display user
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2370/00Aspects of data communication
    • G09G2370/04Exchange of auxiliary data, i.e. other than image data, between monitor and graphics controller
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2370/00Aspects of data communication
    • G09G2370/16Use of wireless transmission of display information
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/16Details of telephonic subscriber devices including more than one display unit
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/42204User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor

Definitions

  • This application relates to the field of computer technologies, and in particular, to a control method applied to a screen projection scenario and a related device.
  • devices of different types may perform screen projection to implement screen sharing.
  • the computer may present a collaboration window.
  • Screen content of the mobile phone may be displayed in the collaboration window, that is, a mobile phone mirror is presented on the computer.
  • a user may control the mobile phone by performing a keyboard and mouse operation in the collaboration window of the computer.
  • the mobile phone generally uses a touch screen as an input device, and the computer uses a keyboard and a mouse as input devices. Simulating a touch operation by using a mouse or a keyboard has disadvantages, thereby affecting user experience of controlling the mobile phone by using the computer.
  • This application provides a control method applied to a screen projection scenario and a related device, to implement functions by using keyboard and mouse operations in the screen projection scenario so as to replace functions that are hard to simulate and implement by the keyboard and mouse operations, thereby improving operation experience of a user.
  • this application provides a control method applied to a screen projection scenario.
  • the screen projection scenario may include a mobile phone and a display device. After the mobile phone and the display device establish a connection, the display device receives first screen content and a target navigation function identifier sent by the mobile phone; generates a collaboration window including a screen projection area and a navigation bar according to the target navigation function identifier, and displays the first screen content in the screen projection area; receives a keyboard and mouse operation acting on virtual navigation keys on the navigation bar; and generates a key instruction according to the keyboard and mouse operation, and sends the key instruction to the mobile phone, to cause the mobile phone to execute a navigation function according to the key instruction, where the mobile phone may adjust the first screen content to second screen content by means of the navigation function.
  • the display device receives the second screen content sent by the mobile phone, and displays the second screen content in the screen projection area.
  • the first screen content refers to content displayed on a screen of the mobile phone when the mobile phone and the display device establish the connection.
  • Target navigation functions include mobile phone navigation functions except a three-key navigation function.
  • the navigation bar includes three virtual navigation keys, and the three virtual navigation keys respectively correspond to different navigation functions.
  • the three virtual navigation keys are respectively a menu key, a desktop key, and a return key, where the menu key is configured to enter a task menu, the desktop key is configured to return to a desktop, and the return key is configured to return to an upper level.
  • the functions shown above are common navigation functions. In an actual application, the functions of the three virtual navigation keys may be alternatively set to navigation functions of other types, which is not limited in this application.
  • the display device sets the navigation bar in the collaboration window, and a user may perform a system navigation function to the mobile phone by using the three virtual navigation keys in the navigation bar, to cause screen content of the mobile phone and content of the collaboration window to be updated synchronously, thereby reducing a problem that the user is prone to a mistaken touch when controlling the mobile phone in the another navigation manner, and improving experience when the user uses the display device to control the mobile phone.
  • a current navigation function includes a gesture navigation function and/or an out-of-screen physical navigation function, and the out-of-screen physical navigation function is implemented by using a physical key.
  • the method further includes: when a mode of the collaboration window is a maximized window mode and a pointer position is not located in a first target area, hiding the navigation bar, where the first target area is a part in an edge area of the screen projection area and corresponding to the navigation bar; and when the mode of the collaboration window is the maximized window mode and the pointer position is located in the first target area, displaying the navigation bar in the first target area.
  • the collaboration window further includes a title bar; when the mode of the collaboration window is the maximized window mode and the pointer position is not located in a second target area, hiding the title bar, where the second target area is a part in the edge area of the screen projection area and corresponding to the title bar; and when the mode of the collaboration window is the maximized window mode and the pointer position is located in the second target area, displaying the title bar in the second target area.
  • the navigation bar when the collaboration window is a portrait window, the navigation bar is located below or above the screen projection area, and the navigation bar is adjacent to the screen projection area; and when the collaboration window is a landscape window, the navigation bar is located on a right side or a left side of the screen projection area, and the navigation bar is adjacent to the screen projection area.
  • the method further includes: forbidding the keyboard and mouse operation from simulating the navigation function corresponding to the target navigation function identifier in the collaboration window.
  • this application provides a control method applied to a screen projection scenario.
  • the method includes the following steps: sending first screen content and a target navigation function identifier to a display device; receiving a key instruction sent by the display device; executing a navigation function corresponding to the key instruction according to a preset correspondence, where the navigation function is used for adjusting the first screen content to second screen content; and sending the second screen content to the display device by a mobile phone, to cause the display device to display the second screen content in a screen projection area of a collaboration window.
  • the key instruction is generated by the display device according to a keyboard and mouse operation.
  • the target navigation function identifier is used for identifying a current navigation function except a three-key navigation function on the mobile phone.
  • the current navigation function includes a gesture navigation function and/or an out-of-screen physical navigation function, and the out-of-screen physical navigation function is implemented by using a physical key.
  • this application provides a display apparatus whose function may implement the control method according to the first aspect or any implementation.
  • the function may be implemented by using hardware, or may be implemented by hardware executing corresponding software.
  • the hardware or software includes one or more modules corresponding to the above function.
  • this application provides a mobile phone whose function may implement the control method according to the second aspect or any implementation.
  • the function may be implemented by using hardware, or may be implemented by hardware executing corresponding software.
  • the hardware or software includes one or more modules corresponding to the above function.
  • this application provides a screen projection system.
  • the screen projection system includes the display device according to the third aspect or any implementation thereof and the mobile phone according to the fourth aspect or any implementation thereof.
  • this application provides a computer-readable storage medium, storing instructions, the instructions, when run on a computer, causing the computer to perform the method according to the first aspect or the second aspect.
  • this application provides a computer program product, the computer program product, when run on a computer, causing the computer to perform the method according to the first aspect or the second aspect.
  • FIG. 1 is a schematic diagram of a screen projection scenario according to this application.
  • FIG. 2 is a signaling interaction diagram of a control method applied to a screen projection scenario according to this application;
  • FIG. 3 A is a schematic diagram of a collaboration window according to this application.
  • FIG. 3 B is a schematic diagram of a collaboration window being a maximized portrait window according to this application.
  • FIG. 3 C is a schematic diagram of a collaboration window being a maximized landscape window according to this application.
  • FIG. 3 D is another schematic diagram of a collaboration window being a maximized landscape window according to this application.
  • FIG. 3 E is another schematic diagram of a collaboration window according to this application.
  • FIG. 3 F is another schematic diagram of a collaboration window according to this application.
  • FIG. 4 is a schematic structural diagram of a display device according to this application.
  • FIG. 5 is a schematic structural diagram of a mobile phone according to this application.
  • FIG. 6 is a schematic diagram of a screen projection system according to this application.
  • FIG. 7 is another schematic structural diagram of a display device according to this application.
  • FIG. 8 is another schematic structural diagram of a mobile phone according to this application.
  • This application relates to a control method applied to a screen projection scenario.
  • FIG. 1 is a schematic diagram of a screen projection scenario.
  • the screen projection scenario includes a mobile phone 10 and a display device 20 .
  • the mobile phone 10 and the display device 20 may be connected to each other through a radio link 30 .
  • the mobile phone and the display device 20 may be alternatively connected in a wired manner such as a data cable.
  • the display device 20 may generate a collaboration window 40 according to content displayed by a screen of the mobile phone 10 .
  • An operation of a user in the collaboration window 40 may update screen content of the mobile phone 10 and content of the collaboration window 40 synchronously.
  • the display device 20 refers to a computing device that performs an input operation by using a keyboard and/or a mouse and that has a display, such as a desktop computer or a notebook computer.
  • the mobile phone 10 is also referred to as a cell phone.
  • a system navigation function on the mobile phone is also referred to as a system navigation manner.
  • the system navigation function includes gesture navigation, out-of-screen physical navigation, three-key navigation, and the like.
  • the display device 20 being a computer as an example, after the mobile phone is projected to the computer, if the gesture navigation is used on the mobile phone 10 , the user can hardly simulate a mobile phone navigation operation on the computer through a keyboard and mouse operation accurately.
  • a desktop is returned when sliding upward from a lower left end of the screen of the mobile phone.
  • a task menu is entered when sliding upward from the lower left end of the screen of the mobile phone and holding for a period of time.
  • An upper level is returned when sliding rightward from the leftmost side of the screen of the mobile phone.
  • the user may easily start a navigation function mistakenly when using a keyboard and mouse operation to move on the computer.
  • a function of returning to the desktop is started mistakenly when a pointer of the mouse is slid upward from a lower left end of the collaboration window.
  • the out-of-screen physical navigation is used on the mobile phone pressing a physical key once represents returning to the upper level, and pressing the physical key twice quickly represents returning to the desktop.
  • a mistaken touch is easily triggered by a keyboard and mouse operation on the computer. For example, when the user intends to start the function of returning to the desktop, the navigation function of returning to the upper level is started by three clicks.
  • the mobile phone uses the gesture navigation or the out-of-screen physical navigation, the user may easily trigger a mistaken touch when simulating the above navigation manner by using a keyboard and mouse operation, so that the mobile phone cannot be controlled accurately to perform system navigation, thereby leading to poor operation experience of the user.
  • this application provides three-key navigation virtual keys on the computer to replace gesture navigation operations.
  • performing keyboard and mouse operations on the three virtual navigation keys of the navigation bar can reduce a case that another navigation function is mistakenly touched and has better accuracy, thereby improving the operation experience of the user to control the mobile phone by using another device.
  • an embodiment of a control method provided in this application includes the following steps:
  • Step 201 A display device receives first screen content and a target navigation function identifier sent by a mobile phone.
  • the mobile phone may send a target navigation function identifier to the display device.
  • the target navigation function identifier is used for identifying a current navigation function except a three-key navigation function.
  • the target navigation function identifier may be a character string or a numerical number used for identifying the current navigation function.
  • the target navigation function identifier may be alternatively represented in another manner, such as a picture, a symbol, or text.
  • the target navigation function identifier may be carried in a message.
  • the current navigation function includes, but is not limited to, a gesture navigation function and/or an out-of-screen physical navigation function, and the out-of-screen physical navigation function is implemented by using a physical key.
  • the three-key navigation function refers to an in-screen three-key navigation function or an out-of-screen three-key navigation function.
  • the display device sends a query instruction to the mobile phone, and the mobile phone obtains the target navigation function identifier according to the query instruction and then sends the target navigation function identifier to the display device.
  • the display device may receive the first screen content and the target navigation function identifier sent by the mobile phone at the same moment or at different moments.
  • Step 202 The display device generates a collaboration window including a screen projection area and a navigation bar according to the target navigation function identifier, and displays the first screen content in the screen projection area.
  • the display device may determine that the current navigation function corresponding to the target navigation function identifier is not the three-key navigation function. That is, it indicates that simulating a navigation function that is currently used on the mobile phone on the computer by using a keyboard and mouse operation may trigger a mistaken touch.
  • the mobile phone may send service states of all navigation functions to the display device, and the display device may also determine the navigation function that is currently used on the mobile phone according to the service states of the navigation functions.
  • the screen projection area is used for displaying according to screen content of the mobile phone.
  • the navigation bar includes three virtual navigation keys.
  • the three virtual navigation keys respectively correspond to different navigation functions.
  • the three virtual navigation keys are respectively a menu key, a desktop key, and a return key.
  • the menu key is configured to enter a task menu
  • the desktop key is configured to return to a desktop
  • the return key is configured to return to an upper level.
  • the functions shown above are common navigation functions.
  • the functions of the three virtual navigation keys may be alternatively set to navigation functions of other types, which is not limited in this application.
  • the display device may configure the screen projection area and the navigation bar in the collaboration window at the same moment or at different moments. For example, when the display device receives the first screen content sent by the mobile phone, the display device generates the screen projection area in the collaboration window. When the display device receives the target navigation function identifier, the display device generates the navigation bar in the collaboration window.
  • Step 203 The display device receives a keyboard and mouse operation acting on virtual navigation keys.
  • the keyboard and mouse operation refers to an operation inputted by using a keyboard and/or an operation inputted by using a mouse. For example, when a pointer of the mouse selects a virtual key, the virtual navigation key is clicked.
  • the keyboard and mouse operation is not limited to the foregoing example.
  • Step 204 The display device generates a key instruction according to the keyboard and mouse operation.
  • the display device may generate a key instruction corresponding to the virtual navigation key. For example, if the virtual navigation key is the menu key, a key instruction corresponding to the menu key is generated. If the virtual navigation key is the desktop key, a key instruction corresponding to the desktop key is generated. If the virtual navigation key is the return key, a key instruction corresponding to the return key is generated.
  • Step 205 The display device sends the key instruction to the mobile phone.
  • Step 206 The mobile phone executes a navigation function corresponding to the key instruction according to a preset correspondence, to adjust the first screen content to second screen content.
  • the preset correspondence refers to a correspondence between key instructions from the display device and navigation functions of the mobile phone.
  • the navigation function performed on the mobile phone is to enter the task menu.
  • the navigation function performed on the mobile phone is to return to the desktop.
  • the navigation function performed on the mobile phone is to return to the upper level. It may be understood that, functions of the virtual navigation keys on the computer and system navigation functions of the mobile phone are in a one-to-one correspondence.
  • Step 207 The display device receives the second screen content sent by the mobile phone.
  • Step 208 The display device displays the second screen content in the screen projection area.
  • the display device sets the navigation bar in the collaboration window, and a user performs a system navigation function to the mobile phone by using the three virtual navigation keys in the navigation bar, to cause screen content of the mobile phone and content of the collaboration window to be updated synchronously, thereby reducing a problem that the user is prone to a mistaken touch when controlling the mobile phone in the another navigation manner, and improving experience when the user uses the display device to control the mobile phone.
  • the collaboration window 40 of this application may include a screen projection area 301 , a navigation bar 302 , and a title bar 303 .
  • the navigation bar 302 and the title bar 303 may be hidden according to actual requirements.
  • the navigation bar 302 includes three virtual navigation keys, which are respectively a first virtual navigation key 3021 , a second virtual navigation key 3022 , and a third virtual navigation key 3023 .
  • the first virtual navigation key 3021 , the second virtual navigation key 3022 , and the third virtual navigation key 3023 are respectively a return key, a desktop key, and a menu key. Functions of the virtual navigation keys and a sequence of the virtual navigation keys may be adjusted according to actual requirements.
  • the navigation bar 302 may be disposed outside the screen projection area 301 or may be disposed in the screen projection area 301 .
  • the navigation bar 302 when the collaboration window is a maximized window, the navigation bar 302 is disposed in the screen projection area 301 ; and when the collaboration window 40 is not a maximized window, the navigation bar 302 is disposed outside the screen projection area 301 .
  • the title bar 303 includes a minimized window key, a maximized window key, and a close window key.
  • the title bar 303 may further include a window name, a direction locking key, and the like, which is not limited herein.
  • collaboration window 40 is a maximized window and the navigation bar 302 is disposed in the screen projection area 301 :
  • the navigation bar 302 when a mode of the collaboration window 40 is the maximized window mode and a pointer position is not located in a first target area, the navigation bar 302 is hidden, where the first target area is a part in an edge area of the screen projection area 301 and corresponding to the navigation bar 302 ; and when the mode of the collaboration window 40 is the maximized window mode and the pointer position is located in the first target area, the navigation bar 302 is displayed in the first target area.
  • the first target area may be one or a combination of an upper edge area, a lower edge area, a left edge area, or a right edge area of the screen projection area 301 , or may be a part of any edge area above, which is not limited in this application.
  • the collaboration window 40 is a maximized portrait window
  • the first target area is a lower edge of the screen projection area 301 .
  • the navigation bar 302 is displayed at the lower edge of the screen projection area 301 .
  • the navigation bar 302 is hidden.
  • a manner for hiding the navigation bar 302 may be, but is not limited to: the navigation bar 302 moves downward from the lower edge of the screen projection area 301 and disappears, or the navigation bar 302 directly disappears.
  • the collaboration window 40 is a maximized landscape window
  • the first target area is a right edge of the screen projection area 301 .
  • the screen projection area 301 may be a full-screen area.
  • the navigation bar 302 is displayed at the right edge of the screen projection area 301 .
  • the navigation bar 302 is hidden.
  • a manner for hiding the navigation bar 302 may be, but is not limited to: the navigation bar 302 moves rightward from the right edge of the screen projection area 301 and disappears, or the navigation bar 302 directly disappears.
  • the navigation bar 302 When the navigation bar 302 is displayed in the first target area in the screen projection area 301 , the navigation bar 302 may block some displayed content of the screen projection area 301 .
  • the background of the navigation bar 302 may be set to transparent, to reduce blocking on the screen projection area 301 .
  • screen content of the mobile phone may be displayed in a full-screen manner by hiding the navigation bar 302 .
  • a manner for displaying the screen content of the mobile phone in a full-screen manner on the computer is provided, thereby improving user experience when watching a projected screen.
  • a navigation bar that can be hidden or unfolded is provided. When the user intends to perform system navigation, the navigation bar may be invoked quickly, thereby providing convenient and quick access.
  • the foregoing method further includes: when the mode of the collaboration window 40 is the maximized window mode and the pointer position is not located in a second target area, hiding the title bar 303 , where the second target area is a part in an edge area of the screen projection area 301 and corresponding to the title bar 303 ; and when the mode of the collaboration window 40 is the maximized window mode and the pointer position is located in the second target area, displaying the title bar 303 in the second target area.
  • the collaboration window 40 further includes the title bar 303 .
  • the collaboration window 40 is a maximized portrait window.
  • the collaboration window 40 is a maximized landscape window
  • the screen projection area 301 is a full-screen area in this case.
  • the second target area may be one or a combination of an upper edge area, a lower edge area, a left edge area, or a right edge area of the screen projection area 301 , or may be a part of any edge area above, which is not limited in this application. It may be understood that, the second target area and the first target area are generally set to two separate areas.
  • screen content of the mobile phone may be displayed in a full-screen manner by hiding the title bar 303 .
  • a manner for displaying the screen content of the mobile phone in a full-screen manner on the computer is provided, thereby improving user experience when watching a projected screen.
  • the title bar 303 may be invoked quickly, thereby providing convenient and quick access.
  • the navigation bar 302 when the collaboration window 40 is a portrait window, the navigation bar 302 is located below the screen projection area 301 , and the navigation bar 302 is adjacent to the screen projection area 301 .
  • Such settings meet a habit that the user uses a three-key navigation function in a portrait mode, thereby improving user experience.
  • the navigation bar 302 may be alternatively disposed at another orientation of the screen projection area 301 , such as the above of the screen projection area 301 .
  • the navigation bar 302 when the collaboration window 40 is a landscape window, the navigation bar 302 is located on a right side of the screen projection area 301 , and the navigation bar 302 is adjacent to the screen projection area 301 .
  • Such settings meet a habit that the user uses a three-key navigation function in a landscape mode, thereby improving user experience.
  • the navigation bar 302 may be alternatively disposed at another orientation of the screen projection area 301 , such as a left side of the screen projection area 301 .
  • the display device may scale the screen projection area 301 and the navigation bar 302 at the same ratio according to the scale operation.
  • the display device may start multiple navigation functions at the same moment, or may only start one navigation function and forbid other navigation functions.
  • each navigation function in the collaboration window may be implemented through multiple types of keyboard and mouse operations.
  • a navigation function of returning to a desktop is implemented through two keyboard and mouse operations.
  • One keyboard and mouse operation is to click a desktop key, where the desktop key is a virtual navigation key.
  • Another keyboard and mouse operation is to slide upward from a lower left end of the collaboration window. The rest may be deduced by analogy, and keyboard and mouse operations used for implementing other navigation functions may be set according to actual situations.
  • the user may select keyboard and mouse operations to implement the navigation functions according to requirements, thereby providing implementation flexibility.
  • the foregoing method further includes: forbidding the keyboard and mouse operation from simulating the navigation function corresponding to the target navigation function identifier in the collaboration window.
  • the display device may forbid a keyboard and mouse operation from simulating the foregoing navigation function in the collaboration window.
  • the display device only provides a three-key navigation function, so that a mistaken touch caused by using a keyboard and mouse operation to simulate the gesture navigation or out-of-screen physical navigation may be avoided in the collaboration window.
  • the foregoing method further includes: sending, by the display device, a navigation function modification instruction to the mobile phone according to the target navigation function identifier, and modifying, by the mobile phone, the current navigation function to an in-screen three-key navigation function according to the navigation function modification instruction.
  • the navigation function of the mobile phone is modified to the in-screen three-key navigation function.
  • an in-screen three-key navigation bar is displayed both on the mobile phone and the display device, and the mobile phone may be controlled through the in-screen three-key navigation bar on the display device.
  • control method may be further applied to another electronic device that uses gesture navigation or out-of-screen physical navigation.
  • an embodiment of a display apparatus provided in this application includes:
  • a receiving module 401 configured to receive first screen content and a target navigation function identifier sent by a mobile phone, where the target navigation function identifier is used for identifying a current navigation function except a three-key navigation function on the mobile phone;
  • a display module 402 configured to generate a collaboration window including a screen projection area and a navigation bar according to the target navigation function identifier, where the navigation bar includes three virtual navigation keys; and display the first screen content in the screen projection area;
  • an input module 403 configured to receive a keyboard and mouse operation acting on the virtual navigation keys
  • a processing module 404 configured to generate a key instruction according to the keyboard and mouse operation
  • a sending module 405 configured to send the key instruction to the mobile phone, to cause the mobile phone to execute a navigation function according to the key instruction, where the navigation function is used for adjusting the first screen content to second screen content;
  • the receiving module 401 being further configured to receive the second screen content sent by the mobile phone;
  • the display module 402 being further configured to display the second screen content in the collaboration window.
  • the current navigation function includes a gesture navigation function and/or an out-of-screen physical navigation function, and the out-of-screen physical navigation function is implemented by using a physical key.
  • the display module 402 is further configured to: when a mode of the collaboration window is a maximized window mode and a pointer position is not located in a first target area, hide the navigation bar, where the first target area is a part in an edge area of the screen projection area and corresponding to the navigation bar; and when the mode of the collaboration window is the maximized window mode and the pointer position is located in the first target area, display the navigation bar in the first target area.
  • the collaboration window further includes a title bar
  • the display module 402 is further configured to: when the mode of the collaboration window is the maximized window mode and the pointer position is not located in a second target area, hide the title bar, where the second target area is a part in the edge area of the screen projection area and corresponding to the title bar; and when the mode of the collaboration window is the maximized window mode and the pointer position is located in the second target area, display the title bar in the second target area.
  • processing module 404 is further configured to forbid the keyboard and mouse operation from simulating the navigation function corresponding to the target navigation function identifier in the collaboration window.
  • an embodiment of a mobile phone provided in this application includes:
  • a sending module 501 configured to send first screen content and a target navigation function identifier to a display device, where the target navigation function identifier is used for identifying a current navigation function except a three-key navigation function on the mobile phone;
  • a receiving module 502 configured to receive a key instruction sent by the display device, where the key instruction is generated by the display device according to a keyboard and mouse operation;
  • a processing module 503 configured to execute a navigation function corresponding to the key instruction according to a preset correspondence, where the navigation function is used for adjusting the first screen content to second screen content;
  • the sending module 501 being further configured to send the second screen content to the display device, to cause the display device to display the second screen content in a screen projection area of a collaboration window.
  • the current navigation function includes a gesture navigation function and/or an out-of-screen physical navigation function, and the out-of-screen physical navigation function is implemented by using a physical key.
  • an embodiment of a screen projection system provided in this application includes:
  • a display device 20 and a mobile phone 10 where the mobile phone 10 and the display device 20 are connected to each other through a radio link 30 ;
  • the display device 20 is configured to: receive first screen content and a state of a target navigation function sent by the mobile phone 10 , where the target navigation function includes mobile phone navigation functions except a three-key navigation function; generate a collaboration window including a screen projection area and a navigation bar when the state of the target navigation function is a started state, and display the first screen content in the screen projection area, where the navigation bar includes three virtual navigation keys; receive a keyboard and mouse operation acting on the virtual navigation keys; generate a key instruction according to the keyboard and mouse operation and send the key instruction to the mobile phone to cause the mobile phone to execute a navigation function according to the key instruction; receive second screen content sent by the mobile phone; and display the second screen content in the screen projection area; and
  • the mobile phone 10 is configured to: send first screen content and a state of a target navigation function to the display device 20 ; receive a key instruction sent by the display device when the state of the target navigation function is a started state; execute a navigation function according to the key instruction, where the navigation function is used for adjusting the first screen content to second screen content; and send the second screen content to the display device.
  • the radio link may be a wireless fidelity (wireless fidelity, WiFi) link or a Bluetooth link.
  • Functions of the display device 20 may be the same as that of the display device in the embodiment shown in FIG. 4 or other optional embodiments.
  • Functions of the mobile phone 10 may be the same as that of the mobile phone in the embodiment shown in FIG. 5 or other optional embodiments.
  • FIG. 7 another embodiment of a display device 20 provided in this application includes:
  • an input apparatus 701 , a display apparatus 702 , a memory 703 , a processor 704 , a transceiver 705 , and a data interface 706 where the input apparatus 701 , the display apparatus 702 , the memory 703 , the processor 704 , the transceiver 705 , and the data interface 706 may be connected through a bus.
  • the input apparatus 701 may be a keyboard or a mouse.
  • the display apparatus 702 may be a display, a projector, or another device for display.
  • the memory 703 may be a volatile memory or a non-volatile memory, or may include a volatile memory and a non-volatile memory.
  • the non-volatile memory may be a read-only memory (read-only memory, ROM), a programmable ROM (programmable ROM, PROM), an erasable PROM (erasable PROM, EPROM), an electrically EPROM (electrically EPROM, EEPROM), or a flash memory.
  • the volatile memory may be a random access memory (random access memory, RAM), and is used as an external cache. It should be noted that, the memory described herein aims to include but not limited to these memories and any other suitable types of memories.
  • the foregoing processor 704 may be a general-purpose processor, including a central processing unit (central processing unit, CPU), a network processor (network processor, NP), and the like; and may further be a digital signal processor (digital signal processing, DSP), an application specific integrated circuit (application specific integrated circuit, ASIC), a field-programmable gate array (field-programmable gate array, FPGA), or other programmable logic devices.
  • the processor 704 is configured to implement the functions of the display device in the foregoing embodiments by invoking program code in the memory 703 .
  • the transceiver 705 is configured to send and receive data in wireless communication.
  • a component configured to implement a receiving function in the transceiver 705 may be regarded as a receiver, and a component configured to implement a sending function in the transceiver 705 may be regarded as a sender. That is, the transceiver 701 includes a receiver and a sender.
  • the transceiver 705 may also be referred to as a transceiver machine or a transceiver circuit.
  • the receiver may also be referred to as a receiving machine or a receiving circuit sometimes.
  • the sender may also be referred to as a transmitting machine or a transmitting circuit sometimes.
  • the data interface 706 is connected to the mobile phone in a wired manner.
  • the display device 20 may include any quantity of input apparatuses 701 , display apparatuses 702 , memories 703 , processors 704 , transceivers 705 , and data interfaces 706 , to implement the functions or operations performed by the display device 20 in the apparatus embodiments of this application, and all apparatuses that can implement this application fall within the protection scope of this application.
  • the display device 20 may further include a power supply, and the like.
  • the power supply is configured to supply power to various components, and may be logically connected to the processor 704 by using a power management system, thereby implementing functions such as charging, discharging, and power consumption management by using the power management system.
  • FIG. 8 another embodiment of a mobile phone provided in this application includes:
  • a radio frequency (radio frequency, RF) circuit 810 a radio frequency (radio frequency, RF) circuit 810 , a memory 820 , an input unit 830 , a display unit 840 , a Bluetooth module 850 , an audio circuit 860 , a WiFi module 870 , a processor 880 , and a power supply 890 .
  • RF radio frequency
  • FIG. 8 does not constitute a limitation on the mobile phone, and the mobile phone may include more components or fewer components than those shown in the figure, or some components may be combined, or a different component deployment may be used.
  • the RF circuit 810 may be configured to send and receive signals during an information receiving and sending process or a call process. Specifically, the RF circuit receives downlink information from a base station, then delivers the downlink information to the processor 880 for processing, and sends designed uplink data to the base station.
  • the RF circuit 810 includes but is not limited to an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier (low noise amplifier, LNA), a duplexer, and the like.
  • the RF circuit 810 may also communicate with a network and another device through wireless communication.
  • the wireless communication may use any communication standard or protocol, including, but not limited to, Global System for Mobile Communications (Global System of Mobile communications, GSM), General Packet Radio Service (General Packet Radio Service, GPRS), Code Division Multiple Access (Code Division Multiple Access, CDMA), Wideband Code Division Multiple Access (Wideband Code Division Multiple Access, WCDMA), Long Term Evolution (Long Term Evolution, LTE), email, Short Messaging Service (Short Messaging Service, SMS), and the like.
  • GSM Global System for Mobile Communications
  • GSM Global System of Mobile communications
  • General Packet Radio Service General Packet Radio Service
  • CDMA Code Division Multiple Access
  • WCDMA Wideband Code Division Multiple Access
  • LTE Long Term Evolution
  • SMS Short Messaging Service
  • the memory 820 may be configured to store a software program and a module.
  • the processor 880 runs the software program and the module that are stored in the memory 820 , to implement various functional applications and data processing of the mobile phone.
  • the memory 820 may mainly include a program storage area and a data storage area.
  • the program storage area may store an operating system, an application program required by at least one function (for example, a sound playback function and an image display function), and the like.
  • the data storage area may store data (for example, audio data and an address book) created according to the use of the mobile phone, and the like.
  • the memory 820 may include a high speed RAM, and may further include a non-volatile memory, such as at least one magnetic disk storage device, a flash memory, or another volatile solid storage device.
  • the input unit 830 may be configured to receive inputted digit or character information, and generate a keyboard signal input related to the user setting and function control of the mobile phone.
  • the input unit 830 may include a touch panel 831 and another input device 832 .
  • the touch panel 831 which may also be referred to as a touch screen, may collect a touch operation of a user on or near the touch panel (such as an operation of a user on or near the touch panel 831 by using any suitable object or accessory such as a finger or a stylus), and drive a corresponding connection apparatus according to a preset program.
  • the touch panel 831 may include two parts: a touch detection apparatus and a touch controller.
  • the touch detection apparatus detects a touch position of the user, detects a signal generated by the touch operation, and transfers the signal to the touch controller.
  • the touch controller receives touch information from the touch detection apparatus, converts the touch information into touch point coordinates, and transmits the touch point coordinates to the processor 880 .
  • the touch controller can receive a command transmitted by the processor 880 and execute the command.
  • the touch panel 831 may be implemented by using various types, such as a resistive type, a capacitive type, an infrared type, and a surface acoustic wave type.
  • the input unit 830 may further include the another input device 832 .
  • the another input device 832 may include, but not limited to, one or more of a functional key (such as a volume control key or a switch key), a track ball, and a joystick.
  • the display unit 840 may be configured to display information inputted by the user or information provided for the user, and various menus of the mobile phone.
  • the display unit 840 may include a display panel 841 .
  • the display panel 841 may be configured by using a liquid crystal display (liquid crystal display, LCD), an organic light-emitting diode (organic light-emitting diode, OLED), or the like.
  • the touch panel 831 may cover the display panel 841 . After detecting a touch operation on or near the touch panel, the touch panel 831 transfers the touch operation to the processor 880 , to determine a type of a touch event. Then, the processor 880 provides a corresponding visual output on the display panel 841 according to the type of the touch event.
  • the touch panel 831 and the display panel 841 are used as two separate parts to implement input and output functions of the mobile phone, in some embodiments, the touch panel 831 and the display panel 841 may be integrated to implement the input and output functions of the mobile phone.
  • the mobile phone may further include a Bluetooth module 850 .
  • the audio circuit 860 , a speaker 861 , and a microphone 862 may provide audio interfaces between the user and the mobile phone.
  • the audio circuit 860 may convert received audio data into an electrical signal and transmit the electrical signal to the speaker 861 .
  • the speaker 861 converts the electrical signal into a sound signal for output.
  • the microphone 862 converts a collected sound signal into an electrical signal.
  • the audio circuit 860 receives the electrical signal and converts the electrical signal into audio data, and outputs the audio data to the processor 880 for processing. Then, the processor sends the audio data to, for example, another mobile phone by using the RF circuit 810 , or outputs the audio data to the memory 820 for further processing.
  • WiFi is a short distance wireless transmission technology.
  • the mobile phone may help, by using the WiFi module 870 , a user to send and receive an email, browse a web page, access stream media, and the like. This provides wireless broadband Internet access for the user.
  • FIG. 8 shows the WiFi module 870 , it may be understood that the WiFi module is not a necessary component of the mobile phone, and the WiFi module may be omitted as required provided that the scope of the essence of the present invention is not changed.
  • the processor 880 is a control center of the mobile phone, and is connected to various parts of the entire mobile phone by using various interfaces and lines. By running or executing the software program and/or the module stored in the memory 820 , and invoking data stored in the memory 820 , the processor executes various functions of the mobile phone and performs data processing, thereby monitoring the entire mobile phone.
  • the processor 880 may include one or more processing units.
  • the processor 880 may integrate an application processor and a modem processor.
  • the application processor mainly processes an operating system, a user interface, an application program, and the like.
  • the modem processor mainly processes wireless communication. It may be understood that the modem processor may either not be integrated into the processor 880 .
  • the mobile phone further includes the power supply 890 (such as a battery) for supplying power to the components.
  • the power supply may be logically connected to the processor 880 by using a power management system, thereby implementing functions such as charging, discharging, and power consumption management by using the power management system.
  • the mobile phone may further include a camera, a sensor, and the like. Details are not described herein again.
  • the processor 880 included in the mobile phone may implement the functions of the mobile phone in the embodiment shown in FIG. 2 or other optional embodiments.
  • This application provides a computer storage medium, including instructions, the instructions, when run on a computing device, causing the computing device to perform the steps implemented by a display device in any embodiment above.
  • This application further provides a computer storage medium, including instructions, the instructions, when run on a computing device, causing the computing device to perform the steps implemented by a mobile phone in any embodiment above.
  • This application further provides a computer program product, the computer program product, when run on a computing device, causing the computing device to perform the steps implemented by a display device in any embodiment above.
  • This application further provides a computer program product, the computer program product, when run on a computing device, causing the computing device to perform the steps implemented by a mobile phone in any embodiment above.
  • All or some of the foregoing embodiments may be implemented by using software, hardware, firmware, or any combination thereof.
  • implementation may be entirely or partially performed in the form of a computer program product.
  • the computer program product includes one or more computer instructions.
  • the computer instructions may be stored in a computer-readable storage medium or may be transmitted from a computer-readable storage medium to another computer-readable storage medium.
  • the computer instructions may be transmitted from a website, computer, server, or data center to another website, computer, server, or data center in a wired (for example, a coaxial cable, an optical fiber, or a digital subscriber line) or wireless (for example, infrared, radio, or microwave) manner.
  • the computer-readable storage medium may be any usable medium accessible by the computer, or a data storage device, such as a server or a data center, including one or more usable media.
  • the usable medium may be a magnetic medium (for example, a floppy disk, a hard disk, or a magnetic tape), an optical medium (for example, a DVD), a semiconductor medium (for example, a solid state disk (solid state disk, SSD)), or the like.

Abstract

Provided is a control method applied to a screen projection scenario. The method includes: receiving first screen content and a target navigation function identifier sent by a mobile phone; generating, according to the target navigation function identifier, a collaboration window including a screen projection area and a navigation bar, where the navigation bar includes three virtual navigation keys; displaying the first screen content in the screen projection area; receiving a keyboard and mouse operation acting on the virtual navigation keys; generating a key instruction according to the keyboard and mouse operation, and sending the key instruction to the mobile phone, so that the mobile phone executes a navigation function according to the key instruction, and the mobile phone can adjust the first screen content to second screen content, and displaying the second screen content in the screen projection area.

Description

    CROSS-REFERENCE TO RELATED APPLICATION
  • This application is a continuation of U.S. patent application Ser. No. 17/638,567, filed on Feb. 25, 2022, which is national stage of International Application No. PCT/CN2020/103440, filed on Jul. 22, 2020, which claims priority to Chinese Patent Application No. 201910809113.9, filed on Aug. 29, 2019. All of the aforementioned applications are hereby incorporated by reference in their entireties.
  • TECHNICAL FIELD
  • This application relates to the field of computer technologies, and in particular, to a control method applied to a screen projection scenario and a related device.
  • BACKGROUND
  • With the development of computer technologies, devices of different types (for example, devices using different operating systems) may perform screen projection to implement screen sharing.
  • Using a screen projection scenario of a mobile phone and a computer as an example, when the mobile phone is projected to the computer, the computer may present a collaboration window. Screen content of the mobile phone may be displayed in the collaboration window, that is, a mobile phone mirror is presented on the computer. A user may control the mobile phone by performing a keyboard and mouse operation in the collaboration window of the computer.
  • However, the mobile phone generally uses a touch screen as an input device, and the computer uses a keyboard and a mouse as input devices. Simulating a touch operation by using a mouse or a keyboard has disadvantages, thereby affecting user experience of controlling the mobile phone by using the computer.
  • SUMMARY
  • This application provides a control method applied to a screen projection scenario and a related device, to implement functions by using keyboard and mouse operations in the screen projection scenario so as to replace functions that are hard to simulate and implement by the keyboard and mouse operations, thereby improving operation experience of a user.
  • According to a first aspect, this application provides a control method applied to a screen projection scenario. The screen projection scenario may include a mobile phone and a display device. After the mobile phone and the display device establish a connection, the display device receives first screen content and a target navigation function identifier sent by the mobile phone; generates a collaboration window including a screen projection area and a navigation bar according to the target navigation function identifier, and displays the first screen content in the screen projection area; receives a keyboard and mouse operation acting on virtual navigation keys on the navigation bar; and generates a key instruction according to the keyboard and mouse operation, and sends the key instruction to the mobile phone, to cause the mobile phone to execute a navigation function according to the key instruction, where the mobile phone may adjust the first screen content to second screen content by means of the navigation function. The display device receives the second screen content sent by the mobile phone, and displays the second screen content in the screen projection area.
  • The first screen content refers to content displayed on a screen of the mobile phone when the mobile phone and the display device establish the connection. Target navigation functions include mobile phone navigation functions except a three-key navigation function. The navigation bar includes three virtual navigation keys, and the three virtual navigation keys respectively correspond to different navigation functions. For example, the three virtual navigation keys are respectively a menu key, a desktop key, and a return key, where the menu key is configured to enter a task menu, the desktop key is configured to return to a desktop, and the return key is configured to return to an upper level. It may be understood that, the functions shown above are common navigation functions. In an actual application, the functions of the three virtual navigation keys may be alternatively set to navigation functions of other types, which is not limited in this application.
  • In this application, if the mobile phone uses another navigation manner except a three-key navigation manner, the display device sets the navigation bar in the collaboration window, and a user may perform a system navigation function to the mobile phone by using the three virtual navigation keys in the navigation bar, to cause screen content of the mobile phone and content of the collaboration window to be updated synchronously, thereby reducing a problem that the user is prone to a mistaken touch when controlling the mobile phone in the another navigation manner, and improving experience when the user uses the display device to control the mobile phone.
  • In a possible implementation, a current navigation function includes a gesture navigation function and/or an out-of-screen physical navigation function, and the out-of-screen physical navigation function is implemented by using a physical key.
  • In another possible implementation, the method further includes: when a mode of the collaboration window is a maximized window mode and a pointer position is not located in a first target area, hiding the navigation bar, where the first target area is a part in an edge area of the screen projection area and corresponding to the navigation bar; and when the mode of the collaboration window is the maximized window mode and the pointer position is located in the first target area, displaying the navigation bar in the first target area.
  • In another possible implementation, the collaboration window further includes a title bar; when the mode of the collaboration window is the maximized window mode and the pointer position is not located in a second target area, hiding the title bar, where the second target area is a part in the edge area of the screen projection area and corresponding to the title bar; and when the mode of the collaboration window is the maximized window mode and the pointer position is located in the second target area, displaying the title bar in the second target area.
  • In another possible implementation, when the collaboration window is a portrait window, the navigation bar is located below or above the screen projection area, and the navigation bar is adjacent to the screen projection area; and when the collaboration window is a landscape window, the navigation bar is located on a right side or a left side of the screen projection area, and the navigation bar is adjacent to the screen projection area.
  • In another possible implementation, the method further includes: forbidding the keyboard and mouse operation from simulating the navigation function corresponding to the target navigation function identifier in the collaboration window.
  • According to a second aspect, this application provides a control method applied to a screen projection scenario. The method includes the following steps: sending first screen content and a target navigation function identifier to a display device; receiving a key instruction sent by the display device; executing a navigation function corresponding to the key instruction according to a preset correspondence, where the navigation function is used for adjusting the first screen content to second screen content; and sending the second screen content to the display device by a mobile phone, to cause the display device to display the second screen content in a screen projection area of a collaboration window.
  • The key instruction is generated by the display device according to a keyboard and mouse operation. The target navigation function identifier is used for identifying a current navigation function except a three-key navigation function on the mobile phone. In a possible implementation, the current navigation function includes a gesture navigation function and/or an out-of-screen physical navigation function, and the out-of-screen physical navigation function is implemented by using a physical key.
  • According to a third aspect, this application provides a display apparatus whose function may implement the control method according to the first aspect or any implementation. The function may be implemented by using hardware, or may be implemented by hardware executing corresponding software. The hardware or software includes one or more modules corresponding to the above function.
  • According to a fourth aspect, this application provides a mobile phone whose function may implement the control method according to the second aspect or any implementation. The function may be implemented by using hardware, or may be implemented by hardware executing corresponding software. The hardware or software includes one or more modules corresponding to the above function.
  • According to a fifth aspect, this application provides a screen projection system. The screen projection system includes the display device according to the third aspect or any implementation thereof and the mobile phone according to the fourth aspect or any implementation thereof.
  • According to a sixth aspect, this application provides a computer-readable storage medium, storing instructions, the instructions, when run on a computer, causing the computer to perform the method according to the first aspect or the second aspect.
  • According to a seventh aspect, this application provides a computer program product, the computer program product, when run on a computer, causing the computer to perform the method according to the first aspect or the second aspect.
  • BRIEF DESCRIPTION OF DRAWINGS
  • FIG. 1 is a schematic diagram of a screen projection scenario according to this application;
  • FIG. 2 is a signaling interaction diagram of a control method applied to a screen projection scenario according to this application;
  • FIG. 3A is a schematic diagram of a collaboration window according to this application;
  • FIG. 3B is a schematic diagram of a collaboration window being a maximized portrait window according to this application;
  • FIG. 3C is a schematic diagram of a collaboration window being a maximized landscape window according to this application;
  • FIG. 3D is another schematic diagram of a collaboration window being a maximized landscape window according to this application;
  • FIG. 3E is another schematic diagram of a collaboration window according to this application;
  • FIG. 3F is another schematic diagram of a collaboration window according to this application;
  • FIG. 4 is a schematic structural diagram of a display device according to this application;
  • FIG. 5 is a schematic structural diagram of a mobile phone according to this application;
  • FIG. 6 is a schematic diagram of a screen projection system according to this application;
  • FIG. 7 is another schematic structural diagram of a display device according to this application; and
  • FIG. 8 is another schematic structural diagram of a mobile phone according to this application.
  • DESCRIPTION OF EMBODIMENTS
  • This application relates to a control method applied to a screen projection scenario.
  • FIG. 1 is a schematic diagram of a screen projection scenario. The screen projection scenario includes a mobile phone 10 and a display device 20. The mobile phone 10 and the display device 20 may be connected to each other through a radio link 30. The mobile phone and the display device 20 may be alternatively connected in a wired manner such as a data cable. After the mobile phone 10 and the display device 20 establish a connection, the display device 20 may generate a collaboration window 40 according to content displayed by a screen of the mobile phone 10. An operation of a user in the collaboration window 40 may update screen content of the mobile phone 10 and content of the collaboration window 40 synchronously.
  • The display device 20 refers to a computing device that performs an input operation by using a keyboard and/or a mouse and that has a display, such as a desktop computer or a notebook computer. The mobile phone 10 is also referred to as a cell phone. A system navigation function on the mobile phone is also referred to as a system navigation manner. The system navigation function includes gesture navigation, out-of-screen physical navigation, three-key navigation, and the like.
  • Using the display device 20 being a computer as an example, after the mobile phone is projected to the computer, if the gesture navigation is used on the mobile phone 10, the user can hardly simulate a mobile phone navigation operation on the computer through a keyboard and mouse operation accurately. For example, a desktop is returned when sliding upward from a lower left end of the screen of the mobile phone. A task menu is entered when sliding upward from the lower left end of the screen of the mobile phone and holding for a period of time. An upper level is returned when sliding rightward from the leftmost side of the screen of the mobile phone. In this way, the user may easily start a navigation function mistakenly when using a keyboard and mouse operation to move on the computer. For example, when the user intends to enter the task menu, a function of returning to the desktop is started mistakenly when a pointer of the mouse is slid upward from a lower left end of the collaboration window.
  • Alternatively, if the out-of-screen physical navigation is used on the mobile phone pressing a physical key once represents returning to the upper level, and pressing the physical key twice quickly represents returning to the desktop. A mistaken touch is easily triggered by a keyboard and mouse operation on the computer. For example, when the user intends to start the function of returning to the desktop, the navigation function of returning to the upper level is started by three clicks.
  • As can be seen from the above, if the mobile phone uses the gesture navigation or the out-of-screen physical navigation, the user may easily trigger a mistaken touch when simulating the above navigation manner by using a keyboard and mouse operation, so that the mobile phone cannot be controlled accurately to perform system navigation, thereby leading to poor operation experience of the user.
  • For the problems existing in the foregoing scenario, this application provides three-key navigation virtual keys on the computer to replace gesture navigation operations. In this way, performing keyboard and mouse operations on the three virtual navigation keys of the navigation bar can reduce a case that another navigation function is mistakenly touched and has better accuracy, thereby improving the operation experience of the user to control the mobile phone by using another device.
  • The following describes the foregoing control method in a screen projection scenario. Referring to FIG. 2 , an embodiment of a control method provided in this application includes the following steps:
  • Step 201. A display device receives first screen content and a target navigation function identifier sent by a mobile phone.
  • In this embodiment, the mobile phone may send a target navigation function identifier to the display device. The target navigation function identifier is used for identifying a current navigation function except a three-key navigation function. Specifically, the target navigation function identifier may be a character string or a numerical number used for identifying the current navigation function. In an actual application, the target navigation function identifier may be alternatively represented in another manner, such as a picture, a symbol, or text. When the mobile phone sends the target navigation function identifier, the target navigation function identifier may be carried in a message.
  • The current navigation function includes, but is not limited to, a gesture navigation function and/or an out-of-screen physical navigation function, and the out-of-screen physical navigation function is implemented by using a physical key. The three-key navigation function refers to an in-screen three-key navigation function or an out-of-screen three-key navigation function.
  • Optionally, the display device sends a query instruction to the mobile phone, and the mobile phone obtains the target navigation function identifier according to the query instruction and then sends the target navigation function identifier to the display device.
  • It should be noted that, the display device may receive the first screen content and the target navigation function identifier sent by the mobile phone at the same moment or at different moments.
  • Step 202. The display device generates a collaboration window including a screen projection area and a navigation bar according to the target navigation function identifier, and displays the first screen content in the screen projection area.
  • When the display device receives the target navigation function identifier sent by the mobile phone, the display device may determine that the current navigation function corresponding to the target navigation function identifier is not the three-key navigation function. That is, it indicates that simulating a navigation function that is currently used on the mobile phone on the computer by using a keyboard and mouse operation may trigger a mistaken touch. Optionally, the mobile phone may send service states of all navigation functions to the display device, and the display device may also determine the navigation function that is currently used on the mobile phone according to the service states of the navigation functions.
  • The screen projection area is used for displaying according to screen content of the mobile phone.
  • The navigation bar includes three virtual navigation keys. The three virtual navigation keys respectively correspond to different navigation functions. For example, the three virtual navigation keys are respectively a menu key, a desktop key, and a return key. The menu key is configured to enter a task menu, the desktop key is configured to return to a desktop, and the return key is configured to return to an upper level. It may be understood that, the functions shown above are common navigation functions. In an actual application, the functions of the three virtual navigation keys may be alternatively set to navigation functions of other types, which is not limited in this application.
  • It should be noted that, the display device may configure the screen projection area and the navigation bar in the collaboration window at the same moment or at different moments. For example, when the display device receives the first screen content sent by the mobile phone, the display device generates the screen projection area in the collaboration window. When the display device receives the target navigation function identifier, the display device generates the navigation bar in the collaboration window.
  • Step 203. The display device receives a keyboard and mouse operation acting on virtual navigation keys.
  • The keyboard and mouse operation refers to an operation inputted by using a keyboard and/or an operation inputted by using a mouse. For example, when a pointer of the mouse selects a virtual key, the virtual navigation key is clicked. The keyboard and mouse operation is not limited to the foregoing example.
  • Step 204. The display device generates a key instruction according to the keyboard and mouse operation.
  • When a keyboard and mouse operation is performed on a virtual navigation key, the display device may generate a key instruction corresponding to the virtual navigation key. For example, if the virtual navigation key is the menu key, a key instruction corresponding to the menu key is generated. If the virtual navigation key is the desktop key, a key instruction corresponding to the desktop key is generated. If the virtual navigation key is the return key, a key instruction corresponding to the return key is generated.
  • Step 205. The display device sends the key instruction to the mobile phone.
  • Step 206. The mobile phone executes a navigation function corresponding to the key instruction according to a preset correspondence, to adjust the first screen content to second screen content.
  • The preset correspondence refers to a correspondence between key instructions from the display device and navigation functions of the mobile phone. When the key instruction corresponds to the menu key, the navigation function performed on the mobile phone is to enter the task menu. When the key instruction corresponds to the desktop key, the navigation function performed on the mobile phone is to return to the desktop. When the key instruction corresponds to the return key, the navigation function performed on the mobile phone is to return to the upper level. It may be understood that, functions of the virtual navigation keys on the computer and system navigation functions of the mobile phone are in a one-to-one correspondence.
  • Step 207. The display device receives the second screen content sent by the mobile phone.
  • Step 208. The display device displays the second screen content in the screen projection area.
  • In this embodiment, if the mobile phone uses another navigation function except a three-key navigation function, the display device sets the navigation bar in the collaboration window, and a user performs a system navigation function to the mobile phone by using the three virtual navigation keys in the navigation bar, to cause screen content of the mobile phone and content of the collaboration window to be updated synchronously, thereby reducing a problem that the user is prone to a mistaken touch when controlling the mobile phone in the another navigation manner, and improving experience when the user uses the display device to control the mobile phone.
  • Referring to FIG. 3A, the collaboration window 40 of this application may include a screen projection area 301, a navigation bar 302, and a title bar 303. The navigation bar 302 and the title bar 303 may be hidden according to actual requirements.
  • The navigation bar 302 includes three virtual navigation keys, which are respectively a first virtual navigation key 3021, a second virtual navigation key 3022, and a third virtual navigation key 3023. Optionally, the first virtual navigation key 3021, the second virtual navigation key 3022, and the third virtual navigation key 3023 are respectively a return key, a desktop key, and a menu key. Functions of the virtual navigation keys and a sequence of the virtual navigation keys may be adjusted according to actual requirements.
  • The navigation bar 302 may be disposed outside the screen projection area 301 or may be disposed in the screen projection area 301. Optionally, when the collaboration window is a maximized window, the navigation bar 302 is disposed in the screen projection area 301; and when the collaboration window 40 is not a maximized window, the navigation bar 302 is disposed outside the screen projection area 301.
  • The title bar 303 includes a minimized window key, a maximized window key, and a close window key. In addition, the title bar 303 may further include a window name, a direction locking key, and the like, which is not limited herein.
  • The following describes a case that the collaboration window 40 is a maximized window and the navigation bar 302 is disposed in the screen projection area 301:
  • In another optional embodiment, when a mode of the collaboration window 40 is the maximized window mode and a pointer position is not located in a first target area, the navigation bar 302 is hidden, where the first target area is a part in an edge area of the screen projection area 301 and corresponding to the navigation bar 302; and when the mode of the collaboration window 40 is the maximized window mode and the pointer position is located in the first target area, the navigation bar 302 is displayed in the first target area.
  • Specifically, the first target area may be one or a combination of an upper edge area, a lower edge area, a left edge area, or a right edge area of the screen projection area 301, or may be a part of any edge area above, which is not limited in this application.
  • Referring to FIG. 3B, in one case, the collaboration window 40 is a maximized portrait window, and the first target area is a lower edge of the screen projection area 301. When the pointer is located at the lower edge of the screen projection area 301, the navigation bar 302 is displayed at the lower edge of the screen projection area 301. When the pointer is not located at the lower edge of the screen projection area 301, the navigation bar 302 is hidden. A manner for hiding the navigation bar 302 may be, but is not limited to: the navigation bar 302 moves downward from the lower edge of the screen projection area 301 and disappears, or the navigation bar 302 directly disappears.
  • Referring to FIG. 3C, in another case, the collaboration window 40 is a maximized landscape window, and the first target area is a right edge of the screen projection area 301. If the collaboration window 40 is the maximized landscape window, the screen projection area 301 may be a full-screen area. When the pointer is located at the right edge of the screen projection area 301, the navigation bar 302 is displayed at the right edge of the screen projection area 301. When the pointer is not located at the right edge of the screen projection area 301, the navigation bar 302 is hidden. A manner for hiding the navigation bar 302 may be, but is not limited to: the navigation bar 302 moves rightward from the right edge of the screen projection area 301 and disappears, or the navigation bar 302 directly disappears.
  • When the navigation bar 302 is displayed in the first target area in the screen projection area 301, the navigation bar 302 may block some displayed content of the screen projection area 301. The background of the navigation bar 302 may be set to transparent, to reduce blocking on the screen projection area 301.
  • When the mode of the collaboration window 40 is the maximized window mode and the pointer position is not located in the first target area, screen content of the mobile phone may be displayed in a full-screen manner by hiding the navigation bar 302. In this way, a manner for displaying the screen content of the mobile phone in a full-screen manner on the computer is provided, thereby improving user experience when watching a projected screen. In addition, a navigation bar that can be hidden or unfolded is provided. When the user intends to perform system navigation, the navigation bar may be invoked quickly, thereby providing convenient and quick access.
  • Referring to FIG. 3D, in another optional embodiment, the foregoing method further includes: when the mode of the collaboration window 40 is the maximized window mode and the pointer position is not located in a second target area, hiding the title bar 303, where the second target area is a part in an edge area of the screen projection area 301 and corresponding to the title bar 303; and when the mode of the collaboration window 40 is the maximized window mode and the pointer position is located in the second target area, displaying the title bar 303 in the second target area.
  • In this embodiment, the collaboration window 40 further includes the title bar 303. In one case, the collaboration window 40 is a maximized portrait window. In another case, the collaboration window 40 is a maximized landscape window, and the screen projection area 301 is a full-screen area in this case. Specifically, the second target area may be one or a combination of an upper edge area, a lower edge area, a left edge area, or a right edge area of the screen projection area 301, or may be a part of any edge area above, which is not limited in this application. It may be understood that, the second target area and the first target area are generally set to two separate areas.
  • When the mode of the collaboration window 40 is the maximized window mode and the pointer position is not located in the second target area, screen content of the mobile phone may be displayed in a full-screen manner by hiding the title bar 303. In this way, a manner for displaying the screen content of the mobile phone in a full-screen manner on the computer is provided, thereby improving user experience when watching a projected screen. When the user intends to view the title bar 303 or adjust the collaboration window 40, the title bar 303 may be invoked quickly, thereby providing convenient and quick access.
  • The following describes a case that the navigation bar 302 is disposed outside the screen projection area 301:
  • Referring to FIG. 3E, in another optional embodiment, when the collaboration window 40 is a portrait window, the navigation bar 302 is located below the screen projection area 301, and the navigation bar 302 is adjacent to the screen projection area 301. Such settings meet a habit that the user uses a three-key navigation function in a portrait mode, thereby improving user experience. When the collaboration window 40 is a portrait window, the navigation bar 302 may be alternatively disposed at another orientation of the screen projection area 301, such as the above of the screen projection area 301.
  • Referring to FIG. 3F, in another optional embodiment, when the collaboration window 40 is a landscape window, the navigation bar 302 is located on a right side of the screen projection area 301, and the navigation bar 302 is adjacent to the screen projection area 301. Such settings meet a habit that the user uses a three-key navigation function in a landscape mode, thereby improving user experience. When the collaboration window 40 is a landscape window, the navigation bar 302 may be alternatively disposed at another orientation of the screen projection area 301, such as a left side of the screen projection area 301.
  • It should be noted that, when a scale operation is performed on the collaboration window 40, the display device may scale the screen projection area 301 and the navigation bar 302 at the same ratio according to the scale operation.
  • In the collaboration window 40 of this application, the display device may start multiple navigation functions at the same moment, or may only start one navigation function and forbid other navigation functions.
  • In an optional embodiment, each navigation function in the collaboration window may be implemented through multiple types of keyboard and mouse operations.
  • For example, a navigation function of returning to a desktop is implemented through two keyboard and mouse operations. One keyboard and mouse operation is to click a desktop key, where the desktop key is a virtual navigation key. Another keyboard and mouse operation is to slide upward from a lower left end of the collaboration window. The rest may be deduced by analogy, and keyboard and mouse operations used for implementing other navigation functions may be set according to actual situations.
  • When the display device starts multiple navigation functions, the user may select keyboard and mouse operations to implement the navigation functions according to requirements, thereby providing implementation flexibility.
  • In another optional embodiment, the foregoing method further includes: forbidding the keyboard and mouse operation from simulating the navigation function corresponding to the target navigation function identifier in the collaboration window.
  • In this embodiment, when the mobile phone uses gesture navigation or out-of-screen physical navigation, the display device may forbid a keyboard and mouse operation from simulating the foregoing navigation function in the collaboration window. The display device only provides a three-key navigation function, so that a mistaken touch caused by using a keyboard and mouse operation to simulate the gesture navigation or out-of-screen physical navigation may be avoided in the collaboration window.
  • In another optional embodiment, the foregoing method further includes: sending, by the display device, a navigation function modification instruction to the mobile phone according to the target navigation function identifier, and modifying, by the mobile phone, the current navigation function to an in-screen three-key navigation function according to the navigation function modification instruction. In this embodiment, if the mobile phone uses another navigation function, the navigation function of the mobile phone is modified to the in-screen three-key navigation function. During screen projection, an in-screen three-key navigation bar is displayed both on the mobile phone and the display device, and the mobile phone may be controlled through the in-screen three-key navigation bar on the display device.
  • It should be noted that, in addition to a mobile phone, the foregoing control method may be further applied to another electronic device that uses gesture navigation or out-of-screen physical navigation.
  • The control method of this application is described above, and the following describes a related apparatus configured to implement the foregoing control method of this application. Referring to FIG. 4 , an embodiment of a display apparatus provided in this application includes:
  • a receiving module 401, configured to receive first screen content and a target navigation function identifier sent by a mobile phone, where the target navigation function identifier is used for identifying a current navigation function except a three-key navigation function on the mobile phone;
  • a display module 402, configured to generate a collaboration window including a screen projection area and a navigation bar according to the target navigation function identifier, where the navigation bar includes three virtual navigation keys; and display the first screen content in the screen projection area;
  • an input module 403, configured to receive a keyboard and mouse operation acting on the virtual navigation keys;
  • a processing module 404, configured to generate a key instruction according to the keyboard and mouse operation;
  • a sending module 405, configured to send the key instruction to the mobile phone, to cause the mobile phone to execute a navigation function according to the key instruction, where the navigation function is used for adjusting the first screen content to second screen content;
  • the receiving module 401 being further configured to receive the second screen content sent by the mobile phone; and
  • the display module 402 being further configured to display the second screen content in the collaboration window.
  • In an optional embodiment, the current navigation function includes a gesture navigation function and/or an out-of-screen physical navigation function, and the out-of-screen physical navigation function is implemented by using a physical key.
  • In another optional embodiment,
  • the display module 402 is further configured to: when a mode of the collaboration window is a maximized window mode and a pointer position is not located in a first target area, hide the navigation bar, where the first target area is a part in an edge area of the screen projection area and corresponding to the navigation bar; and when the mode of the collaboration window is the maximized window mode and the pointer position is located in the first target area, display the navigation bar in the first target area.
  • In another optional embodiment, the collaboration window further includes a title bar; and
  • the display module 402 is further configured to: when the mode of the collaboration window is the maximized window mode and the pointer position is not located in a second target area, hide the title bar, where the second target area is a part in the edge area of the screen projection area and corresponding to the title bar; and when the mode of the collaboration window is the maximized window mode and the pointer position is located in the second target area, display the title bar in the second target area.
  • In another optional embodiment, the processing module 404 is further configured to forbid the keyboard and mouse operation from simulating the navigation function corresponding to the target navigation function identifier in the collaboration window.
  • Referring to FIG. 5 , an embodiment of a mobile phone provided in this application includes:
  • a sending module 501, configured to send first screen content and a target navigation function identifier to a display device, where the target navigation function identifier is used for identifying a current navigation function except a three-key navigation function on the mobile phone;
  • a receiving module 502, configured to receive a key instruction sent by the display device, where the key instruction is generated by the display device according to a keyboard and mouse operation;
  • a processing module 503, configured to execute a navigation function corresponding to the key instruction according to a preset correspondence, where the navigation function is used for adjusting the first screen content to second screen content; and
  • the sending module 501 being further configured to send the second screen content to the display device, to cause the display device to display the second screen content in a screen projection area of a collaboration window.
  • In an optional embodiment, the current navigation function includes a gesture navigation function and/or an out-of-screen physical navigation function, and the out-of-screen physical navigation function is implemented by using a physical key.
  • Referring to FIG. 6 , an embodiment of a screen projection system provided in this application includes:
  • a display device 20 and a mobile phone 10, where the mobile phone 10 and the display device 20 are connected to each other through a radio link 30;
  • the display device 20 is configured to: receive first screen content and a state of a target navigation function sent by the mobile phone 10, where the target navigation function includes mobile phone navigation functions except a three-key navigation function; generate a collaboration window including a screen projection area and a navigation bar when the state of the target navigation function is a started state, and display the first screen content in the screen projection area, where the navigation bar includes three virtual navigation keys; receive a keyboard and mouse operation acting on the virtual navigation keys; generate a key instruction according to the keyboard and mouse operation and send the key instruction to the mobile phone to cause the mobile phone to execute a navigation function according to the key instruction; receive second screen content sent by the mobile phone; and display the second screen content in the screen projection area; and
  • the mobile phone 10 is configured to: send first screen content and a state of a target navigation function to the display device 20; receive a key instruction sent by the display device when the state of the target navigation function is a started state; execute a navigation function according to the key instruction, where the navigation function is used for adjusting the first screen content to second screen content; and send the second screen content to the display device.
  • Specifically, the radio link may be a wireless fidelity (wireless fidelity, WiFi) link or a Bluetooth link. Functions of the display device 20 may be the same as that of the display device in the embodiment shown in FIG. 4 or other optional embodiments. Functions of the mobile phone 10 may be the same as that of the mobile phone in the embodiment shown in FIG. 5 or other optional embodiments.
  • Referring to FIG. 7 , another embodiment of a display device 20 provided in this application includes:
  • an input apparatus 701, a display apparatus 702, a memory 703, a processor 704, a transceiver 705, and a data interface 706, where the input apparatus 701, the display apparatus 702, the memory 703, the processor 704, the transceiver 705, and the data interface 706 may be connected through a bus.
  • The input apparatus 701 may be a keyboard or a mouse.
  • The display apparatus 702 may be a display, a projector, or another device for display.
  • The memory 703 may be a volatile memory or a non-volatile memory, or may include a volatile memory and a non-volatile memory. The non-volatile memory may be a read-only memory (read-only memory, ROM), a programmable ROM (programmable ROM, PROM), an erasable PROM (erasable PROM, EPROM), an electrically EPROM (electrically EPROM, EEPROM), or a flash memory. The volatile memory may be a random access memory (random access memory, RAM), and is used as an external cache. It should be noted that, the memory described herein aims to include but not limited to these memories and any other suitable types of memories.
  • The foregoing processor 704 may be a general-purpose processor, including a central processing unit (central processing unit, CPU), a network processor (network processor, NP), and the like; and may further be a digital signal processor (digital signal processing, DSP), an application specific integrated circuit (application specific integrated circuit, ASIC), a field-programmable gate array (field-programmable gate array, FPGA), or other programmable logic devices. The processor 704 is configured to implement the functions of the display device in the foregoing embodiments by invoking program code in the memory 703.
  • The transceiver 705 is configured to send and receive data in wireless communication. A component configured to implement a receiving function in the transceiver 705 may be regarded as a receiver, and a component configured to implement a sending function in the transceiver 705 may be regarded as a sender. That is, the transceiver 701 includes a receiver and a sender. The transceiver 705 may also be referred to as a transceiver machine or a transceiver circuit. The receiver may also be referred to as a receiving machine or a receiving circuit sometimes. The sender may also be referred to as a transmitting machine or a transmitting circuit sometimes.
  • The data interface 706 is connected to the mobile phone in a wired manner.
  • The structural diagram provided in this embodiment only shows a simplified design of the display device 20. In an actual application, the display device 20 may include any quantity of input apparatuses 701, display apparatuses 702, memories 703, processors 704, transceivers 705, and data interfaces 706, to implement the functions or operations performed by the display device 20 in the apparatus embodiments of this application, and all apparatuses that can implement this application fall within the protection scope of this application. Although not shown in the figure, the display device 20 may further include a power supply, and the like. The power supply is configured to supply power to various components, and may be logically connected to the processor 704 by using a power management system, thereby implementing functions such as charging, discharging, and power consumption management by using the power management system.
  • Referring to FIG. 8 , another embodiment of a mobile phone provided in this application includes:
  • components such as a radio frequency (radio frequency, RF) circuit 810, a memory 820, an input unit 830, a display unit 840, a Bluetooth module 850, an audio circuit 860, a WiFi module 870, a processor 880, and a power supply 890. A person skilled in the art may understand that the structure of the mobile phone shown in FIG. 8 does not constitute a limitation on the mobile phone, and the mobile phone may include more components or fewer components than those shown in the figure, or some components may be combined, or a different component deployment may be used.
  • The following makes a detailed description of the components of the mobile phone with reference to FIG. 8 :
  • The RF circuit 810 may be configured to send and receive signals during an information receiving and sending process or a call process. Specifically, the RF circuit receives downlink information from a base station, then delivers the downlink information to the processor 880 for processing, and sends designed uplink data to the base station. Generally, the RF circuit 810 includes but is not limited to an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier (low noise amplifier, LNA), a duplexer, and the like. In addition, the RF circuit 810 may also communicate with a network and another device through wireless communication. The wireless communication may use any communication standard or protocol, including, but not limited to, Global System for Mobile Communications (Global System of Mobile communications, GSM), General Packet Radio Service (General Packet Radio Service, GPRS), Code Division Multiple Access (Code Division Multiple Access, CDMA), Wideband Code Division Multiple Access (Wideband Code Division Multiple Access, WCDMA), Long Term Evolution (Long Term Evolution, LTE), email, Short Messaging Service (Short Messaging Service, SMS), and the like.
  • The memory 820 may be configured to store a software program and a module. The processor 880 runs the software program and the module that are stored in the memory 820, to implement various functional applications and data processing of the mobile phone. The memory 820 may mainly include a program storage area and a data storage area. The program storage area may store an operating system, an application program required by at least one function (for example, a sound playback function and an image display function), and the like. The data storage area may store data (for example, audio data and an address book) created according to the use of the mobile phone, and the like. In addition, the memory 820 may include a high speed RAM, and may further include a non-volatile memory, such as at least one magnetic disk storage device, a flash memory, or another volatile solid storage device.
  • The input unit 830 may be configured to receive inputted digit or character information, and generate a keyboard signal input related to the user setting and function control of the mobile phone. Specifically, the input unit 830 may include a touch panel 831 and another input device 832. The touch panel 831, which may also be referred to as a touch screen, may collect a touch operation of a user on or near the touch panel (such as an operation of a user on or near the touch panel 831 by using any suitable object or accessory such as a finger or a stylus), and drive a corresponding connection apparatus according to a preset program. Optionally, the touch panel 831 may include two parts: a touch detection apparatus and a touch controller. The touch detection apparatus detects a touch position of the user, detects a signal generated by the touch operation, and transfers the signal to the touch controller. The touch controller receives touch information from the touch detection apparatus, converts the touch information into touch point coordinates, and transmits the touch point coordinates to the processor 880. In addition, the touch controller can receive a command transmitted by the processor 880 and execute the command. In addition, the touch panel 831 may be implemented by using various types, such as a resistive type, a capacitive type, an infrared type, and a surface acoustic wave type. In addition to the touch panel 831, the input unit 830 may further include the another input device 832. Specifically, the another input device 832 may include, but not limited to, one or more of a functional key (such as a volume control key or a switch key), a track ball, and a joystick.
  • The display unit 840 may be configured to display information inputted by the user or information provided for the user, and various menus of the mobile phone. The display unit 840 may include a display panel 841. Optionally, the display panel 841 may be configured by using a liquid crystal display (liquid crystal display, LCD), an organic light-emitting diode (organic light-emitting diode, OLED), or the like. Further, the touch panel 831 may cover the display panel 841. After detecting a touch operation on or near the touch panel, the touch panel 831 transfers the touch operation to the processor 880, to determine a type of a touch event. Then, the processor 880 provides a corresponding visual output on the display panel 841 according to the type of the touch event. Although in FIG. 8 , the touch panel 831 and the display panel 841 are used as two separate parts to implement input and output functions of the mobile phone, in some embodiments, the touch panel 831 and the display panel 841 may be integrated to implement the input and output functions of the mobile phone.
  • The mobile phone may further include a Bluetooth module 850.
  • The audio circuit 860, a speaker 861, and a microphone 862 may provide audio interfaces between the user and the mobile phone. The audio circuit 860 may convert received audio data into an electrical signal and transmit the electrical signal to the speaker 861. The speaker 861 converts the electrical signal into a sound signal for output. On the other hand, the microphone 862 converts a collected sound signal into an electrical signal. The audio circuit 860 receives the electrical signal and converts the electrical signal into audio data, and outputs the audio data to the processor 880 for processing. Then, the processor sends the audio data to, for example, another mobile phone by using the RF circuit 810, or outputs the audio data to the memory 820 for further processing.
  • WiFi is a short distance wireless transmission technology. The mobile phone may help, by using the WiFi module 870, a user to send and receive an email, browse a web page, access stream media, and the like. This provides wireless broadband Internet access for the user. Although FIG. 8 shows the WiFi module 870, it may be understood that the WiFi module is not a necessary component of the mobile phone, and the WiFi module may be omitted as required provided that the scope of the essence of the present invention is not changed.
  • The processor 880 is a control center of the mobile phone, and is connected to various parts of the entire mobile phone by using various interfaces and lines. By running or executing the software program and/or the module stored in the memory 820, and invoking data stored in the memory 820, the processor executes various functions of the mobile phone and performs data processing, thereby monitoring the entire mobile phone. Optionally, the processor 880 may include one or more processing units. For example, the processor 880 may integrate an application processor and a modem processor. The application processor mainly processes an operating system, a user interface, an application program, and the like. The modem processor mainly processes wireless communication. It may be understood that the modem processor may either not be integrated into the processor 880.
  • The mobile phone further includes the power supply 890 (such as a battery) for supplying power to the components. For example, the power supply may be logically connected to the processor 880 by using a power management system, thereby implementing functions such as charging, discharging, and power consumption management by using the power management system.
  • Although not shown in the figure, the mobile phone may further include a camera, a sensor, and the like. Details are not described herein again.
  • In the embodiments of the present invention, by invoking the program stored in the memory 820, the processor 880 included in the mobile phone may implement the functions of the mobile phone in the embodiment shown in FIG. 2 or other optional embodiments.
  • In this application, “multiple” means two or more, and another quantifier is similar to this. “And/or” represents that three relationships may exist. For example, A and/or B may represent the following three cases: only A exists, both A and B exist, and only B exists. It should be understood that sequence numbers of the foregoing processes do not mean execution sequences in various embodiments of the present invention. The execution sequences of the processes should be determined according to functions and internal logic of the processes, and should not be construed as any limitation on the implementation processes of the embodiments of the present invention.
  • This application provides a computer storage medium, including instructions, the instructions, when run on a computing device, causing the computing device to perform the steps implemented by a display device in any embodiment above.
  • This application further provides a computer storage medium, including instructions, the instructions, when run on a computing device, causing the computing device to perform the steps implemented by a mobile phone in any embodiment above.
  • This application further provides a computer program product, the computer program product, when run on a computing device, causing the computing device to perform the steps implemented by a display device in any embodiment above.
  • This application further provides a computer program product, the computer program product, when run on a computing device, causing the computing device to perform the steps implemented by a mobile phone in any embodiment above.
  • All or some of the foregoing embodiments may be implemented by using software, hardware, firmware, or any combination thereof. When software is used for implementation, implementation may be entirely or partially performed in the form of a computer program product.
  • The computer program product includes one or more computer instructions. When the computer program instructions are loaded and executed on the display device or the mobile phone, the procedure or functions according to this application are all or partially generated. The computer instructions may be stored in a computer-readable storage medium or may be transmitted from a computer-readable storage medium to another computer-readable storage medium. For example, the computer instructions may be transmitted from a website, computer, server, or data center to another website, computer, server, or data center in a wired (for example, a coaxial cable, an optical fiber, or a digital subscriber line) or wireless (for example, infrared, radio, or microwave) manner. The computer-readable storage medium may be any usable medium accessible by the computer, or a data storage device, such as a server or a data center, including one or more usable media. The usable medium may be a magnetic medium (for example, a floppy disk, a hard disk, or a magnetic tape), an optical medium (for example, a DVD), a semiconductor medium (for example, a solid state disk (solid state disk, SSD)), or the like.
  • The foregoing embodiments are merely intended for describing the technical solutions of this application, but not for limiting this application. Although this application is described in detail with reference to the foregoing embodiments, a person of ordinary skill in the art should understand that modifications may be still made to the technical solutions recorded in the foregoing embodiments or equivalent replacements may be made to some technical features thereof, and these modifications or replacements do not cause the essence of the corresponding technical solutions to depart from the scope of the technical solutions of the embodiments of this application.

Claims (20)

What is claimed is:
1. A electronic device, comprising:
one or more processors; a display screen coupled to the one or more processors; and
a memory coupled to the one or more processors, wherein the memory is configured to store instructions that, when executed by the one or more processors, cause the electronic device to be configured to:
display a screen of content;
send the first screen content to a display device, a current navigation function of the electronic device is a navigation function in addition to a three-key navigation function;
receive a first message sent by the display device, the first message is generated by the display device according to a first operation, the first operation acts on a virtual navigation key, the virtual navigation key is generated by the display device in response to the current navigation function of the electronic device is a navigation function in addition to a three-key navigation function;
perform the navigation function corresponding to the virtual navigation key according to the first message, and adjust the first screen content to a second screen content;
send the second screen content to the display device, so that the display device displays the second screen content in the screen projection area.
2. The electronic device according to claim 1, wherein the instructions, when executed by the one or more processors, further cause the electronic device to be configured to:
before receiving a first message sent by the display device, sent a target navigation function identifier to the display device, wherein the target navigation function identifier is used for identifying a current navigation function except a three-key navigation function on the electronic device.
3. The electronic device according to claim 1, wherein
the virtual navigation key is in a navigation bar, and the navigation bar is generated by the display device in response to the current navigation function of the electronic device is a navigation function in addition to a three-key navigation function; the navigation bar includes three virtual Navigation keys, the first operation acts on one virtual navigation key among the three virtual navigation keys.
4. The electronic device according to claim 1, wherein the first message includes a key press instruction.
5. The electronic device according to claim 2, wherein
the three virtual navigation keys include a menu key, a desktop key and a return key, wherein the menu key corresponds to a navigation function of entering a task menu, the desktop key corresponds to a navigation function of returning to a desktop, and the return key corresponds to a navigation function of returning to an upper level.
6. The electronic device according to claim 1, wherein the electronic device and the display device are connected by wired connection or wireless connection.
7. The electronic device according to claim 1, wherein the first operation is a single-click on the virtual navigation key through a mouse pointer.
8. The electronic device according to claim 1, wherein the navigation bar is located below the screen projection area.
9. The electronic device according to claim 1, wherein the first navigation function comprises a gesture navigation or an out-of-screen physical navigation, and the out-of-screen physical navigation function is implemented by using a physical key.
10. The electronic device according to claim 1, wherein the display device is a desktop computer or a notebook computer.
11. The electronic device according to claim 1, wherein the instructions, when executed by the one or more processors, further cause the electronic device to be configured to:
after adjust the first screen content to the second screen content, receive a second message sent by the display device, the second message is generated by the display device according to a second operation, and the second operation is an operation that slides inward from the edge of the projection area;
perform the navigation function corresponding to the second operation according to the second message, and adjust the second screen content to the third screen content.
12. The electronic device according to claim 1, wherein the screen projection window and the navigation bar are located in a same collaborative window.
13. The electronic device according to claim 1, wherein the collaborative window comprises a title bar, and the title bar comprises at least a minimized window key, a maximized window key and a close window key.
14. The electronic device according to claim 13, wherein the title bar is located above the screen projection area.
15. The electronic device according to claim 1, wherein
when the first operation is an operation acting on a menu key, the second screen content is a task menu;
when the first operation is an operation acting on a desktop key, the second screen content is a desktop;
when the first operation is an operation acting on a return key, the second screen content is the content of an upper level.
16. The electronic device according to claim 2, wherein the target navigation function identifier includes any one of a string, a numerical number, a picture, a symbol, or text.
17. The electronic device according to claim 1, wherein the instructions, when executed by the one or more processors, further cause the electronic device to be configured to:
when the electronic device is in the three-key navigation function, a fourth screen content is displayed, and the fourth screen content includes an in-screen three-key navigation bar;
sent the fourth screen content to the display device, so that the display device displays the fourth screen content on the screen projection area and does not display the navigation bar.
18. The electronic device according to claim 1, wherein the instructions, when executed by the one or more processors, further cause the electronic device to be configured to:
the electronic device returns to a desktop when the electronic device receives an operation of sliding up from the lower left end of the screen;
the electronic device enters a task menu when the electronic device receives an operation of swiping up from the lower left end of the screen and holding for a period of time; and
the electronic device returns to an upper level when the electronic device receives an operation of sliding rightward from the leftmost side of the screen of the electronic device.
19. A control method applied to a screen projection scenario, comprising:
displaying a screen of content;
sending the first screen content to a display device, a current navigation function of the electronic device is a navigation function in addition to a three-key navigation function;
receiving a first message sent by the display device, the first message is generated by the display device according to a first operation, the first operation acts on a virtual navigation key, the virtual navigation key is generated by the display device in response to the current navigation function of the electronic device is a navigation function in addition to a three-key navigation function;
perform the navigation function corresponding to the virtual navigation key according to the first message, and adjust the first screen content to a second screen content;
sending the second screen content to the display device, so that the display device displays the second screen content in the screen projection area.
20. A non-transitory computer-readable storage medium storing one or more programs, the one or more programs comprising instructions, which when executed by one or more processors of an electronic device, cause the electronic device to:
display a screen of content;
send the first screen content to a display device, a current navigation function of the electronic device is a navigation function in addition to a three-key navigation function;
receive a first message sent by the display device, the first message is generated by the display device according to a first operation, the first operation acts on a virtual navigation key, the virtual navigation key is generated by the display device in response to the current navigation function of the electronic device is a navigation function in addition to a three-key navigation function;
perform the navigation function corresponding to the virtual navigation key according to the first message, and adjust the first screen content to a second screen content;
send the second screen content to the display device, so that the display device displays the second screen content in the screen projection area.
US18/473,483 2019-08-29 2023-09-25 Control method applied to screen projection scenario and related device Pending US20240012562A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US18/473,483 US20240012562A1 (en) 2019-08-29 2023-09-25 Control method applied to screen projection scenario and related device

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
CN201910809113.9A CN110673782B (en) 2019-08-29 2019-08-29 Control method applied to screen projection scene and related equipment
CN201910809113.9 2019-08-29
PCT/CN2020/103440 WO2021036594A1 (en) 2019-08-29 2020-07-22 Control method applied to screen projection scenario and related device
US202217638567A 2022-02-25 2022-02-25
US18/473,483 US20240012562A1 (en) 2019-08-29 2023-09-25 Control method applied to screen projection scenario and related device

Related Parent Applications (2)

Application Number Title Priority Date Filing Date
US17/638,567 Continuation US11809704B2 (en) 2019-08-29 2020-07-22 Control method applied to screen projection scenario and related device
PCT/CN2020/103440 Continuation WO2021036594A1 (en) 2019-08-29 2020-07-22 Control method applied to screen projection scenario and related device

Publications (1)

Publication Number Publication Date
US20240012562A1 true US20240012562A1 (en) 2024-01-11

Family

ID=69075691

Family Applications (2)

Application Number Title Priority Date Filing Date
US17/638,567 Active US11809704B2 (en) 2019-08-29 2020-07-22 Control method applied to screen projection scenario and related device
US18/473,483 Pending US20240012562A1 (en) 2019-08-29 2023-09-25 Control method applied to screen projection scenario and related device

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US17/638,567 Active US11809704B2 (en) 2019-08-29 2020-07-22 Control method applied to screen projection scenario and related device

Country Status (5)

Country Link
US (2) US11809704B2 (en)
EP (1) EP4016272A4 (en)
CN (3) CN115357178B (en)
MX (1) MX2022002472A (en)
WO (1) WO2021036594A1 (en)

Families Citing this family (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115357178B (en) 2019-08-29 2023-08-08 荣耀终端有限公司 Control method applied to screen-throwing scene and related equipment
CN111327769B (en) 2020-02-25 2022-04-08 北京小米移动软件有限公司 Multi-screen interaction method and device and storage medium
CN111596875B (en) * 2020-04-17 2022-09-13 维沃移动通信有限公司 Screen expansion method and electronic equipment
CN111562896B (en) * 2020-04-26 2023-05-19 维沃移动通信(杭州)有限公司 Screen projection method and electronic equipment
CN111597000B (en) * 2020-05-14 2023-08-01 青岛海信移动通信技术有限公司 Small window management method and terminal
CN111913628B (en) * 2020-06-22 2022-05-06 维沃移动通信有限公司 Sharing method and device and electronic equipment
CN115480670A (en) * 2020-09-07 2022-12-16 华为技术有限公司 Navigation bar display method, navigation bar display method and first electronic equipment
CN113553014B (en) * 2020-09-10 2023-01-06 华为技术有限公司 Application interface display method under multi-window screen projection scene and electronic equipment
WO2022052907A1 (en) * 2020-09-10 2022-03-17 华为技术有限公司 Display method and electronic device
CN114253496A (en) * 2020-09-10 2022-03-29 华为技术有限公司 Display method and electronic equipment
CN117093165A (en) * 2020-12-24 2023-11-21 华为技术有限公司 Equipment control method and terminal equipment
CN114691059B (en) * 2020-12-25 2024-03-26 华为技术有限公司 Screen-throwing display method and electronic equipment
CN112631538A (en) * 2020-12-30 2021-04-09 安徽鸿程光电有限公司 Display method, device, equipment and computer storage medium
CN115442509B (en) * 2021-06-01 2023-10-13 荣耀终端有限公司 Shooting method, user interface and electronic equipment
CN113507694B (en) * 2021-06-18 2024-02-23 厦门亿联网络技术股份有限公司 Screen projection method and device based on wireless auxiliary stream equipment
CN116774870A (en) * 2022-03-15 2023-09-19 荣耀终端有限公司 Screen capturing method and device
CN114860143A (en) * 2022-05-20 2022-08-05 Oppo广东移动通信有限公司 Navigation control method and device, terminal equipment and storage medium

Family Cites Families (40)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050091359A1 (en) 2003-10-24 2005-04-28 Microsoft Corporation Systems and methods for projecting content from computing devices
KR100714707B1 (en) * 2006-01-06 2007-05-04 삼성전자주식회사 Apparatus and method for navigation in 3-dimensional graphic user interface
JP2008123408A (en) * 2006-11-15 2008-05-29 Brother Ind Ltd Projection apparatus, program, projection method, and projection system
US8850052B2 (en) * 2008-09-30 2014-09-30 Apple Inc. System and method for simplified resource sharing
US8698845B2 (en) * 2010-01-06 2014-04-15 Apple Inc. Device, method, and graphical user interface with interactive popup views
KR101763593B1 (en) * 2010-08-24 2017-08-01 엘지전자 주식회사 Method for synchronizing contents and user device enabling of the method
WO2012046890A1 (en) * 2010-10-06 2012-04-12 엘지전자 주식회사 Mobile terminal, display device, and method for controlling same
KR101495190B1 (en) * 2011-03-25 2015-02-24 엘지전자 주식회사 Image display device and operation method of the image display device
KR101788060B1 (en) * 2011-04-13 2017-11-15 엘지전자 주식회사 Image display device and method of managing contents using the same
KR101857563B1 (en) 2011-05-11 2018-05-15 삼성전자 주식회사 Method and apparatus for data sharing of between different network electronic devices
US8806369B2 (en) * 2011-08-26 2014-08-12 Apple Inc. Device, method, and graphical user interface for managing and interacting with concurrently open software applications
CA2798291C (en) * 2011-12-07 2016-11-01 Research In Motion Limited Presenting context information in a computing device
KR101522399B1 (en) * 2011-12-23 2015-05-22 주식회사 케이티 Method for displaying image from handheld terminal to display device and handheld terminal thereof
JP5999452B2 (en) 2012-01-26 2016-09-28 パナソニックIpマネジメント株式会社 Mobile terminal and device linkage method
US20130219303A1 (en) * 2012-02-21 2013-08-22 Research In Motion Tat Ab Method, apparatus, and system for providing a shared user interface
US9513793B2 (en) * 2012-02-24 2016-12-06 Blackberry Limited Method and apparatus for interconnected devices
US9360997B2 (en) * 2012-08-29 2016-06-07 Apple Inc. Content presentation and interaction across multiple displays
US9619128B2 (en) * 2013-07-01 2017-04-11 Microsoft Technology Licensing, Llc Dynamic presentation prototyping and generation
KR102138505B1 (en) 2013-07-10 2020-07-28 엘지전자 주식회사 Mobile terminal and method for controlling the same
KR102144553B1 (en) * 2013-08-30 2020-08-13 삼성전자주식회사 Multiple-display method, machine-readable storage medium and electronic device
US9507482B2 (en) * 2013-10-07 2016-11-29 Narsys, LLC Electronic slide presentation controller
CN103713847A (en) * 2013-12-25 2014-04-09 华为终端有限公司 System bar control method of user equipment and user equipment
US9870058B2 (en) * 2014-04-23 2018-01-16 Sony Corporation Control of a real world object user interface
KR102219861B1 (en) * 2014-05-23 2021-02-24 삼성전자주식회사 Method for sharing screen and electronic device thereof
KR102269481B1 (en) * 2014-10-17 2021-06-28 삼성전자주식회사 Method for screen sharing with devices and device thereof
CN106155614B (en) * 2015-04-24 2020-04-24 联想(北京)有限公司 Operation information transmission method and electronic equipment
CN104967886B (en) 2015-05-28 2018-09-25 深圳市创易联合科技有限公司 Wireless display method and system
US10459675B2 (en) * 2015-09-04 2019-10-29 Fives Cinetic Corp. System and method for controlling a process line using a PLC and scalable HMI control template
CN107870754A (en) * 2016-09-28 2018-04-03 法乐第(北京)网络科技有限公司 A kind of method and device of the content shown on control device
CN106873846A (en) * 2016-12-29 2017-06-20 北京奇虎科技有限公司 A kind of PC ends control the method and system of mobile device
CN108702414B (en) * 2017-06-16 2021-04-09 华为技术有限公司 Screen locking method and device and computer readable storage medium
CN109218731B (en) * 2017-06-30 2021-06-01 腾讯科技(深圳)有限公司 Screen projection method, device and system of mobile equipment
KR20190021016A (en) 2017-08-22 2019-03-05 삼성전자주식회사 Electronic device and control method thereof
CN107547750B (en) * 2017-09-11 2019-01-25 Oppo广东移动通信有限公司 Control method, device and the storage medium of terminal
CN107682724B (en) 2017-09-29 2020-05-01 北京盛世辉科技有限公司 Display method, display device, intelligent remote controller and computer readable storage medium
CN107743220A (en) 2017-10-24 2018-02-27 西安万像电子科技有限公司 The method, apparatus of condition prompting moves on to screen system with
CN108459836B (en) * 2018-01-19 2019-05-31 广州视源电子科技股份有限公司 Annotate display methods, device, equipment and storage medium
CN109753256A (en) 2018-03-29 2019-05-14 北京字节跳动网络技术有限公司 A kind of layout method, device and the equipment on window control column
CN108595137B (en) * 2018-04-25 2021-05-04 广州视源电子科技股份有限公司 Wireless screen projection method and device and screen projector
CN115357178B (en) 2019-08-29 2023-08-08 荣耀终端有限公司 Control method applied to screen-throwing scene and related equipment

Also Published As

Publication number Publication date
CN115357178A (en) 2022-11-18
CN110673782B (en) 2022-11-29
CN115793950A (en) 2023-03-14
CN115357178B (en) 2023-08-08
EP4016272A1 (en) 2022-06-22
WO2021036594A1 (en) 2021-03-04
EP4016272A4 (en) 2022-11-09
MX2022002472A (en) 2022-06-08
CN110673782A (en) 2020-01-10
US11809704B2 (en) 2023-11-07
US20220300153A1 (en) 2022-09-22

Similar Documents

Publication Publication Date Title
US20240012562A1 (en) Control method applied to screen projection scenario and related device
US11861161B2 (en) Display method and apparatus
CN110069179B (en) Icon control method and terminal equipment
WO2019015404A1 (en) Method and apparatus for switching applications in split screen mode, and related device thereof
US10057201B2 (en) Method and apparatus for managing the display of messages of a group chat
US11579946B2 (en) Method for managing multiple operating systems in a terminal
US20220300302A1 (en) Application sharing method and electronic device
JP6492184B2 (en) Method, device, and system for managing information recommendations
CN106445340B (en) Method and device for displaying stereoscopic image by double-screen terminal
EP4228226A1 (en) File sending method and apparatus, and electronic device
WO2020215969A1 (en) Content input method and terminal device
US10298590B2 (en) Application-based service providing method, apparatus, and system
CN111512278B (en) Method for processing application of terminal equipment and terminal equipment
CN104238931B (en) Information input method and device and electronic equipment
CN108811177B (en) Communication method and terminal
CN109739409B (en) Batch processing method and device and terminal equipment
RU2791547C1 (en) Control method applied to the screen projection scenario and the corresponding device
RU2816127C2 (en) Control method applied to screen projection scenario, and corresponding device
CN106681845B (en) Method and device for managing communication messages
CN111124149A (en) Input method and electronic equipment
CN110543273A (en) split screen display method and device for terminal
CN110109582B (en) Display method and device of mobile terminal and storage medium

Legal Events

Date Code Title Description
AS Assignment

Owner name: HONOR DEVICE CO., LTD., CHINA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:GU, HEJIN;NIU, SIYUE;REEL/FRAME:065007/0672

Effective date: 20220217

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION