WO2020253760A1 - 一种输入方法、电子设备和投屏系统 - Google Patents

一种输入方法、电子设备和投屏系统 Download PDF

Info

Publication number
WO2020253760A1
WO2020253760A1 PCT/CN2020/096726 CN2020096726W WO2020253760A1 WO 2020253760 A1 WO2020253760 A1 WO 2020253760A1 CN 2020096726 W CN2020096726 W CN 2020096726W WO 2020253760 A1 WO2020253760 A1 WO 2020253760A1
Authority
WO
WIPO (PCT)
Prior art keywords
projection
input
screen
content
edit box
Prior art date
Application number
PCT/CN2020/096726
Other languages
English (en)
French (fr)
Inventor
李荣根
胡凯
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Priority to ES20825506T priority Critical patent/ES2967240T3/es
Priority to EP20825506.7A priority patent/EP3955556B1/en
Priority to US17/615,977 priority patent/US12032866B2/en
Publication of WO2020253760A1 publication Critical patent/WO2020253760A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1454Digital output to display device ; Cooperation and interconnection of the display device with other functional units involving copying of the display data of a local workstation or window to a remote workstation or window so that an actual copy of the data is displayed simultaneously on two or more displays, e.g. teledisplay
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/02Input arrangements using manually operated switches, e.g. using keyboards or dials
    • G06F3/023Arrangements for converting discrete items of information into a coded form, e.g. arrangements for interpreting keyboard generated codes as alphanumeric codes, operand codes or instruction codes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/02Input arrangements using manually operated switches, e.g. using keyboards or dials
    • G06F3/023Arrangements for converting discrete items of information into a coded form, e.g. arrangements for interpreting keyboard generated codes as alphanumeric codes, operand codes or instruction codes
    • G06F3/0233Character input methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/038Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0489Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using dedicated keyboard keys or combinations thereof
    • G06F3/04895Guidance during keyboard input operation, e.g. prompting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/038Indexing scheme relating to G06F3/038
    • G06F2203/0383Remote input, i.e. interface arrangements in which the signals generated by a pointing device are transmitted to a PC at a remote location, e.g. to a PC in a LAN
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2354/00Aspects of interface with display user
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/12Synchronisation between the display unit and other units, e.g. other display units, video-disc players

Definitions

  • This application relates to the technical field of data processing, in particular to an input method, electronic equipment and a screen projection system.
  • Mobile phones can play videos, and can also display documents, pictures, application interfaces, or web pages. Since the display screen of a mobile phone is small, when the content displayed on the mobile phone needs to be shown to others, if the content displayed on the mobile phone is projected to a display device (such as a TV, a computer, or another mobile phone) through a screen projection technology. In the screen projection scenario, when a user wants to input content into a mobile phone, it can currently be achieved in the following two ways.
  • This application provides an input method, an electronic device, and a screen projection system.
  • the display content can be input through the input device of the screen projection destination, and the display content can be displayed in the edit box of the projection source terminal, so that it can be displayed to the projection source. Enter content quickly and efficiently.
  • an input method including: a projection destination terminal displays a projection window on at least a part of a display interface, the projection window is a mirror image of the screen display content of the projection source terminal, and the screen display content includes the first An edit box; the projection destination receives input activation information from the projection source, and the input activation information is used to indicate that the first edit box has the input focus; the projection destination responds to the The input activation information is used to obtain the text content or image input by the input device of the projection destination to obtain the content to be displayed; the projection destination sends the content to be displayed to the projection source; the projection The screen destination updates the screen projection window, and the updated screen projection window is a mirror image of the updated screen display content of the screen projection source terminal, in the first edit box in the updated screen display content The content to be displayed is displayed.
  • the screen projection destination acquires the text content or image input by the input device of the screen projection destination includes: In response to the input activation information, the projection destination sets the second edit box of the projection destination to the input state; the projection destination monitors the content change event of the second edit box; the projection In response to the content change event, the screen destination obtains the text content or image that triggers the content change event, and uses the obtained text content or image as the content to be displayed.
  • the projection destination starts the second edit box according to the input start information, and by inputting text content or images into the second edit box, the content to be displayed can be obtained, thereby enabling flexible input locally at the projection destination. Content to be displayed.
  • the text content or image is an operation generated by the first input method of the projection destination according to the input device Commands.
  • the display content can be generated by the input method of the projection destination according to the operation command generated by the input device of the projection destination.
  • the method further includes: the screen projection destination, in response to the input activation information, activates the first Input.
  • the projection destination can automatically start the input method of the projection destination according to the received input activation information, so that there is no need to manually start the input method of the projection destination, which saves user operations and improves user input experience.
  • the second edit box is a hidden edit box or a transparent edit box;
  • the input activation information includes the first The first cursor position of the edit box;
  • the method further includes: the projection destination terminal sets the position of the second edit box on the projection window according to the first cursor position.
  • the edit box at the projection destination is a hidden or transparent edit box, so that it will not block or interfere with the display of the mirror image of the edit box at the projection source at the projection destination, which improves the user experience; and, through projection The cursor position of the edit box at the source end, set the position of the edit box at the destination end of the screen on the screen, so that when the user is typing, the candidate word prompt box can be displayed near the mirror image of the edit box at the source end of the screen, which improves the user Enter the experience.
  • the method It after the projection destination sends the content to be displayed to the projection source, the method It also includes: the screen projection destination deletes the text content or image in the second edit box.
  • the content to be displayed in the edit box of the projection destination is sent to the source of the projection
  • the content to be displayed in the edit box of the destination of the projection is sent, so that the projection destination only displays the projection.
  • the mirror image of the content to be displayed in the edit box of the source end of the screen projected from the source end avoids the obstruction or interference of the mirror image of the content to be displayed in the local display of the content to be displayed on the screen destination end, and improves the user's screen projection experience.
  • the second edit box is set on an interface other than the projection window in the display interface of the projection destination end on.
  • the projection window is a part of the display interface of the projection destination
  • the local edit box of the projection destination is set in the display interface of the projection destination except for the projection window, which can avoid screen projection.
  • the local edit box at the destination side blocks the mirror image of the edit box at the source end of the projection screen, which improves the user's screen projection experience.
  • the method before the screen projection destination receives input activation information from the screen projection source terminal, the method further includes: the screen projection destination terminal Acquiring a click signal for the mirror image of the first edit box in the projection window, where the click signal includes a click type and a click position; the projection destination sends the click signal to the projection source; So that the first edit box gets the input focus.
  • the user can perform operations on the screen projection destination, so that the edit box on the screen projection source side obtains the input focus and enters the input state, so that the edit box on the screen projection source side can display the content to be displayed.
  • the input device is any of the following:
  • the content to be displayed is text generated according to an operation command corresponding to a character key on the keyboard;
  • the content to be displayed is the converted text of the voice input by the microphone
  • the content to be displayed is text or image extracted from a picture taken by the camera;
  • the content to be displayed is text or image extracted from a picture input by the scanner
  • the content to be displayed is text or image input through the handwriting tablet or the stylus pen.
  • the user can input the content to be displayed through different input devices locally on the screen projection destination to send it to the projection source terminal and display it in the edit box of the projection source terminal.
  • the input device is a keyboard
  • the method further includes: obtaining, by the screen projection destination, an operation command corresponding to a non-character key of the keyboard;
  • the screen projection destination sends the operation command to the screen projection source terminal, so that the screen projection source terminal edits the content in the first edit box according to the operation command.
  • the user can manipulate the keyboard of the projection destination to edit the content in the edit box at the source of the projection, which improves the efficiency of editing the content in the edit box at the source of the projection.
  • an input method including: a projection source terminal projects its screen display content to a projection destination terminal, so that the projection destination terminal displays a projection window on at least a part of a display interface, and the projection The screen window is a mirror image of the screen display content, and the screen display content includes a first edit box; the screen projection source sends input activation information to the screen projection destination, and the input activation information is used to indicate the The first edit box obtains the input focus; the projection source terminal receives the content to be displayed from the projection destination terminal, and the content to be displayed is obtained by the projection destination terminal in response to the input activation information, The content to be displayed is text content or image input by the input device of the screen projection destination; the screen projection source terminal displays the content to be displayed in the first edit box to update the screen display content.
  • the input activation information is used to set the input state of the second edit box of the projection destination, and the content to be displayed is to trigger the first edit box. 2.
  • the content to be displayed is an operation command generated by the first input method of the projection destination according to the input device And generated.
  • the second edit box is a hidden edit box or a transparent edit box.
  • the input activation information includes the first cursor position of the first edit box; the first cursor position is To set the position of the second edit box on the projection window.
  • the method before the screen projection source terminal sends input activation information to the screen projection destination terminal, the method further includes : The projection source terminal receives a click signal for the mirror image of the first edit box from the projection destination terminal, and the click signal includes the click type and the click position; according to the click signal, the projection source terminal is activated. Second input method, so as to monitor the cursor position of the first edit box through the second input method.
  • the second edit box is set on an interface other than the projection window in the display interface of the projection destination end on.
  • the method further includes: the screen projection source terminal Obtain a click signal for the mirror image of the first edit box from the projection destination, where the click signal includes a click type and a click location; the projection source makes the first edit box obtain input according to the click signal Focus, and send the input activation information to the projection destination.
  • the input device is any of the following:
  • the content to be displayed is text generated according to an operation command corresponding to a character key on the keyboard;
  • the content to be displayed is the converted text of the voice input by the microphone
  • the content to be displayed is text or image extracted from a picture taken by the camera;
  • the content to be displayed is text or image extracted from a picture input by the scanner
  • the content to be displayed is text or image input through the handwriting tablet or the stylus pen.
  • the input device is a keyboard
  • the method further includes: the projection source terminal receives the non-transmission information of the keyboard from the projection destination terminal. Operation command corresponding to the character key; the projection source terminal edits the content in the first edit box according to the operation command.
  • the third aspect provides a projection destination, including a processor, a memory, a transceiver, and a display screen; wherein the memory is used to store computer execution instructions; when the projection destination is running, the processor The computer-executable instructions stored in the memory are executed, so that the projection destination terminal executes the method described in the first aspect. It should be understood that the screen projection destination is an electronic device.
  • the fourth aspect provides a projection source terminal, including a processor, a memory, a transceiver, and a display screen; wherein the memory is used to store computer execution instructions; when the projection source terminal is running, the processor The computer-executable instructions stored in the memory are executed, so that the projection source terminal executes the method described in the first aspect. It should be understood that the source end of the projection screen is an electronic device.
  • a fifth aspect provides a screen projection system, including the screen projection destination terminal described in the third aspect and the screen projection source terminal described in the fourth aspect.
  • this application provides a device, including:
  • a display unit configured to display a screen projection window on at least a part of the display interface, the screen projection window being a mirror image of the screen display content of the screen projection source end, and the screen display content includes a first edit box;
  • a receiving unit configured to receive input activation information from the projection source terminal, where the input activation information is used to indicate that the first edit box has obtained the input focus;
  • An obtaining unit configured to obtain the text content or image input by the input device of the apparatus in response to the input activation information, so as to obtain the content to be displayed;
  • a sending unit configured to send the content to be displayed to the source end of the projection
  • the update unit is configured to update the projection window, the updated projection window is a mirror image of the updated screen display content of the projection source end, and the first edit in the updated screen display content The content to be displayed is displayed in the box.
  • the acquiring unit includes a setting subunit, a listening subunit, and an acquiring subunit; the setting subunit is configured to set in response to the input activation information
  • the second edit box of the projection destination enters the input state;
  • the monitoring subunit is used to monitor the content change event of the second edit box;
  • the acquisition subunit is used to obtain a trigger in response to the content change event
  • the text content or image of the content change event is used as the content to be displayed.
  • the text content or image is generated by the first input method of the device according to the operation command generated by the input device Generated.
  • the apparatus further includes: an activation unit configured to activate the first input in response to the input activation information law.
  • the second edit box is a hidden edit box or a transparent edit box;
  • the input activation information includes the first The first cursor position of the edit box;
  • the setting subunit is further configured to set the position of the second edit box on the projection window according to the first cursor position.
  • the device further includes a deleting unit, configured to send, in the sending unit, to the projection source end After the content is to be displayed, the text content or image is deleted in the second edit box.
  • the second edit box is set on an interface other than the projection window in the display interface of the projection destination end on.
  • the acquiring unit is further configured to acquire the input start information for the projection window before the receiving unit receives the input startup information from the projection source.
  • the click signal of the mirror image of the first edit box in the first edit box, the click signal includes the click type and the click position; the sending unit is also used to send the click signal to the projection source end, so that the first edit box Get input focus.
  • the input device is any of the following:
  • the content to be displayed is text generated according to an operation command corresponding to a character key on the keyboard;
  • the content to be displayed is the converted text of the voice input by the microphone
  • the content to be displayed is text or image extracted from a picture taken by the camera;
  • the content to be displayed is text or image extracted from a picture input by the scanner
  • the content to be displayed is text or image input through the handwriting tablet or the stylus pen.
  • the input device is a keyboard
  • the acquiring unit is configured to acquire operation commands corresponding to non-character keys of the keyboard
  • the sending unit is configured to The operation command is sent to the screen projection source terminal, so that the screen projection source terminal edits the content in the first edit box according to the operation command.
  • a seventh aspect provides a device, including:
  • the projection unit is configured to project its screen display content to a projection destination, so that the projection destination displays a projection window on at least a part of the display interface, and the projection window is a mirror image of the screen display content,
  • the content of the screen display includes a first edit box;
  • a sending unit configured to send input activation information to the screen projection destination, where the input activation information is used to indicate that the first edit box has obtained the input focus;
  • the receiving unit is configured to receive content to be displayed from the projection destination, the content to be displayed is obtained by the projection destination in response to the input activation information, and the content to be displayed is the projection The text content or image input by the input device at the destination;
  • the display unit is configured to display the content to be displayed in the first edit box to update the on-screen content.
  • the input activation information is used to set the input state of the second edit box of the projection destination, and the content to be displayed is to trigger the 2.
  • the content to be displayed is an operation command generated by the first input method of the projection destination according to the input device And generated.
  • the second edit box is a hidden edit box or a transparent edit box.
  • the input activation information includes the first cursor position of the first edit box; the first cursor position is To set the position of the second edit box on the projection window.
  • the device further includes a starting unit; the receiving unit is further configured to send a screen to the sending unit Before the destination terminal sends the input activation information, it receives a click signal for the mirror image of the first edit box from the projection destination terminal, where the click signal includes a click type and a click position; the activation unit is configured to activate according to the click signal
  • the second input method of the source end of the projection screen is used to monitor the cursor position of the first edit box through the second input method.
  • the second edit box is set on an interface other than the projection window in the display interface of the projection destination end on.
  • the device further includes an acquiring unit; the receiving unit is further configured to before the sending unit sends the input activation information to the screen projection destination , Acquiring a click signal for the mirror image of the first edit box from the projection destination, where the click signal includes a click type and a click position; the acquiring unit is configured to obtain the first edit box according to the click signal Input focus; the sending unit is used to send the input start information to the screen projection destination.
  • the input device is any of the following:
  • the content to be displayed is text generated according to an operation command corresponding to a character key on the keyboard;
  • the content to be displayed is the converted text of the voice input by the microphone
  • the content to be displayed is text or image extracted from a picture taken by the camera;
  • the content to be displayed is text or image extracted from the input picture of the scanner
  • the content to be displayed is text or image input through the handwriting tablet or the stylus pen.
  • the input device is a keyboard
  • the apparatus further includes an editing unit; the receiving unit is further configured to receive the screen projection destination Operation commands corresponding to non-character keys of the keyboard; the editing unit is used to edit the content in the first edit box according to the operation commands.
  • An eighth aspect provides a computer storage medium, the computer storage medium comprising computer instructions, when the computer instructions run on a terminal, the terminal is caused to execute the method described in the first aspect or the method described in the second aspect method.
  • a ninth aspect provides a computer program product, which implements the method described in the first aspect or the method described in the second aspect when the program code included in the computer program product is executed by a processor in a terminal.
  • an input method, a projection destination, a projection source, and a device are provided.
  • the projection source projects its screen content to the projection destination
  • the user can manipulate the input device of the projection destination , Locally generate text content or images at the projection destination; send the generated text content or images as content to be displayed to the projection source; the projection source receives the text content or image sent by the projection destination
  • the content to be displayed is submitted to its edit box for display, so that the local input of the projection destination can be synchronized to the projection source without distinction, which improves the user input experience
  • the projection destination is the computer, and the projection source Take a mobile phone as an example.
  • the user can use the keyboard of the computer to input content into the mobile phone, which can break the data between the mobile phone and the computer. And the service is almost independent of each other, allowing mobile phones and computers to communicate quickly. This allows the user to complete the word processing on the mobile phone through the keyboard of the computer and the input method of the computer, which can greatly improve the efficiency of the user in processing the information on the mobile phone in scenarios such as office.
  • FIG. 1 is a schematic diagram of an application scenario provided by an embodiment of the application
  • FIG. 2 is a flowchart of an input method provided by an embodiment of the application.
  • Figure 3a is an interface display diagram provided by an embodiment of the application.
  • Figure 3b is an interface display diagram provided by an embodiment of the application.
  • Figure 3c is an interface display diagram provided by an embodiment of the application.
  • Figure 3d is an interface display diagram provided by an embodiment of the application.
  • FIG. 4 is an interface display diagram provided by an embodiment of the application.
  • FIG. 5 is a schematic diagram of an application architecture of an input method provided by an embodiment of this application.
  • FIG. 6 is a flowchart of an input method provided by an embodiment of the application.
  • FIG. 7 is a schematic diagram of an application architecture of an input method provided by an embodiment of the application.
  • FIG. 8 is a flowchart of an input method provided by an embodiment of the application.
  • FIG. 9 is a flowchart of an input method provided by an embodiment of the application.
  • FIG. 10 is a schematic block diagram of an apparatus provided by an embodiment of this application.
  • FIG. 11 is a schematic block diagram of an apparatus provided by an embodiment of this application.
  • FIG. 12 is a schematic block diagram of a screen projection destination provided by an embodiment of the application.
  • FIG. 13 is a schematic block diagram of a screen projection source provided by an embodiment of this application.
  • Screen projection refers to a device that projects its screen display content onto the display screen or display medium of another device. It is a typical information synchronization method between screened devices.
  • the device that projects its screen display content may be referred to as the projection source
  • the device that receives the projection from the projection source and displays the screen display content of the projection source may be referred to as the projection destination.
  • the projection source can project its screen display content to the projection destination for display.
  • the projecting source can compress and encode the video stream data of its screen content and send it to the projecting destination.
  • the projection destination receives the video stream data from the projection source, and displays the screen display content of the projection source on the display screen of the projection destination after decoding.
  • the screen display content of the projection source displayed on the display screen of the projection destination can be referred to as a mirror image of the screen display content of the projection source.
  • the electronic device can be a mobile phone, a tablet computer, etc.
  • the devices that can be used as the destination of the screen are personal computers, smart TVs, projectors, etc.
  • searching for videos, modifying documents, etc. while watching the screen display content of the projection source displayed on the projection destination, the user needs to display the text content or image in the edit box of the projection source.
  • the embodiment of the present application provides an input method, which can be applied to the scenario shown in FIG. 1.
  • the user can manipulate the input device of the projection destination to locally generate text content or images on the projection destination and serve as the content to be displayed; then, the projection destination sends the content to be displayed to the projection source; the projection source is The edit box displays the content to be displayed received from the destination of the projection.
  • the local input of the projection destination can be synchronized to the projection source indiscriminately, thereby improving the user input experience and input efficiency.
  • the input method provided by the embodiment of the present application will be specifically introduced.
  • the method can be executed by any electronic device with a display screen and data processing capabilities.
  • the electronic device that executes the input method may be referred to as the screen projection destination. As shown in FIG.
  • the method includes: step 202, the projection destination terminal displays a projection window on at least a part of the display interface, the projection window is a mirror image of the screen display content of the projection source terminal, and the screen display content includes The first edit box; step 204, the projection destination terminal receives input activation information from the projection source terminal, the input activation information is used to indicate that the first edit box has the input focus; step 206, the In response to the input start information, the screen projection destination obtains the text content or image input by the input device of the screen projection destination to obtain the content to be displayed; step 208, the screen projection destination sends to the screen projection source terminal Send the content to be displayed; step 210, the projection destination terminal updates the projection window, and the updated projection window is a mirror image of the updated screen display content of the projection source terminal.
  • the content to be displayed is displayed in the first edit box in the displayed content. It should be understood that both the projection destination and the projection source are electronic devices.
  • the projection destination displays a projection window on at least a part of the display interface, the projection window is a mirror image of the screen display content of the projection source terminal, and the screen display content includes a first edit box.
  • the mobile phone can As the projection source terminal, the screen display content is projected to the computer's display screen.
  • the computer is the projection destination; or the computer can be used as the projection source terminal to project its screen display content to the screen of the mobile phone.
  • the mobile phone is the screen destination.
  • the computer can simultaneously display the mirror image of the screen display content of the mobile phone.
  • the window in which the computer displays the mirror image of the mobile phone screen can be called the projection window.
  • the computer can display the projection window on its entire display interface, that is, the projection window occupies all of the computer display interface.
  • the computer can also display a projection window on part of its display interface, that is, the projection window is only part of the computer interface display content.
  • the screen display content of the mobile phone includes the first edit box, and correspondingly, the computer also displays the mirror image of the first edit box in the projection window.
  • the first edit box refers to the window that can be used as the input focus to receive, accommodate, and edit the input content. Specifically, it can be an input box, text box, picture box, address bar, search box, editable page (word page, etc.), The form (excel form, etc.) that contains the input content, etc. It should be noted that the above is only an example of the first edit box, which is not exhaustive.
  • the projection source can display multiple windows, and the multiple windows include at least one edit box.
  • the first edit box is any edit box of the at least one edit box.
  • the first edit box may specifically be the edit box of the first input control at the source end of the projection screen.
  • the text box is specifically the first text box ( TextBox) the text box of the control
  • the first edit box is a picture box
  • the picture box is specifically the picture box of the first picture box (PictureBox) control at the source end of the projection screen.
  • the screen projection destination receives input activation information from the screen projection source terminal, and the input activation information is used to indicate that the first edit box has obtained the input focus.
  • the input start information can be sent to the destination end of the screen projection.
  • the input activation information is used to indicate that the first edit box has obtained the input focus of the projection source, and can display text content or images received by the projection source.
  • the screen projection source can use the video stream data channel to send the input start information to the screen projection destination.
  • the video stream data channel is a data channel used to transmit the video stream data of the screen display content when the screen projection source terminal projects the screen display content to the screen projection destination terminal.
  • the input method provided by the embodiment of the present application further includes: obtaining a click signal for the mirror image of the first edit box in the projection window; the click signal includes a click type and Click position; send the click signal to the projection source terminal to enable the first edit box to obtain the input focus.
  • the user can click the mirror image of the first edit box in the projection window at the projection destination, so that the projection destination generates a click signal.
  • the click event may include the click type (for example, single click, double click, etc.) and the click position (specifically, it may be a coordinate).
  • the projection destination can send the click signal to the projection source. After receiving the click signal, the projection source terminal may determine that the user operation object is the first edit box according to the click position in the click signal.
  • the click type in the click signal is the click type that makes the first edit box get the input focus (for example, the edit box can be preset to click to make the edit box get the input focus)
  • the projection source uses the click type
  • the first edit box obtains the input focus to enter the input state, and then sends input activation information to the screen projection destination.
  • the projection destination can send the click signal to the projection source through the reverse control channel.
  • the screen projection source terminal and the screen projection destination terminal can negotiate to establish a reverse control channel through the video stream data channel.
  • the projection destination can send operation commands such as click signals to the projection source through the reverse control channel.
  • the screen projection source may obtain a click signal for the first edit box, and according to the click signal, make the first edit box obtain the input focus, and send input to the screen projection destination. Start information.
  • the user can click the first edit box at the source end of the screen to make the source end of the screen generate a click signal.
  • the screen projection source terminal causes the first edit box to obtain the input focus to enter the input state according to the click signal, and then sends input activation information to the screen projection destination terminal.
  • the screen projection destination terminal obtains the text content or image input by the input device of the screen projection destination terminal in response to the input activation information to obtain the content to be displayed.
  • the user can manipulate the input device of the projection destination to input text content or images to the projection destination.
  • input devices such as a personal computer, and the conventional input device is a keyboard.
  • the conventional input device is a keyboard.
  • a mobile phone whose input device is its touch screen.
  • microphones, scanners, cameras, etc. can also be used as input devices for electronic devices.
  • the input device can be either built-in or external.
  • its input device can be its own keyboard, camera, microphone, etc., or an external keyboard, camera, microphone, etc.
  • the input device is a keyboard.
  • the user can generate corresponding operation commands by tapping or pressing the character keys on the keyboard, so that the screen projection destination can generate text content according to the operation commands to obtain the content to be displayed. .
  • the input device is a microphone.
  • the user can input voice to the first device through the microphone.
  • the first device converts the user input voice into text to obtain the input content.
  • the input device is a camera. Users can take pictures through the camera.
  • the projection destination can extract text from the picture taken by the camera to get the input.
  • optical character recognition optical character recognition, OCR
  • OCR optical character recognition
  • the projection destination can also extract images from the pictures taken by the camera and use them as input.
  • the input device is a scanner. Users can go through the scanner.
  • the screen projection destination can extract text from the scanned picture input through the scanner to get the content to be displayed. Specifically, OCR technology can be used for text extraction.
  • the projection destination can also extract images from scanned pictures to get the content to be displayed.
  • the input device is a writing tablet.
  • the text content or image used to send to the source end of the screen projection is the text or image input through the handwriting board.
  • the input device is a stylus pen.
  • the text content or image used for sending to the screen projection source is the text or image input through the stylus pen.
  • step 206 includes: in response to the input activation information, the projection destination sets the second edit box of the projection destination to the input state; and monitors content change events of the second edit box; In response to the content change event, the text content or image that triggers the content change event is acquired, and the acquired text content or image is used as the content to be displayed.
  • the second edit box is the edit box of the screen projection destination.
  • the projection destination terminal When receiving the input start information, the projection destination terminal responds to the input start information, so that the second edit box obtains the input focus of the projection destination terminal and enters the input state.
  • the screen projection destination can monitor the content change event in the second edit box in the input state.
  • text content or image When text content or image is input to the second edit box, the input text content or image triggers a content change event in the second edit box.
  • the screen projection destination obtains the text content or image that triggers the content change event, and uses the text content or image as the content to be displayed.
  • the second edit box may be the edit box of the second input control at the projection destination.
  • the text box is specifically the second text box (TextBox )
  • the text box of the control when the second edit box is a picture box, the picture box is specifically the picture box of the second picture box (PictureBox) control at the screen projection destination.
  • the screen projection destination can input the text converted from the voice, the text or image extracted from the picture, the text or image input through the handwriting tablet or the stylus into the second edit box, and trigger the content change event.
  • the input content is generated by the first input method of the projection destination according to the input of the input device.
  • the first input method is the input method of the screen projection destination, which can be the input method that comes with the screen projection destination system or a third-party input method that can be installed.
  • the user when a physical keyboard is connected to the projection destination, after the second edit box enters the input state, the user can start the input method of the projection destination through the physical keyboard of the projection destination.
  • the input device is a physical keyboard, for English and Arabic numerals, the content can be directly input into the second edit box through the physical keyboard, and the input method is not required at this time, so there is no need to activate the input method.
  • a physical keyboard must be combined with the input method of the projection destination.
  • the user can start the projection destination by operating the physical keyboard of the projection destination (for example, pressing the Ctrl key and the space bar at the same time) Input.
  • the user can tap the keys of the physical keyboard to generate operation commands, so that the first input method generates text content according to the operation commands.
  • the first input method can also generate corresponding candidate words in the candidate word prompt box according to the operation command, and determine the text content from the candidate words according to the operation command corresponding to the user's selection operation, and input To the second edit box to trigger the content change event of the second edit box.
  • the screen projection destination can activate the first input method in response to inputting the activation information.
  • a physical keyboard is connected to the screen projection destination, only the status bar of the first input method can be displayed.
  • the user can tap the keys of the physical keyboard to generate operation commands, so that the first input method generates text content according to the operation commands.
  • the first input method can also generate corresponding candidate words in the candidate word prompt box according to the operation command, and determine the text content from the candidate words according to the operation command corresponding to the user's selection operation, and input To the second edit box to trigger the content change event of the second edit box.
  • the virtual keyboard of the first input method can be displayed.
  • the user can click or touch the keys of the virtual keyboard to generate operation commands, so that the first input method generates text content according to the operation commands.
  • the first input method can also generate corresponding candidate words in the candidate word prompt box according to the operation command, and determine the text content from the candidate words according to the operation command corresponding to the user's selection operation, and input To the second edit box to trigger the content change event of the second edit box.
  • the second edit box is a hidden edit box or a transparent edit box.
  • the second edit box when the second edit box is a hidden edit box, its cursor is also a hidden cursor.
  • the second edit box is a transparent edit box, its cursor is also a transparent cursor. Therefore, the influence of the cursor in the second edit box on the user's visual experience is avoided, and the user's visual experience is further improved.
  • the input activation information includes the first cursor position of the first edit box; the input method provided in the embodiment of the present application further includes: setting the first cursor position according to the first cursor position 2. The position of the edit box on the projection window.
  • the projection source can acquire the cursor position of the first edit box, carry the cursor position in the input activation information, and send it to the projection destination.
  • the start input event of the first edit box can start the input method of the projection source, and the input method of the projection source can monitor the cursor position of the first edit box and carry the cursor position in the input start information. Send to the destination of the screencast.
  • the projection destination sets the position of the second edit box according to the cursor position.
  • the second edit box can be set near the mirror image of the first cursor.
  • the left border of the second edit box is superimposed on the cursor mirror image of the first edit box.
  • the text content that triggers the content change event is obtained and used as the text content to be sent to the source end of the screen.
  • the text content is English letters or numbers, as long as a letter or number is input into the second edit box, the content change event can be triggered, and the letter or number that triggered the content change event can be obtained.
  • the input content is a text in a language other than English, such as Chinese characters.
  • the second edit box is set on an interface other than the screen projection window in the display interface of the screen projection destination.
  • the projection window may be a part of the display interface of the projection destination.
  • the second edit box is set at the projection destination end in a part outside the projection window, so that the second edit box does not block the mirror image of the content displayed on the projection source end, which improves the user's visual experience.
  • step 208 the content to be displayed is sent to the source end of the projection.
  • the projection destination can send the acquired content to be displayed to the projection source, and specifically can transmit the content to be displayed through the video stream data channel.
  • the projection source terminal After receiving the content to be displayed, the projection source terminal submits the content to be displayed into the first edit box so that the content to be displayed can be displayed in the first edit box, thereby updating the screen display content of the projection source terminal.
  • the projection destination updates the projection window, and the updated projection window is a mirror image of the updated screen display content of the projection source terminal, which is in the updated screen display content
  • the content to be displayed is displayed in the first edit box of.
  • the screen display window of the screen destination end is a mirror image of the screen display content of the screen source end, and the screen display content of the screen source end.
  • the projection window is updated accordingly, and the updated projection window is a mirror image of the updated screen content.
  • the content to be displayed is displayed in the first edit box in the updated screen display content.
  • the mirror image of the first edit box in the updated screen projection window also contains a mirror image of the content to be displayed.
  • the projection source terminal transmits the data of its screen display content through the video stream data channel to the projection destination terminal in real time, and the projection destination terminal displays the content based on the projection source terminal it receives.
  • the data shows the mirror image of the screen content on the source end of the projection screen. Therefore, the screen display content corresponding to the content to be displayed in the first edit box can be sent to the screen projection destination at the projection source end, and the projection destination terminal displays the screen display content of the projection source based on the data.
  • the mirror image has a mirror image of the content to be displayed, and the mirror image of the content to be displayed is located in the mirror image of the first edit box.
  • the input method provided in this application further includes: deleting the text content or image in the second edit box.
  • the destination end deletes the "Wolf Warrior” in the second edit box , So that the "Wolf Warrior” in the second edit box will not block the "Wolf Warrior” mirror image in the first edit box.
  • the screen projection source can monitor the cursor position in the first input box.
  • the input method at the source end of the projection screen can monitor the cursor position in the first input box.
  • the input of the source end of the projection screen can be activated by the input event of the first edit box. It is easy to understand that after inputting content into the first edit box, the cursor position of the first edit box will change.
  • the source end of the screen can monitor the change of the cursor position of the first edit box and send the first The cursor position after the edit box changes, specifically, the cursor position after the first edit box change can be transmitted through the video stream data channel.
  • the projection destination After the projection destination receives the changed cursor position of the first edit box, it can reset the position of the second edit box according to the changed cursor position of the first edit box, so that it can be used when inputting using the input device of the projection destination.
  • the candidate word prompt box of the input method of the screen projection destination can be displayed near the mirror image of the first edit box to improve the user's input experience.
  • the input method provided in the embodiment of the present application further includes: the screen projection destination obtains the operation command corresponding to the non-character keys of the keyboard; The operation command corresponding to the key is sent to the source end of the screen projection. Specifically, the operation command corresponding to the non-character key can be transmitted through the reverse control channel.
  • the projection source terminal controls its screen display content according to the operation command.
  • the operation command is an operation command for editing text content or an image
  • the screen projection source terminal edits the text content or image displayed in the first edit box according to the operation command.
  • the keys on the keyboard can be divided into character keys and non-character keys; among them, when the user taps or presses the character keys, an operation command corresponding to the character keys is generated, and the device can follow the characters
  • the operation command corresponding to the button generates text content.
  • the character keys can specifically include number keys 9 (0-9), letter keys (az), punctuation keys (for example,, .,,!,?, etc.), special character keys (for example, #, ⁇ , %, *, ⁇ and many more).
  • Non-character keys refer to keys other than character keys on the keyboard. Specifically, the keyboard control (Ctrl), shift (Shift), shift (Alt), caps lock (Caps Lock), insert (Insert), start (Home), end (End), delete (Del), up Non-character keys such as page (PgUp), next page (PgDn), carriage return (Enter), backspace (BackSpace) and arrow keys.
  • the user can generate corresponding operation commands by tapping or pressing the non-character keys of the keyboard.
  • the operation command can be sent to the projection source through the reverse control channel.
  • the source end of the projection screen can perform actions such as cursor shift, case switching, insertion, deletion, line feed, and sending in the first edit box according to the operation command.
  • the user can click on the interface outside the first edit box in the projection window of the projection destination to generate a click signal.
  • the click signal includes the click type (specifically, a single click) and the clicked position (specifically may be coordinate).
  • the projection destination can send the click signal to the projection source, and specifically can transmit the click signal through the reverse control channel.
  • the screen projection source parses the click position and click type, so that the first edit box loses the input focus and exits the input state.
  • the screen projection source can send input and exit information to the projection destination.
  • the first edit box change can be transmitted through the video stream data channel.
  • the cursor position after the screen projection destination receives the input exit information, in response to the input exit information, the second edit box loses the input focus and exits the input state.
  • the user after the screen display content is projected by the projection source to the projection destination, the user can generate text content or images locally at the projection destination by manipulating the input device of the projection destination; and The text or image generated by it is sent to the source of the screen as the content to be displayed; the source of the screen, after receiving the text or image sent by the destination of the screen, submits the content to be displayed to its edit box for display,
  • the local input of the projection destination can be synchronized to the projection source without distinction, which improves the user input experience; taking the projection destination as a computer and the projection source as a mobile phone as an example, the input method provided in the embodiment of this application is adopted
  • the user can use the keyboard of the computer to input the content into the mobile phone, which can break the situation that the data and services between the mobile phone and the computer are almost independent of each other, making the mobile phone and the computer fast Intercommunication. This allows the user to complete the word processing on the mobile phone through the keyboard of the computer and the input method
  • the cross-device input method provided in the embodiment of the present application is introduced by an example.
  • the structure of the input method application provided in this embodiment is shown in FIG. 5, which includes a projection destination and a projection source.
  • the projection destination and the projection source can be connected by a short-range wireless communication technology or a data cable.
  • the input device of the projection destination can include a keyboard, a mouse or a touch screen.
  • An input method and a projection client are installed on the projection destination.
  • the input method of the screen projection destination can generate text content or images according to the operation commands generated by the keyboard or touch screen as the content to be displayed.
  • the projection destination can receive the video stream data of the screen display content of the projection source through the projection client, and display the mirror image of the screen display content of the projection source.
  • the projection client may include a first input management module, a first video stream data module, and a first reverse control module.
  • the first input management module includes a second edit box, the second edit box may be a text box, the text box is a hidden text box, and its border is not displayed on the screen at the screen destination.
  • the first input management module also includes an input monitoring sub-module and a first input event management sub-module.
  • the above example takes the first input management module, the first video stream data module, and the first reverse control module to be integrated into the projection client as an example.
  • Existence constitutes a limitation.
  • the projection source terminal is the projection source terminal, which has an application installed.
  • the application includes a first edit box, and the first edit box may be a text box.
  • the first edit box is displayed on the screen at the source end of the screen, and correspondingly, the mirror image of the first edit box is displayed on the screen at the destination end of the screen.
  • the projection source terminal includes a second input management module, a second video stream data module, and a second reverse control module.
  • the second input management module includes an input method service sub-module and a second input event management sub-module.
  • the input method service submodule can be an input method module installed at the source end of the projection screen, or a module with an input method in the source end system of the projection screen.
  • the first video stream data module and the second video stream data module communicate through the video stream data channel.
  • the first reverse control module and the second reverse control module communicate through the reverse control channel.
  • the projection destination terminal uses the first reverse control module to send the first click signal to the projection source terminal through the reverse control channel.
  • the screen source uses the second reverse control module to receive the first click signal, it distributes the first click signal to the application, and the application parses the first click signal so that the first edit box obtains the input focus and makes a service request to the input method sub-module Start the input method.
  • the input method service sub-module responds to the request to start the input method, opens the monitoring of the cursor position of the first edit box, and uses the second reverse control module through the second input event management sub-module to change the first edit box through the reverse control channel
  • the cursor position of is included in the input start information and sent to the projection destination.
  • the input activation information is used to indicate that the first edit box has obtained the input focus.
  • the screen projection destination receives the input start information including the cursor position of the first edit box through the first reverse control module.
  • the first input event management sub-module responds to the start input event, sets the second edit box to the input state, and starts the input method of the screen projection destination; and sets the position of the second edit box according to the cursor position of the first edit box.
  • the left side frame of the second edit box and the mirrored position of the cursor of the first edit box may be overlapped.
  • the user taps the keyboard at the destination of the projection screen to generate operation commands.
  • the input method of the screen projection destination displays the candidate word prompt box on the left side of the second edit box according to the operation command, that is, near the cursor mirror in the first edit box.
  • the user can tap the space key on the keyboard to determine the candidate word in the candidate word prompt box as the input text content, and enter it into the second edit box.
  • the content change event of the second edit box is triggered.
  • the input monitoring sub-module monitors the content change event, it notifies the first input input management sub-module to take out the text content in the second edit box and send it to the screen source through the first video stream data module as the content to be displayed. Delete the text content from the second edit box.
  • the projection source terminal receives the content to be displayed through the second video stream data module.
  • the second input event management submodule submits the content to be displayed to the first edit box for display through the input method service submodule, and correspondingly, a mirror image of the content to be displayed is displayed on the screen of the projection destination.
  • the cursor position of the first edit box is changed.
  • the input method service sub-module monitors the change of the cursor position of the first edit box, and passes the changed cursor position through the second edit box.
  • the video stream data module is sent to the projection destination.
  • the first input management module updates the position of the second edit box according to the changed cursor position.
  • an operation command corresponding to the non-character key is generated.
  • the projection destination terminal sends the operation command to the projection source terminal through the first reverse control module.
  • the second reverse control module at the screen projection source receives the operation command and distributes it to the application, and the application is used to perform related operations according to the operation command. For example, when the user taps the delete button on the keyboard of the screen projection destination, the application can delete the text content in the first edit box according to the delete button.
  • a second click signal is generated, which includes the click type and the click position.
  • the projection destination terminal sends the second click signal to the projection source terminal through the first reverse control module.
  • the second reverse control module receives the second click signal, and distributes the second click signal to the application, so that the application causes the first edit box to lose the input focus and exit the input state.
  • the first edit box loses the input focus and triggers the end of the input information.
  • the end input information is used to request the input method submodule service to close the input method.
  • the input method service sub-module responds to the request to close the input method, and sends the end input message to the screen projection destination through the second video stream data module.
  • the first video stream data module receives the end input information.
  • the first input event management sub-module responds to the end of the input information, causes the second edit box to lose focus, exits the input state, and closes the input method.
  • the user after the screen display content is projected by the projection source to the projection destination, the user can generate text content or images locally at the projection destination by manipulating the input device of the projection destination; and The text or image generated by it is sent as the content to be displayed to the source of the screen; after receiving the content to be displayed from the destination of the screen, the source of the screen submits the content to be displayed to its edit box for display, so that The local input of the projection destination can be synchronized to the projection source indiscriminately, which improves the user input experience; taking the projection destination as a computer and the projection source as a mobile phone as an example, through the input method provided in the embodiments of this application, When the screen content of the mobile phone is projected to the computer, the user can use the keyboard of the computer to input the content into the mobile phone, which can break the situation that the data and services between the mobile phone and the computer are almost independent of each other, so that the mobile phone and the computer can communicate quickly . This allows the user to complete the word processing on the mobile phone through the
  • the embodiment of the present application provides an input method, and the execution subject of the method is the screen projection source.
  • the method includes: step 602, the screen projection source terminal projects its screen display content to the screen projection destination terminal, so that the screen projection destination terminal displays a screen projection window on at least a part of the display interface, the projection screen The window is a mirror image of the screen display content, the screen display content includes a first edit box; step 604, the screen projection source terminal sends input activation information to the screen projection destination terminal, and the input activation information is used to indicate The first edit box obtains the input focus; step 606, the screen projection source receives the content to be displayed from the screen projection destination, and the content to be displayed is the screen projection destination initiated in response to the input Information, the content to be displayed is the text content or image input by the input device of the projection destination; step 608, the projection source displays the content to be displayed in the first edit box to Update the content of the screen display.
  • the input activation information is used to set the input state of the second edit box of the projection destination, and the content to be displayed is the text content or image that triggers the content change event of the second edit box .
  • the content to be displayed is generated by the first input method of the projection destination according to the operation command generated by the input device.
  • the second edit box is a hidden edit box or a transparent edit box.
  • the input activation information includes the first cursor position of the first edit box; the first cursor position is used to set the position of the second edit box on the projection window .
  • the method before the screen projection source terminal sends the input activation information to the screen projection destination terminal, the method further includes: The destination terminal receives the click signal for the mirror image of the first edit box, the click signal includes the click type and the click position; according to the click signal, the second input method of the source terminal of the projection screen is activated to pass the second input method Monitor the cursor position of the first edit box.
  • the second edit box is set on an interface other than the screen projection window in the display interface of the screen projection destination.
  • the method further includes: the screen projection source terminal obtains information about the first edit from the screen projection destination terminal.
  • the click signal of the mirror image of the frame, the click signal includes the click type and the click position; the projection source terminal makes the first edit box obtain the input focus according to the click signal, and sends all the input focus to the projection destination terminal. Enter the startup information as described.
  • the input device is any of the following:
  • the content to be displayed is text generated according to an operation command corresponding to a character key on the keyboard;
  • the content to be displayed is the converted text of the voice input by the microphone
  • the content to be displayed is text or image extracted from a picture taken by the camera;
  • the content to be displayed is text or image extracted from the input picture of the scanner
  • the content to be displayed is text or image input through the handwriting tablet or the stylus pen.
  • the input device is a keyboard
  • the method further includes:
  • the screen projection source terminal receives the operation command corresponding to the non-character key of the keyboard from the screen projection destination terminal;
  • the screen projection source terminal edits the content in the first edit box according to the operation command.
  • the user after the screen display content is projected by the projection source to the projection destination, the user can generate text content or images locally at the projection destination by manipulating the input device of the projection destination; and The text or image generated by it is sent to the source of the screen as the content to be displayed; the source of the screen, after receiving the text or image sent by the destination of the screen, submits the content to be displayed to its edit box for display,
  • the local input of the projection destination can be synchronized to the projection source without distinction, which improves the user input experience; taking the projection destination as a computer and the projection source as a mobile phone as an example, the input method provided in the embodiment of this application is adopted
  • the screen content of the mobile phone is projected to the computer, the user can use the keyboard of the computer to input the content into the mobile phone, which can break the situation that the data and services between the mobile phone and the computer are almost independent islands. Computers can communicate quickly. This allows the user to complete the word processing on the mobile phone through the keyboard of the computer and the input method of the computer, which can greatly
  • an application architecture to which the input method described in the embodiment of the present application can be applied is provided.
  • the application architecture includes a mobile phone and a personal computer (PC), where the mobile phone and the PC are connected through a network.
  • the mobile phone has a projection server module, application, Android open-source project (AOSP) architecture, and AOSP native method module.
  • the projection server module has a projection input service module, an input method synchronization module, and a reverse control module. Including the application input box.
  • the AOSP architecture has a key event module (keyevent), an input management service module (inputmanager service), and an input method management service module (input method manager service).
  • the AOSP native method module includes an input dispatcher service module and an input management service module (input manager service).
  • PC has a PC input method module, system, PC input box, input management module, reverse control module, input method synchronization module, and window synchronization module.
  • the input management module of the PC includes an input monitoring module, a state control module, and a cursor following module.
  • the mobile phone can project its screen display content onto the PC screen and synchronize the windows, that is, display content synchronization.
  • the input management service module of the AOSP architecture can obtain the cursor position of the application input box and send it to the projection input service module of the mobile phone.
  • the input method management service module of the AOSP architecture can obtain the input status of the application input box and send it to the screen projection input service module of the mobile phone.
  • the screen projection input service module of the mobile phone can send the input status and cursor position of the application input box to the input method synchronization module of the mobile phone.
  • the input method synchronization module of the mobile phone can send the input state and cursor position of the application input box to the input method synchronization module of the PC.
  • the input method synchronization module of the PC can send the input state of the application input box to the state control module of the PC, and the state control module sets the input state of the PC input box according to the input state of the application input box.
  • the PC input method synchronization module can send the cursor position of the application input box to the cursor following module of the PC, and the cursor following module sets the position of the PC input box according to the cursor position.
  • the user can input input content into the PC input box through PC input.
  • the PC input box sends the input content to the input monitoring module of the PC.
  • the input monitoring module of the PC can send the input content to the input method synchronization module of the mobile phone through the input method synchronization module of the PC.
  • the mobile phone input method synchronization module sends the input content to the screen projection input service module of the mobile phone.
  • the screen projection input service module of the mobile phone submits the input content to the application input box for display.
  • the input monitoring module of the PC can also monitor the input events of the system and send the input events to the mobile phone through the reverse control module.
  • the AOSP native method module obtains input events from the reverse control module of the mobile phone and sends them to the input distribution service module.
  • the input distribution service module can distribute input events to applications, so that the application can manage the content in the application input box according to the input events.
  • an input method is provided, refer to FIG. 8.
  • the method includes the following.
  • the mobile phone projects its screen display content to a computer (PC), so that the PC displays the mobile phone screen display content.
  • the user can operate the mouse of the PC and input mouse click events into the PC input box.
  • the user can also operate the PC keyboard and enter PC keyboard input events into the PC input box.
  • the PC input box performs event monitoring, and assembles and sends click/input events.
  • the projection window module sends the click/input event received from the PC input box to the projection client module.
  • the projection client module sends the click/input event to the network module of the mobile phone via network transmission.
  • the network module performs network monitoring, event reception and analysis. After that, the parsed click/input event is sent to the projection server module.
  • the screen projection server module injects click/input events into the input management service module (inputmanager service).
  • the input management service module can distribute click/input events to the input dispatcher.
  • the input distribution module distributes click/input events to the phone input box (edittext).
  • an input method is provided, refer to FIG. 9.
  • the method includes the following.
  • the input management service module (inputmanager service) of the mobile phone sends the input box entering the input event (that is, the event entering the input state) to the projection input method service module.
  • the projection input method service module sends the input input event to the projection server module.
  • the projection server module sends the input event to the network module of the PC.
  • the network module sends the incoming input event it receives to the projection client module.
  • the projection client module processes the entry input event and sends it to the projection window module, so that the projection window module activates the PC input box, and then pulls up the PC input method. So that the user can input through the PC keyboard on the PC side.
  • the user can input text into the PC input box through the PC input method.
  • the projection window module obtains the input text in the PC input box.
  • the projection window module sequentially sends text input events to the projection server module through the projection client module and the network module, including the input text.
  • the screen projection server module processes the text input event to obtain the input text, and then submits the input text to the input method management service module and submits it to the input box of the mobile phone, so that the user's input on the PC side can be displayed in the mobile phone box .
  • the input box of the mobile phone sends the cursor position change to the projection input method service module, and the projection input method service module sends the cursor position to the projection server module.
  • the projection server module sends the cursor position to the network module.
  • the network module sends the cursor position to the projection client module.
  • the projection client module sends the cursor position to the projection window module.
  • the projection window module sends the cursor position to the PC input box, so that the PC input box can adjust its position according to the cursor position of the mobile phone input box.
  • the input management service module (inputmanager service) of the mobile phone sends the input box exit input event (that is, the event of exiting the input state) to the projection input method service module.
  • the projection input method service module sends the exit input event to the projection server module.
  • the projection server module sends the exit input event to the network module of the PC.
  • the network module sends the exit input event it receives to the projection client module.
  • the projection client module processes the exit input event and sends it to the projection window module, so that the projection window module deactivates the PC input box, thereby hiding the PC input method.
  • an apparatus 1000 Referring to FIG. 10, the apparatus 1000 includes:
  • the display unit 1010 is configured to display a screen projection window on at least a part of the display interface, the screen projection window being a mirror image of the screen display content of the screen projection source end, and the screen display content includes a first edit box;
  • the receiving unit 1020 is configured to receive input activation information from the projection source end, where the input activation information is used to indicate that the first edit box has obtained the input focus;
  • the obtaining unit 1030 is configured to obtain the text content or image input by the input device of the apparatus in response to the input activation information, so as to obtain the content to be displayed;
  • the sending unit 1040 is configured to send the content to be displayed to the source end of the projection
  • the update unit 1050 is configured to update the projection window, the updated projection window is a mirror image of the updated screen display content of the projection source end, and the first part of the updated screen display content is The content to be displayed is displayed in the edit box.
  • the acquiring unit 1030 includes a setting sub-unit 1031, a monitoring sub-unit 1032, and an acquiring sub-unit 1033;
  • the setting sub-unit 1031 is configured to respond to the input activation information,
  • the second edit box is set to the input state;
  • the monitoring subunit 1032 is used for monitoring the content change event of the second edit box;
  • the acquisition subunit 1033 is used for obtaining and triggering the content in response to the content change event
  • the text content or image of the change event is used as the content to be displayed.
  • the device further includes an activation unit 1060, configured to activate the first input method in response to the input activation information.
  • the device further includes a deleting unit 1070, configured to delete the text content in the second edit box after the sending unit sends the content to be displayed to the screen source end Or image.
  • each functional unit of the device 1000 can be implemented with reference to the method embodiment shown in FIG. 2, which will not be repeated this time.
  • the user after the screen display content is projected to the screen destination by the screen projection source, the user can generate text content or images locally at the screen projection destination by manipulating the input device of the screen projection destination; And send the text content or image generated by it as the content to be displayed to the source of the screen; after receiving the text content or image sent by the destination of the screen, the source of the screen submits the content to be displayed to its edit box Display, so that the local input of the projection destination can be synchronized to the projection source without distinction, which improves the user input experience.
  • the apparatus 1100 includes:
  • the projection unit 1110 is configured to project its screen display content to a projection destination, so that the projection destination displays a projection window on at least a part of the display interface, and the projection window is a mirror image of the screen display content ,
  • the content of the screen display includes a first edit box;
  • the sending unit 1120 is configured to send input activation information to the screen projection destination, where the input activation information is used to indicate that the first edit box has the input focus;
  • the receiving unit 1130 is configured to receive content to be displayed from the projection destination, where the content to be displayed is obtained by the projection destination in response to the input activation information, and the content to be displayed is the projection.
  • the display unit 1140 is configured to display the content to be displayed in the first edit box to update the on-screen content.
  • the device 1100 further includes an activation unit 1150; the receiving unit 1130 is further configured to send input activation information from the screen destination end before the sending unit 1120 Receive a click signal for the mirror image of the first edit box, where the click signal includes a click type and a click position; the activation unit 1150 is configured to activate the second input method of the projection source terminal according to the click signal, so as to pass all The second input method monitors the cursor position of the first edit box.
  • the device 1100 further includes an acquiring unit 1160; the receiving unit 1130 is further configured to send input activation information from the screen projection destination to the screen projection destination before the sending unit 1120 Acquire a click signal for the mirror image of the first edit box, where the click signal includes a click type and a click position; the acquiring unit 1160 is configured to enable the first edit box to obtain the input focus according to the click signal; the sending unit 1120 is used to send the input activation information to the screen projection destination.
  • the receiving unit 1130 is further configured to send input activation information from the screen projection destination to the screen projection destination before the sending unit 1120 Acquire a click signal for the mirror image of the first edit box, where the click signal includes a click type and a click position
  • the acquiring unit 1160 is configured to enable the first edit box to obtain the input focus according to the click signal
  • the sending unit 1120 is used to send the input activation information to the screen projection destination.
  • each functional unit of the device 1100 can be implemented with reference to the method embodiment shown in FIG. 6, which will not be repeated this time.
  • the user after the screen display content is projected to the screen destination by the screen projection source, the user can generate text content or images locally at the screen projection destination by manipulating the input device of the screen projection destination; And send the text content or image generated by it as the content to be displayed to the source of the screen; after receiving the text content or image sent by the destination of the screen, the source of the screen submits the content to be displayed to its edit box Display, so that the local input of the projection destination can be synchronized to the projection source without distinction, which improves the user input experience.
  • each electronic device includes a hardware structure and/or software module corresponding to each function.
  • the present application can be implemented in the form of hardware or a combination of hardware and computer software. Whether a certain function is executed by hardware or computer software-driven hardware depends on the specific application and design constraint conditions of the technical solution. Professionals and technicians can use different methods for each specific application to implement the described functions, but such implementation should not be considered beyond the scope of this application.
  • the electronic device can be divided into functional modules.
  • each functional module can be divided corresponding to each function, or two or More than two functions are integrated in one processing module.
  • the above-mentioned integrated modules can be implemented in the form of hardware or software functional modules. It should be noted that the division of modules in the embodiments of the present application is illustrative, and is only a logical function division, and there may be other division methods in actual implementation.
  • the projection destination terminal includes a processor 1210, a memory 1220, a transceiver 1230, and a display screen 1240.
  • the memory is used to store computer execution instructions; when the projection destination is running, the processor 1210 executes the computer execution instructions stored in the memory 1220, so that the projection destination executes the image 2 shows the method.
  • the display screen 1240 is used to display a projection window on at least a part of the display interface, the projection window is a mirror image of the screen display content of the projection source end, the screen display content includes a first edit box; a transceiver 1230, Used to receive input activation information from the projection source terminal, where the input activation information is used to indicate that the first edit box has obtained the input focus; the processor 1210 is configured to obtain the input activation information in response to the input activation information The text content or image input by the input device of the device to obtain the content to be displayed; the transceiver 1230 is used to send the content to be displayed to the screen source end; the display screen 1240 is used to update the screen projection window, The updated screen projection window is a mirror image of the updated screen display content of the screen display source end, and the content to be displayed is displayed in the first edit box in the updated screen display content.
  • the screen projection destination further includes a communication bus 1250, where the processor 1210 can be connected to the memory 1220, the transceiver 1230, and the display screen 1240 through the communication bus 1250, so as to implement computer execution according to the storage of the memory 1220. Command to control the transceiver 1230 and the display screen 1240 accordingly.
  • the user can generate text content or images locally at the projection destination by manipulating the input device at the projection destination; and use the generated text
  • the content or image is sent to the projection source as the content to be displayed; after receiving the text content or image sent by the projection destination, the projection source submits the content to be displayed to its edit box for display, so that the projection destination
  • the local input on the remote end can be synchronized to the source end of the screen without distinction, which improves the user input experience.
  • the projection source terminal includes a processor 1310, a memory 1320, a transceiver 1330 and a display screen 1340.
  • the memory is used to store computer execution instructions; when the projection source is running, the processor 1310 executes the computer execution instructions stored in the memory 1320, so that the projection source executes the image The method shown in 6.
  • the transceiver 1330 is used to project the screen display content of the display screen 1340 to the projection destination, so that the projection destination displays a projection window on at least a part of the display interface, and the projection window is the A mirror image of the on-screen display content.
  • the on-screen display content includes a first edit box; a transceiver 1330 is used to send input activation information to the screen projection destination.
  • the input focus; transceiver 1330, used to receive the content to be displayed from the projection destination, the content to be displayed is obtained by the projection destination in response to the input activation information, the content to be displayed Text content or image input for the input device of the screen projection destination; the display screen 1340 is used to display the content to be displayed in the first edit box to update the screen display content.
  • the screen projection source further includes a communication bus 1350, wherein the processor 1310 can be connected to the memory 1320, the transceiver 1330, and the display screen 1340 through the communication bus 1350, so as to implement computer execution according to the storage of the memory 1320. Command to control the transceiver 1330 and the display 1340 accordingly.
  • the user can generate text content or images locally at the projection destination by manipulating the input device at the projection destination; and use the generated text
  • the content or image is sent to the projection source as the content to be displayed; after receiving the text content or image sent by the projection destination, the projection source submits the content to be displayed to its edit box for display, so that the projection destination
  • the local input on the remote end can be synchronized to the source end of the screen without distinction, which improves the user input experience.
  • the processor in the embodiment of the present application may be a central processing unit (central processing unit, CPU), or other general-purpose processors, digital signal processors (digital signal processors, DSP), and application specific integrated circuits. (application specific integrated circuit, ASIC), field programmable gate array (field programmable gate array, FPGA) or other programmable logic devices, transistor logic devices, hardware components, or any combination thereof.
  • the general-purpose processor may be a microprocessor or any conventional processor.
  • the method steps in the embodiments of the present application can be implemented by hardware, or can be implemented by a processor executing software instructions.
  • Software instructions can be composed of corresponding software modules.
  • Software modules can be stored in random access memory (RAM), flash memory, read-only memory (ROM), and programmable read-only memory (programmable rom). , PROM), erasable programmable read-only memory (erasable PROM, EPROM), electrically erasable programmable read-only memory (electrically EPROM, EEPROM), register, hard disk, mobile hard disk, CD-ROM or well-known in the art Any other form of storage medium.
  • An exemplary storage medium is coupled to the processor, so that the processor can read information from the storage medium and can write information to the storage medium.
  • the storage medium may also be an integral part of the processor.
  • the processor and the storage medium may be located in the ASIC.
  • the computer program product includes one or more computer instructions.
  • the computer may be a general-purpose computer, a special-purpose computer, a computer network, or other programmable devices.
  • the computer instructions may be stored in a computer-readable storage medium or transmitted through the computer-readable storage medium.
  • the computer instructions can be sent from one website site, computer, server, or data center to another website site via wired (such as coaxial cable, optical fiber, digital subscriber line (DSL)) or wireless (such as infrared, wireless, microwave, etc.) , Computer, server or data center for transmission.
  • the computer-readable storage medium may be any available medium that can be accessed by a computer or a data storage device such as a server or data center integrated with one or more available media.
  • the usable medium may be a magnetic medium (for example, a floppy disk, a hard disk, and a magnetic tape), an optical medium (for example, a DVD), or a semiconductor medium (for example, a solid state disk (SSD)).

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Controls And Circuits For Display Device (AREA)
  • User Interface Of Digital Computer (AREA)
  • Projection Apparatus (AREA)
  • Overhead Projectors And Projection Screens (AREA)
  • Fittings On The Vehicle Exterior For Carrying Loads, And Devices For Holding Or Mounting Articles (AREA)

Abstract

本申请提供了一种输入方法、电子设备和投屏系统。所述方法包括:投屏目的端在至少一部分显示界面上显示投屏窗口,所述投屏窗口为投屏源端的屏显内容的镜像,所述屏显内容包括第一编辑框;所述投屏目的端从所述投屏源端接收输入启动信息,所述输入启动信息用于指示所述第一编辑框获得了输入焦点;所述投屏目的端响应于所述输入启动信息,获取所述投屏目的端的输入设备输入的文本内容或图像,以得到待显示内容;所述投屏目的端向所述投屏源端发送所述待显示内容;所述投屏目的端更新所述投屏窗口,更新后的投屏窗口为所述投屏源端更新后的屏显内容的镜像,在所述更新后的屏显内容中的所述第一编辑框中显示有所述待显示内容。

Description

一种输入方法、电子设备和投屏系统
本申请要求在2019年6月20日提交中国国家知识产权局、申请号为201910537753.9的中国专利申请的优先权,发明名称为“一种输入方法、电子设备和投屏系统”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及数据处理技术领域,具体涉及一种输入方法、电子设备和投屏系统。
背景技术
手机可以播放视频,也可以显示文档,图片,应用界面,或网页。由于手机的显示屏较小,在需要将手机显示的内容给别人看时,如果将手机显示的内容通过投屏技术投射到显示设备(例如电视,计算机,或另一个手机)。在投屏场景下,用户想要向手机输入内容时,目前可通过以下两种方式实现。
1.直接触摸或点击手机的显示屏,输入内容。
2.点击显示设备显示的手机输入法的虚拟键盘的镜像,并将产生的点击事件以及点击的位置发送给手机,手机解析该点击事件以及点击的位置,以产生输入内容。
现有技术中,在投屏场景下,在向手机输入内容时,无论是在手机上进行操作,还是在显示设备上进行操作,都必须依赖手机的输入法。这种输入方式输入效率较低。
发明内容
本申请提供了一种输入方法、电子设备和投屏系统,可以通过投屏目的端的输入设备输入显示内容,并将该显示内容显示到投屏源端的编辑框中,从而可以实现向投屏源端快捷高效的输入内容。
第一方面提供了一种输入方法,包括:投屏目的端在至少一部分显示界面上显示投屏窗口,所述投屏窗口为投屏源端的屏显内容的镜像,所述屏显内容包括第一编辑框;所述投屏目的端从所述投屏源端接收输入启动信息,所述输入启动信息用于指示所述第一编辑框获得了输入焦点;所述投屏目的端响应于所述输入启动信息,获取所述投屏目的端的输入设备输入的文本内容或图像,以得到待显示内容;所述投屏目的端向所述投屏源端发送所述待显示内容;所述投屏目的端更新所述投屏窗口,更新后的投屏窗口为所述投屏源端更新后的屏显内容的镜像,在所述更新后的屏显内容中的所述第一编辑框中显示有所述待显示内容。
结合第一方面,在第一方面第一种可能的实现方式中,所述投屏目的端响应于所述输入启动信息,获取所述投屏目的端的输入设备输入的文本内容或图像包括:所述投屏目的端响应于所述输入启动信息,将所述投屏目的端的第二编辑框设置为输入状态;所述投屏目的端监听所述第二编辑框的内容变更事件;所述投屏目的端响应于所述内容变更事件,获取触发所述内容变更事件的文本内容或图像,以获取的文本内容或图像作为所述待显示内容。
在该实现方式中,投屏目的端根据输入启动信息启动第二编辑框,通过向第二编辑框输入文本内容或图像,进而可得到待显示内容,从而可以实现在投屏目的端本地灵活输入待显示内容。
结合第一方面第一种可能的实现方式,在第一方面第二种可能的实现方式中,所述文本 内容或图像为所述投屏目的端的第一输入法根据所述输入设备产生的操作命令而生成的。
在该实现方式中,对于需要借助输入法的待显示内容,例如汉字等,可以通过投屏目的端的输入法根据投屏目的端的输入设备产生的操作命令生成该显示内容。
结合第一方面第二种可能的实现方式,在第一方面第三种可能的实现方式中,所述方法还包括:所述投屏目的端响应于所述输入启动信息,启动所述第一输入法。
在该实现方式中,投屏目的端可根据接收到的输入启动信息,自动启动投屏目的端的输入法,从而无须人为启动投屏目的端的输入法,节省了用户操作,改善了用户输入体验。
结合第一方面第一种可能的实现方式,在第一方面第四种可能的实现方式中,所述第二编辑框为隐藏编辑框或透明编辑框;所述输入启动信息包括所述第一编辑框的第一光标位置;所述方法还包括:所述投屏目的端根据所述第一光标位置,设置所述第二编辑框在所述投屏窗口上的位置。
在该实现方式中,投屏目的端的编辑框为隐藏或透明编辑框,从而不会遮挡或干扰投屏源端的编辑框的镜像在投屏目的端的显示,改善了用户体验;并且,通过投屏源端的编辑框的光标位置,设置投屏目的端编辑框在投屏窗口上的位置,可使用户在输入时,候选词提示框可以在投屏源端的编辑框的镜像附近显示,改善了用户输入体验。
结合第一方面第四种可能的实现方式,在第一方面第五种可能的实现方式中,在所述投屏目的端向所述投屏源端发送所述待显示内容之后,所述方法还包括:所述投屏目的端在所述第二编辑框中删除所述文本内容或图像。
在该实现方式中,在将投屏目的端编辑框中的待显示内容发送给投屏源端后,将投屏目的端编辑框中的待显示内容,从而使得投屏目的端仅显示投屏源端投射来的位于投屏源端编辑框中的待显示内容的镜像,避免了投屏目的端本地显示待显示内容对待显示内容的镜像的遮挡或干扰,改善了用户投屏体验。
结合第一方面第一种可能的实现方式,在第一方面第六种可能的实现方式中,所述第二编辑框设置在所述投屏目的端的显示界面中除投屏窗口之外的界面上。
在该实现方式中,投屏窗口为投屏目的端显示界面的部分界面,将投屏目的端本地编辑框设置在投屏目的端的显示界面中除投屏窗口之外的界面,可以避免投屏目的端本地编辑框遮挡投屏源端编辑框的镜像,改善了用户投屏体验。
结合第一方面,在第一方面第七种可能的实现方式中,在所述投屏目的端从所述投屏源端接收输入启动信息之前,所述方法还包括:所述投屏目的端获取针对所述投屏窗口中所述第一编辑框的镜像的点击信号,所述点击信号包括点击类型和点击位置;所述投屏目的端向所述投屏源端发送所述点击信号,以使第一编辑框获得输入焦点。
在该实现方式中,用户可以在投屏目的端进行操作,使得投屏源端的编辑框获得输入焦点,进入输入状态,进而可以使得投屏源端的编辑框显示待显示内容。
结合第一方面,在第一方面第八种可能的实现方式中,
所述输入设备为以下任一种:
键盘、麦克风、摄像头、扫描仪、手写板、手写笔;其中,
当所述输入设备为键盘时,所述待显示内容为根据所述键盘上的字符按键对应的操作命令生成的文本;
当所述输入设备为麦克风时,所述待显示内容为所述麦克风输入语音的转换文本;
当所述输入设备为摄像头时,所述待显示内容为从所述摄像头拍摄的图片中提取的文本或图像;
当所述输入设备为扫描仪时,所述待显示内容为从所述扫描仪输入图片中提取的文本或图像;
当所述输入设备为手写板或手写笔时,所述待显示内容为通过所述手写板或手写笔输入的文本或图像。
在该实现方式中,用户可以通过投屏目的端本地的不同输入设备输入待显示内容,以发送给投屏源端,并在投屏源端的编辑框中显示。
结合第一方面,在第一方面第九种可能的实现方式中,所述输入设备为键盘,所述方法还包括:所述投屏目的端获取所述键盘的非字符按键对应的操作命令;所述投屏目的端将所述操作命令发送至所述投屏源端,以使所述投屏源端根据所述操作命令对所述第一编辑框中的内容进行编辑。
在该实现方式中,用户可以操控投屏目的端的键盘实现对投屏源端编辑框中内容的编辑,提高了对投屏源端编辑框中内容进行编辑的效率。
第二方面提供了一种输入方法,包括:投屏源端将其屏显内容投射到投屏目的端,以使所述投屏目的端在至少一部分显示界面上显示投屏窗口,所述投屏窗口为所述屏显内容的镜像,所述屏显内容包括第一编辑框;所述投屏源端向所述投屏目的端发送输入启动信息,所述输入启动信息用于指示所述第一编辑框获得了输入焦点;所述投屏源端从所述投屏目的端接收待显示内容,所述待显示内容为所述投屏目的端响应于所述输入启动信息而获取的,所述待显示内容为所述投屏目的端的输入设备输入的文本内容或图像;所述投屏源端在所述第一编辑框显示所述待显示内容,以更新所述屏显内容。
结合第二方面,在第二方面第一种可能的实现方式中,所述输入启动信息用于设置所述投屏目的端的第二编辑框进行输入状态,所述待显示内容为触发所述第二编辑框的内容变更事件的文本内容或图像。
结合第二方面第一种可能的实现方式,在第二方面第二种可能的实现方式中,所述待显示内容为所述投屏目的端的第一输入法根据所述输入设备产生的操作命令而生成的。
结合第二方面第一种可能的实现方式,在第二方面第三种可能的实现方式中,所述第二编辑框为隐藏编辑框或透明编辑框。
结合第二方面第三种可能的实现方式,在第二方面第四种可能的实现方式中,所述输入启动信息包括所述第一编辑框的第一光标位置;所述第一光标位置用于设置所述第二编辑框在所述投屏窗口上的位置。
结合第二方面第四种可能的实现方式,在第二方面第五种可能的实现方式中,在所述投屏源端向所述投屏目的端发送输入启动信息之前,所述方法还包括:所述投屏源端从所述投屏目的端接收针对第一编辑框的镜像的点击信号,所述点击信号包括点击类型和点击位置;根据所述点击信号启动所述投屏源端的第二输入法,以便通过所述第二输入法监听所述第一编辑框的光标位置。
结合第二方面第一种可能的实现方式,在第二方面第六种可能的实现方式中,所述第二编辑框设置在所述投屏目的端的显示界面中除投屏窗口之外的界面上。
结合第二方面,在第二方面第七种可能的实现方式中,在所述投屏源端向所述投屏目的端发送输入启动信息之前,所述方法还包括:所述投屏源端从所述投屏目的端获取针对第一编辑框的镜像的点击信号,所述点击信号包括点击类型和点击位置;所述投屏源端根据所述点击信号使所述第一编辑框获得输入焦点,并向所述投屏目的端发送所述输入启动信息。
结合第二方面,在第二方面第八种可能的实现方式中,
所述输入设备为以下任一种:
键盘、麦克风、摄像头、扫描仪、手写板、手写笔;其中,
当所述输入设备为键盘时,所述待显示内容为根据所述键盘上的字符按键对应的操作命令生成的文本;
当所述输入设备为麦克风时,所述待显示内容为所述麦克风输入语音的转换文本;
当所述输入设备为摄像头时,所述待显示内容为从所述摄像头拍摄的图片中提取的文本或图像;
当所述输入设备为扫描仪时,所述待显示内容为从所述扫描仪输入图片中提取的文本或图像;
当所述输入设备为手写板或手写笔时,所述待显示内容为通过所述手写板或手写笔输入的文本或图像。
结合第二方面,在第二方面第九种可能的实现方式中,所说输入设备为键盘,所述方法还包括:所述投屏源端从所述投屏目的端接收所述键盘的非字符按键对应的操作命令;所述投屏源端根据所述操作命令对所述第一编辑框中的内容进行编辑。
第三方面提供了一种投屏目的端,包括处理器、存储器、收发器和显示屏;其中,所述存储器用于存储计算机执行指令;当所述投屏目的端运行时,所述处理器执行所述存储器存储的所述计算机执行指令,以使所述投屏目的端执行第一方面所述的方法。应理解:该投屏目的端为一种电子设备。
第四方面提供了一种投屏源端,包括处理器、存储器、收发器和显示屏;其中,所述存储器用于存储计算机执行指令;当所述投屏源端运行时,所述处理器执行所述存储器存储的所述计算机执行指令,以使所述投屏源端执行第一方面所述的方法。应理解:该投屏源端为一种电子设备。
第五方面提供了一种投屏系统,包括第三方面所述的投屏目的端和第四方面所述的投屏源端。
第六方面本申请提供了一种装置,包括:
显示单元,用于在至少一部分显示界面上显示投屏窗口,所述投屏窗口为投屏源端的屏显内容的镜像,所述屏显内容包括第一编辑框;
接收单元,用于从所述投屏源端接收输入启动信息,所述输入启动信息用于指示所述第一编辑框获得了输入焦点;
获取单元,用于响应于所述输入启动信息,获取所述装置的输入设备输入的文本内容或图像,以得到待显示内容;
发送单元,用于向所述投屏源端发送所述待显示内容;
更新单元,用于更新所述投屏窗口,更新后的投屏窗口为所述投屏源端更新后的屏显内容的镜像,在所述更新后的屏显内容中的所述第一编辑框中显示有所述待显示内容。
结合第六方面在第六方面第一种可能的实现方式中,所述获取单元包括设置子单元、监听子单元、获取子单元;所述设置子单元用于响应于所述输入启动信息,设置所述投屏目的端的第二编辑框进入输入状态;所述监听子单元用于监听所述第二编辑框的内容变更事件;所述获取子单元用于响应于所述内容变更事件,获取触发所述内容变更事件的文本内容或图像,以作为所述待显示内容。
结合第六方面第一种可能的实现方式,在第六方面第二种可能的实现方式中,所述文本内容或图像为所述装置的第一输入法根据所述输入设备产生的操作命令而生成的。
结合第六方面第二种可能的实现方式,在第六方面第三种可能的实现方式中,所述装置还包括:启动单元,用于响应于所述输入启动信息,启动所述第一输入法。
结合第六方面第一种可能的实现方式,在第六方面第四种可能的实现方式中,所述第二编辑框为隐藏编辑框或透明编辑框;所述输入启动信息包括所述第一编辑框的第一光标位置;所述设置子单元还用于根据所述第一光标位置,设置所述第二编辑框在所述投屏窗口上的位置。
结合第六方面第四种可能的实现方式,在第六方面第五种可能的实现方式中,所述装置还包括删除单元,用于在所述发送单元向所述投屏源端发送所述待显示内容之后,在所述第二编辑框中删除所述文本内容或图像。
结合第六方面第一种可能的实现方式,在第六方面第六种可能的实现方式中,所述第二编辑框设置在所述投屏目的端的显示界面中除投屏窗口之外的界面上。
结合第六方面,在第六方面第七种可能的实现方式中,所述获取单元还用于在所述接收单元从所述投屏源端接收输入启动信息之前,获取针对所述投屏窗口中所述第一编辑框的镜像的点击信号,所述点击信号包括点击类型和点击位置;所述发送单元还用于向所述投屏源端发送所述点击信号,以使第一编辑框获得输入焦点。
结合第六方面,在第六方面第八种可能的实现方式中,
所述输入设备为以下任一种:
键盘、麦克风、摄像头、扫描仪、手写板、手写笔;其中,
当所述输入设备为键盘时,所述待显示内容为根据所述键盘上的字符按键对应的操作命令生成的文本;
当所述输入设备为麦克风时,所述待显示内容为所述麦克风输入语音的转换文本;
当所述输入设备为摄像头时,所述待显示内容为从所述摄像头拍摄的图片中提取的文本或图像;
当所述输入设备为扫描仪时,所述待显示内容为从所述扫描仪输入图片中提取的文本或图像;
当所述输入设备为手写板或手写笔时,所述待显示内容为通过所述手写板或手写笔输入的文本或图像。
结合第六方面,在第六方面第九种可能的实现方式中,所述输入设备为键盘,所述获取单元用于获取所述键盘的非字符按键对应的操作命令;所述发送单元用于将所述操作命令发送至所述投屏源端,以使所述投屏源端根据所述操作命令对所述第一编辑框中的内容进行编辑。
第七方面提供了一种装置,包括:
投射单元,用于将其屏显内容投射到投屏目的端,以使所述投屏目的端在至少一部分显示界面上显示投屏窗口,所述投屏窗口为所述屏显内容的镜像,所述屏显内容包括第一编辑框;
发送单元,用于向所述投屏目的端发送输入启动信息,所述输入启动信息用于指示所述第一编辑框获得了输入焦点;
接收单元,用于从所述投屏目的端接收待显示内容,所述待显示内容为所述投屏目的端响应于所述输入启动信息而获取的,所述待显示内容为所述投屏目的端的输入设备输入的文本内容或图像;
显示单元,用于在所述第一编辑框显示所述待显示内容,以更新所述屏显内容。
结合第七方面,在第七方面第一种可能的实现方式中,所述输入启动信息用于设置所述投屏目的端的第二编辑框进行输入状态,所述待显示内容为触发所述第二编辑框的内容变更事件的文本内容或图像。
结合第七方面第一种可能的实现方式,在第七方面第二种可能的实现方式中,所述待显示内容为所述投屏目的端的第一输入法根据所述输入设备产生的操作命令而生成的。
结合第七方面第一种可能的实现方式,在第七方面第三种可能的实现方式中,所述第二编辑框为隐藏编辑框或透明编辑框。
结合第七方面第三种可能的实现方式,在第七方面第四种可能的实现方式中,所述输入启动信息包括所述第一编辑框的第一光标位置;所述第一光标位置用于设置所述第二编辑框在所述投屏窗口上的位置。
结合第七方面第四种可能的实现方式,在第七方面第五种可能的实现方式中,所述装置还包括启动单元;所述接收单元还用于在所述发送单元向所述投屏目的端发送输入启动信息之前,从所述投屏目的端接收针对第一编辑框的镜像的点击信号,所述点击信号包括点击类型和点击位置;所述启动单元用于根据所述点击信号启动所述投屏源端的第二输入法,以便通过所述第二输入法监听所述第一编辑框的光标位置。
结合第七方面第一种可能的实现方式,在第七方面第六种可能的实现方式中,所述第二编辑框设置在所述投屏目的端的显示界面中除投屏窗口之外的界面上。
结合第七方面,在第七方面第七种可能的实现方式中,所述装置还包括获取单元;所述接收单元还用于在所述发送单元向所述投屏目的端发送输入启动信息之前,从所述投屏目的端获取针对第一编辑框的镜像的点击信号,所述点击信号包括点击类型和点击位置;所述获取单元用于根据所述点击信号使所述第一编辑框获得输入焦点;所述发送单元用于向所述投屏目的端发送所述输入启动信息。
结合第七方面,在第七方面第八种可能的实现方式中,
所述输入设备为以下任一种:
键盘、麦克风、摄像头、扫描仪、手写板、手写笔;其中,
当所述输入设备为键盘时,所述待显示内容为根据所述键盘上的字符按键对应的操作命令生成的文本;
当所述输入设备为麦克风时,所述待显示内容为所述麦克风输入语音的转换文本;
当所述输入设备为摄像头时,所述待显示内容为从所述摄像头拍摄的图片中提取的文本或图像;
当所述输入设备为扫描仪时,所述待显示内容为从所述扫描仪输入图片中提取的文本或图像;
当所述输入设备为手写板或手写笔时,所述待显示内容为通过所述手写板或手写笔输入的文本或图像。
结合第七方面,在第七方面第九种可能的实现方式中,所说输入设备为键盘,所述装置还包括编辑单元;所述接收单元还用于从所述投屏目的端接收所述键盘的非字符按键对应的操作命令;所述编辑单元用于根据所述操作命令对所述第一编辑框中的内容进行编辑。
第八方面提供了一种计算机存储介质,所述计算机存储介质包括计算机指令,当所述计算机指令在终端上运行时,使得所述终端执行第一方面所述的方法或第二方面所述的方法。
第九方面提供了一种计算机程序产品,所述计算机程序产品包含的程序代码被终端中的处理器执行时,实现第一方面所述的方法或第二方面所述的方法。
根据本申请实施例提供输入方法、投屏目的端、投屏源端、、装置,在投屏源端将其屏显内容投射到投屏目的端之后,用户可以通过操控投屏目的端的输入设备,在投屏目的端本地生成文本内容或图像;并将其生成的文本内容或图像作为待显示内容发送至投屏源端;投屏源端在接收到投屏目的端发送的文本内容或图像后,将该待显示内容提交至其编辑框中显示,使得投屏目的端的本地输入可以无差别的同步到投屏源端,提高了用户输入体验;以投屏目的端为电脑,投屏源端为手机为例,通过本申请实施例提供的输入方法,当手机的屏显内容投射到电脑上时,用户可以借助电脑的键盘,向手机输入内容,可以打破了手机和电脑之间的数据和服务几乎是相互独立的局面,使得手机和电脑可以快捷互通。使得用户可以通过电脑的键盘和电脑的输入法来完成手机端的文字处理,在办公等场景下可以极大的提高用户处理手机上信息的效率。
附图说明
图1为本申请实施例提供的一种应用场景示意图;
图2为本申请实施例提供的一种输入方法流程图;
图3a为本申请实施例提供的一种界面展示图;
图3b为本申请实施例提供的一种界面展示图;
图3c为本申请实施例提供的一种界面展示图;
图3d为本申请实施例提供的一种界面展示图;
图4为本申请实施例提供的一种界面展示图;
图5为本申请实施例提供的一种输入方法的应用架构示意图;
图6为本申请实施例提供的一种输入方法流程图;
图7为本申请实施例提供的一种输入方法的应用架构示意图;
图8为本申请实施例提供的一种输入方法流程图;
图9为本申请实施例提供的一种输入方法流程图;
图10为本申请实施例提供的一种装置示意性框图;
图11为本申请实施例提供的一种装置示意性框图;
图12为本申请实施例提供的一种投屏目的端示意性框图。
图13为本申请实施例提供的一种投屏源端示意性框图。
具体实施方式
下面将结合附图,对本发明实施例中的技术方案进行描述。显然,所描述的实施例仅是本发明一部分实施例,而不是全部的实施例。在本申请实施例中,如无特殊说明,“多个”包括“两个”。
投屏是指一个设备将其屏显内容投射到另一设备的显示屏上或显示介质上,是有屏设备间较为典型的信息同步方式。在本申请实施例中,可以将投射其屏显内容的设备称为投屏源端,接收投屏源端的投射并显示投屏源端屏显内容的设备可以称为投屏目的端。可参考图1,投屏源端可以将其屏显内容投射到投屏目的端上显示。在投屏时,投屏源端可以把其屏显内容的视频流数据压缩编码并发送到投屏目的端。投屏目的端接收投屏源端的视频流数据,解码后在投屏目的端的显示屏上显示投屏源端屏显内容。为表述方便,可以将投屏目的端显示屏显示的投屏源端屏显内容称为投屏源端屏显内容的镜像。
随着电子设备的发展,其可提供的服务越来越丰富多彩,可以随时随地使用。为了更好的视觉体验,利用投屏技术,用户可将电子设备作为投屏源端,将其屏显内容投射到投屏目 的端显示屏或显示介质上。电子设备可以为手机、平板电脑等。可作为投屏目的端的设备由个人电脑、智能电视(smart TV)、投影仪等。在一些场景下,例如,搜索视频、修改文档等,用户观看投屏目的端显示的投屏源端屏显内容的同时,需要向投屏源端的编辑框文本内容或图像。
本申请实施例提供了一种输入方法,可应用于图1所示的场景。用户可以操控投屏目的端的输入设备,在投屏目的端本地生成文本内容或图像,并作为待显示内容;然后,投屏目的端将待显示内容发送至投屏源端;投屏源端在其编辑框显示其从投屏目的端接收到的待显示内容。从而实现了投屏目的端的本地输入无差别地的同步到投屏源端,从而改善了用户输入体验和提高了输入效率。
接下来,参考图2,对本申请实施例提供的输入方法进行具体介绍。所述方法可以由任何具有显示屏以及数据处理能力的电子设备执行。执行所述输入方法的电子设备可以称为投屏目的端。如图2所示,该方法包括:步骤202、投屏目的端在至少一部分显示界面上显示投屏窗口,所述投屏窗口为投屏源端的屏显内容的镜像,所述屏显内容包括第一编辑框;步骤204、所述投屏目的端从所述投屏源端接收输入启动信息,所述输入启动信息用于指示所述第一编辑框获得了输入焦点;步骤206、所述投屏目的端响应于所述输入启动信息,获取所述投屏目的端的输入设备输入的文本内容或图像,以得到待显示内容;步骤208、所述投屏目的端向所述投屏源端发送待显示内容;步骤210、所述投屏目的端更新所述投屏窗口,更新后的投屏窗口为所述投屏源端更新后的屏显内容的镜像,在所述更新后的屏显内容中的所述第一编辑框中显示有所述待显示内容。应理解:投屏目的端和投屏源端都是电子设备。
接下来,结合具体例子对本申请实施例提供的输入方法进行介绍。
首先,在步骤202中,投屏目的端在至少一部分显示界面上显示投屏窗口,所述投屏窗口为投屏源端的屏显内容的镜像,所述屏显内容包括第一编辑框。
以电脑和手机这两种电子设备为例。当电脑和手机通过近场通讯技术(例如蓝牙、Wi-Fi、端到端(P2P)、ZigBee(紫蜂)等)或数据线(例如USB数据线)连接时,利用投屏技术,手机可以作为投屏源端将其屏显内容投射到电脑的显示屏上,此时电脑为投屏目的端;或者电脑作为投屏源端可以将其屏显内容投射到手机的显示屏上,此时手机为投屏目的端。
具体以手机向电脑投射为例。当手机向电脑投射手机的屏显内容时,电脑可以同步显示手机的屏显内容的镜像。电脑显示手机屏显内容的镜像的窗口可以称为投屏窗口。电脑可以在其整个显示界面上显示投屏窗口,即投屏窗口占据了电脑显示界面的全部。电脑也可以在其显示界面部分区域上显示投屏窗口,即投屏窗口只是电脑的部分界面显示内容。手机屏显内容包括第一编辑框,相应的,电脑在投屏窗口中也显示所述第一编辑框的镜像。
第一编辑框是指能够作为输入焦点,可接收、容纳、编辑输入内容的窗口,具体可以为输入框、文本框、图片框、地址栏、搜索框、可编辑页面(word页面等)、可容纳输入内容的表格(excel表格等)等等。需要说明的是,上文仅对第一编辑框进行举例说明,并非穷举。
容易理解,投屏源端可以显示多个窗口,所述多个窗口包括至少一个编辑框。所述第一编辑框为所述至少一个编辑框中的任一编辑框。
在一些实施例中,第一编辑框具体可以为投屏源端的第一输入控件的编辑框,例如当第一编辑框为文本框时,该文本框具体为投屏源端的第一文本框(TextBox)控件的文本框;当第一编辑框为图片框时,该图片框具体为投屏源端的第一图片框(PictureBox)控件的图片框。
接着,在步骤204中,投屏目的端从所述投屏源端接收输入启动信息,所述输入启动信息用于指示所述第一编辑框获得了输入焦点。
投屏源端在使第一编辑框获得了输入焦点时,可以向投屏目的端发送输入启动信息。输入启动信息用于指示第一编辑框获得了投屏源端的输入焦点,可显示投屏源端接收的文本内容或图像。
投屏源端可以使用视频流数据通道向投屏目的端发送所述输入启动信息。视频流数据通道为投屏源端在向投屏目的端投射屏显内容时,用于传输屏显内容视频流数据的数据通道。
在一些实施例中,在步骤204之前,本申请实施例提供的输入方法还包括:获取针对所述投屏窗口中所述第一编辑框的镜像的点击信号;所述点击信号包括点击类型和点击位置;向所述投屏源端发送所述点击信号,以使第一编辑框获得输入焦点。
在这些实施例中,用户可以在投屏目的端,点击投屏窗口中第一编辑框的镜像,使得投屏目的端产生点击信号。该点击事件可以包括点击类型(例如,单击、双击等)和点击位置(具体可以为坐标)。投屏目的端可以将该点击信号发送至投屏源端。投屏源端在接收到所述点击信号后,可以根据该点击信号中的点击位置,确定用户操作对象为第一编辑框。当点击信号中的点击类型为使第一编辑框获得输入焦点的点击类型(例如,可以预设单击编辑框,可使得该编辑框获得输入焦点)时,投屏源端基于该点击类型使第一编辑框获得输入焦点以进入输入状态,进而向投屏目的端发送输入启动信息。
在这些实施例的一个例子中,投屏目的端可以通过反向控制通道将该点击信号发送至投屏源端。在投屏源端和投屏目的端之间建立了视频流数据通道后,投屏源端和投屏目的端可以通过视频流数据通道进行协商建立反向控制信道。投屏目的端可以通过反向控制通道向投屏源端发送点击信号等操作命令。
在一些实施例中,在步骤204之前,投屏源端可以获取针对所述第一编辑框的点击信号,并根据该点击信号使得第一编辑框获得输入焦点,并向投屏目的端发送输入启动信息。
在这些实施例中,用户可以在投屏源端,点击第一编辑框,以使投屏源端产生点击信号。投屏源端根据该点击信号使得第一编辑框获得输入焦点以进入输入状态,进而向投屏目的端发送输入启动信息。
其次,在步骤206中,投屏目的端响应于所述输入启动信息,获取所述投屏目的端的输入设备输入的文本内容或图像,以得到待显示内容。
用户可以操控投屏目的端的输入设备,向投屏目的端输入文本内容或图像。容易理解,不同的电子设备具有输入设备,例如个人电脑,其常规输入设备为键盘。再例如手机,其输入设备为其触摸屏。另外,麦克风、扫描仪、摄像头等也可以作为电子设备的输入设备。
对于作为投屏目的端的电子设备而言,其输入设备可以为自带的,也可以为外接的。例如投屏目的端为笔记本电脑时,其输入设备可以为其自带的键盘、摄像头、麦克风等,也可以为外接的键盘、摄像头、麦克风等。
在一些实施例中,所述输入设备为键盘。用户可以通过敲击或按压键盘上的字符按键,生成相应的操作命令,使得投屏目的端可根据操作命令生成文本内容,以得到待显示内容。。
在一些实施例中,所述输入设备为麦克风。用户可以通过麦克风向第一设备输入语音。第一设备将用户输入语音转换成文本,得到输入内容。
在一些实施例中,所述输入设备为摄像头。用户可以通过摄像头拍摄图片。投屏目的端可以从摄像头拍摄的图片提取文本,得到输入内容。具体可以采用光学字符识别(optical character recognition,OCR)技术进行文本提取。投屏目的端也可以从摄像头拍摄的图片 中提取图像,并作为输入内容。
在一些实施例中,所述输入设备为扫描仪。用户可以通过扫描仪。投屏目的端可以从通过扫描仪输入的扫描图片中提取文本,得到待显示内容。具体可以采用OCR技术进行文本提取。投屏目的端也可以从扫描图片中提取图像,得到待显示内容。
在一些实施例中,所述输入设备为手写板。用于向投屏源端发送的文本内容或图像为通过所述手写板输入的文本或图像。
在一些实施例中,所述输入设备为手写笔。用于向投屏源端发送的文本内容或图像为通过所述手写笔输入的文本或图像。
在一些实施例中,步骤206包括:投屏目的端响应于所述输入启动信息,将所述投屏目的端的第二编辑框设置为输入状态;监听所述第二编辑框的内容变更事件;响应于所述内容变更事件,获取触发所述内容变更事件的文本内容或图像,以获取的文本内容或图像作为待显示内容。
第二编辑框为投屏目的端的编辑框。在接收到输入启动信息时,投屏目的端响应于输入启动信息,使第二编辑框获取投屏目的端的输入焦点,并进入输入状态。投屏目的端可以监听处于输入状态的第二编辑框中的内容变更事件。当向第二编辑框输入文本内容或图像后,输入的文本内容或图像触发第二编辑框的内容变更事件。投屏目的端响应于内容变更事件,获取触发该内容变更事件的文本内容或图像,将该文本内容或图像作为待显示内容。
在一些实施例中,第二编辑框可以为投屏目的端的第二输入控件的编辑框,例如当第二编辑框为文本框时,该文本框具体为投屏目的端的第二文本框(TextBox)控件的文本框;当第二编辑框为图片框时,该图片框具体为投屏目的端的第二图片框(PictureBox)控件的图片框。
投屏目的端可以将从语音转换的文本、从图片提取的文本或图像、通过手写板或手写笔输入的文本或图像输入至第二编辑框,并触发内容变更事件。
在这些实施例的第一个示例中,所述输入内容为所述投屏目的端的第一输入法根据所述输入设备的输入而生成的。
第一输入法为投屏目的端的输入法,具体可以为投屏目的端系统自带的输入法,也可以安装的第三方输入法。
在一个例子中,当投屏目的端连接有实体键盘时,在第二编辑框进入输入状态后,用户可以通过投屏目的端的实体键盘启动投屏目的端的输入法。容易理解,当输入设备为实体键盘时,对于英文和阿拉伯数字,可以通过实体键盘直接输入向第二编辑框输入内容,此时不需要输入法,因此无需启动输入法。但对于英文之外语言文字,需要实体键盘结合投屏目的端的输入法进行输入,此时,用户可以通过操作投屏目的端的实体键盘(例如,同时按压Ctrl键和空格键)启动投屏目的端的输入法。用户可以敲击实体键盘的按键,产生操作命令,进而使得第一输入法根据操作命令生成文本内容。当输入英文之外的语言文字时,第一输入法还可以根据操作命令在候选词提示框生成相应的候选词,以及根据用户的选择操作对应得操作命令从候选词中确定文本内容,并输入至第二编辑框,从而触发第二编辑框的内容变更事件。
在一个例子中,投屏目的端响应于输入启动信息,可以启动第一输入法。当投屏目的端连接有实体键盘时,可以仅显示第一输入法的状态栏。用户可以敲击实体键盘的按键,产生操作命令,进而使得第一输入法根据操作命令生成文本内容。当输入英文之外的语言文字时,第一输入法还可以根据操作命令在候选词提示框生成相应的候选词,以及根据用户的选择操 作对应得操作命令从候选词中确定文本内容,并输入至第二编辑框,从而触发第二编辑框的内容变更事件。
当投屏目的端为含有触摸屏的设备时,可以显示第一输入法的虚拟键盘。用户可以点击或触摸虚拟键盘的按键,产生操作命令,进而使得第一输入法根据操作命令生成文本内容。当输入英文之外的语言文字时,第一输入法还可以根据操作命令在候选词提示框生成相应的候选词,以及根据用户的选择操作对应得操作命令从候选词中确定文本内容,并输入至第二编辑框,从而触发第二编辑框的内容变更事件。
在这些实施例的第二示例中,所述第二编辑框为隐藏编辑框或透明编辑框。
将第二编辑框设置为隐藏编辑框或透明编辑框,不会遮挡投屏源端的屏显内容的镜像,从而可以改善用户的视觉体验。
在该示例的一个例子中,当第二编辑框为隐藏编辑框时,其光标也为隐藏光标,。当第二编辑框为透明编辑框时,其光标也为透明光标。从而避免了第二编辑框的光标对用户视觉体验的影响,进一步改善用户的视觉体验。
在第二示例的一个例子中,所述输入启动信息包括所述第一编辑框的第一光标位置;本申请实施例提供的输入方法还包括:根据所述第一光标位置,设置所述第二编辑框在所述投屏窗口上的位置。
第一编辑框获得了投屏源端的输入焦点时后,投屏源端可以获取第一编辑框的光标位置,并将该光标位置携带在输入启动信息中,发送至投屏目的端。在一个例子中,第一编辑框的开始输入事件可以启动投屏源端的输入法,投屏源端的输入法可以监听第一编辑框的光标位置,并将该光标位置携带在输入启动信息中,发送至投屏目的端。
投屏目的端根据该光标位置设置第二编辑框的位置。可以将第二编辑框设置在第一光标镜像的附近。如图3a所示,将第二编辑框的左侧边框叠加到第一编辑框的光标镜像上。通过前述方案,在向第二编辑框输入文本内容时,在第一光标镜像的附近显示候选词提示框,改善用户输入体验。具体参考图3b,以输入内容为汉字“战狼”为例,可以假设用户可以依次敲击实体键盘的“Z”、“L”,第一输入法在候选词提示框生成相应的候选词,候选词提示框显示在第一编辑框的光标的镜像附近。
需要说明的是,每当监听到第二编辑框的内容变更事件时,获取触发该内容变更事件的文本内容,并作为用于向投屏源端发送的文本内容。当文本内容为英文字母或数字时,只要向第二编辑框输入了一个字母或数字,就可触发内容变更事件,获取触发该内容变更事件的字母或数字。当输入内容为英文之外的语言的文字时,例如汉字。当从候选词中确定了文本内容,输入至第二编辑框时,触发内容变更事件。具体的,参考图3b和图3c,当第一输入法仅在候选词提示框生成相应的候选词时,这时还未向第二编辑框输入“战狼”,所以没有触发第二编辑框的内容变更事件。当通过敲击空格键等向第二编辑框输入了“战狼”,则第二编辑框的触发内容变更事件,进而投屏目的端可以获取“战狼”这一输入内容。
在这些实施例的第三示例中,所述第二编辑框设置在所述投屏目的端的显示界面中除投屏窗口之外的界面上。
在该示例中,参考图4,投屏窗口可以为投屏目的端的显示界面的中一部分。投屏目的端将第二编辑框设置在投屏窗口之外的部分,从而使第二编辑框不遮挡投屏源端屏显内容的镜像,提高了用户的视觉体验。
然后,在步骤208中,向所述投屏源端发送所述待显示内容。
投屏目的端可以将获取的待显示内容发送至投屏源端,具体可以通过视频流数据通道传输该待显示内容。投屏源端在接收到待显示内容后,将待显示内容提交至第一编辑框中,使得待显示内容可以显示在第一编辑框中,从而更新了投屏源端的屏显内容。
之后,在步骤210中,投屏目的端更新所述投屏窗口,更新后的投屏窗口为所述投屏源端更新后的屏显内容的镜像,在所述更新后的屏显内容中的所述第一编辑框中显示有所述待显示内容。
容易理解,在投屏源端向投屏目的端投射投屏源端的屏显内容的场景中,投屏目的端的投屏窗口为投屏源端屏显内容的镜像,投屏源端的屏显内容更新了,投屏窗口也进行相应更新,更新后的投屏窗口为更新后的屏显内容的镜像。更新后的屏显内容中的第一编辑框中显示有所述待显示内容,相应的,更新后的投屏窗口中第一编辑框的镜像中也有所述待显示内容的镜像。
更具体的,在一个例子中,投屏源端实时将其屏显内容的数据通过视频视频流数据通道传输至投屏目的端,投屏目的端根据其接收到的投屏源端屏显内容的数据显示投屏源端屏显内容的镜像。因此,在投屏源端可以将在第一编辑框显示了待显示内容对应的屏显内容的数据发送给投屏目的端,投屏目的端根据该数据显示的投屏源端屏显内容的镜像中具有待显示内容的镜像,其待显示内容的镜像位于第一编辑框的镜像中。
在一些实施例中,在步骤208之后,本申请提供的输入方法还包括:在所述第二编辑框中删除所述文本内容或图像。参考图3d,以待显示内容为“战狼”为例,在将“战狼”作为待显示内容发送给投屏源端后,投屏目的端将第二编辑框中的“战狼”删除,从而使得第二编辑框中的“战狼”,不会遮挡第一编辑框镜像中的“战狼”镜像。
在一些实施例中,投屏源端可以监听第一输入框中的光标位置。在一个例子中,具体可以由投屏源端的输入法监听第一输入框中的光标位置。投屏源端的输入可以由第一编辑框的输入事件启动。容易理解,当向第一编辑框输入内容后,会导致第一编辑框的光标位置变化,投屏源端可以监听到第一编辑框的光标位置的变化,并向投屏目的端发送第一编辑框变化后的光标位置,具体可以通过视频流数据通道传输第一编辑框变化后的光标位置。投屏目的端接收到第一编辑框变化后的光标位置后,可以根据第一编辑框的变化后的光标位置重新设置第二编辑框的位置,从而可以在使用投屏目的端的输入设备输入时,投屏目的端的输入法的候选词提示框可以在第一编辑框的镜像附近显示,改善用户的输入体验。
在一些实施例中,当投屏目的端的输入设备为键盘时,本申请实施例提供的输入方法还包括:投屏目的端获取所述键盘的非字符按键对应的操作命令;将所述非字符按键对应的操作命令发送至所述投屏源端。具体可以通过反向控制通道传输所述非字符按键对应的操作命令。
投屏源端接收到所述操作命令后,根据所述操作命令对其屏显内容进行控制。在一个例子中,当所述操作命令为用于编辑文本内容或图像的操作命令时,投屏源端根据该操作命令对所述第一编辑框中显示的文本内容或图像进行编辑。
需要说明的是,在本申请实施例中,键盘上的按键可以分为字符按键和非字符按键;其中,当用户敲击或按压字符按键时,产生字符按键对应的操作命令,设备可以根据字符按键对应的操作命令生成文本内容。字符按键具体可以包括数字按键9(0-9)、字母按键(a-z)、标点符号按键(例如,、。、、!、?等等)、特殊字符按键(例如#、¥、%、*、\等等)。
而非字符按键是指键盘上字符按键之外的按键。具体为键盘的控制(Ctrl)、上档(Shift)、 换档(Alt)、大写锁定(Caps Lock)、插入(Insert)、起始(Home)、结束(End)、删除(Del)、上页(PgUp)、下页(PgDn)、回车(Enter)、回格(BackSpace)以及方向键等非字符按键。用户通过敲击或按压键盘的非字符按键的操作,可生成相应的操作命令。操作命令可以通过反向控制信道发送至投屏源端。使得投屏源端根据操作命令在第一编辑框内进行光标移位、大小写切换、插入、删除、换行、发送等动作。
在一些实施例中,用户可以点击投屏目的端的投屏窗口中第一编辑框之外的界面,产生点击信号,该点击信号包括点击类型(具体可以为单击)和点击位置(具体可以为坐标)。投屏目的端可以将该点击信号发送给投屏源端,具体可以通过反向控制通道传输该点击信号。投屏源端解析该点击位置和点击类型,使第一编辑框失去输入焦点、退出输入状态。
在这些实施例的一个示例中,在第一编辑框失去焦点、退出输入状态时,投屏源端可以向投屏目的端发送输入退出信息,具体可以通过视频流数据通道传输第一编辑框变化后的光标位置。投屏目的端接收到输入退出信息后,响应于输入退出信息,使第二编辑框失去输入焦点、退出输入状态。
在本申请实施例中,在投屏源端将其屏显内容投射到投屏目的端之后,用户可以通过操控投屏目的端的输入设备,在投屏目的端本地生成文本内容或图像;并将其生成的文本内容或图像作为待显示内容发送至投屏源端;投屏源端在接收到投屏目的端发送的文本内容或图像后,将该待显示内容提交至其编辑框中显示,使得投屏目的端的本地输入可以无差别的同步到投屏源端,提高了用户输入体验;以投屏目的端为电脑,投屏源端为手机为例,通过本申请实施例提供的输入方法,当手机的屏显内容投射到电脑上时,用户可以借助电脑的键盘,向手机输入内容,可以打破了手机和电脑之间的数据和服务几乎是相互独立的局面,使得手机和电脑可以快捷互通。使得用户可以通过电脑的键盘和电脑的输入法来完成手机端的文字处理,在办公等场景下可以极大的提高用户处理手机上信息的效率。
接下来,在一个实施例中,对本申请实施例提供的跨设备输入方法进行举例介绍。
本实施例提供的输入方法应用的架构如图5所示,包括投屏目的端和投屏源端,投屏目的端和投屏源端可以通过近距离无线通信技术连接或数据线连接。投屏目的端的输入设备可以包括键盘,还可以包括鼠标或触摸屏。投屏目的端安装有输入法和投屏客户端。投屏目的端的输入法可根据键盘或者触摸屏产生的操作命令生成文本内容或图像,作为待显示内容。投屏目的端可以通过投屏客户端,接收投屏源端的屏显内容的视频流数据,并显示投屏源端的屏显内容的镜像。具体的,投屏客户端可以包括第一输入管理模块、第一视频流数据模块、第一反向控制模块。第一输入管理模块包括第二编辑框,第二编辑框可以为文本框,该文本框为隐藏文本框,其边框不在投屏目的端的屏幕上显示。第一输入管理模块还包括输入监听子模块和第一输入事件管理子模块。
需要说明的是,上文以第一输入管理模块、第一视频流数据模块、第一反向控制模块集成到投屏客户端为例,对这些模块进行了说明,并不构成对这些模块的存在方式构成限制。
投屏源端为投屏源端,其安装有应用。应用包括第一编辑框,第一编辑框可以为文本框。第一编辑框显示在投屏源端的屏幕上,相应的,第一编辑框的镜像显示在投屏目的端的屏幕上。投屏源端包括第二输入管理模块、第二视频流数据模块、第二反向控制模块。第二输入管理模块包括输入法服务子模块和第二输入事件管理子模块。输入法服务子模块可以为投屏源端安装的输入法的模块,也可以为投屏源端系统自带输入法的模块。
第一视频流数据模块和第二视频流数据模块通过视频流数据通道进行通信。第一反向控制模块和第二反向控制模块通过反向控制通道进行通信。
用户通过投屏目的端和投屏源端使用投屏服务时,可以利用鼠标或触摸屏点击投屏目的端屏幕上的第一编辑框的镜像,生成第一点击信号,第一点击信号包括点击类型和点击坐标。投屏目的端使用第一反向控制模块通过反向控制通道将第一点击信号发送给投屏源端。
投屏源端使用第二反向控制模块接收到第一点击信号后,将第一点击信号分发到应用,应用解析第一点击信号,使得第一编辑框获得输入焦点并向输入法子模块服务请求启动输入法。输入法服务子模块响应启动输入法的请求,开启对第一编辑框的光标位置的监听,并通过第二输入事件管理子模块使用第二反向控制模块通过反向控制通道将第一编辑框的光标位置包括在输入启动信息中发送给投屏目的端。输入启动信息用于指示第一编辑框获得了输入焦点。
投屏目的端通过第一反向控制模块接收到包括第一编辑框的光标位置的输入启动信息后。第一输入事件管理子模块响应开始输入事件,将第二编辑框设置为输入状态,并启动投屏目的端的输入法;以及根据第一编辑框的光标位置设置第二编辑框的位置。具体的,可将第二编辑框的左侧边框和第一编辑框的光标镜像的位置重叠。
以输入中文为例,用户敲击投屏目的端的键盘,产生操作命令。投屏目的端的输入法根据操作命令在第二编辑框左侧,即第一编辑框光标镜像附近显示候选词提示框。用户可以敲击键盘上的空格按键,将候选词提示框中的候选词确定为输入的文本内容,并输入至第二编辑框中。文本内容输入至第二编辑框中时,触发第二编辑框的内容变更事件。输入监听子模块监听到内容变更事件时,通知第一输入输入管理子模块将第二编辑框中的文本内容取出,并作为待显示内容通过第一视频流数据模块发送给投屏源端,同时将文本内容从第二编辑框中删除。
投屏源端通过第二视频流数据模块接收到待显示内容。第二输入事件管理子模块通过输入法服务子模块将待显示内容提交至第一编辑框中进行显示,相应的,投屏目的端的屏幕上显示待显示内容镜像。将待显示内容提交至第一编辑框后,导致第一编辑框的光标位置发生变化,输入法服务子模块监听到第一编辑框的光标位置的变化,并将变化后的光标位置通过第二视频流数据模块发送给投屏目的端。
第一视频流数据模块接收到变化后的光标位置后,第一输入管理模块根据变化后的光标位置更新第二编辑框的位置。
用户敲击投屏目的端的键盘的非字符按键时,产生非字符按键对应的操作命令。投屏目的端将该操作命令通过第一反向控制模块发送给投屏源端。投屏源端的第二反向控制模块接收到该操作命令,并分发给应用,使用应用根据该操作命令进行相关操作。例如,用户敲击投屏目的端的键盘的删除按键时,应用可以根据该删除按键对第一编辑框中的文本内容进行删除操作。
当用户使用鼠标或触摸屏点击第一编辑框镜像以外的位置时,生成第二点击信号,其中包括点击类型和点击位置。投屏目的端将第二点击信号通过第一反向控制模块发送给投屏源端。第二反向控制模块接收到第二点击信号,并将第二点击信号分发给应用,以使应用使第一编辑框失去输入焦点,退出输入状态。
第一编辑框失去输入焦点触发结束输入信息。结束输入信息用于向输入法子模块服务请求关闭输入法。输入法服务子模块响应关闭输入法的请求,通过第二视频流数据模块向投屏目的端发送结束输入信息。
第一视频流数据模块接收到结束输入信息。第一输入事件管理子模块响应结束输入信息,使第二编辑框失去焦点,退出输入状态,并关闭输入法。
在本申请实施例中,在投屏源端将其屏显内容投射到投屏目的端之后,用户可以通过操控投屏目的端的输入设备,在投屏目的端本地生成文本内容或图像;并将其生成的文本内容或图像作为待显示内容发送至投屏源端;投屏源端在接收到投屏目的端发送的待显示内容后,将该待显示内容提交至其编辑框中显示,使得投屏目的端的本地输入可以无差别的同步到投屏源端,提高了用户输入体验;以投屏目的端为电脑,投屏源端为手机为例,通过本申请实施例提供的输入方法,当手机的屏显内容投射到电脑上时,用户可以借助电脑的键盘,向手机输入内容,可以打破了手机和电脑之间的数据和服务几乎是相互独立的局面,使得手机和电脑可以快捷互通。使得用户可以通过电脑的键盘和电脑的输入法来完成手机端的文字处理,在办公等场景下可以极大的提高用户处理手机上信息的效率。
本申请实施例提供了一种输入方法,该方法执行主体为投屏源端。参阅图6,该方法包括:步骤602、投屏源端将其屏显内容投射到投屏目的端,以使所述投屏目的端在至少一部分显示界面上显示投屏窗口,所述投屏窗口为所述屏显内容的镜像,所述屏显内容包括第一编辑框;步骤604、所述投屏源端向所述投屏目的端发送输入启动信息,所述输入启动信息用于指示所述第一编辑框获得了输入焦点;步骤606、所述投屏源端从所述投屏目的端接收待显示内容,所述待显示内容为所述投屏目的端响应于所述输入启动信息而获取的,所述待显示内容为所述投屏目的端的输入设备输入的文本内容或图像;步骤608、所述投屏源端在所述第一编辑框显示所述待显示内容,以更新所述屏显内容。
在一些实施例中,所述输入启动信息用于设置所述投屏目的端的第二编辑框进行输入状态,所述待显示内容为触发所述第二编辑框的内容变更事件的文本内容或图像。
在这些实施例的一个示例中,所述待显示内容为所述投屏目的端的第一输入法根据所述输入设备产生的操作命令而生成的。
在这些实施例的一个示例中,所述第二编辑框为隐藏编辑框或透明编辑框。
在该示例的一个例子中,所述输入启动信息包括所述第一编辑框的第一光标位置;所述第一光标位置用于设置所述第二编辑框在所述投屏窗口上的位置。
在该例子的一种具体的可能实现方式中,在所述投屏源端向所述投屏目的端发送输入启动信息之前,所述方法还包括:所述投屏源端从所述投屏目的端接收针对第一编辑框的镜像的点击信号,所述点击信号包括点击类型和点击位置;根据所述点击信号启动所述投屏源端的第二输入法,以便通过所述第二输入法监听所述第一编辑框的光标位置。
这些实施例的一个示例中,所述第二编辑框设置在所述投屏目的端的显示界面中除投屏窗口之外的界面上。
在一些实施例中,在所述投屏源端向所述投屏目的端发送输入启动信息之前,所述方法还包括:所述投屏源端从所述投屏目的端获取针对第一编辑框的镜像的点击信号,所述点击信号包括点击类型和点击位置;所述投屏源端根据所述点击信号使所述第一编辑框获得输入焦点,并向所述投屏目的端发送所述输入启动信息。
在一些实施例中,所述输入设备为以下任一种:
键盘、麦克风、摄像头、扫描仪、手写板、手写笔;其中,
当所述输入设备为键盘时,所述待显示内容为根据所述键盘上的字符按键对应的操作命令生成的文本;
当所述输入设备为麦克风时,所述待显示内容为所述麦克风输入语音的转换文本;
当所述输入设备为摄像头时,所述待显示内容为从所述摄像头拍摄的图片中提取的文本或图像;
当所述输入设备为扫描仪时,所述待显示内容为从所述扫描仪输入图片中提取的文本或图像;
当所述输入设备为手写板或手写笔时,所述待显示内容为通过所述手写板或手写笔输入的文本或图像。
在一些实施例中,所述输入设备为键盘,所述方法还包括:
所述投屏源端从所述投屏目的端接收所述键盘的非字符按键对应的操作命令;
所述投屏源端根据所述操作命令对所述第一编辑框中的内容进行编辑。
图6所示的各步骤具体可以参考图2所示的方法实施例实现,在此不再赘述。
在本申请实施例中,在投屏源端将其屏显内容投射到投屏目的端之后,用户可以通过操控投屏目的端的输入设备,在投屏目的端本地生成文本内容或图像;并将其生成的文本内容或图像作为待显示内容发送至投屏源端;投屏源端在接收到投屏目的端发送的文本内容或图像后,将该待显示内容提交至其编辑框中显示,使得投屏目的端的本地输入可以无差别的同步到投屏源端,提高了用户输入体验;以投屏目的端为电脑,投屏源端为手机为例,通过本申请实施例提供的输入方法,当手机的屏显内容投射到电脑上时,用户可以借助电脑的键盘,向手机输入内容,可以打破了手机和电脑之间的数据和服务几乎是相互独立的孤岛这一局面,使得手机和电脑可以快捷互通。使得用户可以通过电脑的键盘和电脑的输入法来完成手机端的文字处理,在办公等场景下可以极大的提高用户处理手机上信息的效率。
在一个实施例中,提供了本申请实施例所述的输入方法可应用的一种应用架构。参考图7,该应用架构包括手机和个人电脑(PC),其中,手机和PC通过网络连接。手机具有投屏服务端模块、应用、安卓开放源代码项目(android open-source project,AOSP)架构、AOSP原生方法模块。投屏服务端模块具有投屏输入服务模块、输入法同步模块、反向控制模块。包括应用输入框。AOSP架构具有按键事件模块(keyevent)、输入管理服务模块(inputmanager service)、输入法管理服务模块(inputmethod managerservice)。AOSP原生方法模块包括输入分发服务模块(input dispatcher)、输入管理服务模块(inputmanager service)。
PC作为投屏客户端,具有PC输入法模块、系统、PC输入框、输入管理模块、反向控制模块、输入法同步模块、窗口同步模块。PC的输入管理模块包括输入监听模块、状态控制模块、光标跟随模块。
手机可以将其屏显内容投射到PC屏幕上,并进行窗口同步,即显示内容同步。在应用输入框获得输入焦点,进入输入状态后,AOSP架构的输入管理服务模块可以获取应用输入框的光标位置,并发送给手机的投屏输入服务模块。AOSP架构的输入法管理服务模块可以获取应用输入框的输入状态,并发送给手机的投屏输入服务模块。手机的投屏输入服务模块可以将应用输入框的输入状态、光标位置发送给手机的输入法同步模块。手机的输入法同步模块可以将应用输入框的输入状态和光标位置发送给PC的输入法同步模块。
PC的输入法同步模块可以将应用输入框的输入状态发送给PC的状态控制模块,状态控制模块根据应用输入框的输入状态设置PC输入框的输入状态。PC的输入法同步模块可以将应用输入框的光标位置发送给PC的光标跟随模块,光标跟随模块根据该光标位置设置PC输入框的位置。
在PC输入框进入输入状态后,用户可通过PC输入可以向PC输入框输入输入内容。PC输入框将输入内容发送给PC的输入监听模块。PC的输入监听模块可以将输入内容通过PC的输入法同步模块发送给手机的输入法同步模块。手机输入法同步模块将输入内容发送给手机的投屏输入服务模块。手机的投屏输入服务模块将输入内容提交至应用输入框进行显示。
PC的输入监听模块还可以监听系统的输入事件,并通过反向控制模块将输入事件发生给手机。AOSP原生方法模块从手机的反向控制模块获取输入事件,并发送给输入分发服务模块。输入分发服务模块可以将输入事件分发给应用,以使应用可根据输入事件对应用输入框中的内容进行管理。
在一个实施例中,提供了一种输入方法,参考图8。该方法包括以下内容。
在手机将其屏显内容投射到电脑(PC),以使PC显示手机屏显内容的情况下。用户可以操作PC的鼠标,向PC输入框输入鼠标点击事件。用户也可以操作PC键盘,向PC输入框输入PC键盘输入事件。PC输入框进行事件监听,并组装发送点击/输入事件。投屏窗口模块将其从PC输入框收到的点击/输入事件发送给投屏客户端模块。投屏客户端模块通过网络传输将该点击/输入事件发送给手机的网络模块。网络模块进行网络监听,并进行事件接收、解析。之后将解析后的点击/输入事件发送到投屏服务端模块。投屏服务端模块将点击/输入事件注入到输入管理服务模块(inputmanager service)。输入管理服务模块可以将点击/输入事件分发到输入分发模块(input dispatcher)。输入分发模块将点击/输入事件分发到手机输入框(edittext)。
在一个实施例中,提供了一种输入方法,参考图9。该方法包括以下内容。
在手机将其屏显内容投射到PC,以使PC显示手机屏显内容的情况下。当手机的输入框(edittext)获得输入焦点时,可以激活手机输入法。手机的输入管理服务模块(inputmanager service)将输入框进入输入事件(即进入输入状态的事件)发送给投屏输入法服务模块。投屏输入法服务模块将进入输入事件发送给投屏服务端模块。投屏服务端模块将进入输入事件发送给PC的网络模块。网络模块将其接收到的进入输入事件,发送给投屏客户端模块。投屏客户端模块处理进入输入事件,并发送给投屏窗口模块,使得投屏窗口模块激活PC输入框,进而拉起PC输入法。从而使得用户可以在PC端通过PC键盘进行输入。
用户可以通过PC输入法向PC输入框输入文本。投屏窗口模块获取PC输入框中的输入文本。投屏窗口模块依次通过投屏客户端模块、网络模块向投屏服务端模块发送文本输入事件,其中包括输入文本。投屏服务端模块处理文本输入事件,得到其中的输入文本,之后向输入法管理服务模块提交输入文本,并提交至手机的输入框中,从而可以在手机框中显示用户在PC端进行的输入。
手机的输入框将光标位置变化发送投屏输入法服务模块,投屏输入法服务模块向投屏服务端模块发送光标位置。投屏服务端模块向网络模块发送光标位置。网络模块向投屏客户端模块发送光标位置。投屏客户端模块向投屏窗口模块发送光标位置。投屏窗口模块向PC输入框发送光标位置,从而可以PC输入框根据手机输入框的光标位置调整其自身的位置。
当手机的输入框失去输入焦点时,可以去激活手机输入法。手机的输入管理服务模块(inputmanager service)将输入框退出输入事件(即退出输入状态的事件)发送给投屏输入法服务模块。投屏输入法服务模块将退出输入事件发送给投屏服务端模块。投屏服务端模块将退出输入事件发送给PC的网络模块。网络模块将其接收到的退出输入事件发送给投屏客户端模块。投屏客户端模块处理退出输入事件,并发送给投屏窗口模块,使得投屏窗口模块去激活PC输入框,进而隐藏PC输入法。
本申请实施例提供了一种装置1000,参阅图10,装置1000包括:
显示单元1010,用于在至少一部分显示界面上显示投屏窗口,所述投屏窗口为投屏源端的屏显内容的镜像,所述屏显内容包括第一编辑框;
接收单元1020,用于从所述投屏源端接收输入启动信息,所述输入启动信息用于指示所 述第一编辑框获得了输入焦点;
获取单元1030,用于响应于所述输入启动信息,获取所述装置的输入设备输入的文本内容或图像,以得到待显示内容;
发送单元1040,用于向所述投屏源端发送所述待显示内容;
更新单元1050,用于更新所述投屏窗口,更新后的投屏窗口为所述投屏源端更新后的屏显内容的镜像,在所述更新后的屏显内容中的所述第一编辑框中显示有所述待显示内容。
在一些实施例中,所述获取单元1030包括设置子单元1031、监听子单元1032、获取子单元1033;所述设置子单元1031用于响应于所述输入启动信息,将所述投屏目的端的第二编辑框设置为输入状态;所述监听子单元1032用于监听所述第二编辑框的内容变更事件;所述获取子单元1033用于响应于所述内容变更事件,获取触发所述内容变更事件的文本内容或图像,以作为所述待显示内容。
在一些实施例中,所述装置还包括启动单元1060,用于响应于所述输入启动信息,启动所述第一输入法。
在一些实施例中,所述装置还包括删除单元1070,用于在所述发送单元向所述投屏源端发送所述待显示内容之后,在所述第二编辑框中删除所述文本内容或图像。
装置1000的各功能单元的功能可以参照图2所示的方法实施例实现,此次不再赘述。
根据本申请实施例提供的装置,在投屏源端将其屏显内容投射到投屏目的端之后,用户可以通过操控投屏目的端的输入设备,在投屏目的端本地生成文本内容或图像;并将其生成的文本内容或图像作为待显示内容发送至投屏源端;投屏源端在接收到投屏目的端发送的文本内容或图像后,将该待显示内容提交至其编辑框中显示,使得投屏目的端的本地输入可以无差别的同步到投屏源端,提高了用户输入体验。
本申请实施例提供了一种装置1100,参阅图11,装置1100包括:
投射单元1110,用于将其屏显内容投射到投屏目的端,以使所述投屏目的端在至少一部分显示界面上显示投屏窗口,所述投屏窗口为所述屏显内容的镜像,所述屏显内容包括第一编辑框;
发送单元1120,用于向所述投屏目的端发送输入启动信息,所述输入启动信息用于指示所述第一编辑框获得了输入焦点;
接收单元1130,用于从所述投屏目的端接收待显示内容,所述待显示内容为所述投屏目的端响应于所述输入启动信息而获取的,所述待显示内容为所述投屏目的端的输入设备输入的文本内容或图像;
显示单元1140,用于在所述第一编辑框显示所述待显示内容,以更新所述屏显内容。
在一些实施例中,所述装置1100还包括启动单元1150;所述接收单元1130还用于在所述发送单元1120向所述投屏目的端发送输入启动信息之前,从所述投屏目的端接收针对第一编辑框的镜像的点击信号,所述点击信号包括点击类型和点击位置;所述启动单元1150用于根据所述点击信号启动所述投屏源端的第二输入法,以便通过所述第二输入法监听所述第一编辑框的光标位置。
在一些实施例中,所述装置1100还包括获取单元1160;所述接收单元1130还用于在所述发送单元1120向所述投屏目的端发送输入启动信息之前,从所述投屏目的端获取针对第一编辑框的镜像的点击信号,所述点击信号包括点击类型和点击位置;所述获取单元1160用于根据所述点击信号使所述第一编辑框获得输入焦点;所述发送单元1120用于向所述投屏目的端发送所述输入启动信息。
装置1100的各功能单元的功能可以参照图6所示的方法实施例实现,此次不再赘述。
根据本申请实施例提供的装置,在投屏源端将其屏显内容投射到投屏目的端之后,用户可以通过操控投屏目的端的输入设备,在投屏目的端本地生成文本内容或图像;并将其生成的文本内容或图像作为待显示内容发送至投屏源端;投屏源端在接收到投屏目的端发送的文本内容或图像后,将该待显示内容提交至其编辑框中显示,使得投屏目的端的本地输入可以无差别的同步到投屏源端,提高了用户输入体验。
上文主要从方法流程的角度对本申请实施例提供的装置进行了介绍。可以理解的是,各个电子设备为了实现上述功能,其包含了执行各个功能相应的硬件结构和/或软件模块。本领域技术人员应该很容易意识到,结合本文中所公开的实施例描述的各示例的单元及算法步骤,本申请能够以硬件或硬件和计算机软件的结合形式来实现。某个功能究竟以硬件还是计算机软件驱动硬件的方式来执行,取决于技术方案的特定应用和设计约束条件。专业技术人员可以对每个特定的应用来使用不同方法来实现所描述的功能,但是这种实现不应认为超出本申请的范围。
本申请实施例可以根据图2所示的方法实施例或图6所示的方法实施例对电子设备等进行功能模块的划分,例如,可以对应各个功能划分各个功能模块,也可以将两个或两个以上的功能集成在一个处理模块中。上述集成的模块既可以采用硬件的形式实现,也可以采用软件功能模块的形式实现。需要说明的是,本申请实施例中对模块的划分是示意性的,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式。
本申请实施例提供了提供了一种投屏目的端,参阅图12,该投屏目的端包括处理器1210、存储器1220、收发器1230和显示屏1240。其中,所述存储器用于存储计算机执行指令;当所述投屏目的端运行时,所述处理器1210执行所述存储器1220存储的所述计算机执行指令,以使所述投屏目的端执行图2所示的方法。其中,显示屏1240,用于在至少一部分显示界面上显示投屏窗口,所述投屏窗口为投屏源端的屏显内容的镜像,所述屏显内容包括第一编辑框;收发器1230,用于从所述投屏源端接收输入启动信息,所述输入启动信息用于指示所述第一编辑框获得了输入焦点;处理器1210,用于响应于所述输入启动信息,获取所述装置的输入设备输入的文本内容或图像,以得到待显示内容;收发器1230,用于向所述投屏源端发送所述待显示内容;显示屏1240,用于更新所述投屏窗口,更新后的投屏窗口为所述投屏源端更新后的屏显内容的镜像,在所述更新后的屏显内容中的所述第一编辑框中显示有所述待显示内容。
在一些实施例中,该投屏目的端还包括通信总线1250,其中,处理器1210可通过通信总线1250与存储器1220、收发器1230和显示屏1240连接,从而可实现根据存储器1220存储的计算机执行指令,对收发器1230和显示屏1240进行相应控制。
本申请实施例的投屏目的端各个部件/器件的具体实施方式,可参照上文如图2所示的各方法实施例实现,此处不再赘述。
由此,在投屏源端将其屏显内容投射到投屏目的端之后,用户可以通过操控投屏目的端的输入设备,在投屏目的端本地生成文本内容或图像;并将其生成的文本内容或图像作为待显示内容发送至投屏源端;投屏源端在接收到投屏目的端发送的文本内容或图像后,将该待显示内容提交至其编辑框中显示,使得投屏目的端的本地输入可以无差别的同步到投屏源端,提高了用户输入体验。
本申请实施例提供了提供了一种投屏源端,参阅图13,该投屏源端包括处理器1310、存储器1320、收发器1330和显示屏1340。其中,所述存储器用于存储计算机执行指令;当所述投屏源端运行时,所述处理器1310执行所述存储器1320存储的所述计算机执行指令,以使所述投屏源端执行图6所示的方法。其中,收发器1330,用于将显示屏1340的屏显内容投射到投屏目的端,以使所述投屏目的端在至少一部分显示界面上显示投屏窗口,所述投屏窗口为所述屏显内容的镜像,所述屏显内容包括第一编辑框;收发器1330,用于向所述投屏目的端发送输入启动信息,所述输入启动信息用于指示所述第一编辑框获得了输入焦点;收发器1330,用于从所述投屏目的端接收待显示内容,所述待显示内容为所述投屏目的端响应于所述输入启动信息而获取的,所述待显示内容为所述投屏目的端的输入设备输入的文本内容或图像;显示屏1340,用于在所述第一编辑框显示所述待显示内容,以更新所述屏显内容。
在一些实施例中,该投屏源端还包括通信总线1350,其中,处理器1310可通过通信总线1350与存储器1320、收发器1330和显示屏1340连接,从而可实现根据存储器1320存储的计算机执行指令,对收发器1330和显示屏1340进行相应控制。
本申请实施例的投屏源端各个部件/器件的具体实施方式,可参照上文如图6所示的各方法实施例实现,此处不再赘述。
由此,在投屏源端将其屏显内容投射到投屏目的端之后,用户可以通过操控投屏目的端的输入设备,在投屏目的端本地生成文本内容或图像;并将其生成的文本内容或图像作为待显示内容发送至投屏源端;投屏源端在接收到投屏目的端发送的文本内容或图像后,将该待显示内容提交至其编辑框中显示,使得投屏目的端的本地输入可以无差别的同步到投屏源端,提高了用户输入体验。
可以理解的是,本申请的实施例中的处理器可以是中央处理单元(central processing unit,CPU),还可以是其他通用处理器、数字信号处理器(digital signal processor,DSP)、专用集成电路(application specific integrated circuit,ASIC)、现场可编程门阵列(field programmable gate array,FPGA)或者其他可编程逻辑器件、晶体管逻辑器件,硬件部件或者其任意组合。通用处理器可以是微处理器,也可以是任何常规的处理器。
本申请的实施例中的方法步骤可以通过硬件的方式来实现,也可以由处理器执行软件指令的方式来实现。软件指令可以由相应的软件模块组成,软件模块可以被存放于随机存取存储器(random access memory,RAM)、闪存、只读存储器(read-only memory,ROM)、可编程只读存储器(programmable rom,PROM)、可擦除可编程只读存储器(erasable PROM,EPROM)、电可擦除可编程只读存储器(electrically EPROM,EEPROM)、寄存器、硬盘、移动硬盘、CD-ROM或者本领域熟知的任何其它形式的存储介质中。一种示例性的存储介质耦合至处理器,从而使处理器能够从该存储介质读取信息,且可向该存储介质写入信息。当然,存储介质也可以是处理器的组成部分。处理器和存储介质可以位于ASIC中。
在上述实施例中,可以全部或部分地通过软件、硬件、固件或者其任意组合来实现。当使用软件实现时,可以全部或部分地以计算机程序产品的形式实现。所述计算机程序产品包括一个或多个计算机指令。在计算机上加载和执行所述计算机程序指令时,全部或部分地产生按照本申请实施例所述的流程或功能。所述计算机可以是通用计算机、专用计算机、计算机网络、或者其他可编程装置。所述计算机指令可以存储在计算机可读存储介质中,或者通过所述计算机可读存储介质进行传输。所述计算机指令可以从一个网站站点、计算机、服务器或数据中心通过有线(例如同轴电缆、光纤、数字用户线(DSL))或无线(例如红外、无线、微波等)方式向另一个网站站点、计算机、服务器或数据中心进行传输。所述计算机可 读存储介质可以是计算机能够存取的任何可用介质或者是包含一个或多个可用介质集成的服务器、数据中心等数据存储设备。所述可用介质可以是磁性介质,(例如,软盘、硬盘、磁带)、光介质(例如,DVD)、或者半导体介质(例如固态硬盘(solid state disk,SSD))等。
可以理解的是,在本申请的实施例中涉及的各种数字编号仅为描述方便进行的区分,并不用来限制本申请的实施例的范围。

Claims (19)

  1. 一种输入方法,其特征在于,包括:
    投屏目的端在至少一部分显示界面上显示投屏窗口,所述投屏窗口为投屏源端的屏显内容的镜像,所述屏显内容包括第一编辑框;
    所述投屏目的端从所述投屏源端接收输入启动信息,所述输入启动信息用于指示所述第一编辑框获得了输入焦点;
    所述投屏目的端响应于所述输入启动信息,获取所述投屏目的端的输入设备输入的文本内容或图像,以得到待显示内容;
    所述投屏目的端向所述投屏源端发送所述待显示内容;
    所述投屏目的端更新所述投屏窗口,更新后的投屏窗口为所述投屏源端更新后的屏显内容的镜像,在所述更新后的屏显内容中的所述第一编辑框中显示有所述待显示内容。
  2. 根据权利要求1所述的方法,其特征在于,所述投屏目的端响应于所述输入启动信息,获取所述投屏目的端的输入设备输入的文本内容或图像包括:
    所述投屏目的端响应于所述输入启动信息,将所述投屏目的端的第二编辑框设置为输入状态;
    所述投屏目的端监听所述第二编辑框的内容变更事件;
    所述投屏目的端响应于所述内容变更事件,获取触发所述内容变更事件的文本内容或图像,以获取的文本内容或图像作为所述待显示内容。
  3. 根据权利要求1或2所述的方法,其特征在于,所述文本内容或图像为所述投屏目的端的第一输入法根据所述输入设备产生的操作命令而生成的。
  4. 根据权利要求1至3任意一项所述的方法,其特征在于,所述方法还包括:所述投屏目的端响应于所述输入启动信息,启动所述第一输入法。
  5. 根据权利要求1至4任意一项所述的方法,其特征在于,所述第二编辑框为隐藏编辑框或透明编辑框;
    所述输入启动信息包括所述第一编辑框的第一光标位置;所述方法还包括:
    所述投屏目的端根据所述第一光标位置,设置所述第二编辑框在所述投屏窗口上的位置。
  6. 根据权利要求1至5任意一项所述的方法,其特征在于,在所述投屏目的端向所述投屏源端发送所述待显示内容之后,所述方法还包括:
    所述投屏目的端在所述第二编辑框中删除所述文本内容或图像。
  7. 根据权利要求1至6任意一项所述的方法,其特征在于,所述第二编辑框设置在所述投屏目的端的显示界面中除投屏窗口之外的界面上。
  8. 根据权利要求1至7任意一项所述的方法,其特征在于,在所述投屏目的端从所述投屏源端接收输入启动信息之前,所述方法还包括:
    所述投屏目的端获取针对所述投屏窗口中所述第一编辑框的镜像的点击信号,所述点击 信号包括点击类型和点击位置;
    所述投屏目的端向所述投屏源端发送所述点击信号,以使第一编辑框获得输入焦点。
  9. 一种输入方法,其特征在于,包括:
    投屏源端将其屏显内容投射到投屏目的端,以使所述投屏目的端在至少一部分显示界面上显示投屏窗口,所述投屏窗口为所述屏显内容的镜像,所述屏显内容包括第一编辑框;
    所述投屏源端向所述投屏目的端发送输入启动信息,所述输入启动信息用于指示所述第一编辑框获得了输入焦点;
    所述投屏源端从所述投屏目的端接收待显示内容,所述待显示内容为所述投屏目的端响应于所述输入启动信息而获取的,所述待显示内容为所述投屏目的端的输入设备输入的文本内容或图像;
    所述投屏源端在所述第一编辑框显示所述待显示内容,以更新所述屏显内容。
  10. 根据权利要求9所述的方法,其特征在于,所述输入启动信息用于设置所述投屏目的端的第二编辑框进行输入状态,所述待显示内容为触发所述第二编辑框的内容变更事件的文本内容或图像。
  11. 根据权利要求9或10所述的方法,其特征在于,所述待显示内容为所述投屏目的端的第一输入法根据所述输入设备产生的操作命令而生成的。
  12. 根据权利要求9至11任意一项所述的方法,其特征在于,所述第二编辑框为隐藏编辑框或透明编辑框。
  13. 根据权利要求9至12任意一项所述的方法,其特征在于,所述输入启动信息包括所述第一编辑框的第一光标位置;所述第一光标位置用于设置所述第二编辑框在所述投屏窗口上的位置。
  14. 根据权利要求9至13任意一项所述的方法,其特征在于,在所述投屏源端向所述投屏目的端发送输入启动信息之前,所述方法还包括:
    所述投屏源端从所述投屏目的端接收针对第一编辑框的镜像的点击信号,所述点击信号包括点击类型和点击位置;
    根据所述点击信号启动所述投屏源端的第二输入法,以便通过所述第二输入法监听所述第一编辑框的光标位置。
  15. 根据权利要求9至14任意一项所述的方法,其特征在于,所述第二编辑框设置在所述投屏目的端的显示界面中除投屏窗口之外的界面上。
  16. 根据权利要求9至15任意一项所述的方法,其特征在于,在所述投屏源端向所述投屏目的端发送输入启动信息之前,所述方法还包括:
    所述投屏源端从所述投屏目的端获取针对第一编辑框的镜像的点击信号,所述点击信号包括点击类型和点击位置;
    所述投屏源端根据所述点击信号使所述第一编辑框获得输入焦点,并向所述投屏目的端发送所述输入启动信息。
  17. 一种投屏目的端,其特征在于,包括处理器、存储器、收发器和显示屏;其中,所述存储器用于存储计算机执行指令;当所述投屏目的端运行时,所述处理器执行所述存储器存储的所述计算机执行指令,以使所述投屏目的端执行权利要求1-8任一项所述的方法。
  18. 一种投屏源端,其特征在于,包括处理器、存储器、收发器和显示屏;其中,所述存储器用于存储计算机执行指令;当所述投屏源端运行时,所述处理器执行所述存储器存储的所述计算机执行指令,以使所述投屏源端执行权利要求9-16任一项所述的方法。
  19. 一种计算机存储介质,其特征在于,所述计算机存储介质包括计算机指令,当所述计算机指令在终端上运行时,使得所述终端执行权利要求1-8任一项所述的方法或权利要求9-16任一项所述的方法。
PCT/CN2020/096726 2019-06-20 2020-06-18 一种输入方法、电子设备和投屏系统 WO2020253760A1 (zh)

Priority Applications (3)

Application Number Priority Date Filing Date Title
ES20825506T ES2967240T3 (es) 2019-06-20 2020-06-18 Método de entrada, dispositivo electrónico y sistema de proyección de pantalla
EP20825506.7A EP3955556B1 (en) 2019-06-20 2020-06-18 Input method, electronic device, and screen projection system
US17/615,977 US12032866B2 (en) 2019-06-20 2020-06-18 Input method, electronic device, and screen projection system

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910537753.9A CN110417992B (zh) 2019-06-20 2019-06-20 一种输入方法、电子设备和投屏系统
CN201910537753.9 2019-06-20

Publications (1)

Publication Number Publication Date
WO2020253760A1 true WO2020253760A1 (zh) 2020-12-24

Family

ID=68359381

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/096726 WO2020253760A1 (zh) 2019-06-20 2020-06-18 一种输入方法、电子设备和投屏系统

Country Status (5)

Country Link
US (1) US12032866B2 (zh)
EP (1) EP3955556B1 (zh)
CN (2) CN112968991B (zh)
ES (1) ES2967240T3 (zh)
WO (1) WO2020253760A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116029746A (zh) * 2023-03-28 2023-04-28 深圳市湘凡科技有限公司 数据处理方法及相关装置

Families Citing this family (30)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112968991B (zh) * 2019-06-20 2022-07-29 华为技术有限公司 一种输入方法、电子设备和投屏系统
CN110837308B (zh) * 2019-11-13 2022-06-28 联想(北京)有限公司 信息处理方法、装置及电子设备
CN113129202B (zh) * 2020-01-10 2023-05-09 华为技术有限公司 数据传输方法、装置及数据处理系统、存储介质
CN111324248A (zh) * 2020-01-21 2020-06-23 维达力实业(深圳)有限公司 终端控制方法、装置、终端、计算机设备和存储介质
CN111414097A (zh) * 2020-03-23 2020-07-14 维沃移动通信有限公司 一种交互方法、交互系统和显示设备
CN115623115B (zh) * 2020-03-31 2024-06-18 华为技术有限公司 一种跨设备创建应用快捷方式的方法、设备、系统及存储介质
CN113542825B (zh) * 2020-04-20 2022-10-11 华为技术有限公司 投屏显示方法、系统、终端设备和存储介质
CN114115629A (zh) * 2020-08-26 2022-03-01 华为技术有限公司 一种界面显示方法及设备
CN113741708B (zh) * 2020-05-31 2024-06-11 华为技术有限公司 一种输入方法及电子设备
CN111653330B (zh) * 2020-06-05 2024-01-30 上海杏脉信息科技有限公司 医学图像显示和诊断信息生成方法、系统、终端及介质
CN111787410B (zh) * 2020-07-03 2022-03-29 三星电子(中国)研发中心 一种键盘输入方法和键盘输入装置
CN114764298B (zh) 2020-07-29 2023-03-03 华为技术有限公司 一种跨设备的对象拖拽方法及设备
CN114071207B (zh) * 2020-07-30 2023-03-24 华为技术有限公司 控制大屏设备显示的方法、装置、大屏设备和存储介质
CN112019914B (zh) * 2020-08-27 2021-10-22 北京字节跳动网络技术有限公司 投屏方法、装置、电子设备及计算机可读介质
US12050837B2 (en) 2020-08-27 2024-07-30 Douyin Vision Co., Ltd. Screen projection method and apparatus, electronic device, and computer-readable medium
CN112256224A (zh) * 2020-10-21 2021-01-22 深圳市艾酷通信软件有限公司 投屏方法和电子设备
US20230403421A1 (en) * 2020-10-31 2023-12-14 Huawei Technologies Co., Ltd. Device Communication Method and System, and Apparatus
CN114518854B (zh) * 2020-11-20 2024-07-30 华为技术有限公司 一种投屏方法及其设备
CN112507882A (zh) * 2020-12-10 2021-03-16 展讯通信(上海)有限公司 基于输入框的信息输入方法及系统、移动终端及存储介质
CN112684993A (zh) * 2020-12-23 2021-04-20 北京小米移动软件有限公司 一种基于跨屏协作的显示方法、装置及介质
CN112667183A (zh) * 2020-12-31 2021-04-16 努比亚技术有限公司 投屏方法、移动终端及计算机可读存储介质
CN112764853A (zh) * 2021-01-19 2021-05-07 深圳乐播科技有限公司 一种投屏方法、设备及系统
CN113515244B (zh) * 2021-03-24 2024-03-22 深圳乐播科技有限公司 基于投屏的终端遥控方法、装置、设备及存储介质
CN113542706B (zh) * 2021-06-25 2023-06-13 深圳乐播科技有限公司 跑步机的投屏方法、装置、设备及存储介质
CN115705128A (zh) * 2021-08-03 2023-02-17 华为技术有限公司 一种跨设备输入方法、设备及系统
CN113891127A (zh) * 2021-08-31 2022-01-04 维沃移动通信有限公司 视频编辑方法、装置及电子设备
CN113760213A (zh) * 2021-09-08 2021-12-07 联想(北京)有限公司 一种屏幕投射方法、系统及电子设备
CN115033195A (zh) * 2022-04-25 2022-09-09 Oppo广东移动通信有限公司 画面显示方法、装置、设备、存储介质及程序产品
CN115665473A (zh) * 2022-10-14 2023-01-31 维沃移动通信有限公司 投屏方法、装置、电子设备和存储介质
CN115759012B (zh) * 2022-11-01 2023-04-11 佳瑛科技有限公司 一种基于无线连接的投屏共享文档批注管理方法及装置

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100137026A1 (en) * 2008-12-02 2010-06-03 Lg Electronics Inc. Mobile terminal and method of controlling display thereof
WO2017166602A1 (zh) * 2016-03-31 2017-10-05 百度在线网络技术(北京)有限公司 车载终端与移动终端协同输入的控制方法及移动终端
CN108900697A (zh) * 2018-05-30 2018-11-27 武汉卡比特信息有限公司 手机与计算机类终端互联时的终端文字信息输入系统及方法
CN110417992A (zh) * 2019-06-20 2019-11-05 华为技术有限公司 一种输入方法、电子设备和投屏系统

Family Cites Families (33)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101609250B1 (ko) * 2008-11-26 2016-04-06 삼성전자주식회사 데이터스트림을 이용한 송수신 시스템의 인터페이스 방법
TWM380521U (en) 2009-09-18 2010-05-11 Aten Int Co Ltd Remote control device and server and client incoporating the same
CN102033838A (zh) 2010-10-27 2011-04-27 东莞宇龙通信科技有限公司 终端控制方法、终端及计算机
JP5799628B2 (ja) * 2011-07-15 2015-10-28 ソニー株式会社 情報処理装置、情報処理方法、及びプログラム
US20130124969A1 (en) * 2011-11-14 2013-05-16 Crowell Solutions, Inc. Xml editor within a wysiwyg application
EP2602730B1 (en) * 2011-12-07 2018-02-14 BlackBerry Limited Presenting context information in a computing device
CN102541795A (zh) 2012-01-09 2012-07-04 南京航空航天大学 基于Android系统的USB动态实时控制方法及其系统
US20150020012A1 (en) * 2013-07-11 2015-01-15 Htc Corporation Electronic device and input method editor window adjustment method thereof
KR20150018256A (ko) * 2013-08-09 2015-02-23 엘지전자 주식회사 모바일 디바이스 및 그 제어 방법
CN103593213B (zh) * 2013-11-04 2017-04-05 华为技术有限公司 文本信息输入方法及装置
CN103702436B (zh) 2013-12-11 2016-09-14 东软集团股份有限公司 Miracast反向控制方法及系统
CN103744763A (zh) 2013-12-25 2014-04-23 广东明创软件科技有限公司 自动化测试中pc端同步控制移动终端的方法
KR20150096826A (ko) * 2014-02-17 2015-08-26 엘지전자 주식회사 디스플레이 장치 및 제어 방법
JP2015162040A (ja) 2014-02-27 2015-09-07 シャープ株式会社 電子機器
TWI610221B (zh) * 2014-06-18 2018-01-01 緯創資通股份有限公司 螢幕播送方法以及使用該方法的系統與裝置
US20160073098A1 (en) * 2014-09-10 2016-03-10 Continental Automotive Systems, Inc. Head-up display system using auto-stereoscopy 3d transparent electronic display
US20160085396A1 (en) * 2014-09-24 2016-03-24 Microsoft Corporation Interactive text preview
TWI520051B (zh) * 2014-11-05 2016-02-01 奇揚網科股份有限公司 鏡射顯示系統與鏡射顯示方法
CN105094839A (zh) 2015-08-14 2015-11-25 深圳市众联悠游科技有限公司 基于PC机实现运行Android系统应用的方法
CN106886361A (zh) * 2015-12-16 2017-06-23 深圳市光峰光电技术有限公司 投影设备的输入控制方法、系统及投影设备、移动终端
CN105677329B (zh) * 2015-12-30 2018-12-14 联想(北京)有限公司 一种控制方法及电子设备
US10530856B2 (en) * 2016-02-09 2020-01-07 Qualcomm Incorporated Sharing data between a plurality of source devices that are each connected to a sink device
KR20180023609A (ko) * 2016-08-26 2018-03-07 삼성전자주식회사 디스플레이 장치 및 그 제어 방법
CN106488016B (zh) * 2016-09-30 2020-03-27 努比亚技术有限公司 一种应用控制方法以及移动终端
CN106547563B (zh) * 2016-11-14 2020-03-31 海能达通信股份有限公司 对讲机中操作功能的实现方法、装置和对讲机终端
EP3563234B1 (en) * 2017-01-31 2021-12-01 Samsung Electronics Co., Ltd. Electronic apparatus and controlling method thereof
US10901679B2 (en) * 2017-02-06 2021-01-26 Hewlett-Packard Development Company, L.P. Mirroring of screens
US10671357B2 (en) * 2017-06-05 2020-06-02 Apptimize Llc Preview changes to mobile applications at different display resolutions
KR102350933B1 (ko) * 2017-06-20 2022-01-12 엘지전자 주식회사 영상표시장치
CN109309822A (zh) * 2017-07-28 2019-02-05 中强光电股份有限公司 投影方法以及投影系统
US11314399B2 (en) * 2017-10-21 2022-04-26 Eyecam, Inc. Adaptive graphic user interfacing system
US11451619B2 (en) 2017-12-12 2022-09-20 Honor Device Co., Ltd. App remote control method and related devices
CN111629100A (zh) 2019-02-27 2020-09-04 华为技术有限公司 指令处理方法及装置

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100137026A1 (en) * 2008-12-02 2010-06-03 Lg Electronics Inc. Mobile terminal and method of controlling display thereof
WO2017166602A1 (zh) * 2016-03-31 2017-10-05 百度在线网络技术(北京)有限公司 车载终端与移动终端协同输入的控制方法及移动终端
CN108900697A (zh) * 2018-05-30 2018-11-27 武汉卡比特信息有限公司 手机与计算机类终端互联时的终端文字信息输入系统及方法
CN110417992A (zh) * 2019-06-20 2019-11-05 华为技术有限公司 一种输入方法、电子设备和投屏系统

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116029746A (zh) * 2023-03-28 2023-04-28 深圳市湘凡科技有限公司 数据处理方法及相关装置
CN116029746B (zh) * 2023-03-28 2023-07-04 深圳市湘凡科技有限公司 数据处理方法及相关装置

Also Published As

Publication number Publication date
EP3955556A1 (en) 2022-02-16
EP3955556A4 (en) 2022-07-27
US20240094974A1 (en) 2024-03-21
ES2967240T3 (es) 2024-04-29
US12032866B2 (en) 2024-07-09
CN110417992B (zh) 2021-02-12
EP3955556B1 (en) 2023-11-08
CN110417992A (zh) 2019-11-05
CN112968991B (zh) 2022-07-29
CN112968991A (zh) 2021-06-15

Similar Documents

Publication Publication Date Title
WO2020253760A1 (zh) 一种输入方法、电子设备和投屏系统
US10659200B2 (en) Companion application for activity cooperation
WO2020211437A1 (zh) 传屏方法和多屏互动装置、系统
US11025980B2 (en) User terminal apparatus, electronic apparatus, system, and control method thereof
CN115145439B (zh) 桌面元数据的显示方法、访问方法及相关装置
WO2019011295A1 (zh) 负一屏内容推送及请求方法、装置、服务器及终端
US12028558B2 (en) Method for processing live broadcast information, electronic device and storage medium
US20170031897A1 (en) Modification of input based on language content background
US20160350062A1 (en) Remote screen display system, remote screen display method and non-transitory computer-readable recording medium
CN110704647A (zh) 一种内容处理方法及装置
WO2023185817A1 (zh) 多设备协作方法、装置、电子设备及介质
US20130111327A1 (en) Electronic apparatus and display control method
CN112507329A (zh) 安全防护方法及装置
WO2017080203A1 (zh) 手写输入方法及装置、移动设备
JP5281324B2 (ja) 画面出力コンバータ、ディスプレイ装置および画面表示方法
JP5752759B2 (ja) 電子機器、方法、およびプログラム
CN110456919B (zh) 数据处理方法、装置和用于数据处理的装置
KR20230154786A (ko) 디스플레이 장치와 단말기 장치의 인터렉션 방법, 저장 매체 및 전자 기기
KR20210079879A (ko) Iptv 서비스의 터치 인터페이스 제공 장치 및 방법
CN109672646B (zh) 跨设备的输入方法和装置、用于跨设备输入的装置
CN115103054B (zh) 信息处理方法、装置、电子设备及介质
WO2022188297A1 (zh) 请求处理方法、装置和介质
WO2019015089A1 (zh) 一种全局菜单的控制方法、装置、设备和存储介质
WO2018205072A1 (zh) 一种文本转换成语音方法和装置
CN114721563A (zh) 显示设备、批注方法及存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20825506

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 17615977

Country of ref document: US

ENP Entry into the national phase

Ref document number: 2020825506

Country of ref document: EP

Effective date: 20211112

NENP Non-entry into the national phase

Ref country code: DE