CN110850964A - Method, device and system for remotely inputting VR equipment - Google Patents

Method, device and system for remotely inputting VR equipment Download PDF

Info

Publication number
CN110850964A
CN110850964A CN201910968615.6A CN201910968615A CN110850964A CN 110850964 A CN110850964 A CN 110850964A CN 201910968615 A CN201910968615 A CN 201910968615A CN 110850964 A CN110850964 A CN 110850964A
Authority
CN
China
Prior art keywords
input
computer
information
input information
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
CN201910968615.6A
Other languages
Chinese (zh)
Inventor
何建邦
朱庆友
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chongqing IQIYI Intelligent Technology Co Ltd
Original Assignee
Chongqing IQIYI Intelligent Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chongqing IQIYI Intelligent Technology Co Ltd filed Critical Chongqing IQIYI Intelligent Technology Co Ltd
Priority to CN201910968615.6A priority Critical patent/CN110850964A/en
Publication of CN110850964A publication Critical patent/CN110850964A/en
Withdrawn legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention aims to provide a method, a device and a system for remotely inputting VR equipment. The VR equipment projects the current display content to a display device of the computer equipment; the computer equipment receives local information input of a user; the computer device encodes the input information and sends the encoded information to the VR device; the VR device receives the encoded input information and invokes its local input method to decode the input information; the VR device displays the input information at a target input location. Compared with the prior art, the computer equipment end and the VR equipment end are connected through the TCP, data synchronization can be achieved rapidly, for example, a current display image of the VR equipment end is sent to the computer equipment end, an input text of the computer equipment end is sent to the VR equipment end, and instantaneity is high. In addition, the invention also supports the input of Chinese text and expressions.

Description

Method, device and system for remotely inputting VR equipment
Technical Field
The invention relates to the technical field of VR (virtual reality), in particular to a technology for remotely inputting information to VR equipment in the debugging and developing process of the VR equipment.
Background
VR (Virtual Reality) industry is developing more and more rapidly, and often needs to display and control the picture of the VR device on a large screen. VR equipment generally adopts handle or head accuse as input device, and the input is comparatively inconvenient. When the VR equipment is connected with a computer, the input can be carried out through a computer end input method, and the input is directly pushed into a text box of the VR equipment. The general scheme is that an adb shell input text command is used at a PC end to send a text to a VR recognition end.
The method of adb shell input text is low in efficiency, adb service needs to be started once every time of input, and the time is about 1s, so that quick input cannot be achieved. In addition, the method cannot input Chinese and expressions (Emoji), and when the Chinese or expressions are sent, messy codes appear at the VR equipment end, which results in poor practicability.
Disclosure of Invention
The invention aims to provide a method, a device and a system for remotely inputting to VR equipment.
According to an aspect of the invention, there is provided a method of remotely inputting to a VR device, wherein the method comprises the steps of:
the VR equipment projects the current display content to a display device of the computer equipment;
the computer equipment receives local information input of a user;
the computer device encodes the input information and sends the encoded information to the VR device;
the VR device receives the encoded input information and invokes its local input method to decode the input information;
the VR device displays the input information at a target input location.
According to an aspect of the present invention, there is also provided a method for implementing remote input at a VR device, where the method includes the following steps:
sending the current display content for screen projection to the computer equipment;
receiving encoded input information from the computer device;
invoking a local input method to decode the encoded input information;
and displaying the input information at the target input position.
According to an aspect of the invention, there is also provided a method of remote input of a VR device by a computer device, wherein the method comprises the steps of:
receiving an image sent by a VR device and presenting a rendered image on a display device, wherein the image corresponds to the current display content of the VR device;
receiving local information input of a user;
encode the input information and send to the VR device.
According to an aspect of the invention, there is also provided a system for remote input to a VR device, wherein the system comprises the VR device and a computer device, the computer device coupled with a display,
wherein the VR device includes:
the screen projection device is used for projecting the current display content of the VR equipment to the display device of the computer equipment;
receiving means for receiving encoded input information from the computer device;
decoding means for invoking a local input method to decode the encoded input information;
the display device is used for displaying the input information at a target input position;
wherein the computer device comprises:
the submitting device is used for receiving local information input of a user;
encoding means for encoding the input information;
and the transceiver is used for transmitting the coded input information to the VR equipment.
According to an aspect of the present invention, there is also provided an apparatus for implementing remote input at a VR device, where the apparatus includes:
the screen projection device is used for sending the current display content for screen projection to the computer equipment;
receiving means for receiving encoded input information from the computer device;
decoding means for invoking a local input method to decode the encoded input information;
and the display device is used for displaying the input information at a target input position.
According to an aspect of the present invention, there is also provided an apparatus for remotely inputting a VR device at a computer device side, wherein the computer device is coupled with a display device;
wherein, the device includes:
the device comprises a transceiver and a processing unit, wherein the transceiver is used for receiving an image sent by a VR device, and the image corresponds to the current display content of the VR device;
the display device is used for rendering the image and presenting the rendered image;
the submitting device is used for receiving local information input of a user;
encoding means for encoding the input information;
the transceiver is further configured to transmit the encoded input information to the VR device.
According to an aspect of the present invention, there is also provided a VR device comprising a memory and a processor, wherein the memory stores computer program instructions, and when the computer program instructions are executed by the processor, a method for implementing remote input at a VR device as described above is implemented.
According to an aspect of the invention, there is also provided a computer program product comprising computer program instructions which, when executed by a VR device, implement a method of enabling remote input at a VR device as described above.
According to an aspect of the present invention, there is also provided a computer readable storage medium storing computer program instructions which, when executed by a VR device, implement a method for enabling remote input at a VR site as described above.
According to an aspect of the invention, there is also provided a computer device comprising a memory and a processor, wherein the memory stores computer program instructions which, when executed by the processor, implement a method of remote input to a VR device by a computer device as described above.
According to an aspect of the invention, there is also provided a computer program product comprising computer program instructions which, when executed by a computer device, implement a method of remotely inputting a VR device by a computer device as hereinbefore described.
According to an aspect of the invention, there is also provided a computer readable storage medium having stored thereon computer program instructions which, when executed by a computer device, implement a method of remotely inputting a VR device by a computer device as previously described.
Compared with the prior art, the computer equipment end and the VR equipment end are connected through the TCP, data synchronization can be achieved rapidly, for example, a current display image of the VR equipment end is sent to the computer equipment end, an input text of the computer equipment end is sent to the VR equipment end, and instantaneity is high. In addition, the invention also supports the input of Chinese text and expressions. In the development, debugging, test and demonstration processes of VR equipment, information can be input rapidly, and input efficiency is improved.
Drawings
Other features, objects and advantages of the invention will become more apparent upon reading of the detailed description of non-limiting embodiments made with reference to the following drawings:
FIG. 1 shows a flow diagram of a method of remotely inputting to a VR device in accordance with an embodiment of the invention;
fig. 2 shows a schematic diagram of a system for remote input to a VR device, according to an embodiment of the invention.
The same or similar reference numbers in the drawings identify the same or similar elements.
Detailed Description
Before discussing exemplary embodiments in more detail, it should be noted that some exemplary embodiments are described as processes or methods depicted as flowcharts. Although a flowchart may describe the operations as a sequential process, many of the operations can be performed in parallel, concurrently, or simultaneously. In addition, the order of the operations may be re-arranged. The process may be terminated when its operations are completed, but may have additional steps not included in the figure. The processes may correspond to methods, functions, procedures, subroutines, and the like.
Specific structural and functional details disclosed herein are merely representative and are provided for purposes of describing example embodiments of the present invention. The present invention may, however, be embodied in many alternate forms and should not be construed as limited to only the embodiments set forth herein.
It will be understood that, although the terms first, second, etc. may be used herein to describe various elements, these elements should not be limited by these terms. These terms are only used to distinguish one element from another. For example, a first element may be termed a second element, and, similarly, a second element may be termed a first element, without departing from the scope of example embodiments. As used herein, the term "and/or" includes any and all combinations of one or more of the associated listed items.
It will be understood that when an element is referred to as being "connected" or "coupled" to another element, it can be directly connected or coupled to the other element or intervening elements may be present. In contrast, when an element is referred to as being "directly connected" or "directly coupled" to another element, there are no intervening elements present. Other words used to describe the relationship between elements (e.g., "between" versus "directly between", "adjacent" versus "directly adjacent to", etc.) should be interpreted in a similar manner.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of example embodiments. As used herein, the singular forms "a", "an" and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms "comprises" and/or "comprising," when used herein, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It should also be noted that, in some alternative implementations, the functions/acts noted may occur out of the order noted in the figures. For example, two figures shown in succession may, in fact, be executed substantially concurrently, or the figures may sometimes be executed in the reverse order, depending upon the functionality/acts involved.
The present invention is described in further detail below with reference to the attached drawing figures.
Fig. 1 illustrates a method flow diagram, which particularly shows a process of remote input to a VR device, in accordance with an embodiment of the present invention.
As shown in fig. 1, in step S1, the VR device projects its current display content to the display device of the computer device; in step S2, the computer device receives a local information input by the user; in step S3, the computer device encodes the input information; in step S4, the computer device sends the encoded input information to the VR device; in step S5, the VR device invokes its local input method to decode the encoded input information; in step S6, the VR device displays the input information at the target input position.
Referring to fig. 1, in step S1, the VR device projects its current display content to the display device of the computer device.
Here, the computer device includes, but is not limited to, any electronic device coupled with a display device, which can execute a predetermined process such as numerical calculation and/or logic calculation by executing a predetermined program or instruction, and generally includes a processor and a memory, the predetermined process is executed by the processor executing a program instruction prestored in the memory, or the predetermined process is executed by hardware such as ASIC, FPGA, DSP, or a combination of the two. Computer devices include, but are not limited to, Personal Computers (PCs), laptops, tablets, and the like.
VR devices are typically integrated VR devices, and specifically, VR head displays (virtual reality head-mounted display devices) with independent processors, and have independent computing, input, and output functions.
Wherein, the step of projecting the screen to the computer equipment by the VR equipment further comprises the following substeps:
and step S101, establishing a TCP connection between the VR device and the computer device through adb connection.
And the VR equipment is connected with the computer equipment through USB or WIFI, and the VR equipment and the computer equipment establish TCP network connection under the condition that the adb works normally.
Step S102, the VR device captures an image of its currently displayed content and sends the image to the computer device via the TCP connection.
The VR device captures images of its currently displayed content in real-time and sends them to the computer device so that the computer device can synchronize the VR device's currently displayed content.
Here, to reduce the amount of data processing and data transmission, the VR device may compare the currently captured image with the last captured image, and when the two are the same, the VR device does not transmit the currently captured image to the computer device, that is, only when the currently displayed content of the VR device changes, the changed displayed image is synchronized to the computer device.
Further, the current display image captured by the VR device may also be sent to the computer device after compression via a specific socket port of both TCP connections.
The computer equipment end is connected with the VR equipment end through the TCP, so that the input text of the computer equipment end can be rapidly sent to the VR equipment end, and the real-time performance is high.
Step S103, the computer apparatus renders the received image and displays the rendered image on its display device.
Here, the display device is typically a display.
Accordingly, the current display content of the VR device is synchronized to the computer device side. When the input requirement is presented in the display content, if the page comprises the input box or the input box is activated, the user can perform local input through the input device of the computer device so as to synchronize the local input to the VR device side.
In step S2, the computer device receives a local information input by the user.
Here, the local input of the user on the computer device can be performed through an input device, and the input device comprises a virtual keyboard and a physical keyboard, which are correspondingly provided with different input method configurations.
By browsing the VR device display content synchronized to the display means, the user uses the input means of the computer device to make an input when the input is needed, which input with respect to the input at the VR device side can be understood as a kind of "local input" at the computer device side.
In step S3, the computer device encodes the input information.
Here, the encoding method is typically UTF8 encoding, which can support a plurality of languages including chinese. Therefore, those skilled in the art will appreciate that the present invention is not limited to UTF8 encoding, and any other encoding scheme that can accommodate chinese or other language inputs may be used.
In step S4, the computer device transmits the encoded input information to the VR device.
And after UTF8 encoding is carried out on the input information by the computer equipment, the encoded input information is sent to the VR equipment through the TCP connection with the VR equipment. That is, the computer device side realizes the real-time acquisition of the user input and the real-time input synchronization of the user to the VR device.
In step S5, the VR device invokes its local input method to decode the encoded input information.
Here, the local input method requires an input sequence capable of supporting the input device on the computer device side to generate input information. For example, for input using a physical keyboard, the VR device may create a standard Android input method locally to be able to recognize the input sequence generated by the keystrokes. For input using a virtual keyboard, the VR device may, for example, invoke a native google input method to recognize the input sequence generated by the touch to the touchscreen.
In step S6, the VR device displays the input information at the target input position.
The target input position means a position in which the input focus is located in the currently displayed content. For example, the VR device may call the system function commitText () to send the input information into the currently focused text box.
Fig. 2 shows a system diagram, which particularly shows a system for remote input to a VR device, according to an embodiment of the invention.
As shown in fig. 2, system 20 includes VR device 210 and computer device 220. VR device 210 includes a screen projection device 211, a receiving device 212, a decoding device 213, and a presentation device 214. The computer device 220 comprises rendering means 223, encoding means 224 and transceiving means 225, and is coupled with display means 221.
The screen projection device 211 projects the current display content of the VR device 210 to the display device 221 of the computer device 220; the submitting means 221 of the computer device 220 receives local information input by the user; the encoding means 224 of the computer device 220 encodes the inputted information; the transceiver 225 of the computer device 220 transmits the encoded input information to the VR device 210, while the receiver 212 of the VR device 210 receives the encoded input information; the decoding means 213 of the VR device 210 invokes the local input method to decode the encoded input information; presentation device 214 of VR device 210 displays the input information at the target input location.
Specifically, still referring to fig. 2, the screen projection device 211 of the VR device 210 projects the currently displayed content to the display device 221 of the computer device 220.
Here, the computer device 220 includes, but is not limited to, any electronic device coupled with the display device 221, and can perform predetermined processes such as numerical calculation and/or logic calculation by executing a predetermined program or instruction, and generally includes a processor and a memory, the predetermined processes are performed by the processor executing a program instruction pre-stored in the memory, or the predetermined processes are performed by hardware such as ASIC, FPGA, DSP, or a combination thereof. Computer devices 220 include, but are not limited to, Personal Computers (PCs), laptops, tablets, and the like.
The VR device 210 is typically a VR all-in-one machine, and specifically is a VR head display (virtual reality head-mounted display device) with an independent processor, and has independent computing, inputting and outputting functions.
Wherein, the screen projecting operation of the screen projecting device 211 to the computer device 220 further comprises the following sub-operations:
the TCP connection is established between the screen-projecting device 211 and the transceiver 225 of the computer device 220 via an adb connection.
The screen projection device 211 and the transceiver 225 are connected through USB or WIFI, and when the adb works normally, the two establish a TCP network connection.
Subsequently, the screen-projecting device 211 captures an image of the content currently displayed by the VR device 210 and transmits the captured image to the transceiving device 225 of the computer device 220 via the TCP connection.
The screen projection device 211 captures images of the currently displayed content of the VR device 210 in real time and transmits them to the computer device 220 so that the computer device 220 can synchronize the currently displayed content of the VR device 210.
Here, to reduce the data processing amount and the data transmission amount, the screen projecting apparatus 211 may compare the currently captured image with the last captured image, and when the two are the same, the currently captured image and the last captured image are not transmitted to the computer device 220, that is, only when the currently displayed content of the VR device 210 changes, the changed displayed image is synchronized to the computer device 220.
Further, the current display image captured by the screen projecting device 211 can also be sent to the computer device 220 via a specific socket port of the TCP connection between the two after being compressed.
The computer device 220 and the VR device 210 are connected through TCP, so that the input text of the computer device 220 can be quickly sent to the VR device 210, and the real-time performance is high.
Next, the display device 221 of the computer apparatus 220 renders the received image and displays the rendered image.
Here, the display device 221 is typically a display.
Accordingly, the currently displayed content of VR device 210 is synchronized to computer device 220. When an input request is presented in the display, such as a page including an input box or an input box being activated, the user may make a local input via input device 222 of computer device 220 to synchronize the local input to VR device 210.
The rendering means 223 of the computer device 220 receives local information input by the user.
Here, the computer device 220 may also be coupled with an input device (not shown) so that a user's local input at the computer device 220 can be made through the input device. The input device comprises a virtual keyboard and a physical keyboard, and the virtual keyboard and the physical keyboard are correspondingly provided with different input method configurations.
By browsing the VR device display content synchronized to the display 221, the user uses the input 222 of the computer device 220 to make input when needed, which input may be understood as a "local input" at the computer device side relative to the input at the VR device side.
Then, the encoding means 224 of the computer device 220 encodes the input information, and the transceiver means 225 transmits the encoded input information to the VR device 210.
Here, the encoding method is typically UTF8 encoding, which can support a plurality of languages including chinese. Therefore, those skilled in the art will appreciate that the present invention is not limited to UTF8 encoding, and any other encoding scheme that can accommodate chinese or other language inputs may be used.
When the encoding device 224 performs UTF8 encoding on the input information, the transceiver 225 transmits the encoded input information to the VR device 210 through a TCP connection with the VR device 210. That is, computer device 220 enables real-time acquisition of user input and synchronization of the user's real-time input to VR device 210.
Next, the receiving means 212 of the VR device 210 receives the encoded input information, and the decoding means 213 invokes its local input method to decode the encoded input information.
Here, the local input method requires an input sequence capable of supporting the input device on the computer device side to generate input information. For example, for input using a physical keyboard, the decoding apparatus 213 may create a standard Android input method locally to be able to recognize the input sequence generated by the key strokes. For input using a virtual keyboard, the decoding means 213 may, for example, invoke a native google input method to recognize the input sequence generated by the touch to the touch screen.
Subsequently, presentation device 214 of VR device 210 displays the input information at the target input location.
The target input position means a position in which the input focus is located in the currently displayed content. For example, the presentation means 214 may call the system function commitText () to send the input information into the currently focused text box.
It should be noted that the present invention may be implemented in software and/or in a combination of software and hardware, for example, as an Application Specific Integrated Circuit (ASIC), a general purpose computer or any other similar hardware device. In one embodiment, the software program of the present invention may be executed by a processor to implement the steps or functions described above. Also, the software programs (including associated data structures) of the present invention can be stored in a computer readable recording medium, such as RAM memory, magnetic or optical drive or diskette and the like. Further, some of the steps or functions of the present invention may be implemented in hardware, for example, as circuitry that cooperates with the processor to perform various steps or functions.
In addition, some of the present invention can be applied as a computer program product, such as computer program instructions, which when executed by a computer, can invoke or provide the method and/or technical solution according to the present invention through the operation of the computer. Program instructions which invoke the methods of the present invention may be stored on a fixed or removable recording medium and/or transmitted via a data stream on a broadcast or other signal-bearing medium and/or stored within a working memory of a computer device operating in accordance with the program instructions. An embodiment according to the invention herein comprises an apparatus comprising a memory for storing computer program instructions and a processor for executing the program instructions, wherein the computer program instructions, when executed by the processor, trigger the apparatus to perform a method and/or solution according to embodiments of the invention as described above.
It will be evident to those skilled in the art that the invention is not limited to the details of the foregoing illustrative embodiments, and that the present invention may be embodied in other specific forms without departing from the spirit or essential attributes thereof. The present embodiments are therefore to be considered in all respects as illustrative and not restrictive, the scope of the invention being indicated by the appended claims rather than by the foregoing description, and all changes which come within the meaning and range of equivalency of the claims are therefore intended to be embraced therein. Any reference sign in a claim should not be construed as limiting the claim concerned. Furthermore, it is obvious that the word "comprising" does not exclude other elements or steps, and the singular does not exclude the plural. A plurality of units or means recited in the system claims may also be implemented by one unit or means in software or hardware. The terms first, second, etc. are used to denote names, but not any particular order.

Claims (23)

1. A method of remotely inputting to a VR device, wherein the method comprises the steps of:
the VR equipment projects the current display content to a display device of the computer equipment;
the computer equipment receives local information input of a user;
the computer device encodes the input information and sends the encoded information to the VR device;
the VR device receives the encoded input information and invokes its local input method to decode the input information;
the VR device displays the input information at a target input location.
2. The method of claim 1, wherein the screen-projecting step specifically comprises:
establishing a TCP connection between the VR device and the computer device through an adb connection;
the VR device capturing an image of its currently displayed content and sending the image to the computer device via the TCP connection;
the computer device renders the image and displays the rendered image on its display device.
3. The method according to claim 1 or 2, wherein the encoding is UTF8 encoding.
4. The method of claim 3, wherein the local information input is via a physical keyboard, the input information being generated based on a keyboard key code.
5. The method of claim 3, wherein the local input method supports keyboard key codes.
6. A method for realizing remote input at a VR device, wherein the method comprises the following steps:
sending the current display content for screen projection to the computer equipment;
receiving encoded input information from the computer device;
invoking a local input method to decode the encoded input information;
and displaying the input information at the target input position.
7. The method according to claim 6, wherein the sending step specifically comprises:
establishing a TCP connection with the computer equipment through an adb connection;
capturing an image of the currently displayed content and sending the image to the computer device via the TCP connection.
8. The method of claim 6 or 7, wherein the encoding is UTF8 encoding.
9. The method of claim 8, wherein the local input method supports keyboard key codes.
10. A method of remote input of a VR device by a computer device, wherein the method comprises the steps of:
receiving an image sent by a VR device and presenting a rendered image on a display device, wherein the image corresponds to the current display content of the VR device;
receiving local information input of a user;
encode the input information and send to the VR device.
11. The method of claim 10, wherein the receiving and presenting steps specifically comprise:
establishing a TCP connection with the VR device through an adb connection;
receiving from the VR device via the TCP connection an image of the currently displayed content it captures;
rendering the image and displaying the rendered image on its display device.
12. The method according to claim 10 or 11, wherein the encoding is UTF8 encoding.
13. The method of claim 12, wherein the local information input is via a physical keyboard, the input information being generated based on a keyboard key code.
14. A system for remote input to a VR device, wherein the system includes the VR device and a computer device, the computer device coupled with a display,
wherein the VR device includes:
the screen projection device is used for projecting the current display content of the VR equipment to the display device of the computer equipment;
receiving means for receiving encoded input information from the computer device;
decoding means for invoking a local input method to decode the encoded input information;
the display device is used for displaying the input information at a target input position;
wherein the computer device comprises:
the submitting device is used for receiving local information input of a user;
encoding means for encoding the input information;
and the transceiver is used for transmitting the coded input information to the VR equipment.
15. The information of claim 14, wherein the encoding is UTF8 encoding.
16. An apparatus for implementing remote input at a VR device, wherein the apparatus comprises:
the screen projection device is used for sending the current display content for screen projection to the computer equipment;
receiving means for receiving encoded input information from the computer device;
decoding means for invoking a local input method to decode the encoded input information;
and the display device is used for displaying the input information at a target input position.
17. An apparatus for remote input of a VR device at a computer device side, wherein the computer device is coupled with a display device;
wherein, the device includes:
the device comprises a transceiver and a processing unit, wherein the transceiver is used for receiving an image sent by a VR device, and the image corresponds to the current display content of the VR device;
the display device is used for rendering the image and presenting the rendered image;
the submitting device is used for receiving local information input of a user;
encoding means for encoding the input information;
the transceiver is further configured to transmit the encoded input information to the VR device.
18. A VR device comprising a memory and a processor, wherein the memory stores computer program instructions that, when executed by the processor, implement the method of any of claims 6 to 9.
19. A computer program product comprising computer program instructions to implement the method of any one of claims 6 to 9 when executed by a VR device.
20. A computer readable storage medium storing computer program instructions which, when executed by a VR device, implement the method of any of claims 6 to 9.
21. A computer device comprising a memory and a processor, wherein the memory stores computer program instructions that, when executed by the processor, implement the method of any of claims 10 to 13.
22. A computer program product comprising computer program instructions which, when executed by a computer device, implement the method of any one of claims 10 to 13.
23. A computer readable storage medium storing computer program instructions which, when executed by a computer device, implement the method of any one of claims 10 to 13.
CN201910968615.6A 2019-10-12 2019-10-12 Method, device and system for remotely inputting VR equipment Withdrawn CN110850964A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910968615.6A CN110850964A (en) 2019-10-12 2019-10-12 Method, device and system for remotely inputting VR equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910968615.6A CN110850964A (en) 2019-10-12 2019-10-12 Method, device and system for remotely inputting VR equipment

Publications (1)

Publication Number Publication Date
CN110850964A true CN110850964A (en) 2020-02-28

Family

ID=69597414

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910968615.6A Withdrawn CN110850964A (en) 2019-10-12 2019-10-12 Method, device and system for remotely inputting VR equipment

Country Status (1)

Country Link
CN (1) CN110850964A (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101488154A (en) * 2009-03-04 2009-07-22 西安双捷科技有限责任公司 Words input implementing method used for web page
CN103095850A (en) * 2013-02-07 2013-05-08 珠海市君天电子科技有限公司 Method and system sharing network through computer for mobile terminal
CN104079865A (en) * 2013-03-29 2014-10-01 思科技术公司 Annotating a presentation in a telepresence meeting
CN105677329A (en) * 2015-12-30 2016-06-15 联想(北京)有限公司 Controlling method and electronic device
CN107566793A (en) * 2017-08-31 2018-01-09 中科云创(北京)科技有限公司 Method, apparatus, system and electronic equipment for remote assistance
CN109218731A (en) * 2017-06-30 2019-01-15 腾讯科技(深圳)有限公司 The throwing screen method, apparatus and system of mobile device

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101488154A (en) * 2009-03-04 2009-07-22 西安双捷科技有限责任公司 Words input implementing method used for web page
CN103095850A (en) * 2013-02-07 2013-05-08 珠海市君天电子科技有限公司 Method and system sharing network through computer for mobile terminal
CN104079865A (en) * 2013-03-29 2014-10-01 思科技术公司 Annotating a presentation in a telepresence meeting
CN105677329A (en) * 2015-12-30 2016-06-15 联想(北京)有限公司 Controlling method and electronic device
CN109218731A (en) * 2017-06-30 2019-01-15 腾讯科技(深圳)有限公司 The throwing screen method, apparatus and system of mobile device
CN107566793A (en) * 2017-08-31 2018-01-09 中科云创(北京)科技有限公司 Method, apparatus, system and electronic equipment for remote assistance

Similar Documents

Publication Publication Date Title
KR101629072B1 (en) Gesture visualization and sharing between electronic devices and remote displays
JP6368033B2 (en) Terminal, server, and terminal control method
US20180189083A1 (en) Method and device for operating target application on corresponding equipment
WO2016197590A1 (en) Method and apparatus for providing screenshot service on terminal device and storage medium and device
US20140108940A1 (en) Method and system of remote communication over a network
US20140292999A1 (en) Annotating a presentation in a telepresence meeting
JP2014520306A (en) Adaptive use of wireless display
CN112422868A (en) Data processing method, terminal device and server
US20220076476A1 (en) Method for generating user avatar, related apparatus and computer program product
US10699664B2 (en) Image display system and method of transforming display panels of mobile devices into being compatible with medical images display standard
US10169628B1 (en) Scanning image codes in virtual mobile infrastructures
CN114071190A (en) Cloud application video stream processing method, related device and computer program product
CN110850964A (en) Method, device and system for remotely inputting VR equipment
CN102073376B (en) Portable wireless operation system and method
CN107291834B (en) Information input method, equipment and terminal based on readable codes
CN107315970B (en) Sensitive data interaction method and device
CN114610212A (en) Front-end visual video content editing method, device, equipment and storage medium
CN111107316B (en) Image display method, device and system
CN113835816A (en) Virtual machine desktop display method, device, equipment and readable storage medium
CN111190675A (en) Three-dimensional image transmission method and equipment based on Roc processor
KR101694289B1 (en) Method for Client Graphic Device-Separated Software Execution
US11860771B1 (en) Multisession mode in remote device infrastructure
CN203773534U (en) Remote virtual machine screen display control system
US11650735B1 (en) Accessibility feature in remote device infrastructure
CN111835804A (en) Method, device and system for data transmission between internal network and external network

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WW01 Invention patent application withdrawn after publication
WW01 Invention patent application withdrawn after publication

Application publication date: 20200228