CN110850963A - Virtual input method and related device - Google Patents

Virtual input method and related device Download PDF

Info

Publication number
CN110850963A
CN110850963A CN201910947895.2A CN201910947895A CN110850963A CN 110850963 A CN110850963 A CN 110850963A CN 201910947895 A CN201910947895 A CN 201910947895A CN 110850963 A CN110850963 A CN 110850963A
Authority
CN
China
Prior art keywords
input
subsystem
equipment
command
peripheral control
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910947895.2A
Other languages
Chinese (zh)
Other versions
CN110850963B (en
Inventor
冯军军
胡震宇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Huole Science and Technology Development Co Ltd
Original Assignee
Shenzhen Huole Science and Technology Development Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Huole Science and Technology Development Co Ltd filed Critical Shenzhen Huole Science and Technology Development Co Ltd
Priority to CN201910947895.2A priority Critical patent/CN110850963B/en
Publication of CN110850963A publication Critical patent/CN110850963A/en
Application granted granted Critical
Publication of CN110850963B publication Critical patent/CN110850963B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/014Hand-worn input/output arrangements, e.g. data gloves
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

The application discloses a virtual input method and a related device, which are applied to intelligent projection equipment, wherein the intelligent projection equipment comprises a plurality of input equipment subsystems, and the method comprises the following steps: connecting peripheral control equipment; selecting an input equipment subsystem according to the selection instruction, wherein the input equipment subsystem is one of a plurality of input equipment subsystems, and the plurality of input equipment subsystems comprise a mouse input subsystem, a keyboard input subsystem and a remote controller input subsystem; receiving a first command sent by peripheral control equipment, and converting the first command into a first input value; the target function is executed according to the first input value. By implementing the embodiment of the invention, the flexibility of virtual input is realized, and the high efficiency and timeliness of input are improved.

Description

Virtual input method and related device
Technical Field
The invention relates to the technical field of computers, in particular to a virtual input method and a related device.
Background
With the rapid development of the mobile internet technology, the remote control technology of the intelligent projection device is also rapidly developing, but the number of input devices is increased, and the input devices are easy to damage and cannot be used for a long time. In the prior art, the intelligent projection equipment is controlled through the self-contained simulation keys of the intelligent projection equipment system, so that the execution efficiency is low, the time is consumed, and the experience effect is very poor when the intelligent projection equipment system is used in man-machine interaction. In order to solve the above problems, some efficient key simulation modes exist in the prior art, but the modification of the native system code of the intelligent projection device is more.
Disclosure of Invention
The embodiment of the invention provides a virtual input method and a related device, and by implementing the embodiment of the invention, the flexibility of virtual input is realized, and the high efficiency and timeliness of input are improved.
In a first aspect, an embodiment of the present application provides a virtual input method, which is applied to an intelligent projection device, where the intelligent projection device includes multiple input device subsystems, and the method includes:
connecting peripheral control equipment;
selecting an input equipment subsystem according to a selection instruction, wherein the input equipment subsystem is one of the input equipment subsystems, and the input equipment subsystems comprise a mouse input subsystem, a keyboard input subsystem and a remote controller input subsystem;
receiving a first command sent by the peripheral control equipment, and converting the first command into a first input value;
and executing a target function according to the first input value.
In a second aspect, the present application provides a virtual input apparatus for a smart projection device, the smart projection device including a plurality of input device subsystems, the virtual input apparatus including a processing unit and a communication unit, wherein,
the processing unit is used for connecting the peripheral control equipment through the communication unit; selecting an input equipment subsystem according to a selection instruction, wherein the input equipment subsystem is one of the input equipment subsystems, and the input equipment subsystems comprise a mouse input subsystem, a keyboard input subsystem and a remote controller input subsystem; receiving a first command sent by the peripheral control equipment through the communication unit, and converting the first command into a first input value; and executing a target function according to the first input value.
In a third aspect, an embodiment of the present application provides an intelligent projection device, including a processor, a memory, a communication interface, and one or more programs, where the one or more programs are stored in the memory and configured to be executed by the processor, and the program includes instructions for executing the steps in any of the methods of the first aspect of the embodiment of the present application.
In a fourth aspect, the present application provides a computer-readable storage medium, where the computer-readable storage medium stores a computer program for electronic data exchange, where the computer program makes a computer perform part or all of the steps described in any one of the methods of the first aspect of the present application.
In a fifth aspect, the present application provides a computer program product, wherein the computer program product includes a non-transitory computer-readable storage medium storing a computer program, and the computer program is operable to cause a computer to perform some or all of the steps as described in any one of the methods of the first aspect of the embodiments of the present application. The computer program product may be a software installation package.
It can be seen that, in the embodiment of the application, the intelligent projection device can control the device by connecting with the peripheral; then, selecting an input equipment subsystem according to the selection instruction, wherein the input equipment subsystem is one of a plurality of input equipment subsystems, and the plurality of input equipment subsystems comprise a mouse input subsystem, a keyboard input subsystem and a remote controller input subsystem; secondly, receiving a first command sent by the peripheral control equipment, and converting the first command into a first input value; and finally executing the target function according to the first input value. Therefore, the intelligent projection equipment can convert the received command sent by the peripheral control equipment into the analog input value corresponding to the input equipment subsystem, realize the corresponding target function of the analog input value, realize the flexibility of virtual input and improve the high efficiency and timeliness of input.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts. Wherein:
FIG. 1 is a schematic diagram of a virtual input system according to an embodiment of the present invention;
fig. 2 is a schematic flowchart of a virtual input method according to another embodiment of the present invention;
FIG. 3a is a schematic diagram of a system selection interface of a peripheral control device according to an embodiment of the present invention;
fig. 3b is a schematic diagram of the peripheral control device and the corresponding position of the cursor in the intelligent projection device according to an embodiment of the present invention;
FIG. 3c is a diagram illustrating cursor movement according to an embodiment of the present invention;
FIG. 3d is a schematic diagram of an input of a keyboard input subsystem according to an embodiment of the present invention;
FIG. 4 is a diagram illustrating a virtual input and a standard input of a remote control according to an embodiment of the present invention;
FIG. 5 is a flowchart illustrating another virtual input method according to an embodiment of the present invention;
fig. 6 is a schematic diagram of an intelligent projection apparatus according to an embodiment of the present invention;
fig. 7 is a block diagram illustrating functional units of a virtual input device according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The following are detailed below.
The terms "first" and "second" in the description and claims of the present invention and the above-described drawings are used for distinguishing between different objects and not for describing a particular order. Furthermore, the terms "include" and "have," as well as any variations thereof, are intended to cover non-exclusive inclusions. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those steps or elements listed, but may alternatively include other steps or elements not listed, or inherent to such process, method, article, or apparatus.
As shown in fig. 1, fig. 1 is a schematic diagram of a virtual input system 100, where the virtual input system 100 includes a peripheral control device 110 and an intelligent projection device 120, the intelligent projection device 120 includes a command receiver 121 and an input device subsystem 122, the peripheral control device 110 is connected to the intelligent projection device 120 through a wireless network, the intelligent projection device 120 is configured to obtain a first command and send the first command to the intelligent projection device 120, the command receiver 121 of the intelligent projection device 120 is configured to obtain the first command, convert the first command into a first input value, and write the first input value into the input device subsystem 122, and the input device subsystem 122 implements a target function according to the first input value. The smart projection device 120 may include an integrated single device or a multi-device, and it is apparent that the smart projection device may include various handheld devices having wireless communication functions, vehicle-mounted devices, wearable devices, computing devices or other processing devices connected to a wireless modem, and various forms of User Equipment (UE), Mobile Stations (MS), terminal equipment (terminal), and the like.
At present, in the prior art, a broadcast mode is adopted through a self-contained analog key sending mode of an intelligent projection equipment system, and the broadcast is a non-aging and asynchronous blocking processing mode, so that the execution efficiency is low, the time consumption is high, and the experience effect is very poor.
Based on this, the embodiments of the present application provide a virtual input method to solve the above problems, and the embodiments of the present application are described in detail below.
First, referring to fig. 2, fig. 2 is a flowchart illustrating a virtual input method according to an embodiment of the present invention, which is applied to the intelligent projection device shown in fig. 1, where the intelligent projection device includes a plurality of input device subsystems, and the method includes:
s201, connecting the intelligent projection equipment with peripheral control equipment;
wherein the smart projection device may be an android smart device, such as a projector. The peripheral control device may be a device with wireless network transmission capabilities, including but not limited to a cell phone; the peripheral control equipment is connected and transmits data through a wireless transmission protocol.
In a specific implementation, the intelligent projection device includes a command receiver, and the command receiver is used for the peripheral control device to listen and receive command data. The intelligent projection device has a security mechanism, and the authority of the input device subsystem in the security mechanism needs to be modified before the peripheral control device is connected. For example, after the command receiver of the smart projection device monitors the user's mobile phone, whether to connect the user's mobile phone is determined according to whether the input device subsystem in the smart projection device system is open or not.
For example, before the projector is connected to the user's mobile phone, the input device subsystem in the projector system may be searched to release the authority of the input device subsystem to the user's mobile phone, so that the projector can be connected to the user's mobile phone, and the input device subsystem receives a command sent by the mobile phone.
S202, the intelligent projection equipment selects an input equipment subsystem according to a selection instruction, wherein the input equipment subsystem is one of the input equipment subsystems, and the input equipment subsystems comprise a mouse input subsystem, a keyboard input subsystem and a remote controller input subsystem;
the system of the intelligent projection device may include a plurality of input device subsystems, such as input device subsystems of a mouse, a keyboard, a remote controller, and the like.
In a specific implementation, as shown in fig. 3a, fig. 3a is a schematic diagram of a system selection interface of a peripheral control device, where when the peripheral control device detects a selection operation of a user, a selection instruction is sent to an intelligent projection device, and the intelligent projection device receives the selection instruction to determine an input device subsystem. For example, when the peripheral control device detects that a user selects the mouse input subsystem, the peripheral control device sends a selection instruction for selecting the peripheral control device to the intelligent projection device, and the intelligent projection device receives the selection instruction and connects and transmits a command between the mouse input subsystem and the peripheral control device according to the selection instruction.
S203, the intelligent projection equipment receives a first command sent by the peripheral control equipment and converts the first command into a first input value;
after receiving a first command of the peripheral control equipment, a command receiver of the intelligent projection equipment converts the first command into a first input value of a corresponding input equipment subsystem and writes the first input value into the corresponding input equipment subsystem; wherein the first input value is an analog input value corresponding to an input value of a physical input device.
S204, the intelligent projection equipment executes a target function according to the first input value.
After the input equipment subsystem of the intelligent projection equipment receives the first input value, the simulated first input value executes the corresponding target function according to the implementation process and mechanism of the input value of the entity input equipment.
It can be seen that, in the embodiment of the application, the intelligent projection device can control the device by connecting with the peripheral; then, selecting an input equipment subsystem according to the selection instruction, wherein the input equipment subsystem is one of a plurality of input equipment subsystems, and the plurality of input equipment subsystems comprise a mouse input subsystem, a keyboard input subsystem and a remote controller input subsystem; secondly, receiving a first command sent by the peripheral control equipment, and converting the first command into a first input value; and finally executing the target function according to the first input value. Therefore, the intelligent projection equipment can convert the received command sent by the peripheral control equipment into the analog input value corresponding to the input equipment subsystem, realize the corresponding target function of the analog input value, realize the flexibility of virtual input and improve the high efficiency and timeliness of input.
In one possible example, the selecting an input device subsystem according to a user instruction includes: receiving a selection instruction sent by the peripheral control equipment, wherein the selection instruction is generated by the peripheral equipment according to the detected user selection; and determining a corresponding input equipment subsystem according to the selection instruction.
As shown in fig. 3a, the intelligent projection device determines the type of the input device subsystem controlled by the peripheral control device according to the selection instruction sent by the peripheral control device.
Therefore, in this example, the intelligent projection device can determine the type of the input device subsystem in the intelligent projection device according to the selection instruction sent by the peripheral control device, so that the intelligent projection device input device subsystem is accurately controlled, and the accuracy and the effectiveness are improved.
In one possible example, the input subsystem includes a mouse input subsystem, the receiving a first command sent by the peripheral control device and converting the first command into a first input value includes: acquiring a first command sent by the peripheral control equipment, wherein the first command comprises a first cursor position; converting the first cursor position to a first input value, the first input value being used to determine a second cursor position in the smart projection device.
When the input device subsystem of the intelligent projection device is determined to be the mouse input subsystem according to the selection instruction sent by the peripheral control device, the display area of the intelligent projection device can display a first cursor activity area with the same length-width ratio as the display area of the intelligent projection device, the peripheral control device determines a first cursor position according to the cursor activity area and a first cursor and sends the first cursor position to the intelligent projection device, and the intelligent projection device determines a second cursor position of the intelligent projection device according to the first cursor activity area and the display area of the intelligent projection device after receiving information of the first cursor position.
For example, as shown in fig. 3b, fig. 3b is a schematic diagram of the peripheral control device and the corresponding position of the cursor in the intelligent projection device. The mobile phone display area is provided with a first cursor moving area determined according to the projector display area, such as a mobile phone gray area in the figure, the length-width ratio of the first cursor moving area is the same as that of the projector display area, and the position of a second cursor in the projector is determined according to the ratio of the projector display area to the first cursor moving area and the position of the first cursor in the first cursor moving area.
Therefore, in this example, the intelligent projection device can determine the cursor position in the intelligent projection device according to the cursor position in the peripheral control device, so that the flexibility of controlling the intelligent projection device is realized, and the accuracy and the effectiveness are improved.
In one possible example, the converting the first cursor position to a first input value includes: determining the first cursor position coordinate (x) from the first cursor position0,y0) (ii) a Obtaining a length L of a display area of the intelligent projection device1And width W1(ii) a According to the length L of the display area1And width W1Determining a length L of a cursor movable region of the peripheral control device2And width W2(ii) a According to the length L of the display area1And width W1And the length L of the cursor movable region2And width W2Determining a display scale factor k; according to the display scale coefficient k and the first cursor position coordinate (x)0,y0) Determining a second cursor position corresponding to the first input value, wherein the second cursor position is (kx)0,ky0)。
Wherein, the length L of the display area of the intelligent projection device can be determined1And width W1Determining a length and a width of a cursor movable region of the intelligent projection device, wherein the length and the width of the cursor movable region may not be limited by a size of a display region of the peripheral control device.
In particular implementations, the intelligent projection device is based on the length L of the display area1And width W1Obtaining an aspect ratio q, determining a first cursor movement area of the peripheral control device according to the aspect ratio q, and determining the length L of the first cursor movement area according to user settings or default values of the intelligent projection device system2And width W2According to the length L of the display area of the intelligent projection equipment1And width W1And length L of first cursor movement region2And width W2Determining a display distance proportionality coefficient k, and then determining a second cursor position, for example, a first cursor position coordinate (x) by the intelligent projection device according to the obtained coordinate of the first cursor position and the coefficient k0,y0) And the coefficient k is 20, the second cursor position coordinate is (20 x)0,20*y0)。
In a specific implementation, as shown in fig. 3c, fig. 3c is a schematic diagram of cursor movement, where a first cursor corresponds to a second cursor position c from a position a, when the first cursor moves from the position a to the position b, a lateral movement distance of the first cursor is x, a longitudinal movement distance of the first cursor is y, a lateral movement distance of the second cursor position c is kx, and a longitudinal movement distance of the second cursor position c is ky, so as to obtain a second cursor position d corresponding to the first cursor position b.
In specific implementation, the intelligent projection device may further receive commands such as scrolling and page turning sent by the peripheral control device, and execute functions such as scrolling and page turning according to the commands such as scrolling and page turning.
Therefore, in this example, the intelligent projection device can determine the cursor position in the intelligent projection device according to the cursor position in the peripheral control device, so that the flexibility of controlling the intelligent projection device is realized, and the accuracy and the effectiveness are improved.
In one possible example, the input subsystem comprises a keyboard input subsystem, the receiving a first command sent by the peripheral control device and converting the first command into a first input value comprises: acquiring a first command sent by the peripheral control equipment, wherein the first command comprises a first key value, and the first key value is determined according to the position of a virtual keyboard in a display area of the peripheral control equipment; and converting the first key value into a first input value corresponding to an entity keyboard key value in the keyboard input subsystem.
When the input device subsystem of the intelligent projection device is determined to be the keyboard input subsystem according to the selection instruction sent by the peripheral control device, the display area of the intelligent projection device can display keyboard key values and the like corresponding to the command sent by the peripheral control device, wherein the keyboard key values include but are not limited to key values corresponding to characters. Specifically, the position information of the virtual key in the keyboard region may be calculated according to the virtual key information in the peripheral control device, the position of the virtual key in the virtual keyboard is determined according to the position information, a first command is further determined, the first command is sent to the intelligent projection device, and the intelligent projection device converts the first command into a first input value, that is, a key value corresponding to the physical keyboard, after receiving the first command.
For example, as shown in fig. 3d, fig. 3d is a schematic diagram of the input of the keyboard input subsystem, in which only a part of the virtual keys are shown. The peripheral control device is a mobile phone, the intelligent projection device is a projector, a virtual keyboard can be displayed in a display area of the mobile phone, a virtual keyboard image corresponding to the virtual keyboard in the mobile phone can be displayed in the display area of the projector, and only input characters can be displayed.
In a specific implementation, when the keyboard input subsystem is selected, the peripheral control device can also receive voice and then convert the voice into a command to be sent to the intelligent projection device.
As can be seen, in this example, the intelligent projection device may determine the input value of the keyboard input system of the physical keyboard in the intelligent projection device according to the first command sent by the virtual keyboard in the peripheral control device, so that the flexibility of virtual input of the intelligent projection device is achieved, and the accuracy and the effectiveness are improved.
In one possible example, the input subsystem comprises a remote control input subsystem, the receiving a first command sent by the peripheral control device and converting the first command into a first input value, comprising: acquiring a first command of the peripheral control equipment, wherein the first command comprises a remote control command; and converting the remote control command into a first input value corresponding to an entity remote control key value in the remote control input subsystem.
When the input equipment subsystem of the intelligent projection equipment is determined to be the remote controller input subsystem according to the selection instruction sent by the peripheral control equipment, the virtual remote controller key information in the peripheral control equipment can be sent to the intelligent projection equipment, and after the intelligent projection equipment receives the virtual remote controller key information, the virtual remote controller key information is converted into the key value corresponding to the entity remote controller.
In this example, the intelligent projection device can determine the input value of the remote controller input system in the intelligent projection device based on the first command of the virtual remote controller in the peripheral control device, so that the flexibility of virtual input and control of the intelligent projection device is realized, and the accuracy and the effectiveness are improved.
In one possible example, the performing a target function according to the first input value includes: writing the first input value to the input device subsystem; inquiring an input process corresponding to entity input equipment in the input equipment subsystem according to the first input value; and importing the first input value into the input flow to realize a target function.
After the first input value is written into the input equipment subsystem, the first input value traverses a standard input flow as the input value of the entity input equipment, and triggers a corresponding function. For example, as shown in fig. 4, fig. 4 is a schematic diagram of virtual input and standard input of a remote controller, a mobile phone or other peripheral control devices sends a first command to an intelligent projection device, a command receiver of the intelligent projection device receives the first command and converts the first command into a first input value, i.e., an analog key value, and then writes the first input value into a remote controller input subsystem of a projector, after the analog key value is written into the remote controller input subsystem of the projector, the analog key value will follow a standard key process as the key value of the remote controller of the projector, and a key function is implemented.
In this example, the intelligent projection device can convert the first command sent by the peripheral control device into the first input value, and the target function is realized through the implementation process of the input device subsystem, so that the accuracy and the effectiveness of virtual input are improved.
Referring to fig. 5, fig. 5 is a schematic flowchart of a virtual input method provided in an embodiment of the present application, and the virtual input method is applied to the intelligent projection apparatus shown in fig. 1, where as shown in the diagram, the virtual input method includes:
s501, connecting the intelligent projection equipment with peripheral control equipment;
s502, the intelligent projection equipment receives a selection instruction sent by the peripheral control equipment, wherein the selection instruction is an instruction generated by the peripheral equipment according to the detected user selection;
s503, the intelligent projection equipment determines a corresponding input equipment subsystem according to the selection instruction, wherein the input equipment subsystem is one of the input equipment subsystems, and the input equipment subsystems comprise a mouse input subsystem, a keyboard input subsystem and a remote controller input subsystem;
s504, the intelligent projection equipment receives a first command sent by the peripheral control equipment and converts the first command into a first input value;
and S505, the intelligent projection equipment executes a target function according to the first input value.
It can be seen that, in the embodiment of the application, the intelligent projection device can control the device by connecting with the peripheral; then, selecting an input equipment subsystem according to the selection instruction, wherein the input equipment subsystem is one of a plurality of input equipment subsystems, and the plurality of input equipment subsystems comprise a mouse input subsystem, a keyboard input subsystem and a remote controller input subsystem; secondly, receiving a first command sent by the peripheral control equipment, and converting the first command into a first input value; and finally executing the target function according to the first input value. Therefore, the intelligent projection equipment can convert the received command sent by the peripheral control equipment into the analog input value corresponding to the input equipment subsystem, realize the corresponding target function of the analog input value, realize the flexibility of virtual input and improve the high efficiency and timeliness of input.
In addition, the intelligent projection equipment can determine the type of the input equipment subsystem in the intelligent projection equipment according to the selection instruction sent by the peripheral control equipment, so that the intelligent projection equipment input equipment subsystem is accurately controlled, and the accuracy and the effectiveness are improved.
In accordance with the embodiments shown in fig. 2 and fig. 5, please refer to fig. 6, fig. 6 is a schematic structural diagram of an intelligent projection device 600 provided in an embodiment of the present application, and as shown in the figure, the intelligent projection device 600 includes an application processor 610, a memory 620, a communication interface 630, and one or more programs 621, where the one or more programs 621 are stored in the memory 620 and configured to be executed by the application processor 610, and the one or more programs 621 include instructions for performing the following steps;
connecting peripheral control equipment;
selecting an input equipment subsystem according to a selection instruction, wherein the input equipment subsystem is one of the input equipment subsystems, and the input equipment subsystems comprise a mouse input subsystem, a keyboard input subsystem and a remote controller input subsystem;
receiving a first command sent by the peripheral control equipment, and converting the first command into a first input value;
and executing a target function according to the first input value.
It can be seen that, in the embodiment of the application, the intelligent projection device can control the device by connecting with the peripheral; then, selecting an input equipment subsystem according to the selection instruction, wherein the input equipment subsystem is one of a plurality of input equipment subsystems, and the plurality of input equipment subsystems comprise a mouse input subsystem, a keyboard input subsystem and a remote controller input subsystem; secondly, receiving a first command sent by the peripheral control equipment, and converting the first command into a first input value; and finally executing the target function according to the first input value. Therefore, the intelligent projection equipment can convert the received command sent by the peripheral control equipment into the analog input value corresponding to the input equipment subsystem, realize the corresponding target function of the analog input value, realize the flexibility of virtual input and improve the high efficiency and timeliness of input.
In one possible example, in the aspect of selecting an input device subsystem according to a user instruction, the instructions in the program are specifically configured to: receiving a selection instruction sent by the peripheral control equipment, wherein the selection instruction is generated by the peripheral equipment according to the detected user selection; and determining a corresponding input equipment subsystem according to the selection instruction.
In one possible example, in a case where the input subsystem includes a mouse input subsystem, the receiving is performed on a first command sent by the peripheral control device and the converting is performed on the first command into a first input value, and the instructions in the program are specifically configured to perform the following operations: acquiring a first command sent by the peripheral control equipment, wherein the first command comprises a first cursor position; converting the first cursor position to a first input value, the first input value being used to determine a second cursor position in the smart projection device.
In one possible example, in said converting the first cursor position to a first input value, the instructions in the program are specifically configured to: determining the first cursor position coordinate (x) from the first cursor position0,y0) (ii) a Obtaining a length L of a display area of the intelligent projection device1And width W1(ii) a According to the length L of the display area1And width W1Determining a length L of a cursor movable region of the peripheral control device2And width W2(ii) a According to the length L of the display area1And width W1And the length L of the cursor movable region2And width W2Determining a display scale factor k; according to the display scale coefficient k and the first cursor position coordinate (x)0,y0) Determining a second cursor position corresponding to the first input value, wherein the second cursor position is (kx)0,ky0)。
In one possible example, where the input subsystem comprises a keyboard input subsystem, the instructions in the program are specifically configured to, in receiving a first command sent by the peripheral control device and converting the first command into a first input value: acquiring a first command sent by the peripheral control equipment, wherein the first command comprises a first key value, and the first key value is determined according to the position of a virtual keyboard in a display area of the peripheral control equipment; and converting the first key value into a first input value corresponding to an entity keyboard key value in the keyboard input subsystem.
In one possible example, where the input subsystem includes a remote control input subsystem, the instructions in the program are specifically configured to, in receiving a first command sent by the peripheral control device and converting the first command into a first input value: acquiring a first command of the peripheral control equipment, wherein the first command comprises a remote control command; and converting the remote control command into a first input value corresponding to an entity remote control key value in the remote control input subsystem.
In one possible example, in said performing a target function according to said first input value, the instructions in said program are specifically configured to perform the following operations: writing the first input value to the input device subsystem; inquiring an input process corresponding to entity input equipment in the input equipment subsystem according to the first input value; and importing the first input value into the input flow to realize a target function.
The above description has introduced the solution of the embodiment of the present application mainly from the perspective of the method-side implementation process. It will be appreciated that the smart projection device, in order to implement the above-described functions, includes corresponding hardware structures and/or software modules for performing the respective functions. Those of skill in the art will readily appreciate that the present application is capable of hardware or a combination of hardware and computer software implementing the various illustrative elements and algorithm steps described in connection with the embodiments provided herein. Whether a function is performed as hardware or computer software drives hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiment of the present application, the functional units may be divided according to the above method example, for example, each functional unit may be divided corresponding to each function, or two or more functions may be integrated into one processing unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit. It should be noted that the division of the unit in the embodiment of the present application is schematic, and is only a logic function division, and there may be another division manner in actual implementation.
Fig. 7 is a block diagram showing functional units of a virtual input device 700 according to an embodiment of the present application. The virtual input apparatus 700 is applied to a smart projection device comprising a plurality of input device subsystems, the virtual input apparatus comprising a processing unit 701 and a communication unit 702, wherein,
the processing unit 701 is configured to connect to a peripheral control device through the communication unit 702; selecting an input equipment subsystem according to a selection instruction, wherein the input equipment subsystem is one of the input equipment subsystems, and the input equipment subsystems comprise a mouse input subsystem, a keyboard input subsystem and a remote controller input subsystem; receiving a first command sent by the peripheral control device through the communication unit 702, and converting the first command into a first input value; and executing a target function according to the first input value.
The virtual input device 700 may further include a storage unit 703 for storing program codes and data of the smart projection apparatus. The processing unit 701 may be a processor, the communication unit 702 may be an internal communication interface, and the storage unit 703 may be a memory.
It can be seen that, in the embodiment of the application, the intelligent projection device can control the device by connecting with the peripheral; then, selecting an input equipment subsystem according to the selection instruction, wherein the input equipment subsystem is one of a plurality of input equipment subsystems, and the plurality of input equipment subsystems comprise a mouse input subsystem, a keyboard input subsystem and a remote controller input subsystem; secondly, receiving a first command sent by the peripheral control equipment, and converting the first command into a first input value; and finally executing the target function according to the first input value. Therefore, the intelligent projection equipment can convert the received command sent by the peripheral control equipment into the analog input value corresponding to the input equipment subsystem, realize the corresponding target function of the analog input value, realize the flexibility of virtual input and improve the high efficiency and timeliness of input.
In one possible example, in terms of the selecting an input device subsystem according to a user instruction, the processing unit 701 is specifically configured to: receiving a selection instruction sent by the peripheral control equipment, wherein the selection instruction is generated by the peripheral equipment according to the detected user selection; and determining a corresponding input equipment subsystem according to the selection instruction.
In a possible example, in a case that the input subsystem includes a mouse input subsystem, and the processing unit 701 is specifically configured to, in terms of receiving a first command sent by the peripheral control device and converting the first command into a first input value: acquiring a first command sent by the peripheral control equipment, wherein the first command comprises a first cursor position; converting the first cursor position to a first input value, the first input value being used to determine a second cursor position in the smart projection device.
In one possible example, in terms of the converting the first cursor position into the first input value, the processing unit 701 is specifically configured to: determining the first cursor position coordinate (x) from the first cursor position0,y0) (ii) a Obtaining a length L of a display area of the intelligent projection device1And width W1(ii) a According to the length L of the display area1And width W1Determining a length L of a cursor movable region of the peripheral control device2And width W2(ii) a According to the length L of the display area1And width W1And the length L of the cursor movable region2And width W2Determining a display scale factor k; according to the display scale coefficient k and the first cursor position coordinate (x)0,y0) Determining a second cursor position corresponding to the first input value, wherein the second cursor position is (kx)0,ky0)。
In a possible example, in an aspect that the input subsystem includes a keyboard input subsystem, and the receiving is performed on a first command sent by the peripheral control device, and the first command is converted into a first input value, the processing unit 701 is specifically configured to: acquiring a first command sent by the peripheral control equipment, wherein the first command comprises a first key value, and the first key value is determined according to the position of a virtual keyboard in a display area of the peripheral control equipment; and converting the first key value into a first input value corresponding to an entity keyboard key value in the keyboard input subsystem.
In a possible example, in an aspect that the input subsystem includes a remote controller input subsystem, the receiving a first command sent by the peripheral control device, and converting the first command into a first input value, the processing unit 701 is specifically configured to: acquiring a first command of the peripheral control equipment, wherein the first command comprises a remote control command; and converting the remote control command into a first input value corresponding to an entity remote control key value in the remote control input subsystem.
In one possible example, in terms of performing the target function according to the first input value, the processing unit 701 is specifically configured to: writing the first input value to the input device subsystem; inquiring an input process corresponding to entity input equipment in the input equipment subsystem according to the first input value; and importing the first input value into the input flow to realize a target function.
It can be understood that, since the method embodiment and the apparatus embodiment are different presentation forms of the same technical concept, the content of the method embodiment portion in the present application should be synchronously adapted to the apparatus embodiment portion, and is not described herein again.
Embodiments of the present application also provide a computer storage medium, wherein the computer storage medium stores a computer program for electronic data exchange, the computer program enables a computer to execute part or all of the steps of any one of the methods as described in the above method embodiments, and the computer includes an intelligent projection device.
Embodiments of the present application also provide a computer program product comprising a non-transitory computer readable storage medium storing a computer program operable to cause a computer to perform some or all of the steps of any of the methods as described in the above method embodiments. The computer program product may be a software installation package, the computer comprising the intelligent projection device.
It should be noted that, for simplicity of description, the above-mentioned method embodiments are described as a series of acts or combination of acts, but those skilled in the art will recognize that the present application is not limited by the order of acts described, as some steps may occur in other orders or concurrently depending on the application. Further, those skilled in the art should also appreciate that the embodiments described in the specification are preferred embodiments and that the acts and modules referred to are not necessarily required in this application.
In the foregoing embodiments, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus may be implemented in other manners. For example, the above-described embodiments of the apparatus are merely illustrative, and for example, the above-described division of the units is only one type of division of logical functions, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection of some interfaces, devices or units, and may be an electric or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit may be stored in a computer readable memory if it is implemented in the form of a software functional unit and sold or used as a stand-alone product. Based on such understanding, the technical solution of the present application may be substantially implemented or a part of or all or part of the technical solution contributing to the prior art may be embodied in the form of a software product stored in a memory, and including several instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the above-mentioned method of the embodiments of the present application. And the aforementioned memory comprises: a U-disk, a Read-only memory (ROM), a Random Access Memory (RAM), a removable hard disk, a magnetic or optical disk, and other various media capable of storing program codes.
Those skilled in the art will appreciate that all or part of the steps in the methods of the above embodiments may be implemented by associated hardware instructed by a program, which may be stored in a computer-readable memory, which may include: flash memory disks, Read-only memories (ROMs), Random Access Memories (RAMs), magnetic or optical disks, and the like.
The foregoing detailed description of the embodiments of the present application has been presented to illustrate the principles and implementations of the present application, and the above description of the embodiments is only provided to help understand the method and the core concept of the present application; meanwhile, for a person skilled in the art, according to the idea of the present application, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present application.

Claims (10)

1. A virtual input method applied to an intelligent projection device, wherein the intelligent projection device comprises a plurality of input device subsystems, and the method comprises the following steps:
connecting peripheral control equipment;
selecting an input equipment subsystem according to a selection instruction, wherein the input equipment subsystem is one of the input equipment subsystems, and the input equipment subsystems comprise a mouse input subsystem, a keyboard input subsystem and a remote controller input subsystem;
receiving a first command sent by the peripheral control equipment, and converting the first command into a first input value;
and executing a target function according to the first input value.
2. The method of claim 1, wherein selecting an input device subsystem according to a user instruction comprises:
receiving a selection instruction sent by the peripheral control equipment, wherein the selection instruction is generated by the peripheral equipment according to the detected user selection;
and determining a corresponding input equipment subsystem according to the selection instruction.
3. The method of claim 2, wherein the input subsystem comprises a mouse input subsystem, and wherein receiving a first command sent by the peripheral control device and converting the first command to a first input value comprises:
acquiring a first command sent by the peripheral control equipment, wherein the first command comprises a first cursor position;
converting the first cursor position to a first input value, the first input value being used to determine a second cursor position in the smart projection device.
4. The method of claim 3, wherein converting the first cursor position to a first input value comprises:
determining the first cursor position coordinate (x) from the first cursor position0,y0);
Obtaining a length L of a display area of the intelligent projection device1And width W1
According to the length L of the display area1And width W1Determining a length L of a cursor movable region of the peripheral control device2And width W2
According to the length L of the display area1And width W1And the length L of the cursor movable region2And width W2Determining a display scale factor k;
according to the display scale coefficient k and the first cursor position coordinate (x)0,y0) Determining a second cursor position corresponding to the first input value, wherein the second cursor position is (kx)0,ky0)。
5. The method of claim 2, wherein the input subsystem comprises a keyboard input subsystem, and wherein receiving a first command sent by the peripheral control device and converting the first command to a first input value comprises:
acquiring a first command sent by the peripheral control equipment, wherein the first command comprises a first key value, and the first key value is determined according to the position of a virtual keyboard in a display area of the peripheral control equipment;
and converting the first key value into a first input value 5 corresponding to an entity keyboard key value in the keyboard input subsystem.
6. The method of claim 2, wherein the input subsystem comprises a remote control input subsystem, and wherein receiving a first command sent by the peripheral control device and converting the first command to a first input value comprises:
acquiring a first command of the peripheral control equipment, wherein the first command comprises a remote control command;
and converting the remote control command into a first input value corresponding to an entity remote control key value in the remote control input subsystem.
7. The method of any of claims 1-6, wherein performing a target function based on the first input value comprises:
writing the first input value to the input device subsystem;
inquiring an input process corresponding to entity input equipment in the input equipment subsystem according to the first input value;
and importing the first input value into the input flow to realize a target function.
8. A virtual input apparatus for use in an intelligent projection device, the intelligent projection device comprising a plurality of input device subsystems, the virtual input apparatus comprising a processing unit and a communication unit, wherein,
the processing unit is used for connecting the peripheral control equipment through the communication unit; selecting an input equipment subsystem according to a selection instruction, wherein the input equipment subsystem is one of the input equipment subsystems, and the input equipment subsystems comprise a mouse input subsystem, a keyboard input subsystem and a remote controller input subsystem; receiving a first command sent by the peripheral control equipment through the communication unit, and converting the first command into a first input value; and executing a target function according to the first input value.
9. An intelligent projection device comprising a processor, a memory, and one or more programs stored in the memory and configured to be executed by the processor, the programs comprising instructions for performing the steps in the method of any of claims 1-7.
10. A computer-readable storage medium, characterized in that a computer program for electronic data exchange is stored, wherein the computer program causes a computer to perform the method according to any one of claims 1-7.
CN201910947895.2A 2019-09-29 2019-09-29 Virtual input method and related device Active CN110850963B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910947895.2A CN110850963B (en) 2019-09-29 2019-09-29 Virtual input method and related device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910947895.2A CN110850963B (en) 2019-09-29 2019-09-29 Virtual input method and related device

Publications (2)

Publication Number Publication Date
CN110850963A true CN110850963A (en) 2020-02-28
CN110850963B CN110850963B (en) 2023-12-12

Family

ID=69596400

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910947895.2A Active CN110850963B (en) 2019-09-29 2019-09-29 Virtual input method and related device

Country Status (1)

Country Link
CN (1) CN110850963B (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080120448A1 (en) * 2006-11-21 2008-05-22 Microsoft Corporation Remote mouse and keyboard using bluetooth
US20090271710A1 (en) * 2008-04-23 2009-10-29 Infocus Corporation Remote On-Screen Display Control
CN103596028A (en) * 2013-11-25 2014-02-19 乐视致新电子科技(天津)有限公司 Method and device for controlling smart television
CN106534558A (en) * 2016-11-25 2017-03-22 重庆杰夫与友文化创意有限公司 Method and device for controlling projector

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080120448A1 (en) * 2006-11-21 2008-05-22 Microsoft Corporation Remote mouse and keyboard using bluetooth
US20090271710A1 (en) * 2008-04-23 2009-10-29 Infocus Corporation Remote On-Screen Display Control
CN103596028A (en) * 2013-11-25 2014-02-19 乐视致新电子科技(天津)有限公司 Method and device for controlling smart television
CN106534558A (en) * 2016-11-25 2017-03-22 重庆杰夫与友文化创意有限公司 Method and device for controlling projector

Also Published As

Publication number Publication date
CN110850963B (en) 2023-12-12

Similar Documents

Publication Publication Date Title
KR102084633B1 (en) Method for screen mirroring, and source device thereof
CN110597474B (en) Information processing method and electronic equipment
US9261995B2 (en) Apparatus, method, and computer readable recording medium for selecting object by using multi-touch with related reference point
KR102056175B1 (en) Method of making augmented reality contents and terminal implementing the same
US9582094B2 (en) Information processing device, display device with touch panel, information processing method, and program
US10705649B2 (en) Pressure touch control method and electronic device
CN108664475A (en) Translate display methods, device, mobile terminal and storage medium
CN114816208A (en) Touch control method and device
CN107734183A (en) A kind of method, storage medium and the mobile terminal of one-handed performance mobile terminal
WO2021027485A9 (en) Information processing method and apparatus, storage medium, and electronic device
KR101690656B1 (en) Method and apparatus for generating media signal
CN110244884B (en) Desktop icon management method and terminal equipment
US20170205980A1 (en) Method and an apparatus for providing a multitasking view
EP3441865B1 (en) Electronic device for storing user data, and method therefor
US20140223298A1 (en) Method of editing content and electronic device for implementing the same
US20140164186A1 (en) Method for providing application information and mobile terminal thereof
CN105518634B (en) The method, apparatus and recording medium interacted with exterior terminal
CN112218134A (en) Input method and related equipment
CN111104236A (en) Paste control method and electronic equipment
CN107959932B (en) Method and device for processing wireless access point information and computer readable storage medium
CN113282214A (en) Stroke rendering method, device, storage medium and terminal
KR20190056523A (en) System and method for synchronizing display of virtual reality content
CN111090529A (en) Method for sharing information and electronic equipment
CN103377093A (en) Multimedia synchronizing method for multiple-system device and electronic device
KR20150001891A (en) electro device for sharing question message and method for controlling thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant