CN106055108B - Virtual touch screen control method and system - Google Patents

Virtual touch screen control method and system Download PDF

Info

Publication number
CN106055108B
CN106055108B CN201610405995.9A CN201610405995A CN106055108B CN 106055108 B CN106055108 B CN 106055108B CN 201610405995 A CN201610405995 A CN 201610405995A CN 106055108 B CN106055108 B CN 106055108B
Authority
CN
China
Prior art keywords
touch screen
virtual touch
control device
simulation area
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201610405995.9A
Other languages
Chinese (zh)
Other versions
CN106055108A (en
Inventor
张素丰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Shingyun Technology Co ltd
Original Assignee
Beijing Shingyun Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Shingyun Technology Co ltd filed Critical Beijing Shingyun Technology Co ltd
Priority to CN201610405995.9A priority Critical patent/CN106055108B/en
Publication of CN106055108A publication Critical patent/CN106055108A/en
Application granted granted Critical
Publication of CN106055108B publication Critical patent/CN106055108B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/014Hand-worn input/output arrangements, e.g. data gloves
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/012Walk-in-place systems for allowing a user to walk in a virtual environment while constraining him to a given position in the physical environment

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Processing Or Creating Images (AREA)
  • Studio Devices (AREA)

Abstract

The invention discloses a method and a system for operating a virtual touch screen, wherein the method comprises the following steps: acquiring the position of the control equipment in the virtual touch screen; acquiring a motion track of the control equipment; generating an operation instruction of the control device on the virtual touch screen based on the position and the motion track of the control device; and executing the operation instruction. According to the method and the device, the simulation area of the virtual touch screen is set and displayed in the view finding range of the camera, the identification image of the control device is scanned and identified in the simulation area through the camera, the position of the control device in the simulation area is determined according to the identification image, the operation instruction of the control device on the virtual touch screen is generated based on the position and the motion track of the control device, and the operation instruction is executed, so that the data processing speed is high, the control of the virtual touch screen can be realized without the help of external equipment, and the technical effect of improving the user experience is achieved.

Description

Virtual touch screen control method and system
Technical Field
The invention relates to the technical field of virtual reality, in particular to a method and a system for controlling a virtual touch screen.
Background
Virtual Reality (VR) technology is used for generating a three-dimensional virtual world through computer simulation, providing simulation of senses such as vision, hearing and touch for a user, enabling the user to observe objects in the three-dimensional space in time without limitation as if the user is personally on the scene, and enabling the computer to immediately perform complex operation when the user moves, so that accurate 3D world images are transmitted back to generate the realistic sensation.
Augmented reality is a technology for increasing the perception of a user to the real world through information provided by a computer system, virtual information is applied to the real world, and virtual objects, scenes or system prompt information generated by a computer is superposed to the real scenes, so that the reality is enhanced.
Currently, with the rapid development of virtual reality and augmented reality technologies, many VR/AR products are emerging. The existing VR/AR products have the following defects: on one hand, when the virtual touch screen is operated, most of the virtual touch screen is based on an image recognition technology, and the data processing speed is slow due to the fact that data in the data processing process is large, and user experience is affected; on the other hand, since many VR/AR products are operated by external devices, the external devices need to be operated when the virtual touch screen operation is completed.
In sum, AR/VR products in the prior art are not humanized enough, and user experience is poor.
Disclosure of Invention
The embodiment of the invention aims to provide a virtual touch screen control method and a virtual touch screen control system, wherein the position of a control device in a simulation area is obtained by identifying the position of a marker image, and the position and the motion track of the control device in the simulation area are given to generate and execute an operation instruction of the control device on a virtual touch screen.
According to an aspect of the present invention, there is provided a method for manipulating a virtual touch screen, including: acquiring the position of the control equipment in the virtual touch screen; acquiring a motion track of the control equipment; generating an operation instruction of the control device on the virtual touch screen based on the position and the motion track of the control device; and executing the operation instruction.
Further, the step of acquiring the position of the manipulation device in the virtual touch screen comprises: setting a simulation area of a virtual touch screen in a camera view finding range; the camera identifies an identification image of the control equipment; the position of the control device in the simulation region is determined by the identification image.
Further, the step of setting the simulation area of the virtual touch screen in the camera view range specifically includes: determining a view finding range of the camera; taking the central position of the virtual touch screen in the viewing range as the central position of a simulation area, and selecting a part of area of the virtual touch screen as the simulation area according to a preset percentage; the preset percentage is the proportion of the simulation area in the virtual touch screen area.
Further, the step of identifying the identification image of the control device by the camera comprises: the camera scans whether the identification image exists in the simulation area; if yes, identifying an identification image of the control device; if not, the simulation area continues to be scanned.
Further, the step of determining the position of the manipulation device in the simulation area by identifying the image specifically includes: matching pixel points with the same gray value as the gray value of the identification image in the image shot by the camera by taking the gray value of the identification image in the reference image as a reference; the reference picture is a picture which is shot in advance and contains an identification image; and taking the position of the image formed by the matched pixel points which are the same as the gray value of the identification image in the reference image in the whole image as the position of the control equipment in the simulation area.
Further, the step of generating an operation instruction of the manipulation device on the virtual touch screen based on the position and the motion trajectory of the manipulation device specifically includes: converting the motion trajectory into a quaternion; matching in a database based on the quaternion to obtain a corresponding instruction; and synthesizing the instruction and the position of the control device in the simulation area to obtain an operation instruction of the control device on the virtual touch screen.
Further, the step of executing the operation instruction comprises: executing an operation instruction of the control equipment on the virtual touch screen; triggering an application operation instruction on the virtual touch screen; and executing the application operation instruction.
According to another aspect of the present invention, there is provided a manipulation system of a virtual touch screen, including: the image acquisition module is connected to the central processing module and used for scanning and identifying the identification image of the control device in the simulation area of the virtual touch screen; the display module is connected to the central processing module and used for setting and displaying the simulation area of the virtual touch screen in the view finding range of the image acquisition module; the central processing module is used for determining the position of the control device in the simulation area according to the identification image and generating an operation instruction of the control device on the virtual touch screen based on the position and the motion track of the control device; and the control equipment is provided with an identification image and generates motion trail data in the motion process to send to the central processing module.
Further, the central processing module executes an operation instruction of the control device on the virtual touch screen, and triggers and executes an application operation instruction on the virtual touch screen by executing the operation instruction of the control device.
Further, the display module includes: a viewing range determining unit for determining a viewing range of the camera; the simulation area selection module is used for taking the central position of the virtual touch screen in the viewing range as the central position of the simulation area, and selecting a part of area of the virtual touch screen as the simulation area according to a preset percentage; the preset percentage is the proportion of the simulation area in the virtual touch screen area; and the display unit is used for displaying the simulation area.
Further, the central processing module includes: the first matching unit is used for matching pixel points with the same gray value as the identification image in the image shot by the camera by taking the gray value of the identification image in the reference image as a reference; the reference picture is a picture which is shot in advance and contains an identification image; and the composition unit is used for taking the position of an image which is matched in the reference image and is composed of pixel points with the same gray value as the identification image in the whole image as the position of the control equipment in the simulation area.
Further, the central processing module further comprises: the conversion unit is used for converting the motion trail into a quaternion; the second matching unit is used for matching in a database based on the quaternion to obtain a corresponding instruction; and the synthesis unit is used for synthesizing the instruction and the position of the control device in the simulation area to obtain an operation instruction of the control device on the virtual touch screen.
The method and the device for simulating the virtual touch screen have the advantages that the simulation area of the virtual touch screen is set and displayed in the camera view range, the identification image of the control device is scanned and recognized in the simulation area through the camera, the position of the control device in the simulation area is determined according to the identification image, the operation instruction of the control device on the virtual touch screen is generated based on the position and the motion track of the control device, and the operation instruction is executed. The virtual touch screen is operated to realize the control of some virtual plane media, and meanwhile, the content of the existing plane device can be transplanted into a virtual/augmented reality device, so that the transition of the existing planarization technology to the 3D scene rendering is realized. The problem of among the prior art because data processing process data lead to data processing speed to influence user experience slowly and need not be humanized inadequately with the help of external equipment greatly is solved, it is fast to have realized data processing, need not realize with the help of external equipment that virtual touch screen controls, reaches the technological effect that improves user experience.
Drawings
Fig. 1 is a schematic flowchart of a method for controlling a virtual touch screen according to a first embodiment of the present invention;
FIG. 2 is a schematic flow chart diagram according to a second embodiment of the present invention;
FIG. 3 is a schematic flow chart of a third embodiment of the present invention;
FIG. 4 is a schematic flow chart of an implementation of a fourth embodiment of the present invention;
FIG. 5 is a schematic flow chart diagram of an implementation of example five of the present invention;
FIG. 6 is a schematic flow chart of an implementation of example six of the present invention;
FIG. 7 is a schematic flow chart diagram illustrating an implementation of example seven of the present invention;
fig. 8 is a schematic diagram of module connections of an operating system of a virtual touch screen according to an eighth embodiment of the present invention;
FIG. 9 is a schematic view of an internal module connection of a display module according to the ninth embodiment of the present invention;
fig. 10 is a schematic view showing another internal module connection of a display module according to a tenth embodiment of the present invention;
FIG. 11 is a schematic diagram of an internal module connection of a CPU module according to an eleventh embodiment of the present invention;
FIG. 12 is a schematic diagram of another internal module connection of a CPU module according to a twelfth embodiment of the present invention;
fig. 13 is a schematic diagram of connection of internal modules of the control device in the embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention will be described in further detail with reference to the accompanying drawings in conjunction with the following detailed description. It should be understood that the description is intended to be exemplary only, and is not intended to limit the scope of the present invention. Moreover, in the following description, descriptions of well-known structures and techniques are omitted so as to not unnecessarily obscure the concepts of the present invention.
Fig. 1 is a flowchart illustrating a method for manipulating a virtual touch screen according to a first embodiment of the present invention.
As shown in fig. 1, a method for manipulating a virtual touch screen includes:
s101, acquiring the position of the control equipment in the virtual touch screen;
in the specific implementation of step S101, the control device refers to a device capable of controlling the virtual touch screen, and the position of the control device in the virtual touch screen is obtained according to the position of the control device on the virtual touch screen. The control device can be a ring or other devices capable of controlling the virtual touch screen.
S102, obtaining a motion track of the control equipment;
in the specific implementation of step S102, the motion trajectory of the control device refers to trajectory data of the motion of the control device on the virtual touch screen, and the motion trajectory of the control device can be obtained according to the motion trajectory data.
S103, generating an operation instruction of the control device on the virtual touch screen based on the position and the motion track of the control device;
in a specific implementation, the step S103 is to obtain an operation instruction of the control device at a corresponding position on the virtual touch screen according to the position of the control device and the motion trajectory of the control device. Such as a click or move operation at a location on the virtual touch screen.
And S104, executing the operation instruction.
After the operation instruction on the virtual touch screen is generated, the operation instruction is executed, so that the corresponding operation effect is obtained.
Fig. 2 shows a schematic flow chart of a method according to a second embodiment of the present invention.
As shown in fig. 2, the step S101 of acquiring the position of the manipulation device in the virtual touch screen includes:
s201, setting a simulation area of a virtual touch screen in a camera view finding range;
in the specific implementation of step S201, in the viewing range of the camera, since the resolution of the virtual touch screen in a partial area may be low, the remaining area excluding the partial area is selected as a simulation area, that is, an area for identifying the identification image.
S202, identifying an identification image of the control equipment by the camera;
the camera is arranged at the glasses end of the intelligent helmet display device, and scans the simulation area in the framing range. The identification image sets up on control device, and control device can be for the ring, and the ring is worn on the finger, and the identification image on the ring must be towards the camera direction, and is just to the camera direction better to the identification image is scanned by the camera. Of course, the control device may be other devices. The smart head-mounted display device may be a VR headset or AR glasses.
S203, determining the position of the manipulation device in the simulation area through the identification image.
Since the identification image is located on the control device, the position of the identification image in the simulation area is the position of the control device in the simulation area.
Fig. 3 shows a schematic flow chart of a method according to a third embodiment of the present invention.
As shown in fig. 3, the step S201 of setting the simulation area of the virtual touch screen in the camera view range specifically includes:
s301, determining a view finding range of the camera;
the viewing range is the camera viewing range, the viewing range is larger than or equal to the range of the simulation area, the virtual touch screen and the simulation area are both mapping of the range of the internal area of the intelligent head-mounted display device, the simulation area is the area range where a user executes operation, and the user can operate the virtual image in the simulation area. The viewing range may be determined based on parameters of the camera.
S302, taking the center position of the virtual touch screen in the view finding range as the center position of a simulation area, and selecting a part of area of the virtual touch screen as the simulation area according to a preset percentage; the preset percentage is the proportion of the simulation area in the virtual touch screen area.
Wherein, when S302 is implemented, it may further be: and in the view finding range of the camera, selecting a region formed by other pixel points except the pixel points with the edge part resolution lower than the threshold value as a simulation region.
Fig. 4 shows a schematic flow chart of a method according to a fourth embodiment of the present invention.
As shown in fig. 4, the step S202 of recognizing the identification image of the manipulation device by the camera includes:
s401, a camera scans whether an identification image exists in the simulation area;
s402, if yes, identifying an identification image of the control device;
s403, if not, continuing to scan the simulation area.
In a specific implementation, the fourth embodiment scans back in the simulation area through the camera, and in the scanning process, if the identification image is scanned, the identification image is recognized, and then the position of the control device in the simulation area is determined through the identification image; if not, the scanning is continued until the identification image is scanned. To facilitate a quick scan of the identification image, the identification image may select distinctive, black and white icons, such as: two-dimensional codes and bar codes.
Fig. 5 shows a schematic flow chart of a method according to a fifth embodiment of the present invention.
As shown in fig. 5, the step S203 of determining the position of the manipulation device in the simulation area by the identification image specifically includes:
s501, matching pixel points with the same gray value as the identification image in the image shot by the camera by taking the gray value of the identification image in the reference picture as a reference; the reference picture is a picture which is shot in advance and contains an identification image;
in the specific implementation of step S501, in the scanning process of the camera, the gray level value of each pixel point in the current image obtained by scanning is compared with the gray level value of the identification image in the reference image one by one.
And S502, taking the position of the image formed by the matched pixel points with the same gray value as the identification image in the whole image as the position of the control equipment in the simulation area.
In the specific implementation of step S502, if pixel points with the same gray value as the identification image are matched in the reference picture, the position of the image formed by the pixel points in the whole image is used as the position of the control device in the simulation area.
Fig. 6 shows a schematic flow chart of a method according to a sixth embodiment of the present invention.
As shown in fig. 6, the step S103 of generating an operation instruction of the control device on the virtual touch screen based on the position and the motion trajectory of the control device specifically includes:
s601, converting the motion trail into quaternion;
here, the motion trajectory is received and converted into a quaternion. The motion trajectory is converted into a quaternion, and the conversion process is not described in detail in this embodiment.
S602, matching in a database based on the quaternion to obtain a corresponding instruction;
matching the quaternion obtained in the step S601 in a database to obtain a corresponding instruction; the instruction is clicking, moving horizontally or moving vertically.
And S603, synthesizing the instruction and the position of the control device in the simulation area to obtain an operation instruction of the control device on the virtual touch screen.
And synthesizing the instruction acquired in the step S602 with the position of the control device in the simulation area, to obtain an action performed by the hand at a certain position in the simulation area, that is, an operation instruction on the virtual touch screen.
Fig. 7 shows a schematic flow chart of a seventh method according to an embodiment of the present invention.
As shown in fig. 7, the step S104 of executing the operation instruction includes:
s701, executing an operation instruction of the control equipment on the virtual touch screen;
s702, triggering an application operation instruction on the virtual touch screen;
and S703, executing the application operation instruction.
And after the operation instruction of the control equipment on the virtual touch screen is acquired, executing the instruction. After the operation instruction of the control device is executed, the application operation instruction on the virtual touch screen is triggered based on the operation instruction of the control device, so that the application operation instruction is executed, and finally, the control on the virtual touch screen is completed.
The method and the device for simulating the virtual touch screen have the advantages that the simulation area of the virtual touch screen is set and displayed in the camera view range, the identification image of the control device is scanned and recognized in the simulation area through the camera, the position of the control device in the simulation area is determined according to the identification image, the operation instruction of the control device on the virtual touch screen is generated based on the position and the motion track of the control device, and the operation instruction is executed. The virtual touch screen is operated to realize the control of some virtual plane media, and meanwhile, the content of the existing plane device can be transplanted into a virtual/augmented reality device, so that the transition of the existing planarization technology to the 3D scene rendering is realized. The problem of among the prior art because data processing process data lead to data processing speed to influence user experience slowly and need not be humanized inadequately with the help of external equipment greatly is solved, it is fast to have realized data processing, need not realize with the help of external equipment that virtual touch screen controls, reaches the technological effect that improves user experience.
Fig. 8 is a schematic diagram showing module connections of a virtual touch screen control system according to an eighth embodiment of the present invention.
As shown in fig. 8, a manipulation system of a virtual touch screen includes:
the image acquisition module 10 is connected to the central processing module 12 and is used for scanning and identifying the identification image of the control device in the simulation area of the virtual touch screen;
the image acquisition module 10 may be a camera, and performs back scanning in the simulation area through the camera, and in the scanning process, if the identification image is scanned, the identification image is recognized, and then the position of the control device in the simulation area is determined through the identification image; if not, the scanning state is kept until the identification image is scanned. To facilitate scanning of the identification image, the identification image may select a distinctive, black and white icon, such as: two-dimensional codes and bar codes.
The display module 11 is connected to the central processing module 12 and is used for setting and displaying the simulation area of the virtual touch screen in the view range of the image acquisition module 10;
the display module selects a part of area of the virtual touch screen as a simulation area in the view finding range of the image acquisition module, and displays the simulation area.
The central processing module 12 is configured to determine a position of the control device in the simulation area according to the identification image, and generate an operation instruction of the control device on the virtual touch screen based on the position and the motion trajectory of the control device;
the position of the identification image in the simulation area is the position of the control device in the simulation area, and therefore the position of the finger on the virtual touch screen is determined. And synthesizing the position and the motion track of the finger on the virtual touch screen to obtain an operation instruction of the finger on the virtual touch screen.
And the control device 13 is provided with an identification image and generates motion track data during the motion process so as to send the motion track data to the central processing module 12.
The motion trajectory data may be detected by a motion sensor disposed on the control device and sent to the central processing module 12.
The central processing module 12 further executes an operation instruction of a control device on the virtual touch screen, and triggers and executes an application operation instruction on the virtual touch screen by executing the operation instruction of the control device.
After acquiring the operation instruction of the control device on the virtual touch screen, the central processing module 12 further executes the instruction. After the operation instruction of the control device is executed, the application operation instruction on the virtual touch screen is triggered based on the operation instruction of the control device, so that the application operation instruction is executed, and finally, the control on the virtual touch screen is completed.
Fig. 9 is a schematic diagram illustrating an internal module connection of an embodiment of a display module according to example nine of the present invention.
As shown in fig. 9, the display module 11 includes:
a finder range determining unit 111 for determining a finder range of the camera;
the viewing range is the camera viewing range, the viewing range is larger than or equal to the simulation area range, the virtual touch screen and the simulation area map the area range inside the intelligent head-mounted display device, the simulation area is the area range where the user executes operation, and the user can operate the virtual image in the simulation area. The determination can be specifically performed according to parameters of the camera.
A simulation area selecting unit 112, configured to select a part of the area of the virtual touch screen as a simulation area according to a preset percentage, where the center of the virtual touch screen in the viewing range is used as the center of the simulation area; the preset percentage is the proportion of the simulation area in the virtual touch screen area;
a display unit 113 for displaying the simulation area.
Fig. 10 is a schematic view showing the connection of internal modules of another embodiment of a display module according to a tenth embodiment of the present invention.
As shown in fig. 10, the display module 11 includes:
a framing determination unit 114 for determining a framing range of the camera;
a simulation area selecting unit 115 configured to select, as a simulation area, an area excluding pixel points whose pixel values of edge portions of the virtual touch screen are smaller than a threshold value within a viewing range;
since the screen resolution is generally low at the edge of the virtual touch screen, it is difficult to recognize the logo image, and therefore, the area on the virtual touch screen excluding this portion of the area is selected as the simulation area.
And a display unit 116 for displaying the simulation region.
Fig. 11 is a schematic diagram showing the connection of internal modules of the cpu according to an eleventh embodiment of the present invention.
As shown in fig. 11, the central processing module 12 includes:
the first matching unit 121 is configured to match pixel points in an image shot by a camera, where the pixel points are the same as the gray value of the identification image, with the gray value of the identification image in the reference image as a reference; the reference picture is a picture which is shot in advance and contains an identification image;
the first matching unit 121 compares the gray value of each pixel point in the current image obtained by scanning in the scanning process of the camera with the gray value of the identification image in the reference image one by one.
And the forming unit 122 is configured to use a position of an image in the whole image, which is formed by the matched pixel points in the reference image and has the same gray value as the identification image, as a position of the control device in the simulation area.
When the above-mentioned component unit 122 is implemented, it is: in the comparison process, if pixel points with the same gray value as the identification image are matched in the reference picture, the position of the image formed by the pixel points in the whole image is used as the position of the control device in the simulation area. Otherwise, discarding the pixel point.
Fig. 12 is a schematic diagram showing another internal module connection of a cpu module according to a twelfth embodiment of the present invention.
As shown in fig. 12, the central processing module 12 further includes:
a conversion unit 123, configured to convert the motion trajectory into a quaternion;
here, the motion trajectory is received and converted into a quaternion. The motion trajectory is converted into a quaternion, and the conversion process is not described in detail in this embodiment.
A second matching unit 124, configured to perform matching in a database based on the quaternion to obtain a corresponding instruction;
wherein, matching the quaternion obtained by the conversion unit 123 in the database to obtain the corresponding instruction; the instruction is clicking, moving horizontally or moving vertically.
And a synthesizing unit 125, configured to synthesize the instruction and the position of the control device in the simulation area, so as to obtain an operation instruction of the control device on the virtual touch screen.
The instruction acquired by the second matching unit 124 is synthesized with the position of the control device in the simulation area, that is, the action performed by the hand at a certain position in the simulation area, that is, the operation instruction on the virtual touch screen is obtained.
Fig. 13 is a schematic diagram illustrating connection of internal modules of the control device according to the embodiment of the present invention.
As shown in fig. 13, the control device in the embodiment of the present invention may include a sensor 131, a key switch 132, and a bluetooth chip 133. Wherein:
the sensor 131 may be a nine-axis sensor, and is configured to collect a motion trajectory of the ring, and send the motion trajectory of the ring to the bluetooth chip 133, and the bluetooth chip 133 may convert the motion trajectory into instruction data of a quaternion.
The key switch 132 is used for sending a key signal to the bluetooth chip 133, and the bluetooth chip 133 converts the key signal into instruction data.
In addition, the ring of the embodiment of the present invention may further include a storage chip 134, which is used to store instruction data in the instruction database in advance. After the bluetooth chip 133 obtains the instruction data stored in the memory chip 134, it may be determined whether the instruction data sent by the sensor 131 and the key switch 132 are matched, and if so, a corresponding instruction may be sent to the terminal.
In one embodiment, the ring may further include a power supply 135 for supplying power to the sensor 131, the key switch 132, the bluetooth chip 133, and the memory chip 134 to support the operation of the sensor 131, the key switch 132, the bluetooth chip 133, and the memory chip 134.
In the above embodiment, the basic structure of a single ring is introduced, and the invention can realize the control of a simulated virtual touch screen by wearing the ring on the hand of the user.
The method and the device for simulating the virtual touch screen have the advantages that the simulation area of the virtual touch screen is set and displayed in the view range of the image acquisition module, the identification image of the control device is scanned and identified in the simulation area through the image acquisition module, the position of the control device in the simulation area is determined according to the identification image, the operation instruction of the control device on the virtual touch screen is generated based on the position and the motion track of the control device, and the operation instruction is executed. The virtual touch screen is operated to realize the control of some virtual plane media, and meanwhile, the content of the existing plane device can be transplanted into a virtual/augmented reality device, so that the transition of the existing planarization technology to the 3D scene rendering is realized. The problem of among the prior art because data processing process data lead to data processing speed to influence user experience slowly and need not be humanized inadequately with the help of external equipment greatly is solved, it is fast to have realized data processing, need not realize with the help of external equipment that virtual touch screen controls, reaches the technological effect that improves user experience.
In the description of the present invention, it should be noted that the terms "first", "second", and "third" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance.
It is to be understood that the embodiments described are only a few embodiments of the present invention, and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.

Claims (8)

1. A method for controlling a virtual touch screen is characterized by comprising the following steps:
the method comprises the steps of acquiring the position of a control device in a virtual touch screen, wherein the step of acquiring the position of the control device in the virtual touch screen comprises the following steps: setting a simulation area of a virtual touch screen in a camera view finding range; the camera identifies an identification image of the control equipment; determining the position of the control device in the simulation area through the identification image;
acquiring a motion track of the control equipment;
generating an operation instruction of the control device on the virtual touch screen based on the position and the motion trail of the control device, wherein the step of generating the operation instruction of the control device on the virtual touch screen based on the position and the motion trail of the control device specifically comprises the following steps: converting the motion trajectory into a quaternion; matching in a database based on the quaternion to obtain a corresponding instruction; synthesizing the instruction and the position of the control equipment in the simulation area to obtain an operation instruction of the control equipment on the virtual touch screen;
executing the operation instruction, wherein the step of executing the operation instruction comprises the following steps: executing an operation instruction of the control equipment on the virtual touch screen; triggering an application operation instruction on the virtual touch screen; and executing the application operation instruction.
2. The control method according to claim 1, wherein the step of setting the simulation area of the virtual touch screen within the viewing range of the camera is specifically:
determining a view finding range of the camera;
taking the central position of the virtual touch screen in the viewing range as the central position of a simulation area, and selecting a part of area of the virtual touch screen as the simulation area according to a preset percentage; the preset percentage is the proportion of the simulation area in the virtual touch screen area.
3. The control method according to claim 1, wherein the step of the camera recognizing the identification image of the control device comprises:
the camera scans whether the identification image exists in the simulation area;
if yes, identifying an identification image of the control device;
if not, the simulation area continues to be scanned.
4. The control method according to claim 1, wherein the step of determining the position of the control device in the simulation area by identifying the image is specifically:
matching pixel points with the same gray value as the gray value of the identification image in the image shot by the camera by taking the gray value of the identification image in the reference image as a reference; the reference picture is a picture which is shot in advance and contains an identification image;
and taking the position of the image formed by the matched pixel points which are the same as the gray value of the identification image in the reference image in the whole image as the position of the control equipment in the simulation area.
5. A manipulation system for a virtual touch screen, comprising:
the image acquisition module (10) is connected to the central processing module (12) and is used for scanning and identifying the identification image of the control device in the simulation area of the virtual touch screen;
the display module (11) is connected to the central processing module (12) and is used for setting and displaying the simulation area of the virtual touch screen in the view range of the image acquisition module (10);
the central processing module (12) is used for determining the position of the control device in the simulation area according to the identification image and generating an operation instruction of the control device on the virtual touch screen based on the position and the motion track of the control device, and the central processing module (12) further comprises: a conversion unit (123) for converting the motion trajectory into a quaternion; the second matching unit (124) is used for matching in the database based on the quaternion to obtain a corresponding instruction; the synthesis unit (125) is used for synthesizing the instruction and the position of the control device in the simulation area to obtain an operation instruction of the control device on the virtual touch screen;
and the control device (13) is provided with an identification image and generates motion track data in the motion process to send to the central processing module (12).
6. The steering system of claim 5,
the central processing module (12) further executes an operation instruction of the control device on the virtual touch screen, and triggers and executes an application operation instruction on the virtual touch screen by executing the operation instruction of the control device.
7. The steering system according to claim 5, the display module (11) comprising:
a viewing range determination unit (111) for determining a viewing range of the camera;
a simulation area selection unit (112) for selecting a part of area of the virtual touch screen as a simulation area according to a preset percentage by taking the center position of the virtual touch screen in the viewing range as the center position of the simulation area; the preset percentage is the proportion of the simulation area in the virtual touch screen area;
a display unit (113) for displaying the simulation region.
8. Steering system according to claim 5, the central processing module (12) comprising:
the first matching unit (121) is used for matching pixel points with the same gray value as the identification image in the image shot by the camera by taking the gray value of the identification image in the reference image as a reference; the reference picture is a picture which is shot in advance and contains an identification image;
and the composition unit (122) is used for taking the position of an image which is matched in the reference image and is composed of pixel points with the same gray value as the identification image in the whole image as the position of the control equipment in the simulation area.
CN201610405995.9A 2016-06-10 2016-06-10 Virtual touch screen control method and system Expired - Fee Related CN106055108B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610405995.9A CN106055108B (en) 2016-06-10 2016-06-10 Virtual touch screen control method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610405995.9A CN106055108B (en) 2016-06-10 2016-06-10 Virtual touch screen control method and system

Publications (2)

Publication Number Publication Date
CN106055108A CN106055108A (en) 2016-10-26
CN106055108B true CN106055108B (en) 2020-11-13

Family

ID=57170676

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610405995.9A Expired - Fee Related CN106055108B (en) 2016-06-10 2016-06-10 Virtual touch screen control method and system

Country Status (1)

Country Link
CN (1) CN106055108B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110597385B (en) * 2019-09-03 2021-12-21 维沃移动通信有限公司 Control method and electronic equipment
CN111324253B (en) * 2020-02-12 2021-08-03 腾讯科技(深圳)有限公司 Virtual article interaction method and device, computer equipment and storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102306053A (en) * 2011-08-29 2012-01-04 Tcl集团股份有限公司 Virtual touch screen-based man-machine interaction method and device and electronic equipment
CN102508550A (en) * 2011-11-15 2012-06-20 深圳市中兴移动通信有限公司 Method for virtualizing input equipment by using sensor
CN103760983A (en) * 2014-01-23 2014-04-30 中国联合网络通信集团有限公司 Virtual gesture input method and gesture collecting device

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101364159A (en) * 2008-09-04 2009-02-11 合肥吉天电子科技有限公司 Virtual touch screen system based on image recognition
CN101673161B (en) * 2009-10-15 2011-12-07 复旦大学 Visual, operable and non-solid touch screen system
US9829971B2 (en) * 2013-01-21 2017-11-28 Facebook, Inc. Systems and methods of eye tracking control
CN103995620A (en) * 2013-12-02 2014-08-20 深圳市云立方信息科技有限公司 Air touch system
CN103914260B (en) * 2014-03-31 2021-03-23 苏州浩辰软件股份有限公司 Control method and device for operation object based on touch screen
JP6069282B2 (en) * 2014-10-30 2017-02-01 本田技研工業株式会社 Vehicle control device
CN104461013B (en) * 2014-12-25 2017-09-22 中国科学院合肥物质科学研究院 A kind of human action reconstruct and analysis system and method based on inertia sensing unit
CN104702755A (en) * 2015-03-24 2015-06-10 黄小曼 Virtual mobile phone touch screen device and method

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102306053A (en) * 2011-08-29 2012-01-04 Tcl集团股份有限公司 Virtual touch screen-based man-machine interaction method and device and electronic equipment
CN102508550A (en) * 2011-11-15 2012-06-20 深圳市中兴移动通信有限公司 Method for virtualizing input equipment by using sensor
CN103760983A (en) * 2014-01-23 2014-04-30 中国联合网络通信集团有限公司 Virtual gesture input method and gesture collecting device

Also Published As

Publication number Publication date
CN106055108A (en) 2016-10-26

Similar Documents

Publication Publication Date Title
US10095458B2 (en) Information processing apparatus, information processing method, non-transitory computer-readable storage medium, and system
JP5389111B2 (en) Augmented reality information providing apparatus and method
US9342925B2 (en) Information processing apparatus, information processing method, and program
US20150205484A1 (en) Three-dimensional user interface apparatus and three-dimensional operation method
US20160049011A1 (en) Display control device, display control method, and program
US20150277555A1 (en) Three-dimensional user interface apparatus and three-dimensional operation method
US20160018897A1 (en) Three-dimensional user interface device and three-dimensional operation processing method
JP2009205556A (en) User interface device
CN109788359B (en) Video data processing method and related device
KR20110104686A (en) Marker size based interaction method and augmented reality system for realizing the same
US20130057574A1 (en) Storage medium recorded with program, information processing apparatus, information processing system, and information processing method
JPWO2013054462A1 (en) User interface control device, user interface control method, computer program, and integrated circuit
KR20140090022A (en) Method and Apparatus for displaying the video on 3D map
JP2021520577A (en) Image processing methods and devices, electronic devices and storage media
JP2015118442A (en) Information processor, information processing method, and program
CN111161396B (en) Virtual content control method, device, terminal equipment and storage medium
US20150145786A1 (en) Method of controlling electronic device using transparent display and apparatus using the same
CN111813214A (en) Virtual content processing method and device, terminal equipment and storage medium
CN106055108B (en) Virtual touch screen control method and system
CN104285243A (en) 3d image display device, cursor display method of same, and computer program
CN113066189B (en) Augmented reality equipment and virtual and real object shielding display method
CN111913674A (en) Virtual content display method, device, system, terminal equipment and storage medium
CN112887601B (en) Shooting method and device and electronic equipment
JP7335335B2 (en) Information processing device, information processing method, and program
CN111918114A (en) Image display method, image display device, display equipment and computer readable storage medium

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20201113