CN212749772U - VR keyboard and VR office device - Google Patents

VR keyboard and VR office device Download PDF

Info

Publication number
CN212749772U
CN212749772U CN202021248605.XU CN202021248605U CN212749772U CN 212749772 U CN212749772 U CN 212749772U CN 202021248605 U CN202021248605 U CN 202021248605U CN 212749772 U CN212749772 U CN 212749772U
Authority
CN
China
Prior art keywords
keyboard
light
display
head
panel
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202021248605.XU
Other languages
Chinese (zh)
Inventor
谷逍驰
唐聚学
陈帅帅
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Daishi Technology Co ltd
Original Assignee
Shenzhen Daishi Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Daishi Technology Co ltd filed Critical Shenzhen Daishi Technology Co ltd
Priority to CN202021248605.XU priority Critical patent/CN212749772U/en
Application granted granted Critical
Publication of CN212749772U publication Critical patent/CN212749772U/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

The utility model relates to a VR keyboard and VR official working device, the VR keyboard includes keyboard, location label, receiving and dispatching response piece and detection chip. The positioning label is arranged on a panel of the keyboard and used for displaying the space position of the marked keyboard to the VR head. The spatial location is used to indicate to the VR headset that a keyboard is displayed in the VR environment. The receiving and sending sensing part comprises a plurality of pairs of light emitters and light receivers, and each pair of light emitter and light receiver is respectively arranged on two different side edges of the periphery of the panel of the keyboard and is used for forming an orthogonal light array on the top surface of the keycap of the keyboard and detecting finger position signals close to each keycap. The detection chip is arranged on the keyboard and is respectively and electrically connected with each receiving and transmitting induction piece, and is used for scanning finger position signals of each receiving and transmitting induction piece and sending the finger position signals to the host computer of the VR head display. The finger position signal is used to indicate the position of a finger on a display keyboard of the VR head display in the VR environment as controlled by the host computer. The purpose of obviously improving the display performance of the keyboard VR is achieved.

Description

VR keyboard and VR office device
Technical Field
The application relates to the technical field of electronic products, in particular to a VR keyboard and a VR office device.
Background
With the development of electronic product technology, VR heads are obviously coming out and gradually popularized, and brand new interactive experience is brought to people. At present, most VR heads are interacted by using handles, and when text editing or webpage browsing is needed, the handle operation is very complicated, and the use efficiency is low. When the keyboard is used for operation, the positions of keys and fingers of the keyboard need to be observed by a user in a display scene of the VR head display, so that the VR head display is obviously difficult to use with the keyboard. In the face of such difficulties, the conventional way for realizing the VR display of the keyboard is to use a see-through technology, such as shooting the keyboard by using an external camera, and then mapping the shot content to the VR display scene for display. However, in the process of implementing the present invention, the inventor finds that the conventional way of displaying the keyboard VR has at least a problem of low keyboard display performance.
SUMMERY OF THE UTILITY MODEL
In view of the above, there is a need to provide a VR keyboard and a VR office device that address the above-mentioned problems in the prior art.
In order to achieve the above object, the embodiment of the present invention provides the following technical solutions:
on the one hand, the embodiment of the utility model provides a VR keyboard, include:
a keyboard;
the positioning label is arranged on a panel of the keyboard and used for displaying and marking the space position of the keyboard to the VR head; the spatial location is used to indicate that the VR head display displays a keyboard in the VR environment;
the receiving and transmitting sensing part comprises a plurality of pairs of light emitters and light receivers, each pair of light emitter and light receiver is respectively arranged on two different side edges of the periphery of the panel of the keyboard and is used for forming an orthogonal light array on the top surface of a keycap of the keyboard and detecting finger position signals close to each keycap;
the detection chip is arranged on the keyboard, is respectively and electrically connected with the receiving and transmitting induction pieces, and is used for scanning and acquiring finger position signals of the receiving and transmitting induction pieces and sending the finger position signals to the host of the VR head display; the finger position signal is used to indicate the position of a finger on a display keyboard of the VR head display in the VR environment as controlled by the host computer.
In one embodiment, the light emitter is an infrared transmitting tube, and the light receiver is an infrared receiving tube;
on any side of the periphery of the panel of the keyboard, the infrared transmitting tubes and/or the infrared receiving tubes are arranged at equal intervals.
In one embodiment, the plurality of infrared transmitting tubes are arranged on two adjacent sides of the periphery of the panel of the keyboard at equal intervals, and the plurality of infrared receiving tubes corresponding to the plurality of infrared transmitting tubes one by one are arranged on the other two adjacent sides at equal intervals.
In one embodiment, the optical transmitter and optical receiver are mechanically connected to the faceplate of the keyboard by welding, snapping or mating a tube mounting bar.
In one embodiment, the positioning tag comprises a two-dimensional code tag attached to a peripheral side corner of the panel of the keyboard.
In one embodiment, the positioning tag comprises LED lamp beads or infrared lamp beads, and the LED lamp beads or the infrared lamp beads are arranged at the peripheral side corners of the panel of the keyboard and are electrically connected with a power supply source of the keyboard.
In one embodiment, the positioning tag further comprises an IMU unit electrically connected to the detection chip, and configured to output a gesture signal of the keyboard to the host through the detection chip; the gesture signal is used for indicating the host computer to control the VR head display to display the gesture of the keyboard in the VR environment.
In one embodiment, the detection chip comprises an MCU chip, an FGPA chip, a CPU chip, a GPU chip or a raspberry chip.
On the one hand, still provide a VR official working device, including the first apparent and foretell VR keyboard of VR, the first detection chip who shows communication connection VR keyboard of VR.
In one embodiment, the VR head display is provided with one or more visual recognition cameras that capture positioning tags on the VR keyboard and identify the spatial location of the VR keyboard.
In one embodiment, the VR office device further includes a mouse, and a positioning tag is disposed on a housing of the mouse and used for indicating a spatial position of the mouse to a VR head.
One of the above technical solutions has the following advantages and beneficial effects:
according to the VR keyboard and the VR office device, the positioning label is designed on the keyboard to be matched with the VR head display, so that the VR head display can utilize the six-degree-of-freedom space position of the positioning label or the keyboard; simultaneously, set up a plurality of receiving and dispatching response pieces in panel week side of keyboard and form the quadrature light battle array on the key cap top surface of keyboard to when the key cap of keyboard has the finger to be close to and carries out key operation, the quadrature light battle array will be sheltered from by the part, thereby can utilize the coordinate detection finger that the light battle array was sheltered from to point on the keyboard. And the output signal scanning of receiving and dispatching response piece has been realized to the setting up of detection chip, can scan in real time and send the host computer that the head of VR shows after acquireing finger position signal and carry out signal processing for the head host computer that shows of VR can accurately discern and control the head of VR and show the finger position on the display keyboard in its VR environment. So, realized that the VR keyboard can pinpoint and show in the VR environment, can accurately acquire and show the finger position of operation moreover on the keyboard that shows, keyboard display screen is clear, stability is high, liberates user's visual angle and makes the user need not frequently to bow to look for the keyboard position, has reached and has showing the purpose that promotes keyboard VR display performance.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments or the conventional technologies of the present application, the drawings used in the descriptions of the embodiments or the conventional technologies will be briefly introduced below, it is obvious that the drawings in the following descriptions are only some embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
FIG. 1 is a schematic diagram of a VR keyboard in one embodiment;
FIG. 2 is a schematic diagram of a VR keyboard in another embodiment;
FIG. 3 is a diagram illustrating detection of finger positions by a VR keyboard in accordance with an embodiment;
FIG. 4 is a schematic top view of an embodiment of a VR keyboard;
FIG. 5 is a schematic diagram of a VR keyboard in yet another embodiment;
FIG. 6 is a schematic diagram of a VR office device in accordance with an embodiment;
fig. 7 is a schematic diagram of applications of a VR office device and a VR display in an embodiment.
Detailed Description
To facilitate an understanding of the present application, the present application will now be described more fully with reference to the accompanying drawings. Preferred embodiments of the present application are shown in the drawings. This application may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs. The terminology used herein in the description of the present application is for the purpose of describing particular embodiments only and is not intended to be limiting of the application.
Spatial relational terms such as "under," "below," "under," "above," "over," and the like may be used herein for convenience in describing the relationship of one element or feature to another element or feature as illustrated in the figures. It will be understood that the spatially relative terms are intended to encompass different orientations of the device in use or operation in addition to the orientation depicted in the figures. For example, if the device in the figures is turned over, then elements or features described as "below" or "beneath" other elements or features would then be oriented "above" the other elements or features. Thus, the exemplary terms "under" and "under" can encompass both an orientation of above and below. The device may be otherwise oriented (rotated 90 degrees or at other orientations) and the spatial descriptors used herein interpreted accordingly.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used herein, the singular forms "a", "an" and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms "comprises" and/or "comprising," when used in this specification, specify the presence of stated features, elements, and/or components, but do not preclude the presence or addition of one or more other features, elements, components, and/or groups thereof. As used herein, the term "and/or" includes any and all combinations of the associated listed items.
When the traditional VR head is used in cooperation with a keyboard, a user cannot know the position of the keyboard and cannot know the positions of keys on the keyboard due to the fact that the external environment cannot be observed, and therefore the user is quite inconvenient to use the keyboard. The existing VR head displays with the see-through function for shooting the external environment through a camera can enable a user to directly observe a keyboard in the external real environment, however, in the process of realizing the application, the inventor finds that the definition of a keyboard display picture in the traditional VR display technology is low and keys on the keyboard cannot be accurately seen; in addition, the visual angle limitation of the VR head makes the user need to frequently hold down to see the keyboard, and the problem of low keyboard display performance exists. In view of the above technical problems, the present application provides the following technical solutions:
referring to fig. 1, the present invention provides a VR keyboard 100, which includes a keyboard 12, a positioning tag 14, a transceiver module 16 and a detection chip 18. The location tag 14 is disposed on the panel 122 of the keyboard 12 for indicating the spatial location of the keyboard 12 to the VR head. The spatial location is used to indicate to the VR head that keyboard 12 is displayed in the VR environment. The transceiver sensor 16 includes a plurality of pairs of optical transmitters 162 and optical receivers 164, and each pair of optical transmitter 162 and optical receiver 164 is disposed on two different sides of the periphery of the panel 122 of the keyboard 12, respectively, for forming an orthogonal optical array on the top surface of the key caps 124 of the keyboard 12, and detecting finger position signals approaching each key cap 124. The detecting chip 18 is disposed on the keyboard 12 and electrically connected to each of the transceiving sensors 16, for scanning and acquiring the finger position signal of each of the transceiving sensors 16 and transmitting the finger position signal to the host of the VR head display. The finger position signal is used to indicate the location of a finger on the display keyboard 12 in the VR environment that the host controls the VR head to display.
It is understood that the keyboard 12 may be the body of any type of conventional keyboard 12 known in the art, and generally includes the bottom panel, the front panel 122, the keys, the key caps 124, and the keyboard circuitry of the keyboard 12. Positioning tag 14 may be any tag that is distinguishable from the environment surrounding keyboard 12 and is attached to panel 122 of keyboard 12 as a VR avatar capturing the six degree-of-freedom position of keyboard 12 as a visual feature. In some embodiments, the location tag 14 may be a barcode, a light emitting tag, a light absorbing tag, a magnetic tag, or other tag. The location of the positioning tags 14 on the panel 122 of the keyboard 12 may be, but is not limited to, on any one of the peripheral corners of the panel 122, or on any one of the pair of peripheral sides of the panel 122, for example, but not limited to, when two or three positioning tags 14 are used, they may be symmetrically or asymmetrically disposed on the panel 122. The positioning tag 14 may be, but is not limited to, attached directly to the panel 122 of the keyboard 12, or attached to the panel 122 of the keyboard 12 by screws, or integrally molded with the panel 122 of the keyboard 12 to form an embedded positioning tag 14.
The transceiver 16 may be an infrared or other laser transceiver, and detects whether an object is present at a predetermined area based on the fact that the emitted light is blocked. Each pair of the light emitters 162 and the light receivers 164 are symmetrically arranged on the peripheral side of the panel 122 of the keyboard 12, and on any one side of the peripheral side of the panel 122, a row of the light emitters 162 is installed, and on the adjacent or opposite other side, a row of the light receivers 164 corresponding to the light emitters 162 one by one respectively is installed; accordingly, the transceiving sensor 16 can be disposed at the peripheral sides of two other adjacent panels 122 in the same manner, so as to form an orthogonal light array on the top plane of the key cap 124 of the keyboard 12. By assigning coordinate points, such as rectangular coordinates, to each of the light emitter 162 and the light receiver 164, the coordinates of the light receiver 164 and the light emitter 162 corresponding to the occluded key cap 124 can be used to pinpoint the location of the finger.
It is understood that in the transceiver sensors 16 arranged in rows on any one side of the peripheral side of the panel 122, different pairs of optical transmitters 162 and optical receivers 164 may be arranged in each row, as long as the desired orthogonal optical arrays can be formed. In addition, the number of optical transceivers arranged in each row may be determined according to the setting index for the finger position detection accuracy on the VR keyboard 100, and generally, the larger the number of optical transceivers arranged, the higher the detection accuracy.
The detection chip 18 may be any chip device in the art having computing processing capability and capable of providing corresponding signal input/output interfaces, data interfaces and/or wireless communication modules, and the detection chip 18 may be a general-purpose or special-purpose scanning processing chip existing in the art. The scanning of the transceiver sensors 16 (optionally only the output of the optical receiver 164) by the detection chip 18 may be real-time scanning, or may be timed scanning, such as but not limited to real-time scanning of the output signals of the transceiver sensors 16 and then packaging and sending to a host of the VR headset for processing. Or after scanning the output signal of each transceiver sensor 16 once, scanning again at certain intervals (e.g., at any interval of tens of milliseconds to hundreds of milliseconds), and so on; thus, if the signals obtained by two successive scans at a certain time interval both include the corresponding finger position signal when the finger is detected, it indicates that the user's finger is operating the keyboard 12, and thus, false touch or environmental interference can be effectively prevented. The detection chip 18 can transmit the scanning data to the host computer of the VR head display in a wired or wireless manner when the scanning completely or partially receives and transmits the output of the sensor 16.
The host computer that the VR shows can be the host computer that is located the first apparent of VR, also can be the host computer that the first apparent outer independent setting of VR for show for the first video source that provides of VR. The host computer is provided with the existing VR graphic engine in the field, after the VR head displays the space position of the keyboard 12 acquired by the VR head and the finger position signal acquired by the scanning of the detection chip 18 is provided for the VR graphic engine to be processed, a keyboard 12 can be simulated and displayed in the VR picture or the real keyboard 12 can be directly displayed in the VR scene picture, and meanwhile, the keycap 124 triggered by the finger on the keyboard 12 is correspondingly displayed or the hand of the user on the keyboard 12 is simulated and displayed at the same time, and the VR graphic engine can be selected and applied according to the needs of the user in the actual application scene. In the schematic structural diagram of the VR keyboard 100 shown in fig. 1, only a part of the key caps 124 is shown, wherein the dashed line elements represent the elements located under the solid line elements and shielded by the solid line elements, for example, the detection chip 18 is disposed in the internal space between the panel 122 and the bottom plate of the keyboard 12; the detection chip 18 may also be disposed outside the keyboard 12, such as an outer surface of the panel 122 or an outer surface of the bottom board of the keyboard 12, which is not particularly limited in this embodiment.
In particular, the VR head may typically be provided with a position detector to obtain the six degree of freedom position of the keyboard 12 in real space: for example, a visual recognition camera, captures the positioning tags 14 disposed on the real keyboard 12, captures the positions of the positioning tags 14 by computer vision techniques existing in the art, calculates the relative positions and rotational orientations of the positioning tags 14 in the VR image, and compares the relative positions and rotational orientations with the known absolute positions of the positioning tags 14 on the hardware, so as to calculate the current position and orientation of the keyboard 12 from the camera and simulate a virtual keyboard for display in the VR environment. Or for example an infrared transmitter, which transmits infrared light to a positioning tag 14 (which may be an infrared receiver, respectively) on the keyboard 12, capturing the position of the keyboard 12 by infrared absorption intensity; or for example an ultrasonic transmitter, transmits ultrasonic waves to the positioning tags 14 (which may be ultrasonic wave absorbing tags, respectively) on the keyboard 12, and captures the position of the keyboard 12 by the ultrasonic wave absorption intensity; or for example a magnetic field emitter, emits a magnetic field towards a location tag 14 (which may accordingly be a magnetic field absorbing tag) on the keyboard 12, capturing the location of the keyboard 12 by the strength of the magnetic field absorption. The specific manner in which the spatial position of keyboard 12 is captured may be determined based on the type of location tag 14, the type of position detector that is used on the VR headset, and the type of conventional location technology that is used.
When a user operates the real keyboard 12, the fingers of the user approach and trigger the corresponding keys, and then the fingers enter the orthogonal optical array to block the light emitted by the optical transmitter 162 at the position corresponding to the keycap 124, so that the optical receiver 164 corresponding to the blocked optical transmitter 162 cannot receive the optical signal transmitted from the opposite side, and further the correspondingly generated output signal, that is, the finger position signal. After the detection chip 18 obtains the output signals of the light receivers 164 through scanning, the obtained finger position signals can be sent to a host of the VR head display for processing. The host computer of the VR head display can accurately position the coordinates x and y of the position blocked by the finger through the output signals of the two sets of the horizontal and vertical light receivers 164, so as to correspondingly judge which key cap 124 the finger is located on. After the finger position is obtained, the host computer of the VR head display outputs corresponding video data, so that the VR head display displays the corresponding finger position on the keyboard 12 displayed in the VR environment to prompt a user; when a user presses a key on the real keyboard 12, the corresponding key on the keyboard 12 displayed in the VR environment is also pressed simultaneously.
The corresponding finger position display on keyboard 12 displayed in the VR environment may be, but is not limited to: for example, the triggered virtual key is highlighted directly; or after passing through the activated keys or the finger tip positions, a virtual hand is simulated in the VR environment, for example, the activated keys on the left keyboard 12 are identified as the fingers of the left hand, and the activated keys on the right keyboard 12 are identified as the fingers of the right hand. The keys from left to right correspond to a little finger to an index finger respectively, the thumb is usually placed on a space key, if the number of the activated keys is less than 4, the identification and judgment are carried out according to the sequence from the index finger to the little finger, for example, the index finger can be judged when only one activated key is available, the index finger and the middle finger can be judged when two activated keys are available, and the like; the right hand understands the same. The specific display mode can be determined according to the application scene needs of the VR head display and the display modes supported by the VR head display.
The VR keyboard 100 is configured such that the positioning tag 14 is configured on the keyboard 12 to cooperate with the VR head, so that the VR head can utilize the six-degree-of-freedom spatial position of the positioning tag 14 or the keyboard 12; meanwhile, a plurality of transceiving sensors 16 are disposed around the panel 122 of the keyboard 12 to form an orthogonal light array on the top surface of the key cap 124 of the keyboard 12, so that when a finger approaches the key cap 124 of the keyboard 12 for a key pressing operation, the orthogonal light array will be partially shielded, and the position of the finger on the keyboard 12 can be detected by using the shielded coordinates of the light array. And the setting up of detecting chip 18 has realized the output signal scanning of receiving and dispatching response piece 16, can scan in real time and send the host computer that shows at the VR head after acquireing finger position signal to carry out signal processing for the host computer that shows at the VR head can accurately discern and control the finger position that shows on keyboard 12 in its VR environment. So, realized that VR keyboard 100 can pinpoint and show in the VR environment, can accurately acquire and show the finger position of operation moreover on the keyboard 12 that shows, keyboard 12 shows that the picture is clear, stability is high, liberates user's visual angle and makes the user need not frequently to bow to look for keyboard 12 position, has reached and has showing the purpose that promotes keyboard VR display performance.
In one embodiment, the light emitter 162 is an infrared emitting tube. The light receiver 164 is an infrared receiving tube. On either side of the periphery of the panel 122 of the keyboard 12, infrared transmitting tubes and/or infrared receiving tubes are arranged at equal intervals.
It is understood that in the present embodiment, an infrared transmitting tube and an infrared receiving tube are used as the transceiving sensor 16. Each infrared transmitting tube and its corresponding infrared receiving tube are electrically connected to each corresponding signal pin of the detecting chip 18, respectively, to implement corresponding driving and signal scanning. The infrared tubes disposed on any one side of the peripheral side of the panel 122 may be all infrared transmitting tubes or infrared receiving tubes, or may be a part of infrared transmitting tubes (the corresponding infrared receiving tubes are disposed on the adjacent or opposite other side edge), another part of infrared receiving tubes (the corresponding infrared transmitting tubes are disposed on the adjacent or opposite other side edge), and the infrared tubes are disposed at equal intervals on the side edges.
By using the infrared pair transistors as the receiving and transmitting induction part 16, the design and manufacturing cost is low and the detection reliability is high. The infrared tubes on each side are arranged at equal intervals, so that an orthogonal optical array at equal intervals can be formed, the finger position detection accuracy is higher, the coordinate equidistant distribution of the infrared tubes can simplify the signal processing of a host computer of the VR head display, and the VR display efficiency of the finger position on the keyboard 12 is improved.
Referring to fig. 2 and 3, in one embodiment, the infrared transmitting tubes are disposed on two adjacent sides of the periphery of the panel 122 of the keyboard 12 at equal intervals, and the infrared receiving tubes corresponding to the infrared transmitting tubes are disposed on the other two adjacent sides at equal intervals.
Optionally, in this embodiment, install infrared transmitting tube and infrared receiving tube separately on the side of difference, can further simplify the arrangement coordinate of each infrared pipe, form the unanimous and equidistant quadrature light array of light outgoing direction of same side for finger position detects the accuracy higher simultaneously, and the coordinate equidistance of infrared pipe distributes and can further simplify the signal processing of the first host computer that shows of VR, promotes the VR display efficiency of finger position on the keyboard 12. In the schematic diagram of detecting the finger position shown in fig. 3, D represents a position blocked by the finger.
In one embodiment, the optical transmitter 162 and optical receiver 164 are mechanically coupled to the faceplate 122 of the keyboard 12 by soldering, snapping, or mating tubing mounting bars.
It is understood that, in the present embodiment, when the optical transmitter 162 and the optical receiver 164 are mounted on the periphery of the panel 122 of the keyboard 12, the connection and the fixation may be achieved by, but not limited to, using common soldering, and may also be achieved by using a snap-fit manner, for example, the optical transmitter 162 and the optical receiver 164 are respectively inserted into the pre-opened bayonets on the periphery of the panel 122 of the keyboard 12 for being limited and then glued. Alternatively, a pair of pipe mounting bars may be provided between the front panel 122 and the key caps 124 of the keyboard 12 as an element mounting layer for the light emitters 162 and the light receivers 164 in a row. The pipe-to-pipe mounting bar may be an annular mounting bar, or may be a mounting bar divided into two or four sections, which may be determined according to the installation requirements of the optical transmitter 162 and the optical receiver 164 on the panel 122. The mounting portions of the respective light emitters 162 and light receivers 164 are provided to the pipe mounting bar so that the respective light emitters 162 and light receivers 164 are collectively mounted to the pipe mounting bar, and the pipe mounting bar may be integrally welded, screwed, and fixed to the peripheral side of the panel 122, or embedded in the peripheral side of the panel 122 so that the respective light emitters 162 and light receivers 164 are arranged separately on the peripheral side of the panel 122, with high ease of mounting and dismounting.
The light emitter 162 and the light receiver 164 are installed on the peripheral side of the panel 122 of the keyboard 12 by the above-mentioned mechanical connection manner, and the manufacturing is flexible and low in cost, so that the influence of the weight of the keyboard 12 on the use efficiency is effectively avoided.
Referring to fig. 4, in one embodiment, the positioning tag 14 comprises a two-dimensional code tag. The two-dimensional code label is attached to the peripheral side corner of the panel 122 of the keyboard 12.
Alternatively, in the present embodiment, the peripheral side corner of the keyboard 12 may refer to any one of the four peripheral side corners of the keyboard 12. On the four peripheral side corners of the keyboard 12 (that is, the four top corners of the panel 122 of the keyboard 12, and in the case of other non-rectangular panels 122, it can be understood as four central symmetrical points on the peripheral side of the panel 122 in the same way), a two-dimensional code label may be attached to one of the four peripheral side corners, two or three of the four peripheral side corners may also be attached to any two or three of the four peripheral side corners, and the four peripheral side corners may also be attached to two-dimensional code labels respectively as visual feature points placed on the keyboard 12. Therefore, the camera mounted on the VR head can capture the positions of the visual feature points through a computer vision technology, calculate the relative positions and the rotational orientations of the visual feature points in the VR image, and further compare the relative positions and the rotational orientations with the known absolute positions of the two-dimensional code labels on the keyboard 12, so that the position and the orientation of the current keyboard 12 from the camera can be calculated, and a virtual keyboard can be simulated in the VR environment for displaying.
Through utilizing the two-dimensional code label as foretell location label 14, effectively realized that the head of VR shows and acquireed keyboard 12 six degrees of freedom positions in the space for the user can show at the head of VR and acquire keyboard 12 position and carry out VR demonstration back, and the demonstration of cooperation finger position can directly carry out keyboard 12 operation, and need not frequently to hang down again and look for keyboard 12. In addition, the manufacturing and using costs of the two-dimensional code label are low, so that the hardware cost of the VR keyboard 100 can be avoided.
In one embodiment, the location tag 14 comprises an LED bead or an infrared bead. The LED lamp beads or the infrared lamp beads are arranged at the peripheral side angle of the panel 122 of the keyboard 12 and are electrically connected with the power supply of the keyboard 12.
Alternatively, in the present embodiment, the peripheral side corner of the keyboard 12 may refer to any one of the four peripheral side corners of the keyboard 12. On four peripheral side corners of keyboard 12, can install LED lamp pearl or infrared lamp pearl (the quantity can be one or more, can confirm according to visual identification needs) in one of them, also can install LED lamp pearl or infrared lamp pearl in arbitrary two or three of them, can also install LED lamp pearl or infrared lamp pearl respectively in four peripheral side corners, as the visual characteristic point of laying on keyboard 12. The power supply of the keyboard 12 supplies power to each lamp bead to support the lighting work of the lamp beads. Therefore, the camera carried by the VR head can capture the positions of the visual feature points through a computer vision technology, calculate the relative positions and the rotating orientations of the visual feature points in the VR image, and further compare the relative positions with the known absolute positions of the LED lamp beads or the infrared lamp beads on the keyboard 12, so that the position and the orientation of the current keyboard 12 from the camera can be calculated, and a virtual keyboard is simulated in a VR environment for displaying.
Through utilizing LED lamp pearl or infrared lamp pearl as foretell location label 14, also can effectively realize that the head of VR shows and acquires keyboard 12 six degrees of freedom positions in the space for the user can show at the head of VR and acquire keyboard 12 position and carry out VR demonstration back, and the demonstration of cooperation finger position can directly carry out keyboard 12 operation, and need not frequently to hang down again and look for keyboard 12.
Referring to FIG. 5, in one embodiment, the locator tag 14 further includes an IMU unit 20. The IMU unit 20 is electrically connected to the detection chip 18, and is configured to output a gesture signal of the keyboard 12 to the host computer through the detection chip 18. The gesture signal is used to indicate the host controlling the VR head display to display the gesture of keyboard 12 in the VR environment.
It will be appreciated that the IMU unit 20, also known as an inertial measurement unit, may be used to measure the attitude of an object. The number of IMU units 20 may be one or two or more, and the specific number and the mounting position thereof may be determined according to design criteria such as the size of the keyboard 12, the posture detection sensitivity, and the accuracy. The IMU unit 20 may also be secured to the keyboard 12 by soldering, snapping or other mechanical attachment means.
Optionally, in this embodiment, the IMU unit 20 may be additionally installed on the keyboard 12 as an auxiliary detection means for detecting the position of the keyboard 12, so that the tilt posture, the acceleration, and the like of the keyboard 12 during use may be converted into corresponding posture signals and output to the detection chip 18. The attitude signal is sent to a host computer of the VR head display through the detection chip 18, so that the host computer can calculate the data such as the inclined attitude and the acceleration of the keyboard 12 by using the received attitude signal, and further supplement the information of the positioning of the keyboard 12 and the VR display. Therefore, the keyboard 12 displayed in the VR head display can display the tilting condition, the accidental falling and the like of the keyboard 12 in real time besides the positioning, rotating and translating display of the keyboard 12 realized based on the positioning tag 14, and the effect of further improving the display performance of the keyboard VR is achieved.
In one embodiment, the detection chip 18 includes an MCU chip, FGPA chip, CPU chip, GPU chip, or raspberry chip. Optionally, in this embodiment, the detection chip 18 may be an MCU chip, a CPU chip, a GPU chip, or an FPGA chip, which has the above device driving and signal scanning output functions, or may be a raspberry group. The device has strong calculation processing capacity, strong functions and small volume, can be easily loaded and run with the existing calculation and control applications in the field to support the required functions, and is convenient to update the applications. Therefore, when the detection chip 18 is applied, the circuit structure of the VR keyboard 100 can be simplified, the application cost thereof can be reduced, the detection output efficiency can be improved, and the improvement of the display performance of the VR keyboard can be effectively promoted.
Referring to fig. 6 and 7, in one embodiment, there is also provided a VR office device 200 including a VR headset 201 and the VR keyboard 100 described above. The VR head display 201 is communicatively coupled to the detection chip 18 of the VR keyboard 100.
It is to be understood that, for specific explanation of the VR keyboard 100 in this embodiment, reference may be made to corresponding explanation in each of the embodiments of the VR keyboard 100, and repeated description is not repeated here. Fig. 7 is a schematic diagram of an application of the VR office apparatus 200, where a is a schematic diagram of a user operating the VR keyboard 100, B is a keyboard VR image in the VR environment displayed on the VR head 201, and C is a keyboard VR image in which fingers are virtually displayed together in the VR environment. The VR headset 201 may be a glasses-type VR display or a head-mounted display in the form of a head ring.
Specifically, after the VR head display 201 obtains the spatial position of the keyboard 12 through the positioning tag 14 disposed on the keyboard 12, a virtual keyboard may be simulated in the VR scene for display, or the real keyboard 12 may be directly displayed in the VR scene. Then, the positions of the fingers of the user on the keyboard 12 are detected and obtained through orthogonal light arrays generated by the transceiving sensors 16 on the periphery of the keyboard 12, and then the corresponding positions of the fingers are displayed on the keyboard 12 in the VR scene to prompt the user, when the user presses down keys on the real keyboard 12, the corresponding keys on the keyboard 12 displayed in the VR environment are synchronously pressed down, and the visualization of the user on the keyboard 12 is realized.
In the VR office device 200, the positioning tag 14 is designed on the keyboard 12 to be matched with the VR head 201, so that the VR head 201 can utilize the six-degree-of-freedom space position of the positioning tag 14 or the keyboard 12; meanwhile, a plurality of transceiving sensors 16 are disposed around the panel 122 of the keyboard 12 to form an orthogonal light array on the top surface of the key cap 124 of the keyboard 12, so that when a finger approaches the key cap 124 of the keyboard 12 for a key pressing operation, the orthogonal light array will be partially shielded, and the position of the finger on the keyboard 12 can be detected by using the shielded coordinates of the light array. And the setting up of detecting chip 18 has realized the output signal scanning of receiving and dispatching response piece 16, can scan in real time and send the host computer that shows 201 to the VR head after acquireing the finger position signal and carry out signal processing for the host computer that shows 201 can accurately discern and control the first apparent 201 of VR and show the finger position on keyboard 12 in its VR environment. So, realized that VR keyboard 100 can pinpoint and show in the VR environment, can accurately acquire and show the finger position of operation moreover on the keyboard 12 that shows, keyboard 12 shows that the picture is clear, stability is high, liberates user's visual angle and makes the user need not frequently to bow to look for keyboard 12 position, has reached and has showing the purpose that promotes keyboard VR display performance.
In one embodiment, the VR headset 201 is provided with one or more visual recognition cameras. The visual recognition camera captures the location tag 14 on the VR keyboard 100 and identifies the spatial location of the VR keyboard 100.
It can be understood that, in the present embodiment, the above-mentioned positioning tag 14 is a visual feature point, such as a two-dimensional code or a light-emitting lamp bead. The visual identification camera that the VR head showed 201 and goes up the setting can show 201 host computer communication connection with the VR head to transmit the spatial position data transmission of keyboard 12 to the host computer and change into corresponding video data, supply the VR head to show 201 and carry out the keyboard 12 at its VR scene and show. Specifically, the visual recognition camera mounted on the VR head display 201 can capture the positions of the visual feature points through the computer visual recognition function, calculate the relative positions and rotational orientations of the visual feature points in the VR image, and compare the relative positions and rotational orientations with the known absolute positions of the two-dimensional code labels on the keyboard 12, so as to calculate the position and orientation of the current keyboard 12 from the visual recognition camera and simulate a virtual keyboard in the VR environment for display.
Through utilizing the positioning label 14 on the VR head display 201 identification keyboard 12 carrying the visual identification camera, the six-degree-of-freedom position acquisition of the keyboard 12 in the space by the VR head display 201 is effectively realized, so that a user can acquire the position of the keyboard 12 and perform VR display at the VR head display 201, the keyboard 12 operation can be directly performed by matching the display of the finger position, the keyboard 12 does not need to be frequently looked for by lowering the head, and the positions of the keyboard 12 and the fingers can be conveniently seen.
In one embodiment, the VR office device 200 described above can also include a touch pad. The touch pad may be an independent input device that is independently connected (e.g., USB wired or wireless such as WIFI, bluetooth) to the host of the VR headset 201, or may be an input device that is integrally configured with the keyboard 12. When the touch panel is an independent input device, the positioning label 14 may be disposed on the periphery of the touch panel, so as to realize positioning and VR display of the touch panel.
Through the above-mentioned touch pad of collocation, can further promote VR office device 200's availability factor, bring more convenient operation experience for the user.
In one embodiment, the VR office device 200 can also include a mouse. The mouse may be a wireless mouse (which may be directly wirelessly connected to a host of the VR headset 201) or a wired mouse connected to the keyboard 12. In this embodiment, a positioning tag 14 may be further disposed on the housing of the mouse (facing the palm of the user's hand) for marking the spatial position of the mouse to the VR head display 201, so as to realize positioning of the mouse, and further display the mouse together with the VR keyboard 100, which is captured by the VR head display 201 and displayed in the VR environment, in the VR environment. Through the above-mentioned mouse of collocation, can also further promote VR office device 200's availability factor, bring abundanter operation mode for the user.
In one embodiment, the host of the VR head display 201 can be integral with the VR head display 201, or can be a separate computing device external to the VR head display 201 that provides a video source for the VR head display 201. The detection chip 18 of the VR keyboard 100 may be connected to the host through a wired connection (a USB data line or another data transmission line commonly used in the art), or may be connected to the host through a wireless connection manner such as WIFI or bluetooth, and may be specifically determined according to a specific type (a carried communication module) of the detection chip 18. When the detection chip 18 is connected with the host computer through a wire, the real-time performance and the anti-interference performance of data transmission are extremely high, and the low time delay and the picture stability of a VR display scene can be guaranteed to the maximum extent. Detect chip 18 through wireless connection during the host computer, the use flexibility between VR keyboard 100 and the first 201 that shows of VR promotes by a wide margin, and the office operation space that the restriction of removal space is less, can promote the VR scene etc..
The technical features of the embodiments described above may be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features of the embodiments described above are not described, but should be considered as being within the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the claims. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (11)

1. A VR keyboard, comprising:
a keyboard;
the positioning label is arranged on a panel of the keyboard and used for displaying and marking the space position of the keyboard to a VR head; the spatial location is to indicate that the VR head display is displaying the keyboard in a VR environment;
the receiving and transmitting sensing part comprises a plurality of pairs of light emitters and light receivers, each pair of light emitters and light receivers are respectively arranged on two different side edges of the periphery of the panel of the keyboard and used for forming orthogonal light arrays on the top surfaces of keycaps of the keyboard and detecting finger position signals close to the keycaps;
the detection chip is arranged on the keyboard, is respectively and electrically connected with the transceiving induction pieces, and is used for scanning and acquiring finger position signals of the transceiving induction pieces and sending the finger position signals to the host of the VR head display; the finger position signal is used for indicating the host computer to control the VR head to display the finger position on the keyboard in the VR environment.
2. The VR keyboard of claim 1, wherein the light emitter is an infrared emitting tube and the light receiver is an infrared receiving tube;
on any side edge of the periphery side of the panel of the keyboard, the infrared transmitting tubes and/or the infrared receiving tubes are arranged at equal intervals.
3. The VR keyboard of claim 2, wherein the plurality of infrared emitting tubes are equally spaced on two adjacent sides of the periphery of the panel of the keyboard, and the plurality of infrared receiving tubes, one for each of the plurality of infrared emitting tubes, are equally spaced on the other two adjacent sides.
4. The VR keyboard of any of claims 1 to 3, wherein the light emitter and the light receiver are mechanically coupled to a faceplate of the keyboard by welding, snapping, or mating a tube mounting bar.
5. The VR keyboard of claim 4, wherein the positioning tag comprises a two-dimensional code tag attached to a peripheral side corner of a panel of the keyboard.
6. The VR keyboard of claim 4, wherein the positioning tags include LED light beads or infrared light beads, and the LED light beads or the infrared light beads are mounted at the peripheral corners of the panel of the keyboard and electrically connected to a power supply of the keyboard.
7. The VR keyboard of claim 5 or 6, wherein the positioning tag further comprises an IMU unit electrically connected to the detection chip for outputting the gesture signal of the keyboard to the host through the detection chip; the gesture signal is used for indicating the host computer to control the VR head display to display the gesture of the keyboard in the VR environment.
8. The VR keyboard of claim 4, wherein the detection chip comprises an MCU chip, an FGPA chip, a CPU chip, a GPU chip, or a raspberry.
9. A VR office device comprising a VR headset and the VR keyboard of any one of claims 1 to 8, the VR headset communicatively coupled to a detection chip of the VR keyboard.
10. The VR office device of claim 9, wherein the VR head display is provided with one or more visual recognition cameras that capture positioning tags on the VR keyboard and identify a spatial location of the VR keyboard.
11. The VR office device of claim 9 or 10, further comprising a mouse having a positioning tag disposed on a housing of the mouse for indicating a spatial location of the mouse to the VR head.
CN202021248605.XU 2020-06-30 2020-06-30 VR keyboard and VR office device Active CN212749772U (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202021248605.XU CN212749772U (en) 2020-06-30 2020-06-30 VR keyboard and VR office device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202021248605.XU CN212749772U (en) 2020-06-30 2020-06-30 VR keyboard and VR office device

Publications (1)

Publication Number Publication Date
CN212749772U true CN212749772U (en) 2021-03-19

Family

ID=75011865

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202021248605.XU Active CN212749772U (en) 2020-06-30 2020-06-30 VR keyboard and VR office device

Country Status (1)

Country Link
CN (1) CN212749772U (en)

Similar Documents

Publication Publication Date Title
US9939911B2 (en) Computer interface for remotely controlled objects and wearable articles with absolute pose detection component
US7257255B2 (en) Capturing hand motion
EP2223293B1 (en) Input device for a scanned beam display
US20140198130A1 (en) Augmented reality user interface with haptic feedback
US8102377B2 (en) Portable interactive media presentation system
JPH09265346A (en) Space mouse, mouse position detection device and visualization device
EP0686935A1 (en) Pointing interface
US20080180395A1 (en) Computer pointing input device
CN101859210A (en) Interactive projection system and implementation method thereof
KR100820573B1 (en) Computer input device utilizing a camera to recognize position and twinkling compare laser pointing image with computer display picture
US20060197742A1 (en) Computer pointing input device
US20170168592A1 (en) System and method for optical tracking
CN1701351A (en) Quasi-three-dimensional method and apparatus to detect and localize interaction of user-object and virtual transfer device
CN111813232A (en) VR keyboard and VR office device
CN212749772U (en) VR keyboard and VR office device
CN111782059A (en) VR keyboard and VR office device
US20230349693A1 (en) System and method for generating input data from pose estimates of a manipulated object by using light data and relative motion data
CN212391777U (en) VR keyboard and VR office device
CN209044429U (en) A kind of equipment
CN109901714A (en) A kind of electronics paper pen system and its control method
CN209044430U (en) A kind of equipment
CN111856754A (en) VR office equipment
CN212846145U (en) VR office equipment
CN209167946U (en) A kind of equipment
US20240265560A1 (en) Three-dimensional measurement device

Legal Events

Date Code Title Description
GR01 Patent grant
GR01 Patent grant