CN221039907U - Space positioning interaction 3D panel and interaction equipment - Google Patents

Space positioning interaction 3D panel and interaction equipment Download PDF

Info

Publication number
CN221039907U
CN221039907U CN202323130661.XU CN202323130661U CN221039907U CN 221039907 U CN221039907 U CN 221039907U CN 202323130661 U CN202323130661 U CN 202323130661U CN 221039907 U CN221039907 U CN 221039907U
Authority
CN
China
Prior art keywords
tablet
interactive
interaction
infrared
bottom shell
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202323130661.XU
Other languages
Chinese (zh)
Inventor
罗军
黄斌
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jiangxi Kejun Industrial Co ltd
Original Assignee
Jiangxi Kejun Industrial Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jiangxi Kejun Industrial Co ltd filed Critical Jiangxi Kejun Industrial Co ltd
Priority to CN202323130661.XU priority Critical patent/CN221039907U/en
Application granted granted Critical
Publication of CN221039907U publication Critical patent/CN221039907U/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)

Abstract

The utility model discloses a space positioning interaction 3D panel and interaction equipment, wherein the space positioning interaction 3D panel comprises a panel main body, the panel main body comprises a shell, a display screen, a front camera and at least three infrared positioning camera modules, at least three fixed cavities are arranged at intervals on the shell, the at least three infrared positioning camera modules are respectively limited in the at least three fixed cavities, the integration degree of the panel main body is high, and a scene with a larger range is shot through the at least three infrared positioning camera modules. And the panel main body further comprises a display screen, wherein the display screen comprises a touch screen, a polaroid, an OC screen, a backlight module and a liquid crystal light valve, and the polaroid is arranged between the liquid crystal light valve and the OC screen, so that the space positioning interaction 3D panel has a polarized 3D display function. The space positioning interaction 3D panel can have the functions of polarized 3D display and space positioning interaction, has more various functions and meets the requirements of users.

Description

Space positioning interaction 3D panel and interaction equipment
Technical Field
The utility model relates to the technical field of virtual reality equipment, in particular to a space positioning interaction 3D (three-dimensional) tablet and interaction equipment.
Background
The interaction panel is an integrated device capable of realizing man-machine interaction. The method mainly comprises the step of controlling the content displayed on the panel through a touch technology so as to obtain convenient man-machine interaction experience. And because the interactive flat board integrates multiple functions such as televisions, computers, sound equipment, video conference terminals and the like, the interactive flat board can be widely applied to the fields such as education and teaching, enterprise conferences, commercial display and the like.
However, the existing interactive flat-panel non-polarized 3D display technology cannot display and view 3D content, does not have a space positioning interaction function, cannot provide a semi-immersive virtual display space positioning interaction function, cannot perform 6DoF action interaction on a 3D model, and cannot meet the requirements of users.
Disclosure of utility model
The utility model mainly aims to provide a space positioning interaction 3D panel, which aims to solve the problems that the existing interaction panel is temporarily unpolarized 3D display technology, 3D content display viewing cannot be performed, a space positioning interaction function cannot be provided, a semi-immersive virtual display space positioning interaction function cannot be provided, 6DoF action interaction cannot be performed on a 3D model, and the requirements of users cannot be met.
In order to achieve the above object, the spatial positioning interactive 3D tablet provided by the present utility model includes a tablet body, where the tablet body includes:
the shell is provided with at least three fixing cavities at intervals;
The display screen is arranged on the shell and comprises an OC screen, a polaroid and a liquid crystal light valve, wherein the polaroid is arranged between the liquid crystal light valve and the OC screen; and
The at least three infrared positioning camera modules are respectively limited in the at least three fixed cavities.
Optionally, the casing includes drain pan and center, the display screen with the center is connected, the drain pan is located the center is kept away from the one side of display screen, the center is equipped with three at least fixed chamber.
Optionally, the middle frame includes first fixed part, second fixed part and third fixed part, first fixed part with the second fixed part is located respectively two corners of dull and stereotyped main part, the third fixed part is connected first fixed part with the second fixed part, just first fixed part with the second fixed part is all set up with dull and stereotyped main part evagination, at least three infrared location camera module install respectively in first fixed part the second fixed part with the third fixed part.
Optionally, the infrared positioning camera module comprises an infrared video camera and an infrared light supplementing lamp, and the infrared light supplementing lamp is close to the infrared video camera.
Optionally, the panel body further includes a support frame, where the support frame is disposed on the bottom shell and rotates relative to the bottom shell to support the bottom shell.
Optionally, the middle frame includes installation department and accommodation portion, accommodation portion for installation department evagination sets up, accommodation portion is equipped with at least three fixed chamber, installation portion with the accommodation chamber between the drain pan is fixed with the battery package, the display screen install in installation portion keep away from one side of drain pan.
Optionally, an interface is arranged on the side surface of the bottom shell, an interface small plate is installed on the back of the installation part, and the interface small plate is arranged corresponding to the interface.
Optionally, a heat dissipation hole is formed at the bottom of the bottom shell, and the heat dissipation Kong Birang is disposed on the support frame.
The utility model also provides interaction equipment which comprises polarized glasses, an interaction pen and a space positioning interaction 3D panel.
According to the technical scheme, at least three fixing cavities are arranged on the shell at intervals, at least three infrared positioning camera modules are respectively arranged in the at least three fixing cavities, and further the at least three infrared positioning camera modules can be limited on the panel main body through one middle frame, so that the integration degree is high. The space positioning interaction 3D panel can shoot a scene with a larger range through at least three infrared positioning camera modules, positioning tracking is more stable, and blind areas are smaller. The signal that interactive accessory sent or reflected can also be caught, the scope is bigger, and interactive experience is felt better. And the panel main body further comprises a display screen, the display screen comprises an OC screen, a polaroid and a liquid crystal light valve, the polaroid is arranged between the liquid crystal light valve and the OC screen, the backlight module emits light, incident light irradiates the OC screen to convert left/right eye frames on the screen into optical signals, and polarized light rays in specific directions are emitted through the polaroid and converted into the optical signals: the left eye image polarized light is converted into left circularly polarized light to be emitted, the left circularly polarized light is controlled by the liquid crystal light valve to pass through the left eye lens of the polarized glasses, and the right circularly polarized light cannot pass through the right eye lens due to the closing of the liquid crystal light valve; the polarized light of the right eye image is converted into right circular polarized light to be emitted, the right circular polarized light is controlled by the liquid crystal light valve to pass through the right lens of the polarized glasses, and the left circular polarized light cannot pass through the left lens of the polarized glasses due to the closing of the liquid crystal light valve; the continuous left-eye and right-eye images of the frames are ensured, one frame enters the left eye from left to right, one frame enters the right eye from right, and the left-eye and right-eye images are fused into a stereoscopic image in the brain due to parallax difference. Therefore, the space positioning interaction 3D panel has the functions of polarized 3D display and space positioning interaction, has more various functions, and meets the requirements of users.
Drawings
In order to more clearly illustrate the embodiments of the present utility model or the technical solutions in the prior art, the drawings that are required in the embodiments or the description of the prior art will be briefly described, and it is obvious that the drawings in the following description are only some embodiments of the present utility model, and other drawings may be obtained according to the structures shown in these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a schematic perspective view of an embodiment of a spatially localized interactive 3D tablet according to the present utility model;
FIG. 2 is a schematic diagram of an exploded structure of one embodiment of a spatially localized interactive 3D tablet according to the present utility model;
FIG. 3 is a schematic diagram illustrating an exploded structure of an embodiment of a spatially localized interactive 3D tablet according to the present utility model at another viewing angle;
Fig. 4 is a schematic partially exploded view of another embodiment of a spatially localized interactive 3D tablet according to the present utility model.
Reference numerals illustrate:
The achievement of the objects, functional features and advantages of the present utility model will be further described with reference to the accompanying drawings, in conjunction with the embodiments.
Detailed Description
The following description of the embodiments of the present utility model will be made clearly and fully with reference to the accompanying drawings, in which it is evident that the embodiments described are only some, but not all embodiments of the utility model. All other embodiments, which can be made by those skilled in the art based on the embodiments of the utility model without making any inventive effort, are intended to be within the scope of the utility model.
It should be noted that all directional indicators (such as up, down, left, right, front, and rear … …) in the embodiments of the present utility model are merely used to explain the relative positional relationship, movement, etc. between the components in a particular posture (as shown in the drawings), and if the particular posture is changed, the directional indicator is changed accordingly.
In the present utility model, unless specifically stated and limited otherwise, the terms "connected," "affixed," and the like are to be construed broadly, and for example, "affixed" may be a fixed connection, a removable connection, or an integral body; can be mechanically or electrically connected; either directly or indirectly, through intermediaries, or both, may be in communication with each other or in interaction with each other, unless expressly defined otherwise. The specific meaning of the above terms in the present utility model can be understood by those of ordinary skill in the art according to the specific circumstances.
Furthermore, descriptions such as those referred to as "first," "second," and the like, are provided for descriptive purposes only and are not to be construed as indicating or implying a relative importance or implying an order of magnitude of the indicated technical features in the present disclosure. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include at least one such feature. In addition, the meaning of "and/or" as it appears throughout is meant to include three side-by-side schemes, for example, "a and/or B", including a scheme, or B scheme, or a scheme that is satisfied by both a and B. In addition, the technical solutions of the embodiments may be combined with each other, but it is necessary to base that the technical solutions can be realized by those skilled in the art, and when the technical solutions are contradictory or cannot be realized, the combination of the technical solutions should be considered to be absent and not within the scope of protection claimed in the present utility model.
The present utility model proposes a spatially localized interactive 3D tablet 100. Referring to fig. 1 to 4, fig. 1 is a schematic perspective view of a spatial positioning interaction 3D panel 100 according to an embodiment of the present utility model; FIG. 2 is a schematic diagram of an exploded structure of an embodiment of a spatially localized interactive 3D tablet 100 according to the present utility model; FIG. 3 is a schematic diagram illustrating an exploded structure of a spatially localized interactive 3D tablet 100 according to an embodiment of the present utility model at another viewing angle; fig. 4 is a schematic partially exploded view of another embodiment of the spatially localized interactive 3D tablet 100 of the present utility model.
In an embodiment of the present utility model, as shown in fig. 1 to 4, a spatially-positioned interactive 3D tablet 100 according to the present utility model includes a tablet body 10, where the tablet body 10 includes:
a housing provided with at least three fixing chambers 21 at intervals;
The display screen 11, wherein the display screen 11 comprises a touch screen, a polaroid 111, an OC screen 110, a backlight module and a liquid crystal light valve 112, and the polaroid 111 is arranged between the liquid crystal light valve 112 and the OC screen; and
At least three infrared positioning camera modules 30, the at least three infrared positioning camera modules 30 are respectively limited in the at least three fixing cavities 21.
The OC screen 110 is attached to the liquid crystal light valve 112, the polarizer 111, etc. to form a complete display module.
According to the technical scheme, at least three fixing cavities 21 are arranged on the middle frame 20 at intervals, the at least three infrared positioning camera modules 30 are respectively arranged in the at least three fixing cavities 21, and further the at least three infrared positioning camera modules 30 can be fixed on the panel main body 10 through one middle frame 20, so that the integration degree is high. The space positioning interaction 3D panel 100 can shoot a scene with a larger range through the at least three infrared positioning camera modules 30, so that the range of signals sent or reflected by the interaction accessories can be captured to be larger, and the interaction experience is better.
The space positioning interaction 3D panel 100 captures signals sent or reflected by the interaction accessories through at least three infrared positioning camera modules 30, further senses information such as positions and postures of the interaction accessories in a three-dimensional space, the space positioning interaction 3D panel 100 performs positioning tracking on the interaction accessories, and the space interaction function of the interaction accessories and the space positioning interaction 3D panel 100 can be achieved through software definition and development of interaction actions.
The spatial positioning interaction 3D panel 100 interacts with polarized glasses and implements the working principle of 3D display: the backlight module emits light, the incident light irradiates the OC screen 110 to convert the continuous images of left/right eye frames on the screen into light signals, and the light signals pass through the polarizer 111 and are converted into polarized light rays in a specific direction to be emitted: the left eye image polarized light is converted into left circularly polarized light to be emitted, the left circularly polarized light is controlled by the liquid crystal light valve 112 to enter the left eye lens of the polarized glasses, and the right circularly polarized light cannot enter the right eye lens due to the closing of the liquid crystal light valve 112; the polarized light of the right eye image is converted into right circular polarized light to be emitted, the right circular polarized light is controlled by the liquid crystal light valve 112 to enter the right lens of the polarized glasses, and the left circular polarized light cannot enter the left lens of the polarized glasses due to the closing of the liquid crystal light valve 112; the continuous left-eye and right-eye images of the frames are ensured, one frame enters the left eye from left to right, one frame enters the right eye from right, and the left-eye and right-eye images are fused into a stereoscopic image in the brain due to parallax difference.
The control unit may be a controller, a control module, or the like. The control unit may employ an existing control method. The present utility model is not limited to the structure and control method of the control unit.
The number of layers of the polarizer 111 and the number of layers of the liquid crystal light valve 112 may be one, two, three, or the like, and the number of layers of the polarizer 111 and the number of layers of the liquid crystal light valve 112 are not limited in the present utility model.
In one embodiment, each infrared positioning camera module 30 includes an infrared camera and an infrared light supplement lamp. The infrared camera works based on an optical positioning tracking technology and can be called an infrared optical positioning tracking camera. The at least three infrared positioning camera modules 30 are positioned on the panel body 10 through the at least three fixing cavities 21 of the middle frame 20. The at least three infrared positioning camera modules 30 are inclined at a fixed angle and are required to be adjusted and fixed through actual debugging and verification, and the overall viewing angle of the at least three infrared positioning camera modules 30 can cover the operation space.
The interactive accessory can comprise an interactive pen and polarized glasses, wherein the interactive pen can emit infrared light, and 5 reflection material particles on the polarized glasses can reflect the infrared light. The at least three infrared positioning camera modules 30 capture, identify the light signals emitted or reflected by the interactive accessory and determine the spatial coordinates of the interactive accessory in a three-dimensional coordinate system according to an algorithm. Specifically, the user wears polarized glasses, the at least three infrared positioning camera modules 30 on the spatial positioning interaction 3D panel 100 may capture and identify the light signals reflected by the 5 reflective material points on the polarized glasses, and determine the coordinates of the polarized glasses in the three-dimensional space through an algorithm, so as to determine the coordinates of the head or eyes of the user in the three-dimensional space. The at least three infrared positioning camera modules 30 can also capture and identify the light signals sent by the interactive pen, and determine the coordinates of the interactive pen in the three-dimensional space according to an algorithm, so as to realize the signal interaction between the interactive pen and the space positioning interactive 3D panel 100 in the three-dimensional space.
The interactive pen can emit light signals to be captured by the three infrared positioning camera modules in real time, and the 6DoF action positioning tracking is performed after the light inertia is mixed by combining the real-time inertia data of the interactive pen; the polarization tracking glasses reflect infrared light emitted by the infrared light supplement lamp through mark points with reflective materials on the glasses, and the infrared positioning camera module captures the infrared light in real time and then performs 6DoF action positioning tracking.
The middle frame 20 includes a first fixing portion 231, a second fixing portion 233 and a third fixing portion 232, the first fixing portion 231 and the second fixing portion 233 are respectively located at two corners of the panel body 10, the first fixing portion 231 and the second fixing portion 233 are respectively protruding from the panel body 10, the third fixing portion 232 is connected with the first fixing portion 231 and the second fixing portion 233, and the at least three infrared positioning camera modules 30 are respectively limited at the first fixing portion 231, the second fixing portion 233 and the third fixing portion 232.
Wherein, the middle frame 20 is further fixed with a front infrared positioning camera module. The at least three infrared positioning camera modules 30 are installed in the at least three fixing cavities 21 of the middle frame 20, the positions of the at least three fixing cavities 21 are arranged at intervals, two fixing cavities 21 are located at the left side corner and the right side corner of the panel body 10, and the other fixing cavity 21 is arranged between the two infrared positioning camera modules 30. In this way, the at least three infrared positioning camera modules 30 are respectively limited at the first fixing portion 231, the second fixing portion 233 and the third fixing portion 232, so that two infrared positioning camera modules 30 of the at least three infrared positioning camera modules 30 are located at the left side corner and the right side corner of the panel body 10, and another infrared positioning camera module 30 is disposed between the two infrared positioning camera modules 30 and is located at the right side of the front RGB camera.
In an embodiment, as shown in fig. 1 to 3, the at least three infrared positioning camera modules 30 include a first infrared positioning camera module 31, a second infrared positioning camera module 32 and a third infrared positioning camera module 33, wherein the first infrared positioning camera module 31 is mounted on the first fixing portion 231, the second infrared positioning camera module 32 is mounted on the second fixing portion 233, and the third infrared positioning camera module 33 is mounted on the third fixing portion 232. The first infrared positioning camera module 31 is installed in the first fixing cavity 211 of the first fixing portion 231, the first infrared positioning camera module 31 is disposed in the first fixing cavity 211, and the inner wall of the first fixing portion 231 can limit the first infrared positioning camera module 31. The second infrared positioning camera module 32 is installed in the second fixing cavity 212 of the second fixing portion 233, the second infrared positioning camera module 32 is disposed in the second fixing cavity 212, and the inner wall of the second fixing portion 233 can limit the second infrared positioning camera module 32. The first infrared positioning camera module 31 and the second infrared positioning camera module 32 may be fixed at two corners of the panel body 10 by the first fixing portion 231 and the second fixing portion 233, and may not be easily separated from the first fixing cavity 211 and the second fixing cavity 212.
In an embodiment, a third fixing cavity 213 may be disposed at a position right of the middle of the third fixing portion 233, the third infrared positioning camera module 33 may be installed in the third fixing cavity 213, and the third infrared positioning camera module 33 is built in the third fixing cavity 213. The range of the spatial location interaction 3D panel 100 capturing the optical signals sent or reflected by the interaction accessory through the first infrared location camera module 31, the second infrared location camera module 32 and the third infrared location camera module 33 is larger than the range of the optical signals sent by the interaction accessory which can be captured by only one or two infrared location camera modules on the existing spatial location interaction 3D panel 100.
Since the middle frame 20 is provided with the first fixing portion 231, the second fixing portion 233 and the third fixing portion 232, the first infrared positioning camera module 31, the second infrared positioning camera module 32 and the third infrared positioning camera module 33 are correspondingly mounted to the first fixing portion 231, the second fixing portion 233 and the third fixing portion 232. The first infrared positioning camera module 31, the second infrared positioning camera module 32 and the third infrared positioning camera module 33 are limited on the panel body 10 by the middle frame 20, so that the structure is compact. Wherein, the casing is integrated structure.
In order to facilitate the placement of the spatially-positioned interactive 3D tablet 100 on a table or other object, the tablet body 10 further includes a support frame 13, where the support frame 13 is disposed on the bottom shell 12 and rotates relative to the bottom shell 12 to support the bottom shell 12.
In an embodiment, the supporting frame 13 is mounted on the back of the bottom shell 12, so that the bottom shell 12 can be supported by the supporting frame 13, and the purpose of supporting the whole space positioning interaction 3D panel 100 is achieved.
The supporting frame 13 is rotatably connected with the bottom shell 12, and the supporting frame 13 can be accommodated in a groove of the bottom shell 12 or can be rotated and unfolded for a certain angle relative to the bottom shell 12. The supporting frame 13 is unfolded, and the supporting frame 13, the bottom shell 12 and the desktop form a triangle structure, so that the supporting frame 13 can support the bottom shell 12 and the display screen 11 on the bottom shell 12.
In order to limit the display angle of the spatial positioning interaction 3D panel 100, the rotation angle range of the support frame 13 relative to the bottom shell 12 is 0-135 degrees, and the opening and closing fixation in the rotation angle range can be realized, and the expansion angle can not be enlarged any more due to the structural limitation when the rotation angle reaches 135 degrees.
In an embodiment, the supporting frame 13 is rotatably connected to the bottom shell 12, and the supporting frame 13 is received in a groove of the bottom shell 12. The support frame 13 is unfolded, the bottom edge of the bottom shell 12 and the support frame 13 together support the whole space positioning interaction 3D panel 100, and the space positioning interaction 3D panel 100 can be placed on a desktop and is obliquely arranged, so that a user can see an image displayed by the display screen 11 on the space positioning interaction 3D panel 100 at a proper viewing angle.
The bottom edge of the bottom shell 12 is designed to be arc-shaped, when the angle of the supporting frame 13 is adjusted and changed, the bottom edge of the bottom shell 12 can be ensured to be always in contact with a tabletop at the same edge, and a sensor inside the host can accurately identify the inclined angle of the display screen 11 and adjust the visual angle of the 3D display picture in real time. Two anti-slip pads are designed and installed at two ends of the bottom edge of the bottom shell 12, so that the flat plate main body 10 is prevented from sliding when being obliquely placed on a table top.
In an embodiment, the middle frame 20 includes a mounting portion 22 and a receiving portion 23, the receiving portion 23 is disposed protruding with respect to the mounting portion 22, the receiving portion 23 is provided with at least three fixing cavities 21, the mounting portion 22 is connected with the bottom shell 12 and encloses a receiving cavity 121, the receiving cavity 121 is provided with a battery pack 122, and the display screen 11 is mounted on a side of the mounting portion 22 away from the bottom shell 12.
Wherein, the accommodating cavity 121 is further fixed with a circuit board motherboard, an FPGA board, an interface small board 123, a speaker, and the like.
In one embodiment, the mounting portion 22 of the middle frame 20 is configured to be coupled to the bottom shell 12. The mounting portion 22 may be screwed, fastened, or inserted into the bottom case 12. The connection between the mounting portion 22 and the bottom case 12 is not limited in the present utility model.
The middle frame 20 is connected with the bottom shell 12 and encloses a receiving cavity 121, and the battery pack 122 may be received in the receiving cavity 121. To secure the battery pack 122 within the center frame, the battery pack 122 may be secured to the center frame 20 by a battery frame, screws.
In order to facilitate charging, an interface 120 is disposed on a side surface of the bottom case 12, an interface small plate 123 is installed in the accommodating cavity 121, and the interface small plate 123 is disposed corresponding to the interface 120.
In an embodiment, the small interface board 123 is connected to the circuit board motherboard through a flat cable, and the small interface board 123 is disposed corresponding to the interface 120, so that a plug of the charging cable and the plug interface 120 are plugged, and further, the charging of the space positioning interaction 3D panel 100 can be achieved.
It should be noted that, the left side and the right side of the bottom case are both provided with the plug-in interfaces 120, and have different functions, such as charging or data transmission. The plug-in port for charging the space positioning interaction 3D flat plate can be arranged on the left side of the bottom shell, and the plug-in port for data transmission with the interaction pen is arranged on the right side of the bottom shell.
Because the circuit board mainboard and the FPGA board are installed the holding chamber 121, heat can be produced when the circuit board mainboard and the FPGA board work, the upper half of drain pan 12 is equipped with louvre 124, louvre 124 dodges support frame 13 sets up, the heat that circuit board mainboard and the FPGA board produced can be given off through louvre 124.
In one embodiment, the heat sink 124 is located above the circuit board motherboard and the FPGA board.
The utility model also proposes an interaction device comprising polarized glasses, an interaction pen and a spatially positioned interaction 3D tablet 100. The specific structure of the spatial positioning interaction 3D panel 100 refers to the above embodiments, and since the interaction device adopts all the technical solutions of all the embodiments, at least has all the beneficial effects brought by the technical solutions of the embodiments, and will not be described in detail herein.
The user wears polarized glasses, or tracking glasses. The polarized glasses are capable of being captured by at least two adjacent infrared positioning camera modules 30 of the at least three infrared positioning camera modules 30 of the spatially positioned interactive 3D tablet 100 and determining a spatial location to determine coordinates of a user within a space. The operator holds the interactive pen and operates the interactive pen, and at least two adjacent infrared positioning camera modules 30 in the at least three infrared positioning camera modules 30 capture and determine the space position of the interactive pen, so that the positioning of the interaction in the three-dimensional space is realized, and the interactive operation in the real-time space is convenient to realize. For example, by operating the interactive pen, the generation of a navigation ray consistent with the orientation of the pen in the virtual scene can be realized. After the navigation ray touches the target object, a user can perform click selection through keys on the interactive pen, and 6DoF (six degrees of freedom) interaction of the 3D model target object in the virtual scene is realized through 6DoF (six degrees of freedom) actions of the interactive pen.
The infrared light supplement lamp of the infrared positioning camera module 30 emits infrared light, the infrared light irradiates to 5 reflection points on the tracking glasses and then is reflected back, the infrared camera on the infrared positioning camera module 30 shoots and identifies any at least 3 adjacent reflection points in the 5 reflection points, so that the coordinate position of the tracking glasses rigid body in a space coordinate system can be accurately positioned and identified, on one hand, signals are sent out and switched from a 2D mode picture to a 3D mode picture, in addition, the real-time space coordinate system coordinates of the tracking glasses are added into the 3D mode picture, and therefore the 3D display picture can be real-time along with the tracking glasses to adjust the viewing angle so as to adapt to the viewing angle transformation of the tracking glasses in different directions. The head and the tail of the interactive pen are respectively provided with an infrared LED lamp, and infrared light signals with different rules can be emitted simultaneously. After the infrared positioning camera captures 30 the active infrared light signals sent by the interactive pen and distinguishes the pen point and the pen tail, the coordinate position of the rigid body of the interactive pen in the space coordinate system and the directions of the pen point of the interactive pen and the pen tail of the interactive pen can be accurately positioned and identified, the rigid body of the interactive pen and the directions of the pen point of the interactive pen and the pen tail of the interactive pen are added into the virtual space of the 3D mode picture, a navigation ray is generated, after the front end of the navigation ray touches the 3D model in the virtual space, the 3D model can be selected through pressing a key of the interactive pen, and six-axis IMU data in the interactive pen are combined, so that six-degree-of-freedom interactive operation of the interactive pen on the 3D model can be realized.
The foregoing description is only of the preferred embodiments of the present utility model and is not intended to limit the scope of the utility model, and all equivalent structural changes made by the description of the present utility model and the accompanying drawings or direct/indirect application in other related technical fields are included in the scope of the utility model.

Claims (9)

1. A spatially localized interactive 3D tablet comprising a tablet body, the tablet body comprising:
the shell is provided with at least three fixing cavities at intervals;
The display screen is arranged on the shell and comprises an OC screen, a polaroid and a liquid crystal light valve, wherein the polaroid is arranged between the liquid crystal light valve and the OC screen; and
The at least three infrared positioning camera modules are respectively limited in the at least three fixed cavities.
2. The spatially-positioned interactive 3D tablet of claim 1, wherein the housing comprises a bottom shell and a middle frame, the display screen is connected to the middle frame, the bottom shell is disposed on a side of the middle frame away from the display screen, and the middle frame is provided with the at least three fixing cavities.
3. The spatially-positioned interactive 3D tablet of claim 2, wherein the middle frame includes a first fixed portion, a second fixed portion, and a third fixed portion, the first fixed portion and the second fixed portion are respectively located at two corners of the tablet body, the third fixed portion connects the first fixed portion and the second fixed portion, and the first fixed portion and the second fixed portion are each arranged protruding outwards relative to the tablet body, and the at least three infrared positioning camera modules are respectively mounted at the first fixed portion, the second fixed portion, and the third fixed portion.
4. The spatially-positioned interactive 3D tablet of claim 3, wherein the infrared positioning camera module comprises an infrared video camera and an infrared light supplement lamp, the infrared light supplement lamp disposed proximate to the infrared video camera.
5. The spatially-positioned interactive 3D tablet of claim 4, wherein the tablet body further comprises a support frame disposed on the bottom shell and rotatable relative to the bottom shell to support the bottom shell.
6. The spatially-positioned interactive 3D tablet of claim 5, wherein the middle frame comprises a mounting portion and a receiving portion, the receiving portion is disposed protruding with respect to the mounting portion, the receiving portion is provided with the at least three fixing cavities, a battery pack is fixed in the receiving cavity between the mounting portion and the bottom shell, and the display screen is mounted on a side of the mounting portion away from the bottom shell.
7. The spatially-positioned interactive 3D tablet of claim 6, wherein an interface is provided on a side of the bottom shell, and an interface platelet is mounted on a back of the mounting portion, the interface platelet being disposed in correspondence to the interface.
8. The spatially-positioned interactive 3D tablet of claim 7 wherein the bottom of the bottom shell is provided with a heat sink aperture and the heat sink Kong Birang is disposed on the support frame.
9. An interactive device comprising polarized glasses, an interactive pen and a spatially localized interactive 3D tablet as in any one of claims 1 to 8.
CN202323130661.XU 2023-11-20 2023-11-20 Space positioning interaction 3D panel and interaction equipment Active CN221039907U (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202323130661.XU CN221039907U (en) 2023-11-20 2023-11-20 Space positioning interaction 3D panel and interaction equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202323130661.XU CN221039907U (en) 2023-11-20 2023-11-20 Space positioning interaction 3D panel and interaction equipment

Publications (1)

Publication Number Publication Date
CN221039907U true CN221039907U (en) 2024-05-28

Family

ID=91170643

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202323130661.XU Active CN221039907U (en) 2023-11-20 2023-11-20 Space positioning interaction 3D panel and interaction equipment

Country Status (1)

Country Link
CN (1) CN221039907U (en)

Similar Documents

Publication Publication Date Title
US10268276B2 (en) Autonomous computing and telecommunications head-up displays glasses
US20140340424A1 (en) System and method for reconfigurable projected augmented/virtual reality appliance
WO2020038068A1 (en) Imaging device and electronic apparatus
JP6397698B2 (en) Information processing terminal, information processing program, information processing terminal system, and information processing method
JPWO2016031028A1 (en) Information processing terminal and game device
US9753293B2 (en) Three-dimensional (3D) display apparatus and system comprising a two-dimensional (2D) display terminal, light-splitting device, and transflective device
US11977234B2 (en) Head-mounted device
KR102218210B1 (en) Smart glasses capable of processing virtual objects
EP3058417A1 (en) System and method for reconfigurable projected augmented/virtual reality appliance
KR102218207B1 (en) Smart glasses capable of processing virtual objects
WO2022100193A1 (en) Housing assembly for head-mounted device and head-mounted device
KR20150145160A (en) No glasses 3D display mobile device, method for setting the same, and method for using the same
WO2018149267A1 (en) Display method and device based on augmented reality
KR102051202B1 (en) Smart glasses capable of processing virtual objects
CN214041981U (en) Immersive projection equipment
US10082672B2 (en) Display apparatus and method of displaying using electromechanical faceplate
CN221039907U (en) Space positioning interaction 3D panel and interaction equipment
WO2020189450A1 (en) Device provided with plurality of markers
CN112255804A (en) Head-mounted device
CN205510302U (en) Three -dimensional projectors
CN213814143U (en) Head-mounted device
CN112255801B (en) Ray apparatus support and head-mounted equipment
CN209765260U (en) Holographic projection device and holographic projection system
CN216748415U (en) Modularization intelligence 3D projector
CN213522176U (en) VR all-in-one of two mesh effects

Legal Events

Date Code Title Description
GR01 Patent grant
GR01 Patent grant