CN219285524U - Non-contact interactive display device - Google Patents

Non-contact interactive display device Download PDF

Info

Publication number
CN219285524U
CN219285524U CN202321404009.XU CN202321404009U CN219285524U CN 219285524 U CN219285524 U CN 219285524U CN 202321404009 U CN202321404009 U CN 202321404009U CN 219285524 U CN219285524 U CN 219285524U
Authority
CN
China
Prior art keywords
interaction
housing
plane
projection element
display device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202321404009.XU
Other languages
Chinese (zh)
Inventor
请求不公布姓名
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhongxian Holographic Beijing Technology Co ltd
Original Assignee
Zhongxian Holographic Beijing Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhongxian Holographic Beijing Technology Co ltd filed Critical Zhongxian Holographic Beijing Technology Co ltd
Priority to CN202321404009.XU priority Critical patent/CN219285524U/en
Application granted granted Critical
Publication of CN219285524U publication Critical patent/CN219285524U/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Position Input By Displaying (AREA)

Abstract

The application provides non-contact interactive display equipment, which belongs to the technical field of space stereoscopic imaging. The non-contact interactive display device includes a housing, a projection element, an image source, and an interactive module; the projection element and the image source are arranged in the shell at a reference angle, and light rays emitted by the image source are projected by the projection element to form a real image; the interaction module is configured to transmit signals along a direction parallel to a display reference plane of the real image, and the distance between a central plane of an interaction space formed by the signals and the display reference plane along a direction perpendicular to the display reference plane is always smaller than or equal to a reference value; the display datum plane and the plane where the image source is located are symmetrical with respect to the projection element. The device limits the detection range of the interaction module in a space which is smaller than or equal to the reference value from the display reference surface, and improves the anti-interference performance of the non-contact interactive display device.

Description

Non-contact interactive display device
Technical Field
The application relates to the technical field of space stereoscopic imaging, in particular to a non-contact interactive display device.
Background
The space stereo imaging technology mainly adopts an orthogonal reflecting surface to reflect light twice according to a reflection law, so that the light is finally converged into a real image visible to naked eyes in the air.
The non-contact interactive display device adopting the aerial stereoscopic imaging technology in the prior art has limited anti-interference performance during interaction, so that the misoperation rate is increased.
Therefore, a non-contact interactive display device having high anti-interference capability is needed.
Disclosure of Invention
In order to solve the technical problem of insufficient anti-interference capability of the existing non-contact interactive display equipment adopting aerial stereoscopic imaging, the embodiment of the application provides the non-contact interactive display equipment, which comprises a shell, a projection element, an image source and an interaction module;
the projection element and the image source are arranged in the shell at a reference angle, and light rays emitted by the image source form a real image after being projected by the projection element;
the interactive module transmits signals in a direction parallel to a display reference plane of the real image, and,
the distance between the central plane of the interaction space formed by the signals and the display reference plane is always smaller than or equal to a reference value along the direction perpendicular to the display reference plane;
the display datum plane and the plane where the image source is located are symmetrical with respect to the projection element.
Optionally, the projection element comprises at least one layer of imaging units; wherein each layer of the at least one layer of imaging units comprises a first lens and a second lens;
a plurality of first reflecting surfaces which are arranged in parallel at intervals are arranged in the first lens, and are perpendicular to the mirror surface of the first lens;
a plurality of second reflecting surfaces which are arranged in parallel at intervals are arranged in the second lens, and are perpendicular to the mirror surface of the second lens;
the first lens is laminated with the second lens, and the first reflecting surface is orthogonal to the second reflecting surface.
Optionally, the reference angle is adjustable in size.
Optionally, the position and/or posture of the interaction module changes along with the adjustment of the reference angle, so that the distance between the central plane of the interaction space and the display datum plane is always smaller than or equal to a reference value.
Optionally, the reference angle is 0 ° or more and 90 ° or less.
Optionally, the reference angle is greater than or equal to 60 ° and less than or equal to 75 °.
Optionally, at least one side boundary of the interaction space is located downstream of the display reference plane along an imaging optical path of the non-contact interactive display device.
Optionally, the reference value is greater than or equal to 0 and less than or equal to 3.8 millimeters.
Optionally, the housing is connected with the interaction module.
Optionally, the housing and the interaction module are independent of each other, and the real image is located in the interaction space.
Optionally, the housing comprises a first housing and a second housing;
wherein the projection element and the image source are arranged in the first housing at the reference angle such that the real image is at least partially located in the second housing;
the second shell is used for shielding ambient light, and one side of the second shell is provided with a window so as to realize observation and/or interaction based on the real image.
Optionally, the display reference plane is located in the second housing and does not exceed the plane of the window.
Optionally, the contactless interactive display apparatus further comprises a first adjustment mechanism;
the first adjustment mechanism is configured to adjust the position and/or attitude of the projection element and/or the image source to adjust the magnitude of the reference angle.
Optionally, the contactless interactive display apparatus further comprises a second adjustment mechanism;
the second adjusting mechanism is configured to adjust the position and/or the posture of the interaction module based on the reference angle, so that the distance between the central plane of the interaction space and the display datum plane is always smaller than or equal to a reference value.
Optionally, the projection element has a size greater than or equal to 1 meter x1 meter.
Optionally, the projection element has a size greater than or equal to 1.5 meters by 1.5 meters.
Optionally, the interaction module includes an infrared touch frame.
Optionally, the interaction module further comprises a gesture sensor;
the gesture sensor is arranged on the shell.
Optionally, the interaction module further comprises a projection device;
the projection device is arranged on the shell and is used for projecting an observation area to the user side so that the user can observe a complete real image when the user is positioned in the observation area.
Optionally, the thickness of the interaction space is greater than or equal to 0 mm and less than or equal to 3.8mm along a direction perpendicular to the display reference plane.
The non-contact interactive display device provided by the embodiment of the application has the following beneficial effects:
according to the non-contact interactive display device, the projection element and the image source are arranged in the shell at the reference angle, so that light rays emitted by the image source are projected into real images through the projection element, the interactive module is arranged on the shell, signals are transmitted along the direction parallel to the display reference plane through the interactive module, the distance between the central plane of the interactive space formed by the signals and the display reference plane is always smaller than or equal to the reference value, and the interactive range of the interactive module is limited in the space within the reference value from the display reference plane. Therefore, the non-contact interactive display device does not recognize interactive actions outside the interactive space, and eliminates interference actions outside the interactive space, thereby reducing the misoperation rate.
Drawings
In order to more clearly describe the technical solutions in the embodiments or the background of the present application, the following description will describe the drawings that are required to be used in the embodiments or the background of the present application.
Fig. 1 shows an alternative structural schematic diagram of a non-contact interactive display device provided in an embodiment of the present application.
Fig. 2 shows a schematic structural diagram of still another alternative non-contact interactive display device provided in an embodiment of the present application.
Fig. 3 shows an alternative schematic diagram of a contactless interactive display device provided in an embodiment of the present application.
Fig. 4 shows a schematic structural diagram of still another alternative non-contact interactive display device provided in an embodiment of the present application.
Fig. 5 shows an alternative structural schematic of the projection element provided in an embodiment of the present application.
Fig. 6 shows a schematic diagram of a projection element provided in an embodiment of the present application.
Reference numerals in the drawings denote:
10-a housing; a 20-projection element; 30-image source; 40-an interaction module; 50-displaying a reference surface; 60-real images;
101-a first housing; 102-a second housing; 1021-window; 201-a first lens; 202-a second lens; 2011-a first reflective surface; 2021-a second reflective surface; 401-an emitter; 402-a receiver; 403-gesture sensor; 404-projection means.
Detailed Description
In order to facilitate an understanding of the present application, a more complete description of the present application will now be provided with reference to the relevant figures. Preferred embodiments of the present application are shown in the drawings. This application may, however, be embodied in many different forms and is not limited to the embodiments described herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete.
It is to be understood that the terminology used herein is for the purpose of describing particular example embodiments only, and is not intended to be limiting. As used herein, the singular forms "a", "an" and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. The terms "comprises," "comprising," "includes," "including," and "having" are inclusive and therefore specify the presence of stated features, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, steps, operations, elements, components, and/or groups thereof. The method steps, processes, and operations described herein are not to be construed as necessarily requiring their performance in the particular order described or illustrated, unless an order of performance is explicitly stated. It should also be appreciated that additional or alternative steps may be used.
Although the terms first, second, third, etc. may be used herein to describe various elements, components, regions, layers and/or sections, these elements, components, regions, layers and/or sections should not be limited by these terms. These terms may be only used to distinguish one element, component, region, layer or section from another region, layer or section. Terms such as "first," "second," and other numerical terms when used herein do not imply a sequence or order unless clearly indicated by the context. Thus, a first element, component, region, layer or section discussed below could be termed a second element, component, region, layer or section without departing from the teachings of the example embodiments.
For ease of description, spatially relative terms, such as "inner," "outer," "lower," "below," "upper," "above," and the like, may be used herein to describe one element or feature's relationship to another element or feature as illustrated in the figures. Such spatially relative terms are intended to encompass different orientations of the device in use or operation in addition to the orientation depicted in the figures. For example, if the device in the figures is turned over, elements described as "below" or "beneath" other elements or features would then be oriented "above" or "over" the other elements or features. Thus, the exemplary term "below" may include both upper and lower orientations. The device may be otherwise oriented, such as rotated 90 degrees or in other directions, and the spatial relative relationship descriptors used herein interpreted accordingly.
Holographic display devices in the prior art capture the user's actions (gestures, lines of sight, etc.) by means of infrared technology to achieve holographic-image based interactions. Whereas prior art holographic-based interactions are susceptible to interference. For example, when there are a plurality of users in a region where a hologram can be observed, the actions of each user trigger human-computer interaction, and thus misoperation is liable to occur. In particular, if the system requires only one operator, the actions of other users in the background of the operator can interfere with the operation of the current operator.
The inventors of the present application have found that in the prior art, the interaction module projects signals towards the user and thereby captures the user's interaction. However, this interaction approach results in the interaction module easily capturing unwanted actions in the background, thereby creating a malfunction.
In view of the above technical problems, the embodiments of the present application provide a non-contact interactive display device, as shown in fig. 1 to 4, which includes a housing 10, a projection element 20, an image source 30, and an interaction module 40. As shown in fig. 1 and 2, the projection element 20 is disposed in the housing 10 at a reference angle with respect to the image source 30. The light emitted from the image source 30 is projected by the projection element 20 to form a real image 60. Alternatively, the real image 60 is a two-dimensional real image that is visible to the naked eye. It should be noted that the reference angle includes both an angle between the projection element 20 and the image source 30 and an angle between the projection element 20 and/or the image source 30 and a horizontal plane. The above-described change in the reference angle may cause a change in the position and/or posture of the display reference surface 50 of the real image 60.
The interaction module 40 transmits the signal 400 in a direction parallel to the display reference plane 50 of the real image 60, whereby the signal 400 constitutes an interaction space. The distance between the central plane of the interaction space and the display reference plane 50 is always smaller than or equal to the reference value. As shown in fig. 1 to 4, the reference plane 50 and the plane in which the image source 30 is located are shown to be symmetrical with respect to the center plane of the projection element 20. According to the embodiment of the present application, the angle between the display reference surface 50 and the horizontal plane is not particularly limited as long as the observation requirement can be satisfied. The embodiment in which the center plane of the interaction space coincides with the display reference plane 50 is only exemplarily shown in fig. 1 and 2. In other alternative embodiments, the center plane of the interaction space is spaced from the display reference plane 50 by a distance greater than 0 in a direction perpendicular to the display reference plane 50.
In some alternative embodiments, as shown in fig. 1 and 2, the image source 30 is enclosed within a cavity of the housing 10, the projection element 20 is disposed in the housing 10, and the projection element 20 is at a reference angle to the image source 30. It should be understood that disposing the projection element 20 in the housing 10 may refer to enclosing the projection element 20 within a cavity of the housing 10, or may refer to embedding the projection element 20 within a housing of the housing 10. The surface of the projection element 20 that is in contact with the air is illustratively covered with a transparent protective layer.
Fig. 3 shows the working principle of the present application. As shown in fig. 3, after the light emitted by the image source 30 is modulated by the projection element 20, a real image 60 is formed in a space on a side of the projection element 20 away from the image source 30. The light-emitting surface of the image source 30 and the display reference surface 50 of the real image 60 are symmetrical with respect to the center plane of the projection element 20.
In an alternative embodiment, referring to fig. 1, the interactive module 40 and the housing 10 are independent of each other. Thereby, the installation position of the interaction module 40 is flexibly configured according to the actual use situation, and the housing 10 is also hidden. As shown in fig. 1, the interactive module 40 and the enclosure 10 may be mounted separately, for example, the interactive device 40 may be embedded in a wall or a riser, while the enclosure 10 may be hidden under the ground or stage. According to embodiments of the present application, the interaction module 40 and the image source 30 may be directly or indirectly electrically connected by wire (e.g., cable or fiber optics, etc.) or wirelessly (e.g., bluetooth or wireless fidelity technology, etc.). It should be appreciated that the interaction module 40 and the image source 30 may be connected by a controller or an upper computer. It should be noted that, the non-contact interactive display device according to the embodiment of the present application further includes a power source, a controller, an upper computer, and other conventional devices that can be considered by those skilled in the art, but will not be described in detail herein. In yet another alternative embodiment, as shown in FIG. 2, an interactive module 40 is coupled to the housing 10. Thereby facilitating integration of the interactive module 40 with the housing 10, facilitating movement of the contactless interactive display device and presentation in a smaller space. The interactive module 40 transmits the signal 400 in a direction parallel to the display reference plane 50. The center plane of the interaction space formed by the signal 400 is parallel to the display reference plane 50 and the distance between the center plane of the interaction space and the display reference plane 50 is less than or equal to the reference value. Preferably, the housing 10 is opaque to visible light for reducing interference of ambient light with the imaging light path.
In yet other alternative embodiments, as shown in fig. 2, the housing 10 includes a first housing 101 and a second housing 102, the first housing 101 being connected to the second housing 102. One side of the second housing 102 has a window 1021, which window 1021 is for enabling observation and/or interaction based on the real image 60. The image source 30 and the projection element 20 are disposed within the first housing 101 at a reference angle such that the real image projected by the projection element 20 is at least partially within the second housing 102. The second housing 102 is used for shielding ambient light and avoiding interference of the ambient light on the real image display effect. While the presence of the second housing 102 makes the installation of the interactive module 40 more flexible, thereby making the adjustment and calibration of the central plane position and/or attitude of the interactive space easier. Preferably, the first housing 101 is opaque to visible light for attenuating interference of ambient light with the imaging light path. Optionally, the second housing 102 is a foldable structure to allow the position and posture of the window 1021 to be changed. Optionally, the display datum 50 is located within the cavity of the second housing 102 and does not exceed the limit of the plane of the window 1021.
It should be noted that fig. 1 and 2 only exemplarily provide the positions and attitudes of the projection element 20 and the image source 30. The positions and attitudes of the projection element 20 and the image source 30 are not particularly limited, and may be set according to display requirements. It should be understood that the gestures described in the present specification are meant to describe the angle of the object to the horizontal and/or vertical.
Fig. 1 and 2 schematically show the interaction means 40 transmitting a signal 400 in a direction parallel to the display reference plane 50. As shown in fig. 1 and 2, there is at least one signal plane in the interaction space in a direction perpendicular to the display reference plane 50. For ease of understanding, the orthogonal signals 400 are shown in fig. 1 and 2 in different planes, and the distance between the different signal planes is exaggerated. Figures 1 and 2 are intended to be schematic and do not relate to actual dimensions or proportions. Fig. 4 shows a schematic diagram when the real image 60 is seen in a direction perpendicular to the display reference plane 50. For ease of understanding, the display datum 50 in fig. 4 does not show the signal 400. According to embodiments of the present application, signal 400 is included in display datum 50, i.e., signal 400 does not avoid the display datum. This is because, in general, the signal 400 transmitted by the interaction module 40 is in a band that is not visible to the user.
In some alternative embodiments, as shown in fig. 5, the projection element 20 is a multi-layer lens structure. Optionally, the projection element 20 comprises at least one layer of imaging units. An alternative structural schematic of a layer of imaging units is shown in fig. 5. Each layer of the at least one layer of imaging units comprises a first lens 201 and a second lens 202. As shown in the enlarged partial view of the first lens 201 in fig. 5, a plurality of first reflecting surfaces 2011 are disposed in parallel at intervals in the first lens 201, and the plurality of first reflecting surfaces 2011 are perpendicular to the mirror surface of the first lens 201. Similarly, a plurality of second reflecting surfaces 2011 are disposed in the second lens 201 in parallel at intervals, and each of the plurality of second reflecting surfaces 2011 is perpendicular to the mirror surface of the second lens 202. As shown in fig. 5, the first mirror 201 and the second mirror 202 are laminated, and the first reflection surface 2011 is orthogonal to the second reflection surface 2021. It should be understood that the first lens 201 and the second lens 202 are both plate-shaped structures, and the mirror surfaces are the surfaces of the light incident side and the light emergent side. Optionally, according to embodiments of the present application, the area of the projection element 20 is greater than or equal to 1 meter x1 meter. Optionally, the area of the projection element 20 is greater than or equal to 1.5 meters by 1.5 meters. It will be appreciated that the surface of the projection element 20 (or the surface adjacent to the air) may also be provided with a cover plate for protecting the projection element 20.
Fig. 6 shows the principle of the projection element 20 in an embodiment of the present application. When the light beam emitted by the light source S is reflected twice by the orthogonal reflecting mirror (e.g. the second reflecting surface 2021 and the first reflecting surface 2011), the light beam exits in parallel to the square line of the incident light beam to form a real image S'. As shown in fig. 6, the outgoing light, the incoming light and the primary reflected light in the projection element 20 are in the same plane. Fig. 5 shows only an alternative configuration of the projection element 20 of the present application, and the remaining configuration of orthogonal mirror surfaces with arrays also enables projection of light rays from the image source 30 to form a spatially real image. Referring to the principle of the projection element 20 and fig. 3 to 5, it will be understood that the light-emitting surface of the image source 30 and the display reference surface 50 of the real image 60 are symmetrical with respect to the center plane of the projection element 20. According to an embodiment of the present application, the image source 30 may be a flat display device. Optionally, the brightness of the image source 30 is greater than or equal to 1000 nits.
According to an embodiment of the present application, optionally, the reference value is greater than or equal to 0 mm and less than or equal to 3.8mm. Thereby, the interaction range of the interaction module 40 is limited within a predetermined space, so that the interaction of the operator by the actions in the background is avoided. Preferably, the reference value is equal to 0, that is, the center plane of the interaction space coincides with the display reference plane 50, which is favorable for further eliminating interference of background actions and improving interaction efficiency. Therefore, the non-contact interactive display device realizes that the response to the interactive action and the touch of the user to the real image synchronously occur, and is beneficial to improving the synchronous rate of the interactive action and the display picture and improving the interactive experience of the user. According to the embodiment of the application, through the limitation of the reference value, the interaction action only occurring in the preset interaction space is identified. Preferably, along the imaging light path, at least one side boundary of the interaction space is located downstream of the display reference plane 50, i.e. the display reference plane 50 is located in the interaction space. By this, the user's interaction is identified before the user touches the real image 60, avoiding that the user's limb is identified through the real image 60 interaction at the time of interaction, i.e. avoiding the interaction response delay of the non-contact interactive display device. Therefore, the interactive module 40 of the application defines the boundary of interaction, and the display reference plane 50 is located in the interaction space defined by the interactive module 40, so that the interference resistance of the user when the user interacts based on the real image is improved. And psychological hints are provided for the user, so that the user can consciously conduct interaction within the boundary during operation, the accuracy of the non-contact interactive display device in identifying the interaction is improved, the touch experience of the user is improved, and the touch experience similar to that of an entity touch screen is provided for the user. Still further, the interaction module 40 is electrically connected to the image source 30, and when the interaction is performed within the boundary defined by the interaction module, the image source 30 displays a GUI (Graphical User Interface ) corresponding to the interaction action, thereby further improving the user experience.
According to an alternative embodiment, signal 400 is an infrared band signal. Optionally, the signal 400 emitted by the interaction module 40 constitutes a two-dimensional detection plane, e.g. an infrared light curtain. In some alternative embodiments, the signals emitted by the interaction module 40 constitute a three-dimensional space having a certain thickness. When an interaction occurs within the three-dimensional space, the interaction may be triggered. Illustratively, the size (i.e., thickness) of the interaction space formed by signal 400 is greater than or equal to 0 and less than or equal to 3.8 millimeters in a direction perpendicular to display datum 50.
In an alternative embodiment, as shown in fig. 2, the interaction module 40 includes a transmitter 401 and a receiver 402. The transmitter 401 and the receiver 402 are disposed opposite to each other in the housing 10. The transmitter 401 emits a signal 400, which signal 400 is received by the receiver 402 after transmission in a direction parallel to the display reference plane 50. A single transmitter 401 and its corresponding receiver 402 form a pair of detection modules. Further, the interaction module 40 includes two pairs of detection modules, and signal transmission paths of the two pairs of detection modules are orthogonal in the same plane to form a matrix. It is not easy to understand that the denser the grid formed by orthogonal signals in the matrix is, the higher the touch accuracy is. Illustratively, the touch resolution of the interaction module 40 provided in the embodiment of the present application is 4096×4096. Alternatively, the interaction module 40 may recognize multi-touch.
Optionally, the interaction module 40 is an infrared touch frame. The principle of the infrared touch frame is that a dense lamp tube infinite axis oblique angle coordinate system is adopted, infinite rays with different angles are emitted through one emitter, and infinite rays with different angles emitted by one receiving emitter are received, so that shielding dead angles in a touch space are filled thoroughly, accurate touch effects are achieved, a mathematical theory is used for weighting and calculating touch points, and interaction based on real images 60 is achieved. Alternatively, the infrared touch frame may be integrated on the second housing 102. For example, the border of window 1021 may be an infrared touch border. It should be understood that the external structural shape of the interactive module 40 is not limited, and an appropriate shape, such as a rectangular parallelepiped, a cube, or a cylinder, etc., may be selected according to the installation space. Alternatively, the interactive module 40 forms a plurality of signal planes in a direction perpendicular to the display reference plane 50.
According to embodiments of the present application, the reference angle may be a fixed value or may be adjustable. Optionally, the reference angle is greater than or equal to 0 ° and less than or equal to 90 °. Preferably, the reference angle is greater than or equal to 45 ° and less than or equal to 90 °. Preferably, the reference angle is greater than or equal to 60 ° and less than or equal to 75 °. This can improve the display effect of the real image 60.
It should be noted that the magnitude of the reference angle may be achieved by changing the position and/or the posture of the projection element 20 and/or the image source 30 by technical means (such as a hinge, a multi-link, etc.) commonly used in the art. It should be appreciated that the adjustment of the position and/or attitude of the projection element 20 and/or the image source 30 may be directly applied to the body of the projection element 20 and/or the image source 30 or may be indirectly applied to the projection element 20 and/or the image source 30. For example, the housing 10 may be configured in a partially telescoping, folding or rotating configuration, with the telescoping, folding or rotating of the housing 10 effecting displacement or rotation of the projection element 20 and/or the image source 30. Optionally, the non-contact interactive display device provided in the embodiments of the present application further includes a first adjustment mechanism for adjusting the position and/or posture of the projection element 20 and/or the image source 30 to adjust the magnitude of the reference angle. The first adjustment mechanism acts directly or indirectly on the projection element 20 and/or the image source 30.
According to an embodiment of the present application, with reference to the imaging principle of the projection element 20, the position and/or posture of the display reference surface 50 varies with the adjustment of the reference angle. Thus, adjustment of the position and/or attitude of the display datum 50 may be achieved by adjusting the reference angle, depending on the requirements of the observation and/or interaction. Optionally, the non-contact interactive display device further comprises a second adjusting mechanism, where the second adjusting mechanism is used to adjust the position and/or the posture of the interactive module 40, so that the distance between the central plane of the interactive space formed by the signal 400 and the display reference plane is always smaller than or equal to the reference value along the direction perpendicular to the display reference plane 50.
It should be appreciated that, in accordance with embodiments of the present application, the central plane of the interaction space formed by signal 400 remains parallel to display reference plane 50 at all times. The second adjustment mechanism acts directly or indirectly on the interaction module 40. It should be appreciated that the specific structure of the second adjustment mechanism is not particularly limited as long as the position and/or posture of the interaction module 40 can be changed. For example, the change of the position and/or the posture of the interaction module may be achieved by means of a sliding rail. Preferably, the position and/or posture of the interaction module 40 varies with the variation of the reference angle such that the variation of the position and/or posture of the central plane of the interaction space constituted by the signals 400 is synchronized with the variation of the display reference plane 50.
In some alternative embodiments, the interaction module 40 further includes gesture sensors 403 for increasing the freedom of interaction. Illustratively, gesture sensor 403 is mounted on housing 10. Also illustratively, gesture sensors 403 are integrated in interaction module 40. In some exemplary embodiments, the interaction module 40 further comprises a projection means 404 for projecting the observation area towards the user side. The observation area is used for prompting the optimal observation position of the non-contact interactive display device, and a user can observe and see a complete real image when the user is positioned in the observation area. The projection module 404 may be mounted on the housing 10 or may be integrated into the interaction module 40.
In summary, in the non-contact interactive display device provided by the embodiment of the application, the projection element and the image source are disposed in the housing at the reference angle, so that the light emitted by the image source is projected into a real image through the projection element, the interactive module is mounted on the housing, the signal is transmitted along the direction parallel to the display reference plane through the interactive module, and the distance between the central plane of the signal and the display reference plane is always smaller than or equal to the reference value, so that the detection range of the interactive module is limited in the space within the reference value from the display reference plane. Interactions outside the detection range are therefore excluded, thereby reducing the rate of false operations. In addition, the effective interaction area is limited by setting the signal detection boundary, touch experience similar to that of a touch screen is provided, and user experience is improved. According to the embodiment of the application, the reference angle is adjustable, the central plane of the signal changes along with the change of the reference angle, the distance between the central plane of the signal and the display datum plane is kept to be always smaller than or equal to the reference value, the anti-interference capability of the non-contact interactive display device is further improved, misoperation after the real image position is adjusted according to the user requirement is avoided, and the user experience is improved.
The foregoing is merely a specific implementation of the embodiments of the present application, but the protection scope of the embodiments of the present application is not limited thereto, and any person skilled in the art may easily think about changes or substitutions within the technical scope of the embodiments of the present application, and all changes and substitutions are included in the protection scope of the embodiments of the present application. Therefore, the protection scope of the embodiments of the present application shall be subject to the protection scope of the claims.

Claims (10)

1. A contactless interactive display device, characterized in that it comprises a housing (10), a projection element (20), an image source (30) and an interaction module (40);
the projection element (20) and the image source (30) are arranged in the shell (10) at a reference angle, and light rays emitted by the image source (30) are projected by the projection element (20) to form a real image (60);
the interaction module (40) is configured to transmit a signal (400) in a direction parallel to a display reference plane (50) of the real image (60), and,
along a direction perpendicular to the display reference plane (50), the distance between the central plane of the interaction space formed by the signals (400) and the display reference plane (50) is always smaller than or equal to a reference value;
the display reference plane (50) and the plane of the image source (30) are symmetrical with respect to the projection element (20).
2. The contactless interactive display device according to claim 1, characterized in that the projection element (20) comprises at least one layer of imaging units; wherein each layer of the at least one layer of imaging units comprises a first lens (201) and a second lens (202);
a plurality of first reflecting surfaces (2011) which are arranged in parallel at intervals are arranged in the first lens (201), and the plurality of first reflecting surfaces (2011) are perpendicular to the mirror surface of the first lens (201);
a plurality of second reflecting surfaces (2021) which are arranged in parallel at intervals are arranged in the second lens (202), and the second reflecting surfaces (2021) are perpendicular to the mirror surface of the second lens (202);
the first lens (201) and the second lens (202) are laminated, and the first reflecting surface (2011) is orthogonal to the second reflecting surface (2021).
3. The contactless interactive display apparatus according to claim 1, wherein a magnitude of the reference angle is adjustable; and, in addition, the processing unit,
the position and/or posture of the interaction module (40) changes along with the adjustment of the reference angle, so that the distance between the central plane of the interaction space and the display datum plane (50) is always smaller than or equal to a reference value.
4. A contactless interactive display device according to any one of claims 1 to 3, wherein the reference angle is greater than or equal to 0 ° and less than or equal to 90 °.
5. The non-contact interactive display device according to claim 1, characterized in that at least one side boundary of the interaction space is located downstream of the display reference plane (50) along an imaging light path of the non-contact interactive display device.
6. A contactless interactive display apparatus according to claim 1 or 3, wherein the reference value is greater than or equal to 0 and less than or equal to 3.8mm.
7. The contactless interactive display device according to claim 1, characterized in that the housing (10) is connected with the interactive module (40); or alternatively, the first and second heat exchangers may be,
the housing (10) is independent of the interaction module (40) and the real image (60) is located in the interaction space.
8. The contactless interactive display device according to claim 7, characterized in that the housing comprises a first housing (101) and a second housing (102);
wherein the projection element (20) and the image source (30) are arranged within the first housing (101) at the reference angle such that the real image (60) is at least partially located in the second housing (102);
the second housing (102) is used for shielding ambient light, and one side of the second housing (102) is provided with a window (1021) for realizing observation and/or interaction based on the real image (60).
9. The contactless interactive display device according to claim 1, characterized in that the interactive module (40) comprises an infrared touch frame; or alternatively, the first and second heat exchangers may be,
the interaction module (40) comprises an infrared touch frame and a gesture sensor (403) and/or a projection device (404);
the gesture sensor (403) is used to increase the freedom of interaction;
the projection device (404) is used for projecting the observation area to the user side so that the user can observe the complete real image (60) when the user is positioned in the observation area.
10. The non-contact interactive display device according to claim 1, characterized in that the thickness of the interaction space is greater than or equal to 0 mm and less than or equal to 3.8mm in a direction perpendicular to the display reference plane (50).
CN202321404009.XU 2023-06-05 2023-06-05 Non-contact interactive display device Active CN219285524U (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202321404009.XU CN219285524U (en) 2023-06-05 2023-06-05 Non-contact interactive display device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202321404009.XU CN219285524U (en) 2023-06-05 2023-06-05 Non-contact interactive display device

Publications (1)

Publication Number Publication Date
CN219285524U true CN219285524U (en) 2023-06-30

Family

ID=86915287

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202321404009.XU Active CN219285524U (en) 2023-06-05 2023-06-05 Non-contact interactive display device

Country Status (1)

Country Link
CN (1) CN219285524U (en)

Similar Documents

Publication Publication Date Title
US11822022B2 (en) Methods and systems for LIDAR optics alignment
TWI536274B (en) Image acquisition device, terminal equipment, liquid crystal terminal equipment and image acquisition method
US20110019204A1 (en) Optical and Illumination Techniques for Position Sensing Systems
CN102449584A (en) Optical position detection apparatus
CN103206926B (en) A kind of panorama three-dimensional laser scanner
US11429184B2 (en) Virtual reality display device, display device, and calculation method of line-of-sight angle
TWI490753B (en) Touch control device
JP2019003332A (en) Aerial graphic display device
JP2019525220A (en) Prompters and how to use aerial imaging elements in prompters
CN102591532B (en) Dual-reflector cross-positioning electronic whiteboard device
US20240019715A1 (en) Air floating video display apparatus
CN102609152A (en) Large-field-angle detection image acquisition method for electronic white board and device
TWI489350B (en) Optical touch apparatus and image capturing apparatus
CN219285524U (en) Non-contact interactive display device
Yasugi et al. Development of aerial interface by integrating omnidirectional aerial display, motion tracking, and virtual reality space construction
JP5530809B2 (en) Position detection apparatus and image processing system
CN101620485B (en) Device and method for positioning light source
KR20030034535A (en) Pointing apparatus using camera and the method of computing pointer position
CN113640991A (en) Head-up display system and vehicle
CN202472608U (en) Signal receiver of electronic whiteboard with wide angle image detection
CN111352252A (en) Air imaging mechanism, real image device and interactive system
WO2024079832A1 (en) Interface device
WO2023274255A1 (en) Aerial imaging system and aerial imaging-based human-computer interaction system
JP2004157708A (en) Coordinate input device
US20230043439A1 (en) 3d mapping in 2d scanning display

Legal Events

Date Code Title Description
GR01 Patent grant
GR01 Patent grant