CN113885702A - Aerial imaging device - Google Patents

Aerial imaging device Download PDF

Info

Publication number
CN113885702A
CN113885702A CN202111149579.4A CN202111149579A CN113885702A CN 113885702 A CN113885702 A CN 113885702A CN 202111149579 A CN202111149579 A CN 202111149579A CN 113885702 A CN113885702 A CN 113885702A
Authority
CN
China
Prior art keywords
air
touch
gas
imaging device
aerial
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111149579.4A
Other languages
Chinese (zh)
Inventor
黄瑞
刘鸿
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Anhui Easpeed Technology Co Ltd
Original Assignee
Anhui Easpeed Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Anhui Easpeed Technology Co Ltd filed Critical Anhui Easpeed Technology Co Ltd
Priority to CN202111149579.4A priority Critical patent/CN113885702A/en
Publication of CN113885702A publication Critical patent/CN113885702A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention discloses an aerial imaging device, which comprises an imaging subsystem and a touch feedback subsystem, wherein the imaging subsystem is used for imaging and displaying human-computer interaction information in an aerial target area to form an aerial interaction interface and sending a touch feedback control instruction and touch point coordinate information when an interaction signal of a user and the aerial interaction interface is detected; and the tactile feedback subsystem is connected with the imaging subsystem and used for jetting gas to the touch point according to the touch feedback control instruction and the touch point coordinate information. The system can realize non-contact tactile feedback of the user, can meet the interactive experience of the user in the aspects of vision and touch, and is more sanitary and safer.

Description

Aerial imaging device
Technical Field
The invention relates to the field of aerial imaging, in particular to an aerial imaging device.
Background
The common tactile feedback mechanism in the market at present mainly refers to contact tactile feedback, and the contact tactile feedback technology is assisted by related equipment (such as a screen, gloves and the like) and realizes tactile perception by using principles and technologies of vibration, electrostatic force and the like. Although the related devices are miniaturized and commercialized in large quantities at present, since the user has to make contact with the auxiliary device, there are the following problems: on one hand, the cross use of the equipment can cause public health safety problems, and users cannot get rid of the constraint of heavy equipment; on the other hand, personal information such as fingerprints and palm prints left on the device after the user uses the device may expose the personal information security to leakage risks.
Disclosure of Invention
The present invention is directed to solving at least one of the problems of the prior art. To this end, an object of the present invention is to propose an aerial imaging device.
The aerial imaging device comprises an imaging subsystem and a tactile feedback subsystem, wherein the imaging subsystem is used for imaging and displaying human-computer interaction information in an aerial target area to form an aerial interaction interface and sending a touch feedback control instruction and touch point coordinate information when an interaction signal of a user and the aerial interaction interface is detected; and the tactile feedback subsystem is connected with the imaging subsystem and used for jetting gas to the touch point according to the touch feedback control instruction and the touch point coordinate information.
The interactive imaging subsystem forms a floating real image at a certain position in the air, the injection area of the air injection unit in the air injection touch feedback subsystem is adjusted to cover the position of the floating real image, the area forms a user touch area, and when a user touches a finger or a palm of the user on the floating real image in the touch area, the air injection touch feedback subsystem accurately controls air flow injection, so that the finger or the palm of the user feels air flow impact, and the touch of the floating touch object is realized.
According to the aerial imaging device provided by the embodiment of the invention, the imaging subsystem is used for imaging and displaying the human-computer interaction information in the aerial target area to form an aerial interaction interface, the aerial interaction interface is used as a reference surface for user touch perception to guide the user to touch, and when the imaging subsystem detects an interaction signal between the user and the aerial interaction interface, a touch feedback control instruction and touch point coordinate information are sent to the touch feedback subsystem. The touch feedback subsystem sprays gas to the touch points according to the received touch feedback control instruction and the coordinate information of the touch points, and generates a touch feedback effect between the human body and the touch points, so that the human body can sense the existence of the touch points in reality, and the purpose of non-contact touch feedback of a user is realized. In the embodiment of the invention, based on the air interaction interface presented in the air target area, the operation of touch feedback can be triggered by the user touching the air interaction interface, the interaction experience of the user is met in the aspects of vision and touch, the operation mode is more natural and comfortable, an additional device for limiting the user operation is not required to be arranged, and the risk of contacting equipment when the user operates is avoided. Meanwhile, the information safety problem caused by leakage of personal information due to the fact that fingerprint information of the user is left is avoided.
In some embodiments, the imaging subsystem comprises: the device comprises a shell, an imaging assembly, a detection module and a main control module, wherein the shell is provided with a display window and an accommodating cavity inside; the imaging assembly is arranged in the accommodating cavity and used for imaging and displaying human-computer interaction information in an aerial target area to form an aerial interaction interface; the detection module is used for detecting an interaction signal of a user and the air interaction interface; the main control module is connected with the imaging assembly and the detection module and used for responding to the interaction signal and sending a touch feedback control instruction and touch point coordinate information.
In some embodiments, the imaging assembly comprises: the display is arranged in the accommodating cavity, connected with the main control module and used for displaying the human-computer interaction information; the optical assembly is arranged in the accommodating cavity and is used for converging and imaging light rays carrying the human-computer interaction information in the aerial target area to form an aerial interaction interface; the display is arranged on the light source side of the optical assembly, and the display window is arranged on the imaging side of the optical assembly.
In some embodiments, the optical component includes two transparent substrates, and a first optical waveguide array and a second optical waveguide array disposed between the two transparent substrates, the first optical waveguide array and the second optical waveguide array are formed by a plurality of reflective strips having rectangular cross sections, a reflective film is disposed on one side or both sides of the arrangement direction of the reflective strips, and each reflective strip of the first optical waveguide array and each reflective strip of the second optical waveguide array are orthogonal to each other.
In some embodiments, the haptic feedback subsystem comprises: the gas injection device comprises a gas injection unit, a data processing module and a driving module, wherein the gas injection unit is used for injecting gas; the data processing module is electrically connected with the main control module and sends the coordinates of the touch points of the operation interface to the data processing module, and the data processing module calculates the injection angles corresponding to the touch points according to the coordinates of the touch points and the positions of the air injection units; the driving module is electrically connected with the data processing module to receive the control signal of the data processing module and obtain a driving signal, and the driving module is electrically connected with the gas injection unit to inject gas towards the touch area or the current touch point.
In some embodiments, the air jet unit jets air to form an air flow in an air beam or an air curtain, the jet mode of the air jet unit is a normally open type or a real-time open type, the real-time open type air jet unit is used for jetting air when a user enters a touch area or clicks a touch point, and the normally open type air jet unit is in a normally open state.
In some embodiments, the air injection unit includes an air compressor and a cradle head, the air compressor is used for injecting air, the air compressor is arranged on the cradle head, and the cradle head is configured to deflect in real time after receiving the touch signal of the touch point so as to inject air to the current touch point.
In some embodiments, the air injection unit includes a plurality of air injection members, the air injection members are air compressors or air knives, the plurality of air injection members are installed at preset positions, the injection direction of each air injection member is fixed, and the injection range of each air injection member covers the corresponding touch point.
In some embodiments, the gas source is connected with the gas injection unit to provide gas for the gas injection unit.
In some embodiments, the gas injection unit further comprises a temperature adjustment module for adjusting the temperature of the gas stored in the gas source or injected by the gas injection unit according to the outdoor ambient temperature.
In some embodiments, the air supply system further comprises a silencing unit, and the silencing unit is arranged outside the air supply, at least one of an air inlet of the air injection unit and an air outlet of the air injection unit.
In some embodiments, when the user clicks each touch point, the wind noise emitted by the air injection unit is used as a touch prompt tone.
In some embodiments, the gas injection unit also has a carrier for containing a disinfecting liquid and an atomizing member for atomizing the disinfecting liquid so that the gas injection unit can inject a mixture of gas and disinfecting liquid.
In some embodiments, the air curtain or the air beam is used for jetting air to the air jet unit.
In some embodiments, the data processing module comprises: the device comprises a data processing module, a data storage module, an interface module and a power supply module; the data processing module is used for receiving coordinate data of the touch points and calculating phases of all channels; the data storage module comprises a storage unit and a program storage unit, the data storage unit is used for storing data in the data processing process, and the program storage unit is used for storing the system boot degree and the loading program; the interface module is used for data communication with external equipment; the power supply module is used for converting an input power supply into a required power supply.
In some embodiments, the data processing module is configured to calculate a pitch angle and a yaw angle corresponding to each touch point according to the spatial coordinates of each touch point and the position of the ejection head of the air ejection unit, calculate air ejection intensity according to the distance between the touch point and the ejection head, and calculate the temperature of the air flow to be ejected according to the ambient temperature.
Additional aspects and advantages of the invention will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the invention.
Drawings
The above and/or additional aspects and advantages of the present invention will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings of which:
FIG. 1 is a schematic view of an air imaging apparatus according to one embodiment of the invention.
FIG. 2 is a schematic view of an air imaging apparatus according to another embodiment of the invention.
FIG. 3 is a schematic diagram of a human-computer interaction of an air imaging device according to another embodiment of the invention.
Fig. 4 is a schematic diagram of the working principle of the aerial imaging device according to the embodiment of the invention.
Fig. 5 is a schematic structural diagram of an optical assembly of an aerial imaging device according to an embodiment of the invention.
FIG. 6 is a schematic diagram of a first optical waveguide array and a second optical waveguide array of an aerial imaging device according to an embodiment of the invention.
FIG. 7 is a schematic top view of an optical assembly of an aerial imaging device according to an embodiment of the invention.
FIG. 8 is a partial schematic view of a first optical waveguide array and a second optical waveguide array of an aerial imaging device according to an embodiment of the invention.
FIG. 9 is a schematic optical path diagram of an optical assembly of an aerial imaging device according to an embodiment of the invention.
FIG. 10 is a block diagram of a data processing module according to one embodiment of the invention;
FIG. 11 is a block diagram of a drive module according to one embodiment of the invention;
fig. 12 is a schematic view of the restriction of the jet unit jet air flow according to one embodiment of the present invention.
Fig. 13 is a schematic view of the restriction of the jet unit jet air flow according to another embodiment of the present invention.
Fig. 14 is a schematic view of the restriction of the jet unit jet air flow according to still another embodiment of the present invention.
FIG. 15 is a schematic view of a control scheme for the air stream or curtain according to one embodiment of the present invention.
FIG. 16 is a schematic view of a control mode of air flow bundle or air curtain according to another embodiment of the present invention.
FIG. 17 is a schematic view of a gas beam or curtain control scheme according to yet another embodiment of the present invention.
Reference numerals:
aerial imaging device 100, imaging subsystem 200, haptic feedback subsystem 300,
the touch screen comprises a machine shell 1, an optical component 2, a display 3, a detection module 4, a main control module 5, a first optical waveguide array 6, a second optical waveguide array 7, a transparent substrate 8, a reflection strip 9, a reflection film 10, photosensitive adhesive 11, a shell 12, an air injection unit 13, a data processing module 14, a driving module 15, a touch area 16, an air knife 17, a constraint mechanism 18, a cloud deck 19 and a three-dimensional touch space D where the touch area is located.
Detailed Description
Embodiments of the present invention will be described in detail below, the embodiments described with reference to the drawings being exemplary, and the embodiments of the present invention will be described in detail below with reference to fig. 1 to 15.
The aerial imaging device 100 with the haptic feedback function according to the embodiment of the present invention includes: an imaging subsystem 200, and a haptic feedback subsystem 300.
As shown in fig. 1, the imaging subsystem 200 is configured to image and display human-computer interaction information in an air target area to form an air interaction interface, and send a touch feedback control instruction and touch point coordinate information when an interaction signal between a user and the air interaction interface is detected. The tactile feedback sub-system 300 is connected to the imaging sub-system 200, and the tactile feedback sub-system 300 is configured to eject gas to the touch point according to the touch feedback control command and the touch point coordinate information.
In the embodiment of the present invention, the imaging subsystem 200 employs an interactive aerial imaging technology, and forms a floating real image, i.e., an aerial interactive interface, at a certain position in the air to serve as a reference surface for the user to sense the touch, and a three-dimensional space where the floating real image is covered is an aerial target area. Human-computer interaction information is converged and imaged in an air target area through the imaging subsystem 200 to form an air interaction interface, and when an interaction signal of a user and the air interaction interface is detected, the imaging subsystem 200 sends a touch feedback control instruction and touch point coordinate information to the tactile feedback subsystem 300.
The touch area of the haptic feedback subsystem 300 covers the three-dimensional space where the air interactive interface is located, i.e., the air target area. Specifically, based on the size and display position of the air interface being relatively fixed, the haptic feedback subsystem 300 generates a touch feedback plane with an equal size and position according to the known air interface. Namely, an air interface is provided by the imaging subsystem 200 to guide a user's touch, and a touch feedback plane is provided by the haptic feedback subsystem 300 to feedback the user's perception of the touch object.
According to the aerial imaging device 100 of the embodiment of the invention, the imaging subsystem 200 images and displays the human-computer interaction information in the aerial target area to form an aerial interaction interface, the aerial interaction interface is used as a reference surface for the user to sense the touch, and is used for guiding the user to touch, and when the imaging subsystem 200 detects an interaction signal between the user and the aerial interaction interface, a touch feedback control instruction and touch point coordinate information are sent to the haptic feedback subsystem 300. The touch feedback subsystem 300 sprays gas to the touch points according to the received touch feedback control instruction and the coordinate information of the touch points, so that the human body can sense the existence of the touch points in reality, and the purpose of non-contact touch feedback of the user is realized. In the embodiment of the invention, based on the air interaction interface presented in the air target area, the operation of touch feedback can be triggered by the user touching the air interaction interface, the interaction experience of the user is met in the aspects of vision and touch, the operation mode is more natural and comfortable, an additional device for limiting the user operation is not required to be arranged, and the risk of contacting equipment when the user operates is avoided. Meanwhile, the information safety problem caused by leakage of personal information due to the fact that fingerprint information of the user is left is avoided. Therefore, the non-contact tactile feedback technology and the interactive aerial imaging technology are combined, the air jet tactile feedback technology and the interactive aerial imaging technology are deeply integrated, a novel interactive touch feedback device is realized through system integration and development, and brand new display and interactive experience is created for users in the aspects of vision and touch.
In some embodiments, as shown in fig. 2, the imaging subsystem 200 includes: the device comprises a machine shell 1, an imaging component, a detection module 4 and a main control module 5, wherein the imaging component comprises an optical component 2 and a display 3. The casing 1 is formed with a display window and an accommodating cavity inside; the imaging assembly is arranged in the accommodating cavity and used for imaging and displaying the human-computer interaction information in the air target area to form an air interaction interface. Specifically, the display 3 and the optical element 2 are both disposed in the accommodating cavity, the display 3 is connected with the main control module 5, the display 3 is disposed on the light source side of the optical assembly 1, and the display window is located on the imaging side of the optical assembly 2. The display 3 is used for displaying human-computer interaction information, light rays carrying the human-computer interaction information sent by the display 3 are projected to the optical component 2, and the optical component 2 converges and images the light rays carrying the human-computer interaction information in an aerial target area to form an aerial interaction interface. For example, the display 3 may display a guidance picture, the air interaction interface may present a real image of the guidance picture, the user touches an icon in the real image to send an interaction signal, and the main control module 5 receives the interaction signal and controls the display 3 to display corresponding interaction information to provide for the user, thereby implementing contactless human-computer interaction.
The detection module 4 is used for detecting the interaction signal of the user and the air interaction interface. In the embodiment, the sensing form of the detection module 4 includes, but is not limited to, far and near infrared, ultrasonic, laser interference, grating, encoder, fiber-optic or CCD (Charge-coupled Device), etc. The sensing area of the detection module 4 and the floating real image are located on the same plane and cover the three-dimensional space where the floating real image is located, the optimal sensing form can be selected according to the installation space, the viewing angle and the use environment, a user can operate the floating real image in the optimal posture conveniently, and the sensitivity and the convenience of user operation are improved.
The main control module 5 is connected to the detection module 4, and as shown in fig. 3, the main control module 5 and the detection module 4 may be connected in a wired or wireless manner to transmit digital or analog signals, so that the volume of the aerial imaging device 100 may be flexibly controlled, and the electrical stability of the aerial imaging device 100 may be enhanced. The main control module 5 is configured to send a touch feedback control instruction and touch point coordinate information in response to the interaction signal. The touch feedback subsystem 300 receives a touch feedback control command, and sprays gas to the coordinate position of the touch point, so that a user can feel touch feedback, and interaction experience is improved.
The position of the floating real image generated by the imaging component is relatively fixed in the air, and is an excellent interactive guide, and the ejection area of the haptic feedback subsystem 300 is set to cover the three-dimensional space where the floating real image is located, for example, the path of the airflow ejected by the haptic feedback subsystem 300 can be set to cover the area where the floating real image to be touched is located. When a user touches the floating real image, the touch feedback subsystem 300 sprays an air curtain or an air beam at a proper moment, and on the air flow spraying path, the human body can feel the impact action of the air flow, so that the human body can actually feel the existence of the touch points, the touch object can be sensed, the user can perform non-contact interactive operation more easily, naturally, efficiently and flexibly, the risk of contacting the equipment body is avoided, and the non-contact touch feedback is really realized.
In some embodiments, as shown in fig. 4, the optical component 2 is mounted on the housing 1, and the display 3 is located on one side of the optical component 2 and within the housing 1 to generate an operation interface in the form of a floating real image on the other side of the optical component 2.
Preferably, the included angle between the optical assembly 2 and the display 3 is set to be in the range of 45 ° ± 5 °, so that the size of the optical assembly 2 can be more fully utilized, and simultaneously, better imaging quality and smaller afterimage influence are obtained. But other angles may be chosen at the expense of partial imaging quality if there are other requirements on the imaging position. It is also preferred that the optical component 2 is sized so that the user can see at a glance the picture of the whole of the real floating image presented by the display 3, but if it is actually necessary to see only part of the content of the display 3, the optical component 2 can also be freely sized and positioned according to the actual display picture.
The imaging mode of the display 3 may include RGB (red, green, blue) Light-Emitting diodes (LEDs), LCOS (Liquid Crystal On Silicon), OLED (Organic Light-Emitting Diode) array, projection, laser Diode, or any other suitable display or stereoscopic display. The display 3 may provide a clear, bright and high contrast dynamic image light source, and in some embodiments, the luminance of the display 3 is not lower than 500cd/m2, which may reduce the effect of luminance loss in the light path propagation.
In addition, according to some embodiments of the present invention, the visible angle control processing is performed on the display image surface of the display 3, so that the ghost of the floating real image can be reduced, the image quality can be improved, and the peeping of others can be prevented, thereby being widely applied to input devices requiring privacy information protection.
In the embodiment of the present invention, the optical component 2 employs an equivalent negative refractive index optical element, as shown in fig. 5 to 8, the optical component 2 includes two transparent substrates 8, and a first optical waveguide array 6 and a second optical waveguide array 7 disposed between the two transparent substrates 8, the first optical waveguide array 6 and the second optical waveguide array 7 are composed of a plurality of reflective strips 9 having rectangular cross sections, a reflective film 10 is disposed on one side or both sides along the arrangement direction of the reflective strips 9, and each reflective strip 9 of the first optical waveguide array 6 and each reflective strip 9 of the second optical waveguide array 7 are orthogonal to each other.
In some embodiments, as shown in fig. 6 and 7, the first and second optical waveguide arrays 6, 7 are the same thickness. The first optical waveguide array 6 and the second optical waveguide array 7 are arranged to have the same thickness, so that the complexity of the structures of the first optical waveguide array 6 and the second optical waveguide array 7 can be simplified, the manufacturing difficulty of the first optical waveguide array 6 and the second optical waveguide array 7 is reduced, the production efficiency of the first optical waveguide array 6 and the second optical waveguide array 7 is improved, and the production cost of the first optical waveguide array 6 and the second optical waveguide array 7 is reduced. Note that the same thickness here includes a relative range, and is not absolutely the same, i.e., the difference in thickness between the optical waveguide arrays is acceptable if the aerial imaging quality is not affected, for the purpose of improving production efficiency.
Specifically, as shown in fig. 5, the optical component 2 includes one transparent substrate 8, a first optical waveguide array 6, a second optical waveguide array 7, and another transparent substrate 8 in this order from the display 3 side to the floating real image side. The two transparent substrates 8 each have two optical faces, the transparent substrate 8 has a transmittance of about 90% -100% for light having a wavelength of about 390nm to about 760nm, and the material used for the transparent substrate 8 may include at least one of glass, plastic, polymer, and acrylic for protecting the optical waveguide array and filtering out unwanted light. Note that, if the strength after the first optical waveguide array 6 and the second optical waveguide array 7 are bonded to each other in an orthogonal manner is sufficient, or if the thickness of the mounting environment is limited, only one transparent substrate 8 may be disposed, or no transparent substrate 8 may be disposed.
In the embodiment, the length of the reflective strips 9 is limited by the peripheral size of the optical waveguide array, and therefore, the lengths of the reflective strips 9 in the first optical waveguide array 6 and the second optical waveguide array 7 are different. As shown in fig. 7, the extending direction of the reflective stripes 9 in the first optical waveguide array 6 is set to the X direction, the extending direction of the reflective stripes 9 in the second optical waveguide array 7 is set to the Y direction, and the thickness direction of the optical waveguide array is set to the Z direction. The extending directions (waveguide directions) of the reflecting strips 9 in the first optical waveguide array 6 and the second optical waveguide array 7 are mutually perpendicular, namely, the first optical waveguide array 6 and the second optical waveguide array 7 are orthogonally arranged when viewed from the Z direction (thickness direction), so that light beams in two orthogonal directions are converged at one point, and an object image surface is ensured to be symmetrical relative to the equivalent negative refractive index optical component, an equivalent negative refractive index phenomenon is generated, and aerial imaging is realized.
As shown in fig. 7, the first optical waveguide array 6 or the second optical waveguide array 7 is composed of a plurality of parallel arranged reflection stripes 9 obliquely arranged by being deflected by 45 ° at the user viewing angle. For example, the first optical waveguide array 6 may be composed of reflective strips 9 arranged side by side at 45 ° to the lower left and having a rectangular cross section, the second optical waveguide array 7 may be composed of reflective strips 9 arranged side by side at 45 ° to the lower right and having a rectangular cross section, and the arrangement directions of the reflective strips 9 in the two optical waveguide arrays may be interchanged, that is, the first optical waveguide array 6 is composed of reflective strips 9 arranged side by side at 45 ° to the lower right and having a rectangular cross section, and the second optical waveguide array 7 is composed of reflective strips 9 arranged side by side at 45 ° to the lower left and having a rectangular cross section. In an embodiment of the invention, the optical waveguide material has an optical refractive index n1, n1>1.4, for example n 1-1.5 or n 1-1.55 or n 1-1.8 or n 1-2.
Two interfaces exist between each reflective strip 9 and its adjacent reflective strip 9, as shown in fig. 8, each interface is bonded by photosensitive glue 11 or thermosetting glue, the glue thickness is T1, in an embodiment, the glue thickness T1 may be T1>0.001mm, for example, T1 ═ 0.002, T1 ═ 0.004, or T1 ═ 0.005. Photosensitive adhesives 11 are arranged between the adjacent optical waveguides in the optical component 2 and between the optical waveguide array and the transparent substrate 8.
The cross section of the reflection bar 9 is rectangular, and a reflection film 10 is provided on one side or both sides along the arrangement direction of the reflection bar 9. Specifically, in the optical waveguide arrangement direction, the two sides of each reflection bar 9 are plated with the reflection films 10, and the materials adopted by the reflection films 10 can be metal materials such as aluminum and silver or other non-metal compound materials for realizing total reflection. The reflecting film 10 is used for preventing light from entering an adjacent optical waveguide due to no total reflection to form stray light to influence imaging. Further, a dielectric film may be added to the reflective film 10, and the dielectric film functions to improve the light reflectance.
The cross-sectional width a and the cross-sectional length b of the single reflection bar 9 satisfy: 0.1mm < a <5mm, 0.1mm < b <5 mm. When a large screen is displayed, the requirement of large size can be realized by splicing a plurality of optical waveguide arrays. The overall shape of the optical waveguide array can be set according to the application scene, in some embodiments, two groups of optical waveguide arrays are integrally rectangular, the two diagonal reflection bars 9 are triangular, the middle reflection bar 9 is a trapezoid structure, the lengths of the single reflection bars 9 are different, the length of the reflection bar 9 located on the diagonal of the rectangle is the longest, and the lengths of the reflection bars 9 at the two ends are the shortest.
In addition, the optical assembly 2 further includes an anti-reflection component and a viewing angle control component (not shown), the anti-reflection component can improve the overall transmittance of the optical assembly 2, improve the definition and brightness of the floating real image, the viewing angle control component can be used for eliminating the residual image of the floating real image, reduce the vertigo of the pattern, and simultaneously prevent the observer from peeping into the non-contact fingerprint acquisition and aerial imaging device 100 from other angles, so as to improve the overall beauty of the device. The anti-reflection component and the viewing angle control component can be combined, or can be respectively and independently arranged between the transparent substrate 8 and the waveguide array, or between two waveguide arrays, or on the outer layer of the transparent substrate 8.
The imaging principle of the optical assembly of the embodiment of the present invention is explained with reference to fig. 9.
The optical component 2 is exemplified by using two mutually orthogonal optical waveguide arrays. Specifically, on a micrometer scale, a double-layer waveguide array structure which is orthogonal to each other is used for orthogonal decomposition of any optical signal, an original signal is decomposed into two paths of signals which are orthogonal to each other, the signal X is totally reflected on the surface of the reflective film 10 at the first optical waveguide array 6 according to a reflection angle which is the same as an incident angle, and at the moment, the signal Y is kept parallel to the first optical waveguide array 6 and totally reflected on the surface of the reflective film 10 at the reflection angle which is the same as the incident angle after passing through the first optical waveguide array 6 on the surface of the second optical waveguide array 7. The reflected optical signal composed of the reflected signal Y and the signal X is mirror-symmetric to the original optical signal. Therefore, the light rays in any direction can realize mirror symmetry through the optical component 2, the divergent light of any light source can be converged into a floating real image again at the symmetrical position through the optical component 2, the imaging distance of the floating real image is the same as the distance from the optical component 2 to the image source (display 3), namely the optical component 2 is used for imaging at equal distance, and the floating real image is positioned in the air, so that a specific carrier is not needed, and the real image can be directly imaged in the air. Therefore, the aerial image seen by the user is a floating real image formed by converging light emitted by an object which actually exists.
A light source, such as a display 3, emits light carrying the interactive information, which passes through the optical assembly 2, where the above-described process takes place. Specifically, as shown in fig. 9, a distance L is set between the light source and the optical component 2, incident angles of light rays emitted by the light source on the first optical waveguide array 6 are α 1, α 2, and α 3, reflection angles of the light rays on the first optical waveguide array 6 are β 1, β 2, and β 3, where α 1 is β 1, α 2 is β 2, and α 3 is β 3, the light rays are reflected by the first optical waveguide array 6 and then projected onto the second optical waveguide array 7, incident angles on the second optical waveguide array 7 are γ 1, γ 2, and γ 3, and the light rays are reflected by the second optical waveguide array 7, and the reflection angles are δ 1, δ 2, and δ 3, where γ 1 is δ 1, γ 2 is δ 2, and γ 3 is δ 3. The light reflected by the second optical waveguide array 7 is emitted, and the emitted light is converged and imaged at an image point, wherein the emission angles of the light emitted from the second optical waveguide array 7 are α 1, α 2, α 3 … α n, respectively, the distance between the image and the optical component 2 is L, that is, the visible angle ∈ is 2 times max (α) at the equal interval between the optical component 2 and the light source (display 3). Therefore, if the size of the optical waveguide array is small, the image can be seen only at a certain distance from the front surface; if the size of the optical waveguide array is increased, a larger imaging distance can be achieved, thereby increasing the field of view.
It should be noted that, mainly describing the imaging principle of the optical module 2 using the double-layer structure of the first optical waveguide array 6 and the first optical waveguide array 7, if a plurality of prism-shaped reflection bars 9 each having a reflection film on the four peripheral surfaces are arrayed in both the X and Y directions in one optical waveguide array structure, that is, two optical waveguide arrays are combined into one layer, the imaging principle is the same, and the structure of the optical module 2 according to the embodiment of the present invention may also be used, and therefore, the structure of the optical module 2 is not particularly limited herein.
The haptic feedback subsystem 300 of an embodiment of the present invention is described below.
In some embodiments, as shown in fig. 1 and 2, the haptic feedback subsystem 300 includes an air injection unit 13, a data processing module 14, and a drive module 15. Wherein the gas injection unit 13 is used to inject gas. The data processing module 14 is electrically connected to the main control module 5, and the main control module 5 receives the interaction signal detected by the detection module 4 and sends a touch feedback control instruction and touch point coordinate information, that is, when the interaction operation occurs on the interaction interface, the main control module 5 sends the touch point coordinate information of the interaction interface to the data processing module 14. The data processing module 14 is small in size, can be fixed on the driving plate through a high-speed connector and is connected with the driving module 15 on the driving plate, and the data processing module 14 can be connected with the air injection unit 13 through a cable. The data processing module 14 calculates the jet angle and jet intensity of the jet unit corresponding to the touch point according to the coordinates of the touch point and the position of the jet unit 13. The driving module 15 is electrically connected to the data processing module 14 to receive the control signal of the data processing module 14 and output a driving signal, and the driving module 15 is electrically connected to the gas injection unit 13 to control the gas injection unit 13 to inject gas toward the touch area 16 or the current touch point.
The air jet unit 13 is the core unit of the haptic feedback subsystem 300 for generating the air jet or curtain required for interaction. When a human body contacts the air beam or the air curtain, the impact of the air flow on the skin can be sensed, and then the tactile feedback is realized. In the system, the data processing module 14 completes the processing and calculation of the coordinate data of the focusing point and sends the formed control signal to the driving module 15, the driving module 15 amplifies the control signal and generates a driving signal to control the air injection unit 13, so that the air injection unit 13 can inject air flow at a proper time and act on the coordinate position of the user operation of the interactive interface, and the user can feel the air flow to realize the tactile feedback.
In an embodiment, the data processing module 14 is used for coordinate data reception, phase data calculation, and data parallel transmission, and its functional block diagram is shown in fig. 10. As shown in fig. 10, the data processing module 14 mainly includes a data processing unit, a first data storage unit, a first interface unit, and a first power supply unit. The data processing unit mainly completes touch coordinate data reception and phase calculation of each channel, in some embodiments, a floating point Digital Signal Processing (DSP) chip may be adopted as the data processing unit, and preferably, the dominant frequency of the DSP chip is above 200MHz, for example, 250MHz or 300MHz or 400MHz, so as to ensure real-time performance requirements of the system. The first Data storage unit is divided into a Data storage subunit and a program storage subunit, wherein the Data storage subunit may be a DDR2(Double Data Rate 2) or an SDRAM (synchronous dynamic random access memory) for Data storage during Data processing; the program storage subunit employs NAND FLASH (NAND flash) memory for storage of system boot programs and loader programs. The first interface unit is used for finishing data communication between the data processing unit and external equipment, and is connected with the control module 5 through a serial port/USB port to realize transmission of touch coordinate data and system control instructions; and the transmission of phase data and feedback data is completed through an Upp parallel port integrated by the DSP. The first power supply unit mainly completes power supply conversion, and converts an input power supply into various stable and reliable power supplies required by each unit.
As shown in fig. 11, the driving module 15 includes a control unit, a second data storage unit, a second interface unit, a driving unit, and a second power supply unit. The control unit receives and processes control parameters and outputs control signals, and can select ARM (advanced RISC machines) with rich pin resources as a main control chip, so that the interface resources are rich, and the system design requirements are met. And a piece of SPI FLASH is expanded outside the second data storage unit to be used as data storage, and can also be used for power failure storage of initialization parameters. And the second interface unit is in data communication with the data processing unit through the Upp parallel port. The driving unit is used for amplifying the control signal output by the ARM and outputting a driving signal meeting the requirement. The second power supply unit is used for providing stable and reliable power supply for the whole system, and external power is input into the driving board to provide required power supply for each power utilization unit through DC-DC conversion and linear conversion. Due to system design requirements, the distance between the driving unit and the air injection unit 13 is possibly far, the driving signal can be transmitted in the form of differential signals or 485 buses or industrial field buses, and the signal transmission cable adopts a plurality of strands of single-core twisted-pair wires, so that signal interference is eliminated as far as possible.
In an embodiment, the air injection unit 13 may adopt, but is not limited to, a fan, a medium/high pressure centrifugal fan, an air pump, or an air compressor to drive air injection, the air injection unit 13 may be divided into an air beam type and an air curtain type according to the form of the injection air flow, the air beam type air injection unit 13 is used for a precise control scenario, the air curtain type air injection unit 13 is suitable for a simple interaction scenario, and the arrangement manners of the air injection units 13 of the two schemes are different. As shown in fig. 1, the layout of the air beam type air injection unit 13 is flexible, the air injection unit 13 can adopt a circular hole type injector head, the circular hole type injector head has two degrees of freedom angle control functions, the air flow formed by injecting air is an air beam, and the data processing module 14 obtains the injection angle according to the touch coordinate information to adjust and correct the two-dimensional angle of the injector head, so as to realize the accurate injection of any point in the three-dimensional space. As shown in fig. 2, the air curtain type air injection unit 13 injects air to form an air curtain, the air curtain needs to be in the same plane and coincide with the air interface, i.e., the touch area 16, and touch feedback can be realized by continuous injection and timely intermittent injection. The following examples are given.
In some embodiments, the air injection unit 13 may be an air compressor, such as a miniature oil-free silent air compressor, which can drive two screws by a motor to perform air compression and can inject quantitative and directional bundled air flow through a special circular hole-shaped flexible nozzle. Specifically, the air compressor is arranged on a cloud deck, for example, a two-degree-of-freedom cloud deck, and the cloud deck is configured to deflect in real time after receiving a touch signal that a touch point is touched, so that the air compressor sprays air to the current touch point. For example, when a user operates an interactive interface in the air, the main control module 5 sends touch coordinate information to the data processing module 14, the data processing module 14 determines a jet angle according to the touch coordinate and the position of the jet unit 13, and sends a control signal to the driving module 15 to drive the deflection motor of the pan-tilt to rotate, so that the nozzle of the air compressor is quickly corrected to a correct jet angle, it is ensured that the jet path of the air compressor covers a point to be touched, and the user feels air flow, thereby realizing touch feedback.
In other embodiments, as shown in fig. 2, the air injection unit 13 may employ an air knife 17, and the air knife 17 blows a wind source or compressed air to blow out an air curtain having a certain initial thickness, for example, 0.05mm, by using the coanda effect, wherein the wind source blow may be generated by a blower or a medium/high centrifugal pressure blower, and the compressed air may be supplied by an air pump/air compressor. The width of the air curtain formed by the air knife 17 can be freely customized, and the air curtain is convenient to be integrally installed on the contactless interaction device.
Further, the air curtain formed by the air knife 17 has a certain divergence angle, for example, the divergence angle is between 5 ° and 6 °, and the farther the air curtain is from the air knife 17, the greater the thickness of the air curtain. In order to ensure that the user has a good experience effect, the optimization can be carried out from three aspects. The first aspect is that when the air knives 17 are arranged, the air knives 17 may be installed in an offset manner according to half of the divergence angle α, as shown in fig. 12, when the air curtain is facing the touch area 16, due to the existence of the divergence angle, a part of the air curtain may be scattered on both sides of the same plane of the touch area 16, which reduces the user experience, and in the embodiment of the present invention, as shown in fig. 12, the boundary surface of the air curtain may be made to coincide with the plane of the touch area 16, that is, the ejection angle of the air knives 17 is deviated 1/2 α, so that the air curtain coincides with the plane of the touch area 16. A second aspect is that, as shown in fig. 13, a constraining mechanism 18 is provided, the constraining mechanism 18 is used for constraining the divergence angle of the air curtain or the air beam ejected by the air ejecting unit 13, the constraining mechanism 18 is arranged between the air knife 17 and the touch control area 16, and when the air curtain with the divergence angle passes through, the constraining mechanism 18 can eliminate or reduce the divergence angle of the air curtain before constraint, so as to reduce the influence of the divergence angle on the touch control feedback and improve the user experience; in a third aspect, as shown in fig. 14, the air knives 17 are arranged in a close distance, that is, the distance H between the air knives 17 and the touch area 16 is reduced, when the distance between the air curtain at the touch area 16 and the air knives 17 is close, the influence of the divergence angle of the air curtain at the touch area 16 is small, the influence of the thickness of the air curtain on the touch feedback of the touch area 16 is negligible, and the method is suitable for small devices.
Of course, the haptic feedback subsystem 300 may further include a housing 12, and the housing 12 is used for carrying and fixing the air injection unit 13, the data processing module 14, the driving module 15, and other components. The housing 12 provides protection and support for the haptic feedback subsystem 300 against damage to the system from external shock and vibration. The structure of the housing 12 includes, but is not limited to, a cabinet, a case, etc., depending on the application scenario and the actual requirements.
In combination with the above-described implementation of the air injection unit 13, the control manner of the air beam or the air curtain is also various, and may be applied to more application scenarios. In an embodiment, the spraying manner of the spraying unit 13 may be a normally open type or a real-time open type, the real-time open type spraying unit 13 is used for spraying air when a user enters the touch area 16 or the user clicks a touch point, and the normally open type spraying unit 13 is in a normally open state, that is, in a real-time spraying state.
The manner in which the air stream or curtain is controlled is illustrated below.
In the first embodiment, the air injection unit 13 injects the air jet to perform precise control. As shown in fig. 15, the air injection unit 13 includes an air compressor and a cradle head 19, the air compressor is used for injecting air, the air compressor is disposed on the cradle head 19, and the cradle head 19 is used for deflecting in real time after receiving a signal that a touch point is touched, so that the air compressor injects air to a current touch point, and touch feedback is achieved.
Specifically, the air injection unit 13 is an air compressor, the nozzle orientation of the air compressor is fixed relative to the air compressor, the air compressor is mounted on the two-degree-of-freedom pan-tilt head 19, and the drive motor of the pan-tilt head 19 can drive the pan-tilt head 19 to deflect at any angle in the horizontal direction and the vertical direction or freely deflect within a certain angle range under the control of a control signal, so that the pitch angle and the horizontal deflection angle of the nozzle of the air compressor can be adjusted within a certain range, and the injected air beam can cover any touch point in a certain space. In this embodiment, the injection of the air jet can be controlled in real time, and only when an object touches the touch area 16 will the air compressor be triggered to act quickly and inject a fixed amount of air jet. The method is mainly applied to occasions with high-precision touch requirements or occasions with more and small control units, can be applied to three-dimensional touch objects such as a three-dimensional touch space D where a touch area 16 in the graph 15 is located, is high in instantaneity and high in response speed, can save a large amount of compressed gas through timely quantitative injection, and improves the energy utilization efficiency.
In the second embodiment, the air injection unit 13 is controlled to inject multiple air jets or air curtains to achieve directional control. As shown in fig. 16, the air injection unit 13 includes a plurality of air injection members, each of which is an air compressor or an air knife 17, the plurality of air injection members are installed at preset positions, the injection direction of each air injection member is fixed, and the injection range of each air injection member covers a corresponding touch point.
Specifically, as shown in fig. 16, a plurality of air compressors (13) or small-sized air knives 17(13) are installed at fixed positions, so that the air injection path is ensured to cover the point to be touched corresponding to the three-dimensional touch space D where the touch area 16 is located. In this embodiment, the ejection of the air beams or the air curtains can be controlled in real time, and only when an object contacts the touch area 16, the corresponding air injection unit 13 is triggered to eject a certain amount of air beams or air curtains. The method is suitable for simple touch objects such as a small number of touch points, and is also suitable for a three-dimensional touch space D where a three-dimensional touch object such as a touch area 16 is located, and when a certain touch point in the three-dimensional touch space D is operated, the air compressor or the air knife 17 corresponding to the coordinate position is controlled to work, and then the corresponding air beam or air curtain can be ejected to the touch coordinate position without adjusting the direction of the air ejecting unit 13, so that the control process is simple.
In the third embodiment, the air injection unit 13 is controlled to inject multiple air jets or air curtains to realize directional normally open type air injection. As shown in fig. 16, the present embodiment is basically the same as the control method of the second embodiment, except that in the present embodiment, the injection manner of the air beam or the air curtain is a normally open type, that is, after the haptic feedback function is started, the air injection unit 13 is controlled to be in the state of injecting air all the time, and it is not necessary to identify whether an object contacts the touch area 16 or the three-dimensional touch space D where the touch area 16 is located.
In the fourth embodiment, the gas injection unit 13 is controlled to inject gas to form a single gas curtain to achieve directional control. Specifically, as shown in fig. 17, a large-sized air knife 17 may be used, for example, the size of the air knife 17 is matched with the touch area, so that the air curtain sprayed by the air knife 17 coincides with the touch area 16, and when it is recognized that an object touches the touch area 16, the air knife 17 is rapidly controlled to spray the air curtain. For a certain fixed touch area 16, an air knife 17 may be disposed on any side of the touch area 16, for example, as shown in fig. 17, with the touch area 16 as an origin position, the air knife 17 may be disposed in the + x direction, the air knife 17 may be disposed in the-x direction, the air knife 17 may be disposed in the + y direction, or the air knife 17 may be disposed in the-y direction, and an air curtain ejected by the air knife 17 may coincide with the touch area 16 regardless of which direction the air knife 17 is disposed. The method is only suitable for a plane type touch object, but simple or complex touch units are suitable, so that the method is a universal scheme, the equipment is easy to integrate, the requirement on compressed gas is low, the control process is simple, and the volume of the equipment can be well controlled.
The fifth embodiment controls the gas injection unit 13 to inject gas to form a single gas curtain to realize directional normally open control, which is basically the same as the control of the fourth embodiment, except that in this embodiment, the injection manner of the gas curtain is changed to a normally open type, that is, the gas injection unit 13 is controlled to be in a state of always injecting gas, and it is not necessary to identify whether an object contacts the touch area 16. Compared to the fourth embodiment above, the method can further simplify the control process and the complexity of the apparatus.
The above is merely an illustration of the arrangement and the air injection control manner of the air injection unit 13, and is not exhaustive here, and other control manners and arrangement modifications based on the embodiment of the present invention also belong to the protection scope of the present application.
In some embodiments, the haptic feedback subsystem 300 further comprises a gas source connected to the gas injection unit 13 to provide gas to the gas injection unit 13. For example, the gas source may be a device for storing the gas to be sprayed by the gas spraying unit 13, so as to facilitate the treatment of the gas to be sprayed, such as temperature adjustment or disinfection, and improve the user experience.
For example, in some embodiments, the haptic feedback subsystem 300 further comprises a temperature adjustment module for adjusting the temperature of the gas stored within the gas source or ejected by the gas ejection unit 13 according to the outdoor ambient temperature. Specifically, the temperature adjusting module is arranged to adjust the temperature of the gas forming the air beam or the air curtain, for example, warm air flow is provided in a low-temperature environment, cold air flow is provided in a high-temperature environment, and the temperatures of the cold air flow and the warm air flow are both in a human body induction comfortable area, so that the comfort when a user feels the air flow can be ensured. The form of the air source includes, but is not limited to, a fan, a medium/high pressure centrifugal fan, an air pump, or an air compressor. When the air injection unit 13 works normally, air inlet input needs to be provided by an air source, then the air injection unit 13 injects air beams or air curtains to form air flow impact at the touch area 16 or the touch point, so that touch feedback is realized, the temperature of the air beams or the air curtains is adjusted by the temperature adjusting module, and a user does not feel uncomfortable when contacting the air beams or the air curtains, so that the user experience effect is improved.
In some embodiments, the haptic feedback subsystem 300 further comprises a sound attenuation unit disposed at least one of outside the air source, at an air inlet of the air injection unit 13, and at an air outlet of the air injection unit 13. Thus, when the air source and air injection unit 13 works, the noise influence is eliminated or reduced by the noise reduction unit. For example, the sealing isolation cover is arranged outside the air source, so that noise generated when the air source works can be effectively suppressed. For another example, a silencer is disposed at the air inlet and/or the air outlet of the air injection unit 13, so that noise generated during operation of the air injection unit 13 can be effectively suppressed, and the silencer at the air inlet can also play a role in dust filtration.
In some embodiments, when the user touches each touch point, the wind noise emitted by the air jet unit 13 can serve as a touch prompt tone. In this way, when the user operates the touch area 16, the air jet unit 13 jets an air flow, and a short-term wind noise thereof may be set as a touch alert tone, so that the aerial imaging device 100 may provide the user with a more comprehensive interactive experience in terms of vision, touch, and hearing.
In some embodiments, the gas injection unit 13 also has a carrier for containing the disinfecting liquid and an atomizing member for atomizing the disinfecting liquid so that the gas injection unit 13 can inject a mixture of gas and disinfecting liquid. In this way, the air injection unit 13 may be configured to inject a mixture of liquid or fine particles, such as a disinfectant, and at this time, the air beam or air curtain injected by the air injection unit 13 is a mixture of air and disinfectant, so that the user can perform disinfection treatment on a touch operation part of a human body, such as a hand, while experiencing tactile feedback. The liquid includes, but is not limited to, water, disinfectant, fine particles, etc., which is determined by the application scenario and the actual demand.
Based on the above description of the aerial imaging device 100 of the present invention, the aerial imaging device 100 of the present invention is combined with the imaging subsystem 200 and the haptic feedback subsystem 300, wherein the imaging subsystem 200 adopts a floating imaging technology to present a real image in the air, so as to realize contactless touch control, avoid virus or bacteria propagation, and improve the safety of use. The tactile feedback subsystem 300 adopts gas injection and realizes tactile feedback by using the impact effect of gas flow, so that a user can more easily and naturally perform non-contact interactive operation, and human body tactile perception is realized without the help of a physical device.
Specifically, when the aerial imaging device 100 according to the embodiment of the present invention works, in an initial state, the main control module 5 controls the display 3 to display a guide picture, and the optical assembly 2 converges light rays carrying the guide picture and emitted by the display 3 and forms an image in the air on the other side of the optical assembly 2, so as to guide a user to perform touch control in a correct area. Meanwhile, the main control module 5 transmits coordinate information of touch points in the interactive interface to the data processing module 14, and the data processing module 14 calculates parameters such as a two-dimensional angle (for example, a pitch angle and a deflection angle of the jetting head), jetting intensity, airflow temperature and the like corresponding to each touch point according to the coordinates of each touch point and the position of the jetting head of the jetting unit 13, wherein the jetting intensity is related to the distance between the touch point and the jetting unit 13, and the airflow temperature is related to the ambient temperature. The data processing module 14 sends the calculated parameters to the driving module 15 according to the control instruction of the main control module 5. The driving module 15 generates a control signal according to the control instruction and the relevant parameters, and drives and amplifies the control signal to obtain a final required driving signal. The driving module 15 transmits a driving signal to the air injection unit 13, and the air injection unit 13 deflects to a proper angle under the control of the driving signal and injects a certain amount of constant temperature air jet. In some embodiments, if there are multiple touch points, the system can refresh at high speed to realize touch feedback, and the touch feedback is combined with the real image pattern presented by the optical component 2 to guide the user to perform touch operation; in addition, a plurality of air injection units 13 can be adopted to work synchronously, and the man-machine interaction of a complex operation interface, even the outline perception of a three-dimensional display object, can be realized through combination and splicing.
In the embodiment, the air injection unit 13 may adopt a small-power and small-sized air injection device, the data processing module 14 and the driving module 15 control the intensity and duration of the air flow injection of the air injection unit 13 according to the distance between the touch object and the air injection unit 13, and the temperature of the injected air flow can be adjusted according to the environmental temperature through the temperature adjusting module, so that the haptic feedback subsystem 300 of the embodiment of the present invention does not cause any damage to the human body.
In addition, the haptic feedback sub-system 300 is designed in a miniaturized and modularized manner, is simple in arrangement, low in cost and convenient to use and integrate, focus point data of touch feedback is remarkably increased for application scenes with large touch areas or three-dimensional touch objects, a plurality of haptic feedback sub-systems 300 can be combined and spliced to be suitable for more application scenes, and the application range is wider.
In the description of the present invention, it is to be understood that the terms "central," "longitudinal," "lateral," "length," "width," "thickness," "upper," "lower," "front," "rear," "left," "right," "vertical," "horizontal," "top," "bottom," "inner," "outer," "clockwise," "counterclockwise," "axial," "radial," "circumferential," and the like are used in the orientations and positional relationships indicated in the drawings for convenience in describing the invention and to simplify the description, and are not intended to indicate or imply that the referenced device or element must have a particular orientation, be constructed and operated in a particular orientation, and are not to be considered limiting of the invention.
In the description of the present invention, "the first feature" and "the second feature" may include one or more of the features. In the description of the present invention, "a plurality" means two or more. In the description of the present invention, the first feature being "on" or "under" the second feature may include the first and second features being in direct contact, and may also include the first and second features being in contact with each other not directly but through another feature therebetween. In the description of the invention, "above", "over" and "above" a first feature in a second feature includes the first feature being directly above and obliquely above the second feature, or simply means that the first feature is higher in level than the second feature.
In the description herein, references to the description of the term "one embodiment," "some embodiments," "an illustrative embodiment," "an example," "a specific example," or "some examples" or the like mean that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the invention. In this specification, the schematic representations of the terms used above do not necessarily refer to the same embodiment or example.
While embodiments of the invention have been shown and described, it will be understood by those of ordinary skill in the art that: various changes, modifications, substitutions and alterations can be made to the embodiments without departing from the principles and spirit of the invention, the scope of which is defined by the claims and their equivalents.

Claims (15)

1. An aerial imaging device, comprising:
the imaging subsystem is used for imaging and displaying human-computer interaction information in an air target area to form an air interaction interface and sending a touch feedback control instruction and touch point coordinate information when an interaction signal of a user and the air interaction interface is detected; and
and the tactile feedback subsystem is connected with the imaging subsystem and used for jetting gas to the touch point according to the touch feedback control instruction and the coordinate information of the touch point.
2. The aerial imaging device of claim 1, wherein the imaging subsystem comprises:
a housing formed with a display window and an accommodating chamber therein;
the imaging assembly is arranged in the accommodating cavity and used for imaging and displaying human-computer interaction information in an aerial target area to form an aerial interaction interface;
the detection module is used for detecting an interaction signal of a user and the air interaction interface;
and the main control module is connected with the imaging assembly and the detection module and used for responding to the interaction signal and sending a touch feedback control instruction and touch point coordinate information.
3. The aerial imaging device of claim 2, wherein the imaging assembly comprises:
the display is arranged in the accommodating cavity, connected with the main control module and used for displaying the human-computer interaction information;
the optical assembly is arranged in the accommodating cavity and is used for converging and imaging the light rays carrying the human-computer interaction information in the air target area to form an air interaction interface;
the display is arranged on the light source side of the optical assembly, and the display window is arranged on the imaging side of the optical assembly.
4. The aerial imaging device of claim 3, wherein the optical assembly comprises two transparent substrates, and a first optical waveguide array and a second optical waveguide array disposed between the two transparent substrates, the first optical waveguide array and the second optical waveguide array are formed by a plurality of reflective strips having rectangular cross sections, a reflective film is disposed on one side or both sides of the arrangement direction of the reflective strips, and each reflective strip of the first optical waveguide array and each reflective strip of the second optical waveguide array are orthogonal to each other.
5. The aerial imaging device of claim 1, wherein the haptic feedback subsystem comprises:
a gas injection unit for injecting gas;
the data processing module is electrically connected with the main control module and sends the touch point coordinates of the operation interface to the data processing module, and the data processing module calculates the injection angle corresponding to the touch point according to the touch point coordinates and the position of the air injection unit;
the driving module is electrically connected with the data processing module to receive the control signal of the data processing module and obtain a driving signal, and the driving module is electrically connected with the gas injection unit to inject gas towards the touch area or the current touch point.
6. The aerial imaging device according to claim 5, wherein the jet unit jets the gas to form an air flow in a form of a beam or a curtain, the jet unit is in a normally open type or a real-time open type, the real-time open type jet unit is used for jetting the gas when a user enters the touch area or clicks the touch point, and the normally open type jet unit is in a normally open state.
7. The aerial imaging device of claim 6, wherein the air injection unit comprises an air compressor and a cradle head, the air compressor is used for injecting air, the air compressor is arranged on the cradle head, and the cradle head is configured to deflect in real time after receiving the touch-controlled signal of the touch-controlled point so as to inject air to the current touch-controlled point.
8. The aerial imaging device according to claim 7, wherein the air injection unit comprises a plurality of air injection members, the air injection members are air compressors or air knives, the plurality of air injection members are installed at preset positions, the injection direction of each air injection member is fixed, and the injection range of each air injection member covers the corresponding touch point.
9. The aerial imaging device of any of claims 5-8, further comprising a gas source connected to the gas injection unit to provide gas to the gas injection unit.
10. The aerial imaging device of claim 9, further comprising a temperature adjustment module to adjust a temperature of gas stored within the gas source or ejected by the gas ejection unit as a function of outdoor ambient temperature.
11. The aerial imaging device of claim 9, further comprising a noise reduction unit disposed outside the air source, at least one of an air inlet of the air injection unit, and an air outlet of the air injection unit.
12. The aerial imaging device of claim 9, wherein the wind noise emitted by the jet unit jets is used as a touch alert tone when a user clicks on each touch point.
13. An aerial imaging device as claimed in any one of claims 5 to 8, wherein the gas injection unit further has a carrier for containing a disinfecting liquid and an atomizing member for atomizing the disinfecting liquid so that the gas injection unit can inject a mixture of gas and disinfecting liquid.
14. An aerial imaging device as claimed in any one of claims 5 to 8, further comprising a constraining mechanism for constraining the divergence angle of the curtain or beam ejected by the jet unit.
15. An aerial imaging device as claimed in any one of claims 2 to 8, wherein the data processing module is configured to calculate a pitch angle and a yaw angle corresponding to each touch point from the spatial coordinates of each touch point and the position of the jetting head of the jetting unit, calculate a jet intensity from the distance between the touch point and the jetting head, and calculate the temperature of the air stream to be jetted from the ambient temperature.
CN202111149579.4A 2021-09-29 2021-09-29 Aerial imaging device Pending CN113885702A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111149579.4A CN113885702A (en) 2021-09-29 2021-09-29 Aerial imaging device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111149579.4A CN113885702A (en) 2021-09-29 2021-09-29 Aerial imaging device

Publications (1)

Publication Number Publication Date
CN113885702A true CN113885702A (en) 2022-01-04

Family

ID=79007982

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111149579.4A Pending CN113885702A (en) 2021-09-29 2021-09-29 Aerial imaging device

Country Status (1)

Country Link
CN (1) CN113885702A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114625278A (en) * 2022-03-09 2022-06-14 杭州名卡科技有限公司 Non-contact intelligent control terminal and intelligent opening and closing method thereof
CN115366792A (en) * 2022-04-02 2022-11-22 安徽省东超科技有限公司 Bus reminding device, operation method and storage medium
WO2024125021A1 (en) * 2022-12-12 2024-06-20 华为技术有限公司 Display device and related device

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090002330A1 (en) * 2007-06-29 2009-01-01 Shun-Ta Chien Method and system for carrying out non-contact testing of touch panel
KR20130065235A (en) * 2011-12-09 2013-06-19 엘지전자 주식회사 Mobile terminal and control method thereof
CN104094193A (en) * 2011-12-27 2014-10-08 英特尔公司 Full 3d interaction on mobile devices
CN104685447A (en) * 2012-09-28 2015-06-03 Lg电子株式会社 Display device and control method thereof
US20150277610A1 (en) * 2014-03-27 2015-10-01 Industry-Academic Cooperation Foundation, Yonsei University Apparatus and method for providing three-dimensional air-touch feedback
CN106249884A (en) * 2016-07-26 2016-12-21 北京航空航天大学 The force feedback of a kind of air pressure driving and haptic feedback devices
US20180136730A1 (en) * 2016-11-11 2018-05-17 Japan Display Inc. Display device
CN110085083A (en) * 2019-06-05 2019-08-02 南京航空航天大学 A kind of Tiny pore injection virtual report control platform of array
CN111340961A (en) * 2020-02-16 2020-06-26 吉林大学 Ultrasonic aerial touch rendering method based on three-dimensional grid model
CN112764592A (en) * 2021-01-15 2021-05-07 安徽省东超科技有限公司 Touch feedback system, terminal device, touch feedback control method and storage medium

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090002330A1 (en) * 2007-06-29 2009-01-01 Shun-Ta Chien Method and system for carrying out non-contact testing of touch panel
KR20130065235A (en) * 2011-12-09 2013-06-19 엘지전자 주식회사 Mobile terminal and control method thereof
CN104094193A (en) * 2011-12-27 2014-10-08 英特尔公司 Full 3d interaction on mobile devices
CN104685447A (en) * 2012-09-28 2015-06-03 Lg电子株式会社 Display device and control method thereof
US20150277610A1 (en) * 2014-03-27 2015-10-01 Industry-Academic Cooperation Foundation, Yonsei University Apparatus and method for providing three-dimensional air-touch feedback
CN106249884A (en) * 2016-07-26 2016-12-21 北京航空航天大学 The force feedback of a kind of air pressure driving and haptic feedback devices
US20180136730A1 (en) * 2016-11-11 2018-05-17 Japan Display Inc. Display device
CN110085083A (en) * 2019-06-05 2019-08-02 南京航空航天大学 A kind of Tiny pore injection virtual report control platform of array
CN111340961A (en) * 2020-02-16 2020-06-26 吉林大学 Ultrasonic aerial touch rendering method based on three-dimensional grid model
CN112764592A (en) * 2021-01-15 2021-05-07 安徽省东超科技有限公司 Touch feedback system, terminal device, touch feedback control method and storage medium

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114625278A (en) * 2022-03-09 2022-06-14 杭州名卡科技有限公司 Non-contact intelligent control terminal and intelligent opening and closing method thereof
CN115366792A (en) * 2022-04-02 2022-11-22 安徽省东超科技有限公司 Bus reminding device, operation method and storage medium
WO2024125021A1 (en) * 2022-12-12 2024-06-20 华为技术有限公司 Display device and related device

Similar Documents

Publication Publication Date Title
CN113885702A (en) Aerial imaging device
US11489993B2 (en) Camera assembly and electronic device
JP7482163B2 (en) Augmented Reality Display System
CN110832443B (en) Compact optical system with MEMS scanner for image generation and object tracking
CN112764592B (en) Touch feedback system, terminal device, touch feedback control method and storage medium
CN110493400A (en) Projective module group and terminal
CN111062370B (en) Optical detection device
CN110458125B (en) Optical detection device
CN110458148A (en) Optical detection apparatus
EP3736622A1 (en) Electronic devices with enhanced display areas
US11886246B2 (en) Electronic devices with shape-transforming displays
CN110458124B (en) Optical detection device
KR20230017742A (en) Optical fiber illumination by a set of light emitters
JP2005504356A (en) Image display presenting large effective images
CN101859207A (en) Touch panel and electronic device applying same
CN212160680U (en) Optical detection device
CN214202336U (en) Touch feedback system and terminal equipment
CN209044429U (en) A kind of equipment
CN210402398U (en) Optical detection device
CN212160685U (en) Optical detection device and electronic apparatus
CN211427363U (en) Optical detection device
CN115547214A (en) Aerial imaging system and man-machine interaction system based on aerial imaging
JP5710715B2 (en) Camera module for optical touch screen
CN214202303U (en) Touch feedback subsystem, touch feedback system and terminal equipment
WO2022152221A1 (en) Touch-control feedback system, terminal device, touch-control feedback control method, and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination