CN116795241A - Interactive system and interactive method - Google Patents

Interactive system and interactive method Download PDF

Info

Publication number
CN116795241A
CN116795241A CN202210258802.7A CN202210258802A CN116795241A CN 116795241 A CN116795241 A CN 116795241A CN 202210258802 A CN202210258802 A CN 202210258802A CN 116795241 A CN116795241 A CN 116795241A
Authority
CN
China
Prior art keywords
interaction
interactive
area
light
processor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210258802.7A
Other languages
Chinese (zh)
Inventor
朱海朋
韩东成
范超
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Anhui Easpeed Technology Co Ltd
Original Assignee
Anhui Easpeed Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Anhui Easpeed Technology Co Ltd filed Critical Anhui Easpeed Technology Co Ltd
Priority to CN202210258802.7A priority Critical patent/CN116795241A/en
Publication of CN116795241A publication Critical patent/CN116795241A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0421Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means by interrupting or reflecting a light beam, e.g. optical touch-screen
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • G01S7/4814Constructional features, e.g. arrangements of optical elements of transmitters alone
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • G01S7/4817Constructional features, e.g. arrangements of optical elements relating to scanning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0428Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means by sensing at the edges of the touch surface the interruption of optical paths, e.g. an illumination plane, parallel to the touch surface which may be virtual

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Human Computer Interaction (AREA)
  • Electromagnetism (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The present application provides an interactive system comprising: the device comprises an emitting device, a detecting device and a control device, wherein the emitting device is used for emitting a detecting light to an external object, the emitting angle of the detecting light is periodically changed in the same plane, so that a scanning area is formed on the plane by a light path of the detecting light in one period, the scanning area comprises an interaction area, and the interaction area is an area for identifying the interaction operation by the interaction system; receiving means for receiving reflected light returned from the external object in accordance with the detection light; and a processor electrically connected with the transmitting device and the receiving device, and used for calculating and recording the interactive operation position of the external object on the interactive area and generating an interactive signal. The application also provides an interaction method.

Description

Interactive system and interactive method
Technical Field
The application relates to the field of optical detection, in particular to an interaction system and an interaction method.
Background
The existing man-machine interaction system generally comprises modes of touch control, key press, gesture recognition and the like, and is generally applied to terminals and the like with touch screens and cameras. However, with the development of new types of display such as virtual reality display, augmented reality display, projection display, and aerial imaging, it is difficult for the existing man-machine interaction system to satisfy the interaction requirement. Specifically, the interaction between touch and key is usually realized by a screen or a device, and the touch and key are all of a physical structure, and once manufactured, the touch and key cannot be changed in size; the interactive mode of gesture recognition requires a large amount of computation to recognize, and has slow response speed and is easy to generate errors.
Disclosure of Invention
In one aspect, the application provides an interactive system comprising:
the device comprises an emitting device, a detecting device and a control device, wherein the emitting device is used for emitting a detecting light to an external object, the emitting angle of the detecting light is periodically changed in the same plane, so that a scanning area is formed on the plane by a light path of the detecting light in one period, the scanning area comprises an interaction area, and the interaction area is an area for identifying the interaction operation by the interaction system;
receiving means for receiving reflected light returned from the external object in accordance with the detection light; and
and the processor is electrically connected with the transmitting device and the receiving device and is used for calculating and recording the interactive operation position of the external object on the interactive area and generating an interactive signal.
In an embodiment, the emitting device includes a laser source for emitting the probe light to a scanning mirror for reflecting the probe light; the scanning mirror periodically rotates about an axis such that the probe light periodically changes the exit angle.
In one embodiment, the processor is electrically connected to the scan mirror for controlling and recording the angle of rotation and the period of rotation of the scan mirror.
In one embodiment, the receiving device includes a receiving lens and a photoelectric converter, where the receiving lens is configured to receive the reflected light and emit the reflected light onto the photoelectric converter; the photoelectric converter is used for converting an optical signal of the reflected light into an electrical signal.
In an embodiment, the processor is electrically connected to the photoelectric converter, and is configured to obtain a time of flight of the probe light according to the electrical signal, so as to calculate the position of the external object.
In an embodiment, the interactive system further includes a projection device, electrically connected to the processor, for projecting an interactive interface on the interactive area, the interactive interface being configured to visualize the interactive area; the processor is used for identifying the interaction signals generated at different positions on the interaction area as different interaction instructions according to the interaction interface.
In an embodiment, the processor is further configured to set a location and a range of the interaction area.
In an embodiment, the processor is further configured to set a scanning density of the probe light on the interaction region to adjust an identification accuracy of the interaction region.
In an embodiment, the area scanned by the optical path of the probe light is smaller than or equal to the angle of view of the receiving device.
According to the interactive system provided by the embodiment of the application, the scanning mirror rotating around the single rotation axis is arranged, so that the optical paths of the detection light are all positioned on the same plane, and the interactive function can be realized by positioning the object on the interactive area through arranging the interactive area on the plane. Compared with a touch screen, the interactive system can realize large-area interaction through lower cost, and the size of the interactive area can be changed according to the requirement; compared with an interactive system utilizing gesture recognition, the interactive system provided by the application has the advantages of higher scanning speed, small volume, low energy consumption and high integration level.
Another aspect of the present application provides an interaction method, including:
providing a detection light with periodically changed emergent angle;
receiving a positioning signal, and forming an interaction area on a plane where a light path of the detection light is located according to the positioning signal;
receiving reflected light returned by the detection light after encountering an external object on the interaction area, and calculating the position of the external object;
and generating an interaction signal according to the position of the external object.
In an embodiment, forming an interaction area on a plane where the optical path of the probe light is located according to the positioning signal specifically includes: setting a plurality of external objects, calculating the positions of the plurality of external objects, and taking the positions of the plurality of external objects as boundaries of the interaction area to form the interaction area.
According to the interaction method provided by the embodiment of the application, the interaction area is arranged on the plane where the optical path of the detection light is located, so that the customization of the interaction area can be realized, the interaction area with a proper size can be selected according to the requirements of different environments, the scanning speed is high, and the calculation is simple.
Drawings
Fig. 1 is a schematic structural diagram of an interactive system according to an embodiment of the application.
FIG. 2 is a functional block diagram of an interactive system according to an embodiment of the present application.
FIG. 3 is a schematic diagram illustrating operation of a scanning mirror in accordance with an embodiment of the present application.
Fig. 4 is a schematic diagram illustrating an operation of the projection apparatus according to an embodiment of the application.
FIG. 5 is a flow chart of an interaction method according to an embodiment of the application.
Fig. 6 is a schematic diagram of step S2 in fig. 5.
Description of the main reference signs
Interactive system 100, 200
Transmitting device 10
Laser source 11
Collimating lens 13
Scanning mirror 15
Axis 151
Receiving device 30
Photoelectric converter 31
Receiving lens 33
Optical filter 35
Processor 50
Projection device 70
Rotation angle alpha
Scanning area 20
Interaction area 201
Interaction interface 203
Foreign objects A, A, A2, A3, A4
Probe light L
Reflected light B
The application will be further described in the following detailed description in conjunction with the above-described figures.
Detailed Description
The following description of the embodiments of the present application will be made clearly and fully with reference to the accompanying drawings, in which it is evident that the embodiments described are some, but not all embodiments of the application.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs. The terminology used in the description of the application herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application.
The application will be described in detail below with reference to the drawings and preferred embodiments thereof, in order to further explain the technical means and effects of the application to achieve the intended purpose.
Example 1
Referring to fig. 1, an interactive system 100 according to an embodiment of the present application includes: transmitting device 10, receiving device 30, and processor 50. The emitting device 10 is configured to emit the probe light L to the external object a, where the probe light L periodically changes the exit angle in the same plane, so that the optical path of the probe light L in one period forms a scanning area 20 (fan-shaped) in the plane, and the scanning area 20 includes an interaction area 201. The interaction region 201 is a region where the interaction system 100 recognizes an interaction operation, i.e., the interaction operation occurring in the interaction region 201 can be recognized by the interaction system 100. The specific location and extent size of interaction region 201 may be set by processor 50 and may be adjusted within scan region 20. The receiving device 30 is used for receiving reflected light B returned by the external object a according to the detection light L. The processor 50 is electrically connected to the transmitting device 10 and the receiving device 30 for calculating and recording the interactive position of the external object on the interactive area 201 and generating an interactive signal.
In this embodiment, please refer to fig. 1 and fig. 2 together. The emitting device 10 comprises a laser source 11, a collimator lens 13 and a scanning mirror 15. The laser source 11 is used for emitting the detection light L to the scanning mirror 15, the scanning mirror 15 is used for reflecting the detection light L, the collimating lens 13 is arranged between the laser source 11 and the scanning mirror 15 and is used for collimating the detection light L, the divergence angle of the detection light L is reduced, and the beam quality is improved.
In this embodiment, the laser source 11 is electrically connected to the processor 50, and the processor 50 is configured to modulate a parameter of the probe light L emitted by the laser source 11. Specifically, the Laser source 11 is a Vertical-Cavity Surface Emitting Laser (Vcsel), and the processor 50 is configured to transmit a modulation signal to the Laser source 11, so as to modulate the waveform and frequency of the detection light L, where the waveform of the detection light L may be a square wave with a certain duty cycle or a sine wave with a certain frequency. The reflected light B generated by the probe light L after encountering the external object a has the same waveform and frequency as the probe light L. The processor 50 is also used for recording the waveform and frequency of the probe light L.
In this embodiment, the wavelength of the detection light L emitted by the laser source 11 is 850nm-940nm, that is, the detection light L is in the near infrared band, which is beneficial to avoiding interference of ambient light and avoiding interaction caused by emitting visible light. In other embodiments, the wavelength of the probe light L may be other wavelength bands that are satisfactory, which is not limited by the present application.
In the present embodiment, the collimator lens 13 is a lens having a specific focal length, and is provided on the light outgoing path of the laser light source 11 to focus and collimate the divergent probe light L into a narrow beam. In other embodiments, the collimating lens 13 may also be a lens group composed of a plurality of lenses, which is also used to focus and collimate the divergent probe light L into a narrow beam, which is not limited by the present application.
In the present embodiment, referring to fig. 2 and 3, the scanning mirror 15 periodically rotates around an axis 151, so that the detection light L periodically changes the exit angle. The processor 50 is electrically connected to the scan mirror 15 for controlling and recording the angle of rotation and the period of rotation of the scan mirror 15. Specifically, the scanning mirror 15 is a one-dimensional Micro-Electro-mechanical system (MEMS) Micro-mirror, that is, the scanning mirror 15 is a mirror that rotates around an axis 151, and the rotation angle of itself is changed, so that the probe light L irradiated on the scanning mirror 15 is emitted at different reflection angles. The rotation angle of the MEMS micro-mirrors can be changed by changing the voltage, so the processor 50 can control the rotation angle of the scan mirror 15 by controlling the magnitude of the voltage applied to the scan mirror 15. Since the scanning mirror 15 rotates only about the axis 151, the plurality of probe lights L emitted at different reflection angles are coplanar, that is, the optical paths of the probe lights L emitted at different angles are all located on the scanning area 20. In this embodiment, the axis 151 is perpendicular to the scan region 20.
In this embodiment, the process of periodically rotating the scan mirror 15 around the axis 151 is specifically: the above process is repeated continuously by rotating the rotation angle α about the axis 151 from an initial position and then rotating the rotation angle α in the opposite direction. Wherein, setting a rotation period as a time for returning the scanning mirror 15 to the initial position again after rotating from the initial position, the specific value of the rotation angle α and the rotation period of the scanning mirror 15 can be set by the processor 50, and the processor 50 is further configured to record the rotation angle and the rotation period of the scanning mirror 15. As can be seen from the above description, in the present embodiment, the area covered by the probe light L in one rotation period of the scanning mirror 15 is a sector area, and the interaction area 201 is located in the sector area.
In other embodiments, the scan mirror 15 may also be rotated periodically about the axis 151 in other ways, such as by simple harmonic vibration, etc., as the application is not limited in this regard.
In the present embodiment, referring to fig. 1 and 2, the receiving device 30 includes a photoelectric converter 31, a receiving lens 33 and a filter 35. The receiving mirror 33 is used for receiving the reflected light B and emitting the reflected light B onto the photoelectric converter 31. The photoelectric converter 31 is for converting an optical signal of the reflected light B into an electrical signal. The filter 35 is disposed on a side of the receiving lens 33 away from the photoelectric converter 31, and filters external light other than the reflected light B.
In the present embodiment, the area scanned by the optical path of the probe light L is equal to or smaller than the angle of view of the receiving device 30. Specifically, the field angle of the receiving device 30, that is, the range in which the receiving device 30 can receive the reflected light B, is the angle change range of the detection light L emitted from the emitting device 10 at different angles, that is, the above-described fan-shaped region, which is the region in which the detection light L is optically scanned.
In the present embodiment, the receiving mirror 33 is composed of a lens having a specific focal length for focusing the divergent reflected light B onto the photoelectric converter 31. In other embodiments, the receiving lens 33 may also be a lens group having a specific focal length, which is also used to focus the divergent reflected light B onto the photoelectric converter 31, which is not limited by the present application.
In the present embodiment, the photoelectric converter 31 includes an avalanche photodiode (Avalanche Photon Diode, APD) detector and a current-to-voltage converter for converting an optical signal of the reflected light B into an electrical signal and transmitting the electrical signal to the processor 50. Since the reflected light B has the same waveform and frequency as the probe light L, the converted electric signal also has the same waveform and frequency as the probe light L. The APD detector is used for converting an optical signal into a current signal, and the current-voltage converter is used for converting the current signal into a voltage signal. In other embodiments, the photoelectric converter 31 may be other devices that can implement the photoelectric conversion function, which is not limited by the present application.
In this embodiment, after the processor 50 receives the electrical signal, by comparing the waveform and the frequency of the recorded detection light L, the time elapsed during which the detection light L is emitted from the laser source 11 and then encounters the external object a to generate the reflected light B and then is received by the photoelectric converter 31 can be obtained, so as to obtain the distance between the external object a and the interaction system 100, and the processor 50 can obtain the position information of the external object a according to the recorded rotation angle of the detection light L and the distance.
In the present embodiment, the processor 50 includes a digital-to-analog converter for generating a control signal to control the deflection of the scanning mirror 15, an analog-to-digital converter for calculating and recording the deflection angle of the scanning mirror 15 based on the electric signal fed back by the scanning mirror 15, a driving unit for emitting a modulation signal to the laser source 11, a comparator for comparing the modulation signal with the electric signal generated by the photoelectric converter 31, and a calculating unit for calculating the position of the external object a. In other embodiments, the processor 50 may also include other means for performing the functions described above, as the application is not limited in this regard.
In the present embodiment, the interaction area 201 is used to implement interactions with the interaction system 100. Specifically, the interaction area 201 is an effective sensing range of the interaction system 100, and when the interaction system 100 recognizes that the position of the external object a is located on the interaction area 201, an interaction signal is generated; if the position of the external object a is not located on the interaction region 201, it cannot be recognized, and no interaction signal is generated.
In this embodiment, the location and range of the interaction region 201 may be set by the processor 50. Specifically, the processor 50 may boundary the interaction region 201 with the positions of the plurality of external objects a, thereby defining the range of the interaction region 201. For example, when the interactive system 100 is started, reflected light B generated when the detected light L encounters a plurality of external objects a is received, and the position of each external object a is calculated and used as the vertex of the interactive area 201, and the connection line between adjacent vertices is the boundary of the interactive area 201, so as to define the position and the range of the interactive area 201. In other embodiments, the processor 50 may directly set a certain area at a certain distance from the interactive system 100 as the interactive area 201 without detecting an external object.
In this embodiment, the processor 50 may also adjust the rotation angle of the probe light L so that the boundary of the scanning area 20 (i.e., the boundary of the sector area) coincides with the boundary portion of the interaction area 201, thereby maximally utilizing the probe light L and achieving the energy saving effect.
In this embodiment, the processor 50 is further configured to set a scanning density of the probe light L on the interaction region 201 to adjust an identification accuracy of the interaction region 201. Specifically, the process of periodically changing the exit angle of the detection light L by the scanning mirror 15 may be approximated by forming a plurality of discontinuous paths of the detection light L on the scanning area 20, and the external object a needs to be located on at least one path of the detection light L to generate the reflected light B to be detected. Therefore, the distance between the light paths of two adjacent detection lights L determines the recognition accuracy of the interactive system 100, and the processor 50 can control the distance between the light paths of the plurality of detection lights L passing through the interactive area 201 by controlling the rotation of the scanning mirror 15, so as to adjust the scanning density, and thus the recognition accuracy of the interactive system 100 on the interactive area 201.
In this embodiment, the processor 50 may include one or a combination of chips such as a micro control unit (Microcontroller Unit, MCU), a central processing unit, or a single chip microcomputer. For example, the elements of the digital-to-analog converter, the analog-to-digital converter, the driving unit, the comparator, the calculating unit, etc. may be part of the structure of the MCU chip. In other embodiments, the processor 50 may also be a circuit board composed of a plurality of chips, and the present application is not limited thereto.
In this embodiment, the interactive system 100 further includes a display device (not shown) for displaying images, where the images may be changed according to the interactive signals, so as to implement interactive functions. In other embodiments, the interactive system 100 may further include other feedback terminals, such as a speaker, a mechanical device, etc., for feeding back the interactive signal, which is not limited by the present application.
According to the interactive system 100 provided by the embodiment of the application, the transmitting device 10, the receiving device 30 and the processor 50 are arranged to determine the position of the external object A, and the interactive area 201 is formed on the scanning area 20 where the optical path of the detection light L is located, so that the interactive operation of the external object A on the interactive area 201 can be positioned, and an interactive signal is generated, thereby realizing the interactive function. Since the processor 50 can adjust the scanning range of the probe light L by adjusting the rotation angle α of the scanning mirror 15, the size of the interaction region 201 can also be adjusted freely, that is, the interaction system 100 can adjust the interaction region 201 according to the requirement, which has a higher degree of freedom and can save more cost compared with the interaction modes such as a touch screen.
Example two
Referring to fig. 4, compared to the interactive system 100 in the first embodiment, the interactive system 200 further includes a projection device 70, the projection device 70 is electrically connected to the processor 50, and is configured to project an interactive interface 203 on the scan area 20, and the interactive interface 203 is configured to visualize the interactive area 201; the processor 50 is configured to identify the interaction signals generated at different locations on the interaction area 201 as different interaction instructions according to the interaction interface 203.
In this embodiment, the interactive interface 203 is a projected image interface that can be observed by human eyes, and the interactive area 201 is located in the interactive interface 203, so that the user can determine the position of the interactive area 201, thereby facilitating the interactive operation. For example, the interactive interface 203 may display the same image as the display device in the first embodiment, so as to form a plane similar to the touch display surface together with the interactive area 201, so as to facilitate the user to directly interact with the interactive system 200.
The image displayed by the interactive interface 203 may also be an image of an operation interface, such as an image of a keyboard, where the image of the operation interface divides the interactive interface 203 into a plurality of instruction areas, and the processor 50 may divide the interactive area 201 while setting the image of the interactive interface 203, so that when the external object a is located in a different instruction area, a corresponding interactive instruction is generated.
In this embodiment, the positions between the projection device 70 and the transmitting device 10, the receiving device 30 and the processor 50 are not fixed, and signals can be transmitted between the projection device 70 and the processor 50 in a wired or wireless manner, so that the position of the interactive system 200 can be set according to the needs of the user.
According to the interactive system 200 provided by the embodiment of the application, the projection device 70 is arranged, so that the interactive area 201 can be visualized, the interactive operation of a user is facilitated, and the accuracy of the interactive operation is improved.
Example III
An interaction method according to a third embodiment of the present application, please refer to fig. 5, includes:
step S1: providing a detection light with periodically changed emergent angle;
step S2: receiving a positioning signal, and forming an interaction area on a plane where a light path of the detection light is located according to the positioning signal;
step S3: receiving reflected light returned by the detection light after encountering an external object on the interaction area, and calculating the position of the external object;
step S4: and generating an interaction signal according to the position of the external object.
In this embodiment, referring to fig. 6, step S2 specifically includes: setting a plurality of external objects, calculating the positions of the plurality of external objects, and taking the positions of the plurality of external objects as boundaries of the interaction area, wherein the boundaries enclose the interaction area. Specifically, the external objects A1, A2, A3, and A4 are respectively used as one vertex of the interaction region 201, and when the interaction system 100 starts to operate, the reflected light B generated when the probe light L encounters the external objects A1, A2, A3, and A4 is used as a positioning signal, and after the positions of the external objects A1, A2, A3, and A4 are calculated, the processor 50 uses the positions of the external objects A1, A2, A3, and A4 as four vertices of the interaction region 201, and the connection lines between the adjacent vertices are used as the boundaries, so that the positions of the interaction region 201 are defined by the boundaries. In other embodiments, the number of external objects as the positioning may be three or more. The plurality of external objects may be simultaneously or sequentially disposed, and the present application is not limited thereto.
In other embodiments, the positioning signal may also be set directly by the processor 50, for example, the processor 50 may directly define an area at a distance from the front of the transmitting device 10 as the interaction area 201, without manually setting.
In this embodiment, steps S3 and S4 further include: judging the calculated position of the external object A, and generating an interaction signal if the external object A is positioned on the interaction area 201; if the external object a is not located on the interactive region 201, no interactive signal is generated.
According to the interactive method provided by the embodiment of the application, the interactive region 201 is generated by setting the interactive system to receive the positioning signal, so that the position and the size of the interactive region 201 can be customized, the degree of freedom is higher, and the limitation of interactive operation by entity parts is avoided.
It will be appreciated by persons skilled in the art that the above embodiments have been provided for the purpose of illustrating the application and are not to be construed as limiting the application, and that suitable modifications and variations of the above embodiments are within the scope of the application as claimed.

Claims (11)

1. An interactive system, comprising:
the device comprises an emitting device, a detecting device and a control device, wherein the emitting device is used for emitting a detecting light to an external object, the emitting angle of the detecting light is periodically changed in the same plane, so that a scanning area is formed on the plane by a light path of the detecting light in one period, the scanning area comprises an interaction area, and the interaction area is an area for identifying the interaction operation by the interaction system;
receiving means for receiving reflected light returned from the external object in accordance with the detection light; and
and the processor is electrically connected with the transmitting device and the receiving device and is used for calculating and recording the interactive operation position of the external object on the interactive area and generating an interactive signal.
2. The interactive system of claim 1, wherein said emitting means comprises a laser source for emitting said probe light to a scanning mirror for reflecting said probe light; the scanning mirror periodically rotates about an axis such that the probe light periodically changes the exit angle.
3. The interactive system of claim 2, wherein said processor is electrically connected to said scanning mirror for controlling and recording the angle of rotation and the period of rotation of said scanning mirror.
4. The interactive system of claim 1, wherein said receiving means comprises a receiving optic and a photoelectric converter, said receiving optic for receiving said reflected light and emitting said reflected light onto said photoelectric converter; the photoelectric converter is used for converting an optical signal of the reflected light into an electrical signal.
5. The interactive system of claim 4, wherein said processor is electrically connected to said photoelectric converter for deriving a time of flight of said probe light from said electrical signal to thereby calculate a position of said external object.
6. The interactive system of claim 1, further comprising a projection device electrically coupled to the processor for projecting an interactive interface on the plane, the interactive interface for visualizing the interactive region; the processor is used for identifying the interaction signals generated at different positions on the interaction area as different interaction instructions according to the interaction interface.
7. The interactive system of claim 1, wherein the processor is further configured to set a location and a range of the interactive region.
8. The interactive system of claim 1, wherein the processor is further configured to set a scanning density of the probe light over the interactive area to adjust an accuracy of identification of the interactive area.
9. The interactive system of claim 1, wherein the probe light path scans an area that is less than or equal to the field angle of the receiving device.
10. An interaction method, comprising:
providing a detection light with periodically changed emergent angle;
receiving a positioning signal, and forming an interaction area on a plane where a light path of the detection light is located according to the positioning signal;
receiving reflected light returned by the detection light after encountering an external object on the interaction area, and calculating the position of the external object;
and generating an interaction signal according to the position of the external object.
11. The interaction method of claim 10, wherein forming an interaction region on a plane where an optical path of the probe light is located according to the positioning signal comprises: setting a plurality of external objects, calculating the positions of the plurality of external objects, and taking the positions of the plurality of external objects as boundaries of the interaction area to form the interaction area.
CN202210258802.7A 2022-03-16 2022-03-16 Interactive system and interactive method Pending CN116795241A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210258802.7A CN116795241A (en) 2022-03-16 2022-03-16 Interactive system and interactive method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210258802.7A CN116795241A (en) 2022-03-16 2022-03-16 Interactive system and interactive method

Publications (1)

Publication Number Publication Date
CN116795241A true CN116795241A (en) 2023-09-22

Family

ID=88045529

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210258802.7A Pending CN116795241A (en) 2022-03-16 2022-03-16 Interactive system and interactive method

Country Status (1)

Country Link
CN (1) CN116795241A (en)

Similar Documents

Publication Publication Date Title
US11598956B2 (en) Eyeball tracking system and eyeball tracking method
US11025897B2 (en) Method and system for tracking eye movement in conjunction with a light scanning projector
JP5971053B2 (en) Position detection device and image display device
WO2012014690A1 (en) Projector
CN101253464A (en) Input method for surface of interactive display
US20220201264A1 (en) Mems mirror-based extended reality projection with eye-tracking
CN108938129A (en) Oral cavity scanning machine
WO2018171277A1 (en) Cross line laser
JP2014202951A (en) Image projection device and operation matter detection method
JP6501444B2 (en) Electronic device comprising a projection device, in particular a communication device, and method of operating the electronic device
CN116795241A (en) Interactive system and interactive method
US10788742B2 (en) Display system
JP2003014430A (en) Three-dimensional measuring method and three- dimensional measuring apparatus
CN218097631U (en) 3D structured light measurement system based on MEMS
CN213690109U (en) Structured light control device
US20210156967A1 (en) Method and system for detecting fiber position in a fiber scanning projector
CN220569012U (en) Optical system and optical module
US20210132371A1 (en) Method and system for using characterization light to detect fiber position in a fiber scanning projector
CN112415740A (en) Structured light control device
JP2016192799A (en) Position detection device and image display device
CN116740799A (en) Optical recognition device and three-dimensional image acquisition method
TW202121003A (en) Structure light emitting module and image collecting device
JP2013050764A (en) Optical position detection device, position detection system and display system with input function
TW201317626A (en) Scanning projection device with detection function and detection method thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination