CN114527871A - Projection lamp human-computer interaction system - Google Patents
Projection lamp human-computer interaction system Download PDFInfo
- Publication number
- CN114527871A CN114527871A CN202210046795.4A CN202210046795A CN114527871A CN 114527871 A CN114527871 A CN 114527871A CN 202210046795 A CN202210046795 A CN 202210046795A CN 114527871 A CN114527871 A CN 114527871A
- Authority
- CN
- China
- Prior art keywords
- projection
- human
- computer interaction
- interaction system
- projection lamp
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B21/00—Projectors or projection-type viewers; Accessories therefor
- G03B21/14—Details
- G03B21/142—Adjusting of projection optics
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Optics & Photonics (AREA)
- Projection Apparatus (AREA)
Abstract
The invention relates to the technical field of human-computer interaction, and discloses a projection lamp human-computer interaction system which is characterized by comprising the following components: a computer for searching and creating projection data; the projection lamp is used for transmitting projection data; the projection screen is used for displaying projection data; the wireless transmission device is used for connecting a computer and a projection lamp, a data processor, a gear unit, a direct current motor, a control circuit and a camera are arranged on the inner side of the projection lamp, and an included angle between the projection axis of the projection lamp and a projection screen is sixty-ninety degrees; the invention is beneficial to enabling the projection to directly generate virtual graphs on the surface of a real environment by arranging the human-computer interaction system, and users can directly observe the information without wearing special glasses and a mouse.
Description
Technical Field
The invention relates to the technical field of projection lamps, in particular to a human-computer interaction system of a projection lamp.
Background
Human-Computer Interaction (HCI) is a technique for the Interaction of a person with a Computer.
The tradition is used for man-machine interaction system mainly is based on touch interaction equipment (mouse), wear special glasses, this type of mode carries out the interaction action, but this kind of mode can't simulate human action completely, thereby can lead to, the operation is inconvenient, secondly, the tradition is used for man-machine interaction system, generally need express image information through extra image display equipment, then make the cost increase, installation space is less, projection space is less, secondly, the tradition is used for man-machine interaction system's projection lens can not carry out quick adjustment, then can make projection lens projection precision lower.
Disclosure of Invention
In order to overcome the above defects in the prior art, the present invention provides a human-computer interaction system for a projection lamp, which solves the problems of the background art.
In order to achieve the purpose, the invention provides the following technical scheme: a projection lamp human-computer interaction system, characterized in that the projection lamp human-computer interaction system comprises:
a computer for searching and creating projection data;
the projection lamp is used for transmitting projection data;
the projection screen is used for displaying projection data;
and the wireless transmission device is used for connecting the computer and the projection lamp.
In a preferred embodiment, a data processor, a gear unit, a direct current motor, a control circuit and a camera are arranged inside the projection lamp.
In a preferred embodiment, the projection axis of the projection lamp is at an angle of sixty to ninety degrees to the projection screen.
In a preferred embodiment, the projection lamp adjusts the projection angle by a gear unit, a dc motor and a control circuit, the output shaft of the dc motor is provided with a one-hundred thirty-four teeth gear, the projection lamp lens is provided with a ninety teeth gear, the ratio of the number of teeth of the gear to the projection lamp lens gear is one-to-one-point four-eight-nine, the control circuit controls the speed of the dc motor by using a pulse width modulation method, and the projection lamp rotates one-point four degrees at a time.
In a preferred embodiment, the camera is a surveillance camera, and the camera pixels are one hundred and thirty thousand to twenty million.
In a preferred embodiment, a human hand tracking algorithm module is arranged inside the camera, and the human hand tracking algorithm comprises kalman filtering and particle filtering.
In a preferred embodiment, the data processor is a data imaging device.
In a preferred embodiment, an infrared distance measuring device is arranged inside the camera, and measures the gesture distance to judge the correct gesture.
In a preferred embodiment, the camera is provided with an autofocus module for magnifying the partial projection image.
In a preferred embodiment, the projection screen has a touch function.
The invention has the technical effects and advantages that:
1. the invention is beneficial to enabling the projection to directly generate virtual graphs on the surface of a real environment by arranging the human-computer interaction system, and users can directly observe the information without wearing special glasses and a mouse.
2. The invention is beneficial to adjusting the projection lens by arranging the gear unit, the direct current motor and the control circuit, thereby improving the projection precision of the projection
Drawings
FIG. 1 is a schematic diagram of a human-computer interaction system of a projection lamp according to the present invention.
FIG. 2 is a schematic diagram of a second embodiment of the present invention.
Detailed Description
The technical solution of the present invention will be clearly and completely described below with reference to the accompanying drawings of the present invention, and the forms of the structures described in the following embodiments are merely examples, and the human-computer interaction system of the projection lamp according to the present invention is not limited to the structures described in the following embodiments, and all other embodiments obtained by a person skilled in the art without any creative work belong to the scope of the present invention.
The first embodiment is as follows:
the invention provides a projection lamp man-machine interaction system, which is characterized by comprising:
a computer for searching and authoring projection data;
the projection lamp is used for transmitting projection data;
the projection screen is used for displaying projection data;
and the wireless transmission device is used for connecting the computer and the projection lamp.
Furthermore, a data processor, a gear unit, a direct current motor, a control circuit and a camera are arranged on the inner side of the projection lamp.
Furthermore, the included angle between the projection axis of the projection lamp and the projection screen is sixty degrees to ninety degrees.
Furthermore, the projection lamp adjusts the projection angle through a gear unit, a direct current motor and a control circuit, an output shaft of the direct current motor is provided with a one-hundred thirty-four-tooth gear, the projection lamp lens is provided with a ninety-tooth gear, the number of teeth of the gear and the number of teeth of the projection lamp lens gear are one-to-one-point four-eight-nine, the control circuit controls the speed of the direct current motor by using a pulse width modulation method, and the projection lamp rotates by one-point four degrees every time.
Furthermore, the camera is a monitoring camera, and the pixels of the camera are one hundred and thirty thousand to two million.
Furthermore, a human hand tracking algorithm module is arranged inside the camera, and the human hand tracking algorithm comprises Kalman filtering and particle filtering.
Further, the data processor is a data imaging device.
Furthermore, an infrared distance measuring device is arranged inside the camera, and measures gesture distance to judge correct gestures.
Furthermore, the camera is provided with an automatic focusing module, and the automatic focusing module is used for amplifying the local projected image.
Further, the projection screen has a touch function.
Example two:
the projection lamp human-computer interaction system comprises a voice module, a cloud computer and a projection lamp, wherein the projection lamp comprises a control module, a behavior tracking module and a light source induction module.
The utility model provides a projection lamp human-computer interaction system, voice module is used for receiving voice command, and the item cloud computer sends voice command, control module is used for receiving the instruction that voice module sent, and to projection lamp sends the projection control command, action tracking module for detect action command and to control module sends control command, light source response module for respond to projection lamp surrounding environment, and adjust projection lamp projection luminance.
The projection lamp human-computer interaction system in the second embodiment is beneficial to enabling the human-computer interaction system to be more intelligent and convenient to operate by being provided with the voice module, the cloud computer and the projection lamp.
The points to be finally explained are: first, in the description of the present application, it should be noted that, unless otherwise specified and limited, the terms "mounted," "connected," and "connected" should be understood broadly, and may be a mechanical connection or an electrical connection, or a communication between two elements, and may be a direct connection, and "upper," "lower," "left," and "right" are only used to indicate a relative positional relationship, and when the absolute position of the object to be described is changed, the relative positional relationship may be changed;
secondly, the method comprises the following steps: in the drawings of the disclosed embodiments of the invention, only the structures related to the disclosed embodiments are referred to, other structures can refer to common designs, and the same embodiment and different embodiments of the invention can be combined with each other without conflict;
and finally: the above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents, improvements and the like that are within the spirit and principle of the present invention are intended to be included in the scope of the present invention.
Claims (10)
1. A projection lamp human-computer interaction system, characterized in that the projection lamp human-computer interaction system comprises:
a computer for searching and authoring projection data;
the projection lamp is used for transmitting projection data;
the projection screen is used for displaying projection data;
and the wireless transmission device is used for connecting the computer and the projection lamp.
2. The human-computer interaction system of claim 1, wherein: the inner side of the projection lamp is provided with a data processor, a gear unit, a direct current motor, a control circuit and a camera.
3. The human-computer interaction system of claim 1, wherein: the included angle between the projection axis of the projection lamp and the projection screen is sixty degrees to ninety degrees.
4. The human-computer interaction system of claim 1, wherein: the projection lamp is characterized in that the projection angle is adjusted by a gear unit, a direct current motor and a control circuit, an output shaft of the direct current motor is provided with a one-hundred thirty-four-tooth gear, a projection lamp lens is provided with a ninety-tooth gear, the number of teeth of the gear and the projection lamp lens gear is one-to-one-point four-eight-nine, the control circuit controls the speed of the direct current motor by using a pulse width modulation method, and the rotation angle of the projection lamp is one-point four degrees each time.
5. The human-computer interaction system of claim 1, wherein: the camera is a monitoring camera, and the pixels of the camera are one hundred and thirty thousand to two million.
6. The human-computer interaction system of claim 1, wherein: the camera is internally provided with a human hand tracking algorithm module, and the human hand tracking algorithm comprises Kalman filtering and particle filtering.
7. The human-computer interaction system of claim 1, wherein: the data processor is a data imaging device.
8. The human-computer interaction system of claim 1, wherein: the camera is internally provided with an infrared distance measuring device, and the infrared distance measuring device measures the gesture distance to judge the correct gesture.
9. The human-computer interaction system of claim 1, wherein: the camera is provided with an automatic focusing module, and the automatic focusing module is used for amplifying a local projection image.
10. The human-computer interaction system of claim 1, wherein: the projection screen has a touch function.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210046795.4A CN114527871A (en) | 2022-01-14 | 2022-01-14 | Projection lamp human-computer interaction system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210046795.4A CN114527871A (en) | 2022-01-14 | 2022-01-14 | Projection lamp human-computer interaction system |
Publications (1)
Publication Number | Publication Date |
---|---|
CN114527871A true CN114527871A (en) | 2022-05-24 |
Family
ID=81619942
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202210046795.4A Pending CN114527871A (en) | 2022-01-14 | 2022-01-14 | Projection lamp human-computer interaction system |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114527871A (en) |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104881109A (en) * | 2014-02-28 | 2015-09-02 | 联想(北京)有限公司 | Action identification method and device and electronic device |
CN105260021A (en) * | 2015-10-15 | 2016-01-20 | 深圳市祈锦通信技术有限公司 | Intelligent interactive projection system |
CN105763775A (en) * | 2016-03-02 | 2016-07-13 | 太仓思比科微电子技术有限公司 | Automatic focusing digital camera for body surface detection |
CN108900820A (en) * | 2018-05-14 | 2018-11-27 | 河南大学 | A kind of control method and device of projector |
CN215072723U (en) * | 2021-05-07 | 2021-12-07 | 深圳市智博通电子有限公司 | Intelligent voice control IPTV projection terminal |
-
2022
- 2022-01-14 CN CN202210046795.4A patent/CN114527871A/en active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104881109A (en) * | 2014-02-28 | 2015-09-02 | 联想(北京)有限公司 | Action identification method and device and electronic device |
CN105260021A (en) * | 2015-10-15 | 2016-01-20 | 深圳市祈锦通信技术有限公司 | Intelligent interactive projection system |
CN105763775A (en) * | 2016-03-02 | 2016-07-13 | 太仓思比科微电子技术有限公司 | Automatic focusing digital camera for body surface detection |
CN108900820A (en) * | 2018-05-14 | 2018-11-27 | 河南大学 | A kind of control method and device of projector |
CN215072723U (en) * | 2021-05-07 | 2021-12-07 | 深圳市智博通电子有限公司 | Intelligent voice control IPTV projection terminal |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP3598274B1 (en) | System and method for hybrid eye tracker | |
CN106687887B (en) | Projected interactive virtual desktop | |
CN109637463B (en) | Backlight black insertion optimization method and device, medium and electronic equipment | |
WO2014208168A1 (en) | Information processing device, control method, program, and storage medium | |
US11353708B1 (en) | Custom mixed reality smart glasses and software for vision impaired use | |
CN108279496B (en) | Eyeball tracking module and method of video glasses and video glasses | |
KR20230017849A (en) | Augmented Reality Guide | |
KR20230025909A (en) | Augmented Reality Eyewear 3D Painting | |
WO2022006116A1 (en) | Augmented reality eyewear with speech bubbles and translation | |
CN114531951A (en) | Automatic video capture and compositing system | |
WO2022005715A1 (en) | Augmented reality eyewear with 3d costumes | |
KR20240009975A (en) | Eyewear device dynamic power configuration | |
JP2017182109A (en) | Display system, information processing device, projector, and information processing method | |
CN110488980B (en) | Human-computer interaction system of projection system | |
US20240082697A1 (en) | Context-sensitive remote eyewear controller | |
KR20230027299A (en) | Eyewear with shared gaze responsive viewing | |
CN114527871A (en) | Projection lamp human-computer interaction system | |
CN105446580A (en) | Control method and portable electronic equipment | |
US20200241721A1 (en) | Interactive display apparatus and method | |
CN203606780U (en) | Multi-touch and gesture recognition fusion system | |
CN209248473U (en) | A kind of interactive display unit | |
EP4172731A1 (en) | Dynamic sensor selection for visual inertial odometry systems | |
CN208820949U (en) | A kind of laser cartoon projection arrangement | |
US20230400978A1 (en) | Eyewear device user interface | |
WO2024025779A1 (en) | Magnified overlays correlated with virtual markers |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |