CN112418080A - Finger action recognition method of laser scanning imager - Google Patents
Finger action recognition method of laser scanning imager Download PDFInfo
- Publication number
- CN112418080A CN112418080A CN202011313713.5A CN202011313713A CN112418080A CN 112418080 A CN112418080 A CN 112418080A CN 202011313713 A CN202011313713 A CN 202011313713A CN 112418080 A CN112418080 A CN 112418080A
- Authority
- CN
- China
- Prior art keywords
- finger
- image
- movement
- track
- laser scanning
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/20—Movements or behaviour, e.g. gesture recognition
- G06V40/28—Recognition of hand or arm movements, e.g. recognition of deaf sign language
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/22—Matching criteria, e.g. proximity measures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Data Mining & Analysis (AREA)
- Human Computer Interaction (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Bioinformatics & Computational Biology (AREA)
- Evolutionary Computation (AREA)
- Evolutionary Biology (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Artificial Intelligence (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Psychiatry (AREA)
- Social Psychology (AREA)
- Life Sciences & Earth Sciences (AREA)
- Multimedia (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The invention discloses a finger action recognition method of a laser scanning imager, which mainly comprises the steps of obtaining the movement track of a finger action; executing step S3 according to the comparison between the track of the finger motion movement and the pre-stored image, otherwise executing step S4; controlling a laser scanning imager according to the control instruction corresponding to the image; failing to acquire the image matching comparison data within a predetermined time period, performing step S5; displaying the image and the control instruction corresponding to the image according to a display area; and sending an instruction according to the image displayed in the step S5 to control the laser scanning imager to work, and controlling the laser scanning imager to work by acquiring the movement track of the finger end, so that a multifunctional control method is realized, and the technical problem that the laser scanning imager cannot be controlled by a hand in the related technology is solved.
Description
Technical Field
The invention relates to the technical field of hand control, in particular to a finger action recognition method of a laser scanning imager.
Background
With the development of science and technology, various electric appliances and electronic equipment have been applied to various fields in the work and life of people. The controller or control component of various electric and electronic devices is in a wide variety. The most traditional control part is a knob, and the resistance value of a slide rheostat in a circuit is adjusted through the knob, so that the electric equipment is controlled by adjusting the current. Furthermore, it has appeared that keys corresponding to the relevant functions are placed in the electrical and electronic equipment, and the relevant functions are realized by pressing the keys. With the development of capacitive touch technology, capacitive touch control parts are beginning to appear, which can penetrate through the insulating material shell by more than 20mm and accurately sense the effective touch of fingers. The capacitive touch control component has two types of independent simulation keys and a capacitive touch screen, the function of the independent simulation key component is similar to that of the traditional key, the user realizes related functions by touching the corresponding simulation keys, the capacitive touch screen component allows the user to click on the control screen of the electric appliance and the electronic equipment, and the like, so that the operation process of the equipment is simpler, the operation functions are more comprehensive, in the related technology, the work of the laser scanner is controlled by a remote controller generally, in order to further improve the intelligent degree of the user during use, and the laser scanner controlled by voice is also provided. However, the scanner in the related art cannot be controlled by hands, so that the deaf-mute has certain obstacle in using and learning the scanner.
Disclosure of Invention
The present invention is directed to a finger motion recognition method for a laser scanning imager, so as to solve the problems mentioned in the background art.
In order to achieve the purpose, the invention provides the following technical scheme: a finger action recognition method of a laser scanning imager comprises the following steps:
s1, acquiring the movement track of the finger;
s2, comparing the trace of the finger movement with the pre-stored image, executing the step S3, otherwise executing the step S4;
s3, controlling the laser scanning imager according to the control instruction corresponding to the image;
s4, failing to acquire the image matching contrast data within a predetermined time, executing the step S5;
s5, displaying the image and the control command corresponding to the image according to a display area;
and S6, controlling the laser scanning imager to work according to the image sending command displayed in the step S5.
Further, acquiring the track of the finger motion movement comprises acquiring a dynamic image of the finger motion movement, and extracting the finger tip movement track from the dynamic image of the finger motion movement.
Further, the step of acquiring the dynamic image of the finger movement comprises acquiring a plurality of finger movement images at a preset acquisition frequency through an image acquisition device, and generating the dynamic image of the finger movement according to the plurality of finger movement images.
Further, recognizing the movement track of the finger according to the pre-stored image comprises comparing the movement track of the finger end with the movement track of the finger end stored in a big data system in an action track database for searching; and calling the hand type action corresponding to the pre-stored image in the big data system under the condition that the finger end moving track identical to the finger end moving track is found in the big data system in the action track database.
Further, the method further comprises the step that the image and the instruction corresponding to the image are displayed in the display area under the condition that the finger end moving track which is the same as the finger end moving track cannot be found in a big data system in the action track database within the preset waiting time.
Further, the waiting preset time is 5s-10 s.
And further, the hand type calibration is further included, and the hand type calibration is used for moving to finish the calibration of the projection gesture through fingers according to the reserved images displayed in the display area in sequence.
Compared with the prior art, the invention has the beneficial effects that:
the method for acquiring the movement track of the finger end and controlling the laser scanning imager to work is adopted, so that a multifunctional control method is realized, and the technical problem that the laser scanning imager cannot be controlled manually in the related technology is solved.
In addition to the objects, features and advantages described above, other objects, features and advantages of the present invention are also provided. The present invention will be described in further detail below.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The invention provides a technical scheme that: a finger action recognition method of a laser scanning imager comprises the following steps:
step S1, acquiring the movement track of the finger action;
step S2, comparing the trace of the finger movement with the pre-stored image, executing step S3, otherwise executing step S4;
step S3, controlling the laser scanning imager according to the control instruction corresponding to the image;
step S4, failing to acquire the image matching contrast data within a predetermined time period, executing step S5;
step S5, displaying the image and the control instruction corresponding to the image according to a display area;
and step S6, controlling the laser scanning imager to work according to the image sending command displayed in the step S5.
In this embodiment, a mode of obtaining a movement track of a finger end and controlling the laser scanning imager to work is adopted, so that a multifunctional control method is realized, and the technical problem that the laser scanning imager cannot be controlled manually in the related art is solved.
The track of the finger motion movement is gesture change presented by a user moving along a certain track when the user performs a hand type motion. Acquiring a dynamic image of the movement of the finger action, and extracting the movement track of the finger end from the dynamic image of the movement of the finger action.
In this embodiment, the acquiring the dynamic image of the finger motion movement includes acquiring, by an image acquisition device, a plurality of finger motion images at a preset acquisition frequency, and generating the dynamic image of the finger motion movement according to the plurality of finger motion images. The image acquisition device acquires the output hand of the user at the acquisition speed of acquiring 3 hand images per second, so that a plurality of hand images generate hand dynamic images. Further, recognizing the movement track of the finger according to the pre-stored image comprises comparing the movement track of the finger end with the movement track of the finger end stored in a big data system in an action track database for searching; and calling the hand type action corresponding to the pre-stored image in the big data system under the condition that the finger end moving track identical to the finger end moving track is found in the big data system in the action track database.
In this embodiment, the pre-stored image is determined to be the control mode of the laser scanning imager when the user moves the two fingers upward: controlling the angle of the laser scanning imager to rise; when the user's two fingers move downwards, the control mode of the laser scanning imager is determined as follows: controlling the angle of the laser scanning imager to fall; when the user moves the double fingers to the left, the control mode of the laser scanning imager is determined as follows: controlling the angle of the laser scanning imager to the left; when the two fingers of the user move rightwards, the control mode of the laser scanning imager is determined as follows: controlling the angle of the laser scanning imager to the right; when the five fingers of the user close and make a fist, determining that the control mode of the laser scanning imager is as follows: controlling a laser scanning imager to determine a key; when the five fingers of the user are unfolded, determining the control mode of the laser scanning imager as follows: the laser scanning imager is controlled to return to the key, and the hand type action can be conveniently read through the design.
And recognizing corresponding hand type actions according to the movement track of the finger end, recognizing the corresponding hand type actions through a recognition model in a prestored image, inputting the recognition model into the movement track of the finger end, and outputting the hand type actions corresponding to the movement track of the finger end by the recognition model, wherein the recognition model is formed by training multiple groups of training data, and each group of training data comprises the input movement track of the finger end and the hand type actions corresponding to the movement track of the finger end.
The method further comprises the steps that under the condition that a finger end moving track which is the same as the finger end moving track cannot be found in the big data system in the action track database after waiting for 5-10 seconds, the image and an instruction corresponding to the image are displayed in the display area, and the image in the display area is watched by a user, so that the finger moving track is further perfected, and image matching can be smoothly completed in the following process; the hand type calibration is used for moving to finish the calibration of the projection gesture according to the reserved figures displayed in the display area in sequence through fingers; when the track of the finger movement fails to be matched with the reserved image, the calibration of the hand-type action can be carried out to ensure the convenience in matching the subsequent images.
The laser scanner is controlled by a simple hand, so that special people, such as deaf-mutes, can simply operate the machine. The functions of the device are targeted and humanized design is reflected.
In the embodiments provided in the present application, it should be understood that the disclosed technical content can be implemented in other ways. The above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units may be a logical division, and in actual implementation, there may be another division, for example, multiple units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, units or modules, and may be in an electrical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U disk, a Read-only Memory (ROM), a Random Access Memory (RAM), a removable hard disk, a magnetic disk, or an optical disk.
Although embodiments of the present invention have been shown and described, it will be appreciated by those skilled in the art that changes, modifications, substitutions and alterations can be made in these embodiments without departing from the principles and spirit of the invention, the scope of which is defined in the appended claims and their equivalents.
Claims (7)
1. A finger action recognition method of a laser scanning imager is characterized by comprising the following steps:
s1, acquiring the movement track of the finger;
s2, comparing the trace of the finger movement with the pre-stored image, executing the step S3, otherwise executing the step S4;
s3, controlling the laser scanning imager according to the control instruction corresponding to the image;
s4, failing to acquire the image matching contrast data within a predetermined time, executing the step S5;
s5, displaying the image and the control command corresponding to the image according to a display area;
and S6, controlling the laser scanning imager to work according to the image sending command displayed in the step S5.
2. The method of claim 1, wherein: acquiring the track of the movement of the finger action comprises acquiring a dynamic image of the movement of the finger action and extracting the movement track of the finger end from the dynamic image of the movement of the finger action.
3. The method of claim 2, wherein: the step of collecting the dynamic image of the finger movement comprises the steps of collecting a plurality of finger movement images at a preset collection frequency through an image collection device, and generating the dynamic image of the finger movement according to the plurality of finger movement images.
4. The method of claim 3, wherein: identifying the track of the finger action movement according to the pre-stored image comprises the steps of comparing the finger end movement track with a finger end movement track stored in a big data system in an action track database for searching; and calling the hand type action corresponding to the pre-stored image in the big data system under the condition that the finger end moving track identical to the finger end moving track is found in the big data system in the action track database.
5. The method of claim 4, wherein: the method further comprises the step that the image and the instruction corresponding to the image are displayed in the display area under the condition that the finger end moving track which is the same as the finger end moving track cannot be found in a big data system in the action track database within the preset waiting time.
6. The method of claim 6, wherein: the waiting preset time is 5s-10 s.
7. The method of claim 6, wherein: and the hand type calibration is used for moving to finish the calibration of the projection gesture according to the reserved figures displayed in the display area in sequence through fingers.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011313713.5A CN112418080A (en) | 2020-11-20 | 2020-11-20 | Finger action recognition method of laser scanning imager |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011313713.5A CN112418080A (en) | 2020-11-20 | 2020-11-20 | Finger action recognition method of laser scanning imager |
Publications (1)
Publication Number | Publication Date |
---|---|
CN112418080A true CN112418080A (en) | 2021-02-26 |
Family
ID=74778645
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202011313713.5A Pending CN112418080A (en) | 2020-11-20 | 2020-11-20 | Finger action recognition method of laser scanning imager |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112418080A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115530855A (en) * | 2022-09-30 | 2022-12-30 | 先临三维科技股份有限公司 | Control method and device of three-dimensional data acquisition equipment and three-dimensional data acquisition equipment |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103118189A (en) * | 2013-01-25 | 2013-05-22 | 广东欧珀移动通信有限公司 | Post-call gesture operation method and post-call gesture operation device for mobile phone |
CN104954829A (en) * | 2014-03-27 | 2015-09-30 | Lg电子株式会社 | Display device and operating method thereof |
CN105302452A (en) * | 2014-07-22 | 2016-02-03 | 腾讯科技(深圳)有限公司 | Gesture interaction-based operation method and device |
CN106886275A (en) * | 2015-12-15 | 2017-06-23 | 比亚迪股份有限公司 | The control method of car-mounted terminal, device and vehicle |
CN107105093A (en) * | 2017-04-18 | 2017-08-29 | 广东欧珀移动通信有限公司 | Camera control method, device and terminal based on hand track |
CN108268181A (en) * | 2017-01-04 | 2018-07-10 | 奥克斯空调股份有限公司 | A kind of control method and device of non-contact gesture identification |
CN110368097A (en) * | 2019-07-18 | 2019-10-25 | 上海联影医疗科技有限公司 | A kind of Medical Devices and its control method |
CN110736223A (en) * | 2019-10-29 | 2020-01-31 | 珠海格力电器股份有限公司 | Air conditioner control method and device |
-
2020
- 2020-11-20 CN CN202011313713.5A patent/CN112418080A/en active Pending
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103118189A (en) * | 2013-01-25 | 2013-05-22 | 广东欧珀移动通信有限公司 | Post-call gesture operation method and post-call gesture operation device for mobile phone |
CN104954829A (en) * | 2014-03-27 | 2015-09-30 | Lg电子株式会社 | Display device and operating method thereof |
CN105302452A (en) * | 2014-07-22 | 2016-02-03 | 腾讯科技(深圳)有限公司 | Gesture interaction-based operation method and device |
CN106886275A (en) * | 2015-12-15 | 2017-06-23 | 比亚迪股份有限公司 | The control method of car-mounted terminal, device and vehicle |
CN108268181A (en) * | 2017-01-04 | 2018-07-10 | 奥克斯空调股份有限公司 | A kind of control method and device of non-contact gesture identification |
CN107105093A (en) * | 2017-04-18 | 2017-08-29 | 广东欧珀移动通信有限公司 | Camera control method, device and terminal based on hand track |
CN110368097A (en) * | 2019-07-18 | 2019-10-25 | 上海联影医疗科技有限公司 | A kind of Medical Devices and its control method |
CN110736223A (en) * | 2019-10-29 | 2020-01-31 | 珠海格力电器股份有限公司 | Air conditioner control method and device |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115530855A (en) * | 2022-09-30 | 2022-12-30 | 先临三维科技股份有限公司 | Control method and device of three-dimensional data acquisition equipment and three-dimensional data acquisition equipment |
WO2024067027A1 (en) * | 2022-09-30 | 2024-04-04 | 先临三维科技股份有限公司 | Control method and apparatus for three-dimensional data acquisition device, and three-dimensional data acquisition device |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN104463152B (en) | A kind of gesture identification method, system, terminal device and Wearable | |
CN107485844B (en) | Limb rehabilitation training method and system and embedded equipment | |
CN106598335B (en) | A kind of touch screen control method, device and mobile terminal of mobile terminal | |
CN106775407A (en) | A kind of touch-screen control method of mobile terminal, device and mobile terminal | |
WO2007097548A1 (en) | Method and apparatus for user-interface using the hand trace | |
US20240077948A1 (en) | Gesture-based display interface control method and apparatus, device and storage medium | |
CN104881122A (en) | Somatosensory interactive system activation method and somatosensory interactive method and system | |
CN112364799A (en) | Gesture recognition method and device | |
Arai et al. | Eye-based human computer interaction allowing phoning, reading e-book/e-comic/e-learning, internet browsing, and tv information extraction | |
CN109976553A (en) | Operation processing method, device, equipment and medium based on keyboard | |
CN111103982A (en) | Data processing method, device and system based on somatosensory interaction | |
CN108829239A (en) | Control method, device and the terminal of terminal | |
Modanwal et al. | A new dactylology and interactive system development for blind–computer interaction | |
CN112418080A (en) | Finger action recognition method of laser scanning imager | |
CN111901518B (en) | Display method and device and electronic equipment | |
Muranaka et al. | A home appliance control system with hand gesture based on pose estimation | |
CN110244853A (en) | Gestural control method, device, intelligent display terminal and storage medium | |
CN110347323A (en) | The input of augmented reality keyboard is transcribed based on hand gesture | |
Lin et al. | Projection-based user interface for smart home environments | |
CN114360686B (en) | Rehabilitation training computer device fusing games, running method and storage medium | |
CN110333780A (en) | Function triggering method, device, equipment and storage medium | |
CN104007916B (en) | A kind of information processing method and electronic equipment | |
CN113407031B (en) | VR (virtual reality) interaction method, VR interaction system, mobile terminal and computer readable storage medium | |
CN113240481A (en) | Model processing method and device, electronic equipment and readable storage medium | |
CN115543135A (en) | Control method, device and equipment for display screen |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20210226 |