CN109583404A - A kind of plane gestural control system and control method based on characteristic pattern identification - Google Patents

A kind of plane gestural control system and control method based on characteristic pattern identification Download PDF

Info

Publication number
CN109583404A
CN109583404A CN201811487619.4A CN201811487619A CN109583404A CN 109583404 A CN109583404 A CN 109583404A CN 201811487619 A CN201811487619 A CN 201811487619A CN 109583404 A CN109583404 A CN 109583404A
Authority
CN
China
Prior art keywords
index point
operating space
user
mark
control
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201811487619.4A
Other languages
Chinese (zh)
Other versions
CN109583404B (en
Inventor
孙晅
杜国铭
李美娟
李超
李东
张文亮
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Harbin Top Technology Co Ltd
Original Assignee
Harbin Top Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Harbin Top Technology Co Ltd filed Critical Harbin Top Technology Co Ltd
Priority to CN201811487619.4A priority Critical patent/CN109583404B/en
Publication of CN109583404A publication Critical patent/CN109583404A/en
Application granted granted Critical
Publication of CN109583404B publication Critical patent/CN109583404B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/107Static hand or arm
    • G06V40/113Recognition of static hand signs
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/56Extraction of image or video features relating to colour

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention proposes a kind of plane gestural control systems and control method based on characteristic pattern identification, signal sending end and receiving end are kept completely separate by control system of the present invention, side images information is captured using camera, user can manipulate zone map in arbitrary plane Freehandhand-drawing, and then equipment control is carried out in the region drawn.Method and system proposed by the invention can be solved effectively in current plane gestural control system, the problem that user's operation region is generally limited, and provide more free, flexible, accurate and efficient interactive experience for user.Meanwhile laser or projection device of the present invention independent of fixed position, algorithm operation quantity can also be effectively reduced using characteristic pattern auxiliary gesture identification, reduces the hardware requirement to computing platform.

Description

A kind of plane gestural control system and control method based on characteristic pattern identification
Technical field
The invention belongs to plane gesture control fields, more particularly to a kind of plane hand based on characteristic pattern identification Gesture control system and control method.
Background technique
Plane gesture control has many advantages, such as that interaction is simple and efficient, scalability is strong, compatibility is high, is current human-computer interaction neck One of the research hotspot in domain.Plane gesture control at this stage mainly include based on wearable device, based on touch control device, Four classes are identified based on supersonic sensing and based on two dimensional image.Wherein: wearable device includes sensor glove, bracelet etc.;Touch-control Equipment mainly includes capacitance/resistance formula touch screen, touch screen film and touch tablet etc.;Ultrasonic sensing device is by issuing ultrasonic wave simultaneously Receive the reflection signal capture gesture of manpower;Know method for distinguishing based on two dimensional image to pass through under capture actively or passively light source illumination Manpower image identify gesture, including laser keyboard, electronic whiteboard etc..
Wearable device using when need user wear designated equipment, may for user increase added burden.Touch-control is set Standby operable area is limited by equipment size, if to keep portability and low cost, operating area size is usually smaller;And work as When operating area is larger, integral device weight is obviously increased, and cost can also increase.Ultrasonic sensing device requires ultrasonic wave transmitting Device, manpower and ultrasonic receiver form complete loops, and in order to guarantee detection effect, transmitter and receiver are often integrated in In same module, therefore usually operating area is also limited to equipment near zone.The usual nothing of method for distinguishing is known based on two dimensional image Extras need to be worn, operating area size and location are relatively flexible, but still there are certain restrictions.Wherein:
Laser keyboard and its similar structures device good portability, but, laser emitter, people similar with ultrasonic sensing device It needs to keep certain degree between hand and camera, causes equipment size limited.In addition, such equipment usually requires additional mini Projection arrangement is to mark effective operating area.
Electronic whiteboard and its similar structures device operable area are big, and position is relatively flexible between transmitter and receiver, but It is more to be used to fix position interaction and show do not have portability.Furthermore such as World Kit apparatus and system is overly dependent upon Projection device, and the size of projection device itself, power consumption and calorific value may bring difficulty for practical application.
Rather than in the two dimensional image recognition methods of active illumination, the unmarked usual operand of method is larger, to computing platform Hardware requirement is higher;And in markd method, or need user to wear designated equipment, or control area is limited in specific bit It sets, and underuses the advantage of two bit images identification.
Summary of the invention
The invention aims to solve the problems of the prior art, a kind of plane based on characteristic pattern identification is provided Gestural control system and control method.
The present invention is achieved by the following technical solutions, and the present invention proposes a kind of plane hand based on characteristic pattern identification Gesture control system, the system comprises master controller, operation platform module, data acquisition module, data processing module, data are logical Believe module and controlled plant module;
The operation of other each modules of the main controller controls simultaneously monitors each module operating status in real time;Operation platform module For the manipulation region of user's Freehandhand-drawing characteristic pattern or in advance with the body surface region of characteristic pattern, user can be in operating platform Gesture motion and system interaction are made in marked region in module;Data acquisition module captures operating space by camera in real time Interior gesture motion, and image data is passed to data processing module;Data processing module is extracted from incoming image data Hand-characteristic identifies gesture command, and incoming data communication module;Data communication module receives gesture command, and sets to controlled Standby module sends corresponding function instruction;Controlled plant module receive capabilities instruct and execute corresponding function, while returning to controlled set Standby operating status.
Further, the controlled plant module includes one or more controlled plants.
The present invention also proposes a kind of control method of plane gestural control system based on characteristic pattern identification,
Step 1, initial networked control systems;
Step 2, detection Freehandhand-drawing index point;
Step 3, according to index point and other relational graphs are drawn, mark positioning operation area, and identify selected function;
Step 4 detects user gesture in functional areas, when detecting gesture, executes step 5, no to then follow the steps 6;
Step 5 sends control instruction according to user gesture;
If step 6, operating space figure change, step 8 is executed, it is no to then follow the steps 7;
Step 7, iteration execute step 2 to step 6, until executing step 10 at the end of control;
If step 8, operating space figure are incomplete or disappear, step 10 is executed, it is no to then follow the steps 9;
Step 9, iteration execute step 2 to step 8, until control terminates;
Step 10, finishing control.
Further, the step 2 specifically:
Step 2.1, according to image-forming condition automatic adjusument imaging parameters;
Step 2.2 captures image using new imaging parameters;
Step 2.3 carries out color segmentation to image using the marker color of typing in initialization procedure;
Step 2.4 carries out the detection of HOG feature to the result of color segmentation, obtains each index point position and size.
Further, the step 3 specifically:
Step 3.1, according to index point testing result, step 3.2 is executed when only one index point, when there is multiple marks Step 3.5 is executed when point, when not having index point, current operation zone state is labeled as " invalid operation area ", executes step 3.9;
If there is additional mark around step 3.2, index point, step 3.3 is executed, otherwise by index point and its proximity Field mark is " comprising unique identification point without labeling operation area ", and executes step 3.9;
Step 3.3 identifies marked content by OCR method;
If step 3.4, marked content meet default template requirement, by index point and mark adjacent domain labeled as " packet The operating space of point containing unique identification and mark " is otherwise labeled as " invalid operation area ", executes step 3.9;
If step 3.5, index point arrangement mode meet default template requirement, step 3.6 is executed, otherwise by each index point Surrounding zone marker is " invalid operation area ", executes step 3.9;
If there is additional mark around step 3.6, index point, step 3.7 is executed, each index point is otherwise surrounded into region Labeled as " comprising multiple index points without labeling operation area ", step 3.9 is executed;
Step 3.7 identifies marked content by OCR method;
If step 3.8, marked content meet default template requirement, each index point and mark are surrounded into zone marker and are " operating space comprising multiple index points and mark " otherwise will be labeled as " invalid operation area ", and execute step 3.9;
Step 3.9 analyzes operating space function according to operating space type, mark vertex type and marked content.
It is further, described that user gesture is detected in functional areas specifically:
Step 4.1, capture image and read operation area parameter;
Step 4.2, space and the temporal aspect for extracting finger form, position and track;
Step 4.3, the feature obtained to extraction are analyzed, and gesture command is obtained.
Further, the operating space is multiple operating spaces, and user is not by drawing multiple operating spaces to same equipment or not It is controlled with equipment.
Further, the multiple operating space is controlled by more people, and when there is multiple users, each user draws operation respectively Area controls same or distinct device.
Further, when controlling same equipment simultaneously using multiple operating spaces, to prevent instruction conflict, control system will It is subject to the instruction first recognized;It is controlled in subsequent 30s after identifying and executing from the instruction of a certain operating space Equipment is only in response to the instruction from the operating space.
Further, the user can augment and/or modify on the basis of existing operating space.
Beneficial effects of the present invention:
1, user can be by Freehandhand-drawing characteristic pattern with defining operation area, and controls corresponding equipment, is compared with the traditional method behaviour Make to be restricted on regional space less, interactive experience is more free, flexible, accurate and efficient interactive experience;
2, fading ink pen can be used to draw operating space temporarily to use for user, ink volatilization not trace after a period of time Mark, suitable for any required scene for temporarily building human-computer interaction interface;
3, the generating mode of characteristic pattern can be used but be not limited to Freehandhand-drawing, and any surface with characteristic pattern can be made For operating space;
4, characteristic pattern can be customized according to user demand, using upper more humanized;
5, laser or projection device of the present invention independent of fixed position, hardware installation mode are more flexible;
6, algorithm operation quantity can also be effectively reduced using characteristic pattern auxiliary gesture identification, reduced to the hard of computing platform Part requirement.
Detailed description of the invention
Fig. 1 is the plane gestural control system structure chart of the present invention based on characteristic pattern identification;
Fig. 2 is the plane gestural control method flow chart of the present invention based on characteristic pattern identification;
Fig. 3 is detection index point flow chart;
Fig. 4 is operating space positioning and identification of function flow chart;
Fig. 5 is indoor arrangement figure;
Fig. 6 is comprising unique identification point without labeling operation area schematic diagram;
Fig. 7 is the operating space schematic diagram comprising unique identification point and mark;
Fig. 8 is comprising multiple index points without labeling operation area schematic diagram;
Fig. 9 is the operating space schematic diagram comprising multiple index points and mark;
Figure 10 be by multioperation district's groups at control panel schematic diagram;
Figure 11 is the operating space schematic diagram that different user is drawn;
Figure 12 is that schematic diagram is augmented in operating space;
Figure 13 is schematic diagram after the modification of TV operation area.
Specific embodiment
Technical solution in the embodiment of the present invention that following will be combined with the drawings in the embodiments of the present invention carries out clear, complete Ground description, it is clear that described embodiments are only a part of the embodiments of the present invention, instead of all the embodiments.Based on this Embodiment in invention, every other reality obtained by those of ordinary skill in the art without making creative efforts Example is applied, shall fall within the protection scope of the present invention.
As shown in Figure 1, the present invention proposes a kind of plane gestural control system based on characteristic pattern identification, the system packet Include master controller, operation platform module, data acquisition module, data processing module, data communication module and controlled plant module;
The operation of other each modules of the main controller controls simultaneously monitors each module operating status in real time;Operation platform module For the manipulation region of user's Freehandhand-drawing characteristic pattern or in advance with the body surface region of characteristic pattern, user can be in operating platform Gesture motion and system interaction are made in marked region in module;Data acquisition module captures operating space by camera in real time Interior gesture motion, and image data is passed to data processing module;Data processing module is extracted from incoming image data Hand-characteristic identifies gesture command, and incoming data communication module;Data communication module receives gesture command, and sets to controlled Standby module sends corresponding function instruction;Controlled plant module receive capabilities instruct and execute corresponding function, while returning to controlled set Standby operating status.The controlled plant module includes one or more controlled plants.
In conjunction with Fig. 2, the present invention also proposes a kind of control method of plane gestural control system based on characteristic pattern identification,
Step 1, initial networked control systems;
Step 2, detection Freehandhand-drawing index point;
Step 3, according to index point and other relational graphs are drawn, mark positioning operation area, and identify selected function;
Step 4 detects user gesture in functional areas, when detecting gesture, executes step 5, no to then follow the steps 6;
Step 5 sends control instruction according to user gesture;
If step 6, operating space figure change, step 8 is executed, it is no to then follow the steps 7;
Step 7, iteration execute step 2 to step 6, until executing step 10 at the end of control;
If step 8, operating space figure are incomplete or disappear, step 10 is executed, it is no to then follow the steps 9;
Step 9, iteration execute step 2 to step 8, until control terminates;
Step 10, finishing control.
The step 1 specifically:
Step 1.1 reads into system or inputs following parameters:
Function definition;
Indicia patterns;
Operating space shape and size;
Marker color;
Reference skin tones;
Manpower template;
Face template;
White balance;
Exposure;
Contrast;
Camera internal reference;
Other relevant parameters;
Step 1.2, according to parameter configuring system.
In conjunction with Fig. 3, the step 2 specifically:
Step 2.1, according to the image-forming conditions automatic adjusument imaging parameters such as illumination;
Step 2.2 captures image using new imaging parameters;
Step 2.3 carries out color segmentation to image using the marker color of typing in initialization procedure;
Step 2.4 carries out the detection of HOG feature to the result of color segmentation, obtains each index point position and size.
In conjunction with Fig. 4, the step 3 specifically:
Step 3.1, according to index point testing result, step 3.2 is executed when only one index point, when there is multiple marks Step 3.5 is executed when point, when not having index point, current operation zone state is labeled as " invalid operation area ", executes step 3.9;
If there is additional mark around step 3.2, index point, step 3.3 is executed, otherwise by index point and its proximity Field mark is " comprising unique identification point without labeling operation area ", and executes step 3.9;
Step 3.3 identifies marked content by OCR method;
If step 3.4, marked content meet default template requirement, by index point and mark adjacent domain labeled as " packet The operating space of point containing unique identification and mark " is otherwise labeled as " invalid operation area ", executes step 3.9;
If step 3.5, index point arrangement mode meet default template requirement, step 3.6 is executed, otherwise by each index point Surrounding zone marker is " invalid operation area ", executes step 3.9;
If there is additional mark around step 3.6, index point, step 3.7 is executed, each index point is otherwise surrounded into region Labeled as " comprising multiple index points without labeling operation area ", step 3.9 is executed;
Step 3.7 identifies marked content by OCR method;
If step 3.8, marked content meet default template requirement, each index point and mark are surrounded into zone marker and are " operating space comprising multiple index points and mark " otherwise will be labeled as " invalid operation area ", and execute step 3.9;
Step 3.9 analyzes operating space function according to operating space type, mark vertex type and marked content.
It is described that user gesture is detected in functional areas specifically:
Step 4.1, capture image and read operation area parameter;
Step 4.2, space and the temporal aspect for extracting finger form, position and track;
Step 4.3, the feature obtained to extraction are analyzed, and gesture command is obtained.
The present invention uses camera to capture and identifies predefined characteristic pattern to realize the control to terminal device.It uses Person can draw given pattern in any selected location, such as draw control area using paper and be pasted on selected location, can also be with Interim control area, ink self-discoloration after a period of time, nothing are drawn in positions such as metope, desktops using self-discoloration pen It needs to clear up.The generating mode of characteristic pattern can be used but be not limited to Freehandhand-drawing, and any surface with characteristic pattern can be used as Operating space corresponds to the organic component that hardware device is also used as system.
Signal sending end and receiving end are kept completely separate by control system of the present invention, capture side images using camera Information, user can manipulate zone map in arbitrary plane Freehandhand-drawing, and then equipment control is carried out in the region drawn.The present invention The method and system proposed can be solved effectively in current plane gestural control system, and what user's operation region was generally limited asks Topic, provides more free, flexible, accurate and efficient interactive experience for user.Meanwhile the present invention is independent of fixed position Laser or projection device can also effectively reduce algorithm operation quantity using characteristic pattern auxiliary gesture identification, reduce to calculating The hardware requirement of platform.
Intelligent home control system based on Freehandhand-drawing control panel
Below by taking the application of smart home field as an example, illustrate embodiment of the present invention:
Hardware device
For system using monocular flake full-view camera as capture device, fading ink pen is controlled to set as freehand tool Standby includes TV, automatically controlled floor lamp, Electric controlled window curtain and air-conditioning.Indoor arrangement is as shown in Figure 5.
The control of single operation area
Comprising unique identification point without labeling operation area
Below by taking floor lamp controls as an example, illustrate the control method without labeling operation area comprising unique identification point.
Triangle is defined as to the control index point of floor lamp in systems first, when user draws triangle in selected location After shape pattern, drawing is identified as landing lamp switch by system, as shown in Fig. 6 (a).When user clicks triangle with finger I.e. controllable floor lamp on or off, as shown in Fig. 6 (b).
In addition, also can define other figures as index point, mode of operation is essentially identical.In addition to clicking, when control has When having the floor lamp of brightness control function, finger also can be used to slide control lamplight brightness on index point.
Operating space comprising unique identification point and mark
Below by taking TV controls as an example, illustrate the control method of the operating space comprising unique identification point and mark.
Defining alphabetical " TV " is that TV controls index point, and user draws " TV " mark, and mark such as Fig. 7 are made around it (a) shown in, system will identify that " P- " is " switching to a upper channel " in figure, and " P+ " is " switching to succeeding channel ", and " V- " is " sound Amount reduces ", " V+ " is " volume increase ".The executable corresponding function when user is clicked with finger to be marked, such as click " V+ " It can be increased television sound volume, as shown in Fig. 7 (b).
When marked content is unintelligible, as shown in Fig. 7 (c), system can not identify marked content, which is considered as invalid operation Area.
In addition to above-mentioned control mode, index point and mark be can be customized by users.
Comprising multiple index points without labeling operation area
Below by taking TV controls as an example, illustrate the control method without labeling operation area comprising multiple index points.
Defining alphabetical " TV " is that TV controls index point, and operating area is rectangle, and user draws two along oblique line directions " TV " mark will be then recognized by the system with the rectangle that drawn mark is diagonal line vertex as operating space, as shown in Fig. 8 (a), wherein " TV " mark be user's Freehandhand-drawing, dotted line be system identification region.It will be executed when user draws assignment graph in operating space Corresponding function is " switching to succeeding channel " as drawn broken line to the right in Fig. 8 (b), and drawing upward broken line in Fig. 8 (c) is " volume increase ".
Since defining operation region is rectangle, arrangement mode is that the index point of rectangle can be identified successfully, such as Fig. 8 (d) shown in.But arrangement mode is not that the index point of rectangle is then considered as invalid operation area, such as Fig. 8 (e) and (f).
If changing system definition, such as defining operation region is triangle, then Fig. 8 (e) will successfully be identified as operating space, Other arrangement modes will be considered as invalid.In addition, index point and drawn graphics command can be customized by users.
Operating space comprising multiple index points and mark
The control mode of operating space comprising multiple index points and mark be comprising multiple index points without labeling operation Supplement on the basis of area.Operating space function can be made to be more clear intuitively by additional mark, and can be customized by users.Figure In operating space shown in 9, compared with Fig. 8 (a), due to being labelled with " P+ ", " P- ", " V+ " and " V- ", user can direct point It hits mark and executes function, no longer need to draw broken line gesture, so as to which broken line gesture is mapped to other function, such as play on or off Close TV.
The control of multioperation area
User can control same equipment or distinct device by drawing multiple operating spaces.
Single control multioperation area
User can draw same operation area in different location, control same equipment, can also draw according to actual needs It makes multiple and different operating spaces and forms a complete home control panel, as shown in Figure 10, wherein triangle represents floor lamp, point It hits " ON " control to turn on light, " OFF " is to turn off the light;" TV " represents TV, related marking Function above content by the agency of mistake;Rectangular generation Table Electric controlled window curtain clicks " OPEN " and opens curtain, and " CLOSE " closes curtain;Asterisk represents air-conditioning, clicks " T+ " elevated temperature, " T- " reduces temperature.
All index points and mark can be customized by users, as defined in the range of, index point and mark put in order Its function is not influenced.
More people control multioperation area
When there is multiple users, each user can draw operating space respectively and control same or distinct device, as shown in figure 11, Wherein (a) is the operating space that user A is drawn, and includes two control functions of floor lamp and TV;(b) operating space is drawn by user B, Including two control functions of TV and Electric controlled window curtain.At this point, user A and B can control TV.
When controlling same equipment simultaneously using multioperation area, to prevent instruction conflict, system is by the finger first to recognize Subject to order.After identifying and executing from the instruction of a certain operating space, in subsequent 30s, controlled plant is only in response to coming from The instruction of the operating space.
Operating space supplement and modification
Operating space supplement
User can augment on the basis of existing operating space.Figure 12 (a) is the TV operation area drawn, Yong Huke Increase index point or mark on the basis of existing with extended operation area, as shown in Figure 12 (b), system will identify that right area is frequency Road control area will jump to corresponding channel when user writes out channel number with finger in the area, such as Figure 12 (c) institute Show.
The supplement index point and mark that system is identified can be customized by the user.In use, supplementary content must accord with Definition is closed, otherwise will be regarded as invalid operation area.
Operating space modification
User can modify on the basis of existing operating space.For example, in Figure 12 (b), when user passes through supplement mark Out behind channel control area, repeated with left side " P+ " and " P- " two channel switching functions, at this time user can cross out " P+ " and " P- " is drawn " V+ " and " V- ", and as shown in figure 13, then system will remove original two channel switching functions, increases volume tune Save function.
Above to it is provided by the present invention it is a kind of based on characteristic pattern identification plane gestural control system and control method, It is described in detail, used herein a specific example illustrates the principle and implementation of the invention, the above reality The explanation for applying example is merely used to help understand method and its core concept of the invention;Meanwhile for the general technology of this field Personnel, according to the thought of the present invention, there will be changes in the specific implementation manner and application range, in conclusion this theory Bright book content should not be construed as limiting the invention.

Claims (10)

1. it is a kind of based on characteristic pattern identification plane gestural control system, it is characterised in that: the system comprises master controller, Operation platform module, data acquisition module, data processing module, data communication module and controlled plant module;
The operation of other each modules of the main controller controls simultaneously monitors each module operating status in real time;Operation platform module is to use The manipulation region of family Freehandhand-drawing characteristic pattern or the body surface region for having characteristic pattern in advance, user can be in operation platform module On marked region in make gesture motion and system interaction;Data acquisition module is captured in operating space in real time by camera Gesture motion, and image data is passed to data processing module;Data processing module extracts hand from incoming image data Feature identifies gesture command, and incoming data communication module;Data communication module receives gesture command, and to controlled plant mould Block sends corresponding function instruction;Controlled plant module receive capabilities instruct and execute corresponding function, while returning to controlled plant fortune Row state.
2. system according to claim 1, it is characterised in that: the controlled plant module, which includes that one or more is controlled, to be set It is standby.
3. a kind of control method of the plane gestural control system as described in claim 1 based on characteristic pattern identification, special Sign is:
Step 1, initial networked control systems;
Step 2, detection Freehandhand-drawing index point;
Step 3, according to index point and other relational graphs are drawn, mark positioning operation area, and identify selected function;
Step 4 detects user gesture in functional areas, when detecting gesture, executes step 5, no to then follow the steps 6;
Step 5 sends control instruction according to user gesture;
If step 6, operating space figure change, step 8 is executed, it is no to then follow the steps 7;
Step 7, iteration execute step 2 to step 6, until executing step 10 at the end of control;
If step 8, operating space figure are incomplete or disappear, step 10 is executed, it is no to then follow the steps 9;
Step 9, iteration execute step 2 to step 8, until control terminates;
Step 10, finishing control.
4. according to the method described in claim 3, it is characterized by: the step 2 specifically:
Step 2.1, according to image-forming condition automatic adjusument imaging parameters;
Step 2.2 captures image using new imaging parameters;
Step 2.3 carries out color segmentation to image using the marker color of typing in initialization procedure;
Step 2.4 carries out the detection of HOG feature to the result of color segmentation, obtains each index point position and size.
5. according to the method described in claim 4, it is characterized by: the step 3 specifically:
Step 3.1, according to index point testing result, step 3.2 is executed when only one index point, when there is multiple index points Step 3.5 is executed, when not having index point, current operation zone state is labeled as " invalid operation area ", executes step 3.9;
If there is additional mark around step 3.2, index point, step 3.3 is executed, otherwise by index point and its adjacent domain mark It is denoted as " comprising unique identification point without labeling operation area ", and executes step 3.9;
Step 3.3 identifies marked content by OCR method;
If step 3.4, marked content meet default template requirement, by index point and mark adjacent domain labeled as " comprising single The operating space of one index point and mark " is otherwise labeled as " invalid operation area ", executes step 3.9;
If step 3.5, index point arrangement mode meet default template requirement, step 3.6 is executed, otherwise surrounds each index point Zone marker is " invalid operation area ", executes step 3.9;
If there is additional mark around step 3.6, index point, step 3.7 is executed, each index point is otherwise surrounded into zone marker For " comprising multiple index points without labeling operation area ", step 3.9 is executed;
Step 3.7 identifies marked content by OCR method;
If step 3.8, marked content meet default template requirement, it is " to include that each index point and mark, which are surrounded zone marker, The operating space of multiple index points and mark " otherwise will be labeled as " invalid operation area ", and execute step 3.9;
Step 3.9 analyzes operating space function according to operating space type, mark vertex type and marked content.
6. according to the method described in claim 5, it is characterized by: described detect user gesture in functional areas specifically:
Step 4.1, capture image and read operation area parameter;
Step 4.2, space and the temporal aspect for extracting finger form, position and track;
Step 4.3, the feature obtained to extraction are analyzed, and gesture command is obtained.
7. the method according to any one of claim 3-6, it is characterised in that: the operating space is multiple operating spaces, User controls same equipment or distinct device by drawing multiple operating spaces.
8. according to the method described in claim 7, it is characterized by: the multiple operating space is controlled by more people, when there is multiple use When family, each user draws operating space respectively and controls same or distinct device.
9. according to the method described in claim 8, it is characterized by: when using multiple operating spaces simultaneously control same equipment when, To prevent instruction conflict, control system will be subject to the instruction first recognized;When identifying and execute from a certain operating space After instruction, in subsequent 30s, controlled plant is only in response to the instruction from the operating space.
10. the method according to any one of claim 3-6,8,9, it is characterised in that: the user can have behaviour Make to be augmented and/or modified on the basis of area.
CN201811487619.4A 2018-12-06 2018-12-06 A kind of plane gestural control system and control method based on characteristic pattern identification Active CN109583404B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811487619.4A CN109583404B (en) 2018-12-06 2018-12-06 A kind of plane gestural control system and control method based on characteristic pattern identification

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811487619.4A CN109583404B (en) 2018-12-06 2018-12-06 A kind of plane gestural control system and control method based on characteristic pattern identification

Publications (2)

Publication Number Publication Date
CN109583404A true CN109583404A (en) 2019-04-05
CN109583404B CN109583404B (en) 2019-08-02

Family

ID=65927455

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811487619.4A Active CN109583404B (en) 2018-12-06 2018-12-06 A kind of plane gestural control system and control method based on characteristic pattern identification

Country Status (1)

Country Link
CN (1) CN109583404B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111258430A (en) * 2020-01-21 2020-06-09 哈尔滨拓博科技有限公司 Desktop interaction system based on monocular gesture control
CN114020192A (en) * 2021-09-18 2022-02-08 特斯联科技集团有限公司 Interaction method and system for realizing non-metal plane based on curved surface capacitor

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102609153A (en) * 2012-01-29 2012-07-25 胡世曦 Device for calculating contact position between object and working plane
CN106201179A (en) * 2016-06-28 2016-12-07 努比亚技术有限公司 Electronic equipment and information processing method
CN107656637A (en) * 2017-08-28 2018-02-02 哈尔滨拓博科技有限公司 A kind of automation scaling method using the projected keyboard for choosing manually at 4 points
CN107998670A (en) * 2017-12-13 2018-05-08 哈尔滨拓博科技有限公司 Remote-control toy control system based on plane gesture identification
CN108870757A (en) * 2018-06-29 2018-11-23 哈尔滨拓博科技有限公司 A kind of controlling device for water heater and control method based on plane gesture identification

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102609153A (en) * 2012-01-29 2012-07-25 胡世曦 Device for calculating contact position between object and working plane
CN106201179A (en) * 2016-06-28 2016-12-07 努比亚技术有限公司 Electronic equipment and information processing method
CN107656637A (en) * 2017-08-28 2018-02-02 哈尔滨拓博科技有限公司 A kind of automation scaling method using the projected keyboard for choosing manually at 4 points
CN107998670A (en) * 2017-12-13 2018-05-08 哈尔滨拓博科技有限公司 Remote-control toy control system based on plane gesture identification
CN108870757A (en) * 2018-06-29 2018-11-23 哈尔滨拓博科技有限公司 A kind of controlling device for water heater and control method based on plane gesture identification

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111258430A (en) * 2020-01-21 2020-06-09 哈尔滨拓博科技有限公司 Desktop interaction system based on monocular gesture control
CN114020192A (en) * 2021-09-18 2022-02-08 特斯联科技集团有限公司 Interaction method and system for realizing non-metal plane based on curved surface capacitor
CN114020192B (en) * 2021-09-18 2024-04-02 特斯联科技集团有限公司 Interaction method and system for realizing nonmetal plane based on curved surface capacitor

Also Published As

Publication number Publication date
CN109583404B (en) 2019-08-02

Similar Documents

Publication Publication Date Title
RU2439653C2 (en) Virtual controller for display images
CN100465867C (en) Handwritten information input apparatus
DE102010031878A1 (en) System and method for remote on-screen virtual input
WO2014030888A1 (en) System and method for perceiving images with multimodal feedback
CN101498973B (en) Touch control interpretation structure and method for executing touch control application program by multi-finger gesture
US20110280441A1 (en) Projector and projection control method
CN107589832A (en) It is a kind of based on optoelectronic induction every empty gesture identification method and its control device
CN103150019A (en) Handwriting input system and method
CN109583404B (en) A kind of plane gestural control system and control method based on characteristic pattern identification
WO2015050322A1 (en) Method by which eyeglass-type display device recognizes and inputs movement
Chen et al. Interacting with digital signage using hand gestures
CN107454304A (en) A kind of terminal control method, control device and terminal
CN101923433A (en) Man-computer interaction mode based on hand shadow identification
CN113641283A (en) Electronic device, screen writing mode switching method and medium thereof
CN106033286A (en) A projection display-based virtual touch control interaction method and device and a robot
Cantzler et al. A Novel Form of Pointing Device.
Khan et al. Computer vision based mouse control using object detection and marker motion tracking
CN111679737B (en) Hand segmentation method and electronic device
CN106339089A (en) Human-computer interaction action identification system and method
CN104777900A (en) Gesture trend-based graphical interface response method
CN113497885A (en) Camera starting method, electronic equipment and storage medium
CN101840291A (en) Light source type positioning system and method thereof
Kadam et al. Mouse operations using finger tracking
Suto et al. A tabletop system using infrared image recognition for multi-user identification
Krishna Automated Gesture Recognition System Using Raspberry Pi

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB03 Change of inventor or designer information
CB03 Change of inventor or designer information

Inventor after: Du Guoming

Inventor after: Sun Xuan

Inventor after: Li Meijuan

Inventor after: Li Chao

Inventor after: Li Dong

Inventor after: Zhang Wenliang

Inventor before: Sun Xuan

Inventor before: Du Guoming

Inventor before: Li Meijuan

Inventor before: Li Chao

Inventor before: Li Dong

Inventor before: Zhang Wenliang

GR01 Patent grant
GR01 Patent grant