CN107678425A - A kind of car controller based on Kinect gesture identifications - Google Patents

A kind of car controller based on Kinect gesture identifications Download PDF

Info

Publication number
CN107678425A
CN107678425A CN201710758670.3A CN201710758670A CN107678425A CN 107678425 A CN107678425 A CN 107678425A CN 201710758670 A CN201710758670 A CN 201710758670A CN 107678425 A CN107678425 A CN 107678425A
Authority
CN
China
Prior art keywords
joint
coordinate
module
gesture
axis coordinate
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201710758670.3A
Other languages
Chinese (zh)
Inventor
史鸣谦
严惠
何新
戴伟
周青云
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing University of Science and Technology
Original Assignee
Nanjing University of Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing University of Science and Technology filed Critical Nanjing University of Science and Technology
Priority to CN201710758670.3A priority Critical patent/CN107678425A/en
Publication of CN107678425A publication Critical patent/CN107678425A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means

Abstract

The present invention is a kind of car controller based on Kinect gesture identifications, including data acquisition module, i.e. Kinect sensor, and the sensor assembly is responsible for receiving the action command from user;Data processing and gesture recognition module, the data processing and gesture recognition module, the action command collected for analyte sensors, pass to screen display module, and generate corresponding control instruction to slave computer;Screen display module, for the treated data of displaying data in real-time processing and gesture recognition module and corresponding instruction;Communication module, the communication module are used for the radio communication being responsible between host computer and slave computer;Slave computer, the slave computer are a three wheelers by 51 monolithic processor controlled two-wheel drives, and the control instruction issued for remotely receiving host computer, controlled motor rotates forward, reversion and stopping act.

Description

A kind of car controller based on Kinect gesture identifications
Technical field
The invention belongs to image recognition and control technology field technical field, it is related to a kind of based on Kinect gesture identifications Car controller.
Background technology
In the control system of in general dolly, it is desirable to control the athletic posture of dolly, it is necessary to people hold control stick or The corresponding instruction of controller input can be realized.This mode of operation is difficult to meet what some user's needs separated with controller Particular application.Compared with traditional control mode, the advantage of gesture control can be real without any controller of contact Now to the control of dolly.
And traditional gesture identification is by single camera or carried out using wearable device, the former is difficult to overcome The influence and recognition result of complex background and light to recognition result are slower, and the latter needs other wearable device, utilizes above-mentioned skill Many inconvenience certainly will be brought if art control dolly.
Therefore, it is necessary to which a kind of new car controller is to solve the above problems.
The content of the invention
The defects of present invention exists for prior art, there is provided a kind of dolly control dress based on Kinect gesture identifications Put.
A kind of car controller based on Kinect gesture identifications, including it is data acquisition module, gesture recognition module, aobvious Show module, communication module and slave computer module;
The data acquisition module includes Kinect sensor, using Kinect sensor gather human body depth information with Bone information;
The gesture recognition module be used for the data collecting module collected to data handle, identify operation The control instruction of personnel;
The display module is used for the processing data for receiving the data processing and gesture recognition module;
The communication module is used to the control instruction of the gesture recognition module being transferred to the slave computer module;
The slave computer module controls the motion state of dolly according to the control instruction of the gesture recognition module.
Further, the ranging and speed measuring module include Hall switch and magnetic devices, and the Hall switch is set On the bicycle front fork, the magnetisable material is arranged on spoke.
Further, the control module is single-chip microcomputer.
Further, in addition to alarm module, the alarm module connect the control module.
Further, the positioning and navigation system are BEI-DOU position system or GPS positioning system.
Further, the storage device is SD card.
Further, described positioning and navigation module, display module, control module and memory module are put down for integral type Plate computer.
Further, the integral type tablet PC is arranged on the centre position of handlebar.
Further, it is described to test the speed and range finder module, tire pressure monitoring module and integral type tablet PC are removable Unload formula.
Further, the gesture of the operating personnel includes advance gesture, retreats gesture, left turn, right turn With stopping gesture, the advance gesture utilizes the coordinate of left hand joint and the coordinate in waist joint, wherein, hl.Position.X Represent the X-axis coordinate of left hand joint and the X-axis coordinate in waist joint respectively with hc.Position.X, hl.Position.Y and Hc.Position.Y represents left hand joint Y-axis coordinate and waist joint Y axial coordinates respectively, works as hl.Position.X> Hc.Position.X and hl.Position.Y>During hc.Position.Y, then it is determined as progress signal;
It is described retreat gesture using right hand joint coordinate and waist joint coordinate, wherein, hr.Position.X and Hc.Position.X represents the X-axis coordinate of right hand joint and the X-axis coordinate in waist joint respectively, hr.Position.Y and Hc.Position.Y represents right hand joint Y-axis coordinate and waist joint Y-axis coordinate respectively, works as hr.Position.X> Hc.Position.X and hr.Position.Y>During hc.Position.Y, then it is determined as backing signal;
The left turn utilizes the coordinate of left hand joint, left carpal coordinate, the coordinate of left elbow joint and left shoulder joint The coordinate of section, wherein, hl.Position.Y represents the Y-axis coordinate of left hand joint, and sl.Position.Y represents the Y of left shoulder joint Axial coordinate, wl.Position.X represent left wrist joint X-axis coordinate, and el.Position.X represents left elbow joint X-axis coordinate, when hl.Position.Y>Sl.Position.Y and wl.Position.X>During el.Position.X, then it is determined as left rotaring signal;
The right turn utilizes the coordinate of right hand joint, right carpal coordinate, the coordinate of right elbow joint and right shoulder joint The coordinate of section, wherein, hr.Position.Y represents the Y-axis coordinate of right hand joint, and sr.Position.Y represents the Y of right shoulder joint Axial coordinate, wr.Position.X represent right wrist joint X-axis coordinate, and er.Position.X represents right elbow joint X-axis coordinate, when hr.Position.Y>Sr.Position.Y and wr.Position.X>During er.Position.X, then it is determined as right turn signal;
The stopping gesture utilizing left hand joint coordinate, right hand joint coordinate and waist joint coordinates, wherein, Hl.Position.Y represents left hand joint Y-axis coordinate, and hr.Position.Y represents right hand joint Y-axis coordinate, Hc.Position.Y represents waist joint Y-axis coordinate, works as hl.Position.Y<Hc.Position.Y and hr.Position.Y<Hc.Position.Y is then judged to ceasing and desisting order.
Beneficial effect:The present invention the car controller based on Kinect gesture identifications needed for amount of calculation it is small, can not between Real time data is carried out disconnectedly, improves the promptness and continuity of data, the cheap of package unit, maintenance are simple, Ke Yiwei Most users are received, and are adapted to large-scale promotion.
Brief description of the drawings
Fig. 1 is the car controller operational process schematic diagram based on Kinect gesture identifications.
Fig. 2 is the apparatus structure schematic diagram of the car controller based on Kinect gesture identifications.
Fig. 3 is the result schematic diagram of the car controller based on Kinect gesture identifications.
Fig. 4 is gesture schematic diagram.
Fig. 5 is the electronic structure schematic diagram of the car controller based on Kinect gesture identifications.
Embodiment
Below in conjunction with the accompanying drawings and specific embodiment, the present invention is furture elucidated, it should be understood that these embodiments are merely to illustrate The present invention rather than limitation the scope of the present invention, after the present invention has been read, those skilled in the art are each to the present invention's The modification of the kind equivalent form of value falls within the application appended claims limited range.
The trolley control system based on Kinect gesture identifications of the present invention, including data acquisition module 1, gesture identification mould Block 2, screen display module 3, communication module 4, slave computer module 5.
Data acquisition module 1 is placed in fixed platform, face operating personnel.Mainly using Kinect sampling depths information with Bone information, depth information are used for the position for determining operating personnel, and bone information is used for gesture identification.Collect operating personnel's After above- mentioned information, data processing and gesture recognition module 2 are transferred to by USB transmission line.
After gesture recognition module 2 parses to the action command that operating personnel send, screen display module 3 is passed to, Corresponding control instruction is transmitted to slave computer 5 by communication module 4 simultaneously.This module is a series of programs of operation in computer And algorithm, developed based on C# language.Gesture recognition module 2 includes data processing submodule 201 and gesture identification submodule 202 Two submodules.
Mainly for the treatment of the data from data acquisition module, the data gathered utilize data processing submodule 201 KinectSDK depth and bone obtains function to gather information, while the bone information collected is overlapping with real-time imaging, And be shown on screen display module 3, facilitate behaviour personnel to obtain real-time visual feedback.Depth and bone information include human body The coordinate in rectangular coordinate system in space of major joint, obtains depth and bone information is mainly used in gesture identification.
Gesture identification submodule 202 is according to the Kinect depth collected and bone information and in advance in gesture identification submodule The gesture that block 202 is set is contrasted, and identifies gesture with this, identification includes recognition result in screen display after completing In module 3, and command adapted thereto is sent to communication module 4.
Gesture identification submodule 202 includes advance 2021, retrogressing 2022, left-hand rotation 2023, turn right 2024, and stopping 2025 again Five submodules, five kinds of gestures set are represented respectively.
2021 submodules that advance have used two joint coordinates, are the coordinate and waist joint coordinates of left hand joint respectively, Two joint point variables are as follows.
Jointhl=s.Joints [JointType.HandLeft];
Jointhc=s.Joints [JointType.HipCenter];
Hl.Position.X and hc.Position.X represents X-axis coordinate and the waist joint of left hand hand joint respectively X-axis coordinate with the face planar configuration plane right-angle coordinate of data acquisition module 1, hl.Position.Y and Hc.Position.Y, left hand joint Y-axis coordinate and waist joint Y-axis coordinate are represented respectively.If hl.Position.X> Hc.Position.X and hl.Position.Y>Hc.Position.Y, that is to say, that if by left hand reach right side of body and Higher than waist, then it is determined as progress signal.
Retreat 2022 submodules and used two joint coordinates, be the coordinate and waist joint coordinates of right hand joint respectively, Two joint point variables are as follows.
Jointhr=s.Joints [JointType.HandRight];
Jointhc=s.Joints [JointType.HipCenter];
If hr.Position.X>Hc.Position.X and hr.Position.Y>Hc.Position.Y, that is, Say, if the right hand is reached into left side of body and is higher than waist, be determined as backing signal.
2023 submodules that turn left need four joint coordinates, are left hand joint, left wrist joint, left elbow joint and a left side respectively Shoulder joint, following four variables are corresponded to respectively.
Jointhl=s.Joints [JointType.HandLeft];
Jointwl=s.Joints [JointType.WristLeft];
Jointel=s.Joints [JointType.ElbowLeft];
Jointsl=s.Joints [JointType.ShoulderLeft];
If hl.Position.Y>Sl.Position.Y and wl.Position.X>El.Position.X namely goes out Existing left hand did shoulder, and left finesse is waved to body direction and exceeded the X-axis coordinate of left hand elbow joint, then counting variable adds 1hl.Position.Y and sl.Position.Y represents left hand joint Y-axis coordinate and left shoulder joint Y axial coordinates respectively, Wl.Position.X and el.Position.X represents left wrist joint X-axis coordinate and left elbow joint X axial coordinates respectively.If meter Number variable has exceeded some numerical value in regular hour inside counting variable, then is judged to waving, and instruction of turning left is set up.
2024 submodule principles of turning right are identical with left-hand rotation submodule.Four joint coordinates are needed, are right hand joint respectively, it is right Wrist joint, right elbow joint and right shoulder joint, following four variables are corresponded to respectively.
Jointhr=s.Joints [JointType.HandRight];
Jointwr=s.Joints [JointType.WristRight];
Jointer=s.Joints [JointType.ElbowRight];
Jointsr=s.Joints [JointType.ShoulderRight];
Stopping submodule 2025 needs three joint coordinates, is left hand joint coordinate, right hand joint coordinate and waist respectively Joint coordinates, three coordinate representations are as follows.
Jointhl=s.Joints [JointType.HandLeft];
Jointhr=s.Joints [JointType.HandRight];
Jointhc=s.Joints [JointType.HipCenter];
If hl.Position.Y<Hc.Position.Y and hr.Position.Y<Hc.Position.Y is then determined as Cease and desist order, that is, it is to be determined as halt instruction that operating personnel's both hands are sagging.
Screen display module 3, is divided into two parts, and the left-half of screen display module 3 shows real-time operating personnel Video, the four limbs and trunk of operating personnel, and the position of each artis are labelled with video with green line section.Screen display The right half part of module 3 has text box to show the command status after being identified by gesture recognition module 2.The He of screen display module 3 Gesture recognition module 2 is integrated in a computer.
Communication module 4 includes RS485 serial communications submodule 401, HC06 bluetooth hosts submodule 402 and HC06 bluetooths Slave submodule 403.RS485 serial communications submodule 401 is connected with HC06 bluetooth hosts submodule 402 with computer, jointly A part for host computer is formed, RS485 serial communications submodule 401 receives the instruction from gesture recognition module 2, passed through HC06 bluetooth hosts submodule 402 is sent to the HC06 bluetooth slaves submodule 403 positioned at slave computer module 5.
Slave computer module 5, including chassis, HC06 bluetooth slave submodules, single-chip minimum system plate, L298N driving moulds Block, two motors, 9V dc sources, driving wheel and directive wheel.Chassis upper and lower surface is horizontal plane.HC06 bluetooth slaves Submodule, single-chip minimum system plate and L298N drive modules are installed on chassis upper surface.Two driving wheels connect two respectively Platform drives, and 9V dc sources and directive wheel are installed on chassis lower surface.Slave computer module 5 is known according to data processing and gesture Other module 2 is instructed to control motor in slave computer module 5 to rotate forward, invert and stop controlling the motion state of dolly. Slave computer module 5 can keep straight on, retreat, stopping and turning left and turn right by the differential realization of two driving wheels.Slave computer module 5, the instruction from gesture recognition module 2 received has five kinds, is character " 1 ", " 3 ", " 2 ", " 4 ", " 0 " respectively, character " 1 " Represent instruction, character " 3 " represents instruction of turning right, and character " 2 " represents advancement commands, and character " 4 " represents to retreat instruction, character " 5 " represent halt instruction,.Slave computer module 5 is not in same static platform with other modules in a platform, other modules, The remote control to slave computer module 5 can be realized.
The present invention identifies gesture using Kinect, and dolly can be controlled without controller, wireless communication module The remote control realized to dolly is added, and the process of gesture identification is quick and precisely.Other modification of program is simple and convenient, can To change program as needed, to identify more gestures, realize and moving of car state is precisely controlled.Used in the present invention Dolly controlled using single-chip microcomputer, simply, reliably.
The system design utilizes Kinect sensor sampling depth and bone information, the data transfer that sensor will collect Handled to computer, wherein software is based on C# language in combination with Kinect SDK exploitations, runs on Window systems On computer.After the completion of data processing, the action message of operating personnel and bone information are included aobvious in the form of video The instruction of current operation personnel is shown in display screen, while on screen.Then instruction is transferred to by monolithic by bluetooth communication The dolly of machine control, realizes the remote control to moving of car state.
The present invention is further described below in conjunction with the accompanying drawings.
As shown in figure 1, being the system construction drawing of whole system, whole system can be divided into two portions of host computer and slave computer Point.Epigynous computer section, including data acquisition module 1, data processing and gesture recognition module 2, screen display module 3 and logical Believe a part for module 4.Kinect shown in epigynous computer section is data acquisition module 1 in Fig. 1;Include gesture in computer Identification module 2 and screen display module 3;Bluetooth host module and USB turn serial port module and together constitute the upper of communication module 4 Position machine part.Epigynous computer section is mainly used in obtaining gesture, identification gesture, display recognition result and sends control signal under Position machine.
Slave computer part includes a part for slave computer module 5 and communication module 4.Slave computer module 4 is exactly the system In dolly, slave computer part include HC06 bluetooth slaves module, single-chip minimum system, L298N drive modules, driving electricity Machine, dolly chassis and 9V dc sources.Bluetooth module is connected with single-chip microcomputer P3.0, P3.1 pin.L298N drive modules Four control terminals are connected with single-chip microcomputer P1.0-P1.3, and PWM inputs are connected with single-chip microcomputer P0.0, P0.1 pin.Slave computer Part is used to receive the control information of host computer transmission and responded according to information.
The implementation process of system is as follows:
Epigynous computer section:Kinect sensor receives original image and depth image, then will be received by USB transmission line To image pass to computer terminal.
Data processing and gesture recognition module 2 utilize Kinect for Windows SDK v1.8 profits in a computer Human skeleton point in image is extracted with bone tracer technique.Then, by comparing the relative position of skeleton point and acting rail Mark identifies corresponding gesture, turns serial port module and HC06 bluetooth host modules using RS485USB according to the gesture identified, Two modules, which link together, sends a corresponding control character to slave computer.
Slave computer part:Slave computer is controlled dolly, is powered by 9V dc sources, is received by HC06 bluetooth slaves module The control character transmitted from host computer, L298N is given to drive from P1.0-P1.3 pins transmission PWM signals after the processing of single-chip microcomputer Dynamic model block, drive module control two motors to rotate forward reversion and stop according to control signal, and motor connects two Driving wheel makes dolly make corresponding actions.Motor stops, then dolly stops;The equidirectional rotation of motor, then Plantago minuta Enter or stop;The equidirectional rotation of two motors, but rotary speed is different, then dolly turns left or turned right.
The flow chart run for SCM program in slave computer module 5 shown in Fig. 2, detailed process are as follows:
Initialization of (a) serial ports simultaneously keep halted state to wait first interruption to transmit, and program is held according to Sbuffer value Row command adapted thereto, Sbuffer initial value is 0, represents and stops, halt instruction, Zhi Daozhong are constantly performed by while statement Stealpass enters.
Epigynous computer section sends control character, and after slave computer receives control character, program is put into interruption subroutine Perform, corresponding value is assigned to Sbuffer.For example, epigynous computer section transmits character " 2 ", represent and advance, then interrupting sub- journey 2 just are entered as to Sbuffer in sequence, program immediately passes to subprograms corresponding and sends instructions to L298N drive modules.L298N drives Dynamic two motors of module drive rotate in same direction, and dolly advances.
Constantly according to upper Sbuffer value execute instruction in while statement if being transmitted without new interruption, Until capable interruption is passed to.
Shown in Fig. 3 for Computer Data processing and the program flow diagram of gesture recognition module 2, detailed process it is as follows:
Program detects whether Kinect sensor has connected first, and initialization of (a) serial ports are carried out if having connected and are led to Cross RGB cameras and obtain video, skeleton data stream is obtained by depth camera, directly terminates to run if not connected.
In the video flowing that the software interface display of screen display module 3 obtains, and bone image is included into human synovial Line between position and joint is mapped on video flowing, corresponding with human body in video.
According to the bone point coordinates got, judge whether its motion mode meets five predetermined gestures, if met The result of identification is then shown on software interface and sends corresponding character to slave computer, otherwise directly returns to program beginning Perform, instruction is sent same return program beginning and performed.
Several it is used to manipulate gesture used in slave computer for what is set in the system shown in Fig. 4.From top to bottom, from a left side To the right, " advance ", " retrogressing ", " left-hand rotation ", " right-hand rotation " and " stopping " being represented respectively.Person of low position is faced outside paper shown in figure, upper left First width figure left hand is higher than waist and stretches to the right, represents " advance " instruction;Upper right the first width figure right hand is higher than waist and stretches to the left, Represent " retrogressing " instruction;The figure left hand on the middle left side is higher than shoulder and quickly brandishes, and represents " left-hand rotation " instruction;The figure of right middle The right hand is higher than shoulder and quickly brandishes, and represents " right-hand rotation " instruction;Last width figure both hands naturally droops, and represents " stopping " instruction.

Claims (2)

1. a kind of car controller based on Kinect gesture identifications, it is characterised in that including data acquisition module (1), hand Gesture identification module (2), display module (3), communication module (4) and slave computer module (5);
The data acquisition module (1) includes Kinect sensor, using Kinect sensor gather human body depth information with Bone information;
The data that the gesture recognition module (2) is used to collect the data acquisition module (1) are handled, and identification is gone out for drill Make the control instruction of personnel;
The display module (3) is used for the processing data for receiving the data processing and gesture recognition module (2);
The communication module (4) is used to the control instruction of the gesture recognition module (2) being transferred to the slave computer module (5);
The slave computer module (5) controls the motion state of dolly according to the control instruction of the gesture recognition module (2).
2. the car controller according to claim 1 based on Kinect gesture identifications, it is characterised in that:The operator The control instruction of member includes advancement commands, retreats instruction, instruction of turning left, turn right instruction and halt instruction, the advance gesture profit With the coordinate of left hand joint and the coordinate in waist joint, wherein, hl.Position.X and hc.Position.X represent a left side respectively The X-axis coordinate of swivel of hand and the X-axis coordinate in waist joint, hl.Position.Y and hc.Position.Y represent left hand pass respectively Y-axis coordinate and waist joint Y-axis coordinate are saved, works as hl.Position.X>Hc.Position.X and hl.Position.Y> During hc.Position.Y, then it is determined as progress signal;
It is described retreat gesture using right hand joint coordinate and waist joint coordinate, wherein, hr.Position.X and Hc.Position.X represents the X-axis coordinate of right hand joint and the X-axis coordinate in waist joint respectively, hr.Position.Y and Hc.Position.Y represents right hand joint Y-axis coordinate and waist joint Y-axis coordinate respectively, works as hr.Position.X> Hc.Position.X and hr.Position.Y>During hc.Position.Y, then it is determined as backing signal;
The left turn utilizes the coordinate of left hand joint, left carpal coordinate, the coordinate of left elbow joint and left shoulder joint Coordinate, wherein, hl.Position.Y represents the Y-axis coordinate of left hand joint, and the Y-axis that sl.Position.Y represents left shoulder joint is sat Mark, wl.Position.X represent left wrist joint X-axis coordinate, and el.Position.X represents left elbow joint X-axis coordinate, when hl.Position.Y>Sl.Position.Y and wl.Position.X>During el.Position.X, then it is determined as left rotaring signal;
The right turn utilizes the coordinate of right hand joint, right carpal coordinate, the coordinate of right elbow joint and right shoulder joint Coordinate, wherein, hr.Position.Y represents the Y-axis coordinate of right hand joint, and the Y-axis that sr.Position.Y represents right shoulder joint is sat Mark, wr.Position.X represent right wrist joint X-axis coordinate, and er.Position.X represents right elbow joint X-axis coordinate, when hr.Position.Y>Sr.Position.Y and wr.Position.X>During er.Position.X, then it is determined as right turn signal;
The stopping gesture utilizing left hand joint coordinate, right hand joint coordinate and waist joint coordinates, wherein, Hl.Position.Y represents left hand joint Y-axis coordinate, and hr.Position.Y represents right hand joint Y-axis coordinate, Hc.Position.Y represents waist joint Y-axis coordinate, works as hl.Position.Y<Hc.Position.Y and hr.Position.Y <Hc.Position.Y is then judged to ceasing and desisting order.
CN201710758670.3A 2017-08-29 2017-08-29 A kind of car controller based on Kinect gesture identifications Pending CN107678425A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710758670.3A CN107678425A (en) 2017-08-29 2017-08-29 A kind of car controller based on Kinect gesture identifications

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710758670.3A CN107678425A (en) 2017-08-29 2017-08-29 A kind of car controller based on Kinect gesture identifications

Publications (1)

Publication Number Publication Date
CN107678425A true CN107678425A (en) 2018-02-09

Family

ID=61135265

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710758670.3A Pending CN107678425A (en) 2017-08-29 2017-08-29 A kind of car controller based on Kinect gesture identifications

Country Status (1)

Country Link
CN (1) CN107678425A (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CA3052597A1 (en) * 2014-06-08 2015-12-17 Hsien-Hsiang Chiu Gestural interface with virtual control layers
CN105677206A (en) * 2016-01-08 2016-06-15 北京乐驾科技有限公司 System and method for controlling head-up display based on vision
CN106125928A (en) * 2016-06-24 2016-11-16 同济大学 PPT based on Kinect demonstrates aid system
CN106354129A (en) * 2016-08-30 2017-01-25 江南大学 Kinect based gesture recognition control system and method for smart car
WO2017075932A1 (en) * 2015-11-02 2017-05-11 深圳奥比中光科技有限公司 Gesture-based control method and system based on three-dimensional displaying
CN106909216A (en) * 2017-01-05 2017-06-30 华南理工大学 A kind of Apery manipulator control method based on Kinect sensor

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CA3052597A1 (en) * 2014-06-08 2015-12-17 Hsien-Hsiang Chiu Gestural interface with virtual control layers
WO2017075932A1 (en) * 2015-11-02 2017-05-11 深圳奥比中光科技有限公司 Gesture-based control method and system based on three-dimensional displaying
CN105677206A (en) * 2016-01-08 2016-06-15 北京乐驾科技有限公司 System and method for controlling head-up display based on vision
CN106125928A (en) * 2016-06-24 2016-11-16 同济大学 PPT based on Kinect demonstrates aid system
CN106354129A (en) * 2016-08-30 2017-01-25 江南大学 Kinect based gesture recognition control system and method for smart car
CN106909216A (en) * 2017-01-05 2017-06-30 华南理工大学 A kind of Apery manipulator control method based on Kinect sensor

Similar Documents

Publication Publication Date Title
CN205281258U (en) Double -deck control system of AGV based on ARM
CN103955215B (en) Automatic obstacle avoidance trolley based on gesture identification and control device and control method
CN104111655B (en) A kind of smart home service robot system based on remote control
CN102219051B (en) Method for controlling four-rotor aircraft system based on human-computer interaction technology
CN204374769U (en) Based on the Intelligent tracking trolley of photoelectric sensor and photoelectric encoder
CN203941451U (en) Based on the automatic obstacle avoidance trolley of gesture identification
CN105643590B (en) A kind of wheeled mobile robot and its operating method of gesture control
CN106038106A (en) Laser ranging based stair climbing wheelchair control system and method
CN105730586A (en) Two-wheel self-balancing car control system in two-control mode
CN107253203A (en) Multimode Intelligent toy collects robot
CN105807790A (en) Intelligent following system based on indoor hybrid location and following method of system
CN109240282A (en) One kind can manipulate intelligent medical robot
CN107678425A (en) A kind of car controller based on Kinect gesture identifications
CN206311970U (en) Intelligent tracking monitoring cart system based on STM32
CN206557607U (en) A kind of wireless WIFI remote-controlled intelligents probe vehicles
CN106354129A (en) Kinect based gesture recognition control system and method for smart car
CN205126119U (en) Intelligent cleaning robot
CN107910810A (en) Become electric line cruise and intelligent monitor system
CN203397604U (en) Automatic control device used in intelligent education
CN108202325A (en) Autonomous operation forestry robot intelligent control system
CN107053172B (en) Steering engine main board control circuit and robot
CN106712591A (en) Motor remote debugging, monitoring and evaluating system
CN113558031A (en) Intelligent targeting pesticide spraying system
CN205490261U (en) Two brushless low -voltage direct current machine drive control system
CN207155786U (en) A kind of servo manipulator control device based on HMI

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination