CN107053214B - Robot fight device based on somatosensory control and control method - Google Patents
Robot fight device based on somatosensory control and control method Download PDFInfo
- Publication number
- CN107053214B CN107053214B CN201710026359.XA CN201710026359A CN107053214B CN 107053214 B CN107053214 B CN 107053214B CN 201710026359 A CN201710026359 A CN 201710026359A CN 107053214 B CN107053214 B CN 107053214B
- Authority
- CN
- China
- Prior art keywords
- robot
- kinect
- control
- player
- singlechip
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J13/00—Controls for manipulators
- B25J13/08—Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F9/00—Games not otherwise provided for
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1602—Programme controls characterised by the control system, structure, architecture
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/103—Static body considered as a whole, e.g. static pedestrian or occupant recognition
Landscapes
- Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Theoretical Computer Science (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Multimedia (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Manipulator (AREA)
- Toys (AREA)
Abstract
The invention discloses a robot fight device based on somatosensory control and a control method, wherein the device comprises a fighter robot, a Kinect device for identifying actions of players and collecting action information, and a control device for receiving the action information collected by the Kinect device and controlling the fighter robot according to the action information; the Kinect device comprises a Kinect sensor, a camera for motion recognition, a depth sensor and a PC end, wherein the Kinect sensor, the camera and the depth sensor are all connected with the PC end, the control device comprises a WIFI module, and the Kinect device is communicated with the control device through the WIFI module; the control device is arranged in the war chariot robot. The device acquires the player action signals by using the computer-man-machine interaction module, and collects and extracts the actions of the original player, thereby realizing the control of the war chariot robot.
Description
Technical Field
The invention relates to the field of human-computer interaction automatic control research, in particular to a robot fight device based on somatosensory control and a control method.
Background
Most of the fight games in the market at present are mainly software games or hand-held remote control fight games. The operation mode of the software game is mainly to operate through peripheral equipment such as a keyboard, a mouse and the like; and the handle is remotely controlled by a remote controller. These kinds of fight games, in which players hit only by hitting keys, do not have a real fight experience, and the long-time game is tired easily, and gives tired feeling.
Kinec t is a 3D somatosensory camera and has functions of dynamic capturing, image recognition, microphone input, voice recognition and the like. The Kinect device proposed by Microsoft corporation has excellent performance in human body tracking and posture evaluation, and meanwhile, the Microsoft corporation also proposes Kinect for Windows SDK development kit, and the Kinect somatosensory technology can be utilized to acquire human depth information by matching with the kit, and the intention of an operator is understood by recognizing human body actions and gestures, so that a computer is utilized to effectively operate the robot.
The motion sensing control system of Kinect equipment is used for controlling the robot to fight, so that the diversity of game operation and entertainment of games are increased, and meanwhile, the operation mode is more in line with one of the trends of the future robot development. The human-computer interaction theory is reflected, so that players really experience the 'combat' sense of pleasure. The battle car robot fighter device controlled by the somatosensory allows two players to respectively select and control a robot part or a car part to match with each other for fighter games.
Disclosure of Invention
The invention mainly aims to overcome the defects and the shortcomings of the prior art and provides a robot fight device and a control method based on somatosensory control, and the fight vehicle robot can be controlled through a Kinect device.
In order to achieve the above purpose, the present invention adopts the following technical scheme:
the invention provides a robot fight device and a control method based on somatosensory control, wherein the robot fight device comprises a fighter robot, a Kinect device for identifying actions of players and collecting action information, and a control device for receiving the action information collected by the Kinect device and controlling the fighter robot according to the action information; the Kinect device comprises a Kinect sensor, a Kinect somatosensory camera for motion recognition, a depth sensor and a PC end, wherein the Kinect sensor, the Kinect somatosensory camera and the depth sensor are all connected with the PC end, the control device comprises a WIFI module, and the Kinect device is communicated with the control device through the WIFI module; the control device is arranged in the war chariot robot.
As the preferable technical scheme, the control device comprises a singlechip, a power module, a communication module, a motor driving module and a function key module, wherein the power module, the communication module, the motor driving module and the function key module are all connected with the singlechip.
As an optimized technical scheme, the war chariot robot comprises a motor and a steering engine, wherein the singlechip is used for setting two I/O ports for controlling the motor, and the forward rotation, the reverse rotation and the rotating speed of one motor are controlled by changing the high and low levels of the two I/O ports and the square wave duty ratio; the single chip microcomputer is also provided with an I/O port for controlling the steering engine, and the rotation of the steering engine from 0-180 degrees is controlled by changing the duty ratio of the high level and the low level of the I/O port in the working period of the steering engine.
As an optimal technical scheme, the power supply module adopts a 7.4V rechargeable battery and a voltage stabilizing chip to supply power to the control device.
As the preferable technical scheme, be equipped with a plurality of micro-gap switches and a plurality of LED lamp on the chariot robot, a plurality of micro-gap switches and a plurality of LED lamp all are connected with controlling means, and when micro-gap switches were beaten once, the LED lamp extinguished one, and when the LED lamp was all extinguished, then the WIFI module disconnection Kinect device was connected with controlling means.
The invention also provides a control method of the robot fighter based on somatosensory control, which comprises the following steps:
1) Connecting the Kinect device with a host server computer to enable the Kinect device to collect body actions of a player before the Kinect somatosensory, and collecting action information of the player in real time by using the Kinect device;
2) Starting a power supply of the war chariot robot to enable a client singlechip on the war chariot robot to be connected with a host server computer PC;
3) The PC machine analyzes and processes the position information of each joint of the player read by the Kinect somatosensory camera to obtain effective joint angle or gesture intention, analyzes action information and reads control information required by controlling the warfare robot;
4) The PC machine is connected with the client singlechip through the wireless WIFI module and sends control information to the singlechip;
5) The singlechip receives control information acquired by the PC in real time, analyzes the control information through a specific algorithm, drives the motor driving circuit and the steering engine driving circuit according to the obtained information, changes the rotating speed of the motor by changing the duty ratio of PWM waves input by motor driving, and changes the rotating angle of the steering engine by changing the duty ratio of PWM waves input by the steering engine, thereby realizing the control of the warfare robot.
As a preferred technical solution, step 3) specifically comprises:
the Kinect device starts to identify and capture bones of a player within a range of 1-3 meters in front of the Kinect somatosensory camera, collects body actions of the player, and can map a 3D scene graph within the range of the Kinect somatosensory camera and give out three-dimensional space coordinates; the Kinect device obtains an effective joint angle of a player or recognizes gesture intention through space positioning, analyzes action information and reads control information required by controlling the warfare robot; wherein the read information includes: controlling angles of six joints, waist rotation direction and upper body inclination angles of hands of a player of the robot; controlling the relative positions of hands of a player of the trolley, specifically the angles between the connecting lines between the palms of the hands and the horizontal line; the information collected by the former controls the action of the robot, and the information collected by the latter controls the advancing, retreating, turning and speed of the trolley.
As a preferred technical solution, step 4) specifically comprises:
after the singlechip completes initialization and establishes connection between the singlechip and the PC through the WIFI module, the singlechip is in a state of waiting for receiving control data of the PC;
when data is sent to the singlechip through the WIFI module, the singlechip analyzes the received information and extracts the needed control information of the steering engine and the motor, wherein the control information comprises the angle control information of 9 steering engines and the control information of 2 motors; the singlechip changes the rotating angle of each steering engine by changing the PWM duty ratio of the output port, and shows different actions of the warfare robot through different combinations of different angles of the steering engines, so that the robot can make expected actions only by inputting a specific angle to each steering engine; the single chip microcomputer continuously updates the action information of the player acquired by the Kinect device, and the action and the synchronization effect of the warfare robot can be achieved under the condition of low updating delay; meanwhile, the motor driving module is used for controlling and changing the rotating speed and steering of the motor, when the rotating speeds of the two driving wheels have rotating speed difference, the trolley can realize a turning function, and the larger the rotating speed difference is, the larger the turning amplitude is, the motor steering is changed, so that the forward and backward functions can be realized.
9. The control method of the robot fight device based on somatosensory control according to claim 6, wherein 5 LED lamps are installed on the warfare robot as life bars of the warfare robot, when the device enters a normal working state, the micro switch is in a inductable state, meanwhile, the 5 LED lamps are all on, when any one of the micro switches senses touch, one LED lamp is extinguished, when the 5 LED lamps are all off, the client singlechip is disconnected from the server computer, data cannot be transmitted, and a player cannot operate the warfare robot.
10. The method for controlling a robot fighter device based on motion sensing control of claim 6, wherein the chariot robot device uses a 7.4V rechargeable battery and a voltage stabilizing chip to supply power to the modules and components.
Compared with the prior art, the invention has the following advantages and beneficial effects:
1. according to the invention, the Kinect device is connected with the control device through the WIFI module, and after the Kinect device analyzes the collected actions of the player to obtain the needed action joint information, a signal is sent to a singlechip in the control device through the wireless communication module; and the singlechip is used for controlling the rotation of the steering engine and the motor to control the action of the warfare robot after carrying out data processing according to the signals received from the communication module.
2. The invention uses the computer-man-machine interaction module to acquire the player action signals, and the control of the war chariot robot is realized after the original player actions are acquired and information is extracted.
3. The motion sensing control of the Kinect changes the single operation of the previous game, so that the human-computer interaction concept is more thoroughly displayed, and the player is really put into the battle. The two players can respectively select and control the robot part or the car part to match with each other for playing the fight game. The player controlling the robot can make the upper body action so that the robot imitates; the player controlling the car part can control the forward, backward, turning and the like of the car by simulating the driving gesture, thereby realizing the moving function of the robot. Two players are matched with each other to fight together.
Drawings
Fig. 1 is a block diagram of a circuit configuration of a robot fighter device based on motion sensing control of the present invention;
FIG. 2 (a) is a schematic diagram of the Arduino mega2560 minimum system circuitry of the present invention;
FIG. 2 (b) is a schematic circuit diagram of the motor drive module of the present invention;
FIG. 2 (c) is a schematic diagram of a 5V, 3V voltage stabilizing module circuit of the present invention;
fig. 2 (d) is a schematic circuit diagram of a chip pin corresponding to each circuit interface of the WIFI module of the present invention;
FIG. 3 is a flow chart of the robot fighter device singlechip based on somatosensory control;
fig. 4 is a flowchart of a control method of the robot fighter device of the present invention.
Detailed Description
The present invention will be described in further detail with reference to examples and drawings, but embodiments of the present invention are not limited thereto.
Examples
As shown in fig. 1 and fig. 2 (a) -fig. 2 (d), the robot fighter device based on motion sensing control of the present embodiment includes a combat vehicle robot, a Kinect device for identifying actions of a player and collecting action information, and a control device for receiving the action information collected by the Kinect device and controlling the combat vehicle robot according to the action information; the Kinect device comprises a Kinect sensor, a camera for motion recognition, a depth sensor and a PC end, wherein the Kinect sensor, the camera and the depth sensor are all connected with the PC end, the control device comprises a WIFI module, and the Kinect device is communicated with the control device through the WIFI module; the control device is arranged in the war chariot robot.
The Kinect sensor adopts Kinect Xbox one, and the depth sensor is embedded in the Kinect sensor.
And the WIFI module adopts an ESP8266WIFI module to realize wireless transmission of data between an upper computer (a computer and a PC) and a lower computer (a singlechip).
The control device comprises a single chip microcomputer, a power module, a communication module, a motor driving module and a function key module, wherein the power module, the communication module, the motor driving module and the function key module are all connected with the single chip microcomputer.
The war chariot robot comprises a motor and a steering engine, wherein the singlechip is used for setting two I/O ports for controlling the motor, and the forward rotation, the reverse rotation and the rotating speed of one motor are controlled by changing the high and low levels of the two I/O ports and the square wave duty ratio; the single chip microcomputer is also provided with an I/O port for controlling the steering engine, and the rotation of the steering engine from 0-180 degrees is controlled by changing the duty ratio of the high level and the low level of the I/O port in the working period of the steering engine.
The singlechip adopts an Arduino mega2560 singlechip and takes the Arduino mega2560 singlechip as a main control chip to analyze action data acquired and transmitted by a computer in real time.
The motor driving module adopts L298P and drives the motor to perform corresponding actions through the L298P.
The power module adopts 7.4V rechargeable battery and steady voltage chip to supply power to controlling means, can provide stable voltage for controlling means, guarantees that robot fight device can continuous operation.
Be equipped with a plurality of micro-gap switches and a plurality of LED lamp on the chariot robot, a plurality of micro-gap switches and a plurality of LED lamp all are connected with controlling means, and when micro-gap switch was beaten once, the LED lamp extinguished one, and when the LED lamp was all extinguished, then the WIFI module disconnection Kinect device was connected with controlling means.
The working voltage of the fighter robot is 7.4 VDC-8.6 VDC; the operating voltage of the Kinect apparatus was 220V.
The AT series singlechip control unit consists of a MEGA2560 singlechip and peripheral circuits thereof, wherein the MEGA2560 singlechip is produced by ATMEL company.
The working process of the robot fighter device of this embodiment is as follows:
referring to fig. 3 in combination with fig. 2, after the ATmega2560 singlechip is initialized, the ESP8255WIFI module on the singlechip establishes the connection between the singlechip and the PC connected with the Kinect device through initialization, and at this time, the WIFI module is used as a client, and the PC is used as a server; after the wireless communication is established, the Kinect device starts to identify the action of the player, collects effective information, and sends data to the singlechip through the WIFI module after certain processing. The singlechip analyzes the received information, extracts the needed control information of the steering engine and the motor, and adopts conventional means in the processes of analysis and information extraction. Meanwhile, two output I/O ports are set, and the forward rotation, the reverse rotation and the rotating speed of one motor are controlled by changing the high and low levels of the two I/O ports and the square wave duty ratio. An output I/O port is set, and the high-low level duty ratio is changed to control the steering engine to rotate from 0-180 degrees in the working period of the steering engine. The motor information collected by the Kinect device can comprise forward and reverse rotation and rotating speed; the collected steering engine information can comprise steering engine angles of each shutdown of the robot. The singlechip outputs the processed information to the driving circuit, so that the movement of the war chariot and the action of the robot can be controlled. In addition, a plurality of micro switches are arranged on the robot body, and when the micro switches are hit, one LED lamp can be turned off from 5 LED lamps on the war chariot. When the lamps are all turned off, the WIFI module is disconnected from the PC, the robot cannot receive information, and the player cannot control the war chariot robot through the Kinect device.
As shown in fig. 4, the control method of the robot fighter device based on motion sensing control of the present embodiment includes the following steps:
and (3) connecting a Kinect device power supply, connecting the Kinect device with a host server computer, completing initialization of the Kinect device, simultaneously turning on a power switch of the warfare robot, initializing the singlechip, and enabling the client singlechip to be connected with the PC.
After the connection is completed, the Kinect device starts to identify and capture the bones of the player in the range of about 1-3 meters in front of the somatosensory camera and collect the body actions of the player. And the PC machine analyzes and processes the position information of each skeletal joint of the player read by the Kinect somatosensory camera. The Kinect device can map out a 3D scene graph in the range of the camera and give out three-dimensional space coordinates. The Kinect device can obtain effective joint angles or recognize gesture intentions of players through space positioning, analyze action information and read control information required by controlling the warfare robot. Wherein the read information includes: controlling angles of six joints, waist rotation direction and upper body inclination angles of hands of a player of the robot; the relative position of the hands of the player of the trolley, in particular the angle between the line between the palms of the hands and the horizontal line, is controlled. The information collected by the former controls the action of the robot, and the information collected by the latter controls the advancing, retreating, turning and speed of the trolley.
The analysis operation information analysis operation is to judge the operation intention of the player by the following methods such as comparison of relative positions, measurement and comparison of angles, setting of time parameters, and the like.
Setting a specific action model according to the action characteristics required to be acquired by the user, and triggering a corresponding execution program if the detected action characteristics accord with the set action model. Such as:
1. lifting the detection hand: taking the height from the shoulder to the head as a unit, if the palm position is higher than the head by one or more units, the player is regarded as lifting by both hands.
2. Detecting arm flat lifting: taking the body plane as a coordinate reference system, and considering that the arms of the player are lifted when the angle formed by the bones of the arms and the vertical axis is 90 degrees plus or minus 10 degrees.
3. Detecting boxing action: a time parameter is set. Taking the length of the shoulder as a length unit, and if the distance between the palm and the shoulder is increased from one unit to be more than or equal to two length units in the time, considering that the player has a boxing action.
In this embodiment, other actions also build an action model according to the action characteristics.
After the required control information is acquired, the PC sends the packed and packaged control information to the singlechip through connection established by the WIFI module and the client singlechip.
After the singlechip completes initialization and establishes connection between the singlechip and the server computer through the WIFI module, the singlechip is in a state of waiting to receive control data of the server computer. And when the data is transmitted to the singlechip through the WIFI module. The singlechip analyzes the received information and extracts the needed control information of the steering engine and the motor. The steering engine comprises angle control information of 9 steering engines and control information of 2 motors. The singlechip changes the rotating angle of each steering engine by changing the PWM duty ratio of the output port. Different combinations of different angles of each steering engine can show different actions of the robot. So that the robot can make the expected action only by inputting a specific angle to each steering engine. The single chip microcomputer continuously updates the action information of the player acquired by the Kinect device, and the effects of robot action and player synchronization can be achieved under the condition of low update delay. In addition, through the motor drive module, the rotational speed and the steering of the motor can be changed. When the rotation speeds of the two driving wheels are different, the trolley can realize the turning function. The larger the rotation speed difference is, the larger the turning amplitude is, and the forward and backward functions can be realized by changing the steering of the motor.
The singlechip analyzes the received information and extracts the control information of the steering engine and the motor, which are needed specifically as follows:
(1) The PC end packs the data and sends the data to the singlechip, and the specific packing mode is as follows: data discriminant, skill judgment data, steering engine and motor control information and check bit.
(2) The data discriminant is used as a data group packet header to remind the singlechip to receive information; the skill judgment data is also two-bit 16-system data, including a skill type which cannot be triggered, and if the skill is triggered, the rest data is skipped to directly execute the skill program; if the command is a non-technical command, analyzing steering engine and motor control information, respectively converting the required 9 steering engine angles and 2 motor control information into two-bit 16-system, and arranging according to a certain sequence; the checking bit is a remainder of 256 after the 16-system data except the data discriminant are accumulated and summed, and the remainder is used as the checking bit.
(3) After receiving a section of effective information, the singlechip accumulates and sums the data except for the check bits, and then takes the remainder of 256, compares the data with the check bits in the data, and if the check bits are incorrect, does not analyze the information and receives the next piece of information; if the correction position is correct, extracting technical data or control data of a steering engine and a motor, performing reverse decoding to obtain control information of 10 scale, and further controlling a circuit.
In addition, a plurality of micro switches are mounted on the body part of the robot, and the micro switches are used as touch points when the robot is hit. The rear part of the trolley is provided with 5 LED lamps which are used as the life value of the robot. When the device enters a normal working state, the micro switch is in an inductable state, and meanwhile, 5 LED lamps are fully lighted. When any one of the micro switches senses touch, one led lamp is extinguished. When the 5 LED lamps are completely turned off, the client singlechip is disconnected with the PC, data cannot be transmitted, and a player cannot control the warfare robot through Kinect somatosensory.
When the action command is triggered, the player only needs to make the setting action of the corresponding skill. When the computer successfully recognizes the skill action of the action position, a specific skill instruction is sent to the singlechip, and the singlechip executes a corresponding program command to enable the robot to perform the skill action set in advance.
The above examples are preferred embodiments of the present invention, but the embodiments of the present invention are not limited to the above examples, and any other changes, modifications, substitutions, combinations, and simplifications that do not depart from the spirit and principle of the present invention should be made in the equivalent manner, and the embodiments are included in the protection scope of the present invention.
Claims (7)
1. The robot fight device based on somatosensory control comprises a war chariot robot, wherein the war chariot robot comprises a robot part and a trolley part, and is characterized by further comprising a Kinect device for identifying actions of players and collecting action information and a control device for receiving the action information collected by the Kinect device and controlling the war chariot robot according to the action information; the Kinect device comprises a Kinect sensor, a Kinect somatosensory camera for motion recognition, a depth sensor and a PC end, wherein the Kinect sensor, the Kinect somatosensory camera and the depth sensor are all connected with the PC end, the control device comprises a WIFI module, and the Kinect device is communicated with the control device through the WIFI module; the control device is arranged in the war chariot robot;
the control device comprises a singlechip, a power module, a communication module, a motor driving module and a function key module, wherein the power module, the communication module, the motor driving module and the function key module are all connected with the singlechip;
the war chariot robot comprises a motor and a steering engine, wherein the singlechip is provided with two I/O ports for controlling the motor, and the forward rotation, the reverse rotation and the rotating speed of one motor are controlled by changing the high and low levels of the two I/O ports and the square wave duty ratio; the single chip microcomputer is also provided with an I/O port for controlling the steering engine, and the rotation of the steering engine from 0-180 degrees is controlled by changing the high-low level duty ratio of the I/O port in the working period of the steering engine;
the Kinect device recognizes and captures bones of a player within a range of 1-3 meters in front of the somatosensory camera, collects body actions of the player, the PC analyzes and processes the read position information of each bone joint of the player through the Kinect somatosensory camera, the Kinect device maps out a 3D scene graph within the camera range, gives out three-dimensional space coordinates, the Kinect device obtains effective joint angles or gesture recognition intentions of the player through space positioning, analyzes the action information, and reads control information required by controlling the warcraft robot; wherein the read information includes: controlling angles of six joints, waist rotation direction and upper body inclination angles of hands of a player of the robot; controlling the relative positions of hands of a player of the trolley, specifically the angles between the connecting lines between the palms of the hands and the horizontal line; the information collected by the former is used for controlling the action of the robot, and the information collected by the latter is used for controlling the advancing, retreating, turning and vehicle speed of the trolley.
2. The robot combat device based on motion sensing control of claim 1, wherein the power module employs a 7.4V rechargeable battery and a voltage stabilizing chip to power the control device.
3. The robot fight device based on motion sensing control according to claim 1, wherein the chariot robot is provided with a plurality of micro switches and a plurality of LED lamps, the micro switches and the LED lamps are connected with the control device, when the micro switches are hit once, one LED lamp is extinguished, and when all the LED lamps are extinguished, the WIFI module disconnects the Kinect device from the control device.
4. A control method of a robot fighter device based on motion sensing control according to any one of claims 1-3, characterized by comprising the steps of:
1) Connecting the Kinect device with a host server computer to enable the Kinect device to collect body actions of a player before the Kinect somatosensory, and collecting action information of the player in real time by using the Kinect device;
2) Starting a power supply of the war chariot robot to enable a client singlechip on the war chariot robot to be connected with a host server computer PC;
3) The PC machine analyzes and processes the position information of each joint of the player read by the Kinect somatosensory camera to obtain effective joint angle or gesture intention, analyzes action information and reads control information required by controlling the warfare robot;
4) The PC establishes connection with the client singlechip through the wireless WIFI module and sends control information to the singlechip; the method comprises the following steps:
after the singlechip completes initialization and establishes connection between the singlechip and the PC through the WIFI module, the singlechip is in a state of waiting for receiving control data of the PC;
when data is sent to the singlechip through the WIFI module, the singlechip analyzes the received information and extracts the needed control information of the steering engine and the motor, wherein the control information comprises the angle control information of 9 steering engines and the control information of 2 motors; the singlechip changes the rotating angle of each steering engine by changing the PWM duty ratio of the output port, and shows different actions of the warfare robot through different combinations of different angles of the steering engines, so that the robot can make expected actions only by inputting a specific angle to each steering engine; the single chip microcomputer continuously updates the action information of the player acquired by the Kinect device, and the action and the synchronization effect of the warfare robot can be achieved under the condition of low updating delay; meanwhile, the motor driving module is used for controlling and changing the rotating speed and steering of the motor, when the rotating speeds of the two driving wheels have the rotating speed difference, the trolley can realize the turning function, and the larger the rotating speed difference is, the larger the turning amplitude is, the motor steering is changed, so that the forward and backward functions can be realized;
5) The singlechip receives control information acquired by the PC in real time, analyzes the control information through a specific algorithm, drives the motor driving circuit and the steering engine driving circuit according to the obtained information, changes the rotating speed of the motor by changing the duty ratio of PWM waves input by motor driving, and changes the rotating angle of the steering engine by changing the duty ratio of PWM waves input by the steering engine, thereby realizing the control of the warfare robot.
5. The method for controlling a robot fighter device based on motion sensing control of claim 4, wherein the step 3) specifically comprises:
the Kinect device recognizes and captures bones of a player within a range of 1-3 meters in front of the somatosensory camera, collects body actions of the player, the PC analyzes and processes the read position information of each bone joint of the player through the Kinect somatosensory camera, the Kinect device maps out a 3D scene graph within the camera range, gives out three-dimensional space coordinates, the Kinect device obtains effective joint angles or gesture recognition intentions of the player through space positioning, analyzes the action information, and reads control information required by controlling the warcraft robot; wherein the read information includes: controlling angles of six joints, waist rotation direction and upper body inclination angles of hands of a player of the robot; controlling the relative positions of hands of a player of the trolley, specifically the angles between the connecting lines between the palms of the hands and the horizontal line; the information collected by the former is used for controlling the action of the robot, and the information collected by the latter is used for controlling the advancing, retreating, turning and vehicle speed of the trolley.
6. The control method of the robot fight device based on somatosensory control according to claim 4, wherein 5 LED lamps are installed on the warfare robot as life bars of the warfare robot, when the device enters a normal working state, the micro switch is in a inductable state, meanwhile, the 5 LED lamps are all on, when any one of the micro switches senses touch, one LED lamp is extinguished, when the 5 LED lamps are all off, the client singlechip is disconnected from the server computer, data cannot be transmitted, and a player cannot operate the warfare robot.
7. The method for controlling a robot fighter device based on motion sensing control of claim 4, wherein the fighter vehicle robot fighter device uses a 7.4V rechargeable battery and a voltage stabilizing chip to supply power to each module and component.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710026359.XA CN107053214B (en) | 2017-01-13 | 2017-01-13 | Robot fight device based on somatosensory control and control method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710026359.XA CN107053214B (en) | 2017-01-13 | 2017-01-13 | Robot fight device based on somatosensory control and control method |
Publications (2)
Publication Number | Publication Date |
---|---|
CN107053214A CN107053214A (en) | 2017-08-18 |
CN107053214B true CN107053214B (en) | 2023-09-05 |
Family
ID=59599340
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710026359.XA Active CN107053214B (en) | 2017-01-13 | 2017-01-13 | Robot fight device based on somatosensory control and control method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN107053214B (en) |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107908288A (en) * | 2017-11-30 | 2018-04-13 | 沈阳工业大学 | A kind of quick human motion recognition method towards human-computer interaction |
CN108481348A (en) * | 2018-03-14 | 2018-09-04 | 合肥工业大学 | Hexapod Robot control system based on Arduino platforms |
CN109906134B (en) * | 2018-03-27 | 2022-06-24 | 尤中乾 | Robot avoidance control method and related device |
CN108594865A (en) * | 2018-05-17 | 2018-09-28 | 广州悦享环球文化科技有限公司 | A kind of control robot imaging system and method |
WO2020133628A1 (en) * | 2018-12-29 | 2020-07-02 | 深圳市工匠社科技有限公司 | Humanoid robotic arm somatosensory control system and related product |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103399637A (en) * | 2013-07-31 | 2013-11-20 | 西北师范大学 | Man-computer interaction method for intelligent human skeleton tracking control robot on basis of kinect |
CN203941451U (en) * | 2014-04-15 | 2014-11-12 | 桂林电子科技大学 | Based on the automatic obstacle avoidance trolley of gesture identification |
CN104971502A (en) * | 2015-07-22 | 2015-10-14 | 黑龙江大学 | Multi-driving autonomous networked boxing model robot system and control method thereof |
CN205323228U (en) * | 2015-12-15 | 2016-06-22 | 广州大学 | Brain wave and acceleration of gravity vehicle actuated control's fought robot toy |
CN106313072A (en) * | 2016-10-12 | 2017-01-11 | 南昌大学 | Humanoid robot based on leap motion of Kinect |
CN206393653U (en) * | 2017-01-13 | 2017-08-11 | 广州大学 | A kind of robot battle device based on motion sensing control |
-
2017
- 2017-01-13 CN CN201710026359.XA patent/CN107053214B/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103399637A (en) * | 2013-07-31 | 2013-11-20 | 西北师范大学 | Man-computer interaction method for intelligent human skeleton tracking control robot on basis of kinect |
CN203941451U (en) * | 2014-04-15 | 2014-11-12 | 桂林电子科技大学 | Based on the automatic obstacle avoidance trolley of gesture identification |
CN104971502A (en) * | 2015-07-22 | 2015-10-14 | 黑龙江大学 | Multi-driving autonomous networked boxing model robot system and control method thereof |
CN205323228U (en) * | 2015-12-15 | 2016-06-22 | 广州大学 | Brain wave and acceleration of gravity vehicle actuated control's fought robot toy |
CN106313072A (en) * | 2016-10-12 | 2017-01-11 | 南昌大学 | Humanoid robot based on leap motion of Kinect |
CN206393653U (en) * | 2017-01-13 | 2017-08-11 | 广州大学 | A kind of robot battle device based on motion sensing control |
Also Published As
Publication number | Publication date |
---|---|
CN107053214A (en) | 2017-08-18 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN107053214B (en) | Robot fight device based on somatosensory control and control method | |
US10384129B2 (en) | System and method for detecting moment of impact and/or strength of a swing based on accelerometer data | |
US10445884B2 (en) | Control device for communicating visual information | |
US8403753B2 (en) | Computer-readable storage medium storing game program, game apparatus, and processing method | |
CN202150897U (en) | Body feeling control game television set | |
WO2007129432A1 (en) | Game device | |
US20160059120A1 (en) | Method of using motion states of a control device for control of a system | |
EP2347320B1 (en) | Control device for communicating visual information | |
CN102004840B (en) | Method and system for realizing virtual boxing based on computer | |
JP2009514106A (en) | System and method for interfacing with a computer program | |
US7999812B2 (en) | Locality based morphing between less and more deformed models in a computer graphics system | |
CN106426206A (en) | Wrestling robot, control equipment and game system | |
CN109421052A (en) | A kind of quintet game Chinese-chess robot based on artificial intelligence | |
JP2013078634A (en) | Game system and game program | |
Tsai et al. | 3D hand gesture recognition for drone control in unity | |
CN106502416B (en) | A kind of driving simulation system and its control method of intelligent recognition bimanual input | |
CN106512391B (en) | A kind of bimanual input recognition methods and the driving simulation system based on it, method | |
CN111228791A (en) | Real person AR shooting game equipment, and shooting fighting system and method based on AR technology | |
US11285394B1 (en) | Computer-readable non-transitory storage medium having instructions stored therein, game apparatus, game system, and game processing method | |
US20240350902A1 (en) | System and method for augmented reality for a moveable real-world object | |
CN206393653U (en) | A kind of robot battle device based on motion sensing control | |
Ionescu et al. | Gesture control: a new and intelligent man-machine interface | |
CN206105877U (en) | Fistfight robot, controlgear and recreation system | |
CN114333054A (en) | Virtual sports application method based on sketch somatosensory gestures | |
CN110732145A (en) | Remote control method and device for toys |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |