CN110065066A - A method of it is interacted with mobile terminal control robot - Google Patents
A method of it is interacted with mobile terminal control robot Download PDFInfo
- Publication number
- CN110065066A CN110065066A CN201810070014.9A CN201810070014A CN110065066A CN 110065066 A CN110065066 A CN 110065066A CN 201810070014 A CN201810070014 A CN 201810070014A CN 110065066 A CN110065066 A CN 110065066A
- Authority
- CN
- China
- Prior art keywords
- mobile terminal
- robot
- camera
- electric machine
- interacted
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J13/00—Controls for manipulators
- B25J13/006—Controls for manipulators by means of a wireless system for controlling one or several manipulators
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1602—Programme controls characterised by the control system, structure, architecture
- B25J9/161—Hardware, e.g. neural networks, fuzzy logic, interfaces, processor
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1679—Programme controls characterised by the tasks executed
- B25J9/1689—Teleoperation
Landscapes
- Engineering & Computer Science (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Automation & Control Theory (AREA)
- Evolutionary Computation (AREA)
- Fuzzy Systems (AREA)
- Mathematical Physics (AREA)
- Software Systems (AREA)
- Physics & Mathematics (AREA)
- Artificial Intelligence (AREA)
- Computer Networks & Wireless Communication (AREA)
- Toys (AREA)
- Manipulator (AREA)
Abstract
The present invention relates to a kind of methods interacted with mobile terminal control robot, the robot includes wireless communication module, central controller, processor, servo motor, electric machine controller, electrical apparatus driving circuit, camera, the camera is driven by rotation motor, comprising the following steps: the mobile terminal connects the wireless communication module of the robot by local area network or Bluetooth communication;The mobile terminal is by control instruction by being sent to the central controller of the robot;The electric machine controller controls the motor-drive circuit driving motor and executes corresponding movement;The implementing result of the motor is periodically fed back to the central controller by the electric machine controller, implementing result is transferred to the mobile terminal by wireless communication module by the central controller, and the mobile terminal shows implementing result on the touch screen of mobile terminal.
Description
Technical field
The present invention relates to robotic technology field more particularly to a kind of sides interacted with mobile terminal control robot
Method.
Background technique
With the continuous development of robot technology, more and more robots start to substitute the various tasks of mankind's execution.Machine
Device people is to automatically control being commonly called as machine (Robot), and automatically controlling machine includes all simulation human behaviors or thought and simulation
The machinery (such as robot dog, Doraemon etc.) of other biological.Have in the narrow sense to the definition of robot there are also many classification and dispute
A little computer programs or even also referred to as robot.In contemporary industry, robot refers to the man-made machine dress that can execute task automatically
It sets, to replace or assist human work.Highly emulated robot in ideal is advanced integral traffic control opinion, mechano-electronic, calculating
Machine and artificial intelligence, materialogy and bionic product, scientific circles research and develop to this direction at present, but robot is remote
Process control is still not perfect, and the application of big data is universal not yet, and also in off-line state, robot is deep for the data acquisition of robot
Degree study is also from the storage of native data.
With the continuous development of science and technology being constantly progressive with robot technology, intelligent robot gradually enters into thousand
Ten thousand families also occur many intelligent robots in the market and offer convenience to people's lives and enjoyment, wherein interaction robot makees
It for one kind of intelligent robot, can interact with people, be added to people's lives, the especially life to old man or child
Many enjoyment.Existing interactive robot is using natural language processing and semantic understanding as core on the market, integrating speech sound identification etc.
Technology realizes the interaction that personalizes with various equipment.But there is also shortcoming, robot and realities for these existing interactive robots
The accuracy that border instruction matches is lower, and robot is easy to misread practical operation instruction, to cause the problem of operation inaccuracy.
Therefore, how to realize accurate identification of the robot to actual instruction, realize the accurate control of robot, preferably in fact
Existing intelligent interaction, becomes those skilled in the art's technical problem urgently to be resolved.
Summary of the invention
The object of the present invention is to provide a kind of methods interacted with mobile terminal control robot, can be realized robot pair
The accurate identification of actual instruction realizes the accurate control of robot, intelligent interaction is better achieved.
To achieve the goals above, the method that robot is interacted is controlled with mobile terminal the present invention provides a kind of,
The robot includes wireless communication module, central controller, processor, servo motor, electric machine controller, electrical equipment drive electricity
Road, camera, the camera are driven by rotation motor, comprising the following steps:
The mobile terminal connects the wireless communication module of the robot by local area network or Bluetooth communication;
The mobile terminal is by control instruction by being sent to the central controller of the robot;
The processor obtains the control instruction that the central processing unit issues, and the control instruction that will acquire is stored in dynamic and delays
Region is deposited, when the control instruction number stored in the dynamic buffering region reaches default frame number, once transfers director data, and
The director data gone will be jumped and be sent to the electric machine controller;It is taken between the central controller and the processor by message
Business device is communicated, and the central controller, the processor take respectively as the client of message server and the message
Business device connection, the message server are located in the mobile terminal;
The rotation motor driving camera rotation obtains the limb action image of operator, in the limbs dynamic image
Human body establishes the three-dimensional coordinate of skeleton model, by left-hand minutia, right-hand minutia, the left finesse node, the right side in the skeleton model
Wrist node, left shoulder node, right shoulder node obtain above-mentioned left-hand minutia, right-hand minutia, left hand carpopodium as characteristic processing object
The posture and gesture dynamic vector of point, right finesse node, left shoulder node, right shoulder node in the three-dimensional coordinate, by the appearance
State and gesture dynamic vector in gesture database default feature posture and gesture feature vector be compared, obtain the spy
Operational order represented by posture and gesture feature vector is levied, the operational order is sent to electric machine controller;
The electric machine controller controls the motor-drive circuit driving motor and executes corresponding movement;
The implementing result of the motor is periodically fed back to the central controller, the center control by the electric machine controller
Implementing result is transferred to the mobile terminal by wireless communication module by device processed, and the mobile terminal shows implementing result
On the touch screen of mobile terminal.
Preferably, the camera includes the first depth camera arranged side by side, RGB camera and the second depth camera
Head, the RGB camera is for capturing color image, and first depth camera, second depth camera are for catching
Catch dynamic image.
It preferably, further include face recognition module, the facial image of the camera step operation person, the face is known
Other module carries out discriminance analysis to the facial image, judges whether the facial image of the operator is preset face figure
Picture, if preset facial image, the rotation motor driving camera rotation obtains the limb action image of operator;If no
It is preset facial image, the camera stops obtaining the limb action image of operator.
Preferably, the coordinate origin of the three-dimensional coordinate is the central point of both shoulders.
Preferably, the control instruction includes camera open command, advancement commands, retreats instruction, instruction of turning left, turns right
Instruction.
Preferably, the mobile terminal is mobile phone, tablet computer, remote controler.
Preferably, the robot further includes speech recognition module, and operator sends out the speech recognition module for identification
Voice operating order out, and generate corresponding voice operational order and be sent to the central controller, the central controller
The voice operational order is parsed, and the voice operational order is sent to electric machine controller, the electric machine controller control
The motor-drive circuit driving motor executes corresponding movement.
It is provided by the invention with the method that is interacted of mobile terminal control robot, by mobile terminal to machine human hair
Operational order is sent, robot can precisely identify actual instruction, the accurate control of robot may be implemented, preferably in fact
Existing intelligent interaction.
Detailed description of the invention
Fig. 1 is a kind of stream of specific embodiment provided by the invention that the method that robot is interacted is controlled with mobile terminal
Journey schematic diagram.
Specific embodiment
Below in conjunction with the attached drawing in the embodiment of the present application, technical solutions in the embodiments of the present application is carried out clearly and completely
Description, it is clear that described embodiments are only a part of embodiments of the present application, instead of all the embodiments.Based on this Shen
Please in embodiment, every other implementation obtained by those of ordinary skill in the art without making creative efforts
Example, shall fall in the protection scope of this application.
Fig. 1 is please referred to, Fig. 1 is a kind of tool provided by the invention that the method that robot is interacted is controlled with mobile terminal
The flow diagram of body embodiment.
As shown in Figure 1, the present invention provides a kind of method interacted with mobile terminal control robot, the machine
People includes wireless communication module, central controller, processor, servo motor, electric machine controller, electrical apparatus driving circuit, camera,
The camera is driven by rotation motor, comprising the following steps:
The mobile terminal connects the wireless communication module of the robot by local area network or Bluetooth communication;
The mobile terminal is by control instruction by being sent to the central controller of the robot;
The processor obtains the control instruction that the central processing unit issues, and the control instruction that will acquire is stored in dynamic and delays
Region is deposited, when the control instruction number stored in the dynamic buffering region reaches default frame number, once transfers director data, and
The director data gone will be jumped and be sent to the electric machine controller;It is taken between the central controller and the processor by message
Business device is communicated, and the central controller, the processor take respectively as the client of message server and the message
Business device connection, the message server are located in the mobile terminal;
The rotation motor driving camera rotation obtains the limb action image of operator, in the limbs dynamic image
Human body establishes the three-dimensional coordinate of skeleton model, by left-hand minutia, right-hand minutia, the left finesse node, the right side in the skeleton model
Wrist node, left shoulder node, right shoulder node obtain above-mentioned left-hand minutia, right-hand minutia, left hand carpopodium as characteristic processing object
The posture and gesture dynamic vector of point, right finesse node, left shoulder node, right shoulder node in the three-dimensional coordinate, by the appearance
State and gesture dynamic vector in gesture database default feature posture and gesture feature vector be compared, obtain the spy
Operational order represented by posture and gesture feature vector is levied, the operational order is sent to electric machine controller;
The electric machine controller controls the motor-drive circuit driving motor and executes corresponding movement;
The implementing result of the motor is periodically fed back to the central controller, the center control by the electric machine controller
Implementing result is transferred to the mobile terminal by wireless communication module by device processed, and the mobile terminal shows implementing result
On the touch screen of mobile terminal.
It is provided by the invention with the method that is interacted of mobile terminal control robot, by mobile terminal to machine human hair
Operational order is sent, robot can precisely identify actual instruction, the accurate control of robot may be implemented, preferably in fact
Existing intelligent interaction.
In preferred scheme, the camera includes that the first depth camera arranged side by side, RGB camera and second are deep
Camera is spent, the RGB camera is for capturing color image, first depth camera, second depth camera
For capturing dynamic image.
It further include face recognition module, the facial image of the camera step operation person is described in preferred scheme
Face recognition module carries out discriminance analysis to the facial image, judges whether the facial image of the operator is preset people
Face image, if preset facial image, the rotation motor driving camera rotation obtains the limb action image of operator;
If not preset facial image, the camera stops obtaining the limb action image of operator.
In preferred scheme, the coordinate origin of the three-dimensional coordinate is the central point of both shoulders.
In preferred scheme, the control instruction includes camera open command, advancement commands, retreats instruction, turns left to refer to
It enables, instruction of turning right.
In preferred scheme, the mobile terminal can be the mobile terminals such as mobile phone, tablet computer, remote controler.
In preferred scheme, the robot further includes speech recognition module, and the speech recognition module is grasped for identification
The voice operating order that author issues, and generate corresponding voice operational order and be sent to the central controller, the center
Controller parses the voice operational order, and the voice operational order is sent to electric machine controller, the motor control
Device controls the motor-drive circuit driving motor and executes corresponding movement.
Structure, feature and effect of the invention, the above institute are described in detail based on the embodiments shown in the drawings
Only presently preferred embodiments of the present invention is stated, but the present invention does not limit the scope of implementation as shown in the drawings, it is all according to structure of the invention
Think made change or equivalent example modified to equivalent change, when not going beyond the spirit of the description and the drawings,
It should all be within the scope of the present invention.
Claims (7)
1. a kind of method interacted with mobile terminal control robot, the robot includes wireless communication module, center
Controller, processor, servo motor, electric machine controller, electrical apparatus driving circuit, camera, the camera are driven by rotation motor
It is dynamic, which comprises the following steps:
The mobile terminal connects the wireless communication module of the robot by local area network or Bluetooth communication;
The mobile terminal is by control instruction by being sent to the central controller of the robot;
The processor obtains the control instruction that the central processing unit issues, and the control instruction that will acquire is stored in dynamic and delays
Region is deposited, when the control instruction number stored in the dynamic buffering region reaches default frame number, once transfers director data, and
The director data gone will be jumped and be sent to the electric machine controller;It is taken between the central controller and the processor by message
Business device is communicated, and the central controller, the processor take respectively as the client of message server and the message
Business device connection, the message server are located in the mobile terminal;
The rotation motor driving camera rotation obtains the limb action image of operator, in the limbs dynamic image
Human body establishes the three-dimensional coordinate of skeleton model, by left-hand minutia, right-hand minutia, the left finesse node, the right side in the skeleton model
Wrist node, left shoulder node, right shoulder node obtain above-mentioned left-hand minutia, right-hand minutia, left hand carpopodium as characteristic processing object
The posture and gesture dynamic vector of point, right finesse node, left shoulder node, right shoulder node in the three-dimensional coordinate, by the appearance
State and gesture dynamic vector in gesture database default feature posture and gesture feature vector be compared, obtain the spy
Operational order represented by posture and gesture feature vector is levied, the operational order is sent to electric machine controller;
The electric machine controller controls the motor-drive circuit driving motor and executes corresponding movement;
The implementing result of the motor is periodically fed back to the central controller, the center control by the electric machine controller
Implementing result is transferred to the mobile terminal by wireless communication module by device processed, and the mobile terminal shows implementing result
On the touch screen of mobile terminal.
2. the method according to claim 1 interacted with mobile terminal control robot, which is characterized in that described to take the photograph
As head includes the first depth camera arranged side by side, RGB camera and the second depth camera, the RGB camera is used for
Color image is captured, first depth camera, second depth camera are for capturing dynamic image.
3. the method according to claim 2 interacted with mobile terminal control robot, which is characterized in that also wrap
Face recognition module is included, the facial image of the camera step operation person, the face recognition module is to the facial image
Discriminance analysis is carried out, judges whether the facial image of the operator is preset facial image, if preset facial image,
The rotation motor driving camera rotation obtains the limb action image of operator;If not preset facial image, described
Camera stops obtaining the limb action image of operator.
4. the method according to claim 1 interacted with mobile terminal control robot, which is characterized in that described three
The coordinate origin for tieing up coordinate is the central point of both shoulders.
5. the method according to claim 3 interacted with mobile terminal control robot, which is characterized in that the control
System instruction includes camera open command, advancement commands, retreats instruction, instruction of turning left, instruction of turning right.
6. the method according to claim 1 interacted with mobile terminal control robot, which is characterized in that the shifting
Dynamic terminal is mobile phone, tablet computer, remote controler.
7. the method according to claim 1 interacted with mobile terminal control robot, which is characterized in that the machine
Device people further includes speech recognition module, the speech recognition module voice operating order that operator issues for identification, and raw
It is sent to the central controller at corresponding voice operational order, the central controller parses the voice operational order,
And the voice operational order is sent to electric machine controller, the electric machine controller controls the motor-drive circuit driving electricity
Machine executes corresponding movement.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810070014.9A CN110065066A (en) | 2018-01-24 | 2018-01-24 | A method of it is interacted with mobile terminal control robot |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810070014.9A CN110065066A (en) | 2018-01-24 | 2018-01-24 | A method of it is interacted with mobile terminal control robot |
Publications (1)
Publication Number | Publication Date |
---|---|
CN110065066A true CN110065066A (en) | 2019-07-30 |
Family
ID=67365642
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201810070014.9A Pending CN110065066A (en) | 2018-01-24 | 2018-01-24 | A method of it is interacted with mobile terminal control robot |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110065066A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113221834A (en) * | 2021-06-01 | 2021-08-06 | 北京字节跳动网络技术有限公司 | Terminal control method and device, terminal and storage medium |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104777775A (en) * | 2015-03-25 | 2015-07-15 | 北京工业大学 | Two-wheeled self-balancing robot control method based on Kinect device |
CN105549595A (en) * | 2016-02-03 | 2016-05-04 | 南京聚特机器人技术有限公司 | Robot control system based on intelligent mobile terminal and control method |
CN105643590A (en) * | 2016-03-31 | 2016-06-08 | 河北工业大学 | Wheeled mobile robot controlled by gestures and operation method of wheeled mobile robot |
US20170015003A1 (en) * | 2012-12-21 | 2017-01-19 | Crosswing Inc. | Control system for mobile robot |
CN107398902A (en) * | 2017-08-02 | 2017-11-28 | 合肥中导机器人科技有限公司 | robot control method, robot control system |
-
2018
- 2018-01-24 CN CN201810070014.9A patent/CN110065066A/en active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20170015003A1 (en) * | 2012-12-21 | 2017-01-19 | Crosswing Inc. | Control system for mobile robot |
CN104777775A (en) * | 2015-03-25 | 2015-07-15 | 北京工业大学 | Two-wheeled self-balancing robot control method based on Kinect device |
CN105549595A (en) * | 2016-02-03 | 2016-05-04 | 南京聚特机器人技术有限公司 | Robot control system based on intelligent mobile terminal and control method |
CN105643590A (en) * | 2016-03-31 | 2016-06-08 | 河北工业大学 | Wheeled mobile robot controlled by gestures and operation method of wheeled mobile robot |
CN107398902A (en) * | 2017-08-02 | 2017-11-28 | 合肥中导机器人科技有限公司 | robot control method, robot control system |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113221834A (en) * | 2021-06-01 | 2021-08-06 | 北京字节跳动网络技术有限公司 | Terminal control method and device, terminal and storage medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Berg et al. | Review of interfaces for industrial human-robot interaction | |
JP7212741B2 (en) | 3D avatar generation method, device, electronic device and storage medium | |
US11080520B2 (en) | Automatic machine recognition of sign language gestures | |
CN107894833B (en) | Multi-modal interaction processing method and system based on virtual human | |
US11452941B2 (en) | Emoji-based communications derived from facial features during game play | |
CN105825268B (en) | The data processing method and system of object manipulator action learning | |
WO2018006375A1 (en) | Interaction method and system for virtual robot, and robot | |
WO2016110199A1 (en) | Expression migration method, electronic device and system | |
CN107340859A (en) | The multi-modal exchange method and system of multi-modal virtual robot | |
CN109472217B (en) | Intelligent exercise training model construction method and device and training method and device | |
Lüsi et al. | Joint challenge on dominant and complementary emotion recognition using micro emotion features and head-pose estimation: Databases | |
US20190389075A1 (en) | Robot system and robot dialogue method | |
CN109324688A (en) | Exchange method and system based on visual human's behavioral standard | |
US20210192192A1 (en) | Method and apparatus for recognizing facial expression | |
CN108972593A (en) | Control method and system under a kind of industrial robot system | |
CN106502382A (en) | Active exchange method and system for intelligent robot | |
Tran et al. | A hands-free virtual-reality teleoperation interface for wizard-of-oz control | |
CN116630549A (en) | Face modeling method and device, readable storage medium and electronic equipment | |
CN108037825A (en) | The method and system that a kind of virtual idol technical ability is opened and deduced | |
CN110807391A (en) | Human body posture instruction identification method for human-unmanned aerial vehicle interaction based on vision | |
CN115100563A (en) | Production process interaction and monitoring intelligent scene based on video analysis | |
Kędzierski et al. | Design for a robotic companion | |
CN105892627A (en) | Virtual augmented reality method and apparatus, and eyeglass or helmet using same | |
CN110065066A (en) | A method of it is interacted with mobile terminal control robot | |
Silva et al. | Mirroring emotion system-on-line synthesizing facial expressions on a robot face |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20190730 |
|
RJ01 | Rejection of invention patent application after publication |