CN109213101A - Pretreated method and system under a kind of robot system - Google Patents
Pretreated method and system under a kind of robot system Download PDFInfo
- Publication number
- CN109213101A CN109213101A CN201811032911.7A CN201811032911A CN109213101A CN 109213101 A CN109213101 A CN 109213101A CN 201811032911 A CN201811032911 A CN 201811032911A CN 109213101 A CN109213101 A CN 109213101A
- Authority
- CN
- China
- Prior art keywords
- industrial robot
- control system
- robot
- touch screen
- plc module
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B19/00—Programme-control systems
- G05B19/02—Programme-control systems electric
- G05B19/418—Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS], computer integrated manufacturing [CIM]
- G05B19/41865—Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS], computer integrated manufacturing [CIM] characterised by job scheduling, process planning, material flow
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/32—Operator till task planning
- G05B2219/32252—Scheduling production, machining, job shop
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02P—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
- Y02P90/00—Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
- Y02P90/02—Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]
Abstract
The invention discloses pretreated method and system under a kind of robot system, it is characterized in that, the system includes: industrial robot and vision controller and control system on industrial robot, the control system includes ethernet module, touch screen, processor, other modules and PLC module, the control system is based on ethernet module and connect with the vision controller, and the control system is based on PLC module and connect with industrial robot.Through the embodiment of the present invention based on judging that industrial robot belongs to normal operation condition using corresponding pretreatment mechanism, ensure that the operation object of subsequent operation object is errorless.
Description
Technical field
The present invention relates to robotic technology fields, and in particular to pretreated method and system under a kind of robot system.
Background technique
With the continuous development of robot technology, more and more robots start to substitute the various tasks of mankind's execution.Machine
Device people is to automatically control being commonly called as machine (Robot), and automatically controlling machine includes all simulation human behaviors or thought and simulation
The machinery (such as robot dog, Doraemon etc.) of other biological.Have in the narrow sense to the definition of robot there are also many classification and dispute
A little computer programs or even also referred to as robot.In contemporary industry, robot refers to the man-made machine dress that can execute task automatically
It sets, to replace or assist human work.Highly emulated robot in ideal is advanced integral traffic control opinion, mechano-electronic, calculating
Machine and artificial intelligence, materialogy and bionic product, scientific circles research and develop to this direction at present, but robot is remote
Process control is still not perfect, and the application of big data is universal not yet, and also in off-line state, robot is deep for the data acquisition of robot
Degree study is also from the storage of native data.
Industrial robot is multi-joint manipulator or multivariant installations towards industrial circle, it can hold automatically
Row work is a kind of machine that various functions are realized by self power and control ability.It can receive mankind commander, can also
To run according to the program of preparatory layout, modern industrial robot can also be according to the principle program of artificial intelligence technology formulation
Action.
Industrial robot has been applied in various industrial automation production processes in conjunction with machine vision, but is directed to machine
How vision, which is realized based on pretreatment mechanism, is examined whether industrial robot works normally, if is related to entire industrial machine
The adjustment of device people's system needs to propose that a kind of method and system for being appropriate to control just may be used for existing industrial robot system
To realize.
Summary of the invention
The present invention provides pretreated method and system under a kind of robot system, trigger vision by pretreatment mechanism
Controller and industrial robot work at the same time, and judge that industrial robot belongs to normal operation shape using corresponding pretreatment mechanism
State ensures that the operation object of subsequent operation object is errorless.
The present invention provides a kind of pretreated method under robot system, the industrial robot system includes: industry
Robot and vision controller and control system on industrial robot, the control system include Ethernet mould
Block, touch screen, processor, other modules and PLC module, the control system are based on ethernet module and the vision controller
Connection, the control system are based on PLC module and connect with industrial robot, which comprises
Control system sends pre- place to the vision controller and industrial robot being located on industrial robot based on PLC module
Reason instruction, the pre-processing instruction is for triggering vision controller and industrial robot while working;
Vision controller is pre- in pre-processing instruction based on the camera on the control command view-based access control model controller
If the operation image of shooting industrial robot operation object in the time uses the dedicated neural network model of deep learning algorithm training
It extracts operation object and is operating the position in image, posture;
Judge whether the operation object position in the operation image, posture act with the background image presence in model library
Difference, and the operation image that there is movement difference is sent to based on ethernet module the processor in control system;
After processor in control system receives the operation image, the operation diagram is shown to user based on touch screen
Picture;
The user instruction that user is triggered is received based on touch screen;
Control system is based on the user instruction and sends control adjustment instruction to the industrial robot;
The industrial robot be based on the control adjustment instruction adjusted under the action of vision controller to model library
In the consistent operating process of background image.
The method also includes:
It is acted judging that the background image in the operation object position in the operation image, posture and model library does not exist
Difference then normally shows information to touch screen display operation based on PLC module.
The method also includes:
Control system is based on PLC module and sends operational order to the industrial robot;
The Industry robot is based on the operational order and completes corresponding operation.
After processor in the control system receives the operation image, the behaviour is shown to user based on touch screen
Include: as image
Position and posture similarity analysis are carried out based on the operation image, and similarity is gone out based on position and attitude matching
Chart of percentage comparison;
The similarity chart of percentage comparison of position corresponding to the operation image and the operation image and posture is based on touching
Screen is touched to show to user.
The industrial robot be based on the control adjustment instruction adjusted under the action of vision controller to model library
In the consistent operating process of background image include:
The user instruction that user is triggered based on the similarity percentage is received based on touch screen;
By the operation image that there is movement difference, there are in control system based on the user instruction for control system
In memory module.
The method also includes:
PLC module carries out Data Analysis Services to the control instruction triggered on touch screen, sends phase to industrial robot
The command information answered, industrial robot execute corresponding steps according to the command information that touch-control state transformation is converted;
Industrial robot transmits completion signal after being finished to PLC module, sends out again after waiting PLC module return signal
Further response instruction out.
The deep learning algorithm uses stochastic gradient descent method.
The dedicated neural network model uses convolutional neural networks model AlexNet network model.
Correspondingly, the industrial robot system includes: industrial machine the present invention also provides a kind of robot control system
Device people and vision controller and control system on industrial robot, the control system include Ethernet mould
Block, touch screen, processor, other modules and PLC module, the control system are based on ethernet module and the vision controller
Connection, the control system are based on PLC module and connect with industrial robot, and the industrial robot control system executes the above institute
The method stated.
In the present invention, control system is based on realizing between PLC module and vision controller and modules and interact, and expands
PLC module various functional interfaces in control system have been opened up, PLC module and touch screen is used to execute work in the embodiment of the present invention
Industry robot operating system can be configured and monitor that it runs feelings to industrial robot relevant parameter in touch screen interface
Condition operates the work requirements more humane, adaptation object frequently changes.Using modularization programming method, it is easy to call, programming is patrolled
It is clear to collect, and each functional module is relatively independent, and same I/O interface can assign different effects, reduces wiring, improves and utilizes
Rate improves the stability of system operation.Vision controller, can be based on corresponding after receiving the pre-processing instruction of control system
Neural network model rapidly extracting go out the key frame images of operation object, to realize differentiation motion detection process, and can
Quickly to feed back key frame images content to touch screen based on PLC module, so as to for preprocessing process on industrial robot
It is middle that the movement difference analysis of operation object is provided, to realize the adjustment process before operation object, ensure subsequent entire control
Adapting to property of system controls the operation content of entire robot system.
Detailed description of the invention
In order to more clearly explain the embodiment of the invention or the technical proposal in the existing technology, to embodiment or will show below
There is attached drawing needed in technical description to be briefly described, it should be apparent that, the accompanying drawings in the following description is only this
Some embodiments of invention for those of ordinary skill in the art without creative efforts, can be with
Other attached drawings are obtained according to these attached drawings.
Fig. 1 is the robot control system architecture schematic diagram in the embodiment of the present invention;
Fig. 2 is pretreated method flow diagram under the robot system in the embodiment of the present invention.
Specific embodiment
Following will be combined with the drawings in the embodiments of the present invention, and technical solution in the embodiment of the present invention carries out clear, complete
Site preparation description, it is clear that described embodiments are only a part of the embodiments of the present invention, instead of all the embodiments.It is based on
Embodiment in the present invention, it is obtained by those of ordinary skill in the art without making creative efforts all other
Embodiment shall fall within the protection scope of the present invention.
Pretreated method under robot system in the real-time example of the present invention, the industrial robot system include: industry
Robot and vision controller and control system on industrial robot, the control system include Ethernet mould
Block, touch screen, processor, other modules and PLC module, the control system are based on ethernet module and the vision controller
Connection, the control system are based on PLC module and connect with industrial robot, which comprises control system is based on PLC module
Pre-processing instruction is sent to the vision controller and industrial robot being located on industrial robot, the pre-processing instruction is for touching
Hair vision controller and industrial robot work simultaneously;Vision controller is based on the control command view-based access control model controller
On camera the operation image of industrial robot operation object is shot in preset time in pre-processing instruction, use depth
The dedicated neural network model of learning algorithm training extracts operation object and is operating the position in image, posture;Judge the behaviour
Make operation object position in image, whether posture with the background image in model library has movement difference, and by the presence
The operation image of movement difference is sent to the processor in control system based on ethernet module;Processor in control system is received
To after the operation image, the operation image is shown to user based on touch screen;User is received based on touch screen to be triggered
User instruction;Control system is based on the user instruction and sends control adjustment instruction to the industrial robot;The industry
Robot based on the control adjustment instruction adjusted under the action of vision controller to the background image phase one in model library
The operating process of cause.
Fig. 1 shows pretreated method under the robot system in the embodiment of the present invention, the industrial robot system
It include: industrial robot and vision controller and control system on industrial robot, the control system packet
Include ethernet module, touch screen, processor, other modules and PLC module, the control system be based on ethernet module with it is described
Vision controller connection, the control system are based on PLC module and connect with industrial robot.
Here control system based on PLC module to be located at industrial robot on vision controller and industrial machine human hair
Pre-processing instruction is sent, the pre-processing instruction is for triggering vision controller and industrial robot while working.
Here vision controller is based on the camera on the control command view-based access control model controller in pre-processing instruction
In preset time in shooting industrial robot operation object operation image, use the dedicated nerve net of deep learning algorithm training
Network model extraction goes out operation object and is operating the position in image, posture;Judge it is described operation image in operation object position,
Posture whether in model library background image exist movement difference, and by it is described exist movement difference operation image be based on
Too net module is sent to the processor in control system;
Here after the processor in control system receives the operation image, shown based on touch screen to user described in
Operate image;Control system receives the user instruction that user is triggered based on touch screen;Control system is based on the user instruction
Control adjustment instruction is sent to the industrial robot.
Here industrial robot be based on the control adjustment instruction adjusted under the action of vision controller to model
The consistent operating process of background image in library.
The vision controller is judging the Background in the operation object position in the operation image, posture and model library
As not there is no movement difference, then information is normally shown to touch screen display operation based on PLC module.
The control system is based on PLC module and sends operational order to the industrial robot;The Industry robot
Corresponding operation is completed based on the operational order.
The control system is based on the operation image and carries out position and posture similarity analysis, and is based on position and posture
Allot similarity chart of percentage comparison;By the similarity percentage of position and posture corresponding to the operation image and the operation image
It is shown based on touch screen to user than figure.
The control system receives the user instruction that user is triggered based on the similarity percentage based on touch screen;Control
By the operation image that there is movement difference, there are in the memory module in control system based on the user instruction for system.
PLC module in the control system carries out Data Analysis Services, Xiang Gong to the control instruction triggered on touch screen
Industry robot sends corresponding command information, and industrial robot executes corresponding according to the command information that touch-control state transformation is converted
Step;Industrial robot transmits completion signal after being finished to PLC module, issues again after waiting PLC module return signal
Further response instruction.
Specifically, Fig. 2 shows pretreated method flow diagram under the robot system in the embodiment of the present invention, industrial machine
Device people's system includes: industrial robot and vision controller and control system on industrial robot, the control
System processed includes ethernet module, touch screen, processor, other modules and PLC module, and the control system is based on Ethernet mould
Block is connect with the vision controller, and the control system is based on PLC module and connect with industrial robot, the specific steps are as follows:
S201, control system based on PLC module to be located at industrial robot on vision controller and industrial machine human hair
Pre-processing instruction is sent, the pre-processing instruction is for triggering vision controller and industrial robot while working;
S202, vision controller are based on the camera on the control command view-based access control model controller in pre-processing instruction
Preset time in shooting industrial robot operation object operation image, use the dedicated neural network of deep learning algorithm training
Model extraction goes out operation object and is operating the position in image, posture;
It should be noted that the preset time be generally the second be unit, value of the preset time between 0s to 30s,
The preset time can be provided with by user based on user interface.
It should be noted that deep learning algorithm here uses stochastic gradient descent method;Here dedicated neural network
Model uses convolutional neural networks model AlexNet network model.
S203, judge whether the operation object position operated in image, posture deposit with the background image in model library
In movement difference, difference is acted if it exists and then enters S204, otherwise enters S208;
S204, the operation image that the presence is acted to difference are sent to the processing in control system based on ethernet module
Device;
In specific implementation process, position and posture similarity analysis are carried out based on the operation image, and based on position with
Attitude matching goes out similarity chart of percentage comparison;By the similar of position and posture corresponding to the operation image and the operation image
Degree chart of percentage comparison is based on touch screen and shows to user.
After processor in S205, control system receives the operation image, shown based on touch screen to user described in
Operate image;
S206, the user instruction that user is triggered is received based on touch screen;
S207, control system are based on the user instruction and send control adjustment instruction to the industrial robot;
S208, the industrial robot be based on the control adjustment instruction adjusted under the action of vision controller to mould
The consistent operating process of background image in type library;
Here industrial robot be based on the control adjustment instruction adjusted under the action of vision controller to model
The consistent operating process of background image in library includes: to receive user based on touch screen and be based on the similarity percentage to be touched
The user instruction of hair;By the operation image that there is movement difference, there are control systems based on the user instruction for control system
In memory module in.
S209, judging that the operation object position in the operation image, the background image in posture and model library do not deposit
In movement difference, then information is normally shown to touch screen display operation based on PLC module;
S210, control system are based on PLC module and send operational order to the industrial robot;
S211, the Industry robot are based on the operational order and complete corresponding operation.
In specific implementation process, PLC module carries out Data Analysis Services, Xiang Gong to the control instruction triggered on touch screen
Industry robot sends corresponding command information, and industrial robot executes corresponding according to the command information that touch-control state transformation is converted
Step;Industrial robot transmits completion signal after being finished to PLC module, issues again after waiting PLC module return signal
Further response instruction.
This method judges whether work pattern locating for industrial robot is correct using pretreatment mechanism, in vision controller
Match it is consistent in the case where, just realize entire control process, and to do not match it is consistent in the case where, it is also necessary to pre-
Processing stage continues to adjust the adjustment process of entire industrial robot, make up to it is consistent in the state of, just realize normal make
Industry process.Entire vision controller acquisition, standard, training and identification combine, and reduce system complexity, field deployment
Difficulty, easily for industrial robot be added visual identity ability, reduce as the mating machine vision product of industrial robot
The utilization difficulty of identification, position detection under operation object mode.Use the convolutional neural networks model of telecommunications as vision
The main recognizer of controller has higher accuracy and better robustness.Deep learning according to the present invention is calculated
Method uses stochastic gradient descent method, and each iteration of stochastic gradient descent method randomly chooses a sample from training set to be learned
It practises, has the advantages that operand is small, training convergence is fast, the training of neural network model can be restrained within a short period of time.It adopts
The extraction of certain key images in video requency frame data stream is realized with the mode of key-frame extraction, also reduces subsequent control system
Data volume is handled, the too fat to move of information flow is reduced.
Implement the embodiment of the present invention, control system is based on realizing between PLC module and vision controller and modules and hand over
Mutually, PLC module various functional interfaces in control system are extended, use PLC module and touch screen in the embodiment of the present invention
Industrial robot operating system is executed, industrial robot relevant parameter can be configured and be monitored its fortune in touch screen interface
Market condition operates the work requirements more humane, adaptation object frequently changes.Using modularization programming method, it is easy to call, compiles
Journey clear logic, each functional module is relatively independent, and same I/O interface can assign different effects, reduces wiring, improves benefit
With rate, the stability of system operation is improved.Vision controller can be based on phase after receiving the pre-processing instruction of control system
The neural network model rapidly extracting answered goes out the key frame images of operation object, thus realize differentiation motion detection process, and
Key frame images content quickly can be fed back to touch screen based on PLC module, so as to for pretreated on industrial robot
The movement difference analysis of operation object is provided in journey, to realize the adjustment process before operation object, ensures subsequent entire control
Adapting to property of system processed controls the operation content of entire robot system.
Those of ordinary skill in the art will appreciate that all or part of the steps in the various methods of above-described embodiment is can
It is completed with instructing relevant hardware by program, which can store in computer readable storage medium, and storage is situated between
Matter may include: read-only memory (ROM, Read Only Memory), random access memory (RAM, Random Access
Memory), disk or CD etc..
It is provided for the embodiments of the invention pretreated method and system under robot system above and has carried out detailed Jie
It continues, used herein a specific example illustrates the principle and implementation of the invention, and the explanation of above embodiments is only
It is to be used to help understand method and its core concept of the invention;At the same time, for those skilled in the art, according to this hair
Bright thought, there will be changes in the specific implementation manner and application range, in conclusion the content of the present specification should not manage
Solution is limitation of the present invention.
Claims (9)
1. a kind of pretreated method under robot system, which is characterized in that the industrial robot system includes: industrial machine
People and vision controller and control system on industrial robot, the control system include ethernet module,
Touch screen, processor, other modules and PLC module, the control system is based on ethernet module and the vision controller connects
It connects, the control system is based on PLC module and connect with industrial robot, which comprises
Control system sends pretreatment to the vision controller and industrial robot being located on industrial robot based on PLC module and refers to
It enables, the pre-processing instruction is for triggering vision controller and industrial robot while working;
When vision controller is based on default in pre-processing instruction of the camera on the control command view-based access control model controller
The operation image of interior shooting industrial robot operation object is extracted using the dedicated neural network model of deep learning algorithm training
Operation object is operating the position in image, posture out;
Judge whether the operation object position operated in image, posture are poor in the presence of acting with the background image in model library
It is different, and the operation image that there is movement difference is sent to based on ethernet module the processor in control system;
After processor in control system receives the operation image, the operation image is shown to user based on touch screen;
The user instruction that user is triggered is received based on touch screen;
Control system is based on the user instruction and sends control adjustment instruction to the industrial robot;
The industrial robot based on the control adjustment instruction adjusted under the action of vision controller to in model library
The consistent operating process of background image.
2. pretreated method under robot system as described in claim 1, which is characterized in that the method also includes:
Judging that it is poor that the background image in the operation object position in the operation image, posture and model library does not have movement
It is different, then information is normally shown to touch screen display operation based on PLC module.
3. pretreated method under robot system as described in claim 1, which is characterized in that the method also includes:
Control system is based on PLC module and sends operational order to the industrial robot;
The Industry robot is based on the operational order and completes corresponding operation.
4. pretreated method under robot system as described in claim 1, which is characterized in that the place in the control system
After reason device receives the operation image, show that the operation image includes: to user based on touch screen
Position and posture similarity analysis are carried out based on the operation image, and similarity percentage is gone out based on position and attitude matching
Than figure;
The similarity chart of percentage comparison of position corresponding to the operation image and the operation image and posture is based on touch screen
It is shown to user.
5. pretreated method under work robot system as claimed in claim 4, which is characterized in that the industrial robot base
It is adjusted under the action of vision controller in the control adjustment instruction to the operation consistent with the background image in model library
Process includes:
The user instruction that user is triggered based on the similarity percentage is received based on touch screen;
By the operation image that there is movement difference, there are the storages in control system based on the user instruction for control system
In module.
6. such as pretreated method under robot system described in any one of claim 1 to 5, which is characterized in that the method
Further include:
PLC module carries out Data Analysis Services to the control instruction triggered on touch screen, sends to industrial robot corresponding
Command information, industrial robot execute corresponding steps according to the command information that touch-control state transformation is converted;
Industrial robot transmits completion signal after being finished to PLC module, wait PLC module return signal after issue again into
The response of one step instructs.
7. pretreated method under robot system as claimed in claim 6, which is characterized in that the deep learning algorithm is adopted
With stochastic gradient descent method.
8. pretreated method under robot system as claimed in claim 6, which is characterized in that the dedicated neural network mould
Type uses convolutional neural networks model AlexNet network model.
9. a kind of robot control system, which is characterized in that the industrial robot system includes: industrial robot, Yi Jiwei
In vision controller and control system on industrial robot, the control system includes ethernet module, touch screen, place
Device, other modules and PLC module are managed, the control system is based on ethernet module and connect with the vision controller, the control
System processed is based on PLC module and connect with industrial robot, and the industrial robot control system is executed such as any one of right 1 to 8
The method.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811032911.7A CN109213101B (en) | 2018-09-05 | 2018-09-05 | Method and system for preprocessing under robot system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811032911.7A CN109213101B (en) | 2018-09-05 | 2018-09-05 | Method and system for preprocessing under robot system |
Publications (2)
Publication Number | Publication Date |
---|---|
CN109213101A true CN109213101A (en) | 2019-01-15 |
CN109213101B CN109213101B (en) | 2021-05-25 |
Family
ID=64987636
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201811032911.7A Expired - Fee Related CN109213101B (en) | 2018-09-05 | 2018-09-05 | Method and system for preprocessing under robot system |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN109213101B (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108972593A (en) * | 2018-09-07 | 2018-12-11 | 顺德职业技术学院 | Control method and system under a kind of industrial robot system |
CN110883772A (en) * | 2019-10-23 | 2020-03-17 | 中国国家铁路集团有限公司 | Method and system for processing potential safety hazard of railway station by using robot |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102541253A (en) * | 2010-12-30 | 2012-07-04 | 德信互动科技(北京)有限公司 | Analog control system and analog control method |
CN105643607A (en) * | 2016-04-08 | 2016-06-08 | 深圳市中科智敏机器人科技有限公司 | Intelligent industrial robot with sensing and cognitive abilities |
US20170140539A1 (en) * | 2015-11-16 | 2017-05-18 | Abb Technology Ag | Three-dimensional visual servoing for robot positioning |
CN107169519A (en) * | 2017-05-18 | 2017-09-15 | 重庆卓来科技有限责任公司 | A kind of industrial robot vision's system and its teaching method |
-
2018
- 2018-09-05 CN CN201811032911.7A patent/CN109213101B/en not_active Expired - Fee Related
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102541253A (en) * | 2010-12-30 | 2012-07-04 | 德信互动科技(北京)有限公司 | Analog control system and analog control method |
US20170140539A1 (en) * | 2015-11-16 | 2017-05-18 | Abb Technology Ag | Three-dimensional visual servoing for robot positioning |
CN105643607A (en) * | 2016-04-08 | 2016-06-08 | 深圳市中科智敏机器人科技有限公司 | Intelligent industrial robot with sensing and cognitive abilities |
CN107169519A (en) * | 2017-05-18 | 2017-09-15 | 重庆卓来科技有限责任公司 | A kind of industrial robot vision's system and its teaching method |
Non-Patent Citations (1)
Title |
---|
杨碧玉: "基于PLC、视觉系统和机器人的全自动化生产线设计", 《科技资讯》 * |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108972593A (en) * | 2018-09-07 | 2018-12-11 | 顺德职业技术学院 | Control method and system under a kind of industrial robot system |
CN110883772A (en) * | 2019-10-23 | 2020-03-17 | 中国国家铁路集团有限公司 | Method and system for processing potential safety hazard of railway station by using robot |
Also Published As
Publication number | Publication date |
---|---|
CN109213101B (en) | 2021-05-25 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20240028896A1 (en) | Method and system for activity classification | |
CN112668687B (en) | Cloud robot system, cloud server, robot control module and robot | |
CN108972593A (en) | Control method and system under a kind of industrial robot system | |
CN108983979B (en) | Gesture tracking recognition method and device and intelligent equipment | |
JPWO2003019475A1 (en) | Robot device, face recognition method, and face recognition device | |
US11654552B2 (en) | Backup control based continuous training of robots | |
CN108073851B (en) | Grabbing gesture recognition method and device and electronic equipment | |
CN107553496B (en) | Method and device for determining and correcting errors of inverse kinematics solving method of mechanical arm | |
JP7320885B2 (en) | Systems, methods and media for manufacturing processes | |
Wachs et al. | Real-time hand gesture telerobotic system using fuzzy c-means clustering | |
CN114730407A (en) | Modeling human behavior in a work environment using neural networks | |
CN109213101A (en) | Pretreated method and system under a kind of robot system | |
CN112775967A (en) | Mechanical arm grabbing method, device and equipment based on machine vision | |
CN110807391A (en) | Human body posture instruction identification method for human-unmanned aerial vehicle interaction based on vision | |
CN108415386A (en) | Augmented reality system and its working method for intelligent workshop | |
CN111152227A (en) | Mechanical arm control method based on guided DQN control | |
Li et al. | Teleoperation of a virtual icub robot under framework of parallel system via hand gesture recognition | |
Brecher et al. | Towards anthropomorphic movements for industrial robots | |
CN115847422A (en) | Gesture recognition method, device and system for teleoperation | |
JP7216190B2 (en) | Modular Acceleration Module for Programmable Logic Controller Based Artificial Intelligence | |
WO2020142499A1 (en) | Robot object learning system and method | |
CN109040688A (en) | The method and system that the industrial robot operation video of a kind of pair of acquisition is stored | |
CN106774178B (en) | Automatic control system and method and mechanical equipment | |
CN106779047B (en) | A kind of information processing method and device | |
KR20230100101A (en) | Robot control system and method for robot setting and robot control using the same |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant | ||
CF01 | Termination of patent right due to non-payment of annual fee | ||
CF01 | Termination of patent right due to non-payment of annual fee |
Granted publication date: 20210525 Termination date: 20210905 |