CN116713994A - Robot control system and method based on machine vision - Google Patents

Robot control system and method based on machine vision Download PDF

Info

Publication number
CN116713994A
CN116713994A CN202310729255.0A CN202310729255A CN116713994A CN 116713994 A CN116713994 A CN 116713994A CN 202310729255 A CN202310729255 A CN 202310729255A CN 116713994 A CN116713994 A CN 116713994A
Authority
CN
China
Prior art keywords
robot
behavior
request command
data
machine vision
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310729255.0A
Other languages
Chinese (zh)
Inventor
王佩
张晓伟
冉江婧
崔忠伟
李斌
郭龙
马勋兰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guizhou Education University
Original Assignee
Guizhou Education University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guizhou Education University filed Critical Guizhou Education University
Priority to CN202310729255.0A priority Critical patent/CN116713994A/en
Publication of CN116713994A publication Critical patent/CN116713994A/en
Pending legal-status Critical Current

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture
    • B25J9/1605Simulation of manipulator lay-out, design, modelling of manipulator
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1628Programme controls characterised by the control loop
    • B25J9/163Programme controls characterised by the control loop learning, adaptive, model based, rule based expert control
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The invention provides a robot control system and a robot control method based on machine vision, comprising the following steps: the receiving module is used for receiving a request command sent by the client; the analysis module is used for starting machine vision monitoring equipment preset by the robot, and collecting and analyzing state information of the robot in a dynamic environment; and the calculation module is used for inputting the request command and the state information into the behavior control model to calculate the behavior weight of the robot and obtain the execution behavior with the minimum cost.

Description

Robot control system and method based on machine vision
Technical Field
The invention relates to the technical field of machine vision, in particular to a robot control system and method based on machine vision.
Background
At present, widely used robots in China are in a teaching reproduction stage, the track of the robot is determined in advance, flexibility is lacked, the robot is required to have very high positioning precision on the periphery, and even if the precision requirement of a final product is not high, the condition must be met, so that the flexibility of production is greatly reduced, the requirements of people on production and life are difficult to achieve, and the development of the robot is necessarily toward the integration of multiple fields. In recent years, with the development of industrial automation, particularly the rapid development of technologies such as digital sensors, image processing, pattern recognition and the like, the accurate positioning and tracking of targets are no longer problematic, so that robots based on visual servo control are widely researched and applied, and the control accuracy and flexibility of the robots are also urgently required to be improved.
Disclosure of Invention
In order to overcome the problems in the background art, the technical scheme provides a robot control system and a robot control method based on machine vision.
The technical scheme provides a robot control system based on machine vision, including:
the receiving module is used for receiving a request command sent by the client;
the analysis module is used for starting machine vision monitoring equipment preset by the robot, and collecting and analyzing state information of the robot in a dynamic environment;
and the calculation module is used for inputting the request command and the state information into the behavior control model to calculate the behavior weight of the robot and obtain the execution behavior with the minimum cost.
As an embodiment of the present technical solution, the analysis module includes:
the acquisition unit: the robot vision monitoring device is used for starting a robot preset machine vision monitoring device and collecting surrounding environment information of the robot; wherein,,
the environment information at least comprises 3D road condition information and obstacle information;
a deduction unit: the robot behavior simulation module is used for performing behavior simulation in a behavior simulation model preset by the robot through the environment information and the request command, and deducting the dynamic environment of the robot work;
analysis unit: the method is used for acquiring and analyzing the state information of the robot in the dynamic environment.
As an embodiment of the present technical solution, the deduction unit includes:
a first computing subunit: for calculating a target constraint value of an environment from the environment information;
a second computing subunit: the method comprises the steps of inserting a target constraint value into a behavior simulation model preset by a robot through a request command, and calculating a constraint gesture under the target constraint value;
deduction subunit: the robot behavior simulation method is used for performing behavior simulation in a behavior simulation model preset by the robot based on the constraint gesture of the robot and deducting the dynamic environment of the robot.
As an embodiment of the present technical solution, the computing module includes:
an acquisition unit: the method comprises the steps of inputting the request command and the state information into a behavior control model, and acquiring a behavior path and a target execution behavior of the robot; wherein,,
the behavior path at least comprises a moving direction and a moving speed;
weight calculation unit: the behavior path and the target execution behavior are input into a behavior control model to calculate the behavior weight of the robot, and the behavior weight is determined;
execution behavior unit: and the method is used for acquiring the minimum behavior weight and determining the corresponding execution behavior with the minimum cost.
The technical proposal provides a method for receiving a request command sent by a client, which comprises the following specific steps,
step A1: the robot uses the formula (1) to analyze the received data and judge whether the received data is a request command sent by a client
Wherein E represents the received data state value; d (D) 16 A 16-ary form representing the received data; a is that 16 Representing the guestThe 16-system form of the preset standard frame head of the request command sent by the client; w (W) 16 A 16-system form of a preset standard frame tail for representing a request command sent by the client;>>representing a right shift;<<representing a left shift; len () represents the total number of bits to be found for the data in brackets; f { } represents a function, wherein the function value is 1 if all the formulas in the brackets are satisfied, and is 0 if the formulas which are not satisfied are present in the brackets;
if e=1, it indicates that the received data is a request command sent by the client;
if e=0, it indicates that the received data is not the request command sent by the client;
step A2: if the received data is a request command sent by the client, performing data verification on the command by using a formula (2)
Wherein J represents a data verification state value when the received data is a request command sent by the client; d (D) 16 {len(A 16 )+1→[len(D 16 )-len(W 16 )]Data D is represented by } 16 Middle len (A) 16 ) +1 to [ len (D) 16 )-len(W 16 )]16-ary data on bits; s is S 16 Representing data D 16 Eliminating 16-system data after the head and the tail of the frame; sum (S) 16 >>2) Representing a pair of 16-ary data S 16 >>2 each bit value is added;
step A3: controlling the type of the returned data according to the received data state and the data verification state when the received data is the request command sent by the client by using the formula (3)
Wherein H represents a type control value of the return data;
if h=1, it indicates that the type of the control return data is a received data format error;
if h=2, the type of the control return data is that the received data has a data anomaly;
if h=3, it indicates that the type of the control return data is successful;
starting a machine vision monitoring device preset by the robot, and collecting and analyzing state information of the robot in a dynamic environment;
and inputting the request command and the state information into a behavior control model to calculate the behavior weight of the robot, and acquiring the execution behavior with the minimum cost.
As an embodiment of the present technical solution, the receiving a request command sent by a client includes:
starting a machine vision monitoring device preset by the robot, and collecting environmental information around the robot; wherein,,
the environment information at least comprises 3D road condition information and obstacle information;
performing behavior simulation in a behavior simulation model preset by the robot through the environment information and the request command, and deducting the dynamic environment of the robot work;
and acquiring and analyzing the state information of the robot in the dynamic environment.
As an embodiment of the present disclosure, the performing, by using the environmental information and the request command, a behavior simulation in a behavior simulation model preset by the robot, and deducting a dynamic environment of the robot operation, includes:
calculating a target constraint value of the environment through the environment information;
inserting the target constraint value into a behavior simulation model preset by the robot through the request command to calculate the constraint gesture under the target constraint value;
based on the constraint gesture of the robot, performing behavior simulation in a behavior simulation model preset by the robot, and deducting the dynamic environment of the robot work.
As an embodiment of the present technical solution, the inputting the request command and the state information into the behavior control model calculates a behavior weight of the robot, and obtaining an execution behavior with a minimum cost includes:
inputting the request command and the state information into a behavior control model to acquire a behavior path and a target execution behavior of the robot; wherein,,
the behavior path at least comprises a moving direction and a moving speed;
inputting the behavior path and the target execution behavior into a behavior control model to calculate the behavior weight of the robot and determine the behavior weight;
and acquiring the minimum behavior weight, and determining the corresponding execution behavior with the minimum cost.
Additional features and advantages of the invention will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by practice of the invention. The objectives and other advantages of the invention will be realized and attained by the structure particularly pointed out in the written description and claims thereof as well as the appended drawings.
The technical scheme of the invention is further described in detail through the drawings and the embodiments.
Drawings
The accompanying drawings are included to provide a further understanding of the invention and are incorporated in and constitute a part of this specification, illustrate the invention and together with the embodiments of the invention, serve to explain the invention. In the drawings:
FIG. 1 is a block diagram of a robot control system based on machine vision in accordance with an embodiment of the present invention;
FIG. 2 is a block diagram of a robot control system based on machine vision in accordance with an embodiment of the present invention;
fig. 3 is a block diagram of a robot control system based on machine vision according to an embodiment of the present invention.
Detailed Description
The preferred embodiments of the present invention will be described below with reference to the accompanying drawings, it being understood that the preferred embodiments described herein are for illustration and explanation of the present invention only, and are not intended to limit the present invention.
Example 1:
according to fig. 1, the present technical solution provides a robot control system based on machine vision, including:
the receiving module is used for receiving a request command sent by the client;
the analysis module is used for starting machine vision monitoring equipment preset by the robot, and collecting and analyzing state information of the robot in a dynamic environment;
and the calculation module is used for inputting the request command and the state information into the behavior control model to calculate the behavior weight of the robot and obtain the execution behavior with the minimum cost.
The working principle and beneficial effects of the technical scheme are as follows:
the technical scheme provides a robot control system based on machine vision, including: the receiving module is used for receiving a request command sent by the client; the analysis module is used for starting machine vision monitoring equipment preset by the robot, wherein the vision monitoring equipment is generally camera equipment, and acquiring and analyzing state information of the robot in a dynamic environment; and the calculation module is used for inputting the request command and the state information into the behavior control model to calculate the behavior weight of the robot and obtain the execution behavior with the minimum cost, thereby improving the working efficiency of the robot.
Example 2:
according to fig. 2, the present disclosure provides an embodiment, the analysis module includes:
the acquisition unit: the machine vision monitoring device is used for starting the preset machine vision monitoring equipment of the robot and collecting the surrounding environment information of the robot; wherein,,
the environment information at least comprises 3D road condition information and obstacle information;
a deduction unit: the robot behavior simulation module is used for performing behavior simulation in a behavior simulation model preset by the robot through the environment information and the request command, and deducting the dynamic environment of the robot work;
analysis unit: the method is used for acquiring and analyzing the state information of the robot in the dynamic environment.
The working principle and beneficial effects of the technical scheme are as follows:
the technical scheme analysis module comprises: the acquisition unit: the machine vision monitoring device is used for starting the preset machine vision monitoring equipment of the robot and collecting the surrounding environment information of the robot; the environment information at least comprises 3D road condition information and obstacle information, so that an optimal running route of the robot can be calculated, and the deduction unit is used for deducting the optimal running route of the robot: the analysis unit is used for acquiring and analyzing the state information of the robot in the dynamic environment, so that the robot can conveniently learn while acting.
Example 3:
according to the embodiment of fig. 3, the deduction unit includes:
a first computing subunit: for calculating a target constraint value of an environment from the environment information;
a second computing subunit: the method comprises the steps of inserting a target constraint value into a behavior simulation model preset by a robot through a request command, and calculating a constraint gesture under the target constraint value;
deduction subunit: the robot behavior simulation method is used for performing behavior simulation in a behavior simulation model preset by the robot based on the constraint gesture of the robot and deducting the dynamic environment of the robot.
The working principle and beneficial effects of the technical scheme are as follows:
the deduction unit of the technical scheme comprises: the first calculating subunit is used for calculating a target constraint value of the environment through the environment information so as to perform data calculation on the behavior of the robot on the constraint of the environment, and the second calculating subunit is used for inserting the target constraint value into a behavior simulation model preset by the robot through the request command so as to calculate a constraint gesture under the target constraint value; the deduction subunit is used for performing behavior simulation in a behavior simulation model preset by the robot based on the constraint gesture of the robot, and deducting the dynamic environment of the robot work so as to optimize the learning behavior of the robot.
Example 4:
the technical scheme provides an embodiment, the computing module includes:
an acquisition unit: the method comprises the steps of inputting the request command and the state information into a behavior control model, and acquiring a behavior path and a target execution behavior of the robot; wherein,,
the behavior path at least comprises a moving direction and a moving speed;
weight calculation unit: the behavior path and the target execution behavior are input into a behavior control model to calculate the behavior weight of the robot, and the behavior weight is determined;
execution behavior unit: and the method is used for acquiring the minimum behavior weight and determining the corresponding execution behavior with the minimum cost.
The working principle and beneficial effects of the technical scheme are as follows:
the calculation module of this technical scheme includes: an acquisition unit: the method comprises the steps of inputting the request command and the state information into a behavior control model, and acquiring a behavior path and a target execution behavior of the robot; the behavior path at least comprises a moving direction and a moving speed so as to optimize behavior control of the robot, improve an optimized route of the robot and a weight calculating unit: the behavior path and the target execution behavior are input into a behavior control model to calculate the behavior weight of the robot, and the behavior weight is determined; execution behavior unit: and the method is used for acquiring the minimum behavior weight and determining the corresponding execution behavior with the minimum cost.
Example 5:
the technical scheme provides a robot control method based on machine vision, which comprises the following steps:
the method comprises the specific steps of receiving a request command sent by a client,
step A1: the robot uses the formula (1) to analyze the received data and judge whether the received data is a request command sent by a client
Wherein E represents the received data state value; d (D) 16 A 16-ary form representing the received data; a is that 16 A 16-system form of a preset standard frame header representing a request command sent by the client; w (W) 16 A 16-system form of a preset standard frame tail for representing a request command sent by the client;>>representing a right shift;<<representing a left shift; len () represents the total number of bits to be found for the data in brackets; f { } represents a function, wherein the function value is 1 if all the formulas in the brackets are satisfied, and is 0 if the formulas which are not satisfied are present in the brackets;
if e=1, it indicates that the received data is a request command sent by the client;
if e=0, it indicates that the received data is not the request command sent by the client;
step A2: if the received data is a request command sent by the client, performing data verification on the command by using a formula (2)
Wherein J represents a data verification state value when the received data is a request command sent by the client; d (D) 16 {len(A 16 )+1→[len(D 16 )-len(W 16 )]Data D is represented by } 16 Middle len (A) 16 ) +1 to [ len (D) 16 )-len(W 16 )]16-ary data on bits; s is S 16 Representing data D 16 Eliminating 16-system data after the head and the tail of the frame; sum (S) 16 >>2) Representing a pair of 16-ary data S 16 >>2 each bit value is added;
step A3: controlling the type of the returned data according to the received data state and the data verification state when the received data is the request command sent by the client by using the formula (3)
Wherein H represents a type control value of the return data;
if h=1, it indicates that the type of the control return data is a received data format error;
if h=2, the type of the control return data is that the received data has a data anomaly;
if h=3, it indicates that the type of the control return data is successful;
starting a machine vision monitoring device preset by the robot, and collecting and analyzing state information of the robot in a dynamic environment;
and inputting the request command and the state information into a behavior control model to calculate the behavior weight of the robot, and acquiring the execution behavior with the minimum cost.
The working principle and beneficial effects of the technical scheme are as follows:
the technical scheme provides a robot control method based on machine vision, which comprises the steps of receiving a request command sent by a client; starting a machine vision monitoring device preset by the robot, wherein the vision monitoring device is generally camera equipment, and collecting and analyzing state information of the robot in a dynamic environment; the formula (1) in the step A1 is utilized to carry out data analysis on the received data to judge whether the received data is a request command sent by a client, so that the format state of the received command is known, the intelligent and effective classification of the data is facilitated, and the subsequent processing of the data is facilitated; and then, carrying out data verification on the command by utilizing the formula (2) in the step A2, so as to know whether the received data is abnormal or not, and ensuring the reliability and accuracy of the data received by the robot; and finally, controlling the type of the returned data according to the received data state and the data verification state when the received data is a request command sent by the client by utilizing the formula (3) in the step A3, so as to control the robot to make the returned data of the corresponding type according to different states of the received data, thereby facilitating further subsequent operation on the robot, effectively knowing the internal condition of the robot and efficiently carrying out subsequent steps and measures;
and inputting the request command and the state information into a behavior control model to calculate the behavior weight of the robot and obtain the execution behavior with the minimum cost, thereby improving the working efficiency of the robot.
Example 6:
the technical scheme provides an embodiment, wherein the receiving of the request command sent by the client comprises the following steps:
starting a machine vision monitoring device preset by the robot, and collecting environmental information around the robot; wherein,,
the environment information at least comprises 3D road condition information and obstacle information;
performing behavior simulation in a behavior simulation model preset by the robot through the environment information and the request command, and deducting the dynamic environment of the robot work;
and acquiring and analyzing the state information of the robot in the dynamic environment.
The working principle and beneficial effects of the technical scheme are as follows:
according to the technical scheme, machine vision monitoring equipment preset for the robot is started, and environmental information around the robot is collected; the environment information at least comprises 3D road condition information and obstacle information, so that an optimal running route of the robot is calculated conveniently, the behavior simulation is carried out in a behavior simulation model preset by the robot through the environment information and the request command, and the dynamic environment of the robot work is deduced, so that the optimal learning capacity of the robot is enhanced, the state information of the robot in the dynamic environment is acquired and analyzed, and the robot can learn while acting conveniently.
Example 7:
the present technical solution provides an embodiment, where the performing, by using the environmental information and the request command, a behavior simulation in a behavior simulation model preset by the robot, and deducting a dynamic environment of the robot work includes:
calculating a target constraint value of the environment through the environment information;
inserting the target constraint value into a behavior simulation model preset by the robot through the request command to calculate the constraint gesture under the target constraint value;
based on the constraint gesture of the robot, performing behavior simulation in a behavior simulation model preset by the robot, and deducting the dynamic environment of the robot work.
The working principle and beneficial effects of the technical scheme are as follows:
according to the technical scheme, the target constraint value of the environment is calculated through the environment information, so that the behavior of the robot is calculated for the constraint of the environment, and the constraint attitude of the target constraint value under the target constraint value is calculated in a behavior simulation model preset by the robot through the request command; based on the constraint gesture of the robot, performing behavior simulation in a behavior simulation model preset by the robot, and deducting the dynamic environment of the robot work so as to optimize the learning behavior of the robot.
Example 8:
the technical solution provides an embodiment, the step of inputting the request command and the state information into a behavior control model to calculate the behavior weight of the robot, and the step of obtaining the execution behavior with the minimum cost includes:
inputting the request command and the state information into a behavior control model to acquire a behavior path and a target execution behavior of the robot; wherein,,
the behavior path at least comprises a moving direction and a moving speed;
inputting the behavior path and the target execution behavior into a behavior control model to calculate the behavior weight of the robot and determine the behavior weight;
and acquiring the minimum behavior weight, and determining the corresponding execution behavior with the minimum cost.
The working principle and beneficial effects of the technical scheme are as follows:
according to the technical scheme, the request command and the state information are input into a behavior control model, and a behavior path and a target execution behavior of the robot are obtained; the behavior path at least comprises a moving direction and a moving speed, so that behavior control of the robot is optimized, an optimized route of the robot is improved, the behavior path and target execution behaviors are input into a behavior control model, behavior weights of the robot are calculated, and the behavior weights are determined; and acquiring the minimum behavior weight, and determining the corresponding execution behavior with the minimum cost.
It will be appreciated by those skilled in the art that embodiments of the present invention may be provided as a method, system, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, magnetic disk storage, optical storage, and the like) having computer-usable program code embodied therein.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
It will be apparent to those skilled in the art that various modifications and variations can be made to the present invention without departing from the spirit or scope of the invention. Thus, it is intended that the present invention also include such modifications and alterations insofar as they come within the scope of the appended claims or the equivalents thereof.

Claims (8)

1. A machine vision-based robotic control system, comprising:
the receiving module is used for receiving a request command sent by the client;
the analysis module is used for starting machine vision monitoring equipment preset by the robot, and collecting and analyzing state information of the robot in a dynamic environment;
and the calculation module is used for inputting the request command and the state information into the behavior control model to calculate the behavior weight of the robot and obtain the execution behavior with the minimum cost.
2. The machine vision-based robotic control system of claim 1, wherein the analysis module comprises:
the acquisition unit: the machine vision monitoring device is used for starting the preset machine vision monitoring equipment of the robot and collecting the surrounding environment information of the robot; wherein,,
the environment information at least comprises 3D road condition information and obstacle information;
a deduction unit: the robot behavior simulation module is used for performing behavior simulation in a behavior simulation model preset by the robot through the environment information and the request command, and deducting the dynamic environment of the robot work;
analysis unit: the method is used for acquiring and analyzing the state information of the robot in the dynamic environment.
3. The machine vision based robotic control system of claim 2, wherein the deduction unit comprises:
a first computing subunit: for calculating a target constraint value of an environment from the environment information;
a second computing subunit: the method comprises the steps of inserting a target constraint value into a behavior simulation model preset by a robot through a request command, and calculating a constraint gesture under the target constraint value;
deduction subunit: the robot behavior simulation method is used for performing behavior simulation in a behavior simulation model preset by the robot based on the constraint gesture of the robot and deducting the dynamic environment of the robot.
4. The machine vision-based robotic control system of claim 1, wherein the computing module comprises:
an acquisition unit: the method comprises the steps of inputting the request command and the state information into a behavior control model, and acquiring a behavior path and a target execution behavior of the robot; wherein,,
the behavior path at least comprises a moving direction and a moving speed;
weight calculation unit: the behavior path and the target execution behavior are input into a behavior control model to calculate the behavior weight of the robot, and the behavior weight is determined;
execution behavior unit: and the method is used for acquiring the minimum behavior weight and determining the corresponding execution behavior with the minimum cost.
5. A robot control method based on machine vision, comprising:
the method comprises the specific steps of receiving a request command sent by a client,
step A1: the robot uses the formula (1) to analyze the received data and judge whether the received data is a request command sent by a client
Wherein E represents the received data state value; d (D) 16 A 16-ary form representing the received data; a is that 16 A 16-system form of a preset standard frame header representing a request command sent by the client; w (W) 16 A 16-system form of a preset standard frame tail for representing a request command sent by the client;>>representing a right shift;<<representing a left shift; len () represents the total number of bits to be found for the data in brackets; f { } represents a function, wherein the function value is 1 if all the formulas in the brackets are satisfied, and is 0 if the formulas which are not satisfied are present in the brackets;
if e=1, it indicates that the received data is a request command sent by the client;
if e=0, it indicates that the received data is not the request command sent by the client;
step A2: if the received data is a request command sent by the client, performing data verification on the command by using a formula (2)
Wherein J represents a data verification state value when the received data is a request command sent by the client; d (D) 16 {len(A 16 )+1→[len(D 16 )-len(W 16 )]Data D is represented by } 16 Middle len (A) 16 ) +1 to [ len (D) 16 )-len(W 16 )]16-ary data on bits; s is S 16 Representing data D 16 Eliminating 16-system data after the head and the tail of the frame; sum (S) 16 >>2) Representing a pair of 16-ary data S 16 >>2 (2)Adding the numerical values of each bit;
step A3: controlling the type of the returned data according to the received data state and the data verification state when the received data is the request command sent by the client by using the formula (3)
Wherein H represents a type control value of the return data;
if h=1, it indicates that the type of the control return data is a received data format error;
if h=2, the type of the control return data is that the received data has a data anomaly;
if h=3, it indicates that the type of the control return data is successful;
starting a machine vision monitoring device preset by the robot, and collecting and analyzing state information of the robot in a dynamic environment;
and inputting the request command and the state information into a behavior control model to calculate the behavior weight of the robot, and acquiring the execution behavior with the minimum cost.
6. The machine vision-based robot control method of claim 5, wherein the receiving the request command sent by the client comprises:
starting a machine vision monitoring device preset by the robot, and collecting environmental information around the robot; wherein,,
the environment information at least comprises 3D road condition information and obstacle information;
performing behavior simulation in a behavior simulation model preset by the robot through the environment information and the request command, and deducting the dynamic environment of the robot work;
and acquiring and analyzing the state information of the robot in the dynamic environment.
7. The robot control system according to claim 6, wherein the performing behavior simulation in a behavior simulation model preset for the robot through the environment information and the request command, deducting a dynamic environment of the robot work, comprises:
calculating a target constraint value of the environment through the environment information;
inserting the target constraint value into a behavior simulation model preset by the robot through the request command to calculate the constraint gesture under the target constraint value;
based on the constraint gesture of the robot, performing behavior simulation in a behavior simulation model preset by the robot, and deducting the dynamic environment of the robot work.
8. The robot control system according to claim 5, wherein the inputting the request command and the status information into the behavior control model calculates a behavior weight of the robot, and obtains the execution behavior with the minimum cost, comprising:
inputting the request command and the state information into a behavior control model to acquire a behavior path and a target execution behavior of the robot; wherein,,
the behavior path at least comprises a moving direction and a moving speed;
inputting the behavior path and the target execution behavior into a behavior control model to calculate the behavior weight of the robot and determine the behavior weight;
and acquiring the minimum behavior weight, and determining the corresponding execution behavior with the minimum cost.
CN202310729255.0A 2023-06-20 2023-06-20 Robot control system and method based on machine vision Pending CN116713994A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310729255.0A CN116713994A (en) 2023-06-20 2023-06-20 Robot control system and method based on machine vision

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310729255.0A CN116713994A (en) 2023-06-20 2023-06-20 Robot control system and method based on machine vision

Publications (1)

Publication Number Publication Date
CN116713994A true CN116713994A (en) 2023-09-08

Family

ID=87871232

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310729255.0A Pending CN116713994A (en) 2023-06-20 2023-06-20 Robot control system and method based on machine vision

Country Status (1)

Country Link
CN (1) CN116713994A (en)

Similar Documents

Publication Publication Date Title
CN109159113B (en) Robot operation method based on visual reasoning
CN109807882A (en) Holding system, learning device and holding method
CN109693146B (en) Service life estimation device and machine learning device
CN105082132A (en) Rapid robotic imitation learning of force-torque tasks
JP6977686B2 (en) Control system and control unit
CN113910218B (en) Robot calibration method and device based on kinematic and deep neural network fusion
CN113074959B (en) Automatic driving system test analysis method
CN114355953A (en) High-precision control method and system of multi-axis servo system based on machine vision
CN111340834B (en) Lining plate assembly system and method based on laser radar and binocular camera data fusion
CN117572863A (en) Path optimization method and system for substation robot
CN117067261A (en) Robot monitoring method, device, equipment and storage medium
CN116713994A (en) Robot control system and method based on machine vision
CN116690988A (en) 3D printing system and method for large building model
CN109447235A (en) Feed system model training neural network based and prediction technique and its system
US20200202178A1 (en) Automatic visual data generation for object training and evaluation
CN112379656A (en) Processing method, device, equipment and medium for detecting abnormal data of industrial system
CN114211173B (en) Method, device and system for determining welding position
CN116197918B (en) Manipulator control system based on action record analysis
CN117381805B (en) Mechanical arm operation control method and system for conflict handling
CN113850929B (en) Display method, device, equipment and medium for processing annotation data stream
CN116604535B (en) Teaching system and method applied to modular design of robot industrial chain
Lai et al. Active Data Acquisition in Autonomous Driving Simulation
CN115861592A (en) Speed precision optimization method and system of action capture system based on neural network
JP6731603B1 (en) Inspection system
CN114545936A (en) Robot path optimization method and system based on obstacle avoidance planning

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination