CN115519546A - Space and ground collaborative space scientific experiment robot based on intelligent vision - Google Patents

Space and ground collaborative space scientific experiment robot based on intelligent vision Download PDF

Info

Publication number
CN115519546A
CN115519546A CN202211309652.4A CN202211309652A CN115519546A CN 115519546 A CN115519546 A CN 115519546A CN 202211309652 A CN202211309652 A CN 202211309652A CN 115519546 A CN115519546 A CN 115519546A
Authority
CN
China
Prior art keywords
space
sample box
module
representing
material science
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202211309652.4A
Other languages
Chinese (zh)
Other versions
CN115519546B (en
Inventor
于强
鲁鹏飞
于泽华
刘晓珂
任俊竹
戴宏伟
霍晓智
李欣泽
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
National Space Science Center of CAS
Original Assignee
National Space Science Center of CAS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by National Space Science Center of CAS filed Critical National Space Science Center of CAS
Priority to CN202211309652.4A priority Critical patent/CN115519546B/en
Publication of CN115519546A publication Critical patent/CN115519546A/en
Application granted granted Critical
Publication of CN115519546B publication Critical patent/CN115519546B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture
    • B25J9/161Hardware, e.g. neural networks, fuzzy logic, interfaces, processor
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1612Programme controls characterised by the hand, wrist, grip control
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1661Programme controls characterised by programming, planning systems for manipulators characterised by task planning, object-oriented languages
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1679Programme controls characterised by the tasks executed
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1679Programme controls characterised by the tasks executed
    • B25J9/1689Teleoperation
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • Orthopedic Medicine & Surgery (AREA)
  • General Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Health & Medical Sciences (AREA)
  • Evolutionary Computation (AREA)
  • Fuzzy Systems (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Analysing Materials By The Use Of Radiation (AREA)

Abstract

The invention relates to a world collaborative space science experiment robot based on intelligent vision, which comprises a space material science experiment sample box vision positioning module, a space material science experiment sample box grabbing module, a space material science experiment operation instruction database module and a world collaborative operation module; the visual positioning module is used for acquiring the pose information of the door handles of the batch sample management module, the space material science experiment sample boxes and the sample box handles in the X-ray transmission imaging module in the high-temperature cabinet and sending the pose information to the space material experiment sample box grabbing module; the grabbing module is used for planning a path according to the pose information and executing a sample box replacing task; the operation instruction database module is used for storing related instruction data by using a database; and the space and ground cooperative operation module is used for calling data stored in the space material science experiment operation instruction database module and issuing related instructions of a sample box replacing task to the space material science experiment sample box grabbing module.

Description

Space and ground collaborative space scientific experiment robot based on intelligent vision
Technical Field
The invention relates to the field of developing space science experiments on a space station in China, in particular to a space and ground collaborative space science experiment robot based on intelligent vision.
Background
A high-temperature material scientific experimental cabinet (high-temperature cabinet) of a space station in China is to develop scientific research on space materials by using advanced technologies such as a sample experiment on-line measurement technology and an X-ray perspective imaging technology. The high-temperature cabinet supports the scientific experiment of melt growth and solidification of various materials.
The high temperature cabinet can be loaded with 16 samples per batch. After all 16 samples were tested, they needed to be replaced by an astronaut in a batch.
The high temperature cabinet is therefore faced with the following problems during the in-track experiment:
(1) Replacement of entire batch of samples during the track. Astronauts require long training on the ground and various operations in space to accomplish such tasks.
(2) The problem of avoiding X-ray experimental radiation. In case of its power-on, the astronaut cannot access it. Thus, many real-time online experimental operations cannot be done.
(3) The improvement of the overall automation and the intelligent degree of the experimental equipment. The space station of China is provided with 13 scientific experiment cabinets, and relates to a plurality of disciplines such as space materials, space fluids, space combustion, space physics, space life and the like. Astronauts have many missions and limited number of people, and the experiment task is heavy, so that the workload of the astronauts needs to be reduced so as to carry out more space science experiments.
The space swordsman can replace an astronaut to efficiently and safely complete a space-burdensome experiment task, the safety of the astronaut is guaranteed not to be influenced by dangerous experiment processes such as high temperature and radiation, the efficiency of a space science experiment is improved, and the cost of the space science experiment is reduced. The experimental cabinet can be used for high-temperature material scientific experimental cabinets in space stations and can also be used for other experimental cabinets.
Disclosure of Invention
The invention aims to realize the replacement of a sample box in a high-temperature cabinet when developing advanced technology for developing space material scientific research in a space station high-temperature material scientific experiment cabinet (high-temperature cabinet), and provides a space and ground collaborative space scientific experiment robot based on intelligent vision.
In order to achieve the above purpose, the invention is realized by the following technical scheme.
The invention provides a space-ground collaborative space scientific experimental robot based on intelligent vision, which is used for completing the replacement of an experimental material sample box in a high-temperature cabinet of a space station on a rail; the robot includes: the space material science experiment sample box comprises a space material science experiment sample box visual positioning module, a space material science experiment sample box grabbing module, a space material science experiment operation instruction database module and a world cooperative operation module; wherein the content of the first and second substances,
the space material science experiment sample box visual positioning module is used for acquiring the position and posture information of door handles of batch sample management modules in the high-temperature cabinet, the space material science experiment sample box and sample box handles in the X-ray transmission imaging module by using a space science experiment image acquisition technology and sending the position and posture information to the space material experiment sample box grabbing module;
the space material science experiment sample box grabbing module is used for planning a path according to the pose information and executing a sample box replacing task;
the space material science experiment operation instruction database module is used for storing instruction data related to the robot by using a database;
the space and ground cooperative operation module is used for calling instruction data stored in the space material science experiment operation instruction database module and issuing related instructions of a sample box replacing task to the space material science experiment sample box grabbing module; and the robot module is also used for monitoring the working state of the robot module.
As an improvement of the above technical solution, the sample cartridge replacement task sequentially comprises the following steps:
grasping a door handle of the batch sample management module of the high-temperature cabinet, and opening the door of the batch sample management module of the high-temperature cabinet;
extracting the space material science experiment sample box in the high-temperature cabinet or the space material science experiment sample box in the X-ray transmission imaging module;
taking out the material sample box which has completed the space material science experiment;
replacing the new material sample cartridge;
a space material science experiment sample box inserted into the high-temperature cabinet;
and (4) grasping a door handle of the high-temperature cabinet batch sample management module, and closing the door of the high-temperature cabinet batch sample management module.
As an improvement of the above technical solution, the visual positioning module for a space material science experiment sample box comprises: the device comprises a space science experiment image sensor, a pose sensing component, a space material science experiment material sample box data analysis component and a space material science experiment material sample box pose data exchange component; wherein, the first and the second end of the pipe are connected with each other,
the space science experiment image sensor is used for acquiring three-dimensional information of a door handle in the high-temperature cabinet batch sample management module, a space material science experiment sample box and a sample box handle in the X-ray transmission imaging module by using a space science experiment image acquisition technology and transmitting the three-dimensional information to the pose sensing component;
the pose sensing component is used for converting three-dimensional information of a door handle in the high-temperature cabinet batch sample management module, a space material science experiment sample box and a sample box handle in the X-ray transmission imaging module into image information of the door handle in the high-temperature cabinet batch sample management module, the space material science experiment sample box and the sample box handle in the X-ray transmission imaging module, and transmitting the image information to the space material science experiment material sample box data analysis component;
the space material science experiment material sample box data analysis component is used for analyzing image information transmitted by the space material science experiment sample box visual positioning module to obtain the position and posture information of a door handle in the high-temperature cabinet batch sample management module, a space material science experiment sample box and a sample box handle in the X-ray transmission imaging module;
the position and orientation data exchange component of the space material science experimental material sample box is used for transmitting the position and orientation information of a door handle in the high-temperature cabinet batch sample management module, the space material science experimental sample box and a sample box handle in the X-ray transmission imaging module to the world cooperative operation module.
As an improvement of the above technical solution, the obtaining of pose information by the visual positioning module for the space material science experiment sample box includes:
firstly calibrating a camera, and then calibrating a projector:
Figure BDA0003907488870000031
wherein u represents a coordinate of a horizontal axis in a pixel coordinate system, v represents a coordinate of a vertical axis in the pixel coordinate system, and Z c Representing object distance, A representing a camera reference matrix, R representing a camera rotation matrix, T representing a camera translation matrix, X representing an X-axis coordinate under a world coordinate system, Y representing a Y-axis coordinate under the world coordinate system, and Z representing a Z-axis coordinate under the world coordinate system;
performing two-dimensional Fourier transform:
Figure BDA0003907488870000032
f (u, v) represents a frequency domain image after Fourier transform, M represents a width of the image, N represents a height of the image, F (x, y) represents a time domain image before Fourier transform,
choosing a transfer function H (u, v):
Figure BDA0003907488870000033
wherein D (u, v) represents the distance of the point (u, v) from the center of Fourier transform, D 0 Represents the cut-off frequency;
performing two-dimensional inverse Fourier transform:
Figure BDA0003907488870000034
obtaining Gray code pixel coordinates by using an edge detection algorithm; wherein the content of the first and second substances,
world coordinates of the point P to be measured:
Figure BDA0003907488870000041
wherein, O c Denotes the origin, X, in the camera coordinate system woc Representing the x-coordinate, Y, of the origin in the camera coordinate system woc Representing the y coordinate of origin, Z, in a camera coordinate system woc Representing the z-coordinate of the origin, R, in a camera coordinate system c Representing a rotation matrix, t, in a coordinate system representing the camera c Representing a translation matrix, P, representing the camera coordinate system c Representing the coordinates of a point P, X in the camera coordinate system wpc X-coordinate, Y, representing point P in camera coordinate system wpc Y-coordinate, Z, representing point P in the camera coordinate system wpc Z-coordinate, x, representing point P in the camera coordinate system c Representing normalized x-coordinate, y, in a camera coordinate system c Representing the normalized y-coordinate, z, in the camera coordinate system c Denotes the normalized z-coordinate, O, in the camera coordinate system p Representing the origin, X, in the projector coordinate system wop Representing the x, Y coordinates of the origin in the projector coordinate system wop Representing the y-coordinate of the origin, Z, in the projector coordinate system wop Representing the z-coordinate of the origin, P, in the projector coordinate system p Representing the coordinate system of the projectorPoint P coordinate, X wpp X-coordinate, Y, representing the origin in the projector coordinate system wpp Y-coordinate, Z, representing the origin in the projector coordinate system wpp Z-coordinate, x, representing the origin in the projector coordinate system p X-coordinate, y, representing a point P in the projector coordinate system p Y-coordinate, z, representing a point P in the projector coordinate system p A z-coordinate representing a point P in the projector coordinate system;
the ICP point cloud registration is carried out, and the method comprises the following steps:
given two sets of point clouds X and P,
X=(x 1 ,x 2 ,…,x n ) (6)
P=(p 1 ,p 2 ,…,p n ) (7)
wherein n represents the total number of point clouds, x n Representing point clouds in a collection X, p n Representing a point cloud in the collection P;
solving R and T to minimize E (R, T),
Figure BDA0003907488870000042
wherein E (R, t) represents an error function, R represents a rotation matrix, t represents a translation matrix, x i Representing point clouds in a collection X, p i Representing a point cloud in the collection P;
obtaining the centroids of the two groups of point clouds:
Figure BDA0003907488870000051
Figure BDA0003907488870000052
obtaining coordinates of points in the two groups of point clouds with the centroid as an origin:
X′=x i -u x =x′ i (11)
P′=p i -u p =p′ i (12)
x ' represents the centroid coordinate, X ' of set X ' i Representing the centroid coordinates of set X; p ' represents the centroid coordinate, P ' of the set P ' i Representing the centroid coordinates of set P;
and acquiring omega and carrying out SVD (singular value decomposition) on the omega:
Figure BDA0003907488870000053
omega represents a matrix to be SVD decomposed, U represents an orthogonal matrix, T represents transposition, delta 1 Representing non-zero singular values, δ, of the matrix ω 2 Representing non-zero singular values, δ, of the matrix ω 3 Non-zero singular values representing a matrix ω;
the transformation relation between the two groups of point clouds is as follows:
R=VU T (14)
v denotes an orthogonal matrix, and U denotes an orthogonal matrix;
t=u x -Ru p (15)
t denotes a translation matrix.
As an improvement of the above technical solution, the space material science experiment sample box grabbing module includes: a mechanical arm and a gripper; the mechanical arm and the clamping hand are used for completing the task of replacing the sample box according to the acquired pose information and the planned path, and the method specifically comprises the following steps:
setting an initial position of the mechanical arm;
the position of a door handle in a batch sample management module of the high-temperature cabinet, the position of a sample box in a scientific experimental space material and the position of a sample box handle in an X-ray transmission imaging module are obtained through a visual positioning module of the sample box in the scientific experimental space material, and the position information is sent to a mechanical arm;
planning a path of the mechanical arm;
and the mechanical arm and the clamping hand are controlled to open the door in the batch sample management module of the high-temperature cabinet, so that the task of replacing a space material scientific experiment sample box is completed, or the task of replacing a sample box in the X-ray transmission imaging module is completed.
As an improvement of the above technical solution, the path planning of the mechanical arm includes:
inserting a middle point between the starting point and the end point by utilizing fifth-order polynomial interpolation, wherein the expression is as follows:
Figure BDA0003907488870000061
where θ (t) represents the angular displacement at time t,
Figure BDA0003907488870000062
which represents the angular velocity at the time t,
Figure BDA0003907488870000063
represents angular acceleration at time t, a 0 、a 1 、a 2 、a 3 、a 4 、a 5 Representing the correlation coefficient required to be solved in each formula;
and constraining the angular speed of the starting point and the stopping point, wherein the constraint condition meets the following formula:
Figure BDA0003907488870000064
wherein, θ (t) 0 ) Which represents the location of the start point and,
Figure BDA0003907488870000065
which is indicative of the speed of the starting point,
Figure BDA0003907488870000066
acceleration at starting point, t 0 As the starting point time, θ (t) f ) Indicating the location of the termination point,
Figure BDA0003907488870000067
the speed of the end point is indicated,
Figure BDA0003907488870000068
indicates the end-point acceleration, t f Representing the ending point time;
solving to obtain:
Figure BDA0003907488870000069
θ 0 which represents the starting position of the device,
Figure BDA00039074888700000610
which is indicative of the speed of the start-up,
Figure BDA00039074888700000611
denotes the initial acceleration, θ f The position of the termination is indicated and,
Figure BDA00039074888700000612
it is indicated that the speed of termination is,
Figure BDA00039074888700000613
indicating the ending acceleration.
As an improvement of the above technical solution, the space-ground cooperative operation module includes a robot operation module and a robot state monitoring module; wherein the content of the first and second substances,
the robot operation module is used for controlling the motors of the mechanical arms and the motors of the clamping hands, and controlling the corresponding mechanical arm motors to move and the clamping hands to finish clamping actions after receiving an operation instruction, so that a sample box replacing task is finished;
and the state monitoring module is used for monitoring the motion state and key parameters of the robot during the in-orbit period so as to judge the working state of the robot.
As an improvement of the above technical solution, the motor of the robot arm includes: the wrist training device comprises a base motor, a shoulder motor, an elbow motor, a first wrist motor, a second wrist motor and a third wrist motor;
the base motor is positioned at the bottommost layer of the mechanical arm and is used for controlling the whole mechanical arm to rotate on the horizontal plane;
the shoulder motor is positioned at the upper part of the base motor and is used for controlling the mechanical arm to rotate on a vertical surface;
the elbow motor is positioned in the middle of the mechanical arm and used for driving the first wrist motor, the second wrist motor and the third wrist motor to move back and forth;
the first wrist motor, the second wrist motor and the third wrist motor are used for controlling the clamping hands to rotate front and back and left and right, and further moving the clamping hands to the designated positions.
As an improvement of the above technical solution, the parameter table related to the robot includes: an experiment parameter table, an alarm log table, a machine parameter table, an action flow table, a replacement flow instruction table and a picture video storage table; wherein, the first and the second end of the pipe are connected with each other,
the database, comprising: the system comprises a service application, a client driver and an OpenGauss server; the service application supports the function of a world cooperative control module; the client driver is responsible for receiving an access request from the service application, returning an execution result to the application, communicating with the OpenGauss server, issuing SQL (structured query language) to be executed on the OpenGauss server and receiving the execution result; the OpenGauss server is responsible for storing service data, executing a data query task and returning an execution result to the client driver;
the experimental parameter table is used for storing the state information of the door handle in the time and high-temperature cabinet batch sample management module, the state information of the sample box handle in the space material science experimental sample box and the X-ray transmission imaging module, and simultaneously can also monitor the working state information in the experimental process in real time; the state information of door handle, space material science experiment sample box among the sample management module in batches of high temperature cabinet, sample box handle among X ray transmission imaging module includes: time, the opening and closing state of a door handle in a batch sample management module of the high-temperature cabinet, the sample replacement batch of a space material science experiment sample box, the number of samples which are not replaced and the sample replacement state; the working state information in the experimental process comprises: temperature, shell temperature, current and voltage in the high-temperature cabinet;
the alarm log table is used for recording the abnormal time, abnormal data value, expected data value and abnormal physical quantity information of the experimental data when some parameters exceed the specified range in the experimental process;
the robot parameter table is used for recording the current state of the mechanical arm motor and the state parameters of the robot; the state of the robot arm motor includes: the angle of rotation, the distance of movement and the power of the motor; the state parameters of the robot include: ambient temperature;
the action flow table is used for recording the action execution time, the executed action ID number, the object for executing the action, the parameters used in the action execution process, the duration for executing the action and the action execution state;
the replacing flow instruction list is used for storing a sample replacing flow in the experiment process, and sequencing a plurality of actions stored in the action flow list according to the execution time, so that the robot can sequentially execute related actions according to the action sequence in the replacing flow list to complete a task of replacing the sample;
the picture video storage table is used for storing videos and images shot by the visual positioning module of the space material science experiment sample box, pictures shot by the visual positioning module can be seen in real time in the experiment process, and experiment videos can also be seen after the experiment is finished.
Compared with the prior art, the invention has the advantages that:
1. according to the invention, a three-dimensional visual perception algorithm is adopted to obtain the spatial poses of the high-temperature cabinet and the samples in the space station, the spatial poses are input into the motion controller, the robot is driven to complete the replacement tasks of the whole batch of samples (the samples are replaced one by one), the online experiment operation is carried out by replacing an astronaut, the overall automation and intelligence level of the experiment equipment is improved, the workload of the astronaut can be reduced, and the space science experiment task can be rapidly and efficiently completed;
2. the invention can ensure the safety of astronauts, thoroughly avoid the astronauts from being influenced by dangerous experimental processes such as high temperature, X-ray radiation and the like, improve the efficiency of space science experiments and reduce the operation cost of the space science experiments; the robot 'astronaut' is used for carrying out auxiliary experiments, the operation time is short, the operation is stable, the two hands of an astronaut can be liberated, the time of the astronaut is saved, and the astronaut can put more time and energy on other important tasks;
3. the robot is mainly applied to the experiment process of developing space material science in a high-temperature material science experiment cabinet (high-temperature cabinet) of a space station in China, and can also be applied to the experiment process of other scientific experiment cabinets on the space station in China;
4. according to the invention, different databases such as instructions, processes, parameters and the like are arranged in the space material science experiment operation instruction database module, so that an astronaut can conveniently obtain instructions from the database required by operation, the robot can be accurately controlled, and the task of replacing a sample box in a space station is completed.
Drawings
FIG. 1 is a view of the constitution of the present invention;
FIG. 2 is a block diagram of a visual positioning module of a sample box for space materials science experiments according to the present invention;
FIG. 3 is a diagram of a model robot arm of the present invention;
FIG. 4 is a diagram of the database architecture of the present invention.
Detailed Description
The invention discloses a space-ground cooperative space science experiment robot based on intelligent vision, namely an astronaut robot, and relates to a method for automatically completing space station science experiment tasks (including the operations of changing experiment samples, changing experiment modules, carrying out experiments in an X-ray transmission imaging module and the like) by means of artificial intelligence and space-ground cooperative operation. The space and ground collaborative space science experiment robot based on intelligent vision comprises: the space material science experiment sample box system comprises a space material science experiment sample box visual positioning module, a space material experiment sample box grabbing module, an "astronaut" world-ground cooperative operation module and a space material science experiment operation instruction database module. The robot is mainly applied to the experiment process of developing space material science in a high-temperature material science experiment cabinet (high-temperature cabinet) of a space station in China, and can also be applied to the experiment process of other scientific experiment cabinets on the space station in China. The spatial position and pose of a high-temperature cabinet and a sample in a space station are obtained by adopting a three-dimensional visual perception algorithm, and are input into a motion controller to drive a robot to complete the task of replacing the whole batch of samples (replacing the samples one by one), so that the online experimental operation of astronauts is replaced, the overall automation and intelligence level of experimental equipment is improved, the workload of the astronauts can be reduced, and the task of a space science experiment can be completed quickly and efficiently. The invention designs the space science experiment assisting robot for assisting astronauts to efficiently and safely complete space heavy scientific experiment tasks, ensures the safety of the astronauts, thoroughly avoids the influence of dangerous experiment processes such as high temperature, X-ray radiation and the like on the astronauts, improves the efficiency of space science experiments, and reduces the operation cost of the space science experiments. The auxiliary experiment is carried out by utilizing the astronaut, the operation time is short, the operation is stable, the hands of the astronaut can be liberated, the time of the astronaut is saved, and the astronaut can put more time and energy on other important tasks.
The invention aims to provide a method for a space-ground collaborative space science experiment robot based on intelligent vision, which is an auxiliary robot capable of replacing astronauts to perform experiments in a space station:
in order to realize the purpose, the invention adopts the following technical scheme: as shown in fig. 1, a composition structure diagram of the robot of the present invention, the robot for scientific experiments in space and earth based on intelligent vision mainly comprises:
the space material science experiment sample box comprises a space material science experiment sample box visual positioning module, a space material science experiment sample box grabbing module, an astronaut heaven and earth cooperative operation module and a space material science experiment operation instruction database module; wherein, the first and the second end of the pipe are connected with each other,
the space material science experiment sample box visual positioning module is used for acquiring pose information of door handles of batch sample management modules in a high-temperature cabinet, sample box handles in a space material science experiment sample box and an X-ray transmission imaging module and the like, and sending the pose information to the space material experiment sample box grabbing module to complete a task of replacing a sample box;
the space material science experiment sample box grabbing module is used for executing a sample box replacing task and mainly comprises the steps of opening and closing a door handle of a batch sample management module of the high-temperature cabinet, drawing out and inserting a space material science experiment sample box in the high-temperature cabinet or a space material science experiment sample box in an X-ray transmission imaging module, and taking out a material sample box which has completed a space material science experiment, a new material sample box and the like;
the space and ground cooperative operation module is used for assisting ground operators to judge whether the current working state of the robot is good or not, taking out space and ground cooperative instructions from the database, sending the instructions to the robot on the space station through the space and ground instruction injection system, and verifying the instructions with the received 6-joint mechanical arm data mutually to provide visual space and ground cooperative space scientific experimental robot state information based on intelligent vision for the ground operators;
the space material science experiment operation instruction database module is used for storing an uplink injection operation instruction, a space and ground cooperative operation parameter, a space and ground cooperative space science experiment robot alarm log based on intelligent vision, a space high-temperature material science experiment process and a ground worker taking and checking conveniently, and also brings convenience for astronauts in the space station to monitor the space high-temperature material science experiment process and the working state of the space and ground cooperative space science experiment robot based on the intelligent vision in real time and on line.
Furthermore, the visual positioning module of the space material science experiment sample box comprises a pose sensing component, a space material science experiment material sample box data analysis component and a space material science experiment material sample box pose data exchange component;
furthermore, the pose sensing component is used for transmitting visual information of door handles in the batch sample management module of the high-temperature cabinet, sample box handles in a space material science experiment sample box and an X-ray transmission imaging module and the like to the data analysis component of the sample box of the space material science experiment material, and image information of the door handles in the batch sample management module of the high-temperature cabinet, the sample box handles in the space material science experiment sample box and the sample box handles in the X-ray transmission imaging module and the like is obtained by adopting a space science experiment image sensor acquisition technology;
furthermore, the space material science experiment material sample box data analysis component is used for analyzing the image information transmitted by the space material science experiment sample box visual positioning module to obtain the pose information of a door handle in the high-temperature cabinet batch sample management module, a sample box handle in the space material science experiment sample box and the X-ray transmission imaging module and the like;
furthermore, the pose data exchange component of the sample box of the space material science experiment material is used for transmitting pose information of the door handle in the batch sample management module of the high-temperature cabinet, the sample box handle in the sample box of the space material science experiment and the X-ray transmission imaging module to the controller;
further, the image sensor for space science experiment is to project a pattern with a special structure designed in advance onto the surface of a three-dimensional object by using a space science experiment image acquisition technology, and observe the distortion condition of imaging on a three-dimensional physical surface by using the image sensor for space science experiment, so as to obtain three-dimensional information of a door handle in a batch sample management module of a high-temperature cabinet, a sample box handle in a space material science experiment sample box and an X-ray transmission imaging module, and the flow is shown in fig. 2;
furthermore, the principle of the space science experiment image acquisition technology is that a mathematical model of a replacement sample is extracted by using an algorithm based on simulation learning, preliminary generalization is carried out on a working scene of the space man at a space station so as to reduce a transformation space of a control quantity, and then an action strategy is optimized by using a reinforcement learning algorithm. The experience of actual operation of astronauts is introduced by means of simulation learning, the learning process is accelerated, and meanwhile the feasibility of an action strategy at the initial stage of reinforcement learning training is improved, so that the reliability of a task execution link is effectively improved.
The vision calibration needs to calibrate the camera firstly and then calibrate the projector:
Figure BDA0003907488870000111
the method of using two-dimensional Fourier transform:
Figure BDA0003907488870000112
transfer function:
Figure BDA0003907488870000113
two-dimensional inverse Fourier transform:
Figure BDA0003907488870000114
and obtaining gray code pixel coordinates by using an edge detection algorithm.
World coordinates of the point P to be measured:
Figure BDA0003907488870000115
the ICP point cloud is registered as follows:
given two point cloud sets
X=(x 1 ,x 2 ,…,x n ) (6)
P=(p 1 ,p 2 ,…,p n ) (7)
Solving for R and T, minimizing the following:
Figure BDA0003907488870000116
centroids of the two sets of point clouds:
Figure BDA0003907488870000117
Figure BDA0003907488870000121
the coordinates of the points in the two groups of point clouds with the centroid as the origin:
X′=x i -u x =x′ i (11)
P′=p i -u p =p′ i (12)
ω and SVD decomposition thereof:
Figure BDA0003907488870000122
the transformation relationship between the two groups of point clouds is as follows:
R=VU T (14)
t=u x -Ru p (15)
furthermore, the space material science experiment sample box grabbing module mainly comprises a mechanical arm, a clamping hand and a path planning device for the mechanical arm; wherein the robot arm is selected from a 6-axis robot arm and its gripper as shown in fig. 3.
Further, the working process of the space navigation man mechanical arm is to set the mechanical arm, and the working process comprises an initial position, the position of a door handle in the high-temperature cabinet batch sample management module, the position of a space material science experiment sample box, the position of a sample box handle in the X-ray transmission imaging module and the like, wherein the position of the door handle in the high-temperature cabinet batch sample management module, the position of the space material science experiment sample box and the position of the sample box handle in the X-ray transmission imaging module are positioned through the space material science experiment sample box visual positioning module, the space material science experiment sample box visual positioning module sends the position information to the mechanical arm after obtaining the positioning information, and after obtaining the position information, the mechanical arm is programmed to control the mechanical arm and the mechanical clamp to open the operation of the door in the high-temperature cabinet batch sample management module, so as to complete replacement, the task of the space material science experiment sample box, or the task of replacing the sample box in the X-ray transmission imaging module.
Furthermore, the path planning is performed on the mechanical arm in order to realize the stable motion of the mechanical arm along the track, the relevant path planning is performed on the mechanical arm, a middle point is inserted between the starting point and the end point, the fifth-order polynomial interpolation can solve the problems that the angular velocity change is not smooth and the acceleration jumps, and the expression is as follows:
Figure BDA0003907488870000123
constraint of start-stop angular velocity:
Figure BDA0003907488870000131
solving:
Figure BDA0003907488870000132
furthermore, the space-ground cooperative operation module of the astronomy knight comprises an astronomy knight mechanical arm, a tong operation module and an astronomy knight state monitoring module; the space navigation man mechanical arm and gripper operation module controls 6 motors of the mechanical arm and a gripper motor, and after receiving an operation instruction, controls the corresponding 6-axis motor of the mechanical arm to move and controls the gripper to complete clamping, so that the door of the batch sample management module of the high-temperature cabinet is opened and closed, the space material science experiment sample box and the space experiment material sample box in the X-ray transmission imaging module are drawn out and inserted, the space material science experiment sample box which has completed an experiment is taken out, a new space material science experiment sample box is replaced, or the space experiment material sample box in the X-ray transmission imaging module which has completed the experiment is taken out, the space experiment material sample box in the new X-ray transmission imaging module is replaced, and other tasks are carried out; the state monitoring module of the astronaut is used for monitoring the motion state and key parameters of the astronaut during the orbit period, and the astronaut and ground workers can conveniently judge the working state of the astronaut through the monitoring module.
Furthermore, the "astronaut" arm, tong operation module are used for controlling arm motion and monitoring the arm state, wherein the arm mainly contains 6 motors and tong motor, and 6 motors in the arm are frame motor, shoulder motor, elbow motor, wrist motor 1, wrist motor 2, wrist motor 3 respectively. The base motor is positioned at the bottommost layer of the mechanical arm and is used for controlling the whole mechanical arm to rotate on the horizontal plane; the shoulder motor is positioned at the upper part of the base motor and is used for controlling the mechanical arm to rotate on a vertical surface; the elbow motor is positioned in the middle of the mechanical arm and is responsible for driving the other three wrist motors to move back and forth; the wrist motors 1, 2 and 3 are used for accurately controlling the clamping hands to rotate front and back and left and right within a certain range, the clamping hands are moved to designated positions, and the movement and rotation of the mechanical arm in a three-dimensional space can be realized under the cooperation of the 6 motors. The experimenter determines experimental actions after analyzing the experimental tasks, generates operation instructions by programming the space swords, sends the instructions to the space station from the ground through a space and ground cooperative operation module, and executes the operation instructions by the space swords to complete corresponding experiments;
furthermore, the "astronaut" state monitoring module mainly comprises an image and video data which are shot by the visual positioning module of the space material science experiment sample box and displays the working states of all motors of the mechanical arm and the information of the current spatial position of the mechanical arm, and through the "astronaut" state monitoring module, an astronaut and a ground worker can check the video and the image which are shot by the visual positioning module of the space material science experiment sample box in the experimental process at any time through equipment, so that the running state of the equipment and the experimental operation process are judged and monitored, the astronaut and the ground worker are helped to review the experiment, the inquiry problem and the like, meanwhile, the running power, the working temperature and the like of 6 motors and a clamping arm motor of the mechanical arm can also be displayed, the astronaut and the ground worker can conveniently judge whether the working state and the working environment of the mechanical arm are normal or not, the abnormal condition of the mechanical arm can be found in time, the danger is conveniently and quickly treated, the position information of the current mechanical arm can be seen on a monitoring page, the progress of the experimental process is judged, and the progress of the experimental process can also be avoided, and the mechanical arm can be collided with other equipment;
further, the space material science experiment operation instruction database module comprises a database for storing an experiment parameter table, an alarm log table, a machine parameter table, an experiment flow table, a replacement flow instruction table, a picture video storage table and the like related to the space navigation man;
furthermore, the database mainly comprises a service application, a client driver and an OpenGauss server, wherein the service application is matched with the space-ground cooperative function. The client driver is responsible for receiving an access request from the application, returning an execution result to the application, communicating with the OpenGauss server, issuing SQL to be executed on the OpenGauss server, and receiving the execution result; the OpenGauss server is responsible for storing service data, executing a data query task and returning an execution result to the client driver; storage is a local Storage resource of the server, data is stored in a persistent mode, and the detailed architecture is shown in FIG. 4;
the experimental parameter table stores state information of door handles in the batch sample management module of the high-temperature cabinet, sample boxes for space material science experiments, sample box handles in the X-ray transmission imaging module and the like, and can also monitor working state information in the experimental process in real time;
TABLE 1 Experimental parameters Table
Figure BDA0003907488870000141
Figure BDA0003907488870000151
When certain parameters exceed a specified range in the experimental process, the alarm log table can record information such as abnormal time, abnormal data values, abnormal physical quantities and the like of experimental data, so that follow-up workers can conveniently call and check the information;
TABLE 2 alarm Log Table
Name of field Description of the field Type (B) Length of Main key
Time Time of alarm Date 20 Yes
Name Physical quantity name Varchar 20
Value Alarm value Varchar 20
Expectvalue Expected value Varchar 20
The robot parameter table comprises the current state of a mechanical arm motor, such as a rotating angle, a moving distance, the power of the motor and the like, and the state parameters of the robot, such as the environment temperature and the like;
TABLE 3 machine ginseng number table
Figure BDA0003907488870000152
Figure BDA0003907488870000161
The action flow table records the action execution time, the executed action ID number, the object for executing the action, the parameters used in the action execution process, the duration for executing the action and the action execution state respectively;
TABLE 4 action flow chart
Name of field Description of field Types of Length of Main key
Time Action execution time Date 20 Yes
ID Action ID Int 10
Subject Executing action object Varchar 20
Para Motion parameter Decimal 20
Duration Duration of time Int 20
State Execution state Int 10
The replacement flow instruction list stores a sample replacement flow in the experimental process, a plurality of actions stored in the action flow list are sequenced according to execution time, and the "astronaut" can sequentially execute related actions according to the action sequence in the replacement flow list to complete a task of replacing the sample;
TABLE 5 Replacing flow instruction sheet
Figure BDA0003907488870000171
Figure BDA0003907488870000181
The picture video storage table is used for storing videos and images shot by the visual positioning module of the space material science experiment sample box, pictures shot by the visual positioning module can be seen in real time in the experiment process, and the experiment videos can be reviewed after the experiment is finished, so that workers can conveniently monitor and analyze the experiment process;
table 6 picture video storage table
Name of field Description of field Type (B) Length of Main key
Time Time of picture acquisition Date 20 Yes
ID Picture numbering Int 10
Name Picture name Varchar 20
Pic_URL Picture path Varchar 50
Time Video acquisition time Date 20 Yes
ID Video numbering Int 10
Name Video name Varchar 20
video_URL Video path Varchar 50
The technical scheme provided by the invention is further illustrated by combining the following embodiments.
Examples
The embodiment of the invention provides a space and ground collaborative space scientific experiment robot based on intelligent vision. The invention can complete the replacement of the sample by using the mechanical arm. The robot overall operation process mainly comprises the steps of obtaining an uplink instruction from a space material science experiment operation instruction database module, carrying out visual positioning by a space material science experiment sample box visual positioning module, carrying out path planning by a mechanical arm, and controlling the mechanical arm in a space material science experiment sample box grabbing module to complete actions of taking out a sample box, replacing the sample box and the like;
the system comprises a space material science experiment operation instruction database module, an astronomical man-machine-based space-ground cooperative operation module, a high-temperature cabinet batch sample management module, an X-ray transmission imaging module, a door handle, a space material science experiment sample box, an X-ray transmission imaging module, a door handle, a space material science experiment sample box, an X-ray transmission imaging module and the like, wherein an uplink instruction is obtained from the space material science experiment operation instruction database module; reading a robot parameter table in a database, and judging whether the parameter states (current value, voltage value, position coordinate, rotation angle and the like) of each motor of the robot are normal or not; making a motion related action table (such as an initial position of the motion, a door handle position in a batch sample management module of a high-temperature cabinet, a sample box position in a space material science experiment, a sample box handle position in an X-ray transmission imaging module and the like) of the mechanical arm, making a corresponding action table for each action of the mechanical arm, giving a unique action ID to each action, and subsequently finishing the corresponding action by the robot in the experiment flow table by virtue of the action ID; after the action table is made, corresponding execution time and the action ID are corresponding according to the time sequence of the movement of the mechanical arm, so that the mechanical arm can execute corresponding operation at the corresponding time, and the task of replacing the sample is completed; the obtained up-down command can also be a video and an image which are shot by a visual positioning module of the space material science experiment sample box in the space material science experiment operation command database module, and ground workers and astronauts can review and watch the video and the image in real time.
The visual positioning is carried out through a visual positioning module of a space material science experiment sample box, and the visual positioning system is mainly used for positioning the initial position of a mechanical arm, the position of a door handle in a high-temperature cabinet batch sample management module, the position of the space material science experiment sample box, the position of a sample box handle in an X-ray transmission imaging module and the like, the position information coordinates are stored in a database and sent to the mechanical arm, and the mechanical arm executes related operations according to the position information to finish the replacement of samples.
After coordinates such as an initial position, a door handle position in a high-temperature cabinet batch sample management module, a space material science experiment sample box position, a sample box handle position in an X-ray transmission imaging module and the like are obtained, the mechanical arm needs to carry out path planning and mainly comprises the following steps:
(1) Task planning usually requires high-level task decision, and takes the planning problem determined by the initial state and the planning target state of the mechanical arm as input, so as to reasonably plan the action decision and action sequence required by the mechanical arm to complete the task.
(2) The path planning indicates that a path of the end effector reaching a target point in a designated pose or a desired configuration needs to be generated by a specific trajectory planning algorithm to generate a motion trajectory of the end of the mechanical arm. The measure of the good and bad track planning is whether the mechanical arm is smooth and smooth in the running process, and the appearance of extreme positions and high-speed points is generally required to be eliminated, so that the mechanical arm is prevented from generating abnormal dangerous motions such as high speed and shaking.
(3) The track optimization generally refers to optimizing the position, speed and acceleration of a path point on the basis of path planning, so that the indexes such as the motion performance of the space manipulator, the task execution efficiency and the like are optimized, and an optimal path is found and reached.
After the moving path is determined, according to the position information of the target, the mechanical arm in the space material science experiment sample box grabbing module can be controlled to move to the target positions such as the door handle position in the high-temperature cabinet batch sample management module, the space material science experiment sample box position, the sample box handle position in the X-ray transmission imaging module and the like, and the tasks of opening the door of the sample management module, taking out the space material science experiment sample box and replacing the sample box are completed.
As can be seen from the above detailed description of the invention, the space-ground collaborative space science experiment robot based on intelligent vision is used for space station experimenters to perform space science experiments by using assistance of "astronauts knight-errants", so that scientific experiment tasks in a space station can be completed more efficiently and quickly, and the personal safety of astronauts is improved. The robot is an auxiliary experimental robot in the first space station in China. By using the control technology of the space robot, the designated task can be completed by the operation of the ground person and the intelligent operation of the far-end space robot. Ground experimenters utilize "astronaut knight-errants", combine the cooperative operation of world, can look over the experiment progress and develop the experiment in real time, and it has improved the efficiency of space science experiment greatly, solves people and operation target space-time isolation problem. The method comprises the following steps: the space material science experiment sample box system comprises a space material science experiment sample box visual positioning module, a space material experiment sample box grabbing module, an "astronaut" heaven and earth cooperative operation module and a space material science experiment operation instruction database module.
Finally, it should be noted that the above embodiments are only used for illustrating the technical solutions of the present invention and are not limited. Although the present invention has been described in detail with reference to the embodiments, it should be understood by those skilled in the art that the technical solutions of the present invention may be modified or substituted with equivalents without departing from the spirit and scope of the technical solutions of the present invention, and all of them should be covered by the scope of the claims of the present invention.

Claims (9)

1. A space-ground collaborative space science experiment robot based on intelligent vision is used for completing replacement of an experiment material sample box in a high-temperature cabinet of a space station on a rail; characterized in that the robot comprises: the space material science experiment sample box comprises a space material science experiment sample box visual positioning module, a space material science experiment sample box grabbing module, a space material science experiment operation instruction database module and a world cooperative operation module; wherein the content of the first and second substances,
the space material science experiment sample box visual positioning module is used for acquiring the position and posture information of door handles of batch sample management modules in the high-temperature cabinet, the space material science experiment sample box and sample box handles in the X-ray transmission imaging module by using a space science experiment image acquisition technology and sending the position and posture information to the space material experiment sample box grabbing module;
the space material science experiment sample box grabbing module is used for planning a path according to the pose information and executing a sample box replacing task;
the space material science experiment operation instruction database module is used for storing instruction data related to the robot by using a database;
the space and ground cooperative operation module is used for calling data stored in the space material science experiment operation instruction database module and issuing a related instruction of a sample box replacing task to the space material science experiment sample box grabbing module; and the robot module is also used for monitoring the working state of the robot module.
2. The intelligent vision-based space-earth collaborative space science experimental robot as claimed in claim 1, wherein the sample box replacement task comprises the following steps in sequence:
grasping a door handle of the batch sample management module of the high-temperature cabinet, and opening the door of the batch sample management module of the high-temperature cabinet;
extracting the space material science experiment sample box in the high-temperature cabinet or the space material science experiment sample box in the X-ray transmission imaging module;
taking out the material sample box which has completed the space material science experiment;
replacing the new material sample cartridge;
a space material science experiment sample box inserted into the high-temperature cabinet;
and (4) grasping a door handle of the batch sample management module of the high-temperature cabinet, and closing the door of the batch sample management module of the high-temperature cabinet.
3. The intelligent vision-based space-earth collaborative space science experiment robot as claimed in claim 1, wherein the space material science experiment sample box visual positioning module comprises: the system comprises a space science experiment image sensor, a pose sensing component, a space material science experiment material sample box data analysis component and a space material science experiment material sample box pose data exchange component; wherein the content of the first and second substances,
the space science experiment image sensor is used for acquiring three-dimensional information of a door handle in the high-temperature cabinet batch sample management module, a space material science experiment sample box and a sample box handle in the X-ray transmission imaging module by using a space science experiment image acquisition technology and transmitting the three-dimensional information to the pose sensing component;
the pose sensing component is used for converting three-dimensional information of a door handle in the high-temperature cabinet batch sample management module, a space material science experiment sample box and a sample box handle in the X-ray transmission imaging module into image information of the door handle in the high-temperature cabinet batch sample management module, the space material science experiment sample box and the sample box handle in the X-ray transmission imaging module, and transmitting the image information to the space material science experiment material sample box data analysis component;
the space material science experimental material sample box data analysis component is used for analyzing image information transmitted by the space material science experimental sample box visual positioning module to obtain position and attitude information of a door handle in the high-temperature cabinet batch sample management module, a space material science experimental sample box and a sample box handle in the X-ray transmission imaging module;
the pose data exchange component of the space material science experimental material sample box is used for transmitting pose information of a door handle in the high-temperature cabinet batch sample management module, the space material science experimental sample box and a sample box handle in the X-ray transmission imaging module to the world cooperative operation module.
4. The intelligent vision-based space-ground collaborative space science experiment robot of claim 3, wherein the space material science experiment sample box visual positioning module acquires pose information comprising:
firstly calibrating a camera, and then calibrating a projector:
Figure FDA0003907488860000021
wherein u represents a coordinate of a horizontal axis in a pixel coordinate system, v represents a coordinate of a vertical axis in the pixel coordinate system, and Z c Representing object distance, A representing a camera reference matrix, R representing a camera rotation matrix, T representing a camera translation matrix, X representing an X-axis coordinate under a world coordinate system, Y representing a Y-axis coordinate under the world coordinate system, and Z representing a Z-axis coordinate under the world coordinate system;
performing two-dimensional Fourier transform:
Figure FDA0003907488860000022
f (u, v) represents a frequency domain image after Fourier transform, M represents a width of the image, N represents a height of the image, F (x, y) represents a time domain image before Fourier transform,
choosing a transfer function H (u, v):
Figure FDA0003907488860000023
wherein D (u, v) represents the distance of the point (u, v) from the center of Fourier transform, D 0 Represents the cut-off frequency;
performing two-dimensional inverse Fourier transform:
Figure FDA0003907488860000031
obtaining Gray code pixel coordinates by using an edge detection algorithm; wherein, the first and the second end of the pipe are connected with each other,
world coordinates of the point P to be measured:
Figure FDA0003907488860000032
wherein, O c Denotes the origin, X, in the camera coordinate system woc Representing the x-coordinate, Y, of the origin in the camera coordinate system woc Representing the y coordinate of origin, Z, in a camera coordinate system woc Representing the z-coordinate of the origin, R, in a camera coordinate system c Representing a rotation matrix, t, in a coordinate system representing the camera c Representing a translation matrix, P, representing the camera coordinate system c Representing the coordinates of a point P, X in the camera coordinate system wpc X-coordinate, Y, representing point P in the camera coordinate system wpc Y-coordinate, Z, representing point P in the camera coordinate system wpc Z-coordinate, x, representing point P in the camera coordinate system c Representing the normalized x-coordinate, y, in the camera coordinate system c Representing the normalized y-coordinate, z, in the camera coordinate system c Denotes the normalized z-coordinate, O, in the camera coordinate system p Representing the origin, X, in the projector coordinate system wop Representing the origin x-coordinate, Y, in the projector coordinate system wop Representing the y-coordinate of the origin, Z, in the projector coordinate system wop Representing the z-coordinate of the origin, P, in the projector coordinate system p Representing the coordinates of a point P, X in the projector coordinate system wpp X-coordinate, Y, representing the origin in the projector coordinate system wpp Y-coordinate, Z, representing the origin in the projector coordinate system wpp Z-coordinate, x, representing the origin in the projector coordinate system p X-coordinate, y, representing a point P in the projector coordinate system p Y-coordinate, z, representing a point P in the projector coordinate system p A z-coordinate representing a point P in the projector coordinate system;
the ICP point cloud registration is carried out, and the method comprises the following steps:
given two sets of point clouds X and P,
X=(x 1 ,x 2 ,…,x n ) (6)
P=(p 1 ,p 2 ,…,p n ) (7)
wherein n represents the total number of point clouds, x n Representing point clouds in a collection X, p n Representing point clouds in the set P;
solving R and T to minimize E (R, T),
Figure FDA0003907488860000041
wherein E (R, t) represents an error function, R represents a rotation matrix, t represents a translation matrix, x i Representing point clouds in a collection X, p i Representing point clouds in the set P;
obtaining the centroids of the two groups of point clouds:
Figure FDA0003907488860000042
Figure FDA0003907488860000043
obtaining coordinates of points in the two groups of point clouds with the centroid as an origin:
X′=x i -u x =x′ i (11)
P′=p i -u p =p′ i (12)
x' denotes the set XCentroid coordinate, x' i Representing the centroid coordinates of set X; p ' represents the centroid coordinate, P ' of the set P ' i Representing the centroid coordinates of set P;
and acquiring omega and carrying out SVD (singular value decomposition) on the omega:
Figure FDA0003907488860000044
omega represents a matrix to be SVD decomposed, U represents an orthogonal matrix, T represents transposition, delta 1 Representing non-zero singular values, δ, of the matrix ω 2 Representing non-zero singular values, δ, of the matrix ω 3 Non-zero singular values representing a matrix ω;
the transformation relation between the two groups of point clouds is as follows:
R=VU T (14)
v denotes an orthogonal matrix, and U denotes an orthogonal matrix;
t=u x -Ru p (15)
t denotes a translation matrix.
5. The intelligent vision-based space science experimental robot for heaven and earth collaboration based on claim 1, wherein the space material science experiment sample box grabbing module comprises: a mechanical arm and a gripper; the mechanical arm and the clamping hand are used for completing the task of replacing the sample box according to the acquired pose information and the planned path, and the method specifically comprises the following steps:
setting an initial position of the mechanical arm;
the position of a door handle in a batch sample management module of the high-temperature cabinet, the position of a sample box in a scientific experiment of the space material and the position of a sample box handle in an X-ray transmission imaging module are obtained through a visual positioning module of the sample box in the scientific experiment of the space material, and the position information is sent to a mechanical arm;
planning a path of the mechanical arm;
and the mechanical arm and the clamping hand are controlled to open the door in the batch sample management module of the high-temperature cabinet, so that the task of replacing a space material scientific experiment sample box is completed, or the task of replacing a sample box in the X-ray transmission imaging module is completed.
6. The intelligent vision-based space-earth collaborative space science experimental robot as claimed in claim 5, wherein the path planning of the mechanical arm comprises:
inserting a middle point between the starting point and the end point by utilizing fifth-order polynomial interpolation, wherein the expression is as follows:
Figure FDA0003907488860000051
where θ (t) represents the angular displacement at time t,
Figure FDA0003907488860000052
the angular velocity at the time of the t-time is indicated,
Figure FDA0003907488860000053
represents angular acceleration at time t, a 0 、a 1 、a 2 、a 3 、a 4 、a 5 Representing the correlation coefficient required to be solved in each formula;
and constraining the angular speed of the starting point and the stopping point, wherein the constraint condition meets the following formula:
Figure FDA0003907488860000054
wherein, θ (t) 0 ) Which represents the location of the start point and,
Figure FDA0003907488860000055
which is indicative of the speed of the starting point,
Figure FDA0003907488860000056
acceleration at starting point, t 0 As a starting point time, θ (t) f ) Indicating the location of the termination point and,
Figure FDA0003907488860000057
the speed of the end point is indicated,
Figure FDA0003907488860000058
indicating end point acceleration, t f Indicating a termination point time;
solving to obtain:
Figure FDA0003907488860000059
θ 0 which represents the starting position of the device,
Figure FDA0003907488860000061
which is indicative of the speed of the start-up,
Figure FDA0003907488860000062
denotes the initial acceleration, θ f The position of the termination is indicated,
Figure FDA0003907488860000063
it is indicated that the speed of termination is,
Figure FDA0003907488860000064
indicating the termination of acceleration.
7. The intelligent vision-based space-ground collaborative space science experiment robot is characterized in that the space-ground collaborative operation module comprises a robot operation module and a robot state monitoring module; wherein the content of the first and second substances,
the robot operation module is used for controlling the motors of the mechanical arm and the grippers, and controlling the corresponding mechanical arm motors to move and the grippers to finish clamping after receiving an operation instruction, so that a sample box replacing task is finished;
and the state monitoring module is used for monitoring the motion state and key parameters of the robot during the in-orbit period so as to judge the working state of the robot.
8. The intelligent vision-based space-earth collaborative space science experimental robot according to claim 7, wherein the motor of the mechanical arm comprises: the wrist training machine comprises a machine base motor, a shoulder motor, an elbow motor, a first wrist motor, a second wrist motor and a third wrist motor;
the base motor is positioned at the bottommost layer of the mechanical arm and is used for controlling the whole mechanical arm to rotate on the horizontal plane;
the shoulder motor is positioned at the upper part of the base motor and is used for controlling the mechanical arm to rotate on a vertical surface;
the elbow motor is positioned in the middle of the mechanical arm and used for driving the first wrist motor, the second wrist motor and the third wrist motor to move back and forth;
the first wrist motor, the second wrist motor and the third wrist motor are used for controlling the clamping hand to rotate forwards, backwards, leftwards and rightwards so as to move the clamping hand to a specified position.
9. The smart vision-based space-earth collaborative space science experimental robot as claimed in claim 1, wherein the robot-related parameter table includes: an experiment parameter table, an alarm log table, a machine parameter table, an action flow table, a replacement flow instruction table and a picture video storage table; wherein the content of the first and second substances,
the database, comprising: the system comprises a service application, a client driver and an OpenGauss server; the service application supports the function of a world cooperative control module; the client driver is responsible for receiving an access request from the service application, returning an execution result to the application, communicating with the OpenGauss server, issuing SQL (structured query language) to be executed on the OpenGauss server and receiving the execution result; the OpenGauss server is responsible for storing service data, executing a data query task and returning an execution result to the client driver;
the experimental parameter table is used for storing the state information of the door handle in the time and high-temperature cabinet batch sample management module, the state information of the sample box handle in the space material science experimental sample box and the X-ray transmission imaging module, and simultaneously can also monitor the working state information in the experimental process in real time; the state information of door handle, space material science experiment sample box among the sample management module in batches of high temperature cabinet, sample box handle among X ray transmission imaging module includes: the time, the opening and closing state of a door handle in the batch sample management module of the high-temperature cabinet, the sample replacement batch of the space material science experiment sample box, the number of samples which are not replaced and the sample replacement state; the working state information in the experimental process comprises: temperature, shell temperature, current and voltage in the high-temperature cabinet;
the alarm log table is used for recording the abnormal time, abnormal data value, expected data value and abnormal physical quantity information of the experimental data when some parameters exceed the specified range in the experimental process;
the robot parameter table is used for recording the current state of the mechanical arm motor and the state parameters of the robot; the state of the robot arm motor includes: the angle of rotation, the distance of movement and the power of the motor; the state parameters of the robot include: ambient temperature;
the action flow table is used for recording the action execution time, the executed action ID number, the object for executing the action, the parameters used in the action execution process, the duration for executing the action and the action execution state;
the replacing process instruction list is used for storing a sample replacing process in the experimental process, and sequencing a plurality of actions stored in the action process list according to the execution time, so that the robot can sequentially execute related actions according to the action sequence in the replacing process list to complete a task of replacing the sample;
the picture video storage table is used for storing videos and images shot by the visual positioning module of the space material science experiment sample box, pictures shot by the visual positioning module can be seen in real time in the experiment process, and experiment videos can also be seen after the experiment is finished.
CN202211309652.4A 2022-10-25 2022-10-25 Space science experiment robot is cooperated to world based on intelligent vision Active CN115519546B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211309652.4A CN115519546B (en) 2022-10-25 2022-10-25 Space science experiment robot is cooperated to world based on intelligent vision

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211309652.4A CN115519546B (en) 2022-10-25 2022-10-25 Space science experiment robot is cooperated to world based on intelligent vision

Publications (2)

Publication Number Publication Date
CN115519546A true CN115519546A (en) 2022-12-27
CN115519546B CN115519546B (en) 2023-06-27

Family

ID=84704372

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211309652.4A Active CN115519546B (en) 2022-10-25 2022-10-25 Space science experiment robot is cooperated to world based on intelligent vision

Country Status (1)

Country Link
CN (1) CN115519546B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116130037A (en) * 2023-01-28 2023-05-16 钢研纳克检测技术股份有限公司 Material high-throughput preparation-statistics mapping characterization integrated research and development system

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0724767A (en) * 1993-07-15 1995-01-27 Toshiba Corp Remote control device for robot
US5499320A (en) * 1993-03-24 1996-03-12 The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration Extended task space control for robotic manipulators
CN103302668A (en) * 2013-05-22 2013-09-18 东南大学 Kinect-based space teleoperation robot control system and method thereof
CN111496770A (en) * 2020-04-09 2020-08-07 上海电机学院 Intelligent carrying mechanical arm system based on 3D vision and deep learning and use method
CN114505869A (en) * 2022-02-17 2022-05-17 西安建筑科技大学 Chemical reagent intelligent distribution machine control system
CN114912287A (en) * 2022-05-26 2022-08-16 四川大学 Robot autonomous grabbing simulation system and method based on target 6D pose estimation

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5499320A (en) * 1993-03-24 1996-03-12 The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration Extended task space control for robotic manipulators
JPH0724767A (en) * 1993-07-15 1995-01-27 Toshiba Corp Remote control device for robot
CN103302668A (en) * 2013-05-22 2013-09-18 东南大学 Kinect-based space teleoperation robot control system and method thereof
CN111496770A (en) * 2020-04-09 2020-08-07 上海电机学院 Intelligent carrying mechanical arm system based on 3D vision and deep learning and use method
CN114505869A (en) * 2022-02-17 2022-05-17 西安建筑科技大学 Chemical reagent intelligent distribution machine control system
CN114912287A (en) * 2022-05-26 2022-08-16 四川大学 Robot autonomous grabbing simulation system and method based on target 6D pose estimation

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116130037A (en) * 2023-01-28 2023-05-16 钢研纳克检测技术股份有限公司 Material high-throughput preparation-statistics mapping characterization integrated research and development system
CN116130037B (en) * 2023-01-28 2023-10-10 钢研纳克检测技术股份有限公司 Material high-throughput preparation-statistics mapping characterization integrated research and development system

Also Published As

Publication number Publication date
CN115519546B (en) 2023-06-27

Similar Documents

Publication Publication Date Title
Skaar et al. Camera-space manipulation
Liu et al. Uncalibrated visual servoing of robots using a depth-independent interaction matrix
Siradjuddin et al. A position based visual tracking system for a 7 DOF robot manipulator using a Kinect camera
Fu et al. Active learning-based grasp for accurate industrial manipulation
Rambow et al. Autonomous manipulation of deformable objects based on teleoperated demonstrations
CN114912287A (en) Robot autonomous grabbing simulation system and method based on target 6D pose estimation
CN115519546B (en) Space science experiment robot is cooperated to world based on intelligent vision
CN111590567A (en) Space manipulator teleoperation planning method based on Omega handle
Nori et al. Autonomous learning of 3D reaching in a humanoid robot
Agustian et al. Robot manipulator control with inverse kinematics PD-pseudoinverse Jacobian and forward kinematics Denavit Hartenberg
Nicola et al. Human-robot co-manipulation of soft materials: enable a robot manual guidance using a depth map feedback
Seo et al. Deep Imitation Learning for Humanoid Loco-manipulation through Human Teleoperation
Allen et al. Optimal path planning for image based visual servoing
Su et al. A ROS based open source simulation environment for robotics beginners
Van Molle et al. Learning to grasp from a single demonstration
Li et al. A novel semi-autonomous teleoperation method for the tiangong-2 manipulator system
CN114888768A (en) Mobile duplex robot cooperative grabbing system and method based on multi-sensor fusion
D’Ago et al. Modelling and identification methods for simulation of cable-suspended dual-arm robotic systems
Hu et al. A minimal dataset construction method based on similar training for capture position recognition of space robot
David et al. Digital assistances in remote operations for ITER test blanket system replacement: An experimental validation
Tsakiris et al. Experiments in real-time vision-based point stabilization of a nonholonomic mobile manipulator
Du et al. A Feature Reserved Teaching Method for Pick-Place System under Robot Operating System
Jiabu et al. Research on intelligent grasping system of monocular vision guided manipulator
CN116197918B (en) Manipulator control system based on action record analysis
Li A Design of Robot System for Rapidly Sorting Express Carton with Mechanical Arm Based on Computer Vision Technology

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant