CN113183147B - Large-area coverage electronic skin system with remote proximity sense - Google Patents

Large-area coverage electronic skin system with remote proximity sense Download PDF

Info

Publication number
CN113183147B
CN113183147B CN202110337517.XA CN202110337517A CN113183147B CN 113183147 B CN113183147 B CN 113183147B CN 202110337517 A CN202110337517 A CN 202110337517A CN 113183147 B CN113183147 B CN 113183147B
Authority
CN
China
Prior art keywords
perception
sensing
unit
proximity
basic
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110337517.XA
Other languages
Chinese (zh)
Other versions
CN113183147A (en
Inventor
薛光明
孙立宁
陈国栋
孔向东
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Suzhou University
Original Assignee
Suzhou University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Suzhou University filed Critical Suzhou University
Priority to CN202110337517.XA priority Critical patent/CN113183147B/en
Publication of CN113183147A publication Critical patent/CN113183147A/en
Application granted granted Critical
Publication of CN113183147B publication Critical patent/CN113183147B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/02Sensing devices
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1664Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/29Graphical models, e.g. Bayesian networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • General Engineering & Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Robotics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Computational Linguistics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Evolutionary Biology (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Automation & Control Theory (AREA)
  • Manipulator (AREA)

Abstract

The invention relates to a large area coverage electronic skin system with distance proximity perception, comprising: a basic sensing unit; the basic sensing unit comprises a plurality of sensing modules and a processing module, the sensing modules can be mutually expanded and coupled, and each sensing module comprises a contact force sense detection module and a remote approach sense sensing module; the central hub unit is connected with the basic sensing unit, collects the sensing data of the basic sensing unit in real time, monitors the working state of the basic sensing unit in real time and processes the collected sensing data; the method comprises the steps of fusing remote proximity perception and contact force perception information of an activated perception unit in an electronic skin system, fusing proprioception of a robot, estimating the spatial position, the proximity speed and direction, the shape and the size of a proximity object, fusing motion planning path information of the robot, predicting collision probability, and estimating collision scene type and collision risk through collision recognition deep learning neural network model calculation.

Description

Large-area coverage electronic skin system with remote proximity sense
Technical Field
The invention particularly relates to a large-area covering electronic skin system with remote approach sensation, which mainly aims at acquiring and processing touch information and remote approach sensation information of an electronic skin part needing large-area covering of a novel man-machine cooperation robot.
Background
In the industrial field, a lot of work needing complex cognition and flexible emergency handling can only be finished by the cooperation of a robot and a human at present, and the robot and the human are required to be in the same working space. In the field of life service, it is more difficult to completely isolate the robot from human beings. Therefore, how to make the robot and the human being able to cooperate safely and harmoniously in the same space is an important direction for the development of the robot. Among them, the detection of collisions that may occur between robots and humans and between robots and the surrounding unstructured environment is a key technology for achieving human-computer co-fusion. At present, a great deal of research is carried out in this respect, and a great deal of results are provided, but all the results are not mature and the results are rarely applied in practice. The main related technologies and drawbacks include: (1) the robot joint torque force or joint motor current detection is adopted to identify collision, and the defects are that the reliability is poor, the scene adaptability is poor, and the cost of a complex modeling sensor is high; (2) the machine vision or the laser radar is adopted to detect the collision, the visual field is small, the collision is easily influenced by the change, the shielding and the like of ambient light, the algorithm processing is complex, the reliability is poor, the detection precision is poor, and the micro-distance collision cannot be processed; (3) with the development of electronic skin technology in recent years, a scheme of detecting collision by adopting electronic skin with a force touch sensor and a close proximity sensor array is proposed, but the collision can be detected only when the collision occurs because the detection distance is too short, or the collision avoidance action cannot be responded in time because the distance is too short and the response time is insufficient when the collision is detected.
Disclosure of Invention
In view of the shortcomings of the prior art, it is an object of the present invention to provide a large area coverage electronic skin system with a sense of proximity at a distance.
In order to achieve the purpose, the invention provides the following technical scheme:
a large area coverage electronic skin system with distance proximity perception, comprising:
a basic sensing unit; the basic sensing unit comprises a plurality of sensing modules and a processing module, the sensing modules can be mutually expanded and coupled, and each sensing module comprises a contact force sense detection module for sensing contact force sensing information and a remote approach sense sensing module for sensing approach object information;
the central hub unit is connected with the basic sensing unit, collects the sensing data of the basic sensing unit in real time, monitors the working state of the basic sensing unit in real time and processes the collected sensing data;
the system comprises a sensing module, a Bayesian information fusion algorithm, a contact force priority algorithm, a distance sensing unit, a basic sensing unit, a distance sensing unit, a contact force priority algorithm, a distance sensing unit, a contact force sensing unit and a proximity sensing unit, wherein the Bayesian information fusion algorithm and the contact force priority algorithm are adopted to carry out fusion processing on contact force sensing information and proximity sensing information sensed by the sensing module, the proximity object is sensed mainly at a far position, the proximity object is sensed mainly by adopting the distance sensing information and the contact force sensing information when the proximity object is in contact with the basic sensing unit comprehensively.
The contact force sense detection module comprises a contact force conduction silica gel layer, a plurality of film piezoresistive force touch sensors, a pressure buffer silica gel layer and a flexible circuit board layer, wherein the film piezoresistive force touch sensors are arranged on the upper side of the flexible circuit board layer at intervals, and the pressure buffer silica gel layer and the contact force conduction silica gel layer are sequentially arranged above the film piezoresistive force touch sensors.
The distance and proximity detection module comprises a TOF distance infrared ranging sensor.
The perception module is provided with a detection state lamp which can be used as a position reference light source when the perception module deploys position vision calibration.
The sensing module and the processing module are provided with expansion unit interfaces which can be used for expansion connection.
The central unit is in communication connection with the basic sensing unit through a CAN bus interface.
The center unit is in communication connection with the robot host computer based on a primary network, the center unit is in communication connection with the basic sensing unit based on a secondary network, and the basic sensing unit is in communication connection with the subordinate extended sensing unit based on a tertiary network.
The processing method of the central unit comprises the following steps: the method comprises the steps of acquiring the proximity perception information and the contact force perception information acquired by a basic perception unit, preprocessing the acquired perception information, calculating a perception position space coordinate based on a robot base coordinate by combining current perception data with the current pose of the robot, and finally fusing all perception information of the electronic skin and the current pose data of the robot to estimate the current spatial position, the motion speed and the motion direction of an object to be approached.
When the approach object relative approach speed exceeds the stimulation setting threshold value, the basic sensing unit enters an activation state.
The sensing modules are connected through a cutting structure.
The invention has the beneficial effects that: the method comprises the steps of integrating remote proximity perception and contact force perception information of a perception unit activated in an electronic skin system, integrating the body perception of a robot, estimating the spatial position, the proximity speed and direction of an adjacent object, integrating the shape and size, integrating robot motion planning path information, predicting collision probability, estimating collision scene type and collision risk through collision recognition deep learning neural network model calculation, greatly improving the early warning distance of the electronic skin, and enhancing the fusion effectiveness of the electronic skin proximity perception information and machine vision information.
Drawings
FIG. 1 is a schematic structural view of a hub unit of the present invention.
Fig. 2 is a schematic structural diagram of a basic sensing unit according to the present invention.
Fig. 3 is a schematic structural diagram of the composite multilayer force-conducting silica gel of the present invention.
FIG. 4 is a block diagram of the hardware circuit structure of the hub unit of the present invention.
FIG. 5 is a block diagram of the basic sensing unit hardware circuit structure of the present invention.
FIG. 6 is a software framework diagram of the present invention.
Fig. 7 is a schematic diagram of a network architecture according to the present invention.
Fig. 8 is a schematic diagram of the splicing in the length direction of the present invention.
Fig. 9 is a schematic diagram of the invention spliced in the width direction by the expansion unit interface.
Fig. 10 is a schematic diagram of the present invention splicing in the width direction through a CAN bus interface.
FIG. 11 is a schematic view of the deployment of the present invention on a human-machine-cooperative mechanical arm.
In the figure, 1, a CAN bus interface; 2. a current sensor; 3. self-recovery insurance; 4. a power switch; 5. a direct current power supply input port; 6. a first reset button; 7. an STM processor core board; 8. a touch screen; 9. a first debug interface; 10. a standard industrial bus interface; 11. a detection status light; 12. a TOF long-range infrared ranging sensor; 13. a thin film piezoresistive force touch sensor; 14. a first expansion unit interface; 15. a power indicator light; 16. a second debug interface; 17. an STM32 processor; 18. a CAN bus access interface; 19. a power supply voltage stabilization module; 110. the CAN bus is connected with an output interface; 111. a second expansion unit interface; 112. a CAN drive module; 113. a second reset button; 114. compounding multiple layers of force-conducting silica gel; 21. a contact force conduction silica gel layer; 22. a thin film piezoresistive force touch sensor; 23. a pressure buffer silica gel layer; 24. a flexible circuit board layer; 31. a basic sensing unit of the 2-bit sensing module; 32. a first connection line; 33. a basic sensing unit of the 6-bit sensing module; 41. cutting the basic sensing unit into a 6-bit sensing module; 42. a basic sensing unit of the 4-bit sensing module after cutting; 43. a second connection line; 51. a waist part; 52. an upper arm; 53. a forearm; 54. a terminal actuator; 55. a wrist part.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be obtained by a person skilled in the art without inventive step based on the embodiments of the present invention, are within the scope of protection of the present invention.
The present invention provides a large area coverage electronic skin system with distance proximity perception, comprising:
a basic sensing unit; the basic sensing unit comprises a plurality of sensing modules and a processing module, the sensing modules can be mutually expanded and coupled, and each sensing module comprises a contact force sense detection module 14 for sensing contact force sensing information and a remote approach sense sensing module for sensing approach object information;
basic perception unit can set up a plurality of perception modules, like 6 perception modules, each perception module is through tailorring the structural connection, can reduce the quantity of perception module according to the demand, has satisfied the deployment of the basic perception unit of different length.
In addition, the sensing modules comprise a detection state lamp 11, a TOF long-distance infrared distance measuring sensor 12, a film piezoresistive force touch sensor 13 and a first expansion unit interface 14, and the plurality of sensing modules can be directly connected in an expansion mode through the first expansion unit interface.
The contact force sense detection module can be a composite multilayer force conduction silica gel 114 and comprises a contact force conduction silica gel layer 21, a plurality of film piezoresistive force touch sensors 22, a pressure buffer silica gel layer 23 and a flexible circuit board layer 24, wherein the film piezoresistive force touch sensors are arranged on the upper side of the flexible circuit board layer at intervals, and the pressure buffer silica gel layer and the contact force conduction silica gel layer are sequentially arranged above the film piezoresistive force touch sensors. Through the connection effect of the contact force conduction silica gel layer, when the contact point occurs between the two film piezoresistive force touch sensors, the adjacent film piezoresistive force touch sensors can detect the contact force, so that the contact force of the middle contact point is estimated, and the influence of the middle interval of the film piezoresistive force touch sensors on the contact force perception is eliminated. Because the area of the film piezoresistive force touch sensor is large, the film piezoresistive force touch sensor can be stressed more uniformly when being detected through the pressure buffering silica gel layer, the detection precision is higher and more stable, meanwhile, the function of buffering contact force and protecting a basic sensing module can be realized, and the film piezoresistive force touch sensor is used for measuring the size of the contact force, so that the information such as the size, the position and the sliding direction of the contact force can be sensed.
The remote proximity detection module comprises a TOF remote infrared distance measurement sensor which measures the distance of an approaching object so as to sense the approaching direction, position, speed and other information.
The perception module is provided with a detection state lamp which can be used as a position reference light source when the perception module deploys the position vision calibration, when an approaching object is perceived, the detection state lamp on the corresponding perception module is lightened, and the detection state lamp can also be used as the position reference light source when the perception module deploys the position vision calibration.
The basic sensing unit is responsible for sensing information measurement of electronic skin force touch sense and remote approach sense, sensor management, data preprocessing, point cloud data conversion and communication with the central unit. Meanwhile, the basic sensing unit is also responsible for carrying out data acquisition and monitoring on the next-stage extension sensing unit. Therefore, the processing module is provided with a power indicator 15, a second debugging interface 16, an STM32 processor 17, a CAN bus access interface 18, a power stabilizing module 19, a CAN bus output interface 110, a second extension unit interface 111, a CAN driving module 112 and a second reset button 113. The expansion unit interface of the processing module CAN be connected with the expansion unit interface of the perception module, the STM32 processor is a core processing module of the basic electronic skin perception unit and CAN be debugged by a second debugging interface, and the CAN bus access interface and the CAN bus output interface realize that the basic perception unit accesses the electronic skin system network by a CAN bus topological structure and communicates with the central unit by the interface. Meanwhile, a power supply line is integrated in the CAN bus interface, and a stable power supply is provided for the basic sensing unit through the power supply voltage stabilizing module.
The central hub unit is connected with the basic sensing unit, collects the sensing data of the basic sensing unit in real time, monitors the working state of the basic sensing unit in real time and processes the collected sensing data;
the central unit comprises a CAN bus interface 1, a current sensor 2, a self-recovery fuse 3, a power switch 4, a direct current power input port 5, a first reset button 6, an STM processor core plate 7 (at the bottom layer), a touch screen 8, a first debugging interface 9, a standard industrial bus interface 10 and the like.
The central unit is responsible for collecting the tactile perception data of each basic perception unit of the electronic skin, monitoring the working state of each basic perception unit, setting corresponding parameters of the basic perception units according to the instructions of the robot host computer, analyzing and processing the tactile perception collected data and reporting the data to the robot host computer. At the same time, the unit is also responsible for managing the power supply to the entire electronic skin system. The STM32 processor core board 7 is a core processing module of the electronic skin central unit, program debugging can be carried out on the core processing module through the first debugging interface 9, and the touch screen 8 is used as a human-computer interface of the electronic skin system and is responsible for inputting system setting parameters and displaying the current state information of the system. The tactile perception collected data is uploaded to the main robot computer through the standard industrial bus interface 10 after being analyzed and processed. In addition, the power supply of the electronic skin system is introduced from a direct current power input port 5, a power switch 4 is an electronic skin power main switch, the power supply is protected and output through a self-recovery fuse 3, the current consumption of the electronic skin system is monitored by a current sensor 2, and then the power supply is output by a CAN bus interface 1 to supply power to each basic sensing unit of the electronic skin.
The Bayesian information fusion algorithm and the contact force priority algorithm are adopted to perform fusion processing on contact force perception information and proximity perception information perceived by the perception module, so that a proximity object is perceived mainly by remote proximity perception information at a far position, the proximity object is perceived mainly by remote proximity perception and contact force perception information fusion processing when being near the perception unit, and the proximity object is perceived mainly by contact force perception information when being in full contact with the basic perception unit.
The central unit is in communication connection with the basic sensing unit through a CAN bus interface.
The center unit is in communication connection with the robot host computer based on a primary network, the center unit is in communication connection with the basic sensing unit based on a secondary network, and the basic sensing unit is in communication connection with the subordinate extended sensing unit based on a tertiary network.
The whole information communication network framework is composed of a communication network (primary network) between the robot host computer and each subordinate electronic skin center unit, a communication network (secondary network) for communication between the center unit and each subordinate basic sensing unit, and a communication network (tertiary network) between the basic sensing unit and each subordinate extended sensing unit. The primary network adopts standard industrial bus communication protocols (such as EtherCAT, industrial Ethernet, RS485 and the like) to realize high-speed communication between the main computer of the robot and the electronic skin center unit, and is responsible for uploading electronic skin perception information and working states to the main computer and sending electronic skin internal setting parameters, current robot joint position parameters and control instruction parameters to the center unit. The secondary network adopts a CAN bus communication protocol and is responsible for scheduling the communication time sequence of each basic sensing unit by the central unit, uploading the sensing information and the working state of each electronic skin basic sensing unit and issuing electronic skin internal setting parameters, current robot joint position parameters, control instruction parameters and the like sent by the host computer to the corresponding basic sensing unit. The three-level network adopts an I2C serial port bus communication protocol and is responsible for scheduling the communication time sequence of each extension sensing unit of each subordinate by the basic sensing unit, uploading the sensing information and the working state of each extension sensing unit and issuing related internal setting parameters, current robot joint position parameters, control instruction parameters and the like to the corresponding extension sensing unit.
The electronic skin perception module collects the proximity perception information and the contact force perception information through the perception module, and preprocesses the collected perception information, specifically including normalization processing, digital filtering, non-background characteristic data extraction and the like. And then, calculating the current perception data by combining the current pose of the robot and the space coordinate of the perception position based on the base coordinate of the robot, and finally, calculating and estimating the current barrier position, the motion speed and the motion direction by fusing all perception information of the electronic skin, the current pose of the robot and other data.
Each perception unit comprises a remote proximity perception and a contact force perception, information of the remote proximity perception and the contact force perception is fused with a contact force priority algorithm through a Bayes information fusion algorithm, the proximity perception perceives information in a plurality of proximity senses at a remote distance, a space position of a proximity object with higher credibility is obtained through a Bayes optimal fusion algorithm, the motion speed and the motion direction are estimated, when the distance of the proximity object is very close, the contact force and the proximity senses act together, the position of the proximity object is estimated according to a weighted fusion algorithm, when collision occurs (the contact force is large and the proximity sense is close to 0), the collision position is estimated according to the contact force sense information, and the proximity sense information of the position is invalid; the approach senses and the contact force senses of multiple positions of the electronic skin are fused through a lattice tower complete form principle to form a closed approach object outline, and then the size of the approach object is estimated; the collision probability and the danger of the approaching object can be estimated according to the information such as the position, the size, the approaching speed and the direction of the approaching object, the current motion track and the motion speed of the robot and the like.
Therefore, the approaching object is sensed mainly by the remote approaching information at a far position, the approaching object is sensed by fusion processing of the remote approaching information and the contact force information when the approaching object is near the sensing unit, and the approaching object is sensed mainly by the contact force information when the approaching object is in full contact with the sensing unit. When the approach object relative approach speed exceeds the stimulation setting threshold value, the sensing unit enters an activation state.
The method comprises the steps of integrating remote proximity perception and contact force perception information of an activated perception unit in an electronic skin system, integrating robot proprioception, estimating the spatial position, the proximity speed and direction, the shape and the size of an approaching object, and predicting collision probability and collision risk through collision recognition deep learning neural network model calculation. The algorithm model has the advantages of small data processing amount, high processing speed, strong anti-interference capability and the like.
The clipping, expanding and networking modes of the perception module are as follows:
in order to meet the requirements of the outer surfaces of different practical robots on sensing modules with different deployment lengths and widths, 6 modules in the sensing module designed by the invention can be cut at will, and two basic sensing modules can be spliced through an expansion unit interface, wherein the splicing in the length direction is shown in fig. 8. The basic sensing unit 31 cut into the 2-bit sensing module and the basic sensing unit 33 cut into the 6-bit sensing module are spliced through the first connecting line 32 of the expansion unit interface.
The splice in the width direction is shown in fig. 9. The basic sensing unit 41 cut into the 6-bit sensing module is spliced with the basic sensing unit 42 of the 4-bit sensing module after cutting through the second connection line 43 of the expansion unit interface.
The connection in the width direction CAN also be made via a CAN bus interface, as shown in fig. 10. The basic sensing unit 41 cut into the 6-bit sensing module is spliced with the basic sensing unit 42 of the 4-bit sensing module after cutting through the second connection line 43 of the CAN bus interface.
The deployment of the invention on a human-machine cooperative mechanical arm is shown in fig. 11. The robot is provided with a waist 51, an upper arm 52, a forearm 53, a tail end actuating mechanism 54 and a wrist 55 at the deployment position, and two sensing units in the width direction can be deployed by staggering half module to improve the coverage rate of a sensing space.
The application steps of the invention are as follows:
deploying an electronic skin basic sensing unit on the outer surface of the robot;
lightening a state monitoring indicator lamp on each module of the electronic skin, taking the state monitoring indicator lamp as a module spatial position reference light source, calibrating a deployment spatial position parameter of each electronic skin perception module by using a machine vision positioning technology, and setting the deployment spatial position parameter to a corresponding basic perception unit;
the electronic skin basic sensing unit is connected with the expansion sensing unit through an expansion unit interface;
connecting the electronic skin basic sensing units through a CAN bus interface to form a CAN bus network;
connecting an electronic skin CAN bus network to an electronic skin center module CAN bus interface;
connecting the electronic skin center unit with a robot host computer through a standard industrial bus interface;
the power supply is introduced to the electronic skin system by a direct current power supply input interface of the electronic skin center unit.
The robot host computer obtains electronic skin perception information through communication with the electronic skin center unit and sends the current state of the robot and the electronic skin setting parameters to the electronic skin.
The innovation points of the invention are as follows:
the electronic skin sensing module is designed by adopting a flexible, modularized, tailorable, expandable and customized scheme, so that the flexibility of electronic skin deployment is greatly enhanced;
distributed parallel processing in the preprocessing process of the tactile information is realized, the processing speed and the response time of the tactile information are improved, the calculated amount of a single microprocessor is reduced, and the redundancy and the reliability of a system are improved;
the scheme that one microprocessor simultaneously processes a plurality of groups of sensing modules is adopted, so that the number of the whole microprocessors of the system is reduced, and the power consumption and the cost are reduced.
The machine vision positioning calibration of the deployment space position of the perception module is supported, the determination speed and the determination precision of the deployment position of the perception module are greatly improved, and the efficiency and the convenience of electronic skin field deployment are improved.
The examples should not be construed as limiting the present invention, but any modifications made based on the spirit of the present invention should be within the scope of protection of the present invention.

Claims (8)

1. A large area coverage electronic skin system with distance proximity perception, comprising: it includes:
a basic sensing unit; the basic sensing unit comprises a plurality of sensing modules and a processing module, the sensing modules can be mutually expanded and coupled, and each sensing module comprises a contact force sense detection module for sensing contact force sensing information and a remote approach sense sensing module for sensing approach object information;
the central hub unit is connected with the basic sensing unit, collects the sensing data of the basic sensing unit in real time, monitors the working state of the basic sensing unit in real time and processes the collected sensing data;
wherein, the Bayesian information fusion algorithm and the contact force priority algorithm are adopted to carry out fusion processing on the contact force perception information and the proximity perception information perceived by the perception module, so that the proximity object is perceived mainly by the remote proximity perception information at a far position, the proximity object is perceived mainly by the remote proximity perception and the contact force perception information when being near the perception unit, and the proximity object is perceived mainly by the contact force perception information when being in full contact with the basic perception unit,
the distance and proximity detection module comprises a TOF distance infrared ranging sensor,
the processing method of the central unit comprises the following steps: the method comprises the steps of acquiring the proximity perception information and the contact force perception information acquired by a basic perception unit, preprocessing the acquired perception information, calculating a perception position space coordinate based on a robot base coordinate by combining current perception data with the current pose of the robot, and finally fusing all perception information of the electronic skin and the current pose data of the robot to estimate the current spatial position, the motion speed and the motion direction of an object to be approached.
2. The large area coverage electronic skin system with distance proximity perception according to claim 1, wherein: the contact force sense detection module comprises a contact force conduction silica gel layer, a plurality of film piezoresistive force touch sensors, a pressure buffer silica gel layer and a flexible circuit board layer, wherein the film piezoresistive force touch sensors are arranged on the upper side of the flexible circuit board layer at intervals, and the pressure buffer silica gel layer and the contact force conduction silica gel layer are sequentially arranged above the film piezoresistive force touch sensors.
3. The large area coverage electronic skin system with far distance proximity perception according to claim 1 or 2, wherein: the perception module is provided with a detection state lamp which can be used as a position reference light source when the perception module deploys position vision calibration.
4. The large area coverage electronic skin system with far distance proximity perception according to claim 1 or 2, wherein: the sensing module and the processing module are provided with expansion unit interfaces which can be used for expansion connection.
5. The large area coverage electronic skin system with far distance approach sensation as claimed in claim 1, wherein: the central unit is in communication connection with the basic sensing unit through a CAN bus interface.
6. The large area coverage electronic skin system with distance proximity perception according to claim 1, wherein: the center unit is in communication connection with the robot host computer based on a primary network, the center unit is in communication connection with the basic sensing unit based on a secondary network, and the basic sensing unit is in communication connection with the subordinate extended sensing unit based on a tertiary network.
7. The large area coverage electronic skin system with far distance approach sensation as claimed in claim 1, wherein: when the approach object relative approach speed exceeds the stimulation setting threshold value, the basic sensing unit enters an activation state.
8. The large area coverage electronic skin system with distance proximity perception according to claim 1, wherein: the sensing modules are connected through a cutting structure.
CN202110337517.XA 2021-03-30 2021-03-30 Large-area coverage electronic skin system with remote proximity sense Active CN113183147B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110337517.XA CN113183147B (en) 2021-03-30 2021-03-30 Large-area coverage electronic skin system with remote proximity sense

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110337517.XA CN113183147B (en) 2021-03-30 2021-03-30 Large-area coverage electronic skin system with remote proximity sense

Publications (2)

Publication Number Publication Date
CN113183147A CN113183147A (en) 2021-07-30
CN113183147B true CN113183147B (en) 2022-08-23

Family

ID=76974280

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110337517.XA Active CN113183147B (en) 2021-03-30 2021-03-30 Large-area coverage electronic skin system with remote proximity sense

Country Status (1)

Country Link
CN (1) CN113183147B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113721515A (en) * 2021-08-30 2021-11-30 太原理工大学 Active safety device of mechanical arm and safety control method thereof
CN114489338B (en) * 2022-01-25 2024-03-29 同济大学 Multilayer electronic skin structure
CN114536355B (en) * 2022-01-26 2023-07-07 浙江大学 Extensible reconfigurable multistage sensing flexible robot skin

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN209820667U (en) * 2019-05-06 2019-12-20 北京他山科技有限公司 Capacitive touch sensor, electronic skin and intelligent robot
CN110978017A (en) * 2019-12-02 2020-04-10 温州大学瓯江学院 Modular electronic skin suitable for large-area coverage of robot and application method thereof
CN111844046A (en) * 2017-03-11 2020-10-30 陕西爱尚物联科技有限公司 Robot hardware system and robot thereof
CN111906778A (en) * 2020-06-24 2020-11-10 深圳市越疆科技有限公司 Robot safety control method and device based on multiple perceptions
CN212254424U (en) * 2020-07-29 2020-12-29 河北工业大学 Flexible proximity sense and touch sense dual-mode sensor for robot

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111844046A (en) * 2017-03-11 2020-10-30 陕西爱尚物联科技有限公司 Robot hardware system and robot thereof
CN209820667U (en) * 2019-05-06 2019-12-20 北京他山科技有限公司 Capacitive touch sensor, electronic skin and intelligent robot
CN110978017A (en) * 2019-12-02 2020-04-10 温州大学瓯江学院 Modular electronic skin suitable for large-area coverage of robot and application method thereof
CN111906778A (en) * 2020-06-24 2020-11-10 深圳市越疆科技有限公司 Robot safety control method and device based on multiple perceptions
CN212254424U (en) * 2020-07-29 2020-12-29 河北工业大学 Flexible proximity sense and touch sense dual-mode sensor for robot

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
"机器人接近-接触双模式感知柔性电子皮肤研究";张永进;《北京理工大学硕士学位论文》;20180601;正文部分第4-14页 *

Also Published As

Publication number Publication date
CN113183147A (en) 2021-07-30

Similar Documents

Publication Publication Date Title
CN113183147B (en) Large-area coverage electronic skin system with remote proximity sense
CN102085664B (en) Autonomous operation forestry robot intelligent control system
Su Automatic fire detection system using adaptive fusion algorithm for fire fighting robot
CN110492607A (en) A kind of intelligent substation condition monitoring system based on ubiquitous electric power Internet of Things
CN103235562A (en) Patrol-robot-based comprehensive parameter detection system and method for substations
CN111813130A (en) Autonomous navigation obstacle avoidance system of intelligent patrol robot of power transmission and transformation station
CN107378971A (en) A kind of Study of Intelligent Robot Control system
CN110850413A (en) Method and system for detecting front obstacle of automobile
CN110978017A (en) Modular electronic skin suitable for large-area coverage of robot and application method thereof
Chien et al. Develop a multiple interface based fire fighting robot
CN113733089B (en) Mechanical arm control method, device, equipment, system, storage medium and mechanical arm
CN102853831B (en) Legged robot state sensing system based on dual core processing technology
Khan et al. Surface estimation of a pedestrian walk for outdoor use of power wheelchair based robot
CN113160447A (en) Intelligent inspection method and inspection system
CN102528811B (en) Mechanical arm positioning and obstacle avoiding system in Tokamak cavity
CN212515475U (en) Autonomous navigation obstacle avoidance system of intelligent patrol robot of power transmission and transformation station
CN111152267A (en) Multi-mode inspection robot protection system and protection method
CN207172091U (en) A kind of Study of Intelligent Robot Control system
CN103317513A (en) Networked robot control system based on CPUs
CN107756403A (en) A kind of modularization autonomous underwater exploring robot control system and method
Zeng et al. Construction of multi-modal perception model of communicative robot in non-structural cyber physical system environment based on optimized BT-SVM model
CN209520899U (en) A kind of household service and early warning robot
CN116352722A (en) Multi-sensor fused mine inspection rescue robot and control method thereof
CN114800524B (en) System and method for actively preventing collision of man-machine interaction cooperative robot
Zeng et al. Reliable robot-flock-based monitoring system design via a mobile wireless sensor network

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant