WO2021017083A1 - Système de modélisation et de positionnement de robot industriel et robot - Google Patents

Système de modélisation et de positionnement de robot industriel et robot Download PDF

Info

Publication number
WO2021017083A1
WO2021017083A1 PCT/CN2019/103497 CN2019103497W WO2021017083A1 WO 2021017083 A1 WO2021017083 A1 WO 2021017083A1 CN 2019103497 W CN2019103497 W CN 2019103497W WO 2021017083 A1 WO2021017083 A1 WO 2021017083A1
Authority
WO
WIPO (PCT)
Prior art keywords
module
sensor
image
robot
control
Prior art date
Application number
PCT/CN2019/103497
Other languages
English (en)
Chinese (zh)
Inventor
何正文
王宇智
Original Assignee
南京驭逡通信科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 南京驭逡通信科技有限公司 filed Critical 南京驭逡通信科技有限公司
Publication of WO2021017083A1 publication Critical patent/WO2021017083A1/fr

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems

Definitions

  • the invention belongs to the technical field of robots, and particularly relates to an industrial robot mapping and positioning system and a robot.
  • Existing mobile robots usually use sensors installed on the robot body to scan the surrounding environment, use these sensors to measure the position information of obstacles around the robot, and record the mileage information of the robot itself, and calculate it through positioning and map construction methods.
  • the map of the robot's environment and the robot's location information due to the constant installation position of each sensor and the different sampling rate between each other, the existing positioning method will detect errors when the bumpy road surface, or the car body undulates excessively, or the machine rotates, collides, etc. Therefore, the accuracy and robustness of mapping and positioning algorithms decrease, which in turn affects the map quality and positioning accuracy.
  • the present invention provides an industrial robot mapping and positioning system and a robot.
  • an industrial robot mapping and positioning system including a vision module, a control module, and a power supply module.
  • the vision module is composed of an image acquisition module, an image processing module, and a communication module.
  • the vision module is connected to the power module, the power module is connected to the vision module, the image acquisition module is connected to the image processing module, the image processing module is connected to the communication module, and the
  • the control module is composed of a control end computer module and a vehicle-mounted computer module, and the control end computer module is connected with the vehicle-mounted computer module.
  • the image acquisition module includes an image sensor, an optical camera and an auxiliary light source, and the image processor is a CMOS sensor.
  • the image processing module includes a microprocessor inside, and the microprocessor is connected to the image sensor, and the microprocessor adopts an S3C2440 processing chip.
  • the inside of the communication module includes a communication interface, and the communication interface is connected with the microprocessor and the control module.
  • control terminal computer module Preferably, the output ends of the control terminal computer module are respectively connected with the remote control handle module and the wireless data module, and mainly complete robot control, sensor information fusion, environment map construction, etc.
  • the output ends of the vehicle-mounted computer module are respectively connected to the vehicle control module and the wireless data module
  • the wireless data module is connected to the vehicle sensor module
  • the vehicle control module is connected to the motor drive module
  • the vehicle computer module It mainly completes the collection and transmission of data such as sensors and cameras.
  • the internal processing flow of the image processing module is to initialize first, then acquire the target image, and read the Y component of the image while binarizing the Y component of the image, and then binarizing the image to obtain the target edge Pixels, and finally through the pixels on the edge of the target, determine the spatial position and posture of the target.
  • the vehicle-mounted sensor is divided into an internal sensor and an external sensor.
  • the internal sensor includes an odometer and a gyroscope.
  • the internal sensor is mainly used to detect internal parameters of the robot.
  • the external sensor includes a visual sensor, a laser sensor, An infrared sensor and an MTI sensor, the MTI sensor is a miniature attitude sensor.
  • a robot including a robot body, a plurality of cameras are fixed at the top of the robot body, an observation platform is fixed at the middle of the bottom end of the robot body, and the bottom end of the observation platform is symmetrical
  • a steering gear is fixed, the number of the steering gear is two, and the steering gear is connected to the controller inside the robot body.
  • the overall structure of the present invention is simple, more accurate data can be obtained, and measurement errors can be reduced, so that the position and posture of the robot at different times can be grasped more conveniently.
  • the position and posture information of the workpiece target can be accurately obtained, which reduces the intensity of manual labor and is more convenient for people to use.
  • Figure 1 is a structural block diagram of the industrial robot mapping and positioning system provided in Example 1;
  • Fig. 2 is a structural block diagram of the control module provided in embodiment 1;
  • FIG. 3 is a flowchart of the image processing module provided in Embodiment 1;
  • Vision Module In the above figures, 1. Vision Module; 2. Control Module; 3. Power Module; 4. Image Acquisition Module; 5. Image Processing Module; 6. Communication Module; 7. Control Computer Module; 8. Vehicle Computer Module; 9. Remote control handle module; 10. Wireless data module; 11. Vehicle control module; 12. Vehicle sensor module; 13. Motor drive module; 14. Robot body; 15. Camera; 16. Observation platform; 17. Steering gear.
  • Embodiment 1 as shown in Figures 1 to 3, the present invention provides an industrial robot mapping and positioning system, including a vision module 1, a control module 2, and a power supply module 3.
  • the vision module 1 consists of an image acquisition module 4,
  • the image processing module 5 and the communication module 6 are composed.
  • the control module 2 is connected to the vision module 1 and the power supply module 3, the power supply module 3 is connected to the vision module 1, and the image acquisition module 4 is connected to the
  • the image processing module 5 is connected, the image processing module 5 is connected to the communication module 6, and the control module 2 is composed of a control end computer module 7 and a vehicle-mounted computer module 8; the control end computer is the main control computer of the entire system , Is directly controlled by the operator, and is mainly used to complete environmental feature recognition, data coordinate transformation, and environmental map creation.
  • the on-board computer is the control computer, which mainly controls the action of the sensor and receives the environmental information obtained by the sensor, and performs simple environmental information After processing, it is sent to the control-end computer, and the control-end computer module 7 is connected to the vehicle-mounted computer module 8.
  • the power supply module 3 supplies power to the entire device, and the vision module 1 performs image shooting and transmission, and passes the control module 2.
  • the microprocessor sends a signal to the image sensor to collect the image. After receiving the signal, the image sensor converts the optical image into an electronic signal.
  • the data result is transferred to the storage unit through DMA, and then passed
  • the microprocessor reads the image data from the storage unit, obtains the spatial position and posture of the target through the computer, and finally sends the obtained data to the robot controller through the communication interface to control the movement of the robot.
  • the image acquisition module 4 includes an image sensor, an optical camera 15 and an auxiliary light source.
  • the image processor is a CMOS sensor; the processing chip of the sensor has a high processing speed and can output multiple formats.
  • the image processing module 5 includes a microprocessor inside, and the microprocessor is connected to the image sensor.
  • the microprocessor adopts the S3C2440 processing chip; because the processing chip has a dedicated interface, and The interface supports digital video interface standards, and can be directly connected to the image sensor without the need for an interface conversion circuit.
  • the communication module 6 includes a communication interface inside, and the communication interface is connected with the microprocessor and the control module 2.
  • control terminal computer module 7 is respectively connected with the remote control handle module 9 and the wireless data module 10, and mainly complete robot control, sensor information fusion, environment map construction, etc.
  • the output terminals of the on-board computer module 8 are respectively connected to the in-vehicle control module 112 and the wireless data module 10, the wireless data module 10 is connected to the on-board sensor module 12, and the in-vehicle control module 112 and the motor drive module 13 is connected, and the on-board computer module 8 mainly completes the collection and transmission of data such as sensors and cameras 15.
  • the internal processing flow of the image processing module 5 is to initialize first, then acquire the target image, read the Y component of the image while binarizing the Y component of the image, and then binarize the image to obtain the target Edge pixels, and finally determine the spatial position and posture of the target through the pixels of the target edge; from the principle of image processing, it can be seen that the more pixels of the image, the higher the recognition and calculation accuracy, so the Y component image is selected for binarization Processing and calculation.
  • the vehicle-mounted sensors are divided into internal sensors and external sensors.
  • the internal sensors include an odometer and a gyroscope.
  • the internal sensors are mainly used to detect internal parameters of the robot.
  • the external sensors include visual sensors, laser sensors, Infrared sensor and MTI sensor, the MTI sensor is a miniature attitude and orientation sensor; the data information of the external environment can be obtained through the external sensor, processed by the on-board computer, and transmitted to the control end computer, and the robot is in the process of movement, due to the external environment Various slopes and obstacles cause the robot to generate a tilt angle. When the tilt angle is large, obstacles that do not belong to the plane may be scanned in, resulting in larger data in the data output by the sensor during the mapping and positioning process.
  • Deviation through the MTI sensor using its internal enhanced three-axis gyroscope to quickly track the space attitude of the measured object, while measuring the acceleration and geomagnetic field of the three axes to provide compensation for the speed of the three axes, which is conducive to the stability and stability of the robot. It is controlled, and the MTI is small in size and light in weight, and can be easily installed on the robot to accurately obtain the robot's posture in the process of crossing obstacles.
  • Embodiment 2 as shown in FIG. 4, the present invention provides an industrial robot, including a robot body 14, a plurality of cameras 15 are fixed at the top of the robot body 14, and an observation platform is fixed at the bottom middle of the robot body 14 16.
  • a steering gear 17 is symmetrically fixed at the bottom end of the observation platform 16, the number of the steering gear 17 is two, and the steering gear 17 is connected to the controller inside the robot body 14;
  • the observation platform 16 is a two-degree-of-freedom movement platform composed of two steering gears 17, which can ensure that the robot remains as stable as possible during the measurement, thereby improving the accuracy of the measurement.
  • the microprocessor first sends out the image acquisition signal to the image sensor. After receiving the signal, the image sensor converts the optical image into an electronic signal. After signal amplification, the data result is transferred to the storage unit through DMA, and then through The microprocessor reads the image data from the storage unit, obtains the spatial position and posture of the target through the computer, and finally sends the obtained data to the robot controller through the communication interface to control the movement of the robot.

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Manipulator (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

La présente invention porte sur un système de modélisation et de positionnement de robot industriel, ainsi que sur un robot. Le système comprend un module visuel (1), un module de commande (2) et un module de source d'alimentation (3), le module visuel (1) étant composé d'un module de collecte d'image (4), d'un module de traitement d'image (5) et d'un module de communication (6) ; le module de commande (2) étant relié au module visuel (1) et au module de source d'alimentation (3) ; le module de source d'alimentation (3) étant relié au module visuel (1) ; le module de collecte d'image (4) étant relié au module de traitement d'image (5) ; le module de traitement d'image (5) étant relié au module de communication (6) ; le module de commande (2) étant composé d'un module informatique d'extrémité de commande (7) et d'un module informatique monté sur véhicule (8) ; et le module informatique d'extrémité de commande (7) étant relié au module informatique monté sur véhicule (8). La présente invention peut acquérir avec précision des informations concernant la position et la posture d'une cible de pièce à travailler, ce qui permet de réduire l'intensité de la main-d'œuvre manuelle et de faciliter l'utilisation de celle-ci par des personnes.
PCT/CN2019/103497 2019-07-28 2019-08-30 Système de modélisation et de positionnement de robot industriel et robot WO2021017083A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910685916.8A CN110421563A (zh) 2019-07-28 2019-07-28 一种工业机器人建图定位系统及机器人
CN201910685916.8 2019-07-28

Publications (1)

Publication Number Publication Date
WO2021017083A1 true WO2021017083A1 (fr) 2021-02-04

Family

ID=68411051

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/103497 WO2021017083A1 (fr) 2019-07-28 2019-08-30 Système de modélisation et de positionnement de robot industriel et robot

Country Status (2)

Country Link
CN (1) CN110421563A (fr)
WO (1) WO2021017083A1 (fr)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114161434B (zh) * 2021-11-03 2024-04-12 深圳市芯众云科技有限公司 一种基于视觉和无线技术的仿人机器人控制系统
CN114515923A (zh) * 2022-03-11 2022-05-20 上海隧道工程智造海盐有限公司 一种应用于隧道钢筋焊接的视觉定位系统

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101920498A (zh) * 2009-06-16 2010-12-22 泰怡凯电器(苏州)有限公司 实现室内服务机器人同时定位和地图创建的装置及机器人
CN201904871U (zh) * 2010-04-23 2011-07-20 北京航空航天大学 新型嵌入式图像采集处理装置
CN103399577A (zh) * 2013-08-02 2013-11-20 哈尔滨工程大学 一种远程操作救援机器人的探测感知系统
CN107169977A (zh) * 2017-04-24 2017-09-15 华南理工大学 基于FPGA和Kirsch的自适应阈值彩色图像边缘检测方法
CN109597406A (zh) * 2018-10-30 2019-04-09 昆山睿力得软件技术有限公司 一种视觉定位引导装置及方法
CN109664292A (zh) * 2017-10-16 2019-04-23 南京敏光视觉智能科技有限公司 一种焊接用机器人视觉定位系统及方法
CN109697720A (zh) * 2017-10-20 2019-04-30 南京敏光视觉智能科技有限公司 一种工业用机器视觉识别分析系统
US20190143525A1 (en) * 2017-11-16 2019-05-16 Kabushiki Kaisha Toshiba Actuation system and computer program product
CN110039536A (zh) * 2019-03-12 2019-07-23 广东工业大学 室内地图构造和定位的自导航机器人系统及图像匹配方法

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN208826630U (zh) * 2018-08-21 2019-05-07 四川文理学院 一种带远程主操作手的关节式机械臂
CN109352647A (zh) * 2018-10-12 2019-02-19 盐城工学院 一种汽车用六轴机器人视觉抓取系统

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101920498A (zh) * 2009-06-16 2010-12-22 泰怡凯电器(苏州)有限公司 实现室内服务机器人同时定位和地图创建的装置及机器人
CN201904871U (zh) * 2010-04-23 2011-07-20 北京航空航天大学 新型嵌入式图像采集处理装置
CN103399577A (zh) * 2013-08-02 2013-11-20 哈尔滨工程大学 一种远程操作救援机器人的探测感知系统
CN107169977A (zh) * 2017-04-24 2017-09-15 华南理工大学 基于FPGA和Kirsch的自适应阈值彩色图像边缘检测方法
CN109664292A (zh) * 2017-10-16 2019-04-23 南京敏光视觉智能科技有限公司 一种焊接用机器人视觉定位系统及方法
CN109697720A (zh) * 2017-10-20 2019-04-30 南京敏光视觉智能科技有限公司 一种工业用机器视觉识别分析系统
US20190143525A1 (en) * 2017-11-16 2019-05-16 Kabushiki Kaisha Toshiba Actuation system and computer program product
CN109597406A (zh) * 2018-10-30 2019-04-09 昆山睿力得软件技术有限公司 一种视觉定位引导装置及方法
CN110039536A (zh) * 2019-03-12 2019-07-23 广东工业大学 室内地图构造和定位的自导航机器人系统及图像匹配方法

Also Published As

Publication number Publication date
CN110421563A (zh) 2019-11-08

Similar Documents

Publication Publication Date Title
CN110119698B (zh) 用于确定对象状态的方法、装置、设备和存储介质
CN106981082B (zh) 车载摄像头标定方法、装置及车载设备
CN112189225B (zh) 车道线信息检测装置、方法以及存储被编程为执行该方法的计算机程序的计算机可读记录介质
CN101435704B (zh) 一种星敏感器高动态下的星跟踪方法
JP2020085886A (ja) 乗物、乗物測位システム、及び乗物測位方法
CN106625673A (zh) 狭小空间装配系统及装配方法
JP2019528501A (ja) マルチカメラシステムにおけるカメラ位置合わせ
CN106066179A (zh) 一种基于ros操作系统的机器人位置丢失找回方法和控制系统
WO2014027478A1 (fr) Dispositif de reconnaissance d'environnement de route
CN112734765A (zh) 基于实例分割与多传感器融合的移动机器人定位方法、系统及介质
CN111967360A (zh) 基于车轮的目标车辆姿态检测方法
WO2013145025A1 (fr) Système de camera stéréo et objet mobile
CN105835030A (zh) 基于无线控制和视频传输的危险地域多功能探测车
WO2021017083A1 (fr) Système de modélisation et de positionnement de robot industriel et robot
CN113311821B (zh) 一种多悬垂管道探伤移动机器人的建图与定位系统及方法
CN113821040A (zh) 一种深度视觉相机与激光雷达融合导航的机器人
CN113819905A (zh) 一种基于多传感器融合的里程计方法及装置
CN110901638B (zh) 驾驶辅助方法及系统
CN112819711A (zh) 一种基于单目视觉的利用道路车道线的车辆反向定位方法
CN115903857A (zh) 基于rfid的粮面无人巡检装置及定位方法
CN108253929A (zh) 一种四轮定位仪、系统及其实现方法
CN111145262A (zh) 一种基于车载的单目标定方法
CN111145260A (zh) 一种基于车载的双目标定方法
CN103278141B (zh) 婴幼儿睡眠监控系统及方法
CN113701750A (zh) 一种井下多传感器的融合定位系统

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19939621

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19939621

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 19939621

Country of ref document: EP

Kind code of ref document: A1