CN110893620A - 3D printing robot based on machine vision - Google Patents

3D printing robot based on machine vision Download PDF

Info

Publication number
CN110893620A
CN110893620A CN201911166679.0A CN201911166679A CN110893620A CN 110893620 A CN110893620 A CN 110893620A CN 201911166679 A CN201911166679 A CN 201911166679A CN 110893620 A CN110893620 A CN 110893620A
Authority
CN
China
Prior art keywords
module
printing
model
vision
embedded processor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911166679.0A
Other languages
Chinese (zh)
Inventor
袁丽英
王瑞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Harbin University of Science and Technology
Original Assignee
Harbin University of Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Harbin University of Science and Technology filed Critical Harbin University of Science and Technology
Priority to CN201911166679.0A priority Critical patent/CN110893620A/en
Publication of CN110893620A publication Critical patent/CN110893620A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1674Programme controls characterised by safety, monitoring, diagnostic
    • B25J9/1676Avoiding collision or forbidden zones
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/02Sensing devices
    • B25J19/021Optical sensing devices
    • B25J19/023Optical sensing devices including video camera means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J5/00Manipulators mounted on wheels or on carriages
    • B25J5/005Manipulators mounted on wheels or on carriages mounted on endless tracks or belts
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B29WORKING OF PLASTICS; WORKING OF SUBSTANCES IN A PLASTIC STATE IN GENERAL
    • B29CSHAPING OR JOINING OF PLASTICS; SHAPING OF MATERIAL IN A PLASTIC STATE, NOT OTHERWISE PROVIDED FOR; AFTER-TREATMENT OF THE SHAPED PRODUCTS, e.g. REPAIRING
    • B29C64/00Additive manufacturing, i.e. manufacturing of three-dimensional [3D] objects by additive deposition, additive agglomeration or additive layering, e.g. by 3D printing, stereolithography or selective laser sintering
    • B29C64/20Apparatus for additive manufacturing; Details thereof or accessories therefor
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B33ADDITIVE MANUFACTURING TECHNOLOGY
    • B33YADDITIVE MANUFACTURING, i.e. MANUFACTURING OF THREE-DIMENSIONAL [3-D] OBJECTS BY ADDITIVE DEPOSITION, ADDITIVE AGGLOMERATION OR ADDITIVE LAYERING, e.g. BY 3-D PRINTING, STEREOLITHOGRAPHY OR SELECTIVE LASER SINTERING
    • B33Y30/00Apparatus for additive manufacturing; Details thereof or accessories therefor

Landscapes

  • Engineering & Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Robotics (AREA)
  • Chemical & Material Sciences (AREA)
  • Materials Engineering (AREA)
  • Manufacturing & Machinery (AREA)
  • Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Multimedia (AREA)

Abstract

The utility model provides a 3D prints robot based on machine vision, its characterized in that can carry out three-dimensional stereo imaging according to machine vision module to the model that needs to print, construct the 3D that needs to print the model and print the structure chart, 3D printer constructs the structure chart according to machine vision, automatic printing out the model, this technical field belongs to 3D and prints technical field, 3D needs to establish the model through corresponding software when printing, draw the three-dimensional spatial structure of corresponding model, rethread 3D printer prints, to above-mentioned complicated step, a based on machine vision automatic identification model has been designed, the 3D who establishes the model automatically prints the technique, machine vision module is with the model image who gathers, through three-dimensional reduction algorithm, carry out three-dimensional modeling, then transmit for 3D printer and print.

Description

3D printing robot based on machine vision
Technical Field
The invention mainly relates to the field of robot 3D printing modeling, in particular to a model 3D printing modeling robot based on embedded vision.
Background
3D printing modeling robots in the modern 3D printing modeling field are various in variety, but few 3D printing modeling robots capable of achieving three-dimensional restoration of collision prevention and printing connection gaps are few, so that most of robots still adopt simple 3D printing modeling with fixed printing connection gaps and definite positions. Considering the situations that the positions of printing connection gaps and printed matters are not clear, the mechanical arm collides with the insertion corner forming the model, and the like, the 3D printing modeling robot has higher performance.
Disclosure of Invention
The invention discloses a model 3D printing modeling robot based on embedded vision, which is mainly characterized by being capable of automatically identifying and calibrating printing connection gaps of printed parts and realizing collision avoidance of the 3D printing modeling robot. After the 3D printing modeling track is planned, the embedded processor can control the spray head to perform 3D printing modeling according to a specified path. When the sensor module collects obstacle information during 3D printing modeling, the sensor module can immediately transmit the information to the embedded processor, and the embedded processor can immediately react to prevent collision between a nozzle module and a printed part of the 3D printing modeling robot.
The binocular vision module is installed at the front end of the nozzle of the robot, infrared sensors are installed on all mechanical arms of the nozzle, each sensor is connected with the input port of the embedded processor, and the nozzle module is installed at the tail end of the 3D printing modeling mechanical arm. The embedded processor module is installed in a base of the 3D printing modeling robot, and the power supply module is installed at the lower end of the processor module.
The binocular vision module transmits the acquired two-dimensional image information to the embedded processor, the embedded processor restores the input two-dimensional image information according to a three-dimensional image restoration algorithm, and then positions of printing connection gaps in the three-dimensional restored image are calibrated, so that accurate positioning of the printing connection gaps is achieved.
As shown in fig. 1, the external structure of the embedded binocular vision model 3D printing modeling robot is divided into the following parts: the device comprises a shell 7, a binocular camera 8, a spray head 9, a binocular camera support 10, a mechanical arm 11, a chassis 12, crawler wheels 13 and a power switch 14. As shown in fig. 2, the internal system is composed of an embedded processor module 1; a motor drive module 2; a power supply module 3; an infrared sensor module 4; an embedded binocular vision module 5; a few large parts of the screen 6 are displayed. As shown in fig. 4, the die-printing connection slit 15 is formed by inserting and connecting a main pipe 16 and a branch pipe 17. The core is an embedded binocular vision module 5 and an embedded processor module 1, and a very reliable solution is provided for detecting, positioning and tracking a model printing connection gap.
The embedded binocular vision module 5 is adopted, image information of the printing connecting gap is acquired through the model, characteristic information of the printing connecting gap is identified, identification and positioning of the printing connecting gap are achieved through a KEIL compiling program, and therefore position information of the printing connecting gap is obtained. And processing and analyzing the position information, acquiring X, Y, Z position information of the target, packaging and sending the information to the 3D printing modeling robot, and thus realizing the identification of the target object by the system.
The STM32 single-chip microcomputer embedded processor module is used as the embedded processor module 1, the motion platform acquires target position information, programs are compiled through KEIL, PID adjustment is adopted, the 3D printing modeling robot can accurately position printing connection gaps, the target identification speed efficiency is improved, and the real-time performance is ensured.
An infrared sensor module 4 is adopted, the infrared sensor is installed on a mechanical arm of the 3D printing modeling robot, when the infrared sensor collects an obstacle, the posture of the 3D printing modeling mechanical arm needs to be adjusted in time, and the 3D printing modeling robot is prevented from colliding with a printed piece;
technical scheme
The three-dimensional restoration principle of the invention takes a printing connection gap as a positioning target, and the model printing connection gap is formed by two intersecting edges and corners. According to the 3D printing modeling requirement, the printing connection gaps of the model needing 3D printing modeling are different greatly. And is therefore particularly important for the positioning of the printing connection slit.
The principle of three-dimensional restoration of the printing connection gap is shown in fig. 3, the printing connection gap is detected by the camera, and when the focal length of the camera and the size of the printing connection gap are known, the three-dimensional coordinates of the printing connection gap relative to the camera coordinate system can be calculated by utilizing the transformation between the coordinate system and the camera coordinate system. And (3) establishing a vector space of a coordinate system with the center of the printing connection gap as an origin and the plane thereof as XY, wherein when the central position of the camera is coincident with the central position of the printing connection gap, the position coordinate information acquired by the system is (0, 0, 0), and the Z axis in the actually detected target position information is a negative value. The method is the basis of the embedded positioning system by establishing the coordinate information of the target position.
And (4) performing characteristic three-dimensional restoration according to the extracted printing connection gap characteristic information by using a mechanical arm collision avoidance principle. And calculating the actual height three-dimensional coordinate of the model, and enabling the embedded system to rapidly adjust the position and the posture of the mechanical arm through the relative height difference between the embedded system and the target object in the vertical direction. Preventing collision between the mechanical arm and the print.
As shown in fig. 2, the internal structure of the system includes an embedded processor module 1: an STM32 processor is employed as the embedded processor module. The motor driving module 2: a full-bridge motor is adopted to drive a chip, the current is 5A at most, and the voltage is supplied by a 24V power supply. The power supply module 3: the system adopts 24V chargeable and dischargeable lithium batteries for power supply, and is used as a power supply of the motor driving module, and 3.3V and 5V power supplies are provided for the system through the voltage conversion chip. The infrared sensor module 4: an infrared sensor module with low power consumption is adopted, the design adopts HC-SR501 model, and the working voltage of the infrared sensor module is 3.3V. Embedded binocular vision module 5: an embedded binocular vision module is adopted, and the embedded binocular vision module is a miniature embedded vision module consisting of an STM32F 4-based embedded single chip microcomputer and a camera and is powered by a 5V power supply.
As shown in fig. 5, the main program flow of the system software is as follows: the software design of the embedded binocular vision 3D printing modeling system firstly initializes an I/O port, PWM, SPI and the like, and the initialization is interrupted. And judging whether the system is initialized. When the initialization is judged to be completed, the serial port selects the embedded binocular vision module, a frame of camera image is obtained, whether a target printing connection gap exists in the image is judged, when the target printing connection gap exists, three-dimensional coordinate data of the printing connection gap, which is transmitted by the sensor, relative to the camera is received, coordinate information is analyzed, the motor is driven to move according to a corresponding positioning algorithm, the embedded binocular vision 3D printing modeling system is enabled to position the target and drive the mechanical arm to accurately reach the optimal 3D printing modeling position, and the movement of the robot comprises advancing, retreating, left-right moving, left-right rotating and rotating.
Drawings
FIG. 1 external structure diagram of embedded binocular vision 3D printing modeling robot
FIG. 2 internal structure diagram of embedded binocular vision 3D printing modeling system
FIG. 3 three-dimensional recovery principle of printing connection gap
FIG. 4 target print joint seam map
FIG. 5 Main program flowchart
In the figure: 1-an embedded processor module; 2, a motor driving module; 3, a power supply module; 4-an infrared sensor module; 5, an embedded binocular vision module; 6, a display screen; 7-a housing; 8-binocular camera; 9-a spray head; 10-binocular camera stand; 11-a mechanical arm; 12-a chassis; 13-a crawler wheel; 14-power switch; 15-model printing of connecting gaps; 16-main pipe; 17-branch pipe.
Detailed Description
External system connection: the crawler-type wheel system 13 and the chassis 12 form a hardware motion platform, each circuit module is fixed on the platform, the tail end of the mechanical arm 11 supports and fixes the spray head 9, the binocular camera support 10 supports and fixes the binocular camera module 8, and the power switch 14 is installed on the side face of the shell 7.
Internal system connection: the power module is connected with each module, the serial port of the embedded processor module 1 is connected with the serial port of the embedded binocular vision module 5, the communication port of the embedded processor module 1 is connected with the infrared sensor module 4, and the serial port of the motor driving module 2 is connected with the serial port of the embedded processor module 1. The infrared sensor module collects position signals of the model, converts the position signals into digital signals and transmits the digital signals to the embedded processor module, and the digital signals are converted into PWM output signals through a PID algorithm to control the motion of the motor.
Through debugging operation, the embedded binocular vision 3D printing modeling system can stably work and can realize autonomous anticollision. The identification rate of the printing connection gap is high, the movement of the printing connection gap can be stably followed and positioned, and the spray head can have the optimal position to perform 3D printing modeling and printing connection gap.

Claims (4)

1. The utility model provides a 3D printing robot based on machine vision which characterized in that: the dual-eye image printing machine comprises an embedded processor module, a dual-eye vision module, a power supply module, a heating module and a nozzle module, wherein the dual-eye vision module transmits acquired image information to the embedded processor, the embedded processor module performs three-dimensional restoration on the received image through a corresponding algorithm, after restoration, three-dimensional coordinates of a model to be printed are marked in the three-dimensional image, then the embedded processor transmits the marked position information to the nozzle module, the nozzle module prints according to the position provided by the embedded processor, the heating module is used for heating and melting 3D printing materials, and the power supply module, the embedded processor module, the dual-eye vision module, the heating module and the nozzle module are connected to provide energy for the 4 modules.
2. The machine-vision-based 3D printing robot of claim 1, wherein: the vision module adopts a raspberry-shaped CM3 binocular vision module, is mainly used for three-dimensional modeling of a model to be printed, and transmits acquired image information to the embedded processor.
3. The machine-vision-based 3D printing robot of claim 1, wherein: the embedded processing module adopts an ST series single chip microcomputer, and the embedded processing module is responsible for processing image information and position information acquired by the binocular vision module and the heating module and controlling the spray head module.
4. The machine vision-based intersecting line welding robot of claim 1, wherein: the power supply module is a multi-output power supply capable of providing 24V, 12V, 5V and 3.3V voltages and is mainly responsible for providing energy for other modules.
CN201911166679.0A 2019-11-25 2019-11-25 3D printing robot based on machine vision Pending CN110893620A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911166679.0A CN110893620A (en) 2019-11-25 2019-11-25 3D printing robot based on machine vision

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911166679.0A CN110893620A (en) 2019-11-25 2019-11-25 3D printing robot based on machine vision

Publications (1)

Publication Number Publication Date
CN110893620A true CN110893620A (en) 2020-03-20

Family

ID=69786733

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911166679.0A Pending CN110893620A (en) 2019-11-25 2019-11-25 3D printing robot based on machine vision

Country Status (1)

Country Link
CN (1) CN110893620A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111674048A (en) * 2020-05-13 2020-09-18 广东工业大学 3D printer broken wire alarm device and alarm method based on machine vision

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111674048A (en) * 2020-05-13 2020-09-18 广东工业大学 3D printer broken wire alarm device and alarm method based on machine vision

Similar Documents

Publication Publication Date Title
CN205736996U (en) A kind of electric motor car automatic charging mechanical arm
US20170348858A1 (en) Multiaxial motion control device and method, in particular control device and method for a robot arm
CN106774298B (en) Autonomous charging of robots system and method based on camera and laser aiming positioning
CN108908363B (en) Wireless dragging demonstrator system for spraying robot and active demonstration method thereof
CN114728417A (en) Robot autonomous object learning triggered by a remote operator
CN211324756U (en) Intelligent cleaning robot
CN106527439A (en) Motion control method and apparatus
CN108436912A (en) A kind of control system and its control method of reconstruction robot docking mechanism
CN107297748A (en) A kind of dining room service robot system and application
CN110481356A (en) A kind of the manipulator charging system and method for unmanned plane
CN106393142A (en) Intelligent robot
CN110587602B (en) Fish tank cleaning robot motion control device and control method based on three-dimensional vision
Lee et al. Fast perception, planning, and execution for a robotic butler: Wheeled humanoid m-hubo
CN105082137A (en) Novel robot
CN111230888A (en) RGBD camera-based upper limb exoskeleton robot obstacle avoidance method
CN110893620A (en) 3D printing robot based on machine vision
CN212522923U (en) Ball picking robot system
CN111515928A (en) Mechanical arm motion control system
CN110757466B (en) STM 32-based mine survey robot control system
CN205968985U (en) Portable investigation robot based on intelligent Mobile Terminal control
CN107962546A (en) A kind of image recognition element follows pickup robot
JP2008264901A (en) Mobile robot, its automatic coupling method, and drive control method for parallel link mechanism
CN116100565A (en) Immersive real-time remote operation platform based on exoskeleton robot
CN115632462A (en) Automatic charging system of robot
CN111590165B (en) Ship assemblage plate welding robot based on remote correction and welding method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20200320

WD01 Invention patent application deemed withdrawn after publication