CN110722569A - Real-time image acquisition and processing system based on robot action - Google Patents

Real-time image acquisition and processing system based on robot action Download PDF

Info

Publication number
CN110722569A
CN110722569A CN201911087855.1A CN201911087855A CN110722569A CN 110722569 A CN110722569 A CN 110722569A CN 201911087855 A CN201911087855 A CN 201911087855A CN 110722569 A CN110722569 A CN 110722569A
Authority
CN
China
Prior art keywords
module
electrically connected
output end
robot
input end
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911087855.1A
Other languages
Chinese (zh)
Inventor
何涛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wenzhou Polytechnic
Original Assignee
Wenzhou Polytechnic
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wenzhou Polytechnic filed Critical Wenzhou Polytechnic
Priority to CN201911087855.1A priority Critical patent/CN110722569A/en
Publication of CN110722569A publication Critical patent/CN110722569A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1679Programme controls characterised by the tasks executed
    • B25J9/1689Teleoperation
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/66Remote control of cameras or camera parts, e.g. by remote control devices
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/74Circuitry for compensating brightness variation in the scene by influencing the scene brightness using illuminating means

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Manipulator (AREA)
  • Toys (AREA)

Abstract

The invention relates to the technical field of robot actions, and discloses a real-time image acquisition and processing system based on robot actions. This real-time image acquisition processing system based on robot action, carry out the long-range input that carries out the instruction through remote module, through the usable computer of remote module and instruction output module, the cell-phone, IPAD or certain operation APP carry out the input of instruction, it is very convenient, after the identification module discerns the signal of instruction output module transmission simultaneously, transmit for first control module, first control module control infrared range finder carries out work this moment, can detect the distance with required control thing through infrared range finder, then carry out real-time location processing through orientation module, sense when light is darker through light effect induction module, control light module carries out work, camera module shoots simultaneously and handles, the high mesh of practicality has been reached.

Description

Real-time image acquisition and processing system based on robot action
Technical Field
The invention relates to the technical field of robot actions, in particular to a real-time image acquisition and processing system based on robot actions.
Background
The robot is a machine device which can automatically execute work, can receive instructions of a human, communicate with the human, run a pre-programmed program, and also can compel movements according to principles customized by an artificial intelligence technology, and has the task of assisting or replacing the work of the human, such as production industry and construction industry or dangerous work, and the robot is a product of high-level integrated control theory, mechanical electronics, computers, materials and bionics, and has important application in the fields of industry, medicine, agriculture, even military affairs and the like at present.
According to the general action debugging method of the humanoid robot disclosed in the Chinese patent CN 104057452B, all steering engine parameter data are obtained by sending commands to the steering engines of the robot, the invention is a general independent debugging method and has strong universality, because a real-time data read-back technology of the execution result of the steering engine of the robot is adopted, the action debugging of the robot becomes purposeful and directional instead of the traditional debugging by depending on experience, a cubic spline interpolation technology is adopted, the action of the robot is greatly optimized, the action of the robot is stable and smooth, but the robot cannot be positioned in time for working, the defect of low practicability exists, when small objects are stored in a narrow space, although the debugging method can enable the action of the robot to be more stable, the distance position of the object is not well controlled, the precise operation is not convenient, and the practicability is greatly reduced.
Disclosure of Invention
Technical problem to be solved
Aiming at the defects of the prior art, the invention provides a real-time image acquisition and processing system based on robot action, which has the advantages of high practicability and the like, and solves the problems that when small articles are stored in a narrow space, although the behavior action of the robot can be more stable by the debugging method, the distance and the position of the articles are not well controlled, the precise operation is not convenient, and the practicability is greatly reduced.
(II) technical scheme
In order to achieve the purpose of high practicability, the invention provides the following technical scheme: a real-time image acquisition and processing system based on robot action comprises a remote module, wherein the output end of the remote module is electrically connected with the input end of an instruction input module, the output end of the instruction input module is electrically connected with the input end of an identification module, the output end of the identification module is electrically connected with the input end of a first control module, the output end of the first control module is electrically connected with the input end of a position sensing module, the output end of the position sensing module is electrically connected with the input end of a positioning module, the output end of the positioning module is electrically connected with the input end of a camera module, the output end of the positioning module is electrically connected with the input end of a light sensing module, the output end of the light sensing module is electrically connected with the input end of a light module, and the output end of the camera module is electrically connected with the input end, the output end of the acquisition module is electrically connected with the input end of the signal collection module, the output end of the signal collection module is electrically connected with the input end of the analysis module, the output end of the analysis module is electrically connected with the input end of the second control module, and the output end of the second control module is electrically connected with the input end of the execution module.
Preferably, the execution module includes a reading module, an output end of the reading module is electrically connected to an input end of the parameter setting module, an output end of the parameter setting module is electrically connected to an input end of the time module, and an output end of the time module is electrically connected to an input end of the action module.
Preferably, the acquisition module extracts a first image from image data of the camera module at a preset frame rate.
Preferably, the analysis module comprises an image conversion unit, and the image conversion unit can eliminate the noise of the picture and convert the picture into a digital code acceptable to the device.
Preferably, the remote module can be a computer, a mobile phone, an IPAD or an operating APP.
Preferably, the light sensing module is a light sensor, the type of the light sensor can be NHZD10, and the light module includes cool white and warm white lights.
Preferably, the position sensing module is an infrared distance meter, and the model of the infrared distance meter may be KTR-GP2D 12.
Preferably, the execution module is configured to execute a motor control parameter in at least one degree of freedom based on the time series, so that the robot performs an action in at least one degree of freedom.
Preferably, the motion module controls a motor which moves the corresponding moving part in at least one degree of freedom, and the model of the motor can be Y80M 1-2.
(III) advantageous effects
Compared with the prior art, the invention provides a real-time image acquisition and processing system based on robot action, which has the following beneficial effects:
1. the real-time image acquisition and processing system based on the robot action is convenient to carry out remote instruction input through the remote module, can utilize a computer, a mobile phone, an IPAD or an operation APP to carry out instruction input through the remote module and the instruction output module, simultaneously, after the identification module identifies a signal transmitted by the instruction output module, the signal is transmitted to the first control module, the first control module controls the infrared distance meter to work, the distance from a required manipulator can be detected through the infrared distance meter, then the real-time positioning processing is carried out through the positioning module, when the light effect sensing module senses that light rays are dark, the light module is controlled to work, simultaneously, the camera module carries out shooting processing, the acquisition module extracts a first image from the image data of the camera module according to a preset frame rate to provide the first image for the data information collection module, the purpose of high practicability is achieved.
2. This real-time image acquisition processing system based on robot action, come to arrange in order the information that passes through the information collection module, and provide analysis module, analysis module analyzes information, after reading information through reading the module, give the parameter setting module, after the parameter module sets up the parameter of time and action, and give time module and action module, the time that time module control robot ran, avoid causing the waste on the electric quantity, the motor of corresponding moving part motion on at least one degree of freedom is controlled to the action module, make whole robot carry out work, thereby the purpose that the practicality is high has been reached.
Drawings
FIG. 1 is a schematic structural diagram of a real-time image acquisition and processing system based on robot motion according to the present invention;
fig. 2 is a schematic structural diagram of a real-time image acquisition and processing system based on robot motion according to the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Referring to fig. 1-2, a real-time image collecting and processing system based on robot action includes a remote module, an execution module includes a reading module, an output of the reading module is electrically connected to an input of a parameter setting module, an output of the parameter setting module is electrically connected to an input of a time module, an output of the time module is electrically connected to an input of an action module, an output of the remote module is electrically connected to an input of an instruction input module, an output of the instruction input module is electrically connected to an input of an identification module, an output of the identification module is electrically connected to an input of a first control module, an output of the first control module is electrically connected to an input of a position sensing module, a remote module is used for inputting instructions remotely, a computer can be used through the remote module and the instruction output module, The mobile phone, the IPAD or an operation APP can input instructions conveniently, the identification module identifies signals transmitted by the instruction output module and transmits the signals to the first control module, the first control module controls the infrared distance meter to work at the moment, the distance from a required operation object can be detected through the infrared distance meter, then the signals are positioned in real time through the positioning module, when dark light is sensed through the light effect sensing module, the light module is controlled to work, the camera module performs shooting processing at the same time, the acquisition module extracts a first image from image data of the camera module according to a preset frame rate and provides the first image for the data information collection module, the purpose of high practicability is achieved, the acquisition module extracts the first image from the image data of the camera module according to the preset frame rate, and the output end of the position sensing module is electrically connected with the input end of the positioning module, the analysis module comprises an image conversion unit, the image conversion unit can eliminate the noise of the picture and convert the picture into a digital code acceptable by a device, the output end of the positioning module is electrically connected with the input end of the camera module, the output end of the positioning module is electrically connected with the input end of the light sensing module, the remote module can be a computer, a mobile phone, an IPAD (internet protocol digital AD) or an APP (application) in certain operation, the output end of the light sensing module is electrically connected with the input end of the light module, the output end of the camera module is electrically connected with the input end of the acquisition module, the light sensing module is a light sensor, the type of the light sensor can be NHZD10, the light module comprises cold white light and warm white light, the output end of the acquisition module is electrically connected with the input end of the signal collection module, the position sensing module is an infrared distance meter, the type of, the output end of the signal collection module is electrically connected with the input end of the analysis module, the execution module is used for executing motor control parameters on at least one degree of freedom based on a time sequence, so that the robot finishes the action on at least one degree of freedom, the output end of the analysis module is electrically connected with the input end of the second control module, the action module controls a motor of a corresponding moving part moving on at least one degree of freedom, transmitted information is sorted through the information collection module and is provided for the analysis module, the analysis module analyzes the information, after the information is read through the reading module, the parameter setting module is provided, after the parameter module sets time and parameters of the action, the time module and the action module are provided, the time module controls the running time of the robot, the waste of electric quantity is avoided, and the action module controls the motor of the corresponding moving part moving on at least one degree of freedom, the whole robot works, so that the purpose of high practicability is achieved, the model of the motor can be Y80M1-2, and the output end of the second control module is electrically connected with the input end of the execution module.
When using, carry out the long-range input that carries out the instruction through remote module, carry out the input of instruction through the usable computer of remote module and instruction output module, cell-phone, IPAD or certain operation APP, it is very convenient, after the signal of instruction output module transmission is discerned to the identification module simultaneously, transmits for first control module, and first control module control infrared distance meter carries out work this moment.
In summary, the real-time image collecting and processing system based on robot motion is convenient to use, the remote module is used for inputting commands remotely, the remote module and the command output module can be used for inputting commands by using a computer, a mobile phone, an IPAD or an operation APP, the identification module is used for identifying signals transmitted by the command output module and transmitting the signals to the first control module, the first control module controls the infrared distance meter to work, the infrared distance meter can detect the distance from a required manipulator and then the positioning module is used for carrying out real-time positioning processing, the light module is controlled to work when the light sensing module senses dark light, the camera module is used for shooting, the collecting module extracts a first image from the image data of the camera module according to a preset frame rate and provides the first image for the data information collecting module, the purpose of high practicability is achieved.
The information transmitted is arranged by the information collecting module and is provided to the analyzing module, the analyzing module analyzes the information, after the information is read by the reading module, the information is sent to the parameter setting module, and after the parameter setting module sets the time and the action parameters, the time module controls the running time of the robot to avoid wasting electricity, and the action module controls the motor of the corresponding moving part moving on at least one degree of freedom to make the whole robot work, thereby achieving the purpose of high practicability, solving the problem that when small articles are stored in a narrow space, although the debugging method can enable the behavior and the action of the robot to be more stable, the distance and the position of an article are not well controlled, the precise operation is not convenient, and the practicability is greatly reduced.
It is to be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
Although embodiments of the present invention have been shown and described, it will be appreciated by those skilled in the art that changes, modifications, substitutions and alterations can be made in these embodiments without departing from the principles and spirit of the invention, the scope of which is defined in the appended claims and their equivalents.

Claims (9)

1. The utility model provides a real-time image acquisition processing system based on robot action, includes remote module, its characterized in that: the output end of the remote module is electrically connected with the input end of the instruction input module, the output end of the instruction input module is electrically connected with the input end of the identification module, the output end of the identification module is electrically connected with the input end of the first control module, the output end of the first control module is electrically connected with the input end of the position sensing module, the output end of the position sensing module is electrically connected with the input end of the positioning module, the output end of the positioning module is electrically connected with the input end of the camera module, the output end of the positioning module is electrically connected with the input end of the light sensing module, the output end of the light sensing module is electrically connected with the input end of the light module, the output end of the camera module is electrically connected with the input end of the acquisition module, and the output end of the acquisition module is electrically connected with the, the output end of the signal collection module is electrically connected with the input end of the analysis module, the output end of the analysis module is electrically connected with the input end of the second control module, and the output end of the second control module is electrically connected with the input end of the execution module.
2. The robot-action-based real-time image acquisition and processing system according to claim 1, wherein: the execution module comprises a reading module, the output end of the reading module is electrically connected with the input end of the parameter setting module, the output end of the parameter setting module is electrically connected with the input end of the time module, and the output end of the time module is electrically connected with the input end of the action module.
3. The robot-action-based real-time image acquisition and processing system according to claim 1, wherein: the acquisition module extracts a first image from the image data of the camera module according to a preset frame rate.
4. The robot-action-based real-time image acquisition and processing system according to claim 1, wherein: the analysis module includes an image conversion unit that can remove noise from the picture while converting the picture into a digital code acceptable to the device.
5. The robot-action-based real-time image acquisition and processing system according to claim 1, wherein: the remote module can be a computer, a mobile phone, an IPAD or an operation APP.
6. The robot-action-based real-time image acquisition and processing system according to claim 1, wherein: the light sensing module is a light sensor, the type of the light sensor can be NHZD10, and the light module comprises cold white light and warm white light.
7. The robot-action-based real-time image acquisition and processing system according to claim 1, wherein: the position sensing module is an infrared distance meter, and the model of the infrared distance meter can be KTR-GP2D 12.
8. The robot-action-based real-time image acquisition and processing system according to claim 1, wherein: the execution module is used for executing motor control parameters on at least one degree of freedom based on the time sequence to enable the robot to complete actions on at least one degree of freedom.
9. The robot-action-based real-time image acquisition and processing system according to claim 2, wherein: the motion module controls a motor of which the corresponding moving part moves in at least one degree of freedom, and the model of the motor can be Y80M 1-2.
CN201911087855.1A 2019-11-08 2019-11-08 Real-time image acquisition and processing system based on robot action Pending CN110722569A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911087855.1A CN110722569A (en) 2019-11-08 2019-11-08 Real-time image acquisition and processing system based on robot action

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911087855.1A CN110722569A (en) 2019-11-08 2019-11-08 Real-time image acquisition and processing system based on robot action

Publications (1)

Publication Number Publication Date
CN110722569A true CN110722569A (en) 2020-01-24

Family

ID=69225114

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911087855.1A Pending CN110722569A (en) 2019-11-08 2019-11-08 Real-time image acquisition and processing system based on robot action

Country Status (1)

Country Link
CN (1) CN110722569A (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20010037163A1 (en) * 2000-05-01 2001-11-01 Irobot Corporation Method and system for remote control of mobile robot
CN205058045U (en) * 2015-10-26 2016-03-02 众德迪克科技(北京)有限公司 Robot with vision servo
CN108214487A (en) * 2017-12-16 2018-06-29 广西电网有限责任公司电力科学研究院 Based on the positioning of the robot target of binocular vision and laser radar and grasping means
CN108724192A (en) * 2018-06-28 2018-11-02 河南工程学院 A kind of robot movement-control system
CN109792886A (en) * 2019-02-25 2019-05-24 义乌市吉龙科技有限公司 Fruit system is plucked in a kind of orchard automatically

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20010037163A1 (en) * 2000-05-01 2001-11-01 Irobot Corporation Method and system for remote control of mobile robot
CN205058045U (en) * 2015-10-26 2016-03-02 众德迪克科技(北京)有限公司 Robot with vision servo
CN108214487A (en) * 2017-12-16 2018-06-29 广西电网有限责任公司电力科学研究院 Based on the positioning of the robot target of binocular vision and laser radar and grasping means
CN108724192A (en) * 2018-06-28 2018-11-02 河南工程学院 A kind of robot movement-control system
CN109792886A (en) * 2019-02-25 2019-05-24 义乌市吉龙科技有限公司 Fruit system is plucked in a kind of orchard automatically

Similar Documents

Publication Publication Date Title
CN106826838B (en) Interaction bionic mechanical arm control method based on Kinect visual depth sensor
CN205219101U (en) Service robot of family
CN103398702B (en) A kind of mobile robot's remote operation controller and manipulation technology thereof
CN101762231B (en) Device and method for detecting appearance of mobile phone keys
CN205572431U (en) System of assembling of intelligent vision robot
CN104647388A (en) Machine vision-based intelligent control method and machine vision-based intelligent control system for industrial robot
US20210311480A1 (en) Self-learning robot
CN103941864A (en) Somatosensory controller based on human eye binocular visual angle
CN109752724A (en) A kind of image laser integral type navigation positioning system
Amin et al. Detection & distinction of colors using color sorting robotic arm in a pick & place mechanism
CN111507325A (en) Industrial visual OCR recognition system and method based on deep learning
CN111025977A (en) Intelligent clothes airing system based on Internet of things and deep learning algorithm
CN110722569A (en) Real-time image acquisition and processing system based on robot action
Shang-Liang et al. Using deep learning technology to realize the automatic control program of robot arm based on hand gesture recognition
CN205817879U (en) A kind of Smart Home moves robot
CN203243530U (en) A system of internet of things
Das et al. Vision based object tracking by mobile robot
CN209486480U (en) A kind of SCM Based intelligence exploration robot vehicle
CN109754003B (en) Intelligent robot vision technology application detection system and method based on deep learning
Kuo et al. Remote control based hybrid-structure robot design for home security applications
Sampreeth et al. Object Sorting Robot Using Image Processing
Zhang et al. Multifunctional Robot Grasping System Based on Deep Learning and Image Processing
Umapathy et al. Night Surveillance Robot for Women Safety
CN207027516U (en) A kind of robot of view-based access control model and speech-sound intelligent control
Li et al. Research on Gesture Control Manipulator System based on Leap Motion

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20200124