KR20160116445A - Intelligent tools errands robot - Google Patents

Intelligent tools errands robot Download PDF

Info

Publication number
KR20160116445A
KR20160116445A KR1020150044149A KR20150044149A KR20160116445A KR 20160116445 A KR20160116445 A KR 20160116445A KR 1020150044149 A KR1020150044149 A KR 1020150044149A KR 20150044149 A KR20150044149 A KR 20150044149A KR 20160116445 A KR20160116445 A KR 20160116445A
Authority
KR
South Korea
Prior art keywords
robot
tool
shape
arm
body part
Prior art date
Application number
KR1020150044149A
Other languages
Korean (ko)
Inventor
한성현
Original Assignee
경남대학교 산학협력단
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 경남대학교 산학협력단 filed Critical 경남대학교 산학협력단
Priority to KR1020150044149A priority Critical patent/KR20160116445A/en
Publication of KR20160116445A publication Critical patent/KR20160116445A/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J11/00Manipulators not otherwise provided for
    • B25J11/008Manipulators for service tasks
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J5/00Manipulators mounted on wheels or on carriages
    • B25J5/005Manipulators mounted on wheels or on carriages mounted on endless tracks or belts

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Manipulator (AREA)

Abstract

The present invention relates to an intelligent tool eraser robot, in which a robot recognizes the shape and model number of a tool in place of a person by means of a character / shape recognition technique such as the shape or model number of the tool and the object, To be delivered to the worker, thereby providing an intelligent tool intelligent tool errand robot which can be used as a substitute for on-site workforce.
That is, the present invention is characterized in that autonomous traveling means provided with a plurality of proximity sensors for autonomous travel through line tracking marked on the floor in a use area; A body part coupled to an upper side of the autonomous traveling means and having an ultrasonic sensor for increasing the distance to the front side; A head unit installed on the upper side of the body unit and provided with a camera; The hand assembly comprises shoulder joints, an upper arm, an elbow joint, a lower arm, a wrist joint, and a multi-joint hand assembly installed at left and right sides of the body, A robot arm gripping the tool housed in the tool box; A controller for controlling the operation of the autonomous traveling means, the ultrasonic sensor, the camera, and the robot arm installed in the body part.

Description

{Intelligent tools errands robot}

More particularly, the robot recognizes the shape and model number of a tool in place of a person through a character / shape recognition technique such as a shape of a tool and an object or a model number, Or the tool in the drawer can be moved and delivered to the worker, thereby providing an intelligent tool intelligent tool eraser robot that can be used as a substitute for on-site labor personnel.

Recognition of objects is the next generation user interface technology that helps users to use computer, communication, and home appliances with image recognition technology, which is the core of next generation human-machine interface. It is a technology that can be applied to the information technology industry and lead the high added value of the products.

In the theoretical robot control technology of 2D / 3D shape recognition, the real-time signal processing control technology is essential for the pre-processing shape recognition of the remote environment for application to the robot, and the present technology has a problem that the noise removal effect of the low frequency part is relatively low This problem is solved because there is a big problem on the detection performance dependency. The post-processing technology is the initial research stage and there is no universally standardized research result. Has been used.

Object recognition and control technology has been used for a long time and has been used for real life. However, there are still many problems to be solved for 3D image recognition similar to human.

In the unmanned remote control of the robot through the conventional image, the system is not elaborate and the error rate is high, so that the reliability is degraded.

 Currently, the introduction of commercial and advanced equipment for non-contact inspection of text and shape recognition systems in Korea is the mainstream, and some companies are developing simple software (S / W) And most of them are expensive, so it is difficult to purchase them in SMEs. A few companies are trying to develop core processing devices. In addition, application technology in this field is spurring research and development by accumulating know-how from various companies, and it is expected to further advance domestic technology in the future.

It is expected that the character and shape recognition system will play a large role in securing competitiveness of companies because it is expected to be applied to various fields such as quality assurance to enhance competitiveness and productivity improvement of manpower, Real-time realization of vision-driven control technology based on low-cost vision technology is expected to be more urgent in the fields of displays, optical materials and precision components.

The errand robot has been proposed as an errand robot capable of delivering objects and messages through location recognition of Korea Patent Application Publication No. 10-2008-0083746 (September 29, 2008) as prior art of the errand robot.

The prior art described above is an erranding robot for carrying an object or message through mediation between a sender and a communicator, the errand robot comprising: a map building unit for recognizing and registering a position of the own eraser and a specific place; An obstacle detection unit that moves to a predetermined position registered by the map building unit and detects an obstacle in a moving direction; Member recognition means for recognizing a member located in a registered place including a forwarder or a forwarded party; A key input unit installed at one side of the mobile robot and receiving command information from the transmitter and the transmitter; A message input unit for receiving a voice signal or a video signal from the sender to generate a voice message or a video message, and a message output unit for moving the voice message or the video message to a sender; And a control unit for controlling each of the above units and each means. In this case, there is no function to merely select a tool or the like according to a command at a level of delivering goods or messages, to be.

KR 1020080083746 A Sep. 19, 2008.

Therefore, the present inventor has researched and developed the present invention in the light of the above-mentioned problems. The present invention provides a method for solving the problem of manpower avoidance in a small-sized business and a 3D environment, It is to develop an errand robot that brings the necessary tools or objects / documents for this work.

That is, according to the present invention, the errand robot can recognize and recognize the objects and tools by developing character recognition technology of object shape and part model number to classify the shape and appearance of precision parts, The present invention relates to an intelligent tool eraser robot system which autonomously retrieves a tool on a work table or in a drawer by autonomously recognizing a control technology for autonomous recognition, will be.

According to the present invention, An autonomous running means having a plurality of proximity sensors for autonomous running through line tracking marked on the floor in a first use area; A body part coupled to an upper side of the autonomous traveling means and having an ultrasonic sensor for increasing the distance to the front side; A head unit installed on the upper side of the body unit and provided with a camera; The hand assembly comprises shoulder joints, an upper arm, an elbow joint, a lower arm, a wrist joint, and a multi-joint hand assembly installed at left and right sides of the body, A robot arm gripping the tool housed in the tool box; And a controller for controlling the operation of the autonomous traveling means, the ultrasonic sensor, the camera, and the robot arm installed in the body portion.

Second, the controller processes and preprocesses the recognition process to recognize and store the character / shape of the user as an image / shape detection software An image recognition software kit for recognizing an actual image recognition interval with respect to the input image, and a command instruction unit for controlling the hand, motion motion, and autonomous travel of the robot through the recognized image signal.

Thirdly, the controller stores the shape of 50 tools such as pliers, hammer, drive, wrench, spanner, saw, drill, etc. necessary in the general working process (manufacturing industry) in the basic model shape, , The robot recognizes the unique model number and pushes the unique model number through the designated path to recognize the shape of the designated tool and retrieve it.

Fourth, the drawer of the tool box and the tools accommodated therein are provided with a mark made of a magnetic material and the mark made of a magnetic material is positioned at the center of gravity.

The autonomous mobile robot control system for tool movement errand through image recognition of an object using the intelligent tool errand robot provided in the present invention is a robot control system which is becoming more intelligent and diversified, The user can more easily and conveniently utilize the robot in the workplace.

In addition, even when the work subject is enlarged, image recognition of dozens (50) predetermined number of tools and objects commonly used at work sites, which are frequently used in work sites, is learned through learning, It is possible to improve the accuracy and efficiency of fetching accurately. As a result, it is possible to provide the robot erection robot technology that autonomously imports workplace tools and similar objects that have increased reliability of 2D / 3D shape recognition service have.

BRIEF DESCRIPTION OF THE DRAWINGS Fig. 1 is an overall schematic view showing an intelligent tool eraser robot according to a preferred embodiment of the present invention
Fig. 2 is a block diagram of a robot arm of an intelligent tool eraser robot according to a preferred embodiment of the present invention
3 is a structural view of an autonomous traveling means of an intelligent tool eraser robot according to a preferred embodiment of the present invention
4 is a block diagram of a control structure of a robot controller of an intelligent tool eraser robot system according to a preferred embodiment of the present invention
Figure 5 is a photograph illustrating a controller module of an intelligent tool eraser robot system according to one preferred embodiment of the present invention.
6 is a photograph of the omni-directional obstacle avoidance ultrasonic sensor module installed in the robot
7 is a front view (S1 to S5) of a drawer associated with the present invention;
8 is an exemplary view of a mark of a magnetic material applied to a tool associated with the present invention

Hereinafter, a preferred embodiment of the intelligent tool eraser robot system provided in the present invention will be described in detail with reference to the accompanying drawings.

FIG. 1 is a block diagram showing an intelligent tool eraser robot according to a preferred embodiment of the present invention. FIG. 2 is a block diagram of a robot arm of an intelligent tool eraser robot according to a preferred embodiment of the present invention. 3 is a structural view of the autonomous traveling means of the intelligent tool eraser robot according to the preferred embodiment of the present invention.

As shown in the figure, the intelligent tool eraser robot 1 provided in the present invention is provided with autonomous travel means 2 capable of autonomous travel through line tracing of a silver tape (width 10 cm) .

As shown in FIG. 3, the autonomous traveling means 2 is provided with three wheels 21 at the lower portion thereof. These three wheels 21 can be driven individually or simultaneously, So as to be able to change direction and move in all directions. A plurality of proximity sensors 22 are installed in the autonomous travel means 2 and have a function of tracking silver tape lines through maneuvering to stop or avoid traveling when an obstacle is detected in the surroundings.

A head part 4 is provided at an upper side of the body part 3 and a robot arm 4 is provided at the right and left sides of the body part 3. The body part 3 is connected to the upper part of the autonomous travel means 2, (5).

An ultrasonic sensor 6 capable of sensing a distance is vertically attached to the front of the body part 3 so that the running of the autonomous travel means 2 along with the proximity sensor 22 provided in the autonomous travel means 2 So that it can be performed smoothly.

A camera 7 is provided on the head part 4. [ The camera 7 has a function of photographing the shape of the tool and the model number of the tool and providing it to the controller 8 built in the body part 3. [

The robot arm 5 installed on both sides of the body part 3 includes a shoulder joint 51 that can move like a human arm, an upper arm 52, an elbow joint 53, a lower arm 54, A wrist joint 55, and a hand assembly 56 composed of a multi-joint and capable of gripping an object, so that an operation similar to a movement of a person by an arm can be realized. The hand assembly 56 is provided with a magnetic sensing sensor capable of sensing magnetism.

The robot arm 5 in this configuration provides functionality to open the drawer 10 through the configuration hand assembly 56 or to grip objects on the table.

A controller (8) is installed in the body part (3).

In order to recognize and store the character / shape of the user, the controller 8 performs a post-recognition process and a preprocessing process using an image recognition interface software kit and an image / shape detection A software kit, an image recognition software kit for recognizing an actual image recognition interval with respect to the input image, and an instruction instructing unit 9 for controlling the hand, motion motion, and autonomous travel of the robot through the recognized image signal.

The controller 8 also stores the shapes of 50 tools such as pliers, hammer, drive, wrench, saw, drill and the like necessary in a general work process (manufacturing industry) in a basic model shape, When a unique number is assigned to an address and the unique model number is pressed with a button, the robot recognizes and moves along a designated path so that the shape of the designated tool can be recognized and retrieved.

The command instruction unit 9 may be configured in a monitor format that is configured to be exposed to the outside of the body unit 3.

FIG. 4 is a block diagram showing a control structure of a robot controller of a tool eraser robot having an interactive function according to a preferred embodiment of the present invention; and FIG. 5 is a block diagram of a controller of a tool eraser robot system having an interactive function according to a preferred embodiment of the present invention. Fig.

The block diagram shown in Fig. 4 is the structure of the controller 8,

Figure pat00001
Is the torque required to control the robot speed using the speed measurement of the robot,
Figure pat00002
Is the torque required to control the direction of the robot using the direction angle measurement of the robot. if
Figure pat00003
As a result,
Figure pat00004
And the velocity error ratio,
Figure pat00005
As a result,
Figure pat00006
And a directional error ratio, a learning controller for the speed and direction angle of the robot is constructed.

As shown in FIG. 5, the controller includes a receiving module, a transmitting module, and a main CPU.

6 is a photograph of the omni-directional obstacle avoidance ultrasonic sensor 6 module installed in the robot.

Fig. 7 shows a front view (S1 to S5) of a drawer 10 in connection with the present invention, and Fig. 8 shows an example of a mark of a magnetic material applied to a tool associated with the present invention.

According to the drawings, the designated tool of each stage (first stage to fifth stage) of the drawer 10 is arranged first, and then the position is stored in the controller 8. (ex: 1st floor hammer, 2nd floor pliers, 3rd floor spanner, 4th floor wrench, 5th floor drill, saw, paper paper etc.)

Recognition of a model of the same kind of each tool, but of different dimensions, recognizes the model number of the tool by recognizing the model number through character recognition, and then moves it.

The tool 10 (S2T10), the drawer (10), the drawer (10), and the drawer (10) in the first step S1T1 of the drawer (10) 10) The tool 10 (S4T10), the drawer 10 (10), the tool 10 (S3T10), the tool 10 (S3T1) The unique tool number in the drawer (10) from tool # 1 (S5T1) to tool # 10 (S5T10) is displayed.

The mark of the magnetic material is attached to all of the tools, and the robot recognizes the position point thereof via the camera 7 and grasps the object based on the center of gravity by learning beforehand The controller 8 stores the grasping method.

In the case of the intelligent tool eraser robot of the present invention configured as described above, when the user inputs the unique number of the tool or the type of the tool in a character / shape through the command instruction unit 9 exposed to the body unit 3, 8) to move the intelligent tool eraser robot to the tool box by issuing a movement command to the autonomous travel means (2).

The drawer 10 corresponding to the tool number is checked by the robot 7 after the first check of the end of the drawer 10 in the tool box with the image provided through the camera 7 installed on the head part 4 in the state of being moved forward of the tool box, The grip 7 is gripped by the hand assembly 56 of the arm 5 and the built-in tool is confirmed by the camera 7. Then the self-driving means 2 is operated in a state where the mark portion (center of gravity) To return to the initial position and transmit the tool to the user.

As described above, the present invention is based on the technical idea of providing a tool eraser robot system having an interactive function, and since the above-described embodiment is only one embodiment with reference to the drawings, It should be determined by the range.

1: Intelligent tool errand robot 2: Self-propelled vehicle
21: Wheel 22: Proximity sensor
3: Body part 4: Head part
5: Robot arm 51: shoulder joint
52: upper arm 53: elbow joint
54: lower arm 55: wrist joint
56: Hand assembly 6: Ultrasonic sensor
7: camera 8: controller
9: command instruction section 10: drawer

Claims (4)

An autonomous travel means (2) provided with a plurality of proximity sensors (22) so as to allow autonomous travel through line tracking marked on the floor in the use area;
A body part 3 attached to the upper side of the autonomous traveling means 2 and attached with an ultrasonic sensor 6 for increasing the distance to the front side;
A head part 4 provided above the body part 3 and provided with a camera 7;
A shoulder joint 51, an upper arm 52, an elbow joint 53, a lower arm 54, a wrist joint 55, and a hand made of multiple joints, which are provided at right and left sides of the body part 3, A robot arm 5 composed of an assembly 56 and implemented as an operation similar to a human's arm to open the drawer 10 of the tool box and grip the tool housed in the tool box;
And a controller 8 installed in the body part 3 to control the operation of the autonomous traveling means 2 and the ultrasonic sensor 6, the camera 7 and the robot arm 5 Intelligent tool errand robot.
The method of claim 1,
In order to recognize and store the character / shape of the user, the controller 8 performs an after-recognition process and a preprocessing process in an image recognition interface software kit and an image / shape A detection software kit, an image recognition software kit for recognizing an actual image recognition interval with respect to the input image, and a command instruction unit 9 for controlling the hand, motion motion, and autonomous travel of the robot through the recognized image signal Wherein the robot is a robot.
The method of claim 1,
The controller 8 stores the shapes of 50 tools such as a pliers, a hammer, a drive, a wrench, a saw, and a drill in a basic model shape in a general work process (manufacturing industry) Wherein the robot is recognized and moved along a designated path by designating a unique number to a data address and pushing the inherent model number by a button so as to recognize and retrieve the shape of the designated tool.
The method of claim 1,
Characterized in that the drawer (10) of the tool box and the tool housed therein are provided with a mark made of a magnetic material and the mark made of a magnetic material is positioned at the center of gravity.
KR1020150044149A 2015-03-30 2015-03-30 Intelligent tools errands robot KR20160116445A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020150044149A KR20160116445A (en) 2015-03-30 2015-03-30 Intelligent tools errands robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020150044149A KR20160116445A (en) 2015-03-30 2015-03-30 Intelligent tools errands robot

Publications (1)

Publication Number Publication Date
KR20160116445A true KR20160116445A (en) 2016-10-10

Family

ID=57146035

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020150044149A KR20160116445A (en) 2015-03-30 2015-03-30 Intelligent tools errands robot

Country Status (1)

Country Link
KR (1) KR20160116445A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107088880A (en) * 2017-05-12 2017-08-25 江苏信息职业技术学院 A kind of robot arm
CN110270993A (en) * 2019-07-29 2019-09-24 永嘉县信达智能设备制造有限公司 Robot shoulder structure
CN111958585A (en) * 2020-06-24 2020-11-20 宁波薄言信息技术有限公司 Intelligent disinfection robot
WO2021174850A1 (en) * 2020-03-04 2021-09-10 河南理工大学 Roadway roof-falling hazard automatic detection device

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20080083746A (en) 2007-03-13 2008-09-19 주식회사 유진로봇 Errand robot where the thing which leads a location recognition and message delivery are possible

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20080083746A (en) 2007-03-13 2008-09-19 주식회사 유진로봇 Errand robot where the thing which leads a location recognition and message delivery are possible

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107088880A (en) * 2017-05-12 2017-08-25 江苏信息职业技术学院 A kind of robot arm
CN110270993A (en) * 2019-07-29 2019-09-24 永嘉县信达智能设备制造有限公司 Robot shoulder structure
WO2021174850A1 (en) * 2020-03-04 2021-09-10 河南理工大学 Roadway roof-falling hazard automatic detection device
CN111958585A (en) * 2020-06-24 2020-11-20 宁波薄言信息技术有限公司 Intelligent disinfection robot

Similar Documents

Publication Publication Date Title
CN111496770B (en) Intelligent carrying mechanical arm system based on 3D vision and deep learning and use method
CN111055281B (en) ROS-based autonomous mobile grabbing system and method
CN114728417B (en) Method and apparatus for autonomous object learning by remote operator triggered robots
Fang et al. A novel augmented reality-based interface for robot path planning
US10759051B2 (en) Architecture and methods for robotic mobile manipulation system
Chen et al. Industrial robot control with object recognition based on deep learning
CN110603122B (en) Automated personalized feedback for interactive learning applications
CN114102585B (en) Article grabbing planning method and system
KR20160116445A (en) Intelligent tools errands robot
CN107160403A (en) A kind of intelligent robot system with multi-functional human-machine interface module
CN110914021A (en) Operating device with an operating device for carrying out at least one work step, and method and computer program
CN111823277A (en) Object grabbing platform and method based on machine vision
US20220241980A1 (en) Object-Based Robot Control
Lunenburg et al. Tech united eindhoven team description 2012
Hentout et al. A telerobotic human/robot interface for mobile manipulators: A study of human operator performance
KR102408327B1 (en) AI-based Autonomous Driving Robot System That Supports Gesture Recognition For Autonomous Driving Sales
Shaju et al. Conceptual design and simulation study of an autonomous indoor medical waste collection robot
US11407117B1 (en) Robot centered augmented reality system
Chen et al. Semiautonomous industrial mobile manipulation for industrial applications
TW202001286A (en) Object positioning system capable of simultaneously recognizing the various types of autonomously moving objects, and continuously updating map information and path planning
EP4088882A1 (en) Method of manipulating a construction object, construction robot system, and computer program product
Gong et al. Mobile robot manipulation system design in given environments
US20220281113A1 (en) Joint Training of a Narrow Field of View Sensor with a Global Map for Broader Context
Gwozdz et al. Enabling semi-autonomous manipulation on Irobot’s Packbot
CN207027516U (en) A kind of robot of view-based access control model and speech-sound intelligent control

Legal Events

Date Code Title Description
E902 Notification of reason for refusal
E601 Decision to refuse application