WO2020159029A1 - 전기차 충전 로봇, 로봇의 제어 방법 및 프로그램, 로봇의 도킹을 위한 정밀 제어 방법 및 프로그램 - Google Patents
전기차 충전 로봇, 로봇의 제어 방법 및 프로그램, 로봇의 도킹을 위한 정밀 제어 방법 및 프로그램 Download PDFInfo
- Publication number
- WO2020159029A1 WO2020159029A1 PCT/KR2019/011440 KR2019011440W WO2020159029A1 WO 2020159029 A1 WO2020159029 A1 WO 2020159029A1 KR 2019011440 W KR2019011440 W KR 2019011440W WO 2020159029 A1 WO2020159029 A1 WO 2020159029A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- robot
- feature point
- position sensor
- electric vehicle
- docking
- Prior art date
Links
- 238000003032 molecular docking Methods 0.000 title claims abstract description 108
- 238000000034 method Methods 0.000 title claims abstract description 61
- 238000004364 calculation method Methods 0.000 claims description 12
- 230000008859 change Effects 0.000 claims description 11
- 238000012544 monitoring process Methods 0.000 claims description 11
- OAICVXFJPJFONN-UHFFFAOYSA-N Phosphorus Chemical compound [P] OAICVXFJPJFONN-UHFFFAOYSA-N 0.000 claims description 4
- 229910052698 phosphorus Inorganic materials 0.000 claims description 4
- 239000011574 phosphorus Substances 0.000 claims description 4
- 238000010586 diagram Methods 0.000 description 10
- 230000000694 effects Effects 0.000 description 7
- 230000008569 process Effects 0.000 description 7
- 230000006870 function Effects 0.000 description 5
- 238000004891 communication Methods 0.000 description 4
- 238000012937 correction Methods 0.000 description 4
- 238000013459 approach Methods 0.000 description 2
- 230000001133 acceleration Effects 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 230000005611 electricity Effects 0.000 description 1
- 238000003912 environmental pollution Methods 0.000 description 1
- 239000002803 fossil fuel Substances 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 239000003550 marker Substances 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000001151 other effect Effects 0.000 description 1
- 238000012827 research and development Methods 0.000 description 1
- 230000004044 response Effects 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J19/00—Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
- B25J19/02—Sensing devices
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60L—PROPULSION OF ELECTRICALLY-PROPELLED VEHICLES; SUPPLYING ELECTRIC POWER FOR AUXILIARY EQUIPMENT OF ELECTRICALLY-PROPELLED VEHICLES; ELECTRODYNAMIC BRAKE SYSTEMS FOR VEHICLES IN GENERAL; MAGNETIC SUSPENSION OR LEVITATION FOR VEHICLES; MONITORING OPERATING VARIABLES OF ELECTRICALLY-PROPELLED VEHICLES; ELECTRIC SAFETY DEVICES FOR ELECTRICALLY-PROPELLED VEHICLES
- B60L53/00—Methods of charging batteries, specially adapted for electric vehicles; Charging stations or on-board charging equipment therefor; Exchange of energy storage elements in electric vehicles
- B60L53/30—Constructional details of charging stations
- B60L53/31—Charging columns specially adapted for electric vehicles
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60L—PROPULSION OF ELECTRICALLY-PROPELLED VEHICLES; SUPPLYING ELECTRIC POWER FOR AUXILIARY EQUIPMENT OF ELECTRICALLY-PROPELLED VEHICLES; ELECTRODYNAMIC BRAKE SYSTEMS FOR VEHICLES IN GENERAL; MAGNETIC SUSPENSION OR LEVITATION FOR VEHICLES; MONITORING OPERATING VARIABLES OF ELECTRICALLY-PROPELLED VEHICLES; ELECTRIC SAFETY DEVICES FOR ELECTRICALLY-PROPELLED VEHICLES
- B60L53/00—Methods of charging batteries, specially adapted for electric vehicles; Charging stations or on-board charging equipment therefor; Exchange of energy storage elements in electric vehicles
- B60L53/30—Constructional details of charging stations
- B60L53/35—Means for automatic or assisted adjustment of the relative position of charging devices and vehicles
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02T—CLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
- Y02T10/00—Road transport of goods or passengers
- Y02T10/60—Other road transportation technologies with climate change mitigation effect
- Y02T10/70—Energy storage systems for electromobility, e.g. batteries
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02T—CLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
- Y02T10/00—Road transport of goods or passengers
- Y02T10/60—Other road transportation technologies with climate change mitigation effect
- Y02T10/7072—Electromobility specific charging systems or methods for batteries, ultracapacitors, supercapacitors or double-layer capacitors
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02T—CLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
- Y02T90/00—Enabling technologies or technologies with a potential or indirect contribution to GHG emissions mitigation
- Y02T90/10—Technologies relating to charging of electric vehicles
- Y02T90/12—Electric charging stations
Definitions
- the present invention relates to a control method for an electric vehicle charging robot and a precise control method for docking an electric vehicle charging robot.
- the mobile charging device has emerged as an alternative due to the above problems, but in the case of the mobile charging device, there is a hassle that the user has to walk to the charging station and bring the mobile charging device directly to the vehicle.
- the charging terminal of the charging device must be accurately docked to the charging terminal of the vehicle, but a precise control method for such docking in the field is not presented.
- the electric vehicle charging robot moves to the parked area, and when the feature point of the docking terminal mounted on the license plate or the front charging port of the vehicle is recognized, it moves to the first position in front of the vehicle and poses. It is possible to provide a control method of the electric vehicle charging robot to change the.
- the present invention measures the distance between the distance sensor and the feature point using the feature point image sensed by the robot using the distance sensor at the first position, and controls the electric vehicle charging robot that moves to the second position so that the robot approaches the vehicle. You can provide a method.
- the present invention by driving the robot arm to dock the second docking terminal of the robot arm to the first docking terminal mounted on the vehicle in the second position, and docking the first docking terminal and the second docking terminal, It is possible to provide a control method of an electric vehicle charging robot that performs electric charging on a vehicle.
- the present invention is to move the robot arm of the electric vehicle charging robot moved to a predetermined distance from the vehicle at a predetermined distance to calculate the distance and alignment error of the feature point and the camera, make up for the error and advance the robot arm to dock without error. It can provide a precise control method for docking the electric vehicle charging robot.
- a control method of an electric vehicle charging robot for solving the above-described problem is a method performed by an electric vehicle charging robot, wherein the robot moves to a parking area in which the user's vehicle is parked; Sensing the surroundings by the robot through a position sensor; Searching for a target feature point from the sensing data of the position sensor, and when the feature point is recognized, the robot moves to a first position; Changing the posture of the robot from the first position toward the feature point and moving to the second position; And driving the robot arm to dock the second docking terminal of the robot arm to the charging port of the vehicle or the first docking terminal mounted on the vehicle at the second position.
- the position sensor is provided on the robot, the first position sensor for sensing the surroundings; It characterized in that it comprises a; second position sensor provided on the robot arm.
- the moving of the first position may include deriving position information of at least one of the feature point, the charging port of the vehicle, and the first docking terminal from data sensed through the first position sensor; And calculating the first position to be moved by the robot through the position information and moving to the first position.
- the moving of the second position may include calculating an alignment error between the second position sensor and the feature point using the sensing data of the second position sensor, and controlling the robot to make up for the error. .
- the posture changing step may include changing the robot to a first posture such that the robot arm faces the feature point in the first position;
- the robot acquiring a feature point image in the sensing data of the second position sensor at a first posture;
- the robot rotating at a first angle to change to a second posture, and acquiring sensing data of the second position sensor at the second posture;
- the robot stops movement of the robot when a specific object is detected within a certain distance through one or more distance sensors provided on the outer surface of the robot, and a predetermined time elapses after the specific object disappears from the monitoring range of the distance sensor.
- a predetermined time elapses after the specific object disappears from the monitoring range of the distance sensor.
- the step of driving the robot arm may include: driving the robot arm to move the second position sensor by a first distance in a specific direction; The first distance and the distance that the feature point image is moved in the sensing data are frozen through the number of pixels in which the feature point image is moved within the sensing data of the second position sensor according to the first distance movement of the second position sensor. Phosphorus; And docking the second docking terminal of the robot arm with the first docking terminal.
- the moving of the parking area includes moving the robot to the parking area by a user's order signal.
- An electric vehicle charging robot for solving the above-described problems searches for a target feature point from data sensed from a position sensor and moves the robot to a parking zone in which the user vehicle is parked, and when the corresponding feature point is recognized
- the robot arm is driven to dock the second docking terminal of the robot arm to the first docking terminal mounted on the vehicle at a second position.
- the electric vehicle charging robot control program according to an aspect of the present invention for solving the above-described problem is combined with a computer that is hardware, and is stored in a medium to execute a control method of the electric vehicle charging robot.
- a precise control method for docking an electric vehicle according to an aspect of the present invention for solving the above-described problem is a precise control method performed by an electric vehicle charging robot moved to a predetermined distance (second position) from the vehicle, the second position
- the position sensor In the step of driving the robot arm to move the position sensor by a first distance in a specific direction, the position sensor is provided on the robot arm to sense the feature point of the first docking terminal; The first distance and the distance that the feature point image is moved in the sensing data are frozen through the number of pixels in which the feature point image is moved within the sensing data of the second position sensor according to the first distance movement of the second position sensor. Phosphorus; And moving the robot arm a predetermined distance in the direction of the feature point.
- the robot moves to the parking area where the user's vehicle is parked; Sensing the surroundings by the robot through a position sensor; Searching for a target feature point from the sensing data of the position sensor, and when the feature point is recognized, the robot moves to a first position; Changing the posture of the robot such that the robot arm faces the feature point in the first position; The method further includes measuring a distance between the position sensor and the feature point using the feature point image in the sensing data of the position sensor provided in the robot arm and moving the robot to a second position.
- the position sensor is provided on the robot, the first position sensor for sensing the surroundings; It characterized in that it comprises a; second position sensor provided on the robot arm.
- the moving of the first position may include deriving position information of at least one of the feature point, the charging port of the vehicle, and the first docking terminal from a feature point image in data sensed by the first position sensor; And calculating a first position to be moved by the robot through the position information, and moving the robot to a first position.
- the moving of the second position may include: calculating an alignment error between the second position sensor and the feature point using the feature point image in the sensing data of the second position sensor, and controlling the robot to make up for the error; It includes.
- the posture changing step may include: changing the robot to a first posture so that the robot arm faces the feature point in the first position; The robot acquiring a feature point image in the sensing data of the second position sensor at a first posture; The robot rotating at a first angle to change to a second posture, and acquiring a feature point image in the sensing data of the second position sensor at the second posture; And calculating a parallel alignment posture of the second position sensor and the feature point through the feature point images acquired in the first posture and the second posture, and correcting the posture by the robot.
- the robot stops movement of the robot when a specific object is detected within a certain distance through one or more distance sensors provided on the outer surface of the robot, and a predetermined time elapses after the specific object disappears from the monitoring range of the distance sensor.
- a predetermined time elapses after the specific object disappears from the monitoring range of the distance sensor.
- the moving of the parking area includes moving the robot to the parking area by a user's order signal.
- An electric vehicle charging robot for solving the above-described problems is moved to a predetermined distance (second position) from the vehicle to perform precise control for docking, the robot arm of the robot It is provided in the position sensor for sensing the feature point formed on the first docking terminal;
- the control unit controls the robot arm to make up for the calculated alignment error.
- a precision control program for docking an electric vehicle charging robot according to an aspect of the present invention for solving the above-described problem is combined with a hardware computer, and is stored in a medium in order to execute a precision control method for docking the electric vehicle.
- the electric vehicle charging robot when a user mounts a docking terminal on a vehicle's license plate or front charging port and connects a charging cable, the electric vehicle charging robot automatically moves and docks to perform charging.
- the robot moves to the parking space of the vehicle, when the feature point is recognized, moves to the first position in front of the vehicle to change the posture, and makes up for errors using the feature point image captured by the position sensor to approach the vehicle Because it has the effect of being able to succeed in docking accurately.
- the present invention has an effect of safely autonomous driving to the position of the vehicle because it stops operation when an object or person is detected through one or more distance sensors provided on the outer surface of the robot and resumes operation when it is determined that the target has disappeared.
- the robot arm of the electric vehicle charging robot is arbitrarily moved a predetermined distance to calculate the distance between the feature point and the camera and the alignment error, the error is compensated, and the robot arm is advanced and the docking is performed. There is an effect that can charge the electric vehicle.
- FIG. 1 is a flowchart of a precise control method for docking an electric vehicle charging robot according to an embodiment of the present invention.
- FIG. 2 is a view illustrating that the first docking terminal is mounted on the license plate of the electric vehicle according to the embodiment of the present invention.
- FIG 3 is a view illustrating that the first docking terminal is mounted on the charging port of the electric vehicle according to the embodiment of the present invention.
- FIG. 4 is a view illustrating that the electric vehicle charging robot according to an embodiment of the present invention moves to a parking area where a vehicle is parked.
- FIG 5 is a diagram illustrating that the robot moves to the first position by recognizing the feature point through the first position sensor.
- FIG. 6 is a view illustrating that the electric vehicle charging robot changes the posture at the first position.
- FIG. 7 is a flowchart of a posture correction step according to an embodiment of the present invention.
- FIG 8 is a view illustrating that the electric vehicle charging robot moves to the second position.
- FIG. 9 is a view illustrating a feature point image in a captured image of the second position sensor at the first position.
- FIG. 10 is a view illustrating that the electric vehicle charging robot drives the robot arm for docking in the second position.
- FIG. 11 is a flowchart of a precise control method for docking an electric vehicle charging robot according to an embodiment of the present invention.
- FIG. 12 is a diagram illustrating a feature point image in a captured image of the second position sensor at the second position.
- FIG. 13 is a diagram illustrating that a specific target is detected through a distance sensor while the electric vehicle charging robot is moving.
- FIG. 14 is a block diagram of an electric vehicle charging robot according to an embodiment of the present invention.
- Feature point 42 Marked on the first docking terminal 40 mounted on the vehicle, means a marker marked for identification, and a QR code (Quick Response Code) can be applied as shown in the figure. , Bar code, two-dimensional barcode, images, etc. Anything that can be identified through the camera can be applied.
- QR code Quick Response Code
- the feature point 42 means a physical object marked on the first docking terminal 40, and the feature point image 43 is within the data (shooting image) sensed by the position sensor 20. It means the data of the feature point 42.
- FIG. 1 is a flowchart of a precise control method for docking an electric vehicle charging robot according to an embodiment of the present invention
- FIGS. 2 to 13 are views illustrating an actual application of the control method of the electric vehicle charging robot of FIG. 1.
- the electric vehicle charging robot 10 receives a user's order signal. (Step S505)
- FIG. 2 is a view illustrating that the first docking terminal 40 is mounted on a license plate of an electric vehicle (user vehicle, 99) according to an embodiment of the present invention
- FIG. 3 is a charging port of an electric vehicle according to an embodiment of the present invention Is a diagram illustrating that the first docking terminal 40 is mounted on.
- the first docking terminal 40 is mounted on the license plate of the vehicle as shown in FIG. 2, and the second terminal 84 is mounted (connected) to the charging port of the vehicle.
- the first docking terminal 40 may be mounted (connected) to the front charging port of the vehicle as shown in FIG. 3.
- the docking terminal mounted on the front charging port of the vehicle may be a form in which the first docking terminal 40 and the second terminal 84 are integrated.
- the vehicle means an electric vehicle.
- the charging connector 80 includes a cable 86, and the cable 86 has a first terminal 82 at one end and a second terminal 84 at the other end.
- the first terminal 82 is mounted on the first docking terminal 40 mounted on the license plate of the vehicle, and the second terminal 84 is mounted on the charging terminal (charger) of the vehicle. Due to the characteristics of the charging connector 80, the first terminal 82 and the second terminal 84 may be distinguished or the same.
- the first docking terminal 40 is mounted on the license plate of the vehicle, and the first docking terminal 40 has a feature point 42 formed on the upper side.
- the server receives an order signal from the user terminal to check the location of the zone where the user's vehicle is parked.
- the user's order signal means a signal that the user calls the electric vehicle charging robot to charge the electric vehicle.
- the order signal is a user selecting the parking space code of the parking zone of the electric vehicle with the terminal.
- It may be an electronic signal generated accordingly, in this case, receiving the user's order signal may mean receiving a parking space code for a parking zone in which the user's vehicle is parked, but is not limited thereto.
- various methods may be applied to the communication method of the order signal and the parking space code.
- a user after a user installs a service application on a user terminal, when a user tags the user terminal on an NFC tag attached to a pillar of a parking space, it is transmitted as a parking space code, and the server parks the parking space code. To analyze the location of the area where the vehicle is parked.
- a user accesses a service application and photographs an image and a code attached to a pillar of a parking space through a photographing unit of a user terminal, it is transmitted as a parking space code, and the server analyzes the parking space code to analyze the vehicle You can check the location of this parked area.
- the user may access the service application and directly enter the parking space code and transmit it to the server.
- the service application it is possible to select the building name (address), parking floor, and space number of the parking lot.
- a server in charge of the electric vehicle charging robot 10 may be connected to a parking lot server and receive a parking space code directly from the parking lot server.
- the parking lot server analyzes the CCTV photographing images provided in various places in the parking lot to check the location where the vehicle is parked, and when the vehicle is confirmed to be parked, the electric vehicle charging robot 10 through the server ) To send the parking space code.
- the parking space code can identify the location where the vehicle is parked.
- the electric vehicle charging robot 10 moves to a parking area in which the user's vehicle is parked. (Step S510)
- the electric vehicle charging robot 10 moves to the parking area corresponding to the parking space code included in the user's order signal.
- the parking area means a certain area based on the location where the vehicle is parked.
- FIG. 4 is a view illustrating that the electric vehicle charging robot 10 according to an embodiment of the present invention moves to a parking area where a vehicle is parked.
- the electric vehicle charging robot 10 is provided with a distance sensor 25 in the front, the first position sensor 22 and the second position sensor 32 are provided on both sides of the left and right, the robot on the back.
- the arm 30 is formed.
- the electric vehicle charging robot is provided with one position sensor 20 and can perform various operations using the position sensor 20.
- the position sensor 20 may be a concept including the first position sensor 22 and the second position sensor 32.
- the position sensor 20 may be representatively applied to a photographing and imaging device such as a camera, but is not limited thereto, and various devices capable of sensing a feature point may be applied.
- two or more position sensors may be provided as the first position sensor 22 and the second position sensor 32, which can be easily selected by a practitioner of the present invention.
- 1st position sensor 22 1st camera
- 2nd position sensor 2nd camera
- the robot will be described as having a first position sensor 22 and a second position sensor 32.
- FIGS. 4 to 13 are illustrated at an angle viewed from the upper side to describe the operation of the electric vehicle charging robot 10.
- the first docking on the license plate or front charging port of the user vehicle 99 It is assumed that the terminal 40 is mounted and will be described.
- the electric vehicle charging robot 10 starts autonomous driving into a parking space where a vehicle is parked.
- the electric vehicle charging robot 10 operates the distance sensor 25 to monitor the surroundings (front, side, rear) of the robot 10.
- Electric vehicle charging robot 10 is provided with at least one distance sensor 25 on at least one surface, the distance sensor 250 is typically a lidar device, an ultrasonic device, an image sensor can be applied In addition, any sensor or device capable of detecting an object may be applied.
- the parking lot is provided with a charging station for charging at least one electric vehicle charging robot 10 and waiting for the electric vehicle charging robot 10.
- the electric vehicle charging robot 10 is charged and waits at the charging station, and when the charging signal is assigned, it moves to the parking position of the vehicle.
- the electric vehicle charging robot 10 when charging of the corresponding vehicle is finished, the electric vehicle charging robot 10 returns to the charging station again to charge the battery and wait. At this time, if the charging request signal of another vehicle is allocated while the battery of the electric vehicle charging robot 10 is sufficient, it may move to the parking position of another vehicle without returning to the charging station.
- step S510 corresponds to the starting step for docking, and means that the vehicle moves roughly to the parked position. Through each of the following steps, the robot 10 will be moved more and more precisely for docking, which will be described.
- step S510 the robot 10 senses (shoots) the surroundings through the first position sensor 22, and when the target feature point image 43 is recognized from the sensing data (shooting image), the robot 10 Is moved to the first position 51. (Step S520)
- control unit 60 recognizes the feature point 42 from the sensing data (shooting image) of the first position sensor 22 and moves the robot 10 to the first position 51. .
- the first position 51 means a position spaced a predetermined distance on a straight line in front of the user vehicle 99.
- the distance between the first position 51 and the user vehicle 99 may be changed flexibly according to the situation of the parking lot.
- step S520 the controller 60 derives at least one location information of the feature point, the charging port of the vehicle, and the first docking terminal from the feature point image in the sensing data (shooting image) of the first position sensor 22,
- the calculating unit 63 may include the step of calculating the first position to be moved by the robot through the derived location information, and moving the robot 10 to the first position 51.
- the calculator 63 may calculate the distance and angle between the robot 10 and the feature point 42 using the feature point image in the sensing data (shooting image), and use this to determine the optimal first position 51. Can be calculated.
- step S520 the control unit 60 changes the posture of the robot 10 so that the robot arm 30 faces the feature point 42 at the first position 51.
- FIG. 6 is a view illustrating that the electric vehicle charging robot 10 changes the posture at the first position 51.
- the electric vehicle charging robot 10 has a robot arm 30 formed therein, and the rear opening/closing member 27 on the rear is opened so that the robot arm 30 is drawn out to the outside (back of the robot). It works. Therefore, the electric vehicle charging robot 10 changes the posture and moves backward in order to bring the robot arm 30 closer to the vehicle.
- the electric vehicle charging robot 10 includes a driving unit 65 and wheels, and the driving unit 65 operates the wheels.
- step S530 is a flowchart of a posture correction step (step S530) according to an embodiment of the present invention.
- step S530 may further include the following steps.
- the control unit 60 changes the robot 10 to the first position so that the robot arm 30 faces the feature point 42 at the first position 51. (Step S530)
- step S530 the controller 60 acquires the feature point image 43 in the sensing data (shooting image) of the second position sensor 32 at the first posture of the robot 10. (Step S532)
- step S532 the control unit 60 rotates the robot 10 at a first angle to change it to a second posture, and acquires a feature point image 43 in the captured image of the second position sensor 32 at the second posture.
- Step S535 the control unit 60 rotates the robot 10 at a first angle to change it to a second posture, and acquires a feature point image 43 in the captured image of the second position sensor 32 at the second posture.
- step S535 the control unit 60 calculates the parallel alignment posture of the second position sensor 32 and the feature point 42 through the feature point image 43 acquired in the first posture and the second posture, and the robot ( Correct the posture of 10). (Step S537)
- step S530 Even if steps S532, S535, and S537 are not included, if the robot 10 accurately changes the posture in step S530, the robot 10 moves to the second position 52, but it is not unreasonable. Depending on the situation, wheel slip may occur, and the angle may be slightly changed.
- control unit 60 arbitrarily drives the wheel of the robot 10 to rotate the robot 10 at a first angle from the first posture to change the second posture to the second posture, and feature points in the first posture and the second posture.
- step S530 the calculator 63 and the second position sensor 32 using the feature point image 43 in the sensing data (shooting image) of the second position sensor 32 provided in the robot arm 30.
- the distance and angle of the feature point 42 are measured, and the control unit 60 moves the robot 10 to the second position 52.
- FIG 8 is a view illustrating that the electric vehicle charging robot 10 moves to the second position 52.
- the robot arm 30 provided in the robot 10 according to an embodiment of the present invention has a feature point 42 together with a second docking terminal 35 for docking to the first docking terminal 40 mounted on the license plate of the vehicle.
- a second position sensor 32 for sensing (shooting) is formed.
- step S540 the calculator 63 aligns the second position sensor 32 and the feature point 42 using the feature point image 43 in the sensing data (shooting image) of the second position sensor 32. ) Calculating an error, and further comprising the step of controlling the robot 10 so that the controller 60 retrieves the calculated error (step S543).
- the calculation unit 63 uses the feature point image 43 in the sensing data (shooting image) of the second position sensor 32 to align the error between the second position sensor 32 and the feature point 42. Calculate
- the alignment means the same as the commonly used alignment (Align)
- the calculation unit 63 calculates the alignment error is the center point (72) of the sensing data (shooting image) of the second position sensor 32 ) Means that the center of the feature point image 43 coincides.
- the Z-axis direction is the direction in which the robot 10 retracts
- the X-axis direction is the horizontal direction of the robot 10
- the Y-axis direction is the vertical direction.
- steps S540 and S543 since the driving unit 65 drives the wheels to align and reverse the alignment in order to correct the alignment error, more specifically, the alignment error in the X-axis direction is measured and the error is measured.
- the driving unit 65 controls the wheels to make up for.
- the driving unit 65 controls the wheel to make up for the error, and gradually moves (reverses) in the Z-axis direction.
- FIG. 9 is a diagram illustrating a feature point image 43 in the captured image 70 of the second position sensor 32 in the first position 51 according to an embodiment of the present invention.
- the feature point image 43 in the captured image 70 of the second position sensor 32 at the first position 51 is not positioned at the center point 72 and is biased to the right and top. .
- the calculation unit 63 calculates the alignment error using the feature point image 43 in the captured image of the second position sensor 32, and the control unit 60 controls the driving unit 65 to adjust the alignment error. Try to make up for it.
- the robot 10 will move a predetermined distance to the right to make up for the error, and the feature point image 43 in the captured image of the second position sensor 32 approximates the center point 72. Will be located.
- control unit 60 makes up for the alignment error, reverses the robot 10 by a predetermined distance, and performs a process of measuring the error again, until the robot 10 is moved to the second position 52. You will perform these processes.
- control unit 60 may control the robot arm 30 upward or downward.
- the robot 10 performs step S540 until the distance between the second position sensor 32 and the feature point 42 becomes 11 cm or less.
- the calculator 63 calculates an alignment error between the second position sensor 32 and the feature point 42, The robot 10 is controlled to make up for the error, and the robot 10 is reversed by 1 cm.
- the calculator 63 calculates the alignment error again, controls the robot 10 to make up for the error, and moves the robot 10 backward by 1 cm to 11 cm, the second position 52.
- the robot 10 is positioned.
- Step S550 the control unit 60 in order to dock the second docking terminal 35 of the robot arm 30 to the first docking terminal 40 mounted on the vehicle at the second position 52, the robot arm 30 ).
- FIG. 10 is a view illustrating that the electric vehicle charging robot 10 drives the robot arm 30 for docking at the second position 52.
- the driving unit 65 of the robot 10 controls the wheel to move the robot 10 to the second position 52.
- step S550 the robot arm 30 is controlled while the wheel of the robot 10 is stopped, so that the first docking terminal 40 and the second docking terminal 35 are perfectly docked. Corresponds to the precision control process.
- the robot 10 has already gone through the process of retrieving the error in the process of moving to the second position 52, but slip may occur due to the characteristics of the wheel, and some error may occur depending on the floor condition of the parking lot. Can.
- step S550 the robot arm 30 is precisely controlled while the wheel is stopped, so that there is an effect to make perfect docking without error.
- FIG. 11 is a flowchart of a step of driving a robot arm 30 according to an embodiment of the present invention.
- the robot arm 30 driving step includes the following precise control steps.
- the control unit 60 drives the robot arm 30 to move the second position sensor 32 by a first distance in a specific direction. (Step S552)
- Step S554 Through the number of pixels in which the feature point image 43 in the sensing data (shooting image) of the second position sensor 32 according to the first distance movement of the second position sensor 32 is shifted by the calculator 63, the second The alignment error between the position sensor 32 and the feature point 42 is calculated. (Step S554)
- (S552 step) and (S554 step) is repeatedly performed, it is possible to calculate a substantial alignment error, in this case, at least one of the moving direction and distance of the second position sensor 32 One may be changed for each repeated order, and for example, the second position sensor 32 may reciprocate a predetermined distance in the x-axis direction and at the same time a predetermined distance in the y-axis direction.
- Step S556 After the control unit 60 controls the robot arm 30 to make up for the error calculated in step S554, the robot arm 30 is moved a predetermined distance in the direction of the feature point 42. (Step S556)
- the feature point image 43 in the sensing data Aligns (matches) the moved distance and the first distance, thereby accurately matching the second docking terminal 35 and the first docking terminal 40.
- FIG. 12 is a diagram illustrating a feature point image 43 in the captured image 70 of the second position sensor 32 at the second position 52 according to an embodiment of the present invention.
- the controller 60 randomly drives the robot arm 30 to move the second position sensor 32 by a first distance in a specific direction, the second position sensor 32 is moved by the first distance. Therefore, the feature point image 43 in the captured image 70 of the second position sensor 32 may have been moved by a specific number of pixels.
- the alignment error can also be calculated.
- control unit 60 controls the robot arm 30 to make up for the calculated alignment error, and moves the robot arm 30 a predetermined distance in the direction of the feature point 42.
- the robot arm 30 can be driven in the X-axis, Y-axis, and Z-axis, it can be finely adjusted in each direction even when the electric vehicle charging robot 10 is stopped and in the direction of the feature point image 43 Can be moved.
- the robot arm 30 may be applied in a multi-joint (eg, 6-axis) driving form in some cases.
- control unit 60 performs the above steps one or more times, if it is determined that the distance and alignment error between the second position sensor 32 and the feature point 42 is below the reference range, the second of the robot arm 30 Docking terminal 35 is docked to first docking terminal 40.
- control unit 60 determines that the second docking terminal 35 is within a distance and an error range that may be pushed into the first docking terminal 40, and the docking is performed because the reliability to successfully dock is secured. do.
- the controller 60 drives the robot arm 30 to move the second position sensor 32 in the left direction (X-axis direction) 4 mm ( First distance).
- the calculation unit 63 is the second position sensor 32 through the 4mm (first distance) movement of the second position sensor 32 in the captured image of the feature point image 43 is moved through the number of pixels through the second The alignment error between the position sensor 32 and the feature point 42 is calculated.
- the calculation unit 63 knows that the first distance is 4 mm, the second position sensor 32 and the feature point 42 through the number of pixels and 4 mm of the feature point image 43 are moved. Distance and alignment errors can be calculated.
- control unit 60 controls the robot arm 30 so that the calculated error is retrieved, and then the robot arm 30 is advanced 5 mm in the direction of the feature point 42.
- the controller 60 does not perform the error retrieval operation, and only performs an operation of advancing the robot arm 30 in the direction of the feature point 42.
- the controller 60 repeatedly performs the above operations once or more, and when it is determined that the distance between the second position sensor 32 and the feature point 42 is 8.8 cm or less, removes the second docking terminal 35. Docked to the 1 docking terminal (40).
- the calculation unit 63 is a parallel angle between the second position sensor 32 and the feature point 42 through the angle of the feature point image 43 in the sensing data (shooting image) of the second position sensor 32.
- the calculating of the error and the control unit 60 may further include a step of controlling the angle of the robot arm 30 to make up for the calculated parallel angle error.
- the feature point 42 may be a square or rectangular image. If the second position sensor 32 and the feature point 42 are not horizontal and are at a predetermined angle, within the captured image of the second position sensor 32 The feature point image 43 will be displaced at a predetermined angle in a square or rectangular shape.
- the calculation unit 63 analyzes the distortion of the feature point image 43 in the sensing data (shooting image) of the second position sensor 32 to calculate the angle error between the second position sensor 32 and the feature point 42 And, by correcting the angular error, the alignment error and the angular error can be corrected together to perform perfect docking.
- control unit 60 may further include the step of moving the second position sensor 32 by a specific angle by driving the robot arm 30.
- 13 is a diagram illustrating that a specific target is detected through the distance sensor 25 while the electric vehicle charging robot 10 is moving.
- the electric vehicle charging robot 10 is provided with one or more distance sensors 25 on at least one surface, and typically a lidar device, an ultrasonic device, and an image sensor may be applied.
- a lidar device typically a lidar device, an ultrasonic device, and an image sensor may be applied.
- any sensor or device capable of detecting an object can be applied.
- a distance sensor 25 is provided on each of the front, left, right, and rear surfaces of the robot 10 to monitor the surroundings of the robot 10.
- the distance sensor 25 is a representative example.
- the lidar since the range close to 180 degrees can be monitored, the lidar is provided on the front, left, right, and rear surfaces of the robot 10 to monitor the blind spot. You can monitor without rent.
- the controller 60 stops the movement of the robot 10, and when a certain time elapses after the specific object disappears from the monitoring range of the distance sensor 25 The robot 10 is restarted.
- the control unit 60 may stop the movement of the robot 10 because an accident may occur when driving continues in the current state.
- the robot 10 may be moved again to prevent an accident that may occur.
- the electric vehicle charging robot 10 may include the robot 10 and the bumper sensor 67.
- the control unit 60 stops the movement of the robot 10, and when it is determined that the impact is no longer detected or the specific object that applied the impact disappears from the monitoring range, after a predetermined time, the robot 10 Try to resume operation.
- the bumper sensor may be sensed by forming a bumper capable of detecting an external impact on the outer surface of the robot 10, or detecting an impact, a tilt, etc. of the robot 10 inside the robot 10.
- a sensor is provided to detect the impact applied to the robot 10.
- the robot 10 may include a tilt sensor or an acceleration sensor that can sense the tilt of the robot 10 because a specific object may tilt or shake the robot 10 without applying an impact of a predetermined intensity or higher. .
- FIG. 14 is a block diagram of an electric vehicle charging robot 10 according to an embodiment of the present invention.
- the electric vehicle charging robot 10 includes a driving unit 65, a first position sensor 22, a distance sensor 25, a robot arm 30, a second position sensor ( 32), a second docking terminal 35, a control unit 60 and a calculation unit 63.
- the server may include fewer components or more components than those illustrated in FIG. 14.
- the driving unit 65 controls the wheel of the electric vehicle charging robot 10 to move the robot 10 forward or backward, and controls the wheel to change the posture of the robot 10.
- the driving unit 65 controls the robot arm 30 to dock the second docking terminal 35 to the first docking terminal 40.
- the first position sensor 22 is provided on the left and right sides of the robot 10 to photograph the left and right of the robot 10.
- the distance sensor 25 is provided on one or more outer surfaces of the robot 10 to monitor the periphery of the robot 10, and the control unit 60 is a robot 10 when a specific object within a certain distance is detected through the distance sensor 25 ) Is stopped and the robot 10 resumes operation when a predetermined time has elapsed since the target disappears from the monitoring range of the distance sensor 25.
- the control unit searches the target feature point 42 from the data sensed from the first position sensor 22, moves the robot 10 to the parking area where the user vehicle is parked, and senses data from the first position sensor 22.
- the robot 10 is moved to the first position 51, and the robot arm 30 of the robot 10 at the first position 51 faces the feature point 42.
- the posture of the robot 10 is changed.
- control unit 60 uses the second position sensor 32 and the feature point 42 using the feature point image 43 in the sensing data (shooting image) of the second position sensor 32 provided in the robot arm 30. Measure the distance of and move the robot 10 to the second position 52, the second docking terminal of the robot arm 30 to the first docking terminal 40 mounted on the vehicle at the second position 52 ( The robot arm 30 is driven to dock 35).
- the robot arm 30 is formed inside the robot 10, and a second position sensor 32 and a second docking terminal 35 are formed.
- the second position sensor 32 is formed on the robot arm 30, and photographs the feature point 42 provided on the second docking terminal 35 mounted on the vehicle license plate or charging port.
- the second docking terminal 35 is provided on the robot arm 30 and docked to the first docking terminal 40 mounted on the license plate of the vehicle to charge the electric vehicle by supplying electricity to the vehicle.
- the electric vehicle charging robot 10 according to the embodiment of the present invention described above is only the same as the control method of the electric vehicle charging robot 10 described through FIGS. 1 to 13 and the category of the invention, and is the same, thus overlapping description and example Is omitted.
- control method of the electric vehicle charging robot according to the embodiment of the present invention described above may be implemented as a control program (or application) of the electric vehicle charging robot to be executed in combination with a server that is hardware, and stored in a medium.
- the above-described program is C, C++, JAVA, machine language, etc., in which a processor (CPU) of the computer can be read through a device interface of the computer in order for the computer to read the program and execute the methods implemented as a program.
- It may include a code (Code) coded in the computer language of the.
- code may include functional code related to a function defining functions necessary to execute the above methods, and control code related to an execution procedure necessary for the processor of the computer to execute the functions according to a predetermined procedure. can do.
- the code may further include a memory reference-related code as to which location (address address) of the computer's internal or external memory should be referred to additional information or media necessary for the computer's processor to perform the functions. have.
- the code can be used to communicate with any other computer or server in the remote using the communication module of the computer. It may further include a communication-related code for whether to communicate, what information or media to transmit and receive during communication, and the like.
- the storage medium refers to a medium that stores data semi-permanently and that can be read by a device, rather than a medium that stores data for a short time, such as registers, caches, and memory.
- examples of the storage medium include, but are not limited to, ROM, RAM, CD-ROM, magnetic tape, floppy disk, and optical data storage device. That is, the program may be stored in various recording media on various servers that the computer can access or various recording media on the user's computer.
- the medium may be distributed over a computer system connected through a network, and code readable by a computer in a distributed manner may be stored.
- the steps of a method or algorithm described in connection with an embodiment of the present invention may be implemented directly in hardware, a software module executed by hardware, or a combination thereof.
- the software modules may include random access memory (RAM), read only memory (ROM), erasable programmable ROM (EPROM), electrically erasable programmable ROM (EEPROM), flash memory, hard disk, removable disk, CD-ROM, or It may reside on any type of computer readable recording medium well known in the art.
Landscapes
- Engineering & Computer Science (AREA)
- Mechanical Engineering (AREA)
- Physics & Mathematics (AREA)
- Aviation & Aerospace Engineering (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Robotics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Power Engineering (AREA)
- Transportation (AREA)
- Charge And Discharge Circuits For Batteries Or The Like (AREA)
- Electric Propulsion And Braking For Vehicles (AREA)
Abstract
Description
Claims (21)
- 전기차 충전 로봇에 의해 수행되는 방법으로,상기 로봇이 사용자의 차량이 주차된 주차 구역으로 이동하는 단계;상기 로봇이 위치센서를 통해 주변을 센싱하는 단계;상기 위치센서의 센싱 데이터에서 목표로 하는 특징점을 검색하고, 해당 특징점이 인식되면 상기 로봇이 제1위치까지 이동하는 단계;상기 제1위치에서 상기 특징점을 향하도록 상기 로봇이 자세를 변경하고, 제2위치까지 이동하는 단계; 및상기 제2위치에서 상기 차량의 충전구 또는 상기 차량에 장착된 제1도킹단자에 로봇암의 제2도킹단자를 도킹하기 위해 상기 로봇암을 구동하는 단계;를 포함하는, 전기차 충전 로봇의 제어 방법.
- 제1항에 있어서,상기 위치센서는,상기 로봇에 마련되어 주변을 센싱하는 제1위치센서; 및상기 로봇암에 마련된 제2위치센서;를 포함하는 것을 특징으로 하는, 전기차 충전 로봇의 제어 방법.
- 제2항에 있어서,상기 제1위치 이동 단계는,상기 제1위치센서를 통해 센싱된 데이터로부터 상기 특징점, 상기 차량의 충전구 및 제1도킹단자 중 적어도 하나의 위치 정보를 도출하는 단계;상기 위치 정보를 통해 로봇이 이동해야 할 제1위치를 산출하고, 제1위치까지 이동하는 단계;를 포함하는, 전기차 충전 로봇의 제어 방법.
- 제2항에 있어서,상기 제2위치 이동 단계는,상기 제2위치센서의 센싱 데이터를 이용하여 상기 제2위치센서와 상기 특징점의 얼라인 오차를 산출하고, 상기 오차를 만회하도록 상기 로봇을 제어하는 단계;를 포함하는, 전기차 충전 로봇의 제어 방법.
- 제2항에 있어서,상기 자세 변경 단계는,상기 제1위치에서 로봇암이 상기 특징점을 향하도록 상기 로봇이 제1자세로 변경하는 단계;상기 로봇이 제1자세에서 상기 제2위치센서의 센싱 데이터 내 특징점 이미지를 획득하는 단계;상기 로봇이 제1각도 회전하여 제2자세로 변경하고, 상기 제2자세에서 상기 제2위치센서의 센싱 데이터를 획득하는 단계; 및상기 제1자세와 제2자세 중 적어도 하나에서 획득한 센싱 데이터를 통해 상기 제2위치센서와 상기 특징점의 평행 얼라인 자세를 산출하고, 상기 로봇이 자세를 보정하는 단계;를 포함하는, 전기차 충전 로봇의 제어 방법.
- 제1항에 있어서,상기 로봇은,상기 로봇의 외면에 마련된 하나 이상의 거리센서를 통해 일정 거리 내 특정 대상이 감지될 경우, 상기 로봇의 이동을 중지시키고 상기 특정 대상이 상기 거리센서의 감시 범위에서 사라진 후 소정시간이 경과하면 상기 로봇의 작동을 재개하며,상기 로봇의 외면에 마련된 하나 이상의 범퍼센서를 통해 충격이 감지되는 경우, 상기 로봇의 이동을 중지시키고 더 이상 충격이 감지되지 않거나 충격을 가한 특정 대상이 감시범위에서 사라진 것으로 판단되면 소정 시간 후에 로봇의 작동을 재개하는 것을 특징으로 하는, 전기차 충전 로봇의 제어 방법.
- 제2항에 있어서,상기 로봇암 구동 단계는,상기 로봇암을 구동하여 상기 제2위치센서를 특정 방향으로 제1거리만큼 이동시키는 단계;상기 제2위치센서의 상기 제1거리 이동에 따른 상기 제2위치센서의 센싱 데이터 내에서 특징점 이미지가 이동된 픽셀수를 통해, 센싱 데이터 내에서의 특징점 이미지가 이동된 거리와 제1거리를 얼라인시키는 단계; 및상기 로봇암의 제2도킹단자를 상기 제1도킹단자에 도킹하는 단계;를 포함하는, 전기차 충전 로봇의 제어 방법.
- 제1항에 있어서,상기 주차 구역 이동 단계는,상기 로봇이 사용자의 오더 신호에 의해 주차 구역으로 이동하는 단계;를 포함하는, 전기차 충전 로봇의 제어 방법.
- 위치센서로부터 센싱되는 데이터에서 목표로 하는 특징점을 검색하며 로봇을 사용자 차량이 주차된 주차 구역으로 이동시키며, 해당 특징점이 인식되면 상기 로봇을 제1위치까지 이동시키고, 상기 제1위치에서 상기 로봇의 로봇암이 상기 특징점을 향하도록 로봇의 자세를 변경하는 제어부;상기 제1위치에서 상기 로봇암에 구비된 위치센서의 센싱 데이터를 이용하여 위치센서와 상기 특징점의 거리 및 각도를 산출하는 산출부;를 포함하며,상기 제어부는,상기 로봇을 제2위치까지 이동시키고, 제2위치에서 상기 차량에 장착된 제1도킹단자에 상기 로봇암의 제2도킹단자를 도킹하기 위해 상기 로봇암을 구동시키는, 전기차 충전 로봇.
- 하드웨어인 컴퓨터와 결합되어, 제1항 내지 제8항 중 어느 한 항의 방법을 실행시키기 위하여 매체에 저장된, 전기차 충전 로봇 제어 프로그램.
- 차량과 기 설정된 거리(제2위치)까지 이동된 전기차 충전 로봇에 의해 수행되는 정밀 제어 방법으로,제2위치에서 상기 로봇암을 구동하여 위치센서를 특정 방향으로 제1거리만큼 이동시키되, 상기 위치센서는 상기 로봇암에 구비되어 제1도킹단자의 특징점을 센싱하는 단계;상기 제2위치센서의 상기 제1거리 이동에 따른 상기 제2위치센서의 센싱 데이터 내에서 특징점 이미지가 이동된 픽셀수를 통해, 센싱 데이터 내에서의 특징점 이미지가 이동된 거리와 제1거리를 얼라인시키는 단계; 및상기 로봇암을 상기 특징점 방향으로 소정 거리 이동시키는 단계;를 포함하는, 전기차 충전 로봇의 도킹을 위한 정밀 제어 방법.
- 제11항에 있어서,상기 위치센서와 상기 특징점의 얼라인 오차가 기준범위 이하인 것으로 판단되면, 상기 로봇암의 제2도킹단자를 상기 제1도킹단자에 도킹하는 단계;를 포함하는, 전기차 충전 로봇의 도킹을 위한 정밀 제어 방법.
- 제11항에 있어서,상기 제1거리 이동 단계 이전에,상기 로봇이 사용자의 차량이 주차된 주차 구역으로 이동하는 단계;상기 로봇이 위치센서를 통해 주변을 센싱하는 단계;상기 위치센서의 센싱 데이터에서 목표로 하는 특징점을 검색하고, 해당 특징점이 인식되면 상기 로봇이 제1위치까지 이동하는 단계;상기 제1위치에서 상기 로봇암이 상기 특징점을 향하도록 상기 로봇이 자세를 변경하는 단계; 및상기 로봇암에 구비된 위치센서의 센싱 데이터 내 특징점 이미지를 이용하여 상기 위치센서와 상기 특징점의 거리와 각도를 측정하고, 상기 로봇이 제2위치까지 이동하는 단계;를 더 포함하는, 전기차 충전 로봇의 도킹을 위한 정밀 제어 방법.
- 제13항에 있어서,상기 위치센서는,상기 로봇에 마련되어 주변을 센싱하는 제1위치센서; 및상기 로봇암에 마련된 제2위치센서;를 포함하는 것을 특징으로 하는, 전기차 충전 로봇의 도킹을 위한 정밀 제어 방법.
- 제14항에 있어서,상기 제1위치 이동 단계는,상기 제1위치센서를 통해 센싱된 데이터 내 특징점 이미지로부터 상기 특징점, 상기 차량의 충전구 및 제1도킹단자 중 적어도 하나의 위치 정보를 도출하는 단계;상기 위치 정보를 통해 로봇이 이동해야 할 제1위치를 산출하고, 로봇이 제1위치까지 이동하는 단계;를 포함하는, 전기차 충전 로봇의 도킹을 위한 정밀 제어 방법.
- 제14항에 있어서,상기 제2위치 이동 단계는,상기 제2위치센서의 센싱 데이터 내 특징점 이미지를 이용하여 상기 제2위치센서와 상기 특징점의 얼라인 오차를 산출하고, 상기 오차를 만회하도록 상기 로봇을 제어하는 단계;를 포함하는, 전기차 충전 로봇의 도킹을 위한 정밀 제어 방법.
- 제14항에 있어서,상기 자세 변경 단계는,상기 제1위치에서 상기 로봇암이 상기 특징점을 향하도록 상기 로봇이 제1자세로 변경하는 단계;상기 로봇이 제1자세에서 상기 제2위치센서의 센싱 데이터 내 특징점 이미지를 획득하는 단계;상기 로봇이 제1각도 회전하여 제2자세로 변경하고, 상기 제2자세에서 상기 제2위치센서의 센싱 데이터 내 특징점 이미지를 획득하는 단계; 및상기 제1자세와 제2자세에서 획득한 특징점 이미지를 통해 상기 제2위치센서와 상기 특징점의 평행 얼라인 자세를 산출하고, 상기 로봇이 자세를 보정하는 단계;를 포함하는, 전기차 충전 로봇의 도킹을 위한 정밀 제어 방법.
- 제14항에 있어서,상기 로봇은,상기 로봇의 외면에 마련된 하나 이상의 거리 센서를 통해 일정 거리 내 특정 대상이 감지될 경우, 상기 로봇의 이동을 중지시키고 상기 특정 대상이 상기 거리 센서의 감시 범위에서 사라진 후 소정시간이 경과하면 상기 로봇의 작동을 재개하며,상기 로봇의 외면에 마련된 하나 이상의 범퍼센서를 통해 충격이 감지되는 경우, 상기 로봇의 이동을 중지시키고 더 이상 충격이 감지되지 않거나 충격을 가한 특정 대상이 감시범위에서 사라진 것으로 판단되면 소정 시간 후에 로봇의 작동을 재개하는 것을 특징으로 하는, 전기차 충전 로봇의 도킹을 위한 정밀 제어 방법.
- 제13항에 있어서,상기 주차 구역 이동 단계는,상기 로봇이 사용자의 오더 신호에 의해 주차 구역으로 이동하는 단계;를 포함하는, 전기차 충전 로봇의 도킹을 위한 정밀 제어 방법.
- 차량과 기 설정된 거리(제2위치)까지 이동되어 도킹을 위한 정밀 제어를 수행하는 전기차 충전 로봇에 있어서,상기 로봇의 로봇암에 구비되어 제1도킹단자에 형성된 특징점을 센싱하는 위치센서;제2위치에서 상기 로봇암을 구동하여 상기 위치센서를 특정 방향으로 제1거리만큼 이동시키는 제어부;상기 위치센서의 상기 제1거리 이동에 따른 상기 위치센서의 센싱 데이터 내에서 특징점 이미지가 이동된 픽셀수를 통해, 상기 위치센서와 상기 특징점의 얼라인 오차를 산출하는 산출부;를 포함하며,상기 제어부는,상기 산출된 얼라인 오차를 만회하도록 상기 로봇암을 제어하는, 전기차 충전 로봇.
- 하드웨어인 컴퓨터와 결합되어, 제11항 내지 제19항 중 어느 한 항의 방법을 실행시키기 위하여 매체에 저장된, 전기차 충전 로봇의 도킹을 위한 정밀 제어 프로그램.
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR10-2019-0013387 | 2019-02-01 | ||
KR10-2019-0013388 | 2019-02-01 | ||
KR1020190013388A KR102014340B1 (ko) | 2019-02-01 | 2019-02-01 | 전기차 충전 로봇, 로봇의 도킹을 위한 정밀 제어 방법 및 프로그램 |
KR1020190013387A KR102014338B1 (ko) | 2019-02-01 | 2019-02-01 | 전기차 충전 로봇, 로봇의 제어 방법 및 프로그램 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2020159029A1 true WO2020159029A1 (ko) | 2020-08-06 |
Family
ID=71842304
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/KR2019/011440 WO2020159029A1 (ko) | 2019-02-01 | 2019-09-05 | 전기차 충전 로봇, 로봇의 제어 방법 및 프로그램, 로봇의 도킹을 위한 정밀 제어 방법 및 프로그램 |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2020159029A1 (ko) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
SE2150550A1 (en) * | 2021-04-30 | 2022-10-31 | Husqvarna Ab | Method and system for charging a robotic work tool |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS63192414A (ja) * | 1987-02-05 | 1988-08-09 | オ−トマツクス株式会社 | 床面清掃ロボツト |
US20130076902A1 (en) * | 2011-09-26 | 2013-03-28 | Universite Laval | Robotically operated vehicle charging station |
KR20140078025A (ko) * | 2012-12-14 | 2014-06-25 | 신명철 | 이동통신단말기와 qr코드를 활용한 주차위치안내서비스 시스템 |
US9592742B1 (en) * | 2014-04-09 | 2017-03-14 | FreeWire Technologies, Inc. | Systems, apparatus, and methods of charging electric vehicles |
KR20180046600A (ko) * | 2016-10-28 | 2018-05-09 | 삼성전자주식회사 | 전기자동차용 충전장치 및 전기자동차 충전 제어방법 |
-
2019
- 2019-09-05 WO PCT/KR2019/011440 patent/WO2020159029A1/ko active Application Filing
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS63192414A (ja) * | 1987-02-05 | 1988-08-09 | オ−トマツクス株式会社 | 床面清掃ロボツト |
US20130076902A1 (en) * | 2011-09-26 | 2013-03-28 | Universite Laval | Robotically operated vehicle charging station |
KR20140078025A (ko) * | 2012-12-14 | 2014-06-25 | 신명철 | 이동통신단말기와 qr코드를 활용한 주차위치안내서비스 시스템 |
US9592742B1 (en) * | 2014-04-09 | 2017-03-14 | FreeWire Technologies, Inc. | Systems, apparatus, and methods of charging electric vehicles |
KR20180046600A (ko) * | 2016-10-28 | 2018-05-09 | 삼성전자주식회사 | 전기자동차용 충전장치 및 전기자동차 충전 제어방법 |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
SE2150550A1 (en) * | 2021-04-30 | 2022-10-31 | Husqvarna Ab | Method and system for charging a robotic work tool |
SE545080C2 (en) * | 2021-04-30 | 2023-03-21 | Husqvarna Ab | Method and system for charging a robotic work tool |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2020022654A1 (ko) | 전기차량 충전시스템 | |
WO2016074169A1 (zh) | 一种对目标物体的检测方法、检测装置以及机器人 | |
WO2018110964A1 (en) | Electronic device and method for recognizing object by using plurality of sensors | |
WO2019139463A2 (ko) | 전기차 충전시스템 및 이를 이용한 충전방법 | |
WO2020004817A1 (ko) | 차선 정보 검출 장치, 방법 및 이러한 방법을 수행하도록 프로그램된 컴퓨 프로그램을 저장하는 컴퓨터 판독가능한 기록매체 | |
WO2017091008A1 (ko) | 이동 로봇 및 그 제어 방법 | |
WO2019124894A1 (ko) | 무인 비행체 및 그의 동작 방법, 그리고 상기 무인 비행체의 이동을 제어하기 위한 무인 운반체 | |
WO2019135437A1 (ko) | 안내 로봇 및 그의 동작 방법 | |
WO2018070687A1 (ko) | 공항 로봇 및 그를 포함하는 공항 로봇 시스템 | |
WO2017213339A1 (ko) | 수평 자세 유지 장치 및 자세 유지 장치 구동 방법 | |
EP3829832A1 (en) | Moving robot, moving robot system, and method for moving to charging station of moving robot | |
EP3525992A1 (en) | Mobile robot system and mobile robot | |
WO2020213955A1 (ko) | 모바일 로봇의 초기화 진단 방법 및 시스템 | |
WO2011013862A1 (ko) | 이동 로봇의 위치 인식 및 주행 제어 방법과 이를 이용한 이동 로봇 | |
WO2019139310A1 (en) | Autonomous driving apparatus and method for autonomous driving of a vehicle | |
WO2020159028A1 (ko) | 전기자동차 충전용 충전 커넥터, 도킹 소켓 및 도킹 어셈블리 | |
WO2019199112A1 (ko) | 자율 작업 시스템, 방법 및 컴퓨터 판독 가능한 기록매체 | |
WO2019143129A1 (ko) | 로봇 청소기 및 그 제어 방법 | |
WO2016129924A1 (ko) | 비제한적 구동형 마킹 시스템 및 그 마킹 방법 | |
WO2018143620A2 (en) | Robot cleaner and method of controlling the same | |
WO2020159027A1 (ko) | 전기자동차용 충전 로봇, 전기자동차 충전용 충전 커넥터, 도킹 소켓 및 도킹 어셈블리 | |
WO2018043780A1 (ko) | 이동 로봇 및 그 제어방법 | |
WO2020059926A1 (ko) | 이동단말기 및 그 제어방법 | |
WO2020230931A1 (ko) | 다중 센서 및 인공지능에 기반하여 맵을 생성하고 노드들의 상관 관계를 설정하며 맵을 이용하여 주행하는 로봇 및 맵을 생성하는 방법 | |
WO2015147371A1 (ko) | 차량의 위치 보정 장치 및 방법과 이를 이용한 차량 위치 보정 시스템 및 무인 운행이 가능한 차량 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 19912330 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 19912330 Country of ref document: EP Kind code of ref document: A1 |
|
32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 17/11/2021) |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 19912330 Country of ref document: EP Kind code of ref document: A1 |