WO2021221343A1 - Appareil et procédé de reconnaissance de l'environnement d'un robot mobile d'intérieur dans un ascenseur, support d'enregistrement stockant un programme pour l'exécution de celui-ci, et programme informatique stocké sur le support pour l'exécution de celui-ci - Google Patents

Appareil et procédé de reconnaissance de l'environnement d'un robot mobile d'intérieur dans un ascenseur, support d'enregistrement stockant un programme pour l'exécution de celui-ci, et programme informatique stocké sur le support pour l'exécution de celui-ci Download PDF

Info

Publication number
WO2021221343A1
WO2021221343A1 PCT/KR2021/004457 KR2021004457W WO2021221343A1 WO 2021221343 A1 WO2021221343 A1 WO 2021221343A1 KR 2021004457 W KR2021004457 W KR 2021004457W WO 2021221343 A1 WO2021221343 A1 WO 2021221343A1
Authority
WO
WIPO (PCT)
Prior art keywords
elevator
robot
floor
map
coordinate system
Prior art date
Application number
PCT/KR2021/004457
Other languages
English (en)
Korean (ko)
Inventor
천홍석
이재훈
변용진
송재봉
권아영
박상운
Original Assignee
주식회사 트위니
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 주식회사 트위니 filed Critical 주식회사 트위니
Publication of WO2021221343A1 publication Critical patent/WO2021221343A1/fr

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1664Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J11/00Manipulators not otherwise provided for
    • B25J11/008Manipulators for service tasks
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture
    • B25J9/1607Calculation of inertia, jacobian matrixes and inverses
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture
    • B25J9/161Hardware, e.g. neural networks, fuzzy logic, interfaces, processor
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K19/00Record carriers for use with machines and with at least a part designed to carry digital markings
    • G06K19/06Record carriers for use with machines and with at least a part designed to carry digital markings characterised by the kind of the digital marking, e.g. shape, nature, code
    • G06K19/06009Record carriers for use with machines and with at least a part designed to carry digital markings characterised by the kind of the digital marking, e.g. shape, nature, code with optically detectable marking

Definitions

  • the present invention relates to an apparatus and method for an indoor mobile robot to recognize an environment in an elevator, a recording medium storing a program for implementing the same, and a computer program stored in the medium for implementing the same, and more particularly, the indoor mobile robot is an elevator
  • the indoor mobile robot is an elevator
  • a recording medium storing a program for implementing the same, and a computer program stored in the medium for implementing the same.
  • an indoor mobile robot In order for an indoor mobile robot to board or get off an elevator, it must first be able to recognize an indoor space such as an elevator, and must be able to estimate a pose and plan a route by itself.
  • the constantly located environmental obstacle is mainly composed of a simple obstacle such as a wall, and the space is relatively narrow and closed compared to the general indoor environment.
  • a global positioning system In general, a global positioning system (GPS) cannot be used in an indoor environment such as an elevator, and the environment is recognized locally and posture is estimated mainly depending on the image and distance sensor mounted on the robot.
  • GPS global positioning system
  • the grid map which is a method mainly used by indoor mobile robots to recognize the indoor environment, divides the space into small grids, and uses a map that numerically displays the degree to which each grid is occupied by obstacles to describe the environment. a way to recognize
  • Korean Patent Registration [10-0877071] discloses a particle filter-based method and apparatus for estimating posture of a mobile robot.
  • an object of the present invention is to provide an environment recognition method necessary for an indoor mobile robot to board and get off an elevator, and an indoor mobile robot to control the environment in the elevator
  • a recording medium storing a program for implementing the same, and a computer program stored in the medium for implementing the same.
  • An apparatus for an indoor mobile robot to recognize an environment in an elevator for achieving the object as described above, the marker 10 displayed on the inner floor of the elevator;
  • a map storage unit 100 that stores information related to the location of the inner wall of the elevator and the map of the elevator floor in which the information related to the location of the markers 10 displayed on the inner floor of the elevator is stored;
  • a vision sensor 200 for acquiring image information of the front floor in a specific range from the robot; Based on the map of the elevator floor stored in the map storage unit 100, the marker 10 is determined among the image information obtained from the vision sensor 200, and the floor including the determined marker 10 is not occupied.
  • Image processing unit 300 for generating or updating a map of the elevator floor for; a posture estimation unit 400 for estimating the posture of the robot based on the map of the elevator floor generated or updated by the image processing unit 300; Based on the map of the elevator floor generated or updated by the image processing unit 300 and the posture of the robot estimated by the posture estimation unit 400, after checking whether there is a target area reachable by the robot, the reachable target a path planning unit 500 for planning a moving path to a target point by determining a point; and a robot control unit 600 for controlling the robot to move along the movement path generated by the path planning unit 500 .
  • the marker 10 is characterized by using any one or a plurality of the shape of the Aruco marker, the shape of the checker board, and the shape of the color pattern.
  • the image processing unit 300 acquires the area of the markers 10 that are completely identified for the elevator floor, and then compares the pre-stored information on the elevator floor to the aggregate area of unoccupied unit blocks, that is, the robot It is characterized in that only the movable area is distinguished.
  • the posture estimation unit 400 calculates the coordinates of the vision sensor 200 in the elevator internal map coordinate system by the following equation
  • the vision sensor's direction information, roll, ), pitch (pitch, ) and yaw, ) to the following expression
  • Atan2 is an arctangent function that takes two arguments, is the unit vector in the y-axis direction of the vision sensor coordinate system is a vector transformed into the elevator interior map coordinate system.
  • the z-axis component of is the unit vector in the z-axis direction of the vision sensor coordinate system is a vector transformed into the elevator interior map coordinate system.
  • the z-axis component of , is the z-axis component of , ,
  • Each is a unit vector in the x-axis direction of the vision sensor coordinate system.
  • are the x, y, and z-axis components of is an arcsine function), and is characterized by calculating the posture of the robot.
  • the robot control unit 600 is characterized in that when the target area and the movement path are not secured, the robot returns to the starting point or controls to wait in place.
  • a method for an indoor mobile robot to recognize an environment in an elevator is a method for an indoor mobile robot to recognize an environment in an elevator, which is made in the form of a program executed by an arithmetic processing means including a computer.
  • the floor map acquisition step (S20) acquires the area of the markers 10 completely identified for the elevator floor, and then compares the previously stored information on the elevator floor to the aggregate area of unoccupied unit blocks, that is, It is characterized in that it distinguishes only the area where the robot can move.
  • the coordinates of the vision sensor 200 in the elevator internal map coordinate system are calculated by the following equation
  • the vision sensor's direction information, roll, ), pitch (pitch, ) and yaw, ) to the following expression
  • Atan2 is an arctangent function that takes two arguments, is the unit vector in the y-axis direction of the vision sensor coordinate system is a vector transformed into the elevator interior map coordinate system.
  • the z-axis component of is the unit vector in the z-axis direction of the vision sensor coordinate system is a vector transformed into the elevator interior map coordinate system.
  • the z-axis component of , Each is a unit vector in the x-axis direction of the vision sensor coordinate system. is a vector transformed into the elevator interior map coordinate system.
  • are the x, y, and z-axis components of is an arcsine function
  • a computer-readable recording medium storing a program for implementing a method for the indoor mobile robot to recognize an environment in an elevator is provided.
  • the indoor mobile robot in order to implement a method for the indoor mobile robot to recognize the environment in the elevator, it is characterized in that the program stored in the computer-readable recording medium is provided.
  • a recording medium storing a program for implementing the same, and a computer program stored in the medium for implementing the same, obtained by a vision sensor Estimating the robot's posture based on the marker of the image and planning the robot's movement path, there is an effect of providing an environment necessary for the robot to autonomously board and get off the elevator.
  • the use of the lidar sensor due to the characteristics of the interior of the elevator has an effect of solving the problem of many errors in detecting objects. .
  • the robot even if the robot fails to board the elevator, it returns to the waiting point for boarding (the starting point) and waits, so that more stable elevator boarding is possible.
  • FIG. 1 is a block diagram of an apparatus for an indoor mobile robot to recognize an environment in an elevator according to an embodiment of the present invention.
  • FIG 2 is an exemplary view showing an example in which the elevator door is opened while the robot is waiting at the elevator boarding waiting point (starting point), and the floor inside the elevator is visible.
  • 3 is an exemplary view showing an example of comparing the coordinate system of the robot's vision sensor and the coordinate system of the elevator interior map.
  • FIG. 4 is an exemplary view showing an example of a case in which the form of an Aruco marker is installed on the floor of the elevator.
  • FIG. 5 is an exemplary view showing an example of a case in which the form of a checker board is installed on the floor of the elevator.
  • FIG. 6 is an exemplary view showing an example of a case in which the shape of the color pattern is installed on the floor of the elevator.
  • FIG. 7 is an exemplary view showing an example of a map of the elevator floor according to an embodiment of the present invention.
  • FIG. 8 is an exemplary view showing an example in which the robot sets a primary target point and plans a path according to an embodiment of the present invention
  • FIG. 9 is an exemplary view showing an example in which a robot sets a second target point from a first target point and plans a path according to an embodiment of the present invention
  • FIG 10 is an exemplary view showing an example in which the map of the elevator floor according to an embodiment of the present invention changes the size of the marker according to the importance of the internal space.
  • 11 is a flowchart of a method for an indoor mobile robot to recognize an environment in an elevator according to an embodiment of the present invention.
  • FIG. 1 is a block diagram of a device for an indoor mobile robot to recognize an environment in an elevator according to an embodiment of the present invention
  • FIG. It is an exemplary view showing an example of an open, visible floor inside the elevator
  • Figure 3 is an exemplary diagram showing an example of comparing the coordinate system of the robot's vision sensor and the coordinate system of the elevator interior map
  • Figure 4 is an arco ( Aruco)
  • Figure 5 is an exemplary view showing an example of a case where the shape of a checker board is installed on the floor of the elevator
  • FIG. 7 is an exemplary view showing an example of a map of an elevator floor according to an embodiment of the present invention
  • FIG. 8 is an exemplary embodiment of the present invention
  • It is an exemplary view showing an example in which the robot sets the first target point and plans a path
  • FIG. 9 is an example in which the robot sets the second target point at the first target point and plans the path according to an embodiment of the present invention
  • FIG. 10 is an exemplary view showing an example in which a map of an elevator floor according to an embodiment of the present invention changes the size of a marker according to the importance of an internal space
  • FIG. 11 is an embodiment of the present invention It is a flowchart of a method for an indoor mobile robot to recognize an environment in an elevator according to an example.
  • An apparatus and method for an indoor mobile robot to recognize an environment in an elevator according to an embodiment of the present invention, a recording medium storing a program for implementing the same, and a computer program stored in the medium for implementing the same, the indoor mobile robot rides the elevator And by providing a technology that recognizes the environment required to get off, the indoor mobile robot can check if there is space to board the elevator by itself, estimate its location, and plan a route to move on and off the elevator by itself. can make it happen
  • the elevator knows whether the indoor mobile robot attempts to get on or off, and can know whether the attempt is successful.
  • the indoor mobile robot can detect when the elevator door is fully opened.
  • the vision sensor 10 installed in the robot is waiting in a direction facing the front of the elevator.
  • a method of moving the mobile robot to the elevator waiting position in advance may be manually operated by a person to move it, or may be moved by an algorithm without human operation through autonomous driving.
  • a method for moving the mobile robot to the elevator boarding standby position is not addressed in the present invention.
  • the boarding standby position may be a final target point when the mobile robot gets off the elevator.
  • the device for the indoor mobile robot to recognize the environment in the elevator is a marker 10, a map storage unit 100, a vision sensor 200, an image processing unit ( 300 ), a posture estimation unit 400 , a path planning unit 500 , and a robot control unit 600 .
  • a marker 10 is displayed on the inner floor of the elevator (see Figs. 2 to 3).
  • the marker 10 may be a specific recognizable mark or pattern.
  • the marker 10 may be disposed on the entire interior floor of the elevator, but may also be disposed according to a predetermined rule.
  • a plurality of different types of markers may be arranged according to a predetermined rule.
  • QR code Quick Response code
  • the QR code may contain information related to the characteristics of the inside of the elevator, such as the size and shape of the inside of the elevator.
  • the map storage unit 100 has information (coordinates, features, size, etc.) related to the location of the inner wall of the elevator and information (coordinates, features, size, etc.) related to the location of the markers 10 displayed on the inner floor of the elevator. A map of the elevator floor is stored.
  • the map storage unit 100 stores a map of the elevator floor, which is basic data used for estimating the posture of the robot and planning the movement path, and the map of the elevator floor has information related to the location of the marker 10 . .
  • the vision sensor 200 acquires image information of the front floor in a specific range from the robot.
  • the mobile robot may collect image data by using the vision sensor 10 installed in the mobile robot.
  • the image processing unit 300 determines the marker 10 among the image information obtained from the vision sensor 200 based on the map of the elevator floor stored in the map storage unit 100, and the determined marker 10 is Creates or updates a map of the elevator floor to the unoccupied floors involved.
  • the mobile robot acquires a two-dimensional image with the vision sensor 200, which can be expressed in a coordinate system of the form shown in FIG.
  • the image obtained from the vision sensor 200 is subjected to an image processing process to extract the boundary lines of the left and right walls (refer to FIGS. 4 to 6 (c)), and the boundary lines of the floor to the unoccupied floor are extracted. (See FIGS. 4 to 6 (d)), it is possible to obtain a value for the occupancy of the floor.
  • the mobile robot may estimate its own position in the process of boarding the elevator in the same manner as in FIG. 8 .
  • Non-environmental obstacles refer to fluid obstacles such as people, not obstacles (environmental obstacles) that are fixedly placed in the elevator.
  • FIG. 7 shows an example of initially generating a map of the elevator floor
  • FIG. 8 shows an updated map of the elevator floor while the robot moves.
  • the image processing unit 300 may obtain an image of the inside of the elevator including the marker 10, and may generate (convert) an inside map of the elevator as shown in FIG.
  • the target area may be updated while updating as shown in FIG. 8 .
  • the map of the elevator floor created or updated in this way may be stored and managed in a separate storage space.
  • the posture estimation unit 400 estimates the posture of the robot based on the map of the elevator floor generated or updated by the image processing unit 300 .
  • the indoor mobile robot may estimate a posture including its position and direction based on the coordinate system of the map inside the elevator from the marker.
  • Path planning unit 500 is based on the map of the elevator floor generated or updated by the image processing unit 300 and the posture of the robot estimated by the posture estimation unit 400, the target area that the robot can reach is After checking whether there is a target point, an achievable target point is determined and a movement path to the target point is planned.
  • the target point means a point where the robot can get on or off the elevator while avoiding obstacles, and a target area is required for this purpose.
  • the robot control unit 600 controls the robot to move along the movement path generated by the path planning unit 500 .
  • the marker 10 of the device for the indoor mobile robot to recognize the environment in the elevator is an Aruco marker, a checker board, and a color pattern. It can be characterized by using any one or a plurality of the form of.
  • the marker 10 is an artificial mark made in a certain format, and can be used for tracking.
  • the Aruco marker consists of a two-dimensional bit pattern of size n x n and a black border area surrounding it.
  • the black border area is for quick recognition of the marker, and the two-dimensional bit pattern inside expresses the unique ID of the marker as a combination of white and black cells, and is used to identify the marker (see Fig. 4).
  • the form of the checker board refers to a form in which two colors (white, black, etc.) are alternately arranged in a grid-like arrangement (see Fig. 5).
  • the form of the color pattern refers to a form in which various colors (red, blue, green, yellow, etc.) are alternately arranged in a grid-like arrangement. (See FIG. 6)
  • markers 10 Although examples of various markers 10 have been described above, the present invention is not limited thereto, and Aruco markers are inserted into bright (white, blank, etc.) parts of the checker board. Of course, any marker 10 that can be tracked, such as shape, is applicable.
  • the image processing unit 300 of the apparatus for the indoor mobile robot to recognize the environment in the elevator is a marker 10 that is completely identified with respect to the floor of the elevator It may be characterized in that only the aggregated area of unoccupied unit blocks, that is, only the area in which the robot can move, is distinguished by comparing the pre-stored information on the elevator floor after acquiring the area.
  • the acquired image is first converted into a binary image using a threshold value, which includes Canny edge detection or local Various methods such as a local adaptive thresholding approach may be used, but are not limited thereto.
  • Contours are extracted from the converted binary images, and various methods such as the method using a Sobel filter or Suzuki's topological structural analysis of digitized binary images by border following can be used. may, but is not limited to.
  • a candidate group position of a marker having a rectangular shape can be found from the extracted contour image by using the Douglas-Peucker algorithm.
  • the appropriate size refers to the size of a rectangle that can be detected as a marker when the robot looks at the marker on the floor inside the elevator from the position where the robot waits to ride the elevator, and may be predetermined.
  • the perspective projection effect is removed by using a homography matrix for the part corresponding to the candidate group of markers composed of squares having an appropriate size, and the floor marker of the elevator is looked down vertically. It can be converted into a rectangular shape when viewed.
  • the image inside the marker obtained in the above step is made into a binary image using Otsu's binarization method, and divided into grids at regular intervals. can be converted to 1.
  • the converted data referring to the combination of numbers excluding the border part converted to 0, check whether the marker belongs to the marker dictionary of the robot, and exclude the markers that do not belong to the marker dictionary of the robot to obtain complete markers. area can be determined.
  • the corresponding part is an obstacle rather than a marker.
  • the position of the candidate group of the marker having the shape of a rectangle is found in the same way.
  • the corresponding area is regarded as a non-marker area, and the area of complete markers is determined.
  • a candidate group for a marker having a rectangular shape is found in the same way.
  • the area of the complete markers for the elevator floor is obtained, and then, by comparing the previously stored information on the elevator floor, it is possible to distinguish only the unoccupied area, that is, the area in which the mobile robot can move.
  • the area determined to be a non-marker area can be determined as an area occupied by a marker or invisible due to non-environmental obstacles. have.
  • the boundary line between the elevator floor and the wall can be found by comparing the pre-stored information about the elevator floor and the acquired image.
  • information on the markers corresponding to the left and right ends of the markers of the elevator floor detected from the acquired image is derived, and then the information on the elevator floor stored in advance is compared to determine the inner wall of the elevator. .
  • the unoccupied space means a set of portions in which the fiducial marker is completely recognized.
  • a portion where the reference marker is not recognized may be regarded as an occupied space.
  • the unoccupied space means a set of parts recognized by the checker in the form of a perfect rectangle.
  • the shape of the checker detected in the above example is not rectangular, it may be regarded as an occupied space.
  • the unoccupied space means a set of spaces in which only one predefined color is detected at each position of the rectangular color pattern.
  • the space detected with one color is not a rectangle, it may be regarded as an occupied space.
  • the posture estimator 400 of the device for the indoor mobile robot to recognize the environment in the elevator according to an embodiment of the present invention
  • the coordinates of the vision sensor 200 in the map coordinate system inside the elevator are expressed by the following equation.
  • the vision sensor's direction information, roll, ), pitch (pitch, ) and yaw, ) to the following expression
  • Atan2 is an arctangent function that takes two arguments, is the unit vector in the y-axis direction of the vision sensor coordinate system is a vector transformed into the elevator interior map coordinate system.
  • the z-axis component of is the unit vector in the z-axis direction of the vision sensor coordinate system is a vector transformed into the elevator interior map coordinate system.
  • the z-axis component of , is the unit vector in the x-axis direction of the vision sensor coordinate system.
  • are the x, y, and z-axis components of is an arcsine function
  • the map storage unit 100 may store an elevator internal map of the form shown in FIG. 7 .
  • the elevator interior map may display the interior wall of the elevator and coordinates where each marker is located.
  • a point A on the marker on the floor of the elevator is located at (x, y, z) based on the elevator interior map coordinate system.
  • a point A' on the image measured by the vision sensor of the robot with respect to the point A is at (a, b, c) with respect to the three-dimensional coordinate system with the position of the vision sensor as the origin, that is, the vision sensor coordinate system.
  • , , , , , is the coefficient for radiation distortion
  • class is the coefficient for tangential distortion, am.
  • a point (a, b, c) in the vision sensor coordinate system is expressed as a point A' in the two-dimensional coordinate system of the image acquired by the vision sensor.
  • the vision sensor's direction information, roll, ), pitch (pitch, ) and yaw ) can be obtained as follows.
  • Atan2 is an arctangent function that takes two arguments, is an arcsine function.
  • - here is the unit vector in the y-axis direction of the vision sensor coordinate system is a vector transformed into the elevator interior map coordinate system. is the z-axis component of
  • - here is the unit vector in the z-axis direction of the vision sensor coordinate system is a vector transformed into the elevator interior map coordinate system. is the z-axis component of
  • Each is a unit vector in the x-axis direction of the vision sensor coordinate system. is a vector transformed into the elevator interior map coordinate system. are the x, y, and z-axis components of
  • the coordinates of the vision sensor can be obtained from the map coordinate system inside the elevator, and the posture of the mobile robot can be calculated through this.
  • pitch information and roll information of the vision sensor are not used in this embodiment, pitch information and roll information may be requested if a three-dimensional map is used as an elevator interior map.
  • an area in which an intact marker is found can be determined as a space where the robot can move.
  • the map is updated as shown in FIG. 8 .
  • an elevator interior map as shown in FIG. 8 is generated in real time.
  • the coordinates can be expressed by matching the marker of the two-dimensional image acquired by the vision sensor and the marker on the map inside the elevator of the robot one-to-one.
  • unique identification information (ID) of each marker may be used in matching the markers.
  • matching may be performed using unique color information and color patterns arranged vertically and horizontally.
  • the robot control unit 600 of the device for the indoor mobile robot to recognize the environment in the elevator is to control the robot to return to the starting point or wait in place if the target area and the movement path are not secured.
  • the elevator has to wait for the next if there is no space to board the elevator.
  • the marker 10 of the device for the indoor mobile robot to recognize the environment in the elevator is arranged by varying its size (resolution) according to the importance of the inner floor of the elevator It is also possible to be
  • the size of the marker 10 may decrease toward the inner bottom edge of the elevator.
  • any marker 10 can be used as long as the posture of the robot can be estimated through the marker 10 .
  • the indoor mobile robot in the form of a program executed by an arithmetic processing means including a computer is used in the elevator.
  • a method for recognizing an environment includes a floor detection step (S10), a floor map acquisition step (S20), and a movement route planning step (S30).
  • image information of the front floor in a specific range is acquired from the robot using the vision sensor 200 .
  • the vision sensor 200 may acquire image information of the front floor in a specific range from the robot periodically or in real time.
  • the floor detection step S10 is performed periodically or in real time.
  • the robot can start the algorithm for boarding the elevator in the direction of looking inside the elevator at first.
  • the floor map acquisition step S20 described later may be performed.
  • the floor map acquisition step (S20) is based on a map of the elevator floor in which information related to the location of the inner wall of the elevator and information related to the location of the markers 10 displayed on the inner floor of the elevator are stored, the floor detecting step (S10) By determining the marker 10 among the image information obtained from, creating or updating a map of the elevator floor for the unoccupied floor including the determined marker 10.
  • the mobile robot acquires an image with the vision sensor 200 at the elevator boarding standby position, and searches for the marker 10 in the acquired image.
  • the movement path planning step (S30) based on the map of the elevator floor updated from the floor map acquisition step (S20), the posture of the robot is estimated, and after confirming whether there is a target area reachable by the robot, the reachable target point is determined Decide and plan the route to the destination.
  • the size of the mobile robot is taken into consideration, and the reachable target area and target point are searched, and the route can be planned.
  • the target area to which the target point belongs is a space that the robot can go, it is possible to plan a path to that area.
  • the robot can be moved after estimating the posture of the current mobile robot and creating a movement path.
  • a path may be generated to a target point of the corresponding target area and movement may be started.
  • the moving path may generate a path that can be moved to the end at once, or may consist of a finite number of paths.
  • the process is terminated. If not, the process may be repeated by returning to the step of acquiring an image again.
  • the target area of the target area is not reached and is terminated, there may be a method of returning to the initial point.
  • a general autonomous driving algorithm can be used to generate a route.
  • global route planning algorithms such as A* algorithm and Dijkstra algorithm
  • local route planning algorithms such as Dynamic Window Approach (DWA) algorithm and Timed Elastic Band (TEB) algorithm can be used.
  • DWA Dynamic Window Approach
  • TEB Timed Elastic Band
  • An object of the present invention is to provide an environment recognition method and apparatus for autonomous driving, not an autonomous driving algorithm.
  • the present invention provides a method for recognizing the environment.
  • the robot can recognize its current posture as follows by using its driving information and previous posture.
  • the amount of change in the robot's posture in can be known through the encoder value of the gyro sensor and motor.
  • An example of a particle filter-based self-position estimation method using the environment recognition method proposed in the present invention is as follows.
  • the particle filter Posture candidates for the second robot a number of ( ) to create particles, , update the motion of each particle through the measured attitude change amount.
  • the current posture is determined.
  • the current posture is continuously estimated whenever a predetermined time and posture change.
  • the resampling is a process of excluding the particles with the low weight, and dividing the particles with the high weight into several and updating the N particles.
  • the robot can converge to a set of particles with high weight among particles expressed as candidates for its own posture, and thus can reduce uncertainty from posture estimation.
  • the environment recognition method of the present invention may be utilized.
  • the robot's posture information using the marker information acquired from the real robot to estimate
  • the posture information of the robot can be calculated from the vision sensor posture of the elevator internal map coordinate system.
  • the weight may be calculated as follows so that the position information including the position and direction of each particle is inversely proportional to an error value obtained by comparing the estimated position information of the robot with the position information.
  • the candidate having the highest weight may be regarded as the current position.
  • the robot plans a path and moves toward a target point in the target area while continuously updating its position.
  • the robot may not be able to check the marker through the vision sensor.
  • the magnetic position is estimated depending on the encoder value while maintaining the most recent weight value.
  • the weight is updated again and the resampling process is repeated.
  • blind spots that were not initially visible can be recognized by the vision sensor.
  • the target area may be updated with a new point.
  • the criterion for updating the target area may be to give a high weight to the inside of the elevator, that is, the value in the y-axis direction.
  • a mobile robot it is possible for a mobile robot to enter a narrow space such as an elevator at low cost.
  • the floor map acquisition step (S20) of the method for the indoor mobile robot to recognize the environment in the elevator acquires the area of the markers 10 completely identified for the elevator floor, and then By comparing the information on the elevator floor, it may be characterized in that only the aggregate area of unoccupied unit blocks, that is, only the area in which the robot can move is distinguished.
  • the acquired image is first converted into a binary image using a threshold value, which includes Canny edge detection or local Various methods such as a local adaptive thresholding approach may be used, but are not limited thereto.
  • Contours are extracted from the converted binary images, and various methods such as the method using a Sobel filter or Suzuki's topological structural analysis of digitized binary images by border following can be used. may, but is not limited to.
  • a candidate group position of a marker having a rectangular shape can be found from the extracted contour image by using the Douglas-Peucker algorithm.
  • the perspective projection effect is removed using a homography matrix for the part corresponding to the marker candidate group in the original image, and the floor marker of the elevator is converted into a rectangular shape when viewed vertically.
  • the image inside the marker is made into a binary image, divided into grids at regular intervals and converted to a number of 0 or 1, and then a combination of numbers excluding the border part that is converted to 0
  • the corresponding part is an obstacle rather than a marker.
  • the position of the candidate group of the marker having the shape of a rectangle is found in the same way.
  • the corresponding area is regarded as a non-marker area, and the area of complete markers is determined.
  • a candidate group for a marker having a rectangular shape is found in the same way.
  • the area of the complete markers for the elevator floor is obtained, and then, by comparing the previously stored information on the elevator floor, it is possible to distinguish only the unoccupied area, that is, the area in which the mobile robot can move.
  • the boundary line of the elevator floor can be found by comparing the previously stored information about the elevator floor with the acquired image.
  • information on the markers corresponding to the left and right ends is derived, and then information about the elevator floor stored in advance is compared to find out whether it is the information at the end.
  • the unoccupied space means a set of portions in which the fiducial marker is completely recognized.
  • a portion where the reference marker is not recognized may be regarded as an occupied space.
  • the unoccupied space means a set of parts recognized by the checker in the form of a perfect rectangle.
  • the shape of the checker detected in the above example is not rectangular, it may be regarded as an occupied space.
  • the unoccupied space means a set of spaces in which only one predefined color is detected at each position of the rectangular color pattern.
  • the space detected with one color is not a rectangle, it may be regarded as an occupied space.
  • the movement path planning step (S30) is
  • the coordinates of the vision sensor 200 in the map coordinate system inside the elevator are expressed by the following equation.
  • the vision sensor's direction information, roll, ), pitch (pitch, ) and yaw, ) to the following expression
  • Atan2 is an arctangent function that takes two arguments, is the unit vector in the y-axis direction of the vision sensor coordinate system is a vector transformed into the elevator interior map coordinate system.
  • the z-axis component of is the unit vector in the z-axis direction of the vision sensor coordinate system is a vector transformed into the elevator interior map coordinate system.
  • the z-axis component of , is the unit vector in the x-axis direction of the vision sensor coordinate system.
  • are the x, y, and z-axis components of is an arcsine function
  • the map storage unit 100 may store an elevator internal map of the form shown in FIG. 7 .
  • the elevator interior map may display the interior wall of the elevator and coordinates where each marker is located.
  • a point A on the marker on the floor of the elevator is located at (x, y, z) based on the elevator interior map coordinate system.
  • a point A' on the image measured by the vision sensor of the robot for the point A is at (a, b, c) with respect to the three-dimensional coordinate system with the position of the vision sensor as the origin, that is, the vision sensor coordinate system.
  • , , , , , is the coefficient for radiation distortion
  • class is the coefficient for tangential distortion, am.
  • a point (a, b, c) in the vision sensor coordinate system is expressed as a point A' in the two-dimensional coordinate system of the image acquired by the vision sensor.
  • the vision sensor's direction information, roll, ), pitch (pitch, ) and yaw ) can be obtained as follows.
  • Atan2 is an arctangent function that takes two arguments, is an arcsine function.
  • - here is the unit vector in the y-axis direction of the vision sensor coordinate system is a vector transformed into the elevator interior map coordinate system. is the z-axis component of
  • - here is the unit vector in the z-axis direction of the vision sensor coordinate system is a vector transformed into the elevator interior map coordinate system. is the z-axis component of
  • Each is a unit vector in the x-axis direction of the vision sensor coordinate system. is a vector transformed into the elevator interior map coordinate system. are the x, y, and z-axis components of
  • the coordinates of the vision sensor can be obtained from the map coordinate system inside the elevator, and the posture of the mobile robot can be calculated through this.
  • pitch information and roll information of the vision sensor are not used in this embodiment, pitch information and roll information may be requested if a three-dimensional map is used as an elevator interior map.
  • an area in which an intact marker is found can be determined as a space where the robot can move.
  • the map is updated as shown in FIG. 8 .
  • an elevator interior map as shown in FIG. 8 is generated in real time.
  • the coordinates can be expressed by matching the marker of the two-dimensional image acquired by the vision sensor and the marker on the map inside the elevator of the robot one-to-one.
  • unique identification information (ID) of each marker may be used in matching the markers.
  • matching may be performed using unique color information and color patterns arranged vertically and horizontally.
  • a computer readable record storing a program for implementing the method for the indoor mobile robot to recognize the environment in the elevator
  • a program stored in a computer-readable recording medium for implementing a medium and a method for the indoor mobile robot to recognize the environment in the elevator can also be implemented.
  • the above-described method for the indoor mobile robot to recognize the environment in the elevator may be provided by being tangibly implemented as a program of instructions for implementing it, and may be included in a recording medium that can be read through a computer. You will understand. In other words, it may be implemented in the form of program instructions that can be executed through various computer means and recorded on a computer-readable recording medium.
  • the computer-readable recording medium may include program instructions, data files, data structures, etc. alone or in combination.
  • the program instructions recorded on the computer-readable recording medium may be specially designed and configured for the present invention, or may be known and used by those skilled in the art of computer software.
  • Examples of the computer-readable recording medium include magnetic media such as hard disks, floppy disks and magnetic tapes, optical media such as CD-ROMs and DVDs, and floppy disks. magneto-optical media, and hardware devices specially configured to store and carry out program instructions, such as ROM, RAM, flash memory, USB memory, and the like. Examples of program instructions include not only machine language codes such as those generated by a compiler, but also high-level language codes that can be executed by a computer using an interpreter or the like.
  • the hardware device may be configured to operate as one or more software modules to perform the operations of the present invention, and vice versa.

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Mathematical Physics (AREA)
  • Artificial Intelligence (AREA)
  • Theoretical Computer Science (AREA)
  • Evolutionary Computation (AREA)
  • Fuzzy Systems (AREA)
  • General Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Manipulator (AREA)

Abstract

La présente invention concerne un appareil et un procédé de reconnaissance d'environnement d'un robot mobile d'intérieur dans un ascenseur, un support d'enregistrement stockant un programme pour l'exécution de celui-ci, et un programme d'ordinateur stocké sur un support pour l'exécution de celui-ci et, plus spécifiquement, un appareil et un procédé de reconnaissance de l'environnement d'un robot mobile d'intérieur dans un ascenseur, qui fournissent un procédé de reconnaissance de l'environnement nécessaire au robot mobile d'intérieur pour entrer et sortir de l'ascenseur, un support d'enregistrement stockant un programme pour l'exécution de celui-ci, et un programme informatique stocké sur un support pour l'exécution de celui-ci.
PCT/KR2021/004457 2020-04-29 2021-04-09 Appareil et procédé de reconnaissance de l'environnement d'un robot mobile d'intérieur dans un ascenseur, support d'enregistrement stockant un programme pour l'exécution de celui-ci, et programme informatique stocké sur le support pour l'exécution de celui-ci WO2021221343A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020200052101A KR102194426B1 (ko) 2020-04-29 2020-04-29 실내 이동 로봇이 엘리베이터에서 환경을 인식하기 위한 장치 및 방법, 이를 구현하기 위한 프로그램이 저장된 기록매체 및 이를 구현하기 위해 매체에 저장된 컴퓨터프로그램
KR10-2020-0052101 2020-04-29

Publications (1)

Publication Number Publication Date
WO2021221343A1 true WO2021221343A1 (fr) 2021-11-04

Family

ID=74087451

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2021/004457 WO2021221343A1 (fr) 2020-04-29 2021-04-09 Appareil et procédé de reconnaissance de l'environnement d'un robot mobile d'intérieur dans un ascenseur, support d'enregistrement stockant un programme pour l'exécution de celui-ci, et programme informatique stocké sur le support pour l'exécution de celui-ci

Country Status (2)

Country Link
KR (1) KR102194426B1 (fr)
WO (1) WO2021221343A1 (fr)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114234966A (zh) * 2021-12-01 2022-03-25 北京云迹科技股份有限公司 移动机器人的乘梯状态检测方法、装置、存储介质及设备
CN114397885A (zh) * 2021-12-16 2022-04-26 北京三快在线科技有限公司 辅助移动机器人乘坐电梯的方法、电子设备及存储介质
CN114434453A (zh) * 2021-12-31 2022-05-06 上海擎朗智能科技有限公司 一种机器人乘梯方法、系统、机器人和存储介质
CN114905508A (zh) * 2022-04-19 2022-08-16 合肥综合性国家科学中心人工智能研究院(安徽省人工智能实验室) 基于异构特征融合的机器人抓取方法
CN115167454A (zh) * 2022-08-03 2022-10-11 北京京东乾石科技有限公司 用于机器人的控制方法、装置、电子设备及存储介质

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102194426B1 (ko) * 2020-04-29 2020-12-24 주식회사 트위니 실내 이동 로봇이 엘리베이터에서 환경을 인식하기 위한 장치 및 방법, 이를 구현하기 위한 프로그램이 저장된 기록매체 및 이를 구현하기 위해 매체에 저장된 컴퓨터프로그램
CN114911221B (zh) * 2021-02-09 2023-11-28 北京小米机器人技术有限公司 机器人的控制方法、装置及机器人
KR20220118329A (ko) 2021-02-18 2022-08-25 호서대학교 산학협력단 이동 로봇의 승강기 탑승 제어 방법
KR20220118327A (ko) 2021-02-18 2022-08-25 호서대학교 산학협력단 이동 로봇의 승강기 탑승 제어 방법
KR20220118328A (ko) 2021-02-18 2022-08-25 호서대학교 산학협력단 작업 우선순위에 따른 이동 로봇의 승강기 탑승 제어 방법
KR102463725B1 (ko) * 2021-02-23 2022-11-07 현대자동차주식회사 위치 추정 장치, 그를 포함한 로봇 시스템 및 그에 관한 방법
CN113021336B (zh) * 2021-02-25 2022-07-05 上海交通大学 一种基于主从移动操作机器人的档案取放系统及方法
US20240184305A1 (en) * 2021-03-24 2024-06-06 Hoseo University Academic Cooperation Foundation Mobile robot for determining whether to board elevator, and operating method therefor
CN113848918A (zh) * 2021-09-27 2021-12-28 上海景吾智能科技有限公司 机器人快速高效低成本的部署方法和系统
CN114505840B (zh) * 2022-01-14 2023-10-20 浙江工业大学 一种自主操作箱式电梯的智能服务机器人
WO2024058411A1 (fr) * 2022-09-15 2024-03-21 삼성전자주식회사 Espace spécifique de déplacement de robot mobile et son procédé de commande
CN118544368B (zh) * 2024-07-30 2024-09-27 西湖交互机器科技(杭州)有限公司 一种基于Aruco码矩阵的机器人手眼标定方法及应用

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20100111795A (ko) * 2009-04-08 2010-10-18 (주) 한호기술 로봇의 자율 이동 시스템과 이 시스템에 사용 가능한 자율 이동 로봇
JP2011203257A (ja) * 2005-08-01 2011-10-13 Toyota Motor Corp 運動体の姿勢角検出装置
KR101864948B1 (ko) * 2016-10-31 2018-07-04 고려대학교 산학협력단 이동 로봇의 엘리베이터 승하차 제어 방법
KR20190082674A (ko) * 2017-12-31 2019-07-10 사르코스 코퍼레이션 로봇이 볼 수 있는 비밀 식별 태그 및 로봇 장치
KR20190120104A (ko) * 2019-07-03 2019-10-23 엘지전자 주식회사 마커, 마커 추종 모드로 이동하는 방법 및 이를 구현하는 로봇
KR102194426B1 (ko) * 2020-04-29 2020-12-24 주식회사 트위니 실내 이동 로봇이 엘리베이터에서 환경을 인식하기 위한 장치 및 방법, 이를 구현하기 위한 프로그램이 저장된 기록매체 및 이를 구현하기 위해 매체에 저장된 컴퓨터프로그램

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100877071B1 (ko) 2007-07-18 2009-01-07 삼성전자주식회사 파티클 필터 기반의 이동 로봇의 자세 추정 방법 및 장치

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011203257A (ja) * 2005-08-01 2011-10-13 Toyota Motor Corp 運動体の姿勢角検出装置
KR20100111795A (ko) * 2009-04-08 2010-10-18 (주) 한호기술 로봇의 자율 이동 시스템과 이 시스템에 사용 가능한 자율 이동 로봇
KR101864948B1 (ko) * 2016-10-31 2018-07-04 고려대학교 산학협력단 이동 로봇의 엘리베이터 승하차 제어 방법
KR20190082674A (ko) * 2017-12-31 2019-07-10 사르코스 코퍼레이션 로봇이 볼 수 있는 비밀 식별 태그 및 로봇 장치
KR20190120104A (ko) * 2019-07-03 2019-10-23 엘지전자 주식회사 마커, 마커 추종 모드로 이동하는 방법 및 이를 구현하는 로봇
KR102194426B1 (ko) * 2020-04-29 2020-12-24 주식회사 트위니 실내 이동 로봇이 엘리베이터에서 환경을 인식하기 위한 장치 및 방법, 이를 구현하기 위한 프로그램이 저장된 기록매체 및 이를 구현하기 위해 매체에 저장된 컴퓨터프로그램

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114234966A (zh) * 2021-12-01 2022-03-25 北京云迹科技股份有限公司 移动机器人的乘梯状态检测方法、装置、存储介质及设备
CN114234966B (zh) * 2021-12-01 2024-06-04 北京云迹科技股份有限公司 移动机器人的乘梯状态检测方法、装置、存储介质及设备
CN114397885A (zh) * 2021-12-16 2022-04-26 北京三快在线科技有限公司 辅助移动机器人乘坐电梯的方法、电子设备及存储介质
CN114434453A (zh) * 2021-12-31 2022-05-06 上海擎朗智能科技有限公司 一种机器人乘梯方法、系统、机器人和存储介质
CN114434453B (zh) * 2021-12-31 2024-06-07 上海擎朗智能科技有限公司 一种机器人乘梯方法、系统、机器人和存储介质
CN114905508A (zh) * 2022-04-19 2022-08-16 合肥综合性国家科学中心人工智能研究院(安徽省人工智能实验室) 基于异构特征融合的机器人抓取方法
CN114905508B (zh) * 2022-04-19 2023-08-22 合肥综合性国家科学中心人工智能研究院(安徽省人工智能实验室) 基于异构特征融合的机器人抓取方法
CN115167454A (zh) * 2022-08-03 2022-10-11 北京京东乾石科技有限公司 用于机器人的控制方法、装置、电子设备及存储介质

Also Published As

Publication number Publication date
KR102194426B1 (ko) 2020-12-24

Similar Documents

Publication Publication Date Title
WO2021221343A1 (fr) Appareil et procédé de reconnaissance de l'environnement d'un robot mobile d'intérieur dans un ascenseur, support d'enregistrement stockant un programme pour l'exécution de celui-ci, et programme informatique stocké sur le support pour l'exécution de celui-ci
WO2017188706A1 (fr) Robot mobile et procédé de commande de robot mobile
WO2018074904A1 (fr) Robot mobile et procédé de commande du robot mobile
WO2021010757A1 (fr) Robot mobile et son procédé de commande
AU2020247141B2 (en) Mobile robot and method of controlling the same
WO2017078304A1 (fr) Robot nettoyeur et son procédé de commande
WO2022050507A1 (fr) Procédé et système de surveillance d'un module de génération d'énergie photovoltaïque
WO2018139796A1 (fr) Robot mobile et procédé de commande associé
WO2018212608A1 (fr) Système de marquage mobile, procédé de commande de dispositif de marquage mobile, et support d'enregistrement lisible par ordinateur
WO2020111808A1 (fr) Chariot à conduite autonome
WO2018062647A1 (fr) Appareil de génération de métadonnées normalisées, appareil de détection d'occlusion d'objet et procédés associés
WO2017105130A1 (fr) Dispositif de communication et dispositif électronique le contenant
WO2021221344A1 (fr) Appareil et procédé pour reconnaître l'environnement d'un robot mobile dans un environnement avec une pente, support d'enregistrement dans lequel un programme pour la mise en œuvre de celui-ci est stocké, et programme informatique pour la mise en œuvre de celui-ci stocké dans le support
AU2020244635B2 (en) Mobile robot control method
WO2016028021A1 (fr) Robot de nettoyage et son procédé de commande
WO2020242260A1 (fr) Procédé et dispositif de compression d'image basée sur l'apprentissage machine utilisant un contexte global
WO2022045464A1 (fr) Robot de service autonome multifonctionnel
WO2018182170A1 (fr) Système d'analyse vitesse-puissance de voilier dans des conditions de navigation standard
WO2019199112A1 (fr) Système et procédé de travail autonome et support d'enregistrement lisible par ordinateur
WO2019054676A1 (fr) Système de robot mobiassole et procédé de commande de celui-ci
WO2021006436A1 (fr) Robot de tondeuse à gazon et procédé de commande associé
WO2018182171A1 (fr) Procédé pour effectuer une analyse de puissance de vitesse de navigation standard d'un vaisseau à voile
WO2020222408A1 (fr) Procédé d'amélioration de trajet de point de cheminement en temps réel, support d'enregistrement dans lequel est stocké un programme de mise en œuvre du procédé, et programme informatique stocké dans un support pour sa mise en œuvre
WO2022103236A1 (fr) Procédé de suivi de joueur, dispositif de suivi de joueur et système de suivi de joueur
WO2023043079A1 (fr) Moyen de robot de service à conduite autonome à reconnaissance d'objet transparent

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21797643

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21797643

Country of ref document: EP

Kind code of ref document: A1