US12320900B2 - Apparatus and method for generating dynamic safety zone of mobile robot - Google Patents

Apparatus and method for generating dynamic safety zone of mobile robot Download PDF

Info

Publication number
US12320900B2
US12320900B2 US18/316,889 US202318316889A US12320900B2 US 12320900 B2 US12320900 B2 US 12320900B2 US 202318316889 A US202318316889 A US 202318316889A US 12320900 B2 US12320900 B2 US 12320900B2
Authority
US
United States
Prior art keywords
mobile robot
safety zone
safety
movement
dynamic
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active, expires
Application number
US18/316,889
Other versions
US20240075620A1 (en
Inventor
Seong Ju Park
Dong Hyeon SEO
Seung Ho Jang
Min Chang
Yun Jib Kim
Chang Woo Kim
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Miele und Cie KG
Yujin Robot Co Ltd
Original Assignee
Miele und Cie KG
Yujin Robot Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from KR1020210153768A external-priority patent/KR102625700B1/en
Application filed by Miele und Cie KG, Yujin Robot Co Ltd filed Critical Miele und Cie KG
Assigned to MIELE & CIE. KG, YUJIN ROBOT CO., LTD. reassignment MIELE & CIE. KG ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: CHANG, MIN, JANG, SEUNG HO, KIM, CHANG WOO, KIM, YUN JIB, PARK, SEONG JU, SEO, DONG HYEON
Publication of US20240075620A1 publication Critical patent/US20240075620A1/en
Application granted granted Critical
Publication of US12320900B2 publication Critical patent/US12320900B2/en
Active legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • G01S17/931Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J11/00Manipulators not otherwise provided for
    • B25J11/008Manipulators for service tasks
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • B25J13/08Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
    • B25J13/088Controls for manipulators by means of sensing devices, e.g. viewing or touching devices with position, velocity or acceleration sensors
    • B25J13/089Determining the position of the robot with reference to its environment
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/02Sensing devices
    • B25J19/021Optical sensing devices
    • B25J19/022Optical sensing devices using lasers
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1628Programme controls characterised by the control loop
    • B25J9/1653Programme controls characterised by the control loop parameters identification, estimation, stiffness, accuracy, error analysis
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1664Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
    • B25J9/1666Avoiding collision or forbidden zones
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1674Programme controls characterised by safety, monitoring, diagnostic
    • B25J9/1676Avoiding collision or forbidden zones
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0214Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory in accordance with safety or protection criteria, e.g. avoiding hazardous areas
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/20Control system inputs
    • G05D1/24Arrangements for determining position or orientation
    • G05D1/242Means based on the reflection of waves generated by the vehicle
    • G05D1/2424Means based on the reflection of waves generated by the vehicle for monitoring a plurality of zones
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/60Intended control result
    • G05D1/617Safety or protection, e.g. defining protection zones around obstacles or avoiding hazards
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/60Intended control result
    • G05D1/617Safety or protection, e.g. defining protection zones around obstacles or avoiding hazards
    • G05D1/622Obstacle avoidance
    • G05D1/637Obstacle avoidance using safety zones of adjustable size or shape
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/39Robotics, robotics to robotics hand
    • G05B2219/39091Avoid collision with moving obstacles
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/40Robotics, robotics mapping to robotics vision
    • G05B2219/40476Collision, planning for collision free path
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D2109/00Types of controlled vehicles
    • G05D2109/10Land vehicles
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D2111/00Details of signals used for control of position, course, altitude or attitude of land, water, air or space vehicles
    • G05D2111/10Optical signals
    • G05D2111/17Coherent light, e.g. laser signals

Definitions

  • the present disclosure relates to a functional safety system of a robot, and more particularly, to an apparatus and a method for generating a dynamic safety zone of a mobile robot which generate a safety zone which is a zone to sense whether there is an obstacle.
  • a user When a dynamic safety zone of a robot is generated in the related art, a user receives and stores a shape of a static zone at every speed designated by a user and then if a speed reaches a corresponding speed, it is changed to a zone stored by the user to inspect whether there is an obstacle in the corresponding zone and stop. By doing this, the safety is ensured according to a number and shapes of zones according to a speed designated by the user.
  • a performance level d (PL-d) LiDAR which uses a dynamic safety zone limits a number of safety zones and a budget model may inputs up to six zones and a high-end model may input up to 32 zones.
  • a robot using a mecanum wheel which requires various directions travels in both the x-axis and the y-axis and travels while rotating. Therefore, it is difficult for the user to designate an appropriate safety zone according to the speed and the direction for the dynamic driving. Further, as the speed is subdivided to increase the density between zones, the number of zones which need to be input by the user is also increased so that the larger the number of zones, the higher the risk due to the user fault.
  • FIG. 1 is a view for explaining a method of generating a safety zone of the related art.
  • the safety zone generating method of the related art in a state in which the user matches and stores the speed and the safety zone, when the corresponding condition is satisfied, the safety zone is changed so that the number of safety zones is limited.
  • the higher the density between the safety zones the larger the number of safety zones and the higher the risk of the user fault by the user.
  • An object to be achieved by the present disclosure is to provide an apparatus and a method for generating a dynamic safety zone of a mobile robot which variably generate a safety zone required for a functional safety of a mobile robot according to an appearance and a speed of the mobile robot.
  • a dynamic safety zone generating apparatus of a mobile robot includes an information acquiring unit which acquires a movement direction and a movement speed of a mobile robot; and a safety zone generating unit which dynamically generates a safety zone for the mobile robot based on at least one of shape information of the mobile robot, a movement direction of the mobile robot acquired by the information acquiring unit, and a movement speed of the mobile robot acquired by the information acquiring unit.
  • the safety zone generating unit acquires a future predicted position of the mobile robot based on the movement direction and the movement speed of the mobile robot and dynamically generates a safety zone of the mobile robot based on the shape information of the mobile robot and the future predicted location of the mobile robot.
  • the dynamic safety zone generating unit acquires the movement speed of the x-axis direction and the movement speed of the y-axis direction based on the movement direction and the movement speed of the mobile robot, acquires the future predicted x-axis location based on the movement speed of the x-axis direction, acquires the future predicted y-axis location of the mobile robot based on the movement speed of the y-axis direction, and acquires a future predicted location of the mobile robot based on the future predicted x-axis location and the future predicted y-axis location.
  • the safety zone generating unit acquires a default safety zone based on the shape information of the mobile robot and dynamically generates a safety zone for the mobile robot based on the default safety zone and the future predicted location of the mobile robot.
  • the safety zone generating unit acquires front, rear, left, and right distances with respect to the center point of the mobile robot based on the shape information of the mobile robot and acquires a default safety zone formed of four vertices based on the center point of the robot and the front, rear, left, and right distances.
  • the safety zone generating unit varies the front, rear, left, and right distances according to the default safety zone based on the future predicted location of the mobile robot to dynamically generate the safety zone for the mobile robot.
  • the safety zone generating unit acquires a plurality of sub safety zones by varying the front, rear, left, and right distances according to the default safety zone plural times, based on the current location of the mobile robot and the future predicted location of the mobile robot and dynamically generates the safety zone for the mobile robot based on the plurality of sub safety zones.
  • the safety zone generating unit dynamically generates a safety zone for the mobile robot in the unit of predetermined cycle.
  • the movement direction of the mobile robot is one of a traveling direction according to a straight movement of the mobile robot, a traveling direction according to the rotation of the mobile robot, and a traveling direction when the rotation and the straight movement of the mobile robot are simultaneously generated
  • the straight movement of the mobile robot is one of a straight movement on the x-axis, a straight movement on the y-axis, and a diagonal movement on the x-axis and the y-axis
  • the rotation of the mobile robot is one of the rotational movement to the left side and the rotational movement to the right side.
  • the safety zone generating unit generates the safety zone including a plurality of sub safety areas and speed ranges of a mobile robot required by the sub safety areas are different from each other.
  • the plurality of sub safety areas includes a first sub safety area, a second sub safety area, and a third sub safety area and a first speed range required by the first sub safety area does not overlap a second speed range required by the second sub safety area and is set to be higher than the second speed range.
  • a third speed range required by the third sub safety area is set to be smaller than the second speed range and when the third sub safety area is in contact with a surface of a body of the mobile robot, the third speed range includes a speed value of 0.
  • the safety zone generating unit includes a processor and further includes a neural network processor including a machine learning model which is trained in advance to improve a processing speed and an accuracy related to the safety zone.
  • the processor finally determines an attribute (size, shape, location) of the safety area in consideration of motion information of the mobile robot, environment information acquired from the sensor, or collision prediction information for collision possibility between the mobile robot and the obstacle transmitted from the collision probability prediction model.
  • a dynamic safety zone generating method of a mobile robot includes: acquiring a movement direction and a movement speed of a mobile robot; and dynamically generating a safety zone for the mobile robot based on at least one of shape information of the mobile robot, a movement direction of the mobile robot, and a movement speed of the mobile robot.
  • the dynamically generating of a safety zone is configured by acquiring a future predicted position of the mobile robot based on the movement direction and the movement speed of the mobile robot and dynamically generating a safety zone of the mobile robot based on the shape information of the mobile robot and the future predicted location of the mobile robot.
  • the dynamically generating of a safety zone is configured by acquiring a default safety zone based on the shape information of the mobile robot and dynamically generating a safety zone for the mobile robot based on the default safety zone and the future predicted location of the mobile robot.
  • an expected location of a mobile robot can be calculated according to a speed/direction by shape information and direction/speed information of a mobile robot without inputting the safety zone by the user and a dynamic safety zone is automatically generated to be used for all mobile robots mounted with various wheels, such as a diff wheel or a Mecanum wheel.
  • the safety zones are automatically generated in all directions without directly inputting the safety zone by the user so that the safety function for the density between the safety zones and omni-directions can be performed and the user fault caused by the user input may be significantly reduced.
  • FIG. 1 is a view for explaining a method of generating a safety zone of the related art
  • FIG. 2 is a block diagram for explaining an apparatus of generating a dynamic safety zone of a mobile robot according to an exemplary embodiment of the present disclosure
  • FIG. 3 is a view for explaining a process of generating a dynamic safety zone according to an exemplary embodiment of the present disclosure
  • FIG. 4 is a view for explaining an operation of acquiring a default safety zone according to an exemplary embodiment of the present disclosure
  • FIG. 5 is a view for explaining an operation of acquiring a future predicted location according to an exemplary embodiment of the present disclosure
  • FIG. 6 is a view for explaining an operation of generating a dynamic safety zone according to an exemplary embodiment of the present disclosure
  • FIG. 7 is a view for explaining an example of generating a dynamic safety zone according to an exemplary embodiment of the present disclosure to illustrate a straight movement on one axis;
  • FIG. 8 is a view for explaining an operation of generating a dynamic safety zone for diagonal movement according to an exemplary embodiment of the present disclosure
  • FIG. 9 is a view for explaining an example of generating a dynamic safety zone according to an exemplary embodiment of the present disclosure to illustrate a diagonal movement on an x-axis and a y-axis;
  • FIG. 10 is a view for explaining an operation of generating a dynamic safety zone of rotational movement according to an exemplary embodiment of the present disclosure
  • FIG. 11 is a view for explaining an example of generating a dynamic safety zone according to an exemplary embodiment of the present disclosure to illustrate a rotational movement
  • FIG. 12 is a view for explaining an example of generating a dynamic safety zone according to an exemplary embodiment of the present disclosure to illustrate a case in which a rotational movement and a straight movement simultaneously occur;
  • FIG. 13 is a flowchart for explaining a method for generating a dynamic safety zone of a mobile robot according to an exemplary embodiment of the present disclosure
  • FIG. 14 is a block diagram schematically illustrating a mobile robot according to an exemplary embodiment of the present disclosure.
  • FIG. 15 is a block diagram schematically illustrating a control device of a mobile robot according to an exemplary embodiment of the present disclosure
  • FIGS. 16 and 17 are views for explaining a safety area according to an exemplary embodiment of the present disclosure.
  • FIG. 18 is a block diagram schematically illustrating a control device including a machine learning model according to an exemplary embodiment of the present disclosure.
  • FIGS. 19 and 20 are exemplary views for explaining an operation of a mobile robot according to an obstacle according to an exemplary embodiment of the present disclosure.
  • first or second are used to distinguish one component from the other component so that the scope should not be limited by these terms.
  • a first component may be referred to as a second component
  • a second component may be referred to as a first component.
  • the terms “have”, “may have”, “include”, or “may include” represent the presence of the characteristic (for example, a numerical value, a function, an operation, or a component such as a part”), but do not exclude the presence of additional characteristic.
  • ⁇ unit refers to a software or hardware component such as a field programmable gate array (FPGA) or an ASIC and “ ⁇ unit” performs some functions.
  • ⁇ unit is not limited to the software or the hardware.
  • ⁇ unit may be configured to be in an addressable storage medium or may be configured to reproduce one or more processors.
  • ⁇ unit includes components such as software components, object oriented software components, class components, and task components, processes, functions, attributes, procedures, subroutines, segments of a program code, drivers, a firmware, a microcode, a circuit, data, database, and data structures.
  • a function which is provided in the components and “ ⁇ units” may be combined with a smaller number of components and “ ⁇ units” or divided into additional components and “ ⁇ units”.
  • FIG. 2 is a block diagram for explaining an apparatus of generating a dynamic safety zone of a mobile robot according to an exemplary embodiment of the present disclosure
  • FIG. 3 is a view for explaining a process of generating a dynamic safety zone according to an exemplary embodiment of the present disclosure
  • FIG. 4 is a view for explaining an operation of acquiring a default safety zone according to an exemplary embodiment of the present disclosure
  • FIG. 5 is a view for explaining an operation of acquiring a future predicted location according to an exemplary embodiment of the present disclosure
  • FIG. 6 is a view for explaining an operation of generating a dynamic safety zone according to an exemplary embodiment of the present disclosure.
  • an apparatus 100 for generating a dynamic safety zone of a mobile robot variably generates a safety zone required for the functional safety of the mobile robot according to an appearance and a speed of the mobile robot.
  • the present disclosure is applicable to household cleaning robots, public building cleaning robots, logistics robots, service robots, as well as industrial robots.
  • the dynamic safety zone generating apparatus 100 is free from a method having a limitation for a number of safety zones and a high risk of a user fault due to a method of directly inputting a safety zone by a user according to a speed and a direction of the mobile robot, which is a method according to a related art, and calculates a predicted location of a mobile robot according to a speed/direction by shape information and direction/speed information of the mobile robot without inputting a safety zone by a user and automatically generates a dynamic safety zone to be used for all mobile robots mounted with various wheels such as a diff wheel and a Mecanum wheel.
  • the safety zone is automatically generated for all directions without directly inputting the safety zone by a user so that the safety function for the density between the safety zones and omni-directions may be performed and a user fault by the user input may be significantly reduced.
  • the dynamic safety zone generating apparatus 100 is loaded in a mobile robot to dynamically generate a safety zone based on information acquired from the sensor mounted in the mobile robot.
  • the dynamic safety zone generating apparatus 100 according to the present disclosure is loaded in a server which remotely manages the mobile robot by wireless communication to dynamically generate the safety zone of the mobile robot based on the information provided from the mobile robot and provide information about the generated safety zone to the mobile robot.
  • the dynamic safety zone generating apparatus 100 includes an information acquiring unit 110 and a safety zone generating unit 130 .
  • the information generating unit 110 acquires a movement direction and a movement speed of the mobile robot.
  • the movement direction of the mobile robot refers to one of a traveling direction according to a straight movement of the mobile robot, a traveling direction according to the rotation of the mobile robot, and a traveling direction when the rotational movement and the straight movement of the mobile robot are simultaneously generated.
  • the straight movement of the mobile robot refers to one of a straight movement on the x-axis, a straight movement on the y-axis, and a diagonal movement on the x-axis and the y-axis.
  • the rotation of the mobile robot refers to one of the rotational movement to the left side and the rotational movement to the right side.
  • the safety zone generating unit 130 dynamically generates a safety zone for the mobile robot based on at least one of shape information of the mobile robot, a movement direction of the mobile robot acquired by the information acquiring unit 110 , and a movement speed of the mobile robot acquired by the information acquiring unit 110 .
  • the safety zone generating unit 130 dynamically generates the safety zone for the mobile robot in the unit of predetermined cycles.
  • the safety zone generating unit 130 dynamically generates the safety zone of the mobile robot when at least one of the movement direction and the movement speed of the mobile changes.
  • the safety zone generating unit 130 acquires a future predicted position of the mobile robot based on the movement direction and the movement speed of the mobile robot. For example, as illustrated in FIG. 5 , the safety zone generating unit 130 may acquire a future location of the mobile robot to be predicted with respect a current center location of the mobile robot, based on the movement direction and the movement speed of the mobile robot.
  • the safety zone generating unit 130 may acquire the movement speed of the x-axis direction and the movement speed of the y-axis direction based on the movement direction and the movement speed of the mobile robot.
  • the safety zone generating unit 130 may acquire a future predicted x-axis location of the mobile robot based on the movement speed of the x-axis direction and a future predicted y-axis location of the mobile robot based on the movement speed of the y-axis direction.
  • the safety zone generating unit 130 acquires the future predicted location of the mobile robot based on the future predicted x-axis location and the future predicted y-axis location.
  • the safety zone generating unit 130 acquires the future predicted location of the mobile robot by means of the following Equation 1.
  • future location ⁇ x speed ⁇ x *(response time+braking distance+margin distance)
  • future location ⁇ y speed ⁇ y *(response time+braking distance+margin distance) [Equation 1]
  • future location ⁇ x indicates the future predicted x-axis location and future location ⁇ y indicates the future predicted y-axis location.
  • speed ⁇ x indicates a movement speed of the x-axis direction of the mobile robot and speed ⁇ y indicates a movement speed of the y-axis direction of the mobile robot.
  • braking distance indicates a distance required to stop the mobile robot and is set in advance according to the speed of the mobile robot.
  • margin distance indicates a margin distance which is set in advance for the safety of the mobile robot.
  • response time is a response time of a sensor mounted in the mobile robot and is acquired by the following Equation 2.
  • response time (sensor scanning time*sampling count)+communication delay+margin response time [Equation 2]
  • sensor scanning time indicates a scanning time of the sensor mounted in the mobile robot.
  • sampling count indicates a number of samplings of the sensor mounted in the mobile robot.
  • communication delay indicates a communication delay time of the sensor mounted in the mobile robot.
  • margin response time indicates a margin response time which is set in advance for exact measurement of the sensor mounted in the mobile robot.
  • the safety zone generating unit 130 may dynamically generate a safety zone for the mobile robot based on the shape information of the mobile robot and the future predicted location of the mobile robot.
  • the shape information of the mobile robot refers to information about the appearance of the robot.
  • the safety zone generating unit 130 may acquire a default safety zone based on the shape information of the mobile robot and dynamically generate a safety zone for the mobile robot based on the default safety zone and the future predicted location of the mobile robot.
  • the safety zone generating unit 130 acquires front, rear, left, and right distances with respect to the center point of the mobile robot based on the shape information of the mobile robot and acquires a default safety zone formed of four vertices based on the center point of the robot and the front, rear, left, and right distances. For example, as illustrated in FIG. 4 , the safety zone generating unit 130 acquires a first distance (a front side distance), a second distance (a rear side distance), a third distance (a left side distance), and a fourth distance (a right side distance) with respect to the center point of the mobile robot and acquires a default safety zone based on the acquired first to fourth distances.
  • speed indicates a movement speed of the mobile robot.
  • Response time is a response time of a sensor mounted in the mobile robot and is acquired by the above Equation 2.
  • braking distance indicates a distance required to stop the mobile robot and is set in advance according to the speed of the mobile robot.
  • margin distance indicates a margin distance which is set in advance for the safety of the mobile robot.
  • the safety zone generating unit 130 acquires a plurality of sub safety zones by varying the front, rear, left, and right distances according to the default safety zone plural times, based on the current location of the mobile robot and the future predicted location of the mobile robot and dynamically generates the safety zone for the mobile robot based on the plurality of sub safety zones.
  • the safety zone generating unit 130 acquires a sub safety zone in every predetermined distance unit, by dividing a path along which the mobile robot moves from a start point to a target time into a predetermined distance unit with the current location of the mobile robot as the start point and the future predicted location of the mobile robot as the target point and dynamically generates the safety zone for the mobile robot by overlapping the plurality of acquired sub safety zones.
  • FIG. 7 is a view for explaining an example of generating a dynamic safety zone according to an exemplary embodiment of the present disclosure to illustrate a straight movement on one axis
  • FIG. 8 is a view for explaining an operation of generating a dynamic safety zone for diagonal movement according to an exemplary embodiment of the present disclosure
  • FIG. 9 is a view for explaining an example of generating a dynamic safety zone according to an exemplary embodiment of the present disclosure to illustrate a diagonal movement on an x-axis and a y-axis
  • FIG. 10 is a view for explaining an operation of generating a dynamic safety zone of rotational movement according to an exemplary embodiment of the present disclosure
  • FIG. 10 is a view for explaining an operation of generating a dynamic safety zone of rotational movement according to an exemplary embodiment of the present disclosure
  • FIG. 11 is a view for explaining an example of generating a dynamic safety zone according to an exemplary embodiment of the present disclosure to illustrate a rotational movement
  • FIG. 12 is a view for explaining an example of generating a dynamic safety zone according to an exemplary embodiment of the present disclosure to illustrate a case in which a rotational movement and a straight movement simultaneously occur.
  • the faster the speed the longer the length of the safety zone which is dynamically generated and the slower the speed, the smaller the safety zone.
  • the safety zone which is dynamically generated should not be smaller than the shape of the mobile robot, that is, the default safety zone.
  • the mobile robot when the mobile robot simultaneously moves to the x-axis and the y-axis, the mobile robot diagonally travels and the safety zone is dynamically and automatically generated so that the mobile robot does not collide with the obstacle during the diagonal traveling.
  • a shape of the safety zone which is dynamically generated according to the rotation direction of the mobile robot is changed so that when the mobile robot rotates to the right side, a upper right end and an upper left end which are likely to collide become longer and when the mobile robot rotates to the left side, the upper left end and the lower right end may become longer.
  • a safety zone in which a rotational value is added based on a value obtained from the future predicted location of the mobile robot is generated and all the safety zones are automatically determined and changed according to the speed and the direction.
  • FIG. 13 is a flowchart for explaining a method for generating a dynamic safety zone of a mobile robot according to an exemplary embodiment of the present disclosure.
  • the dynamic safety zone generating apparatus 100 acquires a movement direction and a movement speed of the mobile robot (S 110 ).
  • the dynamic safety zone generating apparatus 100 dynamically generates the safety zone for the mobile robot based on at least one of the shape information of the mobile robot, the movement direction of the mobile robot, and the movement speed of the mobile robot (S 130 ).
  • the dynamic safety zone generating apparatus 100 dynamically generates the safety zone for the mobile robot in the unit of predetermined cycles.
  • the dynamic safety zone generating apparatus 100 dynamically generates the safety zone of the mobile robot when at least one of the movement direction and the movement speed of the mobile changes.
  • the dynamic safety zone generating apparatus 100 acquires a future predicted location of the mobile robot based on the movement direction and the movement speed of the mobile robot. That is, the dynamic safety zone generating apparatus 100 acquires the movement speed of the x-axis direction and the movement speed of the y-axis direction based on the movement direction and the movement speed of the mobile robot, acquires the future predicted x-axis location of the mobile robot based on the movement speed of the x-axis direction and acquires the future predicted y-axis location of the mobile robot based on the movement speed of the y-axis direction, and acquires a future predicted location of the mobile robot based on the future predicted x-axis location and the future predicted y-axis location.
  • the dynamic safety zone generating apparatus 100 may dynamically generate a safety zone for the mobile robot based on the shape information of the mobile robot and the future predicted location of the mobile robot.
  • the dynamic safety zone generating apparatus 100 may acquire a default safety zone based on the shape information of the mobile robot and dynamically generate a safety zone for the mobile robot based on the default safety zone and the future predicted location of the mobile robot. At this time, the dynamic safety zone generating apparatus 100 acquires front, rear, left, and right distances with respect to the center point of the mobile robot based on the shape information of the mobile robot and acquires a default safety zone formed of four vertices based on the center point of the robot and the front, rear, left, and right distances.
  • the dynamic safety zone generating apparatus 100 varies the front, rear, left, and right distances according to the default safety zone based on the future predicted location of the mobile robot to dynamically generate the safety zone for the mobile robot.
  • the dynamic safety zone generating apparatus 100 acquires a plurality of sub safety zones by varying the front, rear, left, and right distances according to the default safety zone plural times, based on the current location of the mobile robot and the future predicted location of the mobile robot and dynamically generates the safety zone for the mobile robot based on the plurality of sub safety zones.
  • FIG. 14 is a block diagram schematically illustrating a mobile robot according to an exemplary embodiment of the present disclosure.
  • a mobile robot 10 includes an environment sensing device 20 , a power device 30 , a control device 40 , and a driving device 50 .
  • the mobile robot 10 of FIG. 14 is an example so that all blocks illustrated in FIG. 14 are not essential components and in the other exemplary embodiment, some blocks included in the mobile robot 10 may be added, modified, or omitted.
  • the mobile robot 10 may be household cleaning robots, public building cleaning robots, logistics robots, service robots, and industrial robots.
  • the environment sensing device 20 refers to a device which senses motion information, surrounding obstacle information, and floor state information for the mobile robot 10 .
  • the environment sensing device 20 includes a plurality of sensors and includes various sensors, such as a LiDAR sensor, a radar sensor, an image sensor, or an IR sensor.
  • various sensors such as a LiDAR sensor, a radar sensor, an image sensor, or an IR sensor.
  • the environment sensing device 20 transmits information sensed by the plurality of sensors to the control device 40 .
  • the power device 30 stores and supplies a power for an operation of the mobile robot 10 .
  • the power device 30 applies a power while interworking with various configurations required to be applied with the power in the mobile robot 10 .
  • the power device 30 may be implemented as a battery, but is not limited thereto.
  • the control device 40 performs an operation of controlling an overall operation of the mobile robot 10 .
  • the control device 40 controls a safety area for preventing collision of the mobile robot 10 .
  • the operation of the control device 40 to generate and control the safety area will be described with reference to FIGS. 15 to 18 .
  • the safety area is a concept including a safety zone and the safety zone is replaced with the safety area.
  • the control device 40 performs an operation corresponding to all or a part of operations performed by the dynamic safety zone generating device 100 .
  • control device 40 controls the driving of the mobile robot 10 .
  • the control device 40 generates an operation control signal based on the safety area and transmits the generated operation control signal to at least one motor included in the driving device 50 to control a driving force of the motor, thereby controlling the operation of the mobile robot 10 .
  • the driving device 50 refers to a device including at least one motor equipped in the mobile robot 10 .
  • the driving device 50 may include various types of motors related to the operation of the mobile robot 10 .
  • the driving device 50 may include a movement motor 56 , but is not necessarily limited thereto and may further include various motors according to the type of the mobile robot 10 .
  • the movement motor 56 is a motor for rotating main wheels (not illustrated) of the mobile robot 10 and is connected to the main wheels (not illustrated) and generates a driving force to rotate the main wheels (not illustrated).
  • the movement motor 56 rotates the main wheels (not illustrated) to move the mobile robot 10 along the movement path set by the control device 32 of the mobile robot 10 .
  • the movement motor 56 adaptively adjusts a driving force to rotate the main wheels so as to correspond to a movement speed or a size of the safety area based on the operation control signal received from the control device 32 .
  • FIG. 15 is a block diagram schematically illustrating a control device of a mobile robot according to an exemplary embodiment of the present disclosure.
  • the control device 40 of the mobile robot 10 includes an I/O interface 41 , a communication module 42 , a processor 43 , a memory 44 , and a database 45 .
  • the control device 40 of FIG. 15 is an example so that all blocks illustrated in FIG. 15 are not essential components and in the other exemplary embodiment, some blocks included in the control device 40 may be added, modified, or omitted.
  • the control device 40 may be implemented by a computing device and each component included in the control device 40 may be implemented by a separate software device or a separate hardware device in which the software is combined.
  • the communication module 42 refers to a means which receives or transmits a signal or data.
  • the communication module 42 interworks with the processor 43 to input various types of signals or data or directly acquires data by interworking with a device in the mobile robot or an external device to transmit the signal or data to the processor 43 .
  • the communication module 42 performs an operation corresponding to all or part of the operation performed by the information acquiring unit 110 .
  • the communication module 42 transmits the signal or data generated in the processor 43 to the device in the mobile robot 10 or an external device (for example, a server).
  • the communication module 42 may be connected to the I/O interface 41 .
  • the I/O interface 41 transmits information acquired from the communication module 42 to the processor 43 or receives a control signal from the processor 43 to substantially convert the information or the control signal into a signal for controlling the communication module 42 .
  • the processor 43 performs an operation of generating and controlling a safety area 2000 for preventing the collision with the obstacle present around the mobile robot 10 .
  • the processor 43 performs an operation corresponding to all or a part of an operation performed by the dynamic safety zone generating unit 130 .
  • the memory 44 includes at least one instruction or program which is executable by the processor 43 .
  • the memory 44 includes instructions or programs for controlling the mobile robot 10 .
  • the database 45 refers to a general data structure implemented in a storage space (a hard disk or a memory) of a computer system using a database management program (DBMS) and means a data storage format which freely searches (extracts), deletes, edits, or adds data.
  • DBMS database management program
  • the database 150 may be implemented according to the object of the exemplary embodiment of the present disclosure using a relational database management system (RDBMS) such as Oracle, Informix, Sybase, or DB2, an object oriented database management system (OODBMS) such as Gemston, Orion, or O2, and XML native database such as Excelon, Tamino, Sekaiju and has an appropriate field or elements to achieve its own function.
  • RDBMS relational database management system
  • ODDBMS object oriented database management system
  • XML native database such as Excelon, Tamino, Sekaiju and has an appropriate field or elements to achieve its own function.
  • the database 45 may be implemented as a cloud or a virtual memory.
  • the database 45 stores and provides information about control of the mobile robot 10 and information about the safety area.
  • the database 45 is implemented in the control device 40 , but is not necessarily limited thereto and may be implemented as a separate data storage device.
  • FIGS. 16 and 17 are views for explaining a safety area according to an exemplary embodiment of the present disclosure.
  • the processor 43 generates a safety area 2000 for preventing the collision with the obstacle present around the mobile robot 10 .
  • the safety area 2000 generated around the mobile robot 10 includes a plurality of sub safety areas 2010 , 2020 , 2030 .
  • a speed range required by each sub safety area is different and is defined so as not to overlap.
  • the sub safety areas are distinguished according to a range of deceleration speed required for every sub safety area.
  • a deceleration range required by a first sub safety area 2010 that is, a required speed range is set so as not to overlap a deceleration range required by a second sub safety area 2020 .
  • a third sub safety area 2030 corresponds to an area in which a collision is imminent and a speed region required by the third sub safety area is set so as not to overlap the deceleration range required by the second sub safety area 2020 .
  • the first speed range required by the first sub safety area 2010 does not overlap the second speed range required by the second sub safety area 2020 , but is set to be higher than the second speed range.
  • the third speed range required by the third sub safety area 2030 is set to be lower than the second speed range. If the third deceleration range is in contact with a surface of the body of the mobile robot 10 , the third speed range substantially includes 0.
  • the processor 43 performs the processing for visualizing real-time information related to the above-described safety area.
  • the visualized information may be provided to the user by means of a user, a screen of a user terminal, or a screen of the mobile robot 10 .
  • the processor 43 includes a model which generates a safety area including a plurality of sub safety areas 2010 , 2020 , and 2030 and generates a control variable for setting or managing an attribute (for example, a size, a shape, or a direction) of safety areas according to circumstances.
  • the processor 43 receives the control variable from the safety area generating model to process the visualization for the safety area.
  • the safety area generating model may be implemented to directly process the visualization for the safety area.
  • the safety area may include a fixed safety area in which the attributes (for example, a size, a shape, and a direction) of the safety areas are not changed and a variable safety area in which at least one of the attributes (for example, a size, a shape, and a direction) of the safety areas is changed.
  • variable safety area may be separately generated from the fixed safety area and or varies the fixed safety area for a predetermined time (when an obstacle is sensed on the traveling path or there is a collision possibility).
  • variable safety area is desirably a safety area which is generated in response to a dynamic obstacle or an unidentified static obstacle.
  • each of the fixed safety area and the variable safety area may further include sub safety areas.
  • the processor 43 finally determines the attribute (size, shape, or location) of the variable safety area in consideration of motion information of the mobile robot or environment information acquired from the sensor.
  • the motion information may be a movement speed or a movement direction of the mobile robot and the environment information may be neighbor sensing information or obstacle detection information.
  • the processor 43 generates the variable safety area independently from the fixed safety area to determine an attribute (for example, size, shape, or direction) of the variable safety area in consideration of the motion information of the mobile robot and the environment information.
  • the variable safety area which is generated in consideration of a distance to the obstacle may further include detailed variable sub safety areas.
  • the fixed safety area and the variable safety area may be generated to have different attributes by the unidentified static obstacle or the dynamic obstacle 4000 which approaches the mobile robot to be sensed. That is, the fixed safety area and the variable safety area 2040 are substantially different so that there are an overlapped region in which two areas overlap and a non-overlapped region 2050 which belongs to only one safety area.
  • FIG. 18 is a block diagram schematically illustrating a control device including a machine learning model according to an exemplary embodiment of the present disclosure.
  • the control device 40 of FIG. 18 is an example in which a neural network processor 46 is added to the control device 40 of FIG. 15 . Therefore, a redundant description with the control device 40 of FIG. 15 will be omitted.
  • the memory 44 includes at least one instruction or program which is executable by the processor 43 .
  • the memory 44 includes instructions or programs for controlling the mobile robot 10 . Further, the memory 44 includes an instruction or a program for an operation of preprocessing a neural network learning result and an input value or an output value of the neural network.
  • the control device 40 of the exemplary embodiment further includes a neural network processor 46 including a machine learning model 47 which is trained in advance to improve a processing speed and an accuracy related to the safety area.
  • the processor 43 may interwork with the neural network processor 46 to perform an operation of controlling a safety area.
  • processor 43 and the neural network processor 46 are different modules, the present invention is not necessarily limited thereto and the processor and the neural network are combined as one module to perform the individual operations.
  • the neural network processor 46 performs an operation of predicting a collision probability based on artificial intelligence (AI) or managing a safety area.
  • AI artificial intelligence
  • the neural network processor 46 includes an input node, an intermediate node, and an output node and has a structure specified by a determination weight in which training is completed in advance by training data as a connection weight which connects the nodes.
  • An output value of the neural network processor 46 may be a coordinate value of an extended area or a coordinate value of a unit block area and may be implemented as a feature value matrix for the extended area or the unit block area.
  • the machine learning model 47 includes a collision probability prediction model 48 which predicts the collision with an obstacle and a safety area management model 49 which manages the safety area.
  • the collision probability prediction model 48 is a model which generates collision prediction information related to the collision with obstacles on a traveling path which is currently predicted.
  • the collision probability prediction model is a machine learning model which is trained by various training data and may be implemented as software or hardware.
  • the collision probability prediction model 48 outputs information related to the collision with a sensor value as an input.
  • a structure of the neural network for the collision probability prediction model 48 is not specifically limited.
  • the neural network may be implemented as a convolution neural network with a multi-layered neural network structure or recurrent neural network. Further, it may be implemented as hardware using an artificial intelligence accelerator.
  • the safety area management model 49 inputs information about the safety area visualized by the processor 43 to a previously trained neural network to output sub area attribute values for every sub area belonging to the visualized area.
  • the sub area attribute value includes information indicating a safety area or a sub safety area to which micro areas defined for every pixel or for every macro block configured by a plurality of pixels.
  • the safety area management model 49 desirably has a convolution neural network type neural network structure, but is not necessarily limited thereto. Further, it may be implemented as hardware using a graphic accelerator model.
  • the processor 43 finally determines an attribute (size, shape, location) of the variable safety area in consideration of motion information of the mobile robot, environment information acquired from the sensor, or collision prediction information for collision possibility between the mobile robot and the obstacle transmitted from the collision probability prediction model 48 .
  • the motion information may be a movement speed or a movement direction of the mobile robot and the environment information may be neighbor sensing information or obstacle detection information.
  • the collision prediction information includes a collision probability, a remaining time to collide, or a collision prediction location when the mobile robot moves along the predicted movement path.
  • the collision prediction point may be, for example, a body left, a body right, or a front of the mobile robot.
  • the processor 43 generates the variable safety area independently from the fixed safety area to determine an attribute (for example, size, shape, or direction) of the variable safety area in consideration of the motion information of the mobile robot, the environment information, and the collision prediction information.
  • the variable safety area which is generated in consideration of a distance to the obstacle may further include detailed variable sub safety areas.
  • the fixed safety area and the variable safety area may be generated to have different attributes by the unidentified static obstacle or the dynamic obstacle 4000 which approaches the mobile robot to be sensed. That is, the fixed safety area 2010 and the variable safety area 2040 are substantially different so that there are an overlapped region in which two areas overlap and a non-overlapped region 2050 which belongs to only one safety area.
  • the non-overlapped area 2050 is generated by an obstacle 400 which is suddenly sensed or an obstacle 4000 nearby.
  • the non-overlapped region 2050 is an area located at the outside of the “variable safety area”.
  • the safety area management model 49 transmits a sub area attribute value for the overlapped region 2050 to the processor 43 .
  • the processor 43 calculates the traveling path again using the sub area attribute value.
  • the processor 43 allows the mobile robot to travel along the calculated traveling path.
  • the processor 43 receives the collision prediction information from the collision probability prediction model 48 and generates a variable safety area 2040 in which the safety area varies according to the location in which the sensed obstacle is located.
  • the processor 43 extends the safety area toward the direction in which the dynamic obstacle is located to generate an extended variable safety area 2040 .
  • the mobile robot 10 includes a variable safety area having a shape with a curvature or an arc shape.
  • the mobile robot 10 supplements a traffic line of the mobile robot 10 which is wasted in a polygonal safety area with respect to the static obstacle. Further, as the variable safety area is applied, the mobile robot 10 covers a missing area in the polygonal safety area with respect to the static obstacle to supplement the safety area.
  • the mobile robot 10 may ensure a safety area based moving path which minimizes a space loss.
  • FIGS. 19 and 20 are exemplary views for explaining an operation of a mobile robot according to an obstacle according to an exemplary embodiment of the present disclosure.
  • FIG. 19 illustrates a situation in which when the mobile robot 10 moves from a start point A to an arrival point B, there are static obstacles C and D
  • FIG. 20 illustrates a situation in which when the mobile robot 10 moves from a start point A to an arrival point B, there are static obstacles C and D and a dynamic obstacle or an unidentified static obstacle 4000 .
  • the mobile robot 10 extends the sub safety areas 2010 , 2020 , and 2030 to cover between the static obstacles C and D and moves while reducing a movement speed in accordance with the extension.
  • the sub safety areas 2010 , 2020 , and 2030 are formed as variable safety areas having a shape with a curvature or an arc shape.
  • the traffic line of the mobile robot which is wasted in the polygonal safety area with respect to the static obstacle is supplemented and the area which is missed in the vicinity of the static obstacle is covered to supplement the safety area.
  • the mobile robot 10 when the dynamic obstacle or the unidentified static obstacle 4000 which is in contact with the first sub safety area 2010 is sensed, the mobile robot 10 reduces the movement speed based on a predetermined deceleration range for the first sub safety area 2010 and modifies a traveling path of the mobile robot 10 .
  • the modified traveling path may be set to a shortest distance to reach the arrival point B while avoiding the dynamic obstacle or the unidentified static obstacle 4000 .
  • the mobile robot 10 When the mobile robot 10 avoids the dynamic obstacle or the unidentified obstacle 4000 , the mobile robot 10 move while maintaining a predetermined distance to the obstacle 4000 .
  • the present invention is not limited to the exemplary embodiment.
  • one or more components may be selectively combined to be operated within a scope of the present invention.
  • all components may be implemented as one independent hardware but a part or all of the components are selectively combined to be implemented as a computer program which includes a program module which performs a part or all functions combined in one or plural hardwares.
  • a computer program may be stored in a computer readable media such as a USB memory, a CD disk, or a flash memory to be read and executed by a computer to implement the exemplary embodiment of the present invention.
  • the recording media of the computer program may include a magnetic recording medium or an optical recording medium.

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Electromagnetism (AREA)
  • Human Computer Interaction (AREA)
  • Optics & Photonics (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Manipulator (AREA)

Abstract

A functional safety system of a robot according to an exemplary embodiment of the present disclosure can generate a safety zone which is a zone to sense whether an obstacle is present.

Description

CROSS-REFERENCE TO RELATED APPLICATIONS
This application is a Continuation-in-part of pending PCT International Application No. PCT/KR2021/016526 filed on Nov. 12, 2021, which claims priority to Korean Patent Application No. 10-2020-0151001 filed on Nov. 12, 2020, Korean Patent Application No. 10-2021-0153766 filed on Nov. 10, 2021, Korean Patent Application No. 10-2021-0153767 filed on Nov. 10, 2021, and Korean Patent Application No. 10-2021-0153768 filed on Nov. 10, 2021, in the Korean Intellectual Property Office, the entire contents of which are hereby incorporated by references in its entirety.
BACKGROUND Field
The present disclosure relates to a functional safety system of a robot, and more particularly, to an apparatus and a method for generating a dynamic safety zone of a mobile robot which generate a safety zone which is a zone to sense whether there is an obstacle.
Description of the Related Art
When a dynamic safety zone of a robot is generated in the related art, a user receives and stores a shape of a static zone at every speed designated by a user and then if a speed reaches a corresponding speed, it is changed to a zone stored by the user to inspect whether there is an obstacle in the corresponding zone and stop. By doing this, the safety is ensured according to a number and shapes of zones according to a speed designated by the user.
As a traveling direction of the robot is diversified, the number of zones according to the speed needs to be increased as many as the traveling directions so that a method of directly inputting the safety zone by the user has limitations according to the number of zones. A performance level d (PL-d) LiDAR (light detection and ranging) which uses a dynamic safety zone limits a number of safety zones and a budget model may inputs up to six zones and a high-end model may input up to 32 zones.
As on example, a robot using a mecanum wheel which requires various directions travels in both the x-axis and the y-axis and travels while rotating. Therefore, it is difficult for the user to designate an appropriate safety zone according to the speed and the direction for the dynamic driving. Further, as the speed is subdivided to increase the density between zones, the number of zones which need to be input by the user is also increased so that the larger the number of zones, the higher the risk due to the user fault.
FIG. 1 is a view for explaining a method of generating a safety zone of the related art.
That is, as illustrated in FIG. 1 , according to the safety zone generating method of the related art, in a state in which the user matches and stores the speed and the safety zone, when the corresponding condition is satisfied, the safety zone is changed so that the number of safety zones is limited. The higher the density between the safety zones, the larger the number of safety zones and the higher the risk of the user fault by the user.
SUMMARY
An object to be achieved by the present disclosure is to provide an apparatus and a method for generating a dynamic safety zone of a mobile robot which variably generate a safety zone required for a functional safety of a mobile robot according to an appearance and a speed of the mobile robot.
Other and further objects of the present disclosure which are not specifically described can be further considered within the scope easily deduced from the following detailed description and the effect.
In order to achieve the above-described objects, according to an aspect of the present disclosure, a dynamic safety zone generating apparatus of a mobile robot includes an information acquiring unit which acquires a movement direction and a movement speed of a mobile robot; and a safety zone generating unit which dynamically generates a safety zone for the mobile robot based on at least one of shape information of the mobile robot, a movement direction of the mobile robot acquired by the information acquiring unit, and a movement speed of the mobile robot acquired by the information acquiring unit.
Here, the safety zone generating unit acquires a future predicted position of the mobile robot based on the movement direction and the movement speed of the mobile robot and dynamically generates a safety zone of the mobile robot based on the shape information of the mobile robot and the future predicted location of the mobile robot.
Here, the dynamic safety zone generating unit acquires the movement speed of the x-axis direction and the movement speed of the y-axis direction based on the movement direction and the movement speed of the mobile robot, acquires the future predicted x-axis location based on the movement speed of the x-axis direction, acquires the future predicted y-axis location of the mobile robot based on the movement speed of the y-axis direction, and acquires a future predicted location of the mobile robot based on the future predicted x-axis location and the future predicted y-axis location.
Here, the safety zone generating unit acquires a default safety zone based on the shape information of the mobile robot and dynamically generates a safety zone for the mobile robot based on the default safety zone and the future predicted location of the mobile robot.
Here, the safety zone generating unit acquires front, rear, left, and right distances with respect to the center point of the mobile robot based on the shape information of the mobile robot and acquires a default safety zone formed of four vertices based on the center point of the robot and the front, rear, left, and right distances.
Here, the safety zone generating unit varies the front, rear, left, and right distances according to the default safety zone based on the future predicted location of the mobile robot to dynamically generate the safety zone for the mobile robot.
Here, the safety zone generating unit acquires a plurality of sub safety zones by varying the front, rear, left, and right distances according to the default safety zone plural times, based on the current location of the mobile robot and the future predicted location of the mobile robot and dynamically generates the safety zone for the mobile robot based on the plurality of sub safety zones.
Here, the safety zone generating unit dynamically generates a safety zone for the mobile robot in the unit of predetermined cycle.
Here, the movement direction of the mobile robot is one of a traveling direction according to a straight movement of the mobile robot, a traveling direction according to the rotation of the mobile robot, and a traveling direction when the rotation and the straight movement of the mobile robot are simultaneously generated, the straight movement of the mobile robot is one of a straight movement on the x-axis, a straight movement on the y-axis, and a diagonal movement on the x-axis and the y-axis, and the rotation of the mobile robot is one of the rotational movement to the left side and the rotational movement to the right side.
Here, the safety zone generating unit generates the safety zone including a plurality of sub safety areas and speed ranges of a mobile robot required by the sub safety areas are different from each other.
Here, the plurality of sub safety areas includes a first sub safety area, a second sub safety area, and a third sub safety area and a first speed range required by the first sub safety area does not overlap a second speed range required by the second sub safety area and is set to be higher than the second speed range.
Here, a third speed range required by the third sub safety area is set to be smaller than the second speed range and when the third sub safety area is in contact with a surface of a body of the mobile robot, the third speed range includes a speed value of 0.
Here, the safety zone generating unit includes a processor and further includes a neural network processor including a machine learning model which is trained in advance to improve a processing speed and an accuracy related to the safety zone.
Here, the processor finally determines an attribute (size, shape, location) of the safety area in consideration of motion information of the mobile robot, environment information acquired from the sensor, or collision prediction information for collision possibility between the mobile robot and the obstacle transmitted from the collision probability prediction model.
In order to achieve the above-described objects, according to an aspect of the present disclosure, a dynamic safety zone generating method of a mobile robot includes: acquiring a movement direction and a movement speed of a mobile robot; and dynamically generating a safety zone for the mobile robot based on at least one of shape information of the mobile robot, a movement direction of the mobile robot, and a movement speed of the mobile robot.
Here, the dynamically generating of a safety zone is configured by acquiring a future predicted position of the mobile robot based on the movement direction and the movement speed of the mobile robot and dynamically generating a safety zone of the mobile robot based on the shape information of the mobile robot and the future predicted location of the mobile robot.
Here, the dynamically generating of a safety zone is configured by acquiring a default safety zone based on the shape information of the mobile robot and dynamically generating a safety zone for the mobile robot based on the default safety zone and the future predicted location of the mobile robot.
According to the apparatus and the method for generating a dynamic safety zone of a mobile robot according to the exemplary embodiment of the present disclosure, an expected location of a mobile robot can be calculated according to a speed/direction by shape information and direction/speed information of a mobile robot without inputting the safety zone by the user and a dynamic safety zone is automatically generated to be used for all mobile robots mounted with various wheels, such as a diff wheel or a Mecanum wheel.
Further, the safety zones are automatically generated in all directions without directly inputting the safety zone by the user so that the safety function for the density between the safety zones and omni-directions can be performed and the user fault caused by the user input may be significantly reduced.
The effects of the present invention are not limited to the technical effects mentioned above, and other effects which are not mentioned can be clearly understood by those skilled in the art from the following description.
BRIEF DESCRIPTION OF DRAWINGS
FIG. 1 is a view for explaining a method of generating a safety zone of the related art;
FIG. 2 is a block diagram for explaining an apparatus of generating a dynamic safety zone of a mobile robot according to an exemplary embodiment of the present disclosure;
FIG. 3 is a view for explaining a process of generating a dynamic safety zone according to an exemplary embodiment of the present disclosure;
FIG. 4 is a view for explaining an operation of acquiring a default safety zone according to an exemplary embodiment of the present disclosure;
FIG. 5 is a view for explaining an operation of acquiring a future predicted location according to an exemplary embodiment of the present disclosure;
FIG. 6 is a view for explaining an operation of generating a dynamic safety zone according to an exemplary embodiment of the present disclosure;
FIG. 7 is a view for explaining an example of generating a dynamic safety zone according to an exemplary embodiment of the present disclosure to illustrate a straight movement on one axis;
FIG. 8 is a view for explaining an operation of generating a dynamic safety zone for diagonal movement according to an exemplary embodiment of the present disclosure;
FIG. 9 is a view for explaining an example of generating a dynamic safety zone according to an exemplary embodiment of the present disclosure to illustrate a diagonal movement on an x-axis and a y-axis;
FIG. 10 is a view for explaining an operation of generating a dynamic safety zone of rotational movement according to an exemplary embodiment of the present disclosure;
FIG. 11 is a view for explaining an example of generating a dynamic safety zone according to an exemplary embodiment of the present disclosure to illustrate a rotational movement;
FIG. 12 is a view for explaining an example of generating a dynamic safety zone according to an exemplary embodiment of the present disclosure to illustrate a case in which a rotational movement and a straight movement simultaneously occur;
FIG. 13 is a flowchart for explaining a method for generating a dynamic safety zone of a mobile robot according to an exemplary embodiment of the present disclosure;
FIG. 14 is a block diagram schematically illustrating a mobile robot according to an exemplary embodiment of the present disclosure;
FIG. 15 is a block diagram schematically illustrating a control device of a mobile robot according to an exemplary embodiment of the present disclosure;
FIGS. 16 and 17 are views for explaining a safety area according to an exemplary embodiment of the present disclosure;
FIG. 18 is a block diagram schematically illustrating a control device including a machine learning model according to an exemplary embodiment of the present disclosure; and
FIGS. 19 and 20 are exemplary views for explaining an operation of a mobile robot according to an obstacle according to an exemplary embodiment of the present disclosure.
DETAILED DESCRIPTION OF THE EMBODIMENT
Hereinafter, embodiments of the present invention will be described in detail with reference to the accompanying drawings. Advantages and features of the present disclosure, and methods for accomplishing the same will be more clearly understood from exemplary embodiments described below with reference to the accompanying drawings. However, the present invention is not limited to exemplary embodiments disclosed herein but will be implemented in various different forms. The exemplary embodiments are provided by way of example only so that a person of ordinary skilled in the art can fully understand the disclosures of the present invention and the scope of the present invention. Therefore, the present invention will be defined only by the scope of the appended claims. Like reference numerals generally denote like elements throughout the specification.
Unless otherwise defined, all terms (including technical and scientific terms) used in the present specification may be used as the meaning which may be commonly understood by the person with ordinary skill in the art, to which the present invention belongs. It will be further understood that terms defined in commonly used dictionaries should not be interpreted in an idealized or excessive sense unless expressly and specifically defined.
In the specification, the terms “first” or “second” are used to distinguish one component from the other component so that the scope should not be limited by these terms. For example, a first component may be referred to as a second component, and similarly, a second component may be referred to as a first component.
In the present specification, in each step, numerical symbols (for example, a, b, and c) are used for the convenience of description, but do not explain the order of the steps so that unless the context apparently indicates a specific order, the order may be different from the order described in the specification. That is, the steps may be performed in the order as described or simultaneously, or an opposite order.
In this specification, the terms “have”, “may have”, “include”, or “may include” represent the presence of the characteristic (for example, a numerical value, a function, an operation, or a component such as a part”), but do not exclude the presence of additional characteristic.
The term “˜unit” used in the specification refers to a software or hardware component such as a field programmable gate array (FPGA) or an ASIC and “˜unit” performs some functions. However, “˜unit” is not limited to the software or the hardware. “˜unit” may be configured to be in an addressable storage medium or may be configured to reproduce one or more processors. Accordingly, as an example, “˜unit” includes components such as software components, object oriented software components, class components, and task components, processes, functions, attributes, procedures, subroutines, segments of a program code, drivers, a firmware, a microcode, a circuit, data, database, and data structures. A function which is provided in the components and “˜units” may be combined with a smaller number of components and “˜units” or divided into additional components and “˜units”.
Hereinafter, a functional safety system of a robot according to the present disclosure will be described in detail with reference to the accompanying drawing.
First, an apparatus for generating a dynamic safety zone of a mobile robot according to an exemplary embodiment of the present disclosure will be described with reference to FIGS. 3 to 7 .
FIG. 2 is a block diagram for explaining an apparatus of generating a dynamic safety zone of a mobile robot according to an exemplary embodiment of the present disclosure, FIG. 3 is a view for explaining a process of generating a dynamic safety zone according to an exemplary embodiment of the present disclosure, FIG. 4 is a view for explaining an operation of acquiring a default safety zone according to an exemplary embodiment of the present disclosure, FIG. 5 is a view for explaining an operation of acquiring a future predicted location according to an exemplary embodiment of the present disclosure, and FIG. 6 is a view for explaining an operation of generating a dynamic safety zone according to an exemplary embodiment of the present disclosure.
Referring to FIG. 2 , an apparatus 100 for generating a dynamic safety zone of a mobile robot according to an exemplary embodiment of the present disclosure (hereinafter, referred to as a “dynamic safety zone generating apparatus”) variably generates a safety zone required for the functional safety of the mobile robot according to an appearance and a speed of the mobile robot.
Here, the present disclosure is applicable to household cleaning robots, public building cleaning robots, logistics robots, service robots, as well as industrial robots.
That is, the dynamic safety zone generating apparatus 100 according to the present disclosure is free from a method having a limitation for a number of safety zones and a high risk of a user fault due to a method of directly inputting a safety zone by a user according to a speed and a direction of the mobile robot, which is a method according to a related art, and calculates a predicted location of a mobile robot according to a speed/direction by shape information and direction/speed information of the mobile robot without inputting a safety zone by a user and automatically generates a dynamic safety zone to be used for all mobile robots mounted with various wheels such as a diff wheel and a Mecanum wheel. Accordingly, unlike a safety zone generating method of the related art in which a safety zone set by the user is not selectively changed, according to the present disclosure, the safety zone is automatically generated for all directions without directly inputting the safety zone by a user so that the safety function for the density between the safety zones and omni-directions may be performed and a user fault by the user input may be significantly reduced.
In the meantime, the dynamic safety zone generating apparatus 100 according to the present disclosure is loaded in a mobile robot to dynamically generate a safety zone based on information acquired from the sensor mounted in the mobile robot. The dynamic safety zone generating apparatus 100 according to the present disclosure is loaded in a server which remotely manages the mobile robot by wireless communication to dynamically generate the safety zone of the mobile robot based on the information provided from the mobile robot and provide information about the generated safety zone to the mobile robot.
To this end, the dynamic safety zone generating apparatus 100 includes an information acquiring unit 110 and a safety zone generating unit 130.
The information generating unit 110 acquires a movement direction and a movement speed of the mobile robot.
Here, the movement direction of the mobile robot refers to one of a traveling direction according to a straight movement of the mobile robot, a traveling direction according to the rotation of the mobile robot, and a traveling direction when the rotational movement and the straight movement of the mobile robot are simultaneously generated.
The straight movement of the mobile robot refers to one of a straight movement on the x-axis, a straight movement on the y-axis, and a diagonal movement on the x-axis and the y-axis.
The rotation of the mobile robot refers to one of the rotational movement to the left side and the rotational movement to the right side.
The safety zone generating unit 130 dynamically generates a safety zone for the mobile robot based on at least one of shape information of the mobile robot, a movement direction of the mobile robot acquired by the information acquiring unit 110, and a movement speed of the mobile robot acquired by the information acquiring unit 110.
At this time, the safety zone generating unit 130 dynamically generates the safety zone for the mobile robot in the unit of predetermined cycles. The safety zone generating unit 130 dynamically generates the safety zone of the mobile robot when at least one of the movement direction and the movement speed of the mobile changes.
To be more specific, the safety zone generating unit 130 acquires a future predicted position of the mobile robot based on the movement direction and the movement speed of the mobile robot. For example, as illustrated in FIG. 5 , the safety zone generating unit 130 may acquire a future location of the mobile robot to be predicted with respect a current center location of the mobile robot, based on the movement direction and the movement speed of the mobile robot.
That is, the safety zone generating unit 130 may acquire the movement speed of the x-axis direction and the movement speed of the y-axis direction based on the movement direction and the movement speed of the mobile robot. The safety zone generating unit 130 may acquire a future predicted x-axis location of the mobile robot based on the movement speed of the x-axis direction and a future predicted y-axis location of the mobile robot based on the movement speed of the y-axis direction. The safety zone generating unit 130 acquires the future predicted location of the mobile robot based on the future predicted x-axis location and the future predicted y-axis location.
For example, the safety zone generating unit 130 acquires the future predicted location of the mobile robot by means of the following Equation 1.
future location·x=speed·x*(response time+braking distance+margin distance) future location·y=speed·y*(response time+braking distance+margin distance)  [Equation 1]
Here, future location·x indicates the future predicted x-axis location and future location·y indicates the future predicted y-axis location. speed·x indicates a movement speed of the x-axis direction of the mobile robot and speed·y indicates a movement speed of the y-axis direction of the mobile robot. braking distance indicates a distance required to stop the mobile robot and is set in advance according to the speed of the mobile robot. margin distance indicates a margin distance which is set in advance for the safety of the mobile robot. response time is a response time of a sensor mounted in the mobile robot and is acquired by the following Equation 2.
response time=(sensor scanning time*sampling count)+communication delay+margin response time  [Equation 2]
Here, sensor scanning time indicates a scanning time of the sensor mounted in the mobile robot. sampling count indicates a number of samplings of the sensor mounted in the mobile robot. communication delay indicates a communication delay time of the sensor mounted in the mobile robot. margin response time indicates a margin response time which is set in advance for exact measurement of the sensor mounted in the mobile robot.
The safety zone generating unit 130 may dynamically generate a safety zone for the mobile robot based on the shape information of the mobile robot and the future predicted location of the mobile robot. Here, the shape information of the mobile robot refers to information about the appearance of the robot.
That is, the safety zone generating unit 130 may acquire a default safety zone based on the shape information of the mobile robot and dynamically generate a safety zone for the mobile robot based on the default safety zone and the future predicted location of the mobile robot.
At this time, the safety zone generating unit 130 acquires front, rear, left, and right distances with respect to the center point of the mobile robot based on the shape information of the mobile robot and acquires a default safety zone formed of four vertices based on the center point of the robot and the front, rear, left, and right distances. For example, as illustrated in FIG. 4 , the safety zone generating unit 130 acquires a first distance (a front side distance), a second distance (a rear side distance), a third distance (a left side distance), and a fourth distance (a right side distance) with respect to the center point of the mobile robot and acquires a default safety zone based on the acquired first to fourth distances.
Further, the safety zone generating unit 130 varies the front, rear, left, and right distances according to the default safety zone based on the future predicted location of the mobile robot to dynamically generate the safety zone for the mobile robot. For example, as illustrated in FIG. 6 , the safety zone generating unit 130 dynamically generates the safety zone by referring to the future predicted location of the mobile robot based on the default safety zone with respect to the current center position of the mobile robot. At this time, the distance of the safety zone is acquired by the following Equation 3.
zone distance=speed*response time+braking distance+margin distance  [Equation 3]
Here, speed indicates a movement speed of the mobile robot. Response time is a response time of a sensor mounted in the mobile robot and is acquired by the above Equation 2. braking distance indicates a distance required to stop the mobile robot and is set in advance according to the speed of the mobile robot. margin distance indicates a margin distance which is set in advance for the safety of the mobile robot.
At this time, the safety zone generating unit 130 acquires a plurality of sub safety zones by varying the front, rear, left, and right distances according to the default safety zone plural times, based on the current location of the mobile robot and the future predicted location of the mobile robot and dynamically generates the safety zone for the mobile robot based on the plurality of sub safety zones. For example, the safety zone generating unit 130 acquires a sub safety zone in every predetermined distance unit, by dividing a path along which the mobile robot moves from a start point to a target time into a predetermined distance unit with the current location of the mobile robot as the start point and the future predicted location of the mobile robot as the target point and dynamically generates the safety zone for the mobile robot by overlapping the plurality of acquired sub safety zones.
Now, an example of generating a dynamic safety zone according to the first exemplary embodiment of the present disclosure will be described with reference to FIGS. 7 to 12 .
FIG. 7 is a view for explaining an example of generating a dynamic safety zone according to an exemplary embodiment of the present disclosure to illustrate a straight movement on one axis, FIG. 8 is a view for explaining an operation of generating a dynamic safety zone for diagonal movement according to an exemplary embodiment of the present disclosure, FIG. 9 is a view for explaining an example of generating a dynamic safety zone according to an exemplary embodiment of the present disclosure to illustrate a diagonal movement on an x-axis and a y-axis, FIG. 10 is a view for explaining an operation of generating a dynamic safety zone of rotational movement according to an exemplary embodiment of the present disclosure, FIG. 11 is a view for explaining an example of generating a dynamic safety zone according to an exemplary embodiment of the present disclosure to illustrate a rotational movement, and FIG. 12 is a view for explaining an example of generating a dynamic safety zone according to an exemplary embodiment of the present disclosure to illustrate a case in which a rotational movement and a straight movement simultaneously occur.
Referring to FIG. 7 , the faster the speed, the longer the length of the safety zone which is dynamically generated and the slower the speed, the smaller the safety zone. However, the safety zone which is dynamically generated should not be smaller than the shape of the mobile robot, that is, the default safety zone.
Referring to FIGS. 8 and 9 , when the mobile robot simultaneously moves to the x-axis and the y-axis, the mobile robot diagonally travels and the safety zone is dynamically and automatically generated so that the mobile robot does not collide with the obstacle during the diagonal traveling.
Referring to FIGS. 10 and 11 , a shape of the safety zone which is dynamically generated according to the rotation direction of the mobile robot is changed so that when the mobile robot rotates to the right side, a upper right end and an upper left end which are likely to collide become longer and when the mobile robot rotates to the left side, the upper left end and the lower right end may become longer.
Referring to FIG. 12 , when the rotational movement and the straight movement of the mobile robot are simultaneously generated so that there are a straight movement speed and a rotation speed simultaneously, a safety zone in which a rotational value is added based on a value obtained from the future predicted location of the mobile robot is generated and all the safety zones are automatically determined and changed according to the speed and the direction.
Now, a method for generating a dynamic safety zone of a mobile robot according to an exemplary embodiment of the present disclosure will be described with reference to FIG. 13 .
FIG. 13 is a flowchart for explaining a method for generating a dynamic safety zone of a mobile robot according to an exemplary embodiment of the present disclosure.
Referring to FIG. 13 , the dynamic safety zone generating apparatus 100 acquires a movement direction and a movement speed of the mobile robot (S110).
By doing this, the dynamic safety zone generating apparatus 100 dynamically generates the safety zone for the mobile robot based on at least one of the shape information of the mobile robot, the movement direction of the mobile robot, and the movement speed of the mobile robot (S130).
At this time, the dynamic safety zone generating apparatus 100 dynamically generates the safety zone for the mobile robot in the unit of predetermined cycles. The dynamic safety zone generating apparatus 100 dynamically generates the safety zone of the mobile robot when at least one of the movement direction and the movement speed of the mobile changes.
To be more specific, the dynamic safety zone generating apparatus 100 acquires a future predicted location of the mobile robot based on the movement direction and the movement speed of the mobile robot. That is, the dynamic safety zone generating apparatus 100 acquires the movement speed of the x-axis direction and the movement speed of the y-axis direction based on the movement direction and the movement speed of the mobile robot, acquires the future predicted x-axis location of the mobile robot based on the movement speed of the x-axis direction and acquires the future predicted y-axis location of the mobile robot based on the movement speed of the y-axis direction, and acquires a future predicted location of the mobile robot based on the future predicted x-axis location and the future predicted y-axis location.
The dynamic safety zone generating apparatus 100 may dynamically generate a safety zone for the mobile robot based on the shape information of the mobile robot and the future predicted location of the mobile robot.
That is, the dynamic safety zone generating apparatus 100 may acquire a default safety zone based on the shape information of the mobile robot and dynamically generate a safety zone for the mobile robot based on the default safety zone and the future predicted location of the mobile robot. At this time, the dynamic safety zone generating apparatus 100 acquires front, rear, left, and right distances with respect to the center point of the mobile robot based on the shape information of the mobile robot and acquires a default safety zone formed of four vertices based on the center point of the robot and the front, rear, left, and right distances.
Further, the dynamic safety zone generating apparatus 100 varies the front, rear, left, and right distances according to the default safety zone based on the future predicted location of the mobile robot to dynamically generate the safety zone for the mobile robot. At this time, the dynamic safety zone generating apparatus 100 acquires a plurality of sub safety zones by varying the front, rear, left, and right distances according to the default safety zone plural times, based on the current location of the mobile robot and the future predicted location of the mobile robot and dynamically generates the safety zone for the mobile robot based on the plurality of sub safety zones.
FIG. 14 is a block diagram schematically illustrating a mobile robot according to an exemplary embodiment of the present disclosure.
A mobile robot 10 according to the exemplary embodiment of the present disclosure includes an environment sensing device 20, a power device 30, a control device 40, and a driving device 50. The mobile robot 10 of FIG. 14 is an example so that all blocks illustrated in FIG. 14 are not essential components and in the other exemplary embodiment, some blocks included in the mobile robot 10 may be added, modified, or omitted.
The mobile robot 10 according to the exemplary embodiment may be household cleaning robots, public building cleaning robots, logistics robots, service robots, and industrial robots.
The environment sensing device 20 refers to a device which senses motion information, surrounding obstacle information, and floor state information for the mobile robot 10.
The environment sensing device 20 includes a plurality of sensors and includes various sensors, such as a LiDAR sensor, a radar sensor, an image sensor, or an IR sensor.
The environment sensing device 20 transmits information sensed by the plurality of sensors to the control device 40.
The power device 30 stores and supplies a power for an operation of the mobile robot 10.
The power device 30 applies a power while interworking with various configurations required to be applied with the power in the mobile robot 10.
The power device 30 may be implemented as a battery, but is not limited thereto.
The control device 40 performs an operation of controlling an overall operation of the mobile robot 10.
The control device 40 controls a safety area for preventing collision of the mobile robot 10. The operation of the control device 40 to generate and control the safety area will be described with reference to FIGS. 15 to 18 . Here, the safety area is a concept including a safety zone and the safety zone is replaced with the safety area.
The control device 40 performs an operation corresponding to all or a part of operations performed by the dynamic safety zone generating device 100.
Further, the control device 40 controls the driving of the mobile robot 10. The control device 40 generates an operation control signal based on the safety area and transmits the generated operation control signal to at least one motor included in the driving device 50 to control a driving force of the motor, thereby controlling the operation of the mobile robot 10.
The driving device 50 refers to a device including at least one motor equipped in the mobile robot 10. The driving device 50 may include various types of motors related to the operation of the mobile robot 10.
The driving device 50 according to the exemplary embodiment may include a movement motor 56, but is not necessarily limited thereto and may further include various motors according to the type of the mobile robot 10.
The movement motor 56 is a motor for rotating main wheels (not illustrated) of the mobile robot 10 and is connected to the main wheels (not illustrated) and generates a driving force to rotate the main wheels (not illustrated).
The movement motor 56 rotates the main wheels (not illustrated) to move the mobile robot 10 along the movement path set by the control device 32 of the mobile robot 10.
Further, the movement motor 56 adaptively adjusts a driving force to rotate the main wheels so as to correspond to a movement speed or a size of the safety area based on the operation control signal received from the control device 32.
FIG. 15 is a block diagram schematically illustrating a control device of a mobile robot according to an exemplary embodiment of the present disclosure.
The control device 40 of the mobile robot 10 according to the exemplary embodiment includes an I/O interface 41, a communication module 42, a processor 43, a memory 44, and a database 45. The control device 40 of FIG. 15 is an example so that all blocks illustrated in FIG. 15 are not essential components and in the other exemplary embodiment, some blocks included in the control device 40 may be added, modified, or omitted. In the meantime, the control device 40 may be implemented by a computing device and each component included in the control device 40 may be implemented by a separate software device or a separate hardware device in which the software is combined.
The communication module 42 refers to a means which receives or transmits a signal or data.
The communication module 42 interworks with the processor 43 to input various types of signals or data or directly acquires data by interworking with a device in the mobile robot or an external device to transmit the signal or data to the processor 43. Here, the communication module 42 performs an operation corresponding to all or part of the operation performed by the information acquiring unit 110.
Further, the communication module 42 transmits the signal or data generated in the processor 43 to the device in the mobile robot 10 or an external device (for example, a server).
The communication module 42 may be connected to the I/O interface 41. The I/O interface 41 transmits information acquired from the communication module 42 to the processor 43 or receives a control signal from the processor 43 to substantially convert the information or the control signal into a signal for controlling the communication module 42.
The processor 43 according to the exemplary embodiment performs an operation of generating and controlling a safety area 2000 for preventing the collision with the obstacle present around the mobile robot 10.
The processor 43 performs an operation corresponding to all or a part of an operation performed by the dynamic safety zone generating unit 130.
The memory 44 includes at least one instruction or program which is executable by the processor 43. The memory 44 includes instructions or programs for controlling the mobile robot 10.
The database 45 refers to a general data structure implemented in a storage space (a hard disk or a memory) of a computer system using a database management program (DBMS) and means a data storage format which freely searches (extracts), deletes, edits, or adds data.
The database 150 may be implemented according to the object of the exemplary embodiment of the present disclosure using a relational database management system (RDBMS) such as Oracle, Informix, Sybase, or DB2, an object oriented database management system (OODBMS) such as Gemston, Orion, or O2, and XML native database such as Excelon, Tamino, Sekaiju and has an appropriate field or elements to achieve its own function. In the meantime, the database 45 may be implemented as a cloud or a virtual memory.
The database 45 according to the exemplary embodiment stores and provides information about control of the mobile robot 10 and information about the safety area.
It has been described that the database 45 is implemented in the control device 40, but is not necessarily limited thereto and may be implemented as a separate data storage device.
Hereinafter, an operation of generating and controlling a safety area by the processor 43 will be described. FIGS. 16 and 17 are views for explaining a safety area according to an exemplary embodiment of the present disclosure.
According to the exemplary embodiment, the processor 43 generates a safety area 2000 for preventing the collision with the obstacle present around the mobile robot 10. The safety area 2000 generated around the mobile robot 10 includes a plurality of sub safety areas 2010, 2020, 2030.
A speed range required by each sub safety area is different and is defined so as not to overlap.
The sub safety areas are distinguished according to a range of deceleration speed required for every sub safety area. A deceleration range required by a first sub safety area 2010, that is, a required speed range is set so as not to overlap a deceleration range required by a second sub safety area 2020. A third sub safety area 2030 corresponds to an area in which a collision is imminent and a speed region required by the third sub safety area is set so as not to overlap the deceleration range required by the second sub safety area 2020.
For example, the first speed range required by the first sub safety area 2010 does not overlap the second speed range required by the second sub safety area 2020, but is set to be higher than the second speed range. The third speed range required by the third sub safety area 2030 is set to be lower than the second speed range. If the third deceleration range is in contact with a surface of the body of the mobile robot 10, the third speed range substantially includes 0.
The processor 43 performs the processing for visualizing real-time information related to the above-described safety area. The visualized information may be provided to the user by means of a user, a screen of a user terminal, or a screen of the mobile robot 10.
The processor 43 includes a model which generates a safety area including a plurality of sub safety areas 2010, 2020, and 2030 and generates a control variable for setting or managing an attribute (for example, a size, a shape, or a direction) of safety areas according to circumstances. The processor 43 receives the control variable from the safety area generating model to process the visualization for the safety area. According to still another exemplary embodiment, the safety area generating model may be implemented to directly process the visualization for the safety area.
In the processor 43 according to the present disclosure, the safety area may include a fixed safety area in which the attributes (for example, a size, a shape, and a direction) of the safety areas are not changed and a variable safety area in which at least one of the attributes (for example, a size, a shape, and a direction) of the safety areas is changed.
The variable safety area may be separately generated from the fixed safety area and or varies the fixed safety area for a predetermined time (when an obstacle is sensed on the traveling path or there is a collision possibility).
The variable safety area is desirably a safety area which is generated in response to a dynamic obstacle or an unidentified static obstacle.
As described above, each of the fixed safety area and the variable safety area may further include sub safety areas.
The processor 43 according to the exemplary embodiment finally determines the attribute (size, shape, or location) of the variable safety area in consideration of motion information of the mobile robot or environment information acquired from the sensor. Here, the motion information may be a movement speed or a movement direction of the mobile robot and the environment information may be neighbor sensing information or obstacle detection information.
The processor 43 generates the variable safety area independently from the fixed safety area to determine an attribute (for example, size, shape, or direction) of the variable safety area in consideration of the motion information of the mobile robot and the environment information. When an unidentified static obstacle or a dynamic obstacle is sensed on the traveling path, the variable safety area which is generated in consideration of a distance to the obstacle may further include detailed variable sub safety areas.
In the meantime, the fixed safety area and the variable safety area may be generated to have different attributes by the unidentified static obstacle or the dynamic obstacle 4000 which approaches the mobile robot to be sensed. That is, the fixed safety area and the variable safety area 2040 are substantially different so that there are an overlapped region in which two areas overlap and a non-overlapped region 2050 which belongs to only one safety area.
FIG. 18 is a block diagram schematically illustrating a control device including a machine learning model according to an exemplary embodiment of the present disclosure.
The control device 40 of FIG. 18 is an example in which a neural network processor 46 is added to the control device 40 of FIG. 15 . Therefore, a redundant description with the control device 40 of FIG. 15 will be omitted.
The memory 44 includes at least one instruction or program which is executable by the processor 43. The memory 44 includes instructions or programs for controlling the mobile robot 10. Further, the memory 44 includes an instruction or a program for an operation of preprocessing a neural network learning result and an input value or an output value of the neural network.
The control device 40 of the exemplary embodiment further includes a neural network processor 46 including a machine learning model 47 which is trained in advance to improve a processing speed and an accuracy related to the safety area.
The processor 43 may interwork with the neural network processor 46 to perform an operation of controlling a safety area.
In the meantime, even though it is described that the processor 43 and the neural network processor 46 are different modules, the present invention is not necessarily limited thereto and the processor and the neural network are combined as one module to perform the individual operations.
The neural network processor 46 performs an operation of predicting a collision probability based on artificial intelligence (AI) or managing a safety area.
The neural network processor 46 includes an input node, an intermediate node, and an output node and has a structure specified by a determination weight in which training is completed in advance by training data as a connection weight which connects the nodes. An output value of the neural network processor 46 may be a coordinate value of an extended area or a coordinate value of a unit block area and may be implemented as a feature value matrix for the extended area or the unit block area.
The machine learning model 47 includes a collision probability prediction model 48 which predicts the collision with an obstacle and a safety area management model 49 which manages the safety area.
First, the collision probability prediction model 48 is a model which generates collision prediction information related to the collision with obstacles on a traveling path which is currently predicted. The collision probability prediction model is a machine learning model which is trained by various training data and may be implemented as software or hardware.
The collision probability prediction model 48 outputs information related to the collision with a sensor value as an input. A structure of the neural network for the collision probability prediction model 48 is not specifically limited. For example, the neural network may be implemented as a convolution neural network with a multi-layered neural network structure or recurrent neural network. Further, it may be implemented as hardware using an artificial intelligence accelerator.
The safety area management model 49 inputs information about the safety area visualized by the processor 43 to a previously trained neural network to output sub area attribute values for every sub area belonging to the visualized area. Here, the sub area attribute value includes information indicating a safety area or a sub safety area to which micro areas defined for every pixel or for every macro block configured by a plurality of pixels.
The safety area management model 49 desirably has a convolution neural network type neural network structure, but is not necessarily limited thereto. Further, it may be implemented as hardware using a graphic accelerator model.
The processor 43 finally determines an attribute (size, shape, location) of the variable safety area in consideration of motion information of the mobile robot, environment information acquired from the sensor, or collision prediction information for collision possibility between the mobile robot and the obstacle transmitted from the collision probability prediction model 48. Here, the motion information may be a movement speed or a movement direction of the mobile robot and the environment information may be neighbor sensing information or obstacle detection information. Further, the collision prediction information includes a collision probability, a remaining time to collide, or a collision prediction location when the mobile robot moves along the predicted movement path. The collision prediction point may be, for example, a body left, a body right, or a front of the mobile robot.
The processor 43 generates the variable safety area independently from the fixed safety area to determine an attribute (for example, size, shape, or direction) of the variable safety area in consideration of the motion information of the mobile robot, the environment information, and the collision prediction information. When an unidentified static obstacle or a dynamic obstacle is sensed on the traveling path, the variable safety area which is generated in consideration of a distance to the obstacle may further include detailed variable sub safety areas.
In the meantime, the fixed safety area and the variable safety area may be generated to have different attributes by the unidentified static obstacle or the dynamic obstacle 4000 which approaches the mobile robot to be sensed. That is, the fixed safety area 2010 and the variable safety area 2040 are substantially different so that there are an overlapped region in which two areas overlap and a non-overlapped region 2050 which belongs to only one safety area.
Referring to FIG. 17 , the non-overlapped area 2050 is generated by an obstacle 400 which is suddenly sensed or an obstacle 4000 nearby. The non-overlapped region 2050 is an area located at the outside of the “variable safety area”. The safety area management model 49 transmits a sub area attribute value for the overlapped region 2050 to the processor 43.
The processor 43 calculates the traveling path again using the sub area attribute value. The processor 43 allows the mobile robot to travel along the calculated traveling path.
Further, as illustrated in FIG. 17 , when a dynamic obstacle or an unidentified static obstacle 4000 is sensed at the left side of the left traveling direction, the processor 43 receives the collision prediction information from the collision probability prediction model 48 and generates a variable safety area 2040 in which the safety area varies according to the location in which the sensed obstacle is located. When the sensed dynamic obstacle or unidentified static obstacle 4000 approaches within a predetermined reference, the processor 43 extends the safety area toward the direction in which the dynamic obstacle is located to generate an extended variable safety area 2040.
The mobile robot 10 includes a variable safety area having a shape with a curvature or an arc shape.
As the variable safety area is applied, the mobile robot 10 supplements a traffic line of the mobile robot 10 which is wasted in a polygonal safety area with respect to the static obstacle. Further, as the variable safety area is applied, the mobile robot 10 covers a missing area in the polygonal safety area with respect to the static obstacle to supplement the safety area.
In other words, as the variable safety area having a shape with a curvature or an arc shape is applied, the mobile robot 10 may ensure a safety area based moving path which minimizes a space loss.
FIGS. 19 and 20 are exemplary views for explaining an operation of a mobile robot according to an obstacle according to an exemplary embodiment of the present disclosure.
FIG. 19 illustrates a situation in which when the mobile robot 10 moves from a start point A to an arrival point B, there are static obstacles C and D and FIG. 20 illustrates a situation in which when the mobile robot 10 moves from a start point A to an arrival point B, there are static obstacles C and D and a dynamic obstacle or an unidentified static obstacle 4000.
In FIG. 19 , the mobile robot 10 extends the sub safety areas 2010, 2020, and 2030 to cover between the static obstacles C and D and moves while reducing a movement speed in accordance with the extension.
In FIG. 19 , the sub safety areas 2010, 2020, and 2030 are formed as variable safety areas having a shape with a curvature or an arc shape.
As the sub safety areas 2010, 2020, and 2030 having a shape with a curvature or an arc shape are applied, the traffic line of the mobile robot which is wasted in the polygonal safety area with respect to the static obstacle is supplemented and the area which is missed in the vicinity of the static obstacle is covered to supplement the safety area.
In the meantime, in FIG. 20 , when the dynamic obstacle or the unidentified static obstacle 4000 which is in contact with the first sub safety area 2010 is sensed, the mobile robot 10 reduces the movement speed based on a predetermined deceleration range for the first sub safety area 2010 and modifies a traveling path of the mobile robot 10. Here, the modified traveling path may be set to a shortest distance to reach the arrival point B while avoiding the dynamic obstacle or the unidentified static obstacle 4000.
When the mobile robot 10 avoids the dynamic obstacle or the unidentified obstacle 4000, the mobile robot 10 move while maintaining a predetermined distance to the obstacle 4000.
Even though it has been described above that all components of the exemplary embodiment of the present invention are combined as one component or operate to be combined, the present invention is not limited to the exemplary embodiment. In other words, one or more components may be selectively combined to be operated within a scope of the present invention. Further, all components may be implemented as one independent hardware but a part or all of the components are selectively combined to be implemented as a computer program which includes a program module which performs a part or all functions combined in one or plural hardwares. Further, such a computer program may be stored in a computer readable media such as a USB memory, a CD disk, or a flash memory to be read and executed by a computer to implement the exemplary embodiment of the present invention. The recording media of the computer program may include a magnetic recording medium or an optical recording medium.
The above description illustrates a technical spirit of the present invention as an example and various changes, modifications, and substitutions become apparent to those skilled in the art within a scope of an essential characteristic of the present invention. Therefore, as is evident from the foregoing description, the exemplary embodiments and accompanying drawings disclosed in the present invention do not limit the technical spirit of the present invention and the scope of the technical spirit is not limited by the exemplary embodiments and accompanying drawings. The protective scope of the present disclosure should be construed based on the following claims, and all the technical concepts in the equivalent scope thereof should be construed as falling within the scope of the present disclosure.

Claims (14)

What is claimed is:
1. A dynamic safety zone generating apparatus loaded in a mobile robot and configured to dynamically generate a safety zone based on information acquired from a sensor mounted on the mobile robot, comprising:
an information acquiring circuit which acquires a movement direction and a movement speed of a mobile robot; and
a safety zone generating circuit which dynamically generates a safety zone for the mobile robot based on at least one of shape information of the mobile robot, a movement direction of the mobile robot acquired by the information acquiring circuit, and a movement speed of the mobile robot acquired by the information acquiring circuit,
wherein the safety zone generating circuit acquires a future predicted x-axis location of the mobile robot by multiplying a movement speed of the x-axis direction of the mobile robot by a sum of a braking distance required to stop the mobile robot which is preset according to the speed of the mobile robot, a margin distance preset for the safety of the mobile robot, and a response time of a sensor mounted on the mobile robot,
acquires a future predicted y-axis location of the mobile robot by multiplying a movement speed of the y-axis direction of the mobile robot by the sum of the braking distance, the margin distance, and the response time,
acquires a future predicted location of the mobile robot based on the future predicted x-axis location and the future predicted y-axis location,
dynamically generates the safety zone for the mobile robot based on the shape information of the mobile robot and the future predicted location of the mobile robot,
generates an operational control signal based on the safety zone, and
transmits the operation control signal to at least one motor to control the operation of the mobile robot.
2. The dynamic safety zone generating apparatus according to claim 1, wherein the safety zone generating circuit acquires a default safety zone based on the shape information of the mobile robot and dynamically generates a safety zone for the mobile robot based on the default safety zone and the future predicted location of the mobile robot.
3. The dynamic safety zone generating apparatus according to claim 2, wherein the safety zone generating circuit acquires front, rear, left, and right distances with respect to a center point of the mobile robot based on the shape information of the mobile robot and acquires a default safety zone formed of four vertices based on the center point of the robot and the front, rear, left, and right distances.
4. The dynamic safety zone generating apparatus according to claim 2, wherein the safety zone generating circuit varies the front, rear, left, and right distances according to the default safety zone based on the future predicted location of the mobile robot to dynamically generate the safety zone for the mobile robot.
5. The dynamic safety zone generating apparatus according to claim 4, wherein the safety zone generating circuit acquires a plurality of sub safety zones by varying the front, rear, left, and right distances according to the default safety zone plural times, based on the current location of the mobile robot and the future predicted location of the mobile robot and dynamically generates the safety zone for the mobile robot based on the plurality of sub safety zones.
6. The dynamic safety zone generating apparatus according to claim 1, wherein the safety zone generating circuit dynamically generates a safety zone for the mobile robot in the circuit of a predetermined cycle.
7. The dynamic safety zone generating apparatus according to claim 1, wherein the movement direction of the mobile robot is one of a traveling direction according to a straight movement of the mobile robot, a traveling direction according to the rotation of the mobile robot, and a traveling direction when the rotation and the straight movement of the mobile robot are simultaneously generated, the straight movement of the mobile robot is one of a straight movement on the x-axis, a straight movement on the y-axis, and a diagonal movement on the x-axis and the y-axis, and the rotation of the mobile robot is one of the rotational movement to the left side and the rotational movement to the right side.
8. The dynamic safety zone generating apparatus according to claim 1, wherein the safety zone generating circuit generates the safety zone including a plurality of sub safety areas and speed ranges of a mobile robot required by the sub safety areas are different from each other.
9. The dynamic safety zone generating apparatus according to claim 8, wherein the plurality of sub safety areas includes a first sub safety area, a second sub safety area, and a third sub safety area and a first speed range required by the first sub safety area does not overlap a second speed range required by the second sub safety area and is set to be higher than the second speed range.
10. The dynamic safety zone generating apparatus according to claim 9, wherein a third speed range required by the third sub safety area is set to be smaller than the second speed range and when the third sub safety area is in contact with a surface of a body of the mobile robot, the third speed range includes a speed value of 0.
11. The dynamic safety zone generating apparatus according to claim 1, wherein the safety zone generating circuit includes a processor and further includes a neural network processor including a machine learning model which predicts a collision with an obstacle and manages the safety area.
12. The dynamic safety zone generating apparatus according to claim 11, wherein the processor finally determines an attribute (size, shape, location) of the safety area in consideration of motion information of the mobile robot, environment information acquired from the sensor, or collision prediction information for collision possibility between the mobile robot and the obstacle transmitted from the collision probability prediction model.
13. A dynamic safety zone generating method performed in a mobile robot and configured to dynamically generate a safety zone based on information acquired from a sensor mounted on the mobile robot, comprising:
acquiring a movement direction and a movement speed of a mobile robot; and
dynamically generating a safety zone for the mobile robot based on at least one of shape information of the mobile robot, a movement direction of the mobile robot, and a movement speed of the mobile robot,
wherein the dynamically generating of a safety zone acquires a future predicted x-axis location of the mobile robot by multiplying a movement speed of the x-axis direction of the mobile robot by a sum of a braking distance required to stop the mobile robot which is preset according to the speed of the mobile robot, a margin distance preset for the safety of the mobile robot, and a response time of a sensor mounted on the mobile robot,
acquires a future predicted y-axis location of the mobile robot by multiplying a movement speed of the y-axis direction of the mobile robot by the sum of the braking distance, the margin distance, and the response time,
acquires a future predicted location of the mobile robot based on the future predicted x-axis location and the future predicted y-axis location,
dynamically generates the safety zone for the mobile robot based on the shape information of the mobile robot and the future predicted location of the mobile robot,
generates an operational control signal based on the safety zone, and
transmits the operation control signal to at least one motor to control the operation of the mobile robot.
14. The dynamic safety zone generating method according to claim 13, wherein the dynamically generating of a safety zone is configured by acquiring a default safety zone based on the shape information of the mobile robot and dynamically generating a safety zone for the mobile robot based on the default safety zone and the future predicted location of the mobile robot.
US18/316,889 2020-11-12 2023-05-12 Apparatus and method for generating dynamic safety zone of mobile robot Active 2042-03-18 US12320900B2 (en)

Applications Claiming Priority (9)

Application Number Priority Date Filing Date Title
KR20200151001 2020-11-12
KR10-2020-0151001 2020-11-12
KR10-2021-0153766 2021-11-10
KR1020210153768A KR102625700B1 (en) 2020-11-12 2021-11-10 Apparatus and method for sensor duplexing of mobile robot
KR10-2021-0153768 2021-11-10
KR1020210153767A KR102625699B1 (en) 2020-11-12 2021-11-10 Apparatus and method for duplex system architecture modularization of mobile robot
KR10-2021-0153767 2021-11-10
KR1020210153766A KR102651026B1 (en) 2020-11-12 2021-11-10 Apparatus and method for generating dynamic safety zone of mobile robot
PCT/KR2021/016526 WO2022103196A1 (en) 2020-11-12 2021-11-12 Functional safety system of robot

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2021/016526 Continuation-In-Part WO2022103196A1 (en) 2020-11-12 2021-11-12 Functional safety system of robot

Publications (2)

Publication Number Publication Date
US20240075620A1 US20240075620A1 (en) 2024-03-07
US12320900B2 true US12320900B2 (en) 2025-06-03

Family

ID=81601590

Family Applications (3)

Application Number Title Priority Date Filing Date
US18/316,889 Active 2042-03-18 US12320900B2 (en) 2020-11-12 2023-05-12 Apparatus and method for generating dynamic safety zone of mobile robot
US18/316,909 Active 2043-01-31 US12529798B2 (en) 2020-11-12 2023-05-12 Apparatus and method for modularizing duplex system architecture of mobile robot
US18/316,927 Pending US20240077618A1 (en) 2020-11-12 2023-05-12 Apparatus and method for duplexing sensor of mobile robot

Family Applications After (2)

Application Number Title Priority Date Filing Date
US18/316,909 Active 2043-01-31 US12529798B2 (en) 2020-11-12 2023-05-12 Apparatus and method for modularizing duplex system architecture of mobile robot
US18/316,927 Pending US20240077618A1 (en) 2020-11-12 2023-05-12 Apparatus and method for duplexing sensor of mobile robot

Country Status (3)

Country Link
US (3) US12320900B2 (en)
EP (1) EP4257301A4 (en)
WO (1) WO2022103196A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2025223640A1 (en) 2024-04-23 2025-10-30 Abb Schweiz Ag Method for configuring at least a safety parameter for a safety zone for a robot device

Citations (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6771208B2 (en) 2002-04-24 2004-08-03 Medius, Inc. Multi-sensor system
US20050154503A1 (en) 2004-01-12 2005-07-14 Steven Jacobs Mobile vehicle sensor array
KR20080027675A (en) 2006-09-25 2008-03-28 엘지전자 주식회사 Robot cleaner and control method accordingly
US20160313364A1 (en) * 2015-04-27 2016-10-27 Honda Motor Co., Ltd. Moving speed estimation device for mobile body and control device for mobile body
US20170025019A1 (en) 2015-07-21 2017-01-26 Robert Bosch Gmbh Sensor system for recognizing protruding or exposed objects in the surroundings of a vehicle
US20170144307A1 (en) * 2015-11-24 2017-05-25 X Development Llc Safety System for Integrated Human/Robotic Environments
KR20180061929A (en) 2016-11-30 2018-06-08 주식회사 모디엠 MOBILE 3D MAPPING SYSTEM OF RAILWAY FACILITIES EQUIPPED WITH DUAL LIDAR and 3D MAPPING METHOD USING THE SAME
US20180173223A1 (en) * 2015-10-16 2018-06-21 Lemmings LLC Robotic Golf Caddy
KR20180099090A (en) 2017-02-28 2018-09-05 전자부품연구원 Multi-Lidar Signal Convergence Device for Autonomous Vehicles
US20180372875A1 (en) 2017-06-27 2018-12-27 Uber Technologies, Inc. Sensor configuration for an autonomous semi-truck
US20190161274A1 (en) * 2017-11-27 2019-05-30 Amazon Technologies, Inc. Collision prevention for autonomous vehicles
KR101986919B1 (en) 2014-06-05 2019-06-07 소프트뱅크 로보틱스 유럽 Humanoid robot with collision avoidance and trajectory recovery capabilities
US20190262993A1 (en) * 2017-09-05 2019-08-29 Abb Schweiz Ag Robot having dynamic safety zones
US20190272671A1 (en) 2016-10-17 2019-09-05 Hangzhou Hikvision Digital Technology Co., Ltd. Method and device for constructing 3d scene model
US20200192341A1 (en) * 2018-03-07 2020-06-18 Skylla Technologies, Inc. Collaborative Determination Of A Load Footprint Of A Robotic Vehicle
US20210018927A1 (en) * 2019-07-15 2021-01-21 Deere & Company Robotic mower boundary detection system
US20210157325A1 (en) * 2019-11-26 2021-05-27 Zoox, Inc. Latency accommodation in trajectory generation
US20210213619A1 (en) * 2018-08-30 2021-07-15 Samsung Electronics Co., Ltd. Robot and control method therefor
WO2021208225A1 (en) * 2020-04-15 2021-10-21 长沙中联重科环境产业有限公司 Obstacle avoidance method, apparatus, and device for epidemic-prevention disinfecting and cleaning robot

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170313332A1 (en) * 2002-06-04 2017-11-02 General Electric Company Autonomous vehicle system and method
EP1969437B1 (en) * 2005-12-02 2009-09-09 iRobot Corporation Coverage robot mobility
DE102006048163B4 (en) * 2006-07-31 2013-06-06 Pilz Gmbh & Co. Kg Camera-based monitoring of moving machines and / or moving machine elements for collision prevention
WO2017050358A1 (en) * 2015-09-22 2017-03-30 Bluebotics Sa Dynamic navigation for autonomous vehicles
KR102461938B1 (en) * 2017-04-04 2022-10-31 엘지전자 주식회사 Method of identifying obstacle on a driving surface and robot implementing thereof
US10664502B2 (en) * 2017-05-05 2020-05-26 Irobot Corporation Methods, systems, and devices for mapping wireless communication signals for mobile robot guidance
US11435759B2 (en) * 2018-05-04 2022-09-06 Lg Electronics Inc. Plurality of autonomous mobile robots and controlling method for the same
WO2020033808A1 (en) * 2018-08-10 2020-02-13 Brain Corporation Systems, apparatus and methods for removing false positives form sensor detection
US20220203531A1 (en) * 2019-04-17 2022-06-30 Sony Group Corporation Robot, transmission method, and transmission estimation method
CN110989605B (en) * 2019-12-13 2020-09-18 哈尔滨工业大学 Three-body intelligent system architecture and detection robot

Patent Citations (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6771208B2 (en) 2002-04-24 2004-08-03 Medius, Inc. Multi-sensor system
US20050154503A1 (en) 2004-01-12 2005-07-14 Steven Jacobs Mobile vehicle sensor array
KR20080027675A (en) 2006-09-25 2008-03-28 엘지전자 주식회사 Robot cleaner and control method accordingly
KR101986919B1 (en) 2014-06-05 2019-06-07 소프트뱅크 로보틱스 유럽 Humanoid robot with collision avoidance and trajectory recovery capabilities
US20160313364A1 (en) * 2015-04-27 2016-10-27 Honda Motor Co., Ltd. Moving speed estimation device for mobile body and control device for mobile body
US20170025019A1 (en) 2015-07-21 2017-01-26 Robert Bosch Gmbh Sensor system for recognizing protruding or exposed objects in the surroundings of a vehicle
US20180173223A1 (en) * 2015-10-16 2018-06-21 Lemmings LLC Robotic Golf Caddy
US20170144307A1 (en) * 2015-11-24 2017-05-25 X Development Llc Safety System for Integrated Human/Robotic Environments
US20190272671A1 (en) 2016-10-17 2019-09-05 Hangzhou Hikvision Digital Technology Co., Ltd. Method and device for constructing 3d scene model
KR20180061929A (en) 2016-11-30 2018-06-08 주식회사 모디엠 MOBILE 3D MAPPING SYSTEM OF RAILWAY FACILITIES EQUIPPED WITH DUAL LIDAR and 3D MAPPING METHOD USING THE SAME
KR20180099090A (en) 2017-02-28 2018-09-05 전자부품연구원 Multi-Lidar Signal Convergence Device for Autonomous Vehicles
US20180372875A1 (en) 2017-06-27 2018-12-27 Uber Technologies, Inc. Sensor configuration for an autonomous semi-truck
US20190262993A1 (en) * 2017-09-05 2019-08-29 Abb Schweiz Ag Robot having dynamic safety zones
US20190161274A1 (en) * 2017-11-27 2019-05-30 Amazon Technologies, Inc. Collision prevention for autonomous vehicles
US20200192341A1 (en) * 2018-03-07 2020-06-18 Skylla Technologies, Inc. Collaborative Determination Of A Load Footprint Of A Robotic Vehicle
US20210213619A1 (en) * 2018-08-30 2021-07-15 Samsung Electronics Co., Ltd. Robot and control method therefor
US20210018927A1 (en) * 2019-07-15 2021-01-21 Deere & Company Robotic mower boundary detection system
US20210157325A1 (en) * 2019-11-26 2021-05-27 Zoox, Inc. Latency accommodation in trajectory generation
WO2021208225A1 (en) * 2020-04-15 2021-10-21 长沙中联重科环境产业有限公司 Obstacle avoidance method, apparatus, and device for epidemic-prevention disinfecting and cleaning robot

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
International Search Report for PCT/KR2021/016526 by Korean Intellectual Property Office dated Feb. 18, 2022.
Office Action for U.S. Appl. No. 18/316,927 by United States Patent and Trademark Office dated Apr. 8, 2025.

Also Published As

Publication number Publication date
EP4257301A1 (en) 2023-10-11
US12529798B2 (en) 2026-01-20
WO2022103196A1 (en) 2022-05-19
US20240077618A1 (en) 2024-03-07
US20240075620A1 (en) 2024-03-07
EP4257301A4 (en) 2024-08-21
US20240075622A1 (en) 2024-03-07

Similar Documents

Publication Publication Date Title
KR102770055B1 (en) System and method for optimizing path planning for tight turns of robotic devices
CN113561963B (en) Parking method and device and vehicle
US11762390B1 (en) Autonomous machine safety management in a dynamic environment
CN111949027B (en) Self-adaptive robot navigation method and device
US20240126283A1 (en) Systems and methods for navigation of an autonomous system
US11967157B2 (en) Robot and method for controlling thereof
US11467592B2 (en) Route determination method
CN115328153B (en) Sensor data processing method, system and readable storage medium
Kenk et al. Human-aware Robot Navigation in Logistics Warehouses.
US12320900B2 (en) Apparatus and method for generating dynamic safety zone of mobile robot
Machkour et al. Monocular based navigation system for autonomous ground robots using multiple deep learning models
KR20220064914A (en) Apparatus and method for generating dynamic safety zone of mobile robot
WO2022116628A1 (en) Obstacle avoidance control system, method, storage medium, computer program product, and mobile device
Misir et al. Visual-based obstacle avoidance method using advanced CNN for mobile robots
US12204335B2 (en) Systems and methods for controlling a robotic vehicle
Nasti et al. Obstacle avoidance during robot navigation in dynamic environment using fuzzy controller
CN108646759A (en) Intelligent dismountable moving robot system based on stereoscopic vision and control method
CN114740835A (en) Path planning method, path planning device, robot and storage medium
Yang et al. Research into the application of AI robots in community home leisure interaction
JP7618378B2 (en) Autonomous Vehicles
WO2020021954A1 (en) Information processing device, information processing method, and program
WO2022153669A1 (en) Distributed coordination system and task execution method
Sun et al. Detection and state estimation of moving objects on a moving base for indoor navigation
EP4113065A1 (en) Systems and methods for navigation of an autonomous system
Acquaah et al. Integrating Deep Planning-Based Object Detection with 3D-Depth Camera for Collision Avoidance in Indoor Robotics Navigation

Legal Events

Date Code Title Description
AS Assignment

Owner name: MIELE & CIE. KG, GERMANY

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:PARK, SEONG JU;SEO, DONG HYEON;JANG, SEUNG HO;AND OTHERS;REEL/FRAME:063630/0096

Effective date: 20230511

Owner name: YUJIN ROBOT CO., LTD., KOREA, REPUBLIC OF

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:PARK, SEONG JU;SEO, DONG HYEON;JANG, SEUNG HO;AND OTHERS;REEL/FRAME:063630/0096

Effective date: 20230511

FEPP Fee payment procedure

Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STCF Information on status: patent grant

Free format text: PATENTED CASE