US11860638B2 - Autonomous floor-cleaning robot having obstacle detection force sensors thereon and related methods - Google Patents
Autonomous floor-cleaning robot having obstacle detection force sensors thereon and related methods Download PDFInfo
- Publication number
- US11860638B2 US11860638B2 US17/065,629 US202017065629A US11860638B2 US 11860638 B2 US11860638 B2 US 11860638B2 US 202017065629 A US202017065629 A US 202017065629A US 11860638 B2 US11860638 B2 US 11860638B2
- Authority
- US
- United States
- Prior art keywords
- bumper
- impact
- sensor
- robot
- orientation
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active, expires
Links
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01L—MEASURING FORCE, STRESS, TORQUE, WORK, MECHANICAL POWER, MECHANICAL EFFICIENCY, OR FLUID PRESSURE
- G01L1/00—Measuring force or stress, in general
- G01L1/12—Measuring force or stress, in general by measuring variations in the magnetic properties of materials resulting from the application of stress
- G01L1/122—Measuring force or stress, in general by measuring variations in the magnetic properties of materials resulting from the application of stress by using permanent magnets
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0227—Control of position or course in two dimensions specially adapted to land vehicles using mechanical sensing means, e.g. for sensing treated area
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01L—MEASURING FORCE, STRESS, TORQUE, WORK, MECHANICAL POWER, MECHANICAL EFFICIENCY, OR FLUID PRESSURE
- G01L1/00—Measuring force or stress, in general
- G01L1/12—Measuring force or stress, in general by measuring variations in the magnetic properties of materials resulting from the application of stress
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01L—MEASURING FORCE, STRESS, TORQUE, WORK, MECHANICAL POWER, MECHANICAL EFFICIENCY, OR FLUID PRESSURE
- G01L5/00—Apparatus for, or methods of, measuring force, work, mechanical power, or torque, specially adapted for specific purposes
- G01L5/0052—Apparatus for, or methods of, measuring force, work, mechanical power, or torque, specially adapted for specific purposes measuring forces due to impact
-
- G05D2201/0203—
Definitions
- the present invention relates to autonomous floor-cleaning robots, and in particular, an autonomous floor-cleaning robot having sensors for obstacle detection and related methods.
- Robot touch sensors have been incorporated into a robot bumper assembly.
- Such bumper assemblies are typically rigid, movable bumpers that are spaced away and suspended from the robot chassis.
- Such bumpers include a rigid outer shell suspended from the robot chassis by a series of hardware, such as pivots/bumper arms and coil springs. The springs absorb the impact energy, but require a high impact force, i.e., require that the bumper deflect by several millimeters to absorb the energy before triggering a switch to indicate that an impact event has occurred.
- An autonomous floor-cleaning robot can be used to clean the floor of a home.
- the autonomous floor-cleaning robot may contact an object, such as a chair, table, or wall.
- the autonomous floor-cleaning robot may execute one or more behaviors (e.g., a cleaning or navigational behavior).
- the behavior can be selected or adjusted based on a portion of the autonomous floor-cleaning robot (e.g., a zone) that makes contact with the object.
- Sensors such as Hall sensors can be used to detect a portion of the autonomous floor-cleaning robot that makes contact with the object.
- the inventors have recognized, among other things, a need for a autonomous floor-cleaning robot having an improved ratio of a number of zones to a number of sensors.
- a number of Hall sensors can be used to sense contact in a corresponding number of spatial regions in a bumper of the autonomous floor-cleaning robot.
- the inventors have recognized, among other things, that an orientation of Hall sensors can be adjusted to increase a number of spatial regions or zones for a fixed number of Hall sensors, thus providing improved performance without the need for additional sensors.
- a mobile robot comprises a robot body; a drive system supporting the robot body above a floor surface for maneuvering the robot across the floor surface; a bumper frame on a front periphery of the robot body, the bumper frame supported by the robot body; and a bumper impact system comprising: a first sensor at a first orientation with respect to the bumper frame that is configured to generate a first signal in response to a magnitude and a direction of movement of the bumper frame relative to the robot body; a second sensor at a second orientation with respect to the bumper frame that is configured to generate a second signal in response to a magnitude and a direction of movement of the bumper frame relative to the robot body, the second orientation being different from the first orientation; and a processor configured to receive the first and second signals from the first and second sensors and to generate an impact profile of a bumper impact event based on the first and second signals responsive to the bumper frame being moved with respect to the robot body.
- the first sensor comprises a first Hall effect sensor coupled to one of the bumper frame or the robot body and a first magnet coupled to the other of the bumper frame or the robot body such that the first Hall effect sensor generates the first signal in response to the bumper frame moving with respect to the robot body; and the second sensor comprises a second Hall effect sensor coupled to one of the bumper frame or the robot body and a second magnet coupled to the other of the bumper frame or the robot body such that the second Hall effect sensor generates the second signal in response to the bumper moving with respect to the robot body.
- the first and second magnets are coupled to the bumper and the first and second Hall effect sensors are coupled to the robot body.
- the first orientation of the first sensor corresponds to a first magnetic pole orientation of the first magnet
- the second orientation of the second sensor corresponds to a second magnetic pole orientation of the second magnet, the first magnetic pole orientation and the second magnetic pole orientation having an offset angle therebetween.
- the offset angle of the first magnetic pole orientation and the second magnetic pole orientation comprises about 90 degrees.
- the first and second magnetic pole orientations of the first and second magnets are oriented about 30-60 degrees or about 45 degrees offset from a major axis of a front edge periphery of the robot body.
- the processor is configured to identify a region of impact based on the impact profile of a bumper impact event.
- the processor is configured to determine a position or region of impact on the bumper frame based on a relative strength of the first and second signals.
- the processor is configured to determine a force of impact on the bumper frame based on a signal strength of the first and second signals.
- the processor is configured to define at least five impact regions of the bumper based on signal profile ranges for the first and second signals, and to identify one of the impact regions for a bumper impact event based on the signal profile ranges for the first and second signals.
- the processor is configured to identify a location of an obstacle based on the position or region of impact on the bumper frame.
- a method of operating a mobile robot includes a robot body, a drive system supporting the robot body above a floor surface for maneuvering the robot across the floor surface, and a bumper frame on a front periphery of the robot body.
- the method includes receiving a first signal from a first sensor positioned at a first orientation with respect to the bumper frame, the first signal being responsive to motion of the bumper frame relative to the robot body; receiving a second signal from a second sensor at a second orientation with respect to the bumper frame, the second signal being responsive to motion of the bumper frame relative to the robot body, the second orientation being different from the first orientation; and generating an impact profile of a bumper impact event based on the first and second signals responsive to the bumper frame being moved with respect to the robot body.
- the first sensor comprises a first Hall effect sensor coupled to one of the bumper frame or the robot body and a first magnet coupled to the other of the bumper frame or the robot body such that the first Hall effect sensor generates the first signal in response to the bumper frame moving with respect to the robot body; and the second sensor comprises a second Hall effect sensor coupled to one of the bumper frame or the robot body and a second magnet coupled to the other of the bumper frame or the robot body such that the second Hall effect sensor generates the second signal in response to the bumper moving with respect to the robot body.
- the first and second magnets are coupled to the bumper and the first and second Hall effect sensors are coupled to the robot body.
- the first orientation of the first sensor corresponds to a first magnetic pole orientation of the first magnet
- the second orientation of the second sensor corresponds to a second magnetic pole orientation of the second magnet, the first magnetic pole orientation and the second magnetic pole orientation having an offset angle therebetween.
- the offset angle of the first magnetic pole orientation and the second magnetic pole orientation comprises about 90 degrees.
- the first and second magnetic pole orientations of the first and second magnets are oriented about 30-60 degrees or about 45 degrees offset from a major axis of a front edge periphery of the robot body.
- the method comprises identifying a position or region of impact based on the impact profile of a bumper impact event.
- the method comprises determining a position of impact on the bumper frame based on a relative strength of the first and second signals.
- the method comprises determining a force of impact on the bumper frame based on a signal strength of the first and second signals.
- the method comprises defining at least five impact regions of the bumper based on signal profile ranges for the first and second signals, and identifying one of the impact regions for a bumper impact event based on the signal profile ranges for the first and second signals.
- the method comprises identifying a location of an obstacle based on the position or region of impact on the bumper frame.
- FIG. 1 is a perspective view of a robot contacting an object with a bumper frame according to some embodiments.
- FIG. 2 is an exploded perspective view of the robot of FIG. 1 with the top surface of the robot removed and interior components omitted for clarity, illustrating a sensor unit according to some embodiments.
- FIGS. 3 - 10 are schematic diagrams of a top view of a sensor unit and bumper frame according to some embodiments.
- FIGS. 11 and 12 are graphs of example signals for bumper frame impacts at the impact locations shown in FIGS. 4 and 5 , respectively.
- FIG. 13 is a plot of data from a bumper impact simulation.
- FIG. 14 is a flowchart of a method according to some embodiments.
- An autonomous floor-cleaning robot can include Hall sensors that can be used to detect a portion of the autonomous floor-cleaning robot that makes contact with an object, such as a piece of furniture or wall. Detecting the location of the impact is limited by the number of switches and suspension points that economically can be incorporated into the robot's mechanical geometry. For many robots, two switches, a left switch and a right switch, are used. At best, this allows for three detection zones, right, left, and center if both switches are triggered. In addition, such a robot cannot generally determine the degree or force of impact. The inventors have recognized, among other things, that an orientation of Hall sensors can be adjusted to increase a number of spatial regions or zones for a fixed number of Hall sensors, thus providing improved performance without the need for additional sensors. Moreover, a degree and/or force of impact may also be determined in some embodiments.
- FIGS. 1 and 2 illustrate a mobile, autonomous floor-cleaning robot 100 having a robot body 110 , a drive system 120 , and a bumper frame 130 on a front portion of the robot body 110 .
- the drive system 120 supports the robot body 110 above a floor surface for maneuvering the robot 100 across the floor surface.
- the bumper frame 130 of the robot 100 may contact an obstacle 10 in the cleaning environment with a given force and at a location on the robot bumper frame 130 .
- the bumper frame 130 moves some distance relative to the robot body 110 .
- the bumper frame 130 may move in a direction with respect to the robot body 110 (e.g., front-back, side-side, diagonally) depending on the location of the impact.
- the robot 100 also includes a processor 170 that is configured to execute various robot control actions and/or receive and analyze information from sensors on the robot 100 , including various sensors described herein.
- the processor 170 may also include mapping capabilities for analyzing information from sensors on the robot 100 to create or update a map of the environment using techniques such as Simultaneous Localization and Mapping (SLAM), for example, as described in U.S. Pat. No. 9,886,037.
- SLAM Simultaneous Localization and Mapping
- the robot 100 includes a bumper impact system 140 having two sensor units 150 , 160 , each at a different orientation with respect to the bumper frame 130 .
- the sensor unit 150 includes a magnetic field sensor or Hall effect sensor 152 coupled to robot body 110 and a magnet 154 with a magnetic field polarity P 1 coupled to the bumper frame 130
- the sensor unit 160 includes a magnetic field sensor or Hall effect sensor 162 coupled to robot body 110 and a magnet 164 with a magnetic field polarity P 2 coupled to the bumper frame 130 .
- the sensor 154 is mounted on the bumper frame 130 with a magnet receptacle 156
- the magnet 164 may be similarly mounted on the bumper frame 130 with a corresponding magnetic receptacle (not shown) on the opposite side of the bumper frame 130 .
- the magnets 154 , 164 move with respect to the corresponding Hall effect sensors 152 , 162 .
- the resulting change in magnetic field generates signals at the Hall effect sensors 152 , 162 that are responsive to the movement of the bumper frame 130 .
- the Hall Effect sensors 152 , 162 are positioned along a central axis of the respective magnet 154 , 164 and the direction of the polarity P 1 of the magnet 154 is at about a ninety degree angle from the direction of the polarity P 2 of the magnet 164 with the magnets 154 , 164 generally facing respective corners of the bumper frame 130 . That is, the magnet 154 is at an orientation that is offset from the plane of the front surface of the bumper frame 130 and the forward direction of movement of the robot 100 by about 45 degrees in a first direction, and the magnet 164 is at an orientation that is offset from the plane of the front surface of the bumper 130 by about 45 degrees in a second direction that is opposite the first direction.
- Each of the Hall effect sensors 152 , 162 is configured to generate a signal that is responsive to the magnitude and direction of movement of the bumper frame 130 relative to the robot body 110 .
- the sensor units 150 , 160 are at different orientations such that each of the Hall effect sensors 152 , 162 detect a different respective change in the magnetic field from the magnets 154 , 164 due to movement of the bumper frame 130 in response to an impact location. Accordingly, the signals generated by the Hall effect sensors 152 , 162 may be used to identify a location or impact zone of an impact on the bumper frame 130 .
- the Hall effect sensors 152 , 162 are typically most sensitive to movement of the magnets 154 , 164 along the direction of the polarity P 1 , P 2 of the magnets 154 , 164 . That is, the Hall effect sensors 152 , 162 generate a larger signal when the magnets 154 , 164 move along the direction of the polarity P 1 , P 2 of the magnets 154 , 164 .
- the sensitivity of the Hall sensors 152 , 162 to identify locations of impact based on different movements of the bumper frame 130 may be increased.
- the impact profile based on the characteristic signals or ranges of signals corresponding to an impact location or region may be identified with sufficient precision to localize an impact in more than three bumper locations.
- five, seven or nine or more bumper locations may be identified, and in some embodiments, a coordinate location of an impact may be identified to within 1, 5, 10, 15, or 20 cm or less.
- Hall effect sensors 152 , 162 are illustrated as being positioned along a central axis of the respective magnet 154 , 164 , it should be understood that the location of the Hall effect sensors 152 , 162 may be offset from the center of the respective magnets 154 , 164 and/or offset with respect to the polarity P 1 , P 2 of the respective magnets. Offsetting the magnets 154 , 164 may reduce or avoid signal saturation of the respective Hall effect sensors 154 , 164 . Thus, the Hall effect sensors 152 , 162 may be positioned at any suitable location so that changes in the magnetic field may be sensed due to movement of the magnets 154 , 164 due to movement of the bumper frame 130 .
- the magnets 154 , 164 are illustrated as being mounted on the bumper frame 130 by the magnet receptacle 156 and the Hall effect sensors 152 , 162 are mounted on the robot body 110 ; however, any suitable location may be used to detect changes in magnetic field due to movement of the bumper frame 130 .
- the Hall effect sensors 152 , 162 may be mounted on the bumper frame 130 and the magnets 154 , 164 may be mounted on the robot body 110 .
- the offset angle between the directions of the magnetic field polarities P 1 and P 2 is illustrated at about 90 degrees; however, different offset angles may be used, such as between 70 degrees and 110 degrees.
- the magnetic pole orientations or polarities P 1 , P 2 may be oriented about 45 degrees offset from a major axis A of a front edge periphery of the robot body 110 (or the plane of the front surface of the bumper frame 130 ); however, offsets of between about 30 and 60 degrees from the plane of the front surface of the bumper frame 130 may be used.
- a first signal is received from the first Hall effect sensor 152 at Block 200
- a second signal is received from the second Hall effect sensor 162 at Block 210 .
- the first and second signals may be received by the processor 170 ( FIG. 1 ), and the processor 170 may generate an impact profile of a bumper impact event based on the first and second signals from the sensors 152 , 162 responsive to the bumper frame 130 being moved with respect to the robot body 110 at Block 220 .
- the processor 170 may determine or estimate a position or region of impact based on characteristic signals or ranges of signals from the Hall effect sensors 152 , 162 .
- the position or region of impact may be based on a relative strength of the signals from the respective sensors 152 , 162 .
- a relative strength of the signals from the respective sensors 152 , 162 For example, an impact in zone 2 as shown in FIG. 3 would result in a larger signal in the Hall effect sensor 152 than in the Hall effect sensor 162 due to greater movement of the bumper frame 130 closer to the sensor 152 .
- the processor 170 may be located on the robot 100 as shown; however, in some embodiments, the signals from the robot sensors 152 , 162 may be transmitted to a remote processor for further analysis to generate an impact profile of a bumper impact event as described herein.
- the impact profile or characteristic signals or ranges of signals corresponding to an impact location or region may be experimentally determined based on actual impact signals, which may be stored in a memory of the processor 170 .
- Example signals are shown in FIGS. 11 and 12 .
- the characteristic signals or ranges of signals corresponding to an impact location or region may be generated by a simulation.
- a signal strength or rate of change of the signal of the sensors 152 , 162 may be used to determine a force of impact on the bumper frame.
- FIGS. 3 - 10 Impact regions or zones 1-7 are illustrated in FIGS. 3 - 10 .
- a hit or impact in zone 1 which is in the front and central region of the bumper frame 130 results in movement of the magnet 154 with respect to the Hall effect sensor 152 by a distance D 1 and movement of the magnet 164 with respect to the Hall effect sensor 162 by a distance D 2 .
- the amount of movement at each of the sensors 152 , 162 is about the same, i.e., D 1 is about the same as D 2 .
- D 1 is about the same as D 2 .
- An example graph of the signals is shown in FIG. 11 for a front, center or zone 1 impact on the bumper frame 130 .
- FIG. 5 A front, left or zone 2 impact on the bumper frame 130 is illustrated in FIG. 5 .
- the relative movement of the magnet 154 with respect to the sensor 152 is a distance D 3
- D 4 the relative movement of the magnet 164 with respect to the sensor 162
- An example graph of the signals is shown in FIG. 12 for a front, left or zone 2 impact on the bumper frame 130 .
- FIG. 6 illustrates respective relative movements at distances D 5 (negligible or zero) and D 6 for an impact in the front, right or zone 3.
- opposite movement of the magnets 154 , 164 and sensors 152 , 162 from FIG. 5 are shown due to the impact in the opposite side of the bumper frame 130 .
- FIG. 7 illustrates a left, side or zone 4 impact with relative sensor movement in a horizontal direction a distance D 7 , D 8 .
- FIG. 8 illustrates a right, side or zone 5 impact with relative sensor movement in a horizontal direction a distance D 9 , D 10 .
- the signals from the sensors 152 , 162 due to the relative movement shown in FIGS. 7 and 8 may be used to identify the relevant impact zone.
- FIG. 9 illustrates a left, corner or zone 6 impact with relative sensor movement in a diagonal direction a distance D 11 , D 12 .
- FIG. 10 illustrates a right, corner or zone 7 impact with relative sensor movement in a horizontal direction a distance D 13 , D 14 .
- the signals from the sensors 152 , 162 due to the relative movement shown in FIGS. 9 and 10 may be used to identify the relevant impact zone.
- the bumper frame 130 is illustrated as being divided into seven impact zones, it should be understood that any number of impact zones may be used, such as from three impact zones to ten or twenty impact zones. By using two sensors in the configurations described herein, forces applied to the bumper frame 130 may be divided into at least five zones.
- an impact profile for an impact event is based on a threshold level of the sensor reading at a given time or during a time interval. For example, a maximum reading from the signal generated by the sensors 152 , 162 may be determined at a given time or time interval.
- the processor 170 may store a value for various thresholds, which can be used to classify a location of an impact event. The thresholds may be set based on actual data from known impacts for classification purposes or based on simulated or calculated values.
- three threshold values may be identified based on sensor values: a high threshold (T(2)), a low threshold (T(1)) and a negative threshold (T( ⁇ 1)) of the sensor value for each sensor 152 , 162 .
- the negative threshold (T( ⁇ 1)) indicates that the bumper moves slightly backwards in relationship to the sensor.
- the three thresholds for each sensor results in nine different combinations of thresholds, which may be used to divide the values into nine different zones as shown in the following Table 1.
- a high sensor value (greater than T(2)) on both of the sensors is a Zone 1 impact.
- a low sensor value greater than T(1), but less than T(2)
- Negative threshold values (less than T( ⁇ 1)) occur during a Zone 4 or Zone 5 impact in which the side of the bumper opposite the impact travels backwards slightly.
- an eighth zone located below Zone 4 opposite Zone 6 and a ninth zone located below Zone 5 opposite Zone 6 may be defined by the following sensor conditions shown in Table 2:
- FIG. 13 An example graph illustrating bumper impact data is shown in FIG. 13 .
- the maximum values of the right and left sensor 152 , 162 for each bumper impact are normalized and divided into five regions or gates for the front central zone, the right and left front zone, and a right and left side zone.
- the central portion of data points are below a chosen threshold where sensor readings indicate a force that is likely too small to be due to an impact on the bumper.
- multiple force/displacements were performed at each push point along the bumper. These force/displacements and corresponding sensor readings were used to build a model or graph that correlates bumper positions with the expected responses from each of the Hall effect sensors 152 , 162 .
- the data shown in FIG. 13 may be further used in a trainer or machine learning engine to develop predictions of what sensor readings correspond to impacts on the respective zones of the bumper.
- the data shown in FIG. 13 is combined with a calibration of the robot which determines where the minimum and maximum values are for each of the Hall effect sensors.
- a user may actuate the bumper in different directions to determine the highest and lowest analog values of the sensors 152 , 162 .
- the sensor values at the various locations of the bumper are then normalized to the actual minima/maxima values of the robot to generate the plot shown in FIG. 13 .
- the collection of the data shown in FIG. 13 may be collected by an automated test in which plungers press on the bumper with a given force at different locations.
- An example of a subset of data collected by an automated bumper test using automated plunger pushes is provided below in Table 2. It should be understood that the following data set is an example and that additional data is typically needed to provide a more accurate and robust model of the bumper impacts.
- new bumper impact data may be analyzed.
- the processor of the robot can receive bumper impact data from the sensors 152 , 162 of the robot and classify the impact data, for example, based on which region of the plot in FIG. 13 the impact falls into.
- the identified impact zone may further be used to identify a location of an obstacle that the robot is contacting with the bumper.
- the location of the obstacle may be used too create or update a map of the environment using techniques such as Simultaneous Localization and Mapping (SLAM), for example, as described in U.S. Pat. No. 9,886,037.
- SLAM Simultaneous Localization and Mapping
- phrases such as “between X and Y” and “between about X and Y” should be interpreted to include X and Y.
- phrases such as “between about X and Y” mean “between about X and about Y.”
- phrases such as “from about X to Y” mean “from about X to about Y.”
- spatially relative terms such as “under,” “below,” “lower,” “over,” “upper” and the like, may be used herein for ease of description to describe one element or feature's relationship to another element(s) or feature(s) as illustrated in the figures. It will be understood that the spatially relative terms are intended to encompass different orientations of the device in use or operation in addition to the orientation depicted in the figures. For example, if the device in the figures is inverted, elements described as “under” or “beneath” other elements or features would then be oriented “over” the other elements or features.
- the exemplary term “under” can encompass both an orientation of “over” and “under.”
- the device may be otherwise oriented (rotated 90 degrees or at other orientations) and the spatially relative descriptors used herein interpreted accordingly.
- the terms “upwardly,” “downwardly,” “vertical,” “horizontal” and the like are used herein for the purpose of explanation only unless specifically indicated otherwise.
- These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instructions which implement the function/act specified in the block diagrams and/or flowchart block or blocks.
- the computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer-implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions/acts specified in the block diagrams and/or flowchart block or blocks.
- the present invention may be embodied in hardware and/or in software (including firmware, resident software, micro-code, etc.).
- embodiments of the present invention may take the form of a computer program product on a computer-usable or computer-readable non-transient storage medium having computer-usable or computer-readable program code embodied in the medium for use by or in connection with an instruction execution system.
- the computer-usable or computer-readable medium may be, for example but not limited to, an electronic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device. More specific examples (a non-exhaustive list) of the computer-readable medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), an optical fiber, and a portable compact disc read-only memory (CD-ROM).
- RAM random access memory
- ROM read-only memory
- EPROM or Flash memory erasable programmable read-only memory
- CD-ROM portable compact disc read-only memory
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Aviation & Aerospace Engineering (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Automation & Control Theory (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
- Toys (AREA)
Abstract
Description
| TABLE 1 | ||
| ZONE | LEFT SENSOR (LS) | RIGHT SENSOR (RS) |
| 1 (FRONT CENTER) | LS > T(2) | RS > T(2) |
| 2 (FRONT LEFT) | LS > T(2) | RS > T(1) |
| 3 (FRONT RIGHT) | LS > T(1) | RS > T(2) |
| 4 (LEFT SIDE) | LS > T(1) | RS < T(−1) |
| 5 (RIGHT SIDE) | LS < T(−1) | RS > T(1) |
| 6 (LEFT CORNER) | T(−1) < LS < T(1) | RS > T(1) |
| 7 (RIGHT CORNER) | LS > T(1) | T(−1) < RS < T(1) |
| TABLE 2 | ||
| ZONE | LEFT SENSOR (LS) | RIGHT SENSOR (RS) |
| 8 | LS < T(−1) | RS < T(1) |
| 9 | LS < T(1) | RS < T(−1) |
| TABLE 2 | ||||
| POSITION | FORCE | LEFT | RIGHT SENSOR | |
| 0 | 0.11 | 861 | 708 |
| 0.02 | 0.12 | 873 | 695 |
| 0.06 | 0.14 | 885 | 682 |
| 0.1 | 0.16 | 897 | 671 |
| 0.14 | 0.18 | 910 | 657 |
| 0.18 | 0.19 | 921 | 647 |
Claims (21)
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US17/065,629 US11860638B2 (en) | 2018-09-07 | 2020-10-08 | Autonomous floor-cleaning robot having obstacle detection force sensors thereon and related methods |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US16/125,051 US10824159B2 (en) | 2018-09-07 | 2018-09-07 | Autonomous floor-cleaning robot having obstacle detection force sensors thereon and related methods |
| US17/065,629 US11860638B2 (en) | 2018-09-07 | 2020-10-08 | Autonomous floor-cleaning robot having obstacle detection force sensors thereon and related methods |
Related Parent Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US16/125,051 Continuation US10824159B2 (en) | 2018-09-07 | 2018-09-07 | Autonomous floor-cleaning robot having obstacle detection force sensors thereon and related methods |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| US20210026365A1 US20210026365A1 (en) | 2021-01-28 |
| US11860638B2 true US11860638B2 (en) | 2024-01-02 |
Family
ID=69720692
Family Applications (2)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US16/125,051 Active US10824159B2 (en) | 2018-09-07 | 2018-09-07 | Autonomous floor-cleaning robot having obstacle detection force sensors thereon and related methods |
| US17/065,629 Active 2039-10-09 US11860638B2 (en) | 2018-09-07 | 2020-10-08 | Autonomous floor-cleaning robot having obstacle detection force sensors thereon and related methods |
Family Applications Before (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US16/125,051 Active US10824159B2 (en) | 2018-09-07 | 2018-09-07 | Autonomous floor-cleaning robot having obstacle detection force sensors thereon and related methods |
Country Status (1)
| Country | Link |
|---|---|
| US (2) | US10824159B2 (en) |
Families Citing this family (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US9804594B2 (en) * | 2014-11-07 | 2017-10-31 | Clearpath Robotics, Inc. | Self-calibrating sensors and actuators for unmanned vehicles |
| US10824159B2 (en) | 2018-09-07 | 2020-11-03 | Irobot Corporation | Autonomous floor-cleaning robot having obstacle detection force sensors thereon and related methods |
| JP7664061B2 (en) * | 2021-03-17 | 2025-04-17 | 株式会社やまびこ | Working robot |
| US12397443B2 (en) * | 2021-10-29 | 2025-08-26 | 11712381 Canada Corporation | Robotics control and sensing system and method |
| WO2023112690A1 (en) * | 2021-12-13 | 2023-06-22 | ソニーグループ株式会社 | Mobile robot, control method, program, and switch module |
| US20240295882A1 (en) * | 2023-03-03 | 2024-09-05 | Samsung Electronics Co., Ltd. | Ultrasonic piezoelectric transceiver sensor for full surface contact localization |
Citations (12)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20030089267A1 (en) * | 2001-10-17 | 2003-05-15 | William Marsh Rice University | Autonomous robotic crawler for in-pipe inspection |
| US20080276407A1 (en) | 2007-05-09 | 2008-11-13 | Irobot Corporation | Compact Autonomous Coverage Robot |
| US20100300230A1 (en) * | 2007-10-19 | 2010-12-02 | Force Dimension | Device for Movement Between an Input Member and an Output Member |
| US20110202175A1 (en) | 2008-04-24 | 2011-08-18 | Nikolai Romanov | Mobile robot for cleaning |
| US20120311810A1 (en) | 2011-01-03 | 2012-12-13 | Irobot Corporation | Autonomous coverage robot |
| US20170215337A1 (en) | 2016-02-02 | 2017-08-03 | Irobot Corporation | Blade Assembly for a Grass Cutting Mobile Robot |
| US20170280960A1 (en) | 2005-02-18 | 2017-10-05 | Irobot Corporation | Autonomous surface cleaning robot for wet and dry cleaning |
| US20170332857A1 (en) | 2016-05-20 | 2017-11-23 | Lg Electronics Inc. | Autonomous cleaner |
| US20180184583A1 (en) | 2016-12-30 | 2018-07-05 | Irobot Corporation | Robot lawn mower bumper system |
| US20180184585A1 (en) * | 2017-01-02 | 2018-07-05 | Lg Electronics Inc. | Lawn mower robot |
| US20180228333A1 (en) * | 2015-04-09 | 2018-08-16 | Irobot Corporation | Wall following robot |
| US20200081447A1 (en) | 2018-09-07 | 2020-03-12 | Irobot Corporation | Autonomous floor-cleaning robot having obstacle detection force sensors thereon and related methods |
-
2018
- 2018-09-07 US US16/125,051 patent/US10824159B2/en active Active
-
2020
- 2020-10-08 US US17/065,629 patent/US11860638B2/en active Active
Patent Citations (14)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20030089267A1 (en) * | 2001-10-17 | 2003-05-15 | William Marsh Rice University | Autonomous robotic crawler for in-pipe inspection |
| US20170280960A1 (en) | 2005-02-18 | 2017-10-05 | Irobot Corporation | Autonomous surface cleaning robot for wet and dry cleaning |
| US20080276407A1 (en) | 2007-05-09 | 2008-11-13 | Irobot Corporation | Compact Autonomous Coverage Robot |
| US20100300230A1 (en) * | 2007-10-19 | 2010-12-02 | Force Dimension | Device for Movement Between an Input Member and an Output Member |
| US20110202175A1 (en) | 2008-04-24 | 2011-08-18 | Nikolai Romanov | Mobile robot for cleaning |
| US20120311810A1 (en) | 2011-01-03 | 2012-12-13 | Irobot Corporation | Autonomous coverage robot |
| US20180228333A1 (en) * | 2015-04-09 | 2018-08-16 | Irobot Corporation | Wall following robot |
| US20170215337A1 (en) | 2016-02-02 | 2017-08-03 | Irobot Corporation | Blade Assembly for a Grass Cutting Mobile Robot |
| US20170332857A1 (en) | 2016-05-20 | 2017-11-23 | Lg Electronics Inc. | Autonomous cleaner |
| US20180184583A1 (en) | 2016-12-30 | 2018-07-05 | Irobot Corporation | Robot lawn mower bumper system |
| US10375880B2 (en) * | 2016-12-30 | 2019-08-13 | Irobot Corporation | Robot lawn mower bumper system |
| US20180184585A1 (en) * | 2017-01-02 | 2018-07-05 | Lg Electronics Inc. | Lawn mower robot |
| US20200081447A1 (en) | 2018-09-07 | 2020-03-12 | Irobot Corporation | Autonomous floor-cleaning robot having obstacle detection force sensors thereon and related methods |
| US10824159B2 (en) | 2018-09-07 | 2020-11-03 | Irobot Corporation | Autonomous floor-cleaning robot having obstacle detection force sensors thereon and related methods |
Non-Patent Citations (11)
Also Published As
| Publication number | Publication date |
|---|---|
| US20210026365A1 (en) | 2021-01-28 |
| US10824159B2 (en) | 2020-11-03 |
| US20200081447A1 (en) | 2020-03-12 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US11860638B2 (en) | Autonomous floor-cleaning robot having obstacle detection force sensors thereon and related methods | |
| AU2010286429B2 (en) | Method and apparatus for simultaneous localization and mapping of mobile robot environment | |
| JP4852425B2 (en) | Automatic movable floor dust collector | |
| Wijk et al. | Triangulation-based fusion of sonar data with application in robot pose tracking | |
| US10783363B2 (en) | Method of creating map by identifying moving object, and robot implementing the method | |
| TW201813572A (en) | Self-propelled surface treating device | |
| CN108177162A (en) | The interference region setting device of mobile robot | |
| CN105091885B (en) | Robot and self-position estimate method | |
| KR102012548B1 (en) | Method of identifying enterable region of robot and robot implementing thereof | |
| CN109813466A (en) | Tactile Sensor with Slip Sensing | |
| CN103019446A (en) | Bunting position and energy measurement method based on wave propagation time and energy function | |
| CN207650645U (en) | A kind of sweeper | |
| KR102100219B1 (en) | Floating population detection system using multiple PIR sensors and method of detecting and counting floating population using the same | |
| Xia et al. | Improved Capacitive Proximity Detection for Conductive Objects through Target Profile Estimation | |
| CN107850977B (en) | Mapping of position measurements to objects using mobile models | |
| CN111166237A (en) | Sweeping robot and its multidirectional collision detection device and method | |
| CN109645900B (en) | Collision detection device and dust collection robot | |
| WO2015141353A1 (en) | Input apparatus | |
| Yang et al. | Cliff-sensor-based Low-level Obstacle Detection for a Wheeled Robot in an Indoor Environment | |
| CN109702770A (en) | A kind of anti-collision sensor for mobile robot | |
| CN117182920A (en) | Robot collision detection device and method and robot | |
| KR20170092608A (en) | Switch actuating device, mobile device, and method for actuating a switch by means of a non-tactile gesture | |
| WO2019210520A1 (en) | Space partitioning-based detection device, system and method | |
| CN119319580A (en) | Offline detection method, device and system for cleaning robot | |
| Nisa | People Detection from Laser Range Finder Data |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: APPLICATION DISPATCHED FROM PREEXAM, NOT YET DOCKETED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
| AS | Assignment |
Owner name: IROBOT CORPORATION, MASSACHUSETTS Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:BAUTISTA, DEXTER;FOWLER, ISAAC;GRAZIANI, ANDREW;AND OTHERS;SIGNING DATES FROM 20180919 TO 20181023;REEL/FRAME:060263/0852 |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
| AS | Assignment |
Owner name: BANK OF AMERICA, N.A., AS ADMINISTRATIVE AGENT, NORTH CAROLINA Free format text: SECURITY INTEREST;ASSIGNOR:IROBOT CORPORATION;REEL/FRAME:061878/0097 Effective date: 20221002 |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
| STCV | Information on status: appeal procedure |
Free format text: NOTICE OF APPEAL FILED |
|
| AS | Assignment |
Owner name: IROBOT CORPORATION, MASSACHUSETTS Free format text: RELEASE BY SECURED PARTY;ASSIGNOR:BANK OF AMERICA, N.A., AS ADMINISTRATIVE AGENT;REEL/FRAME:064430/0001 Effective date: 20230724 Owner name: IROBOT CORPORATION, MASSACHUSETTS Free format text: RELEASE OF SECURITY INTEREST;ASSIGNOR:BANK OF AMERICA, N.A., AS ADMINISTRATIVE AGENT;REEL/FRAME:064430/0001 Effective date: 20230724 |
|
| AS | Assignment |
Owner name: TCG SENIOR FUNDING L.L.C., AS COLLATERAL AGENT, NEW YORK Free format text: SECURITY INTEREST;ASSIGNOR:IROBOT CORPORATION;REEL/FRAME:064532/0856 Effective date: 20230807 |
|
| STCV | Information on status: appeal procedure |
Free format text: APPEAL BRIEF (OR SUPPLEMENTAL BRIEF) ENTERED AND FORWARDED TO EXAMINER |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT RECEIVED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED |
|
| STCF | Information on status: patent grant |
Free format text: PATENTED CASE |