US7957836B2 - Method used by robot for simultaneous localization and map-building - Google Patents
Method used by robot for simultaneous localization and map-building Download PDFInfo
- Publication number
- US7957836B2 US7957836B2 US11/175,396 US17539605A US7957836B2 US 7957836 B2 US7957836 B2 US 7957836B2 US 17539605 A US17539605 A US 17539605A US 7957836 B2 US7957836 B2 US 7957836B2
- Authority
- US
- United States
- Prior art keywords
- robot
- landmarks
- landmark
- chromosomes
- chromosome
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Expired - Fee Related, expires
Links
- 238000000034 method Methods 0.000 title claims abstract description 70
- 230000004807 localization Effects 0.000 title claims abstract description 19
- 210000000349 chromosome Anatomy 0.000 claims abstract description 50
- 238000005070 sampling Methods 0.000 claims abstract description 7
- 239000002245 particle Substances 0.000 claims description 29
- 239000011159 matrix material Substances 0.000 claims description 4
- 238000006073 displacement reaction Methods 0.000 claims description 3
- 230000035772 mutation Effects 0.000 claims description 3
- 238000012545 processing Methods 0.000 claims description 3
- 238000004364 calculation method Methods 0.000 description 9
- 238000012360 testing method Methods 0.000 description 8
- 230000008569 process Effects 0.000 description 5
- 230000008901 benefit Effects 0.000 description 3
- 230000002068 genetic effect Effects 0.000 description 2
- 238000007476 Maximum Likelihood Methods 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000000546 chi-square test Methods 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 230000004069 differentiation Effects 0.000 description 1
- 239000006185 dispersion Substances 0.000 description 1
- 238000011156 evaluation Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000004088 simulation Methods 0.000 description 1
- 230000004083 survival effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/12—Computing arrangements based on biological models using genetic models
- G06N3/126—Evolutionary algorithms, e.g. genetic algorithms or genetic programming
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S1/00—Beacons or beacon systems transmitting signals having a characteristic or characteristics capable of being detected by non-directional receivers and defining directions, positions, or position lines fixed relatively to the beacon transmitters; Receivers co-operating therewith
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B19/00—Programme-control systems
- G05B19/02—Programme-control systems electric
- G05B19/18—Numerical control [NC], i.e. automatically operating machines, in particular machine tools, e.g. in a manufacturing environment, so as to execute positioning, movement or co-ordinated operations by means of programme data in numerical form
- G05B19/19—Numerical control [NC], i.e. automatically operating machines, in particular machine tools, e.g. in a manufacturing environment, so as to execute positioning, movement or co-ordinated operations by means of programme data in numerical form characterised by positioning or contouring control systems, e.g. to control position from one programmed point to another or to control movement along a programmed continuous path
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0268—Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means
- G05D1/0274—Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means using mapping information stored in a memory device
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q50/00—Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
- G06Q50/10—Services
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/10—Terrestrial scenes
Definitions
- the present invention relates to a method used by a robot for localization and map-building, and more particularly, to a method used by a mobile robot for simultaneous localization and map-building (SLAM).
- SLAM simultaneous localization and map-building
- the robot In order for a robot to navigate through the non-trivial surroundings, the robot must localize itself and build a map of its surroundings. The map as built makes it possible for the robot to plan its path, manipulate an object, or communicate with humans, etc.
- a robot In order to navigate through unknown surroundings, a robot has to build a map while localizing itself. However, since the robot localizes itself and builds a map by using sensor data having noise, there is difficulty in the calculation.
- Localization means understanding of the absolute location of a robot in its surroundings by using sensor information, beacons or natural landmarks, etc. Since there are several sources of error in localizing the robot (a wheel slipping on the ground, a change in the diameter of the wheel, etc.) during the navigation of the robot, the error requires a correction.
- Map-building models the surroundings by observing natural or manmade landmarks based on the sensor data. Such modeling makes it possible for the robot to plan its path. In order to model complex surroundings, only when localization is guaranteed, can a reliable map be built. Therefore, a method of simultaneously performing localization and map-building within a specified short time is required.
- An aspect of the present invention provides a method used by a robot for simultaneous localization and map-building which estimates the path of the robot by using a particle filter and estimates the location of landmarks by introducing an evolutionary computation to build a map.
- a method used by a robot for simultaneous localization and map-building including: initializing a pose of the robot and locations of landmarks; sampling a new pose of the robot during motion of the robot, and constructing chromosomes using the locations of the landmarks; observing the landmarks from a present location of the robot; generating offspring from the chromosomes; and selecting next-generation chromosomes from the chromosomes and the offspring using observation values of the landmarks.
- a method of simultaneous localization and map-building including: initializing a pose of a robot and a location of a landmark, the orientation including a direction in which a front of the robot faces and x,y coordinates indicating a location of the robot; sampling a new position of the robot as the robot moves; constructing a chromosome for an evolutionary computation, the chromosome indicating the location of the landmark and being an object in the evolutionary computation; observing the landmark from the new position; determining whether a new landmark is present and, if so, initializing a location of the new landmark using an observed distance and angle from the robot to the landmark; generating, when a new landmark is determined not to be present, offspring from a present parent chromosome according to the evolutionary computation method; evaluating fitness of the parent and the offspring, fitness being defined as an objective function according to a difference between an observation value and a prediction value of each landmark; and selecting a next generation chromosome from the parents and the offspring
- the aforementioned methods can be realized by computer-readable storage media encoded with processing instructions for causing a processor to perform the operations of the methods.
- FIG. 1 is a flowchart illustrating a method of simultaneous localization and map-building according to an embodiment of the present invention
- FIG. 2 illustrates an example of a robot observing a landmark
- FIG. 3 describes test surroundings used to test one embodiment of the present invention
- FIG. 4 illustrates a robot to which an embodiment of the present invention is applied
- FIGS. 5A and 5B illustrate test results of methods for simultaneous localization and map-building of the conventional art and an embodiment of the present invention, respectively, in cases where the number of particles is 100 and the number of landmarks is 100;
- FIGS. 6A and 6B illustrate test results of methods for simultaneous localization and map-building of the conventional art and an embodiment of the present invention, respectively, in cases where the number of particles is 100 and the number of landmarks is 200, respectively;
- FIG. 7 is an error-bar plot of the average calculation time over the number of landmarks to 10, 100, 250, and 500 when the number of particles is 100 and the average calculation time is measured 300 times per each iteration and a total of 20 iterations, respectively according to the conventional art and an embodiment of the present invention.
- FIG. 1 is a flowchart illustrating a method of simultaneous localization and map-building according to an embodiment of the present invention.
- the pose i.e., orientation
- the pose of a robot includes a direction in which the front of the robot faces, besides (x,y) coordinates indicating a location of the robot. Since the present embodiment adopts (i.e., uses) a particle filter to localize the robot, a plurality of particles are generated in the surrounding of the initial pose of the robot in order for the initialization.
- the surrounding of the initial pose indicates within a range determined experimentally and centered around the initial pose.
- FIG. 2 illustrates an example of observing a landmark from the robot.
- reference numeral 20 indicates a robot
- 21 indicates a landmark
- 22 indicates a direction in which the robot faces. Observation may be expressed as the distance r from the robot 20 to the landmark 21 and the angle ⁇ between the direction 22 of the robot 20 and the direction of the landmark 21 .
- the initialization of the landmark location can be set using an inverse function of the observation function g(s t , ⁇ nt ) below based on the observed r, ⁇ values. Although a value t indicates initial time, i.e. 0; after the initialization t indicates the t th time step.
- ⁇ nt indicates (x,y) coordinate of the landmark n t .
- a new pose of the robot is sampled (Operation 11 ).
- a new pose is sampled using a particle filter described below.
- the u denotes a motion command or a desired motion vector
- the n ⁇ 1, . . . , K ⁇ denotes a landmark number
- the z denotes an observation value of the location and direction of the landmark.
- an end point of each path at time t i.e. the robot pose s t [m] can be calculated by Equation 5 according to the end point s t-1 [m] of the path s t-1,[m] and a motion model of Equation 4 below.
- Equation 4 s t [m] ⁇ p ( s t
- New particles are distributed in the particle population S t p according to the probability density function below. p(s t
- chromosomes are constructed for an evolutionary computation (Operation 12 ).
- a chromosome which is expressed as an object in the evolutionary computation method, indicates locations of landmarks discovered in each location of particles in the present embodiment.
- the evolutionary computation method is a calculation model used to find an optimal solution for a given problem.
- the optimal solution can be found by representing potential solutions to real world problems as coded objects over the computer and collecting several objects to form an object group and performing an evolution simulation within the object group according to the survival of the fittest by exchanging genetic information of the objects or furnishing new genetic information to the objects, as generations go by.
- the chromosome (i.e., the landmark location ( ⁇ x ′, ⁇ y ′)), can be obtained from the predicted value ( ⁇ x , ⁇ y ) in previous time described below.
- ⁇ ′ ix ⁇ ix
- ⁇ ′ iy ⁇ iy [Equation 7]
- the landmark location can be adjusted considering the displacement of each particle from the average pose of the robot. In other words, the landmark location may be also determined by subtracting a relative displacement from the average pose of the particle.
- the landmark is observed, as shown in FIG. 2 , from each pose of paths obtained in Equation 6 by using a sensor such as laser or ultrasonic waves (Operation 13 ).
- a new landmark can be determined by a known data association method. For example, a maximum likelihood method, a nearest neighbor method, or a Chi-square test method may be used as the data association method.
- a landmark is determined to be a new landmark
- the location of the new landmark is initialized using the observed r, ⁇ values according to Equations 1 and 2 (Operation 15 ).
- offspring is generated from the present chromosome according to the evolutionary computation method (Operation 16 ).
- the evolutionary computation method can use a random distribution.
- Non-limiting examples of such a random distribution include a Gaussuian distribution and a Cauchy distribution.
- the offspring ⁇ i,t is generated from parents ⁇ i,t-1 according to a Gaussian mutation method using a Gaussian distribution as described below.
- a different distribution such as the Cauchy distribution may be used.
- ⁇ i,t ⁇ i,t-1 + ⁇ i,t ⁇ N i (0,1) [Equation 9]
- N i (0,1) denotes a random value of the i th landmark according to the Gaussian distribution of mean 0 and variance 1
- the ⁇ i denotes variance of the i th landmark.
- Equation 9 the variance ⁇ i,t which is multiplied by the Gaussian distribution is obtained as described below.
- ⁇ i,t ⁇ i,t-1 ⁇ exp( ⁇ ′ ⁇ N (0,1)+ ⁇ N i (0,1)) [Equation 10]
- N( ) has the same value for every landmark according to the Gaussian distribution, and the ⁇ ′ and ⁇ are constants determined according to the number of landmarks.
- T denotes a transpose
- R denotes a constant covariance matrix
- z t denotes an observation value
- ⁇ circumflex over (z) ⁇ n t ,t is an estimation value.
- the prediction value is a value that predicts the relative distance and angle of the landmark from the present location of the robot when ⁇ nt is set to ( ⁇ ′ x , ⁇ ′ y ) in the observation function of Equation 1 and the s t is sets to an prediction value of the robot pose according to Equation 5.
- Landmarks selected according to the objective function of Equation 11 and landmarks initialized in Operation 15 are selected as next-generation landmarks (Operation 16 ).
- the next generation is selected (Operation 18 ).
- the selection is made by using a random roulette wheel method, a random competition method or a tournament method, etc., which are used in the evolutionary computation method according to the result after calculating the objective function of Equation 11.
- FIG. 3 shows an example of test surroundings used to test one embodiment of the present invention, in which 100 landmarks are randomly generated in a two-dimensional plane of 7 m ⁇ 7 m.
- FIG. 4 illustrates a robot to which an embodiment of the present invention is applied.
- Reference numeral 30 indicates a robot
- reference numeral 31 indicates a sensor
- reference numeral 32 indicates a landmark.
- the variance R(r) of the distance from the sensor 31 to the landmark 32 linearly increases by 1 ⁇ 4 whenever it exceeds 1 meter.
- FIGS. 5A and 5B illustrate conventional test results using a Kalman filter and the method for simultaneous localization and map-building of the present invention in a case where the number of both particles and landmarks is 100.
- FIG. 5A illustrates an error of the robot location as to the time step, i.e., ⁇ square root over ((x err 2 +y err 2 )) ⁇
- FIG. 5B illustrates an error of the landmark location with respect to the time step.
- the conventional results and results of the present embodiment show a similar error level as to the robot pose; however, it can be seen that the landmark location of the present embodiment has a faster convergence speed than the conventional landmark location.
- FIGS. 6A and 6B illustrate respective test result of the method for simultaneous localization and map-building of the conventional art and the present embodiment in a case where the number of particles and landmarks is 100 and 200, respectively.
- FIG. 6A illustrates an error of the robot location as to the time step
- FIG. 6B illustrates an error of the landmark location as to the time step.
- the conventional results and results of the present embodiment show a similar error level as to the robot pose; however, it can be seen that the landmark location of the present invention has faster convergence speed than the conventional landmark location.
- FIG. 7 is an error-bar type plot of the average calculation time over the number of landmarks to 10, 100, 250, and 500 when the number of particles is 100 and the average calculation time is measured 300 times per each iteration and a total of 20 iterations, respectively.
- the average of time is expressed as a bended line graph, and the standard deviation of each time is expressed as a length of a T-shaped line segment.
- the present embodiment is much faster than the conventional art whenever the number of landmarks increases.
- the number of landmarks is 500, it can be seen that the present embodiment is about 40 times as fast as the conventional art.
- Embodiments of the present invention may be realized in a computer-readable recording medium as a computer-readable code.
- the computer-readable recording medium includes every kind of recording device that stores computer system-readable data.
- ROM, RAM, CD-ROM, magnetic tape, floppy disc, optical data storage, etc. are used.
- the computer-readable recording medium also includes realization in the form of a carrier wave (e.g., transmission through Internet).
- the computer-readable recording medium is dispersed in a network-connecting computer system, thereby storing and executing a computer-readable code by a dispersion method.
- the above-described embodiment of the present invention avoids calculations such as a matrix inversion, differentiation, etc. required for setting locations of landmarks according to the conventional art, thereby reducing calculation time.
- the evolutionary computation basically enables parallel processing, thereby much reducing calculation time when multi-processors are adopted.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Biophysics (AREA)
- Life Sciences & Earth Sciences (AREA)
- Radar, Positioning & Navigation (AREA)
- Evolutionary Biology (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Bioinformatics & Computational Biology (AREA)
- General Health & Medical Sciences (AREA)
- Remote Sensing (AREA)
- Business, Economics & Management (AREA)
- Automation & Control Theory (AREA)
- Aviation & Aerospace Engineering (AREA)
- Physiology (AREA)
- Computational Linguistics (AREA)
- Data Mining & Analysis (AREA)
- Evolutionary Computation (AREA)
- Artificial Intelligence (AREA)
- Molecular Biology (AREA)
- Computing Systems (AREA)
- General Engineering & Computer Science (AREA)
- Mathematical Physics (AREA)
- Software Systems (AREA)
- Genetics & Genomics (AREA)
- Multimedia (AREA)
- Biomedical Technology (AREA)
- Tourism & Hospitality (AREA)
- Economics (AREA)
- Human Resources & Organizations (AREA)
- Marketing (AREA)
- Primary Health Care (AREA)
- Strategic Management (AREA)
- General Business, Economics & Management (AREA)
- Computer Networks & Wireless Communication (AREA)
- Human Computer Interaction (AREA)
- Manufacturing & Machinery (AREA)
- Manipulator (AREA)
Abstract
Description
μx,t =s t,x +r cos(φ+s t,θ)
μy,t =s t,y +r sin(φ+s t,θ) [Equation 2]
p(st-1|zt-1,ut-1,nt-1) [Equation 3]
p(st|ut,st-1) [Equation 4]
s t [m] ˜p(s t |u t ,s t-1 [m]) [Equation 5]
p(st|zt-1,ut,nt-1) [Equation 6]
μ′ix=μix
μ′iy=μiy [Equation 7]
μ′ix=μix −dx
μ′iy=μiy −dy, (i=1, . . . , N)
dx=x−
dy=y−
μi,t=μi,t-1+σi,t ·N i(0,1) [Equation 9]
σi,t=σi,t-1·exp(τ′·N(0,1)+τ·N i(0,1)) [Equation 10]
w t=(z t −{circumflex over (z)} n
Claims (22)
μx,t s t,x +r cos(φ+s t,θ)
μy,t =s t,y +r sin(φ+s t,θ).
μ′ix=μix −dx
μ′iy=μiy −dy, (i=1, . . . , N)
dx=x−
dy=y−
μi,t=μi,t-1+σi,t −N i(0,1),
σi,t=σi,t-1−exp(τ′−N(0,1)+τ−N i(0,1)),
w t=(z t −{circumflex over (z)} n
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR10-2004-0061790 | 2004-08-05 | ||
KR1020040061790A KR100601960B1 (en) | 2004-08-05 | 2004-08-05 | Simultaneous localization and map building method for robot |
Publications (2)
Publication Number | Publication Date |
---|---|
US20060041331A1 US20060041331A1 (en) | 2006-02-23 |
US7957836B2 true US7957836B2 (en) | 2011-06-07 |
Family
ID=35910640
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US11/175,396 Expired - Fee Related US7957836B2 (en) | 2004-08-05 | 2005-07-07 | Method used by robot for simultaneous localization and map-building |
Country Status (2)
Country | Link |
---|---|
US (1) | US7957836B2 (en) |
KR (1) | KR100601960B1 (en) |
Cited By (38)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20080127445A1 (en) * | 2005-02-18 | 2008-06-05 | Irobot Corporation | Autonomous surface cleaning robot for wet cleaning |
US20080294338A1 (en) * | 2005-12-09 | 2008-11-27 | Nakju Doh | Method of Mapping and Navigating Mobile Robot by Artificial Landmark and Local Coordinate |
US20090007366A1 (en) * | 2005-12-02 | 2009-01-08 | Irobot Corporation | Coverage Robot Mobility |
US20100032853A1 (en) * | 2008-08-11 | 2010-02-11 | Nitto Denko Corporation | Method for manufacturing optical waveguide |
US20100049364A1 (en) * | 2002-09-13 | 2010-02-25 | Irobot Corporation | Navigational Control System for a Robotic Device |
US20100263142A1 (en) * | 2001-06-12 | 2010-10-21 | Irobot Corporation | Method and system for multi-mode coverage for an autonomous robot |
US20100275405A1 (en) * | 2005-02-18 | 2010-11-04 | Christopher John Morse | Autonomous surface cleaning robot for dry cleaning |
US20110268349A1 (en) * | 2010-05-03 | 2011-11-03 | Samsung Electronics Co., Ltd. | System and method building a map |
US8239992B2 (en) | 2007-05-09 | 2012-08-14 | Irobot Corporation | Compact autonomous coverage robot |
US8253368B2 (en) | 2004-01-28 | 2012-08-28 | Irobot Corporation | Debris sensor for cleaning apparatus |
US8368339B2 (en) | 2001-01-24 | 2013-02-05 | Irobot Corporation | Robot confinement |
US8374721B2 (en) | 2005-12-02 | 2013-02-12 | Irobot Corporation | Robot system |
US8380350B2 (en) | 2005-12-02 | 2013-02-19 | Irobot Corporation | Autonomous coverage robot navigation system |
US8390251B2 (en) | 2004-01-21 | 2013-03-05 | Irobot Corporation | Autonomous robot auto-docking and energy management systems and methods |
US8387193B2 (en) | 2005-02-18 | 2013-03-05 | Irobot Corporation | Autonomous surface cleaning robot for wet and dry cleaning |
US8396592B2 (en) | 2001-06-12 | 2013-03-12 | Irobot Corporation | Method and system for multi-mode coverage for an autonomous robot |
US8412377B2 (en) | 2000-01-24 | 2013-04-02 | Irobot Corporation | Obstacle following sensor scheme for a mobile robot |
US8417383B2 (en) | 2006-05-31 | 2013-04-09 | Irobot Corporation | Detecting robot stasis |
US8418303B2 (en) | 2006-05-19 | 2013-04-16 | Irobot Corporation | Cleaning robot roller processing |
US8428778B2 (en) | 2002-09-13 | 2013-04-23 | Irobot Corporation | Navigational control system for a robotic device |
US8474090B2 (en) | 2002-01-03 | 2013-07-02 | Irobot Corporation | Autonomous floor-cleaning robot |
US8515578B2 (en) | 2002-09-13 | 2013-08-20 | Irobot Corporation | Navigational control system for a robotic device |
US8584307B2 (en) | 2005-12-02 | 2013-11-19 | Irobot Corporation | Modular robot |
US8594840B1 (en) | 2004-07-07 | 2013-11-26 | Irobot Corporation | Celestial navigation system for an autonomous robot |
US8780342B2 (en) | 2004-03-29 | 2014-07-15 | Irobot Corporation | Methods and apparatus for position estimation using reflected light sources |
US8788092B2 (en) | 2000-01-24 | 2014-07-22 | Irobot Corporation | Obstacle following sensor scheme for a mobile robot |
US8800107B2 (en) | 2010-02-16 | 2014-08-12 | Irobot Corporation | Vacuum brush |
US8930023B2 (en) | 2009-11-06 | 2015-01-06 | Irobot Corporation | Localization by learning of wave-signal distributions |
US8972052B2 (en) | 2004-07-07 | 2015-03-03 | Irobot Corporation | Celestial navigation system for an autonomous vehicle |
US9008835B2 (en) | 2004-06-24 | 2015-04-14 | Irobot Corporation | Remote control scheduler and method for autonomous robotic device |
US9320398B2 (en) | 2005-12-02 | 2016-04-26 | Irobot Corporation | Autonomous coverage robots |
US11199853B1 (en) | 2018-07-11 | 2021-12-14 | AI Incorporated | Versatile mobile platform |
US11254002B1 (en) | 2018-03-19 | 2022-02-22 | AI Incorporated | Autonomous robotic device |
US11320828B1 (en) | 2018-03-08 | 2022-05-03 | AI Incorporated | Robotic cleaner |
US11340079B1 (en) | 2018-05-21 | 2022-05-24 | AI Incorporated | Simultaneous collaboration, localization, and mapping |
US11402215B2 (en) | 2018-12-31 | 2022-08-02 | Twinny Co., Ltd. | Indoor positioning method for a moving apparatus using first and second two-dimensional maps of z-axis areas |
US11454981B1 (en) | 2018-04-20 | 2022-09-27 | AI Incorporated | Versatile mobile robotic device |
US11548159B1 (en) | 2018-05-31 | 2023-01-10 | AI Incorporated | Modular robot |
Families Citing this family (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR100776215B1 (en) | 2005-01-25 | 2007-11-16 | 삼성전자주식회사 | Apparatus and method for estimating location and generating map of mobile body, using upper image, computer-readable recording media storing computer program controlling the apparatus |
AU2006284577B2 (en) * | 2005-09-02 | 2012-09-13 | Neato Robotics, Inc. | Multi-function robotic device |
US8996172B2 (en) * | 2006-09-01 | 2015-03-31 | Neato Robotics, Inc. | Distance sensor system and method |
KR100809352B1 (en) * | 2006-11-16 | 2008-03-05 | 삼성전자주식회사 | Method and apparatus of pose estimation in a mobile robot based on particle filter |
KR101202695B1 (en) * | 2008-10-01 | 2012-11-19 | 무라다기카이가부시끼가이샤 | Autonomous movement device |
US8340852B2 (en) * | 2009-04-29 | 2012-12-25 | Honeywell International Inc. | System and method for simultaneous localization and map building |
KR101686171B1 (en) * | 2010-06-08 | 2016-12-13 | 삼성전자주식회사 | Apparatus for recognizing location using image and range data and method thereof |
CN102402225B (en) * | 2011-11-23 | 2013-09-04 | 中国科学院自动化研究所 | Method for realizing localization and map building of mobile robot at the same time |
CN102778230B (en) * | 2012-06-14 | 2014-10-29 | 辽宁工程技术大学 | Gravity gradient auxiliary positioning method of artificial physical optimization particle filtering |
US9927814B2 (en) * | 2016-03-28 | 2018-03-27 | Fetch Robotics, Inc. | System and method for localization of robots |
CN107450561A (en) * | 2017-09-18 | 2017-12-08 | 河南科技学院 | The autonomous path planning of mobile robot and obstacle avoidance system and its application method |
CN109542093B (en) * | 2017-09-22 | 2022-06-07 | 华为技术有限公司 | Method and device for processing data |
CN108507579B (en) * | 2018-04-08 | 2020-04-21 | 浙江大承机器人科技有限公司 | Repositioning method based on local particle filtering |
WO2020102946A1 (en) * | 2018-11-19 | 2020-05-28 | 珊口(深圳)智能科技有限公司 | Map building method and system, positioning method and system, navigation method and system, control method and system, and mobile robot |
CN110007670B (en) * | 2019-02-14 | 2021-11-23 | 四川阿泰因机器人智能装备有限公司 | Mobile robot positioning and mapping method |
CN110597070B (en) * | 2019-10-17 | 2022-06-17 | 上海电力大学 | Method for identifying model parameters of thermal power generating unit system |
CN112336883A (en) * | 2020-10-28 | 2021-02-09 | 湖南安商医疗科技有限公司 | Autonomous moving pulse xenon lamp and plasma sterilization robot |
CN113703443B (en) * | 2021-08-12 | 2023-10-13 | 北京科技大学 | GNSS independent unmanned vehicle autonomous positioning and environment exploration method |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20040167716A1 (en) * | 2002-12-17 | 2004-08-26 | Goncalves Luis Filipe Domingues | Systems and methods for controlling a density of visual landmarks in a visual simultaneous localization and mapping system |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0527832A (en) * | 1991-07-19 | 1993-02-05 | Sanyo Electric Co Ltd | Present position recognizing method for mobile robot |
JPH064127A (en) * | 1992-06-16 | 1994-01-14 | Ishikawajima Harima Heavy Ind Co Ltd | Own-position measuring instrument for indoor moving body |
JPH07121235A (en) * | 1993-10-26 | 1995-05-12 | Nippon Telegr & Teleph Corp <Ntt> | Position self-recognizing method for moving robot |
KR100632242B1 (en) * | 2000-11-22 | 2006-10-11 | 삼성광주전자 주식회사 | Path correction method of mobile robot |
-
2004
- 2004-08-05 KR KR1020040061790A patent/KR100601960B1/en active IP Right Grant
-
2005
- 2005-07-07 US US11/175,396 patent/US7957836B2/en not_active Expired - Fee Related
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20040167716A1 (en) * | 2002-12-17 | 2004-08-26 | Goncalves Luis Filipe Domingues | Systems and methods for controlling a density of visual landmarks in a visual simultaneous localization and mapping system |
Non-Patent Citations (1)
Title |
---|
Duckett, Tom, A Genetic Algorithm for Simultaneous Localization and Mapping, Sep. 2003, IEEE International Conference on Robotics & Automation, pp. 434-439. * |
Cited By (115)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8761935B2 (en) | 2000-01-24 | 2014-06-24 | Irobot Corporation | Obstacle following sensor scheme for a mobile robot |
US8565920B2 (en) | 2000-01-24 | 2013-10-22 | Irobot Corporation | Obstacle following sensor scheme for a mobile robot |
US8478442B2 (en) | 2000-01-24 | 2013-07-02 | Irobot Corporation | Obstacle following sensor scheme for a mobile robot |
US8788092B2 (en) | 2000-01-24 | 2014-07-22 | Irobot Corporation | Obstacle following sensor scheme for a mobile robot |
US8412377B2 (en) | 2000-01-24 | 2013-04-02 | Irobot Corporation | Obstacle following sensor scheme for a mobile robot |
US9446521B2 (en) | 2000-01-24 | 2016-09-20 | Irobot Corporation | Obstacle following sensor scheme for a mobile robot |
US9144361B2 (en) | 2000-04-04 | 2015-09-29 | Irobot Corporation | Debris sensor for cleaning apparatus |
US8368339B2 (en) | 2001-01-24 | 2013-02-05 | Irobot Corporation | Robot confinement |
US8659256B2 (en) | 2001-01-24 | 2014-02-25 | Irobot Corporation | Robot confinement |
US8659255B2 (en) | 2001-01-24 | 2014-02-25 | Irobot Corporation | Robot confinement |
US9582005B2 (en) | 2001-01-24 | 2017-02-28 | Irobot Corporation | Robot confinement |
US9167946B2 (en) | 2001-01-24 | 2015-10-27 | Irobot Corporation | Autonomous floor cleaning robot |
US8686679B2 (en) | 2001-01-24 | 2014-04-01 | Irobot Corporation | Robot confinement |
US9038233B2 (en) | 2001-01-24 | 2015-05-26 | Irobot Corporation | Autonomous floor-cleaning robot |
US9622635B2 (en) | 2001-01-24 | 2017-04-18 | Irobot Corporation | Autonomous floor-cleaning robot |
US20100263142A1 (en) * | 2001-06-12 | 2010-10-21 | Irobot Corporation | Method and system for multi-mode coverage for an autonomous robot |
US8463438B2 (en) | 2001-06-12 | 2013-06-11 | Irobot Corporation | Method and system for multi-mode coverage for an autonomous robot |
US9104204B2 (en) | 2001-06-12 | 2015-08-11 | Irobot Corporation | Method and system for multi-mode coverage for an autonomous robot |
US8838274B2 (en) | 2001-06-12 | 2014-09-16 | Irobot Corporation | Method and system for multi-mode coverage for an autonomous robot |
US8396592B2 (en) | 2001-06-12 | 2013-03-12 | Irobot Corporation | Method and system for multi-mode coverage for an autonomous robot |
US8671507B2 (en) | 2002-01-03 | 2014-03-18 | Irobot Corporation | Autonomous floor-cleaning robot |
US8763199B2 (en) | 2002-01-03 | 2014-07-01 | Irobot Corporation | Autonomous floor-cleaning robot |
US8516651B2 (en) | 2002-01-03 | 2013-08-27 | Irobot Corporation | Autonomous floor-cleaning robot |
US8474090B2 (en) | 2002-01-03 | 2013-07-02 | Irobot Corporation | Autonomous floor-cleaning robot |
US8656550B2 (en) | 2002-01-03 | 2014-02-25 | Irobot Corporation | Autonomous floor-cleaning robot |
US9128486B2 (en) | 2002-01-24 | 2015-09-08 | Irobot Corporation | Navigational control system for a robotic device |
US8781626B2 (en) | 2002-09-13 | 2014-07-15 | Irobot Corporation | Navigational control system for a robotic device |
US9949608B2 (en) | 2002-09-13 | 2018-04-24 | Irobot Corporation | Navigational control system for a robotic device |
US8386081B2 (en) | 2002-09-13 | 2013-02-26 | Irobot Corporation | Navigational control system for a robotic device |
US20100049364A1 (en) * | 2002-09-13 | 2010-02-25 | Irobot Corporation | Navigational Control System for a Robotic Device |
US8515578B2 (en) | 2002-09-13 | 2013-08-20 | Irobot Corporation | Navigational control system for a robotic device |
US8428778B2 (en) | 2002-09-13 | 2013-04-23 | Irobot Corporation | Navigational control system for a robotic device |
US8793020B2 (en) | 2002-09-13 | 2014-07-29 | Irobot Corporation | Navigational control system for a robotic device |
US8854001B2 (en) | 2004-01-21 | 2014-10-07 | Irobot Corporation | Autonomous robot auto-docking and energy management systems and methods |
US8390251B2 (en) | 2004-01-21 | 2013-03-05 | Irobot Corporation | Autonomous robot auto-docking and energy management systems and methods |
US8461803B2 (en) | 2004-01-21 | 2013-06-11 | Irobot Corporation | Autonomous robot auto-docking and energy management systems and methods |
US8749196B2 (en) | 2004-01-21 | 2014-06-10 | Irobot Corporation | Autonomous robot auto-docking and energy management systems and methods |
US9215957B2 (en) | 2004-01-21 | 2015-12-22 | Irobot Corporation | Autonomous robot auto-docking and energy management systems and methods |
US8456125B2 (en) | 2004-01-28 | 2013-06-04 | Irobot Corporation | Debris sensor for cleaning apparatus |
US8378613B2 (en) | 2004-01-28 | 2013-02-19 | Irobot Corporation | Debris sensor for cleaning apparatus |
US8253368B2 (en) | 2004-01-28 | 2012-08-28 | Irobot Corporation | Debris sensor for cleaning apparatus |
US8780342B2 (en) | 2004-03-29 | 2014-07-15 | Irobot Corporation | Methods and apparatus for position estimation using reflected light sources |
US9360300B2 (en) | 2004-03-29 | 2016-06-07 | Irobot Corporation | Methods and apparatus for position estimation using reflected light sources |
US9486924B2 (en) | 2004-06-24 | 2016-11-08 | Irobot Corporation | Remote control scheduler and method for autonomous robotic device |
US9008835B2 (en) | 2004-06-24 | 2015-04-14 | Irobot Corporation | Remote control scheduler and method for autonomous robotic device |
US9229454B1 (en) | 2004-07-07 | 2016-01-05 | Irobot Corporation | Autonomous mobile robot system |
US8634956B1 (en) | 2004-07-07 | 2014-01-21 | Irobot Corporation | Celestial navigation system for an autonomous robot |
US8594840B1 (en) | 2004-07-07 | 2013-11-26 | Irobot Corporation | Celestial navigation system for an autonomous robot |
US8874264B1 (en) | 2004-07-07 | 2014-10-28 | Irobot Corporation | Celestial navigation system for an autonomous robot |
US8972052B2 (en) | 2004-07-07 | 2015-03-03 | Irobot Corporation | Celestial navigation system for an autonomous vehicle |
US9223749B2 (en) | 2004-07-07 | 2015-12-29 | Irobot Corporation | Celestial navigation system for an autonomous vehicle |
US8774966B2 (en) | 2005-02-18 | 2014-07-08 | Irobot Corporation | Autonomous surface cleaning robot for wet and dry cleaning |
US8392021B2 (en) | 2005-02-18 | 2013-03-05 | Irobot Corporation | Autonomous surface cleaning robot for wet cleaning |
US20100275405A1 (en) * | 2005-02-18 | 2010-11-04 | Christopher John Morse | Autonomous surface cleaning robot for dry cleaning |
US8670866B2 (en) | 2005-02-18 | 2014-03-11 | Irobot Corporation | Autonomous surface cleaning robot for wet and dry cleaning |
US8382906B2 (en) | 2005-02-18 | 2013-02-26 | Irobot Corporation | Autonomous surface cleaning robot for wet cleaning |
US20080127445A1 (en) * | 2005-02-18 | 2008-06-05 | Irobot Corporation | Autonomous surface cleaning robot for wet cleaning |
US8387193B2 (en) | 2005-02-18 | 2013-03-05 | Irobot Corporation | Autonomous surface cleaning robot for wet and dry cleaning |
US8782848B2 (en) | 2005-02-18 | 2014-07-22 | Irobot Corporation | Autonomous surface cleaning robot for dry cleaning |
US10470629B2 (en) | 2005-02-18 | 2019-11-12 | Irobot Corporation | Autonomous surface cleaning robot for dry cleaning |
US8985127B2 (en) | 2005-02-18 | 2015-03-24 | Irobot Corporation | Autonomous surface cleaning robot for wet cleaning |
US8739355B2 (en) | 2005-02-18 | 2014-06-03 | Irobot Corporation | Autonomous surface cleaning robot for dry cleaning |
US9445702B2 (en) | 2005-02-18 | 2016-09-20 | Irobot Corporation | Autonomous surface cleaning robot for wet and dry cleaning |
US8855813B2 (en) | 2005-02-18 | 2014-10-07 | Irobot Corporation | Autonomous surface cleaning robot for wet and dry cleaning |
US20090007366A1 (en) * | 2005-12-02 | 2009-01-08 | Irobot Corporation | Coverage Robot Mobility |
US9144360B2 (en) | 2005-12-02 | 2015-09-29 | Irobot Corporation | Autonomous coverage robot navigation system |
US9599990B2 (en) | 2005-12-02 | 2017-03-21 | Irobot Corporation | Robot system |
US10524629B2 (en) | 2005-12-02 | 2020-01-07 | Irobot Corporation | Modular Robot |
US8950038B2 (en) | 2005-12-02 | 2015-02-10 | Irobot Corporation | Modular robot |
US8954192B2 (en) | 2005-12-02 | 2015-02-10 | Irobot Corporation | Navigating autonomous coverage robots |
US8606401B2 (en) | 2005-12-02 | 2013-12-10 | Irobot Corporation | Autonomous coverage robot navigation system |
US8978196B2 (en) | 2005-12-02 | 2015-03-17 | Irobot Corporation | Coverage robot mobility |
US8584305B2 (en) | 2005-12-02 | 2013-11-19 | Irobot Corporation | Modular robot |
US9392920B2 (en) | 2005-12-02 | 2016-07-19 | Irobot Corporation | Robot system |
US8584307B2 (en) | 2005-12-02 | 2013-11-19 | Irobot Corporation | Modular robot |
US9320398B2 (en) | 2005-12-02 | 2016-04-26 | Irobot Corporation | Autonomous coverage robots |
US8380350B2 (en) | 2005-12-02 | 2013-02-19 | Irobot Corporation | Autonomous coverage robot navigation system |
US8600553B2 (en) | 2005-12-02 | 2013-12-03 | Irobot Corporation | Coverage robot mobility |
US8374721B2 (en) | 2005-12-02 | 2013-02-12 | Irobot Corporation | Robot system |
US9149170B2 (en) | 2005-12-02 | 2015-10-06 | Irobot Corporation | Navigating autonomous coverage robots |
US8761931B2 (en) | 2005-12-02 | 2014-06-24 | Irobot Corporation | Robot system |
US8661605B2 (en) | 2005-12-02 | 2014-03-04 | Irobot Corporation | Coverage robot mobility |
US20080294338A1 (en) * | 2005-12-09 | 2008-11-27 | Nakju Doh | Method of Mapping and Navigating Mobile Robot by Artificial Landmark and Local Coordinate |
US8572799B2 (en) | 2006-05-19 | 2013-11-05 | Irobot Corporation | Removing debris from cleaning robots |
US8528157B2 (en) | 2006-05-19 | 2013-09-10 | Irobot Corporation | Coverage robots and associated cleaning bins |
US9955841B2 (en) | 2006-05-19 | 2018-05-01 | Irobot Corporation | Removing debris from cleaning robots |
US10244915B2 (en) | 2006-05-19 | 2019-04-02 | Irobot Corporation | Coverage robots and associated cleaning bins |
US8418303B2 (en) | 2006-05-19 | 2013-04-16 | Irobot Corporation | Cleaning robot roller processing |
US9492048B2 (en) | 2006-05-19 | 2016-11-15 | Irobot Corporation | Removing debris from cleaning robots |
US9317038B2 (en) | 2006-05-31 | 2016-04-19 | Irobot Corporation | Detecting robot stasis |
US8417383B2 (en) | 2006-05-31 | 2013-04-09 | Irobot Corporation | Detecting robot stasis |
US8726454B2 (en) | 2007-05-09 | 2014-05-20 | Irobot Corporation | Autonomous coverage robot |
US10070764B2 (en) | 2007-05-09 | 2018-09-11 | Irobot Corporation | Compact autonomous coverage robot |
US11498438B2 (en) | 2007-05-09 | 2022-11-15 | Irobot Corporation | Autonomous coverage robot |
US8839477B2 (en) | 2007-05-09 | 2014-09-23 | Irobot Corporation | Compact autonomous coverage robot |
US10299652B2 (en) | 2007-05-09 | 2019-05-28 | Irobot Corporation | Autonomous coverage robot |
US9480381B2 (en) | 2007-05-09 | 2016-11-01 | Irobot Corporation | Compact autonomous coverage robot |
US8239992B2 (en) | 2007-05-09 | 2012-08-14 | Irobot Corporation | Compact autonomous coverage robot |
US8438695B2 (en) | 2007-05-09 | 2013-05-14 | Irobot Corporation | Autonomous coverage robot sensing |
US20100032853A1 (en) * | 2008-08-11 | 2010-02-11 | Nitto Denko Corporation | Method for manufacturing optical waveguide |
US9623557B2 (en) * | 2009-11-06 | 2017-04-18 | Irobot Corporation | Localization by learning of wave-signal distributions |
US8930023B2 (en) | 2009-11-06 | 2015-01-06 | Irobot Corporation | Localization by learning of wave-signal distributions |
US20170050318A1 (en) * | 2009-11-06 | 2017-02-23 | Irobot Corporation | Localization by Learning of Wave-Signal Distributions |
US10314449B2 (en) | 2010-02-16 | 2019-06-11 | Irobot Corporation | Vacuum brush |
US8800107B2 (en) | 2010-02-16 | 2014-08-12 | Irobot Corporation | Vacuum brush |
US11058271B2 (en) | 2010-02-16 | 2021-07-13 | Irobot Corporation | Vacuum brush |
US20110268349A1 (en) * | 2010-05-03 | 2011-11-03 | Samsung Electronics Co., Ltd. | System and method building a map |
US8787614B2 (en) * | 2010-05-03 | 2014-07-22 | Samsung Electronics Co., Ltd. | System and method building a map |
US11320828B1 (en) | 2018-03-08 | 2022-05-03 | AI Incorporated | Robotic cleaner |
US11254002B1 (en) | 2018-03-19 | 2022-02-22 | AI Incorporated | Autonomous robotic device |
US11454981B1 (en) | 2018-04-20 | 2022-09-27 | AI Incorporated | Versatile mobile robotic device |
US11340079B1 (en) | 2018-05-21 | 2022-05-24 | AI Incorporated | Simultaneous collaboration, localization, and mapping |
US11548159B1 (en) | 2018-05-31 | 2023-01-10 | AI Incorporated | Modular robot |
US11199853B1 (en) | 2018-07-11 | 2021-12-14 | AI Incorporated | Versatile mobile platform |
US11402215B2 (en) | 2018-12-31 | 2022-08-02 | Twinny Co., Ltd. | Indoor positioning method for a moving apparatus using first and second two-dimensional maps of z-axis areas |
Also Published As
Publication number | Publication date |
---|---|
KR20060013022A (en) | 2006-02-09 |
US20060041331A1 (en) | 2006-02-23 |
KR100601960B1 (en) | 2006-07-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US7957836B2 (en) | Method used by robot for simultaneous localization and map-building | |
EP3814865B1 (en) | Self-aware visual-textual co-grounded navigation agent | |
Liu et al. | Robot navigation in crowded environments using deep reinforcement learning | |
Blanco et al. | Efficient probabilistic range-only SLAM | |
CN108871351B (en) | Dynamic path planning method for AUV (autonomous Underwater vehicle) submarine topography matching | |
Havangi et al. | A square root unscented FastSLAM with improved proposal distribution and resampling | |
US20170003131A1 (en) | Method and apparatus for relocation of mobile robot in indoor environment | |
Bopardikar et al. | Multiobjective path planning: Localization constraints and collision probability | |
Vasquez et al. | Intentional motion on-line learning and prediction | |
US8315734B2 (en) | Robot and method and medium for localizing the same by using calculated covariance | |
Vahdat et al. | Mobile robot global localization using differential evolution and particle swarm optimization | |
Plagemann et al. | Gaussian Beam Processes: A Nonparametric Bayesian Measurement Model for Range Finders. | |
Frank et al. | Using gaussian process regression for efficient motion planning in environments with deformable objects | |
Azpúrua et al. | Three-dimensional terrain aware autonomous exploration for subterranean and confined spaces | |
Wang et al. | Virtual maps for autonomous exploration of cluttered underwater environments | |
Burks et al. | Optimal continuous state pomdp planning with semantic observations: A variational approach | |
Maurelli et al. | A particle filter approach for AUV localization | |
US11536797B2 (en) | Mobile network localization | |
GhaemiDizaji et al. | Efficient robot localization and SLAM algorithms using Opposition based High Dimensional optimization Algorithm | |
Ellefsen et al. | Planning inspection paths through evolutionary multi-objective optimization | |
Gao et al. | A prediction method of localizability based on deep learning | |
Augenstein et al. | Simultaneous Estimaton of Target Pose and 3-D Shape Using the FastSLAM Algorithm | |
Pedersen | Science target assessment for Mars rover instrument deployment | |
Thomas et al. | Safe motion planning with environment uncertainty | |
Heinemann et al. | A novel approach to efficient monte-carlo localization in robocup |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: SAMSUNG ELECTRONICS CO., LTD., KOREA, REPUBLIC OF Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:MYEONG, HYEON;HONG, SUNGI;REEL/FRAME:017010/0247 Effective date: 20050831 |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
FEPP | Fee payment procedure |
Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
FPAY | Fee payment |
Year of fee payment: 4 |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 8 |
|
FEPP | Fee payment procedure |
Free format text: MAINTENANCE FEE REMINDER MAILED (ORIGINAL EVENT CODE: REM.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
LAPS | Lapse for failure to pay maintenance fees |
Free format text: PATENT EXPIRED FOR FAILURE TO PAY MAINTENANCE FEES (ORIGINAL EVENT CODE: EXP.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
STCH | Information on status: patent discontinuation |
Free format text: PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362 |
|
FP | Lapsed due to failure to pay maintenance fee |
Effective date: 20230607 |