US20190220025A1 - Method, system for obstacle detection and a sensor subsystem - Google Patents

Method, system for obstacle detection and a sensor subsystem Download PDF

Info

Publication number
US20190220025A1
US20190220025A1 US15/869,291 US201815869291A US2019220025A1 US 20190220025 A1 US20190220025 A1 US 20190220025A1 US 201815869291 A US201815869291 A US 201815869291A US 2019220025 A1 US2019220025 A1 US 2019220025A1
Authority
US
United States
Prior art keywords
light
self
obstacle
guiding machine
indicator light
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
US15/869,291
Other versions
US11009882B2 (en
Inventor
Kai-Shun Chen
Wei-Chung Wang
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Pixart Imaging Inc
Original Assignee
Pixart Imaging Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Pixart Imaging Inc filed Critical Pixart Imaging Inc
Priority to US15/869,291 priority Critical patent/US11009882B2/en
Assigned to PIXART IMAGING INC. reassignment PIXART IMAGING INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: CHEN, KAI-SHUN, WANG, WEI-CHUNG
Priority to CN201810503115.0A priority patent/CN110031002B/en
Priority to CN202210049733.9A priority patent/CN114370881A/en
Publication of US20190220025A1 publication Critical patent/US20190220025A1/en
Priority to US17/227,732 priority patent/US11669103B2/en
Application granted granted Critical
Publication of US11009882B2 publication Critical patent/US11009882B2/en
Priority to US18/140,084 priority patent/US20240085921A1/en
Active legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • G05D1/0248Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means in combination with a laser
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • G01C21/206Instruments for performing navigational calculations specially adapted for indoor navigation
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0214Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory in accordance with safety or protection criteria, e.g. avoiding hazardous areas
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/46Indirect determination of position data
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • G01S17/931Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0238Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • G06K9/00805
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/10Image acquisition
    • G06V10/12Details of acquisition arrangements; Constructional details thereof
    • G06V10/14Optical characteristics of the device performing the acquisition or on the illumination arrangements
    • G06V10/141Control of illumination
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/58Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
    • G05D2201/0203
    • G05D2201/0207

Definitions

  • the disclosure is generally related to a technology for detecting obstacle, and in particular to a method for detecting an obstacle such as a cliff that is on the path of a moving machine, a system for implementing the method, and a sensor subsystem thereof.
  • a self-guiding device e.g. an automatic vehicle or an automatic robot cleaning machine, should avoid colliding with the other things or falling from a height.
  • the self-guiding device requires a proximity sensor to determine a distance to the wall in order to avoid the collision.
  • a conventional proximity sensor is such as radar, ultrasonic sensor, or a light (infrared ray) sensor that is a sensor able to find out the presence of any nearby object without a physical contact.
  • the ultrasonic sensor is utilized to emit ultrasonic waves and receive the reflected waves for determining the front article that reflects the ultrasonic waves.
  • the time difference between the times for emitting and receiving can be used to determine the distance.
  • a cliff sensor mounted at the bottom of the self-guiding device utilizes an infrared ray or ultrasonic waves to detect the presence of a floor under the device by measuring the reflected or scattered infrared ray or waves from the surface of the floor. Therefore, such the cliff sensor can prevent the device from traveling over the cliff when the cliff sensor detects itself moves over the cliff.
  • the self-guiding device When the self-guiding device is able to detect the obstacle, e.g. the wall or the cliff, its controller can conclude the sensed data and instruct a driving system of the self-guiding device to stop the device or avoid the obstacle when it approaches or arrives the obstacle.
  • the obstacle e.g. the wall or the cliff
  • One of the objectives of the method and the system for obstacle device in one aspect of the disclosure is to detect an obstacle on the path a self-guiding machine travels.
  • the process of obstacle detection is performed in the self-guiding machine when it travels around an area.
  • An algorithm operated in the self-guiding machine is used to process the data that is generated by a sensor subsystem installed in the machine.
  • the sensor subsystem of the system for performing the obstacle detection includes a light emitter and a light sensor.
  • the light emitter emits a type of an indicator light
  • the light sensor captures one image at one time or a series of images containing the indicator light for a period of time.
  • the information extracted from the image can be used to estimate a distance to the obstacle.
  • a change occurring in the series of images can be used to determine if the self-guiding machine approaches the obstacle, and the self-guiding machine can process an avoidance measure in order to avoid the risk.
  • the indicator light emitted by the light emitter is such as a linear light projected onto the area in front of the self-guiding machine.
  • the light sensor can capture an image containing the linear light. Through an image analysis process, the captured linear light renders the information that is used to determine the distance to the obstacle, or how close it approaches the obstacle.
  • the method for obstacle detection can be adapted to a self-guiding machine that has a light emitter and a light sensor.
  • the light emitter and the light sensor are preferably set apart at a distance.
  • the light emitter emits an indicator light being projected onto a path the self-guiding machine travels toward, and the light sensor senses the indicator light projected onto the path so as to generate an image containing the indicator light.
  • At least one feature of the indicator light being sensed can be obtained. Then a spatial relationship between the self-guiding machine and an obstacle can be obtained in response to the at least one feature of the indicator light being sensed. This spatial relationship allows the self-guiding machine to compute a distance between the self-guiding machine and the obstacle, and determine if the self-guiding machine will collide with the obstacle when compared with a collision threshold, or determine if the self-guiding machine will fall due to the obstacle when compared with a falling threshold.
  • a system for obstacle detection is provided.
  • the system installed in a self-guiding machine includes a controller, a light emitter, a light sensor, an image processor and a central processor.
  • the image processor is used to generate an image containing the indicator light that is projected onto the path.
  • a central processor is used to perform the method for obstacle detection.
  • a sensor subsystem is provided in the system for obstacle detection.
  • the sensor subsystem mainly includes a light emitter, a light sensor, and an image processor that are used to emit the indicator light projected onto a path, sense the indicator light, and render the image containing the indicator light.
  • the image of indicator light is provided for the system of the self-guiding machine to analyze and obtain a spatial relationship between the self-guiding machine and the obstacle.
  • the system obtains the spatial relationship that allows the central processor to compute a distance between the self-guiding machine and the obstacle, and determine if the self-guiding machine will collide with the obstacle when compared with a collision threshold stored in a memory of the system, or determine if the self-guiding machine will fall due to the obstacle when compared with a falling threshold stored in the memory. Further, when the self-guiding machine reaches the collision threshold or the falling threshold, the central processor generates a signal for instructing the controller to drive the self-guiding machine to avoid the obstacle.
  • FIG. 1 shows a schematic diagram using a top view to depict a circumstance that a self-guiding machine approaches an obstacle in one embodiment of the present disclosure
  • FIG. 2 shows a schematic diagram depicting a circumstance that a self-guiding machine is in front of a cliff in one embodiment of the present disclosure
  • FIG. 3 shows another schematic diagram depicting a perspective view of a circumstance that the self-guiding machine is in front of a wall in one further embodiment of the present disclosure
  • FIG. 4A and FIG. 4B are the schematic diagrams showing a self-guiding machine approaches a wall in one embodiment of the present disclosure
  • FIG. 5A and FIG. 5B are the schematic diagrams showing a self-guiding machine approaches a cliff in one further embodiment of the present disclosure
  • FIG. 6A through FIG. 6D are the schematic diagrams showing a self-guiding machine approaches an obstacle with a height from a ground in one embodiment of the present disclosure
  • FIG. 7 shows circuit blocks of a system for obstacle detection according to one embodiment of the present disclosure
  • FIG. 8 shows a schematic diagram depicting a change of the indicator light captured by a light sensor of a self-guiding machine approaching a wall according to one embodiment of the present disclosure
  • FIG. 9 shows a schematic diagram depicting a change of the indicator light captured by a light sensor of a self-guiding machine approaching a cliff according to one embodiment of the present disclosure
  • FIGS. 10A, 10B and 10C show examples of the received lights by a self-guiding machine that approaches an obstacle according to one embodiment of the present disclosure
  • FIG. 11 shows a schematic diagram depicting a circular indicator light projected on a wall where a self-guiding machine approaches in one embodiment of the present disclosure
  • FIG. 12 shows a flow chart describing a process for obstacle detection adapted to a self-guiding machine in a one embodiment of the present disclosure
  • FIG. 13 shows a flow chart describing a process for obstacle detection adapted to a self-guiding machine in a one further embodiment of the present disclosure.
  • the disclosure is related to a method and a system for obstacle detection adapted to a self-guiding machine.
  • the self-guiding machine is such as an autonomous vehicle or an autonomous cleaning robot that can navigates an area automatically.
  • the method allows the self-guiding machine to sense and identify an obstacle in front of the machine, and then drives the self-guiding machine to avoid the obstacle automatically.
  • the system for obstacle detection is exemplified as a sensor subsystem that essentially includes a light emitter and a light sensor that can be installed in the self-guiding machine.
  • the light emitter and the light sensor are set apart at a distance in a horizontal direction, and the light emitter and the light sensor can be disposed at the same or different horizontal level of position.
  • the light emitter may utilize a Laser or an LED to be a linear light source that emits a linear light as the indicator light, or alternatively emits a certain area of light as the indicator light.
  • the light sensor is used to sense the linear light or the certain area of light that is projected onto a path the self-guiding machine travels toward.
  • one of the objectives of the method and the system for obstacle device in one aspect of the disclosure is to detect the obstacle, e.g. a wall, a cliff, or a floating obstacle, on the path the self-guiding machine travels.
  • the method can be implemented by an algorithm operated in the system for obstacle detection.
  • the self-guiding machine can itself process the data that is generated by the sensor subsystem and perform an avoidance measure.
  • FIG. 1 showing a schematic diagram that uses a top view to depict a circumstance that a self-guiding machine approaches an obstacle in one embodiment of the present disclosure.
  • a self-guiding machine 10 shown in the diagram can be an autonomous robot, e.g. an autonomous cleaning device that navigates an area with various terrains.
  • a system for obstacle detection is installed in the self-guiding machine 10 for sensing the obstacle on a path the self-guiding machine 10 travels toward.
  • the system can exemplarily include a sensor subsystem that essentially includes a light emitter 101 , a light sensor 102 and a processing circuitry.
  • the diagram also shows the light emitter 101 and the light sensor 102 that are set apart at a distance from each other in a horizontal direction. It is noted that the light emitter 101 and the light sensor 102 are not necessary to be disposed at the same horizontal level of position.
  • the light emitter 101 exemplarily utilizes a Laser or an LED to be a light source to emit a linear light 103 .
  • the linear light 103 may be formed by the light source through a specific lens.
  • the linear light 103 is continuously projected onto a scene, e.g. a ground 19 or any surface of any terrain, in front of the self-guiding machine 10 .
  • a scene e.g. a ground 19 or any surface of any terrain
  • the self-guiding machine 10 travels toward a wall 18 that forms the obstacle at a distance ‘d’ from the self-guiding machine 10
  • the linear light 103 acting as an indicator light is projected onto both the wall 18 and the ground 19 .
  • a border 15 between the ground 19 and the wall 18 divides the linear line 103 into a first segment, i.e.
  • the light sensor 102 is such as a camera that has a field of view which is indicated by two dotted lines drawn in the diagram for capturing an image. It is noted that the field of view of the light sensor 102 should cover the indicator light 13 for effectively detect a status of the self-guiding machine 10 .
  • the light sensor 102 is configured to sense the linear light 103 and capture an image containing this linear light 103 .
  • a length ‘s’ of the lower segment of the linear light 103 can be sensed by the light sensor 102 .
  • a length ‘s’ sensed by the light sensor 102 can be used to be an indicator used to determine the distance ‘d.’
  • the distance ‘d’ between the self-guiding machine 10 and the wall 18 can be determined based on the length ‘s’ that is sensed by the light sensor 102 .
  • a horizontal position of the upper segment of the linear light 103 can also be the indicator used to determine the distance ‘d.’
  • the light sensor 102 can continuously sense the indicator light, and the system can accordingly generate a series of images containing the indicator light. If any change of the length ‘s’ appearing in the image of the lower segment of the linear light 103 or of the position/length of the upper segment of the linear light 103 has found in the series of images, the change allows the system to acknowledge that the self-guiding machine 10 is approaching or leaving the wall 18 .
  • the any change of the aforementioned length or position of the indicator light obtained from the series of images can also be used to determine a status of the self-guiding machine 10 , e.g. a moving trend of the self-guiding machine 10 .
  • the system for obstacle detection also establishes a warning mechanism for the self-guiding machine 10 according to a spatial relationship between the self-guiding machine 10 and the wall ( 18 ).
  • FIG. 2 showing a schematic diagram depicting a circumstance that a self-guiding machine is in front of a cliff in one embodiment of the present disclosure.
  • the self-guiding machine 10 travels over a plane 20 of a terrain, e.g. a table or a ground and toward a cliff 22 .
  • the cliff 22 can be formed by a vertical section of the table or a downward stairway of the ground.
  • the light emitter 101 of the self-guiding machine 10 continuously emits an indicator light 103 ′ projected onto the way the machine 10 travels over.
  • the indicator light 103 ′ of the present example appears to be a linear light when the light emitter 101 emits the linear light in a vertical direction.
  • the diagram utilizes section lines to present a range of the emitting light.
  • the light sensor 102 senses the indicator light 103 ′ within its field of view and the system generates an image containing the indicator light 103 ′.
  • the indicator light 103 ′ appears to be cut by an edge 15 ′ between the plane 20 and the cliff 22 .
  • the cut indicator light 103 ′ has been shown in the image captured by the light sensor 102 .
  • the indicator light 103 ′ shows an obstacle, e.g. the cliff 22 , will be met by the self-guiding machine 10 and its length or a slope/angle sensed by the light sensor 102 allows the system to determine a distance between the self-guiding machine 10 and the obstacle indicated by the edge 15 ′. Therefore, when the system for obstacle detection acknowledges the cliff 22 is on the way the self-guiding machine 10 travels over, the system will instruct the self-guiding machine 10 to avoid this obstacle.
  • FIG. 3 shows another schematic diagram depicting a perspective view of a circumstance that the self-guiding machine is in front of a wall in one further embodiment of the present disclosure.
  • the self-guiding machine 10 travels over a ground 30 and toward a wall 32 .
  • the light emitter 101 of the self-guiding machine 10 emits a linear light as an indicator light in a vertical direction.
  • the range of the emitting light covers both the ground 30 and the wall 32 in front of the self-guiding machine 10 .
  • the length of an indicator light 103 ′′ indicates how close the self-guiding machine 10 approaches the wall 32 .
  • a slope or angle of the indicator light 103 ′′ can also show the spatial relationship between the self-guiding machine 10 and the wall 32 ; alternatively, a position of segment of the indicator light on the wall 32 can also be the indicator for depicting the spatial relationship.
  • a linear light emitted by the light emitter of the system installed in the self-guiding machine acts as an indicator light and the feature(s) of the indicator light can be used to determine a spatial relationship between the self-guiding machine and an obstacle, e.g. a wall or a cliff. Therefore, the system can effectively prevent the self-guiding machine from colliding with the wall or falling from the cliff. Since the light emitter and the light sensor of the system are set apart at a distance, i.e. preferably a horizontal distance, the features of the indicator light sensed by the light sensor can be analyzed for rendering the information such as a length, a position, a slope/angle and/or an area.
  • At least one of the features is sufficient to be used to determine the spatial relationship between the self-guiding machine and the obstacle. Further, a change of one of the features between at least two images having the indicator light can be used to render a moving trend of the self-guiding machine. This moving trend allows the system to timely issue an alarm of collision or falling to the self-guiding machine.
  • FIG. 4A and FIG. 4B are the schematic diagrams showing a change of one of the features that allow the system to issue the alarm of collision to a self-guiding machine when it approaches a wall.
  • this lateral view diagram shows a self-guiding machine 40 such as an autonomous vehicle travels over a ground 43 .
  • a light emitter of the system for obstacle detection installed in the self-guiding machine 40 emits an indicator light to the front of the machine 40 .
  • the light emitter has a range of emission 42 and its emitted light is confined to this range.
  • the indicator light can be sensed by a light sensor of the system.
  • the self-guiding machine 40 is in front of a wall 44 within a specific distance that is available for the system to determine a spatial relationship between the self-guiding machine 40 and the obstacle, the indicator light is projected onto both the ground 43 and the wall 44 . While the indicator light is sensed by the light sensor within its field of view, a first segment having a length ‘ 40 a’ and a second segment having a length ‘ 40 b’ are respectively computed by analyzing the image containing all or part of the indicator light.
  • the self-guiding machine 40 approaches the wall 44 , reference is made to FIG. 4B , it shows the range of emission 42 ′ becomes smaller than the previous status.
  • the indicator light sensed by the light sensor has been changed.
  • the length of the first segment of the indicator light projected onto the ground 43 has been changed to “ 40 a′”
  • the length of the second segment of the indicator light projected onto the wall 44 has been changed to “ 40 b′.” Therefore, the system for obstacle detection utilizes the feature of the length of the first segment or the second segment of the indicator light being sensed to determine the distance between the self-guiding machine 40 and the obstacle, e.g. the wall 44 .
  • a change of at least one feature of the indicator light can be used to obtain a moving trend of the machine and to determine if the self-guiding machine approaches the obstacle.
  • FIG. 5A and FIG. 5B are the schematic diagrams showing a change of one of the features that allow the system to issue a falling alarm to a self-guiding machine when it approaches a cliff
  • the lateral view diagram shows a self-guiding machine 50 travels over a ground 53 including a cliff 54 at a distance away.
  • a light emitter of the system for obstacle detection in the self-guiding machine 50 emits an indicator light to the front of the machine 50 .
  • the indicator light can be sensed by a light sensor of the system.
  • a segment of the indicator light is projected onto the ground 53 and the rest of the indicator light is cut by an edge of the ground 53 due to the cliff 54 .
  • the indicator light projected onto the ground 53 leaves a distance ‘ 50 a’ to be sensed by the light sensor. While the indicator light is sensed by the light sensor within its field of view, the distance ‘ 50 a’ is computed by analyzing the image containing all or part of the indicator light.
  • the distance of the indicator light projected onto the ground 53 becomes shorter.
  • the length of the indicator light sensed by the light sensor projected onto the ground 53 is ‘ 50 a’ in FIG. 5A , and then the indicator light projected onto the ground 53 has been changed to “ 50 a′.” Therefore, the system for obstacle detection also utilizes the feature of the length of the indicator light being sensed to determine the distance between the self-guiding machine 50 and the obstacle, e.g. the cliff 54 .
  • a change of length of the indicator light can be used to obtain a moving trend of the machine and to determine if the self-guiding machine approaches the obstacle.
  • FIG. 6A through FIG. 6D are the schematic diagrams showing a self-guiding machine approaches a floating obstacle with a height from a ground in one further embodiment of the present disclosure.
  • a self-guiding machine 60 travels over a ground 63 and approaches a floating obstacle 64 .
  • a light sensor of the self-guiding machine 60 senses an indicator light, for example a linear light, emitted by a light emitter of the system installed in the self-guiding machine 60 .
  • the indicator light within a range of emission 62 of the light emitter of the system forms two segments, i.e. a segment of the indicator light is projected to the ground 63 and the other segment of the indicator light is projected to the floating obstacle 64 .
  • the segment of the indicator light projected to the floating obstacle 64 forms a light with a length ‘ 60 b’ sensed by the light sensor.
  • the light sensor of the system is driven to capture an image with the segment of indicator light projected onto the ground 63 and the other segment of the indicator light having a length ‘ 60 b’ projected onto the floating obstacle 64 .
  • the image can be referred to a frame 66 shown in FIG. 6B .
  • the frame 66 shows at least two features extracted from the image captured by the light sensor.
  • a first segment 60 a shown in the frame 66 indicates the segment of indicator light projected onto the ground 63 .
  • a second segment 60 b shown in the frame 66 indicates the segment of indicator light projected onto the floating obstacle 64 .
  • the self-guiding machine 60 is getting close to the floating obstacle 64 over the ground 63 .
  • the floating obstacle 64 causes changing the range of emission 62 ′ of the light emitter of the system.
  • Both the segments of indicator light respectively projected onto the ground 63 and the floating obstacle 64 have been changed.
  • the length of the segment of the indicator light projected onto the float obstacle 64 has been changed to length “ 60 b′” due to the spatial relationship between the self-guiding machine 60 and the floating obstacle 64 has been changed.
  • the image captured by the light sensor can be referred to the frame 66 ′ shown in FIG. 6D .
  • both features of the first segment 60 a′ and the second segment 60 b′ have been changed. It shows both the length and the slope of the first segment 60 a′ have been changed.
  • the length of the first segment 60 a′ is shorter than the length of the first segment 60 a, and the slope thereof also changes when the self-guiding machine 60 approaches the floating obstacle 64 .
  • both the length and the position of the second segment 60 b′ have been changed.
  • the length of the second segment 60 b′ is shorter than the length of the second segment 60 b, and the position of thereof shifts to the left in the same instance.
  • a sensor module essentially includes a light emitter and a light sensor that are set apart at a distance.
  • the light emitter emits a type of an indicator light
  • the light sensor captures one image at one time or a series of images containing the indicator light for a period of time.
  • the indicator light emitted by the light emitter is such as a linear light, a circular light or any shape of light projected onto the way the self-guiding machine travels toward.
  • the light sensor can capture an image containing the indicator light.
  • the information extracted from the indicator light being captured is used to determine a spatial relationship between the self-guiding machine and an obstacle. The spatial relationship allows the system to instruct the driving system of the self-guiding machine to avoid risk of collision or falling.
  • FIG. 7 shows circuit blocks of a system for obstacle detection according to one embodiment of the present disclosure.
  • the system for obstacle detection with a sensor subsystem that has capability of data processing since it has its own processor and memory.
  • the sensor subsystem is used to collect the environmental information around it and to process the environment information for rendering a protection mechanism.
  • the environment information is such as images of environment around the self-guiding machine adopting this system. The environmental information allows the system to determine if the self-guiding machine will meet any risk of damage.
  • the functions provided by the system are implemented by the circuit components shown in the diagram.
  • the system includes a controller 701 that is in charge of controlling operations of the other circuit components for operating the system.
  • the controller 701 is used to drive a light emitter 702 to emit the indicator light, and also control a light sensor 705 to sense the light signals within its field of view.
  • the indicator light such as a linear light, a circular light or any type of the indicator light can be controlled to function in a full-time manner or periodically.
  • the system includes the light emitter 702 coupled to the controller 701 .
  • the light emitter 702 has a light source and its driving circuit and is used to emit an indicator light through a requisite optical component and/or a window that is mounted on a surface of the self-guiding machine.
  • the optical component is such as a lens that can be used to guide the indicator light to be a linear light, a circular light or any shape of the light.
  • the system includes the light sensor 705 coupled to the controller 701 .
  • the light sensor 705 set apart at a distance from the light emitter 702 is used to sense the indicator light emitted by the light emitter 702 .
  • the scene in front of the self-guiding machine is captured by the light sensor 705 and then transmitted to an image processor 704 of the system for generating an image containing the indicator light.
  • the image processor 704 is coupled to the light sensor 705 and is used to generate the image.
  • the image is temporarily buffered, for example, in a memory 706 .
  • the light emitter 702 , the light sensor 705 and the image process 704 form a sensor subsystem installed in the self-guiding machine.
  • the sensor subsystem is in charge of generating the indicator light and rendering the image of the indicator light. Therefore, the self-guiding machine can use at least one feature of the indicator light being sensed to obtain a spatial relationship between itself and an obstacle when the self-guiding machine approaches the obstacle on its path.
  • a central processor 703 that is coupled to the image processor 704 and the controller 701 .
  • the central processor 703 is used to perform the method for obstacle detection.
  • the memory 706 coupled to the central processor 703 acts as a system memory or storage that can be used to store the instructions for performing the functions provided by the system, for example the method for obstacle detection.
  • the method performed by the central processor 703 primarily includes analyzing the image containing the indicator light sensed by the light sensor 705 , obtaining at least one feature of the indicator light being sensed, and obtaining a spatial relationship between the self-guiding machine and an obstacle in response to the at least one feature of the indicator light being sensed when the self-guiding machine approaches the obstacle on its path.
  • the information extracted by the sensor subsystem from the image having the indicator light can be used to estimate a distance to an obstacle.
  • the spatial relationship e.g. the distance between the self-guiding machine and the obstacle, allows the system to determine if the self-guiding machine will be in any dangerous situation, for example colliding with a wall or falling from a cliff.
  • the controller 701 of the system is coupled to a machine driver 708 that links to a driving system of the self-guiding machine adopting this system.
  • the controller 701 When the system determines that an obstacle exists at a distance from the self-guiding machine, the controller 701 generates a signal for instructing a machine driver 708 for responding to the obstacle.
  • the light sensor 705 can also continuously captures a series of images covering the indicator light, and a change occurring in the series of images can be found and used to determine if the self-guiding machine approaches the obstacle.
  • the self-guiding machine can process an avoidance measure in order to avoid the risk.
  • FIG. 8 shows a schematic diagram depicting a change of the indicator light captured by a light sensor of a self-guiding machine approaching a wall according to one embodiment of the present disclosure.
  • the system installed in a self-guiding machine 80 , 80 ′ is exemplified as the sensor subsystem essentially including a light emitter 801 , 801 ′ and a light sensor 802 , 802 ′.
  • the self-guiding machine 80 , 80 ′ acts as autonomous vehicle including a computer that drives the light emitter 801 , 801 ′ to emit indicator light, integrates data received by the light sensor 802 , 802 ′ and then processes the data for acquiring the terrain information regarding the path in front of the self-guiding machine 80 .
  • the self-guiding machine 80 depicted by a solid line is at a first position when it travels over a ground 80 .
  • the self-guiding machine 80 includes the light emitter 801 and the light sensor 802 .
  • the light emitter 801 emits an indicator light 803 .
  • the indicator light 803 depicted in the diagram is based on the image sensed by the light sensor 802 in its viewing angle.
  • the diagram shows the indicator light 803 has a turning point that divides the indicator light 803 into two segments due to a border between the ground 83 and the wall 84 .
  • the self-guiding machine 80 then moves to a second position closer to the wall 84 and is marked as the self-guiding machine 80 ′ that is depicted by a dotted line.
  • the light emitter 801 ′ still emits an indicator light 803 ′ and the light sensor 802 ′ receives the indicator light 803 ′ and generates another image.
  • the indicator light 803 ′ depicted in the diagram is based on the image sensed by the light sensor 802 ′ in its viewing angle.
  • the border of between the ground 83 and the wall 84 causes the indicator light 803 ′ to have another turning point that divides the indicator light 803 ′ into two segments.
  • This exemplary example shows several changes of the indicator light 803 , 803 ′ projected onto both the ground 83 and the wall 84 when the self-guiding machine 80 , 80 ′ travels toward the obstacle, i.e. the wall 84 .
  • the light emitter 801 , 801 ′ emits a linear light and the light sensor 802 , 802 ′ senses the light within its sensing range confined by its viewing angle.
  • the diagram shows when the linear light is projected onto both the ground 83 and the obstacle, i.e. the wall 84 , at least one feature of the indicator light 803 , 803 ′ can be found by analyzing the image containing the indicator light 803 , 803 ′.
  • a length, as one of the features, of a first segment of the indicator light 803 , 803 ′ projected onto the ground 83 becomes shorter when the self-guiding machine 80 , 80 ′ is closer to the wall 84 .
  • a slope can act as another feature for detecting the obstacle since the image being sensed shows a slope of the first segment of the indicator light 803 , 803 ′ becomes larger when the self-guiding machine 80 , 80 ′ is closer to the wall 84 .
  • the shorter length or the left-shift position of a second segment of the indicator light 803 , 803 ′ projected onto the wall 84 can also act as one of the features to detect the obstacle when the self-guiding machine 80 , 80 ′ is closer to the wall 84 . Therefore, in this exemplary example, a length of the first segment or the second segment, a position of the second segment and/or a slope/angle of the first segment can be regarded as the feature(s) allowing the system to detect the obstacle.
  • FIG. 9 shows another schematic diagram depicting a change of the indicator light captured by a light sensor of a self-guiding machine approaching a cliff according to one embodiment of the present disclosure.
  • the present example shows the self-guiding machine 90 , 90 ′ travels over a ground 94 and will meet an obstacle, i.e. a cliff, and the system in the self-guiding machine 90 , 90 ′ is required to detect the obstacle and avoid falling.
  • an obstacle i.e. a cliff
  • the diagram shows a self-guiding machine 90 depicted by a solid line is originally at a first position.
  • a light emitter 901 of the self-guiding machine 90 at the first position emits an indicator light ( 903 , 904 ), and a light sensor 902 captures an image containing the indicator light ( 903 , 904 ) in its viewing angle.
  • the self-guiding machine 90 then moves to a second position closer to an edge 93 and is marked as the self-guiding machine 90 ′ depicted by a dotted line.
  • the light sensor 902 ′ senses the indicator light ( 903 ′, 904 ′) emitted by the light emitter 901 ′ of the self-guiding machine 90 ′ in a viewing angle.
  • the edge 93 formed by the cliff cuts the indicator light and the segment projected onto the ground 94 is marked as a first segment 903 , 903 ′. It should be noted that the segment 904 , 904 ′ of the indicator light being sensed not well connected to the first segment ( 903 , 903 ′) is projected onto a distant wall with a distance from the cliff and is still sensed by the light sensor 902 , 902 ′.
  • This exemplary example shows the length of the first segment 903 , 903 ′ of the indicator light being sensed by the light sensor 902 , 902 ′ becomes shorter and with larger slope when the self-guiding machine 90 , 90 ′ moves from the first position to the second position that is closer to the edge 93 of the cliff. Therefore, the feature(s) of the first segment 903 , 903 ′ of the indicator light being sensed by the light sensor 902 , 902 ′ can be the information for the system to detect the obstacle, i.e. the cliff. The system accordingly determines if the self-guiding machine 90 , 90 ′ approaches the edge 93 of the cliff.
  • FIGS. 10A, 10B and 10C show three frames of images depicting an example of the received lights by a self-guiding machine that approaches an obstacle at a distance of 20 cm, 10 cm and 5 cm.
  • FIG. 10A shows a frame 1001 with a width from pixel 0 to pixel 200 .
  • the frame 1001 appears a light segment 1004 of indicator light projected onto a path the self-guiding machine travels toward when the self-guiding machine is at 20 cm distance from an obstacle, e.g. a wall or a cliff.
  • the frame 1001 further uses a dotted line 1005 to indicate the position of the light segment projected onto the wall.
  • FIG. 10B shows another frame 1002 with the same width.
  • the frame 1002 appears a light segment 1006 of the indicator light projected onto the path when the self-guiding machine is at 10 cm distance from the obstacle.
  • the dotted line 1007 indicates the position of the light segment projected onto the wall. It is noted that the slope of the light segment 1006 is larger than the slope of the light segment 1004 shown in FIG. 10A ; and as well the length of the light segment 1006 is shorter than the length of the light segment 1004 shown in FIG. 10A since the self-guiding machine is getting close to the obstacle.
  • FIG. 10C shows one more frame 1003 with the same width.
  • the frame 1003 appears a light segment 1008 of the indicator light projected onto the path when the self-guiding machine is at 5 cm distance from the obstacle.
  • the dotted line 1009 indicates the position of the light segment projected onto the wall.
  • the slope of the light segment 1008 is larger than the slope of the light segment 1006 shown in FIG. 10B ; and as well the length of the light segment 1008 is shorter than the length of the light segment 1006 shown in FIG. 10B since the self-guiding machine is getting close to the obstacle.
  • the segment of indicator light projected onto the obstacle i.e. the wall
  • the position of the light segment also acts as an indicator for indicating the spatial relationship between the self-guiding machine and the obstacle.
  • the indicator light emitted by the light emitter of the system can also be a circular light.
  • FIG. 11 shows a schematic diagram depicting a circular indicator light projected on a wall where a self-guiding machine approaches in one embodiment of the present disclosure.
  • the circular indicator light projected on the wall is sensed by the light sensor at a distance apart from the light emitter, and therefore the circular indicator light being sensed and shown in the diagram gets a little distorted.
  • the circular indicator light being sensed specifies a reference point ( 1111 , 1112 ) and is divided into a first segment ( 111 , 113 ), e.g. the lower area, and a second segment ( 112 , 114 ), e.g. the upper area by a dividing line.
  • the dividing line can be a border between a ground and a wall or an edge of a cliff.
  • the solid circle indicates the self-guiding machine is at a first position
  • the dotted circle indicates the self-guiding machine is at a second position that is closer to the obstacle, e.g. the wall. It appears that both the areas of the first segment and the second segment of the dotted circle are smaller than the solid circle when the self-guiding machine approaches the obstacle. Further, referring to the reference points 1111 and 1112 respective to the solid circle and the dotted circle, it appears that the dotted circle moves to the left relative to the solid circle as the self-guiding machine approaches the obstacle.
  • the area and the position of the circular indicator light can act as the indicator for indicating the spatial relationship between the self-guiding machine and the obstacle.
  • the central processor of the system When the system obtains the spatial relationship, the central processor of the system accordingly computes a distance between the self-guiding machine and the obstacle, and determines if the self-guiding machine will collide with the obstacle when compared with a collision threshold stored in a memory of the system. Similarly, the spatial relationship also allows the system to determine if the self-guiding machine will fall due to the obstacle when compared with a falling threshold stored in the memory.
  • FIG. 7 performs the method for obstacle detection adapted to a self-guiding machine.
  • FIG. 12 shows a flow chart describing the method for obstacle detection in one embodiment.
  • step S 121 the light emitter of the system emits an indicator light
  • step S 123 the light sensor of the system is used to senses the indicator light and generate an image.
  • the light sensor is such as a camera that captures the image containing the indicator light within a viewing angle.
  • the indicator light can reflect a spatial relationship between the self-guiding machine and the obstacle when the indicator light can be projected onto the path including the ground and/or the obstacle the self-guiding machine travels over.
  • the image processor of the system then analyzes the image for acquiring at least one feature of the indicator light being sensed, such as step S 125 .
  • the feature can be a length, a position, a slope and/or an area of the indicator light being sensed. Any of the features is provided for the central processor of the system to compute the length, the position, and/or the slope regarding the linear indicator light or the area regarding the circular indicator light.
  • the at least one feature allows the system to determines a distance between the self-guiding machine installing the system and an obstacle.
  • the system can also find a moving trend of the self-guiding machine according to the change of the feature extracted from the indicator light being sensed for a period of time by the light sensor of the system. Reference is made to FIG. 13 .
  • step S 151 the light emitter continuously emits an indicator light projected onto the ground and/or the obstacle in front of the self-guiding machine.
  • step S 153 the light sensor is driven to capture at least two different images containing the indicator light within a time period.
  • the system can obtain at least one feature from individual image.
  • the feature can also be the length, the position and/or the slope obtained from the linear indicator light being sensed by the light sensor, or the area obtained from the circular indicator light.
  • step S 157 any change of length, slope, position and/or area of the indicator lights in both images within the time period can be obtained for determining the change of the spatial relationship between the self-guiding machine and the obstacle.
  • the change of the indicator light projected onto the ground and/or the obstacle can be used to determine the moving trend of the self-guiding machine.
  • the system accordingly can determine if the self-guiding machine is getting close to any obstacle that it should be avoid. Therefore, the system can issue an alarm in advance for the self-guiding machine.
  • the method and the system for obstacle detection can be adapted to a self-guiding machine such as an autonomous vehicle or an autonomous cleaning robot.
  • the system can acquire a spatial relationship between the self-guiding machine and the obstacle according to at least one feature extracted from an indicator light projected onto the path the self-guiding machine travels over.
  • the spatial relationship allows the self-guiding machine to compute a distance between the self-guiding machine and the obstacle so as to determine if the self-guiding machine will collide with a wall, or determine if the self-guiding machine will fall from a cliff.
  • the invention provides the self-guiding machine a solution to determine a distance from an obstacle and optionally to warn the self-guiding machine when it approaches the obstacle. Further, the system can accordingly instruct the driving system of the self-guiding machine to avoid the obstacle.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Electromagnetism (AREA)
  • Automation & Control Theory (AREA)
  • Multimedia (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Optics & Photonics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Optical Radar Systems And Details Thereof (AREA)
  • Navigation (AREA)

Abstract

The disclosure is related to a method and a system for obstacle detection adapted to a self-guiding machine. The method is performed in the system including a controller for driving the system, a light emitter, a light sensor, an image processor and a central processor. The light emitter and the light sensor are set apart at a distance. When the light emitter emits an indicator light being projected onto a path the self-guiding machine travels toward, the light sensor senses the indicator light. An image containing the indicator light is generated. After analyzing the image, at least one feature of the indicator light being sensed can be obtained and used to obtain a spatial relationship between the self-guiding machine and an obstacle. The spatial relationship allows the system to determine if the self-guiding machine will collide with a wall or fall from a cliff.

Description

    BACKGROUND OF THE INVENTION 1. Field of the Invention
  • The disclosure is generally related to a technology for detecting obstacle, and in particular to a method for detecting an obstacle such as a cliff that is on the path of a moving machine, a system for implementing the method, and a sensor subsystem thereof.
  • 2. Description of Related Art
  • A self-guiding device, e.g. an automatic vehicle or an automatic robot cleaning machine, should avoid colliding with the other things or falling from a height. To avoid colliding with a wall, the self-guiding device requires a proximity sensor to determine a distance to the wall in order to avoid the collision. A conventional proximity sensor is such as radar, ultrasonic sensor, or a light (infrared ray) sensor that is a sensor able to find out the presence of any nearby object without a physical contact.
  • For example, the ultrasonic sensor is utilized to emit ultrasonic waves and receive the reflected waves for determining the front article that reflects the ultrasonic waves. The time difference between the times for emitting and receiving can be used to determine the distance. For cliff detection, a cliff sensor mounted at the bottom of the self-guiding device utilizes an infrared ray or ultrasonic waves to detect the presence of a floor under the device by measuring the reflected or scattered infrared ray or waves from the surface of the floor. Therefore, such the cliff sensor can prevent the device from traveling over the cliff when the cliff sensor detects itself moves over the cliff.
  • When the self-guiding device is able to detect the obstacle, e.g. the wall or the cliff, its controller can conclude the sensed data and instruct a driving system of the self-guiding device to stop the device or avoid the obstacle when it approaches or arrives the obstacle.
  • SUMMARY OF THE INVENTION
  • For further understanding of the instant disclosure, reference is made to the following detailed description illustrating the embodiments of the instant disclosure. The description is only for illustrating the instant disclosure, and not for limiting the scope of the claim.
  • One of the objectives of the method and the system for obstacle device in one aspect of the disclosure is to detect an obstacle on the path a self-guiding machine travels. The process of obstacle detection is performed in the self-guiding machine when it travels around an area. An algorithm operated in the self-guiding machine is used to process the data that is generated by a sensor subsystem installed in the machine.
  • According to one of the embodiments, the sensor subsystem of the system for performing the obstacle detection includes a light emitter and a light sensor. The light emitter emits a type of an indicator light, and the light sensor captures one image at one time or a series of images containing the indicator light for a period of time. In one aspect of the present disclosure, the information extracted from the image can be used to estimate a distance to the obstacle. In another aspect of the present disclosure, a change occurring in the series of images can be used to determine if the self-guiding machine approaches the obstacle, and the self-guiding machine can process an avoidance measure in order to avoid the risk.
  • The indicator light emitted by the light emitter is such as a linear light projected onto the area in front of the self-guiding machine. The light sensor can capture an image containing the linear light. Through an image analysis process, the captured linear light renders the information that is used to determine the distance to the obstacle, or how close it approaches the obstacle.
  • More, the method for obstacle detection can be adapted to a self-guiding machine that has a light emitter and a light sensor. The light emitter and the light sensor are preferably set apart at a distance. The light emitter emits an indicator light being projected onto a path the self-guiding machine travels toward, and the light sensor senses the indicator light projected onto the path so as to generate an image containing the indicator light.
  • After analyzing the image, at least one feature of the indicator light being sensed can be obtained. Then a spatial relationship between the self-guiding machine and an obstacle can be obtained in response to the at least one feature of the indicator light being sensed. This spatial relationship allows the self-guiding machine to compute a distance between the self-guiding machine and the obstacle, and determine if the self-guiding machine will collide with the obstacle when compared with a collision threshold, or determine if the self-guiding machine will fall due to the obstacle when compared with a falling threshold.
  • According to one further embodiment, a system for obstacle detection is provided. The system installed in a self-guiding machine includes a controller, a light emitter, a light sensor, an image processor and a central processor. The image processor is used to generate an image containing the indicator light that is projected onto the path. A central processor is used to perform the method for obstacle detection.
  • Further, a sensor subsystem is provided in the system for obstacle detection. In one embodiment, the sensor subsystem mainly includes a light emitter, a light sensor, and an image processor that are used to emit the indicator light projected onto a path, sense the indicator light, and render the image containing the indicator light. The image of indicator light is provided for the system of the self-guiding machine to analyze and obtain a spatial relationship between the self-guiding machine and the obstacle.
  • More, the system obtains the spatial relationship that allows the central processor to compute a distance between the self-guiding machine and the obstacle, and determine if the self-guiding machine will collide with the obstacle when compared with a collision threshold stored in a memory of the system, or determine if the self-guiding machine will fall due to the obstacle when compared with a falling threshold stored in the memory. Further, when the self-guiding machine reaches the collision threshold or the falling threshold, the central processor generates a signal for instructing the controller to drive the self-guiding machine to avoid the obstacle.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • Embodiments are illustrated by way of example and not by way of limitation in the figures of the accompanying drawings, in which like references indicate similar elements and in which:
  • FIG. 1 shows a schematic diagram using a top view to depict a circumstance that a self-guiding machine approaches an obstacle in one embodiment of the present disclosure;
  • FIG. 2 shows a schematic diagram depicting a circumstance that a self-guiding machine is in front of a cliff in one embodiment of the present disclosure;
  • FIG. 3 shows another schematic diagram depicting a perspective view of a circumstance that the self-guiding machine is in front of a wall in one further embodiment of the present disclosure;
  • FIG. 4A and FIG. 4B are the schematic diagrams showing a self-guiding machine approaches a wall in one embodiment of the present disclosure;
  • FIG. 5A and FIG. 5B are the schematic diagrams showing a self-guiding machine approaches a cliff in one further embodiment of the present disclosure;
  • FIG. 6A through FIG. 6D are the schematic diagrams showing a self-guiding machine approaches an obstacle with a height from a ground in one embodiment of the present disclosure;
  • FIG. 7 shows circuit blocks of a system for obstacle detection according to one embodiment of the present disclosure;
  • FIG. 8 shows a schematic diagram depicting a change of the indicator light captured by a light sensor of a self-guiding machine approaching a wall according to one embodiment of the present disclosure;
  • FIG. 9 shows a schematic diagram depicting a change of the indicator light captured by a light sensor of a self-guiding machine approaching a cliff according to one embodiment of the present disclosure;
  • FIGS. 10A, 10B and 10C show examples of the received lights by a self-guiding machine that approaches an obstacle according to one embodiment of the present disclosure;
  • FIG. 11 shows a schematic diagram depicting a circular indicator light projected on a wall where a self-guiding machine approaches in one embodiment of the present disclosure;
  • FIG. 12 shows a flow chart describing a process for obstacle detection adapted to a self-guiding machine in a one embodiment of the present disclosure;
  • FIG. 13 shows a flow chart describing a process for obstacle detection adapted to a self-guiding machine in a one further embodiment of the present disclosure.
  • DETAILED DESCRIPTION OF PREFERRED EMBODIMENTS
  • The aforementioned illustrations and following detailed descriptions are exemplary for the purpose of further explaining the scope of the instant disclosure. Other objectives and advantages related to the instant disclosure will be illustrated in the subsequent descriptions and appended drawings. In addition, for ease of illustration, similar reference numbers or symbols refer to elements alike.
  • The disclosure is related to a method and a system for obstacle detection adapted to a self-guiding machine. The self-guiding machine is such as an autonomous vehicle or an autonomous cleaning robot that can navigates an area automatically. The method allows the self-guiding machine to sense and identify an obstacle in front of the machine, and then drives the self-guiding machine to avoid the obstacle automatically. The system for obstacle detection is exemplified as a sensor subsystem that essentially includes a light emitter and a light sensor that can be installed in the self-guiding machine.
  • It is featured that the light emitter and the light sensor are set apart at a distance in a horizontal direction, and the light emitter and the light sensor can be disposed at the same or different horizontal level of position. The light emitter may utilize a Laser or an LED to be a linear light source that emits a linear light as the indicator light, or alternatively emits a certain area of light as the indicator light. The light sensor is used to sense the linear light or the certain area of light that is projected onto a path the self-guiding machine travels toward.
  • Further, one of the objectives of the method and the system for obstacle device in one aspect of the disclosure is to detect the obstacle, e.g. a wall, a cliff, or a floating obstacle, on the path the self-guiding machine travels. The method can be implemented by an algorithm operated in the system for obstacle detection. The self-guiding machine can itself process the data that is generated by the sensor subsystem and perform an avoidance measure.
  • Reference is made to FIG. 1 showing a schematic diagram that uses a top view to depict a circumstance that a self-guiding machine approaches an obstacle in one embodiment of the present disclosure.
  • A self-guiding machine 10 shown in the diagram can be an autonomous robot, e.g. an autonomous cleaning device that navigates an area with various terrains. A system for obstacle detection is installed in the self-guiding machine 10 for sensing the obstacle on a path the self-guiding machine 10 travels toward. The system can exemplarily include a sensor subsystem that essentially includes a light emitter 101, a light sensor 102 and a processing circuitry. The diagram also shows the light emitter 101 and the light sensor 102 that are set apart at a distance from each other in a horizontal direction. It is noted that the light emitter 101 and the light sensor 102 are not necessary to be disposed at the same horizontal level of position.
  • The light emitter 101 exemplarily utilizes a Laser or an LED to be a light source to emit a linear light 103. The linear light 103 may be formed by the light source through a specific lens. When the self-guiding machine 10 travels, the linear light 103 is continuously projected onto a scene, e.g. a ground 19 or any surface of any terrain, in front of the self-guiding machine 10. For example, when the self-guiding machine 10 travels toward a wall 18 that forms the obstacle at a distance ‘d’ from the self-guiding machine 10, the linear light 103 acting as an indicator light is projected onto both the wall 18 and the ground 19. A border 15 between the ground 19 and the wall 18 divides the linear line 103 into a first segment, i.e. the lower segment, and a second segment, i.e. the upper segment, of the indicator light being sensed. The light sensor 102 is such as a camera that has a field of view which is indicated by two dotted lines drawn in the diagram for capturing an image. It is noted that the field of view of the light sensor 102 should cover the indicator light 13 for effectively detect a status of the self-guiding machine 10. The light sensor 102 is configured to sense the linear light 103 and capture an image containing this linear light 103. When the system for obstacle detection is operated in the self-guiding machine 10, a length ‘s’ of the lower segment of the linear light 103 can be sensed by the light sensor 102. For example, a length ‘s’ sensed by the light sensor 102 can be used to be an indicator used to determine the distance ‘d.’
  • It should be noted that the distance ‘d’ between the self-guiding machine 10 and the wall 18 can be determined based on the length ‘s’ that is sensed by the light sensor 102. Further, a horizontal position of the upper segment of the linear light 103 can also be the indicator used to determine the distance ‘d.’ Furthermore, in one embodiment, the light sensor 102 can continuously sense the indicator light, and the system can accordingly generate a series of images containing the indicator light. If any change of the length ‘s’ appearing in the image of the lower segment of the linear light 103 or of the position/length of the upper segment of the linear light 103 has found in the series of images, the change allows the system to acknowledge that the self-guiding machine 10 is approaching or leaving the wall 18.
  • Therefore, in addition to the length or position of the indicator light captured by the light sensor 102, the any change of the aforementioned length or position of the indicator light obtained from the series of images can also be used to determine a status of the self-guiding machine 10, e.g. a moving trend of the self-guiding machine 10. The system for obstacle detection also establishes a warning mechanism for the self-guiding machine 10 according to a spatial relationship between the self-guiding machine 10 and the wall (18).
  • One further type of the obstacle is a cliff. Reference is made to FIG. 2 showing a schematic diagram depicting a circumstance that a self-guiding machine is in front of a cliff in one embodiment of the present disclosure.
  • The self-guiding machine 10 travels over a plane 20 of a terrain, e.g. a table or a ground and toward a cliff 22. The cliff 22 can be formed by a vertical section of the table or a downward stairway of the ground. The light emitter 101 of the self-guiding machine 10 continuously emits an indicator light 103′ projected onto the way the machine 10 travels over. The indicator light 103′ of the present example appears to be a linear light when the light emitter 101 emits the linear light in a vertical direction. The diagram utilizes section lines to present a range of the emitting light. The light sensor 102 senses the indicator light 103′ within its field of view and the system generates an image containing the indicator light 103′.
  • According to the present embodiment, the indicator light 103′ appears to be cut by an edge 15′ between the plane 20 and the cliff 22. The cut indicator light 103′ has been shown in the image captured by the light sensor 102. The indicator light 103′ shows an obstacle, e.g. the cliff 22, will be met by the self-guiding machine 10 and its length or a slope/angle sensed by the light sensor 102 allows the system to determine a distance between the self-guiding machine 10 and the obstacle indicated by the edge 15′. Therefore, when the system for obstacle detection acknowledges the cliff 22 is on the way the self-guiding machine 10 travels over, the system will instruct the self-guiding machine 10 to avoid this obstacle.
  • FIG. 3 shows another schematic diagram depicting a perspective view of a circumstance that the self-guiding machine is in front of a wall in one further embodiment of the present disclosure.
  • In the diagram, the self-guiding machine 10 travels over a ground 30 and toward a wall 32. The light emitter 101 of the self-guiding machine 10 emits a linear light as an indicator light in a vertical direction. The range of the emitting light covers both the ground 30 and the wall 32 in front of the self-guiding machine 10.
  • When the self-guiding machine 10 traveling over the ground 30 approaches the wall 32, the length of an indicator light 103″ indicates how close the self-guiding machine 10 approaches the wall 32. In another aspect of the method for obstacle detection of the present disclosure, a slope or angle of the indicator light 103″ can also show the spatial relationship between the self-guiding machine 10 and the wall 32; alternatively, a position of segment of the indicator light on the wall 32 can also be the indicator for depicting the spatial relationship.
  • According to the above embodiments of the method and the system for obstacle detection, a linear light emitted by the light emitter of the system installed in the self-guiding machine acts as an indicator light and the feature(s) of the indicator light can be used to determine a spatial relationship between the self-guiding machine and an obstacle, e.g. a wall or a cliff. Therefore, the system can effectively prevent the self-guiding machine from colliding with the wall or falling from the cliff. Since the light emitter and the light sensor of the system are set apart at a distance, i.e. preferably a horizontal distance, the features of the indicator light sensed by the light sensor can be analyzed for rendering the information such as a length, a position, a slope/angle and/or an area. At least one of the features is sufficient to be used to determine the spatial relationship between the self-guiding machine and the obstacle. Further, a change of one of the features between at least two images having the indicator light can be used to render a moving trend of the self-guiding machine. This moving trend allows the system to timely issue an alarm of collision or falling to the self-guiding machine.
  • FIG. 4A and FIG. 4B are the schematic diagrams showing a change of one of the features that allow the system to issue the alarm of collision to a self-guiding machine when it approaches a wall.
  • In FIG. 4A, this lateral view diagram shows a self-guiding machine 40 such as an autonomous vehicle travels over a ground 43. A light emitter of the system for obstacle detection installed in the self-guiding machine 40 emits an indicator light to the front of the machine 40. The light emitter has a range of emission 42 and its emitted light is confined to this range. The indicator light can be sensed by a light sensor of the system. When the self-guiding machine 40 is in front of a wall 44 within a specific distance that is available for the system to determine a spatial relationship between the self-guiding machine 40 and the obstacle, the indicator light is projected onto both the ground 43 and the wall 44. While the indicator light is sensed by the light sensor within its field of view, a first segment having a length ‘40 a’ and a second segment having a length ‘40 b’ are respectively computed by analyzing the image containing all or part of the indicator light.
  • While the self-guiding machine 40 approaches the wall 44, reference is made to FIG. 4B, it shows the range of emission 42′ becomes smaller than the previous status. Simultaneously, the indicator light sensed by the light sensor has been changed. For example, the length of the first segment of the indicator light projected onto the ground 43 has been changed to “40 a′” and the length of the second segment of the indicator light projected onto the wall 44 has been changed to “40 b′.” Therefore, the system for obstacle detection utilizes the feature of the length of the first segment or the second segment of the indicator light being sensed to determine the distance between the self-guiding machine 40 and the obstacle, e.g. the wall 44.
  • Furthermore, when the system captures two or more images containing the indicator light, a change of at least one feature of the indicator light can be used to obtain a moving trend of the machine and to determine if the self-guiding machine approaches the obstacle.
  • FIG. 5A and FIG. 5B are the schematic diagrams showing a change of one of the features that allow the system to issue a falling alarm to a self-guiding machine when it approaches a cliff
  • In FIG. 5A, the lateral view diagram shows a self-guiding machine 50 travels over a ground 53 including a cliff 54 at a distance away. A light emitter of the system for obstacle detection in the self-guiding machine 50 emits an indicator light to the front of the machine 50. The indicator light can be sensed by a light sensor of the system. When the self-guiding machine 50 is at a distance in front of the cliff 54, a segment of the indicator light is projected onto the ground 53 and the rest of the indicator light is cut by an edge of the ground 53 due to the cliff 54. The indicator light projected onto the ground 53 leaves a distance ‘50 a’ to be sensed by the light sensor. While the indicator light is sensed by the light sensor within its field of view, the distance ‘50 a’ is computed by analyzing the image containing all or part of the indicator light.
  • While the self-guiding machine 50 approaches the cliff 54, reference is made to FIG. 5B, the distance of the indicator light projected onto the ground 53 becomes shorter. For example, the length of the indicator light sensed by the light sensor projected onto the ground 53 is ‘50 a’ in FIG. 5A, and then the indicator light projected onto the ground 53 has been changed to “50 a′.” Therefore, the system for obstacle detection also utilizes the feature of the length of the indicator light being sensed to determine the distance between the self-guiding machine 50 and the obstacle, e.g. the cliff 54.
  • Similarly, when the system captures two or more images containing the indicator light, a change of length of the indicator light can be used to obtain a moving trend of the machine and to determine if the self-guiding machine approaches the obstacle.
  • FIG. 6A through FIG. 6D are the schematic diagrams showing a self-guiding machine approaches a floating obstacle with a height from a ground in one further embodiment of the present disclosure.
  • In FIG. 6A, a self-guiding machine 60 travels over a ground 63 and approaches a floating obstacle 64. A light sensor of the self-guiding machine 60 senses an indicator light, for example a linear light, emitted by a light emitter of the system installed in the self-guiding machine 60. When the self-guiding machine 60 is in front of the floating obstacle 64 that is with a height from the ground 63, the indicator light within a range of emission 62 of the light emitter of the system forms two segments, i.e. a segment of the indicator light is projected to the ground 63 and the other segment of the indicator light is projected to the floating obstacle 64. The segment of the indicator light projected to the floating obstacle 64 forms a light with a length ‘60 b’ sensed by the light sensor.
  • As the system for obstacle detection is in operation, the light sensor of the system is driven to capture an image with the segment of indicator light projected onto the ground 63 and the other segment of the indicator light having a length ‘60 b’ projected onto the floating obstacle 64. The image can be referred to a frame 66 shown in FIG. 6B. The frame 66 shows at least two features extracted from the image captured by the light sensor. A first segment 60 a shown in the frame 66 indicates the segment of indicator light projected onto the ground 63. A second segment 60 b shown in the frame 66 indicates the segment of indicator light projected onto the floating obstacle 64.
  • Next, in FIG. 6C, the self-guiding machine 60 is getting close to the floating obstacle 64 over the ground 63. The floating obstacle 64 causes changing the range of emission 62′ of the light emitter of the system. Both the segments of indicator light respectively projected onto the ground 63 and the floating obstacle 64 have been changed. For example, the length of the segment of the indicator light projected onto the float obstacle 64 has been changed to length “60 b′” due to the spatial relationship between the self-guiding machine 60 and the floating obstacle 64 has been changed.
  • The image captured by the light sensor can be referred to the frame 66′ shown in FIG. 6D. When the self-guiding machine 60 approaches the floating obstacle 64, both features of the first segment 60 a′ and the second segment 60 b′ have been changed. It shows both the length and the slope of the first segment 60 a′ have been changed. For example, the length of the first segment 60 a′ is shorter than the length of the first segment 60 a, and the slope thereof also changes when the self-guiding machine 60 approaches the floating obstacle 64. Further, both the length and the position of the second segment 60 b′ have been changed. For example, the length of the second segment 60 b′ is shorter than the length of the second segment 60 b, and the position of thereof shifts to the left in the same instance.
  • According to one of the embodiments, a sensor module essentially includes a light emitter and a light sensor that are set apart at a distance. The light emitter emits a type of an indicator light, and the light sensor captures one image at one time or a series of images containing the indicator light for a period of time. The indicator light emitted by the light emitter is such as a linear light, a circular light or any shape of light projected onto the way the self-guiding machine travels toward. The light sensor can capture an image containing the indicator light. Through an image analysis process, the information extracted from the indicator light being captured is used to determine a spatial relationship between the self-guiding machine and an obstacle. The spatial relationship allows the system to instruct the driving system of the self-guiding machine to avoid risk of collision or falling.
  • FIG. 7 shows circuit blocks of a system for obstacle detection according to one embodiment of the present disclosure.
  • The system for obstacle detection with a sensor subsystem that has capability of data processing since it has its own processor and memory. The sensor subsystem is used to collect the environmental information around it and to process the environment information for rendering a protection mechanism. The environment information is such as images of environment around the self-guiding machine adopting this system. The environmental information allows the system to determine if the self-guiding machine will meet any risk of damage.
  • The functions provided by the system are implemented by the circuit components shown in the diagram. The system includes a controller 701 that is in charge of controlling operations of the other circuit components for operating the system. The controller 701 is used to drive a light emitter 702 to emit the indicator light, and also control a light sensor 705 to sense the light signals within its field of view. In an exemplary example, the indicator light such as a linear light, a circular light or any type of the indicator light can be controlled to function in a full-time manner or periodically. The system includes the light emitter 702 coupled to the controller 701. The light emitter 702 has a light source and its driving circuit and is used to emit an indicator light through a requisite optical component and/or a window that is mounted on a surface of the self-guiding machine. The optical component is such as a lens that can be used to guide the indicator light to be a linear light, a circular light or any shape of the light. The system includes the light sensor 705 coupled to the controller 701. The light sensor 705 set apart at a distance from the light emitter 702 is used to sense the indicator light emitted by the light emitter 702. Then the scene in front of the self-guiding machine is captured by the light sensor 705 and then transmitted to an image processor 704 of the system for generating an image containing the indicator light. The image processor 704 is coupled to the light sensor 705 and is used to generate the image. The image is temporarily buffered, for example, in a memory 706.
  • According to a circuitry planning in one embodiment of the disclosure, the light emitter 702, the light sensor 705 and the image process 704 form a sensor subsystem installed in the self-guiding machine. The sensor subsystem is in charge of generating the indicator light and rendering the image of the indicator light. Therefore, the self-guiding machine can use at least one feature of the indicator light being sensed to obtain a spatial relationship between itself and an obstacle when the self-guiding machine approaches the obstacle on its path.
  • After that, at least one feature of the indicator light involved in the image is processed by a central processor 703 that is coupled to the image processor 704 and the controller 701. The central processor 703 is used to perform the method for obstacle detection. In one embodiment of the present disclosure, the memory 706 coupled to the central processor 703 acts as a system memory or storage that can be used to store the instructions for performing the functions provided by the system, for example the method for obstacle detection. The method performed by the central processor 703 primarily includes analyzing the image containing the indicator light sensed by the light sensor 705, obtaining at least one feature of the indicator light being sensed, and obtaining a spatial relationship between the self-guiding machine and an obstacle in response to the at least one feature of the indicator light being sensed when the self-guiding machine approaches the obstacle on its path.
  • According to one embodiment of the present disclosure, the information extracted by the sensor subsystem from the image having the indicator light can be used to estimate a distance to an obstacle. Further, the spatial relationship, e.g. the distance between the self-guiding machine and the obstacle, allows the system to determine if the self-guiding machine will be in any dangerous situation, for example colliding with a wall or falling from a cliff. In an exemplary example, the controller 701 of the system is coupled to a machine driver 708 that links to a driving system of the self-guiding machine adopting this system. When the system determines that an obstacle exists at a distance from the self-guiding machine, the controller 701 generates a signal for instructing a machine driver 708 for responding to the obstacle.
  • In one further embodiment, the light sensor 705 can also continuously captures a series of images covering the indicator light, and a change occurring in the series of images can be found and used to determine if the self-guiding machine approaches the obstacle. The self-guiding machine can process an avoidance measure in order to avoid the risk.
  • As a matter of illustration, the following figures are given as a guide to describe the method for obstacle detection in accordance with the embodiments described above.
  • FIG. 8 shows a schematic diagram depicting a change of the indicator light captured by a light sensor of a self-guiding machine approaching a wall according to one embodiment of the present disclosure.
  • The system installed in a self-guiding machine 80, 80′ is exemplified as the sensor subsystem essentially including a light emitter 801, 801′ and a light sensor 802, 802′. The self-guiding machine 80, 80′ acts as autonomous vehicle including a computer that drives the light emitter 801, 801′ to emit indicator light, integrates data received by the light sensor 802, 802′ and then processes the data for acquiring the terrain information regarding the path in front of the self-guiding machine 80.
  • As the diagram shows, the self-guiding machine 80 depicted by a solid line is at a first position when it travels over a ground 80. The self-guiding machine 80 includes the light emitter 801 and the light sensor 802. The light emitter 801 emits an indicator light 803. The indicator light 803 depicted in the diagram is based on the image sensed by the light sensor 802 in its viewing angle. The diagram shows the indicator light 803 has a turning point that divides the indicator light 803 into two segments due to a border between the ground 83 and the wall 84.
  • The self-guiding machine 80 then moves to a second position closer to the wall 84 and is marked as the self-guiding machine 80′ that is depicted by a dotted line. At the second position, the light emitter 801′ still emits an indicator light 803′ and the light sensor 802′ receives the indicator light 803′ and generates another image. Similarly, the indicator light 803′ depicted in the diagram is based on the image sensed by the light sensor 802′ in its viewing angle. The border of between the ground 83 and the wall 84 causes the indicator light 803′ to have another turning point that divides the indicator light 803′ into two segments.
  • This exemplary example shows several changes of the indicator light 803, 803′ projected onto both the ground 83 and the wall 84 when the self-guiding machine 80, 80′ travels toward the obstacle, i.e. the wall 84. It should be noted that the light emitter 801, 801′ emits a linear light and the light sensor 802, 802′ senses the light within its sensing range confined by its viewing angle. The diagram shows when the linear light is projected onto both the ground 83 and the obstacle, i.e. the wall 84, at least one feature of the indicator light 803, 803′ can be found by analyzing the image containing the indicator light 803, 803′. For example, a length, as one of the features, of a first segment of the indicator light 803, 803′ projected onto the ground 83 becomes shorter when the self-guiding machine 80, 80′ is closer to the wall 84. Further, a slope can act as another feature for detecting the obstacle since the image being sensed shows a slope of the first segment of the indicator light 803, 803′ becomes larger when the self-guiding machine 80, 80′ is closer to the wall 84. Furthermore, the shorter length or the left-shift position of a second segment of the indicator light 803, 803′ projected onto the wall 84 can also act as one of the features to detect the obstacle when the self-guiding machine 80, 80′ is closer to the wall 84. Therefore, in this exemplary example, a length of the first segment or the second segment, a position of the second segment and/or a slope/angle of the first segment can be regarded as the feature(s) allowing the system to detect the obstacle.
  • FIG. 9 shows another schematic diagram depicting a change of the indicator light captured by a light sensor of a self-guiding machine approaching a cliff according to one embodiment of the present disclosure. The present example shows the self-guiding machine 90, 90′ travels over a ground 94 and will meet an obstacle, i.e. a cliff, and the system in the self-guiding machine 90, 90′ is required to detect the obstacle and avoid falling.
  • The diagram shows a self-guiding machine 90 depicted by a solid line is originally at a first position. A light emitter 901 of the self-guiding machine 90 at the first position emits an indicator light (903, 904), and a light sensor 902 captures an image containing the indicator light (903, 904) in its viewing angle. The self-guiding machine 90 then moves to a second position closer to an edge 93 and is marked as the self-guiding machine 90′ depicted by a dotted line. At the second position, the light sensor 902′ senses the indicator light (903′, 904′) emitted by the light emitter 901′ of the self-guiding machine 90′ in a viewing angle. The edge 93 formed by the cliff cuts the indicator light and the segment projected onto the ground 94 is marked as a first segment 903, 903′. It should be noted that the segment 904, 904′ of the indicator light being sensed not well connected to the first segment (903, 903′) is projected onto a distant wall with a distance from the cliff and is still sensed by the light sensor 902, 902′.
  • This exemplary example shows the length of the first segment 903, 903′ of the indicator light being sensed by the light sensor 902, 902′ becomes shorter and with larger slope when the self-guiding machine 90, 90′ moves from the first position to the second position that is closer to the edge 93 of the cliff. Therefore, the feature(s) of the first segment 903, 903′ of the indicator light being sensed by the light sensor 902, 902′ can be the information for the system to detect the obstacle, i.e. the cliff. The system accordingly determines if the self-guiding machine 90, 90′ approaches the edge 93 of the cliff.
  • FIGS. 10A, 10B and 10C show three frames of images depicting an example of the received lights by a self-guiding machine that approaches an obstacle at a distance of 20 cm, 10 cm and 5 cm.
  • FIG. 10A shows a frame 1001 with a width from pixel 0 to pixel 200. The frame 1001 appears a light segment 1004 of indicator light projected onto a path the self-guiding machine travels toward when the self-guiding machine is at 20 cm distance from an obstacle, e.g. a wall or a cliff. The frame 1001 further uses a dotted line 1005 to indicate the position of the light segment projected onto the wall.
  • FIG. 10B shows another frame 1002 with the same width. The frame 1002 appears a light segment 1006 of the indicator light projected onto the path when the self-guiding machine is at 10 cm distance from the obstacle. The dotted line 1007 indicates the position of the light segment projected onto the wall. It is noted that the slope of the light segment 1006 is larger than the slope of the light segment 1004 shown in FIG. 10A; and as well the length of the light segment 1006 is shorter than the length of the light segment 1004 shown in FIG. 10A since the self-guiding machine is getting close to the obstacle.
  • FIG. 10C shows one more frame 1003 with the same width. The frame 1003 appears a light segment 1008 of the indicator light projected onto the path when the self-guiding machine is at 5 cm distance from the obstacle. The dotted line 1009 indicates the position of the light segment projected onto the wall. Similarly, the slope of the light segment 1008 is larger than the slope of the light segment 1006 shown in FIG. 10B; and as well the length of the light segment 1008 is shorter than the length of the light segment 1006 shown in FIG. 10B since the self-guiding machine is getting close to the obstacle.
  • On the other hand, according to the positions indicated by the dotted line 1005 of FIG. 10A, 1007 of FIG. 10B and 1009 of FIG. 10C, it is found that the segment of indicator light projected onto the obstacle, i.e. the wall, gradually moves to the left as the self-guiding machine approaches the obstacle. Therefore, the position of the light segment also acts as an indicator for indicating the spatial relationship between the self-guiding machine and the obstacle.
  • The indicator light emitted by the light emitter of the system can also be a circular light. FIG. 11 shows a schematic diagram depicting a circular indicator light projected on a wall where a self-guiding machine approaches in one embodiment of the present disclosure.
  • The circular indicator light projected on the wall is sensed by the light sensor at a distance apart from the light emitter, and therefore the circular indicator light being sensed and shown in the diagram gets a little distorted. The circular indicator light being sensed specifies a reference point (1111, 1112) and is divided into a first segment (111, 113), e.g. the lower area, and a second segment (112, 114), e.g. the upper area by a dividing line. The dividing line can be a border between a ground and a wall or an edge of a cliff.
  • As the diagram shows, the solid circle indicates the self-guiding machine is at a first position, and the dotted circle indicates the self-guiding machine is at a second position that is closer to the obstacle, e.g. the wall. It appears that both the areas of the first segment and the second segment of the dotted circle are smaller than the solid circle when the self-guiding machine approaches the obstacle. Further, referring to the reference points 1111 and 1112 respective to the solid circle and the dotted circle, it appears that the dotted circle moves to the left relative to the solid circle as the self-guiding machine approaches the obstacle.
  • Therefore, the area and the position of the circular indicator light can act as the indicator for indicating the spatial relationship between the self-guiding machine and the obstacle.
  • When the system obtains the spatial relationship, the central processor of the system accordingly computes a distance between the self-guiding machine and the obstacle, and determines if the self-guiding machine will collide with the obstacle when compared with a collision threshold stored in a memory of the system. Similarly, the spatial relationship also allows the system to determine if the self-guiding machine will fall due to the obstacle when compared with a falling threshold stored in the memory.
  • The system exemplarily described in FIG. 7 performs the method for obstacle detection adapted to a self-guiding machine. Reference is made to FIG. 12 that shows a flow chart describing the method for obstacle detection in one embodiment.
  • In step S121, the light emitter of the system emits an indicator light, and in step S123 the light sensor of the system is used to senses the indicator light and generate an image. The light sensor is such as a camera that captures the image containing the indicator light within a viewing angle. The indicator light can reflect a spatial relationship between the self-guiding machine and the obstacle when the indicator light can be projected onto the path including the ground and/or the obstacle the self-guiding machine travels over. The image processor of the system then analyzes the image for acquiring at least one feature of the indicator light being sensed, such as step S125.
  • According to the above embodiments, the feature can be a length, a position, a slope and/or an area of the indicator light being sensed. Any of the features is provided for the central processor of the system to compute the length, the position, and/or the slope regarding the linear indicator light or the area regarding the circular indicator light. In step S129, the at least one feature allows the system to determines a distance between the self-guiding machine installing the system and an obstacle.
  • The system can also find a moving trend of the self-guiding machine according to the change of the feature extracted from the indicator light being sensed for a period of time by the light sensor of the system. Reference is made to FIG. 13.
  • In step S151, the light emitter continuously emits an indicator light projected onto the ground and/or the obstacle in front of the self-guiding machine. In step S153, the light sensor is driven to capture at least two different images containing the indicator light within a time period. By analyzing the at least two images, in step S155, the system can obtain at least one feature from individual image. The feature can also be the length, the position and/or the slope obtained from the linear indicator light being sensed by the light sensor, or the area obtained from the circular indicator light. In step S157, any change of length, slope, position and/or area of the indicator lights in both images within the time period can be obtained for determining the change of the spatial relationship between the self-guiding machine and the obstacle.
  • The change of the indicator light projected onto the ground and/or the obstacle can be used to determine the moving trend of the self-guiding machine. The system accordingly can determine if the self-guiding machine is getting close to any obstacle that it should be avoid. Therefore, the system can issue an alarm in advance for the self-guiding machine.
  • To sum up the above embodiments, the method and the system for obstacle detection can be adapted to a self-guiding machine such as an autonomous vehicle or an autonomous cleaning robot. The system can acquire a spatial relationship between the self-guiding machine and the obstacle according to at least one feature extracted from an indicator light projected onto the path the self-guiding machine travels over. The spatial relationship allows the self-guiding machine to compute a distance between the self-guiding machine and the obstacle so as to determine if the self-guiding machine will collide with a wall, or determine if the self-guiding machine will fall from a cliff. The invention provides the self-guiding machine a solution to determine a distance from an obstacle and optionally to warn the self-guiding machine when it approaches the obstacle. Further, the system can accordingly instruct the driving system of the self-guiding machine to avoid the obstacle.
  • The descriptions illustrated supra set forth simply the preferred embodiments of the instant disclosure; however, the characteristics of the instant disclosure are by no means restricted thereto. All changes, alterations, or modifications conveniently considered by those skilled in the art are deemed to be encompassed within the scope of the instant disclosure delineated by the following claims.

Claims (20)

What is claimed is:
1. A method for obstacle detection adapted to a self-guiding machine including a light emitter and a light sensor that are set apart at a distance, comprising:
the light emitter emitting an indicator light being projected onto a path the self-guiding machine travels toward;
the light sensor sensing the indicator light projected onto the path so as to generate an image containing the indicator light;
in the self-guiding machine analyzing the image so as to obtain at least one feature of the indicator light being sensed; and
obtaining a spatial relationship between the self-guiding machine and an obstacle in response to the at least one feature of the indicator light being sensed when the self-guiding machine approaches the obstacle on its path.
2. The method as recited in claim 1, wherein the spatial relationship allows the self-guiding machine to compute a distance between the self-guiding machine and the obstacle, and determine if the self-guiding machine will collide with the obstacle when compared with a collision threshold, or determine if the self-guiding machine will fall due to the obstacle when compared with a falling threshold.
3. The method as recited in claim 2, wherein the indicator light forms a linear light that is projected onto a ground that the self-guiding machine travels.
4. The method as recited in claim 3 wherein the linear light is projected onto both the ground and the obstacle so as to form a first segment and a second segment of the indicator light being sensed if the obstacle is a wall, and the at least one feature of the indicator light being sensed is a length of the first segment or the second segment, a position of the second segment and/or a slope of the first segment.
5. The method as recited in claim 3, wherein the linear light is projected onto the ground and being cut by an edge of the ground if the obstacle is a cliff, and the at least one feature of the indicator light being sensed is a length and/or a slope of the indicator light being projected onto the ground.
6. The method as recited in claim 3, wherein the linear light is projected onto both the ground and the obstacle so as to form a first segment and a second segment of the indicator light being sensed if the obstacle is a floating obstacle with a height from the ground, and the at least one feature of the indicator light being sensed is a position of the second segment.
7. The method as recited in claim 2, wherein the indicator light forms a circular light that is projected onto a ground that the self-guiding machine travels.
8. The method as recited in claim 7, wherein the circular light is projected onto both the ground and the obstacle so as to form a first region and a second region of the indicator light being sensed if the obstacle is a wall, and the at least one feature of the indicator light being sensed is an area of the first region or the second region, and/or a position of a reference point of the circular light.
9. The method as recited in claim 7, wherein the circular light is projected onto the ground and being cut by an edge of the ground if the obstacle is a cliff, and the at least one feature of the indicator light being sensed is an area of the circular light be projected onto the ground, and/or a position of a reference point of the circular light.
10. A system for obstacle detection installed in a self-guiding machine, comprising:
a controller;
a light emitter, coupled to the controller, used to emit an indicator light being projected onto a path the self-guiding machine travels toward;
a light sensor, coupled to the controller, used to sense the indicator light projected onto the path, wherein the light emitter and the light sensor are set apart at a distance;
an image processor, coupled to the light sensor, used to generate an image containing the indicator light;
a central processor, coupled to the image processor and the controller, used to perform a method for obstacle detection comprising:
analyzing the image containing the indicator light sensed by the light sensor;
obtaining at least one feature of the indicator light being sensed; and
obtaining a spatial relationship between the self-guiding machine and an obstacle in response to the at least one feature of the indicator light being sensed when the self-guiding machine approaches the obstacle on its path.
11. The system as recited in claim 10, wherein the system obtains the spatial relationship that allows the central processor to compute a distance between the self-guiding machine and the obstacle, and determine if the self-guiding machine will collide with the obstacle when compared with a collision threshold stored in a memory of the system, or determine if the self-guiding machine will fall due to the obstacle when compared with a falling threshold stored in the memory.
12. The system as recited in claim 11, wherein the method performed by the central processor further comprises instructing the controller to drive the self-guiding machine to avoid the obstacle when the self-guiding machine reaches the collision threshold or the falling threshold.
13. The system as recited in claim 11, wherein the light emitter emits a linear light or a circular light that acts as the indicator light being projected onto a ground that the self-guiding machine travels.
14. The system as recited in claim 13, wherein the linear light is projected onto both the ground and the obstacle so as to form a first segment and a second segment of the indicator light being sensed if the obstacle is a wall, and the at least one feature of the indicator light being sensed is a length of the first segment or the second segment, a position of the second segment and/or a slope of the first segment.
15. The system as recited in claim 13, wherein the linear light is projected onto the ground and being cut by an edge of the ground if the obstacle is a cliff, and the at least one feature of the indicator light being sensed is a length and/or a slope of the indicator light being projected onto the ground.
16. The system as recited in claim 13, wherein the linear light is projected onto both the ground and the obstacle so as to form a first segment and a second segment of the indicator light being sensed if the obstacle is a floating obstacle with a height from the ground, and the at least one feature of the indicator light being sensed is a position of the second segment.
17. The system as recited in claim 10, wherein the light emitter utilizes a Laser or an LED to be a light source that emits a linear light as the indicator light.
18. The system as recited in claim 17, wherein the light sensor senses the linear light and the at least one feature is a length, a position or a slope of the linear light.
19. A sensor subsystem for a self-guiding machine navigating an area, comprising:
a light emitter used to emit an indicator light being projected onto the path the self-guiding machine travels toward;
a light sensor used to sense the indicator light projected onto the path, wherein the light emitter and the light sensor are set apart at a distance; and
an image processor, coupled to the light sensor, used to render an image containing the indicator light;
wherein the self-guiding machine uses at least one feature of the indicator light being sensed to obtain a spatial relationship between the self-guiding machine and an obstacle when the self-guiding machine approaches the obstacle on its path.
20. The sensor subsystem as recited in claim 19, wherein the subsystem is installed in an autonomous robot.
US15/869,291 2018-01-12 2018-01-12 Method, system for obstacle detection and a sensor subsystem Active 2038-10-06 US11009882B2 (en)

Priority Applications (5)

Application Number Priority Date Filing Date Title
US15/869,291 US11009882B2 (en) 2018-01-12 2018-01-12 Method, system for obstacle detection and a sensor subsystem
CN201810503115.0A CN110031002B (en) 2018-01-12 2018-05-23 Method and system for detecting obstacle and sensor subsystem thereof
CN202210049733.9A CN114370881A (en) 2018-01-12 2018-05-23 Method and system for detecting obstacle and sensor subsystem thereof
US17/227,732 US11669103B2 (en) 2018-01-12 2021-04-12 System for obstacle detection
US18/140,084 US20240085921A1 (en) 2018-01-12 2023-04-27 System for obstacle detection

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US15/869,291 US11009882B2 (en) 2018-01-12 2018-01-12 Method, system for obstacle detection and a sensor subsystem

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/227,732 Continuation US11669103B2 (en) 2018-01-12 2021-04-12 System for obstacle detection

Publications (2)

Publication Number Publication Date
US20190220025A1 true US20190220025A1 (en) 2019-07-18
US11009882B2 US11009882B2 (en) 2021-05-18

Family

ID=67213872

Family Applications (3)

Application Number Title Priority Date Filing Date
US15/869,291 Active 2038-10-06 US11009882B2 (en) 2018-01-12 2018-01-12 Method, system for obstacle detection and a sensor subsystem
US17/227,732 Active 2038-05-08 US11669103B2 (en) 2018-01-12 2021-04-12 System for obstacle detection
US18/140,084 Pending US20240085921A1 (en) 2018-01-12 2023-04-27 System for obstacle detection

Family Applications After (2)

Application Number Title Priority Date Filing Date
US17/227,732 Active 2038-05-08 US11669103B2 (en) 2018-01-12 2021-04-12 System for obstacle detection
US18/140,084 Pending US20240085921A1 (en) 2018-01-12 2023-04-27 System for obstacle detection

Country Status (2)

Country Link
US (3) US11009882B2 (en)
CN (2) CN110031002B (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180210448A1 (en) * 2017-01-25 2018-07-26 Lg Electronics Inc. Method of identifying functional region in 3-dimensional space, and robot implementing the method
US20210232143A1 (en) * 2018-01-12 2021-07-29 Pixart Imaging Inc. System for obstacle detection
US20220111786A1 (en) * 2019-06-24 2022-04-14 Event Capture Systems, Inc. Methods, devices, and systems for headlight illumination for semi-autonomous vehicles
US20220160202A1 (en) * 2020-11-24 2022-05-26 Pixart Imaging Inc. Method for eliminating misjudgment of reflective light and optical sensing system
US20220163666A1 (en) * 2020-11-24 2022-05-26 Pixart Imaging Inc. Method for eliminating misjudgment of reflective lights and optical sensing system
US11733360B2 (en) * 2019-06-04 2023-08-22 Texas Instruments Incorporated Optical time of flight sensor for navigation systems in robotic applications

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11698457B2 (en) * 2019-09-04 2023-07-11 Pixart Imaging Inc. Object detecting system and object detecting method
CN113252035A (en) * 2020-02-11 2021-08-13 胜薪科技股份有限公司 Optical navigation device
US20210247516A1 (en) * 2020-02-11 2021-08-12 Visual Sensing Technology Co., Ltd. Optical navigation apparatus

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9002511B1 (en) * 2005-10-21 2015-04-07 Irobot Corporation Methods and systems for obstacle detection using structured light
US9508235B2 (en) * 2013-08-06 2016-11-29 Robert Bosch Gmbh Projection unit for a self-directing mobile platform, transport robot and method for operating a self-directing mobile platform
US10261513B2 (en) * 2016-12-19 2019-04-16 drive.ai Inc. Methods for communicating state, intent, and context of an autonomous vehicle

Family Cites Families (34)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6496754B2 (en) 2000-11-17 2002-12-17 Samsung Kwangju Electronics Co., Ltd. Mobile robot and course adjusting method thereof
CN101088720B (en) 2006-06-15 2012-05-23 财团法人工业技术研究院 Barrier bypassing and drop preventing system and method
KR100791389B1 (en) 2006-12-26 2008-01-07 삼성전자주식회사 Apparatus and method for measuring distance using structured light
CN101290349A (en) 2007-04-16 2008-10-22 柯信成 Vehicular indicating device and vehicle provided with same
KR101461185B1 (en) 2007-11-09 2014-11-14 삼성전자 주식회사 Apparatus and method for building 3D map using structured light
CN101458083B (en) 2007-12-14 2011-06-29 财团法人工业技术研究院 Structure light vision navigation system and method
CN201766649U (en) * 2010-08-17 2011-03-16 原相科技股份有限公司 Sensing device and image sensing system thereof
CN102866433B (en) * 2011-07-05 2015-11-25 科沃斯机器人有限公司 The sniffer of detection self-movement robot periphery barrier and self-movement robot
EP2631730B1 (en) * 2012-02-24 2014-09-24 Samsung Electronics Co., Ltd Sensor assembly and robot cleaner having the same
KR101949277B1 (en) * 2012-06-18 2019-04-25 엘지전자 주식회사 Autonomous mobile robot
WO2014033055A1 (en) * 2012-08-27 2014-03-06 Aktiebolaget Electrolux Robot positioning system
KR101450569B1 (en) * 2013-03-05 2014-10-14 엘지전자 주식회사 Robot cleaner
KR101395888B1 (en) * 2013-03-21 2014-05-27 엘지전자 주식회사 Robot cleaner and operating method
CN104182062A (en) * 2013-05-21 2014-12-03 原相科技股份有限公司 Optical navigator and method for controlling optical mechanisms in optical navigator
CN103424112B (en) * 2013-07-29 2016-06-01 南京航空航天大学 A kind of motion carrier vision navigation method auxiliary based on laser plane
KR102152641B1 (en) * 2013-10-31 2020-09-08 엘지전자 주식회사 Mobile robot
CN106455883B (en) * 2014-05-08 2020-03-06 阿尔弗雷德·卡赫欧洲两合公司 Self-propelled and self-steering floor cleaning device and method for cleaning a floor
CN104914447B (en) * 2015-06-15 2018-05-15 珠海市一微半导体有限公司 Robot anti-collision mechanism
CN104964672B (en) 2015-06-29 2017-05-31 济南大学 A kind of long-distance barrier detecting sensor based on line-structured light
CN105286729B (en) * 2015-09-25 2018-09-11 江苏美的清洁电器股份有限公司 Sweeping robot
CN205031182U (en) * 2015-09-25 2016-02-17 江苏美的清洁电器股份有限公司 Floor sweeping robot
CN207979622U (en) * 2016-05-17 2018-10-19 Lg电子株式会社 Robot cleaner
CN105911703B (en) 2016-06-24 2019-08-09 上海图漾信息科技有限公司 Linear laser grenade instrumentation and method and laser ranging system and method
CN105974427B (en) 2016-06-24 2021-05-04 上海图漾信息科技有限公司 Structured light distance measuring device and method
CN205898143U (en) * 2016-07-20 2017-01-18 山东鲁能智能技术有限公司 Robot navigation system based on machine vision and laser sensor fuse
CN206252152U (en) 2016-09-29 2017-06-16 仕腾达电梯(北京)有限公司 A kind of intelligent staircase of utilization light protection
CN106855411A (en) * 2017-01-10 2017-06-16 深圳市极思维智能科技有限公司 A kind of robot and its method that map is built with depth camera and obstacle avoidance system
KR102033143B1 (en) * 2017-01-25 2019-10-16 엘지전자 주식회사 Method of identifying functioning region in 3-dimensional space and robot implementing thereof
CN206757399U (en) * 2017-04-27 2017-12-15 牛立庚 A kind of obstacle avoidance apparatus
CN107544517B (en) * 2017-10-11 2021-06-01 珠海市一微半导体有限公司 Control method of intelligent cleaning robot
CN108007452B (en) 2017-12-08 2021-11-26 北京奇虎科技有限公司 Method and device for updating environment map according to obstacle and robot
US11009882B2 (en) * 2018-01-12 2021-05-18 Pixart Imaging Inc. Method, system for obstacle detection and a sensor subsystem
US11353884B2 (en) * 2019-01-28 2022-06-07 Pixart Imaging Inc. Robot without detection dead zone
US11826906B2 (en) * 2020-11-24 2023-11-28 Pixart Imaging Inc. Method for eliminating misjudgment of reflective light and optical sensing system

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9002511B1 (en) * 2005-10-21 2015-04-07 Irobot Corporation Methods and systems for obstacle detection using structured light
US9508235B2 (en) * 2013-08-06 2016-11-29 Robert Bosch Gmbh Projection unit for a self-directing mobile platform, transport robot and method for operating a self-directing mobile platform
US10261513B2 (en) * 2016-12-19 2019-04-16 drive.ai Inc. Methods for communicating state, intent, and context of an autonomous vehicle

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180210448A1 (en) * 2017-01-25 2018-07-26 Lg Electronics Inc. Method of identifying functional region in 3-dimensional space, and robot implementing the method
US10620636B2 (en) * 2017-01-25 2020-04-14 Lg Electronics Inc. Method of identifying functional region in 3-dimensional space, and robot implementing the method
US20210232143A1 (en) * 2018-01-12 2021-07-29 Pixart Imaging Inc. System for obstacle detection
US11669103B2 (en) * 2018-01-12 2023-06-06 Pixart Imaging Inc. System for obstacle detection
US11733360B2 (en) * 2019-06-04 2023-08-22 Texas Instruments Incorporated Optical time of flight sensor for navigation systems in robotic applications
US20220111786A1 (en) * 2019-06-24 2022-04-14 Event Capture Systems, Inc. Methods, devices, and systems for headlight illumination for semi-autonomous vehicles
US11712992B2 (en) * 2019-06-24 2023-08-01 Event Capture Systems, Inc. Methods, devices, and systems for headlight illumination for semi-autonomous vehicles
US20230356650A1 (en) * 2019-06-24 2023-11-09 Events Capture Systems, Inc. Methods, devices, and systems for headlight illumination for semi-autonomous vehicles
US20220160202A1 (en) * 2020-11-24 2022-05-26 Pixart Imaging Inc. Method for eliminating misjudgment of reflective light and optical sensing system
US20220163666A1 (en) * 2020-11-24 2022-05-26 Pixart Imaging Inc. Method for eliminating misjudgment of reflective lights and optical sensing system
US11826906B2 (en) * 2020-11-24 2023-11-28 Pixart Imaging Inc. Method for eliminating misjudgment of reflective light and optical sensing system
US11921205B2 (en) * 2020-11-24 2024-03-05 Pixart Imaging Inc. Method for eliminating misjudgment of reflective lights and optical sensing system

Also Published As

Publication number Publication date
CN114370881A (en) 2022-04-19
US11669103B2 (en) 2023-06-06
CN110031002B (en) 2022-02-01
US20240085921A1 (en) 2024-03-14
CN110031002A (en) 2019-07-19
US11009882B2 (en) 2021-05-18
US20210232143A1 (en) 2021-07-29

Similar Documents

Publication Publication Date Title
US11009882B2 (en) Method, system for obstacle detection and a sensor subsystem
CN110852312B (en) Cliff detection method, mobile robot control method, and mobile robot
US11547255B2 (en) Cleaning robot
CN109602341B (en) Cleaning robot falling control method and chip based on virtual boundary
EP2888603B1 (en) Robot positioning system
JP6132659B2 (en) Ambient environment recognition device, autonomous mobile system using the same, and ambient environment recognition method
KR101395089B1 (en) System and method for detecting obstacle applying to vehicle
US20070058838A1 (en) Object position detecting apparatus, map creating apparatus, autonomous mobile apparatus, object position detecting method, and computer program product for object position detection
US10093312B2 (en) Obstacle determining apparatus, moving body, and obstacle determining method
JP5071945B2 (en) Mobile device and method
US10765284B2 (en) Cleaning robot
US11776152B2 (en) Mobile apparatus obstacle detection system, mobile apparatus, and ground-sweeping robot
US20240159901A1 (en) Optical sensing system
JP6811661B2 (en) Mobile imager and mobile
US20240042630A1 (en) Method for eliminating misjudgment of reflective light and optical sensing system
US20220369886A1 (en) Cleaning robot capable of eliminating reflection interference
KR20180000965A (en) System and method for Autonomous Emergency Braking
US10704904B2 (en) Distance detection device
JP2008009927A (en) Mobile robot
KR102135369B1 (en) Distance detecting apparatus and method thereof
US10204276B2 (en) Imaging device, method and recording medium for capturing a three-dimensional field of view
JP2021170147A (en) Traveling device, control method of traveling device, traveling program and recording medium
US20240092399A1 (en) Motion prediction device and motion prediction method
KR20170143389A (en) Apparatus and Method for Detecting Vehicle Proximity
KR20230174730A (en) Method for determining an approximate object position of a dynamic object, computer program, device and vehicle

Legal Events

Date Code Title Description
AS Assignment

Owner name: PIXART IMAGING INC., TAIWAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:CHEN, KAI-SHUN;WANG, WEI-CHUNG;REEL/FRAME:044606/0234

Effective date: 20180110

FEPP Fee payment procedure

Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS

STPP Information on status: patent application and granting procedure in general

Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT RECEIVED

STPP Information on status: patent application and granting procedure in general

Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED

STCF Information on status: patent grant

Free format text: PATENTED CASE