US11614743B2 - System and method for navigating a sensor-equipped mobile platform through an environment to a destination - Google Patents
System and method for navigating a sensor-equipped mobile platform through an environment to a destination Download PDFInfo
- Publication number
- US11614743B2 US11614743B2 US15/905,299 US201815905299A US11614743B2 US 11614743 B2 US11614743 B2 US 11614743B2 US 201815905299 A US201815905299 A US 201815905299A US 11614743 B2 US11614743 B2 US 11614743B2
- Authority
- US
- United States
- Prior art keywords
- imaging target
- imaging
- machine
- navigation
- readable code
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active, expires
Links
Images
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0231—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
- G05D1/0246—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/254—Analysis of motion involving subtraction of images
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/005—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 with correlation of navigation data from several sources, e.g. map or contour matching
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/20—Instruments for performing navigational calculations
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/20—Instruments for performing navigational calculations
- G01C21/206—Instruments for performing navigational calculations specially adapted for indoor navigation
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/02—Systems using the reflection of electromagnetic waves other than radio waves
- G01S17/06—Systems determining position data of a target
- G01S17/42—Simultaneous measurement of distance and other co-ordinates
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/93—Lidar systems specially adapted for specific applications for anti-collision purposes
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/0088—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots characterized by the autonomous decision making process, e.g. artificial intelligence, predefined behaviours
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0231—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
- G05D1/0234—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using optical markers or beacons
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06K—GRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
- G06K7/00—Methods or arrangements for sensing record carriers, e.g. for reading patterns
- G06K7/10—Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
- G06K7/10544—Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation by scanning of the records by radiation in the optical part of the electromagnetic spectrum
- G06K7/10712—Fixed beam scanning
- G06K7/10722—Photodetector array or CCD scanning
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06K—GRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
- G06K7/00—Methods or arrangements for sensing record carriers, e.g. for reading patterns
- G06K7/10—Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
- G06K7/14—Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation using light without selection of wavelength, e.g. sensing reflected white light
- G06K7/1404—Methods for optical code recognition
- G06K7/1408—Methods for optical code recognition the method being specifically adapted for the type of code
- G06K7/1417—2D bar codes
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/50—Image enhancement or restoration using two or more images, e.g. averaging or subtraction
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/02—Systems using the reflection of electromagnetic waves other than radio waves
- G01S17/06—Systems determining position data of a target
- G01S17/08—Systems determining position data of a target for measuring distance only
-
- G05D2201/0211—
-
- G05D2201/0216—
-
- G05D2201/0218—
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10141—Special mode during image acquisition
- G06T2207/10152—Varying illumination
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20212—Image combination
- G06T2207/20224—Image subtraction
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30204—Marker
Definitions
- the present application relates to the field of systems and methods for navigating a sensor-equipped mobile platform through an environment to a destination.
- navigation data is held internally in a memory of a robot, which is then separately registered with the environment by using some other localization process or system.
- Such robots need a high level of programming to navigate an environment.
- a method for navigating a sensor-equipped mobile platform through an environment to a destination includes: capturing a first image in a first state of illumination; capturing a second image in a second state of illumination; generating a difference image from said first image and said second image; locating an imaging target based on said difference image, said imaging target including a machine-readable code embedded therein, said machine-readable code including navigation vector data; extracting said navigation vector data from said machine-readable code; and using said extracted navigation vector data to direct the navigation of the mobile platform through the environment to a destination.
- a system for navigating a sensor-equipped mobile platform through an environment to a destination includes: a plurality of imaging targets at a plurality of locations, each imaging target including a machine-readable code, each said machine-readable code including navigation vector data; and a mobile platform including an imaging device and a computing device.
- the computing device is configured to: capture a first image in a first state of illumination using said imaging device; capture a second image in a second state of illumination using said imaging device; generate a difference image from said first image and said second image; locate an imaging target of said plurality of imaging targets based on said difference image; extract said navigation vector data from said machine-readable code of said located imaging target; and use said extracted navigation vector data to direct the navigation of said mobile platform.
- FIG. 1 A is a schematic representing a system using machine-readable targets for navigating a sensor-equipped mobile platform through an environment to a destination according to an embodiment of the present description;
- FIG. 1 B is a schematic representing navigation vector data of a payload of machine-readable targets of an imaging tag of FIG. 1 A ;
- FIG. 1 C is a schematic representing a system using machine-readable targets for navigating a sensor-equipped mobile platform through an environment to a destination according to another embodiment of the present description;
- FIG. 1 D is a schematic representing navigation vector data of a payload of machine-readable targets of an imaging tag of FIG. 1 C ;
- FIG. 2 is a schematic representing a sensor-equipped mobile platform of the system of FIG. 1 ;
- FIG. 3 A is an illustration representing an image taken by an imaging device in a non-illuminated state
- FIG. 3 B is an illustration representing an image taken by an imaging device in an illuminated state
- FIG. 4 A is an illustration representing an imaging tag taken by an imaging device in a non-illuminated state
- FIG. 4 B is an illustration representing an imaging tag taken by an imaging device in an illuminated state
- FIG. 5 is an illustration representing a difference image generated from the images of FIGS. 3 A and 3 B ;
- FIGS. 6 A to 6 C are schematics representing a process for location an imaging tag.
- FIGS. 7 A and 7 B illustrate a round trip path configuration of imaging tags according to an exemplary embodiment of the present description
- FIGS. 8 A and 8 B illustrate a branching trip path configuration of imaging tags according to an exemplary embodiment of the present description
- FIG. 9 illustrates a continuous cycle path configuration of imaging tags according to an exemplary embodiment of the present description
- FIG. 10 illustrates a branching cycle path configuration of imaging tags according to an exemplary embodiment of the present description
- FIGS. 11 A and 11 B are flow charts representing an exemplary method of navigation a sensor-equipped mobile platform to a destination according to an exemplary embodiment of the present description.
- FIG. 12 is flow diagram of an aircraft manufacturing and service methodology
- FIG. 13 is a block diagram of an aircraft.
- Disclosed herein is a method and system for navigating a sensor-equipped mobile platform through an environment to a destination.
- Various devices, steps, and computer program products may be employed in conjunction with the practice of various aspects of the present disclosure.
- the term “computing device” should be construed broadly to encompass a system having at least one computer or processor, and which may have multiple computers or processors that communicate through a network or bus.
- the terms “computer” and “processor” both refer to devices comprising a processing unit (e.g., a central processing unit) and some form of memory (i.e., computer-readable medium) for storing a program which is readable by the processing unit.
- a system 2 for navigating a sensor-equipped mobile platform through an environment to a destination includes a plurality of imaging targets 4 at a plurality of locations and a sensor-equipped mobile platform 6 .
- each imaging target 4 includes a machine-readable code 8 and a passive marker 10 .
- the imaging targets are each positioned on a surface of an object or otherwise positioned within the environment of the system.
- the machine-readable code 8 is an optically readable code, such as a Quick Response (QR) code.
- QR codes are just one example of an optically-readable code.
- QR code patterns will be used for the description of implementations herein, other optically-readable codes may be employed, such as UPC standard bar codes, Data Matrix (ECC 200) 2D matrix bar codes, and Maxi Code 2D matrix bar codes (used by UPS, public domain).
- the machine-readable code is dynamic, e.g. such as by being formed using e-ink, and therefore the machine-readable code can be updated from a remote location.
- the illustrated imaging target 4 is a QR code.
- the passive markers 10 of the imaging targets 4 contain, for example, retro-reflective materials that are capable of reflecting light back to the source when displayed under a controlled light source.
- the reflective portion of the passive markers comprises: retro-reflective tape, reflective fabric tape, or reflective tape including microspheres.
- the reflective portion includes other types of passive markers that may show up differently under different lighting conditions.
- passive markers that fluoresce under a blacklight such as ultraviolet or infrared paint
- the illustrated passive marker 10 is a retroreflective passive marker circumscribing a QR code.
- FIG. 3 A shows the imaging target in a non-illuminated state
- FIG. 3 B shows the imaging target an illuminated state, causing a distinct difference in the appearance of the retroreflective passive marker between the non-illuminated state and the illuminated state.
- the process of positioning the machine-readable codes and passive markers of the imaging targets is achieved through various implementations.
- the machine-readable codes and passive markers are manufactured and entrenched into the surfaces of objects.
- the machine-readable codes and passive markers are affixed onto a surface through the application of stickers. It may be noted, however, that various other implementations may also be used to affix machine-readable codes and passive markers to the surface of the objects.
- passive markers and machine-readable code are manufactured together as a single unit (e.g. sticker) or as separate units, which are then applied to a surface of an object.
- the mobile platform 6 includes any robot, vehicle or other mobile device or system that utilizes navigation. As shown in FIG. 2 , the mobile platform 6 includes an imaging device 12 and a computing device 14 .
- the imaging device 12 includes a camera, such as a video camera.
- the imaging device has automated zoom capabilities.
- the imaging device is supported on a pan-tilt mechanism, and both the imaging device and the pan-tilt mechanism are operated by the computing device 14 .
- the pan-tilt mechanism is controlled to positionally adjust the imaging device to selected angles around a vertical, azimuth (pan) axis and a horizontal, elevation (tilt) axis.
- the computing device is integrated with the imaging device, and control of the pan-tilt mechanism and therefore, the orientation of the imaging device is controlled using the computing device.
- the mobile platform 6 further includes an illumination device 16 , such as a ring light.
- the illumination device includes an electrically-powered (e.g., battery powered) light source or any other light source with similar functionality.
- the illumination device is a ring light surrounding a lens of the imaging device.
- the illumination device includes two illumination states, e.g. an on state and an off state. In other words, the illumination device is either activated or de-activated. In the on-state, the illumination device provides illumination; and, in the off-state, the illumination device provides no illumination.
- the mobile platform further includes a laser range meter that transmits a laser beam 18 as shown in FIG. 2 .
- the laser range meter is configured to measure a distance to an object.
- the laser range meter has a laser and a unit configured to compute distances based on the laser light detected in response to laser light reflected on the object.
- the laser range meter is incorporated with the imaging device.
- the laser range meter is separate from the imaging device.
- the system further includes three-dimensional localization software loaded into the computing device for determining a position of the mobile platform relative to the imaging targets. In one embodiment, multiple imaging targets (e.g. three or more imaging targets) are utilized to determine the relative position and orientation of the mobile platform.
- the three-dimensional localization software uses the imaging targets and the laser range meter measurements to determine the location (position and orientation) of the mobile platform relative to the imaging targets.
- an estimate of relative position and orientation of the mobile platform can be acquired from a single target by using the internal registration marks in the target, such as those in a QRcode.
- the computing device directs the imaging device to capture a non-illuminated first image.
- the imaging device captures the non-illuminated first image (shown in FIG. 3 A ), which includes an image of an imaging target 4 (close-up shown in FIG. 4 A ) while having the illumination device in the off-state.
- the non-illuminated first image is then sent and stored in a database or memory of the computing device for post-processing.
- the computing device directs the imaging device to capture an illuminated second image, by utilizing the illumination device that is activated to the illumination on-state.
- the imaging device captures the illuminated second image (shown in FIG. 3 B ), which includes an image of the imaging target 4 (close-up shown in FIG. 4 B ) while having the illumination device in the on-state.
- the illuminated second image is then sent and stored in a database or memory of the computing device for post-processing.
- FIG. 5 depicts an illustration of an exemplary difference image generated from the first and second images illustrated in FIGS. 3 A and 3 B .
- the generated difference image may be used to determine the location of the imaging target.
- the generation of a difference image may be achieved through the procedures described as follows.
- a distortion function correction is applied to each captured image.
- a difference image is computed that represents the differences between the illuminated image and the non-illuminated image.
- the difference image is segmented into separate areas, which include filtering using size, color, shape, or other parameters.
- Image segmentation means defining a group of pixels with a specific characteristic. In accordance with one implementation, pixels of a specific color and intensity that are next to each other (i.e. contiguous regions) are found.
- the difference image may have some small artifacts (such a subtle edge outlines) that will be filtered out. This filtering is done using, for example, a blur filter and an intensity threshold filter.
- the computing device calculates the centroid for each segmented region. The centroid is the average X pixel coordinate and average Y pixel coordinate for that region. These X-Y coordinate pairs are used to compute the differences from the X-Y coordinate pair for the center of the image.
- pan and tilt angles for the centroid position of each segmented region in the image may be computed. These are the pan and tilt angles that will be used to direct the pan-tilt mechanism used to orient the laser range finder to the center of the imaging target in order to acquire the distance to the target.
- the method for aiming at the imaging target uses the pixel offsets for each of the centroids from the center of the image, the current field-of-view angle of the imaging device, and the distance to the target at the center of the image to compute offset pan and offset tilt angles.
- Automated local positioning measurements of the locations corresponding to the centroids of the segmented regions are performed using the offset pan and offset tilt angles. From the measured distance and the pan and tilt angles, the relative Cartesian coordinates (X,Y,Z) position from the target to the pan-tilt mechanism can be computed.
- FIGS. 6 A- 6 C illustrate a search and scanning process for locating an imaging target 4 .
- the imaging device 14 is initially aimed with a wide field-of-view angle.
- the imaging target 4 may be located anywhere in the environment, and in general may not be known to the mobile platform 6 before the process begins, which means that in order to read the imaging target 4 , the mobile platform 6 may need to the imaging target 4 first.
- the process for locating the imaging target 4 involves acquiring a mosaic of slightly overlapping wide-angle images of the environment in which the mobile platform 6 is set up. This image acquisition involves changing the pan and tilt angles of the pan-tilt mechanism and setting an appropriate field-of-view (zoom value) to take pairs of images. The process of changing the orientation with the pan-tilt mechanism and taking pairs of images continues until an imaging target 4 has been located.
- the process starts by setting a wide field-of-view angle ( 01 ) and capturing a first image with the imaging device 14 while having the illumination device in an of state and capturing a second image with the same field of view with the imaging device 14 while having the illumination device in an on state.
- a difference image may then be computed to determine if an imaging target 4 is within the current field-of-view of the imaging device, if an imaging target is not found, an aim direction of the imaging device is changed using the pan-tilt mechanism to rotate the imaging device to view another region in the environment, where the new field-of-view region partially overlaps the previous field-of-view region (shown in FIG.
- the computing device instructs the pan-tilt mechanism to aim the imaging device at the center region of the imaging target, zoom in on the imaging target (based on the extents determined in the image processing step), and then acquire a close-up (zoomed-in) image of the imaging target. From this zoomed-in image, the payload within the imaging target is read or decoded (which may be in the form of a QR code, DataMatrix code, barcode, or some other machine-readable form).
- pan-tilt angles and a distance reading of the center point of the imaging target is acquired by the measurement instrument. If the payload cannot be decoded into a usable format, the imaging target is rejected, and the process continues the search from where it left off until a valid imaging target has been located. In alternate embodiments, the re-orientation of the imaging device aim direction, zoom-in, and attempt at decoding the payload may be instructed to take place immediately after an imaging target has been located.
- navigation vector data 20 is stored in a data payload region of the machine-readable code that is obtained or decoded from the imaging targets 4 .
- the navigation vector data includes information corresponding to a direction through the environment to a destination that will be used to direct navigation of the mobile platform.
- the navigation vector data includes information in terms of a local coordinate system for indicating a navigation direction, such as specific coordinates defined in a coordinate system with Cartesian coordinates x, y, and z.
- the imaging device decodes the machine-readable code to extract the navigation vector data therefrom, which is then sent and stored in a database or memory of the computing device for use in directing the navigation of the mobile platform.
- the machine-readable code includes information corresponding to the current location of the imaging target.
- the navigation vector data extracted from the machine-readable code may further include information corresponding to a distance, velocity, and/or travel time that is also used to direct navigation of the mobile platform.
- the payload data further includes a unique identifier string encoded into the data payload region of the machine-readable code. Additionally, other signal data can also be embedded in the machine-readable code to initiate location specific tasks.
- FIGS. 1 C and 1 D show a variation in which a mobile platform is navigated using point-to-point navigation, whereas FIGS. 1 A and 1 B shows direct path to the goal. In the case of FIGS. 1 C and 1 D , there may be an obstruction preventing a direct path to the goal, and a navigation vector from the imaging target may direct the mobile platform to another imaging target rather than to the goal.
- the addition of embedded directional information to imaging targets that a mobile device identifies enables the directing of the mobile device in accordance with the directional info in the imaging targets.
- Field vector information is embedded in and acquired from an imaging target, and not retrieved from a remote database based on look-up information displayed on the imaging target, and the field vector information is used to direct the mobile platform using the field vector info through the environment to the identified direction/destination.
- the mobile device can obtain navigation information on site solely relying on the acquired image of the imaging target, without requiring wireless communications with a system or remote database located outside the environment.
- the mobile platform with an imaging device captures images in both a non-illuminated state and illuminated state, to identify from a difference image an imaging target(s), and to extract from an optically readable marker in the imaging target embedded field vector information, where the field vector information is acquired from the imaging target and not retrieved from a remote, external database.
- the mobile platform uses the field vector directional information to direct the mobile platform in the identified direction.
- the system provides a series of machine-readable imaging tags positioned (e.g. placed/painted/etched/applied) on a surface of a target object or within the environment to guide mobile platforms (e.g. robotic vehicles) on a pre-defined path.
- Each imaging tag contains the desired navigation direction along with, for example, current location, distance, velocity, travel time, which is used to guide the mobile platform along a path with minimal path related, low-level programming.
- the environment tells the robot where it is and where to go next, with only high-level commands from humans required.
- the machine-readable code can be as simple as a direction and a length of a line segment or can include other types of encoding such as QR codes or barcode elements. This type of discrete marking can also be integrated with more common continuous edge markings to make a unified system.
- the first step is to locate the imaging targets in the environment, which may be cluttered, making it difficult for an automated system to find the imaging target in the background.
- the present description solves this problem by using a passive marker, such as a retro-reflective border around the machine-readable code, and a lighting process that enables a simple image processing differencing step to find an imaging target in a cluttered environment.
- a passive marker such as a retro-reflective border around the machine-readable code
- a lighting process that enables a simple image processing differencing step to find an imaging target in a cluttered environment.
- Existing digital pheromone concepts require an independent localization process (with external location sensing hardware, such as motion capture) in order to make the connection to the data. That is problematic, since the location element is sometimes challenging to acquire.
- the present description embeds both location and navigation data into the same reference item without the need for external navigation hardware.
- the process of localization relates to determining the position and orientation (location) of a mobile platform relative to a reference location, and the process of navigation relates to going to a specific place or traveling along a specific direction vector.
- Other existing systems need some level of programming to perform a navigation task; at the very least, a program needs to be loaded into the robot. The robot then needs some type of tracking system that provides coordinate information with respect to the environment in which it is operating.
- the environment is set up with some type of low-cost passive imaging tags (QR codes, etc.) that contain both their current location and vector information (i.e. local navigation directions) pointing to the next imaging tag, and if applicable, the previous imaging tag, then robots could be programmed with much simpler instructions. So instead of turn-by-turn commands, robots could be instructed with just high-level commands (such as “go to room 123”).
- QR codes low-cost passive imaging tags
- Navigation vector data includes, but is not limited to, two main types of vector field formats: (1) the direction vector form; and (2) the velocity vector form.
- the direction vector form encodes distance in a magnitude of a vector
- the velocity vector form encodes speed in a magnitude of a vector. It is also possible to include both forms in the same imaging tag with additional fields, such a travel time. Encoding the format type in the imaging tag is also a useful addition. This approach enables robots (and other mobile platforms or applications) to find and read location specific data that can be used as part of a general-purpose navigation process. It provides a way to eliminate detailed path programming and re-programming, and it is much more flexible than buried wire navigation systems.
- An exemplary process for initially finding imaging tags in cluttered environments involves the use of retro-reflective borders around the imaging tags, and acquisition of two images: one with and the other without an illumination device (such as a ring light).
- An image differencing technique is used to find a border and direct the imaging device to zoom-in on the imaging tag and take a close-up shot of the machine-readable code, and then decode the payload of the machine-readable code.
- imaging tags also provide a way for the system to re-align with the path and re-calibrate distance sensors, thus providing robust and accurate robotic vehicle tracking, along with simplified user interaction.
- the system is setup to request/receive feedback, and used as part of a finite state machine.
- the process enables the system to operate as a finite state machine, which may be enhanced by other sensors and/or actuators co-located with the target to provide additional feedback.
- velocity and acceleration values are integrated into the imaging tag payload.
- multiple imaging tag codes on each marker are available for different trips along the same path.
- imaging tags are dynamic, updated from a remote location to route vehicles around problems (detours), or stop traffic while a problem is cleared. This makes the entire environment of the system easily re-configurable for robotic guidance, such as on a modular manufacturing floor.
- the present description enables a new type of robotic navigation particularly useful to, for example, companies involved in automated manufacturing and factory applications.
- the systems and methods of the present description could also be used for entertainment and game applications used by people with hand-held devices (smartphones).
- Another potential application is driverless rides or people-movers, such as in a large plant or wild animal park, or a part/package courier system in a factory.
- the present description enables a simpler type of robotic navigation in factories for manufacturing and warehouse applications, such as automated part or package delivery. Particular value is obtainable in cost avoidance related to robotic localization and navigation in the reduction of time, complexity, programming errors related to conventional methods.
- the present description is an alternative solution to buried-wire automated guide vehicle (AGV) systems used in factories.
- AGV buried-wire automated guide vehicle
- Another feature includes enabling path configurations, e.g. one-way, continuous cycle, reverse direction (round trip), branches, multiple imaging tag codes for multiple paths.
- path configurations e.g. one-way, continuous cycle, reverse direction (round trip), branches, multiple imaging tag codes for multiple paths.
- FIGS. 7 A and 7 B illustrate a round trip path configuration 70 of imaging tags 71 , 72 , 73 , and 74 according to an exemplary embodiment of the present description.
- each imaging tag includes a machine-readable code bounded by a retro-reflective passive marker, in which a payload of the machine-readable code includes: a unique identifier string; x-, y-, and z-coordinates corresponding to a current location of the imaging tag; a direction vector to the next target; and a direction vector to the previous target.
- each imaging tag includes a machine-readable code bounded by a retro-reflective passive marker, in which a payload of the machine-readable tag includes: a unique identifier string, x-, y-, and z-coordinates corresponding to a current location of the imaging tag; x-, y-, and z-coordinates corresponding to a next imaging tag; and x-, y-, and z-coordinates corresponding to a previous imaging tag, wherein the direction vector data is then computed by the system from the 3D difference between the current location and the next or previous location data.
- the mobile platform may determine whether the unique identifier string corresponds to desired navigation path, calibrate a location of the mobile platform with the current location coordinates of the imaging tag 71 , and navigate the mobile platform based on navigation vector data corresponding to a location of the next imaging tag 72 .
- the process is repeated until the mobile platform reaches a destination. In the case of a round trip, the mobile platform may reach 74 , at which the mobile platform may perform a task or wait for further instructions, and then return to imaging tag 71 via imaging tag 73 and imaging tag 72 .
- FIGS. 8 A and 8 B illustrate a branching trip path configuration 80 of imaging tags 81 , 82 , 83 A, 83 B, 84 and 85 according to an exemplary embodiment of the present description.
- each imaging tag includes a machine-readable code bounded by a retro-reflective passive marker, in which a payload of the machine-readable includes: a unique identifier string, x-, y-, and z-coordinates corresponding to a current location of the imaging tag; a direction vector to the next target; and a direction vector to the previous target.
- each imaging tag includes a machine-readable code bounded by a retro-reflective passive marker, in which a payload of the machine-readable tag includes: a unique identifier string, x-, y-, and z-coordinates corresponding to a current location of the imaging tag; x-, y-, and z-coordinates corresponding to a next imaging tag; and x-, y-, and z-coordinates corresponding to a previous imaging tag, wherein the direction vector data is then computed by the system from the 3D difference between the current location and the next or previous location data.
- the mobile platform may determine whether the unique identifier string corresponds to desired navigation path, calibrate a location of the mobile platform with the current location coordinates of the imaging tag 81 , and navigate the mobile platform based on navigation vector data corresponding to a location of the next imaging tag 82 .
- imaging tag 82 When the mobile platform reaches imaging tag 82 , the process is repeated until the mobile platform reaches imaging tags 83 A and 83 B, at which the mobile platform determines whether the unique identifier string of each of 83 A and 83 B corresponds to desired navigation path, and then navigate the mobile platform based on navigation vector data corresponding to one of imaging tag 83 A and 83 B, and the process is repeated until the mobile platform reaches a destination
- FIGS. 9 and 10 represent a continuous cycle of imaging tags and a branching continuous cycle of imaging tags, respectively, according to embodiments of the present description. However, it would be understood that the variation of path configurations is not limited to the illustrated path configurations.
- the flowchart 100 shows instructions as set forth and executed by at least one or more computing devices that may be communicatively coupled to one or more processing devices. Hence, when directed by the computing device, the system may then perform the instructions as set forth in flowchart 100 of FIGS. 11 A and 11 B .
- the system initially sets the imaging device (e.g. camera) to aim at a surface of an object with a wide field-of-view angle. Since imaging targets may be located anywhere in the environment, the system may have to go through a process (see FIGS. 6 A to 6 C ) before being able to fully locate and capture an imaging target.
- the imaging device e.g. camera
- the system directs an imaging device to capture a first image of an imaging target located on a surface of an object in a non-illuminated state and a second image of an imaging target located on a surface of an object in an illuminated state.
- a difference image may be generated using the results of the captured first and second images from block 102 .
- the difference image may be computed and then a segmentation process is run on the image to determine contiguous regions of pixels within the image.
- the system after segmentation of the difference image, may locate the 4 edges of the rectangular border. If a rectangular border is not found, the system changes an orientation of the imaging device to continue searching for a valid imaging target.
- the computing device locates a center point of each of the plurality of imaging targets to further extract information.
- the imaging device may instruct the pan-tilt mechanism to aim the imaging device at the center region of one of the imaging targets (block 107 ), zoom in on that imaging and acquire a close-up image of the imaging target (block 108 ).
- Each imaging target may include a QR code, Data Matrix (DM), other two-dimensional (2D) code, barcode, other one-dimensional (1D) code, or other code of similar functionality that may be machine-readable or understood by a computer.
- the machine-readable code contains position data related to the position of the imaging target and may be used to help the computing device calibrate the location of the mobile platform.
- the machine-readable code also contains navigation vector data corresponding to a direction of a destination or next imaging tag.
- the machine-readable code may also contain additional information.
- the machine-readable code may contain orientation information, part number, information on whether the object has been serviced, damage information, contact information (email, web site, etc.), or other information related to the part or location on the target object.
- the system determines whether the extracted information from the acquired close-up image contains a valid machine-readable label. If the machine-readable label is not decodable into a usable format or cannot be read by the system, then the system may return back to block 101 and repeat the search process of finding valid imaging targets. If the machine-readable label can be read by the system, then the system may continue on to block 111 .
- the system determines whether the extracted information from the acquired close-up image matches the position format. If the machine-readable label does not match a position format corresponding to a desired navigation destination, then the system may return back to block 101 and repeat the search process of finding valid imaging targets. If the position format is matched, then the system may continue on to block 112 .
- the system navigations the mobile platform according to navigation direction vector data extracted from the valid machine-readable code of an imaging target corresponding to a desired navigation destination.
- Examples of the present disclosure may be described in the context of an aircraft manufacturing and service method 200 , as shown in FIG. 12 , and an aircraft 202 , as shown in FIG. 13 .
- the aircraft manufacturing and service method 200 may include specification and design 204 of the aircraft 202 and material procurement 206 .
- component/subassembly manufacturing 208 and system integration 210 of the aircraft 202 takes place.
- the aircraft 202 may go through certification and delivery 212 in order to be placed in service 214 .
- routine maintenance and service 216 which may also include modification, reconfiguration, refurbishment and the like.
- a system integrator may include without limitation any number of aircraft manufacturers and major-system subcontractors; a third party may include without limitation any number of venders, subcontractors, and suppliers; and an operator may be an airline, leasing company, military entity, service organization, and so on.
- the system and method of the present disclosure may be employed during any one or more of the stages of the aircraft manufacturing and service method 200 , including specification and design 204 of the aircraft 202 , material procurement 206 , component/subassembly manufacturing 208 , system integration 210 , certification and delivery 212 , placing the aircraft in service 214 , and routine maintenance and service 216 .
- the aircraft 202 produced by example method 200 may include an airframe 218 with a plurality of systems 220 and an interior 222 .
- the plurality of systems 220 may include one or more of a propulsion system 224 , an electrical system 226 , a hydraulic system 228 , and an environmental system 230 . Any number of other systems may be included.
- the system and method of the present disclosure may be employed for any of the systems of the aircraft 202 , including the airframe 218 and the interior 222 .
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- General Physics & Mathematics (AREA)
- Electromagnetism (AREA)
- Automation & Control Theory (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Theoretical Computer Science (AREA)
- Artificial Intelligence (AREA)
- Health & Medical Sciences (AREA)
- Aviation & Aerospace Engineering (AREA)
- Computer Networks & Wireless Communication (AREA)
- Toxicology (AREA)
- General Health & Medical Sciences (AREA)
- Multimedia (AREA)
- Business, Economics & Management (AREA)
- Evolutionary Computation (AREA)
- Game Theory and Decision Science (AREA)
- Medical Informatics (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
Abstract
Description
Claims (20)
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US15/905,299 US11614743B2 (en) | 2018-02-26 | 2018-02-26 | System and method for navigating a sensor-equipped mobile platform through an environment to a destination |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US15/905,299 US11614743B2 (en) | 2018-02-26 | 2018-02-26 | System and method for navigating a sensor-equipped mobile platform through an environment to a destination |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| US20190265721A1 US20190265721A1 (en) | 2019-08-29 |
| US11614743B2 true US11614743B2 (en) | 2023-03-28 |
Family
ID=67685799
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US15/905,299 Active 2038-04-10 US11614743B2 (en) | 2018-02-26 | 2018-02-26 | System and method for navigating a sensor-equipped mobile platform through an environment to a destination |
Country Status (1)
| Country | Link |
|---|---|
| US (1) | US11614743B2 (en) |
Families Citing this family (11)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| TWI665461B (en) * | 2018-05-04 | 2019-07-11 | 財團法人工業技術研究院 | Laser positioning system and method thereof |
| TWI671610B (en) | 2018-09-28 | 2019-09-11 | 財團法人工業技術研究院 | Automatic guided vehicle , agv control system, and agv control method |
| WO2020256179A1 (en) * | 2019-06-18 | 2020-12-24 | 엘지전자 주식회사 | Marker for space recognition, method for aligning and moving cart robot by recognizing space, and cart robot |
| TWI701423B (en) * | 2019-07-01 | 2020-08-11 | 東元電機股份有限公司 | Auxiliary positioning system with reflective sticker |
| US20220383541A1 (en) * | 2019-11-13 | 2022-12-01 | Battelle Energy Alliance, Llc | Unmanned vehicle navigation, and associated methods, systems, and computer-readable medium |
| TWI764069B (en) * | 2019-12-19 | 2022-05-11 | 財團法人工業技術研究院 | Automatic guided vehicle positioning system and operating method thereof |
| CN110977984B (en) * | 2019-12-23 | 2023-05-05 | 上海钛米机器人科技有限公司 | Control strip, mechanical arm control method, device, system and storage medium |
| DE102020201785B4 (en) * | 2020-02-13 | 2024-05-23 | Zf Friedrichshafen Ag | Markers for defining the movement trajectory of a vehicle |
| EP3979029B1 (en) * | 2020-09-30 | 2025-05-07 | Carnegie Robotics, LLC | Systems and methods for enabling navigation in environments with dynamic objects |
| US11502729B1 (en) | 2021-08-10 | 2022-11-15 | The Boeing Company | Methods for through-structure power and data transfer between mobile robots and sensor nodes |
| CN115097820A (en) * | 2022-06-09 | 2022-09-23 | 上海同岩土木工程科技股份有限公司 | High-precision positioning device and method for limited space plane area |
Citations (10)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20070276558A1 (en) * | 2004-03-27 | 2007-11-29 | Kyeong-Keun Kim | Navigation system for position self control robot and floor materials for providing absolute coordinates used thereof |
| US7643893B2 (en) | 2006-07-24 | 2010-01-05 | The Boeing Company | Closed-loop feedback control using motion capture systems |
| US20110039573A1 (en) * | 2009-08-13 | 2011-02-17 | Qualcomm Incorporated | Accessing positional information for a mobile station using a data code label |
| US8214098B2 (en) | 2008-02-28 | 2012-07-03 | The Boeing Company | System and method for controlling swarm of remote unmanned vehicles through human gestures |
| US20130212130A1 (en) * | 2012-02-15 | 2013-08-15 | Flybits, Inc. | Zone Oriented Applications, Systems and Methods |
| US20150332079A1 (en) * | 2012-12-31 | 2015-11-19 | Ajou University Industry-Academic Cooperation Foundation | Apparatus and method for recognizing quick response code |
| US20160300354A1 (en) * | 2015-04-09 | 2016-10-13 | The Boeing Company | Automated local positioning system calibration using optically readable markers |
| US20180281191A1 (en) * | 2017-03-30 | 2018-10-04 | Brain Corporation | Systems and methods for robotic path planning |
| US20180364740A1 (en) * | 2017-06-20 | 2018-12-20 | Planck Aerosystems Inc. | Systems and methods for charging unmanned aerial vehicles on a moving platform |
| US20190138030A1 (en) * | 2016-07-07 | 2019-05-09 | SZ DJI Technology Co., Ltd. | Method and system for controlling a movable object using machine-readable code |
-
2018
- 2018-02-26 US US15/905,299 patent/US11614743B2/en active Active
Patent Citations (10)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20070276558A1 (en) * | 2004-03-27 | 2007-11-29 | Kyeong-Keun Kim | Navigation system for position self control robot and floor materials for providing absolute coordinates used thereof |
| US7643893B2 (en) | 2006-07-24 | 2010-01-05 | The Boeing Company | Closed-loop feedback control using motion capture systems |
| US8214098B2 (en) | 2008-02-28 | 2012-07-03 | The Boeing Company | System and method for controlling swarm of remote unmanned vehicles through human gestures |
| US20110039573A1 (en) * | 2009-08-13 | 2011-02-17 | Qualcomm Incorporated | Accessing positional information for a mobile station using a data code label |
| US20130212130A1 (en) * | 2012-02-15 | 2013-08-15 | Flybits, Inc. | Zone Oriented Applications, Systems and Methods |
| US20150332079A1 (en) * | 2012-12-31 | 2015-11-19 | Ajou University Industry-Academic Cooperation Foundation | Apparatus and method for recognizing quick response code |
| US20160300354A1 (en) * | 2015-04-09 | 2016-10-13 | The Boeing Company | Automated local positioning system calibration using optically readable markers |
| US20190138030A1 (en) * | 2016-07-07 | 2019-05-09 | SZ DJI Technology Co., Ltd. | Method and system for controlling a movable object using machine-readable code |
| US20180281191A1 (en) * | 2017-03-30 | 2018-10-04 | Brain Corporation | Systems and methods for robotic path planning |
| US20180364740A1 (en) * | 2017-06-20 | 2018-12-20 | Planck Aerosystems Inc. | Systems and methods for charging unmanned aerial vehicles on a moving platform |
Non-Patent Citations (1)
| Title |
|---|
| Troy et al., "Closed-Loop Motion Capture Feedback Control of Small-Scale Aerial Vehicles," AIAA Infotech@Aerospace 2007 Conference and Exhibit (2007). |
Also Published As
| Publication number | Publication date |
|---|---|
| US20190265721A1 (en) | 2019-08-29 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US11614743B2 (en) | System and method for navigating a sensor-equipped mobile platform through an environment to a destination | |
| Mautz et al. | Survey of optical indoor positioning systems | |
| Kalaitzakis et al. | Experimental comparison of fiducial markers for pose estimation | |
| CN113474677B (en) | Automated method for landing a UAV on a pipeline | |
| US8807428B2 (en) | Navigation of mobile devices | |
| CN113984081B (en) | Positioning method, positioning device, self-mobile equipment and storage medium | |
| KR20200041355A (en) | Simultaneous positioning and mapping navigation method, device and system combining markers | |
| US10791276B2 (en) | Automated local positioning system calibration using optically readable markers | |
| CN107328420A (en) | Localization method and device | |
| CN102419178A (en) | Mobile robot positioning system and method based on infrared road signs | |
| Lee et al. | Mobile robot localization using infrared light reflecting landmarks | |
| Blaser et al. | Development of a portable high performance mobile mapping system using the robot operating system | |
| Vasquez et al. | Sensor fusion for tour-guide robot localization | |
| KR101272422B1 (en) | Device and method for locationing using laser scanner and landmark matching | |
| Rostkowska et al. | On the application of QR codes for robust self-localization of mobile robots in various application scenarios | |
| CN112074706B (en) | Precise positioning system | |
| Tsukiyama | Global navigation system with RFID tags | |
| CN110703773A (en) | Method for positioning AGV (automatic guided vehicle) by using circle and coded light source as markers | |
| US20220084247A1 (en) | System and method for recalibrating an augmented reality experience using physical markers | |
| Lee et al. | Robust self-localization of ground vehicles using artificial landmark | |
| Mautz et al. | Optical indoor positioning systems | |
| KR101858488B1 (en) | Sphere type cartesian coordinate system, method, application and server for providing location information using the same | |
| Araar et al. | Towards low-cost indoor localisation using a multi-camera system | |
| Sultan et al. | Vision guided path planning system for vehicles using infrared landmark | |
| Moreira et al. | Mobile robot outdoor localization using planar beacons and visual improved odometry |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| AS | Assignment |
Owner name: THE BOEING COMPANY, ILLINOIS Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:TROY, JAMES J.;LEA, SCOTT W.;GEORGESON, GARY E.;SIGNING DATES FROM 20180222 TO 20180226;REEL/FRAME:045041/0081 |
|
| FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE AFTER FINAL ACTION FORWARDED TO EXAMINER |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: ADVISORY ACTION MAILED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE AFTER FINAL ACTION FORWARDED TO EXAMINER |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: ADVISORY ACTION MAILED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS |
|
| STCF | Information on status: patent grant |
Free format text: PATENTED CASE |