WO2021021869A1 - Systems and methods for calibrating nonvisible light emitting sensors using alignment targets - Google Patents

Systems and methods for calibrating nonvisible light emitting sensors using alignment targets Download PDF

Info

Publication number
WO2021021869A1
WO2021021869A1 PCT/US2020/043974 US2020043974W WO2021021869A1 WO 2021021869 A1 WO2021021869 A1 WO 2021021869A1 US 2020043974 W US2020043974 W US 2020043974W WO 2021021869 A1 WO2021021869 A1 WO 2021021869A1
Authority
WO
WIPO (PCT)
Prior art keywords
detection indication
sensor
target
indication signal
detection
Prior art date
Application number
PCT/US2020/043974
Other languages
French (fr)
Inventor
Ken MASTERSON
Stan VALSAINT
Avi HALPERN
Original Assignee
Brain Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Brain Corporation filed Critical Brain Corporation
Publication of WO2021021869A1 publication Critical patent/WO2021021869A1/en

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/497Means for monitoring or calibrating
    • G01S7/4972Alignment of sensor
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • G01S17/931Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/4804Auxiliary means for detecting or identifying lidar signals or the like, e.g. laser illuminators

Definitions

  • the present application relates generally to robotics, and more specifically to systems and methods for calibrating nonvisible light emitting sensors using alignment targets.
  • the alignment targets disclosed herein may provide quick and cost effective systems and methods for visualizing a beam of a sensor by a human using light in the visible spectrum such that the visualized measurement plane may be utilized by the human for calibration of the sensor. Additionally, a processor may utilize data from the alignment targets to determine a pose of the sensor and any required adjustments to the pose.
  • an alignment target apparatus comprises: a plurality of detection indication units arranged spatially in at least one linear array, each detection indication unit being configured to detect the incident nonvisible light outputted from the sensor; and at least one target positioned on the alignment target at a location relative to the sensor, the location of the at least one target corresponding to a desired point of intersection between the incident nonvisible light and the alignment target apparatus, the desired point corresponding to the sensor being calibrated.
  • the at least one target comprises a visible light emitting diode configured to visually represent the desired point of intersection between the alignment target and the incident nonvisible light from the sensor.
  • each one of plurality the detection indication units further comprises a threshold detection logic configured to: determine if the nonvisible light from the sensor is incident on a detection indication unit based on an induced voltage from a photodiode; and output a detection indication signal based on the induced voltage, the detection indication signal comprising either a logical high or logical low detection indication signal, the logical high detection indication signal corresponding to detection of the nonvisible light, and the logical low detection indication signal corresponding to no detection of the nonvisible light.
  • the detection indication signal comprises an output voltage over a visible light diode, the visible light diode configured to: emit visible light based on the output voltage if the detection indication signal is the logical high detection indication signal, and not emit the visible light if the detection indication signal is the logical low detection indication signal.
  • the alignment target apparatus further comprises a non-transitory computer readable storage medium and at least one processor configured to execute the computer readable instructions to: determine at least one spatial discrepancy between the at least one target and an intersection point between the incident nonvisible light outputted from the sensor and the at least one alignment target apparatus, the intersection point being indicated by a detection indication signal outputted by one of the plurality of detection indication units if the detection indication signal is the logical high detection indication signal; and minimize the at least one spatial discrepancy by adjusting a pose of the sensor.
  • the at least one processor may execute the instructions to adjust the pose of the sensor by either: activating at least one servomotor, the at least one servomotor configured to adjust the pose of the sensor; or providing instructions to a human via a user interface, the instructions prompt the human to perform the adjustments to the pose of the sensor manually.
  • the at least one target comprises a designated at least one detection indication unit of the plurality of detection indication units [0014] [0015] According to at least one non-limiting exemplary embodiment, a method for calibrating a sensor on a device is disclosed.
  • the sensor being configured to emit nonvisible light to generate measurements of an environment.
  • the method comprising: utilizing at least one alignment target at a known position relative to the device to determine, for each alignment target, at least one spatial discrepancy between a location of at least one target and a location of at least one intersection point; and minimizing the at least one spatial discrepancy by performing adjustments to a pose of the sensor; wherein, an intersection point corresponds to a location on an alignment target where the nonvisible light is incident; and a target corresponds to a desired location of the intersection point on an alignment target corresponding to a calibrated sensor.
  • the method further comprises: determining the intersection point based on a detection indication signal output from one of a plurality of linearly arranged detection indication units of an alignment target being logical high.
  • the method further comprises: determining the detection indication signal for a detection indication unit based on an induced voltage of a photodiode of the detection indication unit exceeding a value, the voltage being induced due to the nonvisible light from the sensor being incident on the photodiode.
  • the detection indication signal comprises an output voltage over a visible light diode, the output voltage configures the visible light diode to emit visible light when the output detection indication signal is logical high and produce no visible light when the detection indication signal is logical low.
  • the at least one target comprises a designated at least one detection indication unit of the plurality of detection indication units.
  • the at least one target comprises a visible light emitting diode configured to visually represent the desired location of the intersection point.
  • a non-transitory computer readable storage medium having a plurality of computer readable instructions embodied thereon.
  • the instructions when executed by a processor, configure the processor to: determine at least one spatial discrepancy between at least one target and at least one intersection point between one or more alignment targets and nonvisible light from a sensor, the intersection point being indicated by one or more detection indication units of an alignment target outputting a logical high detection indication signal; and minimize the spatial discrepancy by performing adjustments to a pose of the sensor.
  • the non-transitory computer readable storage medium further comprises instructions which configure the processor to: perform the adjustments to the pose of the sensor by activating at least one servomotor configured to adjust the pose of the sensor.
  • the non-transitory computer readable storage medium further comprises instructions which configure the processor to: provide instructions to a human via a user interface to perform the adjustments to the pose of the sensor [0024]
  • each of the at least one alignment targets further comprises a plurality of linearly arranged detection indication units, each detection indication unit further comprises a photodiode sensitive to a wavelength of the nonvisible light; and the detection indication signal output is based on an induced voltage of a photodiode exceeding a value, the voltage being induced due to the nonvisible light from the sensor being incident on the photodiode.
  • the at least one target comprises a designated at least one of the plurality of detection indication units located at a desired location of the intersection point, the desired location corresponding to an intersection point of a calibrated sensor.
  • an alignment target apparatus comprises: a plurality of detection indication units arranged spatially in at least one linear array, each detection indication unit being configured to detect the incident nonvisible light from the sensor, each detection indication unit comprising: a threshold detection logic configured to determine if the nonvisible light from the sensor is incident on a detection indication unit based on an induced voltage from a photodiode, the threshold detection logic outputs a detection indication signal based on the induced voltage from the photodiode, the detection indication signal comprising a logical high or low corresponding to a detection or no detection, respectively, of the incident nonvisible light by the photodiode, the detection indication signal comprises an output voltage over a visible light diode, the output voltage configures the visible light diode to emit visible light when the detection indication signal is logical high and produce no visible light when the detection indication signal is logical low; and at least one target positioned on the alignment target at a location relative to the sensor, the
  • FIG. 1A is a functional block diagram of a main robot in accordance with some embodiments of this disclosure.
  • FIG. IB is a functional block diagram of a controller shown in FIG. 1A in accordance with some embodiments of this disclosure.
  • FIG. 1C illustrates a light detection and ranging (LiDAR) sensor and features thereof in accordance with some embodiments of this disclosure.
  • LiDAR light detection and ranging
  • FIG. 2A is a high-level circuit diagram of an alignment target and components thereof, according to an exemplary embodiment.
  • FIG. 2B illustrates an alignment target being utilized to detect a measurement plane of a sensor for calibration of the sensor, according to an exemplary embodiment.
  • FIG. 2C(i)-(iii) illustrates three exemplary alignment targets to illustrate additional exemplary embodiments of an alignment target.
  • FIG. 3 is a high-level circuit diagram of a detection indication unit of an alignment target, according to an exemplary embodiment.
  • FIG. 4A is a top view of a LiDAR sensor and three alignment targets to be utilized to calibrate the LiDAR sensor, according to an exemplary embodiment.
  • FIG. 4B is a side view of the LiDAR sensor and three alignment targets illustrated in
  • FIG. 4A to illustrate discrepancies between a target and illuminated visual light diodes to be utilized to calibrate the LiDAR sensor, according to an exemplary embodiment.
  • FIG. 5 is a process flow diagram illustrating a method for an operator to calibrate a LiDAR sensor on a device using at least one alignment target, according to an exemplary embodiment.
  • FIG. 6 is a functional block diagram of a system configured to utilize alignment targets and a processing unit to determine a pose of a sensor and required adjustments to the pose, according to an exemplary embodiment.
  • LiDAR light detection and ranging
  • LiDAR sensors emit near infrared (IR) light, which is invisible to human eyes, thereby making it difficult for a human to visualize a measurement plane of the LiDAR sensors for calibration.
  • Infrared goggles configured to enable humans to see light in the IR or near IR spectrum, may be cumbersome and impractical due to vision range limitations of the goggles.
  • Infrared luminescent paints which glow upon being illuminated with IR light, may also be impractical as power emitted by LiDAR sensors per unit area may be orders of magnitude too weak to illuminate the paints.
  • many robots may utilize other sensors, such as radar or ultraviolet sensors, which also utilize beams of nonvisible light to sense an environment.
  • an apparatus may be implemented or a method may be practiced using any number of the aspects set forth herein.
  • the scope of the disclosure is intended to cover such an apparatus or method that is practiced using other structure, functionality, or structure and functionality in addition to or other than the various aspects of the disclosure set forth herein. It should be understood that any aspect disclosed herein may be implemented by one or more elements of a claim.
  • a robot may include mechanical and/or virtual entities configured to carry out a complex series of tasks or actions autonomously.
  • robots may be machines that are guided and/or instructed by computer programs and/or electronic circuitry.
  • robots may include electro mechanical components that are configured for navigation, where the robot may move from one location to another.
  • Such robots may include autonomous and/or semi-autonomous cars, floor cleaners, rovers, drones, planes, boats, carts, trams, wheelchairs, industrial equipment, stocking machines, mobile platforms, personal transportation devices (e.g., hover boards, SEGWAYS®, etc.), stocking machines, trailer movers, vehicles, and the like.
  • Robots may also include any autonomous and/or semi- autonomous machine for transporting items, people, animals, cargo, freight, objects, luggage, and/or anything desirable from one location to another.
  • an operator may comprise a human manufacturer, operator, or user of a device. Operators may be tasked with calibrating sensors on a device using, at least in part, the systems and methods disclosed herein.
  • logical high of a digital signal may correspond to an active state (i.e.,
  • a flashlight emitting light i.e., in an ON state
  • the voltage of the signal may be Ov or 5v based on a specific design (i.e., active low or active high) of the flashlight.
  • visible light comprises light of optical wavelength (i.e., visible to humans) from around 400 - 700 nanometers.
  • Infrared (IR) light may include any wavelengths longer than 700 nanometers and may comprise any or all subsections of infrared (e.g., near IR, mid IR, far IR).
  • Nonvisible light comprises light outside of the optical wavelength (i.e., light invisible to humans).
  • a sensor e.g., microwave, ultraviolet, etc.
  • detection of IR light described herein is not intended to be limiting.
  • a pose of an object may comprise an (x, y, z, yaw, pitch, roll) orientation of the object, defined with respect to a predefined origin.
  • a pose of an object may comprise some or all six degrees of freedom, three degrees of freedom being (x, y, z) position and the other three being (yaw, pitch, roll) rotation, wherein the pose of the object may be defined about all available degrees of freedom of the object.
  • network interfaces may include any signal, data, or software interface with a component, network, or process including, without limitation, those of the FireWire (e.g., FW400, FW800, FWS800T, FWS1600, FWS3200, etc.), universal serial bus (“USB”) (e.g., USB l .X, USB 2.0, USB 3.0, USB Type-C, etc.), Ethernet (e.g., 10/100, 10/100/1000 (Gigabit Ethernet), 10-Gig- E, etc.), multimedia over coax alliance technology (“MoCA”), Coaxsys (e.g., TVNETTM), radio frequency tuner (e.g., in-band or OOB, cable modem, etc.), Wi-Fi (802.11), WiMAX (e.g., WiMAX (802.16)), PAN (e.g., PAN/802.15), cellular (e.g., 3G, 4G, or 5G including LTE/LTE-A
  • FireWire e.
  • Wi-Fi may include one or more oflEEE-Std. 802.11, variants oflEEE-Std. 802.11, standards related to IEEE-Std. 802.11 (e.g., 802.11 a/b/g/n/ac/ad/af/ah/ai/aj/aq/ax/ay), and/or other wireless standards.
  • 802.11 e.g., 802.11 a/b/g/n/ac/ad/af/ah/ai/aj/aq/ax/ay
  • other wireless standards e.g., 802.11 a/b/g/n/ac/ad/af/ah/ai/aj/aq/ax/ay
  • processor, microprocessor, and/or digital processor may include any type of digital processing device such as, without limitation, digital signal processors (“DSPs”), reduced instruction set computers (“RISC”), complex instruction set computers (“CISC”), microprocessors, gate arrays (e.g., field programmable gate arrays (“FPGAs”)), programmable logic device (“PLDs”), reconfigurable computer fabrics (“RCFs”), array processors, secure microprocessors, specialized processors (e.g., neuromorphic processors), and application-specific integrated circuits (“ASICs”).
  • DSPs digital signal processors
  • RISC reduced instruction set computers
  • CISC complex instruction set computers
  • microprocessors gate arrays (e.g., field programmable gate arrays (“FPGAs”)), programmable logic device (“PLDs”), reconfigurable computer fabrics (“RCFs”), array processors, secure microprocessors, specialized processors (e.g., neuromorphic processors), and application-specific integrated circuits (“ASICs
  • computer program and/or software may include any sequence or machine cognizable steps which perform a function.
  • Such computer program and/or software may be rendered in any programming language or environment including, for example, C/C++, C#, Fortran, COBOL, MATLABTM, PASCAL, GO, RUST, SCALA, Python, assembly language, markup languages (e.g., HTML, SGML, XML, VoXML), and the like, as well as object-oriented environments such as the Common Object Request Broker Architecture (“CORBA”), JAVATM (including J2ME, Java Beans, etc.), Binary Runtime Environment (e.g.,“BREW”), and the like.
  • CORBA Common Object Request Broker Architecture
  • JAVATM including J2ME, Java Beans, etc.
  • Binary Runtime Environment e.g.,“BREW”
  • connection, link, and/or wireless link may include a causal link between any two or more entities (whether physical or logical/virtual), which enables information exchange between the entities.
  • computer and/or computing device may include, but are not limited to, personal computers (“PCs”) and minicomputers, whether desktop, laptop, or otherwise, mainframe computers, workstations, servers, personal digital assistants (“PDAs”), handheld computers, embedded computers, programmable logic devices, personal communicators, tablet computers, mobile devices, portable navigation aids, J2ME equipped devices, cellular telephones, smart phones, personal integrated communication or entertainment devices, and/or any other device capable of executing a set of instructions and processing an incoming data signal.
  • PCs personal computers
  • PDAs personal digital assistants
  • handheld computers handheld computers
  • embedded computers embedded computers
  • programmable logic devices personal communicators
  • tablet computers tablet computers
  • mobile devices portable navigation aids
  • J2ME equipped devices portable navigation aids
  • cellular telephones smart phones
  • personal integrated communication or entertainment devices personal integrated communication or entertainment devices
  • the systems and methods of this disclosure at least: (i) enhance abilities of a human to manually calibrate a sensor on a robot; (ii) allow humans to visualize a measurement plane of an infrared LiDAR sensor; (iii) reduce cost and complexity associated with calibrating a LiDAR sensor; and (iv) improve operation precision of robots by further enhancing LiDAR sensor calibration methods.
  • enhance abilities of a human to manually calibrate a sensor on a robot at least: (i) enhance abilities of a human to manually calibrate a sensor on a robot; (ii) allow humans to visualize a measurement plane of an infrared LiDAR sensor; (iii) reduce cost and complexity associated with calibrating a LiDAR sensor; and (iv) improve operation precision of robots by further enhancing LiDAR sensor calibration methods.
  • FIG. 1A is a functional block diagram of a robot 102 in accordance with some principles of this disclosure.
  • robot 102 may include controller 118, memory 120, user interface unit 112, sensor units 114, navigation units 106, actuator unit 108, and communications unit 116, as well as other components and subcomponents (e.g., some of which may not be illustrated).
  • controller 118 memory 120
  • user interface unit 112 user interface unit 112
  • sensor units 114 e.g., sensor units 114
  • navigation units 106 e.g., a specific embodiment
  • actuator unit 108 e.g., a specific embodiment
  • communications unit 116 e.g., a specific embodiment is illustrated in FIG. 1A, it is appreciated that the architecture may be varied in certain embodiments as would be readily apparent to one of ordinary skill given the contents of the present disclosure.
  • robot 102 may be representative at least in part of any robot described in this disclosure.
  • Controller 118 may control the various operations performed by robot 102. Controller 118 may include and/or comprise one or more processing devices (e.g., microprocessing devices) and other peripherals.
  • processing device, microprocessing device, and/or digital processing device may include any type of digital processing device such as, without limitation, digital signal processing devices (“DSPs”), reduced instruction set computers (“RISC”), complex instruction set computers (“CISC”), microprocessing devices, gate arrays (e.g., field programmable gate arrays (“FPGAs”)), programmable logic device (“PLDs”), reconfigurable computer fabrics (“RCFs”), array processing devices, secure microprocessing devices and application- specific integrated circuits (“ASICs”).
  • DSPs digital signal processing devices
  • RISC reduced instruction set computers
  • CISC complex instruction set computers
  • FPGAs field programmable gate arrays
  • PLDs programmable logic device
  • RCFs reconfigurable computer fabrics
  • Peripherals may include hardware accelerators configured to perform a specific function using hardware elements such as, without limitation, encryption/description hardware, algebraic processing devices (e.g., tensor processing units, quadratic problem solvers, multipliers, etc.), data compressors, encoders, arithmetic logic units (“ALU”), and the like.
  • algebraic processing devices e.g., tensor processing units, quadratic problem solvers, multipliers, etc.
  • data compressors e.g., encoders, arithmetic logic units (“ALU”), and the like.
  • ALU arithmetic logic units
  • Controller 118 may be operatively and/or communicatively coupled to memory 120.
  • Memory 120 may include any type of integrated circuit or other storage device configured to store digital data including, without limitation, read-only memory (“ROM”), random access memory (“RAM”), non-volatile random access memory (“NVRAM”), programmable read-only memory (“PROM”), electrically erasable programmable read-only memory (“EEPROM”), dynamic random- access memory (“DRAM”), Mobile DRAM, synchronous DRAM (“SDRAM”), double data rate SDRAM (“DDR/2 SDRAM”), extended data output (“EDO”) RAM, fast page mode RAM (“FPM”), reduced latency DRAM (“RLDRAM”), static RAM (“SRAM”), flash memory (e.g., NAND/NOR), memristor memory, pseudostatic RAM (“PSRAM”), etc.
  • ROM read-only memory
  • RAM random access memory
  • NVRAM non-volatile random access memory
  • PROM programmable read-only memory
  • EEPROM electrically erasable programmable read-only memory
  • DRAM dynamic random- access memory
  • SDRAM synchronous D
  • Memory 120 may provide computer-readable instructions and data to controller 118.
  • memory 120 may be a non-transitory, computer- readable storage apparatus and/or medium having a plurality of instructions stored thereon, the instructions being executable by a processing apparatus (e.g., controller 118) to operate robot 102.
  • the computer-readable instructions may be configured to, when executed by the processing apparatus, cause the processing apparatus to perform the various methods, features, and/or functionality described in this disclosure.
  • controller 118 may perform logical and/or arithmetic operations based on program instructions stored within memory 120.
  • the instructions and/or data of memory 120 may be stored in a combination of hardware, some located locally within robot 102, and some located remote from robot 102 (e.g., in a cloud, server, network, etc.).
  • a processing device may be internal to or on board robot 102 and/or may be external to robot 102 and be communicatively coupled to controller 118 of robot 102 utilizing communication units 116 wherein the external processing device may receive data from robot 102, process the data, and transmit computer-readable instructions back to controller 118.
  • the processing device may be on a remote server (not shown).
  • memory 120 may store a library of sensor data.
  • the sensor data may be associated at least in part with objects and/or people.
  • this library may include sensor data related to objects and/or people in different conditions, such as sensor data related to objects and/or people with different compositions (e.g., materials, reflective properties, molecular makeup, etc.), different lighting conditions, angles, sizes, distances, clarity (e.g., blurred, obstructed/occluded, partially off frame, etc.), colors, surroundings, and/or other conditions.
  • the sensor data in the library may be taken by a sensor (e.g., a sensor of sensor units 114 or any other sensor) and/or generated automatically, such as with a computer program that is configured to generate/simulate (e.g., in a virtual world) library sensor data (e.g., which may generate/simulate these library data entirely digitally and/or beginning from actual sensor data) from different lighting conditions, angles, sizes, distances, clarity (e.g., blurred, obstructed/occluded, partially off frame, etc.), colors, surroundings, and/or other conditions.
  • a sensor e.g., a sensor of sensor units 114 or any other sensor
  • a computer program that is configured to generate/simulate (e.g., in a virtual world) library sensor data (e.g., which may generate/simulate these library data entirely digitally and/or beginning from actual sensor data) from different lighting conditions, angles, sizes, distances, clarity (e.g., blurred, obstructed/occ
  • the number of images in the library may depend at least in part on one or more of the amount of available data, the variability of the surrounding environment in which robot 102 operates, the complexity of objects and/or people, the variability in appearance of objects, physical properties of robots, the characteristics of the sensors, and/or the amount of available storage space (e.g., in the library, memory 120, and/or local or remote storage).
  • the library may be stored on a network (e.g., cloud, server, distributed network, etc.) and/or may not be stored completely within memory 120.
  • various robots may be networked so that data captured by individual robots are collectively shared with other robots.
  • these robots may be configured to learn and/or share sensor data in order to facilitate the ability to readily detect and/or identify errors and/or assist events.
  • operative units 104 may be coupled to controller 118, or any other controller, to perform the various operations described in this disclosure.
  • controller 118 any other controller, to perform the various operations described in this disclosure.
  • One, more, or none of the modules in operative units 104 may be included in some embodiments.
  • reference may be to various controllers and/or processing devices.
  • a single controller e.g., controller 118
  • controller 118 may serve as the various controllers and/or processing devices described.
  • different controllers and/or processing devices may be used, such as controllers and/or processing devices used particularly for one or more operative units 104.
  • Controller 118 may send and/or receive signals, such as power signals, status signals, data signals, electrical signals, and/or any other desirable signals, including discrete and analog signals to operative units 104. Controller 118 may coordinate and/or manage operative units 104, and/or set timings (e.g., synchronously or asynchronously), turn off/on control power budgets, receive/send network instructions and/or updates, update firmware, send interrogatory signals, receive and/or send statuses, and/or perform any operations for running features of robot 102.
  • timings e.g., synchronously or asynchronously
  • operative units 104 may include various units that perform functions for robot 102.
  • operative units 104 includes at least navigation units 106, actuator units 108, user interface units 112, sensor units 114, and communication units 116.
  • Operative units 104 may also comprise other units such as specifically configured task units (not shown) that provide the various functionality of robot 102.
  • operative units 104 may be instantiated in software, hardware, or both software and hardware.
  • units of operative units 104 may comprise computer-implemented instructions executed by a controller 118.
  • units of operative unit 104 may comprise hardcoded logic (e.g., ASICS).
  • units of operative units 104 may comprise both computer-implemented instructions executed by a controller and hardcoded logic. Where operative units 104 are implemented in part in software, operative units 104 may include units/modules of code configured to provide one or more functionalities.
  • navigation units 106 may include systems and methods that may computationally construct and update a map of an environment, localize robot 102 (e.g., find the position) in a map, and navigate robot 102 to/from destinations.
  • the mapping may be performed by imposing data obtained in part by sensor units 114 into a computer-readable map representative at least in part of the environment.
  • a map of an environment may be uploaded to robot 102 through user interface units 112, uploaded wirelessly or through wired connection, or taught to robot 102 by a user.
  • navigation units 106 may include components and/or software configured to provide directional instructions for robot 102 to navigate. Navigation units 106 may process maps, routes, and localization information generated by mapping and localization units, data from sensor units 114, and/or other operative units 104.
  • actuator units 108 may include actuators such as electric motors, gas motors, driven magnet systems, solenoid/ratchet systems, piezoelectric systems (e.g., inchworm motors), magneto strictive elements, gesticulation, and/or any way of driving an actuator known in the art.
  • actuators may actuate the wheels for robot 102 to navigate a route; navigate around obstacles; rotate cameras and sensors.
  • actuator unit 108 may include systems that allow movement of robot 102, such as motorize propulsion.
  • motorized propulsion may move robot 102 in a forward or backward direction, and/or be used at least in part in turning robot 102 (e.g., left, right, and/or any other direction).
  • actuator unit 108 may control if robot 102 is moving or is stopped and/or allow robot 102 to navigate from one location to another location.
  • Actuator unit 108 may also include any system used for actuating, in some cases actuating task units to perform tasks.
  • actuator unit 108 may include driven magnet systems, motors/engines (e.g., electric motors, combustion engines, steam engines, and/or any type of motor/engine known in the art), solenoid/ratchet system, piezoelectric system (e.g., an inchworm motor), magnetostrictive elements, gesticulation, and/or any actuator known in the art.
  • motors/engines e.g., electric motors, combustion engines, steam engines, and/or any type of motor/engine known in the art
  • solenoid/ratchet system e.g., piezoelectric system
  • piezoelectric system e.g., an inchworm motor
  • magnetostrictive elements gesticulation, and/or any actuator known in the art.
  • sensor units 114 may comprise systems and/or methods that may detect characteristics within and/or around robot 102.
  • Sensor units 114 may comprise a plurality and/or a combination of sensors.
  • Sensor units 114 may include sensors that are internal to robot 102 or external, and/or have components that are partially internal and/or partially external.
  • sensor units 114 may include one or more exteroceptive sensors, such as sonars, light detection and ranging (“LiDAR”) sensors, radars, lasers, cameras (including video cameras (e.g., red- blue-green (“RBG”) cameras, infrared cameras, three-dimensional (“3D”) cameras, thermal cameras, etc.), time of flight (“ToF”) cameras, structured light cameras, antennas, motion detectors, microphones, and/or any other sensor known in the art.
  • sensor units 114 may collect raw measurements (e.g., currents, voltages, resistances, gate logic, etc.) and/or transformed measurements (e.g., distances, angles, detected points in obstacles, etc.).
  • measurements may be aggregated and/or summarized.
  • Sensor units 114 may generate data based at least in part on distance or height measurements.
  • data may be stored in data structures, such as matrices, arrays, queues, lists, arrays, stacks, bags, etc.
  • sensor units 114 may include sensors that may measure internal characteristics of robot 102.
  • sensor units 114 may measure temperature, power levels, statuses, and/or any characteristic of robot 102.
  • sensor units 114 may be configured to determine the odometry of robot 102.
  • sensor units 114 may include proprioceptive sensors, which may comprise sensors such as accelerometers, inertial measurement units (“IMU”), odometers, gyroscopes, speedometers, cameras (e.g. using visual odometry), clock/timer, and the like. Odometry may facilitate autonomous navigation and/or autonomous actions of robot 102.
  • IMU inertial measurement units
  • odometers e.g. using visual odometry
  • clock/timer e.g. using visual odometry
  • This odometry may include robot 102’s position (e.g., where position may include robot’s location, displacement and/or orientation, and may sometimes be interchangeable with the term pose as used herein) relative to the initial location.
  • Such data may be stored in data structures, such as matrices, arrays, queues, lists, arrays, stacks, bags, etc.
  • the data structure of the sensor data may be called an image.
  • sensor units 114 may be in part external to the robot 102 and coupled to communications units 116.
  • a security camera within an environment of a robot 102 may provide a controller 118 of the robot 102 with a video feed via wired or wireless communication channel(s).
  • sensor units 114 may include sensors configured to detect a presence of an object at a location such as, for example without limitation, a pressure or motion sensor may be disposed at a shopping cart storage location of a grocery store, wherein the controller 118 of the robot 102 may utilize data from the pressure or motion sensor to determine if the robot 102 should retrieve more shopping carts for customers.
  • user interface units 112 may be configured to enable a user to interact with robot 102.
  • user interface units 112 may include touch panels, buttons, keypads/keyboards, ports (e.g., universal serial bus (“USB”), digital visual interface (“DVI”), Display Port, E-Sata, Firewire, PS/2, Serial, VGA, SCSI, audioport, high-definition multimedia interface (“HDMI”), personal computer memory card international association (“PCMCIA”) ports, memory card ports (e.g., secure digital (“SD”) and miniSD), and/or ports for computer-readable medium), mice, rollerballs, consoles, vibrators, audio transducers, and/or any interface for a user to input and/or receive data and/or commands, whether coupled wirelessly or through wires.
  • USB universal serial bus
  • DVI digital visual interface
  • Display Port Display Port
  • E-Sata Firewire
  • PS/2 Serial, VGA, SCSI
  • HDMI high-definition multimedia interface
  • PCMCIA personal computer memory card international association
  • User interface units 218 may include a display, such as, without limitation, liquid crystal display (“UCDs”), light-emitting diode (“UED”) displays, LED LCD displays, in-plane-switching (“IPS”) displays, cathode ray tubes, plasma displays, high definition (“HD”) panels, 4K displays, retina displays, organic LED displays, touchscreens, surfaces, canvases, and/or any displays, televisions, monitors, panels, and/or devices known in the art for visual presentation.
  • UCDs liquid crystal display
  • UMD light-emitting diode
  • IPS in-plane-switching
  • cathode ray tubes plasma displays
  • HD high definition
  • 4K displays retina displays
  • organic LED displays organic LED displays
  • touchscreens touchscreens
  • canvases canvases
  • any displays televisions, monitors, panels, and/or devices known in the art for visual presentation.
  • user interface units 112 may be positioned on the body of robot 102.
  • user interface units 112 may be positioned away from the body of robot 102 but may be communicatively coupled to robot 102 (e.g., via communication units including transmitters, receivers, and/or transceivers) directly or indirectly (e.g., through a network, server, and/or a cloud).
  • user interface units 112 may include one or more projections of images on a surface (e.g., the floor) proximally located to the robot, e.g., to provide information to the occupant or to people around the robot. The information could be the direction of future movement of the robot, such as an indication of moving forward, left, right, back, at an angle, and/or any other direction.
  • communications unit 116 may include one or more receivers, transmitters, and/or transceivers. Communications unit 116 may be configured to send/receive a transmission protocol, such as BLUETOOTH ® , ZIGBEE ® , Wi-Fi, induction wireless data transmission, radio frequencies, radio transmission, radio-frequency identification (“RFID”), near- field communication (“NFC”), infrared, network interfaces, cellular technologies such as 3G (3.5G, 3.75G, 3 GPP/3 GPP2/HSP A+), 4G (4GPP/4GPP2/LTE/LTE-TDD/LTE-FDD), 5G (5GPP/5GPP2), or 5G LTE (long-term evolution, and variants thereof including LTE-A, LTE-U, LTE-A Pro, etc.), high speed downlink packet access (“HSDPA”), high-speed uplink packet access (“HSUPA”), time division
  • a transmission protocol such as BLUETOOTH ® , ZIGBEE ® , Wi-Fi
  • Communications unit 116 may also be configured to send/receive signals utilizing a transmission protocol over wired connections, such as any cable that has a signal line and ground.
  • a transmission protocol such as any cable that has a signal line and ground.
  • cables may include Ethernet cables, coaxial cables, Universal Serial Bus (“USB”), FireWire, and/or any connection known in the art.
  • USB Universal Serial Bus
  • Such protocols may be used by communications unit 116 to communicate to external systems, such as computers, smart phones, tablets, data capture systems, mobile telecommunications networks, clouds, servers, or the like.
  • Communications unit 116 may be configured to send and receive signals comprising of numbers, letters, alphanumeric characters, and/or symbols.
  • signals may be encrypted, using algorithms such as 128-bit or 256-bit keys and/or other encryption algorithms complying with standards such as the Advanced Encryption Standard (“AES”), RSA, Data Encryption Standard (“DES”), Triple DES, and the like.
  • Communications unit 116 may be configured to send and receive statuses, commands, and other data/information.
  • communications unit 116 may communicate with a user operator to allow the user to control robot 102
  • Communications unit 116 may communicate with a server/network (e.g., a network) in order to allow robot 102 to send data, statuses, commands, and other communications to the server.
  • the server may also be communicatively coupled to computer(s) and/or device(s) that may be used to monitor and/or control robot 102 remotely.
  • Communications unit 116 may also receive updates (e.g., firmware or data updates), data, statuses, commands, and other communications from a server for robot 102.
  • operating system 110 may be configured to manage memory 120, controller 118, power supply 122, modules in operative units 104, and/or any software, hardware, and/or features of robot 102.
  • operating system 110 may include device drivers to manage hardware recourses for robot 102.
  • power supply 122 may include one or more batteries, including, without limitation, lithium, lithium ion, nickel-cadmium, nickel-metal hydride, nickel- hydrogen, carbon-zinc, silver-oxide, zinc-carbon, zinc-air, mercury oxide, alkaline, or any other type of battery known in the art. Certain batteries may be rechargeable, such as wirelessly (e.g., by resonant circuit and/or a resonant tank circuit) and/or plugging into an external power source. Power supply 122 may also be any supplier of energy, including wall sockets and electronic devices that convert solar, wind, water, nuclear, hydrogen, gasoline, natural gas, fossil fuels, mechanical energy, steam, and/or any power source into electricity.
  • One or more of the units described with respect to FIG. 1A may be integrated onto robot 102, such as in an integrated system.
  • one or more of these units may be part of an attachable module.
  • This module may be attached to an existing apparatus to automate so that it behaves as a robot.
  • the features described in this disclosure with reference to robot 102 may be instantiated in a module that may be attached to an existing apparatus and/or integrated onto robot 102 in an integrated system.
  • a person having ordinary skill in the art would appreciate from the contents of this disclosure that at least a portion of the features described in this disclosure may also be run remotely, such as in a cloud, network, and/or server.
  • a robot 102, a controller 118, or any other controller, processing device, or robot performing a task, operation or transformation illustrated in the figures below comprises a controller executing computer readable instructions stored on a non-transitory computer readable storage apparatus, such as memory 120, as would be appreciated by one skilled in the art.
  • the processing device includes a data bus 128, a receiver 126, a transmitter 134, at least one processor 130, and a memory 132.
  • the receiver 126, the processor 130 and the transmitter 134 all communicate with each other via the data bus 128.
  • the processor 130 is configurable to access the memory 132 which stores computer code or computer readable instructions in order for the processor 130 to execute the specialized algorithms.
  • memory 132 may comprise some, none, different, or all of the features of memory 120 previously illustrated in FIG. 1A.
  • the algorithms executed by the processor 130 are discussed in further detail below.
  • the receiver 126 as shown in FIG.
  • the IB is configurable to receive input signals 124.
  • the input signals 124 may comprise signals from a plurality of operative units 104 illustrated in FIG. 1A including, but not limited to, sensor data from sensor units 114, user inputs, motor feedback, external communication signals (e.g., from a remote server), and/or any other signal from an operative unit 104 requiring further processing.
  • the receiver 126 communicates these received signals to the processor 130 via the data bus 128.
  • the data bus 128 is the means of communication between the different components— receiver, processor, and transmitter— in the processing device.
  • the processor 130 executes the algorithms, as discussed below, by accessing specialized computer-readable instructions from the memory 132.
  • the memory 132 is a storage medium for storing computer code or instructions.
  • the storage medium may include optical memory (e.g., CD, DVD, HD-DVD, Blu-Ray Disc, etc.), semiconductor memory (e.g., RAM, EPROM, EEPROM, etc.), and/or magnetic memory (e.g., hard-disk drive, floppy-disk drive, tape drive, MRAM, etc.), among others.
  • Storage medium may include volatile, nonvolatile, dynamic, static, read/write, read-only, random-access, sequential-access, location-addressable, fde -addressable, and/or content-addressable devices.
  • the processor 130 may communicate output signals to transmitter 134 via data bus 128 as illustrated.
  • the transmitter 134 may be configurable to further communicate the output signals to a plurality of operative units 104 illustrated by signal output 136.
  • IB may illustrate an external server architecture configurable to effectuate the control of a robotic apparatus from a remote location, such as server 202 illustrated next in FIG. 2. That is, the server may also include a data bus, a receiver, a transmitter, a processor, and a memory that stores specialized computer readable instructions thereon.
  • a controller 118 of a robot 102 may include one or more processing devices and may further include other peripheral devices used for processing information, such as ASICS, DPS, proportional-integral-derivative (“PID”) controllers, hardware accelerators (e.g., encryption/decryption hardware), and/or other peripherals (e.g., analog to digital converters) described above in FIG. 1A.
  • PID proportional-integral-derivative
  • the other peripheral devices when instantiated in hardware are commonly used within the art to accelerate specific tasks (e.g., multiplication, encryption, etc.) which may alternatively be performed using the system architecture of FIG. IB.
  • peripheral devices are used as a means for intercommunication between the controller 118 and operative units 104 (e.g., digital to analog converters and/or amplifiers for producing actuator signals).
  • the controller 118 executing computer readable instructions to perform a function may include one or more processing devices thereof executing computer readable instructions and, in some instances, the use of any hardware peripherals known within the art.
  • Controller 118 may be illustrative of various processing devices and peripherals integrated into a single circuit die or distributed to various locations of the robot 102 which receive, process, and output information to/from operative units 104 of the robot 102 to effectuate control of the robot 102 in accordance with instructions stored in a memory 120, 132.
  • controller 118 may include a plurality of processing devices for performing high-level tasks (e.g., planning a route to avoid obstacles) and processing devices for performing low-level tasks (e.g., producing actuator signals in accordance with the route).
  • FIG. 1C illustrates a planar light detection and ranging
  • LiDAR (“LiDAR”) sensor 138 collecting distance measurements of a wall 142 along a measurement plane in accordance with some exemplary embodiments of the present disclosure.
  • Planar LiDAR sensor 138 may be configured to collect distance measurements of the wall 142 by projecting a plurality of beams 140 of photons at discrete angles along the measurement plane and determining the distance of the wall 142 based on a time of flight (“TOF”) of the photons leaving the LiDAR sensor 138, reflecting off the wall 142, and returning back to the LiDAR sensor 138.
  • a measurement plane of the LiDAR sensor 138 comprises a plane along which the beams 140 are emitted which, for this exemplary embodiment, is the plane of the page.
  • a plurality of sensors 138 may be positioned on a robot 102 chassis to enhance the navigation and localization capabilities of the robot 102 in addition to sensing or detecting objects around the robot 102, such as wall 142.
  • This plurality of sensors 138 may be mounted in static positions (e.g., using screws, bolts, etc.) or may be mounted with servomotors configured to adjust the pose of the sensor 138 on the robot 102.
  • Calibration of these sensors 138 may be essential for the robot 102 to navigate through an environment safely and perform complex tasks with high precision. Calibration of sensors 138 may degrade over time due to, for example, wear and tear, collisions with objects or people, and/or electrical components of the sensor performing abnormally due to, e.g., temperature fluctuations.
  • LiDAR sensors may utilize Doppler frequency shifts of reflected beams 140 to measure a speed of an object relative to the sensor 138.
  • These form of LiDAR sensors are typically used to measure speed using a single beam 140, such as for traffic speed enforcement.
  • the reflected beam 138 may include a ToF proportionate to the distance between the sensor 138 and the object as well as a blue-shifted frequency (i.e., a higher frequency than the emitted beam 140), or vice versa if the object is moving away from the sensor 138 (i.e., red-shift).
  • the alignment targets 202 shown in the following figures may be utilized to calibrate any LiDAR sensor including, but not limited to, planar LiDAR sensors and/or directional laser sensors (e.g., for autonomous vehicles measuring the speed of nearby vehicles).
  • FIG. 2A illustrates an alignment target 202 and high- level circuit diagrams of components thereof, according to an exemplary embodiment.
  • the alignment target 202 may comprise a plurality of detection indication units 204 arranged in a linear array, each further comprising an infrared (“IR”) detector diode 302, amplifier 314, threshold logic 318, and a visible light diode 322 as further illustrated below in FIG. 3.
  • Each of the detection indication units 204 may be configured to detect incident IR light from a LiDAR sensor, or other sensor (e.g., radar), and produce visible light upon detection of the incident IR light.
  • the alignment target 202 may further comprise a target 206 or alternatively referred to as a reference target 206, comprising a visible light diode configured to provide a reference target corresponding to a desired intersection location of the incident light from the sensor with the alignment target 202 to aid in calibration of the sensor.
  • the target 206 may additionally comprise a direct current (“DC”) voltage source 210, V on , such that the target LED 212 remains on during calibration of the LiDAR.
  • the target 206 may output or display a same or different color light than the visible light diodes 322 of the detection indication units 204, dependent on a specific color of LED 212 utilized which may be any visible color, such that the target 206 may be easily distinguishable by an operator from visible light diodes 322 of the detection indication units 204.
  • a location of the target 206 on the alignment target 202 corresponds to a desired point of intersection between a beam or measurement plane of a sensor with the alignment target 202, the desired point of intersection corresponding to a calibrated sensor.
  • the alignment target 202 may be positioned proximate a sensor 138 such that a well-calibrated sensor 138 on a robot 102 may comprise a measurement plane which intersects the alignment target 202 at the location of the reference target 206.
  • the alignment target 202 may comprise N detection indication units 204 arranged in a linear array as illustrated, wherein N may be any positive integer number greater than one and chosen based on practical limitations (e.g., desired size of the alignment target 202, spacing of each detection indication units 204, cost, etc.).
  • the target 206 may be positioned adjacent to any one of the plurality of detection indication units 204.
  • the alignment target 202 may comprise a power supply 208 such as, for example, batteries, a USB port (e.g., USB, micro-USB, USB-C, etc.), alternating current plug with a rectifier circuit, and the like.
  • the power supply 208 couples with connectors 201 which receive power from wired connections (e.g., from an external power source, such as a wall-plug, battery, generator, etc.).
  • Power indication diodes 203 may illuminate when power is received (e.g., the diodes 203 may turn on, change from red (power- off) to green (power-on), flash, blink, etc.).
  • the target 206 may be positioned substantially close to the linear array of detection of the plurality of indication units 204 (e.g., within 1 cm), wherein, as illustrated in FIGS. 2A-2B, the spatial separation has been greatly exaggerated for clarity.
  • the alignment target 202 may be configured to receive IR light from a sensor at a substantially normal incident angle to the surface of the alignment target 202 (i.e., substantially normal to the plane of the page). Accordingly, the IR beams along the line of sight of the sensor plane may be detected by one or more detection indication units 204 to illuminate one or more visible light diodes 322, illustrated below in FIG. 3, such that an operator may visualize the location where the incident beam of the sensor intersects with the alignment target 202. That is, a single alignment target 202 may aid in visualizing a measurement plane of the sensor along one dimension or using a single point of intersection.
  • Multiple or plurality of (e.g., 2 or more) alignment targets 202 may be utilized to visualize a 2 dimensional measurement plane of a LiDAR sensor by an operator yielding an unconventional result wherein adjustment of the LiDAR sensor on a device, such as a robot 102, may be accurately tuned manually due to the visualization of a measurement plane by the operator providing instantaneous visual feedback regarding the exact location/orientation of a measurement plane of the LiDAR sensor, as further illustrated below in FIGS. 4A-B.
  • an alignment target 202 may comprise a plurality of linear arrays of detection indication units 204 positioned parallel or perpendicular to the single linear array such that additional intersection points of a measurement plane of a LiDAR sensor with an alignment target 202 may be visualized by a human. That is, a single array of detection indication units 204 for an alignment target 202 is not intended to be limiting.
  • an alignment target 202 may further comprise a plurality of reference targets 206 arranged linearly and parallel to a linear array of detection indication units 204, wherein one reference target 206 may be chosen to be illuminated based on a desired point of intersection of a sensor beam with the alignment target 202.
  • a user interface, push buttons, or wired/wireless signal may be utilized to select which target 206 of the plurality of targets 206 to be illuminated based on a desired pose of a sensor being calibrated. Additional exemplary embodiments of an alignment target 202 are further illustrated below in FIG. 2C.
  • an alignment target 202 may comprise a plurality of detection indication units 204 positioned horizontally, wherein the vertical arrangement of the detection indication units 204 is not intended to be limiting.
  • the plurality of detection indication units 204 may alternatively be positioned at any angle (e.g., vertically, horizontally, or anywhere in between).
  • the alignment target 202 is configured to be light (e.g., less than 5 kg) to enable users to place the alignment targets 202 in any position and/or orientation useful for calibrating sensors on a robot 102.
  • an alignment target 202 may comprise any number of linear arrays of detection indication units 204, wherein use of a single linear array is not intended to be limiting, as illustrated in FIG. 2C(i).
  • multiple alignment targets 202 each comprising a single linear array may be utilized with inventive concepts disclosed herein, as illustrated in FIG. 4A-B below.
  • a target 206 of an alignment target 202 may comprise a designated detection indication unit 204 of the plurality of detection indication units 204.
  • the designated detection indication unit 204 i.e., the target
  • one of the plurality of detection indication units 204 may be configured to act as the target 206 by illuminating a different visible color than the remaining detection indication units 204, as illustrated in FIG. 2C(iii).
  • This embodiment of an alignment target 202 may further comprise a user interface (e.g., push buttons, remote controller and receiver, etc.) which may adjust which detection indication unit 204 is designated as the target.
  • a controller and non-transitory memory such as processor 130 and memory 132 illustrated in FIG. IB, configured to receive input from the detection indication units 204, the input comprising a logical 1 or 0 signal corresponding to a detection or no detection, respectively, of nonvisible light from a sensor for a given detection indication unit 204.
  • a spatial discrepancy 218 may be determined as illustrated below in FIG. 2B. This spatial discrepancy 218 may be further utilized to determine adjustments to the sensor, as further illustrated below in FIGS. 4-6.
  • the alignment target 202 may further comprise an input configured to adjust the brightness of the various LEDs 204 and 206.
  • a knob coupled to a potentiometer may be used to adjust the light intensity to a suitable level for human use.
  • Other inputs are equally applicable, such as push buttons (e.g., to the intensity up or down), a slider, and/or modulation of the output light (e.g., using a pulse width modulated signal).
  • a transparent covering such as one made from plastics, glass, or other transparent materials. It is appreciated that only the illustrated side of the alignment target 202 is required to include such transparent covering such that incident light from a LiDAR sensor 138 may be received by the detection indication units 204 and to enable light emitted from the detection indication units 204 to be visible to humans.
  • FIG. 2B illustrates an alignment target 202 being utilized to visualize a measurement plane 214 of a LiDAR sensor 138 for calibration of the LiDAR sensor 138, according to an exemplary embodiment.
  • the LiDAR sensor 138 may be mounted on a device (not shown) using, for example, screws, servomotors, or bolts which may be adjusted to change a pose (i.e., orientation) of the LiDAR sensor 138.
  • a plurality of beams 140 may be emitted across measurement plane 214, as illustrated in FIG. 1C above, which intersect with the alignment target at a location indicated by an illuminated detection indication unit 204-0.
  • positioning of the LiDAR sensor 138 illustrated in FIG. 2B is a representative representation and not limited to this particular configuration.
  • the LiDAR sensor 138 may be positioned and oriented in different configuration with respect to the alignment target 202.
  • a human calibrating the LiDAR sensor 138 may desire the measurement plane 214 to intersect the alignment target 202 at a location of target 206.
  • the human, or manual intervention, may position the alignment targets 202 such that the target 206 is at a location where the measurement plane of the LiDAR sensor 138 should intersect if the sensor 138 is well calibrated.
  • the human may calibrate the LiDAR sensor 138 by physically or electronically adjusting the pose of the LiDAR sensor on the device until the measurement plane 214 is at a desired measurement plane 216 (noted by dashed lines). By adjusting the LiDAR sensor 138, the measurement plane is essentially being adjusted from first measurement plane 214 to the second, or desired, measurement plane 216.
  • Measurement plane 216 may intersect the alignment target 202 at the target 206 location as indicated by a detection indication unit 204-C directly adjacent to the target 206, wherein detection indication unit 204-C, which is not illuminated, becomes illuminated after measurement plane 216 intersecting the alignment target 202 at the target 206 location. This change in illumination of the detection indication unit 204-C may correspond, or indicate to the operator, the LiDAR sensor 138 is properly calibrated.
  • an alignment target 202 may further comprise a microprocessor or controller configured to determine a discrepancy 218 between a target 206 (and adjacent detection indication unit 204-C) and a currently illuminated detection indication unit 204-0.
  • the discrepancy 218 may be measured parallel to the linear array of detection indication units 204 as illustrated and based on (i) a number of detection indication units 204 between an illuminated detection indication unit 204-0 and the target 206, (ii) a distance between the alignment target 202 and a sensor 138, and (iii) and a spacing between adjacent detection indication units 204.
  • the microprocessor or controller may then determine adjustments to a pose of a sensor 138 based on the discrepancy 218, as illustrated in FIG. 6 below.
  • the target 206 may be replaced with a designated detection indication unit 204 of the plurality of detection indication units 204 (e.g., detection indication unit 204-C), wherein the microprocessor or controller may receive data comprising which detection indication unit 204 is designated as a target.
  • solid black squares representing detection indication units 204 correspond to a detection indication unit 204 detecting no incident nonvisible light (e.g., detection indication unit 204-C illustrated in FIG. 2B), whereas empty, white or non-black squares representing detection indication units 204 may correspond to the detection indication units 204 detecting incident nonvisible light from a sensor (e.g., illuminated detection indication unit 204-0 illustrated in FIG. 2B).
  • Targets 206 represented with empty squares correspond to a currently active target (i.e., atarget 206 comprising an illuminated LED 212), wherein an alignment target 202 may comprise a plurality of other targets 206 which are disabled and/or not illustrated for clarity. That is, black squares indicate no visible light being emitted from a component (e.g., 204/206) while white squares indicate visible light being emitted from a component.
  • FIGS. 2C(i)-(iii) illustrate three additional non-limiting exemplary embodiments of an alignment target 202.
  • FIG. 2C(i) illustrates an alignment target 202(i) comprising two linear arrays of detection indication units 204 and two separate targets 206 as illustrated.
  • a planar LiDAR sensor may illuminate both linear arrays at points indicated by detection indication units 204-0, illustrated with empty boxes, thereby providing an additional data point from which adjustments to a pose of the planar LiDAR may be determined for calibration, wherein the adjustments may be determined by both discrepancies 218.
  • the two linear arrays may comprise a same or differing number of detection indication units 204 positioned in parallel to each other, as illustrated, or at an angle.
  • the two discrepancies 218 shown in FIG. 2C(i) may be of same or different values.
  • Use of two linear arrays of detection indication units 204 may enable an operator to visualize a measurement plane of a LiDAR sensor by providing two of the three spatial data points required to define the measurement plane, the third point, which defines the plane, being the position of the LiDAR sensor itself.
  • an operator may visualize the measurement plane of the LiDAR sensor intersects the alignment target 202(i) below the targets 206, as shown by discrepancies 218.
  • illuminated detection indication units 204-0 illustrated communicate to the operator that the LiDAR sensor comprises an incorrect orientation (i.e., rotation), as shown by discrepancies 218 comprising unequal magnitude and targets 206 being configured substantially horizontal with respect to each other.
  • alignment target 202(ii) shown in FIG. 2C(ii) may comprise a linear array of detection indication units 204 and a linear array of targets 206, wherein an active target 206-0 (i.e., a target 206 currently being utilized comprising an illuminated LED 212) may be chosen from the plurality of targets 206 illustrated.
  • the linear array of targets 206 being configured parallel to the linear array of detection indication units 204.
  • a user may designate which target 206 of the plurality may be the active target 206-0 using buttons 220, or other similar input.
  • the buttons 220 comprising up and down buttons configured to move the active target 206-0 up or down, respectively, one by space along the linear array of targets 206.
  • This embodiment may enable a single alignment target 202(ii) to be utilized to calibrate multiple different sensors (e.g., LiDAR sensors 138) by simply adjusting a position of the active target 206-0, provided the sensors utilize a similar wavelength of light (i.e., within a passband of a photodiode 302).
  • the positioning of the active target 206-0 may be determined by a processor of the alignment target 202(H) or a separate processor communicatively coupled to the alignment target 202(H) (e.g., as shown in FIG. 6).
  • discrepancy 218, comprising a spatial discrepancy between the active target 206-0 and detection indication unit 204-0 receiving incident light from a LiDAR sensor, may be determined as illustrated.
  • Discrepancy 218 may indicate to the operator viewing the alignment target 202(H) that the current positioning of a measurement plane of the LiDAR sensor intersects the alignment target 202(H) at a location lower than the target or desired location (indicated by active target 206-0).
  • alignment target 202(iii), as shown in FIG. 2C(iii), may comprise a single linear array of detection indication units 204 and no stand-alone targets 206.
  • one detection indication unit 204- T may be designated as the target, wherein discrepancy 218 may be measured from the designated detection indication unit 204-T and a detection indication unit 204-0 outputting a logical high detection indication signal 320, illustrated in FIG. 3.
  • the designated target 204-T may output a same or different color than any illuminated detection indication unit 204 (e.g., 204-T may output a green color while 204-0 may output a red color).
  • a processor may be added or communicatively coupled to the alignment target 202(iii) to set the target detection indication unit 204- T and measure the discrepancy 218 as illustrated based on a number of detection indication units 204 between units 204-T and 204-0 as well as distance between each detection indication unit 204 (e.g., 5 millimeters).
  • push buttons 220 may be utilized to step the position of the target detection indication unit 204-T up or down the linear array.
  • the alignment target 202(iii) may enable configurability of a target 204-T, similar to configurability of target 206-0 of FIG. 2B(ii), while occupying less space on a printed circuit board and/or requiring fewer components to manufacture.
  • Any alignment target 202 illustrated herein comprising a single linear array of detection indication units 204 and a single target 206 (e.g., as depicted above in FIG. 2A) illustrated herein with respect to FIG. 3-6 is not intended to be limiting to the illustrated embodiment. Additionally, an alignment target 202 may include, but is not limited to, any combination of features illustrated in FIG. 2C(i)-(iii) (e.g., two linear arrays of detection indication units 204 with no stand alone targets 206).
  • alignment target 202 used may be based on parameters such as, for example, cost, size, complexity of operation, number of alignment targets 202 used, and/or power consumption which may be considered by the operator when deciding the configuration of the alignment target 202 utilized.
  • an analog circuit i.e., comprising no processor or memory
  • a digital system i.e., comprising a processor and memory
  • receive logical outputs i.e., 0 or 1 from the detection indication units 204 corresponding to the location intersection rather than producing visible light outputs.
  • the digital system may include the alignment target 202 further comprising a specialized processor and non-transitory memory configured to determine a spatial discrepancy 218 between the location of intersection 204-0 and a target 206 based on a detection indication signal 320 (illustrated below in FIG. 3), wherein the target 206 may comprise a designated one of the plurality of detection indication units 204 (e.g., 204-C) rather than a stand-alone target (e.g., target 206 as illustrated) in this embodiment.
  • the processor may utilize the spatial discrepancy 218 to determine any adjustments to a pose of the sensor to configure a beam of the sensor to intersect with the alignment target 202 at a desired location.
  • a processor is to determine and perform adjustments to the sensor
  • use of visual light diodes for the target 206 and detection indication units 204 may be a redundant feature.
  • a digital system configured to measure the spatial discrepancy 218 and determine adjustments to a pose of a sensor is further illustrated below in FIG. 6. That is, feedback provided by alignment targets 202 for use in calibrating a sensor on a device is not intended to be limited to visible light emitting diodes providing feedback to a human for manual calibration of the sensor.
  • FIG. 3 illustrates a detection indication unit 204, and components thereof, according to an exemplary embodiment.
  • the detection indication unit 204 may first comprise a nonvisible light photodiode 302 configured to, upon receipt of incident of nonvisible light 304 (e.g., infrared (IR) or ultraviolet (UV) light) from a sensor (e.g., a LiDAR sensor), induce an output voltage configured to pull up a reference voltage 306 V ref on a line 310, wherein the reference voltage 306 may comprise a constant DC voltage supplied by a DC source 208 located on the alignment target 202, such as power supply 208.
  • nonvisible light 304 e.g., infrared (IR) or ultraviolet (UV) light
  • a sensor e.g., a LiDAR sensor
  • the reference voltage 306 may comprise a constant DC voltage supplied by a DC source 208 located on the alignment target 202, such as power supply 208.
  • Component 308 comprises a resistor with an impedance such that, if no light 304 is detected by photodiode 302, the voltage on line 312 is zero due to a voltage across the photodiode 302 being zero. Similarly, if the light 304 is present and detected by photodiode 302, the voltage on line 312 is an ‘on’ voltage of the photodiode (e.g., 0.7 volts).
  • Line 312, as well as lines 310, 316, and 320 illustrated in FIG. 3 may be illustrative of PCB traces, wirings, or other low-impedance electrical means of transmitting electrical power or a voltage potential from one component to another.
  • the voltage of line 312 may be an input to an op-amp 304, or other similar amplifying circuit, such that a voltage difference between V ref of line 310 and the voltage of line 312, AV, may be amplified based on a gain of the op-amp (i.e., Gain x AV for an ideal op-amp).
  • the value of the gain of the op-amp may be chosen based on, for example, a value of V ref , power consumption of the circuit, desired output voltage range of line 316, and/or other design choices which may be readably discemable by one skilled in the art.
  • the amplified differential voltage output 316 may be passed to a threshold logic component 318 comprising a comparator circuit configured to output a logical high or low detection indication signal 320 (i.e., logical 1 or 0) to power a visible light diode 322.
  • a logical high detection indication signal 320 may correspond to detection of the nonvisible light 304 by the photodiode 302 and a logical low detection indication signal 320 may correspond to no nonvisible light 304 being detected by the photodiode 302, wherein the detection indication signal 320 may be determined based on the output 316 from the amplifier 314 exceeding or falling below a threshold voltage level.
  • the voltage value of the logical high detection indication signal 320 may comprise a turn on voltage, or slightly larger voltage, of the visible light diode 322 and the logical low detection indication signal 320 voltage value may comprise a voltage lower than a turn on voltage of the visible light diode 322 (e.g., 0 volts).
  • the visible light diode 322 may output visible light 324 to indicate to an operator that nonvisible light 304 from the sensor is received by the photodiode 302.
  • the visible light diode 322 may remain off and produce no visible light corresponding to no nonvisible light 304 being detected by the photodiode 302.
  • a detection indication unit 202 may additionally comprise a plurality of additional circuit components not illustrated in FIG. 3 such as, for example, coupling and/or bypass capacitors, current limiting resistors (e.g., to limit brightness of light 324 emitted from diode 322), supply voltage circuits for V ref and the amplifier 304, Zener diodes, metal- oxide semiconductor (MOS) devices/circuits, and the like, all being well understood within the art.
  • some circuit components illustrated in FIG. 3 may be omitted or changed in some embodiments (e.g., amplifier 314, resistor 308, line 310 may be connected to ground, etc.) without significantly altering the function and purpose of the detection indication unit 202.
  • the photodiode 302 may be configured to be sensitive to any wavelength of incident light 304 such as, for example, ultraviolet, near IR, IR, or microwave, wherein a choice of photodiode 302 may depend on a wavelength of a sensor being calibrated.
  • the spatial positioning of the photodiode 302 and the visible light diode 322 may be configured on, e.g., a PCB such that the two diodes 302 and 322 are in substantially similar locations (e.g., up to 0.5, 1, 5, or 10 cm) to provide human users of an alignment target 202 with an accurate intersection location between a measurement plane of a LiDAR sensor 138 and the alignment target 202.
  • line 320 may further include a variable resistor, such as a potentiometer, configured to adjust the brightness of the visible light emitting diode 324.
  • the variable resistor may be coupled to a knob, switch, slider, or other tactile input to enable a human operator to adjust the output brightness of the diode 324.
  • a detection indication signal 320 corresponding to detection or no detection of nonvisible light 304 by a photodiode 302 may be communicated to a processor or microcontroller using, for example, a register (e.g., flip flop, capacitor, etc.).
  • the value stored in the register i.e., logical 1 or 0
  • the processor may then utilize the determined location of intersection (i.e., which output 320 of the plurality of detection indication units 204 is logical 1 or high) to calculate a spatial discrepancy 218 between the location of intersection (e.g., 204-0) and a target (e.g., detection indication unit 204-C or separate target 206) to determine adjustments to a pose of the sensor based on minimizing the spatial discrepancy 218, as illustrated below in FIG. 4B and FIG. 6. [00111] Next, FIG. 4A will be discussed.
  • FIG. 4A illustrates a top view of a LiDAR sensor 402 and three alignment targets 202 positioned at a known fixed distance d from the sensor 402.
  • the alignment targets 202 being used by an operator to determine alignment or pose of the LiDAR sensor 402 (i.e., calibrate the LiDAR sensor 402) and measurement plane thereof, according to an exemplary embodiment.
  • the LiDAR sensor 402 may be mounted on a device (not shown), such as a robot 102, and the mounting (i.e., pose of the LiDAR sensor 402 on the device) may be adjustable by the operator.
  • LiDAR sensor 402 may include a LiDAR sensor 138 shown and described in FIG. 1C above.
  • the alignment targets 202 may be positioned in an environment configured for calibrating LiDAR sensors 402 of a robot 102. Operators of robots 102 may desire the LiDAR sensors 402 to be configured onto the robot 102 at a specific (x, y, z, yaw, pitch, roll) position.
  • the environment may include a location where a robot 102, including the LiDAR sensors 402, may be fixed.
  • the three alignment targets 202 may be positioned within the environment such that the reference targets 206 thereof intersect the measurement plane of the LiDAR sensor 402 when the LiDAR sensor 402 is well calibrated.
  • Three points are required to define a plane, wherein the three points used by operators to determine the current measurement plane of the LiDAR sensor 402 may be visualized by detection indication units 204 which are illuminated (e.g., as shown in FIG. 2B). In some instances, the sensor 402 may be the third point which defines the measurement plane.
  • the three reference targets 206 denote the ideal, or well-calibrated, measurement plane of the LiDAR sensor 402. To calibrate the LiDAR sensor 402, the operator may adjust the position of the LiDAR sensor 402 such that the illuminated detection indication units 204 align with the reference targets 206 for all three alignment targets 202.
  • the operator may, for example, desire a measurement plane of the LiDAR sensor 402 to be parallel to a flat floor below the LiDAR sensor 402 at a specific height above the floor. Accordingly, the heights of the reference targets 206 of the three alignment targets 202 are positioned at the desired height above the flat floor.
  • the operator may activate the LiDAR sensor 402 to send measurement beams 404 of IR light in all directions across a field of view along the measurement plane, wherein some of the beams 404 are illustrated for clarity.
  • the beams 404 may configure IR detection diodes 302, as illustrated in FIG.
  • a LiDAR sensor 402 to generate an output voltage which configures corresponding threshold logic units 318 to activate visible light diodes 322 indicating that a beam 404 has been detected by one or more of the IR detection diodes 302 of a detection indication unit 204 of each respective alignment target 202. It is appreciated that use of a LiDAR sensor 402 is not intended to be limiting, wherein the sensor 402 may comprise any sensor which utilizes nonvisible light to sense an environment along a plane (e.g., radar).
  • FIG. 4B illustrates a side view of the LiDAR sensor 402 and three alignment targets 202, according to the exemplary embodiment illustrated in FIG. 4A above.
  • the LiDAR sensor 402 may emit a plurality of beams 404, wherein the three beams 404 illustrated comprise beams which are incident upon the three alignment targets 202.
  • some detection indication units 204 located at intersection points of the plurality of beams 404 with alignment targets 202 may illuminate (i.e., illuminate LED 322 shown in FIG. 3), as illustrated by empty squares, indicating a location of intersection of a beam 404 with an alignment target 202.
  • Target diodes 206 of the three respective alignment targets 202-L, 202-R, 202-C are positioned at a constant height h from the floor such that a measurement plane intersecting the alignment targets 202 at the locations of all target diode 206 corresponds to a measurement plane parallel to the flat floor at the height h.
  • the LiDAR sensor 402 may be mounted on a chassis of a device (e.g., robot 102) in an incorrect pose such that a measurement plane formed by beams 404 is not parallel to the floor as shown, wherein the alignment targets 202 may be utilized to correct the pose of the LiDAR sensor 402.
  • the leftmost alignment target 202-L comprises a target 206 at the height h from the floor, wherein an illuminated detection indication unit 204 (white square) may comprise a discrepancy / from the target 206 (i.e., from a detection indication unit 204 adjacent to the target 206) due to the improper pose of the LiDAR sensor 402 as illustrated.
  • the rightmost alignment target 202-R may comprise a target 206 at the height h, wherein the illuminated detection indication unit 204 corresponds to location where beam 404 is detected or is incident on the alignment target 202- R.
  • the illuminated detection indication unit 204 yields a discrepancy r from the target 206 for the right alignment target 202-R.
  • the center alignment target 202-C may comprise no or negligible discrepancy between the target 206 and illuminated visible light diodes 308 corresponding to the forward beam 404 being aligned with the target 206 (i.e., no error), as illustrated by both the target 206 and adjacent detection indication unit 204 being illuminated.
  • the center alignment target 202-C may comprise no or negligible discrepancy between the target 206 and illuminated visible light diodes 308 corresponding to the forward beam 404 being aligned with the target 206 (i.e., no error), as illustrated by both the target 206 and adjacent detection indication unit 204 being illuminated.
  • the center alignment target 202-C if there is a discrepancy in the center alignment target 202-C, then such would be reflected similar to as reflected in the right and left alignment targets 202-R, 202- L.
  • an operator may adjust a mounting of the LiDAR sensor 402 along a roll axis illustrated such that the discrepancies / and r become zero corresponding to the measurement plane of the LiDAR sensor being parallel to the floor and at constant height h from the floor.
  • detection indication units 204 directly adjacent to respective targets 206 of the left and right alignment targets 202-L and 202-R may both be illuminated, similar to the target 206 and illuminated detection indication unit 204 of the center alignment target 202-C.
  • detection indication units 204 directly adjacent to respective targets 206 of the three alignment targets 202-L, 202-C, 202-R will illuminate. This corresponds to the angular pose of the LiDAR sensor 402 being configured correctly; however the pose may still comprise a discrepancy in translational coordinates (i.e., x and y). Accordingly, distance measurements collected by beams 404 of LiDAR sensor 402 may be utilized to verify the translational position of the LiDAR sensor 402 is correct.
  • distance measurements between the LiDAR sensor 402 and targets 202-L, 202-C, 202-R may be verified to comprise the distance d.
  • the three alignment targets 202-L, 202-C, 202-R may be each positioned at different distances dr. d c . and ⁇ 3 ⁇ 4 wherein positioning all three alignment targets 202-L, 202-C, 202-R at constant distance d from the LiDAR sensor 402 is not intended to be limiting.
  • translational position of the LiDAR sensor 402 may not be configurable due to a specific configuration of screws, bolts, latches, etc. used to couple the LiDAR sensor 402 to a device.
  • a minimum of two alignment targets 202 may be utilized as visual light diodes 308 of each corresponding alignment targets 202 (202-L, 202-R, 202-C) may define a point on the measurement plane, wherein three points comprise a minimum number of points to define a plane and one of the points comprises the LiDAR sensor 402 itself.
  • Exact position of the LiDAR sensor 402 may be unknown to the operator during calibration, wherein use of three alignment targets 202 to yield three points of the measurement plane may further enhance visualization of the measurement plane for the operator without relying on a known position of the LiDAR sensor 402, thereby enhancing calibration speed and capabilities of the operator.
  • the position of the targets 206 of each corresponding alignment target 202 may be set based on a desired orientation (i.e., (x, y, z, roll, pitch, yaw)) of the LiDAR sensor 402 on the device and/or a field of view of the LiDAR sensor 402, wherein the positions of the targets 206 illustrated is not intended to be limiting.
  • the leftmost target 206 of alignment target 202- L may be at a higher height as compared to a target 206 of alignment target 202-R for a LiDAR sensor measuring at a slant angle.
  • three alignment targets (202-L, 202-R, 202-C) are illustrated, more or less alignment targets may be employed to practice the inventive concepts disclosed herein.
  • multiple (i.e., 2-3) detection indication units 204 may illuminate upon activation of the LiDAR sensor 402 due to spreading of the beams 404 (e.g., point spread through aperture of sensor 402, natural scattering in air, etc.) which may cause multiple photodiodes 302 to receive light from the LiDAR sensor 402 and illuminate multiple visual light diodes 322 at the same time.
  • the distance d may be chosen to comprise a reasonable distance from the LiDAR sensor 402, wherein spreading effects are minimal (e.g., spreading effects may illuminate at most 2-3 detection indication units 204 with nonvisible light 404), however reducing d substantially may reduce an angular resolution of the linear array of detection indication units 204.
  • the distance d may therefore be chosen to be, for example, 0.5 to 2 meters. As shown in FIG. 4B, distance d corresponds to distance between from the LiDAR sensor to the respective alignment targets (202-L, 202-R, 202-C).
  • use of visible light diodes 322 of detection indication units 204 to indicate a position of an IR beam 404 from a LiDAR sensor, or any other nonvisible light sensor may enhance human abilities to manually adjust a LiDAR sensor 402 on a device by enabling a human to visualize the IR beams 404 which are not visible to human eyes. Additionally, as the human is performing adjustments to the LiDAR sensor 402 the human receives instant visual feedback by the visual light diodes 308 of a current pose of the measurement plane of the LiDAR sensor 402, thereby further enhancing accuracy and efficiency of the human manually adjusting the LiDAR sensor 402 to a desired pose.
  • the accuracy of the manual adjustment may depend on the spacing between adjacent visual light diodes 308 as well as a distance d between an alignment target 202 and the LiDAR sensor 402 which, as an example, for adjustment targets 202 at a distance 1 meter from the LiDAR sensor 402 comprising visual light diodes 308 separated vertically by 5 millimeters may enable the human to adjust an angle of the sensor with an angular precision of 0.0002° (i.e., tan(0.005), neglecting spreading effects), vastly more precise than an unaided human.
  • use of targets 206 positioned at specific locations on each alignment target 202 may provide an additional reference target for the human during adjustment of the LiDAR sensor 402.
  • FIG. 5 illustrates a method 500 for a human operator of a device such as a robotic system (e.g., robot 102 of FIG. 1A) comprising a LiDAR sensor, to calibrate the LiDAR sensor using at least one alignment target 202, according to an exemplary embodiment.
  • the LiDAR sensor described in method 500 may comprise a planar LiDAR, as illustrated in FIG. 1C, or a line of sight LiDAR (i.e., configured to measure distance along a one dimensional line of sight).
  • Block 502 comprises the human operator positioning the at least one alignment target
  • the positions of the reference target 206 diodes are configured in known positions corresponding to a desired configuration of a measurement plane of a LiDAR sensor.
  • the alignment target 202 being positioned within a field of view of the LiDAR sensor.
  • the known position may comprise a distance from the LiDAR sensor to the respective targets 206 of the alignment targets 202, as illustrated by distance d and height h of FIG. 4B above.
  • Target diodes 206 of the at least one alignment targets 202 may be at fixed and known locations on the at least one alignment targets 202, wherein the at least one alignment targets 202 may be positioned by aligning the target diodes 206 at desired intersection points of the LiDAR sensor beams 404 and at least one alignment targets 202.
  • Block 504 comprises the human operator activating the LiDAR sensor.
  • the LiDAR sensor may be activated via, e.g., a user interface 112 of a robot 102 which configures the robot 102 into a calibration mode.
  • the calibration mode may cause the controller 118 of the robot 102 to activate one or more LiDAR sensors 114 based on a user input to the user interface 112.
  • the user interface 112 may provide a menu comprising a plurality of user-selectable options (e.g.,“calibrate LiDAR 1”,“calibrate LiDAR 2”, etc.) which enable the human operator to unit-test each LiDAR sensor individually.
  • the LiDAR sensor may emit a beam along a fixed line of sight incident on a single alignment target 202 or may emit a plurality of beams along a measurement plane incident on multiple alignment targets 202.
  • Block 506 comprises the human operator determining if illuminated detection indication units 204 of the at least one alignment targets 202 matches the target diodes 206 of each respective alignment target 202, wherein matching corresponds to a detection indication unit 204 directly adjacent to a target 206 being illuminated.“Adjacent” being along the direction orthogonal to the line formed by the array of detection indication units 204.
  • this step may be performed quickly by the human operator as the visual light emitted by the visual light diodes 308 of detection indication units 204 provide rapid feedback of a current pose of the measurement line of sight or plane and therefore a pose of the LiDAR sensor.
  • the human operator may determine the LiDAR sensor is sufficiently calibrated.
  • the human operator may move to block 508.
  • Block 508 comprises the human operator performing an adjustment to a mounting of the LiDAR sensor.
  • the adjustment may be performed by adjusting one or more screws, bolts, latches, etc. such that x, y, z, yaw, pitch, and/or roll of the LiDAR sensor is modified.
  • the adjustments performed in block 508 may be illustrative of small changes to the pose of the LiDAR sensor, wherein blocks 506 and 508 in conjunction are illustrative of an iterative process of checking the measurement plane of the LiDAR sensor corresponds to the at least one target diode 206, adjusting the mounting of the LiDAR sensor if it does not, and iterating until the illuminated visual light diodes 308 of the at least one alignment targets 202 match the at least one target diode 206 of the respective alignment targets 202.
  • the adjustments being performed are based on a spatial discrepancy between illuminated visual light diodes 308 and a corresponding target diode 206 (e.g., discrepancies 218 of FIG.
  • the human operator may further verify the distance measurements collected by the LiDAR sensor correspond to the known position of the at least one alignment targets 202.
  • the human operator may, with respect to FIG. 4B above, verify distance measurements collected by the LiDAR 402 sensor comprise the distance d to all three alignment targets 202-L, 202-C, 202-R, or other pre-determined distance values.
  • adjusting of a translational position of the LiDAR sensor may not be possible (e.g., due to a configuration of screws, bolts, latches, etc. used to couple the LiDAR sensor to a device), wherein this additional step of verifying distance measurements is not intended to be limiting.
  • use of visual light diodes 308 to indicate a deviation from a desired alignment of a LiDAR sensor, defined by target diodes 206 utilizes a natural ability of humans to recognize patterns in alignment adjustments performed in block 508 to calibrate the LiDAR sensor.
  • the degree or measure of adjustment needed to be made by the operator is a direct reflection of the corresponding visual light diode 322 of a respective detection indication unit 204 being activated, which enhances the ability of the operator to determine the needed adjustments by visualizing the measurements of the LiDAR sensor.
  • steps illustrated in blocks 506 and 508 may be performed by a separate microcontroller or processor may be configured to determine a discrepancy between a target diode 206 and illuminated visual light diodes 308 of each alignment target 202. These discrepancies may be utilized by the microcontroller or processor to determine a pose of the sensor and any adjustments to the pose of the sensor required to achieve a desired (i.e., well calibrated) pose of the sensor, as illustrated next in FIG. 6. For example, a position of a LiDAR sensor on a robot 102 may be adjusted by controller 118 issuing signals to actuator units 108 coupled to the LiDAR sensor.
  • FIG. 6 is a functional block diagram of a system configured to utilize alignment targets
  • FIGS. 4A-B spatial discrepancies between illuminated visible light indication diodes 308 (i.e., detection indication units 204) and a target diode 206 on the alignment targets 202 may be determined (i.e., values / and r illustrated above). These discrepancy values may be measured by the alignment targets 202 and communicated to a processing unit 602 via communications 606 comprising wired or wireless communication.
  • Processing unit 602 may be illustrative of a processor and non-transitory computer readable memory, as illustrated above in FIGS.
  • the processing unit 602 may perform an optimization algorithm (e.g., gradient descent, least squares, etc.) to determine a current pose of the sensor based on the spatial discrepancies measured by the alignment targets 202. Using the determined pose, the processor 602 may output adjustment instructions 608 to an adjustment unit 604.
  • the adjustment unit 604 may comprise, for example, a user interface configured to provide instructions for an operator to manually adjust the sensor 402 to a desired pose (e.g., turn top left screw by 5°) or servomotors configured to adjust the pose of the sensor on the device in response to a control signal of instructions 608.
  • the adjustment unit 604 may be either directly coupled to the device of the sensor 402 or, alternatively, the adjustment unit 604 may be coupled to the device via a wired or wireless communication link.
  • the adjustment unit 604 may be illustrative of a user interface unit 112 of a robot 102, a microcontroller controlling actuators which may change a pose of the sensor, a virtual reality output (e.g., configured to enable an operator to visualize a measurement plane in virtual reality), and so forth.
  • the processing unit 602 may collect new discrepancy data, via communications 606 from the alignment targets 202, and utilize the new discrepancy data as a feedback loop to determine optimal pose adjustments to the sensor 402.
  • alignment targets 202 may be utilized to calibrate a pose of the sensor 402, wherein additional alignment targets 202 may improve the accuracy of adjustments to the pose of the sensor 402 determined by the processing unit 602.
  • positions of the alignment targets 202 illustrated is not intended to be limiting.
  • a single alignment target 202 may be moved across a measurement plane of the sensor 402 (e.g., in an arc at a constant distance from the sensor 402) to collect a discrepancy measurement as a function of angle or spatial position relative to the sensor 402.
  • This function may additionally be utilized by processing unit 602 to determine a pose of the sensor 402 and any adjustments to the pose to achieve a desired pose of the sensor 402 as appreciated by one skilled in the art. It is further appreciated that use of a system illustrated in FIG.
  • a separate processing unit determines a pose and adjustments to the pose of the sensor 402 based on locations of the incident beams on each alignment target 202, may replace the visual light indication diodes 308 with other threshold detection logic corresponding to nonvisible light being received by a respective photodiode 302. That is, if a processing unit 602 performs the discrepancy measurements and pose estimation of the sensor 402, use of visual light diodes may be redundant as humans may not be required to analyze the discrepancies for calibration of the sensor 402.
  • the term“including” should be read to mean“including, without limitation,”“including but not limited to,” or the like;
  • the term“comprising” as used herein is synonymous with“including,”“containing,” or“characterized by,” and is inclusive or open-ended and does not exclude additional, unrecited elements or method steps;
  • the term“having” should be interpreted as“having at least;” the term“such as” should be interpreted as“such as, without limitation;” the term‘includes” should be interpreted as“includes but is not limited to;” the term“example” is used to provide exemplary instances of the item in discussion, not an exhaustive or limiting list thereof, and should be interpreted as“example, but without limitation;” adjectives such as “known,”“normal,”“standard,” and terms of similar meaning should not be construed as limiting the item described to a given time period or to an item available as of a given time, but instead should be read to encompass known, normal, or standard technologies that may be available or
  • a group of items linked with the conjunction “and” should not be read as requiring that each and every one of those items be present in the grouping, but rather should be read as“and/or” unless expressly stated otherwise.
  • a group of items linked with the conjunction“or” should not be read as requiring mutual exclusivity among that group, but rather should be read as“and/or” unless expressly stated otherwise.
  • the terms“about” or “approximate” and the like are synonymous and are used to indicate that the value modified by the term has an understood range associated with it, where the range may be ⁇ 20%, ⁇ 15%, ⁇ 10%, ⁇ 5%, or ⁇ 1%.
  • a result e.g., measurement value
  • close may mean, for example, the result is within 80% of the value, within 90% of the value, within 95% of the value, or within 99% of the value.
  • “defined” or “determined” may include “predefined” or“predetermined” and/or otherwise determined values, conditions, thresholds, measurements, and the like.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Electromagnetism (AREA)
  • Optics & Photonics (AREA)
  • Circuit Arrangement For Electric Light Sources In General (AREA)

Abstract

Systems and methods for calibrating nonvisible light emitting sensors using alignment targets are disclosed herein. According to at least one non-limiting exemplary embodiment, an alignment target may comprise an array of detection indication units configured to detect incident nonvisible light from a sensor, the detection indication units being configured to display a location of intersection between the incident nonvisible light and the alignment target. The location of intersection may be utilized by a human to calibrate the sensor based on visual light feedback from the alignment target or utilized by a separate processing unit to determine a pose of the sensor and any required adjustments to the pose.

Description

SYSTEMS AND METHODS FOR CALIBRATING NONVISIBLE LIGHT EMITTING SENSORS USING ALIGNMENT TARGETS
Priority
[0001] This application claims the benefit of U.S. Provisional Patent Application Serial No.
62/880,334 filed on July 30, 2019, the entire disclosures of each of which are incorporated herein by reference.
Copyright
[0002] A portion of the disclosure of this patent document contains material that is subject to copyright protection. The copyright owner has no objection to the facsimile reproduction by anyone of the patent document or the patent disclosure, as it appears in the Patent and Trademark Office patent files or records, but otherwise reserves all copyright rights whatsoever.
Background
Technological Field
[0003] The present application relates generally to robotics, and more specifically to systems and methods for calibrating nonvisible light emitting sensors using alignment targets.
Summary
[0004] The foregoing needs are satisfied by the present disclosure, which provides for, inter alia, systems and methods for calibrating sensors using alignment targets.
[0005] Exemplary embodiments described herein have innovative features, no single one of which is indispensable or solely responsible for their desirable attributes. Without limiting the scope of the claims, some of the advantageous features will now be summarized.
[0006] The alignment targets disclosed herein may provide quick and cost effective systems and methods for visualizing a beam of a sensor by a human using light in the visible spectrum such that the visualized measurement plane may be utilized by the human for calibration of the sensor. Additionally, a processor may utilize data from the alignment targets to determine a pose of the sensor and any required adjustments to the pose.
[0007] According to at least one non-limiting exemplary embodiment, an alignment target apparatus is disclosed. The alignment target apparatus comprises: a plurality of detection indication units arranged spatially in at least one linear array, each detection indication unit being configured to detect the incident nonvisible light outputted from the sensor; and at least one target positioned on the alignment target at a location relative to the sensor, the location of the at least one target corresponding to a desired point of intersection between the incident nonvisible light and the alignment target apparatus, the desired point corresponding to the sensor being calibrated.
[0008] According to at least one non-limiting exemplary embodiment, the at least one target comprises a visible light emitting diode configured to visually represent the desired point of intersection between the alignment target and the incident nonvisible light from the sensor.
[0009] According to at least one non-limiting exemplary embodiment, each one of plurality the detection indication units further comprises a threshold detection logic configured to: determine if the nonvisible light from the sensor is incident on a detection indication unit based on an induced voltage from a photodiode; and output a detection indication signal based on the induced voltage, the detection indication signal comprising either a logical high or logical low detection indication signal, the logical high detection indication signal corresponding to detection of the nonvisible light, and the logical low detection indication signal corresponding to no detection of the nonvisible light.
[0010] According to at least one non-limiting exemplary embodiment, the detection indication signal comprises an output voltage over a visible light diode, the visible light diode configured to: emit visible light based on the output voltage if the detection indication signal is the logical high detection indication signal, and not emit the visible light if the detection indication signal is the logical low detection indication signal.
[001 1 ] According to at least one non-limiting exemplary embodiment, the alignment target apparatus further comprises a non-transitory computer readable storage medium and at least one processor configured to execute the computer readable instructions to: determine at least one spatial discrepancy between the at least one target and an intersection point between the incident nonvisible light outputted from the sensor and the at least one alignment target apparatus, the intersection point being indicated by a detection indication signal outputted by one of the plurality of detection indication units if the detection indication signal is the logical high detection indication signal; and minimize the at least one spatial discrepancy by adjusting a pose of the sensor.
[0012] According to at least one non-limiting exemplary embodiment, the at least one processor may execute the instructions to adjust the pose of the sensor by either: activating at least one servomotor, the at least one servomotor configured to adjust the pose of the sensor; or providing instructions to a human via a user interface, the instructions prompt the human to perform the adjustments to the pose of the sensor manually.
[0013] According to at least one non-limiting exemplary embodiment, the at least one target comprises a designated at least one detection indication unit of the plurality of detection indication units [0014] [0015] According to at least one non-limiting exemplary embodiment, a method for calibrating a sensor on a device is disclosed. The sensor being configured to emit nonvisible light to generate measurements of an environment. The method, comprising: utilizing at least one alignment target at a known position relative to the device to determine, for each alignment target, at least one spatial discrepancy between a location of at least one target and a location of at least one intersection point; and minimizing the at least one spatial discrepancy by performing adjustments to a pose of the sensor; wherein, an intersection point corresponds to a location on an alignment target where the nonvisible light is incident; and a target corresponds to a desired location of the intersection point on an alignment target corresponding to a calibrated sensor.
[0016] According to at least one non-limiting exemplary embodiment, the method further comprises: determining the intersection point based on a detection indication signal output from one of a plurality of linearly arranged detection indication units of an alignment target being logical high.
[0017] According to at least one non-limiting exemplary embodiment, the method further comprises: determining the detection indication signal for a detection indication unit based on an induced voltage of a photodiode of the detection indication unit exceeding a value, the voltage being induced due to the nonvisible light from the sensor being incident on the photodiode.
[0018] According to at least one non-limiting exemplary embodiment, the detection indication signal comprises an output voltage over a visible light diode, the output voltage configures the visible light diode to emit visible light when the output detection indication signal is logical high and produce no visible light when the detection indication signal is logical low.
[0019] According to at least one non-limiting exemplary embodiment, the at least one target comprises a designated at least one detection indication unit of the plurality of detection indication units.
[0020] According to at least one non-limiting exemplary embodiment, the at least one target comprises a visible light emitting diode configured to visually represent the desired location of the intersection point.
[0021] According to at least one non-limiting exemplary embodiment, a non-transitory computer readable storage medium having a plurality of computer readable instructions embodied thereon is disclosed. The instructions, when executed by a processor, configure the processor to: determine at least one spatial discrepancy between at least one target and at least one intersection point between one or more alignment targets and nonvisible light from a sensor, the intersection point being indicated by one or more detection indication units of an alignment target outputting a logical high detection indication signal; and minimize the spatial discrepancy by performing adjustments to a pose of the sensor. [0022] According to at least one non-limiting exemplary embodiment, the non-transitory computer readable storage medium further comprises instructions which configure the processor to: perform the adjustments to the pose of the sensor by activating at least one servomotor configured to adjust the pose of the sensor.
[0023] According to at least one non-limiting exemplary embodiment, the non-transitory computer readable storage medium further comprises instructions which configure the processor to: provide instructions to a human via a user interface to perform the adjustments to the pose of the sensor [0024]
[0025] According to at least one non-limiting exemplary embodiment, each of the at least one alignment targets further comprises a plurality of linearly arranged detection indication units, each detection indication unit further comprises a photodiode sensitive to a wavelength of the nonvisible light; and the detection indication signal output is based on an induced voltage of a photodiode exceeding a value, the voltage being induced due to the nonvisible light from the sensor being incident on the photodiode.
[0026] According to at least one non-limiting exemplary embodiment, the at least one target comprises a designated at least one of the plurality of detection indication units located at a desired location of the intersection point, the desired location corresponding to an intersection point of a calibrated sensor.
[0027] According to at least one non-limiting exemplary embodiment, an alignment target apparatus is disclosed. The alignment target apparatus comprises: a plurality of detection indication units arranged spatially in at least one linear array, each detection indication unit being configured to detect the incident nonvisible light from the sensor, each detection indication unit comprising: a threshold detection logic configured to determine if the nonvisible light from the sensor is incident on a detection indication unit based on an induced voltage from a photodiode, the threshold detection logic outputs a detection indication signal based on the induced voltage from the photodiode, the detection indication signal comprising a logical high or low corresponding to a detection or no detection, respectively, of the incident nonvisible light by the photodiode, the detection indication signal comprises an output voltage over a visible light diode, the output voltage configures the visible light diode to emit visible light when the detection indication signal is logical high and produce no visible light when the detection indication signal is logical low; and at least one target positioned on the alignment target at a location relative to the sensor, the location of the at least one target corresponding to a desired point of intersection between the incident nonvisible light and the alignment target apparatus, the desired point corresponding to the sensor being calibrated.
[0028] These and other objects, features, and characteristics of the present disclosure, as well as the methods of operation and functions of the related elements of structure and the combination of parts and economies of manufacture, will become more apparent upon consideration of the following description and the appended claims with reference to the accompanying drawings, all of which form a part of this specification, wherein like reference numerals designate corresponding parts in the various figures. It is to be expressly understood, however, that the drawings are for the purpose of illustration and description only and are not intended as a definition of the limits of the disclosure. As used in the specification and in the claims, the singular form of“a”,“an”, and“the” include plural referents unless the context clearly dictates otherwise.
Brief Description of the Drawings
[0029] The disclosed aspects will hereinafter be described in conjunction with the appended drawings, provided to illustrate and not to limit the disclosed aspects, wherein like designations denote like elements.
[0030] FIG. 1A is a functional block diagram of a main robot in accordance with some embodiments of this disclosure.
[0031] FIG. IB is a functional block diagram of a controller shown in FIG. 1A in accordance with some embodiments of this disclosure.
[0032] FIG. 1C illustrates a light detection and ranging (LiDAR) sensor and features thereof in accordance with some embodiments of this disclosure.
[0033] FIG. 2A is a high-level circuit diagram of an alignment target and components thereof, according to an exemplary embodiment.
[0034] FIG. 2B illustrates an alignment target being utilized to detect a measurement plane of a sensor for calibration of the sensor, according to an exemplary embodiment.
[0035] FIG. 2C(i)-(iii) illustrates three exemplary alignment targets to illustrate additional exemplary embodiments of an alignment target.
[0036] FIG. 3 is a high-level circuit diagram of a detection indication unit of an alignment target, according to an exemplary embodiment.
[0037] FIG. 4A is a top view of a LiDAR sensor and three alignment targets to be utilized to calibrate the LiDAR sensor, according to an exemplary embodiment.
[0038] FIG. 4B is a side view of the LiDAR sensor and three alignment targets illustrated in
FIG. 4A to illustrate discrepancies between a target and illuminated visual light diodes to be utilized to calibrate the LiDAR sensor, according to an exemplary embodiment.
[0039] FIG. 5 is a process flow diagram illustrating a method for an operator to calibrate a LiDAR sensor on a device using at least one alignment target, according to an exemplary embodiment.
[0040] FIG. 6 is a functional block diagram of a system configured to utilize alignment targets and a processing unit to determine a pose of a sensor and required adjustments to the pose, according to an exemplary embodiment.
[0041] All Figures disclosed herein are © Copyright 2020 Brain Corporation. All rights reserved.
Detailed Description
[0042] Currently, many robots utilize light detection and ranging (LiDAR) sensors to collect data of the world around them. Data the robots collect may be essential for navigation and execution of tasks, wherein calibration of the LiDAR sensors may be critical for the robots to function properly. Furthermore, accurate calibration of a LiDAR sensor may further enhance precision of task execution by the robots. Typically, LiDAR sensors collect distance measurements across a two-dimensional measurement plane, wherein motion of the robots are further utilized to generate 3 dimensional point clouds of an environment.
[0043] LiDAR sensors emit near infrared (IR) light, which is invisible to human eyes, thereby making it difficult for a human to visualize a measurement plane of the LiDAR sensors for calibration. Infrared goggles, configured to enable humans to see light in the IR or near IR spectrum, may be cumbersome and impractical due to vision range limitations of the goggles. Infrared luminescent paints, which glow upon being illuminated with IR light, may also be impractical as power emitted by LiDAR sensors per unit area may be orders of magnitude too weak to illuminate the paints. Additionally, many robots may utilize other sensors, such as radar or ultraviolet sensors, which also utilize beams of nonvisible light to sense an environment.
[0044] Accordingly, there is a need in the art for systems and methods for calibrating nonvisible light sensors by human operators of robots by utilizing alignment targets. Inventive concepts disclosed herein are directed to a practical application of utilizing alignment targets for enhancing calibrating nonvisible light sensors.
[0045] Various aspects of the novel systems, apparatuses, and methods disclosed herein are described more fully hereinafter with reference to the accompanying drawings. This disclosure can, however, be embodied in many different forms and should not be construed as limited to any specific structure or function presented throughout this disclosure. Rather, these aspects are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the disclosure to those skilled in the art. Based on the teachings herein, one skilled in the art would appreciate that the scope of the disclosure is intended to cover any aspect of the novel systems, apparatuses, and methods disclosed herein, whether implemented independently of, or combined with, any other aspect of the disclosure.
[0046] For example, an apparatus may be implemented or a method may be practiced using any number of the aspects set forth herein. In addition, the scope of the disclosure is intended to cover such an apparatus or method that is practiced using other structure, functionality, or structure and functionality in addition to or other than the various aspects of the disclosure set forth herein. It should be understood that any aspect disclosed herein may be implemented by one or more elements of a claim.
[0047] Although particular aspects are described herein, many variations and permutations of these aspects fall within the scope of the disclosure. Although some benefits and advantages of the preferred aspects are mentioned, the scope of the disclosure is not intended to be limited to particular benefits, uses, and/or objectives. The detailed description and drawings are merely illustrative of the disclosure rather than limiting, the scope of the disclosure being defined by the appended claims and equivalents thereof.
[0048] The present disclosure provides for systems and methods for calibrating nonvisible light emitting sensors using alignment targets. As used herein, a robot may include mechanical and/or virtual entities configured to carry out a complex series of tasks or actions autonomously. In some exemplary embodiments, robots may be machines that are guided and/or instructed by computer programs and/or electronic circuitry. In some exemplary embodiments, robots may include electro mechanical components that are configured for navigation, where the robot may move from one location to another. Such robots may include autonomous and/or semi-autonomous cars, floor cleaners, rovers, drones, planes, boats, carts, trams, wheelchairs, industrial equipment, stocking machines, mobile platforms, personal transportation devices (e.g., hover boards, SEGWAYS®, etc.), stocking machines, trailer movers, vehicles, and the like. Robots may also include any autonomous and/or semi- autonomous machine for transporting items, people, animals, cargo, freight, objects, luggage, and/or anything desirable from one location to another.
[0049] As used herein, an operator may comprise a human manufacturer, operator, or user of a device. Operators may be tasked with calibrating sensors on a device using, at least in part, the systems and methods disclosed herein.
[0050] As used herein, logical high of a digital signal may correspond to an active state (i.e.,
ON) and logical low may correspond to an inactive state (i.e., OFF), wherein the signals on printed circuit board (PCB) traces may comprise active low or active high voltages. For example, a flashlight emitting light (i.e., in an ON state) may comprise a signal to a bulb in a logical high state, wherein the voltage of the signal may be Ov or 5v based on a specific design (i.e., active low or active high) of the flashlight. [0051] As used herein, visible light comprises light of optical wavelength (i.e., visible to humans) from around 400 - 700 nanometers. Infrared (IR) light, as used herein, may include any wavelengths longer than 700 nanometers and may comprise any or all subsections of infrared (e.g., near IR, mid IR, far IR). Nonvisible light, as used herein, comprises light outside of the optical wavelength (i.e., light invisible to humans). One skilled in the art may appreciate that the systems and methods illustrated below may be applied to any operating wavelength of a sensor (e.g., microwave, ultraviolet, etc.), wherein detection of IR light described herein is not intended to be limiting.
[0052] As used herein, a pose of an object (e.g., a sensor) may comprise an (x, y, z, yaw, pitch, roll) orientation of the object, defined with respect to a predefined origin. A pose of an object may comprise some or all six degrees of freedom, three degrees of freedom being (x, y, z) position and the other three being (yaw, pitch, roll) rotation, wherein the pose of the object may be defined about all available degrees of freedom of the object.
[0053] As used herein, network interfaces may include any signal, data, or software interface with a component, network, or process including, without limitation, those of the FireWire (e.g., FW400, FW800, FWS800T, FWS1600, FWS3200, etc.), universal serial bus (“USB”) (e.g., USB l .X, USB 2.0, USB 3.0, USB Type-C, etc.), Ethernet (e.g., 10/100, 10/100/1000 (Gigabit Ethernet), 10-Gig- E, etc.), multimedia over coax alliance technology (“MoCA”), Coaxsys (e.g., TVNET™), radio frequency tuner (e.g., in-band or OOB, cable modem, etc.), Wi-Fi (802.11), WiMAX (e.g., WiMAX (802.16)), PAN (e.g., PAN/802.15), cellular (e.g., 3G, 4G, or 5G including LTE/LTE-A/TD-LTE/TD- LTE, GSM, etc. variants thereof), IrDA families, etc. As used herein, Wi-Fi may include one or more oflEEE-Std. 802.11, variants oflEEE-Std. 802.11, standards related to IEEE-Std. 802.11 (e.g., 802.11 a/b/g/n/ac/ad/af/ah/ai/aj/aq/ax/ay), and/or other wireless standards.
[0054] As used herein, processor, microprocessor, and/or digital processor may include any type of digital processing device such as, without limitation, digital signal processors (“DSPs”), reduced instruction set computers (“RISC”), complex instruction set computers (“CISC”), microprocessors, gate arrays (e.g., field programmable gate arrays (“FPGAs”)), programmable logic device (“PLDs”), reconfigurable computer fabrics (“RCFs”), array processors, secure microprocessors, specialized processors (e.g., neuromorphic processors), and application-specific integrated circuits (“ASICs”). Such digital processors may be contained on a single unitary integrated circuit die or distributed across multiple components.
[0055] As used herein, computer program and/or software may include any sequence or machine cognizable steps which perform a function. Such computer program and/or software may be rendered in any programming language or environment including, for example, C/C++, C#, Fortran, COBOL, MATLAB™, PASCAL, GO, RUST, SCALA, Python, assembly language, markup languages (e.g., HTML, SGML, XML, VoXML), and the like, as well as object-oriented environments such as the Common Object Request Broker Architecture (“CORBA”), JAVA™ (including J2ME, Java Beans, etc.), Binary Runtime Environment (e.g.,“BREW”), and the like.
[0056] As used herein, connection, link, and/or wireless link may include a causal link between any two or more entities (whether physical or logical/virtual), which enables information exchange between the entities.
[0057] As used herein, computer and/or computing device may include, but are not limited to, personal computers (“PCs”) and minicomputers, whether desktop, laptop, or otherwise, mainframe computers, workstations, servers, personal digital assistants (“PDAs”), handheld computers, embedded computers, programmable logic devices, personal communicators, tablet computers, mobile devices, portable navigation aids, J2ME equipped devices, cellular telephones, smart phones, personal integrated communication or entertainment devices, and/or any other device capable of executing a set of instructions and processing an incoming data signal.
[0058] Detailed descriptions of the various embodiments of the system and methods of the disclosure are now provided. While many examples discussed herein may refer to specific exemplary embodiments, it will be appreciated that the described systems and methods contained herein are applicable to any kind of robot. Myriad other embodiments or uses for the technology described herein would be readily envisaged by those having ordinary skill in the art, given the contents of the present disclosure.
[0059] Advantageously, the systems and methods of this disclosure at least: (i) enhance abilities of a human to manually calibrate a sensor on a robot; (ii) allow humans to visualize a measurement plane of an infrared LiDAR sensor; (iii) reduce cost and complexity associated with calibrating a LiDAR sensor; and (iv) improve operation precision of robots by further enhancing LiDAR sensor calibration methods. Other advantages are readily discemable by one having ordinary skill in the art given the contents of the present disclosure.
[0060] FIG. 1A is a functional block diagram of a robot 102 in accordance with some principles of this disclosure. As illustrated in FIG. 1A, robot 102 may include controller 118, memory 120, user interface unit 112, sensor units 114, navigation units 106, actuator unit 108, and communications unit 116, as well as other components and subcomponents (e.g., some of which may not be illustrated). Although a specific embodiment is illustrated in FIG. 1A, it is appreciated that the architecture may be varied in certain embodiments as would be readily apparent to one of ordinary skill given the contents of the present disclosure. As used herein, robot 102 may be representative at least in part of any robot described in this disclosure.
[0061] Controller 118 may control the various operations performed by robot 102. Controller 118 may include and/or comprise one or more processing devices (e.g., microprocessing devices) and other peripherals. As previously mentioned and used herein, processing device, microprocessing device, and/or digital processing device may include any type of digital processing device such as, without limitation, digital signal processing devices (“DSPs”), reduced instruction set computers (“RISC”), complex instruction set computers (“CISC”), microprocessing devices, gate arrays (e.g., field programmable gate arrays (“FPGAs”)), programmable logic device (“PLDs”), reconfigurable computer fabrics (“RCFs”), array processing devices, secure microprocessing devices and application- specific integrated circuits (“ASICs”). Peripherals may include hardware accelerators configured to perform a specific function using hardware elements such as, without limitation, encryption/description hardware, algebraic processing devices (e.g., tensor processing units, quadratic problem solvers, multipliers, etc.), data compressors, encoders, arithmetic logic units (“ALU”), and the like. Such digital processing devices may be contained on a single unitary integrated circuit die, or distributed across multiple components.
[0062] Controller 118 may be operatively and/or communicatively coupled to memory 120.
Memory 120 may include any type of integrated circuit or other storage device configured to store digital data including, without limitation, read-only memory (“ROM”), random access memory (“RAM”), non-volatile random access memory (“NVRAM”), programmable read-only memory (“PROM”), electrically erasable programmable read-only memory (“EEPROM”), dynamic random- access memory (“DRAM”), Mobile DRAM, synchronous DRAM (“SDRAM”), double data rate SDRAM (“DDR/2 SDRAM”), extended data output (“EDO”) RAM, fast page mode RAM (“FPM”), reduced latency DRAM (“RLDRAM”), static RAM (“SRAM”), flash memory (e.g., NAND/NOR), memristor memory, pseudostatic RAM (“PSRAM”), etc. Memory 120 may provide computer-readable instructions and data to controller 118. For example, memory 120 may be a non-transitory, computer- readable storage apparatus and/or medium having a plurality of instructions stored thereon, the instructions being executable by a processing apparatus (e.g., controller 118) to operate robot 102. In some cases, the computer-readable instructions may be configured to, when executed by the processing apparatus, cause the processing apparatus to perform the various methods, features, and/or functionality described in this disclosure. Accordingly, controller 118 may perform logical and/or arithmetic operations based on program instructions stored within memory 120. In some cases, the instructions and/or data of memory 120 may be stored in a combination of hardware, some located locally within robot 102, and some located remote from robot 102 (e.g., in a cloud, server, network, etc.).
[0063] It should be readily apparent to one of ordinary skill in the art that a processing device may be internal to or on board robot 102 and/or may be external to robot 102 and be communicatively coupled to controller 118 of robot 102 utilizing communication units 116 wherein the external processing device may receive data from robot 102, process the data, and transmit computer-readable instructions back to controller 118. In at least one non-limiting exemplary embodiment, the processing device may be on a remote server (not shown).
[0064] In some exemplary embodiments, memory 120, shown in FIG. 1A, may store a library of sensor data. In some cases, the sensor data may be associated at least in part with objects and/or people. In exemplary embodiments, this library may include sensor data related to objects and/or people in different conditions, such as sensor data related to objects and/or people with different compositions (e.g., materials, reflective properties, molecular makeup, etc.), different lighting conditions, angles, sizes, distances, clarity (e.g., blurred, obstructed/occluded, partially off frame, etc.), colors, surroundings, and/or other conditions. The sensor data in the library may be taken by a sensor (e.g., a sensor of sensor units 114 or any other sensor) and/or generated automatically, such as with a computer program that is configured to generate/simulate (e.g., in a virtual world) library sensor data (e.g., which may generate/simulate these library data entirely digitally and/or beginning from actual sensor data) from different lighting conditions, angles, sizes, distances, clarity (e.g., blurred, obstructed/occluded, partially off frame, etc.), colors, surroundings, and/or other conditions.
[0065] The number of images in the library may depend at least in part on one or more of the amount of available data, the variability of the surrounding environment in which robot 102 operates, the complexity of objects and/or people, the variability in appearance of objects, physical properties of robots, the characteristics of the sensors, and/or the amount of available storage space (e.g., in the library, memory 120, and/or local or remote storage). In exemplary embodiments, at least a portion of the library may be stored on a network (e.g., cloud, server, distributed network, etc.) and/or may not be stored completely within memory 120. As yet another exemplary embodiment, various robots (e.g., that are commonly associated, such as robots by a common manufacturer, user, network, etc.) may be networked so that data captured by individual robots are collectively shared with other robots. In such a fashion, these robots may be configured to learn and/or share sensor data in order to facilitate the ability to readily detect and/or identify errors and/or assist events.
[0066] Still referring to FIG. 1A, operative units 104 may be coupled to controller 118, or any other controller, to perform the various operations described in this disclosure. One, more, or none of the modules in operative units 104 may be included in some embodiments. Throughout this disclosure, reference may be to various controllers and/or processing devices. In some embodiments, a single controller (e.g., controller 118) may serve as the various controllers and/or processing devices described. In other embodiments different controllers and/or processing devices may be used, such as controllers and/or processing devices used particularly for one or more operative units 104. Controller 118 may send and/or receive signals, such as power signals, status signals, data signals, electrical signals, and/or any other desirable signals, including discrete and analog signals to operative units 104. Controller 118 may coordinate and/or manage operative units 104, and/or set timings (e.g., synchronously or asynchronously), turn off/on control power budgets, receive/send network instructions and/or updates, update firmware, send interrogatory signals, receive and/or send statuses, and/or perform any operations for running features of robot 102.
[0067] Returning to FIG. 1A, operative units 104 may include various units that perform functions for robot 102. For example, operative units 104 includes at least navigation units 106, actuator units 108, user interface units 112, sensor units 114, and communication units 116. Operative units 104 may also comprise other units such as specifically configured task units (not shown) that provide the various functionality of robot 102. In exemplary embodiments, operative units 104 may be instantiated in software, hardware, or both software and hardware. For example, in some cases, units of operative units 104 may comprise computer-implemented instructions executed by a controller 118. In exemplary embodiments, units of operative unit 104 may comprise hardcoded logic (e.g., ASICS). In exemplary embodiments, units of operative units 104 may comprise both computer-implemented instructions executed by a controller and hardcoded logic. Where operative units 104 are implemented in part in software, operative units 104 may include units/modules of code configured to provide one or more functionalities.
[0068] In exemplary embodiments, navigation units 106 may include systems and methods that may computationally construct and update a map of an environment, localize robot 102 (e.g., find the position) in a map, and navigate robot 102 to/from destinations. The mapping may be performed by imposing data obtained in part by sensor units 114 into a computer-readable map representative at least in part of the environment. In exemplary embodiments, a map of an environment may be uploaded to robot 102 through user interface units 112, uploaded wirelessly or through wired connection, or taught to robot 102 by a user.
[0069] In exemplary embodiments, navigation units 106 may include components and/or software configured to provide directional instructions for robot 102 to navigate. Navigation units 106 may process maps, routes, and localization information generated by mapping and localization units, data from sensor units 114, and/or other operative units 104.
[0070] Still referring to FIG. 1A, actuator units 108 may include actuators such as electric motors, gas motors, driven magnet systems, solenoid/ratchet systems, piezoelectric systems (e.g., inchworm motors), magneto strictive elements, gesticulation, and/or any way of driving an actuator known in the art. By way of illustration, such actuators may actuate the wheels for robot 102 to navigate a route; navigate around obstacles; rotate cameras and sensors. According to exemplary embodiments, actuator unit 108 may include systems that allow movement of robot 102, such as motorize propulsion. For example, motorized propulsion may move robot 102 in a forward or backward direction, and/or be used at least in part in turning robot 102 (e.g., left, right, and/or any other direction). By way of illustration, actuator unit 108 may control if robot 102 is moving or is stopped and/or allow robot 102 to navigate from one location to another location.
[0071] Actuator unit 108 may also include any system used for actuating, in some cases actuating task units to perform tasks. For example, actuator unit 108 may include driven magnet systems, motors/engines (e.g., electric motors, combustion engines, steam engines, and/or any type of motor/engine known in the art), solenoid/ratchet system, piezoelectric system (e.g., an inchworm motor), magnetostrictive elements, gesticulation, and/or any actuator known in the art.
[0072] According to exemplary embodiments, sensor units 114 may comprise systems and/or methods that may detect characteristics within and/or around robot 102. Sensor units 114 may comprise a plurality and/or a combination of sensors. Sensor units 114 may include sensors that are internal to robot 102 or external, and/or have components that are partially internal and/or partially external. In some cases, sensor units 114 may include one or more exteroceptive sensors, such as sonars, light detection and ranging (“LiDAR”) sensors, radars, lasers, cameras (including video cameras (e.g., red- blue-green (“RBG”) cameras, infrared cameras, three-dimensional (“3D”) cameras, thermal cameras, etc.), time of flight (“ToF”) cameras, structured light cameras, antennas, motion detectors, microphones, and/or any other sensor known in the art. According to some exemplary embodiments, sensor units 114 may collect raw measurements (e.g., currents, voltages, resistances, gate logic, etc.) and/or transformed measurements (e.g., distances, angles, detected points in obstacles, etc.). In some cases, measurements may be aggregated and/or summarized. Sensor units 114 may generate data based at least in part on distance or height measurements. Such data may be stored in data structures, such as matrices, arrays, queues, lists, arrays, stacks, bags, etc.
[0073] According to exemplary embodiments, sensor units 114 may include sensors that may measure internal characteristics of robot 102. For example, sensor units 114 may measure temperature, power levels, statuses, and/or any characteristic of robot 102. In some cases, sensor units 114 may be configured to determine the odometry of robot 102. For example, sensor units 114 may include proprioceptive sensors, which may comprise sensors such as accelerometers, inertial measurement units (“IMU”), odometers, gyroscopes, speedometers, cameras (e.g. using visual odometry), clock/timer, and the like. Odometry may facilitate autonomous navigation and/or autonomous actions of robot 102. This odometry may include robot 102’s position (e.g., where position may include robot’s location, displacement and/or orientation, and may sometimes be interchangeable with the term pose as used herein) relative to the initial location. Such data may be stored in data structures, such as matrices, arrays, queues, lists, arrays, stacks, bags, etc. According to exemplary embodiments, the data structure of the sensor data may be called an image.
[0074] According to exemplary embodiments, sensor units 114 may be in part external to the robot 102 and coupled to communications units 116. For example, a security camera within an environment of a robot 102 may provide a controller 118 of the robot 102 with a video feed via wired or wireless communication channel(s). In some instances, sensor units 114 may include sensors configured to detect a presence of an object at a location such as, for example without limitation, a pressure or motion sensor may be disposed at a shopping cart storage location of a grocery store, wherein the controller 118 of the robot 102 may utilize data from the pressure or motion sensor to determine if the robot 102 should retrieve more shopping carts for customers.
[0075] According to exemplary embodiments, user interface units 112 may be configured to enable a user to interact with robot 102. For example, user interface units 112 may include touch panels, buttons, keypads/keyboards, ports (e.g., universal serial bus (“USB”), digital visual interface (“DVI”), Display Port, E-Sata, Firewire, PS/2, Serial, VGA, SCSI, audioport, high-definition multimedia interface (“HDMI”), personal computer memory card international association (“PCMCIA”) ports, memory card ports (e.g., secure digital (“SD”) and miniSD), and/or ports for computer-readable medium), mice, rollerballs, consoles, vibrators, audio transducers, and/or any interface for a user to input and/or receive data and/or commands, whether coupled wirelessly or through wires. Users may interact through voice commands or gestures. User interface units 218 may include a display, such as, without limitation, liquid crystal display (“UCDs”), light-emitting diode (“UED”) displays, LED LCD displays, in-plane-switching (“IPS”) displays, cathode ray tubes, plasma displays, high definition (“HD”) panels, 4K displays, retina displays, organic LED displays, touchscreens, surfaces, canvases, and/or any displays, televisions, monitors, panels, and/or devices known in the art for visual presentation. According to exemplary embodiments user interface units 112 may be positioned on the body of robot 102. According to exemplary embodiments, user interface units 112 may be positioned away from the body of robot 102 but may be communicatively coupled to robot 102 (e.g., via communication units including transmitters, receivers, and/or transceivers) directly or indirectly (e.g., through a network, server, and/or a cloud). According to exemplary embodiments, user interface units 112 may include one or more projections of images on a surface (e.g., the floor) proximally located to the robot, e.g., to provide information to the occupant or to people around the robot. The information could be the direction of future movement of the robot, such as an indication of moving forward, left, right, back, at an angle, and/or any other direction. In some cases, such information may utilize arrows, colors, symbols, etc. [0076] According to exemplary embodiments, communications unit 116 may include one or more receivers, transmitters, and/or transceivers. Communications unit 116 may be configured to send/receive a transmission protocol, such as BLUETOOTH®, ZIGBEE®, Wi-Fi, induction wireless data transmission, radio frequencies, radio transmission, radio-frequency identification (“RFID”), near- field communication (“NFC”), infrared, network interfaces, cellular technologies such as 3G (3.5G, 3.75G, 3 GPP/3 GPP2/HSP A+), 4G (4GPP/4GPP2/LTE/LTE-TDD/LTE-FDD), 5G (5GPP/5GPP2), or 5G LTE (long-term evolution, and variants thereof including LTE-A, LTE-U, LTE-A Pro, etc.), high speed downlink packet access (“HSDPA”), high-speed uplink packet access (“HSUPA”), time division multiple access (“TDMA”), code division multiple access (“CDMA”) (e.g., IS-95A, wideband code division multiple access (“WCDMA”), etc.), frequency hopping spread spectrum (“FHSS”), direct sequence spread spectrum (“DSSS”), global system for mobile communication (“GSM”), Personal Area Network (“PAN”) (e.g., PAN/802.15), worldwide interoperability for microwave access (“WiMAX”), 802.20, long term evolution (“LTE”) (e.g., LTE/LTE-A), time division LTE (“TD- LTE”), global system for mobile communication (“GSM”), narrowband/frequency-division multiple access (“FDMA”), orthogonal frequency-division multiplexing (“OFDM”), analog cellular, cellular digital packet data (“CDPD”), satellite systems, millimeter wave or microwave systems, acoustic, infrared (e.g., infrared data association (“IrDA”)), and/or any other form of wireless data transmission.
[0077] Communications unit 116 may also be configured to send/receive signals utilizing a transmission protocol over wired connections, such as any cable that has a signal line and ground. For example, such cables may include Ethernet cables, coaxial cables, Universal Serial Bus (“USB”), FireWire, and/or any connection known in the art. Such protocols may be used by communications unit 116 to communicate to external systems, such as computers, smart phones, tablets, data capture systems, mobile telecommunications networks, clouds, servers, or the like. Communications unit 116 may be configured to send and receive signals comprising of numbers, letters, alphanumeric characters, and/or symbols. In some cases, signals may be encrypted, using algorithms such as 128-bit or 256-bit keys and/or other encryption algorithms complying with standards such as the Advanced Encryption Standard (“AES”), RSA, Data Encryption Standard (“DES”), Triple DES, and the like. Communications unit 116 may be configured to send and receive statuses, commands, and other data/information. For example, communications unit 116 may communicate with a user operator to allow the user to control robot 102 Communications unit 116 may communicate with a server/network (e.g., a network) in order to allow robot 102 to send data, statuses, commands, and other communications to the server. The server may also be communicatively coupled to computer(s) and/or device(s) that may be used to monitor and/or control robot 102 remotely. Communications unit 116 may also receive updates (e.g., firmware or data updates), data, statuses, commands, and other communications from a server for robot 102.
[0078] In exemplary embodiments, operating system 110 may be configured to manage memory 120, controller 118, power supply 122, modules in operative units 104, and/or any software, hardware, and/or features of robot 102. For example, and without limitation, operating system 110 may include device drivers to manage hardware recourses for robot 102.
[0079] In exemplary embodiments, power supply 122 may include one or more batteries, including, without limitation, lithium, lithium ion, nickel-cadmium, nickel-metal hydride, nickel- hydrogen, carbon-zinc, silver-oxide, zinc-carbon, zinc-air, mercury oxide, alkaline, or any other type of battery known in the art. Certain batteries may be rechargeable, such as wirelessly (e.g., by resonant circuit and/or a resonant tank circuit) and/or plugging into an external power source. Power supply 122 may also be any supplier of energy, including wall sockets and electronic devices that convert solar, wind, water, nuclear, hydrogen, gasoline, natural gas, fossil fuels, mechanical energy, steam, and/or any power source into electricity.
[0080] One or more of the units described with respect to FIG. 1A (including memory 120, controller 118, sensor units 114, user interface unit 112, actuator unit 108, communications unit 116, mapping and localization unit 126, and/or other units) may be integrated onto robot 102, such as in an integrated system. However, according to some exemplary embodiments, one or more of these units may be part of an attachable module. This module may be attached to an existing apparatus to automate so that it behaves as a robot. Accordingly, the features described in this disclosure with reference to robot 102 may be instantiated in a module that may be attached to an existing apparatus and/or integrated onto robot 102 in an integrated system. Moreover, in some cases, a person having ordinary skill in the art would appreciate from the contents of this disclosure that at least a portion of the features described in this disclosure may also be run remotely, such as in a cloud, network, and/or server.
[0081] As used herein, a robot 102, a controller 118, or any other controller, processing device, or robot performing a task, operation or transformation illustrated in the figures below comprises a controller executing computer readable instructions stored on a non-transitory computer readable storage apparatus, such as memory 120, as would be appreciated by one skilled in the art.
[0082] Next referring to FIG. IB, the architecture of a processor or processing device is illustrated according to an exemplary embodiment. As illustrated in FIG. IB, the processing device includes a data bus 128, a receiver 126, a transmitter 134, at least one processor 130, and a memory 132. The receiver 126, the processor 130 and the transmitter 134 all communicate with each other via the data bus 128. The processor 130 is configurable to access the memory 132 which stores computer code or computer readable instructions in order for the processor 130 to execute the specialized algorithms. As illustrated in FIG. IB, memory 132 may comprise some, none, different, or all of the features of memory 120 previously illustrated in FIG. 1A. The algorithms executed by the processor 130 are discussed in further detail below. The receiver 126 as shown in FIG. IB is configurable to receive input signals 124. The input signals 124 may comprise signals from a plurality of operative units 104 illustrated in FIG. 1A including, but not limited to, sensor data from sensor units 114, user inputs, motor feedback, external communication signals (e.g., from a remote server), and/or any other signal from an operative unit 104 requiring further processing. The receiver 126 communicates these received signals to the processor 130 via the data bus 128. As one skilled in the art would appreciate, the data bus 128 is the means of communication between the different components— receiver, processor, and transmitter— in the processing device. The processor 130 executes the algorithms, as discussed below, by accessing specialized computer-readable instructions from the memory 132. Further detailed description as to the processor 130 executing the specialized algorithms in receiving, processing and transmitting of these signals is discussed above with respect to FIG. 1A. The memory 132 is a storage medium for storing computer code or instructions. The storage medium may include optical memory (e.g., CD, DVD, HD-DVD, Blu-Ray Disc, etc.), semiconductor memory (e.g., RAM, EPROM, EEPROM, etc.), and/or magnetic memory (e.g., hard-disk drive, floppy-disk drive, tape drive, MRAM, etc.), among others. Storage medium may include volatile, nonvolatile, dynamic, static, read/write, read-only, random-access, sequential-access, location-addressable, fde -addressable, and/or content-addressable devices. The processor 130 may communicate output signals to transmitter 134 via data bus 128 as illustrated. The transmitter 134 may be configurable to further communicate the output signals to a plurality of operative units 104 illustrated by signal output 136.
[0083] One of ordinary skill in the art would appreciate that the architecture illustrated in FIG.
IB may illustrate an external server architecture configurable to effectuate the control of a robotic apparatus from a remote location, such as server 202 illustrated next in FIG. 2. That is, the server may also include a data bus, a receiver, a transmitter, a processor, and a memory that stores specialized computer readable instructions thereon.
[0084] One of ordinary skill in the art would appreciate that a controller 118 of a robot 102 may include one or more processing devices and may further include other peripheral devices used for processing information, such as ASICS, DPS, proportional-integral-derivative (“PID”) controllers, hardware accelerators (e.g., encryption/decryption hardware), and/or other peripherals (e.g., analog to digital converters) described above in FIG. 1A. The other peripheral devices when instantiated in hardware are commonly used within the art to accelerate specific tasks (e.g., multiplication, encryption, etc.) which may alternatively be performed using the system architecture of FIG. IB. In some instances, peripheral devices are used as a means for intercommunication between the controller 118 and operative units 104 (e.g., digital to analog converters and/or amplifiers for producing actuator signals). Accordingly, as used herein, the controller 118 executing computer readable instructions to perform a function may include one or more processing devices thereof executing computer readable instructions and, in some instances, the use of any hardware peripherals known within the art. Controller 118 may be illustrative of various processing devices and peripherals integrated into a single circuit die or distributed to various locations of the robot 102 which receive, process, and output information to/from operative units 104 of the robot 102 to effectuate control of the robot 102 in accordance with instructions stored in a memory 120, 132. For example, controller 118 may include a plurality of processing devices for performing high-level tasks (e.g., planning a route to avoid obstacles) and processing devices for performing low-level tasks (e.g., producing actuator signals in accordance with the route).
[0085] Next, FIG. 1C will be discussed. FIG. 1C illustrates a planar light detection and ranging
(“LiDAR”) sensor 138 collecting distance measurements of a wall 142 along a measurement plane in accordance with some exemplary embodiments of the present disclosure. Planar LiDAR sensor 138 may be configured to collect distance measurements of the wall 142 by projecting a plurality of beams 140 of photons at discrete angles along the measurement plane and determining the distance of the wall 142 based on a time of flight (“TOF”) of the photons leaving the LiDAR sensor 138, reflecting off the wall 142, and returning back to the LiDAR sensor 138. A measurement plane of the LiDAR sensor 138 comprises a plane along which the beams 140 are emitted which, for this exemplary embodiment, is the plane of the page.
[0086] One skilled in the art would appreciate that a plurality of sensors 138 may be positioned on a robot 102 chassis to enhance the navigation and localization capabilities of the robot 102 in addition to sensing or detecting objects around the robot 102, such as wall 142. This plurality of sensors 138 may be mounted in static positions (e.g., using screws, bolts, etc.) or may be mounted with servomotors configured to adjust the pose of the sensor 138 on the robot 102. Calibration of these sensors 138 may be essential for the robot 102 to navigate through an environment safely and perform complex tasks with high precision. Calibration of sensors 138 may degrade over time due to, for example, wear and tear, collisions with objects or people, and/or electrical components of the sensor performing abnormally due to, e.g., temperature fluctuations.
[0087] According to at least one non-limiting exemplary embodiment, LiDAR sensors may utilize Doppler frequency shifts of reflected beams 140 to measure a speed of an object relative to the sensor 138. These form of LiDAR sensors are typically used to measure speed using a single beam 140, such as for traffic speed enforcement. For example, for a stationary sensor 138 measuring a speed and distance to an object approaching the sensor 138, the reflected beam 138 may include a ToF proportionate to the distance between the sensor 138 and the object as well as a blue-shifted frequency (i.e., a higher frequency than the emitted beam 140), or vice versa if the object is moving away from the sensor 138 (i.e., red-shift). That is, the alignment targets 202 shown in the following figures may be utilized to calibrate any LiDAR sensor including, but not limited to, planar LiDAR sensors and/or directional laser sensors (e.g., for autonomous vehicles measuring the speed of nearby vehicles).
[0088] Next, FIG. 2A will be discussed. FIG. 2A illustrates an alignment target 202 and high- level circuit diagrams of components thereof, according to an exemplary embodiment. The alignment target 202 may comprise a plurality of detection indication units 204 arranged in a linear array, each further comprising an infrared (“IR”) detector diode 302, amplifier 314, threshold logic 318, and a visible light diode 322 as further illustrated below in FIG. 3. Each of the detection indication units 204 may be configured to detect incident IR light from a LiDAR sensor, or other sensor (e.g., radar), and produce visible light upon detection of the incident IR light. The alignment target 202 may further comprise a target 206 or alternatively referred to as a reference target 206, comprising a visible light diode configured to provide a reference target corresponding to a desired intersection location of the incident light from the sensor with the alignment target 202 to aid in calibration of the sensor. The target 206 may additionally comprise a direct current (“DC”) voltage source 210, Von, such that the target LED 212 remains on during calibration of the LiDAR. The target 206 may output or display a same or different color light than the visible light diodes 322 of the detection indication units 204, dependent on a specific color of LED 212 utilized which may be any visible color, such that the target 206 may be easily distinguishable by an operator from visible light diodes 322 of the detection indication units 204. A location of the target 206 on the alignment target 202 corresponds to a desired point of intersection between a beam or measurement plane of a sensor with the alignment target 202, the desired point of intersection corresponding to a calibrated sensor. The alignment target 202 may be positioned proximate a sensor 138 such that a well-calibrated sensor 138 on a robot 102 may comprise a measurement plane which intersects the alignment target 202 at the location of the reference target 206.
[0089] The alignment target 202 may comprise N detection indication units 204 arranged in a linear array as illustrated, wherein N may be any positive integer number greater than one and chosen based on practical limitations (e.g., desired size of the alignment target 202, spacing of each detection indication units 204, cost, etc.). The target 206 may be positioned adjacent to any one of the plurality of detection indication units 204. Additionally, the alignment target 202 may comprise a power supply 208 such as, for example, batteries, a USB port (e.g., USB, micro-USB, USB-C, etc.), alternating current plug with a rectifier circuit, and the like. In the non-limiting exemplary embodiment illustrated, the power supply 208 couples with connectors 201 which receive power from wired connections (e.g., from an external power source, such as a wall-plug, battery, generator, etc.). Power indication diodes 203 may illuminate when power is received (e.g., the diodes 203 may turn on, change from red (power- off) to green (power-on), flash, blink, etc.). It will be appreciated by one skilled in the art that the target 206 may be positioned substantially close to the linear array of detection of the plurality of indication units 204 (e.g., within 1 cm), wherein, as illustrated in FIGS. 2A-2B, the spatial separation has been greatly exaggerated for clarity.
[0090] The alignment target 202 may be configured to receive IR light from a sensor at a substantially normal incident angle to the surface of the alignment target 202 (i.e., substantially normal to the plane of the page). Accordingly, the IR beams along the line of sight of the sensor plane may be detected by one or more detection indication units 204 to illuminate one or more visible light diodes 322, illustrated below in FIG. 3, such that an operator may visualize the location where the incident beam of the sensor intersects with the alignment target 202. That is, a single alignment target 202 may aid in visualizing a measurement plane of the sensor along one dimension or using a single point of intersection. Multiple or plurality of (e.g., 2 or more) alignment targets 202, separated spatially, may be utilized to visualize a 2 dimensional measurement plane of a LiDAR sensor by an operator yielding an unconventional result wherein adjustment of the LiDAR sensor on a device, such as a robot 102, may be accurately tuned manually due to the visualization of a measurement plane by the operator providing instantaneous visual feedback regarding the exact location/orientation of a measurement plane of the LiDAR sensor, as further illustrated below in FIGS. 4A-B.
[0091 ] According to at least one non-limiting exemplary embodiment, an alignment target 202 may comprise a plurality of linear arrays of detection indication units 204 positioned parallel or perpendicular to the single linear array such that additional intersection points of a measurement plane of a LiDAR sensor with an alignment target 202 may be visualized by a human. That is, a single array of detection indication units 204 for an alignment target 202 is not intended to be limiting. According to at least one non-limiting exemplary embodiment, an alignment target 202 may further comprise a plurality of reference targets 206 arranged linearly and parallel to a linear array of detection indication units 204, wherein one reference target 206 may be chosen to be illuminated based on a desired point of intersection of a sensor beam with the alignment target 202. A user interface, push buttons, or wired/wireless signal may be utilized to select which target 206 of the plurality of targets 206 to be illuminated based on a desired pose of a sensor being calibrated. Additional exemplary embodiments of an alignment target 202 are further illustrated below in FIG. 2C.
[0092] According to at least one non-limiting exemplary embodiment, an alignment target 202 may comprise a plurality of detection indication units 204 positioned horizontally, wherein the vertical arrangement of the detection indication units 204 is not intended to be limiting. One skilled in the art would appreciate that the plurality of detection indication units 204 may alternatively be positioned at any angle (e.g., vertically, horizontally, or anywhere in between). Additionally, the alignment target 202 is configured to be light (e.g., less than 5 kg) to enable users to place the alignment targets 202 in any position and/or orientation useful for calibrating sensors on a robot 102. Similarly, according to another non-limiting exemplary embodiment, an alignment target 202 may comprise any number of linear arrays of detection indication units 204, wherein use of a single linear array is not intended to be limiting, as illustrated in FIG. 2C(i). Alternatively, one skilled in the art would appreciate that multiple alignment targets 202 each comprising a single linear array may be utilized with inventive concepts disclosed herein, as illustrated in FIG. 4A-B below.
[0093] According to at least one non-limiting exemplary embodiment, a target 206 of an alignment target 202 may comprise a designated detection indication unit 204 of the plurality of detection indication units 204. The designated detection indication unit 204 (i.e., the target) may, for example, output a different color than the other detection indication units 204 or otherwise differentiate itself from the rest. Stated differently, one of the plurality of detection indication units 204 may be configured to act as the target 206 by illuminating a different visible color than the remaining detection indication units 204, as illustrated in FIG. 2C(iii). This embodiment of an alignment target 202 may further comprise a user interface (e.g., push buttons, remote controller and receiver, etc.) which may adjust which detection indication unit 204 is designated as the target.
[0094] According to at least one non-limiting exemplary embodiment, the alignment target
202 may further comprise a controller and non-transitory memory, such as processor 130 and memory 132 illustrated in FIG. IB, configured to receive input from the detection indication units 204, the input comprising a logical 1 or 0 signal corresponding to a detection or no detection, respectively, of nonvisible light from a sensor for a given detection indication unit 204. Using detection data from the detection indication units 204 (i.e., which one receives incident nonvisible light from the sensor), known spacing between each of the detection indication units 204, and a known position of the target 206 (e.g., the target may be one of the plurality of detection indication units 204 or a separate adjacent target 206 as illustrated), a spatial discrepancy 218 may be determined as illustrated below in FIG. 2B. This spatial discrepancy 218 may be further utilized to determine adjustments to the sensor, as further illustrated below in FIGS. 4-6.
[0095] According to at least one non-limiting exemplary embodiment, the alignment target
202 may further comprise an input configured to adjust the brightness of the various LEDs 204 and 206. For example, a knob coupled to a potentiometer may be used to adjust the light intensity to a suitable level for human use. Other inputs are equally applicable, such as push buttons (e.g., to the intensity up or down), a slider, and/or modulation of the output light (e.g., using a pulse width modulated signal). [0096] According to at least one non-limiting exemplary embodiment, the alignment target
202 may be fully or partially enclosed within a transparent covering, such as one made from plastics, glass, or other transparent materials. It is appreciated that only the illustrated side of the alignment target 202 is required to include such transparent covering such that incident light from a LiDAR sensor 138 may be received by the detection indication units 204 and to enable light emitted from the detection indication units 204 to be visible to humans.
[0097] FIG. 2B illustrates an alignment target 202 being utilized to visualize a measurement plane 214 of a LiDAR sensor 138 for calibration of the LiDAR sensor 138, according to an exemplary embodiment. The LiDAR sensor 138 may be mounted on a device (not shown) using, for example, screws, servomotors, or bolts which may be adjusted to change a pose (i.e., orientation) of the LiDAR sensor 138. Upon the LiDAR sensor 138 being activated, a plurality of beams 140 may be emitted across measurement plane 214, as illustrated in FIG. 1C above, which intersect with the alignment target at a location indicated by an illuminated detection indication unit 204-0. One skilled in the art would appreciate that positioning of the LiDAR sensor 138 illustrated in FIG. 2B is a representative representation and not limited to this particular configuration. The LiDAR sensor 138 may be positioned and oriented in different configuration with respect to the alignment target 202.
[0098] A human calibrating the LiDAR sensor 138 may desire the measurement plane 214 to intersect the alignment target 202 at a location of target 206. The human, or manual intervention, may position the alignment targets 202 such that the target 206 is at a location where the measurement plane of the LiDAR sensor 138 should intersect if the sensor 138 is well calibrated. The human may calibrate the LiDAR sensor 138 by physically or electronically adjusting the pose of the LiDAR sensor on the device until the measurement plane 214 is at a desired measurement plane 216 (noted by dashed lines). By adjusting the LiDAR sensor 138, the measurement plane is essentially being adjusted from first measurement plane 214 to the second, or desired, measurement plane 216. Measurement plane 216 may intersect the alignment target 202 at the target 206 location as indicated by a detection indication unit 204-C directly adjacent to the target 206, wherein detection indication unit 204-C, which is not illuminated, becomes illuminated after measurement plane 216 intersecting the alignment target 202 at the target 206 location. This change in illumination of the detection indication unit 204-C may correspond, or indicate to the operator, the LiDAR sensor 138 is properly calibrated. It is appreciated that as the pose of the LiDAR sensor 138 is adjusted from measuring across the first plane 214 to the second plane 216, different detection indication units 204 between detection indication units 204-0 and 204-C may illuminate sequentially as the pose is adjusted thereby providing the human with instant visual feedback of a current pose of the measurement plane of the sensor which improves manual calibration by the human in both precision and speed. One skilled in the art may appreciate that visualization of the measurement plane 214 of the LiDAR sensor 138 may require use of an additional alignment target 202 as a minimum of three points are required to define a plane, one of the points comprising of the LiDAR sensor 138 itself and the other two remaining points corresponding to detection indication 204 units illuminated by the light emitted by the LiDAR sensor 138, as illustrated in FIG. 4A-B below.
[0099] According to at least one non-limiting exemplary embodiment, an alignment target 202 may further comprise a microprocessor or controller configured to determine a discrepancy 218 between a target 206 (and adjacent detection indication unit 204-C) and a currently illuminated detection indication unit 204-0. The discrepancy 218 may be measured parallel to the linear array of detection indication units 204 as illustrated and based on (i) a number of detection indication units 204 between an illuminated detection indication unit 204-0 and the target 206, (ii) a distance between the alignment target 202 and a sensor 138, and (iii) and a spacing between adjacent detection indication units 204. The microprocessor or controller may then determine adjustments to a pose of a sensor 138 based on the discrepancy 218, as illustrated in FIG. 6 below. In this embodiment, the target 206 may be replaced with a designated detection indication unit 204 of the plurality of detection indication units 204 (e.g., detection indication unit 204-C), wherein the microprocessor or controller may receive data comprising which detection indication unit 204 is designated as a target.
[00100] As illustrated here on out with respect to the figures 2C-6 below, solid black squares representing detection indication units 204 correspond to a detection indication unit 204 detecting no incident nonvisible light (e.g., detection indication unit 204-C illustrated in FIG. 2B), whereas empty, white or non-black squares representing detection indication units 204 may correspond to the detection indication units 204 detecting incident nonvisible light from a sensor (e.g., illuminated detection indication unit 204-0 illustrated in FIG. 2B). Targets 206 represented with empty squares correspond to a currently active target (i.e., atarget 206 comprising an illuminated LED 212), wherein an alignment target 202 may comprise a plurality of other targets 206 which are disabled and/or not illustrated for clarity. That is, black squares indicate no visible light being emitted from a component (e.g., 204/206) while white squares indicate visible light being emitted from a component.
[00101] Next, FIG. 2C(i)-(iii) will be discussed in detail. FIGS. 2C(i)-(iii) illustrate three additional non-limiting exemplary embodiments of an alignment target 202. First, FIG. 2C(i) illustrates an alignment target 202(i) comprising two linear arrays of detection indication units 204 and two separate targets 206 as illustrated. A planar LiDAR sensor may illuminate both linear arrays at points indicated by detection indication units 204-0, illustrated with empty boxes, thereby providing an additional data point from which adjustments to a pose of the planar LiDAR may be determined for calibration, wherein the adjustments may be determined by both discrepancies 218. The two linear arrays may comprise a same or differing number of detection indication units 204 positioned in parallel to each other, as illustrated, or at an angle. The two discrepancies 218 shown in FIG. 2C(i) may be of same or different values. Use of two linear arrays of detection indication units 204 may enable an operator to visualize a measurement plane of a LiDAR sensor by providing two of the three spatial data points required to define the measurement plane, the third point, which defines the plane, being the position of the LiDAR sensor itself. For example, based on the two illuminated detection indication units 204-0 illustrated, an operator may visualize the measurement plane of the LiDAR sensor intersects the alignment target 202(i) below the targets 206, as shown by discrepancies 218. Further, illuminated detection indication units 204-0 illustrated communicate to the operator that the LiDAR sensor comprises an incorrect orientation (i.e., rotation), as shown by discrepancies 218 comprising unequal magnitude and targets 206 being configured substantially horizontal with respect to each other.
[00102] Next, alignment target 202(ii) shown in FIG. 2C(ii) may comprise a linear array of detection indication units 204 and a linear array of targets 206, wherein an active target 206-0 (i.e., a target 206 currently being utilized comprising an illuminated LED 212) may be chosen from the plurality of targets 206 illustrated. The linear array of targets 206 being configured parallel to the linear array of detection indication units 204. A user may designate which target 206 of the plurality may be the active target 206-0 using buttons 220, or other similar input. The buttons 220 comprising up and down buttons configured to move the active target 206-0 up or down, respectively, one by space along the linear array of targets 206. This embodiment may enable a single alignment target 202(ii) to be utilized to calibrate multiple different sensors (e.g., LiDAR sensors 138) by simply adjusting a position of the active target 206-0, provided the sensors utilize a similar wavelength of light (i.e., within a passband of a photodiode 302). Alternatively, the positioning of the active target 206-0 may be determined by a processor of the alignment target 202(H) or a separate processor communicatively coupled to the alignment target 202(H) (e.g., as shown in FIG. 6). Accordingly, discrepancy 218, comprising a spatial discrepancy between the active target 206-0 and detection indication unit 204-0 receiving incident light from a LiDAR sensor, may be determined as illustrated. Discrepancy 218 may indicate to the operator viewing the alignment target 202(H) that the current positioning of a measurement plane of the LiDAR sensor intersects the alignment target 202(H) at a location lower than the target or desired location (indicated by active target 206-0).
[00103 ] Lastly, alignment target 202(iii), as shown in FIG. 2C(iii), may comprise a single linear array of detection indication units 204 and no stand-alone targets 206. In this embodiment, one detection indication unit 204- T may be designated as the target, wherein discrepancy 218 may be measured from the designated detection indication unit 204-T and a detection indication unit 204-0 outputting a logical high detection indication signal 320, illustrated in FIG. 3. The designated target 204-T may output a same or different color than any illuminated detection indication unit 204 (e.g., 204-T may output a green color while 204-0 may output a red color). A processor may be added or communicatively coupled to the alignment target 202(iii) to set the target detection indication unit 204- T and measure the discrepancy 218 as illustrated based on a number of detection indication units 204 between units 204-T and 204-0 as well as distance between each detection indication unit 204 (e.g., 5 millimeters). Alternatively, push buttons 220, or other similar input, may be utilized to step the position of the target detection indication unit 204-T up or down the linear array. Advantageously, the alignment target 202(iii) may enable configurability of a target 204-T, similar to configurability of target 206-0 of FIG. 2B(ii), while occupying less space on a printed circuit board and/or requiring fewer components to manufacture.
[00104] Any alignment target 202 illustrated herein comprising a single linear array of detection indication units 204 and a single target 206 (e.g., as depicted above in FIG. 2A) illustrated herein with respect to FIG. 3-6 is not intended to be limiting to the illustrated embodiment. Additionally, an alignment target 202 may include, but is not limited to, any combination of features illustrated in FIG. 2C(i)-(iii) (e.g., two linear arrays of detection indication units 204 with no stand alone targets 206). One skilled in the art may appreciate that the specific configuration of alignment target 202 used may be based on parameters such as, for example, cost, size, complexity of operation, number of alignment targets 202 used, and/or power consumption which may be considered by the operator when deciding the configuration of the alignment target 202 utilized.
[00105] It is understood by one skilled in the art that an analog circuit (i.e., comprising no processor or memory) configured to output a visible light output to a human to illustrate a location of intersection of a nonvisible sensor beam with an alignment target 202 (e.g., FIG. 3) may instead be replaced by a digital system (i.e., comprising a processor and memory) configured to instead receive logical outputs (i.e., 0 or 1) from the detection indication units 204 corresponding to the location intersection rather than producing visible light outputs. The digital system may include the alignment target 202 further comprising a specialized processor and non-transitory memory configured to determine a spatial discrepancy 218 between the location of intersection 204-0 and a target 206 based on a detection indication signal 320 (illustrated below in FIG. 3), wherein the target 206 may comprise a designated one of the plurality of detection indication units 204 (e.g., 204-C) rather than a stand-alone target (e.g., target 206 as illustrated) in this embodiment. The processor may utilize the spatial discrepancy 218 to determine any adjustments to a pose of the sensor to configure a beam of the sensor to intersect with the alignment target 202 at a desired location. If a processor is to determine and perform adjustments to the sensor, use of visual light diodes for the target 206 and detection indication units 204 may be a redundant feature. A digital system configured to measure the spatial discrepancy 218 and determine adjustments to a pose of a sensor is further illustrated below in FIG. 6. That is, feedback provided by alignment targets 202 for use in calibrating a sensor on a device is not intended to be limited to visible light emitting diodes providing feedback to a human for manual calibration of the sensor.
[00106] FIG. 3 illustrates a detection indication unit 204, and components thereof, according to an exemplary embodiment. The detection indication unit 204 may first comprise a nonvisible light photodiode 302 configured to, upon receipt of incident of nonvisible light 304 (e.g., infrared (IR) or ultraviolet (UV) light) from a sensor (e.g., a LiDAR sensor), induce an output voltage configured to pull up a reference voltage 306 Vref on a line 310, wherein the reference voltage 306 may comprise a constant DC voltage supplied by a DC source 208 located on the alignment target 202, such as power supply 208. Component 308 comprises a resistor with an impedance such that, if no light 304 is detected by photodiode 302, the voltage on line 312 is zero due to a voltage across the photodiode 302 being zero. Similarly, if the light 304 is present and detected by photodiode 302, the voltage on line 312 is an ‘on’ voltage of the photodiode (e.g., 0.7 volts). Line 312, as well as lines 310, 316, and 320 illustrated in FIG. 3 may be illustrative of PCB traces, wirings, or other low-impedance electrical means of transmitting electrical power or a voltage potential from one component to another.
[00107] The voltage of line 312 may be an input to an op-amp 304, or other similar amplifying circuit, such that a voltage difference between Vref of line 310 and the voltage of line 312, AV, may be amplified based on a gain of the op-amp (i.e., Gain x AV for an ideal op-amp). The value of the gain of the op-amp may be chosen based on, for example, a value of Vref, power consumption of the circuit, desired output voltage range of line 316, and/or other design choices which may be readably discemable by one skilled in the art. The amplified differential voltage output 316 may be passed to a threshold logic component 318 comprising a comparator circuit configured to output a logical high or low detection indication signal 320 (i.e., logical 1 or 0) to power a visible light diode 322. A logical high detection indication signal 320 may correspond to detection of the nonvisible light 304 by the photodiode 302 and a logical low detection indication signal 320 may correspond to no nonvisible light 304 being detected by the photodiode 302, wherein the detection indication signal 320 may be determined based on the output 316 from the amplifier 314 exceeding or falling below a threshold voltage level. The voltage value of the logical high detection indication signal 320 may comprise a turn on voltage, or slightly larger voltage, of the visible light diode 322 and the logical low detection indication signal 320 voltage value may comprise a voltage lower than a turn on voltage of the visible light diode 322 (e.g., 0 volts). Upon the threshold logic component 318 outputting a logical high detection indication signal 320, the visible light diode 322 may output visible light 324 to indicate to an operator that nonvisible light 304 from the sensor is received by the photodiode 302. Whereas, upon the threshold logic component outputting a logical low detection indication signal 320, the visible light diode 322 may remain off and produce no visible light corresponding to no nonvisible light 304 being detected by the photodiode 302.
[00108] One skilled in the art may appreciate that a detection indication unit 202 may additionally comprise a plurality of additional circuit components not illustrated in FIG. 3 such as, for example, coupling and/or bypass capacitors, current limiting resistors (e.g., to limit brightness of light 324 emitted from diode 322), supply voltage circuits for Vref and the amplifier 304, Zener diodes, metal- oxide semiconductor (MOS) devices/circuits, and the like, all being well understood within the art. Similarly, some circuit components illustrated in FIG. 3 may be omitted or changed in some embodiments (e.g., amplifier 314, resistor 308, line 310 may be connected to ground, etc.) without significantly altering the function and purpose of the detection indication unit 202. Additionally, it is appreciated that the photodiode 302 may be configured to be sensitive to any wavelength of incident light 304 such as, for example, ultraviolet, near IR, IR, or microwave, wherein a choice of photodiode 302 may depend on a wavelength of a sensor being calibrated. Further, the spatial positioning of the photodiode 302 and the visible light diode 322 may be configured on, e.g., a PCB such that the two diodes 302 and 322 are in substantially similar locations (e.g., up to 0.5, 1, 5, or 10 cm) to provide human users of an alignment target 202 with an accurate intersection location between a measurement plane of a LiDAR sensor 138 and the alignment target 202.
[00109] According to at least one non-limiting exemplary embodiment, line 320 may further include a variable resistor, such as a potentiometer, configured to adjust the brightness of the visible light emitting diode 324. The variable resistor may be coupled to a knob, switch, slider, or other tactile input to enable a human operator to adjust the output brightness of the diode 324.
[001 10] According to at least one non-limiting exemplary embodiment, a detection indication signal 320 corresponding to detection or no detection of nonvisible light 304 by a photodiode 302 may be communicated to a processor or microcontroller using, for example, a register (e.g., flip flop, capacitor, etc.). The value stored in the register (i.e., logical 1 or 0) may be utilized by a processor or microcontroller to determine a location on an alignment target 202 where a measurement beam/plane from a sensor intersects the alignment target 202 based on which detection indication unit 204, of a plurality of linear arranged detection indication units 204, produces a logical high detection indication signal 320. The processor may then utilize the determined location of intersection (i.e., which output 320 of the plurality of detection indication units 204 is logical 1 or high) to calculate a spatial discrepancy 218 between the location of intersection (e.g., 204-0) and a target (e.g., detection indication unit 204-C or separate target 206) to determine adjustments to a pose of the sensor based on minimizing the spatial discrepancy 218, as illustrated below in FIG. 4B and FIG. 6. [00111] Next, FIG. 4A will be discussed. FIG. 4A illustrates a top view of a LiDAR sensor 402 and three alignment targets 202 positioned at a known fixed distance d from the sensor 402. The alignment targets 202 being used by an operator to determine alignment or pose of the LiDAR sensor 402 (i.e., calibrate the LiDAR sensor 402) and measurement plane thereof, according to an exemplary embodiment. The LiDAR sensor 402 may be mounted on a device (not shown), such as a robot 102, and the mounting (i.e., pose of the LiDAR sensor 402 on the device) may be adjustable by the operator. LiDAR sensor 402 may include a LiDAR sensor 138 shown and described in FIG. 1C above.
[00112] For example, the alignment targets 202 may be positioned in an environment configured for calibrating LiDAR sensors 402 of a robot 102. Operators of robots 102 may desire the LiDAR sensors 402 to be configured onto the robot 102 at a specific (x, y, z, yaw, pitch, roll) position. The environment may include a location where a robot 102, including the LiDAR sensors 402, may be fixed. The three alignment targets 202 may be positioned within the environment such that the reference targets 206 thereof intersect the measurement plane of the LiDAR sensor 402 when the LiDAR sensor 402 is well calibrated. Three points are required to define a plane, wherein the three points used by operators to determine the current measurement plane of the LiDAR sensor 402 may be visualized by detection indication units 204 which are illuminated (e.g., as shown in FIG. 2B). In some instances, the sensor 402 may be the third point which defines the measurement plane. The three reference targets 206 denote the ideal, or well-calibrated, measurement plane of the LiDAR sensor 402. To calibrate the LiDAR sensor 402, the operator may adjust the position of the LiDAR sensor 402 such that the illuminated detection indication units 204 align with the reference targets 206 for all three alignment targets 202.
[00113] The operator may, for example, desire a measurement plane of the LiDAR sensor 402 to be parallel to a flat floor below the LiDAR sensor 402 at a specific height above the floor. Accordingly, the heights of the reference targets 206 of the three alignment targets 202 are positioned at the desired height above the flat floor. To determine if the measurement plane is parallel to the flat floor, the operator may activate the LiDAR sensor 402 to send measurement beams 404 of IR light in all directions across a field of view along the measurement plane, wherein some of the beams 404 are illustrated for clarity. The beams 404 may configure IR detection diodes 302, as illustrated in FIG. 3 above, to generate an output voltage which configures corresponding threshold logic units 318 to activate visible light diodes 322 indicating that a beam 404 has been detected by one or more of the IR detection diodes 302 of a detection indication unit 204 of each respective alignment target 202. It is appreciated that use of a LiDAR sensor 402 is not intended to be limiting, wherein the sensor 402 may comprise any sensor which utilizes nonvisible light to sense an environment along a plane (e.g., radar).
[00114] FIG. 4B illustrates a side view of the LiDAR sensor 402 and three alignment targets 202, according to the exemplary embodiment illustrated in FIG. 4A above. The LiDAR sensor 402 may emit a plurality of beams 404, wherein the three beams 404 illustrated comprise beams which are incident upon the three alignment targets 202. Accordingly, some detection indication units 204 located at intersection points of the plurality of beams 404 with alignment targets 202 may illuminate (i.e., illuminate LED 322 shown in FIG. 3), as illustrated by empty squares, indicating a location of intersection of a beam 404 with an alignment target 202. Target diodes 206 of the three respective alignment targets 202-L, 202-R, 202-C, are positioned at a constant height h from the floor such that a measurement plane intersecting the alignment targets 202 at the locations of all target diode 206 corresponds to a measurement plane parallel to the flat floor at the height h. The LiDAR sensor 402 may be mounted on a chassis of a device (e.g., robot 102) in an incorrect pose such that a measurement plane formed by beams 404 is not parallel to the floor as shown, wherein the alignment targets 202 may be utilized to correct the pose of the LiDAR sensor 402.
[00115] For example, the leftmost alignment target 202-L comprises a target 206 at the height h from the floor, wherein an illuminated detection indication unit 204 (white square) may comprise a discrepancy / from the target 206 (i.e., from a detection indication unit 204 adjacent to the target 206) due to the improper pose of the LiDAR sensor 402 as illustrated. Similarly, the rightmost alignment target 202-R may comprise a target 206 at the height h, wherein the illuminated detection indication unit 204 corresponds to location where beam 404 is detected or is incident on the alignment target 202- R. Accordingly, the illuminated detection indication unit 204 (white square) yields a discrepancy r from the target 206 for the right alignment target 202-R. Lastly, the center alignment target 202-C may comprise no or negligible discrepancy between the target 206 and illuminated visible light diodes 308 corresponding to the forward beam 404 being aligned with the target 206 (i.e., no error), as illustrated by both the target 206 and adjacent detection indication unit 204 being illuminated. However, one skilled in the art would appreciate that if there is a discrepancy in the center alignment target 202-C, then such would be reflected similar to as reflected in the right and left alignment targets 202-R, 202- L. Based on the discrepancies l, r, and zero measured by the respective left, right, and center alignment targets, an operator may adjust a mounting of the LiDAR sensor 402 along a roll axis illustrated such that the discrepancies / and r become zero corresponding to the measurement plane of the LiDAR sensor being parallel to the floor and at constant height h from the floor. With discrepancies / and r being zero (i.e., for a well calibrated sensor 402), detection indication units 204 directly adjacent to respective targets 206 of the left and right alignment targets 202-L and 202-R may both be illuminated, similar to the target 206 and illuminated detection indication unit 204 of the center alignment target 202-C.
[00116] One skilled in the art may appreciate that, upon configuring a measurement plane formed by beams 404 to be parallel to the floor at the height h from the floor, detection indication units 204 directly adjacent to respective targets 206 of the three alignment targets 202-L, 202-C, 202-R will illuminate. This corresponds to the angular pose of the LiDAR sensor 402 being configured correctly; however the pose may still comprise a discrepancy in translational coordinates (i.e., x and y). Accordingly, distance measurements collected by beams 404 of LiDAR sensor 402 may be utilized to verify the translational position of the LiDAR sensor 402 is correct. That is, distance measurements between the LiDAR sensor 402 and targets 202-L, 202-C, 202-R (e.g., using beams 404) may be verified to comprise the distance d. In some embodiments, the three alignment targets 202-L, 202-C, 202-R may be each positioned at different distances dr. dc. and <¾ wherein positioning all three alignment targets 202-L, 202-C, 202-R at constant distance d from the LiDAR sensor 402 is not intended to be limiting. In some embodiments, however, translational position of the LiDAR sensor 402 may not be configurable due to a specific configuration of screws, bolts, latches, etc. used to couple the LiDAR sensor 402 to a device.
[00117] It is appreciated that to define a measurement plane of the LiDAR sensor 402 a minimum of two alignment targets 202 may be utilized as visual light diodes 308 of each corresponding alignment targets 202 (202-L, 202-R, 202-C) may define a point on the measurement plane, wherein three points comprise a minimum number of points to define a plane and one of the points comprises the LiDAR sensor 402 itself. Exact position of the LiDAR sensor 402 may be unknown to the operator during calibration, wherein use of three alignment targets 202 to yield three points of the measurement plane may further enhance visualization of the measurement plane for the operator without relying on a known position of the LiDAR sensor 402, thereby enhancing calibration speed and capabilities of the operator. Additionally, the position of the targets 206 of each corresponding alignment target 202 may be set based on a desired orientation (i.e., (x, y, z, roll, pitch, yaw)) of the LiDAR sensor 402 on the device and/or a field of view of the LiDAR sensor 402, wherein the positions of the targets 206 illustrated is not intended to be limiting. For example, the leftmost target 206 of alignment target 202- L may be at a higher height as compared to a target 206 of alignment target 202-R for a LiDAR sensor measuring at a slant angle. One skilled in the art would appreciate that although three alignment targets (202-L, 202-R, 202-C) are illustrated, more or less alignment targets may be employed to practice the inventive concepts disclosed herein.
[00118] One skilled in the art may appreciate that, in practice, multiple (i.e., 2-3) detection indication units 204 may illuminate upon activation of the LiDAR sensor 402 due to spreading of the beams 404 (e.g., point spread through aperture of sensor 402, natural scattering in air, etc.) which may cause multiple photodiodes 302 to receive light from the LiDAR sensor 402 and illuminate multiple visual light diodes 322 at the same time. Accordingly, the distance d may be chosen to comprise a reasonable distance from the LiDAR sensor 402, wherein spreading effects are minimal (e.g., spreading effects may illuminate at most 2-3 detection indication units 204 with nonvisible light 404), however reducing d substantially may reduce an angular resolution of the linear array of detection indication units 204. The distance d may therefore be chosen to be, for example, 0.5 to 2 meters. As shown in FIG. 4B, distance d corresponds to distance between from the LiDAR sensor to the respective alignment targets (202-L, 202-R, 202-C).
[00119] Advantageously, use of visible light diodes 322 of detection indication units 204 to indicate a position of an IR beam 404 from a LiDAR sensor, or any other nonvisible light sensor, may enhance human abilities to manually adjust a LiDAR sensor 402 on a device by enabling a human to visualize the IR beams 404 which are not visible to human eyes. Additionally, as the human is performing adjustments to the LiDAR sensor 402 the human receives instant visual feedback by the visual light diodes 308 of a current pose of the measurement plane of the LiDAR sensor 402, thereby further enhancing accuracy and efficiency of the human manually adjusting the LiDAR sensor 402 to a desired pose. The accuracy of the manual adjustment may depend on the spacing between adjacent visual light diodes 308 as well as a distance d between an alignment target 202 and the LiDAR sensor 402 which, as an example, for adjustment targets 202 at a distance 1 meter from the LiDAR sensor 402 comprising visual light diodes 308 separated vertically by 5 millimeters may enable the human to adjust an angle of the sensor with an angular precision of 0.0002° (i.e., tan(0.005), neglecting spreading effects), vastly more precise than an unaided human. Lastly, use of targets 206 positioned at specific locations on each alignment target 202 may provide an additional reference target for the human during adjustment of the LiDAR sensor 402.
[00120] Next, FIG. 5 will be discussed. FIG. 5 illustrates a method 500 for a human operator of a device such as a robotic system (e.g., robot 102 of FIG. 1A) comprising a LiDAR sensor, to calibrate the LiDAR sensor using at least one alignment target 202, according to an exemplary embodiment. The LiDAR sensor described in method 500 may comprise a planar LiDAR, as illustrated in FIG. 1C, or a line of sight LiDAR (i.e., configured to measure distance along a one dimensional line of sight).
[00121] Block 502 comprises the human operator positioning the at least one alignment target
202 at a known position relative to the device comprising the LiDAR sensor. Specifically, the positions of the reference target 206 diodes are configured in known positions corresponding to a desired configuration of a measurement plane of a LiDAR sensor. The alignment target 202 being positioned within a field of view of the LiDAR sensor. The known position may comprise a distance from the LiDAR sensor to the respective targets 206 of the alignment targets 202, as illustrated by distance d and height h of FIG. 4B above. Target diodes 206 of the at least one alignment targets 202 may be at fixed and known locations on the at least one alignment targets 202, wherein the at least one alignment targets 202 may be positioned by aligning the target diodes 206 at desired intersection points of the LiDAR sensor beams 404 and at least one alignment targets 202.
[00122] Block 504 comprises the human operator activating the LiDAR sensor. The LiDAR sensor may be activated via, e.g., a user interface 112 of a robot 102 which configures the robot 102 into a calibration mode. The calibration mode may cause the controller 118 of the robot 102 to activate one or more LiDAR sensors 114 based on a user input to the user interface 112. For example, the user interface 112 may provide a menu comprising a plurality of user-selectable options (e.g.,“calibrate LiDAR 1”,“calibrate LiDAR 2”, etc.) which enable the human operator to unit-test each LiDAR sensor individually. The LiDAR sensor may emit a beam along a fixed line of sight incident on a single alignment target 202 or may emit a plurality of beams along a measurement plane incident on multiple alignment targets 202.
[00123] Block 506 comprises the human operator determining if illuminated detection indication units 204 of the at least one alignment targets 202 matches the target diodes 206 of each respective alignment target 202, wherein matching corresponds to a detection indication unit 204 directly adjacent to a target 206 being illuminated.“Adjacent” being along the direction orthogonal to the line formed by the array of detection indication units 204. Advantageously, this step may be performed quickly by the human operator as the visual light emitted by the visual light diodes 308 of detection indication units 204 provide rapid feedback of a current pose of the measurement line of sight or plane and therefore a pose of the LiDAR sensor.
[00124] Upon the human operator determining the illuminated visual light diodes 308 matches the target 206, the human operator may determine the LiDAR sensor is sufficiently calibrated.
[00125] Upon the human operator determining the illuminated visual light diodes 308 do not match the target 206, the human operator may move to block 508.
[00126] Block 508 comprises the human operator performing an adjustment to a mounting of the LiDAR sensor. The adjustment may be performed by adjusting one or more screws, bolts, latches, etc. such that x, y, z, yaw, pitch, and/or roll of the LiDAR sensor is modified. The adjustments performed in block 508 may be illustrative of small changes to the pose of the LiDAR sensor, wherein blocks 506 and 508 in conjunction are illustrative of an iterative process of checking the measurement plane of the LiDAR sensor corresponds to the at least one target diode 206, adjusting the mounting of the LiDAR sensor if it does not, and iterating until the illuminated visual light diodes 308 of the at least one alignment targets 202 match the at least one target diode 206 of the respective alignment targets 202. The adjustments being performed are based on a spatial discrepancy between illuminated visual light diodes 308 and a corresponding target diode 206 (e.g., discrepancies 218 of FIG. 2C(i)-(iii), discrepancies / and r illustrated above in FIG. 4B, etc.). [00127] According to at least one non-limiting exemplary embodiment, upon the human operator determining the illuminated visual light diodes 308 match the target 206, the human operator may further verify the distance measurements collected by the LiDAR sensor correspond to the known position of the at least one alignment targets 202. For example, the human operator may, with respect to FIG. 4B above, verify distance measurements collected by the LiDAR 402 sensor comprise the distance d to all three alignment targets 202-L, 202-C, 202-R, or other pre-determined distance values. In some instances, adjusting of a translational position of the LiDAR sensor may not be possible (e.g., due to a configuration of screws, bolts, latches, etc. used to couple the LiDAR sensor to a device), wherein this additional step of verifying distance measurements is not intended to be limiting.
[00128] Advantageously, use of visual light diodes 308 to indicate a deviation from a desired alignment of a LiDAR sensor, defined by target diodes 206, utilizes a natural ability of humans to recognize patterns in alignment adjustments performed in block 508 to calibrate the LiDAR sensor. Stated differently, the degree or measure of adjustment needed to be made by the operator is a direct reflection of the corresponding visual light diode 322 of a respective detection indication unit 204 being activated, which enhances the ability of the operator to determine the needed adjustments by visualizing the measurements of the LiDAR sensor.
[00129] According to at least one non-limiting exemplary embodiment, steps illustrated in blocks 506 and 508 may be performed by a separate microcontroller or processor may be configured to determine a discrepancy between a target diode 206 and illuminated visual light diodes 308 of each alignment target 202. These discrepancies may be utilized by the microcontroller or processor to determine a pose of the sensor and any adjustments to the pose of the sensor required to achieve a desired (i.e., well calibrated) pose of the sensor, as illustrated next in FIG. 6. For example, a position of a LiDAR sensor on a robot 102 may be adjusted by controller 118 issuing signals to actuator units 108 coupled to the LiDAR sensor.
[00130] FIG. 6 is a functional block diagram of a system configured to utilize alignment targets
202 to adjust a pose of a sensor 402, according to an exemplary embodiment. As illustrated above in FIGS. 4A-B, spatial discrepancies between illuminated visible light indication diodes 308 (i.e., detection indication units 204) and a target diode 206 on the alignment targets 202 may be determined (i.e., values / and r illustrated above). These discrepancy values may be measured by the alignment targets 202 and communicated to a processing unit 602 via communications 606 comprising wired or wireless communication. Processing unit 602 may be illustrative of a processor and non-transitory computer readable memory, as illustrated above in FIGS. 1A-B, and may be located on a device comprising the sensor 402 or on a separate device (e.g., cloud server). The processing unit 602 may perform an optimization algorithm (e.g., gradient descent, least squares, etc.) to determine a current pose of the sensor based on the spatial discrepancies measured by the alignment targets 202. Using the determined pose, the processor 602 may output adjustment instructions 608 to an adjustment unit 604. The adjustment unit 604 may comprise, for example, a user interface configured to provide instructions for an operator to manually adjust the sensor 402 to a desired pose (e.g., turn top left screw by 5°) or servomotors configured to adjust the pose of the sensor on the device in response to a control signal of instructions 608. The adjustment unit 604 may be either directly coupled to the device of the sensor 402 or, alternatively, the adjustment unit 604 may be coupled to the device via a wired or wireless communication link. For example, the adjustment unit 604 may be illustrative of a user interface unit 112 of a robot 102, a microcontroller controlling actuators which may change a pose of the sensor, a virtual reality output (e.g., configured to enable an operator to visualize a measurement plane in virtual reality), and so forth. As adjustments to the pose of the sensor 402 are performed, the processing unit 602 may collect new discrepancy data, via communications 606 from the alignment targets 202, and utilize the new discrepancy data as a feedback loop to determine optimal pose adjustments to the sensor 402.
[00131] Although three alignment targets 202 have been illustrated, it is appreciated that any number of alignment targets 202 may be utilized to calibrate a pose of the sensor 402, wherein additional alignment targets 202 may improve the accuracy of adjustments to the pose of the sensor 402 determined by the processing unit 602. Similarly, the positions of the alignment targets 202 illustrated is not intended to be limiting.
[00132] According to at least one non-limiting exemplary embodiment, a single alignment target 202 may be moved across a measurement plane of the sensor 402 (e.g., in an arc at a constant distance from the sensor 402) to collect a discrepancy measurement as a function of angle or spatial position relative to the sensor 402. This function may additionally be utilized by processing unit 602 to determine a pose of the sensor 402 and any adjustments to the pose to achieve a desired pose of the sensor 402 as appreciated by one skilled in the art. It is further appreciated that use of a system illustrated in FIG. 6, wherein a separate processing unit determines a pose and adjustments to the pose of the sensor 402 based on locations of the incident beams on each alignment target 202, may replace the visual light indication diodes 308 with other threshold detection logic corresponding to nonvisible light being received by a respective photodiode 302. That is, if a processing unit 602 performs the discrepancy measurements and pose estimation of the sensor 402, use of visual light diodes may be redundant as humans may not be required to analyze the discrepancies for calibration of the sensor 402.
[00133] It will be recognized that while certain aspects of the disclosure are described in terms of a specific sequence of steps of a method, these descriptions are only illustrative of the broader methods of the disclosure, and may be modified as required by the particular application. Certain steps may be rendered unnecessary or optional under certain circumstances. Additionally, certain steps or functionality may be added to the disclosed embodiments, or the order of performance of two or more steps permuted. All such variations are considered to be encompassed within the disclosure disclosed and claimed herein.
[00134] While the above detailed description has shown, described, and pointed out novel features of the disclosure as applied to various exemplary embodiments, it will be understood that various omissions, substitutions, and changes in the form and details of the device or process illustrated may be made by those skilled in the art without departing from the disclosure. The foregoing description is of the best mode presently contemplated of carrying out the disclosure. This description is in no way meant to be limiting, but rather should be taken as illustrative of the general principles of the disclosure. The scope of the disclosure should be determined with reference to the claims.
[00135] While the disclosure has been illustrated and described in detail in the drawings and foregoing description, such illustration and description are to be considered illustrative or exemplary and not restrictive. The disclosure is not limited to the disclosed embodiments. Variations to the disclosed embodiments and/or implementations may be understood and effected by those skilled in the art in practicing the claimed disclosure, from a study of the drawings, the disclosure and the appended claims.
[00136] It should be noted that the use of particular terminology when describing certain features or aspects of the disclosure should not be taken to imply that the terminology is being re defined herein to be restricted to include any specific characteristics of the features or aspects of the disclosure with which that terminology is associated. Terms and phrases used in this application, and variations thereof, especially in the appended claims, unless otherwise expressly stated, should be construed as open ended as opposed to limiting. As examples of the foregoing, the term“including” should be read to mean“including, without limitation,”“including but not limited to,” or the like; the term“comprising” as used herein is synonymous with“including,”“containing,” or“characterized by,” and is inclusive or open-ended and does not exclude additional, unrecited elements or method steps; the term“having” should be interpreted as“having at least;” the term“such as” should be interpreted as“such as, without limitation;” the term‘includes” should be interpreted as“includes but is not limited to;” the term“example” is used to provide exemplary instances of the item in discussion, not an exhaustive or limiting list thereof, and should be interpreted as“example, but without limitation;” adjectives such as “known,”“normal,”“standard,” and terms of similar meaning should not be construed as limiting the item described to a given time period or to an item available as of a given time, but instead should be read to encompass known, normal, or standard technologies that may be available or known now or at any time in the future; and use of terms like“preferably,”“preferred,” “desired,” or“desirable,” and words of similar meaning should not be understood as implying that certain features are critical, essential, or even important to the structure or function of the present disclosure, but instead as merely intended to highlight alternative or additional features that may or may not be utilized in a particular embodiment. Likewise, a group of items linked with the conjunction “and” should not be read as requiring that each and every one of those items be present in the grouping, but rather should be read as“and/or” unless expressly stated otherwise. Similarly, a group of items linked with the conjunction“or” should not be read as requiring mutual exclusivity among that group, but rather should be read as“and/or” unless expressly stated otherwise. The terms“about” or “approximate” and the like are synonymous and are used to indicate that the value modified by the term has an understood range associated with it, where the range may be ±20%, ±15%, ±10%, ±5%, or ±1%. The term“substantially” is used to indicate that a result (e.g., measurement value) is close to a targeted value, where close may mean, for example, the result is within 80% of the value, within 90% of the value, within 95% of the value, or within 99% of the value. Also, as used herein“defined” or “determined” may include “predefined” or“predetermined” and/or otherwise determined values, conditions, thresholds, measurements, and the like.

Claims

WHAT IS CLAIMED IS:
1. An alignment target apparatus configured to detect incident nonvisible light from a sensor, comprising:
a plurality of detection indication units arranged spatially in at least one linear array, each detection indication unit being configured to detect the incident nonvisible light outputted from the sensor; and
at least one target positioned on the alignment target at a location relative to the sensor, the location of the at least one target corresponding to a desired point of intersection between the incident nonvisible light and the alignment target apparatus, the desired point corresponding to the sensor being calibrated.
2. The apparatus of Claim 1, wherein,
the at least one target comprises a visible light emitting diode configured to visually represent the desired point of intersection between the alignment target and the incident nonvisible light from the sensor.
3. The apparatus of Claim 1, wherein each one of plurality the detection indication units further comprise,
a threshold detection logic configured to:
determine if the nonvisible light from the sensor is incident on a detection indication unit based on an induced voltage from a photodiode; and
output a detection indication signal based on the induced voltage, the detection indication signal comprising either a logical high or logical low detection indication signal, the logical high detection indication signal corresponding to detection of the nonvisible light, and the logical low detection indication signal corresponding to no detection of the nonvisible light.
4. The apparatus of Claim 3, wherein,
the detection indication signal comprises an output voltage over a visible light diode, the visible light diode configured to,
emit visible light based on the output voltage if the detection indication signal is the logical high detection indication signal, and
not emit the visible light if the detection indication signal is the logical low detection indication signal.
5. The apparatus of Claim 3, further comprising:
a non-transitory computer readable storage medium; and
a processor configured to execute the computer readable instructions:
determine at least one spatial discrepancy between the at least one target and an intersection point between the incident nonvisible light outputted from the sensor and the at least one alignment target apparatus, the intersection point being indicated by a detection indication signal outputted by one of the plurality of detection indication units if the detection indication signal is the logical high detection indication signal; and
minimize the at least one spatial discrepancy by adjusting a pose of the sensor.
6. The apparatus of Claim 5, wherein the processor is further configured to execute the computer readable instructions to,
adjust the pose of the sensor by either,
activating at least one servomotor, the at least one servomotor configured to adjust the pose of the sensor; or
providing instructions to a human via a user interface, the instructions prompt the human to perform the adjustments to the pose of the sensor manually.
7. The apparatus of Claim 5, wherein the at least one target further comprises:
a designated at least one detection indication unit of the plurality of detection indication units
8. A method for calibrating a sensor on a device, the sensor being configured to emit nonvisible light to generate measurements of an environment, comprising:
utilizing at least one alignment target at a known position relative to the device to determine, for each alignment target, at least one spatial discrepancy between a location of at least one target and a location of at least one intersection point; and
minimizing the at least one spatial discrepancy by performing adjustments to a pose of the sensor;
wherein, an intersection point corresponds to a location on an alignment target where the nonvisible light is incident; and
a target corresponds to a desired location of the intersection point on an alignment target corresponding to a calibrated sensor.
9. The method of Claim 8, further comprising:
determining the intersection point based on a detection indication signal output from one of a plurality of linearly arranged detection indication units of an alignment target being logical high.
10. The method of Claim 9, further comprising:
determining the detection indication signal for a detection indication unit based on an induced voltage of a photodiode of the detection indication unit exceeding a value, the voltage being induced due to the nonvisible light from the sensor being incident on the photodiode.
11. The method of Claim 9, wherein,
the detection indication signal comprises an output voltage over a visible light diode, the visible light diode configured to,
emit visible light based on the output voltage if the detection indication signal is the logical high detection indication signal, and
not emit the visible light if the detection indication signal is the logical low detection indication signal.
12. The method of Claim 9, wherein,
the at least one target comprises a designated at least one detection indication unit of the plurality of detection indication units.
13. The method of Claim 9, wherein,
the at least one target comprises a visible light emitting diode configured to visually represent the desired location of the intersection point.
14. A non-transitory computer readable storage medium comprising a plurality of computer readable instructions embodied thereon, that when executed by a processor, configure the processor to:
determine at least one spatial discrepancy between the at least one target and an intersection point between the incident nonvisible light outputted from the sensor and the at least one alignment target apparatus, the intersection point being indicated by a detection indication signal outputted by one of the plurality of detection indication units if the detection indication signal is the logical high detection indication signal; and minimize the at least one spatial discrepancy by adjusting a pose of the sensor.
15. The non-transitory computer readable storage medium of Claim 14, wherein the processor is further configured to execute the computer readable instructions to,
perform the adjustments to the pose of the sensor by activating at least one servomotor configured to adjust the pose of the sensor.
16. The non-transitory computer readable storage medium of Claim 14, wherein the processor is further configured to execute the computer readable instructions to,
provide instructions to a human via a user interface to perform the adjustments to the pose of the sensor
17. The non-transitory computer readable storage medium of Claim 14, wherein,
each of the at least one alignment targets further comprises a plurality of linearly arranged detection indication units, each detection indication unit further comprises a photodiode sensitive to a wavelength of the nonvisible light; and
the detection indication signal output is based on an induced voltage of a photodiode exceeding a value, the voltage being induced due to the nonvisible light from the sensor being incident on the photodiode.
18. The non-transitory computer readable storage medium of Claim 14, wherein,
the at least one target comprises a designated at least one of the plurality of detection indication units located at a desired location of the intersection point, the desired location corresponding to an intersection point of a calibrated sensor.
19. An alignment target apparatus configured to detect incident nonvisible light from a sensor, comprising:
a plurality of detection indication units arranged spatially in at least one linear array, each detection indication unit being configured to detect the incident nonvisible light from the sensor, each detection indication unit comprising:
a threshold detection logic configured to determine if the nonvisible light from the sensor is incident on a detection indication unit based on an induced voltage from a photodiode, the threshold detection logic outputs a detection indication signal based on the induced voltage from the photodiode, the detection indication signal comprising a logical high or low corresponding to a detection or no detection, respectively, of the incident nonvisible light by the photodiode, the detection indication signal comprises an output voltage over a visible light diode, the output voltage configures the visible light diode to emit visible light when the detection indication signal is logical high and produce no visible light when the detection indication signal is logical low; and at least one target positioned on the alignment target at a location relative to the sensor, the location of the at least one target corresponding to a desired point of intersection between the incident nonvisible light and the alignment target apparatus, the desired point corresponding to the sensor being calibrated.
PCT/US2020/043974 2019-07-30 2020-07-29 Systems and methods for calibrating nonvisible light emitting sensors using alignment targets WO2021021869A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201962880334P 2019-07-30 2019-07-30
US62/880,334 2019-07-30

Publications (1)

Publication Number Publication Date
WO2021021869A1 true WO2021021869A1 (en) 2021-02-04

Family

ID=74228689

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2020/043974 WO2021021869A1 (en) 2019-07-30 2020-07-29 Systems and methods for calibrating nonvisible light emitting sensors using alignment targets

Country Status (2)

Country Link
TW (1) TW202119055A (en)
WO (1) WO2021021869A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113985906A (en) * 2021-10-28 2022-01-28 上海航天测控通信研究所 Vehicle-mounted mobile type calibration system and method based on unmanned aerial vehicle platform

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020064761A1 (en) * 2000-11-30 2002-05-30 Ripingill Allen E. Infrared laser transmitter alignment verifier and targeting system
JP2014107867A (en) * 2012-11-23 2014-06-09 Sick Ag Portable terminal for aligning sensor
US20190120934A1 (en) * 2017-10-19 2019-04-25 GM Global Technology Operations LLC Three-dimensional alignment of radar and camera sensors
US20190204427A1 (en) * 2017-12-28 2019-07-04 Lyft, Inc. Sensor calibration facility

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020064761A1 (en) * 2000-11-30 2002-05-30 Ripingill Allen E. Infrared laser transmitter alignment verifier and targeting system
JP2014107867A (en) * 2012-11-23 2014-06-09 Sick Ag Portable terminal for aligning sensor
US20190120934A1 (en) * 2017-10-19 2019-04-25 GM Global Technology Operations LLC Three-dimensional alignment of radar and camera sensors
US20190204427A1 (en) * 2017-12-28 2019-07-04 Lyft, Inc. Sensor calibration facility

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
ANONYMOUS: "Alignment Bar Operating Instructions", INSTRUCTION MANUAL, 30 August 2018 (2018-08-30), pages 1 - 2, XP055790976 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113985906A (en) * 2021-10-28 2022-01-28 上海航天测控通信研究所 Vehicle-mounted mobile type calibration system and method based on unmanned aerial vehicle platform

Also Published As

Publication number Publication date
TW202119055A (en) 2021-05-16

Similar Documents

Publication Publication Date Title
US20210146942A1 (en) Systems, methods and apparatuses for calibrating sensors mounted on a device
US20210294328A1 (en) Systems and methods for determining a pose of a sensor on a robot
US11865731B2 (en) Systems, apparatuses, and methods for dynamic filtering of high intensity broadband electromagnetic waves from image data from a sensor coupled to a robot
US11892318B2 (en) Systems, apparatuses, and methods for bias determination and value calculation of parameters of a robot
US11886198B2 (en) Systems and methods for detecting blind spots for robots
US20210354302A1 (en) Systems and methods for laser and imaging odometry for autonomous robots
US20210232149A1 (en) Systems and methods for persistent mapping of environmental parameters using a centralized cloud server and a robotic network
US20220042824A1 (en) Systems, and methods for merging disjointed map and route data with respect to a single origin for autonomous robots
US11529736B2 (en) Systems, apparatuses, and methods for detecting escalators
US20220365192A1 (en) SYSTEMS, APPARATUSES AND METHODS FOR CALIBRATING LiDAR SENSORS OF A ROBOT USING INTERSECTING LiDAR SENSORS
US20230071953A1 (en) Systems, and methods for real time calibration of multiple range sensors on a robot
US11940805B2 (en) Systems and methods for enhancing performance and mapping of robots using modular devices
WO2021021869A1 (en) Systems and methods for calibrating nonvisible light emitting sensors using alignment targets
US20210298552A1 (en) Systems and methods for improved control of nonholonomic robotic systems
US20210215811A1 (en) Systems, methods and apparatuses for calibrating sensors mounted on a device
US10857684B2 (en) Robots with perception-based fiber-optic tactile sensing and methods for providing the same
US20230120781A1 (en) Systems, apparatuses, and methods for calibrating lidar sensors of a robot using intersecting lidar sensors
US20220163644A1 (en) Systems and methods for filtering underestimated distance measurements from periodic pulse-modulated time-of-flight sensors
US20230236607A1 (en) Systems and methods for determining position errors of front hazard sensore on robots
WO2022183096A1 (en) Systems, apparatuses, and methods for online calibration of range sensors for robots
US20210220996A1 (en) Systems, apparatuses and methods for removing false positives from sensor detection
US20210323156A1 (en) Systems and methods for quantitatively measuring wheel slippage in differential drive robots
US20230358888A1 (en) Systems and methods for detecting floor from noisy depth measurements for robots
WO2022146971A1 (en) Systems and methods for precisely estimating a robotic footprint for execution of near-collision motions

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20848350

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20848350

Country of ref document: EP

Kind code of ref document: A1