WO2023137744A1 - Methods of camera-to-robotic device calibration - Google Patents

Methods of camera-to-robotic device calibration Download PDF

Info

Publication number
WO2023137744A1
WO2023137744A1 PCT/CN2022/073429 CN2022073429W WO2023137744A1 WO 2023137744 A1 WO2023137744 A1 WO 2023137744A1 CN 2022073429 W CN2022073429 W CN 2022073429W WO 2023137744 A1 WO2023137744 A1 WO 2023137744A1
Authority
WO
WIPO (PCT)
Prior art keywords
robotic device
camera
axis
matrix
processor
Prior art date
Application number
PCT/CN2022/073429
Other languages
French (fr)
Inventor
Wumei FANG
Yanming Zou
Weizhang LUO
Jun Liu
Zixiang Wang
Zhaoyuan CHENG
Original Assignee
Qualcomm Incorporated
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Qualcomm Incorporated filed Critical Qualcomm Incorporated
Priority to PCT/CN2022/073429 priority Critical patent/WO2023137744A1/en
Priority to TW111145426A priority patent/TW202345099A/en
Publication of WO2023137744A1 publication Critical patent/WO2023137744A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1679Programme controls characterised by the tasks executed
    • B25J9/1692Calibration of manipulator
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/39Robotics, robotics to robotics hand
    • G05B2219/39016Simultaneous calibration of manipulator and camera
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle

Definitions

  • Robotic devices are being developed for a wide range of applications.
  • Robotic devices may be equipped with cameras capable of capturing an image, a sequence of images, or video, and using such data in performing a robotic operation, such as navigation, guiding an actuator, etc.
  • Some robotic devices may be equipped with an image sensor, such as a camera.
  • Robotic devices may use image data from a camera to perform any number of tasks. To properly make use of such image data to perform operations including maneuvering, object manipulation, and the like, information about the location and position of the robotic device must be coordinated with information about the location and position of the camera, generally referred to as camera-to-robotic device calibration.
  • Various embodiments include methods that may be implemented on a processor of a robotic device for calibration of a camera to a robotic device.
  • Various aspects may include maneuvering the robotic device in a plane defined by an X axis and a Y axis of a coordinate system of the robotic device, while maneuvering the robotic device in the plane defined by the X axis and the Y axis determining robotic device pose information at a plurality of times, determining camera pose information at the plurality of times, and determining camera coordinate drift in a Z axis perpendicular to the X and Y axes of the coordinate system at the plurality of times, determining whether the camera coordinate drift at the plurality of times equals or exceeds an unacceptable drift threshold, determining a camera transform matrix using the camera pose information from one or more of the plurality of times in response to determining that the camera coordinate drift at the plurality of times does not equal or exceed the unacceptable drift threshold, determining a robotic device transform matrix using the robotic device pose information from one or more of the plurality
  • performing the calibration of the camera to the robotic device comprises performing a robotic actuator-to-camera calibration of the camera to the robotic device using the camera transform matrix and the robotic device transform matrix.
  • performing the actuator-to-camera calibration of the camera to the robotic device comprises determining a transformation matrix wherein the robotic device transform matrix multiplied by the transformation matrix is substantially equal to the transformation matrix multiplied by the camera transform matrix.
  • Some aspects may include, while maneuvering the robotic device in the plane defined by the X axis and the Y axis, maintaining an acceleration or deceleration of the robotic device such that a deviation of the robotic device in the Z axis does not meet a deviation threshold. Some aspects may include performing a visual odometry initialization of the camera to determine a coordinate system of the camera.
  • Various aspects may include maneuvering the robotic device in a plane defined by an X axis and a Y axis of a coordinate system of the robotic device, while maneuvering the robotic device in the plane defined by the X axis and the Y axis determining robotic device pose information at a plurality of times, determining camera pose information at the plurality of times, and determining a camera coordinate drift in a Z axis perpendicular to the X and Y axes of the coordinate system at the plurality of times, determining whether the camera coordinate drift at the plurality of times equals or exceeds an unacceptable drift threshold, determining a robotic device pose matrix using the robotic device pose information from one or more of the plurality of times in response to determining that the camera coordinate drift at the plurality of times does not equal or exceed the unacceptable drift threshold, determining robotic device pose information at the location relative to a visual odometry coordinate, determining a camera pose matrix at the plurality of times from visual odometry, and performing a calibration of the camera to
  • performing the calibration of the camera to the robotic device comprises performing a robotic actuator-to-camera calibration of the camera to the robotic device using the robotic device pose matrix and the camera pose matrix. In some aspects, performing the calibration of the camera to the robotic device using the robotic device pose matrix and the camera pose matrix comprises determining a transformation matrix wherein the robotic device pose matrix multiplied by the transformation matrix is substantially equal to the camera pose matrix.
  • Some aspects may include, while maneuvering the robotic device in the plane defined by the X axis and the Y axis, maintaining an acceleration or deceleration of the robotic device such that a deviation of the robotic device in the Z axis does not meet a deviation threshold. Some aspects may include performing a visual odometry initialization of the camera to determine a coordinate system of the camera.
  • Further embodiments may include a robotic device configured with processor-executable instructions to perform operations of any of the methods summarized above. Further embodiments may include a processing device for use in a robotic device configured to perform operations of any of the methods summarized above. Further embodiments may include a robotic device including means for performing functions of any of the methods summarized above.
  • FIG. 1 is a system block diagram of a robotic device operating within communication system according to various embodiments.
  • FIG. 2 is a component block diagram illustrating components of a robotic device according to various embodiments.
  • FIG. 3 is a component block diagram illustrating a processing device suitable for use in robotic devices implementing various embodiments.
  • FIG. 4 is a component block diagram illustrating components of an image capture and processing system of a robotic device suitable for use with various embodiments.
  • FIG. 5 is a process flow diagram illustrating a method of camera-to-robotic device calibration performed by a processing device for use in a robotic device according to various embodiments.
  • FIG. 6 is a process flow diagram illustrating operations that may be performed by a processing device for use in a robotic device as part of the method of camera-to-robotic device calibration according to various embodiments.
  • FIG. 7 is a process flow diagram illustrating a method of camera-to-robotic device calibration performed by a processing device for use in a robotic device according to some embodiments.
  • Various embodiments include methods that may be implemented on a processor of a robotic device for robot-to-camera calibration of a robotic device.
  • the methods may include methods for actuator-to-camera calibration of the camera to the robotic device.
  • the methods may include methods for cross calibration of the camera to the robotic device.
  • robot device refers to one of various types of vehicles, automated and self-propelled machines, and other forms of robots including a camera system and an onboard processing device configured to provide some autonomous or semi-autonomous capabilities.
  • robotic devices include but are not limited to: factory robotic devices, autonomous robots, aerial vehicles, such as an unmanned aerial vehicle (UAV) ; ground vehicles (e.g., an autonomous or semi-autonomous car, a vacuum robot, etc. ) ; water-based vehicles (i.e., vehicles configured for operation on the surface of the water or under water) ; space-based vehicles (e.g., a spacecraft or space probe) ; and/or some combination thereof.
  • the robotic device may be manned.
  • the robotic device may be unmanned.
  • the robotic device may include an onboard computing device configured to maneuver and/or navigate the robotic device without remote operating instructions (i.e., autonomously) , such as from a human operator (e.g., via a remote computing device) .
  • the robotic device may include an onboard computing device configured to receive some information or instructions, such as from a human operator (e.g., via a remote computing device) , and autonomously maneuver and/or navigate the robotic device consistent with the received information or instructions.
  • a robotic device may include a variety of components and/or payloads that may perform a variety of functions.
  • Calibrating parameters of a robot vision system is crucial for operations and tasks in which a robotic device may physically interact with its environment. Improper camera-to-robotic device calibration may impede a robot’s ability to navigate or maneuver around, or interact or manipulate with, objects in the environment.
  • a processor device of a robotic device may be configured to perform methods of camera-to-robotic device calibration.
  • the methods may include methods of actuator-to-camera calibration.
  • the methods may include methods of cross camera calibration.
  • camera-to-robotic device calibration may include determining a correlation between a coordinate system of a body of the robotic device and a coordinate system of the camera.
  • a robot may be configured with a robotic arm or other suitable extension on which a camera may be disposed.
  • the robotic arm may be configured to move relative to the robotic device body, and further may be configured with one or more joints that enable a segment of the robotic arm to move relative to another segment of the robotic arm.
  • Various embodiments may include maneuvering the robotic device in a plane defined by an X axis and a Y axis of a coordinate system of the robotic device. While maneuvering the robotic device in the plane defined by the X axis and the Y axis, the robotic device may determine robotic device pose information at a plurality of times, determine camera pose information at the plurality of times, and determine a camera coordinate drift at the plurality of times in a Z axis of the coordinate system (i.e., the axis perpendicular to the two axes defining the plane on which the robotic device maneuvers) .
  • the X axis may be defined as the axis of forward or reverse motion of the robotic device
  • the Y axis may be defined as the axis perpendicular to the X axis and extending to the left and right of the robotic device body.
  • the Z axis perpendicular to the X and Y axes may be defined as extending above and below the robotic device body.
  • the Z axis may be in the direction (or in the opposite direction) of the gravity gradient in the vicinity of the robotic device.
  • the robotic device may determine whether the camera coordinate drift at the plurality of times equals or exceeds an unacceptable drift threshold. For example, the robotic device may determine whether the camera coordinate drift in the Z axis is less than (or less than or equal to) 10%.
  • the robotic device may determine a camera transform matrix using the camera pose information.
  • the robotic device may determine the camera transform matrix using the camera pose information from one or more of the plurality of times.
  • the robotic device may determine a robotic device transform matrix using the robotic device pose information from one or more of the plurality of times.
  • the robotic device may perform a calibration of the camera to the robotic device using the camera transform matrix and the robotic device transform matrix.
  • the robotic device may perform a visual odometry initialization of the camera to determine a coordinate system of the camera.
  • performing the calibration of the camera to the robotic device may include performing a robotic actuator-to-camera calibration of the camera of the robotic device using the camera transform matrix and the robotic device transform matrix.
  • the robotic device may maintain an acceleration or deceleration of the robotic device such that a deviation of the robotic device in the Z axis does not meet the deviation threshold (e.g., a deviation of 10%or more) .
  • the robotic device may determine whether the deviation of the robotic device in the Z axis is greater than (or greater and or equal to) the deviation threshold.
  • determining camera pose information at the plurality of times may include determining camera pose information using the coordinate system of the camera at the plurality of times.
  • performing the calibration of the camera to the robotic device may include performing a cross calibration of the camera of the robotic device using the camera pose matrix and the robotic device pose matrix.
  • the robotic device may initially perform a visual odometry initialization of the camera to determine a coordinate system of the camera.
  • the robotic device may maneuver the robotic device in a plane defined by an X axis and a Y axis of a coordinate system of the robotic device.
  • the robotic device may determine robotic device pose information at a plurality of times, determine camera pose information at the plurality of times, and determine a camera coordinate drift in a Z axis perpendicular to the X and Y axes of the coordinate system at the plurality of times.
  • the robotic device may determine whether the camera coordinate drift at the plurality of times equals or exceeds an unacceptable drift threshold. In response to determining that the camera coordinate drift at the plurality of times does not equal or exceed the unacceptable drift threshold, the robotic device may determine a robotic device pose matrix using the robotic device pose information. In some embodiments, the robotic device may determine the robotic device pose matrix using the robotic device pose information from one or more of the plurality of times. The robotic device may determine robotic device pose information at the location relative to a visual odometry coordinate. The robotic device may determine a camera pose matrix at the plurality of time from visual odometry. The robotic device may perform a calibration of the camera to the robotic device using the robotic device pose matrix and the camera pose matrix.
  • the robotic device may perform a robotic actuator-to-camera calibration of the camera to the robotic device using the robotic device pose matrix and the camera pose matrix.
  • the robotic device may maintain an acceleration or deceleration of the robotic device such that a deviation of the robotic device in the Z axis does not meet or exceed (i.e., remains less than) a deviation threshold.
  • the communication system 100 may include a robotic device 102, a network device 104, such as a network node or base station, an access point 106, a communication network 108, and a network element 110.
  • the robotic device 102 may be equipped with a camera 103.
  • the camera 103 may include a monocular camera, a binocular camera, or a multi-ocular camera.
  • the network device 104 and the access point 106 may provide wireless communications to access the communication network 108 over a wired and/or wireless communication backhaul 116 and 118, respectively.
  • a network node may be implemented as an aggregated base station, as a disaggregated base station, an integrated access and backhaul (IAB) node, a relay node, a sidelink node, etc.
  • IAB integrated access and backhaul
  • a network device may be implemented in an aggregated or monolithic base station architecture, or alternatively, in a disaggregated base station architecture, and may include one or more of a central unit (CU) , a distributed unit (DU) , a radio unit (RU) , a Near-Real Time (Near-RT) RAN Intelligent Controller (RIC) , or a Non-Real Time (Non-RT) RIC.
  • CU central unit
  • DU distributed unit
  • RU radio unit
  • RIC Near-Real Time
  • Non-RT Non-Real Time
  • the network device 104 may include network nodes or RUs configured to provide wireless communications over a wide area (e.g., macro cells) , as well as small cells, which may include a micro cell, a femto cell, a pico cell, and other similar network access points. Other examples of network devices are also possible.
  • a wide area e.g., macro cells
  • small cells which may include a micro cell, a femto cell, a pico cell, and other similar network access points.
  • Other examples of network devices are also possible.
  • the access point 106 may include access points configured to provide wireless communications over a relatively smaller area.
  • access points may be WiFi transceivers or hotspots coupled to the Internet.
  • Other examples of access points are also possible.
  • the robotic device 102 may communicate with the network device 104 over a wireless communication link 112, and with the access point 106 over a wireless communication link 114.
  • the wireless communication links 112 and 114 may include a plurality of carrier signals, frequencies, or frequency bands, each of which may include a plurality of logical channels.
  • the wireless communication links 112 and 114 may utilize one or more radio access technologies (RATs) .
  • RATs radio access technologies
  • RATs examples include 3GPP Long Term Evolution (LTE) , 3G, 4G, 5G, Global System for Mobility (GSM) , Code Division Multiple Access (CDMA) , Wideband Code Division Multiple Access (WCDMA) , Worldwide Interoperability for Microwave Access (WiMAX) , Time Division Multiple Access (TDMA) , and other mobile telephony communication technologies cellular RATs.
  • LTE Long Term Evolution
  • GSM Global System for Mobility
  • CDMA Code Division Multiple Access
  • WCDMA Wideband Code Division Multiple Access
  • WiMAX Worldwide Interoperability for Microwave Access
  • TDMA Time Division Multiple Access
  • RATs that may be used in one or more of the various wireless communication links within the communication system 100 include medium range protocols such as Wi-Fi, LTE-U, LTE-Direct, LAA, MuLTEfire, and relatively short range RATs such as ZigBee, Bluetooth, and Bluetooth Low Energy (LE) .
  • Wi-Fi Wireless Fidelity
  • LTE-U Long
  • the network element 110 may include a network server or another similar network element.
  • the network element 110 may communicate with the communication network 108 over a communication link 122.
  • the robotic device 102 and the network element 110 may communicate via the communication network 108.
  • the network element 110 may provide the robotic device 102 with a variety of information, such as navigation information, weather information, information about local air, ground, and/or sea traffic, movement control instructions, and other information, instructions, or commands relevant to operations of the robotic device 102.
  • the robotic device 102 may move in an environment 120.
  • the robotic device 102 may be configured to perform operations to calibrate the robotic device 102 to the camera 103 to enable the robotic device 103 to maneuver in and interact with the environment 120.
  • the camera may be disposed on a robotic arm or other suitable extension from the robotic device body.
  • the robotic arm may be configured to move relative to the robotic device body, and further may be configured with one or more joints that enable a segment of the robotic arm to move relative to another segment of the robotic arm. The movement of the robotic arm and/or any of its segments may be effected by one or more actuators.
  • the robotic device 102 and the camera 103 may each be configured with a three-dimensional coordinate system (represented in FIG. 1 as X, Y, and Z-axis systems) .
  • camera-to-robotic device calibration may include determining a correlation between the coordinate system of the robotic system (e.g., relative to a body of the robotic device) and the coordinate system of the camera. Some embodiments may include methods of actuator-to-camera calibration. Some embodiments may include methods of cross camera calibration.
  • the robotic device may make use of one or more images of a target image 125 receiving using the camera 103.
  • FIG. 2 is a component block diagram illustrating components of an example robotic device 200 according to various embodiments.
  • Robotic devices may include winged or rotorcraft varieties.
  • Example robotic device 200 is illustrated as a ground vehicle design that utilizes one or more wheels 202 driven by corresponding motors to provide locomotion to the robotic device 200.
  • the illustration of robotic device 200 is not intended to imply or require that various embodiments are limited to ground robotic devices.
  • various embodiments may be used with rotorcraft or winged robotic devices, water-borne robotic devices, and space-based robotic devices.
  • the robotic device 200 may be similar to the robotic device 102.
  • the robotic device 200 may include a number of wheels 202, a body 204, and a camera 206 (e.g., camera 103) .
  • the frame 204 may provide structural support for the motors and their associated wheels 202 as well as for the camera 206.
  • the frame may support a camera 206.
  • the frame 204 may support an arm 208 or another suitable extension, which may in turn support the camera 206.
  • the arm 208, or segments of the arm 208 may be configured to articulate or move by one or more joints, bending elements, or rotating elements.
  • the camera 206 may be moveably attached to the arm 208 by a joint element that enables the camera 206 to move relative to the arm 208.
  • some detailed aspects of the robotic device 200 are omitted such as wiring, motors, frame structure interconnects, or other features that would be known to one of skill in the art.
  • the illustrated robotic device 200 has wheels 202, this is merely exemplary and various embodiments may include any variety of components to provide propulsion and maneuvering capabilities, such as treads, paddles, skids, or any combination thereof or of other components.
  • the robotic device 200 may further include a control unit 210 that may house various circuits and devices used to power and control the operation of the robotic device 200.
  • the control unit 210 may include a processor 220, a power module 230, sensors 240, one or more payload securing units 244, one or more image sensors 245 (e.g., cameras) , an output module 250, an input module 260, and a radio module 270.
  • the processor 220 may be configured with processor-executable instructions to control travel and other operations of the robotic device 200, including operations of various embodiments.
  • the processor 220 may include or be coupled to a navigation unit 222, a memory 224, a gyro/accelerometer unit 226, and a maneuvering data module 228.
  • the processor 220 and/or the navigation unit 222 may be configured to communicate with a server through a wireless connection (e.g., a cellular data network) to receive data useful in navigation, provide real-time position reports, and assess data.
  • a wireless connection e.g., a cellular data network
  • the maneuvering data module 228 may be coupled to the processor 220 and/or the navigation unit 222, and may be configured to provide travel control-related information such as orientation, attitude, speed, heading, and similar information that the navigation unit 222 may use for navigation purposes, such as dead reckoning between Global Navigation Satellite System (GNSS) position updates.
  • the gyro/accelerometer unit 226 may include an accelerometer, a gyroscope, an inertial sensor, an inertial measurement unit (IMU) , or other similar sensors.
  • the maneuvering data module 228 may include or receive data from the gyro/accelerometer unit 226 that provides data regarding the orientation and accelerations of the robotic device 200 that may be used in navigation and positioning calculations, as well as providing data used in various embodiments for processing images.
  • the processor 220 may further receive additional information from one or more image sensors 245 and/or other sensors 240.
  • the camera (s) 245 may include an optical sensor capable of infrared, ultraviolet, and/or other wavelengths of light.
  • the sensors 240 may also include a wheel sensor, an inertial measurement unit (IMU) , a radio frequency (RF) sensor, a barometer, a sonar emitter/detector, a radar emitter/detector, a microphone or another acoustic sensor, or another sensor that may provide information usable by the processor 220 for movement operations as well as navigation and positioning calculations.
  • the sensors 240 may include contact or pressure sensors that may provide a signal that indicates when the robotic device 200 has made contact with a surface.
  • the payload-securing units 244 may include an actuator motor that drives a gripping and release mechanism and related controls that are responsive to the control unit 210 to grip and release a payload in response to commands from the control unit 210.
  • the power module 230 may include one or more batteries that may provide power to various components, including the processor 220, the sensors 240, the payload-securing unit (s) 244, the camera (s) 245, the output module 250, the input module 260, and the radio module 270.
  • the power module 230 may include energy storage components, such as rechargeable batteries.
  • the processor 220 may be configured with processor-executable instructions to control the charging of the power module 230 (i.e., the storage of harvested energy) , such as by executing a charging control algorithm using a charge control circuit.
  • the power module 230 may be configured to manage its own charging.
  • the processor 220 may be coupled to the output module 250, which may output control signals for managing the motors that drive the rotors 202 and other components.
  • the robotic device 200 may be controlled through control of the individual motors of the rotors 202 as the robotic device 200 progresses toward a destination.
  • the processor 220 may receive data from the navigation unit 222 and use such data in order to determine the present position and orientation of the robotic device 200, as well as the appropriate course towards the destination or intermediate sites.
  • the navigation unit 222 may include a GNSS receiver system (e.g., one or more global positioning system (GPS) receivers) enabling the robotic device 200 to navigate using GNSS signals.
  • GPS global positioning system
  • the navigation unit 222 may be equipped with radio navigation receivers for receiving navigation beacons or other signals from radio nodes, such as navigation beacons (e.g., very high frequency (VHF) omni-directional range (VOR) beacons) , Wi-Fi access points, cellular network sites, radio station, remote computing devices, other robotic devices, etc.
  • radio navigation receivers for receiving navigation beacons or other signals from radio nodes, such as navigation beacons (e.g., very high frequency (VHF) omni-directional range (VOR) beacons) , Wi-Fi access points, cellular network sites, radio station, remote computing devices, other robotic devices, etc.
  • VHF very high frequency
  • VOR very high frequency
  • the radio module 270 may be configured to receive navigation signals, such as signals from aviation navigation facilities, etc., and provide such signals to the processor 220 and/or the navigation unit 222 to assist in robotic device navigation.
  • the navigation unit 222 may use signals received from recognizable RF emitters (e.g., AM/FM radio stations, Wi-Fi access points, and cellular network devices) on the ground.
  • recognizable RF emitters e.g., AM/FM radio stations, Wi-Fi access points, and cellular network devices
  • the radio module 270 may include a modem 274 and a transmit/receive antenna 272.
  • the radio module 270 may be configured to conduct wireless communications with a variety of wireless communication devices (e.g., a wireless communication device (WCD) 290) , examples of which include a wireless telephony network device, RU, or cell tower (e.g., the network device 104) , a network access point (e.g., the access point 106) , a beacon, a smartphone, a tablet, or another computing device with which the robotic device 200 may communicate (such as the network element 110) .
  • WCD wireless communication device
  • the processor 220 may establish a bi-directional wireless communication link 294 via the modem 274 and the antenna 272 of the radio module 270 and the wireless communication device 290 via a transmit/receive antenna 292.
  • the radio module 270 may be configured to support multiple connections with different wireless communication devices using different radio access technologies.
  • the wireless communication device 290 may be connected to a server through intermediate access points.
  • the wireless communication device 290 may be a server of a robotic device operator, a third party service (e.g., package delivery, billing, etc. ) , or a site communication access point.
  • the robotic device 200 may communicate with a server through one or more intermediate communication links, such as a wireless telephony network that is coupled to a wide area network (e.g., the Internet) or other communication devices.
  • the robotic device 200 may include and employ other forms of radio communication, such as mesh connections with other robotic devices or connections to other information sources (e.g., balloons or other stations for collecting and/or distributing weather or other data harvesting information) .
  • control unit 210 may be equipped with an input module 260, which may be used for a variety of applications.
  • the input module 260 may receive images or data from an onboard camera or sensor, or may receive electronic signals from other components (e.g., a payload) .
  • FIG. 3 is a component block diagram illustrating a processing device 310 suitable for use in robotic devices implementing various embodiments.
  • the processing device 310 may be configured to be used in a robotic device and may be configured as or including a system-on-chip (SoC) 312.
  • SoC system-on-chip
  • a variety of components e.g., the processor 220, the output module 250, the radio module 270
  • the SoC 312 may include (but is not limited to) a processor 314, a memory 316, a communication interface 318, and a storage memory interface 320.
  • the processing device 310 or the SoC 312 may further include a communication component 322, such as a wired or wireless modem, a storage memory 324, an antenna 326 for establishing a wireless communication link, and/or the like.
  • the processing device 310 or the SoC 312 may further include a hardware interface 328 configured to enable the processor 314 to communicate with and control various components of a robotic device.
  • the processor 314 may include any of a variety of processing devices, for example any number of processor cores.
  • SoC system-on-chip
  • processors e.g., 314
  • memory e.g., 316
  • communication interface e.g., 318
  • the SoC 312 may include a variety of different types of processors 314 and processor cores, such as a general purpose processor, a central processing unit (CPU) , a digital signal processor (DSP) , a graphics processing unit (GPU) , an accelerated processing unit (APU) , a subsystem processor of specific components of the processing device, such as an image processor for a camera subsystem or a display processor for a display, an auxiliary processor, a single-core processor, and a multicore processor.
  • processors 314 and processor cores such as a general purpose processor, a central processing unit (CPU) , a digital signal processor (DSP) , a graphics processing unit (GPU) , an accelerated processing unit (APU) , a subsystem processor of specific components of the processing device, such as an image processor for a camera subsystem or a display processor for a display, an auxiliary processor, a single-core processor, and a multicore processor.
  • the SoC 312 may further embody other hardware and hardware combinations, such as a field programmable gate array (FPGA) , an application-specific integrated circuit (ASIC) , other programmable logic device, discrete gate logic, transistor logic, performance monitoring hardware, watchdog hardware, and time references.
  • FPGA field programmable gate array
  • ASIC application-specific integrated circuit
  • Integrated circuits may be configured such that the components of the integrated circuit reside on a single piece of semiconductor material, such as silicon.
  • the SoC 312 may include one or more processors 314.
  • the processing device 310 may include more than one SoC 312, thereby increasing the number of processors 314 and processor cores.
  • the processing device 310 may also include processors 314 that are not associated with an SoC 312 (i.e., external to the SoC 312) .
  • Individual processors 314 may be multicore processors.
  • the processors 314 may each be configured for specific purposes that may be the same as or different from other processors 314 of the processing device 310 or SoC 312.
  • One or more of the processors 314 and processor cores of the same or different configurations may be grouped together.
  • a group of processors 314 or processor cores may be referred to as a multi-processor cluster.
  • the memory 316 of the SoC 312 may be a volatile or non-volatile memory configured for storing data and processor-executable instructions for access by the processor 314.
  • the processing device 310 and/or SoC 312 may include one or more memories 316 configured for various purposes.
  • One or more memories 316 may include volatile memories such as random access memory (RAM) or main memory, or cache memory.
  • the processing device 310 and the SoC 312 may be arranged differently and/or combined while still serving the functions of the various aspects.
  • the processing device 310 and the SoC 312 may not be limited to one of each of the components, and multiple instances of each component may be included in various configurations of the processing device 310.
  • FIG. 4 is a component block diagram illustrating an image capture and processing system 400 of a robotic device (e.g., 102, 200 in FIGS. 1 and 2) suitable for use with various embodiments.
  • the image capture and processing system 400 may be implemented in hardware components and/or software components of the robotic device, the operation of which may be controlled by one or more processors (e.g., the processor 220, the processing device 310, the SoC 312, and/or the like) of the robotic device, or a combination of hardware and software components.
  • the image capture and processing system 400 may be configured by machine-readable or processor-executable instructions that may include one or more instruction modules.
  • the instruction modules may include computer program modules.
  • a camera 406 may capture light of an image 402 that enters through a lens 404.
  • the lens 404 may include a fish eye lens or another similar lens that may be configured to provide a wide image capture angle.
  • the camera 406 may be similar to the camera 103, 206.
  • the camera 406 may provide image data to an image signal processing module 408.
  • the image signal processing module 408 may be configured to provide image information to a visual odometry initialization module 410, which may perform visual odometry initialization of the camera 406 based on the received information.
  • a robotic device pose determination module 412 may be configured to determine robotic device pose information.
  • the robotic device determination module 412 may determine robotic device pose information at one or more times while the robotic device maneuvers in a plane (e.g., moving along X and Y axes while remaining at a relatively constant Z axis value) .
  • a camera pose determination module 414 may be configured to determine camera pose information.
  • the camera pose determination module 414 may be configured to determine camera pose information at one or more times while the robotic device maneuvers in a plane (e.g., moving along X and Y axes while remaining at a relatively constant Z axis value) .
  • the camera pose determination module 414 may be configured to receive robotic device pose information from a robotic systems at each of the plurality of times and use that information as part of the process of determining camera pose information.
  • a camera coordinate drift determination module 416 may be configured to use the camera pose information at the plurality of times to determine the change or drift in Z axis pose information. For example, if the robotic device is moving along a flat surface perpendicular to the Z axis (in the coordinate system determined by the visual odometry initialization module 408) , then changes in the Z component of the camera pose information represents camera coordinate drift in the Z axis.
  • the camera coordinate drift determination module 416 may also determine whether the determined camera coordinate drift across the plurality of times equals or exceeds an unacceptable drift threshold, which may be an amount of drift that cannot be accommodated in a calibration of the camera-to-robotic device. Said another way, so long as the camera coordinate drift remains less than the unacceptable drift threshold, such drift can be accommodated in a calibration of the camera-to-robotic device.
  • a camera transformation/pose matrix determination module 418 may be configured to determine a camera transformation matrix using the camera pose information determined by the camera pose determination module 414. In some embodiments, the camera transformation/pose matrix determination module 418 may be configured to determine a camera pose matrix using the camera pose information determined by the camera pose determination module 414.
  • a robotic device transform matrix determination module 420 may be configured to determine a robotic device transform matrix using the robotic device pose information provided by the robotic device across the plurality of times.
  • a camera-to-robotic device calibration module 422 may be configured to perform a calibration of the camera to the robotic device. In some embodiments, the camera-to-robotic device calibration module 422 may be configured to perform the calibration of the camera to the robotic device using the camera transform matrix and the robotic device transform matrix. In some embodiments, the camera-to-robotic device calibration module 422 may be configured to perform the calibration of the camera to the robotic device using the camera pose matrix and the robotic device pose matrix.
  • FIG. 5 is a process flow diagram illustrating a method 500 of camera-to-robotic device calibration performed by a processing device for use in a robotic device according to various embodiments.
  • the operations of the method 500 may be performed by a processor of a robotic device (e.g., the processor 220, the processing device 310, the SoC 312, and/or the like) .
  • the operations of the method 500 may be appropriate in implementations in which the camera and robot sensors and/or actuators operate in different coordinate systems and hand-to-eye type calibration is required.
  • the processor of the robotic device may perform a visual odometry initialization of the camera to determine a coordinate system of the camera.
  • the processor may capture one or more images from the environment (e.g., of the target image 125) using a camera of the robotic device (e.g., the camera 103, 206) .
  • the processor may perform the visual odometry initialization of the camera.
  • the processor may maneuver the robotic device in a plane defined by an X axis and a Y axis of a coordinate system of the robotic device.
  • the processor may drive the robotic device along a linear path for a distance.
  • the processor may maintain an acceleration or deceleration of the robotic device such that a deviation of the robotic device in the Z axis does not meet or exceed a deviation threshold in the Z axis of the robotic device coordinate system.
  • the processor may determine robotic device pose information at a plurality of times in block 506.
  • the processor also may determine camera pose information at the plurality of times in block 508.
  • the processor may determine the camera pose information at the plurality of times comprises determining camera pose information using the coordinate system of the camera at the plurality of times.
  • the processor also may determine a camera coordinate drift in the Z axis of the coordinate system at the plurality of times in block 510. In some embodiments, the processor may determine an average camera coordinate drift in the Z axis of the coordinate system over some or all of the plurality of times.
  • processor may determine whether the camera coordinate drift at the plurality of times equals or exceeds an unacceptable drift threshold.
  • the unacceptable drift threshold may be a value of coordinate drift that cannot be accommodated in a calibration of the camera-to-robotic device.
  • an unacceptable drift threshold may be 10%drift in camera coordinate values.
  • the camera coordinate drift threshold may be determined empirically by determining whether a calibration of the camera-to-robotic device can be completed accurately at various values of drift in the camera coordinates, and setting a threshold at a value of coordinate drift at and beyond which camera-to-robotic device calibrations are unsuccessful.
  • the processor may again perform the operations of blocks 502–510.
  • the processor may perform a calibration of the camera to the robotic device using a camera transform matrix and a robotic device transform matrix in block 514.
  • FIG. 6 is a process flow diagram illustrating operations 600 that may be performed by a processing device for use in a robotic device as part of the method 500 of camera-to-robotic device calibration according to various embodiments.
  • the operations 600 may be performed by a processor of a robotic device (e.g., the processor 220, the processing device 310, the SoC 312, and/or the like) .
  • the operations 600 may be appropriate in implementations in which the camera and robot sensors and/or actuators operate in different coordinate systems and hand-to-eye type calibration is required.
  • the processor may perform an actuator-to-camera calibration of the camera to the robotic device using the camera transform matrix and the robotic device pose matrix in block 602.
  • the processor may determine a camera transform matrix using the camera pose information from one or more of the plurality of times in block 604.
  • the processor may determine a robotic device transform matrix using the robotic device pose information from one or more of the plurality of times.
  • the processor may determine a transformation matrix such that the robotic device transform matrix multiplied by the transformation matrix is substantially equal to the transformation matrix multiplied by the camera transform matrix.
  • FIG. 7 is a process flow diagram illustrating a method 700 of camera-to-robotic device calibration that may be performed by a processing device for use in a robotic device according to some embodiments.
  • the operations of the method 700 may be performed by a processor of a robotic device (e.g., the processor 220, the processing device 310, the SoC 312, and/or the like) .
  • the operations of the method 700 may be appropriate in implementations in which the camera and robot sensors and/or actuators operate in the same coordinate system and cross-calibration is required.
  • the processor of the robotic device may perform a visual odometry initialization of the camera to determine a coordinate system of the camera.
  • the processor may capture one or more images from the environment (e.g., of the target image 125) using a camera of the robotic device (e.g., the camera 103, 206) .
  • the processor may perform the visual odometry initialization of the camera.
  • the processor may maneuver the robotic device in a plane defined by an X axis and a Y axis of a coordinate system of the robotic device.
  • the processor may drive the robotic device along a linear path for a distance.
  • the processor may maintain an acceleration or deceleration of the robotic device such that a deviation of the robotic device in the Z axis does not meet a deviation threshold in the Z axis of the robotic device coordinate system.
  • the processor may determine robotic device pose information at a plurality of times in block 706.
  • the processor also may determine camera pose information at the plurality of times in block 708.
  • the processor may determine the camera pose information at the plurality of times comprises determining camera pose information using the coordinate system of the camera at the plurality of times.
  • the processor also may determine a camera coordinate drift in the Z axis of the coordinate system at the plurality of times in block 710. In some embodiments, the processor may determine an average camera coordinate drift in the Z axis of the coordinate system over some or all of the plurality of times.
  • the processor may determine whether the camera coordinate drift at the plurality of times equals or exceeds an unacceptable drift threshold (for example, a threshold of 10%drift) .
  • the unacceptable drift threshold may be a value of coordinate drift that cannot be accommodated in a calibration of the camera to the robotic actuator.
  • the camera coordinate drift threshold may be determined empirically by determining whether a calibration of the robotic actuator-to-camera can be completed accurately at various values of drift in the camera coordinates, and setting a threshold at a value of coordinate drift at and beyond which robotic actuator-to-camera calibrations are unsuccessful.
  • the processor may again perform the operations of blocks 702–710 as described.
  • the processor may determine a robotic device pose matrix using the robotic device pose information from one or more of the plurality of times in block 714.
  • the processor may determine robotic device pose information at the location relative to a visual odometry coordinate.
  • the processor may determine the location of target image (e.g., 125) with respect to a visual odometry coordinate.
  • the processor may perform a visual odometry initialization based on the target image, may capture an image of the target image, and may determine a location of the target image.
  • the processor may determine robotic device pose information at the location relative to the target image.
  • the processor may determine the robotic device pose information at the location relative to a visual odometry coordinate.
  • the processor may determine a camera pose matrix at the plurality of times from visual odometry.
  • the processor may perform a calibration of the camera to the robotic device using the robotic device pose matrix and the camera pose matrix.
  • the processor may perform a robotic actuator-to-camera calibration of the camera to the robotic device using the robotic device pose matrix and the camera pose matrix.
  • the processor may determine the transformation matrix such that the robotic device pose matrix multiplied by the transformation matrix is substantially equal to the camera pose matrix.
  • Implementation examples are described in the following paragraphs. While some of the following implementation examples are described in terms of example methods, further example implementations may include: the example methods discussed in the following paragraphs implemented by a robotic device including a processing device configured to perform operations of the example methods; the example methods discussed in the following paragraphs implemented by a robotic device including means for performing functions of the example methods; and the example methods discussed in the following paragraphs implemented as a non-transitory processor-readable storage medium having stored thereon processor-executable instructions configured to cause a processing device of a robotic device to perform the operations of the example methods.
  • Example 1 A method of camera-to-robotic device calibration performed by a processing device for use in a robotic device, including maneuvering the robotic device in a plane defined by an X axis and a Y axis of a coordinate system of the robotic device; while maneuvering the robotic device in the plane defined by the X axis and the Y axis: determining robotic device pose information at a plurality of times; determining camera pose information at the plurality of times; and determining camera coordinate drift in a Z axis perpendicular to the X and Y axes of the coordinate system at the plurality of times; determining whether the camera coordinate drift at the plurality of times equals or exceeds an unacceptable drift threshold; determining a camera transform matrix using the camera pose information from one or more of the plurality of times in response to determining that the camera coordinate drift at the plurality of times does not equal or exceed the unacceptable drift threshold; determining a robotic device transform matrix using the robotic device pose information from one or more of the plurality of times; and performing a calibration of
  • Example 2 The method of example 1, in which performing the calibration of the camera to the robotic device includes performing a robotic actuator-to-camera calibration of the camera to the robotic device using the camera transform matrix and the robotic device transform matrix.
  • Example 3 The method of example 2, in which performing the actuator-to-camera calibration of the camera to the robotic device includes determining a transformation matrix in which the robotic device transform matrix multiplied by the transformation matrix is substantially equal to the transformation matrix multiplied by the camera transform matrix.
  • Example 4 The method of any of examples 1-3, further including: while maneuvering the robotic device in the plane defined by the X axis and the Y axis: maintaining an acceleration or deceleration of the robotic device such that a deviation of the robotic device in the Z axis does not meet a deviation threshold.
  • Example 5 The method of any of examples 1-4, further including performing a visual odometry initialization of the camera to determine a coordinate system of the camera.
  • Example 6 A method of camera-to-robotic device calibration performed by a processing device for use in a robotic device, including maneuvering the robotic device in a plane defined by an X axis and a Y axis of a coordinate system of the robotic device; while maneuvering the robotic device in the plane defined by the X axis and the Y axis: determining robotic device pose information at a plurality of times; determining camera pose information at the plurality of times; and determining a camera coordinate drift in a Z axis perpendicular to the X and Y axes of the coordinate system at the plurality of times; determining whether the camera coordinate drift at the plurality of times equals or exceeds an unacceptable drift threshold; determining a robotic device pose matrix using the robotic device pose information from one or more of the plurality of times in response to determining that the camera coordinate drift at the plurality of times does not equal or exceed the unacceptable drift threshold; determining robotic device pose information at the location relative to a visual odometry coordinate; determining a camera pose
  • Example 7 The method of example 6, in which performing the calibration of the camera to the robotic device includes performing a robotic actuator-to-camera calibration of the camera to the robotic device using the robotic device pose matrix and the camera pose matrix.
  • Example 8 The method of any of examples 6 or 7, in which performing the calibration of the camera to the robotic device using the robotic device pose matrix and the camera pose matrix includes determining a transformation matrix in which the robotic device pose matrix multiplied by the transformation matrix is substantially equal to the camera pose matrix.
  • Example 9 The method of any of examples 6–8, further including: while maneuvering the robotic device in the plane defined by the X axis and the Y axis: maintaining an acceleration or deceleration of the robotic device such that a deviation of the robotic device in the Z axis does not meet a deviation threshold.
  • Example 10 The method of any of examples 6–9, further including performing a visual odometry initialization of the camera to determine a coordinate system of the camera.
  • DSP digital signal processor
  • ASIC application specific integrated circuit
  • FPGA field programmable gate array
  • a general-purpose processor may be a microprocessor, but, in the alternative, the processor may be any conventional processor, controller, microcontroller, or state machine.
  • a processor may also be implemented as a combination of receiver smart objects, e.g., a combination of a DSP and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a DSP core, or any other such configuration. Alternatively, some operations or methods may be performed by circuitry that is specific to a given function.
  • the functions described may be implemented in hardware, software, firmware, or any combination thereof. If implemented in software, the functions may be stored as one or more instructions or code on a non-transitory computer-readable storage medium or non-transitory processor-readable storage medium.
  • the operations of a method or algorithm disclosed herein may be embodied in a processor-executable software module or processor-executable instructions, which may reside on a non-transitory computer-readable or processor-readable storage medium.
  • Non-transitory computer-readable or processor-readable storage media may be any storage media that may be accessed by a computer or a processor.
  • non-transitory computer-readable or processor-readable storage media may include RAM, ROM, EEPROM, FLASH memory, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage smart objects, or any other medium that may be used to store desired program code in the form of instructions or data structures and that may be accessed by a computer.
  • Disk and disc includes compact disc (CD) , laser disc, optical disc, digital versatile disc (DVD) , floppy disk, and Blu-ray disc where disks usually reproduce data magnetically, while discs reproduce data optically with lasers. Combinations of the above are also included within the scope of non-transitory computer-readable and processor-readable media.
  • the operations of a method or algorithm may reside as one or any combination or set of codes and/or instructions on a non-transitory processor-readable storage medium and/or computer-readable storage medium, which may be incorporated into a computer program product.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Manipulator (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Studio Devices (AREA)

Abstract

Methods of camera-to-robotic device calibration may include, while maneuvering a robotic device in a defined plane, determining robotic device pose information at a plurality of times, determining camera pose information at the plurality of times, and determining a camera coordinate drift in a Z axis of the coordinate system at the plurality of times. In response to determining that a camera coordinate drift at the plurality of times does not meet a drift threshold, the processor may determine a camera transform matrix or camera pose matrix using the camera pose information. The processor may determine a robotic device transform matrix or pose matrix using the robotic device pose information from one or more of the plurality of times, and perform a calibration of the camera to the robotic device using the camera transform and robotic device transform matrices, or the camera pose and robotic device pose matrices.

Description

Methods Of Camera-To-Robotic Device Calibration BACKGROUND
Robotic devices are being developed for a wide range of applications. Robotic devices may be equipped with cameras capable of capturing an image, a sequence of images, or video, and using such data in performing a robotic operation, such as navigation, guiding an actuator, etc. Some robotic devices may be equipped with an image sensor, such as a camera.
Robotic devices may use image data from a camera to perform any number of tasks. To properly make use of such image data to perform operations including maneuvering, object manipulation, and the like, information about the location and position of the robotic device must be coordinated with information about the location and position of the camera, generally referred to as camera-to-robotic device calibration.
SUMMARY
Various embodiments include methods that may be implemented on a processor of a robotic device for calibration of a camera to a robotic device. Various aspects may include maneuvering the robotic device in a plane defined by an X axis and a Y axis of a coordinate system of the robotic device, while maneuvering the robotic device in the plane defined by the X axis and the Y axis determining robotic device pose information at a plurality of times, determining camera pose information at the plurality of times, and determining camera coordinate drift in a Z axis perpendicular to the X and Y axes of the coordinate system at the plurality of times, determining whether the camera coordinate drift at the plurality of times equals or exceeds an unacceptable drift threshold, determining a camera transform matrix using the camera pose information from one or more of the plurality of times in response to determining that the camera coordinate drift at the plurality of times  does not equal or exceed the unacceptable drift threshold, determining a robotic device transform matrix using the robotic device pose information from one or more of the plurality of times, and performing a calibration of the camera to the robotic device using the camera transform matrix and the robotic device transform matrix.
In some aspects, performing the calibration of the camera to the robotic device comprises performing a robotic actuator-to-camera calibration of the camera to the robotic device using the camera transform matrix and the robotic device transform matrix. In some aspects, performing the actuator-to-camera calibration of the camera to the robotic device comprises determining a transformation matrix wherein the robotic device transform matrix multiplied by the transformation matrix is substantially equal to the transformation matrix multiplied by the camera transform matrix.
Some aspects may include, while maneuvering the robotic device in the plane defined by the X axis and the Y axis, maintaining an acceleration or deceleration of the robotic device such that a deviation of the robotic device in the Z axis does not meet a deviation threshold. Some aspects may include performing a visual odometry initialization of the camera to determine a coordinate system of the camera.
Various aspects may include maneuvering the robotic device in a plane defined by an X axis and a Y axis of a coordinate system of the robotic device, while maneuvering the robotic device in the plane defined by the X axis and the Y axis determining robotic device pose information at a plurality of times, determining camera pose information at the plurality of times, and determining a camera coordinate drift in a Z axis perpendicular to the X and Y axes of the coordinate system at the plurality of times, determining whether the camera coordinate drift at the plurality of times equals or exceeds an unacceptable drift threshold, determining a robotic device pose matrix using the robotic device pose information from one or more of the plurality of times in response to determining that the camera coordinate  drift at the plurality of times does not equal or exceed the unacceptable drift threshold, determining robotic device pose information at the location relative to a visual odometry coordinate, determining a camera pose matrix at the plurality of times from visual odometry, and performing a calibration of the camera to the robotic device using the robotic device pose matrix and the camera pose matrix.
In some aspects, performing the calibration of the camera to the robotic device comprises performing a robotic actuator-to-camera calibration of the camera to the robotic device using the robotic device pose matrix and the camera pose matrix. In some aspects, performing the calibration of the camera to the robotic device using the robotic device pose matrix and the camera pose matrix comprises determining a transformation matrix wherein the robotic device pose matrix multiplied by the transformation matrix is substantially equal to the camera pose matrix.
Some aspects may include, while maneuvering the robotic device in the plane defined by the X axis and the Y axis, maintaining an acceleration or deceleration of the robotic device such that a deviation of the robotic device in the Z axis does not meet a deviation threshold. Some aspects may include performing a visual odometry initialization of the camera to determine a coordinate system of the camera.
Further embodiments may include a robotic device configured with processor-executable instructions to perform operations of any of the methods summarized above. Further embodiments may include a processing device for use in a robotic device configured to perform operations of any of the methods summarized above. Further embodiments may include a robotic device including means for performing functions of any of the methods summarized above.
BRIEF DESCRIPTION OF THE DRAWINGS
The accompanying drawings, which are incorporated herein and constitute part of this specification, illustrate example embodiments, and together with the general description given above and the detailed description given below, serve to explain the features of various embodiments.
FIG. 1 is a system block diagram of a robotic device operating within communication system according to various embodiments.
FIG. 2 is a component block diagram illustrating components of a robotic device according to various embodiments.
FIG. 3 is a component block diagram illustrating a processing device suitable for use in robotic devices implementing various embodiments.
FIG. 4 is a component block diagram illustrating components of an image capture and processing system of a robotic device suitable for use with various embodiments.
FIG. 5 is a process flow diagram illustrating a method of camera-to-robotic device calibration performed by a processing device for use in a robotic device according to various embodiments.
FIG. 6 is a process flow diagram illustrating operations that may be performed by a processing device for use in a robotic device as part of the method of camera-to-robotic device calibration according to various embodiments.
FIG. 7 is a process flow diagram illustrating a method of camera-to-robotic device calibration performed by a processing device for use in a robotic device according to some embodiments.
DETAILED DESCRIPTION
Various embodiments will be described in detail with reference to the accompanying drawings. Wherever possible, the same reference numbers will be  used throughout the drawings to refer to the same or like parts. References made to particular examples and embodiments are for illustrative purposes, and are not intended to limit the scope of the claims.
Various embodiments include methods that may be implemented on a processor of a robotic device for robot-to-camera calibration of a robotic device. In some embodiments, the methods may include methods for actuator-to-camera calibration of the camera to the robotic device. In some embodiments, the methods may include methods for cross calibration of the camera to the robotic device.
As used herein, the term “robotic device” refers to one of various types of vehicles, automated and self-propelled machines, and other forms of robots including a camera system and an onboard processing device configured to provide some autonomous or semi-autonomous capabilities. Examples of robotic devices include but are not limited to: factory robotic devices, autonomous robots, aerial vehicles, such as an unmanned aerial vehicle (UAV) ; ground vehicles (e.g., an autonomous or semi-autonomous car, a vacuum robot, etc. ) ; water-based vehicles (i.e., vehicles configured for operation on the surface of the water or under water) ; space-based vehicles (e.g., a spacecraft or space probe) ; and/or some combination thereof. In some embodiments, the robotic device may be manned. In other embodiments, the robotic device may be unmanned. In embodiments in which the robotic device is autonomous, the robotic device may include an onboard computing device configured to maneuver and/or navigate the robotic device without remote operating instructions (i.e., autonomously) , such as from a human operator (e.g., via a remote computing device) . In embodiments in which the robotic device is semi-autonomous, the robotic device may include an onboard computing device configured to receive some information or instructions, such as from a human operator (e.g., via a remote computing device) , and autonomously maneuver and/or navigate the robotic device consistent with the received information or instructions.  A robotic device may include a variety of components and/or payloads that may perform a variety of functions.
Calibrating parameters of a robot vision system is crucial for operations and tasks in which a robotic device may physically interact with its environment. Improper camera-to-robotic device calibration may impede a robot’s ability to navigate or maneuver around, or interact or manipulate with, objects in the environment.
In various embodiments, a processor device of a robotic device may be configured to perform methods of camera-to-robotic device calibration. In some embodiments, the methods may include methods of actuator-to-camera calibration. In some embodiments, the methods may include methods of cross camera calibration. In some embodiments, camera-to-robotic device calibration may include determining a correlation between a coordinate system of a body of the robotic device and a coordinate system of the camera. In some embodiments, a robot may be configured with a robotic arm or other suitable extension on which a camera may be disposed. In some embodiments, the robotic arm may be configured to move relative to the robotic device body, and further may be configured with one or more joints that enable a segment of the robotic arm to move relative to another segment of the robotic arm.
Various embodiments may include maneuvering the robotic device in a plane defined by an X axis and a Y axis of a coordinate system of the robotic device. While maneuvering the robotic device in the plane defined by the X axis and the Y axis, the robotic device may determine robotic device pose information at a plurality of times, determine camera pose information at the plurality of times, and determine a camera coordinate drift at the plurality of times in a Z axis of the coordinate system (i.e., the axis perpendicular to the two axes defining the plane on which the robotic device maneuvers) . For example, the X axis may be defined as the axis of forward or reverse motion of the robotic device, and the Y axis may be defined as  the axis perpendicular to the X axis and extending to the left and right of the robotic device body. The Z axis perpendicular to the X and Y axes may be defined as extending above and below the robotic device body. In some embodiments, the Z axis may be in the direction (or in the opposite direction) of the gravity gradient in the vicinity of the robotic device. The robotic device may determine whether the camera coordinate drift at the plurality of times equals or exceeds an unacceptable drift threshold. For example, the robotic device may determine whether the camera coordinate drift in the Z axis is less than (or less than or equal to) 10%. In response to determining that the camera coordinate drift at the plurality of times does not equal or exceed the unacceptable drift threshold, the robotic device may determine a camera transform matrix using the camera pose information. In some embodiments, the robotic device may determine the camera transform matrix using the camera pose information from one or more of the plurality of times. The robotic device may determine a robotic device transform matrix using the robotic device pose information from one or more of the plurality of times. Further, the robotic device may perform a calibration of the camera to the robotic device using the camera transform matrix and the robotic device transform matrix. In some embodiments, prior to maneuvering the robotic device in the plane defined by the X axis and Y axis of the robotic device, the robotic device may perform a visual odometry initialization of the camera to determine a coordinate system of the camera.
In some embodiments, performing the calibration of the camera to the robotic device may include performing a robotic actuator-to-camera calibration of the camera of the robotic device using the camera transform matrix and the robotic device transform matrix. In such embodiments, the robotic device may determine a transformation matrix in which the robotic device transform matrix multiplied by the transformation matrix is substantially equal to the transformation matrix multiplied by the camera transform matrix. In some embodiments, this may be represented as Aij X = X Bij, in which Aij represents the camera transform matrix determined based  on camera pose information determined during time i to time j, Bij represents the robotic device transform matrix based on robotic device pose information acquired during time i to time j, and X represents the transformation matrix.
In some embodiments, while maneuvering the robotic device in the plane defined by the X axis and the Y axis, the robotic device may maintain an acceleration or deceleration of the robotic device such that a deviation of the robotic device in the Z axis does not meet the deviation threshold (e.g., a deviation of 10%or more) . In some embodiments, the robotic device may determine whether the deviation of the robotic device in the Z axis is greater than (or greater and or equal to) the deviation threshold. In some embodiments, determining camera pose information at the plurality of times may include determining camera pose information using the coordinate system of the camera at the plurality of times.
In some embodiments, performing the calibration of the camera to the robotic device may include performing a cross calibration of the camera of the robotic device using the camera pose matrix and the robotic device pose matrix. In some embodiments, the robotic device may initially perform a visual odometry initialization of the camera to determine a coordinate system of the camera. In some embodiments, the robotic device may maneuver the robotic device in a plane defined by an X axis and a Y axis of a coordinate system of the robotic device. In some embodiments, while maneuvering the robotic device in the plane defined by the X axis and the Y axis, the robotic device may determine robotic device pose information at a plurality of times, determine camera pose information at the plurality of times, and determine a camera coordinate drift in a Z axis perpendicular to the X and Y axes of the coordinate system at the plurality of times.
The robotic device may determine whether the camera coordinate drift at the plurality of times equals or exceeds an unacceptable drift threshold. In response to determining that the camera coordinate drift at the plurality of times does not equal or exceed the unacceptable drift threshold, the robotic device may determine a  robotic device pose matrix using the robotic device pose information. In some embodiments, the robotic device may determine the robotic device pose matrix using the robotic device pose information from one or more of the plurality of times. The robotic device may determine robotic device pose information at the location relative to a visual odometry coordinate. The robotic device may determine a camera pose matrix at the plurality of time from visual odometry. The robotic device may perform a calibration of the camera to the robotic device using the robotic device pose matrix and the camera pose matrix. In some embodiments, the calibration may be represented as Ai X = Bi, in which Ai represents the camera pose matrix determined based on camera pose information determined at time i, Bi represents the robotic device pose matrix based on robotic device pose information determined at time i, and X represents the transformation matrix.
In some embodiments, the robotic device may perform a robotic actuator-to-camera calibration of the camera to the robotic device using the robotic device pose matrix and the camera pose matrix. In some embodiments, while maneuvering the robotic device in the plane defined by the X axis and the Y axis, the robotic device may maintain an acceleration or deceleration of the robotic device such that a deviation of the robotic device in the Z axis does not meet or exceed (i.e., remains less than) a deviation threshold.
Various embodiments may be implemented within a robotic device operating within a variety of communication systems 100, an example of which is illustrated in FIG. 1. With reference to FIG. 1, the communication system 100 may include a robotic device 102, a network device 104, such as a network node or base station, an access point 106, a communication network 108, and a network element 110. In some embodiments, the robotic device 102 may be equipped with a camera 103. In some embodiments, the camera 103 may include a monocular camera, a binocular camera, or a multi-ocular camera.
The network device 104 and the access point 106 may provide wireless communications to access the communication network 108 over a wired and/or  wireless communication backhaul  116 and 118, respectively. In various communication network implementations or architectures, a network node may be implemented as an aggregated base station, as a disaggregated base station, an integrated access and backhaul (IAB) node, a relay node, a sidelink node, etc. Also, in various communication network implementations or architectures, a network device (or network entity) may be implemented in an aggregated or monolithic base station architecture, or alternatively, in a disaggregated base station architecture, and may include one or more of a central unit (CU) , a distributed unit (DU) , a radio unit (RU) , a Near-Real Time (Near-RT) RAN Intelligent Controller (RIC) , or a Non-Real Time (Non-RT) RIC. The network device 104 may include network nodes or RUs configured to provide wireless communications over a wide area (e.g., macro cells) , as well as small cells, which may include a micro cell, a femto cell, a pico cell, and other similar network access points. Other examples of network devices are also possible.
The access point 106 may include access points configured to provide wireless communications over a relatively smaller area. For example, access points may be WiFi transceivers or hotspots coupled to the Internet. Other examples of access points are also possible.
The robotic device 102 may communicate with the network device 104 over a wireless communication link 112, and with the access point 106 over a wireless communication link 114. The  wireless communication links  112 and 114 may include a plurality of carrier signals, frequencies, or frequency bands, each of which may include a plurality of logical channels. The  wireless communication links  112 and 114 may utilize one or more radio access technologies (RATs) . Examples of RATs that may be used in a wireless communication link include 3GPP Long Term Evolution (LTE) , 3G, 4G, 5G, Global System for Mobility (GSM) , Code Division  Multiple Access (CDMA) , Wideband Code Division Multiple Access (WCDMA) , Worldwide Interoperability for Microwave Access (WiMAX) , Time Division Multiple Access (TDMA) , and other mobile telephony communication technologies cellular RATs. Further examples of RATs that may be used in one or more of the various wireless communication links within the communication system 100 include medium range protocols such as Wi-Fi, LTE-U, LTE-Direct, LAA, MuLTEfire, and relatively short range RATs such as ZigBee, Bluetooth, and Bluetooth Low Energy (LE) .
The network element 110 may include a network server or another similar network element. The network element 110 may communicate with the communication network 108 over a communication link 122. The robotic device 102 and the network element 110 may communicate via the communication network 108. The network element 110 may provide the robotic device 102 with a variety of information, such as navigation information, weather information, information about local air, ground, and/or sea traffic, movement control instructions, and other information, instructions, or commands relevant to operations of the robotic device 102.
In various embodiments, the robotic device 102 may move in an environment 120. In some embodiments, the robotic device 102 may be configured to perform operations to calibrate the robotic device 102 to the camera 103 to enable the robotic device 103 to maneuver in and interact with the environment 120. In some embodiments, the camera may be disposed on a robotic arm or other suitable extension from the robotic device body. In some embodiments, the robotic arm may be configured to move relative to the robotic device body, and further may be configured with one or more joints that enable a segment of the robotic arm to move relative to another segment of the robotic arm. The movement of the robotic arm and/or any of its segments may be effected by one or more actuators. In some embodiments, the robotic device 102 and the camera 103 may each be configured  with a three-dimensional coordinate system (represented in FIG. 1 as X, Y, and Z-axis systems) . In some embodiments, camera-to-robotic device calibration may include determining a correlation between the coordinate system of the robotic system (e.g., relative to a body of the robotic device) and the coordinate system of the camera. Some embodiments may include methods of actuator-to-camera calibration. Some embodiments may include methods of cross camera calibration. In some embodiments, the robotic device may make use of one or more images of a target image 125 receiving using the camera 103.
FIG. 2 is a component block diagram illustrating components of an example robotic device 200 according to various embodiments. Robotic devices may include winged or rotorcraft varieties. Example robotic device 200 is illustrated as a ground vehicle design that utilizes one or more wheels 202 driven by corresponding motors to provide locomotion to the robotic device 200. The illustration of robotic device 200 is not intended to imply or require that various embodiments are limited to ground robotic devices. For example, various embodiments may be used with rotorcraft or winged robotic devices, water-borne robotic devices, and space-based robotic devices.
With reference to FIGS. 1 and 2, the robotic device 200 may be similar to the robotic device 102. The robotic device 200 may include a number of wheels 202, a body 204, and a camera 206 (e.g., camera 103) . The frame 204 may provide structural support for the motors and their associated wheels 202 as well as for the camera 206. In some embodiments, the frame may support a camera 206. Additionally or alternatively, in some embodiments, the frame 204 may support an arm 208 or another suitable extension, which may in turn support the camera 206. In some embodiments, the arm 208, or segments of the arm 208, may be configured to articulate or move by one or more joints, bending elements, or rotating elements. Similarly, the camera 206 may be moveably attached to the arm 208 by a joint element that enables the camera 206 to move relative to the arm 208. For ease of  description and illustration, some detailed aspects of the robotic device 200 are omitted such as wiring, motors, frame structure interconnects, or other features that would be known to one of skill in the art. While the illustrated robotic device 200 has wheels 202, this is merely exemplary and various embodiments may include any variety of components to provide propulsion and maneuvering capabilities, such as treads, paddles, skids, or any combination thereof or of other components.
The robotic device 200 may further include a control unit 210 that may house various circuits and devices used to power and control the operation of the robotic device 200. The control unit 210 may include a processor 220, a power module 230, sensors 240, one or more payload securing units 244, one or more image sensors 245 (e.g., cameras) , an output module 250, an input module 260, and a radio module 270.
The processor 220 may be configured with processor-executable instructions to control travel and other operations of the robotic device 200, including operations of various embodiments. The processor 220 may include or be coupled to a navigation unit 222, a memory 224, a gyro/accelerometer unit 226, and a maneuvering data module 228. The processor 220 and/or the navigation unit 222 may be configured to communicate with a server through a wireless connection (e.g., a cellular data network) to receive data useful in navigation, provide real-time position reports, and assess data.
The maneuvering data module 228 may be coupled to the processor 220 and/or the navigation unit 222, and may be configured to provide travel control-related information such as orientation, attitude, speed, heading, and similar information that the navigation unit 222 may use for navigation purposes, such as dead reckoning between Global Navigation Satellite System (GNSS) position updates. The gyro/accelerometer unit 226 may include an accelerometer, a gyroscope, an inertial sensor, an inertial measurement unit (IMU) , or other similar sensors. The maneuvering data module 228 may include or receive data from the  gyro/accelerometer unit 226 that provides data regarding the orientation and accelerations of the robotic device 200 that may be used in navigation and positioning calculations, as well as providing data used in various embodiments for processing images.
The processor 220 may further receive additional information from one or more image sensors 245 and/or other sensors 240. In some embodiments, the camera (s) 245 may include an optical sensor capable of infrared, ultraviolet, and/or other wavelengths of light. The sensors 240 may also include a wheel sensor, an inertial measurement unit (IMU) , a radio frequency (RF) sensor, a barometer, a sonar emitter/detector, a radar emitter/detector, a microphone or another acoustic sensor, or another sensor that may provide information usable by the processor 220 for movement operations as well as navigation and positioning calculations. The sensors 240 may include contact or pressure sensors that may provide a signal that indicates when the robotic device 200 has made contact with a surface. The payload-securing units 244 may include an actuator motor that drives a gripping and release mechanism and related controls that are responsive to the control unit 210 to grip and release a payload in response to commands from the control unit 210.
The power module 230 may include one or more batteries that may provide power to various components, including the processor 220, the sensors 240, the payload-securing unit (s) 244, the camera (s) 245, the output module 250, the input module 260, and the radio module 270. In addition, the power module 230 may include energy storage components, such as rechargeable batteries. The processor 220 may be configured with processor-executable instructions to control the charging of the power module 230 (i.e., the storage of harvested energy) , such as by executing a charging control algorithm using a charge control circuit. Alternatively or additionally, the power module 230 may be configured to manage its own charging. The processor 220 may be coupled to the output module 250, which may  output control signals for managing the motors that drive the rotors 202 and other components.
The robotic device 200 may be controlled through control of the individual motors of the rotors 202 as the robotic device 200 progresses toward a destination. The processor 220 may receive data from the navigation unit 222 and use such data in order to determine the present position and orientation of the robotic device 200, as well as the appropriate course towards the destination or intermediate sites. In various embodiments, the navigation unit 222 may include a GNSS receiver system (e.g., one or more global positioning system (GPS) receivers) enabling the robotic device 200 to navigate using GNSS signals. Alternatively or in addition, the navigation unit 222 may be equipped with radio navigation receivers for receiving navigation beacons or other signals from radio nodes, such as navigation beacons (e.g., very high frequency (VHF) omni-directional range (VOR) beacons) , Wi-Fi access points, cellular network sites, radio station, remote computing devices, other robotic devices, etc.
The radio module 270 may be configured to receive navigation signals, such as signals from aviation navigation facilities, etc., and provide such signals to the processor 220 and/or the navigation unit 222 to assist in robotic device navigation. In various embodiments, the navigation unit 222 may use signals received from recognizable RF emitters (e.g., AM/FM radio stations, Wi-Fi access points, and cellular network devices) on the ground.
The radio module 270 may include a modem 274 and a transmit/receive antenna 272. The radio module 270 may be configured to conduct wireless communications with a variety of wireless communication devices (e.g., a wireless communication device (WCD) 290) , examples of which include a wireless telephony network device, RU, or cell tower (e.g., the network device 104) , a network access point (e.g., the access point 106) , a beacon, a smartphone, a tablet, or another computing device with which the robotic device 200 may communicate  (such as the network element 110) . The processor 220 may establish a bi-directional wireless communication link 294 via the modem 274 and the antenna 272 of the radio module 270 and the wireless communication device 290 via a transmit/receive antenna 292. In some embodiments, the radio module 270 may be configured to support multiple connections with different wireless communication devices using different radio access technologies.
In various embodiments, the wireless communication device 290 may be connected to a server through intermediate access points. In an example, the wireless communication device 290 may be a server of a robotic device operator, a third party service (e.g., package delivery, billing, etc. ) , or a site communication access point. The robotic device 200 may communicate with a server through one or more intermediate communication links, such as a wireless telephony network that is coupled to a wide area network (e.g., the Internet) or other communication devices. In some embodiments, the robotic device 200 may include and employ other forms of radio communication, such as mesh connections with other robotic devices or connections to other information sources (e.g., balloons or other stations for collecting and/or distributing weather or other data harvesting information) .
In various embodiments, the control unit 210 may be equipped with an input module 260, which may be used for a variety of applications. For example, the input module 260 may receive images or data from an onboard camera or sensor, or may receive electronic signals from other components (e.g., a payload) .
FIG. 3 is a component block diagram illustrating a processing device 310 suitable for use in robotic devices implementing various embodiments. With reference to FIGS. 1–3, the processing device 310 may be configured to be used in a robotic device and may be configured as or including a system-on-chip (SoC) 312. In some embodiments, a variety of components (e.g., the processor 220, the output module 250, the radio module 270) may be integrated in the processing device 310. The SoC 312 may include (but is not limited to) a processor 314, a memory 316, a  communication interface 318, and a storage memory interface 320. The processing device 310 or the SoC 312 may further include a communication component 322, such as a wired or wireless modem, a storage memory 324, an antenna 326 for establishing a wireless communication link, and/or the like. The processing device 310 or the SoC 312 may further include a hardware interface 328 configured to enable the processor 314 to communicate with and control various components of a robotic device. The processor 314 may include any of a variety of processing devices, for example any number of processor cores.
The term “system-on-chip” (SoC) is used herein to refer to a set of interconnected electronic circuits typically, but not exclusively, including one or more processors (e.g., 314) , a memory (e.g., 316) , and a communication interface (e.g., 318) . The SoC 312 may include a variety of different types of processors 314 and processor cores, such as a general purpose processor, a central processing unit (CPU) , a digital signal processor (DSP) , a graphics processing unit (GPU) , an accelerated processing unit (APU) , a subsystem processor of specific components of the processing device, such as an image processor for a camera subsystem or a display processor for a display, an auxiliary processor, a single-core processor, and a multicore processor. The SoC 312 may further embody other hardware and hardware combinations, such as a field programmable gate array (FPGA) , an application-specific integrated circuit (ASIC) , other programmable logic device, discrete gate logic, transistor logic, performance monitoring hardware, watchdog hardware, and time references. Integrated circuits may be configured such that the components of the integrated circuit reside on a single piece of semiconductor material, such as silicon.
The SoC 312 may include one or more processors 314. The processing device 310 may include more than one SoC 312, thereby increasing the number of processors 314 and processor cores. The processing device 310 may also include processors 314 that are not associated with an SoC 312 (i.e., external to the SoC  312) . Individual processors 314 may be multicore processors. The processors 314 may each be configured for specific purposes that may be the same as or different from other processors 314 of the processing device 310 or SoC 312. One or more of the processors 314 and processor cores of the same or different configurations may be grouped together. A group of processors 314 or processor cores may be referred to as a multi-processor cluster.
The memory 316 of the SoC 312 may be a volatile or non-volatile memory configured for storing data and processor-executable instructions for access by the processor 314. The processing device 310 and/or SoC 312 may include one or more memories 316 configured for various purposes. One or more memories 316 may include volatile memories such as random access memory (RAM) or main memory, or cache memory.
Some or all of the components of the processing device 310 and the SoC 312 may be arranged differently and/or combined while still serving the functions of the various aspects. The processing device 310 and the SoC 312 may not be limited to one of each of the components, and multiple instances of each component may be included in various configurations of the processing device 310.
FIG. 4 is a component block diagram illustrating an image capture and processing system 400 of a robotic device (e.g., 102, 200 in FIGS. 1 and 2) suitable for use with various embodiments. With reference to FIGS. 1–4, the image capture and processing system 400 may be implemented in hardware components and/or software components of the robotic device, the operation of which may be controlled by one or more processors (e.g., the processor 220, the processing device 310, the SoC 312, and/or the like) of the robotic device, or a combination of hardware and software components. The image capture and processing system 400 may be configured by machine-readable or processor-executable instructions that may include one or more instruction modules. The instruction modules may include computer program modules.
camera 406 may capture light of an image 402 that enters through a lens 404. The lens 404 may include a fish eye lens or another similar lens that may be configured to provide a wide image capture angle. In some embodiments, the camera 406 may be similar to the  camera  103, 206. The camera 406 may provide image data to an image signal processing module 408.
The image signal processing module 408 may be configured to provide image information to a visual odometry initialization module 410, which may perform visual odometry initialization of the camera 406 based on the received information.
A robotic device pose determination module 412 may be configured to determine robotic device pose information. In various embodiments, the robotic device determination module 412 may determine robotic device pose information at one or more times while the robotic device maneuvers in a plane (e.g., moving along X and Y axes while remaining at a relatively constant Z axis value) .
A camera pose determination module 414 may be configured to determine camera pose information. In various embodiments, the camera pose determination module 414 may be configured to determine camera pose information at one or more times while the robotic device maneuvers in a plane (e.g., moving along X and Y axes while remaining at a relatively constant Z axis value) . The camera pose determination module 414 may be configured to receive robotic device pose information from a robotic systems at each of the plurality of times and use that information as part of the process of determining camera pose information.
A camera coordinate drift determination module 416 may be configured to use the camera pose information at the plurality of times to determine the change or drift in Z axis pose information. For example, if the robotic device is moving along a flat surface perpendicular to the Z axis (in the coordinate system determined by the visual odometry initialization module 408) , then changes in the Z component of the camera pose information represents camera coordinate drift in the Z axis. The  camera coordinate drift determination module 416 may also determine whether the determined camera coordinate drift across the plurality of times equals or exceeds an unacceptable drift threshold, which may be an amount of drift that cannot be accommodated in a calibration of the camera-to-robotic device. Said another way, so long as the camera coordinate drift remains less than the unacceptable drift threshold, such drift can be accommodated in a calibration of the camera-to-robotic device.
In some embodiments, a camera transformation/pose matrix determination module 418 may be configured to determine a camera transformation matrix using the camera pose information determined by the camera pose determination module 414. In some embodiments, the camera transformation/pose matrix determination module 418 may be configured to determine a camera pose matrix using the camera pose information determined by the camera pose determination module 414.
A robotic device transform matrix determination module 420 may be configured to determine a robotic device transform matrix using the robotic device pose information provided by the robotic device across the plurality of times.
A camera-to-robotic device calibration module 422 may be configured to perform a calibration of the camera to the robotic device. In some embodiments, the camera-to-robotic device calibration module 422 may be configured to perform the calibration of the camera to the robotic device using the camera transform matrix and the robotic device transform matrix. In some embodiments, the camera-to-robotic device calibration module 422 may be configured to perform the calibration of the camera to the robotic device using the camera pose matrix and the robotic device pose matrix.
FIG. 5 is a process flow diagram illustrating a method 500 of camera-to-robotic device calibration performed by a processing device for use in a robotic device according to various embodiments. With reference to FIGS. 1–5, the operations of the method 500 may be performed by a processor of a robotic device  (e.g., the processor 220, the processing device 310, the SoC 312, and/or the like) . The operations of the method 500 may be appropriate in implementations in which the camera and robot sensors and/or actuators operate in different coordinate systems and hand-to-eye type calibration is required.
In block 502, the processor of the robotic device may perform a visual odometry initialization of the camera to determine a coordinate system of the camera. For example, the processor may capture one or more images from the environment (e.g., of the target image 125) using a camera of the robotic device (e.g., the camera 103, 206) . Using the one or more images, the processor may perform the visual odometry initialization of the camera.
In block 504, the processor may maneuver the robotic device in a plane defined by an X axis and a Y axis of a coordinate system of the robotic device. In some embodiments, the processor may drive the robotic device along a linear path for a distance. In some embodiments, the processor may maintain an acceleration or deceleration of the robotic device such that a deviation of the robotic device in the Z axis does not meet or exceed a deviation threshold in the Z axis of the robotic device coordinate system.
While maneuvering the robotic device in the plane defined by the X and Y axes, the processor may determine robotic device pose information at a plurality of times in block 506. The processor also may determine camera pose information at the plurality of times in block 508. The processor may determine the camera pose information at the plurality of times comprises determining camera pose information using the coordinate system of the camera at the plurality of times. The processor also may determine a camera coordinate drift in the Z axis of the coordinate system at the plurality of times in block 510. In some embodiments, the processor may determine an average camera coordinate drift in the Z axis of the coordinate system over some or all of the plurality of times.
In determination block 512, then processor may determine whether the camera coordinate drift at the plurality of times equals or exceeds an unacceptable drift threshold. The unacceptable drift threshold may be a value of coordinate drift that cannot be accommodated in a calibration of the camera-to-robotic device. For example, an unacceptable drift threshold may be 10%drift in camera coordinate values. In some embodiments, the camera coordinate drift threshold may be determined empirically by determining whether a calibration of the camera-to-robotic device can be completed accurately at various values of drift in the camera coordinates, and setting a threshold at a value of coordinate drift at and beyond which camera-to-robotic device calibrations are unsuccessful.
In response to determining that the camera coordinate drift at the plurality of times meets or exceeds the unacceptable drift threshold (i.e., determination block 512 = “Yes” ) , in other words, the camera coordinate drift is beyond that which can be accommodated in camera-to-robotic device calibrations, the processor may again perform the operations of blocks 502–510.
In response to determining that the camera coordinate drift at the plurality of times does not equal or exceed the unacceptable drift threshold (i.e., determination block 512 = “No” ) , in other words, the camera coordinate drift remains within a range that can be accommodated in camera-to-robotic device calibrations, the processor may perform a calibration of the camera to the robotic device using a camera transform matrix and a robotic device transform matrix in block 514.
FIG. 6 is a process flow diagram illustrating operations 600 that may be performed by a processing device for use in a robotic device as part of the method 500 of camera-to-robotic device calibration according to various embodiments. With reference to FIGS. 1–6, the operations 600 may be performed by a processor of a robotic device (e.g., the processor 220, the processing device 310, the SoC 312, and/or the like) . The operations 600 may be appropriate in implementations in  which the camera and robot sensors and/or actuators operate in different coordinate systems and hand-to-eye type calibration is required.
Following the performance of the operations of block 514 (FIG. 5) , the processor may perform an actuator-to-camera calibration of the camera to the robotic device using the camera transform matrix and the robotic device pose matrix in block 602. In some embodiments, as part of the operations of performing the actuator-to-camera calibration of the camera, the processor may determine a camera transform matrix using the camera pose information from one or more of the plurality of times in block 604.
In block 606, the processor may determine a robotic device transform matrix using the robotic device pose information from one or more of the plurality of times.
In block 608, the processor may determine a transformation matrix such that the robotic device transform matrix multiplied by the transformation matrix is substantially equal to the transformation matrix multiplied by the camera transform matrix.
FIG. 7 is a process flow diagram illustrating a method 700 of camera-to-robotic device calibration that may be performed by a processing device for use in a robotic device according to some embodiments. With reference to FIGS. 1–7, the operations of the method 700 may be performed by a processor of a robotic device (e.g., the processor 220, the processing device 310, the SoC 312, and/or the like) . The operations of the method 700 may be appropriate in implementations in which the camera and robot sensors and/or actuators operate in the same coordinate system and cross-calibration is required.
In block 702, the processor of the robotic device may perform a visual odometry initialization of the camera to determine a coordinate system of the camera. For example, the processor may capture one or more images from the environment (e.g., of the target image 125) using a camera of the robotic device  (e.g., the camera 103, 206) . Using the one or more images, the processor may perform the visual odometry initialization of the camera.
In block 704, the processor may maneuver the robotic device in a plane defined by an X axis and a Y axis of a coordinate system of the robotic device. In some embodiments, the processor may drive the robotic device along a linear path for a distance. In some embodiments, the processor may maintain an acceleration or deceleration of the robotic device such that a deviation of the robotic device in the Z axis does not meet a deviation threshold in the Z axis of the robotic device coordinate system.
While maneuvering the robotic device in the plane defined by the X and Y axes, the processor may determine robotic device pose information at a plurality of times in block 706. The processor also may determine camera pose information at the plurality of times in block 708. The processor may determine the camera pose information at the plurality of times comprises determining camera pose information using the coordinate system of the camera at the plurality of times. The processor also may determine a camera coordinate drift in the Z axis of the coordinate system at the plurality of times in block 710. In some embodiments, the processor may determine an average camera coordinate drift in the Z axis of the coordinate system over some or all of the plurality of times.
In determination block 712, the processor may determine whether the camera coordinate drift at the plurality of times equals or exceeds an unacceptable drift threshold (for example, a threshold of 10%drift) . The unacceptable drift threshold may be a value of coordinate drift that cannot be accommodated in a calibration of the camera to the robotic actuator. In some embodiments, the camera coordinate drift threshold may be determined empirically by determining whether a calibration of the robotic actuator-to-camera can be completed accurately at various values of drift in the camera coordinates, and setting a threshold at a value of coordinate drift at and beyond which robotic actuator-to-camera calibrations are unsuccessful.
In response to determining that the camera coordinate drift at the plurality of times meets or exceeds the unacceptable drift threshold (i.e., determination block 712 = “Yes” ) , in other words, the camera coordinate drift is beyond that which can be accommodated in robotic actuator-to-camera calibrations, the processor may again perform the operations of blocks 702–710 as described.
In response to determining that the camera coordinate drift at the plurality of times does not equal or exceed the unacceptable drift threshold (i.e., determination block 712 = “No” ) , in other words, the camera coordinate drift remains within a range that can be accommodated in robotic actuator-to-camera calibrations, the processor may determine a robotic device pose matrix using the robotic device pose information from one or more of the plurality of times in block 714.
In block 716, the processor may determine robotic device pose information at the location relative to a visual odometry coordinate. In some embodiments, the processor may determine the location of target image (e.g., 125) with respect to a visual odometry coordinate. For example, the processor may perform a visual odometry initialization based on the target image, may capture an image of the target image, and may determine a location of the target image. The processor may determine robotic device pose information at the location relative to the target image. The processor may determine the robotic device pose information at the location relative to a visual odometry coordinate.
In block 718, the processor may determine a camera pose matrix at the plurality of times from visual odometry.
In block 720, the processor may perform a calibration of the camera to the robotic device using the robotic device pose matrix and the camera pose matrix. In some embodiments, the processor may perform a robotic actuator-to-camera calibration of the camera to the robotic device using the robotic device pose matrix and the camera pose matrix. In some embodiments, the processor may determine  the transformation matrix such that the robotic device pose matrix multiplied by the transformation matrix is substantially equal to the camera pose matrix.
Various embodiments illustrated and described are provided merely as examples to illustrate various features of the claims. However, features shown and described with respect to any given embodiment are not necessarily limited to the associated embodiment and may be used or combined with other embodiments that are shown and described. Further, the claims are not intended to be limited by any one example embodiment. For example, one or more of the operations of the methods and  operations  500, 600, and 700 may be substituted for or combined with one or more operations of the  methods  500, 600, and 700, and vice versa.
Implementation examples are described in the following paragraphs. While some of the following implementation examples are described in terms of example methods, further example implementations may include: the example methods discussed in the following paragraphs implemented by a robotic device including a processing device configured to perform operations of the example methods; the example methods discussed in the following paragraphs implemented by a robotic device including means for performing functions of the example methods; and the example methods discussed in the following paragraphs implemented as a non-transitory processor-readable storage medium having stored thereon processor-executable instructions configured to cause a processing device of a robotic device to perform the operations of the example methods.
Example 1. A method of camera-to-robotic device calibration performed by a processing device for use in a robotic device, including maneuvering the robotic device in a plane defined by an X axis and a Y axis of a coordinate system of the robotic device; while maneuvering the robotic device in the plane defined by the X axis and the Y axis: determining robotic device pose information at a plurality of times; determining camera pose information at the plurality of times; and determining camera coordinate drift in a Z axis perpendicular to the X and Y axes of  the coordinate system at the plurality of times; determining whether the camera coordinate drift at the plurality of times equals or exceeds an unacceptable drift threshold; determining a camera transform matrix using the camera pose information from one or more of the plurality of times in response to determining that the camera coordinate drift at the plurality of times does not equal or exceed the unacceptable drift threshold; determining a robotic device transform matrix using the robotic device pose information from one or more of the plurality of times; and performing a calibration of the camera to the robotic device using the camera transform matrix and the robotic device transform matrix.
Example 2. The method of example 1, in which performing the calibration of the camera to the robotic device includes performing a robotic actuator-to-camera calibration of the camera to the robotic device using the camera transform matrix and the robotic device transform matrix.
Example 3. The method of example 2, in which performing the actuator-to-camera calibration of the camera to the robotic device includes determining a transformation matrix in which the robotic device transform matrix multiplied by the transformation matrix is substantially equal to the transformation matrix multiplied by the camera transform matrix.
Example 4. The method of any of examples 1-3, further including: while maneuvering the robotic device in the plane defined by the X axis and the Y axis: maintaining an acceleration or deceleration of the robotic device such that a deviation of the robotic device in the Z axis does not meet a deviation threshold.
Example 5. The method of any of examples 1-4, further including performing a visual odometry initialization of the camera to determine a coordinate system of the camera.
Example 6. A method of camera-to-robotic device calibration performed by a processing device for use in a robotic device, including maneuvering the robotic  device in a plane defined by an X axis and a Y axis of a coordinate system of the robotic device; while maneuvering the robotic device in the plane defined by the X axis and the Y axis: determining robotic device pose information at a plurality of times; determining camera pose information at the plurality of times; and determining a camera coordinate drift in a Z axis perpendicular to the X and Y axes of the coordinate system at the plurality of times; determining whether the camera coordinate drift at the plurality of times equals or exceeds an unacceptable drift threshold; determining a robotic device pose matrix using the robotic device pose information from one or more of the plurality of times in response to determining that the camera coordinate drift at the plurality of times does not equal or exceed the unacceptable drift threshold; determining robotic device pose information at the location relative to a visual odometry coordinate; determining a camera pose matrix at the plurality of times from visual odometry; and performing a calibration of the camera to the robotic device using the robotic device pose matrix and the camera pose matrix.
Example 7. The method of example 6, in which performing the calibration of the camera to the robotic device includes performing a robotic actuator-to-camera calibration of the camera to the robotic device using the robotic device pose matrix and the camera pose matrix.
Example 8. The method of any of examples 6 or 7, in which performing the calibration of the camera to the robotic device using the robotic device pose matrix and the camera pose matrix includes determining a transformation matrix in which the robotic device pose matrix multiplied by the transformation matrix is substantially equal to the camera pose matrix.
Example 9. The method of any of examples 6–8, further including: while maneuvering the robotic device in the plane defined by the X axis and the Y axis: maintaining an acceleration or deceleration of the robotic device such that a deviation of the robotic device in the Z axis does not meet a deviation threshold.
Example 10. The method of any of examples 6–9, further including performing a visual odometry initialization of the camera to determine a coordinate system of the camera.
The foregoing method descriptions and the process flow diagrams are provided merely as illustrative examples and are not intended to require or imply that the operations of various embodiments must be performed in the order presented. As will be appreciated by one of skill in the art the order of operations in the foregoing embodiments may be performed in any order. Words such as “thereafter, ” “then, ” “next, ” etc. are not intended to limit the order of the operations; these words are used to guide the reader through the description of the methods. Further, any reference to claim elements in the singular, for example, using the articles “a, ” “an, ” or “the” is not to be construed as limiting the element to the singular.
Various illustrative logical blocks, modules, circuits, and algorithm operations described in connection with the embodiments disclosed herein may be implemented as electronic hardware, computer software, or combinations of both. To clearly illustrate this interchangeability of hardware and software, various illustrative components, blocks, modules, circuits, and operations have been described above generally in terms of their functionality. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the overall system. Skilled artisans may implement the described functionality in varying ways for each particular application, but such embodiment decisions should not be interpreted as causing a departure from the scope of the claims.
The hardware used to implement various illustrative logics, logical blocks, modules, and circuits described in connection with the aspects disclosed herein may be implemented or performed with a general purpose processor, a digital signal processor (DSP) , an application specific integrated circuit (ASIC) , a field  programmable gate array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components, or any combination thereof designed to perform the functions described herein. A general-purpose processor may be a microprocessor, but, in the alternative, the processor may be any conventional processor, controller, microcontroller, or state machine. A processor may also be implemented as a combination of receiver smart objects, e.g., a combination of a DSP and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a DSP core, or any other such configuration. Alternatively, some operations or methods may be performed by circuitry that is specific to a given function.
In one or more aspects, the functions described may be implemented in hardware, software, firmware, or any combination thereof. If implemented in software, the functions may be stored as one or more instructions or code on a non-transitory computer-readable storage medium or non-transitory processor-readable storage medium. The operations of a method or algorithm disclosed herein may be embodied in a processor-executable software module or processor-executable instructions, which may reside on a non-transitory computer-readable or processor-readable storage medium. Non-transitory computer-readable or processor-readable storage media may be any storage media that may be accessed by a computer or a processor. By way of example but not limitation, such non-transitory computer-readable or processor-readable storage media may include RAM, ROM, EEPROM, FLASH memory, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage smart objects, or any other medium that may be used to store desired program code in the form of instructions or data structures and that may be accessed by a computer. Disk and disc, as used herein, includes compact disc (CD) , laser disc, optical disc, digital versatile disc (DVD) , floppy disk, and Blu-ray disc where disks usually reproduce data magnetically, while discs reproduce data optically with lasers. Combinations of the above are also included within the scope  of non-transitory computer-readable and processor-readable media. Additionally, the operations of a method or algorithm may reside as one or any combination or set of codes and/or instructions on a non-transitory processor-readable storage medium and/or computer-readable storage medium, which may be incorporated into a computer program product.
The preceding description of the disclosed embodiments is provided to enable any person skilled in the art to make or use the claims. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the claims. Thus, the present disclosure is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the following claims and the principles and novel features disclosed herein.

Claims (20)

  1. A method of camera-to-robotic device calibration performed by a processing device for use in a robotic device, comprising:
    maneuvering the robotic device in a plane defined by an X axis and a Y axis of a coordinate system of the robotic device;
    while maneuvering the robotic device in the plane defined by the X axis and the Y axis:
    determining robotic device pose information at a plurality of times;
    determining camera pose information at the plurality of times; and
    determining camera coordinate drift in a Z axis perpendicular to the X and Y axes of the coordinate system at the plurality of times;
    determining whether the camera coordinate drift at the plurality of times equals or exceeds an unacceptable drift threshold;
    determining a camera transform matrix using the camera pose information from one or more of the plurality of times in response to determining that the camera coordinate drift at the plurality of times does not equal or exceed the unacceptable drift threshold;
    determining a robotic device transform matrix using the robotic device pose information from one or more of the plurality of times; and
    performing a calibration of the camera to the robotic device using the camera transform matrix and the robotic device transform matrix.
  2. The method of claim 1, wherein performing the calibration of the camera to the robotic device comprises performing a robotic actuator-to-camera calibration of the camera to the robotic device using the camera transform matrix and the robotic device transform matrix.
  3. The method of claim 2, wherein performing the actuator-to-camera calibration of the camera to the robotic device comprises determining a transformation matrix wherein the robotic device transform matrix multiplied by the transformation matrix is substantially equal to the transformation matrix multiplied by the camera transform matrix.
  4. The method of claim 1, further comprising:
    while maneuvering the robotic device in the plane defined by the X axis and the Y axis:
    maintaining an acceleration or deceleration of the robotic device such that a deviation of the robotic device in the Z axis does not meet a deviation threshold.
  5. The method of claim 1, further comprising performing a visual odometry initialization of the camera to determine a coordinate system of the camera.
  6. A method of camera-to-robotic device calibration performed by a processing device for use in a robotic device having a camera, comprising:
    maneuvering the robotic device in a plane defined by an X axis and a Y axis of a coordinate system of the robotic device;
    while maneuvering the robotic device in the plane defined by the X axis and the Y axis:
    determining robotic device pose information at a plurality of times;
    determining camera pose information at the plurality of times; and
    determining a camera coordinate drift in a Z axis perpendicular to the X and Y axes of the coordinate system at the plurality of times;
    determining whether the camera coordinate drift at the plurality of times equals or exceeds an unacceptable drift threshold;
    determining a robotic device pose matrix using the robotic device pose information from one or more of the plurality of times in response to determining that the camera coordinate drift at the plurality of times does not equal or exceed the unacceptable drift threshold;
    determining robotic device pose information at the location relative to a visual odometry coordinate;
    determining a camera pose matrix at the plurality of times from visual odometry; and
    performing a calibration of the camera to the robotic device using the robotic device pose matrix and the camera pose matrix.
  7. The method of claim 6, wherein performing the calibration of the camera to the robotic device comprises performing a robotic actuator-to-camera calibration of the camera to the robotic device using the robotic device pose matrix and the camera pose matrix.
  8. The method of claim 6, wherein performing the calibration of the camera to the robotic device using the robotic device pose matrix and the camera pose matrix comprises determining a transformation matrix wherein the robotic device pose matrix multiplied by the transformation matrix is substantially equal to the camera pose matrix.
  9. The method of claim 6, further comprising:
    while maneuvering the robotic device in the plane defined by the X axis and the Y axis:
    maintaining an acceleration or deceleration of the robotic device such that a deviation of the robotic device in the Z axis does not meet a deviation threshold.
  10. The method of claim 6, further comprising performing a visual odometry initialization of the camera to determine a coordinate system of the camera.
  11. A robotic device, comprising:
    a processor configured with processor-executable instructions to:
    maneuver the robotic device in a plane defined by an X axis and a Y axis of a coordinate system of the robotic device;
    while maneuvering the robotic device in the plane defined by the X axis and the Y axis:
    determine robotic device pose information at a plurality of times;
    determine camera pose information at the plurality of times; and
    determine camera coordinate drift in a Z axis perpendicular to the X and Y axes of the coordinate system at the plurality of times;
    determining whether the camera coordinate drift at the plurality of times equals or exceeds an unacceptable drift threshold;
    determine a camera transform matrix using the camera pose information from one or more of the plurality of times in response to determining that the camera coordinate drift at the plurality of times does not equal or exceed the unacceptable drift threshold;
    determine a robotic device transform matrix using the robotic device pose information from one or more of the plurality of times; and
    perform a calibration of the camera to the robotic device using the camera transform matrix and the robotic device transform matrix.
  12. The robotic device of claim 11, wherein the processor is further configured with processor-executable instructions to perform a robotic actuator-to-camera calibration  of the camera to the robotic device using the camera transform matrix and the robotic device transform matrix.
  13. The robotic device of claim 12, wherein the processor is further configured with processor-executable instructions to determine a transformation matrix wherein the robotic device transform matrix multiplied by the transformation matrix is substantially equal to the transformation matrix multiplied by the camera transform matrix.
  14. The robotic device of claim 11, wherein the processor is further configured with processor-executable instructions to:
    maintain an acceleration or deceleration of the robotic device while maneuvering the robotic device in the plane defined by the X axis and the Y axis such that a deviation of the robotic device in the Z axis does not meet a deviation threshold.
  15. The robotic device of claim 11, wherein the processor is further configured with processor-executable instructions to perform a visual odometry initialization of the camera to determine a coordinate system of the camera.
  16. A robotic device, comprising:
    a camera; and
    a processor coupled to the camera and configured with processor-executable instructions to:
    maneuver the robotic device in a plane defined by an X axis and a Y axis of a coordinate system of the robotic device;
    while maneuvering the robotic device in the plane defined by the X axis and the Y axis:
    determine robotic device pose information at a plurality of times;
    determine camera pose information at the plurality of times; and
    determine a camera coordinate drift in a Z axis perpendicular to the X and Y axes of the coordinate system at the plurality of times;
    determining whether the camera coordinate drift at the plurality of times equals or exceeds an unacceptable drift threshold;
    determine a robotic device pose matrix using the robotic device pose information from one or more of the plurality of times in response to determining that the camera coordinate drift at the plurality of times does not equal or exceed the unacceptable drift threshold;
    determine robotic device pose information at the location relative to a visual odometry coordinate;
    determine a camera pose matrix at the plurality of times from visual odometry; and
    perform a calibration of the camera
    to the robotic device using the robotic device pose matrix and the camera pose matrix.
  17. The robotic device of claim 16, wherein the processor is further configured with processor-executable instructions to perform a robotic actuator-to-camera calibration of the camera to the robotic device using the robotic device pose matrix and the camera pose matrix.
  18. The robotic device of claim 16, wherein the processor is further configured with processor-executable instructions to determine a transformation matrix wherein the robotic device pose matrix multiplied by the transformation matrix is substantially equal to the camera pose matrix.
  19. The robotic device of claim 16, wherein the processor is further configured with processor-executable instructions to:
    maintain an acceleration or deceleration of the robotic device while maneuvering the robotic device in the plane defined by the X axis and the Y axis such that a deviation of the robotic device in the Z axis does not meet a deviation threshold.
  20. The robotic device of claim 16, wherein the processor is further configured with processor-executable instructions perform a visual odometry initialization of the camera to determine a coordinate system of the camera.
PCT/CN2022/073429 2022-01-24 2022-01-24 Methods of camera-to-robotic device calibration WO2023137744A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/CN2022/073429 WO2023137744A1 (en) 2022-01-24 2022-01-24 Methods of camera-to-robotic device calibration
TW111145426A TW202345099A (en) 2022-01-24 2022-11-28 Methods of camera-to-robotic device calibration

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2022/073429 WO2023137744A1 (en) 2022-01-24 2022-01-24 Methods of camera-to-robotic device calibration

Publications (1)

Publication Number Publication Date
WO2023137744A1 true WO2023137744A1 (en) 2023-07-27

Family

ID=81325472

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/073429 WO2023137744A1 (en) 2022-01-24 2022-01-24 Methods of camera-to-robotic device calibration

Country Status (2)

Country Link
TW (1) TW202345099A (en)
WO (1) WO2023137744A1 (en)

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
ANDREFF N ET AL: "Robot hand-eye calibration using structure-from-motion", INTERNATIONAL JOURNAL OF ROBOTICS RESEARCH SAGE PUBLICATIONS USA,, vol. 20, no. 3, 1 March 2001 (2001-03-01), pages 228 - 248, XP002517219 *
JIANG JIANFENG ET AL: "An overview of hand-eye calibration", THE INTERNATIONAL JOURNAL OF ADVANCED MANUFACTURING TECHNOLOGY, SPRINGER, LONDON, vol. 119, no. 1-2, 19 November 2021 (2021-11-19), pages 77 - 97, XP037693993, ISSN: 0268-3768, [retrieved on 20211119], DOI: 10.1007/S00170-021-08233-6 *
WALTERS CELYN ET AL: "A Robust Extrinsic Calibration Framework for Vehicles with Unscaled Sensors", 2019 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), IEEE, 3 November 2019 (2019-11-03), pages 36 - 42, XP033695814, DOI: 10.1109/IROS40897.2019.8968244 *
YUWEN XUAN ET AL: "Improved Vehicle LiDAR Calibration With Trajectory-Based Hand-Eye Method", IEEE TRANSACTIONS ON INTELLIGENT TRANSPORTATION SYSTEMS, IEEE, PISCATAWAY, NJ, USA, vol. 23, no. 1, 6 August 2020 (2020-08-06), pages 215 - 224, XP011896388, ISSN: 1524-9050, [retrieved on 20211227], DOI: 10.1109/TITS.2020.3009674 *

Also Published As

Publication number Publication date
TW202345099A (en) 2023-11-16

Similar Documents

Publication Publication Date Title
US20200117210A1 (en) Auto-Exploration Control of a Robotic Vehicle
US11054835B2 (en) Vehicle collision avoidance
CN111247390B (en) Concurrent relocation and reinitialization of VSLAM
WO2019019136A1 (en) Systems and methods for utilizing semantic information for navigation of a robotic device
US10386857B2 (en) Sensor-centric path planning and control for robotic vehicles
TWI730276B (en) Robotic vehicle and method for maintaining control of the same
US11080890B2 (en) Image sensor initialization in a robotic vehicle
US11244468B2 (en) Image output adjustment in a robotic vehicle
US20200365041A1 (en) Identifying landing zones for landing of a robotic vehicle
WO2023137744A1 (en) Methods of camera-to-robotic device calibration
WO2023141740A1 (en) Method and system for loop closure detection
WO2023060461A1 (en) Selecting a frontier goal for autonomous map building within a space

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22703876

Country of ref document: EP

Kind code of ref document: A1