US10499038B2 - Method and system for recalibrating sensing devices without familiar targets - Google Patents
Method and system for recalibrating sensing devices without familiar targets Download PDFInfo
- Publication number
- US10499038B2 US10499038B2 US15/618,767 US201715618767A US10499038B2 US 10499038 B2 US10499038 B2 US 10499038B2 US 201715618767 A US201715618767 A US 201715618767A US 10499038 B2 US10499038 B2 US 10499038B2
- Authority
- US
- United States
- Prior art keywords
- sensing device
- captured
- landmark
- points
- scene
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active, expires
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/246—Calibration of cameras
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/50—Information retrieval; Database structures therefor; File system structures therefor of still image data
- G06F16/51—Indexing; Data structures therefor; Storage structures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/80—Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/254—Image signal generators using stereoscopic image cameras in combination with electromagnetic radiation sources for illuminating objects
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N17/00—Diagnosis, testing or measuring for television systems or their details
- H04N17/002—Diagnosis, testing or measuring for television systems or their details for television cameras
Definitions
- the present invention relates generally to the field of calibrating sensing devices, and more particularly to such calibration carried out without the need for predefined calibration targets.
- sensing device is broadly defined as any combination of one or more sensors that are configured to capture three-dimensional data of a scene, directly or through further manipulation.
- An exemplary sensing device may include a pair of cameras which are configured to capture passive stereo which may derive depth data by comparing the images taken from different locations.
- Another example for a sensing device may include a structured light sensor which is configured to receive and analyze reflections of a predefined light pattern that has been projected onto the scene.
- a structured light system includes a radiation source (such as a laser source) and a sensor configured to capture the reflections from the radiation source
- a single camera for capturing 2D images together with a measurement device that can measure the distance the sensing device travels.
- a measurement device can be, but not limited to, an inertial measurement unit (IMU) or odometer.
- IMU inertial measurement unit
- three dimensional rig or ‘3D Rig’ as used herein is defined as any device for mounting at least two cameras or at least one camera and a radiation source that together form a 3D-system capable of capturing images and videos of a scene.
- a 3D Rig must provide the possibility to mount two cameras or one camera and one radiation source, with a horizontal or vertical offset and adjust the cameras in all possible axes.
- calibration and more specifically “camera calibration” as used herein is defined as the process of adjusting predefined parameters so that the camera, or the sensing device may operate optimally.
- Calibration of a 3D Rig involves the calibration of a plurality of cameras.
- the calibration of a 3D Rig includes the alignment or configuration of several cameras and in particular the process of identifying the deviation of several parameters relating cameras' spatial alignment from a predefined value tolerance and remedying the identified deviation. It is said that a certain sensing device is calibrated whenever the estimated calibration parameters, reflect the actual calibration parameters, taken at the specified timeslot, within an agreeable margin. It is possible to verify that a device is calibrated by comparing distances measured by the device with distances obtained from un-related sources (e.g., directly measured).
- landmark as used herein relates to visual features used by a variety of computer vision applications such as image registration, camera calibration, and object recognition. Using landmarks is advantageous as it offers robustness with regard to lightning conditions as well as the ability to cope with large displacements in registration.
- a landmark comprises both artificial and natural landmarks. Exemplary landmarks may include corners or repetitive patterns in images.
- Some embodiments of the present invention provide a method and a system that enable recalibrating of sensing devices without familiar targets, so that newly generated landmarks may be useful in a future calibration process of the sensing device.
- the method may include: capturing a scene using a sensing device; determining whether or not the device is calibrated; in a case that said sensing device is calibrated, adding at least one new landmark to the known landmarks; in a case that said sensing device is not calibrated, calibrating the sensing device based on the known landmarks.
- the system may have various architectures that include a sensing device which captures images of the scene and further derive 3D data on the scene of some form.
- the method may include: capturing a scene using a sensing device; determining whether the captured scene contains a minimal number of known landmarks sufficient for determining whether the sensing device is calibrated; in a case that the captured scene contains said minimal number of landmarks, checking whether said sensing device is calibrated; in a case that said sensing device is calibrated, adding at least one new landmark to the known landmarks; in a case that said sensing device is not calibrated, calibrating the sensing device based on the known landmarks; and in a case that the captured scene does not contain said minimal number of landmarks, checking calibration of the sensing device without using known landmarks, and in a case that the sensing device is calibrated, adding at least one new landmark to the known landmarks.
- determining whether a sensing device or a 3D rig is fully calibrated requires fewer landmarks than the actual calibration process. Therefore, in case that a calibrated condition is identified, it would be easier to generate more landmarks of the unfamiliar scene.
- FIG. 1 is a block diagram illustrating non-limiting exemplary architectures of a system in accordance with some embodiments of the present invention
- FIG. 2A is a high level flowchart illustrating non-limiting exemplary method in accordance with some embodiments of the present invention
- FIG. 2B is a high level flowchart illustrating another non-limiting exemplary method in accordance with some embodiments of the present invention.
- FIG. 3 is a block diagram illustrating yet another aspect in accordance with some embodiments of the present invention.
- FIG. 1 is a block diagram illustrating an exemplary architecture on which embodiments of the present invention may be implemented.
- System 100 may include a sensing device 110 configured to capture a scene.
- System 100 may further include a landmarks database 120 configured to store landmarks known in the scene (e.g., 10 A, 10 B, and 10 C).
- system 100 may further include a computer processor 130 configured to determine whether or not the device is calibrated, based on the captured image, wherein in a case that said sensing device is calibrated, the computer processor configured to add at least one new landmark to the known landmarks, and wherein in a case that said sensing device is not calibrated, the computer processor is configured to calibrate the sensing device based on the known landmarks.
- a computer processor 130 configured to determine whether or not the device is calibrated, based on the captured image, wherein in a case that said sensing device is calibrated, the computer processor configured to add at least one new landmark to the known landmarks, and wherein in a case that said sensing device is not calibrated, the computer processor is configured to calibrate the sensing device based on the known landmarks.
- system 100 may further include a computer processor 130 configured to determine whether the captured scene contains a minimal number of known landmarks sufficient for determining whether the sensing device is calibrated.
- computer processor 130 is further configured to carry out the calibration process, whereas in other embodiments, a different computer processor is being used for the actual calibration process.
- the number of landmarks sufficient for determining whether the sensing device is calibrated is usually lower than the number of landmarks sufficient for actually calibrating the sensing device. This important property is used herein as the ability to determine whether the sensing device is calibrated.
- the computer processor is configured to check whether sensing device 110 is calibrated, wherein in a case that sensing device 110 is calibrated, computer processor 130 is configured to add at least one new landmark 132 to landmarks database 120 , wherein in a case that sensing device 110 is not calibrated, computer processor 130 is configured to calibrate sensing device 110 based on the known landmarks, and wherein in a case that the captured scene does not contain the minimal number of landmarks, computer processor 130 is configured to check calibration of sensing device 110 without using known landmarks (e.g., 10 A, 10 B, and 10 C), and in a case that sensing device 110 is calibrated, computer processor 130 is configured to add at least one new landmark 132 to the landmarks database 120 .
- known landmarks e.g., 10 A, 10 B, and 10 C
- FIG. 2A is a high level flowchart illustrating a method 200 A for recalibrating sensing devices without familiar targets.
- the method may include: capturing a scene using a sensing device 210 ; determining whether the sensing device is calibrated 230 ; in a case that the sensing device is calibrated, adding at least one new landmark to the known landmarks 250 . In a case that said sensing device is not calibrated, determining at least some of the captured objects as objects stored on a database as known landmarks at the scene whose relative position is known 235 and calibrating the sensing device based on the known landmarks 290 .
- the sensing device in a case that the sensing device is determined as non-calibrated it is first checked whether there is a sufficient number of landmarks for calibrating the sensing device 240 and only in a case that the captured scene contains the minimal number of landmarks, calibrating the sensing device based on the known landmarks, Otherwise, the process ends 280 .
- FIG. 2B is a high level flowchart illustrating a method 200 B for recalibrating sensing devices without familiar targets.
- the method may include: capturing a scene using a sensing device 210 ; determining whether the captured scene contains a minimal number of known landmarks sufficient for determining whether the sensing device is calibrated 220 ; in a case that the captured scene contains the minimal number of landmarks, checking whether said sensing device is calibrated 230 , in a case that said sensing device is calibrated, adding at least one new landmark to the known landmarks 250 ; in a case that said sensing device is not calibrated, calibrate the sensing device based on the known landmarks 290 ; and in a case that the captured scene does not contain said minimal number of landmarks, checking calibration of the sensing device without using known landmarks 260 , and in a case that the sensing device is calibrated, adding at least one landmark to the known landmarks 250 , Otherwise, the process ends 270 .
- the sensing device in a case that the sensing device is determined as non-calibrated, it is first checked whether there is a sufficient number of landmarks for calibrating the sensing device 240 and only in a case that the captured scene contains the minimal number of landmarks, calibrating the sensing device based on the known landmarks, Otherwise, the process ends 280 .
- two modes are operable for identifying whether the sensing device is calibrated.
- the first one without the “learned” targets matching features are determined between the components of the sensing device and check if they meet the geometric requirements (such as epipolar lines).
- the geometric requirements such as epipolar lines.
- the second mode with learned targets, it is determined which of the learned targets are visible, possibly using standard target recognition. For each of the learned targets: detect the features from the database in the current frame from the 3D rig, with high-accuracy, while not all features need to be found. Then, it is checked how accurately the known features comply with the sensing device's geometric model and its current parameters. If the accuracy is worse than some threshold, then the sensing device is deemed not calibrated.
- the check of calibration is performed using the known landmarks.
- the at least one new landmark is selected based on predefined criteria including at least one of: textured area, uniqueness of appearance; low-complexity structure; size and distribution of the points; and distance from the sensing device.
- the method may further include a step of maintaining a landmarks database configured to store known and newly added landmarks, wherein the database is searchable.
- the database may be configured to further store low resolution versions or other descriptors of the landmarks, for quick retrieval.
- searching the database for a known landmark takes into account various points of views.
- the new landmarks are stored together with metadata (e.g., descriptors or hash functions) for facilitating contextual search of said landmarks.
- metadata e.g., descriptors or hash functions
- the further identified landmarks are stored in a way to allow fast searching through the database, wherein landmarks from the same environment are clustered.
- FIG. 3 is a block diagram illustrating yet another aspect in accordance with some embodiments of the present invention. An exemplary implementation of the algorithm in accordance with some embodiments of the present invention is described herein.
- Capturing device 110 provides data relating to 2D points in the frame 112 while landmarks database provide data relating to the landmark 122 in the scene.
- the landmark data 122 are then being input to a formula 310 that is tailored for the specific 3D rig or sensing device.
- the formula then maps the landmark data 122 to a calculated 2D point or area on the frame 320 which is then compared by a comparator 330 with the data relating to 2D points in the captured frame 112 to yield a delta 340 being the difference between the calculated 2D point on the frame and the data on the 2D point in the frame. Then, in order to minimize delta 340 , position 312 , angles 314 and calibration parameters 316 are being adjusted in various manners known in the art so as the delta 340 is minimal. Once delta 340 is minimized, the 3D rig or sensing device is calibrated and the adjusted calibration parameters are derived and stored.
- the calibration is solved by minimizing an objective function measuring the projection errors.
- it can be solved by minimizing by using gradient descent.
- the minimizing is carried out by using the Levenberg-Marquardt optimization technique.
- some embodiments of the present invention are not limited to a 3D rig or a sensing device that captures 3D images. It may also be used to calibrate a system that includes a 2D sensing device coupled to an inertial measurement unit (IMU). In such a system, the geometric constraints are given between different frames of the same camera, with the geometery being measured from the IMU.
- the sensing device is a 2D camera mounted on a moving platform such as a robot, from which the motion can be measured from wheel angles and wheel rotations. It should be understood that the aforementioned embodiments should not be regarded as limiting and further applications may be envisoned in order to address further use cases.
- Methods of the present invention may be implemented by performing or completing manually, automatically, or a combination thereof, selected steps or tasks.
- the present invention may be implemented in the testing or practice with methods and materials equivalent or similar to those described herein.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Health & Medical Sciences (AREA)
- Biomedical Technology (AREA)
- General Health & Medical Sciences (AREA)
- Databases & Information Systems (AREA)
- Data Mining & Analysis (AREA)
- Software Systems (AREA)
- General Engineering & Computer Science (AREA)
- Electromagnetism (AREA)
- Length Measuring Devices By Optical Means (AREA)
- Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
Abstract
Description
Claims (16)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/618,767 US10499038B2 (en) | 2015-10-01 | 2017-06-09 | Method and system for recalibrating sensing devices without familiar targets |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/872,158 US9681118B2 (en) | 2015-10-01 | 2015-10-01 | Method and system for recalibrating sensing devices without familiar targets |
US15/618,767 US10499038B2 (en) | 2015-10-01 | 2017-06-09 | Method and system for recalibrating sensing devices without familiar targets |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/872,158 Continuation US9681118B2 (en) | 2015-10-01 | 2015-10-01 | Method and system for recalibrating sensing devices without familiar targets |
Publications (2)
Publication Number | Publication Date |
---|---|
US20170280131A1 US20170280131A1 (en) | 2017-09-28 |
US10499038B2 true US10499038B2 (en) | 2019-12-03 |
Family
ID=58422759
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/872,158 Active 2035-12-06 US9681118B2 (en) | 2015-10-01 | 2015-10-01 | Method and system for recalibrating sensing devices without familiar targets |
US15/618,767 Active 2035-11-27 US10499038B2 (en) | 2015-10-01 | 2017-06-09 | Method and system for recalibrating sensing devices without familiar targets |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/872,158 Active 2035-12-06 US9681118B2 (en) | 2015-10-01 | 2015-10-01 | Method and system for recalibrating sensing devices without familiar targets |
Country Status (3)
Country | Link |
---|---|
US (2) | US9681118B2 (en) |
CN (1) | CN108140246A (en) |
WO (1) | WO2017056088A2 (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20200314411A1 (en) * | 2019-03-29 | 2020-10-01 | Alibaba Group Holding Limited | Synchronizing an illumination sequence of illumination sources with image capture in rolling shutter mode |
US11125581B2 (en) * | 2016-08-01 | 2021-09-21 | Alibaba Technologies (Israel) LTD. | Method and system for calibrating components of an inertial measurement unit (IMU) using scene-captured data |
US11393125B1 (en) * | 2019-12-09 | 2022-07-19 | Gopro, Inc. | Systems and methods for dynamic optical medium calibration |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9681118B2 (en) * | 2015-10-01 | 2017-06-13 | Infinity Augmented Reality Israel Ltd. | Method and system for recalibrating sensing devices without familiar targets |
CN109219785B (en) * | 2016-06-03 | 2021-10-01 | 深圳市大疆创新科技有限公司 | Multi-sensor calibration method and system |
US10666926B1 (en) * | 2017-07-18 | 2020-05-26 | Edge 3 Technologies, Inc. | Residual error mitigation in multiview calibration |
DE102018122411A1 (en) * | 2018-09-13 | 2020-03-19 | Endress+Hauser SE+Co. KG | Process for improving the measurement performance of field devices in automation technology |
US11117591B2 (en) * | 2019-05-08 | 2021-09-14 | Pony Ai Inc. | System and method for recalibration of an uncalibrated sensor |
Citations (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20030182072A1 (en) * | 2002-03-19 | 2003-09-25 | Canon Kabushiki Kaisha | Sensor calibration apparatus, sensor calibration method, program, storage medium, information processing method, and information processing apparatus |
US6724922B1 (en) * | 1998-10-22 | 2004-04-20 | Brainlab Ag | Verification of positions in camera images |
US20040167667A1 (en) * | 2002-12-17 | 2004-08-26 | Goncalves Luis Filipe Domingues | Systems and methods for filtering potentially unreliable visual data for visual simultaneous localization and mapping |
US20060188131A1 (en) * | 2005-02-24 | 2006-08-24 | Xiang Zhang | System and method for camera tracking and pose estimation |
CN101214861A (en) | 2007-12-26 | 2008-07-09 | 北京控制工程研究所 | Star sensor attitude determination method at self-determination retrieve rail controlling fault |
US20080167814A1 (en) * | 2006-12-01 | 2008-07-10 | Supun Samarasekera | Unified framework for precise vision-aided navigation |
US20090323121A1 (en) * | 2005-09-09 | 2009-12-31 | Robert Jan Valkenburg | A 3D Scene Scanner and a Position and Orientation System |
US20100176987A1 (en) * | 2009-01-15 | 2010-07-15 | Takayuki Hoshizaki | Method and apparatus to estimate vehicle position and recognized landmark positions using GPS and camera |
US20100238351A1 (en) * | 2009-03-13 | 2010-09-23 | Eyal Shamur | Scene recognition methods for virtual insertions |
US20120121161A1 (en) * | 2010-09-24 | 2012-05-17 | Evolution Robotics, Inc. | Systems and methods for vslam optimization |
CN102937816A (en) | 2012-11-22 | 2013-02-20 | 四川华雁信息产业股份有限公司 | Method and device for calibrating preset position deviation of camera |
US20130057686A1 (en) * | 2011-08-02 | 2013-03-07 | Siemens Corporation | Crowd sourcing parking management using vehicles as mobile sensors |
US20130279779A1 (en) * | 2012-04-19 | 2013-10-24 | General Electric Company | Systems and methods for landmark correction in magnetic resonance imaging |
US20130329944A1 (en) * | 2012-06-12 | 2013-12-12 | Honeywell International Inc. | Tracking aircraft in a taxi area |
US20140313347A1 (en) | 2013-04-23 | 2014-10-23 | Xerox Corporation | Traffic camera calibration update utilizing scene analysis |
US20150304634A1 (en) * | 2011-08-04 | 2015-10-22 | John George Karvounis | Mapping and tracking system |
US20150341628A1 (en) * | 2014-05-21 | 2015-11-26 | GM Global Technology Operations LLC | Method and apparatus for automatic calibration in surrounding view systems |
US20170099477A1 (en) * | 2015-10-01 | 2017-04-06 | Infinity Augmented Reality Israel Ltd. | Method and system for recalibrating sensing devices without familiar targets |
US20180031389A1 (en) * | 2016-08-01 | 2018-02-01 | Infinity Augmented Reality Israel Ltd. | Method and system for calibrating components of an inertial measurement unit (imu) using scene-captured data |
US20180336704A1 (en) * | 2016-02-03 | 2018-11-22 | Sportlogiq Inc. | Systems and Methods for Automated Camera Calibration |
-
2015
- 2015-10-01 US US14/872,158 patent/US9681118B2/en active Active
-
2016
- 2016-09-27 CN CN201680057654.5A patent/CN108140246A/en active Pending
- 2016-09-27 WO PCT/IL2016/051061 patent/WO2017056088A2/en active Application Filing
-
2017
- 2017-06-09 US US15/618,767 patent/US10499038B2/en active Active
Patent Citations (26)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6724922B1 (en) * | 1998-10-22 | 2004-04-20 | Brainlab Ag | Verification of positions in camera images |
US20030182072A1 (en) * | 2002-03-19 | 2003-09-25 | Canon Kabushiki Kaisha | Sensor calibration apparatus, sensor calibration method, program, storage medium, information processing method, and information processing apparatus |
US20040167667A1 (en) * | 2002-12-17 | 2004-08-26 | Goncalves Luis Filipe Domingues | Systems and methods for filtering potentially unreliable visual data for visual simultaneous localization and mapping |
US8274406B2 (en) * | 2002-12-17 | 2012-09-25 | Evolution Robotics, Inc. | Systems and methods for using multiple hypotheses in a visual simultaneous localization and mapping system |
US20060188131A1 (en) * | 2005-02-24 | 2006-08-24 | Xiang Zhang | System and method for camera tracking and pose estimation |
US20090323121A1 (en) * | 2005-09-09 | 2009-12-31 | Robert Jan Valkenburg | A 3D Scene Scanner and a Position and Orientation System |
US20120206596A1 (en) * | 2006-12-01 | 2012-08-16 | Sri International | Unified framework for precise vision-aided navigation |
US20080167814A1 (en) * | 2006-12-01 | 2008-07-10 | Supun Samarasekera | Unified framework for precise vision-aided navigation |
CN101214861A (en) | 2007-12-26 | 2008-07-09 | 北京控制工程研究所 | Star sensor attitude determination method at self-determination retrieve rail controlling fault |
US20100176987A1 (en) * | 2009-01-15 | 2010-07-15 | Takayuki Hoshizaki | Method and apparatus to estimate vehicle position and recognized landmark positions using GPS and camera |
US20100238351A1 (en) * | 2009-03-13 | 2010-09-23 | Eyal Shamur | Scene recognition methods for virtual insertions |
US20120121161A1 (en) * | 2010-09-24 | 2012-05-17 | Evolution Robotics, Inc. | Systems and methods for vslam optimization |
US20130057686A1 (en) * | 2011-08-02 | 2013-03-07 | Siemens Corporation | Crowd sourcing parking management using vehicles as mobile sensors |
US20150304634A1 (en) * | 2011-08-04 | 2015-10-22 | John George Karvounis | Mapping and tracking system |
US20130279779A1 (en) * | 2012-04-19 | 2013-10-24 | General Electric Company | Systems and methods for landmark correction in magnetic resonance imaging |
US20130329944A1 (en) * | 2012-06-12 | 2013-12-12 | Honeywell International Inc. | Tracking aircraft in a taxi area |
CN102937816A (en) | 2012-11-22 | 2013-02-20 | 四川华雁信息产业股份有限公司 | Method and device for calibrating preset position deviation of camera |
US20140313347A1 (en) | 2013-04-23 | 2014-10-23 | Xerox Corporation | Traffic camera calibration update utilizing scene analysis |
US20150341628A1 (en) * | 2014-05-21 | 2015-11-26 | GM Global Technology Operations LLC | Method and apparatus for automatic calibration in surrounding view systems |
US20170099477A1 (en) * | 2015-10-01 | 2017-04-06 | Infinity Augmented Reality Israel Ltd. | Method and system for recalibrating sensing devices without familiar targets |
US9681118B2 (en) * | 2015-10-01 | 2017-06-13 | Infinity Augmented Reality Israel Ltd. | Method and system for recalibrating sensing devices without familiar targets |
US20170280131A1 (en) * | 2015-10-01 | 2017-09-28 | Infinity Augmented Reality Israel Ltd. | Method and system for recalibrating sensing devices without familiar targets |
US20180336704A1 (en) * | 2016-02-03 | 2018-11-22 | Sportlogiq Inc. | Systems and Methods for Automated Camera Calibration |
US20180031389A1 (en) * | 2016-08-01 | 2018-02-01 | Infinity Augmented Reality Israel Ltd. | Method and system for calibrating components of an inertial measurement unit (imu) using scene-captured data |
US10012517B2 (en) * | 2016-08-01 | 2018-07-03 | Infinity Augmented Reality Israel Ltd. | Method and system for calibrating components of an inertial measurement unit (IMU) using scene-captured data |
US20180283898A1 (en) * | 2016-08-01 | 2018-10-04 | Infinity Augmented Reality Israel Ltd. | Method and system for calibrating components of an inertial measurement unit (imu) using scene-captured data |
Non-Patent Citations (5)
Title |
---|
Author Unknown, "Gradient Descent", Wikipedia, Version Published on Sep. 10, 2015 (Year: 2015). * |
Author Unknown, "Inertial Measurement Unit", Wikipedia, Version Published on Sep. 15, 2015 (Year: 2015). * |
Author Unknown, "Levenberg-Marquardt Algorithm", Wikipedia, Version Published on Aug. 25, 2015 (Year: 2015). * |
Davison et al., "MonoSLAM: Real-Time Single Camera SLAM", IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 29, No. 6, Jun. 2007. |
Office Action dated Jul. 31, 2019 for Chinese Patent Application No. 2016800576545. |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11125581B2 (en) * | 2016-08-01 | 2021-09-21 | Alibaba Technologies (Israel) LTD. | Method and system for calibrating components of an inertial measurement unit (IMU) using scene-captured data |
US20200314411A1 (en) * | 2019-03-29 | 2020-10-01 | Alibaba Group Holding Limited | Synchronizing an illumination sequence of illumination sources with image capture in rolling shutter mode |
US11393125B1 (en) * | 2019-12-09 | 2022-07-19 | Gopro, Inc. | Systems and methods for dynamic optical medium calibration |
US11670004B2 (en) | 2019-12-09 | 2023-06-06 | Gopro, Inc. | Systems and methods for dynamic optical medium calibration |
Also Published As
Publication number | Publication date |
---|---|
WO2017056088A2 (en) | 2017-04-06 |
US20170280131A1 (en) | 2017-09-28 |
CN108140246A (en) | 2018-06-08 |
US9681118B2 (en) | 2017-06-13 |
US20170099477A1 (en) | 2017-04-06 |
WO2017056088A3 (en) | 2017-08-03 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10499038B2 (en) | Method and system for recalibrating sensing devices without familiar targets | |
CN111210468B (en) | Image depth information acquisition method and device | |
US9858684B2 (en) | Image processing method and apparatus for calibrating depth of depth sensor | |
US9805512B1 (en) | Stereo-based calibration apparatus | |
US8655094B2 (en) | Photogrammetry system and method for determining relative motion between two bodies | |
CN109752003B (en) | Robot vision inertia point-line characteristic positioning method and device | |
US20100315490A1 (en) | Apparatus and method for generating depth information | |
US11985292B1 (en) | Residual error mitigation in multiview calibration | |
KR102522228B1 (en) | Apparatus and method for calibrating vehicle camera system | |
CN112991453B (en) | Calibration parameter verification method and device for binocular camera and electronic equipment | |
US9449378B2 (en) | System and method for processing stereoscopic vehicle information | |
US11527006B2 (en) | System and method for dynamic stereoscopic calibration | |
CN116576850B (en) | Pose determining method and device, computer equipment and storage medium | |
Ye et al. | A calibration trilogy of monocular-vision-based aircraft boresight system | |
JP2010145219A (en) | Movement estimation device and program | |
Chen et al. | Robust homography for real-time image un-distortion | |
KR20140068444A (en) | Apparatus for calibrating cameras using multi-layered planar object image and method thereof | |
Kim | Analysis on the characteristics of camera lens distortion | |
Bräuer-Burchardt et al. | Calibration evaluation and calibration stability monitoring of fringe projection based 3D scanners | |
CN112970249A (en) | Assembly for calibrating a camera and measurement of such an assembly | |
CN112750205A (en) | Plane dynamic detection system and detection method | |
Šolony et al. | Scene reconstruction from kinect motion |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: INFINITY AUGMENTED REALITY ISRAEL LTD., ISRAEL Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:PROTTER, MATAN;KUSHNIR, MOTTI;REEL/FRAME:047559/0898 Effective date: 20151007 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: AWAITING TC RESP., ISSUE FEE NOT PAID |
|
AS | Assignment |
Owner name: ALIBABA TECHNOLOGY (ISRAEL) LTD., ISRAEL Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:INFINITY AUGMENTED REALITY ISRAEL LTD.;REEL/FRAME:050873/0634 Effective date: 20191024 |
|
FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
AS | Assignment |
Owner name: ALIBABA DAMO (HANGZHOU) TECHNOLOGY CO., LTD., CHINA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:ALIBABA TECHNOLOGY (ISRAEL) LTD.;REEL/FRAME:063006/0087 Effective date: 20230314 |
|
FEPP | Fee payment procedure |
Free format text: MAINTENANCE FEE REMINDER MAILED (ORIGINAL EVENT CODE: REM.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
FEPP | Fee payment procedure |
Free format text: SURCHARGE FOR LATE PAYMENT, LARGE ENTITY (ORIGINAL EVENT CODE: M1554); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 4 |