CN111780715A - Visual ranging method - Google Patents

Visual ranging method Download PDF

Info

Publication number
CN111780715A
CN111780715A CN202010603837.0A CN202010603837A CN111780715A CN 111780715 A CN111780715 A CN 111780715A CN 202010603837 A CN202010603837 A CN 202010603837A CN 111780715 A CN111780715 A CN 111780715A
Authority
CN
China
Prior art keywords
robot
navigation mark
picture
camera
navigation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010603837.0A
Other languages
Chinese (zh)
Inventor
庄孟文
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Changzhou Ynen Electrical Co ltd
Original Assignee
Changzhou Ynen Electrical Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Changzhou Ynen Electrical Co ltd filed Critical Changzhou Ynen Electrical Co ltd
Priority to CN202010603837.0A priority Critical patent/CN111780715A/en
Publication of CN111780715A publication Critical patent/CN111780715A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C3/00Measuring distances in line of sight; Optical rangefinders
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S11/00Systems for determining distance or velocity not using reflection or reradiation
    • G01S11/12Systems for determining distance or velocity not using reflection or reradiation using electromagnetic waves other than radio waves
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K7/00Methods or arrangements for sensing record carriers, e.g. for reading patterns
    • G06K7/10Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
    • G06K7/10544Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation by scanning of the records by radiation in the optical part of the electromagnetic spectrum
    • G06K7/10712Fixed beam scanning
    • G06K7/10722Photodetector array or CCD scanning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K7/00Methods or arrangements for sensing record carriers, e.g. for reading patterns
    • G06K7/10Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
    • G06K7/14Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation using light without selection of wavelength, e.g. sensing reflected white light
    • G06K7/1404Methods for optical code recognition
    • G06K7/1408Methods for optical code recognition the method being specifically adapted for the type of code
    • G06K7/14172D bar codes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K7/00Methods or arrangements for sensing record carriers, e.g. for reading patterns
    • G06K7/10Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
    • G06K7/14Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation using light without selection of wavelength, e.g. sensing reflected white light
    • G06K7/1404Methods for optical code recognition
    • G06K7/1439Methods for optical code recognition including a method step for retrieval of the optical code
    • G06K7/1443Methods for optical code recognition including a method step for retrieval of the optical code locating of the code in an image

Abstract

The invention relates to a visual ranging method, which comprises a mobile robot with a camera and comprises the following steps: 1. the robot creates a camera feature database; 2. when the robot works, a navigation mark with a known size is shot in a camera focusing or zooming mode; 3. the robot identifies the navigation mark from the picture and measures the size of the navigation mark in the picture; 4. and the robot calculates the distance between the robot and the navigation mark according to the known size of the navigation mark and the size of the navigation mark in the picture by a corresponding relation table of different distances between the robot and the navigation mark and the size of the navigation mark in the picture in a fixed focus or zooming mode. The invention can accurately measure the distance between the robot or the manipulator and the target object, and can control the robot to position the position of the robot at any time when executing a task through a simple and reliable distance measuring method, and can stably and reliably move to a target point to complete the task.

Description

Visual ranging method
Technical Field
The invention relates to the technical field of robots, in particular to a visual ranging method.
Background
Positioning technology is a key technology of mobile robots. The robot needs to be positioned in real time during the task execution process so as to meet the control requirement of the robot. The common GPS positioning error is larger, the positioning error is generally at least beyond 1 meter, and the error obviously cannot meet the requirement on occasions needing accurate positioning, such as when a robot transports and patrols industrial logistics or grabs an object with uncertain distance to the robot. Therefore, it is significant to measure the distance between the robot and the navigation mark or between the robot hand and the target object to provide positioning data for the movement of the robot or the robot hand. At present, common distance measurement methods include: ultrasonic ranging, laser ranging, visual ranging, and the like.
(1) Ultrasonic ranging and laser ranging
Ultrasonic ranging and laser ranging both calculate the distance between the object to be measured and the sensor by measuring the time difference between the emission and return of the emission source. However, the divergence angle of ultrasonic ranging is relatively wide, and when an object closer to the transmitting place exists around the target, the ultrasonic ranging can only measure the distance of the object closer to the periphery of the target, but cannot measure the distance of the target. The laser ranging visual angle is too narrow, the characteristic is required to be maintained in the motion of the robot when the laser is accurately positioned and emitted to a target, the feasibility is basically absent, in addition, the time delay of the conventional laser ranging device is long, the robot cannot acquire real-time distance information in the motion, and if the high-speed laser ranging device is customized, the price is high, and the cost is high. In reality, a laser radar is generally used for obtaining the distance, but the laser radar itself trades the positioning accuracy for the distance measurement speed, so the accuracy is not very high, and particularly when the distance is measured on a distant target, the error is larger.
(2) Vision ranging
The vision-based distance measurement method generally has two modes of monocular vision distance measurement and monocular vision distance measurement.
Patent CN109544633A object distance measuring method, device and equipment disclose a monocular vision distance measuring method, which utilizes a monocular camera to shoot to obtain a traffic object, reads internal parameters and external parameters of the monocular camera, reads the size of the traffic object from a preset standard, and then calculates the distance between the traffic object and a reference object. However, even for cameras with the same specification, due to manufacturing reasons, there is a difference between pictures taken at the same position, and in an occasion with high positioning requirements, for example, when the positioning error is in the order of centimeters or even millimeters, the difference itself exceeds the error tolerance. Therefore, by reading the internal parameters (internal parameter matrix and distortion parameters of the camera) and the external parameters (parameters relative to other coordinate systems) of the camera, the requirement cannot be met during accurate distance measurement.
Patent CN109029363A discloses a target ranging method based on deep learning, which comprises establishing target databases at different distances, building a target ranging model, designing a loss function of the target ranging model, designing a training method of the target ranging model, and testing the trained target ranging model. And converting the target ranging problem into a regression problem and integrating the regression problem into a target detection algorithm model, thereby realizing target detection and target ranging in one algorithm model. According to the specification, the best result of the method is that the distance measurement precision with the binocular camera is realized, the distance measurement error of 1-5 m is 1.2-4.6%, and the difference is several times or even 10 times in the error requirements of centimeter or millimeter in positioning of the robot and the manipulator.
The patent CN108508897A discloses an automatic robot charging alignment system and method based on vision, the method installs an active vision guiding device on a charging seat, the vision guiding device is installed with a light source emitting visible light or infrared light, a common camera is installed at the charging end of the mobile robot, a guiding mark in an image is captured by the camera, the position and approximate angle of the robot relative to the charging seat are calculated according to the mark position and the visual characteristic, and then an alignment motion control algorithm is adopted to make the robot accurately align to the charging seat. In the method, the light source is easy to damage, so that the method is invalid, and the method calculates the position and the approximate angle of the robot relative to the charging seat, and the distance is not accurate.
Patent CN108279685A discloses a carrying trolley based on visual following and its working method, in which the method claim 2 mentions that the automatic following mode adopts a module matching method for visual following to determine the distance between the user and the carrying trolley, and the controller matches the LOGO image obtained by the camera in real time with the standard LOGO image. Because LOGO can warp on the work clothes, the LOGO size is also generally less, also does not combine the camera characteristic to LOGO size measurement, like this, follows the in-process, unable accurate control distance each other.
In summary, the field lacks a simple and reliable distance measurement method for measuring the accurate distance between the robot or the manipulator and the target object, so as to control the robot to be able to locate the position of the robot at any time when the robot executes a task, and know how to walk to the target point stably and reliably to complete the task.
Disclosure of Invention
The technical problem to be solved by the invention is as follows: the vision distance measuring method can accurately know the distance between the current position and the target position, and can stably and reliably move to the target position to complete a task.
The technical scheme adopted by the invention for solving the technical problems is as follows: a visual ranging method comprises a mobile robot with a camera, and comprises the following steps:
1. the robot establishes a camera feature database, wherein the camera feature database is used for shooting navigation mark pictures with known sizes from different distances by the robot in a camera fixed-focus mode or shooting the navigation mark pictures with known sizes from different distances by the robot in a camera zoom mode; the robot identifies the navigation mark from the picture and measures the size of the navigation mark in the picture; establishing a corresponding relation table of different distances between the robot and the navigation mark and the size of the navigation mark in the picture in a fixed focus or zooming mode;
2. when the robot works, a navigation mark with a known size is shot in a camera focusing or zooming mode;
3. the robot identifies the navigation mark from the picture and measures the size of the navigation mark in the picture;
further, the robot identifies the navigation mark from the picture and measures the size of the navigation mark in the picture in step 3, which means that the robot extracts the key feature of the navigation mark from the picture and calculates the size of the key feature in the picture.
4. And the robot calculates the distance between the robot and the navigation mark according to the known size of the navigation mark and the size of the navigation mark in the picture by a corresponding relation table of different distances between the robot and the navigation mark and the size of the navigation mark in the picture in a fixed focus or zooming mode.
Further, the robot calculates the distance between the robot and the navigation mark according to the known dimension of the navigation mark and the measured key feature size of the navigation mark in the picture and through a corresponding relation table of different distances between the robot and the navigation mark and the dimension of the navigation mark in the picture under the known focal length.
In the above technical solution, preferably, the robot takes the navigation marks with known sizes from different distances in a camera fixed-focus or zoom mode, a plane of a camera view angle of the robot is parallel to a plane of the navigation marks, and a center of the navigation marks coincides with a horizontal center line of a picture, or a vertical center line of the picture, or a center of the picture.
In the above technical solution, preferably, the robot takes a navigation mark with a known size by focusing or zooming the camera, the plane of the view angle of the camera of the robot is parallel to the plane of the navigation mark, and the center of the navigation mark coincides with the horizontal center line of the picture, or the vertical center line of the picture, or the center of the picture.
Further, the camera shoots the navigation mark, the visual angle plane of the camera is parallel to the navigation mark, the center of the navigation mark is superposed with the horizontal center line of the picture, or the vertical center line of the picture, or the center of the picture, and when the condition is not met, the robot performs motion adjustment and recalculation to meet the condition.
More specifically, the condition that the position is not met means that the positioning accuracy calculated by the robot exceeds an allowable error; for example, the deviation error of the center line of the navigation mark and the center line of the picture exceeds the allowable range.
In the above technical solution, the navigation mark is an identification pattern, and includes at least one of a two-dimensional code pattern, a circular pattern, a polygonal pattern, and a color block pattern.
In the technical scheme, the mobile robot with the camera is characterized in that the camera is arranged on a manipulator of the mobile robot.
In the technical scheme, the method further comprises the steps of obtaining the distance information between the positioning point and the navigation mark on the electronic map, and obtaining the navigation mark information on the electronic map.
In the above technical solution, the camera feature database created may be directly input into a camera feature database already created by cameras with the same technical parameters according to the technical parameters of the cameras mounted on the robot.
Furthermore, the robot creates a camera feature database, and when the camera feature database established by directly inputting the cameras with the same technical parameters is adopted, data calibration is carried out on the robot according to the steps of the visual ranging method, so that the workload can be greatly reduced, and the ranging precision is also ensured.
The invention has the advantages that the defects in the background technology are overcome, the distance between the robot or the manipulator and the target object can be accurately measured, the robot can be controlled to be positioned at any time when executing the task through a simple and reliable distance measurement method, and the robot can stably and reliably move to the target point to complete the task.
Drawings
FIG. 1 is a schematic view of a visual ranging apparatus according to the present invention; wherein, FIG. 1-1 is a flow chart of the method of the present invention; fig. 1-2 are schematic pictures of a navigation mark of known size (assuming that the navigation mark is a square of 60 x 60cm) taken by a robot at 1 x focal length.
FIG. 2 is a navigation mark picture taken by the robot at 1 time of fixed focal length according to the present invention; wherein, fig. 2-1 is a picture taken when the visual angle plane of the camera is not parallel to the navigation mark; fig. 2-2 is a picture in which the vertical centerline of the navigation mark is not coincident with the vertical centerline in the picture.
FIG. 3 is a diagram showing the electronic map of the present invention including the navigation mark information, the location point and the distance information of the navigation mark.
In the figure: a is a starting point, B is a positioning point, G is a target point, and M1, M2, M3 and M4 are navigation marks.
Detailed Description
The invention will now be described in further detail with reference to the drawings and preferred embodiments. These drawings are simplified schematic views illustrating only the basic structure of the present invention in a schematic manner, and thus show only the constitution related to the present invention.
The two-dimensional code can be rapidly identified in the picture and the coordinate of the central point of the two-dimensional code can be calculated by utilizing the detection graph of the three vertex angles of the two-dimensional code, so that the center line deviation of the two-dimensional code and the picture can be conveniently analyzed and compared, and whether the robot needs to adjust the posture or not is judged, therefore, the two-dimensional code pattern is preferably selected as a navigation mark in the embodiment.
1. A visual ranging method, as shown in fig. 1-1, comprising a mobile robot with a camera, and comprising the steps of:
(1) a robot created database of camera characteristics, as in the table below.
Figure RE-GDA0002650347650000061
The robot respectively takes pictures at positions 1 meter, 2 meters, 3 meters and 4 meters away from a navigation mark (actual size 60 x 60cm) by a fixed focal length of 1 time, and correspondingly, the robot recognizes that the sizes of key features of the navigation mark in the pictures are respectively 600 x 600 pixels, 500 x 500 pixels, 400 x 400 pixels, 300 x 300 pixels and 200 x 200 pixels; the robot respectively takes pictures at positions 1 meter, 2 meters, 3 meters and 4 meters away from a navigation mark (actual size 60 x 60cm) by 2 times of fixed focal length, and correspondingly, the robot recognizes that the sizes of key features of the navigation mark in the pictures are respectively 600 x 600 pixels, 500 x 500 pixels, 400 x 400 pixels and 300 x 300 pixels; and establishing a corresponding relation table of different distances between the robot and the navigation marks (the actual size is 60 x 60cm) and the sizes of the navigation marks in the picture under different fixed focal length multiples, namely a camera feature database.
(2) While the robot is in operation, the camera takes a picture of a navigation mark of known size at 1 x fixed focus (fig. 1-2).
(3) The robot extracts key features of the navigation mark from the shot picture (1920 × 1080 pixels) and calculates the size of the key features (300 × 300 pixels) (fig. 1-2).
(4) The robot calculates the size of the navigation mark in the picture to be 300 pixels by 300 pixels according to the picture shot by the navigation mark (with the actual size of 60cm) under the fixed focal length of 1 time, and searches a corresponding shooting distance in a camera feature database to be 3 meters, namely the distance from the robot to the navigation mark is 3 meters.
2. Preferably, when the camera shoots the navigation mark, the visual angle plane of the camera is parallel to the navigation mark, and the center of the navigation mark is superposed with the horizontal center line of the picture, or the vertical center line of the picture, or the center of the picture; when the center of the navigation mark is not coincident with the horizontal center line of the picture, or the vertical center line of the picture or the center of the picture, whether the robot needs to move and adjust or not is determined according to the relationship between the position deviation of the navigation mark and the allowable deviation of the positioning accuracy.
Since the positioning error causes the error of the visual ranging, the positioning accuracy error is specified according to the tolerance of the visual ranging error.
In this embodiment, in order to ensure that the accuracy of the visual ranging is not greater than 2cm, the allowable deviation required for the positioning accuracy is not greater than 5 cm.
Fig. 2-1 is a picture of a navigation mark (actual size 60 × 60cm) of a known size taken by the robot through 1 time of fixed focal length, the robot extracts a key feature (400 × 400 pixels) of the navigation mark in the picture (1920 × 1080 pixels), and the analysis finds that the deviation between the vertical center line of the key feature of the navigation mark and the vertical center line of the picture is 20 pixels (60cm/400 pixels is 0.15cm/1 pixel), and the actual deviation corresponding to the conversion is 3cm, which is smaller than the allowable deviation, so that the robot does not need to adjust the motion; in fig. 2-2, the robot shoots a navigation mark (with an actual size of 60 × 60cm) with a known size through 1 time of a fixed focal length, extracts a key feature (500 × 500 pixels) of the navigation mark in a picture (1920 × 1080 pixels), analyzes and finds that the deviation between the horizontal center line of the key feature of the navigation mark and the horizontal center line of the picture is 50 pixels (60cm/500 pixels is 0.12cm/1 pixel), and converts that the corresponding actual deviation is 6cm, which is greater than the allowable deviation, so that the robot needs to shoot the picture again for calculation after motion adjustment.
3. The electronic map includes navigation mark information, and information of the distance between the positioning point and the navigation mark (fig. 3).
The robot on the electronic map walks from the point A to the point B, turns left when the distance from the robot to the navigation mark M1 is 1 meter, and walks to the point G, and stops moving when the distance from the robot to the navigation mark M2 is 3 meters; when the robot returns to the point A from the point G, the robot travels from the point G to the point B, when the distance from the robot to the navigation mark M3 is 1 meter, the robot turns right and travels to the point A, and when the distance from the robot to the navigation mark M4 is 1 meter, the robot stops moving.
During actual movement, the robot walks from a point A to a point B, pictures of the navigation mark M1 are shot through 1 time of fixed focal length, the key feature of the navigation mark M1 is 500 pixels by 500 pixels, the corresponding shooting distance of the robot in the camera feature database is 1 meter, the distance from the robot to the navigation mark M1 is 1 meter, therefore, the robot turns left to move to the point G at the moment, the robot continuously shoots the front navigation mark M2 in the traveling process, calculation and analysis are performed, as the robot is far away from the navigation mark M2, the robot shoots the navigation mark M2 through 2 times of fixed focal length and extracts the key feature of the navigation mark as 400 pixels by 400 pixels, the corresponding shooting distance of the robot in the camera feature database is 3 meters, the distance from the robot to the navigation mark M2 is 3 meters, and the robot stops moving. The moving process of the robot returning to the point A from the point G is the same, and the description is omitted here.
4. The mobile robot with the cameras in the same configuration directly inputs a camera feature database established by the cameras with the same parameters.
The user puts several mobile robots with the same configuration and the same camera into the same place, if the mobile robots execute tasks in the same environment, the user can only use one mobile robot with the camera to create a camera feature database, and then inputs the established camera feature database into other mobile robots with the same configuration and the same camera, so that a large amount of time of the user can be saved.
Although the robot is a mobile robot with cameras in the same configuration, in order to ensure the distance measurement accuracy, when a camera feature database established by the cameras with the same technical parameters is directly input into the robot, preferably, a certain proportion of data is randomly extracted from the robot, and the data is calibrated according to the steps of a visual distance measurement method.
While particular embodiments of the present invention have been described in the foregoing specification, various modifications and alterations to the previously described embodiments will become apparent to those skilled in the art from this description without departing from the spirit and scope of the invention.

Claims (9)

1. A method of visual ranging, comprising: the method comprises a mobile robot with a camera, and comprises the following steps:
(1) the robot creates a camera feature database; the camera feature database is used for shooting navigation marks with known sizes from different distances by the robot in a camera focusing or zooming mode; the robot identifies the navigation mark from the picture and measures the size of the navigation mark in the picture; in a fixed focus or zooming mode, the different distances between the robot and the navigation marks and the sizes of the navigation marks in the picture have a corresponding relation table;
(2) when the robot works, a navigation mark with a known size is shot by a camera in a fixed focus or zooming mode;
(3) the robot identifies the navigation mark from the picture and measures the size of the navigation mark in the picture;
(4) and the robot calculates the distance between the robot and the navigation mark according to the size of the navigation mark, the size of the navigation mark in the picture, the corresponding relation table of different distances between the robot and the navigation mark in a fixed focus or zooming mode and the size of the navigation mark in the picture.
2. A visual ranging method as claimed in claim 1, characterized in that: in the step 1, the visual angle plane of the robot camera is parallel to the navigation mark plane, and the center of the navigation mark coincides with the horizontal center line of the picture, or the vertical center line of the picture, or the center of the picture.
3. A visual ranging method as claimed in claim 1, characterized in that: in the step 2, the visual angle plane of the robot camera is parallel to the navigation mark plane, and the center of the navigation mark coincides with the horizontal center line of the picture, or the vertical center line of the picture, or the center of the picture.
4. A visual ranging method as claimed in claim 1, characterized in that: the navigation mark is an identification pattern, and comprises at least one of a two-dimensional code pattern, a circular pattern, a polygonal pattern and a color block pattern.
5. A visual ranging method as claimed in claim 1, characterized in that: the camera of the mobile robot with the camera is arranged on a manipulator of the mobile robot.
6. A visual ranging method as claimed in claim 1, characterized in that: the system also comprises an electronic map, navigation mark information on the electronic map, and distance information between the positioning point and the navigation mark.
7. A visual ranging method as claimed in claim 3, characterized in that: the camera shoots the navigation mark, the visual angle plane of the camera is parallel to the navigation mark, the center of the navigation mark is superposed with the horizontal center line of the picture or the vertical center line of the picture or the center of the picture, and when the conditions are not met, the robot performs motion adjustment and recalculation to meet the conditions.
8. A visual ranging method as claimed in claim 1, characterized in that: the created camera feature database is a camera feature database which is already established by cameras with the same input parameters.
9. A visual ranging method as claimed in claim 1, characterized in that: the camera feature database is established by inputting cameras with the same parameters, and is calibrated on the local computer.
CN202010603837.0A 2020-06-29 2020-06-29 Visual ranging method Pending CN111780715A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010603837.0A CN111780715A (en) 2020-06-29 2020-06-29 Visual ranging method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010603837.0A CN111780715A (en) 2020-06-29 2020-06-29 Visual ranging method

Publications (1)

Publication Number Publication Date
CN111780715A true CN111780715A (en) 2020-10-16

Family

ID=72760154

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010603837.0A Pending CN111780715A (en) 2020-06-29 2020-06-29 Visual ranging method

Country Status (1)

Country Link
CN (1) CN111780715A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113012407A (en) * 2021-02-18 2021-06-22 上海电机学院 Eye screen distance prompt myopia prevention system based on machine vision
CN115077468A (en) * 2022-06-08 2022-09-20 闽江学院 Zooming distance measurement method and device
CN116196109A (en) * 2023-04-27 2023-06-02 北京碧莲盛不剃发植发医疗美容门诊部有限责任公司 Non-shaving hair-planting manipulator based on image recognition

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6058339A (en) * 1996-11-18 2000-05-02 Mitsubishi Denki Kabushiki Kaisha Autonomous guided vehicle guidance device
US20050080512A1 (en) * 2003-09-29 2005-04-14 Critchlow Michael E. System and method for library robotics positional accuracy using parallax viewing
US20110135157A1 (en) * 2009-12-08 2011-06-09 Electronics And Telecommunications Research Institute Apparatus and method for estimating distance and position of object based on image of single camera
CN201993107U (en) * 2011-01-31 2011-09-28 张东 Novel vehicle-mounted range finder
US20140300722A1 (en) * 2011-10-19 2014-10-09 The Regents Of The University Of California Image-based measurement tools
CN104197901A (en) * 2014-09-19 2014-12-10 成都翼比特科技有限责任公司 Image distance measurement method based on marker
CN104748738A (en) * 2013-12-31 2015-07-01 深圳先进技术研究院 Indoor positioning navigation method and system
CN105973236A (en) * 2016-04-26 2016-09-28 乐视控股(北京)有限公司 Indoor positioning or navigation method and device, and map database generation method
CN106969766A (en) * 2017-03-21 2017-07-21 北京品创智能科技有限公司 A kind of indoor autonomous navigation method based on monocular vision and Quick Response Code road sign
US20170315558A1 (en) * 2016-04-28 2017-11-02 Sharp Laboratories of America (SLA), Inc. System and Method for Navigation Assistance
CN110017841A (en) * 2019-05-13 2019-07-16 大有智能科技(嘉兴)有限公司 Vision positioning method and its air navigation aid
US20190244382A1 (en) * 2018-02-06 2019-08-08 Saudi Arabian Oil Company Computer vision system and method for tank calibration using optical reference line method

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6058339A (en) * 1996-11-18 2000-05-02 Mitsubishi Denki Kabushiki Kaisha Autonomous guided vehicle guidance device
US20050080512A1 (en) * 2003-09-29 2005-04-14 Critchlow Michael E. System and method for library robotics positional accuracy using parallax viewing
US20110135157A1 (en) * 2009-12-08 2011-06-09 Electronics And Telecommunications Research Institute Apparatus and method for estimating distance and position of object based on image of single camera
CN201993107U (en) * 2011-01-31 2011-09-28 张东 Novel vehicle-mounted range finder
US20140300722A1 (en) * 2011-10-19 2014-10-09 The Regents Of The University Of California Image-based measurement tools
CN104748738A (en) * 2013-12-31 2015-07-01 深圳先进技术研究院 Indoor positioning navigation method and system
CN104197901A (en) * 2014-09-19 2014-12-10 成都翼比特科技有限责任公司 Image distance measurement method based on marker
CN105973236A (en) * 2016-04-26 2016-09-28 乐视控股(北京)有限公司 Indoor positioning or navigation method and device, and map database generation method
US20170315558A1 (en) * 2016-04-28 2017-11-02 Sharp Laboratories of America (SLA), Inc. System and Method for Navigation Assistance
CN106969766A (en) * 2017-03-21 2017-07-21 北京品创智能科技有限公司 A kind of indoor autonomous navigation method based on monocular vision and Quick Response Code road sign
US20190244382A1 (en) * 2018-02-06 2019-08-08 Saudi Arabian Oil Company Computer vision system and method for tank calibration using optical reference line method
CN110017841A (en) * 2019-05-13 2019-07-16 大有智能科技(嘉兴)有限公司 Vision positioning method and its air navigation aid

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113012407A (en) * 2021-02-18 2021-06-22 上海电机学院 Eye screen distance prompt myopia prevention system based on machine vision
CN115077468A (en) * 2022-06-08 2022-09-20 闽江学院 Zooming distance measurement method and device
CN115077468B (en) * 2022-06-08 2024-03-12 闽江学院 Zoom ranging method and device
CN116196109A (en) * 2023-04-27 2023-06-02 北京碧莲盛不剃发植发医疗美容门诊部有限责任公司 Non-shaving hair-planting manipulator based on image recognition

Similar Documents

Publication Publication Date Title
US20210041222A1 (en) Registration of three-dimensional coordinates measured on interior and exterior portions of an object
US9197810B2 (en) Systems and methods for tracking location of movable target object
CN110458961B (en) Augmented reality based system
US11361469B2 (en) Method and system for calibrating multiple cameras
US6559931B2 (en) Three-dimensional (3-D) coordinate measuring method, 3-D coordinate measuring apparatus, and large-structure building method
CN111780715A (en) Visual ranging method
CN105953771B (en) A kind of active theodolite system and measuring method
CN105014678A (en) Robot hand-eye calibration method based on laser range finding
CN1761855A (en) Method and device for image processing in a geodetic measuring device
EP3421930A1 (en) Three-dimensional shape data and texture information generation system, photographing control program, and three-dimensional shape data and texture information generation method
CN112598750A (en) Calibration method and device for road side camera, electronic equipment and storage medium
CN114838668B (en) Tunnel displacement monitoring method and system
CN108958256A (en) A kind of vision navigation method of mobile robot based on SSD object detection model
US20200318964A1 (en) Location information display device and surveying system
JP5019478B2 (en) Marker automatic registration method and system
JPH09311021A (en) Method for measuring three-dimensional shape using light wave range finder
JP2001296124A (en) Method and apparatus for measurement of three- dimensional coordinates
KR101356644B1 (en) System for localization and method thereof
Lange et al. Cost-efficient mono-camera tracking system for a multirotor UAV aimed for hardware-in-the-loop experiments
CN113218392A (en) Indoor positioning navigation method and navigation device
CN112504263A (en) Indoor navigation positioning device based on multi-view vision and positioning method thereof
Chen et al. Research on stability and accuracy of the OptiTrack system based on mean error
Shojaeipour et al. Robot path obstacle locator using webcam and laser emitter
Yang et al. Beam orientation of EAST visible optical diagnostic using a robot-camera system
CN114527471A (en) Non-contact type layering positioning instrument positioning method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination