CN107478230B - Trolley navigation system based on visual information - Google Patents

Trolley navigation system based on visual information Download PDF

Info

Publication number
CN107478230B
CN107478230B CN201710670538.7A CN201710670538A CN107478230B CN 107478230 B CN107478230 B CN 107478230B CN 201710670538 A CN201710670538 A CN 201710670538A CN 107478230 B CN107478230 B CN 107478230B
Authority
CN
China
Prior art keywords
trolley
pcduino
axis
server
visual information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710670538.7A
Other languages
Chinese (zh)
Other versions
CN107478230A (en
Inventor
陈海山
郭中华
苑俊英
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanfang College Of Sun Yai-Sen University
Original Assignee
Nanfang College Of Sun Yai-Sen University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanfang College Of Sun Yai-Sen University filed Critical Nanfang College Of Sun Yai-Sen University
Priority to CN201710670538.7A priority Critical patent/CN107478230B/en
Publication of CN107478230A publication Critical patent/CN107478230A/en
Application granted granted Critical
Publication of CN107478230B publication Critical patent/CN107478230B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/02Picture taking arrangements specially adapted for photogrammetry or photographic surveying, e.g. controlling overlapping of pictures
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C25/00Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0219Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory ensuring the processing of the whole working surface

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Multimedia (AREA)
  • Manufacturing & Machinery (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Navigation (AREA)

Abstract

The invention belongs to the technical field of electronic communication, and particularly relates to a trolley navigation system based on visual information. The system comprises a wireless camera, a server, a wireless signal transmission module and a trolley; the trolley is provided with Pcduino and Arduino; the Arduino is connected with a motor, a steering engine and a crystal oscillator; the invention provides a novel visual information-based trolley navigation system, which solves the technical problem that the visual information navigation system in the prior art is not accurate enough in navigation. Therefore, the car navigation is more accurate, and the error rate is lower.

Description

Trolley navigation system based on visual information
Technical Field
The invention belongs to the technical field of electronic communication, and particularly relates to a trolley navigation system based on visual information.
Background
Arduino is a convenient, flexible and convenient open-source electronic prototype platform, and comprises hardware and software. It is applicable to fans, artists, designers, and friends who are interested in "interactions". The method can be quickly combined with Adobe Flash, Processing, Max/MSP, Pure Data, SuperColloder and other software to make interactive works. Arduino may use existing electronics such as switches or sensors or other control devices, LEDs, stepper motors or other output devices. Arduino may also run independently and interact with software.
Pcduino is a high-performance and cost-effective platform for mini PC, and can run PC operating systems, such as Ubuntu and Android ICS. It can output video to a television or display screen through a built-in HDMI interface.
Disclosure of Invention
The invention provides a novel visual information-based trolley navigation system, which solves the technical problem that the visual information navigation system in the prior art is not accurate enough in navigation.
The specific technical scheme of the invention is that the trolley navigation system based on visual information comprises a wireless camera, a server, a wireless transmission module and a trolley; the trolley is provided with Pcduino and Arduino; the Arduino is connected with a motor, a steering engine and a first crystal oscillator; the Pcduino is connected with a gyroscope, a magnetometer and a second crystal oscillator; the wireless camera is in wireless communication connection with the server and is used for acquiring image information in real time and transmitting the acquired information to the server in a wireless manner; the server is electrically connected with the wireless transmission module, detects the target and the trolley in the image through opencv, and sends the coordinate vector of the target and the trolley to the Pcduino through the wireless transmission module; the Pcduino is connected with the Arduino through a serial port, the Pcduino receives data and then performs program calculation, and then sends an instruction to the Arduino through a serial port communication mode; the steering engine in rotation continuously adjusts the angle of the trolley to enable the trolley to move forward towards a target direction, the steering engine is accelerated by the driving of the motor, data from the server are continuously received in the accelerating process, and the speed of the motor and the angle of the trolley are timely adjusted by the Pcduino according to the received coordinate vectors of the target and the trolley and the feedback of a gyroscope sensor and a magnetometer on the trolley until the navigation of the trolley before the trolley reaches the target.
The wireless cameras are three to form a three-eye wireless camera.
The further connection mode of the wireless camera and the server is communication connection through a socket.
The gyroscope starts to operate, after waiting for time T, namely reaching sampling time T of the gyroscope, a sensor acquires original data Raw of the gyroscope, wherein Sum is Sum + Raw, and N is N + 1; then, when N is 1000, carrying out Gyr _ offset which is Sum/N, and then ending; wherein N is the number of sampling times; t is the sampling frequency of the gyroscope; raw is the original data of the gyroscope; sum is the Sum of the superposition values of Raw; gyr _ offset is the zero offset compensation output value of the gyroscope.
The output correction method of the magnetometer comprises the following steps of firstly recording output data of the magnetometer in the process of one circle of rotation of a trolley, setting the maximum value of an X axis of the acquired data as Xmax, the minimum value of the X axis as Xmin, correspondingly, setting the maximum value of a Y axis as Ymax, and setting the minimum value of the Y axis as Ymin; the X-axis input is Xin, the Y-axis input is Yin, the X-axis output is Xout, the Y-axis output is Yout, and the X-axis output is output-corrected according to the formula X-1:
Xout=XinXs+Xb(1-1) formula 1-1 wherein Xs is the proportionality coefficient of Xin;
xb is offset compensation for Xin, which is given by the following formula 1-1-1;
Figure GDA0002657293950000031
the Y-axis output is output corrected according to equation 1-2:
Yout=YinYs+Yb(1-2) Ys in the formula 1-2 is a proportionality coefficient of Yin, which is given by the following formula 1-2-2;
Figure GDA0002657293950000032
yb is bias compensation of Yin, and is given by a formula 1-2-3;
Figure GDA0002657293950000033
thus, the output correction value of the magnetometer can be obtained.
The method for receiving the coordinate vectors of the target and the trolley comprises the following steps that the server judges whether the wireless camera captures the trolley and the target, if not, the wireless camera returns to continue capturing, if yes, the trolley and the target vector are transmitted to Pcduino in a wireless communication mode, and the Pcduino receives the coordinate vectors of the target and the trolley.
The visual information-based car navigation system has the advantages of more accurate navigation and lower error rate.
Drawings
In order to more clearly illustrate the technical solution of the present invention, the drawings required in the description of the embodiments are briefly introduced below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, without any inventive work, other drawings can be obtained from the drawings, and the technical solution directly obtained from the drawings shall also belong to the protection scope of the present invention.
FIG. 1 is a block diagram of the visual information-based cart navigation system of the present invention.
FIG. 2 is a flowchart of the gyroscope zero offset compensation in the present invention.
FIG. 3 is a flow chart of Pcduino receiving coordinate vectors.
Fig. 4 is a flow chart of a positioning algorithm of the trinocular camera.
Detailed Description
In order to make the aforementioned objects, features and advantages of the present invention comprehensible, specific embodiments thereof are described in detail below. In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present invention. This invention can be embodied in many different forms than those herein described and many modifications may be made by one skilled in the art without departing from the spirit and scope of the invention.
Embodiment 1 as shown in fig. 1, the invention provides a trolley navigation system based on visual information, which comprises a wireless camera 1, a server 2, a wifi wireless transmission module 3 and a trolley; the trolley is provided with Pcduino4 and Arduino 6; the Arduino6 is connected with a motor 5, a steering engine 7 and a first crystal oscillator 8; the Pcduino4 is connected with a gyroscope 11, a magnetometer 10 and a second crystal oscillator 9; the wireless camera 1 is in wireless communication connection with the server 2 and is used for acquiring image information in real time and transmitting the acquired information to the server 2 through wireless wifi; the server 2 is electrically connected with the wifi wireless transmission module 3, detects the target and the trolley in the image through opencv, and sends the coordinate vector of the target and the trolley to the Pcduino4 through the wifi wireless transmission module 3; the Pcduino4 is connected with the Arduino6 through a serial port, the Pcduino4 receives data and then carries out program calculation, and then an instruction is sent to the Arduino6 in a serial port communication mode; the steering engine 7 in rotation continuously adjusts the angle of the trolley to enable the trolley to move forward towards a target direction, the motor 5 drives the steering engine 7 to accelerate, data from the server 2 are continuously received in the accelerating process, and the speed and the angle of the steering engine 7 are timely adjusted according to the received coordinate vectors of the target and the trolley and the feedback of the gyroscope 11 and the magnetometer 10 on the trolley by the Pcduino4 until the trolley reaches the target surface. The navigation system is simple in structure, data communication is faster, and navigation information is more accurate.
Embodiment 2 the invention provides a trolley navigation system based on visual information, which comprises a wireless camera 1, a server 2, a 4G wireless transmission module 3 and a trolley, wherein the wireless camera is connected with the server 2 through a wireless transmission module; the trolley is provided with Pcduino4 and Arduino 6; the Arduino6 is connected with a motor 5, a steering engine 7 and a first crystal oscillator 8; the Pcduino4 is connected with a gyroscope 11, a magnetometer 10 and a second crystal oscillator 9; the wireless camera 1 is a three-camera wireless camera formed by three wireless cameras, and the three-camera wireless camera is in communication connection with the server 2 in a socket mode and is used for acquiring image information in real time and transmitting the acquired information to the server in a 4G network mode; the server is electrically connected with the 4G wireless transmission module 3, detects the target and the trolley in the image through opencv, and sends the coordinate vector of the target and the trolley to the Pcduino4 through the 4G wireless transmission module 3; the Pcduino4 is connected with the Arduino6 through a serial port, the Pcduino4 receives data and then carries out program calculation, and then an instruction is sent to the Arduino6 in a serial port communication mode; the steering engine 7 in rotation continuously adjusts the angle of the trolley to enable the trolley to move forward towards a target direction, the motor 5 drives the steering engine 7 to accelerate, data from a server are continuously received in the accelerating process, and the speed and the angle of the steering engine 7 are timely adjusted according to the received coordinate vectors of the target and the trolley and the feedback of the gyroscope 11 and the magnetometer 10 on the trolley by the Pcduino4 until the trolley reaches the target surface. The navigation system is simple in structure, data communication is faster, and navigation information is more accurate. As shown in FIG. 4, experiments prove that the error of the positioning method of the trinocular camera is smaller than that of the positioning method of the monocular camera, the positioning error of the monocular camera is 0-7.8, the error of the trinocular camera is in a range of 0-0.46, compared with each point of the total error, the value of the trinocular positioning algorithm is always smaller than that of the monocular positioning algorithm, the fluctuation degree of the total error of the trinocular positioning algorithm is smaller, and the total error can be maintained in a range of [ 5-8.2 ]. The accuracy of the positioning algorithm is improved better by the three-eye positioning algorithm.
Embodiment 3 is as shown in fig. 2, the present invention further provides a zero offset compensation method for the gyroscope 11 in the car navigation system based on the visual information, which includes the steps of, firstly, starting the operation of the gyroscope 11, waiting for a time T, that is, after a sampling time T of the gyroscope is reached, acquiring Raw data Raw of the gyroscope by a sensor, where Sum is Sum + Raw, and N is N + 1; then, when N is 1000, carrying out Gyr _ offset which is Sum/N, and then ending; wherein N is the number of sampling times; t is the sampling frequency of the gyroscope; raw is the original data of the gyroscope; sum is the Sum of the superposition values of Raw; gyr _ offset is the zero offset compensated output value of gyroscope 11.
Embodiment 4 the invention provides an output correction method of a magnetometer 10 in a trolley navigation system based on visual information, which includes the steps of firstly recording output data of the magnetometer 10 in the one-circle rotation process of a trolley, setting the maximum value of an X axis of the acquired data as Xmax, the minimum value of the X axis as Xmin, correspondingly, setting the maximum value of a Y axis as Ymax, and setting the minimum value of the Y axis as Ymin; the X-axis input is Xin, the Y-axis input is Yin, the X-axis output is Xout, the Y-axis output is Yout, and the X-axis output is output-corrected according to the formula X-1:
Xout=XinXs+Xb (1-1)
in the formula 1-1, Xs is a proportionality coefficient of Xin;
xb is offset compensation for Xin, which is given by the following formula 1-1-1;
Figure GDA0002657293950000071
the Y-axis output is output corrected according to equation 1-2:
Yout=YinYs+Yb (1-2)
ys in the formula 1-2 is a proportionality coefficient of Yin, which is given by the following formula 1-2-2;
Figure GDA0002657293950000072
yb is bias compensation of Yin, and is given by a formula 1-2-3;
Figure GDA0002657293950000073
therefore, the output correction value of the magnetometer can be obtained, and the error of the result is smaller.
In embodiment 5, as shown in fig. 3, a method for receiving a coordinate vector of a target and a coordinate vector of a vehicle by the Pcduino4 in the vehicle navigation system based on visual information in the present invention is that a server determines whether to capture the vehicle and the target by using the trinocular wireless camera 1, if not, returns to capture continuously, and if so, transmits the vehicle and the target vector to the Pcduino4 by using a wireless communication manner, and the Pcduino4 receives the coordinate vector of the target and the vehicle.
The positioning algorithm of the trinocular camera is shown in fig. 4, and the positioning test of the trinocular camera is shown in the following, namely, firstly, 10 points are uniformly distributed and selected in an image, one vertex of a positioning area is selected as an origin, a plane coordinate system is established, the coordinates of the 10 points in the positioning area are measured, and data are recorded. Then respectively putting the trolley on the 10 test points, tracking the position of the trolley, and recording the coordinates of the trolley in the monocular camera positioning. And respectively putting the trolley on the 10 test points, tracking the position of the trolley, and recording the coordinate of the trolley in the positioning of the three-mesh camera. And respectively calculating the error and the total error (formula 2-3) of the x axis (formula 2-1) and the y axis (formula 2-2) of the monocular camera and the trinocular camera.
The error formula is as follows:
Δx=/x-x0/ (2-1)
Δy=/y-y0/ (2-2)
Figure GDA0002657293950000081
the test result can obtain that the xy axis error of the monocular positioning algorithm is large, the error is maintained in the range of [ 0-7.8 ], and the xy axis error of the monocular positioning algorithm fluctuates in the range of [ 0-4.6 ]. And comparing each point of the total error, wherein the value of the trinocular positioning algorithm is always smaller than that of the monocular positioning algorithm, and the total error fluctuation degree of the trinocular positioning algorithm is smaller and can be always maintained in the range of [ 5-8.2 ]. The accuracy of the positioning algorithm is improved better by the three-eye positioning algorithm.

Claims (6)

1. The utility model provides a dolly navigation based on visual information which characterized in that: the system comprises a wireless camera (1), a server (2), a wireless transmission module (3) and a trolley; the trolley is provided with Pcduino (4) and Arduino (6); the Arduino is connected with a motor (5), a steering engine (7) and a first crystal oscillator (8); the Pcduino is connected with a gyroscope (11), a magnetometer (10) and a second crystal oscillator (9); the wireless camera (1) is in wireless communication connection with the server (2) and is used for acquiring image information in real time and transmitting the acquired information to the server (2); the server (2) is electrically connected with the wireless transmission module (3), detects the target and the trolley in the image through opencv, and sends the coordinate vectors of the target and the trolley to the Pcduino (4) through the wireless transmission module (3); the Pcduino (4) is connected with the Arduino (6) through a serial port, the Pcduino (4) receives data and then performs program calculation, and then sends an instruction to the Arduino (6) in a serial port communication mode; the steering engine (7) is used for continuously adjusting the angle of the trolley to enable the trolley to advance towards a target direction; the motor (5) is used for accelerating the steering engine (7), and continuously receives data from the server (2) in the accelerating process; and the Pcduino (4) timely adjusts the speed of the motor (5) and the angle of the trolley according to the received coordinate vectors of the target and the trolley and the feedback of the gyroscope (11) and the magnetometer (10) on the trolley until the navigation of the trolley before reaching the target surface is finished.
2. The visual information-based cart navigation system of claim 1, wherein: the wireless camera (1) is three to constitute three wireless cameras.
3. The visual information-based cart navigation system of claim 2, wherein: the three-eye wireless camera is in communication connection with the server (2) through a socket.
4. The visual information-based cart navigation system of claim 3, wherein: the zero offset compensation method of the gyroscope (11) comprises the following steps that firstly, the gyroscope (11) starts to operate, after waiting for time T, namely after reaching sampling time T of the gyroscope, a sensor collects original data Raw of the gyroscope, Sum is Sum + Raw, and N is N + 1; then, Gyr _ offset is performed when N is 1000, Sum/N, and then the process is ended.
5. The visual information-based cart navigation system of claim 4, wherein: the output correction method of the magnetometer (10) comprises the following steps of firstly recording output data of the magnetometer (10) in the process of one circle of rotation of a trolley, setting the maximum value of an X axis of the acquired data as Xmax, the minimum value of the X axis as Xmin, correspondingly, setting the maximum value of a Y axis as Ymax, and setting the minimum value of the Y axis as Ymin; the X-axis input is Xin, the Y-axis input is Yin, the X-axis output is Xout, the Y-axis output is Yout, and the X-axis output is output-corrected according to the formula X-1:
Xout=XinXs+Xb (1-1)
in the formula 1-1, Xs is a proportionality coefficient of Xin;
xb is offset compensation for Xin, which is given by the following formula 1-1-1;
Figure FDA0002657293940000021
the Y-axis output is output corrected according to equation 1-2:
Yout=YinYs+Yb (1-2)
ys in the formula 1-2 is a proportionality coefficient of Yin, which is given by the following formula 1-2-2;
Figure FDA0002657293940000022
yb is bias compensation of Yin, and is given by a formula 1-2-3;
Figure FDA0002657293940000031
thereby obtaining an output correction value of the magnetometer (10).
6. The visual information-based cart navigation system of claim 2, wherein: the method for receiving the coordinate vectors of the targets and the trolleys by the Pcduino (4) comprises the steps that the server (2) judges whether the wireless camera (1) captures the trolleys and the targets, if not, the wireless camera returns to continue capturing, if yes, the trolleys and the target vectors are transmitted to the Pcduino (4) in a wireless communication mode, and the Pcduino (4) receives the coordinate vectors of the targets and the trolleys.
CN201710670538.7A 2017-08-08 2017-08-08 Trolley navigation system based on visual information Active CN107478230B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710670538.7A CN107478230B (en) 2017-08-08 2017-08-08 Trolley navigation system based on visual information

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710670538.7A CN107478230B (en) 2017-08-08 2017-08-08 Trolley navigation system based on visual information

Publications (2)

Publication Number Publication Date
CN107478230A CN107478230A (en) 2017-12-15
CN107478230B true CN107478230B (en) 2020-12-22

Family

ID=60598990

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710670538.7A Active CN107478230B (en) 2017-08-08 2017-08-08 Trolley navigation system based on visual information

Country Status (1)

Country Link
CN (1) CN107478230B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007219758A (en) * 2006-02-15 2007-08-30 Fujitsu Ten Ltd On-vehicle information unit, and information processing method for on-vehicle unit
CN104257048A (en) * 2014-09-11 2015-01-07 浙江大学 Old people assisting system based on intelligent walking stick
CN105227911A (en) * 2015-09-22 2016-01-06 深圳先进技术研究院 Based on the large data monitoring of sweeping robot and the system and method for drawing
CN106444750A (en) * 2016-09-13 2017-02-22 哈尔滨工业大学深圳研究生院 Two-dimensional code positioning-based intelligent warehousing mobile robot system
CN106708053A (en) * 2017-01-26 2017-05-24 湖南人工智能科技有限公司 Autonomous navigation robot and autonomous navigation method thereof
CN106931945A (en) * 2017-03-10 2017-07-07 上海木爷机器人技术有限公司 Robot navigation method and system

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007219758A (en) * 2006-02-15 2007-08-30 Fujitsu Ten Ltd On-vehicle information unit, and information processing method for on-vehicle unit
CN104257048A (en) * 2014-09-11 2015-01-07 浙江大学 Old people assisting system based on intelligent walking stick
CN105227911A (en) * 2015-09-22 2016-01-06 深圳先进技术研究院 Based on the large data monitoring of sweeping robot and the system and method for drawing
CN106444750A (en) * 2016-09-13 2017-02-22 哈尔滨工业大学深圳研究生院 Two-dimensional code positioning-based intelligent warehousing mobile robot system
CN106708053A (en) * 2017-01-26 2017-05-24 湖南人工智能科技有限公司 Autonomous navigation robot and autonomous navigation method thereof
CN106931945A (en) * 2017-03-10 2017-07-07 上海木爷机器人技术有限公司 Robot navigation method and system

Also Published As

Publication number Publication date
CN107478230A (en) 2017-12-15

Similar Documents

Publication Publication Date Title
US10909721B2 (en) Systems and methods for identifying pose of cameras in a scene
CN112738487B (en) Image projection method, device, equipment and storage medium
WO2017004799A1 (en) Camera configuration on movable objects
JP2019528501A (en) Camera alignment in a multi-camera system
US20080050042A1 (en) Hardware-in-the-loop simulation system and method for computer vision
WO2019047641A1 (en) Method and device for estimating orientation error of onboard camera
US10013761B2 (en) Automatic orientation estimation of camera system relative to vehicle
CN110986930B (en) Equipment positioning method and device, electronic equipment and storage medium
US20180299271A1 (en) Robust vision-inertial pedestrian tracking with heading auto-aligment
CN110796738B (en) Three-dimensional visualization method and device for state tracking of inspection equipment
CN111079079B (en) Data correction method, device, electronic equipment and computer readable storage medium
WO2021088498A1 (en) Virtual object display method and electronic device
CN111612852A (en) Method and apparatus for verifying camera parameters
CN112288825A (en) Camera calibration method and device, electronic equipment, storage medium and road side equipment
CN116079697B (en) Monocular vision servo method, device, equipment and medium based on image
CN110702139A (en) Time delay calibration method and device, electronic equipment and medium
CN107478230B (en) Trolley navigation system based on visual information
CN109500817B (en) 360-Degree vision tracking control system and control method for multi-legged robot
Miksch et al. Homography-based extrinsic self-calibration for cameras in automotive applications
CN111581322B (en) Method, device and equipment for displaying region of interest in video in map window
CN112738404B (en) Electronic equipment control method and electronic equipment
CN107636592B (en) Channel planning method, control end, aircraft and channel planning system
CN110689575B (en) Image collector calibration method, device, equipment and medium
CN102316272B (en) Remote controller control method, apparatus thereof and remote controller
CN101598846B (en) Computer terminal display system and display method for zoom lens target ranging

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant