CN211552867U - Visual navigation system for assisting unmanned trolley - Google Patents

Visual navigation system for assisting unmanned trolley Download PDF

Info

Publication number
CN211552867U
CN211552867U CN202020235977.2U CN202020235977U CN211552867U CN 211552867 U CN211552867 U CN 211552867U CN 202020235977 U CN202020235977 U CN 202020235977U CN 211552867 U CN211552867 U CN 211552867U
Authority
CN
China
Prior art keywords
module
video
navigation system
image processing
unmanned
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN202020235977.2U
Other languages
Chinese (zh)
Inventor
林永杰
黄紫林
许伦辉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
South China University of Technology SCUT
Original Assignee
South China University of Technology SCUT
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by South China University of Technology SCUT filed Critical South China University of Technology SCUT
Priority to CN202020235977.2U priority Critical patent/CN211552867U/en
Application granted granted Critical
Publication of CN211552867U publication Critical patent/CN211552867U/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Traffic Control Systems (AREA)

Abstract

The utility model discloses a visual navigation system for assisting unmanned trolley, which comprises a video acquisition module, an image processing module, a storage module, a power module, a video output module, a communication module, a direct current motor driving module, a stepping motor driving module, a speed detection module, a WiFi positioning module, a display module and an indicator light; the video acquisition module is used for converting optical image information into an analog video signal, decoding the analog video signal by a video decoder and transmitting the analog video signal to the image processing module; the image processing module is used for processing the image and outputting the processed digital signal to the display through the video output module; the storage module is connected with the speed detection module, and the speed detection module dynamically controls the number of image storage frames according to the running speed of the unmanned trolley. The utility model discloses an integrated circuit technique possesses with low costs, integrated level is high, detect advantages such as accurate, development cycle weak point to can satisfy the real-time requirement.

Description

Visual navigation system for assisting unmanned trolley
Technical Field
The utility model relates to a vision navigation field especially relates to a vision navigation system for assisting unmanned dolly.
Background
At present, the common navigation modes of the unmanned trolley are laser navigation, visual navigation, inertial navigation and the like. The typical advantage of the visual navigation mode is that the amount of information obtained is large, and a panoramic three-dimensional scene can be constructed, so that automatic navigation is realized. Therefore, this technique is widely used in the field of unmanned driving.
However, in the automatic navigation process of the unmanned vehicle, the system is in an automatic operation state, so that the unmanned vehicle is inconvenient to observe or monitor on site through manpower, and when the unmanned vehicle cannot continue automatic navigation due to target loss or other special reasons, the unmanned vehicle also needs to be stopped or reset remotely in time. When different tracking requirements exist, relevant changes need to be made on a target detection tracking program, control parameters of an execution mechanism and the like. In a traditional study experiment of a visual unmanned trolley, for example, a study on autonomous navigation key technology of unmanned vehicles in the document "jiahui group," university of china academy of sciences (changchun optical precision machinery and physics research institute of china academy of sciences), 2019 ", mainly involves algorithm verification, and a PC host, a notebook computer, an industrial personal computer and a camera are adopted as visual processing mechanisms, so that the visual unmanned trolley is difficult to be used in a complex industrial environment. Therefore, it is necessary to design a small-sized and low-cost visual navigation system for assisting unmanned vehicles based on an embedded system.
SUMMERY OF THE UTILITY MODEL
An object of the utility model is to overcome prior art's not enough, provide a vision navigation system for assisting unmanned vehicle.
The utility model discloses at least, one of following technical scheme realizes.
A visual navigation system for assisting an unmanned trolley comprises a video acquisition module, an image processing module, a storage module, a power supply module, a video output module, a communication module, a driving module, a WiFi positioning module, a speed detection module, a display module and an indicator light;
the video acquisition module is used for converting optical image information into an analog video signal, decoding the analog video signal by a video decoder and transmitting the analog video signal to the image processing module;
the image processing module is respectively connected with the video acquisition module, the storage module, the power supply module, the video output module, the communication module, the driving module, the WiFi positioning module, the speed detection module and the indicator light; the image processing module processes the image and outputs the processed digital signal to a display of the display module through the video output module;
the speed detection module preliminarily estimates the advancing speed of the unmanned trolley and transmits data back to the image processing module, so that the aim of remotely controlling the unmanned trolley is fulfilled;
the storage module is used for storing data information generated in the process of processing images or videos;
the communication module is communicated with the peripheral unmanned trolley at a close distance;
the driving module is mainly used for providing power for the unmanned trolley to advance and controlling the advancing speed of the unmanned trolley;
the WiFi positioning module is used for communicating with surrounding fixed WiFi signal detection nodes;
the display module comprises a display for displaying the image processing effect in real time;
the power supply module comprises a lithium battery and is used for supplying working power to the visual navigation system;
the indicating lamp is connected with the image processing module and the power supply module and used for representing the working state of the video navigation system of the unmanned trolley.
Furthermore, the video acquisition module comprises two OpenMV visual modules, a first video decoder and a data line, wherein the two OpenMV visual modules are connected with the first video decoder, and the first video decoder is connected with the image processing module through the data line.
Further, the image processing module comprises a core processor and a data line, and the core processor is connected with the display through the video output module.
Further, the core processor is a raspberry pi with a TF card as a storage hard disk.
Furthermore, the video output module comprises a video encoder, a second video decoder and a data line, wherein the second video encoder is connected with the video decoder through the data line, and the video decoder is connected with the display.
Furthermore, the communication module adopts a USB wireless network card to complete data communication, communicates with the peripheral close-range unmanned trolley, sends the position of the communication module to the central server to carry out active avoidance, simultaneously transmits the real-time state of the unmanned trolley, and broadcasts and provides the information to the outside when the vehicle is out of control.
Further, the driving module comprises a direct current motor driving module and a stepping motor driving module;
the direct current motor driving module provides power for the unmanned trolley to move forward and controls the moving speed of the unmanned trolley; the direct current motor driving module selects a high-torque direct current speed reducing motor of 25GA-370 model;
the stepping motor driving module realizes accurate lifting of the video acquisition module so that the unmanned trolley can acquire image information in a wider range; the stepping motor driving module selects a stepping motor of 42BYGH40-1704A type.
Furthermore, the stepping motor driving module realizes accurate lifting of the video acquisition module through the stepping motor, so that the unmanned trolley can acquire image information in a wider range.
Furthermore, the WiFi positioning module adopts an SEP8266 serial port wireless WiFi module, the WiFi positioning module actively communicates with the detection nodes, and the coordinate position of the trolley is obtained by combining and calculating three or more detection nodes.
Furthermore, the speed detection module adopts a hall sensor of model a3144E to complete speed detection, the hall sensor of model a3144E generates a rotating speed pulse signal, and the rotating speed pulse signal is subjected to photoelectric processing and then transmitted to the core processor to complete speed detection.
The utility model discloses the realization be the distance of barrier in the simple scene of binocular vision measurement, discernment lane line to through the motion of DC motor drive module and step motor drive module control unmanned trolley, thereby realize the purpose of supplementary navigation. Firstly, processing a front scene image obtained by a video acquisition module, extracting characteristic information in the image, then carrying out image matching according to the characteristic information, and calculating to obtain the parallax formed by imaging matching points in two camera modules. And then, transmitting the measured data to a core processor, and after receiving the image data in the RGB format, the core processor executes a binocular vision algorithm and a lane line recognition algorithm so as to recognize lane lines and determine the distance between the unmanned vehicle and the obstacle.
Furthermore, the core processor receives data returned by the speed detection module, and controls the advancing speed of the trolley through the direct current motor driving module, so that the trolley can automatically run and avoid obstacles. And the running state of the trolley is displayed on the display in real time through the transmission of the video output module.
The utility model discloses compare in prior art, have following beneficial effect:
1. the utility model discloses an use camera module to send as the embedded machine vision system of core treater as image acquisition equipment, raspberry, measure the distance of barrier through the method of binocular vision, lane line discernment further carries out, thereby realized the dolly independently travel and keep away the barrier, this design greatly reduced this system's complexity, and this system integrated level is high, it is accurate to detect, the real-time is good, abundant peripheral interface in addition, can wide application in fields such as unmanned driving dolly navigation, target location.
2. The utility model discloses a camera module among the video acquisition module adopts OpenMV vision module. The imaging module carries OV7725 CMOS imaging module, has rich hardware resources, and provides interfaces such as UART, I2C, SPI, PWM, ADC, DAC and GPIO to conveniently expand peripheral functions of the imaging module. And the onboard USB interface is used for connecting an integrated development environment OpenMV-IDE to assist in completing the work of programming, debugging, updating firmware and the like. Meanwhile, in the mode of roller shutter exposure, the noise is low, the spectral response range is wide, and the capability of quick output is realized.
Drawings
FIG. 1 is a schematic structural diagram of a visual navigation system for assisting an unmanned vehicle according to an embodiment;
FIG. 2 is a schematic structural diagram of a video capture module in this embodiment;
fig. 3 is a schematic flowchart of the navigation operation performed in the present embodiment.
Detailed Description
The present invention will be described in further detail with reference to the following examples and drawings, but the embodiments of the present invention are not limited to these examples.
As shown in fig. 1 and 2, the visual navigation system for assisting the unmanned vehicle comprises a video acquisition module, an image processing module, a storage module, a power supply module, a video output module, a communication module, a direct current motor driving module, a stepping motor driving module, a speed detection module, a WiFi positioning module, a display module and an indicator light; the display module includes a display.
The video acquisition module comprises a first camera module, a second camera module, a first video decoder, a data line and the like, the first camera module and the second camera module acquire optical image information around the unmanned trolley, convert the optical image information into an analog video signal, and transmit the analog video signal to the image processing module after the first video decoding. The first camera module and the second camera module are respectively arranged on two sides of the vehicle, and the two camera modules are used for measuring the distance between the unmanned vehicle and surrounding obstacles through a binocular distance measurement algorithm.
And a camera module in the video acquisition module adopts an OpenMV machine vision module. The OpenMV machine vision module is a singlechip vision acquisition and processing integrated module based on an STM32F765 ARM Cortex-M7 inner core, can carry out independent programming, and undertakes all image processing operation tasks, does not need extra image processing equipment, and the cost is lower, is suitable for the utility model discloses demand to low cost.
The image processing module comprises a core processor, a data line and the like, and has the functions of processing the analog video information obtained after the video acquisition module decodes, quickly identifying obstacles and lane lines, and preventing front collision and driving deviation.
A core processor in the image processing module adopts a raspberry pi. The system takes a TF card as a storage hard disk of the system, and can carry and run a Linux system (Rabian) based on a Debian kernel officially released by a raspberry group. The core processor is respectively connected with the image processing module, the video acquisition module, the storage module, the power supply module, the video output module, the communication module, the driving module, the WiFi positioning module, the speed detection module and the indicator lamp so as to transmit data and output processed digital signals to the display through the video output module.
The storage module is used for storing a large amount of data information generated in the process of processing images or videos. Meanwhile, the storage module is connected with the speed detection module, the number of the image storage frames is dynamically controlled according to the running speed of the unmanned trolley, and the system integration level and the processing real-time performance are improved.
The storage module adopts a flash memory.
The power module is used for providing working power for the visual navigation system. The power module comprises a lithium battery, the lithium battery is connected with the core processor, and when the power module breaks down and is powered off, standby working voltage is provided for the processor.
The video output module comprises a video encoder, a second video decoder and a data line, wherein the second video encoder is connected with the video decoder through the data line, and the video decoder is connected with the display and used for transmitting an image processing result to the display so as to verify the operation reliability of the hardware system.
The communication module adopts a USB wireless network card to complete data communication, and is used for communicating with the peripheral unmanned trolley in a close range, sending the position of the communication module to the central server and actively avoiding the communication module. At the same time, a broadcast is sent to transmit the real-time status of the unmanned vehicle, such as an out-of-control vehicle.
The communication module is a USB wireless network card. The raspberry group bottom plate already provides a drive for the USB port, so the expanded interface is directly used, the USB drive does not need to be installed, but a drive of a wireless network card needs to be installed, and the network card drive is installed under a USB subordinate directory.
The direct current motor driving module provides power for the unmanned trolley to advance and controls the advancing speed of the unmanned trolley. The direct current motor driving module selects a high-torque direct current speed reduction motor of 25GA-370 model.
The stepping motor driving module realizes accurate lifting of the video acquisition module through the stepping motor so that the unmanned trolley can acquire image information in a wider range. The stepping motor driving module selects a stepping motor of 42BYGH40-1704A type. And the direct current motor driving module is connected with the stepping motor driving module.
The speed detection module adopts a Hall sensor of A3144E model to complete speed detection, the Hall sensor of A3144E model generates a rotating speed pulse signal, and the rotating speed pulse signal is subjected to photoelectric processing and then transmitted to the core processor to complete speed detection.
The speed detection module preliminarily estimates the advancing speed of the unmanned trolley and transmits data back to the image processing module, so that the aim of remotely controlling the unmanned trolley is fulfilled.
The WiFi positioning module adopts an SEP8266 serial port wireless WiFi module, the WiFi positioning module actively communicates with surrounding fixed WiFi signal detection nodes, and three or more detection nodes are combined to calculate the coordinate position of the trolley.
The display module is mainly used for transmitting real-time traveling data of the unmanned trolley to the display through the data line so as to find abnormal conditions in time. The display is a 7-segment nixie tube. The 7-section nixie tube is simple to use and low in price, and cost can be saved.
The indicating lamp is connected with the image processing module and the power supply module and used for representing the working state of the video navigation system of the unmanned trolley.
Specifically, as shown in fig. 3, the visual navigation system works as follows:
and S1, acquiring image information by using two OpenMV vision modules, and measuring the distance of the obstacle by using a binocular ranging method. The video acquisition module interface is used for connecting the OpenMV visual module and the core processor, processing the image acquired by the OpenMV visual module into an RGB format and then transmitting the RGB format to the core processor.
And S2, the core processor executes a binocular vision algorithm and a lane line identification algorithm, calculates the distance between the image data measured by the left OpenMV vision module and the right OpenMV vision module, identifies the lane line, and determines the distance between the unmanned vehicle and the obstacle.
The binocular ranging algorithm can be divided into three steps:
A. filtering the image by using a De-rich (edge detection) operator, and extracting edge characteristic information;
B. obtaining matching elements to form a matching matrix, and obtaining an optimal matching point and a parallax corresponding to the matching point according to a competition rule;
C. and calculating the distance of the obstacle under the condition that the first camera module and the second camera module are in parallel alignment postures according to the perspective projection model of the camera.
And S3, displaying the running state of the trolley on the display in real time through the transmission of the video output module.
S4, the core processor receives the data returned by the speed detection module and controls the advancing speed of the trolley through the direct current motor driving module;
and S5, the trolley runs along the lane, automatically runs and avoids obstacles, and stops moving when the obstacle distance is smaller than the preset safe distance.
The above embodiments are preferred embodiments of the present invention, but the embodiments of the present invention are not limited to the above embodiments, and any other changes, modifications, substitutions, combinations, and simplifications which do not depart from the spirit and principle of the present invention should be equivalent replacement modes, and all are included in the scope of the present invention.

Claims (10)

1. A visual navigation system for assisting an unmanned trolley is characterized by comprising a video acquisition module, an image processing module, a storage module, a power supply module, a video output module, a communication module, a driving module, a WiFi positioning module, a speed detection module, a display module and an indicator light;
the video acquisition module is used for converting optical image information into an analog video signal, decoding the analog video signal by a video decoder and transmitting the analog video signal to the image processing module;
the image processing module is respectively connected with the video acquisition module, the storage module, the power supply module, the video output module, the communication module, the driving module, the WiFi positioning module, the speed detection module and the indicator light; the image processing module processes the image and outputs the processed digital signal to a display of the display module through the video output module;
the speed detection module preliminarily estimates the advancing speed of the unmanned trolley and transmits data back to the image processing module, so that the aim of remotely controlling the unmanned trolley is fulfilled;
the storage module is used for storing data information generated in the process of processing images or videos;
the communication module is communicated with the peripheral unmanned trolley at a close distance;
the driving module is mainly used for providing power for the unmanned trolley to advance and controlling the advancing speed of the unmanned trolley;
the WiFi positioning module is used for communicating with surrounding fixed WiFi signal detection nodes;
the display module comprises a display for displaying the image processing effect in real time;
the power supply module comprises a lithium battery and is used for supplying working power to the visual navigation system;
the indicating lamp is connected with the image processing module and the power supply module and used for representing the working state of the video navigation system of the unmanned trolley.
2. The visual navigation system for assisting the unmanned vehicle as claimed in claim 1, wherein the video acquisition module comprises two OpenMV visual modules, a first video decoder and a data line, the two OpenMV visual modules are both connected to the first video decoder, and the first video decoder is connected to the image processing module through the data line.
3. The visual navigation system of claim 1, wherein the image processing module comprises a core processor and a data cable, the core processor is connected with the display through the video output module.
4. A visual navigation system for an assisted drone vehicle according to claim 3, characterised in that the core processor is a raspberry pie with TF card as storage hard disk.
5. The visual navigation system of claim 1, wherein the video output module comprises a video encoder, a second video decoder, and a data line, the second video encoder is connected to the video decoder via the data line, and the video decoder is connected to the display.
6. The visual navigation system of claim 1, wherein the communication module uses a USB wireless network card to perform data communication, communicates with neighboring unmanned vehicles in close proximity, sends the location of the communication module to the central server for active avoidance, transmits the real-time status of the unmanned vehicle to the display module, and broadcasts the information when the vehicle is out of control.
7. The visual navigation system for an assisted drone vehicle of claim 1, wherein the drive module includes a direct current motor drive module and a stepper motor drive module;
the direct current motor driving module provides power for the unmanned trolley to move forward and controls the moving speed of the unmanned trolley; the direct current motor driving module selects a high-torque direct current speed reducing motor of 25GA-370 model;
the stepping motor driving module realizes accurate lifting of the video acquisition module so that the unmanned trolley can acquire image information in a wider range; the stepping motor driving module selects a stepping motor of 42BYGH40-1704A type.
8. The visual navigation system of claim 7, wherein the step motor driving module realizes accurate lifting of the video capture module by the step motor, so that the unmanned vehicle can capture a wider range of image information.
9. The visual navigation system for assisting the unmanned vehicle as claimed in claim 1, wherein the WiFi positioning module adopts an SEP8266 serial port wireless WiFi module, the WiFi positioning module actively communicates with the detection nodes, and the coordinate position of the vehicle is obtained by the combination calculation of three or more detection nodes.
10. The visual navigation system of claim 1, wherein the speed detection module uses a hall sensor model a3144E to perform speed detection, the hall sensor model a3144E generates a tachometer pulse signal, and the tachometer pulse signal is photoelectrically processed and transmitted to the core processor to perform speed detection.
CN202020235977.2U 2020-02-29 2020-02-29 Visual navigation system for assisting unmanned trolley Expired - Fee Related CN211552867U (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202020235977.2U CN211552867U (en) 2020-02-29 2020-02-29 Visual navigation system for assisting unmanned trolley

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202020235977.2U CN211552867U (en) 2020-02-29 2020-02-29 Visual navigation system for assisting unmanned trolley

Publications (1)

Publication Number Publication Date
CN211552867U true CN211552867U (en) 2020-09-22

Family

ID=72494767

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202020235977.2U Expired - Fee Related CN211552867U (en) 2020-02-29 2020-02-29 Visual navigation system for assisting unmanned trolley

Country Status (1)

Country Link
CN (1) CN211552867U (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113432597A (en) * 2021-08-02 2021-09-24 华中农业大学 Composite visual navigation system applied to inter-row management of complex outdoor environment

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113432597A (en) * 2021-08-02 2021-09-24 华中农业大学 Composite visual navigation system applied to inter-row management of complex outdoor environment
CN113432597B (en) * 2021-08-02 2022-12-13 华中农业大学 Composite visual navigation system applied to inter-row management of complex outdoor environment

Similar Documents

Publication Publication Date Title
WO2020253316A1 (en) Navigation and following system for mobile robot, and navigation and following control method
CN107856667B (en) Parking assist system and method
CN111784748B (en) Target tracking method and device, electronic equipment and mobile carrier
US20170140540A1 (en) Pose estimation apparatus and vacuum cleaner system
CN109737981B (en) Unmanned vehicle target searching device and method based on multiple sensors
WO2021218310A1 (en) Parking method and apparatus, and vehicle
CN107297748B (en) Restaurant service robot system and application
US20200047346A1 (en) Robot and robot system comprising same
CN102692236A (en) Visual milemeter method based on RGB-D camera
CN214520204U (en) Port area intelligent inspection robot based on depth camera and laser radar
CN106774318B (en) Multi-agent interactive environment perception and path planning motion system
CN111813130A (en) Autonomous navigation obstacle avoidance system of intelligent patrol robot of power transmission and transformation station
WO2022016754A1 (en) Multi-machine cooperative vehicle washing system and method based on unmanned vehicle washing device
CN110751336B (en) Obstacle avoidance method and obstacle avoidance device of unmanned carrier and unmanned carrier
CN106851095B (en) Positioning method, device and system
CN211552867U (en) Visual navigation system for assisting unmanned trolley
CN111251271B (en) SLAM robot for constructing and positioning rotary laser radar and indoor map
CN211529000U (en) Unmanned trolley based on laser radar and camera
CN202257269U (en) Wide-angle type binocular vision recognition and positioning device for service robot
CN115933718A (en) Unmanned aerial vehicle autonomous flight technical method integrating panoramic SLAM and target recognition
CN212781778U (en) Intelligent vehicle based on vision SLAM
CN113081525A (en) Intelligent walking aid equipment and control method thereof
CN113084776A (en) Intelligent epidemic prevention robot and system based on vision and multi-sensor fusion
CN218398132U (en) Indoor multifunctional operation robot of transformer substation
CN115237113B (en) Robot navigation method, robot system and storage medium

Legal Events

Date Code Title Description
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20200922