CN109459013B - Monitoring navigation machine-mounted label and monitoring navigation system - Google Patents

Monitoring navigation machine-mounted label and monitoring navigation system Download PDF

Info

Publication number
CN109459013B
CN109459013B CN201811268395.8A CN201811268395A CN109459013B CN 109459013 B CN109459013 B CN 109459013B CN 201811268395 A CN201811268395 A CN 201811268395A CN 109459013 B CN109459013 B CN 109459013B
Authority
CN
China
Prior art keywords
machine
information
control center
rectangular
monitoring
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811268395.8A
Other languages
Chinese (zh)
Other versions
CN109459013A (en
Inventor
罗胜
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wenzhou University
Original Assignee
Wenzhou University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wenzhou University filed Critical Wenzhou University
Priority to CN201811268395.8A priority Critical patent/CN109459013B/en
Publication of CN109459013A publication Critical patent/CN109459013A/en
Application granted granted Critical
Publication of CN109459013B publication Critical patent/CN109459013B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/005Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 with correlation of navigation data from several sources, e.g. map or contour matching
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K19/00Record carriers for use with machines and with at least a part designed to carry digital markings
    • G06K19/06Record carriers for use with machines and with at least a part designed to carry digital markings characterised by the kind of the digital marking, e.g. shape, nature, code
    • G06K19/06009Record carriers for use with machines and with at least a part designed to carry digital markings characterised by the kind of the digital marking, e.g. shape, nature, code with optically detectable marking

Abstract

The embodiment of the invention provides a monitoring navigation machine-mounted tag and a monitoring navigation system, wherein the tag corresponds to each target machine one by one, and a tag pattern which is convenient for image recognition by a monitoring camera is contained in the tag, so that a control center can calculate the position, the gesture and the speed of the machine according to the image recognition machine, and meanwhile, the machine code can be recognized from the image. The invention solves the problem of inaccurate machine number identification and navigation positioning in the prior art, and has the beneficial effect of better machine number identification and navigation.

Description

Monitoring navigation machine-mounted label and monitoring navigation system
Technical Field
The embodiment of the invention relates to the technical field of computer navigation, in particular to a monitoring navigation airborne tag and a monitoring navigation system.
Background
The navigation system of the machine is the "eye" of the machine, and multiple technology combinations are the direction of future development. The navigation system is responsible for providing vector information such as the position, the speed, the movement posture and the like of the reference coordinate system for the machine, and guiding the machine to move according to a specified route, which is equivalent to a pilot in the man-machine system.
In the prior art, each machine is provided with a self-positioning sensor, so that the position of the machine can be given; the monitoring camera recognizes the machine in the range image as the machine with corresponding number according to the position and error range notified by the camera. However, if the self-positioning sensor of the machine is not accurate enough, the error is too large, and there may be multiple machines through this range, resulting in an inability to determine the machine number.
Disclosure of Invention
The embodiment of the invention provides a monitoring navigation method and a system, which are used for solving the problem of inaccurate positioning of machine navigation in the prior art, so as to achieve the beneficial effect of better performing machine navigation.
According to a first aspect of an embodiment of the present invention, there is provided a supervisory navigation airborne tag, an upper tag area, a coded tag area, and a lower tag area:
the upper label area is rectangular and is formed by juxtaposing rectangular lattices of n first colors and n second colors;
the lower label area is rectangular and is formed by juxtaposing rectangular lattices of n first colors and n second colors;
the upper label area and the lower label area are aligned in parallel and form an attitude label area together, and the four corners of the attitude label area are respectively provided with a first-color rectangular grid, two first-color rectangular grids, three first-color rectangular grids and a first-color rectangular grid; the relative positions of the four corners of the azimuth label area and the monitoring camera are used for representing the current gesture of the machine to which the monitoring navigation airborne label belongs;
the coded label area is rectangular and is arranged between the upper label area and the lower label area, and the coded label area is formed by combining m rows of n-2 rectangular lattices of a first color and rectangular lattices of a second color; n-2 rectangular lattices in the uppermost row of the coded label area are aligned with the rest rectangular lattices except two rectangular lattices in the upper label area; n-2 rectangular lattices in the lowest row of the coded label area are aligned with the rest rectangular lattices except two rectangular lattices in the lower label area; wherein m is a positive integer; the combination of rectangular boxes in the coded label area is used to characterize the machine code, which is unique and corresponds to the machine to which the supervisory navigation on-board label belongs.
Further, the rectangular boxes of the first color in the encoded tag region represent 1, the rectangular boxes of the second color represent 0, and the numerical string obtained by sequentially combining the numbers represented by the rectangular boxes in the encoded tag region is used to characterize the machine encoding.
According to a second aspect of the present invention, there is provided a supervisory navigation system comprising a control center, at least one monitoring camera and a supervisory navigation on-board tag, the supervisory navigation on-board tag being disposed on a target machine:
the control center acquires current first position information, first posture information and first speed information of a target machine;
the control center selects a monitoring camera corresponding to the current position of the target machine based on the first position information, the first gesture information and the first speed information;
the control center controls the monitoring camera to acquire current second position information, second posture information and second speed information of the target machine in real time based on a monitoring navigation airborne tag corresponding to the target machine;
the control center enables generation or modification of a target machine navigation path based on the second position information, the second posture information, and the second speed information.
The monitoring navigation machine-mounted tag comprises an upper tag area, a coded tag area and a lower tag area:
the upper label area is rectangular and is formed by juxtaposing rectangular lattices of n first colors and n second colors;
the lower label area is rectangular and is formed by juxtaposing rectangular lattices of n first colors and n second colors;
the upper label area and the lower label area are aligned in parallel and form an attitude label area together, and the four corners of the attitude label area are respectively provided with a first-color rectangular grid, two first-color rectangular grids, three first-color rectangular grids and a first-color rectangular grid; the relative positions of the four corners of the azimuth label area and the monitoring camera are used for representing the current gesture of the machine to which the monitoring navigation airborne label belongs;
the coded label area is rectangular and is arranged between the upper label area and the lower label area, and the coded label area is formed by combining m rows of n-2 rectangular lattices of a first color and rectangular lattices of a second color; n-2 rectangular lattices in the uppermost row of the coded label area are aligned with the rest rectangular lattices except two rectangular lattices in the upper label area; n-2 rectangular lattices in the lowest row of the coded label area are aligned with the rest rectangular lattices except two rectangular lattices in the lower label area; wherein m is a positive integer; the combination of rectangular boxes in the coded label area is used to characterize the machine code, which is unique and corresponds to the machine to which the supervisory navigation on-board label belongs.
Further, the monitoring camera is also used for acquiring rough information of obstacles encountered when the target machine continues to move according to the current navigation path;
the control center sends rough information of the obstacle to the target machine, and the target machine acquires specific information of the obstacle; the control center modifies the target machine navigation path based on the specific information of the obstacle.
Further, when the monitoring cameras are plural; the control center is also used for: the navigation path of the target machine is generated or modified by preferentially approaching to the path formed by the connecting lines of the monitoring cameras.
Further, the control center is further configured to:
when the target machine is in a photographing range of any monitoring camera, selecting the monitoring camera corresponding to the current position of the target machine based on the first position information, the first posture information and the first speed information;
when the target machine is in a range in which no monitoring camera can take a picture, calculating the current approximate position information of the target machine based on the historical position information, the historical posture information and the historical speed information of the target machine, and selecting a monitoring camera corresponding to the current position of the target machine based on the approximate position information.
Further, the control center is further configured to: transmitting the approximate position information to each monitoring camera around the approximate position information; the monitoring cameras acquire the target machine information based on the approximate position information.
Further, an obstacle avoidance sensor is arranged on the target machine and used for acquiring specific information of the obstacle; the obstacle avoidance sensor is one or a combination of more of an ultrasonic sensor, an infrared sensor, a radar sensor and a camera.
Further, the control center and the machine are in a passing wireless network, and the external unauthorized equipment cannot access the content of the wireless network; the control center can cross to the external network to be in butt joint with the rest of the business.
Further, a label pattern which is used for image recognition and corresponds to the machine code one by one is arranged on the machine body of the machine so as to be recognized by the monitoring camera.
The embodiment of the invention provides a monitoring navigation machine-mounted tag and a monitoring navigation system, wherein the tag corresponds to each target machine one by one, and a tag pattern which is convenient for image recognition by a monitoring camera is contained in the tag, so that a control center can calculate the position, the gesture and the speed of the machine according to the image recognition machine, and meanwhile, the machine code can be recognized from the image. The invention solves the problem of inaccurate machine number identification and navigation positioning in the prior art, and has the beneficial effect of better machine number identification and navigation.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, a brief description will be given below of the drawings required for the embodiments or the prior art descriptions, and it is obvious that the drawings in the following description are some embodiments of the present invention, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a schematic structural diagram of an embodiment of a surveillance navigation onboard tag according to the present invention;
FIG. 2 is a schematic diagram of a structure of a monitoring navigation system according to an embodiment of the present invention;
fig. 3 is a physical structure schematic diagram of an embodiment of an electronic device according to the present invention.
Description of the embodiments
For the purpose of making the objects, technical solutions and advantages of the embodiments of the present invention more apparent, the technical solutions of the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention, and it is apparent that the described embodiments are some embodiments of the present invention, but not all embodiments of the present invention. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention. Machine navigation refers to a method or process for determining the position, attitude and speed of a machine in the moving process, and relates to a plurality of disciplines such as mathematics, mechanics, optics, electronics, automatic control, computers and the like. The performance of the navigation system directly determines whether the machine can perform a task because the machine can only rely on the navigation signal to perform automatic movements.
However, in the prior art, each machine is provided with a self-positioning sensor, so that the position of the machine can be given; the monitoring camera recognizes the machine in the range image as the machine with corresponding number according to the position and error range notified by the machine. However, if the self-positioning sensor of the machine is not accurate enough, the error is too large, and there may be multiple machines through this range, resulting in an inability to determine the number of machines.
In order to solve at least one technical problem, the invention provides a monitoring navigation airborne tag and a system, wherein the tag corresponds to each target machine one by one, and a tag pattern which is convenient for image recognition by a monitoring camera is contained in the tag, so that a control center can calculate the position, the gesture and the speed of the machine according to the image recognition machine, and can recognize the machine code from the image.
Referring to fig. 1, a schematic structural diagram of an embodiment of a monitoring navigation on-board tag according to an embodiment of the present invention is shown. Overall, it includes an upper label area a11, a coded label area a13, and a lower label area a12:
the upper label area A11 is rectangular and is formed by juxtaposing rectangular lattices of n first colors and second colors; for example, the first color and the second color may be red and white, respectively;
the lower label area A12 is rectangular and is formed by juxtaposing rectangular lattices of n first colors and second colors;
the upper label area A11 and the lower label area A12 are arranged in parallel and aligned to form an attitude label area, and the four corners of the attitude label area are respectively provided with a first color rectangular grid, two first color rectangular grids, three first color rectangular grids and a first color rectangular grid; the relative positions of the four corners of the azimuth label area and the monitoring camera are used for representing the current gesture of the machine to which the monitoring navigation airborne label belongs; the monitoring camera is convenient to distinguish the direction during encoding.
The coded label area A13 is rectangular and is arranged between the upper label area A11 and the lower label area A12, and the label area is formed by combining m rows of n-2 rectangular lattices of a first color and rectangular lattices of a second color; n-2 rectangular lattices in the uppermost row of the coded label area A13 are aligned with the rest rectangular lattices except for two rectangular lattices in the upper label area A11; n-2 rectangular lattices in the lowest row of the coded label area A13 are aligned with the rest rectangular lattices except two rectangular lattices in the lower label area A12; wherein m is a positive integer; the combination of rectangular boxes in the coded label area is used to characterize the machine code, which is unique and corresponds to the machine to which the supervisory navigation on-board label belongs.
The colors of the rectangular lattices of the first color and the second color can be distinguished from each other, and any other combination of rectangular lattices of different colors can be equivalent to the protection scope of the scheme, and the embodiment of the invention is not particularly limited
And (5) setting. The grids are rectangular, and the three parameters of heading angle, pitch angle and roll angle (yaw, pitch and tilt) of the label gesture are calculated only by the rectangle.
Further, the tag uses four red rectangular boxes as four corners to facilitate detection of the target machine when it has not yet been fully brought into view by the camera.
Further, based on the identification of the on-board tag, the current position information, speed information, and attitude information of the target machine can be identified.
On the basis of any one of the above specific embodiments of the present invention, a monitoring navigation airborne tag is provided:
the rectangular boxes of the first color in the coded label area a13 represent 1, the rectangular boxes of the second color represent 0, and the numerical string obtained by sequentially combining the numbers represented by the rectangular boxes in the coded label area a13 is used to characterize machine coding.
The specific numerical values of 1 and 0 only need to be distinguishable from each other, and any other combination of two numbers can be equivalent to the protection scope of the scheme, and the embodiment of the invention is not particularly limited.
Referring to fig. 2, a schematic diagram of the overall structure of a monitoring navigation system embodying the present invention is shown. Overall, comprising a control center a01, at least one monitoring camera a02 and a monitoring navigation onboard tag a03, the monitoring navigation onboard tag a03 is placed on the target machine:
the control center A01 acquires current first position information, first posture information and first speed information of a target machine;
the control center A01 selects a monitoring camera A02 corresponding to the current position of the target machine based on the first position information, the first gesture information and the first speed information;
the control center A01 controls the monitoring camera A02 to acquire current second position information, second posture information and second speed information of the target machine in real time based on a monitoring navigation airborne tag A03 corresponding to the target machine;
the control center a01 realizes generation or modification of the target machine navigation path based on the second position information, the second posture information, and the second speed information.
The monitoring navigation machine-mounted tag comprises an upper tag area, a coded tag area and a lower tag area:
the upper label area is rectangular and is formed by juxtaposing rectangular lattices of n first colors and n second colors;
the lower label area is rectangular and is formed by juxtaposing rectangular lattices of n first colors and n second colors;
the upper label area and the lower label area are aligned in parallel and form an attitude label area together, and the four corners of the attitude label area are respectively provided with a first-color rectangular grid, two first-color rectangular grids, three first-color rectangular grids and a first-color rectangular grid; the relative positions of the four corners of the azimuth label area and the monitoring camera are used for representing the current gesture of the machine to which the monitoring navigation airborne label belongs;
the coded label area is rectangular and is arranged between the upper label area and the lower label area, and the coded label area is formed by combining m rows of n-2 rectangular lattices of a first color and rectangular lattices of a second color; n-2 rectangular lattices in the uppermost row of the coded label area are aligned with the rest rectangular lattices except two rectangular lattices in the upper label area; n-2 rectangular lattices in the lowest row of the coded label area are aligned with the rest rectangular lattices except two rectangular lattices in the lower label area; wherein m is a positive integer; the combination of rectangular boxes in the coded label area is used to characterize the machine code, which is unique and corresponds to the machine to which the supervisory navigation on-board label belongs.
The monitoring navigation method in the embodiment is to realize generation or modification of the navigation path of the target machine based on the prearranged monitoring camera in consideration of the problem of inaccurate positioning of the machine itself when the generation or modification of the navigation path of the machine is carried out.
Further, the current first position information, the first gesture information and the first speed information of the target machine are acquired for positioning the corresponding target machine which can be preset and can acquire the target machine; the current first position information, the first gesture information and the first speed information of the target machine can be obtained actively for the target machine to report or for the control center or the server to obtain; specific storage modes embodiments of the present invention are not particularly limited.
Still further, the control center or the server selects a monitoring camera corresponding to the current position of the target machine based on the first position information, the first gesture information and the first speed information.
Still further, the monitoring camera is controlled to acquire current second position information, second posture information and second speed information of the target machine in real time; the control center or the server can be used for accurately calculating the position, the gesture and the speed of the target machine based on the more accurate second position information, the second gesture information and the second speed information, and correcting the airborne navigation system.
The color of the red Bai Juxing lattices can be distinguished from each other, and any other combination of two rectangular lattices with different colors can be equivalent to the protection scope of the scheme, and the embodiment of the invention is not particularly limited.
Further, the tag uses four red rectangular boxes as four corners to facilitate detection of the target machine when it has not yet fully entered the camera field of view.
Further, based on the identification of the on-board tag, the current position information, speed information, and attitude information of the target machine can be identified.
On the basis of any one of the specific embodiments of the present invention, a monitoring navigation system is provided, which is characterized in that the monitoring camera is further used for obtaining rough information of an obstacle encountered when the target machine continues to move according to the current navigation path;
the control center sends information of the obstacle to the target machine, and the target machine acquires specific information of the obstacle; the control center modifies the target machine navigation path based on the specific information of the obstacle.
When the target machine proceeds according to the current navigation path, if an obstacle is to be encountered, because the field of view of the monitoring camera is larger, the information of the obstacle can be obtained in advance, the information of the obstacle is sent to the target machine, and the target machine acquires real-time information of the obstacle based on a sensor (such as one or a combination of more of an ultrasonic sensor, an infrared sensor, a radar sensor and a camera) of the target machine. Wherein the obstacle information and the real-time information may include position information, altitude information, volume information, shape information, speed information, etc., and the embodiment of the present invention is not particularly limited.
On the basis of any one of the above embodiments of the present invention, a monitoring navigation system is provided, when there are a plurality of monitoring cameras; the control center is also used for: the navigation path of the target machine is generated or modified by preferentially approaching to the path formed by the connecting lines of the monitoring cameras.
On the basis of any one of the specific embodiments of the present invention, a monitoring navigation system is provided, where the control center generates or modifies the navigation path of the target machine preferentially within the photographing range of any monitoring camera.
On the basis of any one of the specific embodiments of the present invention, a monitoring navigation system is provided, wherein the navigation path is generated by a control center, and a target machine can autonomously modify a local navigation path according to an obstacle.
On the basis of any one of the specific embodiments of the present invention, there is provided a monitoring navigation system, the control center further being configured to:
when the target machine is in a photographing range of any monitoring camera, selecting the monitoring camera corresponding to the current position of the target machine based on the first position information, the first posture information and the first speed information;
when the target machine is in a range in which no monitoring camera can take a picture, calculating the current approximate position information of the target machine based on the historical position information, the historical posture information and the historical speed information of the target machine, and selecting a monitoring camera corresponding to the current position of the target machine based on the approximate position information.
On the basis of any one of the specific embodiments of the present invention, there is provided a monitoring navigation system, the control center further being configured to: transmitting the approximate position information to each monitoring camera around the approximate position information; the monitoring cameras acquire the target machine information based on the approximate position information.
On the basis of any one of the specific embodiments of the present invention, a monitoring navigation system is provided, and a target machine is provided with an obstacle avoidance sensor for acquiring specific information of an obstacle; the obstacle avoidance sensor is one or a combination of more of an ultrasonic sensor, an infrared sensor, a radar sensor and a camera.
On the basis of any one of the specific embodiments of the present invention, a monitoring navigation system is provided, wherein the control center and the machine are located in a wireless network through which the external unauthorized device cannot access the content of the wireless network; the control center can cross to the external network to be in butt joint with the rest of the business.
On the basis of any one of the specific embodiments of the present invention, a monitoring navigation system is provided, wherein a machine body of the machine is provided with a tag pattern which is used for image recognition and corresponds to the machine code one by one for recognition by a monitoring camera.
On the basis of any one of the above specific embodiments of the present invention, a monitoring navigation system is provided, and a control center is used for:
the timing communicates with each machine telling the machine its task path, position, attitude and speed:
when starting, the method comprises the following steps: (a) Notifying each machine, starting a control center, and reporting the position, the gesture and the speed of each machine; (b) After the position, the gesture and the speed of each machine are obtained, the position, the gesture and the speed of the machine are accurately calculated;
the operation is as follows: (a) Performing target detection on a camera positioned according to the position and the error positioning of a machine in a photographing range of any monitoring camera; (b) And estimating the position, the gesture, the speed and the error of the machine in the range that any monitoring camera cannot take a picture, and notifying the nearest camera.
In accordance with any of the above embodiments of the present invention, there is provided a supervisory navigation system, the machine being configured to:
when starting, the method comprises the following steps: (a) Informing a control center of starting the machine, and reporting the coding, the position and the gesture of the machine; (b) acquiring a closest camera;
the task is as follows: the timing is communicated with the control center to tell the current position, posture and speed.
On the basis of any one of the specific embodiments of the present invention, a monitoring navigation example is given. And navigating by adopting a monitoring camera. The target machine is located and position, attitude and velocity information is provided to the target machine. The operation flow is as follows.
(1) The user presses a key to send out a machine operation task requirement;
(2) The service center allows/refuses the task according to the service rule;
(3) If materials are to be sent out, selecting a machine which is free and has enough electric quantity, and sending a task notification to the machine by a control center;
(4) The machine goes to a material distribution place;
(5) Dispensing materials, and giving a departure signal to the machine by a control center;
(6) The control center gives a global path;
(7) The machine goes to the destination according to the global path;
(8) Finding an obstacle in the road, and modifying a local path;
(9) The machine reaches the destination and sends a notification to the control center;
(10) After the material treatment is finished, a signal is given to a control center;
(11) Ending the task.
Fig. 3 is a schematic physical structure of an electronic device according to an embodiment of the present invention, where, as shown in fig. 3, the electronic device may include: processor 310, communication interface (Communications Interface) 320, memory
(memory) 330 and communication bus 340, wherein processor 310, communication interface 320, and memory 330 communicate with each other via communication bus 340. The processor 310 may invoke a computer program stored in the memory 330 and executable on the processor 310 to perform the transmission method provided by the above embodiments, for example, including: acquiring current first position information, first posture information and first speed information of a target machine; selecting a monitoring camera corresponding to the current position of the target machine based on the first position information, the first gesture information and the first speed information; controlling a monitoring camera to acquire current second position information, second posture information and second speed information of a target machine in real time; the generation or modification of the target machine navigation path is implemented based on the second position information, the second pose information, and the second velocity information.
Further, the logic instructions in the memory 330 described above may be implemented in the form of software functional units and may be stored in a computer-readable storage medium when sold or used as a stand-alone product. Based on such understanding, the technical solution of the embodiments of the present invention may be embodied in essence or a part contributing to the prior art or a part of the technical solution, in the form of a software product stored in a storage medium, including several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to perform all or part of the steps of the method described in the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
The embodiments of the present invention also provide a non-transitory computer readable storage medium having stored thereon a computer program which, when executed by a processor, is implemented to perform the transmission method provided in the above embodiments, for example, including: acquiring current first position information, first posture information and first speed information of a target machine; selecting a monitoring camera corresponding to the current position of the target machine based on the first position information, the first gesture information and the first speed information; controlling a monitoring camera to acquire current second position information, second posture information and second speed information of a target machine in real time; the generation or modification of the target machine navigation path is implemented based on the second position information, the second pose information, and the second velocity information.
The apparatus embodiments described above are merely illustrative, wherein the elements illustrated as separate elements may or may not be physically separate, and the elements shown as elements may or may not be physical elements, may be located in one place, or may be distributed over a plurality of network elements. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of this embodiment. Those of ordinary skill in the art will understand and implement the present invention without undue burden.
From the above description of the embodiments, it will be apparent to those skilled in the art that the embodiments may be implemented by means of software plus necessary general hardware platforms, or of course may be implemented by means of hardware. Based on this understanding, the foregoing technical solution may be embodied essentially or in a part contributing to the prior art in the form of a software product, which may be stored in a computer readable storage medium, such as ROM/RAM, a magnetic disk, an optical disk, etc., including several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to execute the method described in the respective embodiments or some parts of the embodiments.
Finally, it should be noted that: the above embodiments are only for illustrating the technical solution of the present invention, and are not limiting; although the invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present invention.

Claims (1)

1. The monitoring navigation system is characterized by comprising a control center, at least one monitoring camera, a machine and a monitoring navigation machine-mounted label on the machine, wherein after the monitoring navigation system is started:
(1) The user presses a key to send out a machine operation task requirement;
(2) The service center allows/refuses the task according to the service rule;
(3) If materials are to be sent out, selecting a machine which is free and has enough electric quantity, and sending a task notification to the machine by a control center;
(4) The machine goes to a material distribution place;
(5) Dispensing materials, and giving a departure signal to the machine by a control center;
(6) The control center gives a global path;
(7) The machine goes to the destination according to the global path;
(8) Finding an obstacle in the road, and modifying a local path;
wherein when the control center is started: (1) Informing each machine control center to start, and reporting the position, the gesture and the speed of each machine; (2) After the position, the gesture and the speed of each machine are obtained, the position, the gesture and the speed of the machine are accurately calculated;
wherein when the control center works: (1) Performing target detection on a camera positioned according to the position and the error positioning of a machine in a photographing range of any monitoring camera; (2) Estimating the position, the gesture, the speed and the error of the machine in the range that any monitoring camera can not shoot, and notifying the nearest camera;
the control center communicates with each machine at regular time to tell the machine about the task path, position, gesture and speed;
when the machine is in a photographing range of a certain monitoring camera, selecting the monitoring camera corresponding to the current position of the target machine based on the first position information, the first posture information and the first speed information;
when the machine is in a range in which no monitoring camera can take a picture, calculating the current approximate position of the target machine based on the historical position information, the historical posture information and the historical speed information of the target machine, and selecting a monitoring camera corresponding to the current position of the target machine based on the approximate position;
the control center acquires current first position information, first gesture information and first speed information of a target machine;
the control center selects a monitoring camera corresponding to the current position of the target machine based on the first position information, the first gesture information and the first speed information;
the control center controls the monitoring camera to acquire current second position information, second posture information and second speed information of the target machine in real time based on a monitoring navigation airborne tag corresponding to the target machine;
the control center realizes accurate calculation of the position, the gesture and the speed of the target machine according to the more accurate second position information, the second gesture information and the second speed information, and corrects the airborne navigation system;
the control center realizes generation or modification of a navigation path of the target machine based on the second position information, the second gesture information and the second speed information;
the control center acquires rough information of an obstacle which can be encountered when the target machine continues to move according to the current navigation path according to the monitoring camera;
the control center sends rough information of the obstacle to the target machine, and the target machine acquires specific information of the obstacle;
wherein the control center modifies the target machine navigation path based on the specific information of the obstacle;
the control center is preferentially close to a path formed by connecting a plurality of monitoring cameras, and a target machine navigation path is generated or modified;
the control center is also used for sending the approximate position information to all monitoring cameras around the approximate position information so that all the monitoring cameras acquire the target machine information based on the approximate position information;
the machine is provided with an obstacle avoidance sensor for acquiring specific information of the obstacle; the obstacle avoidance sensor is one or a combination of more of an ultrasonic sensor, an infrared sensor, a radar sensor and a camera;
when the machine is started up: (1) Informing a control center of starting the machine, and reporting the coding, the position and the gesture of the machine; (2) acquiring a nearest camera;
when the machine performs a task: the timing communication with the control center tells the current position, posture and speed;
the monitoring navigation machine-mounted tag comprises an upper tag area, a coded tag area and a lower tag area:
the upper label area is rectangular and is formed by juxtaposing rectangular lattices of n first colors and n second colors;
the lower label area is rectangular and is formed by juxtaposing rectangular lattices of n first colors and n second colors;
the upper label area and the lower label area are aligned in parallel and form an attitude label area together, and the four corners of the attitude label area are respectively provided with a first-color rectangular grid, two first-color rectangular grids, three first-color rectangular grids and a first-color rectangular grid; the relative positions of the four corners of the azimuth label area and the monitoring camera are used for representing the current gesture of the machine to which the monitoring navigation airborne label belongs;
the coded label area is rectangular and is arranged between the upper label area and the lower label area, and the coded label area is formed by combining m rows of n-2 rectangular lattices of a first color and rectangular lattices of a second color; n-2 rectangular lattices in the uppermost row of the coded label area are aligned with the rest rectangular lattices except two rectangular lattices in the upper label area; n-2 rectangular lattices in the lowest row of the coded label area are aligned with the rest rectangular lattices except two rectangular lattices in the lower label area; wherein m is a positive integer;
the combination of rectangular boxes in the coded label area is used to characterize the machine code, which is unique and corresponds to the machine to which the supervisory navigation on-board label belongs.
CN201811268395.8A 2018-10-29 2018-10-29 Monitoring navigation machine-mounted label and monitoring navigation system Active CN109459013B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811268395.8A CN109459013B (en) 2018-10-29 2018-10-29 Monitoring navigation machine-mounted label and monitoring navigation system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811268395.8A CN109459013B (en) 2018-10-29 2018-10-29 Monitoring navigation machine-mounted label and monitoring navigation system

Publications (2)

Publication Number Publication Date
CN109459013A CN109459013A (en) 2019-03-12
CN109459013B true CN109459013B (en) 2023-05-09

Family

ID=65608713

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811268395.8A Active CN109459013B (en) 2018-10-29 2018-10-29 Monitoring navigation machine-mounted label and monitoring navigation system

Country Status (1)

Country Link
CN (1) CN109459013B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5758298A (en) * 1994-03-16 1998-05-26 Deutsche Forschungsanstalt Fur Luft-Und Raumfahrt E.V. Autonomous navigation system for a mobile robot or manipulator
CN102510476A (en) * 2011-10-28 2012-06-20 河海大学 Platform system of video monitoring integration information of network of things
DE102014002311A1 (en) * 2014-02-19 2014-08-14 Daimler Ag Method for navigation of motor vehicle, involves computing two alternative routes by navigation system, and displaying routes to driver for selection and images of route sections of two alternative routes in corresponding sequence
CN106931945A (en) * 2017-03-10 2017-07-07 上海木爷机器人技术有限公司 Robot navigation method and system
CN107368074A (en) * 2017-07-27 2017-11-21 南京理工大学 A kind of autonomous navigation method of robot based on video monitoring
DE102016225608A1 (en) * 2016-12-20 2018-06-21 Robert Bosch Gmbh Location system for a video surveillance system and at least one portable communication module

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1179752A (en) * 1995-03-31 1998-04-22 基维软件程序有限公司 Machine-readable label
US9323250B2 (en) * 2011-01-28 2016-04-26 Intouch Technologies, Inc. Time-dependent navigation of telepresence robots
CN106444780B (en) * 2016-11-10 2019-06-28 速感科技(北京)有限公司 A kind of autonomous navigation method and system of the robot of view-based access control model location algorithm

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5758298A (en) * 1994-03-16 1998-05-26 Deutsche Forschungsanstalt Fur Luft-Und Raumfahrt E.V. Autonomous navigation system for a mobile robot or manipulator
CN102510476A (en) * 2011-10-28 2012-06-20 河海大学 Platform system of video monitoring integration information of network of things
DE102014002311A1 (en) * 2014-02-19 2014-08-14 Daimler Ag Method for navigation of motor vehicle, involves computing two alternative routes by navigation system, and displaying routes to driver for selection and images of route sections of two alternative routes in corresponding sequence
DE102016225608A1 (en) * 2016-12-20 2018-06-21 Robert Bosch Gmbh Location system for a video surveillance system and at least one portable communication module
CN106931945A (en) * 2017-03-10 2017-07-07 上海木爷机器人技术有限公司 Robot navigation method and system
CN107368074A (en) * 2017-07-27 2017-11-21 南京理工大学 A kind of autonomous navigation method of robot based on video monitoring

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
室内移动机器人新型视觉导航系统设计;张博;汪剑鸣;冷宇;窦汝振;;工业控制计算机;第23卷(第2期);第31-33页 *

Also Published As

Publication number Publication date
CN109459013A (en) 2019-03-12

Similar Documents

Publication Publication Date Title
US20220234733A1 (en) Aerial Vehicle Smart Landing
KR102221695B1 (en) Apparatus and method for updating high definition map for autonomous driving
US20210397185A1 (en) Object Motion Prediction and Autonomous Vehicle Control
US20200333789A1 (en) Information processing apparatus, information processing method, and medium
CN110765894B (en) Target detection method, device, equipment and computer readable storage medium
CN107329477B (en) Unmanned ship navigation and automatic driving equipment and method thereof
CN109443345B (en) Positioning method and system for monitoring navigation
WO2017218586A1 (en) Methods and systems for reducing false alarms in a robotic device by sensor fusion
CN106541404A (en) A kind of Robot visual location air navigation aid
KR102131377B1 (en) Unmanned Vehicle for monitoring and system including the same
CN203094401U (en) Shipborne automatic identification system (AIS) automatic intelligent collision prevention system adopting sonar detection
CN109459013B (en) Monitoring navigation machine-mounted label and monitoring navigation system
KR102236753B1 (en) Method for setting ground control points using cluster RTK drones
CN113125795A (en) Obstacle speed detection method, device, equipment and storage medium
CN110603503A (en) Control method and device of mobile equipment and storage device
CN109489678B (en) Positioning method and system for monitoring navigation
CN109443346A (en) Monitor navigation methods and systems
WO2022004333A1 (en) Information processing device, information processing system, information processing method, and program
EP3761285A1 (en) Smart object knowledge sharing
WO2023127337A1 (en) Information processing device, information processing method, and program
EP4187501A1 (en) Monitoring system and method
CN115158606B (en) Water surface lifesaving method, system and medium
JP7283580B2 (en) Flying object management device, flying object management method, and program
KR102425372B1 (en) Multiple UAV control systems and control methods for ground control
CN115291607B (en) Autonomous planning generation method and system for unmanned ship path in complex unknown water area

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant