CN108917752B - Unmanned ship navigation method, device, computer equipment and storage medium - Google Patents

Unmanned ship navigation method, device, computer equipment and storage medium Download PDF

Info

Publication number
CN108917752B
CN108917752B CN201810294328.7A CN201810294328A CN108917752B CN 108917752 B CN108917752 B CN 108917752B CN 201810294328 A CN201810294328 A CN 201810294328A CN 108917752 B CN108917752 B CN 108917752B
Authority
CN
China
Prior art keywords
data
initial
point cloud
image
teaching
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810294328.7A
Other languages
Chinese (zh)
Other versions
CN108917752A (en
Inventor
刘明
于洋
叶昊阳
王鲁佳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Yiqing Innovation Technology Co ltd
Original Assignee
Shenzhen Yiqing Innovation Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Yiqing Innovation Technology Co ltd filed Critical Shenzhen Yiqing Innovation Technology Co ltd
Priority to CN201810294328.7A priority Critical patent/CN108917752B/en
Publication of CN108917752A publication Critical patent/CN108917752A/en
Application granted granted Critical
Publication of CN108917752B publication Critical patent/CN108917752B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Navigation (AREA)

Abstract

The application relates to a navigation method and device for an unmanned ship, computer equipment and a storage medium. The method comprises the steps of obtaining image data, point cloud data and measurement data of an inertia measurement unit between a first moment and a second moment, obtaining an initial position of the first moment and a standard position of the second moment in a navigation map, calculating according to the image data, the point cloud data, the measurement data and the initial position between the first moment and the second moment to obtain a predicted position of the unmanned ship at the second moment, calculating a position difference between the standard position and the predicted position when the predicted position is inconsistent with the standard position, generating a control command according to the position difference, updating acceleration, posture and speed parameters of the unmanned ship according to the control command, and enabling the control command to be used for indicating the unmanned ship to sail towards the standard position. And positioning the unmanned ship according to the image data, the point cloud data and the three types of data of the measurement data of the inertial measurement unit to obtain more accurate positioning data, so that navigation of the unmanned ship is realized.

Description

Unmanned ship navigation method, device, computer equipment and storage medium
Technical Field
The present application relates to the field of computer technologies, and in particular, to a method and an apparatus for unmanned ship navigation, a computer device, and a storage medium.
Background
With the development of computer technology, navigation technology appears, a traditional unmanned ship navigation device mainly navigates according to an inertia measurement unit and image data, the inertia measurement unit can calculate navigation displacement and the like of an unmanned ship, due to the fact that the inertia measurement unit has errors, the errors of the inertia measurement unit gradually become larger along with navigation of the unmanned ship, positioning errors occur, navigation deviates from an original navigation destination, in order to enable navigation to be more accurate, image data is added, more accurate positioning is conducted through the image data, but due to the fact that the image data is easily affected by factors such as navigation weather, positioning data still has large errors.
Disclosure of Invention
In view of the above, it is necessary to provide an unmanned ship navigation method, an apparatus, a computer device and a storage medium for improving positioning accuracy.
An unmanned ship navigation method comprising:
acquiring image data, point cloud data and measurement data of an inertia measurement unit between a first moment and a second moment;
acquiring an initial position at a first moment and a standard position at a second moment in a navigation map;
calculating according to the image data, the point cloud data, the measurement data and the initial position between the first moment and the second moment to obtain a predicted position of the unmanned ship at the second moment;
and when the predicted position is inconsistent with the standard position, calculating the position difference between the standard position and the predicted position, generating a control instruction according to the position difference, updating the acceleration, attitude and speed parameters of the unmanned ship according to the control instruction, and using the control instruction to indicate the unmanned ship to sail to the standard position.
In one embodiment, the step of generating a navigation map comprises:
acquiring an initial navigation map;
the method comprises the steps of obtaining teaching data, wherein the teaching data comprise a teaching image data set, a teaching point cloud data set and a teaching measurement data set, and the teaching data are data collected by a teaching process sensor;
carrying out data synchronization on the teaching image data set, the teaching point cloud data set and the teaching measurement data set according to time consistency;
acquiring matching feature points of each image in a teaching image data set to form a first matching feature point set;
acquiring a matching feature point set of each point cloud data in the teaching point cloud data set to form a second matching feature point set;
calculating according to the first matching feature point set, the second matching feature point set, the teaching image data set, the teaching point cloud data set and the teaching measurement data set to obtain target position information;
and adding the target position information to the initial navigation map to obtain the navigation map.
In one embodiment, calculating according to the first matching feature point set, the second matching feature point set, the teaching image data set, the teaching point cloud data set and the teaching measurement data set to obtain target position information includes:
calculating according to the teaching image data set, the first matching feature point set and the teaching measurement data set to obtain position information of the unmanned ship corresponding to each image in the teaching image data set, and forming a first position information set;
calculating according to the teaching point cloud data set, the second matching feature point set and the teaching measurement data set to obtain position information of the unmanned ship corresponding to each group of point cloud data in the teaching point cloud data set, and forming a second position information set;
and weighting the first position information and the second position information at the same time to obtain the target position information.
In one embodiment, the first time is a time for acquiring first point cloud data, the second time is a time for acquiring next point cloud data, and the predicted position of the unmanned ship at the second time is obtained by performing calculation according to the image data, the point cloud data, the measurement data and the initial position in the first time and the second time, and the method includes:
respectively acquiring an initial image and initial point cloud data corresponding to a first moment from the image data and the point cloud data;
acquiring initial measurement data corresponding to the first moment and next measurement data of the initial measurement data from the measurement data, and calculating according to the initial measurement data and the next measurement data to obtain initial prediction displacement and an initial prediction rotation matrix;
taking the next measurement data as initial measurement data, repeating the step of obtaining the initial measurement data and the next measurement data of the initial measurement data from the measurement data, and updating the initial prediction displacement and the initial prediction rotation matrix;
when the next frame of image of the initial image is obtained, calculating the image prediction displacement and the image prediction rotation matrix of the unmanned ship corresponding to the initial image and the next frame of image according to the initial image and the next frame of image;
calculating according to the initial prediction displacement, the initial prediction rotation matrix, the image prediction displacement and the image prediction rotation matrix which are updated at the current moment to obtain a first prediction displacement and a first prediction rotation matrix;
taking the next frame of image as an initial image, and repeatedly performing the steps of calculating the image prediction displacement and the image prediction rotation matrix of the unmanned ship corresponding to the initial image and the next frame of image according to the initial image and the next frame of image until the first prediction displacement and the first prediction rotation matrix are updated;
when next point cloud data corresponding to the initial point cloud data are obtained, calculating point cloud prediction displacement and a point cloud prediction rotation matrix of the unmanned ship corresponding to the initial point cloud data and the next point cloud data according to the initial point cloud data and the next point cloud data;
and calculating to obtain a predicted position according to the first predicted displacement and the first predicted rotation matrix updated at the second moment, the measurement data corresponding to the second moment, the point cloud predicted displacement and the point cloud predicted rotation matrix.
In one embodiment, after the obtaining of the predicted position of the unmanned ship at the second time by performing calculation according to the image data, the point cloud data, the measurement data and the initial position between the first time and the second time, the method comprises:
when the predicted position is consistent with the standard position, acquiring image data, point cloud data and measurement data of an inertial measurement unit of the unmanned ship between the second moment and the third moment;
taking the standard position corresponding to the second moment as an initial position, acquiring the position of the third moment in the navigation map as a standard position, executing calculation of displacement data and a rotation matrix between the second moment and the third moment, and determining the predicted position again according to the displacement data and the rotation matrix;
and controlling the navigation of the unmanned ship according to the predicted position and the standard position.
An unmanned ship navigation device comprising:
the data acquisition module is used for acquiring image data, point cloud data and measurement data of the inertial measurement unit between a first moment and a second moment and acquiring an initial position of the first moment and a standard position of the second moment in the navigation map;
the predicted position calculation module is used for calculating according to the image data, the point cloud data, the measurement data and the initial position between the first moment and the second moment to obtain the predicted position of the unmanned ship at the second moment;
and the navigation module is used for calculating the position difference between the standard position and the predicted position when the predicted position is inconsistent with the standard position, generating a control instruction according to the position difference, updating the acceleration, the attitude and the speed parameters of the unmanned ship according to the control instruction, and using the control instruction to instruct the unmanned ship to sail towards the standard position.
In one embodiment, the unmanned ship navigation device further comprises:
the data synchronization module is used for acquiring an initial navigation map and teaching data, wherein the teaching data comprises a teaching image data set, a teaching point cloud data set and a teaching measurement data set, the teaching data is data acquired by a teaching process sensor, and the teaching image data set, the teaching point cloud data set and the teaching measurement data set are subjected to data synchronization according to time consistency;
the feature point acquisition module is used for acquiring matching feature points of each image in the teaching image data set to form a first matching feature point set, acquiring a matching feature point set of each point cloud data in the teaching point cloud data set to form a second matching feature point set;
and the map generation module is used for calculating according to the first matching feature point set, the second matching feature point set, the teaching image data set, the teaching point cloud data set and the teaching measurement data set to obtain target position information, and adding the target position information to the initial navigation map to obtain the navigation map.
In one embodiment, the unmanned ship navigation device comprises:
the data acquisition module is also used for acquiring image data, point cloud data and measurement data of the inertial measurement unit of the unmanned ship between the second moment and the third moment when the predicted position is consistent with the standard position;
the predicted position calculation module is also used for taking the standard position corresponding to the second moment as an initial position, acquiring the position of the third moment in the navigation map as a standard position, executing calculation of displacement data and a rotation matrix between the second moment and the third moment, and determining the predicted position again according to the displacement data and the rotation matrix;
and the navigation module is also used for controlling the navigation of the unmanned ship according to the predicted position and the standard position.
A computer arrangement comprising a memory, a processor and a computer program stored on the memory and being executable on the processor, the processor implementing the steps of the above-mentioned unmanned ship navigation method when executing the computer program.
A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the steps of the above-mentioned unmanned ship navigation method.
The unmanned ship navigation method, the unmanned ship navigation device, the computer equipment and the storage medium acquire image data, point cloud data and measurement data of an inertia measurement unit between a first moment and a second moment, and acquire an initial position of the first moment and a standard position of the second moment in a navigation map; calculating according to the image data, the point cloud data, the measurement data and the initial position between the first moment and the second moment to obtain a predicted position of the unmanned ship at the second moment; and when the predicted position is inconsistent with the standard position, calculating the position difference between the standard position and the predicted position, updating the acceleration, the attitude and the speed parameters of the unmanned ship, and generating a control instruction according to the position difference, wherein the control instruction comprises the updated acceleration, attitude and speed parameters and is used for indicating the unmanned ship to sail towards the standard position. The unmanned ship is positioned through the measurement data, the image data and the point cloud data of the inertial measurement unit for a period of time, the acceleration, the pose and the acceleration number parameters of the unmanned ship are updated according to the positioning result, the unmanned ship sails to a destination according to the updated data, and the navigation positioning accuracy of the unmanned ship is improved by adopting a plurality of different types of measurement data.
Drawings
FIG. 1 is a diagram of an exemplary embodiment of a method for unmanned ship navigation;
FIG. 2 is a schematic flow chart diagram of a method for unmanned ship navigation in one embodiment;
FIG. 3 is a schematic flow chart diagram of a method for unmanned ship navigation in another embodiment;
FIG. 4 is a flowchart illustrating the step of calculating a target position in one embodiment;
FIG. 5 is a flowchart illustrating the step of calculating a predicted position in one embodiment;
FIG. 6 is a schematic flow chart illustrating a method for unmanned ship navigation in yet another embodiment;
FIG. 7 is a block diagram of the architecture of unmanned ship navigation in one embodiment;
FIG. 8 is a block diagram of the structure of the unmanned ship navigation in another embodiment;
FIG. 9 is a block diagram that illustrates the structure of a map generation module in one embodiment;
FIG. 10 is a block diagram of a predicted position calculation module in one embodiment;
FIG. 11 is a diagram illustrating an internal structure of a computer device in one embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
The unmanned ship navigation method provided by the application can be applied to the application environment shown in fig. 1. Wherein the terminal 102 communicates with the server 104 via a network. The terminal 102 acquires image data, point cloud data and measurement data of an inertial measurement unit between a first moment and a second moment, and acquires an initial position of the first moment and a standard position of the second moment in a navigation map; calculating according to the image data, the point cloud data, the measurement data and the initial position between the first moment and the second moment to obtain a predicted position of the unmanned ship at the second moment; and when the predicted position is inconsistent with the standard position, calculating the position difference between the standard position and the predicted position, updating the acceleration, the attitude and the speed parameters of the unmanned ship, and generating a control instruction according to the position difference, wherein the control instruction comprises the updated acceleration, the attitude and the speed parameters, and the control instruction is used for indicating the unmanned ship to sail to the standard position. The terminal 102 sends the acquired image data, point cloud data and measurement data of the inertial measurement unit between the first moment and the second moment to the server 104, and the server 104 acquires an initial position of the first moment and a standard position of the second moment in a navigation map; calculating according to the image data, the point cloud data, the measurement data and the initial position between the first moment and the second moment to obtain a predicted position of the unmanned ship at the second moment; and when the predicted position is inconsistent with the standard position, calculating the position difference between the standard position and the predicted position, updating the acceleration, the attitude and the speed parameters of the unmanned ship, and generating a control instruction according to the position difference, wherein the control instruction comprises the updated acceleration, the attitude and the speed parameters, and the control instruction is used for indicating the unmanned ship to sail to the standard position. The terminal 102 may be, but not limited to, various personal computers, notebook computers, smart phones, and tablet computers, and the server 104 may be implemented by an independent server or a server cluster formed by a plurality of servers.
In one embodiment, as shown in fig. 2, an unmanned ship navigation method is provided, which is described by taking the method as an example for being applied to the terminal in fig. 1, and includes the following steps:
step 202, image data, point cloud data and measurement data of the inertial measurement unit between the first moment and the second moment are obtained.
The unmanned ship is an unmanned ship, the image data is an image set formed by images shot by image shooting equipment on the unmanned ship, and the image data comprises at least two images. The point cloud data is data acquired by a laser sensor on the unmanned ship, the point cloud data is a set of points expressing target space distribution and target surface characteristics by acquiring the space coordinates of each sampling point on the surface of an object under the same space reference system by using laser, and the point cloud data comprises at least 2 point cloud data. The measurement data of the inertial measurement unit includes position coordinates, angular velocity, and linear velocity.
Specifically, the terminal acquires image data shot by the image shooting device between a first moment and a second moment, point cloud data collected by the laser sensor, and position coordinates, angular velocity and linear velocity measured by the inertial measurement unit. Because the shooting frequency of the image shooting equipment, the acquisition frequency of the point cloud data acquired by the laser sensor and the frequency of the measurement data acquired by the inertia measurement unit are inconsistent, each data has a time label when the image data, the point cloud data and the measurement data are acquired, the data can be synchronously processed through the time labels, and because noise points exist in the data acquisition process easily, the data can be subjected to drying treatment before the image data, the point cloud data and the measurement data are used, and more accurate data can be obtained.
And step 204, acquiring an initial position at a first moment and a standard position at a second moment in the navigation map.
The navigation map is obtained by processing data acquired in the teaching process. The initial position is a defined initial position and is one of the positions during the navigation of the unmanned ship, and the standard position is one of the destinations during the navigation of the unmanned ship.
Specifically, a plurality of target positions are marked in the navigation map, the position information of the unmanned ship at the first moment is configured according to the navigation map, one of the target positions marked in the navigation map is used as the position information of the first moment, and the target position except the position information of the first moment is selected from the target positions and used as the standard position of the second moment.
And step 206, calculating according to the image data, the point cloud data, the measurement data and the initial position between the first moment and the second moment to obtain the predicted position of the unmanned ship at the second moment.
The predicted position is position information of the unmanned ship at the second time point calculated from various measurement data. And according to the image data acquired between the first time and the second time, carrying out image analysis processing on the image data to obtain displacement and rotation matrixes of the unmanned ship at the first time and the second time. And performing data analysis on the acquired point cloud data to obtain displacement and rotation matrixes of the unmanned ship at the first moment and the second moment, performing data analysis on the measured data to obtain corresponding displacement and rotation matrixes, performing data optimization according to the displacement and rotation matrixes obtained by processing the three different types of data, and the like, so that the error of the displacement and rotation matrixes is minimized to obtain more accurate displacement and rotation matrixes, and calculating the predicted position of the unmanned window at the second moment according to the initial position, the displacement and the rotation matrixes.
And S208, when the predicted position is inconsistent with the standard position, calculating the position difference between the standard position and the predicted position, generating a control instruction according to the position difference, and updating the acceleration, the attitude and the speed parameters of the unmanned ship according to the control instruction, wherein the control instruction is used for indicating the unmanned ship to sail to the standard position.
The acceleration comprises an angular acceleration and a linear acceleration, the angular acceleration is used for adjusting the navigation direction of the unmanned ship, and the linear acceleration is used for adjusting the navigation speed of the unmanned ship.
Specifically, the predicted position is matched with the standard position, when the predicted position and the standard position are inconsistent, namely the position information and the attitude information of the predicted position are inconsistent with the position information and the attitude information of the standard position, the position difference between the standard position and the predicted position is calculated, a control command for controlling the unmanned ship to sail is generated according to the position difference, and the acceleration, the pose and the speed parameters of the unmanned ship are adjusted according to the control command, so that the unmanned ship sails towards the standard position.
The unmanned ship navigation method comprises the steps of obtaining image data, point cloud data and measurement data of an inertia measurement unit between a first time and a second time, obtaining an initial position of the first time and a standard position of the second time in a navigation map, calculating according to the image data, the point cloud data, the measurement data and the initial position between the first time and the second time to obtain a predicted position of the unmanned ship at the second time, calculating a position difference between the standard position and the predicted position when the predicted position is inconsistent with the standard position, generating a control command according to the position difference, updating acceleration, posture and speed parameters of the unmanned ship according to the control command, and enabling the control command to be used for indicating the unmanned ship to sail towards the standard position. The unmanned ship obtains various measurement data through various sensors, fuses the various measurement data, and positions the unmanned ship according to the fused data to obtain more accurate positioning.
As shown in fig. 3, in an embodiment, the unmanned ship navigation method further includes:
step S402, an initial navigation map is obtained, teaching data are obtained, the teaching data comprise an image set, a teaching point cloud data set and a teaching measurement data set, and the teaching data are data collected by a teaching process sensor.
Specifically, the terminal acquires a constructed blank map or a map not containing teaching navigation as an initial navigation map, and acquires data acquired by each sensor in the teaching process of the unmanned ship, wherein the acquired data comprises a teaching image set, a teaching point cloud data set and measurement data acquired by a teaching process inertia measurement unit. The raw data collected by the sensor can be further processed only by data processing, such as data drying and screening.
And S404, synchronizing the teaching image data set, the teaching point cloud data set and the teaching measurement data set according to time consistency.
Specifically, the data synchronization refers to establishing a corresponding relationship between data according to time sequence and time consistency, for example, acquiring a group of teaching point cloud data at 10 points 10 minutes 00 seconds, and corresponding a teaching image set at 10 points 10 minutes 00 seconds to a teaching measurement data set.
Step S406, obtaining matching feature points of each image in the teaching image set to form a first matching feature point set, obtaining matching feature point sets of each point cloud data in the teaching point cloud data set to form a second matching feature point set.
And step S408, calculating according to the first matching feature point set, the second matching feature point set, the teaching image data set, the teaching point cloud data set and the teaching measurement data set to obtain target position information.
Specifically, feature extraction is carried out on each image in the teaching image set, features of different images are matched, matching feature points of each image are obtained, each matching feature point forms a first matching feature point set, feature extraction processing is carried out on the teaching point cloud data set in the same way, feature point sets of point cloud data at all times are obtained, and a second matching feature point set is formed.
And calculating according to the first matching feature point set and the teaching image data set to obtain image position information, calculating according to the second matching feature point set and the teaching point cloud data set to obtain point cloud position information, calculating according to the teaching measurement data set to obtain measurement position information, and calculating according to the image position information, the point cloud position information and the measurement position information to obtain target position information corresponding to each moment.
And step S410, adding the target position information to the initial navigation map to obtain the navigation map.
Specifically, the target position information calculated from the teaching image set, the teaching point cloud data set, and the teaching measurement data set is added to the initial navigation map, and the navigation map in step S104 is generated, which includes information such as a taught route. And determining to obtain a navigation map through the teaching navigation, and labeling the navigation map according to the information of the air route, the navigation speed, the navigation time and the like of the unmanned ship corresponding to the teaching navigation. Marking a plurality of target mileage positions in the navigation map, wherein the target mileage positions can be used as important reference positions during navigation, marking the position coordinates, corresponding navigation speed, navigation direction and other information in the navigation map, and storing the information.
As shown in fig. 4, in one embodiment, step S408 includes:
step S4082, calculating according to the teaching image data set, the first matching feature point set and the teaching measurement data set to obtain position information of the unmanned ship corresponding to each image in the teaching image data set, and forming a first position information set.
Step S4084, calculating according to the teaching point cloud data set, the second matching feature point set and the teaching measurement data set to obtain position information of the unmanned ship corresponding to each group of point cloud data in the teaching point cloud data set, and forming a second position information set.
Step S4086, weighting the first location information and the second location information at the same time to obtain the target location information.
Specifically, the teaching image data set and the first matching feature point set have a corresponding relation, each frame of image corresponds to one or more matching feature points, when the relative position of the unmanned ship corresponding to the image corresponding to the same matching feature point is calculated, the matched feature points can be screened firstly, and the screening rule can be customized according to actual needs. For example, the corresponding relation between the matching feature points and the images can be eliminated if the adjacent time intervals are too close. And calculating the corresponding relation between the matched characteristic points and the images to obtain the displacement between the plurality of images corresponding to each matched characteristic point, and obtaining the displacement and rotation matrix of the unmanned ship corresponding to the plurality of matched characteristic points. According to the teaching measurement data, calculating to obtain displacement and rotation matrixes of the unmanned ship at all moments, according to a relation established between a teaching measurement data set and an image data set through data acquisition time, determining displacement measurement units obtained by image data calculation and measurement units of all rotation angles in the rotation matrixes through the displacement obtained by the teaching measurement data set calculation, and forming a first position information set by the displacement and the rotation angles of all the displacement measurement units on the belt.
And for the teaching point cloud data set, the teaching point cloud data set is consistent with the principle of processing the teaching image data set, a matching characteristic point set corresponding to the teaching point cloud data set is calculated, the displacement and rotation matrix of the unmanned ship corresponding to each group of point cloud data at each time point is obtained through calculation according to the point cloud matching characteristic point set, the teaching point cloud data set and the teaching measurement data set, and a second position information set is formed by the displacement and rotation matrix.
Establishing a corresponding relation between first position information corresponding to the image and second position information corresponding to the point cloud data according to time, acquiring the first position information and the second position information corresponding to the same time, and weighting the first position information and the second position information to obtain final target position information. During weighting, the weight corresponding to the first position information and the second position information can be adjusted in a self-adaptive mode or can be a fixed parameter, the self-adaptive adjustment can be carried out according to reasons such as navigation weather, or a data quality detection model of image data and point cloud data is established, and the corresponding weight is determined according to the quality of the data.
Because the navigation weather is changeable during navigation, the self-adaptive weight adjusting method can better adapt to actual navigation.
As shown in fig. 5, in one embodiment, step S206 includes:
step S2062, an initial image and initial point cloud data corresponding to the first time are respectively obtained from the image data and the point cloud data.
Specifically, the first time is a time point when the first point cloud data is acquired, the second time is a time when the next group of point cloud data of the first point cloud data is acquired, the first time is used as a starting time, image data and point cloud data corresponding to the first time are acquired from the image data, the image data is used as an initial image, and the point cloud data is used as initial point cloud data. It should be noted that the starting time is not equal to the starting time of the navigation, and the starting time can be any time point in the navigation process.
Step S2064, acquiring initial measurement data corresponding to the first moment and next measurement data of the initial measurement data from the measurement data, calculating to obtain initial prediction displacement and an initial prediction rotation matrix according to the initial measurement data and the next measurement data, taking the next measurement data as the initial measurement data, repeatedly entering the step of acquiring the next measurement data of the initial measurement data and the initial measurement data from the measurement data, and updating the initial prediction displacement and the initial prediction rotation matrix.
Specifically, the measurement data are acquired by the inertial measurement unit, and the self-defined mathematical operation can be performed according to the data detected by the inertial measurement unit within a period of time to obtain the relative displacement and the pose between the unmanned ships at different moments, wherein the pose can be determined according to the displacement and the rotation matrix. If initial measurement data corresponding to the first moment are obtained, the inertial measurement unit continues to detect the data, when a next group of measurement data of the first moment are obtained, mathematical calculation is carried out according to the initial measurement data and the next measurement data to obtain a displacement matrix and a rotation matrix of the unmanned ship between the first moment and the moment of obtaining the next measurement data, and the displacement matrix and the rotation matrix are respectively used as an initial prediction displacement matrix and an initial prediction matrix. And then continuously repeating the process to obtain the next group of measurement data and the displacement and rotation matrix of the next group of measurement data, updating the initial prediction displacement and the initial prediction matrix according to the displacement and rotation matrix of the next group of measurement data and the next group of measurement data, and continuously repeating the process to update the initial prediction displacement and the initial prediction matrix.
Step S2066, when the next frame image of the initial image is acquired, the image prediction displacement and the image prediction rotation matrix of the unmanned ship corresponding to the initial image and the next frame image are calculated according to the initial image and the next frame image.
Specifically, when the next frame of image after the initial image is acquired, extracting the features of the initial image and the features of the next frame of image, matching the features of the initial image and the features of the next frame of image to obtain corresponding matching feature points, calculating the displacement and rotation matrix of the unmanned ship between the first moment and the moment when the unmanned ship acquires the next frame of image according to the coordinates of the matching feature points of the two images in the same coordinate system, and taking the displacement and rotation matrix as an image prediction displacement matrix and an image prediction rotation matrix respectively.
Step S2068, calculating to obtain a first prediction displacement and a first prediction rotation matrix according to the initial prediction displacement, the initial prediction rotation matrix, the image prediction displacement and the image prediction rotation matrix which are updated at the current moment, taking the next frame of image as the initial image, and repeatedly entering the step of calculating the image prediction displacement and the image prediction rotation matrix of the unmanned ship corresponding to the initial image and the next frame of image according to the initial image and the next frame of image until the first prediction displacement and the first prediction rotation matrix are updated.
Specifically, the current time is the time corresponding to the next frame of image, the initial prediction displacement and the initial prediction rotation matrix updated according to the measurement data of the inertia measurement unit at the current time are obtained, and the displacement and the rotation matrix are obtained by calculating according to the initial prediction displacement and the initial prediction rotation matrix, and the image prediction displacement and the image prediction rotation matrix and are respectively used as the first prediction displacement and the first prediction rotation matrix. The initial prediction displacement and the initial prediction rotation matrix are data with a measurement unit, the image prediction displacement and the image prediction rotation matrix are data without the measurement unit, and the displacement and the rotation angle are obtained by measuring different types of data at the same time, so that although the data obtained by calculating each type of measured data has a certain error, the difference is not particularly large, and the measurement unit of the image prediction displacement and the measurement unit of the image prediction rotation matrix can be determined according to the measurement unit of the measured data of the inertial measurement unit. Taking the next frame image as an initial image, when the next frame image of the initial image is obtained, repeating the calculation process of the displacement and rotation matrix of the initial image and the next frame image until the updated first prediction displacement and first prediction rotation matrix are obtained by calculation, and then continuously repeating the process to continuously update the first prediction displacement and the first prediction rotation matrix.
Step S2070, when next point cloud data corresponding to the initial point cloud data is obtained, calculating a point cloud prediction displacement and a point cloud prediction rotation matrix of the unmanned ship corresponding to the initial point cloud data and the next point cloud data according to the initial point cloud data and the next point cloud data.
Specifically, the initial point cloud data and the next group of point cloud data of the initial point cloud data are obtained from the sensor, the obtaining time of the next point cloud data is the second moment, the characteristics of the initial point cloud data and the next group of point cloud data are respectively obtained, feature matching is carried out on the two groups of point cloud data to obtain corresponding matching features, the displacement and rotation matrix between the unmanned ship at the first moment and the unmanned ship at the second moment are obtained through calculation according to the matching features, and the displacement and rotation matrix are respectively used as a point cloud prediction displacement matrix and a point cloud prediction rotation matrix.
And step S2072, calculating to obtain a predicted position according to the first predicted displacement and the first predicted rotation matrix updated at the second moment, the measurement data corresponding to the second moment, the point cloud predicted displacement and the point cloud predicted rotation matrix.
Specifically, according to the first prediction displacement and the first prediction rotation matrix updated at the second moment, and the point cloud prediction displacement and the point cloud prediction rotation matrix, the first prediction displacement and the point cloud prediction displacement updated at the second moment are weighted to obtain a pair of prediction displacement, the first prediction rotation matrix and the point cloud prediction rotation matrix are weighted to obtain a corresponding prediction rotation matrix, and the prediction position is calculated according to the displacement and rotation matrix, the prediction displacement and the prediction rotation matrix obtained according to the measurement data corresponding to the second moment.
As shown in fig. 6, in an embodiment, the unmanned ship navigation method further includes:
and step S602, when the predicted position is consistent with the standard position, acquiring image data, point cloud data and measurement data of the inertial measurement unit of the unmanned ship between the second time and the third time.
And step S604, taking the standard position corresponding to the second time as an initial position, acquiring the position of the third time in the navigation map as a standard position, calculating displacement data and a rotation matrix between the second time and the third time, and determining the predicted position again according to the displacement data and the rotation matrix.
And step S606, controlling the navigation of the unmanned ship according to the predicted position and the standard position.
Specifically, when the predicted position corresponding to the second moment is consistent with the standard position obtained in the navigation map, image data, point cloud data and measurement data of the unmanned ship between the second moment and the third moment are obtained, the predicted position at the third moment is obtained through repeated calculation, the predicted position at the third moment is matched with the standard position, a control instruction is determined to be generated according to the matching result, parameters of the unmanned ship are adjusted according to the control instruction, so that the unmanned ship sails towards the standard position according to preset time, or the step S202 is carried out, new data are continuously obtained, the standard position is used as the initial position, and the new standard position is obtained.
In a specific embodiment, for example, position information corresponding to a first time and a second time is obtained from a navigation map, measurement data corresponding to an inertial measurement unit is updated 500 times between the first time and the second time, image data is updated 50 times, point cloud data is updated 1 time, so that when next point cloud data arrives, the image data is updated 50 times to obtain 50 images, relative positions of unmanned ships corresponding to the two images are obtained through calculation according to the two images, relative positions corresponding to adjacent images between the first time and the second time are counted to obtain relative positions of the unmanned ships between the first time and the second time, relative positions of the unmanned ships between the first time and the second time are calculated according to measurement data corresponding to the inertial measurement unit between the first time and the second time, relative positions calculated according to the counted measurement data corresponding to the inertial measurement unit, relative positions calculated by the image data and relative positions calculated by the point cloud data, and a predicted position of the second time is calculated. And matching the predicted position of the second moment with the position information of the second moment in the navigation map, executing generation of a control command according to a matching result, or executing acquisition of the next position in the navigation map as a position corresponding to a third moment, acquiring various types of measurement data according to the navigation position of the third moment, continuously positioning the unmanned ship according to the acquired measurement data, and determining navigation parameters according to a positioning result.
The unmanned ship is positioned through the three types of measurement data, so that a more accurate positioning result can be obtained, and the navigation can be better realized only through accurate positioning.
In one embodiment, as shown in fig. 7, there is provided an unmanned navigation device comprising: a data acquisition module 202, a predicted position calculation module 204, and a navigation module 206, wherein:
the data acquisition module 202 is configured to acquire image data, point cloud data, and measurement data of the inertial measurement unit between a first time and a second time, and acquire an initial position of the first time and a standard position of the second time in the navigation map.
And the predicted position calculating module 204 is used for calculating according to the image data, the point cloud data, the measurement data and the initial position between the first moment and the second moment to obtain the predicted position of the unmanned ship at the second moment.
And the navigation module 206 is used for calculating the position difference between the standard position and the predicted position when the predicted position is inconsistent with the standard position, generating a control instruction according to the position difference, and updating the acceleration, the attitude and the speed parameters of the unmanned ship according to the control instruction, wherein the control instruction is used for indicating the unmanned ship to sail to the standard position.
In one embodiment, as shown in fig. 8, the unmanned ship navigation device 200 further includes:
and the data synchronization module 402 is used for acquiring an initial navigation map and teaching data, wherein the teaching data comprises a teaching image data set, a teaching point cloud data set and a teaching measurement data set, the teaching data is data acquired by a teaching process sensor, and the teaching image data set, the teaching point cloud data set and the teaching measurement data set are subjected to data synchronization according to time consistency.
The feature point obtaining module 404 is configured to obtain matching feature points of each image in the teaching image data set to form a first matching feature point set, obtain a matching feature point set of each point cloud data in the teaching point cloud data set, and form a second matching feature point set.
And the map generation module 406 is configured to calculate according to the first matching feature point set, the second matching feature point set, the teaching image data set, the teaching point cloud data set, and the teaching measurement data set to obtain target position information, and add the target position information to the initial navigation map to obtain the navigation map.
In one embodiment, as shown in FIG. 9, the map generation module 406 includes:
the first position information calculating unit 4062 is configured to calculate, according to the teaching image data set, the first matching feature point set, and the teaching measurement data set, position information of the unmanned ship corresponding to each image in the teaching image data set, and form a first position information set.
And a second position information calculation unit 4064, configured to calculate, according to the teaching point cloud data set, the second matching feature point set, and the teaching measurement data set, position information of the unmanned ship corresponding to each group of point cloud data in the teaching point cloud data set, and form a second position information set.
The target information calculation unit 4066 is configured to weight the first location information and the second location information at the same time to obtain target location information.
In one embodiment, as shown in FIG. 10, the predicted position calculation module 204 includes:
an initial data obtaining unit 2042, configured to obtain an initial image and initial point cloud data corresponding to the first time from the image data and the point cloud data, respectively.
The measurement data processing unit 2044 is configured to obtain initial measurement data corresponding to the first time from the measurement data and next measurement data of the initial measurement data, calculate an initial prediction displacement and an initial prediction rotation matrix according to the initial measurement data and the next measurement data, use the next measurement data as the initial measurement data, repeat the step of obtaining the next measurement data of the initial measurement data and the initial measurement data from the measurement data, and update the initial prediction displacement and the initial prediction rotation matrix.
The image data processing unit 2046 is configured to, when a next frame image of the initial image is obtained, calculate an image prediction displacement and an image prediction rotation matrix of the unmanned ship corresponding to the initial image and the next frame image according to the initial image and the next frame image, calculate a first prediction displacement and a first prediction rotation matrix according to the initial prediction displacement, the initial prediction rotation matrix, the image prediction displacement and the image prediction rotation matrix updated at the current time, use the next frame image as the initial image, and repeatedly enter a step of calculating the image prediction displacement and the image prediction rotation matrix of the unmanned ship corresponding to the initial image and the next frame image according to the initial image and the next frame image until the first prediction displacement and the first prediction rotation matrix are updated.
And a point cloud data processing unit 2048, configured to calculate, when next point cloud data corresponding to the initial point cloud data is obtained, a point cloud prediction displacement and a point cloud prediction rotation matrix of the unmanned ship corresponding to the initial point cloud data and the next point cloud data according to the initial point cloud data and the next point cloud data.
And the predicted position calculating unit 2050 is configured to calculate a predicted position according to the first predicted displacement and the first predicted rotation matrix updated at the second time, the measurement data corresponding to the second time, the point cloud predicted displacement, and the point cloud predicted rotation matrix.
In one embodiment, the above-mentioned unmanned ship navigation device 200 further includes:
the data obtaining module 202 is further configured to obtain image data, point cloud data, and measurement data of the inertial measurement unit of the unmanned ship between the second time and the third time when the predicted position is consistent with the standard position.
The predicted position calculating module 204 is further configured to use the standard position corresponding to the second time as an initial position, obtain a position of a third time in the navigation map as a standard position, perform calculation of displacement data and a rotation matrix between the second time and the third time, and determine the predicted position again according to the displacement data and the rotation matrix.
And the navigation module 206 is also used for controlling the navigation of the unmanned ship according to the predicted position and the standard position.
For specific limitations on the unmanned ship navigation, see the above limitations on the unmanned ship navigation method, which are not described in detail herein. The modules in the unmanned ship navigation can be wholly or partially realized by software, hardware and a combination thereof. The modules can be embedded in a hardware form or independent from a processor in the computer device, and can also be stored in a memory in the computer device in a software form, so that the processor can call and execute operations corresponding to the modules.
In one embodiment, a computer device is provided, which may be a terminal, and its internal structure diagram may be as shown in fig. 11. The computer device includes a processor, a memory, a network interface, a display screen, and an input device connected by a system bus. Wherein the processor of the computer device is configured to provide computing and control capabilities. The memory of the computer device comprises a nonvolatile storage medium and an internal memory. The non-volatile storage medium stores an operating system and a computer program. The internal memory provides an environment for the operation of an operating system and computer programs in the non-volatile storage medium. The network interface of the computer device is used for communicating with an external terminal through a network connection. The computer program is executed by a processor to implement a method of unmanned ship navigation. The display screen of the computer equipment can be a liquid crystal display screen or an electronic ink display screen, and the input device of the computer equipment can be a touch layer covered on the display screen, a key, a track ball or a touch pad arranged on the shell of the computer equipment, an external keyboard, a touch pad or a mouse and the like.
Those skilled in the art will appreciate that the architecture shown in item 11 is merely a block diagram of some of the structures associated with the present solution and is not intended to limit the computing devices to which the present solution may be applied, and that a particular computing device may include more or less components than those shown, or may combine certain components, or have a different arrangement of components.
In one embodiment, a computer device is provided, comprising a memory, a processor, and a computer program stored on the memory and executable on the processor, the processor implementing the following steps when executing the computer program: the method comprises the steps of obtaining image data, point cloud data and measurement data of an inertia measurement unit between a first moment and a second moment, obtaining an initial position of the first moment and a standard position of the second moment in a navigation map, calculating according to the image data, the point cloud data, the measurement data and the initial position between the first moment and the second moment to obtain a predicted position of the unmanned ship at the second moment, calculating a position difference between the standard position and the predicted position when the predicted position is inconsistent with the standard position, generating a control command according to the position difference, updating acceleration, posture and speed parameters of the unmanned ship according to the control command, and enabling the control command to be used for indicating the unmanned ship to sail to the standard position.
In one embodiment, the processor when executing the computer program further performs the steps of: the method comprises the steps of obtaining an initial navigation map, obtaining teaching data, wherein the teaching data comprises a teaching image data set, a teaching point cloud data set and a teaching measurement data set, the teaching data is data collected by a teaching process sensor, carrying out data synchronization on the teaching image data set, the teaching point cloud data set and the teaching measurement data set according to time consistency, obtaining matching feature points of all images in the teaching image data set to form a first matching feature point set, obtaining a matching feature point set of all point cloud data in the teaching point cloud data set to form a second matching feature point set, calculating according to the first matching feature point set, the second matching feature point set, the teaching image data set, the teaching point cloud data set and the teaching measurement data set to obtain target position information, and adding the target position information to the initial navigation map to obtain the navigation map.
In one embodiment, the calculating according to the first matching feature point set, the second matching feature point set, the teaching image data set, the teaching point cloud data set and the teaching measurement data set to obtain the target position information includes: calculating according to the teaching image data set, the first matching feature point set and the teaching measurement data set to obtain position information of the unmanned ship corresponding to each image in the teaching image data set to form a first position information set, calculating according to the teaching point cloud data set, the second matching feature point set and the teaching measurement data set to obtain position information of the unmanned ship corresponding to each group of point cloud data in the teaching point cloud data set to form a second position information set, and weighting the first position information and the second position information at the same time to obtain target position information.
In one embodiment, the calculating according to the image data, the point cloud data, the measurement data and the initial position in the first time and the second time to obtain the predicted position of the unmanned ship at the second time comprises: respectively acquiring an initial image and initial point cloud data corresponding to a first moment from the image data and the point cloud data, acquiring initial measurement data corresponding to the first moment and next measurement data of the initial measurement data from the measurement data, calculating to obtain initial prediction displacement and initial prediction rotation matrix according to the initial measurement data and the next measurement data, taking the next measurement data as the initial measurement data, repeating the step of obtaining the initial measurement data and the next measurement data of the initial measurement data from the measurement data, updating the initial prediction displacement and the initial prediction rotation matrix, when obtaining the next frame image of the initial image, calculating the image prediction displacement and the image prediction rotation matrix of the unmanned ship corresponding to the initial image and the next frame image according to the initial image and the next frame image, calculating to obtain a first prediction displacement and a first prediction rotation matrix according to the initial prediction displacement, the initial prediction rotation matrix, the image prediction displacement and the image prediction rotation matrix which are updated at the current moment, taking the next frame of image as the initial image, and repeatedly entering the step of calculating the image prediction displacement and the image prediction rotation matrix of the unmanned ship corresponding to the initial image and the next frame of image according to the initial image and the next frame of image until the first prediction displacement and the first prediction rotation matrix are updated, when the next point cloud data corresponding to the initial point cloud data is obtained, calculating point cloud prediction displacement and point cloud prediction rotation matrix of the unmanned ship corresponding to the initial point cloud data and the next point cloud data according to the initial point cloud data and the next point cloud data, and calculating to obtain a predicted position according to the first predicted displacement and the first predicted rotation matrix updated at the second moment, the measurement data corresponding to the second moment, the point cloud predicted displacement and the point cloud predicted rotation matrix.
In one embodiment, the processor, when executing the computer program, further performs the steps of: when the predicted position is consistent with the standard position, acquiring image data, point cloud data and measurement data of an inertia measurement unit of the unmanned ship between a second moment and a third moment, taking the standard position corresponding to the second moment as an initial position, acquiring the position of the third moment in a navigation map as the standard position, calculating displacement data and a rotation matrix between the second moment and the third moment, determining the predicted position again according to the displacement data and the rotation matrix, and controlling navigation of the unmanned ship according to the predicted position and the standard position.
In one embodiment, a computer-readable storage medium is provided, having a computer program stored thereon, which when executed by a processor, performs the steps of: the method comprises the steps of obtaining image data, point cloud data and measurement data of an inertia measurement unit between a first moment and a second moment, obtaining an initial position of the first moment and a standard position of the second moment in a navigation map, calculating according to the image data, the point cloud data, the measurement data and the initial position between the first moment and the second moment to obtain a predicted position of the unmanned ship at the second moment, calculating a position difference between the standard position and the predicted position when the predicted position is inconsistent with the standard position, generating a control command according to the position difference, updating acceleration, posture and speed parameters of the unmanned ship according to the control command, and enabling the control command to be used for indicating the unmanned ship to sail to the standard position.
In one embodiment, the processor, when executing the computer program, further performs the steps of: the method comprises the steps of obtaining an initial navigation map, obtaining teaching data, wherein the teaching data comprise a teaching image data set, a teaching point cloud data set and a teaching measurement data set, the teaching data are acquired by a teaching process sensor, carrying out data synchronization on the teaching image data set, the teaching point cloud data set and the teaching measurement data set according to time consistency, obtaining matching feature points of all images in the teaching image data set to form a first matching feature point set, obtaining a matching feature point set of all point cloud data in the teaching point cloud data set to form a second matching feature point set, calculating according to the first matching feature point set, the second matching feature point set, the teaching image data set, the teaching point cloud data set and the teaching measurement data set to obtain target position information, and adding the target position information to the initial navigation map to obtain the navigation map.
In one embodiment, the calculating according to the first matching feature point set, the second matching feature point set, the teaching image data set, the teaching point cloud data set and the teaching measurement data set to obtain the target position information includes: calculating according to the teaching image data set, the first matching feature point set and the teaching measurement data set to obtain position information of the unmanned ship corresponding to each image in the teaching image data set to form a first position information set, calculating according to the teaching point cloud data set, the second matching feature point set and the teaching measurement data set to obtain position information of the unmanned ship corresponding to each group of point cloud data in the teaching point cloud data set to form a second position information set, and weighting the first position information and the second position information at the same time to obtain target position information.
In one embodiment, the calculating according to the image data, the point cloud data, the measurement data and the initial position in the first time and the second time to obtain the predicted position of the unmanned ship at the second time comprises: respectively obtaining an initial image and initial point cloud data corresponding to a first time from the image data and the point cloud data, obtaining initial measurement data corresponding to the first time and next measurement data of the initial measurement data from the measurement data, calculating to obtain initial prediction displacement and initial prediction rotation matrix according to the initial measurement data and the next measurement data, taking the next measurement data as the initial measurement data, repeating the step of obtaining the initial measurement data and the next measurement data of the initial measurement data from the measurement data, updating the initial prediction displacement and the initial prediction rotation matrix, when obtaining the next frame image of the initial image, calculating the image prediction displacement and the image prediction rotation matrix of the unmanned ship corresponding to the initial image and the next frame of image according to the initial image and the next frame of image, calculating to obtain a first prediction displacement and a first prediction rotation matrix according to the initial prediction displacement, the initial prediction rotation matrix, the image prediction displacement and the image prediction rotation matrix which are updated at the current moment, taking the next frame of image as the initial image, and repeatedly entering the step of calculating the image prediction displacement and the image prediction rotation matrix of the unmanned ship corresponding to the initial image and the next frame of image according to the initial image and the next frame of image until the first prediction displacement and the first prediction rotation matrix are updated, when the next point cloud data corresponding to the initial point cloud data is obtained, calculating point cloud prediction displacement and a point cloud prediction rotation matrix of the unmanned ship corresponding to the initial point cloud data and the next point cloud data according to the initial point cloud data and the next point cloud data, and calculating to obtain a predicted position according to the first predicted displacement and the first predicted rotation matrix updated at the second moment, the measurement data corresponding to the second moment, the point cloud predicted displacement and the point cloud predicted rotation matrix.
In one embodiment, the processor when executing the computer program further performs the steps of: when the predicted position is consistent with the standard position, acquiring image data, point cloud data and measurement data of an inertial measurement unit of the unmanned ship between the second moment and the third moment, taking the standard position corresponding to the second moment as an initial position, acquiring the position of the third moment in a navigation map as a standard position, executing calculation of displacement data and a rotation matrix between the second moment and the third moment, determining the predicted position again according to the displacement data and the rotation matrix, and controlling navigation of the unmanned ship according to the predicted position and the standard position.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by hardware instructions of a computer program, which can be stored in a non-volatile computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. Any reference to memory, storage, database, or other medium used in the embodiments provided herein may include non-volatile and/or volatile memory, among others. Non-volatile memory can include read-only memory (ROM), programmable ROM (PROM), electrically Programmable ROM (EPROM), electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM) or external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms such as Static RAM (SRAM), dynamic RAM (DRAM), synchronous DRAM (SDRAM), double Data Rate SDRAM (DDRSDRAM), enhanced SDRAM (ESDRAM), synchronous Link DRAM (SLDRAM), rambus (Rambus) direct RAM (RDRAM), direct memory bus dynamic RAM (DRDRAM), and memory bus dynamic RAM (RDRAM).
All possible combinations of the technical features in the above embodiments may not be described for the sake of brevity, but should be considered as being within the scope of the present disclosure as long as there is no contradiction between the combinations of the technical features.
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (10)

1. A method of unmanned ship navigation, the method comprising:
acquiring image data, point cloud data and measurement data of an inertia measurement unit between a first moment and a second moment;
acquiring an initial navigation map;
the method comprises the steps of obtaining teaching data, wherein the teaching data comprise a teaching image data set, a teaching point cloud data set and a teaching measurement data set, and the teaching data are data collected by a teaching process sensor;
carrying out data synchronization on the teaching image data set, the teaching point cloud data set and the teaching measurement data set according to time consistency;
acquiring matching feature points of each image in the teaching image data set to form a first matching feature point set;
acquiring a matching feature point set of each point cloud data in the teaching point cloud data set to form a second matching feature point set;
calculating according to the first matching feature point set, the second matching feature point set, the teaching image data set, the teaching point cloud data set and the teaching measurement data set to obtain target position information;
adding the target position information to the initial navigation map to obtain a navigation map;
acquiring an initial position at the first moment and a standard position at the second moment in the navigation map;
calculating according to the image data, the point cloud data, the measurement data and the initial position between the first time and the second time to obtain a predicted position of the unmanned ship at the second time;
when the predicted position is inconsistent with the standard position, calculating a position difference between the standard position and the predicted position, generating a control instruction according to the position difference, and updating the acceleration, the attitude and the speed parameters of the unmanned ship according to the control instruction, wherein the control instruction is used for indicating the unmanned ship to sail to the standard position.
2. The method of claim 1, wherein the calculating target position information from the first set of matched feature points, the second set of matched feature points, the teach image data set, the teach point cloud data set, and the teach measurement data set comprises:
calculating position information of the unmanned ship corresponding to each image in the teaching image data set according to the teaching image data set, the first matching feature point set and the teaching measurement data set to form a first position information set;
calculating according to the teaching point cloud data set, the second matching feature point set and the teaching measurement data set to obtain position information of the unmanned ship corresponding to each group of point cloud data in the teaching point cloud data set, and forming a second position information set;
and weighting the first position information and the second position information at the same time to obtain target position information.
3. The method of claim 1, wherein the first time is a time to obtain a first point cloud data, the second time is a time to obtain a next point cloud data, and the calculating according to the image data, the point cloud data, the measurement data, and the initial position in the first time and the second time to obtain the predicted position of the unmanned ship at the second time comprises:
respectively acquiring an initial image and initial point cloud data corresponding to the first moment from the image data and the point cloud data;
acquiring initial measurement data corresponding to the first moment and next measurement data of the initial measurement data from the measurement data, and calculating according to the initial measurement data and the next measurement data to obtain an initial prediction displacement and an initial prediction rotation matrix;
taking the next measurement data as initial measurement data, repeating the step of obtaining the initial measurement data and the next measurement data of the initial measurement data from the measurement data, and updating the initial prediction displacement and the initial prediction rotation matrix;
when the next frame image of the initial image is obtained, calculating the image prediction displacement and the image prediction rotation matrix of the unmanned ship corresponding to the initial image and the next frame image according to the initial image and the next frame image;
calculating to obtain a first prediction displacement and a first prediction rotation matrix according to the initial prediction displacement, the initial prediction rotation matrix, the image prediction displacement and the image prediction rotation matrix which are updated at the current moment;
taking the next frame image as the initial image, and repeatedly performing the step of calculating the image prediction displacement and the image prediction rotation matrix of the unmanned ship corresponding to the initial image and the next frame image according to the initial image and the next frame image until the first prediction displacement and the first prediction rotation matrix are updated;
when next point cloud data corresponding to the initial point cloud data are obtained, point cloud prediction displacement and a point cloud prediction rotation matrix of the unmanned ship corresponding to the initial point cloud data and the next point cloud data are calculated according to the initial point cloud data and the next point cloud data;
and calculating to obtain the predicted position according to the first predicted displacement and the first predicted rotation matrix which are updated at the second moment, the measured data corresponding to the second moment, the point cloud predicted displacement and the point cloud predicted rotation matrix.
4. The method of claim 1, wherein the calculating from the image data, the point cloud data, the measurement data, and the initial position between the first time and the second time, after obtaining the predicted position of the unmanned ship at the second time, comprises:
when the predicted position is consistent with the standard position, acquiring image data, point cloud data and measurement data of an inertial measurement unit of the unmanned ship between the second moment and a third moment;
taking the standard position corresponding to the second moment as the initial position, acquiring the position of the third moment from the navigation map as the standard position, executing calculation of displacement data and a rotation matrix between the second moment and the third moment, and determining the predicted position again according to the displacement data and the rotation matrix;
and controlling the navigation of the unmanned ship according to the predicted position and the standard position.
5. An unmanned marine navigation device, the device comprising:
the data acquisition module is used for acquiring image data, point cloud data and measurement data of an inertial measurement unit between a first moment and a second moment, and acquiring an initial position of the first moment and a standard position of the second moment in a navigation map;
the predicted position calculation module is used for calculating according to the image data, the point cloud data, the measurement data and the initial position between the first time and the second time to obtain a predicted position of the unmanned ship at the second time;
the navigation module is used for calculating the position difference between the standard position and the predicted position when the predicted position is inconsistent with the standard position, generating a control instruction according to the position difference, and updating the acceleration, attitude and speed parameters of the unmanned ship according to the control instruction, wherein the control instruction is used for indicating the unmanned ship to sail to the standard position;
the data synchronization module is used for acquiring an initial navigation map and teaching data, wherein the teaching data comprises a teaching image data set, a teaching point cloud data set and a teaching measurement data set, the teaching data is data acquired by a teaching process sensor, and the teaching image data set, the teaching point cloud data set and the teaching measurement data set are subjected to data synchronization according to time consistency;
the feature point acquisition module is used for acquiring matching feature points of each image in the teaching image data set to form a first matching feature point set, acquiring a matching feature point set of each point cloud data in the teaching point cloud data set to form a second matching feature point set;
and the map generation module is used for calculating according to the first matching feature point set, the second matching feature point set, the teaching image data set, the teaching point cloud data set and the teaching measurement data set to obtain target position information, and adding the target position information to the initial navigation map to obtain the navigation map.
6. The apparatus of claim 5, wherein the apparatus comprises:
the data acquisition module is further used for acquiring image data, point cloud data and measurement data of an inertial measurement unit of the unmanned ship between the second moment and a third moment when the predicted position is consistent with the standard position;
the predicted position calculation module is further configured to use a standard position corresponding to the second time as the initial position, obtain a position of the third time in the navigation map as the standard position, perform calculation of displacement data and a rotation matrix between the second time and the third time, and determine the predicted position again according to the displacement data and the rotation matrix;
and the navigation module is also used for controlling the navigation of the unmanned ship according to the predicted position and the standard position.
7. The apparatus of claim 5, wherein the map generation module comprises:
the first position information calculation unit is used for calculating position information of the unmanned ship corresponding to each image in the teaching image data set according to the teaching image data set, the first matching feature point set and the teaching measurement data set to form a first position information set;
the second position information calculation unit is used for calculating position information of the unmanned ship corresponding to each group of point cloud data in the teaching point cloud data set according to the teaching point cloud data set, the second matching feature point set and the teaching measurement data set to form a second position information set;
and the target information calculation unit is used for weighting the first position information and the second position information at the same time to obtain target position information.
8. The apparatus of claim 5, wherein the predicted position calculation module comprises:
an initial data acquisition unit which respectively acquires an initial image and initial point cloud data corresponding to the first time from the image data and the point cloud data;
the measurement data processing unit is used for acquiring initial measurement data corresponding to the first moment and next measurement data of the initial measurement data from the measurement data, and calculating to obtain an initial prediction displacement and an initial prediction rotation matrix according to the initial measurement data and the next measurement data;
taking the next measurement data as initial measurement data, repeating the step of obtaining the initial measurement data and the next measurement data of the initial measurement data from the measurement data, and updating the initial prediction displacement and the initial prediction rotation matrix;
the image data processing unit is used for calculating image prediction displacement and an image prediction rotation matrix of the unmanned ship corresponding to the initial image and the next frame image according to the initial image and the next frame image when the next frame image of the initial image is acquired;
calculating to obtain a first prediction displacement and a first prediction rotation matrix according to the initial prediction displacement, the initial prediction rotation matrix, the image prediction displacement and the image prediction rotation matrix which are updated at the current moment;
taking the next frame image as the initial image, and repeatedly performing the step of calculating the image prediction displacement and the image prediction rotation matrix of the unmanned ship corresponding to the initial image and the next frame image according to the initial image and the next frame image until the first prediction displacement and the first prediction rotation matrix are updated;
the point cloud data processing unit is used for calculating point cloud prediction displacement and point cloud prediction rotation matrix of the unmanned ship corresponding to the initial point cloud data and the next point cloud data according to the initial point cloud data and the next point cloud data when the next point cloud data corresponding to the initial point cloud data is obtained;
and the predicted position calculating unit is used for calculating the predicted position according to the first predicted displacement and the first predicted rotation matrix which are updated at the second moment, the measurement data corresponding to the second moment, the point cloud predicted displacement and the point cloud predicted rotation matrix.
9. A computer arrangement comprising a memory, a processor and a computer program stored on the memory and executable on the processor, characterized in that the processor implements the steps of the method according to any of claims 1 to 4 when executing the computer program.
10. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the method of any one of claims 1 to 4.
CN201810294328.7A 2018-03-30 2018-03-30 Unmanned ship navigation method, device, computer equipment and storage medium Active CN108917752B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810294328.7A CN108917752B (en) 2018-03-30 2018-03-30 Unmanned ship navigation method, device, computer equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810294328.7A CN108917752B (en) 2018-03-30 2018-03-30 Unmanned ship navigation method, device, computer equipment and storage medium

Publications (2)

Publication Number Publication Date
CN108917752A CN108917752A (en) 2018-11-30
CN108917752B true CN108917752B (en) 2022-11-11

Family

ID=64402821

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810294328.7A Active CN108917752B (en) 2018-03-30 2018-03-30 Unmanned ship navigation method, device, computer equipment and storage medium

Country Status (1)

Country Link
CN (1) CN108917752B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112154455B (en) * 2019-09-29 2024-04-26 深圳市大疆创新科技有限公司 Data processing method, equipment and movable platform
CN112859826A (en) * 2019-11-26 2021-05-28 北京京东乾石科技有限公司 Method and apparatus for controlling an automated guided vehicle
CN111829515A (en) * 2020-07-09 2020-10-27 新石器慧通(北京)科技有限公司 Time synchronization method, device, vehicle and storage medium
CN115900639B (en) * 2023-03-08 2023-05-30 深圳市科思科技股份有限公司 Course angle correction method and server applied to cradle head camera on unmanned aerial vehicle
CN116578030B (en) * 2023-05-25 2023-11-24 广州市番高领航科技有限公司 Intelligent control method and system for water inflatable unmanned ship

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104236548A (en) * 2014-09-12 2014-12-24 清华大学 Indoor autonomous navigation method for micro unmanned aerial vehicle
CN105180942A (en) * 2015-09-11 2015-12-23 安科智慧城市技术(中国)有限公司 Autonomous navigation method and device for unmanned ship
CN105182358A (en) * 2014-04-25 2015-12-23 谷歌公司 Methods and systems for object detection using laser point clouds
CN206563962U (en) * 2017-03-08 2017-10-17 华南理工大学 Cloud data collection and processing unit based on bi-processor architecture
CN107656545A (en) * 2017-09-12 2018-02-02 武汉大学 A kind of automatic obstacle avoiding searched and rescued towards unmanned plane field and air navigation aid

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105182358A (en) * 2014-04-25 2015-12-23 谷歌公司 Methods and systems for object detection using laser point clouds
CN104236548A (en) * 2014-09-12 2014-12-24 清华大学 Indoor autonomous navigation method for micro unmanned aerial vehicle
CN105180942A (en) * 2015-09-11 2015-12-23 安科智慧城市技术(中国)有限公司 Autonomous navigation method and device for unmanned ship
CN206563962U (en) * 2017-03-08 2017-10-17 华南理工大学 Cloud data collection and processing unit based on bi-processor architecture
CN107656545A (en) * 2017-09-12 2018-02-02 武汉大学 A kind of automatic obstacle avoiding searched and rescued towards unmanned plane field and air navigation aid

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
IMU/DGPS辅助车载CCD及激光扫描仪三维数据采集与建模;陈允芳等;《测绘科学》;20061020;第31卷(第05期);第8+78+92-93页 *

Also Published As

Publication number Publication date
CN108917752A (en) 2018-11-30

Similar Documents

Publication Publication Date Title
CN108917752B (en) Unmanned ship navigation method, device, computer equipment and storage medium
US10636168B2 (en) Image processing apparatus, method, and program
CN109099915B (en) Mobile robot positioning method, mobile robot positioning device, computer equipment and storage medium
CN108731664B (en) Robot state estimation method, device, computer equipment and storage medium
CN110047108B (en) Unmanned aerial vehicle pose determination method and device, computer equipment and storage medium
WO2018056391A1 (en) Method for creating positioning geomagnetism map, position measurement method, noise measurement method, and system for creating positioning geomagnetism map
CN111207762B (en) Map generation method and device, computer equipment and storage medium
CN111144398A (en) Target detection method, target detection device, computer equipment and storage medium
CN111241224B (en) Method, system, computer device and storage medium for target distance estimation
JP2017207456A (en) Attitude estimation device, attitude estimation method, control program, and recording medium
CN111721283B (en) Precision detection method and device for positioning algorithm, computer equipment and storage medium
CN110824496B (en) Motion estimation method, motion estimation device, computer equipment and storage medium
CN114863201A (en) Training method and device of three-dimensional detection model, computer equipment and storage medium
CN114819135A (en) Training method of detection model, target detection method, device and storage medium
CN111178126A (en) Target detection method, target detection device, computer equipment and storage medium
CN111723597A (en) Precision detection method and device of tracking algorithm, computer equipment and storage medium
US20210201011A1 (en) Data processing method for multi-sensor fusion, positioning apparatus and virtual reality device
CN112907663A (en) Positioning method, computer program product, device and system
CN117470224A (en) Optical remote sensing satellite geometric positioning precision improving method, system and equipment
CN114063024A (en) Calibration method and device of sensor, electronic equipment and storage medium
CN114202554A (en) Mark generation method, model training method, mark generation device, model training device, mark method, mark device, storage medium and equipment
CN114882115B (en) Vehicle pose prediction method and device, electronic equipment and storage medium
CN114463504B (en) Method, system and storage medium for reconstructing line side linear elements based on monocular camera
CN112633043B (en) Lane line determining method and device, electronic equipment and storage medium
CN110967016B (en) Off-line planning method and device for aircraft route and computer equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant