CN117690194A - Multi-source AI biodiversity observation method and acquisition system - Google Patents

Multi-source AI biodiversity observation method and acquisition system Download PDF

Info

Publication number
CN117690194A
CN117690194A CN202311681124.6A CN202311681124A CN117690194A CN 117690194 A CN117690194 A CN 117690194A CN 202311681124 A CN202311681124 A CN 202311681124A CN 117690194 A CN117690194 A CN 117690194A
Authority
CN
China
Prior art keywords
gps
unmanned ship
beidou
imu
fusion
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202311681124.6A
Other languages
Chinese (zh)
Other versions
CN117690194B (en
Inventor
赵乃峰
王永新
张延伟
刘娜利
庄涛
李文哲
项天远
张兆龙
刘春乐
彭青红
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Hongwan Weipeng Information Technology Co ltd
Original Assignee
Beijing Hongwan Weipeng Information Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Hongwan Weipeng Information Technology Co ltd filed Critical Beijing Hongwan Weipeng Information Technology Co ltd
Priority to CN202311681124.6A priority Critical patent/CN117690194B/en
Publication of CN117690194A publication Critical patent/CN117690194A/en
Application granted granted Critical
Publication of CN117690194B publication Critical patent/CN117690194B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Position Fixing By Use Of Radio Waves (AREA)

Abstract

The application provides a multisource AI biodiversity observation method and collection system, belongs to ecological intelligent science and technology field and biodiversity observation field, and the collection system includes: unmanned ship and edge server, unmanned ship in the system includes: the device comprises a water surface rotatable motion camera, an underwater rotatable motion camera, an audio acquisition device, a communication antenna, a central control unit, a Beidou/GPS positioning unit, an IMU unit and a storage unit; and an edge server in the system performs AI identification of animals and plants. The method can acquire high-quality biological observation data by utilizing artificial intelligence and high-precision positioning, provide important parameters for biological diversity evaluation, and provide basis for scientific protection measures and policy formulation of biological diversity.

Description

Multi-source AI biodiversity observation method and acquisition system
Technical Field
The application relates to the field of ecological intelligent science and technology and the field of biological observation, in particular to an observation method and an acquisition system for biodiversity.
Background
As a branch in intelligent robots, unmanned vessels generally comprise: the intelligent driving, image acquisition and transmission functions have the advantages of small volume, low cost, good maneuverability, capability of carrying different sensors and the like, so that the unmanned ship is adopted to observe and acquire the information of ecological environment biodiversity in a very feasible direction. However, the existing unmanned ship has several problems in observation and collection: 1, the acquisition parameters are incomplete. The acquired data only contains the acquired target video or audio data, but does not contain environmental parameters, such as real-time geographic position information and the like, and does not contain real-time information and the like, so that necessary information is deleted when the data is analyzed subsequently; 2, inaccurate positioning. Inland unmanned ships often work in inland river, reservoir and small lake etc. areas, and the topography is narrow and small and the ocean is comparatively complicated relatively, when being close to the coast, because the shielding of building or trees, big dipper or GPS signal lose easily to lead to the problem of location inefficacy or inaccuracy.
Disclosure of Invention
The purpose of this application is in order to solve the above-mentioned problem of prior art, and this application provides a multisource AI and carries out method and the acquisition system that bio-diversity was surveyed, can utilize artificial intelligence, high-definition location to acquire high-quality biological observation data, provide important parameter for bio-diversity aassessment to offer the basis for scientific safeguard and policy formulation of bio-diversity.
In order to achieve the above object, the present application proposes an acquisition system for biodiversity observation by a multisource AI, the acquisition system comprising: unmanned ship and edge server, unmanned ship in the system includes: the device comprises a water surface rotatable motion camera, an underwater rotatable motion camera, an audio acquisition device, a communication antenna, a central control unit, a Beidou/GPS positioning unit, an IMU unit and a storage unit; the upper surface of the unmanned ship is fixedly connected with a communication antenna, a water surface rotatable moving camera is connected above the antenna, a fisheye lens is fixedly arranged on the moving camera, a camera tripod head is adjusted to enable the fisheye lens to adjust shooting directions, an underwater rotatable moving camera is respectively and fixedly arranged at the left side and the right side of the cavity of the unmanned ship, the underwater rotatable moving camera protrudes out of the cavity, the fisheye lens is arranged on the underwater rotatable moving camera, the underwater rotatable moving camera is adjusted to enable the fisheye lens to face downwards, an audio acquisition device is arranged behind the unmanned ship, a Beidou/GPS positioning unit and an IMU unit realize fusion positioning, and a central control unit is connected with a storage unit, a Beidou/GPS positioning unit, an IMU unit, a control system of the unmanned ship and a control system of the moving camera; and an edge server in the system performs AI identification of animals and plants.
In some embodiments, the Beidou/GPS/IMU fusion positioning mode is adopted to convert the longitude and latitude coordinates (lat, lon) of the Beidou or GPS into two-dimensional plane coordinates (x) bucket/GPS ,y bucket/GPS ) Converting pose data of the IMU into two-dimensional plane coordinates (x And ,y and )。
in some embodiments, the fusion coordinates (x Fusion of ,y Fusion of ) The GPS navigation system is obtained by fusing longitude and latitude coordinates of Beidou or GPS with pose data of the IMU, namely, the following formula is shown:
x fusion of = αx bucket/GPS + βx And (5)
y fusion of = αy bucket/GPS + βy And (6)
wherein α+β=1 (7)
Wherein, alpha and beta are the weight of longitude and latitude coordinates and the weight of IMU pose data respectively.
In some embodiments, when the working state of the Beidou/GPS is abnormal, i.e. the Beidou/GPS positioning fails, the system enters a positioning mode with the IMU as the main part and the Beidou/GPS as the auxiliary part, and the dynamic updating of the weight is updated by the updating period of the IMU, i.e. beta Abnormality of =1/[1+κ(t Currently, the method is that -t IMU refresh )]Wherein t is Currently, the method is that Is the current time, t IMU refresh Is the refresh time of the last IMU, kappa is the time decay rate coefficient, at which alpha Abnormality of =1-β Abnormality of . Namely the following formula:
x fusion of = α Abnormality of x bucket/GPS + β Abnormality of x And (8)
y fusion of = α Abnormality of y bucket/GPS + β Abnormality of y And (9)
wherein the method comprises the steps of
α Abnormality of =1-β Abnormality of (11)。
In some embodiments, when the working state of the Beidou/GPS is normal, i.e. the Beidou/GPS positioning is effective, the method enters a positioning mode with GPS as the main and IMU as the auxiliary, and the dynamic updating of the weight is updated with the updating period of the Beidou/GPS, i.e. alpha Normal state =1/[1+κ(t Currently, the method is that -t bucket/GPS refresh )]Wherein t is Currently, the method is that Is the current time, t bucket/GPS refresh Is the refresh time of the last Beidou/GPS, and kappa is the time decay speed coefficient, and beta is the time decay speed coefficient Normal state =1-α Normal state . Namely the following formula:
x fusion of = α Normal state x bucket/GPS + β Normal state x And (12)
y fusion of = α Normal state y bucket/GPS + β Normal state y And (13)
wherein the method comprises the steps of
β Normal state =1-α Normal state (15)。
In some embodiments, the two-dimensional curved surface longitude and latitude coordinates of the Beidou/GPS are converted into two-dimensional plane rectangular coordinates through the ink card support conversion, namely, the conversion from the spherical coordinates lat, longitude lon to rectangular plane abscissa x and ordinate y is realized through the formula (1) and the formula (2):
x bucket/GPS =lat (1)
In some embodiments, preprocessing pose data output by an IMU based on a sliding weighted filtering algorithm to obtain pose data of the unmanned ship in x and y directions, and constructing a five-dimensional vector according to the pose data: linear acceleration and angular velocity in the x direction, linear acceleration and angular velocity in the y direction and course angle, and the five-dimensional data are used as input vectors of the neural network model; and then based on the trained neural network model, the sum of the relative displacements in the x and y directions (x) And ,y and ) Predicting, and taking the predicted result as the output of the neural network model:
x and =∑Δx i (3)
y and =∑Δy i (4)。
the present application also provides a method for acquisition of multisource AI biodiversity using an acquisition system according to any of claims 1 to 7, comprising: (1) data acquisition: video acquisition is carried out on animals and plants on the water surface through a water surface rotatable motion camera and an audio acquisition device, then fusion positioning information and time information of an unmanned ship are added in an acquired video image to serve as attribute information, video acquisition is carried out on the animals and plants under water through an underwater rotatable motion camera, and then fusion positioning information and time information of the unmanned ship are added in the acquired video image to serve as attribute information; (2) data transmission: transmitting the water surface video, the underwater video data and the attribute information thereof acquired in the step (1) to an edge server in the cloud through a communication antenna for storage; (3) animal and plant identification: the edge server also performs plant identification on the received video; and (4) reporting the identification result: if the identification result is matched with the target object set by the shore-based control unit, reporting the identification related result of the identified target object and the positioning information of the corresponding unmanned ship to the appointed receiving device in a preset mode; (5) calculating an included angle: after the target object is identified, the edge server calculates the azimuth information of the target object and sends the azimuth information of the target object to the central control unit of the unmanned ship; (6) adjusting the attitude of the unmanned ship: the central control unit calculates an included angle between the target object and the unmanned ship according to the received azimuth information of the target object and the current positioning information of the unmanned ship, determines the heading of the unmanned ship according to the included angle, controls the differential steering movement of the unmanned ship through the lower computer, and drives the unmanned ship to move towards the azimuth of the target object after the posture of the unmanned ship is adjusted; (7) target object acquisition: after reaching the target position, target tracking and video alignment are carried out, and video images and audio data of the target object are collected and stored.
In some embodiments, after receiving the azimuth information of the target object and adjusting the heading attitude of the unmanned ship, the central control unit adjusts a cradle head on the water surface rotatable motion camera to enable a fisheye lens on the cradle head to face to the azimuth of the target object.
The application has the following advantages:
(1) The method and the device use a Beidou/GPS/IMU fusion positioning mode, so that the problems of object shielding, signal failure and the like in the complex acquisition environment of the unmanned ship can be solved;
(2) In order to accurately position the unmanned ship, the application creatively provides a positioning fusion calculation method for adjusting dynamic weights, comprehensively considers factors such as the refreshing frequency of positioning signals and the positioning precision, dynamically fuses the fused multi-source positioning information, and therefore effectively relieves the problem of larger positioning errors caused by fixed weights;
(3) The method and the device provide a scheme for adding positioning and time information into video attributes, so that observation and analysis of aspects such as habit and the like of a rare species target object in biodiversity research are effectively guaranteed.
Drawings
FIG. 1 is a schematic view of the unmanned ship of the present application;
FIG. 2 shows a flow chart of a method for multi-source AI biodiversity observation collection.
Detailed Description
The present application is described in further detail below with reference to the drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the application and not limiting thereof. It should be further noted that, for convenience of description, only a part, but not all, of the drawings related to the present application are shown.
The unmanned ship used in the application has the mass of about 130kg, the length of about 2.8m, the width of about 1.5m, the cruising ability of about 20km, the ship speed is adjustable within the range of 0-1.5 m/s, and the communication distance exceeds 10km. The unmanned ship power supply uses 24V and 12V lithium battery packs to respectively supply power for the propulsion system and the main control and sensor system. The unmanned ship has modes of autonomous cruising, remote control and the like, and in the unmanned ship autonomous cruising mode, the unmanned ship generally reasonably plans a navigation path according to the working place, the target azimuth and the operation requirement of the unmanned ship or completes the automatic cruising according to the expected course, the GPS locus, the route and the like sent by the shore-based control unit.
The acquisition system for multi-source AI biodiversity observation described herein, as shown in fig. 1, includes: unmanned ship and edge server, unmanned ship in the system includes: the device comprises a water surface rotatable motion camera (2), an underwater rotatable motion camera (5), an audio acquisition device (3), a communication antenna (1), a central control unit (4), a Beidou/GPS positioning unit, an IMU unit and a storage unit; the method comprises the steps that a communication antenna (1) is fixedly connected to the upper surface of an unmanned ship, a water surface rotatable moving camera (2) is connected to the upper side of the antenna (1), a fisheye lens is fixedly installed on the water surface rotatable moving camera (2), a camera cradle head is adjusted to enable the fisheye lens to adjust shooting directions, an underwater rotatable moving camera (5) is fixedly installed on the left side and the right side of the lower portion of a cavity of the unmanned ship respectively, the fisheye lens is installed on the underwater rotatable moving camera (5), the underwater rotatable moving camera (5) cradle head is adjusted to enable the fisheye lens to face downwards, an audio acquisition device (3) is installed behind the unmanned ship, a Beidou/GPS positioning unit and an IMU unit are fused and positioned, and a central control unit (4) is connected with a storage unit, a Beidou/GPS positioning unit, an IMU unit, a control system of the unmanned ship and a control system of the moving camera; and the edge server in the system performs AI identification of animals and plants according to the received multisource information collected by the unmanned ship.
The unmanned ship needs to acquire longitude and latitude information of the unmanned ship in the process of driving to a target, and the current longitude and latitude information of the unmanned ship is the Beidou or GPS. The GPS positioning precision is 5 meters, the Beidou positioning precision is almost 5 meters, and the positioning precision is high. While the unmanned ship can use GPS to know the longitude and latitude information of the current position in the process of driving to the target fish shoal, the frequency of updating the Beidou and the GPS is low and is about 1-10 Hz. The unmanned ship is located in most of inland rivers, reservoirs, small lakes and other areas, the terrain of the areas is complex, the space is narrow and is always shielded, beidou/GPS signals are easy to lose, and under the condition that the Beidou/GPS signals are lost, an IMU and other self-body sensors are used for completing geographic position positioning estimation. IMU (Inertialmeasurementunit) is an inertial measurement unit with update frequency up to 1KHz, including gyroscopes, accelerometers and magnetometers. The gyroscope measures the angular speed of the unmanned ship by using three shafts so as to calculate the attitude of the ship body; the accelerometer measures acceleration and position of the unmanned ship; the magnetometer (M-Sensor), also called geomagnetism and magnetic Sensor, is used for testing the intensity and direction of the current magnetic field of the unmanned ship, and the position of the unmanned ship can be calculated through the data, but the gyroscope and the accelerometer in the IMU can accumulate errors with the increase of time, and the single IMU is not suitable for long-time use. Therefore, the unmanned ship finishes self-positioning by using a multi-sensor fusion mode, namely, the Beidou/GPS/IMU fusion positioning mode is adopted, and accurate real-time positioning of the unmanned ship is realized.
The GPS output is latitude and longitude information (longitude) in the earth coordinate system, and satellite data such as longitude (longitude), latitude (latitude), moving speed, altitude and the like obtained in the beidou satellite navigation module are also spherical coordinate systems based on the earth, and the coordinate system where the unmanned ship is located is a two-dimensional plane coordinate system (x, y), so that the latitude and longitude information is not suitable for the unmanned ship, and therefore, the latitude and longitude information needs to be converted into a two-dimensional plane rectangular coordinate system. According to the method, according to a conversion algorithm from an earth spherical coordinate system to a two-dimensional plane rectangular coordinate system of a Katon, satellite longitude and latitude data obtained from a Beidou satellite navigation module or longitude and latitude data (lat, lon) obtained from a GPS global positioning system are subjected to coordinate conversion and converted into two-dimensional plane rectangular coordinates (x) bucket/GPS ,y bucket/GPS ). The main process of coordinate conversion is to convert the two-dimensional curved surface longitude and latitude coordinates of Beidou/GPS into two-dimensional plane rectangular coordinates through an ink card support conversion algorithm, namely, the conversion from spherical coordinates latitude lat, longitude lon to plane coordinates abscissa x and ordinate y is realized through a formula (1) and a formula (2).
x bucket/GPS =lat (1)
y bucket/GPS =ln(tan(π/4+lon/2)) (2)
The pose data output by the IMU unit comprises acceleration, angular velocity, course angle and the like, and the pose data also need to be converted into an expression form of two-dimensional rectangular coordinates. The application adopts an AI mode for conversion. Firstly, preprocessing pose data output by an IMU (inertial measurement unit) based on a sliding weighted filtering algorithm to obtain pose data of an unmanned ship in x and y directions, wherein the pose data are five-dimensional data of linear acceleration and angular velocity in the x direction, linear acceleration and angular velocity in the y direction and course angle respectively, and then the five-dimensional data are used as input vectors of a neural network model; then based on trainingThe neural network model of (2) is relative to the sum of the displacements in the x and y directions (x And ,y and ) And carrying out prediction, and taking the predicted result as the output of the neural network model.
x And =∑Δx i (3)
y and =∑Δy i (4)
the Beidou/GPS positioning data and the pose data of the IMU are subjected to weighted fusion to obtain positioning data of two-dimensional plane coordinates, and the positioning data is marked as (x) Fusion of ,y Fusion of )。
X Fusion of =αx bucket/GPS +βx And (5)
y fusion of =αy bucket/GPS +βy And (6)
wherein α+β=1 (7)
In order to solve the problems, the application adopts a dynamic weighting updating method for attenuating the positioning confidence coefficient according to the interval of the distance refreshing time to carry out weighted fusion of positioning data. Specifically, when the working state of the Beidou/GPS is abnormal, namely, the Beidou/GPS positioning fails, a positioning mode mainly comprising the IMU and secondarily comprising the Beidou/GPS is entered, and at the moment, the dynamic updating of the weight is updated by the updating period of the IMU, namely, beta Abnormality of =1/[1+κ(t Currently, the method is that -t IMU refresh )]Wherein t is Currently, the method is that Is the current time, t IMU refresh Is the refresh time of the last IMU, kappa is the time decay rate coefficient, at which alpha Abnormality of =1-β Abnormality of . Namely the following formula:
x fusion of =α Abnormality of x bucket/GPSAbnormality of x And (8)
y fusion of =α Abnormality of y bucket/GPSAbnormality of y And (9)
wherein the method comprises the steps of
α Abnormality of =1-β Abnormality of (11)
When the working state of the Beidou/GPS is normal, namely the Beidou/GPS positioning is effective, a positioning mode which takes the GPS as a main part and takes the IMU as an auxiliary part is entered, and at the moment, the dynamic updating of the weight is updated by the updating period of the Beidou/GPS, namely alpha Normal state =1/[1+κ(t Currently, the method is that -t bucket/GPS refresh )]Wherein t is Currently, the method is that Is the current time, t bucket/GPS refresh Is the refresh time of the last Beidou/GPS, and kappa is the time decay speed coefficient, and beta is the time decay speed coefficient Normal state =1-α Normal state . Namely the following formula:
x fusion of =α Normal state x bucket/GPSNormal state x And (12)
y fusion of =α Normal state y bucket/GPSNormal state y And (13)
wherein the method comprises the steps of
β Normal state =1-α Normal state (15)
The method for collecting the multi-source information based on the collection system for multi-source AI biodiversity observation is shown in fig. 2, and specifically comprises the following steps:
the method comprises the steps of (1) carrying out video acquisition on animals and plants on the water surface through a water surface rotatable moving camera (2) and an audio acquisition device (3), and adding fusion positioning information of an unmanned ship at the moment, time information at the moment and the like into an acquired video image to serve as attribute information, so that subsequent research and analysis or data backtracking and other processing are facilitated. The underwater animals and plants are subjected to video acquisition through an underwater rotatable motion camera (5), and then fusion positioning information of an unmanned ship, time information and the like at the time are added into an acquired video image to serve as attribute information.
(2) According to a certain time interval, the water surface video and the underwater video data collected in the step (1) and attribute information thereof, such as positioning information, time information, file size and the like, are transmitted to an edge server in the cloud through a communication antenna (1) for storage, and in addition, the edge server also carries out animal and plant identification on the received video, wherein the water surface video can be separated by adopting video and audio, and the animal and plant identification result is obtained after AI identification is carried out respectively or the results of the two are integrated.
(3) After receiving the data uploaded by the unmanned ship, the edge server performs integrity verification on the received data and sends confirmation information to the unmanned ship after verification is complete, and the unmanned ship performs local deletion on the successfully uploaded file according to the confirmation information at the moment so as to prevent the problem of overflow of the storage space.
(4) And if the identification result has a result matched with the target object set by the shore-based control unit, reporting the result related to the identification of the target object, such as the image frame containing the target object and the corresponding information such as the positioning of the unmanned ship, to a designated receiving device in a preset mode.
(5) After the target object is identified, the edge server calculates the azimuth information of the target object and sends the azimuth information of the target object to the central control unit of the unmanned ship; the calculation of the target object azimuth information is obtained by calculating information such as acoustic positioning in audio frequency and real-time fusion positioning of an unmanned ship by combining an edge server, so that the cooperative calculation of sound and ship and the cooperative verification of sound and light are realized, and the azimuth information of the target object is more accurate.
(6) The central control unit (4) calculates an included angle between the target object and the unmanned ship according to the received azimuth information of the target object and the current positioning information of the unmanned ship, determines the heading of the unmanned ship according to the included angle, controls the differential steering movement of the unmanned ship through the lower computer, drives the unmanned ship to move towards the azimuth of the target object after the posture of the unmanned ship is adjusted, and adjusts a cradle head on the water surface rotary motion camera to enable a fish-eye lens on the cradle head to face towards the azimuth of the target object.
(7) After reaching the target position, target tracking and video alignment are carried out, and video images and audio data of the target object are collected and stored.
The foregoing description is only a preferred embodiment of the present application, and is not intended to limit the present application, but although the present application has been described in detail with reference to the foregoing embodiment, it will be apparent to those skilled in the art that modifications may be made to the technical solutions described in the foregoing embodiments, or that equivalents may be substituted for part of the technical features thereof. Any modifications or equivalent substitutions made within the spirit and principles of the present application should be included within the scope of the present application.

Claims (10)

1. An acquisition system for multisource AI biodiversity observations, the system comprising: unmanned ship and edge server, unmanned ship in the system includes: the device comprises a water surface rotatable motion camera, an underwater rotatable motion camera, an audio acquisition device, a communication antenna, a central control unit, a Beidou/GPS positioning unit, an IMU unit and a storage unit; the upper surface of the unmanned ship is fixedly connected with a communication antenna, a water surface rotatable moving camera is connected above the antenna, a fisheye lens is fixedly arranged on the water surface rotatable moving camera, a fisheye lens can be adjusted to adjust shooting directions by adjusting a water surface rotatable moving camera holder, an underwater rotatable moving camera is fixedly arranged at the left side and the right side of the cavity of the unmanned ship respectively, the fisheye lens is arranged on the underwater rotatable moving camera, the fisheye lens can be downward by adjusting the underwater rotatable moving camera holder, an audio acquisition device is arranged behind the unmanned ship, a Beidou/GPS positioning unit and an IMU unit realize fusion positioning, and a central control unit is connected with a storage unit, a Beidou/GPS positioning unit, an IMU unit, a control system of the unmanned ship and a control system of the moving camera; and the edge server in the system performs AI identification of animals and plants according to the received multisource information collected by the unmanned ship.
2. A method as claimed in claim 1The acquisition system is characterized in that a Beidou/GPS/IMU fusion positioning mode is adopted to convert longitude and latitude coordinates (lat, lon) of Beidou or GPS into two-dimensional plane coordinates (x) bucket/GPS ,y bucket/GPS ) Converting pose data of the IMU into two-dimensional plane coordinates (x And ,y and )。
3. an acquisition system according to claim 2, characterized in that the fusion coordinates (x Fusion of ,y Fusion of ) The GPS navigation system is obtained by carrying out weighted fusion on longitude and latitude coordinates of Beidou or GPS and pose data of the IMU.
4. A collecting system according to claim 3, wherein when the working state of the Beidou/GPS is abnormal, namely, the Beidou/GPS positioning fails, the system enters a positioning mode mainly comprising the IMU and secondarily comprising the Beidou/GPS, and the dynamic updating of the weight value is updated by the updating period of the IMU, namely, beta Abnormality of =1/[1+κ(t Currently, the method is that -t IMU refresh )]Wherein t is Currently, the method is that Is the current time, t IMU refresh Is the refresh time of the last IMU, kappa is the time decay rate coefficient, at which alpha Abnormality of =1-β Abnormality of ,α Abnormality of And abnormality of Beta is the weight of longitude and latitude coordinates and the weight of IMU pose data when the working state of the Beidou/GPS is abnormal, namely the following formula:
x fusion of = α Abnormality of x bucket/GPS + β Abnormality of x And (8)
y fusion of = α Abnormality of y bucket/GPS + β Abnormality of y And (9)
wherein the method comprises the steps of
α Abnormality of =1-β Abnormality of (11)。
5. The method comprises the following steps ofThe acquisition system as claimed in claim 3, wherein when the working state of the Beidou/GPS is normal, i.e. the Beidou/GPS positioning is effective, the GPS-based and IMU-based positioning mode is entered, and the dynamic updating of the weight is updated with the updating period of the Beidou/GPS, i.e. alpha Normal state =1/[1+κ(t Currently, the method is that -t bucket/GPS refresh )]Wherein t is Currently, the method is that Is the current time, t bucket/GPS refresh Is the refresh time of the last Beidou/GPS, and kappa is the time decay speed coefficient, and beta is the time decay speed coefficient Normal state =1-α Normal state ,α Normal state And beta Normal state When the working state of the Beidou/GPS is normal, the weight of longitude and latitude coordinates and the weight of IMU pose data are respectively represented by the following formulas:
x fusion of = α Normal state x bucket/GPS + β Normal state x And (12)
y fusion of = α Normal state y bucket/GPS + β Normal state y And (13)
wherein the method comprises the steps of
β Normal state =1-α Normal state (15)。
6. A collection system according to claim 3, wherein the conversion from the longitude and latitude coordinates of the two-dimensional curved surface of the beidou/GPS to the rectangular coordinates of the two-dimensional plane is performed by using the following formulas (1) and (2), that is, the conversion from the latitude lat, the longitude lon to the rectangular plane abscissa x and the ordinate y of the spherical coordinates is performed:
x bucket/GPS =lat (1)
y bucket/GPS =ln(tan(π/4+lon/2))(2)。
7. The acquisition system of claim 3, wherein the pose data output by the IMU is preprocessed based on a sliding weighted filtering algorithm to obtain the position of the unmanned shipPose data in x and y directions, and constructing a five-dimensional vector according to the pose data: linear acceleration and angular velocity in the x direction, linear acceleration and angular velocity in the y direction and course angle, and the five-dimensional data are used as input vectors of the neural network model; and then based on the trained neural network model, the sum of the relative displacements in the x and y directions (x) And ,y and ) And carrying out prediction, and taking the predicted result as the output of the neural network model.
8. A method of acquisition of multisource AI biodiversity using an acquisition system according to any of claims 1 to 7, comprising: (1) data acquisition: video acquisition is carried out on animals and plants on the water surface through a water surface rotatable motion camera and an audio acquisition device, then fusion positioning information and time information of an unmanned ship are added in an acquired video image to serve as attribute information, video acquisition is carried out on the animals and plants under water through an underwater rotatable motion camera, and then fusion positioning information and time information of the unmanned ship are added in the acquired video image to serve as attribute information; (2) data transmission: transmitting the water surface video, the underwater video data and the attribute information thereof acquired in the step (1) to an edge server in the cloud through a communication antenna for storage; (3) animal and plant identification: the edge server also performs plant identification on the received video; and (4) reporting the identification result: and if the identification result is matched with the target object set by the shore-based control unit, reporting the identification related result of the identified target object and the positioning information of the corresponding unmanned ship to the appointed receiving device in a preset mode.
9. The acquisition method of claim 8, further comprising: (5) calculating an included angle: after the target object is identified, the edge server calculates the azimuth information of the target object and sends the azimuth information of the target object to the central control unit of the unmanned ship; (6) adjusting the attitude of the unmanned ship: the central control unit calculates an included angle between the target object and the unmanned ship according to the received azimuth information of the target object and the current positioning information of the unmanned ship, determines the heading of the unmanned ship according to the included angle, controls the differential steering movement of the unmanned ship through the lower computer, and drives the unmanned ship to move towards the azimuth of the target object after the posture of the unmanned ship is adjusted; (7) target object acquisition: after reaching the target position, target tracking and video alignment are carried out, and video images and audio data of the target object are collected and stored.
10. The method according to claim 8, wherein the central control unit adjusts a pan-tilt on the camera for rotatable movement on the water surface after receiving the azimuth information of the target object and adjusting the heading attitude of the unmanned ship, so that the fisheye lens on the pan-tilt is directed to the azimuth of the target object.
CN202311681124.6A 2023-12-08 2023-12-08 Multi-source AI biodiversity observation method and acquisition system Active CN117690194B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311681124.6A CN117690194B (en) 2023-12-08 2023-12-08 Multi-source AI biodiversity observation method and acquisition system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311681124.6A CN117690194B (en) 2023-12-08 2023-12-08 Multi-source AI biodiversity observation method and acquisition system

Publications (2)

Publication Number Publication Date
CN117690194A true CN117690194A (en) 2024-03-12
CN117690194B CN117690194B (en) 2024-06-07

Family

ID=90131207

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311681124.6A Active CN117690194B (en) 2023-12-08 2023-12-08 Multi-source AI biodiversity observation method and acquisition system

Country Status (1)

Country Link
CN (1) CN117690194B (en)

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100456041B1 (en) * 2004-07-28 2004-11-10 (주)지앤제이 Equipment for Collecting Global Positioning Information
CN106873578A (en) * 2017-04-27 2017-06-20 南通大学 Unmanned operation intelligence boat equipment and control system
CN108303988A (en) * 2018-03-28 2018-07-20 大连海事大学 A kind of the target identification tracing system and its working method of unmanned boat
CN207908979U (en) * 2018-03-28 2018-09-25 大连海事大学 A kind of target identification tracing system of unmanned boat
CN110243358A (en) * 2019-04-29 2019-09-17 武汉理工大学 The unmanned vehicle indoor and outdoor localization method and system of multi-source fusion
CN111897350A (en) * 2020-07-28 2020-11-06 谈斯聪 Underwater robot device, and underwater regulation and control management optimization system and method
CN113124864A (en) * 2021-04-19 2021-07-16 江苏虹湾威鹏信息技术有限公司 Water surface navigation method adopting machine vision and inertial navigation fusion
CN113985419A (en) * 2021-10-22 2022-01-28 中国科学院合肥物质科学研究院 Water surface robot cooperative obstacle detection and avoidance method and system
CN114046792A (en) * 2022-01-07 2022-02-15 陕西欧卡电子智能科技有限公司 Unmanned ship water surface positioning and mapping method, device and related components
CN115236714A (en) * 2022-05-24 2022-10-25 芯跳科技(广州)有限公司 Multi-source data fusion positioning method, device and equipment and computer storage medium
CN116448100A (en) * 2023-03-10 2023-07-18 华南理工大学 Multi-sensor fusion type offshore unmanned ship SLAM method
CN116540696A (en) * 2023-04-13 2023-08-04 北京工业大学 Multi-mode obstacle avoidance system based on multi-sensor fusion of unmanned ship of ROS
CN116642468A (en) * 2023-05-31 2023-08-25 交通运输部天津水运工程科学研究所 Unmanned aerial vehicle aerial photography and unmanned ship based underwater integrated scanning method
CN117075158A (en) * 2023-08-23 2023-11-17 哈尔滨工业大学 Pose estimation method and system of unmanned deformation motion platform based on laser radar

Patent Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100456041B1 (en) * 2004-07-28 2004-11-10 (주)지앤제이 Equipment for Collecting Global Positioning Information
CN106873578A (en) * 2017-04-27 2017-06-20 南通大学 Unmanned operation intelligence boat equipment and control system
CN108303988A (en) * 2018-03-28 2018-07-20 大连海事大学 A kind of the target identification tracing system and its working method of unmanned boat
CN207908979U (en) * 2018-03-28 2018-09-25 大连海事大学 A kind of target identification tracing system of unmanned boat
CN110243358A (en) * 2019-04-29 2019-09-17 武汉理工大学 The unmanned vehicle indoor and outdoor localization method and system of multi-source fusion
CN111897350A (en) * 2020-07-28 2020-11-06 谈斯聪 Underwater robot device, and underwater regulation and control management optimization system and method
CN113124864A (en) * 2021-04-19 2021-07-16 江苏虹湾威鹏信息技术有限公司 Water surface navigation method adopting machine vision and inertial navigation fusion
CN113985419A (en) * 2021-10-22 2022-01-28 中国科学院合肥物质科学研究院 Water surface robot cooperative obstacle detection and avoidance method and system
CN114046792A (en) * 2022-01-07 2022-02-15 陕西欧卡电子智能科技有限公司 Unmanned ship water surface positioning and mapping method, device and related components
CN115236714A (en) * 2022-05-24 2022-10-25 芯跳科技(广州)有限公司 Multi-source data fusion positioning method, device and equipment and computer storage medium
WO2023226155A1 (en) * 2022-05-24 2023-11-30 芯跳科技(广州)有限公司 Multi-source data fusion positioning method and apparatus, device, and computer storage medium
CN116448100A (en) * 2023-03-10 2023-07-18 华南理工大学 Multi-sensor fusion type offshore unmanned ship SLAM method
CN116540696A (en) * 2023-04-13 2023-08-04 北京工业大学 Multi-mode obstacle avoidance system based on multi-sensor fusion of unmanned ship of ROS
CN116642468A (en) * 2023-05-31 2023-08-25 交通运输部天津水运工程科学研究所 Unmanned aerial vehicle aerial photography and unmanned ship based underwater integrated scanning method
CN117075158A (en) * 2023-08-23 2023-11-17 哈尔滨工业大学 Pose estimation method and system of unmanned deformation motion platform based on laser radar

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
赵燕东;涂佳炎;: "基于北斗卫星导航系统的林区智能巡检测绘系统研究", 农业机械学报, no. 07, 23 May 2018 (2018-05-23) *

Also Published As

Publication number Publication date
CN117690194B (en) 2024-06-07

Similar Documents

Publication Publication Date Title
US10942028B2 (en) Video sensor fusion and model based virtual and augmented reality systems and methods
US11328155B2 (en) Augmented reality labels systems and methods
Bao et al. Integrated navigation for autonomous underwater vehicles in aquaculture: A review
US10802141B2 (en) Water temperature overlay systems and methods
Kinsey et al. A survey of underwater vehicle navigation: Recent advances and new challenges
Whitcomb et al. Advances in underwater robot vehicles for deep ocean exploration: Navigation, control, and survey operations
CN108227751A (en) The landing method and system of a kind of unmanned plane
CN109911188A (en) The bridge machinery UAV system of non-satellite navigator fix environment
CA2977597A1 (en) Method and apparatus for target relative guidance
US11396354B2 (en) Covert underwater navigation via polarimetry
CN111829512A (en) AUV navigation positioning method and system based on multi-sensor data fusion
Shao et al. The application of AUV navigation based on adaptive extended Kalman filter
WO2018102772A1 (en) System and method for augmented reality comprising labels
CN114046777A (en) Underwater optical imaging system and method suitable for large-range shallow sea coral reef drawing
CN117690194B (en) Multi-source AI biodiversity observation method and acquisition system
US20230400302A1 (en) Systems and methods for measuring water capacity of polar lakes
CN114659496B (en) Method for monitoring inclination of shipborne Beidou all-in-one machine
Quraishi et al. Easily deployable underwater acoustic navigation system for multi-vehicle environmental sampling applications
Ballard Mapping the mid-ocean ridge
Conte et al. Data gathering in underwater archaeology by means of a remotely operated vehicle
US20220221297A1 (en) Waypoint timeline user interface systems and methods
CN114964245B (en) Unmanned aerial vehicle vision reconnaissance positioning method
Glotzbach et al. Navigation in Marine Robotics: Methods, Classification and State of the Art
CN114018255B (en) Intelligent integrated navigation method, system, equipment and medium of underwater glider
Bachmann et al. Design and evaluation of an integrated GPS/INS system for shallow-water AUV navigation

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant