CN110393165B - Open sea aquaculture net cage bait feeding method based on automatic bait feeding boat - Google Patents

Open sea aquaculture net cage bait feeding method based on automatic bait feeding boat Download PDF

Info

Publication number
CN110393165B
CN110393165B CN201910625650.8A CN201910625650A CN110393165B CN 110393165 B CN110393165 B CN 110393165B CN 201910625650 A CN201910625650 A CN 201910625650A CN 110393165 B CN110393165 B CN 110393165B
Authority
CN
China
Prior art keywords
bait casting
feeding
aquaculture net
net cage
boat
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910625650.8A
Other languages
Chinese (zh)
Other versions
CN110393165A (en
Inventor
曲梦瑶
陈俊华
林躜
黄方平
宋瑞银
姜楚华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ningbo Institute of Technology of ZJU
Original Assignee
Ningbo Institute of Technology of ZJU
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ningbo Institute of Technology of ZJU filed Critical Ningbo Institute of Technology of ZJU
Priority to CN201910625650.8A priority Critical patent/CN110393165B/en
Publication of CN110393165A publication Critical patent/CN110393165A/en
Application granted granted Critical
Publication of CN110393165B publication Critical patent/CN110393165B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01KANIMAL HUSBANDRY; AVICULTURE; APICULTURE; PISCICULTURE; FISHING; REARING OR BREEDING ANIMALS, NOT OTHERWISE PROVIDED FOR; NEW BREEDS OF ANIMALS
    • A01K61/00Culture of aquatic animals
    • A01K61/80Feeding devices
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B63SHIPS OR OTHER WATERBORNE VESSELS; RELATED EQUIPMENT
    • B63BSHIPS OR OTHER WATERBORNE VESSELS; EQUIPMENT FOR SHIPPING 
    • B63B35/00Vessels or similar floating structures specially adapted for specific purposes and not otherwise provided for
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B63SHIPS OR OTHER WATERBORNE VESSELS; RELATED EQUIPMENT
    • B63BSHIPS OR OTHER WATERBORNE VESSELS; EQUIPMENT FOR SHIPPING 
    • B63B35/00Vessels or similar floating structures specially adapted for specific purposes and not otherwise provided for
    • B63B2035/006Unmanned surface vessels, e.g. remotely controlled
    • B63B2035/007Unmanned surface vessels, e.g. remotely controlled autonomously operating
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A40/00Adaptation technologies in agriculture, forestry, livestock or agroalimentary production
    • Y02A40/80Adaptation technologies in agriculture, forestry, livestock or agroalimentary production in fisheries management
    • Y02A40/81Aquaculture, e.g. of fish

Landscapes

  • Life Sciences & Earth Sciences (AREA)
  • Environmental Sciences (AREA)
  • Marine Sciences & Fisheries (AREA)
  • Mechanical Engineering (AREA)
  • Ocean & Marine Engineering (AREA)
  • Combustion & Propulsion (AREA)
  • Chemical & Material Sciences (AREA)
  • Zoology (AREA)
  • Animal Husbandry (AREA)
  • Biodiversity & Conservation Biology (AREA)
  • Engineering & Computer Science (AREA)
  • Studio Devices (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a feeding method for an aquaculture net cage, in particular to a feeding method for an open sea aquaculture net cage based on an automatic feeding boat. It includes: firstly, according to the GPS information of the aquaculture net cage and the automatic feeding boat acquired by the GPS/IMU module, the navigation obstacle avoidance module plans the route of the automatic feeding boat, and the control module drives the automatic feeding boat to approach the aquaculture net cage; secondly, the visual identification module identifies the feeding marker of the aquaculture net cage, and tracks and positions the feeding marker in real time; thirdly, according to the real-time positioning information of the feeding markers of the aquaculture net cage, the control module calculates the effective parking position of the automatic feeding boat, and controls the automatic feeding boat to reach the effective parking position in an effective feeding posture by combining with the navigation obstacle avoidance module to adjust the air route plan in real time; and then, the control module drives the bait casting device to accurately cast the bait at the effective parking position, so that one-time effective bait casting is completed. The automatic bait casting device solves the technical problem that the open sea aquaculture net cage realizes accurate bait casting of an automatic bait casting boat.

Description

Open sea aquaculture net cage bait feeding method based on automatic bait feeding boat
Technical Field
The invention relates to a feeding method for an aquaculture net cage, in particular to a feeding method for an open sea aquaculture net cage based on an automatic feeding boat.
Background
Modern marine fishery culture gradually develops to a mode of open sea cage culture, and a large amount of HDPE floating cages are used for marine product culture.
The method of feeding by using an artificial feeding boat is mainly adopted in open sea cage culture, and has the following defects: firstly, the labor cost of the manual bait casting boat is large; secondly, the personal safety of the operators on the artificial feeding boat can not be guaranteed in consideration of the severe environment of open sea.
With the maturity and development of unmanned ship technology, the carrying function of unmanned ships is continuously developed. In the aquaculture direction, the unmanned bait casting boat replaces original artificial bait casting gradually, compares artificial bait casting, and the unmanned bait casting boat that controls has reduced the manpower distribution of artificial bait casting, has solved the personal safety problem on water. The unmanned bait casting boat carries out quantitative and fixed-point bait casting according to bait casting requirements through accurate GPS positioning. For example, the published chinese patent application No. 201910328839.0 discloses an unmanned bait casting boat and a method for controlling the same.
However, the existing unmanned bait casting boat is only suitable for the situation that the bait casting position of the aquaculture net cage is preset and the bait casting position is not changed any more in the bait casting process. Therefore, for the open sea aquaculture net cages in which factors such as ocean currents, sea waves and the like in some aquaculture environments influence the aquaculture net cages to cause the position of the aquaculture net cages to continuously and randomly float within a certain range, it is obvious that the existing unmanned bait casting boat and bait casting method cannot complete accurate fixed-point and quantitative bait casting.
Disclosure of Invention
The invention aims to solve the defects of the prior art and provide a method for feeding an open sea aquaculture net cage based on an automatic feeding boat.
In order to achieve the purpose, the invention designs an open-sea aquaculture net cage bait casting method based on an automatic bait casting boat, which adopts the automatic bait casting boat carrying a bait casting system consisting of a GPS/IMU module, an environment perception module, a navigation obstacle avoidance module, a visual identification module, a control module and a bait casting device;
the GPS/IMU module is used for acquiring GPS information and hull posture information of the aquaculture net cage and the automatic feeding boat;
the environment perception module is used for acquiring barrier information; for example: underwater obstacles can be detected by using a sonar technology; the above-water obstacles can be detected by adopting a laser radar technology;
the navigation obstacle avoidance module is used for planning the route of the automatic feeding boat according to the obstacle information obtained by the environment sensing module and the GPS information of the aquaculture net cage and the automatic feeding boat obtained by the GPS/IMU module;
the visual identification module is used for identifying the feeding markers of the aquaculture net cage, and tracking and positioning the feeding markers in real time;
the control module is used for calculating the effective parking position of the automatic feeding boat according to the real-time positioning information of the feeding marker of the aquaculture net cage and controlling the automatic feeding boat to reach the effective parking position in an effective feeding posture; the effective mooring position refers to the situation that the ship body moves to the position within the effective throwing radius corresponding to the bait casting marker, and the longitudinal central plane of the ship body is parallel to the tangential direction of the selected bait casting marker; wherein the effective throwing radius is determined according to the installation position of the bait casting device on the ship body and the bait casting distance of the bait casting device;
and, a feeding device for completing feeding;
in addition, before the bait casting method is implemented, bait casting markers need to be installed on the aquaculture net cage in advance, the bait casting markers are spherical bodies with bright colors, and the size of each spherical body is designed according to the effective working distance of the binocular camera in the visual identification module.
The specific bait casting steps are as follows:
firstly, navigation obstacle avoidance module carries out route planning of automatic feeding boat according to GPS information of the aquaculture net cage and the automatic feeding boat acquired by GPS/IMU module, the GPS information of the aquaculture net cage is preset, the control module drives the automatic feeding boat to approach the aquaculture net cage, and obstacle avoidance is completed according to obstacle information acquired by environment sensing module in real time;
secondly, the visual identification module identifies the feeding marker of the aquaculture net cage, and tracks and positions the feeding marker in real time; the method comprises the following specific steps:
s01, calibrating the camera to obtain internal and external parameters of the camera, and obtaining the depth information of each pixel and the corresponding relation between the pixel coordinate and the world coordinate system;
s02, selecting a frame from the left eye color image sequence of the binocular camera, and carrying out denoising pretreatment operation;
s03, setting RGB characteristic threshold, carrying out color segmentation, and setting the region reaching the threshold as a candidate region;
s04, extracting edges of the candidate areas in the previous step, performing arc fitting in a segmented mode, clustering arcs of the same type, fitting the clustered arcs again, and selecting a circular target in the image;
s05, calibrating the pixel coordinates of the circle center to a world coordinate system to determine the distance and the direction of the bait casting marker;
s06, establishing RGB (red, green, blue) feature and geometric feature weight histograms of the bait casting marker, predicting the position of the bait casting marker by using Kalman filtering, matching feature values in a predicted region, and finding a region with the most matched feature value as a target position of a next frame;
s07, repeating S06 to realize real-time tracking and positioning of the bait casting marker;
thirdly, according to the real-time positioning information of the feeding marker of the aquaculture net cage, the control module calculates the effective throwing radius corresponding to the feeding marker, ensures the effective parking position of the longitudinal center plane of the ship body parallel to the tangential direction of the feeding marker, and controls the automatic feeding boat to reach the effective parking position in an effective feeding posture by combining with a navigation obstacle avoidance module to adjust the route plan in real time; the effective throwing radius is determined according to the installation position of the bait casting device on the ship body and the bait casting distance of the bait casting device;
then, the control module drives the bait casting device to accurately cast bait at the effective parking position, and one-time effective bait casting is completed;
and finally, according to the GPS information of the aquaculture net cages and the automatic feeding ships acquired by the GPS/IMU module, the navigation obstacle avoidance module plans the route of the automatic feeding ships, and the control module drives the automatic feeding ships to approach the next aquaculture net cage to continue effective feeding or return.
The above-mentioned vision identification module discerns aquaculture net case's the marker of feeding to its real-time tracking, location, binocular camera's degree of depth measurement process in the step, as follows:
1. firstly, calibrating a binocular camera to obtain internal and external parameters and a homography matrix of the two cameras;
2. correcting the original image according to the calibration result, wherein the two corrected images are positioned on the same plane and are parallel to each other;
3. matching pixel points of the two corrected images;
4. and calculating the depth of each pixel according to the matching result, thereby obtaining a depth map.
In addition, the correspondence between the pixel coordinates and the world coordinate system is as follows:
Figure 192519DEST_PATH_IMAGE001
the imaging process of the camera involves four coordinate systems: world coordinate system, camera coordinate system, image coordinate system, pixel coordinate system.
World coordinate system:
the absolute coordinate system of the objective three-dimensional world is also called an objective coordinate system. Because the digital camera is placed in a three-dimensional space, we need the reference coordinate system of the world coordinate system to describe the position of the digital camera, and use it to describe the position of any other object placed in the three-dimensional environment, and its coordinate values are represented by (X, Y, Z).
Camera coordinate system (optical center coordinate system):
the coordinate values are expressed by (Xc, Yc, Zc) with the optical center of the camera as the origin of coordinates, the X-axis and the Y-axis being parallel to the X-axis and the Y-axis of the image coordinate system, respectively, and the optical axis of the camera as the Z-axis.
Image coordinate system:
the coordinate values are expressed by (X, Y) with the center of the CCD image plane as the origin of coordinates and the X-axis and the Y-axis parallel to two vertical sides of the image plane, respectively. The image coordinate system is the representation of the location of a pixel in an image in physical units (e.g., millimeters).
Pixel coordinate system:
the coordinate values are expressed by (u, v) with the vertex of the upper left corner of the CCD image plane as the origin and the X-axis and the Y-axis parallel to the X-axis and the Y-axis of the image coordinate system, respectively. The images acquired by the digital camera are first formed into a standard electrical signal and then converted into digital images by analog-to-digital conversion. The storage form of each image is an array of M × N, and the numerical value of each element in the image of M rows and N columns represents the gray scale of the image point. Each element is called a pixel, and the pixel coordinate system is an image coordinate system taking the pixel as a unit.
The coordinates of an object point in the real world in the world coordinate system are known as (X, Y, Z), a picture is obtained by shooting with a camera, and the pixel coordinates on the picture are known as (u, v). Let the coordinates in the image coordinate system be (x, y) and the coordinates in the camera coordinate system be (Xc, Yc, Zc). The transformation between the individual coordinates is as follows:
the conversion relationship between the pixel coordinate system and the image coordinate system is as follows:
Figure 837127DEST_PATH_IMAGE002
Figure 27937DEST_PATH_IMAGE003
the above formula is expressed in a homogeneous coordinate reuse matrix form, as in formula 3:
Figure 658770DEST_PATH_IMAGE004
where (u 0, v 0) is the coordinates of the origin of the image coordinate system in the pixel coordinate system, and dx and dy are the physical dimensions of each pixel in the x and y directions of the image plane, respectively.
The relationship between the image coordinate system and the camera coordinate system is as follows:
Figure 985846DEST_PATH_IMAGE005
Figure 535776DEST_PATH_IMAGE006
where f is the focal length (distance of the image plane from the origin of the camera coordinate system).
The above relationship is expressed in terms of a homogeneous coordinate system and matrix, as shown in equation 5:
Figure 114349DEST_PATH_IMAGE007
the relationship between the camera coordinate system and the world coordinate system is as follows:
Figure 407927DEST_PATH_IMAGE008
where R is a 3 × 3 orthogonal rotation matrix and t is a three-dimensional translation vector.
From this, it is understood that the relationship between the pixel coordinate system and the world coordinate system is expressed by the above expression 1 from the expressions 3, 5, and 6.
Compared with the prior art, the method for feeding the open sea aquaculture net cages based on the automatic feeding boat can realize accurate feeding of multiple net cages in a certain open sea fishing area, simplify the open sea aquaculture process, save manpower, realize automation of open sea purse net aquaculture, and is particularly suitable for the HDPE floating net cages to carry out open sea aquatic product aquaculture.
Drawings
FIG. 1 is a logic block diagram of a visual recognition module;
fig. 2 is a schematic representation of the spatial position of an automatic bait casting vessel and the corresponding operative mooring position.
Detailed Description
The invention is further illustrated with reference to the following figures and examples.
The method for feeding the open-sea aquaculture net cages based on the automatic feeding boat adopts the automatic feeding boat which carries a feeding system consisting of a GPS/IMU module, an environment perception module, a navigation obstacle avoidance module, a visual identification module, a control module and a feeding device;
the GPS/IMU module is used for acquiring GPS information and hull posture information of the aquaculture net cage and the automatic feeding boat;
the environment perception module is used for acquiring barrier information; in the embodiment, a sonar technology is adopted to detect underwater obstacles; detecting the above-water obstacles by adopting a laser radar technology;
the navigation obstacle avoidance module is used for planning the route of the automatic feeding boat according to the obstacle information obtained by the environment sensing module and the GPS information of the aquaculture net cage and the automatic feeding boat obtained by the GPS/IMU module;
the visual identification module is used for identifying the feeding markers of the aquaculture net cage, and tracking and positioning the feeding markers in real time;
the control module is used for calculating the effective parking position of the automatic feeding boat according to the real-time positioning information of the feeding marker of the aquaculture net cage and controlling the automatic feeding boat to reach the effective parking position in an effective feeding posture; the effective mooring position refers to the situation that the ship body moves to the position within the effective throwing radius corresponding to the bait casting marker, and the longitudinal central plane of the ship body is parallel to the tangential direction of the selected bait casting marker; wherein the effective throwing radius is determined according to the installation position of the bait casting device on the ship body and the bait casting distance of the bait casting device;
and, a feeding device for completing feeding;
in addition, before the bait casting method is implemented, a bait casting marker needs to be installed on the aquaculture net cage in advance, in this embodiment, the bait casting marker is a sphere with bright color, and the size of the sphere is designed according to the effective working distance of the binocular camera in the visual identification module.
The specific bait casting steps are as follows:
firstly, navigation obstacle avoidance module carries out route planning of automatic feeding boat according to GPS information of the aquaculture net cage and the automatic feeding boat acquired by GPS/IMU module, the GPS information of the aquaculture net cage is preset, the control module drives the automatic feeding boat to approach the aquaculture net cage, and obstacle avoidance is completed according to obstacle information acquired by environment sensing module in real time;
secondly, the visual identification module identifies the feeding marker of the aquaculture net cage, and tracks and positions the feeding marker in real time; the method comprises the following specific steps:
s01, calibrating the camera to obtain internal and external parameters of the camera, and obtaining the depth information of each pixel and the corresponding relation between the pixel coordinate and the world coordinate system;
s02, selecting a frame from the left eye color image sequence of the binocular camera, and carrying out denoising pretreatment operation;
s03, setting RGB characteristic threshold, carrying out color segmentation, and setting the region reaching the threshold as a candidate region;
s04, extracting edges of the candidate areas in the previous step, performing arc fitting in a segmented mode, clustering arcs of the same type, fitting the clustered arcs again, and selecting a circular target in the image;
s05, calibrating the pixel coordinates of the circle center to a world coordinate system to determine the distance and the direction of the bait casting marker;
s06, establishing RGB (red, green, blue) feature and geometric feature weight histograms of the bait casting marker, predicting the position of the bait casting marker by using Kalman filtering, matching feature values in a predicted region, and finding a region with the most matched feature value as a target position of a next frame;
s07, repeating S06 to realize real-time tracking and positioning of the bait casting marker;
thirdly, according to the real-time positioning information of the feeding marker of the aquaculture net cage, the control module calculates the effective throwing radius corresponding to the feeding marker, ensures the effective parking position of the longitudinal center plane of the ship body parallel to the tangential direction of the feeding marker, and controls the automatic feeding boat to reach the effective parking position in an effective feeding posture by combining with a navigation obstacle avoidance module to adjust the route plan in real time; the effective throwing radius is determined according to the installation position of the bait casting device on the ship body and the bait casting distance of the bait casting device;
then, the control module drives the bait casting device to accurately cast bait at the effective parking position, and one-time effective bait casting is completed;
and finally, according to the GPS information of the aquaculture net cages and the automatic feeding ships acquired by the GPS/IMU module, the navigation obstacle avoidance module plans the route of the automatic feeding ships, and the control module drives the automatic feeding ships to approach the next aquaculture net cage to continue effective feeding or return.
If the automatic feeding boat approaches the next aquaculture net cage to continue effective feeding, the steps are repeated until the automatic feeding boat reaches the next effective parking position in an effective feeding posture, and accurate feeding is completed.
The above-mentioned steps of calculating the effective berthing position take the bait casting marker in the right front of the hull as an example:
as shown in FIG. 2 (a), it is assumed that the bait casting marker is known to be on the right side θ of the ship1A distance of L1Then the right side of the ship theta12At an angular distance L3The effective parking position calculated at this moment, at the moment, the effective throwing radius corresponding to the bait casting marker is larger than L2I.e. the boat is parked at this position to complete an effective bait casting, the control module will act according to theta12Adjusting the yaw angle of the ship by combining the navigation obstacle avoidance module;
as shown in FIG. 2(b), the spatial position information of the bait casting marker acquired next time is directly ahead of the ship and the distance is L1Then the right side of the ship theta2At an angular distance L3Is an effective parking position calculated at this moment, and the corresponding effective throwing radius of the bait casting marker is larger than L2Then the control module will be based on θ2The angle is combined with a navigation obstacle avoidance module to readjust the yaw angle of the ship;
as shown in fig. 2(c), the spatial position information of the bait casting marker acquired next time is the left side θ of the ship1A distance of L1Then the right side of the ship theta21At an angular distance L3Is an effective parking position calculated at this moment, and the corresponding effective throwing radius of the bait casting marker is larger than L2Then the control module will be based on θ21The angle is combined with a navigation obstacle avoidance module to readjust the yaw angle of the ship;
as shown in fig. 2(d), when the boat last captured the bait casting marker in the video sequence, that is, the bait casting marker is about to jump out of the camera view, the boat direction is finely adjusted according to the distance and orientation information of the last captured bait casting marker, so that the boat body can be guaranteed to be parked at an effective parking position in an effective posture.

Claims (1)

1. An open sea aquaculture net cage bait casting method based on an automatic bait casting boat adopts the automatic bait casting boat which carries a bait casting system consisting of a GPS/IMU module, an environment perception module, a navigation obstacle avoidance module, a visual identification module, a control module and a bait casting device; the bait casting marker on the aquaculture net cage is a sphere with bright color, the size of the sphere is designed according to the effective working distance of a binocular camera in the visual identification module, and the bait casting marker is characterized by comprising the following steps:
firstly, according to GPS information of the aquaculture net cage and the automatic feeding boat acquired by a GPS/IMU module, a navigation obstacle avoidance module plans a route of the automatic feeding boat, a control module drives the automatic feeding boat to approach the aquaculture net cage, and obstacle avoidance is completed according to obstacle information acquired by an environment sensing module in real time;
secondly, the visual identification module identifies the feeding marker of the aquaculture net cage, and tracks and positions the feeding marker in real time; the method comprises the following specific steps:
s01, calibrating the camera to obtain internal and external parameters of the camera, and obtaining the depth information of each pixel and the corresponding relation between the pixel coordinate and the world coordinate system;
s02, selecting a frame from the left eye color image sequence of the binocular camera, and carrying out denoising pretreatment operation;
s03, setting RGB characteristic threshold, carrying out color segmentation, and setting the region reaching the threshold as a candidate region;
s04, extracting edges of the candidate areas in the previous step, performing arc fitting in a segmented mode, clustering arcs of the same type, fitting the clustered arcs again, and selecting a circular target in the image;
s05, calibrating the pixel coordinates of the circle center to a world coordinate system to determine the distance and the direction of the bait casting marker;
s06, establishing RGB (red, green, blue) feature and geometric feature weight histograms of the bait casting marker, predicting the position of the bait casting marker by using Kalman filtering, matching feature values in a predicted region, and finding a region with the most matched feature value as a target position of a next frame;
s07, repeating S06 to realize real-time tracking and positioning of the bait casting marker;
thirdly, according to the real-time positioning information of the feeding marker of the aquaculture net cage, the control module calculates the effective throwing radius corresponding to the feeding marker, ensures the effective parking position of the longitudinal center plane of the ship body parallel to the tangential direction of the feeding marker, and controls the automatic feeding boat to reach the effective parking position in an effective feeding posture by combining with a navigation obstacle avoidance module to adjust the route plan in real time; the effective throwing radius is determined according to the installation position of the bait casting device on the ship body and the bait casting distance of the bait casting device;
then, the control module drives the bait casting device to accurately cast bait at the effective parking position, and one-time effective bait casting is completed;
and finally, according to the GPS information of the aquaculture net cages and the automatic feeding ships acquired by the GPS/IMU module, the navigation obstacle avoidance module plans the route of the automatic feeding ships, and the control module drives the automatic feeding ships to approach the next aquaculture net cage to continue effective feeding or return.
CN201910625650.8A 2019-07-11 2019-07-11 Open sea aquaculture net cage bait feeding method based on automatic bait feeding boat Active CN110393165B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910625650.8A CN110393165B (en) 2019-07-11 2019-07-11 Open sea aquaculture net cage bait feeding method based on automatic bait feeding boat

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910625650.8A CN110393165B (en) 2019-07-11 2019-07-11 Open sea aquaculture net cage bait feeding method based on automatic bait feeding boat

Publications (2)

Publication Number Publication Date
CN110393165A CN110393165A (en) 2019-11-01
CN110393165B true CN110393165B (en) 2021-06-25

Family

ID=68324564

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910625650.8A Active CN110393165B (en) 2019-07-11 2019-07-11 Open sea aquaculture net cage bait feeding method based on automatic bait feeding boat

Country Status (1)

Country Link
CN (1) CN110393165B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111897350A (en) * 2020-07-28 2020-11-06 谈斯聪 Underwater robot device, and underwater regulation and control management optimization system and method
CN115616597A (en) * 2022-09-14 2023-01-17 长春理工大学 Unmanned ship fog-penetrating imaging obstacle avoidance device and using method thereof

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102779347A (en) * 2012-06-14 2012-11-14 清华大学 Method and device for tracking and locating target for aircraft
CN105225251A (en) * 2015-09-16 2016-01-06 三峡大学 Over the horizon movement overseas target based on machine vision identifies and locating device and method fast
WO2017143589A1 (en) * 2016-02-26 2017-08-31 SZ DJI Technology Co., Ltd. Systems and methods for visual target tracking
CN107128445A (en) * 2017-04-06 2017-09-05 北京臻迪科技股份有限公司 A kind of unmanned boat
CN109407697A (en) * 2018-09-20 2019-03-01 北京机械设备研究所 A kind of unmanned plane pursuit movement goal systems and method based on binocular distance measurement
CN109479787A (en) * 2018-11-19 2019-03-19 华南农业大学 A kind of unmanned feeding ship and feeding method of navigating
CN109859202A (en) * 2019-02-18 2019-06-07 哈尔滨工程大学 A kind of deep learning detection method based on the tracking of USV water surface optical target

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN202583169U (en) * 2012-05-17 2012-12-05 马炼堪 Water surface pollution detecting and tracking robot
BR202013028855Y1 (en) * 2013-11-08 2019-12-31 Work Station Comercio De Pecas Ltda Me provision introduced in stand up board
CN107316319B (en) * 2017-05-27 2020-07-10 北京小鸟看看科技有限公司 Rigid body tracking method, device and system
CN109795629A (en) * 2019-01-22 2019-05-24 深兰科技(上海)有限公司 A kind of unmanned cargo ship
CN109993113B (en) * 2019-03-29 2023-05-02 东北大学 Pose estimation method based on RGB-D and IMU information fusion

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102779347A (en) * 2012-06-14 2012-11-14 清华大学 Method and device for tracking and locating target for aircraft
CN105225251A (en) * 2015-09-16 2016-01-06 三峡大学 Over the horizon movement overseas target based on machine vision identifies and locating device and method fast
WO2017143589A1 (en) * 2016-02-26 2017-08-31 SZ DJI Technology Co., Ltd. Systems and methods for visual target tracking
CN107128445A (en) * 2017-04-06 2017-09-05 北京臻迪科技股份有限公司 A kind of unmanned boat
CN109407697A (en) * 2018-09-20 2019-03-01 北京机械设备研究所 A kind of unmanned plane pursuit movement goal systems and method based on binocular distance measurement
CN109479787A (en) * 2018-11-19 2019-03-19 华南农业大学 A kind of unmanned feeding ship and feeding method of navigating
CN109859202A (en) * 2019-02-18 2019-06-07 哈尔滨工程大学 A kind of deep learning detection method based on the tracking of USV water surface optical target

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
基于鱼群摄食规律的投饵系统研究;贾成功 等;《机械工程师》;20170831(第8期);第23-25,28页 *

Also Published As

Publication number Publication date
CN110393165A (en) 2019-11-01

Similar Documents

Publication Publication Date Title
CN107844750B (en) Water surface panoramic image target detection and identification method
CN109283538B (en) Marine target size detection method based on vision and laser sensor data fusion
CN110414396B (en) Unmanned ship perception fusion algorithm based on deep learning
KR102530691B1 (en) Device and method for monitoring a berthing
US11900668B2 (en) System and method for identifying an object in water
CN113850848B (en) Marine multi-target long-term detection and tracking method based on cooperation of unmanned ship carrying navigation radar and visual image
CN113627473B (en) Multi-mode sensor-based water surface unmanned ship environment information fusion sensing method
CN110393165B (en) Open sea aquaculture net cage bait feeding method based on automatic bait feeding boat
KR102265980B1 (en) Device and method for monitoring ship and port
CN107527368B (en) Three-dimensional space attitude positioning method and device based on two-dimensional code
KR102530847B1 (en) Method and device for monitoring harbor and ship
KR102520844B1 (en) Method and device for monitoring harbor and ship considering sea level
CN110673622B (en) Unmanned aerial vehicle automatic carrier landing guiding method and system based on visual images
CN116087982A (en) Marine water falling person identification and positioning method integrating vision and radar system
CN109202911B (en) Three-dimensional positioning method for cluster amphibious robot based on panoramic vision
Woo et al. Obstacle avoidance and target search of an Autonomous Surface Vehicle for 2016 Maritime RobotX challenge
CN113933828A (en) Unmanned ship environment self-adaptive multi-scale target detection method and system
CN112862862B (en) Aircraft autonomous oil receiving device based on artificial intelligence visual tracking and application method
Xing et al. Quadrotor vision-based localization for amphibious robots in amphibious area
KR20220055556A (en) Device and method for monitoring ship and port
CN105785990B (en) Ship mooring system and obstacle recognition method based on panoramic looking-around
CN117470249B (en) Ship anti-collision method and system based on laser point cloud and video image fusion perception
CN113190047B (en) Unmanned aerial vehicle group path recognition method based on two-dimensional plane
JP2023103836A (en) Water area object detection system, and ship and surrounding object detection system
Kushnerik et al. Small AUV docking algorithms near dock unit based on visual data

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant