CN112651958A - Aquaculture environment real-time online monitoring method based on Internet of things and artificial intelligence and big data cloud platform - Google Patents

Aquaculture environment real-time online monitoring method based on Internet of things and artificial intelligence and big data cloud platform Download PDF

Info

Publication number
CN112651958A
CN112651958A CN202011640007.1A CN202011640007A CN112651958A CN 112651958 A CN112651958 A CN 112651958A CN 202011640007 A CN202011640007 A CN 202011640007A CN 112651958 A CN112651958 A CN 112651958A
Authority
CN
China
Prior art keywords
sub
reflection
time period
floater
regulation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
CN202011640007.1A
Other languages
Chinese (zh)
Inventor
卓喜龙
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to CN202011640007.1A priority Critical patent/CN112651958A/en
Publication of CN112651958A publication Critical patent/CN112651958A/en
Withdrawn legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/58Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • G06F16/583Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Systems or methods specially adapted for specific business sectors, e.g. utilities or tourism
    • G06Q50/02Agriculture; Fishing; Mining
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/62Analysis of geometric attributes of area, perimeter, diameter or volume
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30181Earth observation

Abstract

According to the real-time online monitoring method for the aquaculture environment based on the Internet of things and artificial intelligence and the big data cloud platform, the water surface environment of aquaculture is monitored and analyzed through the water surface image processing module, the regional environment monitoring module and the processing server, the problem that the aquaculture environment monitoring result is one-sidedness is solved, and the damage to water quality caused by the fact that floaters exist on the water surface for too long time is effectively avoided. Meanwhile, through careful analysis of the reflection images in the written images and the fog concentrations in the environments of the sub-regions, the reflection coefficients of the reflection sub-regions in each reflection time period and the visual coefficients of the sub-regions in each acquisition time period are counted, and corresponding automatic adjustment is performed on the cameras, which need to be adjusted, of the sub-regions according to the reflection coefficients and the visual coefficients, so that the monitoring efficiency of aquaculture environment monitoring is greatly improved, and meanwhile, the authenticity and the reference of monitoring results are effectively guaranteed.

Description

Aquaculture environment real-time online monitoring method based on Internet of things and artificial intelligence and big data cloud platform
Technical Field
The invention belongs to the technical field of aquaculture environment monitoring, and relates to an aquaculture environment real-time online monitoring method based on the Internet of things and artificial intelligence and a big data cloud platform.
Background
With the rapid development of economy and the improvement of the living standard of people, the scale of the aquaculture industry is increasingly enlarged, and the aquaculture industry has very important influence on the economic development of coastal areas, so that the monitoring of aquaculture environment is also very important.
In current aquaculture environment monitoring technology, all concentrate on the detection to internal environment influence factors such as quality of water pH valve, temperature, dissolved oxygen volume to neglected aquaculture environment's external environment factor to aquaculture's influence, consequently current aquaculture environment monitoring technology has one-sidedness to aquaculture environment's testing result, thereby can't ensure aquaculture environment monitoring's authenticity and referential, and then can't effectual promotion aquaculture environment's monitoring efficiency. In order to solve the problems, an aquaculture environment real-time online monitoring method and a big data cloud platform based on the internet of things and artificial intelligence are provided aiming at aquaculture water surface environment monitoring.
Disclosure of Invention
The invention aims to provide an aquaculture environment real-time online monitoring method based on the Internet of things and artificial intelligence and a big data cloud platform.
The purpose of the invention can be realized by the following technical scheme:
in a first aspect, the invention provides an aquaculture environment real-time online monitoring method based on the internet of things and artificial intelligence, which needs to use an aquaculture environment real-time online monitoring system based on the internet of things and artificial intelligence in a specific implementation process, and the system comprises an area dividing module, a water surface image acquisition module, a water surface image processing module, a floater environment acquisition module, an area environment detection module, an intelligent regulation and control module, an information preprocessing module, an analysis and processing module, a database, a remote control center and a background display terminal;
the analysis and processing module is respectively connected with the water surface image processing module, the region environment detection module, the database, the intelligent regulation and control module, the remote control center and the background display terminal, the water surface image acquisition module is respectively connected with the region division module and the water surface image processing module, and the floater environment acquisition module is respectively connected with the water surface image processing module and the information preprocessing module;
the area division module is used for carrying out area division on the aquaculture water surface area according to a plane net format to obtain divided sub-areas, numbering the sub-areas according to a preset sequence, and sequentially marking the sub-areas as 1,2,. once, j,. m;
the water surface image acquisition module comprises a plurality of monitoring devices which are respectively arranged in each subregion, and according to a preset acquisition time period, the cameras in the monitoring devices are used for acquiring the water surface images of each subregion to obtain the water surface images of each subregion in each acquisition time period, so that a water surface image set F of each subregion in each acquisition time period is constructedrt(Fr1,Fr2,...Fri,...Fr6) R represents the number of each sub-region, r is 1,2, a.. j.. m, t represents an acquisition time period, and t is 1,2, a.. i.. 6, and the water surface image set of each sub-region in each acquisition time period is sent to the water surface image processing module;
the water surface image processing module is used for receiving a water surface image set of each sub-region in each acquisition time period sent by the water surface image acquisition module, wherein the water surface image processing module comprises a water surface floater image processing module and a water surface reflection image processing module;
the water surface floater image processing module is used for processing the collected water surface images of all sub-areas in all the collection time periods so as to judge whether floaters exist in the images, and the specific processing steps are as follows:
a1, acquiring a water surface image of each sub-area in each acquisition time period;
a2, comparing the acquired water surface image of each sub-region in each acquisition time period with the standard water surface image of the sub-region, analyzing whether an abnormality exists, if the water surface image of a sub-region in a certain acquisition time period is abnormal, marking the sub-region as a float sub-region, and counting the number of the float sub-region when the float appears in the acquisition time period;
a3, focusing the water surface image corresponding to the sub-area of the floating object on the area where the floating object is located, further extracting the characteristics of the floating object, simultaneously obtaining the geographic position of the floating object, and sending the geographic position of the floating object to a floating object environment acquisition module;
a4, sending the characteristics of the floater, the number of the subarea of the floater and the occurrence time of the floater to a processing and analyzing module;
the water surface reflection image processing module is used for processing the water surface image of each subarea in each acquisition time period so as to judge whether reflection exists in the image, counting the area of the water surface outline in the water surface image of each subarea in each acquisition time period, analyzing whether abnormality exists, if the water surface image of a subarea in a certain acquisition time period has a non-water surface outline, marking the subarea as a reflection subarea, counting the number of the reflection subarea, marking the acquisition time period as a reflection time period, further extracting the water surface image corresponding to the number of the reflection subarea, marking the water surface image as a reflection image, and sending the number of each reflection subarea in each reflection time period and each reflection image to the analysis and processing;
the floater environment acquisition module is used for receiving the geographical position of each floater sent by the water surface floater image processing module, and comprises a plurality of floater environment acquisition units which are used for acquiring the wind speed and the wind direction in the environment parameters of the subareas of the floaters so as to send the geographical position of each floater and the wind speed and the wind direction corresponding to each floater subarea to the information preprocessing module;
the regional environment detection module comprises a plurality of fog sensors for monitoring fog concentration in each sub-region of each acquisition time period, each fog sensor is respectively arranged on the surface of each regional camera, and each acquisition time is carried out through each fog sensorMonitoring the fog concentration in each subregion in each time period, acquiring the fog concentration in each subregion in each acquisition time period, and further constructing a fog concentration set W in each subregion in each acquisition time periodrt(Wr1,Wr2,...Wri,...Wr6) R represents the number of each sub-region, t represents an acquisition time period, and t is 1,2,. i,. 6, and the mist concentration set in each sub-region of each acquisition time period is sent to an analysis and processing module;
the information preprocessing module is used for receiving the geographic position of each floater sent by the floater environment acquisition module and the wind speed and the wind direction corresponding to each floater sub-region, recording the wind speed of each floater sub-region as V, counting the distance between the position of a worker and the geographic position of each floater, acquiring the predicted time of the worker to reach the geographic position of the floater, recording the predicted time as T, predicting the moving distance of the floater when the worker reaches the position according to the wind speed of each floater sub-region and the predicted time of the worker to reach the region, predicting the position of the floater according to the wind direction of each floater sub-region and the predicted moving distance of the floater, and sending the predicted position of the floater to the remote control center;
the analysis and processing module is used for receiving the characteristics of the floater, the number of the subarea of the floater and the occurrence time of the floater, which are sent by the image processing module of the floater on the water surface, comparing the characteristics of the floater with the characteristics of the floater types stored in the database, further screening out the floater type corresponding to the floater in the subarea of the floater, and sending the floater type corresponding to the floater, the occurrence time of the floater and the number of the subarea of the floater to the remote control center;
the analysis and processing module is used for receiving the numbers of the light reflecting sub-regions in the light reflecting time periods and the light reflecting images sent by the water surface light reflecting image processing module, acquiring the areas of the light reflecting regions in the light reflecting images according to the proportion of the light reflecting regions in the light reflecting images of the light reflecting sub-regions in the light reflecting time periods to the light reflecting images, and recording the areas as Xrt, further counting the reflection coefficient of the reflection sub-area in each reflection time periodComparing the reflection coefficient of the sub-region of the light reflection in each reflection time period with the standard reflection coefficient corresponding to the sub-region, if the reflection coefficient of a certain reflection time period of a certain reflection sub-region is greater than the standard reflection coefficient threshold of the sub-region, marking the sub-region of the light reflection as a reflection regulation sub-region, marking the reflection time period as a reflection regulation time period, counting the number of the reflection regulation sub-region and the corresponding reflection regulation time period, and further sending the number of the reflection regulation sub-region, the reflection regulation time period corresponding to the reflection regulation sub-region and the reflection coefficient of the reflection regulation sub-region corresponding to the reflection regulation time period to the intelligent regulation module;
the analysis and processing module is used for receiving the fog concentration set of each subarea of each acquisition time period sent by the regional environment detection module, comparing the fog concentration of each subarea of each acquisition time period with the standard fog concentration corresponding to each subarea of each acquisition time period, and constructing the fog concentration comparison set delta W of each subarea of each acquisition time periodrt(ΔWr1,ΔWr2,...ΔWri,...ΔWr6),ΔWrt represents the difference value between the fog concentration of the r-th area in the t-th collection time period and the standard fog concentration corresponding to the r-th area in the t-th collection time period, r represents a number, t represents a collection time period, and t is 1,2,. i,. 6, then counting the visible coefficient of each sub-area in each acquisition time period, comparing the visible coefficient of each sub-area in each acquisition time period with the sub-area standard visible coefficient, if the visible coefficient of a certain acquisition time period of a certain sub-area is less than the sub-area standard visible coefficient, marking the sub-area as a focal length regulation sub-area, the collection time period is recorded as a focal length regulation time period, at this time, the number of the focal length regulation subarea and the corresponding focal length regulation time period are counted, then sending the number of the focal length regulating and controlling subarea, the focal length regulating and controlling time period corresponding to the focal length regulating and controlling subarea and the visual coefficient corresponding to the focal length regulating and controlling subarea in the focal length regulating and controlling time period to an intelligent regulating and controlling module;
the method comprises the steps of counting the reflection coefficient of each reflection sub-area in each reflection time period and the visual coefficient of each sub-area in each collection time period to further count the comprehensive influence coefficient of environmental monitoring, and sending the comprehensive influence coefficient of environmental monitoring to a background terminal;
the intelligent regulation and control module is used for receiving the number of the reflection regulation and control sub-region sent by the analysis and processing module, the reflection regulation and control time period corresponding to the reflection regulation and control sub-region and the reflection coefficient corresponding to the reflection regulation and control sub-region in the reflection regulation and control time period, and the angle regulation and control mode corresponding to the reflection coefficient is called from the database according to the reflection coefficient corresponding to the reflection regulation and control sub-region in the reflection regulation and control time period, so that the camera corresponding to the reflection regulation and control sub-region is called to automatically regulate and control in the reflection regulation and control time period according to the angle regulation and control mode corresponding to the reflection coefficient of the reflection regulation and control sub;
meanwhile, the intelligent control module is also used for receiving, analyzing and processing the number of the focal length control subarea sent by the module, the focal length control time period corresponding to the focal length control subarea and the visual coefficient of the focal length control subarea corresponding to the focal length control time period, and controlling the focal length control mode corresponding to the visual coefficient from the database according to the visual coefficient of the focal length control subarea corresponding to the focal length control time period, so as to control the camera corresponding to the focal length control subarea to perform automatic control in the focal length control time period according to the focal length control mode corresponding to the visual coefficient of the focal length control subarea in the focal length control time period;
the database is used for storing standard water surface images corresponding to all sub-regions in all acquisition time periods, the variety characteristics of all floaters, the standard light reflection coefficient of each sub-region, the standard fog concentration of each sub-region, the standard visible coefficient of each sub-region, the focus regulation and control mode corresponding to each visible coefficient and the angle regulation and control mode corresponding to each light reflection coefficient;
the remote control center is used for receiving the floater type, the floater occurrence time, the estimated position of the floater and the floater subarea number which are sent by the analysis and processing module and the information and processing module and correspond to the floater, and then dispatching related staff for processing;
the background terminal is used for receiving the environmental monitoring comprehensive influence coefficient sent by the analysis and processing module and displaying the environmental monitoring comprehensive influence coefficient in real time;
when the real-time online monitoring method for the aquaculture environment based on the Internet of things and artificial intelligence is adopted, the method comprises the following steps;
s1, dividing culture water surface areas: carrying out region division on the culture water surface to obtain each divided sub-region;
s2, water surface image acquisition: installing monitoring equipment in each subregion, and acquiring water surface images of each subregion through each monitoring equipment;
s3, water surface image processing: extracting images of reflected light and floats appearing in the collected water surface images of the subregions, processing the images of the reflected light and the floats to obtain images of the floats of the subregions in each collection time period and images of the reflected light of the subregions in each collection time period, and further obtaining the wind direction and the wind speed of the region where the floats are located, the position of the floats, characteristic images of the floats, the time for the floats to appear and the numbers of the subregions of the floats;
s4, collecting the environment of the floating object, wherein the wind speed and the wind direction in the environment parameters of the sub-area of the floating object are collected;
s5, monitoring regional environment: carrying out environment monitoring on each subregion to obtain the fog concentration of each subregion in each acquisition time period;
s6, information preprocessing: acquiring the estimated position of the floater according to the wind direction and the wind speed of the area where the floater is located and the position of the floater;
s7, analysis and processing: analyzing and processing the characteristic images of the floaters in each sub-area of each acquisition time period, the reflection images of the reflection sub-areas of each reflection time period and the fog concentration of each sub-area of each acquisition time period to obtain the floaters type, the numbers of the reflection regulation sub-areas, the reflection regulation time periods corresponding to the reflection regulation sub-areas, the reflection coefficients corresponding to the reflection regulation sub-areas in the reflection regulation time periods, the numbers of the focus regulation sub-areas, the focus regulation time periods corresponding to the focus regulation sub-areas and the visual coefficients corresponding to the focus regulation sub-areas in the focus regulation time periods;
s8, intelligent regulation: according to the reflection coefficient corresponding to the reflection regulation and control time period of the reflection regulation and control sub-region and the visual coefficient corresponding to the focal length regulation and control sub-region in the focal length regulation and control time period, respectively calling an angle regulation and control mode corresponding to the reflection coefficient and a focal length regulation and control mode corresponding to the visual coefficient from a database, and respectively and automatically regulating and controlling the reflection regulation and control and the focal length regulation and control of each camera which needs to be subjected to reflection regulation and control according to the corresponding angle regulation and control mode and focal length regulation and control mode in the corresponding reflection regulation and control time period and focal length regulation and control time;
s9, remote control: dispatching related workers to process according to the obtained floating object type corresponding to the floating object, the occurrence time of the floating object, the number of the sub-area of the floating object and the estimated position of the floating object;
s10, data display: and counting the comprehensive influence coefficient of the environmental monitoring according to the reflection coefficient of each reflection sub-region in each reflection time period and the visual coefficient of each sub-region in each acquisition time period, and displaying in real time in a background.
Furthermore, the monitoring devices comprise a plurality of cameras and a plurality of GPS locators, wherein each camera is respectively installed above the water surface of each sub-area, each sub-area corresponds to one camera for collecting images of the water surface of each sub-area, and each GPS locator is respectively installed inside each camera for acquiring the position of the area where each camera is located.
Furthermore, the plurality of floater environment acquisition units are a plurality of ultrasonic wind speed sensors, wherein each ultrasonic wind speed sensor is respectively installed on a fixing support corresponding to each camera and used for monitoring the wind direction and the wind speed of each subarea.
Furthermore, the calculation formula of the estimated moving distance of the floating object is L ═ V × T, V represents the wind speed of the region where the floating object is located, T represents the time when the worker expects to reach the region where the floating object is located, and L represents the estimated moving distance of the floating object.
Further, the calculation formula of the reflection coefficient of the reflection sub-area in each reflection time period is
Figure BDA0002879753950000081
Wherein phirt represents the reflection coefficient corresponding to the r-th reflection sub-area in the t-th reflection time period, Grt represents the area of the water surface reflection image corresponding to the r-th reflection sub-area in the t-th reflection time period, Xrt represents the area of the light reflecting area corresponding to the r-th light reflecting sub-area in the t-th light reflecting time period.
Further, the calculation formula of the visible coefficient of each sub-area of each acquisition time period is
Figure BDA0002879753950000082
Wherein the content of the first and second substances,
Figure BDA0002879753950000083
represents the visual coefficient, W, corresponding to the r sub-region of the t acquisition time periodrtStandard of meritAnd the standard fog concentration corresponding to the r sub-area of the t acquisition time period is shown.
Further, the calculation formula of the environmental monitoring comprehensive influence coefficient is
Figure BDA0002879753950000084
m represents the number of each sub-region, and r represents the number of each sub-region.
In a second aspect, the invention further provides a big data cloud platform, which includes a processor, a machine-readable storage medium, and a network interface, where the machine-readable storage medium, the network interface, and the processor are connected through a bus system, the network interface is used for being communicatively connected with at least one aquaculture environment real-time online monitoring terminal, the machine-readable storage medium is used for storing programs, instructions, or codes, and the processor is used for executing the programs, instructions, or codes in the machine-readable storage medium to execute the aquaculture environment real-time online monitoring method based on the internet of things and artificial intelligence.
In view of this, the present invention has the following advantageous effects:
(1) according to the real-time online monitoring method for the aquaculture environment based on the Internet of things and the artificial intelligence, the water surface image processing module, the region environment detection module and the processing server are combined to finely monitor and analyze the water surface and the water surface region environment of the aquaculture, so that the problem that the aquaculture environment monitoring result is one-sidedly solved, the authenticity and the reference of the monitoring result are effectively guaranteed, and meanwhile, the monitoring efficiency of the aquaculture environment monitoring is greatly improved.
(2) In the information preprocessing center, the invention estimates the moving distance of the floater when the staff arrives at the area by analyzing the wind direction and the wind speed of the area where the floater is located and the estimated time of the staff arriving at the area, thereby estimating the position of the floater, providing accurate position information for the staff, effectively improving the working efficiency of the staff and simultaneously effectively avoiding the damage to water quality caused by the overlong existence time of the floater on the water surface.
(3) According to the invention, the detection efficiency of the device in the monitoring process is greatly improved by automatically regulating and controlling the angle and the focal length of the monitoring device in the intelligent regulation and control module, and meanwhile, the problems of image defect or image unsharpness caused by reflection or overhigh concentration of fog in the environment of the area where the device is located are effectively avoided, so that the stability of the monitoring result is greatly improved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings used in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art that other drawings can be obtained according to the drawings without creative efforts.
FIG. 1 is a flow chart of the steps of the method of the present invention;
FIG. 2 is a schematic diagram showing the connection of the modules of the system of the present invention;
FIG. 3 is a schematic diagram of the connection of the water surface image processing module according to the present invention.
Detailed Description
While the foregoing is directed to embodiments of the present invention, other and further embodiments of the invention may be devised without departing from the basic scope thereof, and the scope thereof is determined by the claims that follow.
Referring to fig. 1, in a first aspect, the invention provides an aquaculture environment real-time online monitoring method based on internet of things and artificial intelligence, which comprises the following steps when the aquaculture environment real-time online monitoring method based on internet of things and artificial intelligence is adopted;
s1, dividing culture water surface areas: carrying out region division on the culture water surface to obtain each divided sub-region;
s2, water surface image acquisition: installing monitoring equipment in each subregion, and acquiring water surface images of each subregion through each monitoring equipment;
s3, water surface image processing: extracting images of reflected light and floats appearing in the collected water surface images of the subregions, processing the images of the reflected light and the floats to obtain images of the floats of the subregions in each collection time period and images of the reflected light of the subregions in each collection time period, and further obtaining the wind direction and the wind speed of the region where the floats are located, the position of the floats, characteristic images of the floats, the time for the floats to appear and the numbers of the subregions of the floats;
s4, collecting the environment of the floating object, wherein the wind speed and the wind direction in the environment parameters of the sub-area of the floating object are collected;
s5, monitoring regional environment: carrying out environment monitoring on each subregion to obtain the fog concentration of each subregion in each acquisition time period;
s6, information preprocessing: acquiring the estimated position of the floater according to the wind direction and the wind speed of the area where the floater is located and the position of the floater;
s7, analysis and processing: analyzing and processing the characteristic images of the floaters in each sub-area of each acquisition time period, the reflection images of the reflection sub-areas of each reflection time period and the fog concentration of each sub-area of each acquisition time period to obtain the floaters type, the numbers of the reflection regulation sub-areas, the reflection regulation time periods corresponding to the reflection regulation sub-areas, the reflection coefficients corresponding to the reflection regulation sub-areas in the reflection regulation time periods, the numbers of the focus regulation sub-areas, the focus regulation time periods corresponding to the focus regulation sub-areas and the visual coefficients corresponding to the focus regulation sub-areas in the focus regulation time periods;
s8, intelligent regulation: according to the reflection coefficient corresponding to the reflection regulation and control time period of the reflection regulation and control sub-region and the visual coefficient corresponding to the focal length regulation and control sub-region in the focal length regulation and control time period, respectively calling an angle regulation and control mode corresponding to the reflection coefficient and a focal length regulation and control mode corresponding to the visual coefficient from a database, and respectively and automatically regulating and controlling the reflection regulation and control and the focal length regulation and control of each camera which needs to be subjected to reflection regulation and control according to the corresponding angle regulation and control mode and focal length regulation and control mode in the corresponding reflection regulation and control time period and focal length regulation and control time;
s9, remote control: dispatching related workers to process according to the obtained floating object type corresponding to the floating object, the occurrence time of the floating object, the number of the sub-area of the floating object and the estimated position of the floating object;
s10, data display: and counting the comprehensive influence coefficient of the environmental monitoring according to the reflection coefficient of each reflection sub-region in each reflection time period and the visual coefficient of each sub-region in each acquisition time period, and displaying in real time in a background.
Referring to fig. 2 to 3, in the implementation process of the real-time online monitoring method for aquaculture environment based on internet of things and artificial intelligence, an aquaculture environment real-time online monitoring system based on internet of things and artificial intelligence is needed, and the system comprises a region dividing module, a water surface image acquisition module, a water surface image processing module, a floater environment acquisition module, a region environment detection module, an intelligent control module, an information preprocessing module, an analysis and processing module, a database, a remote control center and a background display terminal;
the analysis and processing module is respectively connected with the water surface image processing module, the region environment detection module, the database, the intelligent regulation and control module, the remote control center and the background display terminal, the water surface image acquisition module is respectively connected with the region division module and the water surface image processing module, and the floater environment acquisition module is respectively connected with the water surface image processing module and the information preprocessing module;
the area division module is used for carrying out area division on the aquaculture water surface area according to a plane net format to obtain divided sub-areas, numbering the sub-areas according to a preset sequence, and sequentially marking the sub-areas as 1,2,. once, j,. m;
the water surface image acquisition module comprises a plurality of monitoring devices which are respectively arranged in each subregion, and according to a preset acquisition time period, the cameras in the monitoring devices are used for acquiring the water surface images of each subregion to obtain the water surface images of each subregion in each acquisition time period, so that a water surface image set F of each subregion in each acquisition time period is constructedrt(Fr1,Fr2,...Fri,...Fr6) R represents the number of each sub-region, r is 1,2, a.. j.. m, t represents an acquisition time period, and t is 1,2, a.. i.. 6, and the water surface image set of each sub-region in each acquisition time period is sent to the water surface image processing module;
the water surface image processing module is used for receiving the water surface image sets of all sub-regions in all the acquisition time periods sent by the water surface image acquisition module, wherein the water surface image processing module comprises a water surface floater image processing module and a water surface reflection image processing module;
the water surface floater image processing module is used for processing the collected water surface images of all sub-areas in all the collection time periods so as to judge whether floaters exist in the images, and the specific processing steps are as follows:
a1, acquiring a water surface image of each sub-area in each acquisition time period;
a2, comparing the acquired water surface image of each sub-region in each acquisition time period with the standard water surface image of the sub-region, analyzing whether an abnormality exists, if the water surface image of a sub-region in a certain acquisition time period is abnormal, marking the sub-region as a float sub-region, and counting the number of the float sub-region when the float appears in the acquisition time period;
a3, focusing the water surface image corresponding to the sub-area of the floating object on the area where the floating object is located, further extracting the characteristics of the floating object, simultaneously obtaining the geographic position of the floating object, and sending the geographic position of the floating object to a floating object environment acquisition module;
a4, sending the characteristics of the floater, the number of the subarea of the floater and the occurrence time of the floater to a processing and analyzing module;
the floater environment acquisition module is used for receiving the geographical position of each floater sent by the water surface floater image processing module, and comprises a plurality of floater environment acquisition units which are used for acquiring the wind speed and the wind direction in the environment parameters of the subareas of the floaters so as to send the geographical position of each floater and the wind speed and the wind direction corresponding to each floater subarea to the information preprocessing module;
the information preprocessing module is used for receiving the geographical position of each floater and the wind speed and the wind direction corresponding to each floater sub-region sent by the floater environment acquisition module, recording the wind speed of each floater sub-region as V, counting the distance between the position of a worker and the geographical position of each floater, thereby obtaining the time of the worker expected to reach the geographical position of the floater, and recording as T, and estimating the moving distance of the floater when the worker reaches the position according to the wind speed of the floater sub-region and the time of the worker expected to reach the region, wherein the estimated distance calculation formula is L ═ V × T, V represents the wind speed of the region where the floater is located, T represents the time of the worker expected to reach the region where the floater is located, L represents the estimated moving distance of the floater, and the estimated moving distance of the floater according to the wind direction of the floater sub-region and the estimated moving distance of the floater, the position of the floater is estimated, the estimated position of the floater is sent to a remote control center, in the embodiment, the wind direction and the wind speed of the area where the floater is located and the estimated time of the floater arriving at the area are accurately analyzed through an information preprocessing module, so that the estimated position of the floater is obtained, accurate position information is provided for workers, the working efficiency of the workers is effectively improved, and meanwhile, the damage to water quality caused by the fact that the floater exists on the water surface for too long time is effectively reduced;
the analysis and processing module is used for receiving the characteristics of the floater, the number of the subarea of the floater and the occurrence time of the floater, which are sent by the image processing module of the floater on the water surface, comparing the characteristics of the floater with the characteristics of the floater types stored in the database, further screening out the floater type corresponding to the floater in the subarea of the floater, and sending the floater type corresponding to the floater, the occurrence time of the floater and the number of the subarea of the floater to the remote control center;
the remote control center is used for receiving the floater type, the floater occurrence time, the estimated position of the floater and the number of the floater subarea corresponding to the floater sent by the analysis and processing module and the information and processing module, and dispatching related staff for processing;
the water surface reflection image processing module is used for processing the water surface image of each subarea in each acquisition time period so as to judge whether reflection exists in the image, counting the area of the water surface outline in the water surface image of each subarea in each acquisition time period, analyzing whether abnormality exists, if a non-water surface outline exists in the water surface image of a subarea in a certain acquisition time period, marking the subarea as a reflection subarea, counting the number of the reflection subarea, marking the acquisition time period as a reflection time period, further extracting the water surface image corresponding to the number of the reflection subarea, marking the water surface image as a reflection image, and sending the number of each reflection subarea in each reflection time period and each reflection image to the analysis and processing;
the analysis and processing module is used for receiving all light reflecting sub-regions of all light reflecting time periods sent by the water surface light reflecting image processing moduleNumbering and each reflection image, acquiring the area of the reflection area in the reflection image according to the ratio of the reflection area in each reflection sub-area reflection image in each reflection time period to the reflection image, and recording as Xrt, further counting the reflection coefficient of the reflection sub-region in each reflection time period, wherein the calculation formula of the reflection coefficient of the reflection sub-region in each reflection time period is
Figure BDA0002879753950000151
Wherein phirt represents the reflection coefficient corresponding to the r-th reflection sub-area in the t-th reflection time period, Grt represents the area of the water surface reflection image corresponding to the r-th reflection sub-area in the t-th reflection time period, Xrt represents the area of a light reflecting region corresponding to the r-th light reflecting sub-region in the t-th light reflecting time period, the light reflecting coefficient of the light reflecting sub-region in each light reflecting time period is compared with the standard light reflecting coefficient corresponding to the sub-region, if the light reflecting coefficient of a certain light reflecting sub-region in a certain light reflecting time period is greater than the standard light reflecting coefficient threshold value of the sub-region, the light reflecting sub-region is marked as a light reflecting regulation and control sub-region, the light reflecting regulation and control sub-region number and the corresponding light reflecting regulation and control time period are counted at the moment, and the light reflecting regulation and control sub-region number, the light reflecting regulation and control time period corresponding to the light reflecting regulation and control sub-region and the;
the intelligent regulation and control module is used for receiving the number of the reflection regulation and control sub-region sent by the analysis and processing module, the reflection regulation and control time period corresponding to the reflection regulation and control sub-region and the reflection coefficient corresponding to the reflection regulation and control sub-region in the reflection regulation and control time period, and the angle regulation and control mode corresponding to the reflection coefficient is called from the database according to the reflection coefficient corresponding to the reflection regulation and control sub-region in the reflection regulation and control time period, so that the camera corresponding to the reflection regulation and control sub-region is called to automatically regulate and control in the reflection regulation and control time period according to the angle regulation and control mode corresponding to the reflection coefficient of the reflection regulation and control sub;
the regional environment detection module comprises a plurality of fog sensorsThe fog concentration acquisition system is used for monitoring fog concentration in each sub-area of each acquisition time period, each fog sensor is respectively arranged on the surface of a camera in each area, the fog concentration in each sub-area of each acquisition time period is monitored through each fog sensor, the fog concentration of each sub-area of each acquisition time period is obtained, and then a fog concentration set W in each sub-area of each acquisition time period is constructedrt(Wr1,Wr2,...Wri,...Wr6) R represents the number of each sub-region, t represents an acquisition time period, and t is 1,2,. i,. 6, and the mist concentration set in each sub-region of each acquisition time period is sent to an analysis and processing module;
the analysis and processing module is used for receiving the fog concentration set of each subarea of each acquisition time period sent by the regional environment detection module, comparing the fog concentration of each subarea of each acquisition time period with the standard fog concentration corresponding to each subarea of each acquisition time period, and constructing the fog concentration comparison set delta W of each subarea of each acquisition time periodrt(ΔWr1,ΔWr2,...ΔWri,...ΔWr6),ΔWrt represents the difference value between the fog concentration of the r-th area in the t-th collection time period and the standard fog concentration corresponding to the r-th area in the t-th collection time period, r represents a number, t represents a collection time period, t is 1,2, 1.. i, 6, and then the visible coefficients of all sub-areas in all collection time periods are counted, wherein the calculation formula of the visible coefficients of all sub-areas in all collection time periods is as follows
Figure BDA0002879753950000161
Wherein the content of the first and second substances,
Figure BDA0002879753950000162
represents the visual coefficient, W, corresponding to the r sub-region of the t acquisition time periodrtStandard of meritRepresenting the standard fog concentration corresponding to the r sub-area of the t-th collection time period, comparing the standard visual coefficient of each sub-area of each collection time period with the standard visual coefficient of the sub-area, if the visual coefficient of a certain collection time period of a certain sub-area is smaller than the standard visual coefficient of the sub-area, marking the sub-area as a focal length regulation sub-area, and obtaining the fog concentration of the sub-area of the t-th collection time periodRecording the time period as a focal length regulation time period, counting the number of the focal length regulation subarea and the corresponding focal length regulation time period at the moment, and sending the number of the focal length regulation subarea, the focal length regulation time period corresponding to the focal length regulation subarea and the visual coefficient of the focal length regulation subarea corresponding to the focal length regulation time period to the intelligent regulation and control module;
the intelligent control module is also used for receiving, analyzing and processing the number of the focal length control subarea sent by the module, the focal length control time period corresponding to the focal length control subarea and the visual coefficient of the focal length control subarea in the focal length control time period, and automatically controlling the focal length control mode corresponding to the visual coefficient from the database according to the visual coefficient of the focal length control subarea in the focal length control time period, so as to control the camera corresponding to the focal length control subarea in the focal length control time period according to the focal length control mode corresponding to the visual coefficient of the focal length control subarea in the focal length control time period. Thereby improving the stability of the monitoring result;
the method comprises the steps of counting the reflection coefficient of each reflection sub-area in each reflection time period and the visual coefficient of each sub-area in each collection time period to further count the comprehensive influence coefficient of environmental monitoring, and sending the comprehensive influence coefficient of environmental monitoring to a background terminal;
the database is used for storing standard water surface images corresponding to all sub-regions in all acquisition time periods, the variety characteristics of all floaters, the standard light reflection coefficient of each sub-region, the standard fog concentration of each sub-region, the standard visible coefficient of each sub-region, the focus regulation and control mode corresponding to each visible coefficient and the angle regulation and control mode corresponding to each light reflection coefficient;
the background terminal is used for receiving the environmental monitoring comprehensive influence coefficient sent by the analysis and processing module and displaying the environmental monitoring comprehensive influence coefficient in real time.
In a second aspect, the invention further provides a big data cloud platform, which includes a processor, a machine-readable storage medium, and a network interface, where the machine-readable storage medium, the network interface, and the processor are connected through a bus system, the network interface is used for being communicatively connected with at least one aquaculture environment real-time online monitoring terminal, the machine-readable storage medium is used for storing programs, instructions, or codes, and the processor is used for executing the programs, instructions, or codes in the machine-readable storage medium to execute the aquaculture environment real-time online monitoring method based on the internet of things and artificial intelligence.
The foregoing is merely exemplary and illustrative of the principles of the present invention and various modifications, additions and substitutions of the specific embodiments described herein may be made by those skilled in the art without departing from the principles of the present invention or exceeding the scope of the claims set forth herein.

Claims (8)

1. An aquaculture environment real-time online monitoring method based on the Internet of things and artificial intelligence is characterized in that: the aquaculture environment real-time online monitoring method based on the Internet of things and artificial intelligence needs to use an aquaculture environment real-time online monitoring system based on the Internet of things and artificial intelligence in the specific implementation process, and the system comprises a region dividing module, a water surface image acquisition module, a water surface image processing module, a floater environment acquisition module, a region environment detection module, an intelligent regulation and control module, an information preprocessing module, an analysis and processing module, a database, a remote control center and a background display terminal;
the analysis and processing module is respectively connected with the water surface image processing module, the region environment detection module, the database, the intelligent regulation and control module, the remote control center and the background display terminal, the water surface image acquisition module is respectively connected with the region division module and the water surface image processing module, and the floater environment acquisition module is respectively connected with the water surface image processing module and the information preprocessing module;
the area division module is used for carrying out area division on the aquaculture water surface area according to a plane net format to obtain divided sub-areas, numbering the sub-areas according to a preset sequence, and sequentially marking the sub-areas as 1,2,. once, j,. m;
the water surface image acquisition module comprises a plurality of monitoring devices which are respectively arranged in each subregion, and according to a preset acquisition time period, the cameras in the monitoring devices are used for acquiring the water surface images of each subregion to obtain the water surface images of each subregion in each acquisition time period, so that a water surface image set F of each subregion in each acquisition time period is constructedrt(Fr1,Fr2,...Fri,...Fr6) R represents the number of each sub-region, r is 1,2, a.. j.. m, t represents an acquisition time period, and t is 1,2, a.. i.. 6, and the water surface image set of each sub-region in each acquisition time period is sent to the water surface image processing module;
the water surface image processing module is used for receiving a water surface image set of each sub-region in each acquisition time period sent by the water surface image acquisition module, wherein the water surface image processing module comprises a water surface floater image processing module and a water surface reflection image processing module;
the water surface floater image processing module is used for processing the collected water surface images of all sub-areas in all the collection time periods so as to judge whether floaters exist in the images, and the specific processing steps are as follows:
a1, acquiring a water surface image of each sub-area in each acquisition time period;
a2, comparing the acquired water surface image of each sub-region in each acquisition time period with the standard water surface image of the sub-region, analyzing whether an abnormality exists, if the water surface image of a sub-region in a certain acquisition time period is abnormal, marking the sub-region as a float sub-region, and counting the number of the float sub-region when the float appears in the acquisition time period;
a3, focusing the water surface image corresponding to the sub-area of the floating object on the area where the floating object is located, further extracting the characteristics of the floating object, simultaneously obtaining the geographic position of the floating object, and sending the geographic position of the floating object to a floating object environment acquisition module;
a4, sending the characteristics of the floater, the number of the subarea of the floater and the occurrence time of the floater to a processing and analyzing module;
the water surface reflection image processing module is used for processing the water surface image of each subarea in each acquisition time period so as to judge whether reflection exists in the image, counting the area of the water surface outline in the water surface image of each subarea in each acquisition time period, analyzing whether abnormality exists, if the water surface image of a subarea in a certain acquisition time period has a non-water surface outline, marking the subarea as a reflection subarea, counting the number of the reflection subarea, marking the acquisition time period as a reflection time period, further extracting the water surface image corresponding to the number of the reflection subarea, marking the water surface image as a reflection image, and sending the number of each reflection subarea in each reflection time period and each reflection image to the analysis and processing;
the floater environment acquisition module is used for receiving the geographical position of each floater sent by the water surface floater image processing module, and comprises a plurality of floater environment acquisition units which are used for acquiring the wind speed and the wind direction in the environment parameters of the subareas of the floaters so as to send the geographical position of each floater and the wind speed and the wind direction corresponding to each floater subarea to the information preprocessing module;
the regional environment detection module comprises a plurality of fog sensors for monitoring fog concentration in each sub-region of each acquisition time period, each fog sensor is respectively arranged on the surface of each regional camera, the fog concentration in each sub-region of each acquisition time period is monitored through each fog sensor, the fog concentration of each sub-region of each acquisition time period is obtained, and then a fog concentration set W in each sub-region of each acquisition time period is constructedrt(Wr1,Wr2,...Wri,...Wr6) R denotes the number of each sub-region, t denotes the acquisition time period, and t ═ t1,2, a.. i.. 6, and sending the fog concentration set in each sub-area of each acquisition time period to an analysis and processing module;
the information preprocessing module is used for receiving the geographic position of each floater sent by the floater environment acquisition module and the wind speed and the wind direction corresponding to each floater sub-region, recording the wind speed of each floater sub-region as V, counting the distance between the position of a worker and the geographic position of each floater, acquiring the predicted time of the worker to reach the geographic position of the floater, recording the predicted time as T, predicting the moving distance of the floater when the worker reaches the position according to the wind speed of each floater sub-region and the predicted time of the worker to reach the region, predicting the position of the floater according to the wind direction of each floater sub-region and the predicted moving distance of the floater, and sending the predicted position of the floater to the remote control center;
the analysis and processing module is used for receiving the characteristics of the floater, the number of the subarea of the floater and the occurrence time of the floater, which are sent by the image processing module of the floater on the water surface, comparing the characteristics of the floater with the characteristics of the floater types stored in the database, further screening out the floater type corresponding to the floater in the subarea of the floater, and sending the floater type corresponding to the floater, the occurrence time of the floater and the number of the subarea of the floater to the remote control center;
the analysis and processing module is used for receiving the numbers of the light reflecting sub-regions in the light reflecting time periods and the light reflecting images sent by the water surface light reflecting image processing module, acquiring the areas of the light reflecting regions in the light reflecting images according to the proportion of the light reflecting regions in the light reflecting images of the light reflecting sub-regions in the light reflecting time periods to the light reflecting images, and recording the areas as Xrt, further counting the reflection coefficients of the reflection sub-regions in each reflection time period, comparing the reflection coefficients of the reflection sub-regions in each reflection time period with the standard reflection coefficients corresponding to the sub-regions, if the reflection coefficient of a certain reflection time period of a certain reflection sub-region is greater than the standard reflection coefficient threshold of the sub-region, recording the reflection sub-region as a reflection regulation sub-region, and recording the reflection time period as a reflection regulation time period, wherein at the moment, the reflection coefficient of the certain reflection time period of theCounting the number of the light reflection regulation and control subareas and the corresponding light reflection regulation and control time periods, and sending the number of the light reflection regulation and control subareas, the light reflection regulation and control time periods corresponding to the light reflection regulation and control subareas and the light reflection coefficients of the light reflection regulation and control subareas in the light reflection regulation and control time periods to an intelligent regulation and control module;
the analysis and processing module is used for receiving the fog concentration set of each subarea of each acquisition time period sent by the regional environment detection module, comparing the fog concentration of each subarea of each acquisition time period with the standard fog concentration corresponding to each subarea of each acquisition time period, and constructing the fog concentration comparison set delta W of each subarea of each acquisition time periodrt(ΔWr1,ΔWr2,...ΔWri,...ΔWr6),ΔWrt represents the difference value between the fog concentration of the r-th area in the t-th collection time period and the standard fog concentration corresponding to the r-th area in the t-th collection time period, r represents a number, t represents a collection time period, and t is 1,2,. i,. 6, then counting the visible coefficient of each sub-area in each acquisition time period, comparing the visible coefficient of each sub-area in each acquisition time period with the sub-area standard visible coefficient, if the visible coefficient of a certain acquisition time period of a certain sub-area is less than the sub-area standard visible coefficient, marking the sub-area as a focal length regulation sub-area, the collection time period is recorded as a focal length regulation time period, at this time, the number of the focal length regulation subarea and the corresponding focal length regulation time period are counted, then sending the number of the focal length regulating and controlling subarea, the focal length regulating and controlling time period corresponding to the focal length regulating and controlling subarea and the visual coefficient corresponding to the focal length regulating and controlling subarea in the focal length regulating and controlling time period to an intelligent regulating and controlling module;
the method comprises the steps of counting the reflection coefficient of each reflection sub-area in each reflection time period and the visual coefficient of each sub-area in each collection time period to further count the comprehensive influence coefficient of environmental monitoring, and sending the comprehensive influence coefficient of environmental monitoring to a background terminal;
the intelligent regulation and control module is used for receiving the number of the reflection regulation and control sub-region sent by the analysis and processing module, the reflection regulation and control time period corresponding to the reflection regulation and control sub-region and the reflection coefficient corresponding to the reflection regulation and control sub-region in the reflection regulation and control time period, and the angle regulation and control mode corresponding to the reflection coefficient is called from the database according to the reflection coefficient corresponding to the reflection regulation and control sub-region in the reflection regulation and control time period, so that the camera corresponding to the reflection regulation and control sub-region is called to automatically regulate and control in the reflection regulation and control time period according to the angle regulation and control mode corresponding to the reflection coefficient of the reflection regulation and control sub;
meanwhile, the intelligent control module is also used for receiving, analyzing and processing the number of the focal length control subarea sent by the module, the focal length control time period corresponding to the focal length control subarea and the visual coefficient of the focal length control subarea corresponding to the focal length control time period, and controlling the focal length control mode corresponding to the visual coefficient from the database according to the visual coefficient of the focal length control subarea corresponding to the focal length control time period, so as to control the camera corresponding to the focal length control subarea to perform automatic control in the focal length control time period according to the focal length control mode corresponding to the visual coefficient of the focal length control subarea in the focal length control time period;
the database is used for storing standard water surface images corresponding to all sub-regions in all acquisition time periods, the variety characteristics of all floaters, the standard light reflection coefficient of each sub-region, the standard fog concentration of each sub-region, the standard visible coefficient of each sub-region, the focus regulation and control mode corresponding to each visible coefficient and the angle regulation and control mode corresponding to each light reflection coefficient;
the remote control center is used for receiving the floater type, the floater occurrence time, the estimated position of the floater and the number of the floater subarea corresponding to the floater sent by the analysis and processing module and the information and processing module, and dispatching related staff for processing;
the background terminal is used for receiving the environmental monitoring comprehensive influence coefficient sent by the analysis and processing module and displaying the environmental monitoring comprehensive influence coefficient in real time;
when the real-time online monitoring method for the aquaculture environment based on the Internet of things and artificial intelligence is adopted, the method comprises the following steps;
s1, dividing culture water surface areas: carrying out region division on the culture water surface to obtain each divided sub-region;
s2, water surface image acquisition: installing monitoring equipment in each subregion, and acquiring water surface images of each subregion through each monitoring equipment;
s3, water surface image processing: extracting images of reflected light and floats appearing in the collected water surface images of the subregions, processing the images of the reflected light and the floats to obtain images of the floats of the subregions in each collection time period and images of the reflected light of the subregions in each collection time period, and further obtaining the wind direction and the wind speed of the region where the floats are located, the position of the floats, characteristic images of the floats, the time for the floats to appear and the numbers of the subregions of the floats;
s4, collecting the environment of the floating object, wherein the wind speed and the wind direction in the environment parameters of the sub-area of the floating object are collected;
s5, monitoring regional environment: carrying out environment monitoring on each subregion to obtain the fog concentration of each subregion in each acquisition time period;
s6, information preprocessing: acquiring the estimated position of the floater according to the wind direction and the wind speed of the area where the floater is located and the position of the floater;
s7, analysis and processing: analyzing and processing the characteristic images of the floaters in each sub-area of each acquisition time period, the reflection images of the reflection sub-areas of each reflection time period and the fog concentration of each sub-area of each acquisition time period to obtain the floaters type, the numbers of the reflection regulation sub-areas, the reflection regulation time periods corresponding to the reflection regulation sub-areas, the reflection coefficients corresponding to the reflection regulation sub-areas in the reflection regulation time periods, the numbers of the focus regulation sub-areas, the focus regulation time periods corresponding to the focus regulation sub-areas and the visual coefficients corresponding to the focus regulation sub-areas in the focus regulation time periods;
s8, intelligent regulation: according to the reflection coefficient corresponding to the reflection regulation and control time period of the reflection regulation and control sub-region and the visual coefficient corresponding to the focal length regulation and control sub-region in the focal length regulation and control time period, respectively calling an angle regulation and control mode corresponding to the reflection coefficient and a focal length regulation and control mode corresponding to the visual coefficient from a database, and respectively and automatically regulating and controlling the reflection regulation and control and the focal length regulation and control of each camera which needs to be subjected to reflection regulation and control according to the corresponding angle regulation and control mode and focal length regulation and control mode in the corresponding reflection regulation and control time period and focal length regulation and control time;
s9, remote control: dispatching related workers to process according to the obtained floating object type corresponding to the floating object, the occurrence time of the floating object, the number of the sub-area of the floating object and the estimated position of the floating object;
s10, data display: and counting the comprehensive influence coefficient of the environmental monitoring according to the reflection coefficient of each reflection sub-region in each reflection time period and the visual coefficient of each sub-region in each acquisition time period, and displaying in real time in a background.
2. The real-time online monitoring method for the aquaculture environment based on the Internet of things and artificial intelligence as claimed in claim 1, wherein the real-time online monitoring method comprises the following steps: the monitoring devices comprise a plurality of cameras and a plurality of GPS locators, wherein the cameras are respectively installed above the water surface of each subarea, each subarea corresponds to one camera and is used for collecting images of the water surface of each subarea, and the GPS locators are respectively installed inside each camera and are used for acquiring the position of the area where each camera is located.
3. The real-time online monitoring method for the aquaculture environment based on the Internet of things and artificial intelligence as claimed in claim 1, wherein the real-time online monitoring method comprises the following steps: the floater environment acquisition units are a plurality of ultrasonic wind speed sensors, wherein each ultrasonic wind speed sensor is respectively arranged on a fixed support corresponding to each camera and used for monitoring the wind direction and the wind speed of each subarea.
4. The real-time online monitoring method for the aquaculture environment based on the Internet of things and artificial intelligence as claimed in claim 1, wherein the real-time online monitoring method comprises the following steps: the calculation formula of the estimated moving distance of the floater is L-V-T, V represents the wind speed of the area where the floater is located, T represents the time when the worker expects to reach the area where the floater is located, and L represents the estimated moving distance of the floater.
5. The real-time online monitoring method for the aquaculture environment based on the Internet of things and artificial intelligence as claimed in claim 1, wherein the real-time online monitoring method comprises the following steps: the calculation formula of the reflection coefficient of the reflection subareas in each reflection time period is
Figure FDA0002879753940000071
Wherein phirt represents the reflection coefficient corresponding to the r-th reflection sub-area in the t-th reflection time period, Grt represents the area of the water surface reflection image corresponding to the r-th reflection sub-area in the t-th reflection time period, Xrt represents the area of the light reflecting area corresponding to the r-th light reflecting sub-area in the t-th light reflecting time period.
6. The real-time online monitoring method for the aquaculture environment based on the Internet of things and artificial intelligence as claimed in claim 1, wherein the real-time online monitoring method comprises the following steps: the calculation formula of the visible coefficient of each sub-area in each acquisition time period is
Figure FDA0002879753940000081
Wherein the content of the first and second substances,
Figure FDA0002879753940000082
represents the visual coefficient, W, corresponding to the r sub-region of the t acquisition time periodrtStandard of meritAnd the standard fog concentration corresponding to the r sub-area of the t acquisition time period is shown.
7. The real-time online monitoring method for the aquaculture environment based on the Internet of things and artificial intelligence as claimed in claim 1, wherein the real-time online monitoring method comprises the following steps: the calculation formula of the comprehensive influence coefficient of the environmental monitoring is
Figure FDA0002879753940000083
m represents the number of each sub-region, and r represents the number of each sub-region.
8. A big data cloud platform, its characterized in that: the big data cloud platform comprises a processor, a machine-readable storage medium and a network interface, wherein the machine-readable storage medium, the network interface and the processor are connected through a bus system, the network interface is used for being in communication connection with at least one aquaculture environment real-time online monitoring terminal, the machine-readable storage medium is used for storing programs, instructions or codes, and the processor is used for executing the programs, the instructions or the codes in the machine-readable storage medium so as to execute the aquaculture environment real-time online monitoring method based on the internet of things and artificial intelligence in any one of claims 1 to 7.
CN202011640007.1A 2020-12-31 2020-12-31 Aquaculture environment real-time online monitoring method based on Internet of things and artificial intelligence and big data cloud platform Withdrawn CN112651958A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011640007.1A CN112651958A (en) 2020-12-31 2020-12-31 Aquaculture environment real-time online monitoring method based on Internet of things and artificial intelligence and big data cloud platform

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011640007.1A CN112651958A (en) 2020-12-31 2020-12-31 Aquaculture environment real-time online monitoring method based on Internet of things and artificial intelligence and big data cloud platform

Publications (1)

Publication Number Publication Date
CN112651958A true CN112651958A (en) 2021-04-13

Family

ID=75366957

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011640007.1A Withdrawn CN112651958A (en) 2020-12-31 2020-12-31 Aquaculture environment real-time online monitoring method based on Internet of things and artificial intelligence and big data cloud platform

Country Status (1)

Country Link
CN (1) CN112651958A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114677033A (en) * 2022-04-06 2022-06-28 湖北康协生物科技有限公司 Laboratory cultivation data full-period process intelligent monitoring analysis management system

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114677033A (en) * 2022-04-06 2022-06-28 湖北康协生物科技有限公司 Laboratory cultivation data full-period process intelligent monitoring analysis management system
CN114677033B (en) * 2022-04-06 2023-12-01 湖北康协生物科技有限公司 Laboratory cultivation data full-period flow intelligent monitoring analysis management system

Similar Documents

Publication Publication Date Title
CN109344753A (en) A kind of tiny fitting recognition methods of Aerial Images transmission line of electricity based on deep learning
CA3100569A1 (en) Ship identity recognition method base on fusion of ais data and video data
CN112734694A (en) Water quality monitoring method based on big data
CN111598098B (en) Water gauge water line detection and effectiveness identification method based on full convolution neural network
CN111080634A (en) Transformer appearance defect identification method based on inspection robot and Cascade RCNN algorithm
CN113018725B (en) Intelligent fire fighting analysis and management integrated platform based on remote image analysis and processing technology
CN110807765B (en) Suspension insulator string inclination detection method and system based on image processing
CN116233370A (en) Intelligent video monitoring method based on water quality monitoring
CN114639064B (en) Water level identification method and device
CN112651958A (en) Aquaculture environment real-time online monitoring method based on Internet of things and artificial intelligence and big data cloud platform
CN114494826A (en) Multi-water-gauge water level identification method and system, electronic equipment and storable medium
CN113469178A (en) Electric power meter identification method based on deep learning
CN115372571A (en) Intelligent water environment monitoring system
CN113469097B (en) Multi-camera real-time detection method for water surface floaters based on SSD network
CN111767826A (en) Timing fixed-point scene abnormity detection method
CN112784914B (en) Pipe gallery video intelligent attribute detection method and system based on cloud processing
CN115843733A (en) Machine vision-based electronic feeding table device for river crab cultivation and working method
CN111785094A (en) Advection fog detection method and device, computer equipment and readable storage medium
CN112241691A (en) Channel ice condition intelligent identification method based on unmanned aerial vehicle inspection and image characteristics
CN117237647B (en) Atmospheric pollution monitoring system and method
CN112508986B (en) Water level measurement method based on deep convolutional network and random field
CN116109869B (en) Method and device for identifying reservoir water level fluctuation based on water bank image matching
CN116704734B (en) Monitoring and early warning method and system for preventing underground pipeline from being excavated based on Internet of things technology
CN117557789B (en) Intelligent detection method and system for offshore targets
CN115457457B (en) Safety detection method and device for perimeter area of aviation oil depot and safety operation and maintenance system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WW01 Invention patent application withdrawn after publication

Application publication date: 20210413

WW01 Invention patent application withdrawn after publication