TWI778762B - Systems and methods for intelligent aquaculture estimation of the number of fish - Google Patents

Systems and methods for intelligent aquaculture estimation of the number of fish Download PDF

Info

Publication number
TWI778762B
TWI778762B TW110131193A TW110131193A TWI778762B TW I778762 B TWI778762 B TW I778762B TW 110131193 A TW110131193 A TW 110131193A TW 110131193 A TW110131193 A TW 110131193A TW I778762 B TWI778762 B TW I778762B
Authority
TW
Taiwan
Prior art keywords
fish
neural network
pixel
training
probability
Prior art date
Application number
TW110131193A
Other languages
Chinese (zh)
Other versions
TW202309776A (en
Inventor
林志勳
余仁翔
張欽圳
駱揚
陳昱恩
陳冠竹
鄭錫齊
冉繁華
盧晃瑩
鄭智湧
謝易錚
林士勛
藍珣毓
Original Assignee
國立臺灣海洋大學
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 國立臺灣海洋大學 filed Critical 國立臺灣海洋大學
Priority to TW110131193A priority Critical patent/TWI778762B/en
Application granted granted Critical
Publication of TWI778762B publication Critical patent/TWI778762B/en
Publication of TW202309776A publication Critical patent/TW202309776A/en

Links

Images

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A40/00Adaptation technologies in agriculture, forestry, livestock or agroalimentary production
    • Y02A40/80Adaptation technologies in agriculture, forestry, livestock or agroalimentary production in fisheries management
    • Y02A40/81Aquaculture, e.g. of fish

Landscapes

  • Farming Of Fish And Shellfish (AREA)
  • Mechanical Means For Catching Fish (AREA)

Abstract

A technology for estimating the total number of fish in the fish farm is disclosed, in which a plurality of training examples of consecutive frames of images of the fish school are annotated with strong, weak or no labels, a training criterion is set to train a neural network to be a neural network having a target function capable of converting fish school images to fish density maps, and a beta-binomial mixture model set based on prior information regarding the fish farm and the species of fish is used to estimate the total number of fish in the fish farm.

Description

智慧養殖魚隻數量估算方法及系統 Method and system for estimating the number of intelligently farmed fish

本發明係關於人工智慧水產養殖魚隻數量估算,特別關於訓練神經網路成為能夠產生用於估算魚隻數量的密度圖之目標機率函數神經網路、以及能夠自動地估算養殖場域內養殖的魚隻總數魚之方法及系統。 The present invention relates to artificial intelligence aquaculture fish quantity estimation, in particular, training a neural network to become a target probability function neural network capable of generating a density map for estimating fish quantity, and automatically estimating aquaculture in aquaculture farms. Method and system for total number of fish.

隨著全球人口數增加及野生水產的數量急遽減少,水產供給面臨嚴峻挑戰。因此,發展水產養殖是滿足對水產增長需求的必要手段之一,也是保護野生水產免於滅絕的重要手段。不過,水產養殖所帶來的環境汙染衝擊,也令人擔憂。因此,應用智慧科技來監控水質與養殖標的行為,對於水產養殖而言不僅可以監控水產的養殖環境、監控魚的成長與健康狀態、也可以降低養殖人力與飼料浪費。如此,除了可節省飼料、降低水產死亡率等等養殖成本外,也可因沒有投餵多餘的飼料而減少水質汙染,降低魚病風險、保護水環境等等。 As the global population increases and the number of wild aquatic products plummets, the supply of aquatic products is facing severe challenges. Therefore, the development of aquaculture is one of the necessary means to meet the growing demand for aquatic products, and it is also an important means to protect wild aquatic products from extinction. However, the impact of environmental pollution caused by aquaculture is also worrying. Therefore, the application of smart technology to monitor water quality and aquaculture target behavior, for aquaculture, can not only monitor the aquaculture environment, monitor the growth and health of fish, but also reduce aquaculture manpower and feed waste. In this way, in addition to saving feed, reducing aquatic mortality and other breeding costs, it can also reduce water pollution, reduce the risk of fish disease, and protect the water environment because no excess feed is fed.

因此,有效地應用例如人工智慧等智慧科技,來解決水產養殖業面臨的各種問題是現代水產養殖非常重要的課題。在眾多應用中,魚群監控技術為建構智慧養殖系統的關鍵技術,其中,結合電腦與各式各 樣的監測器或攝影設備之非接觸式監控方式,可以很低度地干擾或傷害養殖標的,是非常適合水產養殖業。因此,需要結合監控系統、人工智慧來準確地分析與計數養殖水產的總數、分析養殖水產的行為、食慾等等,以顯著地提升養殖效率及降低養殖成本。 Therefore, the effective application of intelligent technologies such as artificial intelligence to solve various problems faced by the aquaculture industry is a very important topic for modern aquaculture. In many applications, fish monitoring technology is the key technology for constructing a smart breeding system. The non-contact monitoring method of such monitors or photographic equipment can interfere or damage the breeding target to a very low degree, and is very suitable for the aquaculture industry. Therefore, it is necessary to combine monitoring systems and artificial intelligence to accurately analyze and count the total number of farmed aquatic products, analyze the behavior, appetite, etc.

已知一些使用人工智慧技術監測魚群的技術,但是,它們具有成本高、準確度較低等等問題,舉例而言,在標示用於訓練的影像時,須標示每尾魚所佔的中心像素、不容許有魚隻被遺漏或被誤標、且魚隻重疊的情形難以處理,或者,將每張魚群影像以區塊為單位作切割後再標記每區塊有幾尾魚,但會有很多魚被裁切而不利準確度或是標記工作增加等等,這些都會造成高成本的問題。 Some techniques using artificial intelligence to monitor fish are known, but they have problems such as high cost and low accuracy. For example, when marking images for training, the center pixel occupied by each fish must be marked , Do not allow fish to be missed or mislabeled, and it is difficult to deal with overlapping fish. Alternatively, cut each fish image in blocks and then mark the number of fish in each block, but there will be A lot of fish are cut to the detriment of accuracy or increased marking work, etc., which can cause high cost problems.

因此,需要更準確、成本更低的人工智慧技術以用於魚群監控或魚群養殖等應用。 Therefore, more accurate and lower cost AI technology is needed for applications such as fish monitoring or fish farming.

鑑於上述,本發明提供成本低、高度準確的人工智慧養殖魚隻數量估算方法及系統。 In view of the above, the present invention provides a low-cost, highly accurate method and system for estimating the number of farmed fish with artificial intelligence.

根據本發明的一態樣,提供一種神經網路訓練方法,用於產生用以估算養殖場域中養殖魚隻數量之密度圖,使用至少二成像設備拍攝取得之眾多幀魚群影像作為眾多學習樣本,該方法包含下述步驟:標示步驟,以強標示、弱標示、及未標示之方式,將各該學習樣本進行標示,以區分為明確具有一尾魚或完全無魚的強像素標示區、至少有二尾魚的弱像素標示區、及未標示像素區;以及,訓練步驟,根據強像素標示區相對於所屬的學習樣本之機率關係產生目標機率函數,以及依據弱像素標示區 及未標示像素區對所屬的學習樣本之各別機率關係與目標機率函數分別對應的計算結果達成一致性,作為訓練準則,來教導神經網路成為目標機率函數神經網路,藉以產生能用以估算魚隻數量的密度圖。 According to an aspect of the present invention, a neural network training method is provided, which is used to generate a density map for estimating the number of farmed fish in a farm area, and use at least two imaging devices to shoot and obtain many frames of fish images as many learning samples. , the method includes the following steps: a labeling step, in the form of strong labeling, weak labeling, and no labeling, label each of the learning samples to distinguish the strong pixel labeling area with one fish or no fish at all, There are at least the weak pixel marked area and the unmarked pixel area of the two-tailed fish; and, in the training step, the target probability function is generated according to the probability relationship between the strong pixel marked area and the corresponding learning sample, and the weak pixel marked area is based on. And the unmarked pixel area is consistent with the calculation results corresponding to the respective probability relationships of the learning samples and the target probability function, as a training criterion, to teach the neural network to become the target probability function neural network, so as to generate a neural network that can be used for Density map for estimating fish numbers.

根據本發明的另一態樣,提供一種使用人工智慧之養殖魚隻數量估算方法,用以估算養殖場域中養殖的魚隻總數,該方法包含下述:使用至少二成像設備取得多幀魚群影像;使用完成訓練的目標機率函數神經網路,將多幀魚群影像轉換成多張對應的密度圖;依至少二成像設備的配置方式,將各成像設備同時取得的各密度圖處理成如同從同一視角取得;使用連通物件標記法,標記經過處理之各成像設備同時取得的各密度圖中機率值大於預定閥值之各像素的連通區;將已標記連通區的同時取得的各密度圖,以a×b像素為單位,分割成預定數目的區塊並排序,以次序相同的區塊中較大的連通區數目為代表數目,將各代表數目相加而取得代表總數,a及b都是自然數;以及,以二項式分布為基礎,依據魚種、養殖場域狀況,設計混合式二項分布模型,使用多筆不同時刻取得的代表總數、及代表各狀況的預定參數值,估算養殖魚隻總數。 According to another aspect of the present invention, there is provided a method for estimating the number of farmed fish using artificial intelligence, for estimating the total number of fish farmed in a farm, the method comprising the following steps: using at least two imaging devices to obtain multiple frames of fish school Image; use the trained target probability function neural network to convert multiple frames of fish school images into multiple corresponding density maps; according to the configuration of at least two imaging devices, process each density map obtained by each imaging device at the same time as if from Obtained from the same viewing angle; using the connected object marking method, mark the connected areas of each pixel whose probability value is greater than the predetermined threshold in each density map obtained by the processed imaging devices at the same time; Taking a×b pixels as the unit, it is divided into a predetermined number of blocks and sorted. The larger number of connected areas in the blocks with the same order is the representative number, and the total number of representatives is obtained by adding up the representative numbers. Both a and b are is a natural number; and, based on the binomial distribution, according to the fish species and the conditions of the breeding field, a mixed binomial distribution model is designed, and the representative total number obtained at different times and the predetermined parameter values representing each condition are used, Estimate the total number of farmed fish.

根據本發明的另一態樣,提供一種人工智慧養殖魚隻數量估算系統,用以估算養殖場域中養殖的魚隻總數,系統包括:至少二成像模組,用於拍攝養殖魚群;神經網路模組,包含完成訓練的目標機率函數神經網路;處理單元;以及,通訊單元,其中,神經網路模組與處理單元彼此協力運作,以將拍攝的養殖魚群影像轉換成魚群密度圖,以及,使用根據本發明的魚隻數量估算方法,估算養殖魚隻總數。 According to another aspect of the present invention, an artificial intelligence breeding fish quantity estimation system is provided for estimating the total number of fish cultured in a breeding field. The system includes: at least two imaging modules for photographing aquaculture fish; a neural network A road module, including a target probability function neural network for completing training; a processing unit; and a communication unit, wherein the neural network module and the processing unit work together to convert the photographed images of aquaculture fish into a fish density map, And, using the fish population estimation method according to the present invention, the total number of farmed fish is estimated.

較佳地,成像設備是聲納成像設備。此外,神經網路模組 及處理單元中至少之一與至少一成像模組設置成一體。神經網路模組及處理單元中至少之一可以設置於雲端、物聯網、或養殖設備中。 Preferably, the imaging device is a sonar imaging device. In addition, the neural network module And at least one of the processing units is integrated with at least one imaging module. At least one of the neural network module and the processing unit can be set in the cloud, the Internet of Things, or in aquaculture equipment.

根據本發明的人工智慧魚隻數量估算方法及系統,可以低成本、準確地標示學習樣本,顯著地降低人工智慧應用的成本,且在使用人工智慧來養殖水產時,能顯著地提高養殖效率及降低養殖成本。 According to the artificial intelligence fish quantity estimation method and system of the present invention, the learning samples can be marked accurately at low cost, the cost of artificial intelligence application can be significantly reduced, and when artificial intelligence is used to cultivate aquatic products, the breeding efficiency and efficiency can be significantly improved. Reduce farming costs.

100:箱網養殖場域 100: Box net farming field

101a:聲納成像設備 101a: Sonar Imaging Devices

101b:聲納成像設備 101b: Sonar Imaging Devices

200:神經網路訓練方法 200: Neural Network Training Methods

201:標示步驟 201: Labeling Steps

202:訓練步驟 202: Training steps

302:個別魚體 302: Individual fish

304:背景輪廓 304: Background Outline

306:固定設施輪廓 306: Fixed Facility Outline

308:弱標示區 308: Weak marked area

310:弱標示區 310: Weakly marked area

312:弱標示區 312: Weak marked area

500:魚隻數量估算方法 500: Method for Estimating Number of Fish

501:連通區計數步驟 501: Connected area count step

502:使用混合二項式估算魚隻總數步驟 502: Estimating the total number of fish using mixed binomial steps

800:人工智慧魚隻估算系統 800: Artificial Intelligence Fish Estimation System

801:成像模組 801: Imaging module

802:神經網路模組 802: Neural Network Module

803:處理單元 803: Processing Unit

804:通訊單元 804: Communication unit

參考附圖,閱讀述詳細說明,將可以更佳地瞭解本發明的特點、態樣、及優點,在這些圖中,圖式不一定依比例繪製,而是通常強調顯示本揭示的各種態樣之原理,其中:圖1係視圖,顯示根據本發明的實施例之水產養殖場域聲納成像設備配置;圖2係方塊圖,顯示根據本發明的實施例之神經網路訓練處理;圖3A為聲納成像設備取得之魚群影像,圖3B係對應圖3A之完成標記的魚群影像;圖4係魚群密度圖;圖5係方塊圖,用以說明根據本發明的實施例之魚隻數量估算方法;圖6係視圖,用以說明根據本發明的實施例使用之圖形分割方式;圖7A、B、C、及D係分別顯示不同情況中的貝他機率分布圖;及圖8係方塊圖,用以說明根本發明的人工智慧魚群食慾判斷系統。 The features, aspects, and advantages of the present invention will be better understood by reading the detailed description with reference to the accompanying drawings, in which the drawings are not necessarily to scale, but generally emphasize various aspects of the present disclosure 1 is a view showing the configuration of an aquaculture field sonar imaging device according to an embodiment of the present invention; FIG. 2 is a block diagram showing a neural network training process according to an embodiment of the present invention; FIG. 3A Fig. 3B is a fish school image corresponding to the completed marking of Fig. 3A; Fig. 4 is a fish school density map; Fig. 5 is a block diagram for illustrating the estimation of the number of fish according to an embodiment of the present invention method; FIG. 6 is a diagram illustrating a graph segmentation method used in accordance with an embodiment of the present invention; FIGS. 7A, B, C, and D are graphs showing beta probability distributions in different situations, respectively; and FIG. 8 is a block diagram , to illustrate the fundamental invention of the artificial intelligence fish appetite judgment system.

水產養殖在例如池塘、箱網等特定場域中進行,養殖標的可為各式各樣的魚類、蝦類、貝類等等。在本說明書中,將以箱網養殖的 魚類為例說明。對養殖魚類進行有效精準監控是達成有效率養殖的關鍵因素,一般而言,魚群監控,至少包含:(1)成長監控,例如體長、體寬測量;(2)數量測量,自動地計數養殖場域內隨意游動的魚群數量或分佈情形;(3)行為監測,自動地監測、分析及辨識魚群的攝食行為、食慾、驚擾、健康狀態等等。在這些監控應用中,數量測量與食慾等行為的監測是難以達成精確量測與準確分析的,也是水產養殖能否有效率及降低成本的關鍵因素。根據本發明人長期的研究而完成的本發明,可以顯著地提升魚群數量的測量準確度以及準確地分析辨識魚群的攝食、食慾等等行為,因而大幅地提升養殖效率及降低養殖本。在下述說明中,將參考圖式,具體說明本發明的實施例,但須瞭解,下述僅是以舉例方式說明本發明,以助於瞭解本發明的精神及原理,而非要侷限本發明。 Aquaculture is carried out in specific fields such as ponds, box nets, etc., and the aquaculture targets can be a variety of fish, shrimp, shellfish and so on. In this manual, cage-net cultured Take fish as an example. Effective and accurate monitoring of farmed fish is a key factor to achieve efficient farming. Generally speaking, fish population monitoring includes at least: (1) growth monitoring, such as body length and body width measurement; (2) quantity measurement, automatically counting and breeding The number or distribution of fish swimming freely in the field; (3) Behavior monitoring, automatically monitoring, analyzing and identifying the feeding behavior, appetite, disturbance, health status and so on of the fish. In these monitoring applications, the monitoring of behaviors such as quantity measurement and appetite is difficult to achieve accurate measurement and accurate analysis, and it is also a key factor for the efficiency and cost reduction of aquaculture. According to the present invention completed by the inventor's long-term research, the measurement accuracy of the number of fish can be significantly improved, and the behaviors such as feeding and appetite of the fish can be accurately analyzed and identified, thereby greatly improving the breeding efficiency and reducing the breeding cost. In the following description, the embodiments of the present invention will be described in detail with reference to the drawings, but it should be understood that the following descriptions of the present invention are only by way of example, so as to help understand the spirit and principle of the present invention, but not to limit the present invention. .

一般而言,在應用人工智慧時,須先以預定數目的樣本來進行人工智慧的訓練,在訓練達成預定的目標後,應用訓練結果所取得之參數等等數據或技術而產生特定應用的人工智慧技術。樣本通常是要處理之目標的影像或圖形,而常用以進行訓練的技術是神經網路等等。根據本發明,由於是將人工智慧應用至水產養殖,所以,進行魚群數量或分佈估算、魚群食慾等行為監控等等時,須在混濁或是光線昏暗的水域中,取得清晰的影像以供學習訓練而達成自動精準的估算或判斷。一般的光學相機或攝影機通常無法在此應用中獲取清晰的影像以供分析,所以須使用可清晰地攝取水中影像的設備。 Generally speaking, when applying artificial intelligence, artificial intelligence training must be performed with a predetermined number of samples first. After the training achieves the predetermined goal, data or techniques such as parameters obtained from the training results are used to generate artificial intelligence for specific applications. Smart technology. The samples are usually images or graphics of the target to be processed, and the techniques commonly used for training are neural networks and so on. According to the present invention, since artificial intelligence is applied to aquaculture, it is necessary to obtain clear images in turbid or dimly lit waters for study when estimating the number or distribution of fish, monitoring fish appetite and other behaviors, etc. Training to achieve automatic and accurate estimation or judgment. Typical optical cameras or video cameras usually cannot obtain clear images for analysis in this application, so equipment that can capture clear images in water must be used.

根據本發明的實施例,較佳地使用聲納成像設備來拍攝養殖場域的魚群動態,但是,可以使用任何可清晰地拍攝水中影像的設備, 而不侷限於聲納成像設備。在使用聲納成像設備來監控魚群時,會考慮扇形音束水平角度範圍、垂直方向角度範圍、有效距離解析力等等以取得所需的魚群影像。由於魚群動態地活動於養殖場域中,即使相同的聲納成像設備佈置於不同的位置,其拍攝結果也會因上述固有的限制或魚群的動態活動而不同。根據本發明的實施例,較佳地佈署至少二個聲納成像設備,舉例而言,如圖1所示,在箱網養殖場域100中每隔180度佈置一個聲納成像設備,共佈署二個聲納成像設備101a及101b。當然,可以每隔120度或其它預定角度各佈署一個聲納成像設備。 According to the embodiment of the present invention, it is preferable to use a sonar imaging device to capture the dynamics of fish in the farm area, but any device that can clearly capture images in water can be used, Not limited to sonar imaging equipment. When using sonar imaging equipment to monitor fish, the horizontal angle range, vertical angle range, effective distance resolution, etc. of the fan-shaped sound beam will be considered to obtain the desired fish image. Since fish schools move dynamically in the farm area, even if the same sonar imaging device is placed in different locations, the imaging results will be different due to the above-mentioned inherent limitations or the dynamic movement of fish schools. According to an embodiment of the present invention, at least two sonar imaging devices are preferably deployed. For example, as shown in FIG. 1 , one sonar imaging device is arranged every 180 degrees in the cage and net breeding field 100 , with a total of Two sonar imaging devices 101a and 101b are deployed. Of course, one sonar imaging device may be deployed every 120 degrees or other predetermined angles.

由於養殖場域裡數量很多的魚隻的活動、聲納成像設備和箱網設施等固有設施的影響,所以,經常會有魚隻重疊、遮蔽、瞬間游進游出聲納探測範圍、來自設施與氣泡的回波的干擾、不同聲納成像設備的扇形視角或固有限制、雜訊等等,結果,在取得的聲納影像中會造成遺漏、誤判等等錯誤。因此,使用這些影像作為學習訓練的樣本時,影像首先必須經過前置處裡以有效地排除或顯著地降低這些不良地影嚮判別或訓練的因素,如此,才能取得有效的、達成預定目的之學習訓練結果,進而能夠自動精準地分析魚體長度、魚群數量與行為等等。值得一提的是,根據本發明的實施例,使用多成像聲納視角取得魚群影像。 Due to the movement of a large number of fish in the farm, the influence of inherent facilities such as sonar imaging equipment and box net facilities, there are often fish overlapping, shadowing, swimming in and out of the sonar detection range, and coming from the facility. Interference with echoes of bubbles, fan-shaped viewing angles or inherent limitations of different sonar imaging devices, noise, etc., result in errors such as omissions, misjudgments, etc., in the acquired sonar images. Therefore, when using these images as samples for learning and training, the images must be pre-processed to effectively eliminate or significantly reduce these factors that adversely affect the discrimination or training. Learning the training results, it can automatically and accurately analyze the length of the fish, the number and behavior of the fish, and so on. It is worth mentioning that, according to an embodiment of the present invention, a fish school image is obtained using a multi-imaging sonar viewing angle.

如上所述,以多幀魚群影像作為人工智慧學習樣本時,上述魚隻重疊、遮蔽、瞬間游進游出等等問題,會造成人工智慧學習的困難,因此,根據本發明,將各幀魚群影像進行標示分類以將整幀影像分成多個標示區之方式,可以克服這些困難並增進學習準確性及效率。根據本發明,以分類標示的樣本來教導神經網路,設定評估準則,舉例而言,設 計損失函數,以教導神經網路對各不同的標示區之學習反應能達成一致性。關於神經網路的學習方式,一般而言,可分為對完全未經標示處理的區域來進行學習的非監督式學式、可對未經標示或經過標示的區域來進行學習的半監督式學習、及對經過標示的區域來進行學習之監督式學習。在人工智慧學習訓練時,通常是以學習樣本在統計或機率上原本應有的表現、以及神經網路對作為輸入之同一樣本的輸出表現,兩者是否符合預期的相似性或一致性作為訓練完成與否的評估。於下,將具體說明根據本發明的實施例,以助於更佳地瞭解本發明的精神及原理。 As mentioned above, when multiple frames of fish school images are used as artificial intelligence learning samples, the above-mentioned problems such as overlapping, shading, and instantaneous swimming in and out of fish will cause difficulties in artificial intelligence learning. Therefore, according to the present invention, each frame of fish school is The method of marking and classifying the image to divide the whole frame of image into a plurality of marked regions can overcome these difficulties and improve the accuracy and efficiency of learning. According to the present invention, the neural network is taught with classified samples to set evaluation criteria, for example, set A loss function is calculated to teach the neural network to achieve consistency in the learning responses of different labeled regions. Generally speaking, the learning methods of neural networks can be divided into unsupervised learning methods that learn from areas that are not marked at all, and semi-supervised learning methods that can learn from unmarked or marked areas. Learning, and supervised learning for learning in marked areas. In artificial intelligence learning and training, it is usually based on the original performance of the learning sample in terms of statistics or probability, and the output performance of the neural network for the same sample as input, whether the two meet the expected similarity or consistency as training. Completion assessment. Hereinafter, the embodiments according to the present invention will be described in detail to help better understand the spirit and principle of the present invention.

在作魚隻數量估算時,首先必須能夠精準地估算魚群分佈,並藉以產生對應的密度圖。圖2係方塊圖,顯示根據本發明的實施例之人工智慧訓練階段之神經網路訓練處理200。如圖2所示,訓處理200主要包含標示步驟201、訓練步驟202。於下,將詳細說明各步驟。 When estimating the number of fish, it is first necessary to be able to accurately estimate the distribution of the fish population and to generate the corresponding density map. FIG. 2 is a block diagram showing a neural network training process 200 in an artificial intelligence training phase according to an embodiment of the present invention. As shown in FIG. 2 , the training process 200 mainly includes a marking step 201 and a training step 202 . Below, each step will be explained in detail.

在標示步驟201中,將作為學習樣本的多幀魚群影像中的各影像,進行分類標示以使各像素分屬於不同的標示類別。根據本發明的實施例,用以將各幀影像標示分式分別為無標示、弱標示、及強標示等三種標示,其中,強標示係確定一尾魚存在則以單一點標示該尾魚,以及確定無魚的區域則用包圍該區域的多邊形輪廍來標示;弱標示係確定至少有n(n≧2)與至多cn(c>1)尾魚出現但無法確定某些魚位置之區域,則以框住該區域的矩形作標示;以及,未標示被視為不確定或是之前已有相當多類似標示之像素或區域則不標示。在下述中,經由強標示方式標示的區域稱為強像素標示區,經由弱標示方式標示的區域稱為弱像素標示區,未標示的區域稱為未標示像素區。如此,根據本發明的實施例,一幀經過標示的影 像可分為強像素標示區、弱像素標示區、及未標示像素區。 In the labeling step 201 , each image in the multi-frame fish images serving as a learning sample is classified and labeled so that each pixel belongs to a different labeling category. According to an embodiment of the present invention, each frame of images is marked with three types of markings: no marking, weak marking, and strong marking, wherein the strong marking means that a single point is used to mark the fish when it is determined that a fish exists. And the area that is sure to be free of fish is marked with a polygonal circle surrounding the area; weak marking is to determine the area where at least n ( n≧ 2) and at most cn ( c >1) fish appear but the location of some fish cannot be determined. , it is marked with a rectangle enclosing the region; and, pixels or regions that are not marked are considered uncertain or that have been marked with a considerable number of similar markings before are not marked. In the following, the area marked by the strong marking method is called the strong pixel marking area, the area marked by the weak marking method is called the weak pixel marking area, and the unmarked area is called the unmarked pixel area. In this way, according to the embodiment of the present invention, a marked image frame can be divided into a strong pixel marked area, a weak pixel marked area, and an unmarked pixel area.

將參考圖3A及3B,具體說明根據本實施例之標示步驟201。圖3A為聲納成像設備101取得之魚群影像,圖3B係對應圖3A之完成標記的影像。首先,說明強標示,在圖3A中的魚群影像中可以明確辨識為個別魚體之影像,以單一個點記號標示,並將其歸類於標籤類別f1。舉例而言,圖3A中經辨識後的所有個別魚體302以圓點記號標示並歸類於類別f1。如此,被標示的每一條魚會以一個圓點代表並被給予標籤類別f1,以此方式而產生如圖3B所示之標示。為區別魚群影像與背景影像,將圖3A中可明確判斷為背景或固定設施之處,以多點包圍輪廓描繪之方式標示而取得如圖3B中所示的背景輪廓304及固定設施輪廓306,並分別將其歸類於標籤類別B、F,標籤類別B代表背景,F代表固定設施,這亦屬強標示。如此,整體強像素標示區雖然同樣經過強標示但由於被給予不同的標籤類別,仍可區分有魚及無魚的區域,也可明確容易地計算出整體強像素標示區的總魚數。關於弱標示,係指至少有n(n≧2)與至多cn(c>1)尾魚出現而由矩形框標示之區域,而此區域中明確可知至少某些數量的魚存在,存在的魚體數量可能為例如,至少2、5或10時,將它們分別歸類於類別f2、f5或f10,若c設定為2時標籤類別f2代表2條至4條魚,f5代表5條至10條魚,以此類推。以此方式取得如圖3B中的弱標示區308、310及312。此外,對那些無法確定其標籤類別的像素或區域、或之前已標示過多次的類似像素或區域,不需免強給予標示以免造成標示錯誤。根據本發明的實施例,除了強標示及未標示方式可以節省標示成本外,弱標示方式無須對個別魚隻標示,而是以包含預定數量範圍內的魚隻之區域輪廓作標示,更顯著地節 省標示成本。 3A and 3B, the marking step 201 according to the present embodiment will be described in detail. FIG. 3A is an image of a school of fish acquired by the sonar imaging device 101 , and FIG. 3B is an image corresponding to the completion mark of FIG. 3A . First, the strong labeling is explained. In the fish school image in FIG. 3A , the images of individual fish bodies that can be clearly identified are marked with a single dot mark, and are classified into the label category f1 . For example, all the individual fish bodies 302 identified in FIG. 3A are marked with dots and classified into category f1. In this way, each marked fish will be represented by a dot and given the tag class f1, in this way, the marking as shown in FIG. 3B is generated. In order to distinguish the fish school image from the background image, the location in FIG. 3A that can be clearly determined as the background or the fixed facility is marked with a multi-point enclosing outline to obtain the background outline 304 and the fixed facility outline 306 as shown in FIG. 3B , They are classified into label categories B and F respectively. Label category B represents background and F represents fixed facilities, which is also a strong label. In this way, although the overall strong pixel marking area is also strongly marked, it is still possible to distinguish areas with fish and no fish due to different label categories, and the total number of fish in the overall strong pixel marking area can also be clearly and easily calculated. With regard to weak labeling, it refers to the area marked by a rectangular box with at least n(n≧ 2) and at most cn(c>1) fishes present, and it is clearly known that at least a certain number of fish are present in this area. The number of fish may be, for example, at least 2, 5 or 10, classify them in class f2, f5 or f10 respectively, if c is set to 2 label class f2 for 2 to 4 fish and f5 for 5 to 10 fish, and so on. In this way, weakly marked regions 308, 310 and 312 as in Figure 3B are obtained. In addition, for those pixels or regions whose label category cannot be determined, or similar pixels or regions that have been labeled many times before, it is not necessary to give strong labels to avoid labeling errors. According to the embodiment of the present invention, in addition to the strong marking and non-marking methods that can save the cost of marking, the weak marking method does not need to mark individual fish, but uses the outline of the area including the fish within a predetermined number range as marking, which is more obvious. Save on labeling costs.

如上所述般,聲納設備101a及101b取得的影像幀中的各個像素分別歸屬於強像素標示區、弱像素標示區、及未標示像素區。上述說明中雖然指明標籤類別的數目,以及標籤類別含義,但僅為便於說明,而非要限定標籤類別代表之意義,標籤類別可視條件更改。依上述方式,產生標記檔案,記錄各像素點的位址、標示及相關連的標籤類別。舉例而言,在480×640解析度的影像的標記檔案中記錄第m×n像素為強像素及標籤類別f1、第k×m像素為弱像素及標籤類別f2、第(k+2)×(m+2)像素為無標示、等等。此外,在下述中,為簡明起見,述及強標示、弱標示及未標示等詞時,除了指明其標示方式也同時指給予對應的標籤類別。 As described above, each pixel in the image frame acquired by the sonar devices 101a and 101b belongs to the strong pixel marked area, the weak pixel marked area, and the unmarked pixel area, respectively. Although the above description specifies the number of label categories and the meaning of the label categories, it is only for the convenience of description, not to limit the meaning of the label categories, and the label categories can be changed according to conditions. According to the above method, a tag file is generated to record the address, label and associated tag type of each pixel. For example, in the markup file of an image with a resolution of 480×640, record the m×nth pixel as a strong pixel and the label type f1, the k×mth pixel as a weak pixel and the label type f2, and the (k+2)× (m+2) pixels are unmarked, and so on. In addition, in the following, for the sake of brevity, when referring to words such as strong labeling, weak labeling, and no labeling, in addition to specifying the labeling method, it also means giving the corresponding label category.

在說明訓練步驟202之前。先參考如圖4所示的魚群密度圖,簡明地解釋魚群密度圖的含意。魚群密度圖(於下簡稱密度圖)係依據一幀魚群影像的各像素有魚隻出現的機率分佈而產生的相當於該幀影像的魚群分佈圖。以機率觀點而言,密度圖相當於由總格數等於魚群影像的總像素數(解析度)而格中數字代表魚隻出現的機率之機率分佈圖。 Before explaining the training step 202. First refer to the fish density map shown in Figure 4 to briefly explain the meaning of the fish density map. The fish school density map (hereinafter referred to as the density map) is a fish school distribution map corresponding to the frame image generated according to the probability distribution of fish appearing in each pixel of a frame of fish school image. From a probability point of view, a density map is equivalent to a probability distribution map in which the total number of cells is equal to the total number of pixels (resolution) of the fish image, and the numbers in the cells represent the probability of fish appearing.

在訓練步驟202中,設定訓練準則來訓練神經網路以達到最逼近真實應用的結果。將作為學習樣本的各幀魚群影像,以下述公式(1)產生對應的魚群密度圖作為學習參考,公式(1)係由藍匹斯基(Lempitsky)提出以表示某區域F中特定標示物的密度圖: In the training step 202, training criteria are set to train the neural network to achieve results that most closely approximate the real application. Taking each frame of fish school images as learning samples, the following formula (1) is used to generate the corresponding fish school density map as a learning reference. Density map:

Figure 110131193-A0101-12-0009-1
其中,D(p)是區域F中的密度圖,p表示位置,pk表示第k個位置。
Figure 110131193-A0101-12-0009-1
where D(p) is the density map in region F , p denotes the position, and p k denotes the kth position.

根據本發明實施例,設計下述公式(2)來產生對應於各幀魚 群影像圖的密度圖: According to the embodiment of the present invention, the following formula (2) is designed to generate fish corresponding to each frame Density map of the group image map:

Figure 110131193-A0101-12-0010-21
其中,Y=代表整幀影像的所有像素yi,*表示卷積運算,
Figure 110131193-A0101-12-0010-22
(p;0,Σ)代表期望值與共變異數矩陣分別為0Σ的高斯密度函數,此處,P(Y=1|X,Θ)係表整幀影像有魚的機率。
Figure 110131193-A0101-12-0010-21
Among them, Y = represents all pixels yi of the whole frame of image, * represents the convolution operation,
Figure 110131193-A0101-12-0010-22
(p; 0, Σ) represents the Gaussian density function whose expected value and covariance matrix are 0 and Σ , respectively, where P(Y=1| X , Θ) represents the probability of fish in the whole frame image.

使用公式(1)產生的魚群密度圖會作為訓練過程中的比較參考,以與公式(2)產生的密度圖相比較,用於評估訓練。如上所述,一幀影像中強像素標示區中的魚隻最明確可數,因而其機率分佈最為可靠,可用以作為評估其它像素區的魚群機率分佈。因此,在設定練準則時,會考慮各學習樣本的強像素標示區相對於訓練產生的密度圖之機率關係與相對於比較參考之密度圖的機率關係是否達到預定的相符程度或一致性(於下,或簡稱機率比較一致性),並以此一致性作為評估對其它標示區的學習是否達成訓練目的,換言之,以機率比較一致性的觀點而言,其它標示像素區必須與強像素標示區相符或在預定範圍內。此外,會視強像素標示區相對於比較參考之密度圖之機率關係,設計能最佳地反應魚群影像幀的魚隻機率分佈之機率函數,用以訓練神經網路,以根據公式(2)產生魚群密度圖。於下,將進一步具體說明。 The fish density map generated using Equation (1) is used as a comparison reference during training to be compared with the density map generated by Equation (2) for evaluating training. As mentioned above, the fish in the strong pixel-marked area in a frame of image is the most definite and countable, so its probability distribution is the most reliable, and can be used to evaluate the probability distribution of fish in other pixel areas. Therefore, when setting the training criteria, it will be considered whether the probability relationship between the strong pixel marked area of each learning sample relative to the density map generated by training and the probability relationship relative to the comparison reference density map reaches a predetermined degree of consistency or consistency (in In other words, from the viewpoint of probability comparison consistency, the other marked pixel areas must be the same as the strong pixel marked areas. match or within the predetermined range. In addition, according to the probability relationship between the strong pixel marked area and the comparison reference density map, a probability function that can best reflect the probability distribution of fish in the fish school image frame is designed to train the neural network, according to formula (2) Generate a fish density map. Below, it will be further described in detail.

根據本發明的實施例,用於訓練的神經網路可為例如U-Net架構、P-Net架構或其它架構的神經網路。較佳地,以機率函數P(y i |X,Θ)作為神經網路的學習目標,將其訓練成為機率函數P(y i |X,Θ)神經網路,其中,P(y i |X,Θ)≧0,X代表單幀魚群影像,yi代表整幀影像中的第i個像素,Θ代表神經網路參數,P(y i =1|X,Θ)係表示像素i被標示有一尾魚存在的 機率,P(y i =0|X,Θ)表示像素i被標示一尾魚不存在的機率,亦即,P(y i =1|X,Θ)+P(y i =0|X,Θ)=1。 According to embodiments of the present invention, the neural network used for training may be, for example, a neural network of a U-Net architecture, a P-Net architecture, or other architectures. Preferably, the probability function P ( yi | X , Θ) is used as the learning target of the neural network, and it is trained as a probability function P( yi | X , Θ) neural network, where P ( yi | X ,Θ)≧0, X represents a single frame of fish school image, y i represents the ith pixel in the whole frame image, Θ represents the neural network parameter, P( y i =1| X ,Θ) represents the pixel i is Indicates the probability that a fish exists, P( y i =0| X ,Θ) represents the probability that pixel i is marked with a fish not existing, that is, P( y i =1| X ,Θ)+P( y i =0| X ,Θ)=1.

接著,根據訓練準則的精神,定義損失函數,並以損失函數最小化代表示訓練完成並符合預期目標。根據本發明的實施例,損失函數L(Θ|Θ')設定如下: Then, according to the spirit of the training criteria, define the loss function, and use the minimization of the loss function to indicate that the training is completed and meets the expected goal. According to an embodiment of the present invention, the loss function L (Θ|Θ ' ) is set as follows:

Figure 110131193-A0101-12-0011-2
其中,S代表所有強標示的像素集合,W代表弱標示的像素集合,Ω代表一張影像裡所有像素的集合,Θ'為前一次疊代神經網路參數,Θ為目前的神經網路參數,α>0與β>0為調整這三項權重的參數。
Figure 110131193-A0101-12-0011-2
Among them, S represents the set of all strongly marked pixels, W represents the set of weakly marked pixels, Ω represents the set of all pixels in an image, Θ ' is the previous iteration neural network parameter, Θ is the current neural network parameter , α>0 and β>0 are the parameters to adjust the three weights.

為簡化說明,將上述損失函數(3)主要分為3-1、3-2及3-3三項,依序為

Figure 110131193-A0101-12-0011-23
(...)、α×
Figure 110131193-A0101-12-0011-24
(...)、及β×
Figure 110131193-A0101-12-0011-25
(...),其中,3-1項會計算強像素標示區之參考密度圖D(P)與魚群密度圖P(y i |X,Θ)的一致性,3-2項係對非強標示的像素集合進行半監督式學習以達到與強標示像素一致的學習反應,3-3項係對弱標示像素區的訓練結果要滿足該弱標示範圍裡至少有n(n≧2)與至多cn(c>1)尾魚的限制,其中,α及β係權重參數。 In order to simplify the description, the above loss function (3) is mainly divided into three items: 3-1, 3-2 and 3-3, in order:
Figure 110131193-A0101-12-0011-23
(...), α ×
Figure 110131193-A0101-12-0011-24
(...), and β ×
Figure 110131193-A0101-12-0011-25
(...), among which, items 3-1 will calculate the consistency between the reference density map D(P) of the strong pixel marked area and the fish density map P ( yi | X ,Θ), and items 3-2 are for non- Semi-supervised learning is performed on the strongly marked pixel set to achieve the same learning response as the strongly marked pixels. Items 3-3 are the training results for the weakly marked pixel area to satisfy at least n(n≧2) and The limit of at most cn(c>1) fish, where α and β are weight parameters.

在訓練步驟202中,在進行訓練時,教導神經網路以疊代式及最小化之計算方式,進行損失函數的演算。疊代式計算會根據目前學習到的魚事後機率函數(亦即目前受訓練的神經網路),計算每個非強標示像素是魚的機率與不是魚的機率,結果,前次疊代認為是魚,下次疊代結果會更傾向是魚,前次疊代認為不是魚,下次疊代結果會更傾向不是魚。值得注意 的是,剛開始疊代時神經網路連強標示像素都不吻合。在訓練時,使用原始影像及其經過標示處理後產生的檔案作為輸入,由神經網路如所述般,根據損失函數進行學習演算直到損失函數(3)最小化為止,並將此時取得的權重參數α、β及學習參數Θ分別作為目標權重參數及目標神經參數儲存,如此完成訓練之神經網路可為函數P(y i |X,Θ)神經網路。換言之,任何未經訓練之相同的神經網路具有目標神經參數Θ及目標權重參數α、β,就能成為函數P(y i |X,Θ)神經網路。 In the training step 202 , during training, the neural network is taught to calculate the loss function in an iterative and minimization manner. The iterative calculation will calculate the probability that each non-strongly marked pixel is a fish and the probability that it is not a fish based on the currently learned fish post-event probability function (that is, the currently trained neural network). As a result, the previous iteration considered that It is a fish, the result of the next iteration will be more inclined to be a fish, and the result of the previous iteration will be more inclined to not be a fish. It is worth noting that the neural network does not even match the strongly labeled pixels at the beginning of the iteration. During training, the original image and the file generated by the labeling process are used as input, and the neural network performs the learning calculation according to the loss function as described above until the loss function (3) is minimized, and the obtained at this time is calculated. The weight parameters α, β and the learning parameter Θ are stored as the target weight parameter and the target neural parameter respectively, and the neural network after the training can be a function P ( yi | X ,Θ) neural network. In other words, any untrained identical neural network with target neural parameters Θ and target weight parameters α, β can be a function P( y i | X ,Θ) neural network.

在上述中,已明確說明,如何根據訓練準則,設計及使用損失函數及目標函數來訓練神經網路,對各標示像素區的學習能達成預期的機率比較一致性,而達成訓練目標,產生符合應用而求的神經網路。如此,根據完成訓練的結果,使用類型與訓練的神經網路相同的神經網路,並輸入相關參數,例如神經參數Θ及目標權重參數α、β,即可產生函數P(y i |X,Θ)神經網路,用以自動地將任一幀魚群影像圖轉換成對應的密度圖,用於估算魚隻總數量。 In the above, it has been clearly explained how to design and use the loss function and the objective function to train the neural network according to the training criteria, so that the learning of each marked pixel area can achieve the expected probability more consistent, and achieve the training goal. Neural network for application. In this way, according to the results of the completed training, using the same type of neural network as the trained neural network, and inputting relevant parameters, such as the neural parameter Θ and the target weight parameters α, β, the function P( y i | X , Θ) Neural network to automatically convert any frame of fish image into a corresponding density map for estimating the total number of fish.

接著,將參考圖5,說明根據本發明的實施例之魚隻數量估算方法500,用於估算養殖場內養殖的魚隻總數。如圖5所示,魚隻數量估算方法500包含連通區計數步驟501及使用混合二項式估算魚隻總數步驟502。 Next, referring to FIG. 5 , a method 500 for estimating the number of fishes according to an embodiment of the present invention will be described, which is used for estimating the total number of fishes cultured in a farm. As shown in FIG. 5 , the method 500 for estimating the number of fish includes a step 501 of counting connected regions and a step 502 of estimating the total number of fish by using a mixed binomial method.

在步驟501中,將各成像設備取得的魚群影像輸入至如上所述完成訓線的函數P(y i |X,Θ)神經網路,以產生對應於輸入的魚群影像的二維魚群密度圖。由於根據本發明的實施例,使用二個以180度佈置於場域之聲納成像設備101來攝取場域內的影像,所以,同一時刻會有聲納成像設備101a取得的影像,及聲納成像設備101b取得的影像,而有分別對應這二幀影 幀影像的二張密度圖產生。為了使這二張密度圖如同由同一視角產生般,會將其中之一,依據聲納成像設備101的部署方式旋轉,而另一張不作任何旋轉,舉例而言,在本實施例中,將其中一張密度圖旋轉180度。然後,使用例如連通物件標記演算法(connected component labeling)等方法,分別找出這二張密度圖中各圖中魚隻反應強度(機率值)大於預定閥值的像素的連通區。關於連通區的具體說明,舉例而言,以機率值大於預定閥值的像素為中心,先找出其上、下、左、右、左上、左下、右上、右下這8個緊鄰像素中,是否有大於預定閥值之像素,若有,則不將該緊鄰像素列入其連通區,若無,則又繼續尋找該鄰鄰像素的緊鄰相素中是否有大於預定閥值之像素,依此規則,一直找尋直至沒有大於預定閥值且無像素可再供尋時,如此,該中心像素與這些找到的未大於預定閥值的像素即為一個連通區。從上述說明可知一連通區可明確地代表該區中至少有一尾魚。根據本實施例,一張密度圖若找到n個連通區,則視為該張密度圖中有n尾魚。因此,根據本發明,可以說是以一個連通區有一尾魚存在為準則,找出一張密度圖中共有多少個連通區來代表多少隻。 In step 501, the fish school images obtained by each imaging device are input into the function P( y i | X ,Θ) neural network that completes the training line as described above, so as to generate a two-dimensional fish school density map corresponding to the input fish school images . According to the embodiment of the present invention, two sonar imaging devices 101 arranged at 180 degrees in the field are used to capture images in the field, so at the same time, there will be images obtained by the sonar imaging device 101a and the sonar imaging device 101a. For the image obtained by the imaging device 101b, two density maps corresponding to the two frame images are generated. In order to make these two density maps generated from the same viewing angle, one of them is rotated according to the deployment method of the sonar imaging device 101 , and the other is not rotated at all. For example, in this embodiment, the One of the density maps is rotated 180 degrees. Then, using methods such as connected component labeling, etc., to find the connected regions of pixels whose fish-only response intensity (probability value) is greater than a predetermined threshold in each of the two density maps. Regarding the specific description of the connected area, for example, taking the pixel whose probability value is greater than the predetermined threshold as the center, first find out the 8 adjacent pixels of its upper, lower, left, right, upper left, lower left, upper right, and lower right. Whether there is a pixel greater than the predetermined threshold, if so, do not include the adjacent pixel in its connected area, if not, continue to find whether there is a pixel greater than the predetermined threshold in the adjacent pixels of the adjacent pixel, according to According to this rule, the search is continued until there is no pixel greater than the predetermined threshold value and no more pixels can be found. In this way, the central pixel and these found pixels that are not greater than the predetermined threshold value are a connected area. It can be seen from the above description that a connected area can clearly represent at least one fish in the area. According to this embodiment, if n connected regions are found in a density map, it is considered that there are n fishes in the density map. Therefore, according to the present invention, it can be said that one connected area has one fish as the criterion to find out how many connected areas in a density map to represent how many fish.

然後針對影像裡養殖場域區域,以a×b個像素為一塊基本次單元,將二張同時取得的密度圖養殖場域區域,將如圖6所示場域為K×L像素分別分割成(K/a)×(L/b)塊基本次單元。舉例來說,假如場域寬400像素高為400像素,依80x80個像素為一塊基本次單元進行分割,則取得寬度方向5(=400/80)×高度方向5(=400/80)=25塊基本次單元。如此可得到,0~79像素寬x0~79像素高的方塊1,80~159像素寬x0~79像素高的方塊2...跟320~399像素寬x320~399像素高的方塊25,這樣分割就可以得到25塊80像素寬x80像素 高的密度次單元。將如此分割而得的各區塊,依由左至右、由上至下依序編號。然後,以這二張圖中相對應的各區塊中較大的連通區數全部相加,即可得到代表同一時刻、同一場域及同一視角計數而得的總連通區數,舉例而言,密度圖A及密度圖B為同時取得的密度圖且其中之一已經過旋轉,密度圖A及B分別分割成a1、a2、...a25以及b1、b2、...b25區塊,a1中有3個連通區,b1中有5個連通區,則以b1的5個連通區為代表,依此方式,找出25個代表值並將它們加總,而得到的總數即代表同一時刻、同一場域及同一視角計數而得的總連通區數,亦即總魚隻數。 Then, for the aquaculture field area in the image, with a × b pixels as a basic sub-unit, the two simultaneously obtained density maps of the aquaculture field area are divided into K × L pixels as shown in Figure 6. (K/a)×(L/b) block basic subunit. For example, if the field is 400 pixels wide and 400 pixels high, and is divided according to 80x80 pixels as a basic subunit, then the width direction 5(=400/80)×height direction 5(=400/80)=25 Block basic subunit. In this way, block 1 with a width of 0~79 pixels x 0~79 pixels high, block 2 with a width of 80~159 pixels x 0~79 pixels high... and block 25 with a width of 320~399 pixels x 320~399 pixels high, so Divide to get 25 blocks of 80 pixels wide x 80 pixels High density subunits. The blocks thus divided are numbered sequentially from left to right and from top to bottom. Then, add up the larger number of connected areas in the corresponding blocks in these two pictures to obtain the total number of connected areas counted representing the same time, the same field and the same viewing angle. For example, the density Figure A and density map B are the density maps obtained at the same time and one of them has been rotated. The density maps A and B are respectively divided into a1, a2, ... a25 and b1, b2, ... b25 blocks, in a1 There are 3 connected areas, and there are 5 connected areas in b1, then the 5 connected areas of b1 are represented. In this way, 25 representative values are found and added up, and the total number obtained represents the same time, The total number of connected areas counted from the same field and the same viewing angle, that is, the total number of fish.

接著,說明步驟502,在此步驟中,使用混合貝他-(Beta-)二項分布模型處理方法以更精確地估算養殖場域中養殖魚的總數。由於在聲納影像裡,魚會有掃描死角、重疊、遮蔽等問題,因此不是每條魚都可以很容易觀察到。假設養殖池裡共有m尾魚,若每尾魚被觀察到的事件皆是機率為p的白努利試驗(Bernoulli trial),那麼觀察到n尾魚的機率可以假設為mp的二項分布: Next, step 502 is described, in which the mixed beta-(Beta-) binomial distribution model processing method is used to more accurately estimate the total number of farmed fish in the farm area. Due to the problems of scanning dead corners, overlapping, and occlusion in the sonar image, not every fish can be easily observed. Assuming that there are m fish in the culture pond, if the observed event for each fish is a Bernoulli trial with probability p , then the probability of observing n fish can be assumed to be the binomial of m and p distributed:

Figure 110131193-A0101-12-0014-3
但由於聲納影像存在不同程度魚遮蔽等問題,導致每尾魚被觀察到的事件在不同遮蔽等問題情況下會是不相同的白努利試驗,此外,在不同場域或不同魚種時,這些問題情況又會不同。因此,用密度圖估計之魚隻數量雖可由二項分布所組成,但須與能顧及這些因素的其它分布相混合,以更精確地估算魚隻數量。也就是說以上述二項式分布為基礎,考量不同狀況c(c
Figure 110131193-A0101-12-0014-26
1)下之白努利試驗,設計出混合Beta-二項分布模型,搭配EM演算法(最大期望演算法Expectation-Maximization Algorithm),透過最大似然估計法則(Maximum Likelihood Estimation Principle),根據N筆使用密度圖估計出的魚隻數量n i ,i=1,...,N,來更準確估計出養殖池內的魚隻數量m。估計混合貝他-(Beta-)二項分布模型未知數之完整似然函數(Complete Likelihood Function)可以定義如下:
Figure 110131193-A0101-12-0014-3
However, due to the problems of different degrees of fish shadowing in sonar images, the observed events of each fish will be different Bernoulli tests under different shadowing and other problems. In addition, in different fields or different fish species , the situation will be different. Therefore, the population of fish estimated by the density map can be composed of a binomial distribution, but must be mixed with other distributions that take into account these factors to more accurately estimate the number of fish. That is to say, based on the above binomial distribution, considering different conditions c ( c
Figure 110131193-A0101-12-0014-26
1) In the Bernoulli experiment below, a mixed Beta-binomial distribution model is designed, with the EM algorithm (Expectation-Maximization Algorithm), through the Maximum Likelihood Estimation Principle, according to N strokes Use the estimated number of fish n i , i =1,..., N from the density map to more accurately estimate the number of fish m in the culture pond. The Complete Likelihood Function for estimating the unknowns of a mixed beta-(Beta-) binomial distribution model can be defined as follows:

Figure 110131193-A0101-12-0015-4
Figure 110131193-A0101-12-0015-4

在上述這個似然函數Lm,p k ,z ik 為未知數,其中m為魚池中的總魚隻數,p k 為第k種情況下魚被觀測到的機率,z ik 值為0或1, k=1,...,c並且

Figure 110131193-A0101-12-0015-5
z ik 是用來指示第i筆資料是在c種狀況中哪一個 狀況的潛在變數(Latent Variable)。另外,在上述似然函數LN,c,n i k ,β k 為是觀察到的值或者是設定的參數,皆是已知值,其中N為根據不同時刻成像設備取得的魚群影像之資料筆數,c為魚被觀察到的狀況數,n i 為第i筆使用密度圖估計出的魚隻數量,α k k 為有關p k 的先驗機率(Prior Probability)之Beta分布的α,β參數。透過EM演算法,最大化似然函數L可求得所有未知數。根據養殖場域、魚種等條件,透過設定Beta分布的α k k 參數來設定為第k種情況下魚被觀測到的機率p k 的先驗機率。舉例而言,在本實施例中,由於本實施例是雙聲納系統,因此我們設定聲納探測魚群會有以下4(c=4)種情況:兩支聲納都觀測到很多魚,在此情況下,整體而言每隻魚被觀察到的事件皆是機率為p 1的白努利試驗;第一視角聲納觀測到很多魚並且第二視角聲納觀測到很少魚,在此情況下,整體而言每隻魚被觀察到的事件皆是機率為p 2的白努利試驗;第一視角聲納觀測到很少魚 並且第二視角聲納觀測到很多魚,在此情況下,整體而言每隻魚被觀察到的事件皆是機率為p 3的白努利試驗;以及兩支聲納都觀測到很少魚,在此情況下,整體而言每隻魚魚被觀察到的事件皆是機率為p 4的白努利試驗。因此,舉例而言,在本實施例中,我們將有關p 1,p 2,p 3,p 4的先驗機率貝他(Beta)分布之α,β參數分別設定為(2,2),(2,4),(2,4),(2,7),其背後意義分別代表p 1先驗機率為0.5附近較高,其先驗機率圖如圖7A所示,p 2p 3先驗機率為0.25附近較高,其先驗機率圖如圖7B和7C所示,p 4先驗機率0.125附近較高,其先驗機率圖如圖7D所示。將步驟501得到之連通區物件計數結果代入混合雙項分布模型進行計算,可以得到最終的魚隻數量估測結果。具體而言,魚群的種類、及養殖場域的條件,都會影響到密度圖中魚存在的機率,因此,除了單純的二項分布之外,還需考慮這些因素,因此,根據本發明會以二項分布為基礎,並與考慮這些因素的其它方程式相合併,產生混合二項分布式以更精準地估算魚隻數量。 In the above likelihood function L , m , p k , z ik are unknowns, where m is the total number of fish in the fish pond, p k is the probability of the fish being observed in the kth case, and the value of z ik is 0 or 1, k=1,...,c and
Figure 110131193-A0101-12-0015-5
, zik is a latent variable used to indicate which of the c conditions the i -th data is. In addition, in the above likelihood function L , N , c , n i , α k , β k are the observed values or the set parameters, which are all known values, where N is the fish school obtained by the imaging device at different times The number of image data, c is the number of fish observed, n i is the number of fish estimated by the i -th density map, α k , β k is the prior probability (Prior Probability) of p k The alpha, beta parameters of the beta distribution. Through the EM algorithm, all unknowns can be obtained by maximizing the likelihood function L. According to the conditions of the breeding field, fish species, etc., by setting the α k , β k parameters of the Beta distribution, the prior probability of the observed probability p k of the fish in the kth case is set. For example, in this embodiment, since this embodiment is a dual sonar system, we set the following 4 (c=4) situations for the sonar to detect fish schools: both sonars observe a lot of fish. In this case, the events observed for each fish as a whole are Bernoulli trials with probability p 1 ; many fish are observed by the first-view sonar and few fish are observed by the second-view sonar, where In this case, the events observed by each fish as a whole are Bernoulli trials with probability p 2 ; few fish are observed by the first-view sonar and many fish are observed by the second-view sonar, in this case , the events observed by each fish as a whole are Bernoulli trials with probability p 3 ; and both sonars observed few fish, in which case each fish was observed as a whole The observed events are all Bernoulli trials with probability p 4 . Therefore, for example, in this embodiment, we set the α and β parameters of the prior probability Beta distribution of p 1 , p 2 , p 3 , and p 4 to be (2, 2), respectively, (2,4), (2,4), (2,7), the meaning behind them respectively represents that the prior probability of p 1 is relatively high around 0.5, and the prior probability diagram is shown in Figure 7A, p 2 and p 3 The prior probability is higher around 0.25, and its prior probability maps are shown in Figures 7B and 7C, and the prior probability of p 4 is higher around 0.125, whose prior probability maps are shown in Figure 7D. Substitute the count result of objects in the connected area obtained in step 501 into the mixed binomial distribution model for calculation, and the final fish quantity estimation result can be obtained. Specifically, the type of fish and the conditions of the breeding field will affect the probability of fish in the density map. Therefore, in addition to the simple binomial distribution, these factors also need to be considered. Therefore, according to the present invention, the Based on the binomial distribution and combined with other equations that take into account these factors, a hybrid binomial distribution is produced to more accurately estimate fish populations.

具體而言,根據本發明,考慮魚群的種類、及養殖場域的條件,並具體地以聲納成像設備取得魚群影像的各種狀況來涵蓋這些考量因素。因此,根據慮及這些不同狀況下之白努利試驗,根據本發明,設計出混合貝他二項式分布模型,並據以產生混合Beta-二項分布模型未知數的完整似然函數,以更進一步精準地估算魚隻數量。 Specifically, according to the present invention, the types of fish and the conditions of the breeding area are considered, and these considerations are covered by the various conditions of the fish images obtained by the sonar imaging device. Therefore, according to the Bernoulli test taking into account these different conditions, according to the present invention, a mixed beta binomial distribution model is designed, and the complete likelihood function of the unknowns of the mixed beta-binomial distribution model is generated accordingly, so as to be more Further accurate estimates of fish populations.

圖8係方塊圖,用以說明根本發明的人工智慧魚隻估算系統800。如圖8所示,系統800包含成像模組801、神經網路模組802、處理單元803、通訊單元804。舉例而言成像模組801可為聲納成像設備,用以拍攝魚群影像並傳送給神經網路模組802。神經網路模組802設有經過人工智慧訓 練取得的具有訓練完成的神經網路參數Θ值之函數神經網路,且可以內建2維魚群密度圖產生函數,舉例而言,2維魚群密度圖產生函數可為如上所示之公式(2):P(Y=1|X,Θ)*g(p;0,Σ),神經網路模組802會將成像設備801傳送來的影像自動地轉換成代表密度圖並傳送給處理單元803。處理單元803可以使用根據本發明的魚隻數量估算方法估算養殖的魚群的食慾。通訊單元804可以依Lora、Wifi、GMS等各種通訊協定與外部單元通訊,外部單元可為雲端網路、物聯網、養殖設備等等。 FIG. 8 is a block diagram illustrating an artificial intelligence fish estimation system 800 of the present invention. As shown in FIG. 8 , the system 800 includes an imaging module 801 , a neural network module 802 , a processing unit 803 , and a communication unit 804 . For example, the imaging module 801 can be a sonar imaging device, which is used to capture fish school images and transmit them to the neural network module 802 . The neural network module 802 is provided with a functional neural network with a trained neural network parameter Θ value obtained through artificial intelligence training, and can build a 2-dimensional fish density map generation function, for example, the 2-dimensional fish density The image generation function can be the formula (2) shown above: P(Y=1| X , Θ)* g (p; 0 , Σ), the neural network module 802 will automatically generate the image sent by the imaging device 801 is converted into a representative density map and sent to the processing unit 803 . The processing unit 803 may use the fish quantity estimation method according to the present invention to estimate the appetite of the farmed fish population. The communication unit 804 can communicate with an external unit according to various communication protocols such as Lora, Wifi, GMS, etc., and the external unit can be a cloud network, the Internet of Things, aquaculture equipment, and the like.

較佳地,根據本發明的系統800中的各單元或模組可以一起形成於同一設備中,或個別地或彼此以任何組合分設於不同處。舉例而言,成像模組801、神經網路模組802、及通訊單元804可以形成為一體而成為具有人工智慧之魚群成像設備,而處理單元803設於雲端、物聯網、水上載具或其它具有通訊功能的電腦設備中。或者,成像模組801及通訊單元804一體地設於養殖場域中,神經網路模組802、處理單元803設於雲端、物聯網、水上載具或其它具有通訊功能的電腦設備中。 Preferably, the units or modules in the system 800 according to the present invention can be formed together in the same device, or located at different places individually or in any combination with each other. For example, the imaging module 801, the neural network module 802, and the communication unit 804 can be integrated into a fish imaging device with artificial intelligence, and the processing unit 803 can be installed in the cloud, Internet of Things, water vehicles or other In computer equipment with communication functions. Alternatively, the imaging module 801 and the communication unit 804 are integrally provided in the farm domain, and the neural network module 802 and the processing unit 803 are provided in the cloud, Internet of Things, water vehicle or other computer equipment with communication functions.

在上述中,以舉例方式說明根據本發明的神經網路訓練方法、人工智慧魚隻數量估算方法、及人工智慧魚群食慾判斷訓練系統。從上述說明中,習於此技藝者應能瞭解本發明相較於先前技術具有顯著優點及功效。舉例而言,根據本發明,可以在不驚擾魚群的狀況下,根據自動估算的結果,即可有效地決定是否要改變養殖環境、改變餵食量或配方、採取促進魚群健康狀態的措施等等。 In the above, the neural network training method, the artificial intelligence fish quantity estimation method, and the artificial intelligence fish swarm appetite judgment training system according to the present invention are described by way of example. From the above description, those skilled in the art should understand that the present invention has significant advantages and effects compared with the prior art. For example, according to the present invention, it is possible to effectively decide whether to change the breeding environment, change the feeding amount or formula, and take measures to promote the health of the fish according to the result of automatic estimation without disturbing the fish.

雖然已於上述中說明本發明的較佳實施例,但是,這些僅為說明之用且不應被解譯為限定本發明之範圍,在不悖離本發明的精神之下, 習於此技藝者可以執行很多修改,後附之申請專利範圍涵蓋所有這些落在發明的範圍及精神之內的修改。 Although the preferred embodiments of the present invention have been described above, these are for illustrative purposes only and should not be construed as limiting the scope of the present invention. Without departing from the spirit of the present invention, Many modifications can be implemented by those skilled in the art, and the appended claims cover all such modifications that fall within the scope and spirit of the invention.

500:魚隻數量估算方法 500: Method for Estimating Number of Fish

501:連通區計數步驟 501: Connected area count step

502:使用混合二項式估算魚隻總數步驟 502: Estimating the total number of fish using mixed binomial steps

Claims (11)

一種神經網路訓練方法,用於產生用以估算養殖場域中養殖魚隻數量之密度圖,使用至少二個成像設備拍攝取得之眾多幀魚群影像作為眾多學習樣本,該方法包含下述步驟:標示步驟,以強標示、弱標示、及未標示之方式,將各該學習樣本進行標示,以區分為明確具有一尾魚或完全無魚的強像素標示區、至少有二尾魚的弱像素標示區、及未標示像素區;訓練步驟,根據該強像素標示區相對於所屬的學習樣本之機率關係產生目標機率函數,以及依據該弱像素標示區及該未標示像素區對該所屬的學習樣本之各別機率關係與該目標機率函數分別對應的計算結果達成預期的機率比較一致性,作為訓練準則,來教導神經網路成為該目標機率函數神經網路,藉以產生能用以估算魚隻數量的密度圖。 A neural network training method is used to generate a density map for estimating the number of farmed fish in a farm area, and use at least two imaging devices to capture many frames of fish images as many learning samples. The method includes the following steps: In the marking step, the learning samples are marked in the form of strong marking, weak marking, and no marking, so as to distinguish into strong pixel marking areas with one fish or no fish at all, and weak pixels with at least two fishes The marked area and the unmarked pixel area; the training step is to generate a target probability function according to the probability relationship between the strong pixel marked area and the corresponding learning sample, and to learn the corresponding learning sample according to the weak pixel marked area and the unmarked pixel area The respective probability relationships of the samples and the corresponding calculation results of the target probability function are relatively consistent with the expected probability, which is used as a training criterion to teach the neural network to become the target probability function neural network, so as to generate a system that can be used to estimate fish populations. Quantity density map. 如請求項1之訓練方法,其中,該目標機率函數是P(y i |X,Θ),X代表單幀影像,yi代表整幀影像中的第i個像素,Θ代表神經網路參數,P(y i =1|X,Θ)係表示像素i被標示有一尾魚存在的機率,以及,依據該訓練準則設計損失函數為L(Θ|Θ'):
Figure 110131193-A0305-02-0021-1
其中,S代表所有強像素標示區,W代表弱像素標示區所形成的集合,Ω代表一張影像裡所有像素的集合,Θ為目前神經網路參數,Θ'為前一次疊代神經網路參數,D(p i )為參考密度圖函數,P g (y i =1|X,Θ)為目標密度圖函數, α>0與β>0為權重參數;
According to the training method of claim 1, the target probability function is P( y i | X ,Θ), X represents a single frame of image, y i represents the ith pixel in the whole frame of image, and Θ represents the neural network parameter , P( y i =1| X ,Θ) is the probability that pixel i is marked with a fish, and the loss function is designed according to the training criterion as L (Θ|Θ ' ):
Figure 110131193-A0305-02-0021-1
Among them, S represents all strong pixel marked areas, W represents the set formed by weak pixel marked areas, Ω represents the set of all pixels in an image, Θ is the current neural network parameter, Θ ' is the previous iteration of the neural network parameters, D ( p i ) is the reference density map function, P g ( y i =1| X ,Θ) is the target density map function, α>0 and β>0 are the weight parameters;
其中,對該損失函數進行疊代計算,直至該損失函數最小化,以及將該最小化時取得的神經參數Θ、及權重參數α和β作為目標神經參數及權重參數,藉以使神經網路成為該目標機率函數神經網路。 Among them, the loss function is iteratively calculated until the loss function is minimized, and the neural parameters Θ, and the weight parameters α and β obtained during the minimization are used as the target neural parameters and weight parameters, so that the neural network becomes a The target probability function neural network.
如請求項1之訓練方法,其中,該至少二成像設備是二個180度間隔配置的聲納成像設備。 The training method of claim 1, wherein the at least two imaging devices are two sonar imaging devices arranged at 180-degree intervals. 一種使用人工智慧之養殖魚隻數量估算方法,用以估算養殖場域中養殖的魚隻總數,該方法包含下述: A method for estimating the number of farmed fish using artificial intelligence to estimate the total number of fish farmed in a farm, the method includes the following: 使用至少二成像設備取得多幀魚群影像; Use at least two imaging devices to obtain multiple frames of fish images; 使用如請求項1之訓練方法完成訓練的目標機率函數神經網路,將該多幀魚群影像轉換成多張對應的密度圖; Use the training method of request item 1 to complete the training target probability function neural network, and convert the multi-frame fish images into a plurality of corresponding density maps; 依該至少二成像設備的配置方式,將各成像設備同時取得的各密度圖處理成如同從同一視角取得; According to the configuration of the at least two imaging devices, each density map obtained by each imaging device at the same time is processed as if obtained from the same viewing angle; 使用連通物件標記法,標記經過該處理之各成像設備同時取得的各密度圖中機率值大於預定閥值之各像素的連通區; Using the connected object marking method, mark the connected area of each pixel whose probability value is greater than a predetermined threshold value in each density map obtained by each imaging device that has undergone the processing at the same time; 將該已標記連通區的同時取得的各密度圖,以a×b像素為單位,分割成預定數目的區塊並排序,以次序相同的區塊中較大的連通區數目為代表數目,將各代表數目相加而取得代表總數,a及b都是自然數; Each density map obtained at the same time as the marked connected area is divided into a predetermined number of blocks with a × b pixels as the unit and sorted, and the number of the larger connected area in the blocks with the same order is the representative number, The number of representatives is added to obtain the total number of representatives, a and b are both natural numbers; 以二項式分布為基礎,依據魚種、養殖場域狀況,設計混合式二項分布模型,使用多筆不同時刻取得的代表總數、及代表各狀況的預定參數值,估算該養殖魚隻總數。 Based on the binomial distribution, according to the fish species and the conditions of the breeding field, a mixed binomial distribution model is designed, and the total number of the farmed fish is estimated by using the representative total number obtained at different times and the predetermined parameter values representing each situation. . 如請求項4之估算方法,其中,該二項分布模型為混合貝他-二項分布模型,最大化如下所示的混合貝他-(Beta-)二項分布模型完整似然函數,即可求出該養殖魚隻總數: According to the estimation method of claim 4, the binomial distribution model is a mixed beta-binomial distribution model, and the complete likelihood function of the mixed beta-(Beta-) binomial distribution model shown below can be maximized. Find the total number of farmed fish: L(m,p k ,z ik |n i k ,β k )=
Figure 110131193-A0305-02-0023-2
其中,m代表要估算之該養場域中的總養殖魚隻數,p k 為第k種該養殖場域狀況下魚被觀測到的機率,z ik 值為0或1,k=1,...,c,c為該養殖場域狀況總數,並且
Figure 110131193-A0305-02-0023-3
z ik 是用來指示第i筆資料是在c種狀況中哪一個狀況的潛在變數,N,c,n i k ,β k 皆是已知值,其中N為根據不同時刻成像設備取得的魚群影像之資料筆數,n i 為第i筆使用密度圖估計出的魚隻數量,α k k 為有關p k 的先驗機率之貝他分布的α,β參數。
L ( m , p k , z ik | n i , α k , β k )=
Figure 110131193-A0305-02-0023-2
Among them, m represents the total number of farmed fish in the farm to be estimated, p k is the probability of the fish being observed under the condition of the k - th fish farm, zik is 0 or 1, k=1, ...,c,c is the total number of farm conditions, and
Figure 110131193-A0305-02-0023-3
, z ik is a latent variable used to indicate which of the c conditions the i-th data is in, N , c , n i , α k , β k are all known values, where N is the imaging device at different times The number of fish images obtained, n i is the number of fish estimated by the i -th density map using the density map, α k , β k are the α and β parameters of the beta distribution of the prior probability of p k .
如請求項4之估算方法,其中,該至少二成像設備是二個180度間隔配置的聲納成像設備。 The estimation method of claim 4, wherein the at least two imaging devices are two sonar imaging devices arranged at intervals of 180 degrees. 如請求項6之估算方法,其中,該養殖場域狀況總數c等於4。 The estimation method of claim 6, wherein the total number of conditions c of the farm is equal to 4. 一種人工智慧養殖魚隻數量估算系統,用以估算養殖場域中養殖的魚隻總數,該系統包括:至少二成像模組,用於拍攝該養殖場域魚群;神經網路模組,包含如請求項1之訓練方法完成訓練的目標機率函數神經網路;處理單元;以及通訊單元,其中,該神經網路模組與該處理單元彼此協力運作,以將拍攝的該養殖魚群影像轉換成魚群密度圖,以及,使用如請求項4至7中任一項之估算方法,估算該養殖魚隻總數。 An artificial intelligence breeding fish quantity estimation system for estimating the total number of fish cultured in a breeding field, the system comprising: at least two imaging modules for photographing the fish population in the breeding field; a neural network module, including: The training method of claim 1 completes the training target probability function neural network; a processing unit; and a communication unit, wherein the neural network module and the processing unit cooperate with each other to convert the photographed image of the farmed fish school into a fish school density map, and, using the estimation method as in any one of claims 4 to 7, an estimate of the total number of farmed fish. 如請求項8之系統,其中,該至少二成像模組包含聲納成像設備。 The system of claim 8, wherein the at least two imaging modules comprise sonar imaging equipment. 如請求項8之系統,其中,該神經網路模組及該處理單元中至少之一設置於雲端、物聯網或養殖設備中。 The system of claim 8, wherein at least one of the neural network module and the processing unit is set in the cloud, the Internet of Things or a breeding equipment. 如請求項8之系統,其中,該神經網路模組及該處理單元中至少之一與該至少一成像模組中設置成一體。 The system of claim 8, wherein at least one of the neural network module and the processing unit is integrated with the at least one imaging module.
TW110131193A 2021-08-24 2021-08-24 Systems and methods for intelligent aquaculture estimation of the number of fish TWI778762B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
TW110131193A TWI778762B (en) 2021-08-24 2021-08-24 Systems and methods for intelligent aquaculture estimation of the number of fish

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
TW110131193A TWI778762B (en) 2021-08-24 2021-08-24 Systems and methods for intelligent aquaculture estimation of the number of fish

Publications (2)

Publication Number Publication Date
TWI778762B true TWI778762B (en) 2022-09-21
TW202309776A TW202309776A (en) 2023-03-01

Family

ID=84958210

Family Applications (1)

Application Number Title Priority Date Filing Date
TW110131193A TWI778762B (en) 2021-08-24 2021-08-24 Systems and methods for intelligent aquaculture estimation of the number of fish

Country Status (1)

Country Link
TW (1) TWI778762B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117409368A (en) * 2023-10-31 2024-01-16 大连海洋大学 Real-time analysis method for shoal gathering behavior and shoal starvation behavior based on density distribution

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI562724B (en) * 2015-12-28 2016-12-21 Inventec Corp Intelligent aquarium system with cultivation analyzing and method thereof
CN108875644A (en) * 2018-06-21 2018-11-23 四川盈乾建设工程有限公司 A kind of aquatic animal density survey method
US20190094356A1 (en) * 2015-04-20 2019-03-28 Navico Holding As Methods and apparatuses for constructing a 3d sonar image of objects in an underwater environment
CN112213962A (en) * 2020-08-21 2021-01-12 四川渔光物联技术有限公司 Intelligent feeding system and method based on growth model and sonar feedback
CN112418124A (en) * 2020-11-30 2021-02-26 国电大渡河枕头坝发电有限公司 Intelligent fish monitoring method based on video images

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190094356A1 (en) * 2015-04-20 2019-03-28 Navico Holding As Methods and apparatuses for constructing a 3d sonar image of objects in an underwater environment
TWI562724B (en) * 2015-12-28 2016-12-21 Inventec Corp Intelligent aquarium system with cultivation analyzing and method thereof
CN108875644A (en) * 2018-06-21 2018-11-23 四川盈乾建设工程有限公司 A kind of aquatic animal density survey method
CN112213962A (en) * 2020-08-21 2021-01-12 四川渔光物联技术有限公司 Intelligent feeding system and method based on growth model and sonar feedback
CN112418124A (en) * 2020-11-30 2021-02-26 国电大渡河枕头坝发电有限公司 Intelligent fish monitoring method based on video images

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117409368A (en) * 2023-10-31 2024-01-16 大连海洋大学 Real-time analysis method for shoal gathering behavior and shoal starvation behavior based on density distribution

Also Published As

Publication number Publication date
TW202309776A (en) 2023-03-01

Similar Documents

Publication Publication Date Title
Li et al. Nonintrusive methods for biomass estimation in aquaculture with emphasis on fish: a review
CN110147771B (en) Sow lateral-lying posture real-time detection system based on sow key part and environment combined partition
Monkman et al. Using machine vision to estimate fish length from images using regional convolutional neural networks
Labao et al. Cascaded deep network systems with linked ensemble components for underwater fish detection in the wild
Kellenberger et al. 21 000 birds in 4.5 h: efficient large‐scale seabird detection with machine learning
CN111339912B (en) Method and system for recognizing cattle and sheep based on remote sensing image
CN114241031B (en) Fish body ruler measurement and weight prediction method and device based on double-view fusion
Hayes et al. Drones and deep learning produce accurate and efficient monitoring of large-scale seabird colonies
CN110307903B (en) Method for dynamically measuring non-contact temperature of specific part of poultry
US20220004760A1 (en) Splash detection for surface splash scoring
CN115830490A (en) Multi-target tracking and behavior statistical method for herd health pigs
TWI778762B (en) Systems and methods for intelligent aquaculture estimation of the number of fish
Wang et al. Vision-based in situ monitoring of plankton size spectra via a convolutional neural network
CN114627554A (en) Automatic aquaculture feeding centralized management method and system for aquatic products
CN115908268A (en) Method and device for measuring biomass of underwater fish body in real time
Isa et al. CNN transfer learning of shrimp detection for underwater vision system
CN114612454A (en) Fish feeding state detection method
TW202309823A (en) Intelligent aquaculture: systems and methods for assessment of the appetite of fish
Newlands et al. Measurement of the size, shape and structure of Atlantic bluefin tuna schools in the open ocean
Jovanović et al. Splash detection in fish Plants surveillance videos using deep learning
Boussarie et al. BichiCAM, an underwater automated video tracking system for the study of migratory dynamics of benthic diadromous species in streams
Salas et al. Reducing error and increasing reliability of wildlife counts from citizen science surveys: Counting Weddell Seals in the Ross Sea from satellite images
Wolfenkoehler et al. Viability of side-scan sonar to enumerate Paddlefish, a large pelagic freshwater fish, in rivers and reservoirs
Lestari et al. Segmentation of seagrass (Enhalus acoroides) using deep learning mask R-CNN algorithm
Lin et al. A Real-Time Counting Method of Fish based on the Instance Segmentation

Legal Events

Date Code Title Description
GD4A Issue of patent certificate for granted invention patent