TWI820870B - Methods and devices for biological monitoring by automated systems based on artificial intelligence - Google Patents
Methods and devices for biological monitoring by automated systems based on artificial intelligence Download PDFInfo
- Publication number
- TWI820870B TWI820870B TW111131036A TW111131036A TWI820870B TW I820870 B TWI820870 B TW I820870B TW 111131036 A TW111131036 A TW 111131036A TW 111131036 A TW111131036 A TW 111131036A TW I820870 B TWI820870 B TW I820870B
- Authority
- TW
- Taiwan
- Prior art keywords
- track
- unit
- artificial intelligence
- host
- automated system
- Prior art date
Links
- 238000013473 artificial intelligence Methods 0.000 title claims abstract description 29
- 238000012544 monitoring process Methods 0.000 title claims abstract description 27
- 238000000034 method Methods 0.000 title claims abstract description 19
- 241001465754 Metazoa Species 0.000 claims abstract description 48
- 238000009395 breeding Methods 0.000 claims abstract description 30
- 230000001488 breeding effect Effects 0.000 claims abstract description 30
- 238000001514 detection method Methods 0.000 claims abstract description 13
- 238000007726 management method Methods 0.000 claims abstract description 9
- 230000002159 abnormal effect Effects 0.000 claims abstract description 8
- 238000010801 machine learning Methods 0.000 claims description 16
- 238000012806 monitoring device Methods 0.000 claims description 12
- 230000006399 behavior Effects 0.000 claims description 9
- 238000012545 processing Methods 0.000 claims description 7
- 230000003068 static effect Effects 0.000 claims description 7
- 230000001360 synchronised effect Effects 0.000 claims description 7
- 238000003709 image segmentation Methods 0.000 claims description 6
- 230000001815 facial effect Effects 0.000 claims description 5
- 238000012549 training Methods 0.000 claims description 5
- 230000005540 biological transmission Effects 0.000 claims description 4
- 238000013500 data storage Methods 0.000 claims description 4
- 230000033001 locomotion Effects 0.000 claims description 4
- 230000002093 peripheral effect Effects 0.000 claims description 4
- 230000005856 abnormality Effects 0.000 claims description 3
- 230000007613 environmental effect Effects 0.000 claims description 2
- 238000010171 animal model Methods 0.000 abstract description 17
- 238000004364 calculation method Methods 0.000 abstract description 3
- 238000010586 diagram Methods 0.000 description 15
- 230000000694 effects Effects 0.000 description 4
- 238000013135 deep learning Methods 0.000 description 2
- 238000007689 inspection Methods 0.000 description 2
- 238000002372 labelling Methods 0.000 description 2
- 238000012706 support-vector machine Methods 0.000 description 2
- 238000013528 artificial neural network Methods 0.000 description 1
- 230000009194 climbing Effects 0.000 description 1
- 230000035622 drinking Effects 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 230000007774 longterm Effects 0.000 description 1
- 230000013011 mating Effects 0.000 description 1
- 230000011218 segmentation Effects 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
Landscapes
- Measuring And Recording Apparatus For Diagnosis (AREA)
- Housing For Livestock And Birds (AREA)
Abstract
一種基於人工智慧之自動化系統監控生物的方法及其裝置,其係透過將具有人工智慧和邊緣運算功能之一自主移動裝置置於一生物飼育籠架旁,以便於偵測和管理其生物飼育籠內的實驗動物狀況,並將拍攝取得之生物影像進行數據運算或分析,同時回傳至一中央管理平台,以達到自動化監測及照護管理生物的目的,進而將該等生物飼育籠內狀況標準化,減少對於實驗動物生活的干擾,以增進實驗動物照護品質,藉此,除了解決人工方式巡房問題外,亦能提高異常檢出率及汙染管控程度,達到自動化監測及照護管理生物的目的。A method and device for monitoring organisms through an automated system based on artificial intelligence. This method places an autonomous mobile device with artificial intelligence and edge computing functions next to a biological breeding cage to facilitate detection and management of the biological breeding cage. The conditions of the experimental animals in the cages are analyzed, and the biological images captured are processed for data calculation or analysis, and are sent back to a central management platform to achieve the purpose of automatically monitoring, caring for and managing the animals, and then standardizing the conditions in the cages of these animals. Reduce the interference to the lives of experimental animals to improve the quality of experimental animal care. In addition to solving the problem of manual rounds, it can also improve the abnormal detection rate and pollution control level, and achieve the purpose of automated monitoring and care and management of organisms.
Description
本發明係與一種監控生物的方法及裝置有關,特別是指一種基於人工智慧之自動化系統監控生物的方法及其裝置。The present invention relates to a method and device for monitoring organisms, and in particular, to a method and device for monitoring organisms using an automated system based on artificial intelligence.
按,現今實驗動物照護所遭遇三大問題:According to the current situation, experimental animal care facilities are facing three major problems:
1、人工照護實驗動物模式費時且檢出狀況不佳:實驗動物日常照護皆以人工方式巡房觀察動物狀態,實驗動物需每日進行上下午各一次的例行照護觀察,觀察內容包含清點動物隻數、動物是否受傷或死亡以及動物居住環境是否適宜等異常狀況。現行以人工方式巡房逐籠進行觀察不但耗費時間,且因動物籠數較多異常檢出率偏低,常無法即時反應籠內動物狀況,導致動物處於緊迫狀態無法落實動物福址。1. The artificial care mode of experimental animals is time-consuming and the detection status is not good: the daily care of experimental animals is carried out manually to observe the animal status. Experimental animals need to undergo routine care and observation once a day in the morning and afternoon. The observation content includes counting animals. Abnormal conditions such as the number of animals, whether the animals are injured or dead, and whether the living environment of the animals is suitable. The current manual method of patrolling the rooms and observing cage by cage is not only time-consuming, but also has a low abnormality detection rate due to the large number of animal cages. It is often impossible to immediately reflect the conditions of the animals in the cages, resulting in animals being in a state of emergency and unable to implement animal welfare.
2、動物房污染管控及光照週期考量:動物房中因為污染管控進入的人員需要著裝後才能進入動物房中,因此工作人員無法長時間連續觀察動物活動,即使進入動物房中能夠觀察動物的時間也非常有限,一直以來都缺乏長時間且連續的動物觀察資料。另一個考量為模擬動物的正常生活作息動物房舍中的燈光採自動控制12小時光亮及12小時黑暗,且夜間時進入動物房為不致於影響動物光照周期需使用特殊配備及訓練才能進入,因此導致夜間照護無法進行而無法累積夜間實驗動物活動模式資料。2. Animal room pollution control and light cycle considerations: Due to pollution control in the animal room, personnel entering the animal room need to dress before entering the animal room. Therefore, staff cannot continuously observe animal activities for a long time, even if they can observe the animals in the animal room. It is also very limited, and there has been a lack of long-term and continuous animal observation data. Another consideration is to simulate the animals' normal daily routine. The lighting in the animal house is automatically controlled for 12 hours of light and 12 hours of darkness. Entering the animal house at night will not affect the animal's light cycle, so special equipment and training are required to enter. As a result, night care cannot be carried out and data on the activity patterns of experimental animals at night cannot be accumulated.
3、動物籠內狀態無法標準化:因為巡房工作由人工進行,每個人判斷籠內狀況標準不同,常常導致同一動物籠內狀況沒有統一標準。因此進行換籠時多採用同一動物房同一時間一次完成換籠工作,但動物本性不喜歡被干擾,每一次的換籠動作對於動物都是一種緊迫刺激,如能減少換籠干預動作又可以兼顧實驗動物的生活品質對於實驗動物照護而言才是最佳模式。3. The conditions in animal cages cannot be standardized: Because the inspection work is performed manually, everyone has different standards for judging the conditions in the cage, which often leads to no unified standards for the conditions in the same animal cage. Therefore, when changing cages, the same animal room is often used to complete the cage changing work at the same time. However, animals do not like to be disturbed by nature. Every cage changing action is an urgent stimulus for the animals. If the cage changing intervention can be reduced, both considerations can be taken into consideration. The quality of life of laboratory animals is the best model for laboratory animal care.
因此本案發明人經過積極思考,原型試驗及不斷改善,終於研發出簡易又實用的改良方法,尤其可以解決上述所提及之缺點。Therefore, after active thinking, prototype testing and continuous improvement, the inventor of this case finally developed a simple and practical improvement method, which can especially solve the above-mentioned shortcomings.
本發明係揭露一種基於人工智慧之自動化系統監控生物的方法,其係包含下列步驟:步驟1:透過一自主移動裝置獲取複數影像資料,以形成一影像資料組,該自主移動裝置係包含一主機單元及一感測單元,且該自主移動裝置置於一生物飼育籠架旁;步驟2:針對該影像資料組之該等影像資料,透過動作偵測演算法與關鍵畫面擷取演算法,擷取所需之影像畫面轉換為靜態影像;步驟3:利用一電腦,其設有一圖形化使用者介面軟體,透過該圖形化使用者介面軟體標記上述靜態影像中之目標物件的座標,形成一標記資料,並將該標記資料儲存於該電腦,另可搭配圖形演算法輔助,快速將該標記資料完成標註,以形成一標示資料組,供一機器學習機訓練用;步驟4:將該標示資料組的資料輸入該機器學習機中,透過機器學習演算法建立一辨識模型;步驟5:將該辨識模型部署於該主機單元;步驟6:將該自主移動裝置之該感測單元獲取之新的影像資料,透過步驟2及步驟3處理後以形成新的標示資料組,並將其輸入至該機器學習機,與該辨識模型進行運算比對,即能得到辨識生物及其飼養環境狀態的辨識結果,同時將上述辨識結果傳送至一中央管理平台,以作為監控與異常通知的訊息來源,藉此達到自動化監測及照護管理生物的目的。The present invention discloses a method for monitoring organisms by an automated system based on artificial intelligence, which includes the following steps: Step 1: Acquire multiple image data through an autonomous mobile device to form an image data set. The autonomous mobile device includes a host unit and a sensing unit, and the autonomous mobile device is placed next to a biological breeding cage; Step 2: For the image data of the image data set, capture the image data through a motion detection algorithm and a key frame capture algorithm. Get the required image frame and convert it into a static image; Step 3: Use a computer equipped with a graphical user interface software to mark the coordinates of the target object in the above static image through the graphical user interface software to form a mark data, and store the labeled data in the computer. It can also be assisted by graphics algorithms to quickly label the labeled data to form a labeled data set for training of a machine learning machine; Step 4: Convert the labeled data to The data of the group is input into the machine learning machine, and a recognition model is established through the machine learning algorithm; Step 5: Deploy the recognition model to the host unit; Step 6: Get the new data obtained by the sensing unit of the autonomous mobile device The image data is processed through steps 2 and 3 to form a new label data set, which is input to the machine learning machine and compared with the identification model to obtain identification of the organism and its feeding environment status. As a result, the above identification results are simultaneously transmitted to a central management platform as a source of information for monitoring and abnormal notification, thereby achieving the purpose of automatically monitoring and caring for and managing living creatures.
本發明另外揭露一種基於人工智慧之自動化系統監控生物的裝置,其係裝設於一生物飼育籠架,該生物飼育籠架設有複數生物飼育籠,該基於人工智慧之自動化系統監控生物的裝置係包含有:一軌道單元,其係包含至少一軌道組,該軌道組具有一第一軌道,該第一軌道係設於該等生物飼育籠之一側;一驅動單元,其係電性連接於該軌道單元,該驅動單元對應該軌道組係設有一驅動器,該驅動器可驅動該軌道組運轉;一主機單元,該主機單元對應該第一軌道係設有至少一主機,該主機底部係設有一第一組接板,該第一組接板係組設有一第二組接板,該第二組接板係位於該軌道組;一感測單元,該感測單元對應該主機係設有一感測器,該感測器係電性連接於該主機,並設於該主機之一端。The present invention also discloses an automated system-based biological monitoring device based on artificial intelligence, which is installed on a biological breeding cage. The biological breeding cage is equipped with a plurality of biological breeding cages. The automated system-based artificial intelligence monitoring biological device is It includes: a track unit, which includes at least one track group, the track group has a first track, and the first track is provided on one side of the biological breeding cage; a drive unit, which is electrically connected to The track unit, the driving unit is provided with a driver corresponding to the track group, the driver can drive the track group to operate; a host unit, the host unit is provided with at least one host corresponding to the first track, and the bottom of the host is equipped with a A first set of connecting boards is provided with a second set of connecting boards located on the track group; a sensing unit is provided with a sensor corresponding to the host system; The sensor is electrically connected to the host and is located at one end of the host.
藉由上述方法及結構,透過該主機於該第一軌道進行橫向或軸向往復移動,使設於該主機之該感測器可將該等生物飼育籠內之實驗動物的活動狀況進行長時間且連續的觀察,進一步將該等生物飼育籠內狀況標準化,只針對需更換之該生物飼育籠進行更換,減少對於實驗動物生活的干擾,以增進實驗動物照護品質,藉此,除了解決人工方式巡房問題外,亦能提高異常檢出率及汙染管控程度,達到自動化監測及照護管理生物的目的。With the above method and structure, the main machine moves horizontally or axially back and forth on the first track, so that the sensor provided on the main machine can monitor the activity status of the experimental animals in the biological breeding cage for a long time. And continuous observation will further standardize the conditions in these biological breeding cages, and only replace the biological breeding cages that need to be replaced, thereby reducing interference to the lives of experimental animals and improving the quality of experimental animal care. In this way, in addition to solving the problem of artificial methods In addition to the issue of ward inspections, it can also improve the abnormal detection rate and pollution control level, achieving the purpose of automated monitoring and care and management of organisms.
請參閱圖1及圖6所示,其係揭露有一種基於人工智慧之自動化系統監控生物的方法,其係包含下列步驟:Please refer to Figure 1 and Figure 6, which discloses a method of biological monitoring based on an automated system based on artificial intelligence, which includes the following steps:
步驟1S1:透過一自主移動裝置獲取複數影像資料,以形成一影像資料組,且該自主移動裝置置於一生物飼育籠架200旁。Step 1S1: Acquire multiple image data through an autonomous mobile device to form an image data set, and the autonomous mobile device is placed next to a biological breeding cage 200.
其中,該等影像資料可為靜態影像或動態錄像。Among them, the image data can be static images or dynamic videos.
其中,該自主移動裝置可為線性軌道組、自走車或無人機,該自主移動裝置係包含一主機單元40及一感測單元50,該主機單元40係為一邊緣運算主機,該邊緣運算主機包括中央處理器(CPU)、記憶體、圖形運算單元(GPU, Graphics processing unit)、周邊I/O介面、無線傳輸、資料儲存單元及電源供應單元,該感測單元50包含相機模組、紅外線偵測、溫度儀、濕度儀、指向性麥克風、震動儀及壓力儀或上述之任意組合。The autonomous mobile device can be a linear track set, a self-propelled vehicle or a drone. The autonomous mobile device includes a host unit 40 and a sensing unit 50. The host unit 40 is an edge computing host. The edge computing The host includes a central processing unit (CPU), memory, graphics processing unit (GPU, Graphics processing unit), peripheral I/O interface, wireless transmission, data storage unit and power supply unit. The sensing unit 50 includes a camera module, Infrared detection, temperature meter, humidity meter, directional microphone, vibration meter and pressure meter or any combination of the above.
其中,該影像資料組包含動物品種(如年齡、身體顏色)、環境狀況(如籠架歪斜、飼料存量低、墊料因排泄物或是飲水器漏水導致潮濕)及動物行為(如食用飼料、攀爬、打鬥、交配、活動力異常)。Among them, the image data set includes animal species (such as age, body color), environmental conditions (such as skewed cage frames, low feed stocks, wet bedding due to excrement or leaking drinking fountains) and animal behaviors (such as eating feed, Climbing, fighting, mating, abnormal mobility).
步驟2S2:針對該影像資料組之該等影像資料,透過動作偵測演算法(motion detection algorithms)與關鍵畫面擷取演算法(key frames extraction algorithms),擷取部分具參考價值的畫面轉換為靜態影像,另可針對不同型號的相機模組產生的影像可進一步透過電腦視覺(computer vision)演算法做影像正規化(normalization),以增進資料的一致性,有助於提升一機器學習機訓練速度與準確度。Step 2S2: For the image data of the image data set, use motion detection algorithms and key frames extraction algorithms to extract some frames with reference value and convert them into static Images, and the images generated by different models of camera modules can be further normalized through computer vision algorithms to enhance the consistency of the data and help improve the training speed of a machine learning machine. and accuracy.
步驟3S3:搭配參閱圖2至圖5所示,利用一電腦,其設有一圖形化使用者介面軟體(GUI, graphic user interface),透過該圖形化使用者介面軟體標記上述靜態影像中之目標物件的座標,形成一標記資料,並將該標記資料儲存於該電腦,於本發明實施例中,上述目標物件係為實驗動物以及環境中需要標註之物,該電腦可為桌上型電腦或是筆記型電腦,另可搭配圖形演算法輔助,協助影像標註人員以更快速的方式將該標記資料完成標註,以形成一標示資料組,供該機器學習機訓練用,該標示資料組包含一物件辨識模組、一影像分割模組、一動物實體辨識模組及一動物行為辨識模組,使用者可依據不同之上述目標物件或不同之監控目的,選擇欲使用之模組。Step 3S3: As shown in Figures 2 to 5, use a computer equipped with a graphical user interface software (GUI, graphic user interface) to mark the target object in the above static image through the GUI software. coordinates to form a mark data, and store the mark data in the computer. In the embodiment of the present invention, the target objects are experimental animals and objects in the environment that need to be marked. The computer can be a desktop computer or a The notebook computer can also be used with the aid of graphics algorithms to assist image annotators to complete the annotation of the tagged data in a faster manner to form a tagged data set for training of the machine learning machine. The tagged data set includes an object. Recognition module, an image segmentation module, an animal entity recognition module and an animal behavior recognition module, users can choose the modules they want to use based on different above-mentioned target objects or different monitoring purposes.
其中,該物件辨識模組係將該標記資料逐一以定界框 (bounding box) A方式標註其所在位置,如圖2所示,並將其標註資料儲存於該物件辨識模組中,該影像分割模組係逐一將屬於該標記資料之物件輪廓(boundary/contour)B、物件本體(object)C以及背景(background)D之區域之像素點(image pixels)標註出,如圖3所示,並將其標註資料儲存於該影像分割模組中,該動物實體辨識模組係逐一將該標記資料中屬於物件臉部影像上之臉部特徵點(facial landmark)P1~P7之區域標註出,如圖4所示,並將其標註資料儲存於該動物實體辨識模組中,該動物行為辨識模組係逐一將影像上該標記資料中屬於物件之目標關節點(joints points or decision points)E標註出,如圖5所示,並將其標註資料儲存於該動物行為辨識模組中。Among them, the object recognition module marks the location of the marking data one by one in the form of bounding box A, as shown in Figure 2, and stores the marking data in the object recognition module. The image The segmentation module marks out the pixels (image pixels) in the area of the object outline (boundary/contour) B, object body (object) C and background (background) D belonging to the marked data one by one, as shown in Figure 3. And the labeling data is stored in the image segmentation module. The animal entity recognition module marks out the areas belonging to the facial landmarks (facial landmarks) P1 to P7 on the object's face image in the labeling data one by one. As shown in Figure 4, the labeled data is stored in the animal entity recognition module. The animal behavior recognition module identifies the target joint points (joints points or decision points) belonging to the object in the labeled data on the image one by one. Mark it out, as shown in Figure 5, and store its marking data in the animal behavior recognition module.
步驟4S4:將該標示資料組的資料輸入該機器學習機中,透過機器學習演算法建立一辨識模型。Step 4S4: Input the data of the marked data group into the machine learning machine, and establish a recognition model through the machine learning algorithm.
步驟5S5:將該辨識模型部署於該主機單元40。Step 5S5: Deploy the recognition model to the host unit 40.
步驟6S6:將該自主移動裝置之該感測單元50獲取之新的影像資料,透過步驟2及步驟3處理後以形成新的標示資料組,並將其輸入至該機器學習機,與該辨識模型進行運算比對,即能得到辨識生物及其飼養環境狀態的辨識結果,同時將上述辨識結果傳送至一中央管理平台,其中,該中央管理平台可為近端電腦或雲端虛擬主機,以作為監控與異常通知的訊息來源,藉此達到自動化監測及照護管理生物的目的。Step 6S6: Process the new image data acquired by the sensing unit 50 of the autonomous mobile device through steps 2 and 3 to form a new label data set, and input it into the machine learning machine, and combine it with the recognition The models are compared through calculations to obtain identification results that identify organisms and their feeding environment status. At the same time, the identification results are sent to a central management platform. The central management platform can be a local computer or a cloud virtual host. The source of information for monitoring and abnormal notification, thereby achieving the purpose of automatically monitoring and caring for and managing living creatures.
其中,於步驟6中,將部分異常的上述辨識結果經由管理者檢視,評估是否需進行重新標記分類,藉以形成一新的標示資料組,使該新的標示資料組能持續輸入至該機器學習機以進行該辨識模型的再訓練,藉以提升該辨識模型的準確度。Among them, in step 6, the above-mentioned identification results of some abnormalities are reviewed by the administrator to evaluate whether they need to be re-labeled and classified, so as to form a new label data group so that the new label data group can be continuously input to the machine learning The machine is used to retrain the recognition model, thereby improving the accuracy of the recognition model.
其中,步驟4中的機器學習演算法可根據應用需求不同,所涵蓋範圍從規則式(Rule-based)的演算,例如集群分析(Clustering)或是支撐向量機(Support vector machine, SVM)等,至學習式(Learning-based)的算法,而後者又以神經網路為核心架構的深度學習(Deep Learning)為主。Among them, the machine learning algorithm in step 4 can vary according to application requirements, ranging from rule-based calculations, such as clustering or support vector machine (SVM), etc. to learning-based algorithms, and the latter is mainly based on deep learning (Deep Learning) with neural networks as the core architecture.
參閱圖6至圖10所示,本發明另外揭露一種基於人工智慧之自動化系統監控生物的裝置100之第一實施例,其係裝設於一生物飼育籠架200,該生物飼育籠架200設有複數生物飼育籠210,該基於人工智慧之自動化系統監控生物的裝置100係包含有:Referring to FIGS. 6 to 10 , the present invention further discloses a first embodiment of a biological monitoring device 100 based on an automated system based on artificial intelligence, which is installed on a biological breeding cage 200 . The biological breeding cage 200 is equipped with There are a plurality of biological breeding cages 210. The artificial intelligence-based automatic system monitoring biological device 100 includes:
一軌道單元10,其係包含至少一軌道組11,該軌道組11具有一第一軌道12,該第一軌道12係設於該等生物飼育籠210之一側,該第一軌道12可為橫向設置或縱向設置。A track unit 10 includes at least one track group 11. The track group 11 has a first track 12. The first track 12 is provided on one side of the biological breeding cages 210. The first track 12 can be Landscape setting or portrait setting.
一驅動單元20,其係電性連接於該軌道單元10,該驅動單元20對應該第一軌道12係設有一驅動器21,於本發明實施例中,該驅動器21可為馬達或氣動缸,該驅動器21係設於該第一軌道12之一端,該驅動器21可驅動該第一軌道12運轉。A driving unit 20 is electrically connected to the track unit 10. The driving unit 20 is provided with a driver 21 corresponding to the first track 12. In the embodiment of the present invention, the driver 21 can be a motor or a pneumatic cylinder. The driver 21 is provided at one end of the first rail 12, and the driver 21 can drive the first rail 12 to operate.
一支撐單元30,該支撐單元30對應該第一軌道12係設有至少一支撐架31,該支撐架31之一端係固設於該生物飼育籠架200,另一端係固設於該第一軌道12。A support unit 30. The support unit 30 is provided with at least one support frame 31 corresponding to the first track 12. One end of the support frame 31 is fixed to the biological breeding cage 200, and the other end is fixed to the first Track 12.
一主機單元40,該主機單元40對應該第一軌道12係設有至少一主機41,該主機41係設於第一軌道12,該主機41係為一邊緣運算主機,該邊緣運算主機包括中央處理器(CPU)、記憶體、圖形運算單元(GPU, Graphics processing unit)、周邊I/O介面、無線傳輸及資料儲存單元,且該主機41底部係設有一第一組接板42,該第一組接板42係組設有一第二組接板43,該第二組接板43係位於該第一軌道12,於本發明實施例中,該第一組接板42係設有複數滑塊421,而該第二組接板43對應該第一組接板42之該等滑塊421係設有複數滑槽431,透過該等滑塊421卡固於該等滑槽431,使該主機41組設於該第一軌道12上,並可於該第一軌道12移動,另該主機41之一端係連接一調整架44,該調整架44包含一前架體45及一後架體46,該前架體45之兩側係分別藉由一第一固定件47與該後架體46連接,而該後架體46則係藉由複數第二固定件48連接於該主機41,其中,透過旋鬆該等第一固定件47,可使該前架體45相對該後架體46旋轉,或透過旋鬆該等第二固定件48,使該後架體46可相對該主機41上下調整位置。A host unit 40. The host unit 40 is provided with at least one host 41 corresponding to the first track 12. The host 41 is located on the first track 12. The host 41 is an edge computing host. The edge computing host includes a central Processor (CPU), memory, graphics processing unit (GPU, Graphics processing unit), peripheral I/O interface, wireless transmission and data storage unit, and the bottom of the host 41 is provided with a first set of connecting boards 42. One set of connecting plates 42 is equipped with a second set of connecting plates 43. The second set of connecting plates 43 is located on the first track 12. In the embodiment of the present invention, the first set of connecting plates 42 is provided with a plurality of slides. Block 421, and the slide blocks 421 of the second set of connecting plates 43 corresponding to the first set of connecting plates 42 are provided with a plurality of slide grooves 431, and the slide blocks 421 are fastened to the slide grooves 431, so that the The main machine 41 is assembled on the first rail 12 and can move on the first rail 12. Another end of the main machine 41 is connected to an adjusting frame 44. The adjusting frame 44 includes a front frame body 45 and a rear frame body. 46. Both sides of the front frame 45 are connected to the rear frame 46 through a first fixing part 47, and the rear frame 46 is connected to the main unit 41 through a plurality of second fixing parts 48. Among them, by loosening the first fixing parts 47, the front frame body 45 can be rotated relative to the rear frame body 46, or by loosening the second fixing parts 48, the rear frame body 46 can be rotated relative to the main unit. 41Adjust the position up and down.
一感測單元50,該感測單元50對應該主機41係設有一感測器51,該感測器51係包含相機模組、紅外線偵測、溫度儀、濕度儀、指向性麥克風、震動儀及壓力儀或上述之任意組合,該感測器51係電性連接於該主機41,並設於該主機41之該前架體45。A sensing unit 50. The sensing unit 50 is provided with a sensor 51 corresponding to the host 41. The sensor 51 includes a camera module, infrared detection, a thermometer, a humidity meter, a directional microphone, and a vibration meter. and a pressure gauge or any combination of the above. The sensor 51 is electrically connected to the host 41 and is located on the front frame 45 of the host 41 .
一電源供給單元60,該電源供給單元60對應該主機41係設有一電源供應模組61,該電源供應模組61可提供該主機41及該感測器51所需之電力。A power supply unit 60 is provided with a power supply module 61 corresponding to the host 41 . The power supply module 61 can provide the power required by the host 41 and the sensor 51 .
參閱圖11及圖12並搭配圖7所示, 該驅動器21係驅動該第一軌道12轉動並帶動該主機41,透過該主機41於該第一軌道12進行橫向或軸向往復移動,使設於該主機41之該感測器51可將該等生物飼育籠210內之實驗動物的活動狀況進行長時間且連續的觀察,進一步將該等生物飼育籠210內狀況標準化,只針對需更換之該生物飼育籠210進行更換,減少對於實驗動物生活的干擾,以增進實驗動物照護品質,藉此,除了解決人工方式巡房問題外,亦能提高異常檢出率及汙染管控程度。Referring to Figures 11 and 12 in conjunction with Figure 7, the driver 21 drives the first rail 12 to rotate and drives the host 41, through which the host 41 performs lateral or axial reciprocating movement on the first rail 12, so that the device The sensor 51 in the host computer 41 can observe the activity status of the experimental animals in the biological breeding cages 210 for a long time and continuously, and further standardize the conditions in the biological breeding cages 210, only for those that need to be replaced. The replacement of the biological breeding cage 210 reduces the interference to the life of experimental animals and improves the quality of experimental animal care. In addition to solving the problem of manual rounds, it can also improve the abnormal detection rate and pollution control level.
參閱圖13至圖16所示,其係揭露本發明裝置之第二實施例,本發明之第二實施例與前述第一實施例不同之處在於,該支撐單元30還包含一承座架32,該承座架32係可架設於該生物飼育籠架200周圍或設於該生物飼育籠架200之一側,該承座架32係由複數支撐桿33所組成,使該承座架32整體可略呈倒U字形或方形,該軌道組11係具有一該第一軌道12、二第二軌道13及一同步圓桿14,該等第二軌道13係位於該承座架32之兩側,且分別固定於對應之該支撐桿33,該驅動器21係設於其中一該第二軌道13,該第一軌道12係藉由二固定架34與各該第二軌道13固定,而該同步圓桿14之兩端係分別固定於各該第二軌道13,藉此,該驅動器21係同時驅動對應之該第二軌道13和該同步圓桿14轉動,使該同步圓桿14帶動另一該第二軌道13作動,進而使該等固定架34進行同步同向移動以連動該第一軌道12,使該第一軌道12連同該主機41沿該等第二軌道13之長軸方向往復移動,或可進一步於該第一軌道12設有另一驅動器21,使另一該驅動器21同時驅動該第一軌道12轉動,使該主機41可同時於該第一軌道12移動,以提升本發明使用上之便利性。Referring to FIGS. 13 to 16 , a second embodiment of the device of the present invention is disclosed. The difference between the second embodiment of the present invention and the aforementioned first embodiment is that the support unit 30 also includes a supporting frame 32 , the supporting frame 32 can be erected around the biological breeding cage 200 or on one side of the biological breeding cage 200. The supporting frame 32 is composed of a plurality of supporting rods 33, so that the supporting frame 32 The overall shape can be slightly inverted U-shaped or square. The track set 11 has a first track 12, two second tracks 13 and a synchronized round rod 14. The second tracks 13 are located on both sides of the bearing frame 32. side, and are respectively fixed to the corresponding support rods 33. The driver 21 is provided on one of the second rails 13. The first rail 12 is fixed to each of the second rails 13 through two fixing brackets 34, and the Both ends of the synchronous round rod 14 are respectively fixed to the second rails 13, whereby the driver 21 drives the corresponding second rail 13 and the synchronous round rod 14 to rotate at the same time, so that the synchronous round rod 14 drives the other synchronous round rod 14. When the second rail 13 is activated, the fixing brackets 34 move synchronously and in the same direction to link the first rail 12, so that the first rail 12 and the main machine 41 reciprocate along the long axis direction of the second rails 13. move, or another driver 21 may be further provided on the first rail 12, so that the other driver 21 drives the first rail 12 to rotate at the same time, so that the host 41 can move on the first rail 12 at the same time to lift the machine. Invent the convenience of use.
參閱圖17所示,其係揭露本發明裝置之第三實施例,本發明之第三實施例與前述第一實施例不同之處在於,該第一組接板42係設有複數磁吸件422,而該第二組接板43對應該第一組接板42之該等磁吸件422係設有複數磁吸槽432,透過該等磁吸件422吸附於該等磁吸槽432,使該主機41組設於該第一軌道12上,並可於該第一軌道12移動。Referring to Figure 17, a third embodiment of the device of the present invention is disclosed. The difference between the third embodiment of the present invention and the aforementioned first embodiment is that the first assembly plate 42 is provided with a plurality of magnetic pieces. 422, and the magnetic pieces 422 of the second set of connecting plates 43 corresponding to the first set of connecting plates 42 are provided with a plurality of magnetic slots 432, and are attracted to the magnetic slots 432 through the magnetic pieces 422, The host 41 is assembled on the first rail 12 and can move on the first rail 12 .
參閱圖18所示,其係揭露本發明裝置之第四實施例,本發明之第四實施例與前述第一實施例不同之處在於,該第一組接板42兩側係分別設有一按壓件423,而該第二組接板43對應該第一組接板42之該二按壓件423係設有二按壓孔433,透過該等按壓件423卡設於該等按壓孔433,使該主機41組設於該第一軌道12上,並可於該第一軌道12移動。Referring to Figure 18, a fourth embodiment of the device of the present invention is disclosed. The difference between the fourth embodiment of the present invention and the aforementioned first embodiment is that the first assembly plate 42 is provided with a push button on both sides. 423, and the second set of connecting plates 43 is provided with two pressing holes 433 corresponding to the two pressing parts 423 of the first set of connecting plates 42, and the pressing parts 423 are clamped in the pressing holes 433, so that the The host 41 is assembled on the first rail 12 and can move on the first rail 12 .
參閱圖19所示,其係揭露本發明裝置之第五實施例,本發明之第五實施例與前述第一實施例不同之處在於,該第一組接板42兩側係分別設有一卡扣件424,而該第二組接板43對應該第一組接板42之該二卡扣件424係設有二卡扣孔434,透過該等卡扣件424扣合於該等卡扣孔434,使該主機41組設於該第一軌道12上,並可於該第一軌道12移動。Referring to Figure 19, a fifth embodiment of the device of the present invention is disclosed. The fifth embodiment of the present invention is different from the first embodiment in that a card is provided on both sides of the first set of connecting plates 42. Fasteners 424, and the second set of connecting plates 43 corresponding to the two fastening parts 424 of the first set of connecting plates 42 are provided with two buckling holes 434, and the fastening parts 424 are fastened to the buckles. The hole 434 allows the main unit 41 to be assembled on the first rail 12 and move on the first rail 12 .
綜上所述,本發明在同類產品中實有其極佳之進步實用性,同時遍查國內外關於此類結構之技術資料,文獻中亦未發現有相同的構造存在在先,是以,本發明實已具備發明專利要件,爰依法提出申請。To sum up, the present invention has excellent practicality among similar products. At the same time, after reviewing the technical information on this type of structure at home and abroad, no similar structure has been found to exist in the literature. Therefore, This invention actually meets the requirements for an invention patent, and you need to file an application in accordance with the law.
惟,以上所述者,僅係本發明之一較佳可行實施例而已,故舉凡應用本發明說明書及申請專利範圍所為之等效結構變化,理應包含在本發明之專利範圍內。However, the above is only one of the best possible embodiments of the present invention. Therefore, all equivalent structural changes made by applying the description and patent scope of the present invention should be included in the patent scope of the present invention.
200:生物飼育籠架 210:生物飼育籠 100:基於人工智慧之自動化系統監控生物的裝置 10:軌道單元 11:軌道組 12:第一軌道 13:第二軌道 14:同步圓桿 20:驅動單元 21:驅動器 30:支撐單元 31:支撐架 32:承座架 33:支撐桿 34:固定架 40:主機單元 41:主機 42:第一組接板 421:滑塊 422:磁吸件 423:按壓件 424:卡扣件 43:第二組接板 431:滑槽 432:磁吸槽 433:按壓孔 434:卡扣孔 44:調整架 45:前架體 46:後架體 47:第一固定件 48:第二固定件 50:感測單元 51:感測器 60:電源供給單元 61:電源供應模組 A:定界框 B:物件輪廓 C:物件本體 D:背景 E:目標關節點 S1:步驟1 S2:步驟2 S3:步驟3 S4:步驟4 S5:步驟5 S6:步驟6 P1~P7:臉部特徵點 200:Biological breeding cage 210:Biological breeding cage 100: Device for monitoring biological organisms through automated systems based on artificial intelligence 10:Track unit 11: Orbital group 12: First track 13:Second track 14:Synchronized round rod 20:Drive unit 21:Drive 30:Support unit 31: Support frame 32:Bearing frame 33:Support rod 34:fixed frame 40: Host unit 41:Host 42: The first set of connecting boards 421:Slider 422:Magnetic parts 423: Pressing parts 424: fasteners 43: The second set of connecting boards 431:Chute 432:Magnetic slot 433: Press hole 434:Buckle hole 44: Adjustment stand 45:Front frame body 46:Rear frame body 47:First fixing piece 48:Second fixing piece 50: Sensing unit 51: Sensor 60:Power supply unit 61:Power supply module A: bounding box B: Object outline C: Object body D:Background E: target joint point S1: Step 1 S2: Step 2 S3: Step 3 S4: Step 4 S5: Step 5 S6: Step 6 P1~P7: Facial feature points
[圖1]為本發明之流程示意圖。 [圖2]為本發明之應用於物件辨識模組之示意圖。 [圖3]為本發明之應用於影像分割模組之示意圖。 [圖4]為本發明之應用於動物實體辨識模組之示意圖。 [圖5]為本發明之應用於動物行為辨識模組之示意圖。 [圖6]為本發明之第一實施例之立體圖。 [圖7]為本發明之第一實施例之局部放大圖。 [圖8]為本發明之第一實施例之主機組裝於第一軌道之示意圖。 [圖9]為本發明之第一實施例之局部分解圖。 [圖10]為本發明之第一實施例之感測器於主機作動之示意圖。 [圖11]為本發明之第一實施例之主機於橫向設置之第一軌道往復移動之示意圖。 [圖12]為本發明之第一實施例之主機於縱向設置之第一軌道往復移動之示意圖。 [圖13]為本發明之第二實施例之立體圖。 [圖14]為本發明之第二實施例之第一軌道上下往復移動之示意圖。 [圖15]為本發明之第二實施例之第一軌道左右往復移動之示意圖。 [圖16]為本發明之第二實施例之主機於第一軌道左右移動以及第二軌道上下移動之示意圖。 [圖17]為本發明之第三實施例之主機組裝於第一軌道之示意圖。 [圖18]為本發明之第四實施例之主機組裝於第一軌道之示意圖。 [圖19]為本發明之第五實施例之主機組裝於第一軌道之示意圖。 [Fig. 1] is a schematic flow diagram of the present invention. [Fig. 2] is a schematic diagram of the object recognition module of the present invention. [Fig. 3] is a schematic diagram of the present invention applied to the image segmentation module. [Fig. 4] is a schematic diagram of the present invention applied to the animal entity recognition module. [Figure 5] is a schematic diagram of the animal behavior recognition module of the present invention. [Fig. 6] is a perspective view of the first embodiment of the present invention. [Fig. 7] is a partial enlarged view of the first embodiment of the present invention. [Fig. 8] is a schematic diagram of the main unit assembled on the first track according to the first embodiment of the present invention. [Fig. 9] is a partially exploded view of the first embodiment of the present invention. [Fig. 10] is a schematic diagram of the sensor operating on the host machine according to the first embodiment of the present invention. [Fig. 11] is a schematic diagram of the main machine reciprocating on the first track disposed laterally according to the first embodiment of the present invention. [Fig. 12] is a schematic diagram of the main machine reciprocating on the first track arranged longitudinally according to the first embodiment of the present invention. [Fig. 13] is a perspective view of the second embodiment of the present invention. [Fig. 14] is a schematic diagram of the first track reciprocating up and down according to the second embodiment of the present invention. [Fig. 15] is a schematic diagram of the first track reciprocating left and right according to the second embodiment of the present invention. [Fig. 16] is a schematic diagram of the host machine moving left and right on the first track and up and down on the second track according to the second embodiment of the present invention. [Fig. 17] is a schematic diagram of the main unit assembled on the first track according to the third embodiment of the present invention. [Fig. 18] is a schematic diagram of the main unit assembled on the first track according to the fourth embodiment of the present invention. [Fig. 19] is a schematic diagram of the main unit assembled on the first track according to the fifth embodiment of the present invention.
S1:步驟1 S1: Step 1
S2:步驟2 S2: Step 2
S3:步驟3 S3: Step 3
S4:步驟4 S4: Step 4
S5:步驟5 S5: Step 5
S6:步驟6 S6: Step 6
Claims (17)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
TW111131036A TWI820870B (en) | 2022-08-17 | 2022-08-17 | Methods and devices for biological monitoring by automated systems based on artificial intelligence |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
TW111131036A TWI820870B (en) | 2022-08-17 | 2022-08-17 | Methods and devices for biological monitoring by automated systems based on artificial intelligence |
Publications (2)
Publication Number | Publication Date |
---|---|
TWI820870B true TWI820870B (en) | 2023-11-01 |
TW202409878A TW202409878A (en) | 2024-03-01 |
Family
ID=89722287
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
TW111131036A TWI820870B (en) | 2022-08-17 | 2022-08-17 | Methods and devices for biological monitoring by automated systems based on artificial intelligence |
Country Status (1)
Country | Link |
---|---|
TW (1) | TWI820870B (en) |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109255297A (en) * | 2018-08-06 | 2019-01-22 | 百度在线网络技术(北京)有限公司 | animal state monitoring method, terminal device, storage medium and electronic equipment |
TW202121199A (en) * | 2019-11-26 | 2021-06-01 | 臺中榮民總醫院 | System and method for detecting and classifying animal behaviors capable of effectively reducing the cost of human determination and the error of manual determination |
CN113223035A (en) * | 2021-06-07 | 2021-08-06 | 南京农业大学 | Intelligent inspection system for cage-rearing chickens |
-
2022
- 2022-08-17 TW TW111131036A patent/TWI820870B/en active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109255297A (en) * | 2018-08-06 | 2019-01-22 | 百度在线网络技术(北京)有限公司 | animal state monitoring method, terminal device, storage medium and electronic equipment |
TW202121199A (en) * | 2019-11-26 | 2021-06-01 | 臺中榮民總醫院 | System and method for detecting and classifying animal behaviors capable of effectively reducing the cost of human determination and the error of manual determination |
CN113223035A (en) * | 2021-06-07 | 2021-08-06 | 南京农业大学 | Intelligent inspection system for cage-rearing chickens |
Also Published As
Publication number | Publication date |
---|---|
TW202409878A (en) | 2024-03-01 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Li et al. | Deep cascaded convolutional models for cattle pose estimation | |
Yang et al. | An automatic recognition framework for sow daily behaviours based on motion and image analyses | |
Tu et al. | Automatic behaviour analysis system for honeybees using computer vision | |
US10832035B2 (en) | Subject identification systems and methods | |
Lim et al. | Automated classroom monitoring with connected visioning system | |
Subedi et al. | Tracking floor eggs with machine vision in cage-free hen houses | |
Xu et al. | Automatic scoring of postures in grouped pigs using depth image and CNN-SVM | |
Yang et al. | Pig mounting behaviour recognition based on video spatial–temporal features | |
CN113223035A (en) | Intelligent inspection system for cage-rearing chickens | |
Wu et al. | Monitoring the respiratory behavior of multiple cows based on computer vision and deep learning | |
US20190362137A1 (en) | Patient identification systems and methods | |
Gong et al. | Multicow pose estimation based on keypoint extraction | |
TWI820870B (en) | Methods and devices for biological monitoring by automated systems based on artificial intelligence | |
CN118658209B (en) | AI-based live pig abnormal behavior monitoring and early warning method and system | |
Yu et al. | Research on Automatic Recognition of Dairy Cow Daily Behaviors Based on Deep Learning | |
WO2023034834A1 (en) | Artificial intelligence and vision-based broiler body weight measurement system and process | |
CN113221776B (en) | Method for identifying general behaviors of ruminants based on artificial intelligence | |
WO2024040367A1 (en) | Method and apparatus for monitoring organisms by means of artificial intelligence-based automatic system | |
Kim et al. | AVATAR: ai vision analysis for three-dimensional action in real-time | |
CN111652084B (en) | Abnormal layer identification method and device | |
García-Garví et al. | Automation of Caenorhabditis elegans lifespan assay using a simplified domain synthetic image-based neural network training strategy | |
CN110765930A (en) | Aquaculture inspection system, inspection method and controller | |
US20240127594A1 (en) | Method of monitoring experimental animals using artificial intelligence | |
WO2024214906A1 (en) | Method and device for managing livestock barn | |
Ankita | Video analytics for lameness detection in dairy cattle: Effects of background removal and deep image matting on farm videos |