CN106204537B - Live pig image partition method under a kind of complex environment - Google Patents

Live pig image partition method under a kind of complex environment Download PDF

Info

Publication number
CN106204537B
CN106204537B CN201610493686.1A CN201610493686A CN106204537B CN 106204537 B CN106204537 B CN 106204537B CN 201610493686 A CN201610493686 A CN 201610493686A CN 106204537 B CN106204537 B CN 106204537B
Authority
CN
China
Prior art keywords
image
live pig
obtains
segmented
light source
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201610493686.1A
Other languages
Chinese (zh)
Other versions
CN106204537A (en
Inventor
饶秀勤
宋晨波
应义斌
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hefei Shenmu Information Technology Co.,Ltd.
Original Assignee
Zhejiang University ZJU
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang University ZJU filed Critical Zhejiang University ZJU
Priority to CN201610493686.1A priority Critical patent/CN106204537B/en
Publication of CN106204537A publication Critical patent/CN106204537A/en
Application granted granted Critical
Publication of CN106204537B publication Critical patent/CN106204537B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Image Processing (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses live pig image partition methods under a kind of complex environment.Acquisition live pig image in real time obtains light source reference point, establishes rectangular coordinate system, and live pig image grayscale to be split is converted, and obtains live pig grayscale image, and then handle and obtain live pig gradient map, and segmentation, which is extracted, obtains initial segmentation image;To initial segmentation closing operation, then connected domain is removed, obtains secondary splitting image;It calculates and obtains object, object area, intensity of illumination and distance, expansion structure member is biased in building, computational shadowgraph penalty coefficient carries out processing to secondary splitting image and obtains four segmented images, obtains five segmented images according to four segmented image combination secondary splitting image procossings.The method of the present invention effectively can be partitioned into pig body in the more complicated pig house live pig image of environment, and it can handle to continuous batch and there is self adjustment capability, suitable for carrying out image procossing to continuous prolonged live pig monitoring image, help to realize automatically-monitored to live pig progress.

Description

Live pig image partition method under a kind of complex environment
Technical field
The present invention relates to a kind of live pig image partition methods, more particularly, to the live pig image segmentation under a kind of complex environment Method.
Background technique
Pig-breeding is gradually converted into more environmentally protective intensive style from traditional extensive breeding way.In order to make Pig-breeding more efficiently, pig is monitored and detect its physiological status seem more it is necessary to.And tradition is to the situation of pig It is monitored all by manually carrying out, since pig house environment is sultry, the smell is awful, and there have for a long time to the body of staff to be unfavorable It influences, in addition, staff can not also accomplish to detect constantly, it is slower to the abnormal conditions reflection of live pig.And utilize machine Vision system obtains live pig image and carries out the state-detection of live pig instead of manual type by image processing techniques simultaneously Preferably selection.
In recent years, due to technological progress, have using the research that machine vision carries out animal state detection considerable degree of Progress (pig abnormal breathing detection [D] Jiangsu University of the Wu Zhilei based on machine vision, 2011).However what most researchs used Environment is all more simple, or has considerably artificial reforming, thus the number for facilitating image segmentation and subsequent processing, and using According to being often short-term, so that the variation of various aspects is all unobvious in image, method used in image procossing is also more simple.And mesh The environment on preceding many pig farms is complicated, lacks effective image partition method and splits pig body portion.
Summary of the invention
In order to solve the problems, such as background technique, the purpose of the present invention is to provide the live pigs under a kind of complex environment Image partition method, by recycling its matter after carrying out background difference, Threshold segmentation acquisition primary segmentation image to live pig image The heart and image light source information carry out shadow compensation, to obtain the image after live pig segmentation, traditional images processing method are avoided to exist Failure under complex environment.
The technical solution adopted by the present invention to solve the technical problems is:
1) acquisition live pig image in real time is put into the fixed position of video camera, light source reference Point C L is obtained by image procossing;
It 2) is that y-axis establishes rectangular coordinate system using the vertical direction of image to be split as x-axis, horizontal direction;
3) for live pig image to be split, gradation conversion is carried out, obtains live pig grayscale image I, with live pig gray level image I It subtracts background image BK and obtains difference image DI, and live pig grayscale image I is handled to obtain live pig gradient map;
4) gray value is greater than the pixel that gradient value in gray threshold and live pig gradient map is greater than Grads threshold on difference image Point is labeled as 1, and rest of pixels point is labeled as 0, obtains initial segmentation image Bw1;Gray threshold and Grads threshold are being embodied 40 and 0.8 can be taken respectively.
5) to initial segmentation image Bw1Closed operation is carried out using morphology operations structural elements M, then removes area and is less than The connected domain of area threshold obtains secondary splitting image Bw2;Area threshold can take 50000 in specific implementation respectively.
6) it is directed to secondary splitting image Bw2The object mass center CB and object area AB of its target object are calculated, and takes light source Gray value at reference Point C L is as intensity of illumination L;
7) distance D is calculated, then expansion structure member MS is biased in building, is calculated and is obtained shadow compensation coefficient S;
8) result is obtained to the secondary splitting image Bw of step 5) using step 7)2It carries out processing and obtains four segmented images Bw4, according to four segmented image Bw4In conjunction with secondary splitting image Bw2It is further processed and obtains five segmented image Bw5, as most Whole segmentation result.
The live pig image is the top view photograph or side view photo of live pig.
Light source reference Point C L in the step 1) is to obtain in the following ways: around-the clock live pig image is turned Gray level image is changed, the variance yields of each pixel gray-value variation in 24 hours is calculated, where taking the maximum pixel of variance yields Position is as light source reference Point C L.
Gradient specifically is calculated using the following equation to each pixel in live pig grayscale image I in the step 3), thus Obtain live pig gradient map:
In formula, Grad (i, j) is the gradient value for the pixel that row coordinate is i, column coordinate is j, and f (i, j) is the gray scale Value, l are Laplace operator, and m indicates that the row coordinate of operator, n indicate the column coordinate of operator;
Laplace operator l uses the matrix of following formula:
Distance D in the step 7) refer between shade angle [alpha] and object mass center CB and light source reference Point C L away from From D, specifically calculated with following formula:
In formula, α indicates shade angle, and D indicates the distance between object mass center CB and light source reference Point C L, xCBExpression pair As the x coordinate of mass center CB, yCBIndicate the y-coordinate of object mass center CB, xCLIndicate the x coordinate of light source reference Point C L, yCLIndicate light source The y-coordinate of reference Point C L.
Deviation expansion structure member MS in the step 7) is specifically constructed in the following ways: one full 0 matrix of setting, entirely The side length of 0 matrix be rounded (D/10+1) value, using matrix center as origin, to the left lower section make ray so that ray and Angle between image level direction is shade angle [alpha], then sets 1 for the matrix element with ray intersection, to obtain partially To expansion structure member MS.
The folder of the shade angle [alpha] line and image level direction between light source reference Point C L and object mass center CB Angle.
Shadow compensation coefficient S in the step 7) is specifically to be calculated using the following equation:
In formula, S0It is benchmark penalty coefficient;SABIt is reference area coefficient, measures size, AB indicates object area;aAB It is area attenuation coefficient, control area changes the influence to penalty coefficient;SDIt is reference distance coefficient, measures apart from length;aDIt is Range attenuation coefficient, command range change the influence to penalty coefficient;SLIt is benchmark illumination tensor, measures light intensity, these coefficients Occurrence depending on actual conditions.
The step 8) specifically:
8.1) using the deviation expansion structure member MS of step 7) to the secondary splitting image Bw of step 5)2Carry out expansive working Obtain segmented image Bw three times3, with segmented image Bw three times3Processing is masked to live pig grayscale image I and obtains secondary live pig gray scale Figure, by secondary splitting image Bw2Processing is masked to secondary live pig grayscale image after reverse phase and obtains live pig grayscale image three times, by three Secondary live pig grayscale image divides institute multiplied by the laggard row threshold division of background image BK is subtracted after the shadow compensation coefficient S of step 7) again It obtains image and adds secondary splitting image Bw2For four segmented image Bw4
If 8.2) four segmented image Bw4With secondary splitting image Bw2It is identical, then next step is carried out, otherwise with four Secondary segmented image Bw4Instead of secondary splitting image Bw2It repeats afterwards 8.1) until four segmented image Bw4With secondary splitting image Bw2 It is identical;
8.3) to four segmented image Bw4It recycles morphology operations structural elements M to carry out morphology operations and obtains five times points Cut image Bw5, as final segmentation result.
10. live pig image partition method under a kind of complex environment according to claim 1, it is characterised in that:
The step 5) and 8.3) in morphology operations structural elements M specifically use the matrix of following formula:
The step 3) and 8.1) in background image BK use initial background image in first frame processing, below Background image of the real-time update as next frame processing is carried out when the processing of every frame: five segmentations obtained after present frame is handled Image Bw5Dilation operation is carried out with circular configuration member, circular configuration member is specially that edge is 0, the structural elements that border circular areas is 1, Obtain context update template Bw6, by context update template Bw6Live pig grayscale image I is masked after reverse phase, is obtained among first Mask images, by context update template Bw6The background image BK of present frame processing is masked to obtain the second reticle mask First reticle image and the second reticle image are added by image, obtain new background image BK as next frame at The background image of reason.
The initial background image is to obtain in the following ways: by continuous 10 seconds live pig image converting gradation images, The mean value of each pixel gray-value variation in 10 seconds is calculated, to obtain initial background image.
The beneficial effects of the present invention are:
The present invention is by recycling its matter after carrying out background difference, Threshold segmentation acquisition primary segmentation image to live pig image The heart and image light source information carry out shadow compensation, to obtain the image after live pig segmentation, traditional images processing method are avoided to exist Failure under complex environment.
It is partitioned into pig body in the live pig image that the method for the present invention can be obtained effectively in the more complicated pig house of environment, and It can with continuous batch handle data and there is self adjustment capability to a certain extent, be suitable for supervising continuous prolonged live pig It controls image and carries out image procossing, help to realize automatically-monitored to live pig progress.
Detailed description of the invention
Fig. 1 is treatment process method flow block diagram of the invention.
Fig. 2 is the original image of processing of the invention.
Fig. 3 is the secondary splitting image of processing of the invention.
Fig. 4 is the schematic diagram of angle used in shadow compensation of the invention and distance.
Fig. 5 is the deviation expansion structure member example used of the invention.
Fig. 6 is the final segmented image of processing of the invention.
Specific embodiment
The present invention will be further described with reference to the accompanying drawings and examples.
Step as shown in Figure 1, detailed process is as follows for the embodiment of the present invention:
1) the fixed position of video camera is put to the live pig image for acquiring depression angle as shown in Figure 2 in real time, obtains light source Reference Point C L and initial background image BK:
By around-the clock live pig image converting gradation image, the side of each pixel gray-value variation in 24 hours is calculated Difference takes the position where the maximum pixel of variance yields as light source reference Point C L.
By continuous 10 seconds live pig image converting gradation images, the mean value of each pixel gray-value variation in 10 seconds is calculated, As background image BK.
It 2) is that y-axis establishes rectangular coordinate system using the vertical direction of image to be split as x-axis, horizontal direction;
3) for live pig image to be split, gradation conversion is carried out, obtains live pig grayscale image I, with live pig gray level image I It subtracts background image BK and obtains difference image DI, and gradient is calculated to each pixel in live pig grayscale image I and obtains live pig gradient Figure, is calculated as follows gradient
In formula, Grad (i, j) is the gradient value for the pixel that row coordinate is i, column coordinate is j, and f (i, j) is the gray scale Value, l are Laplace operator, and m indicates that the row coordinate of operator, n indicate the column coordinate of operator.
Laplace operator l uses the matrix of following formula:
4) gray value on difference image is greater than the picture that gradient value on gray threshold and live pig gradient map is greater than Grads threshold Vegetarian refreshments is labeled as 1, and rest of pixels point is labeled as 0, and gray threshold and Grads threshold can take 40 and 0.8 in specific implementation respectively, obtains To initial segmentation image Bw1
5) to initial segmentation image Bw1Closed operation is carried out using morphology operations structural elements M, then removes area and is less than The connected domain of area threshold, area threshold can take 50000 in specific implementation respectively, obtain secondary splitting image Bw2, such as Fig. 3 institute Show, for convenience of observing, wherein the part of black is that the part for the prospect that is divided into is labeled as 1 pixel, and white part is Labeled as 0 part;
6) it is directed to secondary splitting image Bw2The object mass center CB and object area AB of its target object are calculated, and takes light source Gray value at reference Point C L is as intensity of illumination L;
7) as shown in figure 4, be calculated as follows between shade angle [alpha] and object mass center CB and light source reference Point C L away from From D.
In formula, α indicates shade angle, and D indicates the distance between object mass center CB and light source reference Point C L, xCBIndicate object The x coordinate of mass center CB, yCBIndicate the y-coordinate of object mass center CB, xCLIndicate the x coordinate of light source reference Point C L, yCLIndicate light source ginseng The y-coordinate of examination point CL.
Then expansion structure member MS is biased in building, sets a full 0 matrix, and the side length of full 0 matrix is the (D/10+ being rounded 1) value, using matrix center as origin, ray is made in lower section to the left, so that the angle between ray and image level direction is yin Shadow angle [alpha], then 1 is set by the matrix element with ray intersection, to obtain being biased to expansion structure member MS, Fig. 5 being that α is 45 °, D structural elements example generated when being 100.
Then it is calculated as follows and obtains shadow compensation coefficient S,
In formula, S0It is benchmark penalty coefficient;SABIt is reference area coefficient, measures size, AB indicates object area;aAB It is area attenuation coefficient, control area changes the influence to penalty coefficient;SDIt is reference distance coefficient, measures apart from length;aDIt is Range attenuation coefficient, command range change the influence to penalty coefficient;SLIt is benchmark illumination tensor, measures light intensity.Base in calculating Quasi- penalty coefficient S03 are taken, reference area coefficient SAB10000 are taken, area attenuation coefficient aAB2 are taken, reference distance coefficient SDTake 1000, Range attenuation coefficient aD0.5 is taken, reference light shines coefficient SLTake 150.
8) using the deviation expansion structure member MS of step 7) to the secondary splitting image Bw of step 5)2Expansive working is carried out to obtain To segmented image Bw three times3, with segmented image Bw three times3Processing is masked to live pig grayscale image I and obtains secondary live pig gray scale Figure, by secondary splitting image Bw2Processing is masked to secondary live pig grayscale image after reverse phase and obtains live pig grayscale image three times, by three Secondary live pig grayscale image divides institute multiplied by the laggard row threshold division of background image BK is subtracted after the shadow compensation coefficient S of step 7) again It obtains image and adds secondary splitting image Bw2For four segmented image Bw4
If four segmented image Bw4With secondary splitting image Bw2It is identical, then next step is carried out, otherwise with four times points Cut image Bw4Instead of secondary splitting image Bw2It repeats afterwards 8.1) until four segmented image Bw4With secondary splitting image Bw2Completely It is identical;
To four segmented image Bw4It recycles morphology operations structural elements M to carry out morphology operations and obtains five segmentation figures As Bw5, as shown in fig. 6, as final segmentation result, relative to Fig. 3, before the part of pig body shady face is obviously divided into Scape, general outline are also fuller.
The step 5) of embodiment and 8.3) in morphology operations structural elements M specifically use the matrix of following formula:
Embodiment step 3) and 8) in background image BK use initial background image in first frame processing, below Background image of the real-time update as next frame processing is carried out when the processing of every frame: five segmentations obtained after present frame is handled Image Bw5Dilation operation is carried out with circular configuration member, circular configuration member is specially that edge is 0, the structural elements that border circular areas is 1, Obtain context update template Bw6, by context update template Bw6Live pig grayscale image I is masked after reverse phase, is obtained among first Mask images, by context update template Bw6The background image BK of present frame processing is masked to obtain the second reticle mask First reticle image and the second reticle image are added by image, obtain new background image BK as next frame at The background image of reason, to adapt to the image segmentation operations under continuous complex environment.

Claims (9)

1. live pig image partition method under a kind of complex environment, it is characterised in that:
1) acquisition live pig image in real time is put into the fixed position of video camera, light source reference Point C L is obtained by image procossing;
It 2) is that y-axis establishes rectangular coordinate system using the vertical direction of image to be split as x-axis, horizontal direction;
3) for live pig image to be split, gradation conversion is carried out, obtains live pig grayscale image I, is subtracted with live pig gray level image I Background image BK obtains difference image DI, and is handled to obtain live pig gradient map to live pig grayscale image I;
4) gray value on difference image is greater than the pixel that gradient value on gray threshold and live pig gradient map is greater than Grads threshold Labeled as 1, rest of pixels point is labeled as 0, obtains initial segmentation image Bw1
5) to initial segmentation image Bw1Closed operation is carried out using morphology operations structural elements M, then removes area less than area The connected domain of threshold value obtains secondary splitting image Bw2
6) it is directed to secondary splitting image Bw2The object mass center CB and object area AB of its target object are calculated, and takes light source reference point Gray value at CL is as intensity of illumination L;
7) distance D is calculated, then expansion structure member MS is biased in building, is calculated and is obtained shadow compensation coefficient S;
Distance D in the step 7) refers to shade angle [alpha] and the distance between object mass center CB and light source reference Point C L D, Specifically calculated with following formula:
In formula, α indicates shade angle, and D indicates the distance between object mass center CB and light source reference Point C L, xCBIndicate object mass center The x coordinate of CB, yCBIndicate the y-coordinate of object mass center CB, xCLIndicate the x coordinate of light source reference Point C L, yCLIndicate light source reference point The y-coordinate of CL;
8) result is obtained to the secondary splitting image Bw of step 5) using step 7)2It carries out processing and obtains four segmented image Bw4, According to four segmented image Bw4In conjunction with secondary splitting image Bw2It is further processed and obtains five segmented image Bw5, as final Segmentation result.
2. live pig image partition method under a kind of complex environment according to claim 1, it is characterised in that: the live pig Image is the top view photograph or side view photo of live pig.
3. live pig image partition method under a kind of complex environment according to claim 1, it is characterised in that: the step 1) In light source reference Point C L be to obtain in the following ways: around-the clock live pig image converting gradation image calculates each The variance yields of pixel gray-value variation in 24 hours takes the position where the maximum pixel of variance yields as light source reference point CL。
4. live pig image partition method under a kind of complex environment according to claim 1, it is characterised in that: the step 3) In be specifically gradient is calculated using the following equation to each pixel in live pig grayscale image I, to obtain live pig gradient map:
In formula, Grad (i, j) is the gradient value for the pixel that row coordinate is i, column coordinate is j, and f (i, j) is the gray value, l For Laplace operator, m indicates that the row coordinate of operator, n indicate the column coordinate of operator;
Laplace operator l uses the matrix of following formula:
5. live pig image partition method under a kind of complex environment according to claim 1, it is characterised in that: the step 7) In deviation expansion structure member MS specifically construct in the following ways: one full 0 matrix of setting, the side length of full 0 matrix are to be rounded (D/10+1) value, using matrix center as origin, to the left lower section make ray so that the angle between ray and image level direction For shade angle [alpha], then by the matrix element with ray intersection 1 is set, to obtain being biased to expansion structure member MS.
6. live pig image partition method under a kind of complex environment according to claim 1 or 5, it is characterised in that: described The angle of shade angle [alpha] line and image level direction between light source reference Point C L and object mass center CB.
7. live pig image partition method under a kind of complex environment according to claim 1, it is characterised in that: the step 7) In shadow compensation coefficient S be specifically be calculated using the following equation:
In formula, S0It is benchmark penalty coefficient;SABIt is reference area coefficient, measures size, AB indicates object area;aABIt is face Product attenuation coefficient;SDIt is reference distance coefficient;aDIt is range attenuation coefficient;SLIt is benchmark illumination tensor.
8. live pig image partition method under a kind of complex environment according to claim 1, it is characterised in that: the step 8) Specifically:
8.1) using the deviation expansion structure member MS of step 7) to the secondary splitting image Bw of step 5)2Expansive working is carried out to obtain Segmented image Bw three times3, with segmented image Bw three times3Processing is masked to live pig grayscale image I and obtains secondary live pig grayscale image, By secondary splitting image Bw2Processing is masked to secondary live pig grayscale image after reverse phase and obtains live pig grayscale image three times, it will three times Live pig grayscale image divides gained multiplied by the laggard row threshold division of background image BK is subtracted after the shadow compensation coefficient S of step 7) again Image adds secondary splitting image Bw2For four segmented image Bw4
If 8.2) four segmented image Bw4With secondary splitting image Bw2It is identical, then next step is carried out, otherwise with four times points Cut image Bw4Instead of secondary splitting image Bw2It repeats afterwards 8.1) until four segmented image Bw4With secondary splitting image Bw2Completely It is identical;
8.3) to four segmented image Bw4It recycles morphology operations structural elements M to carry out morphology operations and obtains five segmented images Bw5, as final segmentation result.
9. live pig image partition method under a kind of complex environment according to claim 8, it is characterised in that: the step 5) The matrix of following formula is specifically used with the morphology operations structural elements M in 8.3):
The step 3) and 8.1) in background image BK first frame processing when use initial background image, in every frame below Background image of the real-time update as next frame processing: five segmented images obtained after present frame is handled is carried out when processing Bw5Dilation operation is carried out with circular configuration member, obtains context update template Bw6, by context update template Bw6To live pig after reverse phase Grayscale image I is masked, and obtains the first reticle image, by context update template Bw6To the Background of present frame processing As BK is masked to obtain the second reticle image, the first reticle image and the second reticle image are added, obtained Background image to new background image BK as next frame processing.
CN201610493686.1A 2016-06-24 2016-06-24 Live pig image partition method under a kind of complex environment Active CN106204537B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610493686.1A CN106204537B (en) 2016-06-24 2016-06-24 Live pig image partition method under a kind of complex environment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610493686.1A CN106204537B (en) 2016-06-24 2016-06-24 Live pig image partition method under a kind of complex environment

Publications (2)

Publication Number Publication Date
CN106204537A CN106204537A (en) 2016-12-07
CN106204537B true CN106204537B (en) 2019-05-21

Family

ID=57462364

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610493686.1A Active CN106204537B (en) 2016-06-24 2016-06-24 Live pig image partition method under a kind of complex environment

Country Status (1)

Country Link
CN (1) CN106204537B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107986127A (en) * 2017-11-20 2018-05-04 江苏省特种设备安全监督检验研究院 The stranded detection method of passenger in a kind of elevator
CN109886984B (en) * 2019-01-22 2021-01-08 浙江大学 Image accurate segmentation method using foreground and background gray difference and deep learning network
CN109886271B (en) * 2019-01-22 2021-01-26 浙江大学 Image accurate segmentation method integrating deep learning network and improving edge detection
CN110539312A (en) * 2019-08-29 2019-12-06 南京禹智智能科技有限公司 Efficient and accurate livestock and poultry meat dividing robot

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2005034618A1 (en) * 2003-10-10 2005-04-21 Ab Svenska Mätanalys Method and device for the monitoring of pigs
CN104252709A (en) * 2014-07-14 2014-12-31 江苏大学 Multiple-target foreground detection method for look-down group-housed pigs in look-down state under complicated background
CN104504704A (en) * 2014-12-24 2015-04-08 江苏大学 Multiple overlooked herded pig target extraction method capable of self-adaptive multi-threshold segmentation of blocks

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2005034618A1 (en) * 2003-10-10 2005-04-21 Ab Svenska Mätanalys Method and device for the monitoring of pigs
CN104252709A (en) * 2014-07-14 2014-12-31 江苏大学 Multiple-target foreground detection method for look-down group-housed pigs in look-down state under complicated background
CN104504704A (en) * 2014-12-24 2015-04-08 江苏大学 Multiple overlooked herded pig target extraction method capable of self-adaptive multi-threshold segmentation of blocks

Also Published As

Publication number Publication date
CN106204537A (en) 2016-12-07

Similar Documents

Publication Publication Date Title
CN106204537B (en) Live pig image partition method under a kind of complex environment
Xiong et al. Visual detection of green mangoes by an unmanned aerial vehicle in orchards based on a deep learning method
CN110232666A (en) Underground piping image rapid defogging method based on dark primary priori
CN103702015B (en) Exposure control method for human face image acquisition system under near-infrared condition
CN1738426A (en) Video motion goal division and track method
CN103295010B (en) A kind of unitary of illumination method processing facial image
CN104732578B (en) A kind of building texture optimization method based on oblique photograph technology
CN104881644A (en) Face image acquisition method under uneven lighting condition
CN105976337B (en) A kind of image defogging method based on intermediate value guiding filtering
CN105959510B (en) A kind of video rapid defogging method
JP2013141210A (en) Image defogging apparatus, image defogging method, and image processing system
CN110378924A (en) Level set image segmentation method based on local entropy
CN106713701B (en) A kind of crowd hazards collecting method and system based on image processing techniques
CN109451292B (en) Image color temperature correction method and device
CN102496155A (en) Underwater optical image processing method for optimizing C-V (chan-vese) model
CN108109138B (en) Method for self-adaptive light uniformization of high-light area of mirror-like object
CN111161228B (en) Button surface defect detection method based on transfer learning
CN112862721A (en) Underground pipeline image defogging method based on dark channel and Retinex
US11080861B2 (en) Scene segmentation using model subtraction
US10108877B2 (en) System for capturing pupil and method thereof
CN110148097B (en) Color correction method of cataract image
CN106960421A (en) Evening images defogging method based on statistical property and illumination estimate
CN112598777A (en) Haze fusion method based on dark channel prior
CN107220983B (en) A kind of live pig detection method and system based on video
Yu et al. Synchronous measurement of 3D morphology and illuminance of plant leaves based on binocular vision

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right

Effective date of registration: 20211123

Address after: 230000 room 1201, building 1, University Sanchuang Park, No. 478, Jiulong Road, economic and Technological Development Zone, Hefei, Anhui Province

Patentee after: Hefei Shenmu Information Technology Co.,Ltd.

Address before: 310027 No. 38, Zhejiang Road, Hangzhou, Zhejiang, Xihu District

Patentee before: ZHEJIANG University

TR01 Transfer of patent right