WO2023276226A1 - 作物列検出システム、作物列検出システムを備える農業機械、および、作物列検出方法 - Google Patents
作物列検出システム、作物列検出システムを備える農業機械、および、作物列検出方法 Download PDFInfo
- Publication number
- WO2023276226A1 WO2023276226A1 PCT/JP2022/004547 JP2022004547W WO2023276226A1 WO 2023276226 A1 WO2023276226 A1 WO 2023276226A1 JP 2022004547 W JP2022004547 W JP 2022004547W WO 2023276226 A1 WO2023276226 A1 WO 2023276226A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- crop row
- image
- agricultural machine
- crop
- detection system
- Prior art date
Links
- 238000001514 detection method Methods 0.000 title claims abstract description 53
- 238000012545 processing Methods 0.000 claims abstract description 63
- 238000003384 imaging method Methods 0.000 claims abstract description 53
- 230000009466 transformation Effects 0.000 claims description 25
- 240000004050 Pentaglottis sempervirens Species 0.000 claims description 7
- 235000004522 Pentaglottis sempervirens Nutrition 0.000 claims description 7
- 238000010586 diagram Methods 0.000 description 33
- 238000004891 communication Methods 0.000 description 20
- 238000000034 method Methods 0.000 description 19
- 230000006870 function Effects 0.000 description 17
- 238000003860 storage Methods 0.000 description 16
- 239000011159 matrix material Substances 0.000 description 9
- 238000004590 computer program Methods 0.000 description 8
- 230000008878 coupling Effects 0.000 description 8
- 238000010168 coupling process Methods 0.000 description 8
- 238000005859 coupling reaction Methods 0.000 description 8
- 230000003287 optical effect Effects 0.000 description 8
- 230000005540 biological transmission Effects 0.000 description 7
- 230000008859 change Effects 0.000 description 7
- 238000012937 correction Methods 0.000 description 7
- 239000002689 soil Substances 0.000 description 6
- 238000006243 chemical reaction Methods 0.000 description 5
- 238000005516 engineering process Methods 0.000 description 5
- 238000005259 measurement Methods 0.000 description 5
- 239000011295 pitch Substances 0.000 description 4
- 239000004065 semiconductor Substances 0.000 description 4
- 241000196324 Embryophyta Species 0.000 description 3
- 241000607479 Yersinia pestis Species 0.000 description 3
- 230000001133 acceleration Effects 0.000 description 3
- 238000004422 calculation algorithm Methods 0.000 description 3
- 229930002875 chlorophyll Natural products 0.000 description 3
- 235000019804 chlorophyll Nutrition 0.000 description 3
- ATNHDLDRLWWWCB-AENOIHSZSA-M chlorophyll a Chemical compound C1([C@@H](C(=O)OC)C(=O)C2=C3C)=C2N2C3=CC(C(CC)=C3C)=[N+]4C3=CC3=C(C=C)C(C)=C5N3[Mg-2]42[N+]2=C1[C@@H](CCC(=O)OC\C=C(/C)CCC[C@H](C)CCC[C@H](C)CCCC(C)C)[C@H](C)C2=C5 ATNHDLDRLWWWCB-AENOIHSZSA-M 0.000 description 3
- 239000003086 colorant Substances 0.000 description 3
- 239000000470 constituent Substances 0.000 description 3
- 230000008569 process Effects 0.000 description 3
- 235000013311 vegetables Nutrition 0.000 description 3
- 238000004458 analytical method Methods 0.000 description 2
- 238000004364 calculation method Methods 0.000 description 2
- 238000002485 combustion reaction Methods 0.000 description 2
- 238000006073 displacement reaction Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 238000000605 extraction Methods 0.000 description 2
- 238000003306 harvesting Methods 0.000 description 2
- 230000033001 locomotion Effects 0.000 description 2
- 230000000873 masking effect Effects 0.000 description 2
- 238000007781 pre-processing Methods 0.000 description 2
- 230000004044 response Effects 0.000 description 2
- 238000001228 spectrum Methods 0.000 description 2
- 239000013589 supplement Substances 0.000 description 2
- 238000009333 weeding Methods 0.000 description 2
- 240000007124 Brassica oleracea Species 0.000 description 1
- 235000003899 Brassica oleracea var acephala Nutrition 0.000 description 1
- 235000011299 Brassica oleracea var botrytis Nutrition 0.000 description 1
- 235000011301 Brassica oleracea var capitata Nutrition 0.000 description 1
- 235000017647 Brassica oleracea var italica Nutrition 0.000 description 1
- 235000001169 Brassica oleracea var oleracea Nutrition 0.000 description 1
- 240000003259 Brassica oleracea var. botrytis Species 0.000 description 1
- 235000010149 Brassica rapa subsp chinensis Nutrition 0.000 description 1
- 235000000536 Brassica rapa subsp pekinensis Nutrition 0.000 description 1
- 241000499436 Brassica rapa subsp. pekinensis Species 0.000 description 1
- 244000000626 Daucus carota Species 0.000 description 1
- 235000002767 Daucus carota Nutrition 0.000 description 1
- 244000290594 Ficus sycomorus Species 0.000 description 1
- 240000008415 Lactuca sativa Species 0.000 description 1
- 235000003228 Lactuca sativa Nutrition 0.000 description 1
- 206010024796 Logorrhoea Diseases 0.000 description 1
- 244000088415 Raphanus sativus Species 0.000 description 1
- 235000006140 Raphanus sativus var sativus Nutrition 0.000 description 1
- 230000005856 abnormality Effects 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 230000006866 deterioration Effects 0.000 description 1
- 201000010099 disease Diseases 0.000 description 1
- 208000037265 diseases, disorders, signs and symptoms Diseases 0.000 description 1
- 238000009826 distribution Methods 0.000 description 1
- 230000003028 elevating effect Effects 0.000 description 1
- 238000009313 farming Methods 0.000 description 1
- 230000004720 fertilization Effects 0.000 description 1
- 230000005484 gravity Effects 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 238000010801 machine learning Methods 0.000 description 1
- 238000012423 maintenance Methods 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 239000002362 mulch Substances 0.000 description 1
- 238000010606 normalization Methods 0.000 description 1
- 230000029553 photosynthesis Effects 0.000 description 1
- 238000010672 photosynthesis Methods 0.000 description 1
- 239000000047 product Substances 0.000 description 1
- 238000012827 research and development Methods 0.000 description 1
- 230000011218 segmentation Effects 0.000 description 1
- 238000004904 shortening Methods 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 238000009331 sowing Methods 0.000 description 1
- 238000003971 tillage Methods 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A01—AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
- A01B—SOIL WORKING IN AGRICULTURE OR FORESTRY; PARTS, DETAILS, OR ACCESSORIES OF AGRICULTURAL MACHINES OR IMPLEMENTS, IN GENERAL
- A01B69/00—Steering of agricultural machines or implements; Guiding agricultural machines or implements on a desired track
- A01B69/001—Steering by means of optical assistance, e.g. television cameras
-
- A—HUMAN NECESSITIES
- A01—AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
- A01B—SOIL WORKING IN AGRICULTURE OR FORESTRY; PARTS, DETAILS, OR ACCESSORIES OF AGRICULTURAL MACHINES OR IMPLEMENTS, IN GENERAL
- A01B69/00—Steering of agricultural machines or implements; Guiding agricultural machines or implements on a desired track
- A01B69/007—Steering or guiding of agricultural vehicles, e.g. steering of the tractor to keep the plough in the furrow
- A01B69/008—Steering or guiding of agricultural vehicles, e.g. steering of the tractor to keep the plough in the furrow automatic
Definitions
- the present disclosure relates to a crop row detection system, an agricultural machine including the crop row detection system, and a crop row detection method.
- a vision guidance system is being developed that detects rows of crops (rows of crops) or ridges in a field using an imaging device such as a camera, and controls traveling of work vehicles along the detected rows of crops or ridges. .
- Patent Document 1 discloses a working machine that travels along ridges in cultivated land where crops are planted on ridges formed in rows.
- Patent Document 1 describes generating a planar projective transformation image after binarizing an original image obtained by photographing a cultivated land obliquely from above with an in-vehicle camera.
- Patent Document 1 by rotating a planar projectively transformed image, a large number of rotated images with different orientations are generated to detect a working passage between ridges.
- the accuracy of detection may decrease due to disturbance factors such as sunlight conditions.
- the present disclosure provides a crop row detection system, an agricultural machine including the crop row detection system, and a crop row detection method that can solve such problems.
- a crop row detection system in an exemplary, non-limiting embodiment, is mounted on an agricultural machine to capture a ground on which the agricultural machine travels and generate time-series color images including at least a portion of the ground.
- An imaging device for acquiring and a processing device for performing image processing on the time-series color images are provided.
- the processing device generates, from the time-series color images, an enhanced image in which the color of the row of crops to be detected is enhanced, and from the enhanced image, a first pixel whose color index value of the row of crops is equal to or greater than a threshold value. , generating a top view image of the ground as seen from above, classified into second pixels whose index values are less than the threshold; determine the position of
- An agricultural machine in an exemplary, non-limiting embodiment, is an agricultural machine comprising a crop row detection system as described above, comprising a travel gear including steering wheels and a crop row determined by said crop row detection system. and an automatic steering device for controlling the steering angle of the steered wheels based on the position of the edge line of the row.
- a crop row detection method in an exemplary, non-limiting embodiment, is a computer-implemented crop row detection method in which from an imaging device attached to an agricultural machine, obtaining a time-series color image including at least part of the ground, from the time-series color image, generating an enhanced image that emphasizes the color of the crop row to be detected, from the enhanced image Generating a top-view image of the ground as seen from above, in which first pixels having a color index value of the crop row greater than or equal to a threshold value and second pixels having a color index value less than the threshold value are classified; A computer is caused to determine a position of an edge line of the crop row based on the index value of the first pixel.
- a generic or specific aspect of the present disclosure can be realized by an apparatus, system, method, integrated circuit, computer program, or computer-readable non-transitory storage medium, or any combination thereof.
- a computer-readable storage medium may include both volatile and non-volatile storage media.
- a device may consist of a plurality of devices. When the device is composed of two or more devices, the two or more devices may be arranged in one device, or may be divided and arranged in two or more separate devices. .
- FIG. 2 is a diagram schematically showing how an imaging device attached to an agricultural machine captures an image of the ground
- FIG. 4 is a perspective view schematically showing the relationship between a body coordinate system ⁇ b and a camera coordinate system ⁇ c fixed with respect to agricultural machinery, and a world coordinate system ⁇ w fixed with respect to the ground.
- FIG. 2 is a top view schematically showing a portion of a field in which a plurality of rows of crops are provided on the ground
- 4 is a diagram schematically showing an example of an image acquired by the imaging device of the agricultural machine shown in FIG. 3;
- FIG. 4 is a top view schematically showing a state in which the position and orientation (angle in the yaw direction) of the agricultural machine are adjusted; 6 is a diagram showing an example of an image acquired by the imaging device of the agricultural machine in the state of FIG. 5;
- FIG. 1 is a block diagram showing a basic configuration example of a crop row detection system according to an embodiment of the present disclosure;
- FIG. 1 is a block diagram schematically showing a configuration example of a processing device according to an embodiment of the present disclosure;
- FIG. It is a monochrome image corresponding to an image of one frame in time-series color images acquired by an on-vehicle camera mounted on a tractor.
- FIG. 11 is a histogram of the green excess index (ExG) in the image of FIG. 10;
- FIG. 4 is a diagram showing an example of a top-view image (bird's-eye view image) classified into first pixels (for example, crop pixels) and second pixels (background pixels);
- FIG. 3 is a perspective view schematically showing the positional relationship between each of the camera coordinate system ⁇ c1 and the camera coordinate system ⁇ c2 and the reference plane Re.
- FIG. 5 is a schematic diagram showing an example in which the direction of the rows of crops and the direction of the scanning lines in the top view image are parallel; 15 is a diagram schematically showing an example of an integrated value histogram obtained for the top view image of FIG. 14; FIG. FIG. 5 is a schematic diagram showing an example in which the direction of the row of crops and the direction of the scanning line in the top view image intersect; FIG. 17 is a diagram schematically showing an example of an integrated value histogram obtained for the top view image of FIG. 16; 4 is a flow chart illustrating an example algorithm for determining crop row edge lines by a processing device in an embodiment of the present disclosure; FIG. 13 is a diagram showing an integrated value histogram obtained from the top view image of FIG.
- FIG. 12; 4 is a block diagram showing processing executed by a processing device according to an embodiment of the present disclosure
- FIG. FIG. 10 is a diagram for explaining how a top view image is divided into a plurality of blocks
- FIG. 22 is a diagram schematically showing the relationship between the position of a scanning line and the integrated value of index values in each block in FIG. 21
- FIG. 23 is a diagram showing an example of a crop row center in each block in FIG. 22 and an approximate line for the crop row center
- FIG. 24 is a top view showing an example of edge lines of crop rows determined based on the approximation lines of FIG.
- FIG. 26 is a diagram schematically showing the relationship between the position of a scanning line and the integrated value (histogram) of index values in each block in FIG. 25;
- FIG. 27 is a diagram showing an example of a crop row center in each block in FIG. 26 and an approximate line for the crop row center;
- FIG. 28 is a top view showing an example of edge lines of crop rows determined based on the approximate curve of FIG. 27;
- 1 is a perspective view showing an example of an appearance of an agricultural machine according to an embodiment of the present disclosure;
- FIG. FIG. 2 is a side view schematically showing an example of an agricultural machine with a work implement attached; It is a block diagram showing an example of a schematic structure of an agricultural machine and a working machine.
- Agricultural machinery in the present disclosure broadly includes machines that perform basic agricultural work such as “plowing”, “planting”, and “harvesting” in fields.
- Agricultural machines are machines having functions and structures for performing agricultural work such as tillage, sowing, pest control, fertilization, planting of crops, or harvesting on the ground in a field. These agricultural operations are sometimes referred to as “ground operations” or simply “operations.”
- the agricultural machine does not need to be equipped with a travel device for moving itself, and may travel by being attached to or towed by another vehicle equipped with a travel device.
- a work vehicle such as a tractor functions alone as an "agricultural machine”
- the work machine (implement) attached to or towed by the work vehicle and the work vehicle as a whole can be regarded as one "agricultural machine”. It may work.
- Examples of agricultural machinery include tractors, riding machines, vegetable transplanters, mowers, and field mobile robots.
- the crop row detection system in this embodiment includes an imaging device attached to an agricultural machine.
- the imaging device is fixed to the agricultural machine so as to photograph the ground on which the agricultural machine travels and acquire time-series color images including at least a portion of the ground.
- FIG. 1 schematically shows how an imaging device 120 attached to an agricultural machine 100 such as a tractor or a riding management machine captures an image of the ground 10 .
- the agricultural machine 100 includes a vehicle body 110 that can travel, and an imaging device 120 is fixed to the vehicle body 110 .
- FIG. 1 shows a body coordinate system ⁇ b having mutually orthogonal Xb-, Yb-, and Zb-axes.
- the body coordinate system ⁇ b is a coordinate system fixed to the agricultural machine 100, and the origin of the body coordinate system ⁇ b can be set near the center of gravity of the agricultural machine 100, for example. In the drawing, for ease of viewing, the origin of the body coordinate system ⁇ b is shown as being positioned outside the agricultural machine 100 .
- the Xb axis coincides with the traveling direction (the direction of arrow F) when the agricultural machine 100 travels straight.
- the Yb axis coincides with the rightward direction when the positive direction of the Xb axis is viewed from the coordinate origin, and the Zb axis coincides with the vertically downward direction.
- the imaging device 120 is, for example, an in-vehicle camera having a CCD (Charge Coupled Device) or CMOS (Complementary Metal Oxide Semiconductor) image sensor.
- the imaging device 120 in this embodiment is, for example, a monocular camera capable of capturing moving images at a frame rate of 3 frames per second (fps) or higher.
- FIG. 2 is a perspective view schematically showing the relationship between the body coordinate system ⁇ b, the camera coordinate system ⁇ c of the imaging device 120, and the world coordinate system ⁇ w fixed to the ground 10 described above.
- the camera coordinate system ⁇ c has mutually orthogonal Xc, Yc, and Zc axes
- the world coordinate system ⁇ w has mutually orthogonal Xw, Yw, and Zw axes.
- the Xw-axis and Yw-axis of the world coordinate system ⁇ w are on the reference plane Re extending along the ground 10 .
- the imaging device 120 is attached to a predetermined position of the agricultural machine 100 so as to face a predetermined direction. Therefore, the position and orientation of the camera coordinate system ⁇ c with respect to the body coordinate system ⁇ b are fixed in a known state.
- the Zc axis of the camera coordinate system ⁇ c is on the camera optical axis ⁇ 1.
- the camera optical axis ⁇ 1 is inclined from the traveling direction F of the agricultural machine 100 toward the ground 10, and the depression angle ⁇ is greater than 0°.
- a travel direction F of the agricultural machine 100 is generally parallel to the ground 10 on which the agricultural machine 100 travels.
- the depression angle ⁇ can be set, for example, within a range of 0° or more and 60° or less. When the position where the imaging device 120 is attached is close to the ground 10, the angle of depression ⁇ may be set to a negative value, in other words, the orientation of the camera optical axis ⁇ 1 may be set to have a positive elevation angle.
- the body coordinate system ⁇ b and the camera coordinate system ⁇ c translate with respect to the world coordinate system ⁇ w.
- the body coordinate system ⁇ b and the camera coordinate system ⁇ c may rotate with respect to the world coordinate system ⁇ w.
- the agricultural machine 100 does not rotate in the pitch and roll directions and moves substantially parallel to the ground 10 .
- FIG. 3 is a top view schematically showing a portion of a field in which a plurality of rows of crops 12 are provided on the ground 10.
- the crop row 12 is a row formed by continuously planting crops in one direction on the ground 10 of a field.
- a crop row 12 is a collection of crops planted on a ridge of a field.
- the shape of the crop row is complicated depending on the shape of the crops and the arrangement of the crops. be.
- the width of the crop row 12 varies as the crop grows.
- Each intermediate region 14 is a region sandwiched between two opposing edge lines E between two adjacent crop rows 12 .
- a plurality of rows of crops 12 are formed on one ridge. That is, a plurality of crop rows 12 are formed between rows of ridges.
- the edge line E of the row of crops 12 positioned at the edge in the width direction of the ridge serves as the reference for the intermediate region 14 . That is, the intermediate region 14 is located between the edge lines E of the row of crops 12 positioned at the edge in the width direction of the ridge among the edge lines E of the row of crops 12 .
- the intermediate area 14 functions as an area (work passage) through which the wheels of the agricultural machine 100 pass, the "intermediate area” may be referred to as a "work passage”.
- the "edge line" of the crop row means a reference line segment (which may include curved lines) for defining the target route when the agricultural machine travels.
- Such reference line segments can be defined as the ends of a strip-shaped area (working path) through which the wheels of the agricultural machine are allowed to pass.
- FIG. 3 schematically shows one agricultural machine 100 that is entering a field in which a row of crops 12 is provided.
- This agricultural machine 100 includes left and right front wheels 104F and left and right rear wheels 104R as traveling devices, and pulls a working machine (implement) 300 .
- the front wheels 104F are steering wheels.
- the work passages 14 located on both sides of the one row of crops 12 located in the center are marked with thick dashed arrows L and R, respectively.
- the front wheels 104F and the rear wheels 104R of the agricultural machine 100 move along the arrows L and R in the working path 14 so as not to step on the crop row 12. required to move along
- the front wheels 104F and the rear wheels 104R move along the arrows L and R in the work passage 14. It is possible to control the steering/running of the agricultural machine 100 so as to do so. Controlling the steering and running of the agricultural machine 100 based on the edge line E of the row of crops in this way may also be called "row following travel control".
- FIG. 4 is a diagram schematically showing an example of an image 40 acquired by the imaging device 120 of the agricultural machine 100 shown in FIG.
- a plurality of rows of crops 12 running parallel on the ground 10 and an intermediate area (working path) 14 theoretically meet at a vanishing point P 0 on the horizon 11 .
- the reason why the vanishing point P0 is located on the right side of the image 40 is that the running direction F of the agricultural machine 100 is in the direction in which the row of crops 12 extends (the direction parallel to the arrow C), as shown in FIG. This is because it is inclined with respect to
- the crop row 12 can be accurately detected from such an image 40 and the edge line E of the crop row 12 can be determined by the method described later. be possible. Then, based on the edge line E, a route (target route) along which the agricultural machine 100 should travel can be appropriately generated.
- automatic steering is used to control the running of the agricultural machine 100 so that the front wheels 104F and the rear wheels 104R of the agricultural machine 100 move along the arrows L and R in the work path 14 (line following running control). becomes possible.
- a positioning system such as GNSS.
- FIG. 5 is a top view schematically showing a state in which the position and orientation (angle in the yaw direction) of the agricultural machine 100 are adjusted by steering the agricultural machine 100 so as to reduce the positional error with respect to the target path (arrow C).
- FIG. 6 is a diagram showing an example of an image 40 acquired by the imaging device 120 of the agricultural machine 100 in such a state.
- the front wheels 104F and the rear wheels 104R of the agricultural machine 100 in the state of FIG. 5 are positioned on the lines indicated by the arrows L and R in the work passage 14, respectively.
- the automatic steering device in the agricultural machine 100 controls the front wheels 104F and the rear wheels 104R so as not to deviate from the working path 14. to control the steering angle of the
- a crop row detection system 1000 includes, as shown in FIG.
- the processor 122 may be connected, for example, to an autopilot 124 included in the agricultural machine 100 .
- the automatic steering device 124 is included in, for example, an automatic driving device that controls travel of the agricultural machine 100 .
- the processing device 122 can be realized by an electronic control unit (ECU) for image recognition.
- the ECU is an in-vehicle computer.
- the processing device 122 is connected to the imaging device 120 by a serial signal line such as a wire harness so as to receive image data output by the imaging device 120 .
- a part of the image recognition processing executed by the processing device 122 may be executed inside the imaging device 120 (inside the camera module).
- FIG. 8 is a block diagram showing a hardware configuration example of the processing device 122.
- the processing device 122 includes a processor 20 , a ROM (Read Only Memory) 22 , a RAM (Random Access Memory) 24 , a communication device 26 and a storage device 28 . These components are interconnected via bus 30 .
- the processor 20 is a semiconductor integrated circuit and is also called a central processing unit (CPU) or a microprocessor.
- Processor 20 may include an image processing unit (GPU).
- the processor 20 sequentially executes a computer program describing a predetermined group of instructions stored in the ROM 22, and implements processing necessary for crop row detection according to the present disclosure.
- a part or all of the processor 20 may be an FPGA (Field Programmable Gate Array) equipped with a CPU, an ASIC (Application Specific Integrated Circuit), or an ASSP (Application Specific Standard Product).
- the communication device 26 is an interface for data communication between the processing device 122 and an external computer.
- the communication device 26 can perform wired communication such as CAN (Controller Area Network), or wireless communication conforming to the Bluetooth (registered trademark) standard and/or the Wi-Fi (registered trademark) standard.
- the storage device 28 can store images acquired from the imaging device 120 or image data in the middle of processing.
- Examples of storage device 28 include a hard disk drive or non-volatile semiconductor memory.
- the hardware configuration of the processing device 122 is not limited to the above example. Some or all of processing device 122 need not be installed on agricultural machine 100 . By using the communication device 26, one or more computers located outside the agricultural machine 100 can function as part or all of the processing device 122. FIG. For example, a networked server computer may function as part or all of processing unit 122 . On the other hand, a computer mounted on agricultural machine 100 may perform all functions required of processing device 122 .
- such a processing device 122 acquires time-series color images from the imaging device 120 and performs operations S1, S2, and S3 described below.
- S1 Generate an enhanced image in which the color of the row of crops to be detected is enhanced from the time-series color images.
- S2 Generating a top view image of the ground viewed from above, in which the first pixels having the index value of the color of the row of crops equal to or larger than the threshold value and the second pixels having the index value smaller than the threshold value are classified from the enhanced image. do.
- S3 determining the position of the edge line of the crop row based on the index value of the first pixel;
- a time-series color image is a collection of images acquired by the imaging device 120 in a time-series manner. Each image is composed of a group of pixels in frame units. For example, if the imaging device 120 outputs images at a frame rate of 30 frames/second, the processing device 122 may acquire new images at intervals of approximately 33 milliseconds.
- the speed at which the agricultural machine 100 such as a tractor travels in a field is relatively low compared to the speed of general automobiles traveling on public roads, and can be, for example, about 10 kilometers per hour or less. At 10 kilometers per hour, the distance traveled in about 33 milliseconds is about 6 centimeters.
- the processing device 122 may acquire images at intervals of, for example, 100 to 300 milliseconds, and does not need to process all frame images captured by the imaging device 120 .
- the acquisition cycle of images to be processed by the processing device 122 may be automatically changed by the processing device 122 according to the travel speed of the agricultural machine 100 .
- FIG. 9 is an image corresponding to one frame of image 40 in time-series color images acquired by an imaging device (a monocular camera in this example) mounted on agricultural machinery.
- the image in FIG. 9 shows rows of crops (rows of crops) planted in rows on the ground of a field.
- rows of crops are arranged substantially parallel and evenly spaced on the ground, and the camera optical axis of the imaging device faces the direction of travel of the agricultural machine.
- the camera optical axis need not be parallel to the traveling direction of the agricultural machine, and may be incident on the ground ahead of the traveling direction of the agricultural machine.
- the mounting position of the imaging device is not limited to this example. When a plurality of imaging devices are attached to the agricultural machine, some of the imaging devices may face the camera optical axis in the direction opposite to the direction of travel or in the direction crossing the direction of travel.
- the processing device 122 in FIG. 7 generates an image (enhanced image) in which the color of the crop row to be detected is enhanced, based on the time-series color images acquired from the imaging device 120 .
- Crops have chlorophyll (chlorophyll) in order to receive sunlight (white light) and perform photosynthesis. Chlorophyll absorbs less green light than red and blue light. Therefore, the spectrum of sunlight reflected by crops exhibits relatively high values in the green wavelength range compared to the spectrum of sunlight reflected by the soil surface. As a result, crop color generally has a high green component and the "crop row color” is typically green. However, as will be described later, the "crop row color” is not limited to green.
- the image sensor in the imaging device 120 has a large number of photodetection cells arranged in rows and columns.
- Each photodetector cell corresponds to a picture element (pixel) that constitutes an image, and includes an R sub-pixel that detects the intensity of red light, a G sub-pixel that detects the intensity of green light, and an intensity of blue light. , including B sub-pixels for detecting .
- the output of light detected by the R sub-pixel, G sub-pixel, and B sub-pixel in each photodetector cell will be called the R value, G value, and B value, respectively.
- the R value, G value, and B value may be collectively referred to as "pixel value" or "RGB value”.
- RGB value RGB value
- the enhanced image in which the color of the row of crops is emphasized means that the RGB value of each pixel in the color image acquired by the imaging device is set to a relatively large weight of the G value.
- This is an image converted into pixel values.
- Such conversion of pixel values for generating an enhanced image is defined, for example, as "(2 ⁇ G value ⁇ R value ⁇ B value)/(R value+G value+B value)".
- the denominator (R value+G value+B value) is a factor for normalization.
- rgb R value / (R value + G value + B value)
- g G value / (R value + G value + B value)
- b B value / (R value + G value + B value).
- ExG Excess Green Index
- FIG. 10 is a diagram showing an enhanced image 42 obtained by converting the RGB values in the image of FIG. 9 into "2 ⁇ grb".
- FIG. 10 pixels in which "r+b” is relatively small compared to g are displayed brightly, and pixels in which "r+b” is relatively large compared to g are displayed darkly.
- an image (enhanced image) 42 in which the color (in this example, "green") of the crop rows to be detected is enhanced is obtained.
- the brighter pixels in the image of FIG. 10 are pixels with a relatively large green component and belong to crop regions.
- indices such as the green-red vegetation index (G value - R value) / (G value + R value) are used as the "color index value” that emphasizes the color of the crop.
- the imaging device can also function as an infrared camera, NDVI (Normalized Difference Vegetation Index) may be used as the "index value of the color of the crop row”.
- each row of crop rows may be covered with a mulching sheet called "mulch".
- the "color of the crop row” is "the color of the objects arranged in a row over the crop”. Specifically, when the color of the sheet is black, which is an achromatic color, the "crop row color” means “black.” Also, if the color of the sheet is red, the “color of the crop row” means “red”. Thus, the "color of the crop row” means not only the color of the crop itself but also the color of the area defining the crop row (a color distinguishable from the color of the soil surface).
- the HSV color space is a color space composed of three components of hue (Hue), saturation (Saturation), and lightness (Value).
- hue Hue
- saturation saturation
- Value lightness
- the processing unit 122 In operation S2, the processing unit 122 generates a top view image classified from the enhanced image 42 into first pixels having a crop row color index value equal to or greater than a threshold and second pixels having an index value less than the threshold. do.
- a top view image is an image viewed from above the ground.
- FIG. 11 is a histogram of the green excess index (ExG) in the enhanced image 42 of FIG. The horizontal axis of the histogram is the green excess index (ExG), and the vertical axis is the number of pixels in the image (corresponding to frequency of occurrence).
- FIG. 11 shows a dashed line indicating the threshold Th calculated by the discriminant analysis algorithm. Pixels of the enhanced image 42 are classified into two classes by this threshold Th.
- the occurrence frequency of pixels whose green excess index (ExG) is equal to or greater than the threshold is shown, and these pixels are estimated to belong to the crop class.
- the occurrence frequency of pixels whose green excess index (ExG) is less than the threshold is shown, and these pixels are considered to belong to the crop class such as soil. Presumed.
- the first pixel which is the pixel whose index value is greater than or equal to the threshold value, corresponds to the "crop pixel”.
- the second pixels whose index values are less than the threshold correspond to "background pixels".
- the background pixels correspond to objects other than the object to be detected, such as the surface of the soil, and the aforementioned intermediate region (working passage) 14 may be constituted by the background pixels.
- the threshold determination method is not limited to the above example, and the threshold may be determined using, for example, another method using machine learning.
- the detection target area can be extracted from the enhanced image 42. Further, by giving "zero” to the pixel value of the "second pixel” or removing the data of the second pixel from the image data, it is possible to mask the area other than the detection target. When determining the area to be masked, processing may be performed to include pixels locally showing a high value of the green excess index (ExG) as noise in the mask area.
- ExG green excess index
- FIG. 12 is a diagram showing an example of a top view image 44 seen from above the ground, classified into first pixels and second pixels.
- a top-view image 44 in FIG. 12 is an image created from the enhanced image 42 in FIG. 10 by an image conversion technique, which will be described later.
- the second pixels whose crop row color index value (in this example, the green excess index) is less than the threshold Th are black pixels (pixels whose lightness is set to zero).
- the area formed by the second pixels is mainly the area where the surface of the soil on the ground can be seen.
- black triangular areas exist at the left and right corners that are in contact with the lower side. This triangular area corresponds to the area that was not shown in the enhanced image 42 of FIG.
- Preprocessing may include processing other than such processing.
- a top-view image 44 in FIG. 12 is a bird's-eye view image of the reference plane Re parallel to the ground viewed from directly above in the normal direction of the reference plane Re.
- This bird's-eye view image can be generated from the enhanced image 42 of FIG. 10 by homography transformation (planar projective transformation).
- a homographic transformation is a type of geometric transformation that can transform a point on one plane in three-dimensional space to a point on any other plane.
- FIG. 13 shows the positional relationship between the camera coordinate system ⁇ c1 of the imaging device in the first orientation (position and orientation: pose) and the camera coordinate system ⁇ c2 of the imaging device in the second orientation, and the reference plane Re.
- the camera coordinate system ⁇ c1 is tilted so that its Zc axis obliquely intersects the reference plane Re.
- the imaging device in the first posture corresponds to the imaging device attached to the agricultural machine.
- the Zc axis of the camera coordinate system ⁇ c2 is orthogonal to the reference plane Re.
- the camera coordinate system ⁇ c2 is arranged such that the reference plane Re can be obtained as a bird's-eye view image viewed from directly above in the normal direction of the reference plane Re.
- a virtual image plane Im1 exists at a position separated by the focal length of the camera on the Zc axis from the origin O1 of the camera coordinate system ⁇ c1.
- the image plane Im1 is orthogonal to the Zc axis and the camera optical axis ⁇ 1.
- a pixel position on the image plane Im1 is defined by an image coordinate system having mutually orthogonal u and v axes.
- the coordinates of points P1 and P2 located on the reference plane Re are (X1, Y1, Z1) and (X2, Y2, Z2), respectively, in the world coordinate system ⁇ w.
- the reference plane Re is set so as to extend along the ground.
- Points P1 and P2 on the reference plane Re are transformed into points p1 and p2 on the image plane Im1 of the imaging device in the first posture, respectively, by perspective projection of the pinhole camera model.
- points p1 and p2 are located at pixel locations indicated by coordinates (u1, v1) and (u2, v2), respectively.
- a virtual image plane Im2 exists at a position separated by the focal length of the camera on the Zc axis from the origin O2 of the camera coordinate system ⁇ c2.
- the image plane Im2 is parallel to the reference plane Re.
- a pixel position on the image plane Im2 is defined by an image coordinate system having mutually orthogonal u * and v * axes.
- Points P1 and P2 on the reference plane Re are transformed into points p1 * and p2 * , respectively, on the image plane Im2 by perspective projection.
- points p1 * and p2 * are located at pixel locations indicated by coordinates (u1 * , v1 * ) and (u2 * , v2 * ), respectively.
- homography Given the positional relationship of the camera coordinate systems ⁇ c1 and ⁇ c2 with respect to the reference plane Re (world coordinate system ⁇ w), homography transforms any point (u, v) on the image plane Im1 to correspond on the image plane Im2. We can find the point (u * , v * ) where Such a homography transformation is defined by a transformation matrix H of 3 rows ⁇ 3 columns when coordinates of points are expressed in a homogeneous coordinate system.
- the contents of the transformation matrix H are defined by the numerical values of h 11 , h 12 , . . . , h 32 as shown below.
- the contents of the transformation matrices H1 and H2 depend on the reference plane Re, so when the position of the reference plane Re changes, the contents of the transformation matrix H also change.
- the homography transformation By using such a homography transformation, it is possible to generate a top view image of the ground from the image of the ground acquired by the imaging device in the first posture (the imaging device attached to the agricultural machine).
- the coordinates of an arbitrary point on the image plane Im1 of the imaging device 120 are converted to the image plane Im2 of a virtual imaging device in a predetermined orientation with respect to the reference plane Re. Can be converted to point coordinates.
- the processing device 122 executes a software program based on the above algorithm to obtain a view of the ground 10 from above from the time-series color images or the preprocessed images of the time-series color images. generates a bird's-eye view image.
- the ground 10 may have irregularities such as ridges, ridges, and grooves. In such cases, the reference plane Re may be displaced upwards from the bottom of such irregularities. The displacement distance can be appropriately set according to the unevenness of the ground 10 on which the crops are planted.
- the posture of the image pickup device 120 changes, so the content of the transformation matrix H1 changes. can.
- the rotation angles of roll and pitch of the vehicle body 110 are measured by the IMU, the transformation matrix H1 and the transformation matrix H can be corrected according to the posture change of the imaging device.
- the processing device 122 in this embodiment classifies the first pixels having the index value of the color of the row of crops above the threshold value and the second pixels having the index value less than the threshold value from above the ground by the above-described method. After generating the seen top view image, operation S3 is performed.
- the processing unit 122 determines the position of the edge line of the crop row based on the index value of the first pixel. Specifically, the index values of the first pixels (pixels whose color index value is equal to or greater than a threshold value) are integrated along a plurality of scanning lines in the top view image.
- FIG. 14 is an example of a top view image 44 in which three crop rows 12 are shown.
- the crop row 12 direction is parallel to the image vertical direction (v-axis direction).
- FIG. 14 shows a large number of scanning lines (broken lines) S parallel to the image vertical direction (v-axis direction).
- the processing device 122 integrates index values of pixels positioned on a plurality of scanning lines S for each scanning line S to obtain an integrated value.
- FIG. 15 is a diagram schematically showing the relationship (histogram of integrated values) between the positions of the scanning lines S and the integrated values of the index values obtained for the top view image of FIG.
- the horizontal axis of FIG. 15 indicates the position of the scanning line S in the image horizontal direction (u-axis direction).
- the integrated value of the scanning line S is large.
- the second pixels (background pixels) belonging to the intermediate area (work path) 14 between the crop rows 12 the integrated value for that scan line S will be small. Note that in this embodiment the intermediate region (working path) 14 is masked and the index value for the second pixel is zero.
- the position of the edge line of each crop row 12 is the position of the scanning line S having a value that is 80% of the peak integrated value of each crop row 12 .
- the index values of the color of the crop row on each scanning line S are integrated. That is, the number of first pixels (the number of pixels) is not counted for the binarized top view image based on the classification of the first pixels and the second pixels.
- the count value of the first pixel is increases.
- accumulating the index value of the color of the crop row of the first pixels instead of the number of the first pixels suppresses erroneous determination due to fallen leaves and weeds. , to improve the robustness of crop row detection.
- FIG. 16 is an example of a top view image 44 in which the crop row 12 extends obliquely.
- the direction in which crop row 12 extends in image 40 acquired by imaging device 120 may be slanted to the right or left in the image.
- the direction of the crop row 12 is inclined from the image vertical direction (v-axis direction) as in the example of FIG.
- FIG. 16 also shows a large number of scanning lines (broken lines) S parallel to the image vertical direction (v-axis direction).
- the processor 122 integrates the index values of the pixels located on the plurality of scanning lines S for each scanning line S to obtain an integrated value
- a histogram of integrated values as shown in FIG. 17 is obtained.
- FIG. 17 is a diagram schematically showing the relationship between the position of the scanning line S and the integrated value of the index values obtained for the top view image of FIG. From this histogram the edge lines of the crop row 12 cannot be determined.
- FIG. 18 is a flowchart showing an example of a procedure for searching for the direction (angle) of the scanning line S parallel to the direction of the crop row 12 by changing the direction (angle) of the scanning line S.
- step S10 the direction (angle) of the scanning line S is set.
- the clockwise angle is ⁇ with respect to the u-axis in the image coordinate system (see FIGS. 14 and 16).
- the search for the angle ⁇ can be set to a range of, for example, 60 to 120 degrees and an angle step of, for example, 1 degree.
- the angle ⁇ of the scanning line S is given as 60, 61, 62, .
- step S12 index values are integrated for pixels on the scanning line S extending in the direction of each angle ⁇ , and a histogram of integrated values is created. The histogram will show different distributions depending on the angle ⁇ .
- step S14 from the plurality of histograms obtained in this way, a histogram is selected in which the boundaries of the unevenness are steep and the crop row 12 is clearly separated from the intermediate region 14 as shown in FIG. Find the angle ⁇ of the scan line S that produces .
- step S16 the edge line of each crop row 12 is determined from the peak value of the histogram corresponding to the angle ⁇ obtained in step S14.
- the position of the scan line S having an integrated value of, for example, 0.8 times the peak can be taken as the edge line.
- a histogram of integrated values of the angle ⁇ on the scanning line S may be created.
- a characteristic quantity for example, depth of recess/height of convex portion, envelope differential value, etc. is calculated from the waveform of the histogram, and the direction of the crop row 12 and the direction of the scanning line S are determined based on the characteristic quantity. are parallel or not.
- the method for obtaining the angle ⁇ is not limited to the above example. If the direction in which the crop row extends is known by measurement, the orientation of the agricultural machine may be measured by an inertial measurement unit (IMU) mounted on the agricultural machine 100 to determine the angle ⁇ with respect to the direction in which the crop row extends.
- IMU inertial measurement unit
- FIG. 19 is a diagram showing an example of an integrated value histogram created from the top view image of FIG.
- the position of the edge line E is the position of the scanning line that is 0.8 times the peak value of the convex portion of the histogram located in the center.
- the peak of the convex portion becomes lower and the peak of the convex portion spreads.
- the distortion of the image is small at the center of the top view image, whereas the distortion of the image increases as it moves away from the center to the left and right. This is due to the fact that the black triangular areas located on both sides lower the integrated value.
- the crop row that should be accurately detected is the center or periphery of the image. Therefore, distortion in regions near the left and right ends of the top view image can be ignored.
- FIG. 20 is a block diagram showing a series of processes executed by the processing device 122 in this embodiment.
- the processor 122 performs image acquisition 32, enhanced image generation 33, crop row extraction 34, and homography transformation 35 to generate a top view image 44, such as that shown in FIG. Obtainable.
- Processing unit 122 may also perform scanline orientation determination 36 and edgeline location determination 37 to obtain the location of the edgelines of the crop row.
- the processing device 122 or the path generation device that obtains the information indicating the position of the edge line from the processing device 122 can perform the target path generation 38 of the agricultural machine based on the edge line.
- the target path can be generated in such a way that the wheels of the agricultural machine are kept in the intermediate area (workpath) 14 between the edge lines E.
- the target path can be generated so that the central portion of the tire in the width direction passes through the middle of two edge lines positioned at both ends of the intermediate region (working passage) 14 . According to such a target route, even if the agricultural machine deviates from the target route by several centimeters during travel, it is possible to reduce the possibility that the tires will enter the crop row.
- the embodiments of the present disclosure it is possible to suppress the effects of weather conditions such as front light, backlight, fine weather, cloudy weather, and fog, and sunlight conditions that change depending on the working hours, and to detect crop rows with high accuracy. It was confirmed.
- the crop type cabbage, broccoli, radish, carrot, lettuce, Chinese cabbage, etc.
- growth state from seedling to mature state
- presence of disease presence of fallen leaves/weeds
- robustness even when soil color changes
- the homography transformation is performed after the step of obtaining the threshold value for binarization and extracting the crop area using pixels equal to or higher than the threshold value.
- the step of extracting crop regions may be performed after the homography transformation.
- homography transformation 35 may be performed between enhanced image generation 33 and crop row extraction 34, or between image acquisition 32 and enhanced image generation 33. may be performed between
- FIG. 21 is a diagram for explaining a method of dividing part or all of the top view image into a plurality of blocks and determining the position of the edge line for each of the plurality of blocks.
- the processing device 122 divides part or all of the top view image 44 into a plurality of blocks. Then, the position of the edge line E of the crop row 12 is determined for each of the plurality of blocks. In the illustrated example, three blocks B1, B2, and B3 have a belt shape continuous in the horizontal direction of the image in the top view image. The processing device 122 can determine the edge line of the crop row based on the band shape in a direction different from the traveling direction of the agricultural machine 100 .
- FIG. 22 is a diagram schematically showing the relationship (integrated value histogram) between the position of the scanning line S and the integrated value of the index values in each of the blocks B1, B2, and B3 of the top view image in FIG.
- the scanning line S when performing integration is always parallel to the image vertical direction.
- the index values are integrated block by block, and the direction (angle) of the scanning line S does not need to be changed.
- By shortening the length of the scanning line S it is possible to appropriately detect the area of the second pixels (background pixels) caused by the intermediate area (working path) 14 even if the crop row 12 extends obliquely. become. Therefore, there is no need to change the angle of the scanning line S.
- Both ends of the arrow W in FIG. 22 indicate the positions of the crop row edge lines determined in each of the blocks B1, B2, and B3.
- the crop row 12 direction is slanted with respect to the scan line S direction. Therefore, as described above, when a scanning line position showing a value 0.8 times the peak value of the integrated value histogram is adopted as the position of the edge line E of the crop row 12, such an edge line E corresponds to both ends of the "width" passing near the center of the crop row 12 in each of the blocks B1, B2, and B3.
- FIG. 23 shows the crop row center Wc in each of blocks B1, B2, and B3 in FIG.
- the crop row center Wc is obtained from the center of the arrow W that defines the edge line of the crop row obtained from the integrated value histogram in FIG. 22, and is located at the center of each block in the image vertical direction.
- FIG. 23 shows an example of the approximation line 12C for the crop row center Wc belonging to the same crop row 12.
- the approximation line 12C is, for example, a straight line obtained so as to minimize the root mean square distance (error) from the plurality of crop row centers Wc of each crop row 12 .
- Such an approximation line 12C corresponds to a line passing through the center of the crop row 12.
- FIG. 23 shows the crop row center Wc in each of blocks B1, B2, and B3 in FIG.
- the crop row center Wc is obtained from the center of the arrow W that defines the edge line of the crop row obtained from the integrated value histogram in FIG. 22, and is located at the
- FIG. 24 is a top view showing an example of the edge line E of the row of crops 12 determined from the approximation line 12C of FIG.
- the two edge lines E associated with each crop row 12 are spaced equal to the length of the arrow W and equidistant from the approximation line 12C.
- the edge line E of the row of crops 12 can be obtained with a smaller amount of calculation without changing the direction (angle) of the scanning line.
- the length of each block in the image vertical direction can be set to correspond to a distance of 1 to 2 meters on the ground, for example.
- one image is divided into three blocks to obtain the integrated value histogram, but the number of blocks may be four or more.
- the shape of the block is not limited to the above example.
- a block can have a strip shape that is continuous in either the horizontal direction of the image or the vertical direction of the image in the top view image.
- the processing device 122 can divide the crop into belt-shaped blocks extending in a direction different from the traveling direction of the agricultural machine 100 and determine the edge line of the row of crops.
- FIG. 25 schematically shows how the crop row 12 of the top view image 44 includes a curved portion.
- FIG. 26 schematically shows integrated value histograms in each of blocks B1, B2, and B3 of the top view image 44 of FIG.
- FIG. 27 is a diagram showing an example of the crop row center Wc in each of the blocks B1, B2, and B3 in FIG. 26 and the approximation line 12C for each crop row center Xc.
- the approximation line 12C in this example is, for example, a curve obtained so as to minimize the root mean square of the distance (error) of each crop row 12 from the crop row center Wc (for example, a higher-order curve such as a cubic curve). is.
- Such approximate line 12C corresponds to a curved line passing through the center of crop row 12 having curved portions.
- FIG. 28 is a top view showing an example of the edge line E of the row of crops 12 determined from the approximation line of FIG.
- Edge line E is generated in a manner similar to that described with reference to FIG. That is, the two edge lines E associated with each row of crops 12 are spaced equal to the length of the arrow W and equidistant from the approximation line 12C.
- the top view image is divided into a plurality of blocks and a histogram of integrated values is generated for each block, it becomes easier to determine the direction of the crop row, and the direction of the crop row can be easily determined. Even if the direction changes on the way, it is possible to know the changed direction.
- Any of the crop row detection methods described above can be implemented by being implemented in a computer and causing the computer to perform desired operations.
- the agricultural machine in this embodiment includes the crop row detection system described above.
- this agricultural machine includes a control system that performs control for realizing automatic steering operation.
- a control system is a computer system that includes a storage device and a controller, and is configured to control steering, travel, and other operations of agricultural machinery.
- the controller locates the agricultural machine by means of the positioning device and, based on a previously generated target path, steers the agricultural machine so that the agricultural machine travels along the target path.
- Control Specifically, the steering angle of the steered wheels (for example, the front wheels) of the agricultural machine is controlled so that the work vehicle travels along the target route in the field.
- the agricultural machine according to the present embodiment is equipped with an automatic steering device that is configured not only to operate in such a normal automatic steering mode, but also to automatically travel in a field provided with rows of crops by "row following travel control". .
- the positioning device has, for example, a GNSS receiver.
- Such positioning devices can determine the location of work vehicles based on signals from GNSS satellites.
- the positioning device can measure the position of the agricultural machine with high accuracy, the space between the rows of crops is narrow, and depending on how the crops are planted and the growing conditions, the wheels of the agricultural machinery may be affected by the distance between the rows of crops.
- the running devices such as the
- the automatic steering device included in the agricultural machine in the embodiment of the present disclosure is configured to control the steering angle of the steering wheel based on the position of the edge line of the crop row determined by the crop row detection system.
- the processing device of the crop row detection system can monitor the positional relationship between the edge line of the crop row and the steering wheel based on the time-series color images. If a position error signal is generated from this positional relationship, the automatic steering system of the agricultural machine can appropriately adjust the steering angle so as to reduce the position error signal.
- FIG. 29 is a perspective view showing an example of the appearance of the agricultural machine 100 according to this embodiment.
- FIG. 30 is a side view schematically showing an example of agricultural machine 100 with work implement 300 attached.
- the agricultural machine 100 in this embodiment is an agricultural tractor (working vehicle) with a working machine 300 attached.
- Agricultural machine 100 is not limited to a tractor, and work machine 300 need not be attached.
- the crop row detection technology according to the present disclosure exhibits excellent effects when used in small-sized tending machines and vegetable transplanters that can be used for inter-row work such as ridge setting, intertillage, soiling, weeding, top-dressing, and pest control. be able to.
- the agricultural machine 100 in this embodiment includes an imaging device 120, a positioning device 130, and an obstacle sensor 136. Although one obstacle sensor 136 is illustrated in FIG. 29, the obstacle sensors 136 may be provided at a plurality of locations on the agricultural machine 100 .
- the agricultural machine 100 includes a vehicle body 110 , a prime mover (engine) 102 and a transmission (transmission) 103 .
- the vehicle body 110 is provided with tires 104 (wheels) and a cabin 105 .
- Tires 104 include a pair of front wheels 104F and a pair of rear wheels 104R.
- a driver's seat 107 , a steering device 106 , an operation terminal 200 , and a group of switches for operation are provided inside the cabin 105 .
- One of the front wheel 104F and the rear wheel 104R may be a crawler instead of a tire.
- the agricultural machine 100 may be a four-wheel drive vehicle having four tires 104 as driving wheels, or a two-wheel drive vehicle having a pair of front wheels 104F or a pair of rear wheels 104R as driving wheels.
- the positioning device 130 in this embodiment includes a GNSS receiver.
- the GNSS receiver includes an antenna for receiving signals from GNSS satellites and processing circuitry for determining the position of agricultural machine 100 based on the signals received by the antenna.
- the positioning device 130 receives GNSS signals transmitted from GNSS satellites and performs positioning based on the GNSS signals.
- GNSS is a general term for satellite positioning systems such as GPS (Global Positioning System), QZSS (Quasi-Zenith Satellite System, eg, Michibiki), GLONASS, Galileo, and BeiDou.
- the positioning device 130 in this embodiment is provided in the upper part of the cabin 105, but may be provided in another position.
- the positioning device 130 can further supplement the position data using signals from an inertial measurement unit (IMU).
- IMU inertial measurement unit
- the IMU can measure tilts and minute movements of the agricultural machine 100 .
- Positioning performance can be improved by using data obtained by the IMU to supplement position data based on GNSS signals.
- an obstacle sensor 136 is provided at the rear portion of the vehicle body 110.
- FIG. Obstacle sensor 136 may be arranged at a portion other than the rear portion of vehicle body 110 .
- one or more obstacle sensors 136 may be provided anywhere on the sides, front, and cabin 105 of the vehicle body 110 .
- the obstacle sensor 136 detects objects existing around the agricultural machine 100 .
- Obstacle sensor 136 may comprise, for example, a laser scanner or ultrasonic sonar.
- the obstacle sensor 136 outputs a signal indicating the presence of an obstacle when an object is present closer than a predetermined distance from the obstacle sensor 136 .
- a plurality of obstacle sensors 136 may be provided at different positions on the vehicle body of the agricultural machine 100 . For example, multiple laser scanners and multiple ultrasonic sonars may be placed at different locations on the vehicle body. By providing such a large number of obstacle sensors 136, blind spots in monitoring obstacles around the agricultural machine 100 can be reduced.
- the prime mover 102 is, for example, a diesel engine.
- An electric motor may be used instead of the diesel engine.
- the transmission 103 can change the driving force and the moving speed of the agricultural machine 100 by shifting.
- the transmission 103 can also switch between forward and reverse of the agricultural machine 100 .
- the steering device 106 includes a steering wheel, a steering shaft connected to the steering wheel, and a power steering device that assists steering by the steering wheel.
- the front wheel 104F is a steerable wheel, and the traveling direction of the agricultural machine 100 can be changed by changing the steering angle (also referred to as "steering angle").
- the steering angle of the front wheels 104F can be changed by operating the steering wheel.
- the power steering system includes a hydraulic system or an electric motor that supplies an assist force for changing the steering angle of the front wheels 104F. When automatic steering is performed, the steering angle is automatically adjusted by the power of the hydraulic system or the electric motor under the control of the control device arranged in the agricultural machine 100 .
- a coupling device 108 is provided at the rear portion of the vehicle body 110 .
- the coupling device 108 includes, for example, a three-point support device (also called a "three-point link” or “three-point hitch"), a PTO (Power Take Off) shaft, a universal joint, and a communication cable.
- the work machine 300 can be attached to and detached from the agricultural machine 100 by the coupling device 108 .
- the coupling device 108 can control the position or attitude of the working machine 300 by elevating the three-point linkage by, for example, a hydraulic device.
- power can be sent from the agricultural machine 100 to the working machine 300 via the universal joint.
- the agricultural machine 100 can cause the work machine 300 to perform a predetermined work while pulling the work machine 300 .
- the coupling device may be provided in front of vehicle body 110 . In that case, a working machine can be connected to the front of the agricultural machine 100 .
- a working machine 300 shown in FIG. 30 is, for example, a rotary cultivator.
- the work machine 300 that is towed or attached to a work vehicle such as a tractor when traveling in a row of crops can be used as long as it can be used for work between furrows such as furrowing, intertillage, soiling, weeding, top dressing, and pest control. is.
- FIG. 31 is a block diagram showing an example of schematic configurations of the agricultural machine 100 and the working machine 300. As shown in FIG. Agricultural machine 100 and working machine 300 can communicate with each other via a communication cable included in coupling device 108 .
- the agricultural machine 100 in the example of FIG. 31 includes an imaging device 120, a positioning device 130, an obstacle sensor 136, an operating terminal 200, a driving device 140, a steering wheel sensor 150, a steering angle sensor 152, a control system 160, a communication interface ( IF) 190 , operation switch group 210 , and buzzer 220 .
- Positioning device 130 comprises a GNSS receiver 131 and an inertial measurement unit (IMU) 125 .
- Control system 160 includes storage device 170 and control device 180 .
- the controller 180 comprises a plurality of electronic control units (ECUs) 181-186.
- Work machine 300 includes a drive device 340 , a control device 380 , and a communication interface (IF) 390 .
- FIG. 31 shows constituent elements that are relatively highly relevant to the automatic steering or automatic traveling operation of the agricultural machine 100, and illustration of other constituent elements is omitted.
- the positioning device 130 performs positioning of the agricultural machine 100 using GNSS.
- the positioning device 130 comprises an RTK receiver
- correction signals transmitted from reference stations are utilized in addition to GNSS signals transmitted from multiple GNSS satellites.
- the reference station can be installed around the field on which the agricultural machine 100 runs (for example, within 10 km from the agricultural machine 100).
- the reference station generates a correction signal based on GNSS signals received from multiple GNSS satellites and transmits the correction signal to the positioning device 130 .
- a GNSS receiver 131 in the positioning device 130 receives GNSS signals transmitted from a plurality of GNSS satellites.
- the positioning device 130 performs positioning by calculating the position of the agricultural machine 100 based on the GNSS signals and correction signals.
- RTK-GNSS By using RTK-GNSS, it is possible to perform positioning with an accuracy of, for example, an error of several centimeters.
- Location information including latitude, longitude and altitude information, is obtained through RTK-GNSS high-precision positioning.
- the positioning method is not limited to RTK-GNSS, and any positioning method (interferometric positioning method, relative positioning method, etc.) that can obtain position information with required accuracy can be used.
- positioning may be performed using VRS (Virtual Reference Station) or DGPS (Differential Global Positioning System).
- the IMU 135 is equipped with a 3-axis acceleration sensor and a 3-axis gyroscope.
- the IMU 135 may include an orientation sensor, such as a 3-axis geomagnetic sensor.
- the IMU 135 functions as a motion sensor and can output signals indicating various quantities such as acceleration, speed, displacement, and attitude of the agricultural machine 100 .
- the positioning device 130 can more accurately estimate the position and orientation of the agricultural machine 100 based on the signals output from the IMU 135 in addition to the GNSS signals and correction signals. Signals output from IMU 135 may be used to correct or impute position calculated based on GNSS signals and correction signals.
- the IMU 135 outputs signals at a higher frequency than GNSS signals.
- the position and orientation of the agricultural machine 100 can be measured at a higher frequency (eg, 10 Hz or higher).
- a separate 3-axis acceleration sensor and 3-axis gyroscope may be provided.
- the IMU 135 may be provided as a separate device from the positioning device 130 .
- the positioning device 130 may include other types of sensors in addition to the GNSS receiver 131 and IMU 135. Depending on the environment in which the agricultural machine 100 runs, the position and orientation of the agricultural machine 100 can be estimated with high accuracy based on the data from these sensors.
- the driving device 140 includes various components necessary for running the agricultural machine 100 and driving the work implement 300, such as the prime mover 102, the transmission 103, the differential including the differential lock mechanism, the steering device 106, and the coupling device .
- Prime mover 102 includes an internal combustion engine, such as a diesel engine, for example.
- Drive system 140 may include an electric motor for traction instead of or in addition to the internal combustion engine.
- the steering wheel sensor 150 measures the rotation angle of the steering wheel of the agricultural machine 100.
- the steering angle sensor 152 measures the steering angle of the front wheels 104F, which are steered wheels. Measured values by the steering wheel sensor 150 and the steering angle sensor 152 are used for steering control by the controller 180 .
- the storage device 170 includes one or more storage media such as flash memory or magnetic disk.
- the storage device 170 stores various data generated by each sensor and the control device 180 .
- the data stored in the storage device 170 may include map data of the environment in which the agricultural machine 100 travels and data of target routes for automatic steering.
- the storage device 170 also stores a computer program that causes each ECU in the control device 180 to execute various operations described later.
- Such a computer program can be provided to the agricultural machine 100 via a storage medium (such as a semiconductor memory or an optical disk) or an electric communication line (such as the Internet).
- Such computer programs may be sold as commercial software.
- the control device 180 includes multiple ECUs.
- the plurality of ECUs include an ECU 181 for image recognition, an ECU 182 for speed control, an ECU 183 for steering control, an ECU 184 for automatic steering control, an ECU 185 for work machine control, an ECU 186 for display control, and an ECU 187 for buzzer control.
- the image recognition ECU 181 functions as a processing device for the crop row detection system.
- ECU 182 controls the speed of agricultural machine 100 by controlling prime mover 102 , transmission 103 and brakes included in drive system 140 .
- the ECU 183 controls steering of the agricultural machine 100 by controlling a hydraulic device or an electric motor included in the steering device 106 based on the measurement value of the steering wheel sensor 150 .
- ECU 184 performs calculations and controls for realizing automatic steering operation based on signals output from positioning device 130 , steering wheel sensor 150 , and steering angle sensor 152 .
- the ECU 184 sends an instruction to change the steering angle to the ECU 183 .
- the ECU 183 changes the steering angle by controlling the steering device 106 in response to the command.
- ECU 185 controls the operation of coupling device 108 in order to cause work machine 300 to perform a desired operation.
- ECU 185 also generates a signal for controlling the operation of work machine 300 and transmits the signal to work machine 300 via communication IF 190 .
- ECU 186 controls the display of operation terminal 200 .
- the ECU 186 causes the display device of the operation terminal 200 to display various displays such as a map of the field, the detected rows of crops, the position and target route of the agricultural machine 100 on the map, pop-up notifications, setting screens, and the like.
- ECU 187 controls the output of a warning sound by buzzer 220 .
- control device 180 realizes driving by manual steering or automatic steering.
- control device 180 controls drive device 140 based on the position of agricultural machine 100 measured or estimated by positioning device 130 and the target route stored in storage device 170 .
- the control device 180 causes the agricultural machine 100 to travel along the target route.
- the ECU 181 for image recognition determines edge lines of the row of crops from the detected row of crops, and generates a target route based on these edge lines.
- the controller 180 performs operations according to this target path.
- a plurality of ECUs included in the control device 180 can communicate with each other according to a vehicle bus standard such as CAN (Controller Area Network).
- CAN Controller Area Network
- each of the ECUs 181 to 187 is shown as an individual block in FIG. 31, their respective functions may be realized by a plurality of ECUs.
- an in-vehicle computer that integrates at least part of the functions of the ECUs 181 to 187 may be provided.
- the control device 180 may include ECUs other than the ECUs 181 to 187, and an arbitrary number of ECUs may be provided according to functions.
- Each ECU has a control circuit that includes one or more processors.
- the communication IF 190 is a circuit that communicates with the communication IF 390 of the working machine 300 .
- Communication IF 190 performs transmission/reception of signals conforming to the ISOBUS standard such as ISOBUS-TIM with communication IF 390 of work machine 300 .
- work machine 300 can be caused to perform a desired operation, or information can be acquired from work machine 300 .
- the communication IF 190 may communicate with an external computer via a wired or wireless network.
- the external computer may be, for example, a server computer in a farming support system that centrally manages information on fields on the cloud and utilizes data on the cloud to support agriculture.
- the operation terminal 200 is a terminal for a user to perform operations related to traveling of the agricultural machine 100 and operation of the working machine 300, and is also called a virtual terminal (VT).
- Operating terminal 200 may include a display device, such as a touch screen, and/or one or more buttons.
- the user can, for example, switch on/off the automatic steering mode, switch on/off the cruise control, set the initial position of the agricultural machine 100, set the target route, record a map, or Various operations such as editing, 2WD/4WD switching, differential lock ON/OFF switching, and work machine 300 ON/OFF switching can be performed. At least part of these operations can also be realized by operating the operation switch group 210 . Display on the operation terminal 200 is controlled by the ECU 186 .
- the buzzer 220 is an audio output device that emits a warning sound to notify the user of an abnormality.
- the buzzer 220 emits a warning sound, for example, when the agricultural machine 100 deviates from the target route by a predetermined distance or more during automatic steering operation.
- a similar function may be realized by a speaker of the operation terminal 200 instead of the buzzer 220 .
- Buzzer 220 is controlled by ECU 186 .
- the drive device 340 in the work machine 300 performs operations necessary for the work machine 300 to perform a predetermined work.
- Drive device 340 includes a device, such as a hydraulic device, an electric motor, or a pump, depending on the application of work machine 300 .
- Controller 380 controls the operation of drive 340 .
- the control device 380 causes the driving device 340 to perform various operations in response to signals transmitted from the agricultural machine 100 via the communication IF 390 .
- a signal corresponding to the state of work machine 300 can also be transmitted from communication IF 390 to agricultural machine 100 .
- the agricultural machine 100 may be a working vehicle that operates unmanned and automatically. In that case, the agricultural machine 100 does not need to be provided with components necessary only for manned operation, such as a cabin, a driver's seat, a steering wheel, and an operation terminal.
- the unmanned work vehicle may perform operations similar to those in the above-described embodiments by autonomous travel or remote control by a user.
- a system that provides various functions in the embodiments can be retrofitted to agricultural machines that do not have those functions.
- Such systems can be manufactured and sold independently of agricultural machinery.
- Computer programs used in such systems may also be manufactured and sold independently of agricultural machinery.
- the computer program may be provided by being stored in a non-transitory computer-readable storage medium, for example.
- Computer programs may also be provided by download via telecommunications lines (eg, the Internet).
- the technology of the present disclosure can be applied, for example, to agricultural machines such as ride-on maintenance machines, vegetable transplanters, and tractors.
Landscapes
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Mechanical Engineering (AREA)
- Soil Sciences (AREA)
- Environmental Sciences (AREA)
- Guiding Agricultural Machines (AREA)
- Image Processing (AREA)
Abstract
Description
本開示の例示的な第1の実施形態における作物列検出システムおよび作物列検出方法を説明する。
(S1)時系列カラー画像から、検出対象である作物列の色を強調した強調画像を生成する。
(S2)強調画像から、作物列の色の指標値が閾値以上の第1画素と、この指標値が閾値未満の第2画素とに分類された、地面の上方から見た上面視画像を生成する。
(S3)第1画素の前記指標値に基づいて、作物列のエッジラインの位置を決定する。
次に、本開示の作物列検出システムを備える農業機械の実施形態を説明する。
Claims (12)
- 農業機械に取り付けられ、前記農業機械が走行する地面を撮影して前記地面の少なくとも一部を含む時系列カラー画像を取得する撮像装置と、
前記時系列カラー画像の画像処理を行う処理装置と、
を備え、
前記処理装置は、
前記時系列カラー画像から、検出対象である作物列の色を強調した強調画像を生成し、
前記強調画像から、前記作物列の色の指標値が閾値以上の第1画素と、前記指標値が前記閾値未満の第2画素とに分類された、前記地面の上方から見た上面視画像を生成し、
前記第1画素の前記指標値に基づいて、前記作物列のエッジラインの位置を決定する、
作物列検出システム。 - 前記処理装置は、
前記上面視画像の複数の走査ラインに沿って前記第1画素の前記指標値を積算して積算値を求め、前記走査ラインの位置と前記積算値とを関連付けたヒストグラムを作成し、
前記ヒストグラムに基づいて、前記作物列のエッジラインの位置を決定する、請求項1に記載の作物列検出システム。 - 前記処理装置は、
前記ヒストグラムを参照し、前記積算値のピークの両側の所定位置から、前記作物列のエッジラインの位置を決定する、請求項2に記載の作物列検出システム。 - 前記処理装置は、
前記上面視画像の一部または全部を複数のブロックに分割して、前記複数のブロックのそれぞれについて、前記エッジラインの位置を決定する、請求項1から3のいずれか1項に記載の作物列検出システム。 - 前記複数のブロックは、前記上面視画像内において、画像水平方向又は前記画像垂直方向のいずれかの方向に連続する帯形状を有し、
前記処理装置は、前記農業機械の進行方向とは異なる方向の帯形状に基づいて、前記作物列のエッジラインを決定する請求項4に記載の作物列検出システム。 - 前記処理装置は、前記複数のブロックのそれぞれにおける前記エッジラインの位置に基づいて、前記作物列が延びる方向を決定する、請求項4または5に記載の作物列検出システム。
- 前記上面視画像は、前記地面に沿った基準平面を、前記基準平面の法線方向における真上から見た俯瞰画像であり、
前記処理装置は、前記時系列カラー画像、または前記時系列カラー画像の前処理画像から、ホモグラフィ変換によって前記俯瞰画像を生成する、請求項1から6のいずれか1項に記載の作物列検出システム。 - 前記基準平面は、前記作物が作付けされる地面の凹凸に応じて設定された所定の距離だけ、前記地面における前記凹凸の底部から上方に変位している、請求項7に記載の作物列検出システム。
- 前記処理装置は、前記作物列の前記エッジラインの位置に基づいて、目標経路を生成して出力する、請求項1から8のいずれか1項に記載の作物列検出システム。
- 請求項1から9のいずれか1項に記載される作物列検出システムを備える農業機械であって、
操舵輪を含む走行装置と、
前記作物列検出システムが決定した前記作物列の前記エッジラインの位置に基づいて、前記操舵輪の操舵角を制御する自動操舵装置と、
を備える、農業機械。 - 前記作物列検出システムの前記処理装置は、前記時系列カラー画像に基づいて、前記作物列の前記エッジラインと前記操舵輪との位置関係をモニタし、位置誤差信号を前記自動操舵装置に与える、請求項10に記載の農業機械。
- コンピュータに実装される作物列検出方法であって、
農業機械に取り付けられた撮像装置から、前記農業機械が走行する地面を撮影した、前記地面の少なくとも一部を含む時系列カラー画像を取得すること、
前記時系列カラー画像から、検出対象である作物列の色を強調した強調画像を生成すること、
前記強調画像から、前記作物列の色の指標値が閾値以上の第1画素と、前記指標値が前記閾値未満の第2画素とに分類された、前記地面の上方から見た上面視画像を生成すること、
前記第1画素の前記指標値に基づいて、前記作物列のエッジラインの位置を決定すること、
をコンピュータに実行させる、作物列検出方法。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020237042729A KR20240007667A (ko) | 2021-06-29 | 2022-02-04 | 작물열 검출 시스템, 작물열 검출 시스템을 구비하는 농업 기계, 및 작물열 검출 방법 |
EP22832394.5A EP4335265A1 (en) | 2021-06-29 | 2022-02-04 | Crop row detection system, agricultural machine equipped with crop row detection system, and crop row detection method |
JP2023531366A JPWO2023276226A1 (ja) | 2021-06-29 | 2022-02-04 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2021107919 | 2021-06-29 | ||
JP2021-107919 | 2021-06-29 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2023276226A1 true WO2023276226A1 (ja) | 2023-01-05 |
Family
ID=84692236
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2022/004547 WO2023276226A1 (ja) | 2021-06-29 | 2022-02-04 | 作物列検出システム、作物列検出システムを備える農業機械、および、作物列検出方法 |
Country Status (4)
Country | Link |
---|---|
EP (1) | EP4335265A1 (ja) |
JP (1) | JPWO2023276226A1 (ja) |
KR (1) | KR20240007667A (ja) |
WO (1) | WO2023276226A1 (ja) |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0620031A (ja) * | 1992-06-30 | 1994-01-28 | Kubota Corp | 作業領域検出装置 |
JPH09201110A (ja) * | 1996-01-26 | 1997-08-05 | Kubota Corp | 作業車の方向検出装置、走行状態表示装置、及び走行制御装置 |
JPH09224417A (ja) * | 1996-02-27 | 1997-09-02 | Kubota Corp | 作業車の補助装置 |
JP2006101816A (ja) * | 2004-10-08 | 2006-04-20 | Univ Of Tokyo | 操向制御方法及び装置 |
US20070001096A1 (en) * | 2005-07-01 | 2007-01-04 | Jiantao Wei | Method and system for vehicular guidance using a crop image |
JP2016208871A (ja) | 2015-04-30 | 2016-12-15 | 国立大学法人 鹿児島大学 | 作業機及びその制御方法 |
-
2022
- 2022-02-04 EP EP22832394.5A patent/EP4335265A1/en active Pending
- 2022-02-04 JP JP2023531366A patent/JPWO2023276226A1/ja active Pending
- 2022-02-04 KR KR1020237042729A patent/KR20240007667A/ko unknown
- 2022-02-04 WO PCT/JP2022/004547 patent/WO2023276226A1/ja active Application Filing
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0620031A (ja) * | 1992-06-30 | 1994-01-28 | Kubota Corp | 作業領域検出装置 |
JPH09201110A (ja) * | 1996-01-26 | 1997-08-05 | Kubota Corp | 作業車の方向検出装置、走行状態表示装置、及び走行制御装置 |
JPH09224417A (ja) * | 1996-02-27 | 1997-09-02 | Kubota Corp | 作業車の補助装置 |
JP2006101816A (ja) * | 2004-10-08 | 2006-04-20 | Univ Of Tokyo | 操向制御方法及び装置 |
US20070001096A1 (en) * | 2005-07-01 | 2007-01-04 | Jiantao Wei | Method and system for vehicular guidance using a crop image |
JP2016208871A (ja) | 2015-04-30 | 2016-12-15 | 国立大学法人 鹿児島大学 | 作業機及びその制御方法 |
Non-Patent Citations (3)
Title |
---|
OKAMOTO HIROSHI, SHUN-ICHI HATA, MUNEHIRO TAKAI: "Crop-Row Detector for Row-following Control Systems (part 1); Comparison and Evaluation of Detecting Systems", JOURNAL OF THE JAPANESE SOCIETY OF AGRICULTURAL MACHINERY, vol. 61, no. 6, 1 November 1999 (1999-11-01), pages 159 - 167, XP093018518, DOI: 10.11357/jsam1937.61.6_159 * |
OKAMOTO HIROSHI, SHUN-ICHI HATA, MUNEHIRO TAKAI: "Visual Sensor for Crop-Row Following Robot", LECTURE ABSTRACTS OF THE 58TH JAPANESE SOCIETY OF AGRICULTURAL MACHINERY, 1 April 1999 (1999-04-01), pages 231 - 231, XP093018519, DOI: 10.11357/jsam1937.61.Supplement_231 * |
PONNAMBALAM VIGNESH RAJA, BAKKEN MARIANNE, MOORE RICHARD J. D., GLENN OMHOLT GJEVESTAD JON, JOHAN FROM PÅL: "Autonomous Crop Row Guidance Using Adaptive Multi-ROI in Strawberry Fields", SENSORS, vol. 20, no. 18, 14 September 2020 (2020-09-14), pages 5249, XP093018520, DOI: 10.3390/s20185249 * |
Also Published As
Publication number | Publication date |
---|---|
EP4335265A1 (en) | 2024-03-13 |
JPWO2023276226A1 (ja) | 2023-01-05 |
KR20240007667A (ko) | 2024-01-16 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US8855405B2 (en) | System and method for detecting and analyzing features in an agricultural field for vehicle guidance | |
US7792622B2 (en) | Method and system for vehicular guidance using a crop image | |
US8712144B2 (en) | System and method for detecting crop rows in an agricultural field | |
US7684916B2 (en) | Method and system for vehicular guidance using a crop image | |
CN110243372B (zh) | 基于机器视觉的智能农机导航系统及方法 | |
US7570783B2 (en) | Method and system for vehicular guidance using a crop image | |
US8433483B2 (en) | Method and system for vehicular guidance with respect to harvested crop | |
US8737720B2 (en) | System and method for detecting and analyzing features in an agricultural field | |
US7580549B2 (en) | Method and system for vehicular guidance using a crop image | |
CA3233542A1 (en) | Vehicle row follow system | |
JP2006101816A (ja) | 操向制御方法及び装置 | |
WO2023276226A1 (ja) | 作物列検出システム、作物列検出システムを備える農業機械、および、作物列検出方法 | |
WO2023276228A1 (ja) | 列検出システム、列検出システムを備える農業機械、および、列検出方法 | |
WO2023276227A1 (ja) | 列検出システム、列検出システムを備える農業機械、および、列検出方法 | |
WO2024095993A1 (ja) | 列検出システム、列検出システムを備える農業機械、および列検出方法 | |
WO2023120182A1 (ja) | 農業機械 | |
WO2023120183A1 (ja) | 農業機械 | |
WO2023127437A1 (ja) | 農業機械 | |
US11981336B2 (en) | Vehicle row follow system | |
US20240130263A1 (en) | Row detection system, agricultural machine having a row detection system, and method of row detection | |
WO2024095802A1 (ja) | 走行制御システム、作業車両および走行制御方法 | |
US20210185882A1 (en) | Use Of Aerial Imagery For Vehicle Path Guidance And Associated Devices, Systems, And Methods | |
WO2024004574A1 (ja) | 作業車両、制御方法およびコンピュータプログラム | |
WO2023243514A1 (ja) | 作業車両、および作業車両の制御方法 | |
WO2024004575A1 (ja) | 作業車両、および作業車両の制御方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22832394 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2023531366 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2022832394 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 20237042729 Country of ref document: KR Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 1020237042729 Country of ref document: KR |
|
ENP | Entry into the national phase |
Ref document number: 2022832394 Country of ref document: EP Effective date: 20231207 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2301008313 Country of ref document: TH |
|
NENP | Non-entry into the national phase |
Ref country code: DE |