CN110582198B - Harvester - Google Patents

Harvester Download PDF

Info

Publication number
CN110582198B
CN110582198B CN201880029905.8A CN201880029905A CN110582198B CN 110582198 B CN110582198 B CN 110582198B CN 201880029905 A CN201880029905 A CN 201880029905A CN 110582198 B CN110582198 B CN 110582198B
Authority
CN
China
Prior art keywords
weed
unit
recognition
threshing
output data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201880029905.8A
Other languages
Chinese (zh)
Other versions
CN110582198A (en
Inventor
高原一浩
宫下隼辅
石见宪一
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Kubota Corp
Original Assignee
Kubota Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2017123439A external-priority patent/JP6854713B2/en
Priority claimed from JP2017123440A external-priority patent/JP7068781B2/en
Priority claimed from JP2017123441A external-priority patent/JP6765349B2/en
Application filed by Kubota Corp filed Critical Kubota Corp
Publication of CN110582198A publication Critical patent/CN110582198A/en
Application granted granted Critical
Publication of CN110582198B publication Critical patent/CN110582198B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01DHARVESTING; MOWING
    • A01D41/00Combines, i.e. harvesters or mowers combined with threshing devices
    • A01D41/12Details of combines
    • A01D41/127Control or measuring arrangements specially adapted for combines
    • A01D41/1278Control or measuring arrangements specially adapted for combines for automatic steering
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01BSOIL WORKING IN AGRICULTURE OR FORESTRY; PARTS, DETAILS, OR ACCESSORIES OF AGRICULTURAL MACHINES OR IMPLEMENTS, IN GENERAL
    • A01B69/00Steering of agricultural machines or implements; Guiding agricultural machines or implements on a desired track
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01BSOIL WORKING IN AGRICULTURE OR FORESTRY; PARTS, DETAILS, OR ACCESSORIES OF AGRICULTURAL MACHINES OR IMPLEMENTS, IN GENERAL
    • A01B69/00Steering of agricultural machines or implements; Guiding agricultural machines or implements on a desired track
    • A01B69/007Steering or guiding of agricultural vehicles, e.g. steering of the tractor to keep the plough in the furrow
    • A01B69/008Steering or guiding of agricultural vehicles, e.g. steering of the tractor to keep the plough in the furrow automatic
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01DHARVESTING; MOWING
    • A01D41/00Combines, i.e. harvesters or mowers combined with threshing devices
    • A01D41/12Details of combines
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01DHARVESTING; MOWING
    • A01D61/00Elevators or conveyors for binders or combines
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01DHARVESTING; MOWING
    • A01D69/00Driving mechanisms or parts thereof for harvesters or mowers
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01FPROCESSING OF HARVESTED PRODUCE; HAY OR STRAW PRESSES; DEVICES FOR STORING AGRICULTURAL OR HORTICULTURAL PRODUCE
    • A01F12/00Parts or details of threshing apparatus
    • A01F12/46Mechanical grain conveyors

Abstract

A combine harvester is provided with: a threshing depth adjusting member; a threshing depth control unit (620) that uses a threshing depth adjustment means to perform threshing depth adjustment control based on the length of the harvested straw; a body position calculation unit (66) that calculates the body position; an image recognition module (5) that estimates a weed growth area in the captured image acquired by the imaging unit (70), and outputs recognition output data indicating the estimated weed growth area; a weed position information generation unit (51) that generates weed position information indicating the position of a weed growth area on a map, based on the recognition output data and the body position at the time point when the captured image was acquired; and a working travel control unit for calculating the timing of the weed passing through the threshing depth adjusting member and controlling the weed entering while the weed passes through the threshing depth adjusting member.

Description

Harvester
Technical area
The present invention relates to a harvester for harvesting standing grain stalks while traveling on a farm.
Background
Conventional harvesters are provided with various functions to improve work performance and work efficiency.
For example, as shown in patent document 1, a conventional combine harvester is provided with a threshing depth detection sensor for detecting the position of the head of a grain conveying stalk conveyed from a harvesting unit to a threshing device, and is configured to control a threshing depth adjustment member for adjusting the threshing depth in the threshing device based on detection information of the threshing depth detection sensor, thereby threshing the conveying stalk at a target threshing depth.
In addition, in the harvesting work of crops using a harvester in a farm, attention needs to be paid to the growth state of the crops which are locally different and the existence of obstacles including people. Therefore, for example, a combine harvester according to patent document 2 includes a television camera and an image processing device for capturing an image of a grain stalk in front of the harvesting portion. The image processing device compares an image from a television camera with images showing the standing state of various rice straws stored in advance, thereby detecting the specific state of the rice straws. For example, when the image processing apparatus detects that a part of the straw in front of the harvesting section falls, the raking reel is inclined with the straw falling downward. Therefore, the cutting performance of the lodging straws is improved. In the combine harvester according to patent document 3, which is provided with a television camera and an image processing device in the same manner as the combine harvester of patent document 2, when it is detected that no grain stalk is present in front of the harvesting unit, the combine harvester regards that the combine harvester has reached the uncultivated land, and the power transmission to the cutter is cut off and the cutter is raised, thereby reducing the traveling speed. Thus, the impact of the cutter on the ridge or the like when the combine is returned to the uncultivated area on the field side is avoided, and the return of the combine on the uncultivated area on the field side is smoothly performed.
Further, the combine harvester according to patent document 4 is equipped with a camera for photographing the rear of the machine body, and recognizes the row (row) of the straw cut by the cutting unit from the photographed image. The direction of the organism is controlled based on the deviation between the identified plant row (plant line) and the organism so that each divided grass body passes between plants between the plant row (plant line) and the plant row (plant line).
Documents of the prior art
Patent document
Patent document 1: japanese unexamined patent publication Hei 08-172867
Patent document 2: japanese unexamined patent publication No. 11-155340
Patent document 3: japanese unexamined patent publication No. 11-137062
Patent document 4: japanese unexamined patent publication No. 2006-121952
Disclosure of Invention
Problems to be solved by the invention
When the harvested grain stalks harvested at the harvesting portion are conveyed to the threshing device, the technology of performing threshing at an appropriate threshing depth by using the adjustment of the threshing depth detection sensor contributes to the improvement of the threshing performance. However, in some farms, weeds longer than the planted straw are mixed around the planted straw, and the weeds are cut by the cutting section and detected by the threshing depth detection sensor. As a result, the threshing depth adjusting means adjusts the threshing depth in the threshing device by the length of the weeds, and there is a problem that the threshing performance of the harvested grain stalks is lowered.
In view of such circumstances, a combine harvester capable of avoiding as much as possible a reduction in threshing performance of planted stalks even in a farm where weeds locally grow is desired.
In the combine harvester according to patent document 2 or patent document 3, the state of the grain straw in front of the harvesting portion is detected using an image processing technique, and the operation of the working equipment is controlled based on the detection result. However, the actual position of the detected straw on the map in the farm is not taken into account. Therefore, even if the problematic straw is detected during the operation travel and the avoidance control for solving the problem is performed, if the area indicating the problematic straw state is far from the combine, the avoidance control becomes too early or too late. Further, it is also difficult to appropriately determine the timing at which the avoidance control should be terminated.
In view of such circumstances, a harvester that efficiently assists harvesting work using a captured image captured by a capturing unit provided in a vehicle body is also desired.
In the combine harvesters of patent documents 2, 3 and 4, the travel device and the working device are controlled based on a detection result obtained based on a captured image obtained by an on-vehicle camera. However, in the imaging by the camera provided in the combine that travels on the farm, various situations occur in which the captured image becomes an inappropriate image. Such a situation may be, for example, a case where sudden vibrations are transmitted to the onboard camera due to the fact that the driving surface of the farm is not as flat as the road. Further, there may be a case where the direction of the sun with respect to the onboard camera frequently fluctuates due to the work traveling while performing the direction change. Further, there is a case where dust is raised when the work is performed during a cutting operation or the like. If the working device or the travel device is controlled based on the improper captured image, the work travel becomes improper, and a satisfactory result cannot be obtained.
In order to suppress such a problem, a technique capable of eliminating as much as possible an inappropriate captured image in the support of the harvesting operation using the captured image is desired.
The present invention aims to further improve the performance of various operations and the efficiency of operations on the premise of the above-described desire.
Means for solving the problems
A harvester according to an embodiment of the present invention is a combine harvester that harvests standing grain stalks while traveling on a farm, the harvester including: a harvesting part for harvesting the vertical grain stalks from the farm; a grain stalk conveying device for conveying the cut grain stalks from the cutting part to the threshing device; a threshing depth adjusting member provided to the grain and straw conveying device; a threshing depth control unit that performs threshing depth adjustment control based on the length of the harvested straw using the threshing depth adjustment member; an organism position calculation unit that calculates an organism position as a map coordinate of the organism based on the positioning data from the satellite positioning module; an imaging unit provided in the machine body and configured to image the farm during harvesting operation; an image recognition module that receives image data of captured images successively and continuously acquired by the imaging unit, estimates a weed growth area in the captured images, and outputs recognition output data indicating the estimated weed growth area; a weed position information generating unit that generates weed position information indicating a position of the weed growth area on a map, based on the body position at the time point when the captured image is acquired and the identification output data; and a work travel control unit that determines the timing at which weeds harvested in the weed growth area pass through the threshing depth adjusting member, and performs weed entry control while the weeds pass through the threshing depth adjusting member.
In the harvester according to one embodiment of the present invention, when a weed is present in a captured image, the weed growth area is estimated by the image recognition module based on image data as the captured image. Further, since the body position indicated by the map coordinates of the time point at which the captured image is obtained is calculated by the body position calculating unit, weed position information indicating the position of the weed growth area on the map is generated from the body position and the identification output data indicating the weed growth area. Since the body position as the map coordinates of the body is calculated by the body position calculating section, the timing at which the weeds pass through the threshing depth adjusting member is determined by taking into account the mowing position of the weeds in the body and the conveyance time of the weeds from the mowing position to the threshing depth adjusting member. Based on the timing of this decision, by performing special weed entry control during passage of weeds through the threshing depth adjusting means, it is possible to suppress a reduction in threshing performance for the planted straw mixed with weeds even in a farm where weeds locally grow.
A harvester according to an embodiment of the present invention is a combine harvester that harvests standing grain stalks while traveling on a farm, the harvester including: a harvesting unit that harvests the vertical straw from the farm; a grain stalk conveying device for conveying the cut grain stalks from the cutting part to the threshing device; a threshing depth adjusting member provided to the grain and straw conveying device; a threshing depth control unit that performs threshing depth adjustment control based on the length of the harvested straw using the threshing depth adjustment member; an organism position calculation unit that calculates an organism position as a map coordinate of the organism based on the positioning data from the satellite positioning module; an imaging unit provided in the machine body and configured to image the farm during harvesting operation; an image recognition module that receives image data of captured images successively and continuously acquired by the imaging unit, estimates a weed growth area in the captured images, and outputs recognition output data indicating the estimated weed growth area; a weed position information generating unit that generates weed position information indicating a position of the weed growth area on a map, based on the body position at the time point when the captured image is acquired and the identification output data; and a working travel control section that performs weed entry control while the harvesting section passes through the weed growth area.
In a harvester according to an embodiment of the invention, weed entry control is performed during passage of the harvesting portion through the weed growth area. In this configuration, it is not necessary to consider the transport time of weeds and the like, and therefore there is an advantage that control becomes simple.
Further, the timing at which the harvesting part passes through the weed growth area can be accurately obtained by calculating the body position on the map, the distance between the body position and the harvesting part, and the position of the weed growth area on the map, which is included in the weed position information, based on the positioning data from the satellite positioning module. Therefore, in a preferred embodiment of the present invention, the structure is: determining a timing at which the mowing section passes through the weed growth area based on the weed position information and the body position calculated by the body position calculating section.
If weeds higher than the planted grain stalks enter the threshing depth adjusting member, the threshing depth adjusting member regards the weeds as cutting the grain stalks, and therefore the threshing depth control section controls the threshing depth in accordance with the length of the weeds. Therefore, the adjusted threshing depth is not suitable for cutting the grain stalks. Since the length of the harvested straw does not vary so rapidly, when weeds enter the threshing depth adjusting member, it is preferable to temporarily interrupt the threshing depth control by the threshing depth adjusting machine without changing the threshing depth. Therefore, in a preferred embodiment of the present invention, the following structure is provided: interrupting the threshing depth adjustment control by performing the weed entry control.
Further, since weeds and established grain stalks grow in a mixed manner, when the harvesting work is performed on the weed growth area, the amount of treatment including the weeds and the harvested grain stalks increases, and the load on the working devices such as the harvesting unit, the grain stalk conveyor, and the threshing device increases. In order to avoid such overload, it is preferable to reduce the vehicle speed. Therefore, in a preferred embodiment of the present invention, the following structure is provided: the vehicle speed is reduced by performing the weed entry control.
A harvester according to an embodiment of the present invention is a harvester that harvests a crop while traveling on a farm, the harvester including: an organism position calculation unit that calculates an organism position as a map coordinate of the organism based on the positioning data from the satellite positioning module; an imaging unit which is provided in the machine body and which images a farm during harvesting operation; an image recognition module that receives image data of a captured image sequentially and continuously acquired by the imaging unit, estimates an existing region in which an object to be recognized exists in the captured image, and outputs recognition output data including the existing region and an estimated probability at the time of estimation of the existing region; and an identification object position information generating unit that generates identification object position information indicating a position of the identification object on a map, based on the body position at the time point when the captured image is acquired and the identification output data.
In the harvester according to an embodiment of the present invention, the image recognition module that estimates the presence region in the captured image in which the recognition target exists based on the image data as the captured image is configured by using a technique such as a neural network (including deep learning) or reinforcement learning, for example. Further, since the estimation probability when estimating the presence area can be simultaneously output, optimal control using the probability value can be performed. Further, the body position calculating unit calculates a body position indicated by map coordinates at the time point when the captured image is acquired, and generates recognition object position information indicating a position of the recognition object on the map based on the body position and recognition output data indicating a presence area of the recognition object. With this configuration, it is possible to perform control of the harvesting work support in consideration of the estimation probability of the region where the recognition target object is estimated to exist and the position of the recognition target object on the map, and as a result, in consideration of the distance between the recognition target object and the harvesting machine.
In the captured image of the farm captured by the imaging unit, the resolution of the recognition object farther from the imaging unit is lower than the resolution of the recognition object closer to the imaging unit due to the far-near relationship.
Therefore, the recognition reliability of the recognition object captured at a position far from the image capturing unit is lower than that of the recognition object captured at a position near to the image capturing unit. In contrast, in a preferred embodiment of the present invention, the apparatus is configured such that: the estimated probability of the recognition target object is reduced as the recognition target object is farther from the imaging unit in the captured image.
Since the position information of the recognition target based on the captured image can be generated at a speed higher than the vehicle speed of the harvester, a plurality of pieces of position information of the recognition target related to the same recognition target are generated. Therefore, it is possible to statistically calculate the respective estimated probabilities included in the position information of the plurality of recognition objects related to the same recognition object. By this statistical calculation, more reliable identification object position information can be generated. The statistical operation here refers to arithmetic average, weighted average, median operation, and the like, and is an operation of deriving a data value with higher reliability from a plurality of data values. By using such statistical calculation for the estimated probability, more appropriate recognition target position information can be derived from the plurality of pieces of recognition target position information.
Therefore, in a preferred embodiment of the present invention, the structure is: storing a plurality of pieces of the identification target position information, and correcting the stored plurality of pieces of the identification target position information based on a result of a statistical operation of the estimated probabilities included in the corresponding identification output data.
Although the captured image acquired by the imaging unit provided in the body is an input source of the image recognition module, the captured image may become an image that is not suitable for recognition of the recognition target object depending on the imaging conditions. As a result, even if the recognition target object is recognized, the recognition probability of the recognition output data is lowered, and therefore it is not preferable to use the recognition output data as it is for the subsequent processing. Therefore, in a preferred embodiment of the present invention, the present invention includes: a data storage unit that temporarily and temporarily stores the recognition output data sequentially output from the image recognition module as a recognition output data string; and a data determination unit configured to determine that the recognition output data having the estimation probability lower by a predetermined level or more than the estimation probability of the recognition output data having a preceding and following relationship with the recognition output data string is inappropriate recognition output data. The recognition output data determined as inappropriate recognition output data may be deleted, or the estimated probability of the recognition output data may be replaced with the estimated probability of interpolation of the preceding and following recognition output data.
Objects of recognition that are important for a harvester harvesting crops while driving on a farm are humans, lodging straw, weeds, ridges, etc. It is important for safe travel in a farm because a person who recognizes that the person is an obstacle for work travel is the trigger of the obstacle avoidance control or the obstacle alarm notification. Since identification of lodging grain stalks or weeds is used as lodging grain stalk control or weed control in the work control, it is important to perform high-quality harvesting work. Since the boundary line of the farm is detected in order to recognize the ridge, it is important for the control of uncultivated areas (lying on the ground) on the side of the farm.
A harvester according to an embodiment of the present invention includes: an organism position calculation unit that calculates an organism position as a map coordinate of the organism based on the positioning data from the satellite positioning module; a travel track calculation unit that calculates a travel track of the machine body from the position of the machine body; an imaging unit which is provided in the machine body and which images the farm during harvesting operation; an image recognition module that receives image data of a captured image sequentially and continuously acquired by the imaging unit, estimates an existing region in which an object to be recognized exists in the captured image, and outputs recognition output data including the existing region and an estimated probability at the time of estimation of the existing region; and a data determination unit that predicts a range in which the existing region in the last captured image should be located in the next captured image based on the travel locus, and determines that the identification output data based on the next captured image is inappropriate identification output data when a difference between the estimated probability of the existing region in the next captured image that overlaps the range and the estimated probability of the existing region in the last captured image is equal to or greater than a predetermined allowable amount.
When a recognition target object appears ahead while the harvester is traveling for work, a plurality of captured images in which the recognition target object is captured are usually acquired. Therefore, for each of such a plurality of captured images, the existence region of the recognition target object is estimated, and the recognition output data including the estimated probability thereof is output. When the recognition target object recognized in each of the plurality of captured images is substantially a stationary object, the recognition target object exists at a different position in each captured image according to the body position in which each captured image is captured, that is, according to the travel locus. Therefore, the range in which the presence region of the recognition object in the captured image should be located can be predicted based on the travel locus.
In a case where the existence region of the recognition target object is not within the range predicted in this way, or in a case where the estimated probability thereof is greatly different from that based on other captured images even if the existence region is within the predicted range, the captured image is regarded as inappropriate. Further, if the captured image is not appropriate, it is obvious that the recognition output data outputted based on the captured image is also not appropriate. In this way, it is advantageous to predict a range in which the existing region in the last captured image should be located in the next captured image based on the travel locus, and to determine the recognition output data based on the next captured image as inappropriate recognition output data when the difference between the estimated probability of the existing region in the next captured image that overlaps with the range and the estimated probability of the existing region in the last captured image is equal to or greater than a predetermined allowable amount.
In a preferred embodiment of the present invention, the present invention comprises: a data correction unit configured to replace the estimated probability of the inappropriate recognition output data with an interpolated estimated probability based on the estimated probability of the recognition output data that is subsequent to and preceding the inappropriate recognition output data. In this configuration, the estimation probability of the recognition output data determined as inappropriate recognition output data is interpolated and corrected with the estimation probabilities of the recognition output data before and after the estimation probability, whereby the inappropriate recognition output data can be replaced with appropriate recognition output data and used. This is more effective when the number of recognition output data items concerning the same recognition target object is small.
Objects of recognition that are important for a harvester harvesting crops while driving on a farm are humans, lodging straw, weeds, ridges, etc. Since a person who recognizes an obstacle that is traveling for a job may be a trigger for obstacle avoidance control or an obstacle alarm notification, it is important for safe traveling in a farm. Since identification of lodging grain stalks or weeds is used as lodging grain stalk control or weed control in the work control, it is important to perform high-quality harvesting work. Since the boundary line of the farm is detected to recognize the ridge, it is important to control the uncultivated land on the farm.
Drawings
Fig. 1 is an overall side view of the combine harvester according to embodiment 1.
Fig. 2 is a side view of the cutout portion in embodiment 1.
Fig. 3 is an explanatory diagram of threshing depth control in embodiment 1.
Fig. 4 is a functional block diagram showing a control system of the combine harvester according to embodiment 1.
Fig. 5 is an explanatory diagram schematically showing a flow of generation of recognition output data by the image recognition module in embodiment 1.
Fig. 6 is a flowchart showing a flow of control for calculating the weed position from the captured image including the weed growth area and the body position and interrupting the threshing depth control in embodiment 1.
Fig. 7 is a schematic diagram showing a weed map obtained by mapping the weed location in embodiment 1.
Fig. 8 is an overall side view of the combine harvester according to embodiment 2.
Fig. 9 is a functional block diagram showing a control system of the combine harvester according to embodiment 2.
Fig. 10 is an explanatory diagram schematically showing a flow of generation of recognition output data by the image recognition module in embodiment 2.
Fig. 11 is a data flow chart showing a flow of data when generating recognition target position information from a captured image in embodiment 2.
Fig. 12 is a schematic diagram showing a weed map obtained by mapping weed position information, which is one of identification target position information, in embodiment 2.
Fig. 13 is an overall side view of the combine harvester according to embodiment 3.
Fig. 14 is a functional block diagram showing a control system of the combine harvester according to embodiment 3.
Fig. 15 is an explanatory diagram schematically showing a flow of generation of recognition output data by the image recognition module in embodiment 3.
Fig. 16 is a data flow chart showing a flow of data when generating recognition target position information from a captured image in embodiment 3.
Fig. 17 is an explanatory diagram schematically illustrating data processing for determining inappropriate identification output data in embodiment 3.
Fig. 18 is a schematic diagram showing a weed map obtained by mapping weed position information, which is one of identification target position information in embodiment 3.
Detailed Description
Hereinafter, an embodiment of a combine harvester as an example of a harvester according to the present invention will be described with reference to the drawings. In each embodiment, when the front-rear direction of the machine body 1 is defined, the definition is made along the machine body traveling direction in the working state. The direction indicated by the symbol (F) in fig. 1 is the body front side, and the direction indicated by the symbol (B) in fig. 1 is the body rear side. When the left-right direction of the body 1 is defined, the left and right are defined in a state viewed from the body traveling direction viewpoint.
[ embodiment 1]
As shown in fig. 1, in the combine harvester according to embodiment 1, a harvesting unit 2 is connected to a front portion of a machine body 1 including a pair of right and left crawler travel devices 10 so as to be capable of being raised and lowered about a transverse axis X. The rear part of the machine body 1 is provided with a threshing device 11 and a grain box 12 for storing grains in a state of being arranged in the transverse width direction of the machine body. A cab 14 for covering a cab for riding is provided at a front right portion of the machine body 1, and an engine 15 for driving is provided below the cab 14.
As shown in fig. 1, the threshing device 11 loads the harvested straws harvested by the harvesting unit 2 and transported backward inside, and performs threshing on the front end side of the ears in the threshing cylinder 113 while pinching and transporting the roots of the straws by the threshing feed chain 111 and the pinching guide rail 112. Then, a sorting unit provided below the threshing cylinder 113 performs grain sorting processing on the threshing processed object, and the grains sorted here are conveyed to the grain box 12 and stored. Although not described in detail, the grain tank is provided with a grain discharge device 13 for discharging grains stored in the grain tank 12 to the outside.
The harvesting section 2 includes a plurality of raising devices 21 for raising the lodged standing grain stalks, pusher-type cutting devices 22 for cutting the roots of the raised standing grain stalks, a grain stalk conveying device 23, and the like. The straw conveyor 23 conveys the cut straws in the vertical posture, which are cut into roots, to the starting end of the threshing feed chain 111 of the threshing device 11 located on the rear side of the machine body while gradually changing the vertical posture to the horizontal posture.
The straw conveying device 23 includes a converging conveying unit 231 for conveying a plurality of cut straws cut by the cutting device 22 while gathering them at the center in the cutting width direction, a root holding conveying device 232 for holding the roots of the gathered cut straws and conveying them backward, a head end locking conveying device 233 for locking and conveying the head ends of the cut straws, and a supply conveying device 234 for guiding the roots of the cut straws from the end of the root holding conveying device 232 to the threshing supply chain 111.
As shown in fig. 2, the plant root holding and conveying device 232 is supported by the support frame of the harvesting section 2 so as to be swingable around the horizontal axis. The root gripping and conveying device 232 is operated to swing up and down by driving the operating member 235, and is provided so that the conveying end portion is changed in position in the stalk length direction of the grain stalks with respect to the supply and conveying device 234 in accordance with the swing operation. The driving operation member 235 has an electric motor 236 for adjusting the threshing depth (hereinafter referred to as a threshing depth motor) as a driving source. The driving operation member 235 includes an operation rod 237 pushed or pulled by the threshing depth motor 236. The lower end of the operation rod 237 is pivotally connected to the middle part of the plant root holding and conveying device 232.
When the conveying terminal end portion of the root holding and conveying device 232 is separated from the supply and conveying device 234, the position of the root holding of the harvested straw by the supply and conveying device 234 is changed to the ear tip side with respect to the position of the root holding of the harvested straw by the root holding and conveying device 232, and the harvested straw is transferred to the supply and conveying device 234. As a result, the depth (threshing depth) at which the harvested straws enter the threshing device 11 is changed to be shallow (shallow threshing side).
When the conveying terminal end of the root pinching and conveying device 232 approaches the supply conveying device 234, the plant root pinching position of the supply conveying device 234 for the harvested grain stalks is moved to the supply conveying device 234 in a state where the position is close to the plant root pinching position of the root pinching and conveying device 232 for the harvested grain stalks. As a result, the threshing depth of the harvested straws with respect to the threshing device 11 is changed to be deep (deep threshing side).
By changing the posture of the root holding and conveying device 232 in this way, the threshing depth of the harvested grain stalks with respect to the threshing device 11 can be changed. That is, the threshing depth adjusting means 3 capable of changing the threshing depth of the harvested grain stalks with respect to the threshing device 11 is constituted by the root holding and conveying device 232 and the drive operation means 235.
As shown in fig. 3, the combine harvester according to embodiment 1 includes a contact type straw length detection device 30 for detecting the straw length of the harvested straws conveyed by the straw conveyance device 23, and a threshing depth control unit 620. The threshing depth control unit 620 performs threshing depth adjustment control for adjusting the threshing depth based on the detection result of the straw length detection device 30. In this embodiment, the threshing depth control unit 620 controls the threshing depth motor 236 so that the threshing depth of the harvested grain stalks relative to the threshing device 11 is maintained within a target setting range.
As shown in fig. 3, the stalk length detecting device 30 has a structure of: a pair of swing- type sensor arms 32 and 33 are provided on a main body housing 31 which is a device main body portion formed in a substantially bottomless box shape opened downward, and the pair of sensor arms 32 and 33 contact and act on the ear tip side portion of the harvested grain stalks to detect the stalk length. The pair of sensor arms 32 and 33 are provided in a state in which upper side portions thereof are supported by the main body case 31 and hang downward in a state in which the grain stalks are separated in the stalk length direction of the conveyed cut stalks. Each of the sensor arms 32 and 33 is supported so as to be swingable in the front-rear direction (corresponding to the movement direction of the mowing stalk) around a horizontal axis core provided inside the main body case 31, and in a state of tending to return to a downward reference posture.
The upper base end side portions of the pair of sensor arms 32 and 33 are provided with detection switches 34 and 35, respectively, and the detection switches 34 and 35 are turned on when the sensor arms 32 and 33 swing by a set amount or more from the reference posture by contacting with the conveyed mowing stalk, and are turned off when the swing amount of the sensor arms 32 and 33 from the reference posture is smaller than the set amount.
The outputs of the pair of detection switches 34 and 35 are input to the threshing depth control unit 620. The threshing depth control unit 620 controls the operation of the threshing depth motor 236 so that the detection switch 35 located on the ear tip side is turned off and the detection switch 34 located on the root side is turned on.
That is, the threshing depth control unit 620 operates the threshing depth motor 236 so that the plant root holding conveyor 232 moves to the shallow threshing side when both the pair of detection switches 34, 35 are in the on state. The threshing depth control unit 620 operates the threshing depth motor 236 so that the plant root holding and conveying device 232 moves to the deep threshing side when both the pair of detection switches 34 and 35 are in the off state. Further, when the detection switch 35 located on the ear tip side among the pair of detection switches 34, 35 is in the off state and the detection switch 34 located on the plant root side is in the on state, the threshing depth control unit 620 stops the operation of the threshing depth motor 236 and maintains the state.
As shown in fig. 1, an imaging unit 70 including a color camera is provided at the front end of the ceiling of the cab 14. The front-rear direction extension of the imaging field of view of the imaging unit 70 extends from the front end region of the cutting unit 2 to almost the horizon. The width of the imaging field of view is from about 10 meters to several tens of meters. The captured image obtained by the imaging unit 70 is converted into image data and transmitted to a control system of the combine.
The imaging unit 70 images the farm during harvesting operation. The control system of the combine harvester has a function of recognizing the weed growth area as the recognition target object based on the image data sent from the image pickup unit 70. In fig. 1, the normal cornstalk group is denoted by the symbol Z0, and the weed growth zone is denoted by the symbol Z1.
Further, a satellite positioning module 80 is disposed on the top of the cockpit 14. The satellite positioning module 80 includes a satellite antenna for receiving a GNSS (global navigation satellite system) signal (including a GPS signal). In order to supplement the satellite navigation performed by the satellite positioning module 80, an inertial navigation unit in which a gyro acceleration sensor or a magnetic azimuth sensor is embedded is installed in the satellite positioning module 80. Of course, the inertial navigation unit can be arranged elsewhere. In fig. 1, the satellite positioning module 80 is disposed at the rear of the ceiling portion of the cab 14 for convenience of drawing, but is preferably disposed at a position closer to the center side of the body than the front end portion of the ceiling portion so as to be as close as possible to a position directly above the left and right center portions of the cutting device 22.
Fig. 4 shows a functional block diagram of a control system built inside the body 1 of the combine harvester. The control system according to this embodiment is configured by a large number of wiring networks such as an electronic control unit called an ECU, various operating devices, a sensor group, a switch group, and an in-vehicle LAN for data transmission therebetween. The notification device 91 is a device for notifying a driver or the like of a work travel state or various warnings, and is a buzzer, a lamp, a speaker, a display, or the like. The communication unit 92 is used in the control system of the combine harvester to exchange data with the cloud computer system 100 or the portable communication terminal 200 provided at a remote location. Here, the portable communication terminal 200 is a tablet computer operated by a monitor (including a driver) at a work travel site. The control unit 6 is a core element of the control system, and is shown as an aggregate of a plurality of ECUs. The positioning data obtained by the satellite positioning module 80 and the image data obtained by the imaging unit 70 are input to the control unit 6 through a wiring network.
The control unit 6 includes an output processing unit 6B and an input processing unit 6A as input/output interfaces. The output processing unit 6B is connected to the vehicle travel device group 7A and the work equipment device group 7B. The vehicle travel device group 7A includes control devices related to vehicle travel, such as an engine control device, a shift control device, a brake control device, a steering control device, and the like. The working equipment facility group 7B includes a harvesting unit 2, a threshing unit 11, a grain discharging unit 13, a grain stalk conveying unit 23, a power control device in the threshing depth adjusting means 3, and the like.
The traveling system detection sensor group 8A, the work system detection sensor group 8B, and the like are connected to the input processing unit 6A. The running system detection sensor group 8A includes sensors for detecting the states of an engine revolution number adjusting means, an accelerator pedal, a brake pedal, a shift operation means, and the like. The work system detection sensor group 8B includes sensors for detecting the device states in the harvesting section 2, the threshing device 11, the grain discharging device 13, the straw conveying device 23, and the state of the straw or grain. Further, the work system detection sensor group 8B further includes the detection switches 34 and 35 in the threshing depth adjusting means 3 described above.
The control unit 6 includes an image recognition module 5, a data processing module 50, a work travel control module 60 as a work travel control unit, a machine body position calculation unit 66, a notification unit 67, and a weed position calculation unit 68.
The notification portion 67 generates notification data based on instructions or the like received from the respective functional portions of the control unit 6, and supplies to the notification device 91. The body position calculating unit 66 calculates the body position as the map coordinates (or the farm coordinates) of the body 1 based on the positioning data sequentially transmitted from the satellite positioning module 80. The weed position calculating unit 68 of this embodiment determines the timing at which the weeds mowed by the mowing unit 2 pass through the threshing depth adjusting means 3, based on the machine body position, which is usually the antenna position, calculated by the machine body position calculating unit 66, the position of the weed growth area on the map calculated by the data processing module 50, and the conveying speed of the grain and stalk conveyor 23.
The combine harvester according to the embodiment can travel by both automatic travel (automatic steering) and manual travel (manual steering). The work travel control module 60 includes an automatic work travel command unit 63 and a travel route setting unit 64 in addition to the travel control unit 61 and the work control unit 62. A running mode switch (not shown) for selecting either an automatic running mode for running by automatic steering or a manual steering mode for running by manual steering is provided in the cab 14. By operating the running mode switch, it is possible to realize a transition from the manual steering running to the automatic steering running, or a transition from the automatic steering running to the manual steering running.
The travel control section 61 has an engine control function, a steering control function, a vehicle speed control function, and the like, and supplies a travel control signal to the vehicle travel device group 7A. The work control unit 62 supplies work control signals to the work equipment group 7B to control the operations of the harvesting unit 2, the threshing unit 11, the grain discharging unit 13, the straw conveying unit 23, and the like. Further, the operation control unit 62 includes a threshing depth control unit 620 described with reference to fig. 3.
When the manual steering mode is selected, the travel control unit 61 generates a control signal based on the operation of the driver, and controls the vehicle travel equipment group 7A. When the automatic steering mode is selected, the running control section 61 controls the steering-related vehicle running equipment group 7A or the vehicle running equipment group 7A related to the vehicle speed based on the automatic running command supplied from the automatic work running command section 63.
The travel route setting unit 64 expands the travel route for automatic travel created in any one of the control unit 6, the mobile communication terminal 200, the cloud computer system 100, and the like into a memory. The travel route expanded in the memory is sequentially used as a target travel route in automatic travel. Even with manual travel, the travel path can be used to guide the combine along the travel path.
More specifically, the automatic work travel command unit 63 generates an automatic steering command and a vehicle speed command, and supplies them to the travel control unit 61. The automatic steering command is generated so as to eliminate the azimuth offset and the positional offset between the travel route set by the travel route setting unit 64 and the vehicle position calculated by the body position calculating unit 66. The vehicle speed command is generated based on a preset vehicle speed value. Further, the automatic work travel command section 63 provides a work device operation command to the work control section 62 in accordance with the vehicle position or the traveling state of the vehicle.
The image data of the captured image sequentially acquired by the image capturing unit 70 in succession is input to the image recognition module 5. The image recognition module 5 estimates a presence region where the recognition target object exists in the captured image, and outputs, as a recognition result, recognition output data including the presence region and an estimation probability at the time of estimation of the presence region. The image recognition module 5 is constructed using a neural network technique employing deep learning.
Fig. 5 and 6 show a flow of generation of recognition output data by the image recognition module 5. The pixel values of the RGB image data are input to the image recognition module 5 as input values. In this embodiment, the estimated recognition object is a weed. Therefore, the recognition output data as the recognition result contains the weed growth area represented by the rectangle and the estimated probability at which the weed growth area is estimated.
In fig. 5, the estimation result is modeled, and the weed growth area is represented by a rectangular box given the symbol F1. Although the weed growth area is defined by four corner points, respectively, the coordinate positions of the four corner points of each rectangle on the captured image are also included in the estimation result. Of course, if the weeds as the identification target are not estimated, the weed growth area is not output, and the estimated probability thereof is zero.
In this embodiment, the image recognition module 5 sets the internal parameters such that the estimated probability of the recognition target (weed) is reduced as the recognition target is farther from the imaging unit 70 in the captured image. This makes it possible to strictly recognize the recognition target object in the imaging area where the resolution is lowered by being away from the imaging unit 70, thereby reducing erroneous recognition.
The data processing module 50 processes the recognition output data output from the image recognition module 5. As shown in fig. 4 and 6, the data processing module 50 of this embodiment includes a weed position information generating unit 51 and a statistical processing unit 52.
The weed position information generating unit 51 generates weed position information indicating the position of the recognition target object on the map based on the recognition output data and the body position at the time point when the captured image is acquired. The position of the weed on the map where the weed is present, which is included in the recognition output data, is obtained by converting the coordinate positions (camera coordinate positions) of the four corner points of the rectangle representing the weed on the captured image into coordinates on the map.
The imaging unit 70 acquires the captured images at predetermined time intervals, for example, at intervals of 0.5 second, and inputs the image data to the image recognition module 5, so that the image recognition module 5 also outputs recognition output data at the same time intervals. Therefore, when a weed enters the imaging field of the imaging unit 70, the presence area for the same weed is included in the plurality of identification output data. As a result, information on the positions of a plurality of weeds with respect to the same weed can be obtained. In this case, the estimated probability included in the identification output data as each piece of raw data, that is, the estimated probability of the weed existing region (weed growth region) included in the weed position information is often different in value due to the difference in the positional relationship between the imaging unit 70 and the weeds.
Therefore, in this embodiment, such a plurality of pieces of weed position information are stored, and statistical operations are performed on the estimated probabilities included in each of the stored plurality of pieces of weed position information. A representative value of the estimated probability group is obtained by using a statistical operation of estimated probabilities for a plurality of pieces of identification target position information. The representative value is used to correct the position information of the plurality of recognition objects to be the optimal recognition object position information. An example of such correction is to obtain an arithmetic average value, a weighted average value, or an intermediate value of each estimated probability as a reference value (representative value), obtain a logical sum of presence areas (weed growth areas) having estimated probabilities equal to or higher than the reference value, and generate corrected weed position information in which the presence areas are set as optimal presence areas. Of course, other statistical operations can be used to generate a highly reliable weed location information.
As shown in fig. 6, weed position information indicating the position on the map of the weed growth area thus found is supplied to the weed position calculating unit 68. The body position calculated by the body position calculating section 66 as map coordinates of the body position is also supplied to the weed position calculating section 68. The weed position calculating section 68 determines the passing timing of the weeds mowed by the mowing section 2 through the threshing depth adjusting member 3 based on the weed position information, the machine body position, and the conveying speed of the grain straw conveying device 23. While the weeds pass through the threshing depth adjusting member 3, the weed position calculating section 68 supplies a weed entry flag (flag) to the threshing depth controlling section 620.
The threshing depth control unit 620 has a standard control mode and a weed entry control mode, and normally selects the standard control mode to execute the above-described threshing depth control during the working travel.
However, if the weed entry flag is supplied from the weed position calculating section 68, the standard control mode is switched to the weed entry control mode, and weed entry control is performed. In this embodiment, if weed entry control is performed, threshing depth control is interrupted and the vehicle speed is also reduced. Of course, in the weed entry control, only either the interruption of the threshing depth control or the reduction of the vehicle speed may be performed.
The weed position information generated by the weed position information generating unit 51 can be mapped as shown in fig. 7 so as to be visually easily displayed. Fig. 7 illustrates a weed map in which the weed position information is mapped. When the weed location information includes weed growth areas having different estimated probabilities, the weed growth areas can be represented by a pattern in which predetermined ranges of the estimated probability values are distinguished, as shown in fig. 7.
[ embodiment 2]
As shown in fig. 8, in the combine harvester, a cutting portion 2002 is connected to a front portion of a machine body 2001 provided with a pair of right and left crawler travel devices 2010 so as to be capable of raising and lowering around a transverse axis X. The rear part of the machine body 2001 is provided with a threshing device 2011 and a grain tank 2012 for storing grains in a state of being arranged in the transverse width direction of the machine body. A cab 2014 for covering a cab portion is provided in a front right portion of the machine body 2001, and a driving engine 2015 is provided below the cab 2014.
As shown in fig. 8, the threshing device 2011 receives the harvested grain stalks harvested and fed backward by the harvesting unit 2002 into the inside, and performs threshing in the threshing cylinder 2113 while pinching the roots of the grain stalks by the threshing feed chain 2111 and the pinching guide rails 2112. Then, grain sorting processing for the threshing processed product is performed by a sorting unit provided below the threshing cylinder 2113, and the grain thus sorted is transported to a grain tank 2012 and stored. Although not described in detail, the grain discharging device 2013 is provided to discharge the grains stored in the grain tank 2012 to the outside.
The harvesting section 2002 includes a plurality of raising devices 2021 for raising the lodged standing grain stalks, a pusher-type cutting device 2022 for cutting the roots of the raised standing grain stalks, a grain stalk conveying device 2023, and the like. The grain straw conveying device 2023 conveys the harvested grain straws in the vertical posture, in which the roots are cut, to the starting end of the threshing feed chain 2111 of the threshing device 2011 located on the rear side of the machine body while slowly changing the horizontal posture.
The straw conveying device 2023 includes a converging conveyor 2231 for collecting a plurality of harvested straws cut by the cutting device 2022 to the center in the harvesting width direction and conveying them, a root holding conveyor 2232 for holding the roots of the collected harvested straws and conveying them backward, a head engaging conveyor 2233 for engaging and conveying the head end side of the harvested straws, and a feeding conveyor 2234 for guiding the roots of the harvested straws from the end of the root holding conveyor 2232 to the threshing feed chain 2111.
An imaging unit 2070 provided with a color camera is provided at the front end of the top of the cockpit 2014.
In this embodiment, the extension of the imaging field of view of the imaging unit 2070 in the front-rear direction extends from the distal end region of the cutting unit 2002 to almost the horizon. The width of the imaging field of view is from about 10 meters to several tens of meters. The captured image obtained by the imaging unit 2070 is converted into image data and transmitted to a control system of the combine harvester.
The imaging unit 2070 images the farm during harvesting work, but various objects are present in the farm as imaging targets. The control system of the combine harvester has a function of recognizing a specific object as an object to be recognized based on image data transmitted from the imaging unit 2070. Fig. 8 schematically shows a normal standing grain straw group indicated by a symbol Z0, a weed group indicated by a symbol Z1 and extending higher than the standing grain straw, an lodging grain straw group indicated by a symbol Z2, and a person indicated by a symbol Z3 as the recognition target object.
A satellite positioning module 2080 is also disposed on the top of the cockpit 2014. The satellite positioning module 2080 includes an antenna for a satellite for receiving a GNSS (global navigation satellite system) signal (including a GPS signal). In order to supplement the satellite navigation performed by the satellite positioning module 2080, an inertial navigation unit in which a gyro acceleration sensor or a magnetic azimuth sensor is embedded is installed in the satellite positioning module 2080. Of course, the inertial navigation unit can be arranged elsewhere. In fig. 8, for convenience of drawing, the satellite positioning module 2080 is disposed at the rear of the top of the cockpit 2014, but it is preferably disposed at a position closer to the center of the body than to the front end of the top so as to be as close as possible to a position directly above the right and left center portions of the cutting device 2022.
Fig. 9 shows a functional block diagram of a control system of the combine harvester. The control system according to this embodiment is configured by a large number of wiring networks such as an electronic control unit called an ECU, various operating devices, a sensor group, a switch group, and an in-vehicle LAN for data transmission therebetween. The notification device 2091 is a device for notifying a driver or the like of a work travel state or various warnings, and is a buzzer, a lamp, a speaker, a display, or the like. The communication unit 2092 is used in the control system of the combine harvester to exchange data with the cloud computer system 2100 or the portable communication terminal 2200 provided at a remote location. Here, the portable communication terminal 2200 is a tablet computer operated by a monitor (including a driver) at the work travel site.
The control unit 2006 is a core element of the control system, and is shown as an aggregate of a plurality of ECUs. The positioning data from the satellite positioning module 2080 and the image data from the imaging unit 2070 are input to the control unit 2006 via a wiring network.
The control unit 2006 includes an output processing unit 2006B and an input processing unit 2006A as input/output interfaces. The output processing unit 2006B is connected to the vehicle travel facility group 2007A and the work equipment facility group 2007B. The vehicle travel device group 2007A includes control devices related to vehicle travel, such as an engine control device, a shift control device, a brake control device, a steering control device, and the like. The working equipment group 2007B includes a power control device in the harvesting unit 2002, the threshing device 2011, the grain discharge device 2013, the grain straw conveying device 2023, and the like.
The travel system detection sensor group 2008A and the work system detection sensor group 2008B are connected to the input processing unit 2006A. The travel system detection sensor group 2008A includes sensors for detecting states of an engine revolution number adjusting tool, an accelerator pedal, a brake pedal, a shift operation tool, and the like. The work system detection sensor group 2008B includes sensors for detecting the device states of the harvesting portion 2002, the threshing device 2011, the grain discharge device 2013, the grain straw transport device 2023, and the state of the grain straw or grain.
The control unit 2006 includes a work travel control module 2060, an image recognition module 2005, a data processing module 2050, a body position calculation unit 2066, and a notification unit 2067.
The notification portion 2067 generates notification data based on the instruction or the like received from each functional portion of the control unit 2006, and supplies it to the notification device 2091. The body position calculation unit 2066 calculates the body position as the map coordinates (or farm coordinates) of the body 2001 based on the positioning data sequentially transmitted from the satellite positioning module 2080.
The combine harvester of the embodiment can travel by both automatic travel (automatic steering) and manual travel (manual steering). The work travel control module 2060 includes an automatic work travel command unit 2063 and a travel route setting unit 2064 in addition to the travel control unit 2061 and the work control unit 2062. A travel mode switch (not shown) for selecting either an automatic travel mode for traveling by automatic steering or a manual steering mode for traveling by manual steering is provided in the cabin 2014. By operating the running mode switch, it is possible to shift from the manual steering running to the automatic steering running, or from the automatic steering running to the manual steering running.
The travel control unit 2061 has an engine control function, a steering control function, a vehicle speed control function, and the like, and supplies a travel control signal to the vehicle travel equipment group 2007A. The work control unit 2062 supplies a work control signal to the work equipment group 2007B to control the operations of the harvesting unit 2002, the threshing device 2011, the grain discharge device 2013, the grain straw conveying device 2023, and the like.
When the manual steering mode is selected, the travel control unit 2061 generates a control signal based on the operation of the driver, and controls the vehicle travel equipment group 2007A. When the automatic steering mode is selected, the running control unit 2061 controls the steering-related vehicle running equipment group 2007A or the vehicle running equipment group 2007A related to the vehicle speed based on the automatic running command provided by the automatic operation running command unit 2063.
The travel route setting unit 2064 expands the travel route for automatic travel created in any of the control unit 2006, the mobile communication terminal 2200, the cloud computer system 2100, and the like into a memory. The travel route expanded in the memory is in turn used as a target travel route in automatic travel. Even with manual travel, the travel path can be used to guide the combine along the travel path.
More specifically, the automated work travel command unit 2063 generates an automatic steering command and a vehicle speed command, and supplies these commands to the travel control unit 2061. The automatic steering command is generated so as to eliminate the azimuth offset and the position offset between the travel path set by the travel path setting unit 2064 and the vehicle position calculated by the body position calculating unit 2066. The vehicle speed command is generated based on a preset vehicle speed value. Further, the automatic work travel command unit 2063 supplies a work device operation command to the work control unit 2062 in accordance with the vehicle position or the traveling state of the vehicle.
The image data of the captured image sequentially acquired by the image capturing unit 2070 is input to the image recognition module 2005. The image recognition module 2005 estimates a presence region where the recognition target object exists in the captured image, and outputs recognition output data including the presence region and an estimation probability at the time of the presence region being estimated, as a recognition result. The image recognition module 2005 is constructed using neural network technology employing deep learning.
Fig. 10 and 11 show a flow of generation of recognition output data by the image recognition module 2005. The pixel values of the RGB image data are input to the image recognition module 2005 as input values. In the illustration of fig. 10, the estimated recognition objects are weeds, lodging straws, and humans. Therefore, the recognition output data as the recognition result contains the existence region of weeds (hereinafter referred to as weed region) and its estimated probability, the existence region of fallen straws (hereinafter referred to as fallen straw region) and its estimated probability, and the existence region of people (hereinafter referred to as people region) and its estimated probability.
In fig. 10, the estimation result is modeled, and the weed area is represented by a rectangular frame given a symbol F1, the lodging stalk area is represented by a rectangular frame given a symbol F2, and the human area is represented by a rectangular frame given a symbol F3. The respective regions are linked to their estimated probabilities. Although the weed area, the lodging straw area, and the human area are defined by four corner points, respectively, the coordinate positions of the four corner points of each rectangle on the captured image are also included in the estimation result. Of course, if the recognition target object is not estimated, the existence region of the recognition target object is not output, and the estimation probability thereof is zero.
In this embodiment, the image recognition module 2005 sets the internal parameters such that the estimated probability of the recognition target decreases as the recognition target is farther from the imaging unit 2070 in the captured image. This makes it possible to strictly recognize the recognition target object in the imaging region where the resolution is lowered by being away from the imaging unit 2070, thereby reducing erroneous recognition.
The data processing module 2050 processes the recognition output data output from the image recognition module 2005. As shown in fig. 9 and 11, the data processing module 2050 of this embodiment includes a recognition target position information generating unit 2051, a statistical processing unit 2052, a data storage unit 2053, a data determining unit 2054, and a data correcting unit 2055.
The recognition target position information generating unit 2051 generates recognition target position information indicating the position of the recognition target on the map, based on the position of the body at the time when the captured image is acquired and the recognition output data. The position of the recognition target (weed, lodging straw, person) included in the recognition output data on the map where the recognition target exists is obtained by converting the coordinate positions (camera coordinate positions) on the captured image of the four corner points of the rectangle representing the existence region (weed region, lodging straw region, person region) of the recognition target into coordinates on the map.
The image capturing unit 2070 acquires captured images at predetermined time intervals, for example, at intervals of 0.5 second, and inputs the image data to the image recognition module 2005, so the image recognition module 2005 also outputs recognition output data at the same time intervals. Therefore, when the recognition target object enters the imaging field of view of the imaging unit 2070, the plurality of recognition output data include the existence region for the same recognition target object. As a result, it is possible to obtain position information of a plurality of recognition objects with respect to the same recognition object. In this case, the estimated probability included in the recognition output data as each piece of raw data, that is, the estimated probability of the existence region of the recognition target included in the recognition target position information, is often different in value due to the difference in the positional relationship between the imaging unit 2070 and the recognition target.
Therefore, in this embodiment, such a plurality of pieces of identification target position information are stored, and statistical calculation is performed on the estimated probability included in each of the plurality of pieces of stored identification target position information.
The statistical processing unit 2052 obtains a representative value of the estimated probability group by using statistical calculation of the estimated probabilities of the plurality of pieces of identification target position information. Using the representative value, it is possible to correct the position information of the plurality of recognition objects to one optimum recognition object position information (recognition object corrected position information). An example of such correction is to obtain an arithmetic average value, a weighted average value, or an intermediate value of each estimated probability as a reference value (representative value), obtain a logical sum of presence regions having estimated probabilities equal to or higher than the reference value, and generate corrected recognition object position information in which the presence regions are set as optimal presence regions. Of course, it is also possible to generate one piece of highly reliable identification object position information using other statistical calculations. That is, the plurality of pieces of recognition target position information are corrected based on the result of statistical calculation of the estimated probability included in the recognition output data corresponding to the pieces of recognition target position information.
By using the identification object position information (weed position information, lodging grain stalk position information, human position information) indicating the position on the map of the presence area (weed area, lodging grain stalk area, human area) of the identification object thus obtained, the traveling work control and warning notification set in advance for each of the areas are performed at the time of identification of weeds, lodging grain stalks, and humans.
As described above, in the present invention, the estimated probability output from the image recognition module 2005 is important for generating the final recognition target position information. Therefore, in this embodiment, the data processing module 2050 includes a recognition output data evaluation function for examining reliability of the estimation probability of the recognition output data output from the image recognition module 2005 and determining recognition output data having an estimation probability that cannot be relied upon as inappropriate recognition output data.
As shown in fig. 9 and 11, the recognition output data evaluation function is realized by a data storage unit 2053, a data determination unit 2054, and a data correction unit 2055. The data storage unit 2053 temporarily stores the recognition output data sequentially output from the image recognition module 2005 as a recognition output data string over time. The data determination unit 2054 compares the estimated probability of the recognition output data to be determined with the estimated probability of the recognition output data having a temporal relationship with the recognition output data in the recognition output data string, and determines the recognition output data having a low estimated probability of a predetermined level or more as inappropriate recognition output data.
Since the captured image for identifying the generation of the output data is captured during the operation travel of the combine, the captured image may not be clear due to an abrupt operation of the combine or the like. Further, since the combine harvester performs direction change near the ridge, the shooting direction may be abruptly changed. Accordingly, there is a case where the images of the same recognition object are captured in both the forward light and the backward light. Although the recognition output data having a low estimation probability is unexpectedly generated due to such a change in the imaging conditions or the like, the recognition output data, that is, inappropriate recognition output data can be extracted by the recognition output data evaluation function.
When the number of pieces of recognition output data is small, the estimation probability of the recognition output data determined as inappropriate recognition output data may be subjected to interpolation correction using the estimation probabilities of the preceding and following recognition output data. Thereby, the improper recognition output data is appropriately corrected so as to become usable, and an advantage of securing the data amount is obtained. The data correction unit 2055 performs such interpolation correction. Of course, when the recognition output data determined as inappropriate recognition output data is discarded, the data correction unit 2055 is omitted because interpolation correction is not necessary.
The identification object position information generated by the identification object position information generating unit 2051 can be mapped as shown in fig. 12 so as to be displayed so as to be visually recognized. Fig. 12 illustrates a weed map in which the weed position information is mapped. When the weed location information includes weed existence regions having different estimated probabilities, the weed existence regions can be represented by a pattern in a predetermined range of the estimated probability values as shown in fig. 12.
[ embodiment 3]
As shown in fig. 13, in the combine harvester according to embodiment 3, a cutting section 3002 is connected to a front portion of a machine body 3001 including a pair of right and left crawler travel devices 3010 so as to be capable of being raised and lowered about a transverse axis X. The body 3001 includes a threshing device 3011 and a grain tank 3012 for storing grains, which are arranged in the transverse width direction of the body, at the rear part of the body. A cab 3014 for covering a cab on which the vehicle is mounted is provided in a front right portion of the machine body 3001, and an engine 3015 for driving is provided below the cab 3014.
As shown in fig. 13, the threshing device 3011 receives the harvested grain stalks harvested and transported rearward by the harvesting unit 3002 into the interior, and while transporting the roots of the grain stalks between the threshing feed chain 3111 and the holding guide rail 3112, the threshing device performs threshing on the front ear side in the threshing cylinder 3113. Then, a grain sorting process for the threshing processed matter is performed by a sorting unit provided below the threshing cylinder 3113, and the grains thus sorted are transported to a grain tank 3012 and stored. Although not described in detail, the grain tank 3012 is provided with a grain discharge device 3013 for discharging grains stored in the grain tank to the outside.
The harvesting unit 3002 includes a plurality of raising devices 3021 for raising the lodged standing grain stalks, a pusher-type cutting device 3022 for cutting the roots of the raised standing grain stalks, a grain stalk transport device 3023, and the like. The grain straw conveyer 3023 conveys the harvested grain straw in the vertical posture, in which the roots are cut, to the starting end of the threshing feed chain 3111 of the threshing device 3011 located on the rear side of the machine body while slowly changing the horizontal posture.
The straw conveyor 3023 includes a converging conveyor 3231 for collecting a plurality of harvested straws cut by the cutting device 3022 at the center in the cutting width direction and conveying the harvested straws, a root holding conveyor 3232 for holding roots of the collected harvested straws and conveying the harvested straws backward, a head end locking conveyor 3233 for locking and conveying the head end side of the harvested straws, and a supply conveyor 3234 for guiding the roots of the harvested straws from the end of the root holding conveyor 3232 to the threshing feed chain 3111.
An imaging unit 3070 including a color camera is provided at the top end of the cockpit 3014.
In this embodiment, the extension of the imaging field of view of the imaging section 3070 in the front-rear direction extends from the distal end region of the cutting section 3002 to almost the horizon. The width of the imaging field of view is from about 10 meters to several tens of meters. The captured images sequentially acquired by the image capturing unit 3070 are converted into image data and transmitted to the control system of the combine.
The imaging unit 3070 images the farm during the harvesting operation, but various objects exist as imaging targets in the farm. The control system of the combine harvester has a function of recognizing a specific object as a recognition target object based on the image data transmitted from the imaging unit 3070. Fig. 13 schematically shows a normal standing grain straw group indicated by a symbol Z0, a weed group indicated by a symbol Z1 and extending higher than the standing grain straw, an lodging grain straw group indicated by a symbol Z2, and a person indicated by a symbol Z3 as the recognition target object.
A satellite positioning module 3080 is also provided on top of the cockpit 3014. The satellite positioning module 3080 includes an antenna for a satellite for receiving a GNSS (global navigation satellite system) signal (including a GPS signal). In order to complement the satellite navigation performed by the satellite positioning module 3080, an inertial navigation unit in which a gyro acceleration sensor or a magnetic azimuth sensor is embedded is mounted in the satellite positioning module 3080. Of course, the inertial navigation unit can be arranged elsewhere. In fig. 13, for convenience of drawing, the satellite positioning module 3080 is disposed at the rear of the ceiling of the cockpit 3014, but it is preferably disposed at a position closer to the center of the body than to the front end of the ceiling so as to be as close as possible to a position directly above the right and left center portions of the cutting device 3022.
Fig. 14 shows a functional block diagram of a control system of the combine harvester. The control system according to this embodiment is configured by a large number of wiring networks such as an electronic control unit called an ECU, various operating devices, a sensor group, a switch group, and an in-vehicle LAN for data transmission therebetween. The notification device 3091 is a device for notifying a driver or the like of a work travel state or various warnings, and is a buzzer, a lamp, a speaker, a display, or the like. The communication unit 3092 is used for the control system of the combine harvester to exchange data with the cloud computer system 3100 or the mobile communication terminal 3200 provided at a remote location. Here, the portable communication terminal 3200 is a tablet computer operated by a monitor (including a driver) at a work travel site.
The control unit 3006 is a core element of the control system, and is shown as an aggregate of a plurality of ECUs. The positioning data from the satellite positioning module 3080 and the image data from the photographing part 3070 are input to the control unit 3006 through a wiring network.
The control unit 3006 includes an output processing unit 3006B and an input processing unit 3006A as input/output interfaces. The output processing unit 3006B is connected to the vehicle travel device group 3007A and the work equipment device group 3007B. The vehicle travel device group 3007A includes control devices related to vehicle travel, such as an engine control device, a shift control device, a brake control device, a steering control device, and the like. The working device facility group 3007B includes a harvesting unit 3002, a threshing unit 3011, a grain discharge unit 3013, a power control facility in the grain stalk transport unit 3023, and the like.
The input processing unit 3006A is connected to the traveling system detection sensor group 3008A and the work system detection sensor group 3008B. The traveling system detection sensor group 3008A includes sensors for detecting the states of an engine revolution number adjusting means, an accelerator pedal, a brake pedal, a shift operation means, and the like. The work system detection sensor group 3008B includes sensors for detecting the state of the devices in the harvesting section 3002, the threshing device 3011, the grain discharging device 3013, the grain straw conveying device 3023, and the state of the grain straw or grain.
The control unit 3006 includes a work travel control module 3060, an image recognition module 3005, a data processing module 3050, a body position calculation unit 3066, a notification unit 3067, and a travel track calculation unit 3068.
The notification unit 3067 generates notification data based on instructions or the like received from the respective function units of the control unit 3006, and supplies it to the notification device 3091. The body position calculation unit 3066 calculates the body position as the map coordinates (or farm coordinates) of the body 3001 based on the positioning data sequentially transmitted from the satellite positioning module 3080. The travel track calculation unit 3068 calculates the travel track of the machine body 3001 from the relayed machine body position calculated by the machine body position calculation unit 3066. Further, the travel track calculation unit 3068 can add an expected travel track immediately in front of the machine body 3001 to the travel track based on the latest machine body position, the steering angle of the steering control device at that time, or the target travel path.
The combine harvester of the embodiment can travel by both automatic travel (automatic steering) and manual travel (manual steering). The work travel control module 3060 includes an automatic work travel command unit 3063 and a travel route setting unit 3064 in addition to the travel control unit 3061 and the work control unit 3062. A travel mode switch (not shown) for selecting either an automatic travel mode for traveling by automatic steering or a manual steering mode for traveling by manual steering is provided in the cockpit 3014. By operating the running mode switch, it is possible to shift from the manual steering running to the automatic steering running, or from the automatic steering running to the manual steering running.
The travel control unit 3061 has an engine control function, a steering control function, a vehicle speed control function, and the like, and supplies a travel control signal to the vehicle travel device group 3007A. The work control unit 3062 provides work control signals to the work equipment set 3007B to control the operation of the harvesting unit 3002, the threshing device 3011, the grain discharging device 3013, the grain straw conveying device 3023, and the like.
When the manual steering mode is selected, the travel control unit 3061 generates a control signal based on the operation of the driver, and controls the vehicle travel device group 3007A. When the automatic steering mode is selected, the travel control unit 3061 controls the steering-related vehicle travel device group 3007A or the vehicle travel device group 3007A related to the vehicle speed based on the automatic travel command provided by the automatic work travel command unit 3063.
The travel route setting unit 3064 expands the travel route for automatic travel created in any of the control unit 3006, the mobile communication terminal 3200, the cloud computer system 3100, and the like into a memory. The travel route expanded in the memory is in turn used as a target travel route in automatic travel. Even with manual travel, the travel path can be used to guide the combine along the travel path.
More specifically, the automatic work travel command unit 3063 generates an automatic steering command and a vehicle speed command, and supplies them to the travel control unit 3061. The automatic steering command is generated so as to eliminate the azimuth offset and the position offset between the travel path set by the travel path setting unit 3064 and the vehicle position calculated by the body position calculation unit 3066. The vehicle speed command is generated based on a preset vehicle speed value. Further, the automatic work travel command unit 3063 provides a work apparatus operation command to the work control unit 3062 based on the vehicle position or the travel state of the vehicle.
The image data of the captured image sequentially acquired by the imaging unit 3070 is input to the image recognition module 3005. The image recognition module 3005 estimates a presence region where the recognition target object exists in the captured image, and outputs recognition output data including the presence region and an estimated probability at the time of the presence region being estimated, as a recognition result. The image recognition module 3005 is constructed using neural network technology employing deep learning.
Fig. 15 and 16 show a flow of generation of recognition output data by the image recognition module 3005. The pixel values of the RGB image data are input to the image recognition module 3005 as input values. In the illustration of fig. 15, the estimated recognition objects are weeds, lodging straws, and humans. Therefore, the recognition output data as the recognition result contains the existence region of weeds (hereinafter referred to as weed region) and its estimated probability, the existence region of fallen straws (hereinafter referred to as fallen straw region) and its estimated probability, and the existence region of people (hereinafter referred to as people region) and its estimated probability.
In fig. 15, the estimation result is modeled, and the weed area is represented by a rectangular frame given a symbol F1, the lodging straw area is represented by a rectangular frame given a symbol F2, and the human area is represented by a rectangular frame given a symbol F3. The respective regions are linked to their estimated probabilities. Although the weed area, the lodging straw area, and the human area are defined by four corner points, respectively, the coordinate positions of the four corner points of each rectangle on the captured image are also included in the estimation result. Of course, if the recognition target object is not estimated, the existence region of the recognition target object is not output and the estimation probability thereof is zero.
The existence region and the estimated probability of the recognition target object output as the recognition output data from the image recognition module 3005 are important for generating the final recognition target object position information. However, in the image capturing by the camera provided in the combine that travels on the farm, the captured image may become an inappropriate image due to sudden swing of the image capturing unit 3070, momentary backlight, dust crossing the image capturing field of view of the image capturing unit 3070, and the like. In this case, recognition output data having a low estimation probability may be unexpectedly output. Alternatively, the recognition output data in which the recognition target cannot be estimated and the estimation probability is zero is output is caused. Therefore, the data processing module 3050 that processes the recognition output data output from the image recognition module 3005 has, as a preprocessing function, a recognition output data evaluation function that examines the reliability of the recognition output data output from the image recognition module 3005 and determines unreliable recognition output data as inappropriate recognition output data.
As shown in fig. 16, the recognition output data evaluation function is realized by the data storage unit 3053, the data determination unit 3054, and the data correction unit 3055 of the data processing module 3050. The data storage unit 3053 temporarily stores the recognition output data successively output from the image recognition module 3005 as a recognition output data string (に) in a timely manner. The data determination unit 3054 predicts a range (hereinafter, also referred to as a prediction range) in which the existing region in the last captured image estimated by the image recognition module 3005 should be located in the next captured image, based on the travel track calculated by the travel track calculation unit 3068. Further, the data determination unit 3054 compares the estimated probability of the existence region overlapping the prediction range in the next captured image with the estimated probability of the existence region in the last captured image. By this comparison, when the difference between the two estimated probabilities is equal to or greater than a predetermined allowable amount, the recognition output data based on the next captured image is determined as inappropriate recognition output data. Alternatively, the recognition output data having the estimated probability of being equal to or lower than a predetermined value (for example, equal to or lower than 0.5) may be determined as inappropriate recognition output data.
Hereinafter, data processing for determining the improper recognition output data will be described with reference to fig. 17. In this example, the estimated recognition object is a weed.
(step 01) from the captured image acquired by the imaging unit 3070 (#01a), the weed region surrounded by the rectangular frame indicated by the symbol F1 is recognized by the image recognition module 3005, and the estimated probability thereof is 0.75(#01 b). If the estimated probability exceeds 0.7, it is considered to be high reliability. Next, based on the position of the rectangular frame in the captured image and the travel track calculated by the travel track calculation unit 3068, the expected range PA where the weed area is to be located in the captured image acquired next time is estimated (#01 c).
Next, (step 02) the image recognition module 3005 recognizes the weed region surrounded by the rectangular frame indicated by the symbol F1 from the captured image acquired by the imaging unit 3070 (#02a), and the estimated probability is 0.8(#02 b). The identified weed field overlaps the expected range PA. Based on the position of the rectangular frame and the travel path at this time, an expected range PA where the weed area should be located in the captured image acquired next time is estimated (#02 c).
Further, (step 03) based on the next captured image (#03a), the image recognition module 3005 makes an estimation of the existing region where the weed is present, but its estimation probability is 0.2(#03 b). In this way, when the estimated probability is lower than 0.5, the reliability of the position of the rectangular frame representing the weed area is also low. Therefore, it is preferable to use the position of the rectangular frame and the travel locus in step 02 for estimating the estimated range PA at that time, instead of using the position of the rectangular frame of the identification output data with low reliability (#03 c).
Further, (step 04) based on the next captured image (#04a), the image recognition module 3005 makes an estimation of the existing area where weeds exist, and recognizes the weed area surrounded by the rectangular frame indicated by symbol F1 with an estimated probability of 0.9(#04 b). The identified weed field overlaps the expected range PA.
When the image recognition module 3005 outputs the estimated output data as described above, the estimated probability in step 03 is decreased from the last estimated probability of 0.8 to 0.2, and the amount of decrease is equal to or more than a predetermined allowable amount (for example, the rate of change is equal to or more than 50%), so the data determination unit 3054 determines the estimated output data in step 03 as inappropriate recognition output data.
When the data determination unit 3054 determines that the estimated output data in step 03 is inappropriate recognition output data, the data correction unit 3055 calculates an interpolation estimated probability for the inappropriate recognition output data using the estimated probabilities of the preceding and following appropriate recognition output data. Here, as the simplest interpolation operation, an arithmetic mean is used, and an interpolation estimation probability of 0.85 is obtained. The estimated probability of the inappropriate recognition output data that can be used as appropriate recognition output data in the subsequent processing is replaced with the obtained interpolated estimated probability.
As described above, in this embodiment, when the number of pieces of recognition output data is small, the estimated probability of recognition output data determined to be inappropriate recognition output data can be interpolated and corrected by the estimated probabilities of preceding and following the recognition output data. This makes it possible to use the improper recognition output data by appropriately correcting the improper recognition output data, and to obtain an advantage that the data amount is secured. Of course, when the recognition output data determined as inappropriate recognition output data is discarded, the data correction unit 3055 is omitted because interpolation correction is not necessary.
The data processing module 3050 of this embodiment also has a function of generating recognition target position information from the recognition output data evaluated by the above-described recognition output data evaluation function.
This function is realized by the recognition object position information generation unit 3051 and the statistical processing unit 3052 of the data processing module 3050.
The identification object position information is information indicating the position of the identification object on the map. The recognition object position information generation unit 3051 generates recognition object position information from the recognition output data and the body position at the time when the captured image is acquired. The position of the recognition target object (weed, lodging straw, person) included in the recognition output data on the map where the recognition target object exists is obtained by converting the coordinate position (camera coordinate position) on the captured image of the four corner points of the rectangle indicating the existence area (weed area, lodging straw area, person area) of the recognition target object into the coordinate on the map.
The imaging unit 3070 acquires the captured images at predetermined time intervals, for example, at 0.5 second intervals, and inputs the image data to the image recognition module 3005, so that the image recognition module 3005 also outputs recognition output data at the same time intervals. Therefore, when the recognition target enters the imaging field of view of the imaging unit 3070, the plurality of recognition output data include the presence region for the same recognition target. As a result, it is possible to obtain position information of a plurality of recognition objects with respect to the same recognition object. In this case, the estimated probability included in the recognition output data as each piece of raw data, that is, the estimated probability of the existence region of the recognition target included in the recognition target position information, is different in value in many cases due to the difference in the positional relationship between the imaging unit 3070 and the recognition target.
Therefore, in this embodiment, as shown in fig. 16, such a plurality of pieces of identification target position information are stored, and statistical calculation is performed on the estimated probability included in each of the plurality of pieces of stored identification target position information. The statistical processing unit 3052 obtains a representative value of the estimated probability groups by using statistical calculation of estimated probabilities of the plurality of pieces of recognition target position information. The plurality of pieces of identification object position information can be corrected to one piece of optimum identification object position information (identification object corrected position information) using the representative value. An example of such correction is to obtain an arithmetic average value, a weighted average value, or an intermediate value of each estimated probability as a reference value (representative value), obtain a logical sum of existing regions having estimated probabilities equal to or higher than the reference value, and generate corrected recognition object position information in which the existing regions are set as optimal existing regions. Of course, it is also possible to generate one piece of identification target position information with high reliability using other statistical calculations. That is, the plurality of pieces of recognition target position information are corrected based on the result of statistical calculation of the estimated probability included in the recognition output data corresponding to the pieces of recognition target position information.
By using the recognition object position information (weed position information, lodging stalk position information, human position information) indicating the position on the map of the presence area (weed area, lodging stalk area, human area) of the recognition object thus obtained, the predetermined running work control and warning notification are performed at the time of recognition of weeds, lodging stalks, and humans, respectively.
Further, the recognition target position information generated by the recognition target position information generating unit 3051 can be mapped as shown in fig. 18 so as to be displayed so as to be visually easy to recognize. Fig. 18 illustrates a weed map in which the weed position information is mapped. When the weed location information includes weed existence regions having different estimated probabilities, the weed existence regions can be represented by pattern-dividing a predetermined range of the estimated probability values as shown in fig. 18.
Note that the configurations disclosed in the above-described embodiments (including other embodiments, the same below) can be combined with the configurations disclosed in the other embodiments and applied as long as no contradiction occurs, and the embodiments disclosed in the present specification are exemplary, and the embodiments of the present invention are not limited thereto, and can be appropriately changed within a range not departing from the object of the present invention.
[ other embodiments ]
(1) In the above-described embodiment, the weed group extending higher than the planted grain straw is set as the recognition object recognized by the image recognition module (5, 2005, 3005), but other recognition objects such as a fallen grain straw group or a person may be set. At this time, the work travel control module (60, 2060, 3060) is configured to perform necessary control in response to a lodging straw group or human recognition.
(2) In the above-described embodiment, the image recognition module (5, 2005, 3005) is constructed using a deep learning type neural network technique. Alternatively, image recognition modules (5, 2005, 3005) constructed using other machine learning techniques may be employed.
(3) In the above-described embodiment, the image recognition module (5, 2005, 3005), the data processing module (50, 2050, 3050), and the weed position calculation unit 68 are embedded in the control unit (6, 2006, 3006) of the combine harvester, but a part or all of them can be built in a control unit independent of the combine harvester, for example, a portable communication terminal (200, 2200, 3200) or the like.
(4) The functional units shown in fig. 4, 9, 14, and 16 are mainly distinguished for the purpose of explanation. Actually, each functional unit may be combined with another functional unit, or may be further divided into a plurality of functional units.
Industrial applicability of the invention
The present invention can be applied not only to a combine harvester for harvesting rice, wheat, and the like, but also to a combine harvester for harvesting corn and other crops, or a harvester for harvesting carrots and the like.
Description of the reference symbols
3: threshing depth adjusting component
5: image recognition module
6: control unit
23: grain stalk conveying device
236: electric motor for adjusting (motor)
237: operating rod
30: long detection device of stalk
34: detection switch
35: detection switch
50: data processing module
51: weed position information generating unit
52: statistical processing unit
60: operation running control module (operation running control part)
61: running control unit
63: automatic work travel command unit
620: threshing depth control unit
66: body position calculating section
68: weed position calculating section
70: image pickup unit
80: satellite positioning module
91: notification device
92: communication unit
2053: data storage unit
2054: data determination unit
2055: data correction unit
2051: recognition object position information generating unit

Claims (4)

1. A harvester which harvests standing grain stalks while traveling on a farm, comprising:
a harvesting part for harvesting the vertical grain stalks from the farm;
a grain stalk conveying device for conveying the cut grain stalks from the cutting part to the threshing device;
a threshing depth adjusting member provided to the grain and straw conveying device;
a threshing depth control unit that performs threshing depth adjustment control based on the length of the harvested straw using the threshing depth adjustment member;
an organism position calculation unit that calculates an organism position as a map coordinate of the organism based on the positioning data from the satellite positioning module;
an imaging unit which is provided in the machine body and which images the farm during harvesting operation;
an image recognition module that receives image data of captured images successively and continuously acquired by the imaging unit, estimates a weed growth area in the captured images, and outputs recognition output data indicating the estimated weed growth area;
a weed position information generating unit that generates weed position information indicating a position of the weed growth area on a map, based on the body position at the time point when the captured image is acquired and the identification output data;
a weed position calculating section that calculates a timing at which weeds harvested in the weed growth area pass through the threshing depth adjusting member, based on the weed position information, the machine body position, and the conveying speed of the grain stalk conveying device; and
and a work travel control unit that performs weed entry control while the weeds pass through the threshing depth adjusting means, based on the timing calculated by the weed position calculating unit.
2. A harvester which can harvest standing grain stalks while traveling on a farm, the harvester comprising:
a harvesting unit that harvests the vertical straw from the farm;
a grain stalk conveying device for conveying the cut grain stalks from the cutting part to the threshing device;
a threshing depth adjusting member provided to the grain and straw conveying device;
a threshing depth control unit that performs threshing depth adjustment control based on the length of the harvested straw using the threshing depth adjustment member;
a body position calculation unit that calculates a body position as a map coordinate of the body based on the positioning data from the satellite positioning module;
an imaging unit which is provided in the machine body and which images the farm during harvesting operation;
an image recognition module that receives image data of captured images successively and continuously acquired by the imaging unit, estimates a weed growth area in the captured images, and outputs recognition output data indicating the estimated weed growth area;
a weed position information generating unit that generates weed position information indicating a position of the weed growth area on a map, based on the body position at the time point when the captured image is acquired and the identification output data; and
and a work travel control unit that calculates a timing at which the mowing unit passes through the weed growth area based on the weed position information and the machine body position calculated by the machine body position calculating unit, and performs weed entry control while the mowing unit passes through the weed growth area based on the calculated timing.
3. A harvester according to claim 1 or 2,
interrupting the threshing depth adjustment control by performing the weed entry control.
4. A harvester according to claim 1 or 2,
the vehicle speed is reduced by performing the weed entry control.
CN201880029905.8A 2017-06-23 2018-05-21 Harvester Active CN110582198B (en)

Applications Claiming Priority (7)

Application Number Priority Date Filing Date Title
JP2017123439A JP6854713B2 (en) 2017-06-23 2017-06-23 combine
JP2017-123440 2017-06-23
JP2017123440A JP7068781B2 (en) 2017-06-23 2017-06-23 Harvester
JP2017-123439 2017-06-23
JP2017-123441 2017-06-23
JP2017123441A JP6765349B2 (en) 2017-06-23 2017-06-23 Harvester
PCT/JP2018/019444 WO2018235486A1 (en) 2017-06-23 2018-05-21 Harvester

Publications (2)

Publication Number Publication Date
CN110582198A CN110582198A (en) 2019-12-17
CN110582198B true CN110582198B (en) 2022-08-23

Family

ID=64735628

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201880029905.8A Active CN110582198B (en) 2017-06-23 2018-05-21 Harvester

Country Status (3)

Country Link
KR (1) KR102589076B1 (en)
CN (1) CN110582198B (en)
WO (1) WO2018235486A1 (en)

Families Citing this family (41)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11079725B2 (en) 2019-04-10 2021-08-03 Deere & Company Machine control using real-time model
US11178818B2 (en) 2018-10-26 2021-11-23 Deere & Company Harvesting machine control system with fill level processing based on yield data
US11589509B2 (en) 2018-10-26 2023-02-28 Deere & Company Predictive machine characteristic map generation and control system
US11653588B2 (en) 2018-10-26 2023-05-23 Deere & Company Yield map generation and control system
US11467605B2 (en) 2019-04-10 2022-10-11 Deere & Company Zonal machine control
US11957072B2 (en) 2020-02-06 2024-04-16 Deere & Company Pre-emergence weed detection and mitigation system
US11672203B2 (en) 2018-10-26 2023-06-13 Deere & Company Predictive map generation and control
US11641800B2 (en) 2020-02-06 2023-05-09 Deere & Company Agricultural harvesting machine with pre-emergence weed detection and mitigation system
US11240961B2 (en) 2018-10-26 2022-02-08 Deere & Company Controlling a harvesting machine based on a geo-spatial representation indicating where the harvesting machine is likely to reach capacity
US10986778B2 (en) 2018-10-31 2021-04-27 Deere & Company Weed seed devitalizer control
US11206763B2 (en) 2018-10-31 2021-12-28 Deere & Company Weed seed based harvester working member control
CN110070565B (en) * 2019-03-12 2021-06-01 杭州电子科技大学 Ship track prediction method based on image superposition
US11778945B2 (en) 2019-04-10 2023-10-10 Deere & Company Machine control using real-time model
US11234366B2 (en) 2019-04-10 2022-02-01 Deere & Company Image selection for machine control
CN113727597B (en) * 2019-04-25 2023-06-06 株式会社久保田 Agricultural machinery such as harvester
JP2021006011A (en) * 2019-06-27 2021-01-21 株式会社クボタ Obstacle detection system for farm working vehicle
US11510364B2 (en) 2019-07-19 2022-11-29 Deere & Company Crop residue based field operation adjustment
US20210243951A1 (en) * 2020-02-06 2021-08-12 Deere & Company Machine control using a predictive map
US11477940B2 (en) 2020-03-26 2022-10-25 Deere & Company Mobile work machine control based on zone parameter modification
JP7423440B2 (en) 2020-06-23 2024-01-29 株式会社クボタ harvester
US11874669B2 (en) 2020-10-09 2024-01-16 Deere & Company Map generation and control system
US11711995B2 (en) 2020-10-09 2023-08-01 Deere & Company Machine control using a predictive map
US11727680B2 (en) 2020-10-09 2023-08-15 Deere & Company Predictive map generation based on seeding characteristics and control
US11946747B2 (en) 2020-10-09 2024-04-02 Deere & Company Crop constituent map generation and control system
US11845449B2 (en) 2020-10-09 2023-12-19 Deere & Company Map generation and control system
US11849672B2 (en) 2020-10-09 2023-12-26 Deere & Company Machine control using a predictive map
US11871697B2 (en) 2020-10-09 2024-01-16 Deere & Company Crop moisture map generation and control system
US11675354B2 (en) 2020-10-09 2023-06-13 Deere & Company Machine control using a predictive map
US11844311B2 (en) 2020-10-09 2023-12-19 Deere & Company Machine control using a predictive map
US11927459B2 (en) 2020-10-09 2024-03-12 Deere & Company Machine control using a predictive map
US11592822B2 (en) 2020-10-09 2023-02-28 Deere & Company Machine control using a predictive map
US11635765B2 (en) 2020-10-09 2023-04-25 Deere & Company Crop state map generation and control system
US11864483B2 (en) 2020-10-09 2024-01-09 Deere & Company Predictive map generation and control system
US11889788B2 (en) 2020-10-09 2024-02-06 Deere & Company Predictive biomass map generation and control
US11825768B2 (en) 2020-10-09 2023-11-28 Deere & Company Machine control using a predictive map
US11650587B2 (en) 2020-10-09 2023-05-16 Deere & Company Predictive power map generation and control system
US11895948B2 (en) 2020-10-09 2024-02-13 Deere & Company Predictive map generation and control based on soil properties
US11474523B2 (en) 2020-10-09 2022-10-18 Deere & Company Machine control using a predictive speed map
US11849671B2 (en) 2020-10-09 2023-12-26 Deere & Company Crop state map generation and control system
US11889787B2 (en) 2020-10-09 2024-02-06 Deere & Company Predictive speed map generation and control system
GB202020173D0 (en) * 2020-12-18 2021-02-03 Agco Int Gmbh System and method of assisted or automated grain unload synchronization

Family Cites Families (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS5832767A (en) * 1981-08-21 1983-02-25 新島 伸児 Lower leg mounting tool
JP2510564B2 (en) * 1987-04-03 1996-06-26 株式会社日立製作所 Image feature detection method
JPH08172867A (en) 1994-12-26 1996-07-09 Kubota Corp Threshing depth control device of combine harvester
JP3812029B2 (en) * 1997-01-28 2006-08-23 井関農機株式会社 Depth control device for combine etc.
JPH11137062A (en) 1997-11-10 1999-05-25 Yanmar Agricult Equip Co Ltd Control device of conventional combine harvester
JPH11155340A (en) 1997-11-25 1999-06-15 Yanmar Agricult Equip Co Ltd General-purpose combine
JP4057196B2 (en) * 1999-06-16 2008-03-05 ヤンマー農機株式会社 Combine harvester mapping device
JP2004133498A (en) * 2002-10-08 2004-04-30 Sakae Shibusawa Precise agricultural method information management system
JP2006121952A (en) 2004-10-27 2006-05-18 Iseki & Co Ltd Combine harvester
JP2009284808A (en) * 2008-05-29 2009-12-10 Iseki & Co Ltd Grain culm feed control system for combine harvester
JP2010252722A (en) * 2009-04-27 2010-11-11 Kubota Corp Combine harvester
JP5626056B2 (en) * 2011-03-18 2014-11-19 富士通株式会社 Crop image processing program, crop image processing method, and crop image processing apparatus
US9013579B2 (en) * 2011-06-16 2015-04-21 Aisin Seiki Kabushiki Kaisha Vehicle surrounding-area monitoring apparatus
WO2013078328A2 (en) * 2011-11-22 2013-05-30 Precision Planting Llc Stalk sensor apparatus, systems, and methods
JP6116173B2 (en) 2012-09-26 2017-04-19 株式会社クボタ Farm management system
KR102113297B1 (en) * 2012-09-26 2020-05-21 가부시끼 가이샤 구보다 Ground work vehicle, ground work vehicle management system, and ground work information display method
US9074571B1 (en) * 2013-12-17 2015-07-07 Ford Global Technologies, Llc Vehicle and method of controlling an engine auto-stop and restart
KR102339667B1 (en) * 2014-03-26 2021-12-14 얀마 파워 테크놀로지 가부시키가이샤 Autonomous travel working vehicle
JP2016049102A (en) * 2014-08-29 2016-04-11 株式会社リコー Farm field management system, farm field management method, and program
JP2016086668A (en) * 2014-10-30 2016-05-23 井関農機株式会社 combine
JP6566833B2 (en) * 2015-10-20 2019-08-28 ヤンマー株式会社 Mapping system, mapping apparatus and computer program

Also Published As

Publication number Publication date
WO2018235486A1 (en) 2018-12-27
KR102589076B1 (en) 2023-10-16
KR20200014735A (en) 2020-02-11
CN110582198A (en) 2019-12-17

Similar Documents

Publication Publication Date Title
CN110582198B (en) Harvester
US11170547B2 (en) Combine, method of generating field farming map, program for generating the field farming map and storage medium recording the field farming map generating program
JP6854713B2 (en) combine
JP7381402B2 (en) automatic driving system
JP7068781B2 (en) Harvester
EP3939404A1 (en) System and method of assisted or automated grain unload synchronization
EP3939406A1 (en) System and method of assisted or automated grain unload synchronization
CN113766824B (en) Harvester, obstacle determination program, recording medium, and obstacle determination method
EP3939407A1 (en) System and method of assisted or automated grain unload synchronization
EP3939405A1 (en) System and method of assisted or automated grain unload synchronization
CN113766826A (en) Agricultural working machine, automatic travel system, program, recording medium having program recorded thereon, and method
US20220212602A1 (en) Harvester, System, Program, Recording Medium, and Method
JP2020178619A (en) Agricultural work machine
EP3939408A1 (en) System and method of assisted or automated grain unload synchronization
CN113923977B (en) Automatic travel system, agricultural machine, recording medium, and method
CN113727597B (en) Agricultural machinery such as harvester
WO2020262287A1 (en) Farm operation machine, autonomous travel system, program, recording medium in which program is recorded, and method
CN116437801A (en) Work vehicle, crop state detection system, crop state detection method, crop state detection program, and recording medium having recorded the crop state detection program
JP6765349B2 (en) Harvester
US20240065160A1 (en) System for determining a crop edge and self-propelled harvester
EP3939409A1 (en) System and method of assisted or automated grain unload synchronization
EP3939410A1 (en) System and method of assisted or automated grain unload synchronization
EP3939403A1 (en) System and method of assisted or automated unload synchronization
KR20230074717A (en) harvest
CN116456820A (en) Agricultural machine, agricultural machine control program, recording medium on which agricultural machine control program is recorded, and agricultural machine control method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant