WO2018211621A1 - 農作業装置、農作業管理システム、及びプログラム - Google Patents
農作業装置、農作業管理システム、及びプログラム Download PDFInfo
- Publication number
- WO2018211621A1 WO2018211621A1 PCT/JP2017/018517 JP2017018517W WO2018211621A1 WO 2018211621 A1 WO2018211621 A1 WO 2018211621A1 JP 2017018517 W JP2017018517 W JP 2017018517W WO 2018211621 A1 WO2018211621 A1 WO 2018211621A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- farm work
- information
- unit
- farm
- work
- Prior art date
Links
- 238000005259 measurement Methods 0.000 claims abstract description 42
- 238000003384 imaging method Methods 0.000 claims abstract description 31
- 230000005856 abnormality Effects 0.000 claims description 36
- 239000003550 marker Substances 0.000 claims description 8
- 241000234427 Asparagus Species 0.000 description 40
- 235000005340 Asparagus officinalis Nutrition 0.000 description 40
- 238000003306 harvesting Methods 0.000 description 24
- 238000004891 communication Methods 0.000 description 21
- 238000012545 processing Methods 0.000 description 21
- 238000013459 approach Methods 0.000 description 18
- 238000010586 diagram Methods 0.000 description 10
- 238000000034 method Methods 0.000 description 10
- 230000008569 process Effects 0.000 description 10
- 239000003814 drug Substances 0.000 description 7
- 230000010365 information processing Effects 0.000 description 6
- 241000196324 Embryophyta Species 0.000 description 5
- 238000013473 artificial intelligence Methods 0.000 description 4
- 239000000470 constituent Substances 0.000 description 4
- 201000010099 disease Diseases 0.000 description 4
- 208000037265 diseases, disorders, signs and symptoms Diseases 0.000 description 4
- 230000006870 function Effects 0.000 description 4
- 238000005507 spraying Methods 0.000 description 4
- 230000002159 abnormal effect Effects 0.000 description 3
- 238000013480 data collection Methods 0.000 description 3
- 230000002950 deficient Effects 0.000 description 2
- 210000000744 eyelid Anatomy 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000003860 storage Methods 0.000 description 2
- 238000009333 weeding Methods 0.000 description 2
- 229940079593 drug Drugs 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000009313 farming Methods 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 239000010902 straw Substances 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A01—AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
- A01D—HARVESTING; MOWING
- A01D45/00—Harvesting of standing crops
- A01D45/007—Harvesting of standing crops of asparagus
-
- A—HUMAN NECESSITIES
- A01—AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
- A01B—SOIL WORKING IN AGRICULTURE OR FORESTRY; PARTS, DETAILS, OR ACCESSORIES OF AGRICULTURAL MACHINES OR IMPLEMENTS, IN GENERAL
- A01B69/00—Steering of agricultural machines or implements; Guiding agricultural machines or implements on a desired track
- A01B69/001—Steering by means of optical assistance, e.g. television cameras
-
- A—HUMAN NECESSITIES
- A01—AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
- A01D—HARVESTING; MOWING
- A01D46/00—Picking of fruits, vegetables, hops, or the like; Devices for shaking trees or shrubs
- A01D46/30—Robotic devices for individually picking crops
-
- A—HUMAN NECESSITIES
- A01—AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
- A01M—CATCHING, TRAPPING OR SCARING OF ANIMALS; APPARATUS FOR THE DESTRUCTION OF NOXIOUS ANIMALS OR NOXIOUS PLANTS
- A01M21/00—Apparatus for the destruction of unwanted vegetation, e.g. weeds
-
- A—HUMAN NECESSITIES
- A01—AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
- A01M—CATCHING, TRAPPING OR SCARING OF ANIMALS; APPARATUS FOR THE DESTRUCTION OF NOXIOUS ANIMALS OR NOXIOUS PLANTS
- A01M21/00—Apparatus for the destruction of unwanted vegetation, e.g. weeds
- A01M21/02—Apparatus for mechanical destruction
-
- A—HUMAN NECESSITIES
- A01—AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
- A01M—CATCHING, TRAPPING OR SCARING OF ANIMALS; APPARATUS FOR THE DESTRUCTION OF NOXIOUS ANIMALS OR NOXIOUS PLANTS
- A01M7/00—Special adaptations or arrangements of liquid-spraying apparatus for purposes covered by this subclass
- A01M7/0089—Regulating or controlling systems
-
- A—HUMAN NECESSITIES
- A01—AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
- A01M—CATCHING, TRAPPING OR SCARING OF ANIMALS; APPARATUS FOR THE DESTRUCTION OF NOXIOUS ANIMALS OR NOXIOUS PLANTS
- A01M99/00—Subject matter not provided for in other groups of this subclass
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/0088—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots characterized by the autonomous decision making process, e.g. artificial intelligence, predefined behaviours
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0276—Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle
- G05D1/0278—Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle using satellite positioning signals, e.g. GPS
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/60—Analysis of geometric attributes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
-
- A—HUMAN NECESSITIES
- A01—AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
- A01M—CATCHING, TRAPPING OR SCARING OF ANIMALS; APPARATUS FOR THE DESTRUCTION OF NOXIOUS ANIMALS OR NOXIOUS PLANTS
- A01M21/00—Apparatus for the destruction of unwanted vegetation, e.g. weeds
- A01M21/04—Apparatus for destruction by steam, chemicals, burning, or electricity
- A01M21/043—Apparatus for destruction by steam, chemicals, burning, or electricity by chemicals
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30181—Earth observation
- G06T2207/30188—Vegetation; Agriculture
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30204—Marker
-
- G—PHYSICS
- G07—CHECKING-DEVICES
- G07C—TIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
- G07C3/00—Registering or indicating the condition or the working of machines or other apparatus, other than vehicles
- G07C3/08—Registering or indicating the production of the machine either with or without registering working or idle time
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/90—Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums
Definitions
- the present invention relates to a farm work apparatus, a farm work management system, and a program.
- Patent Document 1 discloses that in a data collection device that is attached to an agricultural machine and configured to transmit data on the machine side to a management terminal via a communication line, the management terminal The data file set in the storage device is automatically created, and the file name includes at least the date of work, machine identification code, operator name, or field number. Disclosed is a data collection device for an agricultural machine characterized in that one or more of each element is included.
- the present invention has been made in view of the above circumstances, an agricultural work apparatus that can appropriately manage agricultural work information and enables smooth execution of future agricultural work and business by utilizing the agricultural work information,
- One of the purposes is to provide a farm work management system and program.
- a farm work apparatus is configured to perform the farm work on the target based on a first imaging device and first image information obtained by the first imaging apparatus capturing an image of the farm work target.
- a farm work determination unit that determines whether to perform, a farm work execution unit that executes the farm work based on a determination result of the farm work determination unit, a farm work information generation unit that generates farm work information including a result of the farm work,
- a measurement unit that measures the position of the object; and a farm work information management unit that manages the farm work information and position information indicating the measured position.
- the farm work apparatus may further include an abnormality determination unit that determines whether the target is abnormal based on the first image information and second image information as a reference image including the target.
- the measurement unit measures the dimensions of the target based on the first image information
- the farm work determination unit includes a comparison result between the measured dimensions and a predetermined threshold, and the abnormality Whether or not to perform the farming operation on the target may be determined based on the presence or absence.
- the farm work information management unit may record time information indicating a time when the target position is measured in association with the farm work information and the position information.
- the farm work apparatus may further include an arm device for performing the farm work, and the farm work execution unit may control the operation based on the position of the target measured by the measurement unit.
- the arm device includes a second imaging device
- the farm work execution unit is configured to perform the operation of the arm device based on third image information obtained by the second imaging device imaging the target. The operation may be controlled.
- the farm work device further includes a drive device that controls the farm work device to move along a predetermined route, and the drive device is applied to the target included in the first image information by the farm work determination unit.
- the movement of the farm work device may be controlled.
- the first imaging device may detect a marker disposed on the path, and the drive device may control movement of the farm work apparatus based on the detected marker.
- the position of the target measured by the measurement unit may include a position where the first imaging device images the target.
- a farm work management system is a farm work management system comprising the farm work apparatus according to any one of claims 1 to 9 and a farm work management server, wherein the farm work apparatus includes: The farm work information indicating the result of farm work and the position information are transmitted to the farm work management server, and the farm work management server executes the farm work to be executed on the target based on the farm work information and the position information.
- a farm work prediction unit for predicting the contents is provided.
- the farm work management system further includes a portable terminal including an output unit, and the farm work management server generates output information for outputting the farm work information and the position information in association with each other in a predetermined output mode.
- An information generation unit may be further provided, and the output unit may output the farm work information and the position information in a predetermined output mode based on the output information.
- a program according to an embodiment of the present invention is directed to whether a farm work apparatus including an imaging device performs the farm work on the target based on image information obtained by the imaging device imaging a farm work target.
- a farm work determination unit that determines whether or not, a farm work execution unit that executes the farm work based on a determination result of the farm work determination unit, a farm work information generation unit that generates farm work information including a result of the farm work, and the target
- a measurement unit that measures a position, the farm work information, and a farm work information management unit that manages the position information indicating the measured position are caused to function.
- part”, “apparatus”, and “system” do not simply mean physical means, but the functions of the “part”, “apparatus”, and “system” are defined by software. This includes cases where it is realized. Further, even if the functions of one “part”, “apparatus”, and “system” are realized by two or more physical means and apparatuses, two or more “parts”, “apparatus”, “system” The function may be realized by one physical means or apparatus.
- the farm work information can be appropriately managed, and future farm work and business can be smoothly executed by utilizing the farm work information.
- An example of the farm work information table recorded on the recorder of the farm work apparatus concerning one embodiment of the present invention is shown.
- the block diagram which shows an example of schematic structure of the farm work management server which concerns on one Embodiment of this invention.
- the flowchart which shows an example of the farm work processing flow which concerns on one Embodiment of this invention.
- the flowchart which shows an example of the farm work judgment processing flow which concerns on one Embodiment of this invention.
- the flowchart which shows an example of the farm work execution processing flow which concerns on one Embodiment of this invention.
- FIG. 1 is a block diagram showing an example of a schematic configuration of a farm work management system according to an embodiment of the present invention.
- the farm work management system 100 exemplarily performs farm work in the field FF, manages the farm work apparatus 1 that manages the results of the farm work, and manages the farm work results that the farm work apparatus 1 performs.
- the farm work management server 5 that predicts the contents of farm work to be executed in the future based on the result of the above, and the portable terminal 3 held by the user U engaged in the farm work.
- the “agricultural work” includes, for example, harvesting asparagus A (target) arranged in the ridge R, spraying a drug on the field FF, and performing weeding in the field FF.
- the “object” of the farm work is not limited to a crop such as Aspara A, and may include a ground portion such as a ridge R formed in the field FF and a predetermined space in the field FF.
- the farm work apparatus 1, the portable terminal 3, and the farm work management server 5 are connected to each other via a predetermined communication network N so as to be able to communicate with each other.
- the farm work apparatus 1 receives an instruction for performing farm work in the farm work apparatus 1 from the portable terminal 3 via the communication network N.
- the farm work apparatus 1 harvests, for example, asparagus A cultivated in the straw R using the arm 16.
- the farm work device 1 generates farm work information including the result of farm work, and transmits at least the farm work information and position information to the farm work management server 5 via the communication network N.
- the farm work management server 5 transmits at least the farm work information and the position information to the mobile terminal 3 via the communication network N.
- the portable terminal 3 receives at least the farm work information and the position information from the farm work management server 5 via the communication network N, and outputs them at the output unit 7.
- the farm work information can be properly managed, and future farm work and business can be smoothly executed by utilizing the farm work information.
- the user U can confirm the result of the farm work which the farm work apparatus 1 performs in the output part 7 of the portable terminal 3.
- the portable terminal 3 may be a transmissive type glasses-type terminal with a AR (A ugmented R eality) technology. In this case, when the user U wears the glasses-type terminal and looks at the eyelid R, highlighting such as displaying an area including an unharvested portion on the eyelid R is executed. Thus, the user U can easily identify an unharvested location.
- the communication network N may be a combination of the Internet, a packet communication network, and a line communication network.
- each communication part of the farm work apparatus 1, the mobile terminal 3, and the farm work management server 5 is a wireless network. It may include a wired network.
- the number of each of the farm work apparatus 1, the portable terminal 3, and the farm work management server 5 is arbitrary, and each can be configured to be two or more.
- FIG. 2 is a block diagram illustrating an example of a schematic configuration of a farm work apparatus according to an embodiment of the present invention.
- the farm work apparatus 1 exemplarily includes a GPS sensor 12 that detects the position of the farm work apparatus 1 based on a GPS signal transmitted from a GPS satellite (not shown), and a camera that captures the farm work target. 14 (first imaging device), an arm device 16 for performing farm work, a recording device 18 for recording information necessary for each process executed by the farm work device 1, the portable terminal 3 and the farm work management shown in FIG.
- a communication device 20 that transmits and receives various information to and from the server 5, a drive device 22 that controls the farm work device 1 to move along a predetermined route, and a central processing device 10 that is connected to each of the above components.
- Each of the above-described constituent elements may be a single constituent element or a constituent element including a plurality of constituent elements.
- the central processing unit 10 controls each of the above components in order to execute processing for managing farm work information.
- the central processing unit 10 is, for example, a CPU, MPU, or the like, and operates according to a program stored in the recording device 18.
- the central processing unit 10 functionally includes a crop measurement unit 30 (measurement unit) that measures the position, height, depth, and the like of the target, whether the target is a nonstandard product, whether the target has a disease, and the like. Whether or not to perform the farm work on the target based on the abnormality determination unit 32 that determines the abnormality of the target including the image information (first image information) obtained by the camera 14 imaging the farm work target.
- Agricultural work determination unit 34 a agricultural work execution unit 36 that executes agricultural work based on the determination result of the agricultural work determination unit 34, a agricultural work information generation unit 38 that generates agricultural work information including the results of the agricultural work, and an agricultural work information generation unit And a farm work information management unit 40 that manages the farm work information generated by 38.
- Non-standard products are also referred to as non-standard products.
- the recording device 18 records the farm work information generated by the farm work information generation unit 38.
- the recording device 18 records the farm work information as, for example, the farm work information tables shown in FIGS.
- the recording device 18 includes farm work information including whether or not the target is harvested and the harvest amount, the farm work position (position information) including the position where the target is cultivated, and the date and time when the farm work position is measured. Record in association with.
- the recording device 18 stores the farm work information including the direction in which the farm work device 1 shown in FIG. 1 is harvesting the target and the image information including the target to be harvested, the farm work position and the farm work. Record in relation to the date and time when the position was measured.
- the recording device 18 records the farm work information including whether or not the target is harvested and the target nonstandard product / disease amount in association with the farm work position and the date and time when the farm work position was measured. .
- the recording device 18 records the farm work information including the medicine application amount in association with the position where the medicine is applied as the farm work position and the date and time when the position was measured.
- the recording device 18 uses the weight of the removed weed as the farm work information, the position of the removed weed (or the position of the farm work apparatus 1 that has executed the removal process). ) May be recorded in association with the date and time when weeds are removed.
- farm work information tables shown in FIGS. 3 to 6 may be recorded in association with each other in the recording device 18. Also, the farm work information table may be configured by combining at least two or more of the farm work information tables shown in FIGS.
- the farm work position may be the position of the farm work apparatus 1 detected by the GPS sensor 12 shown in FIG.
- the farm work position may be the position of each of a plurality of objects.
- Each position of the plurality of objects is information indicating whether, for example, the crop is disposed closer (front) or farther (back) to the farm work device 1 in the basket. There may be.
- the direction information in the farm work information table shown in FIG. 4 is indicated by an angle when a predetermined direction is used as a reference, and may be a direction associated with each target when harvesting each of a plurality of targets. In a case where a target group including a plurality of targets is harvested, a direction associated with each group may be used.
- the image information 4 may be image information including a single target or image information including a plurality of targets.
- the image information may be image information indicating a situation where the farm work apparatus 1 is spraying the medicine, or may be image information indicating a situation after the farm work apparatus 1 is spraying the medicine.
- the recording device 18 is an information recording medium configured by an information recording medium such as a ROM, a RAM, and a hard disk, and holds a program executed by the central processing unit 10.
- the recording device 18 also operates as a work memory for the central processing unit 10.
- the program stored in the recording device 18 may be provided by being downloaded from the outside of the farm work apparatus 1 via a network, for example, or read by a computer such as a CD-ROM or DVD-ROM. It may be provided by various information recording media.
- the communication device 20 receives, for example, an instruction for performing farm work from the mobile terminal 3 via the communication network N as shown in FIG. On the other hand, the communication device 20 transmits, for example, the farm work information tables shown in FIGS. 3 to 6 to the farm work management server 5 via the communication network N.
- the driving device 22 is, for example, a moving wheel supported by left and right axles, and can move back and forth by turning the moving wheel and can turn.
- FIG. 7 is a block diagram illustrating an example of a schematic configuration of a farm work management server according to an embodiment of the present invention.
- the farm work management server 7 exemplarily performs a communication unit 50 that transmits and receives various types of information between the farm work apparatus 1 and the mobile terminal 3 illustrated in FIG. 1 and information processing based on the farm work information.
- the information processing unit 52 to be executed, the farm work information transmitted from the farm work apparatus 1 shown in FIG. 1, the farm work information table AT (see FIGS. 3 to 6), and the field map showing the field FF shown in FIGS.
- a recording unit 54 for storing M.
- the agricultural field map M is used for controlling the movement range when the farm work execution range is set in the portable terminal 3 shown in FIG. 1 or when the farm work apparatus 1 actually executes the farm work.
- the recording unit 54 may record various programs related to various information processing executed by the information processing unit 52.
- the information processing unit 34 is functionally transmitted from the farm work apparatus 1 shown in FIG. 1 and received by the communication unit 50 and based on the farm work information and the position information indicating the position of the target.
- a farm work prediction unit 60 that predicts contents, and an output information generation unit 62 that generates output information for outputting farm work information and position information indicating a target position in a predetermined output mode in association with each other.
- the farm work prediction unit 60 predicts the contents of future farm work in view of the state of the crop by referring to the image information of the crop (target) as the farm work information. For example, the growth status of each asparagus is grasped from the image of the asparagus group, and the harvest time, the prospect of the future harvest amount, etc. are measured as an example of the contents of future farm work.
- the output information may include output information for outputting the prediction result of the farm work prediction unit 60 in a predetermined output mode.
- the output information is transmitted to the mobile terminal 3 via the communication unit 50 in response to a request from the mobile terminal 3 illustrated in FIG. 1 and is output from the output unit 7 of the mobile terminal 3.
- FIG. 8 is a flowchart showing an example of a farm work processing flow according to an embodiment of the present invention.
- FIG. 9 is a diagram illustrating an example of farm work performed by the farm work apparatus in the farm field according to the embodiment of the present invention.
- Step S1 A user U shown in FIG. 1 sets a farm work range in which the farm work apparatus 1 performs farm work in the mobile terminal 3.
- the portable terminal 3 acquires the farm field map information M recorded in the recording unit 54 of the farm work management server 5 shown in FIG.
- the user U as shown in FIG. 9, in the portable terminal 3, the routes L 1, L 3 and L 5 along which the farm work apparatus 1 moves in the farm field FF, and the start point P 1 and work at which the farm work apparatus 1 starts work.
- the end point P7 to be ended is set.
- Step S3 The drive device 22 of the farm work apparatus 1 shown in FIG. 2 controls the farm work apparatus 1 so as to move along the set path L1 in the field FF.
- the farm work apparatus 1 stops when it moves about 10 cm, for example, and performs farm work as described later. And when farm work is completed, it moves about 10 cm again and repeats performing farm work.
- the drive device 22 determines whether or not to perform the farm work on all objects included in the image information (first image information) from the camera 14 by the farm work judgment unit 34 shown in FIG. You may be comprised so that the movement of the farm work apparatus 1 may be controlled.
- the movement amount of the farm work apparatus 1 has been described as about 10 cm in the above, but is not limited thereto, and the movement amount is based on the size and arrangement position of the harvested object, the lens size of the camera 14, and the like. May be set as appropriate.
- Step S5 The farm work apparatus 1 performs farm work.
- the farm work execution unit 36 illustrated in FIG. 2 performs the farm work based on the determination result of the farm work determination unit 34 that determines whether to perform the farm work on the target.
- the processing of the farm work determination unit 34 will be described in detail with reference to FIGS.
- the process of the farm work execution part 36 is explained in full detail using FIG.
- the farm work includes, for example, harvesting a target arranged on the ridge R1, spraying a medicine on the field FF, and performing weeding in the field FF.
- the farm work information generation unit 38 of the farm work apparatus 1 shown in FIG. 2 generates farm work information (see FIGS. 3 to 6) including the results of the farm work.
- the crop measurement unit 30 measures the position where the farm work is performed.
- the position where the farm work is performed includes, for example, the position of the farm work apparatus 1 that performs the farm work or the position of the target arranged in the farm field FF.
- the crop measurement unit 30 measures the position of the farm work apparatus 1 that performs farm work based on the position information detected by the GPS sensor 12 shown in FIG. Further, the crop measurement unit 30 obtains the position of the target arranged in the field FF by the position information of the farm work apparatus 1 detected by the GPS sensor 12 shown in FIG. 2 and the camera 14 imaging the target.
- the crop measurement unit 30 estimates the position of the target arranged in the field FF based on the position information of the farm work apparatus 1 detected by the GPS sensor 12 shown in FIG.
- the exact position of the target may be specified based on image information obtained by imaging the target.
- the farm work information management unit 40 associates the farm work information, the position information indicating the position of the object measured by the crop measuring unit 30, and the time information indicating the time when the position of the object is measured, with reference to FIGS.
- the farm work information table shown in FIG. 2 transmits the farm work information table recorded in the recording device 18 to the farm work management server 5 shown in FIG. 1 at a predetermined timing.
- the communication device 20 may transmit the farm work information table periodically, may be transmitted every time farm work is performed, or may be transmitted when a series of farm work is completed. .
- Step S7 The camera 14 (first imaging device) of the farm work apparatus 1 shown in FIG. 2 detects the marker M1 in the field FF.
- Step S9 The drive device 22 of the farm work apparatus 1 shown in FIG. 2 controls the movement of the farm work apparatus 1 based on the marker M1 detected by the camera 14. For example, the farm work apparatus 1 determines that it has arrived at the point P3 that is the end point of the field FF by detecting the marker M1. The farm work apparatus 1 turns, moves to the path L3, and resumes farm work on the rod R3. Next, the farm work apparatus 1 determines that it has arrived at the point P5 that is the end point of the field FF by detecting the marker M3. In the farm work apparatus 1, the drive device 22 turns, moves to the path L5, and resumes the farm work on the rod R5.
- Step S11 When the farm work apparatus 1 arrives at the end point P7 of the set route L5, the farm work apparatus 1 ends the farm work.
- FIG. 10 is a flowchart illustrating an example of a farm work determination processing flow according to an embodiment of the present invention.
- FIG. 11 is a diagram illustrating an example of image information obtained by imaging by the camera of the farm work apparatus according to the embodiment of the present invention.
- Step S21 As illustrated in FIG. 11, the camera 14 of the farm work apparatus 1 illustrated in FIG. 2 acquires an image (first image information) by capturing an asparagus group that is a target of farm work.
- the abnormality determination unit 32 of the farm work apparatus 1 illustrated in FIG. 2 includes a reference including the image (first image information) acquired in step S21 and the asparagus that is the target of farm work recorded in the recording device 18 of the farm work apparatus 1, for example.
- the image (second image information) is compared.
- the abnormality determination unit 32 utilizes an artificial intelligence model for executing abnormality determination.
- the artificial intelligence model is formed so that it is possible to accurately determine an abnormality by repeatedly learning based on the determination result of the presence or absence of abnormality executed in the past.
- the artificial intelligence model a large amount of images (reference images) used in the past abnormality determination are accumulated.
- the abnormality determination part 32 can determine the presence or absence of asparagus abnormality as described later by referring to the artificial intelligence model for the image acquired in step S21.
- the abnormality determination unit 32 determines whether or not there is an asparagus abnormality based on the comparison result in step S23.
- the reference image includes good asparagus that is worthy of harvesting, and the abnormality determination unit 32 captures an image of the asparagus to be harvested with the camera 14 and a reference image that includes the good asparagus. To determine whether or not the asparagus that is actually planned to be harvested is a good product.
- the reference image includes non-standard asparagus that is not worth harvesting.
- the abnormality determination unit 32 includes an image obtained by capturing the asparagus to be harvested with the camera 14 and a non-standard asparagus. It may be configured to determine whether or not the asparagus that is actually scheduled to be harvested is a non-standard product by comparing the included reference image.
- the abnormality determination unit 32 determines that the asparagus to be harvested has some disease because it is different from the non-defective product color (for example, green) included in the reference image. To do. Further, when the shape of the asparagus to be harvested has a large curved portion, the abnormality determination unit 32 is different from the non-defective product shape (for example, a straight line shape) included in the reference image. to decide.
- the non-defective product shape for example, a straight line shape
- Step S27 The crop measuring unit 30 of the farm work apparatus 1 shown in FIG. 2 measures the cultivation position, height, and depth of asparagus based on image information obtained by the camera 14 imaging asparagus.
- the crop measurement unit 30 may measure the asparagus cultivation position and the like based on the irradiation wave of a measurement laser (not shown) included in the farm work apparatus 1 and the reflected wave from the asparagus.
- the farm work apparatus 1 may be comprised so that the cultivation position, height, and depth of asparagus may be measured using both the camera 14 and the measurement laser.
- FIG. 12 is a diagram showing an example of image information in which two asparalas A1 and A2 included in the broken line portion in FIG. 11 are displayed in an enlarged manner.
- the crop measurement unit 30 measures, for example, the XYZ coordinates (positions) of the tip and end (root) of each asparagus A1 and A2 based on image information from the camera 14, The difference D between the positions of Aspara A1 and Aspara A2 can be measured.
- the crop measurement unit 30 can measure the height H (dimension) of the asparagus A1 and the width W (dimension) of the asparagus A1.
- Step S29 The crop measurement unit 30 determines the state of asparagus by comparing the measured dimension with a set threshold value.
- the crop measurement unit 30 exceeds the set threshold (26 cm), and thus determines that the asparagus has grown to an extent worthy of harvesting.
- the crop measuring unit 30 determines that the asparagus has not grown to an extent worthy of harvesting because it is below the set threshold (26 cm).
- the crop measurement unit 30 may determine the state of asparagus by comparing the measured width of asparagus with a set threshold value.
- the “threshold value” is not limited to the set threshold value, and may be a predetermined threshold value.
- Step S31 The farm work determination unit 34 of the farm work apparatus 1 illustrated in FIG. 2 determines whether or not to harvest asparagus based on the determination result output by the abnormality determination unit 32 and the determination result output by the crop measurement unit 30. For example, when the determination result of the abnormality determination unit 32 is “no abnormality” and the determination result of the crop measurement unit 30 is “26 cm or more: sufficiently grown”, it means that the asparagus is sufficient for shipping. It is determined that “to harvest”. Further, when the determination result of the abnormality determination unit 32 is “abnormal” and the determination result of the crop measurement unit 30 is “26 cm or less: not sufficiently grown”, it is worth shipping even if it grows in the future. It is judged to be “harvested” in the sense that it is not asparagus.
- the determination result of the abnormality determination unit 32 is “abnormal” and the determination result of the crop measurement unit 30 is “26 cm or more: sufficiently grown”, it means that the asparagus is not worth shipping. It is determined that “to harvest”. Also in this case, it is discarded after harvesting. Furthermore, if the judgment result of the abnormality judgment unit 32 is “no abnormality” and the judgment result of the crop measurement unit 30 is “26 cm or less: not sufficiently grown”, the farm work judgment unit 34 is a non-standard product. In addition, since it has no disease, it is judged that it does not harvest in the sense that it waits until the asparagus to be harvested grows to the extent that it is worth harvesting.
- the farm work determination unit 34 comprehensively considers the determination result of the abnormality determination unit 32 and the determination result of the crop measurement unit 30 to determine whether or not to harvest the target. However, it is possible to perform a more accurate determination than when determining whether or not to perform farm work based on one of the determination result of the crop measurement unit 30 and the determination result of the abnormality determination unit 32. Can do.
- FIG. 13 is a flowchart illustrating an example of a farm work execution process flow according to an embodiment of the present invention.
- FIG. 14 is a diagram illustrating an example of the farm work performed by the farm work apparatus in the farm field according to the embodiment of the present invention.
- Step S41 As described in step S29 of FIG. 10, the crop measurement unit 30 illustrated in FIG. 2 measures the cultivation position, height, and depth of aspara A5.
- the farm work execution unit 36 based on the measurement result of the crop measurement unit 30, from the approach path of the arm device 16 until the aspara A5 is harvested, for example, the current position of the arm device 16 of the field FF shown in FIG.
- An approach route to the position where the aspara A5 is disposed on the heel R is set.
- the approach route may include a route from the three-dimensional coordinate position of the arm device 16 to the three-dimensional coordinate position indicating the cutting point of the aspara A5 in the basket R of the field FF shown in FIG.
- the approach route may be configured to be set after being generated by the crop execution unit 36, or the crop measurement unit 30 may generate an approach route based on the measurement result, and the crop execution unit 36 may It may be configured to set an approach route.
- Step S43 image information (third image information) is acquired by the camera 14 (second imaging device) mounted on the arm device 16.
- the camera 14 (second imaging device) mounted on the arm device 16 can measure the actual approach path of the arm device 16 more accurately than the camera mounted on the main body portion of the farm work device 1.
- Step S45 Based on the acquired image information, the farm work execution unit 36 determines whether the approach route of the arm device 16 is different from the approach route set in step S41. When the approach route of the arm device 16 is not different from the approach route set in step S41 (in the case of No), the process proceeds to step S49. On the other hand, when the approach route of the arm device 16 is different from the approach route set in step S41 (in the case of Yes), the process proceeds to step S47.
- Step S47 The farm work execution unit 36 controls the approach path of the arm device 16 based on the image information from the camera 14. For example, if the approach route of the arm device 16 is different from the approach route set in step S41, the approach route of the arm device 16 may be corrected to a correct route because there is a possibility that the aspara A5 cannot be reliably harvested. Make sure to harvest Aspara A5. For example, when the approach route includes a route from the three-dimensional coordinate position of the arm device 16 to the three-dimensional coordinate position indicating the cutting point of aspara A5 in the basket R of the field FF shown in FIG. Based on the image information from the camera 14, you may be comprised so that the shift
- Step S49 The farm work execution unit 36 uses the arm device 16 to harvest asparagus A5.
- Step S51 The farm work execution unit 36 stores the asparagus A5 harvested using the arm device 16 in the storage S.
- the farm work information can be appropriately managed by associating and managing the farm work information and the position of the farm work target, and the future by utilizing the farm work information Allows smooth execution of farm work and business.
- the farm work determination unit 34 comprehensively considers the determination result of the abnormality determination unit 32 and the determination result of the crop measurement unit 30 to determine whether or not to harvest the target. . Therefore, it is more accurate than the case where the farm work determination unit 34 determines whether or not to perform the farm work based on one of the determination result of the crop measurement unit 30 and the determination result of the abnormality determination unit 32. Can make decisions.
- the output information generation unit 62 is provided in the farm work management server 5.
- the output information generation unit may be included in the portable terminal 3 shown in FIG.
- the farm work management server 5 is configured to transmit farm work information to the mobile terminal 3, generate output information in the mobile terminal 3, and output the generated output information in the output unit 7 of the mobile terminal 3.
- the farm work prediction unit 60 is provided in the farm work management server 5.
- the farm work prediction unit 60 may be included in the mobile terminal 3 or the farm work apparatus 1. Further, as shown in FIG.
- At least one component of the crop measurement unit 30, the abnormality determination unit 32, the farm work determination unit 34, the farm work execution unit 36, the farm work information generation unit 38, and the farm work information management unit 40 includes farm work management.
- the server 5 or the mobile terminal 3 may be provided.
- each component with which each structure of the agricultural work apparatus 1, the portable terminal 3, and the agricultural work management server 5 is provided it is good also as another structure being provided in the range which does not produce a contradiction in a processing content.
- the farm work judgment part 34 performs a farm work based on the judgment result of the crop measurement part 30, and the judgment result of the abnormality judgment part 32, or not. Judging.
- the farm work determination unit 34 is configured to simply determine whether or not to perform farm work based on one of the determination result of the crop measurement unit 30 and the determination result of the abnormality determination unit 32. May be.
- the farm work information, the position information, and the date / time information are transmitted from the farm work apparatus 1 to the portable terminal 3 via the farm work management server 5.
- the farm work information may be transmitted directly from the farm work apparatus 1 to the mobile terminal 3.
- farm work information, position information, and date information are output in the output part 7 of the portable terminal 3 shown in FIG.
- the farm work information, the position information, and the date / time information may be output by an output unit (not shown) included in the farm work apparatus 1.
Landscapes
- Engineering & Computer Science (AREA)
- Life Sciences & Earth Sciences (AREA)
- Environmental Sciences (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Zoology (AREA)
- Insects & Arthropods (AREA)
- Pest Control & Pesticides (AREA)
- Wood Science & Technology (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Remote Sensing (AREA)
- Radar, Positioning & Navigation (AREA)
- Aviation & Aerospace Engineering (AREA)
- Automation & Control Theory (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Geometry (AREA)
- Artificial Intelligence (AREA)
- Evolutionary Computation (AREA)
- Game Theory and Decision Science (AREA)
- Medical Informatics (AREA)
- Business, Economics & Management (AREA)
- Quality & Reliability (AREA)
- Health & Medical Sciences (AREA)
- Soil Sciences (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
- Guiding Agricultural Machines (AREA)
- Harvesting Machines For Specific Crops (AREA)
Abstract
Description
図1は、本発明の一実施形態に係る農作業管理システムの概略構成の一例を示すブロック図である。図1に示すように、農作業管理システム100は、例示的に、圃場FFにおいて農作業を実行し、農作業の結果を管理する農作業装置1と、農作業装置1が実行する農作業の結果を管理し、農作業の結果に基づいて将来実行される農作業の内容を予測する農作業管理サーバ5と、農作業に従事するユーザUが保持する携帯端末3と、を備える。
図2は、本発明の一実施形態に係る農作業装置の概略構成の一例を示すブロック図である。図2に示すように、農作業装置1は、例示的に、不図示のGPS衛星から送信されたGPS信号に基づいて農作業装置1の位置を検出するGPSセンサ12と、農作業の対象を撮像するカメラ14(第1撮像装置)と、農作業を実行するためのアーム装置16と、農作業装置1が実行する各処理に必要な情報を記録する記録装置18と、図1に示す携帯端末3及び農作業管理サーバ5と各種情報を送受信する通信装置20と、農作業装置1が予め定められた経路に沿って移動するように制御する駆動装置22と、上記各構成要素と接続された中央処理装置10と、を備える。なお上記各構成要素は、それぞれが単一の構成要素であってもよいし、複数の構成要素を含む構成要素であってもよい。カメラ14は、例えば、耐外乱光性能を有する、パルスレーザを使用するTOF(Time Of Flight)型距離画像センサ、及び、CCD(Charge-Coupled Device)を備えるカメラの少なくとも一方を備える。
図7は、本発明の一実施形態に係る農作業管理サーバの概略構成の一例を示すブロック図である。図7に示すように、農作業管理サーバ7は、例示的に、図1に示す農作業装置1及び携帯端末3との間で各種の情報を送受信する通信部50と、農作業情報に基づく情報処理を実行する情報処理部52と、図1に示す農作業装置1から送信される農作業情報を農作業情報テーブルAT(図3~図6を参照)と、図1及び図9に示す圃場FFを示す圃場マップMと、を記憶する記録部54と、備える。ここで、圃場マップMは、後述するとおり、図1に示す携帯端末3において農作業の実行範囲を設定する場合や農作業装置1において実際に農作業を実行する場合の移動範囲の制御等に用いられる。なお、記録部54は、情報処理部52が実行する各種情報処理に関する各種プログラム等を記録してもよい。
図8及び図9を用いて、本発明の一実施形態に係る農作業処理の全体のフローを説明する。図8は、本発明の一実施形態に係る農作業処理フローの一例を示すフローチャートである。図9は、本発明の一実施形態に係る圃場において農作業装置が実行する農作業の一例を示す図である。
図1に示すユーザUは、携帯端末3において農作業装置1が農作業を実行する農作業範囲を設定する。例えば、携帯端末3は、図7に示す、農作業管理サーバ5の記録部54に記録された圃場マップ情報Mを取得する。そして、ユーザUは、携帯端末3において、図9に示すように、圃場FFにおける農作業装置1が移動する経路L1、L3及びL5、並びに、農作業装置1が作業を開始するスタート地点P1及び作業を終了するエンド地点P7を設定する。
図2に示す農作業装置1の駆動装置22は、農作業装置1が圃場FF内を、設定された経路L1に沿って移動するように制御する。まず、農作業装置1は、例えば10cm程度移動したら停止し、後述するとおり農作業を実行する。そして、農作業が完了したら、再度10cm程度移動し、農作業を実行するということを繰り返す。なお、駆動装置22は、図2に示す農作業判断部34により、カメラ14からの画像情報(第1画像情報)に含まれるすべての対象に対して農作業を実行するか否かを判断した後に当該農作業装置1の移動を制御するように構成されてもよい。なお、農作業装置1の移動量に関して、上記では10cm程度と説明したが、これに限られず、当該移動量については、収穫対象物の大きさや配置位置、及び、カメラ14のレンズのサイズ等に基づいて適宜設定されてもよい。
農作業装置1が農作業を実行する。例えば、図2に示す農作業実行部36は、対象に対して農作業を実行するか否かを判断する農作業判断部34の判断結果に基づいて農作業を実行する。なお、農作業判断部34の処理については、図10~12を用いて詳述する。また、農作業実行部36の処理については、図13及び14を用いて詳述する。農作業としては、例えば畝R1に配置されている対象を収穫すること、圃場FFに対して薬剤を散布すること、及び、圃場FFにおいて除草を行うこと等を含む。
図2に示す農作業装置1のカメラ14(第1撮像装置)が圃場FF内のマーカM1を検出する。
図2に示す農作業装置1の駆動装置22は、カメラ14により検出されたマーカM1に基づいて農作業装置1の移動を制御する。例えば、農作業装置1は、マーカM1を検出することで、圃場FFの端の地点である地点P3に到着したことを判定する。農作業装置1は、旋回し、経路L3へ移行し、畝R3に対して農作業を再開する。次に、農作業装置1は、マーカM3を検出することで、圃場FFの端の地点である地点P5に到着したことを判定する。農作業装置1は、駆動装置22は、旋回し、経路L5へ移行し、畝R5に対して農作業を再開する。
農作業装置1が設定された経路L5のエンド地点P7に到着すると、農作業装置1は農作業を終了する。
図10~図12を用いて、本発明の一実施形態に係る農作業判断処理のフローを説明する。図10は、本発明の一実施形態に係る農作業判断処理フローの一例を示すフローチャートである。図11は、本発明の一実施形態に係る農作業装置のカメラが撮像することにより得られる画像情報の一例を示す図である。
図11に示すように、図2に示す農作業装置1のカメラ14は、農作業の対象であるアスパラ群を撮像することで、画像(第1画像情報)を取得する。
図2に示す農作業装置1の異常判断部32は、ステップS21で取得した画像(第1画像情報)と、例えば農作業装置1の記録装置18に記録された、農作業の対象であるアスパラを含む参照画像(第2画像情報)と、を比較する。より具体的に、異常判断部32は、異常判断を実行するための人工知能モデルを活用する。人工知能モデルは、過去に実行した、異常の有無の判断結果に基づいて繰り返し学習することで正確に異常判断が可能なように形成される。ここで、人工知能モデルには、過去の異常判断の際に用いられた画像(参照画像)が大量に蓄積されている。そして、異常判断部32は、ステップS21で取得した画像について、当該人工知能モデルを参照することで、後述するとおり、アスパラの異常の有無を判断可能となる。
異常判断部32は、ステップS23の比較結果に基づいてアスパラの異常の有無を判断する。例えば、参照画像には、収穫するに値するような良品のアスパラが含まれており、異常判断部32は、収穫予定のアスパラをカメラ14で撮像した画像と、良品のアスパラが含まれた参照画像と、を比較することで、実際に収穫予定のアスパラが良品であるか否かを判断する。また、参照画像には、収穫するに値しないような規格外品のアスパラが含まれており、異常判断部32は、収穫予定のアスパラをカメラ14で撮像した画像と、規格外品のアスパラが含まれた参照画像と、を比較することで、実際に収穫予定のアスパラが規格外品であるか否かを判断するように構成されてもよい。
図2に示す農作業装置1の農作物測定部30は、カメラ14がアスパラを撮像することにより得られた画像情報に基づいて、アスパラの栽培位置、高さ、及び奥行を測定する。なお、農作物測定部30は、農作業装置1が備える不図示の測定レーザの照射波、及び、アスパラからの反射波に基づいて、アスパラの栽培位置等を測定してもよい。また、農作業装置1は、カメラ14及び測定レーザの双方を用いて、アスパラの栽培位置、高さ、及び奥行を測定するように構成されてもよい。
農作物測定部30は、測定された寸法と設定した閾値とを比較することによりアスパラの状態を判断する。農作物測定部30は、測定されたアスパラA1の高さが30cmである場合は、設定した閾値(26cm)を上回るので、収穫に値する程度に成長したアスパラであると判断する。他方で、農作物測定部30は、測定されたアスパラA1の高さが15cmである場合は、設定した閾値(26cm)を下回るので、収穫に値する程度に成長したアスパラでないと判断する。なお、農作物測定部30は、測定されたアスパラの幅と設定した閾値とを比較することによりアスパラの状態を判断してもよい。なお、「閾値」については、設定した閾値に限られず、所定の閾値であってもよい。
図2に示す農作業装置1の農作業判断部34は、異常判断部32が出力した判断結果と、農作物測定部30が出力した判断結果と、に基づいてアスパラを収穫するか否かを判断する。例えば、異常判断部32の判断結果が「異常なし」で、農作物測定部30の判断結果が「26cm以上:十分に成長している」である場合は、出荷するに十分なアスパラであるという意味において、「収穫する」と判断する。また、異常判断部32の判断結果が「異常あり」で、農作物測定部30の判断結果が「26cm以下:十分に成長していない」である場合は、今後成長したとしても、出荷するに値しないアスパラであるという意味において、「収穫する」と判断する。この場合は、収穫した後、廃棄等される。さらに、異常判断部32の判断結果が「異常あり」で、農作物測定部30の判断結果が「26cm以上:十分に成長している」である場合も、出荷するに値しないアスパラであるという意味において、「収穫する」と判断する。この場合も、収穫した後、廃棄等される。さらにまた、農作業判断部34は、異常判断部32の判断結果が「異常なし」で、農作物測定部30の判断結果が「26cm以下:十分に成長していない」である場合は、規格外品でもなくなんらかの病気も有しないので、収穫予定のアスパラが収穫するに値する程度に成長するまで待つという意味で、「収穫しない」と判断する。
図13及び図14を用いて、本発明の一実施形態に係る農作業実行処理フローを説明する。図13は、本発明の一実施形態に係る農作業実行処理フローの一例を示すフローチャートである。図14は、本発明の一実施形態に係る圃場において農作業装置が実行する農作業の一例を示す図である。
図10のステップS29において説明したとおり、図2に示す農作物測定部30は、アスパラA5の栽培位置、高さ、及び奥行を測定する。次に、農作業実行部36は、農作物測定部30の測定結果に基づいて、アスパラA5を収穫するまでのアーム装置16の進入経路、例えば、アーム装置16の現在位置から図14に示す圃場FFの畝RにおけるアスパラA5が配置されている位置までの進入経路を設定する。また、進入経路とは、アーム装置16の三次元座標位置から、図14に示す圃場FFの畝RにおけるアスパラA5の切断ポイントを示す三次元座標位置までの経路を含んでもよい。ここで、進入経路については、農作物実行部36が生成した上で設定するように構成されてもよいし、農作物測定部30が、測定結果に基づいて進入経路を生成し、農作物実行部36が進入経路を設定するように構成されてもよい。
例えばアーム装置16に搭載されたカメラ14(第2撮像装置)によって画像情報(第3画像情報)を取得する。アーム装置16に搭載されているカメラ14(第2撮像装置)は、農作業装置1の本体部分に搭載されるカメラに比べて、アーム装置16の実際の進入経路をより正確に測定できる。
農作業実行部36は、取得した画像情報に基づいて、アーム装置16の進入経路がステップS41において設定した進入経路と異なっているか否かを判断する。アーム装置16の進入経路がステップS41において設定した進入経路と異なっていない場合(Noの場合)は、ステップS49に進む。他方で、アーム装置16の進入経路がステップS41において設定した進入経路と異なっている場合(Yesの場合)は、ステップS47に進む。
農作業実行部36は、カメラ14からの画像情報に基づいてアーム装置16の進入経路を制御する。例えば、アーム装置16の進入経路がステップS41において設定した進入経路と異なっている場合は、アスパラA5を確実に収穫することができないおそれがあるため、アーム装置16の進入経路を正しい経路に修正し、アスパラA5を確実に収穫する。例えば、進入経路が、アーム装置16の三次元座標位置から、図14に示す圃場FFの畝RにおけるアスパラA5の切断ポイントを示す三次元座標位置までの経路を含む場合、農作業実行部36は、カメラ14からの画像情報に基づいて、当該切断ポイントのズレを補正するように構成されてもよい。
農作業実行部36は、アーム装置16を用いてアスパラA5を収穫する。
農作業実行部36は、アーム装置16を用いて収穫したアスパラA5をストレージSに格納する。
上記各実施形態は、本発明の理解を容易にするためのものであり、本発明を限定して解釈するものではない。本発明はその趣旨を逸脱することなく、変更/改良(たとえば、各実施形態を組み合わせること、各実施形態の一部の構成を省略すること)され得るとともに、本発明にはその等価物も含まれる。
Claims (12)
- 第1撮像装置と、
前記第1撮像装置が農作業の対象を撮像することにより得られる第1画像情報に基づいて、前記対象に対して前記農作業を実行するか否かを判断する農作業判断部と、
前記農作業判断部の判断結果に基づいて前記農作業を実行する農作業実行部と、
前記農作業の結果を含む農作業情報を生成する農作業情報生成部と、
前記対象の位置を測定する測定部と、
前記農作業情報と、測定された前記位置を示す位置情報と、を管理する農作業情報管理部と、
を備える、
農作業装置。 - 前記第1画像情報と、前記対象を含む参照画像としての第2画像情報と、に基づいて前記対象の異常の有無を判断する異常判断部を更に備える、
請求項1に記載の農作業装置。 - 前記測定部は、前記第1画像情報に基づいて前記対象の寸法を測定し、
前記農作業判断部は、測定された前記寸法と設定した閾値との比較結果と、前記異常の有無と、に基づいて、前記対象に対して前記農作業を実行するか否かを判断する、
請求項2に記載の農作業装置。 - 前記農作業情報管理部は、前記対象の位置を測定した時間を示す時間情報を、前記農作業情報と前記位置情報とに関連づけて記録する、
請求項1~3のいずれか一項に記載の農作業装置。 - 前記農作業を実行するためのアーム装置を更に備え、
前記農作業実行部は、前記測定部により測定された前記対象の位置に基づいて前記動作を制御する、
請求項1~4のいずれか一項に記載の農作業装置。 - 前記アーム装置は、第2撮像装置を備え、
前記農作業実行部は、前記第2撮像装置が前記対象を撮像することにより得られる第3画像情報に基づいて、前記アーム装置の前記動作を制御する、
請求項5に記載の農作業装置。 - 農作業装置が予め定められた経路に沿って移動するように制御する駆動装置を更に備え、
前記駆動装置は、前記農作業判断部により、前記第1画像情報に含まれる前記対象に対して前記農作業を実行するか否かを判断した後に当該農作業装置の移動を制御する、
請求項1~6のいずれか一項に記載の農作業装置。 - 前記第1撮像装置は、前記経路に配置されているマーカを検出し、
前記駆動装置は、検出された前記マーカに基づいて農作業装置の移動を制御する、
請求項7に記載の農作業装置。 - 前記測定部が測定する前記対象の位置は、前記第1撮像装置が前記対象を撮像した位置を含む、
請求項1~8のいずれか一項に記載の農作業装置。 - 請求項1~9のいずれか一項に記載の農作業装置と、
農作業管理サーバと、
を備える農作業管理システムであって、
前記農作業装置は、前記農作業の結果を示す前記農作業情報と前記位置情報とを前記農作業管理サーバに送信し、
前記農作業管理サーバは、前記農作業情報と前記位置情報とに基づいて、前記対象に対して実行する前記農作業の内容を予測する農作業予測部を備える、
農作業管理システム。 - 出力部を備える携帯端末を更に備え、
前記農作業管理サーバは、
前記農作業情報と前記位置情報とを関連づけて予め定められた出力態様で出力するための出力情報を生成する出力情報生成部を更に備え、
前記出力部は、前記出力情報に基づいて前記農作業情報と前記位置情報とを予め定められた出力態様で出力する、
請求項10に記載の農作業管理システム。 - 撮像装置を備える農作業装置を、
前記撮像装置が農作業の対象を撮像することにより得られる画像情報に基づいて、前記対象に対して前記農作業を実行するか否かを判断する農作業判断部と、
前記農作業判断部の判断結果に基づいて前記農作業を実行する農作業実行部と、
前記農作業の結果を含む農作業情報を生成する農作業情報生成部と、
前記対象の位置を測定する測定部と、
前記農作業情報と、測定された前記位置を示す位置情報と、を管理する農作業情報管理部と、
して機能させる、プログラム。
Priority Applications (6)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
AU2017414991A AU2017414991B2 (en) | 2017-05-17 | 2017-05-17 | Agricultural work apparatus, agricultural work management system, and program |
CN201780090814.0A CN110662417A (zh) | 2017-05-17 | 2017-05-17 | 农作业装置、农作业管理系统以及程序 |
US16/613,295 US11632907B2 (en) | 2017-05-17 | 2017-05-17 | Agricultural work apparatus, agricultural work management system, and program |
JP2019518663A JP6853591B2 (ja) | 2017-05-17 | 2017-05-17 | 農作業装置、農作業管理システム、及びプログラム |
EP17909702.7A EP3626043B1 (en) | 2017-05-17 | 2017-05-17 | Agricultural work apparatus, agricultural work management system, and program |
PCT/JP2017/018517 WO2018211621A1 (ja) | 2017-05-17 | 2017-05-17 | 農作業装置、農作業管理システム、及びプログラム |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2017/018517 WO2018211621A1 (ja) | 2017-05-17 | 2017-05-17 | 農作業装置、農作業管理システム、及びプログラム |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2018211621A1 true WO2018211621A1 (ja) | 2018-11-22 |
Family
ID=64273502
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2017/018517 WO2018211621A1 (ja) | 2017-05-17 | 2017-05-17 | 農作業装置、農作業管理システム、及びプログラム |
Country Status (6)
Country | Link |
---|---|
US (1) | US11632907B2 (ja) |
EP (1) | EP3626043B1 (ja) |
JP (1) | JP6853591B2 (ja) |
CN (1) | CN110662417A (ja) |
AU (1) | AU2017414991B2 (ja) |
WO (1) | WO2018211621A1 (ja) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2021261294A1 (ja) * | 2020-06-24 | 2021-12-30 | 株式会社クボタ | 農業用ロボット |
JP2022006700A (ja) * | 2020-06-24 | 2022-01-13 | 株式会社クボタ | 農業用ロボット |
JP2022006699A (ja) * | 2020-06-24 | 2022-01-13 | 株式会社クボタ | 農業用ロボット |
KR20230030314A (ko) * | 2021-08-25 | 2023-03-06 | 국민대학교산학협력단 | 작물 재배 장치 |
JP7545933B2 (ja) | 2021-06-03 | 2024-09-05 | 日立チャネルソリューションズ株式会社 | 農業支援システム及び移動体 |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP7223977B2 (ja) * | 2019-09-13 | 2023-02-17 | パナソニックIpマネジメント株式会社 | 収穫ロボットシステム |
US20210368680A1 (en) * | 2020-05-28 | 2021-12-02 | Automated Harvesting Solutions, LLC | Real-time speed adjustment for harvesting crops |
US11768187B2 (en) | 2020-05-28 | 2023-09-26 | Automated Harvesting Solutions, LLC | Harvester for selectively and robotically harvesting crops |
TR2021003997A1 (tr) * | 2021-03-01 | 2022-09-21 | Move On Teknoloji̇ Anoni̇m Şi̇rketi̇ | Otonom traktör si̇stemi̇ |
CN118476527B (zh) * | 2024-05-20 | 2024-10-01 | 哈尔滨工业大学 | 一种多自由度激光除草机器人 |
Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH06141649A (ja) * | 1991-08-29 | 1994-05-24 | Iseki & Co Ltd | 果菜類収穫機等の果菜把握方式 |
JPH06231329A (ja) | 1993-02-01 | 1994-08-19 | Iseki & Co Ltd | 農業機械のデ−タ収集装置 |
JP2004203514A (ja) * | 2002-12-24 | 2004-07-22 | Sakae Shibusawa | 農産物評価システム |
JP2009131223A (ja) * | 2007-11-30 | 2009-06-18 | Nagasaki Prefecture | アスパラガス切断可否自動判定装置 |
JP2011229406A (ja) * | 2010-04-23 | 2011-11-17 | Ihi Corp | 自動収穫装置 |
JP2013074807A (ja) * | 2011-09-29 | 2013-04-25 | Kubota Corp | 圃場情報生成システム |
JP2013235461A (ja) * | 2012-05-09 | 2013-11-21 | Kubota Corp | 農業機械における圃場データ収集システム |
JP2014183841A (ja) * | 2013-02-19 | 2014-10-02 | Muroran Institute Of Technology | 植物自動収穫機、植物自動収穫プログラムおよび植物自動収穫方法 |
WO2016009752A1 (ja) * | 2014-07-16 | 2016-01-21 | 株式会社リコー | 情報処理装置、制御信号の生産方法、情報処理システム、プログラム |
Family Cites Families (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4379001B2 (ja) * | 2003-05-28 | 2009-12-09 | 井関農機株式会社 | コンバインの刈取作業方法 |
US7765780B2 (en) * | 2003-12-12 | 2010-08-03 | Vision Robotics Corporation | Agricultural robot system and method |
JP4650669B2 (ja) * | 2004-11-04 | 2011-03-16 | 富士ゼロックス株式会社 | 動体認識装置 |
DE102007008330B4 (de) * | 2007-02-16 | 2016-03-24 | Christoph Neubauer | Vorrichtung und Verfahren zum Ernten von Spargel |
JP5119392B2 (ja) | 2007-02-26 | 2013-01-16 | 井関農機株式会社 | 果実収穫ロボットとイチゴ栽培施設 |
JP5294173B2 (ja) * | 2008-09-18 | 2013-09-18 | 独立行政法人農業・食品産業技術総合研究機構 | 果菜類の果房収穫装置及び果菜類の選択収穫方法 |
JP5810494B2 (ja) * | 2010-09-07 | 2015-11-11 | 株式会社ニコン | 植物栽培システム、植物栽培プラント、収穫装置及び植物栽培方法 |
CN103763515B (zh) * | 2013-12-24 | 2017-08-11 | 浙江工业大学 | 一种基于机器学习的视频异常检测方法 |
WO2015138820A1 (en) * | 2014-03-12 | 2015-09-17 | ClearMark Systems, LLC | System and method for authentication |
WO2016009688A1 (ja) * | 2014-07-16 | 2016-01-21 | 株式会社リコー | システム、機械、制御方法、プログラム |
US9313944B1 (en) * | 2014-12-03 | 2016-04-19 | Cnh Industrial America Llc | System and method for agriculture using a seed tape |
ES2540676B2 (es) * | 2015-02-25 | 2016-02-08 | Universidad De Granada | Procedimiento y sistema de guiado para la recolección automática de producto hortícola basado en modelado digital 3D |
CN104809732B (zh) * | 2015-05-07 | 2017-06-20 | 山东鲁能智能技术有限公司 | 一种基于图像比对的电力设备外观异常检测方法 |
CN105619741B (zh) * | 2016-03-28 | 2018-01-23 | 浙江工业大学 | 一种基于Tegra K1的模具智能检测方法 |
CN105976397B (zh) * | 2016-04-28 | 2019-03-26 | 西安电子科技大学 | 一种目标跟踪方法 |
US9898688B2 (en) * | 2016-06-01 | 2018-02-20 | Intel Corporation | Vision enhanced drones for precision farming |
-
2017
- 2017-05-17 EP EP17909702.7A patent/EP3626043B1/en active Active
- 2017-05-17 WO PCT/JP2017/018517 patent/WO2018211621A1/ja unknown
- 2017-05-17 AU AU2017414991A patent/AU2017414991B2/en active Active
- 2017-05-17 CN CN201780090814.0A patent/CN110662417A/zh active Pending
- 2017-05-17 JP JP2019518663A patent/JP6853591B2/ja active Active
- 2017-05-17 US US16/613,295 patent/US11632907B2/en active Active
Patent Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH06141649A (ja) * | 1991-08-29 | 1994-05-24 | Iseki & Co Ltd | 果菜類収穫機等の果菜把握方式 |
JPH06231329A (ja) | 1993-02-01 | 1994-08-19 | Iseki & Co Ltd | 農業機械のデ−タ収集装置 |
JP2004203514A (ja) * | 2002-12-24 | 2004-07-22 | Sakae Shibusawa | 農産物評価システム |
JP2009131223A (ja) * | 2007-11-30 | 2009-06-18 | Nagasaki Prefecture | アスパラガス切断可否自動判定装置 |
JP2011229406A (ja) * | 2010-04-23 | 2011-11-17 | Ihi Corp | 自動収穫装置 |
JP2013074807A (ja) * | 2011-09-29 | 2013-04-25 | Kubota Corp | 圃場情報生成システム |
JP2013235461A (ja) * | 2012-05-09 | 2013-11-21 | Kubota Corp | 農業機械における圃場データ収集システム |
JP2014183841A (ja) * | 2013-02-19 | 2014-10-02 | Muroran Institute Of Technology | 植物自動収穫機、植物自動収穫プログラムおよび植物自動収穫方法 |
WO2016009752A1 (ja) * | 2014-07-16 | 2016-01-21 | 株式会社リコー | 情報処理装置、制御信号の生産方法、情報処理システム、プログラム |
Non-Patent Citations (1)
Title |
---|
See also references of EP3626043A4 |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2021261294A1 (ja) * | 2020-06-24 | 2021-12-30 | 株式会社クボタ | 農業用ロボット |
JP2022006700A (ja) * | 2020-06-24 | 2022-01-13 | 株式会社クボタ | 農業用ロボット |
JP2022006699A (ja) * | 2020-06-24 | 2022-01-13 | 株式会社クボタ | 農業用ロボット |
JP7408496B2 (ja) | 2020-06-24 | 2024-01-05 | 株式会社クボタ | 農業用ロボット |
JP7471931B2 (ja) | 2020-06-24 | 2024-04-22 | 株式会社クボタ | 農業用ロボット |
JP7545933B2 (ja) | 2021-06-03 | 2024-09-05 | 日立チャネルソリューションズ株式会社 | 農業支援システム及び移動体 |
KR20230030314A (ko) * | 2021-08-25 | 2023-03-06 | 국민대학교산학협력단 | 작물 재배 장치 |
KR102613439B1 (ko) | 2021-08-25 | 2023-12-12 | 국민대학교산학협력단 | 작물 재배 장치 |
Also Published As
Publication number | Publication date |
---|---|
EP3626043A1 (en) | 2020-03-25 |
JP6853591B2 (ja) | 2021-03-31 |
US11632907B2 (en) | 2023-04-25 |
AU2017414991A1 (en) | 2019-12-05 |
AU2017414991B2 (en) | 2021-10-21 |
JPWO2018211621A1 (ja) | 2020-03-12 |
EP3626043A4 (en) | 2021-01-13 |
CN110662417A (zh) | 2020-01-07 |
US20210076570A1 (en) | 2021-03-18 |
EP3626043B1 (en) | 2024-02-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2018211621A1 (ja) | 農作業装置、農作業管理システム、及びプログラム | |
US9813512B2 (en) | Systems and methods for efficiently generating a geospatial data map for use in agricultural operations | |
US9696162B2 (en) | Mission and path planning using images of crop wind damage | |
CN109197278B (zh) | 作业策略的确定方法及装置、药物喷洒策略的确定方法 | |
US8417534B2 (en) | Automated location-based information recall | |
US20160019560A1 (en) | Agricultural situational awareness tool | |
JP5944805B2 (ja) | コンバイン及びコンバインの管理システム | |
EP3158409B1 (en) | Garden visualization and mapping via robotic vehicle | |
WO2016103067A1 (en) | Garden mapping and planning via robotic vehicle | |
WO2021089813A2 (en) | System for measuring and interpreting a force | |
JP6499570B2 (ja) | 茎数計測システム、及び、それを用いた農作管理システム | |
US11983934B2 (en) | Machine-learned obstruction detection in a farming machine | |
JP5806997B2 (ja) | 農作業情報管理装置及び農作業情報管理システム | |
KR20160076317A (ko) | 병해충 발생 예측 장치 및 방법 | |
JP7068747B2 (ja) | コンピュータシステム、作物生育支援方法及びプログラム | |
JP7546452B2 (ja) | 情報処理装置、情報処理方法、及び、プログラム | |
US20220392214A1 (en) | Scouting functionality emergence | |
JP6755209B2 (ja) | 収量情報表示システム | |
EP4187344B1 (en) | Work machine distance prediction and action control | |
JP2022181163A (ja) | 情報処理装置、情報処理方法、およびプログラム | |
US20240273712A1 (en) | Agricultural field analysis system for generating a field digital twin | |
NL2028679B1 (en) | A vision system for providing data related to the plant morphology of a plant using deep learning, as well as a corresponding method. | |
JP7570212B2 (ja) | 情報処理装置、情報処理方法及びプログラム | |
JP7510305B2 (ja) | 作業管理システム、作業管理方法及び作業管理プログラム | |
US20220398841A1 (en) | Information processing device, information processing method, and program |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 17909702 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2019518663 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 2017414991 Country of ref document: AU Date of ref document: 20170517 Kind code of ref document: A |
|
ENP | Entry into the national phase |
Ref document number: 2017909702 Country of ref document: EP Effective date: 20191217 |