WO2024080461A1 - Autonomously traveling disaster prevention device, and control method therefor - Google Patents

Autonomously traveling disaster prevention device, and control method therefor Download PDF

Info

Publication number
WO2024080461A1
WO2024080461A1 PCT/KR2023/003823 KR2023003823W WO2024080461A1 WO 2024080461 A1 WO2024080461 A1 WO 2024080461A1 KR 2023003823 W KR2023003823 W KR 2023003823W WO 2024080461 A1 WO2024080461 A1 WO 2024080461A1
Authority
WO
WIPO (PCT)
Prior art keywords
driving
disaster prevention
processor
module
prevention device
Prior art date
Application number
PCT/KR2023/003823
Other languages
French (fr)
Korean (ko)
Inventor
이준수
윤삼진
Original Assignee
주식회사 웨이브에이아이
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 주식회사 웨이브에이아이 filed Critical 주식회사 웨이브에이아이
Publication of WO2024080461A1 publication Critical patent/WO2024080461A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01MCATCHING, TRAPPING OR SCARING OF ANIMALS; APPARATUS FOR THE DESTRUCTION OF NOXIOUS ANIMALS OR NOXIOUS PLANTS
    • A01M7/00Special adaptations or arrangements of liquid-spraying apparatus for purposes covered by this subclass
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots

Definitions

  • the present invention relates to a self-driving disaster prevention device and its control method. More specifically, the present invention relates to an autonomous disaster prevention device that sets a target point from an image captured by a stereo camera, corrects the posture, and performs a set disaster prevention task when crops are identified while driving autonomously. It relates to driving disaster prevention devices and their control methods.
  • Smart Farm refers to a type of intelligent farm that minimizes the need for human labor by automating farming technology by combining information and communication technology (ICT).
  • ICT information and communication technology
  • the temperature, humidity, sunlight, carbon dioxide, soil, etc. of the crop cultivation facility are measured and analyzed using Internet of Things (IoT) technology, and the growth status of the crops is analyzed.
  • IoT Internet of Things
  • the results of the analysis are Accordingly, crops can be grown in an optimal growth environment by running an automated system, and remote management is also possible through mobile devices such as smartphones.
  • these smart farms can be installed in various forms and at various costs depending on how the system is configured, but because they usually cost a lot of money, research has recently been conducted on open-air smart farms, which are cheaper to install than greenhouse-type smart farms. there is.
  • Open land is land that is not covered by a roof, and in agriculture, it refers to rice paddies and fields without facilities.
  • the smart farm system can be installed in open fields (e.g. orchards, highland vegetables, etc.) that are not leveled due to the nature of the open field. Even if flattening has been achieved, it may be transformed into a non-flattened state due to the effects of wind (typhoon), rain (heavy rain), or sunlight (heat wave).
  • open fields e.g. orchards, highland vegetables, etc.
  • the present invention was created to improve the problems described above, and the purpose of the present invention according to one aspect is to set a target point from an image captured by a stereo camera and then correct the posture to prevent disasters when crops are identified while driving autonomously.
  • the purpose is to provide an autonomous driving disaster prevention device that performs tasks and its control method.
  • An autonomous driving disaster prevention device includes a stereo camera that photographs the front; A sensor module that detects the posture of the disaster prevention device; A driving module that moves the disaster prevention device; A disaster prevention module that operates a disaster prevention device; Memory; and a processor operatively coupled to a stereo camera, sensor module, driving module, disaster prevention module, and memory, wherein the processor drives an executable program stored in the memory to set a target point from the captured image from the stereo camera. After that, the driving direction is set and the driving module is driven. If an obstacle is detected, the driving direction is corrected according to the attitude change input from the sensor module while performing evasive driving. When crops are recognized in the captured video, the disaster prevention module is activated. It is characterized by
  • the processor sets a virtual boundary line with crops planted on both sides of the driving space in the captured image and then extends the virtual boundary line to set a target point.
  • the processor detects a target mark in a captured image and sets a target point.
  • the processor is characterized in that it resets the driving direction according to the posture change to the driving direction based on the target point at a set time period.
  • the present invention further includes a communication module operatively connected to the processor, and the processor receives operation commands and work commands through the communication module and transmits and receives operation information and work information to the management terminal.
  • the present invention is characterized in that when the processor performs evasive driving, driving information is transmitted to the management terminal along with the captured image.
  • a control method of an autonomous driving disaster prevention device includes the steps of having a processor drive an executable program according to an operation command to receive captured images from a stereo camera; A step of the processor setting a driving direction after setting a target point from the captured image; A processor driving a travel module based on the travel direction; Performing evasive driving when an obstacle is detected while the processor drives the driving module; A step of correcting the driving direction according to the change in posture input from the sensor module while the processor drives the driving module; And a step of operating the disaster prevention module when the processor recognizes a crop in the captured image.
  • the step of setting the driving direction involves the processor setting a virtual boundary line with crops planted on both sides of the driving space in the captured image, extending the virtual boundary line to set a target point, and then setting the driving direction. It is characterized by
  • the step of setting the driving direction is characterized in that the processor detects a target mark in the captured image, sets the target point, and then sets the driving direction.
  • the present invention is characterized in that it further includes a step of the processor resetting the driving direction according to the posture change to the driving direction based on the target point at a set time period.
  • the present invention is characterized in that it further includes a step of the processor receiving operation commands and work commands through a communication module and transmitting and receiving operation information and work information to the management terminal.
  • the present invention is characterized in that the step of transmitting and receiving to the management terminal transmits driving information along with the captured image to the management terminal when the processor performs evasive driving.
  • the self-driving disaster prevention device and its control method sets a target point from an image captured by a stereo camera, corrects the posture, and performs a set disaster prevention task when crops are identified while autonomously driving, Even if the driving space is damaged and unstable in an open-air smart farm, disaster prevention work can be performed while driving stably, thereby increasing the disaster prevention effect.
  • FIG. 1 is a block diagram showing an autonomous driving disaster prevention device according to an embodiment of the present invention.
  • Figure 2 is an example diagram of setting a target point in an autonomous driving disaster prevention device according to an embodiment of the present invention.
  • Figure 3 is a flowchart for explaining a control method of an autonomous driving disaster prevention device according to an embodiment of the present invention.
  • Figure 1 is a block diagram showing an autonomous driving disaster prevention device according to an embodiment of the present invention
  • Figure 2 is an example diagram of setting a target point in an autonomous driving disaster prevention device according to an embodiment of the present invention.
  • the autonomous driving disaster prevention device includes a stereo camera 10, a sensor module 20, a driving module 50, a disaster prevention module 60, a memory 70, and It may include a processor 40 and a communication module 30.
  • the stereo camera 10 can capture the front of the disaster prevention device through two cameras and detect the distance from the captured image.
  • the sensor module 20 may include a gyro sensor that detects changes in three-dimensional posture, such as the left or right tilt of the disaster prevention device.
  • the driving module 50 can move the disaster prevention device forward, backward, left and right.
  • the disaster prevention module 60 can drive the disaster prevention device to spray water and medicine.
  • the memory 70 contains an execution program for driving and operating the disaster prevention device.
  • the processor 40 is operatively coupled to the stereo camera 10, sensor module 20, driving module 50, disaster prevention module 60, and memory 70.
  • the processor 40 drives the execution program stored in the memory 70 to set a target point from the captured image from the stereo camera 10 and then sets the driving direction to drive the driving module 50.
  • the processor 40 can set a virtual boundary line with crops planted on both sides of the driving space in the captured image as shown in (a) of FIG. 2, then extend the virtual boundary line and set the contact point as the target point. there is.
  • the target mark (TM) may be displayed at the end of the driving space, then the target mark (TM) may be identified in the captured image, and then the target mark (TM) may be set as the target point. there is.
  • the processor 40 sets the driving direction based on the target point and drives the driving module 50, thereby enabling driving in a straight line at the center of the driving space.
  • the processor 40 performs evasive driving and receives input from the sensor module 20.
  • the driving module 50 can be driven by correcting the driving direction according to the change in posture.
  • the processor 40 may reset the driving direction according to the posture change to the driving direction based on the target point at a set time period.
  • the processor 40 can perform disaster prevention by operating the disaster prevention module 60 when crops are recognized in the captured image according to the disaster prevention command.
  • data can be transmitted and received with the management terminal 80 by including a communication module 30 that is operatively connected to the processor 40.
  • the processor 40 can receive operation commands and work commands through the communication module 80 and transmit and receive operation information and work information with the management terminal 80.
  • the processor 40 may transmit driving information along with the captured image to the management terminal 80 so that the reason for evasive driving can be determined.
  • a target point is set from an image captured by a stereo camera, the posture is corrected, and the set disaster prevention task is performed when crops are identified while driving autonomously, Even if the driving space is damaged and unstable in an unleveled open field smart farm, disaster prevention work can be performed while driving stably, thereby increasing the disaster prevention effect.
  • Figure 3 is a flowchart for explaining a control method of an autonomous driving disaster prevention device according to an embodiment of the present invention.
  • the processor 40 drives an execution program according to an operation command and receives captured images from the stereo camera 10. (S10).
  • the processor 40 After receiving the captured image in step S10, the processor 40 sets a target point from the captured image (S20).
  • the processor 40 sets a virtual boundary line with crops planted on both sides of the driving space in the captured image as shown in (a) of FIG. 2, then extends the virtual boundary line and sets the contact point as the target point. You can.
  • the target mark (TM) may be displayed at the end of the driving space, then the target mark (TM) may be identified in the captured image, and then the target mark (TM) may be set as the target point. there is.
  • the processor 40 After setting the target point in step S20, the processor 40 sets the driving direction based on the target point (S30).
  • the processor 40 drives the driving module 50 based on the driving direction to enable driving in a straight line at the center of the driving space (S40).
  • step S40 when the driving module 50 is driven to move the disaster prevention device and an obstacle is detected in the captured image, the processor performs evasive driving (S50).
  • the processor 40 corrects the driving direction according to the attitude change input from the sensor module 20 (S60) ).
  • the processor 40 performs evasive driving and receives input from the sensor module 20.
  • the driving module 50 can be driven by correcting the driving direction according to the change in posture.
  • the processor 40 may reset the driving direction according to the posture change to the driving direction based on the target point at a set time period.
  • the processor 40 can perform disaster prevention by operating the disaster prevention module when crops are recognized from the captured image according to a disaster prevention command while autonomously driving the disaster prevention device along the driving space (S70).
  • the processor 40 can receive operation commands and work commands through the communication module 80 and transmit and receive operation information and work information with the management terminal 80.
  • the processor 40 may transmit driving information along with the captured image to the management terminal 80 so that the reason for evasive driving can be determined.
  • a target point is set from an image captured by a stereo camera, the posture is corrected, and the set disaster prevention task is performed when crops are identified while driving autonomously.
  • Implementations described herein may be implemented, for example, as a method or process, device, software program, data stream, or signal. Although discussed only in the context of a single form of implementation (eg, only as a method), implementations of the features discussed may also be implemented in other forms (eg, devices or programs).
  • the device may be implemented with appropriate hardware, software, firmware, etc.
  • the method may be implemented in a device such as a processor, which generally refers to a processing device that includes a computer, microprocessor, integrated circuit, or programmable logic device. Processors also include communication devices such as computers, cell phones, portable/personal digital assistants (“PDAs”) and other devices that facilitate communication of information between end-users.
  • PDAs portable/personal digital assistants

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Automation & Control Theory (AREA)
  • General Physics & Mathematics (AREA)
  • Remote Sensing (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Environmental Sciences (AREA)
  • Wood Science & Technology (AREA)
  • Pest Control & Pesticides (AREA)
  • Insects & Arthropods (AREA)
  • Zoology (AREA)
  • Electromagnetism (AREA)
  • Mechanical Engineering (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Business, Economics & Management (AREA)
  • Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • Game Theory and Decision Science (AREA)
  • Medical Informatics (AREA)
  • Alarm Systems (AREA)

Abstract

Disclosed are an autonomously traveling disaster prevention device and a control method therefor. The autonomously traveling disaster prevention device of the present invention is characterized by comprising: a stereo camera which captures an image of an area in front of the disaster prevention device; a sensor module which detects the bearing of the disaster prevention device; a travel module which moves the disaster prevention device; a disaster prevention module which drives the disaster prevention device; memory; and a processor operatively coupled to the stereo camera, sensor module, travel module, disaster prevention module, and memory, wherein the processor drives the travel module by running an executable program stored in the memory, setting a destination in the image captured by the stereo camera, and then setting a travel direction, corrects the travel direction in response to bearing changes input from the sensor module while performing avoidance travel when obstacles are detected, and activates the disaster prevention module when crops are recognized in the captured image.

Description

자율주행 방재장치 및 그 제어방법Autonomous driving disaster prevention device and its control method
본 발명은 자율주행 방재장치 및 그 제어방법에 관한 것으로서, 보다 상세하게는 스테레오 카메라로 촬영된 영상으로부터 목표점을 설정한 후 자세를 보정하여 자율주행하면서 농작물이 확인될 경우 설정된 방재작업을 수행하는 자율주행 방재장치 및 그 제어방법에 관한 것이다. The present invention relates to a self-driving disaster prevention device and its control method. More specifically, the present invention relates to an autonomous disaster prevention device that sets a target point from an image captured by a stereo camera, corrects the posture, and performs a set disaster prevention task when crops are identified while driving autonomously. It relates to driving disaster prevention devices and their control methods.
일반적으로, 스마트 팜(Smart Farm)은 농사기술에 정보통신기술(ICT)을 접목하여 자동화함으로써 사람의 노동력이 필요한 부분을 최소화한 일종의 지능화된 농장을 의미한다. In general, Smart Farm refers to a type of intelligent farm that minimizes the need for human labor by automating farming technology by combining information and communication technology (ICT).
이러한 스마트 팜 기술에 따르면 사물인터넷(Internet of Things: IoT) 기술을 이용하여 농작물 재배시설의 온도, 습도, 햇볕 량, 이산화탄소, 토양 등을 측정하여 분석하고 농작물의 생장상태를 분석하며, 그 분석 결과에 따라서 자동화 시스템을 구동하여 농작물을 최적의 생장환경에서 재배할 수 있으며, 스마트폰과 같은 모바일기기를 통한 원격관리도 가능하다. According to this smart farm technology, the temperature, humidity, sunlight, carbon dioxide, soil, etc. of the crop cultivation facility are measured and analyzed using Internet of Things (IoT) technology, and the growth status of the crops is analyzed. The results of the analysis are Accordingly, crops can be grown in an optimal growth environment by running an automated system, and remote management is also possible through mobile devices such as smartphones.
또한 비료와 해충 방제를 위한 살충제를 자동 분사하는 것도 가능하다. It is also possible to automatically spray fertilizers and pesticides for pest control.
한편 이러한 스마트 팜은 시스템 구성 방식에 따라 다양한 형태와 다양한 비용으로 설치가 가능하지만, 통상적으로 많은 비용이 투입되기 때문에 최근에는 온실형 스마트 팜 대비 설치비용이 보다 저렴한 노지형 스마트 팜에 대한 연구가 진행되고 있다. Meanwhile, these smart farms can be installed in various forms and at various costs depending on how the system is configured, but because they usually cost a lot of money, research has recently been conducted on open-air smart farms, which are cheaper to install than greenhouse-type smart farms. there is.
노지는 지붕으로 가리지 않은 땅으로, 농업에서는 시설이 설치되어 있지 않은 논과 밭 등을 의미한다. Open land is land that is not covered by a roof, and in agriculture, it refers to rice paddies and fields without facilities.
그런데 상기 스마트 팜을 관리하기 위해서는 평탄화 작업이 먼저 이루어지는 것이 바람직하지만, 노지형 스마트 팜의 경우, 노지의 특성상 평탄화가 안된 노지(예 : 과수원, 고랭지 채소 등)에 스마트 팜 시스템을 설치할 수 있으며, 또한 평탄화가 이루어졌다고 하더라도, 바람(태풍), 비(폭우), 햇볕(폭염) 등의 영향에 의해 비평탄화 상태로 변형될 수도 있다. However, in order to manage the smart farm, it is desirable to perform leveling work first. However, in the case of open-field smart farms, the smart farm system can be installed in open fields (e.g. orchards, highland vegetables, etc.) that are not leveled due to the nature of the open field. Even if flattening has been achieved, it may be transformed into a non-flattened state due to the effects of wind (typhoon), rain (heavy rain), or sunlight (heat wave).
본 발명의 배경기술은 대한민국 등록특허 10-1556301호(2015.09.22. 등록, 무인자동로봇을 이용한 농작물 유해요소 방제장치)에 개시되어 있다. The background technology of the present invention is disclosed in Republic of Korea Patent No. 10-1556301 (registered on September 22, 2015, device for controlling harmful elements in crops using an unmanned automatic robot).
이와 같이 노지형 스마트 팜을 주행하는 무인이송장치의 경우, 비평탄화된 주행공간을 주행함에 따라 주행 중 자세가 계속해서 변경될 수 있고, 태풍이나 폭우로 인해 주행공간이 손상되거나 변형된 경우 회피주행을 수행할 수 있기 때문에 주행 중 자세 복원이 이루어지지 않을 경우 누적오차로 인해 주행경로의 이탈로 인해 방재작업 등을 수행할 경우 방재효과가 떨어지는 문제점이 있다. In the case of an unmanned transport device running on an open-air smart farm, its posture may continuously change while driving as it drives on an uneven driving space, and evasive driving may occur if the driving space is damaged or deformed due to a typhoon or heavy rain. Since it is possible to perform, there is a problem that the disaster prevention effect is reduced when performing disaster prevention work, etc. due to deviation of the driving path due to cumulative error if the posture is not restored while driving.
본 발명은 상기와 같은 문제점들을 개선하기 위하여 안출된 것으로, 일 측면에 따른 본 발명의 목적은 스테레오 카메라로 촬영된 영상으로부터 목표점을 설정한 후 자세를 보정하여 자율주행하면서 농작물이 확인될 경우 설정된 방재작업을 수행하는 자율주행 방재장치 및 그 제어방법을 제공하는 것이다. The present invention was created to improve the problems described above, and the purpose of the present invention according to one aspect is to set a target point from an image captured by a stereo camera and then correct the posture to prevent disasters when crops are identified while driving autonomously. The purpose is to provide an autonomous driving disaster prevention device that performs tasks and its control method.
본 발명의 일 측면에 따른 자율주행 방재장치는, 전방을 촬영하는 스테레오 카메라; 방재장치의 자세를 감지하는 센서모듈; 방재장치를 이동시키는 주행모듈; 방재장치를 구동시키는 방재모듈; 메모리; 및 스테레오 카메라, 센서모듈, 주행모듈, 방재모듈 및 메모리와 작동적으로 연결(operatively coupled to)된 프로세서;를 포함하되, 프로세서는 메모리에 저장된 실행 프로그램을 구동하여 스테레오 카메라로부터 촬영영상으로부터 목표점을 설정한 후 주행방향을 설정하여 주행모듈을 구동시키고, 장애물이 감지된 경우 회피주행을 수행하면서 센서모듈로부터 입력된 자세변화에 따라 주행방향을 보정하며, 촬영영상에서 농작물이 인식되면 방재모듈을 작동시키는 것을 특징으로 한다. An autonomous driving disaster prevention device according to one aspect of the present invention includes a stereo camera that photographs the front; A sensor module that detects the posture of the disaster prevention device; A driving module that moves the disaster prevention device; A disaster prevention module that operates a disaster prevention device; Memory; and a processor operatively coupled to a stereo camera, sensor module, driving module, disaster prevention module, and memory, wherein the processor drives an executable program stored in the memory to set a target point from the captured image from the stereo camera. After that, the driving direction is set and the driving module is driven. If an obstacle is detected, the driving direction is corrected according to the attitude change input from the sensor module while performing evasive driving. When crops are recognized in the captured video, the disaster prevention module is activated. It is characterized by
본 발명에서 프로세서는, 촬영영상에서 주행공간의 양측에 식재된 농작물들로 가상의 경계선을 설정한 후 가상의 경계선을 연장하여 목표점을 설정하는 것을 특징으로 한다. In the present invention, the processor sets a virtual boundary line with crops planted on both sides of the driving space in the captured image and then extends the virtual boundary line to set a target point.
본 발명에서 프로세서는, 촬영영상에서 타겟마크를 감지하여 목표점을 설정하는 것을 특징으로 한다. In the present invention, the processor detects a target mark in a captured image and sets a target point.
본 발명에서 프로세서는, 설정시간 주기로 자세변화에 따른 주행방향에 대해 목표점에 의한 주행방향으로 재설정하는 것을 특징으로 한다. In the present invention, the processor is characterized in that it resets the driving direction according to the posture change to the driving direction based on the target point at a set time period.
본 발명은 프로세서와 작동적으로 연결되는 통신모듈;을 더 포함하고, 프로세서는 통신모듈을 통해 운행명령과 작업명령을 입력받고 운행정보와 작업정보를 관리단말기와 송수신하는 것을 특징으로 한다. The present invention further includes a communication module operatively connected to the processor, and the processor receives operation commands and work commands through the communication module and transmits and receives operation information and work information to the management terminal.
본 발명은 프로세서가 회피주행을 수행한 경우 촬영영상과 함께 운행정보를 관리단말기에 송신하는 것을 특징으로 한다. The present invention is characterized in that when the processor performs evasive driving, driving information is transmitted to the management terminal along with the captured image.
본 발명의 다른 측면에 따른 자율주행 방재장치의 제어방법은, 프로세서가 작동명령에 따라 실행 프로그램을 구동하여 스테레오 카메라로부터 촬영영상을 입력받는 단계; 프로세서가 촬영영상으로부터 목표점을 설정한 후 주행방향을 설정하는 단계; 프로세서가 주행방향을 기준으로 주행모듈을 구동시키는 단계; 프로세서가 주행모듈을 구동시키면서 장애물이 감지된 경우 회피주행을 수행하는 단계; 프로세서가 주행모듈을 구동시키면서 센서모듈로부터 입력된 자세변화에 따라 주행방향을 보정하는 단계; 및 프로세서가 촬영영상에서 농작물이 인식되면 방재모듈을 작동시키는 단계;를 포함하는 것을 특징으로 한다. A control method of an autonomous driving disaster prevention device according to another aspect of the present invention includes the steps of having a processor drive an executable program according to an operation command to receive captured images from a stereo camera; A step of the processor setting a driving direction after setting a target point from the captured image; A processor driving a travel module based on the travel direction; Performing evasive driving when an obstacle is detected while the processor drives the driving module; A step of correcting the driving direction according to the change in posture input from the sensor module while the processor drives the driving module; And a step of operating the disaster prevention module when the processor recognizes a crop in the captured image.
본 발명에서 주행방향을 설정하는 단계는, 프로세서가 촬영영상에서 주행공간의 양측에 식재된 농작물들로 가상의 경계선을 설정한 후 가상의 경계선을 연장하여 목표점을 설정한 후 주행방향을 설정하는 것을 특징으로 한다. In the present invention, the step of setting the driving direction involves the processor setting a virtual boundary line with crops planted on both sides of the driving space in the captured image, extending the virtual boundary line to set a target point, and then setting the driving direction. It is characterized by
본 발명에서 주행방향을 설정하는 단계는, 프로세서가 촬영영상에서 타겟마크를 감지하여 목표점을 설정한 후 주행방향을 설정하는 것을 특징으로 한다. In the present invention, the step of setting the driving direction is characterized in that the processor detects a target mark in the captured image, sets the target point, and then sets the driving direction.
본 발명은 프로세서가 설정시간 주기로 자세변화에 따른 주행방향에 대해 목표점에 의한 주행방향으로 재설정하는 단계;를 더 포함하는 것을 특징으로 한다. The present invention is characterized in that it further includes a step of the processor resetting the driving direction according to the posture change to the driving direction based on the target point at a set time period.
본 발명은 프로세서가 통신모듈을 통해 운행명령과 작업명령을 입력받고 운행정보와 작업정보를 관리단말기와 송수신하는 단계;를 더 포함하는 것을 특징으로 한다. The present invention is characterized in that it further includes a step of the processor receiving operation commands and work commands through a communication module and transmitting and receiving operation information and work information to the management terminal.
본 발명은 관리단말기와 송수신하는 단계는, 프로세서가 회피주행을 수행한 경우 촬영영상과 함께 운행정보를 관리단말기에 송신하는 것을 특징으로 한다. The present invention is characterized in that the step of transmitting and receiving to the management terminal transmits driving information along with the captured image to the management terminal when the processor performs evasive driving.
본 발명의 일 측면에 따른 자율주행 방재장치 및 그 제어방법은 스테레오 카메라로 촬영된 영상으로부터 목표점을 설정한 후 자세를 보정하여 자율주행하면서 농작물이 확인될 경우 설정된 방재작업을 수행함으로써, 비평탄화된 노지 스마트팜에서 주행공간이 훼손되고 불안정하더라도 안정적으로 주행하면서 방재작업을 수행할 수 있어 방재효과를 높일 수 있다. The self-driving disaster prevention device and its control method according to one aspect of the present invention sets a target point from an image captured by a stereo camera, corrects the posture, and performs a set disaster prevention task when crops are identified while autonomously driving, Even if the driving space is damaged and unstable in an open-air smart farm, disaster prevention work can be performed while driving stably, thereby increasing the disaster prevention effect.
도 1은 본 발명의 일 실시예에 따른 자율주행 방재장치를 나타낸 블록 구성도이다. 1 is a block diagram showing an autonomous driving disaster prevention device according to an embodiment of the present invention.
도 2는 본 발명의 일 실시예에 따른 자율주행 방재장치에서 목표점을 설정하는 예시 도면이다. Figure 2 is an example diagram of setting a target point in an autonomous driving disaster prevention device according to an embodiment of the present invention.
도 3은 본 발명의 일 실시예에 따른 자율주행 방재장치의 제어방법을 설명하기 위한 흐름도이다. Figure 3 is a flowchart for explaining a control method of an autonomous driving disaster prevention device according to an embodiment of the present invention.
이하, 첨부된 도면들을 참조하여 본 발명에 따른 자율주행 방재장치 및 그 제어방법을 설명한다. 이 과정에서 도면에 도시된 선들의 두께나 구성요소의 크기 등은 설명의 명료성과 편의상 과장되게 도시되어 있을 수 있다. 또한, 후술되는 용어들은 본 발명에서의 기능을 고려하여 정의된 용어들로서 이는 사용자, 운용자의 의도 또는 관례에 따라 달라질 수 있다. 그러므로 이러한 용어들에 대한 정의는 본 명세서 전반에 걸친 내용을 토대로 내려져야 할 것이다.Hereinafter, the autonomous driving disaster prevention device and its control method according to the present invention will be described with reference to the attached drawings. In this process, the thickness of lines or sizes of components shown in the drawing may be exaggerated for clarity and convenience of explanation. In addition, the terms described below are terms defined in consideration of functions in the present invention, and may vary depending on the intention or custom of the user or operator. Therefore, definitions of these terms should be made based on the content throughout this specification.
도 1은 본 발명의 일 실시예에 따른 자율주행 방재장치를 나타낸 블록 구성도이고, 도 2는 본 발명의 일 실시예에 따른 자율주행 방재장치에서 목표점을 설정하는 예시 도면이다. Figure 1 is a block diagram showing an autonomous driving disaster prevention device according to an embodiment of the present invention, and Figure 2 is an example diagram of setting a target point in an autonomous driving disaster prevention device according to an embodiment of the present invention.
도 1에 도시된 바와 같이 본 발명의 일 실시예에 따른 자율주행 방재장치는, 스테레오 카메라(10), 센서모듈(20), 주행모듈(50), 방재모듈(60), 메모리(70) 및 프로세서(40)를 비롯하여 통신모듈(30)을 포함할 수 있다. As shown in Figure 1, the autonomous driving disaster prevention device according to an embodiment of the present invention includes a stereo camera 10, a sensor module 20, a driving module 50, a disaster prevention module 60, a memory 70, and It may include a processor 40 and a communication module 30.
스테레오 카메라(10)는 방재장치의 전방을 두 대의 카메라를 통해 촬영하여 촬영영상으로부터 원근거리를 감지할 수 있다. The stereo camera 10 can capture the front of the disaster prevention device through two cameras and detect the distance from the captured image.
센서모듈(20)은 방재장치의 좌우 쏠림 등 3차원 자세의 변화를 감지하는 자이로센서를 포함할 수 있다. The sensor module 20 may include a gyro sensor that detects changes in three-dimensional posture, such as the left or right tilt of the disaster prevention device.
주행모듈(50)은 방재장치를 전후좌우로 이동시킬 수 있다. The driving module 50 can move the disaster prevention device forward, backward, left and right.
방재모듈(60)은 방재장치를 구동시켜 살수 및 약재를 분사시킬 수 있다. The disaster prevention module 60 can drive the disaster prevention device to spray water and medicine.
메모리(70)는 방재장치의 주행 및 작동을 위한 실행 프로그램을 내장한다. The memory 70 contains an execution program for driving and operating the disaster prevention device.
프로세서(40)는 스테레오 카메라(10), 센서모듈(20), 주행모듈(50), 방재모듈(60) 및 메모리(70)와 작동적으로 연결(operatively coupled to)된다. The processor 40 is operatively coupled to the stereo camera 10, sensor module 20, driving module 50, disaster prevention module 60, and memory 70.
따라서 프로세서(40)는 메모리(70)에 저장된 실행 프로그램을 구동하여 스테레오 카메라(10)로부터 촬영영상으로부터 목표점을 설정한 후 주행방향을 설정하여 주행모듈(50)을 구동시킨다. Accordingly, the processor 40 drives the execution program stored in the memory 70 to set a target point from the captured image from the stereo camera 10 and then sets the driving direction to drive the driving module 50.
여기서 프로세서(40)는 도 2의 (가)에 도시된 바와 같이 촬영영상에서 주행공간의 양측에 식재된 농작물들로 가상의 경계선을 설정한 후 가상의 경계선을 연장하여 접하는 접점을 목표점으로 설정할 수 있다. Here, the processor 40 can set a virtual boundary line with crops planted on both sides of the driving space in the captured image as shown in (a) of FIG. 2, then extend the virtual boundary line and set the contact point as the target point. there is.
한편, 도 2의 (나)에 도시된 바와 같이 주행공간의 끝지점에 타겟마크(TM)를 표시한 후 촬영영상에서 타겟마크(TM)를 식별한 후 타겟마크(TM)를 목표점으로 설정할 수도 있다. On the other hand, as shown in (b) of Figure 2, the target mark (TM) may be displayed at the end of the driving space, then the target mark (TM) may be identified in the captured image, and then the target mark (TM) may be set as the target point. there is.
이와 같이 목표점을 설정한 후 프로세서(40)는 목표점을 기준으로 주행방향을 설정하여 주행모듈(50)을 구동시킴으로써, 주행공간의 중심에서 직선으로 주행할 수 있도록 한다. After setting the target point in this way, the processor 40 sets the driving direction based on the target point and drives the driving module 50, thereby enabling driving in a straight line at the center of the driving space.
이와 같이 목표점을 기준으로 주행방향을 설정한 후 주행모듈(50)을 구동시켜 이동하면서 주행공간이 훼손되거나 장애물이 감지된 경우, 프로세서(40)는 회피주행을 수행하면서 센서모듈(20)로부터 입력된 자세변화에 따라 주행방향을 보정하여 주행모듈(50)을 구동시킬 수 있다. In this way, after setting the driving direction based on the target point, if the driving space is damaged or an obstacle is detected while driving the driving module 50 and moving, the processor 40 performs evasive driving and receives input from the sensor module 20. The driving module 50 can be driven by correcting the driving direction according to the change in posture.
이때 프로세서(40)는 설정시간 주기로 자세변화에 따른 주행방향에 대해 목표점에 의한 주행방향으로 재설정할 수 있다. At this time, the processor 40 may reset the driving direction according to the posture change to the driving direction based on the target point at a set time period.
즉, 자세변화에 따라 주행방향을 보정하면서 설정시간이 경과할 경우 센서의 누적오차로 주행방향이 변경되는 것을 방지하기 위해 목표점을 설정하여 주행방향을 재설정함으로써, 보다 안정적이고 정확한 주행이 이루어지도록 할 수 있다. In other words, while correcting the driving direction according to changes in posture, when the set time elapses, a target point is set and the driving direction is reset to prevent the driving direction from changing due to the accumulated error of the sensor, thereby ensuring more stable and accurate driving. You can.
한편, 프로세서(40)는 방재명령에 따라 촬영영상에서 농작물이 인식되면 방재모듈(60)을 작동시켜 방재를 수행할 수 있다. Meanwhile, the processor 40 can perform disaster prevention by operating the disaster prevention module 60 when crops are recognized in the captured image according to the disaster prevention command.
또한, 프로세서(40)와 작동적으로 연결되는 통신모듈(30)을 포함하여 관리단말기(80)와 데이터를 송수신할 수 있다. In addition, data can be transmitted and received with the management terminal 80 by including a communication module 30 that is operatively connected to the processor 40.
따라서 프로세서(40)는 통신모듈(80)을 통해 운행명령과 작업명령을 입력받고 운행정보와 작업정보를 관리단말기(80)와 송수신할 수 있다. Therefore, the processor 40 can receive operation commands and work commands through the communication module 80 and transmit and receive operation information and work information with the management terminal 80.
이때 프로세서(40)는 회피주행을 수행한 경우, 회피주행을 수행한 이유를 파악할 수 있도록 촬영영상과 함께 운행정보를 관리단말기(80)에 송신할 수 있다. At this time, when evasive driving is performed, the processor 40 may transmit driving information along with the captured image to the management terminal 80 so that the reason for evasive driving can be determined.
상술한 바와 같이, 본 발명의 실시예에 의한 자율주행 방재장치에 따르면, 스테레오 카메라로 촬영된 영상으로부터 목표점을 설정한 후 자세를 보정하여 자율주행하면서 농작물이 확인될 경우 설정된 방재작업을 수행함으로써, 비평탄화된 노지 스마트팜에서 주행공간이 훼손되고 불안정하더라도 안정적으로 주행하면서 방재작업을 수행할 수 있어 방재효과를 높일 수 있다. As described above, according to the self-driving disaster prevention device according to an embodiment of the present invention, a target point is set from an image captured by a stereo camera, the posture is corrected, and the set disaster prevention task is performed when crops are identified while driving autonomously, Even if the driving space is damaged and unstable in an unleveled open field smart farm, disaster prevention work can be performed while driving stably, thereby increasing the disaster prevention effect.
도 3은 본 발명의 일 실시예에 따른 자율주행 방재장치의 제어방법을 설명하기 위한 흐름도이다. Figure 3 is a flowchart for explaining a control method of an autonomous driving disaster prevention device according to an embodiment of the present invention.
도 3에 도시된 바와 같이 본 발명의 일 실시예에 따른 자율주행 방재장치의 제어방법에서는 먼저, 프로세서(40)가 작동명령에 따라 실행 프로그램을 구동하여 스테레오 카메라(10)로부터 촬영영상을 입력받는다(S10). As shown in FIG. 3, in the control method of the autonomous driving disaster prevention device according to an embodiment of the present invention, first, the processor 40 drives an execution program according to an operation command and receives captured images from the stereo camera 10. (S10).
S10 단계에서 촬영영상을 입력받은 후 프로세서(40)는 촬영영상으로부터 목표점을 설정한다(S20). After receiving the captured image in step S10, the processor 40 sets a target point from the captured image (S20).
여기서, 프로세서(40)는 도 2의 (가)에 도시된 바와 같이 촬영영상에서 주행공간의 양측에 식재된 농작물들로 가상의 경계선을 설정한 후 가상의 경계선을 연장하여 접하는 접점을 목표점으로 설정할 수 있다. Here, the processor 40 sets a virtual boundary line with crops planted on both sides of the driving space in the captured image as shown in (a) of FIG. 2, then extends the virtual boundary line and sets the contact point as the target point. You can.
한편, 도 2의 (나)에 도시된 바와 같이 주행공간의 끝지점에 타겟마크(TM)를 표시한 후 촬영영상에서 타겟마크(TM)를 식별한 후 타겟마크(TM)를 목표점으로 설정할 수도 있다. On the other hand, as shown in (b) of Figure 2, the target mark (TM) may be displayed at the end of the driving space, then the target mark (TM) may be identified in the captured image, and then the target mark (TM) may be set as the target point. there is.
S20 단계에서 목표점을 설정한 후 프로세서(40)는 목표점을 기준으로 주행방향을 설정한다(S30). After setting the target point in step S20, the processor 40 sets the driving direction based on the target point (S30).
S30 단계에서 주행방향을 설정한 후 프로세서(40)는 주행방향을 기준으로 주행모듈(50)을 구동시킴으로써, 주행공간의 중심에서 직선으로 주행할 수 있도록 한다(S40). After setting the driving direction in step S30, the processor 40 drives the driving module 50 based on the driving direction to enable driving in a straight line at the center of the driving space (S40).
S40 단계에서 주행모듈(50)을 구동시켜 방재장치를 이동시키면서 촬영영상에서 장애물이 감지될 경우, 프로세서는 회피주행을 수행한다(S50). In step S40, when the driving module 50 is driven to move the disaster prevention device and an obstacle is detected in the captured image, the processor performs evasive driving (S50).
또한, S40 단계에서 주행모듈(50)을 구동시켜 이동시키거나 S50 단계에서 회피주행을 수행한 경우, 프로세서(40)는 센서모듈(20)로부터 입력된 자세변화에 따라 주행방향을 보정한다(S60). In addition, when the driving module 50 is driven and moved in step S40 or evasive driving is performed in step S50, the processor 40 corrects the driving direction according to the attitude change input from the sensor module 20 (S60) ).
이와 같이 목표점을 기준으로 주행방향을 설정한 후 주행모듈(50)을 구동시켜 이동하면서 주행공간이 훼손되거나 장애물이 감지된 경우, 프로세서(40)는 회피주행을 수행하면서 센서모듈(20)로부터 입력된 자세변화에 따라 주행방향을 보정하여 주행모듈(50)을 구동시킬 수 있다. In this way, after setting the driving direction based on the target point, if the driving space is damaged or an obstacle is detected while driving the driving module 50 and moving, the processor 40 performs evasive driving and receives input from the sensor module 20. The driving module 50 can be driven by correcting the driving direction according to the change in posture.
이때 프로세서(40)는 설정시간 주기로 자세변화에 따른 주행방향에 대해 목표점에 의한 주행방향으로 재설정할 수 있다. At this time, the processor 40 may reset the driving direction according to the posture change to the driving direction based on the target point at a set time period.
이와 같이 프로세서(40)는 방재장치를 주행공간을 따라 자율주행시키면서 방재명령에 따라 촬영영상으로부터 농작물이 인식되면 방재모듈을 작동시켜 방재를 수행할 수 있다(S70). In this way, the processor 40 can perform disaster prevention by operating the disaster prevention module when crops are recognized from the captured image according to a disaster prevention command while autonomously driving the disaster prevention device along the driving space (S70).
한편, 프로세서(40)는 통신모듈(80)을 통해 운행명령과 작업명령을 입력받고 운행정보와 작업정보를 관리단말기(80)와 송수신할 수 있다. Meanwhile, the processor 40 can receive operation commands and work commands through the communication module 80 and transmit and receive operation information and work information with the management terminal 80.
이때 프로세서(40)는 회피주행을 수행한 경우, 회피주행을 수행한 이유를 파악할 수 있도록 촬영영상과 함께 운행정보를 관리단말기(80)에 송신할 수 있다. At this time, when evasive driving is performed, the processor 40 may transmit driving information along with the captured image to the management terminal 80 so that the reason for evasive driving can be determined.
상술한 바와 같이, 본 발명의 실시예에 의한 자율주행 방재장치의 제어방법에 따르면, 스테레오 카메라로 촬영된 영상으로부터 목표점을 설정한 후 자세를 보정하여 자율주행하면서 농작물이 확인될 경우 설정된 방재작업을 수행함으로써, 비평탄화된 노지 스마트팜에서 주행공간이 훼손되고 불안정하더라도 안정적으로 주행하면서 방재작업을 수행할 수 있어 방재효과를 높일 수 있다. As described above, according to the control method of the self-driving disaster prevention device according to an embodiment of the present invention, a target point is set from an image captured by a stereo camera, the posture is corrected, and the set disaster prevention task is performed when crops are identified while driving autonomously. By performing this, disaster prevention work can be performed while driving stably even if the driving space is damaged and unstable in an unleveled open field smart farm, thereby increasing the disaster prevention effect.
본 명세서에서 설명된 구현은, 예컨대, 방법 또는 프로세스, 장치, 소프트웨어 프로그램, 데이터 스트림 또는 신호로 구현될 수 있다. 단일 형태의 구현의 맥락에서만 논의(예컨대, 방법으로서만 논의)되었더라도, 논의된 특징의 구현은 또한 다른 형태(예컨대, 장치 또는 프로그램)로도 구현될 수 있다. 장치는 적절한 하드웨어, 소프트웨어 및 펌웨어 등으로 구현될 수 있다. 방법은, 예컨대, 컴퓨터, 마이크로프로세서, 집적 회로 또는 프로그래밍 가능한 로직 디바이스 등을 포함하는 프로세싱 디바이스를 일반적으로 지칭하는 프로세서 등과 같은 장치에서 구현될 수 있다. 프로세서는 또한 최종-사용자 사이에 정보의 통신을 용이하게 하는 컴퓨터, 셀 폰, 휴대용/개인용 정보 단말기(personal digital assistant: "PDA") 및 다른 디바이스 등과 같은 통신 디바이스를 포함한다.Implementations described herein may be implemented, for example, as a method or process, device, software program, data stream, or signal. Although discussed only in the context of a single form of implementation (eg, only as a method), implementations of the features discussed may also be implemented in other forms (eg, devices or programs). The device may be implemented with appropriate hardware, software, firmware, etc. The method may be implemented in a device such as a processor, which generally refers to a processing device that includes a computer, microprocessor, integrated circuit, or programmable logic device. Processors also include communication devices such as computers, cell phones, portable/personal digital assistants (“PDAs”) and other devices that facilitate communication of information between end-users.
본 발명은 도면에 도시된 실시예를 참고로 하여 설명되었으나, 이는 예시적인 것에 불과하며, 당해 기술이 속하는 분야에서 통상의 지식을 가진 자라면 이로부터 다양한 변형 및 균등한 타 실시예가 가능하다는 점을 이해할 것이다. The present invention has been described with reference to the embodiments shown in the drawings, but these are merely exemplary, and those skilled in the art will recognize that various modifications and other equivalent embodiments are possible therefrom. You will understand.
따라서 본 발명의 진정한 기술적 보호범위는 아래의 청구범위에 의해서 정하여져야 할 것이다.Therefore, the true technical protection scope of the present invention should be determined by the claims below.

Claims (12)

  1. 전방을 촬영하는 스테레오 카메라; a stereo camera that captures the front;
    방재장치의 자세를 감지하는 센서모듈; A sensor module that detects the posture of the disaster prevention device;
    상기 방재장치를 이동시키는 주행모듈; a traveling module that moves the disaster prevention device;
    상기 방재장치를 구동시키는 방재모듈; A disaster prevention module that drives the disaster prevention device;
    메모리; 및 Memory; and
    상기 스테레오 카메라, 상기 센서모듈, 상기 주행모듈, 상기 방재모듈 및 상기 메모리와 작동적으로 연결(operatively coupled to)된 프로세서;를 포함하되, Including a processor operatively coupled to the stereo camera, the sensor module, the driving module, the disaster prevention module, and the memory,
    상기 프로세서는 상기 메모리에 저장된 실행 프로그램을 구동하여 상기 스테레오 카메라로부터 촬영영상으로부터 목표점을 설정한 후 주행방향을 설정하여 상기 주행모듈을 구동시키고, 장애물이 감지된 경우 회피주행을 수행하면서 상기 센서모듈로부터 입력된 자세변화에 따라 상기 주행방향을 보정하며, 상기 촬영영상에서 농작물이 인식되면 상기 방재모듈을 작동시키는 것을 특징으로 하는 자율주행 방재장치. The processor drives the executable program stored in the memory, sets a target point from the captured image from the stereo camera, sets a driving direction, and drives the driving module. When an obstacle is detected, the processor performs evasive driving while moving from the sensor module. An autonomous driving disaster prevention device, characterized in that the driving direction is corrected according to the input posture change and the disaster prevention module is activated when crops are recognized in the captured image.
  2. 제 1항에 있어서, 상기 프로세서는, 상기 촬영영상에서 주행공간의 양측에 식재된 농작물들로 가상의 경계선을 설정한 후 상기 가상의 경계선을 연장하여 상기 목표점을 설정하는 것을 특징으로 하는 자율주행 방재장치. The autonomous driving disaster prevention method of claim 1, wherein the processor sets a virtual boundary line with crops planted on both sides of the driving space in the captured image and then extends the virtual boundary line to set the target point. Device.
  3. 제 1항에 있어서, 상기 프로세서는, 상기 촬영영상에서 타겟마크를 감지하여 상기 목표점을 설정하는 것을 특징으로 하는 자율주행 방재장치. The autonomous driving disaster prevention device according to claim 1, wherein the processor detects a target mark in the captured image and sets the target point.
  4. 제 1항에 있어서, 상기 프로세서는, 설정시간 주기로 자세변화에 따른 주행방향에 대해 상기 목표점에 의한 주행방향으로 재설정하는 것을 특징으로 하는 자율주행 방재장치. The autonomous driving disaster prevention device according to claim 1, wherein the processor resets the driving direction according to the posture change to the driving direction according to the target point at a set time period.
  5. 제 1항에 있어서, 상기 프로세서와 작동적으로 연결되는 통신모듈;을 더 포함하고, The method of claim 1, further comprising a communication module operatively connected to the processor,
    상기 프로세서는 상기 통신모듈을 통해 운행명령과 작업명령을 입력받고 운행정보와 작업정보를 관리단말기와 송수신하는 것을 특징으로 하는 자율주행 방재장치. The processor is an autonomous driving disaster prevention device characterized in that it receives operation commands and work commands through the communication module and transmits and receives operation information and work information to a management terminal.
  6. 제 5항에 있어서, 상기 프로세서가 회피주행을 수행한 경우 상기 촬영영상과 함께 운행정보를 상기 관리단말기에 송신하는 것을 특징으로 하는 자율주행 방재장치. The autonomous driving disaster prevention device according to claim 5, wherein when the processor performs evasive driving, driving information is transmitted to the management terminal along with the captured image.
  7. 프로세서가 작동명령에 따라 실행 프로그램을 구동하여 스테레오 카메라로부터 촬영영상을 입력받는 단계; A step in which a processor drives an executable program according to an operation command to receive captured images from a stereo camera;
    상기 프로세서가 상기 촬영영상으로부터 목표점을 설정한 후 주행방향을 설정하는 단계; Setting a driving direction by the processor after setting a target point from the captured image;
    상기 프로세서가 상기 주행방향을 기준으로 주행모듈을 구동시키는 단계; allowing the processor to drive a travel module based on the travel direction;
    상기 프로세서가 상기 주행모듈을 구동시키면서 장애물이 감지된 경우 회피주행을 수행하는 단계; performing evasive driving when an obstacle is detected while the processor drives the driving module;
    상기 프로세서가 상기 주행모듈을 구동시키면서 센서모듈로부터 입력된 자세변화에 따라 상기 주행방향을 보정하는 단계; 및 Compensating the driving direction according to a change in posture input from a sensor module while the processor drives the driving module; and
    상기 프로세서가 상기 촬영영상에서 농작물이 인식되면 방재모듈을 작동시키는 단계;를 포함하는 것을 특징으로 하는 자율주행 방재장치의 제어방법. A method of controlling an autonomous disaster prevention device comprising: operating a disaster prevention module when the processor recognizes a crop in the captured image.
  8. 제 7항에 있어서, 상기 주행방향을 설정하는 단계는, 상기 프로세서가 상기 촬영영상에서 주행공간의 양측에 식재된 농작물들로 가상의 경계선을 설정한 후 상기 가상의 경계선을 연장하여 상기 목표점을 설정한 후 상기 주행방향을 설정하는 것을 특징으로 하는 자율주행 방재장치의 제어방법. The method of claim 7, wherein the step of setting the driving direction includes: the processor sets a virtual boundary line with crops planted on both sides of the driving space in the captured image and then extends the virtual boundary line to set the target point. A control method of an autonomous driving disaster prevention device, characterized in that the driving direction is set after the driving direction is set.
  9. 제 7항에 있어서, 상기 주행방향을 설정하는 단계는, 상기 프로세서가 상기 촬영영상에서 타겟마크를 감지하여 상기 목표점을 설정한 후 상기 주행방향을 설정하는 것을 특징으로 하는 자율주행 방재장치의 제어방법. The control method of claim 7, wherein the step of setting the driving direction includes the processor detecting a target mark in the captured image, setting the target point, and then setting the driving direction. .
  10. 제 7항에 있어서, 상기 프로세서가 설정시간 주기로 자세변화에 따른 주행방향에 대해 상기 목표점에 의한 주행방향으로 재설정하는 단계;를 더 포함하는 것을 특징으로 하는 자율주행 방재장치의 제어방법. The control method of claim 7, further comprising the step of the processor resetting the driving direction according to the posture change to the driving direction based on the target point at a set time period.
  11. 제 7항에 있어서, 상기 프로세서가 통신모듈을 통해 운행명령과 작업명령을 입력받고 운행정보와 작업정보를 관리단말기와 송수신하는 단계;를 더 포함하는 것을 특징으로 하는 자율주행 방재장치의 제어방법. The control method of claim 7, further comprising the step of the processor receiving operation commands and work commands through a communication module, and transmitting and receiving operation information and work information to and from a management terminal.
  12. 제 11항에 있어서, 상기 관리단말기와 송수신하는 단계는, 상기 프로세서가 회피주행을 수행한 경우 촬영영상과 함께 운행정보를 상기 관리단말기에 송신하는 것을 특징으로 하는 자율주행 방재장치의 제어방법.The control method of claim 11, wherein in the step of transmitting and receiving to the management terminal, when the processor performs evasive driving, driving information along with a captured image is transmitted to the management terminal.
PCT/KR2023/003823 2022-10-11 2023-03-22 Autonomously traveling disaster prevention device, and control method therefor WO2024080461A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020220129568A KR102593906B1 (en) 2022-10-11 2022-10-11 Automatic driving disaster prevention apparatus and control method thereof
KR10-2022-0129568 2022-10-11

Publications (1)

Publication Number Publication Date
WO2024080461A1 true WO2024080461A1 (en) 2024-04-18

Family

ID=88508803

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2023/003823 WO2024080461A1 (en) 2022-10-11 2023-03-22 Autonomously traveling disaster prevention device, and control method therefor

Country Status (2)

Country Link
KR (1) KR102593906B1 (en)
WO (1) WO2024080461A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2847673B2 (en) * 1990-11-07 1999-01-20 日本輸送機株式会社 Body guidance method for automatic guided vehicles
KR20190103524A (en) * 2018-02-13 2019-09-05 코가플렉스 주식회사 Autonomous driving devise and method
KR20210051969A (en) * 2019-10-31 2021-05-10 재단법인대구경북과학기술원 System for controlling vehicle for use of agriculture
KR20210069816A (en) * 2019-12-04 2021-06-14 충남대학교산학협력단 Attached tpye-agricultural working path detecting appatatus
KR20220135667A (en) * 2021-03-31 2022-10-07 이성호 Robot for pest control in greenhouse

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102272389B1 (en) * 2019-06-07 2021-07-02 이학률 Spraying device of autonomous driving type unmanned control vehicle

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2847673B2 (en) * 1990-11-07 1999-01-20 日本輸送機株式会社 Body guidance method for automatic guided vehicles
KR20190103524A (en) * 2018-02-13 2019-09-05 코가플렉스 주식회사 Autonomous driving devise and method
KR20210051969A (en) * 2019-10-31 2021-05-10 재단법인대구경북과학기술원 System for controlling vehicle for use of agriculture
KR20210069816A (en) * 2019-12-04 2021-06-14 충남대학교산학협력단 Attached tpye-agricultural working path detecting appatatus
KR20220135667A (en) * 2021-03-31 2022-10-07 이성호 Robot for pest control in greenhouse

Also Published As

Publication number Publication date
KR102593906B1 (en) 2023-10-26

Similar Documents

Publication Publication Date Title
Pilarski et al. The demeter system for automated harvesting
CN109676602B (en) Self-adaptive calibration method, system, equipment and storage medium of walking robot
US20070269114A1 (en) Vision guidance system and method for identifying the position of crop rows in a field
WO2020103109A1 (en) Map generation method and device, drone and storage medium
Escobar‐Alvarez et al. R‐ADVANCE: rapid adaptive prediction for vision‐based autonomous navigation, control, and evasion
WO2019041266A1 (en) Path planning method, aircraft, and flight system
WO2020103108A1 (en) Semantic generation method and device, drone and storage medium
Hiremath et al. Image-based particle filtering for navigation in a semi-structured agricultural environment
JP2023537080A (en) METHOD AND APPARATUS FOR ADJUSTMENT OF SHELF POSITION AND POSTURE BY MOBILE ROBOT
CN109901594A (en) A kind of localization method and system of weed-eradicating robot
CN113848208B (en) Plant phenotype platform and control system thereof
WO2024080461A1 (en) Autonomously traveling disaster prevention device, and control method therefor
BR102022001736A2 (en) METHOD AND DETECTION SYSTEM FOR VEHICLES
KR101568853B1 (en) Greenhouse environment measurement device of Self-moving type having an ultrasonic sensor on the side
CN111679666A (en) Greenhouse management system based on mobile robot
US20220012494A1 (en) Intelligent multi-visual camera system and method
CN114128673A (en) Meat pigeon accurate feeding method based on mixed deep neural network
CN116576859A (en) Path navigation method, operation control method and related device
CN109685851B (en) Hand-eye calibration method, system, equipment and storage medium of walking robot
Li et al. Minimum‐time row transition control of a vision‐guided agricultural robot
CN108175337B (en) Sweeping robot and walking method thereof
KR20240049986A (en) Apparatus for identifying location of transfer device in smart farm and method thereof
CN114967495A (en) Orchard virtual simulation inspection system and method based on Internet of things cloud control platform
KR102269362B1 (en) Method and system for controlling robot for providing management service for plant in building
CN110794902B (en) Inspection method and device for breeding room, computer equipment and storage medium