CN111552385A - Grass condition identification method based on touch screen detection - Google Patents

Grass condition identification method based on touch screen detection Download PDF

Info

Publication number
CN111552385A
CN111552385A CN202010346506.3A CN202010346506A CN111552385A CN 111552385 A CN111552385 A CN 111552385A CN 202010346506 A CN202010346506 A CN 202010346506A CN 111552385 A CN111552385 A CN 111552385A
Authority
CN
China
Prior art keywords
touch screen
detection data
screen module
grass
grass condition
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
CN202010346506.3A
Other languages
Chinese (zh)
Inventor
刘瑜
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Jingyi Intelligent Science and Technology Co Ltd
Original Assignee
Hangzhou Jingyi Intelligent Science and Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Jingyi Intelligent Science and Technology Co Ltd filed Critical Hangzhou Jingyi Intelligent Science and Technology Co Ltd
Priority to CN202010346506.3A priority Critical patent/CN111552385A/en
Publication of CN111552385A publication Critical patent/CN111552385A/en
Withdrawn legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01VGEOPHYSICS; GRAVITATIONAL MEASUREMENTS; DETECTING MASSES OR OBJECTS; TAGS
    • G01V9/00Prospecting or detecting by methods not provided for in groups G01V1/00 - G01V8/00

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Life Sciences & Earth Sciences (AREA)
  • Geophysics (AREA)
  • Human Computer Interaction (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The invention discloses a grass condition identification method based on touch screen detection, which comprises a motion platform with a grass cutting device arranged at the bottom, wherein a processor is arranged in the motion platform, a strip-shaped touch screen module is arranged at the bottom, the touch screen module covers the width of a vehicle body and is higher than the grass cutting device, the touch screen module is connected with the processor, and a grass condition identification algorithm is arranged in the processor, and the method comprises the following steps: (1) an initial stage: the processor reads initial detection data of the touch screen module and arranges the initial detection data into a linked list L0(ii) a (2) The working stage is as follows: (2-1) the processor reads the detection data of the touch screen module at intervals of fixed time and arranges the detection data into Li(ii) a (2-2) the detection data LiSubtracting the initial detection data L0Obtaining current effective detection data Di(ii) a (2-3) calculating the effective area S of touchi(ii) a (2-4) according to the effective area SiAnd (6) judging the grass condition.

Description

Grass condition identification method based on touch screen detection
Technical Field
The patent relates to a grass condition identification method based on touch screen detection, and belongs to the technical field of mobile robots.
Background
With the development of computer technology, the input mode of a computer is input from an original paper tape to a keyboard, to a mouse and then to a touch, and the input mode is subjected to four stages. Touch screen technology allows more people to use computers. The touch screen is essentially a sensor, which consists of a touch detection component and a touch screen controller. The touch detection component is arranged in front of the display screen and used for detecting the touch position of a user and sending the touch position to the touch screen controller after receiving the touch position; the touch screen controller is mainly used for receiving touch information from the touch point detection device, converting the touch information into touch point coordinates and sending the touch point coordinates to the CPU, and meanwhile, receiving and executing commands sent by the CPU.
The touch screen is an interactive input device, and a user can control the operation of the computer only by using a certain position of the touch screen of a finger or a light pen. Therefore, the touch screen technology has the characteristics of simple operation and flexible use. Touch screen technology was generated in the 70 s, and was first applied to the military in the united states, and thereafter, the technology gradually shifted to civilian use, and with the development of electronic technology, network technology, and the popularization of internet application, a new generation of touch screen technology and products became available, and many advantages of robustness, fast response speed, space saving, easy communication, and the like were recognized by the public. Gradually, the easiest man-machine interaction technology has been introduced into many fields, and is widely applied to home appliances, public information (e.g., business inquiry in departments such as e-government affairs, banks, hospitals, electric power, etc.), electronic games, communication equipment, office automation equipment, information collection equipment, industrial equipment, and the like, in addition to personal portable information products. The technology is greatly convenient for users and becomes a brand new multimedia interaction device with great attraction.
The intelligent mowing robot can complete daily maintenance of the lawn, does not need personnel to participate, and avoids time-consuming and laborious work. Under the condition of limited environmental information, the intelligent mowing robot can only adopt a random walking path, and the path has high repetition rate and low efficiency, so that how to optimize a working path and provide working efficiency becomes a key problem for research of technicians.
Disclosure of Invention
Aiming at the problems, the touch screen technology is introduced into the technical field of mowing robots, the grass condition identification method based on touch screen detection is provided, the flourishing degree of the grassland is identified in the moving process of a motion platform, and information is provided for efficient path planning.
The technical scheme adopted by the patent for solving the technical problem is as follows:
the grass condition identification method based on touch screen detection comprises a moving platform with a grass cutting device arranged at the bottom, a processor is arranged in the moving platform, a strip-shaped touch screen module is arranged at the bottom of the moving platform, covers the width of a vehicle body and is higher than the grass cutting device, the touch screen module is connected with the processor, a grass condition identification algorithm is arranged in the processor, and the grass condition identification algorithm comprises the following steps:
(1) an initial stage: the processor reads the initial detection data of the touch screen module and arranges the initial detection data into a linked list L0={(x0 1,y0 1,s0 1),(x0 2,y0 2,s0 2)....(x0 n,y0 n,s0 n)},(x0 n,y0 n) As the center coordinate of the touch position, s0 nThe area of the touch position is defined, N is less than or equal to N, and N is the maximum value of multi-point detection;
(2) the working stage is as follows:
(2-1) the processor reads the detection data of the touch screen module at fixed time intervals and arranges the detection data into Li={(xi 1,yi 1,si 1),(xi 2,yi 2,si 2)....(xi j,yi j,si j)},j≤N;
(2-2) the detection data LiSubtracting the initial detection data L0Obtaining current effective detection data Di={(xi 1,yi 1,si 1),(xi 2,yi 2,si 2)....(xi j-n,yi j-n,si j-n)};
(2-3) calculating the effective area S of touchi=
Figure DEST_PATH_IMAGE002
(2-4) if effective area Si>T, judging that the grass condition is lush; if the effective area Si<T, judging the grass condition to be thinThinning; if the effective area SiAnd =0, the grass condition is judged to be a cut-off state, wherein T is an empirical threshold.
The touch screen module is set to be a capacitive touch screen module.
The touch screen module is an infrared touch screen module.
The beneficial effect of this patent mainly shows: the scheme is based on a mature technical means, solves the problem of grass condition identification, provides useful information for path planning, and can improve the working efficiency of the mowing robot.
Drawings
FIG. 1 is an exterior view of a motion platform;
fig. 2 is a bottom schematic view of the motion platform.
Detailed Description
The invention is further described below with reference to the accompanying drawings:
referring to fig. 1-2, the grass condition identification method based on touch screen detection comprises a moving platform 1 with a grass cutting device 3 mounted at the bottom, wherein the moving platform 1 can be provided with a driving wheel 2, and can realize straight movement, backward movement and rotation at any angle.
The motion platform is characterized in that a processor is arranged in the motion platform 1, a strip-shaped touch screen module 4 is arranged at the bottom of the motion platform 1, and the touch screen module 4 is a capacitive touch screen module. Capacitive touch screens have many advantages: such as: the capacitive touch screen only needs to touch, does not need pressure to generate signals, and is suitable for grass blades with small contact force; the capacitive touch screen only needs to be corrected once or completely after being produced; wear resistance, long service life, low cost and multipoint detection.
In order to cover the maximum travel range, the touch screen module 4 is arranged to cover the length of the width of the vehicle body, and in order to distinguish grass cuts from grass cuts, the touch screen module 4 is arranged higher than the grass cutting device 3.
The touch screen module 4 is connected with the processor, and the processor can read the data of the touch screen module 4 and arrange the data into a data format easy to process.
The processor is internally provided with a grass condition identification algorithm, and the grass condition identification algorithm comprises the following steps:
(1) an initial stage: the processor reads the initial detection data of the touch screen module 4 and arranges the initial detection data into a linked list L0={(x0 1,y0 1,s0 1),(x0 2,y0 2,s0 2)....(x0 n,y0 n,s0 n)},(x0 n,y0 n) As the center coordinate of the touch position, s0 nThe area of the touch position is defined, N is less than or equal to N, and N is the maximum value of multi-point detection;
because of the working environment, the touch screen module 4 often has soil or grass adhered to it, and this step can detect these positions.
(2) The working stage is as follows:
(2-1) the processor reads the detection data of the touch screen module 4 at fixed time intervals and arranges the detection data into Li={(xi 1,yi 1,si 1),(xi 2,yi 2,si 2)....(xi j,yi j,si j)},j≤N;
The step is to carry out data detection in the working process. In order to increase the detection sensitivity of the touch screen module 4, the operation in the morning with high humidity can be set.
(2-2) the detection data LiSubtracting the initial detection data L0Obtaining current effective detection data Di={(xi 1,yi 1,si 1),(xi 2,yi 2,si 2)....(xi j-n,yi j-n,si j-n)};
The step is to remove the data detected in the step (1), and the method comprises the following steps: when (x)i j,yi j) And (x)0 n,y0 n) When the same, will be (x)i j,yi j,si j) From LiIs deleted.
(2-3) calculating the effective area S of touchi=
Figure DEST_PATH_IMAGE004
Effective area SiThe contact area of the grass blades or grass stalks with the touch screen modules 4 is directly related to the flourishing degree of the grassland, and the contact area can be used for quantitatively describing the flourishing degree of the grassland.
(2-4) if effective area Si>T, judging that the grass condition is lush; if the effective area Si<T, judging the grass condition to be sparse; if the effective area SiAnd =0, the grass condition is judged to be a cut-off state, wherein T is an empirical threshold.
And setting an experience threshold T to carry out logic judgment on the lush grassland degree, dividing the lush grassland degree into lush grassland states, sparse grassland states and cut grassland states, and providing information for path planning.

Claims (3)

1. The grass condition identification method based on touch screen detection comprises a moving platform with a grass cutting device arranged at the bottom, wherein a processor is arranged in the moving platform, and the grass condition identification method is characterized in that: the bottom of the motion platform is provided with a strip-shaped touch screen module, the touch screen module covers the width of a vehicle body and is higher than the mowing device, the touch screen module is connected with the processor, a grass condition identification algorithm is arranged in the processor, and the grass condition identification algorithm comprises the following steps:
(1) an initial stage: the processor reads the initial detection data of the touch screen module and arranges the initial detection data into a linked list L0={(x0 1,y0 1,s0 1),(x0 2,y0 2,s0 2)....(x0 n,y0 n,s0 n)},(x0 n,y0 n) As the center coordinate of the touch position, s0 nThe area of the touch position is defined, N is less than or equal to N, and N is the maximum value of multi-point detection;
(2) the working stage is as follows:
(2-1) the processor reads the detection data of the touch screen module at fixed time intervals and arranges the detection data into Li={(xi 1,yi 1,si 1),(xi 2,yi 2,si 2)....(xi j,yi j,si j)},j≤N;
(2-2) the detection data LiSubtracting the initial detection data L0Obtaining current effective detection data Di={(xi 1,yi 1,si 1),(xi 2,yi 2,si 2)....(xi j-n,yi j-n,si j-n)};
(2-3) calculating the effective area S of touchi=
Figure 25128DEST_PATH_IMAGE002
(2-4) if effective area Si>T, judging that the grass condition is lush; if the effective area Si<T, judging the grass condition to be sparse; if the effective area SiAnd =0, the grass condition is judged to be a cut-off state, wherein T is an empirical threshold.
2. A grass condition recognition method based on touch screen detection according to claim 1, characterized in that: the touch screen module is set to be a capacitive touch screen module.
3. A grass condition recognition method based on touch screen detection according to claim 1, characterized in that: the touch screen module is an infrared touch screen module.
CN202010346506.3A 2020-04-27 2020-04-27 Grass condition identification method based on touch screen detection Withdrawn CN111552385A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010346506.3A CN111552385A (en) 2020-04-27 2020-04-27 Grass condition identification method based on touch screen detection

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010346506.3A CN111552385A (en) 2020-04-27 2020-04-27 Grass condition identification method based on touch screen detection

Publications (1)

Publication Number Publication Date
CN111552385A true CN111552385A (en) 2020-08-18

Family

ID=72003237

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010346506.3A Withdrawn CN111552385A (en) 2020-04-27 2020-04-27 Grass condition identification method based on touch screen detection

Country Status (1)

Country Link
CN (1) CN111552385A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112293037A (en) * 2020-09-28 2021-02-02 深圳拓邦股份有限公司 Method for detecting lawn growth state by mowing robot and mowing robot
CN113016765A (en) * 2021-03-09 2021-06-25 厦门理工学院 Weeding system for greenhouse planting and control method thereof

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112293037A (en) * 2020-09-28 2021-02-02 深圳拓邦股份有限公司 Method for detecting lawn growth state by mowing robot and mowing robot
CN113016765A (en) * 2021-03-09 2021-06-25 厦门理工学院 Weeding system for greenhouse planting and control method thereof

Similar Documents

Publication Publication Date Title
CN101853133B (en) Method and mobile terminal for automatically recognizing gestures
CN102645972B (en) The movement of computing equipment is used to strengthen the explanation of the incoming event produced when mutual with this computing equipment
US20110074719A1 (en) Gesture detecting method for touch panel
CN111552385A (en) Grass condition identification method based on touch screen detection
CN101408824A (en) Method for recognizing mouse gesticulation
CN104123007A (en) Multidimensional weighted 3D recognition method for dynamic gestures
CN103984432A (en) Touch screen controller and method for controlling thereof
CN105867916A (en) Terminal control method and device
CN103677615A (en) Method and terminal for calling application program
CN104885051A (en) Multi-touch symbol recognition
CN103440033A (en) Method and device for achieving man-machine interaction based on bare hand and monocular camera
CN101968714A (en) Method and system for identifying operation locus input on mobile terminal interface
CN108090332A (en) A kind of air control method that behavioural analysis is logged in based on user
CN111970189B (en) Content sharing control method and device, electronic equipment and storage medium
CN109674404B (en) Obstacle avoidance processing mode of sweeping robot based on free move technology
CN109543014B (en) Man-machine conversation method, device, terminal and server
CN105511619A (en) Human-computer interaction control system and method based on vision infrared induction technology
CN1945513A (en) Cursor controlling device and method
CN116070114A (en) Data set construction method and device, electronic equipment and storage medium
CN206584114U (en) A kind of low power-consumption intelligent AR devices and intelligence AR glasses
WO2012059595A1 (en) Touch detection
CN103593390A (en) Method, device and equipment for recognizing multimedia information
Shen et al. Smart lighting control system based on fusion of monocular depth estimation and multi-object detection
CN104169858A (en) Method and device of using terminal device to identify user gestures
Nam et al. Smartphone accelerometer-based gesture recognition and its robotic application

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WW01 Invention patent application withdrawn after publication

Application publication date: 20200818

WW01 Invention patent application withdrawn after publication