CN111766944A - Human-plant interaction device - Google Patents

Human-plant interaction device Download PDF

Info

Publication number
CN111766944A
CN111766944A CN202010458890.6A CN202010458890A CN111766944A CN 111766944 A CN111766944 A CN 111766944A CN 202010458890 A CN202010458890 A CN 202010458890A CN 111766944 A CN111766944 A CN 111766944A
Authority
CN
China
Prior art keywords
plant
gesture
human
interaction device
control instruction
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010458890.6A
Other languages
Chinese (zh)
Inventor
周磊晶
周莹
陈英俏
钱东
刘建华
黄思
傅苇航
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang University ZJU
Original Assignee
Zhejiang University ZJU
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang University ZJU filed Critical Zhejiang University ZJU
Priority to CN202010458890.6A priority Critical patent/CN111766944A/en
Publication of CN111766944A publication Critical patent/CN111766944A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user

Abstract

The invention discloses a human and plant interaction device, which comprises: a plant; the gesture collecting unit is electrically connected with the plant and is used for collecting different electric signals generated when a person touches the plant through different gesture actions; the control unit is used for identifying the electric signal and generating a corresponding control instruction according to an identification result; and the execution unit performs corresponding execution according to the control instruction. The human-plant interaction device provided by the invention interacts with human by utilizing the stress and conductivity of the plant, is simple in operation mode, and can provide better user experience.

Description

Human-plant interaction device
Technical Field
The invention relates to the technical field of artificial intelligence and intelligent interaction, in particular to a human-plant interaction device.
Background
With the development of science and technology, man-machine interaction between people and intelligent equipment is more and more. In recent years, pervasive computing has been a hot problem of research in the field of human-computer interaction, and the core of pervasive computing is to integrate technology into everyday life and enable people to interact with objects in life at any time and any place and in any way.
The relationship of human and plant is of great importance. The indoor plants bring fresh oxygen to people and create a leisure environment. Meanwhile, as a companion organism, the plant can bring satisfaction, relieve stress, help recovery and the like to people, and the plant brings positive influence to people in various ways.
For the public, the intelligent flower device is not strange. Chinese patent document with publication number CN208273683U discloses an artificial intelligence flower device, which is connected with a plurality of sub flower devices through remote control by a host machine; the host is connected with the sub-flower devices through a self-programming communication protocol; the body of the flower-making device is an open container capable of containing soil, the top of the container is provided with an opening with water storage, water spraying and atomizing functions, the middle of the container is provided with a groove body capable of storing water, the bottom of the container is provided with a fully-closed control system component, and the inner wall of the container is provided with a sensor component and a watering and atomizing device. The host of the artificial intelligence flower device can support Wi-Fi connection under the mode based on the Internet, and controls a plurality of sub-flower devices through app, so that people can detect data in the sub-flower devices in real time even on business trips or travels and realize functions of watering, atomizing and the like of plants through the sub-flower devices. Chinese patent publication No. CN111011045A discloses an intelligent flower pot, which comprises a flower pot body with an upward opening, wherein the flower pot body comprises an outer shell and an inner shell, a cavity for storing water is formed between the outer shell and the inner shell, a water injection port is formed on the outer shell, soil is arranged in the inner shell, and the intelligent flower pot further comprises a base, a detection mechanism, a water replenishing mechanism, a soil loosening mechanism, a controller and a mobile terminal; the advantage is that can remote control watering and loosen the soil, carries out the omnidirectional management to the plant.
The touch sense is an important channel for acquiring external information by human beings, and the unique properties and the sense stimulation which cannot be acquired by other sense senses such as abundant materials, temperature, hardness and the like can be acquired by using the touch sense to carry out interaction. However, in the existing intelligent flower-growing products, the experience of the tactile belt to people is not fully utilized in the interaction process.
Disclosure of Invention
The invention provides a human-plant interaction device which can capture gesture action information when a human touches a plant and perform corresponding execution according to the gesture action information.
The specific technical scheme of the invention is as follows:
a human-plant interaction device, comprising:
a plant;
the gesture collecting unit is electrically connected with the plant and is used for collecting different electric signals generated when a person touches the plant through different gesture actions;
the control unit is used for identifying the electric signal and generating a corresponding control instruction according to an identification result;
and the execution unit performs corresponding execution according to the control instruction.
The gesture acquisition unit is a sweep frequency capacitance sensor; the method comprises the following steps:
the sweep frequency signal generating module generates a sweep frequency signal and acts on the plant;
the capacitance receiving module is used for inputting the gesture action of touching the plant by a person for the plant and is realized by touching the plant by the person;
and the electric signal acquisition module acquires capacitance change on the plants and obtains an electric signal through filtering and demodulation.
The plant electrical signal is an important plant physiological signal related to plant physiological processes and information transmitted in vivo, and is the response of plants to environmental change stimuli. With the development of modern weak signal detection and processing technology and the wide application of computer information acquisition technology in agriculture, the extraction and analysis of plant weak electric signals are more feasible and scientific.
According to the invention, the plant electrically connected with the gesture acquisition unit is used as a conductive element, the gesture acquisition unit adds an electric signal with fixed frequency to the plant, when a person touches the plant, the electric signal with fixed frequency on the plant is changed, the gesture acquisition unit identifies the gesture of the person when the person touches the plant according to the change of the electric signal, and the execution unit is controlled to make a corresponding response according to the touch gesture.
In the invention, the plant serves as an interactive input device, the interaction between people and the plant is realized by utilizing the touch sense, compared with the traditional human-computer interaction, the operation method is simple, and better user experience can be provided.
According to the invention, the gesture collection unit is connected with the plants through the electric wires buried in the soil, and the soil conductivity is utilized, so that the gesture collection unit has the advantages of simple implementation mode and no harm to the plants.
The control unit comprises:
the gesture database stores mapping relations between different gesture actions and corresponding electric signals;
the control instruction library stores mapping relations between different gesture actions and corresponding control instructions;
the gesture classifier searches the gesture database according to the electric signal acquired by the gesture acquisition unit and finds out the gesture action corresponding to the electric signal; and searching the control instruction library according to the found gesture action, and finding out the control instruction corresponding to the gesture action.
The gesture classifier comprises:
the characteristic extraction module is used for extracting the characteristics of the electric signals by using Fast Fourier Transform (FFT) to obtain characteristic vectors of the electric signals;
and the support vector machine is used for classifying the characteristic vectors of the electric signals and matching gesture actions corresponding to the electric signals through a gesture database.
The method uses a machine learning optimization algorithm to extract the key characteristics of the electric signals and more accurately identifies the electric signals.
The gesture actions comprise a plurality of gestures such as pinching, touching, grabbing, holding, kneading and the like.
One technical scheme is as follows: the execution unit is an intelligent flowerpot; the intelligent flowerpot can detect the planting environment information of the plants planted in the intelligent flowerpot according to the corresponding control instruction, and the detected planting environment information is visualized or broadcasted in voice.
The planting environment information can be soil humidity, temperature, illumination intensity and the like.
Furthermore, the intelligent flowerpot can carry out watering, temperature control and illumination control on plants planted in the intelligent flowerpot according to corresponding control instructions.
The other technical scheme is as follows: the execution unit is an intelligent sound box; the intelligent sound box can perform corresponding execution according to the corresponding control instruction.
For example, the smart sound box may perform operations such as turning on or off the sound box, adjusting the volume, and switching functions according to control instructions such as turning on or off the sound box, adjusting the volume, and switching functions.
Compared with the prior art, the invention has the beneficial effects that:
(1) according to the human-plant interaction device, through direct contact between a human and a plant and the combination of a machine learning optimized classification algorithm, a gesture signal generated by touching the plant by the human can be accurately identified;
(2) the human-plant interaction device provided by the invention interacts with the touch of a human body by utilizing the stress and the conductivity of the plant body, and has uniqueness; a reference basis is provided for the subsequent interaction mode of the human and plant development;
(3) the human-plant interaction device breaks through the traditional human-computer interaction method, the plant is used as an interactive input device, interaction is carried out by utilizing touch, and compared with the traditional human-computer interaction method, the human-plant interaction device is simple in operation mode, obvious in effect, capable of providing better user experience, and wide in market application prospect and higher in use value.
Drawings
FIG. 1 is a schematic diagram of an embodiment of a human-plant interaction device;
FIG. 2 is a schematic diagram of a swept-frequency capacitive sensor;
fig. 3 is a schematic diagram of the process of acquiring and recognizing gesture actions.
Detailed Description
The invention will be described in further detail below with reference to the drawings and examples, which are intended to facilitate the understanding of the invention without limiting it in any way.
As shown in fig. 1, a human-plant interaction device according to an embodiment of the present invention includes:
a plant;
the device comprises a sweep frequency capacitance sensor (Touche sensor) electrically connected with a plant, a control unit and a control unit, wherein the sweep frequency capacitance sensor (Touche sensor) is used for acquiring different electric signals generated by the plant when a person touches the plant through different gesture actions;
the method comprises the steps that a sweep frequency capacitance sensor transmits acquired electric signals to a control unit in a wireless mode, an SVM (support vector machine) classifier of the control unit extracts characteristics of the electric signals, gesture actions corresponding to the electric signals are matched through a gesture database according to extracted characteristic vectors, control instructions corresponding to the gesture actions are matched through a control instruction library according to the gesture actions, and the control unit transmits the control instructions to terminal equipment;
and the terminal equipment makes a corresponding reaction according to the control instruction.
Firstly, acquiring a capacitance signal generated by a user touching a plant according to a plant interaction device, wherein the process is triggered by human behavior, acquiring frequency change in a sensing line formed by a human and the plant by combining a frequency sweeping capacitance sensing technology, and transmitting the signal to a gesture recognition unit; the machine learning algorithm of the gesture recognition unit processes the signals, extracts key features of the signals and accurately recognizes the signals. And generating a sample set to establish a gesture database, classifying the signals by using an SVM algorithm in combination with the gesture database, and establishing a basis for real-time gesture recognition.
In the actual operation process, the plant interaction device judges the gesture through the signal value, obtains a control instruction corresponding to the gesture, and the control instruction is used for controlling the intelligent flowerpot to display the environmental information numerical values of the plant, such as soil humidity, temperature and the like, and triggers the plant maintenance system to water the plant, supplement illumination and the like through other gestures.
The swept frequency capacitive sensor electrically connected to the plant is configured as shown in fig. 2, and the user interacts by touching a plant connected to the touch sensor board. Since the plant itself is conductive and the soil is also conductive, the wire can function by being buried in the soil without being embedded in the plant. Swept-frequency capacitive sensing techniques process not single data, but multi-dimensional data generated based on different frequencies, the resulting sampled signal being the capacitive profile of the touch interaction. The design and implementation of the signal acquisition part mainly comprise three modules: the device comprises a sweep frequency signal generation module, a resonant network module and an envelope detection module. The sweep frequency signal generation module is used for generating square wave signals with variable frequencies, the continuous variation of the frequencies is changed into variation of amplitude through the resonant network module, and the envelope signals are detected through the envelope detection module to obtain gesture signals needing to be collected.
FIG. 3 is a process for capturing and identifying gesture actions. After the gesture signals are collected, key features of the signals are extracted by using a machine learning optimization algorithm, and after the features are extracted, the features are classified and recognized. Given a training sample set D { (x1, y1), (x2, y2), (x3, y3) … (xm, ym) }, yi ∈ (-1, + 1); wherein, x refers to a feature vector, y refers to a classification mark, and (x, y) represents a gesture sample data. And finding out a division hyperplane in a sample space based on the training set D, separating samples of different classes, and finding out the hyperplane with the best tolerance of the division hyperplane to the local disturbance of the training samples. Meanwhile, after training of the training set D through the SVM, a gesture database is obtained.
The method comprises the steps that human touch blades are in direct contact with plants, digital gesture information is generated through a sweep frequency capacitance sensing technology, gestures of a human are judged in real time through an SVM (support vector machine) classifier and a gesture database, corresponding control instructions are obtained, and finally terminal equipment is controlled.
Example 1
The terminal equipment is the flowerpot only, and this intelligent flowerpot detectable is its planting environmental information of planting the plant of planting in it, for example soil moisture, temperature, illumination intensity etc to can show these information. In addition, the intelligent flowerpot can also be used for watering, supplementing illumination and the like for plants planted in the intelligent flowerpot according to instructions.
The plant of planting in intelligent flowerpot is touched through specific gesture to the people, and this specific gesture corresponds with the control command of "looking over the planting environment of plant", then the sweep frequency capacitance sensor receives the signal of telecommunication of this gesture and transmits for the control unit, and the control unit obtains the control command "looking over the planting environment of plant" that corresponds with it through discernment, and information such as the soil moisture of intelligent flowerpot according to the control command of "looking over the planting environment of plant", detects the soil moisture, temperature, the illumination intensity of plant, and shows.
After the information such as soil humidity, temperature, illumination intensity of the plant is read by a person, the plant can be touched by a needed gesture according to needs, and therefore actions such as watering, illumination supplement and the like of the plant are completed.
Example 2
The terminal equipment is an intelligent sound box, and the intelligent sound box can complete corresponding actions of turning on or off the sound box, adjusting the volume, switching functions and the like according to the switch control instruction, the volume adjusting control instruction and the function switching control instruction.
A person touches a plant through a specific gesture, the specific gesture corresponds to a control instruction of 'opening an intelligent sound box', then the sweep frequency capacitive sensor receives an electric signal of the gesture and transmits the electric signal to the control unit, the control unit obtains the control instruction 'opening the intelligent sound box' corresponding to the control unit through recognition, and the intelligent sound box opens the switch according to the control instruction 'opening the intelligent sound box'.
The person can also touch the plant through the gesture actions corresponding to the control instructions of volume adjustment, function switching and the like to control the intelligent sound box.
The above-mentioned embodiments are intended to illustrate the technical solutions and advantages of the present invention, and it should be understood that the above-mentioned embodiments are only specific embodiments of the present invention, and are not intended to limit the present invention, and any modifications, additions, equivalents, etc. made within the scope of the principles of the present invention should be included in the scope of the present invention.

Claims (9)

1. A human-plant interaction device, comprising:
a plant;
the gesture collecting unit is electrically connected with the plant and is used for collecting different electric signals generated when a person touches the plant through different gesture actions;
the control unit is used for identifying the electric signal and generating a corresponding control instruction according to an identification result;
and the execution unit performs corresponding execution according to the control instruction.
2. The human-plant interaction device according to claim 1, wherein the gesture collection unit is a swept frequency capacitive sensor.
3. The human-plant interaction device of claim 2, wherein the swept frequency capacitive sensor comprises:
the sweep frequency signal generating module generates a sweep frequency signal and acts on the plant;
the capacitance receiving module is used for inputting the gesture action of touching the plant by a person for the plant and is realized by touching the plant by the person;
and the electric signal acquisition module acquires capacitance change on the plants and obtains an electric signal through filtering and demodulation.
4. The human-plant interaction device of claim 1, wherein the control unit comprises:
the gesture database stores mapping relations between different gesture actions and corresponding electric signals;
the control instruction library stores mapping relations between different gesture actions and corresponding control instructions;
the gesture classifier searches the gesture database according to the electric signal acquired by the gesture acquisition unit and finds out the gesture action corresponding to the electric signal; and searching the control instruction library according to the found gesture action, and finding out the control instruction corresponding to the gesture action.
5. The human-plant interaction device of claim 1, wherein the gesture classifier comprises:
the characteristic extraction module extracts the characteristics of the electric signals by using fast Fourier transform to obtain characteristic vectors of the electric signals;
and the support vector machine is used for classifying the characteristic vectors of the electric signals and matching gesture actions corresponding to the electric signals through a gesture database.
6. A human-plant interaction device according to any one of claims 1 to 5, wherein the gesture actions comprise a plurality of gestures such as pinching, touching, grabbing, holding and kneading.
7. The human-plant interaction device according to claim 1, wherein the execution unit is an intelligent flowerpot; the intelligent flowerpot can detect the planting environment information of the plants planted in the intelligent flowerpot according to the corresponding control instruction, and the detected planting environment information is visualized or broadcasted in voice.
8. The human-plant interaction device according to claim 7, wherein the intelligent flowerpot can be used for watering, controlling temperature and controlling illumination of plants planted in the intelligent flowerpot according to corresponding control instructions.
9. The human-plant interaction device according to claim 1, wherein the execution unit is a smart speaker; the intelligent sound box can carry out the operations of opening or closing the sound box, adjusting the volume and switching the functions according to the control instructions of opening or closing the sound box, adjusting the volume and switching the functions.
CN202010458890.6A 2020-05-26 2020-05-26 Human-plant interaction device Pending CN111766944A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010458890.6A CN111766944A (en) 2020-05-26 2020-05-26 Human-plant interaction device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010458890.6A CN111766944A (en) 2020-05-26 2020-05-26 Human-plant interaction device

Publications (1)

Publication Number Publication Date
CN111766944A true CN111766944A (en) 2020-10-13

Family

ID=72719620

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010458890.6A Pending CN111766944A (en) 2020-05-26 2020-05-26 Human-plant interaction device

Country Status (1)

Country Link
CN (1) CN111766944A (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106601219A (en) * 2016-03-25 2017-04-26 浙江大学 Musical instrument for plant music playing and control method of musical instrument
CN206506920U (en) * 2016-11-04 2017-09-22 北京花花草草科技有限公司 A kind of intelligent flowerpot
CN207266220U (en) * 2017-09-27 2018-04-20 广东万龙智能电子有限公司 A kind of flowerpot type Baffle Box of Bluetooth with wireless doorbell
CN207443583U (en) * 2017-08-10 2018-06-05 深圳市翠园科技有限公司 Plant interactive device and plant interaction systems
CN108363323A (en) * 2018-03-12 2018-08-03 王珏 Self-powered animals and plants intelligence system
CN109917834A (en) * 2017-12-12 2019-06-21 杨杰 A kind of plant monitoring method and device
CN111198613A (en) * 2020-01-20 2020-05-26 江南大学 Music control device and method using bioelectric current

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106601219A (en) * 2016-03-25 2017-04-26 浙江大学 Musical instrument for plant music playing and control method of musical instrument
CN206506920U (en) * 2016-11-04 2017-09-22 北京花花草草科技有限公司 A kind of intelligent flowerpot
CN207443583U (en) * 2017-08-10 2018-06-05 深圳市翠园科技有限公司 Plant interactive device and plant interaction systems
CN207266220U (en) * 2017-09-27 2018-04-20 广东万龙智能电子有限公司 A kind of flowerpot type Baffle Box of Bluetooth with wireless doorbell
CN109917834A (en) * 2017-12-12 2019-06-21 杨杰 A kind of plant monitoring method and device
CN108363323A (en) * 2018-03-12 2018-08-03 王珏 Self-powered animals and plants intelligence system
CN111198613A (en) * 2020-01-20 2020-05-26 江南大学 Music control device and method using bioelectric current

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
宝威科技: "SFCS技术将植物变为多点触摸控制器", 《HTTP://WWW.SINOBRAVE.COM/A/22/1806.HTML》 *

Similar Documents

Publication Publication Date Title
US10061389B2 (en) Gesture recognition system and gesture recognition method
CN103529762B (en) A kind of intelligent home furnishing control method based on sensor technology and system
CN103730116B (en) Intelligent watch realizes the system and method that intelligent home device controls
CN109857251A (en) Gesture identification control method, device, storage medium and the equipment of intelligent appliance
CN108711430A (en) Audio recognition method, smart machine and storage medium
CN102932212A (en) Intelligent household control system based on multichannel interaction manner
CN103948398B (en) Be applicable to the heart sound location segmentation method of android system
CN105353881A (en) Gesture recognition method and system based on RFID (radio frequency identification devices)
CN206117701U (en) Domestic appliance and control system thereof
CN103248703A (en) Automatic monitoring system and method for live pig action
Davies et al. Deep neural networks for appliance transient classification
CN104217718A (en) Method and system for voice recognition based on environmental parameter and group trend data
CN104123930A (en) Guttural identification method and device
CN108981115A (en) A kind of air purifier and control method based on machine learning
CN111597869A (en) Human activity recognition method based on grouping residual error joint space learning
CN111766944A (en) Human-plant interaction device
CN201812394U (en) Popular science device based on biological feature recognition technology
CN115438691A (en) Small sample gesture recognition method based on wireless signals
CN208335743U (en) A kind of intelligent robot Semantic interaction system based on white light communication and the cognition of class brain
CN107241846A (en) A kind of intelligent desk lamp control device and intelligent identification Method
Patel et al. Hand Gesture based Home Control Device using IoT.
CN111048092B (en) Voice control system and method of electronic toilet
CN108803880A (en) Control device based on brain signal and method
CN107861398A (en) The system and intelligent domestic system of a kind of voice control electric appliance
CN103050032A (en) Intelligent pointer

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20201013