CN110248229A - A kind of working method of man-machine interaction method, system and the system - Google Patents

A kind of working method of man-machine interaction method, system and the system Download PDF

Info

Publication number
CN110248229A
CN110248229A CN201910534679.5A CN201910534679A CN110248229A CN 110248229 A CN110248229 A CN 110248229A CN 201910534679 A CN201910534679 A CN 201910534679A CN 110248229 A CN110248229 A CN 110248229A
Authority
CN
China
Prior art keywords
module
man
identification
gesture
main control
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910534679.5A
Other languages
Chinese (zh)
Inventor
龚陈龙
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to CN201910534679.5A priority Critical patent/CN110248229A/en
Publication of CN110248229A publication Critical patent/CN110248229A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/42203Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS] sound input device, e.g. microphone
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/42204User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Selective Calling Equipment (AREA)
  • Input From Keyboards Or The Like (AREA)

Abstract

The present invention provides the working method of a kind of man-machine interaction method, system and the system.The man-machine interaction method includes obtaining human-machine interactive information and being identified;It adds their confirmation to operate to the human-machine interactive information of identification and be intended to;The operation intention that will confirm that changes into corresponding control instruction and issues actuating station execution.Compared with the relevant technologies, control of the present invention by way of human-computer interaction to television set or set-top box effectively prevents the limitation using remote controler, and it is convenient to provide to people's life.And when gesture or limbs fail identification, it can also be identified by voice messaging, intelligence degree is higher.

Description

A kind of working method of man-machine interaction method, system and the system
Technical field
The present invention relates to Intelligent control technical field more particularly to the works of a kind of man-machine interaction method, system and the system Make method.
Background technique
Human-computer interaction refers to using certain conversational language between people and computer, true to complete with certain interactive mode Determine the information exchanging process between the people of task and computer.
In actual life, the control instruction of television set is mostly channel replacement, volume adjustment and pass, booting.Above-mentioned control refers to Purpose remote controller is enabled to realize.When needing to issue control instruction to television set, often there is the phenomenon that looking round for remote controler, It wastes time the sending also to instruction and causes delay, bring inconvenience to people's life.
Therefore the working method for needing to provide a kind of man-machine interaction method, system and the system solves the above problems.
Summary of the invention
The purpose of the present invention is to provide a kind of man-machine interaction methods for facilitating manipulation, the work side of system and the system Method.
The technical scheme is that a kind of man-machine interaction method includes:
It obtains human-machine interactive information and is identified;
It adds their confirmation to operate to the human-machine interactive information of identification and be intended to;
The operation intention that will confirm that changes into corresponding control instruction and issues actuating station execution.
Preferably, the human-machine interactive information includes gesture or limb action information.
Preferably, set by the standing wave direction of microwave and/or infrared/or the identification of infrared heat sensor human body sensing realize The identification of gesture or limb action information.
Preferably, the gesture or limb action can be previously set, and different gestures or limb action correspond to different controls System instruction, and establish limb control library.
Preferably, the human-machine interactive information includes voice messaging, passes through the speech recognition with noise reduction and increase audio Module obtains and identifies voice messaging.
Preferably, control instruction is controlled by the mapping that HID protocol carries out keypress function to actuating station.
The present invention also provides a kind of man-machine interactive systems for realizing above-mentioned man-machine interaction method, comprising:
Main control chip;
A module: connecting with the main control chip signal, for the identification module of gesture or limb action for identification;
B module: connecting with the main control chip signal, for the identification module of voice messaging for identification;
C module: being connected with the main control chip, and is USB interface;
Actuating station: it is connect by the C module with the main control chip signal.
Preferably, the A module includes microwave sensor and infrared human body inductor.
Preferably, the B module is MIC sound pick-up.It is a kind of pickup for MIC electric level interface web camera Device can also connect on PC machine MIC mouth.Its working principle and circuit and headset used on computer, mobile phone are essentially identical, and difference is main It is shell.Passive sound pick-up is not genuine passive, but due to using electret microphone.It is usually to pass through one that it, which is powered, A 2200 Ohmic resistance power supply.
The present invention also provides a kind of working methods of above-mentioned man-machine interactive system, comprising the following steps:
(1) user sends the gesture for needing to manipulate or limb action;
(2) A module obtains specific gesture or limb action information;
(3) main control chip identifies the gesture or limb action information that A module obtains, will after clearly operation is intended to The control instruction that the operation is intended to is sent to actuating station and executes corresponding movement;
(4) when the gesture of identification or limb action validation of information are intended to by main control chip for operation non-clearly, the C mould The control instruction is transferred to actuating station by block, starts the mute function of actuating station, and starts B module;
(5) the B module prompt user operates the demand being intended to, and user is allowed to issue voice messaging;
(6) the B module obtains voice messaging and is identified by main control chip;
(7) main control chip controls the C module and voice messaging is uploaded to speech recognition cloud, makes related voice Pairing identification after, then send actuating station to and execute corresponding movement.
Compared with the relevant technologies, the invention has the benefit that television set or set-top box by way of human-computer interaction Control, effectively prevent the limitation using remote controler, to people life provide it is convenient.And when gesture or limbs fail to know When other, it can also be identified by voice messaging, intelligence degree is higher.As the 5G epoch arrive, innovation and creation are a kind of new The experience that interactive mode has been brought to user allows people more rapidly more easily to carry out human-computer interactive control equipment.
Detailed description of the invention
Fig. 1 is the structural schematic diagram of man-machine interactive system provided by the invention;
Fig. 2 is the workflow schematic diagram of man-machine interactive system provided by the invention.
Specific embodiment
Come that the present invention will be described in detail below with reference to attached drawing and in conjunction with the embodiments.
The present invention provides a kind of man-machine interaction method, comprising:
It obtains human-machine interactive information and is identified;The human-machine interactive information includes gesture or limb action information, and Set by the standing wave direction of microwave and/or infrared/or the identification of infrared heat sensor human body sensing realize gesture or limb action The identification of information.
The human-machine interactive information further includes voice messaging, by obtaining with noise reduction with the speech recognition module for increasing audio It takes and identifies voice messaging.
The gesture or limb action can be previously set, and different gestures or limb action correspond to different control instructions, And establish limbs, language control library.
It adds their confirmation to operate to the human-machine interactive information of identification and be intended to;
The operation intention that will confirm that changes into corresponding control instruction, and control instruction carries out actuating station by HID protocol The mapping of keypress function controls, and actuating station is allowed to execute corresponding actions.
As shown in Figure 1, man-machine interactive system provided by the invention includes main control chip, A module, B module, C module and holds Row end.
Main control chip is micro control unit (Microcontroller Unit;MCU).
The A module is connect with the main control chip signal, for the identification module of gesture or limb action for identification.Institute Stating A module includes microwave sensor and infrared human body inductor or infrared heat sensor, using microwave standing wave direction discernment and The principle of infrared human body induction identification realizes gesture or the limb action identification of people.A module also has artificial intelligence self study energy Power, i.e. deep learning module calculate recognition capability, algorithm analysis compensation and self-learning capability, user by cloud and are using After this man-machine interaction mode, the recognition capability of gesture and limb action to user will be higher and higher, and accuracy rate will increasingly Height can be effectively reduced malfunction and maloperation.
The B module is connect with the main control chip signal, for the identification module of voice messaging for identification.The B mould Speech recognition when block main function, including a 360 degrees omnidirection MIC pickup and noise reduction technology.
The C module is connected with the main control chip, and is USB interface;The actuating station.By the C module with it is described The connection of main control chip signal.The main function of the C module is: giving A, B module for power supply;Realize the connection and data with actuating station Transmission;Keypress function mapping control is carried out to actuating station (television set, set-top box etc.) by HID protocol.
Microwave remote sensor and infrared sensor or infrared heat sensor in the A module make when projecting infrared matrix The relevant action of user and speed will generate corresponding data variation amount in infrared matrix, the variation size of this quantity and Velocity and acceleration will obtain relevant vector according to corresponding algorithm in infrared matrix, thus further according to this vector combination cloud Passing data caused by the server of end carry out intellectual analysis and obtain the true intention of user by fuzzy recognition algorithm, from And issue relevant control instruction.The experience of the immersion of microwave remote sensor and infrared matrix, is to make the limb motion of user complete It is immersed in the infrared dot matrix of microwave standing wave and infrared matrix, this dot matrix is one that infrared point is arranged according to certain bob rule A space body, and non-linear and face property planar structure, such infrared dot matrix can really reflect user's completely Operation is intended to, in addition the backstage of artificial intelligence is in continuous study and assessment, the habit for judging user, in use, A mould group Success rate will be higher and higher.
The speech recognition module of the B mould group has the function of that noise reduction and audio increase, can in 1.8 meters to 3 meters of distance, Voice collecting is carried out in the case where normal voice size, and according to 16bit, the sampled speech packet of 32K is packaged transmission, by with USB port that television set (set-top box) is connected carries out communication transfer, USB and television set (set-top box) in First Contact Connections, Mutually completed the pairing of puppy parc according to the HID protocol of standard, USB interface by related data by equipment (television set, Set-top box etc.) it is uploaded to the speech recognition cloud of equipment itself, the recognition result for making related voice passes equipment (TV back again Machine, set-top box), human-computer interaction is carried out to equipment (television set, set-top box) and realizes relevant control.
As described in Figure 2, man-machine interactive system provided by the invention working method the following steps are included:
(1) user sends the gesture for needing to manipulate or limb action;
(2) A module obtains specific gesture or limb action information;
(3) main control chip identifies the gesture or limb action information that A module obtains, will after clearly operation is intended to The control instruction that the operation is intended to is sent to actuating station (television set or set-top box) and executes corresponding movement;
(4) when the gesture of identification or limb action validation of information are intended to by main control chip for operation non-clearly, the C mould The control instruction is transferred to actuating station by block, starts the mute function of actuating station, and starts B module;
(5) the B module prompt user operates the demand being intended to, and user is allowed to issue voice messaging;
(6) the B module obtains voice messaging and is identified by main control chip;
(7) main control chip controls the C module and voice messaging is uploaded to speech recognition cloud, makes related voice Pairing identification after, then send actuating station (television set or set-top box) to and execute corresponding movement.
A module identifies the direction and side of user when standby by microwave remote sensor and infrared human body inductive pick-up Position passes through the microwave remote sensor and infrared human body induction sensing in specific gesture or limb action triggering A module in user Device, user operates according to default gesture motion or limb action at this time, can be according to using in deliberate action operation The movement that the habit setting user oneself of person is accustomed to is specified, and is established the body language control library of oneself, is passed through gesture Left and right on get off instead of as television set (or set-top box) channel add and subtract, the control of volume, meanwhile, pass through certain gestures (can set) can star B modular voice identification function, and B module will work after the phonetic order for receiving user, right Actuating station (television set or set-top box etc.) is controlled.
The above description is only an embodiment of the present invention, is not intended to limit the scope of the invention, all to utilize this hair Equivalent structure or equivalent flow shift made by bright specification and accompanying drawing content is applied directly or indirectly in other relevant skills Art field, is included within the scope of the present invention.

Claims (10)

1. a kind of man-machine interaction method characterized by comprising
It obtains human-machine interactive information and is identified;
It adds their confirmation to operate to the human-machine interactive information of identification and be intended to;
The operation intention that will confirm that changes into corresponding control instruction and issues actuating station execution.
2. man-machine interaction method according to claim 1, which is characterized in that the human-machine interactive information includes gesture or limb Body action message.
3. man-machine interaction method according to claim 2, which is characterized in that set by the standing wave direction of microwave and/or The identification of gesture or limb action information is realized in the identification of infrared or infrared heat sensor human body sensing.
4. man-machine interaction method according to claim 2, which is characterized in that the gesture or limb action can be set in advance Fixed, different gestures or limb action correspond to different control instructions, and establish limb control library.
5. man-machine interaction method according to claim 1, which is characterized in that the human-machine interactive information includes voice letter Breath is obtained by the speech recognition module with noise reduction and increase audio and identifies voice messaging.
6. man-machine interaction method according to claim 1, which is characterized in that by control instruction by HID protocol to execution End carries out the mapping control of keypress function.
7. a kind of man-machine interactive system for realizing man-machine interaction method as described in any one of claims 1 to 6, feature exist In, comprising:
Main control chip;
A module: connecting with the main control chip signal, for the identification module of gesture or limb action for identification;
B module: connecting with the main control chip signal, for the identification module of voice messaging for identification;
C module: being connected with the main control chip, and is USB interface;
Actuating station: it is connect by the C module with the main control chip signal.
8. man-machine interactive system according to claim 7, which is characterized in that the A module includes microwave sensor and red Outer human inductor.
9. man-machine interactive system according to claim 7, which is characterized in that the B module is MIC sound pick-up.
10. a kind of working method such as the described in any item man-machine interactive systems of claim 7~9, which is characterized in that including with Lower step:
(1) user sends the gesture for needing to manipulate or limb action;
(2) A module obtains specific gesture or limb action information;
(3) main control chip identifies the gesture or limb action information that A module obtains, after clearly operation is intended to, by the behaviour The control instruction for making to be intended to is sent to actuating station and executes corresponding movement;
(4) when the gesture of identification or limb action validation of information are intended to by main control chip for operation non-clearly, the C module will The control instruction is transferred to actuating station, starts the mute function of actuating station, and starts B module;
(5) the B module prompt user operates the demand being intended to, and user is allowed to issue voice messaging;
(6) the B module obtains voice messaging and is identified by main control chip;
(7) main control chip controls the C module and voice messaging is uploaded to speech recognition cloud, makes matching for related voice After identification, then sends actuating station to and execute corresponding movement.
CN201910534679.5A 2019-06-18 2019-06-18 A kind of working method of man-machine interaction method, system and the system Pending CN110248229A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910534679.5A CN110248229A (en) 2019-06-18 2019-06-18 A kind of working method of man-machine interaction method, system and the system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910534679.5A CN110248229A (en) 2019-06-18 2019-06-18 A kind of working method of man-machine interaction method, system and the system

Publications (1)

Publication Number Publication Date
CN110248229A true CN110248229A (en) 2019-09-17

Family

ID=67888263

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910534679.5A Pending CN110248229A (en) 2019-06-18 2019-06-18 A kind of working method of man-machine interaction method, system and the system

Country Status (1)

Country Link
CN (1) CN110248229A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110824989A (en) * 2019-11-11 2020-02-21 路邦科技授权有限公司 Head sensing control device for controlling robot

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011065652A (en) * 2004-05-14 2011-03-31 Honda Motor Co Ltd Sign based man-machine interaction
CN108762512A (en) * 2018-08-17 2018-11-06 浙江核聚智能技术有限公司 Human-computer interaction device, method and system
CN109410940A (en) * 2018-12-05 2019-03-01 湖北安心智能科技有限公司 A kind of man-machine interaction method and system based on indication control board

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011065652A (en) * 2004-05-14 2011-03-31 Honda Motor Co Ltd Sign based man-machine interaction
CN108762512A (en) * 2018-08-17 2018-11-06 浙江核聚智能技术有限公司 Human-computer interaction device, method and system
CN109410940A (en) * 2018-12-05 2019-03-01 湖北安心智能科技有限公司 A kind of man-machine interaction method and system based on indication control board

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110824989A (en) * 2019-11-11 2020-02-21 路邦科技授权有限公司 Head sensing control device for controlling robot

Similar Documents

Publication Publication Date Title
JP6902136B2 (en) System control methods, systems, and programs
CN103730116B (en) Intelligent watch realizes the system and method that intelligent home device controls
CN107358954A (en) It is a kind of to change the device and method for waking up word in real time
US11430438B2 (en) Electronic device providing response corresponding to user conversation style and emotion and method of operating same
EP2770445A2 (en) Method and system for supporting a translation-based communication service and terminal supporting the service
CN104410883A (en) Mobile wearable non-contact interaction system and method
CN105575395A (en) Voice wake-up method and apparatus, terminal, and processing method thereof
EP3246133B1 (en) Control system
CN102789218A (en) Zigbee smart home system based on multiple controllers
CN110010125A (en) A kind of control method of intelligent robot, device, terminal device and medium
CN109474658A (en) Electronic equipment, server and the recording medium of task run are supported with external equipment
KR102391298B1 (en) electronic device providing speech recognition service and method thereof
CN106782522A (en) Sound control method and speech control system
CN111968641B (en) Voice assistant awakening control method and device, storage medium and electronic equipment
CN104754112A (en) User information obtaining method and mobile terminal
CN108897517A (en) A kind of information processing method and electronic equipment
CN205620724U (en) Intelligence pronunciation remote control system
WO2018023515A1 (en) Gesture and emotion recognition home control system
JP2019012506A (en) Method and system for automatic activation of machine
CN110248229A (en) A kind of working method of man-machine interaction method, system and the system
CN208271560U (en) A kind of control system of intelligent sound household electricity device
CN103974112A (en) Method and device for controlling television
CN207458576U (en) A kind of real-time equipment for replacing wake-up word
CN206096789U (en) Intelligent home control system
WO2018023523A1 (en) Motion and emotion recognizing home control system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20190917

RJ01 Rejection of invention patent application after publication