CN110286771A - Interaction method and device, intelligent robot, electronic equipment and storage medium - Google Patents
Interaction method and device, intelligent robot, electronic equipment and storage medium Download PDFInfo
- Publication number
- CN110286771A CN110286771A CN201910578601.3A CN201910578601A CN110286771A CN 110286771 A CN110286771 A CN 110286771A CN 201910578601 A CN201910578601 A CN 201910578601A CN 110286771 A CN110286771 A CN 110286771A
- Authority
- CN
- China
- Prior art keywords
- user
- interaction
- intelligent robot
- interactive
- interaction data
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 230000003993 interaction Effects 0.000 title claims abstract description 217
- 238000000034 method Methods 0.000 title claims abstract description 107
- 230000002452 interceptive effect Effects 0.000 claims abstract description 128
- 230000008569 process Effects 0.000 claims abstract description 60
- 238000004891 communication Methods 0.000 claims abstract description 8
- 230000002045 lasting effect Effects 0.000 claims description 10
- 230000001815 facial effect Effects 0.000 claims description 8
- 230000001755 vocal effect Effects 0.000 claims description 7
- 238000004590 computer program Methods 0.000 claims description 4
- 230000000694 effects Effects 0.000 abstract description 19
- 230000006870 function Effects 0.000 description 26
- 238000010586 diagram Methods 0.000 description 13
- 238000005516 engineering process Methods 0.000 description 11
- 238000012806 monitoring device Methods 0.000 description 5
- 238000012545 processing Methods 0.000 description 5
- 239000013589 supplement Substances 0.000 description 5
- 238000001514 detection method Methods 0.000 description 4
- 210000001747 pupil Anatomy 0.000 description 3
- 238000004458 analytical method Methods 0.000 description 2
- 238000013473 artificial intelligence Methods 0.000 description 2
- 230000008901 benefit Effects 0.000 description 2
- 238000004422 calculation algorithm Methods 0.000 description 2
- 230000005611 electricity Effects 0.000 description 2
- 239000000463 material Substances 0.000 description 2
- 238000012544 monitoring process Methods 0.000 description 2
- 238000000926 separation method Methods 0.000 description 2
- 238000004364 calculation method Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 210000003128 head Anatomy 0.000 description 1
- 230000004807 localization Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000011017 operating method Methods 0.000 description 1
- 239000000047 product Substances 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 238000004088 simulation Methods 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/30—Authentication, i.e. establishing the identity or authorisation of security principals
- G06F21/31—User authentication
- G06F21/32—User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10L—SPEECH ANALYSIS OR SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING; SPEECH OR AUDIO CODING OR DECODING
- G10L15/00—Speech recognition
- G10L15/22—Procedures used during a speech recognition process, e.g. man-machine dialogue
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10L—SPEECH ANALYSIS OR SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING; SPEECH OR AUDIO CODING OR DECODING
- G10L15/00—Speech recognition
- G10L15/22—Procedures used during a speech recognition process, e.g. man-machine dialogue
- G10L2015/226—Procedures used during a speech recognition process, e.g. man-machine dialogue using non-speech characteristics
- G10L2015/227—Procedures used during a speech recognition process, e.g. man-machine dialogue using non-speech characteristics of the speaker; Human-factor methodology
Abstract
The invention provides an interaction method, an interaction device, an intelligent robot, electronic equipment and a storage medium, wherein the method is executed by the intelligent robot, the intelligent robot is provided with an interaction unit and a display unit, the interaction unit and the display unit are separately arranged, data communication is carried out between the interaction unit and the display unit, the interaction unit comprises a camera, the display unit comprises a display screen, and the method comprises the steps of collecting an environment image by adopting the camera; when a user watching the intelligent robot appears in the environment image, executing a preset interaction process to determine the characteristics of the user; determining interactive data corresponding to the user characteristics; and based on the corresponding interactive data, the interactive unit is adopted to interact with the user by combining the display screen. The invention can synchronously carry out interaction and display, improves the interaction efficiency, realizes targeted interaction according to the interaction habits of different users, meets the personalized interaction requirements of the users and improves the intelligent interaction effect of the intelligent robot.
Description
Technical field
The present invention relates to technical field of electronic equipment more particularly to a kind of exchange method, device, intelligent robot, electronics
Equipment and storage medium.
Background technique
With the development of artificial intelligence (Artificial Intelligence, AI) technology, intelligent robot meet the tendency of and
Raw, intelligent robot brings great convenience to daily life, in order to enrich the application scenarios of intelligent robot.Phase
It is developed in the technology of pass and the intelligent robot of service is provided in public.
In the related technology, intelligent robot is usually adopted during interacting with the user in its local environment
It is interacted with general interactive mode with user, for example, some general inquiry voices for playing pre- typing requry the users
To interact, alternatively, display interactive interface, and show that general inquiry text is requried the users to interact.
Under this mode, when interacting, cannot targetedly it be interacted according to the interaction habits of different user, no
It can satisfy the personalized interaction demand of user.
Summary of the invention
The present invention is directed to solve at least some of the technical problems in related technologies.
For this purpose, an object of the present invention is to provide a kind of exchange method, device, intelligent robot, electronic equipment and
Storage medium can be synchronously carried out interaction and show, improve the efficiency of interaction, realize the interaction habits according to different user
It is targetedly interacted, meets the personalized interaction demand of user, promote the intelligent interaction effect of intelligent robot.
In order to achieve the above objectives, the exchange method that first aspect present invention embodiment proposes, the method is by intelligence machine
People executes, and the intelligent robot is equipped with interactive unit and display unit, and the interactive unit and display unit separation are set
It sets, data communication is carried out between the interactive unit and the display unit, the interactive unit includes camera, the displaying
Unit includes display screen, comprising: acquires ambient image using the camera;When presented in the ambient image watch attentively it is described
When the user of intelligent robot, default interactive process is executed to determine user characteristics;Determine friendship corresponding with the user characteristics
Mutual data;Based on the corresponding interaction data, handed over using the interactive unit in conjunction with the display screen and the user
Mutually.
It is described to execute default interactive process to determine user characteristics in some embodiments, comprising:
Interactive voice process is executed, to determine user vocal feature and as the user characteristics;Alternatively,
Default interactive process is executed to identify the physical trait of the user and as the user characteristics, the body is special
Sign includes: facial characteristics and/or fingerprint characteristic.
In some embodiments, the associated fisrt feature of interaction data, the determination and the user characteristics
Corresponding interaction data, comprising:
It determines whether there is and the matched fisrt feature of the user characteristics;
The matched fisrt feature if it exists, then will be with the matched associated interaction data of fisrt feature as institute
State corresponding interaction data;
The matched fisrt feature if it does not exist then guides user to input user characteristics;
In conjunction with default rules of interaction, interaction data corresponding with the user characteristics is generated;
The user characteristics are associated with the corresponding interaction data, and according to after association user characteristics with it is described
Corresponding interaction data supplements existing interaction data and associated fisrt feature.
It is described when presenting the user for watching the intelligent robot attentively in the ambient image in some embodiments, it holds
The default interactive process of row is to determine user characteristics, comprising:
When the user is multiple, determine that each user watches the lasting duration of the intelligent robot attentively, when described
It is long to be respectively less than or be equal to time threshold;
Target duration is determined from multiple durations;
The default interactive process is executed with the user characteristics of the determination target duration owning user.
It is described when presenting the user for watching the intelligent robot attentively in the ambient image in some embodiments, it holds
The default interactive process of row is to determine user characteristics, comprising:
When the user is multiple, the voice of pre- typing is played, the voice of the pre- typing is used for guiding target user
Input voice data;
Determine the location information of the voice data said target user;
Based on the location information, the default interactive process is executed with the feature of the determination target user.
It is described to be based on the location information in some embodiments, the default interactive process is executed with the determination target
The feature of user, comprising:
According to the positional information, the direction of the interactive unit is adjusted;
Using the interactive unit after adjustment direction, the default interactive process is executed with the spy of the determination target user
Sign.
The exchange method that first aspect present invention embodiment proposes watches intelligence machine attentively by working as to present in ambient image
When the user of people, default interactive process is executed to determine user characteristics, and determine interaction data corresponding with user characteristics, and
It is interacted, also, interaction is undertaken with the function of showing by different screen, Neng Goutong with user based on corresponding interaction data
It interacts and shows to step, improve the efficiency of interaction, realization is targetedly handed over according to the interaction habits of different user
Mutually, the personalized interaction demand for meeting user promotes the intelligent interaction effect of intelligent robot.
In order to achieve the above objectives, the interactive device that second aspect of the present invention embodiment proposes, the method is by intelligence machine
People executes, and the intelligent robot is equipped with interactive unit and display unit, and the interactive unit and display unit separation are set
It sets, data communication is carried out between the interactive unit and the display unit, the interactive unit includes camera, the displaying
Unit includes display screen, comprising: acquisition module acquires ambient image via the camera;Execution module, for working as the ring
When presenting the user for watching the intelligent robot attentively in the image of border, default interactive process is executed to determine user characteristics;First
Determining module, for determining interaction data corresponding with the user characteristics;Interactive module, for being based on the corresponding interaction
Data are interacted via the interactive unit in conjunction with the display screen and the user.
In some embodiments, the execution module, comprising:
Phonetic feature submodule, for executing interactive voice process, to determine user vocal feature and as the user
Feature;Alternatively,
Physical trait submodule, for executing default interactive process to identify the physical trait of the user and as described
User characteristics, the physical trait include: facial characteristics and/or fingerprint characteristic.
In some embodiments, the associated fisrt feature of interaction data, the interactive device, further includes:
Second determining module, be used to determine whether to exist with the matched fisrt feature of the user characteristics, and there are institutes
Matched fisrt feature is stated, it will be with the matched associated interaction data of fisrt feature as the corresponding interaction data;
Guiding module, for the matched fisrt feature to be not present, guidance user inputs user characteristics;
Module is produced, for combining default rules of interaction, generates interaction data corresponding with the user characteristics;
Relating module, for the user characteristics to be associated with the corresponding interaction data, and according to association after
User characteristics and the corresponding interaction data, supplement existing interaction data and associated fisrt feature.
In some embodiments, the execution module is also used to:
When the user is multiple, determine that each user watches the lasting duration of the intelligent robot attentively, when described
It is long to be respectively less than or be equal to time threshold, and target duration is determined from multiple durations, and execute the default interactive process
With the user characteristics of the determination target duration owning user.
In some embodiments, the execution module is also used to:
When the user is multiple, the voice of pre- typing is played, the voice of the pre- typing is used for guiding target user
Input voice data;
Determine the location information of the voice data said target user;
Based on the location information, the default interactive process is executed with the feature of the determination target user.
In some embodiments, the execution module is also used to:
According to the positional information, the direction of the interactive unit is adjusted;
Using the interactive unit after adjustment direction, the default interactive process is executed with the spy of the determination target user
Sign.
The interactive device that second aspect of the present invention embodiment proposes watches intelligence machine attentively by working as to present in ambient image
When the user of people, default interactive process is executed to determine user characteristics, and determine interaction data corresponding with user characteristics, and
It is interacted, also, interaction is undertaken with the function of showing by different screen, Neng Goutong with user based on corresponding interaction data
It interacts and shows to step, improve the efficiency of interaction, realization is targetedly handed over according to the interaction habits of different user
Mutually, the personalized interaction demand for meeting user promotes the intelligent interaction effect of intelligent robot.
In order to achieve the above objectives, third aspect present invention also proposes a kind of electronic equipment, which includes shell, place
Manage device, memory, circuit board and power circuit, wherein the circuit board is placed in the space interior that the shell surrounds, described
Processor and the memory are arranged on the circuit board;The power circuit, for each electricity for the electronic equipment
Road or device power supply;The memory is for storing executable program code;The processor is by reading in the memory
The executable program code of storage runs program corresponding with the executable program code, for executing: acquisition environment
Image;When presenting the user for watching the intelligent robot attentively in the ambient image, default interactive process is executed with determination
User characteristics;Determine interaction data corresponding with the user characteristics;Based on the corresponding interaction data and the user into
Row interaction.
Third aspect present invention also proposes a kind of electronic equipment, watches intelligent robot attentively by working as to present in ambient image
User when, execute default interactive process to determine user characteristics, and determine interaction data corresponding with user characteristics, Yi Jiji
It is interacted in corresponding interaction data with user, also, interaction is undertaken with the function of showing by different screen, can synchronized
Ground is interacted and is shown, improves the efficiency of interaction, and realization is targetedly interacted according to the interaction habits of different user,
The personalized interaction demand for meeting user promotes the intelligent interaction effect of intelligent robot.
In order to achieve the above objectives, fourth aspect present invention embodiment proposes a kind of computer readable storage medium, works as institute
When stating instruction in storage medium and being executed by the processor of electronic equipment, so that electronic equipment is able to carry out: first party of the present invention
The exchange method that face embodiment proposes.
Fourth aspect present invention embodiment proposes a kind of computer readable storage medium, by presenting when in ambient image
When having the user for watching intelligent robot attentively, default interactive process is executed to determine user characteristics, and determination is corresponding with user characteristics
Interaction data, and interacted based on corresponding interaction data with user, also, will interaction with the function that shows by difference
Screen undertakes, and can be synchronously carried out interaction and show, improve the efficiency of interaction, realize the interaction habits according to different user
It is targetedly interacted, meets the personalized interaction demand of user, promote the intelligent interaction effect of intelligent robot.
In order to achieve the above objectives, fifth aspect present invention embodiment proposes a kind of intelligent robot, comprising: memory,
Processor and storage on a memory and the computer program that can run on a processor, the processor execution described program
When, realize the exchange method proposed such as first aspect present invention embodiment.
Fifth aspect present invention embodiment proposes a kind of intelligent robot, watches intelligence attentively by working as to present in ambient image
When the user of energy robot, default interactive process is executed to determine user characteristics, and determine interactive number corresponding with user characteristics
According to and being interacted with user based on corresponding interaction data, also, the function of interaction and displaying is held by different screen
Load can be synchronously carried out interaction and show, improve the efficiency of interaction, realize and carry out needle according to the interaction habits of different user
Interaction to property meets the personalized interaction demand of user, promotes the intelligent interaction effect of intelligent robot.
The additional aspect of the present invention and advantage will be set forth in part in the description, and will partially become from the following description
Obviously, or practice through the invention is recognized.
Detailed description of the invention
Above-mentioned and/or additional aspect and advantage of the invention will become from the following description of the accompanying drawings of embodiments
Obviously and it is readily appreciated that, in which:
Fig. 1 is the flow diagram for the exchange method that one embodiment of the invention proposes;
Fig. 2 is intelligent robot structural schematic diagram in the embodiment of the present invention;
Fig. 3 is the flow diagram for the exchange method that another embodiment of the present invention proposes;
Fig. 4 is the flow diagram for the exchange method that another embodiment of the present invention proposes;
Fig. 5 is the structural schematic diagram for the interactive device that one embodiment of the invention proposes;
Fig. 6 is the structural schematic diagram for the interactive device that another embodiment of the present invention proposes;
Fig. 7 is the structural schematic diagram for the electronic equipment that one embodiment of the invention proposes.
Specific embodiment
The embodiment of the present invention is described below in detail, examples of the embodiments are shown in the accompanying drawings, wherein from beginning to end
Same or similar label indicates same or similar element or element with the same or similar functions.Below with reference to attached
The embodiment of figure description is exemplary, and for explaining only the invention, and is not considered as limiting the invention.On the contrary, this
The embodiment of invention includes all changes fallen within the scope of the spiritual and intension of attached claims, modification and is equal
Object.
It, cannot be according to the interaction habits of different user in order to solve intelligent robot in the related technology when interacting
It is targetedly interacted, the technical issues of it is impossible to meet the personalized interaction demands of user, the embodiment of the present invention provides one
Kind exchange method, by when presenting the user for watching intelligent robot attentively in ambient image, executing default interactive process with true
Determine user characteristics, and determines interaction data corresponding with user characteristics, and hand over user based on corresponding interaction data
Mutually, also, by interaction with the function of showing it is undertaken by different screen, interaction can be synchronously carried out and shown, interaction is improved
Efficiency, realization targetedly interacted according to the interaction habits of different user, meet the personalized interaction demand of user, mention
Rise the intelligent interaction effect of intelligent robot.
Fig. 1 is the flow diagram for the exchange method that one embodiment of the invention proposes.
In the embodiment of the present invention, which is executed by intelligent robot, wherein intelligent robot can be any one
Kind has equipment, instrument or the machine of calculation processing ability.
In the embodiment of the present invention, intelligent robot can be set in the enabled instruction for receiving administrator, that is, touched
Hair executes the exchange method, alternatively, may be set to be by intelligent robot intelligently detection time, exists in the time detected
When in preset time range, automatic trigger executes the exchange method, compares for example, working as period at noon flow of the people in public arena
When big, intelligent robot can be set to executing the exchange method in real time, set-up mode is flexible, and intelligence degree is high.
Referring to fig. 2, Fig. 2 is intelligent robot structural schematic diagram in the embodiment of the present invention, which includes: to hand over
Mutual unit 21 and display unit 22, for interactive unit 21 for interacting with user, interactive unit 21 includes camera 211, exhibition
Show unit 22 for being shown, display unit 22 includes display screen 221.Interactive unit 21 and display unit 22 are separately positioned,
Data communication is carried out between interactive unit 21 and display unit 22.
Wherein, interactive unit 21 and display unit 22 can be setting up and down, can also be single with interaction in Fig. 2 with left and right settings
Member 21 and the carry out example setting up and down of display unit 22, it is without limitation.
It, can be using lesser screen as friendship by being provided separately interactive unit and display unit in the embodiment of the present invention
Mutual unit, user are interacted by the lesser screen, using biggish screen as display unit, so that display unit is aobvious
What display screen was arranged can be larger, the size of interactive unit is not limited by, to be conducive to be shown.Also, it is handing over
During mutual unit interacts, it can also be interacted by using interactive unit combination display screen with user, for example, by using
Display screen shows interaction data, and interacts in conjunction with interactive unit with user, by interaction and the function of showing by different screen
It undertakes, interaction can be synchronously carried out and shows, improve the efficiency of interaction.
Since interactive unit can not only be shown, also need carry out information collection, it usually needs use touch screen, higher cost,
And the screen of display unit only needs to be shown, it is only necessary to be set as common display screen, it is touch screen, this reality that no setting is required
It applies in example and undertakes interaction by different screen with the function of showing, be not necessarily to biggish touch screen, reduce into a certain extent
This.
Referring to Fig. 1, this method comprises:
S101: ambient image is acquired using camera.
Wherein, the corresponding image of environment locating for intelligent robot can be referred to as ambient image, which can be with
Specially environment picture or ambient video image, with no restriction to this.
The embodiment of the present invention can be clapped directly by the camera being arranged on intelligent robot during specific execute
Take the photograph to obtain ambient image, on the basis of ensureing intelligent robot interactive function, promote the acquisition timeliness of ambient image, from when
The angle of limit ensures interaction effect.
In the embodiment of the present invention, the camera of intelligent robot can be set as to acquire ambient image in real time, alternatively,
Ambient image is acquired every prefixed time interval, with no restriction to this.
While shooting obtains ambient image, the corresponding time model of acquired ambient image can also be recorded in real time
It encloses, which can be used for tracing to the source to ambient image.
In the embodiment of the present invention, it can not only acquire ambient image and realize some common monitoring functions, can also adopt
Subsequent interaction is assisted with the ambient image, has expanded the application function of intelligent robot, realizes and reality in scene is used in combination
Ambient conditions interact so that interaction it is more targeted.
In the embodiment of the present invention, image processing module can also be set in intelligent robot, passed through in intelligent robot
When camera shoots to obtain ambient image, the ambient image can be transmitted to image processing module in real time, by image procossing
Module carries out subsequent Identification of Images.
S102: when presenting the user for watching intelligent robot attentively in ambient image, default interactive process is executed with determination
User characteristics.
In some embodiments, using human Facial Image Recognition Algorithm in the related technology, environment is being identified by image processing module
When there is portrait in image, further, the portrait eye gaze angle identified can also be detected, obtain target
Angle, and the target angle is compared to (the preset angle threshold can be set via experience with preset angle threshold
It is fixed), when target angle is within the scope of preset angle threshold, determine that the portrait identified to ambient image watches intelligence attentively
It can robot.
In other embodiments, depth information detection module can also be set for intelligent robot, by image procossing mould
Block is using human Facial Image Recognition Algorithm in the related technology, when identifying in ambient image with portrait, further, by depth information
Detection module determines the depth information of portrait eye, then, analyzes the depth information to determine whether portrait watches intelligence machine attentively
People.
In the embodiment of the present invention, it is contemplated that, may not in time and intelligence machine when user does not watch intelligent robot attentively
People interacts, and therefore, in order to guarantee to interact targetedly simultaneously, alsos for ensureing that interactive timeliness, the present invention are in environment
When presenting the user for watching intelligent robot attentively in image, just triggering executes default interactive process to determine user characteristics.
Wherein, user characteristics are used for the identity of the unique identification user, which can be some biologies of user
Feature, alternatively, being also possible to identity corresponding to some biological characteristics with user, for example, raw in initial acquisition user
During object feature, mark create-rule is combined according to collected biological characteristic, corresponding mark is generated and is used as the user
Identity, with no restriction to this.
Example is carried out with some biological characteristics that user characteristics are user in the embodiment of the present invention, user characteristics are, for example,
Face contour feature, vocal print feature, pupil feature etc., with no restriction to this.
In some embodiments, during executing default interactive process to determine user characteristics, execution voice can be
Interactive process can effectively simplify user and input operating procedure, drop as far as possible to determine user vocal feature and as user characteristics
The low learning cost interacted with user promotes user using viscosity, promotes user experience degree.
For example, can in intelligent robot preset voice broadcast module, presented in ambient image and watch intelligent machine attentively
When the user of device people, i.e., the inquiry voice of the pre- typing of triggering broadcasting, the inquiry voice can be the factory program of intelligent robot
It is preset, alternatively, can also be set by the administrator of intelligent robot usage scenario, with no restriction to this.
Does is inquiry voice, for example, to would you please say in short? to guide one Duan Yuyin of user's typing, inquiry voice can certainly
For other any possible forms, with no restriction to this.
During specific execute, while playing the inquiry voice of pre- typing, intelligent robot can star
Audio monitoring device (audio monitoring device is, for example, microphone apparatus), detects whether to connect in real time via the audio monitoring device
Receive the answer voice for inquiry voice.
During specific execute, can also in intelligent robot preset speech recognition module, above-mentioned via this
When audio monitoring device listens to answer voice, which can be sent to speech recognition module in real time, by voice
For identification module using speech analysis techniques in the related technology, analysis replies voice to determine user vocal feature.
In other embodiments, during executing default interactive process to determine user characteristics, it is also possible to identify
The physical trait of user and as user characteristics, physical trait includes: facial characteristics and/or fingerprint characteristic, and facial characteristics is for example
For pupil feature, it can be seen that, determine that the method for user characteristics is more flexible, can make according to actual in the embodiment of the present invention
Suitable mode is chosen with demand to determine user characteristics.
During specific execute, can in intelligent robot preset biological characteristic recognition module, to detect
When presenting the user for watching intelligent robot attentively in ambient image, the physical trait of user is identified and as user characteristics.
S103: interaction data corresponding with user characteristics is determined.
Wherein, interaction data, which can be used for describing some in manipulation function relevant using intelligent robot of user, makes
With habits information and preference information etc..
For example, user A is when using the function of the relevant greeting of the intelligent robot, it is desirable to which intelligent robot is with English
Voice is greeted, and user B is when using the function of the relevant greeting of the intelligent robot, it is desirable to which intelligent robot is with Japanese language
Sound is greeted, alternatively, user A is when using the function of the relevant transacting business of the intelligent robot, it is desirable to intelligent robot with
Operation flow A assist business, user B is when using the function of the relevant transacting business of the intelligent robot, it is desirable to intelligence
Robot is corresponding with operation flow B help transacting business, in the hope intelligent machine that the user characteristics using user A match
Device people is greeted with English voice, or, it is desirable to intelligent robot is with operation flow A assist business, it can is referred to as
Corresponding interaction data.
Wherein, user characteristics can be the use habit information to mass users in advance, and happiness with corresponding interaction data
The acquisition such as good information, is analyzed and is stored based on big data.
S104: it is based on corresponding interaction data, is interacted using interactive unit combination display screen with user.
For example, if above-mentioned beat trick in the hope intelligent robot for using the user characteristics of user A to match with English voice
It exhales, or, it is desirable to intelligent robot can control intelligent robot then with operation flow A assist business with English voice
It greets, alternatively, control intelligent robot is with operation flow A assist business.
In the present embodiment, by when presenting the user for watching intelligent robot attentively in ambient image, executing default interaction
Process determines interaction data corresponding with user characteristics to determine user characteristics, and based on corresponding interaction data and uses
Family interacts, also, interaction is undertaken with the function of showing by different screen, can be synchronously carried out interaction and show, mention
The high efficiency of interaction, realization are targetedly interacted according to the interaction habits of different user, meet the personalized friendship of user
Mutual demand promotes the intelligent interaction effect of intelligent robot.
Fig. 3 is the flow diagram for the exchange method that another embodiment of the present invention proposes.
Referring to Fig. 3, this method comprises:
S301: ambient image is acquired using camera.
The explanation of S301 can be detailed in above-described embodiment.
S302: when user is multiple, determining that each user watches the lasting duration of intelligent robot attentively, duration be respectively less than or
Equal to time threshold.
Step in the present embodiment, allows for when identifying user characteristics, and the user of intelligent robot opposite direction may be
One, it is also possible to be multiple, i.e. multi-user's scene for watching intelligent robot attentively, it is possible to which need to interact at this time is only
One of user, alternatively, multiple use has interactive demand per family.
Therefore, it is successively interacted in the present invention in order to realize, it is lasting can to determine that each user watches intelligent robot attentively
Duration may have stronger interaction wish due to watching the longer user of lasting duration attentively, can preferentially identify have compared with
The user characteristics of strong interaction wish user.
Wherein, time threshold is the threshold value for controlling intelligent robot response user's interaction, and preset threshold can be by intelligence
The factory program of robot is demarcated in advance, alternatively, set by the administrator of intelligent robot according to actual use demand, it is right
This is with no restriction.
Time threshold is, for example, 15s.
During specific execute, for example, detection each user of 15s watches the lasting duration of intelligent robot, then, root attentively
According to the user for the interaction wish that multiple lasting durations are therefrom determined to have stronger.
S303: target duration is determined from multiple durations.
Maximum value when target therein in a length of multiple durations.
It is understood that the longer user of fixation time generally has stronger friendship in conjunction with actual usage scenario
Mutual wish.
Therefore, in the present embodiment, maximum duration can be determined from multiple durations and as target duration, then determine
The user characteristics of target duration owning user, realization combine actual usage scenario control intelligent robot and interact, energy
It is enough accurately to determine to ensure interaction effect with the user of relatively strong interaction wish.
S304: default interactive process is executed to determine the user characteristics of target duration owning user.
S305: interaction data corresponding with user characteristics is determined.
S306: it is based on corresponding interaction data, is interacted using interactive unit combination display screen with user.
The explanation of S304-S306 can be detailed in above-described embodiment.
As another example, when user is multiple, the voice of pre- typing is played, the voice of pre- typing is for guiding mesh
It marks user and inputs voice data;Determine the location information of voice data said target user;Based on location information, default hand over is executed
Mutual process executes default interactive process based on location information with the feature for determining target user to determine the spy of target user
When sign, the direction of interactive unit can be adjusted with specific reference to location information;Using the interactive unit after adjustment direction, execute pre-
If interactive process is to determine the feature of target user.
For example, if active user be it is multiple, the voice of pre- typing can be played: interactive user please be need to say one
Words then can star built-in audio monitoring device, the voice of monitoring users input, listen to user input voice it
Afterwards, its owning user can be positioned in conjunction with voice using auditory localization technology in the related technology, obtains position letter
Breath, then, can analyze the location information and obtains the relative direction of the user and intelligent robot, in turn, by interactive unit
Direction is adjusted to the relative direction, to execute default interactive process to determine that target is used using the interactive unit after adjustment direction
The feature at family.
It is targetedly identified it can be seen that can be realized in the embodiment of the present invention, when user is multiple, sound can be passed through
The mode of source positioning determines target user, then, the direction of interactive unit is adjusted to the relative direction to determine that target is used
The feature at family accurately determines the user with relatively strong interaction wish, so that interactive unit rotation is in order to target user's
Operation promotes user experience degree, promotes the intelligentized control method of intelligent robot thus, it is possible to obtain better interaction effect
Effect.
In the present embodiment, by determining that each user watches the lasting duration of intelligent robot, duration attentively when user is multiple
Respectively less than or it is equal to time threshold, target duration is determined from multiple durations, and executes default interactive process to determine mesh
The user characteristics of duration owning user are marked, realization combines actual usage scenario control intelligent robot and interacts, can
It accurately determines to ensure interaction effect with the user of relatively strong interaction wish.
Fig. 4 is the flow diagram for the exchange method that another embodiment of the present invention proposes.
Referring to fig. 4, this method comprises:
S401: ambient image is acquired using camera.
S402: when presenting the user for watching intelligent robot attentively in ambient image, default interactive process is executed with determination
User characteristics.
The explanation of S401 and S402 can be detailed in above-described embodiment, and details are not described herein.
S403: it determines whether there is and the matched fisrt feature of user characteristics, associated first spy of interaction data
Sign.
During specific execute, intelligent robot can be examined in module is locally stored based on user characteristics
Rope, candidate interaction data and the associated fisrt feature of each interaction data can be stored in advance by being locally stored in module, wherein
The fisrt feature is the user characteristics of the interaction data owning user learnt in advance, is determined alternatively, can also send to server
It requests, candidate interaction data and the associated fisrt feature of each interaction data can be stored in advance in server, so that service
Device based on determine request determine whether there is with the matched fisrt feature of user characteristics, and determining result is fed back into intelligent machine
Device people.
S404: matched fisrt feature if it exists, then it will be with the associated interaction data of matched fisrt feature as corresponding
Interaction data.
During specific execute, if it exists with the matched fisrt feature of user characteristics, then intelligent robot is shown
Learnt the interaction data of the user of current fixation intelligent robot, was closed at this point it is possible to directly read with matched fisrt feature
The interaction data of connection directlys adopt corresponding interaction data and interacts with user as corresponding interaction data, can with this
Ensure preferable interactive efficiency, and since intelligent robot had learnt the interaction number of the user of current fixation intelligent robot
According to the interaction data that use is transferred at this time interacts, and improves intelligent interaction effect.
S405: it is based on corresponding interaction data, is interacted using interactive unit combination display screen with user.
The explanation of S405 can be detailed in above-described embodiment, and details are not described herein.
S406: matched fisrt feature if it does not exist then guides user to input user characteristics.
For example, the form of interactive voice can be used, or guidance user inputs user spy in the form of text interaction
Sign, can play the voice of pre- typing: would you please watch camera attentively!Then, detecting that user watches the camera shooting of intelligent robot attentively
When head, the pupil feature of user is acquired and as user characteristics, alternatively, text can also be shown: woulding you please for finger to be placed on finger
On line identifier!User is guided to input fingerprint characteristic and as user characteristics.
S407: in conjunction with default rules of interaction, interaction data corresponding with user characteristics is generated.
Default rules of interaction is, for example, to simulate some interaction scenarios (for example, the interaction scenarios greeted) in advance, and guide
User inputs corresponding feedback information (for example, whether the language form of the greeting played for intelligent robot, determine user
The voice messaging of the language form of feedback), if the language form for the greeting that intelligent robot plays is English, however, it is determined that use
Family feedback be somebody's turn to do be Chinese voice messaging, then using Chinese voice messaging as corresponding interaction data, this is not limited
System.
S408: user characteristics are associated with corresponding interaction data, and according to after association user characteristics with it is corresponding
Interaction data supplements existing interaction data and associated fisrt feature.
For example, combining default rules of interaction to can be preparatory simulation when generating interaction data corresponding with user characteristics
Some interaction scenarios (for example, the interaction scenarios greeted), guidance user input corresponding feedback information (for example, for intelligence
The mode for the greeting that robot plays, the voice messaging that user is fed back), it analyzes the feedback information and obtains interaction data, and
The interaction data and the user characteristics of guidance user's input are associated, to existing interaction data and associated fisrt feature
It is supplemented.
In the present embodiment, when by determining whether there is fisrt feature matched with user characteristics, directly reading and matching
The associated interaction data of fisrt feature as corresponding interaction data, directly adopt corresponding interaction data and handed over user
Mutually, preferable interactive efficiency can be ensured with this, and since intelligent robot had learnt current fixation intelligent robot
The interaction data of user is interacted using the interaction data transferred at this time, improves intelligent interaction effect.By determining not
When in the presence of fisrt feature matched with user characteristics, guidance user inputs user characteristics, in conjunction with default rules of interaction, generates and uses
The corresponding interaction data of family feature, and user characteristics are associated with corresponding interaction data, and according to the user after association
Feature and corresponding interaction data, supplement existing interaction data and associated fisrt feature, realize to intelligence machine
The interaction data that people is learnt carries out dynamic supplement and update, preferably to meet the personalized interaction demand of user, promotes intelligence
The intelligent interaction effect of energy robot.
Fig. 5 is the structural schematic diagram for the interactive device that one embodiment of the invention proposes.
The interactive device is arranged inside intelligent robot.
Intelligent robot is equipped with interactive unit and display unit, and interactive unit and display unit are separately positioned, interactive unit
Data communication is carried out between display unit, interactive unit includes camera, and display unit includes display screen.
Referring to Fig. 5, the interactive device 500, comprising:
Acquisition module 501, for acquiring ambient image via camera;
Execution module 502, for when presenting the user for watching intelligent robot attentively in ambient image, executing default interaction
Process is to determine user characteristics;
First determining module 503, for determining interaction data corresponding with user characteristics;
Interactive module 504 is handed over via interactive unit combination display screen with user for being based on corresponding interaction data
Mutually.
Optionally, in some embodiments, referring to Fig. 6, execution module 502, comprising:
Phonetic feature submodule 5021, for executing interactive voice process, to determine user vocal feature and as user
Feature;Alternatively,
Physical trait submodule 5022, for executing default interactive process to identify the physical trait of user and as user
Feature, physical trait include: facial characteristics and/or fingerprint characteristic.
Optionally, in some embodiments, the associated fisrt feature of interaction data, interactive device 500, further includes:
Second determining module 505, be used to determine whether exist with the matched fisrt feature of user characteristics, and exist match
Fisrt feature, will be with the associated interaction data of matched fisrt feature as corresponding interaction data.
Guiding module 506, for matched fisrt feature to be not present, guidance user inputs user characteristics;
Module 507 is produced, for combining default rules of interaction, generates interaction data corresponding with user characteristics;
Relating module 508, for user characteristics to be associated with corresponding interaction data, and it is special according to the user after association
Sign with corresponding interaction data, existing interaction data and associated fisrt feature are supplemented.
Optionally, in some embodiments, execution module 502 is also used to:
When user is multiple, determine that each user watches the lasting duration of intelligent robot attentively, duration is respectively less than or is equal to
Time threshold, and target duration is determined from multiple durations, and execute default interactive process and use belonging to target duration to determine
The user characteristics at family.
Optionally, in some embodiments, execution module 502 is also used to:
When user is multiple, the voice of pre- typing is played, the voice of pre- typing inputs voice for guiding target user
Data;
Determine the location information of voice data said target user;
Based on location information, default interactive process is executed to determine the feature of target user.
Optionally, in some embodiments, execution module 502 is also used to:
According to location information, the direction of interactive unit is adjusted;
Using the interactive unit after adjustment direction, default interactive process is executed to determine the feature of target user.
It should be noted that being also applied for this to the explanation of exchange method embodiment in earlier figures 1- Fig. 4 embodiment
The interactive device 500 of embodiment, realization principle is similar, and details are not described herein again.
In the present embodiment, by when presenting the user for watching intelligent robot attentively in ambient image, executing default interaction
Process determines interaction data corresponding with user characteristics to determine user characteristics, and based on corresponding interaction data and uses
Family interacts, also, interaction is undertaken with the function of showing by different screen, can be synchronously carried out interaction and show, mention
The high efficiency of interaction, realization are targetedly interacted according to the interaction habits of different user, meet the personalized friendship of user
Mutual demand promotes the intelligent interaction effect of intelligent robot.
Fig. 7 is the structural schematic diagram for the electronic equipment that one embodiment of the invention proposes.
Referring to Fig. 7, electronic equipment 700 may include following one or more components: processor 701, memory 702, electricity
Source circuit 703, multimedia component 704, audio component 705, the interface 706 of input/output (I/O), sensor module 707, with
And communication component 705.
Power circuit 703, for each circuit or the device power supply for mobile terminal;Memory 602 is for storing and can hold
Line program code;Processor 701 is run by reading the executable program code stored in memory 702 and executable program
The corresponding program of code, for executing following steps:
Acquire ambient image;
When presenting the user for watching intelligent robot attentively in ambient image, default interactive process is executed to determine user spy
Sign;
Determine interaction data corresponding with user characteristics;
It is interacted based on corresponding interaction data with user.
It should be noted that being also applied for this to the explanation of exchange method embodiment in earlier figures 1- Fig. 4 embodiment
The electronic equipment 700 of embodiment, realization principle is similar, and details are not described herein again.
In the present embodiment, by when presenting the user for watching intelligent robot attentively in ambient image, executing default interaction
Process determines interaction data corresponding with user characteristics to determine user characteristics, and based on corresponding interaction data and uses
Family interacts, also, interaction is undertaken with the function of showing by different screen, can be synchronously carried out interaction and show, mention
The high efficiency of interaction, realization are targetedly interacted according to the interaction habits of different user, meet the personalized friendship of user
Mutual demand promotes the intelligent interaction effect of intelligent robot.
In order to realize above-described embodiment, the present invention also proposes a kind of computer readable storage medium, when in storage medium
When instruction is executed by the processor of electronic equipment, so that electronic equipment is able to carry out the exchange method of above-described embodiment proposition.
In order to realize above-described embodiment, the present invention also proposes a kind of intelligent robot, including memory, processor and storage
On a memory and the computer program that can run on a processor, it when processor executes program, realizes such as the aforementioned reality of the present invention
Apply the exchange method of example proposition.
It should be noted that in the description of the present invention, term " first ", " second " etc. are used for description purposes only, without
It can be interpreted as indication or suggestion relative importance.In addition, in the description of the present invention, unless otherwise indicated, the meaning of " multiple "
It is two or more.
Any process described otherwise above or method description are construed as in flow chart or herein, and expression includes
It is one or more for realizing specific logical function or process the step of executable instruction code module, segment or portion
Point, and the range of the preferred embodiment of the present invention includes other realization, wherein can not press shown or discussed suitable
Sequence, including according to related function by it is basic simultaneously in the way of or in the opposite order, Lai Zhihang function, this should be of the invention
Embodiment person of ordinary skill in the field understood.
It should be appreciated that each section of the invention can be realized with hardware, software, firmware or their combination.Above-mentioned
In embodiment, software that multiple steps or method can be executed in memory and by suitable instruction execution system with storage
Or firmware is realized.It, and in another embodiment, can be under well known in the art for example, if realized with hardware
Any one of column technology or their combination are realized: having a logic gates for realizing logic function to data-signal
Discrete logic, with suitable combinational logic gate circuit specific integrated circuit, programmable gate array (PGA), scene
Programmable gate array (FPGA) etc..
Those skilled in the art are understood that realize all or part of step that above-described embodiment method carries
Suddenly be that relevant hardware can be instructed to complete by program, program can store in a kind of computer readable storage medium
In, which when being executed, includes the steps that one or a combination set of embodiment of the method.
It, can also be in addition, each functional unit in each embodiment of the present invention can integrate in a processing module
It is that each unit physically exists alone, can also be integrated in two or more units in a module.Above-mentioned integrated mould
Block both can take the form of hardware realization, can also be realized in the form of software function module.If integrated module with
The form of software function module is realized and when sold or used as an independent product, also can store computer-readable at one
It takes in storage medium.
Storage medium mentioned above can be read-only memory, disk or CD etc..
In the description of this specification, reference term " one embodiment ", " some embodiments ", " example ", " specifically show
The description of example " or " some examples " etc. means specific features, structure, material or spy described in conjunction with this embodiment or example
Point is included at least one embodiment or example of the invention.In the present specification, schematic expression of the above terms are not
Centainly refer to identical embodiment or example.Moreover, particular features, structures, materials, or characteristics described can be any
One or more embodiment or examples in can be combined in any suitable manner.
Although the embodiments of the present invention has been shown and described above, it is to be understood that above-described embodiment is example
Property, it is not considered as limiting the invention, those skilled in the art within the scope of the invention can be to above-mentioned
Embodiment is changed, modifies, replacement and variant.
Claims (10)
1. a kind of exchange method, which is characterized in that the method is executed by intelligent robot, and the intelligent robot is equipped with interaction
Unit and display unit, the interactive unit and the display unit are separately positioned, the interactive unit and the display unit
Between carry out data communication, the interactive unit includes camera, and the display unit includes display screen, comprising:
Ambient image is acquired using the camera;
When presenting the user for watching the intelligent robot attentively in the ambient image, executes default interactive process and used with determining
Family feature;
Determine interaction data corresponding with the user characteristics;
Based on the corresponding interaction data, interacted using the interactive unit in conjunction with the display screen and the user.
2. exchange method as described in claim 1, which is characterized in that described to execute default interactive process to determine user spy
Sign, comprising:
Interactive voice process is executed, to determine user vocal feature and as the user characteristics;Alternatively,
Default interactive process is executed to identify the physical trait of the user and as the user characteristics, the physical trait packet
It includes: facial characteristics and/or fingerprint characteristic.
3. exchange method as claimed in claim 2, which is characterized in that the associated fisrt feature of interaction data,
Determination interaction data corresponding with the user characteristics, comprising:
It determines whether there is and the matched fisrt feature of the user characteristics;
The matched fisrt feature if it exists, then will be with the matched associated interaction data of fisrt feature as described right
The interaction data answered;
The matched fisrt feature if it does not exist then guides user to input user characteristics;
In conjunction with default rules of interaction, interaction data corresponding with the user characteristics is generated;
The user characteristics are associated with the corresponding interaction data, and according to after association user characteristics with it is described corresponding
Interaction data, existing interaction data and associated fisrt feature are supplemented.
4. exchange method as described in claim 1, which is characterized in that it is described present in the ambient image watch attentively it is described
When the user of intelligent robot, default interactive process is executed to determine user characteristics, comprising:
When the user is multiple, determine that each user watches the lasting duration of the intelligent robot attentively, the duration is equal
Less than or equal to time threshold;
Target duration is determined from multiple durations;
The default interactive process is executed with the user characteristics of the determination target duration owning user.
5. exchange method as described in claim 1, which is characterized in that it is described present in the ambient image watch attentively it is described
When the user of intelligent robot, default interactive process is executed to determine user characteristics, comprising:
When the user is multiple, the voice of pre- typing is played, the voice of the pre- typing is inputted for guiding target user
Voice data;
Determine the location information of the voice data said target user;
Based on the location information, the default interactive process is executed with the feature of the determination target user.
6. exchange method as claimed in claim 5, which is characterized in that it is described to be based on the location information, it executes described default
Interactive process is with the feature of the determination target user, comprising:
According to the positional information, the direction of the interactive unit is adjusted;
Using the interactive unit after adjustment direction, the default interactive process is executed with the feature of the determination target user.
7. a kind of interactive device, which is characterized in that described device is set in intelligent robot, and the intelligent robot, which is equipped with, to be handed over
Mutual unit and display unit, the interactive unit and the display unit are separately positioned, and the interactive unit and the displaying are single
Data communication is carried out between member, the interactive unit includes camera, and the display unit includes display screen, comprising:
Acquisition module acquires ambient image via the camera;
Execution module, for when presenting the user for watching the intelligent robot attentively in the ambient image, executing default hand over
Mutual process is to determine user characteristics;
First determining module, for determining interaction data corresponding with the user characteristics;
Interactive module is based on the corresponding interaction data, via the interactive unit in conjunction with the display screen and the user
It interacts.
8. a kind of electronic equipment, including shell, processor, memory, circuit board and power circuit, wherein the circuit board peace
The space interior surrounded in the shell is set, the processor and the memory are arranged on the circuit board;The power supply
Circuit, for each circuit or the device power supply for the electronic equipment;The memory is for storing executable program code;
The processor is run and the executable program code by reading the executable program code stored in the memory
Corresponding program, for executing:
Acquire ambient image;
When presenting the user for watching the intelligent robot attentively in the ambient image, executes default interactive process and used with determining
Family feature;
Determine interaction data corresponding with the user characteristics;
It is interacted based on the corresponding interaction data with the user.
9. a kind of computer readable storage medium, is stored thereon with computer program, which is characterized in that the program is held by processor
Such as exchange method of any of claims 1-6 is realized when row.
10. a kind of intelligent robot, which is characterized in that including memory, processor and store on a memory and can handle
The computer program run on device when the processor executes described program, realizes such as friendship as claimed in any one of claims 1 to 6
Mutual method.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910578601.3A CN110286771A (en) | 2019-06-28 | 2019-06-28 | Interaction method and device, intelligent robot, electronic equipment and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910578601.3A CN110286771A (en) | 2019-06-28 | 2019-06-28 | Interaction method and device, intelligent robot, electronic equipment and storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
CN110286771A true CN110286771A (en) | 2019-09-27 |
Family
ID=68020163
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910578601.3A Pending CN110286771A (en) | 2019-06-28 | 2019-06-28 | Interaction method and device, intelligent robot, electronic equipment and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110286771A (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110991249A (en) * | 2019-11-04 | 2020-04-10 | 支付宝(杭州)信息技术有限公司 | Face detection method, face detection device, electronic equipment and medium |
CN111710046A (en) * | 2020-06-05 | 2020-09-25 | 北京有竹居网络技术有限公司 | Interaction method and device and electronic equipment |
CN112655000A (en) * | 2020-04-30 | 2021-04-13 | 华为技术有限公司 | In-vehicle user positioning method, vehicle-mounted interaction method, vehicle-mounted device and vehicle |
Citations (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2010182287A (en) * | 2008-07-17 | 2010-08-19 | Steven C Kays | Intelligent adaptive design |
CN102609089A (en) * | 2011-01-13 | 2012-07-25 | 微软公司 | Multi-state model for robot and user interaction |
CN102610035A (en) * | 2012-04-05 | 2012-07-25 | 广州广电运通金融电子股份有限公司 | Financial self-service device and anti-peeping system and anti-peeping method thereof |
US20140247208A1 (en) * | 2013-03-01 | 2014-09-04 | Tobii Technology Ab | Invoking and waking a computing device from stand-by mode based on gaze detection |
US20170318019A1 (en) * | 2016-04-29 | 2017-11-02 | John C. Gordon | Gaze-based authentication |
CN108068121A (en) * | 2017-12-22 | 2018-05-25 | 达闼科技(北京)有限公司 | A kind of man-machine interaction control method, device and robot |
WO2018156912A1 (en) * | 2017-02-27 | 2018-08-30 | Tobii Ab | System for gaze interaction |
CN108733208A (en) * | 2018-03-21 | 2018-11-02 | 北京猎户星空科技有限公司 | The I-goal of smart machine determines method and apparatus |
EP3399427A1 (en) * | 2017-05-03 | 2018-11-07 | Karlsruher Institut für Technologie | Method and system for the quantitative measurement of mental stress of an individual user |
CN109015593A (en) * | 2018-09-21 | 2018-12-18 | 中新智擎科技有限公司 | A kind of advertisement robot and its advertisement placement method |
CN109144262A (en) * | 2018-08-28 | 2019-01-04 | 广东工业大学 | A kind of man-machine interaction method based on eye movement, device, equipment and storage medium |
CN109840804A (en) * | 2019-01-21 | 2019-06-04 | 深圳市丰巢科技有限公司 | Third party's information displaying method, device, equipment and storage medium |
-
2019
- 2019-06-28 CN CN201910578601.3A patent/CN110286771A/en active Pending
Patent Citations (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2010182287A (en) * | 2008-07-17 | 2010-08-19 | Steven C Kays | Intelligent adaptive design |
CN102609089A (en) * | 2011-01-13 | 2012-07-25 | 微软公司 | Multi-state model for robot and user interaction |
CN102610035A (en) * | 2012-04-05 | 2012-07-25 | 广州广电运通金融电子股份有限公司 | Financial self-service device and anti-peeping system and anti-peeping method thereof |
US20140247208A1 (en) * | 2013-03-01 | 2014-09-04 | Tobii Technology Ab | Invoking and waking a computing device from stand-by mode based on gaze detection |
US20170318019A1 (en) * | 2016-04-29 | 2017-11-02 | John C. Gordon | Gaze-based authentication |
WO2018156912A1 (en) * | 2017-02-27 | 2018-08-30 | Tobii Ab | System for gaze interaction |
EP3399427A1 (en) * | 2017-05-03 | 2018-11-07 | Karlsruher Institut für Technologie | Method and system for the quantitative measurement of mental stress of an individual user |
CN108068121A (en) * | 2017-12-22 | 2018-05-25 | 达闼科技(北京)有限公司 | A kind of man-machine interaction control method, device and robot |
CN108733208A (en) * | 2018-03-21 | 2018-11-02 | 北京猎户星空科技有限公司 | The I-goal of smart machine determines method and apparatus |
CN109144262A (en) * | 2018-08-28 | 2019-01-04 | 广东工业大学 | A kind of man-machine interaction method based on eye movement, device, equipment and storage medium |
CN109015593A (en) * | 2018-09-21 | 2018-12-18 | 中新智擎科技有限公司 | A kind of advertisement robot and its advertisement placement method |
CN109840804A (en) * | 2019-01-21 | 2019-06-04 | 深圳市丰巢科技有限公司 | Third party's information displaying method, device, equipment and storage medium |
Non-Patent Citations (1)
Title |
---|
徐振国等: "新一代人机交互:自然用户界面的现状、类型与教育应用探究――兼对脑机接口技术的初步展望", 远程教育杂志, no. 04, 17 July 2018 (2018-07-17) * |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110991249A (en) * | 2019-11-04 | 2020-04-10 | 支付宝(杭州)信息技术有限公司 | Face detection method, face detection device, electronic equipment and medium |
CN112655000A (en) * | 2020-04-30 | 2021-04-13 | 华为技术有限公司 | In-vehicle user positioning method, vehicle-mounted interaction method, vehicle-mounted device and vehicle |
CN111710046A (en) * | 2020-06-05 | 2020-09-25 | 北京有竹居网络技术有限公司 | Interaction method and device and electronic equipment |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109844854B (en) | Word Stream Annotation | |
US8847884B2 (en) | Electronic device and method for offering services according to user facial expressions | |
EP3627290A1 (en) | Device-facing human-computer interaction method and system | |
WO2017122900A1 (en) | Apparatus and method for operating personal agent | |
US20210249012A1 (en) | Systems and methods for operating an output device | |
CN110286771A (en) | Interaction method and device, intelligent robot, electronic equipment and storage medium | |
US20180077095A1 (en) | Augmentation of Communications with Emotional Data | |
US20050255434A1 (en) | Interactive virtual characters for training including medical diagnosis training | |
US10409324B2 (en) | Glass-type terminal and method of controlling the same | |
JP2020039029A (en) | Video distribution system, video distribution method, and video distribution program | |
US20210043106A1 (en) | Technology based learning platform for persons having autism | |
CN110520820A (en) | Mixed reality display system and mixed reality display terminal | |
WO2020148920A1 (en) | Information processing device, information processing method, and information processing program | |
US20220364829A1 (en) | Equipment detection using a wearable device | |
JP4845183B2 (en) | Remote dialogue method and apparatus | |
CN109739353A (en) | A kind of virtual reality interactive system identified based on gesture, voice, Eye-controlling focus | |
Strauß et al. | Wizard-of-Oz Data Collection for Perception and Interaction in Multi-User Environments. | |
JP2018180503A (en) | Public speaking assistance device and program | |
US10635665B2 (en) | Systems and methods to facilitate bi-directional artificial intelligence communications | |
KR102383793B1 (en) | Method, apparatus and system for managing and controlling concentration of user of registered extended reality device | |
CN116088675A (en) | Virtual image interaction method, related device, equipment, system and medium | |
CN113689530A (en) | Method and device for driving digital person and electronic equipment | |
CN110266806A (en) | Content pushing method and device and electronic equipment | |
CN112820265A (en) | Speech synthesis model training method and related device | |
CN112671632A (en) | Intelligent earphone system based on face recognition and information interaction and/or social contact method thereof |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |