CN105511608A - Intelligent robot based interaction method and device, and intelligent robot - Google Patents

Intelligent robot based interaction method and device, and intelligent robot Download PDF

Info

Publication number
CN105511608A
CN105511608A CN201510857357.6A CN201510857357A CN105511608A CN 105511608 A CN105511608 A CN 105511608A CN 201510857357 A CN201510857357 A CN 201510857357A CN 105511608 A CN105511608 A CN 105511608A
Authority
CN
China
Prior art keywords
information
man
machine interaction
input information
interaction input
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201510857357.6A
Other languages
Chinese (zh)
Other versions
CN105511608B (en
Inventor
韦克礼
王辰
郭家
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Guangnian Infinite Technology Co ltd
Original Assignee
Beijing Guangnian Wuxian Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Guangnian Wuxian Technology Co Ltd filed Critical Beijing Guangnian Wuxian Technology Co Ltd
Priority to CN201510857357.6A priority Critical patent/CN105511608B/en
Publication of CN105511608A publication Critical patent/CN105511608A/en
Application granted granted Critical
Publication of CN105511608B publication Critical patent/CN105511608B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output
    • G06F3/167Audio in a user interface, e.g. using voice commands for navigating, audio feedback

Abstract

The invention relates to an intelligent robot based interaction method and device, and an intelligent robot; the method comprises the following steps: obtaining man-machine interaction input information; determining whether a preset active interaction condition is satisfied or not according to the man-machine interaction input information; if yes, generating and outputting corresponding man-machine interaction output information according to the man-machine interaction input information and a preset active interaction model. In the prior art, a user and the robot only have a single passive interaction mode; the novel method can solve the problems, and provides the mode combining active interaction with passive interaction, thus diversifying, intelligentizing and humanizing interaction between the robot and the user, and improving robot user experience.

Description

Based on exchange method and device, the intelligent robot of intelligent robot
Technical field
The present invention relates to human-computer interaction technique field, specifically, relate to a kind of exchange method based on intelligent robot and device, intelligent robot.
Background technology
Along with the fast development of science and technology, it has been a kind of inexorable trend that robot enters into human society.As the computer before 30 years, the development of robot and application are also by deep impact with change the life of people, the every aspect of work.And whether have the harmonious mutual ability Ze Shi robot with user whether can be harmonious incorporate the key of human society.
When user suitably, positively is participated in the Behavior-Based control of robot by specific exchange channels, with unmanned be, compared with entirely autonomous mode, robot runs more efficient, and this process just needs to be applied to human-computer interaction technology.
Existing robot system (such as intelligent robot conversational system) is all adopt traditional passive interactive mode, and this mode cannot meet user to the day by day complicated interaction demand of robot system.
Summary of the invention
For solving the problem, the invention provides a kind of exchange method based on intelligent robot, described method comprises:
Obtain man-machine interaction input information;
Determine whether to meet according to described man-machine interaction input information and preset initiatively mutual condition;
When meeting the mutual condition of described default active, then based on default active interaction models, producing according to described man-machine interaction input information and exporting corresponding man-machine interaction output information.
According to one embodiment of present invention, in the process, when not meeting the mutual condition of described default active, then based on default passive interaction models, producing according to described man-machine interaction input information and exporting corresponding man-machine interaction output information.
According to one embodiment of present invention, described man-machine interaction input information comprises any one in following lising or several:
Voice messaging, tactile data, visual information, olfactory information, sense of taste information and machine perception information.
According to one embodiment of present invention, determine whether to meet according to described man-machine interaction input information the step presetting initiatively mutual condition to comprise:
Judge whether described man-machine interaction input information is boot-strap information, wakes information or default touch operation information up;
If so, then determine that described man-machine interaction input information meets the mutual condition of described default active.
According to one embodiment of present invention, determine whether to meet according to described man-machine interaction input information the step presetting initiatively mutual condition to comprise:
Judge whether in preset duration, do not carry out man-machine interaction between user and robot;
If so, then determine that described man-machine interaction input information meets the mutual condition of described default active.
According to one embodiment of present invention, described method based on described default Active model produce export described man-machine interaction output information time, first produce and export the response message of described input information, produce subsequently and export active interactive information.
Present invention also offers a kind of interactive device based on intelligent robot, described device comprises:
Interactive information acquisition module, it is for obtaining man-machine interaction input information;
Initiatively mutual condition judgment module, it presets initiatively mutual condition for determining whether to meet according to described man-machine interaction input information;
Initiatively interactive module, it, for when meeting the mutual condition of described default active, based on default active interaction models, producing according to described man-machine interaction input information and exporting corresponding man-machine interaction output information.
According to one embodiment of present invention, described device also comprises:
Passive interactive module, it is not for when meeting the mutual condition of described default active, based on default passive interaction models, produces according to described man-machine interaction input information and exports corresponding man-machine interaction output information.
According to one embodiment of present invention, described man-machine interaction input information comprises any one in following lising or several:
Voice messaging, tactile data, visual information, olfactory information, sense of taste information and machine perception information.
According to one embodiment of present invention, when described active mutual condition judgment module determines whether to meet default initiatively mutual condition according to described man-machine interaction input information:
Judge whether described man-machine interaction input information is boot-strap information, wakes information or default touch operation information up;
If so, then determine that described man-machine interaction input information meets the mutual condition of described default active.
According to one embodiment of present invention, when described active mutual condition judgment module determines whether to meet default initiatively mutual condition according to described man-machine interaction input information:
Judge whether in preset duration, do not carry out man-machine interaction between user and robot;
If so, then determine that described man-machine interaction input information meets the mutual condition of described default active.
According to one embodiment of present invention, when described active interactive module produces based on described default Active model and exports described man-machine interaction output information, first produce and export the response message of described input information, produce subsequently and export initiatively interactive information.
Present invention also offers a kind of intelligent robot, described intelligent robot comprises:
Interactive information acquisition device, it is for obtaining man-machine interaction input information;
Data processing equipment, it is connected with described interactive information acquisition device, presets initiatively mutual condition, if met for judging whether described man-machine interaction input information meets, then based on default active interaction models, generate man-machine interaction feedback information according to described man-machine interaction input information;
Output feedack device, it is connected with described data processing equipment, for producing corresponding output signal according to described man-machine interaction feedback information.
According to one embodiment of present invention, described interactive information acquisition device comprises any one in following lising or several:
Voice messaging acquiring unit, tactile data acquiring unit, acquisition of vision information unit, olfactory information acquiring unit, sense of taste information acquisition unit and machine perception information acquiring unit.
According to one embodiment of present invention, described data processing equipment is configured to judge whether described man-machine interaction input information is boot-strap information, wakes information or default touch operation information up, if so, then confirm that described man-machine interaction input information meets the mutual condition of described default active.
According to one embodiment of present invention, described data processing equipment is configured to detect the duration not carrying out man-machine interaction between user and robot and whether reaches preset duration, if so, then confirm that described man-machine interaction input information meets the mutual condition of described default active.
According to one embodiment of present invention, described data processing equipment is configured to when described man-machine interaction input information does not meet the mutual condition of described default active, based on default passive interaction models, produce man-machine interaction feedback information according to described man-machine interaction input information.
Exchange method provided by the present invention and system and intelligent robot make no longer can only be in single passive interactive mode as existing between user and robot, but extend to the initiatively mutual and passive mode be combined with each other alternately, this interactive mode makes mutual more diversified, the intelligent and hommization between robot and user, thus improves the Consumer's Experience of robot.
Other features and advantages of the present invention will be set forth in the following description, and, partly become apparent from instructions, or understand by implementing the present invention.Object of the present invention and other advantages realize by structure specifically noted in instructions, claims and accompanying drawing and obtain.
Accompanying drawing explanation
In order to be illustrated more clearly in the embodiment of the present invention or technical scheme of the prior art, do simple introduction by accompanying drawing required in embodiment or description of the prior art below:
Fig. 1 is according to an embodiment of the invention based on the process flow diagram of the exchange method of intelligent robot;
Fig. 2 is according to an embodiment of the invention based on the process flow diagram of the exchange method of intelligent robot;
Fig. 3 is according to an embodiment of the invention based on the structural representation of the interactive device of intelligent robot;
Fig. 4 is the structural representation of intelligent robot according to an embodiment of the invention;
Fig. 5 is the structural representation of interactive information acquisition device according to an embodiment of the invention.
Embodiment
Describe embodiments of the present invention in detail below with reference to drawings and Examples, to the present invention, how application technology means solve technical matters whereby, and the implementation procedure reaching technique effect can fully understand and implement according to this.It should be noted that, only otherwise form conflict, each embodiment in the present invention and each feature in each embodiment can be combined with each other, and the technical scheme formed is all within protection scope of the present invention.
Meanwhile, in the following description, many details have been set forth for illustrative purposes, to provide thorough understanding of embodiments of the invention.But, it will be apparent to those skilled in the art that the present invention can detail here or described ad hoc fashion implement.
In addition, can perform in the computer system of such as one group of computer executable instructions in the step shown in the process flow diagram of accompanying drawing, and, although show logical order in flow charts, but in some cases, can be different from the step shown or described by order execution herein.
Man-machine interactive system traditional is at present all adopt passive interactive mode to realize the information interaction between user and intelligent robot.In passive interactive mode, user first initiatively initiates mutual, and relevant interactive information is sent to robot.Such as, user puts question to a problem by voice to robot.Now, user is just correspondingly voice signal to the interactive information that robot sends.
Robot obtains by the sensory perceptual system of self interactive information that user inputs, and carries out certain process to this interactive information, thus gets the intention of user.Subsequently, robot just obtains corresponding result (such as obtaining the knowledge information etc. relevant to interactive information by retrieval) according to the intention of user obtained and by this results conversion for export and to feed back to user accordingly.
Find out thus, existing man-machine interaction mode is single alternately passive, namely robot can only respond the interactive information (such as can only be in user's question between user and robot and robot retrieves answer according to question content and exports again) of user's input passively, and this mode obviously cannot meet user to the day by day complicated interaction demand of robot.
For the problems referred to above existing for existing man-machine interaction method, present embodiments provide a kind of exchange method based on intelligent robot newly, Fig. 1 shows the process flow diagram of the method.
As shown in Figure 1, first the exchange method based on intelligent robot that the present embodiment provides obtains man-machine interaction input information in step S101.In order to make the human-computer interaction function of intelligent robot more perfect, in the present embodiment, the man-machine interaction input information that can obtain in step S101 preferably can comprise: voice messaging, tactile data, visual information, olfactory information, sense of taste information and machine perception information etc.
It is to be noted, in other embodiments of the invention, according to actual needs, the intelligent robot man-machine interaction input information that can obtain both can for above list in any one or several, also can comprise other unlisted reasonable items, the present invention is not limited thereto.
After obtaining man-machine interaction input information, the method judges whether this man-machine interaction input information meets in step s 102 and presets initiatively mutual condition.If this human-machine interactive information meets preset initiatively mutual condition, the party's rule, in step s 103 based on default active interaction models, produces according to this man-machine interaction input information and exports corresponding man-machine interaction output information; If this human-machine interactive information does not meet preset initiatively mutual condition, the party's rule according to presetting passive interaction models, producing according to this man-machine interaction input information and exporting corresponding man-machine interaction output information in step S104.
In the present embodiment, in order to make interactive process more efficient, quick, also pre-service is carried out, to obtain the data message of the custom of according calculation machine process more to obtained man-machine interaction input information in step s 102.
Particularly, when the man-machine interaction input information of user's input is voice messaging, this method utilizes automatic speech recognition (AutomaticSpeechRecognition in step s 102, referred to as ASR) voice messaging of the input of user is converted to text message by technology, or Application on Voiceprint Recognition etc. is carried out to the voice messaging of input.
When the man-machine interaction input information of user's input is visual information, this method will be converted to corresponding image information by correlation techniques such as degree of depth study to the visual information that user inputs in step s 102, with recognize the profile of thing phase, residing space, thing at a distance of from and thing in profile and/or change etc. spatially.Wherein, degree of depth study is a kind of technology coming from artificial neural network, and such as, multilayer perceptron containing many hidden layers is a kind of degree of depth study structure.
When the man-machine interaction input information of user's input is tactile data, this method will obtain related data information by perception user touch condition and/or temperature in step s 102.Particularly, in the present embodiment, user whether touch intelligent robot can be determined by being distributed in touch sensor on intelligent robot housing and/or temperature sensor, simultaneously, in case of need, the particular location of user's touch intelligent intelligent robot can also be determined by these touch sensors and/or temperature sensor.
Certainly, in other embodiments of the invention, it can also be other rational methods that method provided by the present invention carries out pretreated mode for obtained man-machine interaction input information, the present invention is not limited thereto.
The man-machine interaction method that the present embodiment provides judges whether man-machine interaction input information is boot-strap information, wakes information or default touch operation information up in step s 102.If so, then determine that this man-machine interaction income information meets and preset initiatively mutual condition, now this method will perform step S103 with based on default active interaction models, produce according to man-machine interaction input information and will export corresponding man-machine interaction output information.
Particularly, assuming that " the figure spirit " by name of intelligent robot, so when user says " figure spirit " to intelligent robot, what now user inputted is the information of waking up.Intelligent robot, receiving after this wakes information up, can judge that this information meets and preset initiatively mutual condition, thus will initiate according to default active interaction models initiatively mutual.Such as, intelligent robot can switch to duty and export voice " what do you want with me ".
If user rocks intelligent robot or falls down intelligent robot, now intelligent robot will get corresponding sense of touch and/or movable information by sense of touch and/or motion perception.Intelligent robot, after receiving these information, can be determined that these information meets and preset initiatively mutual condition, thus will initiate according to default active interaction models initiatively mutual.Such as, intelligent robot correspondingly can export voice " head is swooned well, do not shake again me OK " or " why you will fall me ".
In the present embodiment, by tactilely-perceptible, intelligent robot can also judge whether the sense of touch input of user is designated button, if so, then also can initiate initiatively mutual.Such as, when user presses power on button, intelligent robot will start and initiate to touch alternately, particularly, can be " see you again, what has, and I can help yours ".
Simultaneously, in the man-machine interaction method that the present embodiment provides, also can detect between user and intelligent robot and whether in preset duration, not carry out man-machine interaction, if, then judge that now man-machine interaction input information (namely continuing the blank information of preset duration) meets and preset initiatively mutual condition, intelligent robot will be initiated initiatively mutual, to strengthen the interaction between intelligent robot and user.
Particularly, as shown in Figure 2, after obtaining man-machine interaction input information in step s 201, the method inputs information monitoring man-machine interaction state according to man-machine interaction accessed in step S201 in step S202, and whether reaches preset duration according to the duration not carrying out man-machine interaction between man-machine interaction condition adjudgement user and intelligent robot in step S203.If reach preset duration, then in step S204, export corresponding man-machine interaction output information based on default active interaction models initiatively mutual to initiate.
Such as, when user did not both carry out phonetic entry to intelligent robot in 5 seconds, when also the input of the information such as sense of touch or vision not being carried out to intelligent robot, now intelligent robot is by active initiation session, exports given content (such as " we chat, and OK " or " why paying no attention to me ") with speech form.
It is pointed out that in other embodiments of the invention, the method can also adopt other rational methods to initiate initiatively mutual to trigger intelligent robot, the present invention is not limited thereto.Such as in one embodiment of the invention, can also judge whether the session of user and intelligent robot has initiatively mutual condition, if having initiatively mutual condition, it is mutual that intelligent robot then carries out active.Particularly, can be when user is to intelligent robot inquiry " you have had a meal ", intelligent robot after the inquiry (such as answering " I ate ") of answering user, can be initiated initiatively mutual (such as intelligent robot can then be inquired " you have eaten ").
As can be seen from foregoing description, exchange method based on intelligent robot provided by the present invention can make no longer can only be in single passive interactive mode as existing between intelligent robot and user, but extend to the initiatively mutual and passive mode be combined with each other alternately, this interactive mode makes mutual more diversified, the intelligent and hommization between robot and user, thus improves the Consumer's Experience of robot.
The present invention also have passed a kind of interactive device based on intelligent robot, and Fig. 3 shows the structural representation of this device in the present embodiment.
As shown in Figure 3, the interactive device based on intelligent robot that the present embodiment provides comprises: interactive information acquisition module 301, initiatively mutual condition judgment module 302 and initiatively interactive module 303.Wherein, interactive information acquisition module 301 is for obtaining man-machine interaction input information.Particularly, in the present embodiment, man-machine interaction that interactive information acquisition module 301 can obtain input information preferably includes: voice messaging, tactile data, visual information, olfactory information, sense of taste information and and perception information etc.
It is to be noted, in other embodiments of the invention, according to actual needs, the intelligent robot man-machine interaction input information that can obtain both can for above list in any one or several, also can comprise other unlisted reasonable items, the present invention is not limited thereto.
Initiatively mutual condition judgment module 302 presets initiatively mutual condition for determining whether to meet according to above-mentioned man-machine interaction input information.Preset initiatively mutual condition if met, then utilize initiatively interactive module 303 based on default active interaction models, produce according to this man-machine interaction input information and export corresponding man-machine interaction output information; Preset initiatively mutual condition if do not met, then utilize passive interactive module 304 according to default passive interaction models, produce according to this man-machine interaction input information and export corresponding man-machine interaction output information.
In the present embodiment, in order to make interactive process more efficient, quick, mutual condition judgment module 302 also carries out pre-service, to obtain the data message of the custom of according calculation machine process more to obtained man-machine interaction input information.Wherein, the principle that mutual condition judgment module 302 pairs of man-machine interactions input information process and process with carry out in above-mentioned steps S102 identical, do not repeat them here.
Present invention also offers a kind of intelligent robot, this intelligent robot adopts above-mentioned exchange method to carry out man-machine interaction, and Fig. 4 shows the structural representation of this intelligent robot in the present embodiment.
As shown in Figure 4, the man-machine interactive system that the present embodiment provides comprises: interactive information acquisition device 401, data processing equipment 402 and output feedack device 403.Wherein, interactive information acquisition device 401 is for obtaining man-machine interaction input information.In order to make the human-computer interaction function of robot more perfect, as shown in Figure 5, the interactive information acquisition device 401 that the present embodiment provides comprises: voice messaging acquiring unit 501, tactile data acquiring unit 502, acquisition of vision information unit 503, olfactory information acquiring unit 504, sense of taste information acquisition unit 505 and machine perception information acquiring unit 506 etc.
Wherein, voice messaging acquiring unit 501, tactile data acquiring unit 502, acquisition of vision information unit 503 all can directly adopt comparatively ripe sensor to realize, and do not repeat them here.
Along with the development of science and technology, the research of people to the vision of humans and animals, sense of touch and hearing mechanism has been becoming better and approaching perfection day by day maturation, and the corresponding human perception system for intelligent robot there has also been significant progress.But because sense of smell is as the relatively secondary perception of the mankind, the research at present for sense of smell mechanism is less.But in view of situations such as the subjectivity of human olfactory identification and the severe of working environment, the apery olfactory function of research robot can imitate human olfactory and replace the mankind to become inevitable even in some aspects.Therefore the present embodiment be configured with olfactory information acquiring unit 504 in the interactive information acquisition device 401 that provides.
In the present embodiment, olfactory information acquiring unit 504 utilizes the gas sensor in Electronic Nose to realize.Electronic Nose is a kind of device being realized Biologic Olfaction by electronic equipment, and it can distinguish and judge the type of component gas in pure gas or mixed gas.Select specific gas as the identification object of robot olfaction, when gas flow is through gas sensor array, change along with sensor physics characteristic, the voltage signal values that sensor exports changes thereupon, then corresponding pre-service is carried out to signal, as signal amplification, noise elimination, signal conversion etc.The response curve that each sensor exports for different object gas is not quite similar, multiple gas sensor just constitutes the response spectrum with certain feature to a kind of output integrated of specific scent, just can draw recognition result (i.e. current odiferous information) according to this characteristic reaction spectrum in conjunction with suitable recognizer, thus realize apery sense of smell.
As everyone knows, taste is divided into basic taste and compound taste, sour, sweet, bitter, peppery, fragrant, fresh, is saltyly called as basic taste, and the mankind come the basic taste of perception and compound taste by tongue.Compound taste is different from tertiary colour, and the compound between some tastes is insignificant, and the compound taste that the mankind can distinguish simultaneously is also limited.For some compound taste, namely the mankind allow to distinguish, also clearly cannot be described, and can only evaluate with some fuzzy vocabulary.In addition, the mankind are that different people is also not quite similar for the perception of same taste by obtaining the trial test study of smelling for the identification capability of taste.Therefore the realization of the machine sense of taste, except needs rely on highly sensitive taste sensor, also needs to introduce the method such as machine learning and fuzzy diagnosis, and solves the problem such as the acquisition of sense of taste knowledge and the expression of compound taste.
Computational intelligence is the computer simulation of biological intelligence, and it mainly comprises the parts such as artificial neural network, fuzzy system and evolutionary computation.Because computational intelligence has learning ability, it is except can recording known metrical information, also has stronger abstract ability and associative memory ability.Therefore, the man-machine interactive system that the present embodiment provides is by mutually merging the method that computational intelligence method and some correlation machines learn, and ground of having complementary advantages well identifies Taste Signals and classifies.Particularly, the methods such as computational intelligence and rough set theory, Statistical Learning Theory, bayesian probability model, Fuzzy Cluster Analysis method and support vector machine (SVM) merge by the present embodiment mutually, the related data come accessed by right visual information acquiring unit 505 processes, to realize the reliable recognition to the sense of taste.
Again as shown in Figure 4, data processing equipment 402 is connected with interactive information acquisition device 401, and it can judge that interactive information acquisition device 401 transmits the man-machine interaction input information come and whether meets default initiatively mutual condition.If man-machine interaction input information meets preset initiatively mutual condition, then based on default active interaction models, generate corresponding man-machine interaction feedback information according to man-machine interaction input information; Otherwise based on default passive interaction models, generate corresponding man-machine interaction feedback information according to man-machine interaction input information.
Output feedack device 403 is connected with data processing equipment 402, and it can transmit according to data processing equipment 402 the man-machine interaction feedback information come and generate corresponding output information.Particularly, in different embodiments of the invention, output feedack device 403 both can adopt display to realize, and other devices such as loudspeaker also can be adopted to realize, or was adopt above reasonable combination of lising to realize, and the present invention is not limited thereto.
In the present embodiment, in order to make interactive process more efficient, quick, data processing equipment 402 also carries out pre-service, to obtain the data message of the custom of according calculation machine process more to obtained man-machine interaction input information.
Particularly, when the man-machine interaction input information of user's input is voice messaging, data processing equipment 402 utilizes automatic speech recognition (AutomaticSpeechRecognition, referred to as ASR) voice messaging of the input of user is converted to text message by technology, or Application on Voiceprint Recognition etc. is carried out to the voice messaging of input, natural language processing (NaturalLanguageProcessing, referred to as NLP) technology is utilized to process natural language subsequently.
Due to data processing module 402 for all types of man-machine interactions input handling principle of information and process with set forth in above-mentioned steps S102 identical, do not repeat them here.
The data processing equipment 402 that the present embodiment provides can judge whether man-machine interaction input information is boot-strap information, wakes information or default touch operation information up.If so, then judge that this man-machine interaction income information meets and preset initiatively mutual condition, now data processing equipment 402 based on default active interaction models, will be produced corresponding man-machine interaction feedback information according to man-machine interaction input information; Otherwise judge that this man-machine interaction income information does not meet and preset initiatively mutual condition, now data processing equipment 402 based on default passive interaction models, will be produced corresponding man-machine interaction feedback information according to man-machine interaction input information.
In the present embodiment, by tactilely-perceptible, man-machine interactive system can also judge whether the sense of touch input of user is designated button, if so, then also can initiate initiatively mutual.Such as, when user presses power on button, robot will start and initiate to touch alternately, particularly, can be " see you again, what has, and I can help yours ".
Simultaneously, in the man-machine interactive system that the present embodiment provides, data processing equipment 402 also can detect between user and robot whether in preset duration, do not carry out man-machine interaction by interactive information acquisition device 401, if, then judge that now man-machine interaction input information (namely continuing the blank information of preset duration) meets and preset initiatively mutual condition, man-machine interactive system will be initiated initiatively mutual.
It is pointed out that in other embodiments of the invention, data processing equipment 402 can also adopt other rational methods to judge whether and meet initiatively mutual condition, the present invention is not limited thereto.Such as in one embodiment of the invention, data processing equipment 402 can also judge whether the session of user and intelligent robot has initiatively mutual condition, if having initiatively mutual condition, it is mutual that intelligent robot then carries out active.Particularly, can be when user is to intelligent robot inquiry " you have had a meal ", intelligent robot after the inquiry (such as answering " I ate ") of answering user, can be initiated initiatively mutual (such as intelligent robot can then be inquired " you have eaten ").
As can be seen from foregoing description, no longer single passive interactive mode can only be in as existing intelligent robot between intelligent robot provided by the present invention and user, but extend to the initiatively mutual and passive mode be combined with each other alternately, this interactive mode makes mutual more diversified, the intelligent and hommization between robot and user, thus improves the Consumer's Experience of robot.
It should be understood that disclosed embodiment of this invention is not limited to ad hoc structure disclosed herein, treatment step or material, and the equivalent of these features that those of ordinary skill in the related art understand should be extended to substitute.It is to be further understood that term is only for describing the object of specific embodiment as used herein, and and do not mean that restriction.
Although above-mentioned example is for illustration of the principle of the present invention in one or more application, but for a person skilled in the art, when not deviating from principle of the present invention and thought, obviously can in form, the details of usage and enforcement does various amendment and need not creative work be paid.Therefore, the present invention is limited by appending claims.

Claims (17)

1. based on an exchange method for intelligent robot, it is characterized in that, described method comprises:
Obtain man-machine interaction input information;
Determine whether to meet according to described man-machine interaction input information and preset initiatively mutual condition;
When meeting the mutual condition of described default active, then based on default active interaction models, producing according to described man-machine interaction input information and exporting corresponding man-machine interaction output information.
2. the method for claim 1, it is characterized in that, in the process, when not meeting the mutual condition of described default active, then based on default passive interaction models, produce according to described man-machine interaction input information and export corresponding man-machine interaction output information.
3. method as claimed in claim 1 or 2, is characterized in that, described man-machine interaction input information comprises any one in following lising or several:
Voice messaging, tactile data, visual information, olfactory information, sense of taste information and machine perception information.
4. the method according to any one of claims 1 to 3, is characterized in that, determines whether to meet the step presetting initiatively mutual condition comprise according to described man-machine interaction input information:
Judge whether described man-machine interaction input information is boot-strap information, wakes information or default touch operation information up;
If so, then determine that described man-machine interaction input information meets the mutual condition of described default active.
5. the method according to any one of Claims 1 to 4, is characterized in that, determines whether to meet the step presetting initiatively mutual condition comprise according to described man-machine interaction input information:
Judge whether in preset duration, do not carry out man-machine interaction between user and robot;
If so, then determine that described man-machine interaction input information meets the mutual condition of described default active.
6. the method according to any one of Claims 1 to 5, it is characterized in that, described method based on described default Active model produce export described man-machine interaction output information time, first produce and export the response message of described input information, produce subsequently and export active interactive information.
7. based on an interactive device for intelligent robot, it is characterized in that, described device comprises:
Interactive information acquisition module, it is for obtaining man-machine interaction input information;
Initiatively mutual condition judgment module, it presets initiatively mutual condition for determining whether to meet according to described man-machine interaction input information;
Initiatively interactive module, it, for when meeting the mutual condition of described default active, based on default active interaction models, producing according to described man-machine interaction input information and exporting corresponding man-machine interaction output information.
8. device as claimed in claim 7, it is characterized in that, described device also comprises:
Passive interactive module, it is not for when meeting the mutual condition of described default active, based on default passive interaction models, produces according to described man-machine interaction input information and exports corresponding man-machine interaction output information.
9. as claimed in claim 7 or 8 device, is characterized in that, described man-machine interaction input information comprises any one in following lising or several:
Voice messaging, tactile data, visual information, olfactory information, sense of taste information and machine perception information.
10. the device according to any one of claim 7 ~ 9, is characterized in that, when described active mutual condition judgment module determines whether to meet default initiatively mutual condition according to described man-machine interaction input information:
Judge whether described man-machine interaction input information is boot-strap information, wakes information or default touch operation information up;
If so, then determine that described man-machine interaction input information meets the mutual condition of described default active.
11. devices according to any one of claim 7 ~ 10, is characterized in that, described active mutual condition judgment module according to described man-machine interaction input information determine whether to meet preset initiatively mutual condition time:
Judge whether in preset duration, do not carry out man-machine interaction between user and robot;
If so, then determine that described man-machine interaction input information meets the mutual condition of described default active.
12. devices according to any one of claim 7 ~ 11, it is characterized in that, when described active interactive module produces based on described default Active model and exports described man-machine interaction output information, first produce and export the response message of described input information, produce subsequently and export initiatively interactive information.
13. 1 kinds of intelligent robots, is characterized in that, described intelligent robot comprises:
Interactive information acquisition device, it is for obtaining man-machine interaction input information;
Data processing equipment, it is connected with described interactive information acquisition device, presets initiatively mutual condition, if met for judging whether described man-machine interaction input information meets, then based on default active interaction models, generate man-machine interaction feedback information according to described man-machine interaction input information;
Output feedack device, it is connected with described data processing equipment, for producing corresponding output signal according to described man-machine interaction feedback information.
14. intelligent robots as claimed in claim 13, is characterized in that, described interactive information acquisition device comprises any one in following lising or several:
Voice messaging acquiring unit, tactile data acquiring unit, acquisition of vision information unit, olfactory information acquiring unit, sense of taste information acquisition unit and machine perception information acquiring unit.
15. intelligent robots as described in claim 13 or 14, it is characterized in that, described data processing equipment is configured to judge whether described man-machine interaction input information is boot-strap information, wakes information or default touch operation information up, if so, then confirm that described man-machine interaction input information meets the mutual condition of described default active.
16. want the intelligent robot according to any one of 13 ~ 15 as right, it is characterized in that, described data processing equipment is configured to detect the duration not carrying out man-machine interaction between user and robot and whether reaches preset duration, if so, then confirm that described man-machine interaction input information meets the mutual condition of described default active.
17. intelligent robots according to any one of claim 13 ~ 16, it is characterized in that, described data processing equipment is configured to when described man-machine interaction input information does not meet the mutual condition of described default active, based on default passive interaction models, produce man-machine interaction feedback information according to described man-machine interaction input information.
CN201510857357.6A 2015-11-30 2015-11-30 Exchange method and device, intelligent robot based on intelligent robot Active CN105511608B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510857357.6A CN105511608B (en) 2015-11-30 2015-11-30 Exchange method and device, intelligent robot based on intelligent robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510857357.6A CN105511608B (en) 2015-11-30 2015-11-30 Exchange method and device, intelligent robot based on intelligent robot

Publications (2)

Publication Number Publication Date
CN105511608A true CN105511608A (en) 2016-04-20
CN105511608B CN105511608B (en) 2018-12-25

Family

ID=55719658

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510857357.6A Active CN105511608B (en) 2015-11-30 2015-11-30 Exchange method and device, intelligent robot based on intelligent robot

Country Status (1)

Country Link
CN (1) CN105511608B (en)

Cited By (34)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105798918A (en) * 2016-04-29 2016-07-27 北京光年无限科技有限公司 Interactive method and device for intelligent robot
CN105898487A (en) * 2016-04-28 2016-08-24 北京光年无限科技有限公司 Interaction method and device for intelligent robot
CN105913039A (en) * 2016-04-26 2016-08-31 北京光年无限科技有限公司 Visual-and-vocal sense based dialogue data interactive processing method and apparatus
CN105945949A (en) * 2016-06-01 2016-09-21 北京光年无限科技有限公司 Information processing method and system for intelligent robot
CN106022294A (en) * 2016-06-01 2016-10-12 北京光年无限科技有限公司 Intelligent robot-oriented man-machine interaction method and intelligent robot-oriented man-machine interaction device
CN106096717A (en) * 2016-06-03 2016-11-09 北京光年无限科技有限公司 Information processing method and system towards intelligent robot
CN106182007A (en) * 2016-08-09 2016-12-07 北京光年无限科技有限公司 A kind of card for intelligent robot pauses processing method and processing device
WO2016206647A1 (en) * 2015-06-26 2016-12-29 北京贝虎机器人技术有限公司 System for controlling machine apparatus to generate action
CN106372195A (en) * 2016-08-31 2017-02-01 北京光年无限科技有限公司 Human-computer interaction method and device for intelligent robot
CN106426203A (en) * 2016-11-02 2017-02-22 旗瀚科技有限公司 Communication system and method of active trigger robot
CN106462255A (en) * 2016-06-29 2017-02-22 深圳狗尾草智能科技有限公司 A method, system and robot for generating interactive content of robot
CN106502382A (en) * 2016-09-21 2017-03-15 北京光年无限科技有限公司 Active exchange method and system for intelligent robot
CN106537293A (en) * 2016-06-29 2017-03-22 深圳狗尾草智能科技有限公司 Method and system for generating robot interactive content, and robot
CN106537425A (en) * 2016-06-29 2017-03-22 深圳狗尾草智能科技有限公司 Method and system for generating robot interaction content, and robot
CN106559321A (en) * 2016-12-01 2017-04-05 竹间智能科技(上海)有限公司 The method and system of dynamic adjustment dialog strategy
CN106648114A (en) * 2017-01-12 2017-05-10 长春大学 Interactive model of tongue machine and device
CN106648074A (en) * 2016-11-25 2017-05-10 合肥优智领英智能科技有限公司 Man-machine interaction method of intelligent robot
CN106662932A (en) * 2016-07-07 2017-05-10 深圳狗尾草智能科技有限公司 Method, system and robot for recognizing and controlling household appliances based on intention
CN106716934A (en) * 2016-12-23 2017-05-24 深圳前海达闼云端智能科技有限公司 Chat interaction method and apparatus, and electronic device thereof
CN106774837A (en) * 2016-11-23 2017-05-31 河池学院 A kind of man-machine interaction method of intelligent robot
CN106970630A (en) * 2017-05-23 2017-07-21 上海棠棣信息科技股份有限公司 A kind of robot actively provides method and device, the robot of service
CN107147618A (en) * 2017-04-10 2017-09-08 北京猎户星空科技有限公司 A kind of user registering method, device and electronic equipment
CN107294837A (en) * 2017-05-22 2017-10-24 北京光年无限科技有限公司 Engaged in the dialogue interactive method and system using virtual robot
WO2018006380A1 (en) * 2016-07-07 2018-01-11 深圳狗尾草智能科技有限公司 Human-machine interaction system, device, and method for robot
CN107885756A (en) * 2016-09-30 2018-04-06 华为技术有限公司 Dialogue method, device and equipment based on deep learning
CN107977072A (en) * 2017-07-28 2018-05-01 北京物灵智能科技有限公司 What a kind of robot used form method, forms expert system and electronic equipment
CN108133259A (en) * 2017-12-14 2018-06-08 深圳狗尾草智能科技有限公司 The system and method that artificial virtual life is interacted with the external world
CN108181612A (en) * 2017-12-22 2018-06-19 达闼科技(北京)有限公司 Determine the method and relevant apparatus of microphone beam profile angle
CN108214513A (en) * 2018-01-23 2018-06-29 深圳狗尾草智能科技有限公司 Multi-dimensional robot degree responds exchange method and device
CN108229640A (en) * 2016-12-22 2018-06-29 深圳光启合众科技有限公司 The method, apparatus and robot of emotion expression service
CN109343897A (en) * 2018-08-09 2019-02-15 北京云迹科技有限公司 Awakening method and device for robot
CN109857929A (en) * 2018-12-29 2019-06-07 北京光年无限科技有限公司 A kind of man-machine interaction method and device for intelligent robot
CN109949807A (en) * 2019-03-13 2019-06-28 常州市贝叶斯智能科技有限公司 A kind of the intelligent robot interactive system and method for body composition detection and analysis
CN110176161A (en) * 2019-06-19 2019-08-27 上海思依暄机器人科技股份有限公司 A kind of artificial intelligence Teaching Experiment Box and its experiment control method and device

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101441513A (en) * 2008-11-26 2009-05-27 北京科技大学 System for performing non-contact type human-machine interaction by vision
CN103000052A (en) * 2011-09-16 2013-03-27 上海先先信息科技有限公司 Man-machine interactive spoken dialogue system and realizing method thereof
CN103246879A (en) * 2013-05-13 2013-08-14 苏州福丰科技有限公司 Expression-recognition-based intelligent robot system
CN103268150A (en) * 2013-05-13 2013-08-28 苏州福丰科技有限公司 Intelligent robot management and control system and intelligent robot management and control method on basis of facial expression recognition
CN103956128A (en) * 2014-05-09 2014-07-30 东华大学 Intelligent active advertising platform based on somatosensory technology
CN105068661A (en) * 2015-09-07 2015-11-18 百度在线网络技术(北京)有限公司 Man-machine interaction method and system based on artificial intelligence

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101441513A (en) * 2008-11-26 2009-05-27 北京科技大学 System for performing non-contact type human-machine interaction by vision
CN103000052A (en) * 2011-09-16 2013-03-27 上海先先信息科技有限公司 Man-machine interactive spoken dialogue system and realizing method thereof
CN103246879A (en) * 2013-05-13 2013-08-14 苏州福丰科技有限公司 Expression-recognition-based intelligent robot system
CN103268150A (en) * 2013-05-13 2013-08-28 苏州福丰科技有限公司 Intelligent robot management and control system and intelligent robot management and control method on basis of facial expression recognition
CN103956128A (en) * 2014-05-09 2014-07-30 东华大学 Intelligent active advertising platform based on somatosensory technology
CN105068661A (en) * 2015-09-07 2015-11-18 百度在线网络技术(北京)有限公司 Man-machine interaction method and system based on artificial intelligence

Cited By (52)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016206647A1 (en) * 2015-06-26 2016-12-29 北京贝虎机器人技术有限公司 System for controlling machine apparatus to generate action
CN105913039A (en) * 2016-04-26 2016-08-31 北京光年无限科技有限公司 Visual-and-vocal sense based dialogue data interactive processing method and apparatus
CN105898487B (en) * 2016-04-28 2019-02-19 北京光年无限科技有限公司 A kind of exchange method and device towards intelligent robot
CN105898487A (en) * 2016-04-28 2016-08-24 北京光年无限科技有限公司 Interaction method and device for intelligent robot
CN105798918A (en) * 2016-04-29 2016-07-27 北京光年无限科技有限公司 Interactive method and device for intelligent robot
CN105798918B (en) * 2016-04-29 2018-08-21 北京光年无限科技有限公司 A kind of exchange method and device towards intelligent robot
CN105945949A (en) * 2016-06-01 2016-09-21 北京光年无限科技有限公司 Information processing method and system for intelligent robot
CN106022294A (en) * 2016-06-01 2016-10-12 北京光年无限科技有限公司 Intelligent robot-oriented man-machine interaction method and intelligent robot-oriented man-machine interaction device
CN106096717B (en) * 2016-06-03 2018-08-14 北京光年无限科技有限公司 Information processing method towards intelligent robot and system
CN106096717A (en) * 2016-06-03 2016-11-09 北京光年无限科技有限公司 Information processing method and system towards intelligent robot
CN106537293A (en) * 2016-06-29 2017-03-22 深圳狗尾草智能科技有限公司 Method and system for generating robot interactive content, and robot
CN106462255A (en) * 2016-06-29 2017-02-22 深圳狗尾草智能科技有限公司 A method, system and robot for generating interactive content of robot
CN106537425A (en) * 2016-06-29 2017-03-22 深圳狗尾草智能科技有限公司 Method and system for generating robot interaction content, and robot
WO2018000267A1 (en) * 2016-06-29 2018-01-04 深圳狗尾草智能科技有限公司 Method for generating robot interaction content, system, and robot
WO2018000260A1 (en) * 2016-06-29 2018-01-04 深圳狗尾草智能科技有限公司 Method for generating robot interaction content, system, and robot
WO2018000261A1 (en) * 2016-06-29 2018-01-04 深圳狗尾草智能科技有限公司 Method and system for generating robot interaction content, and robot
WO2018006380A1 (en) * 2016-07-07 2018-01-11 深圳狗尾草智能科技有限公司 Human-machine interaction system, device, and method for robot
CN106662932A (en) * 2016-07-07 2017-05-10 深圳狗尾草智能科技有限公司 Method, system and robot for recognizing and controlling household appliances based on intention
WO2018006372A1 (en) * 2016-07-07 2018-01-11 深圳狗尾草智能科技有限公司 Method and system for controlling household appliance on basis of intent recognition, and robot
CN106182007A (en) * 2016-08-09 2016-12-07 北京光年无限科技有限公司 A kind of card for intelligent robot pauses processing method and processing device
CN106372195A (en) * 2016-08-31 2017-02-01 北京光年无限科技有限公司 Human-computer interaction method and device for intelligent robot
CN106502382A (en) * 2016-09-21 2017-03-15 北京光年无限科技有限公司 Active exchange method and system for intelligent robot
CN106502382B (en) * 2016-09-21 2020-01-14 北京光年无限科技有限公司 Active interaction method and system for intelligent robot
US11449678B2 (en) 2016-09-30 2022-09-20 Huawei Technologies Co., Ltd. Deep learning based dialog method, apparatus, and device
CN107885756B (en) * 2016-09-30 2020-05-08 华为技术有限公司 Deep learning-based dialogue method, device and equipment
CN107885756A (en) * 2016-09-30 2018-04-06 华为技术有限公司 Dialogue method, device and equipment based on deep learning
CN106426203A (en) * 2016-11-02 2017-02-22 旗瀚科技有限公司 Communication system and method of active trigger robot
CN106774837A (en) * 2016-11-23 2017-05-31 河池学院 A kind of man-machine interaction method of intelligent robot
CN106648074A (en) * 2016-11-25 2017-05-10 合肥优智领英智能科技有限公司 Man-machine interaction method of intelligent robot
CN106559321A (en) * 2016-12-01 2017-04-05 竹间智能科技(上海)有限公司 The method and system of dynamic adjustment dialog strategy
CN108229640A (en) * 2016-12-22 2018-06-29 深圳光启合众科技有限公司 The method, apparatus and robot of emotion expression service
CN108229640B (en) * 2016-12-22 2021-08-20 山西翼天下智能科技有限公司 Emotion expression method and device and robot
CN106716934B (en) * 2016-12-23 2020-08-04 深圳前海达闼云端智能科技有限公司 Chat interaction method and device and electronic equipment thereof
CN106716934A (en) * 2016-12-23 2017-05-24 深圳前海达闼云端智能科技有限公司 Chat interaction method and apparatus, and electronic device thereof
CN106648114B (en) * 2017-01-12 2023-11-14 长春大学 Tongue machine interaction model and device
CN106648114A (en) * 2017-01-12 2017-05-10 长春大学 Interactive model of tongue machine and device
US11568876B2 (en) 2017-04-10 2023-01-31 Beijing Orion Star Technology Co., Ltd. Method and device for user registration, and electronic device
CN107147618A (en) * 2017-04-10 2017-09-08 北京猎户星空科技有限公司 A kind of user registering method, device and electronic equipment
CN107294837A (en) * 2017-05-22 2017-10-24 北京光年无限科技有限公司 Engaged in the dialogue interactive method and system using virtual robot
CN106970630B (en) * 2017-05-23 2019-12-06 浙江孚宝智能科技有限公司 method and device for actively providing service by robot and robot
CN106970630A (en) * 2017-05-23 2017-07-21 上海棠棣信息科技股份有限公司 A kind of robot actively provides method and device, the robot of service
CN107977072A (en) * 2017-07-28 2018-05-01 北京物灵智能科技有限公司 What a kind of robot used form method, forms expert system and electronic equipment
CN107977072B (en) * 2017-07-28 2021-06-08 北京物灵智能科技有限公司 Formation method for robot, formation expert system and electronic equipment
CN108133259A (en) * 2017-12-14 2018-06-08 深圳狗尾草智能科技有限公司 The system and method that artificial virtual life is interacted with the external world
CN108181612A (en) * 2017-12-22 2018-06-19 达闼科技(北京)有限公司 Determine the method and relevant apparatus of microphone beam profile angle
CN108181612B (en) * 2017-12-22 2019-05-21 达闼科技(北京)有限公司 Determine the method and relevant apparatus of microphone beam profile angle
CN108214513A (en) * 2018-01-23 2018-06-29 深圳狗尾草智能科技有限公司 Multi-dimensional robot degree responds exchange method and device
CN109343897A (en) * 2018-08-09 2019-02-15 北京云迹科技有限公司 Awakening method and device for robot
CN109857929B (en) * 2018-12-29 2021-06-15 北京光年无限科技有限公司 Intelligent robot-oriented man-machine interaction method and device
CN109857929A (en) * 2018-12-29 2019-06-07 北京光年无限科技有限公司 A kind of man-machine interaction method and device for intelligent robot
CN109949807A (en) * 2019-03-13 2019-06-28 常州市贝叶斯智能科技有限公司 A kind of the intelligent robot interactive system and method for body composition detection and analysis
CN110176161A (en) * 2019-06-19 2019-08-27 上海思依暄机器人科技股份有限公司 A kind of artificial intelligence Teaching Experiment Box and its experiment control method and device

Also Published As

Publication number Publication date
CN105511608B (en) 2018-12-25

Similar Documents

Publication Publication Date Title
CN105511608A (en) Intelligent robot based interaction method and device, and intelligent robot
CN105843381B (en) Data processing method for realizing multi-modal interaction and multi-modal interaction system
US11670324B2 (en) Method for predicting emotion status and robot
CN108000526B (en) Dialogue interaction method and system for intelligent robot
CN105598972B (en) A kind of robot system and exchange method
US8321221B2 (en) Speech communication system and method, and robot apparatus
CN108009573B (en) Robot emotion model generation method, emotion model and interaction method
WO2018000259A1 (en) Method and system for generating robot interaction content, and robot
CN106022294B (en) Intelligent robot-oriented man-machine interaction method and device
Savery et al. A survey of robotics and emotion: Classifications and models of emotional interaction
CN104881108A (en) Intelligent man-machine interaction method and device
WO2018006374A1 (en) Function recommending method, system, and robot based on automatic wake-up
US11531881B2 (en) Artificial intelligence apparatus for controlling auto stop system based on driving information and method for the same
WO2018000267A1 (en) Method for generating robot interaction content, system, and robot
CN108628908B (en) Method, device and electronic equipment for classifying user question-answer boundaries
CN111858861A (en) Question-answer interaction method based on picture book and electronic equipment
KR20210033809A (en) Control server and method for controlling robot using artificial neural network, and the robot implementing the same
Takagi Interactive evolutionary computation for analyzing human awareness mechanisms
WO2018000261A1 (en) Method and system for generating robot interaction content, and robot
CN106598241A (en) Interactive data processing method and device for intelligent robot
Hao et al. Proposal of initiative service model for service robot
CN117193524A (en) Man-machine interaction system and method based on multi-mode feature fusion
Abdallah et al. Smart assistant robot for smart home management
WO2018000260A1 (en) Method for generating robot interaction content, system, and robot
Najjar et al. Constructivist ambient intelligent agent for smart environments

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20190211

Address after: 518064 Turing Robot 1404 Mango Net Building, Haitianyi Road, Nanshan District, Shenzhen City, Guangdong Province

Patentee after: Shenzhen Lightyear Turing Technology Co.,Ltd.

Address before: 100000 Fourth Floor Ivy League Youth Venture Studio No. 193, Yuquan Building, No. 3 Shijingshan Road, Shijingshan District, Beijing

Patentee before: Beijing Guangnian Infinite Technology Co.,Ltd.

TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20240401

Address after: Room 193, Ivy League Youth Entrepreneurship Studio, 4th Floor, Yuquan Building, No. 3 Shijingshan Road, Shijingshan District, Beijing, 100049

Patentee after: Beijing Guangnian Infinite Technology Co.,Ltd.

Country or region after: China

Address before: 518064 Turing Robot 1404 Mango Net Building, Haitianyi Road, Nanshan District, Shenzhen City, Guangdong Province

Patentee before: Shenzhen Lightyear Turing Technology Co.,Ltd.

Country or region before: China