The content of the invention
In view of this, the utility model embodiment provides a kind of robot, the man-machine interaction shape for expanding machinery people
Formula, improves the intelligent of man-machine interaction.
The utility model embodiment provides a kind of robot, including:
It is arranged at robot fuselage, the detector for detecting interactive operation;
According to the interactive operation and the corresponding detector location of the interactive operation, control feedback device performs feedback row
For middle control element;And
Perform the feedback device of the feedback behavior.
Optionally, the detector includes at least one touch sensing for being arranged on diverse location on the robot fuselage
Device, and/or at least one vibrating sensor;
The touch sensor, the direction slided and/or speed are touched for detecting;
The vibrating sensor, the number of times and/or dynamics of operation are tapped for detecting.
Optionally, the feedback device includes at least one of Audio Players, display screen, moving parts.
Optionally, the feedback device includes the moving parts, and at least one described touch sensor includes:Set
Touch sensor under the display screen, the direction slided is touched for detecting;
The middle control element is additionally operable to:The moving parts according to the direction controlling for touching and sliding, make the machine
People walks to correspondence direction.
Optionally, the robot also includes:
Memory for storing various feedback behavior.
Optionally, described robot also includes:
For the image acquisition device for gathering the image of interactive object, being arranged on the robot fuselage;And,
The image of the interactive object is recognized, the image recognizer of the feature of the interactive object is obtained;
The middle control element is additionally operable to:According to the interactive operation, the trigger position of the interactive operation and the interaction
The feature of object, determines the feedback behavior.
The robot that the utility model is provided, sets one or more detectors on robot fuselage, to pass through detection
The interactive operation that device detection user triggers in robot.Due to different detector locations and different friendships can be preset
Interoperability behavior corresponds to different feedback behaviors, so that, central control element receives the interactive operation of some detector transmission
After information, corresponding feedback behavior can be determined based on the predeterminated position of the interactive operation and corresponding detector, and control
Feedback device performs the feedback behavior.By this programme, the interactive operation row that robot can be triggered based on user on fuselage
To realize human-computer intellectualization with interactive operation position so that man-machine interaction is not limited only to voice and touches display screen hand over
Mutually, the interactive form of robot is extended.
Embodiment
It is new below in conjunction with this practicality to make the purpose, technical scheme and advantage of the utility model embodiment clearer
Accompanying drawing in type embodiment, the technical scheme in the utility model embodiment is clearly and completely described, it is clear that retouched
The embodiment stated is a part of embodiment of the utility model, rather than whole embodiments.Based on the implementation in the utility model
Example, the every other embodiment that those of ordinary skill in the art are obtained under the premise of creative work is not made is belonged to
The scope of the utility model protection.
The structural representation for the robotic embodiment one that Fig. 1 provides for the utility model embodiment, as shown in figure 1, the machine
Device people includes detector 10, middle control element 20 and feedback device 30.
Wherein, detector 10 is arranged on robot fuselage 1, the interactive operation for detecting user.
Middle control element 20 is arranged at the inside of fuselage 1, for the interactive operation and the interaction detected according to detector 10
The corresponding position of detector 10 control feedback device 30 is operated to perform feedback behavior.
Feedback device 30 is connected to middle control element 20, and performs feedback behavior under the control of middle control element 20.
In the utility model embodiment, middle control element 20 can use various application specific integrated circuits (ASIC), numeral
Signal processor (DSP), digital signal processing appts (DSPD), PLD (PLD), field programmable gate array
(FPGA), microcontroller, microprocessor or other electronic components are realized.
Alternatively, in actual applications, detector 10 includes being arranged at least one of diverse location on robot fuselage 1
Touch sensor 110, and/or at least one vibrating sensor 120.Wherein, touch sensor 110, which is used to detect, touches what is slided
Direction and/or speed;Vibrating sensor 120 is used to detect the number of times and/or dynamics that tap operation.Detector is in robot fuselage
Set location on 1 can be but be not limited to head 11, shoulder 12, belly 13 etc..Such as, two are set respectively in shoulder 12
Vibrating sensor 120;On head 11 and belly 13, multiple touch sensors 110 are set respectively.It is understood that for reality
The interaction effect now personalized, set location of the detector on robot fuselage can by a large number of users to robot
It is accustomed to operating position statistics to determine, can also experience setting.
It is pointed out that touch sensor 110 and vibrating sensor 120 can be understood as exemplary in nature, it can think
To be other kinds of sensor, if with detection man-machine interaction interactive operation, the guarantor of detector 10 can also be included
Scope is protected, belongs to the utility model same design, protection domain of the present utility model should be fallen into.
In actual applications, can be man-machine in machine in order to point out which interactive operation user can carry out to robot
Corresponding prompt message is covered near the set location of detector with it, or can also be shown on the display screen of robot
Show prompt message, the display screen is such as the touch-screen for being arranged on robot belly.
When user has carried out corresponding interactive operation according to above-mentioned prompt message to robot, the detector of relevant position
10 detect the interactive operation information of user's triggering, such as the number of times tapped, dynamics, so that the detector 10 is by the friendship detected
Interoperability information is sent to middle control element 20.Middle control element 20 is based on the interactive operation information received and the position of the detector 10
Put, determine corresponding feedback behavior, and control feedback device 30 to perform the feedback behavior, it is touched to user feedback robot
The response of the interactive operation of hair.
Specifically, detector 10 to middle control element 20 when transmitting the interactive operation information detected, by the mark of itself
Send to middle control element 20, wherein, the mark is such as the unit type of the detector 10.Middle control element 20 is based on the equipment type
Number it can determine the set location on robot fuselage 1 of the detector 10.
Specifically, in the utility model embodiment, memory 40 can be provided with robot, can be with the memory 40
The detector that is stored with identifies the corresponding relation with detector location, so that middle control element 20 can realize above-mentioned set location really
It is fixed.
Can also be stored with various feedback behavior in the memory 40, wherein, the storage of every kind of feedback behavior index can be with
Expressed by detector location, the coding result of interactive operation information.So as to which central control element 20 receives interactive operation letter
Breath, and determine after detector location, (glide direction, speed are such as touched to interactive operation information;Number of taps, percussion power
Degree) and detector location encoded, coding result is obtained, using obtained coding result as search index memory 40, to obtain
Obtain and feed back behavior accordingly.It is understood, therefore, that including the correlator for realizing codimg logic in middle control element 20
Part.
In the present embodiment, the feedback behavior that robot interacts feedback to user can include sound feedback behavior, figure
As at least one of feedback behavior and action feedback behavior.That is, the sound that can be stored with memory 40, figure
One or more feedback data in picture, action sequence.So as to which alternatively, feedback device 30 includes Audio Players 310, display
At least one of screen 320, moving parts 330.
Wherein, Audio Players 310 are connected with middle control element 20, for controlling the control of element 20 in receiving to sound feedback
Behavior is to be played out for the sound fed back so that user is timely from acoustically obtaining interaction feedback;Display screen 320 is connected
In middle control element 20, the control for controlling element 20 in receiving plays out aobvious to image feedback behavior as the image fed back
Show so that user timely visually obtains interaction feedback;Moving parts 330 include but is not limited to robot both legs,
Arm, first-class, moving parts 330 are connected to middle control element 20, and the control execution for controlling element 20 in receiving acts feedback row
For the action i.e. for feeding back so that user timely experiences interaction feedback from the limbs change of robot.
Implementing for interaction feedback in the utility model embodiment is illustrated with several examples below:
Such as, feedback device 30 includes moving parts 330, and detector 10 includes being arranged on multiple touches of robot belly
Sensor 110, such as be arranged on multiple touch sensors 110 under the display screen 320, and the side slided is touched for detecting
To.Now, when the plurality of touch sensor 110, which detects user, triggers the slide in some direction, middle control element 20
The direction controlling moving parts 330 that touch according to detecting is slided, make robot be walked to correspondence direction.It is many in this example
Individual touch sensor 110 can be both arranged at the display screen 320, can also be arranged at the adjacent position of the display screen 320
Place is put, is to pass through the display screen 320, Yong Huke with touch detection function when being arranged under the display screen 320
To carry out travelling control to robot.
Specifically, the multiple touch sensors 110 for being arranged at including but not limited to belly 13 detect the cunning of user
Dynamic operation, and detect behind the direction of slide, the direction of the slide and the device identification of itself are sent to middle control
Element 20, middle control element 20 determines the generation position of slide according to the device identification, by the generation position and cunning
The direction encoding of dynamic operation obtains storage index, and corresponding feedback behavior is determined so as to inquire about, and such as " advances 50 centimetres forward ",
Here forward direction is corresponding with the direction of slide, the both legs in middle control element 20 and then control moving parts 330, makes
Robot advances forward 50 centimetres, that is, completes the touch operation because of user, causes robot to make " advancing 50 centimetres forward "
Act feedback behavior.
And for example, feedback device 30 includes Audio Players 310, and detector 10 includes being arranged at least the one of robot shoulder
Individual vibrating sensor 120, for detecting that the percussion of user is operated.Percussion operation is triggered when vibrating sensor detects user,
And detect after number of taps and dynamics, the information detected is sent to middle control element 20, middle control element 20 is according to detecting
Information and the generation position that operates of the percussion determine corresponding audio files, and control Audio Players 310 to play accordingly
Audio files;In addition, the detector 10 can also include the multiple touch sensors 110 for being arranged on robot belly,
As described in being arranged on the lower section of display screen 320 it is multiple as described in touch sensor 110, touch the direction slided for detecting, work as inspection
When measuring user and triggering the slide in some direction, the direction of the slide, the middle control basis of element 20 are judged
Audio Players 310 described in the direction controlling of the slide play volume during audio files, the cunning such as detected
Dynamic operation direction be from the top-to-bottom of display screen when reduce volume when the Audio Players 310 play audio files,
Otherwise the slide direction detected increases the Audio Players 310 and plays sound when being from the bottom of display screen to top
Volume during file, to realize the control by 10 pairs of the detector feedback device 30
For another feedback device 30 includes the display screen 320, and detector 10 includes being arranged on robot belly side
Multiple touch sensors 110, the touch operation for detecting user.When multiple touch sensors 110 detect user's triggering
Slide, and detect after the speed of slip, the information detected is sent to middle control element 20.Middle control element 20 is based on
The position of the plurality of touch sensor 110 and its touch information detected, it is determined that feeding back behavior accordingly, such as are picture
File.Middle control element 20 and then the corresponding picture file of the broadcasting display of control display screen 320.Above-mentioned sliding speed is only for example,
In practice, the touch operation information detected can also include such as touching duration, area coverage etc..Wherein, picture text
Part is such as smiling face's picture, text prompt picture etc..
In the present embodiment, one or more detectors are set on robot fuselage, to detect that user exists by detector
The interactive operation triggered in robot.Different detector locations and different interactive operation behaviors pair due to that can preset
Should in different feedback behaviors so that, it is central control element receive some detector transmission interactive operation information after, can be with base
Corresponding feedback behavior is determined in the predeterminated position of the interactive operation and corresponding detector, and controls feedback device to perform and is somebody's turn to do
Feedback behavior.By this programme, interactive operation behavior and the interactive operation position that robot can be triggered based on user on fuselage
Put and realize human-computer intellectualization so that man-machine interaction is not limited only to voice and the touch to display screen is interacted, and extends machine
The interactive form of people.
The structural representation for the robotic embodiment two that Fig. 2 provides for the utility model embodiment, as shown in Fig. 2 in Fig. 1
On the basis of illustrated embodiment, the robot also includes image acquisition device 50 and image recognizer 60.
Wherein, image acquisition device 50 is arranged on robot fuselage 1, the image for gathering interactive object;Image recognition
Device 60 is connected to middle control element 20 and image acquisition device 50, the image for recognizing interactive object, to obtain interaction pair
The feature of elephant.
Image acquisition device 50 includes but is not limited to camera 510, and it is generally positioned at the high bit of robot fuselage 1
Put, such as the position of head 11, the image of image information, i.e. interactive object specifically for gathering the user interacted with robot
Information.The image information that interactive object is gathered in the present embodiment can interactive object be taken pictures or to interaction
Object is imaged, and is collected after the image information of interactive object and to be sent image information to image recognizer 60.
It can be the electronic equipment with image identification function to image recognizer 60, and it is specifically for identification interactive object
Image, to obtain the feature of interactive object.In the present embodiment, the feature of interactive object includes but is not limited to interactive object
Sex (man, female), age (adult, children) of interactive object etc..
In the present embodiment, it is identified by the feature to interactive object, it is possible to achieve targetedly interaction feedback.At this
In embodiment, middle control element 20 is except the interactive operation detected based on detector 10, the trigger position progress of the interactive operation
Outside the determination of feedback behavior, herein in connection with the feature of interactive object, it is determined that the behavior of feedback, so as to realize the individual character for interactive object
Change interaction feedback.
For example, a children (trigger touch interaction behaviour having been touched in face of robot and on the belly of robot
Make), image acquisition device 50 is taken pictures to it, and then the photo captured by 60 pairs of image recognizer is identified, and obtains interaction
The feature of object (children) is children, and then middle control element 20 is examined according to the touch sensor 110 for being arranged at robot belly 13
The touch interactive operation measured, and the particular location (belly 13) that interactive operation occurs is touched, with reference to the spy of the interactive object
Levy, the related feedback device 30 of control makes corresponding feedback behavior, such as sends laugh, and say pair of " child, hello "
Words, while in the action the present embodiment made and shaken hands that makes a stretch of the arm, based on the feature recognition to interactive object, to interactive object institute
The detection of interactive operation, and the determination to interactive operation position are triggered, targetedly interaction feedback can be realized, machine is improved
The interactive intelligence of device people.
Through the above description of the embodiments, those skilled in the art can be understood that each embodiment can
Realized by the mode of software plus required general hardware platform, naturally it is also possible to pass through hardware.Understood based on such, on
The part that technical scheme substantially in other words contributes to prior art is stated to embody in the form of software product, should
Computer software product can be stored in a computer-readable storage medium, such as ROM/RAM, magnetic disc, CD, including some fingers
Order is to cause a computer equipment (can be personal computer, server, or network equipment etc.) to perform each implementation
Method described in some parts of example or embodiment.
Finally it should be noted that:Above example is only to illustrate the technical solution of the utility model, rather than its limitations;
Although the utility model is described in detail with reference to the foregoing embodiments, it will be understood by those within the art that:
It can still modify to the technical scheme described in foregoing embodiments, or which part technical characteristic is carried out etc.
With replacement;And these modifications or replacement, the essence of appropriate technical solution is departed from each embodiment technology of the utility model
The spirit and scope of scheme.