Specific embodiment
In order to make the object, technical scheme and advantages of the embodiment of the invention clearer, below in conjunction with the embodiment of the present invention
In attached drawing, technical scheme in the embodiment of the invention is clearly and completely described, it is clear that described embodiment is
A part of the embodiment of the present invention, instead of all the embodiments.Based on the embodiments of the present invention, those of ordinary skill in the art
Every other embodiment obtained without creative efforts, shall fall within the protection scope of the present invention.
Fig. 1 is the structural schematic diagram of robotic embodiment one provided by the embodiments of the present application, as shown in Figure 1, the robot
It include: fuselage main body 10, the head assembly 11 of robot, position detector 12 and controller 13.
Wherein, head assembly 11 is rotatably connected with fuselage main body 10, and position detector 12 may be mounted at the fuselage
It in main body 10, also may be mounted on head assembly 11, the embodiment of the present application is with no restrictions.Specific descriptions in following part
In, the technical solution of the embodiment of the present application is done so that position detector 12 is mounted in the fuselage main body 10 as an example and is explained in detail
It states.
Wherein, position detector 12, the orientation at the place for detecting interactive object;Controller 13, for controlling head
Component rotation, makes the orientation where head assembly interaction object.
Optionally, various application specific integrated circuits (interactive object SIC), Digital Signal Processing can be used in controller 13
Device (DSP), digital signal processing appts (DSPD), programmable logic device (PLD), field programmable gate array (FPG interaction pair
As), microcontroller, microprocessor or other electronic components realize.
In the embodiment of the present application, position detector 12 is used to obtain the location information of interactive object.Optionally, in this Shen
Please be in embodiment, location information may include: distance and/or the interactive object opposed robots of interactive object opposed robots
Orientation.Wherein, position detector 12 can be microphone array, image recognizer, range sensor, infrared sensor, laser
One or more in sensor and ultrasonic sensor.
Optionally, it is provided in fuselage main body 10 microphone array (MIC array), MIC array is one group and is located at space
The omnidirectional microphone of different location arranges the array to be formed by certain regular shape, is to carry out sky to spatial voice signal
Between a kind of device for sampling, collected signal contains its spatial positional information.In the embodiment of the present application, according to MIC array
Auditory localization obtains the location information of interactive object, so that controller 13 is after receiving the location information, to head assembly 11
Rotational parameters be configured.For example, MIC array detection is located at the left side small angle range of robot body to interactive object,
Then the current rotational angle of the head assembly of settable robot is to turn left, and rotating angle is 5 degree etc.;The inspection of MIC array
Measure the left side polarizers of big angle scope that interactive object is located at robot body, then the current rotational angle of settable fuselage main body be to
Turn left dynamic, and rotating angle is 15-30 degree etc..
Image recognizer can be the integrated equipment with processing capacities such as shooting, image recognition, calculating.By with taking the photograph
Camera or camera acquire image or video flowing containing face, and detection and tracking face in the picture, can be in video camera
Or interactive object is found in the visual range of camera, and obtain the location information of interactive object.Wherein, it is based on image recognition
Device with respect to interactive object influence of the shooting angle to image imaging effect, by clap image calculates and can determine friendship
The location information of mutual object opposed robots, the calculating process are referred to related art realization, and the present embodiment does not repeat.
Range sensor, infrared sensor, laser sensor and ultrasonic sensor are by range measurement, relatively
Orientation angles realize the position detector of positioning.In general, can be realized by way of multiple sensor combinations above-mentioned
The detection of distance, orientation angles.Its course of work are as follows: multiple position detectors are simultaneously emitted by position detection signal, reach tested
After object position detection signal reflect, position detector after receiving position detection signal record position detection signal it is round-trip when
Between, it is calculated to obtain the location information of measured object according to the spread speed of position detection signal, at the same time, be passed according to multiple distances
The orientation that detected material is obtained apart from testing result comprehensive analysis of sensor.
In the present embodiment, head assembly 11 and the fuselage main body 10 of robot are rotatably connected, specifically, connection structure can
To be that stepper motor, retarder and gear assembly form, gear is driven by stepper motor, head is driven by driven gear
Component rotation, so that controller 13 can control the rotation of robot head parts by control stepper motor.
Behind orientation where controller 13 receives interactive object, it is pre- whether the orientation where judging interactive object meets
If bearing range.When the orientation where interactive object meets pre-configured orientation range, controller 13 is according to where interactive object
The angle value that orientation determines the direction that head assembly 11 should rotate and should rotate.Wherein, pre-configured orientation range can manage
Solution are as follows: the angle between interactive object and robot belongs to a certain value range, which can be according to user demand and machine
The rotating property of device number of people parts is adjusted.
For example, above range value can have following label as a result, with the front of the fuselage main body 10 of robot for 0 degree,
Labeled as+0 °, 90 degree of direction (front-right) is labeled as+90 ° to the right, and 90 degree of direction (front-left) is labeled as -90 ° to the left,
It is 0 ° to side's label of left/right 180 degree (dead astern).Place's value range in pre-configured orientation range can be [- 90 ° ,+0] or [+
0,90°].That is, determining the first tune when the angle between interactive object and robot is between [- 90 ° ,+0] or [+0,90 °]
Whole parameter.It should be noted that above-mentioned "-", "+" symbol only represent orientation all around, without the positive and negative meaning in mathematics
Justice.Above-mentioned angle is that the mode of an accurate number is only for example, and in practical application, can also realize arbitrary angle model
It encloses, the embodiment of the present application is with no restrictions.
It is worth noting that in the embodiment of the present application, it is for the beauty of guarantee robot and more approximate special with " people "
Sign, the azimuth informations such as above-mentioned front, dead astern, front-left and front-right are for the fuselage main body 10 of robot
What is be defined causes interactive object to generate discomfort to avoid the rotation that robot head is excessively exaggerated.
Specifically, the control of controller 13 robot head parts 11 can be by being arranged the first of robot head parts 11
Adjusting parameter realizes that the first adjustment parameter includes the adjustment direction and adjustment angle of robot head parts.Controller 13
After determining the first adjustment parameter, the head assembly 11 for controlling robot is rotated according to the first adjustment parameter, thus, it is possible to root
Make robot front interaction object according to the physical location of interactive object, to carry out human-computer interaction, promotes user experience.
Optionally, control of the controller 13 based on the orientation where interactive object to head assembly 11 can be based on pre-
What the corresponding relationship between the orientation and different the first adjustment parameters where the different interactive objects being first arranged was realized.It is preferred that
, at some linear between the orientation where settable interactive object and the first adjustment parameter, for example, when interactive object exists
When between [- 90 ° ,+0] of robot or [+0,90 °], the rotation direction of robot head can [- 90 ° ,+0] or [+0,
90 °] between adjust with the specific location of interactive object, for example, interactive object is located at -75 ° of the left front direction of robot,
The rolling target direction that controller 13 controls the head assembly 11 of robot is -75 ° of directions, if the current direction of robot be -
At 25 °, then controller 13 need to control robot and turn left 50 ° and can be rotated to target direction and interactive object front and hand over
Mutually.
Assuming that following scene, robot is when the first interactive object with 25 ° of left front (- 25 °) engages in the dialogue, robot
Fuselage main body 10 towards front (+0 °) and face orientation be -25 ° of directions.In this situation, position detector 12 detects
Have the second interactive object at 20 ° of right front (+20 °), controller 13 according to the current direction calculatings of robot head parts with
Angle difference between the target direction of next rotation determines the first adjustment parameter of robot head parts.For example, above-mentioned
In scene, controller 13 determines that the first adjustment parameter of robot head parts is to turn right 45 ° (+45 °).Alternatively, control
After device 13 controls the playback of robot head parts 11 (being consistent direction with robot fuselage main body), robot head group is determined
The first adjustment parameter of part is to turn right 20 ° (+20 °).
In practical application, except through controlling the interaction effect that interaction object is realized in the rotation of head assembly 11 of robot
Except fruit, in order to more be bonded anthropomorphic interaction effect, the entire fuselage main body interaction of robot can also be further controlled
Object.
It thus is an optional that as shown in Fig. 2, the robot of the embodiment of the present application can also include: walking chassis 14, the walking
Chassis 14 can be installed on the foot position of robot body, and walking chassis 14 is able to drive machine by the control of controller 13
The fuselage main body of people rotates fixed angle to a certain specific direction, or is moved row to along a certain path by the control of controller 13
It walks.Walking chassis 14 can realize rotation function by structure that stepper motor, retarder and gear assembly form, can be with
By stepper motor and scroll wheel groups at removable frame realize the function that is movably walking of walking chassis 14, certainly, the application
Embodiment is with no restrictions.
It is to be understood that when by control walking chassis 14 to achieve the purpose that robot interaction object, it can
It is immobilized with controlling head assembly 11 relative to fuselage main body 10, and simply by rotation, the movement of control walking chassis 14
To realize the purpose.It is of course also possible to control head assembly 11 and walking chassis 14 simultaneously to realize the purpose.
For achieving the purpose that robot interaction object by control walking chassis 14, when controller 13 receives
To after the orientation where interactive object, if judging that the orientation does not meet pre-configured orientation range, then controller 13 is according to the orientation
Determine the second adjustment parameter of walking chassis, wherein second parameter include the motion track of walking chassis 14, rotational angle with
And rotation direction.Above-mentioned example is accepted, when the value range in pre-configured orientation range is [- 90 ° ,+0] or [+0,90 °], if sentencing
Angle between disconnected interactive object and robot except [- 90 ° ,+0] or [+0,90 °], for example, interactive object and robot it
Between angle between [- 90 °, -0 °] or [- 0 ° ,+90 °] when, controller 13 determine second adjustment parameter.
Optionally, control of the controller 13 based on the orientation where interactive object to walking chassis 14 can be based on pre-
What the corresponding relationship between the orientation and different second adjustment parameters where the distinct interaction object being first arranged was realized.It is preferred that
, at some linear between the orientation where settable interactive object and second adjustment parameter, that is, will be locating for interactive object
Orientation values and revolute orientation values correspond.For example, interactive object is located at -125 ° of the left back direction of robot
Place, the rolling target direction that controller 13 controls the walking chassis 14 of robot is -125 ° of directions, if the direction that robot is current
At -25 °, then 13 walking chassis that need to control robot of controller, which turn left 100 °, can be rotated to target direction and hands over
Mutual object front interaction.
Optionally, the first adjustment parameter and second adjustment parameter are determined also according to the orientation of interactive object can have and is following feasible
Mode: by the value subregion of the orientation angles between robot and interactive object, and for each subregion be arranged a fixing head
The rotation parameter of parts or walking chassis.For example, accepting above-mentioned example, divide the orientation angles to four sections: [- 90 ° ,+
0],[+0,90°],[-90°,-0],[-0,90°].For example, robot is arranged when interactive object is between [- 90 ° ,+0]
The adjusting parameter of head assembly is to rotate to the left 45 ° (- 45 °);When interactive object is between [+0,90 °], robot is set
The adjusting parameter of head assembly is to rotate to the right 45 ° (+45 °);When interactive object is between [- 90 °, -0], robot is set
The adjusting parameter of walking chassis be 135 ° (- 135 °) to the left;When interactive object is between [- 0,90 °], robot is set
The adjusting parameter of walking chassis is 135 ° (+135 °) to the right.Certainly, above-mentioned numerical value is only used for citing, is not constituted to this Shen
Please embodiment limitation.
Particularly, the orientation angles between robot and interactive object that position detector 12 detects [- 90 °, -0],
Between [- 0,90 °], and the head assembly 11 of robot, there are when certain rotational angle, controller 13 is in control robot head
The second adjustment parameter of the swivel base 14 of robot is determined after the playback of component 11 again, thus, swivel base 14 is with movement machine
After man-machine body body rotation to target angle, the face of robot can front interactive object, to be brought for user good
Interactive experience.Certainly, the playback process of robot head parts 11 can also carry out simultaneously with the rotation process of swivel base 14, this
Apply for embodiment with no restrictions.
Optionally, it in order to further enhance the control precision to robot interaction object, is gone back in the embodiment of the present application
It provides based on the detection to interactive object human face region and accurately adjusts the scheme of revolute.It specifically, can be in machine
Image recognizer is set in device people's fuselage main body, which is used to acquire and identify the human face region of interactive object.?
Behind location information i.e. distance, the orientation for measuring interactive object by the such as position detectors such as MIC array inspection 12, image recognizer
If recognizing the human face region of interactive object, face location, that is, opposed robots side of interactive object is further accurately confirmed
Position, distance obtain more accurate interactive object so as to be finely adjusted amendment to the result that position detector detects
Location information.
Under a kind of situation, after position detector 12 determines the azimuth information of interactive object, image recognizer is unidentified to be arrived
The human face region of interactive object then can determine whether interactive object side to or back to robot, at this point, controller 13 determines walking bottom
The second adjustment parameter of disk 14.Preferably, second adjustment parameter can be set to, using the interactive object as the center of circle, with robot
It is the half radial motion track for drawing circle as walking chassis 14 relative to the distance between interactive object, until image recognizer is known
It is clipped to the human face region of interactive object.
Optionally, the robot of the embodiment of the present application further includes Audio Players 15, which is used for machine
The broadcasting of voice signal when voice dialogue between people and interactive object.Correspondingly, controller 13 is opposite in acquisition interactive object
After the distance of robot, according to the range information of interactive object, the volume of Audio Players 15 is controlled.For example, working as
For interactive object in farther away place, Acoustic Teansmission Attenuation is big, and the volume that controller 13 can control Audio Players 15 at this time increases
Greatly;Conversely, Acoustic Teansmission Attenuation is small, at this point, controller 13 can control Audio Players when interactive object is in closer place
15 volume down, thus, robot can be exchanged with reasonable volume with interactive object, further promoted user and handed over
Comfort level during mutually.
Optionally, the robot of the embodiment of the present application further includes noise detector 16, which can pass through
Sound pressure meter or MIC array are realized.The sound of Audio Players 15 is easy to be influenced by noise in environment, therefore, controller 13
Before the volume of control Audio Players 15, the noise measuring of noise detector 16 is obtained in advance as a result, controller 13
Sound described in the distance controlling of the interactive object opposed robots detected further according to testing result and position detector plays
The volume of device further improves the levels of precision of volume control, improves the Man machine interaction of robot.
In the present embodiment, the orientation where the interactive object of interactive object is detected by position detector, controller is based on
The azimuth information determines the adjusting parameter of robot head parts and walking chassis, so as to adjust robot head assembly with
And the rotation direction of walking chassis, the head assembly of robot and walking chassis are become according to the position of interactive object
Change to different directions rotation and/or movement, the positive interaction of robot and interactive object can directionally acquire interactive object
Sound, improve voice and collect quality to realizing that better human-computer interaction extends the Man machine interaction of robot.With this
Meanwhile controller determines robot relative to the distance of robot and the noise size of man-machine interaction environment based on interactive object
Audio Players volume, into further improving the interactive capability of robot.
Fig. 3 is the flow chart of robot control method embodiment one provided by the embodiments of the present application, as shown in figure 3, the party
Method includes the following steps:
Orientation where step 101, detection interactive object.
Step 102 judges that the orientation whether within the scope of pre-configured orientation, if so, thening follow the steps 103, otherwise executes step
Rapid 104.
Step 103, the head assembly rotation for controlling robot, where making the head assembly interaction object of robot
Orientation.
Step 104, the rotation of control robot fuselage main body and/or movement, make the head assembly interaction pair of robot
As the orientation at place.
In the present embodiment, the tune of the detection of the azimuth information where above-mentioned interactive object and the head assembly to robot
It is had suffered journey, may refer to the description in embodiment illustrated in fig. 1, details are not described herein.
In the present embodiment, the orientation of interactive object opposed robots is detected by position detector, controller is based on the party
The head assembly and/or walking chassis of position information adjustment robot, so that robot is during human-computer interaction, robot
Head assembly can be face-to-face with interactive object, extends the Man machine interaction of robot.
Fig. 4 is the flow chart of robot control method embodiment two provided by the embodiments of the present application, as shown in figure 4, the machine
Device people's control method includes the following steps:
Step 201, detection interactive object are relative to the noise decibel value in the distance and/or preset range of robot.
Step 202, the noise decibel value according to this distance and/or in preset range determine the Audio Players of robot
Volume.
Step 203 controls Audio Players progress volume adjustment according to volume.
In the present embodiment, the detection in the orientation where above-mentioned interactive object and the control of the volume to Audio Players
Process processed may refer to the description in embodiment illustrated in fig. 1, and details are not described herein.
It should be noted that the corresponding device of Fig. 1 and Fig. 2 can execute it is any in the corresponding embodiment of Fig. 3~Fig. 4
One or more combination.Implement for example, the corresponding device of Fig. 1 and Fig. 2 can execute the corresponding combination of Fig. 3 and Fig. 4
Example, under a kind of situation, the orientation where interactive object is detected by position detector, controller determines machine based on the orientation
The first adjustment parameter of device number of people parts, so as to adjust the head assembly of robot, so that the head assembly of robot can be with
It is rotated according to the Orientation differences of interactive object to different directions;Under another situation, is detected and interacted by position detector
Orientation where object, controller determines the second adjustment parameter of machine walking chassis based on the orientation, so as to adjust robot
Walking chassis, allow the walking chassis of robot according to the change in location of interactive object and to different directions rotation move
It is dynamic;At the same time, distance controlling robot of the interactive object that controller is detected based on position detector relative to robot
The volume of Audio Players.The combination of above-described embodiment executes the Man machine interaction for extending robot into one, is promoted
User experience.
The apparatus embodiments described above are merely exemplary, wherein described, unit can as illustrated by the separation member
It is physically separated with being or may not be, component shown as a unit may or may not be physics list
Member, it can it is in one place, or may be distributed over multiple network units.It can be selected according to the actual needs
In some or all of the modules achieve the purpose of the solution of this embodiment.Those of ordinary skill in the art are not paying creativeness
Labour in the case where, it can understand and implement.
Through the above description of the embodiments, those skilled in the art can be understood that each embodiment can
It realizes by means of software and necessary general hardware platform, naturally it is also possible to pass through hardware.Based on this understanding, on
Stating technical solution, substantially the part that contributes to existing technology can be embodied in the form of software products in other words, should
Computer software product may be stored in a computer readable storage medium, such as ROM/RAM, magnetic disk, CD, including several fingers
It enables and using so that a computer equipment (can be personal computer, server or the network equipment etc.) executes each implementation
Method described in certain parts of example or embodiment.
Finally, it should be noted that the above embodiments are merely illustrative of the technical solutions of the present invention, rather than its limitations;Although
Present invention has been described in detail with reference to the aforementioned embodiments, those skilled in the art should understand that: it still may be used
To modify the technical solutions described in the foregoing embodiments or equivalent replacement of some of the technical features;
And these are modified or replaceed, technical solution of various embodiments of the present invention that it does not separate the essence of the corresponding technical solution spirit and
Range.