CN106335071B - Robot and robot control method - Google Patents

Robot and robot control method Download PDF

Info

Publication number
CN106335071B
CN106335071B CN201611000091.4A CN201611000091A CN106335071B CN 106335071 B CN106335071 B CN 106335071B CN 201611000091 A CN201611000091 A CN 201611000091A CN 106335071 B CN106335071 B CN 106335071B
Authority
CN
China
Prior art keywords
robot
interactive object
orientation
head assembly
adjustment parameter
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201611000091.4A
Other languages
Chinese (zh)
Other versions
CN106335071A (en
Inventor
蒋化冰
孙斌
吴礼银
康力方
李小山
张干
赵亮
邹武林
徐浩明
廖凯
齐鹏举
方园
李兰
米万珠
舒剑
吴琨
管伟
罗璇
罗承雄
张海建
马晨星
张俊杰
谭舟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Mrobot Technology Co ltd
Shanghai Mumu Jucong Robot Technology Co ltd
Original Assignee
Shanghai Mumuju Fir Robot Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Mumuju Fir Robot Technology Co Ltd filed Critical Shanghai Mumuju Fir Robot Technology Co Ltd
Priority to CN201611000091.4A priority Critical patent/CN106335071B/en
Publication of CN106335071A publication Critical patent/CN106335071A/en
Application granted granted Critical
Publication of CN106335071B publication Critical patent/CN106335071B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • B25J13/08Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
    • B25J13/088Controls for manipulators by means of sensing devices, e.g. viewing or touching devices with position, velocity or acceleration sensors
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J11/00Manipulators not otherwise provided for
    • B25J11/0005Manipulators having means for high-level communication with users, e.g. speech generator, face recognition means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1671Programme controls characterised by programming, planning systems for manipulators characterised by simulation, either to verify existing program or to create and verify new program, CAD/CAM oriented, graphic oriented programming systems

Abstract

This application discloses a kind of robot and robot control method, which includes: fuselage main body, head assembly, position detector and controller;Wherein, head assembly is rotatably connected with fuselage main body;Position detector is used to detect the orientation where interactive object;Controller makes the orientation where head assembly interaction object for controlling head assembly rotation.With this solution, allow the head assembly of robot to turn to corresponding direction according to the change in location of interactive object, extend the Man machine interaction of robot.

Description

Robot and robot control method
Technical field
The embodiment of the present invention belongs to mobile robot field more particularly to a kind of robot and robot control method.
Background technique
In recent years, the development of robot technology and artificial intelligence study deepen continuously, and intelligent robot is in human lives Play the part of more and more important role.It is increasing with the needs of people, more humanized robot can will gradually become machine The favorite of Qi Ren circle.
It is desirable to robots can be more humanized, and being especially desirable to robot can be in the interaction with the mankind more Close to the feature of " people ".But the interactive function of big multirobot is relatively simple at present, mostly robot be based on user speech or The instruction of touch input executes corresponding movement.
Summary of the invention
In view of this, the embodiment of the present application provides a kind of robot, the Man machine interaction for expanding machinery people.
The embodiment of the present application provides a kind of robot, comprising: fuselage main body, head assembly, position detector and control Device processed;
Wherein, the head assembly is rotatably connected with the fuselage main body;
The position detector, for detecting the orientation where interactive object;
The controller makes the head assembly towards the interactive object institute for controlling the head assembly rotation Orientation.
The embodiment of the present application provides a kind of robot control method, comprising:
Detect the orientation of the relatively described robot of interactive object;
If the orientation within the scope of pre-configured orientation, controls the head assembly rotation of the robot, make the machine The head assembly of people is towards the orientation where the interactive object.
Robot and robot control method provided by the present application, by where position detector detection interactive object Orientation, controller determines the rotation direction and rotational angle of robot head parts based on the orientation, so that the head of robot Parts can be rotated according to the change in location of interactive object and to different directions, extend the Man machine interaction of robot.
Detailed description of the invention
In order to more clearly explain the embodiment of the invention or the technical proposal in the existing technology, to embodiment or will show below There is attached drawing needed in technical description to be briefly described, it should be apparent that, the accompanying drawings in the following description is this hair Bright some embodiments for those of ordinary skill in the art without creative efforts, can be with root Other attached drawings are obtained according to these attached drawings.
Fig. 1 is the structural schematic diagram for the robotic embodiment one that one embodiment of the application provides;
Fig. 2 is another structural schematic diagram for the robotic embodiment one that one embodiment of the application provides;
Fig. 3 is the flow chart for the robot control method embodiment one that one embodiment of the application provides;
Fig. 4 is the flow chart for the robot control method embodiment two that one embodiment of the application provides.
Specific embodiment
In order to make the object, technical scheme and advantages of the embodiment of the invention clearer, below in conjunction with the embodiment of the present invention In attached drawing, technical scheme in the embodiment of the invention is clearly and completely described, it is clear that described embodiment is A part of the embodiment of the present invention, instead of all the embodiments.Based on the embodiments of the present invention, those of ordinary skill in the art Every other embodiment obtained without creative efforts, shall fall within the protection scope of the present invention.
Fig. 1 is the structural schematic diagram of robotic embodiment one provided by the embodiments of the present application, as shown in Figure 1, the robot It include: fuselage main body 10, the head assembly 11 of robot, position detector 12 and controller 13.
Wherein, head assembly 11 is rotatably connected with fuselage main body 10, and position detector 12 may be mounted at the fuselage It in main body 10, also may be mounted on head assembly 11, the embodiment of the present application is with no restrictions.Specific descriptions in following part In, the technical solution of the embodiment of the present application is done so that position detector 12 is mounted in the fuselage main body 10 as an example and is explained in detail It states.
Wherein, position detector 12, the orientation at the place for detecting interactive object;Controller 13, for controlling head Component rotation, makes the orientation where head assembly interaction object.
Optionally, various application specific integrated circuits (interactive object SIC), Digital Signal Processing can be used in controller 13 Device (DSP), digital signal processing appts (DSPD), programmable logic device (PLD), field programmable gate array (FPG interaction pair As), microcontroller, microprocessor or other electronic components realize.
In the embodiment of the present application, position detector 12 is used to obtain the location information of interactive object.Optionally, in this Shen Please be in embodiment, location information may include: distance and/or the interactive object opposed robots of interactive object opposed robots Orientation.Wherein, position detector 12 can be microphone array, image recognizer, range sensor, infrared sensor, laser One or more in sensor and ultrasonic sensor.
Optionally, it is provided in fuselage main body 10 microphone array (MIC array), MIC array is one group and is located at space The omnidirectional microphone of different location arranges the array to be formed by certain regular shape, is to carry out sky to spatial voice signal Between a kind of device for sampling, collected signal contains its spatial positional information.In the embodiment of the present application, according to MIC array Auditory localization obtains the location information of interactive object, so that controller 13 is after receiving the location information, to head assembly 11 Rotational parameters be configured.For example, MIC array detection is located at the left side small angle range of robot body to interactive object, Then the current rotational angle of the head assembly of settable robot is to turn left, and rotating angle is 5 degree etc.;The inspection of MIC array Measure the left side polarizers of big angle scope that interactive object is located at robot body, then the current rotational angle of settable fuselage main body be to Turn left dynamic, and rotating angle is 15-30 degree etc..
Image recognizer can be the integrated equipment with processing capacities such as shooting, image recognition, calculating.By with taking the photograph Camera or camera acquire image or video flowing containing face, and detection and tracking face in the picture, can be in video camera Or interactive object is found in the visual range of camera, and obtain the location information of interactive object.Wherein, it is based on image recognition Device with respect to interactive object influence of the shooting angle to image imaging effect, by clap image calculates and can determine friendship The location information of mutual object opposed robots, the calculating process are referred to related art realization, and the present embodiment does not repeat.
Range sensor, infrared sensor, laser sensor and ultrasonic sensor are by range measurement, relatively Orientation angles realize the position detector of positioning.In general, can be realized by way of multiple sensor combinations above-mentioned The detection of distance, orientation angles.Its course of work are as follows: multiple position detectors are simultaneously emitted by position detection signal, reach tested After object position detection signal reflect, position detector after receiving position detection signal record position detection signal it is round-trip when Between, it is calculated to obtain the location information of measured object according to the spread speed of position detection signal, at the same time, be passed according to multiple distances The orientation that detected material is obtained apart from testing result comprehensive analysis of sensor.
In the present embodiment, head assembly 11 and the fuselage main body 10 of robot are rotatably connected, specifically, connection structure can To be that stepper motor, retarder and gear assembly form, gear is driven by stepper motor, head is driven by driven gear Component rotation, so that controller 13 can control the rotation of robot head parts by control stepper motor.
Behind orientation where controller 13 receives interactive object, it is pre- whether the orientation where judging interactive object meets If bearing range.When the orientation where interactive object meets pre-configured orientation range, controller 13 is according to where interactive object The angle value that orientation determines the direction that head assembly 11 should rotate and should rotate.Wherein, pre-configured orientation range can manage Solution are as follows: the angle between interactive object and robot belongs to a certain value range, which can be according to user demand and machine The rotating property of device number of people parts is adjusted.
For example, above range value can have following label as a result, with the front of the fuselage main body 10 of robot for 0 degree, Labeled as+0 °, 90 degree of direction (front-right) is labeled as+90 ° to the right, and 90 degree of direction (front-left) is labeled as -90 ° to the left, It is 0 ° to side's label of left/right 180 degree (dead astern).Place's value range in pre-configured orientation range can be [- 90 ° ,+0] or [+ 0,90°].That is, determining the first tune when the angle between interactive object and robot is between [- 90 ° ,+0] or [+0,90 °] Whole parameter.It should be noted that above-mentioned "-", "+" symbol only represent orientation all around, without the positive and negative meaning in mathematics Justice.Above-mentioned angle is that the mode of an accurate number is only for example, and in practical application, can also realize arbitrary angle model It encloses, the embodiment of the present application is with no restrictions.
It is worth noting that in the embodiment of the present application, it is for the beauty of guarantee robot and more approximate special with " people " Sign, the azimuth informations such as above-mentioned front, dead astern, front-left and front-right are for the fuselage main body 10 of robot What is be defined causes interactive object to generate discomfort to avoid the rotation that robot head is excessively exaggerated.
Specifically, the control of controller 13 robot head parts 11 can be by being arranged the first of robot head parts 11 Adjusting parameter realizes that the first adjustment parameter includes the adjustment direction and adjustment angle of robot head parts.Controller 13 After determining the first adjustment parameter, the head assembly 11 for controlling robot is rotated according to the first adjustment parameter, thus, it is possible to root Make robot front interaction object according to the physical location of interactive object, to carry out human-computer interaction, promotes user experience.
Optionally, control of the controller 13 based on the orientation where interactive object to head assembly 11 can be based on pre- What the corresponding relationship between the orientation and different the first adjustment parameters where the different interactive objects being first arranged was realized.It is preferred that , at some linear between the orientation where settable interactive object and the first adjustment parameter, for example, when interactive object exists When between [- 90 ° ,+0] of robot or [+0,90 °], the rotation direction of robot head can [- 90 ° ,+0] or [+0, 90 °] between adjust with the specific location of interactive object, for example, interactive object is located at -75 ° of the left front direction of robot, The rolling target direction that controller 13 controls the head assembly 11 of robot is -75 ° of directions, if the current direction of robot be - At 25 °, then controller 13 need to control robot and turn left 50 ° and can be rotated to target direction and interactive object front and hand over Mutually.
Assuming that following scene, robot is when the first interactive object with 25 ° of left front (- 25 °) engages in the dialogue, robot Fuselage main body 10 towards front (+0 °) and face orientation be -25 ° of directions.In this situation, position detector 12 detects Have the second interactive object at 20 ° of right front (+20 °), controller 13 according to the current direction calculatings of robot head parts with Angle difference between the target direction of next rotation determines the first adjustment parameter of robot head parts.For example, above-mentioned In scene, controller 13 determines that the first adjustment parameter of robot head parts is to turn right 45 ° (+45 °).Alternatively, control After device 13 controls the playback of robot head parts 11 (being consistent direction with robot fuselage main body), robot head group is determined The first adjustment parameter of part is to turn right 20 ° (+20 °).
In practical application, except through controlling the interaction effect that interaction object is realized in the rotation of head assembly 11 of robot Except fruit, in order to more be bonded anthropomorphic interaction effect, the entire fuselage main body interaction of robot can also be further controlled Object.
It thus is an optional that as shown in Fig. 2, the robot of the embodiment of the present application can also include: walking chassis 14, the walking Chassis 14 can be installed on the foot position of robot body, and walking chassis 14 is able to drive machine by the control of controller 13 The fuselage main body of people rotates fixed angle to a certain specific direction, or is moved row to along a certain path by the control of controller 13 It walks.Walking chassis 14 can realize rotation function by structure that stepper motor, retarder and gear assembly form, can be with By stepper motor and scroll wheel groups at removable frame realize the function that is movably walking of walking chassis 14, certainly, the application Embodiment is with no restrictions.
It is to be understood that when by control walking chassis 14 to achieve the purpose that robot interaction object, it can It is immobilized with controlling head assembly 11 relative to fuselage main body 10, and simply by rotation, the movement of control walking chassis 14 To realize the purpose.It is of course also possible to control head assembly 11 and walking chassis 14 simultaneously to realize the purpose.
For achieving the purpose that robot interaction object by control walking chassis 14, when controller 13 receives To after the orientation where interactive object, if judging that the orientation does not meet pre-configured orientation range, then controller 13 is according to the orientation Determine the second adjustment parameter of walking chassis, wherein second parameter include the motion track of walking chassis 14, rotational angle with And rotation direction.Above-mentioned example is accepted, when the value range in pre-configured orientation range is [- 90 ° ,+0] or [+0,90 °], if sentencing Angle between disconnected interactive object and robot except [- 90 ° ,+0] or [+0,90 °], for example, interactive object and robot it Between angle between [- 90 °, -0 °] or [- 0 ° ,+90 °] when, controller 13 determine second adjustment parameter.
Optionally, control of the controller 13 based on the orientation where interactive object to walking chassis 14 can be based on pre- What the corresponding relationship between the orientation and different second adjustment parameters where the distinct interaction object being first arranged was realized.It is preferred that , at some linear between the orientation where settable interactive object and second adjustment parameter, that is, will be locating for interactive object Orientation values and revolute orientation values correspond.For example, interactive object is located at -125 ° of the left back direction of robot Place, the rolling target direction that controller 13 controls the walking chassis 14 of robot is -125 ° of directions, if the direction that robot is current At -25 °, then 13 walking chassis that need to control robot of controller, which turn left 100 °, can be rotated to target direction and hands over Mutual object front interaction.
Optionally, the first adjustment parameter and second adjustment parameter are determined also according to the orientation of interactive object can have and is following feasible Mode: by the value subregion of the orientation angles between robot and interactive object, and for each subregion be arranged a fixing head The rotation parameter of parts or walking chassis.For example, accepting above-mentioned example, divide the orientation angles to four sections: [- 90 ° ,+ 0],[+0,90°],[-90°,-0],[-0,90°].For example, robot is arranged when interactive object is between [- 90 ° ,+0] The adjusting parameter of head assembly is to rotate to the left 45 ° (- 45 °);When interactive object is between [+0,90 °], robot is set The adjusting parameter of head assembly is to rotate to the right 45 ° (+45 °);When interactive object is between [- 90 °, -0], robot is set The adjusting parameter of walking chassis be 135 ° (- 135 °) to the left;When interactive object is between [- 0,90 °], robot is set The adjusting parameter of walking chassis is 135 ° (+135 °) to the right.Certainly, above-mentioned numerical value is only used for citing, is not constituted to this Shen Please embodiment limitation.
Particularly, the orientation angles between robot and interactive object that position detector 12 detects [- 90 °, -0], Between [- 0,90 °], and the head assembly 11 of robot, there are when certain rotational angle, controller 13 is in control robot head The second adjustment parameter of the swivel base 14 of robot is determined after the playback of component 11 again, thus, swivel base 14 is with movement machine After man-machine body body rotation to target angle, the face of robot can front interactive object, to be brought for user good Interactive experience.Certainly, the playback process of robot head parts 11 can also carry out simultaneously with the rotation process of swivel base 14, this Apply for embodiment with no restrictions.
Optionally, it in order to further enhance the control precision to robot interaction object, is gone back in the embodiment of the present application It provides based on the detection to interactive object human face region and accurately adjusts the scheme of revolute.It specifically, can be in machine Image recognizer is set in device people's fuselage main body, which is used to acquire and identify the human face region of interactive object.? Behind location information i.e. distance, the orientation for measuring interactive object by the such as position detectors such as MIC array inspection 12, image recognizer If recognizing the human face region of interactive object, face location, that is, opposed robots side of interactive object is further accurately confirmed Position, distance obtain more accurate interactive object so as to be finely adjusted amendment to the result that position detector detects Location information.
Under a kind of situation, after position detector 12 determines the azimuth information of interactive object, image recognizer is unidentified to be arrived The human face region of interactive object then can determine whether interactive object side to or back to robot, at this point, controller 13 determines walking bottom The second adjustment parameter of disk 14.Preferably, second adjustment parameter can be set to, using the interactive object as the center of circle, with robot It is the half radial motion track for drawing circle as walking chassis 14 relative to the distance between interactive object, until image recognizer is known It is clipped to the human face region of interactive object.
Optionally, the robot of the embodiment of the present application further includes Audio Players 15, which is used for machine The broadcasting of voice signal when voice dialogue between people and interactive object.Correspondingly, controller 13 is opposite in acquisition interactive object After the distance of robot, according to the range information of interactive object, the volume of Audio Players 15 is controlled.For example, working as For interactive object in farther away place, Acoustic Teansmission Attenuation is big, and the volume that controller 13 can control Audio Players 15 at this time increases Greatly;Conversely, Acoustic Teansmission Attenuation is small, at this point, controller 13 can control Audio Players when interactive object is in closer place 15 volume down, thus, robot can be exchanged with reasonable volume with interactive object, further promoted user and handed over Comfort level during mutually.
Optionally, the robot of the embodiment of the present application further includes noise detector 16, which can pass through Sound pressure meter or MIC array are realized.The sound of Audio Players 15 is easy to be influenced by noise in environment, therefore, controller 13 Before the volume of control Audio Players 15, the noise measuring of noise detector 16 is obtained in advance as a result, controller 13 Sound described in the distance controlling of the interactive object opposed robots detected further according to testing result and position detector plays The volume of device further improves the levels of precision of volume control, improves the Man machine interaction of robot.
In the present embodiment, the orientation where the interactive object of interactive object is detected by position detector, controller is based on The azimuth information determines the adjusting parameter of robot head parts and walking chassis, so as to adjust robot head assembly with And the rotation direction of walking chassis, the head assembly of robot and walking chassis are become according to the position of interactive object Change to different directions rotation and/or movement, the positive interaction of robot and interactive object can directionally acquire interactive object Sound, improve voice and collect quality to realizing that better human-computer interaction extends the Man machine interaction of robot.With this Meanwhile controller determines robot relative to the distance of robot and the noise size of man-machine interaction environment based on interactive object Audio Players volume, into further improving the interactive capability of robot.
Fig. 3 is the flow chart of robot control method embodiment one provided by the embodiments of the present application, as shown in figure 3, the party Method includes the following steps:
Orientation where step 101, detection interactive object.
Step 102 judges that the orientation whether within the scope of pre-configured orientation, if so, thening follow the steps 103, otherwise executes step Rapid 104.
Step 103, the head assembly rotation for controlling robot, where making the head assembly interaction object of robot Orientation.
Step 104, the rotation of control robot fuselage main body and/or movement, make the head assembly interaction pair of robot As the orientation at place.
In the present embodiment, the tune of the detection of the azimuth information where above-mentioned interactive object and the head assembly to robot It is had suffered journey, may refer to the description in embodiment illustrated in fig. 1, details are not described herein.
In the present embodiment, the orientation of interactive object opposed robots is detected by position detector, controller is based on the party The head assembly and/or walking chassis of position information adjustment robot, so that robot is during human-computer interaction, robot Head assembly can be face-to-face with interactive object, extends the Man machine interaction of robot.
Fig. 4 is the flow chart of robot control method embodiment two provided by the embodiments of the present application, as shown in figure 4, the machine Device people's control method includes the following steps:
Step 201, detection interactive object are relative to the noise decibel value in the distance and/or preset range of robot.
Step 202, the noise decibel value according to this distance and/or in preset range determine the Audio Players of robot Volume.
Step 203 controls Audio Players progress volume adjustment according to volume.
In the present embodiment, the detection in the orientation where above-mentioned interactive object and the control of the volume to Audio Players Process processed may refer to the description in embodiment illustrated in fig. 1, and details are not described herein.
It should be noted that the corresponding device of Fig. 1 and Fig. 2 can execute it is any in the corresponding embodiment of Fig. 3~Fig. 4 One or more combination.Implement for example, the corresponding device of Fig. 1 and Fig. 2 can execute the corresponding combination of Fig. 3 and Fig. 4 Example, under a kind of situation, the orientation where interactive object is detected by position detector, controller determines machine based on the orientation The first adjustment parameter of device number of people parts, so as to adjust the head assembly of robot, so that the head assembly of robot can be with It is rotated according to the Orientation differences of interactive object to different directions;Under another situation, is detected and interacted by position detector Orientation where object, controller determines the second adjustment parameter of machine walking chassis based on the orientation, so as to adjust robot Walking chassis, allow the walking chassis of robot according to the change in location of interactive object and to different directions rotation move It is dynamic;At the same time, distance controlling robot of the interactive object that controller is detected based on position detector relative to robot The volume of Audio Players.The combination of above-described embodiment executes the Man machine interaction for extending robot into one, is promoted User experience.
The apparatus embodiments described above are merely exemplary, wherein described, unit can as illustrated by the separation member It is physically separated with being or may not be, component shown as a unit may or may not be physics list Member, it can it is in one place, or may be distributed over multiple network units.It can be selected according to the actual needs In some or all of the modules achieve the purpose of the solution of this embodiment.Those of ordinary skill in the art are not paying creativeness Labour in the case where, it can understand and implement.
Through the above description of the embodiments, those skilled in the art can be understood that each embodiment can It realizes by means of software and necessary general hardware platform, naturally it is also possible to pass through hardware.Based on this understanding, on Stating technical solution, substantially the part that contributes to existing technology can be embodied in the form of software products in other words, should Computer software product may be stored in a computer readable storage medium, such as ROM/RAM, magnetic disk, CD, including several fingers It enables and using so that a computer equipment (can be personal computer, server or the network equipment etc.) executes each implementation Method described in certain parts of example or embodiment.
Finally, it should be noted that the above embodiments are merely illustrative of the technical solutions of the present invention, rather than its limitations;Although Present invention has been described in detail with reference to the aforementioned embodiments, those skilled in the art should understand that: it still may be used To modify the technical solutions described in the foregoing embodiments or equivalent replacement of some of the technical features; And these are modified or replaceed, technical solution of various embodiments of the present invention that it does not separate the essence of the corresponding technical solution spirit and Range.

Claims (8)

1. a kind of robot, for being interacted with interactive object characterized by comprising fuselage main body, head assembly, position Set detector and controller;
Wherein, the head assembly and the fuselage main body are rotatablely connected;
The position detector, for detecting the orientation where the interactive object;
The controller makes the head assembly towards where the interactive object for controlling the head assembly rotation Orientation;
The robot further includes walking chassis, and the fuselage main body is connect with the walking chassis;The walking chassis is by step The structure formed into motor, retarder and gear assembly realizes rotation function;The controller is also used to control described Walking chassis rotation and/or movement, make the fuselage main body towards the orientation where the interactive object;
Wherein, the controller, is specifically used for: when the orientation of the interactive object is between [- 90 ° ,+0] or [+0,90 °], It determines the first adjustment parameter, and controls the head assembly and rotated according to the first adjustment parameter;In the orientation of the interactive object When between [- 90 °, -0] or [- 0 ,+90 °], second adjustment parameter is determined, and control the walking chassis according to described second Adjusting parameter rotation and/or movement;
Wherein, the first adjustment parameter and the orientation of the interactive object are linear;The second adjustment parameter and institute The orientation for stating interactive object is linear;Or the second adjustment parameter are as follows: using the interactive object as the center of circle, with described Robot is that radius draws motion track of the circle as walking chassis relative to the distance between described interactive object, until institute's rheme Set the human face region that the image recognizer in detector recognizes the interactive object.
2. robot according to claim 1, which is characterized in that described image identifier, specifically for acquiring and identifying The human face region of the interactive object;
The controller is also used to adjust the rotational angle of the head assembly according to the facial image recognized.
3. robot according to claim 1, which is characterized in that the robot further includes Audio Players;Institute's rheme Detector is set, is also used to detect the interactive object at a distance of the distance of the robot;
The controller is also used to adjust the volume of the Audio Players according to the distance.
4. robot according to claim 3, which is characterized in that the robot further includes Audio Players and noise Detector;The controller is also used to adjust the sound according to the testing result and the distance of the noise detector The volume of player.
5. robot according to claim 1-4, which is characterized in that the position detector, comprising:
One or more in microphone array, image recognizer, infrared sensor, laser sensor and ultrasonic sensor It is a.
6. a kind of robot control method characterized by comprising
Detect the orientation of the relatively described robot of interactive object;
If the orientation within the scope of pre-configured orientation, controls the head assembly rotation of the robot, make the robot Head assembly is towards the orientation where the interactive object, comprising: the orientation of the interactive object [- 90 ° ,+0] or [+0, 90 °] between when, determine the first adjustment parameter, and control the head assembly according to the first adjustment parameter rotate;
If the orientation not within the scope of pre-configured orientation, controls the fuselage main body rotation and/or movement of the robot, make institute The head assembly of robot is stated towards the orientation where the interactive object, comprising: the orientation of the interactive object [- When 90 °, -0] or between [- 0 ,+90 °], second adjustment parameter is determined, and control the walking chassis of the robot according to The rotation of second adjustment parameter and/or movement;
Wherein, the first adjustment parameter and the orientation of the interactive object are linear;The second adjustment parameter and institute The orientation for stating interactive object is linear;Or the second adjustment parameter are as follows: using the interactive object as the center of circle, with described Robot is that radius draws motion track of the circle as the walking chassis relative to the distance between described interactive object, until institute State the human face region that the image recognizer installed in robot recognizes the interactive object.
7. according to the method described in claim 6, it is characterized in that, the method also includes:
Detect the distance of the relatively described robot of the interactive object;Sound when adjusting the robot interactive according to the distance The volume of player.
8. the method according to the description of claim 7 is characterized in that the method also includes:
The noise decibel value in the preset range of the robot is detected, and is adjusted according to the noise decibel value and the distance The volume of Audio Players when the whole robot interactive.
CN201611000091.4A 2016-11-14 2016-11-14 Robot and robot control method Active CN106335071B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201611000091.4A CN106335071B (en) 2016-11-14 2016-11-14 Robot and robot control method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201611000091.4A CN106335071B (en) 2016-11-14 2016-11-14 Robot and robot control method

Publications (2)

Publication Number Publication Date
CN106335071A CN106335071A (en) 2017-01-18
CN106335071B true CN106335071B (en) 2019-03-15

Family

ID=57841911

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201611000091.4A Active CN106335071B (en) 2016-11-14 2016-11-14 Robot and robot control method

Country Status (1)

Country Link
CN (1) CN106335071B (en)

Families Citing this family (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106671110A (en) * 2017-02-10 2017-05-17 苏州库浩斯信息科技有限公司 Robot head travel control method and system
KR20180096078A (en) * 2017-02-20 2018-08-29 엘지전자 주식회사 Module type home robot
JP6720950B2 (en) * 2017-11-13 2020-07-08 株式会社安川電機 Laser processing method, controller and robot system
CN108724173B (en) * 2017-12-04 2020-11-03 北京猎户星空科技有限公司 Robot motion control method, device and equipment and robot
CN109895140A (en) * 2017-12-10 2019-06-18 湘潭宏远电子科技有限公司 A kind of robotically-driven trigger device
CN108340379A (en) * 2018-02-10 2018-07-31 佛山市建金建电子科技有限公司 A kind of multi-functional guest-meeting robot
CN108733083B (en) * 2018-03-21 2022-03-25 北京猎户星空科技有限公司 Robot rotation control method and device, robot and storage medium
CN108724176B (en) * 2018-03-21 2021-03-05 北京猎户星空科技有限公司 Robot rotation control method and device, robot and storage medium
CN108733084A (en) * 2018-03-21 2018-11-02 北京猎户星空科技有限公司 Control method, device, robot and the storage medium of revolute
CN108614583A (en) * 2018-04-28 2018-10-02 北京小米移动软件有限公司 The control method of intelligent terminal and intelligent terminal
CN110767246B (en) * 2018-07-26 2022-08-02 深圳市优必选科技有限公司 Noise processing method and device and robot
CN110858426A (en) * 2018-08-24 2020-03-03 深圳市神州云海智能科技有限公司 Method and device for interaction between lottery robot and user and lottery robot
CN109648561A (en) * 2018-12-30 2019-04-19 深圳市普渡科技有限公司 Robot interactive method
CN109571502A (en) * 2018-12-30 2019-04-05 深圳市普渡科技有限公司 Robot allocator
CN112140118B (en) * 2019-06-28 2022-05-31 北京百度网讯科技有限公司 Interaction method, device, robot and medium
CN111246339B (en) * 2019-12-31 2021-12-07 上海景吾智能科技有限公司 Method and system for adjusting pickup direction, storage medium and intelligent robot
CN111618856B (en) * 2020-05-27 2021-11-05 山东交通学院 Robot control method and system based on visual excitation points and robot

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4394602B2 (en) * 2005-04-20 2010-01-06 富士通株式会社 Service robot
CN101295016B (en) * 2008-06-13 2011-04-27 河北工业大学 Sound source independent searching and locating method
CN201570225U (en) * 2009-10-16 2010-09-01 陈文椿 Control device capable of judging approaching distance of human bodies to execute different actions
CN102302858A (en) * 2010-08-26 2012-01-04 东莞龙昌数码科技有限公司 Family entertainment robot
CN105425806A (en) * 2015-12-25 2016-03-23 深圳先进技术研究院 Human body detection and tracking method and device of mobile robot
CN105690391A (en) * 2016-04-12 2016-06-22 上海应用技术学院 Service robot and control method thereof
CN105959872B (en) * 2016-04-21 2019-07-02 歌尔股份有限公司 Intelligent robot and Sounnd source direction discriminating conduct for intelligent robot
CN105975930A (en) * 2016-05-04 2016-09-28 南靖万利达科技有限公司 Camera angle calibration method during robot speech localization process
CN206170100U (en) * 2016-11-14 2017-05-17 上海木爷机器人技术有限公司 Robot

Also Published As

Publication number Publication date
CN106335071A (en) 2017-01-18

Similar Documents

Publication Publication Date Title
CN106335071B (en) Robot and robot control method
US9538156B2 (en) System and method for providing 3D sound
US9753119B1 (en) Audio and depth based sound source localization
US10659670B2 (en) Monitoring system and control method thereof
CN104240606B (en) The adjusting method of display device and display device viewing angle
Nakadai et al. Improvement of recognition of simultaneous speech signals using av integration and scattering theory for humanoid robots
Jiang et al. Real-time vibration source tracking using high-speed vision
CN109318243A (en) A kind of audio source tracking system, method and the clean robot of vision robot
Zhong et al. Active binaural localization of multiple sound sources
EP3529009A1 (en) Human-tracking robot
CN206170100U (en) Robot
CN106426180A (en) Robot capable of carrying out intelligent following based on face tracking
US20160286133A1 (en) Control Method, Control Device, and Control Equipment
Plinge et al. Multi-speaker tracking using multiple distributed microphone arrays
Gala et al. Realtime active sound source localization for unmanned ground robots using a self-rotational bi-microphone array
Liu et al. Azimuthal source localization using interaural coherence in a robotic dog: modeling and application
CN112614508A (en) Audio and video combined positioning method and device, electronic equipment and storage medium
CN106346475A (en) Robot and robot control method
Liu et al. 3D audio-visual speaker tracking with a two-layer particle filter
Martinson et al. Auditory evidence grids
KR101172354B1 (en) Sound source localization device using rotational microphone array and sound source localization method using the same
Plinge et al. Geometry calibration of distributed microphone arrays exploiting audio-visual correspondences
CN209579577U (en) A kind of the audio source tracking system and clean robot of vision robot
Martinson et al. Robotic discovery of the auditory scene
Tepljakov et al. Sound localization and processing for inducing synesthetic experiences in Virtual Reality

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20190201

Address after: 201400 Shanghai Fengxian District Xinyang Highway 1800 Lane 2 2340 Rooms

Applicant after: SHANGHAI MUMU JUCONG ROBOT TECHNOLOGY Co.,Ltd.

Address before: 200336 402 rooms, No. 33, No. 33, Guang Shun Road, Shanghai

Applicant before: SHANGHAI MROBOT TECHNOLOGY Co.,Ltd.

Effective date of registration: 20190201

Address after: 200336 402 rooms, No. 33, No. 33, Guang Shun Road, Shanghai

Applicant after: SHANGHAI MROBOT TECHNOLOGY Co.,Ltd.

Address before: 200336 402 rooms, No. 33, No. 33, Guang Shun Road, Shanghai

Applicant before: SHANGHAI MUYE ROBOT TECHNOLOGY Co.,Ltd.

GR01 Patent grant
GR01 Patent grant