CN113319869B - Welcome robot system with emotion interaction function - Google Patents

Welcome robot system with emotion interaction function Download PDF

Info

Publication number
CN113319869B
CN113319869B CN202110713166.8A CN202110713166A CN113319869B CN 113319869 B CN113319869 B CN 113319869B CN 202110713166 A CN202110713166 A CN 202110713166A CN 113319869 B CN113319869 B CN 113319869B
Authority
CN
China
Prior art keywords
driving
arm
steering engine
head
lip
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110713166.8A
Other languages
Chinese (zh)
Other versions
CN113319869A (en
Inventor
何苗
毕健
胡方超
路世青
罗茗楠
刘泽明
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chongqing University of Technology
Original Assignee
Chongqing University of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chongqing University of Technology filed Critical Chongqing University of Technology
Priority to CN202110713166.8A priority Critical patent/CN113319869B/en
Publication of CN113319869A publication Critical patent/CN113319869A/en
Application granted granted Critical
Publication of CN113319869B publication Critical patent/CN113319869B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J11/00Manipulators not otherwise provided for
    • B25J11/0005Manipulators having means for high-level communication with users, e.g. speech generator, face recognition means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J11/00Manipulators not otherwise provided for
    • B25J11/008Manipulators for service tasks
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02BCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO BUILDINGS, e.g. HOUSING, HOUSE APPLIANCES OR RELATED END-USER APPLICATIONS
    • Y02B20/00Energy efficient lighting technologies, e.g. halogen lamps or gas discharge lamps
    • Y02B20/40Control techniques providing energy savings, e.g. smart controller or presence detection

Abstract

The invention discloses a welcome robot system with an emotion interaction function, which comprises a human-simulated main body, human-simulated double arms, a human-simulated head and a mobile chassis, and also comprises a power supply unit, a voice recognition control unit, a double-arm control unit, a head control unit and a mobile chassis control unit; the power supply unit is used for supplying power to each control unit; the voice recognition control unit is used for recognizing a voice command of a user and converting the recognized voice command of the user into control signals to be respectively output to the double-arm control unit, the head control unit and the mobile chassis control unit; the double-arm control unit is used for controlling actions of the two arms of the humanoid; the head control unit is used for controlling the expression of the human head; the mobile chassis control unit is used for controlling the motion of the mobile chassis. The greeting robot can interact with the user in a voice emotion mode, and meanwhile action and expression changes can be carried out according to voice instructions of the user, so that the user experience is improved.

Description

Welcome robot system with emotion interaction function
Technical Field
The invention relates to the technical field of robots, in particular to a welcome robot system with an emotion interaction function.
Background
The research and development of China in the field of service robots are relatively late. At present, products in the aspect of service robots are few, a plurality of developed laboratory prototypes have a large distance from the practical industrialization, service robot enterprises with a certain scale are not formed, and only a few small enterprises are involved in the manufacturing of education and entertainment robot products. However, in recent years, with the support of the national '863' plan and the 'fifteen' plan, china has performed a lot of work in the aspects of service robot research and product research and development, and has achieved certain achievements.
With the increasing improvement of living standard and the rapid development of scientific and technical standard, the robot is slowly integrated into the life of people. A welcome robot is a common service robot.
The welcome robot is a high-tech exhibit integrating a voice recognition technology and an intelligent motion technology, and relates to multiple disciplines such as information, machinery, materials, aesthetics and the like. At present, the usher robot has been widely used in places such as catering, hotels, campuses, and the use of usher robot can promote user's good feeling, but present usher robot often function comparison is single, can not carry out the interactive of pronunciation emotion with the user, and the usher robot also can't carry out the change of action and expression according to user's instruction simultaneously, can reduce user's experience feel so undoubtedly for can not carry out fine emotion interaction between user and the usher robot.
Disclosure of Invention
Aiming at the defects in the prior art, the technical problem to be solved by the invention is as follows: how to provide a guest-greeting robot system with emotion interaction function, which can perform voice emotion interaction with a user and can change actions and expressions according to a voice instruction of the user, thereby improving user experience.
In order to solve the technical problems, the invention adopts the following technical scheme:
a welcome robot system with emotion interaction function comprises a human-simulated main body, human-simulated double arms, a human-simulated head and a mobile chassis, and further comprises a power supply unit, a voice recognition control unit, a double-arm control unit, a head control unit and a mobile chassis control unit;
the power supply unit is used for respectively supplying power to the voice recognition control unit, the double-arm control unit, the head control unit and the mobile chassis control unit;
the voice recognition control unit is arranged on the humanoid main body and used for recognizing a voice command of a user, and simultaneously converting the recognized voice command of the user into control signals to be respectively output to the double-arm control unit, the head control unit and the mobile chassis control unit;
the double-arm control unit is arranged on the humanoid double arms and is used for controlling the actions of the humanoid double arms according to the control signals output by the voice recognition control unit;
the head control unit is arranged on the human-simulated head and is used for controlling the expression and the action of the human-simulated head according to the control signal output by the voice recognition control unit;
the mobile chassis control unit is arranged on the mobile chassis and is used for controlling the movement of the mobile chassis according to the control signal output by the voice recognition control unit.
The working principle of the invention is as follows: when the greeting robot system is used, when a user sends a voice control instruction, the voice recognition control unit recognizes the voice instruction sent by the user, converts the voice instruction into a control signal and then respectively outputs the control signal to the double-arm control unit, the head control unit and the mobile chassis control unit, wherein the double-arm control unit controls the actions of the humanoid double arms according to the received control signal so as to synthesize humanoid action, the head control unit controls the facial expression actions of the humanoid head according to the received control signal so as to synthesize humanoid action expressions, the mobile chassis control unit controls the mobile chassis to move according to the received control signal, so that when the user sends the voice control instruction to the greeting robot, the greeting robot system can make corresponding actions and expressions according to the voice control instruction sent by the user, and can also perform corresponding moving operations, therefore, the greeting robot has a good voice interaction function, the intelligent interactivity of the greeting robot is improved, and the experience of the greeting robot is improved.
In conclusion, the welcome robot can interact with the user in a voice emotion mode, and meanwhile, the action and expression change can be carried out according to the voice instruction of the user, so that the user experience is improved.
Preferably, the voice recognition control unit comprises a voice acquisition module, a voice recognition chip, a voice main control chip and a main body control chip;
the output end of the voice acquisition module is connected with the data input end of the voice recognition chip, and the voice acquisition module is used for acquiring a user voice instruction, converting the acquired user voice instruction into an electric signal and outputting the electric signal to the voice recognition chip;
the output end of the voice recognition chip is connected with the data input end of the voice main control chip, and the voice recognition chip is used for recognizing the voice instruction transmitted by the voice acquisition module and outputting the recognized result to the voice main control chip;
the output end of the voice main control chip is respectively in data connection with the main body control chip and the mobile chassis control unit, the voice main control chip outputs serial port data and response audio according to the recognition result of the voice command by the voice recognition chip and respectively outputs the serial port data to the main body control chip and the mobile chassis control unit, the mobile chassis control unit controls the motion of the mobile chassis according to the serial port data output by the voice main control chip, and the signal output end of the main body control chip is also respectively connected with the double-arm control unit and the head control unit data receiving end, so that the double-arm control unit and the head control unit respectively control the motion of the double arms of the humanoid and the expression of the head of the humanoid according to the signal output by the main body control chip.
Thus, when a user sends a voice control instruction, the voice acquisition module acquires the voice control instruction sent by the user, converts the acquired user instruction into an electric signal and outputs the electric signal to the voice recognition chip for further recognition processing, the voice recognition chip further outputs a result obtained after recognizing the voice control instruction sent by the user to the voice main control chip, the voice main control chip further outputs serial port data and response audio of the result, the serial port data output by the voice main control chip is further output to the mobile chassis control unit and the main body control chip, the mobile chassis control unit controls the motion of the mobile chassis according to the received serial port data, the main body control chip further sends a control signal to the double-arm control unit and the head control unit according to the received serial port data, the double-arm control unit controls the humanoid arms to make humanoid behaviors according with the voice control instruction of the user, and the head control unit controls the humanoid heads to make humanoid facial expressions according with the voice control instruction of the user according to the received control signal, so that the welcome robot can make corresponding actions according with the voice control instruction of the user to improve the interaction performance.
Preferably, the voice recognition control unit further includes an audio decoding chip, an audio memory card, a speaker and a display module, an input end of the display module is connected to an output end of the main control chip for displaying the interactive information in real time, an audio file with a set content is stored in the audio memory card, an output end of the voice main control chip is connected to an input end of the audio memory card for reading the audio file stored in the audio memory card, an input end of the audio decoding chip is connected to an output end of the audio memory card for decoding the read audio file and then converting the decoded audio file into an electrical signal for outputting, and an output end of the audio decoding chip is connected to the speaker for outputting the read audio file in sound through the speaker.
Therefore, after the voice command of the user is processed by the voice acquisition module, the voice recognition chip and the voice main control chip, the voice command is output by the voice main control chip, the output response voice is further output to the voice storage card to read the voice file in the voice storage card, the read voice file is further output to the voice decoding chip, the voice decoding chip decodes the voice file and converts the decoded voice file into an electric signal to be further output to the loudspeaker, the loudspeaker is used for outputting the voice of the read voice file, and meanwhile, the display module is used for receiving the signal of the main body control chip to display the interactive information of the user in real time.
Preferably, the dual-arm control unit includes a dual-arm control chip and a dual-arm driving module for driving the humanoid dual-arm to move, the dual-arm control chip stores set humanoid dual-arm movement parameters, an input end of the dual-arm control chip is connected to a data output end of the main body control chip to receive control signals of the main body control chip and call the correspondingly stored humanoid dual-arm movement parameters according to the received control signals of the main body control chip, an output end of the dual-arm control chip is connected to a data input end of the dual-arm driving module to send control commands to the dual-arm driving module according to the called humanoid dual-arm movement parameters, and the dual-arm driving module controls the movement of the humanoid dual-arm according to the control commands sent by the dual-arm control chip.
Thus, when a voice control instruction sent by a user needs to imitate human two arms to act, the main body control chip outputs a control signal to the two-arm control chip, set action parameters of the imitated human two arms are stored in the two-arm control chip, the two-arm control chip processes the received control signal to obtain the action parameters of the imitated human two arms corresponding to the control signal, the action parameters are compared with the prestored action parameters of the imitated human two arms, the action parameters matched with the highest imitated human two arms are extracted, then the extracted action parameter information is converted into corresponding control signals to be output to the two-arm driving module, and the two-arm driving module controls the action of the imitated human two arms according to the control signals, so that the effect that the imitated human two arms act according to the voice control instruction of the user is achieved.
Preferably, the humanoid double arms comprise two humanoid arms which are respectively distributed on two sides of the humanoid main body, the double-arm driving module comprises arm steering engine driving chips which are positioned on each humanoid arm and arm driving steering engines which are positioned at a plurality of joints of the humanoid arms, the input ends of the arm steering engine driving chips are connected with the data output ends of the double-arm control chips, and a plurality of output interfaces of the arm steering engine driving chips are respectively connected with one arm driving steering engine, so that the arm steering engine driving chips can simultaneously send control signals to the plurality of arm driving steering engines.
Therefore, when the humanoid arm is controlled, the arm steering engine driving chip receives control signals from the double-arm control chip, then the arm steering engine driving chip controls the movement of the arm driving steering engines at different positions according to the received control signals, because the joints of the human arm are more and the movement is more complex, the arm driving steering engines are arranged at a plurality of joints of the humanoid arm, and the arm driving steering engine driving chip is used for uniformly controlling the arm driving steering engines at different positions, on one hand, the number of chips for controlling the arm driving steering engines is reduced, and on the other hand, the movement control of all joints of the humanoid arm is more coordinated and accurate.
Preferably, the head control unit includes a head control chip and a head driving module for driving the human-simulated head to perform expression motions, the head control chip stores preset human-simulated head expression motion parameters, an input end of the head control chip is connected with a data output end of the main body control chip to receive control signals of the main body control chip and call corresponding human-simulated head expression motion parameters according to the received control signals of the main body control chip, an output end of the head control chip is connected with a data input end of the head driving module to send control instructions to the head driving module according to the called human-simulated head expression motion parameters, and the head driving module controls the human-simulated head expression motions according to the control instructions sent by the head control chip.
Therefore, when a voice control instruction sent by a user needs to imitate the human head to perform expression action, the main body control chip outputs a control signal to the head control chip, a set expression action parameter of the imitated human head is stored in the head control chip, the head control chip processes the received control signal to obtain an expression action parameter of the imitated human head corresponding to the control signal, the expression action parameter is compared with a prestored expression action parameter of the imitated human head, the expression action parameter matched with the highest imitated human head is extracted, the extracted expression action parameter information is converted into a corresponding control signal to be output to the head driving module, and the head driving module controls the expression action of the imitated human head according to the control signal, so that the effect of performing the expression action on the imitated human head according to the voice control instruction of the user is achieved.
Preferably, the human-simulated head comprises eyebrows, eyelids, eyeballs, mouths, jaws and necks, the head driving module comprises a head steering engine driving chip and a plurality of head driving steering engines which are positioned on the human-simulated head and used for driving the eyebrows, the eyelids, the eyeballs, the mouths, the jaws and the necks to move respectively, the input end of the head steering engine driving chip is connected with the data output end of the head control chip, and a plurality of output interfaces of the head steering engine driving chip are respectively connected with one head driving steering engine so that the head steering engine driving chip can simultaneously send control signals to the head driving steering engines.
Therefore, when the expression action of the human-simulated head is controlled, the head steering engine driving chip receives a control signal from the head control chip, and then the head steering engine driving chip controls the head driving steering engines at different positions to move according to the received control signal.
Preferably, the mobile chassis control unit includes a chassis control chip and a chassis driving module for driving the mobile chassis to move, an input end of the chassis control chip is connected to a data output end of the voice main control chip to receive serial port data output by the voice main control chip, an output end of the chassis control chip is connected to a data receiving end of the chassis driving module to send a control instruction to the chassis driving module according to the serial port data output by the voice main control chip, and the chassis driving module controls the mobile chassis to move according to the received control instruction.
Therefore, when a voice control instruction sent by a user needs to move the chassis, the voice main control chip outputs a control signal to the chassis control chip, and the chassis control chip outputs the control signal to the chassis driving module at the moment, so that the chassis driving module controls the movement of the movable chassis according to the control signal. Therefore, the effect that the mobile chassis moves according to the voice control instruction of the user is achieved.
Preferably, the chassis driving module includes a motor driving chip and four mobile motors distributed around the mobile chassis, an input end of the motor driving chip is connected with a data output end of the chassis control chip, and output ends of the motor driving chip are respectively connected with four mobile motors electrically, so that the motor driving chip can simultaneously send control instructions to the four mobile motors, each mobile motor is further respectively connected with a rotating wheel and a speed measuring piece for collecting rotating speed signals of the mobile motors, and an output end of the speed measuring piece is further connected with the chassis control chip in a data connection manner, so that the collected rotating speed signals of the mobile motors are output to the chassis control chip.
Therefore, four moving motors are arranged around the moving chassis, rotating wheels are arranged on the moving motors, the moving chassis is moved by the aid of rotation of the rotating wheels, when the moving chassis is required to move, the motor driving chip receives control signals from the chassis control chip and sends control instructions to the four moving motors according to the control signals, the moving motors further drive the four rotating wheels to rotate to achieve the moving effect of the moving chassis, meanwhile, in the moving process of the moving chassis, the speed measuring part continuously detects rotating speed signals of the moving motors and forms deviation with target speed set in a program, the rotating speed of the moving motors can be adjusted according to the deviation by the aid of a PID algorithm, and accordingly actual speed of the moving motors is the same as the target speed.
Preferably, the mobile chassis control unit further comprises a following obstacle avoidance module, the following obstacle avoidance module comprises a human body infrared sensor and a plurality of ultrasonic sensors, the ultrasonic sensors and the human body infrared sensor are connected with the chassis control chip, the ultrasonic sensors are respectively located on the front side, the rear side, the left side and the right side of the position above the humanoid main body, the human body infrared sensor and the ultrasonic sensors located on the front side of the position above the humanoid main body are used for detecting the position information of the human body and transmitting the detected position information of the human body to the chassis control chip, the chassis control chip controls the chassis driving module to drive the mobile chassis to move along with the human body according to the received position information of the human body, the ultrasonic sensors located on the rear side, the left side and the right side of the humanoid main body are used for detecting obstacles on the path corresponding to the side and transmitting the detected position information of the obstacles to the chassis control chip, and the chassis control chip controls the chassis driving module to drive the mobile chassis to avoid the obstacles according to the received position information of the obstacles to move.
Therefore, the following obstacle avoidance module is arranged, the welcome robot can have two moving modes, one is a voice control mode, the other is an autonomous following mode, in the voice control mode, the welcome robot moves according to a voice command of a user, in the autonomous following mode, the ultrasonic sensor and the human body infrared sensor which are positioned in the front side of the position above the humanoid main body in the following obstacle avoidance module can detect the position information of a human body in front, and transmit the position information to the chassis control chip, at the moment, the chassis control signal sends a control command to the chassis driving module, so that the chassis driving module drives the moving chassis to move along with the human body, and is positioned in the rear side above the humanoid main body, the ultrasonic sensors on the left side and the right side can also detect the obstacle information corresponding to the side direction, and transmit the detected obstacle information to the chassis control chip, at the moment, the chassis control chip further controls the moving chassis to avoid obstacles in the moving process through the chassis driving module, and accordingly, the autonomous following and automatic obstacle avoidance effects in the autonomous following mode are achieved.
Drawings
FIG. 1 is a system block diagram of a greeting robot system with emotion interaction function according to the invention;
FIG. 2 is a structural block diagram of a mobile chassis in the greeting robot system with emotion interaction function;
FIG. 3 is a schematic structural diagram of a human-simulated main body, human-simulated double arms and a mobile chassis in the greeting robot system with emotion interaction function of the invention;
FIG. 4 is an enlarged schematic view at A in FIG. 3;
FIG. 5 is a schematic structural diagram of a humanoid arm in the greeting robot system with emotion interaction function (finger not shown);
FIG. 6 is a schematic front side structure diagram of a human head simulation in the greeting robot system with emotion interaction function according to the invention;
FIG. 7 is a schematic diagram of the rear side structure of a humanoid head in the welcome robot system with emotion interaction function (the lower jaw and the neck are removed);
FIG. 8 is a schematic structural diagram of another perspective of the back side of the humanoid head in the welcome robot system with emotion interaction function according to the invention (the lower jaw and the neck are removed);
FIG. 9 is a schematic front side structure diagram of a position of an eye mechanism in a human head simulated in the greeting robot system with emotion interaction function of the invention;
FIG. 10 is a schematic diagram of a rear side structure of a guest-greeting robot system with emotion interaction function at a position of an eye mechanism in a human head simulation;
FIG. 11 is a schematic structural diagram of one of the angles of view at the positions of an upper eyelid component, a lower eyelid component and an eyeball component in a human head simulated in the welcome robot system with emotion interaction function according to the invention;
FIG. 12 is a schematic structural diagram of another view angle at the positions of an upper eyelid component and a lower eyelid component and an eyeball component in the human head simulated welcome robot system with emotion interaction function according to the invention;
FIG. 13 is a schematic structural diagram of an eyeball component in a human head simulated in the greeting robot system with the emotion interaction function;
FIG. 14 is a schematic structural diagram of a neck mechanism in a human head simulated in the welcome robot system with emotion interaction function according to the invention.
Description of reference numerals: <xnotran> 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, </xnotran> The human-simulated head comprises a right eyeball up-down connecting rod 62, a left eyeball supporting column 63, a left eyeball rotating piece 64, a left eyeball head piece 65, a left spherical joint 66, a right eyeball supporting column 67, a right eyeball rotating piece 68, a right eyeball piece 69, a left eyeball supporting plate 70, a right eyeball supporting plate 71, a neck framework 72, a first neck driving steering engine 73, a second neck driving steering engine 74, a third neck driving steering engine 75, a human-simulated main body 76, human-simulated double arms 77, a moving chassis 78, a rotating wheel 79, a moving motor 80, a speed measuring piece 81, an arm driving steering engine 82 and human-simulated arms 83.
Detailed Description
The invention will be further explained with reference to the drawings and the embodiments.
As shown in fig. 1 to 4, a guest greeting robot system with emotion interaction function comprises a humanoid main body 76, humanoid double arms 77, a humanoid head, a mobile chassis 78, a power supply unit, a voice recognition control unit, a double-arm control unit, a head control unit and a mobile chassis control unit;
the power supply unit is used for respectively supplying power to the voice recognition control unit, the double-arm control unit, the head control unit and the mobile chassis control unit;
the voice recognition control unit is arranged on the humanoid body 76 and used for recognizing the voice command of the user, converting the recognized voice command of the user into control signals and outputting the control signals to the double-arm control unit, the head control unit and the mobile chassis control unit respectively;
the double-arm control unit is arranged on the humanoid double arms 77 and is used for controlling the actions of the humanoid double arms 77 according to the control signals output by the voice recognition control unit;
the head control unit is arranged on the human-simulated head and is used for controlling the expression and the action of the human-simulated head according to the control signal output by the voice recognition control unit;
the mobile chassis control unit is mounted on the mobile chassis 78 and is used to control the movement of the mobile chassis 78 according to the control signal output by the voice recognition control unit.
The working principle of the invention is as follows: when the greeting robot system is used, when a user sends a voice control instruction, the voice recognition control unit recognizes the voice instruction sent by the user, converts the voice instruction into a control signal and then respectively outputs the control signal to the double-arm control unit, the head control unit and the mobile chassis control unit, wherein the double-arm control unit controls the actions of the humanoid double arms 77 according to the received control signal so as to synthesize humanoid action, the head control unit controls the facial expression actions of the humanoid head according to the received control signal so as to synthesize humanoid action expressions, the mobile chassis control unit controls the mobile chassis 78 to move according to the received control signal, so that when the user sends the voice control instruction to the greeting robot, the greeting robot system can make corresponding actions and expressions according to the voice control instruction sent by the user, and can also perform corresponding moving operations, therefore, the greeting robot has a good voice interaction function, the intelligent interactivity of the greeting robot is improved, and the experience of the greeting robot is improved.
In this embodiment, the voice recognition control unit includes a voice acquisition module, a voice recognition chip, a voice main control chip and a main body control chip;
the voice acquisition module is used for acquiring a user voice instruction, converting the acquired user voice instruction into an electric signal and outputting the electric signal to the voice recognition chip; in this embodiment, the voice collecting module is a micro microphone and is installed above the humanoid main body 76, so as to collect the voice command of the user conveniently.
The output end of the voice recognition chip is connected with the data input end of the voice main control chip, and the voice recognition chip is used for recognizing the voice instruction transmitted by the voice acquisition module and outputting the recognized result to the voice main control chip; in this embodiment, the speech recognition chip uses LD3320 chip and ASR technology to recognize the user speech command.
The output end of the voice main control chip is respectively in data connection with the main body control chip and the mobile chassis control unit, the voice main control chip outputs serial port data and response audio according to the recognition result of the voice command by the voice recognition chip and outputs the serial port data to the main body control chip and the mobile chassis control unit respectively, the mobile chassis control unit controls the motion of the mobile chassis 78 according to the serial port data output by the voice main control chip, the signal output end of the main body control chip is also connected with the data receiving ends of the double-arm control unit and the head control unit respectively, so that the double-arm control unit and the head control unit control the motion of the humanoid double arms 77 and the expression of the humanoid head according to the signal output by the main body control chip, the main body control chip simultaneously sends control commands to the head control unit and the double-arm control unit, and coordination of robot motion and posture control can be guaranteed. In this embodiment, the voice main control chip adopts an STC11L16XE chip.
Thus, when a user sends a voice control instruction, the voice acquisition module acquires the voice control instruction sent by the user, converts the acquired user instruction into an electric signal and outputs the electric signal to the voice recognition chip for further recognition processing, the voice recognition chip further outputs a result obtained after recognizing the voice control instruction sent by the user to the voice main control chip, the voice main control chip further outputs serial port data and response audio of the result, the serial port data output by the voice main control chip is further output to the mobile chassis control unit and the main body control chip, the mobile chassis control unit controls the motion of the mobile chassis 78 according to the received serial port data, the main body control chip further sends a control signal to the double-arm control unit and the head control unit according to the received serial port data, the double-arm control unit controls the humanoid double arms 77 to make humanoid behavior motions conforming to the voice control instruction of the user according to the received control signal, and the head control unit controls the humanoid heads to make humanoid facial motions conforming to the voice control instruction of the user according to the received control signal, so that the welcome robot can make corresponding facial expressions according to the voice control instruction of the user to improve the performance of the human expression.
In this embodiment, the voice recognition control unit further includes an audio decoding chip, an audio memory card, a speaker and a display module, an input end of the display module is connected with an output end of the main control chip, so as to display the interactive information in real time, an audio file with set contents is stored in the audio memory card, an output end of the voice main control chip is connected with an input end of the audio memory card, so as to read the audio file stored in the audio memory card, an output end of the input end of the audio decoding chip is connected with an output end of the audio memory card, so as to decode the read audio file and convert the decoded audio file into an electrical signal for output, and an output end of the audio decoding chip is connected with the speaker, so as to output sound of the read audio file through the speaker. Specifically, the display module is a display screen, is obliquely arranged on the front face of the humanoid main body 76, forms an included angle of 30 degrees with the vertical direction, and is 1.3m away from the movable chassis 78, and is used for displaying interactive information.
Therefore, after the voice command of the user is processed by the voice acquisition module, the voice recognition chip and the voice main control chip, the voice command is output by the voice main control chip, the output response voice is further output to the voice storage card to read the voice file in the voice storage card, the read voice file is further output to the voice decoding chip, the voice decoding chip decodes the voice file and converts the decoded voice file into an electric signal to be further output to the loudspeaker, the loudspeaker is used for outputting the voice of the read voice file, and meanwhile, the display module is used for receiving the signal of the main body control chip to display the interactive information of the user in real time.
In this embodiment, the audio decoding chip is a VG009 chip. The miniature microphone converts voice signals into electric signals, the output end of the miniature microphone is connected with the input end of the voice recognition chip, the voice recognition chip recognizes voice instructions based on an ASR (asynchronous receiver/transmitter) technology, the output end of the miniature microphone is connected with the voice main control chip, the voice main control chip controls serial port data output and response audio output according to the voice instructions provided by the voice recognition chip, the serial port data output end of the voice main control chip is respectively connected with the serial port data input ends of the main body control chip and the chassis control unit, the audio decoding chip acquires the response audio signals and then reads audio files in the audio storage card, and the response audio is output by the loudspeaker.
In this embodiment, dual-arm control unit includes a dual-arm control chip and a dual-arm driving module for driving humanoid dual-arm 77, where the dual-arm control chip stores set motion parameters of humanoid dual-arm 77, an input end of the dual-arm control chip is connected to a data output end of the main body control chip to receive a control signal of the main body control chip and call correspondingly stored motion parameters of humanoid dual-arm 77 according to the received control signal of the main body control chip, an output end of the dual-arm control chip is connected to a data input end of the dual-arm driving module to send a control instruction to the dual-arm driving module according to the called motion parameters of humanoid dual-arm 77, and the dual-arm driving module controls motions of humanoid dual-arm 77 according to the control instruction sent by the dual-arm control chip.
Thus, when voice control command from user needs to imitate human double arm 77, main body control chip outputs control signal to double arm control chip, and stores set human double arm 77 imitation parameters in double arm control chip, at this time double arm control chip processes received control signal to obtain human double arm 77 imitation parameters corresponding to the control signal, and compares the motion parameters with prestored human double arm 77 imitation parameters to extract matching human double arm 77 action parameters, and then converts the extracted action parameter information into corresponding control signal to output to double arm driving module, and double arm driving module controls human double arm 77 action according to the control signal, thereby realizing human double arm 77 action according to user voice control command.
As shown in fig. 5, in this embodiment, the humanoid double arms 77 include two humanoid arms 83 respectively distributed on two sides of the humanoid main body 76, the double-arm driving module includes an arm steering engine driving chip located on each humanoid arm 83 and arm driving steering engines 82 located at multiple joints of the humanoid arm 83, an input end of the arm steering engine driving chip is connected with a data output end of the double-arm control chip, and multiple output interfaces of the arm steering engine driving chip are respectively connected with one arm driving steering engine 82, so that the arm steering engine driving chip can simultaneously send control signals to the multiple arm driving steering engines 82.
Therefore, when the humanoid arm 83 is controlled, the arm steering engine driving chip receives control signals from the double-arm control chip, and then the arm steering engine driving chip controls the arm driving steering engines 82 at different positions to move according to the received control signals, because the joints of the human arm are more and the movement is complex, the arm driving steering engines 82 are arranged at a plurality of joints of the humanoid arm 83, and the arm driving steering engines 82 at different positions are uniformly controlled by the arm steering engine driving chip, on one hand, the number of chips for controlling the arm driving steering engines 82 is reduced, and on the other hand, the movement control of each joint of the humanoid arm 83 is more coordinated and accurate.
In this specific embodiment, each humanoid arm 83 has 10 arm drive steering engines 82 in total, the humanoid arm 83 includes arm and finger, the vertical distance from humanoid arm 83 to moving chassis 78 is 1.4m, wherein there are 5 arm drive steering engines 82 in the arm, there are 5 fingers in the finger, each finger is controlled by 1 arm drive steering engine 82, in order to realize the humanoid action in a flexible way, arm steering engine drive chip has 16 way PWM output interfaces in total, can control 16 way steering engines at most, 10 arm drive steering engines 82 on the humanoid arm 83 are connected with 10 way PWM output interfaces in the arm steering engine drive chip respectively, during the action, the arm control chip sends the action control signal to arm steering engine drive chip, arm steering engine drive chip converts the action control signal into PWM signal and controls 10 way arm drive 82 to rotate in coordination, and then produce the humanoid action group, promote the interactive experience.
In this embodiment, the mobile chassis control unit includes a chassis control chip and a chassis driving module for driving the mobile chassis 78 to move, an input end of the chassis control chip is connected to a data output end of the voice main control chip to receive serial data output by the voice main control chip, an output end of the chassis control chip is connected to the chassis driving module to send a control instruction to the chassis driving module according to the serial data output by the voice main control chip, and the chassis driving module controls the mobile chassis 78 to move according to the received control instruction. Specifically, the chassis control chip sends a control command to the chassis drive module through a PID algorithm and controls the stable movement of the mobile chassis 78.
Therefore, when a voice control instruction sent by a user needs to move the chassis 78, the voice main control chip outputs a control signal to the chassis control chip, and at the moment, the chassis control chip outputs the control signal to the chassis driving module, so that the chassis driving module controls the movement of the movable chassis 78 according to the control signal. Thereby achieving the effect that the moving chassis 78 moves according to the user voice control instruction.
In this embodiment, the chassis driving module includes a motor driving chip, and four moving motors 80 distributed around the moving chassis 78, the input end of the motor driving chip is connected with the data output end of the chassis control chip, and the output end of the motor driving chip is electrically connected with the four moving motors 80 respectively, so that the motor driving chip can send a control instruction to the four moving motors 80 simultaneously, a rotating wheel 79 and a speed measuring part 81 for collecting the rotating speed signal of the moving motors 80 are further connected to each moving motor 80 respectively, and the output end of the speed measuring part 81 is also in data connection with the chassis control chip, so that the rotating speed signal of the collected moving motors 80 is output to the chassis control chip. Specifically, the speed measurement part 81 is an AB phase magnetic encoder with filtering, the AB phase magnetic encoder can measure the positive and negative rotation direction of the motor, the speed measurement precision is high, the power inductance filtering is increased, the battery interference can be effectively reduced, the AB phase magnetic encoder can convert the rotating speed of the moving motor 80 into the pulse number, the output end of the AB phase magnetic encoder is connected with the pulse acquisition end of the chassis control chip, and the closed-loop movement control of the moving chassis 78 can be realized after the chassis control chip processes the pulse signal.
Thus, four moving motors 80 are arranged around the moving chassis 78, rotating wheels 79 are arranged on the moving motors 80, the moving chassis 78 is moved by the rotation of the rotating wheels 79, when the moving chassis 78 needs to move, a motor driving chip receives a control signal from a chassis control chip and sends a control instruction to the four moving motors 80 according to the control signal, so that the moving motors 80 rotate and further drive the four rotating wheels 79 to rotate to realize the moving effect of the moving chassis 78, meanwhile, in the moving process of the moving chassis 78, a speed measuring part 81 continuously detects a rotating speed signal of the moving motors 80 and forms deviation with a target speed set in a program, the rotating speed of the moving motors 80 can be adjusted according to the deviation by utilizing a PID algorithm, and further, the actual speed of the moving motors 80 is the same as the target speed.
In this embodiment, the rotating wheels 79 are all Mecanum wheels, and the four Mecanum wheels are all mounted in an O shape, so that the change of the friction force between the roller of the Mecanum wheel and the ground can be detailed by utilizing the O-shaped mounting principle of the omnidirectional wheel, the control of the chassis control chip on the omnidirectional displacement of the welcome robot can be better realized, and the high efficiency of omnidirectional rotation such as translation, in-situ turning, advancing, retreating and the like can be realized.
In this embodiment, STM32F103ZET6 chip is all adopted to main part control chip, both arms control chip, head control chip and chassis control chip, and the low power dissipation of this chip, peripheral hardware interface is abundant, and the real-time performance is excellent, and development cost is low.
In this embodiment, the mobile chassis control unit further comprises a following obstacle avoidance module, the following obstacle avoidance module comprises a human body infrared sensor and a plurality of ultrasonic sensors, the ultrasonic sensors and the human body infrared sensor are both connected with the chassis control chip, the plurality of ultrasonic sensors are respectively located on the front side, the rear side, the left side and the right side of the position above the humanoid main body 76, specifically, the following obstacle avoidance module comprises 6 ultrasonic sensors in total, wherein 3 ultrasonic sensors are located on the front side of the position above the humanoid main body (wherein 1 ultrasonic sensor is located in the middle and right at the front position, and the other 2 ultrasonic sensors are respectively located on the two sides and arranged at an angle of 45 degrees), 1 ultrasonic sensor is located on the rear side of the position above the humanoid main body, 1 ultrasonic sensor is located on the left side of the position above the humanoid main body, and 1 ultrasonic sensor is located on the right side of the position above the humanoid main body; human infrared sensor and be located 3 ultrasonic sensor of imitative people main part 76 top position front side are used for detecting human positional information, and transmit the human positional information who detects for chassis control chip, chassis control chip controls chassis drive module according to the human positional information who receives and drives and remove chassis 78 and follow human motion, be located imitative people main part 76 rear side, 3 ultrasonic sensor on left side and right side are used for detecting the barrier on the route of moving of corresponding side and give chassis control chip with the barrier positional information who detects, chassis control chip controls chassis drive module according to the barrier positional information who receives and drives and remove chassis 78 and avoid the barrier and remove.
Thus, by arranging the following obstacle avoidance module, the welcome robot can have two moving modes, one is a voice control mode, and the other is an autonomous following mode, in the voice control mode, the welcome robot moves according to a voice command of a user, in the autonomous following mode, an ultrasonic sensor and a human body infrared sensor which are positioned in front of the position above the humanoid main body 76 in the following obstacle avoidance module can detect position information of a human body in front, and transmit the position information to the chassis control chip, at the moment, a chassis control signal sends a control command to the chassis driving module, so that the chassis driving module drives the moving chassis 78 to move along with the human body, and is positioned in back above the humanoid main body 76, the ultrasonic sensors on the left side and the right side can also detect obstacle information of the direction on the corresponding side, and transmit the detected obstacle information to the chassis control chip, at the moment, the chassis control chip further controls the moving chassis 78 to avoid obstacles in the moving process through the chassis driving module, and accordingly, the effects of autonomous following and automatic obstacle avoidance in the autonomous following mode are achieved.
In this embodiment, the human-simulated main body 76 is integrally made of an aluminum profile support with a length of 20cm, a width of 30cm and a height of 130cm, two layers of partition plates are arranged on the support, wherein the height of the first layer of partition plate located on the upper side from the top of the support is 15cm, the height of the second layer of partition plate from the first layer of partition plate is 20cm, the voice recognition control unit is located on the first layer of partition plate except for a display module, the display module is located on the front side of the support between the two layers of partition plates, and the power supply unit, the following obstacle avoidance module and the human body infrared sensor are located on the second layer of partition plate.
In this embodiment, the head control unit includes a head control chip and a head driving module for driving the human-simulated head to perform expression motions, the head control chip stores a set human-simulated head expression motion parameter, an input end of the head control chip is connected to a data output end of the main body control chip to receive a control signal of the main body control chip, and calls a corresponding human-simulated head expression motion parameter according to the received control signal of the main body control chip, an output end of the head control chip is connected to a data input end of the head driving module to send a control instruction to the head driving module according to the called human-simulated head expression motion parameter, and the head driving module controls the human-simulated head expression motions according to the control instruction sent by the head control chip.
Therefore, when a voice control instruction sent by a user needs to imitate the human head to perform expression action, the main body control chip outputs a control signal to the head control chip, a set expression action parameter of the imitated human head is stored in the head control chip, the head control chip processes the received control signal to obtain an expression action parameter of the imitated human head corresponding to the control signal, the expression action parameter is compared with a prestored expression action parameter of the imitated human head, the expression action parameter matched with the highest imitated human head is extracted, the extracted expression action parameter information is converted into a corresponding control signal to be output to the head driving module, and the head driving module controls the expression action of the imitated human head according to the control signal, so that the effect of performing the expression action on the imitated human head according to the voice control instruction of the user is achieved.
In this embodiment, imitative people's head includes the eyebrow, the eyelid, the eyeball, the mouth, chin and neck, head drive module includes head steering wheel drive chip, and be located imitative people's head and be used for driving the eyebrow respectively, the eyelid, the eyeball, the mouth, a plurality of head drive steering wheel of chin and neck motion, head steering wheel drive chip's input is connected with head control chip's data output end, a plurality of output interface of head steering wheel drive chip are connected with a head drive steering wheel respectively, so that head steering wheel drive chip can send control signal to a plurality of head drive steering wheels simultaneously.
Therefore, when the expression action of the human-simulated head is controlled, the head steering engine driving chip receives a control signal from the head control chip, and then the head steering engine driving chip controls the head driving steering engines at different positions to move according to the received control signal.
As shown in fig. 6 to fig. 14, in the present embodiment, the humanoid head further includes a head skeleton 1 and a neck skeleton 72, the head skeleton 1 is provided with an eye mechanism and a mouth mechanism, and the neck skeleton 72 is provided with a neck mechanism;
the eye mechanism comprises an eyebrow assembly, an upper eyelid assembly, a lower eyelid assembly and an eyeball assembly, the eyebrow assembly comprises an eyebrow 2, a first eyebrow driving unit and a second eyebrow driving unit, the eyebrow of the eyebrow 2 is connected with the first eyebrow driving unit, so that the first eyebrow driving unit can drive the eyebrow 2 to rotate around the eyebrow tip of the eyebrow, and the second eyebrow driving unit is connected with the first eyebrow driving unit, so that the second eyebrow driving unit can drive the eyebrow of the eyebrow 2 to vertically move through the first eyebrow driving unit; wherein, the eyebrow 2 is made of soft material.
The upper eyelid component and the lower eyelid component comprise a left eyelid, a left eyelid driving unit, a right eyelid and a right eyelid driving unit, the left eyelid is connected with the left eyelid driving unit so that the left eyelid driving unit can drive the left eyelid to close and open, and the right eyelid is connected with the right eyelid driving unit so that the right eyelid driving unit can drive the right eyelid to close and open;
the eyeball assembly comprises a left eyeball 33, a right eyeball 36, a left and right eyeball driving unit and an up and down eyeball driving unit, wherein the left and right eyeball driving unit is simultaneously connected with the left eyeball 33 and the right eyeball 36 so as to drive the left eyeball 33 and the right eyeball 36 to synchronously move leftwards or rightwards, and the up and down eyeball driving unit is simultaneously connected with the left eyeball 33 and the right eyeball 36 so as to drive the left eyeball 33 and the right eyeball 36 to synchronously move upwards or downwards;
the mouth mechanism comprises a jaw assembly and a lip assembly, the jaw assembly comprises a jaw 3 and a jaw driving unit, and the jaw driving unit is connected with the jaw 3 and is used for driving the jaw 3 to realize opening and closing motions; the lip assembly comprises a lip unit, a lip driving unit and a mouth corner driving unit, the lip unit comprises a mouth lip part and a mouth corner part, the lip driving unit is connected with the lip part and used for driving the mouth lip part to move up and down, and the mouth corner driving unit is connected with the mouth corner part and used for driving the mouth corner part to move up and down;
the neck mechanism comprises a neck and a neck driving unit, and the neck driving unit is connected with the neck and used for driving the neck to rotate.
Like this, when imitative people's head is when using, when the motion of needs control eyebrow 2, first eyebrow drive unit can control eyebrow 2 and rotate around the eyebrow tip to human eyebrow 2 of simulation department contracts (frown) and lax state, second eyebrow drive unit drives the eyebrow vertical migration of eyebrow 2 through first eyebrow drive unit, thereby realize the control of choosing and flagging state on eyebrow 2, therefore, through the common control of first eyebrow drive unit and second eyebrow drive unit, can simulate out 2 states of eyebrow of human body under multiple mood state.
When the opening and closing states of the left eyelid and the right eyelid need to be controlled, the left eyelid driving unit controls the left eyelid to be closed or opened, and the right eyelid driving unit controls the right eyelid to be closed or opened, so that the closing or opening states of the left eyelid and the right eyelid can be controlled independently, and complex actions such as single eye blinking and the like can be further realized.
When the left and right eyeballs 36 need to be controlled to move up and down or left and right, the eyeball left and right driving unit is used for synchronously driving the left and right eyeballs 36 to move left or right simultaneously, and the eyeball up and down driving unit is used for synchronously driving the left and right eyeballs 36 to move up or down simultaneously.
When the jaw 3 is required to be controlled to move, the jaw driving unit is used for controlling the opening and closing movement of the jaw 3, meanwhile, the lip part and the mouth corner part of the lip unit are respectively sampled to drive the lip driving unit and the mouth corner driving unit, the lip driving unit is used for driving the lip part to move up and down, and the mouth corner driving unit is used for driving the mouth corner part to move up and down, so that the lip part and the mouth corner part can be controlled more accurately.
When the neck movement needs to be controlled, the neck driving unit drives the neck to rotate so as to realize the rotation action of the human neck.
In conclusion, by controlling the eyebrows 2, the upper eyelid, the lower eyelid, the left eyeball 33, the right eyeball 36, the lower jaw 3, the lip part of the mouth, the corner part of the mouth and the neck, more complicated human expression states such as anger, surprise, fear, joy, disgust, sadness, smile, fear and the like can be simulated, and meanwhile, the control on each part of the head structure is more accurate, so that the simulated human expression states are more vivid.
In this embodiment, the eyebrow 2 assembly further includes an eyebrow mount 28, the eyebrow mount 28 is fixedly connected to the head frame 1, and an eyebrow tip of the eyebrow 2 is rotatably connected to the eyebrow mount 28, so that the eyebrow 2 can rotate around the eyebrow tip, the first eyebrow driving unit includes a first eyebrow driving steering engine 29 and an eyebrow rotating link which are longitudinally arranged, one end of the eyebrow rotating link close to the first eyebrow driving steering engine 29 is fixedly connected to a rotating shaft of the first eyebrow driving steering engine 29, and one end of the eyebrow rotating link far away from the first eyebrow driving steering engine 29 is connected to the eyebrow tip of the eyebrow 2, so that when the first eyebrow driving steering engine 29 rotates, the eyebrow tip of the eyebrow 2 can be driven to rotate around the eyebrow tip by the eyebrow rotating link;
second eyebrow drive unit includes along the second eyebrow drive steering wheel 40 of axial setting, second eyebrow drive steering wheel 40 passes through steering wheel mounting fixed connection on head skeleton 1, be equipped with steering wheel overcoat 41 in first eyebrow drive steering wheel 29 overcoat, the both sides along the axial direction are equipped with first connecting sleeve and second connecting sleeve respectively on steering wheel overcoat 41, first connecting sleeve is connected with second eyebrow drive steering wheel 40's pivot, the position that corresponds with second connecting sleeve still is equipped with first support frame 30 on head skeleton 1, the second connecting sleeve rotates with first support frame 30 and is connected, so that second eyebrow drive steering wheel 40 can drive first eyebrow drive steering wheel 29 through steering wheel overcoat 41 and rotate.
Therefore, when the eyebrows 2 need to be controlled to rotate around the eyebrow tips, the first eyebrow driving steering engine 29 rotates, the first eyebrow driving steering engine 29 drives the eyebrow rotating connecting rod to rotate, the eyebrow rotating connecting rod drives the eyebrow positions of the eyebrows 2 to rotate around the eyebrow tips of the eyebrows, and therefore the contraction (cockled eyebrows) and relaxation states of the eyebrows 2 can be controlled by controlling the first eyebrow driving steering engine 29.
When the upper part of the eyebrow 2 needs to be controlled to be lifted or drooped, the second eyebrow driving steering engine 40 rotates, a rotating shaft of the second eyebrow driving steering engine 40 drives the steering engine outer sleeve 41 to vertically rotate through the first connecting sleeve, the first eyebrow driving steering engine 29 is installed on the steering engine outer sleeve 41, so the first eyebrow driving steering engine 29 can rotate along with the steering engine outer sleeve 41, the first eyebrow driving steering engine 29 further drives the eyebrow part of the eyebrow 2 to vertically move through the rotary connecting rod of the eyebrow 2, the upper part of the eyebrow 2 is lifted or drooped, and therefore the upper part of the eyebrow 2 can be controlled or the eyebrow can be controlled to be in a drooped state through the control of the second eyebrow driving steering engine 40.
To sum up, through the common control to first eyebrow drive steering wheel 29 and second eyebrow drive steering wheel 40, just can make eyebrow 2 simulate human expression under the multiple state, if the human body can move down at the glabellum position of eyebrow 2 when anger, the glabellum position of eyebrow 2 can move up when sad etc..
In this embodiment, the upper and lower eyelid assembly further includes a left eyelid fixing frame 37, a right eyelid fixing frame 38 and an eyelid fixing column 39, the left eyelid fixing frame 37 is fixedly connected to the head skeleton 1 on the left eyelid side, the right eyelid fixing frame 38 is fixedly connected to the head skeleton 1 on the right eyelid side, the eyelid fixing column 39 is fixedly connected to the head skeleton 1 between the left eyelid and the right eyelid, the left eyelid includes a left upper eyelid 31 and a left lower eyelid 32, the right eyelid includes a right upper eyelid 34 and a right lower eyelid 35, one side of the left upper eyelid 31 close to the left eyelid fixing frame 37 is rotatably connected to the left eyelid fixing member through the left upper eyelid connecting member 31, one side of the left lower eyelid 32 close to the left eyelid fixing frame 37 is rotatably connected to the right eyelid fixing member through the left lower eyelid 32 connecting member, one side of the right upper eyelid 34 close to the right eyelid fixing frame 38 is rotatably connected to the right eyelid fixing member 34 through the right upper eyelid connecting member 39, one side of the right lower eyelid 35 close to the right eyelid fixing frame 35 is rotatably connected to the left lower eyelid fixing frame 39, and one side of the right upper eyelid fixing frame 34 close to the left eyelid fixing frame 39 is rotatably connected to the left lower eyelid fixing frame 39, and one side of the right lower eyelid fixing frame 39 is rotatably connected to the left lower eyelid fixing frame 39.
Thus, the left eyelid fixing frame 37, the right eyelid fixing frame 38 and the eyelid fixing column 39 are used for realizing the rotary support of the left eyelid and the right eyelid, wherein two ends of the left upper eyelid 31 are respectively and rotatably connected to the left eyelid fixing frame 37 and the eyelid fixing column 39 through the left upper eyelid 31 connecting piece and the middle upper eyelid connecting piece, and two ends of the left lower eyelid 32 are respectively and rotatably connected to the left eyelid fixing frame 37 and the eyelid fixing column 39 through the left lower eyelid 32 connecting piece and the middle lower eyelid connecting piece; the two ends of the right upper eyelid 34 are respectively and rotatably connected with the right eyelid fixing frame 38 and the eyelid fixing upright post 39 through the right upper eyelid 34 connecting piece and the middle upper eyelid connecting piece, and the two ends of the right lower eyelid 35 are respectively and rotatably connected with the right eyelid fixing frame 38 and the eyelid fixing upright post 39 through the right lower eyelid 35 connecting piece and the middle lower eyelid connecting piece.
In this embodiment, the left eyelid driving unit includes an upper left eyelid driving actuator 42 and a lower left eyelid driving actuator 45 fixedly connected to the head skeleton 1, a rotation shaft of the upper left eyelid driving actuator 42 is connected with an upper left eyelid rudder arm 43, the upper left eyelid driving arm 43 is connected with an upper left eyelid driving connecting rod 44, and one end of the upper left eyelid driving connecting rod 44, which is far away from the upper left eyelid driving arm 43, is connected with the upper left eyelid 31, so that the upper left eyelid driving connecting rod 44 can drive the upper left eyelid 31 to close or open when the upper left eyelid driving arm 43 rotates; a rotating shaft of the left lower eyelid driving steering engine 45 is connected with a left lower eyelid rudder arm 46, the left lower eyelid driving link 47 is connected with the left lower eyelid driving link 46, and one end, far away from the left lower eyelid driving link 47, of the left lower eyelid driving link 47, which is connected with the left lower eyelid rudder arm 46, is connected with the left lower eyelid 32, so that the left lower eyelid driving link 47 can drive the left lower eyelid 32 to close or open when the left lower eyelid driving link 46 rotates;
the right eyelid driving unit comprises a right upper eyelid driving steering engine 48 and a right lower eyelid driving steering engine 51 which are fixedly connected to the head skeleton 1, a rotating shaft of the right upper eyelid driving steering engine 48 is connected with a right upper eyelid rudder arm 49, the right upper eyelid driving arm 49 is connected with a right upper eyelid driving connecting rod 50, one end, far away from the right upper eyelid driving connecting rod 50, of the right upper eyelid driving connecting rod 49 is connected with the right upper eyelid 34, so that the right upper eyelid driving connecting rod 50 can drive the right upper eyelid 34 to close or open when the right upper eyelid driving arm 49 rotates; a right lower eyelid driving arm 52 is connected to a rotating shaft of the right lower eyelid driving steering engine 51, a right lower eyelid driving connecting rod 53 is connected to the right lower eyelid driving arm 52, and one end, far away from the right lower eyelid driving connecting rod 53, of the right lower eyelid driving connecting rod 52 is connected with the right lower eyelid 35, so that the right lower eyelid 35 can be driven to be closed or opened through the right lower eyelid driving connecting rod 53 when the right lower eyelid driving connecting rod 52 rotates.
Thus, when the closing or opening of the upper left eyelid 31 is controlled, the upper left eyelid driving steering engine 42 rotates, the rotation of the upper left eyelid driving steering engine 42 is transmitted to the upper left eyelid driving connecting rod 44 through the upper left eyelid steering engine arm 43, and the upper left eyelid 31 is driven by the upper left eyelid driving connecting rod 44 to move, so that the closing and opening of the upper left eyelid 31 can be controlled by controlling the forward or reverse rotation of the upper left eyelid driving steering engine 42;
similarly, when the closing or opening of the left lower eyelid 32 is controlled, the left lower eyelid driving steering engine 45 rotates, the rotation of the left lower eyelid driving steering engine 45 is transmitted to the left lower eyelid driving connecting rod 47 through the left lower eyelid steering engine arm 46, and the left lower eyelid driving connecting rod 47 drives the left lower eyelid 32 to move, so that the closing and opening of the left lower eyelid 32 can be controlled by controlling the forward or reverse rotation of the left lower eyelid driving steering engine 45;
when the closing or opening of the upper right eyelid 34 is controlled, the upper right eyelid drives the steering engine 48 to rotate, the rotation of the upper right eyelid driving steering engine 48 is transmitted to the upper right eyelid driving connecting rod 50 through the upper right eyelid rudder arm 49, and then the upper right eyelid driving connecting rod 50 drives the upper right eyelid 34 to move, so that the closing and opening of the upper right eyelid 34 can be controlled by controlling the forward or reverse rotation of the upper right eyelid driving steering engine 48;
similarly, when the closing or opening of the right lower eyelid 35 is controlled, the right lower eyelid driving steering engine 51 rotates, the rotation of the right lower eyelid driving steering engine 51 is transmitted to the right lower eyelid driving connecting rod 53 through the right lower eyelid rudder arm 52, and the right lower eyelid driving connecting rod 53 drives the right lower eyelid 35 to move, so that the closing and opening of the right lower eyelid 35 can be controlled by controlling the forward or reverse rotation of the right lower eyelid driving steering engine 51.
In this embodiment, a left eyeball carrier plate 70 is further fixedly connected to the left eyeball 33, left eyeball connecting portions are symmetrically formed on two axial sides of the left eyeball carrier plate 70, a left eyeball support post 63 is connected between the two left eyeball connecting portions, a left eyeball rotator 64 is further connected to the left eyeball support post 63, a left head part 65 is connected to one end, far away from the left eyeball support post 63, of the left eyeball rotator 64, the left head part 65 comprises a left thread portion and a left head portion, the left thread portion of the left head part 65 is in threaded connection with the left eyeball rotator 64, a left spherical joint 66 is rotatably connected to the left head portion of the left head part 65, a left spherical groove for rotatably connecting with the left head portion is formed in one end, close to the left head portion, of the left spherical joint 66, and one end, far away from the left head portion, of the left spherical joint 66 is fixedly connected to the head skeleton 1;
still fixedly connected with right eyeball support plate 71 on right eyeball 36, right eyeball connecting portion have been seted up to the axial bilateral symmetry of right eyeball support plate 71, be connected with right eyeball support post 67 between two right eyeball connecting portion, still be connected with right eyeball rotation piece 68 on right eyeball support post 67, the one end of keeping away from it and connecting right eyeball support post 67 at right eyeball rotation piece 68 is connected with right bulb 69, right bulb 69 includes right screw thread portion and right bulb portion, the right screw thread portion and the right eyeball rotation piece 68 threaded connection of right bulb 69, and rotate in the right bulb portion department of right bulb 69 and be connected with right spherical joint, the one end that right spherical joint is close to right bulb portion is seted up and is used for rotating the right spherical groove of being connected with right bulb portion, and the one end fixed connection on head skeleton 1 that right spherical joint kept away from right bulb portion.
Thus, when the left eyeball 33 rotates left and right, the left eyeball 33 will synchronously drive the left eyeball carrier plate 70, the left eyeball support column 63, the left eyeball rotation member 64 and the left eyeball head piece 65 to rotate left and right, at this moment, the left ball head part of the left eyeball head piece 65 will rotate in the left ball groove of the left ball joint 66, and the left ball joint 66 is fixedly connected to the head skeleton 1, thereby realizing the rotating support and installation of the left eyeball 33 on the head skeleton 1.
Similarly, when the right eyeball 36 rotates left and right, the right eyeball 36 will synchronously drive the right eyeball carrier plate 71, the right eyeball support column 67, the right eyeball rotation member 68 and the right eyeball head piece 69 to rotate left and right, at this moment, the right ball head of the right eyeball head piece 69 will rotate in the right spherical groove of the right spherical joint, and the right spherical joint is fixedly connected to the head skeleton 1, thereby realizing the rotation support and installation of the right eyeball 36 on the head skeleton 1.
In this embodiment, the left and right eyeball driving unit comprises a left and right eyeball driving steering engine 54, a left and right eyeball steering engine arm 55 is connected to a rotating shaft of the left and right eyeball driving steering engine 54, a left and right eyeball driving connecting rod is further arranged at one end of the left and right eyeball steering engine arm 55 far away from the end connected with the left and right eyeball driving steering engine 54, a left and right eyeball connecting rod 56 is further connected to one end of the left and right eyeball driving connecting rod 56 near the left eyeball, one end of the left and right eyeball connecting rod 56 far away from the end connected with the left and right eyeball driving connecting rod is connected with a left eyeball carrier plate 70, so that the left and right eyeball driving steering engine 54 can drive the left eyeball to move left and right through the left and right eyeball connecting rod 56 when rotating, a right and left eyeball driving connecting rod 57 is further connected with a right eyeball carrier plate 71, so that the right and left eyeball driving steering engine 54 can drive the right eyeball 36 to move left and right through the right eyeball driving connecting rod 57 when rotating;
the eyeball up-down driving unit comprises an eyeball up-down driving steering gear 58, an eyeball up-down steering gear arm 59 is connected to the rotating shaft of the eyeball up-down driving steering gear 58, an eyeball up-down driving connecting rod 60 is further arranged at one end, far away from the end, connected with the eyeball up-down driving steering gear 58, of the eyeball up-down driving connecting rod 59, a left eyeball up-down connecting rod 61 is further connected to one end, close to the left eyeball 33, of the eyeball up-down driving connecting rod 60, one end, far away from the end, connected with the eyeball up-down driving connecting rod 60, of the left eyeball up-down driving connecting rod 61 is connected with the lower end of the left eyeball support plate 70, so that the left eyeball up-down driving steering gear 58 can drive the left eyeball to move up and down through the left eyeball up-down connecting rod 61 when rotating, one end, close to the right eyeball 36, of the right eyeball up-down connecting rod 62 is further connected with the lower end, far away from the end, connected with the eyeball up-down driving connecting rod 60, of the right eyeball up-down support plate 71, and the right eyeball up-down driving connecting rod 62 can drive the right eyeball up-down driving steering gear 58 to drive the right eyeball up-down eyeball to move up-down eyeball up-down through the right eyeball.
Thus, when the left and right movements of the left and right eyeballs 36 are to be controlled, the left and right eyeball driving steering engine 54 is started, the left and right eyeball steering engine 54 rotates to drive the left and right eyeball steering engine arms 55 to rotate left and right, the left and right eyeball steering engine arms 55 further drive the left and right eyeball driving connecting rods to rotate left and right, the left and right eyeball driving connecting rods further transmit the movements to the left and right eyeball connecting rods 56 and the right and left eyeball connecting rods 57, wherein the left and right eyeball connecting rods 56 drive the left eyeballs 33 to move left and right through the left eyeball carrier plate 70, and the right and left eyeball connecting rods 57 drive the right eyeballs 36 to move left and right through the right eyeball carrier plate 71, thereby realizing the synchronous control of the left and right eyeball driving steering engine 54 to the left and right eyeballs 36 to move left and right.
Similarly, when the up-and-down movement of the left and right eyeballs 36 is to be controlled, the up-and-down eyeball driving steering engine 58 is started, the up-and-down eyeball driving steering engine 58 drives the up-and-down eyeball rudder arm 59 to rotate up and down, the up-and-down eyeball rudder arm 59 further drives the up-and-down eyeball driving connecting rod 60 to rotate up and down, the up-and-down eyeball driving connecting rod 60 further transmits the movement to the up-and-down left eyeball connecting rod 61 and the up-and-down right eyeball driving connecting rod, the up-and-down left eyeball connecting rod 61 drives the up-and-down left eyeball 33 to move up and down through the left eyeball supporting plate 70, the up-and-down right eyeball connecting rod 62 drives the up-and-down right eyeball 36 to move up and down through the right eyeball supporting plate 71, and therefore, the up-and-down movement of the left and right eyeballs 36 is synchronously controlled through the control of the up-and-down eyeball driving steering engine 58.
In the present embodiment, the lip part includes an upper lip and a lower lip;
the lip drive unit comprises an upper left lip drive steering engine 10, an upper right lip drive steering engine 12, a lower left lip drive steering engine 14 and a lower right lip drive steering engine 16, an upper left lip rudder horn 11 is connected to a rotating shaft of the upper left lip drive steering engine 10, the upper left lip rudder horn 11 is far away from one end of the upper left lip drive steering engine 10 and is connected with the left side of an upper lip, so that the upper left lip drive steering engine 10 can drive the left position of the upper lip to move upwards or downwards through the upper left lip rudder horn 11 when rotating, an upper right lip rudder horn 13 is connected to a rotating shaft of the upper right lip drive steering engine 12, one end of the upper right lip drive steering engine 12, which is far away from the upper right lip, is connected with the right side of the upper right lip drive steering engine 12, so that the upper right lip drive steering engine 12 can drive the right position of the upper lip to move upwards or downwards through the upper right lip rudder horn 13 when rotating, a left lower lip drive steering engine horn 15 is connected with the lower left lip drive steering engine horn 17, so that the lower left lip drive steering engine 15 and the lower left lip drive steering engine 16 and the lower left lip drive steering engine can drive steering engine 17 or right lip drive steering engine 16 when rotating, the lower left lip drive steering engine 16.
Therefore, when the left side of the upper lip of the lip part of the mouth is controlled, the left upper lip steering engine is started, the left upper lip steering engine transmits the motion to the left upper lip steering engine arm 11, the left upper lip steering engine arm 11 drives the left side position of the upper lip to move, and the control of the upward or downward motion of the left side position of the upper lip can be realized by controlling the rotation direction of the left upper lip steering engine;
similarly, when the right side of the upper lip needs to be controlled, the upper right lip steering engine is started, the upper right lip steering engine transmits the motion to the upper right lip rudder horn 13, the upper right lip rudder horn 13 drives the right side position of the upper lip to move, and the control of the upward or downward motion of the right side position of the upper lip can be realized by controlling the rotation direction of the upper right lip steering engine.
When the left side of the lower lip of the lip part is controlled, the left lower lip steering engine is started, the left lower lip steering engine transmits the motion to the left lower lip steering engine arm 15, the left lower lip steering engine arm 15 drives the left side position of the lower lip to move, and the control of the upward or downward movement of the left side position of the lower lip can be realized by controlling the rotation direction of the left lower lip steering engine;
similarly, when the right side of the lower lip needs to be controlled, the right lower lip steering engine is started, the right lower lip steering engine transmits the motion to the right lower lip rudder horn 17, the right lower lip rudder horn 17 drives the right position of the lower lip to move, and the control of the upward or downward motion of the right position of the lower lip can be realized by controlling the rotation direction of the right lower lip steering engine.
In this embodiment, the mouth corner portion includes a left mouth corner and a right mouth corner, the mouth corner driving unit includes a left upper mouth corner driving steering engine 18, a left lower mouth corner driving steering engine 20, a right upper mouth corner driving steering engine 24 and a right lower mouth corner driving steering engine 26, a left upper mouth corner rudder arm is connected to a rotating shaft of the left upper mouth corner driving steering engine 18, a left upper mouth corner driving connecting rod 19 is further connected to one end of the left upper mouth corner rudder arm, which is far away from the end connected to the left upper mouth corner driving steering engine 18, a left lower mouth corner rudder arm is connected to a rotating shaft of the left lower mouth corner driving steering engine 20, a left lower mouth corner driving connecting rod 21 is further connected to one end of the left lower mouth corner rudder arm, which is far away from the end connected to the left lower mouth corner driving steering engine 20, the end, far away from the left upper mouth angle driving connecting rod 19, of the left upper mouth angle driving connecting rod 19, the end, far away from the left lower mouth angle driving connecting rod 21, of the left upper mouth angle driving connecting rod is connected with a left mouth angle after being connected in a rotating mode, the end, far away from the right upper mouth angle driving connecting rod 24, of the right upper mouth angle driving connecting rod 24 is further connected with a right upper mouth angle driving connecting rod 25, the rotating shaft of the right lower mouth angle driving steering engine 26 is connected with a right lower mouth angle driving connecting rod arm, the end, far away from the right upper mouth angle driving connecting rod 25, of the right lower mouth angle driving connecting rod 25, of the right upper mouth angle driving connecting rod 25, and the end, far away from the right lower mouth angle driving connecting rod 27, of the right upper mouth angle driving connecting rod 25, of the right lower mouth angle driving connecting rod 27, of the right lower mouth angle driving connecting rod.
Therefore, when the left mouth corner of the mouth corner part is controlled, the rotation of the left upper mouth corner driving steering engine 18 can be transmitted to the left upper mouth corner driving connecting rod 19 through the left upper mouth corner steering engine arm, the rotation of the left lower mouth corner driving steering engine 20 can be transmitted to the left lower mouth corner driving connecting rod 21 through the left lower mouth corner steering engine arm, meanwhile, the left upper mouth corner driving connecting rod 19 and the left lower mouth corner driving connecting rod 21 are connected with the left mouth corner after being connected in a rotating mode, and therefore the left upper mouth corner driving steering engine 18 and the left lower mouth corner driving steering engine 20 achieve common control over the movement of the left mouth corner.
Similarly, when the right mouth angle of the mouth angle part is controlled, the rotation of the right upper mouth angle driving steering engine 24 can be transmitted to the right upper mouth angle driving connecting rod 25 through the right upper mouth angle rudder arm, the rotation of the right lower mouth angle driving steering engine 26 can be transmitted to the right lower mouth angle driving connecting rod 27 through the right lower mouth angle rudder arm, meanwhile, the right upper mouth angle driving connecting rod 25 and the right lower mouth angle driving connecting rod 27 are connected with the right mouth angle after being connected in a rotating mode, and therefore the right upper mouth angle driving steering engine 24 and the right lower mouth angle driving steering engine 26 achieve common control over the movement of the right mouth angle.
In conclusion, the left mouth angle and the right mouth angle are controlled by 4 steering engines, so that more complex expressions can be simulated, and the simulated expressions can be closer to real expressions of human beings.
In this embodiment, the upper lip includes first upper lip 4, second upper lip 5 and third upper lip 6, the lower lip includes first lower lip 7, second lower lip 8 and third lower lip 9, second upper lip 5 sets up and the axial both ends of second upper lip 5 are connected with left upper lip rudder horn 11 and right upper lip rudder horn 13 respectively along the axial, first upper lip 4 is close to the one end of second upper lip 5 and connects on left upper lip rudder horn 11, third upper lip 6 is close to the one end of second upper lip 5 and connects on right upper lip rudder horn 13, second lower lip 8 and second upper lip 5 parallel arrangement, and the axial both ends of second lower lip 8 are connected with left lower lip rudder horn 15 and right lower lip rudder horn 17 respectively, the one end that first lower lip 7 is close to second lower lip 8 is connected on left upper lip rudder horn 15, the one end that third lower lip 9 is close to second lower lip 8 connects on right lower lip rudder horn 17 and the one end that second upper lip 5 left lower lip 8 sets up and the one end that second upper lip 5 left lower lip is far away from the second upper lip is inclined, the one end that second upper lip 9 sets up and the lower lip is connected with the one end of second lower lip 5 and the one end that the second upper lip is far away from the second upper lip 5 and the second lower lip is inclined, the one end is connected downward, the one end that the third lower lip is far away from the second lower lip 8 and the third lower lip is connected downward.
Thus, by providing the upper lips as the first upper lip 4, the second upper lip 5 and the third upper lip 6 and the lower lips as the first lower lip 7, the second lower lip 8 and the third lower lip 9, it is possible to make the control of the corners of the mouth and the lips of the mouth more accurate.
In this embodiment, the lower jaw driving unit includes a left lower jaw driving steering engine 22 and a right lower jaw driving steering engine 23, a rotating shaft of the left lower jaw driving steering engine 22 is connected to the left side of the lower jaw 3, and a rotating shaft of the right lower jaw driving steering engine 23 is connected to the right side of the lower jaw 3.
Thus, when the lower jaw 3 is controlled, the rotating shaft of the left lower jaw driving steering engine 22 directly controls the left side movement of the lower jaw 3, and the rotating shaft of the lower jaw driving steering engine directly controls the right side movement of the lower jaw 3.
In this embodiment, the neck driving unit includes a first neck driving steering engine 73, a second neck driving steering engine 74 and a third neck driving steering engine 75, the first neck driving steering engine 73 is used for driving the neck to rotate axially, the second neck driving steering engine 74 is used for driving the neck to rotate vertically, and the third neck driving steering engine 75 is used for being connected with the head skeleton 1 to drive the head skeleton 1 to rotate around the neck axially.
When controlling the neck, first neck drive steering wheel 73, second neck drive steering wheel 74 and third neck drive steering wheel 75 are used for realizing the rotation effect of neck in different directions respectively, and then realize the rotation effect of imitative people's neck.
The control system of the welcome robot is unitized, each control unit is provided with a main control chip, the main control chips of the units communicate through serial ports, the transmission of control signals is realized, the signal transmission efficiency is high, and mutual interference is avoided; each unit is independently programmed, and a unified communication interface is configured, so that the development difficulty is reduced; in addition, when a certain unit of the robot breaks down, only the unit needs to be overhauled without detecting the whole system, and then great convenience is brought to future maintenance.
Finally, it should be noted that the above embodiments are only used for illustrating the technical solutions of the present invention and not for limiting the technical solutions, and those skilled in the art should understand that modifications or equivalent substitutions can be made on the technical solutions of the present invention without departing from the spirit and scope of the technical solutions, and all that should be covered by the claims of the present invention.

Claims (7)

1. A welcome robot system with emotion interaction function comprises a human-simulated main body, human-simulated double arms, a human-simulated head and a mobile chassis, and is characterized by further comprising a power supply unit, a voice recognition control unit, a double-arm control unit, a head control unit and a mobile chassis control unit;
the power supply unit is used for respectively supplying power to the voice recognition control unit, the double-arm control unit, the head control unit and the mobile chassis control unit;
the voice recognition control unit is arranged on the humanoid main body and used for recognizing a voice command of a user, and simultaneously converting the recognized voice command of the user into control signals to be respectively output to the double-arm control unit, the head control unit and the mobile chassis control unit;
the double-arm control unit is arranged on the humanoid double arms and is used for controlling the actions of the humanoid double arms according to the control signals output by the voice recognition control unit;
the head control unit is arranged on the human-simulated head and is used for controlling the expression and the action of the human-simulated head according to the control signal output by the voice recognition control unit;
the mobile chassis control unit is arranged on the mobile chassis and is used for controlling the movement of the mobile chassis according to the control signal output by the voice recognition control unit;
the voice recognition control unit comprises a voice acquisition module, a voice recognition chip, a voice main control chip and a main body control chip;
the output end of the voice acquisition module is connected with the data input end of the voice recognition chip, and the voice acquisition module is used for acquiring a user voice instruction, converting the acquired user voice instruction into an electric signal and outputting the electric signal to the voice recognition chip;
the output end of the voice recognition chip is connected with the data input end of the voice main control chip, and the voice recognition chip is used for recognizing the voice instruction transmitted by the voice acquisition module and outputting the recognized result to the voice main control chip;
the output end of the voice main control chip is respectively in data connection with the main body control chip and the mobile chassis control unit, the voice main control chip outputs serial port data and response audio according to the recognition result of a voice instruction by the voice recognition chip and respectively outputs the serial port data to the main body control chip and the mobile chassis control unit, the mobile chassis control unit controls the motion of the mobile chassis according to the serial port data output by the voice main control chip, and the signal output end of the main body control chip is also respectively connected with the data receiving ends of the double-arm control unit and the head control unit, so that the double-arm control unit and the head control unit respectively control the motion of the humanoid double arms and the expression of the humanoid head according to the signal output by the main body control chip;
the head control unit comprises a head control chip and a head driving module used for driving the human-simulated head to perform expression action, the head control chip stores set expression action parameters of the human-simulated head, the input end of the head control chip is connected with the data output end of the main body control chip to receive control signals of the main body control chip and call the corresponding expression action parameters of the human-simulated head according to the received control signals of the main body control chip, the output end of the head control chip is connected with the data input end of the head driving module to send control instructions to the head driving module according to the called expression action parameters of the human-simulated head, and the head driving module controls the expression action of the human-simulated head according to the control instructions sent by the head control chip;
the human-simulated head comprises eyebrows, eyelids, eyeballs, mouths, jaws and necks, the head driving module comprises a head steering engine driving chip and a plurality of head driving steering engines which are positioned on the human-simulated head and are respectively used for driving the eyebrows, the eyelids, the eyeballs, the mouths, the jaws and the necks to move, the input end of the head steering engine driving chip is connected with the data output end of the head control chip, and a plurality of output interfaces of the head steering engine driving chip are respectively connected with one head driving steering engine, so that the head steering engine driving chip can simultaneously send control signals to the plurality of head driving steering engines;
the imitated head further comprises a head framework and a neck framework, an eye mechanism and a mouth mechanism are arranged on the head framework, the neck mechanism is arranged on the neck framework, the eye mechanism comprises an eyebrow assembly, an upper eyelid assembly, a lower eyelid assembly and an eyeball assembly, the eyebrow assembly comprises eyebrows, a first eyebrow driving unit and a second eyebrow driving unit, the eyebrows of the eyebrows are connected with the first eyebrow driving unit, so that the first eyebrow driving unit can drive the eyebrows to rotate around eyebrow tips of the eyebrows, the second eyebrow driving unit is connected with the first eyebrow driving unit, so that the second eyebrow driving unit can drive the eyebrows of the eyebrows to vertically move through the first eyebrow driving unit, the eyebrow assembly further comprises an eyebrow mounting seat, the eyebrow mounting seat is fixedly connected to the head framework, the eyebrow tips of the eyebrows are rotatably connected to the eyebrow mounting seat, so that the eyebrows can rotate around the eyebrow tips, the first eyebrow driving unit comprises a first eyebrow driving and eyebrow rotating connecting rod which are longitudinally arranged, one end of the eyebrow rotating connecting rod is close to one end of the first eyebrow driving steering engine, and when the steering engine drives the eyebrows to rotate around the first eyebrows, the steering engine;
the second eyebrow driving unit comprises a second eyebrow driving steering engine arranged along the axial direction, the second eyebrow driving steering engine is fixedly connected to the head framework through a steering engine fixing piece, a steering engine outer sleeve is arranged outside the first eyebrow driving steering engine outer sleeve, a first connecting sleeve and a second connecting sleeve are respectively arranged on the two sides of the steering engine outer sleeve along the axial direction, the first connecting sleeve is connected with a rotating shaft of the second eyebrow driving steering engine, a first supporting frame is further arranged on the head framework at a position corresponding to the second connecting sleeve, and the second connecting sleeve is rotatably connected with the first supporting frame, so that the second eyebrow driving steering engine can drive the first eyebrow driving steering engine to rotate through the steering engine outer sleeve;
the lip part comprises an upper lip and a lower lip; the lip driving unit comprises a left upper lip driving steering engine, a right upper lip driving steering engine, a left lower lip driving steering engine and a right lower lip driving steering engine, wherein a left upper lip steering engine arm is connected to a rotating shaft of the left upper lip driving steering engine, one end of the left upper lip steering engine, which is far away from the left upper lip driving steering engine, is connected with the left side of an upper lip, so that the left side position of the upper lip can be driven to move upwards or downwards through the left upper lip steering engine arm when the left upper lip driving steering engine rotates, a right upper lip steering engine arm is connected to a rotating shaft of the right upper lip driving steering engine, one end of the right upper lip steering engine, which is far away from the right upper lip driving steering engine, is connected with the right side of the upper lip, so that the right upper lip can be driven to move upwards or downwards through the right upper lip steering engine arm when the right upper lip driving steering engine rotates, a left lower lip driving engine arm is connected to the left lower lip driving engine arm, and one end of the left lower lip driving engine, which is far away from the left lower lip driving engine, can be connected with the left lower lip driving engine, so that the left lower lip driving engine can drive the left lower lip driving engine to move upwards or downwards through the right lip driving engine arm, and the right lower lip driving engine arm, and the left lower mouth driving engine arm when the left lower mouth driving engine arm drives the left lower mouth, the left lower mouth driving engine arm, the left lower mouth driving engine arm, the left mouth driving engine arm or lower mouth driving engine to rotate through the left lower mouth driving engine, the left upper mouth angle steering engine arm is connected with a left upper mouth angle driving connecting rod at the end far away from the left upper mouth angle driving steering engine, a left lower mouth angle steering engine arm is connected on a rotating shaft of the left lower mouth angle driving steering engine, a left lower mouth angle driving connecting rod is connected at the end far away from the left lower mouth angle driving steering engine, the left upper mouth angle driving connecting rod is connected with a left mouth angle after being connected with the left upper mouth angle driving connecting rod in a rotating way at the end far away from the left lower mouth angle driving connecting rod which is connected with the left lower mouth angle steering engine, a right upper mouth angle steering engine arm is connected on a rotating shaft of the right upper mouth angle driving steering engine, a right upper mouth angle driving connecting rod is connected at the end far away from the right upper mouth angle driving connecting rod which is connected with the right upper mouth angle driving steering engine, a right lower mouth angle steering engine arm is connected on a rotating shaft of the right lower mouth angle driving steering engine, a right lower mouth angle steering engine arm is connected with a right lower mouth angle driving connecting rod at the end far away from the right lower mouth angle steering engine, the right upper mouth corner driving connecting rod is far away from one end of the right upper mouth corner rudder horn connecting rod connected with the right upper mouth corner rudder horn and the right lower mouth corner driving connecting rod is far away from one end of the right lower mouth corner rudder horn connected with the right lower mouth corner rudder horn connecting rod, the upper lips comprise a first upper lip, a second upper lip and a third upper lip, the lower lips comprise a first lower lip, a second lower lip and a third lower lip, the second upper lip is axially arranged, the axial two ends of the second upper lip are respectively connected with the left upper lip rudder horn and the right upper lip rudder horn, one end of the first upper lip close to the second upper lip is connected with the left upper lip rudder horn, one end of the third upper lip close to the second upper lip is connected with the right upper lip rudder horn, the second lower lip is parallel to the second upper lip, and the axial two ends of the second lower lip are respectively connected with the left lower lip rudder horn and the right lower lip rudder horn, one end, close to the second lower lip, of the first lower lip is connected to the left lower lip rudder horn, one end, close to the second lower lip, of the third lower lip is connected to the right lower lip rudder horn, one end, far away from the second upper lip, of the first upper lip is arranged obliquely downwards towards the outer side, one end, far away from the second lower lip, of the first lower lip is arranged obliquely upwards towards the outer side, one end, far away from the second upper lip, of the first upper lip and one end, far away from the second lower lip, of the first lower lip are connected with each other to form a left lip angle, one end, far away from the second upper lip, of the third upper lip is arranged obliquely downwards towards the outer side, one end, far away from the second upper lip, of the third lower lip is arranged obliquely upwards towards the outer side, and one end, far away from the second upper lip, of the third upper lip and one end, far away from the second lower lip are connected with each other to form a right lip angle;
the upper eyelid assembly and the lower eyelid assembly comprise a left eyelid, a left eyelid driving unit, a right eyelid and a right eyelid driving unit, the left eyelid is connected with the left eyelid driving unit so that the left eyelid driving unit can drive the left eyelid to close and open, and the right eyelid is connected with the right eyelid driving unit so that the right eyelid driving unit can drive the right eyelid to close and open; eyeball subassembly includes left eyeball, right eyeball, eyeball left and right drive unit and eyeball about drive unit, and eyeball left and right drive unit is connected with left eyeball and right eyeball simultaneously to be used for driving left eyeball and right eyeball synchronous left or right movement, eyeball about drive unit be connected with left eyeball and right eyeball simultaneously, in order to be used for driving left eyeball and right eyeball synchronous upwards or downwards movement.
2. The greeting robot system with emotion interaction function as claimed in claim 1, wherein the voice recognition control unit further includes an audio decoding chip, an audio memory card, a speaker and a display module, an input end of the display module is connected to an output end of the main body control chip for displaying interaction information in real time, an audio file with set content is stored in the audio memory card, an output end of the voice main control chip is connected to an input end of the audio memory card for reading the audio file stored in the audio memory card, an input end of the audio decoding chip is connected to an output end of the audio memory card for decoding the read audio file and converting the decoded audio file into an electrical signal for outputting, and an output end of the audio decoding chip is connected to the speaker for outputting the read audio file in sound through the speaker.
3. The greeting robot system with emotion interaction function as claimed in claim 1, wherein the dual-arm control unit includes a dual-arm control chip and a dual-arm driving module for driving the humanoid dual-arm, the dual-arm control chip stores set humanoid dual-arm motion parameters, an input end of the dual-arm control chip is connected to a data output end of the main body control chip to receive the control signal of the main body control chip and call the corresponding stored humanoid dual-arm motion parameters according to the received control signal of the main body control chip, an output end of the dual-arm control chip is connected to a data input end of the dual-arm driving module to send control commands to the dual-arm driving module according to the called humanoid dual-arm motion parameters, and the dual-arm driving module controls the humanoid dual-arm motion according to the control commands sent by the dual-arm control chip.
4. The welcome robot system according to claim 3, wherein the humanoid dual arms comprise two humanoid arms respectively distributed at two sides of the humanoid main body, the dual-arm driving module comprises an arm steering engine driving chip located on each humanoid arm and an arm driving steering engine located at a plurality of joints of the humanoid arm, an input end of the arm steering engine driving chip is connected with a data output end of the dual-arm control chip, and a plurality of output interfaces of the arm steering engine driving chip are respectively connected with one arm driving steering engine, so that the arm steering engine driving chip can simultaneously send control signals to the plurality of arm driving steering engines.
5. The welcome robot system with emotion interaction function according to claim 1, wherein the mobile chassis control unit comprises a chassis control chip and a chassis driving module for driving the mobile chassis to move, an input end of the chassis control chip is connected with a data output end of the voice main control chip to receive serial port data output by the voice main control chip, an output end of the chassis control chip is connected with a data receiving end of the chassis driving module to send a control instruction to the chassis driving module according to the serial port data output by the voice main control chip, and the chassis driving module controls the mobile chassis to move according to the received control instruction.
6. The welcome robot system according to claim 5, wherein the chassis driving module comprises a motor driving chip and four moving motors distributed around the moving chassis, an input end of the motor driving chip is connected with a data output end of the chassis control chip, an output end of the motor driving chip is electrically connected with the four moving motors respectively, so that the motor driving chip can send control commands to the four moving motors simultaneously, each moving motor is further connected with a rotating wheel and a speed measuring piece for collecting rotating speed signals of the moving motors respectively, and an output end of the speed measuring piece is further connected with the chassis control chip so as to output the collected rotating speed signals of the moving motors to the chassis control chip.
7. The welcome robot system with emotion interaction function as claimed in claim 6, wherein the mobile chassis control unit further comprises a following obstacle avoidance module, the following obstacle avoidance module comprises a human body infrared sensor and a plurality of ultrasonic sensors, the ultrasonic sensors and the human body infrared sensor are both connected with the chassis control chip, the plurality of ultrasonic sensors are respectively located on the front side, the rear side, the left side and the right side of the position above the humanoid main body, the human body infrared sensor and the ultrasonic sensors located on the front side of the position above the humanoid main body are used for detecting human body position information and transmitting the detected human body position information to the chassis control chip, the chassis control chip controls the chassis drive module to drive the mobile chassis to move along with the human body according to the received human body position information, the ultrasonic sensors located on the rear side, the left side and the right side of the position above the humanoid main body are used for detecting obstacles on a corresponding lateral moving path and transmitting the detected obstacle position information to the chassis control chip, and the chassis control chip controls the chassis drive module to drive the mobile chassis to move the obstacles to avoid the obstacles according to the received obstacle position information.
CN202110713166.8A 2021-06-25 2021-06-25 Welcome robot system with emotion interaction function Active CN113319869B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110713166.8A CN113319869B (en) 2021-06-25 2021-06-25 Welcome robot system with emotion interaction function

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110713166.8A CN113319869B (en) 2021-06-25 2021-06-25 Welcome robot system with emotion interaction function

Publications (2)

Publication Number Publication Date
CN113319869A CN113319869A (en) 2021-08-31
CN113319869B true CN113319869B (en) 2023-04-07

Family

ID=77424741

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110713166.8A Active CN113319869B (en) 2021-06-25 2021-06-25 Welcome robot system with emotion interaction function

Country Status (1)

Country Link
CN (1) CN113319869B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115122350B (en) * 2022-07-05 2024-04-09 广东工业大学 Bionic expression robot head device

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104536374A (en) * 2014-11-19 2015-04-22 曾洪鑫 Robot mouth shape control mechanism and system
JP2017047494A (en) * 2015-09-01 2017-03-09 株式会社国際電気通信基礎技術研究所 Android robot control system, device, program and method
CN110103234A (en) * 2019-04-30 2019-08-09 广东工业大学 A kind of robot with humanoid facial expression
CN112265007A (en) * 2020-10-27 2021-01-26 广东富利盛仿生机器人股份有限公司 Head of robot with human-face-expression imitation

Family Cites Families (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9277021B2 (en) * 2009-08-21 2016-03-01 Avaya Inc. Sending a user associated telecommunication address
CN202569495U (en) * 2012-04-13 2012-12-05 谷逍驰 Machine head capable of controlling expression demonstration through speech
CN103413113A (en) * 2013-01-15 2013-11-27 上海大学 Intelligent emotional interaction method for service robot
CN105234945A (en) * 2015-09-29 2016-01-13 塔米智能科技(北京)有限公司 Welcome robot based on network voice dialog and somatosensory interaction
CN106078762B (en) * 2016-07-23 2018-11-06 芜湖哈特机器人产业技术研究院有限公司 Ten neck mechanisms of six-degree-of-freedom humanoid robot
CN106078746A (en) * 2016-08-08 2016-11-09 许志林 A kind of robot control system
CN106393127A (en) * 2016-08-29 2017-02-15 昆山塔米机器人有限公司 Robot capable of simulating human facial expressions
CN106695849A (en) * 2017-02-17 2017-05-24 宁波韦尔德斯凯勒智能科技有限公司 Sensing, controlling and performing system for greeting security service robot
CN106695839A (en) * 2017-03-02 2017-05-24 青岛中公联信息科技有限公司 Bionic intelligent robot for toddler education
CN107520849A (en) * 2017-07-25 2017-12-29 北京联合大学 A kind of SCM Based Voice command robot expression display device
CN107378971A (en) * 2017-09-08 2017-11-24 南京阿凡达机器人科技有限公司 A kind of Study of Intelligent Robot Control system
TWI658377B (en) * 2018-02-08 2019-05-01 佳綸生技股份有限公司 Robot assisted interaction system and method thereof
CN210025312U (en) * 2019-04-30 2020-02-07 广东工业大学 Humanoid facial expression robot
CN210500284U (en) * 2019-05-13 2020-05-12 广西电网有限责任公司南宁供电局 Multifunctional welcome robot for electric power service
CN112936308A (en) * 2021-03-03 2021-06-11 常州龙源智能机器人科技有限公司 Head structure of humanoid expression robot and robot

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104536374A (en) * 2014-11-19 2015-04-22 曾洪鑫 Robot mouth shape control mechanism and system
JP2017047494A (en) * 2015-09-01 2017-03-09 株式会社国際電気通信基礎技術研究所 Android robot control system, device, program and method
CN110103234A (en) * 2019-04-30 2019-08-09 广东工业大学 A kind of robot with humanoid facial expression
CN112265007A (en) * 2020-10-27 2021-01-26 广东富利盛仿生机器人股份有限公司 Head of robot with human-face-expression imitation

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
Fangchao Hu.A combined clustering and image mapping based point cloud segmentation for 3D object detection.2018 Chinese Control And Decision Conference (CCDC).2018,全文. *
余俊炜.一种绳牵引上肢康复机器人结构设计与有限元分析.机械传动.2018,全文. *
杨晓龙.人脸表情识别综述.数字技术与应用.2018,全文. *

Also Published As

Publication number Publication date
CN113319869A (en) 2021-08-31

Similar Documents

Publication Publication Date Title
CN101474481B (en) Emotional robot system
CN102699914B (en) Robot
CN101570019B (en) Robot with humanoid facial expression
CN101458778B (en) Control method of artificial head robot
CN112265007B (en) Head of robot with human-face-expression imitation
CN102566474A (en) Interaction system and method for robot with humanoid facial expressions, and face detection and tracking method
CN113319869B (en) Welcome robot system with emotion interaction function
CN205721625U (en) A kind of expression robot interactive system
CN112936308A (en) Head structure of humanoid expression robot and robot
CN109822590A (en) A kind of robot eyes telecontrol equipment and control method
CN215471165U (en) Head structure of expression robot
CN115533932A (en) Mechanical structure of robot with multiple motion control points and expressions
CN206883648U (en) Robot
CN205750354U (en) A kind of expression robot
CN207578420U (en) Highly emulated robot head construction
CN115122350B (en) Bionic expression robot head device
CN202607678U (en) Eyeball movement mechanism for bionic-robot
CN206066468U (en) A kind of anthropomorphic robot control device
CN212241060U (en) Deformation dancing robot imitating orchid mantis
CN211030007U (en) Robot facial expression control mechanism and robot with same
CN210025294U (en) Mouth mechanism driven by flexible cable
CN211806165U (en) Robot expression simulation mechanism
CN213674145U (en) Head of robot with human-face-expression imitation
CN212794971U (en) Robot eye mechanism
CN111531553A (en) Humanoid robot head based on connecting rod smooth expression control mechanism

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant