CN217728742U - Intelligent interactive robot - Google Patents

Intelligent interactive robot Download PDF

Info

Publication number
CN217728742U
CN217728742U CN202220974973.5U CN202220974973U CN217728742U CN 217728742 U CN217728742 U CN 217728742U CN 202220974973 U CN202220974973 U CN 202220974973U CN 217728742 U CN217728742 U CN 217728742U
Authority
CN
China
Prior art keywords
main controller
module
base
driving motor
interactive robot
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202220974973.5U
Other languages
Chinese (zh)
Inventor
李银
苗云龙
位瑞英
卓俊鸿
张映辉
王娟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shaoguan University
Original Assignee
Shaoguan University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shaoguan University filed Critical Shaoguan University
Priority to CN202220974973.5U priority Critical patent/CN217728742U/en
Application granted granted Critical
Publication of CN217728742U publication Critical patent/CN217728742U/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Toys (AREA)

Abstract

The intelligent interactive robot of the utility model comprises a motion unit and a voice interaction unit, wherein the voice interaction unit is arranged on the motion unit; the movement unit comprises a base, a main controller, a driving motor and a driving wheel, wherein the main controller and the driving motor are arranged on the base; the main controller is used for controlling the rotation of the driving motor; the voice interaction unit comprises an ESP32 module, a microphone and a loudspeaker which are arranged on the base, and the microphone and the loudspeaker are electrically connected with the ESP32 module; the microphone is used for receiving voice emitted by a user and transmitting the voice to the ESP32 module, the ESP32 module is used for transmitting the voice reply signal to the loudspeaker, and the loudspeaker is used for converting the voice reply signal into sound. Compared with the prior art, the utility model discloses an interactive robot of intelligence includes the motion unit, and the accessible motion unit removes, and it is more convenient to use.

Description

Intelligent interactive robot
Technical Field
The utility model relates to a robot field especially relates to an interactive robot of intelligence.
Background
With the development of science and technology, the use of robots is more and more common. Some intelligent interactive robots in the current market can only carry out voice communication, cannot move by themselves, have single functions and cannot meet various interactive requirements.
SUMMERY OF THE UTILITY MODEL
Based on this, the utility model aims to provide an intelligent interactive robot that can remove by oneself.
The intelligent interactive robot comprises a motion unit and a voice interaction unit, wherein the voice interaction unit is arranged on the motion unit; the motion unit comprises a base, a main controller, a driving motor and a driving wheel, wherein the main controller and the driving motor are arranged on the base, the driving wheel is sleeved on a shaft of the driving motor, and the driving motor is electrically connected with the main controller; the main controller is used for controlling the rotation of the driving motor; the voice interaction unit comprises an ESP32 module, a microphone and a loudspeaker which are arranged on the base, and the microphone and the loudspeaker are electrically connected with the ESP32 module; the microphone is used for receiving voice sent by a user and transmitting the voice to the ESP32 module, the ESP32 module is used for transmitting the voice reply signal to the loudspeaker, and the loudspeaker is used for converting the voice reply signal into sound.
Compared with the prior art, the utility model discloses an intelligent interactive robot includes the motion unit, the utility model discloses an accessible motion unit removes, and it is more convenient to use.
Furthermore, the number of the driving motors is two, and the two driving motors are respectively arranged on two sides of the base; the number of the driving wheels is two, and the two driving wheels are respectively arranged on the shafts of the two driving motors.
Further, the motion unit also comprises a bull's eye wheel which is arranged below the front side of the base.
Further, the ESP32 module is electrically connected to the main controller, and the ESP32 module is configured to send a control command to the main controller, and the main controller controls the rotation of the driving motor according to the received control command.
Further, the main controller is an STM32 module.
Furthermore, the motion unit also comprises an infrared tracing module and a guide pattern, the infrared tracing module is arranged at the bottom of the base, the guide pattern is arranged on the ground, and the infrared tracing module is electrically connected with the main controller; the infrared tracing module is used for detecting the guide pattern arranged on the ground and transmitting a pattern detection signal to the main controller, and the main controller controls the driving motor to rotate according to the received pattern detection signal.
Furthermore, the motion unit also comprises an ultrasonic module, the ultrasonic module is arranged above the base and is electrically connected with the main controller; the ultrasonic module is used for detecting the distance of surrounding objects and transmitting a distance detection signal to the main controller, and the main controller controls the driving motor to rotate according to the received distance detection signal.
The motion unit further comprises a steering engine, the steering engine is arranged on the base, the ultrasonic module is connected with a shaft of the steering engine, and the steering engine is electrically connected with the main controller; the main controller is used for controlling the steering engine to rotate, so that the detection direction of the ultrasonic module is adjusted.
The system further comprises a character recognition unit, wherein the character recognition unit comprises an OpenMV module and a display which are arranged on the base, and the OpenMV module and the display are respectively and electrically connected with the main controller; the OpenMV module is used for acquiring images, identifying characters in the images and transmitting character identification results to the main controller, and the main controller is used for controlling the display to display the character identification results.
Further, the display is a dot matrix display module.
Drawings
Fig. 1 is a schematic diagram of a circuit structure of the intelligent interactive robot of the present invention;
fig. 2 is an external structural view of the intelligent interactive robot of the present invention;
fig. 3 is an explosion diagram of the intelligent interactive robot of the present invention.
In the figure: 10. a motion unit; 11. a base; 12. a main controller; 13. a drive motor; 14. a drive wheel; 15. a bull's eye wheel; 16. an infrared tracing module; 18. an ultrasonic module; 19. a steering engine; 20. a voice interaction unit; 21. an ESP32 module; 22. a microphone; 23. a speaker; 30. a character recognition unit; 31. an OpenMV module; 32. a display; 40. a battery.
Detailed Description
The following are specific embodiments of the present invention, and the technical solutions of the present invention will be further described with reference to the accompanying drawings, but the present invention is not limited to these embodiments.
In the description of the present invention, it is to be understood that the terms "center", "longitudinal", "lateral", "up", "down", "front", "back", "left", "right", "vertical", "horizontal", "top", "bottom", "inner", "outer", etc. indicate orientations or positional relationships based on the orientations or positional relationships shown in the drawings, and are only for convenience of description and simplicity of description, but do not indicate or imply that the device or element referred to must have a particular orientation, be constructed and operated in a particular orientation, and therefore, are not to be construed as limiting the present invention.
It will be understood that when an element is referred to as being "secured to" another element, it can be directly on the other element or intervening elements may also be present. When an element is referred to as being "connected" to another element, it can be directly connected to the other element or intervening elements may also be present.
As shown in fig. 1-3, the intelligent interactive robot of the present invention includes a motion unit 10 and a voice interaction unit 20, wherein the voice interaction unit 20 is disposed on the motion unit 10 and can move along with the motion unit 10.
The movement unit 10 comprises a base 11, a main controller 12, a driving motor 13 and a driving wheel 14, wherein the main controller 12 and the driving motor 13 are arranged on the base 11, the driving wheel 14 is sleeved on a shaft of the driving motor 13, and the driving motor 13 is electrically connected with the main controller 12; the main controller 12 is used to control the rotation of the driving motor 13 so that the base 11 can move. In one particular embodiment, master controller 12 is an STM32 module.
The voice interaction unit 20 comprises an ESP32 module 21, a microphone 22 and a speaker 23 which are arranged on the base 11, and the microphone 22 and the speaker 23 are respectively electrically connected with the ESP32 module 21; the microphone 22 is used for receiving the voice sent by the user and converting the voice into a voice signal to be transmitted to the ESP32 module 21, the espp 32 module 21 is used for converting the received voice signal into a voice response signal and transmitting the voice response signal to the speaker 23, and the speaker 23 is used for converting the voice response signal into sound, so as to realize the voice conversation between the user and the intelligent interactive robot.
In order to enable the movement unit 10 to be steered, in one embodiment, two driving motors 13 are provided, respectively, on both sides of the base 11; the number of the driving wheels 14 is two, the driving wheels are respectively sleeved on the shafts of the two driving motors 13, and the main controller 12 controls the two driving motors 13 to rotate at different speeds or directions, so that steering can be realized.
In order to make the movement of the moving unit 10 smoother, in a preferred embodiment, the moving unit 10 further includes a bull's-eye wheel 15, the bull's-eye wheel 15 being provided below the front side of the base 11; in other embodiments, universal wheels may be used instead of bull's-eye wheels 15.
In order to make the moving unit 10 move according to the voice command of the user, in a preferred embodiment, the ESP32 module 21 is electrically connected to the main controller 21, the ESP32 module 21 is used for converting the voice signal into a control command and sending the control command to the main controller 12, and the main controller 12 controls the rotation of the driving motor 13 according to the received control command to realize the voice control.
In order to enable the motion unit 10 to move along a set path, in a preferred embodiment, the motion unit 10 further includes an infrared tracking module 16 and a guide pattern, the infrared tracking module 16 is disposed at the bottom of the base 11, the guide pattern is disposed on the ground, and the infrared tracking module 16 is electrically connected to the main controller 12; the infrared tracking module 16 is used for detecting a guide pattern on the ground and transmitting a pattern detection signal to the main controller 12, and the main controller 12 controls the rotation of the driving motor 13 according to the received pattern detection signal. Specifically, the infrared tracing module 16 includes an infrared transmitter and an infrared receiver, the guiding pattern includes black lines consistent with a set path, infrared light emitted by the infrared transmitter irradiates the black lines, the infrared receiver receives reflected light of the black lines, and converts the intensity of the received reflected light into a pattern detection signal to be transmitted to the main controller 12, the main controller 12 determines the deviation degree of the moving unit 10 according to the intensity change of the received pattern detection signal, and controls the rotation of the driving motor 13 according to the deviation degree to correct the position deviation of the moving unit 10, so that the moving unit 10 can move along with the black lines of the guiding pattern.
The moving unit 10 may encounter an obstacle during the moving process, and the collision with the obstacle easily causes the damage of each part, which affects the normal operation of the intelligent interactive robot of the present invention; in a preferred embodiment, the movement unit 10 further comprises an ultrasonic module 18, the ultrasonic module 18 is disposed above the base 11 and electrically connected to the main controller 12; the ultrasonic module 18 is used for detecting the distance of the surrounding object and transmitting a distance detection signal to the main controller 12, and the main controller 12 controls the movement of the driving motor 13 according to the received distance detection signal, so that the movement unit 10 can avoid the obstacle. Specifically, the ultrasonic module 18 includes an ultrasonic transmitter and an ultrasonic receiver, the ultrasonic transmitter transmits ultrasonic waves to the object to be detected, and then the ultrasonic waves are reflected to the ultrasonic receiver, and the main controller 12 can calculate the distance to the object to be detected according to the time difference between the transmission and reception of the ultrasonic waves.
In order to increase the detection range of the ultrasonic module 18, in a preferred embodiment, the motion unit 10 further comprises a steering engine 19, the steering engine 19 is arranged on the base 11, the ultrasonic module 18 is connected with a shaft of the steering engine 19, and the steering engine 19 is electrically connected with the main controller 12; the main controller 12 is used for controlling the rotation of the steering engine 19, so as to adjust the detection direction of the ultrasonic module 18. In other embodiments, the moving unit 10 includes a stepping motor disposed on the base 11, the ultrasonic module 18 is connected to a shaft of the stepping motor, the stepping motor is electrically connected to the main controller 12, and the main controller 12 is configured to control rotation of the stepping motor, thereby adjusting the detection direction of the ultrasonic module.
In order to further increase the interactive mode and realize the functions of character recognition and display, in a preferred embodiment, the device further includes a character recognition unit 30, the character recognition unit 30 includes an OpenMV module 31 and a display 32 disposed on the base, and the OpenMV module 31 and the display 32 are respectively electrically connected to the main controller 12; the OpenMV module 31 is configured to obtain a text image, recognize the text image, and transmit a text recognition result to the main controller 12, where the main controller 12 is configured to control the display 32 to display the text recognition result. In one particular embodiment, the display 32 is a dot matrix display module.
In order to get rid of the utility model discloses get rid of the restraint of power supply line, can carry out wider removal, in a preferred embodiment, the utility model discloses still include battery 40, battery 40 is used for supplying power for main control unit 12, driving motor 13, infrared module 16, ultrasonic wave module 18, steering wheel 19, ESP32 module 21, microphone 22, speaker 23, openMV module 31 and display 32.
Compared with the prior art, the beneficial effects of the utility model are as follows:
1. the device comprises a motion unit which can move, so that the use is more convenient;
2. the ultrasonic module is arranged to prevent collision with the barrier;
3. the character recognition unit is further included, so that the interactive mode of character recognition and display is increased, and the interactive experience of a user is improved.
The character recognition method, the voice interaction method, the voice control method, the infrared tracing method and the ultrasonic distance measurement method related to the above embodiments are all the prior art, and the utility model discloses do not relate to the improvement of the method.
The above examples only represent some embodiments of the present invention, and the description thereof is more specific and detailed, but not to be construed as limiting the scope of the present invention. It should be noted that, for those skilled in the art, without departing from the spirit of the present invention, several variations and modifications can be made, which are within the scope of the present invention.

Claims (10)

1. An intelligent interactive robot, which is characterized in that:
the device comprises a motion unit (10) and a voice interaction unit (20), wherein the voice interaction unit (20) is arranged on the motion unit (10);
the movement unit (10) comprises a base (11), a main controller (12), a driving motor (13) and a driving wheel (14), wherein the main controller (12) and the driving motor (13) are arranged on the base (11), the driving wheel (14) is sleeved on a shaft of the driving motor (13), and the driving motor (13) is electrically connected with the main controller (12); the main controller (12) is used for controlling the rotation of the driving motor (13);
the voice interaction unit (20) comprises an ESP32 module (21), a microphone (22) and a loudspeaker (23) which are arranged on the base (11), and the microphone (22) and the loudspeaker (23) are electrically connected with the ESP32 module (21); the microphone (22) is used for receiving voice emitted by a user and transmitting the voice to the ESP32 module (21), the ESP32 module (21) is used for transmitting a voice reply signal to the loudspeaker (23), and the loudspeaker (23) is used for converting the voice reply signal into sound.
2. The intelligent interactive robot of claim 1, wherein:
the two driving motors (13) are respectively arranged on two sides of the base (11); the number of the driving wheels (14) is two, and the two driving wheels are respectively arranged on the shafts of the two driving motors (13).
3. The intelligent interactive robot of claim 2, wherein:
the movement unit (10) further comprises a bull's-eye wheel (15), and the bull's-eye wheel (15) is arranged below the front side of the base (11).
4. The intelligent interactive robot of claim 1, wherein:
the ESP32 module (21) is electrically connected with the main controller (12), the ESP32 module (21) is used for sending a control command to the main controller (12), and the main controller (12) controls the rotation of the driving motor (13) according to the received control command.
5. The intelligent interactive robot of claim 1, wherein:
the main controller (12) is an STM32 module.
6. The intelligent interactive robot of claim 1, wherein:
the motion unit (10) further comprises an infrared tracing module (16) and a guide pattern, the infrared tracing module (16) is arranged at the bottom of the base (11), the guide pattern is arranged on the ground, and the infrared tracing module (16) is electrically connected with the main controller (12); the infrared tracing module (16) is used for detecting the guide pattern arranged on the ground and transmitting a pattern detection signal to the main controller (12), and the main controller (12) controls the rotation of the driving motor (13) according to the received pattern detection signal.
7. The intelligent interactive robot of claim 1, wherein:
the motion unit (10) further comprises an ultrasonic module (18), the ultrasonic module (18) is arranged above the base (11), and the ultrasonic module (18) is electrically connected with the main controller (12); the ultrasonic module (18) is used for detecting the distance of surrounding objects and transmitting distance detection signals to the main controller (12), and the main controller (12) controls the driving motor (13) to rotate according to the received distance detection signals.
8. The intelligent interactive robot of claim 7, wherein:
the motion unit (10) further comprises a steering engine (19), the steering engine (19) is arranged on the base (11), the ultrasonic module (18) is connected with a shaft of the steering engine (19), and the steering engine (19) is electrically connected with the main controller (12); the main controller (12) is used for controlling the steering engine (19) to rotate, so that the detection direction of the ultrasonic module (18) is adjusted.
9. The intelligent interactive robot of claim 1, wherein:
the device is characterized by further comprising a character recognition unit (30), wherein the character recognition unit (30) comprises an OpenMV module (31) and a display (32) which are arranged on the base (11), and the OpenMV module (31) and the display (32) are respectively and electrically connected with the main controller (12); the OpenMV module (31) is used for acquiring images, identifying characters in the images and transmitting character identification results to the main controller (12), and the main controller (12) is used for controlling the display (32) to display the character identification results.
10. The intelligent interactive robot of claim 9, wherein:
the display (32) is a dot matrix display module.
CN202220974973.5U 2022-04-25 2022-04-25 Intelligent interactive robot Active CN217728742U (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202220974973.5U CN217728742U (en) 2022-04-25 2022-04-25 Intelligent interactive robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202220974973.5U CN217728742U (en) 2022-04-25 2022-04-25 Intelligent interactive robot

Publications (1)

Publication Number Publication Date
CN217728742U true CN217728742U (en) 2022-11-04

Family

ID=83818471

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202220974973.5U Active CN217728742U (en) 2022-04-25 2022-04-25 Intelligent interactive robot

Country Status (1)

Country Link
CN (1) CN217728742U (en)

Similar Documents

Publication Publication Date Title
US8272466B2 (en) Robot
US20180081367A1 (en) Method and system for automatically charging robot
CN106338999A (en) Intelligent following anti-collision dolly and anti-collision method thereof
CN106406316A (en) Autonomous charging system and charging method thereof for intelligent home accompanying robot
JP4024683B2 (en) Communication robot
US10986459B2 (en) Sound production device, display system, and sound production method
CN106003069A (en) Robot
CN109514566A (en) A kind of intelligent monitoring machine people based on raspberry pie
CN101134147B (en) Intelligent type telecontrol system
CN205507543U (en) Intelligent vehicle control system
CN211529000U (en) Unmanned trolley based on laser radar and camera
CN217728742U (en) Intelligent interactive robot
JP4677593B2 (en) Communication robot
CN208903133U (en) A kind of embedded guide vehicle based on panoramic vision and speech recognition
CN210072416U (en) Intelligent following movement device
CN206115278U (en) Family's intelligence accompany and attend to robot from main charge system
CN215219970U (en) Machine vision omnidirectional movement training platform
CN212623752U (en) Trolley obstacle avoidance device based on 51 single chip microcomputer
CN208993511U (en) The positioning device that charges and system
CN109444867B (en) Underwater positioning and communication system and method
CN212341739U (en) Automatic following system for balance car and balance car
CN212460379U (en) Electric control system of automatic pilot ship
CN211761560U (en) Multifunctional intelligent robot
CN107632287B (en) Sound positioning method
CN207008406U (en) Trackless navigating robot control device

Legal Events

Date Code Title Description
GR01 Patent grant
GR01 Patent grant