CN203300127U - Children teaching and monitoring robot - Google Patents

Children teaching and monitoring robot Download PDF

Info

Publication number
CN203300127U
CN203300127U CN2013203578438U CN201320357843U CN203300127U CN 203300127 U CN203300127 U CN 203300127U CN 2013203578438 U CN2013203578438 U CN 2013203578438U CN 201320357843 U CN201320357843 U CN 201320357843U CN 203300127 U CN203300127 U CN 203300127U
Authority
CN
China
Prior art keywords
module
robot
host computer
steering wheel
digital signal
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN2013203578438U
Other languages
Chinese (zh)
Inventor
杨鸿武
王海燕
刘平和
甘振业
裴东
王全洲
徐世鹏
王鑫
陈丽君
赵学深
谭等泰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Northwest Normal University
Original Assignee
Northwest Normal University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Northwest Normal University filed Critical Northwest Normal University
Priority to CN2013203578438U priority Critical patent/CN203300127U/en
Application granted granted Critical
Publication of CN203300127U publication Critical patent/CN203300127U/en
Anticipated expiration legal-status Critical
Expired - Fee Related legal-status Critical Current

Links

Images

Landscapes

  • Toys (AREA)

Abstract

The utility model provides a children teaching and monitoring robot, comprising a voice processing module, a host computer, a serial port communication module, a digital signal processor and a lower computer motion control module which are connected in sequence. The host computer is independently connected with a touch screen display module and a human body object tracking module; and the digital signal processor is independently connected with a sensor module, a GSM module and a mechanical arm module. The robot is mainly used for teaching, accompany, entertainment and monitor for children and is applied as a private teacher and housekeeper. Kinect integration is employed for image depth and color information, object detecting and tracking are realized, control on a mechanical arm is realized through control on rotation of a steering engine, a smog sensor and a gas sensor are employed for safety monitor of the indoor environment, and once abnormity happens, a short message is sent to a master through the GSM module.

Description

Child teaching monitoring robot
Technical field
The utility model belongs to the intelligent robot technology field, relates to a kind of robot for designed for children, can be used as that the private teacher realizes imparting knowledge to students, amusement and the function of accompanying; Also may be used on each kindergarten and realize human-computer interaction as children education robot and child, increase Learning atmosphere, allow child increase knowledge in carefree atmosphere, be specifically related to a kind of child teaching and guard robot.
Background technology
In recent years, fast development along with computing machine, microelectronics, infotech, robot development's technology is also more and more ripe, degree of intelligence is also more and more higher, aspect a lot, as every field such as ocean development, interplanetary probe, industrial and agricultural production, military affairs, community service, amusements, application has widely been arranged.Robot just waits future development towards intelligent and variation, and intelligent robot is more and more pressed close to our life.
The wired home robot becomes an important directions of robot development, is also simultaneously one of focus of studying both at home and abroad now.Along with the development of Smart Home, people are also more and more higher to the requirement of intelligent robot.Carrying out voice dialogue, voice interface with robot is a key character that embodies system intelligence.Voice technology is joined in robot, can create the interactive atmosphere of vivid and interesting, such as allow intelligent robot make corresponding action, by sound come manipulation robot's brain and inputting word information or with intelligent robot simple dialog etc.And image and word are the direct modes of people's obtaining information, in robot, add these functions with it, can make the people obtain easily the information of oneself wanting.
Baby stages is the enlightening stage of life, and the child has vital effect in study and the cognition in this stage to later growth.But now a lot of heads of a family are busy with work, do not have the too much time to accompany children learn, play; Cause most children to stay alone, child's growth, psychology is exerted an influence, when something unexpected happened, child can not save oneself simultaneously.
Summary of the invention
The purpose of this utility model is to provide and a kind ofly integrates teaching, amusement, accompanies and the child teaching monitoring robot of monitoring function, accompanies the child who stays alone, and can carry out security monitoring to child's environment of living in simultaneously.
For achieving the above object, the technical scheme that the utility model adopts is: a kind of child teaching is guarded robot, comprise the speech processing module, host computer, serial communication modular, digital signal processor and the slave computer motion-control module that connect successively, host computer also is connected with the human body target tracking module with touch-screen display module respectively; Digital signal processor also is connected with sensor assembly, gsm module and mechanical arm module respectively.
The utility model child teaching monitoring robot has voice interface and touch-screen interaction function, and utilizes sensor and gsm module to monitor security contexts such as inside fire and gas leakages; Can realize teaching, amusement, accompany and monitoring function is integrated, make children in easily environment learning and the growth of safety, excite children's desire for knowledge.
Description of drawings
Fig. 1 is the structural representation of the utility model robot.
Fig. 2 is the infrared touch panel control principle drawing.
Fig. 3 is the schematic diagram that people and the utility model robot carry out voice interface.
Fig. 4 is the human body target tracking algorithm flow block diagram that merges Kinect depth image information and chromatic information.
Fig. 5 is serial port module work schematic diagram.
Fig. 6 is the graph of a relation of PWM pulsewidth and steering wheel output shaft corner.
Fig. 7 is digital signal processor and gsm module data transmission schematic diagram.
Fig. 8 is slave computer motion-control module schematic diagram.
Fig. 9 is the PWM modulating wave figure of different in width.
in Fig. 1: touch-screen display module, 2. display, 3. infrared touch panel, 4. sound equipment, 5. phonetic synthesis unit, 6. speech processing module, 7. microphone array, 8. human body target tracking module, 9. human body target detects and tracking cell, 10.Kinect camera, 11. voice recognition unit, 12. host computer, 13. serial communication modular, 14. smoke transducer, 15. gas sensor, 16.GSM module, 17. motor encoder, 18. direct current generator, 19. driver module, 20. wrist joint steering wheel, 21. elbow joint steering wheel, 22. shoulder joint steering wheel, 23. digital signal processor.
Embodiment
Below in conjunction with the drawings and specific embodiments, the utility model is elaborated.
As shown in Figure 1, the utility model child teaching monitoring robot, comprise the speech processing module 6, host computer 12, serial communication modular 13, digital signal processor 23 and the driver module 19 that connect successively; Host computer 12 also is connected with human body target tracking module 8 with touch-screen display module 1 respectively; Digital signal processor 23 also is connected with smoke transducer 14, gas sensor 15, gsm module 16, motor encoder 17, wrist joint steering wheel 20, elbow joint steering wheel 21 and shoulder joint steering wheel 22 respectively; Driver module 19 is connected with direct current generator 18 respectively with motor encoder 17.
Touch-screen display module 1 comprises display 2 and infrared touch panel 3, and infrared touch panel 3 is connected with host computer 12.Speech processing module 6 comprises phonetic synthesis unit 5 and the sound equipment 4 that is connected, and speech processing module 6 comprises voice recognition unit 11 and the microphone array 7 that is connected; Phonetic synthesis unit 5 is connected with host computer 12 respectively with voice recognition unit 11.Human body target tracking module 8 comprises that the human body target that is connected detects and tracking cell 9 and Kinect camera 10, and Kinect camera 10 is connected with host computer 12.
Motor encoder 17, direct current generator 18 and driver module 19 form the slave computer motion-control module, are used for the control motion.
Wrist joint steering wheel 20, elbow joint steering wheel 21 and shoulder joint steering wheel 22 form the mechanical arm module, are used for completing corresponding action.
The sensor that uses in this robot is not limited only to smoke transducer 14 and other sensor 15, and the sensor with corresponding function can be installed as required.
Digital signal processor (DSP) 12 adopts DSP TMS320F2812.
Touch-screen display module 1 is comprised of touch-screen and display 2, and touch-screen system generally comprises touch screen controller and touch detecting apparatus.The Main Function of touch screen controller is to receive touch information from touch point detection device, and converts it to contact coordinate, then gives CPU, and its can receive simultaneously the order that CPU sends and be carried out.Touch detecting apparatus generally is arranged on the front end of display, and Main Function is the touch location that detects the user, and sends touch screen controller to.Mainly contain at present the touch-screen of following several types: resistive touch screen, capacitive touch screen, infrared touch panel and surface acoustic wave touch screen.
The utility model adopts the infrared touch panel E17D03U-30 of IRTOUCH series, and it is of a size of 17 inches, and USB interface connects, and conveniently uses and develops.the principle of work of infrared touch panel is that the surrounding at touch-screen is covered with infrared receiving tube and infrared transmitting tube, these infrared tubes are Rankine-Hugoniot relations one to one in touch screen surface, form a light net that is become by infrared ray cloth, the user is when touch screen, finger can block two infrared rays anyhow of this position, when the infrared light net stops the infrared ray transmitter and receiver in somewhere, this location point ultrared power of receiving of the receiving tube of both direction anyhow will change, controller just can be known and where touch by the variation of ultrared reception condition, thereby can judge the position of touch point on screen.This infrared touch panel is comprised of controller, radiating circuit and receiving circuit.During work, the microprocessor in controller is controlled driving circuit (displacement latch) and is connected successively infrared transmitting tube and carry out the corresponding infrared receiving tube of addressing by address wire and data line simultaneously.When touch is arranged, finger or other thing will block the infrared ray anyhow through this position, will find the infrared ray that this is obstructed during the microprocessor scanography, judgement may have touch, simultaneously change at once another coordinate and scan, if find that a more other axle also has an infrared ray to be obstructed, expression is found to touch, and with two infrared tube position messages that find to intercept to main frame, judge the position of touch point at screen through calculating.The infrared touch panel control principle drawing as shown in Figure 2.Infrared touch panel 3 directly is connected by the USB serial ports with host computer (PC) 12, the position that touches is detected by touch screen controller with coordinate form, be sent to host computer 12 by the USB serial ports, thereby determine the information of input, and then according to touching chosen content, show teaching material.
After touch-screen is embedded into display, be connected to the USB port of main frame, the infrared touch panel that needs to download IRTOUCH drives, and after drive installation, can carry out setting and the calibration of touch-screen attribute, parameter.The touch-screen picture is designed by the IRTOUCH software kit, first by computer analog, debugs, and downloads to touch-screen after debugging is correct again.The utility model has designed respectively key frame and control interface.Key frame is the teaching guidance picture, and the user can carry out teaching material according to the picture prompting and select, and the content that selection will be learnt, have Tang poetry 300 head, elementary English word, Chinese nursery rhymes, English nursery rhymes and children's story; Control interface shows current teaching material according to selecting, and carries out next step redirect.The utility model robot is with the head of touch-screen as robot.Start the machine after the people, touch-screen namely shows content of courses selection interface, when the user touches touch-screen with finger or other object, the position that touches is detected by touch screen controller with coordinate form, and by the USB serial communication, be sent to host computer, thereby determine the information of input, and then according to touching chosen content, show teaching material.As: children can show and select ancient poetry, nursery rhymes, English word etc. to learn by touch-screen, also can allow robot read aloud ancient poetry, sing nursery rhymes and tell a story.
Usually, man-machine voice communication has two kinds of situations: a kind of people of being talks, and machine is obedient.This is " artificial ear ", i.e. speech recognition (Speech Recognition, SR); The second is the machine speech, and the people is obedient.This is " artificial face ", i.e. phonetic synthesis (Text To Speech, TTS).The utility model, based on the microphone array 7 of Kincet camera 10 and the Speech SDK kit of Microsoft, joins speech recognition and speech synthesis technique in robot.Realized that robot can understand people's order, as: " reading aloud " think quiet night " ", " left-hand rotation ", " tracking ", " played songs " etc., thus robot operates accordingly; Simultaneously, also can carry out simple man-machine conversation with robot, as: " what you cry? ", " you how old " etc.In addition, robot can also select to read aloud story according to the user, and English word, play nursery rhymes, tells a story etc.
Speech recognition is to study how to utilize computing machine to extract Useful Information from people's voice signal, and determines its language meaning.Its ultimate principle is exactly with the voice of input, after treatment, itself and speech model storehouse is compared, thereby obtain recognition result.A speech recognition system mainly comprises training and identifies two stages.In the training stage, the user inputs the several times training utterance, and system obtains character vector after above-mentioned pre-service and feature extraction, then by the feature modeling module, sets up the reference model storehouse of training utterance.At cognitive phase, the character vector of input voice and the pattern in the reference model storehouse are carried out similarity measurement relatively, the classification under the pattern that similarity is the highest is as the middle candidate result output of identification.Phonetic synthesis is the process of voice that text is changed into, and based on the statistical parameter phonetic synthesis process of HMM, also is divided into training and synthetic these two stages, can regard the inverse process of speech recognition as.In the training stage, it sets up respectively the HMM model to excitation and the channel parameters of voice, and according to the model prediction parameter, and the operation parameter compositor synthesizes final voice at synthesis phase.
The built-in array microphone system of Kinect can be used for speech recognition, the audio system of Kincet adopts the linear microphone array technology of quaternary, arrangement can be linear or " L " shape, the beam-forming technology etc. that it has adopted advanced noise suppression, echo to eliminate and be used for identifying current sound source, can effectively eliminate the interference of background noise, improve discrimination.The Kinect for Windows SDK kit that utilizes Microsoft to provide, adopt the C++ programming by Visual Studio 2010, the function of speech recognition can well be added in it robot, realize that robot can understand people's order, thereby operate accordingly, and with robot, carry out simple man-machine conversation.Microsoft Speech SDK is the voice software kit that Microsoft provides, speech recognition and Compositing Engine associated component have wherein been comprised, utilize this kit, can realize speech-sound synthesizing function by programming, thereby speech synthesis technique is applied in robot, realizes that robot can lift up one's voice, reads aloud ancient poetry, English word, tells a story, nursery rhymes etc.After inputting voice by microphone, carry out corresponding operation through the PC speech recognition and control; When the needs synthetic speech, the voice that PC is synthetic are play by sound equipment.
The schematic diagram that the human and computer people carries out voice interface as shown in Figure 3, when people and robot carry out simple man-machine conversation, the voice of inputting by microphone array after host computer identification, are play voice by phonetic synthesis search Matching conversation in sound bank after; When the voice control moved, the command translation that host computer will be identified became the command signal of definition, was sent to slave computer by the RS232 serial ports and controlled motor movement; When the people want the touch content of selecting is read aloud, infrared touch panel was sent to host computer with institute's touch coordinate by the USB serial ports, and host computer is read aloud selected content by phonetic synthesis.Children can input voice and " dance ", and robot can carry out nautch by the motion of mechanical arm by the action that sets.
Target detection refers to extract real-time moving target from video flowing, moving target is extracted the information that has nothing to do with moving target in the filtering image from the background of complexity.Target following refers to the target in image is detected, extracts, identifies and follows the tracks of, obtain the kinematic parameter of target, as speed, position, movement locus and acceleration etc., thereby be further analyzed and process, obtain the parameter of moving target, to complete more higher leveled processing.
The utility model robot, in conjunction with the Camshift algorithm, realizes the tracking of human body target after utilizing Kinect camera 10 to get the depth image information and color image information of human body target.When children and robot carried out human-computer interaction, by Kinect microphone array input voice command " tracking target ", host computer 12 recognized after order and command signal is sent to slave computer controls the motion of motor, the tracking that can realize robot with accompany.
Kinect has three camera lenses: middle camera lens is the RGB colour camera, is used for gathering coloured image; Left and right both sides camera lens is respectively the 3D structured light degree of depth inductor that infrared transmitter and infrared C MOS video camera form, and is used for sampling depth data (in scene, object is to the distance of camera).Can allow the programmer's application easily that uses C++, C# or VisualBasic language collocation MicrosoftVisualStudio2010 instrument.The Camshift algorithm is a kind of printenv iterative algorithm of the searching probability distribution extreme value (peak value) take the Meanshift algorithm as core, its basic thought is that all frames of video image are done the MeanShift computing, and with the result of previous frame, it is the size of search window and the center initial value as next frame MeanShift algorithm search window, so loop iteration goes down, until reach certain threshold value, stops computing.
The utility model robot utilizes color information and the depth information that Kinect gets that the Camshift algorithm is extended in three dimensions, then merge the depth information of human body target in three dimensions, and utilize the positional information of Kalman wave filter target of prediction, eliminate the interference of complex background in perspective view, the last centroid position that calculates human body in the depth image of correspondence, realize the tracking fusion depth information of human body target and the human body target tracking of color information.At first the coloured image that Kinect is gathered transforms to the hsv color space, extract the H component that characterizes object color information, and merge depth information and set up object module, to each pixel in video image H passage,, by the query aim model, obtain the background plane figure of target.Position according to target in former frame, to can not be set to 0 for the pixel of target in background plane figure, then search for the position of target in background plane figure, the target location that utilization obtains is in conjunction with depth image, calculate the barycenter of whole human body target in depth image, thereby realize human body target tracking, its FB(flow block) as shown in Figure 4.
The utility model robot communicates by the RS232 serial ports, and RS232 belongs to asynchronous communication, and its receiver and transmitter have clock separately.Robot receives signal from host computer 12, by the RS232 serial communication, sends signal to the slave computer motion-control module, thus the motion of control, and serial communication modular work schematic diagram is as shown in Figure 5.
Mechanical arm mainly is comprised of shoulder joint steering wheel 22, elbow joint steering wheel 21 and wrist joint steering wheel 20.Steering wheel has another name called servomotor, is the servo device in a kind of position (angle), is applicable to the control system that those need angle constantly to change and can keep, and in micro-electro-mechanical systems was unified model plane, it was a basic output executing mechanism.Its principle of work is: pwm signal, by the passage entering signal modulation chip of receiver, obtains dc offset voltage.There is a reference circuit steering wheel inside, and the generation cycle is 20ms, and width is the reference signal of 1.5ms, with the voltage ratio of the dc offset voltage of acquisition and potentiometer, obtains voltage difference output.Finally, the positive and negative motor drive ic that outputs to of voltage difference determines the rotating of motor.When motor speed one timing, drive the potentiometer rotation by the cascade reduction gearing, making voltage difference is 0, motor stops operating.The control signal of steering wheel is pwm signal, utilizes the position of the variation change steering wheel of dutycycle.When the pulse width of square wave changed, the angle of steering wheel rotating shaft changed, the Angular measures pulse width be varied to direct ratio.Relation between the output shaft corner of steering wheel and the pulse width of input signal can represent with Fig. 6.Control DSP by PC and produce the PWM ripple of distinct pulse widths, can control the steering wheel that is connected with DSP and carry out the rotation of different angles, thus the motion of control mechanical arm.
GSM (Global System for Mobile Communication) module, be a kind of GSM radio frequency chip, baseband processing chip, storer, power discharging device etc. to be integrated on a wiring board, have GSM radio frequency processing, Base-Band Processing and the functional module of standard interface is provided.Can use microcontroller to communicate by letter with gsm module by the RS232 serial ports, the AT order of Application standard is controlled gsm module and is realized various radio communication functions.The radio communication gsm module TC35 that this robot adopts Siemens company to produce,, based on DSP, realize the communication function of SMS (short message service).TC35 mainly is comprised of GSM baseband processor, GSM wireless module, power module (ASIC), flash memory, ZIF connector, antennal interface 6 parts, have 40 pins, by a ZIF (Zero Insertion Force) connector, draw.These 40 pins can be divided into 5 classes, i.e. power supply, data I/O, SIM card, audio interface and control.The data-interface of TC35 adopts the asynchronous serial transmission mode, DSP TMS320F2812 has 2 asynchronous serial communication interfaces (SCI), it comprises that two external pin SCITXD(send output pin) and SCIRXD(reception input) pin, be multiplexed into respectively on universaling I/O port, SCI module by DSP is connected with the serial communication interface of TC35, can realize the data transmission of TC35 and DSP, as shown in Figure 7, utilize the instruction of the AT Command Set of GSM 07105 definition, the transmitting-receiving that just can directly use the AT order to facilitate to realize compactly short message, search and manage.First set information to be sent, gsm module is carried out initialization, when sensor assembly had detected abnormal conditions, DSP sent to gsm module 16 with short message content from serial ports by the AT instruction, finally sent to user mobile phone.And the key of communication is how TMS320F2812 controls TC35 with the AT order, and the sending and receiving pattern of SMS messaging breath is divided into: Block pattern, based on the Text pattern of AT order with based on PDU (the Protocol Description Unit) pattern of AT order.GSM Chinese short message is received and dispatched according to the PDU form, maximum 70 Chinese characters, and Chinese character is encoded according to UNICODE, as long as understand or write corresponding data layout and can receive or send SMS message according to the PDU form on software.The AT order relevant to short message mainly contains: AT+CMGF arranges short messaging mode; AT+CMGS, send SMS message; AT+CNMI, arrange the short message indicating mode; AT+CMGR, read short message; AT+GMGD, the deletion short message.
This robot has adopted respectively smoke transducer MQ-2 and gas sensor MQ-4 to detect indoor fire and gas leakage.The MQ-4 gas sensor has very high sensitivity to methane, rock gas, have and respond fast recovery characteristics, has long-term serviceable life and reliable stability, the resistance value of gas sensor and indoor gas concentration are the linear decrease relation, so the electric current of indoor concentration and gas sensor is approximate linear,, in conjunction with potentiometer, concentration signal can be converted into voltage signal again, the magnitude of voltage that sensor collects is converted into digital value through the A/D of DSP conversion passage and processes.MQ-2 and MQ-4 structure are similar, can indoor smokescope be detected.Smoke transducer all is connected with the A/D conversion interface of DSP with gas sensor, the A/D ALT-CH alternate channel is converted into digital signal with the aanalogvoltage that sensor collects, if as calculated, concentration and the combustable gas concentration of smog reach set critical value, will transmitted signal, DSP sends information by gsm module to owner, thereby processes.That is to say, by the SCI(asynchronous communication transmission of DSP and gsm module), can realize both data transmission, when sensor detects abnormal conditions, control gsm module by the AT order and send note to owner.
Thereby the motion of robot mainly produces the PWM ripple of distinct pulse widths and controls the rotation of coupled direct current generator by controlling DSP, and the value of the motor encoder that is connected with motor of collection is regulated its rotating speed.The slave computer motion-control module adopts DSP TMS320F2812 as main control chip,, by this module, realizes robot motor's motion control, completes the motion of robot.TMS320F28xx is the dsp chip of the up-to-date release of TI company, is one of 32 most advanced on present international market, that function is the most powerful fixed-point DSP chip, and it is the dsp chip of applying for control field specially.The connection diagram of slave computer motion-control module each several part as shown in Figure 8.Wherein, DSP collects the motor coded signal, through velocity, decomposes, and output PWM ripple, by regulating the dutycycle of PWM ripple, the speed of regulating direct current generator., by regulating the dutycycle of DSP output PWM ripple, can change " dutycycle " of voltage on armature of direct current motor and change the size of average voltage, thereby control the rotating speed of motor.Under pulse action, when electrical power, speed increases; During the motor outage, speed reduces gradually.As long as according to certain rules, change the time of switching electricity, can allow motor speed controlled, the PWM modulating wave of different in width is as shown in Figure 9.
Picture depth and color information that the utility model robot collects by Kinect camera 10, the detection of realize target and tracking, accompany children.In addition, this robot also has the indoor environment monitoring function, by smoke transducer 14 and gas sensor 15, can monitor safety problems such as inside fire and gas leakages, when having abnormal conditions to occur, to send information to owner by gsm module 16, thereby owner can process in time.

Claims (7)

1. a child teaching is guarded robot, it is characterized in that, comprise the speech processing module (6), host computer (12), serial communication modular (13), digital signal processor (23) and the slave computer motion-control module that connect successively, host computer (12) also is connected with human body target tracking module (8) with touch-screen display module (1) respectively; Digital signal processor (23) also is connected with the mechanical arm module with sensor assembly, gsm module (16) respectively.
2. child teaching is guarded robot according to claim 1, it is characterized in that, described slave computer motion-control module comprises driver module (19), direct current generator (18) and the motor encoder (17) that is connected successively, and driver module (19) is connected with digital signal processor (23) respectively with motor encoder (17).
3. child teaching is guarded robot according to claim 1, it is characterized in that, described mechanical arm module comprises wrist joint steering wheel (20), elbow joint steering wheel (21) and shoulder joint steering wheel (22), and wrist joint steering wheel (20), elbow joint steering wheel (21) and shoulder joint steering wheel (22) are connected with digital signal processor (23) respectively.
4. according to claim 1,2 or 3 described child teaching are guarded robot, it is characterized in that, described digital signal processor (23) adopts DSP TMS320F2812.
5. child teaching is guarded robot according to claim 1, it is characterized in that, described touch-screen display module (1) comprises display (2) and infrared touch panel (3), and infrared touch panel (3) is connected with host computer (12).
6. child teaching is guarded robot according to claim 1, it is characterized in that, described speech processing module (6) comprises phonetic synthesis unit (5) and the sound equipment (4) that is connected, and speech processing module (6) also comprises voice recognition unit (11) and the microphone array (7) that is connected; Phonetic synthesis unit (5) is connected with host computer (12) respectively with voice recognition unit (11).
7. child teaching is guarded robot according to claim 1, it is characterized in that, described human body target tracking module (8) comprises that the human body target that is connected detects and tracking cell (9) and Kinect camera (10), and Kinect camera (10) is connected with host computer (12).
CN2013203578438U 2013-06-18 2013-06-18 Children teaching and monitoring robot Expired - Fee Related CN203300127U (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN2013203578438U CN203300127U (en) 2013-06-18 2013-06-18 Children teaching and monitoring robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN2013203578438U CN203300127U (en) 2013-06-18 2013-06-18 Children teaching and monitoring robot

Publications (1)

Publication Number Publication Date
CN203300127U true CN203300127U (en) 2013-11-20

Family

ID=49576123

Family Applications (1)

Application Number Title Priority Date Filing Date
CN2013203578438U Expired - Fee Related CN203300127U (en) 2013-06-18 2013-06-18 Children teaching and monitoring robot

Country Status (1)

Country Link
CN (1) CN203300127U (en)

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104575141A (en) * 2015-01-20 2015-04-29 三峡大学 Man-computer interaction auxiliary classroom teaching aid
CN105702106A (en) * 2016-04-22 2016-06-22 广东技术师范学院 Teaching robot control system
CN105719519A (en) * 2016-04-27 2016-06-29 深圳前海勇艺达机器人有限公司 Robot with graded teaching function
CN105741626A (en) * 2016-05-16 2016-07-06 苏州金建达智能科技有限公司 Education robot arm containing touch screen
CN106020057A (en) * 2016-07-19 2016-10-12 东莞市优陌儿智护电子科技有限公司 Nursing robot
CN106054897A (en) * 2016-07-18 2016-10-26 旗瀚科技有限公司 Robot capable of performing human body following
CN106313072A (en) * 2016-10-12 2017-01-11 南昌大学 Humanoid robot based on leap motion of Kinect
CN106647423A (en) * 2015-10-28 2017-05-10 中国移动通信集团公司 Intelligent following shooting method and intelligent following shooting device
CN106781745A (en) * 2017-02-08 2017-05-31 深圳凯达通光电科技有限公司 A kind of home-use intelligent child teaching robot
CN107363842A (en) * 2017-07-28 2017-11-21 长沙师范学院 A kind of children, which give pleasure to, teaches monitoring robot and its human body target tracking algorithm
CN108582117A (en) * 2018-07-12 2018-09-28 朱明来 Robot is followed based on Kinect sensor
CN109719741A (en) * 2019-01-08 2019-05-07 颜睿毅 A kind of interaction robot method for safe operation
US11580972B2 (en) * 2019-04-26 2023-02-14 Fanuc Corporation Robot teaching device

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104575141A (en) * 2015-01-20 2015-04-29 三峡大学 Man-computer interaction auxiliary classroom teaching aid
CN106647423A (en) * 2015-10-28 2017-05-10 中国移动通信集团公司 Intelligent following shooting method and intelligent following shooting device
CN105702106A (en) * 2016-04-22 2016-06-22 广东技术师范学院 Teaching robot control system
CN105719519A (en) * 2016-04-27 2016-06-29 深圳前海勇艺达机器人有限公司 Robot with graded teaching function
CN105741626A (en) * 2016-05-16 2016-07-06 苏州金建达智能科技有限公司 Education robot arm containing touch screen
CN106054897A (en) * 2016-07-18 2016-10-26 旗瀚科技有限公司 Robot capable of performing human body following
CN106020057A (en) * 2016-07-19 2016-10-12 东莞市优陌儿智护电子科技有限公司 Nursing robot
CN106313072A (en) * 2016-10-12 2017-01-11 南昌大学 Humanoid robot based on leap motion of Kinect
CN106781745A (en) * 2017-02-08 2017-05-31 深圳凯达通光电科技有限公司 A kind of home-use intelligent child teaching robot
CN107363842A (en) * 2017-07-28 2017-11-21 长沙师范学院 A kind of children, which give pleasure to, teaches monitoring robot and its human body target tracking algorithm
CN108582117A (en) * 2018-07-12 2018-09-28 朱明来 Robot is followed based on Kinect sensor
CN109719741A (en) * 2019-01-08 2019-05-07 颜睿毅 A kind of interaction robot method for safe operation
US11580972B2 (en) * 2019-04-26 2023-02-14 Fanuc Corporation Robot teaching device

Similar Documents

Publication Publication Date Title
CN203300127U (en) Children teaching and monitoring robot
Xie et al. Accelerometer-based hand gesture recognition by neural network and similarity matching
US8552983B2 (en) Intelligent robotic interface input device
KR101576148B1 (en) System and method for the multidimensional evaluation of gestures
Li et al. A web-based sign language translator using 3d video processing
CN109933191B (en) Gesture recognition and control method and system
US20110199292A1 (en) Wrist-Mounted Gesture Device
CN102789218A (en) Zigbee smart home system based on multiple controllers
Kaholokula Reusing ambient light to recognize hand gestures
JP7375748B2 (en) Information processing device, information processing method, and program
Wang et al. Wheeled robot control based on gesture recognition using the Kinect sensor
US20200269421A1 (en) Information processing device, information processing method, and program
Verdadero et al. Hand gesture recognition system as an alternative interface for remote controlled home appliances
Francis et al. Significance of hand gesture recognition systems in vehicular automation-a survey
Christian et al. Hand gesture recognition and infrared information system
Wilson Sensor-and recognition-based input for interaction
Swapna et al. Hand gesture recognition system for numbers using thresholding
Hatwar et al. Home automation system based on gesture recognition system
PreetiDhiman et al. Voice Operated Intelligent Fire Extinguisher Vehicle
Kurian et al. Visual Gesture-Based Home Automation
Luo Research on gesture recognition based on YOLOv5
Melnyk et al. Towards computer assisted international sign language recognition system: a systematic survey
KR20150124009A (en) Coaching System Of Robot Using Hand Movement
Lee et al. Fabrication of the kinect remote-controlled cars and planning of the motion interaction courses
Wang et al. A practical service robot system for greeting guests

Legal Events

Date Code Title Description
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20131120

Termination date: 20140618

EXPY Termination of patent right or utility model