CN206475183U - Robot - Google Patents

Robot Download PDF

Info

Publication number
CN206475183U
CN206475183U CN201621230925.6U CN201621230925U CN206475183U CN 206475183 U CN206475183 U CN 206475183U CN 201621230925 U CN201621230925 U CN 201621230925U CN 206475183 U CN206475183 U CN 206475183U
Authority
CN
China
Prior art keywords
robot
feedback
interactive
interactive operation
control element
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201621230925.6U
Other languages
Chinese (zh)
Inventor
蒋化冰
孙斌
吴礼银
康力方
李小山
张干
赵亮
邹武林
徐浩明
廖凯
齐鹏举
方园
李兰
米万珠
舒剑
吴琨
管伟
罗璇
罗承雄
张海建
马晨星
张俊杰
谭舟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Noah Wood Robot Technology Co ltd
Shanghai Zhihui Medical Technology Co ltd
Shanghai Zhihuilin Medical Technology Co ltd
Original Assignee
Shanghai Muye Robot Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Muye Robot Technology Co Ltd filed Critical Shanghai Muye Robot Technology Co Ltd
Priority to CN201621230925.6U priority Critical patent/CN206475183U/en
Application granted granted Critical
Publication of CN206475183U publication Critical patent/CN206475183U/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Abstract

The utility model discloses a kind of robot, robot includes:It is arranged at robot fuselage, the detector for detecting interactive operation;According to interactive operation and the corresponding detector location of interactive operation, control feedback device performs the middle control element of feedback behavior;And perform the feedback device of feedback behavior.By this programme, human-computer intellectualization is realized in the interactive operation behavior and interactive operation position that robot can be triggered based on user on fuselage so that man-machine interaction is not limited only to voice and the touch to display screen is interacted, and extends the interactive form of robot.

Description

Robot
Technical field
The utility model belongs to mobile robot field, more particularly to a kind of robot.
Background technology
In recent years, the development of robot technology and artificial intelligence study deepen continuously, and intelligent robot is in human lives Play the part of more and more important role.It is on the increase with the demand of people, the robot of stronger interactive function can gradually will turn into The favorite of robot circle.
And be that the phonetic order inputted according to user or the touch-control inputted on a display screen refer to mostly robot at this stage Corresponding cross reaction is made in order, and these interactive functions are substantially the angle setting for service logic, robot The interactive function that personalizes compares shortcoming so that the man-machine interaction form of current robot is excessively single.
The content of the invention
In view of this, the utility model embodiment provides a kind of robot, the man-machine interaction shape for expanding machinery people Formula, improves the intelligent of man-machine interaction.
The utility model embodiment provides a kind of robot, including:
It is arranged at robot fuselage, the detector for detecting interactive operation;
According to the interactive operation and the corresponding detector location of the interactive operation, control feedback device performs feedback row For middle control element;And
Perform the feedback device of the feedback behavior.
Optionally, the detector includes at least one touch sensing for being arranged on diverse location on the robot fuselage Device, and/or at least one vibrating sensor;
The touch sensor, the direction slided and/or speed are touched for detecting;
The vibrating sensor, the number of times and/or dynamics of operation are tapped for detecting.
Optionally, the feedback device includes at least one of Audio Players, display screen, moving parts.
Optionally, the feedback device includes the moving parts, and at least one described touch sensor includes:Set Touch sensor under the display screen, the direction slided is touched for detecting;
The middle control element is additionally operable to:The moving parts according to the direction controlling for touching and sliding, make the machine People walks to correspondence direction.
Optionally, the robot also includes:
Memory for storing various feedback behavior.
Optionally, described robot also includes:
For the image acquisition device for gathering the image of interactive object, being arranged on the robot fuselage;And,
The image of the interactive object is recognized, the image recognizer of the feature of the interactive object is obtained;
The middle control element is additionally operable to:According to the interactive operation, the trigger position of the interactive operation and the interaction The feature of object, determines the feedback behavior.
The robot that the utility model is provided, sets one or more detectors on robot fuselage, to pass through detection The interactive operation that device detection user triggers in robot.Due to different detector locations and different friendships can be preset Interoperability behavior corresponds to different feedback behaviors, so that, central control element receives the interactive operation of some detector transmission After information, corresponding feedback behavior can be determined based on the predeterminated position of the interactive operation and corresponding detector, and control Feedback device performs the feedback behavior.By this programme, the interactive operation row that robot can be triggered based on user on fuselage To realize human-computer intellectualization with interactive operation position so that man-machine interaction is not limited only to voice and touches display screen hand over Mutually, the interactive form of robot is extended.
Brief description of the drawings
, below will be to embodiment in order to illustrate more clearly of the utility model embodiment or technical scheme of the prior art Or the accompanying drawing used required in description of the prior art is briefly described, it should be apparent that, drawings in the following description are Some embodiments of the present utility model, for those of ordinary skill in the art, on the premise of not paying creative work, Other accompanying drawings can also be obtained according to these accompanying drawings.
The structural representation for the robotic embodiment one that Fig. 1 provides for the utility model embodiment;
The structural representation for the robotic embodiment two that Fig. 2 provides for the utility model embodiment.
Embodiment
It is new below in conjunction with this practicality to make the purpose, technical scheme and advantage of the utility model embodiment clearer Accompanying drawing in type embodiment, the technical scheme in the utility model embodiment is clearly and completely described, it is clear that retouched The embodiment stated is a part of embodiment of the utility model, rather than whole embodiments.Based on the implementation in the utility model Example, the every other embodiment that those of ordinary skill in the art are obtained under the premise of creative work is not made is belonged to The scope of the utility model protection.
The structural representation for the robotic embodiment one that Fig. 1 provides for the utility model embodiment, as shown in figure 1, the machine Device people includes detector 10, middle control element 20 and feedback device 30.
Wherein, detector 10 is arranged on robot fuselage 1, the interactive operation for detecting user.
Middle control element 20 is arranged at the inside of fuselage 1, for the interactive operation and the interaction detected according to detector 10 The corresponding position of detector 10 control feedback device 30 is operated to perform feedback behavior.
Feedback device 30 is connected to middle control element 20, and performs feedback behavior under the control of middle control element 20.
In the utility model embodiment, middle control element 20 can use various application specific integrated circuits (ASIC), numeral Signal processor (DSP), digital signal processing appts (DSPD), PLD (PLD), field programmable gate array (FPGA), microcontroller, microprocessor or other electronic components are realized.
Alternatively, in actual applications, detector 10 includes being arranged at least one of diverse location on robot fuselage 1 Touch sensor 110, and/or at least one vibrating sensor 120.Wherein, touch sensor 110, which is used to detect, touches what is slided Direction and/or speed;Vibrating sensor 120 is used to detect the number of times and/or dynamics that tap operation.Detector is in robot fuselage Set location on 1 can be but be not limited to head 11, shoulder 12, belly 13 etc..Such as, two are set respectively in shoulder 12 Vibrating sensor 120;On head 11 and belly 13, multiple touch sensors 110 are set respectively.It is understood that for reality The interaction effect now personalized, set location of the detector on robot fuselage can by a large number of users to robot It is accustomed to operating position statistics to determine, can also experience setting.
It is pointed out that touch sensor 110 and vibrating sensor 120 can be understood as exemplary in nature, it can think To be other kinds of sensor, if with detection man-machine interaction interactive operation, the guarantor of detector 10 can also be included Scope is protected, belongs to the utility model same design, protection domain of the present utility model should be fallen into.
In actual applications, can be man-machine in machine in order to point out which interactive operation user can carry out to robot Corresponding prompt message is covered near the set location of detector with it, or can also be shown on the display screen of robot Show prompt message, the display screen is such as the touch-screen for being arranged on robot belly.
When user has carried out corresponding interactive operation according to above-mentioned prompt message to robot, the detector of relevant position 10 detect the interactive operation information of user's triggering, such as the number of times tapped, dynamics, so that the detector 10 is by the friendship detected Interoperability information is sent to middle control element 20.Middle control element 20 is based on the interactive operation information received and the position of the detector 10 Put, determine corresponding feedback behavior, and control feedback device 30 to perform the feedback behavior, it is touched to user feedback robot The response of the interactive operation of hair.
Specifically, detector 10 to middle control element 20 when transmitting the interactive operation information detected, by the mark of itself Send to middle control element 20, wherein, the mark is such as the unit type of the detector 10.Middle control element 20 is based on the equipment type Number it can determine the set location on robot fuselage 1 of the detector 10.
Specifically, in the utility model embodiment, memory 40 can be provided with robot, can be with the memory 40 The detector that is stored with identifies the corresponding relation with detector location, so that middle control element 20 can realize above-mentioned set location really It is fixed.
Can also be stored with various feedback behavior in the memory 40, wherein, the storage of every kind of feedback behavior index can be with Expressed by detector location, the coding result of interactive operation information.So as to which central control element 20 receives interactive operation letter Breath, and determine after detector location, (glide direction, speed are such as touched to interactive operation information;Number of taps, percussion power Degree) and detector location encoded, coding result is obtained, using obtained coding result as search index memory 40, to obtain Obtain and feed back behavior accordingly.It is understood, therefore, that including the correlator for realizing codimg logic in middle control element 20 Part.
In the present embodiment, the feedback behavior that robot interacts feedback to user can include sound feedback behavior, figure As at least one of feedback behavior and action feedback behavior.That is, the sound that can be stored with memory 40, figure One or more feedback data in picture, action sequence.So as to which alternatively, feedback device 30 includes Audio Players 310, display At least one of screen 320, moving parts 330.
Wherein, Audio Players 310 are connected with middle control element 20, for controlling the control of element 20 in receiving to sound feedback Behavior is to be played out for the sound fed back so that user is timely from acoustically obtaining interaction feedback;Display screen 320 is connected In middle control element 20, the control for controlling element 20 in receiving plays out aobvious to image feedback behavior as the image fed back Show so that user timely visually obtains interaction feedback;Moving parts 330 include but is not limited to robot both legs, Arm, first-class, moving parts 330 are connected to middle control element 20, and the control execution for controlling element 20 in receiving acts feedback row For the action i.e. for feeding back so that user timely experiences interaction feedback from the limbs change of robot.
Implementing for interaction feedback in the utility model embodiment is illustrated with several examples below:
Such as, feedback device 30 includes moving parts 330, and detector 10 includes being arranged on multiple touches of robot belly Sensor 110, such as be arranged on multiple touch sensors 110 under the display screen 320, and the side slided is touched for detecting To.Now, when the plurality of touch sensor 110, which detects user, triggers the slide in some direction, middle control element 20 The direction controlling moving parts 330 that touch according to detecting is slided, make robot be walked to correspondence direction.It is many in this example Individual touch sensor 110 can be both arranged at the display screen 320, can also be arranged at the adjacent position of the display screen 320 Place is put, is to pass through the display screen 320, Yong Huke with touch detection function when being arranged under the display screen 320 To carry out travelling control to robot.
Specifically, the multiple touch sensors 110 for being arranged at including but not limited to belly 13 detect the cunning of user Dynamic operation, and detect behind the direction of slide, the direction of the slide and the device identification of itself are sent to middle control Element 20, middle control element 20 determines the generation position of slide according to the device identification, by the generation position and cunning The direction encoding of dynamic operation obtains storage index, and corresponding feedback behavior is determined so as to inquire about, and such as " advances 50 centimetres forward ", Here forward direction is corresponding with the direction of slide, the both legs in middle control element 20 and then control moving parts 330, makes Robot advances forward 50 centimetres, that is, completes the touch operation because of user, causes robot to make " advancing 50 centimetres forward " Act feedback behavior.
And for example, feedback device 30 includes Audio Players 310, and detector 10 includes being arranged at least the one of robot shoulder Individual vibrating sensor 120, for detecting that the percussion of user is operated.Percussion operation is triggered when vibrating sensor detects user, And detect after number of taps and dynamics, the information detected is sent to middle control element 20, middle control element 20 is according to detecting Information and the generation position that operates of the percussion determine corresponding audio files, and control Audio Players 310 to play accordingly Audio files;In addition, the detector 10 can also include the multiple touch sensors 110 for being arranged on robot belly, As described in being arranged on the lower section of display screen 320 it is multiple as described in touch sensor 110, touch the direction slided for detecting, work as inspection When measuring user and triggering the slide in some direction, the direction of the slide, the middle control basis of element 20 are judged Audio Players 310 described in the direction controlling of the slide play volume during audio files, the cunning such as detected Dynamic operation direction be from the top-to-bottom of display screen when reduce volume when the Audio Players 310 play audio files, Otherwise the slide direction detected increases the Audio Players 310 and plays sound when being from the bottom of display screen to top Volume during file, to realize the control by 10 pairs of the detector feedback device 30
For another feedback device 30 includes the display screen 320, and detector 10 includes being arranged on robot belly side Multiple touch sensors 110, the touch operation for detecting user.When multiple touch sensors 110 detect user's triggering Slide, and detect after the speed of slip, the information detected is sent to middle control element 20.Middle control element 20 is based on The position of the plurality of touch sensor 110 and its touch information detected, it is determined that feeding back behavior accordingly, such as are picture File.Middle control element 20 and then the corresponding picture file of the broadcasting display of control display screen 320.Above-mentioned sliding speed is only for example, In practice, the touch operation information detected can also include such as touching duration, area coverage etc..Wherein, picture text Part is such as smiling face's picture, text prompt picture etc..
In the present embodiment, one or more detectors are set on robot fuselage, to detect that user exists by detector The interactive operation triggered in robot.Different detector locations and different interactive operation behaviors pair due to that can preset Should in different feedback behaviors so that, it is central control element receive some detector transmission interactive operation information after, can be with base Corresponding feedback behavior is determined in the predeterminated position of the interactive operation and corresponding detector, and controls feedback device to perform and is somebody's turn to do Feedback behavior.By this programme, interactive operation behavior and the interactive operation position that robot can be triggered based on user on fuselage Put and realize human-computer intellectualization so that man-machine interaction is not limited only to voice and the touch to display screen is interacted, and extends machine The interactive form of people.
The structural representation for the robotic embodiment two that Fig. 2 provides for the utility model embodiment, as shown in Fig. 2 in Fig. 1 On the basis of illustrated embodiment, the robot also includes image acquisition device 50 and image recognizer 60.
Wherein, image acquisition device 50 is arranged on robot fuselage 1, the image for gathering interactive object;Image recognition Device 60 is connected to middle control element 20 and image acquisition device 50, the image for recognizing interactive object, to obtain interaction pair The feature of elephant.
Image acquisition device 50 includes but is not limited to camera 510, and it is generally positioned at the high bit of robot fuselage 1 Put, such as the position of head 11, the image of image information, i.e. interactive object specifically for gathering the user interacted with robot Information.The image information that interactive object is gathered in the present embodiment can interactive object be taken pictures or to interaction Object is imaged, and is collected after the image information of interactive object and to be sent image information to image recognizer 60.
It can be the electronic equipment with image identification function to image recognizer 60, and it is specifically for identification interactive object Image, to obtain the feature of interactive object.In the present embodiment, the feature of interactive object includes but is not limited to interactive object Sex (man, female), age (adult, children) of interactive object etc..
In the present embodiment, it is identified by the feature to interactive object, it is possible to achieve targetedly interaction feedback.At this In embodiment, middle control element 20 is except the interactive operation detected based on detector 10, the trigger position progress of the interactive operation Outside the determination of feedback behavior, herein in connection with the feature of interactive object, it is determined that the behavior of feedback, so as to realize the individual character for interactive object Change interaction feedback.
For example, a children (trigger touch interaction behaviour having been touched in face of robot and on the belly of robot Make), image acquisition device 50 is taken pictures to it, and then the photo captured by 60 pairs of image recognizer is identified, and obtains interaction The feature of object (children) is children, and then middle control element 20 is examined according to the touch sensor 110 for being arranged at robot belly 13 The touch interactive operation measured, and the particular location (belly 13) that interactive operation occurs is touched, with reference to the spy of the interactive object Levy, the related feedback device 30 of control makes corresponding feedback behavior, such as sends laugh, and say pair of " child, hello " Words, while in the action the present embodiment made and shaken hands that makes a stretch of the arm, based on the feature recognition to interactive object, to interactive object institute The detection of interactive operation, and the determination to interactive operation position are triggered, targetedly interaction feedback can be realized, machine is improved The interactive intelligence of device people.
Through the above description of the embodiments, those skilled in the art can be understood that each embodiment can Realized by the mode of software plus required general hardware platform, naturally it is also possible to pass through hardware.Understood based on such, on The part that technical scheme substantially in other words contributes to prior art is stated to embody in the form of software product, should Computer software product can be stored in a computer-readable storage medium, such as ROM/RAM, magnetic disc, CD, including some fingers Order is to cause a computer equipment (can be personal computer, server, or network equipment etc.) to perform each implementation Method described in some parts of example or embodiment.
Finally it should be noted that:Above example is only to illustrate the technical solution of the utility model, rather than its limitations; Although the utility model is described in detail with reference to the foregoing embodiments, it will be understood by those within the art that: It can still modify to the technical scheme described in foregoing embodiments, or which part technical characteristic is carried out etc. With replacement;And these modifications or replacement, the essence of appropriate technical solution is departed from each embodiment technology of the utility model The spirit and scope of scheme.

Claims (5)

1. a kind of robot, it is characterised in that including:
It is arranged at robot fuselage, the detector for detecting interactive operation;
According to the interactive operation and the corresponding detector location of the interactive operation, control feedback device performs feedback behavior Middle control element;And
Perform the feedback device of the feedback behavior;
For the image acquisition device for gathering the image of interactive object, being arranged on the robot fuselage;And,
The image of the interactive object is recognized, the image recognizer of the feature of the interactive object, described image identifier is obtained It is connected to the middle control element and described image collector;
The middle control element is additionally operable to:According to the interactive operation, the trigger position of the interactive operation and the interactive object Feature, determine the feedback behavior.
2. robot according to claim 1, it is characterised in that the detector includes being arranged on the robot fuselage At least one touch sensor of upper diverse location, and/or at least one vibrating sensor;
The touch sensor, the direction slided and/or speed are touched for detecting;
The vibrating sensor, the number of times and/or dynamics of operation are tapped for detecting.
3. robot according to claim 2, it is characterised in that the feedback device include Audio Players, display screen, At least one of moving parts.
4. robot according to claim 3, it is characterised in that the feedback device includes the moving parts, described At least one touch sensor includes:The touch sensor under the display screen is arranged on, the side slided is touched for detecting To;
The middle control element is additionally operable to:According to it is described touch slide direction controlling described in moving parts, make the robot to Correspondence direction is walked.
5. robot according to any one of claim 1 to 4, it is characterised in that also include:
Memory for storing various feedback behavior.
CN201621230925.6U 2016-11-16 2016-11-16 Robot Active CN206475183U (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201621230925.6U CN206475183U (en) 2016-11-16 2016-11-16 Robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201621230925.6U CN206475183U (en) 2016-11-16 2016-11-16 Robot

Publications (1)

Publication Number Publication Date
CN206475183U true CN206475183U (en) 2017-09-08

Family

ID=59757196

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201621230925.6U Active CN206475183U (en) 2016-11-16 2016-11-16 Robot

Country Status (1)

Country Link
CN (1) CN206475183U (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106393113A (en) * 2016-11-16 2017-02-15 上海木爷机器人技术有限公司 Robot and interactive control method for robot
CN108687768A (en) * 2018-04-02 2018-10-23 深圳臻迪信息技术有限公司 One kind is paddled robot and robot data inputting method of paddling

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106393113A (en) * 2016-11-16 2017-02-15 上海木爷机器人技术有限公司 Robot and interactive control method for robot
CN108687768A (en) * 2018-04-02 2018-10-23 深圳臻迪信息技术有限公司 One kind is paddled robot and robot data inputting method of paddling

Similar Documents

Publication Publication Date Title
US10126826B2 (en) System and method for interaction with digital devices
CN106393113A (en) Robot and interactive control method for robot
US8421634B2 (en) Sensing mechanical energy to appropriate the body for data input
CN112926423B (en) Pinch gesture detection and recognition method, device and system
US20170083086A1 (en) Human-Computer Interface
CN102016765A (en) Method and system of identifying a user of a handheld device
JP2017529635A5 (en)
CN109933206B (en) Finger non-contact drawing method and system based on Leap Motion
TW201430633A (en) Tactile feedback system and method for providing tactile feedback
CN105324736A (en) Techniques for touch and non-touch user interaction input
CN112383805A (en) Method for realizing man-machine interaction at television end based on human hand key points
CN206475183U (en) Robot
CN107787478A (en) Content item is selected in user interface display
CN106601217A (en) Interactive-type musical instrument performing method and device
Vyas et al. Gesture recognition and control
CN111142663B (en) Gesture recognition method and gesture recognition system
CN103376884B (en) Man-machine interaction method and its device
CN111860086A (en) Gesture recognition method, device and system based on deep neural network
Nigam et al. A complete study of methodology of hand gesture recognition system for smart homes
Lin et al. Action recognition for human-marionette interaction
Jamaludin et al. Dynamic hand gesture to text using leap motion
CN111050266B (en) Method and system for performing function control based on earphone detection action
Thakar et al. Hand gesture controlled gaming application
WO2019134606A1 (en) Terminal control method, device, storage medium, and electronic apparatus
Vasanthagokul et al. Virtual Mouse to Enhance User Experience and Increase Accessibility

Legal Events

Date Code Title Description
GR01 Patent grant
GR01 Patent grant
CP01 Change in the name or title of a patent holder

Address after: 200336 402 rooms, No. 33, No. 33, Guang Shun Road, Shanghai

Patentee after: SHANGHAI MROBOT TECHNOLOGY Co.,Ltd.

Address before: 200336 402 rooms, No. 33, No. 33, Guang Shun Road, Shanghai

Patentee before: SHANGHAI MUYE ROBOT TECHNOLOGY Co.,Ltd.

CP01 Change in the name or title of a patent holder
CP01 Change in the name or title of a patent holder

Address after: 200336 402 rooms, No. 33, No. 33, Guang Shun Road, Shanghai

Patentee after: Shanghai Zhihui Medical Technology Co.,Ltd.

Address before: 200336 402 rooms, No. 33, No. 33, Guang Shun Road, Shanghai

Patentee before: SHANGHAI MROBOT TECHNOLOGY Co.,Ltd.

Address after: 200336 402 rooms, No. 33, No. 33, Guang Shun Road, Shanghai

Patentee after: Shanghai zhihuilin Medical Technology Co.,Ltd.

Address before: 200336 402 rooms, No. 33, No. 33, Guang Shun Road, Shanghai

Patentee before: Shanghai Zhihui Medical Technology Co.,Ltd.

CP03 Change of name, title or address
CP03 Change of name, title or address

Address after: 202150 room 205, zone W, second floor, building 3, No. 8, Xiushan Road, Chengqiao Town, Chongming District, Shanghai (Shanghai Chongming Industrial Park)

Patentee after: Shanghai Noah Wood Robot Technology Co.,Ltd.

Address before: 200336 402 rooms, No. 33, No. 33, Guang Shun Road, Shanghai

Patentee before: Shanghai zhihuilin Medical Technology Co.,Ltd.