CN104199321A - Emotion interacting type vehicle-mounted robot - Google Patents

Emotion interacting type vehicle-mounted robot Download PDF

Info

Publication number
CN104199321A
CN104199321A CN201410394549.3A CN201410394549A CN104199321A CN 104199321 A CN104199321 A CN 104199321A CN 201410394549 A CN201410394549 A CN 201410394549A CN 104199321 A CN104199321 A CN 104199321A
Authority
CN
China
Prior art keywords
vehicle
mechanical arm
arm assembly
mobile device
mounted mechanical
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201410394549.3A
Other languages
Chinese (zh)
Inventor
刘松珍
郭海锋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to CN201410394549.3A priority Critical patent/CN104199321A/en
Publication of CN104199321A publication Critical patent/CN104199321A/en
Pending legal-status Critical Current

Links

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

The invention provides a realizing method for an emotion interacting type vehicle-mounted robot. The robot comprises an integrated emotion interacting type vehicle-mounted robot and a separated emotion interacting type vehicle-mounted robot, wherein each emotion interacting type vehicle-mounted robot consist of a moving device, a vehicle-mounted mechanical arm device and a power supply. When the moving devices are connected with the vehicle-mounted mechanical arm devices, the integrated emotion interacting type vehicle-mounted robot can be combined; and when the moving devices are separated from the vehicle-mounted mechanical arm devices, the separated emotion interacting type vehicle-mounted robot can be combined. The vehicle-mounted robot has the emotion like a person, and can interact with the person through emotion responses.

Description

A kind of affective interaction type on-vehicle machines people
Technical field
The present invention relates to the application technology that mobile device is combined with robot, especially relate to a kind of affective interaction type on-vehicle machines people.
Background technology
So far, Chinese smart phone user number has exceeded 500,000,000, emerge in an endless stream, but most application all only rests on simple App above around the various application of smart mobile phone, and the application product that smart mobile phone combines with hardware is less.
In addition, no matter be existing pure App product, or the product combining with hardware, smart mobile phone all only serves as " micro computer " screen, for user, change be only the convenience that carries and consult aspect, in essence, do not change since " computer " is born the interactive mode between computer and people.
But smart mobile phone, compared with traditional computer, has again its unique advantage, its portable characteristics, abundant sensor characteristic, can the online characteristic of round-the-clock networking etc. by cellular network, make smart mobile phone become a peculiar mobile terminal device undoubtedly.Therefore, how to utilize more fully these characteristics of smart mobile phone, allow it be combined more closely with people, experience to the user that terminal user is more friendly, especially combined with hardware equipment, allow the function of smart mobile phone comprehensively be brought into play, for people's production and life bring efficiency and happy, urgently careful research and development.
In recent years existing minority inventor has given concern to the application of smart mobile phone, has successively proposed some application schemes around smart mobile phone.CN201210387268.6 has invented a kind of supervisory-controlled robot that is used in household safety-protection field taking smart mobile phone as main body, mainly comprise smart mobile phone and robot platform, can realize the function such as intelligent monitoring, long-range human-computer interactive control.CN 201210189512.8 has invented a kind of solution magic square robot based on smart mobile phone, gather magic square colouring information by smart mobile phone, complete out of order magic square reduction logical operation, and send to the control module in robot to control in real time by Bluetooth communication solution, complete the reduction to three rank magic squares by performance element.CN 201210337637.0 has invented a kind of sweeping robot that adopts smart mobile phone navigation, obtain external information by the sensor on smart mobile phone, avoided directly on sweeping robot, installing additional navigator fix and sensor hardware device, implemented simply, sweeping efficiency is high.CN201010543931.8 has invented a kind of robot man-machine interactive system based on iPhone smart mobile phone, build portable general man-machine interactive system by smart mobile phone, thereby just can realize anywhere or anytime the tele-robotic straighforward operation of convenient and efficient without extras.CN201310160936.6 has invented a kind of emotion on-vehicle machines people of information-driven, this on-vehicle machines people expresses various information in succinct mode anthropomorphic and/or plan thing expression, be equipped with corresponding sound, allow user " just listen at a glance, " that just all are clear.In realization, user need obtain robot body device and App is installed on smart mobile phone and just can have emotion on-vehicle machines people, for customization and personalized information service is provided.
Can find out from foregoing invention, existing invention lays particular emphasis on mostly using smart mobile phone as control end, substitute the hardware calculation element using in conventional machines people performance history, be equivalent to reduce the cost of development of hardware, but its function aspects is too many improvement not, still belong to " hardware " and replace level, " intelligence " degree is lower.
The object of the invention is the actual demand from user, can be taking smart mobile phone as computing equipment, design a kind of vehicle-mounted mechanical arm assembly and smart mobile phone and carry out combination, construct a kind of on-vehicle machines people who can be combined to integral type, also can be combined as split type on-vehicle machines people, and this on-vehicle machines people can have mood as people, can be undertaken alternately by emotional reactions and people.Invent different from the existing robot calculating with opertaing device that only smart mobile phone served as, the present invention has not only designed vehicle-mounted mechanical arm assembly, the more important thing is that interactive function and the implementation method of the present invention to robot carried out depth design, can carry out alternately with user, this is the outstanding substantial contribution that the present invention does.
Summary of the invention
The problem that the embodiment of the present invention will solve is to provide a kind of affective interaction type on-vehicle machines people, utilize form single with what solve existing mobile unit, and with the problem such as people's interactive mode is unfriendly, finally based on technical scheme of the present invention, arbitrary use can be passed through vehicle-mounted mechanical arm assembly of the present invention per family, and smart mobile phone is become to affective interaction type on-vehicle machines people.For example, user is positioned over smart mobile phone on vehicle-mounted mechanical arm assembly, will be combined into immediately integral type affective interaction type on-vehicle machines people, and this robot can make various expression+action+sound+word+picture/videos.
To achieve these goals, the present invention has provided a kind of affective interaction type on-vehicle machines people's implementation method, comprise integral type affective interaction type on-vehicle machines people and split type affective interaction type on-vehicle machines people, and each affective interaction type on-vehicle machines is per capita by mobile device, vehicle-mounted mechanical arm assembly and electric power generating composition.
Described affective interaction type on-vehicle machines people, is characterized in that, is made up of mobile device, vehicle-mounted mechanical arm assembly and power supply three parts;
Described vehicle-mounted mechanical arm assembly or be placed in automobile instrument panel top table top, or be pasted on shield glass, or hang on vehicle air conditioning outlet place, or hang on vehicle centre rear-view mirror;
Described vehicle-mounted mechanical arm assembly is made up of external module and intraware; Described external module further includes but not limited to that base, mechanical arm and hand grab; Described mechanical arm is connected and composed by joint by one section of arm or multistage arm; Described mechanical arm one end is connected with base, and the other end is grabbed and is connected with hand;
Described power supply or be built in vehicle-mounted mechanical arm assembly inside, or be placed on vehicle-mounted mechanical arm assembly outside with autonomous device form; Described power supply is the power supply of vehicle-mounted mechanical arm assembly or is mobile power supply equipment simultaneously;
Described hand is grabbed after the mobile device in face of sensing, mobile device is caught and unclasped in the opening and closing action of grabbing by hand, or mobile device is clung or holds in the stickup of grabbing by hand or absorption action;
After described hand is grabbed and fixed mobile device or before mobile device is carried out, face upward or backward wing, or mobile device is turned clockwise or be rotated counterclockwise, or mobile device is maintained static;
Between described mobile device and vehicle-mounted mechanical arm assembly, carry out data transmission by wired or wireless mode;
Described vehicle-mounted mechanical arm assembly receives the instruction that mobile device sends, by driving base rotation to perform an action and/or being performed an action and/or performed an action by driving hand to grab by driving device arm;
When described vehicle-mounted mechanical arm assembly is carried out relevant action, described mobile device App interface or the mode display device people's by expression mood, or not display device people's expression and mood of mobile device App interface;
Between described affective interaction type on-vehicle machines people, communicate by wireless mode.
Beneficial effect of the present invention is:
(1) by technical scheme provided by the invention, user only needs to obtain vehicle-mounted mechanical arm assembly provided by the invention, and App application program is installed on smart mobile phone just can has an affective interaction type on-vehicle machines people;
(2) user can arrange relevant parameter by App, realizes on-vehicle machines people's control and mutual, and on-vehicle machines people can sensing external environment information, and give corresponding emotional reactions according to the information of perception;
(3) the present invention takes the principle of simplifying, and in succinct mode anthropomorphic and/or that intend thing expression, various information is expressed, and is equipped with corresponding sound simultaneously, makes user " just listen at a glance, " that just all are clear.
Brief description of the drawings
Fig. 1 is one embodiment of the invention, integral type affective interaction type on-vehicle machines people module map;
Fig. 2 is one embodiment of the invention, split type affective interaction type on-vehicle machines people module map;
Fig. 3 is one embodiment of the invention, typical case's application implementation procedure.
Embodiment
The preferred embodiments of the present invention provide a kind of affective interaction type on-vehicle machines people.On-vehicle machines people is made up of mobile device, vehicle-mounted mechanical arm assembly and power supply, in concrete enforcement, can be divided into integral type affective interaction type on-vehicle machines people and split type affective interaction type on-vehicle machines people.Each affective interaction type on-vehicle machines people can power to vehicle-mounted mechanical arm assembly by built-in power or external power supply, in the time that mobile device and robot body are combined as a whole formula affective interaction type on-vehicle machines people, this power supply can also be charging of mobile devices simultaneously.
Described affective interaction type on-vehicle machines people is divided into integral type affective interaction type on-vehicle machines people and split type affective interaction type on-vehicle machines people;
Described integral type affective interaction type on-vehicle machines people refers to that described mobile device is directly connected with described vehicle-mounted mechanical arm assembly, is combined into one, and is integrally formed formula affective interaction type on-vehicle machines people;
Described split type affective interaction type on-vehicle machines people refers to that described mobile device is not directly connected with described vehicle-mounted mechanical arm assembly, forms split type affective interaction type on-vehicle machines people by separate form.
Described mobile device refers to have SOS, the intelligent portable computing equipment of third party App application program can be installed, include but not limited to carry the mobile device of the mobile device of iOS system, the mobile device that carries Android system, lift-launch Windows system;
The intraware of described vehicle-mounted mechanical arm assembly includes but not limited to steering wheel group, group of motors, gearing, sensor, surface-mounted integrated circuit and printed circuit board;
Described steering wheel group and/or group of motors are responsible for driving base to perform an action and/or mechanical arm performs an action and/or hand is grabbed and performed an action;
Described vehicle-mounted mechanical arm assembly drives base within the scope of 360 degree, carry out clockwise or be rotated counterclockwise with given speed and angle by steering wheel group and/or group of motors;
Described vehicle-mounted mechanical arm assembly carries out free movement by steering wheel group and/or group of motors driving device arm with given steering order;
Described vehicle-mounted mechanical arm assembly is grabbed with given steering order and is carried out opening and closing by steering wheel group and/or group of motors driving hand, before carrying out simultaneously, faces upward, swings back, turns clockwise, is rotated counterclockwise.
Described power supply refers to or form by battery provides electric energy for vehicle-mounted mechanical arm assembly, or form by external power supply provides electric energy for vehicle-mounted mechanical arm assembly; Described battery or be built in described vehicle-mounted mechanical arm assembly, or be placed on described vehicle-mounted mechanical arm assembly.
Described mobile device App shows that mood picture comprises anthropomorphic countenance and/or sound and/or picture and/or video and/or word;
Described anthropomorphic countenance includes but not limited to pleasure, anger, sorrow, happiness, tired, tired, tired, hungry;
Described affective interaction type on-vehicle machines people's mood is expressed one's feelings by App and/or the action of sound and/or picture and/or video and/or word and/or the execution of vehicle-mounted mechanical arm assembly embodies.
Fig. 1, Fig. 2 are embodiments of the invention, and wherein Fig. 1 is integral type affective interaction type on-vehicle machines people module map of the present invention.
The 101st, integral type affective interaction type on-vehicle machines people, it comprises 11 mobile devices, 12 vehicle-mounted mechanical arm assemblies and 13 power supplys.13 power supplys can be built in 12 vehicle-mounted mechanical arm assemblies, integrate with vehicle-mounted mechanical arm assembly.
Described 12 vehicle-mounted mechanical arm assemblies are grabbed 11 mobile devices are fixed on 12 vehicle-mounted mechanical arm assemblies by hand, combine with 12 vehicle-mounted mechanical arm assemblies, be integrally formed formula affective interaction type on-vehicle machines people, described integral type affective interaction type on-vehicle machines people's mood is expressed one's feelings by App and/or the action of sound and/or picture and/or video and/or word and/or the execution of vehicle-mounted mechanical arm assembly embodies;
The instruction that described 11 mobile devices send described App by wired or wireless mode passes to 12 vehicle-mounted mechanical arm assemblies; Described 11 mobile devices can carry out bidirectional data transfers by wired or wireless mode and 12 vehicle-mounted mechanical arm assemblies;
The instruction that described 12 vehicle-mounted mechanical arm assemblies send according to 11 mobile device App performs an action;
Described 11 mobile device App carry out emotive response according to information and/or the event of perception;
The information of described 11 mobile device App perception includes but not limited to the information that information that the sensor in information, vehicle-mounted mechanical arm assembly that mobile device self-sensor device obtains obtains, information that other affective interaction type on-vehicle machines people sends and server send.
After described 11 mobile devices and 12 vehicle-mounted mechanical arm assemblies link together, or be charging of mobile devices by 12 vehicle-mounted mechanical arm assembly built-in powers, or be charging of mobile devices by vehicle-mounted mechanical arm assembly external power supply.
Described 12 vehicle-mounted mechanical arm assemblies receive that after the instruction of described App transmission, steering wheel group and/or group of motors carry out rotating within the scope of clockwise and/or counterclockwise 360 degree with given speed and angle by actuator drives base;
Described 12 vehicle-mounted mechanical arm assemblies receive after the instruction that described App sends, steering wheel group and/or group of motors also by actuator drives mechanical arm taking joint as axle performs an action;
Described 12 vehicle-mounted mechanical arm assemblies are received after the instruction of described App transmission, steering wheel group and/or group of motors are also grabbed to face upward before given speed and acceleration and/or are swung back and/or clockwise rotate and/or rotate counterclockwise by actuator drives hand, or drive hand to grab to open or closed;
Described 11 mobile device App carry out emotive response according to information and/or the event of perception, comprise that anthropomorphic countenance and/or sound and/or picture and/or video and/or word and/or vehicle-mounted mechanical arm assembly perform an action.
Fig. 2 is the split type affective interaction type of the present invention on-vehicle machines people module map.
201 split type affective interaction type on-vehicle machines people are by 11 mobile devices and 202 module compositions, and wherein 202 assemblies comprise 12 vehicle-mounted mechanical arm assemblies and 13 power supplys.11 mobile devices in this split type affective interaction type on-vehicle machines people separate with 12 vehicle-mounted mechanical arm assemblies, do not connect.
Described 11 mobile devices depart from hand and grab, separate with 12 vehicle-mounted mechanical arm assemblies, 11 mobile devices and 12 vehicle-mounted mechanical arm assemblies are with separate form, form split type affective interaction type on-vehicle machines people, described split type affective interaction type on-vehicle machines people's mood is expressed one's feelings by App and/or the action of sound and/or picture and/or video and/or word and/or the execution of vehicle-mounted mechanical arm assembly embodies;
The instruction that described 11 mobile devices send described App by wired or wireless mode passes to vehicle-mounted mechanical arm assembly; Described 11 mobile devices carry out bidirectional data transfers by wired or wireless mode and 12 vehicle-mounted mechanical arm assemblies;
The instruction that described 12 vehicle-mounted mechanical arm assemblies send according to mobile device App performs an action;
Described 11 mobile device App carry out emotive response according to information and/or the event of perception.
Described 12 vehicle-mounted mechanical arm assemblies receive that after the instruction of described App transmission, steering wheel group and/or group of motors carry out rotating within the scope of clockwise and/or counterclockwise 360 degree with given speed and angle by actuator drives base;
Described 12 vehicle-mounted mechanical arm assemblies receive after the instruction that described App sends, steering wheel group and/or group of motors also by actuator drives mechanical arm taking joint as axle performs an action;
Described 12 vehicle-mounted mechanical arm assemblies are received after the instruction of described App transmission, steering wheel group and/or group of motors are also grabbed to face upward before given speed and acceleration and/or are swung back and/or clockwise rotate and/or rotate counterclockwise by actuator drives hand, or drive hand to grab to open or closed.
Described 11 mobile device App carry out emotive response according to information and/or the event of perception, comprise that anthropomorphic countenance and/or sound and/or picture and/or video and/or word and/or vehicle-mounted mechanical arm assembly perform an action;
The information of described mobile device App perception includes but not limited to the information that information that the sensor in information, vehicle-mounted mechanical arm assembly that mobile device self-sensor device obtains obtains, information that other affective interaction type on-vehicle machines people sends and server send.
Fig. 3 is embodiment of the present invention typical case application implementation procedure.
In order further to set forth concrete application process of the present invention, the present invention to be to narrate as typical application scenarios shown in Fig. 3, and taking smart mobile phone as typical mobile device.In the present invention, 12 vehicle-mounted mechanical arm assemblies can be lain in to automobile instrument panel top table top, and be that vehicle-mounted mechanical arm assembly is powered by built-in power or external power supply.
Suppose user's starting engine, the once task of going on a journey of having driven.When user is by after 301 starting engines, 12 vehicle-mounted mechanical arm assemblies perceive engine and start, and now, 12 vehicle-mounted mechanical arm assemblies drive 302 mechanical arms to stretch out forward by steering wheel group and/or group of motors, and hand is grabbed and opened simultaneously; Then user is placed on smart mobile phone in face of hand grabs, and 303 hands are grabbed and sensed after smart mobile phone, just automatically closed, catch smart mobile phone.Now hand is grabbed and can grabbed smart mobile phone and carry out relevant action, and such as smart mobile phone is leaned forward or hypsokinesis, smart mobile phone App program also and then starts simultaneously, carries out interactive response with the mode and the user that express one's feelings.In the time that user's actuating vehicle on the way travels, smart mobile phone is by the sensor senses corresponding information in self sensor and/or vehicle-mounted mechanical arm assembly, and/or other affective interaction type on-vehicle machines human hair carry informations, and/or server end sends information, when perceiving after corresponding information, the various information of perception can be processed into corresponding event.In the time that certain class event occurs, 304 on-vehicle machines people just can perceive corresponding event information, and then 305 on-vehicle machines people start the emotive response that such event is given; The emotive response that on-vehicle machines people gives may present by the modes some or certain several combination such as 306,307,308,309,310,311.Can excite 306 vehicle-mounted mechanical arm assemblies to carry out relevant action and/or excite the expression on 307 smart mobile phone App change and/or excite 308 smart mobile phone App to play corresponding sound and/or excite 309 smart mobile phone App to show related text and/or excite 310 smart mobile phone App to show picture concerned and/or excite 311 smart mobile phone App to show corresponding video.When driving, will constantly there is similar situation to occur, on-vehicle machines gives similar emotive response process per capita, when user arrives behind destination, kill engine by 312, now on-vehicle machines people perceives tail-off, by 313, mechanical arm and hand are grabbed to retraction, stack to base, user takes off smart mobile phone by 314, completes the process of once going on a journey.
The present invention describes by specific embodiment, it will be appreciated by those skilled in the art that, without departing from the present invention, can also carry out various conversion and be equal to alternative the present invention.In addition, for particular condition or concrete condition, can make various amendments to the present invention, and not depart from the scope of the present invention.Therefore, the present invention is not limited to disclosed specific embodiment, and should comprise the whole embodiments that fall within the scope of the claims in the present invention.

Claims (10)

1. an affective interaction type on-vehicle machines people, is characterized in that, is made up of mobile device, vehicle-mounted mechanical arm assembly and power supply three parts;
Described vehicle-mounted mechanical arm assembly or be placed in automobile instrument panel top table top, or be pasted on shield glass, or hang on vehicle air conditioning outlet place, or hang on vehicle centre rear-view mirror;
Described vehicle-mounted mechanical arm assembly is made up of external module and intraware; Described external module further includes but not limited to that base, mechanical arm and hand grab; Described mechanical arm is connected and composed by joint by one section of arm or multistage arm; Described mechanical arm one end is connected with base, and the other end is grabbed and is connected with hand;
Described power supply or be built in vehicle-mounted mechanical arm assembly inside, or be placed on vehicle-mounted mechanical arm assembly outside with autonomous device form; Described power supply is the power supply of vehicle-mounted mechanical arm assembly or is mobile power supply equipment simultaneously;
Described hand is grabbed after the mobile device in face of sensing, mobile device is caught and unclasped in the opening and closing action of grabbing by hand, or mobile device is clung or holds in the stickup of grabbing by hand or absorption action;
After described hand is grabbed and fixed mobile device or before mobile device is carried out, face upward or backward wing, or mobile device is turned clockwise or be rotated counterclockwise, or mobile device is maintained static;
Between described mobile device and vehicle-mounted mechanical arm assembly, carry out data transmission by wired or wireless mode;
Described vehicle-mounted mechanical arm assembly receives the instruction that mobile device sends, by driving base rotation to perform an action and/or being performed an action and/or performed an action by driving hand to grab by driving device arm;
When described vehicle-mounted mechanical arm assembly is carried out relevant action, described mobile device App interface or the mode display device people's by expression mood, or not display device people's expression and mood of mobile device App interface;
Between described affective interaction type on-vehicle machines people, communicate by wireless mode.
2. as claimed in claim 1, a kind of affective interaction type on-vehicle machines people, is characterized in that:
Described affective interaction type on-vehicle machines people is divided into integral type affective interaction type on-vehicle machines people and split type affective interaction type on-vehicle machines people;
Described integral type affective interaction type on-vehicle machines people refers to that described mobile device is directly connected with described vehicle-mounted mechanical arm assembly, is combined into one, and is integrally formed formula affective interaction type on-vehicle machines people;
Described split type affective interaction type on-vehicle machines people refers to that described mobile device is not directly connected with described vehicle-mounted mechanical arm assembly, forms split type affective interaction type on-vehicle machines people by separate form.
3. as claimed in claim 1, a kind of affective interaction type on-vehicle machines people, is characterized in that:
Described mobile device refers to have SOS, the intelligent portable computing equipment of third party App application program can be installed, include but not limited to carry the mobile device of the mobile device of iOS system, the mobile device that carries Android system, lift-launch Windows system;
Described vehicle-mounted mechanical arm assembly intraware includes but not limited to steering wheel group, group of motors, gearing, sensor, surface-mounted integrated circuit and printed circuit board;
Described steering wheel group and/or group of motors are responsible for driving base to perform an action and/or mechanical arm performs an action and/or hand is grabbed and performed an action;
Described vehicle-mounted mechanical arm assembly drives base within the scope of 360 degree, carry out clockwise or be rotated counterclockwise with given speed and angle by described steering wheel group and/or group of motors;
Described vehicle-mounted mechanical arm assembly carries out free movement by described steering wheel group and/or group of motors driving device arm with given steering order;
Described vehicle-mounted mechanical arm assembly is grabbed with given steering order and is carried out opening and closing by described steering wheel group and/or group of motors driving hand, before carrying out simultaneously, faces upward, swings back, turns clockwise, is rotated counterclockwise.
4. as claimed in claim 1, a kind of affective interaction type on-vehicle machines people, is characterized in that:
Described power supply refers to or form by battery provides electric energy for vehicle-mounted mechanical arm assembly, or form by external power supply provides electric energy for vehicle-mounted mechanical arm assembly; Described battery or be built in described vehicle-mounted mechanical arm assembly, or be placed on described vehicle-mounted mechanical arm assembly.
5. as claimed in claim 1, a kind of affective interaction type on-vehicle machines people, is characterized in that:
Described mobile device App shows that mood picture comprises anthropomorphic countenance and/or sound and/or picture and/or video and/or word;
Described anthropomorphic countenance includes but not limited to pleasure, anger, sorrow, happiness, tired, tired, tired, hungry;
Described affective interaction type on-vehicle machines people's mood is expressed one's feelings by App and/or the action of sound and/or picture and/or video and/or word and/or the execution of vehicle-mounted mechanical arm assembly embodies.
6. an integral type affective interaction type on-vehicle machines people, is characterized in that:
Described vehicle-mounted mechanical arm assembly is grabbed mobile device is fixed on vehicle-mounted mechanical arm assembly by hand, combine with vehicle-mounted mechanical arm assembly, be integrally formed formula affective interaction type on-vehicle machines people, described integral type affective interaction type on-vehicle machines people's mood is expressed one's feelings by App and/or the action of sound and/or picture and/or video and/or word and/or the execution of vehicle-mounted mechanical arm assembly embodies;
Described mobile device carries out bidirectional data transfers by wired or wireless mode and vehicle-mounted mechanical arm assembly; The instruction that described mobile device sends described App passes to vehicle-mounted mechanical arm assembly;
The instruction that described vehicle-mounted mechanical arm assembly sends according to mobile device App performs an action;
Described mobile device App carries out emotive response according to information and/or the event of perception;
After described mobile device and vehicle-mounted mechanical arm assembly link together, or be charging of mobile devices by vehicle-mounted mechanical arm assembly built-in power, or be charging of mobile devices by vehicle-mounted mechanical arm assembly external power supply.
7. as claimed in claim 6, a kind of integral type affective interaction type on-vehicle machines people, is characterized in that:
Described vehicle-mounted mechanical arm assembly receives that after the instruction of described App transmission, steering wheel group and/or group of motors carry out rotating within the scope of clockwise and/or counterclockwise 360 degree with given speed and angle by actuator drives base;
Described vehicle-mounted mechanical arm assembly receives after the instruction that described App sends, steering wheel group and/or group of motors also by actuator drives mechanical arm taking joint as axle performs an action;
Described vehicle-mounted mechanical arm assembly is received after the instruction of described App transmission, steering wheel group and/or group of motors are also grabbed to face upward before given speed and acceleration and/or are swung back and/or clockwise rotate and/or rotate counterclockwise by actuator drives hand, or drive hand to grab to open or closed;
Described mobile device App carries out emotive response according to information and/or the event of perception, comprises that anthropomorphic countenance and/or sound and/or picture and/or video and/or word and/or vehicle-mounted mechanical arm assembly perform an action;
The information of described mobile device App perception includes but not limited to the information that information that the sensor in information, vehicle-mounted mechanical arm assembly that mobile device self-sensor device obtains obtains and other affective interaction type on-vehicle machines people send.
8. a split type affective interaction type on-vehicle machines people, is characterized in that:
Described mobile device departs from hand and grabs, separate with vehicle-mounted mechanical arm assembly, mobile device and vehicle-mounted mechanical arm assembly are with separate form, form split type affective interaction type on-vehicle machines people, described split type affective interaction type on-vehicle machines people's mood is expressed one's feelings by App and/or the action of sound and/or picture and/or video and/or word and/or the execution of vehicle-mounted mechanical arm assembly embodies;
Described mobile device carries out bidirectional data transfers by wired or wireless mode and vehicle-mounted mechanical arm assembly; The instruction that described mobile device sends described App passes to vehicle-mounted mechanical arm assembly;
The instruction that described vehicle-mounted mechanical arm assembly sends according to mobile device App performs an action;
Described mobile device App carries out emotive response according to information and/or the event of perception.
9. as claimed in claim 8, a kind of split type affective interaction type on-vehicle machines people, is characterized in that:
Described vehicle-mounted mechanical arm assembly receives that after the instruction of described App transmission, steering wheel group and/or group of motors carry out rotating within the scope of clockwise and/or counterclockwise 360 degree with given speed and angle by actuator drives base;
Described vehicle-mounted mechanical arm assembly receives after the instruction that described App sends, steering wheel group and/or group of motors also by actuator drives mechanical arm taking joint as axle performs an action;
Described vehicle-mounted mechanical arm assembly is received after the instruction of described App transmission, steering wheel group and/or group of motors are also grabbed to face upward before given speed and acceleration and/or are swung back and/or clockwise rotate and/or rotate counterclockwise by actuator drives hand, or drive hand to grab to open or closed.
10. as claimed in claim 8, a kind of split type affective interaction type on-vehicle machines people, is characterized in that:
Described mobile device App carries out emotive response according to information and/or the event of perception, comprises that anthropomorphic countenance and/or sound and/or picture and/or video and/or word and/or vehicle-mounted mechanical arm assembly perform an action;
The information of described mobile device App perception includes but not limited to the information that information that the sensor in information, vehicle-mounted mechanical arm assembly that mobile device self-sensor device obtains obtains and other affective interaction type on-vehicle machines people send.
CN201410394549.3A 2014-08-07 2014-08-07 Emotion interacting type vehicle-mounted robot Pending CN104199321A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410394549.3A CN104199321A (en) 2014-08-07 2014-08-07 Emotion interacting type vehicle-mounted robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410394549.3A CN104199321A (en) 2014-08-07 2014-08-07 Emotion interacting type vehicle-mounted robot

Publications (1)

Publication Number Publication Date
CN104199321A true CN104199321A (en) 2014-12-10

Family

ID=52084622

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410394549.3A Pending CN104199321A (en) 2014-08-07 2014-08-07 Emotion interacting type vehicle-mounted robot

Country Status (1)

Country Link
CN (1) CN104199321A (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105382826A (en) * 2015-11-27 2016-03-09 四川巨鑫机器人信息技术有限公司 Vehicle-mounted type robot
CN108919804A (en) * 2018-07-04 2018-11-30 广东猪兼强互联网科技有限公司 A kind of intelligent vehicle Unmanned Systems
CN109710055A (en) * 2017-12-15 2019-05-03 蔚来汽车有限公司 The interaction control method of vehicle intelligent interactive system and vehicle-mounted interactive terminal
CN110641476A (en) * 2019-08-16 2020-01-03 广汽蔚来新能源汽车科技有限公司 Interaction method and device based on vehicle-mounted robot, controller and storage medium
CN113515060A (en) * 2021-07-05 2021-10-19 上海仙塔智能科技有限公司 Control script processing method and device, electronic equipment and storage medium

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105382826A (en) * 2015-11-27 2016-03-09 四川巨鑫机器人信息技术有限公司 Vehicle-mounted type robot
CN109710055A (en) * 2017-12-15 2019-05-03 蔚来汽车有限公司 The interaction control method of vehicle intelligent interactive system and vehicle-mounted interactive terminal
CN108919804A (en) * 2018-07-04 2018-11-30 广东猪兼强互联网科技有限公司 A kind of intelligent vehicle Unmanned Systems
CN110641476A (en) * 2019-08-16 2020-01-03 广汽蔚来新能源汽车科技有限公司 Interaction method and device based on vehicle-mounted robot, controller and storage medium
CN113515060A (en) * 2021-07-05 2021-10-19 上海仙塔智能科技有限公司 Control script processing method and device, electronic equipment and storage medium

Similar Documents

Publication Publication Date Title
CN103324100B (en) A kind of emotion on-vehicle machines people of information-driven
CN104199321A (en) Emotion interacting type vehicle-mounted robot
CN110139732B (en) Social robot with environmental control features
CN103263094B (en) Intelligent induction glove system
CN103786061A (en) Vehicular robot device and system
CN108237918A (en) Vehicle and its control method
CN105264452A (en) Multi-purposed self-propelled device
CN104170360A (en) Intelligent response method of user equipment, and user equipment
CN105900074A (en) Method and apparatus for screen sharing
CN203689077U (en) Intelligent service robot
US10057676B2 (en) Wearable wirelessly controlled enigma system
US11922809B2 (en) Non-visual outputs for a smart ring
CN205968983U (en) Intelligent robot and control system thereof
Tombeng et al. Smart car: Digital controlling system using android smartwatch voice recognition
CN206421194U (en) A kind of Intelligent gesture controlling switch
CN106020459A (en) Intelligent spectacles as well as manipulation method and manipulation system of intelligent spectacles
CN106707512A (en) Intelligent AR (Augmented Reality) system with low power consumption and intelligent AR glasses
CN104317298A (en) Emotional interaction type mobile phone robot
CN205787669U (en) A kind of Smart Home robot
CN105867640A (en) Smart glasses and control method and control system of smart glasses
CN204256394U (en) A kind of affective interaction type on-vehicle machines people
CN102024345A (en) Domestic teaching mobile point-to-read robot
US20200009741A1 (en) Method for managing modular robot and robot thereof
US20230095484A1 (en) Electronic ink display for smart ring
Sergeyeva et al. Development of a Wi-Fi controlled mobile video device on the Arduino NANO basis

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20141210