CN108919804A - A kind of intelligent vehicle Unmanned Systems - Google Patents
A kind of intelligent vehicle Unmanned Systems Download PDFInfo
- Publication number
- CN108919804A CN108919804A CN201810726257.3A CN201810726257A CN108919804A CN 108919804 A CN108919804 A CN 108919804A CN 201810726257 A CN201810726257 A CN 201810726257A CN 108919804 A CN108919804 A CN 108919804A
- Authority
- CN
- China
- Prior art keywords
- robot
- user
- affective state
- state
- indicate
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0257—Control of position or course in two dimensions specially adapted to land vehicles using a radar
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/174—Facial expression recognition
Landscapes
- Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Remote Sensing (AREA)
- Radar, Positioning & Navigation (AREA)
- Theoretical Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Multimedia (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Human Computer Interaction (AREA)
- General Health & Medical Sciences (AREA)
- Aviation & Aerospace Engineering (AREA)
- Automation & Control Theory (AREA)
- Manipulator (AREA)
- Toys (AREA)
Abstract
The present invention provides a kind of intelligent vehicle Unmanned Systems, including the radar being set to outside vehicle, the people-car interaction robot being set in vehicle and controller of vehicle, the radar is used to obtain the obstacle information of vehicle front, the people-car interaction robot is interacted for intelligent driving system with user, and the controller of vehicle is for controlling vehicle according to obstacle information and interaction scenario.Beneficial effects of the present invention are:A kind of intelligent vehicle Unmanned Systems are provided, people-car interaction is carried out by people-car interaction robot, improves the driving experience of user.
Description
Technical field
The present invention relates to intelligent driving technical fields, and in particular to a kind of intelligent vehicle Unmanned Systems.
Background technique
With social development and economic progress, there are various intelligent driving systems, however, existing intelligent driving system
Interaction effect and unsatisfactory, poor user experience with user.
With the development in an all-round way of artificial intelligence, machine man-based development has also welcome spring, in the human-computer interaction and conjunction of intelligence
During work, robot is gradually applied to every field.
Summary of the invention
In view of the above-mentioned problems, the present invention is intended to provide a kind of intelligent vehicle Unmanned Systems.
The purpose of the present invention is realized using following technical scheme:
Provide a kind of intelligent vehicle Unmanned Systems, including be set to outside vehicle radar, be set in vehicle
People-car interaction robot and controller of vehicle, the radar are used to obtain the obstacle information of vehicle front, and people's vehicle is handed over
Mutual robot is interacted for intelligent driving system with user, and the controller of vehicle is used for according to obstacle information and friendship
Mutual situation controls vehicle.
Beneficial effects of the present invention are:A kind of intelligent vehicle Unmanned Systems are provided, people-car interaction robot is passed through
People-car interaction is carried out, the driving experience of user is improved.
Detailed description of the invention
The present invention will be further described with reference to the accompanying drawings, but the embodiment in attached drawing is not constituted to any limit of the invention
System, for those of ordinary skill in the art, without creative efforts, can also obtain according to the following drawings
Other attached drawings.
Fig. 1 is structural schematic diagram of the invention;
Appended drawing reference:
Radar 1, people-car interaction robot 2, controller of vehicle 3.
Specific embodiment
The invention will be further described with the following Examples.
Referring to Fig. 1, a kind of intelligent vehicle Unmanned Systems of the present embodiment, including be set to outside vehicle radar 1, set
The people-car interaction robot 2 and controller of vehicle 3 being placed in vehicle, the radar 1 are used to obtain the barrier of vehicle front
Information, the people-car interaction robot 2 are interacted for intelligent driving system with user, and the controller of vehicle 3 is used for
Vehicle is controlled according to obstacle information and interaction scenario.
A kind of intelligent vehicle Unmanned Systems are present embodiments provided, the friendship of people's vehicle is carried out by people-car interaction robot
Mutually, the driving experience of user is improved.
Preferably, the people-car interaction robot 2 includes the first processing subsystem, second processing subsystem, third processing
Subsystem, for first processing subsystem for obtaining external environmental information, first processing subsystem includes microphone, height
Clear camera, the microphone are used to obtain the voice messaging of user, and the high definition photography/videography head is used to obtain the face of user
Information, the second processing subsystem are used to carry out interactive voice, the third processing subsystem with user according to voice messaging
For carrying out affective interaction with user according to face information.
The second processing subsystem includes identification module, synthesis module and playing module, and the identification module is for mentioning
The voice messaging at family is taken, and is translated into identifiable binary machine language, the synthesis module is for believing character
Breath is converted into voice messaging, and the playing module is used to play the voice messaging of conversion.
This preferred embodiment people-car interaction robot realizes robot with well interacting between user, second processing subsystem
Robot is realized with interactive voice accurate between user.
Preferably, the third processing subsystem include first processing module, Second processing module, third processing module and
Fourth processing module, the first processing module is for establishing emotional space model, and the Second processing module is according to emotion sky
Between model determine emotional energy, the third processing module for obtaining user feeling, the fourth process according to face information
Module is made corresponding emotion according to user feeling for robot and is changed.
The first processing module is for establishing emotional space model:Two-dimentional emotional space model is established, two-dimentional emotion is empty
Between dimension be respectively happy degree and activity, the happy happy degree spent for indicating emotion, the activity is used for
Indicate the activation degree of emotion;
It is by the affective state set expression of robot:PL={ PL1, PL2..., PLn};
In above-mentioned formula, PLiIndicate that i-th kind of affective state of robot, i=1,2 ..., n, n indicate the emotion of robot
Robot affective state is described the quantity of state in dots in two-dimentional emotional space:(ai, bi), wherein ai
Indicate the happy degree of i-th kind of affective state of robot, biIndicate the activity of i-th kind of affective state of robot;
It is by the affective state set expression of user:GW={ GW1, GW2..., GWm};
In above-mentioned formula, GWjIndicate that i-th kind of affective state of user, j=1,2 ..., m, m indicate the affective state of user
Quantity, user feeling state is described in dots in two-dimentional emotional space:(aj, bj), wherein ajIt indicates to use
The happy degree of family jth kind affective state, biIndicate the activity of user's jth kind affective state;
This preferred embodiment while realizing affective state and accurately express, is subtracted by establishing two-dimentional emotional space model
Calculation amount is lacked, has improved computational efficiency, has laid a good foundation for subsequent interaction.
Preferably, the Second processing module determines emotional energy according to emotional space model, specially:By various psychology
Activity driving source is defined as mental capacity, is indicated with UA:UA=KW1+KW2;
In above-mentioned formula, KW1Indicate the free mental capacity of spontaneous generation under suitable condition, KW1=δ1UA, KW2Table
Show the controlled mental capacity generated under extraneous stimulation, KW2=62UA, wherein δ1Indicate psychological arousal, δ2Table
Show psychological degree of suppression, 61、δ2∈ [0,1], δ1+δ2=1;
The mental capacity of emotion is determined according to emotional space model:UA=y × D (a+b);
In above-mentioned formula, D indicates emotional intensity, and y indicates that emotion coefficient, a, b respectively indicate the happy degree of affective state, swash
Activity;
Emotional energy is determined using following formula:KWq=KW1+μKW2=(1-62+μ62)×y×D(a+b);
In above-mentioned formula, KWqIndicate that emotional energy, μ indicate the emotion shooting parameter of psychology, μ ∈ [0,1];
This preferred embodiment Second processing module defines mental capacity and emotional energy, facilitates the friendship of hoisting machine people
Interaction performance lays the foundation for subsequent interaction.
Preferably, the fourth processing module is made corresponding emotion according to user feeling for robot and is changed, specifically
For:When the current affective state of robot is identical as user feeling state, then the affective state of robot does not change, but machine
The emotional energy of device people can be double;
When the current affective state of robot is different from user feeling state, then the lower affective state of robot will change
Becoming, next affective state is not only related with the affective state of current robot, and it is also closely bound up with user feeling state, if machine
The current affective state of people is PLi(ai, bi), i={ 1,2 ..., n }, user feeling state is GWj(aj, bj), j=1,2 ...,
M }, any possible affective state of robot subsequent time is PLk(ak, bk), k={ 1,2 ..., n }, i ≠ j ≠ k;
Calculating the feature vector shifted from current affective state to user feeling state is PA1:PA1=(aj-ai, bj-bi),
Feature vector from from current affective state to any possible affective state transfer is PA2:PA2=(ak-ai, bk-bi), from user's feelings
The feature vector of sense state to any possible affective state transfer is PA2:PA2=(ak-aj, bk-bj), emotion is determined using following formula
Transfer function TZ:
Transference function is minimized, affective state PL when transference function is minimized is obtainedz(az, bz), z ∈
K, using the affective state as robot subsequent time state.
This preferred embodiment fourth processing module uses mathematical theory method, and the emotion for allowing the robot to the simulation mankind produces
Raw, variation, and be allowed to meet the rule of human emotion's variation, meet the needs of human emotion in driving procedure, works as robot
Current affective state it is identical as user feeling state, robot emotional energy increase, when robot current affective state with
User feeling state is different, then the lower affective state of robot will change;Robot is worked as into cause by transference function
Sense state is got up with user feeling state relation, and then judges the type of the lower affective state of robot, improves robot
Interaction capabilities.
It is driven using intelligent vehicle Unmanned Systems of the present invention, chooses 5 users and test, respectively user
1, user 2, user 3, user 4, user 5, count drive safety and user satisfaction, compared with intelligent driving system
It compares, generation has the beneficial effect that shown in table:
Drive safety improves | User satisfaction improves | |
User 1 | 29% | 27% |
User 2 | 27% | 26% |
User 3 | 26% | 26% |
User 4 | 25% | 24% |
User 5 | 24% | 22% |
Finally it should be noted that the above embodiments are merely illustrative of the technical solutions of the present invention, rather than the present invention is protected
The limitation of range is protected, although explaining in detail referring to preferred embodiment to the present invention, those skilled in the art are answered
Work as understanding, it can be with modification or equivalent replacement of the technical solution of the present invention are made, without departing from the reality of technical solution of the present invention
Matter and range.
Claims (6)
1. a kind of intelligent vehicle Unmanned Systems, which is characterized in that including be set to outside vehicle radar, be set in vehicle
People-car interaction robot and controller of vehicle, the radar is used to obtain the obstacle information of vehicle front, people's vehicle
Interaction robot is interacted with user for intelligent driving system, the controller of vehicle be used for according to obstacle information with
Interaction scenario controls vehicle.
2. intelligent vehicle Unmanned Systems according to claim 1, which is characterized in that people-car interaction robot packet
The first processing subsystem, second processing subsystem, third processing subsystem are included, first processing subsystem is for obtaining outside
Environmental information, first processing subsystem include microphone, high-definition camera, and the microphone is used to obtain the voice of user
Information, the high definition photography/videography head are used to obtain the face information of user, and the second processing subsystem according to voice for believing
Breath carries out interactive voice with user, and the third processing subsystem is for carrying out affective interaction with user according to face information.
3. intelligent vehicle Unmanned Systems according to claim 2, which is characterized in that the second processing subsystem packet
Identification module, synthesis module and playing module are included, the identification module is used to extract the voice messaging of user, and is translated into
Identifiable binary machine language, the synthesis module are used to convert voice messaging, the playing module for character information
For playing the voice messaging of conversion.
4. intelligent vehicle Unmanned Systems according to claim 3, which is characterized in that the third processing subsystem packet
First processing module, Second processing module, third processing module and fourth processing module are included, the first processing module is for building
Vertical emotional space model, the Second processing module determine emotional energy, the third processing module according to emotional space model
For obtaining user feeling according to face information, the fourth processing module is made accordingly for robot according to user feeling
Emotion variation;
The first processing module is for establishing emotional space model:Two-dimentional emotional space model is established, two-dimentional emotional space
Dimension is respectively happy degree and activity, and the happy happy degree spent for indicating emotion, the activity is for indicating
The activation degree of emotion;
It is by the affective state set expression of robot:PL={ PL1, PL2..., PLn};
In above-mentioned formula, PLiIndicate that i-th kind of affective state of robot, i=1,2 ..., n, n indicate the affective state of robot
Quantity, robot affective state is described in dots in two-dimentional emotional space:(ai, bi), wherein aiIt indicates
The happy degree of i-th kind of affective state of robot, biIndicate the activity of i-th kind of affective state of robot;
It is by the affective state set expression of user:GW={ GW1, GW2..., GWm};
In above-mentioned formula, GWjIndicate that i-th kind of affective state of user, j=1,2 ..., m, m indicate the number of the affective state of user
Amount, user feeling state is described in dots in two-dimentional emotional space:(aj, bj), wherein ajIndicate user's jth
The happy degree of kind affective state, biIndicate the activity of user's jth kind affective state.
5. intelligent vehicle Unmanned Systems according to claim 4, which is characterized in that the Second processing module according to
Emotional space model determines emotional energy, specially:Various psychological activity driving sources are defined as mental capacity, are indicated with UA:
UA=KW1+KW2;
In above-mentioned formula, KW1Indicate the free mental capacity of spontaneous generation under suitable condition, KW1=61UA, KW2It indicates
The lower controlled mental capacity generated of environmental stimuli effect, KW2=62UA, wherein δ1Indicate psychological arousal, δ2Indicate the heart
Manage degree of suppression, δ1、δ2∈ [0,1], δ1+δ2=1;
The mental capacity of emotion is determined according to emotional space model:UA=y × D (a+b);
In above-mentioned formula, D indicates emotional intensity, and y indicates emotion coefficient, and a, b respectively indicate the happy degree of affective state, activation
Degree;
Emotional energy is determined using following formula:KWq=KW1+μKW2=(1- δ2+μδ2)×y×D(a+b);
In above-mentioned formula, KWqIndicate that emotional energy, μ indicate the emotion shooting parameter of psychology, μ ∈ [0,1].
6. intelligent vehicle Unmanned Systems according to claim 5, which is characterized in that the fourth processing module is used for
Robot makes corresponding emotion according to user feeling and changes, specially:When the current affective state and user feeling of robot
State is identical, then the affective state of robot does not change, but the emotional energy of robot can be double;
When the current affective state of robot is different from user feeling state, then the lower affective state of robot will change, under
One affective state is not only related with the affective state of current robot, also closely bound up with user feeling state, if robot works as
Preceding affective state is PLi(ai, bi), i={ 1,2 ..., n }, user feeling state is GWj(aj, bj), j={ 1,2 ..., m }, machine
Any possible affective state of device people's subsequent time is PLk(ak, bk), k={ 1,2 ..., n }, i ≠ j ≠ k;
Calculating the feature vector shifted from current affective state to user feeling state is PA1:PA1=(aj-ai, bj-bi), from working as
The feature vector of preceding affective state to any possible affective state transfer is PA2:PA2=(ak-ai, bk-bi), from user feeling shape
The feature vector of state to any possible affective state transfer is PA2:PA2=(ak-aj, bk-bj), transference is determined using following formula
Function TZ:
Transference function is minimized, affective state PL when transference function is minimized is obtainedz(az, bz), z ∈ k will
The affective state is as robot subsequent time state.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810726257.3A CN108919804B (en) | 2018-07-04 | 2018-07-04 | Intelligent vehicle unmanned system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810726257.3A CN108919804B (en) | 2018-07-04 | 2018-07-04 | Intelligent vehicle unmanned system |
Publications (2)
Publication Number | Publication Date |
---|---|
CN108919804A true CN108919804A (en) | 2018-11-30 |
CN108919804B CN108919804B (en) | 2022-02-25 |
Family
ID=64425077
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201810726257.3A Active CN108919804B (en) | 2018-07-04 | 2018-07-04 | Intelligent vehicle unmanned system |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN108919804B (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111538335A (en) * | 2020-05-15 | 2020-08-14 | 深圳国信泰富科技有限公司 | Anti-collision method of driving robot |
CN113433874A (en) * | 2021-07-21 | 2021-09-24 | 广东工业大学 | 5G-based unmanned ship comprehensive control management system and method |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20060149428A1 (en) * | 2005-01-05 | 2006-07-06 | Kim Jong H | Emotion-based software robot for automobiles |
CN101571930A (en) * | 2008-04-30 | 2009-11-04 | 悠进机器人股份公司 | Robot capable of interacting with human |
CN103324100A (en) * | 2013-05-02 | 2013-09-25 | 郭海锋 | Emotion vehicle-mounted robot driven by information |
CN104199321A (en) * | 2014-08-07 | 2014-12-10 | 刘松珍 | Emotion interacting type vehicle-mounted robot |
CN108009573A (en) * | 2017-11-24 | 2018-05-08 | 北京物灵智能科技有限公司 | A kind of robot emotion model generating method, mood model and exchange method |
-
2018
- 2018-07-04 CN CN201810726257.3A patent/CN108919804B/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20060149428A1 (en) * | 2005-01-05 | 2006-07-06 | Kim Jong H | Emotion-based software robot for automobiles |
CN101571930A (en) * | 2008-04-30 | 2009-11-04 | 悠进机器人股份公司 | Robot capable of interacting with human |
CN103324100A (en) * | 2013-05-02 | 2013-09-25 | 郭海锋 | Emotion vehicle-mounted robot driven by information |
CN104199321A (en) * | 2014-08-07 | 2014-12-10 | 刘松珍 | Emotion interacting type vehicle-mounted robot |
CN108009573A (en) * | 2017-11-24 | 2018-05-08 | 北京物灵智能科技有限公司 | A kind of robot emotion model generating method, mood model and exchange method |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111538335A (en) * | 2020-05-15 | 2020-08-14 | 深圳国信泰富科技有限公司 | Anti-collision method of driving robot |
CN113433874A (en) * | 2021-07-21 | 2021-09-24 | 广东工业大学 | 5G-based unmanned ship comprehensive control management system and method |
Also Published As
Publication number | Publication date |
---|---|
CN108919804B (en) | 2022-02-25 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109658928A (en) | A kind of home-services robot cloud multi-modal dialog method, apparatus and system | |
CN104361316B (en) | Dimension emotion recognition method based on multi-scale time sequence modeling | |
CN108803874A (en) | A kind of human-computer behavior exchange method based on machine vision | |
CN106293102A (en) | A kind of robot affective interaction method based on user mood change emotion | |
CN108921284A (en) | Interpersonal interactive body language automatic generation method and system based on deep learning | |
CN108171315A (en) | Multiple no-manned plane method for allocating tasks based on SMC particle cluster algorithms | |
CN110688910B (en) | Method for realizing wearable human body basic gesture recognition | |
CN106502382A (en) | Active exchange method and system for intelligent robot | |
CN108919804A (en) | A kind of intelligent vehicle Unmanned Systems | |
CN108009573A (en) | A kind of robot emotion model generating method, mood model and exchange method | |
CN106022294A (en) | Intelligent robot-oriented man-machine interaction method and intelligent robot-oriented man-machine interaction device | |
CN109558805A (en) | Human bodys' response method based on multilayer depth characteristic | |
CN106815321A (en) | Chat method and device based on intelligent chat robots | |
CN110536095A (en) | Call method, device, terminal and storage medium | |
CN106774897A (en) | The method and apparatus of virtual robot and use its glasses or the helmet | |
CN109800295A (en) | The emotion session generation method being distributed based on sentiment dictionary and Word probability | |
CN109086351A (en) | A kind of method and user tag system obtaining user tag | |
CN109948569B (en) | Three-dimensional mixed expression recognition method using particle filter framework | |
Savadi et al. | Face based automatic human emotion recognition | |
CN106489114A (en) | A kind of generation method of robot interactive content, system and robot | |
Eaton | An approach to the synthesis of humanoid robot dance using non-interactive evolutionary techniques | |
CN110443872A (en) | A kind of countenance synthesis method having dynamic texture details | |
CN106445153A (en) | Man-machine interaction method and device for intelligent robot | |
Herbort et al. | Learning to select targets within targets in reaching tasks | |
CN108762500A (en) | A kind of intelligent robot |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
TA01 | Transfer of patent application right | ||
TA01 | Transfer of patent application right |
Effective date of registration: 20220130 Address after: 063100 No. 7, Hongbei Road, economic development zone, Guye District, Tangshan City, Hebei Province Applicant after: Tangshan Dehui Aviation Equipment Co.,Ltd. Address before: Room 9213-9215, building 9, No. 200, Yuangang Road, Tianhe District, Guangzhou, Guangdong 510000 Applicant before: GUANGDONG ZHUJIANQIANG INTERNET TECHNOLOGY Co.,Ltd. |
|
GR01 | Patent grant | ||
GR01 | Patent grant |