CN113380252A - User interaction method, interaction system and computer program product for vehicle - Google Patents

User interaction method, interaction system and computer program product for vehicle Download PDF

Info

Publication number
CN113380252A
CN113380252A CN202110650820.5A CN202110650820A CN113380252A CN 113380252 A CN113380252 A CN 113380252A CN 202110650820 A CN202110650820 A CN 202110650820A CN 113380252 A CN113380252 A CN 113380252A
Authority
CN
China
Prior art keywords
user
information
vehicle
characteristic information
interaction method
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110650820.5A
Other languages
Chinese (zh)
Inventor
赵玥
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Mercedes Benz Group AG
Original Assignee
Daimler AG
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Daimler AG filed Critical Daimler AG
Priority to CN202110650820.5A priority Critical patent/CN113380252A/en
Publication of CN113380252A publication Critical patent/CN113380252A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS OR SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING; SPEECH OR AUDIO CODING OR DECODING
    • G10L15/00Speech recognition
    • G10L15/24Speech recognition using non-acoustical features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS OR SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING; SPEECH OR AUDIO CODING OR DECODING
    • G10L15/00Speech recognition
    • G10L15/26Speech to text systems

Abstract

The invention provides a user interaction method of a vehicle, which comprises the following steps: step S1, acquiring image information of users and/or their belongings in the vehicle through the vehicle-mounted camera device (1); step S2, obtaining appearance characteristic information about the appearance of the user and/or the personal belongings from the image information; and step S3, outputting a greeting associated with the appearance characteristic information to the user. An interaction system of a vehicle for performing the user interaction method according to the invention is also proposed, the interaction system comprising a camera device (1), an output device (2) and a controller (3) connected to the camera device (1) and the output device (2). The invention also relates to a corresponding computer program product. By means of the invention, the vehicle can actively send out more personalized and personified greetings related to the user per se to the user, thereby improving the user experience in the human-vehicle interaction process.

Description

User interaction method, interaction system and computer program product for vehicle
Technical Field
The present invention relates to the field of vehicles, and in particular, to a user interaction method for a vehicle, a corresponding interaction system, and a computer program product.
Background
At present, vehicles become common transportation tools for people to go out in daily life. As the frequency and duration of riding in vehicles has increased, users' demands for vehicles have not been limited to simple ride-through functions, but rather, vehicles are expected to perform more functions.
Many manufacturers are working to develop and design intelligent functions that are easily used and operated by users while being loaded on vehicles. For example, a vehicle may be equipped with in-vehicle smart voice functionality. Through on-vehicle intelligent voice function, usable voice interaction realizes comparatively efficient people's car interactive.
However, such onboard smart voice functionality typically requires the user to actively wake up using voice. That is, the user is required to actively input and the machine is required to passively feed back. The intrusive human-vehicle interaction process causes poor user experience, and has the problems of low identification accuracy of awakening words, false awakening and the like.
In addition, this interaction is limited to verbal communication with the user, and only gives relatively stylized feedback based on the very limited information the user gives by voice.
Therefore, the interaction process of the vehicle and the user still has many defects in the user experience.
Disclosure of Invention
The invention aims to provide an improved way of user interaction of a vehicle, thereby improving the user experience. This way of interacting is in particular able to actively issue a more personalized, more personified greeting to the user in relation to the user himself.
According to a first aspect of the present invention, there is provided a user interaction method of a vehicle, wherein the user interaction method comprises: step S1, acquiring image information of users and/or their belongings in the vehicle through the vehicle-mounted camera device; step S2, obtaining appearance characteristic information about the appearance of the user and/or the personal belongings from the image information; and step S3, outputting a greeting associated with the appearance characteristic information to the user.
Optionally, the user's belongings include objects and/or pets that the user carries with him.
Optionally, the appearance characteristic information comprises type information and/or color information determined by the appearance of the user and/or his belongings.
Optionally, the appearance characteristic information comprises at least one of the following information:
information about clothing worn by the user, such as information about the user's clothing, hat, shoes, or accessories;
information about the appearance of the user, such as information about the makeup or hair of the user;
information about a bag or other object carried by the user; and
information about a pet carried by the user.
Alternatively, in step S2, appearance feature information is acquired from the image information based on a machine learning model.
Optionally, step S2 further includes: and determining greeting contents to be output according to the acquired appearance characteristic information based on a machine learning model.
Optionally, step S2 includes: greeting content to be output is determined based on appearance characteristic information acquired over a period of time, for example, based on differences between appearance characteristic information acquired over a period of time.
Optionally, in step S3, the greeting is output to the user through a voice output device and/or the greeting is output to the user through a display device.
Optionally, the user interaction method further includes step S4: in association with the appearance characteristic information, further on-board systems are activated, which comprise, for example, a navigation system and/or an on-board shopping system.
Optionally, step S4 includes at least one of the following sub-steps:
inquiring whether the user needs to navigate to a corresponding sport place or not under the condition that the appearance characteristic information shows that the personal belongings of the user comprise sport articles;
inquiring whether navigation to a working place is needed or not under the condition that the appearance characteristic information shows that the user wears the working clothes;
inquiring whether to navigate to a nearby pet hospital or not under the condition that the appearance characteristic information shows that the user carries the pet; and
in the event that the appearance characteristic information indicates that the user is wearing a particular type of clothing and lacks a matching adornment, the in-vehicle shopping system is activated.
According to a second aspect of the invention, there is provided a computer program product comprising computer program instructions, wherein the computer program instructions, when executed by one or more processors, enable the processors to perform a user interaction method according to the invention.
According to a third aspect of the present invention, there is provided an interactive system of a vehicle for performing the user interaction method according to the present invention, wherein the interactive system comprises a camera device, an output device and a controller connected to the camera device and the output device, wherein the camera device is arranged to be able to acquire image information of a user and/or belongings thereof within the vehicle; the controller is arranged to be able to obtain appearance characteristic information about the appearance of the user and/or his belongings from the image information; and the output means is arranged to be able to output a greeting associated with the appearance characteristic information to the user.
The invention has the positive effects that: by means of the camera device, the current appearance characteristic information of the user can be acquired, and the vehicle can actively send out more personalized and more anthropomorphic intelligent greetings to the user based on the appearance characteristic information, so that the user experience is improved.
Drawings
The principles, features and advantages of the present invention may be better understood by describing the invention in more detail below with reference to the accompanying drawings. The drawings comprise:
FIG. 1 schematically illustrates a flow chart of a user interaction method for a vehicle according to an exemplary embodiment of the present invention;
FIG. 2 schematically illustrates an interactive system of a vehicle according to an exemplary embodiment of the invention; and
fig. 3 schematically shows a flow chart of a user interaction method according to an exemplary embodiment of the present invention.
Detailed Description
In order to make the technical problems, technical solutions and advantageous effects of the present invention more apparent, the present invention will be described in further detail with reference to the accompanying drawings and exemplary embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and do not limit the scope of the invention.
Fig. 1 schematically shows a flow chart of a user interaction method of a vehicle according to an exemplary embodiment of the present invention. Fig. 2 schematically shows an interaction system of a vehicle according to an exemplary embodiment of the present invention. The concept of the present invention is explained in detail below with reference to fig. 1 and 2.
As shown in fig. 1, the user interaction method according to the present invention includes:
step S1, acquiring image information of users and/or their belongings in the vehicle through the vehicle-mounted camera device 1;
step S2, obtaining appearance characteristic information about the appearance of the user and/or the personal belongings from the image information; and
step S3, outputting a greeting associated with the appearance characteristic information to the user.
Therefore, by means of the camera device 1, the current appearance characteristic information of the user can be acquired, and the vehicle can actively send out personalized intelligent greetings to the user based on the appearance characteristic information, so that the user experience is improved. It should be understood that users within the vehicle may include a driver and/or a passenger.
The user interaction method may be started after detecting that the user enters the vehicle, and for example, whether the user enters the vehicle may be detected by detecting the open/close state of a vehicle door, detecting the weight of a vehicle seat, and detecting an image of the image pickup device 1. Alternatively or additionally, the user interaction method may also be initiated after detecting that the driver has started the vehicle.
The user interaction method may be performed by means of an interaction system according to the invention. As shown in fig. 2, the interactive system includes an image pickup apparatus 1, an output apparatus 2, and a controller 3 connected to the image pickup apparatus 1 and the output apparatus 2.
The imaging device 1 is provided so as to be able to acquire image information of a user and/or his belongings in the vehicle. The imaging device 1 is disposed above a front windshield of a vehicle, for example, so that an image of a user in the vehicle can be conveniently taken. The image information may include picture information and/or video information. The user's belongings may include objects and/or pets that the user carries with him.
The controller 3 is arranged to be able to derive from said image information appearance characteristic information about the appearance of the user and/or his belongings.
Optionally, the appearance characteristic information may include at least one of the following:
information about clothing worn by the user, such as information about the user's clothing, hat, shoes, or accessories;
information about the appearance of the user, such as information about the makeup or hair of the user;
information about a bag or other object carried by the user; and
information about a pet carried by the user.
The appearance characteristic information may comprise, inter alia, type information and/or color information determined by the appearance of the user and/or his belongings. For example, the appearance characteristic information may include: the color of the clothing worn by the user, the style of wear of the user (e.g., sports style, casual style, business style, etc.), the type of sporting goods the user carries with him (e.g., yoga mat, badminton racket, soccer ball, etc.), or the variety of pets the user carries with him, etc.
Alternatively, the greeting content to be output is determined based on appearance characteristic information acquired over a certain period of time, for example, based on a difference between appearance characteristic information acquired over a certain period of time. For example, based on the difference between the appearance characteristic information acquired in a certain period of time, a greeting may be output: "you wear red clothes today more beautiful", "your earrings more fit your eyes than yesterday" or "your new form fits well with the suit". Thus, the output greeting can be made more humane.
Alternatively, the controller 3 may acquire appearance feature information from the image information based on a machine learning model. The controller 3 may also determine greeting content to be output from the acquired appearance feature information based on a machine learning model.
The output means 2 is arranged to be able to output a greeting associated with the appearance characteristic information to the user. The output device 2 may be a display device, for example a central control screen of a vehicle, by means of which a greeting can be output to the user in text form. Alternatively or additionally, the output means 2 comprise voice output means by which a voice greeting can be output to the user.
Fig. 3 schematically shows a flow chart of a user interaction method according to an exemplary embodiment of the present invention. Compared to the embodiment shown in fig. 1, the user interaction method shown in fig. 3 additionally includes step S4: in association with the appearance characteristic information, further on-board systems are activated, which comprise, for example, a navigation system and/or an on-board shopping system.
Optionally, step S4 includes at least one of the following sub-steps:
inquiring whether the user needs to navigate to a corresponding sport place or not under the condition that the appearance characteristic information shows that the personal belongings of the user comprise sport articles;
inquiring whether navigation to a working place is needed or not under the condition that the appearance characteristic information shows that the user wears the working clothes;
inquiring whether to navigate to a nearby pet hospital or not under the condition that the appearance characteristic information shows that the user carries the pet; and
in the event that the appearance characteristic information indicates that the user is wearing a particular type of clothing and lacks a matching adornment, the in-vehicle shopping system is activated.
The sport place and the work place may be specific sport places and work places set in advance, or corresponding sport places and work places searched by the user before. For example, in the case where the appearance characteristic information shows that the user is wearing a suit, the interactive system may query: is it necessary to navigate to the company? After receiving a positive answer from the user, the navigation system navigates to the address of the company that has been previously entered. As another example, in the case where the appearance characteristic information shows that the user's belongings include yoga mats, the interactive system may query: is it necessary to navigate to the yoga classroom? And after a positive answer is obtained by the user, navigating to the address of the yoga classroom through the navigation system. Or for example, in the case that the appearance characteristic information shows that the user wears gorgeous and lacks a matched ornament such as an earring or a necklace, or in the case that the appearance characteristic information shows that the user wears sports-type clothes and lacks a wrist strap, the in-vehicle shopping system can be activated and shopping information of the related ornament can be displayed, so that the user can conveniently choose to purchase.
The invention also relates to a computer program product comprising computer program instructions which, when executed by one or more processors, are capable of performing the user interaction method according to the invention. The computer program instructions may be stored in a computer readable storage medium, which may include, for example, high-speed random access memory, and may also include non-volatile memory, such as a hard disk, a memory, a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash Card (Flash Card), at least one magnetic disk storage device, a Flash memory device, or other volatile solid state storage device. The Processor may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic, discrete hardware components, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
Although specific embodiments of the invention have been described herein in detail, they have been presented for purposes of illustration only and are not to be construed as limiting the scope of the invention. Various substitutions, alterations, and modifications may be devised without departing from the spirit and scope of the present invention.

Claims (10)

1. A user interaction method of a vehicle, wherein the user interaction method comprises:
step S1, acquiring image information of users and/or their belongings in the vehicle through the vehicle-mounted camera device (1);
step S2, obtaining appearance characteristic information about the appearance of the user and/or the personal belongings from the image information; and
step S3, outputting a greeting associated with the appearance characteristic information to the user.
2. The user interaction method of claim 1,
the personal belongings of the user comprise objects and/or pets carried by the user; and/or
The appearance characteristic information includes type information and/or color information determined by the appearance of the user and/or his belongings.
3. The user interaction method according to claim 1 or 2, wherein the appearance feature information comprises at least one of the following information:
information about clothing worn by the user, such as information about the user's clothing, hat, shoes, or accessories;
information about the appearance of the user, such as information about the makeup or hair of the user;
information about a bag or other object carried by the user; and
information about a pet carried by the user.
4. The user interaction method of any of claims 1-3,
in step S2, appearance feature information is acquired from the image information based on a machine learning model; and/or
Step S2 further includes: and determining greeting contents to be output according to the acquired appearance characteristic information based on a machine learning model.
5. The user interaction method of any of claims 1-4,
step S2 includes: greeting content to be output is determined based on appearance characteristic information acquired over a period of time, for example, based on differences between appearance characteristic information acquired over a period of time.
6. The user interaction method of any of claims 1-5,
in step S3, the greeting is output to the user through the voice output device and/or the greeting is output to the user through the display device.
7. The user interaction method of any of claims 1-6,
the user interaction method further includes step S4: in association with the appearance characteristic information, further on-board systems are activated, which comprise, for example, a navigation system and/or an on-board shopping system.
8. The user interaction method according to claim 7, wherein step S4 includes at least one of the following sub-steps:
inquiring whether the user needs to navigate to a corresponding sport place or not under the condition that the appearance characteristic information shows that the personal belongings of the user comprise sport articles;
inquiring whether navigation to a working place is needed or not under the condition that the appearance characteristic information shows that the user wears the working clothes;
inquiring whether to navigate to a nearby pet hospital or not under the condition that the appearance characteristic information shows that the user carries the pet; and
in the event that the appearance characteristic information indicates that the user is wearing a particular type of clothing and lacks a matching adornment, the in-vehicle shopping system is activated.
9. A computer program product comprising computer program instructions, wherein the computer program instructions, when executed by one or more processors, enable the processors to perform the user interaction method of any one of claims 1-8.
10. An interactive system of a vehicle for performing the user interaction method according to any one of claims 1-8, wherein the interactive system comprises a camera device (1), an output device (2), and a controller (3) connected to the camera device (1) and the output device (2), wherein,
the camera device (1) is arranged to be able to acquire image information of a user and/or his belongings in the vehicle;
a controller (3) arranged to be able to obtain appearance characteristic information about the appearance of a user and/or his belongings from said image information; and
the output means (2) is arranged to be able to output a greeting associated with the appearance characteristic information to a user.
CN202110650820.5A 2021-06-11 2021-06-11 User interaction method, interaction system and computer program product for vehicle Pending CN113380252A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110650820.5A CN113380252A (en) 2021-06-11 2021-06-11 User interaction method, interaction system and computer program product for vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110650820.5A CN113380252A (en) 2021-06-11 2021-06-11 User interaction method, interaction system and computer program product for vehicle

Publications (1)

Publication Number Publication Date
CN113380252A true CN113380252A (en) 2021-09-10

Family

ID=77573794

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110650820.5A Pending CN113380252A (en) 2021-06-11 2021-06-11 User interaction method, interaction system and computer program product for vehicle

Country Status (1)

Country Link
CN (1) CN113380252A (en)

Similar Documents

Publication Publication Date Title
CN109804400B (en) Information providing device and moving object
CN108327722A (en) System and method for identifying vehicle driver by Move Mode
EP1424652A2 (en) Method and apparatus for adding ornaments to an image of a person
Parkin Women at the wheel: a century of buying, driving, and fixing cars
JP2020109578A (en) Information processing device and program
CN107665238A (en) Image processing method and device, the device for picture processing
DE112017004942T5 (en) Information acquisition device and mobile body
WO2021014993A1 (en) Information processing device, information processing method, and program
Rahman et al. Synthetic distracted driving (syndd1) dataset for analyzing distracted behaviors and various gaze zones of a driver
JP2021176235A (en) Image processing method
CN108875651B (en) Method and device for evaluating placement of articles and computer-readable storage medium
CN113380252A (en) User interaction method, interaction system and computer program product for vehicle
JPH09288699A (en) Product advertisement device and product advertisement method
CN108657090A (en) Vehicle entertainment system and its method
JP7068156B2 (en) Information processing equipment and programs
DE102018203944A1 (en) Method and motor vehicle for outputting information depending on a characterizing an occupant of the motor vehicle property
CN115935064A (en) Accessory recommendation method and control method for vehicle
JP7188427B2 (en) Information provision system
CN115309285A (en) Method and device for controlling display and mobile carrier
CN107563465A (en) A kind of system and method for obtaining gift information
US20210261142A1 (en) Information processing apparatus
WO2019221070A1 (en) Information processing device, information processing method, and information processing program
CN111736700A (en) Digital person-based vehicle cabin interaction method and device and vehicle
JP2020095502A (en) Information processor and program
US20190385176A1 (en) Information processing apparatus, information processing method, and information processing program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication